Here are definitions of some key SEO terms you're likely to come across.
Meta title: The main title of the page as defined by the underlying HTML code. Search engines such as Google use these exactly as they appear on their results pages. For example, the meta title for this page is “The Best Website Builder for SEO.”
Meta description: A brief overview of a web page designed to explain its content to users within a search engine results page. Meta descriptions let users know that your site is what they're looking for. For example, the meta description for this page would be: “Find the best website builder to help your business appear in Google searches and attract more customers.”
Permalink: A static, unchanging URL for the page. You can tell search engines more about your content. Some website builders allow you to customize permalinks to target specific keywords.
Responsive design: A responsive website is just as easy to read on a laptop as it is on a mobile device, and dynamically adapts to fit your space to give your users the best experience.
301 redirect: A digital sign that tells search engines to automatically redirect users from the page they choose to visit to another page. Websites use 301 redirects when they want to replace an old or obsolete page with a new page and direct search engines to the new version.
SSL certificate: A digital certificate verified by a third party that authenticates the identity of a website for search engine purposes.
Canonical tag: A small piece of HTML code that tells search engines which pages to index from a set of duplicate pages. For example, a website might have two versions of a page for two different regions. Tagging one of the duplicates as canonical will prevent the duplicate listing from appearing on search engine results pages.
site map: A file that lists all the pages on a website and their relationships to each other. Search engines use these files to index your website more efficiently.
robot.txt: A file that helps control web requests from web robots (“bots”) to websites. For example, you can use the robots.txt file to determine which parts of your site bots can and cannot “crawl”. For example, you may want to prevent Googlebot from crawling your login page.
Scheme/Structured Data: Structured data is a standardized format of web content that search engines can use for classification. For example, if you search for “events in London,” you'll see a list of upcoming events that Google retrieved from his website using structured data, and you'll know it's an event.
HTML: The basic computer language used to build web pages.
URL: A URL (Uniform Resource Locator) is basically a human-readable address of a web page on the Internet.