A beginner’s guide to common SEO terms
SEM (Search Engine Marketing), and as the name implies it involves marketing services or products through search engines. SEM is divided into two types: SEO & PPC. SEO (Search Engine Optimization), and it is the practice of optimizing websites to make web pages appear in the organic search results. PPC stands for Pay-Per-Click, & it is the practice of purchasing clicks from search engines. The clicks come from sponsored listings in the search results.
Also referred as inlink or only link, it is a hyperlink on another website pointing back towards our own website. Backlinks are crucial for SEO because they affect the PageRank of any web page directly, by affecting its search engine rankings.
PageRank is an algorithm that Google uses to figure out the relative importance of pages around the web. Basically, the idea behind the algorithm is the fact that a link from page A to B can be seen as a point of trust from page A to B. The higher the number of links (depends on their value) to a page, therefore, the higher the probability that such page is relevant.
A link bait is a designed web content published on a website or blog with the aim of attracting as many as backlinks possible (to improve search rankings). Usually, it’s written content, but it can be a video, picture, quiz or anything else. A perfect example of linkbait are the “Top 10″ lists that become so famous on social bookmarking sites.
5. Link farm
A link farm is a group of websites where many of the websites links to every other site, with the goal of increasing the PageRank of all the sites in the Link farm. This practice was very active in the starting days of search engines, but today they see as spam & illegal technique (and thus can get you penalized by Google).
6. Anchor Text
The anchor text of a backlink is the text that is clickable on the web page. Having a keyword, rich anchor texts help with SEO because Google will connect keywords with the content of the website. If you have a weight loss blog, for the case, it would help your search engine rankings if some of your backlinks had “weight loss” as their anchor texts.
The follow is a link attribute used by website owners for the signal given to Google that they don’t authorize the site they are linking to. This can happen if the link is created by the users themselves (Example- blog comments), or when the link was paid for (Example-sponsors and advertisers). When Google sees the no follow attribute, it will basically not count that link for the PageRank and search algorithms.
8. Link Sculpting
By using the no follow attribute strategically, webmasters were able to control the flow of PageRank within their websites, thus increasing the search engine rankings of desired pages. This practice is no longer useful as Google recently change how it handles the no follow attribute.
9. Title Tag
The title tag is the title of a web page, and it is one of the most crucial factors inside Google’s search ranking algorithm. Basically, your title tag should be different and contain the central keywords of your page. You can also see the title tag of any web page on top of the web browser while navigating it.
10. Meta Tags
Meta tags are used to provide search engines more information about the content of pages. The meta tags are placed inside the Head section of HTML code and not visible to human visitors.
11. Search Algorithm
Google’s search algorithm is used to find the most appropriate web pages for any search query. The algorithm considers over 250 factors (according to Google), including the PageRank value, the content of the website, the title tag, the meta tags, the age of the domain and much more.
SERP( Search Engine Results Page) is basically the page you will get when anyone searches for a particular keyword on Google or from other search engines. The amount of search traffic anyone website will receive depends on the rankings it has inside the SERPs.
Basically, Google has a separate index, the sandbox, where it places all newly discovered websites. When websites are in the sandbox, they do not appear in the search engine results for common search queries. Once Google verified that the website is legitimate, it automatically moves it out of the sandbox and into the main index.
14. Keyword Density
To find out the keyword density of any particular page you just need to divide the no. of times that keyword is used by the total number of words in the page. Keyword density is used to be an important SEO factor, as the early algorithms placed a heavy importance on it. This is not the case anymore.
15. Keyword Stuffing
Since keyword density was an important portion of the early search algorithms, webmasters started to game the system by artificially raising the keyword density inside their websites. This refers to keyword stuffing. These days this practice don’t help you, and it can also get you penalized by Google.
Cloaking technique involves making the same web page show different kinds of content to search engines and to human visitors. The goal is to get the page ranked for specific keywords, and then use the increased traffic to promote unrelated products or services. This practice is to consider as a spamming and also get you penalized (if not banned) on most search engines.
17. Web Crawler
Also known as search bot or spider, it’s a computer program that browses the web on behalf of search engines, trying to discover new links and new pages. This is the first step in the indexation process.
18. Duplicate Content
Duplicate content refers to substantive blocks of content within or across domains that either completely matches other content or are appreciably similar. You should avoid having duplicate/same content on your website because it can get you penalized.
19. Canonical URL
Canonicalization is a process for transforming data that has more than one possible representation into a standard canonical representation. A canonical URL, therefore, is the official URL for accessing a particular page within your website. For instance, the official version of your domain might be http://www.domain.com instead of http://domain.com.
Robots.txt is more than a file, placed in the root of the domain(URL), that is used to provide information of search bots about the construction of the website. For instance, via the robots.txt file, it’s possible to block specific search robots and to restrict the access to specific folders of the section inside the website.
We hope this post about Quick Guide to Understanding Common SEO Terms will help you to polish your SEO skills for sure. If we missed something let us know. We would love to hear from you..