Strategies involving SEO have become very popular due to the result and return on investment.
Trillions of pages are listed in Google’s index and there are only two ways to get featured in search results: SEO and Google Ads.
SEO is the acronym for Search Engine Optimization, which translates to search engine optimization.
Whenever a page is created, search engines try to index it so that it is found in search results.
That’s where website optimization proves to be the only viable option to achieve organic results through techniques that aim to show Google that your website is more relevant than competing websites.
However, if you don’t have a website, you need to hire a website building service to get started with SEO.
In this complete guide to SEO I will teach you all the concepts and techniques that allow your website to climb and remain in the top positions of Google.
A White Hat SEO professional is one who follows all the rules of Google and other search engines to rank websites organically.
In general, White Hat SEO results tend to take longer, but are more durable than others, and this practice ensures that the site is exempt from any penalty.
A Black Hat SEO professional does not necessarily ignore the rules of search engines, however, he uses “illegal” shortcuts to speed up the organic ranking process.
The problem with Black Hat SEO is that the website is vulnerable to being penalized by search engines, in the worst case scenario, the site disappears completely from the search results, the so-called de-indexing.
Most of the time, Black Hat SEO results aren’t durable enough to get a good return on investment.
The most used Black Hat techniques are:
A backlink is nothing more than a link placed on another website pointing to yours.
There are two types of backlinks:
In addition to types, backlinks can have relationships, they are:
Linkbuilding is composed of a set of strategies that aim to get contextual and quality backlinks for one or more domains.
Linkbuilding provides:
I always recommend setting your internal and external links to open in a new tab, so that the user does not leave your page.
Benefits of configuring links to open in another tab:
Every domain has an authority, which is assigned from the backlinks.
There are several metrics from different tools, see the main ones below.
PageRank is a scale from 0 to 10 calculated by Google according to the quality, context and relevance of the backlinks received.
The Domain Authority (DA) represents the authority of the domain and the Page Authority (PA) represents the authority of the page, both are calculated by Moz, both scales go from 1 to 100.
Domain Rating (DR) represents domain rating and URL Rating (UR) represents page rating, both are calculated by Ahefs, both are scales from 1 to 100.
Considered the first internet search engine, Architext appeared in mid-1993, later being renamed Excite.
With the success of the search engine, Yahoo (in 1994) and Google (in 1997) soon appeared.
Google was created by Larry Page and Sergey Brin to be a tool capable of cataloging the internet through keywords, bringing relevant results according to the search performed by the user.
Initially, the only ranking factor for pages was through citations (backlinks), inspired by articles and scientific research, therefore, the more citations a page received, the better positioned it would be in the search results.
At the time, this way made it possible to know that the highest ranked page was reliable due to the reputation received by the citations.
The algorithm that made this happen was Pagerank, a scale from 0 to 10 that was calculated according to the quantity and quality of links directed to the page.
PageRank was conceived by Larry Page and was already functional in the first version of the search engine.
Due to the quality of the search results, PC Magazine published an article in 1997 praising the search engine and ranked it as the best search engine in the “Top 100 Web Sites”.
In addition, one of the largest portals on search engines, Search Engine Land, published a publication in which the first mention of the term SEO is identified (in 1997), in the book Net Results, written by Bob Heyman, Rick Bruner and Leland Harden.
According to the writers, the term was used in a conversation about the positioning of the Jefferson Starship website in search engines.
As they entered more keywords with the band’s name on the site, they realized that the site appeared in the first position on Google, thus creating the term Search Engine Optimization.
In those days, SEO was limited to keyword repetition in site content and backlinks.
It was in mid-1999 that the link building strategy emerged, where “legal” practices were used to get backlinks, but some people also used “illegal” means to get more links, this illegal practice was classified as Black Hat SEO.
In order to make link building more measurable, Google launched the Google Toolbar in 2000, in which it was possible to see the Pagerank of the websites accessed, this extension was intended for Internet Explorer.
Back in 2000, Google launched Google Adwords, allowing people to pay to appear in the top positions.
With the popularization of Black Hat techniques, Google released an algorithm on November 16, 2003 called Florida that permanently changed SEO.
Google’s Florida algorithm acted as a filter for commercial terms that removed several sites from the search results.
When launched, this algorithm caused a stir among merchants who used multiple sites to drive traffic and sales.
Nevertheless, thanks to the impact of this update, several entrepreneurs began to invest more in their own websites, which made the quality of the websites significantly increase, improving the user experience.
But this was just the first update to Google’s search engine, in the following years several updates were made to improve search results more and more, which made Google become the most used search engine in the world.
With each update, many SEO professionals speculate about the end of this strategy, however, the optimization of sites is done through the improvement and prominence of the site, that is, the better the site, the better it will be positioned in the results.
Ever wonder how Google works? I find it bizarre to do any search and the results appear in seconds. See below for the steps that go from discovering a page to displaying it on the results page.
Google has some robots called Googlebot, their objective is to search the internet in search of new pages, this discovery is called tracking.
Crawling can be accelerated through the use of crawlers, sitemaps and backlinks.
As Googlebot identifies new pages, it includes all of them in an analysis to later decide whether it is appropriate to index them or not.
A curiosity is that whenever Google goes through a page, it lists all internal and external links for future crawling, ensuring that no page or site is skipped.
After crawling the new pages and deciding whether to include them in the results, the pages are indexed according to the page content, the publication date, the region the publication is aimed at, the structured data, the title and the description.
If the page does not have a description (meta-description), Google takes care of using a snippet of the page’s content so that the user knows what it is about.
After the aforementioned steps and the processes carried out by the different algorithms, Google determines in which position the page should appear in the search results.
Currently, there are over 200 ranking factors to determine the relevance and positioning of a page on Google.
Google’s algorithms aim to provide the best possible results for users’ searches.
As I mentioned in the previous topic, there are currently more than 200 ranking factors in the Google search tool.
Each algorithm works as a filter to narrow the results according to the user’s needs.
Learn below about the main Google algorithms and how they impact the organic positioning of websites.
The Florida algorithm was Google’s first update, released in 2003, it contributed to the first SEO strategies.
When this update was released, more than 50% of the sites listed were downgraded or removed from indexing.
The Florida algorithm targeted domains with exact match keywords and sites with:
The Panda algorithm was another major update, released in 2011, it affected approximately 12% of all search results in the world.
The purpose of the Panda algorithm was to identify and penalize sites with plagiarized and/or low quality content.
Panda has received more than 27 updates since it was released, always aiming to identify, classify and penalize low quality content, the last version was released in 2015.
Penguin is still called by some people as Webspam Update, released in 2012, this algorithm was responsible for stopping excessive use of SEO on several sites.
Approximately 3% of all English websites were affected by Penguin at launch alone.
The main goals of the Penguin Algorithm were to identify and penalize sites that use keyword stuffing (excessive keyword repetition) and black hat techniques to generate backlinks, as is the case with PBNs.
Penguin had a series of updates until 2016, when it started working in real time.
Hummingbird, released in 2013, was quite different from previous algorithms, the goal was to completely overhaul Google’s algorithms.
With Hummingbird, search results didn’t just depend on keywords.
Since the launch of this algorithm, the entire semantics of the search term has had an impact on the results, as well as purchase intent, meaning, context, etc.
In addition, the user’s location was used to bring more relevant results, which made the results more accurate according to the user’s intention and need.
After many years of recommending that Webmasters use HTTPS on their sites, Google released an algorithm in 2014 that made using SSL certificates a ranking factor.
With this update, the internet became more secure, as many sites were forced to use encryption in communication with users’ computers, preventing data from being intercepted by hackers.
Referencing the movie Armageddon, the update that brought Mobilegeddon in 2015 started to prioritize responsive websites because most searches are done on mobile devices.
Thanks to this update, almost all websites on the internet are responsive, that is, they adapt to the screen size of cell phones, tablets and other devices, bringing more comfort to users.
Rankbrain, an algorithm launched in 2015, was the first search engine algorithm to use artificial intelligence and machine learning.
Rankbrain has become one of the top three positioning factors, looking at the context and semantics of content.
Again focusing on low-quality content, the Fred algorithm was launched in 2017, also targeting advertising banners.
The BERT algorithm (Bidirectional Encoder Representations from Transformers) was launched in mid-2018, focused on interpreting the context and semantics of search terms to bring more accurate results.
The BERT algorithm was what made possible the fidelity of results that we have today, including through voice searches.
The Mobile-first Index was announced in late 2016 and gradually rolled out over the following months.
But it was only on July 1, 2019 that websites were forced to adhere to Mobile-first, that is, Google started to consider only the mobile version as decisive in organic ranking.
With this update, the mobile version of websites has become primary, and this is how Google ranks websites.
Because of this update, several sites that were not responsive simply disappeared from the search results.
As I mentioned, there are more than 200 ranking factors, however, it is impossible to optimize a site for all of them.
The best thing to do is to follow all Google guidelines and observe the main factors, check some of them below.
Page authority is assigned considering the quality, quantity and contextualization of the backlinks.
Domain authority is assigned according to the quality, quantity and contextualization of backlinks.
Backlinks are still considered the primary ranking factor, so it’s best to focus on quality and context rather than quantity.
As you read earlier, most algorithms are geared towards the content of the pages, which means that the better and more original your content is, the better your organic positioning will be.
But while long content helps, only the size and number of words doesn’t matter, the purpose of articles is to “heal users’ pain”.
Using keywords in the body of the page, in the title and in the description is mandatory to achieve a good positioning.
Google has been paying more and more attention to user experience on websites, so the time a user spends on the page is very important.
When a user lands on your page and then leaves, it shows Google that your site doesn’t have what the user was looking for or isn’t good enough.
Since most searches are performed on mobile devices over the mobile network, Google has decided to give priority to sites that load faster.
Ideally, the total page load time should be less than 3 seconds on the 3G network.
Considering the Mobilegeddon algorithm, websites are analyzed in their mobile version, that is, the better your page is on mobile, the better your ranking will be.
SEO can be divided into three parts, they are:
Read on to learn the composition of these three parts of SEO strategies.
On-page SEO or “on-page optimization” aims to apply a series of techniques and practices on the website itself.
Read below some techniques that make up on-page SEO.
The title of the page does not have to be the same as the article, the title of the page is what is inside the tag <title>, the title of the content is usually the <h1>.
This is one of the most important points for SEO, it is recommended to use a maximum of 56 characters.
The Meta Description is the description of the page, the ideal is to use a maximum of 155 characters.
Using the keyword in the title and description helps with the organic positioning of the page, in addition, using mental triggers in the description helps your site to be clicked on in search results.
The content of a page is without a doubt one of the most important elements of on-page SEO.
Avoid repeating the keyword excessively, use it naturally and if possible, use synonyms.
Regarding the content of your site, consider the following aspects.
It’s no use having a perfect website with a great backlink profile and gigantic content if your article was the result of plagiarism.
For this reason, always write authored articles for your site.
In the whatsapp groups I participate in, I always mention that the structure of an article is fundamental for its scanability.
Therefore, whenever possible, try to use the following elements:
Using a keyword in your article doesn’t mean Google will rank it in exactly the right context, you need to contribute to it.
Therefore, use complete sentences that can be searched by users in search engines.
As mentioned in the topic above, using complete terms and phrases that users use in searches is of great value.
Do an analysis of the subheadings of this article and look at the structure as a whole.
An article architecture, when done well, not only helps with text scanning but also contributes to the virality of your content.
The URL is the address of your page, for example, the URL of this page is “https://seolinklord.com/seo/all-about-seo/”.
Using friendly URLs is essential for SEO strategies.
Example of unfriendly URL: “https://seolinklord.com/?p=14567”.
Example of friendly URL: “https://seolinklord.com/seo/all-about-seo/”.
Check out this example page using keywords in the title, meta description and URL:
As mentioned earlier, links are extremely important for SEO, the use of internal links is no exception to the rule.
The main benefits of internal linking are:
The main attribute of a link is the anchor text, ie the clickable term. Ideally, use keywords as anchor text to associate the term with the page.
A good practice is to always include the TITLE attribute in the links and always use the target=”_blank” option so that when clicking, it opens in a new tab.
It is well known that most pages on a website have at least one image.
Considering that the loading speed of a website can be impacted by images, always use optimized images, for this I recommend the TinyPNG tool.
In addition, other elements can be assigned to images to enhance user experience and accessibility, they are:
Different from on-page, off-page SEO is a set of activities applied outside the site, but which have a direct impact on organic positioning.
Learn about some practices that make up off-page SEO.
Without a doubt, the main organic ranking factors today are: quality authorial content and contextual backlinks.
The more domains pointing to your site, the better your domain authority, organic placement, traffic acquisition, and lead generation tends to be.
But thanks to Penguin, it’s not just about the number of links, it’s about the quality and context of the links.
Learn below the best strategies to get backlinks and brand mentions.
Without a doubt, writing rich articles, breaking down the whole subject is a differential.
An article that contains all the information on a certain subject is more likely to go viral and receive organic backlinks.
Evergreen content is timeless content, meaning this article or material will always be relevant.
Consider positioning evergreen content organically and you’ll find that this page will get a lot of backlinks over time.
Guest posts or guest publications consist of producing an article for another website in exchange for a link to a page on your website.
It is worth mentioning that a guest post must follow all quality standards in structure and information to generate traffic to the partner site.
As I mentioned earlier, many websites and news portals may mention your brand without including a link to your website.
Therefore, it is very important to monitor your mentions and request that a link be included in these mentions.
It is very common to find broken links on websites, especially on news portals, probably because the cited website no longer exists.
Identifying broken links can be very helpful, as you can contact the author of the publication and inform him of the problem.
The highlight of this strategy is that you can suggest that the link be directed to some page on your site.
Don’t think that experts in your segment are unreachable, interviewing these personalities in addition to generating quality content, allows your site to receive many organic backlinks.
Depending on how well this person is recognized outside their niche, the interview can be used by newspapers, magazines and portals.
It’s true that conducting a survey is not an easy task, but even so, doing a survey to get data and information about your industry can make this content go viral.
An important detail about searches is that they can even be cited by your competitors, generating the best backlinks to your site.
Using press relations to obtain backlinks is a great strategy.
Not only does this help link building, it also increases your brand presence.
In addition, you can also send a mailing to journalists who write or have written stories related to your niche.
Use backlink monitoring tools to find out which sites point to your competitors.
After putting together a list of all the sites that link to your competition and don’t point to your site, contact all of them and propose a guest post.
Increasing your brand’s presence on the internet will not directly increase your site’s authority, but having a recognized brand makes Google more confident in it.
To increase your brand’s presence on the internet, you need to:
A differential is to monitor mentions of your brand and request the inclusion of a backlink to your website in these publications.
Contrary to what many people think, technical SEO is not classified as on-page SEO.
On-site SEO is a great differentiator in many campaigns, as many SEO professionals neglect these practices.
Read below for the main factors of technical SEO.
Structured data or markup schemes are information made available in a standard to search engines.
I recommend you visit Schema.org to learn more about this subject.
The most commonly used structured data are the following:
The XML sitemap is a file that represents the sitemap, that is, it contains a list of all the pages on the website.
The sitemap is very important for a successful SEO campaign as Google can use this sitemap to crawl and index new pages.
Since mobile-first is an organic ranking factor, validate your site’s mobile version to ensure it meets this requirement.
The robots.txt is a file whose purpose is to inform search engines whether or not the site can be crawled and indexed.
In addition, within this file it is possible to direct the robots of these mechanisms to the sitemap of the site.
The UX (User Experience) or user experience is made up of various metrics based on the browsing data of users on a website.
Providing a good browsing experience for users brings the following benefits:
Also, user experience is a ranking factor, as Conversion cites.
Check out below the main factors that should be observed to improve the user experience of your site.
Site speed is not just a metric, it is directly related to user experience.
Ideally, your site should fully load in less than 3 seconds over the 3G network.
For this, consider:
The HTTPS or encryption of a domain is a ranking factor and is categorized within the user experience.
Even so, HTTPS should not be viewed in this way, but as a security for your website users.
When setting up an SSL certificate on your website, note the following points:
To provide a unique experience on your website:
Surely you have come across a 404 page, this occurs when a page is deleted and is not redirected to another.
Another point is that a 404 page causes your site to lose link juice, so I recommend scanning your website with the Dead Link Checker tool.
If any 404 pages are identified on your site, I recommend that you redirect to another page dealing with a similar topic or to the main page.
Another good practice is to create a relaxed 404 page, giving the person the option to go back to the main page or search your site for the subject they want.
There are two tags that cannot be missing from your site, learn more about them below.
The canonical tag tells search engines what the main page is, for example:
The canonical URL of this page is: https://seolinklord.com/seo/all-about-seo/. Therefore, when search engine algorithms access this page in the AMP version, they will know which version to index.
The Alternate Tag aims to guide search engines that other versions of the page exist, for example:
I mentioned redirects a few times above, learn about some types of redirects below:
AMP (Accelerated Mobile Pages) aims to create a mobile version of your website.
The advantage of AMP technology is that a website loads much faster than a conventional website.
The AMP architecture consists of:
Having a blog without the AMP version of the pages is like starting a car without filling the tank.
What I saw on my site after activating the AMP version:
If your site refers to a physical store or local business, local SEO may be the best strategy for you.
Local SEO is about positioning your site for keywords related to your business in your neighborhood and/or city.
With this, the effort involved is less, because you don’t have to compete with large companies in organic positioning.
There are two tools that will help you with local SEO, they are:
Local keywords are basically the terms that users use to search for your services and products followed by the name of the city, for example:
Main keyword: website optimization.
Local keyword in Belo Horizonte: website optimization in BH.
It is possible to apply some SEO techniques to Youtube, making your videos appear in the first positions of the platform.
Today, some of the main ranking factors for YouTube are:
Below are answers to the most frequently asked questions about SEO for beginners.
SEO or website optimization is a set of practices whose objective is to get the pages of a website to be positioned in the first positions of Google and other search engines.
The GoogleBot is a Google robot that constantly scans the internet for new pages, articles and content. In addition, Google Bot also fetches and stores page updates.
Bingbot is a robot developed by Microsoft to crawl and catalog web pages and content.
I hope this complete website optimization guide has contributed to your knowledge.
Remember that SEO is a medium and long term strategy, that is, results may take a few months.
It all depends on your keyword research and other strategies like content marketing.