What is Technical SEO? Your Practical Guide For Best Practices!

What is Technical SEO? Your Practical Guide For Best Practices!

We know what you are thinking – why bother with Technical SEO when you have already implemented the overall SEO?

To get to the answer, first you should understand:

What is Technical SEO?

Technical SEO is a process to ensure that your website matches the modern technical standards of search engines.

Search engines are continuously updating themselves. Like, Google launches at least three major algorithm updates every year. For tiny updates, the count is more than 100.

With each major algorithm update comes a need to update your SEO. That’s where technical SEO comes in the picture.

It is a process of website and server optimizations to improve organic rankings as per latest search engine updates.

It makes search engine crawlers crawl and index your web pages fast and efficiently.

Note: Technical SEO is a part of overall SEO.

Importance of Technical SEO

Like other SEO strategies, technical SEO is also crucial to create strong SEO.

Aspects of technical SEO – focusing on website structure and ensuring proper crawling of your website is the main factor for search engine rankings.

These aspects make search engines aware that your website is of high value. As a result it makes you rank higher in the SERPs (Search Engine Results Pages).

Google only wishes to share the best possible web pages in results for the users’ queries.

Implementing technical SEO ensures that your website meets this criteria.

A strong technical SEO of your website goes a long way in delighting internet users. In return, crawlers/bots rank your website over others who offer inferior experiences.

Note: No matter how good your content and design is, without technical SEO, your website will not rank high on search engines.

How Can You Improve Your Technical SEO?

There are many points to focus on in this regard:

Keep Your Website Structure Organized

Website structure is the structure that your pages and links create for a website.

It is like a map of your website pages with lines representing the links between them. The way the whole structure is, matters for the relevance of the site. It should be clear and logical.

Such site structure makes it easier for your visitors to find what they are looking for. Also, it makes the site easier to navigate.

As good structure equals good User Experience (UX), the structure of your website has an influence on your ranking in search engines.

Search engine robots find it easy to navigate and crawl such websites and thus give them preference in organic listings.

Thus, it increases conversions and Click Through Rate (CTR).

The optimal structure of a website should:

  • Have a clear sense of hierarchy.
  • Tree structure in categories then sub-categories.
  • Have logical internal linking.

Keep A Simple Navigation Menu

When you visit a place for the first time, don’t you think it is better to have a good map to get around without getting lost? The same is the purpose of the navigation menu in a website.

It is the guide for your visitors once they are on your site. As like a map, it should be clear, precise and – if possible – visually appealing too.

The content, format and design of this section should help visitors to find their way around quickly.

But what does an ideal menu look like? Here’s the answer:

Easy To Use:

The menu is the centerpiece of your site navigation, and visitors should be able to find what they’re looking for easily and quickly.

A visitor should not have to click more than 3 times to arrive on a specific page regardless of their starting point on your site.

Don’t Have Too Many Tabs in Main Menu:

The ideal number of links in the navigation menu is seven. This is because according to research, our memory can only focus on seven items at a time.

So, if you offer more than that, visitors will miss main tabs.

No Too Long Drop-down Menus:

An overly complex drop-down menu risks upsetting the visitor. Faced with an extended list, they may not know where to go and therefore leave your site soon.

Help Users Know Where They Are:

Your menu should tell visitors where they are at all times. This is very important when a visitor takes their first steps on your site. This offers a lot in terms of navigation comfort.

Have A Search Bar:

30% of internet users use the search bar in a website. These users convert 5 to 6 times more than the others .

It is therefore essential to highlight a search bar on your navigation menu. Ensure that visitors can use it at any time even when scrolling down a page.

Maintain A Consistency in Navigation Web Design

Let’s talk about navigation first.

The reason why a website should be consistent in navigation is psychological. Humans are comfortable in places that are familiar and predictable.

If the navigation works the same throughout the website, the user will feel more relaxed, more likely to stay on the website and therefore more likely to convert.

So, consistency in navigation is a crucial element for functionality and good user experience. Now, we will discuss consistency in website design.

Consistency in web design is maintaining the same style in all of your web pages. By “same style” we mean to be consistent in the colors, fonts, style of icons, style of images that you use throughout your website.

Imagine for a moment that on your homepage you use some fonts and in the rest of your pages you use completely different ones. Or that you use different colors of text on each page.

Ofcourse, it won’t look nice.

Moreover, without any consistency in design, it is difficult for your visitors to remember who you are.

This is because the logo, colors, fonts, and design of brands makes the audience remember about their services.

To create consistency on your website, you have to first know what are the visual elements that make up a brand:

  1. Logo
  2. Colors
  3. Page Layout
  4. Fonts
  5. Language
  6. Images
  7. Icons
  8. Interactions

For Logo:

Make it more striking than the rest of your fonts, but without going overboard. Remember that it has to be easy to identify and remember. Your logo should appear on every page.

For Colors:

For the colors of your website, try to use a maximum of two. Choose a main color and another to complement or contrast it.

Create a harmony in using different tones of colors for the different sections of the site. If you have an ecommerce website, you can associate colors with products.

Page Layouts:

Always try to have consistent elements on each page:

  • Same position of navigation menu on each page.
  • Similar fonts and colours throughout the webpages.
  • Search box should be in the same spot on each page.
  • Clear and identical hierarchy of the page elements.

For The Font Style on Your Website:

Use just three.

  • One for your logo that is usually different from the other fonts on your website.
  • Another for the headlines.
  • Last one for the paragraphs.

Remember headlines should vary in thickness or shape and paragraphs must give priority to readability. The font style for each element must be the same on every page.


On a website we cannot write one thing and at the same time oppose it. That is to say, our tone cannot be diametrically opposite in each of the pages or sections.

Tip: Use the AIDA (Attention, Interest, Desire, Action) method as a writing pattern. In addition to being effective and facilitating content writing, it helps the user a lot:

For Images:

Use same size images on a web page or on all web pages – as you like. Also, use the same color for the backgrounds.

Let’s say, you have an online store and you use white background for the product images. In that case, ensure that all of them are white, not some white, others green, or blue.

Also there should be a consistent image placement structure. If there is one image on the right side of the banner, it should be on the right side on all other pages.

Additional Tip: Define a style first by asking yourself what kind of images your client would like to see. You can add a few funny images to gain attraction.

For the Icons:

These are generally used on the home page where you explain the benefits, the services you offer, etc. There are several formats for icons : Flat icons, line icons and icons that mix graphics with fonts. Whichever format you choose, make sure you choose the same icon for the rest of the items in the list.


The consistency in response to the user’s action is directly related to consistency in navigation.

For example, if the menu tabs unfold to the right on the home page, the user assumes that the rest pages will do the same. It is the key to improve usability – a factor that search engines give priority when ranking a website.

Enable HTTPs

Since 2014, there has been a sharp increase in HTTPS protocol sites in Google search results.

This is in response to Google’s official announcement about consideration of the secure HTTPS protocol as a ranking factor the same year.

What is the HTTPS protocol?

HTTPS is a short form for HyperText Transfer Protocol Secure. It is a secure protocol that has existed since 1994 to overcome the main flaw of classic HTTP.

This is where the S for “Secure” of HTTPS comes in. The HTTPS protocol is the combination of HTTP with the security of an SSL (Secure Socket Layer) or TLS (Transport Layer Security) encryption which encrypts the transmitted information.

The HTTPS protocol is a guarantee against personal data piracy and therefore makes your website safer for the Internet users who visit it.

Why enable HTTPS?

Here are the main reasons:

  1. Google Chrome browser notifies Internet users that they are on an authentication page protected by an HTTPS protocol.
  2. You reassure Internet users by securing your site, which helps strengthen the trust they place in you. Messages like “This site is not secure” are not good for your website.
  3. Your website benefits from the advantage granted by Google to HTTPS: URLs in HTTPS benefit from a “bonus” in terms of positioning. Simply put, HTTPs is a ranking factor.
  4. After migration to HTTPS, you will be able to use HTTP / 2, which boosts your website’s page speed.

How To Enable HTTPS?

Be careful, if you want to switch your website from an unsecured HTTP version to an HTTPS secure version. It’s not a piece of cake. Any mistake can hamper your SEO ranking.

For the redirection to go as smoothly as possible at the SEO level, it is essential to seek SEO expert’s assistance.

Below is a list of the steps for setting up the HTTPS protocol:

  • Choose a quality SSL / TLS security certificate
  • Redirect all the URLs on your site (.htaccess files or other means).
  • Update all of your internal links (internal netlinking)
  • Update the URLs of your images and all other URLs
  • Refresh your Google Search Console
  • Test your new secure HTTPS protocol.

Things To Note:

  • Use 301 redirects, no 302, no javascript, etc
  • Have a single redirection, not several.
  • List your social media accounts for which you will need to update your site URL to indicate the HTTPS version.
  • Do an audit of your backlinks to identify the most strategic. Then ask the people concerned if they can update their link to point to your HTTPS version. Use the 301 redirects where they don’t agree.
  • List your ad campaigns to identify where you need to update your site URL to reflect the HTTPS version.
  • Update the URLs in your email signatures.

Changes in Google Search Console for enabling HTTPS

In Google Search Console, you will need to do this for each subdomain of your site after migrating to HTTPS:

  • Add a new website property for the HTTPS version
  • Declare sitemaps with URLs in HTTPS.
  • Check that each URL in the inventory is indeed redirected in a single 301 redirect to the correct URL in HTTPS. ( For this, you need a tool that tests a list of URLs. Such a tool will give you all the details of redirects and HTTP codes.)

Example to check website HTTPS status:

Add Robots.txt

As the name suggests, this file is for the search engine robots.The crawlers/bots will read this file first on your website. From this file you will be able to authorize or prohibit search engine robots/ crawlers from exploring a specific section or page on your website.

Thus, search engine robots associate your web pages as per your requests. To use this file, you must know about Meta robots tag.

This tag is taken into account by Google (and other engines). It conveys messages to the robot that crawls the page.

Here are the different values for the robots meta tag, and their meanings:

  • Noindex : Conveys to robots not to index the page.
  • Nofollow : Conveys to robots not to follow the links in the page. This means that Google will not crawl the pages linked by the page containing this robot meta tag.
  • Index : Tells the robot that it can index the page. Since this value is the default, there is no need to indicate it.
  • Follow : Conveys to robot that it can follow the links in the page. Its value is also default, there is no need to indicate it.
  • All: Its value is the equivalent of index and follow. Since this value is the default, there is no need to indicate it.
  • None : Its value is the equivalent of noindex, nofollow.
  • Nosnippet: Tells the robot not to display a snippet in the results page.
  • Noarchive: Conveys the robot not to allow access to the cached version. The “Cached” link in the results page will therefore not be displayed. This can be useful to those who switch their content from a public accessible version to a paid archived version.
  • Unavailable_after: [date]: indicates to the robot that the page should not appear in the results after the indicated date

SEO Tips on robots.txt file:

  • Google has downloaded the robots.txt file on average once every 24 hours since 2000. So any changes you make in robolts.txt will reflect within 24 hours.
  • If a URL has been indexed by Google, then blocking it in robots.txt will not change anything.
  • To deindex a URL, you must authorize its crawl and use a robots noindex meta tag. You can also request to delete the URL in Google Search Console.
  • Do not block the crawl of URLs that are redirected , otherwise the engines will not be able to notice this redirection
  • The maximum size of a robots.txt file is 500KB (be careful, what exceeds will be ignored by Google)
  • Google must only obtain either a 200, 403 or 404 code from your robot.txt file.
    Code 200 : The file does exist and is accessible.
    Code 403 or 404: The file is not accessible but the HTTP code is consistent
  • The robots.txt file may itself be indexed in Google. To deindex it, you must either use X-Robots-Tag or prohibit crawling the file and then have it removed from the index in Google Search Console.

SEO Friendly Url Structure

What is a URL?

URL stands for Uniform Resource Locator. This element designates the location of an Internet resource according to a precise encoding. In other words, it is the address of part of a website .

The URL makes it possible to find it among the billions of existing sites on the web. This is universal data, which has a standard identical structure throughout the world.

It consist of three parts:

  • Scheme/protocol
  • Sub domain
  • Domain name
  • Extension of domain name

  • https:.: this is the most widely used protocol on the web. It allows you to exchange web pages in a secure manner.
  • Www : This is the sub domain.
  • The domain name : this is the middle part of the URL, often corresponding to the name of the brand;
  • . .com : This is the extension of the domain name. .com is very much seen in general. .Also, sometimes, you see “.ca” , “.us” – they represent the country. (.ca for Canada and .us for the United States). And .org is for government sites.

Further, other elements join it as per requirement. Like, for a blog,

For a blog under blog category:

Classification of URL Structures

There are two types of URL structure:

Static URL:

A static URL is a URL that does not change and does not contain any kind of parameters.
Parameters are additional information at the end of the url. It is placed after the ? symbol. They also help to track information about the clicks on the URL.

These URLs remain the same throughout the website’s life until there are any changes in HTML. Hence, they are used for web pages that are not usually updated, such as Updating this type of page is heavy because you have to modify the source code every time there is a change.


Dynamic URL:

Dynamic URLs are used when the content of a page is in a database and is only offered as a result on a circumstantial basis. Dynamic URLs contain parameters such as a session identifier or product identifier and have signs such as? = &.

Dynamic URLs have two drawbacks:

  1. Internet users cannot understand them as easily as static URLs.
  2. Several dynamic URLs can point to the same content. This can create inconsistency which search engines don’t like.


Can Google correctly read dynamic URLs?

Google can read dynamic URLs just as well as static URLs, as long as they are not too long. A dynamic URL must not contain more than three parameters.

In the case of content managed by databases, dynamic URLs are better than static URLs. This is because Google knows how to find its path through the relevant parameters and does not take into account the irrelevant ones.

You will find more information about it on the Google Webmaster-Blog.

How to optimize your URLs for SEO?

Here are pro tips to optimize URL for SEO:

Present the Theme of Your Page

URLs are a key element for the search engine and the Internet user. They are analyzed by the search engine as well as by readers.

It is therefore important to personalize them. They must indicate the theme of each page.

This contributes to the understanding of your content by both – search engines as well as the Internet users. In other words, your URL must make sense.

It should also make readers want to click on the link to read your content.

For example:

The right url for this blog is :

And not:

Use the Keywords Wisely in Particular URLs

Logically, it is wise to include the keywords targeted by your content in the URL of your pages.

This can have a positive impact on SEO but not every time. Here’s how you should plan it:

For Service /Product Page URL:

Yes, definitely go for it. Thanks to this approach, search engines and users will have better visibility of what you offer on each of your pages. Moreover, it is easier to follow on product pages or service pages of your website.

However, before using the main keyword in the URL, check that it is present in your Title tag, your H1, Sub Headings and your content.

For Blog Post URLs:

As a general rule, we recommend that you include in URL the main keywords present in your blog title.

For example, if your blog post is called “Google’s Algorithm Updates Of The Last 10 Years”, the appropriate url would be

Note: As with content, do not overuse keywords in your URL. In this case, repetition becomes counterproductive.

Two Major Benefits of Using Keywords in URLs:

The keywords present in your URL will be beneficial to your communication strategy. For example, if you choose to distribute some of your URLs on social media or in your emails, it is better if they are clear and relevant if you want to get as many clicks as possible.

Keywords also play a role in your backlinking strategy. If you are getting backlinks from other sites and the link anchor is represented by your URL, it will contain your keyword. Which sends a good signal to search engine crawlers.

Take care of Url Length

Long URLs are much harder to communicate and use. They are also not practical to embed in a forum, in a blog or on social media.

So we strongly advise you to go as short as possible. Typically, a full-size URL contains 50 to 60 characters. Beyond 100, you must shorten it.

So how do you go about shortening URLs?

Start by getting rid of all the words that add no added value . These words are also known as “stop words”. In most cases, these are, coordinating conjunctions or little words like “the”, “of” etc. You can easily remove them without making the URL not understandable.

For example, rather than writing “the-use-of- breadcrumbs- in-SEO”, prefer “breadcrumbs- in-SEO”. This is much more practical and aesthetic, and doesn’t compromise the understanding of the subject.

It will greatly improve the user experience and the convenience of your URLs. It’s much easier to share short URLs. Also, they are easier to memorize

Signs & Special Characters in URL

  • The comma (,) and the semicolon (;) are not allowed as they can cause confusion.
  • The slash (/) , on the other hand, does not pose a real problem. If you choose to use it, still pay attention to the directory. Think about it especially in your HTML coding, particularly for files (scripts or images);
  • The vertical bar (or “pipe”) (|) is allowed but should be avoided. It is not easy to type on the keyboard and neither is it easy to explain orally. It is therefore likely to cause typing errors when entering your URL address;
  • Percent (%), asterisk (*) and at sign (@) signs can be used. However, both are unhelpful.
  • The underscore (_) , also called “underscore”, is not recognized by avoid it.
  • The equal sign (=) or the ampersand (&) are reserved for dynamic URLs;
  • Finally, when the # sign appears in a URL, it usually corresponds to an anchor. It is therefore a link to only part of the page. For search engines, anything after the # the sign is ignored.

Avoid Parameters/dynamic URLs

Most people think Dynamic websites are the best option available today. However, search engine crawlers are not able to correctly read information through dynamic URLs .

When using dynamic URLs, the creation of duplicate content is very common. This is because the same content can be accessed through different URLs.

On the other hand, dynamic URLs tend to have a lower Click- Through Rate – CTR.

It is also important to analyze the semantic aspects of the URL since it does not usually work like static URLs.

When creating a dynamic URL, ensure that it is “friendly” in the eyes of search engines. The structure should be simple and clear, and there should be no unnecessary parameters.

Menus, navigation or footer should be written with static URLs so that search engines can interpret information and links correctly. In this way, there will be no loss of link juice.

Example of dynamic url:

Improve Breadcrumbs Navigation

Breadcrumb trail, as its name suggests, is a common thread that shows users how far they have come from the page they landed on. It is possible to return to previous pages by clicking on the different pages present in the breadcrumb trail.

Present on the top side of all the web pages, the breadcrumb trail offers navigation support to the user. Often users click on the logo to return to the home page. It is also possible with the breadcrumb trail by clicking on “Home”.

Why Use Breadcrumbs Navigation?

  • 90% Internet users do not revisit a website that has issues in navigation or use.
  • The reason behind the failure of three out of four online companies is bad user experience.
  • Still, only half of the total websites on the internet take user experience testing seriously.

What are the Types of Breadcrumbs Navigation?

There are three types of breadcrumbs navigation:

  1. History: It shows you all the pages that you visited to reach the current page. You can go back to any of the previous pages by just a click.
  2. Hierarchy Level: In this you get the option to visit the parent pages of the page you are currently on. Let’s say, you are on a product page – “Blender Bottle” on an ecommerce site. You can visit the “Personal Care section”, “Sports & Outdoor section” , “Water Bottles” and other similar parent pages from the breadcrumb navigation.
  3. Attributes: Mostly product websites use this level. This level displays attributes of the products on the navigation menu. For example, imagine you are on an automotive website looking at a particular car model page. Attribute breadcrumb navigation can take you to its technical specification page, its interior 360 images page and other pages devoted to its different attributes.

How To Setup A Breadcrumb Navigation?

You can implement breadcrumb navigation with a CMS in just a few clicks. However, if your CMS does not offer this function, it probably will have corresponding plugins available. It is also possible to set up a breadcrumb trail on dynamic websites using PHP or JavaScript.

When setting up the breadcrumb trail, it is better to respect these few things:

  • Place the breadcrumb trail at the top of the page, preferably on the left above the current page. This way it does not interfere with the content and remains easily accessible.
  • Use the “>” symbol between each item. This sign has become standardized. Choosing another sign will require additional efforts of understanding from the Internet user.
  • The breadcrumb trail must remain readable without interfering with the content of the page.
  • Indicate the title of the current page in the breadcrumb trail.
  • For the user to have a good benchmark, the page title must be present both in the breadcrumbs and on the page This redundancy allows the Internet user to find their way around.
  • The breadcrumbs must be integrated, visually and in the source code, in the same place on all the pages of the site; except on the home page where its presence is not compulsory.
  • Use the same names in the breadcrumb trail as in the main menu.
  • A breadcrumb trail should be logical and should not confuse

Sites that offer many browsing options run the risk of confusing Internet users. Especially when there are duplicates. Breadcrumbs are low-profile and fit well into most designs. But if they offer nothing more than the classic navigation bar function, then no one benefits. For example:

Good example of breadcrumb navigation:

Here breadcrumbs are beneficial, subtle and easy to find at the same time. This navigation is thus logical, clear and very well integrated into the design of the page

Major Advantages Of Integrating Breadcrumbs On Your Website:

  • Improve the understanding of the structure of the site for Internet users. This navigation element is a must for SXO – Search eXperience Optimization .
  • Help search engines understand how each page connects with the others
  • Increase the duration of sessions and optimize the conversion rate. Visitors will stay on your site longer, which promotes conversion.
  • Optimize the internal network by creating links with optimized anchors on each landing page
  • Improve the click- through rate from the SERP thanks to several links in the same result

Keep an Eye on Crawling, Indexing, and Rendering

Since nothing is perfect, sometimes there are errors in indexing and the most common are:

Slow loading of the web:

Web speed is a key factor in positioning and usability, it can also affect indexing because Google doesn’t like that its robots waste time and if they arrive at your page and have to sit and wait for them to load what they do is “skip it” and not crawl it, as a consequence that page will not be indexed and therefore will not show up in the search results.

Do Not Repeat Content

We already know that what Google likes is originality, so if you have pages that it detects as duplicate content , what can happen is that it does not want to play the 7 differences with your pages and decides that it does not index any.

4XX errors

If you are going to delete a URL, do not forget to communicate it to Google through a redirect, because otherwise you will miss 4XX errors that will harm your positioning.

To detect these and other errors you have to go to the Search Console panel > Coverage. Once you are in there resolve all the possible crawling and indexing issues.

Note: Apart from Google Search Console, we also use Screaming frog and Ahrefs tool.

How to know the tracking errors that our website has

The first thing we will need is to know what type of errors our website currently has in the eyes of Google (and how many). To do this, we will have to access our Search Console account and click on the side menu section ‘Tracking’> ‘Tracking Errors’

Once here, Google will be able to indicate any of these failures …

Site errors

They are tracking errors that affect the entire web and not specific addresses. If you have not had any problems in the last 90 days, you will have a green ‘check’ next to its corresponding section.

If instead you had an error, it could be any of these:

DNS errors

This problem means that Google is not being able to access your website due to a connection error caused by DNS, either a timeout or a bad configuration of these.

How to solve it?
Contact the company that manages your DNS, it is possible that the IP configured for your website is incorrect or that there is a technical problem that needs to be solved.

Server connectivity errors

They differ from DNS errors in that, in this case, Google CAN reach the URL of your web page, but it returns a loading error or takes too long to display the content correctly. It is usually common in pages with excess user traffic.

How to solve it?
If you have load problems due to continuous user traffic, contact your hosting provider and upgrade to a new server with greater capacity that can withstand a greater load. Thus, it will not crash and return this type of error.

Information errors in robots.txt

With this error, Google tells us that it has not been able to access our robots.txt file. This document is important since in it we establish which pages the robots of the different search engines can or cannot crawl.

How to solve it?
First, we have to make sure that we really have a robots.txt file created and it is located in the correct url ( If so, we will have to check that it is correctly configured and we are not blocking Google robots on necessary pages or even the entire domain.

URL errors

They are crawling errors that do not affect the website as a whole, but the specific URLs. One can have different types of these errors:

Minor 404 errors

It occurs when a URL on the web no longer exists or cannot be found by robots. But, at the same time, the HTTP header of that address is indicating that the page is being displayed correctly.

How do I solve it?
The best solution is to create permanent 301 redirects for each of the errors to other similar pages, in this way the user will not see anything strange and Google will consider the error to be solved while giving the new URL the authority that the address could have previously.

404 errors

When a URL does not exist but the server returns a 404 error or file not found.Similar to the previous point, but in this case there is no doubt in the eyes of Google that it is a URL that no longer exists. Either it does not load or returns an error, and not even the HTTP headers indicate otherwise.

How do I solve it?
In the same way that it would solve a mild 404 error, redirecting to related pages through permanent 301s.

Access denied

Unlike the previous errors, the robots could track these addresses but they cannot access them.

How do I solve it?
These errors can be due to three factors:

  • We are blocking certain URLs through robots.txt,
  • That our hosting provider is preventing Google from accessing certain sections of our page.
  • We have some type of access to the web through username and password that prevents robots from accessing the content.

Review the different points and find which of them is generating these errors.Then resolve it.

Not Followed

Do not relate it with the ‘nofollow’ attribute that we can put in a link. These errors indicate that Google was unable to crawl a specific URL. It is usually because of some Flash, JavaScript or similar that makes it difficult for robots to do their work.

How do I solve it?
Try not to put links in this type of programming to facilitate tracking by crawlers.

Server and / or DNS errors

They are errors when accessing specific addresses on the page due to the same problems that we explained in the server errors sections – bad DNS configuration, technical problems, pages collapsed due to web traffic.

How do I solve it?
As said before, the solution will generally consist of contacting your hosting provider and exposing the problem to them. It is very likely that the solution depends on them.

How to Fix Coverage Errors in Google Search Console?

Crawling errors will help you understand the problems Google spiders encounter when crawling your website. Google Search Console will inform you of the errors of your website and the specific URLs that are involved, giving a possibility to solve these errors.

Here’s what you can do with it:

  • Check that your hosting server is working well and has no crashes (500 errors).
  • In the admin panel you will find activity statistics, where you can see if there have been drops in activity.
  • Check for a lot of 404 errors and find out how to fix them. You can do Redirect 301 /
  • Redirect Match 301 redirects. You can also remove urls with 404 errors from your website’s sitemaps. In this Google guide you can see how to fix 404 errors .
  • Check if you have a badly configured robots.txt or there is a conflict between what the robots.txt says and your sitemap.
  • Check that the errors have been fixed
  • The last step is to validate the errors with Google Search Console. This Google webmaster tool only shows up to 1000 errors that it considers the most important.
  • Having the tool “clean” (free of errors) will ensure you can monitor the new errors that are arriving and be 100% sure that the web is traceable and does not present serious indexing errors

Focus on Internal Linking To Avoid Indexing And Crawling Errors

  • Ensure that the link opens in a new tab.
  • There should be no broken links on your website.
  • Check and make sure there are no orphan pages on the website.
  • Perform internal linking as per the content length. Too many internal links in a short length of content is not a good approach.

Use an XML or Html Sitemap

What is a Sitemap?

An xml sitemap or site map , is a file that helps search engines to follow the url of a specific website. In this way, Google will better understand our website.

How does Xml Sitemap work?

The operation is simple. Google does a crawl, checks that your website has content and indexes them. Every new change you make to your website, the xml sitemap plugin makes a new report for search engines, thus improving times.
Indexing is basically the process by which Google begins to crawl the content of your website so that it is stored in its database and thus later appears in the search results. Sitemap xml in addition to containing the structure of the web, gives other data such as how often the web is updated or the duration of the videos if they are available.

Plugins to use in your WordPress

If you want the Google Robot (Google bot) to go to your website every day to review it because you add new product pages daily, insert new articles in the WordPress blog or add new content to your blog, you must use a WordPress plugin that help update the xml sitemap or website map.

Using Google Xml Sitemap is saving time

When you update something on a website, it is essential that you communicate it to Google. The moment you do it, your robot will start to evaluate the changes, having a better chance of better positioning.

In addition, by informing Google of the new updates, you will save work since it will not waste time analyzing your urls one by one to know which ones have been updated or are important.

Google XML sitemap makes this whole process automatic. It notifies Google any updates on your website.

How Can I Create an XML sitemap?

There are two available options:

  • Using Yoast SEO plugin
  • Google XML sitemap plugin

How to configure the Google XML sitemap plugin?

Google xml sitemap is a very necessary plugin, so you should know what each option is for.

#Step 1: Basic configuration for Google XML sitemaps

Notify Google about changes to your blog:

If you activate this tab, the Xml sitemaps file will be linked to this tool and within it when you click on add property, you will insert your website and a ping will be sent to notify Google every time our xml sitemap changes. Leave this option checked

Add the sitemap URL to the virtual robots.txt file:

This option inserts the URL of our sitemap in the Robot.txt file so that search engines that do not allow ping or notifications can find your sitemap more easily.

The first thing the Google robot does is read your robot.txt, and from there it follows the search pattern. If by any chance you cannot find the file in your WordPress installation, it is most likely that it does not exist and you will have to create it. You can create a .txt file with the instructions and upload it to the root of your hosting or use the option that the Yoast Seo plugin has for it.

Increase the memory limit:

This option allows us to increase the memory of our xml sitemap in case there are loading problems.

Compress the sitemap if the requesting client supports it:

Option to compress the sitemap. Compression makes it easier for Google robots to read You can disable it if you see coding error warnings or unreadable content on the sitemap.

Include the Sitemap in HTML format:

Generate the XML sitemap also in the HTML format to have more compatibility with some robots. You must leave it marked.

The Googlebot actually reads our html code. It is a machine, it sees absolutely everything, structure of colors, images, text, links … but it will not see the photo or the colors visually of your wordpress for example.

Allow anonymous statistics:

If we activate this option, anonymous statistics are sent to the creator of the Google Xml sitemaps generator plugin to improve the product.

#Step 2: Additional pages

In this step you can specify those files or urls that you wish to include in the xml sitemap, but they do not belong to your site. It also gives you the option to add a new page on different WordPress installations on the same hosting.

#Step 3: Blogs Priority and Comments Settings

In this step, you have to specify which blogs on your site are superior to others. The crawlers will give primary attention to these blogs. Do not use the automatic priority option. Initially, it is best to give priority to everything equally. Later change it as per which pages are performing good and which are not performing well.

Also, here you will have a comment counter option. You can enable or disable comments on your blogs from here.

#Step 4: Sitemap Content

In this section, you will mark the elements that you want search engines to index. You have to check the checkbox button for various options. These options are:

It is the most important page on your website. It must be checked as we want Google to see it.

They are the posts or publications of your site. For the same reason it must be checked.

Static pages:
They are the pages created: about me, my services, and contact us. This option must be checked

With the categories you have to be careful. Only use it in case you already have enough articles within each category. Leave it unchecked if you just started.

Archives :
As a general rule the files do not have to be indexed.

Author Pages:
Do not index them if you are the only author on your blog as it would be taken as duplicate content because it is the same as the one who writes on the blog.

Only check this option if there are multiple writers creating articles for the same blog and you want them to be distinguished from each other.

Label Pages:
As a general rule, like the categories, do not include them in the sitemap.xml.

Include the date of the last modification:
Check it as it will help search engines to know if the content is fresh.

#Step 5: Exclude Items

Excluded categories:

Here we will mark what we do not want search engines to index. They will appear blank if you have not created categories yet.

Items excluded:
In this section, we can insert the id of the pages that we do not want Google to index us, separated by commas.

#Step 6: Frequency of Sitemap Changes

It shows the periods of time in which we want search engines to visit us. Depending on the kind of website you have, you will have to configure the frequency in one way or another.

If you have a static website with hardly any movements, it would be advisable to notify Google to visit us every week or month.

If you have a blog with daily updated content, you should assign the frequency in days or even hours.

For magazines or newspapers with content updated almost to the minute, it would be advisable to assign the frequencies to “always”.

#Step 7: Priorities of Elements in the Sitemap

Assign the maximum value to the landing pages since they are most important on the website. Assign the second high values to articles and static pages. The rest to your liking.

#Steps 8 and 9: Validate Your XML sitemap:
You need to notify search engines about your sitemap.

  • Upload your sitemap to Google.
  • Then, login to Google Search Console and go to your website.
  • Then go to Index > Sitemaps.
  • Type in sitemap URL.
  • Allow notify Bing for any updates.
  • Save changes and you’re done.

What is an HTML sitemap?

Unlike XML sitemap, HTML sitemap is an actual webpage that assists visitors in navigation. It consists of all the links of your important web pages. The key difference between XML and HTML sitemaps is: HTML sitemaps are for visitors and XML are for search engines. As per Google, having both sitemaps – XML and HTML are advantageous for websites.

How To Create an HTML sitemap?

There are many plugins available for different CMS that can create HTML sitemap. Also, you can create it manually in WordPress. Manually creating is the right option for small websites. Just create a normal web page having links to all of your important pages.

However if you have a website that requires content update frequently for example a news website, it will be impossible to update your HTML sitemap manually for each content update. So, for such websites using a plugin will be the right option.

Build A Successful Content Strategy

Content is the backbone of a successful SEO strategy. In technical SEO, we focus on certain aspects of content:

Finding And Emitting Duplicate Content

For this you can use SEO audit tools. There are many of them available online. We use copyscape. It is a premium tool. There are also free tools – Small SEO Tools, , Duplichecker etc.

if you found any duplicate content on your website, replace it with new content. When writing content keep these things in mind:

  • Make use of active voice wherever possible.
  • Write for users not for search engines.
  • Include research and stats – it makes your content trustworthy.
  • Give proper credits when using a stat or research to the reference content.

Write content for Thin Pages

Thin content pages are those that are either:

  • Have low-quality content
  • Have content less than 100 words
  • Poor Quality affiliate pages
  • Offer no value to the user.

Google considers such content pages of low value and thus don’t rank them in SERPS.

Meta Titles & Descriptions


The Title tag and Meta Description tag provide a short description of the content of the web page. They have two roles:

  • Let search engines know about the context of the web page.
  • To create the desire to click on your site.

Good tags are those that describe the landing page and include prompt action verbs and keywords. They must comply with Google’s guidelines in terms of length. Plugins such as Yoast and online SERP Snippet Optimization tools like SEOMOFO are very helpful

There are also other tags h1, h2, h3… hn on the webpages. Placing the relative terms of the keyword you are targeting in these tags is a good way to improve your positioning on Google . But be careful not to overdo it, just reading all your titles should allow you to understand the content of the page without heavy repetitions.

Use Canonical Tags

Canonical tags are related to canonical URLs.

When you have a page with duplicate content you either replace the content or tag them as noindex. But there is also one other option – canonical URL.

It indicates web pages with almost the same content. For example, assume you operate an ecommerce site that sells shoes. There is a page specific to black leather shoes.

Now there can be numerous URLs depending on product attribute – size, color and material. Google will see them as duplicate pages – which isn’t good.

Fortunately, with a canonical URL, you can let Google know which version of your product page is the “main” one and which ones are variations. For this you will have to use the Canonical tag.

Optimize Page Speed

First, know the loading speed of your website. The easiest option is to open your website in iconogative mode. This will show you exactly how much time your website takes to open on a new device.

Also, you can test page loading time using online tools. There are three free tools that you can use for this purpose:

  • Speed Insights : Page Speed Insights is a free tool by Google. It’s always interesting how Google likes you. It gives you a rating out of 100 and a color code. Plus, it also indicates a whole series of tips to optimize your website.
  • GTMetrix : It is a popular website speed measurement tool that also offers recommendations for optimizing your website performance.
  • Pingdom : Pingdom is a free online tool that gives the speed of loading a web page from different locations around the world. Pingdom also gives you the causes of slowdowns.
  • Bonus Tip: Compare the three results.This way, you also get more recommendations to optimize your website and tips to speed it up.

What Are Possible Reasons Behind A Slow Site?

  • Your Web Hosting: Maybe the server at your web host is not properly configured. This affects the speed of your WordPress site.
  • The Configuration of WordPress: If your WordPress site is not delivering pages properly, it overloads your server, causing your entire site to slow down or even crash completely.
  • The Weight of the Pages: Usually, the weight of a web page comes from images that have not been optimized for the web.
  • Extensions: A poorly designed extension can also significantly slow down your website. Scripts . External scripts such as advertisements, font loaders also have a huge impact on the performance of your website.

Pro Tips To Optimize Page Speed

Use Caching For WordPress Sites:

WordPress web pages are “dynamic”. That is to say each time an Internet user visits a page, WordPress performs a whole complex process to find the right information, assemble it and then send it to the user.

Technically, WordPress needs PHP to function and a MySQL database.Your server retrieves a whole series of information from your MySQL database and your PHP files, then everything is put together in HTML content ready to be served to your Internet user.

This complex processing can really slow down your website if you have multiple people visiting your site at the same time. In addition, this processing is not necessarily necessary since the content of your web pages does not constantly change.

To bypass this processing, one can use “caching”.

How Caching Works:

Instead of totally generating a page on each request, the caching plugin makes a copy of the assembled page the first time it loads. Then, it is this copy which is sent for the next Internet users. The cache is updated automatically if the content of your page changes.

“With caching, your WordPress site can be 2-5 times faster”

Choose a Good Web Hosting

You cannot build massive structures if the foundations are not good. And the quality level of your web host is the foundation of your website. It plays a crucial role in the overall performance of your website.

A good host takes care of certain actions to optimize the performance of your server and consequently that of your website.One of these actions is updating the software on your server. All of these help to boost the loading speed of your website.

Keep A Check on Your Extensions

Even if all of your extensions are up to date, some can be bad for your website optimization. Like there might be some that you don’t use. Such extensions are just increasing the site load.
So you need to remove extensions you don’t need.


    • Whether a feature offered by one extension is still useful or not.
    • Are there more than one extensions that are serving the same purpose?

If yes, then you know what to do. Also, some extensions are either poorly designed or run constantly which overloads your server unnecessarily. To identify them, run a scan of your website with the Plugin Performance Profiler extension.

Compress Your Images

Images offer great help stimulate the engagement of your visitors. But if your images aren’t optimized properly, they are doing more harm than good.

In fact, the big size of images is the most common problem causing slowness on a site.

What to do: Try to keep the weight of each image below 100 KB. .

How to : Before uploading a photo to WordPress directly from your library, go through photo editing software to compress your images. Try Imagify or WP smush. They can reduce the size of an image to half without affecting the quality.

Reduce the Length of Your Pages

The more heavy and lengthy the content of a page, the more time it takes to display.

Use Page Insight or Pingdom to know the size of your webpages. If it is less than 500KB, it’s awesome. If it is less than a 1 MB, that’s good. If it is more than 3 MB, you should act.

Tips: The size of a web page mainly depends on the size of the images. So, start by optimizing your images. Then get rid of 3rd party scripts.

Pro Tip: Use Google Tag Manager to avoid multiple code installations.

Ensure Your Website is Mobile-Friendly

You no longer have a choice, your website must be “mobile-friendly”.

80% of people who use the internet have a smartphone and more than half of the traffic comes from mobile devices. Clearly, when a visitor arrives on your website, there is a greater chance that they are behind their smartphone screen than in front of their computer.

Moreover, search engines like Google offer first priority to mobile friendly sites on SERPS.

A mobile-friendly website simply means a website that displays well not only on computers, but also on mobile devices, with their smaller screens. On a mobile-friendly website, the text is easy to read, the links and the menu are easily clickable.

There are several tools to find out if your site is mobile-friendly. One of the best is the one offered by Google.

      • Type “mobile-friendly” in the search bar of your favorite search engine.
      • Choose tools just below the search bar.
      • And just after that, Google puts its in-house tool forward.

This tool will help you check mobile usability issues. Enter the URL of your website (or your web page if you want to test the display of a particular page of your site). Google gets to work and tells you if your site is responding well to the criteria of being mobile-friendly or not.

How To Make Your Website Mobile Friendly

A responsive design website is a website whose content automatically adjusts according to the size of the screen to display correctly.

Simplify The Menus

Mobile screens are significantly smaller than computer screens. You should always keep this in mind when creating your menus. The menu of the desktop version of your site can be larger and offer more options.

However, on mobile, things get complicated. You should be concise and make use of space as per mobile display. You need to simplify the menus so as not to force your users to scroll and zoom to navigate your site.

Keep a Responsive Design:

A responsive design website is a website whose content automatically adjusts according to the size of the screen to display correctly. If you have a website that is not at all suitable for mobile and you want to fix it quickly, the best solution is probably to change your graphic theme completely.

However, if you have a site already well set up that is running well, this might not be the best option. But if you have low traffic or are just getting started (let alone), installing a responsive theme is the best thing you have to do.

If you are using WordPress, changing the theme is very easy. Just go to Appearances> Themes and activate a new theme. You can type “responsive” in the search bar to find a responsive theme.

Create Forms as Short as Possible:

Not only do long forms convert less well (all other things being equal), they reduce the quality of the user experience on mobile. On a computer, if a form is not fully displayed on the screen, it is not a big deal, navigation is quite simple because of the mouse. But on a mobile, it’s complicated.

Therefore, go through all your forms and try to reduce their size if they are too long, by removing unnecessary fields. On a newsletter registration form, for example, you do not need to ask for the contact number.

Integrate an Internal Search Bar:

This tip is in relation to simplifying the menus. There may be websites that have more than seven items in their menus. It’s hard to fit a menu with many items on a mobile screen. In this case, the option is to integrate a search bar so that the Internet users can more quickly access the content they are looking for. This avoids ending up with a menu that is too big.

Pro Tip: If you have an e-commerce site, the internal search engine is clearly a must-have feature. Especially when you have a catalog with more than 12 million products like Amazon!

Size Matters:

Navigating a website from a computer is easy, it’s simple. You control everything with the cursor of your mouse or your keypad. But navigating with inches on a 4-inch screen is not the same thing. Never forget this, when you create the various elements of your mobile website.

Buttons should be large enough to be clickable with a finger. The distance between the buttons / links must be sufficient so that the user does not click on the wrong button. You should always keep in mind the clickable areas on a screen.

Pro Tip: 75% of users navigate on their mobile with their thumb. Avoid integrating buttons in the corners and at the top of the screen: these are the least accessible places for the thumb. Regardless of the size of the phone, the elements should ideally be placed towards the middle of the screen.

Avoid large Blocks of text:

You need to reduce the amount of text displayed on your mobile website. Of course, you need to communicate with your visitors, so you need texts, words. But choose short sentences, shorten paragraphs if you want your content to be read.

Don’t forget: if a paragraph is 4 lines on the desktop version, it could very well be 12 on the mobile version. So, keep the content short and informative. Make use of numbering lists or bullet points.

Create Accelerated Mobile Pages (AMP)

Accelerated Mobile Pages (AMP) are HTML pages that use a specific format. Google supports this format. Pages that are in AMP format are highlighted by Google on mobile in search results.

The advantage of AMPs is their loading speed, which is much faster than traditional pages. It is for this reason that Google favors them. If you use WordPress, creating accelerated mobile pages is very simple. There is a plugin for it.

More Technical SEO Tips

Use hreflang Tag For Multilingual Website

If you are targeting multiple countries, you will need a multilingual website. For this you can use hreflang Tag. It auto translates the language as per visitor’s location.

Remove Dead Links from Site

Login to your search console and look out for any broken or dead links. Get rid of them. You can read more about them here.

Set up Structured Data

Also known as Schema Implementation, it depends upon a website – whether it is service-based or product-based. Here’s a complete guide on it.

Lastly, Focus on Technical SEO User Experience:

For this, you need to measure core web vitals:

      • Largest Contentful Paint (LCP)
      • First Input Delay (FID)
      • Cumulative Layout Shift (CLS)

Read: How Core Web Vitals Work

Technical SEO Tools

You can check the SEO tool list here.

Claim a free technical SEO Audit of your website: Contant Us!

Google Search Console – How to Add & Remove Users

Google Search Console – How to Add & Remove Users

In the time you read this blog, more than 175 new websites will be added to Google.

To stand out from that crowd, you must have adequate knowledge about different SEO tools – their usage and operation.

One such prominent SEO tool is the Google search console.

Often considered as a complementary tool to Google Analytics (whose role is to analyze your site traffic), the Search Console gives you all the data you need to:

  • Optimize Your Web Pages As Per Engagement Stats
  • Get Rid of Indexing And Crawling Issues
  • Resolve Server errors and Site Load Issues
  • Identify the Right Keywords via Preferences of Internet Users
  • Re-index updated Web pages
  • Get Solid Backlinks
  • Rectify Security Issues
  • Site Maintenance
  • Troubleshoot Mobile Usability And AMP Issues

More specifically, the Google Search Console gives the possibility of knowing the frequency with which Google’s indexing robots ( Google Bot ) visit and index websites.

The tool even goes as far as making a robots.txt file and suggesting HTML improvements to website administrators. On top of that, it is totally FREE.

The Search Console is therefore an essential service for all those who handle SEO.

Let’s move on to:

Steps to Add or Remove User From GSC

** The first four steps are common for both actions.**

Step 1: Go to the Google Search Console page and sign in with your email ID and password.

Note: Make sure you have added your website to Google Search Console.

Step 2: Choose a property on which we want to add or remove a user:

Properties here are websites you own. You may have multiple websites thus multiple properties. Here you can see our website

Choose gsc property

Step 3: Then go to settings:

It will be in the left corner of your screen along with the icon.

GSC Settings

Step 4: Choose Users & Permissions:

Under the Property settings, you will find “users and permissions” just below the “ownership verification”. This is where you know who has access to your Google Search Console account. Here you can verify that there are no unauthorized users on your account.

Users & Permissions

Steps To Add New User in GSC

1. Click the top right corner to add “New User”:
It is a blue box with a plus symbol and emoji of a person.

New User

2. Enter the email address of the concerned person and also select the type of permission.

  • Full Access: Full access as the name conveys grants total access to the user. They can
    perform any actions, such as disavowing links, submitting reconsideration requests in response to manual penalties, re-requesting indexing, or submitting sitemaps.
  • Restricted Access: Restricted access grants limited access to users. They can access performance data but can not perform any actions. This access is for internal teams who need the performance data but don’t need to do any manual actions.

User Access

3. Lastly, click on Add button

Add button

That’s it. Now you will get the notification about granted user access over an email. After adding a user you will see them in the list.

Steps To Remove a User in Google Search Console:

Note: Only the property owner can remove a user from GSC (Google Search Console).

1. Log in and go to “User and Permissions” using the four common steps stated above.

2. Click on three Dots next to the user. Then click remove access.

remove access

3. Confirm your choice.

Confirm your choice

Note: From the User and permissions list, you can, directly, for each user, modify the authorization status: from “total” to “limited”, and vice versa.

Why Add or Remove Users From GSC?

There can be multiple reasons for this:

  • Joining of new members of the team.
  • Leaving of employees.
  • Granting access to the data to different teams – content, SMO, etc.
  • Removing any unauthorized users from the account.

Concept of User Access In The Google Search Console

On Google Search Console, users have a certain “level” of permission of a web property. A web-property is a website.

Then, there are four different levels of access to Google Search Console:

1. The Owner level: Roughly speaking, the owner of the account is the manager of your website. This is the person who controls everything in the account, for example:

  • Access to all the information & data available
  • Create, modify, delete users
  • Use all the tools offered. Etc.

By default, the creator of the account on Google Search Console for your website has owner-level access rights.

Further, a user of owner level can have two statuses:

  • Confirmed: An owner is confirmed when they have confirmed the ownership of the website via a verification token.
  • Delegate: The delegate owners are those who have not yet completed the confirmation process. Verified owners can create a delegate owner using webmaster central. Deleting this owner does not require any verification token so any other owner can delete the delegated owner in the search console.

2. The Full User Level: As its name suggests, full user-level access allows, to use Search Console, but not to administer it. For example, a full user cannot add other users. They also may have limited access to the functionalities.

3. Restricted User: They can only observe data and can’t perform any actions.

4. Associate Level: Associates are those people that have rights to take some defined actions on behalf of your site, or approach certain data. These are related to associate your Search Console with other platforms like Chrome Web Store. However, they can’t view your Search Console account or data directly. The permissions vary depending on the type of association.

How To Confirm Your Status as a Website Owner in Google Search Console

The details of the methods of confirming your status are a bit technical.

Confirming that you are the owner of a website, in Google Search Console, can be done via:

  • Connection with your Google Analytics account
  • Importing an HTML file (provided by Search Console) by FTP
  • Adding an HTML tag in your source code
  • Via the Google Tag Manager,
  • Or filling in technical information in the customer area of your domain name provider.

It only takes one of the above methods to become a “confirmed” owner. But if you have more than one, it helps Google provide you with relevant information even more.

If you need help with these settings, your best bet is to call on a competent developer, in order to save time on these technical aspects.

Or contact us, so that someone from the team can help you!

How To Add New Owner Access in Search Console

Let’s say you created your Search Console account 2 years ago. And you have just integrated a new partner in your company.

He needs to be on top of everything. You, therefore decide to create owner access for it.

  • In the “user and permissions”, you will see an “Add Owner” button:
  • If you click on it, a mini-window appears to insert the email of the person to add as the owner:
  • All you have to do is click on “Continue”. That’s it!
  • The user who has been added as the owner will receive an email notification.

And as a security measure, Google sends an email to all owners of this web property, which informs that new owner access has been created.

How To Remove Existing Owner Access

Now let’s say you have a partner in your business who decides to take his well-deserved retirement.

In this case, he probably no longer needs to have access to the Google Search Console tool. To do this, in the list of owner accesses, to the far right of the table, there are “Cancel validation” links.

These have the effect of simply removing the status of “Owner” from the person concerned. It’s fast and efficient. In general, it is better to limit access to the minimum possible number.

Indeed it is complicated to secure a web tool if you have many people who can access it. Some of them could forget their password, or choose one that is too easy to guess, others could leave the company on bad terms and do stupid things. We are never safe from problems on the Web.

Common FAQs on Google Search Console

Who is Google Search Console for?

It is aimed at the website administrator, ranging from the business manager to the application developer, including an SEO specialist or a digital marketer.

How does Google Search Console work?

The use is quite simple. You can observe and manage your website’s data daily and monthly. Google will let you know via reports if there are any irregularities on this one. Every month, a dashboard will be sent to you to help you quickly monitor your site. The dashboard contains statistics, the number of clicks, etc.

If you want to add or modify the content, you can update your sitemap, or indicate it on the robots files – the pages not to crawl and send it to Google.

How to make your site functional on Google Search Console

  • Go to Google Search Console;
  • Add your site and subdomains to Search Console;
  • Confirm your site;
  • Specify if it is with or without the www;
  • Add the users who will have access to the information;
  • Add the country or countries that will be able to find your website;
  • Reread the best practices and instructions for webmasters;
  • Send a sitemap or the robots files to Google to find out which page to index if you want;
  • Monitor stats and dashboard for follow-up.

Can I recreate an access level in Google Search Console?

Yes, it is easy to recreate an access if necessary and if you made a mistake except for owner access. This type of user can see, do and edit everything in your account.

What are the advantages of the Google Search Console?

Above all, the tool helps you monitor and maintain good SEO for your site by providing you with the data you need to plan and prioritize your future actions. The Google Search Console helps you understand why and which your website pages are performing well (or not) in organic results.

For this, the tool gives you 3 essential features to improve your SEO :

Search Performance Analysis

The “Search traffic” section provides information on the most frequent requests in Google search. You are thus informed about the traffic and the keywords used to find your site. To obtain detailed data on your pages and requests, you have to think carefully about choosing your metrics: Clicks, Impressions, CTR, and position.

Using this tool, you can see how your pages are behaving on specific queries. You will then use this data to improve your SEO.

The Number of Your Pages Indexed by Google

With the “Indexing Status” feature of Search Console, you can see how many URLs on your website have indexed Google in the last twelve months.

Having this indexing state is essential for the natural referencing of your web pages, you can view the pages that are blocked by the robots.txt file and the number of deleted web pages.

If this number is too unusually high or not what you expected, it is a sign that something is wrong. However, if your site is not readable by Google robots, the positioning of your site will be greatly penalized.

Resolving Crawling Errors on Your Web Pages: The Google Search Console also helps identify errors in:

Site: DNS error, server-related errors, robot.txt errors that prevent Googlebot from accessing your entire site.

URLs: 404 error, access denied, server error, untracked url: these are the specific errors that Google encounters when trying to crawl specific pages on a computer or on a phone.

These errors impact your SEO referencing because they slow down the passage of Google robots on your web pages (crawling) and consequently delay the indexing of your website on Google.

Final Words

Google provides you with many free tools to optimize your website and generate more traffic. Among them, the Search Console is essential and its main function is to improve the SEO quality of your website.

So grab all the features of the Google Search Console to boost your SEO!

** Please feel free to send us your comments and remarks below **

SEO: Google’s Algorithm Updates Of The Last 10 Years

Everything About Google SEO Updates of The Last Decade

The slightest thrill in Google’s algorithm can shake the whole SEO world. For Google, it helps to respond to user queries with ever more precise results. But for online businesses, it can ruin the hard work of several months or even years.

The famous search engine released several algorithms with names as charming as Penguin, Panda, or Hummingbird. Below we have explained all of these along with their impacts.

But before that, you should know:

Google Algorithm – What is it?

It is a system that helps Google to fetch relevant data from its database in return for a search query. Thus, it helps Google deliver precise information instantly. It follows several ranking factors to deliver the right web pages on the top of search results.

What are updates in Google Algorithm and Why Are They Necessary?

Everything needs updates, so do Google’s algorithms. It helps to upgrade Google’s ranking criteria for websites. They are necessary because without them, internet users will not get the precise results for their search queries.

Google makes a plethora of updates in its algorithms every year. But plenty of these don’t show major effects and thus go unseen.

Now let’s discuss the Google Algorithms updates from 2010:

Google Caffeine Algorithm – June 2010

This update was launched in Aug 2009 and rolled out in June 2010. Caffeine redesigned Google’s indexing system. With Caffeine, Google was now able to crawl and index pages faster on its servers. This update made it possible for Google to crawl and add newer content more quickly.

Caffeine wasn’t an algorithm by Google to manipulate rankings. Rather, it was just a reconstruction of its indexing system.

Google Panda Update – February 2011

The Google Panda algorithm was a search filter that targets the content of sites. The goal for this update was to highlight unique and good quality content in its results.

More specifically, Google Panda excludes:

Content Spinning: a method of automatically generating content using software by taking text and replacing certain words with synonyms.

Duplicate Content: Internally (a page that includes the exact content of another within the same domain name), or externally (a page of a website that includes the exact content of another page of ‘another website)

Spam Comments: Identical texts automatically published in order to generate links to another site.

To avoid being penalized by Google Panda, websites must therefore offer unique and useful content, i.e. content that answers the user’s question.

Major Google Panda Updates

  • In mid-March 2013, Matt Cutts announced that Google Panda will now be integrated into ongoing algorithm updates.
  • In mid-June 2013, Matt Cutts also revealed that an update for Google Panda had been launched in early May (Google Panda 26) and that Google was working on less severe Panda filters.
  • In mid-July 2013, Google confirmed having deployed this reworked version of Panda (Google Panda 27), which according to the engine “incorporates new signals to target more finely”.
  • In 2014, Google began to communicate again on Panda deployments, in May and September in particular, with its versions 4 and 4.1.
  • At the beginning of 2016, Google announced that its Panda was now integrated into the core algorithm and there were no more updates on it since then.

Sites that were hit the most by Panda update:

Panda update

Google Penguin – April 2012

This is another Google search filter algorithm, but this time it tackles spam backlinking. Just like Google Panda, it has become so important for the engine that it was integrated into the core algorithm in September 2016, thus becoming an essential criterion in search engine result positioning.

Google Penguin aims to punish poor quality backlinking, in particular by addressing:

Spamdexing: This is when the backlinking of a site is framed in an artificial way, most often using link directories. It involves purchasing links. Very often, they come from URLs unrelated to the theme, or from poor quality sites from an SEO point of view. For Google, this is clearly a fraud on the engine index.

Over-Optimization of Link Anchors: When the anchor of internal links or external links pointing to the same page is the same.

For Google, the goal for Penguin Algorithm was to create the cleanest and most natural backlinking network possible between sites, in particular avoiding link purchases.

Major Google Penguin Updates

  • Shortly before Penguin’s 4th deployment, in May 2013, Matt Cutts explained that the filter would now be able to “analyze more deeply” and also have “more impact” than the first version. It was therefore to be a new major version of the filter, which earned it the name “Penguin 2.0”

Penguin 2.0

  • In 2014, Google rolled out ” Penguin 3.0 ” more than a year after the previous rollout. The first aftershocks were seen on October 18, but numerous aftershocks also came into notice more than six weeks later.
  • On December 10, Google claimed that Penguin was now deployed on a continuous basis, with no end-of-deployment date.
  • However, finally, on October 23, 2016, Google announced the last update of Penguin. It was then that Penguin was integrated into the core algorithm and has been operating since that date.

Exact Match Domain (EMD) Update September 2012

This update was announced by Cutts on September 28, 2012, on his Twitter:

This algorithm was deployed to fill a flaw in the system: to prevent poor quality sites from appearing in the first position of the SERPs only because their domain name corresponds to the request.

Exact Match Domain

Some people, knowing the impact of the domain name on SEO, have taken advantage of this flaw in Google’s algorithm to improve the SEO of their site. However, thanks to this update, Google was able to remedy this error

The motive behind this Algorithm was not to exclude websites with exact match domain names, but those exact match domains that have poor/ non-reliable/ thin content.

Have an EMD? Wish to know how to avoid getting penalized?

  1. If you use a site containing an EMD, you must first and foremost focus on quality content. That is, original content, along with the regular addition of content.
  2. Go for creating content clusters. Keep in mind that for Google, the most important thing is to give relevant content to the internet user.
  3. Adapt your site with an optimal design (UX), and the mobile version (index mobile-first).
  4. Think about diversifying your outgoing links, and finally, create a community in order to create interactions.

These points will allow you to keep a good position or limit losses with the use of an EMD.

Hummingbird Update – September 2013

The Hummingbird algorithm formalized by Google in September 2013 has made significant changes in the SERPs. Thanks to this update, the search engine has been able to understand what SEO experts call conversational search, i.e. a whole sentence or even a question, and no longer a succession of keywords.

Hummingbird allows Google to understand a query in its entirety and is no longer based on one or more keywords. To achieve this, this tool has been programmed to better understand both Internet users’ requests and indexed content.

Due to this, Google better understands the request made by the Internet user, which makes it possible to offer more precise and relevant results.

The idea was: if an Internet user writes a question, he can obtain a result that answers his request. Depending on the search intent of the Internet user, the results differ.

It is because of the Hummingbird Algorithm, the requests made orally via voice assistants had shown better relevant results.

Payday Loan 2013 – 2014

On June 11, 2013, Google announced this update to target spammy search results that mention porn and payday loans. It took around sixty days to roll out. Later in 2014, the Payday loan 2.0 anti-spam and Payday loan 3.0 update was rolled out. Some well-known experts suggested that the 3.0 update targeted spammy queries and 2.0 targeted specific sites.

Google Rankbrain – Early 2015

Rankbrain is part of the Hummingbird. Google said with Rankbrain, it has built machine learning skills into its algorithm so that it can understand the need behind a query, what experts call search intent.

It aims to understand the implicit searches of Internet users via artificial intelligence. For example, it will understand that to the query “golden shoe”, it is relevant to provide answers concerning “Messi or Ronaldo”, even if their name is not given in the search.

So, it relies on the whole long tail phenomenon. After the quality of the content and links, Rankbrain is the 3rd SEO positioning criterion for a site.

Mobile-Friendly / Compatibility – April 2015

The SEO people around the world nicknamed it Mobilegeddon!

Since 2015, Google has made mobile compatibility a priority and a primary factor in the search engine optimization of a website.

Deployed with the objective of favoring the sites adapted to mobile in SERPs, the update “Mobile compatibility” appeared on April 21, 2015. Later, Google launched its mobile-first index in 2017 to adapt to user behavior.

Knowing that users are increasingly going to search via their mobile phone and not their computer, Google has therefore made responsive design a necessity by classifying websites in relation to their mobile version.

It is a question of giving preference to websites that have an interface adapted to mobile browsing. Mobile-Friendly was the forerunner of the Mobile-First Index.

Pigeon Algorithm – June 2015

Google Pigeon is an algorithm that has come to strengthen local search. Launched internationally in June 2015, the Pigeon algorithm focuses on local search.

With the aim of providing ever more precise responses to Internet users, the algorithm differentiates between local results (via cities) and more general results.

This update is very useful for stores, restaurants that want to develop their visibility at the local level.

Core Algorithm Update (Quality, Phantom Fred, Medic) 2015 -2017

2015 – 2018 has been an eventful time for the SEO world. Google has rolled out a number of very important updates. The search engine announces updates to its algorithm approximately quarterly. These regular updates called “Core Update” impact the way Google ranks the pages it references in its search engine.

Algorithm updates of this type are called Core Update because they relate to Google’s main algorithm and not a particular indexing method or SEO criterion. Let’s have a look at them:

  • The “ Phantom” update launched in 2015 allowed Google to penalize websites that did not present relevant content to users.
  • Phantom 2, also called “Quality Update”, was launched on May 20, 2015, with the objective of improving the user experience by combating low-quality content.
    In March 2017, Google rolled out its “Fred” update. This Core update led to many changes in the search results. It was launched with the aim of combating sites displaying a lot of advertisements that were detrimental to the user experience.
  • The effects of “Fred” were also felt on all sites using Black Hat SEO practices.

Essential Web Speed and Signals 2018

This Google Algorithm was set up in April 2010 for computers. In July 2018, its version for mobile phones came into existence. On the computer, its goal was to promote sites with a good loading speed. On mobile, it penalizes sites that are too slow. It now serves as a part of Core Web Vitalls – an update for 2021.

Local Search Update – November 2019

In November 2019, Google announced the “Local Search Update”. It acts on the understanding of queries: Google was now able to analyze the link between the words used in a local query and its meaning, to provide more precise results;

Core Update 2019

Google has rolled out a major new update in 2019. This time the search engine imposes new technical quality criteria on the sites to be well ranked:

  • Importance of page loading speed which must be as low as possible
  • Switching to HTTPS to secure the data passing through the site (payment, personal data, etc.)
  • Importance of UX with fully responsive design

Site Diversity Update – 2019

Launched almost simultaneously with the previous one, this update wants to offer more diversity in the SERPs. Google no longer wants to bring up several pages of the same site, but to offer several domain names to respond to a request.

Google EAT Algorithm Update -2019

Google’s EAT Expertise -Authority-Trust, was established in 2014 and updated in 2019. This algorithm concerns the criteria that the search engine evaluates in terms of content. In order for the textual content of your site to be correctly rated by Google, it must respond favorably to these three criteria:

  • Expertise
  • Authority
  • Reliability

Most of the sites had been impacted by this update, particularly the YMYL websites. YMYL stands for Your Money, Your Life. These sites have a direct impact on “people’s happiness, health and wealth”. In general, YMYL sites concern themes such as health, finance, legal, insurance, security, etc. In other words, a very large part of the web.

Google wants to provide the best possible answers to Internet users, which is why it set up Google EAT. The content must then be of high quality, 100% authentic and respond perfectly to the requests of Internet users on a specific request.

Why Did Google Create the EAT Criteria?

Because a website that meets these criteria naturally distributes quality content that perfectly meets the demands of Internet users on a specific request. Thus, the search engine can provide a very satisfactory response to its users, based on quality, reliability, and relevance. Please note that the content of a web page is not limited to the text but to all its functionalities and its design.

You must therefore include these criteria in your SEO strategy, especially as the results in the SERPs become more and more refined. Google tends to respond effectively to the problems of Internet users and the EAT plays an important role in the display of the first search results.

What are the important EAT criteria for YMYL sites?

Site Reputation: The reputation of a YMYL website is very important and negative ratings may penalize it.

The Main Content Quantity: Whether for the YMYL site or for the other sites, the content must be highly qualitative.

Supplementary Content: Content allowing a good user experience is mandatory;

Advertisements: The advertisements which are in the web page must not disturb the visitor, whether it is for the YMYL sites or the other sites.

Reviews: They ensure the reliability of the site.

Recommendations and Awards: Awards greatly increase the score of YMYL sites.

Other than that, one should work on:

  • Quality of Information on the website.
  • The level of expertise.
  • The online reputation of the author.

How to improve your “EAT score”?

To show Google that you are legitimate, you need to improve the “EAT” score of each of your pages. Here are some tips that will help you present yourself as an expert in your industry.

  1. Put Author Bio On Each Content:
    Google attaches importance to the author of the content. It is therefore important to put an author bio on each blog post. This allows Internet users and Google to see if the article was written by an expert on the subject.
  2. Encourage Contributors:
    You can encourage your community to comment on your articles. Interactions do good for the EAT and allow you to gain reliability (if the feedback is positive).
  3. Have a site that facilitates navigation:
    On e-commerce sites, all pages should be easily accessible from the home page. Also, all the information such as contact information must be easily available.
  4. Work on your brand image:
    To take care of your brand image, you must:

    Be active on social networks ; Present your values ; Tell a story to the audience.

  5. Delete or refresh pages with a low EAT score:
    Each page on a website is rated by Google and may have different EAT scores . All of these scores are combined with the reputation of the website and provide the final EAT score. If you have pages with a low score, update them. You can also remove them if they do not add value.
  6. Secure the Site:
    HTTP sites without an SSL certificate are flagged as insecure. You need to ensure that every page on your site has an HTTPS URLS to have good EAT scores.

Google BERT Update – 2019

Arguably the most significant update of the past 5 years, BERT was officially launched on September 9, 2019.

BERT stands for “Bidirectional Encoder Representations from Transformers”. This is the deployment of a form of artificial intelligence in the algorithm to better understand the requests of Internet users.

The objective of the BERT update is to achieve a better understanding of user requests by prioritizing the terms and expressions used by users. With this update, Google was able to better understand the relationships between words within an entire sentence, rather than processing the keywords (or phrases) one by one.

Thus, since its launch, this update has enabled Internet users to see results appearing in the SERPs which are precisely linked to their searches.

Core Updates – 2020

In 2020, three Core Updates had been deployed by Google. The “ January Core Update” followed by the “May Core Update ” and finally at the end of the year the “December Core Update ”.

The search engine focused on the four elements that must be of paramount importance for a web SEO:

  • Quality Content
  • Efficient Sites
  • A well-thought-out Website Structure
  • A Fluid User Experience

Featured Snippet Repetition – January 2020

Danny Sullivan from Google announced this update on Twitter on January 22.

According to this update, the same featured snippet will not appear in page 1 organic search results repeatedly. It affected all the search results of Google.

Passage Ranking Algorithm – February 2021

Danny Sullivan introduced Passage Ranking to the world on February 12 2021 via Twitter. For now, this algorithm is only launched for search queries in the US. This algorithm will help Google to understand the bits of your content. We will soon consider passages as an additional ranking factor.

Products Review April 2021

This algorithm was designed by Google to rate product reviews. It was officially launched last April. Concretely, thanks to Product Reviews, Google wants to favor the sites having the best opinions. The search engine thus favors sites whose opinions are based on relevant data and genuine research work carried out by enthusiasts and professionals who master the subject concerned.

Currently, Products Review only applies to English content. But Google plans to expand it to various other languages.

Google Page Experience Update – June 15, 2021

This update concerns the “Google Page Experience”. Google now takes into account more criteria to classify web pages. These criteria relate to the state of a site in terms of UX. They include:

  • The loading experience;
  • Interactivity;
  • The visual stability of the content of a site’s pages.

Page Experience Update

In order not to be penalized in the ranking of websites within the SERP, it is necessary to take upstream actions:

  • Test the mobile compatibility of your site
  • Test the speed of your site
  • Check the mobile efficiency of your site

The objective is always the same for the search engine: to ensure that the sites offered to Internet users are always more relevant and efficient. But the user experience is also taken into account this time.

Google relates UX to three “Core Web Vitals”:

LCP: Largest Contentful Paint. This indicator concerns the loading time of a web page. The main elements should display in less than 2.5 seconds according to Google.

FID: First Input Delay. This is the time between the user’s click to view a page and the browser response. For Google, the FID should not be greater than 100 milliseconds.

CLS: Cumulative Layout Shift. It measures the visual stability of a page during loading time.

Google Spam Algorithm Update – June 2021

This update was rolled out on June 23. However, there were no specific details by Google on what spam update was targeting. Its second version was rolled out on June 28th.

July Core Update – 2021

In July 2021 Google deployed a Core Update. The engine specifies that the goal is to “always focus on the quality of content”. On July 12th, Google confirmed via Twitter that the July 2021 Core Update was effectively completed. Although no deep details were provided, one can read about it on the Google Search Console Blog.

Think You’ve Been Impacted By Any of These Updates?

Ask Our Experts For An Audit



Ten Questions To Help You Hire The Best CRO Agency

Ten Questions To Help You Hire The Best CRO Agency

Unbounce reports 44% of brands across the world spend more than $10,000 every year on A/B testing their products! How much are you investing in booming your brand?

The good news is you don’t need a big budget to supersede your competition. However, you need a dedicated Conversion Rate Optimization (CRO) team to ensure you’re constantly growing.

  • Did you know former President Barack Obama raised $60 million through A/B testing?
  • In 2011, even Google ran 7,000 A/B tests.

CRO tools can increase your ROI by an average of 223%.

It should come as no surprise that outsourcing to the top CRO agency will still cost you much less than hiring in-house CRO specialists.

So, let’s walk you through why you need CRO and how you can select the best CRO partner for your brand.

Value Added Through Conversion Rate Optimization

Just a refresher, Conversion rate optimization (CRO) includes all the practices and strategies that help increase the percentage of users performing the action you want them to take on your site.

From macro-conversions like clicking the ‘add to cart’ or purchasing your product to micro-conversions like signing up for your services or filling out forms, you need CRO to succeed, period.

Why Your Brand Needs Conversion Rate Optimization

Hint: Not following the “best practices” is in your brand’s best interest. Take a look at these famous CRO case studies:

1. Basecamp’s Counterintuitive Decision To Remove Trust Elements

In 2007, Basecamp, a powerful project management tool, released Highrise, a CRM solution that allowed users to easily share contacts and manage communications. Since its launch, Basecamp wanted to scale its signup rates. So, they experimented with a completely new design.

Take a look at their original page:

project management tool

It checks all best practices:

  • Testimonials
  • Bold CTAs and other visual cues
  • Good use of color contrast

And yet, they wanted better conversions. So, they decided to revamp the entire site!

Take a look at the six redesigns and their conversion rates:

conversion rates

  • Smiling faces are welcoming and boosts trust.
  • Highlighting the best testimonial that addresses their services and the value provided made more impact.
  • They had a cleaner, simpler design.

The aftereffect? 102% rise in their conversion rate!

2. World Wildlife Fund’s Straightforward And Intuitive Web Page

The famous World Wildlife Fund wanted to get more signups to their newsletters. Here’s how their landing page originally looked:

newsletter conversion rate

Following their intuition and experimenting allowed them to stand against “best practices,” and here’s the redesign:

newsletter conversion rate 1

“Best practices” then enlisted:

  • having a clear CTA that stands out
  • gathering as much data from your reader
  • having more fields on the contact form.

Although their new design wasn’t too different from the original, it went against all best practices.

But, adding a few extra details (like telling readers exactly what they’re signing up for), changing the CTA to the left, and removing unnecessary fields was the best conversion rate optimization strategy to follow.

The aftereffect? 83% increase in newsletter signups!

There are many other examples of how conversion rate optimization can help your brand make decisions in its best interest. Blindly relying on industry standards or best practices isn’t always in your best interest.

Many conversion rate optimization agencies don’t always have the experience to suggest you the right strategies. That’s what differentiates the top CRO agencies from the rest.

Let’s understand how to select the right CRO agency for your brand:

Questions To Help You Select The Right Conversion Rate Optimization Agency

There are multitudes of CRO marketing agencies today, and to help you select the right partner for your brand, here are a few questions to ask the sales representatives of your shortlisted CRO agencies.

1. Do you have a CRO specialty?

A jack of all trades is bound to be a master of none.

Sure, a CRO consultant can provide services like web design, front-end development, or copywriting, but a specialist will help you bring home the prize.

2. What is your process/ methodology?

CRO is solely data-based. Let no one tell you otherwise.

  • Is the answer to this question providing you with real insights into the agency’s process or methodology of working?
  • Are they instead talking about their intuition never failing them?
  • Are they thinking strategically?
  • Do they ask you questions about your metrics (traffic overview, average order value, current conversion rate, and revenue)?
  • Are their strategies user-driven?

Don’t just partner with the first result you find when you type “CRO agency near me” on search engines.

3. What shapes consumer choices?

Simply changing the prices or mindlessly switching up your web design won’t result in increased conversions.

The agency needs to live and breathe consumer psychology.

Ask them about their knowledge:

  • What books do they consume?
  • How many clients have they worked with?
  • Examples from their work on how consumer psychology drove sales.

Don’t forget to ask them for their case studies for successful projects so you know what to expect.

4. Which was your least successful project, and why?

Every brand faces setbacks. If the conversion optimization agency denies this, it can mean one of three things:

  • They’re very new to the industry
  • They’re lying
  • They don’t have concrete learnings from their setbacks.

Better you move on.

5. What is the average experience of your CRO experts?

Experience sheds a lot of insight on expertise. And conversion optimization needs that expertise, versatility, and patience that only experience can bring.

Let’s look at the skills manager from a CRO marketing agency has:

  • Analytical skills
  • Well-versed with consumer psychology
  • Sound knowledge in HTML
  • Proficient in SEO & digital marketing
  • A clear understanding of web design

When you get on a call with the sales rep of a conversion optimization agency, they usually connect you with the manager.

That’s why you must explicitly ask for the portfolio of the CRO expert that would eventually manage your account. Then, get on a call with your potential agents. Ask them a few of these qualifying questions:

  • What’s the difference between a Conversion Rate Optimization Agency, an SEO agency, and a Digital Marketing Consultancy? Of course, do your research beforehand. But this question will give you some insight into their understanding and expertise.
  • Many firms believe that an agency specializing in eCommerce acquisition (AdWords agency or an SEO agency) would optimize your landing pages for conversion. However, don’t expect web designers, AdWords specialists, or SEO experts to specialize in CRO. Far from it. If this is your consultant’s understanding, you’re better off continuing your search.
  • With the abundance in the availability of knowledge, a lot of agencies do know basic CRO techniques. But only true CRO specialists know how to execute these methods strategically. For example, scaling your leads isn’t the same as scaling your ROI. For all we know, non-specialists would tell you to have discounts on your products. This results in increased traffic to your website, but your bottom line remains the same.

If they’re hesitant to share those details with you, you should move on to the next agency on your list.

6. What tools do you use?

CRO marketing agencies have access to multiple tools. Most of them are for A/B testing:

AB Tasty
Google Optimize 360

Apart from running A/B tests, A few other useful CRO tools are:

Your conversion optimization agency should help you understand the tools and how they would use them for your brand.

7. Are there any additional costs apart from your quoted charges?

Conversion rate optimization tends to involve some hidden costs. But if you ask about this upfront, it need not be hidden. Here are some cases where your CRO marketing agency might charge you additionally:

  • Suppose you don’t have access to an AB Testing tool to monitor your site. Then, depending on your volume of visitors, these charges can go upwards of £1000 each month!
  • Heatmap tools, Funnel Analysis platforms, and Survey platforms can quickly increase your overhead.
  • Depending on the complexity of your project, staff costs could increase for the number of hours spent.

This is why we suggest you ask about these details upfront. Make an informed decision.

8. How will you measure success in this project?

As with other partnerships, understand and ask for the goals and performance indicators (KPIs) proposed by the conversion optimization agency.

  • If you’re an eCommerce site, you want increased sales that don’t shrink your margins. However, if your CRO agent is insistent on you focusing primarily on micro-conversions (the clicks on an ‘add to cart’ button), they’re probably not worth their salt.
  • If you’re in the SaaS industry, you need increased subscribers who eventually become paid users. Here, micro-conversions count.
  • If you have a lead generation site, your conversions are an increase in lead generations.

The points we mentioned above account for high traffic to your website. Suppose your traffic isn’t as high to gain significant insights. Then, your CRO agency will rely on evaluating qualitative data.

Also, note that measuring success isn’t about the immediate goals. You can evaluate your future CRO partnership through the lens of:

  • Transfer of skills from the agency to your team
  • Well defined KPIs at the start of your project
  • A clear-cut strategy for success

9. What do you need from my team?

CRO is a complicated process. Also, remember it is a partnership you’re entering into. Sometimes there would be a need for give-and-take. For example, your conversion optimization agency might need your resources like the intervention of your web developers and design team.

Some questions to ask:

  • What format would you deliver the new designs to us in? For example, do you generally use simple Wireframes, PSD, or HMTL?
  • Do you need assistance with Front-End resources for launching A/B tests?

10. Do you guarantee increased conversion rates?

While NOTHING can be guaranteed, we’ve added this question to help you screen out agencies that do. CRO is too complex, with too many variables to have written guarantees. Top CRO agencies know this and will always explain to you all the variables and factors outside their control.

Just as an example, If you decide to run a Native Advertising campaign, and all this while you’ve been getting visitors from Search Adwords campaigns, this would certainly affect the search intentions of your traffic. Furthermore, it would reflect in your conversion rates.

How dishonest would it be then for an agency to assure you a rise in your conversion rates?

In the end

You now have ten great questions to help you weed out all the agencies that don’t have your best interest at heart. From the two case studies we shared, you know how important it is for you to scale your conversion rates. So, next time you search for ‘CRO agency near me,’ you’ll be prepared to select the right partner for your brand.

WebSpero has successfully increased the conversion rates of our clients up to 15X for the past decade. As a result, we’ve grown our clientele globally. More than 700 of our clients vary across all industries. So, if you’re looking for a conversion optimization agency, schedule a free consultation now.

Types of SEO Keywords – Your Guide To SEO Success!

Types of SEO Keywords – Your Guide To SEO Success!

Google processes more than 40,000 keyword searches every SECOND. But how many of these searches help businesses gain profits? 

To know the answer, you need to be familiar with the types of SEO keywords and their classification. It will help you understand which type of keywords brings you just traffic and which brings you sales. Today, we will cover this topic in detail.

Let’s dive in.

What Are SEO Keywords? (Because Basics Are Must!)

SEO keywords are the words typed by Internet users on search engines when they search something on the Web. So, basically, these are the query phrases that people use.

SEO keywords example:

Suppose if an Internet user makes the query “red dress” on a search engine, the words “red dress” serves as a keyword.

For good SEO of a website, one must understand the search queries related to their niche and try to answer them as precisely as possible. This is the foremost thing. Then comes the technical part, which requires using keywords at certain places on your website – like in header tags.

When you do so, it serves two purposes:

1. It helps search engines to relate your web pages with relevant Internet users’ searches. 

That will help your web pages to rank high on particular search queries.


When you create a site, Google crawlers, called “Spiders’ ‘ will crawl your pages. They will analyze the keywords and phrases that appear regularly on your web pages and associate them with these words. If they find it relevant, they will show your web page in the search results. 

2. It helps in smooth navigation on websites with multiple pages, especially e-commerce websites. Amazon is the perfect example of this.

On the homepage of Amazon, type a search query related to a product on their search bar. Hit enter, and you will see a list of product pages that matches your query. You can navigate to the product page you wish to go on. This happens because of the presence of keywords on those web pages. 

What Differentiates Types of Keywords in SEO?

Well, experts say many things in this regard. But the best answer is the intent of keywords. What internet users wish to achieve using keywords is what differentiates the keywords’ types. 

Types of SEO Keywords

There are three types of intent that helps in the classification of SEO keywords:

Informational Intent

When people search for something for just information, without any particular goal,  their intent is informational. For example, when someone searches iPhone on the internet, the person just wishes to know about it. 

On the other hand, a person searching “iPhone 12 pro price” has some other intention. The chances are high that this person is looking to make a purchase. This brings us to internet users’ transactional intent. 

Transactional Intent

The users with transactional intent are looking to make a transaction – be it purchasing a product or service or even subscribing to a newsletter. So the guy in the above example, who is searching “iPhone 12 Pro price” has the transactional intent to make a purchase. 

Navigational Intent

This intent is of the internet users who are performing a navigational search. They are already aware of the brand. They just wish to find the right website or webpage offering the brands’ products.

Note: The intent of keywords is the base of the classification of keywords which further serves as a base for differentiating the types of SEO keywords.

What Are Different Types of SEO Keywords? 

As per the searchers’ intent, there are multiple types of SEO-related keywords. But it is not the only SEO keywords classification factor. The other one is the length of the keywords. We will discuss both, starting with the intent. 

Informational Intent Keyword Types:

Niche Defining Keywords

These are keywords relating to a specific market or industry. People use them to search for industry-specific information. These keywords may be specifically or broadly related to a niche. 

Examples: Running Shoes, Plumbing Issues, Restaurant Ideas

Customer Persona Defining Keywords

These search phrases help you target specific sets of audiences. When using these keywords, the users define themselves. Therefore, one can use such search queries to align with your product or services to rank high in search results.

Examples:  Gym Tips For Women, Shoes For Men 

Transactional / Buyer Intent Keywords Types

Product Defining Keywords:

As the name indicates, these search terms are linked with details of a specific product. Here, the users’ intention is to make a certain transaction like purchase a product and not just take the information. 

Examples: Nike Air Max 270 Price, Google Newsletter, Spotify Subscription 

Location Specific Keywords:

As the name indicates, people use these keywords to find something online in a particular area or location. It may be their current location or a city they are planning to visit. Using these keywords serves the best for local SEO. That’s why they are also known as local SEO keywords.

Examples: Auto Repair Near Me, Wedding Planner in LA

Navigational Intent Keywords Type

Brand Specific Keywords:

Keywords that include the name of a brand or other terms specific to a brand come under this category. Generally, these search phrases include the brand name and a product type or other brand descriptive terms. 

Examples: Reebok Running Shoes, Puma Gym Outfits

Product Specific Keywords:

These search queries contain the exact product name with some other terms to specify criteria for search results. You will better understand these types of keywords with their examples.

Example: iPhone 12, Air max under $50

Types of Keywords in SEO by Length

Short Tail SEO Keywords:

These are also known as Generic, Broad or Seed keywords. Search terms with one or two words that have a very high search volume belong to this category. They are directly related to the product or service you offer. But since you are one among the sea of billions of others offering the same service, these keywords have very high competition.

Example: Men’s Shirts

Mid-Tail SEO Keywords:

These search phrases have two or more words and are more specific than generic keywords. They have less competition in comparison to short tails as they are more specific. 

Example: Plumbing Repair Company

Long-Tail Keywords

Long-tail keywords in SEO are search phrases used by people when searching for something on the internet on a specific topic in a very precise way. These phrases are long – a minimum of 3 to 5 words. These search terms are more specific and better describe a particular thing. 

Long-tail keywords examples: “cheap men’s brown shoes” “Navy blue shirts for men” 

Why Experts Prefer Long Tail SEO Keywords?

There are two reasons behind it:

1. Less Competition:
Consider the following search results:

  • “men’s shoes” – About 10,130,000,000 results on Google
  • “designer men’s shoes”- About 385,000,000 results on Google
  • “branded men’s shoes” – About 74,300,000 results on Google
  • “cheap branded men’s shoes” – About 25,200,000 results on Google

As you can see, the longer the expression, the fewer matching sites there are. Therefore, long-tail keywords are easier to rank for. So instead of working hard to try and rank yourself on “men’s shoes”, why not optimize a page for “cheap branded men’s shoes”?

2.  Higher Conversion Rate
When the police ask someone to describe a thief, they hope to get as many details as possible, as this will help them find the right person. The same is the concept with long-tail keywords. People using long-tail keywords are in the purchasing cycle most of the time. So, normally, long-tail phrases bring in visitors who convert better. Thus, investing in long keywords is a winning “quality over quantity” strategy.

What are good SEO keywords?

Good SEO keywords for a fruitful keyword strategy depend on certain factors:

1. The Monthly Searches: this is the monthly average of the number of times the word has been typed by an Internet user on a search engine. This, therefore, gives you an approximate idea of the traffic that this keyword could generate on your site.

2. The Competition: it is a question of how many Internet sites also wish to position themselves on this keyword. The more competition there is, the less chance you have of appearing on the first page of Google. A keyword analysis tool can help you in this regard.

3. The Relevance Of Keyword: The chosen keywords must align with your business niche and your site’s content for better SEO keyword optimization.

How To Choose The Right Keywords For SEO?

To choose the right SEO keywords for your site to rank well, there are a few points to consider:

  • Establish a list of priority keywords on which you want to position yourself,
  • Perform a competitor keyword analysis,
  • The chosen keywords must be relevant,
  • If you offer several products from different categories or various services, make sure to create a different page for each main keyword,
  • Also, make a list of secondary keywords for each page. A free keyword tool like Google keyword planner can help you in this task.
  • Evaluate the search volume for each chosen keyword using an SEO keyword research tool.

What is an SEO keyword tool?

SEO keyword tools help you find the right keywords for your website. They give you data and metrics about keyword analysis such as the number of monthly searches, country-based searches, keyword competition, relative search terms, and other things. Google ads keyword planner and SEMrush are two leading keyword planner tools that you can use.  

How are keywords in SEO Are Different from SEM?

In SEM, you target users according to paid advertising. So, your keywords are mostly transactional intent types. Whereas in SEO keyword analysis, you consider all types of keywords mentioned in this blog. 

How do I work on keywords on my website?

The most important point is one main keyword = one page. Ideally, the content of your page should be adapted to a defined keyword. This is where the writing of web content comes in,  which also requires respecting certain criteria:

  • Include your keyword regularly in the contentṣ without overdoing it.
  • The ideal keyword density is less than or equal to 3%.  
  • If possible, insert your keyword in the first paragraph of the text. 
  • Insert your keyword in the title of your page, the meta description, the subtitles.
    Each keyword must match a specific URL. 
  • Use keyword variations in your content.

Google’s algorithms will look at the richness of the content of your page, its URL, and the relevant inbound links to judge your authority on the main keyword and position you higher or lower in the SERPs.

One bonus tip: create keyword clusters.

How to create Keyword clusters?

Creating keyword clusters is grouping non-similar keywords that have alike transactional intent.  This is done to target all the searchers who have similar transactional intent. 

For example, “shorts women”, “cotton shorts for women” and “women’s cargo shorts” are non-similar keywords. But they all showcase that searchers wish to buy linen shorts.


Search Volume

shorts women  74000
cotton shorts for women 12100
women’s cargo shorts  18100

Now let’s say you sell women’s shorts online. If you only target the first keyword, you will leave out the traffic that other keywords can bring. 

Instead, if you target the primary keyword along with its LSI keywords, long-tail variations, semantic keywords, and some other relative terms, chances are you will have 10x more traffic as your webpage will rank for multiple keywords.  To understand this concept in detail, read – Keyword Clusters by SEJ

Final Words

There are many types of keywords in SEO. To find those that can profit your business requires understanding the searcher’s intent. Once you understand that, put yourself in your prospect’s shoes. Ask yourself what they would type in the search engines that can lead them to your site or blog.

Finally, be realistic. If your site is fresh, don’t aim too high right away. You will be competing with sites that have been around for years and have many more pages than you. Take the time to work on your strategy. Start with the most accessible keywords. Once you position yourself better, then you can consider more competitive keywords.

Have A Project? We would love to discuss it with you! 

Contact Us