What is Technical SEO? Your Practical Guide For Best Practices!

We know what you are thinking – why bother with Technical SEO when you have already implemented the overall SEO?

To get to the answer, first you should understand:

What is Technical SEO?

Technical SEO is a process to ensure that your website matches the modern technical standards of search engines.

Search engines are continuously updating themselves. Like, Google launches at least three major algorithm updates every year. For tiny updates, the count is more than 100.

With each major algorithm update comes a need to update your SEO. That’s where technical SEO comes in the picture.

It is a process of website and server optimizations to improve organic rankings as per latest search engine updates.

It makes search engine crawlers crawl and index your web pages fast and efficiently.

Note: Technical SEO is a part of overall SEO.

Importance of Technical SEO

Like other SEO strategies, technical SEO is also crucial to create strong SEO.

Aspects of technical SEO – focusing on website structure and ensuring proper crawling of your website is the main factor for search engine rankings.

These aspects make search engines aware that your website is of high value. As a result it makes you rank higher in the SERPs (Search Engine Results Pages).

Google only wishes to share the best possible web pages in results for the users’ queries.

Implementing technical SEO ensures that your website meets this criteria.

A strong technical SEO of your website goes a long way in delighting internet users. In return, crawlers/bots rank your website over others who offer inferior experiences.

Note: No matter how good your content and design is, without technical SEO, your website will not rank high on search engines.

How Can You Improve Your Technical SEO?

There are many points to focus on in this regard:

Keep Your Website Structure Organized

Website structure is the structure that your pages and links create for a website.

It is like a map of your website pages with lines representing the links between them. The way the whole structure is, matters for the relevance of the site. It should be clear and logical.

Such site structure makes it easier for your visitors to find what they are looking for. Also, it makes the site easier to navigate.

As good structure equals good User Experience (UX), the structure of your website has an influence on your ranking in search engines.

Search engine robots find it easy to navigate and crawl such websites and thus give them preference in organic listings.

Thus, it increases conversions and Click Through Rate (CTR).

The optimal structure of a website should:

  • Have a clear sense of hierarchy.
  • Tree structure in categories then sub-categories.
  • Have logical internal linking.

Keep A Simple Navigation Menu

When you visit a place for the first time, don’t you think it is better to have a good map to get around without getting lost? The same is the purpose of the navigation menu in a website.

It is the guide for your visitors once they are on your site. As like a map, it should be clear, precise and – if possible – visually appealing too.

The content, format and design of this section should help visitors to find their way around quickly.

But what does an ideal menu look like? Here’s the answer:

Easy To Use:

The menu is the centerpiece of your site navigation, and visitors should be able to find what they’re looking for easily and quickly.

A visitor should not have to click more than 3 times to arrive on a specific page regardless of their starting point on your site.

Don’t Have Too Many Tabs in Main Menu:

The ideal number of links in the navigation menu is seven. This is because according to research, our memory can only focus on seven items at a time.

So, if you offer more than that, visitors will miss main tabs.

No Too Long Drop-down Menus:

An overly complex drop-down menu risks upsetting the visitor. Faced with an extended list, they may not know where to go and therefore leave your site soon.

Help Users Know Where They Are:

Your menu should tell visitors where they are at all times. This is very important when a visitor takes their first steps on your site. This offers a lot in terms of navigation comfort.

Have A Search Bar:

30% of internet users use the search bar in a website. These users convert 5 to 6 times more than the others .

It is therefore essential to highlight a search bar on your navigation menu. Ensure that visitors can use it at any time even when scrolling down a page.

Maintain A Consistency in Navigation Web Design

Let’s talk about navigation first.

The reason why a website should be consistent in navigation is psychological. Humans are comfortable in places that are familiar and predictable.

If the navigation works the same throughout the website, the user will feel more relaxed, more likely to stay on the website and therefore more likely to convert.

So, consistency in navigation is a crucial element for functionality and good user experience. Now, we will discuss consistency in website design.

Consistency in web design is maintaining the same style in all of your web pages. By “same style” we mean to be consistent in the colors, fonts, style of icons, style of images that you use throughout your website.

Imagine for a moment that on your homepage you use some fonts and in the rest of your pages you use completely different ones. Or that you use different colors of text on each page.

Ofcourse, it won’t look nice.

Moreover, without any consistency in design, it is difficult for your visitors to remember who you are.

This is because the logo, colors, fonts, and design of brands makes the audience remember about their services.

To create consistency on your website, you have to first know what are the visual elements that make up a brand:

  1. Logo
  2. Colors
  3. Page Layout
  4. Fonts
  5. Language
  6. Images
  7. Icons
  8. Interactions

For Logo:

Make it more striking than the rest of your fonts, but without going overboard. Remember that it has to be easy to identify and remember. Your logo should appear on every page.

For Colors:

For the colors of your website, try to use a maximum of two. Choose a main color and another to complement or contrast it.

Create a harmony in using different tones of colors for the different sections of the site. If you have an ecommerce website, you can associate colors with products.

Page Layouts:

Always try to have consistent elements on each page:

  • Same position of navigation menu on each page.
  • Similar fonts and colours throughout the webpages.
  • Search box should be in the same spot on each page.
  • Clear and identical hierarchy of the page elements.

For The Font Style on Your Website:

Use just three.

  • One for your logo that is usually different from the other fonts on your website.
  • Another for the headlines.
  • Last one for the paragraphs.

Remember headlines should vary in thickness or shape and paragraphs must give priority to readability. The font style for each element must be the same on every page.

Language:

On a website we cannot write one thing and at the same time oppose it. That is to say, our tone cannot be diametrically opposite in each of the pages or sections.

Tip: Use the AIDA (Attention, Interest, Desire, Action) method as a writing pattern. In addition to being effective and facilitating content writing, it helps the user a lot:

For Images:

Use same size images on a web page or on all web pages – as you like. Also, use the same color for the backgrounds.

Let’s say, you have an online store and you use white background for the product images. In that case, ensure that all of them are white, not some white, others green, or blue.

Also there should be a consistent image placement structure. If there is one image on the right side of the banner, it should be on the right side on all other pages.

Additional Tip: Define a style first by asking yourself what kind of images your client would like to see. You can add a few funny images to gain attraction.

For the Icons:

These are generally used on the home page where you explain the benefits, the services you offer, etc. There are several formats for icons : Flat icons, line icons and icons that mix graphics with fonts. Whichever format you choose, make sure you choose the same icon for the rest of the items in the list.

Interactions:

The consistency in response to the user’s action is directly related to consistency in navigation.

For example, if the menu tabs unfold to the right on the home page, the user assumes that the rest pages will do the same. It is the key to improve usability – a factor that search engines give priority when ranking a website.

Enable HTTPs

Since 2014, there has been a sharp increase in HTTPS protocol sites in Google search results.

This is in response to Google’s official announcement about consideration of the secure HTTPS protocol as a ranking factor the same year.

What is the HTTPS protocol?

HTTPS is a short form for HyperText Transfer Protocol Secure. It is a secure protocol that has existed since 1994 to overcome the main flaw of classic HTTP.

This is where the S for “Secure” of HTTPS comes in. The HTTPS protocol is the combination of HTTP with the security of an SSL (Secure Socket Layer) or TLS (Transport Layer Security) encryption which encrypts the transmitted information.

The HTTPS protocol is a guarantee against personal data piracy and therefore makes your website safer for the Internet users who visit it.

Why enable HTTPS?

Here are the main reasons:

  1. Google Chrome browser notifies Internet users that they are on an authentication page protected by an HTTPS protocol.
  2. You reassure Internet users by securing your site, which helps strengthen the trust they place in you. Messages like “This site is not secure” are not good for your website.
  3. Your website benefits from the advantage granted by Google to HTTPS: URLs in HTTPS benefit from a “bonus” in terms of positioning. Simply put, HTTPs is a ranking factor.
  4. After migration to HTTPS, you will be able to use HTTP / 2, which boosts your website’s page speed.

How To Enable HTTPS?

Be careful, if you want to switch your website from an unsecured HTTP version to an HTTPS secure version. It’s not a piece of cake. Any mistake can hamper your SEO ranking.

For the redirection to go as smoothly as possible at the SEO level, it is essential to seek SEO expert’s assistance.

Below is a list of the steps for setting up the HTTPS protocol:

  • Choose a quality SSL / TLS security certificate
  • Redirect all the URLs on your site (.htaccess files or other means).
  • Update all of your internal links (internal netlinking)
  • Update the URLs of your images and all other URLs
  • Refresh your Google Search Console
  • Test your new secure HTTPS protocol.

Things To Note:

  • Use 301 redirects, no 302, no javascript, etc
  • Have a single redirection, not several.
  • List your social media accounts for which you will need to update your site URL to indicate the HTTPS version.
  • Do an audit of your backlinks to identify the most strategic. Then ask the people concerned if they can update their link to point to your HTTPS version. Use the 301 redirects where they don’t agree.
  • List your ad campaigns to identify where you need to update your site URL to reflect the HTTPS version.
  • Update the URLs in your email signatures.

Changes in Google Search Console for enabling HTTPS

In Google Search Console, you will need to do this for each subdomain of your site after migrating to HTTPS:

  • Add a new website property for the HTTPS version
  • Declare sitemaps with URLs in HTTPS.
  • Check that each URL in the inventory is indeed redirected in a single 301 redirect to the correct URL in HTTPS. ( For this, you need a tool that tests a list of URLs. Such a tool will give you all the details of redirects and HTTP codes.)

Example to check website HTTPS status: https://httpstatus.io/

Add Robots.txt

As the name suggests, this file is for the search engine robots.The crawlers/bots will read this file first on your website. From this file you will be able to authorize or prohibit search engine robots/ crawlers from exploring a specific section or page on your website.

Thus, search engine robots associate your web pages as per your requests. To use this file, you must know about Meta robots tag.

This tag is taken into account by Google (and other engines). It conveys messages to the robot that crawls the page.

Here are the different values ​​for the robots meta tag, and their meanings:

  • Noindex : Conveys to robots not to index the page.
  • Nofollow : Conveys to robots not to follow the links in the page. This means that Google will not crawl the pages linked by the page containing this robot meta tag.
  • Index : Tells the robot that it can index the page. Since this value is the default, there is no need to indicate it.
  • Follow : Conveys to robot that it can follow the links in the page. Its value is also default, there is no need to indicate it.
  • All: Its value is the equivalent of index and follow. Since this value is the default, there is no need to indicate it.
  • None : Its value is the equivalent of noindex, nofollow.
  • Nosnippet: Tells the robot not to display a snippet in the results page.
  • Noarchive: Conveys the robot not to allow access to the cached version. The “Cached” link in the results page will therefore not be displayed. This can be useful to those who switch their content from a public accessible version to a paid archived version.
  • Unavailable_after: [date]: indicates to the robot that the page should not appear in the results after the indicated date

SEO Tips on robots.txt file:

  • Google has downloaded the robots.txt file on average once every 24 hours since 2000. So any changes you make in robolts.txt will reflect within 24 hours.
  • If a URL has been indexed by Google, then blocking it in robots.txt will not change anything.
  • To deindex a URL, you must authorize its crawl and use a robots noindex meta tag. You can also request to delete the URL in Google Search Console.
  • Do not block the crawl of URLs that are redirected , otherwise the engines will not be able to notice this redirection
  • The maximum size of a robots.txt file is 500KB (be careful, what exceeds will be ignored by Google)
  • Google must only obtain either a 200, 403 or 404 code from your robot.txt file.
    Code 200 : The file does exist and is accessible.
    Code 403 or 404: The file is not accessible but the HTTP code is consistent
  • The robots.txt file may itself be indexed in Google. To deindex it, you must either use X-Robots-Tag or prohibit crawling the file and then have it removed from the index in Google Search Console.

SEO Friendly URL Structure

What is an URL?

URL stands for Uniform Resource Locator. This element designates the location of an Internet resource according to a precise encoding. In other words, it is the address of the part of a website.

The URL makes it possible to find it among the billions of existing sites on the web. This is universal data, which has a standard identical structure throughout the world.

It consist of three parts:

  • Scheme/protocol
  • Sub domain
  • Domain name
  • Extension of domain name

  • https:.: this is the most widely used protocol on the web. It allows you to exchange web pages in a secure manner.
  • Www : This is the sub domain.
  • The domain name : this is the middle part of the URL, often corresponding to the name of the brand;
  • . .com : This is the extension of the domain name. .com is very much seen in general. .Also, sometimes, you see “.ca” , “.us” – they represent the country. (.ca for Canada and .us for the United States). And .org is for government sites.

Further, other elements join it as per requirement. Like, for a blog,

For a blog under blog category:

Classification of URL Structures

There are two types of URL structure:

Static URL:

A static URL is a URL that does not change and does not contain any kind of parameters.
Parameters are additional information at the end of the url. It is placed after the ? symbol. They also help to track information about the clicks on the URL.

These URLs remain the same throughout the website’s life until there are any changes in HTML. Hence, they are used for web pages that are not usually updated, such as www.example.com. Updating this type of page is heavy because you have to modify the source code every time there is a change.

Example:

Dynamic URL:

Dynamic URLs are used when the content of a page is in a database and is only offered as a result on a circumstantial basis. Dynamic URLs contain parameters such as a session identifier or product identifier and have signs such as? = &.

Dynamic URLs have two drawbacks:

  1. Internet users cannot understand them as easily as static URLs.
  2. Several dynamic URLs can point to the same content. This can create inconsistency which search engines don’t like.

Example:

Can Google correctly read dynamic URLs?

Google can read dynamic URLs just as well as static URLs, as long as they are not too long. A dynamic URL must not contain more than three parameters.

In the case of content managed by databases, dynamic URLs are better than static URLs. This is because Google knows how to find its path through the relevant parameters and does not take into account the irrelevant ones.

You will find more information about it on the Google Webmaster-Blog.

How to Optimize Your URLs for SEO?

Here are pro tips to optimize URL for SEO:

Present the theme of your page

URLs are a key element for the search engine and the Internet user. They are analyzed by the search engine as well as by readers.

It is therefore important to personalize them. They must indicate the theme of each page.

This contributes to the understanding of your content by both – search engines as well as the Internet users. In other words, your URL must make sense.

It should also make readers want to click on the link to read your content.

For example:

The right url for this blog is :

https://www.webspero.com/blog/technical-seo-guide/

And not:

https://www.webspero.com/blog/page/2/?et_blog

Use the keywords wisely in particular URLs

Logically, it is wise to include the keywords targeted by your content in the URL of your pages.

This can have a positive impact on SEO but not every time. Here’s how you should plan it:

For service /product page URL:

Yes, definitely go for it. Thanks to this approach, search engines and users will have better visibility of what you offer on each of your pages. Moreover, it is easier to follow on product pages or service pages of your website.

However, before using the main keyword in the URL, check that it is present in your Title tag, your H1, Sub Headings and your content.

For blog post URLs:

As a general rule, we recommend that you include in URL the main keywords present in your blog title.

For example, if your blog post is called “Google’s Algorithm Updates Of The Last 10 Years”, the appropriate url would be https://www.webspero.com/blog/google-algorithm-updates/

Note: As with content, do not overuse keywords in your URL. In this case, repetition becomes counterproductive.

Two major benefits of using keywords in URLs:

The keywords present in your URL will be beneficial to your communication strategy. For example, if you choose to distribute some of your URLs on social media or in your emails, it is better if they are clear and relevant if you want to get as many clicks as possible.

Keywords also play a role in your backlinking strategy. If you are getting backlinks from other sites and the link anchor is represented by your URL, it will contain your keyword. Which sends a good signal to search engine crawlers.

Take care of Url Length

Long URLs are much harder to communicate and use. They are also not practical to embed in a forum, in a blog or on social media.

So we strongly advise you to go as short as possible. Typically, a full-size URL contains 50 to 60 characters. Beyond 100, you must shorten it.

So how do you go about shortening URLs?

Start by getting rid of all the words that add no added value . These words are also known as “stop words”. In most cases, these are, coordinating conjunctions or little words like “the”, “of” etc. You can easily remove them without making the URL not understandable.

For example, rather than writing “SEO-in-Los-Angeles”, prefer “Los-Angeles-SEO”. This is much more practical and aesthetic, and doesn’t compromise the understanding of the subject.

It will greatly improve the user experience and the convenience of your URLs. It’s much easier to share short URLs. Also, they are easier to memorize

Signs & Special Characters in URL

  • The comma (,) and the semicolon (;) are not allowed as they can cause confusion.
  • The slash (/), on the other hand, does not pose a real problem. If you choose to use it, still pay attention to the directory. Think about it especially in your HTML coding, particularly for files (scripts or images);
  • The vertical bar (or “pipe”) (|) is allowed but should be avoided. It is not easy to type on the keyboard and neither is it easy to explain orally. It is therefore likely to cause typing errors when entering your URL address;
  • Percent (%), asterisk (*) and at sign (@) signs can be used. However, both are unhelpful.
  • The underscore (_) , also called “underscore”, is not recognized by Google.so avoid it.
  • The equal sign (=) or the ampersand (&) are reserved for dynamic URLs;
  • Finally, when the # sign appears in a URL, it usually corresponds to an anchor. It is therefore a link to only part of the page. For search engines, anything after the # the sign is ignored.

Avoid Parameters/dynamic URLs

Most people think Dynamic websites are the best option available today. However, search engine crawlers are not able to correctly read information through dynamic URLs .

When using dynamic URLs, the creation of duplicate content is very common. This is because the same content can be accessed through different URLs.

On the other hand, dynamic URLs tend to have a lower Click- Through Rate – CTR.

It is also important to analyze the semantic aspects of the URL since it does not usually work like static URLs.

When creating a dynamic URL, ensure that it is “friendly” in the eyes of search engines. The structure should be simple and clear, and there should be no unnecessary parameters.

Menus, navigation or footer should be written with static URLs so that search engines can interpret information and links correctly. In this way, there will be no loss of link juice.

Example of dynamic url:

https://www.example.com/dp/B08SW6MQMD/ref=redir_mobile_desktop?_encoding=UTF8&aaxitk=0a758f783baae7ec96b880612c0efbd3&hsa_cr_id=4890370850002&pd_rd_plhdr=t&pd_rd_r=4cbc644f-8b66-47d9-864c-1d07a4c76f75&pd_rd_w=P4Zcz&pd_rd_wg=eftP9&ref_=sbx_be_s_sparkle_mcd_asin_0_img

Improve Breadcrumbs Navigation

Breadcrumb trail, as its name suggests, is a common thread that shows users how far they have come from the page they landed on. It is possible to return to previous pages by clicking on the different pages present in the breadcrumb trail.

Present on the top side of all the web pages, the breadcrumb trail offers navigation support to the user. Often users click on the logo to return to the home page. It is also possible with the breadcrumb trail by clicking on “Home”.

Why Use Breadcrumbs Navigation?

  • 90% Internet users do not revisit a website that has issues in navigation or use.
  • The reason behind the failure of three out of four online companies is bad user experience.
  • Still, only half of the total websites on the internet take user experience testing seriously.

What are the Types of Breadcrumbs Navigation?

There are three types of breadcrumbs navigation:

  1. History: It shows you all the pages that you visited to reach the current page. You can go back to any of the previous pages by just a click.
  2. Hierarchy Level: In this you get the option to visit the parent pages of the page you are currently on. Let’s say, you are on a product page – “Blender Bottle” on an ecommerce site. You can visit the “Personal Care section”, “Sports & Outdoor section” , “Water Bottles” and other similar parent pages from the breadcrumb navigation.
  3. Attributes: Mostly product websites use this level. This level displays attributes of the products on the navigation menu. For example, imagine you are on an automotive website looking at a particular car model page. Attribute breadcrumb navigation can take you to its technical specification page, its interior 360 images page and other pages devoted to its different attributes.

How To Setup A Breadcrumb Navigation?

You can implement breadcrumb navigation with a CMS in just a few clicks. However, if your CMS does not offer this function, it probably will have corresponding plugins available. It is also possible to set up a breadcrumb trail on dynamic websites using PHP or JavaScript.

When setting up the breadcrumb trail, it is better to respect these few things:

  • Place the breadcrumb trail at the top of the page, preferably on the left above the current page. This way it does not interfere with the content and remains easily accessible.
  • Use the “>” symbol between each item. This sign has become standardized. Choosing another sign will require additional efforts of understanding from the Internet user.
  • The breadcrumb trail must remain readable without interfering with the content of the page.
  • Indicate the title of the current page in the breadcrumb trail.
  • For the user to have a good benchmark, the page title must be present both in the breadcrumbs and on the page This redundancy allows the Internet user to find their way around.
  • The breadcrumbs must be integrated, visually and in the source code, in the same place on all the pages of the site; except on the home page where its presence is not compulsory.
  • Use the same names in the breadcrumb trail as in the main menu.
  • A breadcrumb trail should be logical and should not confuse

Sites that offer many browsing options run the risk of confusing Internet users. Especially when there are duplicates. Breadcrumbs are low-profile and fit well into most designs. But if they offer nothing more than the classic navigation bar function, then no one benefits. For example:

Good example of breadcrumb navigation:

Here breadcrumbs are beneficial, subtle and easy to find at the same time. This navigation is thus logical, clear and very well integrated into the design of the page

Major Advantages Of Integrating Breadcrumbs On Your Website:

  • Improve the understanding of the structure of the site for Internet users. This navigation element is a must for SXO – Search eXperience Optimization .
  • Help search engines understand how each page connects with the others
  • Increase the duration of sessions and optimize the conversion rate. Visitors will stay on your site longer, which promotes conversion.
  • Optimize the internal network by creating links with optimized anchors on each landing page
  • Improve the click- through rate from the SERP thanks to several links in the same result

Keep an Eye on Crawling, Indexing, and Rendering

Since nothing is perfect, sometimes there are errors in indexing and the most common are:

Slow loading of the web:

Web speed is a key factor in positioning and usability, it can also affect indexing because Google doesn’t like that its robots waste time and if they arrive at your page and have to sit and wait for them to load what they do is “skip it” and not crawl it, as a consequence that page will not be indexed and therefore will not show up in the search results.

Do Not Repeat Content

We already know that what Google likes is originality, so if you have pages that it detects as duplicate content , what can happen is that it does not want to play the 7 differences with your pages and decides that it does not index any.

4XX errors

If you are going to delete a URL, do not forget to communicate it to Google through a redirect, because otherwise you will miss 4XX errors that will harm your positioning.

To detect these and other errors you have to go to the Search Console panel > Coverage. Once you are in there resolve all the possible crawling and indexing issues.

Note: Apart from Google Search Console, we also use Screaming frog and Ahrefs tool.

How to know The Tracking Errors That Your Website Has

The first thing we will need is to know what type of errors our website currently has in the eyes of Google (and how many). To do this, we will have to access our Search Console account and click on the side menu section ‘Tracking’> ‘Tracking Errors’

Once here, Google will be able to indicate any of these failures …

Site errors

They are tracking errors that affect the entire web and not specific addresses. If you have not had any problems in the last 90 days, you will have a green ‘check’ next to its corresponding section.

If instead you had an error, it could be any of these:

DNS errors

This problem means that Google is not being able to access your website due to a connection error caused by DNS, either a timeout or a bad configuration of these.

How to solve it?
Contact the company that manages your DNS, it is possible that the IP configured for your website is incorrect or that there is a technical problem that needs to be solved.

Server connectivity errors

They differ from DNS errors in that, in this case, Google CAN reach the URL of your web page, but it returns a loading error or takes too long to display the content correctly. It is usually common in pages with excess user traffic.

How to solve it?
If you have load problems due to continuous user traffic, contact your hosting provider and upgrade to a new server with greater capacity that can withstand a greater load. Thus, it will not crash and return this type of error.

Information errors in robots.txt

With this error, Google tells us that it has not been able to access our robots.txt file. This document is important since in it we establish which pages the robots of the different search engines can or cannot crawl.

How to solve it?
First, we have to make sure that we really have a robots.txt file created and it is located in the correct url (http://yourdomain.com/robots.txt). If so, we will have to check that it is correctly configured and we are not blocking Google robots on necessary pages or even the entire domain.

URL errors

They are crawling errors that do not affect the website as a whole, but the specific URLs. One can have different types of these errors:

Minor 404 errors

It occurs when a URL on the web no longer exists or cannot be found by robots. But, at the same time, the HTTP header of that address is indicating that the page is being displayed correctly.

How do I solve it?
The best solution is to create permanent 301 redirects for each of the errors to other similar pages, in this way the user will not see anything strange and Google will consider the error to be solved while giving the new URL the authority that the address could have previously.

404 errors

When a URL does not exist but the server returns a 404 error or file not found.Similar to the previous point, but in this case there is no doubt in the eyes of Google that it is a URL that no longer exists. Either it does not load or returns an error, and not even the HTTP headers indicate otherwise.

How do I solve it?
In the same way that it would solve a mild 404 error, redirecting to related pages through permanent 301s.

Access denied

Unlike the previous errors, the robots could track these addresses but they cannot access them.

How do I solve it?
These errors can be due to three factors:

  • We are blocking certain URLs through robots.txt,
  • That our hosting provider is preventing Google from accessing certain sections of our page.
  • We have some type of access to the web through username and password that prevents robots from accessing the content.

Review the different points and find which of them is generating these errors.Then resolve it.

Not Followed

Do not relate it with the ‘nofollow’ attribute that we can put in a link. These errors indicate that Google was unable to crawl a specific URL. It is usually because of some Flash, JavaScript or similar that makes it difficult for robots to do their work.

How do I solve it?
Try not to put links in this type of programming to facilitate tracking by crawlers.

Server and / or DNS errors

They are errors when accessing specific addresses on the page due to the same problems that we explained in the server errors sections – bad DNS configuration, technical problems, pages collapsed due to web traffic.

How do I solve it?
As said before, the solution will generally consist of contacting your hosting provider and exposing the problem to them. It is very likely that the solution depends on them.

How to Fix Coverage Errors in Google Search Console?

Crawling errors will help you understand the problems Google spiders encounter when crawling your website. Google Search Console will inform you of the errors of your website and the specific URLs that are involved, giving a possibility to solve these errors.

Here’s what you can do with it:

  • Check that your hosting server is working well and has no crashes (500 errors).
  • In the admin panel you will find activity statistics, where you can see if there have been drops in activity.
  • Check for a lot of 404 errors and find out how to fix them. You can do Redirect 301 /
  • Redirect Match 301 redirects. You can also remove urls with 404 errors from your website’s sitemaps. In this Google guide you can see how to fix 404 errors .
  • Check if you have a badly configured robots.txt or there is a conflict between what the robots.txt says and your sitemap.
  • Check that the errors have been fixed
  • The last step is to validate the errors with Google Search Console. This Google webmaster tool only shows up to 1000 errors that it considers the most important.
  • Having the tool “clean” (free of errors) will ensure you can monitor the new errors that are arriving and be 100% sure that the web is traceable and does not present serious indexing errors

Focus on Internal Linking To Avoid Indexing And Crawling Errors

  • Ensure that the link opens in a new tab.
  • There should be no broken links on your website.
  • Check and make sure there are no orphan pages on the website.
  • Perform internal linking as per the content length. Too many internal links in a short length of content is not a good approach.

Use an XML or Html Sitemap

What is a Sitemap?

An xml sitemap or site map , is a file that helps search engines to follow the url of a specific website. In this way, Google will better understand our website.

How does Xml Sitemap work?

The operation is simple. Google does a crawl, checks that your website has content and indexes them. Every new change you make to your website, the xml sitemap plugin makes a new report for search engines, thus improving times.
Indexing is basically the process by which Google begins to crawl the content of your website so that it is stored in its database and thus later appears in the search results. Sitemap xml in addition to containing the structure of the web, gives other data such as how often the web is updated or the duration of the videos if they are available.

Plugins to use in your WordPress

If you want the Google Robot (Google bot) to go to your website every day to review it because you add new product pages daily, insert new articles in the WordPress blog or add new content to your blog, you must use a WordPress plugin that help update the xml sitemap or website map.

Using Google Xml Sitemap is saving time

When you update something on a website, it is essential that you communicate it to Google. The moment you do it, your robot will start to evaluate the changes, having a better chance of better positioning.

In addition, by informing Google of the new updates, you will save work since it will not waste time analyzing your urls one by one to know which ones have been updated or are important.

Google XML sitemap makes this whole process automatic. It notifies Google any updates on your website.

How Can I Create an XML sitemap?

There are two available options:

  • Using Yoast SEO plugin
  • Google XML sitemap plugin

How to configure the Google XML sitemap plugin?

Google xml sitemap is a very necessary plugin, so you should know what each option is for.

#Step 1: Basic configuration for Google XML sitemaps

Notify Google about changes to your blog:

If you activate this tab, the Xml sitemaps file will be linked to this tool and within it when you click on add property, you will insert your website and a ping will be sent to notify Google every time our xml sitemap changes. Leave this option checked

Add the sitemap URL to the virtual robots.txt file:

This option inserts the URL of our sitemap in the Robot.txt file so that search engines that do not allow ping or notifications can find your sitemap more easily.

The first thing the Google robot does is read your robot.txt, and from there it follows the search pattern. If by any chance you cannot find the file in your WordPress installation, it is most likely that it does not exist and you will have to create it. You can create a .txt file with the instructions and upload it to the root of your hosting or use the option that the Yoast Seo plugin has for it.

Increase the memory limit:

This option allows us to increase the memory of our xml sitemap in case there are loading problems.

Compress the sitemap if the requesting client supports it:

Option to compress the sitemap. Compression makes it easier for Google robots to read You can disable it if you see coding error warnings or unreadable content on the sitemap.

Include the Sitemap in HTML format:

Generate the XML sitemap also in the HTML format to have more compatibility with some robots. You must leave it marked.

The Googlebot actually reads our html code. It is a machine, it sees absolutely everything, structure of colors, images, text, links … but it will not see the photo or the colors visually of your wordpress for example.

Allow anonymous statistics:

If we activate this option, anonymous statistics are sent to the creator of the Google Xml sitemaps generator plugin to improve the product.

#Step 2: Additional pages

In this step you can specify those files or urls that you wish to include in the xml sitemap, but they do not belong to your site. It also gives you the option to add a new page on different WordPress installations on the same hosting.

#Step 3: Blogs Priority and Comments Settings

In this step, you have to specify which blogs on your site are superior to others. The crawlers will give primary attention to these blogs. Do not use the automatic priority option. Initially, it is best to give priority to everything equally. Later change it as per which pages are performing good and which are not performing well.

Also, here you will have a comment counter option. You can enable or disable comments on your blogs from here.

#Step 4: Sitemap Content

In this section, you will mark the elements that you want search engines to index. You have to check the checkbox button for various options. These options are:

Homepage:
It is the most important page on your website. It must be checked as we want Google to see it.

Articles:
They are the posts or publications of your site. For the same reason it must be checked.

Static pages:
They are the pages created: about me, my services, and contact us. This option must be checked

Categories:
With the categories you have to be careful. Only use it in case you already have enough articles within each category. Leave it unchecked if you just started.

Archives :
As a general rule the files do not have to be indexed.

Author Pages:
Do not index them if you are the only author on your blog as it would be taken as duplicate content because it is the same as the one who writes on the blog.

Only check this option if there are multiple writers creating articles for the same blog and you want them to be distinguished from each other.

Label Pages:
As a general rule, like the categories, do not include them in the sitemap.xml.

Include the date of the last modification:
Check it as it will help search engines to know if the content is fresh.

#Step 5: Exclude Items

Excluded categories:

Here we will mark what we do not want search engines to index. They will appear blank if you have not created categories yet.

Items excluded:
In this section, we can insert the id of the pages that we do not want Google to index us, separated by commas.

#Step 6: Frequency of Sitemap Changes

It shows the periods of time in which we want search engines to visit us. Depending on the kind of website you have, you will have to configure the frequency in one way or another.

If you have a static website with hardly any movements, it would be advisable to notify Google to visit us every week or month.

If you have a blog with daily updated content, you should assign the frequency in days or even hours.

For magazines or newspapers with content updated almost to the minute, it would be advisable to assign the frequencies to “always”.

#Step 7: Priorities of Elements in the Sitemap

Assign the maximum value to the landing pages since they are most important on the website. Assign the second high values ​​to articles and static pages. The rest to your liking.

#Steps 8 and 9: Validate Your XML sitemap:
You need to notify search engines about your sitemap.

  • Upload your sitemap to Google.
  • Then, login to Google Search Console and go to your website.
  • Then go to Index > Sitemaps.
  • Type in sitemap URL.
  • Allow notify Bing for any updates.
  • Save changes and you’re done.

What is an HTML sitemap?

Unlike XML sitemap, HTML sitemap is an actual webpage that assists visitors in navigation. It consists of all the links of your important web pages. The key difference between XML and HTML sitemaps is: HTML sitemaps are for visitors and XML are for search engines. As per Google, having both sitemaps – XML and HTML are advantageous for websites.

How To Create an HTML sitemap?

There are many plugins available for different CMS that can create HTML sitemap. Also, you can create it manually in WordPress. Manually creating is the right option for small websites. Just create a normal web page having links to all of your important pages.

However if you have a website that requires content update frequently for example a news website, it will be impossible to update your HTML sitemap manually for each content update. So, for such websites using a plugin will be the right option.

Build A Successful Content Strategy

Content is the backbone of a successful SEO strategy. In technical SEO, we focus on certain aspects of content:

Finding And Emitting Duplicate Content

For this you can use SEO audit tools. There are many of them available online. We use copyscape. It is a premium tool. There are also free tools – Small SEO Tools, Plagiarismdetector.net , Duplichecker etc.

if you found any duplicate content on your website, replace it with new content. When writing content keep these things in mind:

  • Make use of active voice wherever possible.
  • Write for users not for search engines.
  • Include research and stats – it makes your content trustworthy.
  • Give proper credits when using a stat or research to the reference content.

Write content for Thin Pages

Thin content pages are those that are either:

  • Have low-quality content
  • Have content less than 100 words
  • Poor Quality affiliate pages
  • Offer no value to the user.

Google considers such content pages of low value and thus don’t rank them in SERPS.

Meta Titles & Descriptions

OPTIMIZE YOUR TAGS AND METADATA

The Title tag and Meta Description tag provide a short description of the content of the web page. They have two roles:

  • Let search engines know about the context of the web page.
  • To create the desire to click on your site.

Good tags are those that describe the landing page and include prompt action verbs and keywords. They must comply with Google’s guidelines in terms of length. Plugins such as Yoast and online SERP Snippet Optimization tools like SEOMOFO are very helpful

There are also other tags h1, h2, h3… hn on the webpages. Placing the relative terms of the keyword you are targeting in these tags is a good way to improve your positioning on Google . But be careful not to overdo it, just reading all your titles should allow you to understand the content of the page without heavy repetitions.

Use Canonical Tags

Canonical tags are related to canonical URLs.

When you have a page with duplicate content you either replace the content or tag them as noindex. But there is also one other option – canonical URL.

It indicates web pages with almost the same content. For example, assume you operate an ecommerce site that sells shoes. There is a page specific to black leather shoes.

Now there can be numerous URLs depending on product attribute – size, color and material. Google will see them as duplicate pages – which isn’t good.

Fortunately, with a canonical URL, you can let Google know which version of your product page is the “main” one and which ones are variations. For this you will have to use the Canonical tag.

Optimize Page Speed

First, know the loading speed of your website. The easiest option is to open your website in iconogative mode. This will show you exactly how much time your website takes to open on a new device.

Also, you can test page loading time using online tools. There are three free tools that you can use for this purpose:

  • Speed ​​Insights : Page Speed ​​Insights is a free tool by Google. It’s always interesting how Google likes you. It gives you a rating out of 100 and a color code. Plus, it also indicates a whole series of tips to optimize your website.
  • GTMetrix : It is a popular website speed measurement tool that also offers recommendations for optimizing your website performance.
  • Pingdom : Pingdom is a free online tool that gives the speed of loading a web page from different locations around the world. Pingdom also gives you the causes of slowdowns.
  • Bonus Tip: Compare the three results.This way, you also get more recommendations to optimize your website and tips to speed it up.

What Are Possible Reasons Behind A Slow Site?

  • Your Web Hosting: Maybe the server at your web host is not properly configured. This affects the speed of your WordPress site.
  • The Configuration of WordPress: If your WordPress site is not delivering pages properly, it overloads your server, causing your entire site to slow down or even crash completely.
  • The Weight of the Pages: Usually, the weight of a web page comes from images that have not been optimized for the web.
  • Extensions: A poorly designed extension can also significantly slow down your website. Scripts . External scripts such as advertisements, font loaders also have a huge impact on the performance of your website.

Pro Tips To Optimize Page Speed

Use Caching For WordPress Sites:

WordPress web pages are “dynamic”. That is to say each time an Internet user visits a page, WordPress performs a whole complex process to find the right information, assemble it and then send it to the user.

Technically, WordPress needs PHP to function and a MySQL database.Your server retrieves a whole series of information from your MySQL database and your PHP files, then everything is put together in HTML content ready to be served to your Internet user.

This complex processing can really slow down your website if you have multiple people visiting your site at the same time. In addition, this processing is not necessarily necessary since the content of your web pages does not constantly change.

To bypass this processing, one can use “caching”.

How Caching Works:

Instead of totally generating a page on each request, the caching plugin makes a copy of the assembled page the first time it loads. Then, it is this copy which is sent for the next Internet users. The cache is updated automatically if the content of your page changes.

“With caching, your WordPress site can be 2-5 times faster”

Choose a Good Web Hosting

You cannot build massive structures if the foundations are not good. And the quality level of your web host is the foundation of your website. It plays a crucial role in the overall performance of your website.

A good host takes care of certain actions to optimize the performance of your server and consequently that of your website.One of these actions is updating the software on your server. All of these help to boost the loading speed of your website.

Keep A Check on Your Extensions

Even if all of your extensions are up to date, some can be bad for your website optimization. Like there might be some that you don’t use. Such extensions are just increasing the site load.
So you need to remove extensions you don’t need.

Check:

    • Whether a feature offered by one extension is still useful or not.
    • Are there more than one extensions that are serving the same purpose?

If yes, then you know what to do. Also, some extensions are either poorly designed or run constantly which overloads your server unnecessarily. To identify them, run a scan of your website with the Plugin Performance Profiler extension.

Compress Your Images

Images offer great help stimulate the engagement of your visitors. But if your images aren’t optimized properly, they are doing more harm than good.

In fact, the big size of images is the most common problem causing slowness on a site.

What to do: Try to keep the weight of each image below 100 KB. .

How to : Before uploading a photo to WordPress directly from your library, go through photo editing software to compress your images. Try Imagify or WP smush. They can reduce the size of an image to half without affecting the quality.

Reduce the Length of Your Pages

The more heavy and lengthy the content of a page, the more time it takes to display.

Use Page Insight or Pingdom to know the size of your webpages. If it is less than 500KB, it’s awesome. If it is less than a 1 MB, that’s good. If it is more than 3 MB, you should act.

Tips: The size of a web page mainly depends on the size of the images. So, start by optimizing your images. Then get rid of 3rd party scripts.

Pro Tip: Use Google Tag Manager to avoid multiple code installations.

Ensure Your Website is Mobile-Friendly

You no longer have a choice, your website must be “mobile-friendly”.

80% of people who use the internet have a smartphone and more than half of the traffic comes from mobile devices. Clearly, when a visitor arrives on your website, there is a greater chance that they are behind their smartphone screen than in front of their computer.

Moreover, search engines like Google offer first priority to mobile friendly sites on SERPS.

A mobile-friendly website simply means a website that displays well not only on computers, but also on mobile devices, with their smaller screens. On a mobile-friendly website, the text is easy to read, the links and the menu are easily clickable.

There are several tools to find out if your site is mobile-friendly. One of the best is the one offered by Google.

      • Type “mobile-friendly” in the search bar of your favorite search engine.
      • Choose tools just below the search bar.
      • And just after that, Google puts its in-house tool forward.

This tool will help you check mobile usability issues. Enter the URL of your website (or your web page if you want to test the display of a particular page of your site). Google gets to work and tells you if your site is responding well to the criteria of being mobile-friendly or not.

How To Make Your Website Mobile Friendly

A responsive design website is a website whose content automatically adjusts according to the size of the screen to display correctly.

Simplify The Menus

Mobile screens are significantly smaller than computer screens. You should always keep this in mind when creating your menus. The menu of the desktop version of your site can be larger and offer more options.

However, on mobile, things get complicated. You should be concise and make use of space as per mobile display. You need to simplify the menus so as not to force your users to scroll and zoom to navigate your site.

Keep a Responsive Design:

A responsive design website is a website whose content automatically adjusts according to the size of the screen to display correctly. If you have a website that is not at all suitable for mobile and you want to fix it quickly, the best solution is probably to change your graphic theme completely.

However, if you have a site already well set up that is running well, this might not be the best option. But if you have low traffic or are just getting started (let alone), installing a responsive theme is the best thing you have to do.

If you are using WordPress, changing the theme is very easy. Just go to Appearances> Themes and activate a new theme. You can type “responsive” in the search bar to find a responsive theme.

Create Forms as Short as Possible:

Not only do long forms convert less well (all other things being equal), they reduce the quality of the user experience on mobile. On a computer, if a form is not fully displayed on the screen, it is not a big deal, navigation is quite simple because of the mouse. But on a mobile, it’s complicated.

Therefore, go through all your forms and try to reduce their size if they are too long, by removing unnecessary fields. On a newsletter registration form, for example, you do not need to ask for the contact number.

Integrate an Internal Search Bar:

This tip is in relation to simplifying the menus. There may be websites that have more than seven items in their menus. It’s hard to fit a menu with many items on a mobile screen. In this case, the option is to integrate a search bar so that the Internet users can more quickly access the content they are looking for. This avoids ending up with a menu that is too big.

Pro Tip: If you have an e-commerce site, the internal search engine is clearly a must-have feature. Especially when you have a catalog with more than 12 million products like Amazon!

Size Matters:

Navigating a website from a computer is easy, it’s simple. You control everything with the cursor of your mouse or your keypad. But navigating with inches on a 4-inch screen is not the same thing. Never forget this, when you create the various elements of your mobile website.

Buttons should be large enough to be clickable with a finger. The distance between the buttons/links must be sufficient so that the user does not click on the wrong button. You should always keep in mind the clickable areas on a screen.

Pro Tip: 75% of users navigate on their mobile with their thumb. Avoid integrating buttons in the corners and at the top of the screen: these are the least accessible places for the thumb. Regardless of the size of the phone, the elements should ideally be placed towards the middle of the screen.

Avoid large blocks of text:

You need to reduce the amount of text displayed on your mobile website. Of course, you need to communicate with your visitors, so you need texts, words. But choose short sentences, shorten paragraphs if you want your content to be read.

Don’t forget: if a paragraph is 4 lines on the desktop version, it could very well be 12 on the mobile version. So, keep the content short and informative. Make use of numbering lists or bullet points.

Create Accelerated Mobile Pages (AMP)

Accelerated Mobile Pages (AMP) are HTML pages that use a specific format. Google supports this format. Pages that are in AMP format are highlighted by Google on mobile in search results.

The advantage of AMPs is their loading speed, which is much faster than traditional pages. It is for this reason that Google favors them. If you use WordPress, creating accelerated mobile pages is very simple. There is a plugin for it.

More Technical SEO Tips

Use hreflang Tag For Multilingual Website

If you are targeting multiple countries, you will need a multilingual website. For this, you can use hreflang Tag. It auto translates the language as per the visitor’s location.

Remove Dead Links from Site

Login to your search console and look out for any broken or dead links. Get rid of them. You can read more about them here.

Set up Structured Data

Wondering what is structured data?

Structured data is metadata you add to your pages to make them easier for search engines to understand. For this to work, search engines need a “vocabulary”. The vocabulary used by the major search engines is called schema.org.

Schema.org provides a number of tags and properties to categorize pages or sections of pages. Examples of categories are products, reviews, local business listings, job postings, etc. The major search engines, Google, Bing, Yandex, and Yahoo, have jointly developed this vocabulary to better understand websites.

When implemented correctly, you can make search engines better understand the content of your web pages. This enables your website for higher positions in the search results, in the form of rich results or snippets. However, there is no guarantee that you will get rich results – that is in the hands of the search engines.

Rich results are the additional information and interactive functions that are displayed on a search results page. By using structured data, the snippets you create in your CMS are turned into rich results which provide search engine users with even more information at a glance and encourage them to click. These include reviews, product prices, images, or event locations.

Here’s a detailed guide on it – Schema markup in SEO

Lastly, Focus on Technical SEO User Experience:

For this, you need to measure core web vitals:

      • Largest Contentful Paint (LCP)
      • First Input Delay (FID)
      • Cumulative Layout Shift (CLS)

Read: How Core Web Vitals Work

Technical SEO Tools

You can check the SEO tool list here.

Claim a free technical SEO Audit of your website: Contact Us!

Avatar photo

Rahul Vij

Co-founded WebSpero solutions about a decade ago. Having worked in web development- I realized the dream of transforming ideas sketched out on paper into fully functioning websites. Seeing how that affected the customers’ generation of leads and conversions, I wanted to delve deeper into the sphere of digital marketing. At Webspero Solutions, handling operations and heading the entire Digital Marketing Field – SEO, PPC, and Content are my core domains. And although we as a team have faced many challenges, we have come far learning along and excelling in this field and making a remarkable online reputation for our work. Having worked in building websites and understanding that sites are bare structures without quality content, the main focus was to branch into optimizing each website for search engines. Investing in original, quality content creation is essential to SEO success in the current search climate. Succeeding in this arena ensures the benefits of producing visitor-friendly content. Directing all our teams to zoom in on these factors has been a role that I have thoroughly enjoyed playing throughout these years. linkedin