Have you ever had trouble getting your website indexed? Have you ever asked yourself why Google does not index the site? What do you think is why not displaying the site in Google search? What do you do when this happens? Probably the best solution is to go to your search console account and by reading the relevant reports, search for the reason why the site is not indexed in Google. But if you have not visited the coverage section of your search console, you may be shocked to see the errors in this section. If so, don’t worry at all. Because in this article, we are going to the Fa host knowledge base to investigate why the site is not indexed in Google and the proper method to fix its errors. Analyze index status reports using the following tutorial.
The training of the coverage section of the new Google search console
Topics that will be addressed in this article:
- What is the reason for not displaying the site in the search console?
- The site is not indexed due to site errors
- Site is not indexed due to URL errors
What is the reason for not displaying the site in Google search?
The search console index errors section named coverage is located at the counter of this tool. In this section, website errors are displayed in general. The errors that you can see in the coverage report are errors that affect the performance of your website in general. In the coverage section of the Google search console, you can fully check the items related to the last 90 days that caused the site not to be indexed.
It is better to visit your search console every day and check the errors in the coverage report. Continue to fix them. Everything may seem fine and monotonous at first. But this work makes you quickly find important and effective errors in your website and try to solve them. If you don’t have the opportunity to check your search console daily, you can do this in less than 90 days. Thus, before losing the information from the previous 90 days, You can fix the present errors.
As mentioned earlier, the categories of errors in the Google search console coverage section are clear. In the following, we will introduce and solve each of them.
1. The site is not indexed due to site errors
Errors that fall into this category affect your entire website. For this reason, site errors are referred to as errors of high importance. In the following, we will introduce you the types of errors that can affect the performance of the entire site.
DNS error
DNS errors are very important for managing a website and can play an important role in the site not being indexed in Google. DNS stands Domain _ _ Name System for . DNS errors are the first and most important errors that can be pointed out.
Having a dns problem means that Google bot cannot connect to your domain due to dns timeout or dns lookup problem. All domains are definitely hosted by a hosting company. If you encounter such a problem, you should act to fix it immediately. Because in fact, this is the first step for Google bots to access your website.
Fix dns error
In order to fix the dns error that results in the site not being indexed, you should proceed to the following steps:
- In the first step, it is better to use the URL inspection tool of the Google search console to understand how the Google robot crawls the page related to the desired URL.
URL inspection tool training, Google search console , dns training - If Google cannot fetch the desired page correctly, you need to take more actions. In this case, you can refer to your dns settings and investigate the problem.
- Check if your hosting server is displaying a 404 or 500 error. In this case, instead of showing failed connection, your server should show error 404 (not found) or error 500 (server error). These errors are more accurate compared to the dns error.
server errors
Another reason for the site not being indexed is a server error. This error often means that the server is taking too long to respond. That’s why you face server error. When crawling a website, Google bots can only wait a certain amount of time for the site to load. If the loading time becomes too long, the Google bot will stop trying to crawl your website.
Server errors are different from DNS errors. A dns error means that Google bots are not even able to find your URL due to existing problems. While in server errors, Google bots can connect to a website but cannot load the page due to problems. One of the reasons for server error is excessive traffic to your website. To prevent this from happening, you need to make sure that your web hosting provider can handle high volumes of traffic.
Fix server error
When such an incident occurs on your website, you should use the URL inspection tool to find out whether Google robots can crawl your website or not. If Google Search Console can find the website’s homepage without any problems, you can be sure that Google has access to your site. Otherwise, the reason for indexing the site is a server error.
Error robots
In this case, the Google robot cannot retrieve the robots.txt file, leading to the site not being indexed. You may be interested to know that the use of the robots.txt file is necessary only when you intend to limit Google’s access to some of your pages. If you want search engines to index all the information on your website, you don’t need a robots.txt file.
How to create a robots.txt file for WordPress and optimize it
Fix robots
Make sure the robots.txt file is configured correctly. Check which of the pages you have restricted Google’s access to in this file. Check all the /: Diwallow lines written in your robots.txt file and make sure that access to all the specified links is restricted to your liking.
2. The site is not indexed due to URL errors
Errors in this category are different from site errors. Because these errors affect only one specific page of the website and have nothing to do with the entire website, like site errors, many sites may face a high volume of URL errors, and this causes them to worry. But the good news is that you can use the new Google search console’s coverage section to view this section’s errors in a categorized manner and try to fix them. After fixing it, you can use its validation to ensure that these errors are fixed.
404 error
This error is actually one of the most confusing errors in the discussion of crawling and not indexing the site. When Google bots try to crawl a page whose address is no longer valid, they encounter a 404 error. Now, considering that a software factor caused this to happen or that the desired page no longer exists, this error can be divided into two categories, soft and hard.
404 hard error
This error is one of the most complex and at the same time the simplest error that can be faced. The 404 error shows its importance when it is displayed for important pages of the website. This error shows itself when the desired page no longer exists externally. As a result, when the user or Google bots enter that page, they encounter this error.
Fix hard 404 error
To fix this error, you must do the following:
- In the first step, you must ensure that you have published the desired page through the content management system. The content should not be removed from this section; also, check that it is not in draft mode.
- Then you need to ensure that the corresponding URL is correct and that there are no problems.
- In the next step, you should check if this error is shown in www mode or if it is seen when it is searched without www. Also, try this topic for the https version or the http version of the desired link.
- If you want the page to be shown as a redirect, you must ensure that the desired link is given to a relevant page with a proper 301 redirect.
In the simplest case, if the page is dead, you need to revive it. If you don’t want to revive that page, then you need to redirect it to another suitable page.
Soft 404 error
A soft 404 error occurs when the URL of a page loses its validity due to a software factor . This agent can be the rules defined in htaccess, plugins or anything similar. Users who encounter a 404 message know that the page no longer exists. But in the soft 404 error, the story is a little different.
Fix soft 404 error
In order to fix the problem related to pages that no longer exist, you should consider the following issues:
- If a page is dead and not getting any traffic or important links, allow a 404 or 410 error to appear. Of course, you need to make sure that the response that gives error 404 or 410 and code 200 is not shown.
- Redirect old pages to relevant pages on your site using 301 redirects.
Familiarity with redirect site - Be careful not to redirect too many of these dead pages to your home page. These pages should either display a 404 error or redirect to an appropriate related page.
You can design a special page for yourself with the help of the 404 page tutorial in WordPress to be displayed in such cases.
If your pages are live but showing a 404 error, you should take the following actions:
- Make sure there is enough content on the target page. Because in some cases, pages that have no content or their content is low, soft 404 provide a
- You should also make sure that the page is not showing a 404 while showing the 200 code.
If you want to learn how to redirect, you can read the article on how to redirect WordPress media pages to the main post .
The soft 404 error is a bit difficult to recognize and understand. Because pages with this error are somehow stuck between a normal page and a page with a 404 error. For this reason, you should make sure that the main and important pages of your site do not show the soft 404 error.
Access denied error
error when access denied the Google robot cannot crawl your website page due to lack of access permission. The reasons that cause this error are:
- Users must first register on the site in order to be able to see the URL of your site. That’s why the Google bot can’t access it either.
- Through the robots.txt file, Google’s access to a URL, folder or the entire site is blocked.
- The web hosting service provider has blocked the Google bot’s access to your website or the server requires users to authenticate themselves through a proxy.
- And…
Of course, when the access of Google bots to your website is limited, you will have problems in indexing the content. That is why it is important to investigate this issue and solve it.
Fix Access denied error
In order to fix the access denied error, to eliminate the problem of the site not being indexed, you must remove all the factors that have limited the access of googlebot. These factors include:
- Unregister the pages you want Google to crawl.
- Check the robots.txt file to make sure that the pages you want are not blocked.
- Using URL inspection, you can check whether Google search engine can display your website or not.
Problems related to access denied can affect your site’s ranking. For this reason, it is very important to check the errors related to the access permission.
Not followed error
You should not confuse this error with the “nofollow” error in linking. The “not followed” case actually means that Google cannot follow a specific URL. Most of the time, this error occurs because Google still has trouble reading Flash content, Javascript codes, and category redirects. For this reason, if the main pages are not followed, you should take action to fix them.
Fix the Not followed error
Google has identified the features that search engines have problems with when crawling. These features include:
- JavaScript
- Cookies
- Session IDs
- Frames
- DHTML
- Flash
If the page you’re having trouble with has one of the above features, then you’ve found the cause of the error. When the content and links on a page cannot be seen, Google robots cannot crawl that page and this causes the site not to be indexed. So, remove the relevant agent. Another thing that was mentioned as the cause of the not followed error was page redirection. In this regard, you must do the following:
- Check the redirect chains. If this chain has many hops or jumps, Google will not follow them.
- If you have enough time, instead of using redirect, try to update your website architecture.
- Don’t put your redirected URLs in the sitemap. Only the final URL should be included in the sitemap.
dns and server errors
In the URL errors subset, we can also consider dns and server errors. Of course, the method of solving and managing this category of errors for a specific URL is the same as for an entire website. For this reason, we do not explain the method of fixing these errors that cause the site not to be indexed in a separate section. But there are differences in this field that should be considered. If you have a separate configuration for your domain specific URLs you should put them in the URL errors category