Even if you are not a programmer to load a site, you should have seen on your computer screen these strange codes:
Loading the page of a site, the browser can give various kinds of response that give indications on the state of the page and website.
These responses are 3 digits.
The first digit identifies the response category, others say 2 digits describe the response itself.
But to make sure the crawler (search engine) from each corner of your site, you have to ensure that it is not blocking the robots to read it.
For example, the SEMRush is a search engine emulator and not a search engine itself.
You have to be careful in setting up your site so that the SEMRush emulator engine can get everywhere from the site for proper analysis.
Sometimes SEMRush comes to point out errors that do not actually exist simply because the engine did not have adequate access to your site.
To provide adequate access, you need timely set up the following files:
IMPORTANT: obviously meu-site.com is purely for purpose of example. You must enter the name of your site.
So see a list of commands set up in your robots.txt, that is, which parts of your site is not being indexed by Google (or another search engine), and other information.
The point ‘.’ before .htaccess is not a typo. It simply means that we are talking about a communication file to the server.
The response categories can be
The best known response in this category is 500, which means it has a problem of access to the content on the server.
In this case no use to leave the post super optimized, primarily does not solve the problem in the server response.
The most common cause of this type of error is the server’s response time.
This can happen because you are sharing access to the server with other users at the same time. In this case the OS server can not cope efficiently with all demands in the same place.
It is as if on the same computer you wanted to get into the same folder from various places, you will open the same folder multiple times, until one o’clock the system hangs.
We even went through this experience in the NoTopo site and decided migrating our website to the server Hostgator to resolve the response from the server problem.
Another solution might be to rent a single server space.
In this case the response time is great but the cost much higher to have a space on the server dedicated only to you. This solution is recommended for anyone who has a website with many hits a day, then a share server space can seriously impair the performance of your website and your SEO in general .
The best known response of this category is 200 , which means that everything is okay, you do not have to do anything.
Ideal would be to have a site in which all pages deem 200 response to load.
The best known source of this category is redirecting 301.
For example, when we want to delete a page or not want it to be loaded by the browser, we must direct it to another page of the site.
To do this you can use the command to 301 permanent redirect, or 302 for a temporary redirect.
I think that nowadays very few people use the 302, so let’s focus on the 301.
The command redirect 301 can be configured in the file .htaccess if you have familiarity with your server and FTP. Or if your site is in wordpress, you can enter this command through a simple plugin.
We like to use the plugin wordpress without knowing how the code behind.
However, for those who want to know the command to be included in .htaccess, is as follows:
Redirect 301 / page-old www.site-novo.com/pagina-nova.
After a while it is worth assessing whether these pages that were redirected are still indexed in Google.
If you are no longer indexed in Google, you can remove the redirect and leave everything in order.
To see if your page is indexed, go to Google and type:
The command site: is a direct query in the Google index.
If the page you were looking for appears as a result, it means it is still indexed. It is good to leave the redirect.
If Google display this page in the results, it is likely that the page is not indexed and you can try to delete the 301 command.
because care there is 100% certainty that this URL will not return in the Google index. Because if you have an external link to this page, it could be indexed again and again you will need to insert a 301 redirect.
The best known answer is 404.
Who works with SEO refers to it simply as: given a 404.
And everyone will understand what it is.
404 Error means that the page does not exist.
This is the response of the browser, ie it does not find the contents of the URL that you uploaded. This happens especially if the content has been removed.
As official communication from Google, the 404 does not impact directly your SEO.
However we believe that a high number of 404 will have consequences in the optimization of your website as you are using your crawl-budget for content that does not exist.
The crawl-budget is the amount of your site’s URL that Google (or another search engine) accesses in the day.
This amount depends on your domain authority. Then you understand that if you have a high number of 404 you are wasting your crawl-budget to non-existent pages and can end up reducing the authority of your site.
When the Google engine enters your site, they enter the URL you visit the database itself.
This calls dtabase Index .
A page is indexed when it is inserted into the Google database.
Now imagine if Google enters and finds a lot of 404 errors, ie, the content does not exist?
As told over several platforms indicate these errors.
Google Search Console comes to show how many URLs are accessed and how many indexed.
The first answer, the faster can be: make a 301 redirect and ready.
Of course a 301 redirect resolves the issue for the engine, because when it tries to access the URL without content before loading it goes straight to the redirected.
But it has a little bit more than that.
In the analysis platforms (such as Google Search Console ) you can see where it came from the link redirected to this URL.
However you have to have a careful, because not always these platform identify external links leading to this URL to 404.
For example the Screaming Frog only scans the pages from internal links, so if a URL is pointed out only for external links the Screaming Frog will not be able to get this 404.
Already in Google Webmaster Tools you will have some indication platforms links you even manage (Pinterest, linkedin, facebook, etc).
If you have to change these links more than 404 redirect to another page, it is better even take this link to a page that does not exist. So you leave everything clean.
If you have no way to control this link, it comes from external sources, oh yes the 301 redirect is needed.
More and more we are seeing as a SEO work is not only fill the keywords site. Before doing so we need to go through the hygienic phase of SEO , which evaluated the most technical parts.
The analysis answers the loading of the page is only a part of this hygienic optimization, perhaps nag, but necessary.
If you want to understand more about this type of analysis or if you had a different experience or want to supplement and comment on this article, leave a comment here below.