How do most people do the website SEO audits?

How do SEO tools work?
August 2, 2018
What is SEO audit & SEO score?
August 3, 2018

To develop any strategy aimed at achieving an objective, we must first determine, as closely as possible, what is our current situation and, from what, what actions or tactics can we carry out.

That is, knowing where we are to find out which roads or alternatives we have available that will bring us closer to our goal.

The strategies of organic positioning (SEO) do not escape this reality and the SEO audits are the mechanism that allow us to establish the status of a website and, depending on the objective, design the best strategy .

An SEO audit also provides a “diagnosis” of the quality of a website, not only in relation to its content and its structure, but also to other aspects such as performance, user experience, navigability or relevance.

Aspects closely related to the guidelines that Google recommends to optimize the organic positioning of the pages of a website and which, in turn, are part of the positioning factors of their search algorithms.

Therefore, the degree of success of our SEO positioning strategies on a website will depend directly on the level of quality and scope of the SEO audit carried out.

Now, what is an SEO audit and what is its scope? What answers do you offer us and what can we do with them?

Next we will see in detail all the fundamental aspects that the SEO analysis of a website should contemplate and set the guidelines to optimize its positioning and develop an optimal SEO strategy.

What is an SEO audit?

First of all, we must understand that, to perform an SEO audit, it is necessary that the website exists and is published. In addition, normally, it will also have several months or years of life.

It makes sense: after all, the SEO audit aims to analyze the “current state” of a website.

However, this does not mean that you should first wait for a website to be built and take a few months on the Internet to do an SEO audit and then optimize it for organic searches.

An SEO audit is carried out on an existing product (the word itself says it: something on what to “audit”), but you can also build a website, from scratch, following a minimum quality parameters for SEO optimization.

In this case, it would stop the work of an audit as such, but would correspond to a consultancy or advice, in which the SEO professional is part of the multidisciplinary team that builds the website and guides or advises them during this process of construction.

Even so, this advisory work does not exempt us from carrying out a subsequent SEO analysis, once the website has been built, to ensure that it complies with the consulting recommendations, especially those related to the technical and content section.

At this point, we already know what is the purpose of an SEO audit (optimizing the website for positioning) and when to do it, but what does it consist of and how to do an SEO audit ? In other words, what tasks should we undertake during this audit?

 Areas covered by an SEO audit

Although the SEO audit is carried out on a clearly defined product (that is, an operational website), the diversity of elements, both structural, content and functional, that make up a website means that it has different application areas depending on its purpose or committed.

In general, these areas would be the following:

    • Technical analysis. This section focuses on aspects of high technical level of the website, related to the technological platform that hosts it. Here factors such as download speed, configuration of the hosting server, administration of the WordPress content manager, among others, would be valued.
    • Structure and content optimization. It is not enough that a website has quality content that is useful to users (even this is something that is already taken for granted), but must be presented in a format that is, on the one hand, easy to read (actually , to scan) by the people and, on the other hand, that the searchers of the search engines know how to distinguish the different parts of the content and their relative importance.
    • Website architecture Conceptually, a website represents a hierarchical structure, with more important pages than others, depending on their content or the importance that we, as owners, want to give a page, in addition to links between them. 

      The SEO audit verifies that the organization and relationship of the web pages (that is, the architecture of the site) coincide with the desired hierarchy and that this is clear to both the users and the crawlers.

    • Usability and navigability. Currently, an important weight of SEO optimization lies in the fact of providing a satisfactory user experience during your visit, which leaves a good feeling in it that extends your stay and makes you want to return. 

      Although I have listed it separately, this section depends directly on areas such as the technical or content structure.

    • Indexing and relevance of the website. When the website has been in operation for some time, we must evaluate the impact it has had so far, be it positive or negative. Factors such as domain authority or the profile of incoming links are especially important here.
    • Positioning of keywords. Once a website is on the Internet and although there has not been any strategy (explicit) of organic positioning, from the moment that the trackers locate and track it, it starts to position by some keywords. We must analyze them and determine how they fit into our objectives or strategies.
    • State of the competition. We must not forget that we are not alone. On the Internet there are dozens, if not hundreds or thousands of websites that offer products, services or content similar to ours. 

      How is it different? What makes them better? What can we learn from them? How do they position? Analyzing the websites of the competition, their successes and their failures will help us improve our own analysis.

As a final result of doing an SEO audit, a report is issued that includes all the incidents detected, recommendations for their resolution as well as improvement, and a first proposal of the organic positioning strategy.

What advantages does an SEO audit have for my strategy?

So far we have seen that if we want to design an optimal strategy of organic positioning, we must start with the SEO analysis of the website. This would be the ultimate goal of the audit and almost its foundation of being.

However, behind this premise, conducting a complete SEO audit provides a set of advantages that we must bear in mind both during the development of the audit and when implementing the recommendations of the resulting report:

    • Establish a reference point. We are in marketing and, let’s not forget, we must all measure it to evaluate the impact of our actions, whether positive or negative. If we go blind, without analyzing the current situation well, we can hardly know what we are doing right, what is wrong or what we should strengthen or correct.
    • Identify points of improvement in the website that affect their positioning. The audit not only detects problems or errors that may represent a burden for positioning, but also what aspects of the website can be improved to optimize results.
    • Identify the keywords for which you are already positioning. By the mere fact of being available on the Internet, in search of the tracker, the pages of a website are already positioning by some keywords, which do not have to coincide with those that were initially intended or those intended in the future . 

      Knowing what these keywords are can help us design a strategy that harnesses their potential to reinforce the authority of the website.

    • Know the direct competition and counteract their positioning actions. I should not hesitate in this regard: we are not alone out there. There are a variety of competitors who have and want almost the same as us. We must identify and study them, so that our website provides a difference and an added value with respect to them.
    • Possibility of identifying new opportunities. In addition to how evaluation tool, SEO analysis allows us to analyze with critical eye the sector to which the website is directed, which can give us ideas to develop new concepts. 

      For example, creation of new content, products or services, optimization for new keywords, or identification of promising niches ignored by our competition, among others.

    • Discard actions or tactics without or little chance of success. Similar to the new opportunities, the SEO audit can help us identify what is a burden to optimize the website, as keywords without search volume, content that does not produce interest among users or saturated niches with no possibility of expansion .
    • Improve the recruitment of qualified users and the conversion rate. Not only do we want to increase visits to a website, but these are from users who are interested in its contents, who solve or satisfy any need and, as a final goal, to complete a conversion, whatever the type. 

      It is useless to have millions of visits if none stops to interact with the website.

How to do an SEO audit step by step

Checklist areas of an SEO audit

Given the fundamentals, the justification, the advantages and how to do SEO audits, now we will detail what steps should be taken.

Depending on the complexity of the website, some steps will be unnecessary or trivial, but as the website grows in size, the analysis and evaluation requirements will increase at each step.

To facilitate its understanding and given that the field of application of an audit encompasses several sections of diverse technical nature, we will divide this sequence following those sections.

Although it is not essential to strictly follow the proposed order, it is advisable to do so. On the one hand, to impose a certain order in our work of SEO analysis of an audit, but also because some steps, which we will identify, may need the conclusions of previous steps.

 Technical SEO

In this section we will assess aspects directly related to the technological platform that hosts the website and make it available through the Internet.

Although most of the time they do not require a high degree of technical qualification, modifying some of these parameters can negatively affect the website and even stop working, so they should be handled with caution and always with the necessary guarantees.

    • Check the robots.txt file

The robots.txt file includes guidelines on how searchers should track the website, from which directories they can access what types of files they should ignore.

There are guidelines that can instruct one or another tracker not to crawl the website, so a misconfigured robots.txt file can cause the most important parts of a website to be ignored, however optimized they may be. In fact, on many occasions, the best robots.txt is … not having it!

    • Review the .htaccess file

This file manages the functioning of the web server (not to be confused with the content manager) and its correct configuration can affect the performance of the server (that is, reduce its response time), improve security against possible attacks by pirates or to optimize the 30x redirects.

It should not be touched unless you are VERY sure of what is done. While the robots.txt file may prevent search engines from crawling a website, a misconfigured .htaccess file may cause it to stop responding.

    • Review the sitemap file

The sitemap file includes a kind of index or guide to the pages and resources of a website (such as images or documents).

Although the current trackers are smart enough to finish “inferring” the structure of a website, through the sitemap we ensure that they “see” the structure and content that interests us, not their “particular” vision, which may or may not coincide with ours

In addition to having a properly configured and updated sitemap file (there are tools or automatic plugins for this), we must indicate it to the search engines (in Google, through its search console):

Sitemap Search Console
    • Check the download speed of the website

Although it is only one step, its scope can cover many areas of the website, since a slow download speed can be due to many factors. From the purely technical (which may be solved with the .htaccess file) to others related to the content (size of the images).

There are multiple tools for this evaluation (such as GT-Metrix, Pingdom Tools or Google PageSpeed ​​Insights), which in addition to assigning a score for the level of performance of the website, issue two reports with the elements that have room for improvement.

    • Review the mobile version of the website

Users increasingly use mobile devices to navigate through the Internet. We must ensure, therefore, that the website has a responsive design that adapts to the dimensions of the display screen.

Google provides a tool ( test of optimization for mobiles ) that checks if a website is friendly and is optimized for mobile devices. In addition, the search console, in the “Search Traffic” menu, also includes an option that analyzes mobile usability:

Usability Mobile console Google
    • Optimized configuration of the content manager (WordPress)

Currently, most websites are made on a content manager, such as WordPress or Joomla, which allows you to manage and update your content through user interfaces based on very easy-to-use forms.

As part of the platform that hosts and serves the website, we must review its configuration to verify that it is adapted and optimized for the characteristics of the website. In WordPress, for example, a frequent problem often comes from the excessive installation of plugins that can be unnecessary or redundant.

 Keyword analysis

Search engines show their results based on keywords entered by users. The suitability of the keywords of our website and its correlation with user searches are essential to optimize our presence in search engines. Therefore, it is an aspect that we should not overlook when we do the SEO audit.

    • Current status of keywords positioned

We have already mentioned it: by the mere fact of existing on the Internet, it is almost certain that the website has already been tracked by search engines and associated with certain search keywords. Again, the Google search console provides information in this regard:

Search console queries

Although perhaps these keywords do not coincide with those that in principle interest us, it is convenient to take them into account, because they can open new opportunities for us to continue generating content or identify what content we must modify to avoid them.

    1. Keyword researchWhile the previous point focuses on how we are positioned, this section focuses more on how we would like to be positioned; that is, with what keywords we want to appear in searches.

      To do this, we will use tools like, KW Finder or the Google Keyword Planner, which allow us to analyze and evaluate various combinations of keywords and choose those that maintain a good compromise between level of competence and search volume within our area of ​​interest.

    2. Cannibalization

Cannibalization occurs when two or more pages of the same website compete for the same keyword. One possible negative consequence of this scenario is that search engines only index and position one of them, which they consider most appropriate.

Another consequence is that, when distributing a keyword between several pages, we are distributing the force that a single page would have for that word among several pages, so it would remain in its positioning in the search engines.

The search console of Google, within the “Search Analytics” in the “Search Traffic” menu, can also offer many clues in this regard.

    • Positioning of the competition

We know how we are doing it, but how and what is the competition doing? Through indirect methods and with tools like SemRUSH or Ahrefs, we can discover them and analyze their feasibility to consider them on our website, depending on their level of competence, difficulty or level of search.

 SEO OnPage

Identify which internal aspects of the website, outside the technical section that we have already analyzed, can be revised to optimize it.

To differentiate it from technical SEO, which is also “on the site”, SEO OnPage is independent of the platform and technology used, focusing on the content, the structure of the pages or the link between them.

    • Analyze indexing, authority, domain and possible penalties

Either through the Google search console, or with the command “site:” of the search engine, we can see the degree of indexation of the website and how many of its pages are indexed. A discrepancy with the actual values ​​may indicate traceability problems (either in the robots.txt or in the sitemap).

We must also check the authority value of the domain of the website (with tools such as Moz or Majestic) and identify possible penalties, which may make it advisable to desist from that domain.

    • Titles and meta descriptions of the website and the pages

We must ensure that each web page of the website has a title according to its content and optimized for the key words of that page, as well as its meta description, eliminating any repetition or, very usual, that are blank or default values.

The Google search console, within the “HTML Improvements” in the “Appearance in the search engine” menu will show us when there are duplicate titles or meta descriptions, but it will be our task to check them page by page that are optimized:

HTML improvements in an SEO audit
    • Images optimized in dimensions and size

Subject VERY pending even today. Although the images are usually optimized in terms of file size, it is still very common to find large images (thousands of pixels high and even wide) to be displayed in a much smaller box.

We must ensure that all the images are optimized for the location where they will be displayed, both in terms of their dimensions and the size of the file.

    • Optimized content for keywords and users

Each page must be written with a key word as an objective, using them appropriately throughout the text. That is, not only exact matches, but also using orthographic variations, metaphors, similar expressions, etc.

In addition, HTML format text resources (H1, H2 and H3 headlines; bold; italics; listed list) must be used appropriately to highlight the important elements of the text, so that they can be easily identified by both users and users. crawlers.

    • Optimized URLs

A readable URL, with semantic meaning and related to the keywords we have identified previously, also provides information to the user about the content of the page, and to the trackers, when it comes to positioning.

You can extend or supplement the information that, in itself, the title of the page (not to confuse with the holder H1) already includes.

    • Website architecture and internal links

A fundamental aspect of a website, as already mentioned, is how the contents are organized (that is, the pages) and how they are related to each other (that is, their internal link), following a hierarchical structure that is easy to understand and follow by users and trackers.

In this section we check that the architecture of the website responds to this structure, with the most important contents or pages located at prominent levels of the architecture and conveniently linked to facilitate navigation between them.

    • Broken links (HTTP 404 error) and redirects (30x)

When a link does not lead anywhere, it produces frustration in the user, who may decide to go to another website, and causes us to lose the tracking fee of the search engine, which has followed a link that leads nowhere. Detecting them is a laborious task, but online tools such as Dead Link Checker or plugins like Brocken Link Checker do it automatically.

We must also analyze the redirections, both permanent (301) and temporary (302), and decide their convenience; that is, eliminate the redirect and put the final URL in its place (only possible if we talk about internal links).

From the point of view of performance, if WordPress is used with a redirection plugin, it must be evaluated if it is really necessary, since almost the same can be achieved, and much more efficiently, with the .htaccess file.

  • Structured marking of data (Schema)

The structured marking of the data allows assigning semantic meaning to the information contained in the web pages so that the crawlers of the search engines can identify and interpret it.

For example, while for a person it is easy to recognize a postal address, while for a tracker it is only a text string that is indistinguishable from the rest.

A correct structured data markup allows search engines to generate the rich snippets of the search results, which make our links stand out even more:

Structured data in an SEO analysis

In the case of e-commerce websites, the adequate marking of data becomes critical, because it means that our products can come out directly in the Google results pages, with all that this could provide in terms of qualified visits.

 SEO OffPage

By itself, an efficient website and optimized content do not ensure a good positioning, but it also needs to have relevance and “contacts” with the outside. This section of the SEO audits evaluates it, along with an analysis of how the competition is.

    • Analysis of incoming links

Among the positioning factors used by search engines, the structure of external links to a website is one of the most prominent since, to a certain extent, it measures its “reputation”.

There are several parameters to consider in this analysis. Not only the number of links, but also the “reputation” of the origin website, the correlation between its theme and ours, the anchor text, as well as other factors related to the context where that link is placed (that is, the text around the link at the origin).

In addition to the Google search console, we have multiple tools to facilitate this work, which we have already used in other sections of the audit, such as SemRUSH, Ahrefs, Majestic or Moz OSE:

Inbound links
    • Presence in social networks

In the “Acquisition” menu of Google Analytics we can see that one of the channels through which visitors can reach us is Social Networks:

Social Networks in an SEO analysis

Although the links from Social Networks do not directly affect the positioning, analyzing this traffic will help us to define a strategy of diffusion in social networks with greater response from users, selecting those with better perspectives and getting a more qualified audience to our interests.

    • Duplicate contents

Search engines do not like duplicate content, since it does not contribute anything to what already exists, as well as an ethical grievance (we are talking about plagiarism). Consequently, when they detect it, they usually penalize the “copier” website, maintaining the original.

Even when we ourselves (either directly or through a team of trusted writers) have written the contents, it is always advisable to verify that there are no duplications with other websites (that, for example, we have copied).

Even by writing them ourselves, similarities can occasionally occur that could be interpreted as duplicate content. Tools like Siteliner and Copyscape will help us in this task, both inside and outside our website.

    • Relevance of the competition

From the previous sections we had already identified and analyzed the websites of our main competitors, even indirectly, their keywords.

In the SEO OffPage we must continue with this research work, including an analysis of the incoming external links of its websites, practically following the same methodology and tools that we already did with our own website (first point of this section), and its presence in Social Networks.

The structure of incoming links that we obtain from our competence will serve us, in addition to assessing its relevance on the Internet, to reinforce our own structure, adding links from websites that they have but we do not.

 User experience

We can have the best content and an excellent structure of external links, but if the user who visits our website does not locate the information they are looking for or feel confused or lost when they try to navigate through the pages, then we will lose an opportunity to capture it and we lose it forever.

This section tries to measure this satisfaction of the user experience, with the limits imposed by a criterion based on very subjective evaluations and, often, following the latest trend in website design.

    • Hierarchical structure of menus and navigability

During the SEO review OnPage the importance of an adequate website architecture was already indicated. This architecture and, especially, the hierarchy of the main pages should also be reflected in the menus of the website.

There is no point in having a perfect web architecture if afterwards the menus are confusing or show an apparent structure different from the real one, confusing both the user and the trackers when it comes to identifying and navigating between the pages.

    • Third party advertising

It is not intrinsically negative to place third-party advertising on a website, but it must be done with respect to the user, in a non-intrusive way, and only with relevant and potentially useful messages to the user; that is, related to the context of the page.

We must not forget that if a user comes to our page it is because something has interested him in our content and he wants an answer. It is not there to be bombarded with advertising.

On the other hand, many advertising platforms involve connecting to an external server to download the ad, which can slow the browsing speed of the pages and negatively affect the user experience.

    • Trends in competition

Web design tends to advance by trends, as design and development techniques improve and better understand the usability factors that favor the user experience.

Being aware of what our competition does allows us to see the practical application of these trends, analyze them in real time and evaluate them to decide their suitability for our website.