Get a free SEO audit

THE SEO METHOD

Fill in this information and instantly receive a report with more than 50 key points for SEO positioning and if your website is complying with them, don’t wait any longer!

SEO analysis

What is the SEO Analyzer?

THE SEO METHOD

The SEO analyzer allows you to know through a general x-ray what is the current state of the On page elements of your business. It is simple to use. To use this SEO tool that will allow you to perform a free website analysis, you only have to follow three simple steps:
  • Enter the URL of your website.
  • The phrase or keyword you are interested in positioning.
  • Your email address.
After completing these fields, a window will open with the results of your SEO analysis that you can download in PDF. In less than a minute you will be able to know what fails and what works on your business website. When we try to analyze the SEO of our business, it is possible to overlook certain aspects, or not give the importance to certain actions that the website analyzer does take into account.

What does SEO analysis offer you?

Errors

Here is an explanation of the most common errors that free web analysis can detect so that you know where to apply changes:

Internal links are broken

This message indicates that broken internal links have been detected, that is, you have links on your website that point to other pages of your website and give errors when loading the page, since the inserted link is incorrect.

This consumes resources of the search engine robots, decreasing the quality of your website and also provides a bad user experience.

Issues with broken internal JavaScript and CSS files

Issues with broken internal JavaScript and CSS files. If this error is displayed, we recommend that you check the loading of these types of files, which are usually responsible for rendering the web.

It can impoverish the user’s navigation and give a bad experience, in addition to consuming robot resources by making requests to broken files.

Pages returned 4XX status code

These are pages that we have linked and give a 4XX response, related to problems loading the page, either because it does not exist or access is restricted in most cases.

Pages returned 5XX status cod

This type of error is usually caused by the server. To fix it, you will usually need to contact your hosting provider to solve it. 5XX errors cause pages to be inaccessible even though they exist.

Pages don’t have title tags

It indicates that there are pages that do not have the title tag. This tag is the title that is displayed when the page appears in search engines.

Issues with duplicate title tags

The pages contain duplicate title tags.

This error tells us that there are pages with the same title tag, which is usually considered duplicate content. It is recommended that pages have different titles from each other.

Pages have duplicate content issues

This message is displayed when duplicate content issues are detected. All pages must have unique content to avoid being penalized by search engine algorithms.

Pages couldn’t be crawled

When pages cannot be crawled, this error is displayed. We must make sure that search engine robots can crawl the pages if they want them to be crawled.

This problem can also occur for two specific reasons:

(DNS resolution Issues)

When the crawling issue is produced by the DNS.

(Incorrect URL formats)

When the crawling issue is caused by incorrect URL formats.

Internal images are broken

This message is displayed when images with incorrect URLs are detected, preventing them from being displayed to the user and search engines.

Pages have duplicate meta descriptions

Each page should contain its own meta description, since that text is normally displayed below the title in the search engines of each of our pages. This text should communicate what the page is about so that the user can enter in case they are interested in what they are looking for.

Robots.txt file has format errors

If the robots.txt file is not correctly created, this problem will occur. We can check the file in Search Console to detect if there really is an error and the type of error if there is one.

Sitemap.xml files have format errors

If the sitemap.xml file has not been created correctly, it will not show this message. You can also check with Search Console to see where the problem occurs.

Incorrect pages found in sitemap.xml

This error occurs when the pages shown in the sitemap do not exist.

Pages have a WWW resolve issue

This problem appears if when loading the page, errors occur when resolving the domain with the 3 or without the 3 w.

Pages have no viewport tag

This tag is essential if you want to make your website accessible and optimized for mobile devices.

Excessive HTML code

If the pages contain too much unnecessary HTML code, it can slow down the loading of your website, so it is recommended to create only the necessary code.

AMP pages have no canonical tag

If the pages contain too much unnecessary HTML code, it can slow down the loading of your website, so it is recommended to create only the necessary code.

Issues with hreflang values

The values inserted in hreflang tags are incorrect. This can cause problems with search engines interpreting the language and country the content is targeting.

Hreflang conflicts within page source code

It is important to review the hreflang tag to fix possible problems and correctly indicate to search engines the translated content where it is located.

Issues with incorrect hreflang links

The links that have been inserted in the hreflang tags are incorrect and should be reviewed and modified.

Non-secure pages

These are pages that do not work under the HTTPS protocol. Since last year, Google has shown warnings that the page is not secure if it is not under the HTTPS protocol.

Issues with expiring or expired certificate

If the security certificate is going to expire soon or has expired, it must be renewed to fix the problem.

Issues with old security protocol

If the security certificate is going to expire soon or has expired, it must be renewed to fix the problem.

Issues with incorrect certificate name

If the certificate name is incorrect, it can cause problems with web access, so it must match the domain or subdomain name.

Issues with mixed content

This problem happens when the page contains elements in both HTTP and HTTPS. We recommend moving the whole site to so it loads over HTTPS connections.

No redirect or canonical to HTTPS homepage from HTTP version

If the website is detected in both HTTPS and HTTP, this error is displayed. You must redirect or insert canonical from the HTTP version to the HTTPS version.

Redirect chains and loops

If there are several redirects pointing to each other or there are redirect loops, we are consuming resources in crawling and giving a bad browsing experience to the user. Links must point to the correct url.

AMP HTML issues

If your HTML does not follow AMP standards this problem occurs. It should be revised to match the standards set.

AMP style and layout issues

If the style and layout of your website does not follow AMP standards this issue occurs. It should be revised to match the standards.

AMP templating issues

If the AMP page has template syntax, it can cause problems.

Pages with a broken canonical link

If the page in AMP has template syntax, it may give problems.

Pages have multiple canonical URLs

If a page has multiple canonical tags this problem occurs. Pages can only contain a single canonical tag.

Pages have a meta refresh tag

The meta refresh tag is used to redirect a user to another page after a few seconds. It is not recommended because it tends to cause SEO and user experience problems.

Subdomains don’t support secure encryption algorithms

This problem occurs when the security protocol is too old or outdated.

Sitemap.xml files are too large

The sitemap.xml file should not have more than 500,000 urls or weigh more than 50 mb.

Pages have slow load speed

Pages should load as fast as possible to improve the user experience and reduce the resources used by search engines. You can use tools like GTMetrix or Page Speed to check what you can improve on your website.

Warnings

Below is an explanation of the most common warnings that free website analytics can detect so you know where to apply changes:

Issues with unminified JavaScript and CSS files

Minifying files reduces the weight of the files, making them load much faster.

Issues with uncompressed JavaScript and CSS files

Compressing files reduces the weight of the files, making them load much faster.

Issues with uncached JavaScript and CSS files

File caching reduces the weight of the files, making loading much faster.

Outgoing internal links contain nofollow attribute

With the nofollow attribute we indicate to the robots not to follow that link, so if we insert it in our internal pages, they will not discover those pages and will not position or will do it worse.

Images don’t have alt attributes

The alt attribute allows search engines to understand semantically what the image is about, so it is highly recommended to fill it in.

Uncompressed pages

Compressing files reduces their weight, making them load much faster.

Pages have low text-HTML ratio

It is always advisable to have more text than HTML, although if we work with content management systems, this can be difficult.

Pages have a low word count

It is recommended that all pages have a minimum of 300 words for them to be indexed, since content with fewer words makes indexing and positioning more difficult.

Pages don’t have meta descriptions

Meta descriptions help users get a snap of the content before accessing the webpage.

Pages have duplicate H1 and title tags

The H1 and title tags used on each page should be different to semantically help search engines understand what the site is about.

Pages have duplicate H1 and title tags

The H1 and title tags used on each page should be different to semantically help search engines understand what the site is about.

Pages have too much text within the title tags

If the title tag is too long, it will not be displayed in its entirety in search results, so a size of about 50-55 characters is recommended.

Page doesn’t have enough text within the title tags

If the title of your website is too short, it may not stand out or convey the information correctly, compared to other search engine results.

External links are broken

If external broken links are detected, it is recommended that these are corrected.

External images are broken

If external images are broken, it is recommended that these are corrected.

Links on HTTPS pages leads to HTTP page

If the page is under HTTPS protocol, all links should point to the HTTPS version and not to the HTTP version, so the detected links should be corrected.

Pages don’t have an h1 heading

The h1 is very powerful for SEO and helps the user to understand what the page is about.

Pages have too many on-page links

If the page has too many outbound links it can cause problems in indexing and positioning, so it is not recommended to insert many.

Pages have temporary redirects

Temporary redirects do not transfer authority. They should only be used for pages that you only want to display publicly for a certain period of time.

Pages have too many parameters in their URLs

Search engines do not support the use of parameters in the urls to be positioned, besides, if the url is too long, it makes it difficult for the search engine to interpret all the semantic value of it.

Pages have no hreflang and lang attributes

Search engines are not in favor of using parameters in the urls to be positioned. Additionally, if the url is too long, it makes it difficult for the search engine to interpret all its semantic value.

Pages don’t have character encoding declared

Not having a character declaration can lead to errors in interpreting special characters for search engines and browsers.

Pages don’t have doctype declared

Search engines prefer the use of hyphens, as they have difficulty reading urls with underscores and may interpret them as single words.

Sitemap.xml not indicated in robots.txt

The robots.txt file should indicate at the end where the sitemap.xml file is located by indicating the full path.

Sitemap.xml not found

It is recommended that all websites have a sitemap.xml to facilitate the work of crawlers.

Homepage does not use HTTPS encryption

Google considers HTTPS encryption as a quality value for websites, so it is recommended to have it.

Subdomains don’t support SNI

Using SNI allows you to support multiple servers and host multiple certificates on the same IP address, which can improve security and trust.

HTTP URLs in sitemap.xml for HTTPS site

If the site is in HTTPS protocol, the sitemap should point to all urls in HTTPS protocol and not in HTTP.

Issues with blocked internal resources in robots.txt

The problem arises because the robots.txt file is blocking resources, which can result in the site being displayed differently than the way the robots see it.

Pages have a JavaScript and CSS total size that is too large

If the total weight of your CSS and JavaScript exceeds 2mb, this warning is displayed. It is advisable to optimize your website to keep the weight as small as possible.

Pages use too many JavaScript and CSS files

If the page uses too many JavaScript and CSS files (more than 100) this warning is displayed. Making several calls to multiple files slows down the loading of the website, so it is recommended to have the necessary amount for each page.

Alerts

Below is an explanation of the most common errors that the free website analysis can detect so that you know where to apply the changes:

Outgoing external links contain nofollow attributes

Having nofollow tags can have a negative impact on your web crawl. This type of links do not pass authority to outgoing sites.

URLs with a permanent redirect

It is advisable to use redirects in some situations but be aware that it reduces the crawl-budget.

Pages have only one incoming internal link

Pages with few links indicate that they have little relevance to your site, so they will have little ranking power.

Pages are blocked from crawling

Pages with few links indicate that they have little relevance to your site, so they will have low ranking strength.

Subdomains don’t support HSTS

It is recommended that your server has HSTS support.

Pages contain more than one H1 tag

It is recommended that each page has only one h1.

URLs on “x” pages are too long

If urls are too long, part of the string will not be interpreted on a semantic SEO level, so it is recommended to use short URLs.

Robots.txt not found

It is recommended that all websites have a robots.txt file to block the access of robots to certain parts of our website and indicate where the sitemap is located.

Pages have hreflang language mismatch issues

This problem arises when the detected language of the page and the declared language do not match, so it must be corrected.

Orphaned pages in sitemaps

Orphaned pages are pages without links. It is recommended that all pages have links, and orphaned pages that are in noindex should not be included in the sitemap.xml.

Pages blocked by X-Robots-Tag: noindex HTTP header

The robots meta tag of the page prevents the page from being indexed. We must check that this is correct.

Issues with blocked external resources in robots.txt

If a robots.txt external to our page is blocking the robots from crawling the files we are using, it causes this problem. It is recommended that you always make use of your own files as much as possible.

Issues with broken external JavaScript and CSS files

If we use external JavaScript and CSS files and they no longer exist or the link is incorrectly inserted, this problem occurs. It is recommended to always use your own files as much as possible.

Pages need more than 3 clicks to be reached

It is recommended that all pages be within 3 clicks to be reached. This will also make it easier for the robot to reach as many pages as possible.

Frequently asked questions about free SEO analysis

What tools are used for SEO analysis?(h3)

A SEO analysis allows us to know the current state of our website and the factors that are affecting its positioning on Google and other search engines and, therefore, should be optimized to achieve higher conversions. The tools used for SEO analysis are Google’s own tools such as Search Console, Analytics or its own search engine, as well as complementary professional tools such as Sistrix, Semrush, Ahrefs, or Screaming Frog, among others.

What is the difference between on page and off page SEO analysis?

On page SEO analysis refers to the aspects of our website that have an impact on search engine rankings. Aspects such as loading speed, page indexing, content quality, images, etc. On the other hand, off page SEO analysis is everything that is done outside of our website but that ends up affecting our positioning, such as the acquisition of external links, mentions and citations, etc.

Why is it important to know my organic competition?

The SEO analyzer also allows you to know some aspects related to your competition. If you add to the SEO tool the URL of a website that you consider to be your direct competitor for the same term or niche, the report will offer you a comparison between the two. In this way, getting a general idea of where you are and where your competition is can help you improve. Finally, you have to keep in mind that the SEO analyzer is simply a rough guide that can give you little pills to improve. Therefore, if you want to grow in terms of your digital marketing strategy it is best to rely on experts. At Agencia SEO we want to help you grow. Shall we grow together?