If you are applying an SEO strategy, but it does not work as you expected, the fault is not necessarily the algorithm. It is possible to have an excellent content strategy, with all the necessary optimizations, but your website does not finish ranking. Do you know why this happens? Because in most cases, we forget about technical SEO . 

You see, there are roughly 200 factors that Google considers when ranking search results. And while many are related to content and domain authority, another large part depends on user experience. Speed, design, usability and other factors that manage to retain users’ attention for longer. 

Good content and web improvements go hand in hand, one does not rank without the other. 

Most common SEO mistakes

Next, we will explain which technical factors you should constantly review to maintain the good SEO health of your website. Enhancements can be applied by your webmaster or tech team and usually don’t take up a lot of time.

1. Server problems 

HTTP 4xxx code

HTTP status codes are your server’s response to a request from the browser. The 4xxx means there is a syntax error or the request could not be completed. This error usually occurs in 47.3% of the cases.

Broken internal links

The 42.4% of the analyzed sites have lost positioning internal links broken. This occurs when we redirect from one page to another that no longer exists. That tells Google that constant maintenance is not being done.

Broken external links

It is an error similar to the previous one, but taking into account the URLs that link to external domains. This affected 46.5% of the websites.

Pages not crawled

It usually occurs when the server’s response time is greater than 5 seconds or access to the web page is denied. This error was detected in 29% of the cases.

Broken internal images

It happens when the image file no longer exists or the URL doesn’t work. This was found in 16% of the projects analyzed by Semrush.

Redirects

The status of temporary and permanent redirects also affects organic positioning.

2. SEO errors in meta tags

Meta tags are signals that tell search engines what your content is about. You might know them best as SEO title, meta description, and image ALT attributes. All these fields must have the keywords well specified to give a clear and direct message to Google.

When these factors are not optimized, the system automatically generates them, causing duplication and matching problems that negatively affect your SEO.

Duplicate meta tags 

Your content must have original titles and descriptions, replicas often confuse the search engine algorithm during the classification process. The error is so common, it was found in 50.8% of the web pages evaluated by Semrush.

Lack of H1 tags

Error present in 64% of cases. H1 tags are the main title of your note and it is one of the most important for Google. Without it, it cannot properly understand your content. 

Lack of meta descriptions

If your website has a very low CTR, it probably has to do with the lack of meta descriptions. This is generally seen by the user in the SERPs (search results page) and has a great impact on the click-through rate. As a consequence, it impoverishes other indicators of your website that are important during content ranking. The lack of meta descriptions caused losses in the positioning of approximately 67% of the websites.

ALT attributes missing

For Google to understand the images on your website, it is important to place keywords in their ALT attributes. Otherwise they will lose relevance.

3. Duplicate content

Duplicate content can be very detrimental to your organic positioning. And it can occur in blogs, descriptions and / or meta tags.

For example, it can happen if your web page handles content in two languages, for example. In these cases, there are very fast and effective solutions such as integrating the rel = canonical code or creating a 303 redirect.

This error affected more than 50% of the web pages considered in the study. 

4. SEO errors in your sitemap.xml

There are two very common SEO errors related to sitemap.xml, the first and foremost is: not having a sitemap.xml.

The sitemaps.xml are, as the name implies, a map of your website. They help Google understand the structure of your project and greatly favors SEO. In fact, it should be included in your robots.txt file.

Despite its importance, more than 17% of websites do not have it. And if they do, they possibly incur the second most common mistake: having broken links within the sitemap.xml

This failure is very frowned upon by Google, because the algorithm itself highly values ​​web updates and maintenance. This tells you that the website is constantly improving to please users.

5. Low word count

72.3% of web pages proved to have a low number of words, which for Google means that the content may not be of quality.

One of the most popular SEO issues is that texts must be at least 300 words long. And it is probably not strictly true that the number of words defines the quality of the article, but in a context where another million pages offer the user the same information as you, but more detailed or better explained, Google begins to do its tests. 

According to the latest research from Hubspot (2020), the majority of articles on the first page of Google have an average of 1500 words. And that tells us a lot about user preferences.

So, always try to keep your content competitive. It’s not about writing for the sake of writing, either. Offer quality in your content and consider the tastes of your audience.