Frequent Top SEO mistakes mistakes to avoid!

We’re up in the air or we just do not know … so here’s a list of frequent and minor SEO errors to remember when doing SEO or managing a site.

Minor SEO errors that have a big impact on your search engine ranking, Google in the lead. Believe me, these minor SEO errors sometimes have huge consequences on your SEO and may cause the robots crwler not to index your site.

1- The stupid SEO Mistakes # 1: tell his CMS not to index his site

When using a CMS, we have a multitude of options to check, uncheck, edit … that influence the SEO of your website. There is an option that your website will NEVER be indexed by search engines.

This is the option that allows you to ask search engines not to index your site. Yes, yes, when you realize that you checked this SEO option in WordPress during the installation, you feel stupid.

2- Silly SEO errors # 2: do not check the quality of its code in the validator W3C

You must know, but everything on the internet is code. The code has its rules and the search engines love when they are respected.

If your site does not respect a specific structure, possibly validated by the W3C instance, your SEO may be undermined … Our advice: stay DRY (Do not Repeat Yourself) if you change your theme or the heart of your CMS and test again and again your website on the W3C tester (HTML). It will avoid this SEO error.3- Silly SEO Errors

3: forget to pass the external links in NoFollow and lose the SEO juice

Do not pass the external links of your sites in nofollow is an SEO mistake that can cost you dearly. All SEO specialists will tell you, if you take the trouble to create a site and work on your natural SEO (link building, optimization On / Off Page …), do not ruin while leaving the links on your site in DoFollow ! Indeed, once the robots are on your site: keep them!

By passing the external links present in your articles in NoFollow, the robots will see the link, understand the process, but will not leave your site and will continue their visit!

That’s what a NoFollow link looks like on the internet

<a href=”/monlien/” rel=”nofollow”> my anchor </a> and that’s what Google says about the NoFollow.

4- Silly SEO Mistakes # 4: Forgetting to work his robots.txt file

To do it quickly: the robots.txt file is at the root of your website, it is the first file that robots (Google, Google images, Bing …) will read when arriving on your site.

This robtos.txt file allows you to prioritize or restrict their work. By looking at this file, they will know if they have the right or not to crawl such page, such file, such file … In SEO, if you leave your robots.txt as initially, it will look like this:

User-Agent: *

Allow: /

If we translate this piece of code, it means that all robots ( = everything, User-Agent = robots) have the right to crawl all pages (Allow = authorized and / = all folders on your server). Imagine the SEO mistake! Indeed, the goal in SEO is not to tire robots, so they understand that we take care of and they come back as often as possible crawler your site.

If the robot loses time, Google / Bing / Yahoo servers waste time, so money, so he will not send his robots to your site. So if you allow all files on the crawl, you will feed unnecessary pages and files to robots, ex. : WordPress configuration files, JavaScript files … So hurry up your robots.txt to avoid tiring the robots and get them back as often as possible.

5- Forgetting to deindex its pre-production site

SEO error: forget to unindex the preprod

Before yesterday, when I opened my emails and saw an object “URGENT: OUR CUSTOMERS ACCESS THE PRE-PROD” I laughed a little (it’s not me who code the site) and then I said to myself: waggle of battle, what happened?

Also Read – What is Google search console and how to use it

Answer: The nice development agency had broken the Htaccess and HTPassword of the pre-prod site and Google’s bots had started indexing the site.

Result: 1853 pages indexed in search results. So the mistake was not to put “Disallow: *” in the robots.txt at the root of the pre-prod domain and / or a meta: robots = “Noindex” in the header of the pages and / or put a password via the HTAccess / Htpwd.

Get latest post updates on your email -