Mobile app version of vmapp.org
Login or Join
Rambettina238

: How does duplicate file and directory names affects SEO I have some 40 crawl errors in my website, those are pointing to files which are not existing. and I want to resolve all 404 errors

@Rambettina238

Posted in: #301Redirect #Filenames #Php #Seo #SharedHosting

I have some 40 crawl errors in my website, those are pointing to files which are not existing. and I want to resolve all 404 errors

I dont have acccess to .htaccess file. Because i am on shared hosting

So, I thought to create those files under respective path by putting 301 redirect.

Examples:

Correct URL: www.myapp.com/folder/oracle-sql-course.php
404 error URL: www.myapp.com/foder/sub_folder/oracle-sql-course.php
404 error URL: www.myapp.com/foder/oracle-sql-course.php
Like this I have around 40 crawl errors. So I have to create the folders of same name in different levels and same files as well.

Does these changes affects SEO.

10.02% popularity Vote Up Vote Down


Login to follow query

More posts by @Rambettina238

2 Comments

Sorted by latest first Latest Oldest Best

 

@Jamie184

Just to clarify... 404 crawl errors are not necessarily a bad thing if the content genuinely does not exist. It is what 404's are for. You will not get penalised for having legitimate 404's.

However, in this case it looks as if these are typos in the source URL, or people/search engines are linking to content that has moved. In this case you are potentially missing out on visitor traffic, and you should try to 301 redirect to the correct URL. Suppressing the 404 by any other means will not help.

It is OK to create a file at the "incorrect URL" and 301 redirect to the correct URL. This will correctly resolve your 404 issue. However, it is a lot more work than using .htaccess, and much harder to maintain, and very messy.

Providing you do a 301 redirect, and not simply serve content from the "incorrect URL", then users and search engines won't actually know whether there is a file stored there or not (it will be indistinguishable from a redirect in .htaccess). However, do not simply serve a duplicate file from the "incorrect URL" as this will simply result in duplicate content and could cause you more problems in the long run.

10% popularity Vote Up Vote Down


 

@BetL925

Avoiding 404 crawl errors is good pratice.

If you don't have access to .htaccess and you can't correct them by hand, you can add restrictions to these files in robots.txt (Disallow). It's simple for only 40 crawl errors. After that, if these bad URLs are indexed in SERPs, you can ask delete them in Webmaster Tools to resolve crawl erros.

If these files create duplicate content on your website, resolving them can avoid SEO sanction.

10% popularity Vote Up Vote Down


Back to top | Use Dark Theme