Video: Http Vs Https – Duplicate content issues


The solution using htaccess


# Redirecting the robots.txt file for htaccess to stop https crawling to avoid duplicate content.
RewriteCond % 443 [NC]
RewriteRule ^robots.txt$ robots_ssl.txt [L]

There are other ways of solving it too like

< ?php if ($_SERVER["SERVER_PORT"] == 443) { echo "< meta name=" robots " content=" noindex,nofollow " > ";
}
?>

You can even do canonicalization but it is not always possible as you want both the urls to be accessible.

4 Replies to “Video: Http Vs Https – Duplicate content issues”

  1. Great Videos! You mention to run your site through a webmaster tool to see how it’s reading the robots.txt files, do you have a tool you would recommend to use?

    Thanks!

  2. Hi Miles, great to have your comment.

    Google provides webmaster tools, it is really great, you can check and verify in detail. https://www.google.com/webmasters/tools/ (create an account if you haven’t)

    I am working on a new project based on supplement results for wordpress. I am using http://www.meabhi.com/blog/find-supplemental-ratio/ for the project. One shouldn’t miss this experiment.

    Please let me know if I can be of any further help.

    Regards,
    Aji aka AjiNIMC

  3. Thanks a lot for the tips! I have a similar problem with my https being displayed which I don’t want.

    I am not sure I understand where would you add that php code you mentioned:

    thanks
    Gabi.

Comments are closed.