Video: Http Vs Https – Duplicate content issues


The solution using htaccess


# Redirecting the robots.txt file for htaccess to stop https crawling to avoid duplicate content.
RewriteCond % 443 [NC]
RewriteRule ^robots.txt$ robots_ssl.txt [L]

There are other ways of solving it too like

< ?php if ($_SERVER["SERVER_PORT"] == 443) { echo "< meta name=" robots " content=" noindex,nofollow " > ";
}
?>

You can even do canonicalization but it is not always possible as you want both the urls to be accessible.

4 Replies to “Video: Http Vs Https – Duplicate content issues”

  1. Hi Miles, great to have your comment.

    Google provides webmaster tools, it is really great, you can check and verify in detail. https://www.google.com/webmasters/tools/ (create an account if you haven’t)

    I am working on a new project based on supplement results for wordpress. I am using http://www.meabhi.com/blog/find-supplemental-ratio/ for the project. One shouldn’t miss this experiment.

    Please let me know if I can be of any further help.

    Regards,
    Aji aka AjiNIMC

  2. Thanks a lot for the tips! I have a similar problem with my https being displayed which I don’t want.

    I am not sure I understand where would you add that php code you mentioned:

    thanks
    Gabi.

Leave a Reply

Your email address will not be published. Required fields are marked *