Announcement

Collapse
No announcement yet.

How To Get Rid of Duplicate Content - A Solution

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • How To Get Rid of Duplicate Content - A Solution

    Well... now that the googlebot accepts wildcards in robots.txt... I have put together some disallows that will hopefully solve all of vB's duplicate content penalizations (supplemental index). One or two dupes are no big deal.. but if google gets all the links on your forums like mine... you can get up to 8+ dupes and that is NO GOOD - all go to sup index.

    Someone please check this robots.txt entry and let me know if this is safe to use. I am not too hip on the robots exclusions syntax and would appreciate a friendly "looks good to me" for my lines below.

    NOTE: I WANT to have showthread.php indexed... just without the params listed below:

    Code:
    User-agent: Googlebot
    Disallow: /*goto=
    Disallow: /*mode=
    Disallow: /*showthread.php?p=
    Disallow: /*&pp=
    Disallow: /*postcount=
    Disallow: /*daysprune=
    Disallow: /*&sort=
    Any flaws with this? Thanks!
    Last edited by kevinmanphp; Wed 14 Feb '07, 6:47am.

  • #2
    Just curious,what will showthread.php do without parameters ?
    UNP - Auto Forum India - Playway

    Comment


    • #3
      showthread.php is the file that serves up all the posts in a thread. The problem is that showthread.php accepts a lot of different parameters that can display the same content just in different ways.

      Comment


      • #4
        Ok I know I'm late to the party...
        Not sure how big a flaw it is for you, but with that robots.txt you'll lose the value of 3rd party links matching those lines. And there could be quite a lot of them, people tend to bookmark and re-use some really weird URLs.

        Comment

        Loading...
        Working...
        X