Announcement

Collapse
No announcement yet.

ROBOTS.TXT help please

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • ROBOTS.TXT help please

    My site is getting HAMMERED by intikomi (what ever that is) and I want it to slow down, will this slow it down???


    # robots.txt for http://www.yesfans.com/
    # mail [email protected] for constructive criticism

    User-agent: Slurp
    Crawl-delay: 60
    Tim L
    www.yesfans.com

  • #2
    It should, but remember it make take up to 30 days to reget a robots.txt file.

    Comment


    • #3
      30 days for that one to see the TXT and adjust?
      Tim L
      www.yesfans.com

      Comment


      • #4
        Can Iblock the site to unreg users from seeing it unless they are a member and log in and this specific slurp will see that TXT and slow down??
        Tim L
        www.yesfans.com

        Comment


        • #5
          If you only allow users who are registered to view the forums, they will not be able to spider any content at all.

          Yes, most SE bots only re-check robots.txt once every few weeks.

          Comment


          • #6
            i thought i would add my question to this since its the same subject, i am getting hammered too, 175+ yahoo slurp spiders on top of 100-150+ regular users is slowing things down a bit, what can i do to immedietly get rid of yahoo, just change guest viewing permissions? i hate to do this as we have 30-40 legitimate guests viewing our content but i need to get yahoo gone, any suggestions?

            Comment


            • #7
              i added disallow IP# to an htaccess file and the slurp spider immediately started disappearing, next question if anyone knows, if I leave it like this for just a few days will it hurt me any next month when slurp comes around again?

              Comment

              Related Topics

              Collapse

              Working...
              X