No announcement yet.

Pesky Guest is being tricky??

  • Filter
  • Time
  • Show
Clear All
new posts

  • Pesky Guest is being tricky??

    Ok what's going on? Last time I had my forums I woke up one morning... and I had 2 members online and 38 guests (all with a similar IP down to the last digit place X.X.X.#), most doing stuff that guests shouldn't be doing... "Moderating Duties", "Editing Posts", "Quoting Post", "Viewing Memberlist", "Viewing User Control Panel"

    So I block the IP, Turn off the forums, etc, and the dude stuck around! And about 2 hours later my site goes down... without a trace...

    Now I have a new set of forums at my new host for my site.. and immediately it's happening again... I've tried the same things and well.. little has happened.. I deleted some of the files for a bit to get him/her away.. and the most he/she has gotten to was 2 at a time (people on at the same time)... I'm kinda worried this time

    My site is: (if it doesn't work for you, it's because of your ISP, try

    Attached is an image of the problem...

    I am Rob, my host who I made an Admin is Moiph

    Help please? (PS: Guests can't do anything with posts or check the memberlist)
    Attached Files
    Last edited by Forest Sage; Sat 9 Aug '03, 6:48pm.

  • #2
    I have the exact same problem! 30 odd guests all similar IP's doing stuff they shouldn' please!

    And the IP's are similar to the ones giving forest sage problems
    Last edited by jon fuller; Sun 10 Aug '03, 12:13am.


    • #3
      Do NOT block them, as chances are they are search engine spiders, and blocking them will make it where they will not be able to index your board.


      • #4


        • #5
          If you have left the IP of the guest visiting your forums, we could have better informed you if it were a search engine spider, but you also say your forums went down, that is more worrying, any idea why this happened?

          Blocking an IP only stops registration, if you want to stop search engines and the like period put a robots.txt file in your public_html folder with the content below.

          User-agent: * # applies to all robots
          Disallow: / # disallow indexing of all pages


          • #6
            It also may have been because of another site on the server (Zelda Universe) going down...

            But my host said that also didn't happen, and that the guy never renewed his server purchase... which just confuses me... Now that I've found out that it was a search engine robot, I'm conforted... but I made a robots.txt... and they're not stopping


            • #7
              they cache robots.txt and only fetch before each spidering attempt.

              You'll just have to wait until this one is finished unless you just block them.
              Scott MacVicar

              My Blog | Twitter


              widgetinstance 262 (Related Topics) skipped due to lack of content & hide_module_if_empty option.