Announcement

Collapse
No announcement yet.

Way to avoid being penalised by google and yahoo

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Way to avoid being penalised by google and yahoo

    If search engine such as google and yahoo delist or penalise a site, it would kill. For a forum, it is difficult to control the content and outbound link of the member's post. Is there any way to avoid being penalised by google and yahoo?

    There is one case come to my mind, if a member posts a bad link(ie. a link that already penalised by search engine), to an extreme, a link farm, what will search engine do to my forum?

    Is there any other case would lead to penality?
    May we discuss here?thank you

  • #2
    Personally I see SEO as modern day snake oil. Just build content and link backs will coem naturally.

    Comment


    • #3
      Originally posted by TruthElixirX View Post
      Personally I see SEO as modern day snake oil. Just build content and link backs will coem naturally.
      Yep, I agree with that

      Comment


      • #4
        Don't worry too much about it

        Comment


        • #5
          Originally posted by funfun168 View Post
          There is one case come to my mind, if a member posts a bad link(ie. a link that already penalised by search engine), to an extreme, a link farm, what will search engine do to my forum?
          Add rel="nofollow" to all such links.

          Comment


          • #6
            Originally posted by Icheb View Post
            Add rel="nofollow" to all such links.
            yes, I think it is necessary to "kill" the outbound link for our own goods

            for example, www.yahoo.com

            how I remove this outbound link and preserve the meaning?

            Comment


            • #7
              Originally posted by funfun168 View Post
              If search engine such as google and yahoo delist or penalise a site, it would kill. For a forum, it is difficult to control the content and outbound link of the member's post. Is there any way to avoid being penalised by google and yahoo?

              There is one case come to my mind, if a member posts a bad link(ie. a link that already penalised by search engine), to an extreme, a link farm, what will search engine do to my forum?

              Is there any other case would lead to penality?
              May we discuss here?thank you
              Use the ref="no-follow" attribute in links. Currently this is hard-coded (I do not know if 3.6.0 changes this) so it is impossible to change that without modifying files, which would nullify support from Jelsoft.

              Comment


              • #8
                Originally posted by Shining Arcanine View Post
                Use the ref="no-follow" attribute in links. Currently this is hard-coded (I do not know if 3.6.0 changes this) so it is impossible to change that without modifying files, which would nullify support from Jelsoft.
                I think I saw a product on vb.org that added a rel="nofollow" to all external links.

                Comment


                • #9
                  Hell... the real problem arises with duplicate content and numerous URL's to the same file. I asked if this solution I came up with would work... but am yet to hear a response. Personally... I can't believe no one has really looked at this:

                  http://www.vbulletin.com/forum/showthread.php?t=187235

                  Comment


                  • #10
                    We're not looking beause google says the issues with duplicate content isn't real, its been posted on their development blog that the algorithms they use are smart enough to know that its not real duplicate content.

                    Read http://www.bytestart.co.uk/content/p...-content.shtml

                    Then read http://www.searchenginepromotionhelp...nalty-real.php

                    What is the proof that Dupliate Content Penalties actually exist?

                    I'm all for making things easier for a search engine but going through the files and affecting usability by removing features that some people use, making the code more complex and potentially use more resources based on something that may or may not exist?

                    Google implements a filter service not a penalty, only one copy of duplicate content will be shown.
                    Scott MacVicar

                    My Blog | Twitter

                    Comment


                    • #11
                      I am well aware of the status of duplicate content with Google. However, the problem is the "filtering process" that you made mention too.. and that is the Supplemental Index... which as we all know is not included in the search results, for all in this index is deemed, "untrustworthy".

                      Furthermore, if Google notices a large portion (say 80%) of your website is being filtered into the supplemental index (which is exactly what is happening to my forums)... then Google will deliberately slow their crawling of the links and may even halt deeper crawling of certain sections of websites, due to the fact it is a complete waste of Googlebot's time to index a site whose 80% of it's crawl will be dumped into supplemental index.

                      This is the very reason a lot of dynamic sites such as ecommerce, have very hard times getting google to crawl deep pages past the initial 30% scrape. I am certainly not trying to bust anyone's balls here, so don't get me wrong. But I do want to at least make light that there are "deficiencies" that can be cured.

                      Comment


                      • #12
                        Originally posted by kevinmanphp View Post
                        I am well aware of the status of duplicate content with Google. However, the problem is the "filtering process" that you made mention too.. and that is the Supplemental Index... which as we all know is not included in the search results, for all in this index is deemed, "untrustworthy".

                        Furthermore, if Google notices a large portion (say 80%) of your website is being filtered into the supplemental index (which is exactly what is happening to my forums)... then Google will deliberately slow their crawling of the links and may even halt deeper crawling of certain sections of websites, due to the fact it is a complete waste of Googlebot's time to index a site whose 80% of it's crawl will be dumped into supplemental index.

                        This is the very reason a lot of dynamic sites such as ecommerce, have very hard times getting google to crawl deep pages past the initial 30% scrape. I am certainly not trying to bust anyone's balls here, so don't get me wrong. But I do want to at least make light that there are "deficiencies" that can be cured.
                        Sources?

                        Comment


                        • #13
                          I'd like to read your sources too about this, you quote figures and behaviours but I presumed this was only available to google engineers.

                          I'm just curious about what is truth and what is myth, since most of the SEO documents are either syndicated from another site or appear to be pure speculation.
                          Scott MacVicar

                          My Blog | Twitter

                          Comment

                          widgetinstance 262 (Related Topics) skipped due to lack of content & hide_module_if_empty option.
                          Working...
                          X