Robots Directives

  • 3 Things I Learnt From Spamming Matt Cutts’ Blog

    … that the right combination of rules and directives work for you. A robots txt disallow is NOT the best way to keep content out of google. The best way to get rid of such pages is via a page level meta “no index”, though there are other methods to keep indexation from happening. Of course if the content is already indexed, you may want to use…

    RefuGeeksin SEO- 2 readers -