Robots Directives

3 Things I Learnt From Spamming Matt Cutts’ Blog

3 Things I Learnt From Spamming Matt Cutts’ Blog

I absolutely love running random experiments in SEO. More so I like to keep an eye on others experiments and see how they worn and try and replicate them. In my books, it’s all part of being a competitive SEO. Keeping this in mind – imagine my pleasure at seeing a particularly interesting SERP by SEOmofo: The Worlds Greatest SEO? Here is the actual text: SEOmofo is the World’s Greatest SEO &... »

Using URL Shorteners, 301 and 302 Redirects to Spam Google

Using URL Shorteners, 301 and 302 Redirects to Spam Google

It has been a while since I have covered BlackHat Tactics, partly because I have been busy, but partly because I didnt want to be part of the problem of pushing them. However I decided, I rather cover them, so that sites that fall prey to these, can take preventative actions. Typically sites that link out, but may not want to pass equity or to track outbound clicks, or want to cloak the urls would... »

Search Results pages – Should we “No Index” them in Robots txt files?

Search Results pages – Should we “No Index” them in Robots txt files?

I often get questions from a number of SEOs and people working on SEO for sites. One interesting one that I see time and time again is “Should we use No Index” for stuff we no longer want search engines to index in the results pages? If this was a few years back, my standard response would be to use the “disallow” line in robots.txt. You can see that recommendation from Mat... »