3 Things I Learnt From Spamming Matt Cutts’ Blog

I absolutely love running random experiments in SEO. More so I like to keep an eye on others experiments and see how they worn and try and replicate them. In my books, it’s all part of being a competitive SEO. Keeping this in mind – imagine my pleasure at seeing a particularly interesting SERP by SEOmofo: Worlds Greatest SEO

The Worlds Greatest SEO?

Here is the actual text: SEOmofo is the World’s Greatest SEO – Matt Cutts mattcutts.com/files/seomofo-is-worlds-greatest-seo.html A description for this result is not available because of this site’s robots.txt – learn more. So Matt Cutts is saying SEOmofo is the world’s greatest SEO? Lets click on that link: Matt Cutts 404 Its a 404 page. Darren manipulated this result. The question was, could I? And could I try it some other way?  

Spamming Matt Cutts

I decided that instead of trying to rank a query, I will try and rank it for my brand – right under the sitelinks. Guess what? It worked :) Refugeeks reccomended by Matt Cutts! See result for domain search too: Refugeeks.com reccomended by Matt Cutts What about “Worlds Best SEO Website”? Well I didn’t do as well, simply because I didn’t throw enough link equity at it. It is still climbing for that term, but might take some more links: worlds best SEO website

How Did I do It?

It’s no wonder, that the page is a 404, as its not really a page that exists on Matt’s website. Its an issue with the way Google treats the no index directive if you set “disallow” via robots.txt: Matt Cutts Robot Txt FileThe meta description doesn’t exist, and even if it did, the robots exclusion that basically is saying “do not crawl this page” wont allow for it to show up. So if the page didn’t exist, how the hell did we get the title tag right? There are two ways of manipulating title tags in serps.  The first way is to manipulate Anchor Text, the second is to see what url combinations trigger automatic title tags, though this is CMS dependent so not easy to find. The URL I used was: https://www.mattcutts.com/files/refugeeks.com-is-worlds-best-seo-website.html. What I did was to format anchor text to say: “Refugeeks.com is worlds best seo website” This triggered the title tag.

What Did I Learn? (or reconfirm!)

A lot of the stuff below I was pretty sure of, but this particular test just reaffirms it. The 3 big takeaways from the test are below.

Google Bombing: Rishi Lakhani Certainly Does

Interestingly one of the people linking to the site decided to switch the anchor text to see what happens. The Anchor Text here was “Rishi Lakhani Certainly Does”. What happens when you google that?

Rishi Lakhani certainly does

I don’t know about you, but I think that’s pretty cool. For those of you who haven’t been in SEO as long as some of us old timers, there was a practice called “Google Bombing” where you link to a particular page with anchor text of choice and let that page rank for that phrase. The most famous example for that was getting the Scientology site ranking for “Dangerous Cult”:

Scientology Google Bombing

The Other example was ranking Ex President Bush’s biography for “Miserable Failure”. Although that result has been cleaned out, there seems to be a new target for the phrase (not at the top, but bottom of page 10)Michael Moore  - Miserable Failure

Here are the backlinks to his site filtered for “miserable” in the anchor text: Michael Moore  - Miserable Failure links

So this says to me that Google Bombing via anchor text IS still possible, though it takes careful manipulation, and not brute force links as it used to. Of course, if you want to take it a step further, try Google Bombing 3.0 – manipulating Googles in a different way:  (click to see the results yourself!)

Dangerous Cult - Google Bombing 3 point O

Protecting your important private pages and keeping track of them

Most devs would follow Google’s suggestions blindly, and this is where an SEO is important to have – a good SEO will not only understand Googles rules, but also interpret them into reality, like I have done above. Simply reading googles suggestions on how to keep your important private URLs out of the index will most certainly cause issues – you have to make sure that the right combination of rules and directives work for you. A robots txt disallow is NOT the best way to keep content out of google.

The best way to get rid of such pages is via a page level meta “no index”, though there are other methods to keep indexation from happening. Of course if the content is already indexed, you may want to use a content removal request.

It is advisable to constantly monitor these URLs for indexation. Of course, if you didn’t realise you had these urls, you cant monitor for them. But a quick view of your robots.txt file, see which folders are disallowed, and simply run a site search:

site command for blocked directories

Reputation Management 4.0?

As you can see, I got Matt’s site ranking for my brand name. I must note of course, that the volume of search and competing pages for the query are very low. However if you prescribe to the theory of “domain strength”, where by it is easier to rank content from a powerful domain (and I do!) then you could potentially use this tactic to rank a number of blocked urls from high authority domains for your brand name.

Rep man 4 point oh

The reasons why this *could* be an easier strategy:

  • A strong domain may mean easier rankings
  • Because google doesn’t crawl the actual content on these blocked urls, there is no need to have unique content. Typically when you set up multiple sites and social profiles to rank for a phrase purely for reputation management, you need to build a fair bit of unique content that relates to the phrase, and often you have to keep those social profiles relatively active. In this situation, you may need not bother.
  • You dont need permission to rank those URLs, which means the whole web could be your playground – all you need to do is find the RIGHT combination of site and robots.txt directives.
  • You could employ cheap spammy links to get the url ranking – greatly reducing your cost.

I haven’t actually tried this technique, but I think it *could* work. I do have to warn you, this is grayhat, and probably not for those who like to stick to the white side of the fence.

Summary

Although nothing I did here was revolutionary, it does show that a few very simple tests can teach you a lot about the way search engines work. I always advise people to try their own tests, or try and replicate other people’s tests so that they can learn techniques themselves rather than just relying on what they read. Also, Matt, if you are reading this – sorry – am talking off the links now :)

Please share and rate this post if you enjoyed it :)

3 Things I Learnt From Spamming Matt Cutts’ Blog
User Rating: 4.6 (20 votes)
Rishi has been a consultant in online marketing for over 10 years, specialising in SEO, PPC, Affiliate Marketing, Email and Social Media. Over the years he has worked with many brands as well as many small businesses.