Posted by Tom Anthony
Google's relationship with links has changed over the last 15 years – it started out as a love affair but nowadays the Facebook status would probably read: "It's Complicated". I think Google are beginning to suffer from trust issues, brought about by well over a decade of the SEO community manipulating the link graph. In this post I'm going to lay out how I think Authorship, and Google+ are one of the ways that Google are trying to remedy this situation.
I'll move on to what that means we should be thinking about doing differently in the future, and am sharing a free link-building tool you can all try out to experiment with these ideas. The tool will allow you to see who is linking to you rather than where is linking to you, and will provide you with social profiles for these authors, as well as details of where else they write.
To start I want to quickly look at a brief history of Google's view of links.
Are links less important than they were?
Back in the early days Google treated all links as being equal. A link in the footer was as good as a link in the main content, a link in bad content was as good as a link in good content, and so on. However, then the new generation of SEOs arrived and started 'optimizing' for links. The black hats created all sorts of problems, but the white hats were also manipulating the link graph. What this meant was now Google had to begin scrutinizing links to decide how trust-worthy they were.
Every link would be examined for various accompanying signals, and it would be weighted according to these signals. It was no longer a case of all links being equal. Reciprocal links began to have a diminished effect, links in footers were also not as powerful, and so it went for a variety of other signals. Over the last decade Google have begun using a wide range of new signals for determining the answer to the question they have to answer for every single link: How much do we trust this link?
They've also introduced an increasing number of signals for evaluating pages beyond the link based signals that made them. If we look at the ranking factors survey results from SEOmoz for 2011 we see that link based factors make up just over 40% of the algorithm. However, in the 2009 survey they were closed to 55% of the algorithm.
So in the last 2 years 15% of the algorithm that was links has been replaced by other signals in relative importance. The results are from a survey, but a survey with people who live and breathe this stuff, and it seems to match up well with what the community as a whole believes, and what we observe with the increasing importance of social signals and the like.
This reduction in the relative power of links seems to imply that Google aren't able to trust links as much as they once did. Whilst clear they are still the backbone of the algorithm, it is clear Google has been constantly searching for other factors to offset the 'over-optimization' that links have suffered from.
Are social signals the answer?
The SEO community has been talking a lot about social signals the last couple of years, and whether they are going to replace links. I'd argue that social signals can tell you a lot about trust, timeliness, perhaps authority and other factors, but that they are quite limited in terms of relevancy. Google still need the links – they aren't going anywhere anytime soon.
To visualise this point in a different way, if we look at a toy example of the Web Graph. The nodes represent websites (or webpages) and the connections between them as the links between these websites:
And a corresponding toy example of the Social Graph:
We can now visualise Social 'Votes' (be they likes/tweets/+1s/pins or shares of some other type) for different websites. We can see that nodes on the Social Graph send their votes to nodes on the Web Graph:
The Social Graph is sending signals over to the websites. They are basically saying 'Craig likes this site', or 'Rand shared this page'. In other words, the social votes are signals about web sites/pages and not about the links — they don't operate on the graph in the same manner as links.
Whilst social signals do give Google an absolute wealth of information, they don't directly help improve the situation with links and how some links are more trustworthy than others.
Putting the trust back into links
So Google have needed to find a way to provide people with the ability to improve the quality of a link, to verify that links are trust-worthy. I believe that verifying the author of a link is a fantastic way to achieve this, and it fits neatly into the model.
In June last year Google introduced rel author, the method that allows a web page to announce the author of the page by pointing to a Google+ profile page (which has to link back to the site for 2 way verification).
With this model it isn't: 'Distilled linked to SEOmoz' but it is 'Tom Anthony linked on Distilled to Rand Fishkin on SEOmoz'. It's the first time there has been a robust mechanism for this.
This is incredibly powerful for Google as it allows them to do exactly what I mentioned above – they can now verify the author of a web page. This gives 2 advantages:
- Knowing this is an authored link, by a human who they have data about, they can place far more trust in a link. Its likely that a link authored manually by a human is of higher quality, and that a human is unlikely to claim responsibility for a link if it is spammy.
- Furthermore it allows them to change the weighting of links according to the AuthorRank of the author who placed the link.
The latter point is very important, it could impact how links can pass link juice. I believe this will shift the link juice model towards:
I've shown it here as a simple multiplication (and without all the other factors I imagine go into this), but it highlights the main principle: authors with a higher AuthorRank (as determined by both their social standing and by the links coming into their authored pages, I'd imagine):
The base strength of the link still comes from the website, but Rand is a verified author who Google know a lot about and as he a strong online presence, so multiplies the power of links that he authors.
I'm a less well-known author, so don't give as much of a boost to my links as Rand would give. However, I still give links a boost over anonymous authors, because Google now trust me a bit more. They know where else I write, that I'm active in the niche, and socially etc.
Where to Who
So what does all this imply that you do? The obvious things are ensuring that you (and your clients) are using authorship markup, and of course you should try to become trustable in the eyes of Google. However, if you're interested in doing that stuff, you probably were already doing it.
The big thing is that we need a shift in our mindset from where we are getting links from to who we are getting links from. We need to still do the traditional stuff, sure, but we need to ask start thinking about ‘who’ more and more. Of course, we do that some of the time already. Distilled noticed when Seth Godin linked to our Linkbait Guide. I noticed when Bruce Schneier linked to me recently, but we need to begin doing this all in a scalable fashion.
With OpenSiteExplorer, Majestic and many other linkbuilding tools we have a wide array of tools that allow us to look at where we are getting links from in a scalable way.
I hope I've managed to convince you that we need to begin to examine this from the perspective that Google increasingly will be. We need tools for looking at who is linking to who. Here's the thing – all the information we need for this is out there. Let me show you…
Authored links – A data goldmine
We'll examine an example post from GIanluca Fiorelli that he posted in December. Gianluca is using Google's authorship markup to highlight he is the author of this post.
Lets take a look at what information we can pull out from this markup.
The rel author attribute in the HTML source of the page points to his Google+ page, from there we can establish a lot of details about Gianluca:
We can from his Google+ profile establish where Gianluca lives, his bio, where he works etc. We can also get an indicator of his social popularity from the number of Circles that he is in, but also by following examining the other social profiles that he might link to (for example following the link to his Twitter profile and seeing how many Twitter followers he has).
We've talked a lot in the industry in the last couple of years about identifying influencers in a niche, and about building relationships with people. Yet, there is an absolute abundance of information available about authors of links we or our competitors already have — why are we not using it!?!
All of this data can be crawled and gathered automatically, exactly in the way that Google crawls the authorship markup, which allows us to begin thinking about building the scalable sorts of tools I have mentioned. In the absence of any tools, I went right ahead and built one…
AuthorCrawler – A tool for mining Author Data for Linkbuilding
I first unveiled this tool a couple of weeks ago at LinkLove London, but I'm pleased to release it publicly today. (As an aside, if you like getting exclusive access to cool toys like this then you should check out SearchLove San Fran in June or MozCon in July).
AuthorCrawler is a free, open-source tool that pulls the backlinks to a URL, crawls the authorship markup on the page, and gives you a report of who is linking to a URL. It is fully functional, but it is a proof-of-concept tool, and isn't intended to be an extensive or robust solution. However, it does allow us to get started experimenting with this sort of data in a scalable way.
When you run the report, you'll get something similar to this example report (or take a look at the interactive version) I ran for SEOmoz.org:
It pulls the top 1000 backlinks for the homepage, and then crawled each of them looking for authorship markup, which if found is followed to crawl for the authors data (no. Circles, Twitter followers), and very importantly it also pulls the 'Contributes to' field from Google+ so you can see where else this author writes. It might be that you find people linking to your site that also write elsewhere, on maybe more powerful sites, so these are great people to build a relationship with – they are already aware of you, warm to you (they're already linking) and could provide links from other domains.
You can sort the report by the PA/DA of where the link was placed, or by the social follower counts of the authors. You can also click through to the authors Google+ and Twitter profiles to quickly see what they're currently up to.
I'm pretty excited by this sort of report and I think it opens up some creative ideas for new approaches to building both links and relationships. However, I still felt we could take this a little bit further.
I'm sure many of you will know the link intersect tool, in the labs section of SEOmoz. It allows you to enter your URL, and the URLs of other domains in your niche (most likely your competitors, but not necessarily), and it examines the back links to each of these and reports on domains/pages that are linking to multiple domains in your niche. It also reports whether you currently have a link from that page – so you can quickly identify some possible places to target for links. Its a great tool!
So, I took the principle from the link intersect tool and I applied the authorship crawling code to create an Author Intersect tool. It will give you a report that looks like this (you can check the interactive example report also):
Now what you have is really cool – you have a list of people who are writing about your niche, who are possibly linking to your competitors, whose social presence you can also see at a glance. These are great people to reach out to build relationships with – they are primed to link to you!
The tool is pretty simple to use – if you're unsure there is an instructions page on the site to get you started.
We are in the early days of authorship, but I think Google are going to keep on pushing Google+ hard, and I think authorship's importance is just going to increase. Correspondingly – I think tools such as this re going to become an increasing part of an SEOs toolkit in the next 12 months, and I'm excited to see where it goes.
I've only just begun to dig into the ways we can use tools like these – so I'd love to hear from others what they get up to with it. So go and download the tool and try it out. Have fun!
Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!