How Should Blogs be Ranked?
I came across a recent article in Marketing Magazine (our nemesis for the time being...) about their favorite marketing and advertising blogs. What interested me wasn't the list itself, but how the authors choose the blogs in the first place. In typical fashion, they were simply accompanied by a headline and their list of favorites - no reasoning or description provided.
While a agree with a number of their choices, I'd like to know a bit more about the blogs that they considered but didn't make the final 'cut'. (yes...this is AdJoke related...)
The list aside, however, it got me thinking about how blogs should be ranked. Sites like Technorati or Google Blog search provide users with different methods of ranking and creating lists - from authority scores to tag-based results. Alexa provides a decent look at web numbers in terms of traffic and page views but is this really the furthest we've come?
Obviously, any list should be based on a combination of a few key metrics (daily uniques, bounce rate, long-term return visits, tagging, etc). But I also think we should have a forum of 'marketing experts' - people like Seth Godin, Armano, Mitch Joel, Advergirl - who have one list that is optimized on a monthly basis based on what blogs they find most important to inform and educate themselves on everything marketing and ad related.
Think about how cool that would be for ad folks out there. I'd much rather read about what the top folks in the industry find interesting than the top 10 results that come up for 'Advertising blogs' in a Google search. Plus, I don't think it would be that hard to do.
Ad Age has a part of their Power 150 ranking that has a guy (Todd) who reads every blog on the list (there are over 960) each month and provides his personal score out of 20 for each blog (it can change on a monthly basis if the content gets better or worse...pretty much whatever Todd thinks.)
The total ranking uses a number of different metrics ( like Google, Yahoo tags, Technorati score, etc) but I like the idea that there is some "Super-blog reader" out there who influences the final outcome of the list.
Although Adjoke isn't very high on the rankings, Todd has given us a 13 out of 20. Considering that of the top 50 blogs only 5 have a higher Todd ranking (14 is the highest I've seen), it's not bad...not bad at all.
What's the point of all this? That traditional metrics aren't enough. We need a human factor because the quality of the content can't simply be defined unique visits or page views.
There are thousands of great blogs out there and most might never be discovered. Send us any that you love, that aren't in the top 10 of some random list and that are worth reading. We promise to share them (in no particular order, of course).