It is symptomatic of the era where we’re living that websites like Twitter or Summly can attract massive investments yet traditional content creators worldwide struggle to make ends meet, are cutting back headcounts and reducing editorial budgets.
A quick look on Alexa’s list of the biggest websites in the world shows that there are only four English speaking news websites in the top 100 (HuffPo, BBC, CNN and About) with none of them featuring in the top 50.
Clearly the internet has moved on from being primarily a repository of “dumb” information silos in the early days to one where information is produced overwhelmingly by “the crowd” and the increasing armada of automated entities (bots, avatars, scrapers, spiders and sensors).
Indeed some of the the biggest websites in the world crave on “outsourcing” content production to the crowd; Wikipedia, Twitter, Facebook, Tumblr or Wordpress.
Others, like Google, try to make the most out of organizing that data into a meaningful way while others, like Flipboard, Summly, Flite, try to do a better job at presenting content produced by third parties.
It’s interesting to see that Yahoo! acquired BuzzTracker, a news aggregator back in 2007 and merged with Yahoo! News and later changed its focus to produce more original content. It also acquired Associated Content, a content farm, back in 2010, changing its name to Yahoo Voices.
Just to make it clear, there’s nothing wrong with aggregating content from content producers. News is very rarely original or exclusive and, more often than not, it will be rehashed, reworked, “respun” and regurgitated by multiple news sources during its life cycle.
Content producers and distributors do however need to embrace rather than combat these as it is the new norm.
But my fear is that the urge for data-driven content production brought about by aggregators, search engine and scrapers, the mere process of economically producing quality news, a daily struggle.