It seems quaint to imagine now but the original vision for the web was not an information superhighway. Instead, it was a newspaper that fed us only the news we wanted. This was the central thesis brought forward in the late 1990s and prophesied by thinkers like Bill Gates – who expected a
beautiful, customized “road ahead” – and Clifford Stoll who saw only
snake oil. At the time, it was the most compelling use of the Internet those thinkers thought possible. This concept – that we were to be coddled by a hive brain designed to show us exactly what we needed to know when we needed to know it – continued apace until it was supplanted by the concept of User Generated Content – UGC – a related movement that tore down gatekeepers and all but destroyed propriety in the online world.
That was the arc of Web 2.0: the move from one-to-one conversations in Usenet or IRC and into the global newspaper. Further, this created a million one-to-many conversations targeted at tailor-made audiences of fans, supporters, and, more often, trolls. This change gave us what we have today: a broken prism that refracts humanity into none of the colors except black or white. UGC, that once-great idea that anyone could be as popular as a rock star, fell away to an unmonetizable free-for-all that forced brands and advertisers to rethink how they reached audiences. After all, on a UGC site it’s not a lot of fun for Procter & Gamble to have Downy Fabric Softener advertised next to someone’s racist rant against Muslims in a
Starbucks .
Still the Valley took these concepts and built monetized cesspools of self-expression.
Facebook, Instagram, YouTube, and Twitter are the biggest beneficiaries of outrage culture and the eyeballs brought in by its continuous refreshment feed their further growth. These sites are Web 2.0 at its darkest epitome, a quiver of arrows that strikes at our deepest, most cherished institutions and bleeds us of kindness and forethought, says
TechCrunch.