White Listing the Web by Mark Stone
November 2011. I had just stepped out of the room for a moment, but when I returned Nathan had a look of terror on his face and his hands clamped firmly over his ears. On the screen was a great white shark, bursting out of frothing water. Nathan is five. He doesn't know any better. The charming whale video we had been watching had ended, and he had simply clicked on a link for another "related" video. Such are the risks of letting your kids watch YouTube unsupervised, even briefly.
Media creators, media censors, and parents have led a delicate dance around the issue of parental supervision since the beginning of television. The challenges are only magnified by the quantity, variety, and accessibility of online content. Approaches that have been adequate in the past are failing today. The problem will only get worse in the future with the extension of media to mobile, and an increasingly younger demographic using mobile devices.
In its simplest terms, the two basic approaches to content filtering are:
- Black listing. Making a list of content that is unacceptable, and filtering out that content.
- White listing. Making a list of content that is acceptable, and filtering out everything else.
Of course with a finite set of content, white listing and black listing are functionally equivalent. Thus the choice between approaches may not seem important, or even seem like a choice. In the early days of television with three commercial networks and one public channel, there just wasn't much at stake. But as we added cable and satellite TV, and expanded the number of channels to hundreds, and thus the number of programs to tens of thousands, this blacklist approach became less and less practical.
Here's the problem. If you have an infinite, or even just very large pool of content that you do not know the status of, and a short list of content censored as not child friendly, then you really havent' put a dent in the problem. You just can't say much with confidence about an item selected at random from the large pool of unvetted content. And thus you haven't addressed the fundamental parental question: "What can I safely allow my kids to watch?" This problem is exacerbated by the inherent negative focus of blacklisting. My sense is that parents are much more concerned about what's good for their kids, rather than obsessing over what might be bad for them. Furthermore, at least in the States, our content censorship / blacklisting policies reflect and old-fashioned and often puritanical predilection with sex, and to a lesser extent violence. I'm not that worried if my five year old is inadvertently exposed to images of breasts or genitalia; they are, after all, a pretty natural part of the human experience. Violence is a more complicated question, but I certainly don't subscribe to the view that violent images cause violent behavior.
What I want is content, and more generally a pool of content in which my child can "swim" unsupervised, that I'm confident is generally good (and not obviously bad). I'm okay with the high level of violence in the movie "Cars 2" because its embedded within reasonably high quality story-telling. I'm not okay with "Sponge Bob", whether violent or not, because the content is completely vapid. So what I'm looking for is curated content: a white list of known good, not a black list of known bad.
So let's come at the problem from another direction, and return to Nathan's whale watching experience. Take a look at the page on which the video in question is displayed.
The video itself is fine, completely G-rated, and very representative of our family's whale watching experience in the Puget Sound. The problem is the related videos suggested by the thumbnails in links in the right hand column. I won't list them all, and most of them are equally tame. But here are a few that a parent might find troubling for a 5 year old to stumble across:
- "Great white shark jumps into boat"
- "Killer whale vs. sea lions"
- "Don't play with an orca!"
- "Whale hits and smashes yacht in South Africa"
- "MEGALADON PREHISTORIC SHARK"
- "Killer whale attacks trainer"
While its tempting and easy to blame YouTube, I don't. First, YouTube's Terms of Service clearly state that even a viewer of YouTube is expected to be 13 or older, "as the Service is not intended for children under 13." Second, and most importantly, YouTube has to deal with the delicate balancing act known as the Digital Millenium Copyright Act (DMCA). In fact, their terms of service spends more verbage on the DMCA than any other topic.
Here's the DMCA issue in a nutshell. Websites like YouTube pass a lot of copyrighted material to end users. They don't want to be in the business of policing the copyright status of that material, as the overhead and liability associated with that task would ruin the business. And they argue that, after all, they are not content providers but content couriers; merely the delivery man in the middle. If that sounds like a subtle distinction, think of the U.S. Postal Service by comparison. The Post Office does not, and I'm sure we'd all agree should not, examine the contents of the mail to make sure its legal. Its just a courier. It is in no way responsible for what people choose to pass through their service. Thus the DMCA contains a "safe harbor" provision that says, in essence, that websites who merely pass along content are not liable for the legal status of that content, provided they respond in a timely manner to remove content that has not been legally posted to the site.
How does all this relate to whales and sharks? A content courier cannot also be a content curator. Any attempt to provide an editorial filter on the basis of content itself could jeopoardize the safe harbor standing of mere courier. Even if YouTube wanted to take on the daunting task of sorting out content on the basis of age appropriateness, its hands are tied by the DMCA requirement to be blind to content passing through the site. Any solution to sorting out quality, child-friendly content on a site like YouTube will have to come from outside YouTube.
Thus the DMCA creates a peculiar "church and state" separation between sites that host content and sites that curate content. We actually benefited from this effect at Slashdot, which at its heart is a geek news and discussion site about content on other sites. We could not be targeted by DMCA take down requests, because we didn't host any of the curated content; we simply linked to it. Further, the staff of Slashdot did relatively little in the way of editorializing or curation. That was provided by the Slashdot audience. This is actually a really important point which I will return to in a moment.
When it comes to children's content online, we still follow the tired black list approaches borrowed from television rather than the white list approach of Slashdot. There are filtering sites like YouTube for Children, but the filters are still more negative than positive, and you'd need such a filtering site for each destination that you and your child were interested in. That's not a scalable approach. There are more general Internet filtering tools like "Net Nanny" and "CyberSitter", by these are hyper-obsessed with screening out porn (which they do poorly and in a heavy-handed manner), and have little other value.
No, what we need is a children's Slashdot ("Sites for kids; stuff that's cool").
Ironically the Slashdot approach to content curation is one of the oldest on the Web, but no longer very popular. In the pre-Google Web, when we didn't rely on search engines to feed us everything, Yahoo was a manually curated index of the Web rather than a search engine. You could browse through Yahoo's hierarchy of links much like thumbing through encyclopedia entries, and have some confidence that an actual human being had looked at the site to which Yahoo linked, and applied some minimum editorial standard when including the link in the index. The problem with early Yahoo was that it didn't scale. The quantity of content on the Web proliferated much faster than manual indexing could keep up with.
The only site I know of to effectively solve this scaling problem is Slashdot. Slashdot engages a powerful network effect by engaging the power of its own audience to curate content. Lots of sites let their audience rate and rank content, but only Slashdot has a process that doesn't commoditize or trivialize the rating activity, and only Slashdot has a meta-moderation process that provides a quality bar not just for content, but for the activity of curation itself. This is a topic I've discussed at length elsewhere, and I'm still astonished that no one else has tried to emulate the best practices of Slashdot.
So why isn't there a children's Slashdot? One big problem is that we just don't take children, or the networked family seriously as a demographic that needs to be addressed online. The bulk of venture capital funding goes to start-ups that target the young and the hip whose branding preferences have yet to be formed and whose income is not yet tied to family responsibilities. We also built the Web so fast that we never really stopped to take children into consideration. Picking Netscape's 1995 IPO as the approximate "year zero" for the modern Web, children born that year are now 16. So we've yet to raise an entire generation in the modern Web era. We are close, though. The class of high school class of 2013, and every subsequent class, will have spent their entire childhood in this era, and our generation, which created this era, has done very little to figure the right parenting approach to steward our children through this. We're just making it up as we go along.
A children's Slashdot could be a great step in the right direction. It would be a white list approach that says what's good content, and why. And it could use the network effect of its own audience to scale its white listed pool of content to some meaningful size. So why isn't there such a site already? Sounds like a business opportunity to me.