Date: August 10, 2007
“Ms. Supan [Director of Marketing for YouTube] said that among the videos removed were those that ‘display graphic depictions of violence in addition to any war footage (U.S. or other) displayed with intent to shock or disgust, or graphic war footage with implied death (of U.S. troops or otherwise).'” – NY Times, Oct. 6, 2006
Could it be you do not enforce your own policies on reviewing and removing flagged videos with any seriousness or consistency? If you suspect the standards by which we choose videos to flag then please, by all means, visit our website and pick a few videos from our archive of flagged videos at random. Pick a dozen. If you find them worthy of remaining online, then your error has been to delete the 10-20%, for there is no appreciable difference in their content. If you find them as objectionable as we do, why have they been left up after being flagged and reviewed? In fact, these videos often recycle the exact same footage. Why remove some and leave the others online? Looking over the archives, it’s easy to imagine the removal rate is more a function of who is on staff at any given time and not the actual content of the videos.
YouTube removes videos on grounds of hate speech. Recently, a video was removed, and account closed, that contained nothing but a person sitting in front of a camera explaining how pleased he was that journalist Daniel Pearl had his head cut off. If that’s sufficient grounds to qualify as hate speech, why aren’t videos celebrating attacks on, and the death of, Coalition soldiers in Iraq and Afghanistan also hate speech? YouTube removes videos of fights in schoolyards that kids upload from their cell phones. If it’s unacceptable violence to show school kids wrestling and pulling at each other’s hair, why is it acceptable to show videos of IED detonations, especially when they celebrate those deaths?
YouTube also claims it is not responsible for content uploaded by its users, and that if content is found in violation of its standards, it will be removed when users flag them as ‘inappropriate’. When you review flagged videos, and choose to keep them online, do you assume responsibility for those videos then? You might wish to consult with your legal staff on this question. If you aren’t responsible for content even after YouTube personnel have reviewed it, why review or ever delete anything at all? Perhaps you should add a note to videos that have been flagged, stating that YouTube has reviewed it and found the contents acceptable. It could prevent us from bothering to flag them in the future, and give everyone a clearer idea as to what you consider ‘appropriate content’.
Lastly, would you be so good as to inform us how many languages your staff speaks fluently? I’m sure you wouldn’t want to remove videos celebrating the death of soldiers, or journalists like Daniel Pearl, only because they were in English while leaving up similar speech if it were spoken in any other language. Could you tell us more about how you handle ‘hate speech’ flags when they are recorded in languages other than English? You must have some way of confirming whether or not the videos truly contain hate speech.
We look forward to your answers to the above questions. They will help us answer our own questions as to whether YouTube wishes to follow in the footsteps of it’s new owner when it comes to living up to the “Don’t Be Evil” standard.
The Blog Admins for
Operation YouTube Smackdown