Even though the internet's been around for almost two decades now, we're still finding out how it affects our minds. It all sounded so great when it started, didn't it? We'd jack all our computers together and share ideas easily and openly. But, today, we're finding out that an idea can be dangerous when the internet gets its hands on it; if that idea happens to be particularly harmful, it can fester and grow toxic.
YouTube is the most prominent dealer of ideas in the world. The entire platform is about letting anyone with a camera put their ideas out into the world. Therefore, YouTube bares the huge responsibility of monitoring what ideas get out into the world.
YouTube has finally decided to get tougher on which ideas it allows to manifest on the platform. The company put out a blog post outlining some changes that it has made to its policies regarding hateful videos, announcing that it would finally — finally — outright ban videos that are explicitly hateful.
Read on to find out what specifically will be changing about YouTube's policies, and what some of the potential implications may be...
YouTube's been around since 2005.
In those early days, it was kind of inane.
The first YouTube video, "Me at the Zoo," featured YouTube co-founder, Jawed Karim, just... chillin' at the zoo. It was, to be charitable, not the most entertaining video that the site would ever see.Google bought YouTube in 2006.
YouTube evolved.
More or less overnight, YouTube went from being populated by nonsensical videos about Pokemon to leading a media revolution.Today, there are hundreds of millions of videos on YouTube.
Some would that say that's too many videos...The platform's openness is what's made it so successful.
And while some of them are silly as hell...
Full disclosure — we searched "silly videos" on YouTube, and this was one of the search results. But we don't think it's silly because everyone's worried about being pregnert.A lot of them are very straight-laced.
Oh boy, just what we were hoping for — an explanation of tariffs. Why don't we all just call over our friends, pop some popcorn, and get rowdy watching this beauty?Some of the videos on YouTube are capital-I Important.
The platform has the power to immediately show us what's going on in the world. The incident captured in this video, one in which a university police officer sprayed a number of UC Davis students with pepper spray during a protest, could have easily been ignored if the entire internet hadn't seen how awful it was for themselves.But the way that YouTube works can make it a less-than-ideal news source.
The YouTube economy tends to prioritize opinions over facts. The more scorching hot your take is, the more likely it is to be liked and shared. Facts just don't have as much sway on the platform.After your video plays, an unseen algorithm provides recommendations.
If YouTube recommends me a Fortnite video one more time....— sprEEEzy (@sprEEEzy)1559563467.0
That leads to YouTube recommending videos that are... not exactly what you're looking for.
Sometimes, simply watching a video on one topic makes the algorithm suggest a lot more videos on that topic, even if it's the opposing viewpoint on that topic.This has lead to YouTube taking a ton of flack for letting violent ideas spread.
Not only are videos with awful ideas recommended by YouTube - albeit unknowingly - but so many rotten creators use the disingenuous "both sides" in their titles and content to dissuade YouTube from banning them despite the fact that they're clearly preaching hatred.And, now, YouTube is addressing that hate.
Today has generated a lot of questions and confusion. We know it hasn't been easy for everyone. Going forward, we'l… https://t.co/I9YPO30PHA— TeamYouTube (@TeamYouTube)1559791812.0
That means that YouTube will now officially ban videos advocating hatred.
Amid a torrent of criticism, YouTube has announced plans to take down some hateful content, including videos endors… https://t.co/W44ijXk2nD— Los Angeles Times (@Los Angeles Times)1559764923.0
YouTube also acknowledged some of the reasons that this sort of content might stay on the site.
Sometimes videos with strong, harmful opinions are used by researchers to study hateful ideologies.
Of course, some mistakes will be made.
YouTube have banned me for 'hate speech', I think due to clips on Nazi policy featuring propaganda speeches by Nazi… https://t.co/LaFw21aU6k— Mr Allsop History (@Mr Allsop History)1559756212.0
This change comes just days after YouTube decided not to punish YouTuber, Steven Crowder, for relentlessly harassing journalist, Carlos Maza.
When Vox's Carlos Maza continuously reported the abuse that he was receiving from Steven Crowder, YouTube initially said that they wouldn't throw down the ban, saying that Crowder's speech did not "violate [their] policies". It sure feels like YouTube's "borderline content" policy shift is in reference to this incident, as Crowder has been repeatedly reported for harassment, but has stayed just barely on the right side of the line vis a vis YouTube's stated policies. The new policy allows YouTube to act even without explicit rule violations.There have been questions concerning freedom of speech.
But why does free speech automatically equate to awful speech?
The reason that free speech is so important is that it allows us to speak the truth to power. The things that these hateful YouTubers are saying aren't speaking the truth to power, they're harming people for no reason.It's clear that YouTube has to draw some kind of line.
As the entity putting all this speech into the world, it does ultimately fall to YouTube to decide what is, and what is not, acceptable.Earlier this year, YouTube worked to change its recommendation algorithm.
“In response to the criticism, YouTube announced in January that it would recommend fewer objectionable videos, suc… https://t.co/4fT2epo1RR— Zerah (@Zerah)1559771781.0
That means trying to de-emphasize all that "borderline content."
@drdrew They've gone farther than just "hate speech" they said content that is borderline will be demonetized as well.— George Martinez (@George Martinez)1559851410.0
But the "borderline" isn't well-defined.
YouTube is still permitting videos it classifies as “borderline,” which isn’t clearly defined. This is the latest… https://t.co/DgA3Q53J2Z— Vox (@Vox)1559791500.0
All of these steps are vital because hate can be like a virus.
It's not healthy to be so deep inside anger and hatred that you have to put it on YouTube for the world to see. Once it's out there, others can find it and ideas can be contagious.If someone follows a YouTube bread trail, it can make their already negative state of mind even worse.
That's why taking these steps is so important for YouTube.
I would like to thank all the YouTube commenters for being extra crazy today and making it much easier to remove co… https://t.co/ga6surUncz— Gary (@Gary)1558217371.0