YouTube has a set of community guidelines aimed to reduce abuse of the site's features. Generally prohibited material includes sexually explicit content, videos of animal abuse, shock videos, content uploaded without the copyright holder's consent, hate speech, spam, and predatory behavior.[362] Despite the guidelines, YouTube has faced criticism from news sources for retaining content in violation of these guidelines.
At the time of uploading a video, YouTube users are shown a message asking them not to violate copyright laws.[363] Despite this advice, many unauthorized clips of copyrighted material remain on YouTube. YouTube does not view videos before they are posted online, and it is left to copyright holders to issue a DMCA takedown notice pursuant to the terms of the Online Copyright Infringement Liability Limitation Act. Any successful complaint about copyright infringement results in a YouTube copyright strike. Three successful complaints for copyright infringement against a user account will result in the account and all of its uploaded videos being deleted.[364][365] Organizations including Viacom, Mediaset, and the English Premier League have filed lawsuits against YouTube, claiming that it has done too little to prevent the uploading of copyrighted material.[366][367][368] Viacom, demanding $1 billion in damages, said that it had found more than 150,000 unauthorized clips of its material on YouTube that had been viewed "an astounding 1.5 billion times". YouTube responded by stating that it "goes far beyond its legal obligations in assisting content owners to protect their works".[369]
During the same court battle, Viacom won a court ruling requiring YouTube to hand over 12 terabytes of data detailing the viewing habits of every user who has watched videos on the site. The decision was criticized by the Electronic Frontier Foundation, which called the court ruling "a setback to privacy rights".[370][371] In June 2010, Viacom's lawsuit against Google was rejected in a summary judgment, with U.S. federal Judge Louis L. Stanton stating that Google was protected by provisions of the Digital Millennium Copyright Act. Viacom announced its intention to appeal the ruling.[372] On April 5, 2012, the United States Court of Appeals for the Second Circuit reinstated the case, allowing Viacom's lawsuit against Google to be heard in court again.[373] On March 18, 2014, the lawsuit was settled after seven years with an undisclosed agreement.[374]
In August 2008, a US court ruled in Lenz v. Universal Music Corp. that copyright holders cannot order the removal of an online file without first determining whether the posting reflected fair use of the material. The case involved Stephanie Lenz from Gallitzin, Pennsylvania, who had made a home video of her 13-month-old son dancing to Prince's song "Let's Go Crazy", and posted the 29-second video on YouTube.[375] In the case of Smith v. Summit Entertainment LLC, professional singer Matt Smith sued Summit Entertainment for the wrongful use of copyright takedown notices on YouTube.[376] He asserted seven causes of action, and four were ruled in Smith's favor.[377]
In April 2012, a court in Hamburg ruled that YouTube could be held responsible for copyrighted material posted by its users. The performance rights organization GEMA argued that YouTube had not done enough to prevent the uploading of German copyrighted music. YouTube responded by stating:
We remain committed to finding a solution to the music licensing issue in Germany that will benefit artists, composers, authors, publishers, and record labels, as well as the wider YouTube community.[378]
On November 1, 2016, the dispute with GEMA was resolved, with Google content ID being used to allow advertisements to be added to videos with content protected by GEMA.[379]
In April 2013, it was reported that Universal Music Group and YouTube have a contractual agreement that prevents content blocked on YouTube by a request from UMG from being restored, even if the uploader of the video files a DMCA counter-notice. When a dispute occurs, the uploader of the video has to contact UMG.[380][381] YouTube's owner Google announced in November 2015 that they would help cover the legal cost in select cases where they believe fair use defenses apply.[382]
In June 2007, YouTube began trials of a system for automatic detection of uploaded videos that infringe copyright. Google CEO Eric Schmidt regarded this system as necessary for resolving lawsuits such as the one from Viacom, which alleged that YouTube profited from content that it did not have the right to distribute.[383] The system, which was initially called "Video Identification"[384][385] and later became known as Content ID,[386] creates an ID File for copyrighted audio and video material, and stores it in a database. When a video is uploaded, it is checked against the database, and flags the video as a copyright violation if a match is found.[387] When this occurs, the content owner has the choice of blocking the video to make it unviewable, tracking the viewing statistics of the video, or adding advertisements to the video.
By 2010, YouTube had "already invested tens of millions of dollars in this technology".[385]
In 2011, YouTube described Content ID as "very accurate in finding uploads that look similar to reference files that are of sufficient length and quality to generate an effective ID File".[387]
By 2012, Content ID accounted for over a third of the monetized views on YouTube.[388]
An independent test in 2009 uploaded multiple versions of the same song to YouTube and concluded that while the system was "surprisingly resilient" in finding copyright violations in the audio tracks of videos, it was not infallible.[389] The use of Content ID to remove material automatically has led to controversy in some cases, as the videos have not been checked by a human for fair use.[390] If a YouTube user disagrees with a decision by Content ID, it is possible to fill in a form disputing the decision.[391]
Before 2016, videos were not monetized until the dispute was resolved. Since April 2016, videos continue to be monetized while the dispute is in progress, and the money goes to whoever won the dispute.[392] Should the uploader want to monetize the video again, they may remove the disputed audio in the "Video Manager".[393] YouTube has cited the effectiveness of Content ID as one of the reasons why the site's rules were modified in December 2010 to allow some users to upload videos of unlimited length.[394]
YouTube has also faced criticism over the handling of offensive content in some of its videos. The uploading of videos containing defamation, pornography, and material encouraging criminal conduct is forbidden by YouTube's "Community Guidelines".[362] YouTube relies on its users to flag the content of videos as inappropriate, and a YouTube employee will view a flagged video to determine whether it violates the site's guidelines.[362]
To limit the spread of misinformation and fake news via YouTube, it has rolled out a comprehensive policy regarding how to planned to deal with technically manipulated videos.[395]
YouTube has also been criticized for suppressing opinions dissenting from governments' positions, especially related to the COVID-19 pandemic.[396][397][398]
Controversial content has included material relating to Holocaust denial and the Hillsborough disaster, in which 96 football fans from Liverpool were crushed to death in 1989.[399][400] In July 2008, the Culture and Media Committee of the House of Commons of the United Kingdom stated that it was "unimpressed" with YouTube's system for policing its videos, and argued that "proactive review of content should be standard practice for sites hosting user-generated content". YouTube responded by stating:
We have strict rules on what's allowed, and a system that enables anyone who sees inappropriate content to report it to our 24/7 review team and have it dealt with promptly. We educate our community on the rules and include a direct link from every YouTube page to make this process as easy as possible for our users. Given the volume of content uploaded on our site, we think this is by far the most effective way to make sure that the tiny minority of videos that break the rules come down quickly.[401] (July 2008)
In October 2010, U.S. Congressman Anthony Weiner urged YouTube to remove from its website videos of imam Anwar al-Awlaki.[402] YouTube pulled some of the videos in November 2010, stating they violated the site's guidelines.[403] In December 2010, YouTube added the ability to flag videos for containing terrorism content.[404]
Following media reports about PRISM, NSA's massive electronic surveillance program, in June 2013, several technology companies were identified as participants, including YouTube. According to leaks from the program, YouTube joined the PRISM program in 2010.[405]
YouTube's policies on "advertiser-friendly content" restrict what may be incorporated into videos being monetized; this includes strong violence, language,[406] sexual content, and "controversial or sensitive subjects and events, including subjects related to war, political conflicts, natural disasters and tragedies, even if graphic imagery is not shown", unless the content is "usually newsworthy or comedic and the creator's intent is to inform or entertain".[407] In September 2016, after introducing an enhanced notification system to inform users of these violations, YouTube's policies were criticized by prominent users, including Phillip DeFranco and Vlogbrothers. DeFranco argued that not being able to earn advertising revenue on such videos was "censorship by a different name". A YouTube spokesperson stated that while the policy itself was not new, the service had "improved the notification and appeal process to ensure better communication to our creators".[408][409][410] Boing Boing reported in 2019 that LGBT keywords resulted in demonetization.[411]
show |
---|
show Online video and sharing platforms |
---|