YOUTUBE Education, Business, Science, Tech,

Promotion of Conspiracy Theories and Fringe discourse

YouTube has been criticized for using an algorithm that gives great prominence to videos that promote conspiracy theories, falsehoods, and incendiary fringe discourse.

According to an investigation by The Wall Street Journal, "YouTube’s recommendations often lead users to channels that feature conspiracy theories, partisan viewpoints, and misleading videos, even when those users haven’t shown interest in such content.

When users show a political bias in what they choose to view, YouTube typically recommends videos that echo those biases, often with more-extreme viewpoints."

After YouTube drew controversy for giving top billing to videos promoting when people did breaking-news queries during the 2017 Las Vegas shooting, YouTube changed its algorithm to give greater prominence to mainstream media sources.

In 2018, it was reported that YouTube was again promoting fringe content about breaking news, giving great prominence to conspiracy videos about Anthony Bourdain's death.

In 2017, it was revealed that advertisements were being placed on extremist videos, including videos by rape apologists, anti-Semites and hate preachers who received ad payouts.

After firms started to stop advertising on YouTube in the wake of this reporting, YouTube apologized and said that it would give firms greater control over where ads got placed.

Alex Jones, known for far-right conspiracy theories, has built a massive audience on YouTube. YouTube drew criticism in 2018 when it removed a video from a leftwing watchdog compiling offensive statements made by Jones, claiming that it was "harassment and bullying".

University of North Carolina professor Zeynep Tufekci has referred to YouTube as "The Great Radicalizer", saying "YouTube may be one of the most powerful radicalizing instruments of the 21st century."

Revenue

Google does not provide detailed figures for YouTube's running costs, and YouTube's revenues in 2007 were noted as "not material" in a regulatory filing.

In June 2008, a Forbes magazine article projected the 2008 revenue at $200 million, noting progress in advertising sales.

In January 2012, it was estimated that visitors to YouTube spent an average of 15 minutes a day on the site, in contrast to the four or five hours a day spent by a typical US citizen watching television.

In 2012, YouTube's revenue from its ads program was estimated at $3.7 billion.

In 2013 it nearly doubled and estimated to hit $5.6 billion according to eMarketer,

others estimated 4.7 billion,

The vast majority of videos on YouTube are free to view and supported by advertising.

In May 2013, YouTube introduced a trial scheme of 53 subscription channels with prices ranging from $0.99 to $6.99 a month.

The move was seen as an attempt to compete with other providers of online subscription services such as Netflix and Hulu.

In 2017, viewers on average watch YouTube on mobile devices for more than an hour every day.

Advertisement partnerships

YouTube entered into a marketing and advertising partnership with NBC in June 2006.

In March 2007, it struck a deal with BBC for three channels with BBC content, one for news and two for entertainment.

In November 2008, YouTube reached an agreement with MGM, Lions Gate Entertainment, and CBS, allowing the companies to post full-length films and television episodes on the site, accompanied by advertisements in a section for U.S. viewers called "Shows".

The move was intended to create competition with websites such as Hulu, which features material from NBC, Fox, and Disney.

In November 2009, YouTube launched a version of "Shows" available to UK viewers, offering around 4,000 full-length shows from more than 60 partners.

In January 2010, YouTube introduced an online film rentals service, which is only available to users in the United States, Canada, and the UK as of 2010. The service offers over 6,000 films.

Partnership with video creators

In May 2007, YouTube launched its Partner Program (YPP), a system based on AdSense which allows the uploader of the video to share the revenue produced by advertising on the site.

YouTube typically takes 45 percent of the advertising revenue from videos in the Partner Program, with 55 percent going to the uploader.

There are over a million members of the YouTube Partner Program. According to TubeMogul, in 2013 a pre-roll advertisement on YouTube (one that is shown before the video starts) cost advertisers on average $7.60 per 1000 views. Usually no more than half of the eligible videos have a pre-roll advertisement, due to a lack of interested advertisers.

In 2013, YouTube introduced an option for channels with at least a thousand subscribers to require a paid subscription in order for viewers to watch videos.

In April 2017, YouTube set an eligibility requirement of 10,000-lifetime views for a paid subscription.

On January 16, 2018, the eligibility requirement for monetization was changed to 4,000 hours of watch time within the past 12 months and 1,000 subscribers.

The move was seen as an attempt to ensure that videos being monetized did not lead to controversy, but was criticized for penalizing smaller YouTube channels.

YouTube Play Buttons, a part of the YouTube Creator Rewards, are a recognition by YouTube of its most popular channels. The trophies made of nickel plated copper-nickel alloy, gold plated brass, silver plated metal and ruby are given to channels with at least one hundred thousand, a million, ten million and fifty million subscribers, respectively.

Users of the YPP can have their videos "demonetized" if YouTube feels that the content is not advertiser-friendly. If a video receives this status, ad revenue will be canceled for the video in question, and the video will have a yellow coin symbol in the Partner's YouTube dashboard.

Revenue to copyright holders

Further information: § Copyrighted material

Much of YouTube's revenue goes to the copyright holders of the videos. In 2010, it was reported that nearly a third of the videos with advertisements were uploaded without permission of the copyright holders.

YouTube gives an option for copyright holders to locate and remove their videos or to have them continue running for revenue. In May 2013, Nintendo began enforcing its copyright ownership and claiming the advertising revenue from video creators who posted screenshots of its games. In February 2015, Nintendo agreed to share the revenue with the video creators.

Community policy

YouTube has a set of community guidelines aimed to reduce abuse of the site's features. Generally prohibited material includes sexually explicit content, videos of animal abuse, shock videos, content uploaded without the copyright holder's consent, hate speech, spam, and predatory behavior. Despite the guidelines, YouTube has faced criticism from news sources for content in violation of these guidelines.

Copyrighted material

At the time of uploading a video, YouTube users are shown a message asking them not to violate copyright laws. Despite this advice, there are still many unauthorized clips of copyrighted material on YouTube.

YouTube does not view videos before they are posted online, and it is left to copyright holders to issue a DMCA takedown notice pursuant to the terms of the Online Copyright Infringement Liability Limitation Act.

Any successful complaint about copyright infringement results in a YouTube copyright strike. Three successful complaints about copyright infringement against a user account will result in the account and all of its uploaded videos being deleted.

Organizations including Viacom, Mediaset, and the English Premier League have filed lawsuits against YouTube, claiming that it has done too little to prevent the uploading of copyrighted material.

Viacom, demanding $1 billion in damages, said that it had found more than 150,000 unauthorized clips of its material on YouTube that had been viewed "an astounding 1.5 billion times". YouTube responded by stating that it "goes far beyond its legal obligations in assisting content owners to protect their works".

During the same court battle, Viacom won a court ruling requiring YouTube to hand over 12 terabytes of data detailing the viewing habits of every user who has watched videos on the site. The decision was criticized by the Electronic Frontier Foundation, which called the court ruling "a setback to privacy rights".

In June 2010, Viacom's lawsuit against Google was rejected in a summary judgment, with U.S. federal Judge Louis L. Stanton stating that Google was protected by provisions of the Digital Millennium Copyright Act. Viacom announced its intention to appeal the ruling.

On April 5, 2012, the United States Court of Appeals for the Second Circuit reinstated the case, allowing Viacom's lawsuit against Google to be heard in court again. On March 18, 2014, the lawsuit was settled after seven years with an undisclosed agreement.

In August 2008, a US court ruled in Lenz v. Universal Music Corp. that copyright holders cannot order the removal of an online file without first determining whether the posting reflected the fair use of the material. The case involved Stephanie Lenz from Gallitzin,

Pennsylvania, who had made a home video of her 13-month-old son dancing to Prince's song "Let's Go Crazy", and posted the 29-second video on YouTube. In the case of Smith v. Summit Entertainment LLC, professional singer Matt Smith sued Summit Entertainment for the wrongful use of copyright takedown notices on YouTube. He asserted seven causes of action, and four were ruled in Smith's favor.

In April 2012, a court in Hamburg ruled that YouTube could be held responsible for copyrighted material posted by its users. The performance rights organization GEMA argued that YouTube had not done enough to prevent the uploading of German copyrighted music. YouTube responded by stating:

We remain committed to finding a solution to the music licensing issue in Germany that will benefit artists, composers, authors, publishers and record labels, as well as the wider YouTube community.

On November 1, 2016, the dispute with GEMA was resolved, with Google content ID being used to allow advertisements to be added to videos with content protected by GEMA.

In April 2013, it was reported that Universal Music Group and YouTube have a contractual agreement that prevents content blocked on YouTube by a request from UMG from being restored, even if the uploader of the video files a DMCA counter-notice.

When a dispute occurs, the uploader of the video has to contact UMG. YouTube's owner Google announced in November 2015 that they would help cover the legal cost in select cases where they believe fair use defenses apply.


Content ID

See also: Criticism of Google § YouTube

In June 2007, YouTube began trials of a system for automatic detection of uploaded videos that infringe copyright. Google CEO Eric Schmidt regarded this system as necessary for resolving lawsuits such as the one from Viacom, which alleged that YouTube profited from content that it did not have the right to distribute.

The system, which became known as Content ID, creates an ID File for copyrighted audio and video material, and stores it in a database. When a video is uploaded, it is checked against the database and flags the video as a copyright violation if a match is found.

When this occurs, the content owner has the choice of blocking the video to make it unviewable, tracking the viewing statistics of the video, or adding advertisements to the video.

YouTube describes Content ID as "very accurate in finding uploads that look similar to reference files that are of sufficient length and quality to generate an effective ID File". Content ID accounts for over a third of the monetized views on YouTube.

An independent test in 2009 uploaded multiple versions of the same song to YouTube and concluded that while the system was "surprisingly resilient" in finding copyright violations in the audio tracks of videos, it was not infallible.

The use of Content ID to remove material automatically has led to controversy in some cases, as the videos have not been checked by a human for fair use.

If a YouTube user disagrees with a decision by Content ID, it is possible to fill in a form disputing the decision. Prior to 2016, videos weren't monetized until the dispute was resolved. Since April 2016, videos continue to be monetized while the dispute is in progress, and the money goes to whoever won the dispute.

Should the uploader want to monetize the video again, they may remove the disputed audio in the "Video Manager". YouTube has cited the effectiveness of Content ID as one of the reasons why the site's rules were modified in December 2010 to allow some users to upload videos of unlimited length.

Controversial content

See also: Criticism of Google § YouTube, and Censorship by Google § YouTube

YouTube has also faced criticism over the handling of offensive content in some of its videos. The uploading of videos containing defamation, pornography, and material encouraging criminal conduct is forbidden by YouTube's "Community Guidelines".

YouTube relies on its users to flag the content of videos as inappropriate, and a YouTube employee will view a flagged video to determine whether it violates the site's guidelines.

Controversial content has included material relating to Holocaust denial and the Hillsborough disaster, in which 96 football fans from Liverpool were crushed to death in 1989.

In July 2008, the Culture and Media Committee of the House of Commons of the United Kingdom stated that it was "unimpressed" with YouTube's system for policing its videos, and argued that "proactive review of content should be standard practice for sites hosting user-generated content". YouTube responded by stating:

We have strict rules on what's allowed, and a system that enables anyone who sees inappropriate content to report it to our 24/7 review team and have it dealt with promptly. We educate our community on the rules and include a direct link from every

YouTube page to make this process as easy as possible for our users. Given the volume of content uploaded on our site, we think this is by far the most effective way to make sure that the tiny minority of videos that break the rules come down quickly. (July 2008)

In October 2010, U.S. Congressman Anthony Weiner urged YouTube to remove from its website videos of imam Anwar al-Awlaki. YouTube pulled some of the videos in November 2010, stating they violated the site's guidelines. In December 2010, YouTube added the ability to flag videos for containing terrorism content.

Following media reports about PRISM, NSA's massive electronic surveillance program, in June 2013, several technology companies were identified as participants, including YouTube. According to leaks of said program, YouTube joined the PRISM program in 2010.

YouTube's policies on "advertiser-friendly content" restrict what may be incorporated into videos being monetized; this includes strong violence, language, sexual content, and "controversial or sensitive subjects and events, including subjects related to war, political conflicts, natural disasters and tragedies, even if graphics imagery is not shown", unless the content is "usually newsworthy or comedic and the creator's intent is to inform or entertain".

In September 2016, after introducing an enhanced notification system to inform users of these violations, YouTube's policies were criticized by prominent users, including Phillip DeFranco and Vlogbrothers. DeFranco argued that not being able to earn advertising revenue on such videos was "censorship by a different name".

A YouTube spokesperson stated that while the policy itself was not new, the service had "improved the notification and appeal process to ensure better communication with our creators".

In March 2017, the government of the United Kingdom pulled its advertising campaigns from YouTube, after reports that its ads had appeared on videos containing extremism content. The government demanded assurances that its advertising would "be delivered in a safe and appropriate way".

The Guardian newspaper, as well as other major British and U.S. brands, similarly suspended their advertising on YouTube in response to their advertising appearing near offensive content. Google stated that it had "begun an extensive review of our advertising policies and have made a public commitment to put in place changes that give brands more control over where their ads appear".

In early April 2017, the YouTube channel h3h3Productions presented evidence claiming that a Wall Street Journal article had fabricated screenshots showing major brand advertising on an offensive video containing Johnny Rebel music overlaid on a Chief Keef music video, citing that the video itself had not earned any ad revenue for the uploader. The video was retracted after it was found that the ads had actually been triggered by the use of copyrighted content in the video.

On April 6, 2017, YouTube announced that in order to "ensure revenue only flows to creators who are playing by the rules", it would change its practices to require that a channel undergo a policy compliance review, and have at least 10,000-lifetime views, before they may join the Partner Program.

On January 16, 2018, YouTube announced tighter thresholds where creators must have at least 4,000 hours of watch time within the past 12 months and at least 1,000 subscribers.

Child protection

See also: DaddyOFive and Elsagate

In 2017, YouTube was associated with several controversies related to child safety. During Q2, the owners of popular channel DaddyOFive, which featured themselves playing "pranks" on their children, were accused of child abuse. Their videos were eventually deleted, and two of their children were removed from their custody.

Later that year, YouTube came under criticism for showing inappropriate videos targeted at children and often featuring popular characters in violent, sexual or otherwise disturbing situations, many of which appeared on YouTube Kids and attracted millions of views. The term "Elsagate" was coined on the Internet and then used by various news outlets to refer to this controversy.

On November 11, 2017, YouTube announced it was strengthening site security to protect children from unsuitable content. Later that month, the company started to mass delete videos and channels that made improper use of family-friendly characters.

As part as a broader concern regarding child safety on YouTube, the wave of deletions also targeted channels which showed children taking part in inappropriate or dangerous activities under the guidance of adults.

Most notably, the company removed Toy Freaks, a channel with over 8.5 million subscribers, that featured a father and his two daughters in odd and upsetting situations. According to analytics specialist SocialBlade, it earned up to £8.7 million annually prior to its deletion.

Also in November 2017, it was revealed in the media that many videos featuring children – often uploaded by the minors themselves, and showing innocent content – were attracting comments from pedophiles and circulating on the dark web, with predators finding the videos by typing in certain keywords in Russian.

As a result of the controversy, which added to the concern about "Elsagate", several major advertisers whose ads had been running against such videos froze spending on YouTube.

User comments

See also: Criticism of Google § YouTube user comments

Most videos enable users to leave comments, and these have attracted attention for the negative aspects of both their form and content.

In 2006, Time praised Web 2.0 for enabling "community and collaboration on a scale never seen before", and added that YouTube "harnesses the stupidity of crowds as well as its wisdom. Some of the comments on YouTube make you weep for the future of humanity just for the spelling alone, never mind the obscenity and the naked hatred". The Guardian in 2009 described users' comments on

YouTube as:

Juvenile, aggressive, misspelled, sexist, homophobic, swinging from raging at the contents of a video to providing a pointlessly detailed description followed by a LOL, YouTube comments are a hotbed of infantile debate and unashamed ignorance – with the occasional burst of wit shining through.

In September 2008, The Daily Telegraph commented that YouTube was "notorious" for "some of the most confrontational and ill-formed comment exchanges on the internet", and reported on YouTube Comment Snob, "a new piece of software that blocks rude and illiterate posts".

The Huffington Post noted in April 2012 that finding comments on YouTube that appear "offensive, stupid and crass" to the "vast majority" of the people is hardly difficult.

On November 6, 2013, Google implemented a comment system oriented on Google+ that required all YouTube users to use a Google+ account in order to comment on videos.

The stated motivation for the change was giving creators more power to moderate and block comments, thereby addressing frequent criticisms of their quality and tone. The new system restored the ability to include URLs in comments, which had previously been removed due to problems with abuse.

In response, YouTube co-founder Jawed Karim posted the question "why the fuck do I need a Google+ account to comment on a video?" on his YouTube channel to express his negative opinion of the change. The official YouTube announcemen[388] received 20,097 "thumbs down" votes and generated more than 32,000 comments in two days.

Writing in the Newsday blog Silicon Island, Chase Melvin noted that "Google+ is nowhere near as popular a social media network as Facebook, but it's essentially being forced upon millions of YouTube users who don't want to lose their ability to comment on videos" and "Discussion forums across the Internet are already bursting with outcry against the new comment system". In the same article, Melvin goes on to

Perhaps user complaints are justified, but the idea of revamping the old system isn't so bad.

Think of the crude, misogynistic and racially-charged mudslinging that has transpired over the last eight years on YouTube without any discernible moderation. Isn't any attempt to curb unidentified libelers worth a shot? The system is far from perfect, but Google should be lauded for trying to alleviate some of the damage caused by irate YouTubers hiding behind animosity and anonymity.

On July 27, 2015, Google announced in a blog post that it would be removing the requirement to sign up for a Google+ account to post comments to YouTube.

On November 3, 2016, YouTube announced a trial scheme which allows the creators of videos to decide whether to approve, hide or report the comments posted on videos based on an algorithm that detects potentially offensive comments.

Creators may also choose to keep or delete comments with links or hashtags in order to combat spam. They can also allow other users to moderate their comments.

View counts

In December 2012, two billion views were removed from the view counts of Universal and Sony music videos on YouTube, prompting a claim by The Daily Dot that the views had been deleted due to a violation of the site's terms of service, which ban the use of automated processes to inflate view counts.

This was disputed by Billboard, which said that the two billion views had been moved to Vevo since the videos were no longer active on YouTube.


On August 5, 2015, YouTube removed the feature which caused a video's view count to freeze at "301" (later "301+") until the actual count was verified to prevent view count fraud.

YouTube view counts once again updated in real time

Virtual Reality Battle | Dude Perfect
Original link
Virtual Reality: Explained!
Original link
5 INCREDIBLE VIRTUAL REALITY GADGETS YOU MUST SEE
Original link
People Face Their Fears in Virtual Reality
Original link
FUNNY VIRTUAL REALITY REACTION COMPILATIONS PSVR FUNNY VR REACTIONS
Original link
Superman Roller Coaster 360 VR POV Six Flags Fiesta Texas Virtual Reality
rollercoaster
Original link
The Future of Virtual Reality | Phil Kauffold | TEDxSonomaCounty
Original link