You are here

Google to pay $170 million for YouTube child privacy breaches

By Bloomberg News (TNS) - Sep 05,2019 - Last updated at Sep 05,2019

Photo courtesy of bestofmicro.com

WASHINGTON — Google’s YouTube agreed on Wednesday to pay a $170 million fine and limit ads on kids’ videos to settle claims that the company violated children’s privacy laws.

The world’s largest video-sharing site agreed to pay the fine, which is a record for a children’s privacy case, to the US Federal Trade Commission and New York State for failing to obtain parental consent in collecting data on kids under the age of 13, the FTC said. Starting in four months, Google also will limit data collection and turn off commenting on videos aimed at kids, YouTube announced at the same time, moves that will hamstring its ability to sell advertisement against a massive portion of its media library.

The settlement under the 1998 Children’s Online Privacy Protection Act, or COPPA, represents the most significant US enforcement action against a big technology company in at least five years over its practices involving minors. Washington is stepping up privacy and antitrust scrutiny of the big internet platforms that have largely operated with few regulatory constraints.

“The $170 million total monetary judgement is almost 30 times higher than the largest civil penalty previously imposed under COPPA,” FTC Chairman Joe Simons said in a joint statement with fellow Republican Commissioner Christine Wilson. “This significant judgment will get the attention of platforms, content providers and the public.”

The commission’s two Democrats broke from its three Republicans, however, saying the settlement did not go far enough to fix the problems. Some consumer advocates have slammed earlier reports of the fine as an insufficient deterrent, given the size of the company.

YouTube said it will rely on both machine learning and video creators themselves to identify what content is aimed at children. The algorithms will look at cues such as kids’ characters and toys, although the identification of youth content can be tricky. Content creators are being given four months to adjust before changes take effect, the company said.

The company will also spend more to promote its kids app and establish a $100 million fund, disbursed over three years, “dedicated to the creation of thoughtful, original children’s content”, Chief Executive Officer Susan Wojcicki wrote in a blog posting.

“Today’s changes will allow us to better protect kids and families on YouTube,” Wojcicki wrote in the blog, which acknowledged the rising chances that children are watching the site alone. “In the coming months, we’ll share details on how we’re rethinking our overall approach to kids and families, including a dedicated kids experience on YouTube,” she said.

YouTube has already begun plans to strip videos aimed at kids of “targeted” ads, which rely on information such as web-browsing cookies, Bloomberg has reported. The company violated COPPA with data collection to serve these ads, the FTC alleged. Some consumer advocates say the move away from targeted ads would do little to stop tracking of kids when they watch content aimed at general audiences, and that relying on video creators to make the changes could hurt compliance.

The FTC has been cracking down on firms that violate COPPA. It fined the popular teen app now known as TikTok $5.7 million in February to resolve claims the video service failed to obtain parental consent before collecting names, e-mail addresses and other information from children under 13. The agency is also planning to revamp its rules around children’s online privacy.

Alphabet Inc.’s Google doesn’t break out sales for the video site, but the company has reported that YouTube is its second-largest source of revenue behind search advertising. Research firm Loup Ventures estimates that 5 per cent of YouTube’s annual revenue, or roughly $750 million a year, comes from content aimed at children.

YouTube had long maintained that children under 13 don’t use its site without parental supervision, as its terms of service stipulate, but according to the FTC, it touted young users in advertising materials. There’s ample evidence these young viewers flock to the site, and consumer groups complained last year.

The site has already made tweaks as it tries to create a safer destination for children. In recent months, it changed its algorithm to promote what it called “quality” kids’ videos, a shift that alarmed many of its video creators. Wojcicki said the newest transitions “won’t be easy for some creators” and the company would work with them and provide resources to navigate the changes.

The company also introduced more parental controls for YouTube Kids, the app it launched in 2015 to offer a smaller selection of YouTube’s massive library, and created a web version of the app. The service is far smaller than YouTube’s primary audience of more than 2 billion monthly visitors, and data show the main site is used by more children than the kids app.

Democratic Senator Ed Markey of Massachusetts, who was a key force behind the passage of COPPA, wrote in a June 25 letter that besides deleting kids’ data, the FTC should make YouTube start a campaign to warn parents about minors’ use of the platform, create ways to identify users under 13 and prohibit it from launching new kids’ services without the approval of independent experts.

Google isn’t the only big internet platform facing pressure for its practices with minors. Children’s advocacy organisations have filed complaints with the FTC accusing Facebook Inc. of tricking children into making purchases while playing games on the social network. The company recently disclosed it has discussed its children’s chat app with the FTC, although it’s not clear whether there was a formal probe. Kids advocates have also alleged that Amazon.com Inc.’s Echo kids smart speaker violates privacy law.

Google and other tech giants have faced fines over their practices involving children before. In 2014, Google agreed to refund at least $19 million to settle with the FTC for failing to get parental consent for charges racked up by children playing games on mobile devices. Apple Inc. also agreed in 2014 to refund at least $32.5 million and change its billing practices after similar complaints. Yelp Inc. previously said it paid $450,000 for allegations it failed to test the age-registration feature on its applications and collecting names and e-mail addresses from children as young as nine years old without the consent of their parents.

By Ben Brody and Mark Bergen

 

 

up
7 users have voted.


Newsletter

Get top stories and blog posts emailed to you each day.

PDF