In December, when Facebook launched Messenger Kids, an app for preteens and child development as young as 6, the company stressed that it had worked closely with resulting experts in order to safeguard younger consumers. What Facebook didn’t say is that many of those experts had received funding from Facebook.

Equally notable are the experts Facebook did not consult. Although Facebook says it expended 18 months developing the app, Common Sense Media and Campaign for a Commercial Free Childhood, two large nonprofits in the field, say they weren’t informed about it until weeks or days before the app’s debut. “They had reached out to me personally Friday before it launched, when plainly it was a fait accompli, ” says Josh Golin, executive director of Campaign for a Commercial Free Childhood. Facebook, he says, is “trying to represent that they have much better support for this than we are really do.” Academics Sherry Turkle and Jean Twenge, well-known researchers whose work on children and technology is often cited, didn’t know about the app until after it launched.

The omissions promptly came back to bite Facebook. Eight weeks after the Messenger Kids debut, Golin helped organize a group of nearly 100 child-health advocates who asked Facebook CEO Mark Zuckerberg to kill the app because it could undermine healthy child development. That same week, Common Sense Media announced that it would help fund a lobbying endeavor around the downside of addictive technology, including a curriculum1 distributed among 55,000 public schools that would highlight concerns, such as a possible is connected with heavy social media use and depression.

Antigone Davis, Facebook’s global head of safety, says Facebook solicited, and listened to, input from a variety of people before launching the app. “We took much of what we heard and incorporated it into the app, ” she says. “For example, “weve heard” from parents and privacy advocates that they did not want ads in the app, and we induced the decision not to have ads.”

Fixing Problems

Facebook’s approach to outside voices about Messenger Kids is echoed in its efforts to “fix” other controversial issues, such as fake news and election interference. As pressure mounts, Facebook touts its commitment to solving a difficult problem, often citing partnerships with third-party experts as a sign of its seriousness. Behind the scenes, nonetheless, the company sometimes obscures fiscal ties with experts, discounts high-profile critics, or co-opts outside efforts to address the same concerns.

Last week, for example, frustrated fact-checkers pressured Facebook into a session at the company’s headquarters, claiming “theyve been” shut out of vital data necessary to assess whether their efforts to combat fake news were working. Days after social-media analyst Jonathan Albright discovered that Russian propaganda may have been viewed hundreds of millions of days around the presidential election, Facebook called Albright, but then scrubbed the data from the internet. Cindy Southworth, one of the experts often cited in support of Facebook’s controversial project to combat retaliation porn, belongs to a nonprofit that has received funding from Facebook. After former Google design ethicist Tristan Harris popularise the phrase “time well spent” to warn against the dangers of addictive engineering, Zuckerberg adopted the phrase as well. Several days in recent months, he has promised to make sure that “time spent on Facebook is time well spent.” But Harris doesn’t belief Facebook is sincere. “It’s too bad to see Facebook co-opt the term without taking its meaning seriously beyond asking what are’ meaningful interactions, ’” he tweeted Monday.

Facebook is “trying to represent that they have so much more support for this than we are really do, ” says Josh Golin, executive director of Campaign for a Commercial Free Childhood.

The debate over kids and smartphones is far from settled, including disagreement over such studies tying social media use to depression in teens, which was conducted by Twenge. One side argues that kids are already on social media and need counseling to learn how to use it safely. The other side says tech monsters have intersected the line by targeting young children and are accusing ahead without understanding the effects. The only thing everyone agrees on? The need for more research and better parental controls. In this polarized climate, Facebook initially deflected criticism by presenting Messenger Kids as the outcomes of careful consultation with a range of outside experts, even as it subtly stacked the deck.

Facebook has toyed with targeting kids under 13 since at least 2011, when Zuckerberg swore to someday “fight” the Children’s Online Privacy Protection Act, which requires companies to obtain parental permission before collecting data on anyone under 13. Until December, though, it had not overtly targeted younger children.

Facebook’s blog post announcing Messenger Kids emphasizes that the app was “co-developed” with parents and experts, through conversations with the National PTA, Blue Star Families, and an advisory board with more than a dozen experts, from groups such as the Yale Center for Emotional Intelligence, Connect Safely, and Sesame Workshop. In an accompanying press release, Facebook quoth comments from roundtable debates held by the National PTA and a parent who rendered feedback to New Mexico State University’s Learning Games Lab.

Financial Ties

One Facebook post said the company had “collaborated” with the National PTA, but it did not mention Facebook’s fiscal ties in the working group, or others among its advisers. The National PTA says Facebook donated money for the first time in 2017, which the organization used to fund a survey and roundtables. Facebook says it previously donated “small amounts” unrelated to the app to Blue Star Families, a nonprofit for military families. Facebook money the research at New Mexico State. At least seven of Facebook’s 13 -person advisory board have some kind of financial necktie to the company. In 2017, Facebook donated money to Family Online Safety Institute, which has two representatives on the board, as well as Connect Safely, the Yale Center for Emotional Intelligence, and Telefono Azzurro, which each have one representative on the board. In 2017, Facebook likewise donated at the least $50,000 to MediaSmarts, which has two members on the members of the commission. One members of the security council, former Sesame Workshop executive Lewis Bernstein , now operates as a consultant advising Facebook on developing content for teens, unrelated to Messenger Kids. Bernstein and other board members have gone on to write op-eds in The Hill and the San Jose Mercury News supporting Facebook’s app. WIRED previously reported that Facebook had donated to FOSI, Connect Safely, and MediaSmarts.

“There was no attempt to not be upfront about it, ” says Davis, the Facebook executive. Many of the groups on the Messenger Kids advisory board are also on Facebook’s security advisory board, which was created in 2009. Davis says that Facebook’s financial support for safety members of the security council has previously been reported. “We’ve had that conversation publicly many, many times, ” she says. The committee is featured at the top of Facebook’s Safety Center, without disclosing that some members receive funding. On a linked page, the company says “Facebook consults” with these organizations. In a statement, Davis says “We do not want there to be a financial onu to cooperating with Facebook.” She says the company sometimes offer “funding to cover programmatic or logistics expenses” of partner organizations.

Funding from Facebook may not have affected the feedback or research around Messenger Kids. The Facebook advisers who spoke to WIRED offered thoughtful views, based on personal experience or supported by research. Board member Michael Rich, who founded the Center on Media and “Childrens health” at Boston Children’s Hospital, also partnered with Apple shareholders on a widely circulated letter asking the company to research potential impacts of smartphones on children and build better tools for mothers. Kristelle Lavallee, a content strategist at Rich’s center, who is also on Facebook’s kids advisory board, compared the desire to shut down Messenger Kids to abstinence-only education. “Nobody is saying they have the answers, because no one does, ” she says, but as researchers and instructors, “It’s really our job to understand these tools.” Barbara Chamberlin, who runs the New Mexico lab, says she agreed to work with Facebook merely after the company promised that her lab’s research would be fully integrated into the development process. National PTA chairman Jim Accomando says, “It is important that families are armed with resources and tools to help them take advantage of the possibility that the digital world gives while building good digital habits and ensuring children have the skills they need to be responsible online.”

Participants in such discussions say some of the outside perspectives helped shape Messenger Kids, but that Facebook appeared to have already decided some issues. Bernstein, the former Sesame Workshop executive, says at the one session he attended in Palo Alto, consultants brought up the age range. “We said 6 is really young, 7 is young for this, 8 is even young, ” he says. Facebook responded by saying the children of deployed service members would find it useful. “We said OK, but know those precautions, ” Bernstein says.

Parents have to set up their child’s account on Messenger Kids, confirming their identity by logging into Facebook. Children can’t be found in search, so mothers control initiate and responding to pal petitions. The app does not include advertising, and the company says it will not use the data for ad intents, topics that Facebook’s Davis says came up often. But the terms of service allow the company to collect information like the content of children’s messages, photos they send, and what features “theyre using”, and then share that info among corporations owned by Facebook, as well as third-party vendors managing issues including customer support and analysis.

Facebook’s is supportive of academics and advocacy groups is not uncommon. Google’s academic affect campaign has been well documented, and Google has also donated to both Family Online Safety Institute and Connect Safely. The Family Safety board includes executives from Facebook, Google, Comcast, Amazon, Twitter, Microsoft, AT& T, Netflix, and others. Common Sense Media working in cooperation with Apple, AT& T, Comcast, DirecTV, Netflix, Microsoft, and others as distribution spouses for its content. Comcast and DirecTV donated a combined $50 million in media and airtime for the anti-tech craving lobbying campaign.

Golin, of the Campaign for a Commercial Free Childhood, says Facebook, for all its faultings, was more responsive than Google. Golin says his organization offered to meet with YouTube over their concerns about YouTube Kids but didn’t hear back. At least, he says, “I’ve met with Facebook.” Still, he says Facebook’s refusal so far to consider shutting down the app is telling. “If the parameters are just’ How can we make this app a little safer and a little less harmful ?’ then the conversation is already so restricted, ” he says.

Facing the Music

For two years, Mark Zuckerberg has combated crisis around bias, fake news, extremism, and Russia’s interference in the 2016 election. Read WIRED’s cover story on the internal drama.

Facebook said Messenger Kids would help safeguard preteens who may be using unauthorized and unsupervised social-media accounts.

Child-health advocates asked Facebook to discontinue Messenger Kids, claiming it will undermine childhood growth.