McNamee said Facebook needed to completely retool its business model. Roger McNamee, an early Facebook investor and ex-mentor to CEO Mark Zuckerberg, has spent a week attacking his former protege and his company for harming users and for spreading fake news. McNamee has written articles for The Washington Monthly , The Washington Post , and The Guardian over the last week, accusing Facebook of ignoring “bad actors” who manipulate its platform. McNamee said Zuckerberg should appear before Congress to justify Facebook’s “refusal to accept responsibility” for harming users acebook chief executive Mark Zuckerberg announced last week that he would spend 2018 fixing the problems with his platform that enable bad actors to do harm, such as Russia’s interference in the U.S. election. As Zuckerberg’s former mentor, I applaud this commitment and would like to offer my friend a road map to protect our democracy. I first noticed bad actors exploiting Facebook in early 2016 and contacted Zuckerberg and Sheryl Sandberg, Facebook’s chief operating officer, just before the election. I spent four months trying to convince Facebook that its algorithms and advertising business model were vulnerable to bad actors. They were reluctant to accept my conclusion then and continued to deny and deflect through the end of 2017. The company still argues that it is not responsible for the actions of third parties on its platform. I can understand that it was initially difficult for Facebook to believe that its product was at fault, but there is no longer any excuse for inaction. What we need from Zuckerberg is acknowledgment that Facebook has some responsibility for what others do on its platform and that it is prepared to make fundamental changes to limit future harm. This week’s announcement of changes to Facebook’s NewsFeed may be a positive step, but it’s not a solution. Had this change been in place in 2016, it might even have exacerbated the Russian interference by increasing the exposure of Facebook group users to misinformation. I recommend that Facebook follow the example of Johnson & Johnson during the Tylenol poisonings in 1982. Johnson & Johnson did not cause the tampering. It was not technically required to take responsibility, but it knew it was the right thing to do. The company took immediate and aggressive action to protect its customers. It took every bottle of Tylenol off every retail shelf and redesigned the packaging to make it tamper-proof. There was a substantial economic cost in the short run, but the company built trust with customers that eventually offset it. Following this model, the first step for Facebook is to admit it has a problem. Zuckerberg did that in his blog post. The next step is for Facebook to admit that its algorithms and advertising business model invite attacks by bad actors. By giving users only “what they want,” Facebook reinforces existing beliefs, makes them more extreme and makes it hard for users to accept unpleasant facts. Instead of bringing people together, Facebook drives us apart. The same tools that make Facebook so addictive for users and so effective for advertisers are dangerous in the hands of bad actors. And thanks to automation, Facebook cannot currently prevent harm. It will happen again and again until Facebook takes aggressive action. The problem cannot be fixed by hiring contractors to review problematic posts. The company needs to change the priorities of its algorithms and retool its business model. It needs to act like Johnson & Johnson. Facebook also owes its users a personal apology. Thanks to Facebook’s negligence, 126 million Americans were exposed to Russian manipulation, and most of them do not realize it. To compensate, Facebook must notify every user touched by Russian election interference with a personal message explaining how the platform was manipulated and how that manipulation harmed users and the country. They should include copies of every post, group, event and ad each user received. Facebook is the only entity able to break through to users trapped in its filter bubbles. Sen. Richard Blumenthal (D-Conn.) made this request several months ago. Facebook’s response was a “portal” that was as hard to find as it was inadequate. Finally, Zuckerberg should volunteer to testify in an open hearing before Congress. The country needs to hear him explain Facebook’s strategy and design choices and justify its refusal to accept responsibility for what bad actors are doing on the platform. Facebook is tailor-made for abuse by bad actors, and unless the company takes immediate action, we should expect a lot more of it, including interference in upcoming elections. If Facebook chooses to protect its current business model, it has enough power and influence to skate by without implementing the changes needed to protect democracy and public health in the United States and across the world. But users and regulators are watching. Zuckerberg and Sandberg have an opportunity to be heroes or villains. The choice is theirs.According to McNamee’s version of events, he received a call from Facebook’s then-chief privacy officer Chris Ziegler in 2006. Ziegler was hoping McNamee, as an experienced investor and disinterested party, could advise a young Mark Zuckerberg about whether to sell Facebook or not. McNamee met with Zuckerberg and advised him not to sell the company. “I was convinced that Mark had created a game-changing platform that would eventually be bigger than Google was at the time,” McNamee wrote. What followed, he said, “was the most painful silence of my professional career.” As it happened, Yahoo had just offered $1 billion (then around £509 million) for Facebook. Though everyone was advising the opposite, Zuckerberg took McNamee’s advice and turned the offer down. McNamee went on to mentor Zuckerberg and invest in Facebook through Elevation Partners, the investment firm he cofounded with U2 frontman Bono.]]>