The House Committee on Energy and Commerce has been underwhelmed with Facebook’s response to the doctored video of Speaker of the House Nancy Pelosi (D-Calif.) that went viral last month.
Saying, “We are concerned that there may be a potential conflict of interest between Facebook’s bottom line and immediately addressing political discrimination on your platform,” the committee requested answers to the following questions by July 17:
- When was the original version of the video, which was altered to make Pelosi appear drunk, first reviewed by Facebook? What triggered the review? What standards were used to conduct the review? What was the outcome?
- Did Facebook review the original video or copies to determine whether Ad Breaks should be allowed, if it should be boosted, if it should be delivered via other forms of ads or if it should appear in connection with ads?
- Did any of the actions mentioned in the previous question occur?
- Did Facebook generate any revenue, directly or indirectly, from the original video or copies?
- On average, what is the time gap between content being posted to Facebook and flagged to third-party fact-checkers; review by those fact-checkers; and actions once that review is complete? What were those times in the case of the Pelosi video?
- How many full-time product managers does Facebook employ to address disinformation? How many third-party fact-checkers does the social network retain?
- What determines if content gets reviewed by third-party fact-checkers? Must those fact-checkers abide by standards or time frames?
- What steps is Facebook taking ahead of the 2020 presidential election in the U.S. to stop the spread of political disinformation?
Zuckerberg addressed some of those questions said at the Aspen Ideas Festival Wednesday, as reported by Queenie Wong of CNET.
He said of the social network’s response time, “During that time, it got more distribution than our policy … should have allowed, so that was an execution mistake.”
Zuckerberg also addressed the “fine line” between fake news and satire and option, Wong reported, saying, “This is a topic that can be very easily politicized. People who don’t like the way that something was cut … will kind of argue that … it did not reflect the true intent or was misinformation. But we exist in a society … where we value and cherish free expression.”
And on deepfakes—the use of artificial intelligence to alter videos and make it seem like the subjects did or said something that didn’t actually occur—he said there is a “good case” that those videos are different than traditional misinformation, adding, “The policies continue to evolve. As technology develops, we continue to think through them.”
The letter was signed by Committee on Energy and Commerce chairman Rep. Frank Pallone Jr. (D-N.J.), ranking member Greg Walden (R-Ore.) and Reps. Nanette Diaz Barragán (D-Calif.), G.K. Butterfield (D-N.C.), Tony Cárdenas (D-Calif.), Kathy Castor (D-Fla.), Yvette Clark (D-N.Y.), Dianna Degette (D-Colo.), Debbie Dingell (D-Mich.), Mike Doyle (D-Pa.), Eliot Engel (D-N.Y.), Anna Eshoo (D-Calif.), Robin Kelly (D-Ill.), Joseph Kennedy III (D-Mass.), Ann McLane Kuster (D-N.H.), David Loebsack (D-Iowa), Ben Ray Luján (D-N.M.), Doris Matsui (D-Calif.), Jerry McNerney (D-Calif.), Lisa Blunt Rochester (D-Del.), Raul Ruiz (D-Calif.), Bobby Rush (D-Ill.), John Sarbanes (D-Md.), Jan Schakowsky (D-Ill.), Darren Soto (D-Fla.), Paul Tonko (D-N.Y.), Marc Veasey (D-Texas) and Peter Welch (D-Vt.).