Frances Haugen, the Facebook whistleblower with Massachusetts ties, testified before Congress Tuesday.
Haugen, a graduate of Needham‘s Olin College of Engineering and Harvard Business School, testified to the Senate Commerce subcommittee on consumer protection at a 10 a.m. hearing Tuesday. The data scientist removed a stash of private Facebook research when she left the company earlier this year.
“I have to get out enough that no one can question that this is real,” she told “60 Minutes” on Sunday.
Of her motivation, she said, “Imagine you know what’s going on inside Facebook and you know no one on the outside knows.”
In the interview, Haugen describes how Facebook’s own research shows that hateful, divisive content helps inspire users and keeps them engaged.
“If they change the algorithm to be safer, people will spend less time on the site, they’ll click less on ads and they’ll make less money.”
U.S. Sen. Ed Markey, D-Mass., is on the committee that Haugen testifies before Tuesday.
“I will be asking her about conversations which she had with her Facebook bosses,” Markey said in an interview Monday.
He said he is concerned about Facebook research that shows the harmful effects of Instagram on teenage girls. He said the insider intel proves the need for regulation.
"Facebook says the problem of Democracy are already there. Facebook says the problems of exploitation of children are already there," Markey said. "They worsen those problems. They don't solve them, they make them even harder to bring a resolution to."
Asked if he thinks anything is going to actually change because of the whistleblowers revelations, Markey said he does.
“I’ve been waiting for the moment where I had the evidence so that I could provide a policy Bill of Rights for every child in America,” he said.
Markey also said the Jan. 6 Capitol insurrection is another topic sure to be discussed at the hearing.
In response to the "60 Minutes" report, Facebook said they have to balance protecting the right of people to express themselves openly while keeping the platform safe and positive.
"To suggest we encourage bad content and do nothing is just not true," the company wrote in a statement.
Emerson College social media expert David Gerzof Richard said Haugen's claims confirm “what we’ve suspected for a long time.”
Facebook has made the business decision that it’s okay to push misinformation and hate speech as long as it’s getting people to engage, he said: “They decided to go with profits over what is better for the public.”
Gerzof Richard said that all of the attention may motivate Facebook to make some changes for fear of the alternative, adding, “There’s going to be some pressure coming down.”