By Nikhila Natarajan
New York, Oct 7 (IANS): After throwing the kitchen sink at her former employer, Facebook whistleblower Frances Haugen laid out an inside-out view on the simple "frictions" that would cool off Facebook's "toxic" and "divisive" algorithms that are driving teens and vulnerable populations off the cliff on the world's largest social networking platform.
Starting mid-September, leaked internal Facebook studies of the harms of Instagram for teenagers - including triggering eating disorders and suicidal thoughts - have set off alarm bells across the country.
In three hours of Congress testimony Tuesday, Haugen accused Facebook of intentionally refusing to make changes to its algorithms because it put "profits" before people. She is urging US lawmakers to act swiftly. "I came forward at great personal risk because I believe we still have time to act. But we must act now. I'm asking you, our elected representatives, to act,a Haugen said.
"The kids who are bullied on Instagram, the bullying follows them home. It follows them into their bedrooms. The last thing they see before they go to bed at night is someone being cruel to them," Haugen said.
"Kids are learning that their own friends, people who they care about, are cruel to them."
Haugen listed a series of small tweaks as a starting point. "A lot of the changes that I'm talking about are not going to make Facebook an unprofitable company, it just won't be a ludicrously profitable company like it is today," she said.
Haugen said her knowledge of "problematic use" suggests that there should be an age limit of 16-18 years old for children to use social media.
"I strongly encourage raising age limits to 16 or 18 years old based on looking at the data around problematic use or addition on the platform and children's self regulation issues," Haugen told lawmakers.
Haugen said one of the simplest changes could be to reset the default mode to chronological ordering of posts rather than bombarding the audience with hyper-targeted posts based on their inferred identity. The way it's coded now, Facebook - and other social media platforms - let their algorithms predict what people want to see based on their engagement history over time.
Haugen pointed to Twitter as an example to explain the upside of inserting frictions. Twitter now routinely asks users if they want to read an article first before retweeting it. While not a hard intervention like deplatforming, Haugen said Facebook knows that such features can dramatically reduce misinformation and hate speech and refuses to act.
Haugen made the case that these frictions will actually work better for the company in the longer term. According to Haugen, the company's own research shows that people tend to avoid the platform as the volume of exposure to toxic content increases.
"One could reason a kinder, friendlier, more collaborative Facebook might actually have more users five years from now, so it's in everyone's interest," she said.