x
Breaking News
More () »

Why information exposed by the Facebook whistleblower matters

A University of St. Thomas professor says the information exposed by Frances Haugen could prove critical, but only if change follows.

ST PAUL, Minn. — From the time she began sharing thousands of documents with the Wall Street Journal, to the moment she revealed her identity to CBS' 60 Minutes; the Facebook whistleblower, Frances Haugen, continues to criticize the actions of her former employer and how she says it prioritized profits at the expense of safety.

"It's a hugely courageous act," said Katherina Pattit, a University of St. Thomas professor who has spent years studying how social media intersects with, and influences, business law and ethics.

Pattit says there is well-established research into the addictive nature of social media and the role it can play on emotions, but she says the whistleblower report takes our understanding of the issue a step further.

Pattit: "When you see internal conversations between people that, essentially, recognize that, yes, we have been lying - and you see it in front of you - you can't really deny that anymore."

"It reminded me kind of how the scandal with the opioids started to unravel. It became obvious, through internal documents, how much the (drug) companies knew about their addictive nature and the way they were actually marketing these opioids, which was intentionally trying to drive up prescriptions rates."

Kent Erdahl: "There might be some people saying, come on, Facebook is not on the same level as the opioid crisis."

Pattit: "Well, addiction is the core element in both of those things, and we've seen, from studies for example, that cortisol - stress level hormones - go up in these (social media) sites. People get increasingly depressed, and even the rage and anger has all kinds of negative biological effects as well. We might be talking, on one hand, about a drug that causes addiction and also harmful effects in humans, but Facebook and engaging in this sort of repeated consumption of those types of content, has just as much psychological and biological harm of those associated with it, in the end."

But unlike the opioid crisis, in which whistleblower evidence eventually led to landmark lawsuits and legislation, Pattit says she's more skeptical of the road ahead for prosecuting or policing Facebook.

Pattit: "Let's say someone decides to say, 'storm the Capitol,' or they get depressed and kill themselves. It's very difficult to establish that kind of causality and that's often the most important piece that will make a company liable in the end, for a certain act."

Erdahl: "Do you believe Facebook is killing people?"

Pattit: "It's not that simple. I think if it was that simple, the solutions would have been here already, but it's not."

In the meantime, her solution is simple.

"I am not on Facebook," she said. "I noticed that I was spending way too much time on Facebook, despite teaching it and knowing all about it and so I decided I had to go cold turkey."

Pattit says a fix that would help society as a whole would likely require Facebook to change its algorithm and share data about it more widely. She advocates for using Artificial Intelligence that isn't built on machine learning. Instead, she hopes to see algorithms that abide by a set of guiding principals and ethics.

Before You Leave, Check This Out