A mass shooting live-streamed on Facebook. The plan for another mass murder posted online before the shots rang out.
In light of these recent connections between social media and violent acts, a Senate committee put Google, Facebook, and Twitter in the hot seat Wednesday.
These platforms say they’ve become a way to communicate with loved ones during tragedies— organize events for people to gather and grieve— and raise money to help support victims.
But they’re working to prevent these incidents all together.
“We just don’t allow the violence period,” said Monika Bickert, Facebook’s Head of Global Policy Management.
Facebook, Twitter, and Google told lawmakers they have made progress toward protecting users from online hate and violence.
The companies say over the past two years they have removed millions of violent posts, suspended thousands of accounts, and hired hundreds of employees to search out hate speech that might encourage terrorism.
Mississippi Senator Roger Wicker says the social media giants still have work to do.
“This is a matter of serious importance to the safety and wellbeing of our nation’s communities.”
And Florida Senator Rick Scott pointed to a major failure.
“Someone with the username Nicholas Cruz had posted a comment on a Youtube video that said, ‘I am going to be a professional school shooter,’” said Scott.
Less than six months later, a gunman with the same name killed 17 people at a Parkland, Florida High School when Scott was governor.
He asked the representative from Google, which owns Youtube, what happened.
“We strive to be vigilant, to invest heavily, to proactively report when we see an imminent threat. I don’t have the details on the specific facts you’re describing,” said Derek Slater, Google’s Global Director of Information Policy.
But the companies did say Parkland and other tragedies highlight why they have made it a priority to work more closely with each other and with law enforcement.
“Removing content alone will not stop those who are determined to cause harm,” said Nick Pickles, Twitter’s Public Policy Director.
The companies and agencies now have better information sharing and technology. But say many times, the content just moves to darker corners of the internet.
“The threat environment that we are in today as a country has changed and evolved in the past 24 to 36 months,” said George Selim, Anti-Defamation League’s Senior Vice President of Programs.
As the platforms work to keep up, members of Congress plan to ask the Department of Justice for its help, too.
Some lawmakers are hoping the DOJ will police hate crimes on the dark side of the web, like it has with child porn in recent years.