The State of Online Harassment on Social Media
- How could platform moderation change with Musk?
- What are the risks of unmoderated social media platforms?
- How effectively are social media companies addressing online harassment?
This problem runs far deeper than just Trust and Safety- it extends to software engineers, community managers, and marketers being tasked with navigating these reports in mentions and direct messages. We are already seeing the first signs of sexual assault and harassment in the metaverse. Law enforcement is ill-equipped to handle these reports because they do not treat digital crimes with the same severity as physical crimes. But they aren’t the only ones.
HR departments are having new hires onboard in the metaverse- which can put women at risk for assaults in front of their colleagues. Who is liable if this happens? Have companies thought through the severity of sexual assault in the metaverse? Or are we setting women up for failure?
Who is coming to protect these women if law enforcement isn’t and HR and legal don’t even have guidelines around it?
Companies have over-invested in woke talking points and line items at the expense of investing in online harassment training. This effectively means that employees are being tasked to handle something they have zero experience handling. Their response can lead to further escalation, which could put someone’s life at risk.
The transparency reports across every social media platform show that online abuse and social media harassment is the largest issue.
The answer to abuse and harassment isn’t to empower abusers to abuse and harassers to harass. The answer to stalking isn’t to empower stalkers to stalk. The answer starts with acknowledging that the offline problems that the courts are trained in dealing with have merged to online problems at scale.
We are essentially asking trust and safety teams to enforce a digital restraining order, without ever being handed one.
We are essentially asking trust and safety teams to enforce a digital restraining order, without ever being handed one.
We need better solutions to create a safer Internet. That doesn’t start with decentralization. It starts with centralization between law enforcement, social media companies and acknowledgment of the severity of the issue.