April 24, 2024

Can Threads Hold It Together? 3 Things I Should Achieve With Me

In my little corner of the tech world, everyone can talk about Threads – the short text platform that Meta launched earlier this month as a move to replace Twitter, which has struggled since Elon Musk took over last year, losing users and ads. income. The opportunity was not lost on Mark Zuckerberg, CEO of Meta. “Twitter hasn’t been as successful as I think it should have been,” he said The Guardian“and we want to do it differently.”

Zuckerberg and his team are definitely on to something. Threads racked up more than 100 million users in a few days. Whether or not they do it differently remains to be seen. As a former Trust and Safety domain expert for Twitter and Facebook, I have some concerns – concerns that led me to found T2.social, a new alternative platform that keeps Trust and Safety at its core. I worry that the mistakes of the past could be repeated: growth could once again become a safety hazard.

With major launches by companies such as Meta and Twitter, the emphasis is almost unilaterally on going live at all costs. Risks raised by researchers and operations colleagues are addressed after the “successful” launch. This backward prioritization can have disastrous consequences.

How then? In May 2021, Twitter launched Spaces, offering live audio chats. Before that launch, there will be people across the company expressed concern internally about how Spaces could be misused if the right safeguards were not in place. The company chose to move forward quickly, ignoring the warnings.

The following Christmas, the Washington Post reported that Spaces had become a megaphone for “Taliban supporters, white nationalists, and anti-vaccine activists spreading coronavirus misinformation,” and that some hosted “disdainful transgender people and Black Americans.” This was largely because Twitter had not invested in human moderators or technologies capable of real-time audio monitoring. This could have been avoided if the company had made safety as important as shipping.

I’d like to think that the teams at Meta kept Twitter’s missteps in mind when preparing to release Threads, but I’ve yet to see clear indicators that prove it. Facebook has a fine history on these matters, especially in new markets where the a platform was not prepared for integrity issues. A few days ago civil society organizations asked the company i an open letter share what’s different this time: how is the company prioritizing healthy interactions? What plans does Meta have to combat abuse on the platform and prevent Threads from falling apart at the seams like its predecessors? I reply sent to Grace Eliza Goodwin from the InsiderMeta said their enforcement tools and human review processes are “hardwired into Threads”.

Ultimately, there are three main initiatives that I know work to build safe online communities in the long term. I hope Meta is taking these steps.

1. Set Healthy Norms And Make Them Easy To Follow

The first (and best) thing a platform can do to protect its community from abuse is to make sure it doesn’t happen in the first place. Platforms can firmly establish norms by carefully designing site guidelines in ways that are easy to read and easy to find. No one joins an online community to read a bunch of law, so the most important points need to be stated in plain language and easily located on the site. Ideally, subtle reminders can be integrated into the UI to reinforce the most important rules. Then, of course, the team needs to enforce these guidelines quickly and consistently so that users know they are supporting the activity.

2. Encourage Positive Behavior

There are elements that can encourage healthy behavior, working in conjunction with established norms and enforced guidelines. Nudges, for example, were successful on Twitter before disbanding.

Starting in 2020, teams at Twitter an experiment was conducted with a series of automated “snoots”. which would give users a moment to reconsider posting potentially problematic responses. A prompt would appear if a user tried to post something with hateful language, giving them a moment to edit or delete their Tweet.

Although they could continue with their original versions if they wanted, the prompted users canceled their initial responses 9% of the time. Another 22% reviewed before posting. This successful safety feature was discontinued after Elon Musk took control of the platform and let most of the team go, but it remains a successful strategy.

3. Keep an Open Dialogue With People

I’m lucky because my co-founders at T2 share my view of methodical growth that favors user experience over rapid scale. This approach gave me a unique opportunity to have deep, direct conversations with our early users as we built the platform. The users we spoke to at T2 have become skeptical of a “growth at all costs” approach. They say they don’t want to deal with sites that place a high price on a scale if it comes to toxicity and abuse.

Now, Meta is a public company focused on shareholder interests and, therefore, does not have that luxury. And building on Instagram’s existing user base, Meta had a switch that he could easily flip and flood the platform with engagement—an opportunity too good to pass up. It’s no surprise that the Threads team took this route.

That said, a company this size has huge teams and tons of tools at its disposal that can help monitor public health and open avenues for dialogue. I hope Meta uses them. Currently, Threads’ algorithms seem to prioritize high-visibility influencers and celebrities over everyone else, which already sets one-way conversations as the standard.

What I’ve learned over the years in the trenches working on trust and safety is that if you want to build a healthy community, listening to people and building with them is key. If the teams behind Threads neglect to listen, and favor engagement over healthy interactions, Threads will quickly become another unsatisfactory experience that drives users away and misses an opportunity to deepen a human connection. It won’t be any different from Twitter, no matter what Zuck says he wants.

Leave a Reply

Your email address will not be published. Required fields are marked *