Online communities have become a staple in today’s digital world. From social media groups to topical forums to direct interactions between clients and customers, audience participation, so to speak, is driving engagement.
But, a challenge has also arisen: We are seeing a high volume of inflammatory or inappropriate comments in these interactions. It’s apparent that many users of these various platforms feel comfortable making bold statements behind the digital curtain. Some use the veil of anonymity, while others are fine using their identifiable user names. Either way, anyone who participates in any of the countless digital communities has certainly seen how content in these channels has changed over the years. Perhaps it’s not surprising, given some of our most public figures have ditched any filters on their interactions with each other and the public.
What’s clear is moderation is required, especially given the influence digital interactions can have, both in public and private forums. The Brainy Insights projects the content moderation market will grow from $13 billion in 2022 to $40.4 billion by 2032.
But, given the high volume of interactions, it can hardly be accomplished by humans alone. Sendbird, an all-in-one communications API platform the helps drive positive interactions across web and mobile applications, has a solution.
“It takes 40 positive interactions to overcome a single negative interaction,” says Sendbird CEO and co-founder John Kim. “Brands, whether communicating directly with their customers or relying on their customers to communicate with one another in private communities, can use moderation to build better user experiences. That leads to long-term brand equity and loyalty.”
Sendbird has been a pioneer in API technology, enabling diverse applications to integrate chat, video calls, and live streaming functionalities. Its clientele includes industry giants like Yahoo, Hinge, Krafton, and Paytm. With over 7 billion interactions monthly across more than 4,000 applications, Sendbird's influence on digital communication is substantial.
Today, Sendbird takes it innovation a step further to help ensure the safety and integrity of online interactions, introducing a new content moderation solution as it looks to set new industry standards for online community safety and engagement.
Sendbird Advanced Moderation is a timely response to the growing concerns around online safety. As digital platforms become increasingly central to social interaction, the challenge of managing user-generated content without stifling community engagement has become more complex.
Sendbird Advanced Moderation marks a significant shift in content moderation, blending the speed and accuracy of automation with the nuanced understanding of human moderators. This hybrid approach is a recognition of the limitations inherent in both purely automated systems and solely human moderation.
At the core of the solution is the Sendbird Rule Engine, an advanced tool that scans user interactions, from text to multimedia, for inappropriate content, based on customer-defined rules. What sets the Sendbird engine apart is its adaptability – it allows moderators to tailor rules to reflect their communities’ unique standards and values. This customization ensures a consistent, fair, and safe environment across various social platforms, forums, and applications.
In cases where automated systems face ambiguity, Sendbird's Moderation Review Queue comes into play. This feature allows for human intervention, ensuring that the complexity of human communication is adequately addressed. The Live Moderation Dashboard further enhances real-time interaction oversight, providing moderators with a direct window into chat groups and channels.
Programmatic rules could include things like detecting bad behavior or offensive curse words, message or user reporting, and deleted messages. Rules can also be conditional, such as time-framed repetition (e.g., curse words send multiple time within 10-15 seconds), or frequency (e.g., 3 users reporting the same message).
The rule engine can also be programmed to take specific actions based on these identified rule violations, such as muting/banning users for a period of time (or permanently) deleting messages, or referring the violation to a human moderator for review and action.
While the automated rule engine can be used as a standalone tool, Kim explains that it works best when automation and human moderation are combined.
“It is the combination of automation with human insight that yields the best trust and safety results for online communities,” Kim told TMC. “Automation ensures consistency and scalability, while moderators handle more complex cases and manage rules improvement and maintenance to adapt to the community constant evolution.”
Sendbird's commitment to transparency and continuous improvement is furthered by its Moderation Logs. These logs record every action taken, forming a transparent trail for audits and rule refinement.
The impact of Sendbird's Advanced Moderation is already being felt across various industries. Kakao Entertainment, for instance, has praised the system for its flexibility and efficiency in tailoring moderation rules to their specific needs, enhancing the safety and vibrancy of their online community.
Moderation, though, isn’t important merely for social platforms and large online communities. Many businesses have a very real need to ensure their communications platforms are used appropriately to avoid putting their brands or their customers at risk.
“The impact of moderation on social platforms, like X (fka Twitter), Reddit, or Facebook is easy to grasp, but moderation also plays a significant role for companies with different business models,” Kim explains.
For instance, take a company like Uber, where the ability to manage driver-rider interactions helps define their customer experiences. Or take financial advisors like Wells Fargo, where advisors and clients engage in in-app conversations regularly. Moderation is critical to compliance and reputation management.
An additional feature, which promotes transparency and continuous improvement, is Sendbird’s Moderation Logs. These logs record every action taken, forming a transparent trail for audits and rule refinement. Naturally, this raises the question of privacy, which is inherent to any digital product. Sendbird Advanced Moderation either anonymizes or hashes any sensitive information to protect both brands and their users/customers.
Sendbird's co-founder and CEO, John S. Kim, emphasizes the importance of language and intent in online communities. His vision for Sendbird is to foster safer, more engaging online spaces where interactions are not only secure but enriching. This vision is now being realized through this hybrid moderation system.
Unfortunately, today’s reality is that abuse by bad actors can and will continue to occur in private or public settings, from social media and gaming apps to dating and healthcare apps.
“Sendbird Advanced Moderation helps foster a trusted and safe digital environment anywhere customer communications occur,” Kim concludes. “It combines the speed and precision of automation with human moderators' invaluable insight to ensure fast, scalable, and consistent treatment of problematic content, ensuring a quick and fair treatment of any incidents.”
But, as we’ve seen over the past several years, there’s a line between moderation and censorship. Does moderation run the risk of becoming censorship?
Kim acknowledges the risk of moderation tools being used for censorship. But, he is also correct in noting that Sendbird delivers the moderation technology and is not involved in deciding how a brand approaches moderation.
“This risk arises when those in charge of moderation impose overly restrictive rules or use their authority to suppress dissenting voices or unpopular opinions,” he says. “It's crucial to ensure that moderation practices are transparent, fair, and aligned with community guidelines to prevent them from being used as censorship. Striking the right balance between moderation and censorship is essential for maintaining healthy online communities. We certainly encourage our customers to conduct business in an ethical manner.”
As digital platforms and brands grapple with the dual need for openness and safety, solutions like Sendbird Advanced Moderation offer a way forward that balances these priorities. As digital interactions continue to evolve, Sendbird’s role in helping shape safe and engaging online communities will be crucial. The success in deploying these new tools will not only benefit clients and their user communities, but also set a precedent for the future of digital communication and community management.
Sendbird Advanced Moderation is currently in private beta. For an opportunity to test it for your digital interactions, request access here.
Edited by
Erik Linask