Just in:
Ministry of Agriculture Supports Taiwanese Tea’s Entry into Singapore Market to Boost Global Presence // AVPN Charts Path Forward at 2024 Global Conference // PolyU forms global partnership with ZEISS Vision Care to expand impact and accelerate market penetration of patented myopia control technology // Oman Seeks Growth Through Strategic Economic Alliances // World Intellectual Property Day: OPPO Maintains Top 10 Global IP Ranking for Fifth Consecutive Year // NetApp’s 2024 Cloud Complexity Report Reveals AI Disrupt or Die Era Unfolding Globally // Andertoons by Mark Anderson for Fri, 26 Apr 2024 // Nano-Care Deutschland AG launches next generation of sustainable PFAS-free oleophobic coatings // Supreme Court dismisses pleas for 100% VVPAT verification // Heavy Rainfall Disrupts UAE Construction Boom // Abu Dhabi Unveils Online Portal to Strengthen Healthcare Workforce // DIFC Courts Cement Role as Top English Dispute Resolution Choice // Galaxy Macau’s Sakura Cultural Festival Kicked off in Splendor // Moomoo Wins “Digital CX Awards 2024” by The Digital Banker // Why Lok Sabha Election For 20 Seats In Kerala Is Crucial For Future Of Left In Indian Politics? // GE Jun, Chairman and CEO of TOJOY, Delivers an Inspiring Speech: “Leaping Ahead Again” // Crypto Market Poised for Boom as Baby Boomers Embrace Bitcoin ETFs // Forward Fashion’s Artelli Presents: Nobuyoshi Araki’s “Paradise” Starting from April 27th, at K11 MUSEA // Prince Holding Group’s Chen Zhi Scholarship Clinches Silver Stevie for CSR Excellence at Asia-Pacific Stevie Awards // Downpours in Oman and UAE Likely Amplified by Warming Planet //

Research may help combat abusive online comments

1494425100 researchmayh

Research may help combat abusive online comments

Researchers at the Georgia Institute of Technology’s School of Interactive Computing have come up with a novel computational approach that could provide a more cost- and resource-effective way for internet communities to moderate abusive content.


They call it the Bag of Communities (BoC), a technique that leverages large-scale, preexisting data from other to train an algorithm to identify abusive behavior within a separate target community.

ADVERTISEMENT

Specifically, they identified nine different communities. Five, such as the free-for-all of communities 4chan, are rife with abusive behavior from commenters; four, like the heavily moderated MetaFilter, are helpful, positive, and supportive.

Using linguistic characteristics from these two types of communities, researchers built an algorithm that can learn from the comments and, when a new post is generated within a target community, it can make a prediction of whether or not it is abusive.

“MetaFilter is known around the internet as a good, helpful, supportive community,” said Eric Gilbert, an associate professor in the School of Interactive Computing and a member of the team of researchers on the project. “That’s an example of how, if your post is closer to that, it’s more likely that it should stay on the site. Conversely, if your post is closer to 4chan, then maybe it should come off.”

The researchers provide two algorithms. One is a static , off the shelf with no training examples from the target community, and can achieve roughly 75 percent accuracy. In other words, with access only to posts from the other nine communities, the can accurately predict abusive posts in the target community roughly three quarters of the time.

“A new community that does not have enough resources to actually build automated algorithms to detect abusive content could use the static model,” said Georgia Tech doctoral student Eshwar Chandrasekharan, who led the team.

A dynamic model, one that mimics scenarios in which newly moderated data arrives in batches, learns over time and can achieve 91.18 percent accuracy after seeing 100,000 human-moderated posts.

“Over time, as new moderator labels come in, when it has seen examples of things that have been moderated from the site, it can learn more site-specific information,” Chandrasekharan said. “It can learn the type of comments that get moderated, and if there is a level of tolerance that is different from what you see in the static model, it could learn that over time.”

Both the static and dynamic models outperformed a solely in-domain model from a major internet community.

Anyone who has managed an online community has encountered problems with abusive content from users. From social media to message boards to comments sections in online news publications, regulating what is and isn’t allowed has become overly costly and taxing on existing human moderators.

Founders at social media startup Yik Yak spent months of their early time removing hate speech, and Twitter has stated publicly that dealing with abusive behavior remains its most pressing challenge. A number of major news agencies are buried under the demands of strict moderation, and many have shut down comments sections altogether.

Prior research into abuse detection and online content moderation has focused on in-domain methods – using data collected from within your own community – but those face challenges in obtaining enough data to build and evaluate algorithms. In a BoC-based method, algorithms would leverage out-of-domain data from other existing online communities.

Gilbert said that the applications from such a model could be widespread.

“This is a core internet problem,” he said. “So many places struggle with this, and many are shutting comments off because they just don’t want to deal with the trouble they cause.”

This research is presented in a paper (The Bag of Communities: Identifying Abusive Behavior Online with Preexisting Internet Data) at the Association for Computing Machinery CHI Conference on Human Factors in Computing Systems 2017.


Explore further:
Google rolls out AI tool to combat online trolls

Source link

ADVERTISEMENT

ADVERTISEMENT
Just in:
PolyU forms global partnership with ZEISS Vision Care to expand impact and accelerate market penetration of patented myopia control technology // World Intellectual Property Day: OPPO Maintains Top 10 Global IP Ranking for Fifth Consecutive Year // World Football Federation Secures Sponsorship From Saudi Oil Giant // NetApp’s 2024 Cloud Complexity Report Reveals AI Disrupt or Die Era Unfolding Globally // Forward Fashion’s Artelli Presents: Nobuyoshi Araki’s “Paradise” Starting from April 27th, at K11 MUSEA // Abu Dhabi Secures US$5 Billion in Fresh Funding // Heavy Rainfall Disrupts UAE Construction Boom // Ministry of Agriculture Supports Taiwanese Tea’s Entry into Singapore Market to Boost Global Presence // TPBank and Backbase Clinch ‘Best Omni-Channel Digital CX Solution’ at the Digital CX Awards 2024 // Nano-Care Deutschland AG launches next generation of sustainable PFAS-free oleophobic coatings // CapBridge Shares Insights on the Recent Launch of Digital Asset ETFs in Hong Kong // Andertoons by Mark Anderson for Fri, 26 Apr 2024 // Moomoo Wins “Digital CX Awards 2024” by The Digital Banker // Supreme Court dismisses pleas for 100% VVPAT verification // e& UAE Unveils Strategic Roadmap // GE Jun, Chairman and CEO of TOJOY, Delivers an Inspiring Speech: “Leaping Ahead Again” // Etihad Airways Announces Paris Service with A380 // Oman Seeks Growth Through Strategic Economic Alliances // UN Commends Vietnam’s Progress on Climate Goals // AVPN Charts Path Forward at 2024 Global Conference //