Home Technology Twitter disperses Trust & Safety Council after key members step down • businessroundups.org

Twitter disperses Trust & Safety Council after key members step down • businessroundups.org

by Ana Lopez
0 comment

Twitter today broke up the Trust & Safety Council, an advisory group made up of about 100 independent researchers and human rights activists. Founded in 2016, the group provided the social network with input on various content and human rights-related issues, such as the removal of child sexual abuse material (CSAM), suicide prevention, and online safety. This could have implications for Twitter’s global content moderation, as the group consisted of experts around the world.

According to multiple reports, council members received an email Monday from Twitter saying the council is “not the best structure” for getting outside insights into company product and policy strategy. While the company said it “will continue to welcome ideas from councillors”, there were no guarantees as to whether they would be considered. Since the advisory group that was designed to contribute ideas was disbanded, it just feels like “thanks, but no thanks”.

A report of the Wall Street Journal notes that the email was sent an hour before the council had a scheduled meeting with Twitter staff, including new head of trust and security Ella Irwin, and senior director of public policy Nick Pickles.

This development comes later three key members of the Trust & Safety Board have stepped down last week. The members said in a letter that Elon Musk ignored the group despite claiming to focus on user safety on the platform.

“The formation of the Council represented Twitter’s commitment to moving away from a US-centric approach to user safety, stronger collaboration between regions, and the importance of having highly experienced people on the safety team. The latter promise is no longer evident, given that Twitter’s recent statement that it will rely more heavily on automated content moderation. Algorithmic systems can only go so far in protecting users from ever-evolving abuse and hate speech before detectable patterns have developed.

After taking over Twitter, Musk said he was going to form a new content moderation board with “diverse opinions,” but there’s been no development on that front. As my colleague, Taylor Hatmaker, pointed out in her August story, the lack of a robust set of content filtering systems can lead to harm to underrepresented groups like the LGBTQ community.


You may also like

About Us

Latest Articles