Our ability to more rapidly detect networks of bad actors limits the number of accounts engaging with the content, resulting in fewer accounts disabled overall. This decline is a direct result of our ongoing improvements in our efforts to detect and proactively remove child exploitative content from Discord. The Child Safety category accounted for 90% of this decrease in accounts disabled. We disabled 185,756 accounts between July and September 2022 for policy violations not including spam, a 74.5% decrease when compared to 726,759 accounts disabled in the previous quarter. This report details the actions that Discord has taken on accounts and servers that have violated our Community Guidelines and Terms of Service.Ī text version of the number of accounts disabled by category chart is available here. We may take a number of enforcement actions including but not limited to issuing warnings removing content temporarily or permanently disabling or removing the accounts and/or servers responsible and potentially reporting them to law enforcement. We analyze these reports to determine if the content or behavior violates our Guidelines. We encourage users, moderators, and trusted reporters to submit reports if they believe an account or server is violating our Community Guidelines. Through advanced tooling, machine learning, specialized teams, and partnering with external experts, we work to remove high-harm abuse before it is viewed or experienced by others. We invest heavily in our proactive efforts to detect and remove abuse before it’s reported to us. ĭiscord publishes and maintains a comprehensive set of Community Guidelines that explains what content and behavior is and isn’t allowed on Discord. This Transparency Report, our ninth since 2019, covers the third quarter of 2022, from July to September. These Transparency Reports provide insight into our continued investment into keeping Discord a safe place for people to find belonging. Around 15% of all Discord employees are dedicated to this area and every employee shares in the commitment to keeping Discord safe. Safety is a vital priority for our company. Our Safety team works with cutting-edge technology to detect and respond to abuse, both proactively and from reports received from users, moderators, and trusted third party reporters. Our Policy team takes a nuanced and sophisticated approach to developing our Community Guidelines and forms strategic partnerships with academics, civil society, industry peers, and community moderators to advance our collective understanding of online safety. Our Engineering, Data, and Product teams build products with safety principles in mind. Safety is a collaborative and cross-functional effort at Discord. We recognize that safety enables people to find belonging, and that’s why safety is one of our most important investments and priorities. ![]() Our mission at Discord is to give people the power to create belonging in their lives.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |