To avoid erroneously flagging CSAM, Meta's training docs tell content moderators to "err on the side of an adult" when judging people's age in photos or videos (Michael H. Keller/New York Times)


Michael H. Keller / New York Times:

To avoid erroneously flagging CSAM, Meta’s training docs tell content moderators to “err on the side of an adult” when judging people’s age in photos or videos  —  The company reports millions of photos and videos of suspected child sexual abuse each year.


Related news
- Advertisement -spot_img
Related news

LEAVE A REPLY

Please enter your comment!
Please enter your name here