The private messages Facebook users send to each other through Messenger aren’t so private. Facebook, already under fire for how it handles profile data, confirmed to Bloomberg on Wednesday that it scans the text and images people send to one another on Messenger to make sure it follows the company’s content rules.
And it blocks messages that don’t comply.
The company said it uses the same automated tools to scan Messenger for abuse as it does on Facebook in general.
“For example, on Messenger, when you send a photo, our automated systems scan it using photo matching technology to detect known child exploitation imagery or when you send a link, we scan it for malware or viruses,” a Facebook Messenger spokeswoman said in a statement to Bloomberg. “Facebook designed these automated tools so we can rapidly stop abusive behavior on our platform.”
The company told Bloomberg it doesn’t use data from scanned messages for advertising.
Concerns about whether Facebook was snooping on Messenger rose this week after the site’s founder, Mark Zuckerberg, alluded to it in a interview with Vox’s Erza Klein . Zuckerberg told Klein about stopping sensational messages about ethnic cleansing in Myanmar being sent through Messenger.
Facebook launched Messenger as a stand-alone app in 2014. That same year, Facebook paid $19 billion for WhatsApp, a chat app similar to Messenger. Messenger topped 1 billion monthly users in 2017. WhatsApp had 1 billion daily users in 2017.
WhatsApp encrypts messages on both ends of the conversation, so the company cannot see them, according to Bloomberg.
Facebook has been under intense scrutiny since news broke that private information from about 50 million users was accessed by Cambridge Analytica, a political consulting firm with connections to President Donald Trump’s 2016 campaign. Zuckerberg agreed to testify before Congress next week.
The company also announced Wednesday a slew of changes to…