Facebook Found Lax in Enforcement of Own Privacy Rules. Surprised?

March 4, 2021

Facebook is refining its filtering AI for app data after investigators at New York’s Department of Financial Services found the company was receiving sensitive information it should not have received. The Jakarta Post reports, “Facebook Blocks Medical Data Shared by Apps.” Facebook regularly accepts app-user information and feeds it to an analysis tool that helps developers improve their apps. It never really wanted responsibility for safeguarding medical and other sensitive data, but did little to block it until now. The write-up quotes state financial services superintendent Linda Lacewell:

“Facebook instructed app developers and websites not to share medical, financial, and other sensitive personal consumer data but took no steps to police this rule. By continuing to do business with app developers that broke the rule, Facebook put itself in a position to profit from sensitive data that it was never supposed to receive in the first place.”

Facebook is now stepping up its efforts to block sensitive information from reaching its databases. We learn:

“Facebook created a list of terms blocked by its systems and has been refining artificial intelligence to more adaptively filter sensitive data not welcomed in the analytics tool, according to the report. The block list contains more than 70,000 terms, including diseases, bodily functions, medical conditions, and real-world locations such as mental health centers, the report said.”

A spokesperson says the company is also “doing more to educate advertisers on how to set-up and use our business tools.” We shall see whether these efforts will be enough to satisfy investigators next time around.

Cynthia Murrell, March 4, 2021

Comments

Got something to say?





  • Archives

  • Recent Posts

  • Meta