SOCIAL MEDIA: Facebook's Propaganda Failure is a Feature, not a Bug

The best thing we've seen on Zuckerberg and Cambridge Analytica is this piece on Slate by Will Oremus. Cambridge Analytica and data scientist Alex Kogan did pull lots of Facebook data out the system and create "psychographic" profiles of users. This means that advertising could be targeted towards particular belief groups, surrounding them with different messages that would lead to behavior change at the margin. This is mass customized propaganda, and it had real impact on the 2016 elections.

But the real takeaways are that (1) Cambridge Analytica wasn't actually that good at its job and was really pretending its software worked, (2) Facebook has always been in the business of monetizing user data, from Farmville to Tinder, and (3) Facebook's current third party data sharing policies no longer allow companies like Cambridge Analytica to grab the data to do AI-based advertising, because Facebook does the work of mass-targeting itself. There's no need for a malicious third party -- just use the native Facebook tools.

This is what happens when we put no value on human data and put it up for rent. Machines can use that data to manufacture preferences and behaviors at scale. This is not a surprise or a malfunction -- quantitative advertising technology has been a massive venture investment sector for years, seeing $3 billion in funding in 2011. Since then, GAFA has swallowed up the market. And the technology of this sector, in particular artificial intelligence for profiling customers through unstructured information, has spread everywhere, including financial services. See for example the $30 million investment into Digital Reasoning by BNP Paribas, Barclays, Goldman Sachs, with prior investors being Square Capital Nasdaq and others. The product processes audio, text and voice data overlayed on top of internal communications to prevent fraud or add customer insights.

0b15a2a7-cf3e-48ab-bfa8-be12fcf7809d[1].jpg

What symptoms like this mean in the long run is that we don't even need a Facebook data leak to be trapped in the AI bubble. Our interactions with each other are now nearly all digital, which means they can be used to impute a personality and a profile that we may not have ever shared. And AI hooks live everywhere -- from media, to finance, to commerce. Mass customization of our products and information is inevitable, and Facebook is not special in empowering this trend. Rather, we need a new literacy to live in an AI-first world.