ARTIFICIAL INTELLIGENCE: "Financial Deadbeats" map is the worst things about Chinese Fintech

In our continued amazed gawking at the Chinese fintech landscape, we bring you the following. There is now a feature within WeChat, one of two channels for all mobile chat communication, to show a map of "financial deadbeats" around you. That's right -- a shaming visualization of people who are in financial trouble, like some sort of public sex offender list. We link to the article below, and assume that it is true despite how preposterous the whole thing seems. 

Offenses that could land you on the blacklist include serious ones like being the founder of a digital lender that collapsed with 12 million unpaid accounts, and trivial ones like being a single mother embroiled in a divorce proceeding. Once you are on the list, not only will your full name and financial information be public entertainment on this app, but access to credit, commerce and university admission could be revoked. To add insult to injury, a special ringback tone is added to the "discredited" person's mobile phone, alerting any potential caller about your poor financial management skills.

We add to this soup the idea of algorithmic bias exhibited by AI based on training data. We've covered this issue in the past, but point to Rep. Alexandria Ocasio-Cortez (D-NY) recently bringing it up into mainstream conversation. From propaganda bots to algo-racism, these arcane issues are starting to concern the broader Western polity. So when you combine historical training data reflecting past social and economic biases with social media enforcement systems, dystopia calls. One of the most important financial innovations in the West was bankruptcy, allowing entrepreneurs to fail and start over. This normalization of financial wipe-out led to an equilibrium with higher risk-taking and innovation. It is chilling to see technology being used, with potential for error and misuse, to stifle that spirit. Based on the US personal bankruptcy data below, you can see that 6 out of 1000 people would be guilty according to WeChat, skewed in large part to minority populations. No thanks. 

59ca575a-39a6-4f36-a60c-89c9ccfd9a01[1].jpg
f1102030-8e90-4366-8bcf-31e14f4b774b[1].gif
0f2b8c5a-2d85-479c-9ab1-239e47436d17[1].png

Source: Abacus News (deadbeat map), Independent (deadbeats), Vox (algo-racism), On bankruptcy normalization and bankruptcy zip codes