Let’s say you open Facebook and see an alert. You click, and to your horror you see that someone has posted a terribly unflattering picture and tagged you. That tag is nefarious, because it means that whenever anyone searches the internet for you or anyone else tagged in that photo, or for that matter anything related to the event you were attending, that cringeworthy image will show up. It sticks to your identity and is dragged along, like a piece of toilet paper on a shoe.

The misery we inflict on others via digital shame machines, often without knowing it, accounts for only the most obvious grief. The more pervasive abuses are engineered to whir away on their own. And this automated poison is progressing at such a furious pace that science fiction from only a few years ago now reads like today’s news. Gary Shteyngart’s 2010 novel, Super Sad True Love Story, for example, describes a futuristic world in which radically open data is the norm, and the possibility of shaming lurks around every corner. Credit scores appear on a public display when characters walk past a “credit pole” in the neighborhood. The advanced cellphones, or äppäräts, can scan the net worth and financial history of each passerby. If someone in a bar tells a joke that falls flat, their “hotness” and “personality” scores plummet in real time.

Abuses very much like these are already spreading, especially in China, where state surveillance operates without even a hint of restraint. There are constellations of government-sanctioned social credit scores, some of which ding a given person if a surveillance camera catches them lighting up in a no-smoking zone or playing too many video games. Others use cameras equipped with AI that can identify individuals based on a combination of facial features, posture, and gait. So if, say, someone is heading to work and gets caught jaywalking, the smart cam can tag the name and personal information of the offender and flash it across a digital billboard. Or likewise you might get punished for littering in the subway or denigrating the ruling party online. Your various infractions might also be announced, by name, on Weibo or WeChat, internet giants in China.

No matter where we live, some of us fare far better than others in our relations with the expanding network linking data to shame and stigma. The easiest people to exploit tend to be the most desperate, the ones who lack the money, the knowledge, or the leisure time to tend to the digital baggage that trails them, or simply those who have traditionally been treated badly. These are folks who are disproportionately poor or otherwise marginalized and have the least control over their identities. Their lives can be defined, and poisoned, by shame machines: the diet industry, opioid merchants, for-profit prisons, welfare bureaucracies, the list goes on. Those machines punch down on them relentlessly.

But shame has a second life in the data economy. Evictions, brushes with child protective services or the law, trips to casinos—all leave rich trails of information, creating a bonanza for the many institutions that feed on shame data. These stretch far beyond the social networks to the formal economy of credit rating companies, mortgage brokers, and parole boards, as well as a vast who’s who of hucksters and scam artists. The episodes that trigger the most shame are digitized, codified, and then processed by hundreds or thousands of different algorithms to size up the people involved, make money from them, and deprive them of opportunities, often permanently.