The problem with tech, many declare, is its quantitative inclination, its “hard” math deployed in the softer human world. Tech is Mark Zuckerberg: all turning pretty girls into numbers and raving about the social wonders of the metaverse while so awkward in every human interaction that he is instantly memed. The human world contains Zuck, but it is also everything he fails at so spectacularly. That failure, the lack of social and ethical chops, is one many believe he shares with the industry with which he is so associated.

And so, because Big Tech is failing at understanding humans, we often hear that its workforce simply needs to employ more people who do understand. Headlines like “Liberal arts majors are the future of the tech industry” and “Why computer science needs the humanities” have been a recurring feature of tech and business articles over the past few years. It has been suggested that social workers and librarians might help the tech industry curb social media’s harm to Black youth and proliferation of disinformation, respectively. Many anthropologists, sociologists, and philosophers—especially those with advanced degrees who are feeling the financial squeeze of academia’s favoring of STEM—are rushing to demonstrate their utility to tech behemoths whose starting salaries would make the average humanities professor blush.

I’ve been studying nontechnical workers in the tech and media industries for the past several years. Arguments to “bring in” sociocultural experts elide the truth that these roles and workers already exist in the tech industry and, in varied ways, always have. For example, many current UX researchers have advanced degrees in sociology, anthropology, and library and information sciences. And teachers and EDI (Equity, Diversity, and Inclusion) experts often occupy roles in tech HR departments.

Recently, however, the tech industry is exploring where nontechnical expertise might counter some of the social problems associated with their products. Increasingly, tech companies look to law and philosophy professors to help them through the legal and moral intricacies of platform governance, to activists and critical scholars to help protect marginalized users, and to other specialists to assist with platform challenges like algorithmic oppression, disinformation, community management, user wellness, and digital activism and revolutions. These data-driven industries are trying hard to augment their technical know-how and troves of data with social, cultural, and ethical expertise, or what I often refer to as “soft” data.

But you can add all of the soft data workers you want and little will change unless the industry values that kind of data and expertise. In fact, many academics, policy wonks, and other sociocultural experts in the AI and tech ethics space are noticing a disturbing trend of tech companies seeking their expertise and then disregarding it in favor of more technical work and workers.

Such experiences particularly make clear this fraught moment in the burgeoning field of AI ethics, in which the tech industry may be claiming to incorporate nontechnical roles while actually adding ethical and sociocultural framings to job titles that are ultimately meant to be held by the “same old” technologists. More importantly, in our affection for these often underappreciated “soft” professions, we must not ignore their limitations when it comes to achieving the lofty goals set out for them.

While it is important to champion the critical work performed by these underappreciated and under-resourced professions, there is no reason to believe their members are inherently more equipped to be the arbiters of what’s ethical. These individuals have very real and important social and cultural expertise, but their fields are all reckoning with their own structural dilemmas and areas of weakness.

Take anthropology, a discipline that emerged as part and parcel of the Western colonial project. Though cultural anthropology now often espouses social justice aims, there are no guarantees that an anthropologist (85% of whom are white in the US) would orient or deploy algorithms in a less biased way than, say, a computer scientist. Perhaps the most infamous example is PredPol, the multimillion-dollar predictive policing company that Ruha Benjamin called part of The New Jim Code. PredPol was created by Jeff Brantingham, an Anthropology professor at UCLA.

Other academic communities championed by those pushing for soft data are similarly conflicted. Sociology’s early surveillance and quantification of Black populations played a role in today’s surveillance technologies that overwhelmingly monitor Black communities. My own research area, critical internet studies, skews very white and has failed to center concerns around race and racism. Indeed, I am often one of only a handful of Black and brown researchers in attendance at our field’s conferences. There have been times I was surrounded by more diversity in tech industry spaces than in the academic spaces from which the primary critiques of Big Tech derive.

Social workers would likely add some much-needed diversity to tech. Social work is overwhelmingly performed by women and is a pretty diverse profession: over 22% Black and 14% Hispanic/Latinx. However, social workers are also implicated in state violence toward marginalized communities. For example, a social worker coauthored a controversial paper with Brantingham extending his predictive policing work to automated gang classification.