You might say I faked it. You might say I paid attention to the macro rather than the micro. I prefer to be generous to myself.

I managed to convince many scholars, deans, reporters, and even the editors of this esteemed publication (to which I have subscribed since about 1996) that I am, in fact, an expert. I have written or edited six books related to the effects of technology on democracy and culture, including one devoted to the consequences of our collective dependence on Google and another to the uses and dangers of Facebook.

Am I really an expert on Google and Facebook? Or, more appropriately, who is an expert on these companies? Is anyone?

I have some nominees. There are journalists like Steven Levy or Kara Swisher, who have been covering the personalities and policies of these companies for decades. But do they understand the code, the server farms, the global networks of undersea cables? Can they discuss the fragile treaties and legal settlements that have let these companies transfer sensitive user data from Europe to North America and back?

There are former friends of Mark Zuckerberg, like the investor and writer Roger McNamee or the investor and writer Chris Hughes. But do they know how to code? Do they grasp the ways in which societies and cultures reshape themselves around mobile devices and flows of data?

The best candidates are scholars like danah boyd of Data and Society, Zeynep Tufekci of the University of North Carolina, and Ian Bogost of Georgia Tech. They all have deep backgrounds in coding and working for technology companies, and have deployed academic expertise and writing skills to influence public understanding of these industries.

There are former employees of these companies like Antonio Garcia Martinez, who helped build Facebook’s advertising systems after building a couple of previous Silicon Valley startups. Tristan Harris used to work on Google’s email services before quitting to criticize the company for building all its systems to maximize user engagement and leverage attention for revenue. They both understand the mechanisms of their portions of the companies for which they worked. But did they ever get to see how the whole system works? And what qualifies them to comment on the big picture?

Does anyone, even Mark Zuckerberg and Sundar Pichai, really understand these massive, complex, global information systems with their acres of infrastructure, billions in revenue, and billions of users almost as diverse as humanity itself?

I think not. That’s the thing about complex systems. Almost no one understands any of them. As technology writer Samuel Arbesman writes in his important book, Overcomplicated: Technology at the Limits of Comprehension, the messiness of complex systems, in which teams of people understand one aspect yet no one gets the whole thing, invited such calamities as the May 2010 “flash crash” of global financial markets. A complex system like a computer-driven securities market has multiple points of failure: a tangle of computer code, human actions, laws and regulation, and massive amounts of financial data that no one understands. Ultimately, many people have theories of what went wrong that day. No one knows for sure—or how to avoid another such collapse.

Consider Google. It’s a 22-year-old company that started out complex. It was a collection of servers and some brilliant code that scraped the growing web, making copies of every new page and indexing the terms (and later images) to rank them based on a dynamic assessment of “relevance” to users typing terms into a box. Only later did the company add advertisement auctions, productivity applications, maps, self-driving cars, books, mobile operating systems, videos, Wi-Fi routers, home surveillance devices, thermostats, and who-knows-what-next to its collection of services that somehow promise to work in concert. I would love to meet the person at Google who understands Google, or—even better—a person at Alphabet who truly understands Alphabet. That would be a busy, and brilliant, person.

So as we look at the myriad ways Google and Facebook have let us down and led us astray, let’s remember that no one has the manual. No one fully understands these systems, even the people who designed them at their birth. The once impressive, now basic, algorithms that made Google and Facebook distinct and useful have long been eclipsed by even more sophisticated and opaque data sets and machine learning. They are not just black boxes to regulators, journalists, and scholars. They are black boxes to the very engineers who work there.

As Arbesman writes of other complex systems, “While many of us continue to convince ourselves that experts can save us from this massive complexity—that they have the understanding that we lack—that moment has passed.”

So the next time Congress calls technology company leaders up to testify, we should remember that no one really understands these behemoths. They sure do understand us.