Discover more from Simplify
Do We Need More Software?
The industrial revolution wasn't a mistake, but the de-industrial one might be
What is Tech?
In recent weeks, I’ve read tons of news concerning layoffs across the tech industry. These have occurred at both brand names like Stitch Fix and Netflix and smaller firms that few people have heard of. While reading about the layoffs, two questions came to mind. First, what the heck are so many people doing at these companies? I understand why Tesla employs tons of people. They build cars. On the other hand, I can’t imagine what thousands of tech workers do at Netflix. I understand that it takes tons of people to film TV shows, but how many people does it take to list these shows on a webpage? How many Meta employees does does it take to recommend posts from the same six friends ad nauseum? Then again, At least I understand know what products Meta and Netflix produce. Pick a couple random firms on this layoff list, and tell me if you can explain what they do. Secondly, and more importantly, are these firms actually “tech companies?” Websites don’t exactly represent cutting edge technology. If the thing we call “tech companies” aren’t really that, we might overestimate our level of technological progress.
In other words, I think we’ve seen some untechnical firms slip under “tech company” umbrella. Take the aforementioned Stitch Fix, which consists of a website that recommends and sells clothes. That’s cool, I guess, but why do we consider that a tech company? It doesn’t automate the production of clothes or produce clothes with any enhanced features. It just provides a (presumably) more efficient way of marketing items to customers. Sure, Stich Fix employs technology in the sense that it includes software and data analytics, but that also applies to Kohls. I question the tech-ness of some heavy hitters like Uber and Amazon. We never referred to taxi firms as tech companies, so why do so with Uber? Just because it has an an app. Similarly, Amazon has built a remarkable logistics network, but so did Walmart, and I’ve never heard anyone refer to them a tech company.
Some commentators take it all the way and declare that every company is a tech company. When I hear a statement like that, it’s easy to think “if every company is a tech company, then no company is a tech company.” However, that’s not quite right. Rather, if every company is a tech company, then the term “tech company” no longer does the thing its supposed to do. We use the term “tech company” to differentiate it from the non-tech ones. Imagine that a trove of new biological research proved that, genetically speaking, all animals were mammals. It wouldn’t be true that “if all animals are mammals, no animals are mammals.” It would be true, however, that the term “mammal” no longer served its original purpose. We’d need another word to refer to furry animals with breasts. I think something similar has happened with the concept of tech companies. Everything is tech now, so we lack vocabulary to differentiate meaningful technological innovation from stuff on webpage. Basically, we need technological equivalent of “furry company with breasts,” and I don’t see that term catching on.
I don’t point this out because I think language matters that much. Maybe, in twenty years, the term “tech company” refers solely to unlicensed hotdog stands in New York city. That’s fine, words always evolve. Rather, I worry that our vocabulary masks a serious problem: our economy puts a lot of resources into technology of dubious value.
Things on Screens
To pick a random example, I found Tray.io on the aforementioned layoff list. The platform claims to “[i]ntegrate Salesforce directly to the rest of your tech stack to sync mission-critical data to any other application.” Maybe this platform solves some a critical problem, but I suspect we don’t need to add Salesforce integration to Maslow’s hierarchy of needs. To be clear, I’m not trying to ridicule anyone involved. I imagine the guys and gals would build and maintain this platform are smart, conscientious, and creative people. In other words, they’re likely the exact sort of people who should be working on the world’s most pressing problems and the exact sort of people who aren’t.
I feel the same way towards my discipline: data science. I’ve seen many brilliant people find their way into the field. Unfortunately, as I’ve written about before, a lot of this work goes nowhere. Executives often dismiss analytics when it contradicts their intuitions, and many companies don’t know how to derive value from their analytics departments. Even when companies make proper use of their data, they’re often doing so to to trick people into seeing more ads or putting more items in their cart.
In Bullshit Jobs, author David Graeber attributed much of the bullshit job phenomena to IT. He noted that workers spend a lot of time converting information to computer-readable formats. For example, a human can read a handwritten a table. A computer, meanwhile, needs that information converted to a small number of specified formats. I’ve noticed this a lot at fast food restaurants. I order something, and the employee struggles to place the order into the computer. There’s no human-to-human misunderstanding here, the person at the counters knows what I asked for. As a result, we spend a lot of time telling a computer something that the surrounding humans all ready understand. When I worked for a hotel firm, I remember the small armies of IT and analytics people it took to produce a simple daily revenue report. I wondered if the whole thing ran more efficiently when guys kept paper spreadsheets in cigar-smoke-filled rooms.
In other words, I’m applying the same criticism of “tech” that we’ve long heard about finance. Brilliant people spend their prime years building complicated financial assets of questionable moral and social value. I once listened to a podcast where an author claimed that only 15% the financial industry produces anything real. I wouldn’t bet my life on that figure, but it’s hard to imagine we’d live in a poorer society if fewer Ivy League graduates went to Goldman Sachs.
The Material World
It’s difficult to see what we’ve achieved through the software-ification of our economy. Life expectancy plateaued, even before the pandemic, and we haven’t seen any long-term dip in poverty. While one would expect a drastic rise in worker productivity during a technology boom, that rise does not seem to have occurred.
However, I should mention a counterargument that I’ve heard regarding productivity growth. Some argue that productivity statistics underestimate reality since consumers can now access a lot of free materials that don’t count towards GDP. We can all use Google Maps, Wikipedia, and Substack without paying a cent. This idea, strangely, mirrors the classic feminist idea that traditional economic metrics don’t account for women’s work. As the great Paul Samuelson once quipped, GDP falls when a man marries his maid.
This critique makes some sense, and I think we can all agree that no economic calculation can ever represent the whole of human output. Still, I can take some issue with the idea. First, I don’t imagine our productivity calculations accounted for the full value of previous technologies either. Household appliances and frozen dinners, for instance, saved people a lot of time and hassle that won’t manifest in the productivity data. Secondly, if we’re going to acknowledge that GDP doesn’t reflect the benefits of our digital world, we should also acknowledge that it doesn’t reflect the drawbacks. I’m not going to attribute every human problem to social media, but it’s played some role in rising mental health issues, shortened attention spans, political polarization, and various other societal ills. If we think we should add the convenience of Google Maps into our output data, then we should probably subtract my inability to make it more than ten consecutive seconds without checking my phone.
Furthermore, software can’t solve many material problems. Our economy still depends on oil, and apps can’t create energy. Likewise, I’ve recently become fascinated by organisms that can digest plastic. Scientists are attempting to extract the enzymes from these organism in order to clean up landfills. This requires physical enzymes breaking up physical pieces of plastic. You can’t do that through an API. I’m excited by the possibility of self-driving cards, but they still can’t replace mass transit. Trains take up much less space per person than cars, and they allow their passengers a greater freedom of movement. Self-driving AIs will save lives, but they can’t create additional space.
I also worry about pseudo-automation. In “true” automation, the technology wholly replaces the task that a human used to complete. For instance, my washing machine replaces the manual labor needed to clean my clothing. On the other hand, pseudo-automation occurs when the new technology doesn’t quite replace the human version. Traffic lights offer one example. In ye olden days, a man would stand at the intersection and direct traffic. Today, a machine fills that role. However, we’ve probably all seen instances where an empty road gets green light while a bunch of drivers stare at a red. A physical person would notice this and direct the traffic as appropriate. The same holds for the automated customer service, such as hotel kiosk and call center bots. These replace a human in a literal sense, but they’re much harder to communicate with that their ape-like predecessors. I could see a lot more of this as programmers automate more tasks. I’m worried about a future where computers do a lot more things, and they do all of them poorly.
I also don’t value the alleged open-ness of software. I understand the potential of an project that anyone can participate in, but I think the concept suffers from some practical limitations. On multiple occasions, I’ve struggled and failed to install a piece of open source software. If regular company sold this, I’d contact their (hopefully human) support, and that firm’s customer service department would have a monetary incentive to ensure that their product works. With open source products, I’m often directed to years-old Stack Overflow threads, poorly formatted message boards, and YouTube videos with 15 views. These rarely solve the issue. Furthermore, the notion open source only applies to the small handful of people who know how to code. Most of us couldn’t contribute to an open source product any more than we could augment Microsoft Excel.
Before moving on, I want to thank subscriber Ying B for pointing me toward an interesting writeup about China’s policy on same issue. I don’t know have much to add except for “hey, this article agrees with me.” Here’s a good quote:
It’s become apparent in the last few months that the Chinese leadership has moved towards the view that hard tech is more valuable than products that take us more deeply into the digital world. Xi declared this year that while digitization is important, “we must recognize the fundamental importance of the real economy… and never deindustrialize.” This expression preceded the passage of securities and antitrust regulations, thus also pummeling finance, which along with tech make up the most glamorous sectors today. The optimistic scenario is that these actions compress the wage and status premia of the internet and finance sectors, such that we’ll see fewer CVs that read: “BS Microelectronics, Peking; software engineer, Airbnb” or “PhD Applied Mathematics, Princeton; VP, Citibank.”
Why Is This Happening?
A left wing outlet would pin the software glut on capitalism and neoliberalism, and I think they would have a point. Many of technologies that I think we should be working on instead (mass transit, improved energy tech, plastic decomposition) provide a degree a public benefit that corporations don’t profit from. As such, we’d expect the market to underinvest in these technologies. I can see corporate short-termism encouraging “quick wins” over expensive research endeavors. Software also promises capital the possibility of high rewards with relatively little investment. A developer can build an app and watch millions of people download it without incurring large fixed or marginal costs. The physical world doesn’t offer this benefit. Finally, the quoted article makes a great observation about network effects. People use Facebook or Board Game Geek, mainly, because everyone else uses it. Hence, there’s a lot of profit associated with becoming the first company to offer something in a digital space, so firms will spend a lot of resources to gain that first mover advantage
Still, I don’t think that tells the whole story. For one, it’s hard to attribute this trend to the profit motive when many of these companies don’t make any profits. I also have to imagine that government regulations play some role. Entrepreneurs often complain about excessive and expensive regulatory hurdles associated creating a new business. Given the regulatory costs of opening a simple coffee shop, I can only imagine what those costs look like for constructing a nuclear power plant or releasing a novel enzyme into landfills. In such a regime, I could see business moving to the less-regulated cyber space. In fact, companies like Uber seem to exist mostly as a circumvention of regulations. Cities limited their taxi supply via medallions, so programmers figured out how to bypass these limitations. I’ve also heard people argue that IP laws apply too broadly in software, granting firms a monopoly that would be hard to obtain in the physical world. Examples include Amazon’s 1-Click checkout, which doesn’t feel like the sort of innovation that patents exist to protect.
I hope the previous paragraphs have earned me some credibility, because I’m gonna risk some of it on one last hypothesis. Maybe the lack of a clear, existential threat has eliminated a valuable incentive structure for technological development. The first World War bestowed stainless steal and wristwatches while improving our aviation capabilities. It’s successor gave us penicillin and nuclear energy. Even the comparatively boring Cold War put a man on the moon. Though today’s major powers still compete through the international market and proxy wars, these don’t provide the same sense of urgency. It’s probably true, for example, that Western reliance on oil is funding Russia’s invasion Ukraine. Yet, let’s be honest, I don’t think the average American or German loses much sleep over Mariupol. Meanwhile, If anyone feared a reconquest of Alaska or Second Battle of Berlin, I’d imagine that the US and Germany could figure out a different way to power their economies. If the US needed to quickly move materials and people across the country, I think we’d suddenly find a way to build otherwise infeasible infrastructure projects. Sure, maybe we don’t need a giant war. The COVID-19 pandemic spurred the creation of mRNA vaccines, which will probably remain one of the great inventions of my lifetime. Still, I think there’s some level of achievement that people won’t reach when they’re trying to grow Q3 revenues.
Gross Domestic Product. Basically, the monetary value of everything produced within a country.
This might seem a bit to commonplace to footnote, but experience has taught me that many smart people remain unfamiliar with economics jargon
If you pay a maid for her services, that payment is calculated in GDP as consumer spending. If a wife cleans the house, there’s no payment and therefore no contribution to GDP.
The argument would be that these do show up, as the saved time allowed people to consume or produce something else. But doesn’t that apply to software as well? Shouldn’t the value of Wikipedia “show up” in GDP?