Here are some notes toward an essay that I would like to structure more, but the more I type the more I find and I’m trying to bring it to a point where I can call it finished and expand on these ideas in other pieces.
My music recommendation for today is NOW by RENT STRIKE
In her rant about technology, Ursula K. LeGuin describes technology as the “active human interface with the world”. She reminds us that technology’s prime function is to interact with our surroundings and that this privilege is not restricted to what she calls “hi tech”:
“Technology” and “hi tech” are not synonymous, and a technology that isn’t “hi,” isn’t necessarily ‘“low” in any meaningful sense.
The defining factor is that they have material effects. Effects that are goal oriented and would have been more difficult to achieve otherwise. From the broom sweeping your floor to the computer piping music or this essay into your home, what you are doing is trying to change something about the world (clean floors, room filled with music). There are many ways to achieve the same results with different tools. You could vacuum, for instance, or use a CD, a record player or any other music device to listen to your tunes. Technology is always a means to achieving a goal.
I want to extend this claim by a few definitions in order to allow us to think about technology in more diverse terms, especially when faced with the ongoing fascist overtake of big tech. How do these companies leverage interfacing technologies for profit?
Abstractions and Extensions
We have already heard the thesis that most big tech companies deal in interfacing technologies. But if LeGuin’s claim holds true, then ALL technology interfaces with the world. I propose that they do so in different ways. I think we can divide most technologies up into two rough categories: extensions and abstractions.
Extending technologies are technologies that allow us to do more things and directly interface with the world. Brooms, backpacks, heating, the hardware of modern computers, all these are extensions of our abilities to interact with the material world.
Abstractions on the other hand help us to deal more efficiently with existing technology or with data. An abstraction might be a computer program or even language. They are layers that rest over the world in order to help us interface with it more efficiently without having to know how exactly everything functions. The map, not the territory.
I don’t want to open up a opposition here. Both types of technology are essential and important. In fact, most technology is rarely only one of the two, but actually an interconnected system of extending and abstracting technologies that in the end allows us to make use of them more intuitively (It is nearly impossible for most people to use the extensive technology computer without the abstraction of a programming language > a dedicated piece of software > a GUI).
The Big Tech Abstractionists
So how do the big tech companies play into this? How do they continuously reinvent the wheel? They present themselves as the big problem solvers to the issues of the modern day, conveniently glossing over the fact that they are frequently the ones who have created the problem in the first place. In his book To Save Everything Click Here1 Evgeny Morozov introduces the concept of Techno-Solutionism, or the idea that all problems can be solved by technology, where the “problems” are frequently created in order to justify the existence of the technology. This premise of problem solving and optimization leads us directly into the “Internet of Things” where toasters that use fancy algorithms to make your toast badly are only one of the ridiculous things you encounter.
The IoT and it’s mission creep of inter-connectivity and data harvesting are one thing. Take the toaster example: With the added abstraction layers of a touch-screen interface and proprietary software, the user relinquishes full control of the device to the manufacturer’s imposed limits. But the model of Techno-Solutionism creeps even further into our lives with the abstractions big tech layers over every other aspect of our lives.
The strategy in most of these cases is to produce ‘services’. These services aim to streamline or simplify existing processes you didn’t know you had to streamline and fix problems you didn’t know you had to fix. They are the middle-men to the world.
Whether this is in the way you listen to music, the way you order food, track your habits and physical bodily functions, or the way you meet new people, big tech companies are constantly maneuvering themselves into positions where they provide additional interfacing layers between us and the world, attempting to make themselves seem ubiquitous and necessary. Of course these services are essential! Without Spotify, Netflix, Apple Fitness, how would you get your music, your entertainment, know how to manage your body?
The universal presence of these services attempt to make them essential or else threaten to lock the user out of the fabric of the (online) world.
This platform lock-in also allows companies to more brazenly take advantage of data harvesting and brokering, leading to more aggressive ad campaigning, because that is truly where the money lies for these companies. Their services exist as an interfacing layer to how we interact with the world, and they siphon off that data in order to sell it to data brokers that will in turn try to sell us bullshit.
So how can we think about these things in a way that can release us from this polyester pipe dream? We can comprehend the underlying motivations of these companies by applying a critique of capitalism, but that alone will not get us to understand the exit routes open to us right now. So how can we rethink the way we approach technological problem solving in our everyday lives? Let’s try an approach!
Abstracting Life
Big tech deals in abstraction layers designed to streamline the rough lines of the every-day, not to extend where those lines may flow. I believe we can understand the big tech grift as an effort to increase the level of abstraction from the material world. This serves multiple purposes.
For one thing, added layers of abstraction make it easier for companies to scrape user data. Each abstraction layer requires a different way of interaction or input. Every input is a data point.
Apart from the data gathering, these abstraction layers are also used to directly make more money. It’s easy to slap an abstracting technological interface on an extending technology in order to control it. In fact, that is exactly what DRM is. It’s an abstracting interface that prevents you from interfacing with your technology directly, without input from tech companies. Just check out the antics HP, John Deere, and Microsoft are pulling.
The more our everyday activity is abstracted by big tech interfacing layers, the more our daily lives become quantifiable as data.
The effects of this have become common knowledge since the Facebook Cambridge Analytica data scandal. It results in hyper-targeted advertising and the erosion of any kind of privacy. The further you abstract and get people to leave a trail of digital data in their wake trying to use your service, the easier it is to quantify that persons behavior. This in turn makes it easier to infer their interests and display hyper-targeted advertising. If you want a direct example of how user’s emotional states and desires are recorded and quantified, just take a look at Spotify.
This information is well known at this point, and while I think it is important not to forget about it, I want to focus more closely on the other effect of mass-abstraction technologies: The alienation of people from their labor and their lives.
These mass-abstraction machines make our methods of interfacing with the world so abstract that we can no longer do anything but rely on the borders of interaction the companies that make them draw for us. Design dictates interfacing, and if the design only allows a small field of interactions, and that design is ubiquitous in our lives, so ubiquitous that we don’t even think about it, our scope of interaction begins to limit itself to the tools we know.
An acquaintance of mine who is very gadget savvy showed me how seamless her device ecosystem functioned after making the switch to Apple. All she had to do was place her iPad next to her Macbook and the laptop would recognize the iPad as an external display. “Isn’t that cool? It just works!”
In their graphic and essay Anatomy of an AI System Kate Crawford and Vladan Joler attempt to map the vast network of interconnected systems of labor and exploitation that culminate in the form of an Amazon Echo. Their work tracks the device through Lithium mines in Bolivia to the megadatacenters that churn through every query posed to the sleek, unnoticeable user interface.
The point they draw out in their graphic is this: None of the interactions with these technologies (AI, the Cloud, etc.) is ever as instantaneous or magical as they seem. They rely on and obfuscate thousands of avenues of exploitation from resource extraction to labor exploitation. And they do all this to service the hyper-individualized needs of the user.
This isn’t a invention of the 21st century. This is simply Marx’ theory of alienation and the commodity fetish extended to even the “simplest” actions in our lives. It is the service fetishized.
Summed up by tumblr user beadyeyes:
Where’s it made? Who brought it here? How much were they paid? Who makes it? Is it made in separate parts and put together? How much were they all paid to do this? Where do they get the materials? Who paid for that? Who brings it there? How much were they paid? Who streamlined the base materials? How much were they paid? Who gathered the base materials? Where? How much were they paid? Is it good for them? Is it good for us? Is it good for the land? Is it necessary? Is it biodegradable? How much does it hurt? Do I need it? Do I even want it?
All of these aspects of production and daily use are completely obfuscated by abstraction layers. It just works.
Design as Social Function
This sleekness is one of the primary end forms of the techno-capitalism we inhabit today. It’s goal is to achieve total smoothness of every day life driven by it’s abstractions. Life is relegated to a subsystem of the global project of profit. Cordoned off from the unimaginable human and ecological cost that enables it.
The design philosophy of Big Tech is a driver of totalitarianism.
I’m not trying to make the case that this is done deliberately, in a top down sort of way, the claim is much more that the way that this system functions brings with it a close relationship to objects and tools that we use in everyday life, and that these objects and tools in turn shape how we can imagine interacting with the world. Our skills and our knowledge of the world give each of us what I like to call an action horizon. This refers to the options of action open to us in the world. Anything beyond our action horizon remains invisible to us and therefore we do not even consider doing it.
What this type of smooth design does is restrict our action horizon to the actions the tools big tech provides us allows. And the more we rely on those technologies, these abstractions, to manage our daily lives, the closer our action horizon becomes to operating exclusively within those fences.
Let’s consider Theodor Adorno’s Aphorism 19 from Minima Moralia2 in order to think of the limits to our action horizons:
Technology is making gestures precise and brutal, and with them men. It expels from movements all hesitation, deliberation, civility. It subjects them to the implacable, as it were ahistorical demands of objects. Thus the ability is lost, for example, to close a door quietly and discreetly, yet firmly. Those of cars and refrigerators have to be slammed, others have the tendency to snap shut by themselves, imposing on those entering the bad manners of not looking behind them, not shielding the interior of the house which receives them. The new human type cannot be properly understood without awareness of what he is continuously exposed to from the world of things about him, even in his most secret innervations. What does it mean for the subject that there are no more casement windows to open, but only sliding frames to shove, no gentle latches but turnable handles, no forecourt, no doorstep before the street, no wall around the garden? And which driver is not tempted, merely by the power of his engine, to wipe out the vermin of the street, pedestrians, children and cyclists? The movements machines demand of their users already have the violent, hard-hitting, unresting jerkiness of Fascist maltreatment. Not least to blame for the withering of experience is the fact that things, under the law of pure functionality, assume a form that limits contact with them to mere operation, and tolerates no surplus, either in freedom of conduct or in autonomy of things, which would survive as the core of experience, because it is not consumed by the moment of action.
If industrial capitalism demands “the violent, hard-hitting, unresting jerkiness of Fascist maltreatment”, what are the motions that techno-capitalism imbues upon its subjects? It functions through command. Whether this is the request posed to an Alexa or a prompt to an LLM, the primary interaction method is one of commanding an action be taken, and expecting it to be executed. But these commands only seem to be simply to the machines. As relayed in the AI model essay, each command is actually an activation of thousands of moving parts that cater to the users needs. We are all the commanders in our individualized lives. Hyper-individualized, micro CEOs of the self.
Add to this the way that technology is made to not be opened, modified, fixed or altered, but replaced as soon as it has outgrown it’s use, and we can begin to see how these daily interactions and assumptions might work on our daily lives. Customer service employees are simply daemons to be commanded to fulfill a task, like we would command Siri to order an Uber.
The idea is to remove all lines of flight from the outset. And if they can’t be removed, to anticipate which escapes are possible by curating and anticipating the tools that will be used to make the escape, so as to be part of the next world.
The complete smoothing of everyday life and its processes removes any cracks and crevices that might be used to pry open the layers of abstraction, to make it difficult to think of any future where these abstractions might not exist.
Recognizing and Reducing Abstractions
So what’s the way out? In the case of big tech, the path out may seem difficult since many of these services offered up to us actually do do the things we want them to do. They are additionally packed with so many added interaction layers: likes, comments, share, remix, tip, subscribe, recommended pages etc. These interaction layers add up to become platforms. Platforms work by keeping their users locked in by threatening them implicitly with the loss of all the social functions they offer. Cory Doctorow has written much more on the concept of enshittification and platform decay that I won’t repeat here because he does a much better job of it.
But if we learn to see the world and the tools we use as these extending and abstracting concepts, it allows us to cut them down to more user-friendly solutions. We can learn to look past the useless abstractions and to see which abstractive layers are truly necessary. For example: I am writing this blog post as a plaintext markdown file. There is still an abstraction between me and the device that is displaying the content of the file, but it is an abstraction not mitigated by abstraction layers in the same way a more complex text document might be encoded.
The key here is to see what material change you want to achieve within the world, and to then consider yourself as the agent of that change, while bearing in mind what abstractions might be mobilized to aid you. If we consider again the Alexa example, with our new knowledge of the vast networks mobilized to our aid that we have no control over, we may also consider other options to achieve the same goal. It’s a certain mindfulness about the way we are tied into systems of production. We can never fully extricate ourselves from this web, but we can consider our position in it, and work to reduce the abstractions we allow into our lives to be more closely in tune with the reality of the world. This obviously isn’t an individualized effort. It must be a collective and organized struggle. Since the active pathways that enable these changes are communal, so must our efforts to change them be. The micro-CEO must be dispossessed.