Friday, January 12, 2024

Which one is autonomous?

Next up on the bookshelf in the philosophy of technology project is Langdon Winner's Autonomous Technology: Technics-out-of-control as a Theme in Political Thought.  Since Winner is a professor of political science, the tone of this book differs markedly from the frankly speculative efforts of Simondon and Arthur.  While it's both accessible and fairly well written, it still suffers from the long windedness of someone who constantly feels the need to justify their academic existence.  So, for example, we only come to understand the overall design of the book at the very end.

We began our inquiry with the simple recognition that ideas and images of technology-out-of-control have been a persistent obsession in modern thought. Rather than dismiss this notion out of hand, I asked the reader to think through some ways in which the idea could be given reasonable form. The hope was that such an enterprise could help us reexamine and revise our conceptions about the place of technology in the world. In offering this perspective, I have tried to indicate that many of our present conceptions about technics are highly questionable, misleading, and sometimes positively destructive. I have also tried to lay some of the early groundwork for a new philosophy of technology, one that begins in criticism of existing forms but aspires to the eventual articu­lation of genuine, practical alternatives. (AT, 306)

This is a nice summary, and explains the rationale for what was an otherwise puzzling duplication of material between the first third of the book and its remaining two parts.  Winner spends quite a lot of time establishing that 1) we're worried about technology and 2) while some of these worries don't make a lot of rational sense, there is some real reason to worry.  I think both were important points to make, especially at the time of publication in 1977; it's clear that even today not everyone would agree with them.  And in making them, Winner addresses some interesting questions that often go unnoticed in our thinking about technology.  For example, asking if we should be worried about technology allows us to back up and suspend judgement for a moment so that we can see what's at stake in the question itself.  If we're afraid that technology is now somehow "out of control", then we clearly need to interrogate our notions of control and power.  With this question we immediately enter a political realm. 

But in the end, Winner's answer to this question is sufficiently not-earth-shattering (we thought we were the goal-setting master and technology the goal-obeying slave), and our perspective sufficiently jaded (by the century long failure of the modern, technocratic, or internet revolutions to bring about anything resembling utopia), that this first third of the book now reads as a elaborate and unnecessary justification of why you should think about these issues at all.  The attempt to maintain a facade of neutrality throughout this section -- "some people have said that modern technology qualitatively changes human experience"-- when it will later become clear that Winner is himself almost entirely a techno-critic who agrees with 'some people', introduces a lot of duplicate material.  So, for example, we see Jacques Ellul presented first as potentially just a wild-eyed prophet of techno-doom, only to find these same basic ideas re-presented, in the chapter on "Technological Politics", as compelling observations on the changing nature of human autonomy in a technological society.  It's enough to make us wonder whether Winner devoted a third of the book to shadow boxing a straw man just to keep in top academic condition, or whether political science was backwards enough (in 1977) that this approach was necessary to keep your job.

As the summary quoted above indicates though, once Winner finally legitimizes criticism of technology, he goes on to articulate two basic approaches to the problem of improving things.  He dedicates one chapter to "Technocracy" as illustrated by the theories of Don Price and John Kenneth Galbraith.  While the details differ, these theories hold that any political problems caused by the unintended side effects of technology can be resolved with better political solutions.  Both solutions may require an elite new interest group to be implemented, but they still fit squarely within our understanding of politics as the process of human self-determination.  A second chapter illustrates a much more radical take on the problem that Winner calls "Technological Politics".  This is the view associated with figures like Ellul, Mumford, and Marcuse.  They argue that modern technology so permeates human experience that to imagine we are free to direct it as we see fit is pure hubristic fantasy.  Technology now has a logic of its own, together with its own autonomous momentum.  And we can neither control it in any straightforward sense, nor even get off the train.  Winner summarizes two positions here:

   The first, the utili­tarian-pluralist approach, sees that technology is problematic in the sense that it now requires legislation. An ever increasing array of rules, regulations, and administrative personnel is needed to maximize the benefits of technological practice while limiting its unwanted maladies. Politics is seen as the process in representative government and interest group interplay whereby such legislation takes shape.
   The second approach, disjointed and feeble though it still may be, begins with the crucial awareness that technology in a true sense is legislation. It recognizes that technical forms do, to a large extent, shape the basic pattern and content of human activity in our time. Thus, politics becomes (among other things) an active encounter with the specific forms and processes contained in technology.
   Along several lines of analysis this book has tried to advance the idea central to all thinking in the second domain - that technology is itself a political phenomenon. A crucial turning point comes when one is able to acknowledge that modern technics, much more than politics as conventionally understood, now legislates the conditions of human existence. (AT, 323)

As you can see, Winner himself falls squarely into the latter, "technological politics", approach.  But because he conceives of his work as overcoming the objections of some putative technocratically optimistic Chad, he spends a great deal of time trying to specify exactly how the (assumed to be muddle-headed) idea that technology already is politics can be made to fit with a naively dualistic view of human agency.  This is a very useful exercise.  However, it's also an exercise that tends to dissolve its own starting point in a way that Winner doesn't quite appreciate.  Fortunately, this undermining actually takes us to the heart of the issue, and carries us right back into the terrain that Simondon explored  -- what does it even mean to be an 'autonomous' individual?  So in what remains, I'll trace Winner's careful dualistic reconstruction of how the sixties critics of Technological Politics might well have been onto something, and then look at how the means of this reconstruction paint him into a corner that he could only have gotten out of with the help of Simondon's concept of reciprocal causality.  

---------

Winner's central contention is that technology has changed how power functions in human society.  Whereas previously the power to steer society towards particular values may have rested with kings, or capitalists, or even 'the people', now this power increasingly rests with 'technology'.  This is the conclusion of the school of Technological Politics that Winner wants to reconstruct.

In twentieth-century social philosophy the conception of a self-maintaining technological society has recurred in a number of interesting and disturbing books - Oswald Spengler's Man and Technics, Friedrich Georg Juenger's The Failure of Tech­nology, Karl Jasper's Man in the Modern Age, Lewis Mumford's The Myth of the Machine, Herbert Marcuse's One-Dimensional Man, Sieg­fried Giedion's Mechanization Takes Command, and Jacques Ellul's The Technological Society. These works contain a widely diverse collec­tion of arguments and conclusions, but in them one finds a roughly shared notion of society and politics, a common set of observations, assumptions, modes of thinking and sense of the whole, which, I be­lieve, unites them as an identifiable tradition. Taken together they ex­ press an inchoate theory which takes modern technology as its do­ main. (AT, 174)

But what is 'technology'?  Isn't it just a neutral collection of means for achieving what are always ultimately human ends?  How could these means, when we loosely group them together as a whole, themselves gain the power to determine the ends of human society?  Aren't we just anthropomorphizing 'technology' by treating it as an entity or organism or subject in its own right, and then turning this fiction into a political actor?  Does this represent progress in political theory, or its mystification?  To reconstruct what it might mean for technology to be a political actor, Winner first has to rescue the idea from those hard-headed critics who still believe that only human individuals can have political agency.  In other words, he has to rephrase what sounds like the description of an organism -- "a self-maintaining technological society" -- as nothing more than some peculiar thing that happens when humans have a whole lot of tools at their disposal.  

Are there certain conditions, constraints, necessities, requirements, or imperatives effectively governing how an advanced technological society operates? Do such conditions predominate regardless of the specific character of the men who ostensibly hold power? This, it seems to me, is the most crucial problem raised by the conjunction of politics and technics. It is certainly the point at which the idea of autonomous technology has its broadest significance. (AT, 173)

What characteristics allow us to confuse technology with a self-maintaining organism?  Well, one is Winner's very broad use of the term 'technology'.  He doesn't simply mean the various apparatus that we commonly refer to as machines.  He also includes the entire body of human "technical activities", that is, all goal seeking rational behavior, which he refers to as "technique" (following the French use of the term).  Finally, he also means to address the various organizations, or networks, that function like machines, regardless of whether they consist of entirely human or artificial parts, or some mix of the two.  So it's clear from the outset (AT, 11-12) that there is no clear dividing line between human society and technology.  The latter is being construed not only to include much of human social behavior, but even the parts of individual behavior that conform to a 'rational' logic of means and ends.  Already, we can see that the question of which one is an autonomous organism is complex and messy.  We might be talking about technology, or society, or even, if we stop taking it for granted for a moment, a human person.  In fact, most of the time Winner's term "autonomous technology" means the complicated systemic interaction of these three.  One of the secrets of thinking about technology, which to me was not apparent when I began with Simondon, is the way it reveals how badly we need a better definition of 'organism' or 'individual', along with the closely related notions of 'control' and 'autonomy'.

Winner, however is a social scientist, not a philosopher, so he doesn't try to give us a definition of an organism.  Instead, he appeals again and again to the various ways that technology appears to acquire a life of its own by calling into question human's ability to control and limit it.  In other words, the proof of technology's autonomy lies in the way we lose our own.  For example, modern technology seems to constantly expand its reach, and at an ever accelerating rate that hairless chimps find difficult to keep up with.  In the process, it also seems to suck in everything in its environment and reshape it as raw material for technological growth.  And this logic applies not only to natural resources, but even tends to reshape human beings into suitably standardized and machined parts, as if it were technology were incorporating or disciplining us.  Technical systems seem to inevitably become megatechnical systems, to use Mumford's term.  The scale of an individual apparatus or organization tends to increase in an endless quest for efficiency, and somehow each tends towards a complex interaction with others.  In addition, technical development doesn't always seem to respond to human goals. If it's really just working for us, then why are there so many unintended consequences that don't seem to benefit any human?  We wanted universally accessible knowledge and instead we got TikTok.  We want to get from A to B, and somehow we got an endless parade of SUVs.  Technological means seem to become ends in themselves with surprising regularity.  Finally, technology doesn't ever seem to want to be turned off.  A decade ago we might have illustrated this with the mushrooming of highly redundant data center bunkers.  Today we can imagine adding an AI whose job is to motivate those notoriously fickle humans to continue supplying fuel and replacing servers.  But Winner has in mind something more systemic that has long involved the interaction of human desire and technical means, which he calls the threat of apraxia.  Imagine a politician who suggests shutting down the internet and you'll immediately comprehend the idea.  Even if technology seems to require human participation in it as both as efficient means and final justifying end, and even if we could theoretically abandon it and survive just fine, somehow the idea of stopping is always made unthinkably apocalyptic.  For Winner, all of these examples serve to illustrate that we are not in control of our modern technology the way we imagine ourselves in control of, say, a hammer. 

In its centrality to the daily activity and consciousness of the "employee," the function-serving human com­ponent, the technical order is more properly thought of as a way of life. Whatever else it may be, a way of life is certainly not neutral. Oppor­tunities for "use" or "control" that the human components have within this system are minimal, for what kind of "control" is it that at every step requires strict obedience to technique or the necessities of techni­cal organization? One can say that the "control" is exercised from the center or apex of the system; this is true, although we shall soon see that even this has a paradoxical character. But in terms of the function­ing of individual components and the complex social interconnections, "control" in the sense of autonomous individuals directing technical means to predetermined ends has virtually no significance. "Control" and "use" simply do not describe anything about relationships of this kind. The direction of governance flows from the technical conditions to people and their social arrangements, not the other way around. What we find, then, is not a tool waiting passively to be used but a technical ensemble that demands routinized behavior.
   In this way of seeing, therefore, the tool-use model is a source of illusions and misleading cues. We do not use technologies so much as live them. (AT, 201)

But if humans aren't in charge of technology, then what is?  Has Winner given us any reason to think that technology is in charge of itself, as the word autonomy suggests?  And if modern technology constitutes a new organism, when was it born and above all, what does it want?  Here, Winner's methodology begins to create problems.  On the one hand, he's sought to be hard-headed and ignore the idea that there's anything mystical or vital involved here.  'Technology' is really just a short-hand for the systemic interaction of mechanical apparatus, human technique, and organizational networks that we see in the world around us. Humans are the only real agents in this system, and the problem is simply that we have lost this agency, this control. The idea that it has thereby passed to some other entity is just an expression of our Frankenstinian fear.  On the other hand, Winner has actually described something that walks and talks like an organism in its own right.  He hasn't emphasized this emergence, but as I suggested earlier, as the book progresses he increasingly talks as if there is a real "self-maintaining technological society".  He has even sketched what this system or organism wants.  It seems to want to expand and incorporate everything, and it seems to want to turn its ever-evolving means into ends in themselves. 

In fact, Winner will even go on to talk about how these tendencies to expansion ("technical imperative") and self justification ("reverse adaptation") actually serve to meld together the two perspectives of Technocracy and Technological Politics we discussed earlier.  Does technology require legislation or is it legislation?  Apparently the answer is yes.

  On the basis of the revised notions of technological politics pre­sented here, however, I want to suggest an alternative conception of technocracy. It is one which avoids the pitfalls of the centralist-elitist notion and which, I believe, comes closer to defining the crucial problem that writers on this question have been aiming at for some time. The conception, although difficult to formulate in terms of pre­cise indexes of measurement, could certainly be tested.
   I offer it as follows. Technocracy is a manifestation of two influen­ces upon public life, which we have dealt with at some length: the technological imperative and reverse adaptation as they appear to a whole society with the force of overwhelming necessity. From this point of view it matters little who in specific obeys the imperative or enacts the adaptation. It is of little consequence what the nature of the education, technical training, or specialized position of such persons may be; indeed, they need not be technically trained at all. Similarly, the issue has little to do with whether there is a "center" of decisions, who occupies it, or what their personal, professional, or class interests may be. (AT, 258)

Here Winner reinterprets the technocrats themselves as mere dispensable tools of technology.  Technocracy becomes a description of how the system works, not who controls it.  Technology gives itself the legislation it requires, though this legislation only serves to unleash more technology rather than allow some human elite to limit it.  This view seems perfectly accurate, and is very close to the critique of Marxism that we saw in Simondon.  The elites are slaves of the machines just as much as the workers, notwithstanding their more comfortable yachts.  Placed in the same situation, anyone would act as they do, because the system has made any other behavior unthinkable.

However, this insistence that the system runs itself, that it doesn't require an elite who understands it, or even a center from which power is constructed poses a tricky question.  What leads us to call it a single system at all?  Winner has described something that we can only recognize because it has a coherent logic of its own that resists human efforts at control.  But then he claims that this same thing is so decentralized that each megatechnical system charts its own course.

Finally, the view I am taking here does not find the essence of tech­nocracy at the center of all centers. If what we have seen is correct, then one would expect a dispersion of power into the functionally specific large-scale systems of the technological order. The systems do on occasion appeal to the central decision-making organs of the state for support and assistance, but it is incorrect to say that in so doing they necessarily yield control of their affairs. Their tendency is, in fact, to resist the final centralization Ellul predicted. The direction of gov­ernance by technological imperatives and reverse adaptation runs from megatechnical systems to the state. (AT, 261)

I think Winner sees this idea of a decentralized technocracy as his crucial contribution to the theory of Technological Politics.He thinks his more rigorous materialistic methodology allows him to reach the same type of conclusions as Mumford and Ellul, but without the whiff of vitalism or human essentialism that (supposedly) clings to those authors.  This is why he emphasizes here that his theory requires none of the state centralization that the others apparently see as inevitable. 

But this idea won't work as it stands.  If each megatechnical system has its own "affairs" to control, and these are not organized by a central state, then we really don't have a system at all.  Why would all these individual megatechnical systems all share common goals, and why would the resulting overall behavior have any coherent logic that we can recognize?  And if megatechnical systems resist final integration by the mega-mega-technical state, then why doesn't this logic apply inductively, and so prevent technical systems from ever forming mega-technical systems in the first place?  Where's the secret sauce that allows certain technical systems to come together so that they act like distinct centers that can control their affairs and resist centralization?  Conversely, if these systems are the only real agents, then how is it that we've identified a characteristic expansion and self-justification of 'technology' as a whole?  Was this just a sloppy shorthand for describing the combined effect of distinct megatechnical systems?  But then why do they all seem to have the same combined effect where they don't just individually expand but also constantly produce new megatechnical systems?  Winner appears to be arguing that we are slaves to many machines which are not necessarily compatible.  This is probably true, but doesn't help us see how the individual machines formed nor why they seem to come together to form a system. 

What we need here is the idea of causal feedback loops.  This was the core of Simondon's theory of the technical individual.  We have to distinguish the individual from the element and ensemble levels of organization, and the key to that is examining where self-reinforcing causal loops get formed.  At this point, it feels like an anti-climax to admit that I don't completely understand how to construct the theory we need here.  It  seems that Simondon developed the idea in his book on individuation, and merely applied it in On the Mode of Existence of Technical Objects.  So presumably we'll return to this point.  For now, I can only sketch a few things that our discussion of Winner has gotten me thinking about.  I think we need to think about something like an ecology of feedback loops.  On the one hand, the existence of individuals clearly requires a positive feedback loop.  I'm imagining something like an autocatalytic set of causal relations.  If the internal functioning of the individual doesn't cause the individual to continue functioning as it does then we're not going to see that individual around for very long.  On the other hand, it seems individuals also require negative feedback loops to maintain a separate identity.  We need a condition of closure to create a distinct self that can be maintained.  Left to itself, a self-reinforcing autocatalytic set wouldn't maintain the equilibrium we associate with individuals, but would simply expand without limit.  It wouldn't make a copy of itself but would simply more of itself.  This is the kind of behavior we associate with crystals, not living individuals. 

This is about as far as I've gotten with the idea so far.  It's not obvious to me that one of these loops should operate exclusively internally and the other externally, between the individual and the environment.  This was one of the interesting, though confusing, points in Simondon.  The individual and its associated milieu are held together like two poles of a teepee.  The individual can't be what it is without a suitable associated milieu, but this in turn can't be as it is without the actions of the individual.  Winner describes a similar phenomenon when he talks about the way technology reshapes both the natural and human environment in a way that leads to more technology.  At the same time, the individual is not just the same thing as its associated milieu, so there has to be some mechanism that creates this distinctness. 

No comments:

Post a Comment