━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ ELEMENTS OF REFUSAL John Zerzan (transcribed by this site’s author out of the archive.org PDF of the book, and some on-line pages) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ Table of Contents ───────────────── Preface to the Second Edition Introduction to the First Edition PART ONE .. Beginning of Time, End of Time .. Language: Origin and Meaning .. Number: Its Origin and Evolution .. The Case Against Art .. Agriculture PART TWO .. Industrialism and Domestication .. Who Killed Ned Ludd? .. Axis Point of American Industrialism .. The Practical Marx .. Origins and Meaning of WWI .. Taylorism and Unionism .. Unionization in America .. Organized Labor vs. “The Revolt Against Work” .. New York, New York .. The Refusal of Technology .. Anti-work and the Struggle for Control PART THREE .. The Promise of the ‘80s .. The ‘80s So Far .. Present-Day Banalities .. Media, Irony and “Bob” .. Afterword Commentary on Form and Content in /Elements of Refusal/ by Paul Z. Simons TODO: Transcribe PART TWO missing chapters, transcribe entirety of PART THREE, fix up formatting, add footnotes Preface to the Second Edition ═════════════════════════════ This collection of writings was published by Left Bank in 1988, and went out of print fairly quickly. I believe most most of it holds up rather well, in part because of a totality that keeps giving us new evidence, on every level, of its fundamental destructiveness. The magnitude of these challenges, created by such a depth of peril and falsity, is the strongest impetus behind efforts to question every component of our truly frightening reality. Unfortunately, stark reality has far more often brought the opposite response, based on fear and denial. More and more we are immersed in a postmodern ethos of appearances, images, and veneers. Everyone can feel the nothingness, the void, just beneath the surface of everyday routines and securities. How tempting, apparently, to avoid asking why, thus elevating the superficial as the only appropriate, indeed the only possible response. The fragmentary, the cynical, and the partial define an extremely pervasive postmodern stance—if such a cowardly, shifting outlook even qualifies as a stance. It is hardly surprising that the high-tech juggernaut, embodying all the bereft features of the social order as a whole, rushes into this intellectual and moral vacuum with an increasing acceleration. I live in the Pacific Northwest, where I was born and where the final traces of the natural forests are being systematically eradicated. The vista of cloned humans looms, as we struggle to maintain some undamaged humanness in a bleak, artificialized panorama. The group suicide of techno-occultists at Rancho Santa Fe (March 1997) is too faithful a reflection of the desperation generated by engulfing emptiness. One of the would-be UFO voyagers spoke for so many others: “Maybe I’m crazy but I don’t care. I’ve been here thirty-one years and there’s nothing for me here.” The first five essays in this volume, written during the mid-1980s, are the basis for more recent efforts such as “Future Primitive” (1992) and “Running on Emptiness” (1997). The question of the origins of our the basis for estrangement is refused by a reigning culture that recognizes neither origins nor estrangement. I feel that this question must be explored, in the face of this stunning, still-unfolding enormity: the entire absence of free or whole life. Time, language, number, art, agriculture. On the other hand, maybe there are no foundations of alienation to be found in these categories, or anywhere else. Certainly these five explorations, and the others that followed, have elicited some very negative reactions. When they were published in Fifth Estate in the ‘80s, FE never failed to run accompanying commentaries rejecting their conclusions. This line of originary studies has been called absolutist, moralistic, religious, paralyzing, even anti-pleasure[1], among other things. To me they are none of the above. In trying to put forth lines of thought, I seemed definitively closed to other perspectives. If so, I regret it. “Industrialism and Domestication” and “Who Killed Ned Ludd?” appear later in the book, but were written earlier. Discovering the intentional social control built into industrial technology and the factory system was part of a questioning that led not only to a re-appraisal of technology itself, but also to a search for the remote origins of our present captivity, all the way back at the beginnings of symbolic culture. Many of the remaining contributions deal with anti-work phenomena and other recent evidence of the erosion of belief in society’s dominant values. These writings often implied that a collapse of the transcendent order was all but imminent. Here I was obviously a bit too sanguine. The onrushing impoverishment of daily life, not to neglect contracting economic pressures, has led many to cling to any semblance of content or meaning, even when found in the context of work. Thus trends of social and workplace alienation that some of us saw as promising have yet to move to the stage of significant resistance, even if the method of being attentive to barely-concealed indices of disaffection remains valid. I hope that aspects of Elements of Refusal may be useful to those who are appalled by the nightmare we face, and who are determined not to go along. This edition I dedicate to the Unabomber. As Arleen Davila put it, “He tried to save us.” Introduction to the First Edition ═════════════════════════════════ Elements of Refusal is the first comprehensive collection of John Zerzan’s writings. Appearing over the past decade in primarily marginal or “underground” publications, this collection is long overdue. No less than as they appeared, these essays are provocative and important. For me John’s writings have always contained that critical spirit which best characterized both the old “Frankfurt School” and the Situationists—but are more radical, and without the debilitating despair of the former or the disgusting love affair with technology and “progress” afflicting the latter. Present-day “reality,” as constituted by those with vested interests in maintaining this domination, is touted as the “best”, if not the only possible reality. Accordingly, history is shaped like a monstrous land-fill to legitimize this contemporary high-rise shill. Still, the designated social straitjacket ill-fits and the social fabric isn’t so smooth as appearances dictate. Daily life, as John makes clear, with its increasingly intensifying alienations, schizophrenia and psychopathology, becomes more spectacular and bizzare. No, all is not well in Utopia. It is a weird and peculiar world where the growing destruction of the earth is touted as “progress,” an advance for humanity. Every technological innovation promising to bring us closer together drives us farther apart; every revolution promises to liberate us from want, but leaves us more in need. We grow more dependant on glitter and distraction to fill the void where all that is human is gutted. Our noses are shoved to the window of consumption (a display of lies) and we are told that here is life. Life is reduced to a game where, for a price, anyone can play; but there is nowhere to play. Indeed, the word “survive” replaces the world “life” more and more in our everyday speech, as if they were equivalent. A kind of social terror permeates everything, becoming a commonplace in our lives. Because, contrary to the glib, superficial aura (desperately and massively touted by mass media), this “work-buy-consume-die” paradise teeters on the brink of collapse and dissolution. But it is not enough to suspect something awiy, to buy bicycles instead of cars, or eat more grain, less meat. It is not enough to affirm the coherency of our feelings and or insights through alternative groupings, structures, cultures, and so forth. We must go much further. Failure to press coherently to the sources of our malaise simply leave us carrying this offal about, endlessly failing to understand anything, repeating forever the stupidities trapping us here, reducing everything to a cynical charade. We will be continually victimized, our best insights nothing if we are not to become visionaries, insisting more of life than a never ending series of computer gadgets, new “causes,” new mysticisms or re-runs of Dr. Strangelove ad nauseum. John’s essays make all this abundantly clear. Here it is axiomatic that time, technology, work and other aspects of our social lifes—hailed as the liberators of humanity—are, in fact, the co-conspirators of domestication and domination. Today, more than ever—as you will see from this modest collection—they stand exposed. If some think these efforts are simply a theory of spontaneity they will fail to understand anything, much less the end of illusion, how to separate authentic from the corrupt and recuperable. If de-mystification is difficult, finding those prepared to listen or to undertake the necessary doings is more so. The blat of everyday survival threatens to drown out some important voices of our time. A few I would point out, for example, are Fredy Pearlman, Frederick Turner, Jacques Camatte, Pierre Clastres, Marshall Sahlins, Richard Drinnon, Stanley Diamond, Howard Zinn and the lively, changing groups of people who have been involved in marginal and periodical publications, such as the Fifth Estate in Detroit. These people constitute no school or homogenous group. They are diverse individuals whose disagreements, oppositions and arguments are as integral to their activity as the commonality of their projects. At the core we see much of what is vital to any authentic revolution: to have done with the “civilizing” myths destroying us. Much of their work is necessarily “anthropologically” grounded. The importance of this digging cannot be underestimated. It isn’t a rooting about for utopia or silly sociological role-models. We are so locked in mentally and physically to “what is” that we fail to recognize that our kingdom is a prison. The overwhelming power of present-day ruling notions and the requirement of sheer survival leave many of us virtually incapable of recognizing how diverse are the possibilities of life. It is not the power of the State, of capitalism, mass media, nationalism, racism, sexism, work routine, class, language, schooling, or culturalization doing us in, but the total ensemble that must be attacked. John’s writings are an important part of this effort—divested of the dross always undermining the best-intentioned movements—to begin anew rather than on or within the ash-heaps of the old society, for we are not rid of a plague while trucking its diesased baggage all about. /Elements of Refusal/ is the result of one person’s pursuits, musings, concerns, discoveries, possibilities, researches and clarifications where so little is understood. The ideological landscape is insidious in its need to prevail. Everywhere this is confirmed. Event the suspicious, the unmarginalized or the refusers have few places to turn. This small book is not a how-to manual nor a blueprint of an alternative future, but begins where we must all begin: by questioning the whole in each of its parts. And it reflects the attendant problems of rummaging and researching where so little is understood. This is, ultimately, a book of on-going explorations—not equations. These articles are loosely grouped in three sections: the first encompasses the more fundamental, sweeping, speculative searches for the sources of our contemporary malaise—origins so deep as to require digging into pre-history; the second group is oriented to events and movements over the past 100 years or so, debunking certain mythologies surrounding technology, the origins of WWI, a variety of “breakdowns,” and industrialism with its concomitant actors and movements; and the last section, focused on the 1980s, draws especially upon mass media’s own dispartate materials, helping us to understand present-day diversions and the radical contexts of its “breakdowns.” Every pocket of refusal gives us hope and every element of refusal keeps this hope burning; in the “past,” as we are the legatees of those before us, “presently,” amongst each other; to the “future,” absolutely. Of some primitive past, some so-called “Golden Age,” we cannot and do not want to re-implement its time or character; but we can, now, recover and cleave to its temper. And here, lastly, if John’s tone is often apocalyptic, so be it; indeed, it is in this spirit /Elements of Refusal/ is presented—as a series of provocations and challenges. David Brown Left Bank Books PART ONE ════════ Beginning of Time, End of Time ────────────────────────────── Just as today’s most obsessive notion is that of the material reality of time, self-existent time was the first lie of social life. As with nature, time did not exist before the individual became separate from it. Reification of this magnitude—the beginning of time—constitutes the Fall: the initiation of alienation, of history. Spengler observed that one culture is differentiated from another by the intuitive meanings assigned to time, Canetti that the regulation of time is the primary attribute of all government. But the very movement from community to civilization is also predicated there. It is the fundamental language of technology and the spirit of domination. Today the feverish acceleration of time, as well as the failure of the “solution” of spatializing it, is exposing it as an artificial, oppressive force along with its corollaries, Progress and Becoming. More concretely, technology and work are being revealed by the palpable thrall of time. Either way, the pressure to dissolve history and the rule of time hasn’t been so strong since the Middle Ages, before that, since the Neolithic revolution establishing agriculture. When the humanization of technology and work appear as dubious propositions, the humanization of time itself is also called into question. The questions forming are, how can basic oppressions be effectively controlled or reformed? Why not abolished? Quoting Hegel approvingly, Debord wrote, “Man, ‘the negative being who is only to the extent that he suppresses Being,’ is identical to time.” This equation is being refused, a situation perhaps best illumined by looking at the origins, evolution and present status of time. If “all reification is forgetting,” in Horkheimer and Adorno’s pregnant phrase, it seems equally true that all “forgetting"—in the sense of loss of contact with our time-less beginnings, of constant “falling into time"—is a reification. All the other reifications, in fact, follow this one. It may be due to the huge implications involved that no one has satisfactorily defined the objectification called time and its course. From time, into history, through progress, and so to the murderous idolatry of the future, which now kills species, languages, cultures, and possibly the entire natural world. This essay should go no further without declaring an intent and strategy: technological society can only be dissolved (and prevented from recycling) by annulling time and history. “History is eternal becoming and therefore eternal future; Nature is become and therefore eternally past,” as Spengler put it. This movement is also well captured by Marcuse’s “History is the negation of Nature,” the incrcasing speed of which has carried man quite outside of himself. At the heart of the process is the reigning concept of temporality itself, which was unknown in early humans. Levy-Bruhl provides an introduction: “Our idea of time seems to be a natural attribute of the human mind. But that is a delusion. Such an idea scarcely exists where primitive mentality is concerned…” The Frankfurts concluded that primeval thought “does not know time as uniform duration or as a succession of qualitatively indifferent moments.” Rather, early individuals “lived in a strearn of inner and outer experience which brought along a different cluster of coexisting events at every moment, and thus constantly changed, quantitatively and qualitatively.” Meditating on the skull of a plains hunter-gatherer woman, Jacquetta Hawks could imagine the “eternal present in which all days, all the seasons of the plain stand in an enduring unity.” In fact, life was lived in a continuous present,” underlying the point that historical time is not inherent in reality, but an imposition on it. The concept of time itself as an abstract, continuing “thread,” unravelling in an endless progression that links all events together while remaining independent of them, was completely unknown. Henri-Charles Puesch’s term “articulated atemporality” is a useful one, which rcflects the fact that awareness of intervals, for instance, existed with the absence of an explicit sense of time. The relationship of subject to object was radically different, clearly, before temporal distance intruded into the psyche. Perception was not the detached act we know now, involving the distance that allows an externalization and domination of nature. Of course, we can see the reflections of this original condition in surviving tribal peoples, in varying degrees. Wax said of the nineteenth century Pawnee Jndians, “Life had a rhythm but not a progression.” The Hopi language employs no references to past, present or future. Further in the direction of history, time is explicit in Tiv thought and speech, but it is not a category of it, just as another African group, the Nuer, have no concept of time as a separate idea. The fall into time is a gradual one; just as the early Egyptians kept two clocks, measuring everyday cycles and uniform “objective” time, the Balinese calendar “doesn’t tell what time it is, but rather what kind of time it is.” In terms of the original hunter-gatherer humanity generally referred to above, a few words may be in order, especially inasmuch as there has been a “nearly complete reversal in anthropological orthodoxy” concerning it since the end of the 1960s. Life prior to the earliest agricultural societies of about 10,000 years ago had been seen as nasty, short and brutish, but the research of Marshall Sahlins, Richard Lee and others has changed this view very drastically. Foraging now represents the original affluent society in that it provided life and pleasures with a minimum of effort; work was regarded strictly as a social cost and the spirit of the gift predominated. This, then, was the basis of no-time, bringing to mind Whitrow’s remark that “Primitives live in a now, as we all do when we are having fun” and Nietzsche’s that “All pleasure desires eternity—deep, deep eternity.” The idea of an original state of pleasure and perfection is very old and virtually universal. The memory of a “Lost Paradise"—and often an accompanying eschatology that demands the destruction of subsequent existcnce—is seen in the Taoist idea of a Golden Age, the Cronia and Saturnalia of Rome, the Greeks’ Elysium, and the Christian Garden of Eden and the Fall (probably deriving from the Sumerian laments for lost happiness in lordless society), to name but a few. The loss of a paradisal situation with the dawn of time rcveals time as the curse of the Fall, history seen as a consequence of Original Sin. Norman O. Brown felt that “Separateness, then is the Fall—the fall into division, the original lie,” Walter Benjamin that “the origin of abstraction… is to be sought in the Fall.” Conversely, Eliade discerned in the shamanic experience a “nostalgia for paradise,” in exploring the belief that “what the shaman can do today in ecstasy” could, prior to the hegemony of time, “be done by all human beings in concreto.” Small wonder that Loren Eisely saw in aboriginal people “remarkably effective efforts to erase or ignore all that is not involved with the transcendent search for timelessness, the happy land of no change,” or that Levi-Strauss found primitive societies determined to “resist desperately any modification in their structure that would enable history to burst forth into their midst.” If all this seems a bit too heady for such a sober topic as time, a few modern cliches may give pause as to where an absence of wisdom really lies. John G. Gunnell tells us that “Time is a form of ordering experience,” an exact parallel to the equally fallacious assertion of the neutrality of technology. Even more extreme in its fealty to time is Clark and Piggott’s bizarre claim that “human societies differ from animal ones, in the final resort, through their consciousness of history.” Erich Kahler has it that “Since primitive peoples have scarcely any feeling of individuality, they have not individual property.” A notion as totally wrong as Leslie Paul’s “In stepping out of nature, man makes himself free of the dimension of time.” Kahler, it might be added, is on vastly firmer ground in noting that the early individual’s “primitive participation with his universe and community begins to disintegrate” with the acquiring of time. Seidenberg also detected this loss, in which our ancestor “found himself diverging ever further from his instinctual harmony along a precarious path of unstable synthesis. And that path is history. Coming back to the mythic dimension, as in the generalized ancient memory of an original Eden-the reality of which was hunter-gatherer life-we confront the magical practices found in all races and early societies. What is seen here, as opposed to the timebounu mode of technology, is an atemporal intervention aimcd at the “reinstatement of thc usual uniformities of nature.” It is this primary human interest in the regularity, not the supersession, of the processes of nature that bears cmphasizing. Relatcd to magic is totemism, in which the kinship o f all living things is paramount; with magic and its totemic context, participa­ tion with nature underlies all. “In pure totemism,” says Frazer, “…the totem [ancestor, patron] is never a god and is never worshipped.” The step from participation to religion, from communion with the world to externalized deities for worship, is a part of the alienation process of emerging time. Ratschow held the rise of historical consciousness responsible for the collapse of magic and its replacement by religion, an essential connection. In much the same sense, theil, did Durkheim consider time to be a “product of religious thought.” Eliade saw this gathering separation and related it to social life: “the most extravagant myths and rituals, Gods and Goddesses of the most various kinds, the Ancestors, masks and secret societies, temples, priesthoods, and so on—all this is found in cultures that have passed beyond the stage of gathering and small-game hunting…” Elman Service found the band societies of the hunter-gatherer stage to have been “surprisingly” egalitarian and marked by the absence not only of authoritarian chiefs, but of specialists, intermediaries of any kind, division of labor, and classes.” Civilization, as Freud repeatedly pointed out, with alienation at its core, had to break the early hold of timeless and non-productive gratification. In that long, original epoch, alienation first began to appear in the shape of time, although many tens of thousands of years’ resistance stayed its definitive victory, its conversion into history. Spatialization, which is the motor of technology, can be traced back to the earliest sad experiences of deprivation through time, back to the beginning efforts to offset the passage of time by extension in space. The injunction in Genesis to “Be fruitful and multiply” was seen by Cioran as “criminal.” Possibly he could see in it the first spatialization—that of humans themselves—for division of labor and the other ensuing separations may be said to stem from the large growth of human numbers, with the progressive breakdown of hunter-gatherer life. The bourgeois way of stating this is the cliche that domination (rulers, cities, the state, etc.) was the natural outcome of “population pressures.” In the movement from the hunter-gatherer to the nomad we see spatialization in the form, at about 1200 B.C., of the war chariot (and the centaur figure). The intoxication with space and speed, as compensation for controlling time, is obviously with us yet. It is a kind of sublimation; the anxious energy of the sense of time is converted toward domination spatially, most simply. With the end of a nomadic existence, the social order is created on a basis of fixed property, a further spatialization. Here enters Euclid, whose geometry reflects the needs of the early agricultural systems and which established science on the wrong track by taking space as the primary concept. In attempting a typology of the egalitarian society Morton Fried declared that it had no regular division of labor (and thus no political power accrued therefrom) and that “Almost all of these societies are founded upon hunting and gathering and lack significant harvest periods when large reserves of food are stored.” Agricultural civilization changed all of this, introducing production via the development of surplus and specialization. Supported by surplus, the priest measured time, traced celestial movement, and predicted future events. Time, controlled by a powerful elite, was used directly to control the lives of great numbers of men and women.” The masters of the early calendars and their attendant lore “became a separate priestly caste,” according to Lawrence Wright. A prime example was the very time-obsessed Mayans; GJ. Whitrow tells us that “of all ancient peoples, the Mayan priests developed the most elaborate and accurate astronomical calendar, and thereby gained enormous intluence over the masses.” Generally speaking, Henry Elmer Barnes is quite correct that formal time concepts came with the development of agriculture. “One is reminded here of the famous Old Testament curse of agriculture (Genesis 3:17-18) at the expulsion from Paradise, which announces work and domination, With the advance of farming culture the idea of time became more defined and conceptual, and differences in the interpretation of time constituted a demarcation line between a state of nature and one of civilization, between the educated classes and the masses.” It is recognized as a defining mode of the new Neolithic phenomena, as expressed by Nilsson’s comment that “ancient civilizcd peoples appear in history with a fully-developed system of time-reckoning,” and by Thompson’s that “the form of the calendar is basic to the form of a civilization.” The Babylonians gave the day 12 hours, the Hebrews gave the week 7 days, and the carly notion of cyclical time, with its partial claim to a return to the beginnings, gradually succumbed to time as a linear progression. Time and domestication of nature advanced, at a price unrivalled. “The discovery of agriculture,” as Eliade claimed, “provoked upheavals and spiritual breakdowns whose magnitude the modern mind finds it well-nigh impossible to conceive.” A world fell before this virulent partnership, but not without a vast struggle. So with Jacob Burckhardt we must approach history “as it were as a pathologist”; with Holderlin we still seck to know “How did it begin? Who brought the curse?” Resuming the narrative, even up to Greek civilization did resistance flourish. In fact, evcn with Socrates and Plato and the primacy of systematic philosophy, was time at least held at bay, precisely because “forgetting” timeless beginnings was still regarded as the chief obstacle to wisdom or salvation. J.B. Bury’s classic /The Idea of Progress/ pointed out the “widely-spread belief” in Greece that the human race had decidedly degenerated from an initial “golden age of simplicity"—a longstanding bar to the progress of the idea of progress. Christianson found the anti-progress attitude later yet: “The Romans, no less than the Greeks and Babylonians, also clung to various notions of cyclical recurrence in time…” With Judaism and Christianity, however, time very clearly sharpened itself into linear progression. Here was a radical departure, as the urgency of time seized upon humanity. its standard features were outlined by Augustine, not coincidentally at one of the most catastrophic moments of history—the collapse of the ancient world and the fall of Rome. Augustine definitely attacked cyclical time, portraying a unitary mankind that advances irreversibly through time; appearing at about 400 A.D., it is the first notable theory of history. As if to emphasize the Christian stamp on triumphant linear time, one soon finds, in feudal Europe, the first instance of daily life ruled by a Strict time table: the monastery.” Run like a clock, organized and absolute, the monastery confined the individual in time just as its walls confined him in space. The Church was the first power to conjoin the measurement of time and a temporarily ordered mode of life, a project it pursued vigorously. The invention of the striking and wheeled clock by Pope Sylvester II, in the year 100, is thus quite fitting. The Benedictine order, in particular, has been seen by Coulton, Sombart, Mumford and others as perhaps the original founder of modern capitalism. The Benedictines, who ruled 40,000 monasteries at their height, helped crucially to yoke human endeavor to the regular, collective beat and rhythm of the machine, reminding us that the clock is not merely a means of keeping track of the hours, but of synchronizing human action. In the Middle Ages, specifically the 14th century, the march of time resistance unequalled in scope, quite possibly, since the Neolithic revolution of agriculture. This claim can be assessed by a comparison of the very basic developments of time and social revolt, which seems to indicate a definite and profound collision of the two. With the 1300s quantified, official time staked its claim to the colonization of modern life; time then became fully abstracted into a uniform series of units, points and sections. The technology of the verge escapement early in the century produced the first modern mechanical clock, symbol of a qualitatively new era of confinement now dawning as temporal associations became completely separate from nature. Public clocks appeared, and around 1345 the division of hours into sixty minutes and of minutes into sixty seconds became common, among other new conventions and usages across Europe. The new exactitude carried a tighter synchronization forward, essential to a new level of domestication. Glasser remarked on the “loss of poetry and immediacy in personal experience” caused by time’s new power, and reflected that this manifestation of time replaced the movement and radiance of the day by its utilization as a temporal unit. Days, hours, and minutes became interchangeable like the standardized parts and work processes they prefigured. These decisive and oppressive changes must have been at the heart of the great social revolts that coincided with them. Textile workers, peasants, and city poor shook the norms and barriers of society to the point of dissolution, in risings such as that of Flanders between 1323 and 1328, the facquerie of France of 1358, and the English revolt of 1381, to name only the three most prominent. The millennial character of revolutionary insurgence at this time, which in Bohemia and Germany persisted even into the early 16th century, underlines the unmistakable time element and recalls earlier examples of longing for all original, unmediated condition. The mystical anarchism of the Free Spirit in England sought the state of nature, for example, as did the famous proverb stressed hy the rebel John Ball: “When Adam delved and Eve span, who then was a gentleman?” Very instructive is a meditation of the radical mystic Suso, of Cologne, at about 1330: ‘Whence have you come?’ The image (appearing to Suso) answers ‘I come from nowhere.’ ‘Tell me, what are you?’ ‘I am not.’ ‘What do you wish?’ ‘I do not wish.’ ‘This is a miracle! Tell me, what is your name?’ ‘I am called Nameless Wildness.’ ‘Where does your insight lead to?’ ‘To untrammelled freedom.’ ‘Tell me, what do you call untrammelled freedom?’ ‘When a man lives according to all his caprices without distinguishing between God and himself, and without looking before or after…’ The desire “to hold all things in common,” to abolish rank and hierarchy, and, even more so, Suso’s explicitly anti-time utterance, reveal the most extreme desires of the 14th century social revolt and demonstrate its element of time refusal. This watershed in the late medieval period can also be understood via art, where the measured space of perspective followed the measured time of the clocks, Before the 14th century there was no attempt at perspective because the painter attempted to record things as they are, not as they look. After the 14th century, an acute time sense informs art; “Not so much a place as a moment is fixed for us, and a fleeting moment: a point of view in time more than in space,” as Bronowski described it. Similarly, Yi-Fu Tuan pointed out that the landscape picture, which appeared only with the 15th century, represented a major re-ordering of time as well as space with its perspective. Motion is stressed by perspective’s transformation of the similarity of space into a happening in time, which, returning to the theme of spatialization, shows in another way that a “quantum leap” in time had occurred, Movement again became a source of values following the defeat of the 14th century resistance to time; a new level of spatialization was involved, as seen most clearly in the emergence of the modern map, in the 15th century, and the ensuing age of the great voyages, Braudel’s phrase, modern civilization’s “war against empty space,” is best understood in this light “The new valuation of Time, which then broke to the surface, actually became one of the most powerful agencies by which Western thought, at the end of the Middle Ages, was transformed…” was Kantorowicz’s way of expressing the new, strengthened hegemony of time. If in this objective temporal order of official, legal, factual time only the spatial found the possibility of real expression, all thinking would be necessarily shifted, and also brought to heel. A good deal of this reorientation can be found in Le Goff’s simple observation concerning the early 15th century, that “the first virtue of the humanist is a sense of time.” How else could modernity be achieved but by the new dimensions reached by tiem and technology together, their distinctive and perfected mating? Lilley noted that “the most complex machines produced by the Middle Ages were mechanical clocks,” just as Mumford saw that “the clock, not the steam engine, is the key machine of the modern industrial age.” Marx too found here the first basis of machine industry: “The clock is the first automatic machine applied to practical purposes, and the whole theory of production of regular motion was developed on it.” Another telling congruence is the fact that, in the mid-15th century, the first document known to have been printed on Gutenberg’s press was a calendar (not a bible). And it is noteworthy that the end of the millenarian revolt, such as that of the Taborites of Bohemia in the 15th century and the Anabaptists of Munster in the early 16th century coincided with the perfection and spread of the mechanical clock. In Peter Breughel’s /The Triumph of Time/ (1574), the many objects and ideas of the painting are dominated by the figure of a modern clock. This triumph, as noted above, awakened a great spatial urge by way of compensation: circumnavigating the globe and the discovery, suddenly, of vast new lands, for example. But just as certain is its relationship to “the progressive disrealization of the world,” in the words of Charles NEwman, which began at this time. Extension, in the form of domination, obviously accentuated alienation from the world: a totally fitting accompaniment to the dawning of modern history. Official time had become a barrier both palpable and all-pervasive, filtering and distorting what people said to each other. As of this time, it unmistakably imposed a new distance on human relations and restraint on emotional responses. A Renaiisance hallmark, the search for rare manuscripts and classical antiquities, is one form of longing to withstand this powerful time. But the battle had been decided, and abstract time had become the milieu, the new framework of existence. When Ellul opined that “the whole structure of being” was now permeated by “mechanical abstraction and rigidity,”, he referred most centrally to the time dimension. All this bloomed in the 1600s, from Bacon, who first proclaimed modernity’s domination of nature, and Descartes’ formulation regarding the /maîtres et possesseurs de la nature/, which “predicted the imperialistic control of nature which characterizes modern science,” including Galileo and the whole ensemble of the century’s scientific revolution. Life and nature became mere quantity, the unique lost its strength, and soon the Newtonian image of the world as a clock-like mechanism prevailed. Equivalence—with uniform time as its real model—came to rule, in a development that made “the dissimilar comparable by reducing it to abstract quantities.” The poet Ciro di Pers understood that the clock made time scarce and life short. To him, it Speeds on the course of the fleeing century, And to make it open up, Knocks every hour at the tomb. Later in the 17th century, Milton’s /Paradise Lost/ sides with victorious time, to the point of deigrating the timeless, paradisiacal state: with labour I must earn My bread; what harm? Idleness had been worse. Well before the beginnings of industrial capitalism, then, had time substantially subdued and synchronized life; advancing technology can be said to have heen borne by the earlier breakthroughs of time. “It was the beginning of modern time that made the speed of technology possible,” concluded Octavio Paz. E.P.Thompson’s widely-known “Time, Work-Discipline, and Industrial Capitalism” described the industrialization of time, but, more fundamentally, it was time that did the industrializing, the great daily life struggles of the late 18th and early 19th centuries against the factory system” notwithstanding. In terms of the modern era, again one can discern in social revolts the definite aspect of time refusal, however inchoate. In the very late 18th century, for instance, the context of two revolutions, one must judge, helped Kant see that space and time are not part of the empirical world but part of our acquired intersubjective faculties. It is a non-revolutionary twist that a new, short-lived, calendar was introduced by the French Revolution—not resistance to time, but its renewal under new management! Walter Benjamin wrote of actual time refusal vis-a-vis the July revolution of 1830, noting the fact that in early fighting “the clocks in towers were being fired on simultaneously and independently from several places in Paris.” He quoted an eyewitness the following verse: Who would have believed? We are told that new ]oshuas at the foot of every tower, as though irritated with time itself, fired at the dials in order to stop the day. Not that moments of insurgence are the only occasions of sensitivity to time’s tyranny. According to Poulet, no one felt more grievously the metamorphosis of the into something quite infernal than did Baudelaire, who wrote of the malcontents “who have refused redemption by work,” who wanted “to possess immediately, on this earth, a Paradise”; these he termed “Slaves martyred by Time,” a notion echoed by Rimbaud’s denunciation of the scandal of an existence in time. These two poets suffered in the long, dark night of capital’s mid-and late-19th century ascendancy, though it could be argued that their awareness of time was made clearest via their active participation, respectively, in the 1848 revolution and the Commune of 1871. Samuel Butler’s utopian Erewhon portrayed workers who destroyed their machines lest their machines destroy them. Its opening theme derives from the incident of wearing a watch, and later a visitor’s watch is rather forcibly retired to a museum of bygone evils. Very much in this spirit, and from the same era, are these lines of Robert Louis Stevenson: You may dally as long as you like by the roadside. It is almost as if the millennium were arrived, when we shall throw our clocks and watches over the housetop, and remember time and seasons no more. Not to keep hours for a lifetime is, I was going to say, to live forever. You have no idea, unless you have tried it, how endlessly long is a summer’s day, that you measure only by hunger, and to an end only when you are drowsy. Referring to such phenomena as huge political rallies, Benjamin’s “The Work of Art in the Age of Mechanical Reproduction” made the point that “Mass reproduction is aided especially by the reproduction of masses…” But one could go much further and say simply that mass reproduction is the reproduction of masses, or the mass-man. Mass production itself with its standardized, interchangeable parts and wage-labor to match constitues a fascism of everyday life long predating the fascist rallies Benjamin had in mind. And, as described above, it was time, several hundred years before that, which provided the categorical paradigm to mass production, in the form of uniform but discrete quantities ordering life. Stewart Ewen held that during the 19th and early 20th centuries, “the industrial definition of social time and space stood at the core of social unrest,” and this is certainly true; homever, the breadth of time and space “issue” requires a rather broad historical perspective to allow for a comprehension of modernity’s unfolding mass age. That the years immediately preceding World War I expressed a rising radical challenge requiring the fearful carnage of the war to divert and destroy it is a thesis I have argued elsewhere. The depth of this challenge can best be plumbed in terms of the refusal of time. The contemporary tension between the domains of being and of time was first elucidated by Bergson in the pre-war period in his protest against the fragmentary and repressive character of mechanistic time. With his distrust of science, Bergson argued that a qualitative sense of time, of lived experience or duree, requires a resistance to formalized, spatialized time. Though limited, his outlook announced the renewal of a developing opposition to a tyranny that had come to inform so many elements of subjugation. Most of this century’s anti-time impulse was rather fully articulated i n thc quickening movement just prior to the war. Cubism’s urgent re­ examination o f appearances helongs here, of course; by smashing visual perspective, which h a d prevailed since the early Renaissance, the Cubists sought to apprehend reality as it was, not as i t looked at a moment of time. It is this whieh enabled John Berger to judge that “the Cubist formula presupposed… for the first time in history, man living unalicnated from nature.” Einstein and Minkowski also bespoke the time revolt context with the well-known scrapping of the Newtonian universe bascd on absolute time and space. In music, Arnold Schoenberg liberated dissonance from thc prevailing false positivity’s restraints, and Stravinsky explicitly attacked temporal limitations in a variety of new ways, as did Proust, Joyce, and others in literature. All modes of expression, according to Donald Lowe, “rejected the linear perspective of visuality and Archimedean reason, in that crucial decade of 1905-1915!” I n the 1920s Heidegger emphasized time as the central concept for contemporary metaphysics and as torming the essential structure of subjectivity. But the devastating impact of the war had deeply altered the sense of possibilities within social reality. /Being and Time/ (1927), in fact, far from questioning time, surrendered to it completely as the only vantage that allows understanding of being. Related, in the parallel provided by Adorno, is “the trick of military command, which dressed up imperative in the guise of a predicative sentence… Heidegger, too, cracks the whip when he italicizes the auxiliary verb in the sentence, ‘Death is.’” Indeed, for almost forty years after World War I the anti-time spirit was essentially suppressed. By the 1930s, one could still find signs of it in, say the Surrealist movement, or novels of Aldous Huxley, but predominant was the renewed rush of technology and domination, as reflected by Katayev’s Five-Year-Plan novel Time, Forward! or the bestial deformation expressed in the literally millenarian symbol, the Thousand Year Reich. Nearer to our contemporary situation, a restive awareness of time began to re-emerge as a new round of contestation neared. In the mid-1950’s the scientist N.J Berrill interrupted a fairly dispassionate book to comment on the predominant desire in society “to get from nowhere to nowhere in nothing flat,” observing, “And still a minute can embrace eternity and a month be empty of meaning.” Still more startling, he cried out that “For a long time I have felt trapped in time, like a prisoner searching for some sense of escape.” Perhaps an unlikely quarter from which to hear such an articulation, but another man of science made a similar statement forty years before, just as World War I was about to quell insurgence for decades, Wittgenstein noted, “Only a man who lives not in time but in the present is happy.” Children, of course, live in a now and want their gratification now, if we are looking for subjects for the idea that only the present can be total. Alienation in time, the beginning of time as an alien “thing,” begins in early infancy, as early as the maternity ward, though Josst Meerloo is correct that “With every trauma in life, every new separation, the awareness of time grows.” Raoul Veneigem supplied the conscious element, outlining perfectly the function of schooling: “The child’s days escape adult time; their time is swollen by subjectivity, passion, dreams haunted by raelity. Outside, the educators look on, waiting, watch in hand, till the child joins and fits the cycle of the hours.” The levels of conditioning reflect, of course, the dimensions of a world so emptied, so exquisitely alienated that time has completely robbed us of the present. “Every passing second drags me from the moment that was to the moment that will be. Every second spirits me away from myself; now never exists.” The repetitious, routine nature of industrial life is the obvious product of time and technology. An important aspect of time-less hunter-gatherer life was the unique, sporadic quality of its activities, rather than the repetitive, numbers and time apply to the quantitative, not the qualitative. In this regard Richard Schlegel judged that if events were always novel, not only would order and routine be impossible, bo so would notions of time itself. In Beckett’s play, /Waiting for Godot/, the two main characters receive a visitor, after which one of them sighs, “Well at least it helped to pass the time.” The other replies, “Nonsense, time would have passed anyway.” In this prosaic exchange the basic horror of modern life is plumbed. The meta-presence of time is by this time felt as a heavily oppressive force, standing over its subjects quite autonomously. Very apropos is this summing up by George Morgan: “A fretful business to ‘kill time’ and restless movement from novelty to novelty bury an ever-present sense of futility and vacuouness. In the midst of his endless achievements, modern man is losing the substance of human life.” Loren Eisely once described “a feeling of inexplicable terror,” as if he and his companion, who were examining a skull, were in the path of “a torrent that was sweeping everything to destruction.” Understanding Eisely’s sensation completely, his friend paraphrased him as saying, “to know time is to fear it, and to know civilized time is to be terror­stricken.” Given the history of time and our present plight in it, it would be hard to imagine a more prescient bit of communication. In the 1960s Robert Lowell gave succinct expression to the extremity of alienation of time: I am learning to live in history. What is history? What you cannot touch. Fortunately, also in the ‘60s, many others were beginning the unlearning of how to live in history, as evidenced by the shedding of wristwatches, the use of psychedelic drugs, and paradoxically perhaps, by the popular single-word slogan of the French insurrectionaries of May 1968—"Quick!” The element of time refusal in the revolt of the ‘60s was strong and there are signs—such as the revolt against work—that it continues to deepen even as it contends with extreme new spatializations of time. Since Marcuse wrote of “the alliance between time and the order of repression,” and Norman O. Brown on the sense of time or history as a function of repression, the vividness of the connection has powerfully grown. Christopher Lasch, in the late ‘70s, noticed that “A profound shift in our sense of time has transformed work hahits, values, and the definition of success.” And if work is heing refused as a key component of time, it is also becoming obvious how consumption gobbles up time alive. Today’s perfect spatial symbol of the latter is the Pac-Man video game figure, which literally eats up space to kill time. As with Aldous Huxley’s Mr. Propter, millions have come to find time “a thing intrinsically nightmarish.” A fixation with age and the pro-longevity movement, as discussed by Lasch and others, are two signs of its torment. Adorno once said, “As the subjects live less, death grows more precipitous, more terrifying.” There seems to be a new generation among the young virtually every three or four years, as time, growing more palpable, has accelerated since the ‘60s. Science has provided a popular reflection of time resistance in at least two phenomena; the widespread appeal of anti-time concepts more or less derived from physical theory, such as black holes, time warps, spacetime singularities and the like, and the comforting appeal of the “deep time” of the so-called geological romances, such as John McPhee’s /Basin and Range/ (1981). When Benjamin assayed that “The concept of the historical progress of mankind cannot be sundered from the concept of its progression through a homogenous time,” he called for a critique of both, little realizing how resonant this call might someday become. Still less, of course, could Goethe’s dictum that “No man can judge history but one who has himself experienced history” have been forseen to apply in such a wholesale way as it does now, with time the most real and onerous dimension. The project of annuling time and history will have to be developed as the only hope of human liberation. Of course, there is no dearth of the wise who continue to assert that consciousness itself is impossible without time and its spatialization, overlooking somehow an overwhelmingly massive period of humanity’s existence. Some concluding words from William Morris’s /News from Nowhere/ are a fitting hope in reply to such sages of domination: “In spite of all the infallible maxims of your day there is yet a time of rest in store for the world, when mastery has changed into fellowship.” Language: Origin and Meaning ──────────────────────────── Fairly recent anthropology (e.g. Sahlins, R.B. Lee) has virtually obliterated the long-dominant conception which defined prehistoric humanity in terms of scarcity and brutalization. As if the implications of this are already becoming widely understood, there seems to be a growing sense of that vast epoch as one of wholeness and grace. Our time on earth, characterized by the very opposite of those qualities, is in the deepest need of a reversal of the dialectic that stripped that wholeness from our life as a species. Being alive in nature, before our abstraction from it, must have involved a perception and contact that we can scarcely comprehend from our levels of anguish and alienation. The communication with all of existence must have been an exquisite play of all the senses, reflecting the numberless, nameless varieties of pleasure and emotion once accessible within us. To Levy-Bruhl, Durkheim and others, the cardinal and qualitative difference between the “primitive mind” and ours is the primitive’s lack of detachment in the moment of experience; “the savage mind totalizes,” as Levi-Strauss put it. Of course we have long been instructed that this original unity was destined to crumble, that alienation is the province of being human: consciousness depends on it. In much the same sense that objectified time has been held to be essential to consciousness–Hegel called it “the necessary alienation"–so has language, and equally falsely. Language may be properly considered the fundamental ideology, perhaps as deep a separation from the natural world as self-existent time. And if timelessness resolves the split between spontaneity and consciousness, languagelessness may be equally necessary. Adorno, in Minima Moralia, wrote: “To happiness the same applies as to truth: one does not have it, but is in it.” This could stand as an excellent description of humankind as we existed before the emergence of time and language, before the division and distancing that exhausted authenticity. Language is the subject of this exploration, understood in its virulent sense. A fragment from Nietzsche introduces its central perspective: “words dilute and brutalize; words depersonalize; words make the uncommon common.” Although language can still be described by scholars in such phrases as “the most significant and colossal work that the human spirit has evolved,” this characterization occurs now in a context of extremity in which we are forced to call the aggregate of the work of the “human spirit” into question. Similarly, if in Coward and Ellis’ estimation, the most “significant feature of twentieth-century intellectual development” has been the light shed by linguistics upon social reality, this focus hints at how fundamental our scrutiny must yet become in order to comprehend maimed modern life. It may sound positivist to assert that language must somehow embody all the “advances” of society, but in civilization it seems that all meaning is ultimately linguistic; the question of the meaning of language, considered in its totality, has become the unavoidable next step. Earlier writers could define consciousness in a facile way as that which can be verbalized, or even argue that wordless thought is impossible (despite the counter-examples of chessplaying or composing music). But in our present straits, we have to consider anew the meaning of the birth and character of language rather than assume it to be merely a neutral, if not benign, inevitable presence. The philosophers are now forced to recognize the question with intensified interest; Gadamer, for example: “Admittedly, the nature of language is one of the most mysterious questions that exists for man to ponder on.” Ideology, alienation’s armored way of seeing, is a domination embedded in systematic false consciousness. It is easier still to begin to locate language in these terms if one takes up another definition common to both ideology and language: namely, that each is a system of distorted communication between two poles and predicated upon symbolization. Like ideology, language creates false separations and objectifications through its symbolizing power. This falsification is made possible by concealing, and ultimately vitiating, the participation of the subject in the physical world. Modern languages, for example, employ the word “mind” to describe a thing dwelling independently in our bodies, as compared with the Sanskrit word, which means “working within,” involving an active embrace of sensation, perception, and cognition. The logic of ideology, from active to passive, from unity to separation, is similarly reflected in the decay of the verb form in general. It is noteworthy that the much freer and sensuous hunter-gatherer cultures gave way to the Neolithic imposition of civilization, work and property at the same time that verbs declined to approximately half of all words of a language; in modern English, verbs account for less than 10% of words. Though language, in its definitive features, seems to be complete from its inception, its progress is marked by a steadily debasing process. The carving up of nature, its reduction into concepts and equivalences, occurs along lines laid down by the patterns of language. And the more the machinery of language, again paralleling ideology, subjects existence to itself, the more blind its role in reproducing a society of subjugation. Navajo has been termed an “excessively literal” language, from the characteristic bias of our time for the more general and abstract. In a much earlier time, we are reminded, the direct and concrete held sway; there existed a “plethora of terms for the touched and seen.” (Mellersh 1960) Toynbee noted the “amazing wealth of inflexions” in early languages and the later tendency toward simplification of language through the abandonment of inflexions. Cassirer saw the “astounding variety of terms for a particular action” among American Indian tribes and understood that such terms bear to each other a relation of juxtaposition rather than of subordination. But it is worth repeating once more that while very early on a sumptuous prodigality of symbols obtained, it was a closure of symbols, of abstract conventions, even at that stage, which might be thought of as adolescent ideology. Considered as the paradigm of ideology, language must also be recognized as the determinant organizer of congnition. As the pioneer linguist Sapir noted, humans are very much at the mercy of language concerning what constitutes “social reality.” Another seminal anthropological linguist, Whorf, took this further to propose that language determines one’s entire way of life, including one’s thinking and all other forms of mental activity. To use language is to limit oneself to the modes of perception already inherent in that language. The fact that language is only form and yet molds everything goes to the core of what ideology is. It is reality revealed only ideologically, as a stratum separate from us. In this way language creates, and debases the world. “Human speech conceals far more than it confides; it blurs much more than it defines; it distances more than it connects,” was George Steiner’s conclusion. More concretely, the essence of learning a language is learning a system, a model, that shapes and controls speaking. It is easier still to see ideology on this level, where due to the essential arbitrariness of the phonological, syntactic, and semantic rules of each, every human language must be learned. The unnatural is imposed, as a necessary moment of reproducing an unnatural world. Even in the most primitive languages, words rarely bear a recognizable similarity to what they denote; they are purely conventional. Of course this is part of the tendency to see reality symbolically, which Cioran referred to as the “sticky symbolic net” of language, an infinite regression which cuts us off from the world. The arbitrary, self-contained nature of language’s symbolic creates growing areas of false certainty where wonder, multiplicity and non-equivalence should prevail. Barthes’ depiction of language as “absolutely terrorist” is much to the point here; he saw that its systematic nature “in order to be complete needs only to be valid, and not to be true.” Language effects the original split between wisdom and method. Along these lines, in terms of structure, it is evident that “freedom of speech” does not exist; grammar is the invisible “thought control” of our invisible prison. With language we have already accomodated ourselves to a world of unfreedom. Reification, the tendency to take the conceptual as the perceived and to treat concepts as tangible, is as basic to language as it is to ideology. Language represents the mind’s reification of its experience, that is, an analysis into parts which, as concepts, can be manipulated as if they were objects. Horkheimer pointed out that ideology consists more in what people are like–their mental constrictedness, their complete dependence on associations provided for them–than in what they believe. In a statement that seems as pertinent to language as to ideology, he added that people experience everything only within the conventional framework of concepts. It has been asserted that reification is necessary to mental functioning, that the formation of concepts which can themselves be mistaken for living properties and relationships does away with the otherwise almost intolerable experience of relating one experience to another. Cassirer said of this distancing from experience, “Physical reality seems to reduce in proportion as man’s symbolic activity advances.” Representation and uniformity begin with language, reminding us of Heidegger’s insistence that something extraordinarily important has been forgotten by civilization. Civilization is often thought of not as a forgetting but as a remembering, wherein language enables accumulated knowledge to be transmitted forward, allowing us to profit from other’s experiences as though they were our own. Perhaps what is forgotten is simply that other’s experiences are not our own, that the civilizing process is thus a vicarious and inauthentic one. When language, for good reason, is held to be virtually coterminous with life, we are dealing with another way of saying that life has moved progressively farther from directly lived experience. Language, like ideology, mediates the here and now, attacking direct, spontaneous connections. A descriptive example was provided by a mother objecting to the pressure to learn to read: “Once a child is literate, there is no turning back. Walk through an art museum. Watch the literate students read the title cards before viewing the paintings to be sure that they know what to see. Or watch them read the cards and ignore the paintings entirely…As the primers point out, reading opens doors. But once those doors are open, it is very difficult to see the world without looking through them.” The process of transforming all direct experience into the supreme symbolic expression, language, monopolizes life. Like ideology, language conceals and justifies, compelling us to suspend our doubts about its claim to validity. It is at the root of civilization, the dynamic code of civilization’s alienated nature. As the paradigm of ideology, language stands behind all of the massive legitimation necessary to hold civilization together. It remains for us to clarify what forms of nascent domination engendered this justification, made language necessary as a basic means of repression. It should be clear, first of all, that the arbitrary and decisive association of a particular sound with a particular thing is hardly inevitable or accidental. Language is an invention for the reason that cognitive processes must precede their expression in language. To assert that humanity is only human because of language generally neglects the corollary that being human is the precondition of inventing language. The question is how did words first come to be accepted as signs at all? How did the first symbol originate? Contemporary linguists find this “such a serious problem that one may despair of finding a way out of its difficulties.” Among the more than ten thousand works on the origin of language, even the most recent admit that the theoretical discrepancies are staggering. The question of when language began has also brought forth extremely diverse opinions. There is no cultural phenomenon that is more momentous, but no other development offers fewer facts as to its beginnings. Not surprisingly, Bernard Campell is far from alone in his judgement that “We simply do not know, and never will, how or when language began.” Many of the theories that have been put forth as to the origin of language are trivial: they explain nothing about the qualitative, intentional changes introduced by language. The “ding-dong” theory maintains that there is somehow an innate connection between sound and meaning; the “pooh-pooh” theory holds that language at first consisted of ejaculations of surprise, fear, pleasure, pain, etc.; the “ta-ta” theory posits the imitation of bodily movements as the genesis of language, and so on among explanations that only beg the question. The hypothesis that the requirements of hunting made language necessary, on the other hand, is easily refuted; animals hunt together without language, and it is often necessary for humans to remain silent in order to hunt. Somewhat closer to the mark, I believe, is the approach of contemporary linguist E.H. Sturtevant: since all intentions and emotions are involuntarily expressed by gesture, look, or sound, voluntary communication, such as language, must have been invented for the purpose of lying or deceiving. In a more circumspect vein, the philosopher Caws insisted that “truth…is a comparative latecomer on the linguistic scene, and it is certainly a mistake to suppose that language was invented for the purpose of telling it.” But it is in the specific social context of our exploration, the terms and choices of concrete activities and relationships, that more understanding of the genesis of language must be sought. Olivia Vlahos judged that the “power of words” must have appeared very early; “Surely…not long after man had begun to fashion tools shaped to a special pattern.” The flaking or chipping of stone tools, during the million or two years of Paleolithic life, however, seems much more apt to have been shared by direct, intimate demonstration than by spoken directions. Nevertheless, the proposition that language arose with the beginnings of technology–that is, in the sense of division of labor and its concomitants, such as a standardizing of things and events and the effective power of specialists over others–is at the heart of the matter, in my view. It would seem very difficult to disengage the division of labor–"the source of civilization,” in Durkheim’s phrase–from language at any stage, perhaps least of all the beginning. Division of labor necessitates a relatively complex control of group action; in effect it demands that the whole community be organized and directed. This happens through the breakdown of functions previously performed by everybody, into a progressively greater differentiation of tasks, and hence of roles and distinctions. Whereas Vlahos felt that speech arose quite early, in relation to simple stone tools and their reproduction, Julian Jaynes has raised perhaps a more interesting question which is assumed in his contrary opinion that language showed up much later. He asks, how it is, if humanity had speech had for a couple of million years, that there was virtually no development of technology? Jaynes’s question implies a utilitarian value inhering in language, a supposed release of latent potentialities of a positive nature. But given the destructive dynamic of the division of labor, referred to above, it may be that while language and technology are indeed linked, they were both successfully resisted for thousands of generations. At its origins language had to meet the requirements of a problem that existed outside language. In light of the congruence of language and ideology, it is also evident that as soon as a human spoke, he or she was separated. This rupture is the moment of dissolution of the original unity between humanity and nature; it coincides with the initiation of division of labor. Marx recognized that the rise of ideological consciousness was established by the division of labor; language was him the primary paradigm of “productive labor.” Every step in the advancement of civilization has meant added labor, however, and the fundamentally alien reality of productive labor/work is realized and advanced via language. Ideology receives its substance from division of labor, and, inseparably, its form from language. Engels, valorizing labor even more explicitly than Marx, explained the origin of language from and with labor, the “mastery of nature.” He expressed the essential connection by the phrase, “first labor, after it and then with it speech.” To put it more critically, the artificial communication which is language was and is the voice of the artificial separation which is (division of) labor. (In the usual, repressive parlance, this is phrased positively, of course, in terms of the invaluable nature of language in organizing “individual responsibilities.”) Language was elaborated for the suppression of feelings; as the code of civilization it expresses the sublimation of Eros, the repression of instinct, which is the core of civilization. Freud, in the one paragraph he devoted to the origin of language, connected original speech to sexual bonding as the instrumentality by which work was made acceptable as “an equivalence and substitute for sexual activity.” This transference from a free sexuality to work is original sublimation, and Freud saw language constituted in the establishing of the link between mating calls and work processes. The neo-Freudian Lacan carries this analysis further, asserting that the unconscious is formed by the primary repression of acquisition of language. For Lacan the unconscious is thus “structured like a language” and functions lingustically, not instinctively or symbolically in the traditional Freudian sense. To look at the problem of origin on a figurative plane, it interesting to consider the myth of the Tower of Babel. The story of the confounding of language, like that other story in Genesis, the Fall from the grace of the Garden, is an attempt to come to terms with the origin of evil. The splintering of an “original language” into mutually untintelligible may best be understood as the emergence of symbolic language, the eclipse of an earlier state of more total and authentic communication. In numerous traditions of paradise, for example, animals can talk and humans can understand them. I have argued elsewhere that the Fall can be understood as a fall into time. Likewise the failure of the Tower of Babel suggests, as Russell Fraser put it, “the isolation of man in historical time.” But the Fall also has a meaning in terms of the origin of language. Benjamin found it in the mediation which is language and the “origin of abstraction, too, as a faculty of language-mind.” “The fall is into language,” according to Norman O. Brown. Another part of Genesis provides Biblical commentary on an essential of language, names, and on the notion that naming is an act of domination. I refer to the creation myth, which includes “and whatsoever Adam called every living creature, that was the name thereof.” This bears directly on the necessary linguistic component of the domination of nature: man became master of things only because he first named them, in the formulation of Dufrenne. As Spengler had it, “To name anything by a name is to win power over it.” The beginning of humankind’s separation from and conquest of the world is thus located in the naming of the world. Logos itself as god is involved in the first naming, which represents the domination of the deity. The well-known passage is contained in the Gospel of John: “In the beginning was the Word, and the Word was with God, and the Word was God.” Returning to the question of the origin of language in real terms, we also come back to the notion that the problem of language is the problem of civilization. The anthropologist Lizot noted that the hunter-gatherer mode exhibited that lack of technology and division of labor that Jaynes felt must have bespoken an absence of language; "(Primitive people’s) contempt for work and their disinterest in technological progress per se are beyond question.” Furthermore, “the bulk of recent studies,” in Lee’s words of 1981, shows the hunter-gatherers to have been “well nourished and to have (had) abundant leisure time.” Early humanity was not deterred from language by the pressures of constant worries about survival; the time for reflection and linguistic development was available but this path was apparently refused for many thousands of years. Nor did the conclusive victory of agriculture, civilization’s cornerstone, take place (in the form of the Neolithic revolution) because of food shortages or population pressures. In fact, as Lewis Binford has concluded, “The question to be asked is not why agriculture and food-storage techniques were not developed everywhere, but why they were developed at all.” The dominance of agriculture, including property ownership, law, cities, mathematics, surplus, permanent hierarchy and specialization, and writing, to mention a few of its elements, was no inevitable step in human “progress”; neither was language itself. The reality of pre-Neolithic life demonstrates the degradation or defeat involved in what has been generally seen as an enormous step forward, an admirable transcending of nature, etc.. In this light, many of the insights of Horkheimer and Adorno in the Dialectic of Enlightenment (such as the linking of progress in instrumental control with regression in affective experience) are made equivocal by their false conclusion that “Men have always had to choose between their subjugation to nature or the subjugation of nature to the Self.” “Nowhere is civilization so perfectly mirrored as in speech,” as Pei commented, and in some very significant ways language has not only reflected but determined shifts in human life. The deep, powerful break that was announced by the birth of language prefigured and overshadowed the arrival of civilization and history, a mere 10,000 years ago. In the reach of language, “the whole of History stands unified and complete in the manner of a Natural Order,” says Barthes. Mythology, which, as Cassirer noted, “is from its very beginning potential religion,” can be understood as a function of language, subject to its requirements like any ideological product. The nineteenth-century linguist Muller described mythology as a “disease of language” in just this sense; language deforms thought by its inability to describe things directly. “Mythology is inevitable, it is natural, it is an inherent necessity of language…(It is) the dark shadow which throws upon thought, and which can never disappear till language becomes entirely commensurate with thought, which it never will.” It is little wonder, then, that the old dream of a lingua Adamica, a “real” language consisting not of conventional signs but expressing the direct, unmediated meaning of things, has been an integral part of humanity’s longing for a lost primeval state. As remarked upon above, the Tower of Babel is one of the enduring significations of this yearning to truly commune with each other and nature. In that earlier (but long enduring) condition nature and society formed a coherent whole, interconnected by the closest bonds. The step from participation in the totality of nature to religion involved a detaching of forces and beings into outward, inverted existences. This separation took the form of deities, and the religious practitioner, the shaman, was the first specialist. The decisive mediations of mythology and religion are not, however, the only profound cultural developments underlying our modern estrangement. Also in the Upper Paleolithic era, as the species Neanderthal gave way to Cro-Magnon (and the brain actually shrank in size), art was born. In the celebrated cave paintings of roughly 30,000 years ago is found a wide assortment of abstract signs; the symbolism of late Paleolithic art slowly stiffens into the much more stylized forms of the Neolithic agriculturalists. During this period, which is either synonymous with the beginnings of language or registers its first real dominance, a mounting unrest surfaced. John Pfeiffer described this in terms of the erosion of the egalitarian hunter-gatherer traditions, as Cro-Magnon established its hegemony. Whereas there was “no trace of rank” until the Upper Paleolithic, the emerging division of labor and its immediate social consequences demanded a disciplining of those resisting the gradual approach of civilization. As a formalizing, indoctrinating device, the dramatic power of art fulfilled this need for cultural coherence and the continuity of authority. Language, myth, religion and art thus advanced as deeply “political” conditions of social life, by which the artificial media of symbolic forms replaced the directly-lived quality of life before division of labor. From this point on, humanity could no longer see reality face to face; the logic of domination drew a veil over play, freedom, affluence. At the close of the Paleolithic Age, as a decreased proportion of verbs in the language reflected the decline of unique and freely chosen acts in consequence of division of labor, language still possessed no tenses. Although the creation of a symbolic world was the condition for the existence of time, no fixed differentiations had developed before hunter-gatherer life was displaced by Neolithic farming. But when every verb shows a tense, language is “demanding lip service to time even when time is furthest of our thoughts.” (Van Orman Quine 1960) From this point one can ask whether time exists apart from grammar. Once the structure of speech incorporates time and is thereby animated by it at every expression, division of labor conclusively destroyed an earlier reality. With Derrida, one can accurately refer to “language as the origin of history.” Language itself is a repression, and along its progress repression gathers–as ideology, as work–so as to generate historical time. Without language all of history would disappear. Pre-history is pre-writing; writing of some sort is the signal that civilization has begun. “Once gets the impression,” Freud wrote in The Future of an Illusion, “that civilization is something which was imposed on a resisting majority by a minority which understood how to obtain possession of the means of power and coercion.” If the matter of time and language can seem problematic, writing as a stage of language makes it appearance contributing to subjugation in rather naked fashion. Freud could have been legitimately pointed to written language as the lever by which civilization was imposed and consolidated. By about 10,000 B.C., extensive division of labor had produced the kind of social control reflected by cities and temples. The earliest writings are records of taxes, laws, terms of labor servitude. This objectified domination thus originated from the practical needs of political economy. An increased use of letters and tablets soon enabled those in charge to reach new heights of power and conquest, as exemplified in the new form of government commanded by Hammurabi of Babylon. As Levi-Strauss put it, writing “seems to favor rather the exploitation than the enlightenment of mankind..Writing, on this its first appearance in our midst, had allied itself with falsehood.” Language at this juncture becomes the representation of representation, in hieroglyphic and ideographic writing and then in phonetic-alphabetic writing. The progress of symbolization, from the symbolizing of words, to that of syllables, and finally to letters in an alphabet, imposed an increasingly irresistable sense of order and control. And in the reification that writing permits, language is no longer tied to a speaking subject or community of discourse, but creates an autonomous field from which every subject can be absent. In the contemporary world, the avant-garde of art has, most noticeably, performed the gestures of refusal of the prison of language. Since Mallarme, a good deal of modernist poetry and prose has moved against the taken-for-grantedness of normal speech. To the question “Who is speaking?” Mallarme answered, “Language is speaking.” After this reply, and especially since the explosive period around World War I when Joyce, Stein and others attempted a new syntax as well as a new vocabulary, the restraints and distortions of language have been assaulted wholesale in literature. Russian futurists, Dada (e.g. Hugo Ball’s efforts in the 1920s to create “poetry without words”), Artaud, the Surrealists and lettristes were among the more exotic elements of a general resistance to language. The Symbolist poets, and many who could be called their descendants, held that defiance of society also includes defiance of its language. But inadequacy in the former arena precluded success in the latter, bringing one to ask whether avant-garde strivings can be anything more than abstract, hermetic gestures. Language, which at any given moment embodies the ideology of a particular culture, must be ended in order to abolish both categories of estrangement; a project of some considerable dimensions, let us say. That literary texts (e.g. Finnegan’s Wake, the poetry of e.e. cummings) breaks the rules of language seems mainly to have the paradoxical effect of evoking the rules themselves. By permitting the free play of ideas about language, society treats these ideas as mere play. The massive amount of lies–official, commercial and otherwise–is perhaps in itself sufficient to explain why Johnny Can’t Read or Write, why illiteracy is increasing in the metropole. In any case, it is not only that “the pressure on language has gotten very great,” according to Canetti, but that “unlearning” has come “to be a force in almost every field of thought,” in Robert Harbison’s estimation. Today “incredible” and “awesome” are applied to the most commonly trivial and boring, it is no accident that powerful and shocking words barely exist anymore. The deterioration of language mirrors a more general estrangement; it has become almost totally external to us. From Kafka to Pinter silence itself is a fitting voice of our times. “Few books are forgivable. Black on the canvas, silence on the screen, an empty white sheet fo paper, are perhaps feasible,” as R.D. Laing put it so well. Meanwhile, the structuralists–Levi-Strauss, Barthes, Foucault, Lacan, Derrida–have been almost entirely occupied with the duplicity language in their endless exegetical burrowings into it. They have virtually renounced the project of extracting meaning from language. I am writing (obviously) enclosed in language, aware that language reifies the resistance to reification. As T.S. Eliot’s Sweeney explains, “I’ve gotta use words when I talk to you.” One can imagine replacing the imprisonment of time with a brilliant present–only by imagining a world without division of labor, without that divorce from nature from which all ideology and authority accrue. We couldn’t live in this world without language and that is just how profoundly we must transform this world. Words bespeak a sadness; they are used to soak up the emptiness of unbridled time. We have all had that desire to go further, deeper than words, the feeling of wanting only to be done with all the talk, knowing that being allowed to live coherently erases the need to formulate coherence. There is a profound truth to the notion that “lovers need no words.” The point is that we must have a world of lovers, a world of the face-to-face, in which even names can be forgotten, a world which knows that enchantment is the opposite of ignorance. Only a politics that undoes language and time and is thus visionary to the point of voluptuousness has any meaning. Number: Its Origin and Evolution ──────────────────────────────── The wrenching and demoralizing character of the crisis we find ourselves in, above all, the growing emptiness of spirit and arificiality of matter, lead us more and to question the most commonplace of “givens.” Time and language begin to arouse suspicions; number, too, no longer seems “neutral.” The glare of alienation in technological civilization is too painfully bright to hide its essence now, and mathematics is the schema of technology. It is also the language of science–how deep we must go, how far back to reveal the “reason” for damaged life? The tangled skein of unnecessary suffering, the strands of domination, are unavoidably being unreeled, by the pressure of an unrelenting present. When we ask, to what sorts of questions is the answer a number, and try to focus on the meaning or the reasons for the emergence of the quantitative, we are once again looking at a decisive moment of our estrangement from natural being. Number, like language, is always saying what it cannot say. As the root of a certain kind of logic or method, mathematics is not merely a tool but a goal of scientific knowledge: to be perfectly exact, perfectly self-consistent, and perfectly general. Never mind that the world is inexact, interrelated, and specific, that no one has ever seen leaves, trees, clouds,animals, that are two the same, just as no two moments are identical. As Dingle said, “All that can come from the ultimate scientific anlysis of the material world is a set of numbers,” reflecting upon the primacy of the concept of identity in math and its offspring, science. A little further on I will attempt an “anthropology” of numbers and explore its social embeddedness. Horkheimer and Adorno point to the basis of the disease: “Even the deductive form of science reflects hierarchy and coercion…the whole logical order, dependency, progression, and union of [its] concepts is grounded in the corresponding conditions of social reality–that is, the division of labor. If mathematical reality is the purely formal structure of normative or standardizing measure (and later, science), the first thing to be measured at all was time. The primal connection between time and number becomes immediately evident. Authority, first objectified as time, becomes rigidified by the gradually mathematized consciousness of time. Put slightly differently, time is a measure and exists as a reification or materiality thanks to the introduction of measure. The importance of symbolization should also be noted, in passing, for a further interrelation consists of the fact that while the basic feature of all measurement is symbolic representation, the creation of a symbolic world is the condition of the existence of time. To realize that representation begins with language, actualized in the creation of a reproducible formal structure, is already to apprehend the fundamental tie between language and number. An impoverished present renders it easy to see, as language becomes more impoverished, that math is simply the most reduced and drained language. The ultimate step in formalizing a language is to transform it into mathematics; conversely, the closer language comes to the dense concretions of reality, the less abstract and exact it can be. The symbolizing of life and meaning is at its most versatile in language, which, in Wittgenstein’s later view, virtually constitutes the world. Further, language, based as it is on a symbolic faculty for conventional and arbitrary equivalences, finds in the symbolism of math its greatest refinement. Mathematics, as judged by Max Black, is the “grammar of all symbolic systems.” The purpose of the mathematical aspect of language and concept is the more complete isolation of the concept from the senses. Math is the paradigm of abstract thought for the same reason that Levy termed pure mathematics “the method of isolation raised to a fine art.” Closely related are its character of “enormous generality,” as discussed by Parsons, its refusal of limitations on said generality, as formulated by Whitehead. This abstracting process and its formal, general results provide a content that seems to be completely detached from the thinking individual; the user of a mathematical system and his/her values do not enter into the system. The Hegelian idea of the autonomy of alienated activity finds a perfect application with mathematics; it has its own laws of growth, its own dialectic, and stands over the individual as a separate power. Self-existent time and the first distancing of humanity from nature, it must be preliminarily added, began to emerge when we first began to count. Domination of nature, and then, of humans is thus enabled. In abstraction is the truth of Heyting’s conclusion that “the characteristic of mathematical thought is that it does not convey truth about the external world.” Its essential attitude toward the whole colorful movement of life is summed up by, “Put this and that equal to that and this!” Abstraction and equivalence of identity are inseparable; the suppression of the world’s richness which is paramount in identity brought Adorno to the “primal world of ideology.” The untruth of identity is simply that the concept does not exhaust the thing conceived. Mathematics is reified, ritualized thought, the virtual abandonment of thinking. Foucalt found that “in the first gesture of the first mathematician one saw the constitution of an ideality that has been deployed throughout history and has questioned only to be repeated and purified.” Number is the most momentous idea in the history of human nature. Numbering or counting (and measurement, the process of assigning numbers to represent qualities) gradually consolidated plurality into quantification, and thereby produced the homogenous and abstract character of number, which made mathematics possible. From its inception in elementary forms of counting (beginning with a binary division and proceeding to the use of fingers and toes as bases) to the Greek idealization of number, an increasingly abstract type of thinking developed, paralleling the maturation of the time concept. As William James put it, “the intellectual life of man consists almost wholly in his substitution of a conceptual order for the perceptual order in which his experience originally comes.” Boas concluded that “counting does not become necessary until objects are considered in such generalized form that their individualities are entirely lost sight of.” In the growth of civilization we have learned to use increasingly abstract signs to point at increasingly abstract referents. On the other hand, prehistoric languages had a plethora of terms for the touched and felt, while very often having no number words beyond one, two and many. Hunter-gatherer humanity had little if any need for numbers, which is the reason Hallpike declared that “we cannot expect to find that an operational grasp of quantification will be a cultural norm in many primitive societies.” Much earlier, and more crudely, Allier referred to “the repugnance felt by uncivilized men towards any genuine intellectual effort, more particularly towards arithmetic.” In fact, on the long road toward abstraction, from an intuitive sense of amount to the use of different sets of number words for counting different kinds of things, along to fully abstract number, there was an immense resistance, as if the objectification involved was somehow seen for what it was. This seems less implausible in light of the striking, unitary beauty of tools of our ancestors half a million years ago, in which the immediate artistic and technical (for want of better words) touch is so evident, and by “recent studies which have demonstrated the existence, some 300,000 years ago, of mental ability equivalent to modern man,” in the words of British archeologist Clive Gamble. Based on observations of surviving tribal peoples, it is apparent, to provide another case in point, that hunter-gatherers possessed an enormous and intimate understanding of the nature and ecology of their local places, quite sufficient to have inaugurated agriculture perhaps hundreds of thousands of years before the Neolithic revolution. But a new kind of relationship to nature was involved; one that was evidently refused for so many, many generations. To us it has seemed a great advantage to abstract from the natural relationship of things, whereas in the vast Stone Age being was apprehended and valued as a whole, not in terms of separable attributes. Today, as ever, when a large family sits down to dinner and it is noticed that someone is missing, this is not accomplished by counting. Or when a hut was built in prehistoric times, the number of required posts was not specified or counted, rather they were inherent to the idea of the hut, intrinsically involved in it. (Even in early agriculture, the loss of a herd animal could be detected not by counting but by missing a particular face or characteristic features; it seems clear, however, as Bryan Morgan argues, that “man’s first use for a number system” was certainly as a control of domesticated flock animals, as wild creatures became products to be harvested.) In distancing and separation lies the heart of mathematics: the discursive reduction of patterns, states and relationships which we initially perceived as wholes. In the birth of controls aimed at control of what is free and unordered, crystallized by early counting, we see a new attitude toward the world. If naming is a distancing, a mastery, so too is number, which is impoverished naming. Though numbering is a corollary of language, it is the signature of a critical breakthrough of alienation. The root meanings of number are instructive: “quick to grasp or take” and “to take, especially to steal,” also “taken, seized, hence…numb.” What is made an object of domination is thereby reified, becomes numb. For hundreds of thousands of years hunter-gatherers enjoyed a direct, unimpaired access to the raw materials needed for survival. Work was not divided nor did private property exist. Dorothy Lee focused on a surviving example from Oceania, finding that none of the Trobrianders’ activities are fitted into a linear, divisible line. “There is no job, no labor, no drudgery which finds its reward outside the act.” Equally important is the “prodigality,” “the liberal customs for which hunters are properly famous,” “their inclination to make a feast of everything on hand,” according to Sahlins. Sharing and counting or exchange are, of course, relative opposites. Where articles are made, animals killed or plants collected for domestic use and not for exchange, there is no demand for standardized numbers or measurements. Measuring and weighing possessions develops later, along with the measurement and definition of property rights and duties to authority. Isaac locates a decisive shift toward standardization of tools and language in the Upper Paleolithic period, the last stage of hunter-gatherer humanity. Numbers and less abstract units of measurement derive, as noted above, from the equalization of differences. Earliest exchange, which is the same as earliest division of labor, was indeterminate and defied systematization; a table of equivalences cannot really be formulated. As the predominance of the gift gave way to the progress of exchange and division of labor, the universal interchangeability of mathematics finds its concrete expression. What comes to be fixed as a principle of equal justice–the ideology of equivalent exchange–is only the practice of the domination of division of labor. Lack of a directly-lived existence, the loss of autonomy that accompany separation from nature are the concomitants of the effective power of specialists. Mauss stated that exchange can be defined only by all the institutions of society. Decades later Belshaw grasped division of labor as not merely a segment of society but the whole of it. Likewise sweeping, but realistic, is the conclusion that a world without exchange or fractionalized endeavor would be a world without number. Clastres, and Childe among others well before him, realized that people’s ability to produce a surplus, the basis of exchange, does not necessarily mean that they decide to do so. Concerning the nonetheless persistent view that only mental/cultural deficiency accounts for the absence of surplus, “nothing is more mistaken,” judged Clastres. For Sahlins, “Stone Age economics” was “intrinsically an anti-surplus system,” using the term system extremely loosely. For long ages humans had no desire for the dubious compensations attendant on assuming a divided life, just as they had no interest in number. Piling up a surplus of anything was unknown, apparently, before Neanderthal times passed to the Cro-Magnon; extensive trade contracts were nonexistent in the earlier period, becoming common thereafter with Cro-Magnon society. Surplus was fully developed only with agriculture, and characteristically the chief technical advancement of Neolithic life was the perfection of the container: jars, bins, granaries and the like. This development also gives concrete form to a burgeoning tendency toward spatialization, the sublimation of an increasingly autonomous dimension of time into spatial forms. Abstraction, perhaps the first spatialization, was the first compensation for the deprivation caused by the sense of time. Spatialization was greatly refined with number and geometry. Ricoeur notes that “Infinity is discovered…in the form of the idealization of magnitudes, of measures, of numbers, figures,” to carry this still further. This quest for unrestricted spatiality is part and parcel of the abstract march of mathematics. So then is the feeling of being freed from the world, from finitude that Hannah Arendt described in mathematics. Mathematical principles and their component numbers and figures seem to exemplify a timelessness which is possibly their deepest character. Hermann Weyl, in attempting to sum up (no pun intended) the “life sum of mathematics,” termed it the science of the infinite. How better to express an escape from reified time than by making it limitlessly subservient to space–in the form of math. Spatialization–like math–rests upon separation; inherent in it are division and an organization of that division. The division of time into parts (which seems to have been the earliest counting or measuring) is itself spatial. Time has always been measured in such terms as the movement of the earth or moon, or the hands of a clock. The first time indications were not numerical but concrete, as with all earliest counting. Yet, as we know, a number system, paralleling time, becomes a separate, invariable principle. The separations in social life–most fundamentally, division of labor–seem alone able to account for the growth of estranging conceptualization. In fact, two critical mathematical inventions, zero and the place system, may serve as cultural evidence of division of labor. Zero and the place system, or position, emerged independently, “against considerable psychological resistance,” in the Mayan and Hindu civilizations. Mayan division of labor, accompanied by enormous social stratification (not to mention a notorious obsession with time, and large-scale human sacrifice at the hands of a powerful priest class), is a vividly documented fact, while the division of labor reflected in the Indian caste system was “the most complex that the world had seen before the Industrial Revolution.” (Coon 1954) The necessity of work (Marx) and the necessity of repression (Freud) amount to the same thing: civilization. These false commandments turned humanity away from nature and account for history as a “steadily lengthening chronicle of mass neurosis.” (Turner 1980) Freud credits scientific/mathematical achievement as the highest moment of civilization, and this seems valid as a function of its symbolic nature. “The neurotic process is the price we pay for our most precious human heritage, namely our ability to represent experience and communicate our thoughts by means of symbols.” The triad of symbolization, work and repression finds its operating principle in division of labor. This is why so little progress was made in accepting numerical values until the huge increase in division of labor of the Neolithic revolution: from the gathering of food to its actual production. With that massive changeover mathematics became fully grounded and necessary. Indeed it became more a category of existence than a mere instrumentality. The fifth century B.C. historian Herodotus attributed the origin of mathematics to the Egyptian king Sesostris (1300 B.C.), who needed to measure land for tax purposes. Systematized math–in this case geometry, which literally means “land measuring"–did in fact arise from the requirements of political economy, though it predates Sesostris’ Egypt by perhaps 2000 years. The food surplus of Neolithic civilization made possible the emergence of specialized classes of priests and administrators which by about 3200 B.C. had produced the alphabet, mathematics, writing and the calendar. In Sumer the first mathematical computations appeared, between 3500 and 3000 B.C., in the form of inventories, deeds of sale, contracts, and the attendant unit prices, units purchased, interest payments, etc.. As Bernal points out, “mathematics, or at least arithmetic, came even before writing.” The number symbols are most probably older than any other elements of the most ancient forms of writing. At this point domination of nature and humanity are signaled not only by math and writing, but also by the walled, grain-stocked city, along with warfare and human slavery. “Social labor” (division of labor), the coerced coordination of several workers at once, is thwarted by the old, personal measures; lengths, weights, volumes must be standardized. In this standardization, one of the hallmarks of civilization, mathematical exactitude and specialized skill go hand in hand. Math and specialization, requiring each other, developed apace and math became itself a specialty. The great trade routes, expressing the triumph of division of labor, diffused the new, sophisticated techniques of counting, measurement, and calculation. In Babylon, merchant-mathematicians contrived a comprehensive arithmetic between 3000 and 2500 B.C., which system “was fully articulated as an abstract computational science by about 2000 B.C.. (Brainerd 1979) In succeeding centuries the Babylonians even invented a symbolic algebra, though Babylonian-Egyptian math has been generally regarded as extremely trial-and-error or empiricist compared to that of the much later Greeks. To the Egyptians and Babylonians mathematical figures had concrete referents: algebra was an aid to commercial transactions, a rectangle was a piece of land of a particular shape. The Greeks, however, were explicit in asserting that geometry deals with abstractions, and this development reflects an extreme form of division of labor and social stratification. Unlike Egyptian or Babylonian society, in Greece, a large slave class performed all productive labor, technical as well as unskilled, such that the ruling class milieu that included mathematicians disdained practical pursuits or applications. Pythagoras, more or less the founder of Greek mathematics (6th century, B.C.) expressed this rarefied, abstract bent in no uncertain terms. To him numbers were immutable and eternal. Directly anticipating Platonic idealism, he declared that numbers were the intelligible key to the universe. Usually encapsulated as “everything is number,” the Pythagorean philosophy held that numbers exist in a literal sense and are quite literally all that does exist. This form of mathematical philosophy, with the extremity of its search for harmony and order, may be seen as a deep fear of contradiction or chaos, an oblique acknowledgement of the massive and perhaps unstable repression underlying Greek society. An artificial intellectual life that rested so completely on the surplus created by slaves was at pains to deny the senses, the emotions and the real world. Greek sculpture is another example, in its abstract, ideological conformations, devoid of feeling or their histories. Its figures are standardized idealizations; the parallel with a highly exaggerated cult of mathematics is manifest. The independent existence of ideas, which is Plato’s fundamental premise, is directly derived from Pythagoras, just as his whole theory of ideas flows from the special character of mathematics. Geometry is properly an exercise of disembodied intellect, Plato taught, in character with his view that reality is a world of form from which matter, in every important respect, is banished. Philosophical idealism was thus established out of this world-denying impoverishment, based on the primacy of quantitative thinking. As C.I. Lewis observed, “from Plato to the present day, all the major epistemological theories have been dominated by, or formulated in the light of , accompanying conceptions of mathematics.” It is no less accidental that Plato wrote, “Let only geometers enter,” over the door to his Academy, than that his totalitarian Republic insists that years of mathematical training are necessary to correctly approach the most important political and ethical questions. Consistently, he denied that a stateless society ever existed, identifying such a concept with that of a “state of swine.” Systematized by Euclid in the third century B.C., about a century after Plato, mathematics reached an apogee not to be matched for almost two millenia; the patron saint of intellect for the slave-based and feudal societies that followed was not Plato, but Aristotle, who criticized the former’s Pythagorean reduction of science to mathematics. The long non-development of math, which lasted virtually until the end of Renaissance, remains something of a mystery. But growing trade began to revive the art of the quantitative by the twelfth and thirteenth centuries. The impersonal order of the counting house in the new mercantile capitalism exemplified a renewed concentration on abstract measurement. Mumford stresses the mathematical prerequisite of later mechanization and standardization; in the rising merhant world, “counting numbers began here and in the end numbers alone counted.” (Mumford 1967) But the Renaissance conviction that mathematics should be applicable to all the arts (not to mention such earlier and atypical forerunners as Roger Bacon’s 13th century contribution toward a strictly mathematical optics), was a mild prelude to the magnitude of number’s triumph in the seventeenth century. Though they were soon eclipsed by other advances of the 1600’s, Johannes Kepler and Francis Bacon revealed its two most important and closely related aspects early in the century. Kepler, who completed the Copernican transition to the heliocentric model, saw the real world as composed of quantitative differences only; its differences are strictly those of number. Bacon, in The New Atlantis (c. 1620) depicted an idealized scientific community, the main object of which was domination of nature; as Jaspers put it, “Mastery of nature…’knowledge is power,’ has been the watchword since Bacon.” The century of Galileo and Descartes–pre-eminent among those who deepened all the previous forms of quantitative alienation and thus sketched a technological future–began with a qualitative leap in the division of labor. Franz Borkenau provided the key as to why a profound change in the Western world-view took place in the seventeenth century, a movement to a fundamentally mathematical-mechanistic outlook. According to Borkenau, a great extension of division of labor, occurring from about 1600, introduced the novel notion of abstract work. This reification of human activity proved pivotal. Along with degradation of work, the clock is the basis of modern life, equally “scientific” in its reduction of life to a measurability, via objective, commodified units of time. The increasingly accurate and ubiquitous clock reached a real domination in the seventeenth century, as, correspondingly, “the champions of the new sciences manifested an avid interest in horological matters.” Thus it seems fitting to introduce Galileo in terms of just this strong interest in the measurement of time; his invention of the first mechanical clock based on the principle of the pendulum was likewise a fitting capstone to his long career. As increasingly objectified or reified time reflects, at perhaps the deepest level, an increasingly alienated social world, Galileo’s principal aim was the reduction of the world to an object of mathematical dissection. Writing a few years before World War II and Auschwitz, Husserl located the roots of the contemporary crisis in this objectifying reduction and identified Galileo as its main progenitor. The life-world has been “devalued” by science precisely insofar as the “mathematization of nature” initiated by Gallo has proceeded–clearly no small indictment. (Husserl 1970) For Galileo as with Kepler, mathematics was the “root grammar of the new philosophical discourse that constituted modern scientific method.” He enunciated the principle, “to measure what is measurable and try to render what is not so yet.” Thus he resurrected the Pythagorean-Platonic substitution of a world of abstract mathematical relations for the real world and its method of absolute renunciation of the senses’ claim to know reality. Observing this turning away from quality to quantity, this plunge into a shadow-world of abstractions, Husserl concluded that modern, mathematical science prevents us from knowing life as it is. And the rise of science has fueled ever more specialized knowledge, that stunning and imprisoning progression so well-known by now. Collingwood called Galileo “the true father of modern science” for the success of his dictum that the book of nature “is written in mathematical language” and its corollary that therefore “mathematics is the language of science.” Due to this separation from nature, Gillispie evaluated, “After Galileo, science could no longer be humane.” It seems very fitting that the mathematician who synthesized geometry and algebra to form analytic geometry (1637) and who, with Pascal, is credited with inventing calculus, should have shaped Galilean mathematicism into a new system of thinking. The thesis that the world is organized in such a way that there is a total break between people and the natural world, contrived as a total and triumphant world-view, is the basis for Descartes’ renown as the founder of modern philosophy. The foundation of his new system, the famous “cogito ergo sum,” is the assigning of scientific certainty to separation between mind and the rest of reality. This dualism provided an alienated means for seeing only a completely objectified nature. In the Discourse on Method…Descartes declared that the aim of science is “to make us as masters and possessors of nature.” Though he was a devout Christian, Descartes renewed the distancing from life that an already fading God could no longer effectively legitimize. As Christianity weakened, a new central ideology of estrangement came forth, this one guaranteeing order and domination based on mathematical precision. To Descartes the material universe was a machine and nothing more, just as animals “indeede are nothing else but engines, or matter sent into a continual and orderly motion.” He saw the cosmos itself as a giant clockwork just when the illusion that time is a separate, autonomous process was taking hold. Also as living, animate nature died, dead, inanimate money became endowed with life, as capital and the market assumed the attributes of organic processes and cycles. Lastly, Descartes mathematical vision eliminated any messy, chaotic or alive elements and ushered in an attendant mechanical world-view that was coincidental with a tendency toward central government controls and concentration of power in the form of the modern nation-state. “The rationalization of administration and of the natural order were occurring simultaneously,” in the words of Merchant. The total order of math and its mechanical philosophy of reality proved irresistable; by the time of Descartes’ death in 1650 it had become virtually the official framework of thought throughout Europe. Leibniz, a near-contemporary, refined and extended the work of Descartes; the “pre-established harmony” he saw in existence is likewise Pythagorean in lineage. This mathematical harmony, which Leibniz illustrated by reference to two independent clocks, recalls his dictum, “There is nothing that evades number.” Leibniz, like Galileo and Descartes, was deeply interested in the design of clocks. In the binary arithmetic he devised, an image of creation was evoked; he imagined that one represented God and zero the void, that unity and zero expressed all numbers and all creation. He sought to mechanize thought by means of a formal calculus, a project which he too sanguinely expected would be completed in five years. This undertaking was to provide all the answers, including those to questions of morality and metaphysics. Despite this ill-fated effort, Leibniz was perhaps the first to base a theory of math on the fact that it is a universal symbolic language; he was certainly the “first great modern thinker to have a clear insight into the true character of mathematical symbolism.” Furthering the quantitative model of reality was the English royalist Hobbes, who reduced the human soul, will, brain, and appetites to matter in mechanical motion, thus contributing directly to the current conception of thinking as the “output” of the brain as computer. The complete objectification of time, so much with us today, was achieved by Issac Newton, who mapped the workings of the Galilean-Cartesian clockwork universe. Product of the severely repressed Puritan outlook, which focused on sublimating sexual energy into brutalizing labor, Newton spoke of absolute time, “flowing equably without regard to anything external.” Born in 1642, the year of Galileo’s death, Newton capped the Scientific Revolution of the seventeenth century by developing a complete mathematical formulation of nature as a perfect machine, a perfect clock. Whitehead judged that “the history of seventeenth-century science reads as though it were vivid dream of Plato or Pythagoras,” noting the astonishingly refined mode of its quantitative thought. Again the correspondence with a jump in division of labor is worth pointing out; as Hill described mid-seventeenth century England, “…significant specialization began to set in. The last polymaths were dying out…” The songs and dances of the peasants slowly died, and in a rather literal mathematization, the common lands were closed and divided. Knowledge of nature was part of philosophy until this time; the two parted company as the concept of mastery of nature achieved its definitive modern form. Number, which first issued from dissociation from the natural world, ended up describing and dominating it. Fontenelle’s Preface on the Utility of Mathematics and Physics (1702) celebrated the centrality of quantification to the entire range of human sensibilities, thereby aiding the eighteenth century consolidation of the breakthroughs of the preceding era. And whereas Descartes had asserted that animals could not feel pain because they are soulless, and that man is not exactly a machine because he had a soul, LeMetrie, in 1747, went the whole way and made man completely mechanical in his L’Homme Machine. Bach’s immense accomplishments in the first half of the eighteenth century also throw light on the spirit of math unleashed a century earlier and helped shape culture to that spirit. In reference to the rather abstract music of Bach, it has been said that he “spoke in mathematics to God.” (LeShan & Morgenau 1982) At this time the individual voice lost its independence and tone was no longer understood as sung but as a mechanical conception. Bach, treating music as a sort of math, moved it out of the stage of vocal polyphony to that of instrumental harmony, based always upon a single, autonomous voice fixed by instruments, instead of somewhat variable with human voices. Later in the century Kant stated that in any particular theory there is only as much real science as there is mathematics, and devoted a considerable part of his Critique of Pure Reason to an analysis of the ultimate principles of geometry and arithmetic. Descartes and Leibniz strove to establish a mathematical science method as the paradigmatic way of knowing, and saw the possibility of a singular universal language, on the model of empirical symbols, that could contain the whole of philosophy. The eighteenth century Enlightenment thinkers actually worked at realizing this latter project. Condillac, Rousseau and others were also characteristically concerned with origins–such as the origin of language; their goal of grasping human understanding by taking language to its ultimate, mathematized symbolic level made them incapable of seeing that the origin of all symbolizing is alienation. Symmetrical plowing is almost as old as agriculture itself, a means of imposing order on an otherwise irregular world. But as the landscape of cultivation became distinguished by linear forms of an increasingly mathematical regularity–including the popularity of formal gardens–another eighteenth-century mark of math’s ascendancy can be gauged. In the early 1800s, however, the Romantic poets and artists, among others, protested the new vision of nature as a machine. Blake, Goethe and John Constable, for example, accused science of turning the world into a clockwork, with the Industrial Revolution providing ample evidence of its power to violate organic life. The debasing of work among textile workers, which caused the furious uprisings of the English Luddites during the second decade of the nineteenth century, was epitomized by such automated and cheapened products as those of the Jacquard loom. This French device not only represented the mechanization of life and work unleashed by seventeenth century shifts, but directly inspired the first attempts at the modern computer. The designs of Charles Babbage, unlike the “logic machines” of Leibniz and Descartes, involved both memory and calculating units under the control of programs via punched cards. The aims of the mathematical Babbage and the inventor-industrialist J.M. Jacquard can be said to rest on the same rationalist reduction of human activity to the machine as was then beginning to boom with industrialism. Quite in character, then, were the emphasis in Babbage’s mathematical work on the need for improved notation to further the processes of symbolization, his Principles of Economy, which contributed to the foundations of modern management–and his contemporary fame against London “nusiances,” such as street musicians! Paralleling the full onslaught of industrial capitalism and the hugely accelerated division of labor that it brought was a marked advance in mathematical development. According to Whitehead, “During the nineteenth century pure mathematics made almost as much progress as during the preceding centuries from Pythagoras onwards.” The non-Euclidean geometries fo Bolyai, Lobachevski, Riemann and Klein must be mentioned, as well as the modern algebra of Boole, generally regarded as the basis of symbolic logic. Boolean algebra made possible a new level of formulized thought, as its founder pondered “the human mind…and instrument of conquest and dominion over the powers of surrounding nature,” (Boole 1952) in an unthinking mirroring of the mastery mathematized capitalism was gaining in the mid-1800s. (Although the specialist is rarely faulted by the dominant culture for his “pure” creativity, Adorno adroitly observed that “The mathematician’s resolute unconsciousness testifies to the connection between division of labor and “purity.”) If math is impoverished language, it can also be seen as the mature form of that sterile coercion known as formal logic. Bertrand Russell, in fact, determined that mathematics and logic had become one. Discarding unreliable, everyday language, Russell, Frege and others believed that in the further degradation and reduction of language lay the real hope for “progress in philosophy.” The goal of establishing logic on mathematical grounds was related to an even more ambitious effort by the end of the nineteenth century, that of establishing the foundations of math itself. As capitalism proceeded to redefine reality in its own image and became desirous of securing its foundations, the “logic” stage of math in the late 19th and early 20th centuries, fresh from new triumphs, sought the same. David Hilberts theory of formalism, one such attempt to banish contradiction or error, explicitly aimed at safeguarding “the state power of mathematics for all time from all ‘rebellions.’” Meanwhile, number seemed to be doing quite well without the philosophical underpinnings. Lord Kelvin’s late nineteenth century pronouncement that we don’t really know anything unless we can measure it bespoke an exalted confidence, just as Frederick Taylor’s Scientific Management was about to lead the quantification edge of industrial management further in the direction of subjugating the individual to the lifeless Newtonian categories of time and space. Speaking of the latter, Capra has claimed that the theories of relativity and quantum physics, developed between 1905 and the late 1920s, “shattered all the principal concepts fo the Cartesian world view and Newtonian mechanics.” But relativity theory is certainly mathematical formulism, and Einstein sought a unified field theory by geometrizing physics, such that success would have enabled him to have said, like Descartes, that his entire physics was nothing other than geometry. That measuring time and space (or “space-time”) is a relative matter hardly removes measurement as its core element. At the heart of quantum theory, certainly, is Heisenberg’s Uncertainty Principle, which does not throw out quantification but rather expresses the limitations of classical physics in sophisticated mathematical ways. As Gillespie succinctly had it, Cartesian-Newtonian physical theory “was an application of Euclidean geometry to space, general relativity a spatialization of Riemann’s curvilinear geometry, and quantum mechanics a naturalization of statistical probability.” More succinctly still: “Nature, before and after the quantum theory, is that which is to be comprehended mathematically.” During the first three decades of the 20th century, moreover, the great attempts by Russell & Whitehead, Hilbert, et al., to provide a completely unproblematic basis for the whole edifice of math, referred to above, went forward with considerable optimism. But in 1931 Kurt Godel dashed these bright hopes with his Incompleteness Theorem, which demonstrated that any symbolic system can be either complete or fully consistent, but not both. Godel’s devastating mathematical proof of this not only showed the limits of axiomatic number systems, by rules out enclosing nature by any closed, consistent language. If there are theorems or assertions within a system of thought which can neither be proved or disproved internally, if it is impossible to give a proof of consistency within the language used, as Godel and immediate successors like Tarski and Church convincingly argued, “any system of knowledge about the world is, and must remain, fundamentally incomplete, eternally subject to revision.” (Rucker 1982) Morris Kline’s Mathematics: The Loss of Certainty related the “calamities” that have befallen the once seemingly inviolable “majesty of mathematics,” chiefly dating from Godel. Math, like language, used to describe the world and itself, fails in its totalizing quest, in the same way that capitalism cannot provide itself with unassailable grounding. Further, with Godel’s Theorem mathematics was not only “recognized to be much more abstract and formal than had been traditionally supposed,” but it also became clear that “the resources of the human mind have not been, and cannot be, fully formalized.” (Nagel & Newman 1958) But who could deny that, in practice, quantity has been mastering us, with or without definitively shoring up its theoretical basis? Human helplessness seems to be directly proportional to mathematical technology’s domination over nature, or as Adorno phrased it, “the subjection of outer nature is successful only in the measure of the repression of inner nature.” And certainly understanding is diminished by number’s hallmark, division of labor. Raymond Firth accidentally exemplified the stupidity of advanced specialization, in a passing comment on a crucial topic: “the proposition that symbols are instruments of knowledge raises epistemological issues which anthropologists are not trained to handle.” The connection with a more common degradation is made by Singh, in the context of an ever more refined division of labor and a more and more technicised social life, noting that “automation of computation immediately paved the way for automatizing industrial operations.” The heightened tedium of computerized office work is today’s very visible manifestation of mathematized, mechanized labor, with its neo-Taylorist quantification via electronic display screens, announcing the “information explosion” or “information society.” Information work is now the chief economic activity and information the distinctive commodity, in large part echoing the main concept of Shannon’s information theory of the late 1940s, in which “the production and the transmission of information could be defined quantitatively.” (Feinstein 1958) From knowledge, to information, to data, the mathematizing trajectory moves away from meaning–paralleled exactly in the realm of “ideas” (those bereft of goals or content, that is) by the ascendancy of structuralism. The “global communications revolution” is another telling phenomenon, by which a meaningless “input” is to be instantly available everywhere among people who live, as never before, in isolation. Into this spiritual vacuum the computer boldly steps. In 1950 Turing said, in answer to the question ‘can machines think?’, “I believe that at the end of the century the use of words and general educated opinion will have altered so much that one will be able to speak of machines thinking without expecting to be contradicted.” Note that his reply had nothing to do with the state of machines but wholly that of humans. As pressures build for life to become more quantified and machine-like, so does the drive to make machines more life-like. By the mid-’60s, in fact, a few prominent voices already announced that the distinction between human and machine was about to be superseded–and saw this as positive. Mazlish provided an especially unequivocal commentary: “Man is on the threshold of breaking past the discontinuity between himself and machines…We cannot think any longer of man without a machine…Moreover, this change…is essential to our harmonious acceptance of an industrialized world.” By the late 1980s thinking sufficently impersonates the machine that Artificial Intelligence experts, like Minsky, can matter-of-factly speak of the symbol-manipulating brain as a “computer made of meat.” Cognitive psychology, echoing Hobbes, has become almost based on the computational model of thought in the decades since Turing’s 1950 prediction. Heidegger felt that there is an inherent tendency for Western thinking to merge into the mathematical sciences, and saw science as “incapable of awakening, and in fact emasculating, the spirit of genuine inquiry.” We find ourselves, in an age when the fruits of science threaten to end human life altogether, when a dying capitalism seems capable of taking everything with it, more apt to want to discover the ultimate origins of the nightmare. When the world and its thought (Levi-Strauss and Chomsky come immediately to mind) reach a condition that is increasingly mathematized and empty (where computers are widely touted as capable of feelings and even of life itself), the beginnings of this bleak journey, including the origins of the number concept, demand comprehension. It may be that this inquiry is essential to save us and our humanness. The Case Against Art ──────────────────── Art is always about “something hidden.” But does it help us connect with that hidden something? I think it moves us away from it. During the first million or so years as reflective beings humans seem to have created no art. As Jameson put it, art had no place in that “unfallen social reality” because there was no need for it. Though tools were fashioned with an astonishing economy of effort and perfection of form, the old cliche about the aesthetic impulse as one of the irreducible components of the human mind is invalid. The oldest enduring works of art are hand-prints, produced by pressure or blown pigment—a dramatic token of direct impress on nature. Later in the Upper Paleolithic era, about 30,000 years ago, commenced the rather sudden appearance of the cave art associated with names like Altamira and Lascaux. These images of animals possess an often breathtaking vibrancy and naturalism, though concurrent sculpture, such as the widely-found “venus” statuettes of women, was quite stylized. Perhaps this indicates that domestication of people was to precede domestication of nature. Significantly, the “sympathetic magic” or hunting theory of earliest art is now waning in the light of evidence that nature was bountiful rather than threatening. The veritable explosion of art at this time bespeaks an anxiety not felt before: in Worringer’s words, “creation in order to subdue the torment of perception.” Here is the appearance of the symbolic, as a moment of discontent. It was a social anxiety; people felt something precious slipping away. The rapid development of the earliest ritual or ceremony parallels the birth of art, and we are reminded of the earliest ritual re-enactments of the moment of “the beginning,” the primordial paradise of the timeless present. Pictorial representation roused the belief in controlling loss, the belief in coercion itself. And we see the earliest evidence of symbolic division, as with the half-human, half-beast stone faces at El Juyo. The world is divided into opposing forces, by which binary distinction the contrast of culture and nature begins and a productionist, hierarchical society is perhaps already prefigured. The perceptual order itself, as a unity, starts to break down in reflection of an increasingly complex social order. A hierarchy of senses, with the visual steadily more separate from the others and seeking its completion in artificial images such as cave paintings, moves to replace the full simultaneity of sensual gratification. Le’vi-Strauss discovered, to his amazement, a tribal people that had been able to see Venus in daytime; but not only were our faculties once so very acute, they were also not ordered and separate. Part of training sight to appreciate the objects of culture was the accompanying repression of immediacy in an intellectual sense: reality was removed in favor of merely aesthetic experience. Art anesthetizes the sense organs and removes the natural world from their purview. This reproduces culture, which can never compensate for the disability. Not surprisingly, the first signs of a departure from those egalitarian principles that characterized hunter-gatherer life show up now. The shamanistic origin of visual art and music has been often remarked, the point here being that the artist-shaman was the first specialist. It seems likely that the ideas of surplus and commodity appeared with the shaman, whose orchestration of symbolic activity portended further alienation and stratification. Art, like language, is a system of symbolic exchange that introduces exchange itself. It is also a necessary device for holding together a community based on the first symptoms of unequal life. Tolstoy’s statement that “art is a means of union among men, joining them together in the same feeling,” elucidates art’s contribution to social cohesion at the dawn of culture. Socializing ritual required art; art works originated in the service of ritual; the ritual production of art and the artistic production of ritual are the same. “Music,” wrote Seu-ma-tsen, “is what unifies.” As the need for solidarity accelerated, so did the need for ceremony; art also played a role in its mnemonic function. Art, with myth closely following, served as the semblance of real memory. In the recesses of the caves, earliest indoctrination proceeded via the paintings and other symbols, intended to inscribe rules in depersonalized, collective memory. Nietzsche saw the training of memory, especially the memory of obligations, as the beginning of civilized morality. Once the symbolic process of art developed it dominated memory as well as perception, putting its stamp on all mental functions. Cultural memory meant that one person’s action could be compared with that of another, including portrayed ancestors, and future behavior anticipated and controlled. Memories became externalized, akin to property but not even the property of the subject. Art turns the subject into object, into symbol. The shaman’s role was to objectify reality; this happened to outer nature and to subjectivity alike because alienated life demanded it. Art provided the medium of conceptual transformation by which the individual was separated from nature and dominated, at the deepest level, socially. Art’s ability to symbolize and direct human emotion accomplished both ends. What we were led to accept as necessity, in order to keep ourselves oriented in nature and society, was at base the invention of the symbolic world, the Fall of Man. The world must be mediated by art (and human communication by language, and being by time) due to division of labor, as seen in the nature of ritual. The real object, its particularity, does not appear in ritual; instead, an abstract one is used, so that the terms of ceremonial expression are open to substitution. The conventions needed in division of labor, with its standardization and loss of the unique, are those of ritual, of symbolization. The process is at base identical, based on equivalence. Production of goods, as the hunter-gatherer mode is gradually liquidated in favor of agriculture (historical production) and religion (full symbolic production), is also ritual production. The agent, again, is the shaman-artist, enroute to priesthood, leader by reason of mastering his own immediate desires via the symbol. All that is spontaneous, organic and instinctive is to be neutered by art and myth. Recently the painter Eric Fischl presented at the Whitney Museum a couple in the act of sexual intercourse. A video camera recorded their actions and projected them on a TV monitor before the two. The man’s eyes were riveted to the image on the screen, which was clearly more exciting than the act itself. The evocative cave pictures, volatile in the dramatic, lamp-lit depths, began the transfer exemplified in Fischl’s tableau, in which even the most primal acts can become secondary to their representation. Conditioned self-distancing from real existence has been a goal of art from the beginning. Similarly, the category of audience, of supervised consumption, is nothing new, as art has striven to make life itself an object of contemplation. As the Paleolithic Age gave way to the Neolithic arrival of agriculture and civilization—production, private property, written language, government and religion—culture could be seen more fully as spiritual decline via division of labor, though global specialization and a mechanistic technology did not prevail until the late Iron Age. The vivid representation of late hunter-gatherer art was replaced by a formalistic, geometric style, reducing pictures of animals and humans to symbolic shapes. This narrow stylization reveals the artist shutting himself off from the wealth of empirical reality and creating the symbolic universe. The aridity of linear precision is one of the hallmarks of this turning point, calling to mind the Yoruba, who associate line with civilization: “This country has become civilized,” literally means, in Yoruba, “this earth has lines upon its face.” The inflexible forms of truly alienated society are everywhere apparent; Gordon Childe, for example, referring to this spirit, points out that the pots of a Neolithic village are all alike. Relatedly, warfare in the form of combat scenes makes its first appearance in art. The work of art was in no sense autonomous at this time; it served society in a direct sense, an instrument of the needs of the new collectivity. There had been no worship-cults during the Paleolithic, but now religion held sway, and it is worth remembering that for thousands of years art’s function will be to depict the gods. Meanwhile, what Glu:ck stressed about African tribal architecture was true in all other cultures as well: sacred buildings came to life on the model of those of the secular ruler. And though not even the first signed works show up before the late Greek period, it is not inappropriate to turn here to art’s realization, some of its general features. Art not only creates the symbols of and for a society, it is a basic part of the symbolic matrix of estranged social life. Oscar Wilde said that art does not imitate life, but vice versa; which is to day that life follows symbolism, not forgetting that it is (deformed) life that produces symbolism. Every art form, according to T.S. Eliot, is “an attack upon the inarticulate.” Upon the unsymbolized, he should have said. Both painter and poet have always wanted to reach the silence behind and within art and language, leaving the question of whether the individual, in adopting these modes of expression, didn’t settle for far too little. Though Bergson tried to approach the goal of thought without symbols, such a breakthrough seems impossible outside our active undoing of all the layers of alienation. In the extremity of revolutionary situations, immediate communication has bloomed, if briefly. The primary function of art is to objectify feeling, by which one’s own motivations and identity are transformed into symbol and metaphor. All art, as symbolization, is rooted in the creation of substitutes, surrogates for something else; by its very nature therefore, it is falsification. Under the guise of “enriching the quality of human experience,” we accept vicarious, symbolic descriptions of how we should feel, trained to need such public images of sentiment that ritual art and myth provide for our psychic security. Life in civilization is lived almost wholly in a medium of symbols. Not only scientific or technological activity but aesthetic form are canons of symbolization, often expressed quite unspiritually. It is widely averred, for example, that a limited number of mathematical figures account for the efficacy of art. There is Cezanne’s famous dictum to “treat nature by the cylinder, the sphere and the cone,” and Kandinsky’s judgement that “the impact of the acute angle of a triangle on a circle produces an effect no less powerful than the finger of God touching the finger of Adam in Michelangelo.” The sense of a symbol, as Charles Pierce concluded, is its translation into another symbol, this an endless reproduction, with the real always displaced. Though art is not fundamentally concerned with beauty, its inability to rival nature sensuously has evoked many unfavorable comparisons. “Moonlight is sculpture,” wrote Hawthorne; Shelley praised the “unpremeditated art” of the skylark; Verlaine pronounced the sea more beautiful than all the cathedrals. And so on, with sunsets, snowflakes, flowers, etc., beyond the symbolic products of art. Jean Arp, in fact, termed :the most perfect picture” nothing more than “warty, threadbare approximation, a dry porridge.” Why then would one respond positively to art? As compensation and palliative, because our relationship to nature and life is so deficient and disallows an authentic one. As Motherlant put it, “One gives to one’s art what one has not been capable of giving to one’s own existence.” It is true for artist and audience alike; art, like religion, arises from unsatisfied desire. Art should be considered a religious activity and category also in the sense of Nietzsche’s aphorism, “We have Art in order not to perish of Truth.” Its consolation explains the widespread preference for metaphor over a direct relationship to the genuine article. If pleasure were somehow released from every restraint, the result would be the antithesis of art. In dominated life freedom does not exist outside art, however, and so even a tiny, deformed fraction of the riches of being is welcomed. “I create in order not to cry,” revealed Klee. This separate realm of contrived life is both important and in complicity with the actual nightmare that prevails. In its institutionalized separation it corresponds to religion and ideology in general, where its elements are not, and cannot be, actualized; the work of art is a selection of possibilities unrealized except in symbolic terms. Arising from the sense of loss referred to above, it conforms to religion not only by reason of its confinement to an ideal sphere and its absence of any dissenting consequences, but it can hence be no more than thoroughly neutralized critique at best. Frequently compared to play, art and culture—like religion—have more often worked as generators of guilt and oppression. Perhaps the ludic function of art, as well as its common claim to transcendence, should be estimated as one might reassess the meaning of Versailles: by contemplating the misery of the workers who perished draining its marshes. Clive Bell pointed to the intention of art to transport us from the plane of daily struggle “to a world of aesthetic exaltation,” paralleling the aim of religion. Malraux offered another tribute to the conservative office of art when he wrote that without art works civilization would crumble “within fifty years” … becoming “enslaved to instincts and to elementary dreams.” Hegel determined that art and religion also have “this in common, namely, having entirely universal matters as content.” This feature of generality, of meaning without concrete reference, serves to introduce the notion that ambiguity is a distinctive sign of art. Usually depicted positively, as a revelation of truth free of the contingencies of time and place, the impossibility of such a formulation only illuminates another moment of falseness about art. Kierkegaard found the defining trait of the aesthetic outlook to be its hospitable reconciliation of all points of view and its evasion of choice. This can be seen in the perpetual compromise that at once valorizes art only to repudiate its intent and contents with “well, after all, it is only art.” Today culture is commodity and art perhaps the star commodity. The situation is understood inadequately as the product of a centralized culture industry, a la Horkheimer and Adorno. We witness, rather, a mass diffusion of culture dependent on participation for its strength, not forgetting that the critique must be of culture itself, not of its alleged control. Daily life has become aestheticized by a saturation of images and music, largely through the electronic media, the representation of representation. Image and sound, in their ever-presence, have become a void, ever more absent of meaning for the individual. Meanwhile, the distance between artist and spectator has diminished, a narrowing that only highlights the absolute distance between aesthetic experience and what is real. This perfectly duplicates the spectacle at large: separate and manipulating, perpetual aesthetic experience and a demonstration of political power. Reacting against the increasing mechanization of life, avant-garde movements have not, however, resisted the spectacular nature of art any more than orthodox tendencies have. In fact, one could argue that Aestheticism, or “art for art’s sake,” is more radical than an attempt to engage alienation with its own devices. The late 19th century art pour l’art development was a self-reflective rejection of the world, as opposed to the avant-garde effort to somehow organize life around art. A valid moment of doubt lies behind Aestheticism, the realization that division of labour has diminished experience and turned art into just another specialisation: art shed its illusory ambitions and became its own content. The avant-garde has generally staked out wider claims, projecting a leading role denied it by modern capitalism. It is best understood as a social institution peculiar to technological society that so strongly prizes novelty; it is predicated on the progressivist notion that reality must be constantly updated. But avant-garde culture cannot compete with the modern world’s capacity to shock and transgress (and not just symbolically). Its demise is another datum that the myth of progress is itself bankrupt. Dada was one of the last two major avant-garde movements, its negative image greatly enhanced by the sense of general historical collapse radiated by World War I. Its partisans claimed, at times, to be against all “isms,” including the idea of art. But painting cannot negate painting, nor can sculpture invalidate sculpture, keeping in mind that all symbolic culture is the co-opting of perception, expression and communication. [nor can writing negate writing, nor can typing radical essays onto diskettes to assist in their publication ever be liberating—even if the typer breaks the rules and puts in an uninvited comment] In fact, Dada was a quest for new artistic modes, its attack on the rigidities and irrelevancies of bourgeois art a factor in the advance of art; Hans Richter’s memoirs referred to “the regeneration of visual art that Dada had begun.” If World War I almost killed art, the Dadaists reformed it. Surrealism is the last school to assert the political mission of art. Before trailing off into Trotskyism and/or art-world fame, the Surrealists upheld chance and the primitive as ways to unlock “the Marvellous” which society imprisons in the unconscious. The false judgement that would have re-introduced art into everyday life and thereby transfigured it certainly misunderstood the relationship of art to repressive society. The real barrier is not between art and social reality, which are one, but between desire and the existing world. The Surrealists’ aim of inventing a new symbolism and mythology upheld these categories and mistrusted unmediated sensuality. Concerning the latter, Breton held that “enjoyment is a science; the exercise of the senses demands a personal initiation and therefore you need art.” Modernist abstraction resumed the trend begun by Aestheticism, in that it expressed the conviction that only by a drastic restriction of its field of vision could art survive. With the least strain of embellishment possible in a formal language, art became increasingly self-referential, in its search for a “purity” that was hostile to narrative. Guaranteed not to represent anything, modern painting is consciously nothing more than a flat surface with paint on it. But the strategy of trying to empty art of symbolic value, the insistence on the work of art as an object in its own right in a world of objects, proved a virtually self-annihilating method. This “radical physicality,” based on aversion to authority though it was, never amounted to more, in its objectiveness, than simple commodity status. The sterile grids of Mondrian and the repeated all-black squares of Reinhardt echo this acquiescence no less than hideous 20th century architecture in general. Modernist self-liquidation was parodied by Rauschenberg’s 1953 _Erased Drawing_, exhibited after his month-long erasure of a de Kooning drawing. The very concept of art, Duchamp’s showing of a urinal in a 1917 exhibition notwithstanding, became an open question in the ‘50s and has grown steadily more undefinable since. Pop Art demonstrated that the boundaries between art and mass media (e.g. ads and comics) are dissolving. Its perfunctory and mass-produced look is that of the whole society and the detached, blank quality of a Warhol and his products sum it up. Banal, morally weightless, depersonalized images, cynically manipulated by a fashion-conscious marketing stratagem: the nothingness of modern art and its world revealed. The proliferation of art styles and approaches in the ‘60s—Conceptual, Minimalist, Performance, etc.—and the accelerated obsolescence of most art brought the “postmodern” era, a displacement of the formal “purism” of modernism by an eclectic mix from past stylistic achievements. This is basically a tired, spiritless recycling of used-up fragments, announcing that the development of art is at an end. Against the global devaluing of the symbolic, moreover, it is incapable of generating new symbols and scarcely even makes an effort to do so. Occasionally critics, like Thomas Lawson, bemoan art’s current inability “to stimulate the growth of a really troubling doubt,” little noticing that a quite noticeable movement of doubt threatens to throw over art itself. Such “critics” cannot grasp that art must remain alienation and as such must be superseded, that art is disappearing because the immemorial separation between nature and art is a death sentence for the world that must be voided. Deconstruction, for its part, announced the project of decoding Literature and indeed the “texts,” or systems of signification, throughout all culture. But this attempt to reveal supposedly hidden ideology is stymied by its refusal to consider origins or historical causation, an aversion it inherited from structuralism/poststructuralism. Derrida, Deconstruction’s seminal figure, deals with language as a solipsism, consigned to self-interpretation; he engages not in critical activity but in writing about writing. Rather than a de-constructing of impacted reality, this approach is merely a self-contained academicism, in which Literature, like modern painting before it, never departs from concern with its own surface. Meanwhile, since Piero Manzoni canned his own feces and sold them in a gallery and Chris Burden had himself shot in the arm, and crucified to a Volkswagen, we seen in art ever more fitting parables of its end, such as the self-portraits drawn by Anastasi—with his eyes closed. “Serious” music is long dead and popular music deteriorates; poetry nears collapse and retreats from view; drama, which moved from the Absurd to Silence, is dying; and the novel is eclipsed by non-fiction as the only way to write seriously. In a jaded, enervated age, where it seems to speak is to say less, art is certainly less. Baudelaire was obliged to claim a poet’s dignity in a society which had no more dignity to hand out. A century an more later how inescapable is the truth of that condition and how much more threadbare the consolation or station of “timeless” art. Adorno began his book thusly: “Today it goes without saying that nothing concerning art goes without saying, much less without thinking. Everything about art has become problematic; its inner life, its relation to society, even its right to exist.” But _Aesthetic Theory_ affirms art, just as Marcuse’s last work did, testifying to despair and to the difficulty of assailing the hermetically sealed ideology of culture. And although other “radicals,” such as Habermas, counsel that the desire to abolish symbolic mediation is irrational, it is becoming clearer that when we really experiment with our hearts and hands the sphere of art is shown to be pitiable. In the transfiguration we must enact, the symbolic will be left behind and art refused in favor of the real. Play, creativity, self-expression and authentic experience will recommence at that moment. Agriculture ─────────── Agriculture, the indispensable basis of civilization, was originally encountered as time, language, number and art won out. As the materialization of alienation, agriculture is the triumph of estrangement and the definite divide between culture and nature and humans from each other. Agriculture is the birth of production, complete with its essential features and deformation of life and consciousness. The land itself becomes an instrument of production and the planet’s species its objects. Wild or tame, weeds or crops speak of that duality that cripples the soul of our being, ushering in, relatively quickly, the despotism, war and impoverishment of high civilization over the great length of that earlier oneness with nature. The forced march of civilization, which Adorno recognized in the “assumption of an irrational catastrophe at the beginning of history,” which Freud felt as “something imposed on a resisting majority,” of which Stanley Diamond found only “conscripts, not volunteers,” was dictated by agriculture. And Mircea Eliade was correct to assess its coming as having “provoked upheavals and spiritual breakdowns” whose magnitude the modern mind cannot imagine. “To level off, to standardize the human landscape, to efface its irregularities and banish its surprises,” these words of E.M. Cioran apply perfectly to the logic of agriculture, the end of life as mainly sensuous activity, the embodiment and generator of separated life. Artificiality and work have steadily increased since its inception and are known as culture: in domesticating animals and plants man necessarily domesticated himself. Historical time, like agriculture, is not inherent in social reality but an imposition on it. The dimension of time or history is a function of repression, whose foundation is production or agriculture. Hunter-gatherer life was anti-time in its simultaneous and spontaneous openness; farming life generates a sense of time by its successive-task narrowness, its directed routine. As the non-closure and variety of Paleolithic living gave way to the literal enclosure of agriculture, time assumed power and came to take on the character of an enclosed space. Formalized temporal reference points — ceremonies with fixed dates, the naming of days, etc. — are crucial to the ordering of the world of production; as a schedule of production, the calendar is integral to civilization. Conversely, not only would industrial society be impossible without time schedules, the end of agriculture (basis of all production) would be the end of historical time. Representation begins with language, a means of reining in desire. By displacing autonomous images with verbal symbols, life is reduced and brought under strict control; all direct, unmediated experience is subsumed by that supreme mode of symbolic expression, language. Language cuts up and organizes reality, as Benjamin Whorf put it, and this segmentation of nature, an aspect of grammar, sets the stage for agriculture. Julian Jaynes, in fact, concluded that the new linguistic mentality led very directly to agriculture. Unquestionably, the crystallization of language into writing, called forth mainly by the need for record-keeping of agricultural transactions, is the signal that civilization has begun. In the non-commodified, egalitarian hunter-gatherer ethos, the basis of which (as has so often been remarked) was sharing, number was not wanted. There was no ground for the urge to quantify, no reason to divide what was whole. Not until the domestication of animals and plants did this cultural concept fully emerge. Two of number’s seminal figures testify clearly to its alliance with separateness and property: Pythagoras, center of a highly influential religious cult of number, and Euclid, father of mathematics and science, whose geometry originated to measure fields for reasons of ownership, taxation and slave labor. One of civilization’s early forms, chieftainship, entails a linear rank order in which each member is assigned an exact numerical place. Soon, following the anti-natural linearity of plow culture, the inflexible 90-degree gridiron plan of even earliest cities appeared. Their insistent regularity constitutes in itself a repressive ideology. Culture, now numberized, becomes more firmly bounded and lifeless. Art, too, in its relationship to agriculture, highlights both institutions. It begins as a means to interpret and subdue reality, to rationalize nature, and conforms to the great turning point which is agriculture in its basic features. The pre-Neolithic cave paintings, for example, are vivid and bold, a dynamic exaltation of animal grace and freedom. The neolithic art of farmers and pastoralists, however, stiffens into stylized forms; Franz Borkenau typified its pottery as a “narrow, timid botching of materials and forms.” With agriculture, art lost its variety and became standardized into geometric designs that tended to degenerate into dull, repetitive patterns, a perfect reflection of standardized, confined, rule-patterned life. And where there had been no representation in Paleolithic art of men killing men, an obsession with depicting confrontation between people advanced with the Neolithic period, scenes of battles becoming common. Time, language, number, art and all the rest of culture, which predates and leads to agriculture, rests on symbolization. Just as autonomy preceded domestication and self-domestication, the rational and the social precede the symbolic. Food production, it is eternally and gratefully acknowledged, “permitted the cultural potentiality of the human species to develop.” But what is this tendency toward the symbolic, toward the elaboration and imposition of arbitrary forms? It is a growing capacity for objectification, by which what is living becomes reified, thing-like. Symbols are more than the basic units of culture; they are screening devices to distance us from our experiences. They classify and reduce, “to do away with,” in Leakey and Lewin’s remarkable phrase, “the otherwise almost intolerable burden of relating one experience to another.” Thus culture is governed by the imperative of reforming and subordinating nature. The artificial environment which is agriculture accomplished this pivotal mediation, with the symbolism of objects manipulated in the construction of relations of dominance. For it is not only external nature that is subjugated: the face-to-face quality of pre-agricultural life in itself severely limited domination, while culture extends and legitimizes it. It is likely that already during the Paleolithic era certain forms or names were attached to objects or ideas, in a symbolizing manner but in a shifting, impermanent, perhaps playful sense. The will to sameness and security found in agriculture means that the symbols became as static and constant as farming life. Regularization, rule patterning, and technological differentiation, under the sign of division of labor, interact to ground and advance symbolization. Agriculture completes the symbolic shift and the virus of alienation has overcome authentic, free life. It is the victory of cultural control; as anthropologist Marshall Sahlins puts it, “The amount of work per capita increases with the evolution of culture and the amount of leisure per capita decreases.” Today, the few surviving hunter-gatherers occupy the least “economically interesting” areas of the world where agriculture has not penetrated, such as the snows of the Inuit or desert of the Australian aborigines. And yet the refusal of farming drudgery, even in adverse settings, bears its own rewards. The Hazda of Tanzania, Filipino Tasaday, !Kung of Botswana, or the Kalahari Desert !Kung San-who were seen by Richard Lee as easily surviving a serious, several years’ drought while neighboring farmers starved-also testify to Hole and Flannery’s summary that “No group on earth has more leisure time than hunters and gatherers, who spend it primarily on games, conversation and relaxing.” Service rightly attributed this condition to “the very simplicity of the technology and lack of control over the environment” of such groups. And yet simple Paleolithic methods were, in their own way, “advanced.” Consider a basic cooking technique like steaming foods by heating stones in a covered pit; this is immemorially older than any pottery, kettles or baskets (in fact, is anti-container in its non-surplus, non-exchange orientation) and is the most nutritionally sound way to cook, far healthier than boiling food in water, for example. Or consider the fashioning of such stone tools as the long and exceptionally thin “laurel leaf” knives, delicately chipped but strong, which modern industrial techniques cannot duplicate. The hunting and gathering lifestyle represents the most successful and enduring adaptation ever achieved by humankind. In occasional pre-agriculture phenomena like the intensive collection of food or the systematic hunting of a single species can be seen signs of impending breakdown of a pleasurable mode that remained so static for so long precisely because it was pleasurable. The “penury and day-long grind” of agriculture, in Clark’s words, is the vehicle of culture, “rational” only in its perpetual disequilibrium and its logical progression toward ever-greater destruction, as will be outlined below. Although the term hunter-gatherer should be reversed (and has been by not a few current anthropologists) because it is recognized that gathering constitutes by far the larger survival component, the nature of hunting provides salient contrast to domestication. The relationship of the hunter to the hunted animal, which is sovereign, free and even considered equal, is obviously qualitatively different from that of the farmer or herdsman to the enslaved chattels over which he rules absolutely. Evidence of the urge to impose order or subjugate is found in the coercive rites and uncleanness taboos of incipient religion. The eventual subduing of the world that is agriculture has at least some of its basis where ambiguous behavior is ruled out, purity and defilement defined and enforced. Lévi-Strauss defined religion as the anthropomorphism of nature; earlier spirituality was participatory with nature, not imposing cultural values or traits upon it. The sacred means that which is separated, and ritual and formalization, increasingly removed from the ongoing activities of daily life and in the control of such specialists as shamans and priests, are closely linked with hierarchy and institutionalized power. Religion emerges to ground and legitimize culture, by means of a “higher” order of reality; it is especially required, in this function of maintaining the solidarity of society, by the unnatural demands of agriculture. In the Neolithic village of Catal Hüyük in Turkish Anatolia, one of every three rooms was used for ritual purposes. Plowing and sowing can be seen as ritual renunciations, according to Burkert, a form of systematic repression accompanied by a sacrificial element. Speaking of sacrifice, which is the killing of domesticated animals (or even humans) for ritual purposes, it is pervasive in agricultural societies and found only there. Some of the major Neolithic religions often attempted a symbolic healing of the agricultural rupture with nature through the mythology of the earth mother, which needless to say does nothing to restore the lost unity. Fertility myths are also central; the Egyptian Osiris, the Greek Persephone, Baal of the Canaanites, and the New Testament Jesus, gods whose death and resurrection testify to the perseverance of the soil, not to mention the human soul. The first temples signified the rise of cosmologies based on a model of the universe as an arena of domestication or barnyard, which in turn serves to justify the suppression of human autonomy. Whereas precivilized society was, as Redfield put it, “held together by largely undeclared but continually realized ethical conceptions,” religion developed as a way of creating citizens, placing the moral order under public management. Domestication involved the initiation of production, vastly increased divisions of labor, and the completed foundations of social stratification. This amounted to an epochal mutation both in the character of human existence and its development, clouding the latter with ever more violence and work. Contrary to the myth of hunter-gatherers as violent and aggressive, by the way, recent evidence shows that existing non-farmers, such as the Mbuti (“pygmies”) studied by Turnbull, apparently do what killing they do without any aggressive spirit, even with a sort of regret. Warfare and the formation of every civilization or state, on the other hand, are inseparably linked. Primal peoples did not fight over areas in which separate groups might converge in their gathering and hunting. At least “territorial” struggles are not part of the ethnographic literature and they would seem even less likely to have occurred in pre-history when resources were greater and contact with civilization non-existent. Indeed, these peoples had no conception of private property, and Rousseau’s figurative judgment, that divided society was founded by the man who first sowed a piece of ground, saying “This land is mine,” and found others to believe him, is essentially valid. “Mine and thine, the seeds of all mischief, have no place with them,” reads Pietro’s 1511 account of the natives encountered on Columbus’ second voyage. Centuries later, surviving Native Americans asked, “Sell the Earth? Why not sell the air, the clouds, the great sea?” Agriculture creates and elevates possessions; consider the longing root of belongings, as if they ever make up for the loss. Work, as a distinct category of life, likewise did not exist until agriculture. The human capacity of being shackled to crops and herds devolved rather quickly. Food production overcame the common absence or paucity of ritual and hierarchy in society and introduced civilized activities like the forced labor of temple-building. Here is the real “Cartesian split” between inner and outer reality, the separation whereby nature became merely something to be “worked.” On this capacity for a sedentary and servile existence rests the entire superstructure of civilization with its increasing weight of repression. Male violence toward women originated with agriculture, which transmuted women into beasts of burden and breeders of children. Before farming, the egalitarianism of foraging life “applied as fully to women as to men,” judged Eleanor Leacock, owing to the autonomy of tasks and the fact that decisions were made by those who carried them out. In the absence of production and with no drudge work suitable for child labor such as weeding, women were not consigned to onerous chores or the constant supply of babies. Along with the curse of perpetual work, via agriculture, in the expulsion from Eden, God told woman, “I will greatly multiply thy sorrow and thy conception; in sorrow thou shalt bring forth children; and that desire shall be to thy husband, and he shall rule over thee.” Similarly, the first known codified laws, those of the Sumerian king Ur-Namu, prescribed death to any woman satisfying desires outside of marriage. Thus Whyte referred to the ground women “lost relative to men when humans first abandoned a simple hunting and gathering way of life,” and Simone de Beauvoir saw in the cultural equation of plow and phallus a fitting symbol of the oppression of women. As wild animals are converted into sluggish meat-making machines, the concept of becoming “cultivated” is a virtue enforced on people, meaning the weeding out of freedom from one’s nature, in the service of domestication and exploitation. As Rice points out, in Sumer, the first civilization, the earliest cities had factories with their characteristic high organization and refraction of skills. Civilization from this point exacts human labor and the mass production of food, buildings, war and authority. To the Greeks, work was a curse and nothing else. Their name for it-ponos-has the same root as the Latin poena, sorrow. The famous Old Testament curse on agriculture as the expulsion from Paradise (Genesis 3:17–18) reminds us of the origin of work. As Mumford put it, “Conformity, repetition, patience were the keys to this [Neolithic] culture…the patient capacity for work.” In this monotony and passivity of tending and waiting is born, according to Paul Shepard, the peasant’s “deep, latent resentments, crude mixtures of rectitude and heaviness, and absence of humor.” One might also add a stoic insensitivity and lack of imagination inseparable from religious faith, sullenness, and suspicion among traits widely attributed to the domesticated life of farming. Although food production by its nature includes a latent readiness for political domination and although civilizing culture was from the beginning its own propaganda machine, the changeover involved a monumental struggle. Fredy Perlman’s Against Leviathan! Against His-Story! is unrivaled on this, vastly enriching Toynbee’s attention to the “internal” and “external proletariats,” discontents within and without civilization. Nonetheless, along the axis from digging stick farming to plow agriculture to fully differentiated irrigation systems, an almost total genocide of gatherers and hunters was necessarily effected. The formation and storage of surpluses are part of the domesticating will to control and make static, an aspect of the tendency to symbolize. A bulwark against the flow of nature, surplus takes the forms of herd animals and granaries. Stored grain was the earliest medium of equivalence, the oldest form of capital. Only with the appearance of wealth in the shape of storable grains do the gradations of labor and social classes proceed. While there were certainly wild grains before all this (and wild wheat, by the way, is 24 percent protein compared to 12 percent for domesticated wheat), the bias of culture makes every difference. Civilization and its cities rested as much on granaries as on symbolization. The mystery of agriculture’s origin seems even more impenetrable in light of the recent reversal of long-standing notions that the previous era was one of hostility to nature and an absence of leisure. “One could no longer assume,” wrote Arme, “that early man domesticated plants and animals to escape drudgery and starvation. If anything, the contrary appeared true, and the advent of farming saw the end of innocence.” For a long time, the question was “Why wasn’t agriculture adopted much earlier in human evolution?” More recently, we know that agriculture, in Cohen’s words, “is not easier than hunting and gathering and does not provide a higher quality, more palatable, or more secure food base.” Thus the consensus question now is, “Why was it adopted at all?” Many theories have been advanced, none convincingly. Childe and others argue that population increase pushed human societies into more intimate contact with other species, leading to domestication and the need to produce in order to feed the additional people. But it has been shown rather conclusively that population increase did not precede agriculture but was caused by it. “I don’t see any evidence anywhere in the world,” concluded Flannery, “that suggests that population pressure was responsible for the beginning of agriculture.” Another theory has it that major climatic changes occurred at the end of the Pleistocene, about 11,000 years ago, that upset the old hunter-gatherer life-world and led directly to the cultivation of certain surviving staples. Recent dating methods have helped demolish this approach; no such climatic shift happened that could have forced the new mode into existence. Besides, there are scores of examples of agriculture being adopted-or refused-in every type of climate. Another major hypothesis is that agriculture was introduced via a chance discovery or invention as if it had never occurred to the species before a certain moment that, for example, food grows from sprouted seeds. It seems certain that Paleolithic humanity had a virtually inexhaustible knowledge of flora and fauna for many tens of thousands of years before the cultivation of plants began, which renders this theory especially weak. Agreement with Carl Sauer’s summation that, “Agriculture did not originate from a growing or chronic shortage of food” is sufficient, in fact, to dismiss virtually all originary theories that have been advanced. A remaining idea, presented by Hahn, Isaac and others, holds that food production began at base as a religious activity. This hypothesis comes closest to plausibility. Sheep and goats, the first animals to be domesticated, are known to have been widely used in religious ceremonies, and to have been raised in enclosed meadows for sacrificial purposes. Before they were domesticated, moreover, sheep had no wool suitable for textile purposes. The main use of the hen in southeastern Asia and the eastern Mediterranean-the earliest centers of civilization-“seems to have been,” according to Darby, “sacrificial or divinatory rather than alimentary.” Sauer adds that the “egg laying and meat producing qualities” of tamed fowl “are relatively late consequences of their domestication.” Wild cattle were fierce and dangerous; neither the docility of oxen nor the modified meat texture of such castrates could have been foreseen. Cattle were not milked until centuries after their initial captivity, and representations indicate that their first known harnessing was to wagons in religious processions. Plants, next to be controlled, exhibit similar backgrounds so far as is known. Consider the New World examples of squash and pumpkin, used originally as ceremonial rattles. Johannessen discussed the religious and mystical motives connected with the domestication of maize, Mexico’s most important crop and center of its native Neolithic religion. Likewise, Anderson investigated the selection and development of distinctive types of various cultivated plants because of their magical significance. The shamans, I should add, were well-placed in positions of power to introduce agriculture via the taming and planting involved in ritual and religion, sketchily referred to above. Though the religious explanation of the origins of agriculture has been somewhat overlooked, it brings us, in my opinion, to the very doorstep of the real explanation of the birth of production: that non-rational, cultural force of alienation which spread, in the forms of time, language, number and art, to ultimately colonize material and psychic life in agriculture. “Religion” is too narrow a conceptualization of this infection and its growth. Domination is too weighty, too all-encompassing to have been solely conveyed by the pathology that is religion. But the cultural values of control and uniformity that are part of religion are certainly part of agriculture, and from the beginning. Noting that strains of corn cross-pollinate very easily, Anderson studied the very primitive agriculturalists of Assam, the Naga tribe, and their variety of corn that exhibited no differences from plant to plant. True to culture, showing that it is complete from the beginning of production, the Naga kept their varieties so pure “only by a fanatical adherence to an ideal type.” This exemplifies the marriage of culture and production in domestication, and its inevitable progeny, repression and work. The scrupulous tending of strains of plants finds its parallel in the domesticating of animals, which also defies natural selection and re-establishes the controllable organic world at a debased, artificial level. Like plants, animals are mere things to be manipulated; a dairy cow, for instance, is seen as a kind of machine for converting grass to milk. Transmuted from a state of freedom to that of helpless parasites, these animals become completely dependent on man for survival. In domestic mammals, as a rule, the size of the brain becomes relatively smaller as specimens are produced that devote more energy to growth and less to activity. Placid, infantilized, typified perhaps by the sheep, most domesticated of herd animals; the remarkable intelligence of wild sheep is completely lost in their tamed counterparts. The social relationships among domestic animals are reduced to the crudest essentials. Non-reproductive parts of the life cycle are minimized, courtship is curtailed, and the animal’s very capacity to recognize its own species is impaired. Farming also created the potential for rapid environmental destruction and the domination over nature soon began to turn the green mantle that covered the birthplaces of civilization into barren and lifeless areas. “Vast regions have changed their aspect completely,” estimates Zeuner, “always to quasi-drier condition, since the beginnings of the Neolithic.” Deserts now occupy most of the areas where the high civilizations once flourished, and there is much historical evidence that these early formations inevitably ruined their environments. Throughout the Mediterranean Basin and in the adjoining Near East and Asia, agriculture turned lush and hospitable lands into depleted, dry, and rocky terrain. In Critias, Plato described Attica as “a skeleton wasted by disease,” referring to the deforestation of Greece and contrasting it to its earlier richness. Grazing by goats and sheep, the first domesticated ruminants, was a major factor in the denuding of Greece, Lebanon, and North Africa, and the desertification of the Roman and Mesopotamian empires. Another, more immediate impact of agriculture, brought to light increasingly in recent years, involved the physical well-being of its subjects. Lee and Devore’s researches show that “the diet of gathering peoples was far better than that of cultivators, that starvation is rare, that their health status was generally superior, and that there is a lower incidence of chronic disease.” Conversely, Farb summarized, “Production provides an inferior diet based on a limited number of foods, is much less reliable because of blights and the vagaries of weather, and is much more costly in terms of human labor expended.” The new field of paleopathology has reached even more emphatic conclusions, stressing, as does Angel, the “sharp decline in growth and nutrition caused by the changeover from food gathering to food production.” Earlier conclusions about life span have also been revised. Although eyewitness Spanish accounts of the sixteenth century tell of Florida Indian fathers seeing their fifth generation before passing away, it was long believed that primitive people died in their 30s and 40s. Robson, Boyden and others have dispelled the confusion of longevity with life expectancy and discovered that current hunter-gatherers, barring injury and severe infection, often outlive their civilized contemporaries. During the industrial age only fairly recently did life span lengthen for the species, and it is now widely recognized that in Paleolithic times humans were long-lived animals, once certain risks were passed. DeVries is correct in his judgment that duration of life dropped sharply upon contact with civilization. “Tuberculosis and diarrheal disease had to await the rise of farming, measles and bubonic plague the appearance of large cities,” wrote Jared Diamond. Malaria, probably the single greatest killer of humanity, and nearly all other infectious diseases are the heritage of agriculture. Nutritional and degenerative diseases in general appear with the reign of domestication and culture. Cancer, coronary thrombosis, anemia, dental caries, and mental disorders are but a few of the hallmarks of agriculture; previously women gave birth with no difficulty and little or no pain. People were far more alive in all their senses. !Kung San, reported R.H. Post, have heard a single-engine plane while it was still 70 miles away, and many of them can see four moons of Jupiter with the naked eye. The summary judgment of Harris and Ross, as to “an overall decline in the quality-and probably in the length-of human life among farmers as compared with earlier hunter-gatherer groups,” is understated. One of the most persistent and universal ideas is that there was once a Golden Age of innocence before history began. Hesiod, for instance, referred to the “life-sustaining soil, which yielded its copious fruits unbribed by toil.” Eden was clearly the home of the hunter-gatherers and the yearning expressed by the historical images of paradise must have been that of disillusioned tillers of the soil for a lost life of freedom and relative ease. The history of civilization shows the increasing displacement of nature from human experience, characterized in part by a narrowing of food choices. According to Rooney, prehistoric peoples found sustenance in over 1500 species of wild plants, whereas “All civilizations,” Wenke reminds us,” have been based on the cultivation of one or more of just six plant species: wheat, barley, millet, rice, maize, and potatoes.” It is a striking truth that over the centuries “the number of different edible foods which are actually eaten,” Pyke points out, “has steadily dwindled.” The world’s population now depends for most of its subsistence on only about 20 genera of plants while their natural strains are replaced by artificial hybrids and the genetic pool of these plants becomes far less varied. The diversity of food tends to disappear or flatten out as the proportion of manufactured foods increases. Today the very same articles of diet are distributed worldwide, so that an Inuit Eskimo and an African may soon be eating powdered milk manufactured in Wisconsin or frozen fish sticks from a single factory in Sweden. A few big multinationals such as Unilever, the world’s biggest food production company, preside over a highly integrated service system in which the object is not to nourish or even to feed, but to force an ever-increasing consumption of fabricated, processed products upon the world. When Descartes enunciated the principle that the fullest exploitation of matter to any use is the whole duty of man, our separation from nature was virtually complete and the stage was set for the Industrial Revolution. Three hundred and fifty years later this spirit lingered in the person of Jean Vorst, Curator of France’s Museum of Natural History, who pronounced that our species, “because of intellect,” can no longer re-cross a certain threshold of civilization and once again become part of a natural habitat. He further stated, expressing perfectly the original and persevering imperialism of agriculture, “As the earth in its primitive state is not adapted to our expansion, man must shackle it to fulfill human destiny.” The early factories literally mimicked the agricultural model, indicating again that at base all mass production is farming. The natural world is to be broken and forced to work. One thinks of the mid-American prairies where settlers had to yoke six oxen to plows in order to cut through the soil for the first time. Or a scene from the 1870s in The Octopus by Frank Norris, in which gang-plows were driven like “a great column of field artillery” across the San Joaquin Valley, cutting 175 furrows at once. Today the organic, what is left of it, is fully mechanized under the aegis of a few petrochemical corporations. Their artificial fertilizers, pesticides, herbicides and near-monopoly of the world’s seed stock define a total environment that integrates food production from planting to consumption. Although Lévi-Strauss is right that “Civilization manufactures monoculture like sugar beets,” only since World War II has a completely synthetic orientation begun to dominate. Agriculture takes more organic matter out of the soil than it puts back, and soil erosion is basic to the monoculture of annuals. Regarding the latter, some are promoted with devastating results to the land; along with cotton and soybeans, corn, which in its present domesticated state is totally dependent on agriculture for its existence, is especially bad. J.Russell Smith called it “the killer of continents…and one of the worst enemies of the human future.” The erosion cost of one bushel of Iowa corn is two bushels of topsoil, highlighting the more general large-scale industrial destruction of farmland. The continuous tillage of huge monocultures, with massive use of chemicals and no application of manure or humus, obviously raises soil deterioration and soil loss to much higher levels. The dominant agricultural mode has it that soil needs massive infusions of chemicals, supervised by technicians whose overriding goal is to maximize production. Artificial fertilizers and all the rest from this outlook eliminate the need for the complex life of the soil and indeed convert it into a mere instrument of production. The promise of technology is total control, a completely contrived environment that simply supersedes the natural balance of the biosphere. But more and more energy is expended to purchase great monocultural yields that are beginning to decline, never mind the toxic contamination of the soil, ground water and food. The U.S. Department of Agriculture says that cropland erosion is occurring in this country at a rate of two billion tons of soil a year. The National Academy of Sciences estimates that over one third of topsoil is already gone forever. The ecological imbalance caused by monocropping and synthetic fertilizers causes enormous increases in pests and crop diseases; since World War II, crop loss due to insects has actually doubled. Technology responds, of course, with spiraling applications of more synthetic fertilizers, and “weed” and “pest” killers, accelerating the crime against nature. Another post-war phenomenon was the Green Revolution, billed as the salvation of the impoverished Third World by American capital and technology. But rather than feeding the hungry, the Green Revolution drove millions of poor people from farmlands in Asia, Latin America and Africa as victims of the program that fosters large corporate farms. It amounted to an enormous technological colonization creating dependency on capital-intensive agribusiness, destroying older agrarian communalism, requiring massive fossil fuel consumption and assaulting nature on an unprecedented scale. Desertification, or loss of soil due to agriculture, has been steadily increasing. Each year, a total area equivalent to more than two Belgiums is being converted to desert worldwide. The fate of the world’s tropical rainforests is a factor in the acceleration of this desiccation: half of them have been erased in the past thirty years. In Botswana, the last wilderness region of Africa has disappeared like much of the Amazon jungle and almost half of the rainforests of Central America, primarily to raise cattle for the hamburger markets in the U.S. and Europe. The few areas safe from deforestation are where agriculture doesn’t want to go. The destruction of the land is proceeding in the U.S. over a greater land area than was encompassed by the original thirteen colonies, just as it was at the heart of the severe African famine of the mid-1980s, and the extinction of one species of wild animal and plant after another. Returning to animals, one is reminded of the words of Genesis in which God said to Noah, “And the fear of you and the dread of you shall be upon every fowl of the air, upon all that moveth upon the earth, and upon all the fishes of the sea; into your hands are they delivered.” When newly discovered territory was first visited by the advance guard of production, as a wide descriptive literature shows, the wild mammals and birds showed no fear whatsoever of the explorers. The agriculturalized mentality, however, so aptly foretold in the biblical passage, projects an exaggerated belief in the fierceness of wild creatures, which follows from progressive estrangement and loss of contact with the animal world, plus the need to maintain dominance over it. The fate of domestic animals is defined by the fact that agricultural technologists continually look to factories as models of how to refine their own production systems. Nature is banished from these systems as, increasingly, farm animals are kept largely immobile throughout their deformed lives, maintained in high-density, wholly artificial environments. Billions of chickens, pigs, and veal calves, for example, no longer even see the light of day much less roam the fields, fields growing more silent as more and more pastures are plowed up to grow feed for these hideously confined beings. The high-tech chickens, whose beak ends have been clipped off to reduce death from stress-induced fighting, often exist four or even five to a 12” by 18” cage and are periodically deprived of food and water for up to ten days to regulate their egg-laying cycles. Pigs live on concrete floors with no bedding; foot-rot, tail-biting and cannibalism are endemic because of physical conditions and stress. Sows nurse their piglets separated by metal grates, mother and offspring barred from natural contact. Veal calves are often raised in darkness, chained to stalls so narrow as to disallow turning around or other normal posture adjustment. These animals are generally under regimens of constant medication due to the tortures involved and their heightened susceptibility to diseases; automated animal production relies upon hormones and antibiotics. Such systematic cruelty, not to mention the kind of food that results, brings to mind the fact that captivity itself and every form of enslavement has agriculture as its progenitor or model. Food has been one of our most direct contacts with the natural environment, but we are rendered increasingly dependent on a technological production system in which finally even our senses have become redundant; taste, once vital for judging a food’s value or safety, is no longer experienced, but rather certified by a label. Overall, the healthfulness of what we consume declines and land once cultivated for food now produces coffee, tobacco, grains for alcohol, marijuana, and other drugs, creating the context for famine. Even the non-processed foods like fruits and vegetables are now grown to be tasteless and uniform because the demands of handling, transport and storage, not nutrition or pleasure, are the highest considerations. Total war borrowed from agriculture to defoliate millions of acres in Southeast Asia during the Vietnam War, but the plundering of the biosphere proceeds even more lethally in its daily, global forms. Food as a function of production has also failed miserably on the most obvious level: half of the world, as everyone knows, suffers from malnourishment ranging to starvation itself. Meanwhile, the “diseases of civilization,” as discussed by Eaton and Konner in the January 31, 1985 New England Journal of Medicine and contrasted with the healthful pre-farming diets, underline the joyless, sickly world of chronic maladjustment we inhabit as prey of the manufacturers of medicine, cosmetics, and fabricated food. Domestication reaches new heights of the pathological in genetic food engineering, with new types of animals in the offing as well as contrived microorganisms and plants. Logically, humanity itself will also become a domesticate of this order as the world of production processes us as much as it degrades and deforms every other natural system. The project of subduing nature, begun and carried through by agriculture, has assumed gigantic proportions. The “success” of civilization’s progress, a success earlier humanity never wanted, tastes more and more like ashes. James Serpell summed it up this way: “In short we appear to have reached the end of the line. We cannot expand; we seem unable to intensify production without wreaking further havoc, and the planet is fast becoming a wasteland.” Physiologist Jared Diamond termed the initiation of agriculture “a catastrophe from which we have never recovered.” Agriculture has been and remains a “catastrophe” at all levels, the one which underpins the entire material and spiritual culture of alienation now destroying us. Liberation is impossible without its dissolution. PART TWO ════════ Industrialism and Domestication ─────────────────────────────── The modern definitions of division of labor, progress, ideology, and the workers’ movement were inscribed by the coming of industrial capitalism and the factory system. The dynamics of what Hobsbawn termed “the most fundamental transformation of human life” in written history—specifically the reasons why it happened—explain the legacy and value of these institutions. Not surprisingly, much at the core of Marx’s thought can also be evaluated against the reality of the Industrial Revolution. Eighteenth-century England, where it all began, had long since seen the demise of feudalism; capitalist social relations, however, had been unable to establish a definitive hegemony. Gwyn Williams (Artisans and Sans-Culottes) found it hard to find a single year free from popular uprisings; “England was pre-eminently the country of the eighteenth-century mob,” he wrote. Peter Laslett (The World We Have Lost) surveyed the scene at the beginning of the century, noting the general consciousness that working people were openly regarded as a proletariat, and the fact, as “everyone was quite well aware,” that violence posed a constant threat to the social order. Laslett further noted that enclosure, or the fencing off of lands previously pastured, ploughed, and harvested cooperatively, commenced at this time and “destroyed communality altogether in English rural life.” Neither was there, by 1750, a significant land-owning peasantry; the great majority on the land were either tenant-farmers or agricultural wage laborers. T.S. Ashton, who wrote a classic economic history of 18th century England, identified a crucial key to this development by his observation that “Enclosure was desirable if only because rights of common led to irregularity of work,” as was widely believed. Britain in 1750, in any case, engendered a number of foreign visitors’ accounts that its common people were much “given to riot,” according to historian E.J. Hobsbawm. The organization of manufacture prevailing then was the domestic, or ‘putting out’ system, in which workers crafted goods in their own homes, and the capitalists were mainly merchants who supplied the raw materials and then marketed the finished products. At first these craftsmen generally owned their own tools, but later came to rent them. In either case, the relationship to the ‘means of production’ afforded great strategic strength. Unsupervised, working for several masters, and with their time their own, a degree of independence was maintained. “Luddism,” as E.P. Thompson (The Making of the English Working Class) reminds us, “was the work of skilled men in small workshops.” The Luddites (c. 1810-1820), though they belong toward the end of the period surveyed here, were perhaps the machine-breakers par excellence—textile knitters, weavers, and spinners who exemplify both the relative autonomy and anti-employer sentiment of the free craftsman. Scores of commentators have discussed the independence of such domestic workers as the handloom weavers; Muggeridge’s report on Lancashire craftsmen (from Exell, Brief History of the Weavers of the Country of Gloucester), for example, notes that this kind of work “gratifies that innate love of independence… by leaving the workman entirely a master of his own time, and the sole guide of his actions.” These workers treasured their versatility, and their right to execute individual designs of their own choosing rather than the standardization of the new factory employment (which began to emerge in earnest about 1770). Witt Bowden (Industrial Society in England Towards the End of the Eighteenth Century) noted that earlier processes of production had indeed often “afforded the workers genuine opportunities for the expression of their personalities in their work,” and that-in these pre-specialization times craftsmen could pursue “artistic conceptions” in many cases. A non-working class observer (Malachy Postlewayt, c. 1750), in fact, expressed the view that the high quality of English manufactures was to be attributed to the frequent “relaxation of the people in their own way.” Others discerned in the workers’ control over time a distinct threat to authority as well as to profits; Ashton wrote how “very serious was the almost universal practice of working a short week,” adding a minister’s alarum (1752) that “It is not those who are absolutely idle that injure the public so much as those who work but half their time.” If anything, Ashton understated the case when he concluded that “…leisure, at times of their own choice, stood high on the workers’ scale of preferences.” William Temple’s admonition (1739) that the only way to insure temperance and industry on the part of laborers was to make it necessary that they work all the time physically possible “in order to procure the common necessaries of life,” was a frequent expression of ruling-class frustration. Temple’s experience with the turbulent weavers of Gloucestershire had thus led him to agree with Arthur Young’s “everyone but an idiot knows that the lower classes must be kept poor or they will never be industrious” dictum. Among the craftsmen of cloth, the insistence on their own methods—(including, at times, the ingenious sabotage of finished goods)—was matched by another weapon, that of embezzlement of the raw materials assigned to them. As Ashton reports, “A survey of the measures passed to suppress embezzlement and delay in returning materials shows a progressive increase in penalties.” But throughout the 18th century, according to Wadsworth and Mann (The Cotton Trade and Industrial Lancashire, 1600-1780), “the execution of the anti-embezzlement acts…lagged behind their letter. Their effectiveness was limited by the ‘resentment of the spinners and workpeople,’ which prosecutors incurred and by the difficulty of detection without regular inspection.” James’ History of the Worsted Manufacture echoes this finding: “Justices of the Peace…until compelled by mandamus, refused to entertain charges against or convict upon proper evidence, embezzlers or false reelers.” Wadsworth and Mann perceived in the embezzlement issue the relationship between the prevailing ‘work ethic’ and the prevailing mode of production: “The fact is simply that a great many…have never seen eye to eye with their employers on the rights and sanctity of ownership. The home worker of the eighteenth century, living away from the restraints of the factory and workshop and the employer’s eye, had every inducement [to try] to defeat the hard bargain the employer had driven.” The independent craftsman was a threatening adversary to the employing class, and he clung strongly to his prerogatives: his well-known propensity, for instance, to reject “the higher material standard of the factory towns,” in Thompson’s phrase, to gather his own fruits, vegetables and flowers, to largely escape the developing industrial blight and pollution, to gather freely with his neighboring workers at the dinner hour. Thompson noted a good example of the nature of the domestic worker in ‘the Yorkshire reputation for bluntness and independence” which could be traced to what local historian Frank Peel (early 19th century) saw as “men who doffed their caps to no one, and recognized no right in either squire or parson to question or meddle with them.” Turning to some of the specifics of pre-factory system revolt in England, the following from Ashton provides a good introduction: “Following the harvest failure of 1709 the keelmen of the Tyne took to rioting. When the price of food rose sharply in 1727 the tin-miners of Cornwall plundered granaries at Falmouth, and the coal-miners of Somerset broke down the turnpikes on the road to Bristol. Ten years later the Cornish tinners assembled again at Falmouth to prevent the exportation of corn, and in the following season there was rioting at Tiverton. The famine of 1739-40 led to a ‘rebellion’ in Northumberland and Durham in which women seem to have taken a leading part: ships were boarded, warehouses broken open, and the Guild at Newcastle was reduced to ruins. At the same time attacks on corn dealers were reported from North and South Wales. The years 1748 and 1753 saw similar happenings in several parts of the country; and in 1756-7 there was hardly a county from which no report reached the Home Office of the pulling down of corn mills or Quaker meeting-houses, or the rough handling of bakers and grain dealers. In spite of drastic penalties the same thing occurred in each of the later dearths of the century: in 1762, 1765-7, 1774, 1783, 1789, 1795, and 1800.” This readiness for direct action informs the strife in textiles, the industry so important to England and to capitalist evolution, where, for example, “discontent was the prevalent attitude of the operatives engaged in the wool industries for centuries,” said Burnley in his Historys of Wool and Woolcombing. Popular ballads give ample evidence to this, as does the case of rioting London weavers, who panicked the government in 1675. Lipson’s History of the Woollen and Worsted Industries provides many instances of the robustness of domestic textile workers’ struggles, including that of a 1728 weavers’ strike which was intended to have been pacified by a meeting of strike leaders and employers; a “mob” of weavers “burst into the room in which the negotiations were taking place, dragged back the clothiers as they endeavored to escape from the windows, and forced them to concede all their demands.” Or these additional accounts by Lipson: “The Wiltshire weavers were equally noted for their turbulent character and the rude violence with which they proclaimed the wrongs under which they smarted. In 1738 they assembled together in a riotous manner from the villages round Bradford and Trowbridge, and made an attack upon the house of a clothier who had reduced the price of weaving. They smashed open the doors, consumed or spoiled the provisions in the cellar, drank all the wine they could, set the casks running, and ended up by destroying great quantities of raw materials and utensils. In addition to this exploit they extorted a promise from all the clothiers in Melksham that they would pay fifteen pence a yard for weaving…Another great tumult occurred at Bradford (Wiltshire) in 1752. Thirty weavers had been committed to prison; the next day above a thousand weavers assembled, armed with bludgeons and firearms, beat the guard, broke open the prison, and rescued their companions.” Similarly, J.P. Kay was driven from Leeds in 1745 and from Bury in 1753, as outbreaks of violence flared in: many districts in response to his invention, the flying shuttle for mechanizing weaving. Wadsworth and Marin found the Manchester Constables Accounts to have reported “great Riots, Tumults, and Disorders” in the late 1740s, and that “After 1750 food riots and industrial disputes grow more frequent,” with outbreaks in Lancashire (the area of their study) virtually every year. These historians further recount “unrest and violence in all parts of the country” in the middle to late 1750s, with Manchester and Liverpool frequently in alarm and “panic among the propertied classes.” After sporadic risings, such as Manchester, 1762, the years 1764-68 saw rioting in almost every county in the country; as the King put it in 1766, “a spirit of the most daring insurrection has in divers parts broke forth in violence of the most criminal nature.” Although the smashing of stocking frames had been made a capital offense in 1727, in a vain attempt to stem worker violence, Hobsbawm counted 24 incidents of wages and prices being forcibly set by exactly this type of riotous destruction in 1766 alone. Sporadic rioting occurred in 1769, such as the anti-spinning jenny outbursts which menaced the inventor Hargreaves and during which buildings were demolished at Oswaldthistle and Blackburn in order to smash the hated mechanization. A whole new wave began in 1772. Sailors in Liverpool, for example, responded to a wage decrease proposal in 1775 by “sacking the owners’ houses, hoisting the bloody flag, and bringing cannon ashore which they fired on the Exchange,” according to Wordsworth and Mann. The very widespread anti-machinery risings of 1779 saw the destruction of hundreds of weaving and spinning devices which were too large for domestic use. The rioters’ sentiments were very widely shared, as evidenced by arrest records that included miners, nailmakers, laborers, joiners—a fair sample of the entire industrial population. The workers’ complaint averred that the smaller machines are “in the Hands of the Poor and the larger ‘Patent Machines’ in the Hands of the Rich,” and “that the work is better manufactured by small [textile machines] than by large ones.” This list, very incomplete as it is, could be easily extended into the many early 19th century outbreaks, all of which seem to have enjoyed great popular support. But perhaps a fitting entry on which to close this sample would be these lines from a public letter written by Gloucestershire shearmen in 1802: “We hear in Formed that you got Shear in mee sheens and if you Don’t Pull them Down in a Forght Nights Time we will pull them Down for you Wee will you Damd infernold Dog.” This brief look at the willfulness of the 18th century proletariat serves to introduce the conscious motivation behind the factory system. Sidney Pollard (The Genesis of Modern Management) recognized the capitalists’ need of “breaking the social bonds which had held the peasants, the craftsmen and the town poor of the eighteenth century together in opposition to the new order.” Poltaud saw too the essential nature of the domestic system, that the masters “had to depend on the work performed in innumerable tiny domestic-workshop units, unsupervised and unsupervisable. Such incompatibility,” he concluded, “was bound to set up tensions and -to drive the merchants to seek new ways of production, imposing their own managerial achievements and practices in the productive sector.” This underlying sense of the real inadequacy of existing powers of control was also-firmly grasped by David Landes (The Unbound Prometheus): “One can understand why the thoughts of employers turned to workshops where the men would be brought together to labour under the watchful overseers, and to machines that would solve the shortage of manpower while curbing the insolence and dishonesty of the men.” According to Wadsworth and Mann, in fact, many employers definitely felt that “the country would perish if the poor—that is, the working classes—were not brought under severe discipline to habits of industry and docile subordination.” Writing on the evolution of the ‘central workshop’ or factory, historian N.S.B. Gras saw its installation strictly in terms of control of labor: “It was purely for purposes of discipline, so that the workers could be effectively controlled under the supervision of foremen.” Factory work itself became the central weapon to force an enemy character into a safe, reliable mold following the full realization that they were dealing with a recalcitrant, hostile working class whose entire morale, habits of work, and culture had to be broken. Bowden described this with great clarity: “More directly as a result of the introduction of machinery and of large-scale organization was the subjection of the workers to a deadening mechanical and administrative routine.” Adam Smith, in his classic Wealth of Nations, well understood that the success of industrial capitalism lies with nothing so much as with the division of labor, that is, with ever-increasing specialization and the destruction of versatility in work. He also knew that the division of labor is as much about the production and allocation of commodities. And certainly the new order is also related to consumption as to the need to guarantee control of production; in fact, there are those who see its origin almost strictly in terms of market demand for mass production, but who do not see the conscious element here either. In passing, Bishop Berkeley’s query of 1755, “whether the creation of wants be not the likeliest way to produece industry in a people?” is eminently relevant. As Hobsbawn pointed out, the populace was definitely not originally attracted by novelties or standardized products; industrialization gradually enabled production “to expand its own markets, if not actually to create them.” The lure of cheap, identical goods succeeded essentially due to the enforced absence of earlier pleasures. When independence and variety of pursuits were more possible, a different kind of leisure and consumption was the norm. This, of course, was in itself a target of the factory system, “the tendency, so deplored by economists, to work less when food was cheap,” as Christopher Hill put it. Exports, too, were an obvious support of the emerging regime, backed by the systematic and aggressive help of government, another artificial demand mechanism. But the domestic market was at least as important, stemming from the “predisposing condition” that specialization and discipline of labor makes for further ‘progress,’ as Max Weber observed. Richard Arkwright (1732-1792) agreed completely with those who saw the need for consciously spurring consumption, “as to the necessity of arousing and satisfying new wants,” in his phrase. But it is as the developer of cotton spinning machinery that he deserves a special word here; because he is generally regarded as the most prominent figure in the history of the textile industries and even as ‘the founder of the factory system.’ Arkwright is a clear illustration of the political and social character of the technology he did so much to advance. His concern with social control is very evident from his writings and correspondence, and Mantoux (The Industrial Revolution in the Eighteenth Century) discerned that “His most original achievement was the discipline he established in the mills.” Arkwright also saw the vital connection between work discipline and social stability: “Being obliged to be more regular in their attendince on their work, they became more orderly in their conduct.” For his pioneering efforts, he received his share of appropriate response; Lipson relates that in 1767, with “the news of the riots in the neighborhood of Blackburn which had been provoked by Hargreaves’ spinning jenny,” he and his financial backer Smolley, “fearing to draw upon themselves the attention of the machine-wreckers, removed to Nottingham.” Similarly, Arkwright’s Birkacre mill was destroyed by workers in 1779. Lipson ably summarizes his managerial contribution: “In coordinating all the various parts of his vast industrial structures; in organising and disciplining large bodies of men, so that each man fitted into his niche and the whole acted with the mechanical precision of a trained army. in combining division of labour with effective supervision from a common centre…a new epoch was inaugurated.” Andrew Ure’s Philosophy of Manufactures is one of the major attempts at an exposition of the factory system, a work cited often by Marx in Capital. Its revealing preface speaks of tracing “the progression of the British system of industry, according to which every process peculiarly nice, and therefore liable to injury from the ignorance and waywardness of workmen, is withdrawn from handicraft control, and placed under the guidance of self-acting machinery.” Examining the nature of the new system, then, we find, instead of domestic craft labor, “industrial labor…[which] imposes a regularity, routine, and monotony…which conflicts…with all the inclinations of a humanity as yet unconditioned into it,” in the words of Hobsbawm. Factory production slowly supplanted that of the domestic system in the face of fierce opposition (discussed below), and workers experienced the feeling of daily entering a prison to meet the new “strain and violence” of work, as the Hammonds put it. Factories often resembled pauper work-houses or prisons, after which they had actually often been modeled; Max Weber saw a strong initial similarity between the modern factory and the Russian serf-labor workshops, wherein the means of production and the workers themselves were appropriated by the masters. The Hammonds’ Town Labourer saw “the depreciation of human life” as the leading fact about the new system for the working classes: “The human material was used up rapidly; workmen were called old at forty.” Possibly just as important was the novel, “inhuman” nature of its domination, as if all “were in the grasp of a great machine that threatened to destroy all sense of the dignity of human life,” as the Hammonds described it. A famous characterization by J.P. Kay (1832) put the everyday subjugation in hard to forget terms: “Whilst the engine runs the people must work—men, women and children are yoked together with iron and steam. The animal machine—breakable in the best case, subject to a thousand sources of suffering—is chained fast to the iron machine, which knows no suffering and no weariness.” Resistance to industrial labor displayed a great strength and persistence, reflecting the latent anti-capitalism of the domestic worker who was “the despair of the masters” in a time when a palpable aura of unfreedom clung to wage-labor. Lipson tells us, for example, of Ambrose Crowley, perhaps the very first factory owner and organizer (from 1691); that he showed an obsession with the problem of disciplining his workers to “an institution so alien in its assumptions about the way in which people should spend their lives.” Lewis Paul wrote from his London firm in 1742 that “I have not half my people come to work today and I have no fascination in the prospect that I have to put myself in the power of such people.” In 1757 Josiah Tucker noted that factory-type machinery is highly provocative to the populace who “never fail to break out into Riots and Insurrections whenever such things are proposed.” As we have seen, and as Christopher Hill put it, “Machine-breaking was the logical reaction of free men…who saw the concentration of machinery in factories as the instrument of their enslavement.” A hosiery capitalist, in admitting defeat to the Committee on Woollen Manufacture, tells us much of the independent spirit that had to be broken: “I found the utmost distaste on the part of the men, to any regular hours or regular habits…The men themselves were considerably dissatisfied, because they could not go in and out as they pleased, and go on just as they had been used to do…to such an extent as completely to disgust them with the whole system, and I was obliged to break it up.” The famous early entrepreneurs, Boulton and Watt, were likewise dismayed to find that the miners they had to deal with were “strong, healthy and resolute men, setting the law at defiance; no officer dared to execute a warrant against them.” Wedgwood, the well-known pottery and china entrepreneur, had to fight “the open hostility of his work-people” when he tried to develop division of labor-in his workshops, according to Mantoux. And Jewitt’s The Wedgwoods, exposing the social intent of industrial technology, tells us “It was machinery (which) ultimately forced the worker to accept the discipline of the factory.” Considering the depth of workers’ antipathy to the new regimen, it comes as no surprise that Pollard should speak of “the large evidence which all points to the fact that continuous employment was precisely one of the most hated aspects of factory work.” This was the case because the work itself, as an agent of pacification, was perceived ‘precisely’ in its true nature. Pollard later provides the other side of the coin to the workers’ hatred of the job; namely, the rulers’ insistence on it for its own (disciplinary) sake: “Nothing strikes so modern a note in the social provisions of the factory villages as the attempts to provide continuous employment.” Returning to the specifics of resistance, Sir Frederic Eden, in his State of the Poor (1797), stated that the industrial laborers of Manchester “rarely work on Mein-day and that many of them keep holiday two or three days in the week.” Thus Lire’s tirades about the employees’ “unworkful impulses,” their “aversion to the control and continuity of factory labor,” are reflected in—such data as the fact that as late as 1800, spinners would be missing from the factories on Mondays and Tuesdays. Absenteeism, as well as turnover, then, was part of the syndrome of striving to maintain a maximum of personal liberty. Max Weber spoke of the “immensely stubborn resistance” to the new work discipline, and a later social scientist, Reinhard Bendix, saw also that the drive to establish the management of labor on “an impersonal, systematic basis” was opposed “at every point.” Ure, in a comment worth quoting at length, discusses the fight to master the workers in terms of Arkwright’s career: “The main difficulty [he faced was] above all, in training human beings to renounce their desultory habits of work, and to identify themselves with the unvarying regularity of the complex automaton. To devise and administer a successful code of factory discipline, suited to the necessities of factory diligence, was the Herculean enterprise, the noble achievement of Arkwright. Even at the present day, when the system is perfectly organized, and its labour lightened to the utmost, it is found nearly impossible to convert persons past the age of puberty, whether drawn from rural or from handicraft occupations, into useful factory hands.” We also encounter in this selection from Ure the reason why early factory labor was so heavily comprised of the labor of children, women, and paupers threatened with loss of the dole. Thompson quotes a witness before a Parliamentary investigative committee, that “all persons working on the power-loom are working there by force because they cannot exist any other way.” Hundreds of thousands clung to the deeply declining fortunes of hand-loom weaving for decades, in a classic case of the primacy of human dignity, which Mathias (The First Industrial Nation) notes “defied the operation of simple economic incentives.” What Hill termed the English craftsmen’s tradition “of self-help and self-respect” was a major source of that popular will which denied complete dominion by capital, the “proud awareness that voluntarily going into a factory was to surrender their birth-right.” Thompson demonstrates that the work rules “appeared as unnatural and hateful restraints” and that everything about factory life was an insult. “To ‘stand at their command’—this was the most deeply resented indignity. For he felt himself, at heart, to be the real maker of the cloth…” This spirit was why, for example, paper manufacturers preferred to train inexperienced labor for the new (post-1806) machine processes, rather than employ skilled hand paper makers. And why Samuel Crompton, inventor of the spinning mule, lamented, relatively late in this period, “To this day, though it is more than thirty years since my first machine was shown to the public, I am hunted and watched with as much never-ceasing care as if I was the most notorious villain that ever disgraced the human form; and I do affirm that if I were to go to a smithy to get a common nail made, if opportunity offered to the bystanders, they would examine it most minutely to see if it was anything but a nail.” The battle raged for decades, with victories still being won at least as late as that over a Bradford entrepreneur in 1882, who tried to secretly install a power-loom but was discovered by the domestic workers. “It was therefore immediately taken down, and placed in a cart under a convoy of constables, but the enraged weavers attacked and routed the constables, destroyed the loom, and dragged its roller and warp in triumph through Baildon.” Little wonder that Ure wrote of the requirement of “a Napoleon nerve and ambition to subdue the refractory tempers of work-people.” Without idealizing the earlier period, or forgetting that it was certainly defined by capitalist relationships, it is also true, as Hill wrote, “What was lost by factories and enclosure was the independence, variety and freedom which small producers had enjoyed.” Adam Smith admitted the “mental mutilation” due to the new division of labor, the destruction of both an earlier alertness of mind and a previous “vivacity of both pain and pleasure.” Robert Owen likewise discussed this transformation when he declared, in 1815, that “The general diffusion of manufactures throughout a country generates a new character…an essential change in the general character of the mass of the people.” Less abstractly, the Hammonds harkened back to the early 19th century and heard the “lament that the games and happiness of life are disappearing,” and that soon “the art of living had been degraded to its rudest forms.” In 1819 the reformer Francis Place, speaking of the population of industrial Lancashire, was pleased to note that “Until very lately it would have been very dangerous to have assembled 500 of them on any occasion…Now 100,000 people may be collected together and no riot ensue.” It was as Thompson summarized: gradually, between 1780 and 1830, “the ‘average’ English working man became more disciplined, more subject to the productive tempo of the clock, more reserved and methodical, less violent and less spontaneous.” A rising at the end of this period, the “last Labourers’ Revolt” of agricultural workers in 1830, says a good deal about the general change that had occurred. Similar to outbreaks of 1816 and 1822, much rural property had been destroyed and large parts of Kent and East Anglia were in the rebels’ control. The Duke of Buckingham, reflecting the government’s alarm, declared the whole country as having been taken over by the rioters. But despite several weeks’ success, the movement collapsed at the first show of real force. Historian Pauline Gregg described the sudden relapse into apathy and despair; they were “unused to asserting themselves,” their earlier tradition of vigor and initiative conquered by the generalized triumph of the new order. Also concerning this year as marking a watershed, is Mantoux’s remark about Arkwright, that “About 1830 he became the hero of political economy.” Absurd, then, are the many who date the “age of revolution” as beginning at this time, such as the Tillys’ Rebel Century, 1830-1930. Only with the defeat of the workers could Arkwright, the architect of the factory system, be installed as the hero of the bourgeoisie; this defeat of authentic rebellion also gave birth to political ideology. Socialism, a caricature of the challenge that had existed, could have begun no other way. The German businessman Harkort, wrote in 1844 of the “new form of serfdom,” the diminution of the strength and intelligence of the workers that he saw. The American Colman witnessed (1845) nothing less than “Wretched, defrauded, oppressed, crushed human nature, lying in bleeding fragments all over the face of society.” Amazing that another businessman of this time could, in his Condition of the Working Class, glory that the “factory hands, eldest children of the industrial revolution, have from the beginning to the present day formed the nucleus of the Labour Movement.” But Engels’ statement at least contains no internal contradiction; the tamed, defeated factory operative has clearly been the mainstay of the labor movement and socialist ideology among the working class. As Rexford Tugwell admitted in his Industrial Discipline and the Governmental Arts: “When the factory came into existence…work became an indignity rather than a matter for pride… Organized labor has always consented to this entirely uncreative subjection.” Thus “the Character structure of the rebellious pre-industrial labourer or artisan was violently recast into that of the submissive industrial worker,” in Thompson’s words; by trade unionism, the fines, firings, bearings, factory rules, Methodism, the education system, the diversion known as ideology—the entire battery of institutions that have never achieved unchallenged success. Thompson recognized the essentially “repressive and disabling” discipline of industrialization and yet, as if remembering that he is a Marxist historian, somehow finds the process good and inevitable. How could the Industrial Revolution have happened without this discipline, he asks, and in fact finds that in the production of “sober and disciplined” workers, “this growth in self-respect (!) and political consciousness” to have been “one real gain” of the transformation of society. If this appears as insanity to the healthy reader, it is wholly consistent with the philosophy of Marx. “Division of labor,” said the young Marx, “increases with civilization.” It is a fundamental law, just as its concomitant, the total victory of the capitalist system. In Volume I of Capital, Marx described the inevitable and necessary “movement of the proletariat”: “In the ordinary run of things, the worker can be left to the action of the natural laws of production, i.e. to his dependence on capital, a dependence springing from, guaranteed, and perpetuated by the very mechanism of production.” Until, as he says elsewhere, on the day of the Revolution the proletariat will have been “disciplined, united, and organized by the very mechanism of production.” Then they will have achieved that state whereby they can totally transform the world; “completely deprived of any self activity” or “real life content,” as the young Marx prescribed. To back-track for a moment, consider the conservative historian Ashton’s puzzlement at such workers as “the west-country weavers who destroyed tenter frames, or of the colliers who frequently smashed the pit gear, and sometimes even set the mines on fire: they must have realized that their action would result in unemployment, but their immediate concern was to assert their strength and inflict loss on stubborn employers. There seems to have been little or no social theory in the minds of the rioters and very little class consciousness in the Marxist sense of the term.” This orthodox professor would certainly have understood Marx’s admonition to just such workers, “to direct their attacks, not against the material instruments of production, but against the mode in which they are used.” Marx understood, after all, that “the way machinery is utilized is totally distinct from the machinery itself;” as he wrote in 1846! Similarly, Engels destroyed the logic of the anarchists by showing that the well-known neutrality of technology necessitates subordination, authority and power. How else, he asks, could a factory exist? In fact, Marx and Engels explain worker resistance to “scientific socialism” largely in terms of the survival of artisan-type jobs; those who are the more beaten and subordinated resist it the least. It is historical fact that those closest to the category artisan (“undeveloped”) actually have felt the most capacity to abolish the wage system, precisely because they still exercise some control of work processes. Throughout nearly all his writings, however, Marx managed to return to the idea that, in socialist society, individuals would develop fully in and through their work. But by the third volume of Capital his attitude had changed and the emphasis was upon the “realm of freedom” which “only begins, in fact, where that labor, which is determined by need and external purpose, ceases,” lying “outside the sphere of material production proper.” Thus Marx admits that not even under socialism will the degradation of labor be undone. (This is closely related to the Marxist notion of revolutionary preservation, in which the acquisitions and productivity of the capitalist economic system are not disturbed by proletarian revolution.) The free creation of life is hence banished, reduced to the marginalia of existence much like hobbies in class society. Despite his analysis of alienated labor, much of the explicit core of his philosophy is virtually a consecration of work as tyranny. Durkheim, writing of the late 19th century, saw as the main social problem the need for a cohesive social integration. Much like Marx, who also desired the consolidation and maturation of capitalism, albeit for different reasons, Durkheim thought he found the key in the division of labor. In the need for coordination engendered by the division of labor, he discerned the essential source of solidarity. Today this grotesque inversion of human values is recognized rather fully; the hostility to specialization and its always authoritarian expertise is strongly present. A look at the recent opinion polls, or articles like Fortune’s “The Senseless War on Science” (March, 1971) will suffice. The perennial struggle against integration by the dominant system now continues as a struggle for disintegration, a more and more consciously nihilist effort. The progress of ‘progress’ is left with few partisans, and its enemies with few illusions as to what is worth preserving. Who Killed Ned Ludd? ──────────────────── In England, the first industrial nation, and beginning in textiles, capital’s first and foremost enterprise there, arose the widespread revolutionary movement (between 1810 and 1820) known as Luddism. The challenge of the Luddite risings– and their defeat–was of very great importance to the subsequent course of modern society. Machine-wrecking, a principal weapon, pre-dates this period to be sure; historian Frank Darvall accurately termed it “perennial” throughout the 18th century, in good times and bad. And it was certainly not confined to either textile workers or England. Farm workers, miners, millers, and many others joined in destroying machinery, often against what would generally be termed their own “economic interests.” Similarly, there were the workers of Eurpen and Aix-la-Chapelle who destroyed the important Cocker-ill Works, the spinners of Schmollen and Crimmitschau who razed the mills of those towns, and countless others at the dawn of the Industrial Revolution. Nevertheless, it was the English cloth workers knitters, weavers, spinners, croppers, shearmen, and the like–who initiated a movement, which “in sheer insurrectionary fury has rarely been more widespread in English history,” as E.P. Thompson wrote, in what is probably an understatement. Though generally characterized as a blind, unorganized, reactionary, limited and ineffective upheaval, this ‘instinctive’ revolt against the new economic order was very successful for a time and had revolutionary aims. Strongest in the more developed areas, the central and northern parts of the country especially, The Times of February 11, 1812 described it as “the appearance of open warfare” in England. Vice-Lieutenant Wood wrote to Fitzwilliam in the government on June 17, 1812 that “except for the very spots which were occupied by soldiers, the country was virtually in the possession of the lawless.” The Luddites indeed were irresistible at several moments in the second decade of the century and developed a very high morale and self-consciousness. As Cole and Postgate put it, “Certainly there was no stopping the Luddites. Troops ran up and down helplessly, baffled by the silence and connivance of the workers.” Further, an examination of newspaper accounts, letters, and leaflets reveals insurrection as the stated intent; for example, “all Nobles and tyrants must be brought down,” read part of a leaflet distributed in Leeds. Evidence of explicit general revolutionary preparations was widely available in both Yorkshire and Lancashire, for instance, as early as 1812. An immense amount of property was destroyed, including vast amounts of textile frames which had been redesigned for the production of inferior goods. In fact, the movement took its name from young Ned Ludd, who, rather than do the prescribed shoddy work, took a sledge-hammer to the frames at hand, This insistence on either the control of the productive processes or the annihilation of them fired the popular imagination and brought the Luddites virtually unanimous support. Hobsbawm declared that there existed an “overwhelming sympathy for machine-wreckers in all parts of the population,” a condition which by 1813, according to Churchill, “had exposed the complete absence of means of preserving public order.” Frame-breaking had been made a capital offense in 1812 and increasing numbers of troops had to be dispatched, to a point exceeding the total Wellington had under his command against Napoleon. The army, however, was not only spread very thin, but was often found unreliable due to its own sympathies and the presence of many conscripted Luddites in the ranks. Likewise, the local magistrates and constabulary could not be counted upon, and a massive spy system proved ineffective against the real solidarity of the populace. As might be guessed, the volunteer militia, as detailed under the Watch and Ward Act, served only to “arm the most powerfully disaffected,” according to the Hammonds, and thus the modern professional police system had to be instituted. Intervention of this nature could hardly have been basically sufficient, though, especially given the way Luddism seemed to grow more revolutionary from event to event. Cole and Postgate, described the post-1815 Luddites as more radical than those previous and from this point imputes to them that they “set themselves against the factory system as a whole.” Also, Thompson observed that as late as 1819 the way was still open for a successful general insurrection. Required against what Mathias termed “the attempt to destroy the new society,” was a weapon much closer to the point of production, namely the furtherance of an acceptance of the fundamental order in the form of trade unionism. Though it is clear that the promotion of trade unionism was a consequence of Luddism as much as the creation of the modern police was, it must also be realized that there had existed a long-tolerated tradition of unionism among the textile workers and others prior to the Luddite risings. Hence, as Morton and Tate almost alone point out, the machine-wrecking of this period cannot be viewed as the despairing outburst of workers having no-other outlet. Despite the Combination Acts, which were an unenforced ban on unions between 1799 and 1824, Luddism did not move into a vacuum but was successful for a time in opposition to the refusal of the extensive union apparatus to compromise capital. In fact, the choice between the two was available and the unions were thrown aside in favor of the direct organization of the workers and their radical aims. During the period in question it is quite clear that unionism was seen as basically distinct from Luddism and promoted as such, in the hope of absorbing the Luddite autonomy. Contrary to the fact of the Combination Acts, unions were often held to be legal in the courts, for example, and when unionists were prosecuted they generally received light punishment or none whatever, whereas the Luddites were usually hanged. Some members of Parliament openly blamed the owners for the social distress, for not making full use of the trade union path of escape. This is not to say that union objectives and control were as clear or pronounced as they are to all today, but the indispensable role of unions vis-a-vis capital was becoming clear, illumined by the crisis at hand and the felt necessity for allies in the pacification of the workers. Members of Parliament in the Midlands counties urged Gravenor Henson, head of the Framework Knitters Union, to combat Luddism–as if this were needed. His method of promoting restraint was of course his tireless advocacy of the extension of union strength. The Framework Knitters Committee of the union, according to Church’s study of Nottingham, “issued specific instructions to workmen not to damage frames.” And the Nottingham Union, the major attempt at a general industrial union, likewise set itself against Luddism and never employed violence. If unions were hardly the allies of the Luddites, it can only be said that they were the next stage after Luddism in the sense that unionism played the critical role in its defeat through the divisions, confusion, and deflection of energies the unions engineered. It “replaced” Luddism in the same way that it rescued the manufacturers from the taunts of the children in the streets, from the direct power of the producers. Thus the full recognition of unions in the repeal in 1824 and 1825 of the Combination Acts “had a moderating effect upon popular discontent,” in Darvall’s words. The repeal efforts, led by Place and Hume easily passed an unreformed Parliament, by the way, with much pro-repeal testimony from employers as well as from unionists, with only a few reactionaries opposed. In fact, while the conservative arguments of Place and Hume included a prediction of fewer strikes post-repeal, many employers understood the cathartic, pacific role of strikes and were not much dismayed by the rash of strikes which attended repeal. The repeal Acts of course officially delimited unionism to its traditional marginal wages and hours concern, a legacy of which is the universal presence of “management’s rights” clauses in collective bargaining contracts to this day. The mid-1830’s campaign against unions by some employers only underlined in its way the central role of unions: the campaign was possible only because the unions succeeded so well as against the radicality of the unmediated workers in the previous period. Hence Lecky was completely accurate later in the century when he judged that ” there can be little doubt that the largest, wealthiest and best-organized Trade-Unions have done much to diminish labor conflicts,” just as the Webbs also conceded in the 19th century that there existed much more labor revolt before unionism became the rule. But to return to the Luddites, we find very few first-person accounts and a virtually secret tradition, mainly because they projected themselves through their acts, not ideology. And what was it really all about? Stearns, perhaps as close as the commentators come, wrote “The Luddites developed a doctrine based on the presumed virtues of manual methods.” He all but calls them ‘backward-looking wretches’ in his condescension, yet there is a grain of truth here certainly. The attack of the Luddites was not occasioned by the introduction of new machinery, however, as is commonly thought, for there is no evidence of such in 1811 and 1812 when Luddism proper began. Rather, the destruction was leveled at the new slip-shod methods which were ordered into effect on the extant machinery. Not an attack against production on economic grounds, it was above all the violent response of the textile workers (and soon joined by others) to their attempted degradation in the form of inferior work; shoddy goods–the hastily-assembled “cut-ups,” primarily–was the root issue at hand. While Luddite offensives generally corresponded to periods of economic downturn, it was because employers often took advantage of these periods to introduce new production methods. But it was also true that not all periods of privation produced Luddism, as it was that Luddism appeared in areas not particularly depressed. Leicestershire, for instance, was the least hit by hard times and it was an area producing the finest quality woolen goods; Leicestershire was a strong center of Luddism. To wonder what was so radical about a movement which seemed to demand “only” the cessation of fraudulent work, is to fail to perceive the inner truth of the valid assumption, made on every side at the time, of the connection between frame-breaking and sedition. As if the fight by the producer for the integrity of his work-life can be made without calling the whole of capitalism into question. The demand for the cessation of fraudulent work necessarily becomes a cataclysm, an all-or-nothing battle insofar as it is pursued; it leads directly to the heart of the capitalist relationship and its dynamic. Another element of the Luddite phenomenon generally treated with condescension, by the method of ignoring it altogether, is the organizational aspect. Luddites, as we all know, struck out wildly and blindly, while the unions provide the only organized form to the workers. But in fact, the Luddites organized themselves locally and even federally, including workers from all trades, with an amazing coordination. Eschewing an alienating structure, their organization was without a center and existed largely as an “unspoken code;” theirs was a non-manipulative, community organization which trusted itself. All this, of course, was essential to the depth of Luddism, to the appeal at its roots. In practice, “no degree of activity by the magistrates or by large reinforcements of military deterred the Luddites. Every attack revealed planning and method,” stated Thompson, who also gave credit to their “superb security and communications.” An army officer in Yorkshire understood their possession of “a most extraordinary degree of concert and organization.” William Cobbett wrote, concerning a report to the government in 1812: “And this is the circumstance that will most puzzle the ministry. They can find no agitators. It is a movement of the people’s own.” Coming to the rescue of the authorities, however, despite Cobbett’s frustrated comments, was the leadership of the Luddites. Theirs was not a completely egalitarian movement, though this element may have been closer to the mark than was their appreciation of how much was within their grasp and how narrowly it eluded them. Of course, it was from among the leaders that “political sophistication” issued most effectively in time, just as it was from them that union cadres developed in some cases. In the “pre-political” days of the Luddites–now developing in our “post-political” days, also–the people openly hated their rulers. They cheered Pitt’s death in 1806 and, more so, Perceval’s assassination in 1812. These celebrations at the demise of prime ministers bespoke the weakness of mediations between rulers and ruled, the lack of integration between the two. The political enfranchisement of the workers was certainly less important than their industrial enfranchisement or integration, via unions; it proceeded more slowly for this reason. Nevertheless, it is true that a strong weapon of pacification was the strenuous effort made to interest the population in legal activities, namely the drive to widen the electoral basis of Parliament. Cobbett, described by many as the most powerful pamphleteer in English history, induced many to join Hampden Clubs in pursuit of voting reform, and was also noted, in the words of Davis, for his “outspoken condemnation of the Luddites.” The pernicious effects of this divisive reform campaign can be partially measured by comparing such robust earlier demonstrations of anti-government wrath as the Gordon Riots (1780) and the mobbing of the King in London (1795) with such massacres and fiascoes as the Pentridge and Peterloo “risings.” But to return, in conclusion, to more fundamental mechanisms, we again confront the problem of work and unionism. The latter; it must be agreed, was made permanent upon the effective divorce of the worker from control of the instruments of production–and of course, unionism itself contributed most critically to this divorce, as we have seen. Some, certainly including the marxists, see this defeat and its form, the victory of the factory system, as both an inevitable and desirable outcome, though even they must admit that in work execution resides a significant part of the direction of industrial operations even now. A century after Marx, Galbraith located the guarantee of the system of productivity over creativity in the unions’ basic renunciation of any claims regarding work itself. But work, as all ideologists sense, is an area closed off to falsification. Work activities are the kernel, impervious to the intrusion of ideology and its forms, such as mediation and representation. Thus ideologists ignore the unceasing universal luddite contest over control of the productive processes. Thus class struggle is something quite different to the producer than to the ideologue. In the early trade union movement there exists a good deal of democracy. Widespread, for example, was the practice of designating delegates by rotation or by lot. But what cannot be legitimately democratized is the real defeat at the root of the unions’ victory, which makes them the organization of complicity, a mockery of community. Form on this level cannot disguise unionism, the agent of acceptance and maintenance of a grotesque world. The marxian quantification elevates output-per-hour over creation as the highest good, as leftists likewise ignore the real story of the Luddites (the ending of the direct power of the producers) and so manage, incredibly, to espouse unions as all that “untutored” workers can have. The opportunism and elitism of all the Internationals, indeed the history of leftism, sees its product finally in fascism when accumulated ideological confines bring their result. When fascism can successfully appeal to workers as the removal of inhibitions, as the “Socialism of Action,” etc.–as revolutionary–it should be made clear how much was buried with the Luddites and what a terrible anti-history was begun. There are those who again fix the label of “age of transition” on today’s growing crisis–hoping all will turn out nicely in another defeat for the luddites. We see today the same need to enforce work discipline as in the earlier period, and simultaneously the same awareness by the population of the meaning of “progress.” But quite possibly we now can recognize all our enemies the more clearly, so that this time the transition can be in the hands of the creators. Axis Point of American Industrialism ──────────────────────────────────── The 1820’s constituted a watershed in United States life. By the end of that decade, about ten years after the last of the English Luddite risings had been suppressed, industrialism secured its decisive American victory; by the end of the 1830’s all of its cardinal features were definitively present. The many overt threats to the coherence of emerging industrial capitalism, the ensemble of forms of resistance to its hegemony, were blunted at this time and forced into the current of that participation so vital to modern domination. In terms of technology, work, politics, sexuality, culture, and the whole fabric of ordinary life, the struggles of an earlier, relative autonmy, which threatened both old and new forms of authority, fell short, and a dialectic of domestication, so familiar to us today, broke through. The reactions engendered in the face of the new dynamic in this epoch of its arrival seem, by the way, to offer some implicit parallels to present trends as technological civilization likely enters its terminal crisis: the answers of progress, now anything but new or promising, encounter a renewed legitimation challenge that can be informed, even inspired, by understanding the past. American “industrial consciousness”, which Samuel Rezneck judged to have triumphed by 1830, 1 was in large measure and from the outset a virtual project of the State. In 1787, generals and government officials sponsored the first promotional effort, the Pennsylvania Society for the Encouragement of Manufactures and the Useful Arts. With Benjamin Franklin as the Society’s official patron, capital was raised and a factory equipped, but arson put an end to this venture early in 1790. Another benchmark of the period was Alexander Hamilton’s Report on the Subject of Manufactures, drafted by his tirelessly pro-factory-technology assistant secretary of the Treasury, Tench Coxe. It is noteworthy that Coxe received government appointments from both the Federalist Hamilton and his arch-rival Jefferson, Republican and career celebrator of the yeoman freeholder as the basis of independent values. While Hamilton pushed industrialization, arguing, for example, that children were better off in mills than at home or in school, 2 Jefferson is remembered as a constant foe of that evil, alien import, manufacturing. To correct the record is to glimpse the primacy of technology over ideological rhetoric as well as to remember that no Enlightenment man was not also an enthusiast of science and technology. In fact, it is fitting that Jefferson, the American most closely associated with the Enlightenment, introduced and promoted the idea of interchangeability of parts, key to the modern factory, from France as early as 1785. 3 Also to the point is Charles V. Hagnar’s remark that in the 1790’s “Thomas Jefferson, […] a personal friend of my father, […] indoctrinated him with the manufacturing fever”, and induced him to start a cotton mill. 4 As early as 1805 Jefferson, at least in private, complained that his earlier insistence on independent producers as the bedrock of national virtue was misunderstood, that his condemnation of industrialism was only meant to apply to the cities of Europe. 5 Political foliage aside, it was becoming clear that mechanization was in no way impeded by government. The role of the State is tellingly reflected by the fact that the “armory system” now rivals the older “American system of manufactures” term as the more accurate to describe the new system of production methods. 6 It is along these lines that Cochran referred to the need for the Federal authority to “keep up the pressure”, around 1820, in order to soften local resistance to factories and their methods. 7 In the ‘twenties, a fully developed industrial lobby in Congress and the extensive use of the technology fair and exhibit—not to mention nationalist pro-development appeals such as that to anti-British sentiment after the war of 1812, and other non-political factors to be discussed below—contributed to the assured ascendancy of industrialization, by 1830. Ranged against the efforts to achieve that ascendancy was an unmistakable antipathy, observed in the references to its early manifestations in classic historical works. Norman Ware found that the Industrial Revolution “was repugnant to an astonishingly large section of the earlier American community”, and Victor S. Clark noted the strong popular prejudice that existed “against factory industries as detrimental to the welfare of the working-people”. 9 Later, too, this aversion was still present, if declining, as a pivotal force. The July 4, 1830 oratory of pro-manufacture Whig Edward Everett contained a necessary reference to the “suffering, depravity, and brutalism” 10 of industrialism—in Europe—for the purpose of deflecting hostility from its American counterpart. Later in the ‘thirties the visiting English liberal Harriet Martineau, in her efforts to defend manufacturing, indicated that her difficulties were precisely her audiences’ antagonism to the subject. 11 Yet despite the “slow and painful” 12 nature of the changeover, and especially the widespread evidence of deep-seated resistance (of which the foregoing citations are a minute sample), there lingers the notion of an enthusiastic embrace of mechanization in America by craftsmen as well as capitalists. 13 Fortunately, recent scholarship has been contributing to a better grasp of the struggles of the early to mid nineteenth century, Merritt Roe Smith’s excellent Harpers Ferry Armory and the New Technology, 1 * for example. “The Harpers Ferry story diverges sharply from oft-repeated generalizations that ‘most Americans accepted and welcomed technological change with uncritical enthusiasm’”, 15 Smith declares in his introduc- tion. Suffice it to interject here that no valid separation exists between anti-technology feelings and the more commonly recognized elements of contestation of classes that proceeded from the grounding of that technology; in practice the two strands were and are obviously intertwined. This reference to the “massive and irrefutable” 16 class opposition of early industrialism or to Taft’s and Ross’s dictum that “The United States has had the bloodiest and most violent labor history of any industrial nation” 17 finds its full meaning when we appraise both levels of anti-authoritarianism, especially in the watershed period of the 1820’s. In early 1819 the English visitor William Faux declared that “Labour is quite as costly as in England, whether done by slaves, or by hired whites, and it is also much more troublesome.” 18 Later that year his travel journal further testified to the “very villainous” character of American workers, who “feel too free to work in earnest, or at all, above two or three days in a week”. 19 Indeed, travelers seemed invariably to remark on “the independent manners of the laboring classes”, 20 in slightly softer language. More specifically, dissent by skilled workers, as has often been noted, was the sharpest and most durable. Given the “astonishing versatility of the average native laborer” , 21 however, it is also true that a generalized climate of resistance confronted the impending debasement of work by the factory. Those most clearly identified as artisans give us the clearest look at resistance, owing to the self-reliant culture that was a function of autonomous handicraft production. Bruce Laurie, on some Philadelphia textile craftsmen, illustrates the vibrant pre-industrial life in question, with its blase attitude toward work. On a muggy summer day in August 1828 Kensington’s hand loom weavers announced a holiday from their daily toil. News of the affair circulated throughout the district and by mid-afternoon the hard-living frame tenders and their comrades turned the neighborhood avenues into a playground. Knots of lounging workers joked and exchanged gossip […] . The more athletic challenged one another to foot races and games, [and] quenched their thirst with frequent drams. The spree was a classic celebration of St. Monday. 22 It was no accident that mass production—primarily textile factories—first appeared in New England, with its relative lack of strong craft traditions, rather than in, say, Philadelphia, the center of American artisan skills. 23 Traditions of independent creativity obviously posed an obstacle to manufacturing innovation, causing Carl Russell Fish to assay that “Such craftsmen were the only actively dissatisfied class in the country.” 24 The orthodox explanation of industrialism’s triumph stresses the much higher United States wage levels, compared to Europe, and an alleged shortage of skilled workers. These are, as a rule, considered the primary factors that produced “an environment affording every suggestion and inducement to substitute machinery for men”, and which nurtured that “inventiveness and mechanical intuition which are sometimes regarded as a national trait”, in the descriptive phrases of Clark. 25 But the preceding discussion should already be enough to indicate that it was the presence of work skills that challenged the new technology, not their absence. Research shows no dearth of skilled workers, 26 and there is abundant evidence that “the trend toward mechanization came more from cultural and managerial bias than from carefully calculated marginal costs.” 27 Habakkuk’s comparison of American and British antebellum technology and labor economics cites the “scarcity and belligerency of the available skilled labour”, 2 * and we must accent the latter quality, while realizing that scarcity can also mean the ability to make oneself scarce—namely, the oft-remarked high turnover rates. 29 It was industrial discipline that was missing, especially among craftsmen. At mid century Samuel Colt confided to a British engineering group that “uneducated laborers” made the best workers in his new mass-production arms factory because they had so little to unlearn; 30 skills—and the recalcitrance accompanying them—were hardly at a premium. Strikes and unionization (though certainly not always linked) became common from 1823 forward, 31 and the modern labor movement showed particular vitality during the militant “great uprising” period of 1833-37. 32 However, especially by the ‘thirties, these struggles (largely for shorter hours, secondarily over wages) were essentially situated within the world of a standardizing, regimenting technology, predicated on the worker as a component of it. And although this distinction is not total, it was the “unorganized” workers who mounted the most extreme forms of opposition, Luddite in many instances, contrary to the time-honored wisdom that Luddism and America were strangers. Gary Kulik’s excellent scholarship on industrial Rhode Island determined that in Pawtucket alone more than five arson attempts were made against cotton-mill properties, and that the deliberate burning of textile mills was far from uncommon throughout early-nineteenth-century New England, declining by the ‘thirties. 33 Jonathan Prude reached a similar conclusion: “Rumors abounded in antebellum New England that fires suffered by textile factories were often of ‘incendiary origin’." 34 The same reaction was felt in Philadelphia, albeit slightly later: “Several closely spaced mill burnings triggered cries of ‘incendiarism’ in the 1830’s, a decade of intense industrial conflict.” 35 The hand-sawyers who burned Oliver Evans’s new steam mill at New Orleans in 1813 36 also practiced machine-wrecking by arson, like their Northeastern cousins, and shortly later Massachusetts rope-makers attacked machine-made yarn, boasting that their handspun product was stronger. 37 Sailors in New York often inflicted damage on vessels during strikes, according to Dulles, who noted: “The seamen were not organized and were an especially obstreperous lot.” 38 Though its impact, as with resistance in general, declined after the ‘twenties, Luddite-type violence continued. The unpopular superintendent of the Harpers Ferry Armory 39 was shot dead in his office in early 1830 by an angry craftsman named Ebenezer Cox. Though Cox was hung for his act, he was a folk hero among the Harpers Ferry workers, who hated Dunn’s emphasis on supervision and factory-type discipline, and never tired of citing Dunn’s fate as a blunt reminder to superintendents of what could be expected if they became overzealous in executing their duties and impinged on the traditional freedoms of employees. 40 Construction laborers, especially in railroad work, frequently destroyed property; Gutman provides an example from 1831 in which about three hundred of them punished a dishonest contractor by tearing up the track they built. 41 The destructive fury of Irish strikers on the Baltimore and Ohio Canal in 1834 occasioned the inaugural use of Federal troops in a labor dispute, on orders of Andrew Jackson. And in the mid ‘thirties anti-railroad teamsters, still waylaid trains and shot at their crews from ambush. 42 In the Philadelphia handloom weavers’ strike of 1842, striking artisans used machine breaking, intimidation, destruction of unwoven wool and finished cloth, house wrecking, and threats of even worse violence. During this riotous struggle, weavers marched on a water-powered, mass-production mill to burn it; the attack was driven off, with two constables wounded. 43 Returning to the New England textile mills and incendiary Luddism, Prude describes the situation after 1840: “Managers were rarely directly challenged by their hands; and although mills continued to burn down, contemporaries did not as quickly assume that workers were setting the fires.” 44 Looking for social-political reasons for the culture of industrialism, one finds that official efforts to domesticate the ruled via the salutory effects of factory labor date back to the mid seventeenth century. The costs of poor relief led Boston officials to put widows and orphans to work, beginning in 1735, in what amounted to a major experiment to inculcate habits of industry and routine. But even threats of denial of subsistance aid failed to establish industrial discipline over irregular work habits and independent attitudes. 45 Artisanal (and agricultural) work was far more casual than that regimented by modern productionist models. Unlike that of the factory, for example, it could almost always be interrupted in favor of an encounter, an adventure, or simply a distraction. This easy entry to gaming, drinking, personal projects, hunting, extended and often raucus revelry on a great variety of occasions, among other interruptions, was a preserve of independence from authority in general. And, on the other hand, the regulation and monotony that adhere to the work differentiation of industrial technology combat such casual, undomesticated tendencies. Division of labor embodies, as an implicit purpose, the control and domination of the work process and those tied to it. Adam Smith saw this, and so did Tocqueville, in the 1830’s: “As the principle of the division of labor is ever more completely applied, the workman becomes weaker, more limited, and more dependent. […] Thus, at the same time that industrial science constantly lowers the standing of the working class, it raises that of the masters.” 46 This subordination, including its obvious benefit, social control, was widely appreciated, especially, but not exclusively, by the early industrialists. Manufacturers, with unruliness very visible to them, came quickly to identify technological progress with a more subdued populace. In 1816 Walton Felch, for instance, claimed that the “restless dispositions and insatiate prodigality” of working people were altered, by “manufacturing attendance”, into patterns of regularity and calmness. 47 Another New England mill-owner, Smith Wilkinson, judged in 1835 that factory labor imposed a “restraining influence” on people who “are often very ignorant, and too often vicious”. 48 The English visitor Harriet Martineau, introduced above, was of like mind in the early ‘forties: “The factories are found to afford a safe and useful employment for much energy which would otherwise be wasted and misdirected.” She determined that, unlike the situation that had prevailed before the introduction of manufactures, “now the same society is eminently orderly. […] disorders have almost entirely disappeared.” 49 Eli Whitney provides another case in point of the social designs inhering in mechanization, namely that of his Mill Rock armory, which moved from craft shop to factory status during the period of the late 1790’s to Whitney’s death in 1825. Long associated with the birth of the “American System” of interchangeable-parts production, he was thoroughly unpopular with his employees for regimentation he developed via increasing division of labor. His penchant for order and discipline was embodied in his view of Mill Rock as a “moral gymnasium”, where “correct habits” 50 of diligence and industry were inculcated through systematic control of all facets of the work day. 51 As skill levels were forcibly reduced, the art of living was also purposefully degraded by the sheer number of hours involved in industrial work. Emerson, usually thought of in terms of a vague philosophy of human possibilities, applauded the suppression of potential enacted by the work hours of 1830’s railroad building. He observed the long, hard construction shifts as “safe vents for peccant humors; and this grim day’s work of fifteen or sixteen hours, though deplored by all humanity of the neighborhood, is a better police than the sheriff and his deputies.” 52 A hundred years later Simone Weil supplied a crucial part of the whole equation of industrialization: “No one would accept two daily hours of slavery. To be accepted, slavery must be of such a daily duration as to break something in a man.” 53 Similar is Cochran’s more recent (and more conservative) reference to the twelve-hour day, that it was “maintained in part to keep workers under control”. 54 Pioneer industrialist Samuel Slater wondered, in the 1830’s, whether national institutions could survive “amongst a people whose energies are not kept constantly in play by the pursuit of some incessant productive employment”. 55 Indeed, technological “progress” and the modern wage-slavery accompanying it offered a new stability to representative government, owing essentially to its magnified powers for suppressing the individual. Slater’s biographer recognized that “To maintain good order and sound government, [modern industry] is more efficient than the sword or bayonet.” 56 A relentless assault on the worker’s historic rights to free time, self-education, craftsmanship and play was at the heart of the rise of the factory system; “increasingly, a feeling of degradation spread among factory hands”, according to Rex Burns. 57 By the mid ‘thirties a common refrain in the working-class press was that the laborer had been debased “into a necessary piece of machinery”. 58 Assisted by sermons, a growing public school system, a new didactic popular literature, and other social institutions that sang the praises of industrial discipline, the factory had won its survival by 1830. From this point on, and with increasing visibility by the end of the ‘thirties, conditions worsened and pay decreased. 59 No longer was there a pressing need to lure first-time operatives into industrialized life, and curry their favor with high wages and relatively light duties. Beginning before 1840, for example, the pace of work in textile mills was greatly speeded up, facilitated also by the first major immigration influx, that of impoverished Irish and French Canadians. 60 Henry Clay asked: “Who has not been delighted with the clockwork movements of a large cotton factory?”, 61 reminding us that concomitant with such regimentation was the spread of a new conception of time. Although certainly things did not always go “like clockwork” for the industrialists—"punctuality and absenteeism remained intractable problems for management” throughout the first half of the nineteenth century, 62 for example—a new, industrial time, against great resistance, made gradual headway. In the task-oriented labors of artisans and farmers, work and play were freely mixed; a constant pace of unceasing labor was the ideal not of the mechanic, but of the machine—more specifically, of the clock. The largely spontaneous games, fairs, festivals and excursions gave way, along with working at one’s own pace, to enslavement to the uniform, unremitting technological time of the factory whistle, centralized power and unvarying routine. For the Harpers Ferry armorers early in the century, the workshops opened at sunrise and closed at sunset, but they were free to come and go as they pleased. They had long been accustomed to controlling the duration and scheduling of their tasks, and “the idea of a clocked day seemed not only repugnant but an outrageous insult to their self-respect and freedom.” 63 Hence, the opposition to 1827 regulations that installed a clock and announced a ten-hour day was bitter and protracted. For those already under the regimen of factory production, struggles against the alien time were necessarily of a lingering, rear-guard character by the late ‘twenties. An interesting illustration is that of Pawtucket, Rhode Island, a mill village whose denizens built a town clock by public subscription in 1828. 64 In their efforts to counter the monopoly of recording time which had been the mill-owner’s factory bell, one can see that by this time the whole level of contestation had degenerated: the issue was not industrial time itself, but merely the democratization of its measurement. The clock, favorite machine of the Enlightenment, is a master device in the depiction of American political economy by Thoreau and others. Its function is decisive because it links the industrial apparatus with consciousness. 65 It is fitting that clockmaking, along with gun manufacture, was a model of the new technology; the United States led the world in the production of inexpensive time-pieces by the 1820’s, a testimony to the encroaching industrial value system—and the marked anxiety about the passage of time that was part of it. 66 Though even in the first decades of the Republic there was a permanent operative class in at least three urban centers of the Mid Atlantic seaboard, 67 industrialization began in earnest with New England cloth production twenty years after the Constitution was adopted. For example, forty-one new woolen mills were built in the United States, chiefly along New England streams, between 1807 and 1813. 68 The textile industry selected the most economically deprived areas, and with cheery propaganda and, initially, relatively good working conditions, enticed women and children (who had no other options) into the mills. That they “came from families which could no longer support them at home” 69 means that theirs was essentially forced labor. In 1797 Obadiah Brown, in a letter to a partner regarding the selection of a mill site, determined that “the inhabitants appear to be poor, their homes very much on the decline. I apprehend it might be a very good place for a Cotton Manufactory, Children appearing very plenty.” 70 “In collecting our help”, a Connecticut mill-owner said thirty years later, “we are obliged to employ poor families and generally those having the greatest number of children.” 71 New England factory-cloth output increased from about 2.4 million yards in 1815 to approximately 13.9 million yards in 1820, and the shift of weaving from home to factory was virtually completed by 1824. 72 Despite arson, absenteeism, stealing and sabotage persisting with particular emphasis into the ‘thirties, 73 the march of industrialization proceeded in textiles as elsewhere. If, as Inkeles and Smith 74 (among others) have contended, a prime element of modernity is the amount of time spent in factories, the 1820’s was indeed a watershed. “Certainly by 1825 the first stage of the industrialization of the United States was over”, 75 in Cochran’s estimation. In 1820, factories were capitalized to $ 50,000,000; by 1840, to $ 250,000,000, and the number of people working in them had more than doubled. 76 Also by the ‘twenties the whole direction of specialized bureaucratic control, realized a generation later in such large corporations as the railroads, had already become clear. 77 As the standardizing, quasi-military machine replaced the individual’s tools, it provided authority with an invaluable, “objective” ally against “disorder”. Not coincidentally did modern mass politics also labor to implant itself in the ‘twenties: political hegemony, as a necessary part of social power, had also failed to fully resolve the issue in its favor in the struggles of the early Republic. 78 Conflict of all kinds was rampant, and a “terrible precariousness”, in Page Smith’s phrase, 79 characterized the cohesion of national power. In fact, by the early ‘twenties a virtual breakdown of the legitimacy of traditional rule by informal elites was underway and a serious re-structuring of American politics was required. Part of the re-structuring dealt with law, in a parallel to the social meaning of technology: “neutral” universal principles came to the fore to justify increased coercion. Modern bourgeois society was forced to rely on an increasingly objectified legal system, which reflected, at base, the progress of division of labor. It must, in David Grimsted’s words, “elevate law because of what it is creating and what it has to destroy”. 80 By the time of Jackson’s ascendancy in the late ‘twenties, America had become largely a government of laws not men (though juries mitigated legality), despite the unpopularity of this development as seen, for example, in the widespread scorn of lawyers. 81 Along with the need to mobilize the lower orders into industrial work, it was important to greatly increase political participation in the interests of legitimizing the whole. Although by the mid ‘twenties almost every state had extended the franchise to incude all white males, the numbers of voters remained very low during the decade. 82 By this time newspapers had proliferated and were playing a key role in working toward the critical integration achieved with Jackson and new, mass-political machinery. In 1826, a workingman was chosen for the first time as a mayoral candidate in Baltimore, explicitly in order to attract workingmen’s participation, 83 an early example of a necessary part of moving away from narrow-based, old-style rule. However, John Quincy Adams, who had become President in 1825, “failed to comprehend that voters needed at least the appearance of consultation and participation in making decisions.” 84 A conservative and a nationalist, he was at least occasionally candid: as he told Tocqueville, there is “a great equality before the law, [which] ceases absolutely in the habits of life. There are upper classes and working classes.” 85 Following Adams, the election of Andrew Jackson in 1828 symbolized and accelerated a shift in American life. At the moment that mechanization was securing its domination of life and culture, the Jacksonian era signalled the arrival of professional politics and a crucial diversion of the remaining potentially dangerous energies. Embodying this domestication in his successful appeal to the “common man”, the old General was in reality a plantation-owner, land speculator and lawyer, whose first case in 1788 defended the interests of Tennessee creditors against debtors. He reversed the decline in executive strength that had plagued his three predecessors, essentially renewing State power by a direct appeal to the working classes for the first time in United States history. The mob at the 1829 White House inaugural, celebrated in history text-books with its smashing of china and trampling on the furniture, did in fact “symbolize a new power”, in Curti’s phrase 86—a power tamed and delivering itself to government. Jackson’s “public statements address a society divided into classes invidiously distinguished and profoundly antagonistic.” 87 And yet, employing the Jeffersonian argot, he regularly identified the class enemy in misleading terms as the money power, the moneyed aristocracy, etc. By the presidential contest of 1832 the gentleman-leader had certainly been rendered an anachronism, 88 in large part via the use of class-oriented rhetoric. In Jackson’s second term, after he had been overwhelmingly re-elected on the strength of his attacks on the Bank of the United States, 89 he vetoed the re-chartering of the Bank in the most popular act of his administration. Although many conservatives feared that Jackson’s policies and conduct would result in a “disastrous, perhaps a fatal” revolution, 90 that the Jacksonians “had raised up forces greater than they could control”, 91 the Bank proved a safe target for the Jacksonian project of deflecting popular anger. As Fish noted, “hostility was merely keenest against banks; it existed against all corporations.” 92 Thus, the “Monster” Bank, which did reap outrageous profits and openly purchased members of Congress, was inveighed against as the incarnation of aristocracy, privilege and the spirit of luxury, while, missing the essential point, Daniel Webster and others warned against such inflaming of the poor against the rich. 93 Needless to say, the growth of an enslaving technology was never attacked; rather, as Bray Hammond maintained, Jackson represented “a blow at an older set of capitalists by a newer, more numerous set”. 94 And meanwhile, along with the phrase-making of this “frontier democrat”, class distinctions widened, and tensions increased, minus the means to successfully overcome them. In the mid ‘thirties various workers’ parties also sprang up. Many were far from totally proletarian in composition, and few went much further than Jacksonian Democracy, in their denunciations of the “monopolists” and such demands as free public schools and equality of “opportunity”. This political workerism only advanced the absorption of working people into the new political system and displayed, for the first time, the now familiar interchangeability of labor leader and politician. But integration was not accomplished smoothly or automatically. For one thing, political insurrection was a legacy of the eighteenth century: from Bacon’s Rebellion in Virginia (1675), by 1760 there had been eighteen uprisings aimed at overthrowing colonial governments, 95 and more recently there had appeared Shays’s Rebellion in Massachusetts (1786-87), the Whiskey Rebellion in Western Pennsylvania (1794), and Fries’s Rebellion in Eastern Pennsylvania (1798-99). Twenty-five years after the Constitution was signed, extensive anti-Federalist rioting in Baltimore seemed to connect with this legacy, rather than to less authentic political alternatives to the old informal means of social control. Significantly, over the course of the summer 1812 upheavals, the composition of the mob shifted toward an exclusively proletarian, unpropertied make-up. 96 Moving into the period under particular scrutiny, the depth of general contestation is somewhat reflected by a most unlikely revolt, that of a “vicious cadet mutiny” at West Point in 1826. On Christmas morning in that year, “drunken and raging cadets endeavored to kill at least one of their superior officers and converted their barracks into a bastion which they proposed to defend, armed, against assault by relieving Regular Army troops on the Academy reservation.” 97 The fury of this amazing turn of events, though detailed in much Board of Inquiry and courts-martial testimony, remains a little-known episode in United States history; it can be seen to have introduced a whole chapter of wholesale tumult, nonetheless. By the late ‘twenties group violence had reached great prominence in American life, such that within a few years “many Americans had a strong sense of social disintegration”. 98 The annual New York parade of artisans in November 1830 was another incident that told a great deal about the mounting unruliness. Printers, coopers, furniture-makers and a great many other tradesmen assembled at the culmination of the procession, to hear speeches expressing the usual Republican virtues. But on this day politicians mouthing the old ritual phrases about political freedom and the dignity of labor were suddenly confronted by curses, scuffling and a defiant temper. “As the militia tried to quiet the militants, the dissatisfied crowd knocked out the support of the scaffolding, causing the entire stage to crash to the ground”, 99 and bringing the ceremonies to an undignified end. The public violence of the ‘thirties was more a prolonged aftershock, however, than a moment of revolutionary possibility. For the reasons given above, the triumph of industrial technology was a fact by the end of the ‘twenties, and the ensuing aftermath, though major, could not be decisive. But it is true that, by Hammett’s reckoning, “A climate of disorder prevailed, […] which seemed to be moving the nation to the edge of disaster.” 100 As Page Smith described urban life in the early ‘thirties, “What is hard to comprehend today is the constant ferment of social unrest and bitterness that manifested itself almost monthly in violent riots and civic disorders.” 101 Gilje’s research revealed “nearly 200 instances of riot between 1793 and 1829 in New York City alone”, 102 for example, and Weinbaum counted 116 in that city just in the period of 1821 to 1837. 103 Philadelphia, Baltimore and Boston witnessed outbreaks on a similar scale, often directed at bankers and “monopolists”. Michael Chavalier wrote a chapter entitled “Symptoms of Revolution”, against the backdrop of four days of rioting in Baltimore over exploitative practices of the Bank of Maryland in the summer of 1835. m Also in that year, disorders that caused Jackson to increasingly resort to the use of Federal troops, occasioned William Ellery Channing’s report from Boston: “The cry is, ‘Property is insecure, law a rope of sand, and the mob sovereign.’” 105 Likewise, the Boston Evening Journal pondered the “disorganizing, anarchical spirit” of the times in an August 7, 1835 editorial. February 1836 saw hundreds of debtor farmers attack and burn offices of the Holland Land Company in Western New York. 106 During 1836 and 1837 crowds in New York City broke into warehouses several times, furious over high food, rent and fuel prices. The Workingmen’s Party in New York, known as the Locofoco Party, has been linked with these “flour riots”, but, interestingly, at the February 1837 outburst most closely tied to Locofoco speech-making, of fifty-three rioters arrested none was a party member. 107 Despite the narrow chances for the ultimate success of the uprisings of the ‘thirties, it is impossible to deny the existence of deep and bitter class feelings, of the notion that the promise of equality contained in the Declaration of Independence was mocked by reality. Serious disturbances continued: the 1838 “Buckshot War”, in which Harrisburg was seized by an irate, armed crowd in a Pennsylvania senatorial-election dispute, for example; the “Anti-rent” riots by New York tenants of the Van Rensselaer family in 1839; the “Dorr War” of 1842 (somewhat reminiscent of the independent “Indian Stream Republic” of 1832-35 in New Hampshire), in which thousands in Rhode Island approached civil war in a fight over rival state constitutions; and the sporadic anti-railroad riots in the Kensington section of Philadelphia from 1840 to 1842 were among major hostilities. But ethnic, racial and religious disputes began fairly early in the decade to begin to supersede class-conscious struggles, though often disparate elements coexisted in the same occasions. This decline in consciousness was manifested in anti-Irish, anti-abolitionist and anti-Catholic riots largely, and must be seen in the context of the earlier, principal defeat of working people by the factory system, in the ‘twenties. Cut off from the only terrain on which challenge could gain basic victories, could change life, the upheaval in the ‘thirties was destined to sour.Characteristically, the end of the ‘thirties saw both the professionalization of urban police forces and organized gang violence in place as permanent fixtures. If by 1830 virtually every aspect of American life had undergone major alteration, the startling changes in drinking habits shed particular light on the industrialism behind this transformation. The “great alcoholic binge of the early nineteenth century”, and its precipitous decline in the early ‘thirties, have much to say about how the culture of the new technology took shape. Drinking, on the one hand, was a part of the pre-industrial blurring of the distinction between work and leisure. On into the early decades of the century, small amounts of alcohol were commonly consumed throughout the day, at work and at home (sometimes the same place); reference has been made above to the frequent, spontaneous holidays of all kinds, and the widespread observance of “blue Mondays” or three-day weekends, “which run pretty well into the week”, according to one complaining New York employer. 109 Drinking was the universal accompaniment to these parties, celebrations and extended weekends, as it was to the normal work-day. The tavern or grog-shop, with its “unstructured, leisurely, and wholly unproductive, even anti-productive, character”, 110 was a social center well-suited to a non-mechanized age, and in fact became more than ever the workingman’s club as modernization cut him off from other emotional outlets. 111 But drunkenness—binge-drinking and solitary drinking, most importantly—was increasing by 1820; significantly, alcoholic delirium, or delirium tremens, first appeared in the United States during the ‘twenties. 112 Alcoholism is an obvious register of strains and alienation, of the inability of people to cope with the burden of daily life which a society places on them. Clearly, there is little healthy or resistant about the resort to such drinking practices. Temperance reform was a part of the larger syndrome of social disciplining expressed in industrialization, as irregular drinking habits were an obstacle to a well-managed population. Not surprisingly, factory-owners were in the forefront of such efforts, having to contend with troublesome wage-earners who had little taste for such dictums as “the steady arm of industry withers from drink”. 113 Tyrrell’s examination of Worcester, Massachusetts, also found that “the leading temperance reformers were those with a hand in the work of inventions and of innovations in factory and machine production”. 114 While at one point workers considered a daily-liquor issue a non-negotiable right and an emblem of their independence, increasing reliance on alcohol signified the debility that went along with their domination by machine culture. The Secretary of War estimated in 1829 that “three-quarters of the nation’s laborers drank daily at least 4 ounces of distilled spirits”, 115 and in 1830 the average annual consumption of liquor exceeded five gallons, nearly triple the amount one hundred and fifty years later. 116 The anti-alcohol crusade began in earnest in 1826 with the formation of the American Temperance Society, and other local groups such as the Society in Lynn (Massachusetts) for the Promotion of Industry, Frugality and Temperance (also 1826). In the same year Beecher wrote his Six Sermons on Intemperance, the leading statement of antidrinking of the period, which pronounced tippling to be politically dangerous. In Gusfield’s excellent summation, Beecher’s writings “displayed the classic fear the creditor has of the debtor, the propertied of the propertyless, and the dominant of the subordinate—the fear of disobedience, renunciation, and rebellion.” 117 Temperance exertions in the ‘twenties revealed in their propaganda the tenuous influence that the respectable held over the laboring classes during the height of the battle to establish industrial values and a predictable workforce. As this battle was won, drinking suddenly leveled off at the end of the ‘twenties and began to plummet in the early ‘thirties toward an unprecedented low. 118 As working people became domesticated, the temperance movement shifted toward the goal of complete abstinence, and in the ‘forties a “dry” campaign swept the nation. 119 The other major reform movement, also arising in the mid ‘twenties, was for a public school system, and like the temperance campaign it was explicitly undertaken to “make the dangerous classes trustworthy”. 120 The concept of mass schooling had arrived by the early Jacksonian period, when innovative forms of coercion were demanded by deteriorating restraints on social behavior, and auxiliary institutions came to the aid of the factory. The “willingness of early nineteenth-century school promoters to intervene directly and without invitation in the lives of the working class” 121 was a consequence of the notion that education was something the ruling orders did to the rest to make them orderly and tractable. Thus “the first compulsory schools were alien institutions set in hostile territory”, 122 as Katz put it, owing largely to the spirit of autonomy and egalitarianism that parents had instilled in their children. Faux noted, in 1819, the “prominent want of respect for rule and rulers”, which he connected with a common refusal of “strict discipline” in schools; 123 Marryat’s diary reported that students “learn precisely what they please and no more”. 124 Drunkennness and rioting occurred in schools as well as in the rest of society, and educators interpreted the overall situation as announcing general subversion; in an 1833 address on education, John Armstrong declared: “When Revolution threatens the overthrow of our institutions, everything depends upon the character of the people.” 125 Industrial morality - obedience, self-sacrifice, restraint and order-constituted the most important goal of public education; character was of far greater importance than intellectual development. 126 The school system came into existence to shape behavior and attitudes, and thus reinforce the emerging world. The belief that attendance should be universal and compulsory followed logically from assumptions about its importance. 127 Moral instruction was also amplified by the churches during the ‘twenties and ‘thirties, an antidote to that tendency to “rejoice in casting off restraints & unsettling the foundations of social order”, 128 woefully recorded by the Reverend Charles Hall. Sunday School and the society for diffusion of religious tracts were two new ecclesiastical contributions to social control in this period. The Jacksonian period is also synonymous with the “Age of the Asylum”, a further development in the quest for civic docility. The regularity and efficiency of the factory was the model for the penitentiaries, insane asylums, orphanages and reformatories that now appeared. 129 Embodying uniformity and regularity, the factory was indeed the model, as we have seen, for the whole of society. Religious revivalism and millenarianism grew in strength after the mid ‘twenties, and one of the new denominations to appear was the Millerites (today’s Seventh-day Adventists). On October 22, 1844, the group gathered to await what they predicted would be the end of the world. Their expectation was but the most literal manifestation of a feeling that began to pervade the country after 1830; 130 without unduly elevating the pre-industrial past, one can recognize the lament for a world that was indeed ended. The early stages of industrial capitalism introduced a sharpened division between the worlds of work and home, male and female, and private and public life, with large extended families eroding toward small, isolated nuclear families. Along with this process of increasing separation and isolation came a focused repression of personal feelings, stemming from new requirements for rationalized, predictable behavior. As planning and organization moved ahead via the progress of the machine model of the individual, the range of human sentiments became suspect, a target for suppression. For example, whereas in 1800 it was not considered “unmanly” for a man to weep openly, by the ‘thirties a proscription against any extreme emotional display, especially crying, was gaining strength. 131 Similarly, in child training this tendency became very pronounced; in the widely distributed Advice to Christian Parents (1839), the Reverend John Hersey emphasized that “In every stage of domestic education, children should be disciplined to restrain their appetites and desires.” 132 The seventeenth-century Puritans were hardly “puritanical” about sexual matters, and eighteenth-century American society, especially in the latter part of the century was characterized by very open sexuality; 133 during the seventeenth and eighteenth centuries, moreover, much emphasis was placed on the arousal, pleasure and satisfaction of women. Aristotle’s Master Piece, for example, was a very popular work of erotica and anatomy in the eighteenth and early nineteenth centuries, predicated on the sexual interest of women. There were at least one hundred editions of the book prior to 1830—and no known complaints about it in any newspapers or periodicals. 134 In 1831, the year that the last edition of Aristotle’s Master Piece was published, J. N. Bolles’s Solitary Vice Considered appeared, an anti-masturbation booklet of a type that would proliferate from the early ‘thirties on. 135 While the advice books on sex of the early part of the century could be quite explicit concerning women’s sexual satisfaction, the trend was that “medical, biological, instructional, and popular literature contained countless defenses of extreme modern moderation and self-control”. 136 The turning-point, again, in this area as elsewhere, was the ‘twenties. By the ‘forties the very idea of women’s sexuality was becoming virtually erased. In the middle years of the century Dr William Acton’s Functions and Disorders of the Reproductive Organs was a popular standby; it summed up the official view on the subject thus: “The majority of women (happily for them) are not very much troubled with sexual feelings of any kind. What men are habitually, women are only exceptionally.” 137 Among working and non-white women (not exclusive categories, obviously) this ideology had less impact than among those of higher station, for whom the relentless quelling of the recognition of “animal passions” caused vast physical and psychological damage. 138 The cult of female purity, or cult of the lady, or “true womanhood”, emerged among the latter in the ‘thirties, stressing piety and domesticity. 139 This American woman was now exclusively a consumer of her husband’s income, at a period when advertising developed on a scale and sophistication unique in the world. Not surprisingly, national expansionist policy came into its own now, too. The claim of hemispheric rights proclaimed in late 1823—the Monroe Doctrine - coincided with the beginnings of real Indian genocide, both occurring, of course, against the backdrop of a gathering industrial ethos. The Seminoles and Creeks were crushed at this time, an answer to the “especially menacing” specter of a combined Indian and runaway-slave coalition: the First Seminole War was in large part undertaken “to secure Indian lands and therewith deny sanctuary to runaway slaves”. 140 From 1814 to 1824, Jackson had been “the moving force behind southern Indian removal”, 141 a policy inherited from Jefferson and one which he completed upon becoming President in 1828. Indian destruction, surely one of the major horror tales of the modern age, was more than an ugly stain on American politics and culture; indeed, Rogin’s argument that its scope “defines for America the stage of primitive capitalist accumulation” 142 is at least partly true. At the very least it presaged the further acquisitiveness that blossomed in the Manifest Destiny conquest spirit of the ‘forties. But the more monstrous perhaps is its moral dimension, committed under Jackson’s description of “extending the area of freedom”. 143 The Red Man, as Noble Savage, had to disappear; he was “savage”, after all. The Dead Indian is obviously a more apt symbol for the trajectory of industrial capitalism, though the romantic use of the Indian reached its height at the moment of capital’s victory, when, by the ‘thirties, Nature truly became an evil to be subdued, while the machine was the fountainhead of all values that counted. Nevertheless, voices and symbols of opposition survived. Johnny Appleseed (John Chapman), for instance, who was respected by the Indians during the first forty years of the century, and who represents riches of a wholly non-productionist, non-commodity type. There were such doubters of the period as Thoreau, Hawthorne, Poe and Melville. Lee Clark Mitchell, among other contemporary scholars, has found, in letters, diaries and essays, the record of a popular sense of deep foreboding about the conquest of the wilds by technological progress. 144 The victories of the dominant order have certainly never completely erased this alternative spirit of refusal, a spirit renewing itself today. The Practical Marx ────────────────── Karl Marx is always approached as so many thoughts, so many words. What connection is there between lived choices—one’s willful lifetime—and the presentation of one’s ideas? Marx in his dealings with family and associates, his immediate relations to contemporary politics and to survival, the practical pattern and decisions of a life: this is perhaps worth a look. Despite my rejection of basic conceptions he formulated, I aim not at character assassination in lieu of tackling those ideas, but as a reminder to myself and others that our many compromises and accommodations with a grisly world are the real field of our effort to break free, more so than stating our ideas. It is in disregarding abstractions for a moment that we see our actual equality, in the prosaic courses of our common nightmare. A brief sketch of the “everyday” Marx, introducing the relationship between his private and public lives as a point of entry may serve to underline this By 1843 Marx had become a husband and father, roles predating that of Great Thinker. In this capacity, he was to see three of his six children die, essentially of privation. Guido in 1850, Francesca in 1852, and Edgar in 1855 perished not because of poverty itself, so much as from his desire to maintain bourgeois appearances. David McLellan’s Marx: His Life and Thought, generally accepted as the definitive biography, makes this point repeatedly. Despite the fairly constant domestic deficiencies, Marx employed Helene Demuth as maid from 1845 until his death in 1883, and a second servant was added as of 1857. Beyond any question of credibility, it was Demuth who bore Marx’s illegitimate son Frederick in 1851. To save Marx from scandal, and “a difficult domestic conflict” according to Louis Freyberger, Engels accepted paternity of the child. From the end of the 1840’s onward, the Marx household lived in London and endured a long cycle of hardship which quickly dissipated the physical and emotional resources of Jenny Marx. The weight of the conflicting pressures involved in being Mrs. Marx was a direct cause of her steadily failing health, as were the deaths of the three children in the ’50s. By July 1858 Marx was accurate in conceding to Engels that “My wife’s nerves are quite ruined…” In fact, her spirit had been destroyed by 1856 when she gave birth to a stillborn infant, her seventh pregnancy. Toward the end of that year she spoke of the “misery” of financial disasters, of having no money for Christmas festivities, as she completed copying out work on Marx’s The Critique of Political Economy. Despite several inheritances, the begging letters to Engels remained virtually non-stop; by 1860 at the latest, Jenny’s once very handsome appearance had been turned to gray hair, bad teeth, and obesity. It was in that year that small pox, contracted after transcribing the very lengthy and trivial Herr Vogt diatribe, left her deaf and pockmarked. As secretary to Marx and under the steady strain of creditors, caused pre-eminently by the priority of maintaining appearances, Jenny’s life was extremely difficult. Marx to Engels, 1862: “In order to preserve a certain facade, my wife had to take to the pawnbroker’s everything that was not actually nailed down.” The mid-’60s saw money spent on private lessons for the eldest of the three daughters and tuition at a “ladies seminary” or finishing school, as Marx escaped the bill-collectors by spending his days at the British Museum. He admitted in 1866, in a letter to his future son-in-law, Paul Lafargue, that his wife’s “life had been wrecked.” Dealing with nervous breakdowns and chronic chest ailments, Jenny was harried by ever-present household debt. One partial solution was to withhold a small part of her weekly allowance in order to deal with their arrears, the extent of which she tended to hide from Marx. In July 1869 the Great Man exploded upon learning of this frugal effort; to Engels he wrote, “When I asked why, she replied that she was frightened to come out with the vast total (owed). Women plainly always need to be controlled!” Speaking of Engels, we may turn from Marx the “family man” to a fairly chronological treatment of Marx in his immediate connections with contemporary politics. It may be noted here that Engels, his closest friend, was, from 1838 on, a representative of the firm of Engels and Ermen; in fact, throughout the 1850s and ’60s he was a full-time capitalist in Manchester. Thus his Condition of the Working Class in England was the fruit of a practical businessman, a man of precisely that class responsible for the terrible misery he so clearly chronicled. By 1846 Marx and Engels had written The German Ideology, which made a definitive break with the Young Hegelians and contains the full and mature ideas of the materialist concept of the progress of history. Along with this tome were the practical activities in politics, also by now receiving their characteristic stamp. In terms of his Communist Correspondence Committee and its propaganda work, Marx (also in 1846) stated: “There can be no talk at present of achieving communism; the bourgeoisie must first come to the helm.” In June of the same year he sent instructions to supporters to act “jesuitically,” to not have “any tiresome moral scruples” about acting for bourgeois hegemony. The inexorable laws of capitalist development, necessarily involving the sacrifice of generations of “insufficiently developed” proletarians, would bring capital to its full plenitude–and the workers to the depths of enslavement. Thus in 1847, following a congress of professional economists in Brussels to which he was invited, Marx publicly noted the disastrous effect of free trade upon the working class, and embraced this development. In a subsequent newspaper article, he likewise found colonialism, with its course of misery and death to be, on the whole, a good thing: like the development of capitalism itself, inevitable and progressive, working toward eventual revolution. In 1847 the Communist League was formed in London, and at its second Congress later in the year Marx and Engels were given the task of drafting its manifesto. Despite a few ringing anti-capitalist phrases in its general opening sections, the concrete demands by way of conclusions are gradualist, collaborationist, and highly statist (e.g. for an inheritance tax, graduated income tax, centralization of credit and communications). Ignoring the incessant fight waged since the mid-18th century and culminating with the Luddites, and unprepared for the revolutionary upheavals that were to shake Europe in less than a year, the Communist Manifesto sees, again, only an “insufficiently developed” proletariat. From this policy document arises one of the essential tactical mysteries of Marx, that of the concomitant rise of both capitalism and the proletariat. The development of capital is clearly portrayed as the accumulation of human misery, degradation and brutality, but along with it grows, by this process itself, a working class steadily more “centralized, united, disciplined, and organized.” How is it that from the extreme depths of physical and cultural oppression issues anything but a steadily more robotized, powerless, de-individualized proletariat? In fact, the history of revolts and militance of the 19th and 20th centuries shows that the majority do not come from those most herdlike and deprived, but from those least disciplined and with something to lose. In April of 1848, Marx went to Germany with the Manifesto plus the utterly reformist “Demands of the Communist Party in Germany.” The “Demands,” also by Marx and Engels, were constituent of a bourgeois revolution, not a socialist one, appealing to many of the elements that directly fought the March outbreak of the revolution. Considering Marx’ position as vice-president of the non-radical Democratic Association in Brussels during the previous year, and, naturally, his support of a prerequisite bourgeois ascendancy, he quickly came into conflict with the revolutionary events of 1848 and with much of the Communist League. Marx helped found a Democratic Society in Cologne, which ran candidates for the Frankfurt Parliament, and he vigorously opposed any League support for armed intervention in support of the revolutionaries. Using the opportunist rationale of not wanting to see the workers become “isolated,” he went so far as to use his “discretionary powers,” as a League official, to dissolve it in May as too radical, an embarrassment to his support of bourgeois elements. With the League out of the way, Marx concentrated his 1848 activities in Germany on support for the Democratic Society and his dictatorial editorship of the Neue Rheinische Zeitung. In both capacities he pursued a “united front” policy, in which working people would be aligned with all other “democratic forces” against the remnants of feudalism. Of course, this arrangement would afford the workers no autonomy, no freedom of movement; it chose to see no revolutionary possibilities residing with them. As editor of the NRZ, Marx gave advice to Camphausen, businessman head of the provisional government following the defeat of the proletarian upsurge. And further, astounding as it sounds, he supported the Democratic Society’s newspaper despite the fact that it condemned the June 1848 insurrection of the Paris proletariat. As politician and newspaper editor, Marx was increasingly criticized for his consistent refusal to deal with the specific situation or interests of the working class. By the fall of 1848, the public activities of Marx began to take on a somewhat more activist, pro-worker coloration, as the risings of the workers resumed in Germany. By December, however, disturbances were on the wane, and the volatile year in Germany appeared to be ending with no decisive revolutionary consequences. Now it was, and only now, that Marx in his paper declared that the working class would have to depend on itself, and not upon the bourgeoisie for a revolution. But because it was rather clearly too late for this, the source of revolution would have to come, he divined, from a foreign external shock: namely, war between France and England, preceded by a renewed French proletarian uprising. Thus at the beginning of 1849, Marx saw in a Franco-British war the social revolution, just as in early 1848 he had located it in war between Prussia and Russia. This was not to be the last time, by the way, that Marx saw in the slaughter of national wars the spark of revolution; the worker-as-subject again fails to occur to Marx, that they could act–and did act–on their own initiatives without first having to be sacrificed, by the generation, as factory slaves or cannon fodder. There were radicals who had seen the openings to revolution in 1848, and who were shocked by the deterministic conservatism of Marx. Louis Gottschalk, for example, attacked him for positing the choice for the working class as between bourgeois or feudal rule; “what of revolution?” he demanded. And so although Marx supported bourgeois candidates in the February (1849) elections, by April the Communist League (which he had abolished) had been re-founded without him, effectively forcing him to leave the moderate Democratic Association. By May, with its week of street fighting in Dresden, revolts in the Ruhr, and extensive insurgency in Baden, events–as well as the reactions of the German radical community–continued to leave Marx far behind. Thus in that month, he closed down the NRZ with a defiant–and manifestly absurd–editorial claiming that the paper had been revolutionary and openly so throughout 1848 and 1849. By 1850 Marx had joined other German refugees in London, upon the close of the insurrectionary upheavals on the continent of the previous two years. Under pressure from the left, as noted above, he now came out in favor of an independently organized German proletariat and a highly centralized state for the (increasingly centralized) working class to seize and make its own. Despite the ill-will caused by his anything-but-radical activities in Germany, Marx was allowed to rejoin the Communist League and eventually resumed his dominance therein. In London, he found support among the Chartists and other elements devoted to electoral reform and trade unionism, shunning the many radical German refugees whom he often branded as “agitators” and “assassins.” This behavior gained him the support of a majority of those present in London and enabled him to triumph over those in the League who had called him a “reactionary” for the minimalism of the Manifesto and for his disdain of a revolutionary practice in Germany. But from the early ’50s Marx had begun to spend most of his time in studies at the British Museum, where he could ponder the course of world revolution away from the noisome hubbub of his precarious household. From this time, he quickly jettisoned the relative radicality of his new-found militance and foresaw a general prosperity ahead, hence no prospects for revolution. The coincidence of economic crisis with proletarian revolt is, of course, mocked by the real history of our world. From the Luddites to the Commune, France in 1968 to the multitude of struggles opening on the last quarter of the 20th century, insurrection has been its own master; the great fluctuations of unemployment or inflation have often served, on the contrary, to deflect class struggles to the lower, survivalist plane rather than to fuel social revolution. The Great Depression of the 1930s brought a diminished vision, for example, perhaps characterized by German National Socialism and its cousin, the American New Deal, nothing approaching the destruction of capitalism. (The Spanish Revolution, bright light of the ’30s, had nothing to do with the Depression gripping the industrialized nations.) Marx’s overriding concern with externalities–principally economic crises, of course–was a trademark of his practical as well as theoretical approach; it obviously reflects his slight regard for the subjectivity of the majority of people for their potential autonomy, imagination and power. The distanciation from actual social struggles of his day is seemingly closely linked with the correct bourgeois life he led. In terms of his livelihood, one is surprised by the gap between his concrete activities and his reputation as revolutionary theorist. From 1852 into the 1860s, he was “one of the most highly valued” and “best paid” columnists of the N.Y. Daily Tribune, according to its editor. In fact, one hundred and sixty-five of his articles were used as editorials by this not quite-revolutionary metropolitan daily, which could account for the fact that Marx requested in 1855 that his subsequent pieces be printed anonymously. But if he wanted not to appear as the voice of a huge bourgeois paper, he wanted still more–as we have seen in his family role–to appear a gentlemen. It was “to avoid a scandal” that he felt compelled to pay the printer’s bill in 1859 for the reformist Das Volk newspaper in London. In 1862 he told Engels of his wish to engage in some kind of business: “Grey, dear friend, is all theory and only business is green. Unfortunately, I have come too late to this insight.” Though he declined the offers, Marx received, in 1865 and 1867, two invitations which are noteworthy for the mere fact that they would have been extended to him at all: The first, via a messenger from Bismarck, to “put his great talents to the service of the German people,” the second, to write financial articles, from the Prussian Government’s official journal. In 1866 he claimed to have made four hundred pounds by speculating in American funds, and his good advice to Engels on how to play the Stock Market is well authenticated. 1874 saw Marx and two partners wrangle in court over ownership of a patent to a new engraving device, intending to exploit the rights and reap large profits. To these striking suggestions of ruling-class mentality must be added the behavior of Marx toward his children, the three daughters who grew to maturity under his thoroughly Victorian authority. In 1866 he insisted on economic guarantees for Paul LaFarque’s future, criticizing his lack of “diligence,” and lecturing him in the most prudish terms regarding his intentions toward Laura, who was almost twenty-one. Reminding LaFarque that he and Laura were not yet engaged, and if they were to become so, that it would constitute a “long-term affair”, he went on to express very puritanical structures: “To my mind, true love expresses itself in the lover’s restraint, modest bearing, even diffidence toward the adored one, and certainly not in unconstrained passion and manifestations of premature familiarity.” In 1868 he opposed the taking of a job by Jenny, who was then twenty-two; later he forbade Eleanor from seeing Lissagaray, a Communard who happened to have defended single-handed the last barricade in Paris. Turning back to politics, the economic crisis Marx avidly awaited in the ’50s had come and gone in 1857, awakening no revolutionary activity. But by 1863 and the Polish insurrection of that year, unrest was in the air-providing the background for the formation of the international Workingman’s Association. Marx put aside his work on Capital and was most active in the affairs of the International from its London inception in September, 1864. Odger, President of the Council of all London Trades Unions, and Cremer, Secretary of the Mason’s Union, called the inaugural meeting, and Wheeler and Dell, two other British union officials, formally proposed an international organization. Marx was elected to the executive committee (soon to be called the General Council), and at its first business meeting was instrumental_ in establishing Odger and Cramer as President and Secretary of the International. Thus from the start Marx’s allies were union bureaucrats, and his policy approach was a completely reformist one with “plain speaking” as to radical aims disallowed. One of the first acts of the General Council was the sending of Marx’s spirited, fraternal greetings to Abraham Lincoln, that “single-minded son of the working class.” Other early activities by Marx included the formation, as part of the International, of the Reform League dedicated to manhood suffrage. He boasted to Engels that this achievement–is our doing,” and was equally enthusiastic when the National Reform League, sole surviving Chartist organization, applied for membership. This latter proved too much even for the faithful Engels, who for some time after refused to even serve as correspondent to the International for Manchester, where he was still a full-time capitalist. During this practice of embracing every shade of English gradualism, principally by promoting the membership of London trade unions, he penned his famous “the proletariat is revolutionary or it is nothing” line, in a letter to the German socialist Ferdinand Lassalle. Lasalle and his General Union of German Workers (ADAV) harbored transparently serious illusions about the state; namely that Bismarck was capable of genuinely socialist policies as Chancellor of Prussia. Yet Marx in 1866 agreed to run for the presidency of the ADAV in the hopes of incorporating it into the International. At the same time, he wrote (to a cousin of Engels): “the adherence of the ADAV will only be of use at the beginning, against our opponents here. Later the whole institution of this Union, which rests on a false basis, must be destroyed.” Volumes could be written, and possibly have, on the manipulations of Marx within the International, the maneuverings of places, dates, and lengths of meetings, for example, in the service of securing and centralizing his authority. To the case of the ADAV could be added, among a multitude of others, his cultivation of the wealthy bourgeois Lefort, so as to keep his wholly non-radical faction within the organization. By 1867 his dedicated machinations were felt to have reaped their reward; to Engels he wrote, “we (i.e. you and I) have this powerful machine in our hands.” War Progressive and Inevitable Also, in 1867 he availed himself publicly once more of one of his favorite notions, that a war between Prussia and Russia would prove both progressive and inevitable. Such a war would involve the German proletariat versus despotic Eastern barbarism and would thus be salutary for the prospects of European revolution. This perennial “war games” type of mentality somehow manages to equate victims, set in motion precisely as chattels of the state, with proletarian subjects acting for themselves; it would seem to parallel the substitution of trade union officials for workers, the hallmark of his preferred strategy as bureaucrat of the International. Marx naturally ridiculed anyone–such as his future son-in-law, LaFargue–for suggesting that the proper role of revolutionaries did not lie in such a crass game of weighing competing nationalisms. And in 1868 when the Belgian delegation to the International’s Brussels Congress proposed the response of a general strike to war, Marx dismissed the idea as a “stupidity,” owing to the “underdeveloped” status of the working class. The weaknesses and contradictions of the adherents of Proudhon and Bakunin are irrelevant here, but we may observe 1869 as the high-water mark of the influence of Marx, due to the approaching decline of the Proudhonists and the infancy of Bakunin’s impact in that year. With mid-1870 and the Napoleon III-engineered Franco-Prussian War, we see once more the pre-occupation with “progressive” vs, “non-progressive” military exploits of governments. Marx to Engels: “The French need a drubbing. if the Prussians are victorious then the centralization of the working class… the superiority of the Germans over the French in the world arena would mean at the same time the superiority of our theory over Proudhon’s and so on.” By July 1870, in an Address endorsed by the international’s General Council, Marx added to this outlook a warning: “If the German working class allow the present war to lose its strictly defensive character and degenerate into a war against the French people, victory or defeat will prove alike disastrous.” Thus the butchery of French workers is fine and good–but only up to a point. This height of cynical calculation appears almost too incredible–and after the Belgians and others were loudly denounced for imagining that the proletariat could be a factor for themselves, in any case. How now could the “German working class” (Prussian army) decide how far to carry out the. orders of the Prussian ruling class–and if they could, why not “instruct” them to simply ignore any and all of these class orders? This kind of public statement by Marx, so devoid of revolutionary content, was naturally received with popularity by the bourgeois press. In fact, none other than the patron saint of British private property, John Stuart Mill, sent a message of congratulations to the International for its wise and moderate Address. When the war Napoleon III had begun turned out as a Prussian victory, by the end of summer 1870, Marx protested, predictably, that Germany had dropped its approved “defensive” posture and was now an aggressor demanding annexation of the Alsace-Lorraine provinces. The defeat of France brought the fall of Louis Napoleon and his Second Empire, and a provisional Republican government was formed. Marx decided that the aims of the International were now two-fold: to secure the recognition of the new, Republican regime by England, and to prevent any revolutionary outbreak by the French workers. His policy advised that “any attempt to upset the new government in the present crisis, when the (Prussian) army is almost knocking at the doors of Paris, would be a desperate folly.” This shabby, anti-revolutionary strategy was publicly promoted quite vigorously–until the Commune itself made a most rude and “unscientific” mockery of it in short order. Well-known, of course, is Marx’s negative reception to the rising of the Parisians; it is over-generous to say that he was merely pessimistic about the future of the Commune. Days after the successful insurrection began he failed to applaud its audacity, and satisfied himself with grumbling that “it had no chance for success.” Though he finally recognized the fact of the Commune (and was thereby forced to revise his reformist ideas regarding proletarian use of existing state machinery), his lack of sympathy is amply reflected by the fact that throughout the Commune’s two-month existence, the General Council of the International, spoke not a single word about it. It often escapes notice when an analysis or tribute is delivered well after the living struggle is safely living no longer. The masterful polemicizing about the triumphs of the Commune and Civil War in France constitute an obituary, in just the same way that Class Struggles in France did so at a similarly safe distance from the events he failed to support at the time of revolutionary Paris, 1848. After a very brief period–again like his public attitude just after the 1848 through 1849 outbreaks in Europe–of stated optimism as to proletarian successes in general, Marx returned to his more usual colors. He denied the support of the International to the scattered summer 1871 uprisings in Italy, Russia, and Spain–countries mainly susceptible to the doctrines of anarchy, by the way. September witnessed the last meeting of the International before the Marx faction effectively disbanded it, rather than accept its domination by more radical elements such as the Bakuninists, in the following year. The bourgeois gradualism of Marx was much in evidence at the fall 1871 London Conference, then, as exemplified by such remarks as: “To get workers into parliament is equivalent to a victory over the governments, but one must choose the right man.” Between the demise of the International and his own death in 1883, Marx lived in a style that varied little from that of previous decades. Shunning the Communard refugees, by and large, as he had shunned the radical Germans in the ’50’s after their exile following 1848 through 1849–Marx kept company with men like Maxim Kovalevsky, a non-socialist Russian aristocrat, the well-to-do Dr. Kugelmann, the businessman Max Oppenheim, H.M. Hyndman, a very wealthy social democrat, and, of course, the now-retired capitalist, Engels. With such a circle as his choice of friends, it is not surprising that he continued to see little radical capacity in the workers, just as he had always failed to see it. In 1874, he wrote, “The general situation of Europe is such that it moves to a general European war. We must go through this war before we can think of any decisive external effectiveness of the European working class.” Looking, as ever, to externalities–and of course to the “immutable laws of history”–he contributes to the legacy of the millions of World War 1 dead, sacrificed by the capitulation of the Marxist parties to the support of war in 1914. Refusing throughout his lifetime to see the possibilities of real class struggle, to understand the reality of the living negation of capitalism, Marx actively and concretely worked for the progress and fullness of capitalist development, which prescribed that generations would have to be sacrificed to it. I think that the above observations of his real life are important and typical ones, and suggest a consistency between that life and his body of ideas. The task of moving the exploration along to encompass the “distinctly theoretical” part of Marx, is expressly beyond the scope of this effort; possibly, however, the preceding will throw at least indirect light on the more “disembodied” Marx. Origins and Meaning of WWI ────────────────────────── Twenty pages I have to transcribe myself, yet again. This will be a pain in the ass. Taylorism and Unionism ────────────────────── In 1973 David Jenkins, in one of the many recent works on the “revolt against work” phenomenon, observed that “The impression has begun to get about that the Industrial Revolution is not going to work out after all.” [2] In light of the profound malaise of blue and white collar workers brought out so stunningly in Studs Terkel’s Working, for example, the decline of output per worker since early 1973, and increasing signs of a pervasive anti-union sentiment complementing anti-management restiveness, Jenkins’ remark does not, after all, seem so shocking. The 1973 Health, Education and Welfare report, Work In America, remarked, in a similar vein, that “absenteeism, wildcat strikes, turnover, and industrial sabotage (have) become an increasingly significant part of the cost of doing business.” [3] In using the last quote, from the HEW report, I was influenced not only by its succinct accuracy and the “high level” nature of the source, but by its placement in the report within a section called, “The Anachronism of Taylorism.” Owing to the many misunderstandings about the scientific management’s historical role—including its relevance to the current crisis in industrial relations—much that is basic to our industrial society is not seen for what it is. The genesis of Taylorism, or scientific management, and the developing relationship of this system to trade unionism are especially crucial, and I hope to illuminate these areas. As Frederick Taylor was engaged in his pioneering efforts at the Midvale Steel Company, in the 1880s, several members of the American Society of Mechanical Engineers (ASME) were likewise interested in the problems of labor management. The development of capitalism was meeting sharp resistance from the growing ranks of labor, desirous of a sense of work integrity and craftsmanship. ASME member William Partridge spoke to the Society in 1887 of the crisis in industrial relations. “More than one hundred years ago, France found herself in a condition not unlike that which is prevailing today.” Continuing his reference to the French Revolution, he underlined the urgency of efforts to resolve “the labor problem.” “It is a question which has great interest to us whether 1898 will mark a period of equally disastrous uprisings with us. Certainly there are some things in the history of labor and capital which make it seem almost probable.” Task management, or scientific management as it came to be called, began taking shape in the eighties as the way to break the workers’ threatening resistance. The heart of this approach, which Peter Brucker has called the most effective idea of the past century, is the systematic reduction of work into discrete, routinized tasks, totally separated from any policy decisions about the job. Taylor realized that employees exert vital influence because they possess the crucial talents needed in any productive process. As he put it in his Principles of Scientific Management, “foremen and superintendents know, better than anyone else, that their own knowledge and personal skill falls far short of the combined knowledge and dexterity of all the workmen under them.” [4] If Robert Hoxie understood the point, that “this unique possession of craft knowledge and craft skill on the part of a body of wage workers, that is, their possession of these things and the employer’s ignorance of them” is the key to worker strength on the job, management experts, like Taylor, knew just what had to be done to break that strength. For capitalism to be firmly in control, it must monopolize the information and techniques of output as surely as it controls the rest of the means of production. The worker must only be permitted to perform certain narrow, specific actions as planned by management. “For one of the most important general principles of Taylor’s system was that the man who did the work could not derive or fully understand its science,” as Samuel Haber accurately observed. Naturally it was beneficial to publicly promote scientific management as geared directly to problems of price and productivity, though its motivating concern was with the control of production itself. In fact, capital’s problem was less and less one of productivity at this time, as Edwin Perkins points out, and Siegfried Giedion’s comparison of American and German industry shows that Germany’s greater reliance on worker skill was cheaper than the American tendency to mechanize. [5] C. Bertrand Thompson made, in effect, the same point in 1917 when he remarked on the absence of a competitive pressure behind firms employing scientific management, “for the reason that most of them now using it stand in a quasi-monopoly position in which there is no necessity to reduce their prices…” Thus the introduction of Taylorism may be seen as primarily a social and even political response, rather than a matter of economics or technology. Concerning its effect, Robert Hoxie noted in 1915 that “the whole scheme of scientific management, especially the gathering up and systematization of the knowledge formerly the possession of the workmen, tends enormously to add to the strength of capitalism.” The proponents of the new regimentation sought to invest it with an aura of impartiality, to evoke a theoretical legitimacy useful to capitalism as a whole. [6] Mary Follett of the Taylor Society, for example, claimed that with scientific management, “authority is derived from function” and thus “has little to do with hierarchy of position as such.” Typical pronouncements claimed that it embodied “a new kind of authority which stemmed from the unveiling of scientific law,” and that it substitutes joint obedience of employers and workers for obedience to personal authority.” The time-study man, measuring and manipulating the worker with his stopwatch, relies on “unimpeachable data.” Despite these efforts for the Taylorist approach, the public rapidly derived a very malignant view of the subject. As the Taylor society admitted with surprising candor, scientific management became known as “the degradation of workmen into obedient oxen under the direction of a small body of experts—into men debarred from creative participation in their work.” [7] The public’s very accurate impression of scientific management practice finds its source in the contempt in which Taylor and his followers held workers. Referring to his experience at Bethlehem Steel, Taylor described the iron handler he encountered as stupid, phlegmatic, and ox-like. [8] H.L. Gantt, one of Taylor’s leading disciples, spoke of implementing the task system as “the standard method of teaching and training children.” As “the worker became an object in Taylor’s hands,” in Jacques Ellul’s phrase, it follows that he was seen as an animal or child by the Taylorites. Another part of the justification was Taylor’s concept of the “economic man,” that a worker’s real motivation is money and nothing else. Despite the attempts to downgrade their subjects and discount their motivations, the scientific management tracts and guidelines are full of admonitions to proceed slowly, due to the workers’ resistance. It was regularly repeated, in fact, that several years are needed to get control of a plant on the scientific management plane. [9] The Taylor Society warned employers to expect strikes and sabotage, to proceed with cunning so as to infiltrate under false appearances, and to expect opposition at every step. [10] The struggle has been clearly over work and the progressive attempts to debase it. [11] The fight to control it has been the heart of the contest, as manifested in such articles as “Who’s Boss in your Shop?” from the August 1917 Bulletin of the Taylor Society. In fact, the first effort of Taylor to lay out his theory, in “A Piece-Rate System” (1895) underlines the fact that the problem to be solved is the antagonism between workers and employers. Although a survey of management and personnel journals [12] makes it clear that scientific management is the foundation of work organization today, our everyday experiences of work bring the point home with painful clarity. Control assumed “unprecedented dimensions” with Taylor to use Harry Braverman’s assessment, and it has engendered a serious stage of opposition today which is calling work as we know it (wage labor) into question. Through the recent work of Harry Braverman, Stephen Marglin, and others, we now see the social/political control essence of Taylorism. What is less understood, however, is the nature of the fight between workers and controllers, and the role of unionism in that—continuing—fight. From the two standard works on the subject, Jean Trepp McKelvey’s AFL Attitudes Toward Production and Milton Nadworny’s Scientific Management and the Unions, has emerged the thesis that organized labor switched from a hostile attitude toward Taylorism before World War I, to a warmly receptive one thereafter. The evidence shows this judgment to be mistaken, the error stemming from the perennial confusion of union attitude with rank and file attitude. It would be much more accurate to say that workers seem to have opposed scientific management all along, while the unions gave only a brief show of opposition and have never really been against it. Turning first to the union attitudes toward Taylorism in the pre-War period, we find anything but concerted opposition. In 1895, for example, upon the occasion of Taylor’s first presentation of his ideas to the American Society of Mechanical Engineers, John A. Penton, ex-president of the Brotherhood of Machine Moulders, was in attendance and joined the discussion of Taylor’s paper. This former union official, speaking “as a workman,” was more lavish in his praise than any of the others; urging that the paper be put into the hands of every employer and employee, he termed it “perhaps the most remarkable thing of its kind I ever heard in my life. I can sympathize with every word. His paper, I think, is a landmark in the field of political economy.” [13] In 1907, David Van Alstyne of the American Locomotive Company secured an agreement with the molders’ and blacksmiths’ union for the introduction of Taylorism into the company’s U.S. and Canada shops. Though the molders and blacksmiths were thus prevented from fighting the degrading methods, the unorganized machinists in Pittsburgh walked out, “seething” with anger. [14] Professor John Commons provided the cardinal reason for the unions’ absence of hostility to Taylorism in a 1906 issue of The Outlook magazine: “… the unions have generally come to the point of confining their attention to wages—that is, to distribution—leaving to employers the question of production.”[15] If either McKelvey or Nadworny had examined collective bargaining agreements reached prior to World War I, [16] they would have most likely discovered the “management’s rights” clause vests the sole right to set work methods, job design, assignments, etc. with management and is of fundamental importance in understanding why unionism was incapable of hostility to scientific management or any other kind of management system. It is easy to see why, when Taylorism became a public issue in 1911, AFL officials could not have found historical grounds for opposition. [17] Thus when Nadworny mentions the arrangement made between Plimpton Press and the Typographical Union in 1914, whereby the union agreed to accept scientific management in return for closed shop recognition, or the arrangement between the New York garment industry and the International Ladies Garment Workers’ Union in 1916, involving the same exchange we are not at all dealing with aberrations. In fact, the idea began spreading well before the war that unionization, with its standard “management’s rights” clause contracts, was the best approach for fitting the Taylorism yoke on the workers. The efficacy of this “trojan horse” tactic of union mediation led Harvard Professor C. Bertrand Thompson, in a book published in 1917, to prescribe industrial unionism over the AFL’s craft unionism as the best way to secure the Taylor system in industry. Describing “one plant where scientific management was fully developed and in complete operation, the management has itself authorized and aided the organization of its employees,” Thompson went so far as to urge recognition of the Industrial Workers of the World, to secure “the necessary unanimity of action” in linking all the workers, not only the skilled ones, to Taylorism. [18] The ostensibly radical IWW might seem an unlikely candidate for the job of Taylorizing the workers, but actually several Wobbly spokesmen saw in scientific management much of value toward stabilizing and rationalizing production “after the Revolution.” And from the rest of the American Left, many another sympathetic voice could be heard, such as that of left-wing Socialist, Algie Simons. Enthusiasm for the system seemed to cut across ideological lines. Lenin’s support of Taylorism is well known, while John Spargo, an American Socialist, denounced everything about the Bolshevik Revolution save Lenin’s adoption of scientific management. [19] Henry L. Gantt, on the other hand, a conservative Taylor disciple, admired the Leninist dictatorship, especially of course, its Taylorist component. And Morris L. Cooke, a liberal Taylorite, of whom it was said in 1915 that “no one has done more to broaden the scope of scientific management,” was one of the first spokesmen to publicly urge the Taylor Society to realize in unionism its natural partner. He became in the 1930s a prominent CIO advocate. While the official union and radical spokesmen for the workers were finding no fault with scientific management, the workers were acting on their own against it. An attempt to introduce Taylorism at the huge Rock Island government arsenal in 1908 was defeated by the intense opposition it aroused. It is interesting that these “unorganized” workmen appealed not to a union for help, but confronted the setting of piece rates and the division of tasks by themselves and immediately demanded that the methods be discontinued. Likewise, the beginnings of Taylorism at the Frankfort arsenal were defeated by the hostility of the “unorganized” employees there in 1910 and 1911. In October 1914, the 3,000 garment workers of Sonnenborn and Company in Baltimore walked out spontaneously upon hearing that Taylorism was to be installed. [20] These examples could be multiplied ad infinitum. What may be of at least as much value is a more detailed look at a particular plant’s experience. The case of Taylorism at the U.S. arsenal at Watertown, Mass. in 1911, wrongly termed in 1917 the only instance of real union opposition to Taylorism, clearly demonstrates the need for not confusing union with workers, “organized” or not. If this is as close as unions came in practice to opposing the new system, it is very safe to say that they did not oppose it at all. When the ideas of Taylorizing Watertown first arose in 1908, Taylor warned that the government managers must have the complete system. “Anything short of this leaves such a large part of the game in the hands of the workmen that it becomes largely a matter of whim or caprice on their part as to whether they will allow you to have any real results or not.” [21] Hugh Aitken, in his excellent study of the Watertown situation, is correct that control of the entire work environment was at issue and that no move by Taylor and his associates “was merely technological or administrative.” It is clear that Taylor himself mistook the quiescence of the AFL unions, who represented various arsenal workers, for passivity on the part of the employees. He counseled a Watertown manager in 1910 “not to bother too much about what the AFL write (sic) concerning our system,” and in March 1911, just before the strike, again tried to allay any management fears of worker resistance by pooh-poohing any AFL correspondence which might be received in the future. [22] He knew the unions wouldn’t seriously interfere; his elitism prevented a clear appraisal of worker attitudes. When the time-study man Merrick openly timed foundry workers with a stopwatch, action was immediately forthcoming. Though union members, they did not call the union, but instead drew up a petition demanding the cessation of any further Taylorist intrusions, and being rebuffed, walked out. Joseph Cooney, a molder in the foundry, testified early in 1912 to the Congressional committee examining Taylor’s system, that there had been no contact between the workers and any union official and that the strike had been completely spontaneous. [23] Other testimony made it clear, furthermore, that workers’ resentment was fueled by the anti-workmanship aspects of Taylorism. Issac Godstray and Alexander Crawford, for example, spoke of the pressures to slight their work and reduce their level of craftsmanship. Though an overwhelming majority of Watertown employees questioned by a consultant (hired by a group of workers) felt that the unions had no interest in agitating against scientific management, [24] the International Association of Machinists (IAM) publicly proclaimed union opposition to the system shortly after the 1911 strike. Because this public opposition by the IAM in 1911 is practically the sole evidence supporting the thesis of pre-War union hostility to Taylorism, [25] it deserves a closer look. In 1909, as McKelvey notes, the initial features of scientific management were installed at Watertown without the slightest protest from the unions, including the IAM. [26] At about this time, the National League of Government Employees began to make inroads on the IAM, due to the dissatisfaction of the latter group’s members. The rival organization had drawn away many members by the time of the 1911 strike, [27] and the IAM was thus forced to make a show of opposition if it wished to retain its hold among the workers. In a similar fashion, the International Molders’ Union had to give grudging support to a strike of Boston molders in the same year; the strike had occurred without so much as informing the local union. The union leaders involved frequently made statements showing their actual support of Taylorism, and a careful reading of the 1911 AFL convention record, also cited as evidence of anti-Taylorism by the unions, shows that Samuel Gompers avoided directly attacking the new work system in any substantial way. Burt Alpert’s judgment that the “basis of modern trade unionism is its role in bargaining away the right of the worker to exercise control over the quality of his or her work” could have easily been reached via a study of the betrayal of the workers to Taylorism. G.T.W. Patrick’s dictum that “a mere increase of wages will never redeem the evils of the industrial system” also seems to be to the point here. The 1920s, which saw the not-unrelated phenomena of unionism’s public embrace of scientific management and the falling away of union membership, was a victorious period for Taylorism. The age of the consumer was begun, out of the systematic destruction of much of the last autonomy of the producer. With the invaluable aid of the unions, a healthy share of the content of work lives had been removed. James Rorty saw the lack of militancy and initiative from workers in the early 1930s as stemming directly from their lack of understanding of the technological processes to which they were enslaved. [28] The re-awakening of the struggle for a life of quality and meaning in the 1970s is informed with the knowledge that work itself is the issue. [29] Unionization in America ─────────────────────── This is B.S… I can’t find this anywhere on the net, and the PDF is tilted in so many ways that the OCR can’t read the characters properly. Thanks green-anarchy.wikidot.com. A very good scan. This will take some time, as 13 pages of a fuckton of text to transcribe is pretty time-consuming. Organized Labor vs. “The Revolt Against Work” ───────────────────────────────────────────── Serious commentators on the labor upheavals of the Depression years seem to agree that disturbances of all kinds, including the wave of sit-down strikes of 1936 and 1937, were caused by the ‘speed-up’ above all. Dissatisfaction among production workers with their new CIO unions set in early, however, mainly because the unions made no efforts to challenge management’s right to establish whatever kind of work methods and working conditions they saw fit. The 1945 Trends in Collective Bargaining study noted that “by around 1940” the labor leader had joined the business leader as an object of “widespread cynicism” to the American employee. Later in the 1940s C. Wright Mills, in his The New Men of Power: Amenca’s Labor Leaders, described the union’s role thusly: “the integration of union with plant means that the union takes over much of the company’s personnel work, becoming the discipline agent of the rank-and-file.” In the mid-1950s, Daniel Bell realized that unionization had not given workers control over their job lives. Struck by the huge, Spontaneous walk-out at River Rouge in July. 1949, over the speed of the Ford assembly line, he noted that “sometimes the constraints of work explode with geyser suddenness.” And as Bell’s Work and Its Discontents (1956) bore witness that “the revolt against work is widespread and takes many forms, “so had Walker and Guest’s Harvard study, The Man on the Assembly Line (1953), testified to the resentment and resistance of the man on the line. Similarly, and from a writer with much working class experience himself, was Harvey Swados’ “The Myth of the Happy Worker,” published in The Nation, August, 1957, Workers and the unions continued to be at odds over conditions of work during this period. In auto, for example, the 1955 contract between the United Auto Workers and General Motors did nothing to check the ‘speed-up’ or facilitate the settlement of local shop grievances. Immediately after Walter Reuther made public the terms of the contract he’d just signed, over 70% of GM workers went on strike. An even larger percentage ‘wildcatted’ after the signing of the 1958 agreement because the union had again refused to do anything about the work itself. For the same reason, the auto workers walked off their jobs again in 1961, closing every GM and a large number of Ford plants. Paul Jacobs’ The State of the Unions, Paul Saltan’s The Disenchanted Unionist, and BJ. Widick’s The Triumphs and Failures of Unionism in the United States were some of the books written in the early 1960s by pro-union’ figures, usually former activists, who were disenchanted with what they had only lately and partially discovered to be the role of the unions. A black worker, James Boggs, clarified the process in a sentence: “Looking backwards, one will find that side by side with the fight to control production, has gone the struggle to control the union, and that the decline has taken place simultaneously on both fronts. " What displeased Boggs, however, was lauded by business. In the same year that his remarks were published, Fortune, American capital’s most authoritative magazine, featured as a cover story in its May, 1963 issue Max Way’s “Labor Unions are worth the Price.” But by the next year, the persistent dissatisfaction of workers was beginning to assume public prominence, and a June, 1964 Fortune article reflected the growing pressure for union action: “Assembly-line monotony, a cause reminiscent of Charlie Chaplin’s Modern Times, is being revived as a big issue in Detroit’s 1964 negotiations, it reported. In the mid-1960’s, another phenomenon was dramatically and violently making itself felt. The explosions in the black ghettoes appeared to most to have no connection with the almost underground fight over factory conditions. But many of the participants in the insurrections in Watts, Detroit and other cities were fully employed according to arrest records. The struggle for dignity in one’s work certainly involved the black workers, whose oppression was, as in all other areas, greater than that of non-black workers. Jessie Reese, a Steelworkers’ union organizer, described the distrust his fellow blacks felt toward him as an agent of the union: “To organize that black boy out there today you’ve got to prove yourself to him, because he don’t believe nothing you say.” Authority is resented, not color. Turning to more direct forms of opposition to an uncontrolled and alien job world, we encounter the intriguing experience of Bill Watson, who spent 1968 in an auto plant near Detroit. Distinctly post-union in practice, he witnessed the systematic, planned efforts of the workers to substitute their own production plans and methods for those of management. He described it as “a regular phenomenon” brought out by the refusal of management and the UAW to listen to workers’ suggestions as to modifications and improvements in the product. “The contradictions of planning and producing poor quality, beginning as the stuff of jokes, eventually became a source of anger.—temporary deals unfolded between inspection and assembly and between assembly and trim, each with planned sabotage… the result was stacks upon stacks of motors awaiting repair. . it was almost impossible to move. ..the entire six-cylinder assembly and inspection operation was moved away- where new workers were brought in to man it. In the most dramatic way, the necessity of taking the product out of the hands of laborers who insisted on planning the product’ became overwhelming.” The extent and co-ordination of the workers’ own organization in the plant described by Watson was very advanced indeed, causing him to wonder if it wasn’t a glimpse of a new social form altogether arising from the failure of unionism. Stanley Weir, writing at this time of similar if less highly developed phenomena, found that “in thousands of industrial establishments across the nation, workers have developed informal underground unions” due to the deterioration or lack of improvement in the quality of their daily job lives.” Until the 1 970s—and very often still—the wages and benefits dimension of a work dispute, that part over which the union would become involved, received almost all the attention. In 1965 Thomas Brooks observed that the “apathy” of the union member stemmed from precisely this false emphasis: “. .—grievances on matters apart from wages are either ignored or lost in the limbo of union bureaucracy.” A few years later, Dr. David Whitter, industrial consultant to GM, admitted, “That [more money] isn’t all they want; it’s all they can get.” As the 1960s drew to a close, some of the more perceptive business observers were about to discover this distinction and were soon forced by pressure from below to discuss it publicly. While the October, 1969, Fortune stressed the preferred emphasis on wages as the issue in Richard Armstrong’s “Labor 1970: Angry Aggressive, Acquisitive” (while admitting that the rank and file was in revolt “against its own leadership, and in important ways against society itself’), the July, 1970 issue carried Judson Gooding’s “Blue-Collar Blues on the Assembly Line: Young auto workers find job disciplines harsh and uninspiring, and they vent their feelings through absenteeism, high turnover, shoddy work, and even sabotage. It’s time for a new look at who’s down on the line. With the 1970s there has at last begun to dawn the realization that on the most fundamental issue, control of the work process, the unions and the workers are very much in opposition to each other. A St. Louis Teamster commented that traditional labor practice has as a rule involved “giving up items involving workers’ control over the job in exchange for cash and fringe benefits.” Acknowledging the disciplinary function of the union, he elaborated on this time-honored bargaining: Companies have been willing to give up large amounts of money to the union in return for the union’s guarantee of no work stoppages.” Daniel Bell wrote in 1973 that the trade union movement has never challenged the organization of work itself, and summed up the issue thusly: “The crucial point is that however much an improvement there may have been in wage rates, pension conditions, supervision, and the like, the conditions of work themselves- the control of pacing, the assignments, the design and layout of work—are still outside the control of the worker himself.” Although the position of the unions is usually ignored, since 1970 there has appeared a veritable deluge of articles and books on the impossibility to ignore rebellion against arbitrary work roles. From the covers of a few national magazines: Barbara Garson’s “The Hell with Work,” Harper’s, June, 1972; Life magazine’s “Bored on the Job: Industry Contends with Apathy and Anger on the Assembly Line,” September 1, 1972; and “Who Wants to Work?” in the March 26, 1973 Newsweek. Other articles have brought out the important fact that the disaffection is definitely not confined to industrial workers. To cite just a few: Judson Gooding’s “The Fraying White Collar” in the December, 1970 Fortune, Timothy Ingram’s “The Corporate Underground,” in The Nation of September 13, 1971, Marshall Kilduffs “Getting Back at a Boss: The New Underground Papers,” in the December 27, 1971 San Francisco Chronicle, and Seashore and Barnowe’s “Collar Color Doesn’t Count,” in the August. 1972 Psychology Today. In 1971 The Workers, by Kenneth Lasson, was a representative book, focusing on the growing discontent via portraits of nine blue-collar workers. The Job Revolution by Judson Gooding appeared in 1972, a management-oriented discussion of liberalizing work management in order to contain employee pressure. The Report of a Special Task Force to the Secretary of Health, Education, and Welfare on the problem titled Work in America, was published in 1973. Page 19 of the study admits the major facts: “absenteeism, wildcat strikes. Turnover, and industrial sabotage [have) become an increasingly significant part of the cost of doing business.” The scores of people interviewed by Studs Terkel in his Working: People Talk A bout What They Do All Day and How They Feel about What They Do (1974), reveal a depth to the work revolt that is truly devastating. His book uncovers a nearly unanimous contempt for work and the fact that active resistance is fast replacing the quiet desperation silently suffered by most. Prom welders to editors to former executives, those questioned spoke up readily as to the feelings of humiliation and frustration. If most of the literature of “the revolt against work” has left the unions out of their discussions, a brief look at some features of specific worker actions from 1970 through 1973 will help underline the comments made above concerning the necessarily anti-union nature of this revolt. During March 1970, a wildcat strike of postal employees, in defiance of union orders, public employee anti-strike law, and federal injunctions, spread across the country disabling post offices in more than 200 cities and towns. In New York, where the strike began, an effigy of Gus Johnson, president of the letter carriers’ union local there, was hung at a tumultuous meeting on March 21 where the national union leaders were called “rats” and “creeps.” In many locations, the workers decided to not handle business mail, as part of their work action, and only the use of thousands of National Guardsmen ended the strike, major issues of which were the projected layoff of large numbers of workers and methods of work. In July, 1971, New York postal workers tried to renew their strike activity in the face of a contract proposal made by the new letter carrier president, Vincent Sombrotto. At the climax of a stormy meeting of 3,300 workers, Sombrotto and a lieutenant were chased from the hall and down 33rd Street, narrowly escaping 200 enraged union members, who accused them of “selling out” the membership. Returning to the Spring of 1970, 100,000 Teamsters in 16 cities wildcatted between March and May to overturn a national contract signed March 23 by IBT President Fitzsimmons. The ensuing violence in the Middle West and West Coast was extensive, and in Cleveland involved no less than a thirty-day blockade of main city thoroughfares and 67 million dollars in damages. On May 8, 1970, a large group of hard-hat construction workers assaulted peace demonstrators in Wall Street and invaded Pace College and City Hall itself to attack students and others suspected of not supporting the prosecution of the Vietnam War. The riot, in fact, was supported and directed by construction firm executives and union leaders, in all likelihood to channel worker hostility away from themselves. Perhaps alone in its comprehension of the incident was public television (WNET, New York) and its “Great American Dream Machine” program aired May 13. A segment of that production uncovered the real job grievances that apparently underlined the affair. Intelligent questioning revealed, in a very few minutes, that “commie punks” were not wholly the cause of their outburst, as an outpouring of gripes about unsafe working conditions, the strain of the work pace, the fact that they could be fired at any given moment, etc., was recorded. The head of the New York building trades union, Peter Brennan, and his union official colleagues were feted at the White House on May 26 for their patriotism—and for diverting the workers?—and Brennan was later appointed Secretary of Labor. In July, 1970, on a Wednesday afternoon swing shift a black auto worker at a Detroit Chrysler plant pulled out an M-1 carbine and killed three supervisory personnel before he was subdued by UAW committeemen. It should be added that two others were shot dead in separate auto plant incidents within weeks of the Johnson shooting spree, and that in May, 1971 a jury found Johnson/innocent because of insanity after visiting and being shocked by what they considered the maddening conditions at Johnson’s place of work. The sixty-seven day strike at General Motors by the United Auto Workers in the Fall of 1970 is a classic example of the anti-employee nature of the conventional strike, perfectly illustrative of the ritualized manipulation of the individual which is repeated so often and which changes absolutely nothing about the nature of work. A Wall Street Journal article of October 29, 1970 discussed the reasons why union and management agreed on the necessity of a strike. The UAW saw that a walk-out would serve as “an escape valve for the frustrations of workers bitter about what they consider intolerable working conditions,” and a long strike would “wear down the expectations of members.” The Journal went on to point out that, “among those who do understand the need for strikes to ease intra-union pressures are many company bargainers.—They are aware that union leaders may need such strikes to get contracts ratified and get re-elected.” Or, as William Serrin succinctly put it: “A strike, by putting the workers on the street, rolls the steam out of them it reduces their demands and thus brings agreement and ratification; it also solidifies the authority of the union hierarchy.” Thus, the strike was called. The first order of the negotiating business was the dropping of all job condition demands, which were only raised in the first place as a public relations gesture to the membership. With this understood, the discussions and publicity centered around wages and early retirement benefits exclusively, and the charade played itself out to its preordained end. “The company granted each demand [UAW president] Woodcock had made, demands he could have had in September.” Hardly surprising, then, that GM loaned the union $23 million per month during the strike? As Serum conceded, the company and the union are not even adversaries, much less enemies. In November, 1970, the fuel deliverers of New York City exasperated by their union president’s resistance to pleas for action, gave him a public beating. Also in New York, in the following March the Yellow Cab drivers ravaged a Teamsters’ Union meeting hall in Manhattan in response to their union officials’ refusal to yield the floor to rank and file speakers. In January, 1971, the interns at San Francisco General Hospital struck, solely over hospital conditions and patient care. Eschewing any ties to organized labor, their negotiating practice was to vote publicly on each point at issue, with all interns present. The General Motors strike of 1970 discussed above in no way dealt with the content of jobs. Knowing that it would face no challenge from the UAW, especially, it was thought, so soon after a strike and its cathartic effects, GM began in 1971 a co-ordinated effort at speeding up the making of cars, under the name General Motors Assembly Division, or GMAD. The showplace plant for this re-organization was the Vega works at Lordstown, Ohio, where the workforce was 85% white and the average age 27. With cars moving down the line almost twice as fast as in pre-GMAD days, workers resorted to various forms of on the job resistance to the terrific pace. GM accused them of sabotage and had to shut down the line several times. Some estimates set the number of deliberately disabled cars as high as 500,000 for the period of December, 1971 to March, 1972, when a strike was finally called following a 97% affirmative vote of Lordstown’s Local 1112. But a three-week strike failed to check the speed of the line, the union, as always, having no more desire than management to see workers effectively challenging the control of production. The membership lost all confidence in the union; Gary Bryner, the 29-year-old president of Local 1112 admitted: “They’re angry with the union; when I go through the plant 1 get catcalls.” In the GMAD plant at Norwood, Ohio, a strike like that at Lordstown broke out in April and lasted until September, 1. The 174 days constituted the longest walkout in GM history. The Norwood workers had voted 98% in favor of striking in the previous February, but the UAW had forced the two locals to go out separately, first Lordstown, and later Norwood, thus isolating them and protecting the GMAD program. Actually, the anti-worker efforts of the UAW go even further back, to September of 1971, when the Norwood Local 674 was put in receivership, or taken over, by the central leadership when members had tried to confront GMAD over the termination of their seniority rights. In the summer of 1973, three wildcat strikes involving Chrysler facilities in Detroit took place in less than a month. Concerning the successful one-day wildcat at the Jefferson assembly plant, UAW vice president Doug Fraser said Chrysler had made a critical mistake in “appeasing the workers” and the Mack Avenue walkout was effectively suppressed when a crowd of “UAW local union officers and committeemen, armed with baseball bat” and clubs, gathered outside of the plant gates to ‘urge’ the workers to return.” October, 1973 brought the signing of a new three-year contract between Ford and the UAW. But with the signing, appeared fresh evidence that workers intend to involve themselves in decisions concerning their work lives: “Despite the agreement, about 7,700 workers left their jobs at seven Ford plants when the strike deadline was reached, some because they were unhappy with the secrecy surrounding the new agreement.” With these brief remarks on a very small number of actions by workers, let us try to arrive at some understanding of the overall temper of American wage-earners since the mid-1960s. Sidney Lens found that the number of strikes during 1968, 1969, and 1971 was extremely high and that only the years 1937, 1944-46, and 1952-53 showed comparable totals. More interesting is the growing tendency of strikers to reject the labor contracts negotiated for them. In those contracts in which the Federal Mediation and Conciliation Service took a hand (the only ones for which there are statistics), contract rejections rose from 8.7% of the cases in 1964, to 10% in 1965, to 11% in 1966, to an amazing 14.2% in 1967, levelling off since then to about 12% annually. And the ratio of work stoppages occurring during the period when a contract was in effect has changed which is especially significant when it is remembered that most contracts specifically forbid strikes. Bureau of Labor Statistics figures reveal that while about one-third of all stoppages in 1968 occurred under existing agreements. “an alarming number,” almost two-fifths of them in 1972 took place while contracts were in effect. In 1973 Aronowitz provided a good summary: “The configuration of strikes since 1967 is unprecedented in the history of American workers. The number of strikes as a whole, as well as rank-and-file rejections of proposed union settlements with employers and wildcat actions has exceeded that in any similar period in the modern era.” And as Sennett and Cobb, writing in 1971 made clear, the period has involved “the most turbulent rejection of organized union authority among young workers.” The 1970 GM strike was mentioned as an example of the usefulness of a sham struggle in safely releasing pent-up employee resentment. The nation-wide telephone workers’ strike of July, 1971 is another example, and the effects of the rising tide of anti-union hostility can also be seen in it. Rejecting a Bell System offer of a 30% wage increase over three years, the Communication Workers’ union called a strike, publicly announcing that the only point at issue was that “we need 31 to 32 per cent, “as union president Joseph Beirne put it. After a six-day walkout, the 1% was granted, as was a new Bell policy requiring all employees to join the union and remain in good standing as a condition of employment. But while the CWA was granted the standard ‘union-shop’ status, a rather necessary step for the fulfillment of its role as a discipline agent of the work force, thousands of telephone workers refused to return to their jobs, in some cases staying out for weeks in defiance of CWA orders. The calling of the 90-day wage-price freeze on August 15 was in large part a response to the climate of worker unruliness and independence, typified by the defiant phone workers. Aside from related economic considerations, the freeze and the ensuing controls were adopted because the unions needed government help in restraining the workers. Sham strikes clearly lose their effectiveness if employees refuse to play their assigned roles remaining, for example, on strike on their own. George Meany, head of the AFL-CIO, had been calling for a wage-price freeze since 1969, and in the weeks prior to August 15 had held a number of very private meetings with President Nixon. Though he was compelled to publicly decry the freeze as “completely unfair to the worker” and “a bonanza to big business,” he did not even call for an excess profits tax; he did come out strongly for a permanent wage-price control board and labor’s place on it, however. It seems clear that business leaders understood the need for government assistance. In September, a Fortune article proclaimed that “A system of wage-price review boards is the best hope for breaking the cost-push momentum that individual unions and employers have been powerless to resist.” As workers try to make partial compensation for their lack of autonomy on the job by demanding better wages and benefits, the only approved concessions, they create obvious economic pressure especially in an inflationary period. Arthur M. Louis, in November’s Fortune, realized that the heat had been on labor officials for some time. Speaking of the “rebellious rank and file” of longshoremen, miners, and steelworkers, he said, “Long before President Nixon announced his wage-price freeze, many labor leaders were calling for stabilization, if only to get themselves off the hook.” A Fortune editorial of January (1972) predicted that by the Fall, a national “wave of wildcat strikes” might well occur and the labor members of the tripartite control board would resign. In fact, Meany and Woodcock quit the Pay Board much earlier in the year than that, due precisely to the rank and file’s refusal to support the plainly anti-labor wage policies of the board. Though Fitzsimmons of the Teamsters stayed on, and the controls continued, through a total of four “Phases” until early 1974, the credibility of the controls program was crippled, and its influence waned rapidly. Though the program was brought to a premature end, the Bureau of Labor Statistics gave its ceiling on wage increases much of the credit for the fact that the number of strikes in 1972 was the smallest in five years. During “Phase One” of the controls, the 90-day freeze, David Deitch wrote that “the new capitalism requires a strong centralized trade union movement with which to bargain.” He made explicit exactly what kind of “strength” would be needed: “The labor bureaucracy must ultimately silence the rank and file if it wants to join in the tripartite planning, in the same sense that the wildcat strike cannot be tolerated.” In this area, too, members of the business community have shown an understanding of the critical role of the unions. In May, 1970, within hours of the plane crash that claimed UAW chief Walter Reuther, there was publicly expressed corporate desire for a replacement who could continue to effectively contain the workers. “It’s taken a strong man to keep the situation under control,” Virgil Boyd. Chrysler vice chairman, told the New York Times. “I hope that whoever his successor is can exert great internal discipline.” Likewise, Fortune bewailed the absence of a strong union in the coalfields. In a 1971 article subtitled, “The nation’s fuel supply, as well as the industry’s prosperity, depends on a union that has lost control of its members.” Despite the overall failure of the wage control program, the government has been helping the unions in several other ways. Since 1970, for example, it has worked to reinforce the conventional strike—again due to its important safety-valve function. In June, 1970, the U.S. Supreme Court ruled that an employer could obtain an injunction to force employees back to work when a labor agreement contains a no-strike pledge and an arbitration clause. “The 1970 decision astonished many observers of the labor relations scene, “directly reversing a 1962 decision of the Court, which ruled that such walkouts were merely labor disputes and not illegal. Also in 1970, during the four-month General Electric strike, Schenectady. New York, officials “pleaded with non-union workers to refrain from crossing picket lines on the grounds that such action might endanger the peace.” A photo of the strike scene in Fortune was captioned. “Keeping workers out-workers who were trying to cross picket lines and get to their jobs—became the curious task of Schenectady policemen.” A Supreme Court decision in 1972 indicated how far state power will go to protect the spectacle of union strikes. Four California Teamsters were ordered reinstated with five years’ back pay as a unanimous Supreme Court ruled [November 7, 1972] that it is unfair labor practice for an employer to fire a worker solely for taking part in a strike.” Government provides positive as well as negative support to approved walkouts, too. An 18-month study by the Wharton School of Finance and Commerce found that welfare benefits, unemployment compensation, and food stamps to strikers mean that “the American taxpayer has assumed a significant share of the cost of prolonged work stoppages.” But in some areas, unions would rather not even risk official strikes. The United Steelworkers of America—which allows only union officials to vote on contract ratifications, by the way, agreed with the major steel companies in March, 1973 that only negotiations and arbitration would be used to resolve differences. The Steelworkers’ contract approved in April, 1974 declared that the no-strike policy would be in effect until at least 1980. A few days before, in March, a federal court threw out a suit filed by rank and file steelworkers, ruling that the union needn’t be democratic in reaching its agreements with management. David Deitch, quoted above, said that the stability of the system required a centralized union structure. The process of centralization has been a fact and its acceleration has followed the increasing militancy of wage-earners since the middle-1960s. A June. 1971, article in the federal Monthly Labor Review discussed the big increase in union mergers over the preceding three years August, 1972, saw two such mergers, the union of the United Papermakers and Paperworkers and the International Brotherhood of Pulp, Sulphite, and Paper Mill Workers, and that of the United Brewery Workers with the Teamsters In a speech made on July 5, 1973, Longshoremen’s president Harry Bridges called for the formation of “one big national labor movement or federation.” The significance of this centralization movement is that it places the individual even further from a position of possible influence over the union hierarchy-at a time when he is more and more likely to be obliged to join a union as a condition of employment. The situation is beginning to resemble in some ways the practice in National Socialist Germany, of requiring the membership of all workers in ‘one big, national labor movement or federation,’ the Labor Front. In the San Francisco Bay area, for example, in 1969, “A rare-and probably unique- agreement that will require all the employees of a public agency to join a union or pay it the equivalent of union dues was reported in Oakland by the East Bay Regional Park District.” And in the same area this process was upheld in 1973: “A city can require its employees to pay the equivalent of initiation fees and dues to a union to keep their jobs, arbitrator Robert E. Burns has ruled in a precedent-setting case involving the city of Hayward. ”This direction is certainly not limited to public employees, according to the Department of Labor. Their “What Happens When Everyone Organizes” article implied the inevitability of total unionization. Though a discussion of the absence of democracy in unions is outside the scope of this essay, it is important to emphasize the lack of control possessed by the rank and file. In 1961 Joel Seidman commented on the subjection of the typical union membership: “It is hard to read union constitutions without being struck by the many provisions dealing with the obligations and the disciplining of members, as against the relatively small number of sections concerned with members’ rights within the organization.” Two excellent offerings on the subject written in the 1970s are Autocracy and Insurgency in Organized Labor by Burton Hal and “Apathy and Other Axioms: Expelling the Union Dissenter from History,” by Il.W. Benson. Relatively unthreatened by memberships, the unions have entered into ever-closer relations with government and business. A Times-Post Service story of April. 1969, disclosed a three-day meeting between AFL-CIO leadership and top Nixon administration officials shrouded in secrecy at the exclusive Greenbriar spa. “Big labor and big government have quietly arranged an intriguing tryst this week in the mountains of West Virginia—for a private meeting involving at least half a dozen cabinet members.” Similarly, a surprising New York Times article appearing on the last day of 1972 is worth quoting for the institutionalizing of government-labor ties it augurs: “President Nixon has offered to put a labor union representative at a high level in every federal government department, a well-informed White House official has disclosed. The offer, said to be unparalleled in labor history, was made to union members on the National Productivity Commission, including George Meany, president of the AFL-CIO and Frank E. Fitzsimmons, president of the IBT, at a White House meeting last week… labor sources said that they understood the proposal to include an offer to place union men at the assistant secretary level in all relevant government agencies..—should the President’s offer be taken up, it would mark a signal turning point in the traditional relations between labor and government.” In Oregon, the activities of the Associated Oregon Industries, representing big business and the Oregon AFL-CIO, by the early ‘70s reflected a close working relationship between labor and management on practically everything. Joint lobbying efforts, against consumer and environmentalist proposals especially, and other forms of cooperation led to an exchange of even speakers at each other’s conventions in the Fall of 1971. On September 2, the president of the AOI, Phil Bladine, addressed the AFL-CIO; on September18, AFL-CIO president Ed Whalen spoke before the AOI. In California, as in many other states, the pattern has been very much the same, with labor and business working together to attack conservationists in 1972 and defeat efforts to reform campaign spending in 1974, for example. Also revealing is the “Strange Bedfellows from Labor, Business’ Own Dominican Resort” article on the front page of the May 15, 1973 Wall Street Journal by Jonathon Kwitney. Among the leading stockholders in the 15,000 acre Punta Cana, Dominican Republic resort and plantation are George Meany and Lane Kirkland, president and secretary-treasurer of the AFL-CIO, and Keith Terpe, Seafarers’ Union official, as well as leading officers of Seatrain Lines, Inc., which employs members of Terpe’s union. Not seen for what they are, the striking cases of mounting business-labor-government collusion and cooperation have largely been overlooked. But those in a position to see that the worker is more and more actively intolerant of a daily work life beyond his control, also realize that even closer cooperation is necessary. In early 1971 Personnel, the magazine of the American Management Association, said that “it is perhaps time for a marriage of convenience between the two [unions and management] “for the preservation of order. Pointing out, however, that many members “tend to mistrust the union.” The reason for this “mistrust,” as we have seen, is the historical refusal of unions to interfere with management’s control of work. The AFL-CIO magazine, The American Federationist, admitted labor’s lack of interest and involvement in an article in the January. 1974 issue entitled “Work is here to stay, alas.” And the traditional union position on the matter is why, in turn, C. Jackson Grayson, Dean of the School of Business Administration at Southern Methodist University and former chairman of the Price Commission, called in early 1974 for union-management collaboration. The January 12 issue of Business Week contains his call for a symbolic dedication on July 4, 1976, “with the actual signing of a document-a Declaration of Interdependence” between labor and business, “inseparably linked in the productivity quest. “Productivity—output per hour of work—has of course fallen due to worker dissatisfaction and unrest. A basic indication of the continuing revolt against work are the joint campaigns for higher productivity, such as the widely publicized US Steel-United Steelworkers efforts. A special issue on productivity in Business Week for September 9, 1972, highlighted the problem, pointing out also the opposition workers had for union-backed drives of this kind. Closely related to low productivity, it seems, is the employee resistance to working overtime, even during economic recession. The refusal of thousands of Ford workers to overtime prompted a Ford executive in April, 1974 to say, “We’re mystified by the experience in light of the general economic situation.” Also during April, the Labor Department reported that “the productivity of American workers took its biggest drop on record as output slumped in all sectors of the economy during the first quarter.” In 1935 the NRA issued the Henderson Report, which counseled that “unless something is done soon, they [the workers] intend to take things into their own hands.” Something was done, the hierarchical, national unions of the CIO finally appeared and stabilized relations. In the 1970s it may be that a limited form of worker participation in management decisions will be required to prevent employees from “taking things into their own hands.” Irving Bluestone, head of the UAW’s GM department, predicted in early 1972 that some form of participation would be necessary, under union-management control, of course. As Arnold Tannenbaum of the Institute for Social Research in Michigan pointed out in the late 1960s, ceding some power to workers can be an excellent means of increasing their subjection, if it succeeds in giving them a sense of involvement. But it remains doubtful that token participation will assuage the worker’s alienation. More likely, it will underline it and make even clearer the true nature of the union-management relationship, which will still obtain. It may be more probable that traditional union institutions, such as the paid, professional stratum of officials and representatives, monopoly of membership guaranteed by management, and the labor contract itself will be increasingly re-examined as workers continue to strive to take their work lives into their own hands. New York, New York ────────────────── “Amid All the Camaraderie is Much Looting this Time.” “Seeing the City Disappear”, Wall Street Journal headline, 15 July 1977 The Journal went on to quote a cop on what he saw, as the great Bastille Day break-out unfolded: “People are going wild in the borough of Brooklyn. They are looting stores by the carload.” Another cop added later: “Stores were ripped open. Others have been leveled. After they looted, they burned.” At about 9:00 p.m. on July 13 the power went out in New York for 24 hours. During that period the complete impotence if the state in our most ‘advanced’ urban space could hardly have been made more transparent. As soon as the lights went out, cheers and shouts and loud music announced the liberation of huge sections of the city. The looting and burning commenced immediately, with whole families joining in the “carnival spirit”. In the University Heights section of the Bronx, a Pontiac dealer lost the 50 new cars in his showroom. In many areas, tow trucks and other vehicles were used to tear away the metal gates from stores. Many multistorey furniture businesses were completely emptied by neighborhood residents. Despite emergency alerts for the state troopers, FBI and National Guard, there was really nothing authority could do, and they knew it. A New York Times editorial of July 16 somewhat angrily waved aside the protests of those who wondered why there was almost no intervention on the side of property. “Are you kidding?” the Times snorted, pointing out that such provocation would only have meant that the entire city would still be engulfed in riots, adding that the National Guard is a “bunch of kids” who wouldn’t have had a chance. The plundering was completely multi-racial, with white, black and Hispanic businesses cleaned out and destroyed throughout major parts of Manhattan, Brooklyn, Queens and the Bronx. Not a single “racial incident” was reported during the uprising, while newspaper pictures and TV news bore witness to the variously coloured faces emerging from the merchants’ windows and celebrating in the streets. Similarly, looting, vandalism, and attacks in police were not confined to the City proper; Mount Vernon, Yonkers and White Plains were among suburbs in which the same things happened, albeit on a smaller scale. Rioting broke out in the Bronx House of Detention where prisoners started fires, seized dormitories, and almost escaped by ramming through a wall with a steel bed. Concerning the public, the Bronx District Attorney fumed, “It’s lawlessness. It’s almost anarchy.” Officer Gary Parlefsky, of the 30th Precinct in Harlem, said that he and other cops came under fire from guns, bottles and rocks. He continued: We were scared to death… but worse than that, a blue uniform didn’t mean a thing. They couldn’t understand why we were arresting them. At a large store at 110th Street and Eighth Avenue, the doors were smashed open and dozens of people carried off appliances. A woman in her middle-50s walked into the store and said laughingly: “Shopping with no money required!” Attesting to the atmosphere of a “collective celebration”, as one worried columnist put it, a distribution center was spontaneously organized at a Brooklyn intersection, with piles of looted goods on display for the taking. This was shown briefly on an independent New York station, WPIX-TV, but not mentioned in the major newspapers. The transformation of commodities into free merchandise was only aided by the coming of daylight, as the festivity and music continued. Mayor Beame, at a noon (July 15) press conference, spoke of the “night of terror”, only to be mocked heartily by the continuing liberation underway throughout New York as he spoke. Much, of course, was made of the huge contrast between the events of July 1977 and the relatively placid, law-abiding New York blackout of November 1965. One can only mention the obvious fact that the dominant values are now everywhere in shreds. The “social cohesion” of class society is evaporating. New York is no isolated example. Of course, there has been a progressive decay in recent times of restraint, hierarchy, and other enforced virtues; it hasn’t happened all at once. Thus, in the 1960s, John Leggett (in his Class, Race and Labour) was surprised to learn upon examining the arrest records of those in the Detroit and Newark insurrections, that a great many of the participants were fully employed. This time, of the 176 people indicted as of August 8 in Brooklyn (1,004 were arrested in the borough), 48% were regularly employed. (The same article in the August 9 San Francisco Chronicle where these figures appeared also pointed out that only “six grocery stores were looted while 39 furniture stores, 20 drug stores and 17 jewelry stores and clothing stores were looted”). And there are other similarities to New York, naturally; Life magazine of 4 August 1967 spoke of the “carnival-like revel of looting” in Detroit, and Professor Edward Banfield commented that Negroes and whites mingled in the streets [of Detroit] and looted amicably side by side…” The main difference is probably one of scale and scope — that in New York virtually all areas, even the suburbs, took the offensive and did so from the moment the lights went out. Over $1 billion was lost in the thousands of stores looted and burned, while the cops were paralyzed. During the last New York rioting, the ‘Martin Luther King’ days of 1968, 32 cops were injured; in one day in July 1977, 418 cops were injured. The Left — all of it — has spoken only of the high unemployment, the police brutality; has spoken of the people of New York only as objects, and pathetic ones at that! The gleaming achievements of the unmediated / unideologized have all the pigs scared shitless. The Refusal of Technology ───────────────────────── Of course everybody had to be given a personal code! How else could the government do right by its citizens, keep track of the desires, tastes, preferences, purchases, commitments and above all location of a continentful of mobile, free individuals? So don’t dismiss the computer as a new kind of fetters. Think of it rationally, as the most liberating device ever invented, the only tool capable of serving the multifarious needs of modern man. Think of it, for a change, as him. —John Brunner, /The Shockwave Rider/ Upon the utter destruction of wage-labor and the commodity, a new life will be situated and redefined, by the moment, in countless, unimagined forms. Launched by the abolition of every trace of authority and signified by the delights and surprises of an infinity of gift-creations, freely, spontaneously expressed by everyone. Concepts like “economy,” “exchange,” “production” will have no meaning. (What is worth preserving from this lunatic order?) Perhaps mobile celebrations will replace our sense of cities, maybe even language will be obsolete. But there are those who see revolutionary transformation in rather a different light; for them the Brunner quote is, tragically, not much of a burlesque. Consider—if your stomach is strong—the following, from a 1980 ultra-leftist flyer, typical of the high-tech approach to the revolutionary question: The development of computer technologies, now a threat to our job security, could he used to develop a network of global communications. In this way, our needs can be directly coordinated with the available labor-power and raw materials. Leaving aside the pro-wage-labor concern for our job security, we find human activity treated (electronically) as so much “available labor-power.” Is this the language of desire? Could freedom, love and play nourish along such lines? This computerized prescription is filled by taking “control of the global social reproduction network…” Capitalism, it need hardly be added, can be defined with some precision as the global social reproduction network. Looking at the foundations of “advanced” technology—which our ultra-leftists, in their instrumentalism, always wish to ignore—even the most visionary of intentions would founder. High-tech as a vehicle, far from aiding a qualitative regeneration, denies the possibility of visionary development. The “great height made possible” by computers and the like is, alas, only an expression of the perverse logic of historical class rule. Technology has not developed neutrally as if in the right hands it could benignly transform reality into something importantly different. The means and methods of social reproduction are necessarily in keeping with the stability of a social order. The factory system expressed the need for a disciplined proletariat; more modern modes progressively extend this “civilizing” process via specialized, usually centralized, technologies. The individual is everywhere reduced by the instruments of capitalism, as surely as by its wage-labor/commodity essence. The purveyors of “alternative technology,” it should be noted, promote a different illusion. This illusion lies in ideologizing fragments of possibly acceptable technology while ignoring that which will shape all of the future, class struggles. Simple techniques (see Fukuoka, Mollison, etc.) for growing a huge amount of food in a few hours per year, for instance, are fraught with extremely significant implications; they present, in fact, some of the practical possibilities of living life exquisitely—as in a garden. But they can only become real if linked to the gigantic, necessary destruction of a world which impedes every utopian project. Cioran asks, “If ‘progress’ is so great an evil, how is it that we do nothing to free ourselves from it without further delay?” In fact this “freeing” is well underway, as seen in the massive “turn-off” felt toward its continuance. General Dynamics vice president Veliotis gave vent to a bitter ruling class frustration on the subject (summer 1980): I, for one, would be delighted if our vocational schools would bring us graduates who, if not trained, were simply trainable—who could understand basic manufacturing processes, who could do shop math, who could use standard tools and gauges. More fundamental yet is a growing refusal to participate in education at all, given its direct linkage to “progress.” The drop-out rate in NYC high schools is now over 50 percent. The drop-out rate for all California high schools has risen from 12 percent in 1970 to 22 percent in 1980, ocassioning predictions of “angry future workers and high juvenile crime rates.” The relationship between technology and education is also apt for the reason that the latter provides, in its progression, such a useful, if obvious, analogy to the former. The fragmentation of knowledge into separate, artificially constructed fields constitutes the modern university and social intelligence in general—in its ridiculous division of labor. This is the perfect analog to technology itself; rather, it is more, inasmuch as both clearly work in tandem toward the ever-shrunken individual, dominated by a contrived, fractionalized scale of “information.” The ignorance thus engendered and enforced reminds us of Khayati’s allusion to the university: “Everything is said about our society except what it is.” Government thinker Willis Harman writes of the coming “information society,” based on “revolutionizing everyday life with microcomputers.” A horrible history surfaces on these words, as well as a forewarning of our future as cast by all similar techno-junkies, benevolent or otherwise. Finally, we return to the personal, which is of course the real terrain of the revolutionary axis. A character in Bellow’s /Mr. Sammler’s Planet/ wonders: And what is “common” about the “common life”? What if [we] were to do with “common life” what Einstein did with matter? Finding its energetics, uncovering its radiance. The radiance and the energetics will be there when we are all that “Einstein”: when every productivist, standardized seperation—and every other meditation (“coordinated” or not)—is destroyed by us forever. Everything in the past and present is waiting, waiting to detonate. Anti-work and the Struggle for Control ────────────────────────────────────── The debacle of the air controllers’ strike and the growing difficulties unions are having in attracting new members (and holding new ones—decertification elections have increased for the last 10 years) are two phenomena that could be used to depict American workers as quite tamed overall and adjusted to their lot. But such a picture of conservative stasis would be quite unfaithful to the reality of the work culture, which is now so un-tamed as to be evoking unprecedented attention and countermeasures. Before tackling the subject of anti-work, a few words on the status of business might be in order. Bradshaw and Vogel’s Corporations and 1heir Critics sees enterprise today as “faced by uncertainty and hostility on every hand.” In fact, this fairly typical book finds that “latent mistrust has grown to the point at which lack of confidence in business’s motives has become the overwhelming popular response to the role of the large corporation in the United States. An early ‘81 survey of 24,000 prominent students, as determined by Who’s Who Among American High School Students, showed a strong anti-business sentiment; less than 20 percent of the 24,000 agree, for example, with the proposition that most companies charge fair prices.’ Not surprisingly, then, are Peter Berger’s conclusions about current attitudes. His “New Attack on the Legitimacy of Business” is summed up, in part, thusly: “When people genuinely believe in the ‘rightness’ of certain social arrangements, those arrangements are experienced as proper and worthy of support—that is, as legitimate… American business once enjoyed this kind of implicit social charter. It does not today. Within business, one begins to see the spread of work refusal. Nation’s Business strikes what has become a familiar chord in its introduction to Dr. H.J. Freudenberger’s “How to Survive Burn-Out”: “For many business people, life has lost its meaning. Work has become mere drudgery, off-hours are spent in a miasma of dullness.” Similar is Datamation’s “Burnout: Victims and Avoidances,” because this disabling trauma “secms to be running rampant” among data processors Veninga and Spardley’s The Work Stress Condition: How to Cope with Job Burnout was condensed by the Decemned 1981 Reader’s Digest. To continue in this bibliographic vein, it is worth noting that the sharp increase in scholarly articles such as Kahn’s “Work, Stress, and Individual Well-Being,” Abdel-Halim’s “Effects of Role Stress-Job Design-Technology Interaction on Employee Work Satisfaction,” and Behling and Holcombe’s “Dealing with Employee Stress.” Studies in Occupational Stress, a series initiated in 1978 by Cooper and Kasl, dates the formal study of this facet of organized misery. There is other related evidence of aversion to work, including this reaction in its literal sense, namely a growth of illnesses such as job-related allergies and at least a significant part of the advancing industrial accident rate since the early ‘60s. Comes to mind the machinist who becomes ill by contact with machine oil, the count less employees who seem to be accident-prone in the job setting. We are just beginning to see some awareness of this sort of phenomenon, the consequences of which may be very significant. And, of course, there is absenteeism, probably the most common sign of antipathy to work and a topic that has called forth a huge amount of recent attention from the specialists of wage-labor. Any number of remedies are hawked; Frank Kuzmits’ offering, “No Fault: A New Strategy for Absenteeism,” for example. Deitsch and Dill’s “Getting Absent Workers Back on the Job: The Case of General Motors,” puts the annual cost to GM at $1 billion plus, and observes that “Absenteeism is of increasing concern to management and organized labor alike.” There are other well-known elements of the anti-work syndrome. The inability of some firms to get a shift working on time is a serious problem; this is why Nucor Corp. offers a 4 percent pay hike for each ton of steel produced above a target figure, up to a 100 percent pay bonus for those who show up as scheduled and work the whole shift. The amount of drinking and drug-taking on the job is another form of protest, occasioning a great proliferation of employee alcoholism and drug abuse programs by every sort of company. Teresine and Rusell confront the “staggering” employee theft phenomenon, observing that it has become “more widespread and professional in recent years.” Turnover (considered as a function of the quit rate and not due to layoffs, of course), very high since the early 1970s, has inched up further. All of these aspects come together to produce the much publicized productivity, or output per hour worked, crisis. Blake and Moulton provide some useful points; they recognize, for example, that the “declining productivity rate and the erosion of quality in industry have caused grave concern in this country” and that “industry is pouring more money than ever into training and development,” while “the productivity rate continues to fall.” Further, “attitudes among workers themselves” including, most basically, an “erosion of obedience to authority,” are seen as at the root of the problem. Unlike many confused mainstream analyses of the situation—or the typical leftist denial of it as either a media chimera or an invention of the all-powerful corporations—our two professors can at least realize that “Basic to the decline in productivity is the breakdown of the authority-obedience means of control”; this trend, moreover, “which is one manifestation of the broader social disorder… will continue indefinitely without corrective action,” they say. Librarian R.S. Byrne gives a useful testimonial to the subject in her compendious “Sources on Productivity,” which lists some of the huge outpouring of articles, reports, books, newsletters, etc., from a variety of willing helpers of business, including those of the Work in America Institute, the American Productivity Center, the American Center for the Quality of Work Life, and the Project on Technology, Work and Character, to name a few. As Byrne notes, “One can scarcely pick up any publication without being barraged by articles on the topic written from every possible perspective.” The reason for the outpouring is of course available to her: “U.S. productivity growth has declined continuously in the past 15 years and the trend appears to be worsening.” The August 1981 Personnel Administrator, devoted entirely to the topic, declares that “Today poor productivity is the United States’ number one industrial problem.” Administrative Management reasons, in George Crosby’s “Getting Back to Basics on Productivity,” that no progress can occur “until all individuals begin viewing productivity as their own personal responsibility.” ""How Deadly Is the Productivity Disease?” asks Stanley Henrici rccently in the Harvard Business Review. An endless stream, virtually an obsession. Dissatisfaction with work and the consequences of this have even drawn the Pope’s attention. John Paul II, in his Laborem Exercens (Through Work) encyclical of September 1981, examines the idea of work and the tasks of modern management. On a more prosaic level, one discovers that growing employee alienation has forced a search for new forms of work organization. The December 1981 Nation’s Business has located a new consensus in favor of “more worker involvement in decision-making.” James O’Toole’s “Making America Work” emphasizes the changed work culture with its low motivation and prescribes giving workers the freedom to design their own jobs, set their own work schedules and decide their own salaries. The productivity crisis has clearly led to the inauguration of worker participation, in a burgeoning number of co-determination arrangements since the mid-70s. The May 11, 1981 Business Week announced the arrival of a new day in U.S. management with its cover story and special report, “The New Industrial Relations.” Proclaiming the “almost unnoticed” ascendancy of a “fundamentally different way of managing people,” it claimed that the “authoritarian” approach of the “old, crude workplace ethos” is definitely passing, aided “immeasurably” by the growing collaborations of trade unions. “With the adversarial approach outmoded, the trend is toward more worker involvement in decision on the shop floor—and more job satisfaction, tied to productivity. Shortly after this analysis, Business Week’s “A Try at Steel-Mill Harmony” recounted the labor-management efforts made between the U.S steel industry and the United Steelworkers “create a cooperative labor climate where it matters most: between workers and bosses on the mill floor.” The arrangements, which are essentially production teams made up of supervisors, local union officials, and workers, were provided for in 1980 contracts with the nine major steel companies, but not implemented until after early 1981 union elections because of the unpopularity of the idea among many steelworkers. “The participation team concept… was devised as a means of improving steel’s sluggish productivity growth rate,” the obvious reason for a climate of disfavor in the mills. In a series of Fortune articles appearing in June, July and August 1981, the new system of industrial organization is discussed in some depth. “Shocked by faltering productivity,” according to Fortune , America’s corporate managers have moved almost overnight toward the worker involvement app roach (after long ignoring the considerable Northern European experience), which “challenges a system of authority and accountability that has served most of histo ry.” With a rising hopefulness, big capital’s leading magazine announces that “Companies which have had time to weigh the consequences of participative management are finding that informs the entire corporate culture.” Employees “are no longer just workers, they become the lowest level of management,"" It says, echoing such recent books as Myers’ Every Employe e a Manager. The bottom line of such programs, which also go by the name “quality of work life, is never lost sight of. G.P. Strippoli, a plant manager of the TRW Corp., provides the guiding principle: “The workers know that if I feel there’s no payback to the company in the solution they arrive at, there will be a definite no. I’m not here to give away the store or run a country club.” In effect in about 100 auto manufacturing and assembly plants, co-management replaces the traditional, failed ways of pushing productivity. Auto with virtually nothing to lose, has jumped for the effort to get workers to help run the factories. “As far as I’m concerned, it’s the only way to operate the business—there isn’t another way in today’s world,” says GM President F. James McDonald. United Auto Workers committeemen and stewards are key co-leaders with management in the drive to “gain higher product quality and lower absenteeism.” Similar is the campaign for worker involvement in the AT&T empire, formalized in the 1980 contract with the Communication Workers of America. The fight to bolster output per hour is as much the unions’ as it is managements’; anti-work feelings are equally responSIble tor the declIne of the bodyguards of capital as thcy are for the prodUCtIVIty ens IS proper. AFL-CIO Secrctary-Treasurer T.R. Donahue has found in the general productivity impasse the message that time has come for a “limited partnership—a marriage of convenience” with business. Fortune sees in formal collaboration “interesting possibilities for reversing the decline” of organized labor. Business Weeks “Quality of Work Life: Catching On” observes that shop-floor worker participation and the rest of the QWL movement is “taking root in everyday life.” Along the same lines, the October 1981 issue of Productivity notes that half of 500 firms surveyed now have such involvement programs. William Ouchi’s 1981 contribution to the industrial relations literature, Theory Z, cites recent research, such as that of Harvard’s James Mcdoff and MIT’s Kathryn Abraham, to point out the productivity edge that unionized companies in the United States have over non-union ones. And David Lewin’s “Collective Bargaining and the Quality of Work Life” argues for a further union presence in the QWL movement, based on organized labor’s past ability to recognize the constraints of work and support the ultimate authority of the workplace. It is clear that unions hold the high ground in a growing number of these programs, and there seems to be a trend toward co-management at ever higher levels. Douglas Frazer, UAW president, sits on the board of directors at Chrysler-a situation likely to spread to the rest of auto— and the Teamsters union appears close to putting its representative on the board at Pan-American Airways. Joint labor-management efforts to boost productivity in construction have produced about a dozen important local collaborative setups involving the building trade unions, like Columbus’ MOST (Management and Organized Labor Striving Together), Denver’s Union Jack, and PEP (Planning Economic Progress) in Beaumont, Texas. Rusiness Horizons editorialized in 1981 about “the newly established Industrial Board with such luminaries as Larry Shaprin of DuPont and Lane Kirkland of the AFL-CIO” as a “mild portent” of the growing formal collaboration " The board, a reincarnation of the Labor Management Board that expired in 1978, is chaired by Kirkland and the chairman of Exxon, Clifton C. Garvin Jr. The defeat in 1979 of the Labor Law Reform Act, which would have greatly increased government support to unionization, was seen by many as almost catastrophic given labor’s organizing failures. But the economic crisis, perhaps especially in light of generous union conccssions to the auto, airlines, rubber, trucking and other industries, may provide the setting for a “revitalization” of the national order including a real institutionalization of labor’s social potential to contain the mounting anti-work challenge.” There is already much pointing to such a possibility, beyond even the huge worker participation-QWL movement with its vital union component. The 1978 Trilateral Commission on comparative industrial relations spoke in very glowing terms aboul the development of neo-corporatist institutions (with German “co-determination” by unions and management as its model).” Business Week of June 30, 1980, a special issue on “The Reindustrialization of America,” proclaimed that “nothing short of a new social contract” between business, labor and government, and “sweeping changes in basic institutions” could stem the country’s industrial decline.” Thus, when the AFL-CIO’s Kirkland called in late 1981 for a tripartite National Reindustrialization Board, a concept first specifically advanced by investment banker Felix Rohatyn, the recent theoretical precedents are well in place. One of the main underlying arguments by Rohatyn and others is that labor will need the state to help enforce its productivity programs in its partnership with management. Thus would spreading “worker involvement” be utilized, but shepherded by the most powerful political arrangements. Wilber and Jameson’s “Hedonism and Quietism” puts the matter in general yet historical terms “Ways must be found to revitalize mediating institutions from the bottom up. A good example is Germany’s efforts to bring workers into a direct role in decision-making.” A change of this sort might appear to be too directly counter to the ideology of the Reagan government, but it actually would be quite in line with the goal of renewed social control minus spending outlays. Washington, after all, has been trying to reduce its instrumentalities because this giant network of programs is past its ability to coherently manage, just as its cutbacks also reflect the practical failure of government social pacification programs. Meanwhile, the refusal of work grows. One final example is the extremely high teenage unemployment rate, which continues to climb among all groups and is the object of a growing awareness that a very big element is simply a rejection of work, especially low-skill work, by the young. And legion are the reports that describe the habits of teenagers who do work as characterized by habitual tardiness, a chronic absenteeism, disrespect for supervisors and customers, etc. Which recalls the larger picture drawn by Frederick Herzberg in his “New Perspectives on the Will to Work”: “the problem is work motivation—all over the world. It’s simply a matter of people not wanting to work.” The gravity of the anti-work situation seems now to be approaching an unprecedented structural counter-revolution. Tripartism dates back to World War 1, to Coolidge in peacetime, but the addition of a mass-participation schema is just beginning to emerge as a national hypothesis. Of course, this nascent reaction intersects with a political tide of non-participation (e.g. declining voter turnout, massive non-registration for the draft rolls, growing tax evasion). The larger culture of withdrawal, from the state as from work, will make this integration effort highly problematic, and may even produce a more effective exposure of capital’s organization of life, given that organization’s heightened dependence on its victim’s participation. PART THREE ══════════ The Promise of the ‘80s ─────────────────────── The ‘80s So Far ─────────────── Present-Day Banalities ────────────────────── Media, Irony and “Bob” ────────────────────── Afterword Commentary on Form and Content in /Elements of Refusal/ by Paul Z. Simons ─────────────────────────────────────────────────────────────────────────────────── Footnotes ───────── [1] DEFINITION NOT FOUND. [2] DEFINITION NOT FOUND. [3] DEFINITION NOT FOUND. [4] DEFINITION NOT FOUND. [5] DEFINITION NOT FOUND. [6] DEFINITION NOT FOUND. [7] DEFINITION NOT FOUND. [8] DEFINITION NOT FOUND. [9] DEFINITION NOT FOUND. [10] DEFINITION NOT FOUND. [11] DEFINITION NOT FOUND. [12] DEFINITION NOT FOUND. [13] DEFINITION NOT FOUND. [14] DEFINITION NOT FOUND. [15] DEFINITION NOT FOUND. [16] DEFINITION NOT FOUND. [17] DEFINITION NOT FOUND. [18] DEFINITION NOT FOUND. [19] DEFINITION NOT FOUND. [20] DEFINITION NOT FOUND. [21] DEFINITION NOT FOUND. [22] DEFINITION NOT FOUND. [23] DEFINITION NOT FOUND. [24] DEFINITION NOT FOUND. [25] DEFINITION NOT FOUND. [26] DEFINITION NOT FOUND. [27] DEFINITION NOT FOUND. [28] DEFINITION NOT FOUND. [29] DEFINITION NOT FOUND.