|
Home | Switchboard | Unix Administration | Red Hat | TCP/IP Networks | Neoliberalism | Toxic Managers |
(slightly skeptical) Educational society promoting "Back to basics" movement against IT overcomplexity and bastardization of classic Unix |
"Computers eat people" was a pretty prophetic warning. It took time to materialize but we are almost there.
For example between 2000 and 2010, the jobs of 1.1 million secretaries were eliminated, replaced by internet services; the number of telephone operators dropped by 64%, travel agents by 46% and bookkeepers by 26%.Low level managers survived slightly better then this only because they are important politically, serving as a buffer between elite and those in the trenches. the real danger of drastic changes due to 24x7 surveillance over the workforce (Amazon warehouse workers, Fed ex, truckers with GPS installed on the truck, etc), and moving part of the job "up the chain" (disappearance of secretaries is a good example here). Another area getting hit by automation is the legal business. Scanning, text recognition and search have eliminated countless legal and paralegal jobs. Even the big, high powered law firms are laying people off. Cashiers are already hurt and will be hurt more in two-three years (ShopRite and several other large retail chains already have computerized checkout machines). Delivery trucks drivers on some fixed routes might be next. Plane pilots at least for cargo planes, are in danger as well.
At the same time, humans are the only one who are able to handle the complexity of rules typical for many occupations. But that means that number of jobs in government and healthcare, which are generally proportional to the size of population will somewhat grow. Guard labor, the labor engaged in protecting the elite will also grow faster then other sectors as rising inequality increases this type of dangers dramatically.
Computer can put great pressure on people with low level "floor" jobs, even if they can't replace them, turning them into "human robots". Job become "plantation" work with very cruel and unforgiving, never sleeping guard. And "performance review" torture chamber waits for those who can't keep pace. Amazon is one example here Is Amazon Running a Sweatshop in Pennsylvania
Everyone loves Amazon, except some employees in its Lehigh Valley, Pennsylvania warehouse. According to a report in local newspaper The Morning Call, working conditions within the facility are unbearable.
The report describes a factory with temperatures in the 100s, employees passing out from the heat and workers being terminated when they can't keep up the pace. Workers tolerate these conditions because the economy stinks and they need these jobs.
And these reports are not isolated to the 20 employees interviewed by The Morning Call. Numerous employees and security guards at Amazon complained to OSHA about these conditions. Even the local emergency room supposedly reported Amazon to federal regulators because it treated so many employees for heat-related illness.
And how greedy elite will deal with surplus people is a very interesting question. Bill Joy in "Why the future doesn't need us" was not optimistic (http://www.wired.com/wired/archive/8.04/joy.html ):
As society and the problems that face it become more and more complex and machines become more and more intelligent, people will let machines make more of their decisions for them, simply because machine-made decisions will bring better results than man-made ones. Eventually a stage may be reached at which the decisions necessary to keep the system running will be so complex that human beings will be incapable of making them intelligently. At that stage the machines will be in effective control. People won't be able to just turn the machines off, because they will be so dependent on them that turning them off would amount to suicide.
On the other hand it is possible that human control over the machines may be retained. In that case the average man may have control over certain private machines of his own, such as his car or his personal computer, but control over large systems of machines will be in the hands of a tiny elite - just as it is today, but with two differences. Due to improved techniques the elite will have greater control over the masses; and because human work will no longer be necessary the masses will be superfluous, a useless burden on the system. If the elite is ruthless they may simply decide to exterminate the mass of humanity.
As for right-wing assertion that "No amount of time and effort is going to transform the bottom 1/3 and middle 1/3 of the nation into people who think and behave like the top 1/3." you need to understand that part of the difference are due different access to education and conditions at home, not the abilities.
Computerization lead to higher centralization and growth on inequality. In this conditions the elite already becoming more like aristocracy of the past -- inherited position. So the key question is what "standard of living" the elite will allow to "peasants". As Bill Joy noted, in conditions of shrinking natural resources and first of all oil "If the elite is ruthless they may simply decide to exterminate "non-performing assets" so to speak.
Hollowing middle class jobs became non-stop process. Now computer programmers (and IT workers in general) themselves are finding difficulties in employment and are send packing in increasing numbers.
With neoliberal "winner takes all" policy inequality is going through the roof. New jobs created are mainly McJobs.
So we have like unforgettable Bush II quipped "have more", say, with annual income above $100K per family (top 20%); many members of this blog belong to those top 20%. And we have "have none" (let's people with less then $40K of annual income per family with bottom 20% getting less then $20K) with little in between. Anne probably can help me with stats.
In other words, computers are more disruptive than, say, the machines smashed by the Luddites. The current elites are an increasingly greedy. Little good can be said about them. Instead of upgrading the technical skills of the work force, advanced technology in the service of neoliberal capitalism makes skilled labor in the USA and other Western countries superfluous.
So this is an established fact that computers promotes inequality and Industrial (transnational mega-corporations) and political centralization and in combination with neoliberalism create a very poisoning and may be even explosive mix. That's why unemployment is structural and probably will stabilize on the current level. That's probably one reason why S&P500 is close 1700 (which makes some people so exited about this balanced Boyle-style portfolio performance ;-)
Also that's why national security state is now a reality in the USA. That's why militarization of police force is going non-stop with drones now added to the arsenal and face recognition technology tested in crowds.
|
Switchboard | ||||
Latest | |||||
Past week | |||||
Past month |
June 8, 2015 | blogs.nytimes.com
That is the world imagined by government officials and technologists at the Defense Advanced Research Projects Agency, the American military organization that is charged with the mission of avoiding a Sputnik-style technology threat to national security. Last weekend at the sprawling Los Angeles County Fairgrounds, Darpa concluded the Robotics Challenge, a two-year-long effort to jump start this next generation of smart and presumably helpful robots by offering a cash prize for the designers of a machine that could work in concert with human controllers in a hazardous environment.
The $3.5 million competition was won by a South Korean team from KAIST, formerly the Korea Advanced Institute of Science and Technology.
The technology may still seem far-fetched, but betting against the agency that has had a remarkably far-reaching effect on the modern world - from funding the work that led to both the personal computer and the Internet, to setting expectations that self-driving vehicles are only a matter of years away - might be a mistake.
Darpa officials have taken pains to assure anyone who would listen that it was not primarily interested in designing Terminators, or killer robots. The agency is an arm of the Pentagon, and its futuristic robots are an example of what is described as a "dual use" technology that will have both military and civilian uses.
Darpa, which is also known for pioneering the Internet surveillance system that was exposed last year by Edward J. Snowden, has, under its current director, Arati Prabhakar, expanded its watchfulness over the potential effect of the technologies it helps foster.
In introducing a workshop for discussion on the effect of robotics held at the end of the challenge competition on Sunday, Dr. Prabhakar described the agency as being committed to a broader mission: "We work together to build the future of robots that can help extend the capabilities that we have and build the technologies that will aid humanity in the future.
That has been become something of a personal cause for Gill Pratt, the roboticist who has overseen the Robotics Challenge. A former Massachusetts Inst itute of Technology and Olin College engineering professor, Dr. Pratt gave an impassioned speech about the positive potential for humanoid robots.
"You would assume that all the people who watched these machines would be filled with fear and anxiety because after all, all we hear about in the news is people saying robots are going to take over the world, they are going to kill all of us, we should run away," he said. "There is a new discovery that we made here besides all the technology. That discovery is that there is some new untapped affinity between people and robots that we saw really for the first time today."
Under Tony Tether, who was Darpa's director from June 2001 to February 2009, there was less concern at the agency about questions of societal impact. Dr. Tether oversaw the DARPA Grand Challenge in 2004, which had a narrower goal of helping meet a congressional mandate that one-third of the military's land vehicles would become autonomous by 2015.
While that goal remains far off, the earlier contest has had an effect on the world's automakers after Google hired Sebastian Thrun, a Stanford professor who led the original winning team in 2005, to begin a self-driving car research project. Since then, virtually all of the world's major automobile makers have set up research laboratories in Silicon Valley.
In a similar fashion, it now appears that the new Robotics Challenge may have a direct role in jump starting an imaginative new commercial industry making mobile humanoid and even more flexible robots.
"You won't go in and replace all these people all at once with robots," said Rodney Brooks, who has founded several robotics companies including iRobot and Rethink Robotics. "We'll see robots creep into our lives, getting the robots to do a few of the tasks at a time."
Aug 25, 2013 | economistsview.typepad.com/economistsview
I've had many posts on the Autor and Dorn paper on the hollowing out of the middle class (see here too), and this is yet another, but let me add one thing. This explains the demise of the middle class as a result of technological change. However, there are those who argue that the troubles of the working class has happened for other reasons, e.g. the demise of unions as politicians favored business over labor, or that there have been other political/institutional changes that worked against the middle class. My own view is that it wasn't one or the other, both technology and politics mattered:
How Technology Wrecks the Middle Class, by David Autor and David Dorn, Commentary, NY Times: In the four years since the Great Recession officially ended, the productivity of American workers - those lucky enough to have jobs - has risen smartly. But the United States still has two million fewer jobs than before the downturn, the unemployment rate is stuck at levels not seen since the early 1990s and the proportion of adults who are working is four percentage points off its peak in 2000.This job drought has spurred pundits to wonder whether a profound employment sickness has overtaken us. And from there, it's only a short leap to ask whether that illness isn't productivity itself. Have we mechanized and computerized ourselves into obsolescence?Are we in danger of losing the "race against the machine," as the M.I.T. scholars Erik Brynjolfsson and Andrew McAfee argue in a recent book? Are we becoming enslaved to our "robot overlords," as the journalist Kevin Drum warned in Mother Jones? Do "smart machines" threaten us with "long-term misery," as the economists Jeffrey D. Sachs and Laurence J. Kotlikoff prophesied earlier this year? Have we reached "the end of labor," as Noah Smith laments in The Atlantic? ...
EMIchael:
Baker weighs in:
"The story told by Autor and Dorn is that technology displaces these jobs putting downward pressure on the wages of formerly middle class workers. At the time it creates more jobs for the people who program the machines, hence we see higher wages for high end workers.
This story is comforting to the affluent because it means that the upward redistribution of income that we have been seeing is simply an inevitable outcome of technological progress. It might be unfortunate, but what are we supposed to do, smash the machines?
This story should strike people as absurd on its face if they are interested in anything other than a rationale for inequality. After all, how many of the winners in today's economy are actually programming the robots, as the story implies. The group of big winners includes many doctors, lawyers, and dentists, most of whom have no more computer skills than your average high school senior.
They keep their position not by mastering the technology, but rather through the old-fashion way, restricting supply. They use professional barriers and trade restrictions to limit competition. That's much easier than mastering the latest in computer technology.
This sort of abuse of market power applies to a large share, if not the majority, of the winners in today's economy. In fact, if anyone really gave a damn, they could see that the Autor-Dorn story simply does not fit the pattern of job creation that we have seen in the last decade. Their occupation analysis would show that low earning occupations have been the big job gainers since 2000. The employment share of the highest earning occupations has actually fallen slightly over this period.
However that story provides less comfort to the rich and powerful. It implies that upward redistribution is something that they did, rather than something that just happened. Therefore we will not likely see these data featured prominently in news stories and opinion pieces."
bakho:
Technology was supposed to free workers from low paying menial jobs so they could work at more productive higher pay jobs. However many of the gains from technology have been captured by the malefactors of great wealth and those gains are not returned to society. Many workers have indeed benefitted. However, the Malefactors have pursued low wage policies including high employment and economic policy that favors stagnant wages for the middle class.
There is plenty that needs to be done. There is plenty of wealth. There is too much wealth concentrated in the hands of selfish hoarders who are only concerned with enhancing their personal relative wealth, not making everyone wealthier.
Jason Dick:
In some sense, it matters a bit less what the original cause was, as it doesn't really change all that much about what the solution must be. I don't think there really is any way to fix this problem except politically, and I worry immensely that the people that claim that it's all about technology seek to imply that there's nothing to be done about it.
I think Paul Krugman hit the nail on the head in his fairly recent posts on this subject, where it's not so much that the machines are taking over, but that earnings are shifting from labor to capital. And if earnings are shifting from labor to capital, that implies that we need to wildly rethink how we manage capital income. In essence, if it there is any truth to the story that technology has damaged the middle class, then the answer is to enforce redistribution of wealth.
Not My Real Fake Name:
Don't forget that the practice of replacing US-based workers with US-based workers brought in on indentured "guest-worker" visas is also a "technology".
And for those of you who may object to the use of the word "indentured", what do you call it when you are required by the laws of your home country to re-pay all you prior earnings to your employer if you quit your job for another one with higher pay?
anne -> Not My Real Fake Name...
And for those of you who may object to the use of the word "indentured", what do you call it when you are required by the laws of your home country to re-pay all your prior earnings to your employer if you quit your job for another one with higher pay?
[ Reference this precisely which if I understand means that India will require an Indian worker at Apple in California to repay all earning from Apple if the worker resigns to take a job at Google. This makes no sense. ]
kievite -> anne...
Ann,
I think the meaning is that employment in, say, Apple was obtained not directly, but via some placement firm which forced on the worker an additional contract with this kind of condition.
In any case mobility of workers with H-1B visa is very restricted so to call them indentured service until they gain a green card is not too much of a stretch. And typically five years or more is required to get a green card.
Using the same line of reasoning you can probably call undocumented workers in the USA "slaves".
Aug 24, 2013 | NYTimes.com
In the four years since the Great Recession officially ended, the productivity of American workers - those lucky enough to have jobs - has risen smartly. But the United States still has two million fewer jobs than before the downturn, the unemployment rate is stuck at levels not seen since the early 1990s and the proportion of adults who are working is four percentage points off its peak in 2000.
This job drought has spurred pundits to wonder whether a profound employment sickness has overtaken us. And from there, it's only a short leap to ask whether that illness isn't productivity itself. Have we mechanized and computerized ourselves into obsolescence?
Are we in danger of losing the "race against the machine," as the M.I.T. scholars Erik Brynjolfsson and Andrew McAfee argue in a recent book? Are we becoming enslaved to our "robot overlords," as the journalist Kevin Drum warned in Mother Jones? Do "smart machines" threaten us with "long-term misery," as the economists Jeffrey D. Sachs and Laurence J. Kotlikoff prophesied earlier this year? Have we reached "the end of labor," as Noah Smith laments in The Atlantic?
Of course, anxiety, and even hysteria, about the adverse effects of technological change on employment have a venerable history. In the early 19th century a group of English textile artisans calling themselves the Luddites staged a machine-trashing rebellion. Their brashness earned them a place (rarely positive) in the lexicon, but they had legitimate reasons for concern.
Economists have historically rejected what we call the "lump of labor" fallacy: the supposition that an increase in labor productivity inevitably reduces employment because there is only a finite amount of work to do. While intuitively appealing, this idea is demonstrably false. In 1900, for example, 41 percent of the United States work force was in agriculture. By 2000, that share had fallen to 2 percent, after the Green Revolution transformed crop yields. But the employment-to-population ratio rose over the 20th century as women moved from home to market, and the unemployment rate fluctuated cyclically, with no long-term increase.
Labor-saving technological change necessarily displaces workers performing certain tasks - that's where the gains in productivity come from - but over the long run, it generates new products and services that raise national income and increase the overall demand for labor. In 1900, no one could foresee that a century later, health care, finance, information technology, consumer electronics, hospitality, leisure and entertainment would employ far more workers than agriculture. Of course, as societies grow more prosperous, citizens often choose to work shorter days, take longer vacations and retire earlier - but that too is progress. ---[ Not under neoliberalism -- NNB]
So if technological advances don't threaten employment, does that mean workers have nothing to fear from "smart machines"? Actually, no - and here's where the Luddites had a point. Although many 19th-century Britons benefited from the introduction of newer and better automated looms - unskilled laborers were hired as loom operators, and a growing middle class could now afford mass-produced fabrics - it's unlikely that skilled textile workers benefited on the whole.
Fast-forward to the present. The multi-trillionfold decline in the cost of computing since the 1970s has created enormous incentives for employers to substitute increasingly cheap and capable computers for expensive labor. These rapid advances - which confront us daily as we check in at airports, order books online, pay bills on our banks' Web sites or consult our smartphones for driving directions - have reawakened fears that workers will be displaced by machinery. Will this time be different?
A starting point for discussion is the observation that although computers are ubiquitous, they cannot do everything. A computer's ability to accomplish a task quickly and cheaply depends upon a human programmer's ability to write procedures or rules that direct the machine to take the correct steps at each contingency. Computers excel at "routine" tasks: organizing, storing, retrieving and manipulating information, or executing exactly defined physical movements in production processes. These tasks are most pervasive in middle-skill jobs like bookkeeping, clerical work and repetitive production and quality-assurance jobs.
Logically, computerization has reduced the demand for these jobs, but it has boosted demand for workers who perform "nonroutine" tasks that complement the automated activities. Those tasks happen to lie on opposite ends of the occupational skill distribution.
At one end are so-called abstract tasks that require problem-solving, intuition, persuasion and creativity. These tasks are characteristic of professional, managerial, technical and creative occupations, like law, medicine, science, engineering, advertising and design. People in these jobs typically have high levels of education and analytical capability, and they benefit from computers that facilitate the transmission, organization and processing of information.
On the other end are so-called manual tasks, which require situational adaptability, visual and language recognition, and in-person interaction. Preparing a meal, driving a truck through city traffic or cleaning a hotel room present mind-bogglingly complex challenges for computers. But they are straightforward for humans, requiring primarily innate abilities like dexterity, sightedness and language recognition, as well as modest training. These workers can't be replaced by robots, but their skills are not scarce, so they usually make low wages.
Computerization has therefore fostered a polarization of employment, with job growth concentrated in both the highest- and lowest-paid occupations, while jobs in the middle have declined. Surprisingly, overall employment rates have largely been unaffected in states and cities undergoing this rapid polarization. Rather, as employment in routine jobs has ebbed, employment has risen both in high-wage managerial, professional and technical occupations and in low-wage, in-person service occupations.
So computerization is not reducing the quantity of jobs, but rather degrading the quality of jobs for a significant subset of workers. Demand for highly educated workers who excel in abstract tasks is robust, but the middle of the labor market, where the routine task-intensive jobs lie, is sagging. Workers without college education therefore concentrate in manual task-intensive jobs - like food services, cleaning and security - which are numerous but offer low wages, precarious job security and few prospects for upward mobility. This bifurcation of job opportunities has contributed to the historic rise in income inequality.
HOW can we help workers ride the wave of technological change rather than be swamped by it? One common recommendation is that citizens should invest more in their education. Spurred by growing demand for workers performing abstract job tasks, the payoff for college and professional degrees has soared; despite its formidable price tag, higher education has perhaps never been a better investment. But it is far from a comprehensive solution to our labor market problems. Not all high school graduates - let alone displaced mid- and late-career workers - are academically or temperamentally prepared to pursue a four-year college degree. Only 40 percent of Americans enroll in a four-year college after graduating from high school, and more than 30 percent of those who enroll do not complete the degree within eight years.
The good news, however, is that middle-education, middle-wage jobs are not slated to disappear completely. While many middle-skill jobs are susceptible to automation, others demand a mixture of tasks that take advantage of human flexibility. To take one prominent example, medical paraprofessional jobs - radiology technician, phlebotomist, nurse technician - are a rapidly growing category of relatively well-paid, middle-skill occupations. While these paraprofessions do not typically require a four-year college degree, they do demand some postsecondary vocational training.
These middle-skill jobs will persist, and potentially grow, because they involve tasks that cannot readily be unbundled without a substantial drop in quality. Consider, for example, the frustration of calling a software firm for technical support, only to discover that the technician knows nothing more than the standard answers shown on his or her computer screen - that is, the technician is a mouthpiece reading from a script, not a problem-solver. This is not generally a productive form of work organization because it fails to harness the complementarities between technical and interpersonal skills. Simply put, the quality of a service within any occupation will improve when a worker combines routine (technical) and nonroutine (flexible) tasks.
Following this logic, we predict that the middle-skill jobs that survive will combine routine technical tasks with abstract and manual tasks in which workers have a comparative advantage - interpersonal interaction, adaptability and problem-solving. Along with medical paraprofessionals, this category includes numerous jobs for people in the skilled trades and repair: plumbers; builders; electricians; heating, ventilation and air-conditioning installers; automotive technicians; customer-service representatives; and even clerical workers who are required to do more than type and file. Indeed, even as formerly middle-skill occupations are being "deskilled," or stripped of their routine technical tasks (brokering stocks, for example), other formerly high-end occupations are becoming accessible to workers with less esoteric technical mastery (for example, the work of the nurse practitioner, who increasingly diagnoses illness and prescribes drugs in lieu of a physician). Lawrence F. Katz, a labor economist at Harvard, memorably called those who fruitfully combine the foundational skills of a high school education with specific vocational skills the "new artisans."
The outlook for workers who haven't finished college is uncertain, but not devoid of hope. There will be job opportunities in middle-skill jobs, but not in the traditional blue-collar production and white-collar office jobs of the past. Rather, we expect to see growing employment among the ranks of the "new artisans": licensed practical nurses and medical assistants; teachers, tutors and learning guides at all educational levels; kitchen designers, construction supervisors and skilled tradespeople of every variety; expert repair and support technicians; and the many people who offer personal training and assistance, like physical therapists, personal trainers, coaches and guides. These workers will adeptly combine technical skills with interpersonal interaction, flexibility and adaptability to offer services that are uniquely human.
David H. Autor is a professor of economics at the Massachusetts Institute of Technology. David Dorn is an assistant professor of economics at the Center for Monetary and Financial Studies in Madrid.
A version of this article appears in print on 08/25/2013, on page SR6 of the NewYork edition with the headline: How Technology Wrecks the Middle Class.
151 Comments
Share your thoughts.
Bowie ArizonapotaytopotahtoAs usual a pair of economists are underestimating technology. Let them remember what a great economist, John Maynard Keynes, said about their future in the age of technology:
"But, chiefly, do not let us overestimate the importance of the economic problem, or sacrifice to its supposed necessities other matters of greater and more permanent significance. It should be a matter for specialists-like dentistry. If economists could manage to get themselves thought of as humble, competent people, on a level with dentists, that would be splendid!
The science fiction of the early 20th century...taking Soma so that all feelings are blunted...looks like it may have come to pass with the massive glut of antidepressants, mood stabolizers, and addiction to other sedating medications. Is it so unreasonable that there is a little predictive value from the science fiction of the early 80's? Terminator, anyone?
By Mark Berman Opposing Views, Tue, September 20, 2011
Amazon is the largest online retailer in the world -- you press a button on your computer and seemingly overnight your item arrives at your door. Such an operation requires a huge support staff, but workers at one Amazon warehouse in Pennsylvania are complaining that they are literally working under sweatshop conditions.
A very, very long story in The Morning Call details the working conditions at a warehouse in Leigh Valley. The site said it spoke with 20 former or current employees and just one of them said it was a good place to work.
The Morning Call writes:
Workers said they were forced to endure brutal heat inside the sprawling warehouse and were pushed to work at a pace many could not sustain. Employees were frequently reprimanded regarding their productivity and threatened with termination, workers said. The consequences of not meeting work expectations were regularly on display, as employees lost their jobs and got escorted out of the warehouse. Such sights encouraged some workers to conceal pain and push through injury lest they get fired as well, workers said.
It was the heat aspect that drew the most complaints. They said tempertatures routinely soar into the 100s during summer heat waves. It gets so bad, the workers said, that Amazon arranged to have paramedics standing by to treat workers who become dehydrated or suffer from other heat-related issues. It is not unusual, the workers claimed, to see employees being wheeled out of the warehouse in stretchers.
Amazon would let workers go home if they wanted to.
"When the heat index exceeded 110, they'd give you voluntary time off," former employee Robert Rivas said. "If you wanted to go home, they'd send you home. But if you didn't have a doctor's note saying you couldn't work in the heat, you'd get (disciplinary) points."
Get too many points and you're fired, so many workers would muddle through.
In June one worker called the Occupation Safety and Health Administration (OSHA). Following an inspection OSHA issued recommendations to Amazon on how to better manage the heat in the warehouse.
"Several conditions and practices were observed which have the potential to adversely impact on employee safety and health," OSHA's area director Jean Kulp said in a letter to Amazon.
The recommendations included reducing heat and humidity in the warehouse, providing hourly breaks in a cool area and personal fans at each work station.
Workers said Amazon has installed cooling units and fans since the inspection. Breaks are longer, but they still must produce the same amount of work.
In an email to The Morning Call, warehouse general manager Michele Glisson wrote:
"The safety and welfare of our employees is our No. 1 priority at Amazon, and as the general manager, I take that responsibility seriously. We go to great lengths to ensure a safe work environment, with activities that include free water, snacks, extra fans and cooled air during the summer. I am grateful to work with such a fantastic group of employees from our community, and we partner with them every day to make sure our facility is a great place to work."
July 13, 2013 | Economist's View
This essay/report by Frank Levy and Richard J. Murnane attempts to answer the question "How do we ensure American middle class prosperity in an era of ever-intensifying globalization and technological upheaval?":
Dancing with Robots: Human Skills for Computerized Work, by Frank Levy and Richard J. Murnane: On March 22, 1964, President Lyndon Johnson received a short, alarming memorandum from the Ad Hoc Committee on the Triple Revolution. The memo warned the president of threats to the nation beginning with the likelihood that computers would soon create mass unemployment:
A new era of production has begun. Its principles of organization are as different from those of the industrial era as those of the industrial era were different from the agricultural. The cybernation revolution has been brought about by the combination of the computer and the automated self-regulating machine. This results in a system of almost unlimited productive capacity which requires progressively less human labor. Cybernation is already reorganizing the economic and social system to meet its own needs.The memo was signed by luminaries including Nobel Prize winning chemist Linus Pauling, Scientific American publisher Gerard Piel, and economist Gunnar Myrdal (a future Nobel Prize winner). Nonetheless, its warning was only half right. There was no mass unemployment- since 1964 the economy has added 74 million jobs. But computers have changed the jobs that are available, the skills those jobs require, and the wages the jobs pay.
For the foreseeable future, the challenge of "cybernation" is not mass unemployment but the need to educate many more young people for the jobs computers cannot do. Meeting the challenge begins by recognizing what the Ad Hoc Committee missed-that computers have specific limitations compared to the human mind. Computers are fast, accurate, and fairly rigid. Human brains are slower, subject to mistakes, and very flexible. By recognizing computers' limitations and abilities, we can make sense of the changing mix of jobs in the economy. We can also understand why human work will increasingly shift toward two kinds of tasks: solving problems for which standard operating procedures do not currently exist, and working with new information- acquiring it, making sense of it, communicating it to others. ...
pgl said...
Gunnar Myrdal was a Luddite?
Sandwichman said in reply to pgl...anne said in reply to Sandwichman...Yep. So to speak. Although technically, he was a "neo-Luddite" because to be a Luddite he would have had to been around in the early 19th century.
The "system of almost unlimited productive capacity which requires progressively less human labor" is wishful thinking, one way or the other. Either we'll all be freed from having to work (but still have all the necessities and comforts of modern living) or there will be massive disruption as so many of us will be cut loose from having any viable means of support.
Actually, neither will happen -- but not for the reason that the anti-Luddites (the "Millocrats") suppose. Machines are just not all that productive. WHAT? I said, "Machines are just not all that productive."
A machine is no more "productive" than is a hedge that surrounds a field of wheat. A hedge does not supply sunlight, minerals or water to the growing wheat. All the hedge does is keep out the animals and maybe break the wind. It keeps people out, too, provided those people respect the symbolism and legality of the hedge and private property. There are, after all, tools that enable people to tear down hedges if they really want to.
It takes work to make machines, to operate the machines, to maintain machines and to supply the energy to run the machines. All the machines do is convert the labor and energy inputs into a more usable form. But they don't do so without some mechanical loss.
The problem that results from machines -- whether they are cotton looms or robots -- is not the elimination of all work but the concentration of control over who does the work, how the work is done and how the proceeds are distributed.
Sandwichman said in reply to pgl...Machines are just not all that productive. WHAT? I said, "Machines are just not all that productive."
A machine is no more "productive" than is a hedge that surrounds a field of wheat. A hedge does not supply sunlight, minerals or water to the growing wheat. All the hedge does is keep out the animals and maybe break the wind. It keeps people out, too, provided those people respect the symbolism and legality of the hedge and private property. There are, after all, tools that enable people to tear down hedges if they really want to.
It takes work to make machines, to operate the machines, to maintain machines and to supply the energy to run the machines. All the machines do is convert the labor and energy inputs into a more usable form. But they don't do so without some mechanical loss.
The problem that results from machines -- whether they are cotton looms or robots -- is not the elimination of all work but the concentration of control over who does the work, how the work is done and how the proceeds are distributed.
[ Wonderful writing. ]
AND Wassily Leontief, believe it or not!
The Blorch:
anne said..."We can also understand why human work will increasingly shift toward two kinds of tasks: solving problems for which standard operating procedures do not currently exist, and working with new information- acquiring it, making sense of it, communicating it to others. ..."
[These are precisely the tasks for which productivity is poorly measured which leaves the problem of what to pay these kinds of workers. I suggest letting these workers use whatever non-standard operating procedures they care to employ and communicate the results on the blogeshpere. If the content attracts any eyeballs these workers can monetize the traffic by selling advertizing. Sounds real lucrative, doesn't it?]
The Blorch said in reply to anne...But computers have changed the jobs that are available, the skills those jobs require, and the wages the jobs pay.
[ A sentence such as this is written, when the intent is to make it seem as though political policy has nothing at all to do an economic occurrence. As though computers are superior beings who program us. This is nonsense, but is much the sort of thinking that is reflected in the continually harmful advice and pushes that the International Monetary Fund foists on poorer countries. ]
Sandwichman said in reply to anne...But computers have changed the jobs that are available, the skills those jobs require, and the wages the jobs pay.
[The author of the comment never revealed any research data supporting the existence of these trends. And, of course, even if the trends exist, assigning cause and effect is always an art form.
More likely, the author is a lobbyist for an industry that wants to influence the number of students studying in a field pertaining to that industry. Not because there is likely to be more jobs in that industry mind you. But because if more people study towards placement of a particular industry that industry benefits of choosing from a larger pool.
For example say one student is interested in a position in and industry with one job opening. Chance are, this student is about average intellect.
But, suppose a million students want a position in a given industry with one position. In this case, the industry will probably get to interview a one in a million, super genius.
Think about it. Wall Street does.]
anne said in reply to Sandwichman...It's called "idolatry," anne. And it's the way the Millocrats speak. Nobody is ever responsible for the unpleasant consequences -- especially not the decision makers. But please give the entrepreneurs all the credit for all the benefits!
BTW: "Millocrats" is a lovely 19th century term that deserves to be hurled back whenever the spectre of "Luddism" appears.
kievite said...It's called "idolatry," anne. And it's the way the Millocrats speak. Nobody is ever responsible for the unpleasant consequences -- especially not the decision makers. But please give the entrepreneurs all the credit for all the benefits!
[ Perfect. ]
Ellis said...The was pretty prophetic warning. it take time to materialize but we are almost there.
For example between 2000 and 2010, the jobs of 1.1 million secretaries were eliminated, replaced by internet services; the number of telephone operators dropped by 64%, travel agents by 46% and bookkeepers by 26%. Low level managers survived better then this only because they are important politically, serving as a buffer between elite and those in the trenches.
Hollowing middle class jobs became non-stop process. Now computer programmers (and IT workers in general) themselves are finding difficulties in employment and are send packing in increasing numbers.
With neoliberal "winner takes all" policy inequality is going through the roof. New jobs created are mainly McJobs.
So we have like unforgettable Bush II quipped "have more", say, with annual income above $100K per family (top 20%); many members of this blog belong to those top 20%. And we have "have none" (let's people with less then $40K of annual income per family with bottom 20% getting less then $20K) with little in between. Ann probably can help me with stats.
In other words, computers are more disruptive than, say, the machines smashed by the Luddites. The current elites are an increasingly greedy. Little good can be said about them. Instead of upgrading the technical skills of the work force, advanced technology in the service of neoliberal capitalism makes skilled labor in the USA and other Western countries superfluous.
So this is an established fact that computers promotes inequality and political centralization and in combination with neoliberalism create a very poisoning and may be even explosive mix. That's why unemployment is structural and probably will stabilize on the current level. That's probably one reason why S&P500 is close 1700 (which makes some people so exited about mixed Boyle-style portfolio performance ;-)
Also that's why national security state is now a reality in the USA. That's why militarization of police force is going non-stop with drones now added to the arsenal and face recognition technology tested in crowds.
carlyle said...Greater use of computers, greater productivity: is it a curse for employment? Only in an economy based on corporate profit. Instead of freeing humans, instead of giving us greater leisure time to do other things, it only creates more unemployment. And big business goes laughing all the way to the bank.
Oh, I forgot, its the problem of the workforce. We aren't educated enough. That's why it is so important to slash education spending, raise college tuition to the sky... and create wonderful profit opportunity to provide loans, at wonderfully high rates.
Or maybe it's the problem of demographics... (I must become an economist, or at least a pundit.)
reason said...When the jobs we have pay as well as the jobs we used to have there will be economic growth. We will refinance working Americans or the American dream is over.
Step one in refinancing workers is a large increase in the minimum wage. Americas seven dollars twenty five cents is much lower than other wealthy Democracies at about twenty eight percent of our median wage. Most of Europe and Canada and Australia support minimum wages of forty five to forty eight percent of their median wage. Again, we are below world standards by a whole bunch.
kievite said in reply to reason...What computers do differently today is allow all sorts of things to be imported that previously were strictly domestic.
They enable transcontinental supply chains and that complex logistics that is behind them. This way the pool of cheap labor can be used instead of domestic labor.
Eric Blair:
"For the foreseeable future, the challenge of "cybernation" is not mass unemployment but the need to educate many more young people for the jobs computers cannot do."
Not totally false, but perhaps not particularly useful either. There are indeed a lot of jobs that computers can't do, but they are generally the jobs that are hardest to define and quantify, and therefore to train others for. In other words, if you can explain to a stranger what it is that you do for a living, then there is a possibility that your job can be either automated or offshored.
The Blorch said in reply to Eric Blair...
Your going in the same direction as my comment. The jobs for which productivity is most easily measured are the jobs that are most often off shored or computerized or automated.
The most important skill in the future will be bullshit artistry because workers solving unstructured problem are still going to have bosses who will want to know "WHY isn't it done yet" and the worker is going to have to placate that impatient boss with some kind of novel, compelling excuse.
cm:
"Computers are fast, accurate, and fairly rigid. Human brains are slower, subject to mistakes, and very flexible."
I wouldn't say that is the primary characteristic at this point of technology development, though it is still relevant and important. (With the caveat that humans are not generally slower/less accurate - only for the things where computers are better. This may sound like a platitude, but it is a very important consideration/bias: computer vs. human performance is almost always only compared on tasks/domains that have been successfully automated, or at the most where automation (or its unsuccessful attempt) has been experimentally demonstrated.)
These days, I would say the primary "gap" is moving more towards the inability to capture human affairs and social relationships/interactions at minimally sufficient completeness and accuracy to be processed by computer. This is not just because of complexity, but because humans themselves don't really understand the related mechanisms in adequate detail. Maybe it is even fundamentally intractable - maybe the "robots" would have to be social beings in a social context as well to "get the hang of it", and and it is not unlikely the speed and precision of clear cut algorithms would likely be lost.
cm said in reply to cm...
OTOH a computer should have been able to spot the grammatical errors and redundancies in my last sentence. :-)
kievite said in reply to cm...
Compare your reasoning with Bill Joy in "Why the future doesn't need us"
http://www.wired.com/wired/archive/8.04/joy.html
As society and the problems that face it become more and more complex and machines become more and more intelligent, people will let machines make more of their decisions for them, simply because machine-made decisions will bring better results than man-made ones. Eventually a stage may be reached at which the decisions necessary to keep the system running will be so complex that human beings will be incapable of making them intelligently. At that stage the machines will be in effective control. People won't be able to just turn the machines off, because they will be so dependent on them that turning them off would amount to suicide.
On the other hand it is possible that human control over the machines may be retained. In that case the average man may have control over certain private machines of his own, such as his car or his personal computer, but control over large systems of machines will be in the hands of a tiny elite - just as it is today, but with two differences. Due to improved techniques the elite will have greater control over the masses; and because human work will no longer be necessary the masses will be superfluous, a useless burden on the system. If the elite is ruthless they may simply decide to exterminate the mass of humanity.
Christopher Mims, Quartz | Mar. 27, 2013, 7:19 AM |
20,248 |47 Marc Andreessen is funding the companies making the software disrupting labor markets the world over.
Sixty percent of the jobs in the US are information-processing jobs, notes Erik Brynjolfsson, co-author of a recent book about this disruption, Race Against the Machine. It's safe to assume that almost all of these jobs are aided by machines that perform routine tasks. These machines make some workers more productive. They make others less essential.
The turn of the new millennium is when the automation of middle-class information processing tasks really got under way, according to an analysis by the Associated Press based on data from the Bureau of Labor Statistics. Between 2000 and 2010, the jobs of 1.1 million secretaries were eliminated, replaced by internet services that made everything from maintaining a calendar to planning trips easier than ever. In the same period, the number of telephone operators dropped by 64%, travel agents by 46% and bookkeepers by 26%. And the US was not a special case. As the AP notes, "Two-thirds of the 7.6 million middle-class jobs that vanished in Europe were the victims of technology, estimates economist Maarten Goos at Belgium's University of Leuven."
Economist Andrew McAfee, Brynjolfsson's co-author, has called these displaced people "routine cognitive workers." Technology, he says, is now smart enough to automate their often repetitive, programmatic tasks. "We are in a desperate, serious competition with these machines," concurs Larry Kotlikoff, a professor of economics at Boston University. "It seems like the machines are taking over all possible jobs."
Like farming and factory work before it, the labors of the mind are being colonized by devices and systems. In the early 1800′s, nine out of ten Americans worked in agriculture-now it's around 2%. At its peak, about a third of the US population was employed in manufacturing-now it's less than 10%. How many decades until the figures are similar for the information-processing tasks that typify rich countries' post-industrial economies?
Web pioneer and venture capitalist Marc Andreessen describes this process as "software is eating the world." As he wrote in an editorial (paywall) for the Wall Street Journal, "More and more major businesses and industries are being run on software and delivered as online services-from movies to agriculture to national defense."
The hollowing out of the middle class
To see how the internet has disproportionately affected the jobs of people who process information, check out the gray bars dipping below the 0% line on the chart, below. (I've adapted this chart to show just the types of employment that lost jobs in the US during the great recession. Every other category continued to add jobs or was nearly flat.)
People who process information are losing their jobs at rates comparable to the rate of loss in manufacturing.St. Louis Fed
What's apparent is that the same trend seen in making and processing things-represented by the "Production…" and "Operators…" categories-shows up for the routine cognitive workers in offices and, for related if not identical reasons, sales.
Here's another clue about what's been going on in the past ten years. "Return on capital" measures the return firms get when they spend money on capital goods like robots, factories, software-anything aside from people. (If this were a graph of return on people hired, it would be called "Return on labor".)
Not surprisingly, information processing tasks benefit as much from the application of capital-including information technology-as manufacturing does.St. Louis Fed
Notice: the only industry where the return on capital is as great as manufacturing is "other industries"-a grab bag which includes all the service and information industries, as well as entertainment, health care and education. In short, you don't have to be a tech company for investing in technology to be worthwhile.
Companies that invest in IT do better
Here's yet a third clue about what's going on. For many years, the question of whether or not spending on information technology (IT) made companies more productive was highly controversial. Many studies found that IT spending either had no effect on productivity or was even counter-productive. But now a clear trend is emerging. More recent studies show that IT-and the organizational changes that go with it-are doing firms, especially multinationals (pdf), a great deal of good.
One reason for the delay is that it has taken some time for companies to learn how best to use IT. Economist Carlota Perez calls this the "installation phase." Moreover, the more recent rise of the internet has multiplied the power that IT has on its own.
In any case, if computers are the factory floor for routine cognitive workers, then when software and the internet makes some workers more productive, others are no longer needed.
Winner-take-all, the power of capital to exacerbate inequality
One thing all our machines have accomplished, and especially the internet, is the ability to reproduce and distribute good work in record time. Barring market distortions like monopolies, the best software, media, business processes and, increasingly, hardware, can be copied and sold seemingly everywhere at once. This benefits "superstars"-the most skilled engineers or content creators. And it benefits the consumer, who can expect a higher average quality of goods.But it can also exacerbate income inequality, says Brynjolfsson. This contributes to a phenomenon called "skill-biased technological [or technical] change." "The idea is that technology in the past 30 years has tended to favor more skilled and educated workers versus less educated workers," says Brynjolfsson. "It has been a complement for more skilled workers. It makes their labor more valuable. But for less skilled workers, it makes them less necessary-especially those who do routine, repetitive tasks."
The result is that, with the aid of machines, productivity increases-the overall economic pie gets bigger-but that's small consolation if all but a few workers are getting a smaller slice. "Certainly the labor market has never been better for very highly-educated workers in the United States, and when I say never, I mean never," MIT labor economist David Autor told American Public Media's Marketplace.
The other winners in this scenario are anyone who owns capital. Only about half of Americans own stock at all, and as more companies are taken private or never go public, more and more of that wealth is concentrated in the hands of fewer and fewer people. As Paul Krugman wrote, "This is an old concern in economics; it's "capital-biased technological change", which tends to shift the distribution of income away from workers to the owners of capital."
Unlike other technological revolutions, computers are everywhere
The ubiquity of smartphones in rich countries is just the tip of the silicon iceberg. Computers are more disruptive than, say, the looms smashed by the Luddites, because they are "general-purpose technologies" noted Peter Linert, an economist at University of Californa-Davis. Sensors, embedded systems, internet-connected devices, and an ever-expanding pool of cloud computing resources are all being put to the same use: how to figure out, in the most efficient way possible, what to do next.
"The spread of computers and the Internet will put jobs in two categories," said Andreessen. "People who tell computers what to do, and people who are told by computers what to do." It's a glib remark-but increasingly true.
In a gleaming new warehouse in the old market town of Rugley, England, Amazon directs the actions of hundreds of "associates" wielding hand-held computers. These computers tell workers not only which shelf to walk to when they're pulling goods to be shipped, but also the optimal route by which to get there. Each person's performance is monitored, and they are given constant feedback about whether or not they are performing their job quickly enough. Their bosses can even send them text messages via their handheld computers, urging them to speed up. "You're sort of like a robot, but in human form," one manager at Amazon's warehouse told the Financial Times. "It's human automation, if you like."
And yet despite this already high level of automation, Amazon is already working on how to eliminate the humans in its warehouses all together. In March 2009, Amazon acquired Kiva Systems, a warehouse robotics and automation company. In partnership with a company called Quiet Logistics, Kiva's combination of mobile shelving and robots has already automated a warehouse in Andover, Massachusetts. Here's a video showing how Kiva's robots, which look like oversize Roombas, can store, retrieve and sort goods with minimal involvement from humans.
This time it's faster
History is littered with technological transitions. Many of them seemed at the time to threaten mass unemployment of one type of worker or another, whether it was buggy whip makers or, more recently, travel agents. But here's what's different about information-processing jobs: The takeover by technology is happening much faster.
From 2000 to 2007, in the years leading up to the great recession, GDP and productivity in the US grew faster than at any point since the 1960s, but job creation did not keep pace. Brynjolfsson thinks he knows why: More and more people were doing work aided by software. And during the great recession, employment growth didn't just slow. As we saw above, in both manufacturing and information processing, the economy shed jobs, even as employment in the service sector and professional fields remained flat.
Especially in the past ten years, economists have seen a reversal of what they call "the great compression"-that period from the second world war through the 1970s when, in the US at least, more people were crowded into the ranks of the middle class than ever before. There are many reasons why the economy has reversed this "compression," transforming into an "hourglass economy" with many fewer workers in the middle class and more at either the high or the low end of the income spectrum. But whatever those forces, they are clearly being exacerbated by technological change.
The hourglass represents an income distribution that has been more nearly the norm for most of the history of the US. That it's coming back should worry anyone who believes that a healthy middle class is an inevitable outcome of economic progress, a mainstay of democracy and a healthy society, or a driver of further economic development. Indeed, some have argued that as technology aids the gutting of the middle class, it destroys the very market required to sustain it-that we'll see "less of the type of innovation we associate with Steve Jobs, and more of the type you would find at Goldman Sachs."
Is any job safe?
Recently I sat down with the team at Betterment, a tech startup to which people have already handed over $150 million in assets. For many, that money represents a significant chunk of their savings and retirement accounts. Betterment is the sort of company that, it it does well, will someday be a canonical example of the principle that "software eats everything." It's an attempt replace the kind of job you might think is still beyond the reach of an algorithm: personal financial advice.
The legal field has been transformed by software too. For example, it replaced paralegals in the previously labor-intensive process of sifting through documents during the discovery phase of a lawsuit.
No one, it seems, is more aware of this phenomenon than the technologists themselves. In an interview with Pando Daily, Josh Kopelman, a venture capitalist with First Round Capital, said that even his industry is going to be eaten by software. "In fifteen years, will VCs make as much money as they do now?" he was asked. "They probably shouldn't," was his response.
Survival of the fittest-and the richest
Barring a civilization-ending event, technology is not going to move backward. More and more of our world will be controlled by software. It's already become so ubiquitous that, argues one of my colleagues, it's now ridiculous to call some firms as "tech" companies when all companies depend on it so much.
So how do we deal with this trend? The possible solutions to the problems of disruption by thinking machines are beyond the scope of this piece. As I've mentioned in other pieces published at Quartz, there are plenty of optimists ready to declare that the rise of the machines will ultimately enable higher standards of living, or at least forms of unemployment as foreign to us as "big data scientist" would be to a scribe of the 17th century.
But that's only as long as you're one of the ones telling machines what to do, not being told by them. And that will require self-teaching, creativity, entrepreneurialism and other traits that may or may not be latent in children, as well as retraining adults who aspire to middle class living. For now, sadly, your safest bet is to be a technologist and/or own capital, and use all this automation to grab a bigger-than-ever share of a pie that continues to expand.
Read more: http://qz.com/67323/how-the-internet-made-us-poor/#ixzz2PSXUslor
A Society without Wisdom,
Amazon Verified Purchase(What's this?)
This review is from: The Revolt of the Elites and the Betrayal of Democracy (Paperback)The elites of the present are an increasingly comical and childish bunch. Little good can be said about them. The prospects for a democratic society are increasingly grim. Much of the blame can be put on the degeneracy of the public, but this has only resulted from the dominate paradigms of regressive elites, who have sold the public utopian schemes of perpetual progress through limitless technological and scientific innovation and growth. The costs of their vision are very rarely discussed in a meaningful or serious way. In the process of throwing off all the conservative and cautious elements of traditional morality, our elites have ended up wallowing in a relativistic pragmatism and nihilism. They have mastered the art of the shameless transgression of authority without offering up a real vision of what constitutes the good life. Don't buy into any of the pseudo-radicalism, because they only pretend to care.
It should be noted that Lasch did understand the new "knowledge economy," as some reviewers claim he didn't. Read his review of Turkle's The Second Self from all the way back in 1984:
Chip of Fools
Before rushing blindly into the computer age, we need to remind ourselves that events have already falsified most of the predictions about "postindustrial society" issued with such authoritative assurance over the last forty years. According to a long line of social forecasters, the expansion of the service sector of the economy, this growing proportion of white-collar workers in relation to blue collar workers, and the development of increasingly sophisticated technologies will homogenize the class structure of advanced capitalist societies. In the middle-class society of the future, "professional, technical, and managerial people," as Peter Drucker argued in 1958, will "become the largest group in the working population ... larger in fact than the blue-collar people."
Advanced technology, on this reasoning, will create an insatiable demand for trained personnel. In the 1950s and 1960s, those who predicted a "knowledge revolution" cited the growth of college enrollments after World War II and the upgrading of the educational credentials required for employment. The exponential increase of information, in their view, accounted not only for the decline of the working class and the rise of a "new middle class," but for the poverty of the newly discovered "underclass" of blacks, Puerto Ricans, and other "culturally deprived" minorities who lacked the skills essential to effective competition in the labor market. The drive for equal opportunity in education, in many ways the heart of the liberal program in the '50s and '60s, assumed that once the underclass had been equipped with technical skills, it would gain access to steady jobs and find its way into the mainstream of American life.
Today the same arguments are advanced in favor of compulsory training in the use of computers. Yet the "postindustrial society" routinely invoked in such appeals has demonstrably failed to materialize. The prolonged academic depression, the surplus of college graduates, and the growing unemployment even among people with advanced degrees, make a mockery of Drucker's optimistic assertion that "we cannot get enough educated people." It now appears that employers use educational credentials not to certify technical competence but merely to screen out applicants, mostly from the working and lower classes, who lack what has aptly been called a white-collar union card--that is, a college degree. The educational system has not served as an equalizer. For reasons explained by Christopher Jencks and others, it has reinforced class distinctions instead of breaking them down. Nor has the United States become a middle-class society. Thanks to the work of Richard Parker, Andrew Levinson, Paul Blumberg, and Harry Braverman, among others, it is now clear that most white-collar workers perform essentially unskilled labor. Distinctions between white-collar jobs and blue-collar jobs have become increasingly arbitrary and misleading. Advanced industrial society rests not on the upgrading of skills but on the systematic destruction of skills--on the "degradation of labor," in Braverman's phrase.
FAR FROM upgrading the technical skills of the work force, advanced technology in the service of corporate capitalism makes skilled labor superfluous. It promotes an interchangeability of personnel, a rapid movement from one type of work to another, and most important of all, a growing concentration of the labor force in technically backward, labor-intensive, and often un-unionized sectors of the economy. The rapid growth of employment in the service sector, which allegedly proves that America has become a middle-class, "postindustrial" society, reflects, on the contrary, the proletarianization of the work force and the growth of the reserve army of labor, as Marx called it--the casually or irregularly employed workers who constitute an "inexhaustible reservoir of disposable labor power." The same developments lead to high levels of unemployment, which can no longer be regarded as aberrational but have to be seen, as Braverman puts it, as a "necessary part of the working mechanisms of the capitalist mode of production."
I dwell on all this because apologies for the computer and for intensified computer instruction in the schools rest on the familiar claim that technological innovations will create an abundance of skilled jobs, eliminate disagreeable jobs, and make life easy for everyone. Everything we know about technological "progress" indicates, on the contrary, that it promotes inequality and political centralization. It commends itself to the masters of American industry for that very reason. Whenever we hear that some new technology is "inevitable" we should consult the historical record, which shows that technical innovations usually appeal to industrialists not because they are inevitable or even because they make for greater productive efficiency, but because they consolidate the industrialist's power over the work force. The triumph of industrial technology testifies not to the inexorable march of science, but to the defeat of working-class resistance.
It is a muddled, a historical view of the industrial revolution that dismisses this resistance as an attempt to "postpone the inevitable," as J. David Bolter writes in his study of the coming "computer age." It is equally muddled to argue that since the "computer age" is upon us, our best hope lies in "reforming the age of computers from within." In the past, efforts to reform industrial technology from within, usually led by engineers, served merely to reinforce the lessons already driven home by workers' resistance to the introduction of new technologies: that those technologies serve the interests of capital and that even those who design and manage the machines have little to say about the uses they are put to.
Over and over again, new technologies have reduced even the engineer's work to a routine. What originates as a craft degenerates into a series of automatic operations performed more or less unthinkingly. Computer programming is no exception to this pattern. As Sherry Turkle writes:
"In the course of the last decades programmers have watched their opportunities to exercise their expertise in a spontaneous way being taken away. Those who are old enough remember the time when things were different as a kind of golden age, an age when a programmer was askilled artisan who was given a problem and asked to conceive of and craft a solution. ... Today, programs are written on a kind of assembly line. The professional programmer works as part of a large team and is in touch with only a small part of the problem being worked on."
IN THE early days of the computer, according to Turkle, many people hoped that electronic technology could be captured by the counterculture. "Personal computers became symbols of hope for a new populism in which citizens would band together to run information resources and local government." But things did not turn out that way. Computers encouraged centralization and bureaucracy. Instead of humanizing industry, the personal computer came to serve as an escape from industry for hobbyists and even for professional programmers seeking to achieve in the privacy of the home the control they could no longer exercise at work. Turkle reminds us that "people will not change unresponsive government or intellectually deadening work through involvement with their machines at home." But personal computers offer the illusion of control in "one small domain," if not in the larger world of work and politics. Sold to the public as a means of access to the new world of postindustrial technology, personal computers in fact provide escape from that world. They satisfy a need for mastery and control denied outlets elsewhere.
They provide other forms of escape as well. In interviews with people who use computers extensively, particularly with children, Turkle found that computers appeal to people who find their personal lives unmanageable, often to people afraid of being overwhelmed by uncontrollable emotions. "The greater the anxiety about being out of control, the greater the seduction of the material that offers the promise of perfect response." Playing video games and solving problems on a home computer help to dissociate thought from feeling. They encourage a cool, detached, cerebral state of mind. They allow the operator to feel at once "swept away and in control." The computer provides a lifelike response that can nevertheless be predicted and controlled. It is no wonder that many users find themselves becoming addicted to their computers. After contrasting early expectations about computer technology to the role it actually plays in people's lives, Turkle concludes: "It would certainly be inappropriate to rejoice at the `holistic' relationships that personal computers offer if it turns out that they serve as a kind of opiate."
EXERTION of control over a machine often leads to the further step of identification with the machine--to a new conception of the self as a machine in its own right. Turkle's principal contention is that technologies are "evocative," changing the way we think about ourselves and about human nature. The image evoked by computers is the image of the machine-like self. In this sense, it is a "mirror of the mind"--not because it accurately imitates the operation of the mind (as we are often told) but because it satisfies the wish to believe that thought can divorce itself from emotion. For those who have entered most fully into the world of the computer, the prospect that men and women can become machines is a hopeful promise, not a threat. The promise finds its most highly developed expression in the Utopia of "artificial intelligence"--the "next step in evolution," as Edward Fredkin of M.I.T. once proclaimed it. The theory of artificial intelligence rests on the premise that "thought does not need a unitary agent who thinks," as Turkle puts it. Thought can dispense with the thinking self, in other words. It can thus overcome the emotional and bodily limitations that have encumbered humanity in the past. Theorists of artificial intelligence celebrate the mind's clarity, as opposed to what one of them, Marvin Minsky, revealingly refers to as the "bloody mess of organic matter."
These dreamers hope to create a new race of supermen freed from nature and from man's oldest enemy, death. Listen to Fredkin's contemptuous recital of human limitations.
"Basically, the human mind is not most like a god or most like a computer. It's most like the mind of a chimpanzee and most of what's there isn't designed for living in high society [sic] but for getting along in the jungle or out in the fields. ...The mere idea that of have to be the best in the universe is kind of far-fetched. ... The fact is, I think we'll be enormously happier once our niche has limits to it. We won't have to worry about carrying the burden of the universe on our shoulders as we do today. We can enjoy life as human beings without worrying about it."
The social vision implied by this kind of thinking is as regressive as the escapist psychology behind it. The psychology is the fantasy of total control, absolute transcendence of the limits imposed on mankind by its lowly origins. As for the social vision, it carries one step further the logic of industrialism, in which the centralization of decision-making in an educated elite frees the rest of us from the burden of political participation.
According to J. David Bolter, the computer promotes a new understanding of human limitations; and Fredkin's statement might lend itself to just this sort of misinterpretation. Like Turkle, Bolter studies the computer's imaginative impact. He too concludes that computers evoke a new image of man as a machine. But he offers a more encouraging reading of this "major change in sensibilities," one reminiscent of those advanced by Marshall McLuhan and Alvin Toffler. The computer has undermined our "linear" conception of progress. Bolter thinks, and replaced it with a "finite world view." It has weakened the old Faustian "concern with depth" and encouraged a concern with surfaces. It has devalued emotional intensity; but "if the computer age does not produce a Michelangelo and a Goethe, it is perhaps less likely to produce a Hitler or even a Napoleon."
Bolter is not unaware of the computer's "Utopian" appeal; but he thinks that other considerations "balance" the Faustian implications of computer technology. "We are becoming aware of our own temporal limitations." But as Fredkin's feverish reflections show so clearly, this acknowledgment of limitations, prompted by a comparison of the slow moving human mind with the computer's rapid calculations, does not mean what it appears lo mean. Instead of accepting human limitations, theorists of artificial intelligence and other prophets of the new electronic age dream of overcoming them by creating a new race of machines, just as genetic engineers dream of redesigning the human body so as to free it from all the ills that flesh is heir to.
Like most people who write about computers, Bolter--a classicist with a master's degree in computer science has no interest in politics and no conception of the political context of computer technology. His claim that computers foster a sense of limits rests, in part, on the irrelevant observation that "computers figure largely in all facets of conservation and rational consumption." Conservation and consumption are political issues, not technical issues, and computers in themselves will do nothing to bring about a rational allocation of scarce resources or a less exploitive attitude toward nature. Conservation runs counter to our entire system of large-scale capitalist enterprise. It demands small-scale production, political decentralization, and an abandonment of our consumer culture. It demands a change in the way we live, not a new technology, even a "revolutionary" technology. In any case, the revolutionary impact of information technology has been greatly exaggerated by students of "megatrends." Like all technologies, the computer solves problems that are defined not by technology itself but by the prevailing social priorities. In a society based on the ruthless exploitation of natural resources and on the dream that man can "raise himself above the status that nature seems to have assigned him," in Bolter's own words, the computer will serve as another weapon in man's war against nature. More prosaically, it will serve as a means of producing corporate profits. In a more democratic society, the computer might serve more constructive purposes.
Technology is a mirror of society, as Turkle insists, not a revolutionary force in its own right. It shows us ourselves as we are and as we would like to be; and what it reveals, in the case of the computer, is an unflattering image of the American at his most incorrigibly escapist, hoping to lose himself--in every sense of the term--in the cool precision of machines that know everything except everything pertaining to that "bloody mess of organic matter."