How Digital Devices Deprive Brain of Needed Downtime
In Greek mythology, Sisyphus, an evil king,
was condemned to Hades to forever roll a big rock to the top of a mountain,
and then the rock always rolled back down again.
Similar version of Hell is suffered every day by people
managed by micromanagers and control freaks.
"Cram them full of non-combustible data, chock them so damned full of 'facts' they feel
stuffed, but absolutely brilliant with information. Then they'll feel they're thinking, they'll
get a sense of motion without moving." �
Ray Bradbury, Fahrenheit 451, 1951
"It is not only information that they need-in the age of Fact, information often dominates
their attention and overwhelms their capacities to assimilate it�.What they need , and what they
feel they need , is a quality of mind that will help them to use information and to develop reason
in order to achieve lucid summations of what is going on in the world and of what may be happening
within themselves."
� C. Wright Mills, The Sociological Imagination, 1959
Internat now means information overload. It has several dimentions:
Too much information ("pure informational overload or "drinking from the firehoze"). For
example
Drinking
from the Fire Hose At Microsoft this situation is typical for college students, see
Mental overload. It is also typical for Unix/Linux sysadmins
ad modern OS is an extremely complex maze of subsystems each of them complex enough to be impossible
to learn in full.
Self inflicted overload, suffering from Obsessive-compulsive
disorder: a broad category of "voluntary workaholics" belongs here.
Smarphone addicts and Computer gamers
are typical victims of this type of chronic overload. This is a kind of addition is similar to eating disorders. Slavery to online services and/or games and or smartphones.
Obsession with computers that is borderline with the addiction (similar to eating addition). "Instead
of having long relaxing breaks, like taking two hours for lunch, we have a lot of these phone calls,
checking email and playing games" that destroy our life"
Overwhelming complexity of the situation and/or the fact that useful signal is lost in noise.
Typical examples include network and server troubleshooting, intelligence gathering, pilots in
complex meteorological conditions and during "blind" landing, . We can also add to this category
troubleshooting of complex programming and/or networking problem when complexity of the stuff is
over human capacity to comprehend and people spend days trying to figure out what is wrong and why.
Network administrators and security analysts know this type of situations pretty well. Unability
to find information in the sea of other information (finding needle in the stack) is highly stressful.
Force feeding with "junk" information when the
neoliberal social system produces more information than
necessary for normal functioning, with most information of low quality (email spam is a typical phenomenon
belonging to this category). This is kind of new type of pollution unique to neoliberalism (althouth
in advertizing domain and nespater domain it emerged in some areas even before 1974 -- the time
of the neoliberal counterrevolution in Chile), information smog. Information
smog replaced information scarcity with information overabundance which has several dimantion
within itself.
Printing press smog. What started out as a liberating stream during the Renaissance
has turned into a deluge of chaos. In the USA, for example, there are ten thousand of newspapers
and magazines. There are also more than 100,000 new book titles published every year (and it's
probably more than a million world-wide) and just for the record, over 60 billion pieces of advertising
junk mail come into our mail boxes every year. Everything from telegraphy and photography in the
19th century to the silicon chip in the twentieth has amplified the instream of information, until
matters have reached such proportions today that for the average person, information no longer
has any relation to the solution of problems. In mild forms printing press smog is usually pretty
benign. You just need to eradiate the view that 'Knowledge is Power' and start regularly
throwing out Computer magazines that were never opened ;-).
Web smog: Situation deteriorated due to Internet. Typical Google search
is an example of junk dominance. Many sites references has no real value for the particular
query. It also vividly demonstrates that most information is of low
quality, repetitive or outright deceptive (financial information belongs to the latter category).
Web smog is larger and more dangerous problem then printing press smog as, while newspapers
avoid useful information and generally repeat government talking point depicting foreign
events, here useful
information is buried in the avalanche is useless and deceptive information (fake news) making it essentially unavailable. Some forms of Internet smog such as email overload
and its ultimate manifestation -- email spam -- can now be effectively controlled by technical
means.
Email and instant messaging smog. It was a real problem in late 90th and now it slightly
diminished in importance and was replaced with Facebook visibility mania. Younger generation
rarely uses email preferring instant messaging via smartphones. The compulsive desire to check
you smartphone here is telling sign of the problem.
Facebook "visibility" mania and similar additions.
Red Tape induced or bureaucratic overload. Often is connected with
Corporate bullshit as a communication method.
In large corporations typical situation creating overload and "shadow It" is that accomplishing anything requires
jumping via bureaucratic hoops and represents tremendous additional and counterproductive
efforts. Red tape became real problem for large corporations with the growth of
overcentralization and outsourcing. It is especially acute when working for a clueless
Control Freaks: micromanager
who makes accomplishing anything extremely difficult and drown you in useless paperwork
supposedly
needed to him/her to make a decision that often should be made on the spot. Often despite
numerous documents and emails it is completely unclear how to perform particular activity,
especially after some arbitrary change, such as introduction of Devops of other fashion of the
year hoopla.
Working in the environment where one person has just too many responsibilities and just
cannot cope with then (typical situation in startups', where often the whole IT department is two
or three people). This often happens when you get into new working environment and need tome
to adapt but was put "in the trenches". There is too much new information and you also is able to
process and filter the vital information.
Combination of information overload with the prolonged exposure to stress is especially toxic
A thunderstorm is God's way of saying that you
spend too much time
in front of the computer.
Prolonged exposure to stress (Working for
a corporate psychopath) increase
the level of overload and contribute to the level of stress you experience.
Prolonged exposure to information overload produces so called information fatigue syndrome.
Symptoms include paralysis of analytical capacity, increased anxiety, greater self-doubt, and a tendency
to blame others. Long exposure produces symptoms similar to post-traumatic stress syndrome and
in milder form is intrinsically connected with demoralization and burnout. Here the most helpful
page is probably Softpanorama Humor Archive. Unique Collection of
Open Source Related Humor. Humor is one of the most affecting methods of fighting stress and
overload. It helps a person to remain positive in difficult situations more effectively that most drugs.
When people are faced with more information than they can process, they become unable to make decisions
or take action. There are two important aspects of this problem:
Overwhelming complexity of the situation and/or the fact that useful signal is lost in noise.
Typical examples include intelligence gathering, pilots in complex meteorological conditions
and during "blind" landing, network troubleshooting. We can also add to this category troubleshooting
of complex programming and/or networking problem when complexity of the stuff is over human capacity
to comprehend and people spend days trying to figure out what is wrong and why. Network
administrators and security analysts know this type of situations pretty well.
Information overload with "junk" information when civilization produces more information
than necessary for normal functioning, with most information of low quality. This is kind of new
type of pollution, information smog. Information smog replaced information scarcity as
an important personal and social problem.
Printing press smog. What started out as a liberating stream during the Renaissance
has turned into a deluge of chaos. In the USA, for example, there are ten thousand of newspapers
and magazines. There are also more than 100,000 new book titles published every year (and it's
probably more than a million world-wide) and just for the record, over 60 billion pieces of
advertising junk mail come into our mail boxes every year. Everything from telegraphy and photography
in the 19th century to the silicon chip in the twentieth has amplified the instream of information,
until matters have reached such proportions today that for the average person, information
no longer has any relation to the solution of problems. In mild forms printing press smog is
usually pretty benign. You just need to eradiate the view that 'Knowledge is Power' and
start regularly throwing out Computer magazines that were never opened ;-).
Internet smog: Situation deteriorated due to Internet. Typical Google search
is an example of junk dominance. It also vividly demonstrates that most information is of low
quality, repetitive or outright deceptive (financial information belongs to the latter category).
Internet smog is larger and more dangerous problem then printing press smog as it masks useful
information making it essentially unavailable. Some forms of Internet smog such as email overload
and its ultimate manifestation -- email spam -- can now be effectively controlled by
technical means.
Often information overload is typical for high-tech startups. "Technology has changed, but human
nature hasn't. Whether it's the Gold Rush of 1849 or the Web Rush of l999, people are people. More often
than not, they're miserable, nasty, selfish creatures, driven by vanity and greed, doing whatever they
can to get ahead, even if it means stepping on the person next to them, crushing the weak, and destroying
themselves in the process." Actually this is not true. The IT industry is a unique environment;
we are truly given a more choice as to where our priorities lie than in many other jobs. But there is
no free lunch. You want a cool job? Don't expect to work for a huge company and get paid the big bucks.
You want to make good money? Don't expect to be able to leave the office in the middle of the day just
to sit in the park and drink coffee. You want to make great money? Don't expect to work 40 or even 50
hours a week...
Actually startups aren't about the paradise, nor are they viable for those who crave security. They
are about risk, not just financial but also emotional and intellectual. Some think that the rewards
for success are worth it, some not... It's true that some startups hire, than harass and inflict burnout
on programmers and sys-admins. Life in the fast lane can be brutal - long hours, almost no employer-employee
loyalty, greed and moral cowardice, back-stabbing, pressure, etc. If you don't want to do what your
boss want, a startup can probably find immigrant that will do it for less money. That is the Silicon
Valley Way (TM).
Many visitors to this page are probably system administrators. And it's sad to say but sysadmins
are often the janitors of e-business. To clean up the messes from the ugly packages superfast growth
and unrealistic schedules they often work long, late hours. It's a thankless job (although not the only
one and not the most miserable one...) Anyway the reality is that sysadmins/programmers in startups
and small companies that are struggling to survive. Sometimes are also put under substantial stress...
I'm surprised most of them aren't more neurotic from sleep deprivation.
At the same time many sysadmins in established companies working with "Gold" coverage from Sun or
HP can surf the WEB for 80% of the day... And if you can rarely showed up before 11 a.m., sometimes
it is just a survival skill to stay past midnight once in a while... In large companies most sysadmin
roles aren't always firefighting, and not so much stress, but there is little of no traning
and unless you engage is self-stude you dequlify and the routine tends to wear on a capable person
pretty quickly.
Sometimes it is really look like a cleaning job with slightly better pay (not by much if
you calculate the total number of hours worked per year and the fact that many sysadmin has no
real replacement and are always "on call"). You clean it today, but in a month everything
return to the same state. Sometimes it make sense to play an idiot in large company in best traditions
of Peter Principle. Officially recognized low-performers often
can spend 90% of their time addressing only 10% of problems that high-performer needs to address. The
most valued employees in large companies are often on the verge of burn-out because they are too overloaded
and have way too many pressures, conflicts and demands combined with too few rewards, acknowledgments
and successes.
For IS top guns it might make sense to stop for a moment to dig infodirt and ask themselves a simple
question "Does working with the fancy hardware and software (let's assume for a moment that Unix can
be fancy first five years or so ;-) worth 60 hours a week or even 40 hours of cleaning infodirt?". Independent
of your answer thinking about this may help to adjust your priorities :-).
Pseudo-Attention Deficit Disorder
Some programmers are perversely
wired. It is not uncommon for them to be sitting in a meeting and using a hand-held device
to exchange instant messages surreptitiously - with someone in the same meeting. In such cases we
can talk about Pseudo-attention
deficit disorder. Some signs:
I find my mind wandering from tasks that are uninteresting or difficult
I say things without thinking and later regret having said them.
I make quick decisions without thinking enough about their possible bad results
I have a quick temper, a short fuse
I have trouble planning in what order to do a series of tasks or activities
In group activities it is hard for me to wait my turn.
I usually work on more than one project at a time, and fail to finish many of them.
"... With the boundaries between home and the workplace blurred as the result of many people working from home, Friedman describes work burnout as a "pandemic within a pandemic." ..."
"... "It's quite natural to feel burnt out right now," Friedman said. "And it's because of the decimation between work and life boundaries and the fact that we're all juggling our kids on top of our basic work responsibilities." ..."
"... The better approach, he said, is to learn more rather than working less in order to increase your energy. Learning new things will provide a mood and confidence boost, while also fulfilling one's "basic psychological need for growth." ..."
"... "What we know from the research is that when you take care of the entire employee by fulfilling their basic human psychological needs of autonomy, competence, and relatedness, they tend to be more productive," Friedman said. "So this is something that should be top of mind for any leader hoping to motivate their staff." ..."
"... "[Employees are] having the ability to focus in a way that just isn't available to them in the office," Friedman said. "And I'm heartened by the fact that I think more organizations are aware of those biological needs." ..."
Changes to the workplace brought on by the COVID-19 pandemic will cause "a revolution in the way that organizations operate,"
Dr. Ron Friedman, social psychologist and author of " Decoding Greatness
," told Yahoo Finance Live
.
With the boundaries between home and the workplace blurred as the result of many people working from home, Friedman describes
work burnout as a "pandemic within a pandemic."
"It's quite natural to feel burnt out right now," Friedman said. "And it's because of the decimation between work and life
boundaries and the fact that we're all juggling our kids on top of our basic work responsibilities."
Amid a nationwide
labor shortage , many Americans are returning to work in person, with the
CDC reporting that
52.6% of the population is inoculated with at least one dose and 43.9% are considered fully vaccinated. However, a recent study found
that 73% of U.S. workers
have some anxiety about returning to in-person work. And although some believe these concerns will
ease over time , working
from home has taken a mental toll on many in the workforce.
Friedman, who has consulted for Fortune 500 companies, political leaders, and global non-profits, describes burnout as a situation
in which the requirements of an individual's tasks consistently outstrip the amount of energy they have available.
According to Friedman, there are two main ways of alleviating burnout. One of the strategies is to reduce the demands of work,
which may be difficult for many. Friedman admitted that a pitfall of this strategy is to attempt to cram more work into less time
when trying to work less, which ultimately elevates stress levels in the end.
The better approach, he said, is to learn more rather than working less in order to increase your energy. Learning new things
will provide a mood and confidence boost, while also fulfilling one's "basic psychological need for growth."
As for how companies and other organizations should approach the issue of burnout among their staff, Friedman argued that leaders
must take a more holistic approach to caring for employees. He stressed the need to care for the "entire employee," rather than just
the "sliver of them" who is in the office from 9 to 5.
"What we know from the research is that when you take care of the entire employee by fulfilling their basic human psychological
needs of autonomy, competence, and relatedness, they tend to be more productive," Friedman said. "So this is something that should
be top of mind for any leader hoping to motivate their staff."
Fritedman cited realizations among workplaces that leaders must take additional steps to meet employees' biological needs if they
wish to fulfill their basic psychological needs. Because people have been doing things such as taking naps and going for walks during
the day, he suggested that peoples' biological needs have been better satisfied during the pandemic than they have been in generations.
These things allow for better focus that would not be possible in an office setting, according to Friedman.
"[Employees are] having the ability to focus in a way that just isn't available to them in the office," Friedman said. "And
I'm heartened by the fact that I think more organizations are aware of those biological needs."
Thomas Hum is a writer at Yahoo Finance. Follow him on Twitter: @thomashumTV
Jonathan Frostick, who does information technology work for financial services firm HSBC,
wrote on LinkedIn that his first thought while having a heart attack was "this isn't
convenient" for a meeting with his manager the next day. His second thought: "How do I secure
the funding for X (work stuff)."
His wife was fourth on the list of concerns, following worries about updating his will. But
since recovering in the hospital, he said he has re-evaluated his goals, outlining his
overhauled goals in a post that's gone viral
on the business-focused social network.
No more days packed with Zoom calls, for starters, the U.K.-based worker wrote. "I'm
restructuring my approach to work," Frostick continued. "I'm really not going to be putting up
with any s#%t at work ever again â€" life literally is too short."
Frostick's post is striking a chord at a time when the boundaries between work and home life
have all but disappeared for millions of white-collar workers. With more than 203,000 likes and
more than 10,000 comments on LinkedIn, people are posting their own experiences with work,
health setbacks as well as sending him well-wishes.
Frostick updated his post to say that he's "up and walking."
"I never expected this post to hit home the message it did â€" but I'm pleased as
it has seemingly helped a lot of people," he wrote early on Wednesday.
... ... ...
Frostick, who didn't immediately respond to a request for comment,
told Bloomberg that his work days stretched to 12 hours, with him and his colleagues
spending long amounts of time on Zoom. The 45-year-old, who has three children, said he took
responsibility for blurring the line between work and home life.
"Whereas before I would finish sensibly anywhere between five and half six, I'd be finding
myself there on a Friday at 8 o'clock at night exhausted, thinking I need to prep up something
for Monday and I haven't got time, and I started then to actually work weekends," Frostick told
the publication. "That's my responsibility. I think that was probably for me where it was those
blurring of boundaries."
Many people have developed a love-hate relationship with Zoom during the pandemic . While it makes remote
work possible, it can also lead to burnout
, with Citibank CEO Jane Fraser last month designating Fridays as a Zoom-free
day to battle video-call fatigue. She also urged workers to set "healthy work boundaries"
and avoid scheduling calls outside business hours. "[T]he blurring of lines between home and
work and the relentlessness of the pandemic workday have taken a toll on our well-being," she
said in a memo to employees.
In the meantime, Frostick said in a LinkedIn update that he has an excellent manager, and
added that he wasn't forced to work on weekends.
"Yes I shouldn't have, but I wasn't forced to. I am deeply passionate about what I do. I'm a
(fortunate) living example of getting the mix wrong," he noted. "You are in charge of YOUR life
â€" make changes.
When Jonny Frostick realised he was having a heart attack this month, the first thing that occurred to the
HSBC
contractor
was: “I needed to meet with my manager tomorrow, this isn’t convenient.”
Then he thought about funding for a project, his will, and finally, his wife.
Frostick, who manages more than 20 employees working on regulatory data projects, chronicled his near-death experience in a
viral
LinkedIn post
that had been viewed almost 8 million times. The 45-year-old Briton is the latest financial employee to weigh
in on the
work-till-you-drop
culture
during a pandemic that’s obliterated the lines between office and home life for droves of workers.
“Whereas before I would finish sensibly anywhere between five and half six, I’d be finding myself there on a Friday at 8 o’clock
at night exhausted, thinking I need to prep up something for Monday and I haven’t got time, and I started then to actually work
weekends,” Frostick said in a phone interview from his home in Dorset, England. “That’s my responsibility. I think that was
probably for me where it was those blurring of boundaries.”
Jonny Frostick and his wife, Adel.
“We all wish Jonathan a full and speedy recovery,” said HSBC spokeswoman Heidi Ashley. “The response to this topic shows how
much this is on people’s minds and we are encouraging everyone to make their health and wellbeing a top priority.”
Isolation and hours of Zoom calls
Frostick said he and
colleagues spend a disproportionate amount of time on Zoom calls, and work days can stretch to 12 hours. The isolation of remote
work also takes a toll, he said.
“We’re not able to have those other conversations off the side of a desk or by the coffee machine, or take a walk and go and
have that chat,” he said. “That has been quite profound, not just in my work, but across the professional-services industry.”
The former construction worker took a different path into finance to many of his peers. A native of Bournemouth, an English
coastal town, he worked in his father’s building business and didn’t get a bachelor’s degree until he was 29.
When he arrived in
London, the self-described country boy had to learn how to use the Underground subway system, and mixed for the first time with
ballet and theatre aficionados. From there, he went down a path of intense work that included stints at Accenture, JPMorgan, UK
government ministries and Deutsche Bank. He cultivated a so-called mask to fit into corporate culture.
Frostick, who has three young children, said he is responsible for the overwork and neglect of his health that culminated in the
heart attack. Now he wants to share his wake-up call with others.
‘This could happen to you’
“I owe a responsibility to myself and other people,” Frostick said. “This happened to me, this could happen to you. You need to
change that.”
He wants to drive conversation about the post-pandemic work culture and hopes employers will implement a more flexible approach.
In the post, Frostick vowed to make changes, including limiting Zoom calls, restructuring his approach to work and spending more
time with family. The post received more than 201,000 likes and generated thousands of messages from people who are rethinking
their attitudes.
Frostick is still recovering from his hospital stay, and only has enough energy to get out of bed for a couple of hours at a
time. He’s enjoying time with his wife and children, and eventually wants to do more work on a dilapidated Mercedes. There’s
some talk about non-executive director roles or advisory work. Someone suggested he write a book.
The decision to write the raw LinkedIn post comes at a precarious time in his life and finances, said Frostick. He’s racked up
costs from court proceedings with his ex-wife over child-care arrangements for their daughter.
“My back’s against the wall,” he said.
Still, he doesn’t blame HSBC for his health problems and is bullish about future prospects.
“I don’t think this should reflect badly on the place where I work, I think it’s fairly consistent across the industry, and I
think that’s why it’s resonated with so many people,” he said. “If an organisation didn’t want to employ me because I’d actually
taken a moment to reflect, and capture this, then that’s probably not the right place for me to be working.”
Not only social media, but also regular MSM web sites create a "dopamine loop" when the users
spend unordinary amount of time browsing for news.
Notable quotes:
"... Thanks to neuroscience, we're beginning to understand that achieving a goal or anticipating the reward of new content for completing a task can excite the neurons in the ventral tegmental area of the midbrain, which releases the neurotransmitter dopamine into the brain's pleasure centers. ..."
"... Twitter is wonderful because a lot of journalists, writers, scientists, artists, and activists frequent it, so I get many fascinating links and insights from all over the world that I would never find otherwise. Twitter is horrible because it takes every aspect of American politics that is currently horrible, and amplifies it, and the short form may itself encourage more horribleness. ..."
"... Of course, neoliberalism produces plenty of desperation. "The good autocrat provides many opportunities for failure in the populace" –Frank Herbert, Children of Dune . ..."
"... I think that if one honestly mined the user population, they would find that dopamine rushes apply to only a segment of users – the socially insecure, which accounts for most children and many young adults. There is also the cofactor of the smartphone, which for some has become the technological equivalent of Linus' blue blanket. ..."
"... I find social media to be most useful to keep up with old friends scattered all over the world because smartphones make their sharing spontaneous ..."
"... Addiction is rarely likely to be the case, because craving is only one part of addiction. The other part is getting physically ill when you stop. Being cranky or preoccupied when deprived of social media is not illness – it is just annoying to others. ..."
"... Regardless, getting users addicted to running in the dopamine hamster wheel is exactly what the social media engineers have been designing to achieve on purpose. Because every turn of the wheel generates more money for the social media platform owners who pay the social media engineers to do the engineering. Except for those founding social media engineers who founded the platforms themselves, like Zuckerberg. Their incentive to addict as many hamsters as possible to running in the dopamine wheel is even stronger. ..."
"... We live in a society where people are lonely, isolated and insecure, and where they are officially encouraged to fight each other for financial or social/identity advantage. ..."
"... But people don't actually like doing this, and would rather be members of communities than be good liberal autonomy maximizers. But if you haven't got a real community any more, you're much more likely to adopt, and even use to excess, something that has the outward trappings of one. ..."
"... "Dopamine" is just a trendy term for "reinforcement" or, before that, "pleasure." So an important reminder: we're always – ALWAYS – "manipulating human nervous tissue." That's what it means to be an obligate social animal. ..."
"... i am addicted to Naked Capitalism, and proud of it. Both articles and comments. ..."
"... Another industry that has been in this business a long time is the gambling and casino industry. Slot machines, video poker, etc., are also software constructions explicitly designed to engage users as strongly as possible and keep them engaged for as long as possible, in order to generate as much profit as possible. ..."
"... I have to say that many programmers are very young, and mostly male, and when in groups, for whatever reason, I've observed among them a distinct lack of empathy, a lack of worldly wisdom and questioning, and an inability to imagine any other kind of life experience than what the engineer has personally known, no matter how well intentioned the individual is (and they are sometimes rather the opposite). Meanwhile the much more experienced, worldly and wise managers stand over the coding team, giving direction and applause and monetary rewards for every bit of "cleverness" the team comes up with, no matter how deranged. Every incentive is in favor of sociopathic mindless greed. And who goes to prison when something goes wrong? The engineer. ..."
The existence of a "dopamine loop" created by social media likes and clicks is conventional
wisdom in Silicon Valley, but I haven't been able to find the original science behind it. (It
is possible that it's a phrase that is used because it sticks in the mind and makes the user
sound authoritative, like "kompromat .")
The Atlantic describes the dopamine loop as "neuroscience" (hmm) in 2012:
Thanks to neuroscience, we're beginning to understand that achieving a goal or
anticipating the reward of new content for completing a task can excite the neurons in the
ventral tegmental area of the midbrain, which releases the neurotransmitter dopamine into the
brain's pleasure centers. This in turn causes the experience to be perceived as pleasurable.
As a result, some people can
become obsessed with these pleasure-seeking experiences and engage in compulsive behavior
such as a need to keep playing a game, constantly check email, or compulsively gamble online.
A recent
Newsweek cover story described some of the harmful effects of being trapped in the
compulsion loop.
... ... ...
And so let me circle round to the programmer. Here's an example of
manipulating human nervous tissue at Instagram (owned by Facebook). From the
Toronto Globe and Mail :
The makers of smartphone apps rightly believe that part of the reason we're so curious
about those notifications is that people are desperately insecure and crave positive feedback
with a kneejerk desperation. Matt Mayberry, who works at a California startup called Dopamine
Labs, says it's common knowledge in the industry that Instagram exploits this craving by
strategically withholding "likes" from certain users . If the photo-sharing app
decides you need to use the service more often, it'll show only a fraction of the likes
you've received on a given post at first, hoping you'll be disappointed with your haul and
check back again in a minute or two. "They're tying in to your greatest insecurities," Mr.
Mayberry said.
NOTES
[1] I have a carefully curated list. Twitter is wonderful because a lot of journalists,
writers, scientists, artists, and activists frequent it, so I get many fascinating links and
insights from all over the world that I would never find otherwise. Twitter is horrible because
it takes every aspect of American politics that is currently horrible, and amplifies it, and
the short form may itself encourage more horribleness. On the whole, however, I prefer Twitter
because I curate my news feed -- I suppose it could be said that I titrate my dosage
-- and not Facebook's faceless engineers.
[2] Of course, neoliberalism produces plenty of desperation. "The good autocrat provides
many opportunities for failure in the populace" –Frank Herbert, Children of Dune
.
The fact that people can't stop staring at and interacting with their phone screens while
driving, walking, conversing, or even having sex, if news reports are to be believed, are
indicative of the addictive nature of the technology. Or would "format" as opposed to
"technology" be the more appropriate term? The "Technology" section on Google news page seems
to consist largely of infomercials for social media.
I think this story raises an interesting, albeit fictional, account of why people use
social media so much. I think that if one honestly mined the user population, they would find
that dopamine rushes apply to only a segment of users – the socially insecure, which
accounts for most children and many young adults. There is also the cofactor of the
smartphone, which for some has become the technological equivalent of Linus' blue
blanket.
You do see evidence from posts on FB that some people are seeking approval of their daily
lifestyle. "Here I am at this club, this restaurant, this event, with this person." If they
do not get many likes, do they patronize alternatives? Perhaps they seek reassurance that
they are tasteful, "in," cool, not overdoing, etc.
As an elderly FB user, I grew up with computers (DARPANet, BITNET, the Internet), PCs,
laptops, tablets, cellphones, palm pilots, smartphones, etc. These are tools for various
purposes. When working, I would not tolerate people putting their smartphones on the meeting
room table – it was/is a rude distraction. It was okay to use them for scheduling the
next meeting date, or making a note of a new task. It was/is handy to have my rolodex IN my
phone now, and a diary that vibrates to remind me of my next appointment – even in
retirement! None of those uses smack of abusive use, and the vibration in my pocket does not
produce a dopamine rush.
I find social media to be most useful to keep up with old friends scattered all over the
world because smartphones make their sharing spontaneous. "Likes" for my peers are often
ratifications that grandparenting is indeed gratifying, isn't it nice that we can travel, or
sharing in the glee of a new puppy. Passe email, is still a wonder because we can daily share
private ideas, experiences, new theories, or discuss world events just like when we were
teens or in college. Lastly, there is blogs. Like NC, I learn more and faster what is going
on in my world, and I can adjust the diversity of my input (which for me is quite high).
Lastl, from a neuroscience perspective, dopamine is often characterized as if it were an
addictive neurotoxin like heroin or cocaine. We hear rants about how people are addicted to
their iPhones (a metaphor of sorts for social media). Addiction is rarely likely to be the
case, because craving is only one part of addiction. The other part is getting physically ill
when you stop. Being cranky or preoccupied when deprived of social media is not illness
– it is just annoying to others.
""The short-term, dopamine-driven feedback loops we've created are destroying how society
works," he said, referring to online interactions driven by "hearts, likes, thumbs-up." "No
civil discourse, no cooperation; misinformation, mistruth."
Likes makes right. Those posts with fewer likes become invisible compared to those with
more. Having a discussion might lead one or both sides to learn something and come to a place
of mutual understanding, if not agreement, but why bother with all that when it's a million
times easier to simply block out disagreeing voices? Hell, the apps do that for you.
If you want likes, keep it simplistic, feel-good and humorous. Posting anything
thought-provoking causes the dopamine machine gun to stutter, and that's poor form.
I remember reading/hearing that the "pleasure center" in the brain is supposed to be a
real bunch of neurons which really exists. Dopamine is supposed to be one of the
neurotransmitters secreted therein. Various other braincell fiber pathways are supposed to
connect to it such that when survival-necessary activities send related sensory-stimulus
impulses through those pathways, that some dopamine is secreted which makes the pleasure
center make the brain-at-large feel good. The brain will seek more such feel-good
dopamine-pellet rewards by driving the body to engage in more such survival-prolonging
activities such as eating food or having procreational sex.
It is so much easier to use and to hear the 3-word-phrase "dopamine feedback loop".
Perhaps "dopamine feedback loop" is a metaphorical word-model for the whole process alluded
to above, just as Niels Bohr's little solar system model was a metaphorical diagram-model for
an "atom".
Regardless, getting users addicted to running in the dopamine hamster wheel is exactly
what the social media engineers have been designing to achieve on purpose. Because every turn
of the wheel generates more money for the social media platform owners who pay the social
media engineers to do the engineering. Except for those founding social media engineers who
founded the platforms themselves, like Zuckerberg. Their incentive to addict as many hamsters
as possible to running in the dopamine wheel is even stronger.
Their statements of dismay are so much virtue signalling and mutual back patting. Their
actions all say: more hamsters, please. And spin the wheels faster.
( I don't have a cell phone because cell phones cause cancer. I don't do facebook because
facebook was never anything but a clever conspiracy to trick people into building dossiers on
themselves. I don't do twitter because I don't have the energy or the desire to be known and
followed. Reading and commenting on 3 or so blogs is the closest I come to running in the
dopamine hamster wheel).
I'm as anti these social media companies as anyone, and never use their products. But I
wonder if some of their success doesn't come from kicking into an open goal.
We live in a society where people are lonely, isolated and insecure, and where they are
officially encouraged to fight each other for financial or social/identity advantage.
But
people don't actually like doing this, and would rather be members of communities than be
good liberal autonomy maximizers. But if you haven't got a real community any more, you're
much more likely to adopt, and even use to excess, something that has the outward trappings
of one.
"Dopamine" is just a trendy term for "reinforcement" or, before that, "pleasure." So an important reminder: we're always – ALWAYS – "manipulating human nervous
tissue." That's what it means to be an obligate social animal.
However, I have only a "dumb" phone (a lot of us, here on NC), and minimize my involvement
with Facebook; not on Twitter at all.
Of course, with a recent rash of babies in my family (my siblings are suddenly
grandparents – long generations in my family), I've been introduced to "23snaps," a
picture-sharing platform. It's annoying, no matter how cute the babies are.
I read Jaron Lanier's books, You Are Not a Gadget and Who Owns the Future?, when they came
out. He has been skeptical of Facebook all along. He is also highly skeptical of EULA
agreements–the idea that software is licensed to you and that the licenser then has
access to your computer because you are not the owner. He also pointed out several years ago
that certain assumptions about software, for instance, that text should go into a "file,"
have frozen innovation. As a musician, he is definitely not keen on musical software (neither
the software for storing / playing music nor composition software).
The dopamine connection sounds like a bunch of quant majors searching for something from
their required bio course. The problem with Facebook is that it is Pavlovian–you get
approval and go back for more approval. Ding, ding, ding. The reason that the dopamine
connection is popular is that it reinforces some currently received ideas about the chemical
brain. Pavlov was about behavior: But criticizing behavior is so darn patriarchal and
judgmental and old fashioned. With chemicals, no one has to answer for behavior. It's the
fault of covalent bonding.
The brain either has chemicals in it or it doesn't. If it does, the people who understand
that fact and figure out how to study what those chemicals have to do with what will know
more than those people who don't understand that fact and don't study anything to do with
that fact or how it operates.
I made a conscious decision to not belong to Facebook, to cancel Twitter, and to not use a
cell-phone except for its communication as a phone-thing. If we human beings had used our
techno time to help solve some of earth's problems (pollution, climate change,
over-population, poverty, inequality, etc.) we would have been on the path of solving these
most important problems already. Technology of the type named basically keeps us from
confronting and resolving these most important problems. Another fear I have of the overuse
of technology (along with a world run by billionaires) is the weakening and finally the
breaking down of democracy itself.
I will just say thank you for not having any type of 'likes' on NC. I enjoy the fact that
here, people's words stand on their own and people can make up their own minds what to
think.
And these days, how many 'likes' or 'followers' or whatever are from real human beings as
opposed to bots? Seem to remember reading about a whole cottage industry where one could
purchase followers to make themselves seem more popular.
But what about the software engineers who also "did it anyway"? That horrid little piece
of manipulation -- "strategically withholding 'likes'" -- was implemented by a team. There
was a manager, there was a whiteboard, there were design sessions, there was testing, there
was coding, all for software engineered to treat humans like cattle.
Another industry that has been in this business a long time is the gambling and casino
industry. Slot machines, video poker, etc., are also software constructions explicitly
designed to engage users as strongly as possible and keep them engaged for as long as
possible, in order to generate as much profit as possible.
By Facebook standards, gaming machines are quite crude: perform some physical act, then
get monetary reimbursement (or not), repeat. It's straight variable-ratio reinforcement, as
the behaviorists used to say. But it seems to work quite well, and no one can say it isn't
intentional.
It should be noted that no ethically-trained software engineer would ever consent to
write a DestroyBaghdad procedure. Basic professional ethics would instead require him to
write a DestroyCity procedure, to which Baghdad could be given as a parameter.
Also this post from
Clean Coder, about the VW diesel fraud and the engineer who's going to prison for "just
following orders":
Imagine the scene in that meeting room. What was said? What was agreed to? We may never
know all the details; but it's clear that the executives asked the engineers to find a way
to defeat the emission tests.
Now think of the engineers. What a cool problem to have to solve? No, really! Imagine
how much fun it would be to figure out some sneaky way to bypass the emission test.
[snip]
Imagine the brainstorming, the "good" ideas. The coolness of knowing that there's a
really nifty solution to this problem.
Imagine how pleased the executives would be with this really cool engineering solution.
Imagine how proud the engineers were.
I have to say that many programmers are very young, and mostly male, and when in groups,
for whatever reason, I've observed among them a distinct lack of empathy, a lack of worldly
wisdom and questioning, and an inability to imagine any other kind of life experience than
what the engineer has personally known, no matter how well intentioned the
individual is (and they are sometimes rather the opposite). Meanwhile the much more
experienced, worldly and wise managers stand over the coding team, giving direction and
applause and monetary rewards for every bit of "cleverness" the team comes up with, no matter
how deranged. Every incentive is in favor of sociopathic mindless greed. And who goes to
prison when something goes wrong? The engineer.
Robert Martin's speech "The Scribes' Oath" from GOTO 2017 also comes to mind. (The video
is very easy to find on Youtube, but the URL is blocked where I work so I'm not able to
provide it.) I've only read his "code of conduct" so I'm not certain whether his speech goes
into the ethics of certain programming decisions, as opposed to strictly technical decisions.
If there were some sort of "oath" required for the programming profession, I would
hope it placed ethical and moral considerations much more highly than merely technical ones
like requiring unit tests or not blocking other people's commits. While the scribes in
ancient Egypt were highly valued and technically skilled, they fundamentally served
autocratic power. And it is the same for us.
If we had a real profession, those programmers would be brought before that profession,
investigated, and if found guilty, drummed out of the profession in disgrace.
-- "VW" 14 October 2015
Of course, we don't have a real profession. Just some mystique stolen from actual
engineers.
Well if it was a profession there would be some kind of job protections as well maybe. But
haha. So maybe people just do it because if they don't some H1B will.
How actual weakening of peoples morality goes is: one may start out all idealistic and
moral, but in order to stay employed or get employment one gradually must compromise more and
more and one HAS TO deaden themselves to the effect of this compromising. So one may start
out idealistic at 20 but chances are one isn't going to be such an idealist by the time they
reach 50, oh heck one would sell their soul several times over just to get a job by the time
they reach 50
Nir Eyal wrote a whole book in this topic called Hooked.
There is a ton of skepticism here, but keeping people in anticipation of the next hit is
why there is endless scroll on the most time sucking applications
"... There is undoubtedly a habit-forming component where the lever pressing becomes compulsive at some point. B ..."
"... When I grew up, there was the concept of "substitute gratification" ranging over thumbsucking, substance use, casual TV watching, and other compulsive behaviors, all of which emerge in response to frustration and are magically absent when a better gratification is available, whether intellectual stimulation of otherwise. ..."
There is undoubtedly a habit-forming
component where the lever pressing becomes compulsive at some point.
But I have found myself as well as heard the same from peers, that
especially in a workplace setting there is a large boredom-killing
aspect.
As engineers, we do get opportunity to work on interesting or
at least intellectually challenging and satisfying things. When
you are on something like this, there is no urge to text, check
updates, etc. But as soon as you get stuck on some issue and frustration
sets in, or you are made to partake in office bureaucracy and other
unpleasantries, boredom kicks in.
When I grew up, there was the concept of "substitute gratification"
ranging over thumbsucking, substance use, casual TV watching, and
other compulsive behaviors, all of which emerge in response to frustration
and are magically absent when a better gratification is available,
whether intellectual stimulation of otherwise.
"... The trap was set at least twenty-five years ago and the mice jumped at the smell of the cheese. I am referring to the introduction of the computer as a mass necessity and the Internet that followed. I was slow to enter the trap, "forced" finally in 2007 by the college where I was teaching. ..."
"... In 1960 the sociologist C. Wright Mills said that there was far too much information for people to assimilate and make sense of and that lucid summations were needed. He was echoing Thoreau who in 1854 said: ..."
"... If you are acquainted with the principle, what do you care for a myriad instances and applications?" ..."
"... The Internet is a double-bind because we are damned if we do and damned if we don't. News, writing, and information of all sorts is now often not available any other way. The era of paper newspapers is coming to an end. This was meant to be. ..."
"... To put you into a state of frenetic passivity while whispering in your ear that there is no escape, while allowing elements of truth to emerge to keep you addicted. ..."
The trap was set at least twenty-five years ago and the mice jumped at the smell of the
cheese. I am referring to the introduction of the computer as a mass necessity and the Internet
that followed. I was slow to enter the trap, "forced" finally in 2007 by the college where I
was teaching.
Up to that point, I was just a member of The Lead Pencil Club, whose motto was "a speed bump
on the information superhighway" and whose membership list numbered twenty-three and a half
people worldwide. When I slowly and reluctantly reached for the cheese the trap snapped, not on
my neck to finish me, but on my head that was half in and half out.
The out part kept thinking.
What follows are that half-head's musings on why I didn't follow my intuition, the whole
damn sorry situation we are all in, and what we might do to spring the trap and run free. I
don't like this trapped feeling. And, by the way, the cheese was American, which is not exactly
real cheese.
In 1960 the sociologist C. Wright Mills said that there was far too much information for
people to assimilate and make sense of and that lucid summations were needed. He was echoing
Thoreau who in 1854 said:
If you are acquainted with the principle, what do you care for a myriad instances and
applications?"
Mills said people needed to develop what he called the sociological imagination that would
allow them to condense and simplify news and to connect personal and social matters within
historical and structural contexts.
That was the long-lost era of newspapers, long-form paper magazines, the reading of books,
and minimal television stations. To think that there was far too much information then can only
make one laugh, now that the digital revolution has buried us in data, information, and
"breaking news" at warp speed, usually contradictory and lacking context.
The internet has literally made people crazy, created schizoid or split personalities who
don't know whether they are coming or going or what world they are in, physical or virtual.
This is the era of social schizophrenia. It is also the era of Covid-19 lockdowns when a far
greater online life is promoted as the necessary future.
If people once felt that all the information was too confusing and they were ending up
thinking and doing things ass-backwards as a result, back then they might have understood it if
you told them that the only way you can do anything is ass-backwards. Today, many would
probably greet you with a look of bewilderment as they googled it to see if there was a way to
swivel their asses to the front to get adjusted to the way they feel while waiting online for
clear directions to emerge. Which way does an ass go?
They will be waiting for a long, long time.
The Internet is a double-bind because we are damned if we do and damned if we don't. News,
writing, and information of all sorts is now often not available any other way. The era of
paper newspapers is coming to an end. This was meant to be.
Other sources of fact and fiction have gradually been eliminated, while the content on the
Internet has been dramatically increased and progressively censored. The dream of an open
Internet is turning into a nightmare.
If you look at the Internet's creation and development by the US
military-intelligence-Silicon Valley network as a tool for social control, propaganda, and
total spying, if you grasp this nexus and their intentions, you will come away realizing that
the Internet and the total integrated digital world is a dystopian tool designed to make you
crazy. To sow confusion and endless contradictory information from minute to minute. To "flood
the zone" (see Event 201) with propaganda and disinformation. To give you a headache, keep you
agitated, destroy your genuine human experience in the physical world.
To put you into a state of frenetic passivity while whispering in your ear that there is no
escape, while allowing elements of truth to emerge to keep you addicted.
This is the double-bind. It is what Jacques Ellul in 1964 called the technological society
that is ruled by technique in every aspect of its life. Technique is a way of thinking that
emphasizes efficiency; it is a way of thinking that emphasizes order and standardized means to
a predetermined end. It is rational, deliberate, and focused on results. It is a way of
thinking that has penetrated deep into the psychic structures of society and opposes
spontaneity and unreflective action.
Machines grow out of technical thinking, and today the computer, the internet, and
artificial intelligence are the ideal manifestations of such thinking. They are the result, not
the cause.
As such, digital technology satisfies the technical mindsets that have been created over the
decades, which includes regular people who have been gradually softened up to believe these
machine dreams. Efficiency, results, practicality, and speed. The human body as a wonderful
machine.
We have all been so conditioned, even those of us old enough to have lived before the
computer era. Starting particularly in the early 1990s with the rat-a-tat electronic frenzy of
the U.S. televised aggressive war against Iraq, euphemistically called the Gulf War and
presented live with round-the-clock television coverage by ghoulish announcers more excited
than 13-year-old boys with a porn magazine, the speed of everyday life has increased.
If you lived through those years and were sensitive to the social drift, you could feel the
pace of life pick up year-to-year, as everyone was induced to get in the fast lane. On the
information superhighway, it is the only lane.
Paul Virilio, a French thinker, has focused on this issue of speed in his studies of
dromology, from dromos: a race, running. While his language is perhaps too academic, his
insights are profound, as with the following point:
The speed of the new optoelectronic and electroacoustic milieu becomes the final void (the
void of the quick), a vacuum that no longer depends on the interval between places or things
and so on the world's extension, but on the interface of an instantaneous transmission of
remote appearances, on a geographic and geometric retention in which all volume, all relief
vanishes.
This is the world of teleconferencing and the online life, existence shorn of physical space
and time and people. A world where shaking hands is a dissident act. A haunted world of
specters, words, and images that can appear and disappear in a nanosecond. A magic show. A
place where, in the words of Charles Manson, you can "get the fear," where fear is king. A
locus where, as we sit at home "sheltering in place," we are no longer there.
Ernest Hemingway sniffed the future when in The Sun Also Rises , he has the
protagonist Jake Barnes say no to Robert Cohn, who wants him to travel to South America with
him, with these words: "All countries look like the moving pictures."
That was 1926.
Things have changed a wee bit since then. But the essence of propaganda and social control
remains the same. "All those people who seek to control the behavior of large numbers of
other people work on the experiences of those other people," wrote R.D. Laing, in The
Politics of Experience . "Once people can be induced to experience a situation in a
similar way, they can be expected to behave in similar ways."
Mystification takes place when people can be convinced that a social construction –
e.g. the Internet and the digital life – is part of "the natural order of things," like
the air we breathe. And that life online is real life, better and more real than physical
existence.
I believe the digital revolution has gone a long way toward destroying our experience as
persons. It is the endless magical mystery tour that goes nowhere. It is the ultimate
psychodrama conjured by a satanic magician.
Do I exaggerate? Perhaps. But how else explain the spell this medium has cast on billions of
people worldwide? Did the human race suddenly get smart? Or are many more people crazy?
I ask myself this question, and now I ask you. Has the Internet and the devices to access it
made your life better or worse? Has it made the life of humanity better or worse? Has its
essential role in globalization made for a better world?
Obviously, there are pluses to the Internet, just as there are pluses to almost everything.
I don't deny that. The plus side of death is that the thought of it reminds you that you are
alive. The plus side of television is you don't have to turn it on. Like you, I could rattle
off many good things about the Internet (not cell phones, sorry). But on the scale of good and
bad, where do you come down? Where do I?
Or is it possible we can't decide because we are too conflicted and caught in a
double-bind?
I am of two minds, or more accurately, two half-heads. The upper part, pinned in the trap
and dead to my situation, can only answer yes, sir, now that I am trapped, my life is
better.
I can debate endlessly the minutiae of every issue thrown out like pieces of meat for caged
lions. I can check the weather forecast for every hour of every day of the week, even though I
know they will probably be wrong. I can get directions even though I know you don't need a
director to know which way the roads go. I can research issues quickly and pontificate as if I
were an expert on every matter from a to z. I can feel I am informed while feeling deformed by
the contradictory information that appears and disappears every few minutes.
Essentially, I can feel in-touch and worthy of respect from friends and neighbors because I
can exchange empty words with them about nothing. I can feel so very normal and rejoice in
that. I can feel sane.
On the negative side, well, my lower half-head, the one that's still thinking lead-pencil
thoughts, the slow and easy stuff, the calm cool breeze oh what a lovely daydreams – you
don't really need to hear what it has to bitch about the Internet. You can probably guess.
In a fine article, Vicious Cycles: Theses on a philosophy of news , in Harper's
Magazine, Greg Jackson writes the following about our addiction to so-called "news" (the
Internet):
When we turn away from the news, we will confront a startling loneliness. It is
the loneliness of life. The loneliness of thinking, of having no one to think for us, and of
uncertainty.
It is a loneliness that was always there but that was obscured by an illusion, and we will
miss the illusion . And we will miss tuning in each day to hear that voice that cuts boredom
and loneliness in its solution of the present tense, that like Scheherazade assures us the
story is still unfolding and always will be.
OffGuardian does not accept advertising or sponsored content. We have no large financial
backers. We are not funded by any government or NGO. Donations from our readers is our only
means of income. Even the smallest amount of support is hugely appreciated.
Our Bitcoin JTR code is: 1JR1whUa3G24wXpDyqMKpieckMGGW2u2VX
Connect with
Connect with
Subscribe newest oldest most voted
Notify of
Doctortrinate ,
so the tech monster is taking over our lives .but who has control over it ? I think the
illusion is the belief that we control it, when in truth, we're being played, led deeper into
its web, trapped in its net and held under its power, restricted through dependence , slowly
vulgarized into ineffectual hollowness, deteriorated until so manipulated by it that folk
won't know of or care for a life in any other way but artificial – so where are the
calls restrict Its influence – to question those who built it and that would use it
against us .those who's continuation is reliant on it taking them to a managed repeat,
sustaining their control, and completing the apparatuses infinite circle of physical
dominion.
this thing, this game, this performance – even to my lesser intermediate self , all
is Insignificant.
TrueNorth ,
"The calm cool breeze" is exactly how it feels. It feels refreshing to be able to think
independently.
tonyopmoc ,
I have always liked Italians but Sara Cunial is Something Else
Che coraggio. Che coraggio !!!!
BRILLANTE!!!!!!!
"Italian MP,Sara Cunial,Blasts Bill Gates in the Italian Parliament"
I love that woman. I don't care if someone says she's right-wing or this or that blah blah
blah
tonyopmoc ,
"Hobbes said that absolute power does not come from an imposition from above but by the
choice of individuals who feel more protected renouncing to their own freedom and granting it
to a third party. With this, you are going on anesthetizing the minds with corrupted Mass
Media with Amuchina (a brand of disinfectant promoted by Mass Media) and NLP, with words like
"regime", "to allow" and "to permit", to the point of allowing you to regulate our emotional
ties and feelings and certify our affects.
So, in this way, Phase 2 is nothing else than the persecution/continuation of Phase 1
– you just changed the name, as you did with the European Stability Mechanism (ESM). We
have understood people, for sure, don't die for the virus alone. So people will be allowed to
die and suffer, thanks to you and your laws, for misery and poverty. And, as in the "best"
regimes, the blame will be dropped only on citizens. You take away our freedom and say that
we looked for it. Divide et Impera (Divide and Rule).
It is our children who will lose more, who are 'raped souls', with the help of the
so-called "guarantor of their rights" and of CISMAI (Italian Coordination of Services against
Child Abuse). In this way, the right to school will be granted only with a bracelet to get
them used to probation, to get them used to slavery – involuntary treatment and to
virtual lager. All this in exchange for a push-scooter and a tablet. All to satisfy the
appetites of a financial capitalism whose driving force is the conflict of interest, conflict
well represented by the WHO, whose main financier is the well-known "philanthropist and
savior of the world" Bill Gates.
Hobbes said that absolute power does not come from an imposition from above but by the
choice of individuals who feel more protected renouncing to their own freedom and granting it
to a third party. With this, you are going on anesthetizing the minds with corrupted Mass
Media with Amuchina (a brand of disinfectant promoted by Mass Media) and NLP, with words like
"regime", "to allow" and "to permit", to the point of allowing you to regulate our emotional
ties and feelings and certify our affects.
So, in this way, Phase 2 is nothing else than the persecution/continuation of Phase 1
– you just changed the name, as you did with the European Stability Mechanism (ESM). We
have understood people, for sure, don't die for the virus alone. So people will be allowed to
die and suffer, thanks to you and your laws, for misery and poverty. And, as in the "best"
regimes, the blame will be dropped only on citizens. You take away our freedom and say that
we looked for it. Divide et Impera (Divide and Rule).
It is our children who will lose more, who are 'raped souls', with the help of the
so-called "guarantor of their rights" and of CISMAI (Italian Coordination of Services against
Child Abuse). In this way, the right to school will be granted only with a bracelet to get
them used to probation, to get them used to slavery – involuntary treatment and to
virtual lager. All this in exchange for a push-scooter and a tablet. All to satisfy the
appetites of a financial capitalism whose driving force is the conflict of interest, conflict
well represented by the WHO, whose main financier is the well-known "philanthropist and
savior of the world" Bill Gates.
We all know it, now. Bill Gates, already in 2018, predicted a pandemic, simulated in
October 2019 at the "Event 201", together with Davos (Switzerland). For decades, Gates has
been working on Depopulation policy and dictatorial control plans on global politics, aiming
to obtain the primacy on agriculture, technology and energy.
Gates said, I quote exactly from his speech:
"If we do a good job on vaccines, health and reproduction, we can reduce the world
population by 10-15%. Only a genocide can save the world".
With his vaccines, Gates managed to sterilize millions of women in Africa. Gates caused a
polio epidemic that paralyzed 500,000 children in India and still today with DTP, Gates
causes more deaths than the disease itself. And he does the same with GMOs designed by
Monsanto and "generously donated" to needy populations. All this while he is already thinking
about distributing the quantum tattoo for vaccination recognition and mRNA vaccines as tools
for reprogramming our immune system. In addition, Gates also does business with several
multinationals that own 5G facilities in the USA.
On this table there is the entire Deep State in Italian sauce: Sanofi, together with
GlaxoSmithKline are friends of the Ranieri Guerra, Ricciardi, and of the well-known
virologist that we pay 2000 Euro every 10 minutes for the presentations on Rai (Italian state
TV. She's probably talking about Burioni). Sanofi and GlaxoSmithKline sign agreements with
medical societies to indoctrinate future doctors, making fun of their autonomy of judgment
and their oath.
Hi-Tech multinationals, like the Roman Engineering which is friend of the noble Mantoan,
or Bending Spoons, of Pisano, which are there for control and manage our personal health
datas in agreement with the European Agenda ID2020 of electronic identification, which aims
to use mass vaccination to obtain a digital platform of digital ID. This is a continuation of
the transfer of data started by Renzi to IBM. Renzi, in 2016, gave a plus 30% to Gates Global
Fund.
On the Deep State table there are the people of Aspen, like the Saxon Colao, who with his
4-pages reports, paid 800 Euros/hour, with no scientific review, dictates its politics as a
Bilderberg general as he is, staying away from the battlefield. The list is long. Very long.
In the list there is also Mediatronic, by Arcuri and many more.
The Italian contribution to the International Alliance Against Coronavirus will be of 140
million Euros, of which 120 million Euros will be given to GAVI Alliance, the non-profit by
Gates Foundation. They are just a part of the 7.4 billion Euro fund by the EU to find a
vaccine against Coronavirus – vaccines which will be used as I said before.
No money, of course for serotherapy, which has the collateral effect of being super cheap.
No money for prevention, a real prevention, which includes our lifestyles, our food and our
relationship with the environment.
The real goal of all of this is total control. Absolute domination of human beings,
transformed into guinea pigs and slaves, violating sovereignty and free will. All this thanks
to tricks/hoax disguised as political compromises. While you rip up the Nuremberg code with
involuntary treatment, fines and deportation, facial recognition and intimidation, endorsed
by dogmatic scientism – protected by our "Multi-President" of the Republic who is real
cultural epidemic of this country.
We, with the people, will multiply the fires of resistance in a way that you won't be able
to repress all of us.
I ask you, President, to be the spokesperson and give an advice to our President Conte:
Dear Mr. President Conte, next time you receive a phone call from the philanthropist Bill
Gates forward it directly to the International Criminal Court for crimes against humanity. If
you won't do this, tell us how we should define you, the "friend lawyer" who takes orders
from a criminal."
Thank you.
nondimenticare ,
The daily double-bind I face with the Internet, while I long for the days without it, lost to
me forever: I need it to find information – the "truth" – I am denied by mass
media, and was denied even in their better times. How much earlier I could have learned the
sordid background of the Vietnam War, JFK's assassination, 9/11 – even the origins of
World War I – without devoting most hours of my day to the task!
Yet all the truths I search for are only tentatively available to me, come with extraneous
negative baggage, and are in the process of being gradually withdrawn. Thus the trap is
sprung.
In relation to COVID, I know more than I could have hoped to know (thanks in great part to
OffG) owing to my digital link to others. But the irony is that the most frightful plans for
our futures, referenced by Curtin, would not be possible without that digital world.
Problem is, how much of the truth you have found about JFK at Al. is not just limited
hangout.
I know, I take what they give us and triangulate, and Intuit, and shake and bake, but the
fact is they litter the landscape with endless red herring.
I was at an ROTC school in N. Hollywood in my teens and though I never spoke to him, he
was a star student and asked to give a number of talks.
He's become a world expert on "true" JFK and as a publisher, or editor, is a major
gatekeeper for, wow, most of the anti-Warren Conspiracy Realists.
I remember making a note on him at 16 years old that I really didn't trust his vibe. He
edited our school mag, so I started an underground, very successful, til they kicked us both
out in June 1969. My dismiss was real, but
I often mused that they were crafting a legend for him, like LHO. I'm dammed if I can prove
he's real.
That was all fifty years ago, but he almost rules the roost of the Oliver Stone side of
things. He could all be smoke and mirrors. Our school was a hotbed, as a rich rich Army
school, of future CIA. How could he advance so well against the CIA without them putting more
of a drag.
Reminds me of most of these former spooks turned whistleblowers. None of them could be so
real. None.
He wrote "The Devil's Chessboard" about Allen Dulles being the party who killed JFK.
But he was only a puppet. I believe the Kennedy's were hit because they were putting down
solid wonderful diplomatic roots with the Vatican, and going through the Pope via the Kremlin
to walk the world back from Nuclear Holocaust.
But that alone is enough to put ANYONE on the Hit Parade if the Freemasons, who are sworn
to the death to destroy as much of Catholicism as they possibly can, just read the history of
the CRISTERO WAR in Mexico 95 years ago. The Mexican lodges of the Scottish Rite Freemasons
gave President Plutarco Called a shining medal for his "work against the Catholic Church" in
Mexico (work that got 100,000 people murdered in what Graham Greene called "the fiercest
persecution of religion since Elizabeth")
dil pickles ,
Shit eff and other expletives
Beautifully put.
We are so many of us terrified of aloneness.
Loneliness is the name we give to the feeling we have when we are scared of aloneness.
Aloneness, when apprehended and experienced with brave abandon, may yield a new person, or a
person where there was not one before?
Very brave women and a fantastic speech. She nailed it. Interesting that the bell chimed just
as she mentioned 'Bilderberg'. Coincidence?
when not if ,
10 years at the helm of Google and currently a chair of the US Department of Defense's
Defense Innovation Advisory Board, Eric Schimdt stated that Google does not cross the creepy
line in their use of our personal information. The Creepy Line is the point where people are
pushed into madness. While Schmidt is saying Google does not cross the creepy line, it is an
admission that Google, glaringly, is constantly placing people at the edge of a thin line
near insanity.
No wonder, people are feeling insane as they are indeed constantly driven into madness by
an ever creepy algorithm. An algorithm that is impossible to quit as many people's livelihood
depends on.
The news( propaganda) was on paper before the internet.
The internet has made the crazy louder as every mad bugger can get their ideas propagated.
The ideas desired by the occult mind controllers get made" viral" through monopoly search
engine + " social media"( internet news).
The internet did not make people mad, who was mad was crazy before internet they are just
making more noise the ego minds love creating false images of themselves and the internet is
the petri dish for a new fake identity.One that is better than others, one that totally
identifys with thoughts and fights to defend them as though it is their very selves they are
defending.
If we do not know who we are and so are run by the egomind (conditioned)we are skitzofr3nic
.
You don't have to give up the internet, you can use it to do what you need to do.
But in saying that most people are addicted to internet, computers and phones and are on
it to try to build their egos 24/7 twitters.
Screen free days are a good idea as are news/propaganda free days/weeks/years .
when not if ,
Exactly my thoughts, from the start [of the article] to finish. Thanks Edward Curtin!
Has the Internet and the devices to access it made your life better or worse?
Each device makes certain tasks better and easier However all the devices and tasks
combined are making life worse and much more difficult. It is negative synergy that in the
wrong hands can become destructive.
Insanity is not only becoming the new normal, it is fast becoming celebrated and
rewarded.
Dungroanin ,
But how can you leave out Marshall MacLuhan?
The Medium Is The Message.
I ask myself this question, and now I ask you. Has the Internet and the devices to
access it made your life better or worse? Has it made the life of humanity better or worse?
Has its essential role in globalization made for a better world?
Since Edward asks, my opinion is that it has made life of humanity better.
Now the message can be resisted.
From and by anyone willing to RESIST.
¡No Pasaran!
tonyopmoc ,
Dungroanin,
About time you woke up, and recognised, and maybe even began to understand, why hardly
anyone wants to go back to work, whilst I want to write again on Facebook
GET BACK TO WORK you lazy sods
As you can imagine, that did not go down too well, so I have kept quiet. I started off
with -well we were all still going down the pub, and hugging and kissing (like we do) –
and I looked at the numbers Far less people had died than normal. This would not go down too
well now, either. Not everyone shares my sense of humour, and reality, so I have banned
myself from social media. I do not yet know how to unbrainwash brainwashed people, but I am
working on a few ideas, and kind of testing them a bit, socially. No one has given me a hard
time yet. I always try to be helpful and friendly.
"Monty Python and the Holy Grail: Bring Out Your Dead"
Yeah don't think you've quite thought that through – you are using it!
As i say (or McLuhan did) "The Medium is the message'
The medium is the internet not any particular flavour of it.
But I don't need to tell you that surely?
tonyopmoc ,
Dungroanin,
Stop trying to be clever, whilst I do like you – I still don't know if you are a boy
or a girl. Dunno about you, but mine still works.
Tony
bob ,
what, thisMarshall MacLuhan?
https://player.vimeo.com/video/114022336
check out the Glasgow Media Group – their media work is exemplary
Dungroanin ,
Oi cant spellz 😉
Herbert Marshall McLuhan CC was a Canadian philosopher. His work is one of the
cornerstones of the study of media theory.
tonyopmoc ,
I am 66. I have kept the child alive inside my mind, by reading books – of all kinds.
We didn't have a lot of money, but my Mum gradually weaned me off The Beano and Dandy, by
every week, buying Mind Alive – it was a magazine, that you could compile into an
encyclopedia. (Still in my attic)
Even when thrown into the deep end many years later, and being introduced by my new boss,
who immediately went off on 2 weeks holiday – to my new team, I told them truth. I
couldn't bullshit this
They slung me a book
"UNIX 101 for Dummies"
We got on really well.
I learn from clever people. I do not tell them how to do it, when I have not got a clue,
or they won't tell me or show me anything, and we will not be a team.
Asking questions is good, even if you think, they might think, you didn't quite
understand.
Tony
Lost in a dark wood ,
Re: Mind Alive – it was a magazine, that you could compile into an encyclopedia. (Still
in my attic)
Try looking up words which have now become commonplace, such as "autism". You can do the
same with old dictionaries.
--
It's Erik Satie's birthday today. He said: "I came into this world very young.. at a very old
time" also.. "Although our information is incorrect, we do not vouch for it" that rings a few
bells.
Dennis Brown ,
This is a very thought provoking article by Mr. Curtin , which should be widely shared!!! And
once again a sterling example of the quality of the Off Guardian website.
I'd only add that we should pause to consider that technology–per se–is not
necessarily evil in itself. Rather it is the social relations that lurk behind the use of
technology that can pose a potential threat to human well being.
For those not frightened by the name of Karl Marx it is worth noting that he addressed
many of Mr. Curtin's concerns in Das Kapital 150 years ago.
In Vol One of Capital, in a fairly obscure footnote, Marx made a passing reference to how
he personally viewed his intellectual quest. It was , indeed,to write the social history of
the evolution of technology. He equated his goal to being similar to that of Sir Charles
Darwin's history of natural life, In the Origin of Species.
To wit, Marx observed that technology is an extension of all human activity and therefore
all human relationships. Technology shapes and conditions what we do , where we live ,how we
live , how we treat one another, how we treat the earth etc. It is those social relationships
that in turn determine whether technology can be judged either "good" or "bad" .
Societies and economies that are organized capitalistically–i.e. for the production
of commodities for exchange value and private profit, as opposed to use values and common
social well being -- are by definition based on exploitation , planetary destruction,social
domination and control.
The world is on the cusp of an unprecedented epochal paradigm shift, as so many of the
intelligent commentators on this site have already noted. We can respond to this fact and
relentlessly inquire as to why that might be so. Or we can accept the narrative of the
dominant elites that this is an unprecedented biological event and that are leaders, with
their superior wisdom are simply trying to protect us -- and that the suppression of basic
liberties, free speech and the destruction of the livelihoods of ordinary people is all
regrettable but unavoidably necessary.
Or we can do as Marx suggested and we can follow the money and see where it leads. We can
note that under the smokescreen of this "pandemic emergency" that trillions of dollars are
being transferred by the State to the One Percent. We can also note that large sections of
the global workforce will be permanently rendered redundant,and be ultimately replaced by
artificial intelligence and robotics in order to squeeze the last dregs of surplus value from
what remains of the working class . All predicted in Volume one of Capital, and awaiting
re-interpretation by those of us willing to take up the challenge in the contemporary
context.
But there is a structural contradiction in all of this. If billions of people are
marginalized from the workforce where will effective demand come from to buy all the junk the
Capitalists produce?This is an issue Marx wrestled with incompletely in Vol 2 of Capital.
Ironically,the elites need us as consumers , yet strive to eliminate us as workers in order
to reduce labour costs and enhance profits.
That seems to me to be the central dilemma of our age. And that big transformative
struggle is now being being played out under the convenient guise of the pandemic.
Their answer to the contradiction appears to be a form of cynical ,and ultimately
penurious, form of"universal basic income." Best administered through a cashless,
authoritarian cybernetic matrix. This prospect would be another example of Mr. Curtin's
metaphor of cheese in a mousetrap. Once we fall for it there will be no escape!
Seems to me our only hope, and sadly it is a distant one, is to keep our eyes wide open
and challenge in every way possible the Covid-19 pych-op. The truth in this regard is very
rapidly emerging!!!
Beyond that we need to use this traumatizing event to question what kind of a society we
wish to live in. One that is based on ever increasing exploitation, misery, and environmental
degradation to benefit the profit seeking gluttony of a tiny minority replete with an arsenal
of financial, technological, and military tools? Or one based on a sustainable economy
focused on human need , meaningful employment for all,and the production use values instead
of exchange value, within a democratic consensus .
(I know that I'm a dreamer, but I'm not the only one!)
TrueNorth ,
Good comment. I like to analyze and understand the evolution of technology and how it is
shaping how we live. Technology has become an extension of ourselves. Currently,
technological advances are made by a small number of people who embue the products with their
own values and ideas born from their very narrow human experience. For example, people who
are submerged in the digital world might not appreciate the importance of diversity in the
natural environment or diversity of cultural heritage simply because they have never
experienced, and thus would not reflect any of it in their digital creations. Technological
advances are accelerating and are irreversible. The problem is that they are in the control
of the few and incomprehensible to majority of others who lack the tools to assess the
quality and value of these changes to the future of our civilization. In order to avoid the
technology replacing humans in the near future, majority of people need to get up to speed
with it and steer the direction of innovation toward the greater good that would benefit this
world.
wardropper ,
A marvellous article, which covers all the essential aspects of why actual human beings,
along with their irrefutable experiences, are suffering in today's world. It is the author's
broad, sweeping strokes which convince, and not the latest mainstream-media CoVid statistics
which prove beyond a shadow of doubt that I died two months ago, because, yes, the virus is
really THAT deadly
tonyopmoc ,
I know we are making progress. I even chatted to my next door neighbour today How big is your
shed? I said, well I can't remember the numbers, (but whilst cleaning out our rainwater water
barrel), I said I think I know where the plans are for my shed (built about 5 years ago
– the builders needed access to her garden, and she was really nice about it). I really
recommend them, and passed her the plans and final invoice )she must have looked and thought
bloody hell that was cheap whilst wearing her rubber gloves, also digging the back garden
– which I found very impressive. We did not talk politics, nor COVID, but when they
lock themselves out, they come round. Have you got our spare key and we haven't, I do my
best, cough cough, – to get them back in to their own home. They are lovely people too.
It is really easy to grow potatoes and tomatoes, but my wife's Spinach from last year,
never stopped, even in midwinter.
Most peoples minds, have been scrambled, by the incessant propaganda
It is not easy to unscramble an egg, but we are making progress.
Tony,
Another good year-rounder is Curly Kale. And it's dead easy to grow.
ame ,
BREAKING NEWS or is it
UK to invest up to 93 million pounds in new coronavirus vaccine center
By REUTERS MAY 17, 2020
The British government will invest up to 93 million pounds ($112 million) to accelerate
construction of a new vaccines center, the Department for Business, Energy and Industrial
Strategy said on Saturday.
The funding will ensure the new center opens in Summer 2021, a year ahead of schedule, the
department said.The Vaccines Manufacturing and Innovation Centre (VMIC), which is currently
under construction, is a key component of the government's program to ensure that once a
coronavirus vaccine is available it can be rolled out quickly in mass quantities, the
department said.
https://www.jpost.com/Breaking-News/UK-to-invest-up-to-93-million-pounds-in-new-coronavirus-vaccine-center-628282
Grafter ,
Oh good Billy Gates will be pleased. I would expect him and his family along with the buffoon
Professor Ferguson to be front of the queue. As for myself you can gtf.
when not if ,
"UK to invest up to 93 million pounds in new coronavirus vaccine center"
The UK is also investing to train dogs to detect the new coronavirus in people. As the UK
is obsessed about austerity, they could do well in combining the two investments together and
train dogs to sniff the disease and deliver the vaccine at the same time. This fits the
obnoxious ruling elites' ideologies perfectly.
IANA ,
O/t but interesting article in the mail questioning just who is running the govt lockdown
policy.
Boris had to ask Sir Mark Sedwill about 'who is in charge' of the policy and re-iterated
what seemed apparent when Boris was forced to u-turn over govt's initial policy response.
From that moment on he unfortunately was 'removed' due to his having caught cv19 not
returning until well after the lockdown was in full swing. It seems from his question it is
clear he doesn't think he is in charge which is enlightening about who really runs the UK.
Very helpful of Sir Mark to defer in this situation that its he in fact who is in charge.
Just in time for any fallout that may result from all the questions being raised by another
guy who has fallen foul of the inner circle – Neil Ferguson.
The internet does one thing perhaps better than anything else – and far better than the
"real" world surrounding us: it shows us the fleetingness of permanence. In the old days, if
you saw a book or a record or anything else you wanted, your biggest worry was that it would
be gone by the time you were able to acquire it – that someone else will have beat you
to it. There is no such worry on the internet.
There's no danger, for instance, that someone else might beat me to this article and I
might miss the chance to read it. It is ensconced in a veneer of permanence. Yet it and every
single trace of it could completely vanish in a heartbeat should the internet itself suddenly
go off grid. We depend on the internet to be there; the corollary being that we exist in a
perpetual state of anxiety lest we lose everything we cherish.
It is us that have become good at ignorance and fear through practice.
We can't blame the internet for what we have done.
The internet would be neutral, could be used for " good or bad" ( if it weren't for the
censorship, privacy violations and monopoly search engine). Thats why agent Assange is MSM
hero celeb poster boy for our "internet freedom"( haha )and CIA's whisleblower damage control
trap "wikileaks".
tonyopmoc ,
Howard, Whilst I kind of agree with you, I am an old person, who likes old, well crafted,
beautiful original things. I was extremely upset, when my favourite coffee mug, which I had
loved, and which had served me well for many years lay broken on the ground. I have been
searching for an identical replacement for 18 months, and I am almost certain I have found
it. Yes, it was expensive, nearly £18 including delivery. Hopefully, it will turn up
this week in one piece, if it survives being mangled through the delivery machine. I may be a
sentimental old sod, but I really like my beautiful coffee mug. It really brightens things up
in the morning, especially after a heavy one the night before.
Learnt impermanence from the internet , thats something I have never considered.
What we truly cherish we can never loose.
Ort ,
I take your point, I think, but I also see a contradiction: it seems that you're actually
saying that the Internet is an exception, or antidote, to the "fleetingness of
permanence"– that it's like a vast, expanding, unbounded block of amber that traps all
of its content for eternity, just as ancient sap flows trapped prehistoric insects.
Also, the "permanence" depends on how one punctuates Internet experience. It's true that
virtual content is a "gift that keeps on giving", insofar as an infinite number of users can
access a given item without depleting or exhausting it.
But there are devils in these details: links famously "die", i.e. are broken and useless
when the target site becomes defunct; searching for elusive items can be labor-intensive,
frustrating, and fruitless. It's for the "web" to know, and the hapless user to find out.
And "improved" website bells and whistles exemplify Virilio's "void of the quick" cited in
the essay. I know I'm a dinosaur (age 65), but I became incensed and outraged when animated
features became standard web page "eyeball grabbers" several years ago. I don't know the
technical nomenclature, but I'm referring to, say, news sites that display a panel of "top
stories" that continuously change in rotating slide-show fashion.
This deliberate virtual buzzing, blooming confusion celebrates the ephemeral; if one is
not quick enough on the draw, an item of interest vanishes before one's eyes. The standard
logical rebuttal is to assert "Aha! But if the user is patient, that item of interest will
reappear momentarily– or alternatively, can easily be recovered."
But my experience says otherwise. I've often navigated away from a page, suddenly
reconsidered and returned to pursue a featured item within seconds or minutes, and discovered
that it is no longer there. Something new has replaced it.
FWIW, YouTube is particularly vexing in this regard; it stuffs my home page with unwanted
"recommendations"; if I leave the page and return, or even refresh it, the page is
involuntarily "updated" by the relentless YT algorithms. Worse yet, I have even done searches
for a video I'd just seen and "lost", but even using keywords fails to retrieve it.
And then there are "innovations" like infinite page scrolling, or whatever it's
called– pointlessly turning discrete pages into one "bottomless page" that is
overwhelming. I have no doubt that these innovations are all imposed for some nefarious
self-serving purpose, probably commercial– either variations of "clickbait" or making
the page more suitable to hand-held devices like smartphones.
So the Internet's "permanence", such as it is, exists within a maddening perpetual
kaleidoscopic flux.
All a frightful mistake, old boy. No-one thought to check Ferguson's numbers. The Daily
Telegraph and The Daily Mail are both railing against Ferguson's broken adding machine.
Mi6, oath/motto "Semper Occultus", employer of Alastair Crowley, public budget GBP 3
billion, black budget unknown simply didn't think to check Neil Ferguson's software or see
how he was calculating his projection of deaths by Covid-19.
Government Communications Headquarters (GCHQ) which employs hundreds of software
development engineers, security and public safety specialists, IT operations specialists,
mathematicians, and even medical technicians with a declared budget of GBP 1.7 forgot to put
anyone on the case.
The BBC has a declared budget of GBP 3.7 billion (but that's just the license fee. Total
budget is closer to 5 billion) and has 22,000 staff. None of them thought to ask how Neil
Ferguson was arriving at his numbers.
The Mi5 Guardian has already seeded the Ether with the idea that Neil Ferguson's fate could
cross paths with that of Dr David Kelly, the weapons scientist found dead in suspicious
circumstances in 2003 . All to be blamed on "sceptics", of course.
"A similar ordeal apparently caused Dr David Kelly to take his own life after the
biological weapons expert was hounded for revealing that the threat posed by Iraq's weapons
of mass destruction had been exaggerated by Tony Blair's government.
[Scientists] are regularly attacked by many of the British media commentators who are
currently joining the pile-on to Ferguson ."
John Pretty ,
I just looked at the background of the author of the piece you linked:
"Bob Ward is policy director at the Grantham Research Institute on Climate Change and the
Environment at the London School of Economics"
That's all I needed to know!
How is it that the average Guardian reader can't smell the bullshit?
wardropper ,
Because today's Guardian readers are not average people. The once-decent paper has lost the
plot, so, naturally, its readers are mentally at risk.
Waldorf ,
Are they below average or above average? They could be very clever but insane.
John Pretty ,
The software issue is relevant to a degree, but it's still a case of "garbage in", "garbage
out".
And it's still guessing
Lost in a dark wood ,
Due diligence is a well established concept. The incompetent failure to do due diligence may
be a criminal offence (e.g. manslaughter). The calculated failure to do due diligence is
complicity (e.g. treason, crimes against humanity, etc).
Hear ye, hear ye! And let it be known. I posted upstream, with time.stamps abundant, that as
the news broke that Professor Lockdown had caught his projection in the wringer, it was the
first domino that would bring down all the others, of this pathetic planetarylockdown.
As it begins to throb more and more and stick out like a sore, well, you know, that grim
sight will be noticed by more and more outlets around the globe, no matter the CYA.
WWW is great for access to information. Of course NOT ALL BRAINS are equipped to navigate
the WWW Ocean. Most just use it to publish selfies and moronic comments.
Even for those who navigate with greatest dexterity, the triple W's are fraught with
unparalleled peril, which I believe was Mr. Curtin's main point: the double bind.
Intrinsic to the medium, as eyestrain was to Gutenberg's first customers.
We're told that getting ahead at work and reorienting our lives around our jobs will make us
happy. So why hasn't it? Many of those who work in the corporate world are constantly peppered
with questions about their " career progression ." The Internet is
saturated with
articles providing tips and tricks on how to develop a never-fail game plan for
professional development. Millions of Americans are engaged in a never-ending cycle of
résumé-padding that mimics the accumulation of Boy Scout merit badges or A's on
report cards except we never seem to get our Eagle Scout certificates or academic diplomas.
We're told to just keep going until we run out of gas or reach retirement, at which point we
fade into the peripheral oblivion of retirement communities, morning tee-times, and long
midweek lunches at beach restaurants.
The idealistic Chris McCandless in Jon Krakauer's bestselling book Into the Wild
defiantly declares, "I think careers are a 20th century invention and I don't want one." Anyone
who has spent enough time in the career hamster wheel can relate to this sentiment. Is
21st-century careerism -- with its promotion cycles, yearly feedback, and little wooden plaques
commemorating our accomplishments -- really the summit of human existence, the paramount
paradigm of human flourishing?
Michael J. Noughton, director of the Center for Catholic Studies at the University of St.
Thomas, Minnesota, and board chair for Reel Precision Manufacturing, doesn't think so. In his
Getting Work Right: Labor and Leisure in a Fragmented World , Noughton provides a
sobering statistic: approximately two thirds of employees in the United States are "either
indifferent or hostile to their work." That's not just an indicator of professional
dissatisfaction; it's economically disastrous. The same survey estimates that employee
disengagement is costing the U.S. economy "somewhere between 450-550 billion dollars
annually."
The origin of this problem, says Naughton, is an error in how Americans conceive of work and
leisure. We seem to err in one of two ways. One is to label our work as strictly a job, a
nine-to-five that pays the bills. In this paradigm, leisure is an amusement, an escape from the
drudgery of boring, purposeless labor. The other way is that we label our work as a career that
provides the essential fulfillment in our lives. Through this lens, leisure is a utility,
simply another means to serve our work. Outside of work, we exercise to maintain our health in
order to work harder and longer. We read books that help maximize our utility at work and get
ahead of our competitors. We "continue our education" largely to further our careers.
Whichever error we fall into, we inevitably end up dissatisfied. The more we view work as a
painful, boring chore, the less effective we are at it, and the more complacent and
discouraged. Our leisure activities, in turn, no matter how distracting, only compound our
sadness, because no amount of games can ever satisfy our souls. Or, if we see our meaning in
our work and leisure as only another means of increasing productivity, we inevitably burn out,
wondering, perhaps too late in life, what exactly we were working for . As Augustine
of Hippo noted, our hearts are restless for God. More recently, C.S. Lewis noted that we yearn
to be fulfilled by something that nothing in this world can satisfy. We need both our work and
our leisure to be oriented to the transcendent in order to give our lives meaning and
purpose.
The problem is further compounded by the fact that much of the labor Americans perform
isn't actually good . There are "bad goods" that are detrimental to society and human
flourishing. Naughton suggests some examples: violent video games, pornography, adultery dating
sites, cigarettes, high-octane alcohol, abortifacients, gambling, usury, certain types of
weapons, cheat sheet websites, "gentlemen's clubs," and so on. Though not as clear-cut as the
above, one might also add working for the kinds of businesses that contribute to the
impoverishment or destruction of our communities,
as Tucker Carlson has recently argued .
Why does this matter for professional satisfaction? Because if our work doesn't offer goods
and services that contribute to our communities and the common good -- and especially if we are
unable to perceive how our labor plays into that common good -- then it will fundamentally
undermine our happiness. We will perceive our work primarily in a utilitarian sense, shrugging
our shoulders and saying, "it's just a paycheck," ignoring or disregarding the fact that as
rational animals we need to feel like our efforts matter.
Economic liberalism -- at least in its purest free-market expression -- is based on a
paradigm with nominalist and utilitarian origins that promote "freedom of indifference." In
rudimentary terms, this means that we need not be interested in the moral quality of our
economic output. If we produce goods that satisfy people's wants, increasing their "utils," as
my Econ 101 professor used to say, then we are achieving business success. In this paradigm, we
desire an economy that maximizes access to free choice regardless of the content of that
choice, because the more choices we have, the more we can maximize our utils, or sensory
satisfaction.
The freedom of indifference paradigm is in contrast to a more ancient understanding of
economic and civic engagement: a freedom for excellence. In this worldview, "we are made
for something," and participation in public acts of virtue is essential both to our
own well-being and that of our society. By creating goods and services that objectively benefit
others and contributing to an order beyond the maximization of profit, we bless both ourselves
and the polis . Alternatively, goods that increase "utils" but undermine the common
good are rejected.
Returning to Naughton's distinction between work and leisure, we need to perceive the latter
not as an escape from work or a means of enhancing our work, but as a true time of rest. This
means uniting ourselves with the transcendent reality from which we originate and to which we
will return, through prayer, meditation, and worship. By practicing this kind of true leisure,
well
treated in a book by Josef Pieper , we find ourselves refreshed, and discover renewed
motivation and inspiration to contribute to the common good.
Americans are increasingly aware of the problems with Wall Street conservatism and globalist
economics. We perceive that our post-Cold War policies are hurting our nation. Naughton's
treatise on work and leisure offers the beginnings of a game plan for what might replace
them.
Casey Chalk covers religion and other issues for The American Conservative and is a
senior writer for Crisis Magazine. He has degrees in history and teaching from the University
of Virginia, and a masters in theology from Christendom College.
Likbez, the psychology today article you cite does not match your characterization of
it, ie, " Browsing Web for relevant articles" in that it is discussing the effects of
"aimlessly using the Internet, to no specific end" one could hardly characterize the work
involved in Mark Thoma's or Yves Smith's aggregations as "aimless" or "to no specific
end"
True, but psychological mechanisms involved are identical. and require identical
psychological pre-disposition.
on the other hand, i didn't have any problem with your characterization of the
aggregator's behavior as "a compulsive self-destructive obsession"; i certainly see the
obsessive-compulsive behavior in my own work, and there is a self-destructiveness to it
as well, as i'll often forgo other activities and even eating decently to get my online
work done .
Yes, that's very apt description, thank you: "I'll often forgo other activities and even
eating decently to get my online work done ." is a classic description of related set of
manic behaviors.
A modern version of the "labor of Sisyphus", maybe ?
The thing is that, however, that "Internet hoarder" or "Internet pack rat" ( fuzzy and
not very accurate terms for a person with such a disorder) inevitably tend to expand the
scope of aggregation and that inevitably lead to the burnout.
In other words, I would like to stress here this particular and more limited danger.
likbez, my impression from a quick read of the psychology today article was that they
were talking about people like gamers or youtubers, those whose activity was "aimless" so i
disagree that "the psychological mechanisms involved are identical" to those that drive
someone like mark thoma or yves smith
in re the obsessive-compulsiveness of my work, i understand that you are critical but i
understand that it is who i am; i've always been a workaholic when i worked for a major
corporation (years ago), i'd go in on weekends, and would often put in 12 hour days even as
a teenager, working for myself (over 50 years ago), i'd put in successive 18 hour days at
what i was working on, often to the abandonment of everything else .the thing is, even back
then i understood the Sisyphean nature of my work and i never had a problem with that; it's
those who don't understand that who get themselves in trouble
anyhow, this thread is not about me; i am just using myself to suggest the type of
person you'd need to take over running Economist's View and as i said in my first comment
here, i'm not volunteering
EconomistView deteriorated steadily during the last two years as Mark failed to update posts for several weeks, and comment
threads became unmanageable, often exceeding 1K posts, but surprisingly the site still has a vibrant community of commenters.
Kind of the last refuge of retired persons still interested in both economics and politics. There is a special term for this
category of people: "pikey vests" (playing on the fact that thermoregulation in older people is often broken and they prefer to
put on more closing then younger people).
The meaning is similar to "armchair strategists" but with emphasis on the Dunning–Kruger effect -- a cognitive bias in which
people assess their cognitive abilities as greater than they are. In other words, the complete inability of many people, especially
seniors, to recognize their lack of ability (the effect quite visible in Trump ;-).
The term implies a verifiable tendency of the increase of the Dunning–Kruger effect with the age as it typically became really
pronounced in many retired seniors, who become attracted to discussions about politics.
It looks to me that with time such blogs as Economist's View naturally become a refuge for pikey vests.
It is not that difficult to recreate a similar aggregator blog (may be on a better platform), but to launch it into the mainstream,
you need to have your own strong personal or research interest in browsing Web for interesting links in which few people possess.
But for those two like browse the Web for interesting articles, this often became a compulsive self-destructive obsession that
takes too much time and negatively affect their research work and their lives.
You also need a large dose of political correctness not to stray from neoliberal MSM views to much and be ostracized. Which
kills the idea. So this is a delicate balancing act in which Thoma succeeded, but most wannabees might not.
Also, his status of a professor here helped with the patina of respectability and gave him a little bit more freedom than for
mere mortals.
The problem here is how to attract meaningful commenters community which is difficult. Thoma post were at the beginning informative
enough to accomplish this feat and attract many people. For the first several years his selections were interesting enough to
browse his blog of a regular basis.
Later is became more questionable and many older commenters disappeared, but still comment
threads were interesting.
in 2019 this community existed mainly due to inertia as quality of the blog deteriorated. Community also changed with very few
survivors from earlier years (Paine, anne, ilism, Fred C. Dobbs).
The same process is observable in other blogs such as Naked Capitalism, which also had lost lion share of early commenters.
One unsolvable problems to un-moderated comment threads is that there are commenters who literally are powerful spam generators
and who fill the threads with dozens of low quality posts. And you can do nothing about it.
At the same more heavy moderation like at http://crookedtimber.org
creates animosity and makes the community an echo chamber of the blog owner views and more conformist then desirable. And as such
far less interesting. Censored commenters often leave and never return.
The Economist's View blog recently became mostly political, not so much economic. The same trend can be observed with Naked
Capitalism. That probably reflect that fact the economics in and by itself is mostly pseudo science (especially neoclassical economics
with its mathiness and scholastic models which destroy students ability to think) and there is only political economy and econometrics
Another interesting effect is that most of active comments usually post their own links, which in some cases were much more
interesting/ important then Thoma's links. So Thoma;'s links served as a catalyst for posting better links.
There were a couple of "super-reposters" and Anne was/is not much of a commenter as a "super-reposter". As run75441 correctly
observed "Anne is fastidious" and simultaneously is an asset and the liability.
Another person with the same inclinations but without Anne tendency to post some useful statistical info from FED databases
was/is Fred C. Dobbs. He actually reposted a useful article from NYT even in the last comment tread (
https://nyti.ms/34pZAbD )
Like is typical in such blogs commenting community was polarized with two or three distinct camps (neoliberals/neocons and
anti-neoliberals/neocons plus libertarians who were all over the place)
An interesting thing about this blog is that Thoma posts with the list of links generate much more vibrant discussion that
posts with his short review of some articles (often Krugman, whom for some reason he like). So the blog became the aggregator
blog very early in its existence.
Lately as Thoma lost interest in its maintainance, he switched exclusively "recommended links" style of posts, which gradually
became more and more rare,
It's a bit sad to see the blogging culture in general is now losing steam, with much of the discussion moving on to Twitter
and other social platforms.
run75441 , December 26, 2019 10:38 am
likbez:
You could say so much more by saying less. What the hell is this string of descriptors "compulsive self-destructive obsession"
of the subject? Could you pick one of the string to get your point across? I read your words and you appear to be obsessed with
being heard or read.
likbez , December 26, 2019 8:18 pm
> You could say so much more by saying less.
That's always a good advice if one has time for editing. Thank you.
> What the hell is this string of descriptors "compulsive self-destructive obsession" of the subject?
You never run an aggregator blog and therefore is unable to understand what I am talking about:
1. Attracts a certain type of people who tend to overextend the scope of aggregation and then burn themselves doing so.
In male internet users, who reported using the Internet for 42 hours per week, those who displayed more symptoms of Internet
addiction, such as experiencing more negative consequences of their internet use, feeling withdrawal symptoms when not using
the Internet and an inability to control their internet use had less brain (grey) matter volume in an area of the brain known
as the right frontal pole.
This area of the brain is part of the prefrontal cortex, and under activation of the prefrontal cortex is strongly linked
to poor decision-making, addictive behaviour and willpower.
The study linked further differences in other areas of brain circuitry and excessive Internet use, and this overall pattern
of difference associated with the brains of excessive Internet users resembles the changes in brain seen in substance addictions.
As with all cross-sectional studies, the cause and effect is not clear. The brain changes may be due to excessive Internet
use, but equally, brain volume differences could be a precondition for excessive Internet use.
Likbez, the psychology today article you cite does not match your characterization of it, ie, " Browsing Web for relevant
articles" in that it is discussing the effects of "aimlessly using the Internet, to no specific end" one could hardly characterize
the work involved in Mark Thoma's or Yves Smith's aggregations as "aimless" or "to no specific end"
on the other hand, i didn't have any problem with your characterization of the aggregator's behavior as "a compulsive self-destructive
obsession"; i certainly see the obsessive-compulsive behavior in my own work, and there is a self-destructiveness to it as
well, as i'll often forgo other activities and even eating decently to get my online work done .
Likbez Rls, Yes this is a very apt description: "i'll often forgo other activities and even eating decently to get my online
work done .". But the problems here not only "labour of Sysphus' isse but also tendency to expand the scope of collection ("Internet
pack rat" phenomenon) and this burden crashes the person leading to burnut .
"... I'm a little surprised by how many people tell me they have no hobbies. It may seem a small thing, but -- at the risk of sounding grandiose -- I see it as a sign of a civilization in decline. The idea of leisure, after all, is a hard-won achievement; it presupposes that we have overcome the exigencies of brute survival. Yet here in the United States, the wealthiest country in history, we seem to have forgotten the importance of doing things solely because we enjoy them. ..."
"... But there's a deeper reason, I've come to think, that so many people don't have hobbies: We're afraid of being bad at them. Or rather, we are intimidated by the expectation -- itself a hallmark of our intensely public, performative age -- that we must actually be skilled at what we do in our free time. Our "hobbies," if that's even the word for them anymore, have become too serious, too demanding, too much an occasion to become anxious about whether you are really the person you claim to be. ..."
"... If you're a jogger, it is no longer enough to cruise around the block; you're training for the next marathon. If you're a painter, you are no longer passing a pleasant afternoon, just you, your watercolors and your water lilies; you are trying to land a gallery show or at least garner a respectable social media following. When your identity is linked to your hobby -- you're a yogi, a surfer, a rock climber -- you'd better be good at it, or else who are you? ..."
"... Lost here is the gentle pursuit of a modest competence, the doing of something just because you enjoy it, not because you are good at it. Hobbies, let me remind you, are supposed to be something different from work. But alien values like "the pursuit of excellence" have crept into and corrupted what was once the realm of leisure, leaving little room for the true amateur. The population of our country now seems divided between the semipro hobbyists (some as devoted as Olympic athletes) and those who retreat into the passive, screeny leisure that is the signature of our technological moment. ..."
"... Liberty and equality are supposed to make possible the pursuit of happiness. It would be unfortunate if we were to protect the means only to neglect the end. ..."
"... Lest this sound suspiciously like an elaborate plea for people to take more time off from work -- well, yes. Though I'd like to put the suggestion more grandly: The promise of our civilization, the point of all our labor and technological progress, is to free us from the struggle for survival and to make room for higher pursuits. ..."
I'm a little surprised by how many people tell me they have no hobbies. It may seem a small thing, but -- at the risk of sounding
grandiose -- I see it as a sign of a civilization in decline. The idea of leisure, after all, is a hard-won achievement; it presupposes
that we have overcome the exigencies of brute survival. Yet here in the United States, the wealthiest country in history, we seem
to have forgotten the importance of doing things solely because we enjoy them.
Yes, I know: We are all so very busy. Between work and family and social obligations, where are we supposed to find the time?
But there's a deeper reason, I've come to think, that so many people don't have hobbies: We're afraid of being bad at them.
Or rather, we are intimidated by the expectation -- itself a hallmark of our intensely public, performative age -- that we must actually
be skilled at what we do in our free time. Our "hobbies," if that's even the word for them anymore, have become too serious, too
demanding, too much an occasion to become anxious about whether you are really the person you claim to be.
If you're a jogger, it is no longer enough to cruise around the block; you're training for the next marathon. If you're a
painter, you are no longer passing a pleasant afternoon, just you, your watercolors and your water lilies; you are trying to land
a gallery show or at least garner a respectable social media following. When your identity is linked to your hobby -- you're a yogi,
a surfer, a rock climber -- you'd better be good at it, or else who are you?
Lost here is the gentle pursuit of a modest competence, the doing of something just because you enjoy it, not because you
are good at it. Hobbies, let me remind you, are supposed to be something different from work. But alien values like "the pursuit
of excellence" have crept into and corrupted what was once the realm of leisure, leaving little room for the true amateur. The population
of our country now seems divided between the semipro hobbyists (some as devoted as Olympic athletes) and those who retreat into the
passive, screeny leisure that is the signature of our technological moment.
I don't deny that you can derive a lot of meaning from pursuing an activity at the highest level. I would never begrudge someone
a lifetime devotion to a passion or an inborn talent. There are depths of experience that come with mastery. But there is also a
real and pure joy, a sweet, childlike delight, that comes from just learning and trying to get better. Looking back, you will find
that the best years of, say, scuba-diving or doing carpentry were those you spent on the learning curve, when there was exaltation
in the mere act of doing.
In a way that we rarely appreciate, the demands of excellence are at war with what we call freedom. For to permit yourself to
do only that which you are good at is to be trapped in a cage whose bars are not steel but self-judgment. Especially when it comes
to physical pursuits, but also with many other endeavors, most of us will be truly excellent only at whatever we started doing in
our teens. What if you decide in your 40s, as I have, that you want to learn to surf? What if you decide in your 60s that you want
to learn to speak Italian? The expectation of excellence can be stultifying.
Liberty and equality are supposed to make possible the pursuit of happiness. It would be unfortunate if we were to protect the
means only to neglect the end. A democracy, when it is working correctly, allows men and women to develop into free people; but it
falls to us as individuals to use that opportunity to find purpose, joy and contentment.
Lest this sound suspiciously like an elaborate plea for people to take more time off from work -- well, yes. Though I'd like to
put the suggestion more grandly: The promise of our civilization, the point of all our labor and technological progress, is to free
us from the struggle for survival and to make room for higher pursuits. But demanding excellence in all that we do can undermine
that; it can threaten and even destroy freedom. It steals from us one of life's greatest rewards -- the simple pleasure of doing
something you merely, but truly, enjoy.
Tim Wu ( @superwuster ) is a law professor at Columbia, the author
of "The Attention Merchants: The Epic Struggle to Get Inside Our Heads" and a contributing opinion writer. A version of this article
appears in print on Sept. 30, 2018 , on Page SR 6 of the New York edition with the headline: In Praise of Mediocrity.
The issue of Millennial 'burnout' has been an especially hot topic in recent years - and not
just because the election of President Trump ushered in an epidemic of co-occurring TDS (Trump
Derangement Syndrome) that sent millions of American twenty somethings on a never-ending quest
for a post-grad 'safe space'.
For those who aren't familiar with the subject, the World Health Organization recently
described burnout as "a syndrome conceptualized as resulting from chronic workplace stress that
has not been successfully managed." As birth rates plunge and so-called deaths from despair
(suicides and overdoses) climb, sending the US left expectancy lower for multiple consecutive
years for the first time since the 1960s, many researchers see solving the problem of burnout
as critical to fixing many of our societal issues.
To try and dig deeper into the causes and impact of millennial burnout, Yellowbrick
, a national psychiatric organization, surveyed 2,000 millennials to identify what exactly is
making a staggering 96% of the generation comprising the largest cohort of the American labor
force say they feel "burned out" on a daily basis.
The answer is, unsurprisingly, finances and debt: These are
the leading causes of burnout (and one reason why Bernie Sanders latest proposal to wipe
out all $1.6 trillion in outstanding student debt might be more popular with millennial voters
than many other Americans realize).
At 6% interest with a 6-year amortization, that works out to monthly payments of $497 --
about what many of these folks spend on eating at restaurants or on tattoos or on drugs per
month.
It's a matter or priority -- and repaying the student loans isn't a priority for them
which is why a report in '17 showed that at 7 years after graduation, more than 45% of them
hadn't paid even one dollar of principle on their student loans.
This is becoming exhausting. The boomers and the like simply don't want to admit that it
is much harder today making ends meet than it was when they were younger. That is a fact,
inflation and asset inflation has made the value of a dollar half of what is was 40 years
ago. Meaning, you would have to work 80 hours in today's money to match 40 hours in money
from the late 70's. Now, millenials don't get off easy either because they think they deserve
that same standard and since it does not and cannot exist in our monetary system, they try to
usurp personal responsibility, at any level, by finger pointing and apathy. Our society is
slowly collapsing.
You can imagine them rubbing their hands with glee as they quote statistics such as: the 53
metropolitan areas in the U.S. with populations of 1 million or more accounted for two-thirds of
the GDP growth and three-quarters of the job growth. A staggering 93% of the population growth in
the U.S. in the past decade occurred in these urban centers.
And this asymmetry is even greater
if we separate the top 10 metropolitan areas from the rest: super-cities with super-charged
economies, fueled by enormous influxes of capital and people, which just so happen to make life
unbearable as overcrowded, aging infrastructure breaks down and costs for housing, rent, taxes,
utilities, fees etc. skyrocket out of reach of the bottom 95%.
The well-paid pundits viewing glowing statistics of growth never get around to examining
the human costs of this lopsided "recovery":
the "winners" in increasingly unlivable urban
centers are cracking under the pressure-cooker stress, burning out, flaming out, crashing.
The residents of all the regions sucked dry of capital and talent--the "losers" of neoliberal
globalization's concentrations of mobile capital and talent in a few favored megalopolises--are
also cracking under the weight of a loss of dignity and secure livelihood, the two being intimately
bound, much to the dismay of the supporters of "just pay them to go away and not bother us"
Universal Basic Income (UBI).
In other words, the "winners" are losing, too.
They're losing their sanity in
3-hour daily commutes on jammed freeways and equally jammed streets as thousands of other commuters
seek a work-around to the endless congestion.
They're losing their dreams of a better life, as all the average-wage worker can afford to rent
is a bed in a cramped living room that has been converted into sleeping quarters for two workers
who don't make six-figure salaries and who don't have stock options in a Unicorn tech company.
They're fixated on FIRE--financial independence, retire early--because they hate their job,
their career and the sector they toil in, and they count the days until they're free, free, free of
the pressure, the stress, the BS work, and the insanity of daily life in a teeming rat-cage.
No wonder the FIRE movement is spreading like (ahem) wildfire. Nobody in their right mind wants
to do their job for another 10 years, much less 20 or 25 years. Everybody is bailing out the moment
they can, or if they burn out and crash, when they're forced to.
Let's say you want to start a business in a super-progressive city that fulfills all
your most cherished ideals:
paying your employees good wages, providing customers with
value, and paying all your taxes and fees, of course, as a responsible progressive citizen.
Where do we start? How about the reality that virtually no one employed in the restaurant sector
can afford to live in San Francisco unless they inherited a rent-controlled flat or scored one of
the few subsidized housing openings?
The city's solution--mandating a $15/hour minimum wage--doesn't magically make healthcare or
rent affordable; all it does is increase the burden on small businesses that are hanging on by a
thread.
Working 100 hours a week couldn't compensate for the crushing rent.
Even the well-paid are burning out.
Astronomical household incomes (say,
$300,000 annually) aren't enough to buy a decayed bungalow for $1.3 million and pay for childcare,
private-school tuition, healthcare, an aging parent and all the services the overworked
wage-earners don't have the time or energy to do themselves. Oh, and don't forget the taxes. You're
rich, people, so pay up.
No wonder people who can afford to retire are bailing at 55 or 60, on the first day they
qualify. Life's too short to put up with the insane pressure and stress a day longer than you have
to.
Not everybody feels it, of course.
People who bought their modest house for
$100,000 30 years ago can hug themselves silly that it's now worth $1,000,000 (but with a
still-modest property tax), and if they're retired with a plump pension and gold-plated medical
benefits, their biggest concern is finding ways to blow all the cash that's piling up.
These lucky retirees wonder what all the fuss is about. "We worked hard for what we have," etc.
It's easy to overlook being a lucky winner of the housing-bubble lottery and the equally
bubblicious pension lottery, and easy not to ask yourself how you'd manage if you arrived in NYC,
San Francisco, et al. now rather than 35 years ago.
The asymmetries are piling up and we're cracking under the weight.
When do we
recover from the "recovery"? The answer appears to be "never."
"To the brain, information is its own reward, above and beyond
whether it's useful," says Assoc. Prof. Ming Hsu, a neuroeconomist. "And just as our brains
like empty calories from junk food, they can overvalue information that makes us feel good but
may not be useful -- what some may call idle curiosity."
The paper, "Common neural code for
reward and information value," was published this month by the Proceedings of the National
Academy of Sciences. Authored by Hsu and graduate student Kenji Kobayashi, now a post-doctoral
researcher at the University of Pennsylvania, it demonstrates that the
brain converts information into the same common scale as it does for money .
It also lays
the groundwork for unraveling the neuroscience behind how we consume information -- and perhaps
even digital addiction.
This explains Wikipedia. You start by looking up "just one article." After that hit, you
click on a link to one more. And then another and another. Before you know it, ten hours have
passed and you're sprawled out half-reading an article about cat foot fungus. You realize you
should stop, but there's a link there about nails and your hand goes to click it without you
telling it to.
The economic and financial stresses will exceed the workforce's carrying capacity in the next recession.
A number of recent surveys reflect a widespread sense of financial stress and symptoms of poor health in America's workers,
particularly the younger generations. There's no real mystery as to the cause of this economic anxiety:
competition for secure, well-paid jobs that were once considered the birthright of the middle class is increasingly fierce;
the pay and predictability of the jobs that are available are low;
high-paying jobs are extraordinarily demanding, forcing workers to sacrifice everything else to keep the big-bucks position;
the much-lauded gig economy is tracking the Pareto Distribution, as 80% of the income accrues to the top 20%, and those trying
to earn a lower-middle class income in the gig economy are working long hours to do so;
housing costs are unaffordable in hot job markets;
commutes to jobs from lower-cost areas are brutal;
student loan debt taken on to earn low-value diplomas is crushing.
These are just the highlights, not an exhaustive list of the common stresses experienced by American workers of all ages.
The inevitable result of these pressures over time is burnout , which anecdotally is reaching epidemic proportions in the
U.S. and other nations.
While many of these stresses are unique to private-sector precariats in the gig economy or insecure positions in Corporate
America, many public-sector workers in public safety and healthcare are also prone to burnout due to increasing workloads and understaffing.
... .. ...
But why should workers tolerate high levels of chronic stress? The alternative--quitting the source of the stress and finding
a lower wage, lower pressure livelihood is an increasingly compelling alternative.
The New York Times has an
illuminating article today summarizing recent research on the gender effects of
mandatory overwork in professional jobs. Lawyers, people in finance and other
client-centered occupations are increasingly required to be available round-the-clock, with
50-60 or more hours of work per week the norm. Among other costs, the impact on wage inequality
between men and women is severe. Since women are largely saddled with primary responsibility
for child care, even when couples ostensibly embrace equality on a theoretical level, the
workaholic jobs are allocated to men. This shows up in dramatic differences between typical
male and female career paths. The article doesn't discuss comparable issues in working class
employment, but availability for last-minute changes in work schedules and similar demands are
likely to impact men and women differentially as well.
What the article doesn't point out is that the situation it describes is a classic prisoners
dilemma.* Consider law firms. They compete for clients, and clients prefer attorneys who are
available on call, always prepared and willing to adjust to whatever schedule the client throws
at them. Assume that most lawyers want sane, predictable work hours if they are offered without
a severe penalty in pay. If law firms care about the well-being of their employees but also
about profits, we have all the ingredients to construct a standard PD payoff matrix:
There is a penalty to unilateral cooperation, cutting work hours back to a work-life balance
level. If your firm does it and the others don't, you lose clients to them.
There is a benefit to unilateral defection. If everyone else is cutting hours but you don't,
you scoop up the lion's share of the clients.
Mutual cooperation is preferred to mutual defection. Law firms, we are assuming, would
prefer a world in which overwork was removed from the contest for competitive advantage. They
would compete for clients as before, but none would require their staff to put in soul-crushing
hours. The alternative equilibrium, in which competition is still on the basis of the quality
of work but everyone is on call 24/7 is inferior.
If the game is played once, mutual defection dominates. If it is played repeatedly there is
a possibility for mutual cooperation to establish itself, but only under favorable conditions
(which apparently don't exist in the world of NY law firms). The logical solution is some
form of binding regulation.
The reason for bringing this up is that it strengthens the case for collective action rather
than placing all the responsibility on individuals caught in the system, including for that
matter individual law firms. Or, the responsibility is political, to demand constraints on the
entire industry. One place to start would be something like France's
right-to-disconnect law .
*I haven't read the studies by economists and sociologists cited in the article, but I
suspect many of them make the same point I'm making here.
There will always be an endless list of chores to complete and work to do, and a culture of
relentless productivity tells us to get to it right away and feel terribly guilty about any
time wasted. But the truth is, a life spent dutifully responding to emails is a dull one
indeed. And "wasted" time is, in fact, highly fulfilling and necessary.
Don't believe me? Take it from the creator of "Inbox Zero." As Oliver Burkeman reports
in
The Guardian , Merlin Mann was commissioned to write a book about his streamlined email
system. Two years later, he abandoned the project and instead posted a (since deleted) blog
post on how he'd spent so long focusing on how to spend time well, he'd ended up missing
valuable moments with his daughter.
The problem comes when we spend so long frantically chasing productivity, we refuse to take
real breaks. We put off sleeping in, or going for a long walk, or reading by the window -- and,
even if we do manage time away from the grind, it comes with a looming awareness of the things
we should be doing, and so the experience is weighed down by guilt.
Instead, there's a tendency to turn to the least fulfilling tendency of them all: Sitting at
our desk, in front of our computer, browsing websites and contributing to neither our happiness
nor our productivity.
"There's an idea we must always be available, work all the time," says Michael Guttridge, a
psychologist who focuses on workplace behavior. "It's hard to break out of that and go to the
park." But the downsides are obvious: We end up zoning out while at the computer -- looking for
distraction on social media, telling ourselves
we're "multitasking" while really spending far longer than necessary on the most basic
tasks.
Plus, says Guttridge, we're missing out on the mental and physical benefits of time spent
focused on ourselves. "People eat at the desk and get food on the computer -- it's disgusting.
They should go for a walk, to the coffee shop, just get away," he says. "Even Victorian
factories had some kind of rest breaks."
"... Try working construction for minimum wage and not knowing where your next job will come from. Then have your blood pressure tested. ..."
"... I've watched it drive many people out. My own mentor told me when I first started "I'll tell you the first thing my Mentor told me, 'Get out now'". A bit much for a new engineer to take in, but now I know why he said it. Right before he left the company, he started telling me he wasn't sure how much longer he could handle the pressure. ..."
"... I find most of the stress in this industry is self induced by clueless fucks being in charge. ..."
"... I work with people who proudly complain about "working until 2 am" or willingly take on all kinds of client work at ridiculous times because it burnishes their reputation. ..."
"... My understanding would be Apple, Amazon, Google, Facebook, etc. although I've only really heard from people that have worked at Amazon. They hire new young and eager workers who they can work and fire them when they burn out. However, just as many leave before that. It's all part of an understood system where new workers agree to be overworked while padding their resume and looking for a new job. This lasts for an average of 18 months before they have found a new job or get laid off. ..."
"... The no vacation thing pisses me off. My entire adult life, I've only had one "real" vacation if you define it as a whole week off. ..."
An anonymous reader writes: A survey conducted among the tech workers, including many
employees of Silicon Valley's elite tech companies, has revealed that
over 57% of respondents are suffering from job burnout . The survey was carried out by the
makers of an app that allows employees to review workplaces and have anonymous conversations at
work, behind their employers' backs. Over 11K employees answered one question -- if they suffer
from job burnout, and 57.16% said "Yes."
The company with the highest employee burnout rate was Credit Karma, with a whopping
70.73%, followed by Twitch (68.75%), Nvidia (65.38%), Expedia (65.00%), and Oath (63.03% --
Oath being the former Yahoo company Verizon bought in July 2017). On the other end of the
spectrum, Netflix ranked with the lowest burnout rate of only 38.89%, followed by PayPal
(41.82%), Twitter (43.90%), Facebook (48.97%), and Uber (49.52%).
This is usually the type of thing I tell myself to keep perspective. But the truth is that
tech jobs can be stressful too. I imagine people in blue collar jobs believe we are living
high on the hog with not a care in the world, but it's not really that way. But I also have
two brothers that work jobs requiring much more manual labor. It absolutely takes a toll on
your body.
We've recently had a few people come over to hardware management (I am a hardware
developer). Both my manager and I told them, hardware projects change EVERY DAY. Every day
its, "so and so (big customer) just had issues with this", or "The market is way behind on
these parts and we are short", or "The product you just designed is failing ____ test right
now, what are we doing to fix it".
I've watched it drive many people out. My own mentor told me when I first started "I'll
tell you the first thing my Mentor told me, 'Get out now'". A bit much for a new engineer to
take in, but now I know why he said it. Right before he left the company, he started telling
me he wasn't sure how much longer he could handle the pressure.
Honestly, I don't care as much about the pay, the fancy benefits, or any of the fluff.
What has nearly drove me out is when I feel like every day is just another barrage of
unbounded problems. Like you're the guy on the track, your problem is the chains holding you
there, and management is driving the train and they aren't slowing it down. You better get
those chains undone.
I've been an auto mechanic, welder, machinist, and now EE. My back-up plan / exit strategy
is machining. I enjoy it, it is so much more bounded (in my opinion), and still presents good
challenges to keep me engaged. I already have a colleague in another company on his way.
We've talked at length about it.
I worked for a large company that made networking equipment. My job was to run a sanity
test framework for their operating system. Developers load the images in a queue, the system
pulls them, loads them on real hardware, and executes a body of tests.
The problem was that a bad image would hose the system to where it couldn't reboot, and
then it would not be able to correct itself. Every image after that would fail. My job was to
come in, clean up the mess, and apologize to each developer. It was actually stressful.
I repeatedly told the manager how I could fix it, and he always said we didn't have time.
I waited for him to travel for a week, I shut down the system, and fixed it so that the
system got completely initialized between every run. From that point on, every failure was a
real failure cause by that developer's changes.
My job became a cake walk. I find most of the stress in this industry is self induced by
clueless fucks being in charge.
But the truth is that tech jobs can be stressful too. I imagine people in blue collar
jobs believe we are living high on the hog with not a care in the world, but it's not
really that way.
I was pulling long hours one week to try and finish a software update in time. The
deadline was fast approaching and the outlook was grim. As usual, the cleaning lady came by
to collect the trash that evening and we got to chit-chatting like we usually did (I arrived
late and stayed late back then, so my being there when she did her rounds was perfectly
normal). Part way through the conversation she paused for a moment, then said something to
the effect of, "You know, before I started working here I used to think that you guys all had
it easy with your cushy jobs and nice offices. But then I see people here with the look that
you have in your eyes right now and I realize I was wrong. It's just as tough. Different, but
just as tough, if not tougher."
I think I mustered a tired "Thanks?" in response.
I don't make any claim to having it tougher than anyone else (I have a MASSIVE
appreciation for manual workers, among many other fields, since I couldn't do that work), but
the only people I find suggesting that tech work is easy are those who either aren't in the
field and have no awareness of what it entails, or those who are a burden on everyone else
around them in the field.
Yes, but the stress that tech people experience is completely fake. It REALLY doesn't
matter if your work is done on time.
It does if you want to remain employed with your current company. If that doesn't matter
to you then you probably aren't stressed to begin with. If anyone who worked for me expressed
that attitude they would be "succeeding elsewhere" in short order.
No one is going to die if your software or network doesn't work.
I'd like to introduce you to some folks who work in medical IT who will disagree with you
rather strongly. Same thing with software that controls/drives cars or airplanes or manned
rockets or traffic signals or ocean navigation or food safety or electrical grids or nuclear
reactor controls or.... The list is very long for things that actually do matter. Yeah,
nobody probably cares if your word processor crashes but more than a few of us do things that
have serious consequences.
Amazingly humans survived for thousands of years without IT or computers.
Ok we're done here. Claiming people shouldn't have stress because computers didn't exist
200 years ago is irrelevant and stupid.
Tech work culture is seriously broken when 80 hour weeks and never going on vacation for
any reason is encouraged and celebrated. Burnout under such conditions is inevitable .
I work with people who proudly complain about "working until 2 am" or willingly take on
all kinds of client work at ridiculous times because it burnishes their reputation.
Some after hours work is unavoidable in IT, but I just refuse to work those kinds of hours
regularly without added compensation of some kind (added vacation days without strings and/or
more money).
As a more skilled/experienced/older worker, I think I can get away with it but I'm not
gonna lie, the people who do it seem to have more street cred in the organization because
they are willing to bend over.
I think it's highly organization dependent and sometimes individually dependent (ie, can
you get done what needs doing in normal work hours). And I think there are definitely orgs
where if you're not doing that, you might as well resign now because you will get shuffled to
the shit work.
I worked 55-60 hours a week for most of a year, mainly due to two senior people leaving
with a month's difference and a third knocked his head pretty bad leaving me and a few
juniors to sort it out. That was as an IT consultant job though so I had a billing bonus that
gave me pretty good kickback. If I recall correctly it kicked in at about 2/3rd = 67%
billable time and the company average was 75-80% somewhere, so your average consultant would
get bonus for like 10% while I could hit 50%+. Normally they wouldn't'
I hear this all the time but WTH actually does this? Anyone here at slashdot? Even when
I was younger I did an all nighter just once or twice. I've been working 8 hour days the
last 15 years.
My understanding would be Apple, Amazon, Google, Facebook, etc. although I've only really
heard from people that have worked at Amazon. They hire new young and eager workers who they
can work and fire them when they burn out. However, just as many leave before that. It's all
part of an understood system where new workers agree to be overworked while padding their
resume and looking for a new job. This lasts for an average of 18 months before they have
found a new job or get laid off. They hopefully hop to
The no vacation thing pisses me off. My entire adult life, I've only had one "real"
vacation if you define it as a whole week off.
One reason there's such a lack of vacation time here in Seattle is that in Washington
state, the law only requires less than 2/3 be paid out. In CA, we have to pay out 100%.
That's why in CA we require employees to take PTO to get it off of the books, but in WA we
basically don't allow vacation time. No company I've ever worked for let programmers take
even a fifth (as a guess)
I worked for a companies where IT people used to look for places to go on vacation that
had no phones or pager service. For one co-worker's rafting trip on the Colorado River
through the Grand Canyon started a trend among the IT staff: where can I go where the
phone/pager coverage is really poor or non-existent? Far, far North Canadian fishing trips
started getting considered. Can't have people actually having an outside-of-work life so the
companies bought satellite phones. No more vacations for you withou
If you work under such conditions by choice then it is on your shoulders alone.
No, you're wrong. Those working conditions are spreading everywhere. Companies have
figured out that instead of hiring more people, they can force others to work more for the
same pay.
It is very nice to be independently wealthy and not have to worry about getting a
paycheck, but for the rest of us we have to do it for a paycheck or face homelessness and
possibly starvation.
If all available work is under such conditions, is that really a choice?
It is very nice to be independently wealthy and not have to worry about getting a
paycheck, but for the rest of us we have to do it for a paycheck or face homelessness and
possibly starvation.
You don't have to be independently wealthy to make a living doing something that you don't
enjoy. If you hate IT work then go find something else to do. It's a big world with lots of
opportunity.
If all available work is under such conditions, is that really a choice?
Are you seriously claiming that someone who is bright enough to find work in the tech
sector will find it impossible to do something else if they put their mind to it? Possibly
even something they actually enjoy doing with reasonable hours and adequate pay. Point is
very few people are forced to work in IT. Arg
Old timer, this is no longer the case. It may have been true when you were young, but
these days it is IT, gigs, or unemployment. Too many people in a globally connected world
competing for the same few jobs.
That's hilarious. Do you have any idea how many jobs there are available in academia? Not
many. The issue is that if you do what you love, what's the incentive to stop? There's a
reason that the average age of professors always hovers in the 50s and 60s. It's not uncommon
to find semi-retired professors still kicking around well into their 70s teaching one or two
classes they love.
Do I really have to explain that some people don't really give a shit about what they are
doing? Sure everyone works to get paid but some people actually try to enjoy what they are
doing along the way so that the job is more than just a means to get money.
I've taken 4 weeks of vacation in 30 years. One week when my dad died. One week for a
camping trip, and the remaining two weeks were for things like my children being born.
Then you've been suckered, or have different priorities. One year, I took 6 weeks off to
travel around the country. Another year, I took 4 weeks off and went to Australia. Another
year, I took 6
Long on call hours. Declining inflation adjusted wages. Having to spend hours and hours of
your own time training because companies don't train anymore. Constant threats of outsourcing
or being replaced by an H1-B applicant (despite the fact that that is explicitly
illegal).
Does this result argue for wider adoption of Netflix's H.R. model, as expressed in the
manifesto
[slideshare.net] that went viral a few years back? Namely:
1. Hire "A" players, because the competence of one's coworkers is a large contributor to
employee satisfaction.
2. Don't use golden handcuffs as a means of mitigating hiring churn; you want employees to
stay at the company because they want to be there. Employees choose how much stock they want
vs. cash.
3. Don't use performance based bonuses; high performance is the base level expectation, not
something to be singled out and rewarded.
4. "We're a team, not a family." You don't "cut" people from a family; you do "cut" people
from a pro sports team.
5. "Hard work - Not Relevant". They care about productivity, not how hard you worked to be
productive.
6. Low tolerance for "brilliant jerks".
7. Pay "top of market" wages. "One outstanding employee gets more done and costs less than
two 'adequate' employees." "Employees should feel they are being paid well relative to other
options in the market."
A single data point is statistically meaningless "woe is us" wanking UNLESS other
industries are surveyed.
If the "burnout" rate for tech workers is 57%, but for medical workers is 75%, factory
line workers is 62%, and teachers is 60%, then the rate for tech workers is really not
bad.
If OTOH other industries scale at 20-30%, then the tech sector really is dire.
In short: I suspect that everyone feels like they are underappreciated, underpaid, and is
"fed up with all the bullshit at work"...like everyone else.
The office (
Score: 4 , Interesting) by Anonymous Coward writes: on Tuesday June 26, 2018 @10:50AM
( #56847616 )
I've done a lot of Peopleware like consulting, mostly for software development teams. The
IT office space is in general the enemy of these teams. They are noisy and destroy your
concentration. You can only break someones concentration for a finite number per day,
certainly with introverts, after that the dev is just excausted. As a rule of thumb, the
correlation is more people wearing headphones -> more burnout. It's fucked up that people
need to wear headphones to attempt to do their work, and a clear sign the environment is
poison to their jobs. Of course they put all these people in the same space, to save money.
Hardly ever do they do the math, and contemplate how much it costs them in burnout and
turnover.
Does this result argue for wider adoption of Netflix's H.R. model, as expressed in the
manifesto
[slideshare.net] that went viral a few years back? Namely:
1. Hire "A" players, because the competence of one's coworkers is a large contributor to
employee satisfaction.
2. Don't use golden handcuffs as a means of mitigating hiring churn; you want employees to
stay at the company because they want to be there. Employees choose how much stock they want
vs. cash.
3. Don't use performance based bonuses; high performance is the base level expectation, not
something to be singled out and rewarded.
4. "We're a team, not a family." You don't "cut" people from a family; you do "cut" people
from a pro sports team.
5. "Hard work - Not Relevant". They care about productivity, not how hard you worked to be
productive.
6. Low tolerance for "brilliant jerks".
7. Pay "top of market" wages. "One outstanding employee gets more done and costs less than
two 'adequate' employees." "Employees should feel they are being paid well relative to other
options in the market."
Yep, so many folks LOOOVVVVEEE 50, 60, 70 hour weeks, and having to respond to the boss
24x7x365.25. Who needs a life?
UNIONS are why we have benefits, weekends, holidays and vacations. No company did that out
of the alleged kindness of their hearts.
But none of you here need them, they're *so* "ancient", never mind they could get you a 40
hour week and no being bothered off hours, no, enjoy your (non-) life.
What's wrong with not being promoted -- just do your job well, take your pay and vacation
time. Work to live, don't live to work. A snazzy job title isn't the pinnacle of human
achievement.
Because US's annual raises rarely meet the US's annual inflation rates. So you are forced
to move up the salary chain or effectively get a pay cut ever year.
What's wrong with not being promoted -- just do your job well, take your pay and
vacation time. Work to live, don't live to work. A snazzy job title isn't the pinnacle of
human achievement.
While I agree with the sentiment that most people shouldn't feel pressured into living to
work, the pinnacle of human achievement in any discipline is nearly always achieved through
an insane devotion to the task. The people responsible for this level of excellence generally
live to work.
There is nothing wrong with working to live, but there often is nothing wrong with living
to work as long as it is a decision made freely.
Or at least raise the wage floor where overtime == time and a half. Obama tried this,
Trump unfortunately rolled it back. Also, sometimes you need to work overtime two weeks in a
row, crunch time to finish a project. I'd change that requirement to get the time back to
something like a 2-3 month period.
I'd support having all such things (including scheduled days off, vacation, overtime/comp
time, etc.) kept indefinitely, with maximum caps for each kind. If an employee leaves for any
reason, including being fired, they get paid out whatever they haven't used.
I'm quite happy to help my team meet their goals and go the extra mile to deliver a
quality product to our customer..... but I certainly expect that once that's done, I'll get
to go spend time with my family.
40 hour work weeks, enforced. 30 days paid vacation per year, plus holidays and
weekends.
Par for the course in the UK.
If you work overtime one week, you get those hours back the next week.
Not par for the course, but it's pretty common the you will get it back sometime. A busy
period coming up to a deadline could cover a few weeks.
Everyone gets two days off in a row every week.
.. usually happens
If you give up those days for some special reason, you get comp vacation time to be used
within the next month.
You would usually get this, but may have to wait until the peak is over before taking the
time back. Alternatively you could be paid - time and a half is quite common
Everyone takes all their vacation, every year.
In the UK it's exceptional for anyone not to take all their time. A company I worked for
switched the "holiday year" from a fixed January-December to a ye
$250k/yr if you have no time to enjoy it is worthless unless you plan to work for a few
years, live like a miser, and invest enough of it in rental property so you never have to
work again.
I work with several devs making nearly that much, and they most certainly are burned out.
When you work constant death marches with Seattle Hundreds (16 hours a day Mon-Thu and 12
hours a day Fri-Sun) that almost always happens. I work almost that much, and I moved over a
year ago and still haven't even unpacked yet. High pay helps, but you still have a breaking
point. There just aren't enough programmers to meet demand.
How does the company even end up with 100 hours of work per week for everyone? Is that all
essential work, or just busywork? If burnout rate is super high, wouldn't you end up with
even more work and fewer people to do it?
...end up with even more work and fewer people to do it?
The part I find fascinating about that is that the junior/recent college grads stick with
jobs despite the long hours for the experience and the most experienced people stick with
jobs because they know it's the same most everywhere else. I guess it's the devil you know.
The guys in the middle with five to fifteen years experience are the ones that keep jumping
ship to try to find somewhere better.
My company has about eighty people with less than three years experience and around twenty
with more than tw
I work with several devs making nearly that much, and they most certainly are burned
out. When you work constant death marches with Seattle Hundreds (16 hours a day Mon-Thu and
12 hours a day Fri-Sun) that almost always happens. I work almost that much, and I moved
over a year ago and still haven't even unpacked yet. High pay helps, but you still have a
breaking point. There just aren't enough programmers to meet demand.
I've never worked anywhere with that kind of schedule....or known anyone who has. Then
again, I have never lived in shit holes like Seattle or California.
I simply wouldn't work like that. If it were that, or go on welfare, I'd say fuck it and
go on welfare, or just rob houses for a living - leaving that kind of schedule to the
suckers.
If my employer required me to work more than 50 hours per week on anything other than a
rare occasion, I'd find a new employer. ASAP.
Too many tech jobs are just cleaning up after Indian disaster after Indian disaster. And
not in any sort of permanent way, just putting out the same fires over and over.
There are two kinds of IT people. Those who create. And those who fix creations. If you're
tired of doing one, then figure out how to get paid doing the other, and feel good knowing
you'll be working to fix
Recent studies have shown that 90% of Americans use digital devices for two or more hours each day and the average American spends
more time a day on high-tech devices than they do sleeping:
8 hours and 21 minutes to
be exact. If you've ever considered attempting a "digital detox", there are some health benefits to making that change and a
few tips to make things a little easier on yourself.
Many Americans are on their phones rather than playing with their children or spending quality family time together. Some people
give up technology, or certain aspects of it, such as social media for varying reasons, and there are some shockingly terrific health
benefits that come along with that type of a detox from technology. In fact, more and
more health experts and medical
professionals are suggesting a periodic digital detox; an extended period without those technology gadgets.
Studies continue to show that a digital detox, has
proven to be beneficial for relationships, productivity, physical health, and mental health. If you find yourself overly stressed
or unproductive or generally disengaged from those closest to you, it might be time to unplug.
DIGITAL ADDICTION RESOLUTION
It may go unnoticed but there are many who are actually addicted to their smartphones or tablet. It could be social media or YouTube
videos, but these are the people who never step away. They are the ones with their face in their phone while out to dinner with their
family. They can't have a quiet dinner without their phone on the table. We've seen them at the grocery store aimlessly pushing around
a cart while ignoring their children and scrolling on their phone. A whopping
83%
of American teenagers claim to play video games while other people are in the same room and
92%
of teens report to going online daily . 24% of those users access the internet via laptops, tablets, and mobile devices.
Addiction therapists who treat gadget-obsessed people say their patients aren't that different from other kinds of addicts. Whereas
alcohol, tobacco, and drugs involve a substance that a user's body gets addicted to, in behavioral addiction, it's the mind's craving
to turn to the smartphone or the Internet. Taking a break teaches us that we can live without constant stimulation, and lessens our
dependence on electronics. Trust us: that Facebook message with a funny meme attached or juicy tidbit of gossip can wait.
IMPROVE RELATIONSHIPS AND BE MORE PERSONABLE
Another benefit to keeping all your electronics off is that it will allow you to establish good mannerisms and people skills and
build your relationships to a strong level of connection. If you have ever sat across someone at the dinner table who made more phone
contact than eye contact, you know it feels to take a backseat to a screen. Cell phones and other gadgets force people to look down
and away from their surroundings, giving them a closed off and inaccessible (and often rude) demeanor. A digital detox has the potential
of forcing you out of that unhealthy comfort zone. It could be a start toward rebuilding a struggling relationship too. In a
Forbes study ,
3 out of 5 people claimed that they spend more time on their digital devices than they do with their partners. This can pose
a real threat to building and maintaining real-life relationships. The next time you find yourself going out on a dinner date, try
leaving your cell phone and other devices at home and actually have a conversation. Your significant other will thank you.
BETTER SLEEP AND HEALTHIER EATING HABITS
The sleep interference caused by these high-tech gadgets is another mental health concern. The
stimulation caused by artificial light can make
you feel more awake than you really are, which can potentially interfere with your sleep quality. It is recommended that you give
yourself at least two hours of technology-free time before bedtime. The "blue light" has been shown to interfere with
sleeping patterns by inhibiting melatonin
(the hormone which controls our sleep/wake cycle known as circadian rhythm) production. Try shutting off your phone after dinner
and leaving it in a room other than your bedroom. Another great tip is to buy one of those old-school alarm clocks so the smartphone
isn't ever in your bedroom. This will help your body readjust to a normal and healthy sleep schedule.
Your eating habits can also suffer if you spend too much time checking your newsfeed.
The Rochester Institute of Technology released a study that
revealed students are more likely to eat while staring into digital media than they are to eat at a dinner table. This means that
eating has now become a multi-tasking activity, rather than a social and loving experience in which healthy foods meant to sustain
the body are consumed. This can prevent students from eating consciously, which promotes unhealthy eating habits such as overeating
and easy choices, such as a bag of chips as opposed to washing and peeling some carrots. Whether you're an overworked college student
checking your Facebook, or a single bachelor watching reruns of The Office , a digital detox is a great way to promote healthy and
conscious eating.
IMPROVE OVERALL MENTAL HEALTH
Social media addicts experience a wide array of emotions when looking at the photos of Instagram models and the exercise regimes
of others who live in exotic locations. These emotions can be mentally draining and psychologically unhealthy and lead to depression.
Smartphone use has been linked to loneliness, shyness, and less engagement at work. In other words,
one may have many "social media friends" while being lonely and unsatisfied because those friends are only accessible through
their screen. Start by limiting your time on social media. Log out of all social media accounts. That way, you've actually got to
log back in if you want to see what that Parisian Instagram vegan model is up to.
If you feel like a detox is in order but don't know how to go about it, start off small. Try shutting off your phone after dinner
and don't turn it back on until after breakfast. Keep your phone in another room besides your bedroom overnight. If you use your
phone as an alarm clock, buy a cheap alarm clock to use instead to lessen your dependence on your phone. Boredom is often the biggest
factor in the beginning stages of a detox, but try playing an undistracted board game with your children, leaving your phone at home
during a nice dinner out, or playing with a pet. All of these things are not only good for you but good for your family and beloved
furry critter as well!
Pity the poor blogger's lot: there are more interesting papers being published every week
than any essayist, however diligent, can possibly cope with. And there will be more, as the
vast genetic databases give up their secrets. No sooner does one team scoop the others with a
savage novelty than their rivals counter-attack with their own surprising findings. If you are
curious about mankind, it is the best time to be alive. We are likely to learn more about
ourselves in the next few decades than was possible in the last few centuries.
(usatoday.com)BeauHD on Thursday
February 22, 2018 @06:50PM from the welcome-to-2018 dept. According to a
new survey from Common Sense Media and SurveyMonkey,
47% of parents worry their child is addicted to their mobile device . By comparison, only
32% of parents say they're addicted themselves. USA Today reports: Half of parents also say
they are at least somewhat concerned about how mobile devices will affect their kids' mental
health. Nearly one in five say they're "extremely" or "very" concerned. According to the
survey, 89% of parents believe it's up to them to curb their children's smartphone usage. The
survey conducted between Jan. 25 and Jan. 29 included a sample of 4,201 adults, including 1,024
parents with children under age 18. Data was weighted to reflect the demographic composition of
the U.S. for adults over 18, based on Census data. Many devices and services feature parental
controls, but some parents may not be aware they exist. The Common Sense-SurveyMonkey survey
found 22% of parents did not know YouTube -- which has faced scrutiny over how easy it is for
kids to find inappropriate videos -- offered parental controls. Also, 37% have not used the
controls before. Among parents surveyed who say their kids watch YouTube videos, 62% said their
kids have seen inappropriate videos on the site. Most, or 81%, said it's the parents' job to
prevent kids from seeing these videos.
So, what draws people to these phones? Surely, it is not just the groundbreaking design or
the connection with a community. As a minister, psychotherapist and scholar studying our
relationship with hand-held devices, I believe there is much more going on.
Our sense of self is shaped while we are still in the womb. The development of the self,
however,
accelerates after birth . A newborn, first and foremost, attaches herself to the primary
caregiver and later to things – acquiring what has been called an "extended
self."
The leading 20th-century American psychologist William James was among the first to
argue for an extended self. In his "Principles of Psychology," James
defined the self as "the sum total of all that a man can call his, not only his body and his
psychic powers, but his clothes and his house, his wife and children." Losing any of this
extended self, which could include money or another prized object, as he explained, could lead
to a sense of great loss. In early childhood, for example, babies and toddlers cry if they
suddenly lose their pacifier or favorite soft toy, objects that become part of their extended
selves.
Phones, I argue, play a similar role. It is not uncommon for me to feel a sudden onset of
anxiety should I drop my phone or am unable to find it. In my experience, many individuals feel
the same way. It is also reflected in how often many of us check our devices.
Psychologist Larry Rosen and
his colleagues at California State University found that 51 percent of individuals born in the
1980s and 1990s experienced moderate to high levels of anxiety when they were kept from
checking in with their devices for more than 15
minutes . Interestingly, the percentage drops slightly – to 42 percent – for
those born between 1965 and 1979.
This is primarily because they came into being during a time where hand-held technologies
were only beginning to make their entry. For this group, phones became part of their extended
self only as late teens or as young adults.
"... It's fine to acknowledge a misstep. But spin the answer to focus on why this new situation is such an ideal match of your abilities to the employer's needs. ..."
I have been in my present position for over 25 years. Five years ago, I was assigned
a new boss, who has a reputation in my industry for harassing people in positions such as mine
until they quit. I have managed to survive, but it's clear that it's time for me to move along.
How should I answer the inevitable interview question: Why would I want to leave after so long?
I've heard that speaking badly of a boss is an interview no-no, but it really is the only reason
I'm looking to find something new. BROOKLYN
I am unemployed and interviewing for a new job. I have read that when answering interview
questions, it's best to keep everything you say about previous work experiences or managers positive.
But what if you've made one or two bad choices in the past: taking jobs because you
needed them, figuring you could make it work - then realizing the culture was a bad fit, or you
had an arrogant, narcissistic boss?
Nearly everyone has had a bad work situation or boss. I find it refreshing when I read
stories about successful people who mention that they were fired at some point, or didn't get
along with a past manager. So why is it verboten to discuss this in an interview? How can the
subject be addressed without sounding like a complainer, or a bad employee? CHICAGO
As these queries illustrate, the temptation to discuss a negative work situation can be strong
among job applicants. But in both of these situations, and in general, criticizing a current or past
employer is a risky move. You don't have to paint a fictitiously rosy picture of the past, but
dwelling on the negative can backfire. Really, you don't want to get into a detailed explanation
of why you have or might quit at all. Instead, you want to talk about why you're such a perfect fit
for the gig you're applying for.
So, for instance, a question about leaving a long-held job could be answered by suggesting that
the new position offers a chance to contribute more and learn new skills by working with a stronger
team. This principle applies in responding to curiosity about jobs that you held for only a short
time.
It's fine to acknowledge a misstep. But spin the answer to focus on why this new situation
is such an ideal match of your abilities to the employer's needs.
The truth is, even if you're completely right about the past, a prospective employer doesn't really
want to hear about the workplace injustices you've suffered, or the failings of your previous employer.
A manager may even become concerned that you will one day add his or her name to the list of people
who treated you badly. Save your cathartic outpourings for your spouse, your therapist, or, perhaps,
the future adoring profile writer canonizing your indisputable success.
Send your workplace conundrums to [email protected], including your name and contact
information (even if you want it withheld for publication). The Workologist is a guy with well-intentioned
opinions, not a professional career adviser. Letters may be edited.
"... To my generation computer games seem crazy but incredible amounts of money are spent developing each new game. Man's ingenuity has been turned against himself as mental addiction takes its place next to chemical addiction. ..."
"... You need to use AdBlock and NoScript (or the equivalent for whatever OS and browser you're using.) I don't see ads hardly anywhere. The main reason for using these tools is not only to get rid of ads, it's to enhance the security of your computer. ..."
Death of education by smartphones is a recent meme worrying educators. The ads, news bites, and
apps are crafted specifically to attract attention. They are the end result of marrying Madison
Avenue with Silicon Valley and only the most effective/annoying/distracting survive to become
the template for the next generation.
To my generation computer games seem crazy but incredible amounts of money are spent developing
each new game. Man's ingenuity has been turned against himself as mental addiction takes its place
next to chemical addiction.
"If I look up a news article on the Web, swarms of ads descend to interrupt, and we spend
precious time trying to delete them and move on as even as more continue to appear. The volume
of ads are so asphyxiating these days that it isn't worth the effort to get rid of them, and
so I turn them off., annoyed and exasperated."
You need to use AdBlock and NoScript (or the equivalent for whatever OS and browser you're
using.) I don't see ads hardly anywhere. The main reason for using these tools is not only to
get rid of ads, it's to enhance the security of your computer.
"... Today, we learn in snatches or in brief bites. We don't settle down to learn comprehensively. We can't concentrate. Our life is one of incessant interruptions. If I look up a news article on the Web, swarms of ads descend to interrupt, and we spend precious time trying to delete them and move on as even as more continue to appear. The volume of ads are so asphyxiating these days that it isn't worth the effort to get rid of them, and so I turn them off., annoyed and exasperated. ..."
"... News items are intruders. Their origin is external to our thought. If outside events are always being dumped on our brains, it is hard to take the time to grade them in terms of our general knowledge. We do we really know? It takes a lot of reflection to answer that. Only by looking at our own knowledge from all sides, do we get a grasp of the insights that come from experience rather than the knowledge that come from foreign impressions. Schopenhauer once said that real thinking means "comparing truth with truth." To me that means deciding which truth had more meaning and priority in my own mental life? ..."
"... The ability to focus on a subject for a long time without fatigue was one of Napoleon's mottos. Who today can do that? What benefit to we get from blotting out distractions and learning to reason carefully for a long time without getting tired? It becomes harder for us to do everyday. Topics flock to our brains. The Middle East, President Trump, North Korea. Are these things really interesting? If we buckle down and concentrate on them, what will be the reward? To me, the rewards are always meager. There is a lot of competition when it comes to current affairs. If we fall behind, we suffer a pang of regret � some neighbor knows more about current affairs than I do. But so what? I want to ponder things that are unique to my own temper and mental capacity. I don't want to become a replica of my neighbor. There are few worse fates than that. I want to ponder things that are appropriate to my nature and experience. I want to encourage thoughts that have truth and life in them that occur naturally, not from without. ..."
"... Let's face it. Today we are all the junkies of daily news. "The Daily Fix" phrase is perhaps the most appropriate. ..."
"... One of the main villains of modern life is opinion. Popular opinion has replaced thought and reflection. Opinions are the product of ignorant hearsay. All of us see or view something and, without considering what it means, we rush to bray our reactions to anyone who can hear it. But is our reaction valid? Insightful? Useful? Enlightening? Opinions are unstable; they become outmoded, lacking in pertinence or validity and over time, are discarded. An opinion is the prisoner of the moment, a prisoner of the thoughtless and automatic the commonplace. For every thousand people cry a thing up only a pitiable few cry it down and their voices are drowned out. ..."
"... New York Times' ..."
"... You need to use AdBlock and NoScript (or the equivalent for whatever OS and browser you're using.) I don't see ads hardly anywhere. The main reason for using these tools is not only to get rid of ads, it's to enhance the security of your computer. ..."
"... Most people (68%) have an IQ that is within 1 standard deviation of average. These people are mediocre; functional, but mediocre. Of the remaining 32% we have 16% on the far left side of the bell curve. These people are truly stupid. That leaves only 16% (16 out of every hundred people you meet) that have some spark of intelligence above mediocrity. Of those, only 2% are truly bright. ..."
"... This, I think, is the root of the problems you discuss. Most people simply do not have the ability to do more than absorb and rote repeat the shallow informational garbage that is tossed at them. Their stunted intellectual capacities don't permit them to gain satisfaction from deep meditations. Rather, they prefer the gross pleasures of food, drink, slapstick and gossip. ..."
"... I think much of what is "modern life" is soul stifling. There are many ways to sidestep or repudiate the crassness and incivility of the world today, but for me, it has been to exit the metropolitan life. Going to my farm, where there is no cell service, no big highways and people still ride their horses down the roadways - I feel a palpable release and relief just driving into the area. ..."
"... The key to things, as has been taught throughout time, is to do things in moderation - and the internet and smartphones are no exception. However, the addictive appeal of instant everything is apparent to us here commenting, and is to be understood and moderated. In that vein, I want to thank the Colonel for giving us the opportunity to enjoy this little nook of cyberspace - thank you! ..."
Today, we learn in snatches or in brief bites. We don't settle down to learn comprehensively.
We can't concentrate. Our life is one of incessant interruptions. If I look up a news article on
the Web, swarms of ads descend to interrupt, and we spend precious time trying to delete them and
move on as even as more continue to appear. The volume of ads are so asphyxiating these days that
it isn't worth the effort to get rid of them, and so I turn them off., annoyed and exasperated.
The chief point is that we cannot sit and think and read or reflect in peace any more. Everything
calls to us, tempts us, distracts us, befuddles and annoys us. Our brains are not what they once
were, not because of age, but because our culture works differently on them and hinders their further
development.
News items are intruders. Their origin is external to our thought. If outside events are always
being dumped on our brains, it is hard to take the time to grade them in terms of our general knowledge.
We do we really know? It takes a lot of reflection to answer that. Only by looking at our own knowledge
from all sides, do we get a grasp of the insights that come from experience rather than the knowledge
that come from foreign impressions. Schopenhauer once said that real thinking means "comparing truth
with truth." To me that means deciding which truth had more meaning and priority in my own mental
life?
The ability to focus on a subject for a long time without fatigue was one of Napoleon's mottos.
Who today can do that? What benefit to we get from blotting out distractions and learning to reason
carefully for a long time without getting tired? It becomes harder for us to do everyday. Topics
flock to our brains. The Middle East, President Trump, North Korea. Are these things really interesting?
If we buckle down and concentrate on them, what will be the reward? To me, the rewards are always
meager. There is a lot of competition when it comes to current affairs. If we fall behind, we suffer
a pang of regret � some neighbor knows more about current affairs than I do. But so what? I want
to ponder things that are unique to my own temper and mental capacity. I don't want to become a replica
of my neighbor. There are few worse fates than that. I want to ponder things that are appropriate
to my nature and experience. I want to encourage thoughts that have truth and life in them that occur
naturally, not from without.
I do not understand why so many people strive so hard to be up to date. They are always in a race
to try and announce headlines before their neighbors. They rarely study or master the stories the
headlines advertize. They evade the labor of memorizing. All they can recapitulate are the headlines.
If you ask about the stories, they hesitate then falter out, "I only saw the headlines." I am sometimes
eager to have them summarize what they've read, but there is no there, there as Gertrude Stein said
about Oakland, CA.
Let's face it. Today we are all the junkies of daily news. "The Daily Fix" phrase is perhaps
the most appropriate. It is really shameful if you think about it, but no one does, or if you
protest about the meaningless deluge of daily news, you are labeled over-sensitive or nit-picking.
Most of us awake to news headlines. There is a hurricane, an accident that kills sailors, a helicopter
crashes, a new threat of annihilation from an Asian punk regime.
But do we learn anything from these? We are like those toy birds that dip their beaks into a dish
of water. They look as if they're drinking, but they don't. They are not built to absorb anything.
Their dipping looks like activity, but it is all counterfeit. Unfortunately the breathless topics
of today are not of permanent interest nor do they enrich the mind. They are transitory, destined
not to last. They keep us floating on the surface of life, preventing us from diving deep and discovering
something new and valuable and priceless.
We see lists of notable books on the Civil War, the downfall of the Soviet Union, the Fall of
the Bastille. We see new books on the French Revolution or the fall of Paris in 1870. We see histories
of the Balkans or the Ottoman Empire. We see books about the nature of power, religious or corporate
or military. Do we read them, study them?
As we get older, our minds get more introspective. We want to seize the enduring truths that reside
in our nature or our close friends. Such things sharpen the mind; help expand the range of our inner
insight. Worthless Opinions
One of the main villains of modern life is opinion. Popular opinion has replaced thought and
reflection. Opinions are the product of ignorant hearsay. All of us see or view something and, without
considering what it means, we rush to bray our reactions to anyone who can hear it. But is our reaction
valid? Insightful? Useful? Enlightening? Opinions are unstable; they become outmoded, lacking in
pertinence or validity and over time, are discarded. An opinion is the prisoner of the moment, a
prisoner of the thoughtless and automatic the commonplace. For every thousand people cry a thing
up only a pitiable few cry it down and their voices are drowned out.
We suffer from an increasingly lack of sound judgment.
... ... ...
Isolation
Isolation plays a large part in retarding study. The pleasure of learning is a noble pleasure,
and like all good things, sharing what we learn with others increases its value. We are social creatures,
and it is part of our nature to share the excellent. But most of the time we lack people to share
the joy of our discoveries with. We are victims of the addicts of the mental lightweights who confine
their reading to New York Times' bestsellers, people who lack the means to judge the merit
of what they're reading, who lack the talent to articulate its virtues. They lack the standards of
taste and the critical spirit required to evaluate them correctly.
Isolation has killed a lot of thinkers. I remember How Hume's book on Reelections on Human Nature
fell absolutely flat after it was published yet, over time, became a classic. But popularity can
kill as well. We think of how Mozart's amazing genius wowed and fascinated his audiences and followers
and yet his fame resulted in him buried in an unmarked grave for the poor. Crowds are dismayingly
fickle. Their interest lacks stamina.
Apparently it is the task of modern culture is to herd all of us on well traveled roads, never
taking the road less traveled. Few of us explore and the few who do are not met with enthusiasm or
praise or appreciation but by polite indifference mainly because your knowledge is not current or
popular.
Popularity is a trap. It retains a viselike grip on the ignorant. It is sinister because it is
addictive. If something is popular and makes money, then it must be successful, and if successful,
it must be superior. No one asks the fans of the popular why they admire as they do. Because they
assume that everyone else thinks just as they do and everyone else suffers from the same mediocre
qualities of taste and narrowness of mind.
It is a hard truth that people of more talented intellectual capacity seek out people with similar
temperaments and natures. That is the key to all friendship. With the right people, they come alive.
They speak freely and honestly, relating facts that stimulate their listeners who then come forward
with their own treasured items of memory and knowledge that stimulate and reinforce the conversation.
Both sides leave the discussion strengthed and invigorated. Both are eager to hear more, learn more.
Both return feeling less isolated from the ephemeral l thing tat matter so much in the world.
Divas
The purpose is to learn and share our knowledge for its own sake not because we want to not to
become the center of attention. A neighbor's kid came to visit his parents. He was obsessed with
learning about Rubik Cube. One the night of his arrival, there was a dinner in progress, but no sooner
had the guests entered in the hallway, than this kid was putting on an exhibition, wresting with
his cube, blocking the entering hallway, of course earning automatic applause from his audience.
A short time later, he then went down to Miami to attend an international competition, and after
all his self display his scores were mediocre, resting stolidity in the middle of the pack. I wondered
if his interest was merely a desire to attract cheap applause, or whether he was serious student
determined to become an expert, putting in those long hours of concentrated focus to improve his
skill. Of course, my hopes were mislaid. He moved onto so something else where he would be the center
of attention and hog the spotlight.
How We Die
Am reading an excellent book, How We Die ? The author, Sherwin Nuland, is a doctor, a
surgeon, who is a well educated and deeply cultured man. He writes with eloquence. His prose is not
for the squeamish. He retails very grisly details about how we lose our lives. Each chapter documents
the chief causes of death in America, heart disease, Alzheimer's, accidents, suicides, "Murder and
Serenity", etc.
One death he documents was that of James McCarty who died of a heart attack. He was a successful
construction executive who led a "suicidal" life. He smoked, ate rich food, consumed a lot of red
meat, and grew flabby and overweight and never exercised. He arrived at the emergency room at 8 p.m.,
on a hot and humid Sept. evening. He complained of "a constrictive pressure behind the breastbone"
that radiated up into his throat and down his left arm. The pressure had begun after his usual heavy
dinner. His face was ashen and sweaty. His heartbeat was irregular but improved after initial treatment.
At 11:00 p.m., Nuland arrived. McCarty wasn't pleased to see him. McCarty greeted him with a thin,
forced smile. Nuland was 22 years of age at the time and this was one of his first cases. As Nuland
sat down, McCarty suddenly threw his head back and "bellowed a wordless roar that came out of his
throat from somewhere deep in his stricken heart." He hit his balled fists with surprising force
up against his chest as his face became swollen and purple.
Nuland explains how he opened up the chest cavity to massage the man's heart. The heart felt like
"an uncoordinated squirming, a jellylike bagful of hyperactive worms." The heart was wriggling under
his fingers, and he began a series of firm, syncopated compressions.
Then Nuland writes "Suddenly a something stupefying in its horror took place." (Excellent sentence.)
McCarty "threw back his head once more, and staring at the ceiling with his glassy, unseeing gaze
of open, dead eyes, roared at the distant heavens a hoarse, rasping whoop that sounded as if the
hounds of hell were barking." (Pat described this as McCarty's "last hurrah." McCarty, of course,
was already dead when this happened.
The book is written in this effective pictorial style. It spares the reader nothing.
Of course, we all die from lack of oxygen. We cease breathing and our esophagus muscles can constrict
and make us bark as we die or there can be seen great heaving as our lungs fail. The myths that our
nails or hair grow after our death are simply myths. After we die, nothing grows. The lively energetic
spirit that was one our deepest being had fled, leafing a pathetic shell behind that is not pleasant
to look at. The eyes, at first unfocused and glassy, soon become covered ay a gray film that has
no expression at all. The body beings to shrink. We have become mere luggage. What will survive of
us has already been done. There is nothing else to look forward to.
I learned enough of New Testament Greek to read St. Paul's letters, which were outstandingly articulate
in every way. But when I came to the Resurrection, I became skeptical. It was a lovely wish � to
be restored to your parents, your wife, and your friends. But St. Paul's belief had its antecedents
Zoroaster, the great Persian religious leader, was said to have been torn to pieces by his followers,
but rose after three days. I don't like coincides. Of course, Jesus appeared to his followers but
there was little to record of him after that. Was he resurrected a second time? There is little information.
You are spot on. The biggest problem we face is our own self and the delusion in search of non-existing
knowledge out side of us. As you say if we sit comfortably and contemplate our own experiences
both good and bad, there will be a greater awakening to the world outside of us. But as we search
for knowledge outside of us be it internet or other mediums we are bombarded with irrivelent information
such as the pop us ads etc. I have to plead guilty of the later but sometimes I do practice the
former!
In my early years I marched along the trail knowing that in the mist dimly seen was "The Wall".
Now at 81 "The Wall" is clear, spotlighted in bright sunlight.
I am overwhelmed by this gift of your constant thinking. I agree with you about not wanting to
live if my wits are gone, but I fear that it will be impossible for me to tell what that moment
might be. I am I guess still afraid of death even though I strive to overcome this feeling. We
would all like to die peacefully in our sleep one night but I think this rarely happens.
Death of education by smartphones is a recent meme worrying educators. The ads, news bites, and
apps are crafted specifically to attract attention. They are the end result of marrying Madison
Avenue with Silicon Valley and only the most effective/annoying/distracting survive to become
the template for the next generation. To my generation computer games seem crazy but incredible
amounts of money are spent developing each new game. Man's ingenuity has been turned against himself
as mental addiction takes its place next to chemical addiction.
"If I look up a news article on the Web, swarms of ads descend to interrupt, and we spend
precious time trying to delete them and move on as even as more continue to appear. The volume
of ads are so asphyxiating these days that it isn't worth the effort to get rid of them, and
so I turn them off., annoyed and exasperated."
You need to use AdBlock and NoScript (or the equivalent for whatever OS and browser you're
using.) I don't see ads hardly anywhere. The main reason for using these tools is not only to
get rid of ads, it's to enhance the security of your computer.
Having watched my father pass away in recent months, after several years confined to a wheelchair
and in the care of gifted, compassionate immigrants, I sincerely appreciate this post.
In those last weeks, the most help that I could offer was to play him any opera, musical, jazz,
or orchestral piece that he requested -- all via a quick search on my iTunes account. In the last
hours, when he could no longer speak, Indian Chakra music (also via iTunes) helped his breathing
and was a balm beyond what words could ever express.
What he taught me is that it is not how we die -- in his case, stoic, uncomplaining, loved,
and treasured -- but how we live, that matters.
His life, like so many of his generation, was shaped by several years spent in the US Army
between 1943 - 45, much of it in the South Pacific, then Japan. The catastrophic destruction that
he witnessed, which he did not share with me until he was well into his 80s, shaped the way that
he lived his life, and sharpened his priorities, his beliefs, his politics, his ethics, and his
capacity for friendship. Also, his capacity for making a decision, then sticking to it.
He once told me that after watching 'so many bodies stacked up like cordwood' in the cleanup
of Yokohoma after it had been firebombed, he promised himself that he would never, ever remain
in any job if he was miserable after 72 hours. He kept that promise to himself, and helped countless
others also try to find meaningful work, be productive, and laugh through job losses, down cycles,
and lawsuits.
In other words, his military experiences in WWII seemed to liberate him in a sense to live
his life as fully as he possibly could, and he always felt grateful to have had a solid education,
a superb local library, and -- much later -- The Internet to help him reconnect with friends strewn
across the country.
Today, he would be called 'resilient'. Many of the traits that helped him be successful in
a long career were sharpened in the US Army, and he felt that 'kids today' would have enormous
benefits from some kind of national service. That generation knew how to pull together. Whether
today's kids can figure it out remains to be seen.
I'm in my 40's. I had a heart attack (MI) 3 years ago and a stroke 2 weeks ago. The MI felt like
1000-lbs of compressed air was shot into my lungs. When I had the stroke I was typing a report
at my desk around 7 pm. My wife was still at work. My right arm went completely numb and the right
side of my face felt partially numb. I was rushed to the ER at a local hospital outside Boston.
No major long-term effects. In both cases (MI and stroke) I was a bit freaked out because I
was conscious and knew that what was happening was grave. In both cases my overwhelming thought
(fear) was that I was about to enter eternity and I wondered if I had lead a good enough life
to avoid eternal isolation from God. During the stroke they were ready to use a very aggressive
treatment called TPA, which, the ER doctor told me, could result in bleeding in the brain and
fatality. I was frightened of death for the first time in my life. Because it was real. I asked
my wife if we might need to call a priest. She said I would be ok. The decision to not go forward
with TPA was made by a brother and sister-in-law (one a Harvard Med cardiologist and the other
a professor of medicine) who talked with the ER doctor by phone as this was going down (I'm sure
a first for him).
Anyway, crazy stuff. I will be changing my lifestyle in many ways-- body, mind, and spirt.
I'm practicing my faith more diligently and plan to go to confession at least once per month and
say the rosary daily. A view these events as a wake up call for my health and a severe mercy for
my eternal soul.
I was undergoing some sort of medical test and the technician noticed I was reading a book, one
of the Patrick O'Brian series which includes Master and Commander which was such a good
movie. I told him I was reading the series for what I thought was the sixth time and he was stunned.
He could not believe that anyone would read a book twice, let alone a series of twenty books six
times. I think Richard Sale understands why I'm reading it yet again.
I volunteer at a hospice home in my community. It's a nice place and people in the community
can spend their final days there, for free, well taken care of, with their families and friends,
in a clean, peaceful, respectful environment. The goal of the home is provide as much dignity
in death as possible. I've seem a lot of people go through the dying process and have been there
at the final moment for some of them.
You'd be surprised at how many residents pass their last week and day and even moment with
some banal game show blaring away on the television. You might be surprised at how few conversations
there are about spiritual matters, how few reflections on what was learned during life, how few
conversations regarding great adventures, joys, loves, sorrows.
For most, death comes painlessly. There is a sigh and, perhaps, a brief rattle and then the
resident is gone. Quite uneventful. Quite mundane.
Most people (68%) have an IQ that is within 1 standard deviation of average. These people
are mediocre; functional, but mediocre. Of the remaining 32% we have 16% on the far left side
of the bell curve. These people are truly stupid. That leaves only 16% (16 out of every hundred
people you meet) that have some spark of intelligence above mediocrity. Of those, only 2% are
truly bright.
This, I think, is the root of the problems you discuss. Most people simply do not have
the ability to do more than absorb and rote repeat the shallow informational garbage that is tossed
at them. Their stunted intellectual capacities don't permit them to gain satisfaction from deep
meditations. Rather, they prefer the gross pleasures of food, drink, slapstick and gossip.
Sorry Eric but I don't buy your assumptions at all. I think if you were to look carefully and
without bias, you will find that the mediocrity you have perceived is almost entirely the result
of a very poor national (at least in the USA) public education system.
"Are these things really interesting? If we buckle down and concentrate on them, what will
be the reward? To me, the rewards are always meager. There is a lot of competition when it
comes to current affairs. If we fall behind, we suffer a pang of regret � some neighbor knows
more about current affairs than I do. But so what? I want to ponder things that are unique
to my own temper and mental capacity."
Dear sir, thank you for your essay. You are very right in your principles. One should meditate
and think deeply. One should not be distracted by passing fads and foolish fancies. I am a foolish
fellow. I fritter my time away on distractions. I know that I should say "no" to exciting projects
and focus on just one useful enterprise, but in general I fail.
One thing that I do focus on is putting together aggregated news of police misconduct, government
corruption, and conspiracy theories. Up through 2016, I thought it was just another foolish habit.
I had perhaps two dozen readers every day - I got no money for keeping them abreast of the headlines.
And then, in 2016, John Podesta was accused of human trafficking. If the allegations - known
as Pizzagate - are even close to true, then the entire USA government will be shaken when the
truth comes out. I reported on Pizzagate when it was news, just like I report on every other report
of government misconduct. And instead of two dozen visitors, I got thousands. For just one day,
or just one month, there were thousands of people who wanted to read the allegations, and I played
a very small role in delivering the truth that had been exposed by much braver and abler men.
I hope the corruption will be exposed, and then everyone will wake up, and my blogging efforts
will be obsolete. I would very much like to feel that I can ignore the news in good conscience.
I think much of what is "modern life" is soul stifling. There are many ways to sidestep or
repudiate the crassness and incivility of the world today, but for me, it has been to exit the
metropolitan life. Going to my farm, where there is no cell service, no big highways and people
still ride their horses down the roadways - I feel a palpable release and relief just driving
into the area.
My recommendation is simply to limit your drinking. Nobody gets drunk every day except alcoholics,
who have a sickness. My sense of things on the internet and in smartfone-land is similar - it's
like a drunk who needs to drink. If you have a little, it is fine, although you don't always need
it. If you have a lot, then you are like a drunk - because knowing things does not mean you can
affect them, and worrying over things you cannot affect is a recipe for many ills.
The craziness of the world will recede in the future - so much of what is considered 'normal'
now is not so, when viewed from the lens of history. Things go in cycles, and the current world
is the most technologically complex one in known history - and thus it has more innate vulnerabilities
than any other previous human existence. Simplification will come, and is likely on its way in
our children's or grandchildren's times here on Earth.
Concurrently, my focus has been on building the farm so that my children and possibly their
own, have a place to go that is not the city, that is simpler, that is closer to the Earth and
provides them with things impalpable. This has and is a great source of happiness in this life
for me.
I haven't subscribed to the Judeo-Christian faith since I was originally indoctrinated in my
early teens via catechism. I never grokked a God that delivers binary choice - this world would
be anathema to that type of being. I believe reliving the wheel of life a far more likely and
positive possibility for souls. Polishing ones soul in repeated attempts has an appeal much greater
than burning in hell eternally or playing a harp among identically blissful angels - the binaries
offered by many religions are not reflective of what humanity is, IMHO. I guess in the next years
I will discover what the truth of things is, and take comfort in my offspring moving through time
beyond my own.
The key to things, as has been taught throughout time, is to do things in moderation -
and the internet and smartphones are no exception. However, the addictive appeal of instant everything
is apparent to us here commenting, and is to be understood and moderated. In that vein, I want
to thank the Colonel for giving us the opportunity to enjoy this little nook of cyberspace - thank
you!
And for this essay - thank you. I surely needed to be written, as it is something we all should
acknowledge. Death is something natural, normal and inevitable. Easing the burden of loss to your
loved ones is an important responsibility as we pass through the veil.
At a cabin in the Washington state woods, the reSTART center helps residents withdraw from technology
that has consumed their lives in Redmond, Washington.
By the time Marshall Carpenter's father broke down the barricaded door of his son's apartment
and physically ripped him away from his electronic devices, the 25-year-old was in a bad way. He
could not bear to live a life that didn't involve hours upon hours of uninterrupted screen time.
"I was playing video games 14 or 15 hours a day, I had Netflix on a loop in the background, and
any time there was the tiniest break in any of that, I would be playing a game on my phone or sending
lonely texts to ex-girlfriends," Carpenter says.
We are sitting in a small, plain apartment in a nondescript condo complex in Redmond, Washington,
on the outskirts of Seattle. Marshall shares the apartment with other men in their 20s, all of whom
have recently emerged from a unique internet addiction rehab program called reSTART Life.
"I was basically living on Dr Pepper, which is packed with caffeine and sugar. I would get weak
from not eating but I would only notice it when I got so shaky I stopped being able to think and
play well," he adds. By then, he'd already had to drop out of university in Michigan and had lost
his sports scholarship.
His new friends Charlie and Peter nod sagely. Charlie Bracke, 28, was suicidal and had lost his
job when he realized his online gaming was totally out of control. He can't remember a time in his
life before he was not playing video games of some kind: he reckons he began when he was about four
and was addicted by the age of nine.
Marshall and Charlie at reSTART, an internet addiction center.
Marshall and Charlie at reSTART, with Charlie's dog, Minerva. Photograph: Rafael Soldi for the
Guardian
For Peter, 31, who preferred to withhold his last name, the low came when he had been homeless
for six months and was living in his car.
"I would stay in church parking lots and put sunshades up on the windows and spend all day in
my car on my tablet device," he says.
He was addicted to internet porn, masturbating six to 10 times a day, to the point where he was
bleeding but would continue.
When he wasn't doing that, he was so immersed in the fantasy battle game World of Warcraft that
in his mind, he was no longer a person sitting at a screen, but an avatar: the bold dwarvish hero
Tarokalas, "shooting guns and assassinating the enemy" as he ran through a Tolkien-esque virtual
realm.
And when he wasn't doing that, he would read online news reports obsessively and exercise his
political opinions and a hair-trigger temper in the comment section of The Economist, projecting
himself pseudonymously as a swaggering blogger-cum-troll.
"I was a virgin until I was 29. Then I had sex with a lap dancer at a strip club. That's something
I never thought I would do," he says.
After completing the initial $25,000, 45-day residential stage at the main "campus" a few miles
away, clients move into the cheaper, off-site secondary phase. Here they get to share a normal apartment,
on the condition that they continue with psychotherapy, attend Alcoholics Anonymous-style 12-step
meetings, search for work and avoid the internet for a minimum of six months.
Marshall, Charlie and Peter successfully completed the second phase and have graduated from the
reSTART program, but they have chosen to stay in the same apartment complex and rent with other recovering
gamers as they continue to reboot their lives.
Mostly they carry only flip phones and have to go to the library when they want to check email.
"I'm taking my life in six-month chunks at this stage. So far I haven't relapsed into gaming and
I'm feeling optimistic," says Bracke.
An addiction overwhelmingly afflicting men
A climbing wall at the main ReStart campus, deep in the woods.
Facebook
Twitter
Pinterest
A climbing wall at the main reSTART campus, deep in the woods. Photograph: Rafael Soldi for the
Guardian
Nine miles east, down a dirt track off a country road that winds through forests, six young men
are sitting in a wooden cabin amid a cluster of moss-draped trees � the reSTART campus.
Spring sunshine is flooding through the windows and the only sounds are birds singing and the
men cracking their knuckles as they stare at the floor.
They have recently arrived at rehab.
Hilarie Cash, a psychotherapist and the chief clinical officer at reSTART, asks the guys to begin
a communication exercise.
Philip, 22, steps into the middle of the group. He's been here for three weeks and is on a year's
medical leave from Duke University after getting hooked on Dota 2, the sequel to the fantasy battle
game Defense of the Ancients. He asks Adam, who only arrived four days ago and is fidgeting awkwardly,
to stand up and face him. (The real names of those currently in the residential program have been
withheld.)
Kevin, who has been here for four weeks, coaches them through an exercise known in counseling
circles as the "listening cycle", designed to facilitate emotional conversations in relationships.
It's a basic introduction for the new guy.
Fears grow for children addicted to online games
Read more
Philip, who was underweight when he arrived, says to Adam, who is overweight: "I'm worried that
you're not eating healthily. I noticed you've been skipping dinner."
Adam is meant to repeat back to Philip what he heard him say the problem is. He mumbles, barely
audible, and can't seem to remember what he's just been told.
He's unable to focus, and the air is thick with reluctance and embarrassment.
Stephen, another newbie, is gazing at the ceiling, yawning, sighing, then looking mildly irritated.
Alex, 20, comes to the rescue. He arrived at rehab in January but has popped back to visit the
group and explains: "It's so hard at the beginning. Day one here, I was a wreck, and the first two
weeks I was backsliding."
His games of choice were The Legend of Zelda, a solo action adventure series, where "instead of
being the depressed piece of shit I was in real life" he could exist as a swashbuckling hero.
Adapting to a tech-free world structured around rural communal living and social skills was a
nightmare, he says. "I wouldn't join in at first and I got called out for it by the others."
The problem is that you can't learn IT well working 40 hours a week. This is too complex
specilaty and it does rtequre long hours. So only people who can put long hours can survive in IT.
Notable quotes:
"... There's been a flurry of recent coverage praising Sheryl Sandberg , the chief operating officer of Facebook, for leaving the office every day at 5:30 p.m. to be with her kids. Apparently she's been doing this for years, but only recently "came out of the closet," as it were. ..."
"... They discovered that the "sweet spot" is 40 hours a week � and that, while adding another 20 hours provides a minor increase in productivity, that increase only lasts for three to four weeks, and then turns negative. ..."
"... Anyone who's spent time in a corporate environment knows that what was true of factory workers a hundred years ago is true of office workers today. People who put in a solid 40 hours a week get more done than those who regularly work 60 or more hours. ..."
"... However, the facts don't bear this out. In six of the top 10 most competitive countries in the world (Sweden, Finland, Germany, Netherlands, Denmark, and the United Kingdom), it's illegal to demand more than a 48-hour work week . You simply don't see the 50-, 60-, and 70-hour work weeks that have become de rigeur in some parts of the U.S. business world. ..."
"... In other words, nobody should be apologizing for leaving at work at a reasonable hour like 5:30 p.m. In fact, people should be apologizing if they're working too long each week�because it's probably making the team less effective overall. ..."
You may think you're getting more accomplished by working longer hours. You're probably wrong.
There's been a flurry of recent coverage
praising Sheryl Sandberg, the chief operating officer of Facebook, for
leaving the office every day at 5:30 p.m. to be with her kids. Apparently she's been doing this
for years, but only recently "came out of the closet," as it were.
What's insane is that Sandberg
felt the need to hide the fact, since there's a century of research establishing the undeniable fact
that working more than 40 hours per week actually decreases productivity.
In the early 1900s, Ford Motor ran dozens of tests to discover the optimum work hours for worker
productivity. They discovered that
the "sweet
spot" is 40 hours a week � and that, while adding another 20 hours provides a minor increase in
productivity, that increase only lasts for three to four weeks, and then turns negative.
Anyone who's spent time in a corporate environment knows that what was true of factory workers
a hundred years ago is true of office workers today. People who put in a solid 40 hours a week get
more done than those who regularly work 60 or more hours.
The workaholics (and their profoundly misguided management) may think they're accomplishing more
than the less fanatical worker, but in every case that I've personally observed, the long hours result
in work that must be scrapped or redone.
Accounting for Burnout What's more, people who consistently work long
work weeks get burned out and inevitably start having personal problems that get in the way of getting
things done.
I remember a guy in one company I worked for who used the number of divorces in his group as a
measure of its productivity. Believe it or not, his top management reportedly considered this a valid
metric. What's ironic (but not surprising) is that the group itself accomplished next to nothing.
In fact, now that I think about it, that's probably why he had to trot out such an absurd (and,
let's face it, evil) metric.
Proponents of long work weeks often point to the even longer average work weeks in countries like
Thailand, Korea, and Pakistan�with the implication that the longer work weeks are creating a competitive
advantage.
If U.S.
managers were smart, they'd end this "if you don't come in on Saturday, don't bother coming to
work on Sunday" idiocy. If you want employees (salaried or hourly) to get the most done�in the shortest
amount of time and on a consistent basis�40 hours a week is just about right.
In other words, nobody should be apologizing for leaving at work at a reasonable hour like
5:30 p.m. In fact, people should be apologizing if they're working too long each week�because it's
probably making the team less effective overall.
After all,
there are numerous high-profile billionaires who haven't called it quits despite possessing the luxury
to retire, including some of the world's top chief executives, such as Amazon's Jeff Bezos and Facebook's
Mark Zuckerberg. But it turns out, the suddenly rich who aren't running companies are also loathe
to quit, even though they have plenty of money. That could be, in part, because the link between
salary and job satisfaction is very weak.
According to a meta-analysis by University of Florida business
school professor Timothy Judge and other researchers, there's less than a 2% overlap between the
two factors.
In the long run, we derive job satisfaction from non-monetary sources, which include
positive peer relationships, the ability to work on meaningful projects and even leadership opportunities.
"... If anything, the whole plagiarism scandal reflects somewhat poorly on Michelle Obama. One reason Obama's words were able to
play so well at the RNC was that in the lifted passages, Obama was speaking using the conservative language of "bootstrapping." Obama's
sentence, that "the only limit" to one's achievements is the height of one's goal and the "willingness to work" toward it, is the Republican
story about America. It's the story of personal responsibility, in which the U.S. is overflowing with opportunity, and anyone who fails
to succeed in such a land of abundance must simply not be trying hard enough. ..."
"... People on the left are supposed to know that it is a cruel lie to tell people that all they need to do is work hard. There
are plenty of people with dreams who work very hard indeed but get nothing, because the American economy is fundamentally skewed and
unfair. This rhetoric, about "hard work" being the only thing needed for the pursuit of prosperity, is an insult to every tomato-picker
and hotel cleaner in the country. It's a fact that those who work the hardest in this country, those come home from work exhausted and
who break their backs to feed their families, are almost always rewarded the least. Far from embarrassing Melania Trump and the GOP,
then, it should be deeply humiliating for Democrats that their rhetoric is so bloodless and hollow that it can easily be spoken word-for-word
in front of a gang of crazed racists. Instead of asking "why is Melania Trump using Michelle Obama's words?" we might think to ask "why
is Michelle Obama using the right-wing rhetoric of self-reliance?" ..."
"... This is, of course, the myth of "meritocracy" that Thomas Frank has exposed with scalpel-like precision in his latest book
Listen, Liberal . It's clear that the Democratic Party, at its core, believes with Michelle (and Barack) Obama the comfortable and self-serving
lie that no individual has anyone to blame but herself if she fails to achieve high goals. She should just have reached higher; she
should just have worked harder. ..."
"... It's not only a lie, it's a "cruel lie," as Nimni says. So why is she, Michelle Obama, telling it? Clearly it serves her interests,
her husband's interests, her party's interests, to tell the "rich person's lie," that his or her achievement came from his or her own
efforts. To call most people's success a product of luck (right color, right gender, right country, right neighborhood, right schools,
right set of un-birth-damaged brain cells) or worse, inheritance (right parents), identifies the fundamental unfairness of our supposed
"meritocratic" system of allocating wealth and undercuts the "goodness," if you look at it writ large, of predatory capitalism. By that
measure, neither the very wealthy themselves (Charles Koch, Jamie Dimon) nor those who serve them (Barack Obama et al ) are "good" in
any moral sense. ..."
"... U.S. cultural norms, as the piece describes accurately, glorify and misrepresent "work" especially of the "hard" kind. Hmm
I wonder where that notion came from and why it gained such a foothold in the prevailing groupthink? ..."
"... The present regime of "teach to the test" here in America almost completely short circuits the teaching of critical thinking
skills. With stressed parents increasingly abdicating their responsibilities towards the upbringing of their offspring in favour of
the State, is it any wonder that the narrow interests of the State, such as the Iron Law of Institutions, are supplanting enlightenment
in the minds of the young? We now must begin to consider the divergence of the interests of the Society from the interests of the State.
With the balance of power swinging heavily in favour of the State these recent decades, I am not sanguine about the near term future
of our culture. ..."
"... As is so often the case in American culture, the "hard work" meme emerges from the slave system. Slaves had to be bullied and
terrorized in order to extract "hard work" from them, given that they had zero rewards of any tangible sort for it. So "hard work" required
constant vigilance and frequent punishments while slaves rationally attempted to do the least amount of work that enabled them to escape
the many types of tortures they were regularly threatened with. ..."
"... Then after "emancipation," plantation owners complained that they could not get any of those lazy, shiftless Negroes to perform
"hard work" for them, given that the newly freed men and women were much more interested in getting ahead for themselves than continuing
to pick cotton or harvest rice for starvation wages. ..."
"... I don't think you are over-simplifying, Clive�in Hong Kong, too, my experience has been that most people I deal with in the
work world take a great deal of intrinsic pride in doing a job efficiently and well, for its own sake, not because it will necessarily
make you more money. ..."
"... What I'm starting to sniff in the zeitgeist today is that Trump's kids are totally changing what people think of the father.
People are making the semi-rational assumption that anyone who can raise such good kids must be very different in private than he is
on the campaign trail. ..."
"... the genesis of the "plagiarism" attacks. The mud slinging has started early in this campaign. However, if Trumps' family can
exude some sense of charm and class, the entire mud slinging strategy can be 'stood on its' head.' ..."
"... Me, I'm terrified of Hillary Clinton and the devastation that her ascension to the Presidency might bring to this nation and
to the world. She is not only a liar, a blatantly self-dealing criminal, but more devastating yet, a sociopath of the first water, willing
to walk across the bodies to advance her personal and class agenda. ..."
"... Her time as President would go a long way toward cementing the Unitary Executive in place (i.e., a functional Dictator, as
understood in the Roman Republican meaning of the term, a Tribune, in which a chief magistrate of the State like the President under
our Constitution, whose writ as an authoritarian ruler ran so long as there was a national emergency. I serve as the clerk for government
documents in a university law library, and I can tell you that the number of House Documents announcing a "National Emergency" or the
continuation of a previously announced "National Emergency" is very alarming. These "emergencies" are the camel's nose under the tent
in my estimation for the slow accretion of Dictatorial powers (again, in the Roman Republican sense of the term "dictator") toward the
Caesar-like role of Unitary Executive. These "National Emergencies" functionally invest power into the hands of the President and those
forces military, legal, and regulatory under the control of the Executive by which the President can wage military, legal/diplomatic,
and economic warfare against those who refuse to bend the knee to US-dominated global hegemony. ..."
"... Hillary is practically salivating to grasp the rod of power embodied in the Unitary Executive. Warfare will follow her tenure
in office like a dire shadow, and due to her belief in the right of and necessity of the US to enforce a global hegemony, she is inevitably
moving toward a deadly clash with other nuclear powers unwilling to submit to the yoke of globalized, stateless, culturally-anodyne
finance capitalism. Good times await. ..."
"... "Our well-nigh useless Legislative branch has largely surrendered its Constitutional responsibilities to the Executive through
such trash as Authorizations of Military Force rather than engaging in the mandated procedure of the Declaration of War found in the
Constitution to authorize extended use of military (and legal and economic) force." ..."
"... That allows individuals to claim they had no responsibility for the war, something Pence and Clinton cannot claim because of
their votes. But on what other things do you see Obama as being a strong "unitary executive." I thought it was generally viewed that
Congress had thwarted his (almost) every wish. ..."
"... And how about that patriot act renewal, US out of iraq/afganistan? Vicky nuland and the ukraine? I guess the problem is that
you get your information as it is generally viewed, but you fail to indicate who it is that generally views things that way, however,
it should help you understand why trump will win because hillary is generally viewed as corrupt. ..."
"... I'm intrigued by author's concluding idea. "It involves another attempt to take over the Republican Party, this time by the
Clinton-led Democratic leadership. " ..."
"... And if the words were lies coming out of Obama's mouth, what are they coming out of Trump's mouth? ..."
"... "Far from embarrassing Melania Trump and the GOP, then, it should be deeply humiliating for Democrats that their rhetoric is
so bloodless and hollow that it can easily be spoken word-for-word in front of a gang of crazed racists. Instead of asking "why is Melania
Trump using Michelle Obama's words?" we might think to ask "why is Michelle Obama using the right-wing rhetoric of self-reliance?" ..."
"... A lot of this is related to the Democrats and what Bill made "successful" with his presidency. The lack of a truly left party
that works for average citizens has created this environment when a character like Trump can gain such support. This article illustrates
but another example of meritocratic nonsense being regurgitated by the party. ..."
"... A thought-provoking and unexpected take, Gaius Publius. I was struck by one item left off your list of lucky attributes: beauty.
Both Michelle Obama and Melania Trump are undeniably beautiful women -� tall, slim, with the elegantly symmetrical features prized in
every culture. Sadly in beauty-obsessed America the doors opened for women who look this lovely are shut hard against women who are
fat, or old, or ugly. ..."
"... I had pretty much the exact same thought as your second "blackbird" when the video of Melania Trump plagiarizing Michele Obama's
speech and all my liberal friends were yuking it up. All I could think was "If the same speech could plausibly come out of either of
their mouths without alienating the audience, we have much worse problems than her Mrs. Trump's copycating." The fact that this seemed
to bother hardly anyone else made it worse. So much of these elections just get reduced down to rooting for your team at a sporting
event. This works well to keep people from having to deal with a lot of unpleasant questions and conclusions. ..."
"... Read Roosevelt's speech, Trump certainly did, for some real fear mongering and look at the coalition he has taken over the
Republican party to form. FDR 1932. ..."
"... > "another attempt to take over the Republican Party" Which shouldn't be that hard, since both the Democrat and Republican
parties are neoliberal. As always, the real enemy is the left. ..."
"... I'm surprised Gaius failed to address this portion of Michelle's speech which he quoted: "tell the truth; keep your promises;
treat others with dignity and respect." Since when has Obama told the truth, or kept his promises, or treated anyone except Jamie Dimon
and Lloyd Blanfein with respect? ..."
"... Put aside whether "Michelle Obama" or some speechwriting merc came up with the banal verbiage redolent of Sunday school and
Horatio Alger. What gives the snippet its special Trumpian turn into hyper-unreality, an ever-expanding balloon of hot boast and hyperbolic
deceit, is the way it transcends garden-variety plagiarism by laying claim to the very virtues that the appropriation itself falsifies.
..."
"... That's chutzpah! The stunning effrontery supersizes an overall meta-ness that's less indicative of middle-class morality and
meritocracy than the predatory opportunism of the exploitative rich, what C. Wright Mills might have recognized as the "higher immorality."
Here we have a colossally vain billionaire atop an empire of glitz and privilege kayfabing his way to a party nomination as the indignant
voice of the brutalized working class he's dedicated his life to disparaging as envious losers. The mind reels between giddiness and
nausea. ..."
"... You can't forever distract it away with Lifestyles of the Rich and Famous and color counter-revolutions against exploitative
freeloaders (the non-rich and famous ones, that is). It takes an philosophy of human worth apart from vanities over this or that temporarily
adaptive skill or happy accident. ..."
"... I think Oren Nimni basically gets it right: When you cut through the tautologies and the bromides that many parents deliver
to their children, what you have is the message, "don't expect government to be there for you; those days are over" (which, actually,
sounds like Bubba Bill's pitch�"the day's of big government are over"). ..."
"... If you work hard enough and have enough ambition you will succeed is not a lie to those born on 3rd base, it was true for them.
The Obama's the Trumps. They are really just guilty of not understanding the plight of those who were born at bat against a major league
pitcher. ..."
Oren Nimni: Obama's statement "is an insult to every tomato-picker and hotel cleaner in the country"
The fact that Michelle Obama's statement is blatantly false (and that a woman of color in the United States said it) is revealing.
Current Affairs writer Oren Nimni
on that (emphasis in original):
If anything, the whole plagiarism scandal reflects somewhat poorly on Michelle Obama. One reason Obama's words were able
to play so well at the RNC was that in the lifted passages, Obama was speaking using the conservative language of "bootstrapping."
Obama's sentence, that "the only limit" to one's achievements is the height of one's goal and the "willingness to work" toward
it, is the Republican story about America. It's the story of personal responsibility, in which the U.S. is overflowing
with opportunity, and anyone who fails to succeed in such a land of abundance must simply not be trying hard enough.
People on the left are supposed to know that it is a cruel lie to tell people that all they need to do is work hard.
There are plenty of people with dreams who work very hard indeed but get nothing, because the American economy is fundamentally
skewed and unfair. This rhetoric, about "hard work" being the only thing needed for the pursuit of prosperity, is an insult to
every tomato-picker and hotel cleaner in the country. It's a fact that those who work the hardest in this country, those come
home from work exhausted and who break their backs to feed their families, are almost always rewarded the least.
Far from embarrassing Melania Trump and the GOP, then, it should be deeply humiliating for Democrats that their rhetoric is
so bloodless and hollow that it can easily be spoken word-for-word in front of a gang of crazed racists. Instead of asking "why
is Melania Trump using Michelle Obama's words?" we might think to ask "why is Michelle Obama using the right-wing rhetoric of
self-reliance?"
This is, of course, the myth of
"meritocracy" that Thomas Frank has exposed with scalpel-like precision in his latest book Listen, Liberal . It's clear
that the Democratic Party, at its core, believes with Michelle (and Barack) Obama the comfortable and self-serving lie that no individual
has anyone to blame but herself if she fails to achieve high goals. She should just have reached higher; she should just have worked
harder.
It's not only a lie, it's a "cruel lie," as Nimni says. So why is she, Michelle Obama, telling it? Clearly it serves her interests,
her husband's interests, her party's interests, to tell the "rich person's lie," that his or her achievement came from his or her
own efforts. To call most people's success a product of luck (right color, right gender, right country, right neighborhood, right
schools, right set of un-birth-damaged brain cells) or worse, inheritance (right parents), identifies the fundamental unfairness
of our supposed "meritocratic" system of allocating wealth and undercuts the "goodness," if you look at it writ large, of predatory
capitalism. By that measure, neither the very wealthy themselves (Charles Koch, Jamie Dimon) nor those who serve them (Barack Obama
et al ) are "good" in any moral sense.
(The idea of the supposed "goodness" of the successful capitalist, by the way, his supposed "greater morality," goes all the way
back to the 18th Century attempt of the wealthy to counter the 17th Century bleakness of Protestant predestination. How could people,
especially the very rich, know whether they are among the "elect" or the damned?
God gives them wealth as a sign
of his plans for them, just as God gives them morally deficient poverty-wage workers to take advantage of.)
There's also a flip side to the main point drawn out in the above article ("if you work hard you'll be successful and rewarded")
which, dare I say, is rarely mentioned and even an anathema in U.S. culture (not, mind you, that I think British culture isn't
going the same way so I am not trying to throw stones in this glass house).
Which is: quite often, you are rewarded if you don't "work hard" and even if you work somewhat "hard" the rewards you receive
are out of all proportion to the effort you have to make. But no-one (or few people) are willing to admit, if they are in that
position, that - to put it crudely - they are really doing bugger all but raking it in.
I, for example, do very little. What I do do certainly isn't "hard work". Now, I have expended a certain amount of mental effort
on understanding the system - the dynamic - in play at my employer. And how to successfully exploit that to gain the maximum amount
of financial reward for the least amount of effort. But I would hardly call that "work", and certainly it is not of "hard" variety.
U.S. cultural norms, as the piece describes accurately, glorify and misrepresent "work" especially of the "hard" kind.
Hmm I wonder where that notion came from and why it gained such a foothold in the prevailing groupthink?
In Japanese culture, to introduce another nuance, the concept of "hard work" is still present as a thing to be looked up to
but it is more tinged with an air of "doing your best" or "doing your upmost" rather than "hard" (i.e. demanding) work and lacks
the "you're going to get the payoff if you do" quid pro quo. The reward, in Japanese culture, comes from knowing you've done the
best you can which is more a personal satisfaction than a financial compensator. But I am glossing over some complexity here so
do not view what I've just said in this paragraph as anything other than a simplification.
May I suggest that the "simplification" you mention is an essential part of any group control strategy. Simplified thinking
may work wonders in efficiency studies or some sorts of high energy physics, but in the realm of social relations, simplicity
masks diversity and complexity to the detriment of any version of "truth." I was lucky in having skeptical parents and some excellent
minds among my High School teachers. The present regime of "teach to the test" here in America almost completely short circuits
the teaching of critical thinking skills. With stressed parents increasingly abdicating their responsibilities towards the upbringing
of their offspring in favour of the State, is it any wonder that the narrow interests of the State, such as the Iron Law of Institutions,
are supplanting enlightenment in the minds of the young? We now must begin to consider the divergence of the interests of the
Society from the interests of the State. With the balance of power swinging heavily in favour of the State these recent decades,
I am not sanguine about the near term future of our culture.
As is so often the case in American culture, the "hard work" meme emerges from the slave system. Slaves had to be bullied
and terrorized in order to extract "hard work" from them, given that they had zero rewards of any tangible sort for it. So "hard
work" required constant vigilance and frequent punishments while slaves rationally attempted to do the least amount of work that
enabled them to escape the many types of tortures they were regularly threatened with.
Then after "emancipation," plantation owners complained that they could not get any of those lazy, shiftless Negroes to
perform "hard work" for them, given that the newly freed men and women were much more interested in getting ahead for themselves
than continuing to pick cotton or harvest rice for starvation wages. Ever since, we have lived with the embittered voice
of the slaveowner infuriated at the loss of all that labor power he once had at his disposal for free. Thus the mythology that
"hard work" is all you need to perform to get ahead and the implicit wink-wink-we-know-who-won't-do-that racism that goes along
with it.
I don't think you are over-simplifying, Clive�in Hong Kong, too, my experience has been that most people I deal with in
the work world take a great deal of intrinsic pride in doing a job efficiently and well, for its own sake, not because it will
necessarily make you more money. (Although often that is the result� over-performing and exceeding expectations is a great
way of ensuring repeat customers and a thriving business.)
Coming from the US, where every corporate smile and "Have a Nice Day" is being recorded for performance review, I find this
a most refreshing cultural trait, one that I have tried my best to assimilate.
I would add to what Clive said that in Japan the ganbare ethos is also underlined by a certain expectation that your
wider social group will back you up, or at least make certain your life doesn't fall off a cliff. This doesn't always work in
practice, and there are obvious examples of social groups that the Japanese polity like to pretend simply doesn't exist, but it
is a cultural expectation. You even see it among homeless camps in Japan, which constitute a very clear in group.
In the US, a great of anxiety stems from the realization that you could do your best in all circumstances and still have your
life fall apart, since that social backstop just isn't there, especially not in the world of meritocracy, in which you're expected
to basically give up your pre-existing social networks in order to even participate.
I remember one job where my Boss warned me: "Nice guys finish last here."
Nice of him, eh?
Figure out the culture of your workplace, and if you can stomach it, do what you have to do to succeed. This is what the Obamas
and the Clintons have done. And geez, they can stomach a lot. But I do know people who have "worked hard" and been successful
in their own businesses, and musicians are a prime example of having to really do the work to get the work. It's who you want
to be recognized by, in my way of thinking.
I often think the better saying would be "Whom the gods would destroy, they first make outrageously successful."
With outrageous � as in wildly-disproportionate-to-effort-and-actual-talent � success comes a sense of infallibility, inevitability,
hubris. A self-centered personality-cult delusion � ergo a form of madness � which often ends in a spectacular undoing. Alas,
not nearly often enough, when it comes to the DC cabal of hubristic upward-failing sociopaths.
GOP convention finished with a bang tonight, and thankfully the dire pre-convention worries about the streets of Cleveland
flowing with rivers of blood proved unfounded � I'd studiously avoided the previous evenings, aside from a few brief nauseating
while-channel-flipping glimpses � but happened to catch Trump himself tonight. While I disliked Trump's police-centric take on
American security at home, I thought he really effectively hammered the issues of economic inequality � including a mention of
soaring unemployment rates in the latino and black communities (I wish he would have said more in that vein, but he did at least
say something) and governmental corruption at the highest levels, as well as Hillary's multiple foreign-policy debacles; the whole
"what has 15 years of blowing shit up in the middle east done for us?" issue. Also made a very pronounced point of embracing Sanders'
"top issue" of bad so-called-free-trade deals, while emphasizing the degree to which things were rigged against Bernie. And closed
with a nifty turning of Hillary's pet slogan against her [I paraphrase, too tired to dig the exact quote out]: "she demands a
three-word loyalty oath 'I`m with her' well I'm here to tell you tonight that I'm with you ."
And the speeches by his kids (Donald Jr last night, Ivanka tonight) were both good, and I think likely surprising � in a positive
way � to many people. The image of the whole family onstage post-speech will likely resonate with the traditional Republican base
� clean-cut successful-looking guys and attractive ladies of a leggy-blond (but not Barbie-esque/ditzy) type I expect even folks
of a conservative Mormon bent will have found something to like in that image.
Scott Adams comments on the kids
:
What I'm starting to sniff in the zeitgeist today is that Trump's kids are totally changing what people think of the
father. People are making the semi-rational assumption that anyone who can raise such good kids must be very different in private
than he is on the campaign trail.
Would be interested to hear the takes of other NC readers who watched the nomination acceptance speech.
Re "..a minority of one.." At least you go in for nuance and reflection. My take on H Clinton and her claque is that they all
perceive the Candidate as a 'majority of one.'
Your comment about the wife of Trump reminds me of the old saying by Caesar that : " Caesars' wife must be above suspicion." Thus,
the genesis of the "plagiarism" attacks. The mud slinging has started early in this campaign. However, if Trumps' family can exude
some sense of charm and class, the entire mud slinging strategy can be 'stood on its' head.'
Me, I'm terrified of Hillary Clinton and the devastation that her ascension to the Presidency might bring to this nation
and to the world. She is not only a liar, a blatantly self-dealing criminal, but more devastating yet, a sociopath of the first
water, willing to walk across the bodies to advance her personal and class agenda.
Her time as President would go a long way toward cementing the Unitary Executive in place (i.e., a functional Dictator,
as understood in the Roman Republican meaning of the term, a Tribune, in which a chief magistrate of the State like the President
under our Constitution, whose writ as an authoritarian ruler ran so long as there was a national emergency. I serve as the clerk
for government documents in a university law library, and I can tell you that the number of House Documents announcing a "National
Emergency" or the continuation of a previously announced "National Emergency" is very alarming. These "emergencies" are the camel's
nose under the tent in my estimation for the slow accretion of Dictatorial powers (again, in the Roman Republican sense of the
term "dictator") toward the Caesar-like role of Unitary Executive. These "National Emergencies" functionally invest power into
the hands of the President and those forces military, legal, and regulatory under the control of the Executive by which the President
can wage military, legal/diplomatic, and economic warfare against those who refuse to bend the knee to US-dominated global hegemony.
Our well-nigh useless Legislative branch has largely surrendered its Constitutional responsibilities to the Executive through
such trash as Authorizations of Military Force rather than engaging in the mandated procedure of the Declaration of War found
in the Constitution to authorize extended use of military (and legal and economic) force. This gives the Executive carte blanche
to engage in unending wars (beginning to sound familiar?) with all that that implies concerning the dominance of the MIC in the
formulation of national policies.
Hillary is practically salivating to grasp the rod of power embodied in the Unitary Executive. Warfare will follow her
tenure in office like a dire shadow, and due to her belief in the right of and necessity of the US to enforce a global hegemony,
she is inevitably moving toward a deadly clash with other nuclear powers unwilling to submit to the yoke of globalized, stateless,
culturally-anodyne finance capitalism. Good times await.
And that is only the beginning, as the plans she has for the US citizenry are scarcely less dire, what with the inevitability
of the Grand Bargain in service of Finance Capitalism looming dead ahead.
"Our well-nigh useless Legislative branch has largely surrendered its Constitutional responsibilities to the Executive
through such trash as Authorizations of Military Force rather than engaging in the mandated procedure of the Declaration of
War found in the Constitution to authorize extended use of military (and legal and economic) force."
That allows individuals to claim they had no responsibility for the war, something Pence and Clinton cannot claim because
of their votes. But on what other things do you see Obama as being a strong "unitary executive." I thought it was generally viewed
that Congress had thwarted his (almost) every wish.
Indeed, the republicans twisted barack's arm behind his back and forced him to allow insurance company lobbyists to write the
"Affordable Care Act". Since you have tsa pre check I'll guess that your cadillac plan is still operational, or if not that that
all the people who pay for insurance they can't use are subsidising you, and your own health care costs have been ameliorated.
They also forced him to nominate merrick garland. They forced him to foam the runway for the banks and forced him to let all the
bankster crimes go unpunished. My view is that obama, like hillary, is a republican because for both of them the policies they
worked to advance are republican policies. TPP, ISDS, ACA, Edward Snowden, Chelsea Manning I could go on and on. I agree with
the author that dems like obama and hillary are interested in serving the top sliver of the population that has the lions share
of the wealth. It's not right/left anymore, it's top/bottom .
And how about that patriot act renewal, US out of iraq/afganistan? Vicky nuland and the ukraine? I guess the problem is
that you get your information as it is generally viewed, but you fail to indicate who it is that generally views things that way,
however, it should help you understand why trump will win because hillary is generally viewed as corrupt.
I think both Obama and Trump were reciting a standard variation of the American Dream�. Horatio Alger stories are part of the
US mythos. Bill Clinton used a variation in 1992. Most US pols use the "up from nothing by dint of hard work and good morals"
line. The flap is that O and T used the exact same words instead of noting that the sentiment itself is boilerplate?
I'm intrigued by author's concluding idea. "It involves another attempt to take over the Republican Party, this time
by the Clinton-led Democratic leadership. "
" And if the words were lies coming out of Obama's mouth, what are they coming out of Trump's mouth? "
They are still lies, but they are lies in keeping with the ideology that dominates the party of which Trump is the nominee.
Nimni summarized this well:
"Far from embarrassing Melania Trump and the GOP, then, it should be deeply humiliating for Democrats that their rhetoric
is so bloodless and hollow that it can easily be spoken word-for-word in front of a gang of crazed racists. Instead of asking
"why is Melania Trump using Michelle Obama's words?" we might think to ask "why is Michelle Obama using the right-wing rhetoric
of self-reliance?"
A lot of this is related to the Democrats and what Bill made "successful" with his presidency. The lack of a truly left
party that works for average citizens has created this environment when a character like Trump can gain such support. This article
illustrates but another example of meritocratic nonsense being regurgitated by the party.
I think she initially claimed she wrote it didn't she? But yea it's clearly silly coming out of her mouth. Although being a
model may be hard work (it could very well be frankly), she hasn't worked hard for years by now, and didn't get into such a privileged
position by hard work (in whose definition exactly does marrying money count as hard work?).
So while in Michelle Obama's mouth the words are a lie, at least they might be a lie that's kind of true for her, in Misses
Trumps mouth it's beyond silly. I have no idea if Mr Inherited Wealth and Misses Married Money do raise their kids that way or
not. Wow the rich are crazy!!!
A thought-provoking and unexpected take, Gaius Publius. I was struck by one item left off your list of lucky attributes:
beauty. Both Michelle Obama and Melania Trump are undeniably beautiful women -� tall, slim, with the elegantly symmetrical features
prized in every culture. Sadly in beauty-obsessed America the doors opened for women who look this lovely are shut hard against
women who are fat, or old, or ugly.
There is a strong correlation between height and compensation. "When it comes to height, every inch counts�in fact, in the
workplace, each inch above average may be worth $789 more per year, according to a study in the Journal of Applied Psychology
(Vol. 89, No. 3).
The findings suggest that someone who is 6 feet tall earns, on average, nearly $166,000 more during a 30-year career than someone
who is 5 feet 5 inches�even when controlling for gender, age and weight." http://www.apa.org/monitor/julaug04/standing.aspx
That apparently is not true. One study of lawyers "found that those rated attractive on the basis of their graduation photographs
went on to earn higher salaries than their less well-favoured colleagues. Moreover, lawyers in private practice tended to be better
looking than those working in government departments." Even among economists, beauty pays and "attractive candidates were more
successful in elections for office in the American Economic Association."
http://www.economist.com/node/10311266
I had pretty much the exact same thought as your second "blackbird" when the video of Melania Trump plagiarizing Michele
Obama's speech and all my liberal friends were yuking it up. All I could think was "If the same speech could plausibly come out
of either of their mouths without alienating the audience, we have much worse problems than her Mrs. Trump's copycating." The
fact that this seemed to bother hardly anyone else made it worse. So much of these elections just get reduced down to rooting
for your team at a sporting event. This works well to keep people from having to deal with a lot of unpleasant questions and conclusions.
Or worse still, they've become so used to neoliberal platitudes like "pulling yourself up by your bootstraps" that it's become
"common sense" or the don't even recognize it as such.
In case you missed it, T's entire speech was about the "forgotten man," those that work hard and still cannot make a living
wage. The height of their dreams count for nothing. The system is rigged. Read Roosevelt's speech, Trump certainly did, for
some real fear mongering and look at the coalition he has taken over the Republican party to form. FDR 1932.
> "another attempt to take over the Republican Party" Which shouldn't be that hard, since both the Democrat and Republican
parties are neoliberal. As always, the real enemy is the left.
I'm surprised Gaius failed to address this portion of Michelle's speech which he quoted: "tell the truth; keep your promises;
treat others with dignity and respect."
Since when has Obama told the truth, or kept his promises, or treated anyone except Jamie Dimon and Lloyd Blanfein with respect?
you work hard for what you want in life, that your word is your bond and you do what you say and keep your promise
Put aside whether "Michelle Obama" or some speechwriting merc came up with the banal verbiage redolent of Sunday school
and Horatio Alger. What gives the snippet its special Trumpian turn into hyper-unreality, an ever-expanding balloon of hot boast
and hyperbolic deceit, is the way it transcends garden-variety plagiarism by laying claim to the very virtues that the appropriation
itself falsifies.
That's chutzpah! The stunning effrontery supersizes an overall meta-ness that's less indicative of middle-class morality
and meritocracy than the predatory opportunism of the exploitative rich, what C. Wright Mills might have recognized as the "higher
immorality." Here we have a colossally vain billionaire atop an empire of glitz and privilege kayfabing his way to a party nomination
as the indignant voice of the brutalized working class he's dedicated his life to disparaging as envious losers. The mind reels
between giddiness and nausea.
What then exists outside the genteel social Darwinism of meritocratic ideology and further descent into a society of the spectacle,
the Reaganite sitcom devolved into the Trump unreality show? To the gnomic, sidelong mysticism of Stevens let's add the frontal
transvaluation of a sardonic Shaw:
What am I, Governors both? I ask you, what am I? I'm one of the undeserving poor: that's what I am. Think of what that means
to a man. It means that he's up agen middle class morality all the time. If there's anything going, and I put in for a bit
of it, it's always the same story: 'You're undeserving; so you can't have it.' But my needs is as great as the most deserving
widow's that ever got money out of six different charities in one week for the death of the same husband. I don't need less
than a deserving man: I need more. I don't eat less hearty than him; and I drink a lot more. I want a bit of amusement, cause
I'm a thinking man. I want cheerfulness and a song and a band when I feel low. Well, they charge me just the same for everything
as they charge the deserving. What is middle class morality? Just an excuse for never giving me anything.
Governors both, Democrats and Republicans, the meritocrats and the masters. What must be taken in is that the unskilled, the
uneducated, the out of step, the unlucky, all need the means to live. If that's taken from them by the self-described deserving
on the Acela and the higher immoralists in their towers and Gulfstreams, a democracy will begin to wobble like a spinning coin
on the verge. You can't educate that away. You can't forever distract it away with Lifestyles of the Rich and Famous and color
counter-revolutions against exploitative freeloaders (the non-rich and famous ones, that is). It takes an philosophy of human
worth apart from vanities over this or that temporarily adaptive skill or happy accident.
When the market is be all and end all, an expression of natural law and supernatural giver of meaning, it's hard to see how
even a managed, minimal democracy can prevail except as grotesque, corrupt parody, a mood traced in the shadow a decipherable
cause. Or did I read something like that somewhere, like in a poem?
I don't recall Ms. Obama's speech. Based on the excerpts I heard during the recent news cycle (from both speeches) were pathetic.
I think Oren Nimni basically gets it right: When you cut through the tautologies and the bromides that many parents deliver
to their children, what you have is the message, "don't expect government to be there for you; those days are over" (which, actually,
sounds like Bubba Bill's pitch�"the day's of big government are over").
If you work hard enough and have enough ambition you will succeed is not a lie to those born on 3rd base, it was true for
them. The Obama's the Trumps. They are really just guilty of not understanding the plight of those who were born at bat against
a major league pitcher.
"... By the time we had three young children, I was rarely home. ..."
"... After Cisco bought IronPort, I went to work for Cisco for a few years, then quit and took about 18 months off. During that time, my relationship with my family completely changed. I was packing lunches, driving carpools, making dinners; I began doing my part. With the help of my wife and other role-model dads, I essentially got re-programmed. In 2011, I joined Andreessen Horowitz as a partner. But my new role at home has continued to work for us even though I'm working full-time again. ..."
"... Scott Weiss is a general partner at Andreessen Horowitz. You can follow him on Twitter @W_ScottWeiss. This piece originally appeared on Medium . ..."
My brightest years running a startup were the darkest ones for my family.
My wife and I were college sweethearts. We delayed having children first by choice, then by necessity,
as we put ourselves through business school. But nearly six years into our marriage, we agreed it
was time. My wife and I both worked at startups and were committed to our careers; we expected that
we would both pursue our careers and raise our first child at the same time. To facilitate that,
we found an amazing, energetic, full-time nanny. In fact, my wife went back to work just two weeks
after our first child was born, because the startup she was with was approaching an IPO, and our
new nanny supported us through that period. When my wife became pregnant with our second child, I
was a managing partner at Idealab, a startup studio, where a large part of my job involved shutting
down companies that had been hurt by the dot-com bust. I planned to take some time off and stay at
home while my wife went back to work six months after the birth, but by the time our second child
was born, 22 months later, a lot had changed. Disenchanted with my work and eager to build something
of my own, I had decided to start a company. As we brought our daughter home from the hospital, I
had already launched into fundraising for what would be IronPort, an email-security startup.
It was just my co-founder and me in the beginning, and while we had an ambitious vision - to protect
enterprises against all Internet-related threats - we didn't yet have the resources to scale it.
So initially, we did everything by ourselves. The life one signs up for at an early-stage tech startup
involves getting in early, killing yourself to make something great, and getting a meaningful product
out before you run out of money. This was true even after we started hiring people. I didn't code,
but as the CEO, I felt it necessary to be physically present with the engineering team. Sometimes,
I would get everyone lunch or dinner. When we started pulling consistent coding weekends, we brought
in the entire management team to serve the engineers: We brought them food, washed their cars, got
oil changes, took in their dry cleaning, and arranged for childcare for their kids in the office.
Thanks to all the effort, IronPort ultimately grew to be very large and successful over its seven
years as an independent company, before being
acquired
by Cisco. It was an incredible, once-in-a-lifetime professional experience. But those brightest
years at work were without a doubt the darkest years at home. We had added baby number three just
18 months after the second one, which had forced us to make a decision about how to parent going
forward. We did the math - and some soul-searching - and figured it would take two or more nannies
and other staffers to allow us to keep pursuing work at our current pace. So, after years of working
full-time in a startup with our first child, and continuing to work as its senior VP of business
development after its IPO, my Harvard MBA wife, who had had an amazing career in her own right, "decided"
to become a full-time mom and take care of our kids.
By the time we had three young children, I was rarely home. And when I was there, well, let's
just say I wasn't particularly helpful or cheery. My perspective at the time was: I'm killing myself
at work, so when I get home, I just want to kick back with a cocktail and watch some TV. All I do
is talk to people all day long, and so at home, I'd really prefer just quietly relaxing. Then, as IronPort grew, I was constantly on the road with customers, press, analysts, and of course, employees.
We ultimately got most of our revenue from outside of the U.S., and we all felt it to be very important
to support our disparate offices from Europe to Asia to South America. There were several times when
I was gone more than half of the days in a given month. Even when I was home, I was usually in this
brutal state of sleep deprivation and recovery from adjusting to yet another time zone and couldn't
be relied on to help with childcare.
My wife's experience was totally different. She was now home speaking in monosyllabic words to
kids all day and was starving for adult conversation by the time I got in the door. And that part
about sitting on my ass in front of the TV with a cocktail? This ran counter to all of her efforts
to teach the kids about pitching in together as a family. The message of everyone helping to cook,
clean, and be responsible for the household fell completely flat when daddy wouldn't so much as take
out the trash or change a light bulb. Nope, I was far too important for that and suggested she should
hire someone to keep the house clean or even cook, if that was "stressing her out."
Ugh. I was completely missing the point. I was setting such a great example at work, but such
a terrible one at home, where I often acted like a self-important asshole. Something had to change.
After Cisco bought IronPort, I went to work for Cisco for a few years, then quit and took about
18 months off. During that time, my relationship with my family completely changed. I was packing
lunches, driving carpools, making dinners; I began doing my part. With the help of my wife and other
role-model dads, I essentially got re-programmed. In 2011, I joined Andreessen Horowitz as a partner.
But my new role at home has continued to work for us even though I'm working full-time again.
My wife and I have now been married for 22 years. Reflecting on the years we've spent as parents,
here are the most critical things I needed to change:
Disconnect to Connect
During the IronPort days, when my children were young, I thought what I was doing at work was
far more important and urgent than what was going on at home. I was often accused of being physically
present without being mentally present. (If you find yourself sneaking into the bathroom to complete
emails, then you're certainly not in the moment.) My wife dropped a bunch of hints, but I was undeterred.
When I left IronPort, I realized that committing to my family required disconnecting from work (e.g.
turning off the computer and phone), and completely focusing all of my attention on the details of
the home. Cooking a great meal. Helping with a science project. Discussing the future with my partner.
Planning and Priorities
My wife and I have a weekly date night. My son and I are in a fantasy football league together.
I cook with my daughters. Most times these have become immovable appointments on my calendar.
When my calendar reflects that I can't do a meeting on Wednesday and Friday mornings before
9 a.m., because I cook breakfast and drive a carpool, then it's amazing how meetings just don't get
scheduled. (If it's at all possible, living physically close to the office is also a huge
help to juggling the priorities. It means that I can cut out for a family dinner and then go back
to the office or have a late meeting afterwards.)
Communicate
When I was traveling at IronPort, I would sometimes go for days without communicating at all.
When friends would ask my wife, "Hey, where's Scott this week?" she would sincerely have to answer,
"I have no idea, you'll have to email him yourself." I was that sucked in. Now that I am completely
tuned in to the weekly family schedule, we plan and calendar family meals (perhaps the single most
important thing we do), pickups and drop-offs, and make adjustments on the fly. For example: Did
some time suddenly free up so I can catch the last 30 minutes of the kids' basketball game? Can I
pick something up on the way home? And so on. My norm is to check in between meetings, but if I'm
the "parent on duty" - i.e., if my wife is out of town - then I will start a meeting with, "You'll
have to excuse me, but I'm the only parent in town so I need to keep my phone handy in case of an
issue." Communication was by far my biggest area for improvement. Now, multiple, daily phone and
text check-ins are the norm. Communication is important in a broader sense, too. I believe that families - and
that includes everyone - need to discuss each parent's life-changing decisions, such as joining a
startup or becoming a CEO, together. And they should reserve the right to change the contract as
their life together evolves.
Participate
It's just not possible to be a real partner if you aren't deeply involved in all aspects of the
family; you can't just ask your partner to delegate certain tasks to you. Or maybe you can, but then
it needs to be a mutual, shared decision - one that honors your partner's choices and dreams, too.
But I personally believe that even the busiest CEOs should drive a carpool, pack a lunch, help with
homework, make a breakfast or dinner, and consistently attend school events. And note, my wife didn't
need another person to "manage" in the household; she needed me to "own" some of our family life
activities myself. Being involved every week is the only way to stay connected at home, and it cannot
be outsourced. It might even make you a better CEO since you're more sensitive to the needs of others.
There's a debate that rages in the corridors of VCs, startups, and other intense entrepreneurial
centers, which is: Can you have it all? Aren't the best CEOs and founders so ambitious, so driven,
that they must sacrifice everything to make it work? We have seen couples struggle with this on a
personal level, and there is almost always an imbalance that leads to deep sacrifices on one front
or the other.
What historically has been somewhat unique to Silicon Valley is the age and experience level of
CEOs; that role is often achieved a decade earlier than in traditional industries. I've observed
that CEOs in their 20s may be fully equipped and knowledgeable enough to handle leading a company,
but when their family life begins to expand and demand for their attention increases, they are at
a loss as to why things aren't just falling into place. The changes that I've described ideally should
be made before you get to that point.
It's easy for me to share this advice now - after I sold my company. The reality is that
it took certain sacrifices, in terms of my family life, to make IronPort a success. Still, I'm hopeful.
I'm hopeful that the new generation, having grown up with more permeable boundaries between work
and home, and being used to new technologies to keep them even more connected in ways we couldn't
be before, refuses to accept a world in which one can't have it all.
Scott Weiss is a general partner at Andreessen Horowitz. You can follow him on Twitter
@W_ScottWeiss. This piece originally
appeared on
Medium.
I'm a very high achiever. I know this. I am obsessive,
I am overly ambitious, and I am definitely out of balance at times (something I'm working on), but
that's just how I operate.
The rules I live by are strict but that's because they have to be.
As a result, a I'm criticized a lot by people who aren't "high achievers." But that comes with the
territory. And as a result, I achieve what I set out to achieve.
Here is my mindset.
1) My Time Is Gold
Time is the only thing I have. Time is what creates my writing. Time is what makes me money. Time
is what allows me to eat, sleep, read, learn. Time is my most precious resource.
When deciding where to invest my time, I am extremely greedy. I have to be. I give my close, close
friends the time they deserve because I value our relationship. Casual friends and acquaintances
I give extra time I have to, when I can. Anyone else, I weigh the investment versus the return and
go from there. It might not be "normal" but it's required to reach the levels of success I know I
want for myself.
2) I Set Goals and I Reach Them
When I set a goal, I put a date to it. I tell myself when I'm going to have it done by. If I don't
have it done by then, I better have a good reason for not doing it. If I don't have a good reason,
I set another date and push myself harder to reach it. I do the same thing even if I had a good reason
in the first place.
The difference between those who "achieve" and those who don't is the follow through. It's the
ability to set a goal and walk through the finish line.
3) I See Every Decision As Crucial
Every decision I make has an effect. What time I go to bed, how much time I spend reading or writing,
how much time I spend with my friends, etc. Everything I do, every choice I make, I ask myself whether
or not it is moving me closer towards my goal. Will this burger make me feel sick and will I waste
an hour feeling groggy later? Yes? Ok, I don't eat it. Will me going out late tonight keep
me from waking up early to write? Ok, I don't go out. Every single decision has to, in some way,
be contributing to my growth. Am I perfect? Am I 100% consistent? No. But I'd say I'm somewhere around
80-90%. And that percentage over a long period of time is insanely, profoundly, immeasurably valuable.
4) I Learn Something From Everyone
Every single person I meet, I try to learn something from. Whether it's a CEO of a major company
or a random person next to me on the train, I believe we all cross paths for a reason and there is
a lesson everywhere you turn. By seeing life this way you are always open to the process. Every moment
is an opportunity to grow. And the more moments you string together, the faster you learn, the more
you grow, and better you become at everything you do.
5) I Invest In Skills, Not In Rewards
I can play classical piano. I can beatbox. I can write stories. I can sing. I can produce music.
I can rap. I can write songs. I can take pretty good pictures. I can lift weights with top athletes.
I can cook. I can do a lot of things. I don't say this to brag, I say this to point out the fact
that I am not a prodigy, I am not a genius, I am NOT ANY MORE GIFTED THAN YOU. The only difference
is that instead of spending my Friday nights going to clubs and getting drunk, my Saturday nights
hanging out at bars, my Sundays at brunch sipping mimosas, instead of being super social and mr.
on-the-town, I work. I work really hard. And to me it's not even work, it's fun. I'd rather learn
a new skill than get drunk. I'd rather socialize with people who I can learn from rather than having
the same repetitive conversations with inebriated acquaintances. And it's sad how this mentality
is seen as "above" other people. That's just part of the gig. People don't like it when you get good
at stuff. People want you to be lazy like them. Fuck that.
6) I Surround Myself With Likeminded People
There are people out there who live life like me. There are people who want to learn more than
they want to get rich. There are people who want to build something of their own more than they want
to climb their way up the corporate ladder. There are people out there like you, you just have to
find them. And once you do find them, become friends and help each other. Once a week I meet up with
a few entrepreneurs I know and we exchange ideas, set new goals, and hold each other accountable.
Once a week I also meet up with an artist group from my college and we help each other stay grounded,
meditate, and share our art. These sorts of groups of peers are beyond valuable. They will help you
remember what you're working toward.
7) I Read, A Lot
I read #ABookAWeek, minimum. On my website, I share which book I read last week and allow people
to sign up for my weekly newsletter:
www.nicolascole.com/blog.
I know you can learn without reading. I know that experience is immensely valuable. But if you're
not reading you're not learning fast enough, and that's just the truth of it. When someone asks me
what I'm reading, I say, "What genre?" I alternate between self development How-To books, timeless
fiction literature, books on spirituality and meditation, books on creative process, nonfiction memoirs,
and books on marketing and advertising.
Pick up a book. Now.
8) I Know The Value Of A Mentor
I write about mentorship a lot because I believe it is the single most effective way to learn,
period.
When I find a mentor, I give them everything. I throw everything I think I know out the window
and I allow myself to be completely open to what they have to teach. I work harder than they expect
me to work. I ask a million questions. I spend as much time around them as possible because I know
how rare and valuable a mentor can be.
Since I was 15 years old, I've had some sort of mentor in my life. To show you how crucial mentors
are, here's what happened:
15-18: Gaming Mentor. I sought out and played with one of the best World of Warcraft players
I could find. As a result, we became best friends and I went on to become one of the highest ranked
World of Warcraft players in North America.
19-22: Lifting Mentor: I became friends with a powerlifter at my gym. He took me under
his wing and taught me everything. We became great friends (still friends today) and he helped me
gain 40 lbs of muscle and lift more weight than I ever thought was possible for a once-skinny-kid
like me.
23-Present: My current mentor is also my boss -- a successful entrepreneur and marketing master.
He hasn't just taught me about business, he's taught me how to be my own man. He's taught me how
to carry myself, how to dress, how to handle clients, how to pitch clients, how to explain my creative
ideas, how to stand up for what I believe in, and how to be willing to pursue ideas that other people
would call "impossible."
9) I Care About What I Create
This might be the most important differentiating factor in being a high achiever: I care. I care
a lot. I care what I create, I care about the difference I make, I care about helping people, I can
about helping others learn. I care, and as a result, I take things personally. I care if someone
doesn't like what I make. I care about what people think. I do. It doesn't deter me from what I want
to do, but I do care. And because I care, I put my everything into what it is I do.
People that don't care, go nowhere. And do you know why most people don't care? Because
it's hard. It leaves you vulnerable. It is a chink in the armor where people can point and aim and
say, "Hah, you care." Especially as a man, we're told not to care. And a lot of people don't care
out of fear that what they DO care about will make them look naive. What if other people don't care
about what you care about? How weird will you look then?
If you want to achieve, if you want to become successful-use whatever words you want-if you want
to reach something that is slightly out of your grasp, you have to care. You have to care a lot.
You have to allow yourself to feel all those emotions: excitement, fear, ambition, vulnerability.
And you have to use what you feel to propel you to create, create, create.
Nicolas Cole is an artist, writer,
creative marketing strategist and self development coach. He's also a Quora contributor.
You can follow Quora on
Twitter ,
Facebook , and
Google+ .
New submitter
mirandakatz writes: Katie Hafner
has spent the last 23 days in rehab. Not for alcoholism or gambling, but for a
self-inflicted case of episodic partial attention thanks to her iPhone . On Backchannel, Hafner
writes about the detrimental effect the constant stream of pings has had on her, and how her life
has come to resemble a computer screen. "I sense a constant agitation when I'm doing something,"
she says, "as if there is something else out there, beckoning -- demanding -- my attention. And nothing
needs to be deferred."
"I blame electronics for my affliction," writes Hafner, who says the devices in her life
"teem with squirrels." "If I pick up my iPhone to send a text, damned if I don't get knocked off
task within a couple of seconds by an alert about Trump's latest tweet. And my guess is that if you
have allowed your mind to be as tyrannized by the demands of your devices as I have, you too suffer
to some degree from this condition." Hafner goes on to describe her symptoms of "episodic partial
attention" and provide potential fixes for it: "There are the obvious fixes. Address the electronics
first: Silence the phone as well as all alerts on your computer, and you automatically banish two
squirrels. But how do you shut down the micro-distractions that dangle everywhere in your physical
world, their bushy gray tails twitching seductively? My therapy, of my own devising, consists of
serial mono-tasking with a big dose of mindful intent, or intentional mindfulness -- which is really
just good, old-fashioned paying attention. At first, I took the tiniest of steps.
I celebrated the
buttoning of a blouse without stopping to apply the hand cream I spotted on the dresser as if I had
gotten into Harvard. Each task I took on -- however mundane -- I had to first announce, quietly,
to myself. I made myself vow that I would work on that task and only that task until it was finished.
Like a stroke patient relearning how to move an arm, I told myself not that I was making the entire
bed (too overwhelming), but that I had a series of steps to perform: first the top sheet, then the
blankets, then the comforter, then the pillows. Emptying the dishwasher became my Waterloo. Putting
dishes away takes time, and it's tedious. Perhaps the greatest challenge lies in the fact that the
job requires repeated kitchen crossings. There are squirrels everywhere, none more treacherous than
the siren song that is my iPhone."
(newscientist.com)
102 Posted by BeauHD on Thursday
September 07, 2017 @09:00AM from the creative-juices dept. An anonymous reader quotes a report from
New Scientist: Need inspiration? Happy background music can help get the creative juices flowing.
Simone Ritter, at Radboud University in the Netherlands, and Sam Ferguson, at the University of Technology
in Sydney, Australia, have been studying the effect of silence and different types of music on how
we think. They put 155 volunteers into five groups. Four of these were each given a type of music
to listen to while undergoing a series of tests, while the fifth group did the tests in silence.
The tests were designed to gage two types of thinking: divergent thinking, which describes the process
of generating new ideas, and convergent thinking, which is how we find the best solutions for a problem.
Ritter and Ferguson found that people were
more creative when listening to music they thought was positive , coming up with more unique
ideas than the people who worked in silence. However, happy music -- in this instance,
Antonio Vivaldi's Spring
-- only boosted divergent thinking. No type of music helped convergent thinking, suggesting that
it's better to solve problems in silence. The study was
published in the journal PLoS One .
"... The spike in reported burnout is directly attributable to loss of control over work, increased performance measurement (quality, cost, patient experience), the increasing complexity of medical care, the implementation of electronic health records (EHRs), and profound inefficiencies in the practice environment, all of which have altered work flows and patient interactions. ..."
"... The rest of the items seem more plausible. However absent from the post is consideration of why physicians lost control over work, have been subject to performance measurement (often without good evidence that it improves performance, and particularly patients' outcomes), and have been forced to use often badly designed, poorly implemented EHRs ..."
"... In fact, we began the project that led to the establishment of Health Care Renewal because of our general perception that physician angst was worsening (in the first few years of the 21st century), and that no one was seriously addressing its causes. Our first crude qualitative research(8) suggested hypotheses that physicians' angst was due to perceived threats to their core values, and that these threats arose from the issues this blog discusses: concentration and abuse of power, leadership that is ill-informed , uncaring about or hostile to the values of health care professionals, incompetent, deceptive or dishonest, self-interested , conflicted , or outright corrupt , and governance that lacks accountability , and transparency . ..."
"... We have found hundreds of cases and anecdotes supporting this viewpoint. ..."
"... However, the biggest cause of physicians' loss of control over work may be the rising power of large health care organizations, in particular the large hospital systems that now increasingly employ physicians, turning them into corporate physicians . ..."
"... We have also frequently posted about what we have called generic management , the manager's coup d'etat , and mission-hostile management. Managerialism wraps these concepts up into a single package. The idea is that all organizations, including health care organizations, ought to be run people with generic management training and background, not necessarily by people with specific backgrounds or training in the organizations' areas of operation. Thus, for example, hospitals ought to be run by MBAs, not doctors, nurses, or public health experts. Furthermore, all organizations ought to be run according to the same basic principles of business management. These principles in turn ought to be based on current neoliberal dogma , with the prime directive that short-term revenue is the primary goal. ..."
Here is what the blog post said about the causes of burnout:
The spike in reported burnout is directly attributable to loss of control over work, increased
performance measurement (quality, cost, patient experience), the increasing complexity of medical
care, the implementation of electronic health records (EHRs), and profound inefficiencies in the
practice environment, all of which have altered work flows and patient interactions.
We dealt with the curious citation of inefficiencies as a cause of burnout above.
The rest of the items seem more plausible. However absent from the post is consideration of
why physicians lost control over work, have been subject to performance measurement (often
without good evidence that it improves performance, and particularly patients' outcomes), and have
been forced to use often badly designed, poorly implemented EHRs . Particularly absent was any
consideration of whether the nature or actions of large organizations, such as those led by the authors
of the blog post, could have had anything to do with physician burnout.
Contrast this discussion with how we on
Health Care Renewal have discussed burnout
in the past. In 2012, we
noted the first report on burnout by Shanefelt et al(2). At that time we observed that the already
voluminous literature on burnout often did not attend to the external forces and influences on physicians
that are likely to be producing burnout. Instead, burnout etc has been addressed as if it were lack
of resilience, or even some sort of psychiatric disease of physicians.
In fact, we began the project that led to the establishment of
Health Care Renewal because of our general
perception that physician angst was worsening (in the first few years of the 21st century), and that
no one was seriously addressing its causes. Our first crude qualitative research(8) suggested hypotheses
that physicians' angst was due to perceived threats to their core values, and that these threats
arose from the issues this blog discusses:
concentration
and abuse of power, leadership that is
ill-informed
,
uncaring about or
hostile to the values of health care professionals, incompetent,
deceptive or dishonest,
self-interested
,
conflicted
, or outright
corrupt
, and governance that lacks
accountability
, and transparency
.
We have found hundreds of cases and anecdotes supporting this viewpoint.
... ... ...
Finally, the Health Affairs post mention of "loss of control over work" deserves special attention.
It could represent a catch-all of more "system factors" as noted above. However, the biggest
cause of physicians' loss of control over work may be the rising power of large health care organizations,
in particular the large hospital systems that now increasingly employ physicians, turning them into
corporate
physicians .
In the US, home of the most commercialized health care system among developed countries, physicians
increasingly practice as employees of large organizations, usually hospitals and hospital systems,
sometimes for-profit corporations. The leaders of such systems meanwhile are now often
generic managers
, people trained as managers without specific training or experience in medicine or health care,
and " managerialists
" who apply generic management theory and dogma to medicine and health care just as it might
be applied to building widgets or selling soap.
We have also frequently posted about what we have called
generic management
, the
manager's
coup d'etat , and
mission-hostile management. Managerialism wraps these concepts up into a single package. The
idea is that all organizations, including health care organizations, ought to be run people with
generic management training and background, not necessarily by people with specific backgrounds or
training in the organizations' areas of operation. Thus, for example, hospitals ought to be run by
MBAs, not doctors, nurses, or public health experts. Furthermore, all organizations ought to be run
according to the same basic principles of business management. These principles in turn ought to
be based on current
neoliberal dogma
, with the prime directive that short-term revenue is the primary goal.
... ... ...
Summary
I am glad that physician burnout is getting less anechoic. However, in my humble opinion, the
last thing physicians at risk of or suffering burnout need is a top down diktat from CEOs of large
health care organizations. The CEOs who wrote the Health Affairs post not have any personal responsibility
for any physicians' burnout. However, the transformation of medical practice by the influence of
large health care organizations run by the authors' fellow CEOs, particularly huge hospital systems,
often resulting in physicians practicing as hired employees of such corporations likely is a major
cause of burnout. If the leaders of such large organizations really want to reduce burnout, they
should first listen to their own physicians. But this might lead them to realize that reducing burnout
might require them to divest themselves of considerable authority, power, and hence remuneration.
True health care reform in this sphere will require the breakup of concentrations of power, and the
transformation of leadership to make it well-informed, supportive of and willing to be accountable
for the health care mission, honest and unconflicted.
Physicians need to join up with other health care professionals and concerned member of the public
to push for such reform, which may seem radical in our current era. Such reform may be made more
difficult because it clearly would threaten the financial status of some people who have gotten very
rich from the status quo, and can use their wealth and power to resist reform.
There's far too many to-do list apps to pick the perfect one. They're each so similar, yet different,
and they'd all take time to setup and learn to use. You already have too much to do, so why take
the time to learn how to use a new to-do list app just to keep up with everything you have to do?
The simplest way to keep track of your tasks is to write them down on a piece of paper. You can
list them in the way that makes sense to you, with any extra info you want, and only have to carry
the paper around to keep track of what you need to do. It's simple, cheap, and just makes sense.
But perhaps you'd rather keep a digital to-do list, so it'll be on all your devices and you won't
have to worry about accidentally throwing it away. You just need a solution that's as simple as plain
paper and ink.
Enter Todo.txt . It's a system for keeping
track of your to-dos in a plain text file, and is the closest digital equivalent to keeping track
of your tasks on paper. In this tutorial, I'll show you how to use Todo.txt to replace those paper
lists and still ensure everything gets done.
What Is Todo.txt?
Todo.txt is a
framework
of guidelines through which a simple text file can become a feature-rich to-do list. Instead
of just writing your tasks in a list at random, its simple rules will help you avoid creating a mess
of tasks, and will make that plain text file into something much more useful and interesting. That
might sound confusing, but it's actually simple. Here's how it works:
The first rule in Todo.txt is that each to-do item is its own line in your text file. New to-do
item, new line. So let's give that a try. Open your favorite text editor (or just use Notepad on
a PC or TextEdit on a Mac), and type in some tasks you need to get done, each on its own line, like
so:
1 2 3
Do the dishes because they're starting to pile up and it really looks bad. Do a load of laundry, preferably a light load. Vacuum the house, making
sure to get into all the little corners.
There's my first three tasks, each of which are rather long. You can include as much info as you
want into each task. Just make sure each task is on its own line.
Now, just save that file as todo.txt , and place it inside your Documents
folder or somewhere else you can access it easy. Better yet, place it in your Dropbox
folder so you can easily sync it later.
And just like that, you've started to use Todo.txt! Sure it doesn't seem all that impressive just
yet: a plain text file with your to-do items in it. Now we're ready to start using some of the text
formatting conventions Todo.txt supports, and use some of the tools that support Todo.txt. That's
when you'll see how useful this whole idea can be.
How to Speak the Lingo
We now have a text file called todo.txt that's stored in our Documents
folder. Inside it we have a few to-do items. Let's take a look at that file again (this time,
in the interests of brevity, I've shortened my todo items a bit):
1 2 3
Do the dishes. Do a load of laundry. Vacuum the house.
Ok, not bad so far, but we really aren't using the Todo.txt framework to the full. While Todo.txt
is supposed to be simple, it isn't featureless. Todo.txt is designed to help you prioritize
your to-do items, as well as organize them into projects and contexts . This is
largely following the spirit of David Allen's famous productivity
methodology known as "Getting Things Done" , or more often abbreviated to "GTD"-but you can
use these tools to organize your tasks however you'd like. If you don't like GTD, you can still use
Todo.txt to keep track of your tasks, and use these extra features to help you keep them organized.
Now, let's look at how projects, contexts, and priorities would apply to our sample list, and
how to actually mark tasks as complete. I'll keep using my simple to-do list that, honestly, isn't
tasks you'll likely need to put down on a to-do list, but you can use the same ideas shown here to
keep track of any of your tasks.
Projects
In my list, all three items are related to cleaning the house. So we can group them all into a
project called "cleaning". Just add a "+" sign followed by the project name to your tasks, like so:
1 2 3
Do the dishes. +cleaning Do a load of laundry. +cleaning Vacuum the house. +cleaning
That's nice, but everything on my list falls into the same project, so it seems a little redundant.
I could break everything out further, especially the "Do a load of laundry" task that includes putting
the load in the washer, then the dryer, and finally folding the clothes. Todo.txt allows items to
be in more then one project; just add another + project to the end of the task to add it to
another project. Let's take advantage of this and split the "Do a load of laundry" to-do item into
multiple items, and then put them in their own "laundry" project.
1 2 3 4 5 6
Do the dishes. +cleaning Put a load of laundry into the washer. +laundry
+cleaning Put the load into the dryer. +laundry +cleaning Fold the
load of laundry. +laundry +cleaning Put away the folded clothes. +laundry +cleaning
Vacuum the house. +cleaning
Great. Now our to-do list is split into multiple projects, and our "laundry" project tasks are
categorized under the "cleaning" project as well.
Context
Context refers to a place or situation where you have certain things to do. In the case of our
sample list, the context for all of them is pretty obvious: at home. In a case like that, I don't
think adding a context is really all that useful, since it's an implied part of the project itself.
Let's add some more items so we can better understand context.
1 2 3 4 5 6 7 8 9
Do the dishes. +cleaning Put a load of laundry into the washer. +laundry
+cleaning Put the load into the dryer. +laundry +cleaning Fold the
load of laundry. +laundry +cleaning Put away the folded clothes. +laundry +cleaning
Vacuum the house. +cleaning Buy eggs. Buy juice.
Buy a new pair of jeans.
I added three new to-do items, all of which have to do with buying things. The first two are food
items, things I'll need to buy at the grocery store. The last one is an article of clothing, something
I'll probably buy at the mall. All of these items could be put into a "shopping" project. But the
location I'll buy them at is different. This is where contexts come in. Designate a context in Todo.txt
with an "@" sign followed by the name of your context. Here's what our new list, including contexts,
looks like:
1 2 3 4 5 6 7 8 9
Do the dishes. +cleaning Put a load of laundry into the washer. +laundry
+cleaning Put the load into the dryer. +laundry +cleaning Fold the
load of laundry. +laundry +cleaning Put away the folded clothes. +laundry +cleaning
Vacuum the house. +cleaning Buy eggs. +shopping @grocery Buy juice. +shopping @grocery Buy a new pair of jeans. +shopping @mall
And there we go. Now the to-do items in our "shopping" project have been given a context. When
we're at the grocery store we can focus on the items we need to buy there, and the same goes for
when we're at the shopping mall.
Priority
The last feature we need to look at is priority. To do that, we'll add a few work-related tasks
to the list, then assign a priority to them and some of our existing tasks. Just add a letter surrounded
by parenthesis to the beginning of your tasks to give them a priority.
01 02 03 04 05 06 07 08 09 10 11
(A) Do the dishes. +cleaning (B) Put a load of laundry into the washer.
+laundry +cleaning Put the load into the dryer. +laundry +cleaning Fold the load of laundry. +laundry +cleaning Put away the folded clothes.
+laundry +cleaning Vacuum the house. +cleaning Buy eggs. +shopping
@grocery Buy juice. +shopping @grocery (A) Buy a new pair of jeans.
+shopping @mall Email Matt about my new article idea. +work (A) Finish
rough draft of next article. +work
Priorities are designated by an uppercase letter, A-Z, which is enclosed in parentheses, and then
followed by a space. They always appear at the beginning of the to-do item, and are in alphabetical
order-that is, a task with a priority of (A) is more important than a (B) task, and so on. You'll
see why this is when we get into some of the tools you can use to manipulate your Todo.txt file.
Marking Items Complete
One final word on formatting your Todo.txt file: marking a task as complete.You could delete the
item once you're done with it, but that isn't the preferred way in Todo.txt. Instead, put a lowercase
"x" at the start of the to-do item, like so:
01 02 03 04 05 06 07 08 09 10 11
(A) Do the dishes. +cleaning (B) Put a load of laundry into the washer.
+laundry +cleaning Put the load into the dryer. +laundry +cleaning Fold the load of laundry. +laundry +cleaning Put away the folded clothes.
+laundry +cleaning Vacuum the house. +cleaning Buy eggs. +shopping
@grocery Buy juice. +shopping @grocery x (A) Buy a new pair of jeans.
+shopping @mall Email Matt about my new article idea. +work (A) Finish
rough draft of next article. +work
If you'll notice, there's now a small "x" at the beginning of the line containing the item "Buy
a new pair of jeans." This signifies that the jeans have been bought and the item has been completed,
effectively "crossed off" my list. That way, you'll see what you've completed, along with the stuff
that still needs done.
You now know how to assign to-do items both projects and contexts, as well as how to prioritize
various items inside your plain text to-do list. All of this helps make our to-do list more useful
to us than it was before, giving structure and organization to an otherwise basic, unordered list.
You can use each of these features, or none of them-it's your choice. Todo.txt at its basics is whatever
you'd like it to be. It's a blank slate for your tasks, and some rules that keep everything organized.
And you could stop here. That's enough to keep up with your tasks the way you want, in a plain
text file. You could easily find all of your projects or contexts with a Command-F or Control-F search,
and stay on top of what needs done with nothing else.
But because we're following conventions outlined by Todo.txt, we can make use of some other interesting
tools which give even more power to our humble little text file.
Desktop App Options
Being an open source project, Todo.txt also works in a variety of specialized apps outside of
your plain text editor. You'll find apps for almost any platform that work with Todo.txt, but one
of the best is a free app: Todour .
Todour is available for both Mac and Windows, and gives a simple graphical interface to our Todo.txt
file. And I mean simple . Take a look:
See what I mean? You should notice right away though that your items have been correctly prioritized
automatically. You won't see much fancy stuff here in Todour, but it has all the essentials. You
can add and remove items, mark them as done or undone, and all of that is nicely supported within
your plain text file. Check the little box there to mark an item complete, and a lowercase "x" appears
at the start of that line in your text file. Nifty, isn't it?
The reason I really recommend Todour over using just a text editor is that it includes a search
filter. This lets you take full advantage of projects and contexts and can dynamically hide everything
else in your to-do list. Just search for a project or context, and only those tasks will appear.
Search for a project and a context, and you'll see only the tasks that have both.
Overall Todour, like Todo.txt itself doesn't have many flashy features. But it has the essentials
and it gets them right.
Mobile Access
Todo.txt was born from the command line, in a traditional computer world. But that doesn't mean
you can't use Todo.txt on mobile devices. In fact, there are Todo.txt apps for
iOS
and
Android
for $2 each. They have all the same features and functionality that we've already discussed including
projects, contexts, and priorities. The interface is clean and minimal, and focused on just letting
you quickly keep up with your Todo.txt tasks.
There isn't too much to say about the mobile apps, other than that they work just like you'd expect.
Like Todo.txt itself, these mobile apps are simple and straightforward. You can add tasks, filter
them by project and category, and edit or complete them on the go-and keep everything in sync with
your computer via Dropbox.
There's still one more tool to cover in the Todo.txt arsenal, and it's the most potent one-but
also the most geeky: the command line interface.
In a capitalist economy, the market rewards things that are rare and valuable. Social media
use is decidedly not rare or valuable. Any 16-year-old with a smartphone can invent a hashtag
or repost a viral article. The idea that if you engage in enough of this low-value activity, it
will somehow add up to something of high value in your career is the same dubious alchemy that
forms the core of most snake oil and flimflam in business.
Professional success is hard, but it's not complicated. The foundation to achievement and fulfillment,
almost without exception, requires that you hone a useful craft and then apply it to things that
people care about. [...] Interesting opportunities and useful connections are not as scarce as
social media proponents claim. In my own professional life, for example, as I improved my standing
as an academic and a writer, I began receiving more interesting opportunities than I could handle.
As you become more valuable to the marketplace, good things will find you.
To be clear, I'm not arguing that new opportunities and connections are unimportant. I'm instead
arguing that you don't need social media's help to attract them. My second objection concerns
the idea that social media is harmless. Consider that the ability to concentrate without distraction
on hard tasks is becoming increasingly valuable in an increasingly complicated economy. Social
media weakens this skill because it's engineered to be addictive. The more you use social media
in the way it's designed to be used -- persistently throughout your waking hours -- the more your
brain learns to crave a quick hit of stimulus at the slightest hint of boredom.
Once this Pavlovian connection is solidified, it becomes hard to give difficult tasks the unbroken
concentration they require, and your brain simply won't tolerate such a long period without a
fix. Indeed, part of my own rejection of social media comes from this fear that these services
will diminish my ability to concentrate -- the skill on which I make my living.
A dedication to cultivating your social media brand is a fundamentally passive approach to
professional advancement. It diverts your time and attention away from producing work that matters
and toward convincing the world that you matter. The latter activity is seductive, especially
for many members of my generation who were raised on this message, but it can be disastrously
counterproductive.
Modern life now forces us to do a multitude of things at once - but can we? Should we?
Forget invisibility or flight: the superpower we all want is the ability to do several things
at once. Unlike other superpowers, however, being able to multitask is now widely regarded as a basic
requirement for employability. Some of us sport computers with multiple screens, to allow tweeting
while trading pork bellies and frozen orange juice. Others make do with reading a Kindle while poking
at a smartphone and glancing at a television in the corner with its two rows of scrolling subtitles.
We think nothing of sending an email to a colleague to suggest a quick coffee break, because we can
feel confident that the email will be read within minutes.
All this is simply the way the modern world works. Multitasking is like being able to read or
add up, so fundamental that it is taken for granted. Doing one thing at a time is for losers - recall
Lyndon Johnson's often bowdlerised dismissal of Gerald Ford: "He can't fart and chew gum at the same
time."
The rise of multitasking is fuelled by technology, of course, and by social change as well. Husbands
and wives no longer specialise as breadwinners and homemakers; each must now do both. Work and play
blur. Your friends can reach you on your work email account at 10 o'clock in the morning, while your
boss can reach you on your mobile phone at 10 o'clock at night. You can do your weekly shop sitting
at your desk and you can handle a work query in the queue at the supermarket.
This is good news in many ways - how wonderful to be able to get things done in what would once
have been wasted time! How delightful the variety of it all is! No longer must we live in a monotonous,
Taylorist world where we must painstakingly focus on repetitive tasks until we lose our minds.
And yet we are starting to realise that the blessings of a multitasking life are mixed. We feel
overwhelmed by the sheer number of things we might plausibly be doing at any one time, and by the
feeling that we are on call at any moment.
And we fret about the unearthly appetite of our children to do everything at once, flipping through
homework while chatting on WhatsApp, listening to music and watching Game of Thrones. (According
to a recent study by Sabrina Pabilonia of the US Bureau of Labor Statistics, for over half the time
that high-school students spend doing homework, they are also listening to music, watching TV or
otherwise multitasking. That trend is on the increase.) Can they really handle all these inputs at
once? They seem to think so, despite various studies suggesting otherwise.
And so a backlash against multitasking has begun - a kind of Luddite self-help campaign. The poster
child for uni-tasking was launched on the crowdfunding website Kickstarter in December 2014. For
$499 - substantially more than a multifunctional laptop - "The Hemingwrite" computer promised a nice
keyboard, a small e-ink screen and an automatic cloud back-up. You couldn't email on the Hemingwrite.
You couldn't fool around on YouTube, and you couldn't read the news. All you could do was type. The
Hemingwrite campaign raised over a third of a million dollars.
The Hemingwrite (now rebranded the Freewrite) represents an increasingly popular response to the
multitasking problem: abstinence. Programs such as Freedom and Self-Control are now available to
disable your browser for a preset period of time. The popular blogging platform WordPress offers
"distraction-free writing". The Villa St�phanie, a hotel in Baden-Baden, offers what has been branded
the "ultimate luxury": a small silver switch beside the hotel bed that will activate a wireless blocker
and keep the internet and all its temptations away.
The battle lines have been drawn. On one side: the culture of the modern workplace, which demands
that most of us should be open to interruption at any time. On the other, the uni-tasking refuseniks
who insist that multitaskers are deluding themselves, and that focus is essential. Who is right?
The 'cognitive cost'
There is ample evidence in favour of the proposition that we should focus on one thing at a time.
Consider a study led by David Strayer, a psychologist at the University of Utah. In 2006, Strayer
and his colleagues used a high-fidelity driving simulator to compare the performance of drivers who
were chatting on a mobile phone to drivers who had drunk enough alcohol to be at the legal blood-alcohol
limit in the US. Chatting drivers didn't adopt the aggressive, risk-taking style of drunk drivers
but they were unsafe in other ways. They took much longer to respond to events outside the car, and
they failed to notice a lot of the visual cues around them. Strayer's infamous conclusion: driving
while using a mobile phone is as dangerous as driving while drunk.
Less famous was Strayer's finding that it made no difference whether the driver was using a handheld
or hands-free phone. The problem with talking while driving is not a shortage of hands. It is a shortage
of mental bandwidth.
Yet this discovery has made little impression either on public opinion or on the law. In the United
Kingdom, for example, it is an offence to use a hand-held phone while driving but perfectly legal
if the phone is used hands-free. We're happy to acknowledge that we only have two hands but refuse
to admit that we only have one brain.
Another study by Strayer, David Sanbonmatsu and others, suggested that we are also poor judges
of our ability to multitask. The subjects who reported doing a lot of multitasking were also the
ones who performed poorly on tests of multitasking ability. They systematically overrated their ability
to multitask and they displayed poor impulse control. In other words, wanting to multitask is a good
sign that you should not be multitasking.
We may not immediately realise how multitasking is hampering us. The first time I took to Twitter
to comment on a public event was during a televised prime-ministerial debate in 2010. The sense of
buzz was fun; I could watch the candidates argue and the twitterati respond, compose my own 140-character
profundities and see them being shared. I felt fully engaged with everything that was happening.
Yet at the end of the debate I realised, to my surprise, that I couldn't remember anything that Brown,
Cameron and Clegg had said.
A study conducted at UCLA in 2006 suggests that my experience is not unusual. Three psychologists,
Karin Foerde, Barbara Knowlton and Russell Poldrack, recruited students to look at a series of flashcards
with symbols on them, and then to make predictions based on patterns they had recognised. Some of
these prediction tasks were done in a multitasking environment, where the students also had to listen
to low- and high-pitched tones and count the high-pitched ones. You might think that making predictions
while also counting beeps was too much for the students to handle. It wasn't. They were equally competent
at spotting patterns with or without the note-counting task.
But here's the catch: when the researchers then followed up by asking more abstract questions
about the patterns, the cognitive cost of the multitasking became clear. The students struggled to
answer questions about the predictions they'd made in the multitasking environment. They had successfully
juggled both tasks in the moment - but they hadn't learnt anything that they could apply in a different
context.
That's an unnerving discovery. When we are sending email in the middle of a tedious meeting, we
may nevertheless feel that we're taking in what is being said. A student may be confident that neither
Snapchat nor the live football is preventing them taking in their revision notes. But the UCLA findings
suggest that this feeling of understanding may be an illusion and that, later, we'll find ourselves
unable to remember much, or to apply our knowledge flexibly. So, multitasking can make us forgetful
- one more way in which multitaskers are a little bit like drunks.
Early multitaskers
All this is unnerving, given that the modern world makes multitasking almost inescapable. But
perhaps we shouldn't worry too much. Long before multitasking became ubiquitous, it had a long and
distinguished history.
In 1958, a young psychologist named Bernice Eiduson embarked on an long-term research project
- so long-term, in fact, that Eiduson died before it was completed. Eiduson studied the working methods
of 40 scientists, all men. She interviewed them periodically over two decades and put them through
various psychological tests. Some of these scientists found their careers fizzling out, while others
went on to great success. Four won Nobel Prizes and two others were widely regarded as serious Nobel
contenders. Several more were invited to join the National Academy of Sciences.
After Eiduson died, some of her colleagues published an analysis of her work. These colleagues,
Robert Root-Bernstein, Maurine Bernstein and Helen Garnier, wanted to understand what determined
whether a scientist would have a long productive career, a combination of genius and longevity.
There was no clue in the interviews or the psychological tests. But looking at the early publication
record of these scientists - their first 100 published research papers - researchers discovered a
pattern: the top scientists were constantly changing the focus of their research.
Over the course of these first 100 papers, the most productive scientists covered five different
research areas and moved from one of these topics to another an average of 43 times. They would publish,
and change the subject, publish again, and change the subject again. Since most scientific research
takes an extended period of time, the subjects must have overlapped. The secret to a long and highly
productive scientific career? It's multitasking.
Charles Darwin thrived on spinning multiple plates. He began his first notebook on "transmutation
of species" two decades before The Origin of Species was published. His A Biographical Sketch of
an Infant was based on notes made after his son William was born; William was 37 when he published.
Darwin spent nearly 20 years working on climbing and insectivorous plants. And Darwin published a
learned book on earthworms in 1881, just before his death. He had been working on it for 44 years.
When two psychologists, Howard Gruber and Sara Davis, studied Darwin and other celebrated artists
and scientists they concluded that such overlapping interests were common.
Another team of psychologists, led by Mihaly Csikszentmihalyi, interviewed almost 100 exceptionally
creative people from jazz pianist Oscar Peterson to science writer Stephen Jay Gould to double Nobel
laureate, the physicist John Bardeen. Csikszentmihalyi is famous for developing the idea of "flow",
the blissful state of being so absorbed in a challenge that one loses track of time and sets all
distractions to one side. Yet every one of Csikszentmihalyi's interviewees made a practice of keeping
several projects bubbling away simultaneously.
Just internet addiction?
If the word "multitasking" can apply to both Darwin and a teenager with a serious Instagram habit,
there is probably some benefit in defining our terms. There are at least four different things we
might mean when we talk about multitasking. One is genuine multitasking: patting your head while
rubbing your stomach; playing the piano and singing; farting while chewing gum. Genuine multitasking
is possible, but at least one of the tasks needs to be so practised as to be done without thinking.
Then there's the challenge of creating a presentation for your boss while also fielding phone
calls for your boss and keeping an eye on email in case your boss wants you. This isn't multitasking
in the same sense. A better term is task switching, as our attention flits between the presentation,
the telephone and the inbox. A great deal of what we call multitasking is in fact rapid task switching.
Task switching is often confused with a third, quite different activity - the guilty pleasure
of disappearing down an unending click-hole of celebrity gossip and social media updates. There is
a difference between the person who reads half a page of a journal article, then stops to write some
notes about a possible future project, then goes back to the article - and someone who reads half
a page of a journal article before clicking on bikini pictures for the rest of the morning. "What
we're often calling multitasking is in fact internet addiction," says Shelley Carson, a psychologist
and author of Your Creative Brain. "It's a compulsive act, not an act of multitasking."
A final kind of multitasking isn't a way of getting things done but simply the condition of having
a lot of things to do. The car needs to be taken in for a service. Your tooth is hurting. The nanny
can't pick up the kids from school today. There's a big sales meeting to prepare for tomorrow, and
your tax return is due next week. There are so many things that have to be done, so many responsibilities
to attend to. Having a lot of things to do is not the same as doing them all at once. It's just life.
And it is not necessarily a stumbling block to getting things done - as Bernice Eiduson discovered
as she tracked scientists on their way to their Nobel Prizes.
The fight for focus
These four practices - multitasking, task switching, getting distracted and managing multiple
projects - all fit under the label "multitasking". This is not just because of a simple linguistic
confusion. The versatile networked devices we use tend to blur the distinction, serving us as we
move from task to task while also offering an unlimited buffet of distractions. But the different
kinds of multitasking are linked in other ways too. In particular, the highly productive practice
of having multiple projects invites the less-than-productive habit of rapid task switching.
To see why, consider a story that psychologists like to tell about a restaurant near Berlin University
in the 1920s. (It is retold in Willpower, a book by Roy Baumeister and John Tierney.) The story has
it that when a large group of academics descended upon the restaurant, the waiter stood and calmly
nodded as each new item was added to their complicated order. He wrote nothing down, but when he
returned with the food his memory had been flawless. The academics left, still talking about the
prodigious feat; but when one of them hurried back to retrieve something he'd left behind, the waiter
had no recollection of him. How could the waiter have suddenly become so absent-minded? "Very simple,"
he said. "When the order has been completed, I forget it."
One member of the Berlin school was a young experimental psychologist named Bluma Zeigarnik. Intrigued,
she demonstrated that people have a better recollection of uncompleted tasks. This is called the
"Zeigarnik effect": when we leave things unfinished, we can't quite let go of them mentally. Our
subconscious keeps reminding us that the task needs attention.
The Zeigarnik effect may explain the connection between facing multiple responsibilities and indulging
in rapid task switching. We flit from task to task to task because we can't forget about all of the
things that we haven't yet finished. We flit from task to task to task because we're trying to get
the nagging voices in our head to shut up.
Of course, there is much to be said for "focus". But there is much to be said for copperplate
handwriting, too, and for having a butler. The world has moved on. There's something appealing about
the Hemingwrite and the hotel room that will make the internet go away, but also something futile.
It is probably not true that Facebook is all that stands between you and literary greatness. And
in most office environments, the Hemingwrite is not the tool that will win you promotion. You are
not Ernest Hemingway, and you do not get to simply ignore emails from your colleagues.
If focus is going to have a chance, it's going to have to fight an asymmetric war. Focus can only
survive if it can reach an accommodation with the demands of a multitasking world.
Loops and lists
The word "multitasking" wasn't applied to humans until the 1990s, but it has been used to describe
computers for half a century. According to the Oxford English Dictionary, it was first used in print
in 1966, when the magazine Datamation described a computer capable of appearing to perform several
operations at the same time.
Just as with humans, computers typically create the illusion of multitasking by switching tasks
rapidly. Computers perform the switching more quickly, of course, and they don't take 20 minutes
to get back on track after an interruption.
Nor does a computer fret about what is not being done. While rotating a polygon and sending text
to the printer, it feels no guilt that the mouse has been left unchecked for the past 16 milliseconds.
The mouse's time will come. Being a computer means never having to worry about the Zeigarnik effect.
Is there a lesson in this for distractible sacks of flesh like you and me? How can we keep a sense
of control despite the incessant guilt of all the things we haven't finished?
"Whenever you say to someone, 'I'll get back to you about that', you just opened a loop in your
brain," says David Allen. Allen is the author of a cult productivity book called Getting Things Done.
"That loop will keep spinning until you put a placeholder in a system you can trust."
Modern life is always inviting us to open more of those loops. It isn't necessarily that we have
more work to do, but that we have more kinds of work that we ought to be doing at any given moment.
Tasks now bleed into each other unforgivingly. Whatever we're doing, we can't escape the sense that
perhaps we should be doing something else. It's these overlapping possibilities that take the mental
toll.
The principle behind Getting Things Done is simple: close the open loops. The details can become
rather involved but the method is straightforward. For every single commitment you've made to yourself
or to someone else, write down the very next thing you plan to do. Review your lists of next actions
frequently enough to give you confidence that you won't miss anything.
This method has a cult following, and practical experience suggests that many people find it enormously
helpful - including me (see below). Only recently, however, did the psychologists E J Masicampo and
Roy Baumeister find some academic evidence to explain why people find relief by using David Allen's
system. Masicampo and Baumeister found that you don't need to complete a task to banish the Zeigarnik
effect. Making a specific plan will do just as well. Write down your next action and you quiet that
nagging voice at the back of your head. You are outsourcing your anxiety to a piece of paper.
A creative edge?
It is probably a wise idea to leave rapid task switching to the computers. Yet even frenetic flipping
between Facebook, email and a document can have some benefits alongside the costs.
The psychologist Shelley Carson and her student Justin Moore recently recruited experimental subjects
for a test of rapid task switching. Each subject was given a pair of tasks to do: crack a set of
anagrams and read an article from an academic journal. These tasks were presented on a computer screen,
and for half of the subjects they were presented sequentially - first solve the anagrams, then read
the article. For the other half of the experimental group, the computer switched every two-and-a-half
minutes between the anagrams and the journal article, forcing the subjects to change mental gears
many times.
Unsurprisingly, task switching slowed the subjects down and scrambled their thinking. They solved
fewer anagrams and performed poorly on a test of reading comprehension when forced to refocus every
150 seconds.
But the multitasking treatment did have a benefit. Subjects who had been task switching became
more creative. To be specific, their scores on tests of "divergent" thinking improved. Such tests
ask subjects to pour out multiple answers to odd questions. They might be asked to think of as many
uses as possible for a rolling pin or to list all the consequences they could summon to mind of a
world where everyone has three arms. Involuntary multitaskers produced a greater volume and variety
of answers, and their answers were more original too.
"It seems that switching back and forth between tasks primed people for creativity," says Carson,
who is an adjunct professor at Harvard. The results of her work with Moore have not yet been published,
and one might reasonably object that such tasks are trivial measures of creativity. Carson responds
that scores on these laboratory tests of divergent thinking are correlated with substantial creative
achievements such as publishing a novel, producing a professional stage show or creating an award-winning
piece of visual art. For those who insist that great work can only be achieved through superhuman
focus, think long and hard on this discovery.
Carson and colleagues have found an association between significant creative achievement and a
trait psychologists term "low latent inhibition". Latent inhibition is the filter that all mammals
have that allows them to tune out apparently irrelevant stimuli. It would be crippling to listen
to every conversation in the open-plan office and the hum of the air conditioning, while counting
the number of people who walk past the office window. Latent inhibition is what saves us from having
to do so. These subconscious filters let us walk through the world without being overwhelmed by all
the different stimuli it hurls at us.
And yet people whose filters are a little bit porous have a big creative edge. Think on that,
uni-taskers: while you busily try to focus on one thing at a time, the people who struggle to filter
out the buzz of the world are being reviewed in The New Yorker.
"You're letting more information into your cognitive workspace, and that information can be consciously
or unconsciously combined," says Carson. Two other psychologists, Holly White and Priti Shah, found
a similar pattern for people suffering from attention deficit hyperactivity disorder (ADHD).
It would be wrong to romanticise potentially disabling conditions such as ADHD. All these studies
were conducted on university students, people who had already demonstrated an ability to function
well. But their conditions weren't necessarily trivial - to participate in the White/Shah experiment,
students had to have a clinical diagnosis of ADHD, meaning that their condition was troubling enough
to prompt them to seek professional help.
It's surprising to discover that being forced to switch tasks can make us more creative. It may
be still more surprising to realise that in an age where we live under the threat of constant distraction,
people who are particularly prone to being distracted are flourishing creatively.
Perhaps we shouldn't be entirely surprised. It's easier to think outside the box if the box is
full of holes. And it's also easier to think outside the box if you spend a lot of time clambering
between different boxes. "The act of switching back and forth can grease the wheels of thought,"
says John Kounios, a professor of psychology at Drexel University.
Kounios, who is co-author of The Eureka Factor, suggests that there are at least two other potentially
creative mechanisms at play when we switch between tasks. One is that the new task can help us forget
bad ideas. When solving a creative problem, it's easy to become stuck because we think of an incorrect
solution but simply can't stop returning to it. Doing something totally new induces "fixation forgetting",
leaving us free to find the right answer.
Another is "opportunistic assimilation". This is when the new task prompts us to think of a solution
to the old one. The original Eureka moment is an example.
As the story has it, Archimedes was struggling with the task of determining whether a golden wreath
truly was made of pure gold without damaging the ornate treasure. The solution was to determine whether
the wreath had the same volume as a pure gold ingot with the same mass; this, in turn, could be done
by submerging both the wreath and the ingot to see whether they displaced the same volume of water.
This insight, we are told, occurred to Archimedes while he was having a bath and watching the
water level rise and fall as he lifted himself in and out. And if solving such a problem while having
a bath isn't multitasking, then what is?
Tim Harford is an FT columnist. His latest book is 'The Undercover Economist Strikes Back'.
Twitter: @TimHarford
Six ways to be a master of multitasking
1. Be mindful
"The ideal situation is to be able to multitask when multitasking is appropriate, and focus when
focusing is important," says psychologist Shelley Carson. Tom Chatfield, author of Live This Book,
suggests making two lists, one for activities best done with internet access and one for activities
best done offline. Connecting and disconnecting from the internet should be deliberate acts.
2. Write it down
The essence of David Allen's Getting Things Done is to turn every vague guilty thought into a
specific action, to write down all of the actions and to review them regularly. The point, says Allen,
is to feel relaxed about what you're doing - and about what you've decided not to do right now -
confident that nothing will fall through the cracks.
3. Tame your smartphone
The smartphone is a great servant and a harsh master. Disable needless notifications - most people
don't need to know about incoming tweets and emails. Set up a filing system within your email so
that when a message arrives that requires a proper keyboard to answer - ie 50 words or more - you
can move that email out of your inbox and place it in a folder where it will be waiting for you when
you fire up your computer.
4. Focus in short sprints
The "Pomodoro Technique" - named after a kitchen timer - alternates focusing for 25 minutes and
breaking for five minutes, across two-hour sessions. Productivity guru Merlin Mann suggests an "email
dash", where you scan email and deal with urgent matters for a few minutes each hour. Such ideas
let you focus intensely while also switching between projects several times a day.
5. Procrastinate to win
If you have several interesting projects on the go, you can procrastinate over one by working
on another. (It worked for Charles Darwin.) A change is as good as a rest, they say - and as psychologist
John Kounios explains, such task switching can also unlock new ideas.
6. Cross-fertilise
"Creative ideas come to people who are interdisciplinary, working across different organisational
units or across many projects," says author and research psychologist Keith Sawyer. (Appropriately,
Sawyer is also a jazz pianist, a former management consultant and a sometime game designer for Atari.)
Good ideas often come when your mind makes unexpected connections between different fields.
Tim Harford's To-Do Lists
David Allen's Getting Things Done system - or GTD - has reached the status of a religion among
some productivity geeks. At its heart, it's just a fancy to-do list, but it's more powerful than
a regular list because it's comprehensive, specific and designed to prompt you when you need prompting.
Here's how I make the idea work for me.
Write everything down. I use Google Calendar for appointments and an electronic to-do list
called Remember the Milk, plus an ad hoc daily list on paper. The details don't matter. The principle
is never to carry a mental commitment around in your head.
Make the list comprehensive. Mine currently has 151 items on it. (No, I don't memorise the
number. I just counted.)
Keep the list fresh. The system works its anxiety-reducing magic best if you trust your calendar
and to-do list to remind you when you need reminding. I spend about 20 minutes once a week reviewing
the list to note incoming deadlines and make sure the list is neither missing important commitments
nor cluttered with stale projects. Review is vital - the more you trust your list, the more you
use it. The more you use it, the more you trust it.
List by context as well as topic. It's natural to list tasks by topic or project - everything
associated with renovating the spare room, for instance, or next year's annual away-day. I also
list them by context (this is easy on an electronic list). Things I can do when on a plane; things
I can only do when at the shops; things I need to talk about when I next see my boss.
Be specific about the next action. If you're just writing down vague reminders, the to-do list
will continue to provoke anxiety. Before you write down an ill-formed task, take the 15 seconds
required to think about exactly what that task is.
A walk in the park may soothe the mind and, in the process, change the workings of our brains
in ways that improve our mental health, according to an interesting new study of the physical effects
on the brain of visiting nature.
Most of us today live in cities and spend far less time outside in green, natural spaces than
people did several generations ago.
City dwellers also have a higher risk for anxiety, depression and other mental illnesses than
people living outside urban centers, studies show.
These developments seem to be linked to some extent, according to a growing body of research.
Various studies have found that urban dwellers with little access to green spaces have a higher incidence
of psychological problems than people living near parks and that city dwellers who visit natural
environments have lower levels of stress hormones immediately afterward than people who have not
recently been outside.
But just how a visit to a park or other green space might alter mood has been unclear. Does experiencing
nature actually change our brains in some way that affects our emotional health?
That possibility intrigued Gregory Bratman, a graduate student at the Emmett Interdisciplinary
Program in Environment and Resources at Stanford University, who has been studying the psychological
effects of urban living. In an
earlier
study published last month, he and his colleagues found that volunteers who walked briefly through
a lush, green portion of the Stanford campus were more attentive and happier afterward than volunteers
who strolled for the same amount of time near heavy traffic.
But that study did not examine the neurological mechanisms that might underlie the effects of
being outside in nature.
Brooding, which is known among cognitive scientists as morbid rumination, is a mental state familiar
to most of us, in which we can't seem to stop chewing over the ways in which things are wrong with
ourselves and our lives. This broken-record fretting is not healthy or helpful. It can be a precursor
to depression and is disproportionately common among city dwellers compared with people living outside
urban areas, studies show.
Perhaps most interesting for the purposes of Mr. Bratman and his colleagues, however, such rumination
also is strongly associated with increased activity in a portion of the brain known as the subgenual
prefrontal cortex.
If the researchers could track activity in that part of the brain before and after people visited
nature, Mr. Bratman realized, they would have a better idea about whether and to what extent nature
changes people's minds.
Mr. Bratman and his colleagues first gathered 38 healthy, adult city dwellers and asked them to
complete a questionnaire to determine their normal level of morbid rumination.
The researchers also checked for brain activity in each volunteer's subgenual prefrontal cortex,
using scans that track blood flow through the brain. Greater blood flow to parts of the brain usually
signals more activity in those areas.
Then the scientists randomly assigned half of the volunteers to walk for 90 minutes through a
leafy, quiet, parklike portion of the Stanford campus or next to a loud, hectic, multi-lane highway
in Palo Alto. The volunteers were not allowed to have companions or listen to music. They were allowed
to walk at their own pace.
Immediately after completing their walks, the volunteers returned to the lab and repeated both
the questionnaire and the brain scan.
As might have been expected, walking along the highway had not soothed people's minds. Blood flow
to their subgenual prefrontal cortex was still high and their broodiness scores were unchanged.
But the volunteers who had strolled along the quiet, tree-lined paths showed slight but meaningful
improvements in their mental health, according to their scores on the questionnaire. They were not
dwelling on the negative aspects of their lives as much as they had been before the walk.
They also had less blood flow to the subgenual prefrontal cortex. That portion of their brains
were quieter.
These results "strongly suggest that getting out into natural environments" could be an easy and
almost immediate way to improve moods for city dwellers, Mr. Bratman said.
But of course many questions remain, he said, including how much time in nature is sufficient
or ideal for our mental health, as well as what aspects of the natural world are most soothing. Is
it the greenery, quiet, sunniness, loamy smells, all of those, or something else that lifts our moods?
Do we need to be walking or otherwise physically active outside to gain the fullest psychological
benefits? Should we be alone or could companionship amplify mood enhancements?
"There's a tremendous amount of study that still needs to be done," Mr. Bratman said.
But in the meantime, he pointed out, there is little downside to strolling through the nearest
park, and some chance that you might beneficially muffle, at least for awhile, your subgenual prefrontal
cortex.
Forget Russian figure skater Julia Lipnitskaia spinning in a blur with her leg impossibly held
straight up against her ear. The sight of skier Bode Miller collapsing with emotion at the end of
a race dedicated to his brother while NBC cameras lingered uncomfortably on the long shot. Or even
jubilant Noelle Pikus-Pace climbing into the stands to race into her family's arms after her silver
medal finish in the Skeleton.
The image that stands out most in my mind during the broadcast of the 2014 Winter Olympics? The Cadillac
commercial with a boxy, middle-aged white guy in a fancy house striding purposefully from his luxurious
swimming pool to his $75,000 luxury Cadillac ELR parked out front while extolling the virtues of
hard work, American style.
"Why do we work so hard? For stuff?" actor Neal McDonough asks in the commercial that has been
playing without cease. "Other countries work. They stroll home. They stop by a caf�. They take
the entire month of August off. "Off," he says again, to reinforce the point.
"Why aren't you like that? Why aren't WE like that?"
The first time the commercial aired during the Opening Ceremonies in Sochi, the slight pause after
those two questions made me hopeful. I sat up to listen closely.
Was he about to say � we should be more like that? Because Americans work among the most hours of
any advanced country in the world, save South Korea and Japan, where they've had to invent a word
for dying at your desk. (Karoshi. Death from Overwork.) We also work among the most extreme hours,
at 50 or more per week. The Bureau of Labor Statistics reports that the average American works about
one month more a year than in 1976.
Was he going to say that we Americans are caught up in what economist Juliet Schor calls a vicious
cycle of "work-and-spend" � caught on a time-sucking treadmill of more spending, more stuff, more
debt, stagnant wages, higher costs and more work to pay for it all?
Would he talk about how we Americans, alone among the advanced economies, whose athletes are competing
between the incessant commercials with such athleticism and grace, have no national vacation policy.
(So sacrosanct is time off in some countries that the Court of Justice of the European Union ruled
in 2012 that workers who get sick on vacation are entitled to take more time off "to enable the worker
to rest and enjoy a period of relaxation and leisure.").
American leisure? Don't let the averages fool you, he could say. While it looks like leisure time
has gone up, time diaries show that leisure and sleep time have gone up steeply since 1985 for those
with less than a high school degree. Why? They're becoming unemployed or underemployed. And leisure
and sleep time for the college educated, the ones working those crazy extreme hours, has fallen steeply.
Americans don't have two "nurture days" per child until age 8, as Denmark does. No year-long paid
parental leaves for mothers and fathers, as in Iceland. Nor a national three-month sabbatical
policy, which Belgium has.
Instead of taking the entire month of August off, the most employers voluntarily grant us American
workers tends to be two weeks. One in four workers gets no paid vacation or holidays at all,
one study found. And, in a telling annual report called the "Vacation Deprivation" study, travel
company Expedia figures that Americans didn't even USE 577 million of those measly vacation days
at all last year.
Center for Economic and Policy Research, May 2013 Center for Economic and Policy Research, May 2013
So as I watched the Cadillac commercial, hanging onto that rich white guy's pause, I was hoping he'd
make a pitch to bring some sanity to American workaholic culture. It wouldn't have been a first for
the auto industry. Henry Ford outraged his fellow industrialists when he cut his workers' hours to
40 a week. (Standards in some industries at the time were for 12-hour workdays, 7 days a week.) Ford
did so because his internal research showed 40 hours was as far as you could push manual laborers
in a week before they got stupid and began making costly mistakes. He also wanted his workers to
have the leisure time to buy and use his cars.
The rich guy takes a breath and smirks. We work so much "Because we're crazy, driven hard-working
believers, that's why." Bill Gates. The Wright Brothers. Were they crazy? He asks. We went to the
moon and, you know what we got? Bored, he says.
"You work hard. You create your own luck. And you've gotta believe anything is possible." Fair
enough. "As for all the stuff?" he says as he knowingly unplugs his luxury electric car, "that's
the upside of only taking TWO weeks off in August, n'est ce pas?"
THE predictions sounded like promises: in the future, working hours would be short and vacations
long. "Our grandchildren", reckoned John Maynard Keynes in 1930, would work around "three hours a
day"-and probably only by choice. Economic progress and technological advances had already shrunk
working hours considerably by his day, and there was no reason to believe this trend would not continue.
Whizzy cars and ever more time-saving tools and appliances guaranteed more speed and less drudgery
in all parts of life. Social psychologists began to fret: whatever would people do with all their
free time?
This has not turned out to be one of the world's more pressing problems. Everybody, everywhere
seems to be busy. In the corporate world, a "perennial time-scarcity problem" afflicts executives
all over the globe, and the matter has only grown more acute in recent years, say analysts at McKinsey,
a consultancy firm. These feelings are especially profound among working parents. As for all those
time-saving gizmos, many people grumble that these bits of wizardry chew up far too much of their
days, whether they are mouldering in traffic, navigating robotic voice-messaging systems or scything
away at e-mail-sometimes all at once.
Tick, tock
Why do people feel so rushed? Part of this is a perception problem. On average, people in rich
countries have more leisure time than they used to. This is particularly true in Europe, but even
in America leisure time has been inching up since 1965, when formal national time-use surveys began.
American men toil for pay nearly 12 hours less per week, on average, than they did 40 years ago-a
fall that includes all work-related activities, such as commuting and water-cooler breaks. Women's
paid work has risen a lot over this period, but their time in unpaid work, like cooking and cleaning,
has fallen even more dramatically, thanks in part to dishwashers, washing machines, microwaves and
other modern conveniences, and also to the fact that men shift themselves a little more around the
house than they used to.
The problem, then, is less how much time people have than how they see it. Ever since a clock
was first used to synchronise labour in the 18th century, time has been understood in relation to
money. Once hours are financially quantified, people worry more about wasting, saving or using them
profitably. When economies grow and incomes rise, everyone's time becomes more valuable. And the
more valuable something becomes, the scarcer it seems.
Individualistic cultures, which emphasise achievement over affiliation, help cultivate this time-is-money
mindset. This creates an urgency to make every moment count, notes Harry Triandis, a social psychologist
at the University of Illinois. Larger, wealthy cities, with their higher wage rates and soaring costs
of living, raise the value of people's time further still. New Yorkers are thriftier with their minutes-and
more harried-than residents of Nairobi. London's pedestrians are swifter than those in Lima. The
tempo of life in rich countries is faster than that of poor countries. A fast pace leaves most people
feeling rushed. "Our sense of time", observed William James in his 1890 masterwork, "The Principles
of Psychology", "seems subject to the law of contrast."
When people see their time in terms of money, they often grow stingy with the former to maximise
the latter. Workers who are paid by the hour volunteer less of their time and tend to feel more antsy
when they are not working. In an experiment carried out by Sanford DeVoe and Julian House at the
University of Toronto, two different groups of people were asked to listen to the same passage of
music-the first 86 seconds of "The Flower Duet" from the opera "Lakm�". Before the song, one group
was asked to gauge their hourly wage. The participants who made this calculation ended up feeling
less happy and more impatient while the music was playing. "They wanted to get to the end of the
experiment to do something that was more profitable," Mr DeVoe explains.
The relationship between time, money and anxiety is something Gary S. Becker noticed in America's
post-war boom years. Though economic progress and higher wages had raised everyone's standard of
living, the hours of "free" time Americans had been promised had come to nought. "If anything, time
is used more carefully today than a century ago," he noted in 1965. He found that when people are
paid more to work, they tend to work longer hours, because working becomes a more profitable use
of time. So the rising value of work time puts pressure on all time. Leisure time starts to seem
more stressful, as people feel compelled to use it wisely or not at all.
The harried leisure class
That economic prosperity would create feelings of time poverty looked a little odd in the 1960s,
given all those new time-saving blenders and lawnmowers. But there is a distinct correlation between
privilege and pressure. In part, this is a conundrum of wealth: though people may be earning more
money to spend, they are not simultaneously earning more time to spend it in. This makes time-that
frustratingly finite, unrenewable resource-feel more precious.
"In the name of simplicity, I even try to avoid instant messaging. But I also can't help worrying
that I am missing out. "
June 15, 2013 | NYT
I'M old enough to remember a simpler time in the office, when talking - whether in person or on
the phone - was the main way to communicate. I once had a job where I filled out those pink "While
You Were Out" slips for employees who had stepped away from their desks.
Then, in the 1990s, came e-mail, and things were never the same. Besides delivering a serious
blow to the sellers of those pieces of paper, e-mail made communicating with people incredibly -
and, at first, delightfully - easy.
Now, a few decades later, people constantly complain that their e-mail in-boxes are unmanageable.
And many more technologies have joined the workplace party. We can now use cellphones, texts, instant
messaging, text messaging, social media, corporate intranets and cloud applications to communicate
at work.
Something may have been lost as we adopted these new communication tools: the ability to concentrate.
"Nobody can think anymore because they're constantly interrupted," said
Leslie Perlow, a Harvard Business School professor and author
of "Sleeping With Your Smartphone." "Technology has enabled this expectation that we always be on."
Workers fear the repercussions that could result if they are unavailable, she said.
The intermingling of work and personal life adds to the onslaught, as people communicate about
personal topics during the workday, and about work topics when they are at home.
According to
a 2011
article in The Ergonomics Open Journal, electronic communication tools can demand constant switching,
which contributes to a feeling of "discontinuity" in the workplace. On the other hand, people sometimes
deliberately introduce interruptions into their day as a way to reduce boredom and to socialize,
the article said.
We're only beginning to understand the workplace impact of new communication tools. The use of
such technology in the office is "less rational than we would like to think," said
Steve Whittaker a professor of human-computer interaction at
the University of California, Santa Cruz. Sometimes, "it's one person who's an evangelist," he said.
"They will start using a particular thing, and they will bring other people along with them."
More tech-oriented types might favor the latest new communication "toy," while others, like me,
are less enthusiastic. In the name of simplicity, I even try to avoid instant messaging. But I also
can't help worrying that I am missing out.
Plenty of workplace advice focuses on how we, as individuals, can manage our technology, but in
many cases, this is a collective, team-level issue, Professor Perlow said.
As Professor Whittaker put it, "We haven't stabilized our regular practices," and these may need
to be negotiated among workers.
It's important to distinguish between collaborative and one-on-one communication, he said. Cloud-based
systems are meant for sharing and editing documents, and they can enable people in different cities
to work together in real time. Internal social media pages can be useful for seeking and sharing
knowledge.
But when one person wants to communicate with another privately, e-mail remains the go-to method,
Professor Whittaker said. That's why it is nearly universal, despite a general yearning for something
better.
To lessen the disruptive nature of e-mail and other messages, teams need to discuss how to alter
their work process to allow blocks of time where they can disconnect entirely, Professor Perlow said.
"I don't think you can do it without leadership support," she added.
MAYBE more managers, consulting with their teams, need to set up clear guidelines for communication.
When is it best to use the cloud? When is it best to use e-mail, or instant messaging? And when is
it acceptable, even preferable, to turn off all technology? Not that managers need to be dictators,
but a little clarity can lead to much more productivity.
Making it a priority to learn how to use the latest tools more effectively is a good idea, too.
For example, how do those filters that help prioritize messages really work?
And let's never forget the value of face-to-face, or voice-to-voice, communication. An actual
unrehearsed conversation - requiring sustained attention and spontaneous reactions - may be old-fashioned,
but it just might turn up something new.
I have long found inspiration and insight in Rushkoff's work, especially his keen understanding
of the pathologies of consumerism. In my 2009 book
Survival+, I wrote:
Rushkoff's reply to an interview question on the consequences of ubiquitous marketing reveals
how media/marketing has created an unquestioned politics of experience in which one's identity and
sense of self is constructed almost entirely by what one buys:
"Children are being adultified because our economy is depending on them to make purchasing
decisions. So they're essentially the victims of a marketing and capitalist machine gone awry.
You know, we need to expand, expand, expand. There is no such thing as enough in our current economic
model and kids are bearing the brunt of that.... So they're isolated, they're alone, they're desperate.
It's a sad and lonely feeling....The net effect of all of this marketing, all of this disorienting
marketing, all of the shock media, all of this programming designed to untether us from a sense
of self, is a loss of autonomy. You know, we no longer are the active source of our own experience
or our own choices. Instead, we succumb to the notion that life is a series of product purchases
that have been laid out and whose qualities and parameters have been pre-established."
In my view, this is a brilliant analysis of the rot at the heart of the American project.
In
his new book, Rushkoff examines the telescoping of time and context wrought by ubiquitous digital
technologies. We're always accessible, always connected and every channel is always on; this
overload affects not just our ability to process information but our culture and the way media and
marketing are designed and delivered.
The title consciously plays off the influential 1970 book by Alvin Toffler,
Future Shock, which posited that our innate ability to process change was limited even as the
rate of change in our post-industrial world increased. That rate of change would soon overwhelm our
capacity to process new inputs and adapt to them.
In Rushkoff's view, we've reached that future:
the speed of change and the demands of the present are disorienting us in profound ways.
We all know what stress feels like: it often causes our view to narrow to the present stressor, and
we lose perspective and the ability to "make sense" of anything beyond managing the immediate situation.
Rushkoff identifies five symptoms of present shock:
Narrative collapse - the loss of linear stories and their replacement with both crass reality
programming and post-narrative shows like The Simpsons.
Digiphrenia � digitally provoked mental chaos as technology lets us be in more than one place
at any one moment. As Rushkoff notes in this chapter: Our boss isn't the guy in the corner
office, but the PDA in our pocket. Our taskmaster is depersonalized and internalized.
Overwinding � trying to squish huge timescales into much smaller ones, for example, packing
a year's worth of retail sales expectations into a single Black Friday event.
Fractalnoia � making sense of our world entirely in the present tense, by drawing connections
between things with weak causal relationships, for example Big Data, which excels at identifying
correlations but is utterly incapable of identifying cause amidst the correlations.
Apocalypto � the intolerance for presentism leads us to fantasize a grand finale, the cultural
equivalent of a "market-clearing event."
As Janet Maslin of the New York Times wrote in her review: "How do we shield ourselves
from distraction, or gravitate to what really matters?"
Studies have shown that our innate ability to remember people and identify their relationships
with others is limited to around 100 people--the size of a village or combat company. We undoubtedly
have similar innate limitations on how many channels of input we can absorb.
Clay Shirky (author of
Here Comes Everybody: The Power of Organizing Without Organizations) calls this filter failure,
his term for what used to be called information overload. Our filters become overloaded and
we lose the ability to "make sense" of what's going on around us.
As the phenomenologists discovered in the 20th century, our basic coping mechanism is to separate
the world (and inputs) into three basic categories: the focal point, the foreground and the deep
background. Being unable to sort out which input belongs in the three spaces leads to disorientation
and poor decisions.
The parallels between filter failure and stress are not coincidental, as we handle filter failure
and present shock the same way we handle stress: we limit inputs and make a concerted effort to reorient
our awareness and context, what some call "be still and know."
Another troubling parallel to present shock is addiction. People now respond to texts,
emails, alerts and phone calls like rats in the proverbial cage with the lever that releases another
tab of cocaine: they over-stimulate themselves to death but are incapable of restraining their impulse
for more.
The "obvious" solution is to turn off inputs as a way of restoring our ability to live in a present
without novelty and distraction. This is akin to withdrawal from a powerful opiate, and so we should
not be surprised that there are now treatment facilities for kids who need to detox from digital
inputs.
Rushkoff is especially attuned to the distortions in our experience of time created by digital
media-communication present shock: "Time in the digital era is no longer linear but disembodied
and associative. The past is not something behind us on the timeline but dispersed through the sea
of information."
In effect, change no longer flows linearly like time anymore, it flows in all directions at once.
History and meaningful context are both fatally disrupted by this non-linear flow of time and
narrative. Is it any wonder that we now read about young well-educated people who do not understand
the meaning of "policy"? To understand policy requires a grasp of the histories and narratives
that led to the policy, and the linear, causally-linked way that policy is designed to solve or ameliorate
a specific problem or challenge.
If the causal chains of history and narrative are disrupted, then how can anyone fashion a meaningful
context for actions and narratives, and effectively frame problems and solutions? If everything is
equally valid in a non-linear flood of data, then what roles can authenticity, experience and knowledge
play in making sense of our world?
These are knotty, complex issues, and you will find much to constructively ponder in
Present Shock.
For more than a decade, the most significant ritual in my work life has been to take on the
most important task of the day as my first activity, for 90 minutes, without interruption, followed
by a renewal break. I do so because mornings are when I have the highest energy and the fewest
distractions.
... Far and away the biggest work challenges most of us now face are cognitive overload
and difficulty focusing on one thing at a time.
Whenever I singularly devote the first 90 minutes of my day to the most challenging or important
task � they're often one and the same - I get a ton accomplished.
Following a deliberate break � even just a few minutes - I feel refreshed and ready to face the
rest of the day. When I don't start that way, my day is never quite as good, and I sometimes head
home at night wondering what I actually did while I was so busy working.
Performing at a sustainably high level in a world of relentlessly rising complexity requires that
we manage not just our time but also our energy � not just how many hours we work, but when we work,
on what and how we feel along the way.
Fail to take control of your days - deliberately, consciously and purposefully - and you'll be
swept along on a river of urgent but mostly unimportant demands.
It's all too easy to rationalize that we're powerless victims in the face of expectation from
others, but doing that is itself a poor use of energy. Far better to focus on what we can influence,
even if there are times when it's at the margins.
Small moves, it turns out, can make a significant difference.
When it comes to doing the most important thing first each morning, for example, it's best to
make that choice, along with your other top priorities, the night before.
Plainly, there are going to be times that something gets in your way and it's beyond your control.
If you can reschedule for later, even 30 minutes, or 45, do that. If you can't, so be it. Tomorrow
is another day.
If you're a night owl and you have more energy later in the day, consider scheduling your most
important work then. But weigh the risk carefully, because as your day wears on, the number of pulls
on your attention will almost surely have increased.
Either way, it's better to work highly focused for short periods of time, with breaks in between,
than to be partially focused for long periods of time. Think of it as a sprint, rather than a marathon.
You can push yourself to your limits for short periods of time, so long as you have a clear stopping
point. And after a rest, you can sprint again.
How you're feeling at any given time profoundly influences how effectively you're capable of working,
but most of us pay too little attention to these inner signals.
Fatigue is the most basic drag on productivity, but negative emotions like frustration, irritability
and anxiety are equally pernicious. A simple but powerful way to check in with yourself is to intermittently
rate the quantity and quality of your energy - say at midmorning, and midafternoon - on a scale from
1 to 10.
If you're a 5 or below on either one, the best thing you can do is take a break.
Even just breathing deeply for as little as one minute � in to a count of three, out to a count
of six � can quiet your mind, calm your emotions and clear your bloodstream of the stress hormone
cortisol.
Learn to manage your energy more skillfully, and you'll get more done, in less time, at a higher
level of focus. You'll feel better - and better about yourself - at the end of the day.
About the Author
Tony Schwartz is the chief executive of the Energy Project and the author, most recently, of
"Be Excellent at Anything: The Four Keys to Transforming the Way We Work and Live." Twitter: @tonyschwartz
Anne-Marie Hislop, Chicago
The key is figuring our when we are most productive and focused. Although a morning person,
I need early time for my rituals - exercise, shower, coffee, and NYT online (along with pop-ins
at other sites). Then, by time I get to work around 8AM I will have my most focused, productive
hours.
I also find that standing while I work on my computer, which I do more and more when I
can (don't have a way to type extensively while standing) energizes me and helps me focus.
John Lamont, Pennsylvania
Frankly I think the core premise of the article, how to get more "done", needs to be questioned,
especially in the context to which it speaks, the corporate environment. We would be far better
served if Mr. Schwarz's audience spent that 1st 30 minutes of their day sitting back and thinking
about what they do, who it's really for, what are the consequences of what they and their company
do, and is it morally and ecologically ethical.
In the grand scheme of things I don't doubt mankind is now better off than it was 2,000 or
even 500 years ago, but in our relentless drive to produce, consume and sell we have developed,
and continue to develop technologies and complex global social interactions that have a good chance
of setting us back to the stone age.
Let's not be in such a hurry getting wherever we think we're going and spend a bit more time
pondering about where it is we actually want to be and who we want to be when we get there.
Everyone has a cell phone these days. Out here in my little corner of the world, in a county that
competes with the neighboring county for the poorest in the state, everyone can somehow afford smartphones
with generous data plans. I have no idea what people's eye colors are anymore, or if they even have
eyes, because all I see are the tops of their heads as they are bent over their tiny screens. This
stuff is not cheap-- I don't know anyone whose monthly bite is under a hundred dollars. Which is
why I have a cheapo TracFone, because I refuse to pay that much. Plus I like hoarding minutes, so
I turn it off. I don't have to be in constant contact with my eleventeen bestest friends at all times.
Michelle, Ma Bell
American telecoms are special beasts. Back in the olden days we had a single giant regulated monopoly,
AT&T. Technological progress was non-existent, and stuck at a level barely above Alexander Graham
Bell's original prototype. We could not own our telephones, but had to lease them from the phone
company, which made those old phones some of the most valuable hardware in existence because we kept
paying for them year after year after year. We could not add extensions, or attach any other equipment.
The one upside was rock-solid service, which set American telephone service apart from most other
countries, where unreliability was the norm.
Then Ma Bell was broken up and we got competition, sort of. We never got a choice of carriers
for local service, but long distance became competitive. Though again only sort-of, because in-state
calls cost more than cross-country calls, and other oddities. Now with mobile phones everywhere a
lot of people don't even bother with land lines, and they'e all happy at getting free long distance,
even though it's not really free and they're paying a lot more. But even though mobile service costs
more, it includes more, like worse call quality and no-service areas. I estimate that 40% of all
cell traffic is "What? Are you there? Hello? What?"
When We Dialed Telephones
Where was I going with all this? Oh yeah, ubiquity. My grandmother had a single black dial telephone,
and it sat on a special table next to a chair in her entry hall. A phone call was a bit of an event--
she couldn't Web surf while half-listening, or watch TV, or go shopping, or put people on hold and
juggle multiple calls. She had conversations, one at a time. She couldn't just pick up and call someone
when she felt like it because she was on a party line. That is a shared phone line, which meant everyone
who shared the line could eavesdrop on your calls. When I grew up the other giant time- and attention-pit-of-suck,
television, was not yet everywhere, and a lot of our friends did not have TVs. So when we got together
we talked to each other. With eye contact and everything.
Now we're all proud that Android dominates mobile phones, rah rah Linux. Little kids have their
own phones, and again I marvel at how much people are willing to pay for their mobile fix. Sure,
for some it's a necessity, but in my somewhat humble opinion most of the time it's more akin to an
addiction. It's like the rats that push the button that stimulates the pleasure centers of their
brains, and then starve to death because they won't push the food button. Humans just plain love
to push buttons, and are willing to pay top dollar for the privilege-- vending machines, video poker,
serial channel-surfing on the TV, mobile phones; give them buttons to push and they're happy for
hours.
Woa, you might be thinking, get off the grumpy train, because mobile phones are useful tools!
And you are right. But I'm still going to be grumpy at people who won't turn them off when we're
visiting or doing an activity together. You know those people who have to answer the phone no matter
what they're doing? Showering, sleeping, birthing babies? They're a thousand times worse with mobile
phones. In the olden days it was considered rude to leave the TV on when people came to visit. Unless
they came to watch a program, of course. Remember when call waiting was all new and special? And
an insult, like whoever you were talking to was hoping someone better than you would interrupt your
call. Now the phone is the TV, along with a million other interruptions, distractions, delights,
and portability. We can't escape the things.
Thinking. No Really.
One of the things I love about computer nerds is most of them understand the need for long stretches
of uninterrupted time to think, and to concentrate deeply on a task. It is impossible to master a
new skill or solve a problem when you're skittering randomly from one activity to the next, never
engaging more than the bare surface of your consciousness. It's unsatisfying, because you never accomplish
anything. Multi-tasking is a myth. It is the very rare human who can perform two or more tasks at
once. A "multi-tasker" is someone who juggles multiple chores and does a poor job at all of them.
I prefer total immersion: full attention and no distractions.
Magic happens in your brain when you achieve that state of total focus. It's almost a meditative
state. Obstacles fall away and your path become wide and clear. It's as though you're forging new
neural pathways and amping up your brainpower. Single-tasking has superpowers.
"At last I knew what had happened to Lucy. The television ate her. It must have been a terrible
thing to see. Now my parents were thinking of getting one. I was scared. They didn't understand what
television could do."
Beginning Android Programming
Pushing buttons is fun, and building the buttons is a million times more fun. Try Juliet Kemp's
excellent introduction to Android programming:
You may think you're getting more accomplished by working longer hours. You're probably wrong.
There's been a flurry of recent coverage praising Sheryl Sandberg, the chief operating officer
of Facebook, for leaving the office every day at 5:30 p.m. to be with her kids. Apparently she's
been doing this for years, but only recently "came out of the closet," as it were.
What's insane is that Sandberg felt the need to hide the fact, since there's a century of research
establishing the undeniable fact that working more than 40 hours per week actually decreases productivity.
In the early 1900s, Ford Motor ran dozens of tests to discover the optimum work hours for worker
productivity. They discovered that the "sweet spot" is 40 hours a week�and that, while adding another
20 hours provides a minor increase in productivity, that increase only lasts for three to four weeks,
and then turns negative.
Anyone who's spent time in a corporate environment knows that what was true of factory workers
a hundred years ago is true of office workers today. People who put in a solid 40 hours a week get
more done than those who regularly work 60 or more hours.
The workaholics (and their profoundly misguided management) may think they're accomplishing more
than the less fanatical worker, but in every case that I've personally observed, the long hours result
in work that must be scrapped or redone.
Accounting for Burnout What's more, people who consistently work long work weeks get burned out
and inevitably start having personal problems that get in the way of getting things done.
I remember a guy in one company I worked for who used the number of divorces in his group as a
measure of its productivity. Believe it or not, his top management reportedly considered this a valid
metric. What's ironic (but not surprising) is that the group itself accomplished next to nothing.
In fact, now that I think about it, that's probably why he had to trot out such an absurd (and,
let's face it, evil) metric.
Proponents of long work weeks often point to the even longer average work weeks in countries like
Thailand, Korea, and Pakistan�with the implication that the longer work weeks are creating a competitive
advantage.
Europe's Ban on 50-Hour Weeks However, the facts don't bear this out. In six of the top 10 most
competitive countries in the world (Sweden, Finland, Germany, Netherlands, Denmark, and the United
Kingdom), it's illegal to demand more than a 48-hour work week. You simply don't see the 50-, 60-,
and 70-hour work weeks that have become de rigeur in some parts of the U.S. business world.
If U.S. managers were smart, they'd end this "if you don't come in on Saturday, don't bother coming
to work on Sunday" idiocy. If you want employees (salaried or hourly) to get the most done�in the
shortest amount of time and on a consistent basis�40 hours a week is just about right.
In other words, nobody should be apologizing for leaving at work at a reasonable hour like 5:30
p.m. In fact, people should be apologizing if they're working too long each week�because it's probably
making the team less effective overall.
These models are interesting, but the mathematics underneath them can be challenging (here's
a taste):
'Rational
Inattention' Guides Overloaded Brains, Helps Economists Understand Market Behavior, by Antonella
Tutino, Economic Letter, FRB Dallas: Between Internet news sources, social media and email,
people are awash in information, most of it accessible at near-zero cost. Yet, humans possess
only a finite capacity to process all of it. The average email user, for example, receives dozens
of messages per day. The messages can't all receive equal attention. How carefully does someone
read an email from a sibling or friend before crafting a reply? How closely does a person read
an email from the boss?
Limitations on the ability to process information force people to make choices regarding the
subjects to which they pay more or less attention. Economists have long acknowledged the existence
of human cognitive capacities, but only in recent years have models embodying such limits known
as "rational inattention" found their way into mainstream macroeconomics.
Rational inattention models have a broad range of applications. They may reconcile relatively
unchanged prices and volatile ones and how the two play out in aggregate demand in the U.S. economy.
Moreover, such models can capture salient features of the business cycle, providing a rationale
for sharp contractions or slower expansions. Finally, rational inattention models have significant
implications for monetary policy. Since the focus of these models revolves around formation of
peoples' expectations, understanding how individuals perceive the economy is instrumental to policymakers'
efforts to achieve output and price stabilization objectives.
Rational Inattention: A Primer
One macroeconomic school of thought-known as rational expectations-assumes that people fully
and quickly process all freely available information. By comparison, under rational inattention
theory, information is also fully and freely available, but people lack the capability to quickly
absorb it all and translate it into decisions. Rational inattention is based on a simple observation:
Attention is a scarce resource and, as such, it must be budgeted wisely.[1]
Part of the problem is information overload and it should be called by the author as such. The other
part is that deluge of Internet information does not mean high quality of information. There is specific
"Google effect" when pages are created just to extract advertising fees and promoted using "link farms"
to get high Google rating for the topic. And spending time on junk is spending time on junk whether
it is electronic or not. It's just easier with computer. Nicklaus Carr easier article in Atlantic
Is Google Making Us Stupid- - Magazine - The Atlantic covers the same ground.
William Timothy Lukeman Death by a thousand distracting cuts, June 8, 2010
In this short but informative, thought-provoking book, Nicholas Carr presents an argument I've
long felt to be true on a humanist level, but supports it with considerable scientific research.
In fact, he speaks as a longtime computer enthusiast, one who's come to question what he once
wholeheartedly embraced ... and even now, he takes care to distinguish between the beneficial
& detrimental aspects of the Internet.
The argument in question?
Greater access to knowledge is not the same as greater knowledge.
An ever-increasing plethora of facts & data is not the same as wisdom.
Breadth of knowledge is not the same as depth of knowledge.
Multitasking is not the same as complexity.
The studies that Carr presents are troubling, to say the least. From what has been gleaned
to date, it's clear that the brain retains a certain amount of plasticity throughout life -- that
is, it can be reshaped, and the way that we think can be reshaped, for good or for ill.
Thus, if the brain is trained to respond to & take pleasure in the
faster pace of the digital world, it is reshaped to favor that approach to experiencing the world
as a whole. More, it comes to crave that experience, as the body increasingly craves
more of anything it's trained to respond to pleasurably & positively. The
more you use a drug, the more you need to sustain even the basic rush.
And where does that leave the mind shaped by deep reading? The mind that immerses itself in
the universe of a book, rather than simply looking for a few key phrases & paragraphs? The mind
that develops through slow, quiet contemplation, mulling over ideas in their entirety, and growing
as a result? The mature mind that ponders possibilities & consequences, rather than simply going
with the bright, dazzling, digital flow?
Nowhere, it seems.
Carr makes it clear that the digital world, like any other technology that undeniably makes
parts of life so much easier, is here to stay. All the more reason, then, to approach it warily,
suspiciously, and limit its use whenever possible, since it is so ubiquitous. "Yes, but," many
will say, "everything is moving so fast that we've got to adapt to it, keep up with it!" Not unlike
the Red Queen commenting that it takes all of one's energy & speed to simply remain in one place
while running. But what sort of life is that? How much depth does it really have?
Because some aspects of life -- often the most meaningful & rewarding aspects -- require time
& depth. Yet the digital world constantly makes us break it into discrete, interchangeable bits
that hurtle us forward so rapidly & inexorably that we simply don't have time to stop & think.
And before we know it, we're unwilling & even unable to think. Not in any way that allows true
self-awareness in any real context.
Emerson once said (as aptly quoted by Carr), "Things are in the saddle / And ride mankind."
The danger is that we'll not only willingly, even eagerly, wear those saddles, but that we'll
come to desire them & buckle them on ever more tightly, until we feel naked without them. And
we'll gladly pay anything to keep them there, even as we lose the capacity to wonder why we ever
put them on in the first place.
This book is a more fully fleshed out attempt to answer the question that Carr first posed
a couple of years ago in an article titled "Is Google Making Us Stupid?"
The Shallows: What the Internet is Doing to Our Brains explores the ideas of his Google article
in much more detail.
To fully understand what the Internet is doing to our brains, we must first understand our brains.
Carr highlights results from a variety of iconic and recent studies that illustrate the plasticity
of our thinking organs. We see experiments ranging from the severed sensory nerves of monkeys'
hands in the 1960s (and their brains subsequent `rewiring') to London taxi drivers whose posterior
hippocampuses (the "part of the brain that plays a key role in storing and manipulating spatial
represenations of a person's surroudings") were much larger than normal.
In short, we see plenty of evidence that the brain can reorganize itself, and is certainly
not fixed in one state for all of its adult life.
The Shallows then explores the history of the written word and its explosion due to Gutenberg's
invention, and even further back to the argument between Socrates and Plato concerning the value
of the written word. Socrates argued that if we committed all of our thoughts to paper, we would
not have to remember anything. How do we know this? From the writings of Plato, of course. The
soundwaves of Socrates' voice, as wise as he was, cannot travel through time like written words
can.
With the first half detailing the brain's plasticity and our species' history with the accumulation
of knowledge, Carr sets up the latter half of the book perfectly, and his ideas might be grossly
simplified into something like this:
P1: Experiments of brain plasticity have proven that our brains change over time.
P2: We are using the Internet for an increasing amount of our activities, including work, entertainment
and commerce.
P3: The Internet is a medium that encourages distractedness and makes our brains inept at remembering.
C: We are all becoming a lot more dependent upon our digital devices, and in doing so, are increasingly
distracted in everything we do, both online and off.
Carr's book is a giant caution sign on the side of the road that we ride into the increasingly
digital future. The caution sign might be too far behind us already, as we've blazed ahead and
rewired our minds to think like computers - logical, task-switching, and distracted at every second
of the day. If people in their 30s and 40s (who may have had the Internet for approximately
25-40% of their life times) are experiencing these changes in their brains, imagine the effect
the Internet is having on our youth. The Net Generation is defined to be those who have grown
up with the Net for more than half their lives. There are some who have had the Internet in 100%
of their life spans. Imagine that, never knowing a world without the Internet. Yes, some children
are younger than Google. Imagine explaining to your grandchildren that you grew up in a time that
didn't have the Internet, let alone the information organizing superpower known as Google.
Will we look back at this period of transition from print to digital and see it as being as momentous
as the shift from an oral culture to a print culture? What would Socrates have thought? Have we
become lesser human beings, inextricably tied to the addictive external memories of our computers
and mobile phones?
Could it be that George W. Bush infamous "the Internets" quote was just a sign of the stupidness
to come? Perhaps Bush was ahead of his time. Perhaps the Flynn effect is about to peak, or already
has. Could the greatest learning tool ever created be so useful that we forget how to think as
we use it?
This is a great book for anyone who's interested in our society as a whole, how our brains work,
the effects of technology, and the process of learning. Highly recommended.
Carr, in his epilogue to this work, warns that we as a species have to be 'attentive to what
we stand to lose.' In his view, the brain's adaptation to the newer and newer technologies in
effect flattens our minds and deprives the individual human of the depth that once could be called
wisdom. I agree with him more because I am an avowed Luddite than for the wonderful argument he
lays out.
For example, I am enrolled in a science class at this time. As the semester comes to a close,
I have an average approaching 100. The problem is that I have actually learned little of the actual
science but I have instead learned to utilize the electronic tools built into the shell; the entire
evaluative framework of the class is on-line. I have learned how to find the answers but I do
not know the answers. Compare this to a literature class, where you have to maybe read and analyze
and memorize things and your own wisdom is grown because you build long-term memories that give
crucial context. This may not matter, per se, in the terms of the professions of the future, but
they have a real impact when it comes to human-to-human interaction or aesthetic enjoyment.
The ironic thing is that I did not read this book. I instead listened to it on my mp3 player
as I worked and from time to time flipped through to look at Amazon or Facebook or Gmail. My own
mind has developed according to the standards of the internet -- I am hyperlinks not a straight
narrative. I can no longer read one book, but have to be 'reading' many simultaneously. This factor
will only increase as time and technology advance as we see ourselves more in terms of machines,
and the machines start to see themselves in terms of us.
the authors description of our decrease in attention and cognitive capabilities is at times
a bit exaggerated.
however, the idea of our mind and thought process physically changing and the descriptions
of the evolution of thought are really insightful and coherent.
SAN FRANCISCO - It's 1 p.m. on a Thursday and Dianne Bates, 40, juggles three screens. She listens
to a few songs on her
iPod, then taps out a quick e-mail on her
iPhone and turns her attention to the high-definition television.
Break Time vs. Screen Time
Articles in this series examine how a deluge of data can affect the way people think and behave.
Loren Frank, a professor of physiology, said downtime lets the brain
go over experiences, "solidify them and turn them into permanent long-term memories."
Just another day at the gym.
As Ms. Bates multitasks, she is also churning her legs in fast loops on an elliptical machine
in a downtown fitness center. She is in good company. In gyms and elsewhere, people use phones and
other electronic devices to get work done - and as a reliable antidote to boredom.
Cellphones, which in the last few years have become full-fledged computers
with high-speed Internet connections, let people relieve the tedium of exercising, the grocery store
line, stoplights or lulls in the dinner conversation.
The technology makes the tiniest windows of time entertaining, and potentially productive. But
scientists point to an unanticipated side effect: when people keep their brains busy with digital
input, they are forfeiting downtime that could allow them to better learn and remember information,
or come up with new ideas.
Ms. Bates, for example, might be clearer-headed if she went for a run outside, away from her devices,
research suggests.
At the
University of California, San Francisco, scientists have found that when rats have a new experience,
like exploring an unfamiliar area, their brains show new patterns of activity. But only when
the rats take a break from their exploration do they process those patterns in a way that seems to
create a persistent memory of the experience.
The researchers suspect that the findings also apply to how humans learn.
"Almost certainly, downtime lets the brain go over experiences it's had, solidify them and turn
them into permanent long-term memories," said Loren Frank, assistant professor in the department
of physiology at the university, where he specializes in learning and memory.
He said he believed that when the brain was constantly stimulated, "you
prevent this learning process."
At the
University of Michigan, a study found that people learned significantly
better after a walk in nature than after a walk in a dense urban environment, suggesting that processing
a barrage of information leaves people fatigued.
Even though people feel entertained, even relaxed, when they multitask while exercising, or pass
a moment at the bus stop by catching a quick video clip, they might be taxing their brains, scientists
say.
"People think they're refreshing themselves, but they're fatiguing themselves," said Marc Berman,
a University of Michigan neuroscientist.
Regardless, there is now a whole industry of mobile software developers competing to help people
scratch the entertainment itch. Flurry, a company that tracks the use of apps, has found that mobile
games are typically played for 6.3 minutes, but that many are played for much shorter intervals.
One popular game that involves stacking blocks gets played for 2.2 minutes on average.
Today's game makers are trying to fill small bits of free time, said Sebastien de Halleux, a co-founder
of PlayFish, a game company owned by the industry giant Electronic Arts.
"Instead of having long relaxing breaks, like taking two hours for lunch, we have a lot of these
micro-moments," he said. Game makers like Electronic Arts, he added, "have reinvented the game experience
to fit into micro-moments."
Many business people, of course, have good reason to be constantly checking their phones. But
this can take a mental toll. Henry Chen, 26, a self-employed auto mechanic in San Francisco, has
mixed feelings about his BlackBerry habits.
"I check it a lot, whenever there is downtime," Mr. Chen said. Moments earlier, he was texting
with a friend while he stood in line at a bagel shop; he stopped only when the woman behind the counter
interrupted him to ask for his order.
Mr. Chen, who recently started his business, doesn't want to miss a potential customer. Yet he
says that since he upgraded his phone a year ago to a feature-rich BlackBerry, he can feel stressed
out by what he described as internal pressure to constantly stay in contact.
Do you ever wonder if you spend too much time online or find that
multitasking makes
homework feel like it takes forever? Do you end up getting less sleep because you're texting, chatting
or surfing or because you find that your brain can't seem to shut itself down for the night?
The
NY Times has an entire series devoted to
such topics called "Your
Brain on Computers." In
An Ugly Toll of Technology: Impatience and Forgetfulness, a Dr. Kimberly Young is referenced
for comparing net addiction to eating disorders. Even adults are at risk for various problems involving
our ability to parent and nurture as is highlighted by
The Risks of Parenting while Plugged in �there was a comparison between said addiction and alcoholism
the way a parent might say to an objecting son/daughter "just one more text" while driving.
In
Attached to Computers and Paying a Price, our connection to constant information changes our
chemistry with dopamine (which is possibly addictive) squirts in our
brains and alters our expectations of daily life in terms of boredom. More dangerous
are the risks involved in using devices while driving, yet it's easy to understand how we would reach
for them since so many of us are bound to be addicted to them. We read
of families missing big business deals, having family fights (and forgetting to pick up kids!) and
getting lower grades.
According to the article, "At home, people consume 12 hours of
media a day on average, when an hour spent with, say, the Internet and TV simultaneously counts
as two hours. That compares with five hours in 1960, say researchers at the University
of California, San Diego. Computer users visit an average of 40 Web sites a day, according to
research by RescueTime, which offers time-management tools.
As computers have changed, so has the understanding of the human brain. Until 15 years ago,
scientists thought the brain stopped developing after childhood. Now they understand that its
neural networks continue
to develop, influenced by things like learning skills."
If you'd like to test your own ability to focus, the NY Times features this
test. If you think you might be addicted and would like to find out more, check out the
Net Addiction site.
For what it's worth, I think technology can be helpful for practicing Spanish and staying up to
date on yoObsession with Computers and InternetObsession with Computers and Internetth Computers
and Internetth Computers and Internetat school. It's even useful for contact between teachers, students
and parents (at times). What I take away from the NY Times series is how essential boundaries are
in terms of when/where my family and I are plugged in�there have to be concrete limits for us not
to get lost and fragmented.
I should have an overwhelming, Malox-guzzling, stress-saturated schedule. Here's why: I'm a graduate
student in a demanding program. I'm working on several research papers while also attempting to nail
down some key ideas for my dissertation. I'm TA'ing and taking courses. I maintain this blog.
I'm a staff writer for
Flak Magazine. And to keep things interesting, I'm working on background research for a potential
new book project.
You would be reasonable to assume that I must get, on average, 7 - 8 minutes of sleep a night.But you would also be wrong. Let me explain�
For Some Reason It's Not�
Here is my actual schedule. I work:
From 9 to 5 on weekdays.
In the morning on Sunday.
That's it. Unless I'm bored, I have no need to even turn on a computer after 5 during the week
or any time on Saturday. I fill these times, instead, doing, well, whatever I want.
How do I balance an ambitious work load with an ambitiously sparse schedule? It's a simple idea
I call fixed-schedule productivity.
Fixed-Schedule Productivity
The system work as follows:
Choose a schedule of work hours that you think provides the ideal balance of effort and relaxation.
Do whatever it takes to avoid violating this schedule.
This sounds simple. But think about it for a moment. Satisfying rule 2 is not easy. If you took
your current projects, obligations, and work habits, you'd probably fall well short of satisfying
your ideal work schedule. Here's a simple truth: to stick to your ideal schedule will require
some drastic actions. For example, you may have to:
Dramatically cut back on the number of projects you are working on.
Ruthlessly cull inefficient habits from your daily schedule.
In the abstract, these all seem like hard things to do. But when you have the focus of a specific
goal - "I do not want to work past 5 on week days!" - you'd be surprised by how much easier
it becomes deploy these strategies in your daily life.
Let's look at an example�
Case Study: My Schedule
My schedule provides a good case study. To reach my relatively small work hour limit, I have to
be careful with how I go about my day. I see enough bleary-eyed insomniacs around here to know how
easy it is to slip into a noon to 3 am routine (the infamous "MIT cycle.") Here are some of the techniques
I regularly use to remain within the confines of my fixed schedule:
I serialize my projects. I keep two project queues - one from my student
projects and one for my writing projects. At any one moment I'm only working on the top project
from each queue. When I finish, I move on to the next. This focus lets me churn out quality results
without the wasted time of constantly dancing back and forth between multiple efforts. (As also
discussed here and
here.)
I'm ultra-clear about when to expect results from me. And it's not always soon.
If someone slips something onto my queue, I make an honest evaluation of when it will percolate
to the top. I communicate this date. Then I make it happen when the time comes. You can get away
with telling people to expect a result a long time in the future, if - and this is a big if -
you actually deliver when promised.
I refuse. If my queue is too crowded for a potential project to get done
in time, I turn it down.
I drop projects and quit. If a project gets out of control, and starts to
sap too much time from my schedule: I drop it. If something demonstrably more important comes
along, and it conflicts with something else in my queue, I drop the less important project. If
an obligation is taking up too much time: I quit. Here's a secret: no one really cares
what you do on the small scale. In the end you're judged on your large-scale list of
important completions.
I'm not available. I often work in hidden nooks of the various libraries
on campus. I check and respond to work e-mail only a few times a day. People have to wait for
responses from me. It's often hard to find me. Sometimes they get upset at first. But they don't
really need immediate access. And I will always respond within a reasonable timeframe and get
them what they need. So they adjust. And I get things done.
I batch and habitatize. Any regularly occurring work gets turned into a habit
- something I do at a fixed time on a fixed date. For example, I write blog posts on Sunday morning.
I do reading for my seminar on Friday and Monday mornings. Etc. Habit-based schedules for the
regular work makes it easier to tackle the non-regular projects. It also prevents schedule-busting
pile-ups.
I start early. Sometimes real early. On certain projects that I know are
important, I don't tolerate procrastination. It doesn't interest me. If I need to start something
2 or 3 weeks in advance so that my queue proceeds as needed, I do so.
Why This Works
You could fill any arbitrary number of hours with what feels to be productive work. Between e-mail,
and "crucial" web surfing, and to-do lists that, in the age of David Allen, grow to lengths that
rival the bible, there is always something you could be doing. At some point, however, you have to
put a stake in the ground and say: I know I have a never-ending stream of work, but this
is when I'm going to face it. If you don't do this, you let the never-ending stream
of work push you around like a bully. It will force you into tiring, inefficient schedules. And you'll
end up more stressed and no more accomplished.
Fix the schedule you want. Then make everything else fit around your needs. Be flexible. Be efficient.
If you can't make it fit: change your work. But in the end, don't compromise. No
one really cares about your schedule except for yourself. So make it right.
1
Joyce Says:
That is awesome advice. Hats off to you for instilling such discipline in your work
but
as always its predicated by the desire to set out and accomplish a goal - something that is often
missing.
On the BBC News site, Bill Thompson
takes the discussion
in an interesting new direction:
The Swiss developmental psychologist Jean Piaget described two processes that he believed lay
behind the development of knowledge in children. The first is assimilation, where new knowledge fits
into existing conceptual frameworks. More challenging is accommodation, where the framework itself
is modified to include the new information.
The current generation of 'search engines' seem to encourage a model of exploration that is disposed
towards assimilative learning, finding sources, references and documents which can be slotted into
existing frameworks, rather than providing material for deeper contemplation of the sort that could
provoke accommodation and the extension, revision or even abandonment of views, opinions or even
whole belief systems.
Perhaps the real danger posed by screen-based technologies is not that they are rewiring our brains
but that the collection of search engines, news feeds and social tools encourages us to link to,
follow and read only that which we can easily assimilate.
Globe and Mail, columnist Margaret Wente becomes the latest writer to
fess up to an evaporating ability to read long works of prose:
Google has done wondrous things for my stock of general knowledge. It also seems to have destroyed
my attention span. Like a flea with ADD, I jump back and forth from the Drudge Report to gardening
sites that list the growing time of Green Zebras �
Thanks to Google, we're all turning into mental fast-food junkies. Google has taught us to be
skimmers, grabbing for news and insights on the fly. I skim books now too, even good ones. Once I
think I've got the gist, I'll skip to the next chapter or the next book. Forget the background, the
history, the logical progression of an argument. Just give me the takeaway.
Make information free, and we'll become gluttons of information, as Rob Horning
notes in an interesting post today:
As behavioral economists (most vociferously, Dan Ariely) have pointed out,
we find the promise of free things hard to resist (even
when a little thinking reveals that the free-ness is illusory). So when with very little effort
we can accumulate massive amounts of "free" stuff from various places on the internet, we can
easily end up with 46 days (and counting) worth of unplayed music on a hard drive. We end up with
a permanent 1,000+ unread posts in our RSS reader, and a lingering, unshakable feeling that we'll
never catch up, never be truly informed, never feel comfortable with what we've managed to take
in, which is always in the process of being undermined by the free information feeds we've set
up for ourselves. We end up haunted by the potential of the free stuff we accumulate, and our
enjoyment of any of it becomes severely impinged. The leisure and unparalleled bounty of a virtually
unlimited access to culture ends up being an endless source of further stress, as we feel compelled
to take it all in. Nothing sinks in as we try to rush through it all, and our rushing does nothing
to keep us from falling further behind-often when I attempt to tackle the unread posts in my RSS
reader, I end up finding new feeds to add, and so on, and I end up further behind than when I
started.
Information may be free, but, as Horning explains, it exacts a price in the time required to collect,
organize, and consume it. As we binge on the Net, the time available for other intellectual activities
- like, say, thinking - shrinks. Eventually, we get bloated, mentally, and a kind of intellectual
nausea sets in. But we can't stop because - hey - it's free.
Posted by kdawson on Monday June 23, @02:20AM from the but-we-knew-this dept.djvaselaar sends along an article
from The New Atlantis that summarizes recent research indicating that
multitasking
may be detrimental to work and learning.. It begins, "In one of the many letters he wrote
to his son in the 1740s, Lord Chesterfield offered the following advice: 'There is time enough for
everything in the course of the day, if you do but one thing at once, but there is not time enough
in the year, if you will do two things at a time.' To Chesterfield, singular focus was not merely
a practical way to structure one's time; it was a mark of intelligence... E-mails pouring in, cell
phones ringing, televisions blaring, podcasts streaming--all this may become background noise, like
the 'din of a foundry or factory' that [William] James observed workers could scarcely avoid at first,
but which eventually became just another part of their daily routine. For the younger generation
of multitaskers, the great electronic din is an expected part of everyday life. And given what neuroscience
and anecdotal evidence have shown us, this state of constant intentional self-distraction could well
be of profound detriment to individual and cultural well-being."
In one of the many letters he wrote to his son in the 1740s, Lord Chesterfield offered the following
advice: "There is time enough for everything in the course of the day,
if you do but one thing at once, but there is not time enough in the year, if you will do two things
at a time." To Chesterfield, singular focus was not merely a practical way to structure
one's time; it was a mark of intelligence. "This steady and undissipated
attention to one object, is a sure mark of a superior genius; as hurry, bustle, and agitation, are
the never-failing symptoms of a weak and frivolous mind."
In modern times, hurry,
bustle, and agitation have become a regular way of life for many people-so much so that we have embraced
a word to describe our efforts to respond to the many pressing demands on our time: multitasking.
Used for decades to describe the parallel processing abilities of computers, multitasking is now
shorthand for the human attempt to do simultaneously as many things as possible, as quickly as possible,
preferably marshalling the power of as many technologies as possible.
In the late 1990s and early 2000s, one sensed a kind of exuberance about the possibilities of
multitasking. Advertisements for new electronic gadgets-particularly the first generation of handheld
digital devices-celebrated the notion of using technology to accomplish several things at once. The
word multitasking began appearing in the "skills" sections of r�sum�s, as office workers restyled
themselves as high-tech, high-performing team players. "We have always multitasked-inability to walk
and chew gum is a time-honored cause for derision-but never so intensely or self-consciously as now,"
James Gleick wrote in his 1999 book
Faster. "We are multitasking connoisseurs-experts in crowding, pressing,
packing, and overlapping distinct activities in our all-too-finite moments." An article in the
New York Times Magazine in 2001 asked, "Who can remember life before multitasking? These days
we all do it." The article offered advice on "How to Multitask" with suggestions about giving your
brain's "multitasking hot spot" an appropriate workout.
But more recently, challenges to the ethos of multitasking have begun to emerge. Numerous studies
have shown the sometimes-fatal danger of using cell phones and other electronic devices while driving,
for example, and several states have now made that particular form of multitasking illegal. In the
business world, where concerns about time-management are perennial, warnings about workplace distractions
spawned by a multitasking culture are on the rise. In 2005, the BBC reported on a research study,
funded by Hewlett-Packard and conducted by the Institute of Psychiatry at the University of London,
that found, "Workers distracted by e-mail and phone calls suffer a fall in IQ more than twice
that found in marijuana smokers." The psychologist who led the study called this new "infomania"
a serious threat to workplace productivity. One of the Harvard Business Review's "Breakthrough
Ideas" for 2007 wasLinda Stone's notion of "continuous partial attention," which might be
understood as a subspecies of multitasking: using mobile computing power and the Internet, we are
"constantly scanning for opportunities and staying on top of contacts, events, and activities in
an effort to miss nothing."
Dr. Edward Hallowell, a Massachusetts-based psychiatrist who specializes in the treatment of attention
deficit/hyperactivity disorder and has written a book with the self-explanatory title
CrazyBusy, has been offering therapies to combat extreme multitasking
for years; in his book he calls multitasking a "mythical activity in which people believe they can
perform two or more tasks simultaneously." In a 2005 article, he described a new condition, "Attention
Deficit Trait," which he claims is rampant in the business world. ADT is "purely a response to the
hyperkinetic environment in which we live," writes Hallowell, and its hallmark symptoms mimic those
of ADD. "Never in history has the human brain been asked to track so many data points," Hallowell
argues, and this challenge "can be controlled only by creatively engineering one's environment and
one's emotional and physical health." Limiting multitasking is essential. Best-selling business advice
author Timothy Ferriss also extols the virtues of "single-tasking" in his book, The 4-Hour Workweek.
Multitasking might also be taking a toll on the economy. One study by researchers at the University
of California at Irvine monitored interruptions among office workers; they found that workers took
an average of twenty-five minutes to recover from interruptions such as phone calls or answering
e-mail and return to their original task. Discussing multitasking with the New York Times
in 2007, Jonathan B. Spira, an analyst at the business research firm Basex, estimated that extreme
multitasking-information overload-costs the U.S. economy $650 billion a year in lost productivity.
Changing Our Brains
To better understand the multitasking phenomenon, neurologists and psychologists have studied
the workings of the brain. In 1999, Jordan Grafman, chief of cognitive neuroscience at the National
Institute of Neurological Disorders and Stroke (part of the National Institutes of Health), used
functional magnetic resonance imaging (fMRI) scans to determine that when people engage in "task-switching"-that
is, multitasking behavior-the flow of blood increases to a region of the frontal cortex called Brodmann
area 10. (The flow of blood to particular regions of the brain is taken as a proxy indication of
activity in those regions.) "This is presumably the last part of the brain to evolve, the most mysterious
and exciting part," Grafman told the New York Times in 2001-adding, with a touch of hyperbole,
"It's what makes us most human."
It is also what makes multitasking a poor long-term strategy for learning. Other studies, such
as those performed by psychologist Ren� Marois of Vanderbilt University, have used fMRI to demonstrate
the brain's response to handling multiple tasks. Marois found evidence of a "response selection bottleneck"
that occurs when the brain is forced to respond to several stimuli at once. As a result, task-switching
leads to time lost as the brain determines which task to perform. Psychologist David Meyer at the
University of Michigan believes that rather than a bottleneck in the brain, a process of "adaptive
executive control" takes place, which "schedules task processes appropriately to obey instructions
about their relative priorities and serial order," as he described to the New Scientist. Unlike
many other researchers who study multitasking, Meyer is optimistic that, with training, the brain
can learn to task-switch more effectively, and there is some evidence that certain simple tasks are
amenable to such practice. But his research has also found that multitasking contributes to the release
of stress hormones and adrenaline, which can cause long-term health problems if not controlled, and
contributes to the loss of short-term memory.
In one recent study, Russell Poldrack, a psychology professor at the University of California,
Los Angeles, found that "multitasking adversely affects how you learn. Even if you learn while multitasking,
that learning is less flexible and more specialized, so you cannot retrieve the information as easily."
His research demonstrates that people use different areas of the brain for learning and storing new
information when they are distracted: brain scans of people who are distracted or multitasking show
activity in the striatum, a region of the brain involved in learning new skills; brain scans of people
who are not distracted show activity in the hippocampus, a region involved in storing and recalling
information. Discussing his research on National Public Radio recently, Poldrack warned, "We have
to be aware that there is a cost to the way that our society is changing, that humans are not built
to work this way. We're really built to focus. And when we sort of force ourselves to multitask,
we're driving ourselves to perhaps be less efficient in the long run even though it sometimes feels
like we're being more efficient."
If, as Poldrack concluded, "multitasking changes the way people learn," what might this mean for
today's children and teens, raised with an excess of new entertainment and educational technology,
and avidly multitasking at a young age? Poldrack calls this the "million-dollar question." Media
multitasking-that is, the simultaneous use of several different media, such as television, the Internet,
video games, text messages, telephones, and e-mail-is clearly on the rise, as a 2006 report from
the Kaiser Family Foundation showed: in 1999, only 16 percent of the time people spent using any
of those media was spent on multiple media at once; by 2005, 26 percent of media time was spent multitasking.
"I multitask every single second I am online," confessed one study participant. "At this very moment
I am watching TV, checking my e-mail every two minutes, reading a newsgroup about who shot JFK, burning
some music to a CD, and writing this message."
The Kaiser report noted several factors that increase the likelihood of media multitasking, including
"having a computer and being able to see a television from it." Also, "sensation-seeking" personality
types are more likely to multitask, as are those living in "a highly TV-oriented household." The
picture that emerges of these pubescent multitasking mavens is of a generation of great technical
facility and intelligence but of extreme impatience, unsatisfied with slowness and uncomfortable
with silence: "I get bored if it's not all going at once, because everything has gaps-waiting for
a website to come up, commercials on TV, etc." one participant said. The report concludes on a very
peculiar note, perhaps intended to be optimistic: "In this media-heavy world, it is likely that brains
that are more adept at media multitasking will be passed along and these changes will be naturally
selected," the report states. "After all, information is power, and if one can process more information
all at once, perhaps one can be more powerful." This is techno-social Darwinism, nature red in pixel
and claw.
Other experts aren't so sure. As neurologist Jordan Grafman told Time magazine: "Kids that
are instant messaging while doing homework, playing games online and watching TV, I predict, aren't
going to do well in the long run." "I think this generation of kids is guinea pigs," educational
psychologist Jane Healy told the San Francisco Chronicle; she worries that they might become
adults who engage in "very quick but very shallow thinking." Or, as the novelist Walter Kirn suggests
in a deft essay in The Atlantic, we might be headed for an "Attention-Deficit Recession."
Paying Attention
When we talk about multitasking, we are really talking about attention: the art of paying attention,
the ability to shift our attention, and, more broadly, to exercise judgment about what objects are
worthy of our attention. People who have achieved great things often credit for their success a finely
honed skill for paying attention. When asked about his particular genius, Isaac Newton responded
that if he had made any discoveries, it was "owing more to patient attention than to any other talent."
William James, the great psychologist, wrote at length about the varieties of human attention.
In
The Principles of Psychology (1890), he outlined the differences among
"sensorial attention," "intellectual attention," "passive attention," and the like, and noted the
"gray chaotic indiscriminateness" of the minds of people who were incapable of paying attention.
James compared our stream of thought to a river, and his observations presaged the cognitive "bottlenecks"
described later by neurologists: "On the whole easy simple flowing predominates in it, the drift
of things is with the pull of gravity, and effortless attention is the rule," he wrote. "But at intervals
an obstruction, a set-back, a log-jam occurs, stops the current, creates an eddy, and makes things
temporarily move the other way."
To James, steady attention was thus the default condition of a mature mind, an ordinary state
undone only by perturbation. To readers a century later, that placid portrayal may seem alien-as
though depicting a bygone world. Instead, today's multitasking adult may find something more familiar
in James's description of the youthful mind: an "extreme mobility of the attention" that "makes the
child seem to belong less to himself than to every object which happens to catch his notice." For
some people, James noted, this challenge is never overcome; such people only get their work done
"in the interstices of their mind-wandering." Like Chesterfield, James believed that the transition
from youthful distraction to mature attention was in large part the result of personal mastery and
discipline-and so was illustrative of character. "The faculty of voluntarily bringing back a wandering
attention, over and over again," he wrote, "is the very root of judgment, character, and will."
Today, our collective will to pay attention seems fairly weak. We require advice books to teach
us how to avoid distraction. In the not-too-distant future we may even employ new devices to help
us overcome the unintended attention deficits created by today's gadgets. As one New York Times
article recently suggested, "Further research could help create clever technology, like sensors or
smart software that workers could instruct with their preferences and priorities to serve as a high
tech 'time nanny' to ease the modern multitasker's plight." Perhaps we will all accept as a matter
of course a computer governor-like the devices placed on engines so that people can't drive cars
beyond a certain speed. Our technological governors might prompt us with reminders to set mental
limits when we try to do too much, too quickly, all at once.
Then again, perhaps we will simply adjust and come to accept what James called "acquired inattention."
E-mails pouring in, cell phones ringing, televisions blaring, podcasts streaming-all this may become
background noise, like the "din of a foundry or factory" that James observed workers could scarcely
avoid at first, but which eventually became just another part of their daily routine. For the younger
generation of multitaskers, the great electronic din is an expected part of everyday life. And given
what neuroscience and anecdotal evidence have shown us, this state of constant intentional self-distraction
could well be of profound detriment to individual and cultural well-being. When people do their work
only in the "interstices of their mind-wandering," with crumbs of attention rationed out among many
competing tasks, their culture may gain in information, but it will surely weaken in wisdom.
This project combines ideas from mythology, folklore, and library and information science in an
effort to make sense of an aspect of modern culture that is frequently perceived as troublesome.
Discussions of information overload, "data glut," or "information anxiety" are abundant in popular
culture but do little to shed light on the origin of this problem. Library and information science
work sidesteps the need to verify the existence of information overload, seeking instead to mitigate
its effects. The discipline has produced a vast literature that addresses user perceptions, information
needs, and information-seeking behavior. Information management, information retrieval, and attendant
notions such as relevance have also received much attention. Within both popular culture and library
and information science research, information overload is usually described or defined by means of
anecdote or by associated symptoms.
However constituted, popular and scholarly attention confirms information overload as a recognized
and resonant cultural concept that persists even without solid corroboration. Mythology and folkloristics
are used here as analytic tools to suggest that information overload can be viewed as a myth of modern
culture. Here myth does not mean something that is not true but an overarching
prescriptive belief.
Trying to sip information from the fire hose is a difficult and challenging task :-). This memo
might help.
In today's world, mental overload is a fact of life. Fortunately, by applying some simple techniques
from the computer world, you can avoid some of the costly consequences of a too full brain!
SIGNS OF AN OVERLOAD
A too-full computer can:
� give you error messages
� run slower
� take longer to process information
� crash
A too-full brain can cause you to:
� make mistakes
� forget to do something
� let things slip through the cracks
� become sluggish
� loose creativity
� become unproductive
� procrastinate
� become indecisive
� get stressed out
� experience a total mental break down
Does excessive multi-tasking like happens to college students make us stupid? The answer is tentative
yes:
"You are trying to feed information through various kinds of processing channels in the brain
which have limited capacity and are really only available for one thing at a time."
"A lot of tasks we have to do, there are little moments of gaps which you can steal for another
task," said Hal Pashler, psychology professor at the University of California in San Diego. "The
interesting hidden cost ... is that (we) may be strikingly unable to recollect what happened."
"To the degree that tasks rely on similar processes, they are more likely
to interfere with each other. For instance, talking on the phone and writing an e-mail is
hard, because both involve language, Poldrack said."
"The answer is to choose carefully when you take on more than one job at once. For high-priority
or complex tasks, you might want to shut down your e-mail, turn off the phone and close your office
door.
Multi-tasking may be too much for the brain to handle Friday, March 09, 2007
BY KATHERINE REYNOLDS LEWIS
NEWHOUSE NEWS SERVICE
We feel so efficient, listening to a teleconference while sorting e-mail and eating lunch at the
same time. But experts warn that instead of completing three tasks in the space of one, we're really
spending more time to achieve mediocre results.
"Research that's looked at multi-tasking shows that you can't do it well -- no one can," said
Kristin Byron, assistant professor of management at Syracuse University. "You're fighting the way
your brain works."
The brain acts on just one task at a time. What we perceive as simultaneous multi-tasking is really
rapid switching back and forth to keep different tasks going -- even if one is as simple as deciding
to lift the sandwich for another bite.
It's like the classic vaudeville act of spinning plates. Your brain can set a task in motion,
then another, and then another, before returning to pick up the first task, explained David Strayer,
a psychology professor at the University of Utah in Salt Lake City.
"If the demands of any given task aren't too taxing, you can get two, three, four plates going
up, but at some point you're going to reach a threshold when they're going to crash."
You may avoid driving while talking on a cell phone because of the physical challenge of holding
both phone and steering wheel. But Strayer's research shows hands-free cell phone use is just as
dangerous while driving. The risk comes in toggling between the two mental demands.
Moreover, subjects in a recent study scored significantly lower on IQ tests they took while driving.
"When your attention is taken away from a task, you are not going to perform it as smartly," Strayer
said.
So does multi-tasking make us stupid?
It's not an outlandish conclusion. A 2005 study sponsored by Hewlett-Packard found the average
worker lost 10 IQ points when interrupted by ringing telephones and incoming e-mails -- about equal
to the cost of missing an entire night of sleep.
"Interruptions are time-consuming, and they are dangerous in the sense that they can lead to errors,"
said David E. Meyer, a psychology professor at the University of Michigan in Ann Arbor. "You are
trying to feed information through various kinds of processing channels in the brain which have limited
capacity and are really only available for one thing at a time."
Whenever we drop one task to perform another, we face "resumption costs" -- the time and energy
it takes to orient ourselves when we return to the original task. It's true that interweaving two
lengthy tasks can take less total time than performing the tasks separately. But there's a price.
"A lot of tasks we have to do, there are little moments of gaps which you can steal for another
task," said Hal Pashler, psychology professor at the University of California in San Diego. "The
interesting hidden cost ... is that (we) may be strikingly unable to recollect what happened."
That's because the free moments in each task -- such as waiting for a partner to respond in a
conversation -- appear to be used to store or consolidate memories. If we talk on the phone while
checking e-mail, it's at the expense of downtime our brains need.
"The conversation plus the e-mail may take less of your life, but the cost is that tomorrow you
may not know exactly what you said," Pashler said.
Thus, if you try to take in new material or facts while multi-tasking, you'll have a tougher time
learning, said Russell A. Poldrack, psychology professor at the University of California in Los Angeles.
Does all this mean we should never check our Blackberries while waiting in line at the grocery
store? Or even sip a cup of coffee while listening to a conference speaker? After all, multi-tasking
is woven into the fabric of modern life. More than 85 percent of people multi-task, and 67 percent
believe they do it well, according to a survey by Apex Performance, a Charlotte, N.C., training firm.
Fortunately, the experts give us some slack. "You can't say in every situation it would be better
to always focus on one task," Poldrack said.
If you're a stock trader who has to respond quickly to a lot of information, it makes sense to
monitor multiple televisions and computer screens at once, he said. It may not matter that the next
day you're hazy about which news anchor said what.
Certain physical actions, like walking or eating, are so hard-wired that they don't tax our brains
much. There's certainly no harm in combining simple, low-stakes tasks, like folding laundry and watching
television. And if background music energizes you to finish your work, that may outweigh the cost
of your mind shifting between listening and crafting a report, Poldrack said.
Similarly, talking to an adult passenger doesn't hurt your driving the way talking on a cell phone
does, Strayer has found. That's because the person in your car is attuned to the driving environment,
and will pause the conversation when a tricky maneuver approaches.
To the degree that tasks rely on similar processes, they are more likely to interfere with each
other. For instance, talking on the phone and writing an e-mail is hard, because both involve language,
Poldrack said.
The answer is to choose carefully when you take on more than one job at once. For high-priority
or complex tasks, you might want to shut down your e-mail, turn off the phone and close your office
door. Apex Performance founder Louis Czoka even recommends that clients shut their eyes to focus
on a teleconference.
Just how bad have things gotten? That's the subject of Extreme Jobs: The Dangerous Allure of the
70-Hour Workweek, a recent study from the Center for Work-Life Policy. The study found that 1.7 million
people consider their jobs and their work hours extreme, thanks to globalization, BlackBerries, corporate
expectations and their own Type A personalities.
... .... ....
What Hewlett and Buck Luce found in their survey was that workers were themselves to blame.
Many of the people interviewed for the study say they love their jobs and are reluctant to lessen
their work load. In Agoglia's case, working for the small business consulting group was exactly what
she wished for. Now she only comes into the office on a need basis. "It offers an opportunity for
someone like me who needs more breathing room," she says, "but it also fulfills my desire to be challenged
in my job."
That kind of fulfillment has its hazards. Sixty-four percent of those surveyed said their work
pressures are self-inflicted but say it is taking a real toll on them individually. Nationally, 70
percent, and globally, 81 percent, say their jobs undermine their health in terms of exercise, diet
and the impact of stress. Nationally, 46 percent, and globally, 59 percent, say it gets in the way
of their relationships and nationally, 50 percent, say it affects their sex life.
Not surprisingly, men and women have a different take on the extreme nature of their jobs. In
the global survey, 58 percent of men and 80 percent of women say they didn't want to work these hours
for more than one more year. Says Buck Luce: "For women there's a flight risk. But men get burned
out and are able to stick with it. There's a tremendous stigma for men who say, 'I can't do this.'
That means there aren't going to be women at the top ranks of companies."
I wasn't surprised to read that 40% of Americans work 50 hours or more per week and rarely disconnect
from their work, even on vacation. I hear about it all the time in my seminars where people feel
like an 8 hour day is slacking off and working at night after the kids go to bed and in the morning
before the office really opens is the only way they can stay on top of things.
Is it that people have too much to do or is it that they just don't have trusted systems (ala
GTD) to feel like they can disconnect?
I've heard David Allen mention that we've always had too much to do. I don't think BlackBerry's
necessarily create more work, it's just now people have higher expectations about how fast the work
needs to get done.
Someone in one of my seminars recently told me she takes her laptop on vacation just to stay on
top of her email (people actually hissed when she said this, perhaps from the fear that this will
become expected.) The "vacation tax" of coming back to hundreds, if not thousands, of emails is just
not worth it to her.
It's vacation prime time. Millions of wage-earners are on the road, in the air or on the water
in search of overdue recreation, relaxation and adventure. But for too many, it will be a futile
quest, thanks to a big, fat killjoy stowed away on the trip: OCP, or obsessive-compulsive productivity,
a frantic fixation to wring results from every minute of the day, even our play.
Americans have always had an insistent work ethic. But thanks to technology that allows
us to get things done 24/7, growing job demands and the elevation of efficiency to an unofficial
national religion, many vacationers simply can't turn off their productive machinery. Every minute
of the day, even of play, must be productive.
It's a habit that's increasingly counterproductive, evident in soaring job-stress bills (a $300-billion-a-year
tab for U.S. business, according to the American Institute of Stress, a nonprofit organization) and
longer workweeks. Nearly 40% of Americans work more than 50 hours a week. The all-output, all-the-time
mandate of OCP wires us to do holidays like jobs. We cram downtime with to-do lists and a performance-review
mentality that dooms trips to disappointment because we couldn't see or do everything we wanted.
The trip's experience is an afterthought in a crazed race to polish off sights to the finish line
of the holiday.
But trying to make a vacation productive is like trying to get a cat to bark. It's the wrong animal
for the outcome, because vacations aren't about output. Instead, they're about the realm of an increasingly
rare species - input - that can't be measured by a performance yardstick. The most packed itinerary
can't quantify play, fun, wonder, discovery, adventure. How do you tally the spray of an exploding
waterfall? The pattern of ripples on a sand dune? How do you produce quiet?
The productivity of U.S. workers has doubled since 1969, according to Boston College economist
Juliet Schor. But none of the dividends have come back in additional free time. The added time that
greater productivity creates is simply fodder for more productivity increases - and OCP jitters that
we must get more done. How much production is enough?
Even on the job, too much time on task can lead to burnout, heart disease, carpal tunnel syndrome,
mistakes, costly do-overs and rote performance. A study last year by the University of Massachusetts
Medical School found that chronic 12-hour workdays increase your risk of illness or injury by 37%.
Work without time to think, analyze or recharge feeds knee-jerk performance and the hurry-worry
of stress. Everything appears urgent when there isn't time to judge what is truly urgent and what
isn't.
More than anybody else's, Americans' identity comes through labor. But the reflex to define self-worth
by what we get done makes it hard to relax without a heap of guilt because there's always something
next on the horizon to handle. Our focus on future results shrinks our experience of living and,
ironically, the very thing we need for optimum performance - input.
The consulting firm McKinsey & Co. asked managers where they got their best ideas. It wasn't at
the office. Rather, inspiration came when people were at play - on the golf course, running. Research
on fatigue in the workplace since the 1920s shows that performance rises after a break in the action,
whether a break of a few seconds or 15 minutes.
Studies have also found that job performance improves after a vacation. Income doubled at the
H Group, an investment services company in Salem, Ore., after owner Ron Kelemen increased employee
time off to 3 1/2 weeks. When Jancoa, a Cincinnati cleaning company, switched to a three-week vacation
policy, worker productivity soared enough to cut overtime. Profits jumped 15%.
The true source of productivity isn't nonstop output. It's a refreshed and energized mind, something
vacations specialize in.
But for that to happen, we must leave the OCP drill sergeant at home. Vacations require a different
skill set - leisure skills. Without them, we lapse into default mode - produce, produce, produce.
My retired father was stunned when he visited his former company and found a couple of his fellow
retirees back at their desks. They didn't know what else to do.
As kids, we knew how to entertain ourselves. But many of us lost the knack when we learned that
play for its own sake didn't produce rewards - status, pats on the back, money, goodies. Once we're
in OCP territory, we've forgotten how to do things simply because we enjoy doing them.
Researchers say we had it right as kids. "Quality of life does not depend on what others think
of us or what we own," contends psychology professor Mihaly Csikszentmihalyi in "Flow: The Psychology
of Optimal Experience." "The bottom line is, rather, how we feel about ourselves, and about what
happens to us. To improve life one must improve the quality of experience."
Famed for his studies on when people are at their happiest, Csikszentmihalyi adds that "when experience
is intrinsically rewarding, life is justified in the present."
Things we do for our amusement are particularly good at improving that experience, delivering
what's supposed to come out of all that production - self-worth, a sense of competence and, best
of all, life satisfaction. Upping levels of performance can't generate happiness, psychologists contend,
because production is tied to external approval, which is gone by the next morning's to-do list.
But research shows that the more active your leisure lifestyle is, the higher your life satisfaction.
Leisure also increases initiative, confidence and a positive mood.
So, if you haven't taken your vacation yet, maybe it's time to dust off the leisure portfolio
and resuscitate the childhood practice of play. The packing list should include participation, engagement,
spontaneity, a nonjudgmental attitude, the ability to ferret out amusements, take detours, wander
without aim, plunge into things you haven't done before, and get out of your head and into direct
experience. Along the way you may discover something long forgotten. Recess rules.
Computer addicts tend to lose all sense of time when they are on-line. They are
drawn so deeply into the world of bytes and bits that they do not notice entire days passing by.
They forget to eat, sleep, go to school, and even care for their children.
They shirk responsibilities, slack off at work, and miss appointments because they are unable to
pull themselves away.
The virtual world and the real world are competing for their attention, and
the virtual world often wins.
The Anxiety Disorders Education Program is a national education campaign developed
by the National Institute of Mental Health (NIMH) to increase awareness among the public and health
care professionals that anxiety disorders are real medical illnesses that can be effectively diagnosed
and treated. More than 19 million Americans suffer from anxiety disorders, which include panic disorder,
obsessive-compulsive disorder, post-traumatic stress disorder, phobias and generalized anxiety disorder.
They suffer from symptoms that are chronic, unremitting and usually grow progressively worse if left
untreated. Tormented by panic attacks, irrational thoughts and fears, compulsive behaviors or rituals,
flashbacks, nightmares, or countless frightening physical symptoms, people with anxiety disorders
are heavy utilizers of emergency rooms and other medical services. Their work, family and social
lives are disrupted, and some even become housebound. Many of them have co-occuring disorders such
as depression, alcohol or drug abuse, or other mental disorders. Because of widespread lack of understanding
and the stigma associated with these disorders, many people with anxiety disorders are not diagnosed
and are not receiving treatments that have been proven effective through research.
A reader from America , July 2, 1999
Excellent! A DYI approach to OCD and related disorders. A friend gave me this book and it
is excellent. If you have OCD or even a related disorder it gives you a practical approach to learning
to deal with and outsmart your disorder.
Take me, frinstance, while I do not have any checking compulsions, I have suffered
from anxiety disorder and occasionally intrusive, disturbing thoughts for a number of years. (Other
than that I am your regular guy, you wouldn't know I had a disorder if you saw me). This book gives
you a 4-step method of "reframing" OCD in a way that makes it manageable. Ultimately, the authors
say, by using their method you can "retrain your brain" and actually alter your brain chemistry in
a positive direction and thus reduce the original symptoms to something liveable.
A reader from Santa Fe, NM , July 16, 1998
A good description of the problem and some solutions This book contains well-written descriptions
of obsessive-compulsive disorder -- it's informative, clear, and a pleasure to read. And for those
of us who either suffer from these disorders or are close to someone who does, it's an eye-opener:
you are NOT the only person who's ever had to deal with this problem, and there IS hope for curing
it! For all these reasons, I highly recommend the book. Two cautions, however: (1) The book gave
a good description of the ways of treating OCD as of the date it was written. Since then, however,
there have been many new developments, so, if you're specifically interested in treatments, you'll
need to look up some more recent books and articles. (2) "Obsessive-compulsive personality disorder"
(OCPD) is a related but different condition, and it's possible that someone who exhibits similar
symptoms but doesn't have full-blown OCD suffers from this instead. (My mother has never gone in
for compulsive hand-washing, but she's rigid, intolerant, controlling, and a pack rat on a truly
monumental scale. That's OCPD.) The treatments for the two conditions differ -- drugs are more helpful
for OCD than OCPD, for example. As with any mental condition, it's absolutely necessary to have a
thorough professional diagnosis; don't just march into your doctor's office demanding Prozac, or
stock up on St. John's Wort at your local herbalist's.
KIEV (Reuters) - A Ukraine businessman who bought a pager for each member of his
staff as a New Year gift was so alarmed when all 50 of them went off at the same time that he drove
his car into a lamp post, a newspaper said Thursday.
The unnamed businessman was returning from the pager shop when the accident happened,
the Fakty daily reported. ''With no more than 100 meters to go to the office, the 50 pagers on the
back seat suddenly burst out screeching. The businessman's fright was such that he simply let go
of the steering wheel and the car ploughed into a lamp post.''
After he had assessed the damage to the car, the businessman turned his attention
to the message on the 50 pagers. It read: ''Congratulations on a successful purchase!''
The tremendous growth in the price-performance of networking and storage has fueled
the explosive growth of the web. The amount of information easily accessible from the desktop has
dramatically increased by several orders of magnitude in the last few years, and shows no signs of
abating. Users of the web are being confronted with the consequent information overload problem.
It can be exceedingly difficult to locate resources that are both high-quality and relevant to their
information needs. Traditional automated methods for locating information are easily overwhelmed
by low-quality and unrelated content. Thus, the second generation of search engines will have to
have effective methods for focusing on the most authoritative among these documents. The rich structure
implicit in the hyperlinks among Web documents offers a simple, and effective, means to deal with
many of these problems. The CLEVER search engine incorporates several algorithms that make use of
hyperlink structure for discovering high-quality information on the Web.
The Last but not LeastTechnology is dominated by
two types of people: those who understand what they do not manage and those who manage what they do not understand ~Archibald Putt.
Ph.D
Copyright � 1996-2021 by Softpanorama Society. www.softpanorama.org
was initially created as a service to the (now defunct) UN Sustainable Development Networking Programme (SDNP)
without any remuneration. This document is an industrial compilation designed and created exclusively
for educational use and is distributed under the Softpanorama Content License.
Original materials copyright belong
to respective owners. Quotes are made for educational purposes only
in compliance with the fair use doctrine.
FAIR USE NOTICEThis site contains
copyrighted material the use of which has not always been specifically
authorized by the copyright owner. We are making such material available
to advance understanding of computer science, IT technology, economic, scientific, and social
issues. We believe this constitutes a 'fair use' of any such
copyrighted material as provided by section 107 of the US Copyright Law according to which
such material can be distributed without profit exclusively for research and educational purposes.
This is a Spartan WHYFF (We Help You For Free)
site written by people for whom English is not a native language. Grammar and spelling errors should
be expected. The site contain some broken links as it develops like a living tree...
You can use PayPal to to buy a cup of coffee for authors
of this site
Disclaimer:
The statements, views and opinions presented on this web page are those of the author (or
referenced source) and are
not endorsed by, nor do they necessarily reflect, the opinions of the Softpanorama society.We do not warrant the correctness
of the information provided or its fitness for any purpose. The site uses AdSense so you need to be aware of Google privacy policy. You you do not want to be
tracked by Google please disable Javascript for this site. This site is perfectly usable without
Javascript.