new hair loss drug

The Decline of Productivity and How To Fix It–Part 2

Posted July 8th, 2015 in Articles, Blogs by admin

In Part 1 of “The Decline of Personal Productivity and How to Fix It”, I described how personal productivity has been declining in many peoples’ lives, with a negative impact on productivity and personal well being. Part 2 of this article provides a number of practical suggestions on how you can deal with the problem and change your habits.

Strategies To Increase Productivity By Working Less, Taking Breaks, Controlling Disruptions and Developing a Personal Control System

  • Shorten your to do list. Many of us unconsciously keep adding things to do in our lives, without deleting any. So make an extensive list of the things that don’t really matter in your life (that is essentially just busyness), and delete them, so you can make more room for the things that really matter in your life;
  • Change your have to do list to a love to do list. Prioritize all the things you do and make sure you commit corresponding time and energy to them rather than less important things. Spend time on the things related to your most important values;
  • Rest in a do nothing state. There is plentiful research evidence to show that doing nothing in a restful state (where there is nothing to accomplish) strengthens your state of being, rather than doing, and can enhance your energy levels and creativity;
  • Learn how to say no and guard against energy vampires. Workplace culture often requires that you sacrifice time for others, whether that means acting as a mentor or maintaining an open-door policy. The benefit to others’ productivity often comes at a cost to your own. Greg McKeown, author of Essentialism: The Disciplined Pursuit of Less, recommends extreme selectivity as a check on your desire to always be accommodating. McKeown likes to ask people to imagine they have no to-do list, no inbox, no schedule of appointments. While it’s nice to be able to help people and be the “go to” person at work, if you’re efficient, and get your to do list done, people will notice and readily ask you for help. Learning how to say no to requests is critical for maintaining your physical and mental health;
  • Eliminate or seriously reduce multitasking. The research evidence is pretty clear on multitasking—it reduces productivity and cognitive functioning. Replace it with what has been termed “single tasking”—doing one thing at a time with complete focus. Learning how to mediate can help immensely;
  • Learn how to live in in present. This is an essential part of mindfulness. When we occupy our minds with thoughts and emotions by rehashing the past, hoping to change it, or constantly thinking about the future, particularly what could go wrong, we increase stress levels (including cortisol in our bodies) and become less productive. Focusing on the present moment and what you can do only then is the answer;
  • Measure progress daily and stop measuring your worth by what you accomplish. In her book The Progress Principle,  Teresa Amabile emphasizes progress (moving forward with one’s work) over productivity (getting things done well and efficiently, irrespective of their importance). A sense of making meaningful progress, she found, has much greater positive impact on engagement and motivation. Her latest research suggests that the simple act of looking back on progress also positively affects your sense of accomplishment and how competent and effective you feel at work. Francesca Gino, also an HBS professor, asked some employees at an Indian company to spend 15 minutes at the end of each day writing about what had gone well. The group that took time to reflect had a performance level 23 percent higher than that of employees who spent those last 15 minutes simply working. If reviewing incomplete to-do lists brings us down, it appears compiling have-done lists bestows a sense of satisfaction and enhances performance;
  • Manage your energy, not your time. This concept is taken from Tony Schwarz’s book, The Power of Full Engagement,  where he argues that ensuring your energy levels (physical, mental and emotional) are more important than trying to manage time to fill in all you want to do in life;
  • Stop writing down, visualizing and telling everyone about your goals. This may sound counterintuitive, because we’ve become brainwashed with the conventional wisdom of goal setting. Yet there was no research to show that writing down goals translates into success. And visualizing your goals being accomplished is more hype promoted by self-help gurus than motivational; 
  • Schedule breaks in your daily work (17 minutes), ideally every 52 minutes or even less. Use a timer as a reminder;
  • Schedule everything in your calendar rather than creating to do lists. Scheduling has the added impetus for you to confront what you have to do, rather than burying it on some hidden to do list. This includes scheduling your free time.A study was conducted at National Pingtung University of Science and Technology in Taiwan found  a positive relationship between free time management and quality of life;
  • Plan your day in reverse. Start with your desired ending time (eg: 5 pm), and then calendarize all you want or have to do in reverse order including free time. This also gives you a greater sense of control over your work life as opposed to reacting to time demands as they appear;
  • Take regular vacations or sabbaticals. A 2010 survey indicated that the average American accrues 18 vacation days and uses only 16. More than forty percent of American workers who received paid time off did not take all of their allotted time last year, despite the obvious personal benefits, according to “An Assessment of Paid Time Off in the U.S.” commissioned by the U.S. Travel Association, a trade group, and completed by Oxford Economics.The average French worker takes more than twice the vacation time. To some, this statistic encapsulates the difference between American and European workers. Americans believe they are productive and Europeans are lazy. In fact, it’s the opposite. Europeans understand that breaks improve workplace efficiency. Americans mistakenly believe that more hours will always increase output, while ignoring the clear evidence: The secret to being an effective worker is not working too hard. Not taking vacation time is a bad idea, as it harms productivity and the economy. Those are key findings of a new study (link is external)released earlier this month. One of the most watched Ted Talks by Stefan Sagmeister, who takes a sabbatical every seven years and describes the incredible benefits.Make the beginning of your day the most productive. Tim Ferris author of The 4 Hour Work Week  recommends not checking email for the first hour or two of the day. Dan Ariely, co-founder of Timeful, a time management app, and the New York Times bestselling author of Predictably Irrational: The Hidden Forces That Shape Our Decisions,  says you have 2-2.5 hours of peak productivity every day. You may actually be 30% more effective at that time. He argues most people are productive in the first two hours of the morning.
  • Control your email habits. Some organizations, particularly in Europe, have taken action to restrict internal emails for employees to address their negative effects on productivity. Other studies how shown how attending to emails throughout the working day occupies up to 30% of the entire working day, but does not actually result in productive work. A study by the French business watchdog company ORSE found that reading useless email messages damaged concentration, as it took more than five minute to refocus on the task at hand. A study  by the University of California, Irvine, which was co-written with United States Army researchers, found that people who do not look at e-mail on a regular basis at work are less stressed and more productive. A study (link is external) done at the University of London found that constant emailing and text-messaging reduces mental capability by an average of ten points on an IQ test. It was five points for women, and fifteen points for men. This effect is similar to missing a night’s sleep. For men, it’s around three times more than the effect of smoking cannabis.
  • Become a master in regulating your emotions. Research shows how you start the day has an enormous effect on productivity and you procrastinate more when you’re in a bad mood. Researchers  found that employees’ moods when they clocked in tended to affect how they felt the rest of the day. Early mood was linked to their perceptions of customers and to how they reacted to customers’ moods;
  • Develop a personal system that simplifies things, is routinized and reduces decision fatigue. Roy Baumeister, author of the New York Times bestseller Willpower: Rediscovering the Greatest Human Strength, (link is external) argues every decision you make depletes your self-control. Having too many choices or decisions to make depletes willpower and afterward your self-control is impaired. Personal organization guru David Allen, author of best seller, Getting Things Done, (link is external) has great suggestions on a personal system; 
  • Engage in mindfulness practices. Mindfulness has been shown in research (link is external) to help people be calmer, more grounded, handle stress better, improve emotional regulation and cognitive functioning.

We need to redefine productivity. Unfortunately, it has become synonymous with busyness. We are seduced by the cultural norm of measuring productivity as the goal of “getting all the work done.” So we invent the checklists or to do lists, and excessively focus on what hasn’t been done.

What if productivity was defined not be describing what we get done but by doing things we want or love to do. Or by defining it by doing things well, instead of fast. What if we decide to take control over our lives and value time off, vacations and doing nothing as strategies for improving our productivity. Wouldn’t that create a different kind of life?

 

The Decline of Productivity and How To Fix It

Posted July 8th, 2015 in Articles, Blogs by admin

Part 1 of this article describes in detail the decline of personal productivity. Part 2 will provide some practical solutions.

The most common expression I hear from my clients and colleagues and friends is “I don’t have enough time,” or “I can’t seem to get everything done.” They are often amazed by the people who seem to be super productive without become workaholics.

Productivity, or the lack of it, seems to be a widespread personal and organizational problem.

At the organizational level, the emphasis on employee engagement levels, which is another way of defining productivity, has been a focus of many Gallup polls, other research and management fixes. At the personal level, the focus has been on work-life balance, workaholism, and stress.

A closer examination of the issue of productivity surfaces several important perspectives:

  • The applied definition of productivity
  • The relationship between productivity and working hours
  • The impact of technology on productivity
  • Our scattered and over stimulated lives
  • Solutions to the personal productivity problem.

The definition of productivity

The dictionary defines productivity as “the quality, state, or fact of being able to generate, create, enhance, or bring forth goods and services.” Since the industrial revolution began, we have equated productivity with other concepts and beliefs—progress and growth. The success of our free market capitalist system and economic prosperity has since been based on structural systems and habits that require unending economic progress and growth. Yet we are now beginning to realize that our obsession with economic growth and productivity is in fact creating huge problems, and economic growth is the cause of them. It requires a constant increase in the flow of raw materials extracted from the planet to be turned into goods, services and waste. The more we grow, certainly using current economic thinking, the more resources we need to use and the more pollution we create. Our definition of productivity takes a positive perspective, not indicating it has detrimental effects. Hence our belief that productivity is good, and anything that can enhance it is good. But what if productivity was bad? What if the bad effects outweighed the good?

Productivity and Working Hours

The industrial revolution’s factory model of work ushered in the use of humans as virtual slave labor for the average worker (but not their wealthy owners), with 12 and 14 hours working days for six and seven days a week. Soon the 40-hour workweek became the base upon which the workplace was structured. As global economic competition increased, productivity-working hours were assumed to be the driver of economic success. Indeed the concept has been integrated into accepted measurements such as GDP and GNP, none of which measure human well being or social factors. And while the 40-hour work week for a while became the norm, in part due to government policies and the power of unions, the norm has slowly been eroded, most notably in North America and Asian countries. But not in many European countries, where the work week has been reduced.

In the late 1700’s, Benjamin Franklin predicted we’d work a 4-hour week. In 1933 the U.S. Senate passed a bill for an official 30-hour workweek, which was vetoed by President Roosevelt. In 1965, a U.S. Senate subcommittee predicted a 22-hour workweek by 1985 and a 14-hour work week by 2000. None of those predictions have come to pass. In fact the opposite is true. The number of hours people work is increasing.

Working hours in North America and the U.K. have steadily risen in the last 20 years. A DIT research report found that 1 in 6 employees now work more than 60 hours a week. Full time employees in the U.K. work the longest hours in Europe and a British Medical Association report found that 77% of consultants work more than 50 hours a week and 46% more than 60 hours.

According to U.S. Census and CPS data, the number of employed American men regularly working more than 48 hours per week is higher today than it was 25 years ago. Using CPS data from 1979 to 2006, this increase was greatest among highly educated, highly-paid, and older men, was concentrated in the 1980s, and was largely confined to workers paid on a salaried basis. A new study by the Organization for Economic Cooperation and Development (OECD) confirms that on average, people in the U.S. are putting in 20 per cent more hours of work than they did in 1970. It also shows that in the same period, the number of hours worked in all the other industrialized countries, except for Canada, decreased. The average work week in the U.S. is 54 hours according to a Sage Software Survey in 2007. In an average week, only 14 percent work 40 hours or less. One-third work 50-59 hours a week, and 80% work between 40 and 79 hours according to a 2006 study of 2,500 Americans. In Japan, in contrast, annual work hours declined 17 per cent and in France they declined by 24 per cent. In general, a third of all American workers could be viewed as chronically overworked in 2004, according to a report by the nonprofit Families and Work Institute in New York City.

So in many ways we have begun to accept overwork or workaholism as a necessity to drive productivity. At what cost?

In the U.S., and Canada workaholism remains what it’s always been: the so-called “respectable addiction” that’s dangerous as any other—whether or not they hold jobs. “Yes, workaholism is an addiction, an obsessive-compulsive disorder, and it’s not the same as working hard or putting in long hours,” says Bryan Robinson, PhD, one of the leading researchers on the disorder and author of Chained to the Desk and other books on workaholism. Workaholic’s obsession with work is all occupying, which prevents workaholics from maintaining healthy relationships, outside interests, or even taking measures to protect their health.

So who are these workaholics? According to several research studies, there is no typical profile, although Baby Boomers are more susceptible to being workaholics than Generation Y workers. Most workaholics are successful. And workaholics are more likely to be managers or executives, more likely to be unhappy about their work/life balance and work on average more than 50 hours per week. They neglect their health to the point of devastating results and ignore their friends and family. They avoid going on vacation so they don’t have to miss work. And even if they do go on vacation, they aren’t fully present because their mind is still on work.

It’s been my experience in working with many firms, particularly large ones, that overwork is the norm. In a society where job dedication is praised, workaholism is an invisible addiction. Work is at the core of much of modern life. If you work excessively you can be both praised in the corporate world, and criticized because of a lack of work-life balance.

Workaholism is like a badge of courage for many. Professionals are working harder than ever and the 40-hour work week is a thing of the past. Workaholism is a reflection of our culture’s embrace of an extreme ethos. For many professionals, work is the center of their social life and friendships.

Personal connections, once made exclusively through family, friends and civic organizations, are now made in the workplace. In conversations with executives and employees alike in the boardrooms and lunchrooms I have visited, the most common comments I hear are phrases such as “I’m up to my neck in alligators,” or “I can’t keep up,” or “not enough time.”

The phenomena of overwork can’t be blamed entirely on employers and bosses. Laura Vanderkam, author of What Most Successful Do on the Weekend, contends many workers lack the self-discipline to set proper boundaries between work and their personal lives. Many report a feeling of being needed or important by overwork.

Does More Working Hours Mean Greater Productivity?

Not according to research. Economists for some time have argued working longer hours would negatively affect productivity. John Hicks, a British economist was one of the first in the 1930s to look at the issue ), and concluded that productivity declined with increases of working hours.

John Pencavel of Stanford University showed in his research  that reduced working hours can be good for productivity. The study found that productivity declined markedly after more than 50 hours per week. His study also showed that the absence of a rest day (such as Sunday) damaged productivity.

Research by the Draugiem Group, a social networking company using a time-tracking productivity app called DeskTime, conducted an experiment to see what habits set their most productive employees apart. They found the employees with the highest productivity didn’t work longer hours than anyone else. In fact, the study showed that these people didn’t even work full eight-hour days. What they did instead is to take regular breaks (17 minutes for every 52 minutes of work). Other studies have shown that 90 minutes of continuous work without a break reduces cognitive performance. What is critical about the breaks was the focus—these productive people did something totally unrelated to work, rather than checking email, phone messages or other tasks. Instead, they took a walk, read a book, meditated, engaged in social talk.

There’s more proof that working more hours per day doesn’t translate into greater productivity. In Greece , the average number of hours worked per worker is among the highest in the OECD, second only to Korea, yet the economy there has ground to a halt, partly because problems in worker productivity. In contrast, economies in Germany and Sweden are robust where workers work considerably fewer hours.

Longer hours have also been connected to absenteeism and employee turnover. The Center for Disease Control and Prevention even has an entire website devoted to the effects of long working hours even if workers aren’t paid for this extra time.

A survey  from UBS has shown that the French continue to work the least amount of hours per year in the world. People work an average of 1,902 hours per year in the surveyed cities but they work much longer in Asian and Middle Eastern cities. People in Lyon and Paris, by contrast, spend the least amount of time at work according to the global comparison: 1,582 and 1,594 hours per year respectively. Nationmaster ranks France as #18 in terms of GDP per capita, at $36,500 per person, yet France works much less than most developed nations. They achieve their high standard of living while working 16% less hours than the average world citizen, and almost 25% less than their Asian peers.

The Impact of Technology on Productivity

Technological progress was assumed to have driven productivity and economic growth. Yet, there is evidence  that it hasn’t contributed greatly to our standard of life. Between 1991 and 2012 the average annual increase in real wages in Britain was 1.5% and in America 1%, according to the Organization for Economic Co-operation and Development, a club of mostly rich countries. That was less than the rate of economic growth over the period and far less than in earlier decades. Other countries fared even worse. Real wage growth in Germany from 1992 to 2012 was just 0.6%; Italy and Japan saw hardly any increase at all. And, critically, those averages conceal plenty of variation. Real pay for most workers remained flat or even fell, whereas for the highest earners it soared.

It seems difficult to square this unhappy experience with the extraordinary technological progress during that period, but the same thing has happened before. Most economic historians reckon there was very little improvement in living standards in Britain in the century after the first Industrial Revolution. And in the early 20th century, as Victorian inventions such as electric lighting came into their own, productivity growth was every bit as slow as it has been in recent decades. This failure of new technology to boost productivity (apart from a brief period between 1996 and 2004) became known as the Solow paradox. Economists disagree on its causes. Robert Gordon of Northwestern University suggests that recent innovation is simply less impressive than it seems, and certainly not powerful enough to offset the effects of demographic change, inequality and sovereign indebtedness.

Technology has allowed workers both at work and at home, through the use of smartphones, tablets , email, and instant messaging to be “on” and available at all times for work, even outside of working hours. And increasingly, people are working on vacations or not taking vacations at all, particularly in the U.S.

Our Scattered and Over Stimulated Lives

John Robinson, one of the leading researcher on the issue of time use, says the biggest problem we have today is not “not having enough time,” it’s that our lives are so fragmented, over stimulated and interrupted. Ed Hallowell, best selling author of Driven to Distraction , argues that we have a “culturally generated ADD.” In other words, there are so many distractions and stimuli, we are losing our ability to focus.

Many studies have shown that most workers are frequently interrupted at work. Top CEOs and executives can be interrupted as often as every 20 minutes.

And research has shown that for every interruption it takes an average of 25 minutes to fully regain your cognitive focus. Dr. Gloria Mark, associate professor at the Donald Bren School of Information and Computer Sciences at the University of California, found that average information workers are interrupted every three minutes – nearly twenty times per hour or seventy-three times every day. And the average manager is interrupted every eight minutes. Interruptions include telephone calls, incoming email messages, interruptions by colleagues, and crises. On average, most of us experience one interruption every 8 minutes or approximately 6-7 per hour. In an 8-hour day, that totals around 50-60 interruptions in the day. The average interruption takes approximately 5 minutes. If you are receiving 50 interruptions in the day and each takes 5 minutes, that totals 250 minutes, or just over 4 hours out of 8, or about 50% of the workday.. Cognitive studies on interruptions show that an interruption requires immediate attention and action and most of us allow and even encourage interruptions to take place and to take precedence over other tasks. We often respond quickly to these interruptions, as it gives us a feeling of closure, knowing we may not have to address this issue in the immediate future.

And what about multitasking?

The evidence is pretty clear that multitasking is not efficient and takes a severe toll on productivity. No two tasks can be done at the same time with 100% efficiency. As multitasking increases, our ability to distinguish between what is relevant from non-relevant declines. You’ve likely heard that multitasking is problematic, but new studies show that it kills your performance and may even damage your brain. Research conducted at Stanford University  found that multitasking is less productive than doing a single thing at a time. The researchers also found that people who are regularly bombarded with several streams of electronic information cannot pay attention, recall information, or switch from one job to another as well as those who complete one task at a time. What’s interesting is the research conducted at the University of London found that participants who multitasked during cognitive tasks experienced IQ score declines that were similar to what they’d expect if they had smoked marijuana or stayed up all night. IQ drops of 15 points for multitasking men lowered their scores to the average range of an 8-year-old child. Finally, it was long believed that cognitive impairment from multitasking was temporary, but new research at the University of Sussex found that multitaskers had less brain density in the anterior cingulate cortex, a region responsible for empathy as well as cognitive and emotional control.

In summary, there are signficant reasons why personal productivity is declining. Part 2 of this article will suggest strategies to fix the problem.

The Crisis of Fatherhood

Posted July 8th, 2015 in Blogs by admin

As we approach Father’s Day, it may be useful for us to reflect on what’s happened to fatherhood, and indeed male identity in America.

Some would argue that America is rapidly becoming a fatherless society, or perhaps more accurately, an absentee father society. The importance and influence of fathers in families has been in significant decline since the Industrial Revolution and is now reaching critical proportions. The near-total absence of male role models has ripped a hole the size of half the population in many urban areas. For example, in Baltimore, only 38 percent of families have two parents, and in St. Louis the portion is 40 percent.

Across time and cultures, fathers have always been considered essential—and not just for their sperm. Indeed, no known society ever thought of fathers as potentially unnecessary. Marriage and the nuclear family—mother, father, and children—are the most universal social institutions in existence. In no society has the birth of children out of wedlock been the cultural norm. To the contrary, concern for the legitimacy of children is nearly universal.

As Alexander Mitscherlich argues in Society Without A Father (link is external), there has been a “progressive loss of the father’s authority and diminution of his power in the family and over the family.”

“If present trends continue, writes David Popenoe , a professor of sociology at Rutgers University, “the percentage of American children living apart from their biological fathers will reach 50% by the next century.” He argues “this massive erosion of fatherhood contributes mightily to many of the major social problems of our time…Fatherless children have a risk factor of two to three times that of fathered children for a wide range of negative outcomes, including dropping out of high school, giving birth as a teenager and becoming a juvenile delinquent.”

According to David Blankenhorn, author of Fatherless America, chair of the National Fatherhood Initiative and founder/president of the Institute for American Values, organization, and research conducted by Popenoe and scores of other researchers:

  • Approximately 30% of all American children are born into single-parent homes, and for the black community, that figure is 68%;
  • Fatherless children are at a dramatically greater risk of drug and alcohol abuse, mental illness, suicide, poor educational performance, teen pregnancy, and criminality, according to the U.S. Department of Health and Human Services, National Center for Health Statistics.
  • Over half of all children living with a single mother are living in poverty, a rate 5 to 6 times that of kids living with both parents;
  • Child abuse is significantly more likely to occur in single parent homes than in intact families;
  • 63% of youth suicides are from fatherless homes according to the U.S. Bureau of the Census;
  • 72% of adolescent murderers grew up without fathers. 60% of America’s rapists grew up the same way according to a study by D. Cornell (et al.), in Behavioral Sciences and the Law;
  • 63% of 1500 CEOs and human resource directors said it was not reasonable for a father to take a leave after the birth of a child;
  • 71% of all high school dropouts come from fatherless homes according to the National Principals Association Report on the State of High Schools;
  • 80% of rapists motivated with displaced anger come from fatherless homes according to a report in Criminal Justice & Behavior;
  • In single-mother families in the U.S. about 66% of young children live in poverty;
  • 90% of all homeless and runaway children are from fatherless homes;
  • Children from low-income, two-parent families outperform students from high-income, single-parent homes. Almost twice as many high achievers come from two-parent homes as one-parent homes according to a study by the Charles F. Kettering Foundation.
  • 85% of all children that exhibit behavioral disorders come from fatherless homes according to a study by the Center for Disease Control;
  • Of all violent crimes against women committed by intimates about 65% were committed by either boy-friends or ex-husbands, compared with 9 % by husbands;
  • Girls living with non-natal fathers (boyfriends and stepfathers) are at higher risk for sexual abuse than girls living with natal fathers;
  • Daughters of single mothers are 53% more likely to marry as teenagers, 111% more likely to have children as teenagers, 164% more likely to have a premarital birth and 92% more likely to dissolve their own marriages.
  • A large survey conducted in the late 1980s found that about 20% of divorced fathers had not seen his children in the past year, and that fewer than 50% saw their children more than a few times a year.
  • Juvenile crime, the majority of which is committed by males, has increased six-fold since 1992;
  • In a longitudinal study of 1,197 fourth-grade students, researchers observed “greater levels of aggression in boys from mother-only households than from boys in mother-father households,” according to a study published in the Journal of Abnormal Child Psychology.
  • The Scholastic Aptitude Test scores have declined more than 70 points in the past two decades; children in single-parent families tend to score lower on standardized tests and to receive lower grades in school according to a Congressional Research Service Report.

Blankenhorn argues that America is facing not just the loss of fathers, but also the erosion of the ideal of fatherhood. Few people doubt the fundamental importance of mothers, Popenoe comments, but increasingly the question of whether fathers are really necessary is being raised and said by many to be a merely a social role that others-mothers, partners, stepfathers, uncles and aunts, and grandparents can play.

“The scale of marital breakdowns in the West since 1960 has no historical precedent that I know of,” says Lawrence Tone, a noted Princeton University family historian, “There has been nothing like it for the last 2,000 years, and probably longer.” Consider what has happened to children. Most estimates are that only about 50% of the children born during the 1970-84 “baby bust” period will still live with their natural parents by age 17-a staggering drop from nearly 80%.

Despite current interest in father involvement in families, an extremely large proportion of family research focuses on mothers and children. Health care agencies and other organizations exclude fathers, often unwittingly. Starting with pregnancy and labor and delivery most appointments are set up for mothers and held at times when fathers work.

The same is true for most pediatric visits. School records and files in family service organizations often have the child’s and mother’s name on the label, and not the father’s. In most family agency buildings, the walls are typically pastel colors, the pictures on the wall are of mothers, flowers and babies, the magazines in the waiting room are for women and the staff is predominantly female. In most welfare offices, fathers are not invited to case planning meetings, and when a home visitor is greeted at the door by a man, she often asks to speak with the mother. Given these scenarios, fathers are likely to get the message that they are invisible or irrelevant to their children’s welfare, unless it involves financial support.

Popenoe and others have examined the role of fathers in raising children and found there are significant differences than that for mothers. For example, an often-overlooked dimension of fathering is play. From their children’s birth through adolescence, fathers tend to emphasize play more than caretaking. The play is both physically stimulating and exciting. It frequently resembles an apprenticeship or teaching relationships, and emphasizes often teamwork and competitive testing of abilities. The way fathers play affects everything from the management of emotions to intelligence and academic achievement. It is particularly important in promoting the essential virtue of self-control.

A committee assembled by the Board of Children and Families of the National Research Council, concluded “children learn critical lessons about how to recognize and deal with highly charged emotions in the content of playing with their fathers. Fathers, in effect, give children practice in regulating their own emotions and recognizing others’ emotional clues.”

At play and in other realms, fathers tend to stress competition, challenge, initiative, risk taking and independence. Mothers, as caretakers, stress emotional security and personal safety. Father’s involvement seems to be linked to improved quantitative and verbal skills, improved problem-solving ability and higher academic achievement for children. Men also have a vital role to play in promoting cooperation and other “soft” virtues. Involved fathers, it turns out according to one 26 year longitudinal research study may be of special importance for the development of empathy in children.

Family life-marriage and child rearing-is a civilizing force for men. It encourages them to develop prudence, cooperativeness, honesty, trust, self-sacrifice and other habits that can lead to success as an economic provider by setting a good example.

Mark Finn and Karen Henwood, writing about the issue of masculinity and fatherhood, in the British Journal of Social Psychology, argue that the traditional view of masculinity, with its focus on power, aggression, economic security, and “maleness”, and the emerging new view of fatherhood, which incorporates many aspects of motherhood is a source of struggle for many men who become fathers.

In a study of fatherhood in popular TV sitcoms, Timothy Allen Pehlke and his colleagues concluded that fathers are generally shown to be relatively immature, unhelpful and incapable of taking care of themselves in comparison with other family members. In addition, the researchers found that fathers often served as the butt of family members’ jokes. All of these characterizations, while the intention may be humor, depicted fathers as being socially incompetent and objects of derision.

In a study of depictions of fathers in the best selling children’s picture books, researcher Suzanne M. Flannery Quinn concluded that of the 200 books analyzed, there were only 24 books where the father appears alone, and only 35 books where mother and father appear together. The author concludes, “because fathers are not present or prominent in a large number of these books, readers are given only a narrow set of images and ideas from which they can construct an understanding of the cultural expectations of fatherhood and what I means to be a father.”

It seems to me that the issue of the decline of fatherhood and the problem of the male identity crisis are inextricably intertwined.

In my Psychology Today article, “Our male identity crisis: What will happen to men?” I said, “In a post-modern world lacking clear-cut borders and distinctions, it has been difficult to know what it means to be a man and even harder to feel good about being one. The many boundaries of a gendered world built around the opposition of work and family-production versus reproduction, competition versus cooperation, hard vs. soft-have been blurred, and men are groping in the dark for their identity.”

Overwhelmingly, the portrayal of men and the male identity in contemporary western societies is mostly negative. Men today are extensively demonized, marginalized and objectified, in a way reminiscent of what happened to women. The issue of the male identity is of crucial importance because males are falling behind in school, committing more suicides and crimes, dying younger and being treated for conditions such as ADHD more than females.

There has also been a loss of fatherhood in society as artificial insemination by anonymous donors is on the rise. Further, medical experiments have shown that male sperm can now be grown artificially in a laboratory. There has been a rise in divorce rates where in most cases, child custody is granted to mothers. Continuous negative portrayal of men in the media, along with the feminization of men and loss of fatherhood in society, has caused confusion and frustration in younger generation males, as they do not have a specific role model and are less able to define their role in society.

From once being seen as successful breadwinners, heads of families and being respected leaders, men today are the butt of jokes in the popular media. A Canadian research group, Nathanson and Young, conducted research on the changing role of men and media and concluded that widely popular TV programs such as The Simpsons present the father character, Homer, as lazy, chauvinistic, irresponsible, and stupid and his son, Bart, as mischievous, rude and cruel to his sister. By comparison, the mother and daughter are presented as thoughtful, considerate and mild-natured. The majority of TV shows and advertisements present men as stupid buffoons, or aggressive evil tyrants or insensitive and shallow “studs” for women’s pleasure.

According to J.R. Macnamara, in the book, Media and the Male Identity: The Making and Remaking of Men, less than 20% of media profiles reflected positive themes for men. Violent crimes, including murder, assault, and armed robberies accounted for over 55% of all media reporting of male activities. Macnamara says that over 30% of all discussion in the media of male sexuality was in relation to pedophilia, and males’ heterosexuality associated with masculinity is seen as violent, aggressive and dominating. Men are frequently shown in TV shows and movies as lacking in commitment in relationships and are shown as frequently cheating on women. And with increasing frequency, women are shown on TV shows and movies as being independent single mothers, not needing a man.

Guy Garcia, author of The Decline of Men: How The American Male is Tuning Out, Giving Up and Flipping Off His Future,  argues that many men bemoan a “fragmentation of male identity,” in which husbands are asked to take on unaccustomed familial roles such as child care and housework, while wives bring in the bigger paychecks. “Women really have become the dominant gender,” says Garcia, “what concerns me is that guys are rapidly falling behind. Women are becoming better educated than men, earning more than men, and, generally speaking, not needing men at all. Meanwhile, as a group, men are losing their way.”

“The crisis of fatherhood, then is ultimately a cultural crisis, a sharp decline in the traditional sense of communal responsibly, ” contends Popenoe; “It therefore follows that to rescue the rescue the endangered institution of fatherhood, we must regain our sense of community.”

Beyond that, fathers—men—bring an array of unique and irreplaceable qualities that women do not ordinarily bring. Some of these are familiar, if sometimes overlooked or taken for granted. The father as protector, for example, has by no means outlived his usefulness. And he is important as a role model. Teenage boys without fathers are notoriously prone to trouble. The pathway to adulthood for daughters is somewhat easier, but they still must learn from their fathers, as they cannot from their mothers, how to relate to men. They learn from their fathers about heterosexual trust, intimacy, and difference. They learn to appreciate their own femininity from the one male who is most special in their lives (assuming that they love and respect their fathers). Most important, through loving and being loved by their fathers, they learn that they are worthy of love.

Recent research has given us much deeper—and more surprising—insights into the father’s role in child rearing. It shows that in almost all of their interactions with children, fathers do things a little differently from mothers. What fathers do—their special parenting style—is not only highly complementary to what mothers do but is by all indications important in its own right.

For example, an often-overlooked dimension of fathering is play. From their children’s birth through adolescence, fathers tend to emphasize play more than caretaking. This may be troubling to egalitarian feminists, and it would indeed be wise for most fathers to spend more time in caretaking. Yet the fathers’ style of play seems to have unusual significance. It is likely to be both physically stimulating and exciting. With older children it involves more physical games and teamwork that require the competitive testing of physical and mental skills. It frequently resembles an apprenticeship or teaching relationship: Come on, let me show you how.

The way fathers play affects everything from the management of emotions to intelligence and academic achievement. It is particularly important in promoting the essential virtue of self-control. According to one expert, “Children who roughhouse with their fathers . . . usually quickly learn that biting, kicking, and other forms of physical violence are not acceptable.” They learn when enough is enough.

Children, a committee assembled by the Board on Children and Families of the National Research Council concluded, “learn critical lessons about how to recognize and deal with highly charged emotions in the context of playing with their fathers. Fathers, in effect, give children practice in regulating their own emotions and recognizing others’ emotional clues.” A study of convicted murderers in Texas found that 90 percent of them either didn’t play as children or played abnormally.

So as we annually celebrate Father’s Day, and reflect on it’s importance to social stability, more men in our culture need to find their male identity and commit to the central importance of fatherhood.

Why the Business World Needs Liberal Arts Graduates

Posted July 8th, 2015 in Articles by admin

Liberal arts education is in a life-and-death struggle amidst pressure by politicians, business leaders and educational administrators to diminish or eliminate their presence in our post-secondary institutions and replace them with a job-targetted educational system emphasizing technological and practical skills. Yet, ironically, the importance and utililty of a liberal education has never been greater.

The Causes

Powerful forces have contributed to a perspective that a liberal education is no longer relevant including:

  • Each year in the U.S. alone, more than 30 million workers are working in jobs that did not exist in the previous quarter;
  • Every year, more than 1/3 of the entire labor force changes jobs;
  • Students now graduating from post secondary institutions will have 10-14 jobs by the time they are 38 years old;
  • Competition with Asian countries where technological education is stressed;
  • Unemployment rates for college graduates remain high regardless of the economic recovery from the recession.

Political and “Expert” Pressure on Colleges and Universities

Business and military leaders complain that students are ill-educated for the work that needs doing. Some, for example, Walter Russell Mead,  recommends scrapping liberal arts in higher education and replacing it with skill based certificates, or The Council on Foreign Relations,  which recommends an education system that produces better soldiers, security analysts, managers and producers.

Liberal arts education programs are under duress in higher education, in an atmosphere of increasing anti-intellectualism, where uninformed opinions based on little facts and even less study of our history and culture is on a daily basis being spouted by political and business leaders.

Part of the reason for the decline of liberal arts in colleges and universities and their focus more on professions, technology and sciences has been an economic one. The rising cost of post secondary education has made a liberal arts education out of reach for most working class and middle class families, and these students are compelled to pursue vocationally oriented educations out of necessity. Second, higher education institutions have partially solved their funding problem by turning more and more to research grants and endowments provided by corporations which are often driven by self interest.

In an article by Joseph Epstein  argues the division between vocational and liberal arts education, which began during the 19th century with the advent of the land-grant state universities in the US., are today tilting further and further in favor of the vocational. Even within the liberal arts, more and more students are fleeing from the traditional liberal arts courses such as English or History to the marketable subjects such as Economics, in hope that this will bring them the practical credentials that might impress prospective employers.

The war on the liberal arts is also born from the same desire of right wing America which has produced voter ID laws, which is an attempt to limit democratic participation. The goal of a liberal arts education was never primarily direct economic benefit for the recipients, it was to produce an educated citizenry.

Rosanna Warren, the Hanna Holborn Gray Distinguished Professor at the University of Chicago, argues “Most people need and want the arts in their lives. Our civilization may now be so coarsened that we will eliminate the humanities from our schools, and we will train citizens only for technical skills which give them no sense of what they are living for, or why.”

What is a liberal education and why is it important?

According to the Association of American Colleges and Universities, a liberal education can be defined as “an approach to learning that empowers individuals and prepares them to deal with complexity, diversity, and change. It provides students with broad knowledge of the wider world (e.g. science, culture, and society) as well as in-depth study in a specific area of interest. A liberal education helps students develop a sense of social responsibility, as well as strong and transferable intellectual and practical skills such as communication, analytical and problem-solving skills, and a demonstrated ability to apply knowledge and skills in real-world settings.”

The term
 “liberal education” was first used in classical Greek and Roman
 times, chosen to emphasize the fact that it helped people deal with their 
rulers critically. Through time, a liberal education was thought to help a
 person become wise.

As former U.S. Secretary of Education Richard Riley has so aptly put it, “We are currently preparing students for jobs that don’t yet exist, using technologies that haven’t been invented, in order to solve problems that we don’t even know are problems yet.”

“I think, increasingly, anything you learn is going to become obsolete within a decade,” says Lawrence Summers, a former president of Harvard University, “and so the most important kind of learning is about how to learn.”

David Autor, the MIT economis who has stuided the impact of technology and globalization on labor, writes “human taks that have proved most amendable to computerization are those that follow explicit, codifiable proceures where computers now vastly exceed human labor inn speed, quality accuracy and cost efficiency. Tasks that have proved most vexing to automate are those that demand flexibiilty, judgment and common sense.” In other words, the kinds of skills learned in a Liberal Arts education.

The humanities and social sciences are not merely elective, nor are they elite or elitist. They are necessary and require our support in challenging times as well as in times of prosperity. And our current education system in North America is losing that perspective. So says a report by the national commission on the humanities and social sciences of the American Academy of Arts and Sciences. 

It’s fashionable these days for many business leaders to lampoon liberal arts graduates and exalt those with professional degrees. Yet as Peter Drucker, often acknowledged as the world’s foremost expert on management and leadership, said this belief is misplaced. Drucker drew many of his insights from literature and social sciences, not economics and business. Rick Wartzman, executive director of the Drucker Institute argues, “The problem is that the broad world of ideas has become largely separated from the world of business.”

In an
 article in the London Times, entitled “Harvard’s
 Masters of the Apocalypse,” Philip Broughton, a Harvard Business
 School graduate and author of What They Teach You At Harvard, 
says “You can draw up a list of the greatest entrepreneurs of 
recent history, from Larry Page and Sergey Brin of Google and Bill Gates of
Microsoft, to Michael Dell, Richard Branson, Lak-shmi Mittal – and there’s not 
an MBA among them.” Mark Zuckerberg was a classic liberal arts student who also happened to be passionately interested in computers. He studied ancient Greek intensively in high school and majored in psychology while he attended college.

Steve Jobs made a statement that points straight to the value of the liberal arts in the 21st century. “We’re not just a tech company, even though we invent some of the highest technology products in the industry,” said Jobs, “It’s the marriage of that plus the humanities or the liberal arts that distinguishes Apple.”

Norman Augustine, longtime chairman and CEO of Lockheed Martin, insists that liberal arts deficiencies are putting the U.S. at a strategic disadvantage. In a 2010 American Management Association Study (link is external), less than 50% of the executives polled said their employees had effective communication and innovative-thinking skills and 80% said colleges and universities could better prepare America’s future work force by placing more emphasis on the humanities.

E. O. Wilson, a world-renowned American biologist contends “We are drowning in information while starving for wisdom,” Wilson declares. “The world henceforth will be run by synthesizers, people able to put together the right information at the right time, think critically about it, and make important choices wisely.”

“You need some people who are holistic thinkers and have liberal arts backgrounds and some who are deep functional experts,” Laszlo Bock, Google’s senior vice president who oversees the company’s hiring, told the New York Times (link is external), “Building that balance is hard, but that’s where you end up building great societies, great organizations.”

At the very moment when China, Singapore and some European countries are seeking to institute the concept of a broad liberal education, increasingly the U.S. and Canadian higher education institutions are narrowing their focus on scientific and technological enterprises.

What About Financial End Results

There are lots of cause-and-effect arguments out there–science and technology graduates get paid more than liberal arts graduates, so the argument goes. Yet, in an article in the Wall Street Journal  by Melissa Korn, she cites the research by the American Association of American Colleges and Universities (AACU) which advocates a broad-based liberal arts education and shows that while liberal arts graduates initially make lower salaries compared to business graduates, in the long term the differences are minimal. An excerpt of the AACU’s report states “The case for Liberal Arts goes beyond purely vocational or economic reasons, they are indispensable to the vitality of democracy and future of global understanding and community.”

Finally, Fareed Zakaria, in his recent article in the Washington Post raises the alarm for an American push for STEM education and the diminishment of a liberal education, saying “America will not dominate the 21st century by making cheaper computer chips butr instead by constantly reimagining how computer and other new technologies will itneract with human beings.”

In summary, it’s clear that a core Liberal Education is necessary for a thriving economy and sustaining a democratic society.

The Body Image and Eating Disorder Tsunami

Posted July 8th, 2015 in Articles, Blogs by admin

Negative body image and eating disorders constitute a not-so-silent Tsunami that are wreaking havoc in the lives of women and men today.

This article was prepared with the assistance of Samantha Skelly, (link is external) a coach specializing in working with people with eating disorders.

Recent studies, which focus on these problems, go beyond the usual finger pointing at celebrities and media.

A recent study by researchers Petya Eckler, University of Strathclyde and Yusuf Kalyango Jr., of Ohio University; and Ellen Paasch, University of Iowa found that more time on Facebook could lead to more negative feelings and more comparisons to the bodies of friends.

They surveyed 881 college women about their Facebook use, eating and exercise habits, and body image. They were able to predict how often women felt negatively about their own bodies after looking at someone else’s photos or posts, and how often women compared their own bodies to those of their friends.

The findings showed that more time spent on Facebook was associated with more negative feelings and more comparisons to the bodies of friends. They also found that for women who want to lose weight, more time on Facebook led to more attention being paid to physical appearance. This included attention to one’s body and clothing.

Previous studies have examined college or adolescent girls and the effect of Facebook on users’ body image over non-users’. However, this is the first study to link time spent on Facebook to poor body image.

“Public health professionals who work in the area of eating disorders and their prevention now have clear evidence of how social media relates to college women’s body image and eating disorders. While time spent on Facebook had no relation to eating disorders, it did predict worse body image among participants,” said Eckler. “As experts in the field know, poor body image can gradually lead to developing an unhealthy relationship with food. The attention to physical attributes may be even more dangerous on social media than on traditional media because participants in social media are people we know. These comparisons are much more relevant and hit closer to home. Yet they may be just as unrealistic as the images we see on traditional media.”

Another study by Jasmine Fardouly and colleagues and published in the Psychology of Women Quarterly (link is external) concluded young women objectify themselves more browsing Facebook, magazines than media types.

“Our research shows that spending more time reading magazines and on Facebook is associated with greater self-objectification among young women and these relationships are influenced by women’s tendency to compare their appearance to others, particularly to peers on Facebook,” the researchers commented.

Surveying 150 female college students and staff ages 17-25, researchers Jasmine Fardouly et al., also found the following connections between type of media, comparing the way women look, and self-objectification:Magazines, though significantly related to self-objectification, are infrequently read by women. On average, the women spent about two hours a day on Facebook, accounting for 40% of daily internet use, and check the site every few hours.Facebook users compare their appearance most often to their own images, then to those of their peers, and rarely to images of family members and celebrities.

The researchers discussed reasoning for this finding. For example, unlike TV and music videos, on Facebook, users can compare pictures of themselves with their peers or past images of themselves. The researchers also note that self-comparisons may lead to greater self-objectification for women as they look at themselves literally as an observer. They wrote, “Furthermore, self-comparisons to images of a previous self might engender a greater focus on specific body parts, also contributing to self-objectification.”

To help young women stop comparing themselves and promote wellness, the researchers recommend that young women post fewer images of themselves on Facebook and follow people on Facebook who post photos less frequently.

The researchers continued, “This was one of the first studies which shows that appearance comparisons partially account for the relationship between media usage and self-objectification. Young women report spending long periods of time on Facebook and this research highlights some of the potential negative influences that Facebook may have on how young women view their body.”

In another study linking Facebook to eating disorders, published in the International Journal of Eating Disorders researcher Pamela K. Keel found many women who experienced “likes” on Facebook showed links to eating disorders. Facebook has become a global phenomenon and an active space for social comparison. With the increase in technology use, there is a positive correlation with decreased body image in young women. In her study, 960 female college students were evaluated on the time they spend on social media sites, how important “likes” are, and whether or not they “untag” photos of themselves.

“Over 95% of college women in our study use Facebook, and those with Facebook accounts described typically spending 20 minutes on the site during each visit, amounting to over an hour on the site each day,” said Keel.

Women who spent more time on Facebook reported a higher incidence of appearance-focused behaviors and reported greater eating pathology. These women were more likely to give greater significance to receiving comments and “likes” on status updates, frequently untagged pictures of themselves and compared their photos to friends.

“In examining the immediate consequences of Facebook use, we found that 20 minutes of Facebook use contributed to maintenance of higher weight and shape concerns and anxiety compared to a control internet condition. This causal link is important because anxiety and body image concerns both increase risk for developing eating disorders,” Keel stated.

Although it is a main cause to the issue, Facebook could possibly become a maintenance factor for prevention programs. The main objective is to encourage women to develop better self-image and practice responsible use of social media sites.

“Facebook merges powerful peer influences with broader societal messages that focus on the importance of women’s appearance into a single platform that women carry with them throughout the day. As researchers and clinicians attempt to understand and address risk factors for eating disorders, greater attention is needed to the emerging role of social media in young people’s lives.”

Perfectionism is a key factor influencing body image and eating disorders. Marika Tiggemann and Tracey Wade from Flinders University, has published her study in the Journal of Eating Disorders  which describes adaptive perfectionism as high standards driving a person towards achieving a goal body image, and maladaptive perfectionism as concerned with mistakes and other people’s opinions. The finding indicates that both are involved in heightened concerns about body image, which in turn places people at risk of developing an eating disorder.

Over a thousand women representing a cross section of the population (aged 28-40) were involved in this study. They ranged from underweight to morbidly obese, with a BMI of 14 to 64, and overall, the further these women were away from a healthy BMI, the bigger the difference between their current and ideal body images.

While perfectionism is recognized as an important factor in eating disorders, the exact role of perfectionism in perceived body image has been difficult to pin down. The study found that women who desired the lowest BMI and the smallest body size tended to be more concerned about making mistakes, and more worried about organisation and higher self doubt than everyone else.

The co-author of the study, Wade, explained, “While some perfectionism is normal and necessary there becomes a point at which it becomes and unhelpful and vicious cycle. Knowing that perfectionism of any sort is a risk factor for eating disorders suggests we should tackle ‘all or nothing’ attitudes with clients, as well as helping them to become less invested in defining their self worth in terms of their ability to achieve high standards.”

In another study, Eric Stice and his colleagues from Oregon Research Institute have found that their obesity prevention program reduced the risk for onset of eating disorders by 61 percent and obesity by 55 percent in young women. These effects continued for as long as 3 years after the program ended. In their research on eating disorders, Oregon Research Institute (ORI) scientists help young women reduce the influence of the “thin ideal,” which is described as associating success and happiness with being thin.

Stice, and his colleagues have found that their obesity prevention program reduced the risk for onset of eating disorders by 61% and obesity by 55% in young women. These effects continued for as long as 3 years after the program ended.

These results are noteworthy because, to date, the idea that we can reduce risk for future onset of eating disorders and obesity has been an unrealized goal: over 80 prevention programs have been evaluated, but no previous program had been found to significantly reduce risk for onset of these serious health problems.

Stice notes that, “One reason these programs might be more effective is that they require youth to take a more healthy perspective, which leads them to internalize the more healthy attitudes. In addition, these programs have simple take-home messages, which may be easier to remember in the future than messages from more complex prevention programs.”

Funded by the National Institutes of Health (NIH), Stice has been studying eating disorders for 18 years. He has conducted this line of research at Stanford University and the University of Texas, and now continues at the Oregon Research Institute in Eugene, Oregon. He is presently funded by NIH to conduct two research studies to further test these programs with young women in Eugene/Springfield.

The obesity prevention program, called Healthy Weight, helps adolescents adopt a healthier lifestyle, wherein they gradually reduce intake of the least healthy portion of their diet and increase physical activity. This program simply teaches youth to balance their energy intake with their energy needs, and to do so on a permanent basis, rather than on the transient basis which is more typical of diets. College-age women in Eugene/Springfield are participating in this study.

The eating disorder prevention program, called the Body Project, consists of four one-hour weekly sessions in which participants critique the thin ideal espoused for women in our culture and learn how to challenge current and future pressures to be thin. The program has also produced reductions in other important outcomes such as body dissatisfaction and eating disorder symptoms. Stice has partnered with area high schools on this study and has trained high school counselors to facilitate the weekly sessions.

“It is our hope that other institutions and communities will adopt this program for delivery in their schools,” notes Stice; “If this program is delivered to enough youth, it should be possible to reduce the prevalence of these serious health problems.”

Given that eating disorders are one of the most common problems faced by young women and that obesity is presently credited with 111,000 deaths per year in the U.S., it is vital to develop brief prevention programs for these pernicious conditions. At least seven other institutions have begun delivering these interventions in the U.S. and in other countries.

In working with senior leaders as an executive coach, I’ve been able to work with both male and female executives. In the process, I was fortunate to become acquainted with the work of Samantha Skelly (link is external), a coach who specializes in working with people with a wide variety of disordered eating and body image issues. In addition to explaining how these problems are not just those of women, but that men too struggle with these issues, she summarized the importance of her work as follows:

  • “Disordered eating and excessive dieting is not about the food, it’s about an emotional void, what we need to uncover is the root cause to effectively repair the relationship we have with food;
  • The relationship you have with yourself, is the same relationship you have to food. When starting a journey to overcoming disordered eating we need to start from within. Internal work is essential in overcoming disordered eating;
  • So often the assumed root cause of thee issues are not to blame. The media, as well as social media definitely have a negative influence on how we ‘should’ look – however often it’s a belief that seemed from as young as 6 years that perpetuated these behaviours;
  • It’s perceived that only women deal with these issues, however in my work at the moment there is a clear 70-30 split. Men of all ages and backgrounds suffer with disordered eating and body image, to the same calibre of severity as women.”

Canada’s Reputation is Becoming Tarnished

Posted July 8th, 2015 in Articles, Blogs by admin

What happens to a country’s image abroad and the self-image of the people within it, when it changes from peaceful well being to aggressive actions against the environment and other countries?

That question is one now facing Canada.

Traditionally, when people—both inside and outside the nation—think about the values of the Canadian people, words such as “friendly,” “non-violent,” “generous,” “peaceful,” usually are cited, and the country is noted for its positive social safety net, cultural diversity and tolerance. In many studies and surveys in the past, Canada has ranked among the top nations in the social well-being index and best places to live.

That image may be changing due to the economic, political and military decisions of the country’s leaders.

Here’s some examples of the significant changes that have taken place:

  • According to a review by Harvard Law School’s immigration and refugee clinic, Canada has become a more refugee-unfriendly place in the post-9/11 world, “Canada is systematically closing its borders to asylum seekers and failing in its refugee protection obligations under domestic and international law,” the group’s report)states.
  • Canada’s federal government opposed former British PM Gordon Brown’s global tax  on international financial transactions;
  • Canada would not support a UN effort to recognize the human right to access sufficient water to sustain life;
  • Canada was opposed to the UN Declaration )on the Rights of Indigenous People;
  • Canada did not support the Rotterdam Convention  to ban the toxin chrysotile asbestos;
  • Canadian federal and provincial governments have supported the extremely environmentally damaging tar sands oil projects; 
  • Canada has suggested the Kyoto Protocol  on climate change be scrapped at the UN climate conference in Bangkok;
  • Thousands of mines owned and operated by Canadian mining companies in Latin America, Africa and India are among the worst offenders in terms of environmental destruction and human rights abuses according to the Canadian Centre for the Study of Resource Conflict; 
  • Canada’s per-person greenhouse gas emissions (link is external)are among the world’s highest in the world;
  • Canada’s federal government has systematically removed the collection of scientific base-line data on human populations (the long form census) and environmental biodiversity and health. 

Perhaps the biggest shift has occurred in the area of military action. Canada’s reputation in the past has been linked the role of “peacekeeper,” under the auspices of the United Nations. Former Prime Minister Lester B. Pearson proposed the first peacekeeping force, which moved the world back from war in the 1956 Suez Crisis, winning Pearson the Nobel Peace Prize. From that time onward, until the mid-1990s, Canada was the largest contributor of peacekeepers and the only country to have contributed to every UN mission. From Kashmir to the Congo, from Bosnia to Ethiopia, Canadian soldiers were at the forefront of world order, contributing to peace in war-torn lands. The Pearson Peacekeeping Center was established in 1994 by the Government of Canada and became the flagship of the nation’s commitment to UN peacekeeping, providing world-class training to peacekeepers from Canada and around the globe.

The Canadian government is now closing the Pearson Center, a reflection of its dwindling support for both the UN and a peacekeeping role. Canada once contributed 3,000 military personnel to peacekeeping, and contributed more than 10 per cent of all peacekeeping troops to the UN. Sixteen years later, its contribution is less than 0.1 per cent. Today, Canada now ranks 53 – between Paraguay and Slovakia – on the United Nations contributors’ list with approximately 40 serving on UN missions overseas.

Since the 1990s, successive Canadian governments, both Conservative and Liberal, have shunned traditional UN-mandated peacekeeping for U.S.-led war-fighting missions in Kosovo, Afghanistan and Libya. Those campaigns have eclipsed the UN as Ottawa’s favored military expeditionary effort.

While this was happening, NATO began to rise to prominence as an instrument of humanitarian intervention, providing a second distracting factor. Since the 1990s Canada has chosen to participate more with NATO, fighting alongside the US and its allies in wars based on humanitarian intervention, a concept which it is important to note is fundamentally different than peacekeeping. Peacekeeping is dependent on the conflicted country’s consent, and uses lightly armed troops to enforce peace agreements, a practice which has been shown through statistics to decrease the likelihood of a return to violence. Humanitarian intervention, on the other hand, is based off of military supremacy and the enforcement of peace using things like no fly zones, precision airstrikes, and offensive counterinsurgency operations.

Yet, is there a disconnect in how the people of Canada view its military role versus the political views of their leaders? In the 2002 and 2004 Focus Canada polls, as well as the 2005 Ekos-conducted Canadian Attitudes Toward the CF study. The studies verify that a majority of Canadians indeed prefer a “traditional peacekeeping role” for Canada. In 2002, of the 2021 adult Canadians polled, 52 per cent indicated that they preferred the “traditional peacekeeping role.” In 2004, the preference for traditional peacekeeping increased to 59 per cent. These statistics closely match an Ekos-conducted study in 2005, which found 57 per cent preferred “traditional peacekeeping” versus 41 per cent for a “peacemaking” role.

Canada’s military role in Afghanistan, and now in Iraq and Syria can hardly be called one of peacekeeping given its cooperation with the U.S. in initiating aggressive combat operations.

Canada has now been a target of terrorists, and increasingly, terrorists groups and their leaders see no or little difference between the Canada and the U.S.

Clearly, the more recent actions of Canada both in terms of the environment and the military contradict the traditional image of it being a peace-loving country that places the peoples’ physical and social well being at the heart of its policies and actions. We’ve already seen how those changes will impact Canada’s image abroad, and may continue to do so. What is still a question is how it will impact the self-image of Canadians.

Is Mind Wandering a Good or Bad Thing?

Posted April 11th, 2015 in Articles, Blogs by admin

Is mind wandering a good thing or bad thing? Is it the same as being on “autopilot,” without being conscious of what you are doing? There seems to be several differing perspectives on these questions.

One of the original pieces of research on the subject of mind wandering was completed by psychologists Matthew A. Killingsworth and Daniel T. Gilbert of Harvard University, is described in the journal Science. (link is external) Killingsworth and Gilbert concluded that people spend 46.9 percent of their waking hours thinking about something other than what they’re doing, and this mind-wandering typically makes them unhappy. In the research, entitled “A human mind is a wandering mind, and a wandering mind is an unhappy mind,” Killingsworth and Gilbert argue: “The ability to think about what is not happening is a cognitive achievement that comes at an emotional cost.”

Unlike other animals, humans spend a lot of time thinking about what isn’t going on around them: contemplating events that happened in the past, might happen in the future, or may never happen at all. Indeed, mind-wandering appears to be the human brain’s default mode of operation.

To track this behavior, Killingsworth developed an iPhone web app that contacted 2,250 volunteers at random intervals to ask how happy they were, what they were currently doing, and whether they were thinking about their current activity or about something else that was pleasant, neutral, or unpleasant.

Subjects could choose from 22 general activities, such as walking, eating, shopping, and watching television. On average, respondents reported that their minds were wandering 46.9 percent of time, and no less than 30 percent of the time during every activity except making love.

“Mind-wandering appears ubiquitous across all activities,” says Killingsworth, “This study shows that our mental lives are pervaded, to a remarkable degree, by the non-present.” Killingsworth and Gilbert found that people were happiest when making love, exercising, or engaging in conversation. They were least happy when resting, working, or using a home computer.

“Mind-wandering is an excellent predictor of people’s happiness,” Killingsworth says. “In fact, how often our minds leave the present and where they tend to go is a better predictor of our happiness than the activities in which we are engaged.”

The researchers estimated that only 4.6 percent of a person’s happiness in a given moment was attributable to the specific activity he or she was doing, whereas a person’s mind-wandering status accounted for about 10.8 percent of his or her happiness. Time-lag analyses conducted by the researchers suggested that their subjects’ mind-wandering was generally the cause, not the consequence, of their unhappiness.

“Many philosophical and religious traditions teach that happiness is to be found by living in the moment, and practitioners are trained to resist mind wandering and to ‘be here now,'” Killingsworth and Gilbert argue. “These traditions suggest that a wandering mind is an unhappy mind.”

Michael J. Kane and Jennifer C. McVay, writing in Current Directions in Psychological Science (link is external) argue while mind wandering might lead to creative insights, involuntary mind wandering can also take us away from the important activities and tasks at hand. In this article, Kane and McVay discuss the relationships among working memory, task-unrelated thoughts, and task performance. Using both laboratory-based and daily-life assessments, their research has shown that people with lower working memory capacity are more likely to mind wander, at least during demanding tasks. This propensity to mind wandering may partly explain why people with lower working memory capacity are also more likely to make errors. Kane and McVay argue that involuntary mind wandering can provide psychological scientists with a unique window into aspects of the mind’s mechanisms for cognitive control, including how, when, and for whom these mechanisms fail.

Researchers Daniel B. Levinson, Jonathan Smallwood, and Richard J. Davidson asked the question our working memory acts as a sort of mental workspace that allows us to juggle multiple thoughts simultaneously, but what role does it play in mind wandering? Does working memory inhibit or support off-task thinking? Their research was published in Psychological Science. (link is external)They decided to put this question to the test.

The researchers asked volunteers to perform one of two simple tasks — either pressing a button in response to the appearance of a certain letter on a screen, or simply tapping in time with one’s breath — and compared people’s propensity to drift off.

“We intentionally use tasks that will never use all of their attention,” Smallwood explains, “and then we ask, how do people use their idle resources?”

Throughout the tasks, the researchers checked in periodically with the participants to ask if their minds were on task or wandering. At the end, they measured each participant’s working memory capacity, scored by their ability to remember a series of letters given to them interspersed with easy math questions.

In both tasks, there was a clear correlation. “People with higher working memory capacity reported more mind wandering during these simple tasks,” says Levinson, though their performance on the test was not compromised.

The result is the first positive correlation found between working memory and mind wandering and suggests that working memory may actually enable off-topic thoughts.”What this study seems to suggest is that, when circumstances for the task aren’t very difficult, people who have additional working memory resources deploy them to think about things other than what they’re doing,” Smallwood says.

Interestingly, when people were given a comparably simple task but filled with sensory distractors (such as lots of other similarly shaped letters), the link between working memory and mind wandering disappeared.

Working memory capacity has previously been correlated with general measures of intelligence, such as reading comprehension and IQ score. The current study underscores how important it is in everyday situations and offers a window into the ubiquitous — but not well-understood — realm of internally driven thoughts.

“Our results suggest that the sorts of planning that people do quite often in daily life — when they’re on the bus, when they’re cycling to work, when they’re in the shower — are probably supported by working memory,” says Smallwood. “Their brains are trying to allocate resources to the most pressing problems.”

In essence, working memory can help you stay focused, but if your mind starts to wander those resources get misdirected and you can lose track of your goal. Many people have had the experience of arriving at home with no recollection of the actual trip to get there, or of suddenly realizing that they’ve turned several pages in a book without comprehending any of the words. “It’s almost like your attention was so absorbed in the mind wandering that there wasn’t any left over to remember your goal to read,” Levinson says.

Where your mind wanders may be an indication of underlying priorities being held in your working memory, whether conscious or not, Levinson says. But it doesn’t mean that people with high working memory capacity are doomed to a straying mind. The bottom line is that working memory is a resource and it’s all about how you use it, he says:”If your priority is to keep attention on task, you can use working memory to do that, too.”

People whose minds wander whilst driving, especially when intense, are significantly more likely to be responsible for a crash and are threatening safety on the roads, warns a study by C. Galera and colleagues published in the British Medical Journal (link is external)

All drivers experience occasional drifting of their minds towards internal thoughts, a temporary “zoning out” that might dangerously distract them from the road. External distractions (such as from mobile phones) are known to be linked with crashes, but inattention arising from internal distractions (such as worries) is still poorly understood in the context of road safety.

A team of researchers from France therefore wanted to see if mind wandering would increase the risk of being responsible for a crash.

They interviewed 955 drivers injured in a motor vehicle crash attending the emergency department at Bordeaux University Hospital between April 2010 and August 2011. All participants were 18 years or older.

Patients were asked to describe their thought content just before the crash. Researchers also assessed how disruptive/distracting the thought was. Mitigating factors considered to reduce driver responsibility, such as road environment, traffic conditions, traffic rule obedience and difficulty of the driving task were also taken into account.

Finally, blood alcohol level was tested as well as the driver’s emotional state just before the crash.They classified 453 (47%) drivers as responsible for the crash and 502 (53%) as not responsible. Over half (52%) reported some mind wandering just before the crash, and its content was highly disrupting / distracting (defined as intense mind wandering) in 121 (13%).

Intense mind wandering was associated with greater responsibility for a crash — 17% (78 of 453 crashes in which the driver was thought to be responsible) compared with 9% (43 of 502 crashes in which the driver was not thought to be responsible). This association remained after adjusting for other confounding factors that could have affected the results.

The authors conclude that the association between intense mind wandering and crashing “could stem from a risky decoupling of attention from online perception, making the driver prone to overlook hazards and to make more errors during driving.”

More recent research seems suggests that there are different kinds of mind-wandering, some of which are actually beneficial.

According to a new study published in the (link is external)Proceedings of the National Academy of Sciences (link is external), a wandering mind can impart a distinct cognitive advantage.

Scientists at Bar-Ilan University are the first to demonstrate how an external stimulus of low-level electricity can literally change the way we think, producing a measurable up-tick in the rate at which daydreams — or spontaneous, self-directed thoughts and associations — occur. Along the way, they made another surprising discovery: that while daydreams offer a welcome “mental escape” from boring tasks, they also have a positive, simultaneous effect on task performance.

The new study was carried out in Bar-Ilan’s Cognitive Neuroscience Laboratory supervised by Prof. Moshe Bar, part of the University’s Gonda (Goldschmied) Multidisciplinary Brain Research Center which Professor Bar also directs.

While a far cry from the diabolical manipulation of dream content envisioned in “Inception” — the science-fiction thriller starring Leonardo DiCaprio — the Bar-Ilan University study is the first to prove that a generic external stimulus, unrelated to sensory perception, triggers a specific type of cognitive activity.

In the experiment — designed and executed by Bar’s post-doctoral researcher Dr. Vadim Axelrod — participants were treated with transcranial direct current stimulation (tDCS), a non-invasive and painless procedure that uses low-level electricity to stimulate specific brain regions. During treatment, the participants were asked to track and respond to numerals flashed on a computer screen. They were also periodically asked to respond to an on-screen “thought probe” in which they reported — on a scale of one to four — the extent to which they were experiencing spontaneous thoughts unrelated to the numeric task they had been given.

Bar — a long-time faculty member at Harvard Medical School who has authored several studies exploring the link between associative thinking, memory and predictive ability — the specific brain area targeted for stimulation in this study was anything but random.

“We focused tDCS stimulation on the frontal lobes because this brain region has been previously implicated in mind wandering, and also because is a central locus of the executive control network that allows us to organize and plan for the future,” Bar explains, adding that he suspected that there might be a connection between the two.

As a point of comparison and in separate experiments, the researchers used tDCS to stimulate the occipital cortex — the visual processing center in the back of the brain. They also conducted control studies where no tDCS was used. While the self-reported incidence of mind wandering was unchanged in the case of occipital and sham stimulation, it rose considerably when this stimulation was applied to the frontal lobes. “Our results go beyond what was achieved in earlier, fMRI-based studies,” Bar states. “They demonstrate that the frontal lobes play a causal role in the production of mind wandering behavior.”

In an unanticipated finding, the present study demonstrated how the increased mind wandering behavior produced by external stimulation not only does not harm subjects’ ability to succeed at an appointed task, it actually helps. Bar believes that this surprising result might stem from the convergence, within a single brain region, of both the “thought controlling” mechanisms of executive function and the “thought freeing” activity of spontaneous, self-directed daydreams.

“Over the last 15 or 20 years, scientists have shown that — unlike the localized neural activity associated with specific tasks — mind wandering involves the activation of a gigantic default network involving many parts of the brain,” Bar says, “this cross-brain involvement may be involved in behavioral outcomes such as creativity and mood, and may also contribute to the ability to stay successfully on-task while the mind goes off on its merry mental way.”

While it is commonly assumed that people have a finite cognitive capacity for paying attention, Bar says that the present study suggests that the truth may be more complicated. “Interestingly, while our study’s external stimulation increased the incidence of mind wandering, rather than reducing the subjects’ ability to complete the task, it caused task performance to become slightly improved. The external stimulation actually enhanced the subjects’ cognitive capacity.”

Bar says that, in the future, he would be interested in studying how external stimulation might affect other cognitive behaviors, such as the ability to focus or perform multiple tasks in parallel. And while any therapeutic application of this technique is speculative at best, he believes that it might someday help neuroscientists understand the behavior of people suffering from low or abnormal neural activity.

Benjamin Baird, Jonathan Smallwood, Michael D. Mrazek, Julia W. Y. Kam, Michael S. Franklin, and Jonathan W. Schooler published research in Psychological Science (link is external) to test the theories of mind wandering.

They designed an experiment in which they asked participants to perform an Unusual Use Task (UUT), listing as many unusual uses for an item as possible. The participants were then split into four groups — one group was asked to perform a demanding task and a second was asked to perform an undemanding task. The third group rested for 12 minutes and a fourth group was given no break. All participants then performed the Unusual Use Task again. Of the four groups, only the people who performed the undemanding task improved their score on the second UUT test. Participants in the undemanding task reported greater instances of mind wandering during the task, which suggests that simple tasks that allow the mind to wander may increase creative problem solving.

So it seems like there are two different perspectives on mind-wandering and its negative or positive effects. How this research intersects with the growing field of mindfulness as a way of managing mind wandering from an “autopiolot” perspective should prove to be most interesting.

Have We Lost the Need For Physical Touch?

Posted April 5th, 2015 in Articles, Blogs by admin

Has our hi-tech, media-socialized world lost something critical to our species—non-sexual human physical touch? Hasn’t human physical contact set us apart from other animals, and has helped us develop complex language, culture, thinking and emotional expression?

Two hundred years ago, a creature looking somewhat human, was sighted running through the forests of Southern France. Once captured, scientists determined he was age 11, and had run wild in the forests for much of his childhood. One of the fathers of psychiatry at that time, Phillipe Pinel, observed the child—named “Victor”—and concluded, erroneously, that the Victor was an idiot. A French physician attending Victor, disagreed with Pinel, concluding that the child had merely been deprived of human physical touch, which had retarded his social and developmental capacities.

We know from child developmental research that the absence of physical bonding and healthy attachment between an adult and child may result in life-long emotional disturbances. James W. Prescott, an American developmental psychologist, proposed that the origins of violence in society were related to the lack of mother-child bonding. Harry Harlow completed extensive studies on the relationship between affection and development.

In Communist Romania, dictator Nicolae Ceausescu, in a pathological program to raise the birth rate through “science,” established numerous orphanages. When the world was able to see these orphans after his overthrow, they were shocked to see severe underdevelopment in their social skills and values. The commonality for all these orphans was a lack of human physical touch, particularly of the loving kind.

Sharon K. Farber, writing in Psychology Today contends “being touched and touching someone else are fundamental modes of human interaction, and increasingly, many people are seeking out their own professional touchers and body arts teachers—chiropractors, physical therapists, Gestalt therapists, Rolfers, the Alexander-technique and Feldenkrais people, massage therapists, martial arts and T’ai Chi Ch’uan instructors. And some even wait in physicians’ offices for a physical examination for ailments that no organic cause—they wait to be touched.”

Daniel Keltner, (link is external) the founding director of the Greater Good Science Center and professor of psychology at University of California, Berkeley, says “in recent years, a wave of studies has documented some incredible emotional and physical health benefits that come from touch. This research is suggesting that touch is truly fundamental to human communication, bonding, and health.”

Keltner cites the work of neuroscientist Edmund Ross, who found that physical touch activates the brain’s orbitfrontal cortex, which is linked to feelings of reward and compassion. Keltner contends that “studies show that touch signals safety and trust, it soothes. It activates the body’s vagus nerve, which is intimately involved with our compassion response and a simple touch can trigger release of oxytocin, aka “the love hormone.” Keltner also describes the research that shows the economic benefits of physical touch, citing his own recent study of NBA basketball teams, concluding that teams whose players touch each other more win more games.

Keltner also says that “touch signals safety and trust, it soothes. Basic warm touch clams cardiovascular stress. It activates the body’s vagus nerve, which is intimately involved with our compassionate response.”

Research at University of California’s School of Public Health found that getting eye contact and a pat on the back from the doctor may boost survival rates of patients with complex diseases.

Paul Zak, author of The Moral Molecule, conducted a “neuroeconomics” study, published in the journal Evolution and Human Behavior, in which he argues that hugs or handshakes are likely to cause the release of the neurochemical oxytocin, and increase the chances this persons will treat you “like family”, even it you’ve just met him or her. Zak argues “We touch to initiate and sustain cooperation.”

French psychologist Nicolas Guéguen reports in the journal Social Psychology of Education, t (link is external)hat when teachers pat students in a friendly way, those students are three times as likely to speak up in class. Another recent study has found that when librarians pat the hand of a student checking out a book, that student says he or she likes the library more—and is more likely to come back.

Touch can even be a therapeutic way to reach some of the most challenging children: Some research by Tiffany Field suggests that children with autism, widely believed to hate being touched, actually love being massaged by a parent or therapist.

According to research (link is external)conducted at the University of North Carolina, women who receive more hugs from their partners have lower heart rates and blood pressure and higher levels of oxytocin.“Hugs strengthen the immune system…The gentle pressure on the sternum and the emotional charge this creates activates the Solar Plexus Chakra. This stimulates the thymus gland, which regulates and balances the body’s production of white blood cells, which keeps you healthy and disease free.”

Whether we get a friendly slap on the back, a sensual caress, or a loving kiss –interpersonal touch has a powerful impact on our emotions. In fact, our skin contains receptors that directly elicit emotional responses, through stimulation of erogenous zones or nerve endings that respond to pain (Auvray, Myin, & Spence, 2010).

Although psychologists have learned a great deal about the significance of touch, the scientific inquiry of touch is still in its infancy. One important complexity that has yet to be addressed is that touch is inherently a multisensory experience. During interpersonal touch, we typically experience tactile stimulation, but also changes in warmth, along with changes in what we see, hear, and smell. Nevertheless, inputs from other senses can have independent effects. For instance, researchers Laurence A. Williams and John A. Bargh (link is external)found merely being in a warm room or holding a warm drink can make people feel closer to others compared to when they are in a cold room or holding a cold drink.

What does all this have to do with today’s world and workplace? Two things. The growing prevalence for human interaction through digital media—particularly for young people—versus personal physical contact, and the social and legal restrictions over physical contact in our schools and workplaces may have unintended negative consequences.

Josh Ackerman, a MIT psychologist, claims that people understand their world through physical experiences, and the first sense is through touch. He says that you can produce changes in peoples’ thoughts through different physical experiences. His study, published in Science (link is external)magazine, is the latest in the growing field of research called, “embodied cognition,” a field of research that supports the concept of mind-body connection.

In an article in Wired Magazine, (link is external) Brandon Keim contends, based on this embodied cognition research, that studies show children “are better at math when using their hands while thinking,” and “actors recall lines better when moving.

And when it comes to catching a woman’s interest, little beats a man’s winning smile and a touch on the arm. Studies have shown that a gentle brush of a woman’s arm can boost his chances in love and another study showed that two-thirds of women agreed to dance with a man who touched her on the arm a second or two before making the request.

In our desire to have a politically correct and safe social environment, or an environment of instant communication, have we lost sight of the most important aspect of human development and culture—physical touch?

Additional References:

Auvray, M., Myin, E., & Spence, C. (2010). The sensory-discriminative and affective-motivational aspects of pain. Neuroscience and Biobehavioral Reviews, 34, 214-223.

Paladino, M.P., Mazzurega, M., Pavani, F., & Schubert, T. (2010). Synchronous multisensory stimulation blurs self-other boundaries. Psychological Science, 21, 1202-1207

Wilhelm, F. H., Kochar, A. S., Roth, W. T., & Gross, J. J. (2001). Social anxiety and response to touch: Incongruence between self-evaluative and physiological reactions. Biological Psychology, 58, 181-202.

Is America Addicted to War?

Posted March 25th, 2015 in Articles, Blogs by admin

War has become the new normal for the U.S. Militarization, the justification for violence, and the privatization of war are now firmly entrenched.

The U.S. is number one in the world in terms of both war and domestic civilian gun deaths. It leads all countries in the military arms industry, worth approximately 2.7% of the world’s GDP. Russia and China are ranked 2nd and 3rd. Of all the military spending in the world, the U.S. is responsible for 39% of it alone. In comparison, China is second with 9.5% and Russia third with 5.2%. The U.S. currently has (depending on the source of information) somewhere between 800 and 1,000 military bases in over 50 countries.

A study of history will show that since the founding of the U.S. in 1776, it has been at war 214 out of 235 calendar years, leaving only 21 calendar years in which the U.S. did not wage war (during the Depression). Starting with the 1991 Gulf War, the U.S. has had two decades of non-stop fighting: Somalia in 1992, Haiti in 1994, Bosnia in 1995, Serbia-Kosovo in 1999, Afghanistan starting in 2001, and Iraq and Libya and Syria since then. And that doesn’t count the drone warfare in Pakistan and Yemen.

The economy of the U.S. has become captured by the military and intelligence organizations. Vietnam cost the U.S. $450 billion. Iraq alone will end up costing the U.S. some $6 trillion according to Nobel Prize-winning economist Joseph Stiglitz. That’s twice the amount needed to make Social Security solvent forever.

In an article in The Atlantic, (link is external) James Fallows describes how U.S. military expenditure has risen 10 times as fast as household consumption, while government non-military expenditure has fallen. This continues a longer trend. Since the beginning of the 21st century, U.S. GDP, in inflation-adjusted terms, has increased by 21 percent, U.S. government non-military spending by 11 percent, and U.S. personal consumption by 28 percent, while military expenditure increased by 52 percent.

Being at war, whether it’s against other countries or “against terrorists” has become big government business, much of it unseen by the public and many people in government. A Washington Post (link is external) article by Dana Priest and William M. Arkin describes how the top-secret world the government created in response to the terrorist attacks of Sept. 11, 2001, has become so large, so unwieldy and so secretive that no one knows how much money it costs, how many people it employs, how many programs exist within it or exactly how many agencies do the same work. They have identified some 1,271 government organizations and 1,931 private companies work on programs related to counterterrorism, homeland security and intelligence in about 10,000 locations across the United States.

Barbara Enrenreich, an award winning author and political activist, author of the book Blood Rites (link is external), in which she confronts the mystery of the human attraction to violence: What draws our species to war and even makes us see it as a kind of sacred undertaking. She says, “War has dug itself into economic system, where it offers a livelihood to millions, rather than to just a handful of craftsmen and professional soldiers. It has lodged in our souls as a kind of religion, a quick tonic for political malaise and bracing antidote to the moral torpor of consumerist, market-driven cultures.”

The use of war and torture by the U.S. has not only become an instrument of political and foreign policy but become increasingly an accepted moral norm, and even glamorized in media and movies. The movie American Sniper, poised to be the biggest box office movie of all time, is a case in point. American Sniper is not the first movie to capitalize on the insatiable hunger for stories of brave and sacrificing commandos, as witnessed by the other recent blockbusters: Lone Survivor, and Zero Dark Thirty. Hollywood has found a successful formula with a small number of unstoppable special forces, promoting the belief that special forces and drone warfare can do anything that a conventional army can.

As many movie reviewers have commented, the movie American Sniper doesn’t depict the truth. It’s not really the story of Chris Kyle, as would become obvious when you read his autobiography. The movie implies that it would be wrong to kill civilians, particularly children. According to many U.S. war veterans, no such ban exists, and if it does occur, subsequent reports of the incidents are often altered accordingly.

Despite the lie, still believed by many Americans, that Iraq was behind the 9/11 tragedy, it is perpetuated in the movie. And in the movie, every Iraqi, including women and children, are either evil butchers or insurgents or terrorists. As Kyle has commented, “they are all savages.” In many ways, American Sniper lionizes America’s gun culture, it’s blind adoration of the military, and promotes the belief that the U.S. has to impose its values and rights on “lesser ethnic cultures.”

Kyle’s book is more disturbing than the movie. In the movie, Kyle is portrayed as a reluctant solider, one forced to do what is necessary by orders. In Kyle’s book, he clearly shows that he loves killing and war. He says in his book, he wishes he had killed more. In a 2012 interview with the BBC, Kyle said “Every person I killed I strongly believe they were bad…When I do go face God there is going to be lots of things I will have to account for, but killing any of those people is not one of them.” The Guardian author Lindy West argues “ The real American Sniper was a hate-filled killer. Why are simplistic patriots treating him as a hero?”

In many ways the message of American Sniper is that Muslims are evil and should be killed. This belief feeds the argument that the civilized world is at war with Muslims who are all either terrorists or fighting a “jihad” against the Western world.

Chris Hedges, writing in TruthDig, (link is external) cites his conversation with Reagan White House staffer and former Air Force officer Mikey Weinstein who describes the growing Christian fundamentalism within the U.S. Military. Weinstein notes the extreme right-wing Christian chauvinism, which calls for the creation of a theocratic “Christian” America, a sentiment that is particularly acute among elite units of the SEALs and Special Forces.

American Sniper is disturbing because it glorifies death and violence, blind obedience and loyalty and diminishes the capacity for empathy and compassion. The more the American public exalts movies like it, the more the moral compass of the nation will be dangerously off course.

Does Chris Kyle represent the typical combat soldier—one of little compunction for killing or remorse for doing it? Not according to research and the experience of most war veterans. Dave Grossman, a Special Forces colonel, an instructor at West Point and author of the book, On Killing (link is external), which basically makes the case and presents massive data to show that far from being innate warriors who are just dying to kill people, the vast majority of men are extremely reluctant to kill other people. A large survey done of U.S. combat veterans in World War II found that numerous combat soldiers were not firing at all or were firing away from the enemy. They did not want to kill someone. As a result, the Army completely revamped its training to boost the firing rates of combat soldiers. They also boosted the rates in the Korean War and especially in the Vietnam War. but Grossman said the result has been higher rates of post-traumatic stress disorder.

America’s moral compass is also being compromised by its disregard for international law. While just over six years ago (link is external) the U.S. public ranked among the world’s most enthusiastic supporters of international law (falling just behind the Germans and the Chinese in global surveys), it now appears that vast majorities of Americans reject the applicability of international law when it comes to the actions of the U.S. government in the “global war on terror,” and the use of illegal detainment and torture at Guantanamo prison, despite widespread international condemnation (link is external)of this policy.

Daphne Bramham, writing in the Vancouver Sun (link is external), describes how 80 million cluster bombs were dropped by the U.S. on Laos in an illegal war there: “Between 1964 and 1973 America dropped more bombs on Laos than had ever been dropped on a single country or continent. It did that without ever officially declaring war on Laos.” She goes on to describe how these bombs remain buried in the soil, and nearly as many Laotians have died from bombs since 1974 as died during the nine years of the “secret war.” Cluster bombs have also been used in South Sudan and Syria in 2014, she reports. More than 150 countries passed a cluster bomb convention, but the U.S. joins Israel, Russia and China as non-signatories to the convention.

The U.S. has expanded the definition of war by the use of drone warfare. Drones, as Grégoire Chamayou argues in his new book, A Theory of the Drone (link is external), are changing warfare and their potential to alter the political arena of the countries that utilize them.

The United States now uses more than 150 weapons-carrying drones which are used not only in Afghanistan but also in countries officially at peace with the U.S., such as Yemen, Somalia and Pakistan. In Pakistan, CIA drones carry out on average of one strike every four days. Although exact figures of fatalities are difficult to establish, the estimated number of deaths between 2004 and 2012 vary from 2562 to 3325.

The drone wars, however, are introducing risk-free ethics of killing. Moreover, drones change politics within the drone states. Because drones transform warfare into a ghostly “teleguided” act, orchestrated from a base in Nevada or Missouri, whereby soldiers no longer risk their lives, the critical attitude of citizenry towards war is also profoundly transformed, altering, as it were, the political arena within drone states. In the future, politicians might not need to rally citizens because once armies begin deploying only drones and robots there will be no need for the public to even know that a war is being waged.

Finally, there are some myths about the effectiveness and results of American wars, particularly in the Mid-East. Tom Englehardt author of The United States of Fear as well as a history of the Cold War, The End of Victory Culture, identifies these in a recent article. (link is external) He concludes the following:

  • America’s war against the terrorists hasn’t succeeded, citing the RAND report figures of the growth of jihadist groups by 58% between 2010-2013.
  • America’s wars haven’t solved foreign countries’ problems from the time of Vietnam to Iraq and Afghanistan.
  • America’s wars in Iraq and Afghanistan and Libya and the CIA’s drone assassination campaign in Pakistan have actually destabilized those countries.
  • America is not winning its wars if you measure a positive goal a subsequent result of the war. The last true war in which America was a victor in those terms was WWII.

Where will the increasing militarization of the U.S. end? How much of its economic resources will continue to be committed for military and intelligence purposes? What will happen to the once vaunted and highly respected American reputation as the world’s moral leader and defender of the rule of law? And what happens to the public conscience when movies continue to justify violence and killing being perpetrated by “heroes?”

 

Leaders: We Love Humble Leaders But Idolize Narcissists

Posted March 25th, 2015 in Articles, Blogs by admin

The public in general and even management experts are hypocritical about what makes a good leader. On the one hand we exalt and praise leaders who are basically nasty and abusive (called a****les by some) because they are financially successful and on the other hand, research shows that humble leaders whose focus is to serve others are equally successful, but more importantly, capture the hearts and loyalty of others. Which do we value more?

When we think of egotistical, and even narcissistic and abusive leaders, the names of Steven Jobs, Donald Trump and Larry Ellison comes to mind. Not that their hubris doesn’t pay off according to a research study (link is external) completed by Charles A. O’Reilly III at Stanford’s business school. O’Reilly and his colleagues surveyed employees in 32 large, publicly traded tech companies. He contends that bosses who exhibits narcissistic traits like dominance, self-confidence, a sense of entitlement, grandiosity and low empathy, tend to make more money than their less self-centered counterparts, even if the lower-paid CEOs exhibit plenty of confidence. O’Reilly says of the narcissists, “they don’t really care what other people think and depending on the nature of the narcissist, they are impulsive and manipulative.” O’Reilly goes on to argue the longer narcissistic leaders are at the helm, the higher their compensation in comparison with the rest of the leadership team, or in some cases the narcissistic bosses fire anyone who dares to question or challenge them.

There is a dark downside to this appearance of success however, O’Reilly contends. Company morale often declines, and employees leave the company. And while the narcissistic or abusive leaders may bring in the bigger paychecks, O’Reilly says there is compelling evidence that they don’t perform any better than lower-paid, less narcissistic counterparts. This argument has been supported by Michael Maccoby in his book, The Productive Narcissist: The Promise and Peril of Visionary Leadership. (link is external)

While Steve Jobs was a charismatic visionary, and brilliant innovator, Walter Issacson’s biography showed him to be rude, controlling and mean-spirited, never hesitating to humiliate Apple employees and take credit for others’ work. Since his death, there has been a flood of articles and books and seminars extoling Job’s leadership style, many of which argue that it’s okay to be an “asshole” as long as your financially successful. In may article in The Financial Post (link is external) I make the point “The concern I have, and that it is reflected by other leadership experts, is the faulty cause and effect, and “ends justifies the means” arguments that hold up Jobs as a leader to be emulated. It goes something like this: It doesn’t matter what kind of boss you are like (meaning abusive), as long as you get results (financial); and any methods to get there are okay, including abusing people.”

I’ve encountered many young men, aspiring to be leaders, espousing flawed thinking goes something like this: “If Steve Jobs was a jerk and he was one of the most successful leaders in one of the most successful companies in the world, if I act like him, maybe I’ll be successful too.”

Robert Sutton one of the first leadership experts to draw attention to the prevalence of abusive bosses and how organizations should screen them out, as detailed in his book, The No Asshole Rule: Building a Civilized Workplace and Surviving One That Isn’t (link is external). He points out that tech firms, particularly those in Silicon Valley where abusive leaders thrive. His article in the Harvard Business Review on the subject received an overwhelming response of affirmation.

A University of Iowa study, “Perpetuating Abusive Supervision: Third-Party Reactions to Abuse in the Workplace” (link is external), found “when a supervisor’s performance outcomes are high, abusive behavior tends to be overlooked when they evaluate that supervisor’s effectiveness.” In other words, while people might not want to be friends with an abusive, overbearing bosses, they’ll tolerate their behavior as long as they are productive.

So it seems that abusive, narcissistic bosses are alive and doing well in the business world (and politics), and even exalted by the media. This is in sharp contrast to the research showing that humble bosses actually perform better and are better for the organization.

Peter Smuelson, a psychologist at Fuller Theological Seminary along with psychologist Sam Handy at Brigham Young University published a study in the Journal of Positive Psychology (link is external) described the need for humble leaders. They recruited 350 participants and gave them an open-ended questionnaire about real life problems. They found two clusters of traits people used to explain humility: The first from the social realm—sincerity, honesty, unselfishness, thoughtfulness. The second was learning—curiosity, logic, awareness, open-mindedness.

Humble leaders are more effective and better liked, according to a study published in the Academy of Management Journal. (link is external)”Leaders of all ranks view admitting mistakes, spotlighting follower strengths and modeling teachability as being at the core of humble leadership,” says Bradley Owens, assistant professor of organization and human resources at the University at Buffalo School of Management. “And they view these three behaviors as being powerful predictors of their own as well as the organization’s growth.”

Owens and co-author David Hekman, assistant professor of management at the Lubar School of Business, University of Wisconsin-Milwaukee, asked 16 CEOs, 20 mid-level leaders and 19 front-line leaders to describe in detail how humble leaders operate in the workplace and how a humble leader behaves differently than a non-humble leader.

Although the leaders were from vastly different organizations—military, manufacturing, health care, financial services, retailing and religious—they all agreed that the essence of leader humility involves modeling to followers how to grow.

“Growing and learning often involves failure and can be embarrassing,” says Owens. “But leaders who can overcome their fears and broadcast their feelings as they work through the messy internal growth process will be viewed more favorably by their followers. They also will legitimize their followers’ own growth journeys and will have higher-performing organizations.” The researchers found that such leaders model how to be effectively human rather than superhuman and legitimize “becoming” rather than “pretending.”

But some humble leaders were more effective than others, according to the study. Humble leaders who were young, nonwhite or female were reported as having to constantly prove their competence to followers, making their humble behaviors both more expected and less valued. However, humble leaders who were experienced white males were reported as reaping large benefits from humbly admitting mistakes, praising followers and trying to learn.

In contrast, female leaders often feel they are expected to show more humility than their male counterparts, but then they have their competence called into question when they do show humility.

“Our results suggest that female leaders often experience a ‘double bind,'” Owens says. “They are expected to be strong leaders and humble females at the same time.”Owens and Hekman offer straightforward advice to leaders. You can’t fake humility. You either genuinely want to grow and develop, or you don’t, and followers pick up on this.

Leaders who want to grow signal to followers that learning, growth, mistakes, uncertainty and false starts are normal and expected in the workplace, and this produces followers and entire organizations that constantly keep growing and improving. A follow-up study that is forthcoming in Organization Science using data from more than 700 employees and 218 leaders confirmed that leader humility is associated with more learning-oriented teams, more engaged employees and lower voluntary employee turnover.

The more honesty and humility an employee may have, the higher their job performance, as rated by the employees’ supervisor. That’s the new finding from a Baylor University study published in in the journal Personality and Individual Differences (link is external)that found the honesty-humility personality trait was a unique predictor of job performance.

“Researchers already know that integrity can predict job performance and what we are saying here is that humility and honesty are also major components in that,” said Dr. Wade Rowatt, associate professor of psychology and neuroscience at Baylor, who helped lead the study. “This study shows that those who possess the combination of honesty and humility have better job performance. In fact, we found that humility and honesty not only correspond with job performance, but it predicted job performance above and beyond any of the other five personality traits like agreeableness and conscientiousness.”

The Baylor researchers along with a business consultant surveyed 269 employees in 25 different companies across 20 different states who work in positions that provide health care for challenging clients. Supervisors of the employees in the study then rated the job performance of each employee on 35 different job skills and described the kind of customer with whom the employee worked. The ratings were included in order to inform higher management how employees were performing and for the Baylor researchers to examine which personality variables were associated with job performance ratings.

The Baylor researchers found that those who self-reported more honesty and humility were scored significantly higher by their supervisors for their job performance. The researchers defined honesty and humility as those who exhibit high levels of fairness, greed-avoidance, sincerity and modesty.

“This study has implications for hiring personnel in that we suggest more attention should be paid to honesty and humility in applicants and employees, particularly those in care-giving roles,” said Megan Johnson, a Baylor doctoral candidate who conducted the study. “Honest and humble people could be a good fit for occupations and organizations that require special attention and care for products or clients. Narcissists, on the other hand, who generally lack humility and are exploitative and selfish, would probably be better at jobs that require self-promotion.”

Amy Y. Ou and her colleagues at Arizona State University published a study in Administrative Science Quarterly (link is external) in which they suggested it would be interesting to look at some of the leadership traits associated with Confucianism. Those traits include self-awareness, openness to feedback, and a focus on the greater good and others’ welfare, as opposed to dwelling on oneself. Ou, who is now an assistant professor at the National University of Singapore, thought that China would be a good place to gather data, because of Confucianism’s influence. She also had a network of corporate contacts there and she teamed up with another Chinese colleague at the business school, Anne Tsui, who had connections in China.

Together with three other colleagues in the U.S. and China, the researchers wound up interviewing the CEOs of 63 private Chinese companies. They also gave surveys to 1,000 top- and mid-level managers who worked with the CEOs. The surveys and interviews aimed to determine how a humble leadership style would affect not so much the bottom line as the top and mid-level managers who worked under the CEOs. Did managers feel empowered by CEOs’ humility, did they feel as though they were invited into company decision-making, and did that lead to a higher level of activity and engagement? The study’s conclusion: The more humble the CEO, the more top- and mid-level managers reported positive reactions. Top-level managers said they felt their jobs were more meaningful, they wanted to participate more in decision-making, they felt more confident about doing their work and they had a greater sense of autonomy. They also were more motivated to collaborate, to make decisions jointly and to share information. Likewise middle managers felt more engaged and committed to their jobs when the top boss was more humble. “There is a negative stereotype that humble people are weak and indecisive,” Angelo Kinicki, one of the co-authors of the report, “That’s just not the case.”

In an article in the Harvard Business Review (link is external) entitled “Level 5 Leadership: The Triumph of Humility and Fierce Resolve,” leadership expert Jim Collins argues Level 5 leaders, the best leaders exhibit the following characteristics:

  • Demonstrates a compelling modesty, shunning public adulation; never boastful.
  • Acts with quiet, calm determination; relies principally on inspired standards, not inspiring charisma, to motivate.
  • Channels ambition into the company, not the self; sets up successors for even more greatness in the next generation.
  • Looks in the mirror, not out the window, to apportion responsibility for poor results, never blaming other people, external factors, or bad luck.
  • Looks out the window, not in the mirror, to apportion credit for the success of the company—to other people, external factors, and good luck.

Rob Nielsen, author of Leading with Humility, argues that some narcissistic business leaders are treated like rock stars but who leaders who are humble and admit mistakes outshine them all. There’s a difference between being a humble leader and being wishy-washy or overly solicitous of others’ opinions, says Arron Grow, associate program director of the School of Applied Leadership at the City University of Seattle and author of How to Not Suck as a Manager. He says being humble doesn’t mean being a chump and describes 6 ways in which leaders can be more effective by being more humble. Elizabeth Salib takes up on this theme in her article in Harvard Business Review (link is external), contending the best leaders are humble leaders. She cites Google’s SVP of People Operations, Lazlo Bock, who says humility is one of the traits he’s looking for in new hires.

A recent Catalyst (link is external) study backs this up, showing that humility is one of four critical leadership factors for creating an environment where employees from different demographic backgrounds feel included. In a survey of more than 1500 workers from Australia, China, Germany, India, Mexico, and the U.S., Catalyst found that when employees observed altruistic or selfless behavior in their managers—a style characterized by acts of humility, such as learning from criticism and admitting mistakes they were more positive and committed to their work teams.

When are we going to stop idolizing business leaders, needing them to be bigger than life in a way reminiscent of celebrities and movie stars, and start appreciating the value of humble leaders, and accept the research evidence that will serve us better?