Image via Wikipedia
Story of Phineas Gage - HISTORIC PHOTOGRAPHY - Estimated Date 1855
A daguerreotype image believed to be of railway worker Phineas Gage holding a tamping iron that went through his head during an explosion on a worksite in 1848. Phineas P. Gage (July 9?, 1823 – May 21, 1860)was a railroad construction foreman now remembered for his incredible survival of an accident in which a large iron rod was driven completely through his head, destroying one or both of his brain’s frontal lobes, and for that injury’s reported effects on his personality and behavior—effects so profound that friends saw him as “no longer Gage.” Gage recovered from the accident and retained full possession of his reason, but his wife and other people close to him soon began to notice dramatic changes in his personality. Phineas Gage’s brain was not subjected to any medical examination at that time, but seven years later his body was exhumed so his skull could be studied. Today Gage’s skull is on permanent display at Harvard’s Countway Library of Medicine.
The Times ran an article Monday suggesting that what America will need in the future are more “cool nerds.” A playful tweak of the nerd stereotype, to be sure, in an effort to alter it. The people described in the piece were ones with hybrid careers, combining computing with other fields from medicine to Hollywood.
These are jobs that do not match the classic computer geek or nerd image — a heads-down programmer who is socially isolated. In the new hybrid careers, computing is a crucial ingredient and, economists say, such work will be the source of many new jobs of the future.
But David Anderegg, a professor of psychology at Bennington College, says that merely mentioning terms like nerd or geek serves to perpetuate the stereotype. The words are damaging, much like racial epithets, he says, and should be avoided.
Yet the meaning of words often evolves as the social context changes. I noted that in Manhattan’s elite high schools being called a “cool nerd” is a compliment — denoting someone with intellectual and academic chops, un-self-consciously so, and other interests as well.
Perhaps that’s true in a handful of ZIP codes around the country, Dr. Anderegg conceded. But in most of America, he said, nerds and geeks are people to avoid. The connotations are a bit different: a geek suggests a person with special expertise, while nerd suggests social ineptness. And neither are cool.
And math, science and computer science, Dr. Anderegg said, are courses that young people too often associate with nerds and geeks. As a result, he added, “they sabotage themselves in these fields, and the nation’s work force is suffering.”
“The best way to combat this,” he said, “is put it to bed,” banishing “nerd” and “geek” to the linguistic dustbin.
Not easily done, though, as Dr. Anderegg doubtless appreciates. He is an expert on the subject, and the reason I called him for the piece was that I had noticed the praise for his book, “Nerds: Who They Are and Why We Need More of Them.”
With medical schools flooded with applications, three industrial-organizational psychologists have conducted a study to determine if giving personality tests to prospective students would enable admissions officers to better predict which applicants will be successful.
“Our findings show that personality factors do have a predictive value as to the success rate of admitted medical students. Considering personality of applicants can be quite helpful to medical school admissions programs,” said Deniz Ones, a professor in the department of psychology at the University of Minnesota, who conducted the study with colleagues Filip Lievens of the Department of Personnel Management and Work and Organizational Psychology at Ghent University in Belgium and Stephan Dilchert of the Department of Management at City University of New York’s Baruch College.
Their study followed more than 600 Belgian students through seven years of medical studies to determine what, if any, impact personality might have on their performance.
The students, in their first year, completed a personality inventory and their progress was monitored during the remaining six years of medical school. (American students’ medical school curriculum covers four years, the difference being that American students typically complete pre-medical undergraduate work before entering medical school while Belgian students’ seven years of study combines undergraduate and graduate level medical education in the same curriculum.)
Dilchert noted that though the study was conducted in Belgium, both personality factors and modern medical practices are similar around the world and thus personality should consistently relate to valued outcomes in medical education, including in the United States.
Results of their study are reported in the current issue of the Journal of Applied Psychology.
The researchers employed a commonly used test to measure several personality traits, including conscientiousness, agreeableness, extraversion, openness and emotional stability. Psychologists refer to these as the “Big Five” personality variables.
All medical schools in the United States require prospective students to take the Medical College Admissions Test (MCAT), which is a cognitive test measuring a person’s knowledge, skills and abilities.
“The MCAT is a very good test and can predict whether students have the ability to be successful in medical school; but it is a test that only measures cognitive skills, not personality,” said Ones.
“Our research suggests that by adding a personality assessment to medical school entrance requirements, predicting which students will be successful can be greatly improved,” she added.
In the first two years of medical school there is a strong emphasis on science courses, including gross anatomy, biochemistry, physiology and microbiology. However, the medical curriculum shifts from knowledge acquisition to include interpersonal interaction during the student’s clinical years.
Personality traits can reveal a lot about how students will perform during the differing demands and emphases of a student’s medical studies, said Dilchert.
For example, traits such as conscientiousness, self-discipline and competence were good predictors of learning success throughout the medical studies of the Belgian students. Success was measured by grade point averages each year of medical studies.
Conscientiousness was not only evident during successful students’ initial years of medical study, but took on greater significance in their clinical years, where interpersonal traits like honesty, dependability, attention to detail and vigilance were important, the study findings indicated.
Students being trained in a medical program need the emotional resources to cope with the general pressures of academic performance as well as the specific pressures associated with diagnosing and treating patients under the supervision of medical faculty.
“Students who scored well in persistence and conscientiousness experienced success in their studies,” Dilchert said.
The researchers found greater variance in the traits of extraversion and agreeableness: people who generally are more social, assertive, gregarious, talkative and prone to help others.
“Students high in the extraversion trait had lower grade point averages in the early years, but were high in interpersonal performance in the later years,” said Dilchert.
The researchers speculated that extraverted students are more likely to spend less time on studying than on their social relationships during the first years of medical school, which could hinder their academic performance and result in lower grades.
According to figures from the American Association of Medical Colleges, there were 42,000 applicants in 2009 and 18,390 were enrolled, an admission rate of 40-45 percent. Of that number only a little more than 80 percent will graduate in four years, Ones said.
“With an 80 percent graduation rate, one can assume that those students learned and performed at a satisfactory level during their four years of study. It is likely that on-time graduation rates could be improved by using these tests in admissions decisions,” said Dilchert.
“In fact, given the validity of personality tests, an 80 percent success rate could be increased to 91 percent by adding a standardized personality test to current admission standards,” he added.
“Personality traits are a reliable and valid method for predicting both the acquisition of knowledge and interpersonal performance,” said Ones. “Our research shows that personality should be considered in medical school admissions and that much is to be gained by supplementing the MCAT which measures cognitive or skills assessments with personality tests. Both tests can add independent, useful information to the admissions process,” she added.
Ones said that industrial-organizational psychologists have been researching personality tests for many years and are positioned to bring their knowledge and experience to medical admissions processes. “They know how personality tests work and which measures are supported by scientific evidence. But more importantly, they also know how personality measurements can best be incorporated with other test data to create optimally useful admissions systems.”
The Society for Industrial and Organizational Psychology (SIOP) is an international group of more than 7,800 industrial-organizational (I-O) psychologists whose members study and apply scientific principles concerning workplace productivity, motivation, leadership and engagement. SIOP’s mission is to enhance human well-being and performance in organizational and work settings by promoting the science, practice and teaching of I-O psychology.
SIOP’s 25th annual conference will be April 8-10 at the Hilton Hotel in Atlanta, GA.
More than 4,000 members will attend, including many of the world’s top workplace scientists. There will be hundreds of peer-reviewed sessions spanning a wide variety of interesting topics related to current workplace issues.
Source: Society for Industrial and Organizational Psychology
Thoughts: This is a study conducted by some VERY well-respected people in the field (particularly Ones) and published in our top journal (JAP), which is why I highlighted them. Combined that with the large applied sample size and longitudinal design as well as current interest in the role of personality as a predictor of performance is probably why this is getting some serious attention.
I find it very surprising that the authors note that the MCAT is limited in that it only predicts cognitive skills, yet their performance measure (GPA) is assessing only a limited amount of what it means to be successful. Presumably, you have to do more than just get a high GPA to do well after school and, as we all know, bedside manner is important. Having someone who can recite the textbook might be nice, but you need someone who can take care of patients and explain what’s happening in plain English, as well as someone who can work with others (a whole team of medical professionals) - none of which are measured in class grades, especially in the early biology, physics, chemistry curriculum. But I’m sure there’s more to this study than meets the eye… after all, finding that conscientiousness predicts success is hardly all that surprising…
Picture Credits: Google
Don’t you think the Phineas Gage’s case study (ideographic research) was awfully interesting? In the 19th century, while Phineas Gage was working at the rail road, there was an explosion, which sent an iron rod through his head. After a short series of shocks, he just got up and brought himself to the hospital. Seriously, in all that pain? I would’ve rather died. Anyway, physically, he recovered; mentally, his behaviour and personality took a turn - he became temperamental and irresponsible; he couldn’t hold a job down for long. The connection between his frontal lobe and the rest of his brain was damaged and his personality changed for ever. Eventually, he moved to South America and passed away there.
Perhaps no other time of year is as highly anticipated, and secretly dreaded, as that festive family time known collectively as “the holidays.” The clash of fake gaiety and togetherness around Thanksgiving time plus Christmas’s unrealistic expectations of “perfection” can lead to a train wreck of emotions.
We cope the best we can. Both poles of our Jekyll-Hyde personalities can be released. Sometimes we slip behind familiar masks. We might play comforting, non-confrontational roles, or perhaps hide out in the kitchen behind a tower of dirty dishes. For example, I noticed how in recent years around family gatherings I had become “the entertainer.” My job: make ‘em laugh.
To see if this holiday ailment afflicted more than just myself, recently I polled my friends and select family members. I had them write descriptions of their annual performances, each titled “The Holiday Role I Play.” (I’d also like to hear from you: what role do you play?).
Reported anonymously, here are some of the responses (edited for length) that I received:
• I am considered the queen of Christmas.
• When I go home for the holidays I am “The Good Sport.” No matter what game I am asked to play, song I am asked to sing, I never complain. There is time to get even later.
• I am “The Pretender” and enter into all they’re doing and willingly going along. At some level, I know they know this.
• Characterize me as “The Bartender.” Everyone’s glass is full — which permits me to fill my own glass in the doing.
• At mom’s house I am “The Organizer.” Everything must run on schedule, all the dishes at the proper temperature, the gifts opened in descending order of seniority. My husband is “The Clean-up Guy.” When all the gals are sipping their Bailey’s, he is quietly at the sink washing and drying.
• I think I am “The Son Who Needs To Be Spoiled.” Whenever I come home for the holidays, my mom wants to spoil her “lost son” as much as possible.
• I play three roles. With the immediate family, I am “The Reminder of The Love Before.” Mom sees my father in my face and usually loses her mind. The second role I play is “The Project” — everyone is eager to see me 50 and relatively finished. Finally, I am “The Outsider.” My family is a bunch of heartening, Midwestern hicks, barely anyone finishing college, lots of alcoholics, teenage drug addicts and runaways who try to commitsuicide. To have become the quiet one who got out of Fort Wayne, Ind., without babies or a husband, is always unsettling.
• I can tell you right off my role would be “The Moderator.” Such choice therapeutic phrases such as “what I hear you saying is …” and “what I think she is trying to express is …” are commonly uttered by me. I try to avoid using language like “shame spiral” and “co-dependent.” (Note: variations on this theme were the most common roles cited — “The Referee,” “The Sounding Board,” “The Therapist,” “The Link Repairer,” “The Peacemaker.”)
• I play “The Honored Guest,” graciously bestowing my presence and allowing myself to be treated as such.
• I know the pitfalls of family gatherings (a dirge-like, morose collection of individuals, shoveling down holiday food to the strains of Johnny Mathis and searching for an appropriate escape) and do my best to avoid/dilute them.
• My role: “I Am My Sister’s Keeper.” We share thousands of tiny glances throughout one holiday evening that speak volumes in the moment, and signify volumes to be spoken much later. Separately, we can hardly win any battles, but together, on Christmas, we are an unstoppable army of two.
• I am the one trying to shed a little factual light on my family’s highly distorted, historically rewritten views. I used to be the family clown. I don’t think the two are that different — just components of the same role.
• In my house I take the role of “The Conversationalist.” Frequently this involves many different conversations, held in a constant blur of moving from living room to kitchen and back again, trying to not alight on the couch and be sucked into the brain numbing drone of TV. The talk goes a little like this: Cooking, a little politics and sports with Dad; sports with younger brother; current events and education with step-mom. Don’t alienate anyone, make sure you include all the guests, remember to include significant others. Above all else avoid the deadly seven-minute dead air. Silence isn’t golden. Perhaps we will find out how far we have traveled from each other over the year.
• As a child I was “The Anointed Christmas Infant,” responsible for displays of wonder. As a young adult my role shifted to being the one responsible for the continuation of our handed-down traditions of perfection — “Mid-Winter Monarch” and “Kitchen Queen” — she who secures the boundaries, mediates the squabbles and is provider of plenty. Now, in exile and older, I have become “The Contented Ghost of Christmas Past.”
• My son is unable to type so I will attempt to respond for him. His role is to experience and share pure unadulterated joy during the holidays. He jumps with excitement when putting out a plate of cookies, eight carrots and a glass of milk for Santa. He brings meaning to the holidays. Ask him this question in another five years and I am sure you’ll get an answer more like what you were expecting.
• I have no idea what my role is. I think maybe I’m the guy who makes screaming faces in the bathroom mirror and then comes out all smiley.
And you probably could add to these your own cast of characters you find yourself playing. Feel free to comment below and let us know what roles you slip into around the holidays.
Posted on: October 22, 2009 8:49 AM, by Jonah Lehrer
David Brooks has written yet another wonderful column on the mind. This time he explores the nagging gap between our intuitions about personality - we each express a particular set of character traits, which can be traced back to our early childhood - and the scientific facts, which suggest that the vague personality traits measured by the Myers-Briggs are too vague to mean much of anything. Here’s Brooks:
In Homer’s poetry, every hero has a trait. Achilles is angry. Odysseus is cunning. And so was born one picture of character and conduct.
In this view, what you might call the philosopher’s view, each of us has certain ingrained character traits. An honest person will be honest most of the time. A compassionate person will be compassionate.
These traits, as they say, go all the way down. They shape who we are, what we choose to do and whom we befriend. Our job is to find out what traits of character we need to become virtuous.
The psychologists say this because a century’s worth of experiments suggests that people’s actual behavior is not driven by permanent traits that apply from one context to another. Students who are routinely dishonest at home are not routinely dishonest at school. People who are courageous at work can be cowardly at church. People who behave kindly on a sunny day may behave callously the next day when it is cloudy and they are feeling glum. Behavior does not exhibit what the psychologists call “cross-situational stability.”
The psychologists thus tend to gravitate toward a different view of conduct. In this view, people don’t have one permanent thing called character. We each have a multiplicity of tendencies inside, which are activated by this or that context.
One of the first (and fiercest) critics of the fixed trait model of personality - this idea that character is quantifiable, like an IQ score of the soul - is Walter Mischel. I wrote about his groundbreaking research in a recent New Yorker article:
In 1958, Mischel became an assistant professor in the Department of Social Relations at Harvard. One of his first tasks was to develop a survey course on “personality assessment,” but Mischel quickly concluded that, while prevailing theories held personality traits to be broadly consistent, the available data didn’t back up this assumption. Personality, at least as it was then conceived, couldn’t be reliably assessed at all. A few years later, he was hired as a consultant on a personality assessment initiated by the Peace Corps. Early Peace Corps volunteers had sparked several embarrassing international incidents—one mailed a postcard on which she expressed disgust at the sanitary habits of her host country—so the Kennedy Administration wanted a screening process to eliminate people unsuited for foreign assignments. Volunteers were tested for standard personality traits, and Mischel compared the results with ratings of how well the volunteers performed in the field. He found no correlation; the time-consuming tests predicted nothing. At this point, Mischel realized that the problem wasn’t the tests—it was their premise. Psychologists had spent decades searching for traits that exist independently of circumstance, but what if personality can’t be separated from context? “It went against the way we’d been thinking about personality since the four humors and the ancient Greeks,” he says.
One of Mischel’s classic studies documented the aggressive behavior of children in a variety of situations at a summer camp in New Hampshire. Most psychologists assumed that aggression was a stable trait, but Mischel found that children’s responses depended on the details of the interaction. The same child might consistently lash out when teased by a peer, but readily submit to adult punishment. Another might react badly to a warning from a counsellor, but play well with his bunkmates. Aggression was best assessed in terms of what Mischel called “if-then patterns.” If a certain child was teased by a peer, then he would be aggressive.
Mischel’s favorite metaphor for this model of personality, known as interactionism, concerns a car making a screeching noise. How does a mechanic solve the problem? He begins by trying to identify the specific conditions that trigger the noise. Is there a screech when the car is accelerating, or when it’s shifting gears, or turning at slow speeds? Unless the mechanic can give the screech a context, he’ll never find the broken part. Mischel wanted psychologists to think like mechanics, and look at people’s responses under particular conditions.
So if personality is so context-dependent, then why do we believe so fiercely in the constancy of character? Why does everyone know their Myers-Briggs score? The answer returns us to the biased brain, and a mental flaw known as the fundamental attribution error. It turns out that when we evaluate the behavior of others we naturally overemphasize the role of personality - we assume people are always aggressive or always dishonest or always sarcastic - and undervalue the role of context and the pervasive influence of situations. Nobody, it turns out, is always anything.
Thoughts: Despite the fact that some bloggers/ writers/ experts have been claiming that “lay people” can’t or won’t accurately write about psychology and other social sciences. These critics claim that the lay person tends to overgeneralize and make sweeping conclusions from small incidents and flawed scientific studies. Not surprising given that we all want to make conclusions and mental shortcuts, rules of thumb and headlines… but I guess the experts just get called on it via the wonder of peer review. While I feel weird attempting to say that I’m more qualified than this blogger, who has written two very well-selling books, I do think my formal education might surpass his at this point and I have spoken to a number of well-known scientists about some things (WEIRD), so as such an almost qualified psychologist…
I really appreciate the research here - the less frequently mentioned fundamental attribution error and Mischel - who is well-known in the psychology world, but unknown outside of it. It might not be perfect and it may tend towards drawing the big conclusion from a limited amount of evidence, but that’s what everyone not writing a book does… so, nice job here - think of it as the exception to the what many critics are saying about the demise and damage of scientific blogging.
Can we pull the wool over our own eyes or do we see through our mind games?
In theory the one person we should never, ever, lie to is ourselves. Surely lying to ourselves is counter-productive? Like calmly and deliberately shooting yourself in the foot or taking a hot toasting fork and plunging it into your eye?
But look around and it’s not hard to spot the tell-tale symptoms of self-deception in other people. So perhaps we are also deceiving ourselves in ways we can’t clearly perceive? But is that really possible and would we really believe the lies that we ‘told’ ourselves anyway? That’s whatQuattrone & Tversky (1984) explored in a classic social psychology experiment published in the Journal of Personality and Social Psychology.
Lies, damn lies and psychologists
Any study of self-deception is going to involve a fair amount of bare-faced lying, and Quattrone & Tversky’s (1984) research was no different. They recruited 38 students who were told they were going to take part in a study about the “psychological and medical aspects of athletics”. Not true, in fact the researchers were going to trick participants into thinking that how long they could submerge their arms in cold water was diagnostic of their health status, when really it showed just how ready people are to deceive themselves. This is how they did it.
The participants were first asked to plunge their arms into cold water for as long as they could. The water was pretty cold and people could only manage this for 30 or 40 seconds. Then participants were given some other tasks to do to make them think they really were involved in a study about athletics. They had a go on an exercise bike and were given a short lecture about life expectancy and how it related to the type of heart you have. They were told there were two types of heart:
- Type I heart: associated with poorer health, shorter life expectancy and heart disease.
- Type II heart: associated with better health, longer life expectancy and low risk of heart disease.
Half were told that people with Type II hearts (apparently the ‘better’ type) have increased tolerance to cold water after exercise while the other half that it decreased tolerance to cold water. Except of course this was all lies only made up to make participants think that how long they could hold their arm under water was a measure of their health, with half thinking cold-tolerance was a good sign and half thinking it was a bad sign.
Now time for the test: participants had another go at putting their arms into the cold water for as long as they could. The graph below shows the average results before and after all the blatant lying (in the name of science of course!):
As you can see the experimental manipulation had a strong effect. People who thought it was a sign of a healthy heart to hold their arms underwater for longer did just that, while those who believed the reverse all of a sudden couldn’t take the cold. That’s all well and good, but were these people really lying to themselves or just the experimenters and did they believe those lies?
Hook, line and sinker
After the arm-dunking each participant was asked whether they had intentionally changed the amount of time they held their arms underwater. Of the 38 participants, 29 denied it and 9 confessed, but not directly. Many of the 9 confessors claimed the water had changed temperature. It hadn’t of course, this was just a way for people to justify their behaviour without directly facing their self-deception.
All the participants were then asked whether they believed they had a healthy heart or not. Of the 29 deniers, 60% believed they had the healthier type of heart. However of the confessors only 20% thought they had the healthier heart. What this suggests is that the deniers were more likely to be truly deceiving themselves and not just trying to cover up their deception. They really did think that the test was telling them they had a healthy heart. Meanwhile the confessors tried to tell a lie back to the experimenter (seems only fair!), but privately the majority acknowledged they were deceiving themselves.
This experiment is neat because it shows the different gradations of self-deception, all the way up to its purest form, in which people manage to trick themselves hook, line and sinker. At this level people think and act as though their incorrect belief is completely true, totally disregarding any incoming hints from reality.
So what this study suggests is that for many people self-deception is as easy as pie. Not only will many people happily lie to themselves if given a reason, but they will only look for evidence that confirms their comforting self-deception, and then totally believe in the lies they are telling themselves.
Explains a lot, don’t you think?
So recently I’ve seen these psychological breakdowns of Disney Princes and Princesses (above), and I feel like they’re such a cursory coverage of the real stories.
Snow White: The Prince fell in love with her when she was - for all appearance sake - just a servant. He loved her before he saw her when he heard her song. Remember how he climbed the wall to find her??
Aurora: So against being married for politics, she wants to marry for love. Again, Philip falls in love with her before he sees her, and her being beautiful is just an added bonus. Also, if Philip only liked her for her looks, it wouldn’t be true love, so his kiss would not break the spell. And really, how many boys would be willing to take on a temperamental dragon. If he only wanted a beautiful girl, I’m sure it would have been much easier to give up on Aurora and find someone he didn’t have to work for.
Jasmine: Isn’t the entire point of this story that love can’t be arranged? Yes, her culture dictated a marriage to royalty, but the entire point is that she fell in love with someone because of who they were, not what they had, so so of course Aladin’s quick wit helped save her in the end. Also her father really does want her to be happy, which is why he didn’t force her to marry one of the other unsuitable suitors.
Ariel: I don’t think she changed her appearance so much to be attractive to Eric, but so that she coud physically be with Eric. She agreed to sacrifice her voice because he was more important to her than anything else, because of love.
Belle: Belle was not sexually attracted to the beast. Did you not listen to the introduction?? The Beast’s curse said that someone had to love him for who he was, in spite of his appearance. Belle saw past his rough exterior and fell in love with who he was underneath. That is not her blatant sexuality, it is all about seeing past who people pretend to be into who they actually are.
Cinderella: The prince wants to meet all the women in the kingdom, not just the rich or the beautiful, ALL OF THEM. Because he’s not looking for just a pretty face, he wants someone he can see himself spending his life with. The fact that Cinderella is beautiful is again, an added bonus, but it was not his driving motivation to find her. She was mysterious, sweet, gentle, well managered, exciting to be around, etc… All the things he was looking for in a bride.
Women’s Perfectionism and Workplace Ambition
I read a lot about women in the workplace. The “Marci Alboher, Working the New Economy’s Blog” had a recent article debating whether women were less titled in the workplace because they were less ambitious. One of the commentators was saying that women are not less interested in success, but they are less interested in being recognized by others for their success. This seemed plausible, but I also found it less than convincing.
Wednesday, I got the following statistic from HBR (it was The Daily Stat email),
Women Are More Likely to Be Perfectionists A study of 288 employed American adults published in the Journal of Occupational and Organizational Psychology found that more women than men felt they did not meet their own high standards either at work or at home. 38% of women and 24% of men said their job performance did not meet their own standards. 30% of women felt they were failing to meet their home and family commitments at a high enough standard, compared to 17% of men. Source: Journal of Occupational and Organizational Psychology via BBC News
It made me think that women are less likely to want to be recognized or less likely to ask for a promotion mostly because they evaluate their own performance more harshly than their male colleagues do. This makes a lot more sense to me than most other explanations I have heard. Thanks HBR!