Resiliency: Bouncing Back After a Major Event

I don’t know how much exposure other fields have to the idea of resiliency, but in Psychology, I’ve found that there’s a common, reassuring sentiment that “people are resilient”.  That means people can bounce back (psychologically) from major stressors, like a divorce, unemployment, or death of a loved one.  In general, years of research have supported the claim that most people are resilient.  That is, they adjust and recover after one of these life-altering events.

Recently, two researchers at Arizona State University, Frank Infurna and Suniya Luthar, published a paper reporting that they had reanalyzed some of the data from a paper that previously found that most people are resilient.  In their reanalysis, they concluded that resilience was in fact the least common, rather than the most common, response to divorce, widowhood, and unemployment.  According to them, the majority of people in their sample suffered continued decreases in well-being for months, and some for years, after their loss.

Why are their results so different?  The original authors did reply to Infurna and Luthar’s findings, writing that the data used was actually different.  Regardless, these discrepancies highlight the importance of using care in selecting statistical techniques, because the results that follow can be drastically different depending on statistical approach.  In the examples above, authors on each side of the analyses have mentioned that assumptions of statistical tests may have been violated, model fit may not be ideal, too much or too little variability was allowed in their models, etc.  In the end, some of these calls are subjective and there may be no right answer, but an openness to alternative perspectives and feedback from scientific peers is sure to help advance the field.

So, for now, although we know that many people are psychologically resilient in the face of tragedy, it is unclear how common this response is exactly.  Such is the nature of science!

Take home points:

  • Resilience rates are unclear
  • Statistics powerfully influence results; use them wisely!

Bullying: What can we do? A look at KiVa, a school-wide bullying prevention effort.

Each day that I drive to campus, I pass a billboard that says “Be nice.” Although during most of these morning drives, I am thinking about the day’s to-do list, today I tried to be a bit more mindful. Today, specifically, I noticed and thought about this billboard. The message is simple, really. But is it effective? Are people being nicer because of this daily reminder? Am I? It got me thinking more about other anti-bullying efforts and how effective they really are. Today, we will be reviewing an article titled Can a School-Wide Bullying Prevention Program Improve the Plight of Victims? Evidence for Risk x Intervention Effects by Dr. Juvonen and colleagues.

According to the National Center for Education Statistics and Bureau of Justice Statistics, between 25 and 33 % of students say they have been bullied at school. The most common types of bullying include verbal and social and bullying happens more commonly during middle school. As such, the time for intervention just might be before that. The researchers in today’s article aimed to do just that. They developed the KiVa Program and designed a study to determine whether it influenced a few factors including depression, self-esteem, and overall perceptions of regarding the school. The details of the program are extensive and beyond the scope of this post, but a link is provided so you can read more about it.

The researchers recruited a sample of just over 7000 students in 4th through 6th grade from Finnish elementary schools. Students in 39 of the schools received the KiVa program intervention while students in 38 of the schools served as a control group. The intervention took place over a 12 month period of time. What they found was that the intervention was very effective when it came to creating an overall perception of a caring climate throughout the school, especially among the students who were bullied prior to the intervention. The program was also effective in helping reduce the levels of depression and increase levels of self-esteem, but only among the most bullied students in the 6th grade.

So what does this all mean? Essentially, there is evidence suggesting that the KiVa program, and potentially other school-wide bullying prevention efforts, can be effective. According to the authors, the “results suggest that antibullying programs designed to improve the school ecology can alleviate the plight of the victimized…” They go on to note that “… harm reduction should be assessed by testing risk x intervention effects when evaluating effectiveness of such programs.” Ultimately, it will be interesting to see the future of these types of interventions and whether they can be effective across cultures.

 

Take Home Points:

  • A group of researchers designed an study to assess the effectiveness of a school-wide bullying prevention program
  • The intervention (the KiVa program) was effective in a few different ways.
    • First, it created a perception of a caring school climate among the students who had been bullied most
    • It reduced depression symptoms among 6th graders who had been bullied most
    • It increased self-esteem among 6th graders who had been bullied most.

Tips to apply these findings:

  • There aren’t really individual tips for application based on the results of this study, as it is just providing us evidence that school word bullying prevention programs can be effective.
  • However, if you can get involved in facilitating a similar program, at least you know there is evidence to suggest that it should be effective.
  • Also, like the billboard reminds me daily, be nice.

Citations:

  • National Center for Education Statistics and Bureau of Justice Statistics,School Crime Supplement, 2011.
  • Juvonen, J., Schacter, H. L., Sainio, M., & Salmivalli, C. (2016). Can a School-Wide Bullying Prevention Program Improve the Plight of Victims? Evidence for Risk× Intervention Effects.Journal of consulting and clinical psychology.

Genetics of Schizophrenia

This blog wouldn’t be complete without touching on the ground-breaking and widely discussed findings of mentor Steven McCarroll and mentoree Aswin Sekar and colleagues, published just two weeks ago in Nature.  The authors tested the association between schizophrenia and hundreds of possible genes in a certain area that was already known to be involved in the development of schizophrenia.

Schizophrenia is a brain disease (like other psychological disorders) that usually begins in adolescence to early adulthood.  It is characterized by persistent presence of two of the following symptoms:

  • Delusions (fixed beliefs that are not aligned with reality)
  • Hallucinations
  • Disorganized thinking (and speech)
  • Disorganized behavior (such as agitation or bizarre posture)
  • “Negative symptoms”, which means the absence of something, like emotional expression

One common misunderstanding is that schizophrenia involves symptoms that resemble a “split personality” or drastic changes in mood.  These symptoms are not criteria for schizophrenia, and are more characteristic of other psychological disorders.

The authors of the landmark study were able to narrow down genetic risk to a very specific gene: complement component 4 (C4).  This gene was already known for its involvement in the immune system, for “tagging” targets for the immune system.  Researchers had already repeatedly found that patients with schizophrenia had brains with reduced grey matter and synapses.  This study, specifically implicating C4 in the development of schizophrenia, opens the door for the possibility that C4 is the gene responsible for pruning away synaptic connections between brain cells, just like it helps the immune system prune away intruders.

Of course, it will take more studies to confirm this theory.  But this knew knowledge will propel many other studies to test new ideas and eventually develop treatments aimed at the actual cause of schizophrenia, rather than simply managing symptoms.

Take home points:

  • The C4 gene is involved in the development of schizophrenia
  • This gene is normally responsible for aiding the immune system
  • It is thought that perhaps schizophrenia develops by excessive pruning of synapses, as caused by the C4 gene (although this is not yet supported by direct evidence)

American Psychiatric Association. (2013). Diagnostic and statistical manual of mental disorders: DSM-5. Washington, D.C: American Psychiatric Association.

Sekar, A., Bialas, A. R., de Rivera, H., Davis, A., Hammond, T. R., Kamitaki, N., Tooley, K., Presumey, J., Baum, M., Van Doren, V., Genovese, G., Rose, S. A., Handsaker, R. E., Daly, M. J., Carroll, M. C., Stevens, B., McCarroll, S. A. (2016). Schizophrenia risk from complex variation of complement component 4. Nature. doi:10.1038/nature16549

Kicking Cognitive Ageing! Does leg power protect women from cognitive ageing?

Today is January 27th. For the highly motivated and disciplined among us, that means that we are about 27 days into our New Year’s Resolutions. Unfortunately, for that rest of us, that means that we are probably long passed trying to maintain said resolutions. One of the most common New Year’s Resolutions is to exercise and lose weight (as per a survey published in 2015 in the Journal of Clinical Psychology by researchers at the University of Scranton). We all know the wonderful physical health benefits associated with exercise, but sometimes the equally beneficial influences on our mind are overlooked. Today we will be reviewing an article titled Kicking Back Cognitive Ageing: Leg Power Predicts Cognitive Ageing after Ten Years in Older Female Twins by Dr. Claire J. Steves and colleagues.

The main goal of this study was to determine how muscle fitness, as assessed by leg power, influenced cognitive decline over time. The methods were as follows. The researchers recruited 324 healthy female twins. The participants were anywhere from 43 to 73 years of age, with a average age of 55 years. The first thing the researchers did was have each participant complete a fancy neurological assessment known as the Cambridge Neuropsychological Test Automated Battery (CANTAB). This allowed researchers to determine the participants’ baseline levels of cognitive functioning. The researchers then assessed participants’ leg power as well as a whole pile of different variables that might influence the study (heart disease, diabetes, diet, smoking habits, SES, etc, etc.) Lastly, a small sub-sample of monozygotic twins underwent an MRI to assess their brain structures. Then the cool part happened. The researchers waited ten years.

After ten long years, the researchers brought the participants back and reassessed them on these variables. What they found was pretty neat. First, they found a strong relationship between the level of leg power and the level of cognitive change. Specifically, the more powerful the legs (which is indicative of more physical activity) the less cognitive decline happened throughout the ten years. Secondly, the more powerful the legs (once again, more physical activity) the higher the total amount of grey matter in the brain! Pretty neat!

So what exactly does this mean? Well, put simply, leg power is predictive of both how you cognitively age and your overall global brain structure. This finding was present even when researchers controlled for all the various genetic and environmental variables shared by the sets of twins. So in other words, don’t skip leg day!!

 

Take Home Points:

  • After ten years of aging, it was found that your muscle strength (as assessed by leg power) predicted:
    • Cognitive aging as assessed by a neuropsychological assessment
    • Global brain structure as assessed by an MRI
  • In women, leg power seems to serve as a protective factor against cognitive aging and reductions in overall grey matter throughout the aging process.

Tips to apply these findings:

  • Don’t skip leg day!

Citation:

  • Steves, C. J., Mehta, M. M., Jackson, S. H., & Spector, T. D. (2015). Kicking Back Cognitive Ageing: Leg Power Predicts Cognitive Ageing after Ten Years in Older       Female Twins. Gerontology62(2), 138-149.

 

Power Poses: Could They Help You Land That New Job?

You may have seen the TED talk about power poses, delivered by researcher Dr. Amy Cuddy, from the Harvard Business School.  In it, Dr. Cuddy describes the benefits of putting oneself in a “high-power” pose, or an open, expansive pose which conveys dominance and confidence.  Dr. Cuddy is basing this talk largely off of her own research on the topic.  She describes noticing that students who naturally adopted more high-power, or expansive, posture tended to participate in class more.  Some other students, commonly the female ones, seemed to both shrink in their seats and not participate in class as much.  Dr. Cuddy and other researchers (Ranehill, Dreber, Johannesson, Leiberg, Sul, & Weber, 2015) agree on the obvious: standing in a high-power pose makes one feel more powerful than shrinking up and making oneself small.

What is more interesting to consider is that effect that feeling powerful could have on behavior.  For instance, in 2015, Cuddy and her colleagues published a study demonstrating that people who assumed a high-power pose for 2 minutes before a mock job interview were more likely to be “hired” and were judged to have performed better than those who had assumed a low-power stance before the interview.  An important detail is that these individuals were not instructed to use different posture during the interview, but only for 2 minutes beforehand.  When the researchers further examined why the high-power posers were rated more positively and more likely to be hired, they found that it was not what they said, but their presence that was responsible for the benefits.  So perhaps, power posing influences how you feel about yourself and that shows in how you present yourself.

One of the most intriguing findings that Dr. Cuddy presents in her TED talk is the finding that people who were randomly assigned to a high-power pose experienced an increase in testosterone (dominance hormone) and a decrease in cortisol (stress hormone), while those who experienced the low-power pose experienced the opposite.  The idea that simply standing a different way (or sitting!) could change one’s hormone levels is intriguing.

As with all scientific findings, though, we have to make sure that these findings are replicable (able to be found again by other scientists).  Ranehill et al. (2015) tried a similar (but arguably better) experiment in 2015 and were unable to replicate the finding that high-power poses influenced hormone levels at all.  In fact, their high-power pose group had lower testosterone levels than their low-power pose group.  In addition, Cuddy’s finding that the high-power posers were more willing to take a risk (by gambling) was not replicated in Ranehill et al.’s study.

Take home points:

  • Power-posing (i.e., adopting an expansive, open posture) can make you feel more powerful
  • Power-posing may or may not affect your willingness to take risks or alter your hormone levels
  • Power-posing may influence your personal presence in situations such as job interviews, leading to a better chance that you’ll be rated positively and hired

How to apply these findings:

  • Keep your eyes peeled for more research to confirm or disconfirm these preliminary findings!
  • Do power poses in the meantime if it makes you feel good 🙂

References:

Cuddy, A. C., Wilmuth, C. A., Yap, A. J., & Carney, D. R. (2015). Preparatory power posing affects nonverbal presence and job interview performance. Journal Of Applied Psychology, 100(4), 1286-1295. doi:10.1037/a0038543

Ranehill, E., Dreber, A., Johannesson, M., Leiberg, S., Sul, S., & Weber, R. A. (2015). Assessing the robustness of power posing: No effect on hormones and risk tolerance in a large sample of men and women. Psychological Science, 26(5), 653-656. doi:10.1177/0956797614553946

Holiday Gifts: Are experiences or materials more rewarding?

“Every gift from a friend is a wish for your happiness.”

-Richard Bach

From both Brittany and I, happy holidays! Regardless of what one celebrates, the holiday season commonly consists of family, friends, food, gifts, trips, vacation, etc. A common theme at my family’s annual holiday celebration is an examination of the pervasiveness of materialism and consumerism in the holiday season. That got me thinking about materialism. I decided to take a look at the literature and see what I could learn. This week’s post is reviewing an article that aimed to determine whether one’s happiness is increased more through life experiences or obtaining material objects.

The study is titled The Unsung Benefits of Material Things: Material Purchases Provide More Frequent Momentary Happiness than Experiential Purchases (pretty heavy spoilers in the title, but please do read on) by Drs. Aaron C. Weidman and Elizabeth W. Dunn of the University of British Columbia. The study consisted of two experiments. In the first experiment, participants were asked to spend $20 on either a material or a experiential purchase. An example of a material purchase might be a book, video game, knick knack, etc. An example of an experiential purchase might be going to the zoo, a movie, mini golfing, etc. Following the purchase of the material item or experience, they reported their momentary happiness regarding the purchase in a daily-diary for the next two weeks. In experiment two, participants had to report either a material or experiential gift that they had received for Christmas. They then used an experience-sampling methodology to report their momentary happiness following receiving that gift.

The results from both studies were consistent. What they found was that both material and experiential gifts result in happiness, albeit in slightly different ways. After receiving or purchasing gifts of a material nature, participants experienced more frequent momentary happiness over time. Whereas after they received or purchased gifts of an experiential nature, they experienced more intense momentary happiness on specific occasions. Both types of gifts have their influences of happiness, just in different ways!

So what does this all mean? Well… gifts are good. Some individuals hold a predispositional belief that experiential gifts are inherently better than material gifts. This piece of research challenges that idea. Material gifts have their place in influences happiness as well. Although the happiness experienced might not be as intense, it is felt more frequently over time. But just remember, more than anything, it’s the thought that counts… right?

Take Home Points:

  • Both material and experiential gifts result in happiness, although in different ways
  • Material gifts result in more frequent experiences of happiness over time
  • Experiential gifts result in more intense experiences of happiness on specific occasions

How to Apply These Findings:

  • Take the different types of influences that different gifts have into account when buying loves ones gifts during the holidays!

 

Weidman, A. C., & Dunn, E. W. (2015). The Unsung Benefits of Material Things Material Purchases Provide More Frequent Momentary Happiness Than Experiential Purchases. Social Psychological and Personality Science, 1948550615619761.

The Bystander Effect: What We Know Now

Most have heard of the murder of Kitty Genovese, the woman who was raped and stabbed multiple times outside of her New York City apartment in 1964. It is said that, despite cries (“Oh my God, he stabbed me! Help me!”) and multiple attacks by the perpetrator that spanned over a half hour, no one called the police.  It has now been confirmed that a few neighbors tried to help: one called out “Let that girl alone!”, which prompted the attacker to leave momentarily, but he came back.  Another called the police after the second attack, and another neighbor came outside to check on Genovese, who was dying and too weak to get inside.  Genovese died on the way to the hospital.

This story prompted psychologists and non-experts alike to wonder, why had none of the other neighbors helped?  Had the police been called immediately, she may have survived.  It was confirmed that several others did witness parts of the attacks.   Many believe this is an example of the Bystander Effect: people’s tendency not to help when they perceive that there are other passive witnesses.  It’s a robust effect that has been shown with a variety of field and experimental situations: when passing a stranded driver on the road, when someone becomes injured, has an asthma attack, or even when someone just drops a bunch of pencils and needs help picking them up.  A previous review (Latané & Nida, 1981) found that people are more likely to help if the bystander is very young (i.e., too young to help) or the situation is very unambiguous.  Since these original studies, we have learned even more about this important social phenomenon.

A meta-analysis by Fischer and colleagues, published in the Psychological Bulletin in 2011, talks about some of the factors that change someone’s likelihood to help.  Interestingly, they found that whether you know the bystanders can change the likelihood of intervening.  If the people around are strangers, you’re less likely to intervene than if you are with friends.

In general, Fischer and colleagues concluded that people are more likely to intervene if the costs of not intervening are high (i.e., if the situation is dangerous), such as in the case of someone being physically attacked. Furthermore, if the situation requiring intervention is dangerous, the bystander effect is reversed: people become more likely to intervene when someone else is with them.  The authors conclude that this is because additional bystanders in a dangerous situation are perceived as sources of physical back-up.  Consistent with this idea, the bystander effect (i.e., inaction in the face of an emergency) increased when there were no male bystanders available.

Despite encouraging results that people will be more likely to intervene when danger is high, the general likelihood of anyone helping is around 70-75% when they are alone (and thus, the only one available to help), and around 50% if another person is present.  The more bystanders there are, the less likely they are to help (~20% for a group).

Take Home Message:

  • In general, the presence of others decreases the likelihood that you will intervene when someone else needs help
    • This effect is particularly drastic if you are amongst a group of strangers (and less so if you are with fewer people, or friends)
  • If the situation you witness appears clearly dangerous, you are more likely to help when the bystanders appear to be helpful for a physical intervention (i.e., if they appear physically competent, or are men)

How to Apply These Findings:

  • Don’t expect “someone else” to help (because it’s likely everyone else is expecting the same!)
  • If it is too dangerous to personally intervene, at least call the police (don’t assume someone else has or will)
  • Know that ambiguous situations are especially likely to go un-intervened.  Consider, for example, a person slumped over on the sidewalk as bystanders walk by.  Would you make sure they’re OK?  Most people won’t, because it’s ambiguous: the person could just be sleeping.

Watch this video for a clear demonstration of the bystander effect:

Fischer, P., Krueger, J. I., Greitemeyer, T., Vogrincic, C., Kastenmüller, A., Frey, D., & … Kainbacher, M. (2011). The bystander-effect: A meta-analytic review on bystander intervention in dangerous and non-dangerous emergencies. Psychological Bulletin,137(4), 517-537. doi:10.1037/a0023304

Latané, B., & Nida, S. (1981). Ten years of research on group size and helping. Psychological Bulletin, 89(2), 308-324. doi:10.1037/0033-2909.89.2.308

Can fictional characters influence our self-development?

We live in a world inundated with media. As a result, we are exposed to more fictional stories, and specific characters, than ever before. One fictional character that has always been special to me is Bruce Wayne, or, The Batman (imagine five year old me with a pillow case tied around my neck jumping off furniture pretending to be The Dark Knight). I’ve always thought of Batman as a mentor from whom I learned to do the right thing, even if it is not easy, that I don’t need superpowers to do great things, to work hard, to stand up for those who might not be able to stand up for themselves, and so much more. Lots of people scoff when I tell them that, but the reality is that fictional characters can have lasting impacts on our self-development. Today, we will review an article by Randi Shedlosky-Shoemaker and colleagues (2014) titled Self-Expansion through Fictional Characters.

The authors of this study were primarily interested in how parasocial relationships with fictional characters might influence us as individuals. The concept of parasocial relationships is relatively new. Essentially, parasocial relationships are one-sided relationships. They occur when an individual puts time, energy, and thought into a relationship with an individual whom they may have never even met (celebrities, athletes, etc.). The authors applied the concept of parasocial relationships to fictional characters. So what can parasocial relationships with fictional characters really do for us? Can they make a difference in who we are? Who we become? How we perceive ourselves? The answer; yes.

A sample of college students participated in this study. To begin, they were asked to read a brief vignette about a person competing in a race. Following that, they were asked to make a variety of self-report ratings related to likeability as well as self-relevance (including self-expansion and growth) in regards to the character competing in the race, a real-life friend, a classmate, their favorite TV character, and a non-favorite TV character. What the authors found was that, regardless of whether it was a fictional or non-fictional person/relationship, the intensity of the relationship influenced the reported level of self-expansion that the participant felt occurred due to that person. Close friends resulted in the greatest levels of self-expansion followed by favorite TV character, non-favorite TV character, classmate, and then the character in the race.

So what exactly does this mean? The influence of real-world relationships on our development is undeniable. However, this study provides an early piece of evidence that our relationships with fictional characters might also influence our development. The intensity of that parasocial relationship, or how closely you feel you relate to and engage with that character, has the potential to impact your level of self-expansion. A few questions still remain due to the limitations of this study. One is what are the long-term effects of parasocial relationships? This study looked only at the self-reported here and now. Secondly, why are some individuals more likely than others to identify with and engage with fictional characters? It will be interesting to find out as this area of literature continues to grow and develop.

Take Home Points:

  • Parasocial relationships (one-sided relationships with an individual whom you have not met) with fictional characters result in more self-expansion than relationships with real-life individuals with whom you only know, and are not close with.
  • Relationships with close friends still have an undeniably stronger influence on our self-expansion than parasocial relationships with fictional characters have.
  • This study has a few limitations and further research is needed to more completely understand the influence of parasocial relationships with fictional characters and why some people are more likely to be engaged in, and influenced by, them.

Reference:

Shedlosky-Shoemaker, R., Costabile, K. A., & Arkin, R. M. (2014). Self-expansion through fictional characters. Self and Identity13(5), 556-578.

Coffee Drinkers Have a Lower Risk of Dying

I recently performed a simple demonstration in my Research Methods class, which involved everyone answering a few questions and exploring possible correlations between these variables.  One of the associations we explored was number of hours worked per week and cups of coffee drank per week.  As the students shared their numbers, most of them reported having less than 1 cup a day, but one student owned up to having a cup daily and reported “7”.  In response to the slight embarrassment I could see on her face, I announced that I drink 6 “cups” a day (the cups are actually 6 oz.), so I suppose that’s 42 a week.  As I looked out at the shocked facial expressions before me, I found myself scrambling to remember the evidence that I had read before, research that backed me up, that regular coffee consumption is actually related to health benefits!  So, here is my summary of a meta-analysis (analysis of a bunch of scientific studies at once to find overall patterns) by Youjin Je and Edward Giovannucci regarding the effects of coffee on overall risk of mortality (dying).

The authors decided to study the relationship between coffee and risk of death because previous research has already indicated that regular coffee consumption is linked to decreased risk of a myriad of chronic diseases, such as type II diabetes, heart disease, and some types of cancers.

The authors included only prospective-design studies–that is, studies that measured coffee consumption first and then risk of dying over 7-28 years after that.  They compiled a massive dataset totaling over 900,000 participants.  Almost 130,000 of these participants died during the studies’ durations, allowing the researchers to analyze which factors were associated with increased risk of dying.  They found that habitual coffee drinkers (5-6 cups/day) had an overall 14% reduction in risk of death compared to people who drink little to no coffee.

If you’re familiar with research methods, you may be thinking, “I bet coffee drinkers are just different from non-coffee drinkers in other ways–maybe they exercise more, eat healthier, etc.”  Good thinking!  This is an important consideration when we see results like this.  All of the studies in this meta-analysis controlled for smoking and age, and almost all controlled for sex, BMI, and alcohol intake.  Many controlled for other health indicators, like blood pressure, chronic diseases, eating habits, vitamins and medications, and physical activity.  It appears that the reduced risk in death may actually be due to drinking coffee itself.

So how much coffee was beneficial?  The authors found that 2-4 cups per day yielded the same risk of dying as the group that drank 5-9 cups per day, so the reduced risk is about equal for moderate and heavy coffee drinkers (~14%).  Drinking around 1 cup per day was associated with an 8% drop in risk of dying.

Take home points:

  • Habitually drinking coffee is associated with an average of a 14% reduced risk in dying and reduced risk for a number of chronic diseases
  • This number is found after controlling for smoking, age, and mostly controlling for other health-related factors like sex, BMI, alcohol intake, exercise, eating habits, and diseases.
  • The reduction in risk of death is greatest in moderate-heavy coffee users, but light coffee drinkers also have some reduction.

 

Could the Movie ‘Selfless’ Really Happen? The Nature of Consciousness

Although the movie has not come out yet, I saw a trailer for it and felt intrigued by its proposition: that consciousness (the essence of “you”) could be transferred from one body to another.  To address this question, I went digging into what we know about consciousness.

Consciousness is one of the least understood phenomena in psychology.  It is the subject of one of the oldest debates–the mind/body problem–which tackles the question of whether or not our consciousness is a part of our body or separate.  Most people fall into one of two camps on this question: Dualists think that the mind and body are separate, and often liken the mind to a soul-like entity, which continues to live after the body dies.  Monists, in contrast, believe that the mind and body are one entity–that is, the mind is just another part of the physical body.

Consciousness is believed to be a property of the mind.  If you’re a dualist, you probably believe that the experience of consciousness is separate from the physical body.  If you’re a monist, you probably believe that consciousness arises out of physically observable brain activity.

Martin Monti and his colleagues at UCLA looked at what happens in the brain when a person transitions from being conscious to unconscious, to try to figure out if there is a spot where consciousness happens in the brain.  They analyzed 12 young patients’ brains using fMRI as they moved through various states of consciousness: awake, sedated, and waking up from sedation.

They found differences in brain activity when patients were conscious, but it’s not localized to any one area.  Instead, the authors concluded that the main difference they found between the conscious and unconscious brain is that the conscious one transfers information much more efficiently.  The unconscious brains lacked organization and didn’t transfer information across regions very well.  You could liken it to the difference between walking directly across the yard to the other side in a straight line (the conscious brain), or going in little spurts to various spots all over the yard before finally arriving at the other side (the unconscious brain).

This doesn’t exactly help us address the question of whether the mind is a non-physical entity that influences the body (dualism), or simply arising out of organized brain activity (monism), but one post on ASAPScience elaborated on these findings.  They believe that one day, as we discover more and more about the basis of consciousness, we will be able to transfer consciousness to another brain, or perhaps “upload” our brains.  For now, though, we’re all going to have to be content with the brain we’ve got!

Take Home Points:

  • Consciousness appears to change brain activity to be more organized and efficient
    • This could be consistent with both monism and dualism
  • Some people think that, as we learn more about consciousness, we might actually be able to transfer it to another brain or hardware

Original research article:

Monti M.M., Lutkenhoff E.S., Rubinov M., Boveroux P., Vanhaudenhuyse A., Gosseries O., Bruno MA., Noirhomme Q., Boly M., Laureys S. (2013). Dynamic change of global and local information processing in propofol-induced loss and recovery of consciousness. PLoS Computational Biology, 9(10)