What if There Were Live Music at the Doctor’s Office?

There was a really interesting study published earlier this year that had live music in a medical waiting room. The aim of the study was to learn more about the staff’s perceptions of this live music, but as you might expect, the live music had an effect on patients, too:

One of the unanticipated results of the music program was that patients often play the piano in the clinic waiting room. This code emerged frequently in the transcriptions and seemed to enhance staff abilities to initiate non-medical discussions with patients, potentially increasing rapport, trust, and therapeutic alliance.

That second sentence seems important. Allowing the patients to play the piano, it made for a more natural way for a staff member to initiate a conversation with them. I found the next sentence of particular interest:

Another unanticipated result of the music program was that patients often remained in the clinic after their appointments to enjoy the live music.

If someone asked you the probability that you’d voluntarily remain at the doctor’s office after your appointment, I bet almost 100% of people would say that there’s a 100% chance that they’re leaving ASAP.

~

In thinking about this study, the one thing that I find *possibly* concerning is the novelty of the situation. That is, yes, having live music in the doctor’s office would be a new (and enjoyable) experience for many, but I wonder if patients (or staff) would habituate to there being a live musician in the waiting room. And as a result of this habituation, would the positive benefits ‘wear off?’ I suppose given how infrequently we go to the doctor’s office in a calendar year, instead of habituating to the experience, maybe it’s something we look forward to doing. Can you imagine your teenager saying to you, “Yes, I hurt my arm, we get to go to the doctor’s office!”

OK, that probably wouldn’t happen, but I really like this idea of having live musicians in the waiting room. The marriage between live music and waiting rooms seems obvious, especially given the healing qualities of live music. This could also be a great opportunity for younger musicians who have a hard time cracking the lineup at some venues. Instead of playing on street corners or in the subway, they could share their music within the medical community.

ResearchBlogging.orgSilverman, M., & Hallberg, J. (2015). Staff perceptions of live classical music in an urban medical clinic: A qualitative investigation Musicae Scientiae, 19 (2), 135-146 DOI: 10.1177/1029864915583375

Convert PDF to DOC with a Mac — for FREE!

I like to think of myself as relatively computer literate. When I was in elementary school, I taught myself how to use HTML and created/designed my own website. I don’t know if I’ve linked to it on here, but it’s still functioning. Of course, I don’t remember the login or password for it, so there’s no way for me to edit it, but it’s really odd to remember back to where (and when) I was during the creation of it.

Since my GeoCities days, the internet has changed quite a bit. I’ve created a few websites (mostly with WordPress, either through the free version or through the version you need to download), but I wouldn’t — by any stretch of the imagination — say that this is a strength of mine. My skills here are basic, (but when compared to the average person, one might say that they’re a bit beyond basic).

As a tangent, this reminds me of something during my time as an psychology undergraduate. During the “capstone” course for that major, I remember the professor telling us that the department had majors take a test at the beginning and end of the program. They found something interesting: when students took the test at the end of the program, students were reporting that they knew less about psychology than when they started the degree. That is, one of the questions on the ‘pre-test’ was rate your level of understanding of psychology on a Likert scale (one to ten) and that same test appeared on the ‘post-test.’ The department was finding that the average score on the post-test for that question was lower than the average score on the pre-test. Why, you might ask?

Well, as students began to learn more about the subject of psychology, they realized just how vast a subject that it is and as a result, realized just how much they didn’t know about the subject. Food for thought.

Anyways, yes, technology.

Does the phrase “ALT+TAB” or “Command+TAB” mean anything to you? What about “CTRL+F” or “Command+F”?

I’m definitely part of the 10% of people who know about things like this, but I’m sure there are a whole host of things that computers and the internet can do that are unknown to me. On that note, I recently learned of something that my Mac can do that I had no idea it could do — convert to PDF.

All this time, I had been using various websites to do this for me, but as it turns out, a simple process and my Mac will do it for me. Who knew! I wonder what else my Mac can do.

Choice Architecture: Even in “Heads or Tails,” It Matters What’s Presented First

If you’re familiar with behavioural economics, then the results of this study will be right up your alley.

The researchers set out to determine whether there was a “first-toss Heads bias.” Meaning, when flipping a coin and the choices are presented “Heads or Tails,” there would be a bias towards people guessing “Heads” (because it was presented first). Through running their tests, they found something else that surprised them [Emphasis Added]:

Because of stable linguistic conventions, we expected Heads to be a more popular first toss than Tails regardless of superficial task particulars, which are transient and probably not even long retained. We were wrong: Those very particulars carried the day. Once the response format or verbal instructions put Tails before Heads, a first-toss Tails bias ensued.

Even in something as simple as flipping a coin, something where the script “Heads or Tails” is firmly engrained in our heads, researchers discovered that by simply switching the order of the choices, the frequency with which people chose one option or the other changed. That’s rather incredible and possibly has implications from policy to polling. However:

There is, of course, no reason to expect that, in normal binary choices, biases would be as large as those we found. In choosing whether to start a sequence of coin tosses with Heads or Tails, people ostensibly attach no importance to the choice and therefore supposedly do not monitor or control it. Since System 1 mental processes (that are intuitive and automatic) bring Heads to mind before Tails, and since there is no reason for System 2 processes (which are deliberative and thoughtful; see, e.g., Kahneman & Frederick, 2002) to interfere with whatever first comes to mind, many respondents start their mental sequence with Heads. However, in real-life questions people often have preferences, even strong ones, for one answer over another; the stronger the preference, the weaker the bias. A direct generalization from Miller and Krosnick (1998) suggests that in choices such as making a first-toss prediction, where there would seem to be no good intrinsic reason to guide the choice, order biases are likely to be more marked than in voting. At the magnitude of bias we found, marked indeed it was. Miller and Krosnick noted with respect to their much smaller bias that “the magnitude of name-order effects observed here suggests that they have probably done little to undermine the democratic process in contemporary America” (pp. 291–292). However, in some contexts, even small biases can sometimes matter, and in less important contexts, sheer bias magnitude may endow it with importance.

OK, so maybe these results don’t add too much to “government nudges,” but it can — at a minimum — give you a slight advantage (over the long haul) when deciding things by flipping coins with your friends. How?

Well, assuming that you are the one doing the flipping, you can say to your friend: “Tails or Heads?” (or “Heads or Tails?”) and then be sure to start the coin with the opposite side of what your friend said, facing up. A few years ago, Stanford math professor Persi Diaconis showed that the side facing up before being flipped is slightly more likely to be the side that lands facing up.

ResearchBlogging.orgBar-Hillel M, Peer E, & Acquisti A (2014). “Heads or tails?”–a reachability bias in binary choice. Journal of experimental psychology. Learning, memory, and cognition, 40 (6), 1656-63 PMID: 24773285

Three Months Later and I’m Still Avoiding Dessert (and Sugar)

It’s been over three months since my post about cessation of dessert eating, so I thought I’d offer a bit of an update.

It was actually a lot easier than I thought it would be to stop eating sugar. I’m aware that this might be a result of my conviction to the matter and that some people can have a real hard time giving up sugar (because of habits, addiction, etc.). Since giving it up, I have hardly had any sugar (i.e. refined sugar), but there have been times when I’ve been out of the house and in a pinch, I reached for a chocolate bar instead of a piece of fruit. However, each time that I do that, I’m harshly reminded just how bad it is to eat a chocolate bar (for me). Minutes after finishing the chocolate bar, my stomach almost immediately feels terrible and I wish I hadn’t eaten it. That’s happened a couple of times, but like I said, each time that it does happen, I’m reminded just how terrible a choice it is for me to reach for a chocolate bar instead of a piece of fruit or nuts (or nothing!), even if I’m glad that I’m living in Canada, the land of Coffee Crisp.

There’s only been one time that I’ve eaten something sugary and haven’t regretted it — a piece of tiramisu at a wedding. Each bite was heavenly. With that being said, I should clarify that this one of only a couple of times that I’ve opted in for dessert. I don’t remember the other time, but I remember that it made me feel more like I did after the chocolate bars, so maybe my palate has developed such that I can only eat dessert that is of high quality? I realize that sounds a bit pretentious, but it’s not unheard of for one’s palate to change with age.

Even though I’ve dipped into the sugar pot here and there, it’s still my intention to avoid dessert (and sugar, in general). These experiences with sugar since my “decision of cessation” have taught me just how much sugar can have an effect on my body (including my brain).

~

A few other things I wanted to mention. Did you know that there’s sugar in bacon? I certainly did not, but when I began reading the ingredients of things to see if there was sugar in them, I was shocked to find that there’s sugar in bacon. And it’s not just the ‘conventional’ bacon. I looked through a bunch of “organic” bacon packages and there was sugar in them, too. It was news to me.

Also along the lines of sugar in things that we wouldn’t expect: potato chips. I used to eat Miss Vickies Jalapeño chips as a kid and when I came across the Kettle Brand of Jalapeño chips, I would choose those instead. However, upon turning over the package, I was floored to find that sugar’s an ingredient. It’s not just the Kettle Brand of chips, either. I looked at a bunch of other “healthier” options of potato chips and, surprisingly, sugar is in them, too.

~

Lastly, I wanted to close with something I said a few years ago, with regard to diet:

One could read and try thousands of diets from Alicia Silverstone’s to Suzanne Somers’ and still never find the perfect diet. I think that this is the case because there is no perfect diet for everyone.

 

Parenting “Truths” are Culturally-Based: Parenting Without Borders, Introduction

It’s been some time since my last series (almost a year and a half ago) and even longer since the last time I did a series about a book. I’ve definitely read a number of books since then, but one that I’ve recently, I wanted to explore a bit more in-depth, so I thought I’d write a few posts about it in a series.

As you already know, I became a parent last year and as many parents do when making this transition, I was interesting in reading about this new stage of life. I’m aware that there are plenty of books on parenting out there, but I wasn’t interested in reading them all. Luckily for me, during one of my “Bringing Baby Home” classes, the teacher talked about this very thing. Given her experience teaching the course, recommendations from other parents, and her own personal experience, she suggested that the two best books we could read were:

  • The Wonder Weeks: How to Stimulate Your Baby’s Mental Development and Help Him Turn His 10 Predictable, Great, Fussy Phases into Magical Leaps Forward

As the title of this post suggests, we’ll be exploring “Parenting Without Borders.” Part of that’s because the topics within the book are so juicy and part of that’s because my little guy is already beyond the “10 leaps” from The Wonder Weeks. I will say, though, if you’re about to have a young one or you’ve just had a young one, The Wonder Weeks did wonders when helping me to understand why my little guy might have been fussier at times. This is really important because it helps you, as a parent, to understand a little better the things that your baby/toddler might be experiencing. There’s also a Wonder Weeks website.

So, what’s Parenting Without Borders, you ask?

Christine Gross-Loh exposes culturally determined norms we have about “good parenting,” and asks, Are there parenting strategies other countries are getting right that we are not?

The only word that I take issue there is “right,” but that gives you an idea of the kind of material we’re going to be exploring in this series. Let’s get right to the introduction of the book.

~

Right away, Gross-Loh paints a picture of her childhood. Born to two immigrants of the US (via South Korea), she was always given lots of space to do her own thing, but she knew that her parents were worried that they were doing right by her. Her folks would meet with other Korean families to discuss schools, among other things.

Before Gross-Loh had kids, there were things that she “knew” before her kids were born:

They would eat no junk food, watch no violent TV. If my children were raised peacefully, they would never show interest in weapons or war. I would be attentive to them and watchful of their feelings. I would be an accepting, protective parent to give them a secure base.

[…]

I’d been taught it was important to put our kids’ needs first, to give them lots of choice, to praise them to make them feel confident. My American friends and I sought out the right classes, toys, and books to foster our young children’s development, helped oversee their relationships and disagreements with other children, went to bat for them with their teachers and coaches, and guided what they did in their free time.

I think that’s probably a pretty accurate description of how many parents want to be.

But as my own children attended local Japanese schools and we spent time with Japanese families, I saw children raised in a very different way who were clearly thriving — just as much — and sometimes more — than our own. Moms in Japan were surprised by how uptight I was about allowing sweets and were startled by how I monitored what my kids were allowed to watch on TV and the way I tried to stay on top of their behavior. My Japanese friends, unlike me, left their children on their own to figure out their relationships with other kids. But despite how lax these Japanese moms seemed to me, I was constantly surprised by how mature and well-adjusted their children were, how capable, and how pleasant. These were kids being raised in ways that the American parents I knew might look at as simultaneously too permissive and too strict, yet they were clearly thriving.

Which leads us to the most important sentence from the introduction [Emphasis Added]:

It was during that time that I realized something that would change me completely: The parenting assumptions I’d held to be utterly and universally true were culturally based.

And this is why I’m writing a series on this book. There is so much value to be had in exploring the “truths” of different cultures, especially as it relates to shaping (or not shaping?) our young ones.

Understanding is Inherent to Empathy: On Paul Boom and Empathy

I came across an article in The Atlantic recently that expressed the opinion that empathy might be overrated. You’ll note that the way the headline is written: “Empathy: Overrated?” should already tell us that the answer is no (via Betteridge’s law of headlines). While from the outset, I’m already noticing my bias against the idea of empathy being overrated, I did my best to read the piece with an open mind and I’m glad I did because there are a few passages that I think are important to highlight from the “con” side of empathy:

The problem, as Bloom sees it, is that “because of its focusing properties, [empathy] can be innumerate, parochial, bigoted.” People are often more empathetic toward individuals who resemble themselves, a fact that can exacerbate already-existing social inequalities. And empathy can cause people to choose to embrace smaller goods at the expense of greater ones. “It’s because of the zooming effect of empathy that the whole world cares more about a little girl stuck in a well than they do about the possible deaths of millions and millions due to climate change,” Bloom said.

Empathy can also make people do evil. “Atrocities are typically motivated by stories of suffering victims—stories of white women assaulted by blacks, stories of German children attacked by Jewish pedophiles,” Bloom said. It also can lure countries into violent conflicts based on relatively small provocations, and researchers have shown that people who are more empathetic are more likely to want to impose harsh punishments on people. “The more empathy you have, the more violent you are—the more ready and willing you are to cause pain,” Bloom said.

Bloom raises some really good points here, but I don’t know if it’s fair to lay the blame for climate change at the feet of empathy. There’s been an extremely strong misinformation movement that I’d “blame” before I’d blame empathy.

The point about empathy exacerbating social inequalities is also a bit curious to me. While we may be more inclined be to empathetic to people who look like us, that doesn’t preclude us from being empathetic to people who don’t look like us and to that end, wouldn’t being at least marginally more empathetic to people who don’t look like us be better than not being empathetic to them at all (if we’re to look at it from a cold, calculated, and objective standpoint)?

Lastly, and most importantly, I’m worried about this point that the more empathetic you are, the more likely you are to want to impose harsh punishments on people. I looked and looked, but couldn’t find the study that Bloom is referring to in this article in the New Yorker from a few years ago, so I won’t attempt to critique the study’s methodology, but I will say this: isn’t campaigning for less empathy taking us a step back? If we’re looking at the progression of humans, I think it’s probably fair to say that empathy is something that we’ve developed along the way. It’s growth. It’s positive (I mean that it’s an addition to our species, rather than when positive is meant to indicate a judgment). Wouldn’t it be better for us — as a species — to incorporate this new phenomenon of empathy as we continue to grow?

This idea reminds me of Ken Wilber and his work. In particular, the idea that we start with x, move to y, and then find a way to integrate our understanding of x and y to move to a third stage, let’s call it xy. It seems to me that we’ve learned about this thing called empathy (stage x), and now we’re learning about how it can sometimes have a negative effect on us. As a result, there’s this backlash or movement against empathy (stage y). So now, we’ve got to move to place where we can integrate the two (stage xy).

~

Finally, I wanted to talk about one more thing that Bloom said:

At the end of the Aspen session, an audience member posed a scenario to the scientists: What if she was fired from her job, and her partner offered her a back rub and kind words but didn’t truly get why she was upset? Wouldn’t the comfort feel hollow, useless?

“What you’re really asking for is compassion plus understanding,” Bloom replied. “Suppose you feel humiliated. I don’t think it’s what you want or what you need for your partner to feel humiliated. You want your partner to understand your humiliation and respond with love and kindness. I think for your partner to feel humiliated would be the worst thing you want. Because now, you have to worry about your partner’s feelings.”

I like Paul Bloom and I’ve even written about him before, but I wonder if this is a misunderstanding of empathy. Or maybe more accurately, the way that the study defines empathy is different from the way that others may define empathy. The way that I remember empathy is that understanding is a component of empathy. I wrote a post about this a little while back and included a helpful short from the RSA:

Women and Words: Women Who Read Objectifying Words More Likely to Seek Cosmetic Surgery

I’ve tried to write about this article on a few occasions and had to stop because I simply felt terrible with the implications of the research. In short, as the headline of this post suggests, when women read words that are objectifying, they’re more likely to seek cosmetic surgery. I’ve written about the importance of words and how they can have an effect on us in the past, but this is one of the first times I’ve written about it with such awful implications. Here’s a bit more from the researchers:

Our results provide the first evidence that intentions to pursue cosmetic surgery stem (in part) from being in a state of self-objectification— a state where women are focused on how their bodies look in the eyes of others as opposed to what their bodies can do. Compared to the non-self-objectifying conditions, women primed to self-objectify reported more body shame and a greater intent to pursue cosmetic surgery.

You might consider this finding to be intuitive, but it’s really important when research like this is published and we can say with more conviction that the words we use can have a catastrophic effect on some people. In particular, impressionable young women. I should clarify that I don’t mean for that to come across as paternalistic. The study focused on women (and didn’t include look at whether this effect holds in men, too).

While the headline from the article is mostly “Bad News Bears,” there’s still a ray of hope to be found [Emphasis Added]:

In addition, we found that body shame was significantly lower among women primed with the non-self-objectifying physicality words compared to the neutral words. This finding suggests that exposure to text that emphasizes body functionality and competence without a focus on observable physical attributes may be protective against selfobjectification and body shame.

As the researchers suggested, this should be subjected to further investigation. Regardless, these findings are very important for all of us who write for consumption in any form, but probably more so for folks who write for consumption by young women. Before I end this post, I wanted to include a few more passages from the article that I think are important, with some commentary [Emphasis Added]:

Our research has a number of implications for practitioners. First, knowledge of this link between self-objectification (stemming from a sexually objectifying environment) and intentions to have cosmetic surgery should be useful to practitioners who work with girls and women. In particular, it is necessary to move beyond the understanding that sexual objectification makes women feel bad per se to identify the potentially harmful actions against themselves that women might take in response to such encounters.

For those who are in any kind of counselling profession or role, this seems very important. Understanding the actions that a client/patient may take as a result of their state can be key to offering the right kind of counsel.

Second, community members who wish to advocate for girls and women—including activists, educators, counselors, and policymakers—must raise awareness of the harms of self-objectification more consistently, including the pressure to undergo risky elective surgery.

Raise awareness. That’s why, despite my difficulty in trying to complete this post, I persevered. Persevere is probably too strong of a word, but I felt it important to write this, so that when you read this, you may consider changing your behaviour and hopefully, educate those around you in the hopes that they may change their behaviour, too.

Third, more emphasis should be placed on expanding the self and identity of girls and women to provide other domains in which they can glean social rewards and secure esteem beyond a sexualized appearance.

Please, please, please, rent/buy Miss Representation and tell your friends about it. It’s one of the most succinct (and recent) documentaries exploring the issues with how women are portrayed in the media.

Fourth, it is necessary to provide girls and women with specific actions that can be taken in the face of sexual objectification that do not require modification of one’s body in order to arm them with a greater sense of control over these largely uncontrolled and uncontrollable situations.

This goes back to that first point about those in the helping professions — it’s so important that one is able to offer a different avenue of action for one who is seeking out something like cosmetic surgery as a result of self-objectification.

Fifth, to the extent that self-objectification might be a risk factor for repeated surgery and low satisfaction with surgical outcomes, engagement with cosmetic surgery professionals to at least think about the implications of these patterns is worthwhile.

Almost as a “last resort” kind of thing, as the researchers suggested, it would be important for folk who work in cosmetic surgery offices to have knowledge of this issue of self-objectification (through the words they’ve read). While it may not be “good for business,” I would hope that for folks who work in this industry, counselling their potential clients on research like this would come first. I should clarify that I don’t mean to imply that anyone working in the cosmetic surgery industry is simply in it for the money, it’s as noble as any other medical field (consider those who work in plastic surgery, which is the umbrella that cosmetic surgery falls under, that seek to help burn victims).

Finally, it is critical that practitioners take up the challenge of changing the system of sexual objectification that perpetuates self-objectification and the concomitant consequences in the first place (Calogero & Tylka, in press). In light of the potential risks of undergoing any surgery and anesthesia, the pursuit of elective cosmetic surgery may represent another harmful micro-level consequence of selfobjectification for women, which will require our attention on many fronts.

ResearchBlogging.orgCalogero, R., Pina, A., & Sutton, R. (2013). Cutting Words: Priming Self-Objectification Increases Women’s Intention to Pursue Cosmetic Surgery Psychology of Women Quarterly, 38 (2), 197-207 DOI: 10.1177/0361684313506881