Justifying indulgence on Thanksgiving

Louis CK’s take on Thanksgiving

Tomorrow is Thanksgiving! That time of year when you feel that your indulgence is somewhat justified and the guilt for everything you ate is mitigated by the tradition of the holiday. If I just described you, don’t be alarmed. You are certainly not alone. Thanksgiving is a prime day for rationalizing indulgences for people who may otherwise feel guilty about blowing their diet. Recent research finds that people typically rely on six different explanations to justify eating (a lot of) unhealthy foods1, all of which are relevant to Thanksgiving dinner:

  1. Availability of unhealthy food: Unhealthy foods abound and are difficult to avoid. Those green beans just pale in comparison to the mashed potatoes.
  2. Intentions to compensate for the unhealthy eating in the near future: You make firm plans to exercise regularly for the next week and to limit your consumption of leftovers.
  3. Indulgence as an exception to the norm: Thanksgiving is just one day a year, after all. You never eat pumpkin pie or dessert, for that matter. And when was the last time you had your uncle’s stuffing?
  4. Feeling deserving of the unhealthy food: Related to all of the above and then some. You have been eating healthy foods consistently lately, and you just got a promotion at work. Plus, having to interact with some of your extended family makes you feel deserving of any prize.
  5. Curiosity-compelled indulgence: I don’t know exactly what this cookie is, but it looks and smells delicious. I have to try it!
  6. Irresistibility of the foods: Who can turn down sweet potato casserole? Everything smells fantastic!

Honestly, indulging occasionally shouldn’t have to be guilt-inducing. Thanksgiving is a special occasion involving atypical foods and eating companions who may live far away. The act of eating is social and pleasurable and should be enjoyed. However, if you’re someone who finds that the holidays are a more permanent setback for your health goal, you are not doomed. Strategies exist for maintaining your health goals and still enjoying (yes!) your Thanksgiving meal. Check out this Slate article for some tips by Brian Wansink, an expert food researcher, on how to manage your eating.

Did the researchers miss any justifications? What have you noticed at previous Thanksgiving meals?

Happy Thanksgiving, everyone!


1 Taylor, Webb, & Sheeran. (2013). ‘I deserve a treat!’: Justifications for indulgence undermine the translation of intentions into action. British Journal of Social Psychology.

Obesity and public health campaigns: Finding the Holy Grail

Obesity is a disease. It has been since June 2013, at least according to the American Medical Association (AMA). The AMA formally recognized obesity as a disease with the intention that such a classification would prompt additional funding for obesity research. However, a set of studies published in Psychological Science1 earlier this year suggests that designating obesity as a disease without considering the psychological consequences has a variety of positive and negative implications for obese and average-weight individuals.

Across a set of three studies where more than 50% of the 700+ participants were classified as overweight or obese according to the Body Mass Index (BMI), psychology researchers Crystal Hoyt, Jeni Burnette, and Lisa Auster-Gussman found that obese individuals reported significant decreases in weight concern and body dissatisfaction when they received the message that obesity was a disease, whereas average weight individuals demonstrated no such pattern when exposed to the same message. At first glance, this finding suggests that the “obesity as a disease” model is effective at increasing body satisfaction and, perhaps, decreasing internalized stigma.

Choose Health LA County ad campaign

Example of weight management-focused (incremental mindset) public health strategy

At a recent talk I attended, Dr. Burnette discussed these findings as well as related findings from some more recent studies about the relationship between the obesity as disease message and an entity mindset. An entity mindset is the belief that an ability or characteristic, such as intelligence or weight, is fixed and not malleable as a result of effort or behavior change2,3. Burnette suggested that believing obesity is a disease implies that weight is static, that it’s not people’s lack of willpower or behavior making them obese, but rather, their genes and physiologies. This notion seems to decrease anti-fat prejudice and blame placed on obese individuals4. This approach seems promising: reducing stigma and blame, as well as increasing the likelihood of research funding.

There’s a catch, of course. In addition to the decreased weight concern and body dissatisfaction, obese individuals who saw the disease message were also more likely to make hypothetical unhealthy food choices, unlike average weight individuals or obese individuals exposed to the weight-management control. The researchers suggest that these food choices may be a downstream consequence of the disease label and the entity mindset it may induce. That is, if obesity indicates a physiological malfunction, thus making weight control efforts ineffective, why bother trying? The very message that decreases blame seems to reduce motivation to manage weight, too.

The belief that weight loss efforts are ineffective for obese individuals is not completely implausible. Food researcher Traci Mann and fellow obesity researchers have found that long-term weight loss for obese folks is the exception, not the norm. Receiving the message that obesity is a disease and fixed may be affirming to an obese person who was previously told that weight loss attempts were a personal failing. Regardless, it could be argued that the positive impact of affirmation and acceptance is diminished if it’s accompanied by regular unhealthy food choices.

In contrast, obese participants who were shown the control message of standard weight management strategies demonstrated a different pattern. Their concern for their weight did not decrease, but nor did they subsequently choose higher-calorie, unhealthy foods. The implication, which the researchers mention, is that some level of mild body dissatisfaction may be motivating to eat healthier foods and to be more active. But these findings present something of a double-edged sword, as Dr. Burnette mentioned at her recent talk.

The weight management (control) message may have induced an incremental mindset of weight, the alternative to an entity mindset about weight. An incremental mindset affords people more agency by implying that weight can be altered, presumably through behavior change. Accompanying empowerment, however, is the shift in blame away from an obese person’s genes and onto them and their behavior. As Burnette said, promoting either mindset to obese and non-obese individuals alike can have negative effects.

So, what’s a public health professional to do, particularly when obesity has already been officially labeled a disease? It’s exactly that sort of question that Burnette and her collaborators would like to pursue in future research. Specifically, how should public health messages be structured to motivate and promote an incremental mindset for obese individuals without the body image costs and blame? No obvious or simple answers exist yet. Burnette says that the answer to that question would be “the Holy Grail.”

Anyone out there come across research that might answer this or have a suggestion? Post it in the comments!

 


 

1Hoyt, C.L., Burnette, J.L., & Auster-Gussman, L. (2014). “Obesity is a disease”: Examining the self-regulatory impact of this public-health message. Psychological Science, 25, 997-1002.

2 Dweck, C.S., Chiu, C.Y., & Hong, Y.Y. (1995). Implicit theories and their role in judgments and reactions: A world from two perspectives. Psychological Inquiry, 6, 267-285.

3 Burnette, J.L., O’Boyle, E., VanEpps, E.M., Pollack, J.M., & Finkel, E.J. (2013). Mindsets matter: A meta-analytic review of implicit theories and self-regulation. Psychological Bulletin, 139, 655-701.

4Monterosso, J., Royzman, E.B., & Schwartz, B. (2005). Explaining away responsibility: Effects of scientific explanation on perceived culpability. Ethics & Behavior, 15, 139-158.

 

Is academia in a post-sexist era?

Two senior faculty members at Cornell, Stephen J. Ceci and Wendy M. Williams, seem to think so. They wrote an entire paper and a recent op-ed in the New York Times on the topic. In the week since, the blogosphere and social media have done their part to destroy point out the inconsistencies and contradictions the authors make, calling them on their biased conclusions. I won’t personally add to these observations but, instead, point you to some of the more compelling and insightful critiques of the laughable notion that academia* is no longer sexist:

1. Emily Willingham’s post: Academic science is sexist: We do have a problem here

2. Rebecca Schuman’s article in Slate: Don’t worry your pretty little heads

3. Red Ink’s: Let me fix that for you, New York Times

 

What do y’all think?

 


 

* Academia, as a whole. Certain subjects and fields may have different experiences, although I doubt they vary greatly from the overall norm.

The Salem Witch Trials: Groupthink at its worst

In honor of Halloween and all things occult, I wanted to explore a historical event I was morbidly fascinated with as a child: the Salem Witch Trials. When I was younger, I couldn’t get my hands on enough novels and non-fiction books on the subject. I read The Crucible and watched the movie, and I was also fortunate enough to have relatives who lived in Salem, MA, so I got to visit most years around Halloween. While in Salem, I would walk through the cemeteries where the alleged witches had been laid to rest and read their tombstones. These ancient tombstones actually listed the method used to kill the accused. I remember being completely engrossed in the event the more I learned about it—I couldn’t get past the swiftness of the accusations, the unfairness of the trials, the conformity, and the upturned power hierarchy of the Salem community. I didn’t necessarily think of it in those specific terms at the time, but in retrospect, my nascent social psychological wheels were turning.

What exactly happened during the Salem Witch Trials? What perpetuated the mass hysteria? Why did it take so long to stop?

Lithograph of Salem Witch Trials, 1892, by Joseph Baker

Actually, research on groupthink suggests that what happened in Salem Village* wasn’t all that unusual; terrible, yes, but surprising? Perhaps not. A few factors combined to allow for the perfect storm of the Salem Trials.

Groupthink1 is a way of thinking characterized by an excessive emphasis on group cohesion and solidarity. Often, group harmony is prioritized over making an accurate judgment, allowing for important information to be ignored. Groupthink is most likely to occur when the group is highly cohesive, isolated, stressed, has poor decision-making procedures, and a forceful leader. Nearly all of these factors existed in Salem Village during the winter of 1692, the time leading up to and including the witch trials.

Highly cohesive group and group isolation. The Salem villagers were Puritans, tightly knit together by their religious beliefs, including fear of the Devil’s work. Because of their religious convictions, recent attacks by Native Americans, and tension with the wealthier Salem Town, the Salem villagers were distrustful of outsiders, leaving them to rely primarily on each other for support.

Forceful leader. Reverend Samuel Parris, the first ordained minister of Salem Village, ruled strictly and was known for his greedy nature. Editorial note: he doesn’t seem like the type of person who would allow people to speak their mind.

High stress. The 1692 winter was a particularly harsh one, which strained Salem Village’s resources and increased their reliance on Salem Town. Adding to the strain was a number of displaced people from King William’s War, who landed in Salem Village, and a smallpox epidemic.

Poor decision-making. The trial process, a term I use loosely, allowed testimony about dreams and visions to be included, despite opposition from the respected minister Cotton Mather; likely, his voice just wasn’t loud enough to stop the momentum yet. Female children as young as four years old who were connected to accused older women, like Dorothy Good, daughter of Sarah Good, were questioned and thought to have confessed. These are just a few of the ways in which poor decision-making was employed.

So, the groundwork was there. And when groupthink emerged, it did so violently with all of its accompanying symptoms:

Belief in the moral correctness of the group. Need I remind you that these were deeply religious people? They prayed every day and considered themselves to be the elect. In other words, they believed they had been predestined for heaven, chosen uniquely by the God they believed in. As K. David Goss put it in Daily Life during the Salem Witch Trials, their Puritan faith was all-encompassing. These religious beliefs contributed to a lot of self-censorship and the pressure to conform, particularly among women, who were expected to aspire to the ideal virtuous woman as described in the bible (see Goss’s book for more). This pressure to conform and to limit personal beliefs likely increased significantly once accusations were being made, lest someone turn an accusation on someone who dared to speak her mind.

Considering the ripening groupthink conditions of the stressed and isolated place of Salem Village, the mass hysteria and frenzy of the Salem Witch Trials wasn’t completely unexpected, at least in hindsight. That it can be explained doesn’t detract from the horror, death, and upheaval that occurred. And community members of Salem did eventually put a stop to the madness, perhaps because the stress was unsustainable and damaged the group cohesiveness. The diminished cohesiveness may have allowed an opening for some powerful community members to feel comfortable enough to speak up. A public apology was eventually made in 1697 by Judge Sewall, who had overseen many of the trials, but it was too little, too late. Groupthink had left a permanent mark.

Groupthink can, and does, occur today, too. It can be avoided by having an impartial leader, being willing to seek outside opinions, creating subgroups to make decisions separately, and seeking anonymous opinions.2


*The place where the witch trials occurred was actually Salem Village, present-day Danvers, and was established several miles from Salem Town, now present-day Salem. See http://salem.lib.virginia.edu/Witch.html for more information.

1,2  Janis, I.L. (1972). Victims of groupthink: A psychological study of foreign policy decisions and fiascoes. Boston, MA: Houghton Mifflin.

Beating the Wedding Industrial Complex—You can, too!

September and October are becoming increasingly popular months for weddings, overtaking the more traditional wedding month of June, which makes now the perfect time to talk about the wedding industrial complex, or WIC for short. I first heard this term on APracticalWedding.com, where the editor-in-chief Meg Keene wrote a fantastic post on it, and it resonated with me immediately. You see, I’m getting married this weekend. I was in the thick of fighting off the WIC—now I’m almost through!—and it is difficult, despite my knowledge of social psychology.

For those of you who are married, and especially those who got married during the current social media era where social comparison is ever easier, you may already know what I’m talking about. For those of you who would like to get married eventually, consider yourself warned. The WIC is basically all those factors that interact to make the two people getting married feel pressured to have the “right” type of wedding. Note: “right” often translates to traditional and expensive.

The WIC preys on the unsuspecting good-intentioned folks who just want to have a nice wedding. They want to have fun, they want it to be organized and pretty, and they definitely do not want anyone to be hungry. Deciding the specifics that correspond with each of those desires is more ambiguous. Enter the wedding industry. The wedding industry is a $51 billion dollar a year industry that seeks to convince you that you absolutely need to have a fully stocked open bar, and your dress must be Vera Wang with an intricate bouquet to match. Anything less, and your wedding will be just that: less than.

Admittedly, I’m painting a harsh picture of the wedding industry, making them seem calculating and manipulative, all in the goal to get you to spend as much money as possible to have The Perfect Wedding. If I’m being fair, the wedding industry is not the only cause to blame here; after all, the ultimate goal of businesses is to turn a profit. It’s also you. Yes, you, your expectations, societal influence, and more than a little bit of social psychology.

  One of the most egregious examples of injunctive wedding norms I’ve ever seen

Why do we feel the need to stress out over seemingly inconsequential details about the wedding (like whether the cumberbund color will clash with the table runners)* when, realistically, we know that a wedding is not about the color scheme? What it really boils down to is social norms. That is, how people “should” behave and how people are actually behaving. The former is an injunctive norm, and it implies there’s a right and wrong way to do things. The latter is a descriptive norm, and it describes how people are actually behaving.

The injunctive norm tells us that weddings are supposed to have a sit down dinner, that paper flowers are not okay, that not having a bridal party—gasp!—is totally crossing the line. I won’t even get into the proscribed norms about how a bride should look. The descriptive norm incorporates all those wedding experiences of your friends, family, celebrities, and the wedding industry as if to say, “See? This is how weddings are happening all across your world.” Descriptive doesn’t imply that an action or event is right or wrong, but the boundary between descriptive and injunctive seems to get blurry when wedding planning is involved.

Clearly, both types of norms contribute to the wedding industrial complex, because they draw in a person’s experiences and exposure to what the mainstream culture suggests is appropriate for a wedding.

So how can you escape the seemingly impenetrable wedding industrial complex? Well, you can use norms to your own advantage. Specifically, use descriptive norms in a way that promotes and helps your wedding experience. Make your wedding the norm. After all, your wedding should count as much as any of the others. Seek out additional sources of support (like apracticalwedding.com) that diminish any injunctive norms, because there really isn’t a right or wrong way to do weddings. Finally, even if you aren’t planning a wedding yourself right now, be supportive of anyone who is. Don’t contribute to the wedding industrial complex by imposing injunctive norms on anyone. You may have been part of the problem, but you can also be part of the solution!

Wish me luck this weekend.

 


*Some profanity in here. Potentially NSFW.

**The traditional wedding I’m drawing my norms from refers primarily to a middle-upper class wedding. Admittedly, weddings of all shapes and sizes exists with their own accompanying pressures!

Check your privilege

Let’s talk about privilege. White privilege. Male privilege. Class privilege. Straight privilege. All of those categories of people’s identities that dictate how they experience the world, whether they like it or not, and of which there are many more. We often think about privilege from a sociological perspective: institutional racism/sexism/other -isms, and limitations and expectations imposed on people by mainstream society. But what if we thought about privilege through a psychological lens? Specifically, what if we considered the role of the individual both in perpetuating and recognizing unfair privilege in society?*

This summer I taught Social Psychology to undergraduates. To prepare for class, I would do my best to incorporate cultural and relevant examples of topics and processes to make them more relevant and relatable to my students, which got me thinking about the fundamental attribution error and its role in perpetuating privilege in all of its various forms.

The fundamental attribution error (FAE) is a topic covered early in social psychology courses. It’s covered early because it reveals a basic (fundamental, if I may) flaw in the way that people interpret the world. Briefly, the FAE refers to the tendency of people to assume that another person’s behavior is caused by something inherent about that person—their disposition, their personality, etc.—rather than taking the situational context in mind. Consider this scenario: you see someone you know walking toward you on the sidewalk. You smile and wave, but this person walks right past you without any acknowledgment. “How rude!” you think. I thought he was a nice person, but maybe I was wrong, you might conclude. If your thinking follows that pattern, then you just committed the FAE. In all likelihood, the person you knew probably just didn’t see you. His behavior was a function of the situation—maybe he was in a hurry—and not an indicator of who he is as a person. This type of example is often how the FAE is taught, particularly because social psychology studies the individual as its unit of analysis.

But the FAE has significant implications for privilege if you zoom out and examine its influence on a societal level. It relates to privilege because the groups of people who are likely to be oppressed and systematically discriminated against as a function of white (or male or any other type) of privilege are also likely to be victims of the FAE. One of the components of privilege is that you are not seen as an ambassador or a “token” of your group (see any of the links above). One man doesn’t speak for all men. One white person doesn’t speak for all white people. These examples seem obvious. Yet, people who fall outside of these privileged groups often carry the burden of being viewed as the sole representative of their group. If one black person speaks, that person is more likely to be seen as representative of all black people, which is an incredibly unfair responsibility.

Combined with the tendency of people to commit the FAE, you can see the problem. That is, if a black woman is treated unfairly at a checkout counter, and she responds in frustration or anger, the unfortunate consequence is that people are less likely to perceive and consider all the different factors in the situation. Rather than see this woman as someone who is reacting appropriately (or at the very least, justifiably) to her current experience, people are likely to make two assumptions: 1) decide that this woman is an angry and impatient person, and 2) extend that judgment to all black women, which helps to explain the trope of the “angry black woman.” The FAE contributes to the first assumption, and white privilege is responsible for the second. Because not only does white privilege inordinately and unfairly favor white people, it does so at the expense of people of color. Not only do white people get boosted up and given the benefit of the doubt, but people of color get pushed down further. Replace white with male, straight, or class, and people of color with female/trans, gay, or poor, and you get a staggering number of different biased scenarios**.

It’s not all bad news, however. Combating the FAE is possible, although it may require some effort. Those who don’t commit the FAE may simply be more empathic people, but they also tend to be people who know that the FAE exists. They understand how it works and are aware of the shortcomings of human perception. People who know about the FAE can work to pay more attention to situational factors that may explain someone’s behavior without jumping to the conclusion that an act is because of a person’s disposition. For an excellent example of how to do this, check out this video:

 

As for privilege, recognizing its existence is one of the first steps to not being complicit in it. I’m talking to those of you who fit in to some privileged group or another. Many of us do find ourselves in at least one group of privilege at some point in our lives—remember I’m speaking to you from my perspective of a white middle class person. No, we didn’t ask for this privilege. No, we don’t think it’s fair. No, it doesn’t matter that we think these things. Like the FAE, privilege is subtle and must be acknowledged as an initial step. Recognition is just the first step. We must pay attention to it and understand how it affects situational factors in our daily lives, and then take steps to correct that. It’s a lot to take on, but the weight is very little compared to what non-privileged folks must bear every day. Doing so is the only acceptable alternative if we want individuals to contribute collectively to true racial and social justice. If you’re not sure how or you’d like to learn more about privilege, start by reading any of these blogs below.

Resources:

Black Girl Dangerous

TimWise.org

Decolonizing Yoga

It’s Pronounced Metrosexual


*I recognize that the fact that I can choose when I want to think about privilege is, in fact, another element of my white privilege. Not everyone has that luxury.

** I chose to focus on white privilege for two reasons: 1) As a white person, I’ve benefitted from and experienced white privilege all my life, and 2) the heart-wrenching and infuriating race-related events of the last few months, particularly in Ferguson, Missouri.

Mental shortcuts and portion control

Flickr user eddie welker

Standard cheesy nacho connectedness (Image courtesy of flickr user Eddie Welker)

Have you ever picked up a chip from a plate of nachos only to find that it was stuck to several others, creating one large nacho mass of cheesy goodness? Or maybe it happened with cookies that had been baked together. Regardless of the specific food, how many times have you looked at that larger-than-intended portion in your hand and shrugged while thinking, it’s still just one nacho (or cookie or whatever). If so, you are not alone!

People are constantly inundated with a multitude of stimuli from their environments, particularly when making decisions about eating. To simplify things a bit, people rely on heuristics (or mental shortcuts) to keep them from becoming overwhelmed by the number of decisions, such as how many cookies to eat, what type of cookies, when do I want them, and so on.

The unit bias heuristic is the tendency to sense that a single entity is the appropriate amount of food to eat, regardless of how big that entity is (1). In other words, eating a cookie, no matter how big that cookie is, feels acceptable and not guilt-inducing to most people, despite the fact that the cookie size may actually be comparable to three cookies.

Naturally, people vary in how frequently they rely on unit bias and also in the size of the typical unit used. For example, one large cookie or a full package of cookies can both be considered to be a single unit depending on the person or the circumstance. Unit bias doesn’t become particularly problematic to people’s health unless they are regularly consuming extra-large portions as one unit, such as a full bag of chips or an entire box of cereal.* In these cases, people may need external support, sometimes called segmentation cues (2), to provide indicators to stop eating. Segmentation cues are also often called “portion control.” For example, 100-calorie snack packs act as a cue to limit your intake of a particular food item.

To learn more about unit bias and segmentation cues, check out the papers below, or email us at Socialpsyq@gmail.com.

 

(1) Geier, A.B., Rozin, P., & Doros, G. (2006). Unit bias: A new heuristic that helps explain the effect of portion size on food intake. Psychological Science, 17, 521-525.

(2) Geier, A.B., Wansink, B., & Rozin, P. (2012). Red potato chips: Segmentation cues can substantially decrease food intake. Health Psychology. Advance online publication. doi: 10.1037/a0027221

 

 

* Of course, these portions don’t apply to everyone. If someone is a high performance athlete, for example, then their calorie intake will look very different from the average person.

Should kids avoid the cereal aisle for their health?

cerealboxpsychology01

Image courtesy of Cornell Food and Brand Lab

Is it just me, or is the cereal aisle much more complicated and sinister compared to when we were kids? Every time I walk down that aisle, my frustration spikes. The choices, so many choices! Chocolate Krave. Cap’n Crunch. Chocolate Cheerios. Frosted Flakes. Even Rocky Mountain Chocolate Factory has a cereal now, featuring chocolate bits that you can eat for breakfast. That last part is meant to pull in the kids, and it works. Those kids will nag their parents to buy it who will eventually give in[i] because, oh, they’re frustrated, too. Maybe even more than I am.

I’m not frustrated because of the mere existence of so many options necessarily. Rather, it’s the quality of the options that is concerning. Are any of these cereals actually healthy enough that children should be consuming them regularly? Not usually. A recent study on cereal quality found that cereal brands marketed to children had 56% more sugar, 52% less fiber, and 50% more sodium than cereals marketed to adults.[ii] Most of these cereals also feature spokes-characters, like the silly rabbit from Trix, Cap’n Crunch, or Tony the Tiger, which are familiar to children and increase the appeal of the cereal brand. And let’s not forget that the combined rate of obese and overweight children in this country is still holding strong at 17%.[iii] That’s nearly 13 million kids.

See what I mean about sinister? Now, brand marketing is not inherently negative, but when marketing of unhealthy foods is targeted toward children, then cereal companies like Kellogg and General Mills take a step into the danger zone. Sugary cereals are perhaps even more insidious than other snack foods because they are junk foods disguised as a friendly breakfast.

In-store marketing strategies take it one step further. Cereal companies pay top dollar to get an ideal shelf location that will appeal to children[iv]. In a recent study published in the journal Environment and Behavior, researchers found that cereal brands marketed to children were more likely to be at a child’s eye level and to contain spokes-characters whose gazes angled downward at approximately the height of an average child[v]. In contrast, cereals marketed primarily to adults featuring spokes-characters (think Wheaties) had level gazes. And this seemingly subtle shift in height and gaze is effective. People in the study reported a strong preference for the cereal that featured a spokes-character that made eye contact. By placing their cereals on the middle or bottom shelf, then, companies are ensuring that children will make eye contact with spokes-characters and feel connected and loyal to that brand.

This type of marketing exploits and manipulates children. Cereal companies should be held more accountable. In the past few years and in recent months, especially, there has been a serious push to create stricter regulations for companies that market primarily to children. Based on the findings of Musicus and colleagues, just one of many similar studies, these regulations can’t come soon enough. The issue of obesity is still current. People may be tired of hearing about it, and obesity rates may have stabilized in several states[vi], but that doesn’t mean that it’s gone away.

But even if we take obesity out of the equation, even if we recognize that not all children have the same risk factors for becoming obese, it doesn’t mean that kids should be regularly consuming unhealthy sugary food. Parents want to protect their children in every way they can, and they’re stretched to their limits as is. Cereal companies, and all other food companies for that matter, should be held to stricter regulations. Some marketing standards have been successfulvii, but more needs to be done. Regulations should stretch beyond nutrition and include specific marketing techniques, such as shelf placement and use of spokes-characters. We shouldn’t make the cereal aisle another battleground where parents need to be on the front lines.

Posted by Jen

~~~~~~

If you’re interested in this topic and would like to learn more, check out the links below.

Center for Science in the Public Interest: Food Marketing Workgroup

Yale Rudd Center for Food Policy and Obesity

Healthy Eating Research

Salud Today

Eyes in the aisles: Why is Cap’n Crunch looking down at my child? (abstract)

[i],vii A Review of Food Marketing to Children and Adolescents — Follow-Up Report. See http://www.ftc.gov/reports/review-food-marketing-children-adolescents-follow-report .

[ii] Harris, J. L., & Graff, S. K. (2012). Protecting young people from junk food advertising: Implications of psychological research for First Amendment law. American Journal of Public Health, 102, 214-222.

[iii] http://www.cdc.gov/obesity/data/childhood.html

[iv] Wilkie, W.L., Desrochers, D.M., & Gundlach, G.T. (2002). Marketing research and public policy: The case of slotting fees. Journal of Public Policy and Marketing, 21, 275-288.

[v] Musicus, A., Tal, A., & Wansink, B. (2014). Eyes in the aisles: Why is Cap’n Crunch looking down at my child? Environment and Behavior, 0013916514528793.

[vi] http://www.latimes.com/science/sciencenow/la-sci-sn-american-obesity-crisis-stabilizing-20140904-story.html.

Can iPhones predict your happiness?

1287221843_cabd428ab2_z

Old-school generation iPhone 4s (Image from flickr.com)

Well, it’s here. The iPhone 6. And suddenly, predictably, everyone with an iPhone 5s or lower feels inadequate. Sales for the iPhone 6 are predicted to be greater than iPhone sales ever before. Someone I went to college with posted on Facebook yesterday, “The iPhone 6. A piece of s!%t compared to the iPhone 7.” He posted the same thing when the iPhone 5 came out a few years back. His post is funny precisely because it captures the sentiment of the technological version of “keeping up with the Joneses.” It says that people are looking to the next cool device already.

The iPhone is not the first device or product marketed to have such a reaction on consumers. Other devices in the tech world and the fashion world, in particular, seem to rely on this notion of being up to date and “en vogue,” if you will.

So, Apple is simply capitalizing on a pattern that seems to be part of the human condition and that other companies also capitalize on to, well, make capital. But what is it that makes people so eager to give up the devices or clothes they’re currently using or wearing, most likely something they were perfectly satisfied with, and clamor to get the newest and latest?

It could be something social psychologists call affective forecasting. Affective forecasting is the ability to predict how we will feel in the future. It’s essentially a forecast for our emotions. And we all know how reliable weather forecasts are once they’re more than a few days out. Not surprisingly, people are notoriously bad at predicting how they’ll feel. It’s true for positive and negative emotions. People generally overestimate how angry or upset and how excited or happy they’ll feel when X happens. Psychologists think that this miscalibration happens for a number of reasons (to learn more, check out Dan Gilbert’s and Tim Wilson’s pages), including one explanation that I’d like to focus on: a happiness baseline.

Everyone you know, yourself included, has a baseline happiness level or a set point [1,2]. Sure, it fluctuates occasionally, and there are certainly days when you’re happier than others, but most of the time our happiness level falls around our own particular set point. While this is good news for people who are feeling crappy, it doesn’t bode well for people who attempt to alter their happiness with products.

In other words, people who think that the latest iPhone will make them happier than they’ve ever been might be right…for a few days or weeks. After that, they’re likely to revert back to their baseline and feel the same way they did with the iPhone 5 or their Android phone or even their Blackberry. Well, maybe not the Blackberry. RIP Blackberry phones. But I digress. Most people fail to notice these patterns about themselves. They don’t learn from their mistakes, so even though purchasing the iPhone 5s only temporarily elevated their happiness from when they had the iPhone 4, these same people will likely be just as eager to obtain their very own iPhone 6 for the very same reasons.

Maybe it’s because people are eternal optimists. Maybe it’s because we’re victims of advertising. Or maybe we don’t know ourselves as much as we think we do. So before you upgrade to the newest and latest, check in with yourself and ask why. Remember, money can’t buy happiness.

 

Posted by Jen

~~~~~

[1] Diener, E., Suh, E.M., Lucas, R.E., & Smith, H.L. (1999). Subjective well-being: Three decades of progress. Psychological Bulletin, 125, 276-302.

[2] Lykken, D. & Tellegen, A. (1996). Happiness is a stochastic phenomenon. Psychological Science, 7, 186-189.