Psychology and delayed gratification

Back in 2009, The New Yorker published an interesting article on psychology and self-control. It describes an experiment in which children were challenged to delay gratification, and then considers what implication their success or failure at such tasks has for their lives. It also describes some of the mechanisms through which people are able to defer an immediate pleasure in favour of a larger one later:

At the time, psychologists assumed that children’s ability to wait depended on how badly they wanted the marshmallow. But it soon became obvious that every child craved the extra treat. What, then, determined self-control? Mischel’s conclusion, based on hundreds of hours of observation, was that the crucial skill was the “strategic allocation of attention.” Instead of getting obsessed with the marshmallow—the “hot stimulus”—the patient children distracted themselves by covering their eyes, pretending to play hide-and-seek underneath the desk, or singing songs from “Sesame Street.” Their desire wasn’t defeated—it was merely forgotten. “If you’re thinking about the marshmallow and how delicious it is, then you’re going to eat it,” Mischel says. “The key is to avoid thinking about it in the first place.”

In adults, this skill is often referred to as metacognition, or thinking about thinking, and it’s what allows people to outsmart their shortcomings. (When Odysseus had himself tied to the ship’s mast, he was using some of the skills of metacognition: knowing he wouldn’t be able to resist the Sirens’ song, he made it impossible to give in.) Mischel’s large data set from various studies allowed him to see that children with a more accurate understanding of the workings of self-control were better able to delay gratification. “What’s interesting about four-year-olds is that they’re just figuring out the rules of thinking,” Mischel says. “The kids who couldn’t delay would often have the rules backwards. They would think that the best way to resist the marshmallow is to stare right at it, to keep a close eye on the goal. But that’s a terrible idea. If you do that, you’re going to ring the bell before I leave the room.”

Perhaps the most useful thing about psychology is the way in which is allows us to learn about the limitations of our own minds. Once we recognize the many flaws in human reasoning, it becomes easier to avoid falling prey to them and being able to manage well in the world.

Photography and social roles

A number of my friends are fairly serious amateur photographers: people who have built up a repertoire of knowledge, various sorts of gear, and who display photography publicly online. Photography is certainly an excellent pastime. It satisfies geeky cravings for toys to play with, while serving as a creative outlet. It also lets you document and share what is going on in your life, with a group of friends who are increasingly likely to be far-flung (as we stay in touch with friends from former schools and employers, all over the world).

In addition to those appealing elements, photography has an interesting role within group dynamics. Everyone wants flattering photos of themselves, so being able to provide them makes you valuable to others. There is also competition between people who take photos. It takes place on the basis of quality of output, creativity, photographing interesting things, and gear. Indeed, photo gear is an increasingly appropriate way of demonstrating wealth. Whereas in some social circumstances, automobiles are probably the premier form of wealth expression, that isn’t well matched to a lifestyle where people move around often and relatively rarely see their friends in person. Photography is useful, visible, and a way of demonstrating capability, access, and wealth.

[Aside] On a somewhat related note, OKCupid has some data on what makes an attractive photo. Specifically, a non-flash shot taken with an SLR or 4:3 system camera at f/1.2 or f/1.8. The average 30 year old iPhone user has also had significantly more sexual partners than the average BlackBerry and (especially) Android user.

Sign you’re living badly

Paul Graham was written an interesting piece, on addictiveness. He argues that people are vulnerable to getting addicted to all sorts of things, and that avoiding this requires you to behave in an abnormal way: “You can probably take it as a rule of thumb from now on that if people don’t think you’re weird, you’re living badly.”

This strikes me as an interesting and possibly truthful observation, and an extension of our prior discussion of the nature of addiction.

Single player and multiplayer

I have always preferred the single player modes in games like Half Life and Warcraft III to the multiplayer modes. The latter strike me as excessively hectic, with everybody racing to destroy their enemies, generating a lot of chaos in the process. Single player games allow you to take your time and execute things perfectly, in a much more controlled way.

It has occurred to me that the two options might appeal to rather different sorts of people. Multiplayer fans may be the sort who are thrilled by immediate engagement and happy to come out on top, even when the process for doing so is risky and disorderly. If they lose 90% of their army but end up victorious, they are happy. Single player may appeal to the sort of obsessive individual who wants to find a way to beat the enemy without losing a single unit, or suffering a major setback. It is well suited to the risk averse.

In life, it does seem that the kind of skills required in multiplayer are generally of more use than those required in single player. While there are areas of life where developing a plan methodically and them implementing it is possible and a good strategy, there seem to be many more where a capacity for improvisation and a willingness to not reflect on losses and failures are more valuable. Is there any way, I wonder, to make a natural single player fan into a more engaged multiplayer user?

Pickup artists

I find the phenomenon of ‘pickup artists‘ somewhat disturbing.

Basically, these are individuals who exploit quirks of human psychology in order to get people to sleep with them easily. Human behaviour is predictable to such an extent that many tricks are effective against a sizable proportion of the population. For example, you can use a minor insult called a ‘neg’ to make a person feel like they have to prove themselves to you. A long piece on pickup artists in The Point Magazine describes how this is at the core of the technique: “the key to the method is, unquestionably, that the pickup artist ignore, tease, or even insult the targeted female, accustomed as she is to constant, beleaguering attention from men.” There is also the whole collection of cold reading tricks long employed by psychics and con artists to give the false sense that they have special insight into you.

If people were widely aware that such tricks can be effective, the practice of people using them would be less worrisome. When they are employed against unwitting subjects, however, they strike me as exploitative and potentially unethical. The article linked above contains a detailed discussion of the ethics and psychology of this unusual set of skills.

The DSM and defining mental illness

The Diagnostic and Statistical Manual of Mental Disorders (DSM) is published by the American Psychiatric Association and contains the most authoritative definitions of mental illnesses. The current version – the DSM-IV – was released in 1994. Now, work is ongoing on a fifth edition.

To me, it seems like ‘mental illness’ often describes a situation in which a person manifests a normal part of psychology to an excessive extent. For instance, it is perfectly normal and probably even essential for people to feel things like guilt, shame, and anxiety. Any of these felt to an extreme extent, whether that means extremely strongly or weakly, could form the basis for a mental illness.

There is a danger, perhaps, in being too quick to say that someone is ill, when they simply manifest a normal tendency to an unusual degree. Doing so might make them feel stigmatize and lead to unnecessary medical interventions. It also risks making people feel less responsible for their choices and actins, since they can be ascribed to a medical condition rather than to the free expression of their will. At the same time, increased awareness of mental illness is probably an important thing for society to develop. My sense is that most people do not have a great understanding of the character of mental illnesses, and that society is generally poorly set up to assist people suffering from them.

Tattoo motivations

I had never given much thought to why people in their teens and twenties are so often interested in getting piercings and tattoos. Recently, however, it occurred to me that one rather valid reason for doing so is so they can assert ownership of their own bodies, particularly in defiance of their parents. Having never felt as though I had less than complete ownership of my body, that potential motivation had never occurred to me before.

Arguably, it is especially important for young women to take such a stance (though there are obviously many ways of doing so). That is because their bodies have much more commonly been seen as the property of others, or at least under their control – whether the outside entity is the state, family, or some religious structure.

Debit chic

Back in the heady days of high stock prices, rising house prices, and seemingly robust economic growth, having an elite credit card with a high credit limit was a status symbol. Now that the whole world has been forced to confront problematic levels of debt, it seems like having the ability to make impulse purchases on credit shouldn’t have much cachet anymore.

Indeed, perhaps the humble debit card should be today’s status symbol. By breaking it out and typing in your PIN, you can demonstrate to everybody around you that you have the cash on hand to fund your purchase right now, rather than the confidence of Visa or Mastercard that you will be able to pay them back later.

Materialism and free will

I have written before about the apparent contradiction between free will and materialism (the idea that the universe is exclusively comprised of particles that obey physical laws). The problem is easy enough to state: if every particle in the universe behaves in a manner governed by a combination of random chance and predictable laws, how can a physical entity like the brain respond to stimuli in a way that is neither random nor determined?

Joshua Gold of the University of Pennsylvania and Michael Shadlen of the University of Washington recently summarized some experiments on monkeys that illuminate this issue. They found that they could use a computer to predict how monkeys will respond to visual stimuli, suggesting that such mental functions are automatic.

Of course, there is a big difference between parts of mental life like maintaining a steady heartbeat and tracking a moving object visually and those like making ethical decisions. That said, I continue to be unable to see what mechanism could exist between the former and the latter, and which could square our intuitive belief in free will with what we know about the functioning of the universe. That being said, we do not have any reason to act as though free will does not exist. The reason for that is simple: if free will doesn’t exist, we don’t have any influence over what we believe or how we act, while if it does exist we certainly want to behave appropriately. As such, if we do have any scope to choose, we should choose to believe in free will.

Age and openness to new ideas

I wonder whether there is a time in life by which our aesthetic and political preferences have been essentially locked in, after which we are no longer fully capable of integrating new ideas. It certainly seems plausible that this could be true. It could also help to explain the broader pattern of social change in society; as each generation rises to positions of influence, they bring with them the intuitive assumptions about politics and ethics that they absorbed when they were younger. Often, that means being willing to accept things that were outside the bounds of what was acceptable for the generation before, but which are less radical than what will be accepted by the generation after.

If true, this dynamic could also be a major reason for which people dying is an important form of social progress. To take one example, as there have been fewer and fewer surviving parents who would not tolerate having their child in an inter-racial relationship, the less taboo such relationships have become within society generally. I have also read about how scientific progress depends to some extent upon the death of highly respected individuals who have become overly wedded to new ideas in their old age, and who are now keeping the mainstream from accepting what the latest research has shown to be true about the universe.

Obviously, not everybody has their preferences and instincts ossified at exactly the same time, or to the same extent. That said, if there is evidence for such a phenomenon existing generally, it could have political and sociological importance. For one thing, it would highlight the importance of the education system and the overall collective of information available to youth, when it comes to determining what society is going to look like a few decades from now.

Do people think such a phenomenon is real? If so, what would the most important consequences be?