Feeds:
Posts
Comments

Archive for the ‘Psychology’ Category

By Ruthanna Gordon

Watch the latest CSI, and you’ll see a world where evidence is clear-cut and scientific.  By the end of any episode, the viewer can feel secure that the guilty party has been caught, and will be tried and convicted.

In the real world, the situation is often far more complicated.  If investigators are very lucky, criminals leave behind sufficient DNA for testing, and either have that DNA already on file from previous convictions, or get caught so that a new comparison can be made.  (Although unless the case is very high-priority, test results come back in weeks rather than hours.)  Under these circumstances, determinations of guilt can be clear-cut and scientific.  Other types of evidence tend to be fuzzier.

Eyewitness testimony, for example, is far more unreliable than people like to admit.  Leading questions can change someone’s memory outright, or bias their descriptions.  One study showed people a video of a car accident:

How fast do you think the cars were going when they smashed into each other?

If you’re like most people watching the video, your answer to that question—presumably asked by the prosecution—will be 10-15 miles per hour faster than if the defense asked you about the speed of the cars when they “contacted” each other.  Other studies have used similar manipulations to change memory for weapon used, appearance of the perpetrator, and sometimes even the perpetrator’s recollection of their own actions.

The same biases can also affect those whose job it is to interpret evidence.  The whorls and lines of your fingertips really are unique—but fingerprints tend to be smudged.  Fingerprint experts learn towards finding matches between the prints collected at a crime scene, and those they already have access to.  And their confidence levels for these judgments tend to be unrealistically high.

An understanding of these issues has begun to trickle into the courtroom.  Defense lawyers sometimes bring in memory researchers to warn juries about eyewitness fallibility.  Juries are also more likely to receive information about accuracy rates for tests, or how well one can really see a face at 200 feet in the fog.  And yet, it’s still common for people convicted on other evidence to be exonerated by DNA testing.  It’s not yet clear how to create a working justice system that responds appropriately to these findings—but we’ve certainly got a long way to go.

Read Full Post »

Post by Ruthanna Gordon

Perhaps the most notable thing about time is that we notice it.  This may seem unimpressive at first, but think about it.  We don’t notice most of what happens in the universe.  Infrared, ultraviolet, and microwave radiation pass our eyes without a blip.  Magnetic fields pulse and fade.  But time is something that we perceive—without even having a sense set aside for it.

The most studied area in the psychology of time is our perception of the past.  Human  memory is complex and surprisingly fickle.  If you’ve ever gotten into a He Said/She Said argument, you know that people recall events in ways that make them look good, even at the cost of accuracy.  And when we’re very stressed—for example, just after witnessing a crime—we unthinkingly fill in memory gaps with whatever other people claim to have seen.

Even with their flaws, our memories do a lot for us.  Memory helps you keep track of who and what you are.  It helps you keep track of other people’s personalities, so that you learn who and when to trust.  It lets you improve your handling of a situation the second, third, and three thousandth time you encounter it.

The psychology of the future seems like it should be a fuzzier thing.  When we plan and daydream, we imagine many different possibilities.  We consider different ways of asking for a raise tomorrow, or fantasize about remote chances years away.  But it turns out that memory and planning are, in many ways, really the same thing.

Over the last several years, psychologists have begun to examine future thinking in detail.  What they’ve learned is that it draws heavily on memory.  Want to know what the future will be like?  In many cases, it will be like the past.  When it’s not, past patterns may still be useful in guessing at the differences.  Planning your next party lights up the same brain areas that spark when you reminisce about your last one.  These abilities, more and more frequently, are grouped together as “mental time travel.”

In the mind, the past and the future intertwine more closely than anyone guessed.  People with amnesia often have trouble creating a detailed image of the future.  People who give vague answers (“I always do badly at math”) when asked about specific events (“Tell me about the last time you took a test”) are prone to equally vague fears about the future, and to depression.  And some skills—a vivid imagination, social modeling, and mental mapping—extend our reach beyond the present and allow it to flourish in both directions.

Read Full Post »

New research shows that people prone to depression have less specific memories.  What else might our memories tell us about our personalities and predispositions?

Read Full Post »

Post by Ruthanna Gordon

I’m wrapping up seven years of classroom teaching this week. I’m going full time into science communication and policy, so I’m not exactly getting away from education. Still, it’s the sort of thing that makes one thoughtful. And I find, when I think about my time in the classroom, that storytelling has been wound through it all.

My graduate advisor was a storyteller—her introductory psychology course had a real reputation. If the topic was problem solving, she’d have the students rolling in the aisles while she explained how she got a squirrel out of her house with a vacuum cleaner. If the topic was consciousness, we TAs would check the doors surreptitiously for visiting administrators while she talked about her own experiences with altered states. I picked up the habit myself. Most of the time, I tell funny stories: “How Professor Gordon Used Old Issues of X-Men to Improve Her Memory,” or “What Professor Gordon Did When the Cloth Ceiling of Her Junkheap Car collapsed.” But some of the most powerful stories are more serious.

At the beginning of a course, a lot of students—engineers, architects, and physicists looking to pick up an easy credit—are dubious about the whole idea of psychology as a science. It’s not really like a standard lab science, after all. Human minds are fuzzy and hard to pin down, and no one really wants to think that free will is predictable. Or maybe it’s so predictable that no science is necessary. Isn’t psychological “research” just a matter of confirming things that everyone knows intuitively?

So I tell them a story.

Imagine that you’re a student, perennially broke and looking for a chance to earn a little extra cash. You see a sign posted one day, for an experiment on memory. They’re paying a few dollars, not anything spectacular but enough to buy lunch. It will take less than an hour, so why not?

You arrive at the lab at the appointed time. The place is very posh—obviously well-funded, with thick carpets and curtains, and lots of little rooms connected by intercoms and one-way mirrors. There’s another subject there as well, signed up for the same time slot. He’s an older fellow, a little bit balding and shaky, but you chat for a few minutes while you wait, and he seems sweet and likeable.

Finally the experimenter arrives. He passes around the consent forms, and lets you know that he’s studying the effect of punishment on memory. As in many psychology experiments, “punishment” consists of electric shock—a tradition that dates back to the early behaviorists. He’s a bit short on assistants right now, so one of you is going to be the learner, who tries to memorize pairs of words and gets punished if you miss any. The other will be the teacher who reads out words for memorization, and administers the shock when necessary. The experimenter himself will be timing responses and taking notes. The two of you pick roles out of a hat, and you have to admit you’re a bit relieved when you draw the role of the teacher.

The experimenter leads the two of you into the learner’s room, where there’s a chair with electrode hook-ups. You get a sample shock yourself, at the lowest level of 15 volts. It feels a bit like touching a doorknob on a dry winter day. The older fellow gets hooked up, and you’re led into the next room, where there’s a list of words, an intercom, and the control panel for the electrode array. The panel consists of a series of switches for increasing levels of shock, ranging from the static you just experienced all the way up to 450 volts—that end is marked with a red Danger sign.

You’re ready to begin. The experimenter explains the rules: you’ll read the words in pairs, and the learner is to memorize which words go together. Every time he gets a pair wrong, you give him the next highest level of shock—presumably increasing his motivation to avoid future errors.

The first couple of pairs, the learner gets right. But he seems nervous, and he starts to make mistakes. The first couple of times aren’t too bad, but as the voltage increases he starts to grunt, then to cry out in pain. At 150 volts, he’s had enough—he tells you, over the intercom, that he wants out of the experiment. You glance back at the experimenter, busy scribbling notes.

“Keep going,” he says. “It’s important to finish the experiment.”

What do you do?

(At this point, you can picture the sea of hands in response—28 students who say they’d stop, and 2 who think it’s shocking to say they wouldn’t.)

If you do keep going, the cries of pain get worse. The learner refuses to give answers; the experimenter tells you, again and again, that the study must be completed. Finally the learner screams and goes silent. The experimenter tells you to keep going…

Some of you may recognize the Milgram shock experiments by now. If so, you know that the “learner” is a confederate of the experimenter, and not receiving any actual shocks. The “teacher” is the true subject, and the question is how willing people are to harm others—simply because an authority tells them to. The answer turns out to be that they are very willing: 65% turned up the voltage all the way to the Danger sign in spite of believing that the learner was seriously injured, possibly dead.

This is not intuitive. Ask a room of people who don’t know the results, and they’ll guess that maybe 1% or fewer would obey. Not normal people. Only the rare psychopaths. Only the monsters.

Running the experiment, and learning this nonintuitive truth about humanity, is important. It helps us to be on guard against our inner monsters, and to remember that questioning authority is possible. It helps us to be better people. And that, I tell my students, is why we do science.

Read Full Post »

Post by Ruthanna Gordon

I have a confession to make: I am not the world’s most visual thinker.  I like language.  I like finding the exact right word, Mark Twain’s “lightning” rather than his “lightning bug.”  By default, I think in blocks of text.  I have learned, through long experience, that this annoys people.  With some effort, I now remember to put pictures into my presentations and blog posts.

This is a little odd, because when I’m reminded, I actually learn as much from a good visual aid as anyone else.  I pour over XKCD’s intricate and amusing charts for informational tidbits. And the examples from this TED talk made my jaw drop with the sudden rush of clarity.  But somehow, I can still forget to look for—or create—the right visualization if it isn’t dropped in my lap.

Psychology suggests that my flaw isn’t all that unusual. Most people find imagery incredibly helpful—and most default to representing information verbally.  Some time back, a group of researchers asked people to solve a problem.  They had to determine the most efficient order of processes in a hypothetical, given several constraints.  When people were explicitly told to draw a diagram, they came up with better solutions, more quickly.  Yet those who just got the problem rarely used this strategy spontaneously.

Images are easier to remember than visual descriptions.  A picture is not only worth a thousand words, it takes considerably fewer mental resources.  Images require us to ground even the most abstract ideas in specific, concrete details.  Since most of our cognitive abilities have evolved to deal with a specific, concrete world, imagery invokes all our most effective and efficient mental abilities.

We have some instinct for this advantage with kids.  In elementary school, students who have trouble reading may be given playsets to help them follow along—a toy farmhouse with animals that can be moved to fit story descriptions, for example. Those beginning addition or subtraction are given colored blocks to play with, and shown how to push them together or pull them apart to illustrate equations.

These innovations help children overcome bumps in learning, and stay ahead long term.  Yet, as we get older, we tend to dismiss the need for such aids.  It feels a bit like counting on your fingers.  It shouldn’t, though.  Visualization is at the heart of the way we think.

Read Full Post »

Post by Minna Krejci

Let’s take a short journey through our senses.  Take a look at the image below:

Some of the green cell nuclei in this taste bud contain the KATP sugar sensor, indicated in red. One of several sugar sensors recently shown to be present in sweet-sensing taste cells, KATP may help regulate sensitivity to sweet taste under different nutritional conditions. (Photo by Karen Yee, courtesy of the Monell Chemical Senses Center)

Within this mad swirl of color lie important clues as to how we taste “sweetness.”

It’s long been known that a taste receptor called T1r2+T1r3 is largely responsible for the detection of sweet compounds by our taste buds.  This receptor is sensitive to sugars like glucose and sucrose, as well as artificial sweeteners such as saccharin and aspartame.  However, evidence has also shown that mice that are missing the T1r3 part of the receptor still respond to certain sugars, particularly glucose.

To find the missing piece of the puzzle, a research team at the Monell Chemical Senses Center in Philadelphia cleverly thought to look to other organs in the body that sense glucose, such as the intestine and the pancreas.  (For more information, see their paper published in the Proceedings of the National Academy of Sciences last week).

As it turns out, sugar sensors in these other organs are also present in our sweet-sensing taste buds, in the same cells that contain the T1r2+T1r3 receptor.  These taste sensors seem to have various roles in detecting sugars.  Have you ever noticed that sweets taste sweeter in the presence of salt?  (I did have the pleasure of trying some salted caramels from Fran’s Chocolates in Seattle a few months ago and they were quite delicious…)  This effect may be at least partially a result of a glucose sensor called SGLT1 (originally found in the intestine), which transports glucose into the cell when sodium is present.

Another sugar sensor, called KATP, monitors glucose levels in the pancreas and triggers insulin responses.  Also found in sweet-sensing taste cells, this protein may have a role in modulating the sweet-sensitivity of the cells depending on the local environment (i.e., if you just ate something sweet) or overall blood glucose levels.

“The taste system continues to amaze me at how smart it is and how it serves to integrate taste sensation with digestive processes,” says senior author on the paper Robert F. Margolskee.

So what does all this mean?  One thing we can take away is that it’s not as easy to trick our body with artificial sweeteners as we may have thought.  The authors suggest that combining artificial sweeteners with a small amount of sugar may lead to enhanced sweet perception compared with artificial sweeteners or sugar alone.  Also, the relationship between taste and digestive processes may provide clues as to why we crave sweets under certain conditions – hopefully, that knowledge could be used to help prevent overconsumption of these kinds of foods.

It seems like scientists are working hard to solve the mysteries of why we often crave unhealthy foods, and how to combat the health problems that often result, such as obesity and type 2 diabetes.  In reading about the sweet sensors, I also happened to come across an article from about a year ago that reported that in addition to the five previously known tastes that we can detect (sweet, salt, sour, bitter, and umami), we also have a sixth “fat” taste.  Apparently, some people are more sensitive to this taste than others, and long-term consumption of fatty foods may lead to decreased sensitivity.  It’s a vicious cycle where those with a history of high fat intake may be more at risk for overeating these kinds of foods.

There’s obviously much more to our sense of taste than that “tongue map” we learned in school (which apparently isn’t correct anyway).

Can someone just find a way to make broccoli taste like jellybeans?  I’d be the healthiest and happiest girl in the world.  Until then, as White House pastry chef William Yosses puts it, “we are hard-wired to like things that are bad for us.”  Oh well, I guess I’d better work on my self-control…

Here are a few other interesting stories out there regarding some of our other senses – I know there’s more, so please share!

How the brain “sees” words: http://www.nytimes.com/2011/02/22/science/22obbrain.html

Eye pigments that also sense temperature: http://www.sciencenews.org/view/generic/id/70998/title/Light-sensor_pulls_perplexing_double_duty

Read Full Post »

%d bloggers like this: