How A Tiny Bit Of Procrastination Can Help You Make Better Decisions

Decision-making is what you might call “practical science.” Findings on how we make decisions have direct applicability to life outside the psychology lab, and in recent years there’s been quite a lot said about this most commonplace, yet complicated, feats of mind. A new study adds to the discussion by suggesting that a wee bit of procrastination can make us better decision-makers.

Researchers from the Columbia University Medical Center wanted to know if they could improve decision accuracy by inserting just a smidgen more time between peoples’ observation of a problem and their decision on how to respond.

The research team conducted two experiments to test this hypothesis. First, they asked study participants to make a judgment about the direction of a cluster of rapidly moving dark dots on a computer monitor. As the dots (called the “target dots” in the study) traveled across the screen, participants had to determine if the overall movement was right or left. At the same time, another set of brighter colored dots (the “distractor dots”) emerged on the screen to obscure the movement of the first set. Participants were asked to make their decisions as quickly as possible.

When the first and second set of dots moved in generally the same direction, participants completed the task with near-perfect accuracy. When the second set of dots moved in a different direction than the first, the error rate significantly increased. Simple enough.

The second experiment was identical to the first, except this time participants were told to make their decisions when they heard a clicking sound. The researchers varied the clicks to be heard between 17 and 500 milliseconds after the participants began watching the dots – a timespan chosen to mimic real-life situations, such as driving, where events happen so quickly that time seems almost imperceptible.

The research team found that when participants’ decisions were delayed by about 120 milliseconds, their accuracy significantly improved.

"Manipulating how long the subject viewed the stimulus before responding allowed us to determine how quickly the brain is able to block out the distractors and focus on the target dots," said Jack Grinband, PhD, one of the study authors. "In this situation, it takes about 120 milliseconds to shift attention from one stimulus, the bright distractors, to the darker targets."

The researchers were careful to distinguish “delaying” from “prolonging” the decision process. There seems to be a sweet spot that allows the brain just enough time to filter out distractions and focus on the target. If there’s too little time, the brain tries to make a decision while it’s still processing through the distractions. If there’s too much time, the process can be derailed by more distractions.

If you’re wondering how anyone can actually do this with so little time to make a decision, the answer—suggested by this study—is practice. Just as the participants were cued by the clicks to make a decision, it would seem that we have to train ourselves to delay just long enough to filter distractions.

Said another way, doing nothing--for just a tiny amount of time--gives the brain an opportunity to process and execute. (In my book, Brain Changer, I refer to this as the "awareness wedge" because it's a consciously inserted "wedge" between the immediacy of whatever situation we're facing and our next action.)

This research also underscores just how dangerous it can be to add distractions to the mix—like using a phone while driving—when our brains already need a cushion of time to filter the normal array of distractions we experience all the time.

The study appears in the online journal PLoS One.

You can find David DiSalvo on Twitter @neuronarrative and at his website, The Daily Brain. His latest book is Brain Changer: How Harnessing Your Brain’s Power To Adapt Can Change Your Life.

Posted on April 6, 2014 .

The Good and Bad News About Your Sleep Debt

Sleep, science tells us, is a lot like a bank account with a minimum balance penalty. You can short the account a few days a month as long as you replenish it with fresh funds before the penalty kicks in. This understanding, known colloquially as “paying off your sleep debt,” has held sway over sleep research for the last few decades, and has served as a comfortable context for popular media to discuss sleep with weary eyed readers and listeners.

The question is – just how scientifically valid is the sleep debt theory?

Recent research targeted this question by testing the theory across a few things that sleep, or the lack of it, is known to influence: attention, stress, daytime sleepiness, and low-grade inflammation. The first three are widely known for their linkage to sleep, while the last—inflammation—isn’t, but should be. Low-grade tissue inflammation has been increasingly linked to a range of unhealthiness, with heart disease high on the list.

Study participants were first evaluated in a sleep lab for four nights of eight-hour sleep to establish a baseline. This provided the researchers with a measurement of normal attention, stress, sleepiness and inflammation levels to measure against.

The participants then endured six nights of six-hour sleep (a decent average for someone working a demanding job and managing an active family and social life). They were then allowed three nights of 10-hour catch-up sleep. Throughout the study, participants’ health and ability to perform a series of tasks were evaluated.

Sleep debt theory predicts that the negative effects from the first six nights of minimal sleep would be largely reversed by the last three nights of catch-up sleep – but that’s not exactly what happened.

The analysis showed that the six nights of sleep deprivation had a negative effect on attention, daytime sleepiness, and inflammation as measured by blood levels of interleukin-6 (IL-6), a biomarker for tissue inflammation throughout the body — all as predicted. It did not, however, have an effect on levels of the stress hormone cortisol—the biomarker used to measure stress in the study—which remained essentially the same as baseline levels.

After three-nights of catch-up sleep, daytime sleepiness returned to baseline levels – score one for sleep debt theory. Levels of IL-6 also returned to baseline after catch-up – another score in the theory’s corner. Cortisol levels remained unchanged, but that’s not necessarily a plus for the theory (more on that in a moment).

Attention levels, which dropped significantly during the sleep-deprivation period, didn't return to baseline after the catch-up period. That’s an especially big strike against the theory since attention, perhaps more than any other measurement, directly affects performance. Along with many other draws on attention—like using a smart phone while trying to drive—minimal sleep isn’t just a hindrance, it’s dangerous, and this study tells us that sleeping heavy on the weekends won’t renew it.

Coming back to the stress hormone cortisol, the researchers point out that its level remaining relatively unchanged probably indicates that the participants were already sleep deprived before they started the study. Previous research has shown a strong connection between cortisol and sleep; the less sleep we get, the higher the level of the stress hormone circulating in our bodies, and that carries its own set of health dangers. This study doesn’t contradict that evidence, but also doesn’t tell us one way or the other if catch-up sleep decreases cortisol levels.

The takeaway from the study is that catch-up sleep helps us pay off some, but by no means all of our sleep debt. And given the results on impaired attention, another takeaway is that it’s best to keep your sleep-deprived nights to a minimum. Just because you slept in Saturday and Sunday doesn’t mean you’ll be sharp Monday morning.

You can find David DiSalvo on Twitter @neuronarrative and at his website, The Daily Brain. His latest book is Brain Changer: How Harnessing Your Brain’s Power To Adapt Can Change Your Life.

Posted on March 15, 2014 .

Balancing the Self-Control Seesaw

Imagine a seesaw in your brain. On one side is your desire system, the network of brain areas related to seeking pleasure and reward. On the other side is your self-control system, the network of brain areas that throw up red flags before you engage in risky behavior. The tough questions facing scientific explorers of behavior are what makes the seesaw too heavy on either side, and why is it so difficult to achieve balance?

new study from University of Texas-Austin, Yale and UCLA researchers suggests that for many of us, the issue is not that we’re too heavy on desire, but rather that we’re too light on self-control.

Researchers asked study participants hooked up to a magnetic resonance imaging (MRI) scanner to play a video game designed to simulate risk-taking. The game is called Balloon Analogue Risk Task (BART), which past research has shown correlates well with self-reported risk-taking such as drug and alcohol use, smoking, gambling, driving without a seatbelt, stealing and engaging in unprotected sex.

The research team used specialized software to look for patterns of activity across the brain that preceded someone making a risky or safe decision while playing the game.

The software was then used to predict what other subjects would choose during the game based solely on their brain activity. The results: the software accurately predicted people's choices 71 percent of the time.

What this means is that there’s a predictable pattern of brain activity associated with choosing to take or not take risks.

"These patterns are reliable enough that not only can we predict what will happen in an additional test on the same person, but on people we haven't seen before," said Russ Poldrack, director of UT Austin's Imaging Research Center and professor of psychology and neuroscience.

The especially intriguing part of this study is that the researchers were able to “train” the software to identify specific brain regions associated with risk-taking. The results fell within what’s commonly known as the “executive control” regions of the brain that encompass things like mental focus, working memory and attention. The patterns identified by the software suggest a decrease in intensity across the executive control regions when someone opts for risk, or is simply thinking about doing something risky.

"We all have these desires, but whether we act on them is a function of control," says Sarah Helfinstein, a postdoctoral researcher at UT Austin and lead author of the study.

Coming back to the seesaw analogy, this research suggests that even if our desire system is level, our self-control system appears to slow down in the face of risk; less intensity on that side of the seesaw naturally elevates intensity on the other side.

And that’s under normal conditions. Add variables like peer pressure, sleep deprivation and drug and alcohol use to the equation--all of which further handicap self-control--and the imbalance can only become more pronounced.

That’s what the next phase of this research will focus on, says Helfinstein. "If we can figure out the factors in the world that influence the brain, we can draw conclusions about what actions are best at helping people resist risks.”

Ideally, we'd be able to balance the seesaw -- enabling consistently healthy discretion as to which risks are worth taking. While it's evident that too much exposure to risk is dangerous, it's equally true that too little exposure to risk leads to stagnation.

We are, after all, an adaptive species. If we're never challenged to adapt to new risks, we stop learning and developing, and eventually sink into boredom, which, ironically, sets us up to take even more radical risks. 

The study appears in the journal Proceedings of the National Academy of Sciences.

You can find David DiSalvo on Twitter @neuronarrative and at his website, The Daily Brain. His latest book is Brain Changer: How Harnessing Your Brain’s Power To Adapt Can Change Your Life.

Posted on February 28, 2014 .

How to Squeeze Snake Oil From Deer Antlers and Make Millions

If I offered to sell you a liquid extract made from the velvety coating of deer antlers, claiming that it will catalyze muscle growth, slow aging, improve athletic performance and supercharge your libido – I’d expect you'd be a little skeptical. But what if I added that a huge percentage of professional athletes are using the stuff and paying top dollar, $100 or more an ounce, and swear up and down that just a few mouth sprays a day provides all benefits as advertised? Would you be willing to give it a try?

Ever since former Baltimore Ravens star Ray Lewis admitted a few months ago that he used deer antler spray (though subsequently denied it), the market for the stuff has exploded. Some estimates say that close to half of all professional football and baseball players are using it and a hefty percentage of college players as well, to say nothing of the army of weightlifters and bodybuilders that have made the spray a daily part of their routines.

TV journalism bastion 60 Minutes recently ran a special sports segment about "Deer Antler Man" Mitch Ross, the product's highest profile salesman, and the tsunami of buyers for oral deer antler spray and its growing list of celebrity devotees. Without question, deer antler spray has captivated the attention of the sports world and is rapidly pushing into mainstream markets.

Let’s take a look at the science behind the claims and try to find out what’s really fueling the surge in sales for this peculiar product.

The velvety coating of deer antlers is a chemically interesting material. For centuries it’s been used in eastern traditions as a remedy for a range of maladies, and there’s an underlying rationale for why it theoretically could be useful for certain conditions. The velvet coating contains small amounts of insulin-like growth factor 1, or IGF-1, that has been studied for several decades as a clinically proven means to reverse growth disorders in humans. For example, in children born with Laron Syndrome—a disorder that causes insensitivity to growth hormone, resulting in dwarfism—treatment with IGF-I has been shown to dramatically increase growth rates.  IGF-1 appears to act as a chemical facilitator for the production of growth hormone from the pituitary gland, and in sufficient amounts even synthetically derived IGF-1 can help boost physical growth.

That’s the reason why IGF-1 has been banned by the Food and Drug Administration (FDA) and the World Anti-Doping Agency in certain forms as having similar outcomes to using human growth hormone and anabolic steroids.  The forms these agencies have banned, however, are high-dosage, ultra-purified liquids administered by injection.

Why can’t the FDA and anti-doping agencies ban IGF-1 outright? For the simple reason that the chemical, in trace amounts, is found in things we eat every day: red meat, eggs and dairy products. Every time you eat a juicy ribeye or have a few eggs over easy, you’re ingesting IGF-1.

In the tiny amounts of the substance found in these foods, we may experience a cumulative, positive effect on muscle repair over time, but you’ll never be able to drink enough whole milk in a sitting to experience the anabolic effects you’d get from a syringe full of concentrated and purified IGF-1.

As I mentioned, the velvety substance on growing deer antlers also contains trace amounts of IGF-1, and (along with oddities like powdered tiger bone) has been sold in China for centuries as a traditional cure for several ailments. In traditional Chinese medicine, the antler is divided into segments, each segment targeted to different problems.  The middle segment, for example, is sold as a cure for adult arthritis, while the upper section is sold as a solution for growth-related problems in children. The antler tip is considered the most valuable part and sells for top dollar.

The main source for the market explosion in deer antler spray is New Zealand, which produces 450 tons of deer velvet annually, compared to the relatively small amount produced by the US and Canada: about 20 tons annually. Deer can be killed outright for their antlers, but in New Zealand the more accepted procedure is to anesthetize the deer and remove the antlers at the base. The antlers are then shipped overseas to the growing market demanding them.

The reason why deer antler velvet is usually turned into an oral liquid spray instead of a pill (although it is also sold in pill form around the world) is that the trace proteins in the substance are rapidly broken down by the digestive system, so only a fraction of the already tiny amount actually makes it into the bloodstream. In spray form, IGF-1 can potentially penetrate mucosal membranes and enter the bloodstream intact more quickly. Purchasing the spray form can run from anywhere between about $20 for a tiny bottle to $200 for two ounces. Standard doses are several sprays per day, so the monthly costs of using the product are exorbitant.

The question is does using deer antler spray deliver the benefits its sellers claim? These alleged benefits include accelerated muscle growth and muscle repair, tendon repair, enhanced stamina, slowing of the aging process, and increased libido – a virtual biological panacea of outcomes.

The consensus opinion from leading endocrinologists studying the substance, including Dr. Roberto Salvatori at the Johns Hopkins School of Medicine and Dr. Alan Vogol at the University of Virginia,  is that the chances of it delivering on any of these benefits are slim to none.  The reason is simply that there's far too little of the substance in even the purest forms of the spray to make any difference.

Think of it this way: If a steak contains roughly the same trace amount of IGF-1 as deer antler velvet, is there any evidence to suggest that eating steak can provide the same array of benefits claimed for deer antler spray?  No, there’s not a shard of clinical evidence to support that claim.

And yet, thousands of people are paying close to $200 a bottle for the spray believing that it will deliver these benefits.  With such high-profile celebrity connections as Ray Lewis and golf superstar Vijay Singh, there’s little wonder why the craze has picked up momentum. But in light of scientific evidence, there’s no credible reason to pay $200 or any amount for a bottle of deer antler spray.

Aside from the lack of evidence supporting benefits, it’s unclear what the negative effects may be of using the product long-term.  WebMD reports that the compounds in the spray may mimic estrogen in the body, which could contribute to spawning a variety of cancers or worsening of conditions such as uterine fibroids in women. Elevated estrogen levels in men can throw off hormonal balance and lead to a thickening waistline and a host of related metabolic problems.

The takeaway is this: deer antler spray is the latest high-priced snake oil captivating the market. Not only will it cost you a lot of money and not deliver promised benefits, but it could lead to negative health outcomes. Let the deer keep their antler velvet and keep your cash in your wallet.

You can find David DiSalvo on Twitter @neuronarrative and at his website, The Daily Brain. His latest book is Brain Changer: How Harnessing Your Brain’s Power To Adapt Can Change Your Life.

Posted on February 21, 2014 .

The Era Of Genetically-Altered Humans Could Begin This Year

By the middle of 2014, the prospect of altering DNA to produce a genetically-modified human could move from science fiction to science reality.  At some point between now and July, the UK parliament is likely to vote on whether a new form of in vitro fertilization (IVF)—involving DNA from three parents—becomes legally available to couples. If it passes, the law would be the first to allow pre-birth human-DNA modification, and another door to the future will open.

The procedure involves replacing mitochondrial DNA (mtDNA) to avoid destructive cell mutations. Mitochondria are the power plants of human cells that convert energy from food into what our cells need to function, and they carry their own DNA apart from the nuclear DNA in our chromosomes where most of our genetic information is stored. Only the mother passes on mtDNA to the child, and it occasionally contains mutations that can lead to serious problems.

According to the journal Nature, an estimated 1 in 5,000-10,000 people carry mtDNA with mutations leading to blindness, diabetes, dementia, epilepsy and several other impairments (the equivalent of 1,000 – 4,000 children born each year in the U.S.). Some of the mutations lead to fatal diseases, like Leigh Syndrome, a rare neurological disorder that emerges in infancy and progressively destroys the ability to think and move.

By combining normal mitochondrial DNA from a donor with the nucleus from a prospective mother’s egg, the newborn is theoretically free from mutations that would eventually lead to one or more of these disorders. While never tried in humans (human cell research on mtDNA has so far been confined to the lab), researchers have successfully tested the procedure in rhesus monkeys.

Last March, the UK Human Fertilization and Embryology Authority wrapped up a lengthy study of safety and ethical considerations and advised parliament to approve the procedure in humans. According to New Scientist magazine, parliament is likely to vote on the procedure by July of this year. If the procedure overcomes that hurdle, it will still take several months to pass into law, but the initial vote will allow researchers to begin recruiting couples for the first human mtDNA replacement trials.

The U.S. is not nearly as close to approving mtDNA replacement as the UK seems poised to do; the U.S. Food and Drug Administration will start reviewing the data in earnest in February.  Among the concerns on the table is whether the mtDNA donor mother could be considered a true “co-parent” of the child, and if so, can she claim parental rights?

Even though the donor would be contributing just 0.1 percent of the child’s total DNA (according to the New Scientist report), we don’t as yet have a DNA benchmark to judge the issue. Who is to say what percentage of a person’s DNA must come from another human to constitute biological parenthood?

Other scientists have raised concerns about the compatibility of donor mtDNA with the host nucleus and believe the push to legalize human trials is premature. By artificially separating mtDNA from the nucleus, these researchers argue, we may be short-circuiting levels of genetic communication that we're only beginning to fully understand.

These are but two of many issues that this procedure will surface in the coming months. One thing is certain: we’re rapidly moving into new and deeper waters, and chances are we're going to need a bigger boat.

You can find David DiSalvo on Twitter @neuronarrative and at his website, The Daily Brain. His latest book is Brain Changer: How Harnessing Your Brain’s Power To Adapt Can Change Your Life.

 

Posted on February 9, 2014 .

Why Is Heroin Abuse Rising While Other Drug Abuse Is Falling?

Peter Shumlin, Democratic governor of Vermont, moved heroin addiction to the front burner of national news by devoting his entire State of the State address to his state’s dramatic increase in heroin abuse. Shumlin described the situation as an “epidemic,” with heroin abuse increasing 770 percent in Vermont since 2000.

Vermont is a microcosm of the nation. Across the U.S., heroin abuse among first-time users has increased by nearly 60 percent in the last decade, from about 90,000 to 156,000 new users a year, according to the U.S. Substance Abuse and Mental Health Services Administration (SAMHSA).

At the same time, non-medical prescription opiate abuse has slowly decreased.  According to the SAMHSA 2012 National Survey on Drug Use and Health, the number of new non-medical users of pain killers in 2012 was 1.9 million; in 2002 it was 2.2 million. [It bears repeating that these stats are for abuse of non-medical prescription pain killers, not abuse of drugs obtained with a prescription.]

In the same time-frame, abuse of methamphetamine also decreased. The number of new users of meth among persons aged 12 or older was 133,000 in 2012, compared to about 160,000 in 2002.

Cocaine abuse also fell, from about 640,000 new users in 2012 from over 1 million in 2002. Crack abuse fell from over 200,000 users in 2002 to about 84,000 in 2012 (a number that’s held steady for the last three years).

The statistics suggest that heroin has taken up the slack from fall offs among other major drugs (only marijuana and hallucinogens like ecstasy have held steady or slightly increased among new users over the last decade; not surprising since they’re the drugs of choice among the youngest users, and since pot has been angling toward legalization for the last few years).

Most surprising in this sea of stats is the drop in non-medical prescription opiate abuse overlapping with an increase in heroin abuse. The reason may come down to basic economics: illegally obtained prescription pain killers have become more expensive and harder to get, while the price and difficulty in obtaining heroin have decreased.  An 80 mg OxyContin pill runs between $60 to $100 on the street. Heroin costs about $9 a dose. Even among heavy heroin abusers, a day’s worth of the drug is cheaper than a couple hits of Oxy.

Laws cracking down on non-medical prescription pain killers have also played a role. The amount of drugs like Oxy hitting the streets has decreased, but the steady flow of heroin hasn’t hiccupped.  Many cities are reporting that previous non-medical abusers of prescription pain killers—who are often high income professionals—have turned to heroin as a cheaper, easier-to-buy alternative.

One conclusion that can be drawn from the stats is that prescription opiates are serving as a gateway drug for heroin, not so much by choice but by default. The market moves to fill holes in demand, and heroin is effectively filling fissures in demand opened by legal pressures and cost.

Another interesting stat is that among first-time drug users, the mean age of initiation for non-medical prescription pain killers and heroin is virtually identical: 22 to 23 years old. That would also support an argument that there’s a cross-over effect from drugs like Oxy to heroin (in contrast, the mean ages for first-time users of pot and ecstasy are 18 and 20, respectively).

Vermont’s heroin problem would seem a foretelling of things to come in the more affluent parts of the country. According to the U.S. Census Bureau, Vermont’s median household income, home ownership rate, and percentage of people with graduate and professional degrees are all higher than the national averages, and Vermont’s percentage of those living at or below poverty level is significantly lower than the national average.

The bottom line: Vermont’s stratospheric heroin increase is happening where the money is, and the national drug abuse trends suggest that the same thing is happening across the country.

You can find David DiSalvo on Twitter @neuronarrative and at his website, The Daily Brain. His latest book is Brain Changer: How Harnessing Your Brain’s Power To Adapt Can Change Your Life.

Posted on February 4, 2014 .

How Video Games Will Help Us Steal Back Our Focus

I’ve become a focus junkie. If I see something written in a legit publication about techniques or technologies to improve mental focus, I freebase it—mainly because the forces draining focus are unrelenting, and I’m convinced that the only way to regain balance is by indulging measures that are just as intense. (My working philosophy: extreme forces call for extreme adaptation, using the best tools and strategies science can afford us.)

Enter author and psychologist Daniel Goleman, popularizer of “Emotional Intelligence”, and author of a new book about the power of focus called, simply, “Focus”.  Goleman is one of my favorite writers in the psychology space because his work is a true example of what I call “science-help” – he’s all about the research. When you glean takeaway knowledge from a Goleman book, you can be sure it’s been tested and credible enough to earn his writer’s brand.

Because I’m also a midnight snacker of business nibblets, I came across Goleman’s latest article in the Harvard Business Review, “The Focused Leader: How effective executives direct their own—and their organization's—attention". The entire piece is well worth the magazine's $17 cover price (or at least buying a PDF reprint online), but I was especially intrigued by a sidebar in the article about a new species of video games designed to help regain our focus in a focus-fragmenting world.

Dave Eggers or Michael Chabon couldn’t come up with a better ironic twist than video games—engaging and entertaining video games, no less(!)—being used to sharpen attention.  As Goleman discusses in HBR, neuroscientists at the University of Wisconsin-Madison have grabbed hold of this task like tics on a deerhound and produced a video game slated for a 2014 release called, fittingly, “Tenacity”.  Quoting Goleman:

“The game offers a leisurely journey through any of half a dozen scenes, from a barren desert to a fantasy staircase spiraling heavenward. At the beginner’s level you tap an IPad screen with one finger every time you exhale; the challenge is to tap two fingers with every fifth breath. As you move to higher levels, you’re presented with more distractions—a helicopter flies into view, a plane does a flip, a flock of birds suddenly scud by.”

The objective is the same as that of meditation—to draw attention back to a central point despite the number or intensity of distractions dive-bombing one’s focus.  Goleman adds, “When players are attuned to the rhythm of their breathing, they experience the strengthening of selective attention as a feeling of calm focus, as in meditation.”

University of Wisconsin-Madison researchers see this as just the beginning of a focus-enhancing revolution in digital tech.  Through an initiative called Games+Learning+Society (GLS), they are pioneering efforts that marry entertainment with enrichment, and building it all on a platform of solid science.

The team boasts members with serious science street cred, like neuroscientist Richard J. Davidson, founder of the Center for Investigating Healthy Minds, whose work on neuroplasticity (the brain’s ability to change at the neuronal level) could carry Promethean fire to the video game world.  Davidson is leading research to identify what’s happening in the brains of people who use games like Tenacity, with the hypothesis that the technology will help train our brains for enhanced focus, and—believe it or not—greater kindness.

“Modern neuroscientific research on neuroplasticity leads us to the inevitable conclusion that well-being, kindness and focused attention are best regarded as skills that can be enhanced through training,” says Davidson. “This study is uniquely positioned to determine if game playing can impact these brain circuits and lead to increases in mindfulness and kindness.”

Given the deluge of news about video games leading to violence, the idea that they could make us a bit nicer sounds, well, mighty nice. And the truth is that it's not even far-fetched: it's an outcome sitting at the crossroads of ancient wisdom traditions and focus-enhancing technology—as we learn to more consistently focus our attention, we experience a change in both awareness and attitude. As everyone from the Buddha to David Foster Wallace has observed, once our awareness is enhanced and broadened, we can get out of our heads and interact more conscientiously with others.

That's the pro-social goal that has the Wisconsin team fired up about the focus-enhancing power of digital tech.  According to Constance Steinkuehler, co-director of GLS and associate professor of education at UW-Madison: “We’re looking at pro-social skills, particularly being able to recognize human emotions and then respond to them in some productive fashion, which turns out to be harder than you might think.”

Armed with Davidson’s brain-imaging analysis, the team wants to know if playing the games they’ve designed will foster pro-social adaptation in our noggins.

“We look at pre- and post-test measures and see if there is a difference,” said Steinkuehler. “For example, in Tenacity, our mindfulness app, you might ask yourself ‘Is there a dosage effect? Can we see that more game play has more positive effect on kid’s attention?’”

If that's proven out, then GLS's technology-harnessing work could be the perfect counterbalance to the dubious video-game legacy the news media is so fond of blowhorning: that gaming does little more than foster anti-social behavior, everything from bullying to serial violence.

At a less radical level, the UW-Madison team’s work may also provide an antidote to the insular effects of digital tech. If doses of Tenacity, or similar games, leads to heightened focus and social awareness, then spending time buried in your smartphone or tablet could have an upside beyond accruing more gold and elixir for your barbarian clan.

“There’s this tremendous amount of time and energy investment in games and media,” says Steinkuehler. “So part of what we’ve been trying to figure out is how do we take some of that time and make it beneficial for the people engaged in it? We have examples from television or film of documentaries, of art pieces, of indie films, of shows like Sesame Street, that actually have documented benefits for their viewers. So games are another media, why not use them?”

It’s this pragmatic view of technology, as opposed to the absolutist view that too often creeps into our mindspace, that will eventually win the day. As Sesame Street proved decades ago amidst the clamor of “TV is an anti-educational evil!” fear mongering, we can make use of technology to enrich minds. The difficulty in doing so arises from fighting against path-of-least-resistance thinking—human nature's chronic disease of default—that turns us into willing slaves of our time-chewing vices.

The work of the GLS team and others crafting new uses for digital tech reminds us that how technology ultimately affects us is an outcome we can, and should, influence. If we punt on that responsibility, we shouldn't be surprised at the bad news that invariably follows.  But if we see the responsibility as an opportunity, we'll be surprised at how much good can come from the ones and zeroes in our hands.

David DiSalvo's newest book, Brain Changer, is now available at AmazonBarnes and Noble and other major booksellers.

Posted on January 26, 2014 .

Study Shows That Electrical Stimulation Can Boost The Brain's Brakes

Using harmless electrical stimulation, researchers have shown that they can boost self-control by amplifying the human brain’s “brakes.”

Researchers from The University of Texas Health Science Center at Houston (UTHealth) and the University of California, San Diego asked study participants to perform simple tasks in which they had to exert self-control to slow down their behavior. While doing so, the team used brain imaging to identify the areas of the participants’ prefrontal cortex (sometimes called the brain’s “command and control center”) associated with the behavior—allowing them to pinpoint the specific brain area that would need a boost to make each participant’s “braking” ability more effective.

They then placed electrodes on the surface of the participants’ brains associated with the prefrontal cortex areas linked with the behavior.  With an imperceptible, computer-controlled electrical charge, researchers were able to enhance self-control at the exact time the participants needed it.

"There is a circuit in the brain for inhibiting or braking responses," said Nitin Tandon, M.D., the study's senior author and associate professor in The Vivian L. Smith Department of Neurosurgery at the UTHealth Medical School. "We believe we are the first to show that we can enhance this braking system with brain stimulation."

To make sure that specifically stimulating the prefrontal cortex was really causing the effect, the researchers conducted a follow-up in which they placed the electrodes on other surface areas of the participants’ brains. Doing so had no effect.

That’s an important point, because it separates this study from past research that used electrical stimulation to disrupt general brain function.  In contrast, this study shows that particular parts of the prefrontal cortex form a self-control circuit that can be externally enhanced.

What also makes this study noteworthy is that it was double-blind-- neither the researchers nor participants knew when or where the electrical charges were being administered.  That’s critical because it means the participants would not know when to intentionally slow down their behavior to exaggerate the effect. They were, in a very real sense, being externally controlled by the stimulation, albeit only briefly.

The study has a few caveats. First, all of the participants were volunteers suffering from epilepsy who agreed to be monitored for seizures by hospital staff during the experiment.  Second, there were only four participants—though all four experienced the self-control boosting effect.  Obviously, placing electrodes on the surface of the brain is an invasive procedure, hence the small number of participants.

If this research sounds a little scary to you, you can relax knowing that we're a long way from externally controlling peoples' behavior. The true value of this study is to demonstrate that the brain's self-control circuit can be amplified, at least under certain conditions.

Placing electrodes on peoples' brains isn't a practical solution, but eventually the same effect may be triggered with scalp electrodes and, down the road, with medication that targets the self-control circuit. That may one day be promising news for sufferers of behavioral disorders like Tourette’s Syndrome and OCD.

The study was published in The Journal of Neuroscience.

David DiSalvo's newest book, Brain Changer, is now available at AmazonBarnes and Noble and other major booksellers.

Posted on January 16, 2014 .

Eat More Of These Four Things For A Stronger, Healthier Brain

Remember these four letters: DDFM.  If it’s easier, think of them as call letters for a cheesy radio station, “Double D FM!” The letters stand for four nutrients critical to brain health that you probably aren’t getting enough of: Vitamin D, DHA, Folate and Magnesium.

Research suggests that our diets are increasingly low in all four, and our brains are suffering for it.

Vitamin D

Why it’s important:  I stumbled across the importance of vitamin D when a routine blood test revealed that my level was low and my doctor recommended that I begin taking three 2000 i.u. vitamin D3 supplements a day. I’d always thought being out in the sun was enough to keep vitamin D levels high, because the human body uses sunlight to manufacture the vitamin. But research shows we’re frequently low in this essential vitamin and that’s potentially dangerous. Low levels are associated with free radical damage to brain cells and accelerated cognitive decline.  In addition to boosting brain health, there’s also evidence suggesting that vitamin D aids in muscle strength and repair.

Deutsch: Schweizer Emmentaler AOC, Block

How to get more of it:  Eat oily fish like wild salmon* and eggs. You can also get a boost by eating cheese, and if you go this route I recommend swiss cheese because it also contains a high level of Conjugated Linoleic Acid (CLA) that has shown promise in helping reduce abdominal fat.  Another decent source is Greek yogurt, but avoid brands with excess sugar (I’m not recommending milk for that reason – it’s naturally high in sugar). If your vitamin D levels are especially low—and it’s best to determine that via a blood test—consider taking a vitamin D3 supplement at a level your doctor recommends.**

DHA

Why it’s important:  DHA (Docosahexaenoic acid) plays a vital role in keeping cell membranes flexible, resilient and healthy. Healthy cell membranes are less susceptible to oxidative stress, the damage caused by free radicals, which can lead to cell mutation and, ultimately, cancer. DHA also appears to help brain cells regulate their energy use and protects them from inflammation—a condition linked to an array of degenerative diseases including Alzheimer's. In addition, low levels of DHA have been linked to depression, memory loss, and even elevated hostility. Suffice to say, there's enough credible research out there on DHA now to support a strong statement that it's essential to brain health.

How to get more of it: Eat more oily fish like wild salmon and sardines, though if you eat canned sardines try to find brands that are not packed in cans containing BPA, a chemical linked to a host of toxic badness. If you don't mind the taste, kelp (aka seaweed) is another excellent source. You can also get ample DHA in Omega 3 fish oil supplements. Just make sure that you are buying a brand that is filtered to remove mercury and has a high level of DHA (the EPA and DHA levels will be listed in the ingredients; try to get a supplement with at least 200mg of DHA per capsule).**

Folate

Why it’s important: Folate, a water-soluble B vitamin, has long been established as critical to brain development in infants; pregnant women are strongly advised to take a folate supplement to fend off birth defects. But research has also shown that folate is important to brains of all ages, and deficiencies are correlated with cognitive decline particularly in the elderly. Studies have linked folate to improved memory function and mental processing speed—two things that typically take a hit as we age. There's also evidence indicating that folate deficiency contributes to psychiatric disorders such as depression.

How to get more of it:  Eat unsalted peanuts. The little legumes are folate powerhouses, and they’re also packed with heart healthy monounsaturated fat.  If crunching nuts isn’t your thing, try natural peanut butter. Just stay away from peanut butter with added sugar and salt – stick to the kind that’s all peanuts. Other good sources include asparagus, black eyed peas, spinach, broccoli and egg yolks.

Magnesium

Why it’s important:  In the brain, magnesium acts as buffer between neuron synapses, particularly the NMDA receptor that plays a role in several cognitive functions including learning and memory. Magnesium “sits” on the receptor without activating it, in effect protecting the receptor from over-activation by other neurochemicals, especially the neurotransmitter glutamate. If there isn’t enough magnesium available to protect NMDA receptors, glutamate constantly triggers the receptors causing an “excitatory” response. That’s why you often see magnesium advertised as a calming nutrient, because it blocks glutamate from too-frequently activating the NMDA receptors in your brain. The most important thing to remember is that without magnesium, over-activation of NMDA receptors eventually becomes toxic to the brain, leading to progressively worse damage and steady cognitive decline.

Spinach

How to get more of it:  Eat spinach, it's loaded with magnesium. Other sources include almonds and black beans. Just be sure to eat raw or roasted almonds that are unsalted and not coated in sugar (even though those taste so good). Peanuts are also a decent source of magnesium, which makes them a double-whammy snack because they're also high in folate as mentioned above.

If you decide to take a magnesium supplement, be sure to find a readily absorbable form of magnesium such as magnesium citrate, and avoid the less absorbable (but widely sold) form of magnesium oxide. **

*In each case where I recommended eating more fish, you'll notice that I said "wild salmon," and that's because there's troubling evidence to suggest that farm-raised salmon are a significantly less healthy choice for the brain and the heart.

** Always check with your doctor before beginning any supplement regimen.

David DiSalvo's newest book, Brain Changer, is now available at AmazonBarnes and Noble and other major booksellers.

Posted on January 9, 2014 .

Two Incredible Speeches About Thinking To Begin 2014

Widely regarded as two of the most influential commencement addresses ever given, I offer you David Foster Wallace's speech "This is Water" from his commencement at Kenyon College in 2005, and Steve Job's commencement speech at Stanford, also in 2005.  As the New Year begins, I urge you to listen to both and spend some time thinking about the messages from these two remarkable thinkers from different parts of culture who had important lessons to teach us about our own thinking.

Happy New Year. 

 

 

 

 

Posted on December 31, 2013 .

Why The Future Of Online Dating Relies On Ignoring You

According to a new studyNetflix and Amazon have much to teach online dating sites. Netflix doesn’t wait around for you to tell it what you want; its algorithm is busy deciphering your behavior to figure it out. Likewise, say researchers, dating sites need to start ignoring what people put in their online profiles and use stealthy algorithmic logic to figure out ideal matches – matches that online daters may have never pursued on their own.

Kang Zhao, assistant professor of management sciences in the University of Iowa Tippie College of Business, is leading a team that developed an algorithm for dating sites that uses a person's contact history to recommend partners with whom they may be more compatible, following the lead of the model Netflix uses to recommend movies users might like by tracking their viewing history.

The difference between this approach, and that of using a user’s profile, can be night and day. A user’s contact history may in fact run entirely counter to what she or he says they are looking for in a mate, and usually they aren’t even aware of it.

Zhao's team used a substantial amount of data provided by a popular commercial online dating service: 475,000 initial contacts involving 47,000 users in two U.S. cities over 196 days. About 28,000 of the users were men and 19,000 were women, and men made 80 percent of the initial contacts. Only about 25 percent of those contacts were reciprocated.

Zhao's team sought to improve the reciprocation rate by developing a model that combines two factors to recommend contacts: a client's tastes, determined by the types of people the client has contacted; and attractiveness/unattractiveness, determined by how many of those contacts are returned and how many are not.

“Those combinations of taste and attractiveness,” Zhao says, “do a better job of predicting successful connections than relying on information that clients enter into their profile, because what people put in their profile may not always be what they're really interested in. They could be intentionally misleading, or may not know themselves well enough to know their own tastes in the opposite sex.”

Zao gives the example of a man who says on his profile that he likes tall women, but who may in fact be approaching mostly short women, even though the dating website will continue to recommend tall women.

"Your actions reflect your taste and attractiveness in a way that could be more accurate than what you include in your profile," Zhao says. The research team’s algorithm will eventually “learn” that while a man says he likes tall women, he keeps contacting short women, and will unilaterally change its dating recommendations to him without his notice, much in the same way that Netflix’s algorithm learns that you’re really a closet drama devotee even though you claim to love action and sci-fi.

"In our model, users with similar taste and (un)attractiveness will have higher similarity scores than those who only share common taste or attractiveness," Zhao says. "The model also considers the match of both taste and attractiveness when recommending dating partners. Those who match both a service user's taste and attractiveness are more likely to be recommended than those who may only ignite unilateral interests."

After the research team’s algorithm is used, the example 25 percent reciprocation rate described above improves to about 44 percent --  a better than 50% jump.

Zhao says that his team’s algorithm seems to work best for people who post multiple photos of themselves, and also for women who say they “want many kids,” though the reasons for that correlation aren't quite clear.

If you’re wondering how soon online dating services could start overruling your profile to find your best match, Zhao’s team has already been approached by two major services interested in using the algorithm.   And it’s not only online dating that will eventually change. Zhao adds that college admissions offices and job recruiters will also benefit from the algorithm.

The age of Ignore is upon us, though safe money says we’ll continue thinking we’ve “chosen” the outcomes anyway.

The research was published in the journal Social Computing, Behavioral-Cultural Modeling and Prediction

David DiSalvo's newest book, Brain Changer, is now available at AmazonBarnes and Noble and other major booksellers.

Posted on December 28, 2013 .

Neuroscience Explains Why the Grinch Stole Christmas

"You're a mean one, Mr. Grinch."

But why?

We all know Dr. Seuss's iconic tale of the green ogre who lives on a mountain, seething while the Whos in the village below celebrate Christmas. The happier they are, the angrier he gets, until finally he can't take it anymore and hatches a plan to steal away their joy.

Dr. Seuss was a brilliant intuitive psychologist, and I'd have loved to chat with him about the core of the Grinch's rage, but, alas, he left us too early. So I'm turning to another impressive thinker who has taught me a great deal about the neurobiology of emotion: Dr. John Cacioppo, a pioneer in the field of social neuroscience and co-author of the book, Loneliness: Human Nature and the Need for Social Connection

Cacioppo has conducted a wealth of research about the effects of loneliness on the human brain. We're not talking about physical loneliness (although that can be part of the equation); we're talking about a sense of loneliness that someone in the midst of thousands of people can feel.  I saw an interview with John Bon Jovi who was describing the feeling he gets after he leaves the stage. Surrounded by tens of thousands of fans, you might think that he'd have no reason to feel lonely—but when he goes back to his hotel room, those thousands of people screaming his name may as well not exist at all. He feels alone despite being the center of a publicity universe.

That's much closer to the sort of loneliness Cacioppo studies, and it's especially relevant in the age of social media, where someone might have 2,000 Facebook friends and yet feel like they're completely alone in the world.  

If Cacioppo could persuade the Grinch to step into his MRI, he'd likely observe a result consistent with those of a 2009 brain imaging study he conducted to identify differences in the neural mechanisms of lonely and nonlonely people.   Specifically, he wanted to know what's going on in the brains of individuals with an acute sense of "social isolation"—a key ingredient in loneliness that has nothing to do with being physically alone, and everything to do with feeling alone. 

While in an MRI machine, subjects viewed a series of images, some with positive connotations, such as happy people doing fun things, and others with negative associations, such as scenes of human conflict.  As the two groups watched pleasant imagery, the area of the brain that recognizes rewards showed a significantly greater response in nonlonely people than in lonely people. Similarly, the visual cortex of lonely subjects responded much more strongly to unpleasant images of people than to unpleasant images of objects—suggesting that the attention of lonely people is especially drawn to human conflict. Nonlonely subjects showed no such difference.

In short, people with an acute sense of social isolation appear to have a reduced response to things that make most people happy, and a heightened response to human conflict.  This explains a lot about people who not only seem to wallow in unhappiness, but also seem obsessed with the emotional "drama" of others. Every office has a few people just like that.

The Grinch is easier to understand given these findings. He is physically isolated (except for his dog), but more importantly he's socially isolated. He feels no sense of connection to the citizens of Whoville, though they live just outside his mountain lair. Watching them surround themselves with happy things like ornaments and gifts and food ticks him off, so he determines to inject some strife into the festivities and revel in the fallout. 

Fortunately, the Grinch has an epiphany (a Gestalt moment) that makes him not only want to return everything to Whoville, but participate in the merriment as well. The real-life corollary would probably include a couple years of therapy, but Dr. Seuss makes the point well enough: there is redemption for those suffering from loneliness.  It requires genuine connection with others—not faces in a cheering crowd or numbers on a Facebook page. Those things might supplement real relationships, but they can't replace them. 

And that, it seems to me, is the heart of the holidays: they are ritualized reminders that none of us are islands, and that no matter how many people surround us, we're only at our best when we allow some of them to be part of us.  

Happy holidays.  

David DiSalvo's newest book, Brain Changer, is now available at AmazonBarnes and Noble and other major booksellers.

Posted on December 21, 2013 .

New Study Asks: What Kind Of Bored Are You?

Most of us think we already know what it means to be bored, and we’ll look for just about any diversion to avoid the feeling.  But according to recent research, boredom is not a one-size-fits-all problem — what triggers or alleviates one person’s boredom won’t necessarily hold sway for someone else.

According to researchers publishing in the journal Motivation and Emotion, there are four well-established types of boredom:

Indifferent boredom (characterized by feeling relaxed and indifferent – typical coach potato boredom);

Calibrating boredom (characterized by feeling uncertain but also receptive to change/distraction);

Searching boredom (characterized by feeling restless and actively searching for change/distraction); and

Reactant boredom (characterized by feeling reactive, i.e. someone bored out of her mind storming out of a movie theater to find something better to do).

The most recent study by the boredom-defining research team has now identified a fifth type--apathetic boredom--and it's the most troublesome of all. People exhibiting apathetic boredom are withdrawn, avoid social contact, and are most likely to suffer from depression. In fact, apathetic boredom could be considered a portal leading to depression.

The sort of remedy that would alleviate “searching bordeom”—actively pursuing change—would not help someone with apathetic boredom, because change itself represents too much of a threat. Apathetic boredom feeds on itself, perpetuating over and over the same feelings that make it so difficult to escape. The uncertainty of change is just another reason to stay cloistered away.

Study co-author Dr. Thomas Goetz of the University of Konstanz and the research team conducted two real-time experiments over two weeks involving students from German universities and high schools. Participants were given personal digital assistants to record their activities, experiences and feelings throughout the day for the duration of the study. The results showed that not only do different people experience different types of boredom, but also that people don't typically switch-hit between flavors of boredom – any given person will tend to predominantly experience one type of boredom far more than others.

The most alarming finding of the study is that apathetic boredom was reported by almost 40 percent of the high school students, suggesting a link between apathetic boredom and rising numbers of depressed teens.

The obvious drawback of this research is that participants self-reported their feelings and experiences during the study period, and self-reporting is often unreliable.  On the plus side, the researchers ran the study for two full weeks instead of just a few days, and had far more data to analyze as a result.

The research was published in the journal Motivation and Emotion.

David DiSalvo's newest book, Brain Changer, is now available at AmazonBarnes and Noble and other major booksellers.

Posted on December 19, 2013 .

Why Cheating Is Like A Drug

Every so often a news story comes out about a celebrity caught shoplifting. The standard response is “Why?” The reason isn’t lack of money, and it’s certainly not that getting arrested is good for the celeb's career, so what would make an A-lister take the chance?

New research suggests that, for some people, stealing or cheating has much in common with doing a line of cocaine – it’s all about the buzz.  Psychologists call it the "Cheater’s High."

Researchers from the Foster School of Business at the University of Washington conducted three experiments to test the theory.  The first used a cash reward as the carrot for solving word puzzles. The researchers set up the experiment in such a way that participants had a chance to illicitly get a look at the correct answers, with the expectation that many of them would use the answers to cheat on the test. As predicted, more than 40% of the participants cheated. After the test, participants were asked to report on their emotions. Researchers found that the cheaters consistently reported a bigger boost in positive emotion (such as a sense of “self-satisfaction”) compared to those who didn’t cheat.

In a follow-up study, the research team removed the financial-reward factor (which by itself could spark positive emotions) and asked a different group of participants to solve a series of math problems on a computer.  Once again, the test was set up so that participants could—if they chose—get a peek at the answers. This time almost 70% of participants cheated, and once again they reported higher levels of positive emotion than the non-cheaters, despite not winning any money.

In the final study, the research team used Amazon’s Mechanical Turk survey site to recruit 205 people online and offered them a chance to win cash for solving word puzzles. The researchers sent a portion of the participants a message that they were on the “honor system” when reporting their answers because the researchers wouldn’t be able to tell if they were cheating (in truth, they actually could tell). The purpose of the message was to remove the possibility that cheaters weren’t aware that they were cheating, or that they might “play dumb” about having cheated. The message also implied that if the participants chose to cheat, they were in effect stealing the money.

The results in this case were even more significant: not only did the cheaters report more positive emotion than non-cheaters, but the cheaters who received the warning message reported even greater self-satisfaction than cheaters who didn’t get the message.

The research team’s takeaway from all three experiments is that the cheaters high is sparked by the thrill of getting away with it.  The final experiment showed this most clearly, because the plain face truth that participants were knowingly cheating actually increased their “high.”

Since this study only focused on cheating and stealing, it's not clear that the same dynamic plays out in cases where someone directly harms another person, which would of course be hard to test for obvious reasons.

The research was published in the Journal of Personality and Social Psychology.

David DiSalvo's newest book, Brain Changer, is now available at AmazonBarnes and Noble and other major booksellers.

Posted on December 4, 2013 .

Study: Making Direct Eye Contact Is Not An Effective Way To Persuade

Few popular beliefs are as unshakable as, “If you want to influence someone, always make direct eye contact.” But new research suggests that this bit of sturdy pop lore is hardly gospel – in fact, in many circumstances a direct gaze may result in the exact opposite effect.

Researchers from Harvard, the University of British Columbia and the University of Freiberg used newly developed eye-tracking technology to test the claim during two experiments.  In the first, they had study participants watch a speaker on video while tracking their eye movements, and then asked how persuaded they were by the speaker. Researchers found that the more time participants spent looking into the speaker’s eyes, the less persuaded they were by the speaker's argument. The only time looking into the speaker’s eyes correlated with being influenced was when the participants already agreed with the speaker’s opinions.

So the first takeaway is: when a speaker gives an opinion contrary to the audiences’, looking into her or his eyes has the exact opposite of the intended effect.

In a second experiment, some participants were told to look into the speaker’s eyes and others were told to watch the speaker’s mouth. Once again, participants who looked into the speaker's eyes were less receptive to his opposing arguments, and also said they were less inclined to interact with advocates of the speaker’s argument.

Which leaves us with another takeaway contrary to the popular belief: if your audience is already skeptical of your arguments, looking into your eyes will not only reinforce their skepticism, but also make them less likely to interact with others expressing your views.

According to Julia Minson of the Harvard Kennedy School of Government, co-lead researcher of the studies, “The findings highlight the fact that eye contact can signal very different kinds of messages depending on the situation. While eye contact may be a sign of connection or trust in friendly situations, it's more likely to be associated with dominance or intimidation in adversarial situations.”

Her advice to everyone from parents to politicians: “It might be helpful to keep in mind that trying to maintain eye contact may backfire if you're trying to convince someone who has a different set of beliefs than you.”

In the next round of research, the team is going to investigate whether eye contact in certain situations correlates with patterns of brain activity associated with responding to a threat, and an increase in stress hormones and heart rate.

There’s a corollary to these findings that’s found throughout the animal world, one that everyone who deals with everything from dogs to gorillas already knows – looking directly into a potentially aggressive animal’s eyes is not a good idea. The gesture is taken as a threat and might draw an attack.

Quoting another of the researchers, Frances Chen, “Eye contact is so primal that we think it probably goes along with a whole suite of subconscious physiological changes.”

The study was published in the journal Psychological Science.

David DiSalvo's newest book, Brain Changer, is now available at AmazonBarnes and Noble and other major booksellers.

Posted on December 1, 2013 .

How Exercise Makes Your Brain Grow

David DiSalvo's newest book, Brain Changer, is now available at AmazonBarnes and Noble and other major booksellers. 

 

Research into “neurogenesis”—the ability of certain brain areas to grow new brain cells—has recently taken an exciting turn. Not only has research discovered that we can foster new brain cell growth through exercise, but it may eventually be possible to “bottle” that benefit in prescription medication.

The hippocampus, a brain area closely linked to learning and memory, is especially receptive to new neuron growth in response to endurance exercise. Exactly how and why this happens wasn’t well understood until recently. Research has discovered that exercise stimulates the production of a protein called FNDC5 that is released into the bloodstream while we’re breaking a sweat. Over time, FNDC5 stimulates the production of another protein in the brain called Brain Derived Neurotrophic Factor (BDNF), which in turns stimulates the growth of new nerves and synapses – the connection points between nerves – and also preserves the survival of existing brain cells.

What this boils down to in practice is that regular endurance exercise, like jogging, strengthens and grows your brain. In particular, your memory and ability to learn get a boost from hitting the pavement.  Along with the other well-established benefits of endurance exercise, such as improved heart health, this is a pretty good reason to get moving. If jogging isn’t your thing, there’s a multitude of other ways to trigger the endurance effect – even brisk walking on a regular basis yields brain benefits.

Now researchers from the Dana-Farber Cancer Institute at Harvard Medical School (HMS) have also discovered that it may be possible to capture these benefits in a pill.  The same protein that stimulates brain growth via exercise could potentially be bottled and given to patients experiencing cognitive decline, including those in the beginning stages of Alzheimer’s and Parkinson’s.

"What is exciting is that a natural substance can be given in the bloodstream that can mimic some of the effects of endurance exercise on the brain," said Bruce Spiegelman, PhD, of Dana-Farber and HMS and co-senior author of the research report with Michael E. Greenberg, PhD, chair of neurobiology at HMS.

In the new study, the research team artificially increased BDNF in the brains of mice by using a harmless virus to piggyback FNDC5 molecules through the bloodstream of the mice.  After seven days, researchers found a significant increase in BDNF in the hippocampus area of the mice brains – the brain area crucial for memory and learning.

"Perhaps the most exciting result overall is that peripheral delivery of FNDC5 with adenoviral vectors (i.e. a virus) is sufficient to induce central expression of BDNF and other genes with potential neuroprotective functions or those involved in learning and memory," the authors said.

The research team cautions that since this is an animal study, it’s far too early to conclude that the same effect will work in humans, but the significant results of this study show promise for future research into delivering cognitive benefits to the human brain via a similar mechanism. Cognitive boost for suffers of Alzheimer’s, Parkinson’s and other debilitating diseases in the form of a brain-growth pill may not be too far off.

More immediately, neurogenesis research has provided yet another great reason to get up, get out and get moving.

The research report was published in the journal Cell Metabolism.

You can find David DiSalvo on Twitter @neuronarrative.

Posted on November 14, 2013 .

How Neuroscience Could Make Your Resistance Futile

David DiSalvo's newest book, Brain Changer, is now available at AmazonBarnes and Noble and other major booksellers.

 

Comply. That’s an uneasy watchword at the very center of social cohesion. Without enough social norm compliance—such as the norm that stresses fairness in our dealings with others—humans aren’t great at getting along. The question is, what’s at the heart of our willingness to comply with social norms?  Are our brains pre-packaged with compliance wiring? Or do we bend to the dictates of fairness and equal treatment only because our laws press us into compliance? Or is it some of both?

Neuroscientists are quite interested in these questions, and they’ve even made some progress answering them. Studies using functional magnetic resonance imaging (fMRI) have identified brain areas that appear to be involved in our decisions about when and why we treat others fairly or unfairly. These studies have shown, for example, that a region in the right hemisphere of the brain called the right lateral prefrontal cortex (rLPFC) is activated when people comply with social norms (or "rules"), suggesting that the rLPFC is an important part of a neural network that could be considered our brain’s social-norm wiring. But as with all fMRI results, brain activity does not conclusively prove a causal relationship between a given brain area and a given behavior—the results can only suggest it.

new study from researchers at the University of Zurich took all of this a big step forward by using a painless and harmless electrical charge to positively or negatively stimulate the rLPFC  (something called “transcranial direct current stimulation”) while study participants took part in a computerized fairness game.

The game works like this: participants are given an amount of money and told to share it with a randomly assigned partner. In one game scenario, they are allowed to make the decision of how much money to give away without the threat of a penalty for being unfair.  In another scenario, they are told they can still make the decision, but their partner will be able to penalize them if they act unfairly.

In the first phase of the study, participants played the fairness game without experiencing the electrical charge. The social norm of fairness dictates that people give away an equal or near equal portion of the money, but without the threat of a penalty most participants only gave away between 10-25% of their stash. With the threat of a penalty, the percentage increased to between 40-50%.

Researchers then had the participants play the game again, but this time while experiencing a positive electrical charge designed to increase activity in the rLPFC.  Participants receiving the positive charge increased the amount of money they gave away by about 33%. When researchers switched to a negative charge (which decreased rLPFC activity), participants decreased the amount they gave away by about 22%.

But here’s the twist: these results only held true when a penalty was threatened. Without threat of a penalty, the positive and negative charges to the rLPFC actually had an opposite effect.  Researchers also checked to see if the electrical charges changed the participants’ expectation of how strong or weak the penalty would be, and found no change in threat expectation.

What this means is that stimulating the brain region didn’t make people fairer -- it made them more sensitive to threats of being punished if they didn’t act fairly.

The implications of this finding are potentially massive, and more than a little alarming. If we can biochemically alter activity in the rLPFC with a pill, just as these researchers did with an electrical current, then we’re looking toward a brave new pharmacological world that serves up a daily dose of compliance via threat sensitivity (assuming, of course, that there might be a market for such a drug). On a more positive note, the finding opens a door to treat people with damage to their rLPFC, who may be dangerously non-compliant with social norms.

However you choose to view the results, the research is significant because it bridges a chasm between seeing brain activity in relation to a behavior (in an fMRI brain scan) and changing behavior by manipulating brain activity. And while that's also a little frightening, it's a necessary step toward figuring out where ambiguous concepts like "social norm compliance" play out in the brain. This study is just a tiny taste of what's to come.

Posted on November 6, 2013 .

10 Things You Should Know About Goals

David DiSalvo's newest book, Brain Changer, is now available for pre-order on Amazon and Barnes and Noble. It will hit store shelves on November 12.

 

Setting and reaching goals is a mainstay topic in research across a range of disciplines, including psychology, neuroscience, marketing, and communications. Below is a survey of 10 recent findings about goals, chosen from these and other topic areas, that throw some light on the ups and downs of goal achievement.

1. Giving up a goal takes a psychological and physical toll.

First a word of caution – goal achievement is risky business. If setbacks start accumulating, and you begin doubting whether you can reach your goal, you’re on your way to what psychologists call an “action crisis.” This is the crucial point at which you experience an internal conflict about whether you should keep going or give up. Research has shown that experiencing an action crisis increases production of the stress hormone cortisol, which is your brain’s way of sounding a body-wide alarm in response to the internal conflict. The problem is, the extra cortisol doesn’t help your performance, and may contribute to giving up sooner. It also increases blood pressure, which takes a toll on your blood vessels.

2. Being more specific can help you reach your goal.

We like flexibility in our lives, but some recent research (PDF) from consumer psychology suggests that being more specific and less flexible may be more effective in goal achievement.  The premise is simple but not easily accepted: specific steps, accomplished in strict order, seem harder to do at first, but ultimately lead to greater goal achievement than an ambiguous plan.  The problem is that more ambiguous, flexible plans seem much more appealing upfront.

3. Our brains may have an internal guidance system for reaching goals.

Research from neuroscience suggests that our brains use the neurotransmitter dopamine as an internal guidance system to reach goals. An animal study showed that the dopamine signal in the brain gets stronger as the goal gets closer. It’s sort of a “Marco Polo” effect that influences choices made to direct action toward a goal, and adjusts expectations about how close or far away the goal really is.

4. Your inner voice is a potent goal-achievement tool.

Reacting impulsively can thwart goal achievement, and research shows that your inner voice is an effective way to control impulses. A study suggests that simple things like telling yourself “Keep going, you can do it” while you’re exercising really does help keep you moving, and sidetracks the impulse to give up because the activity is getting harder.

5. Fist power could keep you from choking.

study earlier this year showed that clenching your left (but not right) fist can prevent you from choking under high pressure situations, as you might experience on your way to achieving a physical performance goal. The effect was studied across three experiments with athletes as test subjects, and the results were consistently significant. The researchers believe that left fist clenching primes the right hemisphere of the brain, aiding automatic skill performance (the opposite of conscious deliberation, which is thought to be controlled in the left hemisphere and actually contributes to choking).

6. Sharing your goals with friends improves your chances of reaching them.

More research from this year indicates that writing down your goals, sharing them with friends, and sending your friends regular updates about your progress can boost your chances of succeeding. The study showed that people who merely thought about their goals and how to reach them succeeded less than 50% of the time, while people who wrote goals down, and enlisted friends to help them by sending regular progress reports succeeded closer to 75% of the time.

7. Overmotivation can undermine goal achievement.

Motivation is essential to goal achievement, but overmotivation can lead to exactly the opposite. When your brain is in a hyper state of arousal about wanting something, the neurotransmitter dopamine floods your brain’s reward circuits. Research shows that when this happens, your chances of failing increase no matter how hard you try. Mental focus and precision are deluged by the flood.  The trick seems to be to find the happy motivation balance that keeps you moving forward without tripping on your brain’s in-built foibles.
 

8. And so can fantasizing.

Even though it’s tempting, research suggests that fantasizing too much about your dream job or any other major goal can undermine success. It's all about expectations. Realistic thinking fosters more realistic expectations; fantasizing blows expectations out of proportion, obscuring vision of what must actually be done to reach a goal.

9. And so can overthinking.

Although an incredibly powerful organ, the brain can get in its own way (in many ways) – and, ironically, thinking too much is one of them.  A study indicated that there’s an interesting connection between memory and performance. Once the right skills for a given task are internalized (like the many parts of a perfect golf swing), thinking about them when trying to perform doesn’t help, it hurts.

10. Finally, try to stay optimistic.

While easier said than done, keeping an optimistic mindset appears to enable people to deal with stress more effectively -- a key to goal achievement. Looking on the bright side actually is good for you, and an effective way to help reach your goals.

David DiSalvo's newest book, Brain Changer, is now available for pre-order on Amazon and Barnes and Noble. It will hit store shelves on November 12.

Posted on October 26, 2013 .

The Big Stink About Anxiety: It Changes How Our Brains Process Odors

Anxiety causes a slew of unpleasant symptoms that all of us have experienced to greater or lesser degrees. Sweating, rapid heartbeat, churning stomach, and fear – these are just a few symptoms of an anxious mind. One lesser known symptom is that when we’re anxious, things don't smell quite right.

A new study explored this odd effect by focusing on the role of stress in rewiring the brain.  Two brain circuits that don’t typically “talk” to each other—one linked to our sense of smell and another linked to emotional processing—can become cross-wired when we experience stress-induced anxiety. The result is that stressful experiences transform normally neutral odors into bad ones.

Researchers first asked a group of subjects to rate several smells, all of which were  inoffensive neutral odors.  The subjects were then hooked up to a functional magnetic resonance imaging machine (fMRI) while they watched a series of  disturbing images, like car crashes and graphic war scenes, accompanied by equally disturbing text messages.

After the fMRI, the subjects were exposed to the same set of smells and asked to rate them again. This time, the majority of subjects changed their rating of the smells from neutral to offensive.

"After anxiety induction, neutral smells become clearly negative," explains Wen Li, a professor of psychology at the University of Wisconsin-Madison Waisman Center, who led the study. "People experiencing an increase in anxiety show a decrease in the perceived pleasantness of odors. It becomes more negative as anxiety increases."

The fMRI brain scan—which allowed the researchers to watch what was happening in the subjects’ brains in real time—suggests that stress-induced anxiety from watching the disturbing images and reading the messages triggered a cross-wiring between the smell and emotion brain circuits.

"In typical odor processing, it is usually just the olfactory system that gets activated," says Li. "But when a person becomes anxious, the emotional system becomes part of the olfactory processing stream."

The researchers think that this effect accumulates over time.  The more anxiety we experience, the more the cross-wiring between these two brain circuits strengthens – resulting in more and more otherwise neutral smells turning into bad ones.  The vicious cycle triggered by this effect is that the smells themselves contribute to more anxiety.

According to Li, "We encounter anxiety and as a result we experience the world more negatively. The environment smells bad in the context of anxiety. It can become a vicious cycle, making one more susceptible to a clinical state of anxiety as the effects accumulate. It can potentially lead to a higher level of emotional disturbances with rising ambient sensory stress."

This isn’t the first study, by far, to examine the link between emotions and sense of smell. Journals are full of research explaining why, for example, we think of the holiday season when we smell pine cones, or remember family gatherings when we smell cookies baking.  But it is one of the first to explore the specific role of anxiety in causing a bridge between these brain circuits, and that understanding may help psychologists untangle the bundle of anxiety triggers in people diagnosed with anxiety disorders.

The study was published in the Journal of Neuroscience.

David DiSalvo's newest book, Brain Changer, is now available for pre-order on Amazon and Barnes and Noble. It will hit store shelves on November 12.

Posted on October 19, 2013 .