Archive for the ‘Health News’ Category

‘Dumb’ Neanderthals Likely Had a Smart Diet

01 October 2011 Time: 02:03 PM ET

Instead of Neanderthals being dim-witted hunters who only dined on big game, new findings suggest they had more balanced diets, with broad menus that may have included birds, fish and plants.

Neanderthals are currently our closest known extinct relatives, near enough to modern humans to interbreed, with Neanderthal DNA making up 1 percent to 4 percent of modern Eurasian genomes. A host of recent findings suggest they were not only close genetically, but may have shared many other traits with us, such as creating art.

Still, the term “Neanderthal” has long been synonymous with “stupid.”

“Since they went extinct, conventional wisdom says they were dumber than us,” said researcher Bruce Hardy, a paleoanthropologist at Kenyon College in Gambier, Ohio. [Gnawed Bones Reveal Cannibal Cavemen]For instance, ample fossil evidence suggests Neanderthals hunted big game, deriving the vast majority of their diet from deer, mammoths and other large herbivores. Still, while pursuing such prey undoubtedly must have taken smarts, this fact also led some researchers to suggest they may have had little interest or even capability to dine on other items. Although hints that Neanderthals supplemented their diet with birds, fish, shellfish and plants have popped up at certain sites, these are typically dismissed as unusual exceptions, Hardy said.

“It’s been said that Neanderthals weren’t capable of hunting birds — they moved too fast,” Hardy noted.

Now researchers find evidence that Neanderthals may indeed have dined on a broad menu of plant and animal foods at a cave in the Rhone Valley in France.

“We can now move away from this view of Neanderthals as dim-witted big game hunters,” Hardy told LiveScience.

The area was excavated by Marie-Hélène Moncel at the French Institute of Human Paleontology in Paris, and her colleagues. Distinctly Neanderthal flint tools found at the site called Payre suggest it was used repeatedly by our extinct relatives between 125,000 and 250,000 years ago.

In addition to bones of deer, horses, cattle, rhinos and elephants, in Hardy’s analysis of 182 stone artifacts found there, he also found microscopic residues of fish scales, bird feathers and starchy plants. It remains uncertain what exactly those animals and plants might have been, although edible roots in the area included wild carrots and wild parsnips.

“It’s not surprising that they might have been able to exploit these resources, but it’s nice to have evidence,” Hardy said. “We’ve been reliant on an incomplete fossil record dominated by large animals, because those survive better over time. When we look more carefully and widely, we find that’s not the entire picture.”

Advertisements

Get Saturated: Four Reasons Saturated Fat is Healthy

Elizabeth Walling, Monday, January 04, 2010, Natural News.com

Today we are caught between two philosophies: one says saturated fat is killing us; the other says these fats are necessary for true vitality. There is a heated back-and-forth, a constant tug-of-war scenario, with society caught in the middle like a child caught between two parents in a nasty divorce.

But fortunately, we are adults who can take a step back from the madness and look at the facts – all of the facts – before coming to our own conclusions. The trouble is finding anything but propaganda regarding saturated fats. Corporate food industry and government agencies are quick to demonize these fats, but in traditional cultures saturated fat was revered and even coveted as a source of vital energy.

Mankind has survived for century upon century thanks to saturated fats, depending on these fats to nourish our bodies and support the health of future generations. Only in recent decades have we turned away from natural fats and toward manufactured vegetable oils. And only in recent decades have we seen an explosion of degenerative conditions such as heart disease, diabetes and cancer. Here are four important reasons saturated fat should have its place at the table:

#1 – Lower Your Risk for Heart Disease and Improve Your Cholesterol Profile

It’s the opposite of what the medical industry tells you, but it’s true: eating a diet rich in saturated fats protects your heart. Saturated fat reduces Lp(a), which is associated with increased risk for heart disease, and contributes to higher levels of HDL (good) cholesterol, which keeps your heart healthy.

#2 – Prevent Loss of Bone Density and Osteoporosis

We all know we need calcium for strong, healthy bones. We also need saturated fat to transport that calcium to our bones. This is why dairy products naturally contain both calcium and saturated fat. All those calcium supplements won’t do much good if saturated fat is lacking in our diet.

#3 – Strengthen Your Immunity and Prevent Illness

Saturated fats contain specialized fatty acids which are naturally antifungal, antimicrobial and antiviral. These important fatty acids include lauric acid, myristic acid and caprylic acid. A diet rich in these beneficial fats provides the body with what it needs to fight pathogenic substances.

#4 – Feed Your Brain, Your Nervous System and Your Hormones

Your brain is fat. And that’s a good thing. Your brain consists mainly of fat and cholesterol, and it needs saturated fat more than any other kind. Even the brain-friendly omega-3 fatty acids can’t be utilized without ample saturated fat. In addition, saturated fat facilitates nerve signals and hormone production. All of these systems rely on saturated fat to function, and to keep you healthy and ultimately alive.

It’s important to choose the right saturated fats, like real butter and virgin coconut oil. Avoid highly processed fats and especially hydrogenated oils, which have been proven to cause heart disease, cancer and a slew of other conditions.

The logic is hard to miss. We are told saturated fats cause heart disease, so we trade butter for vegetable oils: heart disease skyrockets. We are told saturated fats cause bone to lessen, so we drink low-fat milk: osteoporosis is widespread. We are told saturated fat isn’t good for our brains, so we stop eating traditional fats like coconut oil: depression, ADHD, dementia and autism are more prevalent than ever before. Are saturated fats really the villain here? In reality they are the victims of misinterpreted studies and commercialized industry, which have no interest in what our ancestors could have told them: saturated fat is the key to good health.

The color of controversy: Link between food dyes, childhood hyperactivity gets renewed attention

 

By Laura Beil

August 27th, 2011; Vol.180 #5 (p. 22)

 

When it comes to the safety of dyeing food, the one true shade is gray

 

Artificial colorings have been around for decades, and for just about as long, people have questioned whether tinted food is a good idea. In the 1800s, when merchants colored their products with outright poisons, critics had a pretty good case. Today’s safety questions, though, aren’t nearly so black and white — and neither are the answers.

 

Take the conclusions reached by a recent government inquiry: Depending on your point of view, an official food advisory panel either affirmed that food dyes were safe, questioned whether they were safe enough or offered a conclusion that somehow merged the two. It was a glass of cherry Kool-Aid half full or half empty.

 

About the only thing all sides agree on is that there would be no discussion if shoppers didn’t feast with their eyes. Left alone, margarine would be colorless, cola wouldn’t be dark, peas and pickles might not be so vibrantly green, and kids cereals would rarely end up with the neon hues of candy. But as the 1990s flop of Crystal Pepsi showed, consumers expect their food to look a certain way.

 

Some of the earliest attempts to dye food used substances such as chalk or copper — or lead, once a favorite for candy — that turned out to be clearly harmful. Most of the added colors in use today were originally extracted from coal tar but now are mostly derived from petroleum.

 

Overseeing the safety of artificial food color was one of the reasons the U.S. Food and Drug Administration was founded (with its current name, in 1930). And the issue of food dye safety has continued to attract government notice, sometimes in dramatic ways, such as the time investigators demanded to know why trick-or-treaters became ill in 1950 after eating Halloween candy dyed with orange No. 1.

 

The most recent government attention came in March, when an FDA advisory panel made up of scientists, consumers and industry representatives held a two-day hearing to try to determine whether food dyes cause hyperactivity in children. It is a debate that has gone on, in some incarnation, more than 30 years. Though scientific attention has grown, the disagreement lingers, partly because the issue is complicated to study and partly because dyes, if harmful, probably affect only a subset of children who have some yet-undiscovered genetic sensitivity. Over the years, skeptics of any connection have seized on uncertainties and other logistical flaws in the research that could lead to misleading results.

 

Still, many scientists say studies are strong enough to warrant some kind of government action. And some of them are now criticizing the FDA, saying that, in retrospect, questions about the hyperactivity-dye link were presented to the advisory panel in a way that meant inaction was almost a foregone conclusion.

 

“To me, the whole process was defective,” says Bernard Weiss, a psychologist in the Department of Environmental Medicine at the University of Rochester School of Medicine and Dentistry in New York who was invited to speak before the panel. The main question that committee members were assigned was whether “a causal relationship between consumption of certified color additives in food and hyperactivity in children in the general population has not been established” (a conclusion ultimately supported by 11 of 14 voting panel members).

 

Weiss calls that “a ridiculous question,” not only because of its tortured, negative wording, but also because even those concerned about food dyes acknowledge that the science has not shown a link to hyperactivity in all kids.

 

Untrue colors

 

Nine different artificial dyes are currently approved for use in the United States; many of these chemicals have been staples of the food industry for generations. While the FDA does not have data on consumption, it does keep track of how much dye of each type gets the OK for use in products; the amount per capita has increased fivefold since the 1950s. Dyes have never been without criticism — a “pure food” movement was well under way even by the late 1800s. But specific concern about hyperactivity and other neurological effects first arose in 1975, when Ben Feingold, former chief allergist at Kaiser Permanente Medical Center in San Francisco, hypothesized that food additives were contributing to hyperactivity. His book Why Your Child is Hyperactive drew largely on his own clinical observations.

 

In 1976 in the journal Pediatrics, researchers published a study that compared a regular diet with a diet that eliminated artificial flavors and colors in 15 hyperactive children. After eating what has since become known as the “Kaiser Permanente elimination diet” or the “Feingold diet,” children showed an improvement in symptoms such as difficulty paying attention.

 

Three decades of studies since then have accumulated evidence linking food dyes to an exacerbation of hyperactivity. But the controversy remains unsettled. Skeptics have a lot of ammunition, pointing out that findings often have been inconsistent and confusing. To set up a study of food dyes, researchers have to juggle a lot of variables at once — including how big a dose of dyes to give, which ones to give and the fine art of having parents and teachers document symptoms that aren’t easy to measure.

 

Other factors also complicate the research. Studies have used mixtures of dyes, making it difficult to tease out the possible effects of any individual color. Also, it may be that only an unknown subset of children are affected: In a scientific analysis, the children not affected might outnumber those who are, blunting the overall findings when data are lumped together.

 

Finally, evidence suggests that dyes may not be the lone culprit. Children who appear to be sensitive to dyes may also have neurological reactions to other ingredients, even naturally occurring components such as wheat and chocolate. In some studies, children were given the dyes in cookies; if the children react to wheat or milk as well, the “placebo” might not have been the placebo scientists thought.

 

In the end, the disagreement comes down to this: How much evidence is necessary to add product warnings about (or ban, as some consumer groups want) chemicals that offer no nutritional benefit and are consumed each day by millions of healthy children?

GOING NATURAL

Natural food dyes include betanin (derived from beetroot), compounds from the seeds of the achiote tree and curcumin (from turmeric).

 

Europe gets the blues

 

Food safety advocates believe the substantial suggestion of harm, even without proof, is enough to take action. So does the European Parliament, which in 2008 dictated that foods with certain dyes had to contain warnings that the chemicals “may have an adverse effect on activity and attention in children.” Neither the FDA nor American lawmakers have gone that far, saying that the levels of dye currently in foods are safe.

 

Most dyes have no set cap on the amount that can be used, just stipulations requiring manufacturers to use only enough to reach their desired color, and no more. “When the FDA established legal limits on dyes, they did not consider children,” says Laura Anderko, a researcher in public health at Georgetown University Medical Center in Washington, D.C. And it is not known, she says, what the lasting effects from constant exposure might be. “Kids, they have a long shelf life. If they are exposed at an early age — depending on those kinds of petrochemicals that are consumed — it could mean lifelong impacts,” she says.

 

The color industry says any link between food coloring and hyperactivity remains unproven. “We don’t see any strong compelling data at this point that there is a neurological effect,” says Sean Taylor, a chemist at Verto Solutions in Washington, D.C., and a representative of the International Association of Color Manufacturers. He notes that the dyes on the market today have been consumed in populations worldwide, without any apparent harm, for decades. In animal toxicity tests, Taylor says, most of the dyes in food are excreted, and the small amounts absorbed are broken down by the liver.

 

More than a dozen clinical studies have tried to investigate the relationship between food dyes and hyperactivity. In 2004, psychiatrists David Schwab from Columbia University and Nhi-Ha Trinh of Harvard University published a meta-analysis of all 15 known double-blind placebo-controlled trials — meaning those in which neither the researchers nor the participants knew who was getting the dyes. That study, in the Journal of Developmental & Behavioral Pediatrics, reported that the results “strongly suggest an association” between food dyes and hyperactivity, though the researchers included a long list of caveats.

 

Following the 2004 meta-analysis, the British Food Standards Agency (the equivalent of the U.S. FDA) commissioned large studies to further examine whether food dyes, along with a common food preservative, affected children’s behavior. Unlike most previous investigations, these new experiments included children from the general population who had no history of hyperactivity.

 

In those studies, researchers from the University of Southampton gave two groups of children (one toddler group, and one school age) beverages with one of two mixes of food dyes and the preservative sodium benzoate or a placebo, and asked parents and educators to note any behavior changes. The older children also took a computerized test designed to measure attention.

 

The results, published in 2007 in the Lancet, “lend strong support for the case that food additives exacerbate hyperactive behaviors,” the researchers write. “Our results are consistent with those from previous studies and extend the findings to show significant effects in the general population.” The scientists recognized the potential political impact of their findings: “The implications of these results for the regulation of food additive use could be substantial.”

 

And in Europe, they were. While the European Food Safety Authority did not think the evidence was strong enough to prompt action, the European Parliament was convinced. Dyes are not banned outright, but warning labels alone have been enough to change the way many products are made. A strawberry sundae at McDonald’s in the United States gets a boost of crimson from red No. 40. In Great Britain, a McDonald’s strawberry sundae gets its red only from strawberries.

 

In 2008, the year warning labels took effect in Europe, the D.C.-based Center for Science in the Public Interest (the same food watchdogs known to denounce the nutritional wasteland of convenience foods and movie popcorn) petitioned the FDA to ban the dyes. A long list of scientists and researchers signed on to the center’s appeal. “Food manufacturers voluntarily could substitute safe natural colors or other ingredients (such as fruit or fruit juices) for dyes, but that’s unlikely to happen throughout the food supply without the level playing field provided by government regulation,” the document stated. “Accordingly, the Food and Drug Administration … should ban the use of dyes in all foods; until such action takes effect, the FDA should require a prominent warning notice on product labels.”

 

While no large trials have been published since 2007, the government took the Center for Science in the Public Interest petition seriously enough to hold the hearings in March, asking members of its Food Advisory Committee to decide whether the evidence establishes a link between food dyes and hyperactivity in children in the general population.

 

Even Michael Jacobson, executive director of the Center for Science in the Public Interest, says he would answer “no.” To him and others, it was not the valid question to address. Better, he said, would have been to assess whether food dyes pose a danger to certain children, in the same way that allergens affect only susceptible people. Few products, no matter how dangerous, affect everyone in the population. “Even smoking does not affect everybody,” he says.

 

Metabolic black box

 

No one knows which children may be at risk, because the biology behind any potential neurological effect associated with hyperactivity isn’t clear. Taylor, the color industry biochemist, says that animal studies find that the molecules do not easily get through intestinal cell walls, and most of the dye passes through the body without leaving the digestive system.

 

Laura Stevens, a nutrition researcher at Purdue University in Indiana, acknowledges that this is the case. “In animals, very little of it is absorbed,” she says. “It is excreted in the feces.” But that doesn’t necessarily negate the idea of any effects on the body, she says; effects could come through metabolites, or through indirect mechanisms.

 

As examples, she cites two studies by British researchers. In one, published in the Journal of Nutritional Medicine in 1990, the scientists investigated how the yellow dye tartrazine affected the zinc levels of 10 hyperactive boys, compared with 10 nonhyperactive peers. (Zinc is a mineral important for proper brain function.) The team found that zinc levels dropped in the blood and increased in the urine among the hyperactive kids after tartrazine consumption. Another study, published in the Journal of Nutritional and Environmental Medicine in 1997, found a similar drop in zinc levels, and an increase in hyperactivity, in some children consuming tartrazine.

 

Newer research suggests that dyes trigger the release of histamines, which are part of the body’s immune system. An experiment reported last year in the American Journal of Psychiatry suggested that differences in genes that control histamines might explain why some children are affected and others are not.

 

But studies are few. In truth, Stevens says, aside from extrapolations from animal studies, the metabolic fate of dyes in humans is a black box. She and her colleagues at Purdue are among those trying to look at food dye metabolism in humans. “If there’s any chance at all there’s a problem, this should be addressed,” she says.

 

Ultimately, the future of food dyes may not rest with scientists or government regulators, but with consumers, says Ron Wrolstad, an agricultural chemist at Oregon State University in Corvallis.

 

“A lot of times now, particularly with natural colorants, it will be a marketing decision rather than a regulatory ruling,” he says. The snack food giant Frito-Lay, for instance, has announced, and heavily publicized, a commitment to use fewer artificial dyes in its products. A company spokeswoman said in December that the move was in response to consumers wanting more snacks “made with real food ingredients.”

 

“My personal opinion is that the synthetics don’t cause you any harm, but I don’t think they do you any good,” Wrolstad says. While other researchers are looking for harmful effects of synthetic dyes, Wrolstad is looking for beneficial effects of natural, plant-derived colors. “A lot of these compounds have antioxidant properties,” he says.

 

Though just as the idea of harm by synthetic colors isn’t universally accepted, neither is the suggestion of benefit from dyes extracted from plants. “I would feel a lot more comfortable if we had some data on those, too,” Weiss says.

 

In the meantime, dyes of all kinds will continue to dominate the grocery aisle unless shoppers demand otherwise. In the food business, the most influential color is green.

 

In the limelight

 

Though concern over a link to hyperactivity has prompted the latest attacks on food dyes, artificial colorings have caught the public’s attention for other economic and health reasons for more than a century.

 

1850s A Victorian-era domestic standby Enquire Within Upon Everything described how bread could be tested at home for the presence of alum, a metallic salt used to create a more preferable, whiter color in the dietary staple. As early as the Middle Ages, some bread manufacturers were rumored to make very white bread on the cheap by adding chalk.

 

1890s One effort used by the dairy industry to prevent newly invented and relatively cheap margarine from undercutting the popularity of butter was the push for regulations that would tax or ban margarine with the yellow tint of butter. (Naturally, margarine is colorless.) Anticoloring laws were adopted in 30 states, and some legislatures went so far as to demand that margarine be dyed pink. Because of the restrictions, some margarine manufacturers sold yellow dye packets with their products, so consumers could color their own margarine at home.

 

1950s In 1950, children became ill after eating Halloween candy containing orange No. 1, which had been approved for use in food by the U.S. Food and Drug Administration. The reports led to a public outcry, and along with other concerns, led the FDA to re-evaluate the safety of food colorings. Several dyes were delisted, and the Color Additive Amendments of 1960 established the current regulatory protocol.

 

1970s In the 1970s, it was red No. 2’s turn to cause a stir. Russian studies had suggested that the dye caused rats to develop intestinal tumors and was toxic to the gonads and embryos. Though the tests were largely debunked, when combined with earlier studies showing breast tumors in female rats fed the dye, the findings were enough to lead to a public health scare. The FDA banned red No. 2, and many manufacturers removed red products regardless of whether they contained the dye. Mars didn’t bring back red M&Ms until the late ’80s.

 

1990s Natural food dyes have caused controversy too. The reddish cochineal extract and carmine came to the attention of the Center for Science in the Public Interest in 1998. The dyes, made from a type of female beetle, had been used for hundreds of years, exempt from certification because they are natural. Recorded allergic reactions as well as anecdotal reports of outrage among vegetarians and kosher-keeping Jewish people who were unknowingly consuming insect products prompted demands for labeling. The FDA agreed to require manufacturers to list the dyes as ingredients on the product label, but consumers have to figure out for themselves that the products come from animals.

 

Added color

 

The U.S. Food and Drug Administration currently certifies nine synthetically produced food dyes (three popular colorings are described below). Such dyes can transform colorless products, giving faded veggies a more vibrant hue and making children’s candies more fun.

 

Brilliant blue  Designated as blue No. 1 by the FDA, this dye is found in ice creams, ice pops, baked goods and a host of blue raspberry–flavored beverages. It shows up in ranch-flavored chips, prepared guacamole and mixed-berry applesauce. The dye was approved by the FDA in 1969.

 

Allura red Red No. 40 is found in strawberry-flavored drinks, ice creams and cream cheeses; some Nutri-Grain bars; licorice; and most other red sweets. It was approved by the FDA in 1971 and, in terms of consumption, is currently the most-used food dye.

 

Tartrazine  Yellow No. 5 is in products such as Mountain Dew, Peeps, Doritos and Cheez Doodles. It’s commonly found in relish, pickles, lemon-flavored seasonings and boxed macaroni and cheese. The dye was approved by the FDA in 1969.

Can cutting carbohydrates from your diet make you live longer?

By Jerome Burne

The Daily Mail

 

It’s an extraordinary claim. But scientists say you can extend your life AND stay fit throughout old age – just by a change of diet that switches on your youth gene.

 

Professor Kenyon has found out why drastically reducing calories has such a remarkable effect

 

For centuries man has dreamed of being immortal, fixated on tales of magic fountains that restore youth, the rejuvenating power of a vampire’s bite or asses’ milk.

 

More recently came claims that injections of monkey glands or hormone supplements would make us live longer.

 

But so far, what’s actually worked are ­medical advances such as vaccines and better living conditions. Over the past century these have boosted average life expectancy by far more than 50 per cent, from 50 to 88.

 

The problem is that this longevity does not mean a healthier life. Indeed, thanks to chronic diseases such as diabetes and arthritis, we’re becoming like the Struldbruggs — the miserable characters in Gulliver’s Travels who were immortal, but still suffered from all the ­diseases of old age.

 

Gradually they lost their teeth, their hair, their sense of smell and taste. All their diseases got worse and their memory faded, so they had no idea who their friends and relations were. At funerals they wept because they couldn’t die.

 

But now a U.S. geneticist is thought to have discovered the secret to a long life, full of health and energy. And the answer might be as simple as cutting down on carbohydrates.

 

Professor Cynthia Kenyon, whom many experts believe should win the Nobel Prize for her research into ageing, has discovered that the carbohydrates we eat — from bananas and potatoes to bread, pasta, biscuits and cakes — directly affect two key genes that govern youthfulness and longevity.

 

She made her remarkable breakthrough after studying roundworms, specifically the C.elegans, a worm just a millimetre in size that lives in soil in temperate climates all over the world.

 

By tweaking some of their genes she has been able to help these worms live up to six times longer than normal. ‘Not only that, but we also know how to make them stay healthy all that time as well,’ she told an audience at the Wellcome Collection in London earlier this month.

 

So, what do worms have to do with us?

 

A great deal, it seems. Professor Kenyon’s work has been successfully repeated in labs around the world — the genes she found controlling ageing in worms do the same thing in rats and mice, probably monkeys, and there are signs they are active in humans, too.

 

This work has revolutionised our understanding of ageing, explains Jeff Holly, professor of clinical sciences at Bristol University.

 

‘Ten years ago we thought ageing was probably the result of a slow decay, a sort of rusting,’ he says. ‘But Professor Kenyon has shown that it’s not about wear and tear, but instead it is controlled by genes. That opens the possibility of slowing it down with drugs.’

 

So how does a worm hold the key to human ageing?

 

At 18 days old the average roundworm is flabby, ­sluggish and wrinkled. Two days later it will probably be dead.

 

The carbohydrates we eat directly affect two key genes that govern youthfulness and longevity

 

However, Professor Kenyon, based at the University of California, San Francisco, found that damping down the activity of just one of their genes had a dramatic effect.

 

‘Instead of dying at about 20 days, our first set of mutant worms carried on living to more than 40 days,’ she says.

 

‘And they weren’t sluggish and worn out — they behaved like youngsters. It was a real shock. In human terms it was the equivalent of talking to someone you thought was about 30 and finding they were actually 60.’

 

With more sophisticated genetic manipulation, she now has some worms that have lived for an astonishing 144 days. An increase of that proportion would allow humans to live to 450.

 

Scientists already knew how to make laboratory animals live longer and healthier lives — you just cut back their calories to about three-quarters of their normal amount.

 

It’s not a practical solution for humans, because you feel cold and hungry all the time.

 

But what Professor Kenyon found out was why ­drastically reducing calories has such a remarkable effect.

 

She discovered that it changed the way two crucial genes behaved. It turned down the gene that controls insulin, which in turn switched on another gene, which acted like an elixir of life.

 

‘We jokingly called the first gene the Grim Reaper because when it’s switched on, the lifespan is fairly short,’ she explains.

 

The ­second ‘elixir’ gene seems to bring all the anti-ageing benefits — its proper name is DAF 16, but it was quickly nicknamed ‘Sweet Sixteen’ because it turned the worms into teenagers.

 

‘It sends out instructions to a whole range of repair and renovation genes,’ says Professor Kenyon.

 

‘Your supply of natural anti­oxidants goes up, damping down damaging free radicals.’

 

These are the ­compounds produced by our body and the environment, which are linked to a host of diseases from ­cancer to Alzheimer’s.

 

The Sweet Sixteen gene also ‘boosts compounds that make sure the skin and muscle-building ­proteins are working properly, the immune system becomes more active to fight infection and genes that are active in cancer get turned off,’ she adds.

 

Kenyon had stumbled on the genetic equivalent of Shangri-La, the fictional valley where people could live for years without really ageing.

 

Discovering the Grim Reaper gene has prompted the professor to ­dramatically alter her own diet, ­cutting right back on carbohydrates. That’s because carbs make your body produce more insulin (to mop up the extra blood sugar carbs ­produce); and more insulin means a more active Grim Reaper.

 

So the vital second gene, the ‘elixir’ one, won’t get turned on. To test this, last year she added a tiny amount of ­sugary glucose to the normal diet of some of her worms that had had their genes engineered so they were living much longer, healthier lives.

 

‘The effect was remarkable,’ she says. ‘The sugary glucose blocked the “youthful” genes and they lost most of the health gains.’

 

But was this just a special feature of the roundworm or did we all have it?

 

Following Kenyon’s lead, other researchers started looking for the Grim Reaper/ Sweet Sixteen combination in other animals — and of course in humans. They found it.

 

One clue came from a small remote community of dwarves living in northern Ecuador who are cancer-free. They are missing the part of the Grim Reaper gene that controls a hormone called insulin-like growth factor. The downside is they only grow to 4ft tall because the hormone is needed for growth.

 

But this missing bit of the Grim Reaper gene also means they don’t develop cancer and are less likely to suffer from heart disease or obesity.

 

Professor Jeff Holly, who specialises in insulin-like growth factor, confirms that it is linked to cancer of the prostate, breast and colon.

 

In fact raised insulin levels, triggered by high carbohydrate ­consumption, could be what ­connects many of our big killers.

 

Research is at its early stage, but raised insulin triggers an increase in cholesterol production in the liver, makes the walls of blood vessels ­contract so blood pressure goes up and stimulates the release of fats called triglycerides (linked to heart disease).

 

Professor Kenyon’s work is ­creating a wave of excitement among drug companies who’ve been researching molecules that will damp down the Grim Reaper and boost Sweet ­Sixteen, giving us the benefits of very low-calorie diets without the ­penalties. So far, none is very near being approved.

 

One way to reduce insulin levels is to exercise, which makes you more sensitive to it, which in turn means you need less of it. It also gives another health benefit in a surprising way. Exercise actually increases the level of damaging free radicals which stimulates the body to produce more protective anti-oxidants.

 

So should we all be trying to cut back on carbs to reduce our insulin levels?

 

It is a suggestion that flies in the face of 30 years of health advice to have a lower fat intake and eat plenty of long-lasting complex carbo­hydrates to keep the body supplied with energy.

 

There is no denying the extra­ordinary breakthrough Kenyon’s work represents and she ‘deserves the Nobel Prize for her findings about ageing’, says David Gems, deputy director of the Institute for Healthy Ageing at University ­College, London.

 

However he isn’t convinced we know enough for us all to start eating a low-carb diet.

 

‘The exact role of insulin in health and ageing is a promising and fascinating area,’ he says. ‘But I’m not sure the evidence for the benefit of cutting carbohydrates and keeping insulin levels down is strong enough yet.’

 

But Professor Kenyon herself doesn’t need convincing.

 

‘Carbo­hydrates, and especially refined ones like sugar, make you produce lots of extra insulin. I’ve been keeping my intake really low ever since I discovered this.

 

‘I’ve cut out all starch such as potatoes, noodles, rice, bread and pasta. Instead I have salads, but no sweet dressing, lots of olive oil and nuts, tons of green vegetables along with cheese, chicken and eggs.

 

‘I’ll have a hamburger without a bun and fish without batter or chips. I eat some fruit every day, but not too much and almost no processed food. I stay away from sweets, except 80 per cent chocolate.’

 

She is adamant it will be well worthwhile. ‘You could have two completely different careers if you could stay healthy to 90,’ she says. ‘How fascinating that would be.’

Acne, Mental Health, and Diet

2 August 2011

Emily Deans, M.D.

In 2008 some folks from a Beverly Hills skin clinic wrote up a short paper in Lipids in Health and Disease called Acne vulgaris, mental health and omega-3 fatty acids: a report of cases (free full text). The experiment itself was an open-label trial of a mineral/omega-3supplement on five patients, so useful only as a reason get us thinking and to give us pointers for further research. But a lot of interesting science tidbits on acne, omega-3s, and minerals are noted in the article, so it’s worth a peek.

Acne is a disease of civilization which, like depression, has increased the last half century, especially in women. As was discussed in my blog post, Acne and Suicide, patients with acne are more likely to be depressed, angry, and suicidal. In fact, patients with acne struggle more with mental health issues than even patients with epilepsy or diabetes, according to a study comparing questionnaires between sufferers of acne and other general medical conditions.

Acne is accompanied by the overproduction of sebum, a waxy oil, in addition to inflammation, hormonal shifts, and infection. Inflammation is one of the earliest manifestations of the disease, particularly mediated by a leukotriene (which is a type of signaling molecule made from fat) called LTB4. This inflammatory chemical helps up-regulate sebum production, and you might be interested to know that the omega 6 fatty acid derivative arachidonic acid is made into LTB4, while the omega 3 fatty acid EPA (from fish) inhibits the conversion of arachidonic acid to LTB4. I’ve reviewed how omega 3 and omega 6 fatty acids are important to mental health in my article Your Brain on Omega 3

A study of 1000 teenagers in North Carolina showed lower incidence of pustules, acne cysts, and oily skin in those teenagers consuming the most fish. Another study showed that patients with acne ate low amounts of seafood. In my own clinical experience, young adults with acne have experienced a reduction in severity when they begin to supplement with fish oil (though it is not a complete cure, and doesn’t seem to help everyone). However, many have a very noticeable improvement. There is a prescription drug, zileuton, that inhibits LTB4 and improves acne, but it would seem a fish prescription might be more practical.

Patients with acne, being in a state of systemic inflammation, also seem to have lower serum amounts of several vitamins and minerals, specifically zinc, vitamins A and C, and selenium. Studies of all these supplements, some administered topically, some orally, or both seemed to show some benefit. In addition, EGCG from green tea has been “suggested to be helpful in acne due to its well documented anti-inflammatory and antioxidant activity.”

Acne is also worse in people with poor control of blood glucose, and the supplement chromium is known to have some minor benefit in that area. There was one open label trial of 400 mcg of chromium daily that seemed to help.

So in this tiny open-label experiment, five patients (three males and two females aged 18-23) were given a supplement with 1000 mg of EPA, EGCG 200mg, zinc gluconate 15 mg, selenium 200 mcg, and chromium 200 mcg to take daily. They didn’t use any new topical treatments or change their diets in any way. The number of pimples and amount of inflammation was noted with a standardized acne scale at the beginning, and measured again at the end of two months. In addition, each patient took a before and after test measuring mental, emotional, and social well-being.

The results? Four of the five had improvement in number of lesions, and all seemed to have a reduction of general skin inflammation. Sense of well-being improved 24% (with a range in the five patients of 20-27%). The authors thought this improvement might be due to the EPA, but since EPA seems less important in the brain than its sister fish oil, DHA, I’m prone to be skeptical. I wonder if the improvement might be due to the generalized reduction in inflammation.

All told, this little open-label trial can’t allow us to draw too many conclusions. Without a control and some more data points, we can only tuck the information away as something to look at further. Now a healthy Paleolithic diet with organ meats and fish would provide the vitamins, minerals, and EPA (not sure about the EGCG). Especially in an active hunter-gatherer who would consume and burn more calories, and therefore more nutrients along the way.

Everyone benefits from an improvement in looks. One of the fastest ways to improve mental health in my clinical experience is to help someone successfully get into fat-burning mode and clear the skin. Clinical experience and common scientific sense is one thing. Real controlled trials are something else. Bring them on.

Increased muscle mass may lower risk of pre-diabetes

Study shows building muscle can lower person’s risk of insulin resistance

28 July 2011

Eureka Alert

A recent study accepted for publication in The Endocrine Society’s Journal of Clinical Endocrinology & Metabolism (JCEM) found that the greater an individual’s total muscle mass, the lower the person’s risk of having insulin resistance, the major precursor of type 2 diabetes.

With recent dramatic increases in obesity worldwide, the prevalence of diabetes, a major source of cardiovascular morbidity, is expected to accelerate. Insulin resistance, which can raise blood glucose levels above the normal range, is a major factor that contributes to the development of diabetes. Previous studies have shown that very low muscle mass is a risk factor for insulin resistance, but until now, no study has examined whether increasing muscle mass to average and above average levels, independent of obesity levels, would lead to improved blood glucose regulation.

“Our findings represent a departure from the usual focus of clinicians, and their patients, on just losing weight to improve metabolic health,” said the study’s senior author, Preethi Srikanthan, MD, of the University of California, Los Angeles (UCLA). “Instead, this research suggests a role for maintaining fitness and building muscle. This is a welcome message for many overweight patients who experience difficulty in achieving weight loss, as any effort to get moving and keep fit should be seen as laudable and contributing to metabolic change.”

In this study, researchers examined the association of skeletal muscle mass with insulin resistance and blood glucose metabolism disorders in a nationally representative sample of 13,644 individuals. Participants were older than 20 years, non-pregnant and weighed more than 35 kg. The study demonstrated that higher muscle mass (relative to body size) is associated with better insulin sensitivity and lower risk of pre- or overt diabetes.

“Our research shows that beyond monitoring changes in waist circumference or BMI, we should also be monitoring muscle mass,” Srikanthan concluded. “Further research is needed to determine the nature and duration of exercise interventions required to improve insulin sensitivity and glucose metabolism in at-risk individuals.”

###

Also working on the study was Arun Karlamangla, PhD, MD, of the David Geffen School of Medicine at UCLA.

The article, “Relative muscle mass is inversely associated with insulin resistance and pre-diabetes. Findings from The Third National Health and Nutrition Examination Survey,” appears in the September 2011 issue of JCEM.

Founded in 1916, The Endocrine Society is the world’s oldest, largest and most active organization devoted to research on hormones and the clinical practice of endocrinology. Today, The Endocrine Society’s membership consists of over 14,000 scientists, physicians, educators, nurses and students in more than 100 countries. Society members represent all basic, applied and clinical interests in endocrinology. The Endocrine Society is based in Chevy Chase, Maryland. To learn more about the Society and the field of endocrinology, visit our site at www.endo-society.org.

Omega-3 Fats Reduce Stress

29 July 2011

Pill Advised

Could omega-3 fats, the kind most often found in fish oil, help reduce stress?

A new study from Ohio State University sought to answer that question, by looking at how omega-3 fats could help decrease anxiety among university students.

Inflammation and Anxiety Reduced by Omega-3 Fats

Consuming more fish oil showed a marked reduction both in inflammation and, surprisingly, in anxiety among the healthy young people in the study.

The findings suggest that if young participants can get such improvements from specific dietary supplements, then the elderly and people at high risk for certain diseases might benefit even more.

Read Supplement Your Knowledge of Omega Fats

Omega-3 Study on Healthy Medical Students

The findings by a team of researchers at Ohio State University were just published in the journal Brain, Behavior and Immunity.

Omega-3 polyunsaturated fatty acids, such as eicosapentaenoic acid (EPA) and docosahexaenoic acid (DHA), have long been considered as positive aspects of a healthy diet.

Earlier research suggested that the compounds might play a role in reducing the level of cytokines in the body, compounds that promote inflammation, and perhaps even reduce depression.

Learn more about reducing inflammation in Natural Anti-Inflammatory Foods and Supplements That Help Arthritis.

Psychological stress has repeatedly been shown to increase cytokine production so the researchers wondered if increasing omega-3 might mitigate that process, reducing inflammation.

Half the students received omega-3 supplements while the other half was given placebo pills.

Omega-3 Fats from Fish Oil Used in Study

“The supplement was probably about four or five times the amount of fish oil you’d get from a daily serving of salmon, for example,” explained Martha Belury, professor of human nutrition and co-author in the study.

The psychological surveys clearly showed an important change in anxiety among the students:  Those receiving the omega-3 showed a 20 percent reduction in anxiety compared to the placebo group.

An analysis of the of the blood samples from the medical students showed similar important results.

“We took measurements of the cytokines in the blood serum, as well as measured the productivity of cells that produced two important cytokines, interleukin-6 (IL-6) and tumor necrosis factor alpha (TNFa),” said Ron Glaser, professor of molecular virology, immunology & medical genetics and director of the Institute for Behavioral Medicine Research.

Learn more about cytokines in Anti Inflammatory Foods Reduce Memory Loss

“We saw a 14 percent reduction in the amounts of IL-6 among the students receiving the omega-3.”  Since the cytokines foster inflammation, “anything we can do to reduce cytokines is a big plus in dealing with the overall health of people at risk for many diseases,” he said.

While inflammation is a natural immune response that helps the body heal, it also can play a harmful role in a host of diseases ranging from arthritis to heart disease to cancer.

Spread the Health by forwarding this article to your friends and family, and sharing on Facebook.

Reference:

Brain, Behavior, and Immunity. Published Online July 19, 2011.

“Omega-3 Supplementation Lowers Inflammation and Anxiety in Medical Students: A Randomized Controlled Trial,” Janice K. Kiecolt-Glaser, Martha A. Belury, Rebecca Andridge, William B. Malarkey, and Ronald Glaser

Author Affiliations:

Institute for Behavioral Medicine Research, Ohio State University College of Medicine, USA, Department of Psychiatry, Ohio State University College of Medicine, USA, Department of Human Nutrition, Ohio State University, Division of Biostatistics, College of Public Health, Ohio State University, Department of Internal Medicine, Ohio State University College of Medicine, USA, Department of Molecular Virology, Immunology, and Medical Genetics, Ohio State University College of Medicine, USA

Source: Ohio State University Medical Center