Better access to quality cancer care may reduce rural and urban disparities

When enrolled in a cancer clinical trial, the differences in survival rates between rural and urban patients are significantly reduced, SWOG study results show.

The study results are published in JAMA Network Open by a team led by Joseph Unger, Ph.D., a SWOG biostatistician and health services researcher at Fred Hutchinson Cancer Research Center. It’s the first study to comprehensively compare survival outcomes in rural and urban cancer patients enrolled in clinical trials.

The results cast new light on decades of research, which paints a stark picture of cancer disparities. About 19 percent of Americans live in rural areas, and studies have shown that, when faced with cancer, rural patients don’t live as long as urban cancer patients. For example, statistics published by the federal Centers for Disease Control and Prevention in 2017 show a significant difference in the rate of cancer deaths, with 180 people out of 100,000 dying of cancer in rural areas compared with 158 people out of 100,000 dying of cancer in urban areas between 2011 and 2015.

But the new analysis by SWOG, the international cancer clinical trials network funded by the National Cancer Institute (NCI), indicates that this difference in survival is not due to patients—but to the care they receive.

“These findings were a surprise, since we thought we might find the same disparities others had found,” Unger said. “But clinical trials are a key difference here. In trials, patients are uniformly assessed, treated, and followed under a strict, guideline-driven protocol. This suggests that giving people with cancer access to uniform treatment strategies could help resolve the disparities in outcomes that we see between rural and urban patients.”

Unger and SWOG member Dr. Banu Symington, an oncologist who practices at the Sweetwater Regional Cancer Center in rural Idaho, received a grant from SWOG’s public charity, The Hope Foundation, to study cancer disparities by analyzing existing data from the group’s trials. The team had a big trove of data to mine. Founded in 1956, SWOG has run more than 1,400 cancer clinical trials enrolling nearly 215,000 patients.

Unger and his team identified 36,995 patients who enrolled in 44 SWOG phase II or III treatment trials between 1986 and 2012. Patients hailed from all 50 states, and had 17 different cancer types, including acute myeloid leukemia, sarcoma, lymphoma, myeloma, and brain, breast, colorectal, lung, ovarian, and prostate cancers. The team limited their analysis of survival to the first five years after trial enrollment to emphasize outcomes related to cancer and its treatment.

Using U.S. Department of Agriculture population classifications known as Rural-Urban Continuum Codes, the team categorized the patients as either rural or urban and analyzed their outcomes. Patient outcomes included overall survival, or how long patients lived; progression-free survival, or how long patients lived before their cancer returned; and cancer-specific survival, or how long the patients lived without dying of cancer. The team used a statistical model known as a multivariate Cox regression to analyze their data.

This method allows investigators to examine the relationship between survival and one or more predictor values, such as the age of the patient or the stage of their cancer.

No matter the variable, or the cancer type, results were clear. There was no meaningful difference in survival patterns between rural and urban patients for almost all of the 17 different cancer types. The only exception was patients with estrogen receptor-negative, progesterone receptor-negative breast cancer. Rural patients with this cancer didn’t live as long as their urban counterparts, a finding the team says could be attributed to a few factors, including timely access to follow-up chemotherapy after their first round of cancer treatment.

“If people diagnosed with cancer, regardless of where they live, receive similar care and have similar outcomes, then a reasonable inference is that the best way to improve outcomes for rural patients is to improve their access to quality care,” Unger said.

Unger noted that the NCI CommunityOncology Research Program (NCORP) – which funded his study—brings clinical trials into community hospitals and clinics, including in rural areas, and represents the community-level outreach that can provide the quality cancer care that may be needed. In 2014, NCI officials broadened NCORP eligibility to include oncology practices that serve large rural populations. Currently, there are NCORP sites in 13 states in which the rural population exceeds 30 percent—Alaska, Ark., Iowa, Ky., Miss., Mont., N.C., N.D., S.C., S.D., Tenn., Wis., and Wyo. The result is that tens of thousands of rural cancer patients can enroll in NCI clinical trials and be cared for right at their local hospital and clinic.

Source: Read Full Article

Men and women show surprising differences in seeing motion

Researchers reporting in the journal Current Biology on August 16 have found an unexpected difference between men and women. On average, their studies show, men pick up on visual motion significantly faster than women do.

Individuals representing both sexes are good at reporting whether black and white bars on a screen are moving to the left or to the right — requiring only a tenth of a second and often much less to make the right call, the researchers found. But, in comparison to men, women regularly took about 25 to 75 percent longer.

The researchers say that the faster perception of motion by males may not necessarily reflect better visual processing. They note that similar performance enhancements in this same task have been observed in individuals diagnosed with autism spectrum disorder (ASD) or depression and in older individuals. The authors speculate that processes in the brain that down-regulate neural activity are disrupted in these conditions and may also be weaker in males.

“We were very surprised,” says Scott Murray at the University of Washington, Seattle. “There is very little evidence for sex differences in low-level visual processing, especially differences as large as those we found in our study.”

Murray and co-author Duje Tadin, University of Rochester, say that the finding was “entirely serendipitous.” They were using the visual motion task to study processing differences in individuals with ASD. ASD shows a large sex bias, with boys being about four times more likely to be diagnosed with the condition than girls. As a result, the researchers included sex as a factor in their analysis of control individuals in the study who didn’t have ASD. The sex difference in visual perception of motion became immediately apparent.

To confirm the findings, the researchers asked other investigators who had used the same task in their own experiments for additional data representing larger numbers of study participants. And those independent data showed the same pattern of sex difference.

Murray, Tadin, and colleagues report that the observed sex difference in visual perception can’t be explained by general differences in the speed of visual processing, overall visual discrimination abilities, or potential motor-related differences. The differences aren’t apparent in functional MRI images of the brain either.

Overall, they write, the results show how sex differences can manifest unexpectedly. They also highlight the importance of including sex as a factor in the design and analysis of perceptual and cognitive studies.

The researchers say that the findings come as evidence that visual processing differs in males and females in ways that hadn’t been recognized. They also provide a new window into differences in neural mechanisms that process visual information, Tadin says.

In further studies, the researchers hope to discover the underlying differences in the brain that may explain the discrepancy between men and women. So far, brain images of the key motion-processing areas haven’t offered up any clues, suggesting that the difference may originate in other portions of the brain or may be difficult to measure using current techniques. Ultimately, they say, this path of study might even yield new clues for understanding a vexing question: why ASD is more common in males.

Source: Read Full Article

New algorithm could improve diagnosis of rare diseases

Today, diagnosing rare genetic diseases requires a slow process of educated guesswork. Gill Bejerano, Ph.D., associate professor of developmental biology and of computer science at Stanford, is working to speed it up.

In a paper published July 12 in Genetics in Medicine, Bejerano and his colleagues describe an algorithm they’ve developed that automates the most labor-intensive part of genetic diagnosis: that of matching a patient’s genetic sequence and symptoms to a disease described in the scientific literature. Without computer help, this match-up process takes 20-40 hours per patient: The expert looks at a list of around 100 of the patient’s suspicious-looking mutations, makes an educated guess about which one might cause disease, checks the scientific literature, then moves on to the next one.

The algorithm developed by Bejerano’s team cuts the time needed by 90 percent.

“Clinicians’ time is expensive; computer time is cheap,” said Bejerano, who worked with experts in computer science and pediatrics to develop the new technique. “If I’m a busy clinician, before I even open a patient’s case, the computer needs to have done all it can to make my life easier.”

A Phrank approach

The algorithm’s name, Phrank—a mashup of “phenotype” and “rank”—hints at how it works: Phrank compares a patient’s symptoms and gene data to a knowledge base of medical literature, generating a ranked list of which rare genetic diseases are most likely to be responsible for the symptoms. The clinician has a logical starting point for making a diagnosis, which can be confirmed with one to four hours of effort per case instead of 20-40 hours.

The mathematical workings of Phrank aren’t tied to a specific database, a first for this type of algorithm. This makes it much more flexible to use.

Phrank also dramatically outperforms earlier algorithms that have tried to do the same thing, according to the paper. Bejerano’s team validated Phrank on medical and genetic data from 169 patients, an important advance over earlier studies in the field. Prior studies had tested algorithms on made-up patients instead because real-patient data for this research is hard to come by.

“The problem is that this test [using synthetic patients] is just too easy,” Bejerano said. “Real patients don’t look exactly like a textbook description.” On data from real patients, one older algorithm ranked the patient’s true diagnosis 33rd, on average, on the list of potential diagnoses it generated; Phrank, on average, ranked the true diagnosis fourth.

Phrank also holds potential for helping doctors identify new genetic diseases, Bejerano said. For example, if a patient’s symptoms can’t be matched to any known human diseases, the algorithm could check for clues in a broader knowledge base. “You might get the result that mouse experiments cause phenotypes similar to your patient, that you may have found the first human patient that suffers from this disease,” Bejerano said.

Source: Read Full Article

Forget the bling: High status-signaling deters new friendships

When it comes to making new friends, status symbols actually repel people from making friends with us, according to new research published in the journal Social Psychological and Personality Science.

“Often times we think that status symbols — whether a luxury car like a BMW, a brand name purse like Prada, or an expensive watch like Rolex — will make us look more socially attractive to others,” says Stephen Garcia (University of Michigan). “However, our research suggests that these status signals actually make us look less socially attractive, not more.”

The scientists conducted a series of six studies, where participants either presented themselves as possible friends, or they were the people evaluating who they would want to be their friends. Throughout the studies, people presenting themselves to a new group chose higher status items. Yet for the people asked about who they would want to be friends with, they preferred people with lower or neutral status symbols.

To control for the possibility that a luxury good might play a role in people’s reactions, the researchers conducted a study where they asked people which of two plain t-shirts participants would wear to a picnic in their efforts to make new friends. One t-shirt had “Walmart” written on it in plain script, and the other t-shirt had “Saks Fifth Avenue” written on it in plain script.

While the shirts were not luxury items, 76% of the participants who presented themselves as new friends chose to wear the t-shirt that said, “Saks Fifth Avenue,” whereas 64% of the would-be friends chose the person wearing the “Walmart” t-shirt.

The results appear to be consistent across socioeconomic groups. The only difference is that what is considered high status depends on one’s socioeconomic status.

“At a societal level, we may be wasting billions of dollars on expensive status symbols that ultimately keep others from wanting to associate with us,” says Kimberlee Weaver Livnat (University of Haifa). “And to the extent that close friendships are important to well-being, we may be inadvertently hurting ourselves.”

One next step is to delve into the mechanism of why presenters are making this error, say the authors. Is it that people often fail to take the perspective of others who are evaluating them as potential friends? Or do they accurately understand the perspective of the potential friends, but for some reason, chose status symbols when presenting themselves anyway?

Does this mean that status symbols are always bad? “No, not necessarily” says Patricia Chen (National University of Singapore). “Our findings right now only apply to the formation of new friendships. Status symbols may very well be beneficial at other times and in other settings, such as when trying to establish new business contacts.” Their last study in the paper finds that signaling high status symbols can, in fact, be helpful in attracting potential contacts, although not remarkably more than neutral status symbols.

Source: Read Full Article

Eating a serving of button mushrooms a day could prevent diabetes

Eating a serving of button mushrooms a day could prevent diabetes: Fungi improves glucose control

Eating button mushrooms daily could prevent type 2 diabetes by boosting gut bacteria (but it must be a particular portion size)

  • Eating mushrooms increases levels of the bacteria Prevotella in mice 
  • Prevotella produces the acids propionate and succinate
  • These play a role in the expression of genes that manage glucose production  
  • Researchers hope to repeat the experiment in obese mice, as well as humans
  • They claim any change to the diet has an impact on a person’s gut bacteria 
  • e-mail

View
comments

Eating a serving of button mushrooms a day could prevent type 2 diabetes, new research suggests.

Feeding mice a daily portion of white button mushrooms boosts their levels of a gut bacteria that is involved in the production of glucose, a study found.

According to the researchers, mushrooms act as a prebiotic, which are indigestible food ingredients that ‘fertilise’ the growth of bugs in the digestive tract.

Study author Professor Margherita Cantorna, from Pennsylvania State University, said: ‘Managing glucose better has implications for diabetes, as well as other metabolic diseases.’ 

Type 1 diabetes occurs when glucose levels rise due to a lack of the hormone insulin, which controls glucose’s movement in and out of cells. In type 2 diabetes, insulin levels are insufficient or the body does not respond to the hormone.

Around 3.2 million people in the UK are diagnosed with diabetes, of which 10 per cent have type 1. Type 1 diabetes affects approximately 1.25 million people living in the US, while 9.4 per cent of the population have type 2. 


Eating a handful of button mushrooms a day could prevent type 2 diabetes (stock)

  • Part-time GPs ‘pose a terrifying risk to patients’, warn… Stop having sex! Colombian health official advises… Mother-of-two, 44, who looked like an ‘acid attack victim’… ‘I don’t want people to feel sorry for me’: Man, 23, told he…

Share this article

How the research was carried out  

The researchers analysed two types of lean mice. The first had gut bacteria, while the others did not and were ‘germ free’.

Professor Cantorna explained: ‘You can compare the mice with the microbiota with the germ-free mice to get an idea of the contributions of the microbiota.’

All of the rodents were fed a daily serving of mushrooms, which is the equivalent of around a 85g portion for humans. It is unclear how long the animals were fed the edible fungi for. 

‘Any change you make to the diet, changes the microbiota’ 

Results, published in the Journal of Functional Foods, suggest eating mushrooms increases levels of the bacteria Prevotella in mice.

Prevotella produces the acids propionate and succinate, which play a role in the expression of genes that manage glucose production. 

The researchers hope to repeat the experiment in obese mice, as well as humans.  

Mushrooms aside, Professor Cantorna added: ‘It’s pretty clear that almost any change you make to the diet, changes the microbiota.’

WHY IS IT IMPORTANT FOR DIABETES PATIENTS TO MEASURE THEIR GLUCOSE LEVELS?

Diabetes is a serious life-long condition that occurs when the amount of sugar in the blood is too high because the body can’t use it properly.

Patients have to regular monitor their glucose levels to prevent them from developing any potentially fatal complications.

Type 1 diabetes patients are often recommended to test their blood sugar at least four times a day. For type 2 patients, doctors advise to test twice a day.

Blood glucose levels should be between the ranges of 3.5–5.5mmol/L before meals and less than 8mmol/L, two hours after meals.


Diabetes patients have to regular monitor their glucose levels to prevent them from developing any potentially fatal complications

Hypoglycemia (when blood sugar drops below 4 mmol/L) can occasionally lead to patients falling into comas in severe cases.

However, it most often can be treated through eating or drinking 15-20g of fast acting carbohydrate, such 200ml of Lucozade Energy Original.

Sufferers can tell they are experiencing a hypo when they suddenly feel tired, have difficulty concentrating or feel dizzy.

Type 1 diabetes patients are more likely to experience a hypo, because of the medications they take, including insulin.

Hyperglycemia (when blood sugar is above 11.0 mmol/L two hours after a meal) can also have life-threatening complications.

It happens when the body either has too little insulin, seen in type 1, or it can’t use its supply properly, most often in type 2.

In the short-term, it can lead to conditions including ketoacidosis – which causes ketones to be released into the body. 

If left untreated, hyperglycemia can lead to long-term complications, such as impotence and amputations of limbs.

Regular exercise can help to lower blood sugar levels over time, and following a healthy diet and proper meal planning can also avoid dangerous spikes.  

Scientists create an insulin pill with ‘remarkable’ results 

This comes after research released last June suggested scientists have created an insulin pill that could signal the end of injections for diabetics.

Unlike previous failed attempts to make oral diabetes medications, the pill survives the acidic environment of the stomach to release insulin into the bloodstream, according to the scientists.

After rats were given the unnamed pill, their blood-glucose levels fell by 38 per cent in two hours and 45 per cent after 10 hours, compared to a 49 per cent decrease in 60 minutes among those given insulin injections, a study found.

The researchers believe the drug may overcome the pain and needle phobias some diabetics experience.

Dr Mark Prausnitz, from the Georgia Institute of Technology, who was not involved in the study, said: ‘This study shows remarkable results where insulin given by mouth works about as well as a conventional injection’.

It is unclear when the drug may be available and if it would benefit type 1 or 2 diabetes patients. 

 

Source: Read Full Article

FDA Approves 1st Generic EpiPen

THURSDAY, Aug. 16, 2018 — The first generic version of the EpiPen was approved by the U.S. Food and Drug Administration on Thursday, paving the way for more affordable versions of the lifesaving allergy emergency medication.

Though other injectors are available, this drug, made by Teva Pharmaceuticals USA, is the first the FDA has said is the equivalent of the EpiPen. It can be automatically substituted for EpiPen in pharmacies across the United States, the Washington Post reported.

With a new school season about to start, people have been reporting a shortage of EpiPens, the newspaper noted.

“Today’s approval of the first generic version of the most widely prescribed epinephrine auto-injector in the U.S. is part of our longstanding commitment to advance access to lower cost, safe and effective generic alternatives once patents and other exclusivities no longer prevent approval,” FDA Commissioner Dr. Scott Gottlieb said in an agency news release.

“This approval means patients living with severe allergies who require constant access to lifesaving epinephrine should have a lower-cost option, as well as another approved product to help protect against potential drug shortages,” Gottlieb added.

The price of the drug and its launch date were not yet available, but the company’s statement suggested it would not be in time for many parents who are scrambling to find EpiPens in their pharmacies now, the newspaper reported. Teva will market its generic epinephrine auto-injector in 0.3 milligram (mg) and 0.15 mg strengths.

EpiPen, made by Mylan, injects the hormone epinephrine into the thigh to reverse potentially fatal reactions to bee stings, peanuts and other allergens.

Although the key ingredient is cheap and the EpiPen was first approved in 1987, Mylan began increasing the price of the product, from less than $100 for a pack of two injectors in 2007 to $608 for a pair now. In response to criticism over the price of its drug, EpiPen introduced its own half-priced generic in 2016, the Post reported.

Posted: August 2018

Source: Read Full Article

Key factor may be missing from models that predict disease outbreaks from climate change: Parasites that incubate at higher temperatures cause stronger infections in future hosts, creating a climate ‘echo effect’ across generations of pathogens

New research from Indiana University suggests that computer models used to predict the spread of epidemics from climate change — such as crop blights or disease outbreaks — may not take into account an important factor in predicting their severity.

A study recently published in the journal Ecology has found that pathogens that grow inside organisms at higher temperatures produce offspring that cause higher rates of infection compared to pathogens that grow inside organisms at lower temperatures. This suggests that climate can cause an “echo effect” in future pathogens, ultimately making them more infectious.

“It’s well known that environment can affect offspring across generations in plants and animals,” said Spencer Hall, a professor in the IU Bloomington College of Arts and Sciences’ Department of Biology, who is senior author on the study. “This study is one of the first to suggest that similar cross-generational effects occur in parasites and pathogens.”

The work was led by Marta Strecker Shocket, a Ph.D. student in Hall’s lab at the time of the study. Hall is also a member of the Environment Resilience Institute at IU, part of the IU Prepared for Environmental Change Grand Challenge.

“If past environmental conditions impact the frequency or severity of future infections, then current climate models are not taking an important factor into consideration when predicting threats from climate change,” said Shocket, who is now a postdoctoral researcher at Stanford University. “This might include threats to animals, plants and people.”

The researchers’ analysis draws in part upon research in water fleas conducted at three freshwater lakes in southern Indiana where Hall’s lab has collected samples since 2009. Located on the site of a former coal mine in Green-Sullivan State Forest in Linton, Indiana, the lakes are known locally as Lake Gambill, Lake Clear and Lake Scott.

Water fleas, also known as Daphnia, are small crustaceans that contribute to the health of lakes by feeding on algae. Without these organisms to control algae, a lake can quickly degrade into a turgid brew that resembles pea soup.

Water fleas are susceptible to infection from a fungal pathogen called Metschnikowia, which reproduces inside water fleas as needle-like spores that multiply until they kill them and burst out of their bodies to infect the next generation. A single autumn can produce six to 10 generations of the spores, with up to 60 percent of water fleas infected at the epidemic’s peak.

For the study, Shocket conducted lab experiments that found that the fungus caused more infections when higher temperatures were observed in the previous generations of water fleas. She then conducted research in the field to compare the lab results against fresh lake samples collected in late autumn.

The analysis found that an increase of merely 6.5 degrees Fahrenheit caused the fungal spores to become two to five times more likely to infect a new host.

Shocket said additional research is needed to explore these effects in other pathogen systems, especially since they could have an impact on agriculture. This is because the effect of climate across generations in parasites is more likely to have an impact on cold-blooded host organisms, such as crop plants or the insects that eat them.

The study also features mathematical methods to predict the effect of temperature changes on spore infectiousness across generations. Hall said the principles behind these models could potentially enhance other simulations that draw upon many factors to predict disease outbreaks.

“The translation of observational data into computational models is important in the field of ecology since nature is so messy,” he added. “The refinement of the algorithms to predict risks from climate is a crucial step in our ability to prepare for environmental change.”

Other authors on the paper are IU Ph.D. student Jason Walsman; Andrew Sickbert, an IU undergraduate at the time of the study; and Alexander Strauss and Jessica Hite, IU Ph.D. students at the time of the study. Additional contributors are Meghan Duffy of the University of Michigan and Carla Cáceres of the University of Illinois.

The Prepared for Environmental Change Grand Challenge brings together a broad, bipartisan coalition of government, business, nonprofit and community leaders to help Indiana better prepare for the challenges that environmental change brings to our economy, health and livelihood. Announced in May 2017, it is the second initiative to be funded as part of the Indiana University Grand Challenges program.

This study was supported in part by the National Science Foundation and Environmental Protection Agency.

Source: Read Full Article

Opt-out organ donation register unlikely to increase number of donations

An opt-out organ donation register is unlikely to increase the number of donations, according to a new study from Queen Mary University of London.

The researchers say donors should actively choose to be on the register by opting-in to ensure they genuinely want to donate their organs and to limit families from refusing the donation of their deceased relatives’ organs.

An opt-out system automatically registers everyone and presumes consent to donate so if you do not want to you must take yourself off the register, whereas an opt-in system requires explicit consent to donate and indicates willingness.

However most organ donation legislative systems, whether opt-in or out, include a clause that allows the final decision to donate to be made by family members.

NHS Blood and Transplant reported in 2016 that more than 500 families vetoed organ donations since April 2010 despite being informed that their relative was on the opt-in NHS Organ Donation Register. This translated into an estimated 1,200 people missing out on potential life-saving transplants.

Plans to introduce an opt-out system in England by 2020 have recently been announced by the government, but the researchers suggest this will create ambiguity and will not reduce veto rates.

In three experiments†, American and European participants from countries that have either a default opt-in or default opt-out system were presented with a fictional scenario and asked to take on the role of a third party to judge the likelihood that an individual’s ‘true wish’ was to actually donate their organs, given that they were registered to donate.

Overall, regardless of which country the participants came from, they perceived the donor’s underlying preference to donate as stronger under the default opt-in and mandated choice systems as compared to the default opt-out and mandatory donor systems.

The study was published in Journal of Experimental Psychology: Applied.

Lead author Dr Magda Osman, from Queen Mary University of London, said: “We show it’s harder to judge the underlying wishes of the deceased if they were on an opt-out and mandatory donation register. Why? Because making a free choice indicates what your preference is. If you don’t actively choose and you are listed as a donor on the register, then it isn’t clear if you really wanted to donate your organs. This matters because if in the event of death your relatives have to decide what to do, they may veto the organ donation if they can’t tell for sure what your underlying wishes were.”

Dr Yiling Lin, also from Queen Mary University of London, added: “There are plans to launch an opt-out organ donation system in England, but what we show is that this system is unlikely to increase actual rates of organ donation or reduce veto rates, all it will do is increase the number of people on the organ donation register.”

In 2017/18 there were 6,044 people in the UK waiting for a transplant while 411 patients died while waiting on this list. Similarly, this year in the US there are 114,000 people on the waiting list to receive an organ and it is estimated that 20 people die each day while waiting on the list.

To address problems like these, behavioural interventions, such as nudges, have been used to provide practical solutions that are based on psychological and behavioural economic research.

An example of a nudge is an automatic default, such as the ones often used in organ donation legislative system. The rationale behind an automatic default is that it can bridge the gap between a good intention and the effort needed to implement that intention into practice.

Dr Osman said: “Our findings are important because they challenge the efforts of many nudge enthusiasts to promote the use of opt-out defaults in organ donation.”

She added: “To help increase actual rates of organ donation, we need more transplant coordinators working with families to help them understand the issues before being faced with a monumental and distressing decision.

“We also need to offer people a way to indicate explicitly what they wish to do. This should involve an expressed statement of intention if their wish is to donate, or an expressed statement of intention if there is an objection to donate. This reduces the ambiguity in trying to infer what one wanted to do when it comes to donating their organs.”

Source: Read Full Article

How people use, and lose, preexisting biases to make decisions

From love and politics to health and finances, humans can sometimes make decisions that appear irrational, or dictated by an existing bias or belief. But a new study from Columbia University neuroscientists uncovers a surprisingly rational feature of the human brain: A previously held bias can be set aside so that the brain can apply logical, mathematical reasoning to the decision at hand. These findings highlight the importance that the brain places on the accumulation of evidence during decision-making, as well as how prior knowledge is assessed and updated as the brain incorporates new evidence over time.

This research was reported today in Neuron.

“As we interact with the world every day, our brains constantly form opinions and beliefs about our surroundings,” said Michael Shadlen, MD, PhD, the study’s senior author and a principal investigator at Columbia’s Mortimer B. Zuckerman Mind Brain Behavior Institute. “Sometimes knowledge is gained through education, or through feedback we receive. But in many cases we learn, not from a teacher, but from the accumulation of our own experiences. This study showed us how our brains help us to do that.”

As an example, consider an oncologist who must determine the best course of treatment for a patient diagnosed with cancer. Based on the doctor’s prior knowledge and her previous experiences with cancer patients, she may already have an opinion about what treatment combination (i.e. surgery, radiation and/or chemotherapy) to recommend — even before she examines this new patient’s complete medical history.

But each new patient brings new information, or evidence, that must be weighed against the doctor’s prior knowledge and experiences. The central question, the researchers of today’s study asked, was whether, or to what extent, that prior knowledge would be modified if someone is presented with new or conflicting evidence.

To find out, the team asked human participants to watch a group of dots as they moved across a computer screen, like grains of sand blowing in the wind. Over a series of trials, participants judged whether each new group of dots tended to move to the left or right — a tough decision as the movement patterns were not always immediately clear.

As new groups of dots were shown again and again across several trials, the participants were also given a second task: to judge whether the computer program generating the dots appeared to have an underlying bias.

Without telling the participants, the researchers had indeed programmed a bias into the computer; the movement of the dots was not evenly distributed between rightward and leftward motion, but instead was skewed toward one direction over another.

“The bias varied randomly from one short block of trials to the next,” said Ariel Zylberberg, PhD, a postdoctoral fellow in the Shadlen lab at Columbia’s Zuckerman Institute and the paper’s first author. “By altering the strength and direction of the bias across different blocks of trials, we could study how people gradually learned the direction of the bias and then incorporated that knowledge into the decision-making process.”

The study, which was co-led by Zuckerman Institute Principal Investigator Daniel Wolpert, PhD, took two approaches to evaluating the learning of the bias. First, implicitly, by monitoring the influence of bias in the participant’s decisions and their confidence in those decisions. Second, explicitly, by asking people to report the most likely direction of movement in the block of trials. Both approaches demonstrated that the participants used sensory evidence to update their beliefs about directional bias of the dots, and they did so without being told whether their decisions were correct.

“Originally, we thought that people were going to show a confirmation bias, and interpret ambiguous evidence as favoring their preexisting beliefs” said Dr. Zylberberg. “But instead we found the opposite: People were able to update their beliefs about the bias in a statistically optimal manner.”

The researchers argue that this occurred because the participants’ brains were considering two situations simultaneously: one in which the bias exists, and a second in which it does not.

“Even though their brains were gradually learning the existence of a legitimate bias, that bias would be set aside so as not to influence the person’s assessment of what was in front of their eyes when updating their belief about the bias,” said Dr. Wolpert, who is also professor of neuroscience at Columbia University Irving Medical Center (CUIMC). “In other words, the brain performed counterfactual reasoning by asking ‘What would my choice and confidence have been if there were no bias in the motion direction?’ Only after doing this did the brain update its estimate of the bias.

The researchers were amazed at the brain’s ability to interchange these multiple, realistic representations with an almost Bayesian-like, mathematical quality.

“When we look hard under the hood, so to speak, we see that our brains are built pretty rationally,” said Dr. Shadlen, who is also professor of neuroscience at CUIMC and an investigator at the Howard Hughes Medical Institute. “Even though that is at odds with all the ways that we know ourselves to be irrational.”

Although not addressed in this study, irrationality, Dr. Shadlen hypothesizes, may arise when the stories we tell ourselves influence the decision-making process.

“We tend to navigate through particularly complex scenarios by telling stories, and perhaps this storytelling — when layered on top of the brain’s underlying rationality — plays a role in some of our more irrational decisions; whether that be what to eat for dinner, where to invest (or not invest) your money or which candidate to choose.”

This research was supported by the Howard Hughes Medical Institute, the National Eye Institute (R01 EY11378), the Human Frontier Science Program, the Wellcome Trust and the Royal Society.

Source: Read Full Article

Measles Outbreak Affects 21 States

Measles is a highly contagious but very preventable disease. In fact, after the introduction of the MMR (measles, mumps and rubella) vaccine, the number of annual cases dropped in the United States from 3 to 4 million in 1963 to something so negligible the disease was declared eradicated in 2000 according to the Centers for Disease Control and Prevention. However, in recent years, measles has seen something of a resurgence in this U.S., and this year is no different. According to a CDC report released on Wednesday, there have been 107 confirmed cases of measles over a recent six-month period. Yep, 107 confirmed cases from Jan. 1 to July 14 alone.

The outbreak has affected individuals from 21 states — including Arkansas, California, Connecticut, Florida, Illinois, Indiana, Kansas, Louisiana, Maryland, Michigan, Missouri, Nevada, New Jersey, New York, North Carolina, Oklahoma, Oregon, Pennsylvania, Tennessee, Texas, Washington and the District of Columbia — and while both vaccinated and unvaccinated individuals have been affected, the CDC has confirmed that most of the 2018 cases have involved those who were unvaccinated.

The symptoms of measles generally appear about seven to 14 days after a person is infected and include red eyes, a cough, runny nose, a high fever and a red rash that spreads all over the body. While the severity of each case will vary, 1 out of every 4 individuals diagnosed with measles will need to be hospitalized — and two to three cases (per 1,000) will result in death.

So, why is measles making a comeback? According to Popular Science, the rise in the number of measles cases is likely due to the declining number of children receiving MMR vaccines. As fewer and fewer parents choose to get their children vaccinated (for religious or other reasons), we begin losing the crucial herd immunity that protects us en masse from a disease. And that is a problem, not only for those who refuse the vaccine but for those who are too young to get the vaccine and/or physically cannot get the vaccine. What’s more, not even those who have been vaccinated against measles are safe. While “the measles vaccine is pretty amazing,” says Popular Science journalist Sara Chodosh, it does only prevent one from contradicting the disease 97 percent of the time — making vaccination very, very important.

But it isn’t just the measles vaccine that matters: all childhood vaccinations are important because it’s so much easier to prevent a disease than to treat it after it occurs. So if you have a school-age child, be sure to get them vaccinated against diphtheria, tetanus, whooping cough (pertussis), polio, measles, mumps, rubella and chicken pox (varicella). Because the truth is in the science: Vaccinations save lives.

Source: Read Full Article