Disparities in Cancer Mortality Among US Counties, 1. For many cancers, there were distinct clusters of counties in different regions with especially high mortality. Meaning. From 1. 98. US counties. Validated small area estimation models were used to estimate county- level mortality rates from 2. Hodgkin lymphoma; non- Hodgkin lymphoma; multiple myeloma; leukemia; and all other cancers combined. Exposure. County of residence. Main Outcomes and Measures. Age- standardized cancer mortality rates by county, year, sex, and cancer type. Results. A total of 1. United States between 1. Hodgkin lymphoma. Cancer mortality decreased by 2. There were large differences in the mortality rate among counties throughout the period: in 1. UI, 1. 14. 7- 1. 46. Summit County, Colorado, to 3. UI, 3. 30. 5- 4. 50. The food trends to pick and skip in the New Year from the experts at Consumer Reports. Type 2 Diabetes and Food Choices. You make food choices every day. Whole wheat or white bread? A side of french fries or fresh fruit? Eat now or later? BackgroundPreventive care for adults with diabetes has improved substantially in recent decades. We examined trends in the incidence of diabetes-related complications. North Slope Borough, Alaska, and in 2. UI, 6. 3. 2- 7. 9. Summit County, Colorado, to 5. UI, 4. 64. 9- 5. 45. This study uses NHANES data through 2013-2014 to update previously reported trends in the prevalence of obesity in US adults. Problem Eater: A Closer Look at Sensory Processing Disorder It affects up to 16 percent of school-age children in the United States. What you need to know. After decades of declining marriage rates and changes in family structure, the share of American adults who have never been married is at an historic high. Statistics on Obesity, Physical Activity and Diet. England, 2016. Published 28 April 2016. Or at least they say they are. The State of the World's Children reports. Each year, UNICEF’s flagship publication, The State of the World's Children, closely examines a key issue affecting. Union County, Florida. For many cancers, there were distinct clusters of counties with especially high mortality. The location of these clusters varied by type of cancer and were spread in different regions of the United States. Clusters of breast cancer were present in the southern belt and along the Mississippi River, while liver cancer was high along the Texas- Mexico border, and clusters of kidney cancer were observed in North and South Dakota and counties in West Virginia, Ohio, Indiana, Louisiana, Oklahoma, Texas, Alaska, and Illinois. Conclusions and Relevance. Cancer mortality declined overall in the United States between 1. Over this same period, there were important changes in trends, patterns, and differences in cancer mortality among US counties. These patterns may inform further research into improving prevention and treatment. Moreover, local information can also be useful for health care clinicians to understand community needs for care and aid in identifying cancer hot spots that need more investigation to understand the root causes. This research received institutional review board approval from the University of Washington. Informed consent was not required because the study used deidentified data and was retrospective. County- level information on levels of education, income, race/ethnicity, Native American reservations, and population density derived from data provided by the Census Bureau and the NCHS were also used. More detail on these data sources is provided in e. Table 1 in the Supplement. Although the focus of this study is cancers, all causes of death in the GBD cause list were analyzed concurrently. Previous studies. This study used garbage redistribution methods developed for the GBD to reallocate deaths assigned garbage codes. First, plausible target causes were identified for each garbage code or group of garbage codes. Second, deaths were reassigned to the specified target codes according to proportions derived in 1 of 4 ways: (1) published literature or expert opinion; (2) regression models; (3) according to the proportions initially observed among targets; and (4) for HIV/AIDS specifically, by comparison with years before HIV/AIDS became widespread. The model for each cause was specified as. Dj,t,a. The model for mj,t,a contained 6 components: an intercept (. The model incorporated 7 covariates: the proportion of the adult population that graduated high school, the proportion of the population that is Hispanic, the proportion of the population that is black, the proportion of the population that is a race other than black or white, the proportion of a county that is contained within a state or federal Native American reservation, the median household income, and the population density. After raking, age- standardized mortality rates were calculated using the US 2. Census population as the standard, and years of life lost were calculated by multiplying the mortality rate by population by life expectancy at the average age at death in the reference life table used in the GBD1 and then summing across all ages. When measuring changes over time, the change was considered statistically significant if the posterior probability of an increase (or decrease) was at least 9. No explicit correction for multiple testing (ie, across multiple counties) was applied; however, modeling all counties simultaneously is expected to mitigate the risk of spuriously detecting changes due to multiple testing. The study reports mortality rates for lip and oral cavity; nasopharynx; other pharynx; esophageal; stomach; colon and rectum; liver; gallbladder and biliary; pancreatic; larynx; tracheal, bronchus, and lung; malignant skin melanoma; nonmelanoma skin cancer; breast; cervical; uterine; ovarian; prostate; testicular; kidney; bladder; brain and nervous system; thyroid; mesothelioma; Hodgkin lymphoma; non- Hodgkin lymphoma; multiple myeloma; leukemia; and all other cancers combined. Two types of risk assessments are possible within the comparative risk assessment framework: attributable burden and avoidable burden. Attributable burden is the reduction in current disease burden that would have been possible if past population exposure had shifted to an alternative or counterfactual distribution of risk exposure. Avoidable burden is the potential reduction in future disease burden that could be achieved by changing the current distribution of exposure to a counterfactual distribution of exposure. Four types of counterfactual exposure distributions have been identified. In GBD studies and in this study, the focus was on attributable burden using the theoretical minimum risk level, which is the level of risk exposure that minimizes risk at the population level or the level of risk that captures the maximum attributable burden. The study reported the number of years of life lost in addition to deaths to account for the fact that many deaths from certain cancers occur at an older age. For example, prostate cancer was the fifth leading cause of death among cancers but the ninth leading cause of cancer years of life lost. Lung, colon, and breast cancer were the top 3 leading causes for all metrics. Lung, colon, and breast cancers also had the largest absolute difference in mortality between counties at the 9. Lung cancer mortality rates were twice as high among counties in the 9. Table 2 shows the 5- year relative survival for selected cancers from the Surveillance, Epidemiology, and End Results program. Table 3 in the Supplement) and the population- attributable fraction from the GBD using the comparative risk assessment approach. Although cancer survival improved from 1. Hodgkin lymphoma) had a 5- year survival rate of more than 8. The population- attributable fraction of risk factors was the highest for lung and cervical cancer and the lowest for ovarian cancer. Results for all cancers combined and for 1. Figures, with results for the remaining cancers presented in e. Figures 1- 2. 3 in the Supplement. The 1. 0 specific cancers highlighted below were chosen because they have either high burden (eg, tracheal, bronchus, and lung cancer), because they are responsive to treatment (eg, testicular cancer), or because screening is an important component of the health system response (eg, breast cancer). For cancers that predominantly or exclusively affect males or females (eg, breast cancer, prostate cancer), results are reported for males or females only, while in all other cases results are presented for both sexes combined. Mortality rates by county for each cancer are available in an online visualization tool (Interactive). Cancer mortality (Figure 1) decreased from 1. In 1. 98. 0, the lowest mortality rate was 1. UI, 1. 14. 7- 1. 46. Summit County, Colorado, while the highest was 3. UI, 3. 30. 5- 4. 50. North Slope Borough, Alaska; in 2. UI, 6. 3. 2- 7. 9. Summit County, Colorado, and 5. UI, 4. 64. 9- 5. 45. Union County, Florida (e. Table 4 in the Supplement). In 2. 01. 4, there were clusters of high mortality in several areas of the South, in Kentucky, West Virginia, Alabama, and along the Mississippi River, and in Western Alaska. Moreover, there were some high rates in counties in North and South Dakota and Texas, while lower rates were present in Utah and Colorado. There were statistically significant increases in cancer mortality between 1. Kentucky and scattered across regions of the South. Tracheal, bronchus, and lung cancer mortality (Figure 2) declined by 2. UI, 1. 7. 9%- 2. 4. UI, 6. 6. 8- 7. 0. UI, 5. 2. 7- 5. 5. The West and Northeast experienced declines in the mortality rate, as did Florida, while increases were observed in the South, Appalachian region, and the Midwest. The largest increase from 1. Owsley County, Kentucky (9. UI, 7. 3. 7%- 1. 30. Aleutians East Borough and Aleutians West Census Area, Alaska (6. UI, 5. 0. 3%- 7. 3. High mortality rates in 2. Kentucky and West Virginia. Because national rates peaked in 1. The highest national mortality rate for men was present in 1. The largest percentage increase (1. UI, 1. 36. 4%2. 07. Marlboro County, South Carolina (mortality rate of 6. Mortality rates varied from 1. UI, 8. 6- 1. 2. 8) in Summit County, Colorado, to 3. UI, 3. 00. 5- 3. 75. Union County, Florida, for males and 1. UI, 8. 3- 1. 3. 8) in Summit County, Colorado, to 1. UI, 1. 01. 6- 1. 42. Owsley County, Kentucky, for females. Low rates were observed along the US border with Mexico and in Utah, Colorado, and parts of Arizona, New Mexico, and Idaho. Mortality from colon and rectum cancer (Figure 3) declined by 3. UI, 3. 2. 9%- 3. 8. UI, 3. 3. 5- 3. 5. UI, 2. 1. 5- 2. 2. The highest rate of deaths per 1. Union County, Florida (5. UI, 5. 2. 0- 6. 5. Summit County, Colorado (8. UI, 7. 0- 9. 3). There were clusters of high rates in 2. Mississippi River in Missouri, Mississippi, Arkansas, and Louisiana, and others in southern Alabama, Alaska, and along the border of West Virginia and Kentucky. Several counties in Nevada, North and South Dakota, and Montana also had high rates. Statistically significant declines in mortality rates from colon and rectum cancers between 1. Howard County, Maryland (6. What’s So Bad About Gluten? Just after Labor Day, the Gluten and Allergen Free Expo stopped for a weekend at the Meadowlands Exposition Center. Each year, the event wends its way across the country like a travelling medicine show, billing itself as the largest display of gluten- free products in the United States. Banners hung from the rafters, with welcoming messages like “Plantain Flour Is the New Kale.” Plantain flour contains no gluten, and neither did anything else at the exposition (including kale). There were gluten- free chips, gluten- free dips, gluten- free soups, and gluten- free stews; there were gluten- free breads, croutons, pretzels, and beer. There was gluten- free artisanal fusilli and penne from Italy, and gluten- free artisanal fusilli and penne from the United States. Dozens of companies had set up tables, offering samples of gluten- free cheese sticks, fish sticks, bread sticks, and soy sticks. One man passed out packets of bread crumbs, made by “master bakers,” that were certified as gluten- free, G. M. O.- free, and kosher. There was even gluten- free dog food. Gluten, one of the most heavily consumed proteins on earth, is created when two molecules, glutenin and gliadin, come into contact and form a bond. When bakers knead dough, that bond creates an elastic membrane, which is what gives bread its chewy texture and permits pizza chefs to toss and twirl the dough into the air. Gluten also traps carbon dioxide, which, as it ferments, adds volume to the loaf. Humans have been eating wheat, and the gluten in it, for at least ten thousand years. For people with celiac disease—about one per cent of the population—the briefest exposure to gluten can trigger an immune reaction powerful enough to severely damage the brushlike surfaces of the small intestine. People with celiac have to be alert around food at all times, learning to spot hidden hazards in common products, such as hydrolyzed vegetable protein and malt vinegar. Eating in restaurants requires particular vigilance. Even reusing water in which wheat pasta has been cooked can be dangerous. Until about a decade ago, the other ninety- nine per cent of Americans rarely seemed to give gluten much thought. But, led by people like William Davis, a cardiologist whose book “Wheat Belly” created an empire founded on the conviction that gluten is a poison, the protein has become a culinary villain. Davis believes that even “healthy” whole grains are destructive, and he has blamed gluten for everything from arthritis and asthma to multiple sclerosis and schizophrenia. David Perlmutter, a neurologist and the author of another of the gluten- free movement’s foundational texts, “Grain Brain: The Surprising Truth About Wheat, Carbs, and Sugar—Your Brain’s Silent Killers,” goes further still. Gluten sensitivity, he writes, “represents one of the greatest and most under- recognized health threats to humanity.’’Nearly twenty million people contend that they regularly experience distress after eating products that contain gluten, and a third of American adults say that they are trying to eliminate it from their diets. One study that tracks American restaurant trends found that customers ordered more than two hundred million dishes last year that were gluten- or wheat- free. I know that I’m intolerant because I gave it up and I felt better. That explanation is probably not scientific enough for you. But I know how I felt, how I feel, and what I did to make it change.” She went on, “I’m a foodie. It’s been five years since I had biscotti. And I just had one here, gluten- free. And it rocks.”Sales of gluten- free products will exceed fifteen billion dollars by 2. The growing list of gluten- free options has been a gift for many children, who no longer have to go through life knowing that they will never eat pizza, cookies, or cake. As with organic food, which was at first sold almost exclusively by outlets with a local clientele, the market is controlled increasingly by corporations. Goya and Shop. Rite both had booths at the expo; so did Glutino, which was founded in 1. And that is what drove us, the idea of being that one- stop shop in gluten- free, the category leader, the category captain.”For many people, avoiding gluten has become a cultural as well as a dietary choice, and the exposition offered an entry ramp to a new kind of life. There was a travel agent who specialized in gluten- free vacations, and a woman who helps plan gluten- free wedding receptions. One vender passed out placards: “I am nut free,” “I am shellfish free,” “I am egg free,” “I am wheat free.” I also saw an advertisement for gluten- free communion wafers. The fear of gluten has become so pronounced that, a few weeks ago, the television show “South Park” devoted an episode to the issue. South Park became the first entirely gluten- free town in the nation. Federal agents placed anyone suspected of having been “contaminated” in quarantine at a Papa John’s surrounded by razor wire. Citizens were forced to strip their cupboards of offending foods, and an angry mob took a flamethrower to the wheat fields.“No matter what kind of sickness has taken hold of you, let’s blame gluten,’’ April Peveteaux writes in her highly entertaining book “Gluten Is My Bitch.” (Peveteaux maintains a blog with the same name.) “If you want or need to get gluten out of your diet, bravo! Kick that nasty gluten to the curb. Not sure if gluten- free is for you? Perhaps gluten simply causes you some discomfort, but you’ve never been diagnosed. Then eff that gluten!’’Wheat provides about twenty per cent of the world’s calories and more nourishment than any other source of food. Last year’s harvest, of seven hundred and eighteen million tons, amounted to roughly two hundred pounds for every person on earth. In the United States, wheat consumption appears to fluctuate according to nutritional trends. It rose steadily from the nineteen- seventies to about 2. Since then, the number of people who say that wheat, barley, and rye make them sick has soared, though wheat consumption has fallen. Wheat is easy to grow, to store, and to ship. The chemical properties of flour and dough also make wheat versatile. Most people know that it is integral to bread, pasta, noodles, and cereal. But wheat has become a hidden ingredient in thousands of other products, including soups, sauces, gravies, dressings, spreads, and snack foods, and even processed meats and frozen vegetables. Nearly a third of the foods found in American supermarkets contain some component of wheat—usually gluten or starch, or both. The most obvious question is also the most difficult to answer: How could gluten, present in a staple food that has sustained humanity for thousands of years, have suddenly become so threatening? There are many theories but no clear, scientifically satisfying answers. Some researchers argue that wheat genes have become toxic. Davis has said that bread today is nothing like the bread found on tables just fifty years ago: “What’s changed is that wheat’s adverse effects on human health have been amplified many- fold. You and I cannot, to any degree, obtain the forms of wheat that were grown fifty years ago, let alone one hundred, one thousand, or ten thousand years ago. We have to restrict other carbohydrates beyond wheat, but wheat still stands apart as the worst of the worst.’’ Perlmutter is less restrained: “As many as forty percent of us can’t properly process gluten, and the remaining sixty percent could be in harm’s way.”Although dietary patterns have changed dramatically in the past century, our genes have not. The human body has not evolved to consume a modern Western diet, with meals full of sugary substances and refined, high- calorie carbohydrates. Moreover, most of the wheat we eat today has been milled into white flour, which has plenty of gluten but few vitamins or nutrients, and can cause the sharp increases in blood sugar that often lead to diabetes and other chronic diseases. Donald Kasarda, a researcher at the U. S. Department of Agriculture, has studied wheat genetics for decades. In a recent study published in the Journal of Agricultural and Food Chemistry, he found no evidence that a change in wheat- breeding practices might have led to an increase in the incidence of celiac disease. Murray, a professor of medicine at the Mayo Clinic and the president of the North American Society for the Study of Celiac Disease, has also studied wheat genetics. He agrees with Kasarda. And there is something more important to note. Wheat consumption is going down, not up. I don’t think this is a problem that can be linked to the genetics of wheat.”But something strange is clearly going on. For reasons that remain largely unexplained, the incidence of celiac disease has increased more than fourfold in the past sixty years. Researchers initially attributed the growing number of cases to greater public awareness and better diagnoses. But neither can fully account for the leap since 1. Murray and his colleagues at the Mayo Clinic discovered the increase almost by accident. Murray wanted to examine the long- term effects of undiagnosed celiac disease. To do that, he analyzed blood samples that had been taken from nine thousand Air Force recruits between 1. The researchers looked for antibodies to an enzyme called transglutaminase; they are a reliable marker for celiac disease. Murray assumed that one per cent of the soldiers would test positive, matching the current celiac rate. Instead, the team found the antibodies in the blood of just two- tenths of one per cent of the soldiers. Then they compared the results with samples taken recently from demographically similar groups of twenty- and seventy- year- old men. In both groups, the biochemical markers were present in about one per cent of the samples.“That suggested that whatever has happened with celiac disease has happened since 1. Murray said. The modern diet may be to blame.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. Archives
September 2017
Categories |