Commentary

Dining Dangers

Author and Disclosure Information

Marie-Eileen Onieal provides food for thought: Since gastroinstestinal disorders and food allergies seem to be increasing in prevalence, does dining out now mean taking your life in your hands?


 

At a recent dinner with friends, a discussion arose during the period between sitting down and ordering our meals about how we have changed our diets over the past few years. Several of us no longer drink colas, others have decaffeinated themselves, and one or two are lactose-free. We chuckled at what a challenge it is to invite people for dinner. The query of “What can I bring?” has become “Are there any food restrictions?”

I find it odd how our bodies have become sensitive or intolerant to food and food additives as we mature. I also find it interesting that in response to those sensitivities, we have reverted to “purer” preparations, to the extent that we can find the unadulterated ingredients. That is definitely easier when you are the “Master Chef,” but when you rely on others—well, that is a very different scenario.

While regulation of food in the United States dates from early colonial times, it took until 1906 to get the Food and Drugs Act, also known as the Wiley Act, established. Harvey Washington Wiley, Chief Chemist of the Bureau of Chemistry in the Department of Agriculture, was the powerhouse behind this law. Wiley believed unsafe foods were a greater public health crisis than adulterated or misbranded drugs. Moreover, he opposed chemical additives to foods, which he viewed as unnecessary adulterants.1

Interestingly, Upton Sinclair’s The Jungle, an exposé of the revolting state of the meatpacking industry, is credited as the precipitating force behind this meat inspection and comprehensive food and drug law.1 The Wiley Act banned interstate commerce in adulterated and misbranded food and drugs, and further prohibited the addition of any ingredients that would substitute for the food, conceal damage, pose a health hazard, or constitute a filthy or decomposed substance. Prior to that, basic elements of food protection were absent.

Despite these inroads, however, concerns about food and drug safety continued, and in 1938, President Franklin D. Roosevelt signed the Food, Drug, and Cosmetic Act into law.2 This corrected abuses in food packaging and quality, and it mandated legally enforceable food standards. The first food standards issued under the 1938 act were for canned tomato products; since the 1960s, about half of our food supply is subject to a standard.

Almost 100 years after the establishment of the Wiley Act, we continued to be plagued with concerns about our food and the contents therein. To address these concerns, in 2004, the passage of the Food Allergy Labeling and Consumer Protection Act required the labeling of any product that contains a protein derived from any of the following foods that, as a group, account for the vast majority of food allergies: peanuts, soybeans, cow’s milk, eggs, fish, crustacean shellfish, tree nuts, and wheat.3 This was an important move, as studies indicate that more than 11 million Americans have one or more food allergies considered a component of chemical intolerance.

The term chemical intolerance (CI) is used to describe the loss of prior, natural tolerance to common foods and drugs that occurs in certain individuals.4 In population-based surveys, participants report a 2% to 13% prevalence of CI.5 Researchers have also found that patients with CI had an increased incidence of poorer functional states and a tendency toward increased use of the health care system, compared with persons without CI.4

Food additives are chemicals used to enhance the flavor, color, or shelf-life of food. While now carefully regulated by federal authorities and various international organizations to ensure that foods are safe to eat and are accurately labeled, in my opinion, they continue to be the most concealed and dangerous sources of CI. The CI recognized as food allergies can be a potentially serious immune response to eating those specific foods.

The incidence of allergies to food, or food additives, is on the rise. In children younger than 18 alone, there was an 18% increase in the prevalence of reported food allergy between 1997 and 2007.6 Often after dining out, those with CI or food allergies suffer for days with gastrointestinal, atopic, cardiovascular, or respiratory symptoms. Anyone who has ever tried to identify exactly what, why, or how they became ill after eating knows how frustrating and sickening it is to go through the process. They also know that as little as one taste of an offending substance can send them to bed for a day—or worse, to the emergency department (ED).

I never cease to be amazed at the carelessness of some food preparers. As you can imagine, I am outraged that people with food allergies or intolerance seem to be viewed as “picky eaters.” Yes, we are picky: We chose not to be ill after eating in your establishment! I was asked once if I “couldn’t just pick out” the allergen in my dish. My response was “Sure, right after you pick out where the LifeFlight helicopter can land after I eat this!”

Pages

Next Article: