Wednesday, December 15, 2021

Colorful sweets may look tasty, but some researchers question whether synthetic dyes may pose health risks to your colon and rectum

 

Early-onset colorectal cancer rates have been increasing since the 1990s. kajakiki/E+ via Getty Images

Early-onset colorectal cancer incidence among the young, defined as those under age 50, has been rising globally since the early 1990s. Rates for colon and rectal cancers are expected to increase by 90% and 124%, respectively, by 2030.

One suspected reason behind this trend is increased global consumption of a Westernized diet that consists heavily of red and processed meats, added sugar and refined grains. Sixty percent of the Standard American Diet, also known as “SAD,” is made up of ultra-processed food such as industrial baked sweets, soft drinks and processed meat. SAD is associated with an increased risk of colorectal cancer.

One aspect of ultra-processed foods I’m concerned about is how colorful they are. This characteristic is on full display in many delicious foods and treats present during the year-end holidays. However, many of the colors that make up candy canes, sugar cookies and even cranberry sauce and roast ham, are synthetic. And there’s some evidence that these artificial food dyes may trigger cancer-causing processes in the body.

Rainbow-colored gum drops in a glass bowl against a white background.
While artificial food coloring may look pretty, potential health risks require further study. cmannphoto/E+ via Getty Images

As the director of the Center for Colon Cancer Research at the University of South Carolina, I have been studying the effects of these synthetic food dyes on colorectal cancer development. While research on the potential cancer risk of synthetic food dyes is only just starting, I believe that you may want to think twice before you reach for that colorful treat this holiday season.

What are synthetic food dyes?

The food industry uses synthetic dyes because they make food look better. The first food dyes were created from coal tar in the late 1800s. Today, they are often synthesized from a chemical derived from petroleum called naphthalene to make a final product called an azo dye.

Food manufacturers prefer synthetic dyes over natural dyes like beet extract because they are cheaper, brighter and last longer. While manufacturers have developed hundreds of synthetic food dyes over the past century, the majority of them are toxic. Only nine are approved for use in food under U.S. Food and Drug Administration policy, and even fewer pass European Union regulations.

Food manufacturers in the U.S. started using synthetic dyes to standardize the coloring of their products as a marketing strategy.

What drives colorectal cancer?

DNA damage is the primary driver of colorectal cancer. When DNA damage occurs on cancer driver genes, it can result in a mutation that tells the cell to divide uncontrollably and turn cancerous.

Another driver of colorectal cancer is inflammation. Inflammation occurs when the immune system sends out inflammatory cells to begin healing an injury or capture disease-causing pathogens. When this inflammation persists over time, it can harm otherwise healthy cells by releasing molecules called free radicals that can damage DNA. Another type of molecule called cytokines can prolong inflammation and drive increased cell division and cancer development in the gut when there isn’t an injury to heal.

Long-term poor dietary habits can lead to a simmering low-grade inflammation that doesn’t produce noticeable symptoms, even while inflammatory molecules continue to damage otherwise healthy cells.

Synthetic food dyes and cancer

Although none of the FDA-approved synthetic food colors are classified as carcinogens, currently available research points to potential health risks I and others find concerning.

For example, the bacteria in your gut can break down synthetic dyes into molecules that are known to cause cancer. More research is needed on how the microbiome interacts with synthetic food coloring and potential cancer risk.

Studies have shown that artificial food dyes can bind to the DNA and proteins inside cells. There is also some evidence that synthetic dyes can stimulate the body’s inflammatory machinery. Both of these mechanisms may pose a problem for colon and rectal health.

Synthetic food dyes have been found to damage DNA in rodents. This is supported by unpublished data from my research team showing that Allura Red, or Red 40, and Tartrazine, or Yellow 5, can cause DNA damage in colon cancer cells with increased dosages and length of exposure in vitro in a controlled lab environment. Our results will need to be replicated in animal and human models before we can say that these dyes directly caused DNA damage, however.

Finally, artificial food coloring may be of particular concern for children. It’s known that children are more vulnerable to environmental toxins because their bodies are still developing. I and others believe that this concern may extend to synthetic food dyes, especially considering their prevalence in children’s food. A 2016 study found that over 40% of food products marketed toward children in one major supermarket in North Carolina contained artificial food coloring. More research needs to be done to examine how repeated exposure to artificial food dyes may affect children.

Child eating a donut with blue frosting.
Many foods marketed toward kids contain synthetic food coloring. FluxFactory/E+ via Getty Images

Lowering your risk of colorectal cancer

A few treats during the holidays won’t cause colorectal cancer. But a long-term diet of processed foods might. While more research is needed on the link between synthetic food dyes and cancer, there are evidence-based steps you can take now to reduce your risk of colorectal cancer.

One way is to get screened for colon cancer. Another is to increase your physical activity. Finally, you can eat a healthy diet with more whole grains and produce and less alcohol and red and processed meat. Though this means eating fewer of the colorful, ultra-processed foods that may be plentiful during the holidays, your gut will thank you in the long run.


Lorne J. Hofseth, Professor and Associate Dean for Research, College of Pharmacy, University of South Carolina

This article is republished from The Conversation under a Creative Commons license.

Are people lying more since the rise of social media and smartphones?

 

Some forms of technology seem to facilitate lying more than others. solitude72/iStock via Getty Images

Technology has given people more ways to connect, but has it also given them more opportunities to lie?

You might text your friend a white lie to get out of going to dinner, exaggerate your height on a dating profile to appear more attractive or invent an excuse to your boss over email to save face.

Social psychologists and communication scholars have long wondered not just who lies the most, but where people tend to lie the most – that is, in person or through some other communication medium.

A seminal 2004 study was among the first to investigate the connection between deception rates and technology. Since then, the ways we communicate have shifted – fewer phone calls and more social media messaging, for example – and I wanted to see how well earlier results held up.

The link between deception and technology

Back in 2004, communication researcher Jeff Hancock and his colleagues had 28 students report the number of social interactions they had via face-to-face communication, the phone, instant messaging and email over seven days. Students also reported the number of times they lied in each social interaction.

The results suggested people told the most lies per social interaction on the phone. The fewest were told via email.

The findings aligned with a framework Hancock called the “feature-based model.” According to this model, specific aspects of a technology – whether people can communicate back and forth seamlessly, whether the messages are fleeting and whether communicators are distant – predict where people tend to lie the most.

In Hancock’s study, the most lies per social interaction occurred via the technology with all of these features: the phone. The fewest occurred on email, where people couldn’t communicate synchronously and the messages were recorded.

The Hancock Study, revisited

When Hancock conducted his study, only students at a few select universities could create a Facebook account. The iPhone was in its early stages of development, a highly confidential project nicknamed “Project Purple.”

What would his results look like nearly 20 years later?

In a new study, I recruited a larger group of participants and studied interactions from more forms of technology. A total of 250 people recorded their social interactions and number of interactions with a lie over seven days, across face-to-face communication, social media, the phone, texting, video chat and email.

As in Hancock’s study, people told the most lies per social interaction over media that were synchronous and recordless and when communicators were distant: over the phone or on video chat. They told the fewest lies per social interaction via email. Interestingly, though, the differences across the forms of communication were small. Differences among participants – how much people varied in their lying tendencies – were more predictive of deception rates than differences among media.

Despite changes in the way people communicate over the past two decades – along with ways the COVID-19 pandemic changed how people socialize – people seem to lie systematically and in alignment with the feature-based model.

There are several possible explanations for these results, though more work is needed to understand exactly why different media lead to different lying rates. It’s possible that certain media are better facilitators of deception than others. Some media – the phone, video chat – might make deception feel easier or less costly to a social relationship if caught.

Deception rates might also differ across technology because people use some forms of technology for certain social relationships. For example, people might only email their professional colleagues, while video chat might be a better fit for more personal relationships.

Technology misunderstood

To me, there are two key takeaways.

First, there are, overall, small differences in lying rates across media. An individual’s tendency to lie matters more than whether someone is emailing or talking on the phone.

Second, there’s a low rate of lying across the board. Most people are honest – a premise consistent with truth-default theory, which suggests most people report being honest most of the time and there are only a few prolific liars in a population.

Since 2004, social media have become a primary place for interacting with other people. Yet a common misperception persists that communicating online or via technology, as opposed to in person, leads to social interactions that are lower in quantity and quality.

People often believe that just because we use technology to interact, honesty is harder to come by and users aren’t well served.

Not only is this perception misguided, but it is also unsupported by empirical evidence. The belief that lying is rampant in the digital age just doesn’t match the data.The Conversation

David Markowitz, Assistant Professor of Social Media Data Analytics, University of Oregon

This article is republished from The Conversation under a Creative Commons license.

How Christmas became an American holiday tradition, with a Santa Claus, gifts and a tree

 

The pagan tradition of celebrating the winter solstice with bonfires on Dec. 21 inspired the early Christian celebrations of Christmas. Gpointstudio/ Image Source via Getty Images

Each season, the celebration of Christmas has religious leaders and conservatives publicly complaining about the commercialization of the holiday and the growing lack of Christian sentiment. Many people seem to believe that there was once a way to celebrate the birth of Christ in a more spiritual way.

Such perceptions about Christmas celebrations have, however, little basis in history. As a scholar of transnational and global history, I have studied the emergence of Christmas celebrations in German towns around 1800 and the global spread of this holiday ritual.

While Europeans participated in church services and religious ceremonies to celebrate the birth of Jesus for centuries, they did not commemorate it as we do today. Christmas trees and gift-giving on Dec. 24 in Germany did not spread to other European Christian cultures until the end of the 18th century and did not come to North America until the 1830s.

Charles Haswell, an engineer and chronicler of everyday life in New York City, wrote in his “Reminiscences of an Octoganarian” that in the 1830s German families living in Brooklyn dressed up Christmas trees with lights and ornaments. Haswell was so curious about this novel custom that he went to Brooklyn in a very stormy and wet night just to see these Christmas trees through the windows of private homes.

The first Christmas trees in Germany

Only in the late 1790s did the new custom of putting up a Christmas tree decorated with wax candles and ornaments and exchanging gifts emerge in Germany. This new holiday practice was completely outside and independent of Christian religious practices.

The idea of putting wax candles on an evergreen was inspired by the pagan tradition of celebrating the winter solstice with bonfires on Dec. 21. These bonfires on the darkest day of the year were intended to recall the sun and show her the way home. The lit Christmas tree was essentially a domesticated version of these bonfires.

The English poet Samuel Taylor Coleridge gave the very first description of a decorated Christmas tree in a German household when he reported in 1799 about having seen such a tree in a private home in Ratzeburg in northwestern Germany. In 1816 German poet E.T.A. Hoffmann published his famous story “Nutcracker and Mouse King.” This story contains the very first literary record of a Christmas tree decorated with apples, sweets and lights.

From the onset, all family members, including children, were expected to participate in the gift-giving. Gifts were not brought by a mystical figure, but openly exchanged among family members – symbolizing the new middle-class culture of egalitarianism.

From German roots to American soil

American visitors to Germany in the first half of the 19th century realized the potential of this celebration for nation building. In 1835 Harvard professor George Ticknor was the first American to observe and participate in this type of Christmas celebration and to praise its usefulness for creating a national culture. That year, Ticknor and his 12-year-old daughter Anna joined the family of Count von Ungern-Sternberg in Dresden for a memorable Christmas celebration.

Other American visitors to Germany – such as Charles Loring Brace, who witnessed a Christmas celebration in Berlin nearly 20 years later – considered it a specific German festival with the potential to pull people together.

For both Ticknor and Brace, this holiday tradition provided the emotional glue that could bring families and members of a nation together. In 1843 Ticknor invited several prominent friends to join him in a Christmas celebration with a Christmas tree and gift-giving in his Boston home.

Ticknor’s holiday party was not the first Christmas celebration in the United States that featured a Christmas tree. German-American families had brought the custom with them and put up Christmas trees before. However, it was Ticknor’s social influence that secured the spread and social acceptance of the alien custom to put up a Christmas tree and to exchange gifts in American society.

The introduction of Santa Claus

Santa Claus visiting the Union Army on Christmas night as shown in a Harper's Weekly cartoon.
‘Santa Claus in Camp,’ from Harper’s Weekly, by artist Thomas Nast. Harris Brisbane Dick Fund, 1929

For most of the 19th century, the celebration of Christmas with Christmas trees and gift-giving remained a marginal phenomenon in American society. Most Americans remained skeptical about this new custom. Some felt that they had to choose between older English customs such as hanging stockings for presents on the fireplace and the Christmas tree as proper space for the placing of gifts. It was also hard to find the necessary ingredients for this German custom. Christmas tree farms had first to be created. And ornaments needed to be produced.

The most significant steps toward integrating Christmas into popular American culture came in the context of the American Civil War. In January 1863 Harper’s Weekly published on its front page the image of Santa Claus visiting the Union Army in 1862. This image, which was produced by the German-American cartoonist Thomas Nast, represents the very first image of Santa Claus.

A cartoon showing Santa Claus as  the jolly old man with a big belly and a long white beard.
‘Santa Claus and His Works,’ from Harper’s Weekly, Dec. 25, 1866. Artist Thomas Nast, HarpWeek

In the following years, Nast developed the image of Santa Claus into the jolly old man with a big belly and long white beard as we know it today. In 1866 Nast produced “Santa Claus and His Works,” an elaborate drawing of Santa Claus’ tasks, from making gifts to recording children’s behavior. This sketch also introduced the idea that Santa Claus traveled by a sledge drawn by reindeer.

Declaring Christmas a federal holiday and putting up the first Christmas tree in the White House marked the final steps in making Christmas an American holiday. On June 28, 1870, Congress passed the law that turned Christmas Day, New Year’s Day, Independence Day, and Thanksgiving Day into holidays for federal employees.

And in December 1889 President Benjamin Harrison began the tradition of setting up a Christmas tree at the White House.

Christmas had finally become an American holiday tradition.The Conversation

Thomas Adam, Associate Professor of International and Global Studies, University of Arkansas

This article is republished from The Conversation under a Creative Commons license.