Try it at home! —

A critical analysis of the latest cellphone safety scare

In which we describe how we decided not to cover the newest cellphone-cancer study.

Don't abandon the Internet yet!
Enlarge / Don't abandon the Internet yet!

Last night, a fellow editor emailed me a link to yet another study purporting to show that cellphone use could be associated with cancer. This one was worth looking at in more detail, however, because it purported to see an increase in a specific cancer—the same type of cancer that was increased in a problematic US government study.

A quick glance at the study identified significant issues with its primary conclusion. Normally, at this point, the decision would be to skip coverage unless the study picked up unwarranted attention from the rest of the media. (See: Scott Kelly's DNA). But in this case, we thought we'd describe how we went about evaluating the paper, since it could help more people identify similar issues in the future.

Background checks

The first step in evaluating a scientific paper is to get ahold of a copy of the paper. Fortunately, this one has been placed online by an organization that consistently promotes the idea that cell phones create health risks. The Environmental Health Trust's involvement shouldn't be seen as a positive or a negative; they've promoted very low-quality material in the past, but the organization would undoubtedly promote higher quality studies if those agreed with its stance.

The study itself has been accepted for publication, which means it has been through peer review. It will appear in a journal called Environmental Research, but journal quality matters. While lots of research that ends up being published in lower-profile journals ends up being significant, the rise of online journals has spawned a host of predatory publishers that will publish anything as long as the authors pay them a publication fee.

Environmental Research isn't one of those; it's published by Elsevier, a large publishing house that has been handling science journals for decades. And this particular journal has been around since the 1960s. A journal's "impact factor" is an imperfect measurement of whether the articles it publishes end up influencing other research. Environmental Research's score is typical of a decent-quality journal that serves a specialized audience.

All of this suggests that this cellphone paper probably went through some decent peer review, so it shouldn't be dismissed out of hand.

Next when in the evaluating process, we typically give the author list a quick look. Something that is a bit unusual here is that every author comes from the same institution, Italy's Instituto Ramazzini. Typically, an author list this large involves a collaboration among several research centers, but this makes it relatively easy to check into the Ramazzini's background. It turns out the Institute is widely recognized for its cancer studies, and it has been doing them for decades. There has been some controversy about some of the organization's conclusions and arguments in Congress over whether US government funds should be going to a foreign institution. But in at least one situation, outside experts were called in by the US government, and they found that at least some of the Ramazzini's work was scientifically valid.

Adding all of this up, this looks like a paper that should be taken seriously. So that's what we'll do.

Stats vs. numbers

Like other studies of its kind, this new research involves long-term exposure of rats to cellphone signals. Like the US government study, it involves unusually long exposures (19 hours a day), but it uses much lower doses, ones similar to what someone might actually experience. It uses a very large number of animals (nearly 2,500 in total), which should provide good statistical power. So far, so good.

But things start to go wrong in the abstract. There, the authors of the paper talk about three increases in the incidence of cancer in animals exposed to cellphone radiation. But two of these weren't statistically significant, meaning there's a greater than five percent chance the difference would occur at random. If we're going to allow non-significant changes into the conclusions, then the data would just as easily support reporting that cellphones reduce the risk of cancer in some of the experimental groups.

That's bad. But there's still one significant increase in cancer in their data, so let's look more closely at that: "A statistically significant increase in the incidence of heart Schwannomas was observed in treated male rats at the highest dose." When it comes to this type of cancer, the control group of 817 rats developed four tumors. But critically, all of those tumors occurred in females; none in males. This apparent sex bias will necessarily exaggerate the impact of any tumors in any of the male experimental populations.

And that's exactly what you see happening. In one female population, 2.2 percent of an experimental group developed this type of tumor, but that was not a statistically significant result. By contrast, in this male population with the significant difference, only 1.5 percent of the animals developed these tumors. A low-dose group of males had the same number of tumors, but the group was larger and so the result slipped below significance.

These numbers suggest that the one statistical effect seen in this study is caused by the unusually low tumor incidence in the control group, rather than a specific effect of cellphone radiation.

As we mentioned above, the normal response to a study like this would be to simply ignore it unless it became widely discussed. But highlighting the process that we use to decide to ignore it should give you a sense of how we determine what to cover when it comes to scientific studies at Ars. And, if you decide to try this method at home, it can also help you determine which results to pay attention to.

Channel Ars Technica