Health knowledge made personal
Join this community!
› Share page:
Go
Search posts:

Comment on “Do aluminum vaccine adjuvants contribute to the rising prevalence of autism?”

Posted Jul 04 2013 6:46pm

As a graduate student I once watched a speaker at a conference give a clearly bad talk. My advisor was next to me and when the talk was opened for questions I asked him why he wasn’t pointing out the major flaws in the study. “Why bother” was the response. The study was so bad that one didn’t need to comment.

That’s how I felt when a paper came out 2 years ago alleging a link between autism prevalence and aluminum in vaccines. The paper, “Do aluminum vaccine adjuvants contribute to the rising prevalence of autism?” made a great effort to present itself as being rigorous. It was a bit more slick than, say, your average Geier paper, but still bad.

The authors of the new study, Tomljenovic and Shaw, took U.S. special education data to create a time trend and data from various countries to compare by geography. The idea of comparing various countries to implicate vaccines isn’t new. Generation Rescue tried it a as well in an unpublished and unsigned “ special report “.

Here’s what both groups do–take prevalence data from various countries and compare it to vaccine schedules per country. In the case of Generation Rescue, they ignored the fact that the data are for vastly different birth cohorts (for example, some data are for kids born in the 1970′s and these are compared directly to those born in the 1990′s). In the case of Tomljenovic and Shaw paper, they made minor modifications to their aluminum exposure data to account for the fact that kids born in various birth cohorts would have had dramatically different vaccine exposures. It was a weak attempt at best and assumes that the only cause of rising prevalence is the exposure they are studying.

In short, both are junk science.

When the recent study from Iceland came out , I was reminded of the Tomljenovic and Shaw paper. One of the countries they included in their study was Iceland. And, as I’ll point out below, not only did they use really old data for Iceland, they made another very sloppy error at the same time. With that stuck in my memory, it was time to write up this comment.

Here are the prevalence data by country from their paper:

ShawByCountry

Yes, they used a prevalence for Iceland of 12.4/10,000. Which, we know now has a prevalence ten times higher. That’s the sort of mistake you get for using old prevalence numbers. But there’s more.

Consider Sweden. Tomljenovic and Shaw quote a prevalence of 53.4/10,000, using this study as their citation: Brief report: “the autism epidemic”. The registered prevalence of autism in a Swedish urban area . Here’s the abstract from that study:

The objective of this study was to establish rates of diagnosed autism spectrum disorders (ASDs) in a circumscribed geographical region. The total population born in 1977-1994, living in Göteborg Sweden in 2001, was screened for ASD in registers of the Child Neuropsychiatry Clinic. The minimum registered rate of autistic disorder was 20.5 in 10,000. Other ASDs were 32.9 in 10,000, including 9.2 in 10,000 with Asperger syndrome. Males predominated. In the youngest group (7-12 years), 1.23% had a registered diagnosis of ASD. There was an increase in the rate of diagnosed registered ASD over time; the cause was not determined. The increase tended to level off in the younger age cohort, perhaps due to Asperger syndrome cases missed in screening.

They used the full cohort, from birth year 1977 to 1994, giving a medium prevalence number. In their study, they are making a comparison to a U.S. study with a prevalence of 110/10,000 whose kids were born between 1990 and 2004. Not a great comparison of cohorts. What makes it clearly cherrypicking on the part of Tomljenovic and Shaw is that they could have used a prevalence for Sweden for a cohort with birth years 1989 to 1994. Still not a perfect match, but closer. That cohort had a prevalence of 123/10,000.

To put it simply, had Tomljenovic and Shaw used the younger Swedish cohort, they would have had Swedish kids with a lower aluminum exposure (due to earlier birth cohort and differences between the U.S. and Sweden vaccine schedule), but a higher autism prevalence.

Cherrypicking. They ignored the data that clearly goes against their theory.

There is some very sloppy data analysis going on with their value for Iceland. They quote a prevalence for Iceland as 12.4/10,000. Here’s part of the abstract from that study .

This clinic-based study estimated the prevalence of autism in Iceland in two consecutive birth cohorts, subjects born in 1974-1983 and in 1984-1993. In the older cohort classification was based on the ICD-9 in 72% of cases while in the younger cohort 89% of cases were classified according to the ICD-10. Estimated prevalence rates for Infantile autism/Childhood autism were 3.8 per 10,000 in the older cohort and 8.6 per 10,000 in the younger cohort.

Do you see 12.4/10,000 in there? The nearest I can tell is that they added the prevalence values from the two cohorts (3.8+8.6=12.4). Maybe they arrived at 12.4 by some other method. Doesn’t matter, the number is wrong. And, as we now know, it has been updated to 120.

So, for their international comparison, both Iceland and Sweden prevalence numbers are wrong. One by clear cherrypicking of data. So, their conclusions based on those data are clearly wrong.

What about the other part of their study–using special education data to show that as aluminum exposure from vaccines increased, so did the autism prevalence in the U.S.. First, special education data are very problematic–they don’t really represent autism prevalence. Jim Laidler spelled it out in his paper in Pediatrics: US Department of Education data on “autism” are not reliable for tracking autism prevalence .

But let’s ask a simple question: Tomljenovic and Shaw used one collection of data (variation by country) to show how a geographic variation in autism prevalence supposedly correlates to aluminum exposure, but another set of data to show time trends (U.S. special education data). Why? U.S. special education data include geographic variation. So, for that matter, do the CDC reports on autism. For example, the 2012 report U.S. prevalence varied from 48/10,000 in Alabama to 212/10,000 in Utah. A factor of four difference, with little variation in vaccine uptake.

To put it another way, had Tomljenovic and Shaw used special education data or CDC reports for their geographic comparison, they would have had to report a completely different answer than they did.

Their study was funded by private foundations. Namely, “This work was supported by the Katlyn Fox and the Dwoskin Family Foundations.” Claire Dwoskin is a former board member of the self-named “National Vaccine Information Center”, which is heavily biased against vaccines. Ms. Dwoskin herself is heavily biased against vaccines, having stated that “Vaccines are a holocaust of poison on our children’s brains and immune systems.”

If a pharmaceutical company had funded a study which was so poor and clearly biased towards the interests of the funding company, there would be a very rightful outcry. Here we have a direct parallel. Very poor research, clearly biased and matching the interests of the funding agency. But, those promoting the idea that vaccines cause autism welcomed and promote this study.

Frankly, had I funded this work, it would have been the last time Tomljenovic and Shaw would have seen a dime from me again. Not because of the answer, but because this effort so clearly cherrypicked results and produced such a clearly biased answer. Tomljenovic and Shaw, however, have continued to receive support from at least the Dwoskin Family Foundation.


By Matt Carey


Post a comment
Write a comment:

Related Searches