I have a highly sophisticated method for finding articles with bad science on the web: I get on Facebook and see what I come across. It never fails, in about thirty minutes, I’ll invariably find something that makes me stop and scratch my head, or occasionally, hold it in my hands in psychic pain. Recently I came across a rather old article being circulated as new by anti-vaxxers. It was from the Daily Mail, and the breathless headline read,
Perhaps we now have the link between vaccination and autism’: Professor reveals aluminium in jabs may cause sufferers to have up 10 times more of the metal in their brains than is safe
Quite the strong claim. Often, newspapers will make incredibly strong claims not actually contained in the research, so naturally I decided to read the article to find the study in question. I was immediately hit with something odd. The byline starts, “Professor Christopher Exley for the Hippocratic Post” and very close to the beginning of the article we have,
Study author Professor Chris Exley from Keele University, said: ‘Perhaps we now have the link between vaccination and autism spectrum disorder (ASD), the link being the inclusion of an aluminium adjuvant in the vaccine.’
This looks a lot more like a press release than journalism. I have nothing against press releases in principle. I could see myself happily issuing news of my work and spend lots of time explaining my findings. I just wouldn’t expect a journalistic outlet of any quality to let me essentially write their article on it. (The implication here being that the Daily Mail is not a journalistic outlet of any quality, in case that’s not clear.)
But the study is open access, so let’s all read it. There are a lot of ways to read a study, I start with the abstract, but then immediately move to the conclusion. This is because I like to be very clear about what the paper is trying to prove.
We have made the ﬁrst measurements of aluminium in brain tissue
in ASD and we have shown that the brain aluminium content is extraordinarily
high. We have identiﬁed aluminium in brain tissue as both
extracellular and intracellular with the latter involving both neurones
and non-neuronal cells. The presence of aluminium in inﬂammatory
cells in the meninges, vasculature, grey and white matter is a standout
observation and could implicate aluminium in the aetiology of ASD.
Breaking this down, Mold, et al are saying:
- They measured aluminum in the brain tissue of people diagnosed with ASD (Autism Spectrum Disorder.)
- That the levels are “extraordinarily high.”
- That the aluminum is both inside of brain cells and outside of the cells in the brain matter.
- That the location of the aluminum in certain types of cells could suggest aluminum causes ASD.
- Aluminum in vaccines is a possible cause of ASD.
Technically, the fifth claim is not made in the paper, but made by the supervisor of the lab who issued what I’m calling a press release, but we’ll examine it anyway.
So how do they try to prove these claims? I’ll keep it simple, since this blog is written with non-scientists in mind. They use two techniques: Atomic Absorption Spectroscopy (AAS) and fluorescence microscopy. in searching the literature, AAS does seem to have a long history in the literature going back to at least the eighties for analyzing metals in brain tissue. I’m not going to question its suitability here, since I’m not an expert in making tissue measurements. For all I know, it’s the gold standard for making these kinds of measurements in the field. I have performed AAS before, just not to measure metals in tissue.
The basic principle of AAS is this: The electrons around an atom vary in their energy levels. Different atoms have electrons with very specific energies and by hitting them with light of very specific wavelengths, those atoms can be promoted to excited states. Outside certain elements, the light passes through because it’s not at the right energy to interact with an atom. By looking at how much light was absorbed by the sample, (because some of the light was “used up” or absorbed by the atoms which were excited) we know how much of a substance was in the sample. Because the wavelengths of light have to be very specific for the element you’re looking for, this method is usually very good for certain kinds of elemental analysis.
After reading the conclusion, I jumped straight to the data. Here, we find we can strike claim two off the list right away: That the levels of Aluminum are “extraordinarily high.” They can’t be high, low, or normal because we don’t have a baseline. The problem is that the researchers have data for five brains from people who were diagnosed with ASD… and that’s it. We’re missing our control specimens. Experiments like this need controls: We need samples of brain tissue from people who are known not to have been diagnosed with ASD, so that we have something to compare our results to. In principle, it should not be hard to get an IRB (an ethical review board) to approve the use of brain tissue from people who have no known history of ASD. If these specimens could not be obtained, then it requires explanation as to why, or why controls are not needed. Often, if you don’t have the data or samples you need to run an experiment, you just have to wait to publish until you do. It’s not the ideal case for a scientist, but it’s the responsible thing to do. They claim that they based their ideas on a different paper that studied numerous brains, but they don’t really do any kind of direct comparison that is age or gender matched, and that study has many of the same problems in methodology that this one does.
Specifically, they have really wide standard deviations in their data. When you run a sample in AAS, most of the time you perform the experiment in triplicate, meaning you perform it three times per sample. You take the average of the three values, and calculate the standard deviation. For those of you who aren’t that into statistics, standard deviation tells you how spread out the data is. For certain kinds of data, AAS data included, you want your standard deviations to be as small as possible. If you look at the paper (which again, is open access, so you can download it to see for yourself) the data is reported the same way. Yet their data is full of unexplained inconsistencies. For the same sample, you’ll get three very different readings leading to high standard deviations. Some variation is expected due to various imperfections in the instrumentation and conditions, but it shouldn’t vary that wildly.
This isn’t the kind of data you can draw any real conclusions from. If I were running that experiment, I would stop to figure out why my results are so inconsistent. Is the instrument malfunctioning? Are my samples dilute enough or too dilute (AAS is only accurate in certain concentration ranges)? Am I using the right fit model for my calibration data? Is my sample prepared properly? I have a lot of theories about why the data looks the way it does, ranging from pH issues making some readings artificially low (aluminum can drop out of solution at certain pHs), to matrix effects making some readings artificially high. These would all be pure speculation on my part. I don’t have the information I need to speculate more intelligently, because the authors chose not to discuss the most troubling parts of their data. Even their supplementary information doesn’t include calibration data, which isn’t always included, but with these kinds of results, should be included.
This is a strike against claim 1, that they measured the amount of aluminum in the brains of people diagnosed with ASD. I have no idea what they measured with any certainty.
I don’t know much about fluorescence microscopy, so it’s very likely I could end up schooled here. They stained brain samples with a substance called lumogallion, which binds to aluminum and glows under a special microscope designed for this purpose. But when I looked up lumogallion, it actually binds to multiple metals, not just aluminum. So that test is not quantitative (meaning we know how much aluminum is in images) or specific (meaning that we know that what we’re looking at is absolutely aluminum and not something else), but qualitative (meaning we can infer there is aluminum in the images). With better AAS data to buttress the microscope data, they’d be in a better position, but with the data they have, I don’t know that the microscope data is that convincing. However, I’ll be kind and say that they proved claim 3, simply because I don’t know enough about that specific method to say for certain what the results really mean.
We’re left with the following conclusion: There was some aluminum in the brains of 5 people who were diagnosed with ASD.
That’s it. It’s impossible to build the house of “Aluminum causes ASD” on these foundations. Even if this study was well-conducted, claim 5 would need a lot more proof. For instance, aluminum from food (and therefore very likely in expressed breast milk) dwarfs anything found in vaccines. Aluminum is actually quite abundant on this planet we call home, and so even if there is a real causal relationship between aluminum and autism, the primary source of that aluminum is not vaccination.
Then we get to the last thing I read in any study: Conflicts of interest and funding. Some people say that this should be the very first thing you read in a study, but I disagree. While funding can indicate a bias, a bias is not in itself a reason to dismiss good data or well-conducted research. In other words, a bias does not in itself make someone wrong and lack of bias does not in itself make someone right. The issue here, is that the research does not appear to be well-conducted. However, we can still glean some useful information from the identity of their funders: The Children’s Medical Safety Institute. Reading up on them, they hardly seem to be a benign source of funding. Apparently they will fund any research that “proves” vaccines are unsafe, but don’t seem to have high standards for how that research is conducted.
Meanwhile, this bad paper from 2017 is still being circulated as “news” a year later. People are using it as “proof” that vaccines cause autism. This is what’s really troubling about this paper. Its reach exceeds its scholarly merit, in no small part due to the publicity the researchers sought for themselves. Scientists don’t just have an ethical obligation to other scientists, but there’s an ethical obligation that we have to society. If you’re going to make strong claims linking autism to vaccination, then you have an obligation to back that up with strong evidence. People will absolutely take even the most preliminary study on an issue like vaccination and act on that information. This paper doesn’t bring us closer to the truth, and in fact, it muddies it. Even as I write this, the Daily Mail article is probably still doing its rounds, and most people aren’t going to download the study, much less dig into the data. This research will carry far more weight in the realm of public opinion than the data justifies. That’s the real shame of it.