As far as I can tell, the epistemological origin of the antivax movement is the human drive to find a reason for things that happen. There was never a scientific basis. They latched onto one study that suggested a link between vaccines and autism, but it's not like they had some sophisticated meta-analysis that led them to believe that study was more reliable than others. It just provided a narrative that gave them the answer they were seeking, while the alternative did not.
There could hypothetically have been a basis like the one you're talking about, but it seems evident from the movement itself that this didn't happen in our version of reality.
That's the point: that we have trained people to distrust science because most of the "results" people are exposed to are bad popular articles reporting on sketchy studies from Psychology and Medicine that tend to be "overthrown" every few years: it is difficult to go to people and say "no, seriously: vaccines work" when they can retort "that's what you said about red wine, and just today I read an article about how that was all bunk... in another few years everyone is going to be sorry they listened to you guys about these vaccine things you all love so much". :(
> That's the point: that we have trained people to distrust science
I am not sure science is an article of faith to be trusted or not. Science is a continuous process of reviewing evidence and making hypothesis based on that evidence. It is not a mathematical proof.
Sure, there are "bad popular articles" that describe causal relations enthusiastically. However, the pipeline of bullshit flows backwards too. There are enough research papers out there where results may not be fudged but over enthusiastic language may be used that can to the layman's eye look like magic.
Science is one part process and one part people: you first have to trust the process to provide useful evidence, and you second have to trust the people to tell you what they did honestly. Psychology and Medicine are "problem fields" in both areas: the process itself runs into ethical challenges that result in very few real "experiments" (leading to lots of correlations with difficult hidden variables, nothing at all like "causal relations"), and the people involved in the reporting process are also much more likely to blow things out of proportion as these are areas of extremely broad interest.
The result is that when Geophysics says something important the general populace doesn't believe them (hence why "trust" is the right word to be using here) because their mental model of Science is built out of "scientists" making lots of firm contradictory statements, and especially many situations they remember where what was said just a few years ago was entirely "overthrown" by the new research of the week. If you actually listen to the people who refuse to believe in things like vaccines or climate change, they seem to just think scientists are silly people who think they know more than they actually do :(.
> Science is one part process and one part people: you first have to trust the process to provide useful evidence, and you second have to trust the people to tell you what they did honestly.
I spent a few years in a heavy math PhD program before dropping out. The only two things I learnt.
1. Be skeptical of all statistics out there that you cannot personally replicate.
2. See 1.
The pressure to publish is real. The pressure to do something for the sake of novelty is real. The process becomes shakier and shakier as we go down the chain from Pure Math to Engineering and so on.
If there are Numerical Analysis papers with over 100 citations with basic linear algebra errors (that was a fun fucking two weeks of head scratching), why on earth would I trust the process somewhere downstream written by people who took a few grad courses in statistics?
Note that I am not taking sides here. I have no interest in one or the other part of the vaccine lobby. I just find it weird when people use terms best left to religion like 'faith' when it comes to scientific research.
> If you actually listen to the people who refuse to believe in things like vaccines or climate change, they seem to just think scientists are silly people who think they know more than they actually do
I think the problem comes from misconceptions in the fundamental nature of science. Scientists are not Gods who are bringing you truth from some mystical truth well. It is a continuous process prone with errors that eventually evolves into getting us closer to understanding processes. If people take that as a sign that everything is junk instead of having a healthy skepticism and reasoning for themselves, they have themselves to blame. Can't argue with stupid.
> It is a continuous process prone with errors that eventually evolves into getting us closer to understanding processes. If people take that as a sign that everything is junk instead of having a healthy skepticism and reasoning for themselves, they have themselves to blame. Can't argue with stupid.
A couple points:
1) The average person probably lacks the necessary background (knowledge of statistics, methodological concerns, etc.) to independently evaluate scientific research. So blaming them for not going and evaluating the research themselves is a bit unreasonable.
2) You also have to realize that this is not an abstract question for new mothers. When you're injecting something into the precious bundle of joy they carried in their bodies for 9 months, asking them to trust the current conclusions of a "continuous process prone with errors" is not going to be the easiest thing. Accepting the scientific consensus, one which you yourself admit is no magical source of truth, is a leap of faith for these parents; they're committing their child to a medical procedure based on that consensus.
And they should take that leap of faith, in my opinion. But let's not pretend like faith doesn't enter into the equation just because it's science.
You make some fair points. I am not sure what the right answer then is. I agree that the average lay person is probably not going to know how to reason on these matters. However, if we are building faith as an abstraction, how do we deal with the fact that the abstraction is leaky. There will be errors. How do we communicate that the errors don't necessarily mean that the system as a whole is broken but that it is the nature of the system, whilst maintaing this abstraction of faith?
It's definitely complex and may never be completely solved. The biggest thing in my mind is learning how to talk to these people; if someone is trying to make sense of competing opinions with no good standard to judge them by, going up to them and saying "Fuck you idiot, clearly we're right and you're stupid for thinking otherwise" is not productive (granted, this is a straw-man, but some of the vitriol I've seen comes pretty close). You run the risk of creating a serious blowback effect, where your condescension drives them further away from accepting what you're trying to tell them.
It's like a Chinese finger trap: if you want to convince an anti-vaxer, you can't treat them like they're crazy, even if you have an impulse to judge. You have to be able to meet them halfway, and from that point get them to understand the safety profile of vaccines. You don't have to throw numbers in their face, but there has to be some way of speaking to them as a fellow human being: "I know you're scared and you don't know if this is safe, but we don't want your child being hurt either. We've done an incredible amount of testing to make sure vaccines are safe, and we firmly believe they will keep your child as healthy as possible, as well as improving the overall health of the community."
Of course, this is just my opinion. But I really think that empathy will make it much easier to mend these kinds of fissures in our beliefs.
> if you want to convince an anti-vaxer, you can't treat them like they're crazy, even if you have an impulse to judge.
> You run the risk of creating a serious blowback effect, where your condescension drives them further away from accepting what you're trying to tell them.
This is exactly what I am coming from too. In my few years in the programming community, one thing I have noticed is a large population of people who pursue science as a religion. (I mentally tag them the r/atheism crowd.) When you treat something as canon and go up on people's faces as to why they are stupid and wrong, you are not creating a productive discussion.
I never used the word "faith": not once. You kept using the word "faith": I insisted upon using the word "trust". These terms are related, but are quite different. I guess a really important question becomes: how can one ever remove "trust"? You seem to believe that it is possible to have something that doesn't at some level rely on trust: as far as I can tell that only works if every single person ignores everything everyone else ever says and instead learns everything from scratch via first-hand experimentation (and even then I would maintain that one must trust your equipment, though maybe one could build their own equipment as well). Even if every single person in the world is an expert in statistics: you still have to trust that the things you read and are "verifying" were reported accurately. :/
There is a distinction between trusting a mathematical process and trusting random user "saurik" for example. I don't know you from hell, I would trust you to sell me lemonade. On the other hand, if you gave me a recipe for lemonade, that I could have the people I trust replicate for me. I might buy that recipe from you.
If you truly don't trust me in your example, you would be a fool to buy my recipe: you may have people you trust to replicate it, but it turns out that my recipe tastes horrible. You fundamentally cannot get away from having to trust the people whose first-hand accounts you are verifying.
> That's the point: that we have trained people to distrust science because most of the "results" people are exposed to are bad popular articles
IOW, our educational system has failed to train people to distrust -- or even to simply critically evaluate -- popular articles on science, leaving them vulnerable to sloppy misrepresentations of science by outlets that usually have neither the skill nor the interest to properly report on it.
Of course, but a lack of trust in the medical establishment is probably an enabling factor. If they trusted their doctors to make the right decisions for their children, they probably wouldn't be buying into Wakefield's study or quoting Jenny McCarthy. Clearly there is fear and a human desire for easy answers, but I think lack of trust is also a big part of it.
There could hypothetically have been a basis like the one you're talking about, but it seems evident from the movement itself that this didn't happen in our version of reality.