Then there’s Munder (2013), which is a meta-meta-analysis on whether meta-analyses of confounding by researcher allegiance effect were themselves meta-confounded by meta-researcher allegiance effect. He found that indeed, meta-researchers who believed in researcher allegiance effect were more likely to turn up positive results in their studies of researcher allegiance effect (p < .002).Everything about it is a delight. The layers of meta-analysis. The English noun-phrase-constructing rules that permit the construction of a sentence in which the prefix "meta-" appears five times, variously modifying words which themselves are modifying other "meta-"-modified words.
I wonder if the same researcher bias/confounding exists in fields where the experiments are entirely done on computers. Can researchers' belief in the effectiveness of certain machine learning techniques affect their experiments? What about physics simulations? I don't see how, but of course I deeply believe in the inviolable sanctity of mathematics. This is an opinion founded in my acknowledged bias. Maybe coders would self-sabotage by writing bad code, so that experiments run slower? ... but in the end this wouldn't affect the actual outcome, just the agony and feasibility of running the experiment many times.
On a larger scale, I am supremely happy that scientists are using their scientific reasoning to criticize the very practice of science itself. In the same way that I frequently remind myself that the basis of the field studying privacy is "trust no one"*, it would be nice to have big science conferences where we all get together and just shake our heads at how unreliable the current practice of science is. Apparently. I mean, check out this conclusion:
But rather than speculate, I prefer to take it as a brute fact. Studies are going to be confounded by the allegiance of the researcher. When researchers who don’t believe something discover it, that’s when it’s worth looking into.... which sounds convincing.
But.
You know what?
I'm skeptical.
This post's theme word is obverse, "the more conspicuous of two alternatives or cases or sides." The skeptic and his obverse performed a coordinated, randomized, double-blind study.
*Or, as I memorably put it during a job interview, "We've known for a long time that almost everything is impossible."