#378      33 min 48 sec
Show me the data: Sifting pseudoscience from the real thing

In a world with a bewildering mix of fact and fiction, and in which social and mainstream media only add to the confusion, how do we separate out false or dubious claims from the well-founded and evidence-based? Research and clinical psychologist Prof Scott Lilienfeld joins science host Dr Andi Horvath to help us distinguish pseudoscience from the real thing by exploring popular myths that too often make fools of us.

"We know lots and lots of people who have pursued medical treatments that have little or no evidentiary basis, no basis in data, and probably have died as a result of them." -- Prof Scott Lilienfeld




Prof Scott Lilienfeld
Prof Scott Lilienfeld

Scott O. Lilienfeld is Samuel Candler Dobbs Professor of Psychology at Emory University in Atlanta, Georgia (USA). He received his bachelor’s degree from Cornell University in 1982 and his Ph.D. in Psychology (Clinical) from the University of Minnesota in 1990. Dr. Lilienfeld is Editor of Clinical Psychological Science, Associate Editor of the Archives of Scientific Psychology, and President-Elect of the Society for a Science of Clinical Psychology. He is also on the editorial boards of the magazines Skeptical Inquirer, The Skeptic, and Scientific American Mind. Dr. Lilienfeld has published over 350 manuscripts on personality disorders, dissociative disorders, psychiatric classification, pseudoscience in psychology, and evidence-based practices in clinical psychology. His 2010 book, co-authored with several colleagues, 50 Great Myths of Popular Psychology, examines a host of widespread misunderstandings regarding human behavior. In 1998, Dr. Lilienfeld received the David Shakow Award for Outstanding Early Career Contributions to Clinical Psychology from APA Division 12, and in 2012 he received the James McKeen Cattell Award for Career Contributions to Applied Psychological Science from APS.

Credits

Host: Dr Andi Horvath
Producer: Eric van Bemmel
Audio Engineer: Gavin Nebauer
Voiceover: Louise Bennet
Series Creators: Kelvin Param, Eric van Bemmel

View Tags  click a tag to find other episodes associated with it.

Download file Download mp3 (30.9 MB)

VOICEOVER

This is Up Close, the research talk show from the University of Melbourne, Australia.


ANDI HORVATH

Hi, I'm Doctor Andi Horvath, thanks for joining us. Today we get up close to the critical thinking skills we need to scrutinise the claims people make and whether what they're telling us has any probable basis in reality. Specifically, how do we tell science from pseudoscience?  Whilst we live in an information age, we could just as well call it the misinformation age. With endless stories in the press and social media on the latest claims from researchers, how do we separate the reliable or credible from the unsubstantiated or downright false?  It's not just the claims themselves that are the problem, every one of us, even scientists or students immersed in the world of research, is subject to a number of human biases in how we take in and accept the ideas of others.

Our guest on this episode is Scott Lilienfeld who is the Samuel Candler Dobbs Professor of Psychology at Emory University in Atlanta, Georgia. Professor Lilienfeld has published extensively in the areas of personality and psychopathy, but he also does a strong sideline in getting students and others to tell pseudoscience from the real thing. He's authored and co-authored countless articles and among his recent books is 50 Great Myths of Popular Psychology, which examines misunderstandings about human behaviour. Scott Lilienfeld is in Melbourne as a guest of the Melbourne School of Psychological Sciences. Scott, welcome.


SCOTT LILIENFELD

Thanks very much, it's a delight, pleasure to be here.


ANDI HORVATH

Scott, tell us just what is pseudoscience?


SCOTT LILIENFELD

Pseudoscience we think of as fake science which is different, by the way, from false science because scientists get things wrong all the time. If you wanted to have a program on the number of things I've gotten wrong in my field that would be a very lengthy one. So all scientists make mistakes. To me, what makes pseudoscience distinctive and also I think what makes it somewhat dangerous, is that it may look like the real thing, it may seem to be like science but in fact it's not quite the same thing. It often lacks the kind of self-correction that we see in typical sciences. Now, science often is not as self-correcting as we'd like, it's often more stagnant than we would prefer. But eventually the scientific community does often correct. Pseudoscience we don't often see that kind of self-correction.

There are hundreds, probably thousands, of examples of pseudoscience out there in the popular media. To me a lot of it depends on the claims you make, but what we see in many western countries - but also other countries too - are claims about, for example, extra sensory perception and astrology, UFOs. Some of these claims, for example, ESP, may eventually turn out to have some merit, we will see, but I think the problem is that many of these claims are made very strongly without sufficient evidence. I think that is what can be quite misleading to the general public.


ANDI HORVATH

Sure. Does that include things like advertising in the cosmetics industry and the claims that they make?


SCOTT LILIENFELD

It can be, absolutely. For example one that comes to mind, a bit different from cosmetics but it relates to this, are companies that make claims about human pheromones. That men can wear perfumes, for example, that contain pheromones that are attractive to other people. There is little or no evidence that humans have pheromones in the way some other animals do. But again, that concept is out there, it's very appealing, and it sounds scientific. It's a very scientific sounding word so a lot of people in the general public understandably are drawn to it.


ANDI HORVATH

Sure. When you mean there is no evidence, do you mean that the tests have been done and they've been shown not to work, or that tests haven't been done yet?


SCOTT LILIENFELD

Sometimes it's both. Sometimes you see evidence of absence, sometimes you see absence of evidence. When there's absence of evidence, that is something has not been adequately tested, it's fine to make the claim but one has to make them in a qualified, careful way. So if one, for example, has a medical procedure, a new pill that has not been sufficiently tested then I think it's really important to let the general public know look, this is experimental, there's not adequate data. If you want to take it you're taking it at your own risk. In contrast if there's something that's been tested a lot in many controlled studies, it's been shown not to work, that I think is even a more [grievous] example because, again, proponents of these claims have to come clean and let the public know that these claims have not stood up to scientific scrutiny.


ANDI HORVATH

Is there something special about the language of pseudoscience?  Or in the way that pseudoscience uses authority?


SCOTT LILIENFELD

Sometimes there is. One common theme you see in pseudoscientific claims is a use of jargon, what we in our field sometimes call psychobabble. I think what's getting more and more popular is neurobabble. I find neuroscience fascinating, many of us do, but many of these claims in my own field of psychology will often claim the mantle of neuroscience without actually doing the hard work. You often see, for example, claims about various psychotherapies that are activating neural synapses or energy fields or brain networks or what have you. It all kind of sounds scientific. In some cases it might be, but more often than not they're just kind of throwing around terms that may be appealing to people who don't have adequate scientific training, and frankly even to people like me.

Sometimes I hear those claims and even my attention is drawn to them because they're using scientific sounding terms but when you look more carefully at the claims oftentimes they just don't hold up.


ANDI HORVATH

Often these pseudoscience things have gurus, is that right?


SCOTT LILIENFELD

That's common. Admittedly we're all primates and we're all looking for the alpha male and alpha female. I think that's something that's true in politics, it's true in the business world, and I think we often want, particularly when we are desperate, and I think we've all been desperate about some things, we look for simple answers. It's very appealing to look for some powerful person who seems to have some kind of knowledge. One of my PhD mentors, Paul Meehl, referred to this as the guru omniscience fantasy, this idea that there's some guru out there who knows everything and can solve all the problems for us. But frankly, when things are too good to be true, as Carl Sagan, the late astronomer, pointed out, they usually are.


ANDI HORVATH

How dangerous is pseudoscience?  Does it have consequences?  Like, can I be parted with my hard-earned cash?  Is it actually dangerous?


SCOTT LILIENFELD

It depends. Undoubtedly some pseudosciences can be innocuous. If someone wants to read an astrology column just for fun and entertainment - look, I've done that too. I still do that sometimes.


ANDI HORVATH
I still do it too.


SCOTT LILIENFELD

Right, that's okay, sometimes it's kind of cute and funny and so long as one realises it's just kind of in fun. So sometimes these things are innocuous however, in other cases yeah, you mentioned parting hard-earned cash. There are lots of cases where people have parted with their life savings to talk to self-proclaimed psychics. When it comes to some complementary alternative medical treatments that are not well supported, yes, those things can be very dangerous, very harmful. Look at Steve Jobs, for example, who passed away from cancer, who pursued a variety of medical treatments that were not well supported.

We know lots and lots of people who have pursued medical treatments that have little or no evidentiary basis, no basis in data, and probably have died as a result of them. Our first president in the US, George Washington, arguably died because of a procedure that did not work. People back then were using blood-letting, so he was drained so he was drained of massive amounts of blood and died soon afterwards. We don't know that was the only cause, but it probably didn't help. That's probably an extreme example. But in fact we know nowadays lots and lots of people are being harmed by medical treatments, and also psychological treatments in my own field. Psychotherapies that do not work.


ANDI HORVATH

Scott, you've been known to say that without scientific evidence based thinking mysticism takes over. So there is a danger in not having critical thinking skills in the public.


SCOTT LILIENFELD

Absolutely, and by the way I think it's fine to be mystical so long as that mysticism doesn't drift into scientific claims. So I am probably a little bit different than some of my sceptical friends. There's a bit of a battle in the sceptical community where some people really think that we should be taking on religion and mysticism. I respect that view, but my view's a bit different. I think religion is fine, I think it's important to have some deep respect for some of the world's great religions. But, as soon as religions begin making scientific claims, for example that the earth is only 6000 years old, or the Shroud of Turin is real or something, I think then they become fair game for scientific investigation. There, I think what you see, is that in some cases those claims have not been supported by the evidence as in the case of Creationism and its variants.


ANDI HORVATH

So is pseudoscience on a spectrum with science?  I mean, there's pseudoscience but there's also poorly done sloppy science, and on top of that there's fraudulent research where researchers deliberately fudge their numbers. Or are we talking broadly about the same thing?  How do you make distinctions here, and why?


SCOTT LILIENFELD

Great question, and it's one we struggle with a lot. My own take on this is that we are looking at a dimension, so I don't see a clear cut bright line between science and pseudoscience. It can get fuzzy around the edges. Because, after all, many people who do pseudoscience often do good work and do good science and, as you point out, lots of scientists will engage in sloppy science and sometimes not often report they're doing that, or make claims that go way beyond the evidence. So there is fuzziness.

However, one point I make in my teaching is that one does not therefore want to use that fuzziness, that murkiness, as an excuse for saying therefore there is no real difference. S.S. Stevens was an American psychologist who basically pointed out that there is no qualitative difference, no difference in degree, between day and night because, after all, we have dawn and dusk and they begin to shade imperceptibly into each other. But that, therefore, doesn't mean we can't tell the difference between day and night. I think it's the same way in science and pseudoscience. So when we talk about the clear cut cases usually it's not all that difficult to make that distinction.


ANDI HORVATH

Scott, let's talk more about this seductive nature of pseudoscience. Why do I want to believe?  You write about the times where you're driving your car and you seem to hit every red light when you're in a hurry and you sort of think the universe is against you, but it's just coincidence, right?  Obviously our human psychology is a little bit vulnerable to pseudoscience.


SCOTT LILIENFELD

Yeah, I'll give, if I can, two answers to that question. My first answer is that one reason we're drawn to astrology is motivational. We want to believe because the world is a scary place, it's scary for all of us, particularly now in a world where terrorism is on the rise and in some countries crime is on the rise. It's often hard to predict the future. It is often quite reassuring to think that there is some way of predicting the future, being able to look at the position of the stars, for example, and find out what's going to happen tomorrow.

The other set of factors is more cognitive, involving thinking, the way our brains work and you mentioned the example of thinking all the traffic lights are against us, and I've sometimes had that reaction. But more often than not what's happening is our brains are perceiving patterns that are not there. Our brains are pattern seeking organisms, that's the way we survive.


ANDI HORVATH

So the human brain is prone to seeing patterns and this is important for survival of the species.


SCOTT LILIENFELD

Yes, if we're an early human and we're out on the savanna and we see a lion in a particular area we want to create a pattern. So we want to remember gee, the lion was there last time, I better not go back to that area. Now, sometimes what ends up happening as a result - and this is a key point - is we make false positive errors. That is, we remember a lion being there when it wasn't there, or we get the place wrong. But that's actually not a bad error to make, it's better to make those kinds of errors because better safe than sorry. But if we don't commit that area we might actually get eaten.

So our brains are prone to seeing patterns, to creating patterns, even when they're not there. That's an adaptive tendency but by the same token it can also lead us astray. To me what makes science science, as a beautiful tool, is it helps us to distinguish which patterns are real and which patterns are not. Because without science our brains are going to perceive all kinds of patterns out there. Some of them may actually be interesting and real, others may be false, and that's what science helps us to do, is to sift through those patterns and to find out which of those are a product of our imagination, which of them are actually out there as opposed to up here in our heads.


ANDI HORVATH

Is this what you refer to as cognitive bias?


SCOTT LILIENFELD

That's one type of bias. Well bias really is any kind of systematic error. We all make errors sometimes, but biases are systematic kinds of errors that we make in particular directions. The kind of thing you're talking about here in terms of seeing false positive errors is indeed a bias. Biases, they often stem from adaptive sources. This is one of the great paradoxes, I think, of psychology is that a lot of the mistakes we make, a lot of the most serious errors, our brains are prone to, are born out of the same basic machinery that actually helps us to survive. That's why we need science as a way of protecting us and sorting through those different patterns.


ANDI HORVATH

I'm Andi Horvath and you're listening to Up Close. In this episode we're talking about pseudoscience and psychology with Professor of Psychology Scott Lilienfeld. Scott, part of your work is ensuring the students entering the psychological sciences gain a greater awareness in scientific and critical thinking. So what do you tell them about scientific thinking and methodology?


SCOTT LILIENFELD

The first thing I tell them is that we are all fallible and that we are all capable of being fooled and we're all fooled every day, and that none of us is perfect. That science is not a panacea, it's not perfect, but it is the best set of tools that the human race has yet developed to protect itself from error and to get a bit closer to the truth. Maybe one day - I'm doubtful, but I have to be a scientist myself - maybe one day the human race will come up with a better one. All we can say is right now it's our best hope for minimising and overcoming our propensity toward error. I think that's what students need to understand.


ANDI HORVATH

Right. So what are the features of scientific thinking?


SCOTT LILIENFELD

To me scientific thinking really is a toolbox. In one way we often portray science, in my view, somewhat inaccurately - and I have to confess, I did that too when I first started teaching - is teaching science as though it is some kind of uniform method. In fact there's not really a simple scientific method, there's a scientific approach. For me, scientific thinking is largely a set of tools for overcoming what we often call confirmation bias. In psychology confirmation bias being this tendency to seek out evidence consistent with what we believe in, and to dismiss or ignore reinterpreted evidence that is not. I think if you look across virtually all, if not all, sciences what you see are scientific thinking tools are methods of helping to overcome confirmation bias.

So for example in my field of psychology, understanding that one often needs a control group or a comparison group to make a claim. So if you claim, for example, that gee, I have this newfangled pill that cures obesity and my Aunt Sally took it and six months later she lost 20 pounds. Well, that's nice to know but was there a comparison group of people who did not take that pill or took what we call a placebo or dummy pill as a comparison?  Because without that control there's simply no way to know whether your Aunt Sally lost the weight because of the pill. But I think people often confuse - as another example of seeing patterns - people often confuse the claim that something happened after the event with the assertion that it happened because of the event. That's very different.


ANDI HORVATH

Right, so correlation and causation.


SCOTT LILIENFELD

Correlation and causation are very different, but because our brains tend to see patterns we often make the mistake of confusing a concurrence between events, a) happened before b), with the conclusion that therefore a) caused b).


ANDI HORVATH

So that often forces us to sort of bend over backwards, as you describe it, to prove ourselves wrong. So it has this double blind trials feature, other people trying to replicate the work, it's peer reviewed, you mentioned control groups. So taking from that, then, let's get into applying critical thinking to pseudoscience. How is pseudoscience constructed in terms of the claims they make?


SCOTT LILIENFELD

Pseudoscience I think is often constructed rather carefully, I think. Oftentimes they know how to capitalise on the mantle of science. They often use terminology that kind of sounds scientific, they often will rely on seemingly plausible anecdotes and testimonial evidence which may seem very compelling to a lot of people. They may make claims that seem remarkable or extraordinary but often are not matched with equally extraordinary or compelling evidence.


ANDI HORVATH

Right. So what happens when people challenge pseudoscientists - in want of a better word - with contrary evidence that says hey, I tried that and it didn't work?


SCOTT LILIENFELD

I think in fairness it probably depends on the person, but I think more often than not pseudoscientists probably act a bit like scientists in that way, which is they may initially become a bit defensive, and in some cases what you'll see is that many proponents of pseudoscience actually just dig in their heels more and become even more persuaded their claims are true. To be a bit technical here, you often see a kind of over reliance on what are sometimes called ad hoc hypotheses, kind of after the fact hypotheses, to explain away negative findings. Like okay, here's my psychotherapy I think works, like here's my energy therapy that I think is effective even though there's no data that these techniques are effective and the study showed it didn't work.

A proponent of that technique might come back and say yes, this study showed it didn't work but you didn't do it quite the right way. The people who did it weren't trained in exactly the correct way. Or you didn't give it for the duration it was wanted. Maybe you gave it for too short a duration, or maybe too long a duration, or maybe the people giving the technique were initially biased against the technique and their sceptical vibe somehow interfered with their energy fields and on and on and on. Again, some of those claims may have some merit, most of them may not. The bigger problem is ultimately for something to be scientific it has to be testable. It has to be capable of being proven wrong if it were true. That's a big problem with a lot of pseudosciences.

Often times they patch on ad hoc hypotheses after ad hoc hypotheses, after ad hoc hypotheses to the point where it becomes almost impossible to test, or what philosophers would call falsify, that claim. If a claim is not capably falsified it's probably not going to be scientific.


ANDI HORVATH

So they kind of reinterpret and reinterpret defining it in their parameters.


SCOTT LILIENFELD

That's right and oftentimes they make escape hatches or loopholes after the fact. Again, in science I should point out those kinds of ad hoc hypotheses, those kinds of escape hatches or loopholes, have their role. Sometimes in science we know that things don't come out of a study that wasn't done correctly. The difference is that in effective sciences, in good sciences, what you see is that those ad hoc hypotheses then strengthen the theory and make it easier to test. So as I tell people it's fine to invoke and after the fact excuse or loophole in science, but you can't stop there. You've got to say, you know what?  You didn't give the treatment quite the right length. Okay, then tell me how to give it and hopefully it should work now.

But if it doesn't work and someone says well okay, you gave it the right length but the person wasn't trained well enough. Well, show me how the person should be trained. I then train the person, it doesn't come out. Well, after a point the proponent has to say, you know what?  Maybe, just maybe I was wrong or I have to revise my beliefs.


ANDI HORVATH

So how do you make pseudoscience a part of the curriculum?


SCOTT LILIENFELD

I think it should be integrated in just about every science course, but admittedly there are people who disagree with me on that. But I happen to think that teaching students about the false claims, erroneous claims, poorly supported claims, should be part and parcel of every course. So when you're teaching a course on astronomy, for example, I think you should teach what's well supported, what isn't well supported, what's in the borderland, what may be a bit fuzzy in terms of its scientific status. Because I think you really have to get students to distinguish between well supported and poorly supported claims.

Too often I think we end up just teaching students what there's support for and we presume that all the poorly supported beliefs, or all the false beliefs, are somehow all going to dissolve and go away. I would argue the educational psychology literature suggests that that's actually not going to work. I think there's actually quite a bit of literature suggesting that unless you address myths and misconceptions proactively they often will remain intact after courses are over.

But I'm a believer in a diffusion model. We should be teaching students not just what there's evidence for but what there's evidence against and also what is controversial and getting students to use and develop critical thinking skills that help them to distinguish among these various claims.


ANDI HORVATH

Yeah. So what evidence do you have that the students equipped with the tools to scrutinise research claims and critical thinking are actually better able to identify pseudoscience, or even dodgy science?


SCOTT LILIENFELD

Much less than we'd like. So here is a case where we as a field have kind of been talking out both sides of our mouth because we have been arguing for the importance of evidence and yet when it comes to actually seeing, do curricular that teach students how to think scientifically actually work? There is actually not that much evidence. Things are beginning to look up now in that regard though, there now is a modest but, I think, growing, body of scientific evidence that teaching students about pseudoscience, incorporating that in the curricular, actually seems to improve performance on critical thinking tests. It helps them to be a bit better at distinguishing scientific and pseudoscientific claims.

But I have to say I think that evidence is still pretty preliminary and if it turns out that that approach does not work guess what?  We're going to have to rethink our educational approach as good scientists and maybe see if we can find more effective ways of getting students to be scientific thinkers.


ANDI HORVATH

I'm Andi Horvath and our guest today on Up Close is psychology researcher, Professor Scott Lilienfeld. We're talking about making sense of information claims and how to build critical faculties to separate science from pseudoscience. Scott, can we for a moment apply critical thinking to critical thinking?  Do cultural biases get in the way?


SCOTT LILIENFELD

I think cultural biases absolutely can play a role in science. I think it's really important for all scientists to be aware of their cultural biases, to see if they can compensate for them. In western cultures, for example, we are more likely to make certain kinds of psychological errors, we're more likely to make what psychologists call the fundamental attribution error which is the tendency to put the locus of responsibility on individuals rather than on their situations. Especially, as a psychologist, I think we have to be careful about that because one thing we've learned the hard way is that sometimes we tend to underestimate the power of the situation in influencing behaviour and put the blame excessively on individuals for their behaviour and sometimes their misbehaviour.


ANDI HORVATH

When we're talking about pseudoscience and teaching pseudoscience how do you skirt around religious spiritual beliefs?  How do you deal with that dangerous terrain?


SCOTT LILIENFELD

It is a dangerous terrain. There are lots of interesting differences of opinion about that issue. My preference is to leave it alone because I happen to believe that metaphysical claims, those that cannot be tested, lie outside the domain of science. I know there's some interesting debate about that but I tend to think the claims about, for example, the existence of God, the existence of a soul, the afterlife, are ones that are probably impossible to test. I'm not sure how much science has to offer those kinds of claims. So I try to respect those claims.

However, when particular claims within those disciplines begin to tread into scientific territory then again I think they do become fair game. In the US, for example, where I teach in the US South, there are a number of people who are fundamentalist Christians and some people that believe, may believe, in Creationism. Particularly things like New Earth Creationism, like the earth is only 6000 years old. It is important to tackle those claims because they're, they actually are making a testable claim, one that can actually be falsified and, I would argue, actually has been falsified.

I would also say though it is important to deal with these issues respectfully. One does not have to be a psychological expert to realise that, guess what, people do not like being made to feel stupid. It sometimes surprises me that as sceptics we often need to be told that. So I think it's really important to tread on these things directly and firmly, but also do them in a way that communicates respect and understanding.

I'll give you one little anecdote. A number of years ago when I was teaching my seminar I had a student from another country who confessed to me after a couple of lectures that she was a creationist. She was raised in a very creationist household and even though my class doesn't focus heavily on that topic I do touch on it. She came to me with some concerns and asked can I still do well in this course, holding these creationist beliefs?  I told her that yes, you can, you can still get an A in my class so long as you understand the other point of view and can explain it. If you end up not changing your point of view that's fine, I'm not here to convert you, but I am here to make sure that you understand what natural selection is and understand that point of view.

I count her as one of my proud successes. She actually ended up writing a paper on intelligent design theory in that course, which is a variant of Creationism, and at the end of the course she came up to me and she said I change my mind.


ANDI HORVATH

Wow.


SCOTT LILIENFELD

Yeah, and she then became a pretty fervent believer in natural selection.


ANDI HORVATH

Sure. Scott, it's healthy to be a sceptic but is there a danger of becoming cynical about everything, that nothing's real?


SCOTT LILIENFELD

I think it's really important, and I'm glad you raised this, to distinguish scepticism from cynicism. So to me being a sceptic, it's kind of gotten this bad connotation, like you're being so sceptical. But scepticism just means demanding evidence for claims and being open minded. To me science is organised scepticism, that's really what we mean by science is show me the data, show me the evidence and I'll be persuaded. Cynicism, and I do think sometimes sceptics when they're not careful will leer into that, will often dismiss claims before they've even heard the evidence and look, from time to time I find myself falling into that trap but it's something I try to watch out for. So yeah, I think it's very, very important to be sceptical, to always insist on evidence to say yeah, that's interesting, I'll take a look at it, show me the data, but not to dismiss the claims too much.

To me extreme cynicism is just as [problematic] and misguided as extreme open mindedness. In both cases in many ways you're not looking at evidence. In one case you are dismissing almost anything before the evidence is in, in the other case you're accepting almost anything before the evidence is in. So somewhere finding that middle ground is really the goal.


ANDI HORVATH

Scott, placebo is a legitimate phenomenon, it has therapeutic benefits. So how does that fit into this discussion?


SCOTT LILIENFELD

Placebo effect, which I think is widely misunderstood, is in essence improvement due to the mere expectation of improvement. Too often it is merely regarded as a psychologically uninteresting artefact. Now, it can be an artefact, some psychologists have argued that we classify artefact and error as anything we don't want to measure. In some cases when we're doing medical research, whether a pill works or psychotherapy research and whether psychological treatment works, the placebo effect can indeed be an artefact.

But it's also very real, because we do know that when it comes to some problems, particularly those that are subjective like depression, anxiety, pain, the placebo effect is quite real. It's quite a genuine effect. There is a very active area of scientific research on how placebo effects arise, what their physiological processes and perhaps mechanisms are and, undoubtedly, there are physiological mechanisms underpinning placebos. So to me it's actually not part of pseudoscience. Where I think it connects up indirectly with pseudoscience is that some medical treatments that in fact are entirely bogus may seem to work but in fact they're really working largely on the basis of placebo effect.

So if you're talk to medical historians, I'm not one, but a lot of medical historians will tell you that until the late 1800s or so the majority of medicine was in fact the history of the placebo effect. Most treatments probably did nothing and on their own they probably just seemed to work because of the mere passage of time, placebo effects and other kinds of artefacts. So a placebo effect is real, but it's something we have to watch out for because it can lead us to be fooled if we're not careful.


ANDI HORVATH

We've been talking about sort of sciences, but is it fair to say that our concerns in pseudoscience are primarily found in social sciences like psychology and economics, rather than the physical sciences?


SCOTT LILIENFELD

I wish I could say that were the case, but I don't think so. I don't know of any data, for example, that psychologists or social scientists are any more prone to these beliefs than our friends and colleagues in physics. In fact there is some data showing that physicists are more likely to believe in ESP, for example, and I think in some other poorly supported claims than psychologists are, there actually is at least one survey showing that. So I don't think so. I think we see these claims across the board, of course the way that they manifest themselves is quite different. Certainly we have pseudosciences in zoology, we have things like cryptozoology, the study of mysterious creatures like the Loch Ness Monster and Bigfoot and so on. Again, a small subset of these creatures that we have discovered turn out to be real, but most of them are not.

We see problems with chemistry we see homeopathy, for example, which claims to make use of chemical remedies but we know is bogus and can't work, in fact, in that domain. So I think we see pseudosciences pretty much across the board.


ANDI HORVATH

Scott, tell us about the so called replication crisis of recent years in which the scientists are finding that the results of many experiments are difficult or impossible to replicate?  Now, it seems psychology is particularly hard hit, so how does this fit in with our discussions?


SCOTT LILIENFELD

I think the importance of replication cannot be overstated. Replication is not sexy in psychology or other fields and I think as a field science in general has greatly underestimated the importance of replication, because replications are not nearly as exciting or thrilling as new findings. They don't make splashy news, newspapers are not going to run around and say gee, someone found for the fifth time that this medication works, that's not very exciting. But you know what, that's the engine of science, science cannot work without replication.

You're exactly right, we've discovered in recent years that at least in psychology, but also in medicine as well, we're finding that a lot of our findings are not replicating. Now a couple of points to be made here. We don't know that those findings are not real, we just know that they are more difficult to replicate than we thought. It could be the original findings were right and the replications were wrong, it could be that the original findings were wrong, the replications are right. More interestingly it's probably the case that in some cases both findings are right but they may be influenced by contextual effects that we have not yet identified. That is, the findings might be true for some samples but not others, some races but not others, some ages but not others, some laboratory conditions but not others, what scientists call moderators. That's also interesting and that no doubt would be the case.

Some people see this as a crisis, I see it as an opportunity. To me this is actually an exciting time to be a scientist because we're beginning to see for the first time I think in quite a while science beginning to look at itself in the mirror, giving a good hard look at its accepted practices and asking can it do better. In some way even though it's a challenging time and it's a time where undoubtedly some peoples' egos are being bruised because their favourite findings and maybe ones in their own lab have not replicated. We're beginning to see science being used to improve science. So scientists are using the very tools of science to make science better.

So I actually think there is a lot of soul searching going on, but I also think it's a time of great optimism because I think in the long run we're going to see that we're going to actually be able to better produce findings that can be trusted, that are robust, and that can be used to improve human health and welfare.


ANDI HORVATH

Scott, thank you so much for this discussion on critical thinking skills and the capacity to be able to catch a thief.


SCOTT LILIENFELD

My pleasure, I enjoyed it very much.


ANDI HORVATH

We've been speaking with Scott Lilienfeld, the Samuel Candler Dobbs Professor of Psychology from Emory University in Atlanta, Georgia, about the need in society for critical thinking and telling science from pseudoscience. You will find details of Scott's publications on the Up Close website together with a full transcript of this and all our other programs. Up Close is a production of the University of Melbourne, Australia. This episode was recorded on 24 August 2016, producer was Eric van Bemmel, audio engineering by Gavin Nebauer. I'm Doctor Andi Horvath, cheers.


VOICEOVER

You have been listening to Up Close. For more information visit upclose.unimelb.edu.au. You can also find us on Twitter and Facebook. Copyright 2016 the University of Melbourne.


show transcript | print transcript | download pdf