« October 2004 »
S M T W T F S
1 2
3 4 5 6 7 8 9
10 11 12 13 14 15 16
17 18 19 20 21 22 23
24 25 26 27 28 29 30
31
You are not logged in. Log in
Entries by Topic
All topics  «
Art.
Beer, cause and effect.
BioPsychology
by Lenny
Christmas
esoterica
Evolution
gawd!
Lenny Rules!
media faction
Mind Control
music
News and Media
Philosophy
Physics
Physics and Metaphysics
Picture of the Week
police state
presidential election
Psychology/Psychiatry
Ramones
Religious Nuts
Resistance.
science and technology
Sex
the Medium is the Massage
the President
the Simpsons
UFO-ology
war in Iraq
Blog Tools
Edit your Blog
Build a Blog
RSS Feed
View Profile
Media Faction
Tuesday, 5 October 2004
Clear Thinking and Muddled Thinking
Topic: Philosophy
Jamie Whyte describes himself as "Outraged of Highbury" - someone who endlessly sends furious letters to newspapers complaining about sloppy thinking, logical errors, fallacies and muddles. He does the same at parties - and even on trains. Fortunately he's a professional philosopher or he might have attracted the attention of the authorities long ago. Liz Else and Alun Anderson asked what gets him steamed up and what errors they could commit that would make him explode.



Jamie Whyte hates Bad Thinking (it's his latest book).

Enjoy his dissections (and examine yourself for contamination):

the authority fallacy - is trusting someone simply because they have acquired a measure of celebrity, which might include publishing a book.

the motive fallacy - someone's motives may reasonably make you suspicious that that person has an incentive to mislead you, but their arguments are no better or worse than the evidence put forward to support them

The mystery fallacy - it's a mystery therefore I can think whatever I want.

"How widespread is this tendency to seek unnecessary explanations?"

JW: "It is well known that when gamblers go wrong they find an excuse and as soon things go right they immediately assign it to their own brilliance and insight rather than finding an accidental reason."

How long have you been angry about bad thinking?

I've always been obsessed with truth. I did my PhD on truth. It has always driven me mad to see people saying things that are well known to be rubbish. And I've never understood how they can bear it. But at the same time I can see that it doesn't affect their lives materially so they can't understand why I get so hysterical.

But you are a philosopher who left academia to work in financial markets, of all things. Isn't that an area where truth is seriously lacking? Or should we trust your views on truth because you have published a book?

There are indeed some very suspicious things that go on in the financial markets. But you are already plunging straight into the authority fallacy, which these days appears in a variety of perverse forms. One of the worst is trusting someone simply because they have acquired a measure of celebrity, which might include publishing a book.

What do you mean?

First, there is nothing wrong with deferring to genuine expertise. You need to defer to some people because you simply can't do all the research on everything yourself. But to whom should you defer? The basis on which you defer to people should be that they are reliable, and by reliable I don't mean nice or good or that they have their hearts in the right places. I mean that if they say that P... it is very likely that P... Now what makes somebody reliable is the way that they acquire their beliefs - ultimately it all comes back to the correct methods for acquiring beliefs. So you should identify people who are doing it the right way and defer to them. In the end, it is just a division of labour.

But how does the authority thing work?

The really big mistake comes when you treat people as authority figures when they are not expert but simply well known. There is a terrible tendency to treat people as reliable sources of fact when in fact they are simply "important" people or people who happen to be in the news. It is doubly perverse when you consider who gets counted as "important". For example, the victims of train accidents appear on television as authorities on rail policy and celebrities endorse presidential campaigns as though they are expert on politics. It's sheer insanity.

That sounds like good news for scientists. With their emphasis on transparency and method, surely they'd be immune from the authority fallacy?

Not at all. Scientists are vulnerable to this kind of celebrity issue. Some scientists have a certain amount of star quality that gives their opinions more weight than they ought to have. Worse, scientists have a terrible tendency to pronounce on issues where they don't recognise that they are not expert.
Take the British Medical Association, which is always making policy recommendations. A recent example was that the government should tax the fat content of food. Why does the BMA think it knows anything about how we should live? It may know that if I live a particular way I'll become unhealthy, but why does it think that it can tell me that I should value my health more than my chosen way of life? What makes its members think that they are in any privileged position to answer questions like that?

Also, how do they know what the effects of a tax on fatty food would be? They're not specialists in the way that prices affect consumption and the way the economy will be affected by redistribution of spending from one part to another. They can't even anticipate the health effects of these things. They should shut up.

So should they express no opinion at all?

The BMA is one of these organisations that commits the authority fallacy. It seems to think the fact that it may be an authority on medical issues means that it is also an authority on the politics of medical issues. There is almost no connection. The BMA's output should be an input to the decision making of somebody else. I think you get a lot of this false authority in science. Let the BMA commission a report from somebody who knows what they're talking about. It has just got this blind assumption that health is everything. Health seems to me to be reasonably important, but we are all mortal and doctors often seem to forget that.

It is fair to say that the BMA represents doctors and as such it is just another pressure group acting for its own constituency, and its opinions will be no better than anyone else's.

Oh dear. Now, you are drifting into the motive fallacy. Too many people see truth as just a game between groups, as a kind of tribalism. That is not rational. Far too many people are not prepared to say: "I don't believe this and here's my argument why I don't." They don't feel they need to. Instead, they will say something like: "Economists are just part of the capitalist conspiracy so I don't have to listen to their arguments about free trade." Thus they dismiss all economists' views on the grounds that they are members of a particular group.

And scientists too...

Yes. These days, scientists are increasingly seen as part of various tribal groups, so when you read about their views the newspapers will go to great lengths to ask who they are working for, what their backgrounds are, and what are their political views are, and so on. Someone's motives may reasonably make you suspicious that that person has an incentive to mislead you, but their arguments are no better or worse than the evidence put forward to support them. So ultimately the question of whether something is true or false can't be settled by a question of motives. And just to dismiss somebody on the basis of "it pays them to say that" isn't a good argument. They might be right anyway.

In your book you are quite harsh on religion. Aren't people entitled to their faith?

This is one of my favourite errors. An interesting change has happened, at least in the west. It used to be that people would argue for a particular religious dogma or a clear religious doctrine. That is no longer what happens. The world is increasingly dividing into those who have "faith" and those who don't. It doesn't really matter what the faith is. That is why you now get "faith groups" coming together from all kinds of different religions. The weirdest manifestation of this new tendency is when people say: "I'm not a Christian but I believe in something." Then I say: "Of course, I believe in many things, like there is a chair there and a table. What are you talking about?" And they reply: "Well, you know, something more." But what "more"? What they mean is something more than we have any good reason to believe in.

That really seems to get to you!

What amazes me is that they like to set themselves up as having a slightly finer sensibility than you or me but in fact they are completely intellectually irresponsible. They used to come up with very bad arguments for their faiths but at least they felt that there was something they should provide. Now mere wilfulness has triumphed. This is what I describe as the egocentric approach to truth. You are no longer interested in reality because to do that you have to be pretty rigorous, you have to have evidence or do some experimentation. Rather, beliefs are part of your wardrobe. You've got a style and how dare anybody tell you that your style isn't right. Ideology is seen as simply a matter of taste and as it's not right to tell people that they've got bad taste, so it's not right to tell them that their opinions are false. I'm afraid that the cast of mind of most people is the opposite of scientific.

There's something close to that that you also hate. When people say "there is an awful lot we don't understand" and use that as an argument for believing in something...

The mystery fallacy: it's a mystery therefore I can think whatever I want.

Isn't there a reverse of that where scientists will ignore or deny the existence of anything they don't understand?

Yes. Scientists have a strange tendency to be insufficiently empirical sometimes. A good example is swing bowling in cricket. For a long time, scientists found it impossible to explain how bowlers could deliver a ball that swerved. So their first defence was to say that it's not really swinging but only appears to be. Sometimes in science theory pushes you ahead of observation - the theory will suggest some observations that you previously wouldn't have made. But sometimes there are things you can observe that you can't explain which should drive theory. The way you learn physics these days is so lacking in observation that it's got a lot of scientists out of the habit. Physicists don't take observation as seriously as they should and sometimes they get this arrogance that if the theory can't accommodate it then it isn't there, instead of letting the observation push them.

Talking of cricket, you also use cricketing scores to show up some of the ways people try to explain patterns that don't need an explanation at all...

An example I use in Bad Thoughts is the explanation of why zero is the most common score on which batsmen go out. Cricketers and sports commentators will tell you that the batsman has just come on the field and they are very nervous, or they don't really know the pitch, so they are more likely to go out immediately. But that kind of explanation is not needed at all. It's just that the way cricket is scored means zero is the score on which batsmen face the most deliveries. Everyone starts at zero but then scores increments of anything from zero to six, thereby skipping many possible scores. Since zero is the score on which most batsmen face the most deliveries, it is entirely unsurprising that it is also the score on which they most often go out. People are looking for an explanation that's just not required.

How widespread is this tendency to seek unnecessary explanations?

It is well known that when gamblers go wrong they find an excuse and as soon things go right they immediately assign it to their own brilliance and insight rather than finding an accidental reason. It's rather similar in the financial industry. Even the bosses buy into this kind of reasoning. They will say "of course I understand why that one went wrong" when they lose millions, and then when it goes well they will say "well done". Everybody systematically overestimates their skill in games of chance. From what research I have seen, financial trading is not much more than a game of chance. There are funds that simply track the market according to a set of simple rules, and others that are very actively managed. But the actively managed ones do not perform better on average. Some will do well in any year but that's what you expect by chance.

But there are star fund managers who do well year after year. So is the advice I read to follow the best managers right?

You have fallen into the error of reading meaning into data where it is not required. Stock traders are quite young; normally you would be quite an old trader at 35. You are probably in the spotlight for four or five years, maximum. There are thousands of traders, so some of them are sure to have five good years in a row. It is purely random. There is nothing you would not expect if it were chance alone.

So if I'm a logical person I had better put my money in the bank?

Well, the best investment depends mainly on your risk appetite. However, retail banks give a much better return to their shareholders than investment banks, partly because their staff are paid less. For investment bankers, the only trick is getting yourself a seat at the table. Once you are there the money is yours. It really isn't that skilful. But the people who work there will always believe their good days are down to their own amazing skills.

New Scientist


Posted by mediafaction at 11:19 AM EDT
Updated: Wednesday, 6 October 2004 3:03 PM EDT

View Latest Entries