Four ways to define “rational”

In this week’s video, I explain what I mean by “rational,” and discuss some of the other meanings that people attribute to this often-ambiguous word.

11 Responses to Four ways to define “rational”

1. Bytor says:

I wonder if looking for what people mean by “a rational explanation” it would be better to look at the colloquial meaning of “rational” rather than the philosophical meaning?

For example, one might ask for a rational explanation of why Marc Lépine killed 14 women and then committed suicide. “Because he was mentally unbalanced” is an explanation that most people would accept as true yet significantly incomplete. It seems indicate a desire for the ultimate causes of why Lépine was so affected and chose to do this rather than the proximate cause. After this happened hypotheses ran rampant in the press – he was unbalanced because of abuse,because he was abandoned, because of both.

In a classical mechanics sort of way, people want a “rational explanation” to be able to explain the trajectories of all the billiard balls on the table all the way back to the open break rather than the confusing inexactness of the quantum mechanics of our daily lives.

2. Barry says:

Enlightening! It might have been slightly less enlightening if you’d thought to mention another common meaning of rational: the quotient of two integers.

To give some credit to the definitions you don’t use, I think they can be formalized as follows:
3. holding well-defined goals which pertain primarily to external state
A person who considers him or herself to be “too rational” due to not taking enough joy in life probably means they want to make their objective function, well, less objective (Pun! Not contradiction! Move along…) by incorporating criteria like pleasure. Or they may have decided their intermediate goals are too directed, and since decision theory itself is intractable even for moderately complex toy problems, they have decided to take the entirely reasonable approach of something more like simulated annealing, and want to shake things up a bit.
4. compressing data
There is “no rational explanation” for data which is too uncorrelated to compress. There is an “explanation” the data is what it is. The property of rationality being referred to, I believe, relates to the ability of the explanation to compress the data. This is a semantic point to be sure, but that’s what you get when you try to define words. 🙂

4. Julia Galef says:

@David —
Regarding #4 — Interesting. I like the characterization of explanation as “compressing data,” though I’m not sure it’s widely applicable. I think an explanation often has to sacrifice compression for comprehensibility to humans. So for example, maybe you could explain why the housing bubble formed by talking about the movements of atoms. That might be a lower-level explanation (and more compressed, if I understand your use of the term properly) but not helpful to us humans in the sense of giving us a grasp of the phenomenon.

In order to explain the housing bubble in terms of molecular dynamics, you’d need to give the initial conditions pre-bubble, which would be a tremendous amount of data, thus not really being compressed at all. But you could easily counter-argue that specifying the initial conditions of the entire universe just before the Big Bang probably wouldn’t be many bits (Schmidhuber would certainly support this hypothesis), so, yes: the compression has to be such that decompression is efficiently achieved by a human brain. In particular, an instrumentalist definition of compression is appropriate: an explanation is a way of reducing incoming data to a neural pattern that is burdened with much less information than the data itself, yet retains similar predictive capability. If the decompression is too hard computationally (e.g. computing the evolution of the universe for 14 billion years), that compression isn’t actually going to give you any predictive capability.

5. Pingback: News Bits

6. Max says:

Dan Ariely gave an example of how irrationality can be a good thing. He asks if you had an opportunity to steal something without getting caught, would you do it? Apparently, his definition of rationality involves being a selfish parasite.
He also says it’s irrational to be willing to ruin an expensive suit to save a drowning baby, but not be willing to donate money to save a baby. Here rationality means being consistent.

His website has a couple of “Irrationality Quotient (IQ)” Quizzes.
http://danariely.com/apps-tools/

Sample question: “If you had to give yourself a daily injection that might dramatically decrease your chance of having a serious illness in 30 years, but would certainly have negative side effects now (headache, vomiting, shaking, fever, etc.), would you regularly take the injection?”

Most of these paradoxes can be resolved by adjusting the utility function. For example, we might assign utility to honor, valor, or to certain personal experiences; or we might assign utility to health according to any monotonic transform of a health metric (which, in and of itself, is hard to define objectively). Rationality, as I define it (and, if I’m not mistaken, as Julia defines it) does not encompass prescriptions of to assign utility functions, but only how, given a utility function, one should go about maximizing it.

7. Jackie says:

How would a researcher ever measure rationality?

8. Thomas says:

I cannot so easily dismiss the word “rational” as being a meaningless addition in front of the word “explanation.” In my experience, when people insert words, they are trying to say something decidedly different from what the phrase would convey without the inserted word.

In my understanding, the word “explanation” carries no judgement as to accuracy. Said another way, “inaccurate explanation” is not an oxymoron. This is why people can ask, “what is *your* explanation?” If accuracy were a prerequisite to being an explanation, we’d ask, “what is *the* explanation?” The idea that there may be many competing explanations of varying accuracy is inherent in the word “explanation.”

“What’s your explanation for how the presents got under the tree?”
“Santa put them there.”
“How did he put them there?”
“He flew around on his sleigh and came in through the chimney.”
“How do you explain how his sleigh flies?”
“Flying reindeer.”
“How do reindeer fly?”
“Magic.”

We can debate whether these answers are accurate explanations, but they are nevertheless explanations.

The next step, then, would be to ask for an accurate explanation. “There must be some accurate explanation!” But, people do not usually say this either. Why?

In my understanding, the phrase “accurate explanation” does carry a judgement as to accuracy but doesn’t acknowledge that accuracy may be judged differently from different points of view. For example, consider the following conversation:

A: “Yesterday I was driving home through Nevada and saw these lights in the sky moving like nothing I’ve ever seen before. Suddenly it became very bright around my car and my car started shaking. Next thing I knew I woke up in my bed with these scars on my back, wrists, and legs.”
B: “Wow, that’s crazy, are you alright? What do you think happened?”
A: “I must have been abducted by aliens. At first, I thought it was just a dream that little beings with big heads were poking needles in my wrists and legs, but now it all makes sense.”

Is this explanation an accurate one? Certainly A thinks so. B may disagree, but unless B is so arrogant as to think he is the sole arbiter of Truth (or A and B are close enough friends it’s understood B is just joking around) you probably won’t hear the conversation continue like this:

B: “No, there must be some accurate explanation!”

The sentiment B wants to convey is that he cannot, in his world view, accept A’s explanation as accurate despite understanding that, in A’s world view, the explanation is very much accurate.

Enter the word “rational.”

In the sense of epistemic rationality (which is the sense I think applies to “rational” in “rational explanation”), the goal is to obtain accurate beliefs about the world. But, like the scientific process, it is understood that the means for achieving these accurate beliefs is to continually refine one’s beliefs. Newton had his equations. Einstein fixed them up to account for the speed of light. Researchers four days ago discovered neutrinos going faster than the speed of light. If the experiment can be repeated, then, in view of the new evidence, a new theory may follow in the quest for the accurate explanation. If it can’t be repeated, then an accurate explanation will be sought for why one set of measurements (erroneously?) showed neutrinos going faster than the speed of light when we have not seen that to be the case anywhere else.

A’s abduction theory is the neutrinos going faster than the speed of light. It may be true, but it’s not corroborated by anything in B’s world, yet.

It’ll take some time (possibly a *long* time) to work out, but if two people put together all the evidence in their world views and design tests to verify any discrepencies, together they should arrive at an explanation they can both agree is accurate as far as both their world views are concerned. This is the rational explanation.

This is the one people are asking for when they say “there must be some rational explanation.” They mean “there must be some explanation which we both can agree is accurate.”

9. Grognor says:

I have a serious problem with the notion that epistemic and instrumental rationality are ever actually at odds.

To take the given example, you don’t have to believe you are attractive to act in a way that causes favorable responses, any more than you don’t have to believe you’re the King of Siam to put on a convincing visage of him in a play.

It’s not that deluding oneself into a false belief of higher attractiveness wouldn’t help; it probably would, in fact; it is just a very strange way of trying to accomplish a goal. It’s much healthier, and probably easier, in this example, to just pick up some social skills instead.

Nobody ever truly benefits from holding false beliefs. One example is that there are atrocities going on in the world, lots and lots and lots of them. So many, of such horror, that it boggles the mind. There are two situations for people: either they can do something about this, or they don’t. If they can do something about it, it’s better for them to know, so they can make an informed decision about whether to do so. If they can’t do anything about it, I hold that it’s still better to know, but that’s because I value knowledge. In that case, though, it doesn’t affect a person who doesn’t value knowledge, but those people typically do not think of rationality as a good thing. They think “rationality” = Spock, which we all know is wrong.