Superstitions as Evolved Objects

It's common to mock superstitions to display one's Skeptic cred, but it's a mistake, because they are Chesterton fences.

It's bad luck to walk under a ladder. No, okay, it's not "bad luck", but ladders are dangerous.

It's bad luck to open an umbrella indoors. Or more accurately, it's inconsiderate, because open umbrellas are deceptively difficult to handle indoors and you're pretty likely to bump into someone or knock over a lamp. A fifteen-minute training course on umbrella safety and etiquette wouldn't be a waste of time.

Seven years of bad luck for breaking a mirror. Not really sure about this one but who wants broken mirrors? Mirrors are probably pretty hard to make.

I've heard that in some Islamic countries it is traditional to touch food only with the right hand, and reserve the left hand for disgusting activities like wiping the ass. For years I thought this was some sort of unprincipled discrimination, but somehow I spontaneously saw the purpose one day. It makes perfect sense to use only one hand for eating in a world where you can't wash your hands right before every meal. The practice makes enough sense to me that I do it.

Cultures exist in a world of slow meticulous selection pressures. Like organisms, they must adapt to them to survive. This is the basic tenet of the study of cultural evolution. It's a small insight, to realize that superstitions are part of culture, and thus also subject to, and created by, those pressures.

My hope with this article is that it produces a Baader-Meinhof effect in you for a little while, so that when next you see someone call some habit a 'superstition' you can think about whether or not there's a pretty obvious purpose like in the above four examples. But there may be a purpose even if you can't think of one, for reasons articulated here and in much more depth and generality here.


the thought to which all thoughts eventually return, as if it were the bottom of the cognitive bowl

See also: The Monster

I wish my mind were larger. Certainly it is larger than most people's; during manic episodes it feels like I can fit entire other human minds inside my own, and maybe that is not entirely mere arrogance. But it's not enough, it will never be enough, life is not graded on a curve. I can't fit in all the concepts, can't work tirelessly and efficiently and self-improvingly.

How is so weak and imperfect a creature as individual man, isolated and wretched, shorn from community, ejected from egregore and unable to assimilate, supposed to make the world a better place? It would be hard enough if I were some competent ubermensch unsurpassed in self-control and diligence, but I am a languid slovenly slob barely able to brush my teeth or shower. Again and again I return to this thought. I can't escape. Multidimensional metasphex.


reason #21,066 reason to distrust all scientific studies

Originally posted November 21, 2016

 I see a lot of people claiming that jobs are good for people, because jobs give life more meaning. They point to studies "showing" that people are happier when employed than unemployed, even when "controlling" for just giving people money. Needless to say I haven't read any of these, because I am not in the habit of filling my brain with motivated bullshit.

Now first of all, really? Do you expect me to believe that someone who cleans toilets for a living is happier than they would be if they didn't have to do that? That's retarded.

But more importantly, this motivated reasoning is failling to take into account that not having a job gets you yelled at all the time. "Get a job!" people yell at mendicants. They really do that, all the time. And it's not just yelling. The whole American culture is infused with an ethos that being unemployed makes you worth less, that you're not earning your keep, that you don't deserve to exist. Unless you're a child, woman

I've spent years unemployed before, and I'll be unemployed again, probably soon. My quality of life is easily lower having to work all the time than not. Work sucks. This is the most obvious thing in the world. Which is one reason everyone wants to get counterintuitiveness points for pretending it doesn't. But the bigger reason, as far as I can tell the reason it's achieved memetic fixation, is that people who have jobs really hate them, and resent people who are able to get away without having them, and resolve the cognitive dissonance by telling themselves that jobs are actually good.



Originally posted October 24, 2016

 Look at these fucking buttons:

I have to hit these buttons many times per day, always with the same overworked left thumb. They are terrible buttons and I have a mild repetitive strain injury as a result.

I don't understand. This steering wheel is in a 2013 vehicle, but the problem of "making buttons that are nice to press, even thousands of times a day every day" was solved by video game manufacturers in the late 1970s. Do engineers who make vehicles think about ergonomics at all? Does anyone other than video game console manufacturers??

Look at this beautiful goddamn artifact:

Look at it. It actually looks like it was made by people who have hands, for people who have hands. Even my unusually large and hammy doom-fists can comfortably hold this, and play with it, for hours. The Nintendo Gamecube controller is the finest nonliving thing I have ever held. How I miss it.

I don't know anything about ergonomics in practice, but it's like architecture in how prevalent it is, affecting humans always and everywhere. The only book I've even heard of about it is The Design of Everyday Things, which popularized the useful concept of affordance. Hopefully I can at least listen to the audiobook some day.

I play video games on my prematurely aging laptop with a Logitech F310. It's no Gamecube controller, but I can still use it for any amount of time without any pain whatsoever. You could literally rip off a piece of the steering wheel and put the controller in, Megas XLR-style, and create a much nicer experience. If I had a lot more experience at DIY engineering, and I owned the truck I drive, I might just have tried something like that. Because it would be cool.

Well, bye



Originally posted October 23, 2016

You should probably read Scott Aaronson's post Umeshisms before reading this post.

Concentrate on the higher-order bits.

Back in the day, this sentence took me a long time to understand, and nobody would explain it when I asked, so this is what it means:

Look at the following number: 12,345,678.

All it means is something obvious, which is that the underlined digit is a lot more important than the bolded one. If you want to make a big difference to what the number represents, you need to change the digits closer to the underlined one than to the bold one.

(As an aside, you might consider something like ?___________________________________________12345678, in order to think a little more scope-sensitively about impact magnitudes. Try to make the question mark digit positive, if you want the number to increase.)

I'm writing this for three reasons. First because I used to think umeshisms had achieved memetic fixation, but when I talk to people about them they've almost never heard of it. I googled it just now and was sort of shocked to find it has only 29 results.

Second because when I tell people about umeshisms, and give examples, statements like "If you've never been arrested, you're not doing enough interesting things," or "If you've never broken a bone, you're not doing enough dangerous physical activity," they don't seem to get it. Trying to come up with their own examples, they fail. Something counterintuitive about the idea that a successful strategy includes nonzero probability of unimportant bad things happening.

Pop quiz: Is "If you've never falsified an umeshism, you're not being badass enough," an umeshism? Answer at end of post.

Third because I want to explain the umeshism mindset, as distinct from umeshisms, the type of aphorism. I don't want this post to be a repository of umeshisms (though it'd be super cool if someone made one, like quotedb or the erstwhile limerickdb where people could submit and vote on them). Umeshism is so named like pragmatism, stoicism, or agnosticism, not in the philosophical school sense, but in the state of mind sense.

"Don't sweat the small stuff," is umeshism, but so is, "sweat the big stuff." Effective altruism is umeshism applied to doing material good. Rejection therapy is for getting your system 1 to be more umeshic in the context of asking for things.

The Hamming questions translate to, "Why aren't you being more umeshistic?" From Richard Hamming's You And Your Research:
Over on the other side of the dining hall was a chemistry table. I had worked with one of the fellows, Dave McCall; furthermore he was courting our secretary at the time. I went over and said, ``Do you mind if I join you?'' They can't say no, so I started eating with them for a while. And I started asking, ``What are the important problems of your field?'' And after a week or so, ``What important problems are you working on?'' And after some more time I came in one day and said, ``If what you are doing is not important, and if you don't think it is going to lead to something important, why are you at Bell Labs working on it?'' I wasn't welcomed after that; I had to find somebody else to eat with! That was in the spring.
Umeshism is prioritization. It's caring more about more important things than about less important things. It doesn't mean caring a lot, or a little, in full generality; it means caring in the right order. It means not spending all your time on social media if you have anything useful to do. But it also means not spending all your time setting up intricate systems to prevent you from wasting your time.

What distinguishes umeshism from just naively trying to be more efficient is deliberately letting avoidable bad things happen as part of the overall strategy, because the harm from those things is outweighed by the cost of preventing all of them.

What's the correct number, taking resource tradeoffs into account, of deaths by electocution per year in the United States? I don't know what the specific number is, but it's not zero.

If you understand everything you read, you're reading too carefully. (Though I must say that if you never notice any misunderstandings, you're not reading nearly carefully enough.) If you don't get it yet, here are some good blog posts that explain it: Focus on the Higher-Order Bits and Why We Should Err in Both Directions

Answer to quiz: No


Originally published July 12, 2016

Lists are really, really appealing for some reason, perhaps because they are so simple and orderly and thus memorable.

Peter McIntyre wrote an article (listicle, is the pejorative) called 52 Concepts to Add to Your Cognitive Toolkit. Despite being written in pandersome newspeak it's really good; I endorse it. Most of those concepts are essential to thinking and if you don't know any you should familiarize yourself post-haste. I cannot emphasize this enough. Fluency in these concepts is by my account an adulthood developmental stage. No such listicle could ever be complete, and to my reckoning the most important omissions are:
Wow, a list. So shiny, so nice. Very compelling.

Some of these concepts are monster-topics that take weeks to understand. Others take less than an hour. Caveat emptor.

I am fascinated by the concept of a "cognitive toolkit" or "conceptual ontology" or "insight collection" or "conceptual vocabulary" or whatever you want to call it. It should probably be on a list of essential concepts! The fascinating thing is that it seems to comprise a list of concepts. Like the way you think is partially embedded by something as simple as a communicable list of ideas.

The 'vocabulary' metaphor for the conceptual ontology helped me realize something important. Earlier I had the minor insight that the set of words you can use is much smaller than the set of words you can recognize. The same is true for concepts. To understand other people's thinking, you only need to be able to recognize the chunked concepts involved, but to think, you have to be able to use these concepts, which requires practice. Pen-and-paper exercises and spaced repetition thereof might help. I nearly put spaced repetition in my above list, but worried it would start to become a list of all the concepts I know and utilize. Like, did you know spaced repetition helps all kinds of knowledge, not just declarative? God damn, son.

Richard Feynman attributed much of his research success to using a 'different box of tools'. It makes sense. Exploring in a different way than everyone who has come before is probably a prerequisite for finding new things. It seems to me that humans, even the brightest, mostly think the same thoughts, over and over, in the same ways, and have only a few tools and heuristics for thinking. Quoth Gian-Carlo Rota:
Every mathematician has only a few tricks.

A long time ago, an older and well known number theorist made some disparaging remarks on Paul Erdos's work. You admire Erdos's contributions to mathematics as much as I do, and I felt annoyed when the older mathematician stated, in flat and definitive terms, that all of Erdos's work could be ``reduced'' to a few tricks which Erdos repeatedly relied upon in his proofs. Actually, what the number theorist did not realize is that other mathematicians, even the very best, also rely on a few tricks that they use over and over. Take Hilbert. The second volume of Hilbert's collected papers contains all of Hilbert's papers in invariant theory. I have made a point of reading some of these papers with care. It was very sad to note how some of Hilbert's beautiful results have been completely forgotten.

But it was surprising to realize, on reading the proofs of Hilbert's striking and deep theorems in invariant theory, that Hilbert's proofs relied on a few tricks that he used over and over. Even Hilbert had only a few tricks!
The way I see it, the way humans have different incommunicable cognitive habits is in large part responsible for differences in quality of their intellectual work, and an important proximate cause of people's uniqueness. I often see deep links between my thinking and other people's thinking, always so deep that I can't articulate them. It makes me wonder whether I have some ultra-deep grognor-only intuitions that languish, unused, because I never see them in anyone else and thus they never get reinforced.

Anyway these deep thought-structures are definitely not simply lists of cognitive habits or heuristics. If you could condense Erdos's briliance into such a thing, you would be as brilliant as him. Even so, lists seem, surprisingly, to compose a large part of people's cognitive faculties.

Conflation vs. Erroneous Splitting

Originally published July 11, 2016

Consider these four situations:
  • Conflating two things. Conflation is the mistake of thinking that two or more things are the same thing.
  • Incorrectly splitting one thing into two things. This is the mistake of thinking that one thing is two or more things.
  • Correctly identifying that two things are the same thing.
  • Correctly distinguishing two things that used to be thought of as one thing.
To illustrate this I came up with a nice little 2x2, which I had ctrlcreep draw a prettier version of:

using "rectitude" in lieu of a proper antonym for "mistake"

The point I want to press upon you is that the situations in the top row are easier or more likely than the situations in the bottom row, due to working memory constraints. An ontology with fewer objects in it is easier to understand, so it's relatively easy for humans to correctly identify that what they thought was two things is actually one thing, and correspondingly, to mistakenly conflate two things into one. Mutatis mutandis, it's hard for people to notice subtle distinctions. And likewise people have low propensity to mistakenly think that one thing is two things.

This is why I see distinction-mongering as such an essential conceptual activity; it goes against the natural inclination to do the opposite.

It goes back to the personality distinction between lumpers and splitters. Some people want wikipedia articles to include everything related to the subject; others want to individuate the various things into their own wikipedia articles. Ever since the list of subtle distinctions I co-wrote, I've become much more of a splitter, seeking distinctions everywhere and never finding them unfruitful. Perhaps this is some sort of boast, like wow guys, look at how many distinctions I can fit in my head. I nevertheless see it as the essential conceptual activity. As Sarah Constantin once said, "science" means "to split".

My Weirdest Ethical Belief

Originally published June 25, 2016

I think male homosexuality is good and female homosexuality is bad. Mostly due to asymmetries in the sexual marketplace, the surplus of men. The upshot if you agree with me is that if you are a bisexual of either gender you should limit yourself to men. This is doubly good: it occupies a given man and frees up a theoretical woman. Additionally, putting this into practice should help control overpopulation (and be accord with antinatalism, which I also hold), though this isn't why I hold the belief at all.

Cooperative Epistemology

Originally published May 27, 2016

 The mathematical theory of instrumental rationality comprises two interrelated disciplines: decision theory and game theory, i.e., individual and group optimization. These are fields for which we can say that both exist. By contrast, epistemic rationality is given its coherent mathematical treatment only concerning individual knowledge. The study of group rationality is scattered to the winds, a field so small it as yet has no name. It could be called interpersonal epistemic rationality, multi-agent epistemology, interactive epistemology, or, preferably, something more clever. I title this post cooperative epistemology because that is the ideal I want the field to be about.

It will prove helpful to distinguish between the formal theories of rationality, the theory of rationality in practice, the practice of rationality, and the practice of theorizing about rationality. We can further subdivide these chunks into epistemic and instrumental by prepending the respective adjective to every instance of 'rational'. I do this as a rule, but the reason I'm inflicting such onerous distinctions on you is because the theoretical study of the practice of group epistemic rationality, by philosophers and psychologists, gets plenty of attention and is given its names. What's comparatively neglected is the mathematical theory of interpersonal epistemology.

Thus social epistemology is a cool field I aim to pay attention to, but is the "Robert Nozick investigating whether induction is justified" of group rationality, I am seeking more things like Aumann's Agreement Theorem.

Aumann's Agreement Theorem
is so famous and well-known in my circles that it needs no introduction here. Nevertheless I want to consolidate some of the last many years of discussion about it.

In the beginning, Robin Hanson and Tyler Cowen coauthored a paper about how disagreements are, therefore, either irrational or dishonest.

I had ambitious plans for a much longer post, but I don't feel like writing this one anymore, so I'm going to truncate it here and publish it. The main upshot was probably going to be something about how Wei Dai continues to be and have been the single best contemporary thinker.



Originally published April 3, 2016

 I've been homeless and unemployed for a long time. So I started scavenging out of necessity. But I would keep doing it even if I had were employed and dignified, to save money and prevent waste. It's smart and cool, like all forms of frugality.

I write this because it's not something that occurs to even the thriftiest of tightwads, and because even they reject the idea out of hand with some rationalization. "It's unhygienic," they say, but I've never gotten sick, even when all the food I ate was from trash cans. I'm on food stamps now. I can't spend that on anything other than food, so it doesn't cost me anything to scavenge less.

It wouldn't even have occurred to me if I weren't obsessed with not letting things go to waste. It started when I was hanging out alone in a dormitory's lounge, and some group walked in with a bunch of food, and then threw it away, with most of it untouched. "Americans. So wasteful," I thought. No one else was around, so I decided to unwaste it. I was newly homeless and not yet in the habit of scrounging, so I was hesitant for a bit. But it was great.

For several months, I fed myself by sneaking into that same building or the nearby one at 5am, and going to the lounge on every floor and picking out the good food. There was always enough that I could be picky about what refuse I consumed. There was good variety. Occasionally there was even alcohol. I stopped using those buildings after a few too many encounters with the cops. But that only happened when I was lazy and actually loitered in the building instead of looting the trash cans and splitting. It was hard not to, since the lounges were nice places with outlets, bathrooms, and free very fast wifi. Everything I needed!

After that I started using outdoor trash cans. This was a lot less fun. They were much farther apart, the amount of good stuff I could find was a lot less guaranteed, rain could fall and ruin stuff, and it was cold at night. Plus there was a much higher chance of being seen. Some part of me retained this irrational desire to appear dignified, so I only did this at night in the wee hours of the morning. I did it so much that I learned when janitors would be snooping around (annoyingly, essentially all the time), when students would be around, what days of the week were the best, and an efficient route that I could instantly modify depending on how much I wanted, how much I'd found already, which entrance to the campus I used, and so on, which ended somewhere that had an outlet and wifi. UCLA is nice enough to provide free wifi everywhere on campus.

When you're a veg*n1 and homeless and thus have no place to cook or store food, food stamps alone isn't enough to feed you. Scavenging was nice because it allowed me thus to not dip into my real money in order to eat enough, without eating nothing but raw ramen, raw oatmeal, and bananas. I could splurge on canned soup and cans of cold refried beans.

My original idea for this post also included some guidelines for what discarded foods are good and what are bad, with the obligatory "your health is your responsibility, not mine, use your judgement" and tips for how to avoid being spotted, and other things I learned over years of scavenging, but whatever. I'm not going to convince anyone to try it, so that would be pointless.


[1] My dietary restrictions are to not economically incentivize suffering. So I can't purchase eggs, dairy, or meat from things that can feel pain. I still buy clam products, because I'm pretty sure clams can't feel anything. For game-theoretic reasons, I can't accept gifts of these things either. But there's no reason for me not to eat meat that's been discarded and was going to be discarded regardless of my decision to eat it. I don't trust "cage free" bullshit about how some animals are in less horrible conditions than others. It's all probably pretty horrible. Anyway it's annoying that there's no short word for this, since in my view it should be the most common dietary restriction. "Vegetarianism" is stupid, since it tends to produce more suffering by way of replacing meat (small numbers of animals in Hell-conditions) with eggs (large numbers of intelligent animals in Hell-conditions).