How to actually change your mind – Eliezer Yudkowsky

Rationality: rational beliefs should not be preferences they should be the best estimate of the way things actually are. why bother learning about rationality if you can’t become fully rational, for the same reason that you would double check a test answer if you can’t be sure it will be correct. ‘if you don’t seek perfection you will halt before taking your first step’. rationality is a forward flow. it gathers evidence and outputs a conclusion. rationalisation is backward flow: from conclusion to selected evidence. How can I construct an impeccable rational argument that supports a specific view? you can’t. it must be done in a forward flow without knowing what conclusion you will eventually reach, a logical argument must flow from the premise not the conclusion. one wants lots of evidence normally but that does not mean it useless to try and be better at forming opinions when evidence is sparse just like it is not helpful to ignore being good with money when you have less of it 

Correspondence bias: tendency to draw inferences about persons unique and enduring dispositions from behaviour that can be explained by situation. e.g thinking someone is an angry person when you see them kicking a vending machine even though if you have ever done it you would be unlikely to think it was because you are and angry person (attribute own actions to situations). prior probability would suggest that more people kick a vending machine because they missed a bus than they are unnaturally angry. This links to the fundamental attribution error where there is a tendency to over attribute others behaviours to dispositions 

How to argue: to honestly argue against an idea you should argue against the strongest advocates and the strongest arguments. you must not forget that it is more superior to argue physics than rationality, if something doesn’t seem rational but can be clearly demonstrated experimentally then that should trump. good technical argument should aim to eliminate the reliance of personal authority of the speaker. arguments and credentials are different and arguments should matter more in cases of technical analysis. Beware if see someone arguing that a policy is defensible rather than optimal or that it has benefits vs no action rather than best benefit of any action. This is not to say that you will be able to always find the best action just that saying it is better than nothing is not a good justification for something. when trying to change someones mind the hope is that it takes less courage to visualise the uncomfortable state of affairs as a thought experiment than to consider how likely it is to be true but that after doing the former the latter is easier to do 

Updating: if you have 3 arguments for something and someone presents an argument against you may discount it due to 3:1 but whilst you disagree with more statements the evidence against builds. also by rehearsing arguments you already knew you are double counting evidence, this is because you are using the same evidence over and over again whilst only using opposing arguments once. if you are hearing evidence from someone that you know is biased but you know what they say is true, they just happen to omit evidence that supports other views, does that mean you must update probabilities towards their view without being able to stop? This will depend on how you think they screen information. for example suppose you have a coin that is biased but it’s either 2/3 chance of heads or 1/3 chance of heads and which option it is is 50/50. if I tell you out of ten flips that the 4,6,9 flips are heads, but nothing about the other ones, then what are the posterior probabilities the coin is H biased. it may be that they report those flips regardless of the result and so the posterior odds are 8:1 for H bias. or they may only report H and so the odds of a head bias are 1:16 against. or consider the monty hall problem. if they will only reveal a door if you initially pick the correct door but you didn’t know that then swapping will guarantee you lose. nothing can ever be certain: if this was the case then nothing would ever be allowed to change your mind again. once you assign probability of 1 you can never undo it ever, there is no updating from this point. 

Contamination: completely uninformative, known to be false, information can influence decisions and once an idea gets into your head it will prime compatible information and this can be made more extreme if you are cognitive busy. Descartes thought that one first comprehends what a proposition means, then considers it and finally accepts or rejects it. Spinoza thought that we first passively accept proposition in the course of comprehending it and only afterwards actively disbelieve it if it is false.If Descartes is right then distracting subjects should interfere with accepting true statements AND rejecting false ones. if Spinoza is right distracting people should cause them to remember false statements as being true BUT NOT make them remember true ones as being false. it was found that with a distraction there was no effect with identifying true propositions but it did affect identifying false propositions. people were given information about criminal case that were colour coded depending on whether the statement was true or false. Some of the false statements said the crimes were worse than they actually were and some said they weren’t as bad as they actually were. Busy people recommended 11.5 years with the bad lies and 5.8 years with the good lies. non busy gave 7 and 6 years respectively. This supports Spinoza and shows that we must be very careful when we expose ourselves to unreliable info especially if we’re busy. 

Motivated actions: A motivated skeptic asks if evidence compels them to accept a conclusion (do I have to accept), a motivated credulist asks if evidence allows them to accept conclusion (is there any way I can accept). When it comes to questioning own beliefs one is much more likely to attack the strongest points than the weak points and it is also likely that after first comforting reply you stop rather than questioning the reply as well. Taber and Lodge’s “Motivated Skepticism in the Evaluation of Political Beliefs” describes the confirmation of six predictions:

  1. prior attitude effect: eval supportive augments more favourably
  2. disconfirmation bias: more time trying to refute contrary arguments than supportive ones
  3. confirmation bias: seek supportive ideas
  4. attitude polarisation: balanced pro’s and con’s exaggerate initial polarisation
  5. attitude strength effect: stronger attitudes more prone to biases above
  6. sophistication effect: knowledgeable subjects due to having more ammo to counter argue incompatible facts more prone to biases (having more knowledge can hurt you)

Motivated continuation can try to disguise itself as virtuous, who can argue against gathering more evidence? Motivated continuation is when some evidence is leaning in a way you don’t like so you decide to pursue more evidence even if it is expensive. Motivated stopping is when you close a search once you get to a comfortable conclusion even though there is a lot of fast cheap evidence available 

Biases with data presentation: 

  1. subjects thought a disease was more dangerous when it was described as killing 1286 out of every 10,000 people vs a disease that was 24.24% fatal (Yamagishi when a 12.86% mortality is more dangerous than 24.14%
  2. one group was told about a measure that would save 150 lives, another was told about one that would save 98% of 150 lives. the second was judged more favourably 
  3. most people preferred a gamble to pick a red ball when there were 7 out of 100 vs when there were 1 out of 10 because there are MORE RED BALLS
  4. when deciding on gambles to take students preferred taking a gamble when there was a very small loss e.g 5p than when there was no loss at all £9 isn’t all that much money but 9:0.5 is an amazing win/loss ratio. just make sure the subjects don’t see those two gambles side by side 

Beliefs: to improve our beliefs it is necessary to change them, otherwise how could they improve. belief in belief is a dangerous thing. this is where you might find it tricky to believe something but you can find it much easier to believe that you should believe it. this means that you may end up with protected beliefs that you are unwilling to change and that you think you should believe. One must also be wary of beliefs that they would be reluctant to correct false praise for. for example if you think contradicting someone who has made a flawed but nice claim about evolution is supporting creationists then your belief/understanding of evolution may be rather fragile. 

Ignoring reality: if you want to understand an enemy and you construct motives that make the enemy look bad you will almost certainly be wrong about what actually goes on in their heads. passive voices obscure reality. for example if they say in a study that the patients were administers a drug, that ignores the reality of college students being given 20 pills and instructions. or if you are told that it is thought that …., well thought by who? you are aware of biases and fallacies but they likely seem much less salient when thinking about yourself. if we start with a truth and there is no new source of information then it can only be degraded in transmission from generation to generation or at best stay the same, but likely due to the passing of time and other things being discovered it will degrade. for example the Torah loses knowledge every generation as contradictory evidence is found, science gains knowledge though so no matter where they both start science will surpass the Torah. power of science means admitting that we are wrong sometimes and learning from this, if one ignores reality and claims they been wrong it by no means means that you have made fewer mistakes. A good hypothesis should explain some outcomes that could happen but should not be able to explain all possible outcomes one may see. if we have a situation where everyone is praised for saying how bad something is and no one dares urge restraint (how bad 9/11 was) then the reaction almost has to be greater than appropriate level. The USA spent billions and 1000’s lives shooting off its own foot as a response to 9/11 better than any terrorist could have dreamed. while on the subject of ignoring reality we may look at the lottery. The book suggests an entertaining alternative that may increase the utility people get from playing the lottery. If most of the utility comes from getting to believe that you could be richer then a definite date at which that will be revealed may be counter to this, he suggests a new lottery that would pay out every 5 years on average but at a random time so that you can savour the anticipation longer.

Anchor/priming: ways that will help prevent anchoring as much (still unlikely to get rid of it). if initial guess sounds implausible try to throw it away completely. you could also try to think of an anchor the other way. One way to think about how human cognition works is that it effectively consists of cache lookups of previous knowledge that are primed by what we hear. you will likely often see people who aspire to critical thinking but repeating cached thoughts that were not invented by critical thinkers

Conformity: Asch’s conforming experiment- 3/4 of subjects gave conforming view on length of line. conformity seems to strongly increase up to 3 people who state a view but doesn’t grow much after that but a single dissenter seems to shatter the effect even if they dissent to a wrong answer. the effect is also stronger when those conforming are of a similar group to you 

Similarities between volume and juries: for a subject rating single sound on an unbound scale without fixed standard comparison nearly all variance due to arbitrary choice of modulus rather than sound, e.g what people assign base noise value as. this is a lot like jury deliberation on punitive damages. jury damage awards are more a psychophisycal measure of outrage expressed on an unbound scale with no standard modulus 

Calibration: Dale Griffin and Almost Tversky (the weighing of evidence and the determinants of confidence) asked their colleagues when faced with decision over jobs what their confidence in the predicted choice was. average was 66% but in reality 96% chose their predicted choice. this means once you can guess what your answer will be in all probability you have already decided 

Cults: when cults suffer major shock the seem to become stronger, why? One argument is that due to the investment in the cult and the tendency for consistency people don’t want to admit they were mistaken. another idea though is that who is most likely to leave first? most likely the less fanatical members, thus leaving only the most fanatical 

Affect heuristic: you have a grandfather clock and it was given by (a) your grandparents (b) a distant relative. how much would you pay to insure it if the policy would pay out $100 if the cloak was stolen. people in (a)  willing to pay twice as much, however the policy doesn’t protect the clock it will only pay a fixed about if it is stolen 

Problem solving: Maier edict – do not propose solutions until the problem has been discuses as thoroughly as possible without suggesting any. to come up with alternatives try to close your eyes and brainstorm wild and creative options trying to think of better alternate, use actual clock for this. 

People: when Christmas shopping decide how much money you want to spend and then find the most worthless object which costs that amount, cheaper the class the more expensive a particular item will appear to a person. How was the most altruistic person of the 20th century? Ghandi may be a pretty strong pick but was he the most altruistic or just the most famous altruist. one must consider whether his fame was protective or liability, if it was protective then one may need to revise down his altruism score as he wasn’t risking as much as say an anonymous member in the crowd. Francis bacon was the only crackpot in history to claim a monumental level of benefit to humanity from an idea and turn out to be right (the scientific method). in modern orthodox judaism you’re allowed to doubt. you’re just not allowed to successfully doubt 

Uncertainty: degree of uncertainty is not the same as a degree of truth. e.g flip a coin but don’t look, it may be 100% heads but you may be 100% uncertain. just because multiple things are uncertain does not mean they are the same uncertainty. if you believe that odds of 2^7000000 means there is still a chance then all probabilities to you are likely to mean the same thing ‘uncertain’

Crisis of faith: This relates to identifying and challenging particular beliefs. You might want to consider a crisis of faith if:

  1. you have held a belief for a long time 
  2. a belief is surrounded by cloud of known arguments and refutations
  3. you have sunk costs in it (time, public declaration)
  4. the belief is emotional
  5. the belief has become part of your personality 

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s