There are hundreds of studies out there proving this and that about supposed human nature. One of the more interesting areas of study is in cognitive biases. These try to illuminate some of the many reasons why people decide to do things contrary to the way a machine would work.
Halo Effect
Has there ever been someone whose art you do not really find very interesting, but then you become friends and start to appreciate their work? This is an example of the halo effect, which is a cognitive bias wherein your perception of one attribute of a person affects other attributes. Physically attractive candidates are more likely to be hired, because that good quality makes people more likely to trust them and think more highly of their intelligence. This works in the reverse, too – something judged to have one negative trait will be treated as though it has many negative traits. The most common application of this effect is in branding. After the iPod’s success, Apple computers started flying off the shelves. A good impression of a company leads to a more positive view of all their products. After one Toyota model had a braking problem, the sales of all their cars plummeted despite no evidence of a problem elsewhere. To counteract this, Toyota splashed advertisements all over the globe at first apologizing for the problem then claiming to now have the most innovative and inspiring safe technology development program in the world, being used to help build more effective football helmets.
Confirmation Bias
This is the tendency of an individual to believe information that confirms their preconceptions regardless of the truth. George Bush once said, “If we do not succeed, we run the risk of failure.” And according to an email chain letter, John Kerry said it too. Tracing it back a decade, it was originally attributed to Dan Quayle. Yet, it was actually written by MAD magazine in 1991. Each of the other attributions were false, but they were fully believed by those who thought poorly of the candidates’ intelligence.
This bias can be seen in the effects of psychics, placebos, court juries, religion, horoscopes, shopping decisions, and critiques. If you believe one thing to be true, you will more readily accept evidence that supports that truth than refutes it. This is related to the Halo effect, and is one of the more dangerous biases because once you believe something it is very hard to change your mind. If you believe the leader of your country is looking out for you, it may take a revolution like the ones in Africa and the Middle-East for you to notice your lack of civil liberties. If you are brought up believing in a god, you will likely find reasons to keep believing in it for the rest of your life, and will be more susceptible to religiously induced discrimination and extremism. If you have a negative view of yourself, you will more easily believe negative criticism than positive feedback.
Dunning-Kruger Effect / Imposter Syndrome
It takes some confidence to call a thing finished. The Dunning-Kruger effect, named in 1999, suggests that the less informed someone is the less they are capable of giving an intelligent opinion on their relative skill in any given trade, and they will tend to overestimate their positive qualities and underestimate their negatives. This is known as illusory superiority.
Those who are more informed are more capable of finding faults, and often suffer from illusory inferiority: the underestimation of positive qualities and overestimation of negatives. This is related to and often causes the Imposter Syndrome, coined in 1978, which describes an inability to internalize accomplishments. Those affected by this phenomenon believe their success is more attributable to chance, good timing, or even the result of deceiving others into believing they are more competent than they believe themselves to be. The subjective nature of art means that artists are especially susceptible to this affliction, as many accomplishments cannot be quantitatively measured.
"One of the painful things about our time is that those who feel certainty are stupid, and those with any imagination and understanding are filled with doubt and indecision."-- Bertrand Russell
Cognitive Dissonance
Cognitive Dissonance is an extensively studied social psychological phenomenon. It is the discomfort one feels when facing conflicting ideas. Have you ever been unsure what to do next on a project, and decided to do something dramatic like to paint over it, cut it in half or burn it? The many possible options seem unlikely to help what you have deemed as an inherently flawed work, so you radically change it. Once you do, you are unsure if it is any better, but it would be difficult or impossible to undo your decision and you have decided you like it better this way, that the old version was hideous and embarrassing.
Cognitive Dissonance is especially visible in two parts of that process.
The work is unresolved, unsatisfying, and any changes you make are now having to make up for the work’s negative label. There is a dissonance between your desire to make something worth your time and your current negative opinion of the piece. This builds as you feel more and more certain that to continue would be a waste of your time, and to cut through that dissonance you feel that you have three choices: stop and call it done, which leaves a lingering feeling of disappointment; throw it out, which makes it feel like the hours spent on it were a waste of time; radically change it, which is a toss of the dice. More often than not, people will run the odds and make the change.
At that point, you are faced with a dramatically different piece. You are biased toward liking what you did, to justify having done it. Maybe it was done without thinking through all the options, and maybe it is not quite what you hoped would happen, but one thing’s for sure, you can’t go back to what it was. And so, to justify this decision, your opinion of the way it looked before the change becomes more negative while the outcome is viewed more positively. This is called post-decision dissonance, and it prevents a fair evaluation of the work and the decisions you have made.
The best way to compensate for this phenomenon is to notice the fourth option – to put it down, out of view, and come back to it another time. This is difficult to accomplish because the dissonance stays with you (like an unclosed parenthesis.
So if we accept that all we see, think, and feel is biased enough that we can't think our way out of it ourselves, then the only way to have clarity is to seek and truly accept feedback from people who don't agree with us, and who we don't trust, may not even like. They may not like us. They only have to be able to speak to us without utter contempt, and to say what they really think or feel. This is really hard.
ReplyDeleteLiving with dissonance rather than ruining a work that is on the painful edge of becoming really good, even illuminating, gets easier only when we have faith that the process of our life will eventually make it sensible, not immediately but in the wake of other change. Then we can mail it to our future selves, and let it go. I've got a bunch of old writing and lyrics that I've been picking up again. Twenty years later I finally have some idea how to align some of them to flow without applying "lethal force". A couple others I've put away again - not yet.
This is not a road for cowards, only for people who want to get real and are willing to do what it takes to do it, to not be solitary but yet not always be among kindred souls.
Thanks for bringing this post back to my attention, it helped for me to reread it now.
ReplyDeleteI agree with you that we should seek feedback from people who disagree with us, though that runs the dangerous risk of reinforcing a negative feedback loop. It's the same as in any relationship - if I don't value myself, I will look for people to support that negative theory just to prove myself right, regardless of my knowledge of how miserable that will make me. In subjective fields of artistic merit, certainty is a valuable commodity bought through the exclusion of other potential truths.