Consensus and the science of persuasion (and . . . scientific consensus)

0
1074
Candid Camera social proof

persuasion silenceIt is time to conclude the six-part report on Robert Ciandini’s concepts about persuasion and marketing.  The other five are:

  1. Reciprocity
  2. Scarcity
  3. Authority
  4. Consistency
  5. Liking

The final principle relates to consensus or social proof.  See this page on social-engineering.org.

In his book, “Influence: The Psychology of Persuasion”, Dr. Robert Cialdini states, “Social Proof – People will do things that they see other people are doing. For example, in one experiment, one or more confederates would look up into the sky; bystanders would then look up into the sky to see what they were seeing. At one point this experiment aborted, as so many people were looking up that they stopped traffic.”

The social engineering framework site continues:

Social proof is utilized in sales when high sales numbers are released, demonstrating to potential customers that the product is popular. A form of this is also taken advantage of when companies will release shirt with logos or slogans printed on them, where the wearer then gives an implicit endorsement.

Social proof is not just influenced by large groups, but also high profile individuals. For instance, a single celebrity becoming associated with product will make others want to be associated with the celebrities positive traits, and then will utilize the same product. For instance, the actor Samuel L Jackson is often perceived as being “cool” and “hip”. In publicity photos he often appears wearing Kongol hats[3]. An example of this can be found here[4] depicting Samuel L Jackson wearing a Kangol hat. This fact has helped increase sales of the product, and is often advertised.

This old Candid Camera clip shows the social proof consequence in action, in an elevator.

However, there is more to the story that meets the eye.  Recently an academic study sought to figure out why, even though the scientific community has achieved consensus that global warming is a real issue, public perceptions about the severity of the problem haven’t kept pace.

This National Science Foundation news release from 2010 is a bit lengthy to reproduce here, but I’ll publish it because it makes some important points about how we perceive and make our decisions about who to believe:

Suppose a close friend who is trying to figure out the facts about climate change asks whether you think a scientist who has written a book on the topic is a knowledgeable and trustworthy expert. You see from the dust jacket that the author received a Ph.D. in a pertinent field from a major university, is on the faculty at another one, and is a member of the National Academy of Sciences. Would you advise your friend that the scientist seems like an “expert”?

If you are like most people, the answer is likely to be, “it depends.” What it depends on, a recent study found, is not whether the position that scientist takes is consistent with the one endorsed by a National Academy. Instead, it is likely to depend on whether the position the scientist takes is consistent with the one believed by most people who share your cultural values.

This was the finding of a recent study conducted by Yale University law professor Dan Kahan, University of Oklahoma political science professor Hank Jenkins-Smith and George Washington University law professor Donald Braman that sought to understand why members of the public are sharply and persistently divided on matters on which expert scientists largely agree.

“We know from previous research,” said Dan Kahan, “that people with individualistic values, who have a strong attachment to commerce and industry, tend to be skeptical of claimed environmental risks, while people with egalitarian values, who resent economic inequality, tend to believe that commerce and industry harms the environment.”

In the study, subjects with individualistic values were over 70 percentage points less likely than ones with egalitarian values to identify the scientist as an expert if he was depicted as describing climate change as an established risk. Likewise, egalitarian subjects were over 50 percentage points less likely than individualistic ones to see the scientist as an expert if he was described as believing evidence on climate change isunsettled.

Study results were similar when subjects were shown information and queried about other matters that acknowledge “scientific consensus.” Subjects were much more likely to see a scientist with elite credentials as an “expert” when he or she took a position that matched the subjects’ own cultural values on risks of nuclear waste disposal and laws permitting citizens to carry concealed guns in public.

“These are all matters,” Kahan said, “on which the National Academy of Sciences has issued ‘expert consensus’ reports.” Using the reports as a benchmark,” Kahan explained that “no cultural group in our study was more likely than any other to be ‘getting it right’,” i.e. correctly identifying scientific consensus on these issues. They were all just as likely to report that ‘most’ scientists favor the position rejected by the National Academy of Sciences expert consensus report if the report reached a conclusion contrary to their own cultural predispositions.”

In a separate survey component, the study also found that the American public in general is culturally divided on what “scientific consensus” is on climate change, nuclear waste disposal, and concealed-handgun laws.

“The problem isn’t that one side ‘believes’ science and another side ‘distrusts’ it,” said Kahan referring to an alternate theory of why there is political conflict on matters that have been extensively researched by scientists.

He said the more likely reason for the disparity, as supported by the research results, “is that people tend to keep a biased score of what experts believe, counting a scientist as an ‘expert’ only when that scientist agrees with the position they find culturally congenial.”

Understanding this, the researchers then could draw some conclusions about why scientific consensus seems to fail to settle public policy debates when the subject is relevant to cultural positions.

“It is a mistake to think ‘scientific consensus,’ of its own force, will dispel cultural polarization on issues that admit scientific investigation,” said Kahan. “The same psychological dynamics that incline people to form a particular position on climate change, nuclear power and gun control also shape their perceptions of what ‘scientific consensus’ is.”

“The problem won’t be fixed by simply trying to increase trust in scientists or awareness of what scientists believe,” added Braman. “To make sure people form unbiased perceptions of what scientists are discovering, it is necessary to use communication strategies that reduce the likelihood that citizens of diverse values will find scientific findings threatening to their cultural commitments.”

So, it seems we believe what the scientists say, if they match what our social group and values indicate is correct.  I suppose that explains in part why some AEC marketers fail to change their practices even if scientific evidence suggests a different course of action would be wiser.

Did you enjoy this article?
Share the love