Empirically effective methods of convincing argumentation
This study recently popped up on r/changemyview. It attempts to pinpoint the specific differences in arguments that make them more effective at changing a person's opinion. This has always been a subject of interest for me, and I'm guessing, others on the forum. While the forum's linear chronological design* is not one that fosters well-structured & effective group discussion (more like everyone yelling over each other in a room, amirite), I'm curious what methods, non-anecdotally proven to be effective, if any, you intentionally use to be convincing in general conversation and argumentation. Please include links to abstracts/studies.
*As a side note, Metafilter has a wealth of surprisingly coherent and focused discussions despite its format, which I suppose speaks to the quality of its moderation shaping the demographic.
Comments
I look forward to reading this paper when I can.
I've long been following the Cultural Cognition Project, a body of research that concerns itself with public assimilation of empirical information. The publications are numerous, but the general theory points to groups of people assessing information along the lines of risk perception. Rather than assess the validity of the information, populations assess the risk factors of accepting that information; most of the perceived risks center around destabilizing certain central commonly-accepted tenets of social structure. Information that challenges those tenets is perceived as risky, and the group "defends" against it.
Trustworthiness of the messenger plays a vital role in communicating that information. An "other" will have an almost impossible time getting someone to understand information they perceive as posing a risk to some central social tenet - they're treated as an enemy. A trusted messenger can convince people far more readily - they're the Trojan Horse of reason.
So my actual long-term strategy involves becoming a "trusted messenger" and using gentle guidance and ego-stroking manipulation to convince people. It's slow work.
I am the Socrates. Rym is the Calicles. I attempt, but obviously fail, to delivery only empirical truths. I put no effort or thought into the delivery method, or how I can better convince the recipient. If they won't accept the truth then either they are not worth my time, or I'm wrong and they're hopefully going to show me how wrong I am.
Rym is the opposite. He wins, and convinces many people to vote for him in the vote who wins game.
At least at the end of the day, when I go home in defeat, I know that I'm not an evil lying jerk like some people.
I am Aristotle.
More to the point of the thread, I don't have a method I've worked out where I can convince people of anything on a reliable basis, although that is not from a lack of trying. I've reached a point where I just blame it on my face, the way I carry myself, or some other physical quantity that I can't identify that just makes people disinclined to listen to what I have to say.
The key for me is making that fact work to my advantage. ;-)
Not that any of this means anything.
A more recent perspective upon truths, but I haven't really read much of either of these writings.
There is no provable absolute truth. At the bottom of every philosophy there is some inspired assumption which the philosopher defends with reasons they have sought after the fact. There is no rational principle behind our world. The most noble goal in life is to create art. To live well is to be art. If life is a dream, "I will dream on!
Otherwise, I basically try rational discourse and evidence. If it's clear that they are effectively immovable by this method, then I fall back immediately on heavy rhetoric.
Of course, there are two very different kinds of argumentation. In one case, I am attempting to spread truth. While I could well be convinced of a different truth in these cases, it isn't likely or expected. E.g., the efficacy of chiropractic.
In the other case, I am attempting to basically merge my notions with the other person's notions to come to some third way. I am pushing for what I generally hold to be correct, but am also synthesizing the other person's ideas into my own immediately. The goal is consensus or divergence: either defining a mutually agreeable position or identifying the core disagreement and focusing heavily on that. E.g., whether or not execution is a morally acceptable punishment for large scale war crimes.
I do have citations to back up the first part of this: my use of intractable people as examples for others.
There are tons of case studies on the rough phenomenon where a person believes something more strongly the more evidence they are presented against. This one is probably my favorite: a doomsday cult in the 50s, infiltrated by researchers and observed as the moment of their doom prophecy came and went without the destruction of the Earth. It is an excellent example of so-called "motivated reasoning:" wherein people will reliably rely on increasingly ridiculous rationalizations for a motivated belief rather than admit it is wrong. They'll even incorporate direct evidence to the contrary into the core of their belief. They basically can not be convinced by anything.
I use that as the basis of my interaction with them. But, only if there is an audience. As soon as I see someone falling into that pattern of rationalization, I push them HARD. I use rhetoric to fluster them a little, then hit them with rapid-fire evidence against them. I'll let them rebut two or three of those before throwing even more at them. It forces them to build a fairly complex rationalization framework in a hurry, and in front of witnesses (so that it cannot easily be retracted or walked back).
Inevitably, the rationalization will include some ridiculous elements, or else an admission a basis in something socially unacceptable like open racism. If it doesn't immediately, I keep pushing until they're backed into a corner and it happens. One that kernel of ridiculousness comes out, I attack it mercilessly.
They're never convinced. But the witnesses sometimes are.
I don't have a citation for this further result, but I believe there are a few factors at play.
1. The witness is not as motivated in the belief, but by not being attacked directly themselves, they do not fall into the reflexive trap of rationalizing it. The person with whom I argue in this case is basically a sacrificial lamb to convince the others. An argumentation voodoo doll.
2. The witness fears the social pressure of agreeing with the straw man rationalization (e.g., admitting that they're racist), and even if they are not fully convinced, they hide their belief for fear of social consequences. I suspect, but can not prove, that people who do this are more pliable to eventual convincing, since they are not able to express the rationalizations openly and thus do not cling to them as powerfully.
So yeah... I have one specific set of case studies and related research saying that people will fall back onto increasingly absurd rationalization when directly confronted with evidence. I use that knowledge to achieve that state, believing myself that it is effective in convincing third parties.
Not surprised that I got Sarte/Camus, followed by Nietzsche. There is an amusing irony in the popularity of the quizzle tangent.
Now I'm thinking about arguments I've had where I earnestly was trying to convince the other person, rather than giving up and TRYING to get a rise out of them.
How can I convince someone that all positive integers have a unique set of prime factors, if I first have to explain what a prime number is? Ain't no one got time for that.
"No, really, it's a theorem. It's 100% true forever. Not like we haven't discovered a counterexample yet, more like... literally impossible for it to be any other way."
Backfire'd.
Instead of a great leveling out of discourse, it appears that a minority will progress deeply into these subjects, while the majority will seek zero new information. The gulf between people who seek information and people who do not widens, and there is no discourse bridging them.
It is deeply frustrating to talk about anything with someone who both has a strong opinion on something, and knows almost nothing about said thing. This is exacerbated by the knowledge that they could trivially learn these things with a few hours of baby's first googling.
I agree I've recently come into contact with an individual in text chat who expects you to tell him everything, literally everything he asks of you, from the temperature in my city to the plot of a movie or the details of a historic event and seems to not understand he can look these things up within seconds.
If you use such a disreputable popular source for your argument, I know you don't know enough about the subject to be worth debating.
If you cared more about the subject, you would do your own further research, and you'd find that either:
1. The popular source is correct, but then you'd find other more rigorous sources to cite instead
Or
2. The popular source is incorrect, and you shouldn't bring it up at all.
The thing with the Zeitgeist movies is that some of the topics interest me, but they are mixed in with standard conspiracy theory stuff, that even if partly correct weigh down the ideas with unneeded baggage.
If you care enough about a belief or idea, you will research it enough to know what sources to cite. If you don't, forget even talking to me about it, let alone expecting me to respond.