all 9 comments

[–]zyxzevn[S] 2 insightful - 1 fun2 insightful - 0 fun3 insightful - 1 fun -  (0 children)

Also very important:

I tried to capture most of this with logcal fallacies.
But because it is so common, I think it is good to look at it separately.

From the article

These are the main behaviors to watch out for:

  1. Illusions of invulnerability lead members of the group to be overly optimistic and engage in risk-taking.

  2. Unquestioned beliefs lead members to ignore possible moral problems and ignore the consequences of individual and group actions.

  3. Rationalising prevents members from reconsidering their beliefs and causes them to ignore warning signs.

  4. Stereotyping leads members of the in-group to ignore or even demonise out-group members who may oppose or challenge the group’s ideas.

  5. Self-censorship causes people who might have doubts to hide their fears or misgivings.

  6. “Mindguards” act as self-appointed censors to hide problematic information from the group.

  7. Illusions of unanimity lead members to believe that everyone is in agreement and feels the same way.

  8. Direct pressure to conform is often placed on members who pose questions, and those who question the group are often seen as disloyal or traitorous.

  9. (addition) Tendency to swing from the status quo to the complete opposite.

  10. (addition) They become control freaks (and lack critical thinking).

[–]CompleteDoubterII 2 insightful - 1 fun2 insightful - 0 fun3 insightful - 1 fun -  (3 children)

Another thing I would recommend is posting your views on forums, and asking for people to debunk it and provide further evidence for it. One probably would find evidence for and against their view that he/she wouldn't have thought of by oneself.

[–]zyxzevn[S] 2 insightful - 1 fun2 insightful - 0 fun3 insightful - 1 fun -  (2 children)

It will certainly work for small stuff.

But psychologically, it is very hard for people to change their views.
Especially when there is some fear or consequences involved.
There are some good examples of that given by Scott Adams on his podcasts.
In the latest he explains how the people do not accept the evidence in the court-documents, but rather go protesting based on fake news.

So instead, I invite people to explain the situation in a way that the story has almost no opinions. 1. No logical fallacies. 2. Evidence based. etc.

This can improve the understanding of the full situations and the different ways to view it. So a person can say: "There is good evidence for this, but I believe that idea."

It may sound contra dictionary, but the person can then:
(1) improve the evidence for his idea,
or (2) accept the idea with the best evidence.
or (3) keep his idea, but understand that others may not support it,
or (4) look at another idea.

And at no point the person will feel attacked, when people add more evidence or reasons to support a different idea.

I hope this improves the communication between people that have completely opposing viewpoints.
They can even support each other in building a consistent and valid theory around their ideas.

[–]CompleteDoubterII 2 insightful - 1 fun2 insightful - 0 fun3 insightful - 1 fun -  (1 child)

I was talking about investigating the truth with no stake in where the evidence lies. I think people would be able to change their minds based on the evidence in that scenario. That being said, you are completely right in saying that it is hard for people to change their views when they have a stake in the truth. This is especially apparent in the case of social conformity. I will assume one is aware of the Asch Conformity Experiment. Gregory Berns took a variation of the experiments where he measured the participant's brain activity during a task (of rotating 3D objects). His results showed that the occipital–parietal network were used when answering incorrectly, meaning social conformity overwrote their reality. This is especially important since the results were physically verifiable. Imagine the implications for matters of opinion and claims that aren't physically verifiable (hat tip StormCloudsGathering).

Your idea of inviting people to explain the situation without any opinions may be a good way for people without strong stakes in what the truth is, but I doubt those with strong investment in a particular claim being true would be able to do that.

[–]zyxzevn[S] 2 insightful - 1 fun2 insightful - 0 fun3 insightful - 1 fun -  (0 children)

Great points..
Maybe I should add a psychology& bias section in the list.

but I doubt those with strong investment in a particular claim being true would be able to do that

You are correct on that.

I chat with astronomers about clear problems in their field.
One huge problem is "magnetic reconnection" which completely breaks with all basic physics and with observations. But there are many others.

But instead of seeing the problem as it is, they just refer to someone else.
The "expert" becomes the "holy prophet".
They just copy the belief that someone else had.
Even if it clearly gets falsified in an experiment.
And bad maths and oversimplification becomes a way to proof impossible things.

The idea of inviting people to investigate theories according to logic and basic science,
is to see the quality of evidence for certain theories.
Even if they prefer other theories.
It can give you better understanding how other people may think, and how well received your theory might be.

My experiences..

I learned a lot from Architects and Engineers for 9/11 truth.
In the beginning I completely believed the idea that the towers came down due to the airplanes.
But with basic physics ans basic logic they make a very convincing case how the towers came down by demolition.
And while I did not immediately believe them, I could accept the quality of their theory.

So when something happens, I want to invite people to become an investigator.
Like a crime investigator.
And look for evidence and use logic like a crime investigator would do.
In this I am copying the method that the Architects and Engineers use.

Logical fallacies:

And the first thing on my list are the logical fallacies.
That is because the first news and first opinions that you find will contain a huge amount of logical fallacies.
You need to remove the logical fallacies before you can use any of that information.

But the logical fallacies can also show you towards what opinions you are pushed.
This can reveal the underlying agenda or narrative that the news pushes.
Or even expose some propaganda.
And if it is too revealing, it may even be reverse psychology.
(The news makes you resist more, and it is meant to work that way).
In some cases there are planted or staged stories/events to make the narrative seem even more true.
Like the group of "russian bots" that were from an agency paid by the democrats,
to make the people think that they were targeted by russia.

[–]zyxzevn[S] 1 insightful - 1 fun1 insightful - 0 fun2 insightful - 1 fun -  (1 child)

There is a lot to add about science, because there are also techniques to corrupt science, which are often used.

  1. Experts, who have big titles, but are actually doing the bidding of their sponsors. Or are just stating opinions instead of facts.

  2. Experst2. When you are trained a lot with a hammer, everything becomes a nail.

  3. Experts3. Experts in Astrology claim that Astrology is useful.

  4. Corporations use science publications as a way to advertise their product. And to hide defects. So they will write lots of reports that show how good it is. And lots of reports that carefully avoid the defects or problems. For example Monsanto designed lab experiments to last only a short time, so that the cancer would not show up. Often you can see that certain products, that require simple tests, still require a long time to complete. That is to design the tests so that they avoid all the problems.

  5. Peer review. The peers are often connected to corporations. Or they are insiders, that do not want the field to change. Due to their extreme bias, they are also not able to see things in a different way. And this stops any real change in the field.

  6. Ghost writers. Often corporations write an article and use the name of an expert-scientist (often for money). So it seems that the article came from that scientist.

  7. Complaining about facts. When the facts are not in line with the corporation, they let scientists write (or sign) articles that are claiming that the facts are bad science. Like the hot and cold cycles of the climate.

  8. Diversion to false theories. So instead of dealing with the facts of the cycles in the climate, the scientists complain about people not listening to them. They even create false theories to claim that the people looking at the facts are crazy. Or that the cycles are due to some hidden ocean cycle, which is easily disproven.

  9. Diversion to hyped solutions. Instead of dealing with the science, the scientists pretend it is settled. So they refuse to talk about the facts. Instead they talk about hyped solutions, that usually do not even work, if their fantasy was correct. So people have to pay energy-taxes, while special big corporations get a free pass.

  10. Diversion to futurism. Instead of dealing with the science of the real problem, we get solutions that we do not have the technology for. Like nuclear fusion. Or GMO-mosquitoes. Or a base on Mars. Or a vaccine/medicine that has never showed to work yet. These diversions make us forget about the real problem that we are facing now.

  11. Patents. Patents and other forms of intellectual are always a problem. But they are presented as a solution, because they create a monopoly for the corporation. And monopolies lead to maximum possible profits and extortion.

I will update this regularly.

[–]zyxzevn[S] 1 insightful - 1 fun1 insightful - 0 fun2 insightful - 1 fun -  (0 children)

More on corrupt science:

  1. People write in popular magazines instead of writing actual science. This way it prevents real experts of countering their claims.

  2. Cherry pick an single event or single detail, and make it seem extremely important. Instead it should be taken with all other evidence.

  3. Reverse faking: "If I can fake it, it must be fake". Usually omits some details which are harder to fake, or circumstances in which this fake is very unlikely. In the same line: It must be a hoax.

  4. Not discussing/countering the Null-Hypothesis. "What if there was no ......?" Without a null-hypothesis, we are jumping to conclusions.

  5. It was "researched". And after investigation actually no-one was responsible for the research. Or a single person. Everyone just assumed that a good research was made, or good science was performed.

  6. Censorship of conflicting science. No science publication can pass the peer review, if it puts severe doubt on previous made conclusions.

  7. Preferable conclusion. Either due to politics, finance, prejudice, or even because "it looks nice", the scientists come to conclusions without any real scientific evidence. A lot in theoretical physics was accepted, because it was mathematically beautiful.

  8. Maths as evidence. Or a simulation as evidence. If you make a mathematical model, and the data fits the model. Does that make the model undeniably correct? No of course not. Even if it is very precise. It might just be correct just for this tested system and for certain circumstances. Sometimes a model has removed the influence of other factors (like noise), but due to the errors in the model, the test seems perfect. In a simulation this can even go further, because a simulation has even more simplifications and systematic errors. Maths or simulations should not be considered as evidence, without properly testing the alternatives and thoroughly investigating the limits of the model/simulation.

[–]zyxzevn[S] 1 insightful - 1 fun1 insightful - 0 fun2 insightful - 1 fun -  (0 children)

Astroturf and manipulation of media messages | Sharyl Attkisson

In this eye-opening talk, veteran investigative journalist Sharyl Attkisson shows how astroturf, or fake grassroots movements funded by political, corporate, or other special interests very effectively manipulate and distort media messages.

[–]zyxzevn[S] 1 insightful - 1 fun1 insightful - 0 fun2 insightful - 1 fun -  (0 children)

See also the information listed by
James Corbett - A message to new "Conspiracy Theorists"