you are viewing a single comment's thread.

view the rest of the comments →

[–]GConly 3 insightful - 1 fun3 insightful - 0 fun4 insightful - 1 fun -  (5 children)

So, I have Asperger's, and I rock at puzzles and I do a bunch daily. Im also heavily into science, and I can tell you plenty of people with PhDs absolutely suck at logic and problem solving where complex data is involved.

I had arguments with PhDs (online) that I won before I had a degree/or never had one in the relevant subject.

A lot of people have a "lazer focus" and get bogged down in minor details and miss obvious large facts that mean their position is untenable.

Prime example, back in 2010 I had an epic argument with a PhD as to whether humans had Neanderthal ancestry, based on a set of bones. As I pointed out, non Africans have a whole bunch of genes that are very ancient, and don't come from Africa (tmrca over 200k in some cases). These bones had Neanderthal traits, their prognathism etc was down to Neanderthal ancestry, not that ancient Europeans looked like modern Africans.

And, Neanderthals were so close to us genetically that we certainly could have produced young.

This guy just couldn't see there was a kind of "wall" blocking his "logic path." He'd decided at some point what the outcome was, and couldn't change from it.

I was proven right by a DNA study less than a month later.

I'm seeing the same thing in the covid outbreak. In the UK we've been told IFR mortality is about 1.5%, whereas the data actually shows its about 0 .25%. it's like people aren't seeing the info in context, or spotting what is a good a d bad source.

[–]magnora7 2 insightful - 1 fun2 insightful - 0 fun3 insightful - 1 fun -  (4 children)

He'd decided at some point what the outcome was, and couldn't change from it.

That's an interesting point. I think people do that a lot. They set anchor points of "Let's just assume this is true since it seems 99% likely to be true, and let's move on from there"

Then a whole house of cards gets built up around those mental anchor points. Then 10 years down the road if one of those anchor points is proven wrong, it will upend the entire house of cards. This is why if those anchor points are challenged, people often become hyper-emotional rather than hyper-logical.

[–]wendolynne 4 insightful - 1 fun4 insightful - 0 fun5 insightful - 1 fun -  (1 child)

A few years back I read a book by some sort of brain scientist, I forget the details, but their research showed that people start with the conclusion and then they look for supporting evidence, although they believe the opposite - that they have arrived at the conclusion after weighing the evidence. This is due to the bio-mechanical structure of our brains, that's how the mechanism works, the neural pathways and such. Once a position is taken, people will posit and believe all sorts of unrealistic scenarios to support their position, tolerating high levels of cognitive dissonance, resisting changing their minds as long as they can hold out. Sorry I don't have a link, it was an interesting book.

edit: I think this is the book: https://www.goodreads.com/book/show/522525.Mistakes_Were_Made_But_Not_by_Me_

[–]magnora7 1 insightful - 1 fun1 insightful - 0 fun2 insightful - 1 fun -  (0 children)

Yeah I think as long as it's emotionally easier to hold on to an old belief, then that belief will generally be held on to.

The emotion of the new idea has to overwhelm the emotion of defending the status quo. That's why people who are very egoistic often have to get in to fights before they even consider changing their mind.

[–]GConly 2 insightful - 1 fun2 insightful - 0 fun3 insightful - 1 fun -  (1 child)

You'd think it was from trying to protect an academic reputation, but no. I've seen people go apeshit when presented with quite minor things they've got wrong. I think ego is a real issue.

I seem to not invest so much in a pov that I can't do a 180.

I think academia (like real life) really suffers from poor logic. Also they need training in stats, and how to spot a bad Vs good source of information.

But as far as overall problem solving and the public goes, most people don't seem to make new information internally from received information.

Not sure how to explain it, but every time I find some piece of information I want to find exactly how it fits into the other stuff I know. Which helps me spot incorrect data/bullshit fairly quickly, because it doesn't fit. It's like finding an apple in a bowl of cherries.

Most people just seem to take in things they learn, and leave it at that. I think it's like being given parts of a jigsaw puzzle and either putting it together, or just leaving it jumbled in its box.

I'm also obsessive as hell. I'm sure that helps with problem solving.

If you're into Myers Briggs, I'd say it's NT Vs SF.

Most people are SF.

[–]magnora7 1 insightful - 1 fun1 insightful - 0 fun2 insightful - 1 fun -  (0 children)

Yup sounds about right to me. Most people are very bad at fitting the scraps of narrative they come across in the media, in to a larger understanding of the world that fits all the pieces of information together. And school and media basically discourage this type of thinking as much as possible too.