Thursday, April 15, 2021

What Searle and his critics AND Turing and others all get wrong — part one

 That wasn't the intent of a recent piece by John Horgan at SciAm. Rather, he ultimately pivoted to thinking about what it means to really learn quantum physics.

Nonetheless, he DID provoke the headline thoughts in me.

Expanded from three Tweets to him, here’s the stimulation and light bulbs I got. 

First, here's what both Turing AND Searle (AND many AI friendly philosophers IMO) miss. We and other conscious entities in general are EMOTERS, not just thinkers. No WAY emotions act in a way to be run as a Chinese room experiment.

To riff on our mutual friend Massimo Pigliucci, as consciousness is in part definable as embodied cognition, we need to emphasize the “embodied.” We need to remember that stimuli that lead to emotional reactions come from external sensory perceptions as well as internal cogitation.

Related to that, as I have repeatedly said, our emotional depth and nuance, as well as second- third- and fourth-level theories of mind, is what distinguish us from a cow chewing the cud.

To put it in terms of the Chinese room? The “system” may be answering questions, but it’s not structured to have consciousness because the system as a whole, not just the person inside, is passive. It’s a plant, not an animal. (More on this in a part two.)

Second, the Chinese room is not like Turing OR real-world meatspace. The parameters of a closed vs. open system and related issues are different, plus what the specific intentionality is. Per Daniel C. Dennett, when your intuition pumps are reaching beyond good analogous ideas, you’ll pump out some funky stuff.

In this case, the intuition pumps miss the question of “is it like a plant” or “is it like an animal”? Plants do interact with their environments and do respond to them. They’re not conscious in any animalian sense and I still reject claims that they are.

Third, as for Dennett? His big miss on what you raise is looking at this similar to a black/white on/off conscious/unconscious switch, rather than a "slider" with various degrees of conscious/unconscious between 0 and 100. For guy who talked about "multiple drafts" it’s a bad oversight. But not a shocking one.

Saturday, April 10, 2021

There is no "true self" but most people still claim there is

 I've said that both directly and indirectly more than once on this site. So has philosopher friend Massimo Pigliucci in various ways.

Now Vice has an in-depth look at this illusion.

Several good takeaways.

First is that people believe their ethics and morals are at the core of a true self.

Second, riffing on behavioral psychology, when people are asked to think about potential changes to their moral selves, when they make good changes, they say it's getting more in touch with that true self. When they imagine making negative changes, it's a "dark side" or something that is not part of their true self. Shades of loss aversion or similar! And, it's not just WEIRD; Hindus and Buddhists from India and Tibet have the same stance.

Third, under blood is thicker than water, if we have wingnut relatives acting like wingnuts, many of us say that's "not their true selves." (I don't have a problem saying it is.)

Fourth, outside people we know, this flips, especially in America, with a punitive criminal justice system.

The story goes on to talk about things like psychological essentialism and moral certification. Give it a read.

Thursday, April 08, 2021

Martin Luther and witchcraft

It's well known, and well accepted outside of certain pockets of fundamentalist Lutheranism in the US, that Martin Luther had some connection, even if indirect, to the rise of Hitler-era German anti-Semitism.

Less known? His part in promulgating deepening belief in the reality of witchcraft in Germany, where the Empire became the greatest site in Europe (Scotland a close second) in witchcraft executions in the century or so after his death.

That Aeon story notes Luther's belief in the "drache," derived from the Greek and Latin "draco," but NOT the dragon of Revelation. Rather, in German and German-influenced lands to the east and northeast, it was a sort of household spirit. A "familiar," to use a witchcraft term, which Aeon surprisingly was not.

This, of course, was a Luther who believed in an active devil, once infamously throwing an inkwell at Satan. (That's actually a legend, but that's not the point; what is, is that it's a good illustrative point.)

Aeon's piece looks at the economic incentive to make accusations of witchcraft.  (And is not alone. More below.)

But, it ignores some others.

The century after Luther of course culminated with the Thirty Years War.

And, the Counter-Reformation, given first real legs by the Council of Trent, picked up steam in the 20 or so years before 1618, and arguably was one indirect cause of the Thirty Years War.

Witchcraft charges were usually at women, yes. But, in some cases, the women might have powerful husbands, sons or fathers. It was also a useful political weapon. As Aeon notes, an epicenter of witchcraft charges was in Bavaria. Note the map at right. Few people, though, know the Reformation, esp. its Lutheran branch, had a strong foothold there for a few decades, until determined Counter-Reformation activity spilled over from Hapsburg Austrian lands.

The economic dislocations of the war then intensified all aspects of Counter-Reformation, including witchcraft charges.

That third link, the "not alone," notes in Quartz what Aeon does not, that the economic nature of the charges was an angle that was leveraged in the name of religious warfare. It adds one other element, that an increasingly unified designation of what constituted actions of witchcraft arose in Switzerland before it became independent of the Empire. It adds the second element that what we know as Germany led Europe in religious tussles from the time the Albigensians were snuffed out in France.

Thursday, April 01, 2021

Defining "life": a scientific and philosophical demarcation problem

What does it mean to be "alive"? Viruses in general make biologists struggle with that definition, and COVID-19 has brought that issue back to life, especially with having relatively few genes even for a virus.

In one of his latest "Elk" columns for the New York Times, Carl Zimmer weighs in heavily on the issue, not just for science, but to some degree, for philosophy of science, or per Massimo Pigliucci, more specifically, for philosophy of biology.

I think that, per his piece, and what I've read elsewhere, probably a slight majority of biologists would accept viruses as "alive," period ... maybe 55 percent? Another 15 percent might still say "nope." And, the remaining 30 percent would say something like "alive but..."

That said, the piece is worth a read otherwise, especially for the last one-third, talking about viruses and viral DNA entering the animal biome, and namely the human biome. Zimmer first notes that viruses as well as bacteria in our gut are important to our microbiome.

But, that's small potatoes.

The biggie is how, over aeons, viral DNA has entered animal DNA. For example, mammalian females' placenta development is dependent in part on old viral DNA inside mammalian genes. In this and other cases, mammals long ago not only, for the most part, neutralized threats from viral DNA but managed to repurpose it.

Zimmer discusses this issue, and his new book, even more at Quanta, with an excerpt from it.

He cites Wittgenstein, and he's barking up the right tree if we mean linguistic philosophy in general as well as some specific Wittgensteinian ideas. The issue of "family resemblance," per Witty's comment about defining "games," is one important issue.

Saturday, March 27, 2021

The spiritual equivalent of plagiarism, books world

Twice in the past three years or so I've read a book that, when I went to review it, I realized had at least partially "stolen" (you can't copyright titles) of another book, but had also "stolen" a fair chunk of the main idea concept of the former book.

In both cases, the former book had been written less than three years previously. In both cases, the author of the latter book was probably in good place to know about the previous book. In both cases, no credit was offered.

Since this is now twice, I decided to blog about it. (In both cases, I had already noted the issue in my reviews.)

The most recent one, earlier this month, was Adam Grant with "Think Again."

Think Again: The Power of Knowing What You Don't KnowThink Again: The Power of Knowing What You Don't Know by Adam M. Grant
My rating: 3 of 5 stars

Solid 4-star book. Grant's right that we all could stand to admit more of what we don't know, and from that, rethink about what we think we know. He's also right that getting group rethink going, or interpersonal group rethink without entering "politician mode" or "preacher mode" is also good. Grant talks about how to move beyond all that.

He's even more right about trying to oversimplify issues, and to avoid caveats. This portion reminds me of Idries Shah reminding us that issues have more than two sides to them. This applies to group rethinking especially.

That said, since the subtitle of the book is, whether consciously or not, a riff on Socrates' claim to be the wisest man in Athens by recognizing his own ignorance, we can go "meta" on some of the ideas in the book by the fulcrum of going meta on Socrates, whose self-claims are themselves problematic.

Did Socrates really know more about what he didn't know, or admit more about this, than anybody else in Athens? Or, is this just Platonic PR? Or maybe, per the title of a modern psychological syndrome, old Socrates had himself a bit of Dunning-Krueger syndrome.

And, that leads to an issue in Grant's book.

He doesn't talk about going meta on ourselves, on rethinking our rethinking (without being obsessive). Nor does he talk about the possible reality of D-K syndrome from people claiming they're great rethinkers.

Nor does he address the possibility of people claiming this as a new way to shut down conversation.

"I've already thought through, then REthought through, this issue, and I'm still fine with my position."

I think that Grant (and a few other people who have written longform articles in places like Atlantic on this issue) also underestimate the difficulty of using various reasoning tools in today's world to reach people. People ultimately have to make themselves reachable, and without that, per Omar, my moving vocal cords, having spoken, will have to just move on.

AND ... change!

Given that a philosopher wrote a book with the same main title (Think Again: How to Reason and Argue), and seemingly the same ideas, based on the editorial blurb, three years ago, I've dropped my rating to three stars.

Since Sinnott-Armstrong is a fairly prolific author himself, while I won't say Grant went as far as plagiarism or even the moral equivalent of it, it does come off as sketchy. And, this is the second time in three years I've seen someone, whether intentionally or not, partially hijack the main title of a previously written book on largely the same topic with largely the same angle. Also, S-A taught at Dartmouth while Grant was getting his BA at Harvard.

See, Adam? Rethinking. THAT's going meta.

Beyond that? Sinnott-Armstrong's book just sounds better. In fact, it sounds enough better that I asked for it on interlibrary loan. Unfortunately, his "Moral Skepticisms" is only available as an ebook. (But, it's used for less than $10 on Yellow Satan.)

Update: It IS better. But not THAT close to perfect. And, more important for this blog post, enough different from Grant's book that the spiritual equivalent of plagiarism claim is more attenuated now. (I'm still not ready to dismiss it, though. Per Sinnott-Armstrong and my own previous study in informal logic, and what I know about Bayesian probability ideas, I'd cut my estimate by 2/3 from the original, but I wouldn't get rid of it.

Beyond that, to add to Grant's own book? Finding out he cowrote another book with Sheryl Sandberg, and she blurbed another one? Blech.

The second book is 24/6 by Tiffany Shlain.

24/6 The Power of Unplugging One Day a Week24/6 The Power of Unplugging One Day a Week by Tiffany Shlain
My rating: 3 of 5 stars

Good but not great. If it had been ranked a bit higher by others, I would have 3-starred to counteract.

The idea of a digital Shabbos is nice. So is the bit of history Shlain had at the start about other "weeks." That said, the first, if not error, a misstep is there.

She said nobody knows why the Roman 8-day week didn't stick. Sure we do. Christians took over the Jewish 7-day week and things went from there when the Roman Empire Christianized. Indeed, Wiki notes that Constantine officially established it.

The big misstep from my point of view? Why not discuss adults ditching smartphones and tablets entirely? Or never getting them in the first place?

I have a flip phone and never want anything else. I have an old Kindle Fire that, cuz it charged all the way to zero and won't work any more cuz the clock won't reset, that I used to take on vacations rather than the laptop I did before. And, that had to be for a full week's vacation, not just a Thanksgiving or something.

Especially if you do that, the idea of having to hardcore a digital Sabbath isn't as necessary. For instance, I live about 30 minutes from a decent-sized metro area. Every other Saturday, usually, I go there for special grocery shopping and other things. With just a dumbphone, I'm disconnected for half a Saturday right there.

Two other mistakes, or misinterpretations.

Green Bank, West Virginia isn't Net-free because residents have some cozy desire for old time life. As home of the National Radio Astronomy Observatory, astronomers don't want WiFi or cellphones interfering with their work in general, or the hunt for extraterrestrials in particular. (Really.)

Steve Pinker, in "The Better Angels ..." did not at all prove or empirically demonstrate that human society is becoming more and more violence-free.

Lest I sound like I'm totally knocking her, she does get the general idea, and need for it, right. If you are going to do a full-on digital Sabbath, she has detailed ideas of how to prepare for it. And she's right about getting out in nature and journaling.

The key is digital detachment.

I decided to take it down to 3 stars for one other reason. This isn't new for today's age. There's two other "24/6" books, yes with that title. One's a non-conservative evangelical Christian who calls for Sabbathing in general. The other is CEO of a tech company who (I presume) looks at digital detachment, and perhaps has even more insight into it from his business side.

Shlain might not have heard of the evangelical Xn's book, from 2012. But Aaron Edelheit wrote just 18 months earlier, and he lives somewhere in Silicon Valley himself. And he's Jewish. (And a hardcore Zionist who hates Corbyn and conflates antizionism and antisemitism, but that's another story.)

Without saying the book was plagiarized, I smell plagiarization of the concept.

View all my reviews

Thursday, March 25, 2021

Young Hume vs Old Hume: The passions and more

Was there a difference between "young Hume" and "old Hume"? Namely, did he actually repudiate the Treatise, or close to that?

I side with Mossner and others among older Hume interpreters and say yes, above all on the issue of the passions, but also on the radicalness of his skepticism. Per the link, I've already talked about the latter, and why we can't call later Hume a Pyrrhonist.

Some trends in modern neuroscience side with young Hume.

GQ, in a good, and in-depth, interview with Lisa Feldman Barrett, touches on this. A key takeaway, from the first "half chapter" of her new book, is the hot new idea in neuroscience — that, contra "old Hume" and 90 percent of thinkers before 2000, the brain did not develop "for" either thinking or feeling or other things as much as it did for running a body budget.

I would disagree with stress being a "withdrawal" from the body budget. But, the general idea that emotions — including deeper passions, though not called such by here — control the body's budget is interesting. (I also think she's probably wrong on some of the details in her ideas of how a body's budgetary system works.)

So, the Hume of the Treatise could perhaps be called an "emotional Pyrrhonist," then? He was more right than wrong. And, in a sense, anticipated some 19th-century and beyond trends in philosophy. But, of course, this risked him being considered seriously.

But, it didn't have to be that way.

He had the Dialogues published posthumously. Why not turning back to at least some ideas in the Treatise?

As he did not, it means we must take his rejection of it seriously. And, must affirm there was a rejection, contra Harris and many other Hume scholars of today's generation. 

On the passions, if Hume had continued, even gone further, down the road of the Treatise, this essentially would have meant rejection of large chunks of Enlightenment ideas. And, that wasn't happening. I think Hume recognized this. So, along with simply wanting to avoid the bad press, he wanted to distance himself from most the ideas advanced there. Surely, as part of this, now that he had become "established," he recognized that letting the Treatise, or any ideas from its nose, back under the tent would have meant the disintegration of the idea of "le bon David" among men of letters in general, and more specifically, among the likes of the French philosophes. Quelle horreur, some of them might have wondered if some of his early thought was akin to that of Rousseau!

That's why he told Beattie and other Scottish common sense philosophers it was unfair to bring up the Treatise and hold it against him, because he had written it anonymously knowing it would be controversial even before it was launched.

With this, I have finished my series of pieces on Hume, as influenced by James Harris' semi-new bio of Hume. I began with a "prequel" piece on refuting the charge of presentism as a way to try to pretend away Hume's racism (and Aristotle's and others' sexism). That piece links the whole set of pieces on Hume.

Thursday, March 18, 2021

Is David Hume just a bundle in my mind? Or just a petard hoisting?

 I'm of course talking about his bundle theory of impressions.

I've often called Hume the first modern(ish) psychologist. He was on more solid ground than a Freud or Jung, at least, and definitely for the limitations of his day.

But?

He was partially right about the self, but not totally, and not close to right on why. The bundle theory is weak. Hume flirts with the land of Berkeley, though he would never admit it. In reality, something like Dennett's subselves is probably a much better explainer of both how and why we don't have a unified, consistent self.

Beyond that, of course, Hume was operating from the typical "blank slate" paradigm of British empiricists. And, of course, this is wrong. Human minds evolved to, on average, have and develop certain conceptions and preconceptions. A baby's brain isn't blank, and while a current subself may be just whichever current set of conceptions and preconceptions are in the saddle, all subselves have those.

Let's look at modern neuroscience. 

GQ, of all places, has a surprisingly good, and in-depth, interview with Lisa Feldman Barrett. A key takeaway, from the first "half chapter" of her new book, is the hot new idea in neuroscience — that, contra "old Hume" and 90 percent of thinkers before 2000, the brain did not develop "for" either thinking or feeling or other things as much as it did for running a body budget.

While I think it can be overblown, there is a generally good core there. That's why the brain has those conceptions and preconceptions — they save energy, and the brain itself is of course the hungriest part of the body.

Along with conceptions and preconceptions come predictions, which also save brain energy.

She noted that some of these come from social learning and that to some degree, "predictions come from a world that curates you." To riff on Hume, thus, we're only not a blank slate, but we're also not in a blank classroom.

However, beyond modern neuroscience, there's a more elemental problem when we look at Hume as Hume on this issue, in his own words.

It's called "petard hoisting," and one of his most famous statements, known in general outline by a fair amount of the non-philosophical world, is a dandy on this.

“For my part, when I enter most intimately into what I call myself, I always stumble on some particular perception or other, of heat or cold, light or shade, love or hatred, pain or pleasure. I never can catch myself at any time without a perception, and never can observe any thing but the perception…. If any one, upon serious and unprejudic'd reflection thinks he has a different notion of himself, I must confess I can reason no longer with him. All I can allow him is, that he may be in the right as well as I, and that we are essentially different in this particular. He may, perhaps, perceive something simple and continu'd, which he calls himself; tho' I am certain there is no such principle in me.”

As they say on Twitter? #Boom! 

In essence, to riff on Gertrude Stein's bon mot about Oakland, Hume is saying there's no "I" at the core of "my/him-self."

BUT? He says that:

“For my part, when I enter most intimately into what I call myself, I always stumble on some particular perception or other …”

Who? Who's this "I," le bon David, if you're just a perception, or a bundle of them, and nothing more?

Now, defenders of Hume would probably say we should treat this the same way as we do his thoughts on causation. Problem? He never really gives the indication of doing that himself in this case. Nooo ... since that was from the Treatise? He just repudiated it along with the rest.

Now, why?

Here, other than "atheism" for a denial of a permanent self being seen as equivalent to denying the soul, I think it was the other big issue leveled at Hume from the publication of the Treatise to the end of his life

Pyrrhonism, in a word.

Being a skeptic, in general, was problematic enough in his day and age.

Being seen as a Pyrrhonic, not an Academic, Skeptic, within the two schools, was far worse.

In the past, I got into arguments with Dan Kaufman, a philosophy prof in Missouri, over what type of Skeptic, to go with the capital for the schools, Hume was, or if he even knew the difference.

Well, to the degree other men of letters and clergymen distinguished "Pyrrhonic" from "Skeptic," they did, and certainly Hume did.

So, my ideas that he was "confused or ignorant on this?

Not at all.

Rather, this was another part of him repudiating the Treatise — he was repudiating any Pyrrhonic-type skepticism.

This may also be why, in his essays on four schools of philosophy in his "Essays Moral and Political," the essay "The Sceptic" doesn't discuss the Academic-Pyrrhonic difference.