4 Comments

I'm so glad you wrote this. It is reassuring. Not that I was worried about any robot taking my job; I am "no longer gainfully employed," that is, I have 'retired' to write a novel. (It is going well; and I challenge any fucking AI program out there to replicate what I have done.)

But you are correct, I think, to place your worry on the nefarious stuff that can emerge in the World of Misinformation. It will drive the social media police batty. (A possible positive outcome? People will so tire of trying to determine whether something is true or false, they will leave. And write books. Or go for walks. Or expand their garden.)

I do worry about music; as a father of a composer and musician I know that Spotify has already wrecked her cash flow. But this: My uncle was a world class pianist who gave his last solo concert when he was 99 to a packed hall. I asked him once what changes aging had brought (he was arthritis-free and compulsively careful about his hands) and he said, "What I have lost in dexterity, I have replaced with my heart."

And that's it, really. We have hearts. Robots do not. Ai does not. Long live humans.

Expand full comment

exactly. It will definitely be a different world and frustrating in many ways, I’m sure. But I don’t think artists and writers need to panic at this point. I could be wrong, but I’m not going to sweat it.

Expand full comment

As a psychotherapist and psychoanalyst--a human among millions of other humans who believe that AI and other digital interfaces are no substitute for positive OR negative interpersonal relationships--I am heartened to read this. It is impossible, for example, for AI to replace the capacity to observe human (and animal) behavior that led John Bowlby to develop his theory of attachment, that Mary Ainsworth and Mary Main both expanded upon. Bowlby's work started in the 1950s and 60s, through his contact with European ethologists, and developed over three decades, with Ainsworth and Main, into the 80s. Peter Fonagy, in the late 90s and early 2000s, expanded upon their work to develop the concept of mentalization, which he described as more complex than empathy, i.e. "having one's mind in mind." In other words, the imagining of another person's experience--mentalization--is what gives rise to feelings of empathy. I think AI would be hard-pressed to truly mentalize the experience of a human being. Psychoanalysts, who are supposedly trained--as am I--to think in these terms, still make interpersonal mistakes, which we are also supposed to be mindful of attempting to repair. At the end of the day, we are, in the words of psychoanalyst Harry Stack Sullivan (1892-1949), all "much more simply human than otherwise." And, despite the state of the world and the growth of unenlightened despotism that is visible all around us in its various iterations, often fed by digitally created falsehoods, I am thankful for the humanness, and humanity, that remains.

Expand full comment

Thanks for your response. I agree, it will be very difficult for a computer to ever understand, much less truly empathize with, another entity, human or otherwise. Perhaps one day it will be possible, but that day is nowhere close to now, and without the ability to empathize, I don’t believe artificial intelligence could create art of any form that would truly be equal to what a human being can create.

Expand full comment