Discussion about this post

User's avatar
Jeff Ahrens's avatar

I’m not a fan of slippery slope arguments, but this seems to be a continuation of the path we’ve been on with the impact that social media has had on our collective psyches. Recent studies on the state teen mental health are relevant here. Chatbots take this to the next level with the speed and amount of content they can generate. Rather than peers and anonymous users we are potentially automating the risk of having our amygdala’s hijacked and our self-worth detrimentally impacted.

Expand full comment
A Thornton's avatar

It's called The ELIZA effect "the tendency to unconsciously assume computer behaviors are analogous to human behaviors; that is, anthropomorphisation" and has its very own wikipedia page quoted previously because I'm lazy. You'll find references at the page. The tl;dr version: because it is an example of anthropomorphisation the Eliza Effect is innate human behavior.

Offsetting the Eliza Effect is the Uncanny Valley. This is caused when an object approaches ever-closer to human behavior without actually behaving as a human. Eventually a person starts to experience unease and revulsion to the object. Again, there is a wikipedia page with references.

So if we can't avoid the Eliza Effect the answer is to move chatbots ever closer, and ever failing, to achieving human behavior until the whole endeavor collapses? Essentially, that is what happened with Siri and Alexia.

Expand full comment
18 more comments...

No posts