The Road to AI We Can Trust

Share this post

24 Seriously Embarrassing Hours for AI

garymarcus.substack.com

24 Seriously Embarrassing Hours for AI

All the recent goodwill and enthusiasm could evaporate fast

Gary Marcus
Jan 18
79
29
Share this post

24 Seriously Embarrassing Hours for AI

garymarcus.substack.com

By my count, the following things have come to light in the last 24 hours or so.

  1. Turns out Tesla staged their famous 2016 driverless car demo, with the famous tagline “The person in the driver’s seat is only there for legal reasons. He is not doing anything. The car is driving itself.”.

    1
    What they showed, it seems, was aspirational, not real, not a single, unedited run taken by a single car. Roughly $100 billion in investment went in, partly on the strength of that demo and the excitement it generated, in part on Musk’s say so; in the subsequent six years, no car has yet achieved what Elon Musk promised would soon arrive, an intervention-free drive, without human assistance, from LA to NYC. (And highways miles are the easy part; nobody even pretends anymore to be close to doing that off the Interstate). Nobody is treating this whole episode like Thereanos—the official company legal line is failure is not fraud —but I expect John Carreyrou might find the whole thing interesting.

  2. We also discovered from court testimony this week that some reasonably high level employees working on driverless cars were apparently unaware of human factors engineering (an absolute necessity if humans are in the loop). This tweet (and several others by the same author, Mahmood Hikmet, in the last couple days) blew me away, and not in a good way. Honestly, rereading it makes my stomach churn.

    Twitter avatar for @MoodyHikmet
    Mahmood Hikmet @MoodyHikmet
    How do engineers show up at work every day and continue to work on a system which has killed multiple people? Looking into the deposition of Dhaval Shroff, Autopilot Engineer at Tesla, in relation to the Walter Huang fatality in 2018 you can get a sense of how they justify it.
    Image
    8:54 PM ∙ Jan 17, 2023
    192Likes42Retweets
  3. OpenAI turns out to have been using sweatshops behind the scenes. You might think that the ChatGPT is just a regular old massive neural network that soaks up a massive amount of training data from the web, but you’d only be partly correct. There is in fact a massive, massively trained large model behind the scenes, but it’s accompanied by a massive amount of human labor, built to filter our bad stuff. A bunch of that work was done poorly paid labor in Kenya, paid less than $2/hour to evaluate (e.g.) graphic descriptions of sexual situations involving children and animals that I prefer not to describe it in detail. Billy Perrigo’s expose at Time is a must read.

    Twitter avatar for @billyperrigo
    Billy Perrigo @billyperrigo
    🚨Exclusive: OpenAI used outsourced Kenyan workers earning less than $2 per hour to make ChatGPT less toxic, my investigation found (Thread) time.com/6247678/openai…
    time.comExclusive: The $2 Per Hour Workers Who Made ChatGPT SaferA TIME investigation reveals the difficult conditions faced by the workers who made ChatGPT possible
    12:07 PM ∙ Jan 18, 2023
    6,860Likes3,677Retweets
  4. Riley Goodside, one of the people who best knows what large language models can and can’t do, put Claude, the latest large model to the test; the focus on this model is on alignment. You can read his detailed comparison for yourself, but one of the things that popped out to me is that the system still quickly lands in the land of hallucination that has so haunted ChatGPT.

  5. CNET became the first casualty of the recently fashionable tendency to put too much faith in ChatGPT. Without making a big deal of it, they started posting ChatGPT-written stories. Mistakes were made. A lot of them. Oops.

    Twitter avatar for @GaryMarcus
    Gary Marcus @GaryMarcus
    Wow! Who could possibly have seen this coming? “CNET Is Reviewing the Accuracy of All Its AI-Written Articles After Multiple Major Corrections” 🐵⌨️⌨️⌨️🙈 gizmodo.com/cnet-ai-chatgp…
    gizmodo.comCNET Is Reviewing the Accuracy of All Its AI-Written Articles After Multiple Major CorrectionsBig surprise: CNET’s writing robot doesn’t know what it’s talking about.
    1:33 AM ∙ Jan 18, 2023
    86Likes18Retweets
  6. The coup de grace? The musician Nick Cave got a listen to ChatGPT riffs on his music. If you can believe it, he was even more scathing than I am:

    Twitter avatar for @SimonKirby
    Simon Kirby @SimonKirby
    “A grotesque mockery of what it is to be human”. Nick Cave on ChatGPT generated lyrics.
    theguardian.com‘This song sucks’: Nick Cave responds to ChatGPT song written in style of Nick CaveSinger-songwriter dissects lyrics produced by popular chatbot, saying it is ‘a grotesque mockery of what it is to be human’
    2:56 PM ∙ Jan 17, 2023
    18Likes4Retweets

Fake demos, hallucination, arrogance and ignorance around basic human engineering, crappy song writing, misleading news stories and sweatshops.

None of this is a good look.

§

I wrote a little thread about the sweatshop stuff in particular:

Twitter avatar for @GaryMarcus
Gary Marcus @GaryMarcus
Real AI wouldn’t need sweatshops. Real AI wouldn’t need vast amounts of data. 🪡 1/3
8:28 PM ∙ Jan 18, 2023
25Likes8Retweets
Twitter avatar for @GaryMarcus
Gary Marcus @GaryMarcus
Real AI would learn with the efficiency of human children. Real AI would be able to reason from first principles, and generalize far more deeply. AI Sweatshops are the consequence of trying to substitute shallow data for deep understanding. 2/3
8:28 PM ∙ Jan 18, 2023
17Likes1Retweet
Twitter avatar for @GaryMarcus
Gary Marcus @GaryMarcus
The choices we make, around what kind of AI we build, have consequences for society. We are pursuing the best path we currently know how to build. But is it the best path imaginable? Sometimes the slower road is the better road. How much human cost is too much? 3/3
8:28 PM ∙ Jan 18, 2023
17Likes4Retweets

Our choices here matter.

I hope we will make better choices.

Gary Marcus is a scientist, best-selling author, and entrepreneur. Ernest Davis is Professor of Computer Science at New York University. His most recent book, co-authored with Ernest Davis, Rebooting AI, is one of Forbes’s 7 Must Read Books in AI. You can also listen to him, on The Ezra Klein Show.

Share

1

Apparently the NYT mentioned this before in a documentary on Tesla that I had meant to watch; for whatever reasons, it’s only really getting talked about now.

29
Share this post

24 Seriously Embarrassing Hours for AI

garymarcus.substack.com
29 Comments
Tom Dietterich
Jan 18Liked by Gary Marcus

Regarding the data labeling "sweatshop". The headlines and lede in the story emphasize the pay level. But as you read the article, it becomes clear that the real issue is the horrifying nature of the work. Even if the workers were paid 100 times more or if the work was done in the US, it is psychologically punishing. We need to find a better way prevent toxic behavior of AI systems than creating large labeled data sets of horror.

Expand full comment
Reply
4 replies by Gary Marcus and others
Red Barchetta
Jan 20Liked by Gary Marcus

Here's something that drives me nuts about AI commentary. The idea that new music or art creation will be dominated by AI in the future. As a for-profit enterprise, those fields are already in trouble due to technology like music streaming.

But - here's an angle I see underdiscussed. Human beings are compelled to create not simply because we can sell it to other human beings. We are compelled to create because creating things gets at something essentially human in all of us. I used to play in bands into my twenties; now I'm nearly 40 and scarcely have the time to pick up an instrument thanks to work, wife and kids. But, even in a future where every popular song was shat out in seconds by some sort of "AI"-type substance, the joy of playing "Wild Thing" on a guitar amp cranked up to 11 or awkwardly belting out your favorite tune in the shower will never, ever go away. Nor will crudely painting a nature scene "just because" or scribbling down some poetry. Getting our emotions out via art is a human need. Being able to sell it to others and survive on it is a blessing, but its a not necessary for human-created art to survive.

There will always be a garage full of kids playing Blitzkrieg Bop or Smells Like Teen Spirit, somewhere. Because that's part of what it means to be human.

Expand full comment
Reply
27 more comments…
TopNewCommunity

No posts

Ready for more?

© 2023 Gary Marcus
Privacy ∙ Terms ∙ Collection notice
Start WritingGet the app
Substack is the home for great writing