People will let AI be their therapist but not their partner, their assistant but not their manager. Gillian Katz of Hannah Grey joins Phillip and Brian to unpack the firm's newest Cultural Vibrations journal and the qualitative study behind it: a read on how people are actually negotiating AI's role in their lives, domain by domain, role by role; from anthropology to sommelier frameworks to Goodhart's Law.
You Can Manage AI, but AI Can’t Manage You
Key Takeaways:
- People accept AI in almost every domain, but reject it in specific roles within them.
- Naming a cultural signal may be what stops it from moving.
- Qualitative research captures what dashboard culture flattens.
- The next frontier isn't the technology, it's the governance around it.
Key Quotes:
- [00:11:04] "No one wants to be managed by a machine, but they're okay to sort of put control over one." — Gillian Katz
- [00:27:08] "It's exactly like the way you wish every person interacted. But if you did actually have that experience time and time again, you would be so frustrated." — Gillian Katz, on AI sycophancy
- [00:29:22] "We give people the benefit of the doubt, but we expect a hundred percent accuracy from AI." — Gillian Katz
- [00:40:56] "If you only use AI to go build your business, you're gonna lose the discernment that's required to actually use AI well in the first place." — Brian Lange
In-Show Mentions:
Associated Links:
Have any questions or comments about the show? Let us know on futurecommerce.com, or reach out to us on Twitter, Facebook, Instagram, or LinkedIn. We love hearing from our listeners!
Hosted by Simplecast, an AdsWizz company. See pcm.adswizz.com for information about our collection and use of personal data for advertising.