Emily Bender, a linguist who co-authored The AI Con with the sociologist Alex Hanna, reminded me that when we talk about AI, we need to be precise. Many tools that use AI — voice-to-text transcription tools, or tools that will turn a set of text into a study-aid podcast, for example — are not generating something new; they are combining a single individual’s inputs and making them legible in a new format.
What Bender is most critical of is what she calls “synthetic media machines” — models that create composite imagery and writing, like ChatGPT, DALL-E3, and Midjourney, using massive libraries of existing material to fulfill a prompt.
“These tools are designed to look like objective, all-knowing systems, and I think it’s important to get kids used to asking, ‘Who are the people who built this? Who said and wrote the original things that became the training data? Whose artwork was stolen by these companies to produce the training sets?’” said Bender.
For kids too young to connect with those questions, Bender suggests parents focus on the environmental impact. “Every time you use a chatbot, you’re helping to build the case for the company to develop the next model and build the next data center. Data centers have to be cooled with massive amounts of clean water, and clean water usually means drinking water.” Whose drinking water will be diverted?
Read more | THE CUT