Earlier this year, an episode in the US echoed my little experiment. With a burgeoning measles outbreak, children’s health has become a significant political battleground, and the Department of Health and Human Services, under the leadership of Robert F Kennedy, has initiated a campaign titled the Make America Healthy Again commission, aimed at combating childhood chronic disease.
The corresponding report claimed to address the principal threats to children’s health: pesticides, prescription drugs and vaccines. Yet the most striking aspect of the report was the pattern of citation errors and unsubstantiated conclusions. External researchers and journalists believed that these pointed to the use of ChatGPT in compiling the report.
What made this more alarming was that the Maha report allegedly included studies that did not exist. This coincides with what we already know about AI, which has been found not only to include false citations but also to “hallucinate”, that is, to invent nonexistent material.
The epidemiologist Katherine Keyes, who was listed in the Maha report as the first author of a study on anxiety and adolescents, said: “The paper cited is not a real paper that I or my colleagues were involved with.”