A Generative AI hallucination is when a tool like ChatGPT generates false information and presents it as true. This could include citations to sources that don't exist, attribution information to the wrong author or creator, nonsensical answers, as well as content that is racist, sexist, or otherwise discriminatory.
Users who rely too heavily on generative AI to produce facts and do not check the validity of generative AI results could then perpetuate this false information.