Hallucinated packages could be the next big security risk hitting AI developers
Hallucinated packages could be the next big security risk hitting AI developers What if someone creates a previously hallucinated package? When you purchase through links on our site, we may earn an affiliate commission.Here’s how it works. The risks of Generative AI tools being able to “hallucinate” - or suggest sources, or tools, that don’t exist - has long been a concern for developers. Now, experts have warned that if a threat actor discovers a Generative AI hallucination of a, let’s say, software package, they can actually build it, and have it be malicious....