Tech »  Topic »  Hallucinated package names fuel 'slopsquatting'

Hallucinated package names fuel 'slopsquatting'


The rise of AI-powered code generation tools is reshaping how developers write software - and introducing new risks to the software supply chain in the process.

AI coding assistants, like large language models in general, have a habit of hallucinating. They suggest code that incorporates software packages that don't exist.

As we noted in March and September last year, security and academic researchers have found that AI code assistants invent package names. In a recent study, researchers found that about 5.2 percent of package suggestions from commercial models didn't exist, compared to 21.7 percent from open source models.

Running that code should result in an error when importing a non-existent package. But miscreants have realized that they can hijack the hallucination for their own benefit.

All that's required is to create a malicious software package under a hallucinated package name and then upload the bad package ...


Copyright of this story solely belongs to theregister.co.uk . To see the full text click HERE