Faux ChatGPT, Claude API Packages Deliver JarkaStealer
Attackers are betting that the hype around generative AI (GenAI) is attracting less technical, less cautious developers who might be more inclined to download an open source Python code package for free access, without vetting it or thinking twice.
November 22, 2024
Two Python packages claiming to integrate with popular chatbots actually transmit an infostealer to potentially thousands of victims.
Publishing open source packages with malware hidden inside is a popular way to infect application developers, and the organizations they work for or serve as customers. In this latest case, the targets were engineers eager to make the most out of OpenAI's ChatGPT and Anthrophic's Claude generative artificial intelligence (GenAI) platforms. The packages, claiming to offer application programming interface (API) access to the chatbot functionality, actually deliver an infostealer called "JarkaStealer."
"AI is very hot, but also, many of these services require you to pay," notes George Apostopoulos, founding engineer at Endor Labs. As a result, in malicious circles, there's an effort to attract people to free access, "and people that don't know better will fall for this."
Two Malicious "GenAI" Python Packages
About this time last year, someone created a profile with the username "Xeroline" on the Python Package Index (PyPI), the official third-party repository for open source Python packages. Three days later, the person published two custom packages to the site. The first, "gptplus," claimed to enable API access to OpenAI's GPT-4 Turbo language learning model (LLM). The second, "claudeai-eng," offered the same for ChatGPT's popular competitor, Claude.
Neither package does what it says it does, but each provide users with a half-baked substitute — a mechanism for interacting with the free demo version of ChatGPT. As Apostopoulos says, "At first sight, this attack is not unusual, but what makes it interesting is if you download it and you try to use it, it will kind of look like it works. They committed the extra effort to make it look legitimate."
Under the hood, meanwhile, the programs would drop a Java archive (JAR) file containing JarkaStealer.
JarkaStealer is a newly documented infostealer sold in the Russian language Dark Web for just $20 — with various modifications available for $3 to $10 apiece — though its source code is also freely available on GitHub. It's capable of all the basic stealer tasks one might expect: stealing data from the targeted system and browsers running on it, taking screenshots, and grabbing session tokens from various popular apps like Telegram, Discord, and Steam. Its efficacy at these tasks is debatable.
Gptplus & claudeai-eng's Year in the Sun
The two packages managed to survive on PyPI for a year, until researchers from Kaspersky recently spotted and reported them to the platform's moderators. They've since been taken offline but, in the interim, they were each downloaded more than 1,700 times, across Windows and Linux systems, in more than 30 countries, most often the United States.
Those download statistics may be slightly misleading, though, as data from the PyPI analytics site "ClickPy" shows that both — particularly gptplus — experienced a huge drop in downloads after their first day, hinting that Xeroline may have artificially inflated their popularity (claudeai-eng, to its credit, did experience steady growth during February and March).
"One of the things that [security professionals] recommend is that before you download it, you should see if the package is popular — if other people are using it. So it makes sense for the attackers to try to pump this number up with some tricks, to make it look like it's legit," Apostopoulos says.
He adds, "Of course, most average people won't even bother with this. They will just go for it, and install it."
About the Author
You May Also Like
A Cyber Pros' Guide to Navigating Emerging Privacy Regulation
Dec 10, 2024Identifying the Cybersecurity Metrics that Actually Matter
Dec 11, 2024The Current State of AI Adoption in Cybersecurity, Including its Opportunities
Dec 12, 2024Cybersecurity Day: How to Automate Security Analytics with AI and ML
Dec 17, 2024The Dirt on ROT Data
Dec 18, 2024