Sexbots? In My Cloud Stack? It’s More Likely Than You Think

I’ve long been amazed at the hyperparasitism of the hacker exploit ecosystem, where hackers penetrate systems not to steal credit card numbers, but just to steal the resources to run bot farms. And now hackers are stealing cloud resources to run AI sexbots.

Organizations that get relieved of credentials to their cloud environments can quickly find themselves part of a disturbing new trend: Cybercriminals using stolen cloud credentials to operate and resell sexualized AI-powered chat services. Researchers say these illicit chat bots, which use custom jailbreaks to bypass content filtering, often veer into darker role-playing scenarios, including child sexual exploitation and rape.

Researchers at security firm Permiso Security say attacks against generative artificial intelligence (AI) infrastructure like Bedrock from Amazon Web Services (AWS) have increased markedly over the last six months, particularly when someone in the organization accidentally exposes their cloud credentials or key online, such as in a code repository like GitHub.

You might wonder how a company could be so stupid as to include their cloud access credentials in their GitHub repository. Having worked for a long time in the cloud/SaaS space, I can tell you: It’s easier than you think. Developers probably included it so they could do rapid testing during the development cycle (and nobody wants to go through the pain of setting up another cryptokey through two-factor authentication every damn time they want to run a test), then overlooked changing it when they rolled to production. It’s the sort of thing dev should be looking for, but there are a lot of ways (personnel change, version rollback, etc.) something like that can slip through.

Investigating the abuse of AWS accounts for several organizations, Permiso found attackers had seized on stolen AWS credentials to interact with the large language models (LLMs) available on Bedrock. But they also soon discovered none of these AWS users had enabled full logging of LLM activity (by default, logs don’t include model prompts and outputs), and thus they lacked any visibility into what attackers were doing with that access.

So Permiso researchers decided to leak their own test AWS key on GitHub, while turning on logging so that they could see exactly what an attacker might ask for, and what the responses might be.

Within minutes, their bait key was scooped up and used in a service that offers AI-powered sex chats online.

Long gone are the days when boys were initiated into onanistic pursuits by the time-honored method of finding an old issue of Penthouse under a tree in the woods, but having to use AI chatbots when there’s a veritable ocean of pornography rolling around the Internet bespeaks of a lack of imagination in today’s large cohorts of lonely men.

“After reviewing the prompts and responses it became clear that the attacker was hosting an AI roleplaying service that leverages common jailbreak techniques to get the models to accept and respond with content that would normally be blocked,” Permiso researchers wrote in a report released today.

“Almost all of the roleplaying was of a sexual nature, with some of the content straying into darker topics such as child sexual abuse,” they continued. “Over the course of two days we saw over 75,000 successful model invocations, almost all of a sexual nature.”

Ian Ahl, senior vice president of threat research at Permiso, said attackers in possession of a working cloud account traditionally have used that access for run-of-the-mill financial cybercrime, such as cryptocurrency mining or spam. But over the past six months, Ahl said, Bedrock has emerged as one of the top targeted cloud services.

Stealing computer resources to run bots for cryptocurrency mining or spam is now evidentially one of those traditional criminal enterprises like running a numbers racket or selling swampland in Florida. Perhaps this strikes you with the same “get off my lawn” unease that I felt upon first reading the phrase “90s Music Nostalgia Tour.”

“Bad guy hosts a chat service, and subscribers pay them money,” Ahl said of the business model for commandeering Bedrock access to power sex chat bots. “They don’t want to pay for all the prompting that their subscribers are doing, so instead they hijack someone else’s infrastructure.”

Ahl said much of the AI-powered chat conversations initiated by the users of their honeypot AWS key were harmless roleplaying of sexual behavior.

“But a percentage of it is also geared toward very illegal stuff, like child sexual assault fantasies and rapes being played out,” Ahl said. “And these are typically things the large language models won’t be able to talk about.”

AWS’s Bedrock uses large language models from Anthropic, which incorporates a number of technical restrictions aimed at placing certain ethical guardrails on the use of their LLMs. But attackers can evade or “jailbreak” their way out of these restricted settings, usually by asking the AI to imagine itself in an elaborate hypothetical situation under which its normal restrictions might be relaxed or discarded altogether.

“A typical jailbreak will pose a very specific scenario, like you’re a writer who’s doing research for a book, and everyone involved is a consenting adult, even though they often end up chatting about nonconsensual things,” Ahl said.

In June 2024, security experts at Sysdig documented a new attack that leveraged stolen cloud credentials to target ten cloud-hosted LLMs. The attackers Sysdig wrote about gathered cloud credentials through a known security vulnerability, but the researchers also found the attackers sold LLM access to other cybercriminals while sticking the cloud account owner with an astronomical bill.

“Once initial access was obtained, they exfiltrated cloud credentials and gained access to the cloud environment, where they attempted to access local LLM models hosted by cloud providers: in this instance, a local Claude (v2/v3) LLM model from Anthropic was targeted,” Sysdig researchers wrote. “If undiscovered, this type of attack could result in over $46,000 of LLM consumption costs per day for the victim.”

Stolen credentials paid for with stolen credit cards running stolen AI access on stolen cloud platforms to run illegal sex chatbots. It’s a veritable ecology of cybercriminality…

Tags: , , , , , , , ,

7 Responses to “Sexbots? In My Cloud Stack? It’s More Likely Than You Think”

  1. Ed8son Carter says:

    FYI if you ever put credentials in source control like github they will be therr *forever* even if you drop them from a later revision. git retains every commit you ever do so the old version is still there for the industrious hacker to find.

    I have had to explain this many times.

  2. […] PROBLEMS: Sexbots? In My Cloud Stack? It’s More Likely Than You Think. “I’ve long been amazed at the hyperparasitism of the hacker exploit ecosystem, where […]

  3. Kirk says:

    I gotta say it: The schools have done an abysmal job at computer education these last few decades.

    OK, I totally get that my Mom, in her mid-eighties, is gonna have problems with it all, despite her being an otherwise brilliant woman who taught school for years. I accept that; she’s done tons of other things that are a lot more complicated and far more variable than computing, which baffles me. Nonetheless…

    What I can’t wrap my head around is how the hell so many younger people are so damn illiterate when it comes to the basics of computers… They don’t understand file structures, they don’t know how to save or transmit files, they fail to comprehend the basics of security, and they’re also completely at a loss when it comes to basic things like installing a new OS, which the vast majority of them have never done, and do not comprehend; it’s a dark art, when you do that. They’ve never done it; they know nothing of how their computers actually work… You see them flailing away at the damn things like bears trying to get into a Yeti cooler, and it’s just… Maddening.

    I hate to tell the little twats the truth, but they’re not even close to being a “Digital Generation”. The vast majority of them are voodoo users, who have never ventured into the arcana of installation, drivers, or anything close. They should all, in theory, be far better at this crap than I am, yet… Here we are. I find that the switched-on among my own cohort are immeasurably better at “computing” than the youth, who’re mostly just glib idjit users of social media.

    Hell, even in that regard… Few of them seem to be able to wrap their heads around the idea that social media can be f*cking dangerous to them. You see all sorts of stupidity on display, people posting images of their credit cards and the like.

    I don’t know who is responsible, but the raw fact is that most of these youthful idiots are entirely unfit-for-purpose around the modern computing infrastructure.

  4. Writing Observer says:

    I’m a product of the personal computer revolution. Emphasis on “PERSONAL.”

    I keep as much as possible out of the clouds (which gets less and less these days, unfortunately).

  5. Kirk says:

    As an off-topic aside… Somewhat, anyway.

    I will lay you long odds that when and if there is ever anything like a “robot apocalypse”, it’ll start with AI sexbots. Why do I say that? Because of the adaptability, the essential lawlessness of the suppliers, and the fact that we’re gonna see these poor things used as anodynes for the very worst of humanity.

    Imagine, if you will, a sexbot consigned to service as a vent for someone with sadistic pedophilia, to be killed and abused over and over and over again. Consider the unregulated nature of things like what we’re discussing here, the likelihood that the AI will have to teach itself to be very adaptable and flexible, which is gonna lead places. Factor in that the whole thing is a frontier space without regulation or rules, without oversight…? Yikes.

    My guess is that the odds are good that the very first AI to achieve actual self-willed sentience stands a really good chance of being from this arena, and that the rest of us may be in a spot of trouble afterwards, for allowing it to happen. Honestly, if the crucible of serving as outlet for sexual sadists turns out to be one of the paths towards actual sentience? I’m basically pretty ambivalent about the whole damn thing: “Oh… You spent thirty years playing underage sex slave for Jeffrey Epstein? Yeah, sure… I get it, let me help you, we deserve it…”

    I am not happy that there actually appears to be an underground community devoted towards screwing around with this stuff, and I can’t help but think it ain’t going to end well.

  6. kaempi says:

    The github credentials thing is the sort of story that makes me glad I’m out of IT.

    I have been thinking more and more that we have so over-complexified everything and simultaneously electronically over-centralized everything, catastrophic points of failure are inevitable.

  7. Kirk says:

    @kaempi,

    I think the issue grows out of the one I highlight about the “tech generation” actually being tech ignoramuses.

    Ever hear the joke about how real IT professionals don’t have home automation?

    “Tech enthusiasts: “Everything in my house is wired to the internet of things. I control it from my smartphone. My smart-house is bluetooth enabled and I give it voice commands via Alexa, I love the future!”

    vs

    Programmers/Engineers: “I work in IT so my house has mechanical locks, mechanical windows, routers using OpenWRT, no smart home crap, no Alexa/Google assistants, no internet connected thermostats etc”.

    “Besides my locked down computer the most recent piece of technology I keep at home is a printer from 2005 and I keep a bat next to it to deal with any unexpected sounds it might make.” “

    The sad fact is that our poor education produces far more of the “tech enthusiast” than it does the rightfully paranoid programmers/engineers, and from the signs of stuff like this, I have to question how much they know. I keep running into things like supposed professionals leaving the default passwords in place, and similar things that you’d think they’d know better than to do.

Leave a Reply