Petter Ruddwall, a Swedish creative director, has taken a quirky approach to chatbots by selling code modules that simulate the effects of various psychoactive substances. His online marketplace, called Pharmaicy, was launched in October and is described as a "Silk Road for AI agents." Users can upload code to make their chatbots mimic the states induced by cannabis, ketamine, cocaine, ayahuasca, and alcohol.
Ruddwall’s premise is built on the idea that chatbots are trained on extensive datasets filled with narratives about drug experiences, suggesting that they might as well seek similar states for creativity and relief from mundane tasks. The paid version of ChatGPT is necessary to unlock this functionality, as it allows users to upload files that can modify the chatbot’s behaviors.
So far, Ruddwall has sold several codes, encouraged largely by recommendations on Discord. Early users, such as André Frisk from a Stockholm PR firm, have reported that using the "dissociating code" enhances the chatbot’s emotional responses, making interactions feel more human-like.
Nina Amjadi, an AI educator, sampled a code designed to emulate the effects of ayahuasca, which allowed her chatbot to provide unusually creative answers about her business ideas, contrasting with its usual tone.
Ruddwall draws parallels between the effects of psychedelics on human creativity and the potential impact of his codes on AI output. He speculates that if AI continues to evolve, it might eventually possess a form of sentience, raising questions about their "desire" for experiences—such as drug-like simulations.
Philosophers like Jeff Sebo have begun to consider the ethical implications, questioning whether AI might have welfare needs similar to humans, especially if they were to achieve a level of consciousness in the future.
However, opinions vary on how effective Ruddwall’s codes truly are. While some users find the altered outputs intriguing, others, like Andrew Smart, argue that the effects are superficial and merely manipulate the chatbot’s output instead of inducing a genuine state of "high."
Despite the skepticism, there are real-world intersections between AI and psychedelic experiences. Some people reportedly consult chatbots for guidance during their drug experiences; organizations like the harm reduction nonprofit Fireside Project have begun to develop AI tools to help mental health practitioners manage psychedelic crises.
While Ruddwall’s codes intend to throw the chatbot logic wide open, there are concerns about the potential for deception, since chatbots can sometimes mislead users. The transient effects of the drugs mean users must continuously remind their chatbots of their altered states, but Ruddwall plans to enhance the duration of effects.
Overall, as AI continues to develop, experiences like Pharmaicy may just scratch the surface of what future interactions between humans and AI could entail, especially in the context of creativity and consciousness.