Anyone who builds a chatbot based on the GPT-3 language model that looks too lifelike will inevitably come into conflict with the rules of the provider OpenAI. This is the opinion of the American programmer Jason Rohrer, who developed such a chat application and now had to close it, although – or even because – it was so good that a user communicated with a simulation of his deceased fiancée, reports The Register. According to a newspaper report about it, the application had become more successful and OpenAI had warned that the rules should be adhered to. But that would mean that the chatbot would no longer have worked, says Rohrer. The shutdown followed.
Train a chatbot yourself
As The Register explained, Rohrer had access to an account to GPT-3 using the chat application called “Project December“Developed. For a fee everyone could try out by far the most powerful language model. You could chat with various bots trained by Rohrer, including” William “who simulated William Shakespeare or” Samantha “based on the application of the same name in the science fiction Film “Her”. If you wanted, you could also train a bot yourself. This is exactly what a user had done with texts from his deceased fiancée, a report on “Project December” resulted in rapidly growing usage figures.
Rohrer, who has meanwhile converted “Project December” to a different language model and is testing it, says that, given the strict requirements of OpenAI, it is not possible to develop any interesting products based on GPT-3. Anyone who wants to advance the development of chatbots would come across this limitation. He thinks OpenAI’s reasoning that such chatbots could be dangerous is laughable. If the startup worries that an AI could make someone choose someone else or kill themselves, it is “hyper-moral”. He refuses to monitor the chats because the conversations between a person and an AI are the most private of all. That is precisely why they are so open. A year ago he was still skeptical about how lifelike such a chat could be, but with “Samantha” he got goose bumps.