On the air for about two months, ChatGPT has already won many hearts. If until now we have been able to access the chatbot for free, eventually, OpenAI intends to monetize the Artificial Intelligence (AI) system through a paid version.
It will be a proposal aimed at those who seek, among other things, faster answers.
It can already be considered one of the most popular AI models today and the truth is that OpenAI should want to improve it even more. For now, anyone who wants to test it can do so by accessing and subscribing to the platform. However, eventually, news may arrive.
After all, OpenAI announced, through its official Discord channel, that, in the future, a paid version of ChatGPT. The objective is clear: to "monetize" a chatbot that has conquered users and is already causing a lot of talk.
Our aim is to continue to improve and maintain the service, and monetization is one way we are considering to ensure its long-term viability.
Explained to OpenAI, on Discord.
What's good ends quickly! Or not, OpenAI?
The professional plan will have advantages over the version we know, such as faster and unlimited responses, for example, as well as the immediate availability of the service. Although the free version is available to any user, it is not always operational due to high demand. Sometimes you really have to wait to ask a question.
The paid professional version is still in an experimental phase and, although it is on the table, when it arrives (if it arrives), it should not replace the free one, but rather function as an alternative as a more complete service.
For now, OpenAI is not monetizing ChatGPT, as it offers the chatbot service for free and does not display any advertising. However, its development and maintenance seem to be costly. second esteem Tom Goldsteina researcher and professor in the Department of Informatics at the University of Maryland, OpenAI spends between $100,000 a day and $3 million a month to keep ChatGPT operational.
I estimate the cost of running ChatGPT is $100K per day, or $3M per month. This is a back-of-the-envelope calculation. I assume nodes are always in use with a batch size of 1. In reality they probably batch during high volume, but have GPUs sitting fallow during low volume.
— Tom Goldstein (@tomgoldsteincs) December 6, 2022
Read too: