Chat GPT Got An Open API — The Neural Network Can Be Integrated Into Any Service
On March 1, OpenAI announced that it now allows third-party developers to integrate ChatGPT into their applications and services via API, while interacting with the neural network will be 10 times cheaper. The company is also making available Whisper, an AI-based automatic speech recognition system.
According to Brockman, the tariff for its use will be $0.002 per 1,000 tokens, or about 750 words, while the API can be embedded in “non-chat” applications.
The first customers of the product will be Snap, Quizlet, Instacart and Shopify.
To avoid incorrect and false responses, OpenAI has implemented the Chat Markup Language or ChatML. It passes text to the ChatGPT API as a sequence of messages along with metadata. This distinguishes it from standard ChatGPT, which uses raw text represented as a series of tokens.
In addition, developers will start updating ChatGPT more frequently with the release of gpt-3.5-turbo.
This will give you more control over the load of allocated capacities. Clients will be able to use gpt-3.5-turbo with a context window of up to 16 KB, which will process four times as many tokens as the standard ChatGPT model. This will allow you to insert whole pages of text in the question and get reasonable responses from the model.