OpenAI’s ChatGPT to be available as an API, coming to Microsoft’s Azure: What it means

OpenAI’s ChatGPT chatbot will soon be available via an API or application programming interface thus potentially allowing businesses to incorporate the chatbot into their programs and applications. This could also open a revenue source for OpenAI if it starts charging for the API. In addition to the API, OpenAI’s ChatGPT will also be made available to Microsoft enterprise customers as part of the Azure OpenAI service. The OpenAI service on Azure let’s Microsoft’s enterprise customers apply OpenAI’s AI models to their business applications.

ChatGPT as an API: What does this mean?

In an announcement post on Twitter, OpenAI posted a link to a form for developers who are interested in accessing ChatGPT as an API. The form notes, “We have been blown away by the excitement around ChatGPT and the desire from the developer community to have an API available. If you are interested in a ChatGPT API, please fill out this form to stay up to date on our latest offerings.” Once the API is made available, other businesses that sign up could plug ChatGPT into their business applications. For example, delivery businesses could use ChatGPT to answer user queries using the API.

ChatGPT is a conversational chatbot built on the company’s own GPT-3.5 large language model (LLM). It was made available to the public for testing in November 2022 and has largely gone viral since then with many arguing that this will change how users search for information. Its popularity has also seen a ‘code red’ being issued at Google, which has its own chatbots but those are not publicly available.

What makes ChatGPT revolutionary right now is that it can seemingly answer several kinds of user queries from SEO search terms to coding-related queries to writing long essays. However, experts have also cautioned that ChatGPT is not entirely accurate and that many of its answers– which often seem correct– are wrong.

Still, ChatGPT has become the talk of AI and tech in 2023. OpenAI CEO Sam Altman has also tweeted in the past how many of the company’s launches would not be possible without Microsoft. He posted last year that “Microsoft, and particularly Azure, don’t get nearly enough credit for the stuff openai launches. they do an amazing amount of work to make it happen; we are deeply grateful for the partnership. they have built by far the best AI infra out there.”

But running these queries is also expensive– especially given ChatGPT crossed one million users in a just over a week. Altman had admitted in a response to a query by Elon Musk that the average cost of the “chat was still in single digits cents per chat,” but added that they will have to “monetise it somehow at some point; the compute costs are eye-watering.” Some estimates put the cost at $3 million per month, which has likely growing given it runs on the Azure cloud platform. The API could potentially address some of these costs.

ChatGPT coming to businesses with Azure

Meanwhile, Microsoft Azure’s cloud service will make OpenAI’s ChatGPT accessible to its enterprise customers as well. According to an official announcement, Microsoft’s enterprise customers who use its Azure cloud service will be able to access ChatGPT via the Azure OpenAI service soon. Keep in mind that ChatGPT itself has been trained and runs inference on Azure’s AI infrastructure. Microsoft is already in talks with OpenAI to invest $10 billion into the company and has been an early investor as well in the startup in the past.

“ChatGPT is coming soon to the Azure OpenAI Service, which is now generally available, as we help customers apply the world’s most advanced AI models to their own business imperatives,” wrote Microsoft Satya Nadella in a tweet.

In a blog post, Eric Boyd Corporate Vice President of AI Platform at Microsoft wrote that the Azure OpenAI service is now generally available and this will enable more businesses to “apply for access to the most advanced AI models in the world—including GPT-3.5, Codex, and DALL•E 2—backed by the trusted enterprise-grade capabilities and AI-optimized infrastructure of Microsoft Azure, to create cutting-edge applications.” Azure is also the core computing power behind OpenAI’s many AI models.

Microsoft first opened the Azure OpenAI Service back in November 2021. Boyd has also given examples of customers already using Azure OpenAI Service such as startups like Moveworks, Al Jazeera Digital, KPMG, etc. Azure  OpenAI Service is what Microsoft also uses to power its own products such as GitHub Copilot, which lets developers write better code. There’s also Microsoft’s Power BI, which uses GPT-3-powered natural language to automatically generate formulae and expressions. Microsoft Designer, which helps creators build content with natural language prompts and again relies on OpenAI’s models.

Enterprise customers will still have to apply for access to Azure OpenAI Service. Once they are approved for use, they can log in to the Azure portal to create an Azure OpenAI Service resource and then get started.  Developers will have to apply for access, describing their “intended use case or application before they are given access to the service,” adds the post. “In the event of a confirmed policy violation, we may ask the developer to take immediate action to prevent further abuse.”

Leave a Reply

Your email address will not be published. Required fields are marked *