Microsoft has announced that it is adding ChatGPT to the suite of solutions that can be integrated into its Azure Cloud Service.
The generative AI chatbox offering, based on an advanced ML large language model (LLM), will start usage billing on March 13 for a price of $0.002 per 1,000 tokens, but only for Microsoft managed customers and partners who have been vetted and granted access to the cloud service.
Availability was announced by Microsoft AI exec Eric Boyd, who said, “Since ChatGPT was introduced late last year, we’ve seen a variety of scenarios it can be used for, such as summarizing content, generating suggested email copy, and even helping with software programming questions. Now with ChatGPT in preview in Azure OpenAI Service, developers can integrate custom AI-powered experiences directly into their own applications, including enhancing existing bots to handle unexpected questions, recapping call center conversations to enable faster customer support resolutions, creating new ad copy with personalized offers, automating claims processing, and more.”
According to a Microsoft Azure blog, this allows developers to leverage the generative AI technology to enhance existing bots to handle unexpected questions, recapping call center conversations to enable faster customer support resolutions, creating new ad copy with personalized offers, automating claims processing and more.
Microsoft says customers and partners can also create new intelligent apps and solutions using a no-code approach in Azure OpenAI Studio, which the company says offers a unique interface to customize ChatGPT and configure response behavior that aligns with the organization.
Microsoft’s cloud service also features a number of other AI models from OpenAI, including GPT-3.5, Codex, and DALL-E. Microsoft combines tools like ChatGPT and DALL-E with Azure data handling, management, and scaling. The software maker uses Azure OpenAI to power GitHub Copilot, Power BI, Microsoft Teams Premium, Viva Sales, and Microsoft’s new Bing chatbot.