Microsoft has significantly enhanced Azure AI Search, previously identified as Azure Cognitive Search, delivering a more cost-effective and powerful tool for developers working with generative AI applications. By improving data utility, Azure AI Search now allows developers to receive more data per dollar spent, which is a boon for efficiency and scaling capabilities. This financial optimization comes from major increases in vector and storage capacities.
Developers can now scale their applications to manage a “multi-billion vector index” within a single search occasion without sacrificing the quality, speed, or performance that users have come to expect from Microsoft’s cloud services. This growth spurt is quantified in an eleven-fold boost in the vector index size, a six-times lift in total storage capacity, and a doubling of the indexing and query throughput. All of these advancements are crucial in keeping up with the expanding demands of sophisticated generative AI applications.
Extended Capabilities and Market Access
Azure AI Search has broadened its reach, rolling out services across various regions worldwide, including the U.S., U.K., Europe, Asia Pacific, and the Americas. This expansion allows users in diverse markets to tap into powerful AI applications, transforming how industries interact with AI. Microsoft has also enhanced Azure AI Search to work in concert with OpenAI’s language models, like ChatGPT and the GPT series, via an Assistant API. This update integrates sophisticated language AI into Azure, catering to a large user base and developer community. ChatGPT alone boasts 100 million weekly active users, which speaks to the popularity and potential of such collaborations. Microsoft’s commitment to adapting its AI offerings to meet user demand and trends is evident, setting the stage for Azure AI Search to be utilized more widely in innovative applications.