DataStax announces availability of vector search on Astra DB
DataStax, the real-time AI company, has announced the general availability of its vector search capability in Astra DB, the popular database-as-a-service (DBaaS) built on the open-source Apache Cassandra database. It will deliver orders of magnitude more data and lower latency than other leading databases for building game-changing generative AI applications.
A database supporting vector search can store data as "vector embeddings," essential to delivering generative AI applications like those built on GPT-4.
With new availability on Microsoft Azure and Amazon Web Services (AWS), adding to initial availability on Google Cloud, businesses can now use Astra DB as a vector database to power their AI initiatives on any cloud. Additionally, within the month, vector search will be available for customers running DataStax Enterprise, the on-premises, self-managed offering.
Customers using Astra DB for their AI initiatives benefit from the vector database's global scale and availability and its support for the most stringent enterprise-level requirements for managing sensitive data, including PHI, PCI, and PII. The recent integration of Astra DB into the popular open-source framework LangChain will continue to accelerate the adoption of generative AI for customers.
McKinsey estimates that generative AI has the potential to be between US$2.4 and US$4.2 trillion in value to the global economy. Enterprises looking to participate in the AI ecosystem require a vector database to power AI applications with their proprietary data to offer their customers and stakeholders a dynamic and compelling user experience through the transformative impact of generative AI.
"Every company is looking for how they can turn the promise and potential of generative AI into a sustainable business initiative. Databases that support vectors – the 'language' of large learning models – are crucial to making this happen," says Ed Anuff, chief product officer at DataStax.
"An enterprise will need trillions of vectors for generative AI so vector databases must deliver limitless horizontal scale. Astra DB is the only vector database on the market today that can support massive-scale AI projects, with enterprise-grade security, and on any cloud platform. And, it's built on the open source technology that's already been proven by AI leaders like Netflix and Uber," he continues.
"We are at the very early stages of identifying enterprise use-cases for generative AI but expect adoption to grow rapidly and assert that through 2025, one-quarter of organisations will deploy generative AI embedded in one or more software applications," says Matt Aslett, vice president and research director at Ventana Research.
"The ability to trust the output of generative AI models will be critical to adoption by enterprises. The addition of vector embeddings and vector search to existing data platforms enables organisations to augment generic models with enterprise information and data, reducing concerns about accuracy and trust."
Skypoint Enterprise uses Astra DB as a vector database on Microsoft Azure to help transform the senior living healthcare industry, which is currently burdened with nearly 70% operational costs.
"Employing generative AI and columnar data lakehouse technology, SkyPoint AI ensures seamless access to resident health data and administrative insights. Envision it as a ChatGPT equivalent for senior living enterprise data, maintaining full HIPAA compliance, and significantly improving healthcare for the elderly," says Tisson Mathew, chief executive officer of SkyPoint Cloud.
"We have very tight SLAs for our chatbot and our algorithms require multiple round trip calls between the large language model and vector database. Initially, we were unable to meet our SLAs with our other vector stores, but then found we were able to meet our latency requirements using Astra DB."