DataStax delivers a production-ready, massively scalable, HIPAA-enabled, PCI-compliant vector database to fuel enterprise-wide generative AI applications
DataStax, the real-time AI company, has announced the general availability (GA) of its vector search capability in Astra DB – the popular database-as-a-service (DBaaS) built on the open source Apache Cassandra® database – delivering orders of magnitude more data and lower latency than other leading databases for building game-changing generative AI applications.
A database that supports vector search can store data as ‘vector embeddings’, which is essential to delivering generative AI applications like those built on GPT-4. With new availability on Microsoft Azure and Amazon Web Services (AWS), adding to initial availability on Google Cloud, businesses can now use Astra DB as a vector database to power their AI initiatives on any cloud, with best-in-class performance that leverages the speed and limitless scale of Cassandra. Additionally, vector search will be available for customers running DataStax Enterprise, the on-premises, self-managed offering, within the month.
Customers using Astra DB for their AI initiatives benefit from the vector database’s global scale and availability, as well as its support for the most stringent enterprise-level requirements for managing sensitive data including PHI, PCI, and PII. Recent integration of Astra DB into popular open source framework LangChain will continue to accelerate the adoption of generative AI for customers.
McKinsey now estimates that generative AI has the potential to be between US$2.4 and US$4.2 (AU$3.68 and AU$6.14) trillion in value to the global economy. Enterprises looking to participate in the AI ecosystem require a vector database to power AI applications with their proprietary data to offer their customers and stakeholders a dynamic and compelling user experience through the transformative impact of generative AI.
“Every company is looking for how they can turn the promise and potential of generative AI into a sustainable business initiative. Databases that support vectors – the “language” of large learning models – are crucial to making this happen,” said Ed Anuff, Chief Product Officer, DataStax.
“An enterprise will need trillions of vectors for generative AI so vector databases must deliver limitless horizontal scale. Astra DB is the only vector database on the market today that can support massive-scale AI projects, with enterprise-grade security, and on any cloud platform. And, it’s built on the open source technology that’s already been proven by AI leaders like Netflix and Uber,” Anuff continued.
“We are at the very early stages of identifying enterprise use-cases for generative AI but expect adoption to grow rapidly and assert that through 2025, one-quarter of organisations will deploy generative AI embedded in one or more software applications,” said Matt Aslett, VP and Research Director, Ventana Research. “The ability to trust the output of generative AI models will be critical to adoption by enterprises. The addition of vector embeddings and vector search to existing data platforms enables organisations to augment generic models with enterprise information and data, reducing concerns about accuracy and trust.”
Skypoint Enterprise is using Astra DB as a vector database on Microsoft Azure to help transform the senior living healthcare industry, which is currently burdened with nearly 70 per cent operational costs.
“Employing generative AI and columnar data lakehouse technology, SkyPoint AI ensures seamless access to resident health data and administrative insights. Envision it as a ChatGPT equivalent for senior living enterprise data, maintaining full HIPAA compliance, and significantly improving healthcare for the elderly,” said Tisson Mathew, Chief Executive Officer, SkyPoint Cloud, Inc. “We have very tight SLAs for our chatbot and our algorithms require multiple round trip calls between the large language model and vector database. Initially, we were unable to meet our SLAs with our other vector stores, but then found we were able to meet our latency requirements using Astra DB.”