Neo4j boosts cloud database performance with new functionality
Neo4j has announced significant new functionality that promises to enhance the performance of cloud databases. This enhancement could spur up to 100 times faster analytical queries alongside real-time mission-critical decision implementation, the company states.
Customers using the cloud or managing systems themselves stand to benefit from the ability to facilitate both transactional and analytical processing within the confines of a single database, Neo4j states.
The company's offerings already boast 75% of the Fortune 100 companies, more than 250,000 data scientists, developers and architects among its clientele list. Notable users of their services include the likes of Commonwealth Bank of Australia, Telstra, Standard Chartered Bank, DBS Bank, and other corporations and government agencies.
Sudhir Hasbe, Chief Product Officer at Neo4j, commented on the development, stating, "Neo4j's integration of operational and analytical workloads within a single database is now enhanced by the power of parallel runtime and change data capture, empowering our customers with real-time insights, cost-efficient data management, and simplified architecture."
Customers that have leveraged enhancements include Dropbox, which uses Neo4j's Change Data Capture functionality to maintain synchronisation across their various data sources - a crucial component for ensuring accurate content searchability and location. Law enforcement agencies also harness the new capacities to respond swiftly to mission-critical events, aiding them in solving and preventing crimes more effectively.
Recently, Neo4j also integrated native vector search into their core database features, translating into more transparent, explainable and accurate conclusions for generative AI applications. The company was acknowledged in the 2022 Gartner Magic Quadrant for Cloud Database Management Systems – a debut appearance for any native graph vendor.
The company is continually focused on advancing its offerings individually and with partners. At the recent keynote of the annual global developer conference, DockerCon, Docker, Inc. together with partners Neo4j, LangChain, and Ollama announced a new GenAI Stack designed to help developers get a running start with generative AI applications in minutes.
Eliminating the need to search for and cobble together and configure technologies from different sources, the GenAI Stack is pre-configured, ready-to-code, and secure with large language models (LLMs) from Ollama, vector and graph databases from Neo4j, and the LangChain framework. Docker also announced today its first AI-powered product, Docker AI.
"Developers are excited by the possibilities of GenAI, but the rate of change, number of vendors, and wide variation in technology stacks makes it challenging to know where and how to start," said Docker CEO Scott Johnston. "Today's announcement eliminates this dilemma by enabling developers to get started quickly and safely using the Docker tools, content, and services they already know and love together with partner technologies on the cutting edge of GenAI app development."