Elastic expands Elasticsearch API support to Amazon Bedrock models
Elastic has announced that its Elasticsearch Open Inference API and Playground now support Amazon Bedrock-hosted models. This development allows developers greater flexibility in choosing large language models (LLMs) available on Amazon Bedrock for building production-ready retrieval-augmented generation (RAG) applications.
Shay Banon, founder and chief technology officer at Elastic, highlighted the benefits of the integration. "Our latest integration with Amazon Bedrock continues our focus on making it easier for AWS developers to build next-generation search experiences," he said. "By leveraging Elasticsearch and Amazon Bedrock's extensive model library, developers can deliver transformative conversational search."
The new support extends the capabilities of Elasticsearch for developers working with models hosted on Amazon Bedrock. These developers can now store and utilise embeddings, refine retrieval to ground answers with proprietary data, and more. Additionally, Amazon Bedrock models are available in the low-code playground experience, providing developers with more options while A/B testing LLMs.
The company has made the support for Amazon Bedrock available immediately. For developers interested in using these new features, further details can be found on the Inference API and Playground blogs.