IT Brief Australia - Technology news for CIOs & IT decision-makers
Story image

Middleware unveils LLM observability & Query Genie tools

Today

Middleware has expanded its cloud observability platform with the introduction of Large Language Model (LLM) Observability and Query Genie.

Laduram Vishnoi, Founder and CEO of Middleware, remarked, "AI is transforming IT, and observability is no exception. It's speeding up incident response, automating tedious tasks, and making it easier for non-tech teams to access data—boosting efficiency and smarter decision-making across the board. Middleware aims to harness this power to drive innovation."

The newly introduced Query Genie facilitates data analysis by allowing users to instantly search and retrieve relevant information from infrastructure and logs through natural language queries. This functionality removes the requirement for manual searching and complex query languages, thereby enabling developers to make quicker, data-driven decisions.

Query Genie is also equipped with advanced observability features for infrastructure data, as well as an intuitive user interface and real-time data analysis capabilities, all while maintaining data privacy and confidentiality standards.

Responding to heightened customer demand, Middleware has augmented its AI observability tools with LLM Observability. Vishnoi stated, "In response to overwhelming customer demand, we've expanded our AI observability capabilities with the introduction of LLM Observability. This enhancement allows customers to gain unparalleled insights into their AI systems, ensuring optimal performance and responsiveness."

The LLM Observability feature in Middleware's platform offers real-time monitoring, troubleshooting, and optimisation for applications powered by LLMs, enabling organisations to proactively address performance issues, identify biases, and improve decision-making. It includes comprehensive tracing and customizable metrics for in-depth insights into LLM performance.

To assist in monitoring and troubleshooting further, Middleware includes pre-built dashboards and integrates with established LLM providers and frameworks, such as Traceloop and OpenLIT.

Tejas Kokje, Head of Engineering at Middleware, explained, "Middleware leverages AI and ML to dynamically analyze and transform telemetry data, reducing redundancy and optimizing costs through our advanced pipeline capabilities for logs, metrics, traces, and Real User Monitoring (RUM). With support for various LLM providers, vector databases, frameworks, and NVIDIA GPUs, Middleware empowers organizations to monitor model performance with granular metrics, optimize resource usage, and manage costs effectively, all while delivering real-time alerts that drive proactive decision-making. Ultimately, we strive to deliver observability powered by AI and designed for AI."

Follow us on:
Follow us on LinkedIn Follow us on X
Share on:
Share on LinkedIn Share on X