LOGIQ.AI advances machine data management with DPaaS offering
LOGIQ.AI has launched LogFlow, an Observability Data Pipeline as a Service (DPaaS). LogFlow functions in machine data management and is designed to enable enterprises to utilise machine data by connecting it to SMEs on-demand.
Greg O'Reilly, observability consultant at Visibility Platforms, says, "LogFlow enables our customers to take a whole new approach to observability data; one that helps regain control and unblock vendor or cost limitation.
"We're opening up discussions between ITOps and Security teams for the first time with a unified solution that keeps data secure, compliant, manageable, and readily available to those who need it on the front lines."
LOGIQ.AI CEO and co-founder Ranjan Parthasarathy says, "Enterprises have unfortunately been sold ‘block’ and ‘drop’ as intelligent features to counter back pressure and upstream unavailability in data pipelines.
"Block and drop is data loss in disguise. Imagine losing a vital signature in your log stream that points to impending ransomware starting to spread. Don't introduce new business risks by buying into block and drop."
According to the company, LogFlow eliminates block and drop by storing all streaming data in InstaStore, a storage innovation that enables object storage as primary storage.
In InstaStore, data is fully indexed and searchable in real-time. LogFlow also stores its indexes in InstaStore, giving a scalable platform with cleanly decoupled storage and compute.
LogFlow ingests data even when upstream targets are down. Due to its indexing capabilities, it provides fine-grained data replays.
LOGIQ.AI head of sales APAC and EMEA Jay Swamidass says, "InstaStore introduces a new paradigm for data agility that eliminates data loss and the need for storage tiering and data rehydration.
"Organisations can now unlock productivity, cost reduction, and compliance like never before."
In addition, LogFlow's native support for open standards makes it simpler to collect machine data from any source. Similar to network flows, LogFlow manages data with its flow-level routing table.
LOGIQ.AI co-founder Tito George says, "LogFlow filters unwanted data and detects security events in-flight. Users can route streams, control EPS and run fine grained data replays.
"InstaStore's indexing and columnar data layouts enable faster querying, unlike archive formats like gzip."
Open-source tools such a Fluent Bit and Logstash can already route data between various sources and target systems and allow routing raw archives to object stores.
The complex problems to solve are: controlling data volume and sprawl, preventing data loss, ensuring data reusability with fine-grained control, and ensuring business continuity during upstream failures.
LOGIQ.AI head of sales Americas Theodore Caroll says, "There's no technical reason to accept anything less than 100% data availability.
"Your data is your only true fortress in responding to threats and adverse business events. Businesses need a system like LogFlow that ensures full data replay is continuously and infinitely available."
LogFlow's built-in ‘Rule Packs’ have more than 2000 rules that filter, tag, extract and rewrite data for popular customer environments and workloads. They also allow security event detection and tagging.
Overall, LOGIQ.AI's LogFlow brings complete control over observability data pipelines and delivers high-value, high-quality data to teams that need it in real-time, all the time, the company states. Organisations can fully control data collection, consolidation, retention, manipulation and upstream data flow management.