News, news analysis, and commentary on the latest trends in cybersecurity technology.

Logiq.ai Tackles Observability Problem With LogFlow

LogFlow addresses data risks associated with machine data pipelines.

Dark Reading Staff, Dark Reading

December 3, 2021

2 Min Read
Network of data folders
Source: D3Damon via iStock

Data management and analytics startup Logiq.ai has launched LogFlow, an "observability data pipeline as a service," aimed at security teams.

A data pipeline refers to all the steps associated with moving data from on-premises systems to the cloud - from copying data, moving it across systems, and reformatting or joining with other data sources - in an automated and reliable way. However, the complexity of technologies being used to collect, store, aggregate, and visualize data has made it difficult to maintain data quality and integrity. Data downtime, which refers to periods of time when data is partial, erroneous, missing, or otherwise inaccurate, is a big problem for organizations as data systems become more complex. Observability refers to the monitoring, tracking, and triaging of incidents to prevent downtime.

Consider a scenario where the security team is looking over logs and security events, and data is missing data that could point to a ransomware attack. Data downtime can impact the ability for IT security teams to detect attacks or identify gaps in controls.

LogFlow is designed to tackle the observability problem, optimize data volume, and improve data quality, Logiq says. Its log management and SIEM capabilities allow security event detection and tagging, and native support for open standards makes it easy to collect machine data from any source, the company says. With over 2,000 rules as part of built-in "Rule Packs," security teams canĀ filter, tag, extract, and rewrite data for popular customer environments and workloads. LogFlow can also continue ingesting data and providing crucial forensics even when upstream systems are unavailable, the company says.

There are a number of open source tools capable of routing data between various sources and target systems, such as Fluent Bit and Logstash, Logiq.ai says. Enterprises can use LogFlow to deal with other data pipeline challenges, such as controlling data volume and sprawl, preventing data loss, ensuring data reusability with fine-grained control, and ensuring business continuity during upstream failures, the company says.

Read more here.

About the Author(s)

Dark Reading Staff

Dark Reading

Dark Reading is a leading cybersecurity media site.

Keep up with the latest cybersecurity threats, newly discovered vulnerabilities, data breach information, and emerging trends. Delivered daily or weekly right to your email inbox.

You May Also Like


More Insights