Skip to content
Web review

The Problem with Collecting, Processing, and Analyzing More Security Data

Author: Jon Oltsik

Source: CSO Online

Blur analytics
MyCrypNet mobile ads banner

Security teams collect a heck of a lot of data today.  ESG research indicates that 38% of organizations collect, process, and analyze more than 10 terabytes of data as part of security operations each month.  What types of data?  The research indicates that the biggest data sources include firewall logs, log data from other types of security devices, log data from networking devices, data generated by AV tools, user activity logs, application logs, etc.

It’s also worth mentioning that the amount of security data collected continues to grow on an annual basis.  In fact, 28% of organizations say they collect, process, and analyze substantially more data today than two years ago, while another 49% of organizations collect, process, and analyze somewhat more data today than two years ago.

Review of this article

Need of standards

The need of standards in cybersecurity is now critical. Maintaining multiple standards is expensive for a company and has a lot of side effects. If 2 standards cannot communicate together, an attacker can use it to hack without being detected by the 2 independent standards when only 1 standard would have detected it.

The data transfer should be standardized. Each injected data should have the same level of reliability. The communications should be protected against tampering. MyCrypNet is a good tool for that.

The data access and authentication should also be standardized. Currently, permissions management in a complex system is a very hard operation and leads to bad habits from humans. Data should be accessed very easily in a central and secure way by people who have permissions to do so.

Centralization/Decentralization

The centralization of analytics is useful to detect cascading threats that span on multiple systems that can seem independent. But it's very slow, as data are managed all together by a unique process (human or AI).

The decentralization of analytics is far more performant as less data have to be analyzed by parallelized processes (human or AI). But they lack a global view. And they can miss threats spanning on multiple systems.

A good strategy is to have both, centralization and decentralization. Decentralization is very good for day to day operations (involving AI), centralization is better for decision making (for humans).

Automation and AI

The automation is an absolute necessity with this amount of data, this big data. A system should now be zero-touch deployed in production without any human interaction.

To analyze data from these deployed systems, AI are helpful. But, we should be very careful as the AI market is full of scams. Companies says they sell AI that are just a bunch of complex regular operations trying to imitate it. If no neural network is involved, be very suspicious. AI is mainly based on this principle.

Automation and AI are needed to process the huge amount of data without the human factor and without bottleneck. Humans should have the noble tasks of decision making, imagine and develop new tools, be ingenious. Day to day operations should be done without human intervention.
MyCrypNet mobile ads banner
© 2015-2017 Coppint Market Place Ltd, All rights reserved. Legals