© 2022 CyberNews - Latest tech news,
product reviews, and analyses.

If you purchase via links on our site, we may receive affiliate commissions.

Ariel Assaraf, Coralogix: “drawing information from a variety of logs can be instrumental in identifying risks early”


Constant development in the cybersecurity industry also means increasing levels of complexity, with handling and analyzing data in a secure way growing more difficult by the day.

Systems for monitoring diverse data entries and all types of incoming and outgoing traffic may have evolved together with industries’ needs. But the amount of data in circulation only increases, and it becomes suspiciously convenient for bad actors to hide in between the folds of information that escapes the gaze of monitoring systems.

To discuss the challenges that businesses face when ensuring smooth and secure workflow in the cloud, Cybernews reached out to Ariel Assaraf, the SEO of Coralogix, which is an information security and event management service provider.

What has the journey been like since your launch in 2015?

The Coralogix story has been interesting from the very beginning. We quickly learned the market was looking for something genuinely new to assist in dealing with the rising challenges of modern software development and delivery.

We saw that the side effects of accessing the storage for analytics – the latency it creates, the dependency on schema, and the price that it poses on our customers – were becoming unbearable. After a lot of trial and error, we decided to rebuild the platform from the ground up back in 2017 - and we’ve grown tremendously since then, almost 100 times.

At that point, we began thinking about how we could give our customers the real-time analytics and insights they need and the ability to track trends over time without losing any performance or coverage. That brings us up to the launch of Streama at the end of 2020, which is the foundational technology that removes the dependency on external storage, essentially decoupling all the storage schema issues and latency from the analytics.

Now we’re working to continually expand the capabilities and value that this in-stream data pipeline can give our customers.

Can you introduce us to your Coralogix platform? What are the main issues it helps solve?

There's this perception in the world of storage, and now in analytics too, that talks about pricing in tiers. If you want to have a great experience and insights, you pay a lot for a high tier. And if you’re willing to get a less-than-great experience, you can spend less and put it in a lower tier.

Streama, our in-stream data analytics pipeline, analyzes data before indexing it. This gives users the ability to scale and still benefit from the power and granularity of stateful correlation for all of their data.

The advantages are enormous. It reduces cost, increases coverage, and solves performance issues. The analysis is done in real-time, with no delays; insights can be spotted the moment they occur so teams no longer have to choose which queries to run and the cost is much lower.

You state that Coralogix is streaming the future of data. Would you like to share more about your vision?

What we mean by streaming the future of data is that we’re using this new stream processing approach for observability. Basically, we create a snapshot of the state of data, which we call the ‘state store,’ for the entire history and hold the system's state within the stream itself.

The most apparent benefit at first is that we don't need more than those insights for most of our data. To frequently search and get immediate results for troubleshooting, only a time portion of data is needed.

At this point, we allow our customers to choose the use case they have and route the data accordingly. Our customers aren’t paying a flat rate for all of their data; they’re paying based on the data’s value.

So we know the importance of the data in real-time, and we allow you to route it based on your use case and pay a different amount per use case.

This is what we see as the future of data.

It seems like the pandemic put the global cybersecurity industry to the test. What would you consider the main takeaways?

The pandemic has led to the cybersecurity industry going into overdrive. A few of the reasons include a significantly higher online presence with much broader and deeper adoption of online services, accelerated adoption of cloud services (both infrastructure and services) from many providers, and a shift to hybrid/remote working.

These factors have significantly increased the complexity and the attack surface of environments that need to be monitored and secured. This has led to substantially higher cybersecurity risks from individuals, professional groups, and state actors, as we have seen from recent events. Unfortunately, we expect things to become even more challenging over the next few years, with the threat environment, vectors, and techniques evolving hourly.

Why do you think certain organizations are not even aware of the risks they are exposed to?

In recent years, systems have become so much more complex, and the complexity of cyber risks has increased alongside. We see four main challenges for organizations to protect against modern security threats.

The sophistication of the attacks is not as transparent as it once was. If you think about how we’re used to monitoring our systems, it was generally written use cases in traditional SIEM platforms. How can you write use cases?

Second, we’re dealing with a substantial amount of tool sprawl and growing complexity in our systems. As companies move towards cloud-native approaches, they lose a lot of clarity regarding the connections between services they’re running. Many more endpoints need to be secured, and as tools are added, this needs to be continually monitored and addressed.

Many companies have expanded their suite of security tools or adopted multi-point security solutions. We’re talking about organizations running multiple security partners, consoles and dashboards, and alert types and functions. The overhead of managing these sprawling security solutions is high. And that brings us to the final and perhaps most painful challenge of the industry today.

There are simply not enough cybersecurity experts in the market. Companies face a massive shortage of manpower available to bring in-house to handle the complexity of today’s systems.

In your opinion, what are the biggest mistakes companies make when it comes to handling vast amounts of data?

From a cybersecurity perspective, we see companies make largely three key mistakes when dealing with vast amounts of data.

First, they end up ingesting and analyzing only a subset of the data, given its been typically expensive to collect data comprehensively. Second, they analyze this data reactively, given the need to first store and then index the data. Finally, they end up creating multiple data silos based on data sources, tools being used, and teams leveraging the data, which inhibits correlation and leverage of the full value of the data.

Besides data observability solutions, what other security best practices do you think are a must for organizations nowadays?

With the exponential rise in cybercrimes in the last decade, cybersecurity for businesses is no longer an option — it’s a necessity. Your business needs to adopt a multi-layer defense system, and here are some best practices and power tools you should focus on.

First, have access protection in place. Designed to monitor outgoing and incoming network traffic, firewalls are the first layer of defense from unauthorized access to private networks. They are easy to implement, adopt, and configure based on security parameters set by the organization.

The second is addressing unsecured endpoints, devices ranging from laptops, mobile phones, and USB drives to printers and servers that connect to a company’s private network beyond the corporate firewall. Next-generation antiviruses, VPNs, and Zero Trust Network Access (ZTNA) if your organization has a large remote base are essential in this direction.

Third, log management is a fundamental security control as it helps to monitor and correlate security events across systems. Drawing information from a variety of logs (application, network, security, etc.) can be instrumental in identifying risks early, mitigating bad actors, and quickly mitigating vulnerabilities during breaches or event reconstruction.

Fourth, a top priority should be robust filtering systems to identify spam and phishing emails, embedded code, and fraudulent websites. Email gateways act as a firewall for all email communications by scanning and auto-archiving malicious content. They also protect against business data loss by monitoring outgoing emails and allowing centralized policy enforcement.

Finally, a company’s security measures are only as strong as the awareness among employees who use them. In 2021, over 85% of data breaches were associated with some level of human error. Thus, you need to implement a strong culture about security threats like phishing and social engineering in your organization. All resources related to cybersecurity should be simplified, made mandatory to pursue, and regularly updated.

Talking about individual users, what personal security tools do you see trending in the next few years?

Rather than a tools-based discussion, I would use this opportunity to focus on the basics of cybersecurity that individual users must be aware of.

The first is that cybersecurity is a shared responsibility, and hence each of us is also personally responsible in large measure for our own cybersecurity. This necessitates keeping abreast at a high level with best practices, some of which we will talk about here. Let us start with enabling automatic software updates for every application and software right from the operating system to the browser extension. Updating software adds new features and enables software companies to fix bugs and remove critical vulnerabilities that hackers exploit to make their attacks.

Next, avoid sharing and clicking unknown links. Clicking anywhere without investigating the website’s legitimacy or downloading an attachment without knowing the email sender may lead you to download malicious software. It is also recommended that you install a browser plugin that blocks the automatic downloading of harmful files. In the cybersecurity world, sharing is not caring, and so you should not share your credentials with anyone. It is also advisable that you enable multi-factor authentication, and if you do one based on a PIN code, do not share this with anyone. Also, avoid setting easy-to-discover passwords, use long passwords with numbers and symbols, and avoid using the same password across multiple logins. Be aware of social engineering deceiving methods used to access your credentials. Also, always install your apps from a legitimate software store - if something seems off, double-check it. And finally, learn about using free wifi safely.

Share with us, what’s next for Coralogix?

Coralogix started with log data analyzed in-stream and has evolved to also include metrics, tracing, and security data.

We plan to continue evolving Streama, our stateful streaming engine, and adding more apps built on top of it. Snowbit, our security product, is the first of many.

Down the road, we intend to open the Streama API for our customers to build their own streaming apps and even for new startups to build their offerings based on in-stream analysis for better performance, cost, and scale to their clients.



Leave a Reply

Your email address will not be published. Required fields are marked