ESG examined how Devo can help organizations overcome the multiple challenges of data silos, the skills gap, and the high cost of analyzing machine data for IT and security operations use cases.
Historically, IT and security teams have been tasked with managing, extracting, and delivering insights from operational data—the data generated from disparate machines, infrastructures, data centers, applications, end-users, and devices. This often-critical data is leveraged to provide insight into the health and performance of production applications or to detect and respond to security threats. Between the silos of different types of data across various business units within an organization and the fact that business units often leverage different collection and analysis tools depending on their goals, skills sets, types of data, or speed to insight, various levels of fragmentation have appeared across organizations.
As companies attempt to integrate and analyze data for improved operational and security insights, growing levels of data management and integration complexity are not uncommon. According to ESG research, 66% of IT professionals said their organization’s IT environment is more or significantly more complex than it was two years ago, with 30% of organizations believing higher data volumes are responsible for the added IT complexity (see Figure 1).1
But IT complexity is not only caused by the need for more capacity to handle growing data volumes—it’s about the absence of an effective way to unite operational data. Organizations are prioritizing the need for data silo consolidation, but the process has been slow due to the lack of commercial offerings that fit the bill. Organizations have been forced to spend most of their time on tooling and custom coding, as opposed to gaining value from their data. Based on ESG research, organizations are experiencing a problematic shortage of skilled personnel in essential areas reaching across the entire data pipeline and comprising security, administration, management, architecture, data science, and governance.2 These shortages are especially concerning since modern organizations rely on various experts to operate at maximum effectiveness. These roles are becoming critical in nearly every enterprise, but organizations are either missing essential roles or asking too much of existing personnel to fill a gap that is outside of their expertise. This approach is creating an environment riddled with roadblocks, friction, and delays on the path to operational insights.
The cost implications of a fragmented approach to operational analytics can also be quite extensive. The capital costs of multiple tools and supporting infrastructure silos and operational costs associated with the time required to manage and integrate data can add up. The opportunity cost of not having an efficient way to marry the various operational data sets across an organization with business context can cause delayed business decisions due to the untimeliness of insight or making suboptimal decisions based on insights derived from inaccurate or incomplete data.
What is needed to effectively address the challenges of deriving value from ever-growing real-time and historical operational data while tying in business data to yield the most accurate insights? IT decision makers should keep performance, scalability, accessibility, security, and cost in mind when selecting a data platform.
The Devo Data Operations Platform
The Devo Data Operations Platform is a cloud-native, full stack, multi-tenant distributed data analytics platform. Devo is designed to collect, integrate, and analyze operational and business data at petabyte scale across an organization. Devo is architected to empower globally distributed organizations to gain accurate and complete insights quickly across all machine data silos to ensure that they can rapidly respond to the real-time needs of the business. It is available as a SaaS offering, an on-premises solution, or a hybrid combination of the two.
Figure 2 illustrates the flexibility of the Devo platform, showing an example of supported data sources and the methods for securely sending data to Devo. It also illustrates the extensibility of Devo. In addition to its native UI, any tool can leverage the Devo REST APIs to both gain access to the data managed by Devo and the insights it is designed to deliver.
- Collect—Devo can ingest any type of machine generated data and integrate with a number of transport methods. Once ingested, Devo classifies data without transforming or modifying it, instantly making it available for analysis in milliseconds at predictable scale. Devo is designed to scale data ingest linearly per core. Devo can process and ingest up to 150,000 events per second (EPS) per core and scale on demand to meet performance requirements.
- Store—Devo has an optimized file structure based on time and data source, eliminating the need to maintain traditional indexes. All data is stored securely in its raw format, is always hot, and is compressed. Devo asserts that the platform achieves a 10 to 1 compression ratio. Devo leverages proprietary micro-indexing technology. These are space-efficient distributed indexes that are created asynchronously, after ingestion. Devo indicates that this approach reduces personnel and/or infrastructure requirements by at least 80%.
- Analyze—Users can look at both real-time and historical data. Through Devo’s intelligent query engine, organizations benefit from automation that recognizes whether answering a query requires raw or aggregated machine data. Through native machine learning (ML) capabilities for anomaly detection, machine-aided analysis of data enables organizations to spend less time on tooling and more time on deriving insights. The Devo platform can analyze up to one million events per second per core, giving it the ability to quickly deliver predictable insights, while simultaneously ingesting data.
- Visualize—Through an intuitive dashboard that is rich with customizable widgets and click-and-drag functionality, everyone can derive operational insights. Since the solution does not require the use of a custom query language, business users can be as empowered as power users, who can still leverage an industry-standard query language.
Devo Activeboards enable users to visualize data any way they prefer with drag-and-drop functionality, and a library of widget types is available for use—configured simply by dragging a query onto the widget, then data is instantly displayed. Widgets can also pass values to one another, enabling truly dynamic views into data.
Next, we looked at Devo’s data search capability. We clicked on Finder at the top and selected firewall. Devo enables organizations to union data from across different vendors and device types into a single view. We were able to aggregate and visualize any of the common fields that the firewalls shared and view traffic data from all the different firewall source types being represented and streaming in real time.
It’s interesting to note that Devo parses data at query time, extracting the individual fields and displaying them in a virtual form, in real time. This capability enables Devo to adapt to changing data formats without needing to re-ingest or re-index data. Next, we performed data enrichment on the firewall data, creating a new column called threat and looking at the destination IP. Leveraging third-party threat intelligence feeds to enrich the data, we were able to determine if any of these destination IP addresses were known to be malicious in nature.
Next, we set a threshold to only display if a source IP touched an external malicious website at least 150 times.
Based on this query definition, we clicked the new alert button to create an alert that will automatically run these correlations and aggregations in the background continuously and—based on the criteria—alert on the streaming data in real time.
Next, we clicked the query tree button and were able to view all the steps that went into building the query. This enabled us to traverse up and down the elements of the query, making real-time changes to this complex data set quickly and easily.
This capability enables an organization to visualize and explore their data with just a few clicks. We were able to graph how many times users from different countries and cities accessed a specific URL, for example, then dynamically pivot on any field, zoom in or out on the timeline, and make graphical correlations of data sets from disparate source types, all in real time. Devo logs every query that users craft, making it easy to revisit or share queries among users in the same domain. Figure 7 shows the Devo Alerts dashboard, where users can view alerts historically over time.
The timeline can be zoomed in or out to view detailed alerts and trends, filtering on any criteria. We were able to view all alerts in our Devo account based on the real-time application of correlation rules against data as it streamed into Devo. When alerts are triggered, there are several delivery methods that Devo supports—email, Slack, JIRA, or PagerDuty are a few of the examples. The policies for sending alerts are flexible, enabling different responses based on the frequency and severity of an alert, or any other custom criteria.
Finally, we looked at ML-based timeseries anomaly detection capabilities within Devo. Figure 8 shows a single metric job, CPU utilization for an application server zoomed in to look at only the last hour. The dark blue line is the actual metric and the light blue band shows the upper and lower bounds, computed based on previous historical data. Red and orange points represent detected anomalies.
Devo also correlates multiple metrics, so organizations can monitor an entire application stack with an overall health score for that stack. In this case, we looked at a total of nine individual metrics that all correspond to the single metric view in Figure 8. This capability enables an organization to look at how anomalous each of the metrics are, apply weighting to them, and then derive an overall score for the whole application stack. Based on this, a user can configure an alert when the overall score passes a threshold as opposed to having to monitor individual components within the environment. This application is reasonably healthy, without anything too anomalous. The scale runs from blue for least anomalous through dark blue, yellow, orange, and red as those qualities increase.
Customer Case Study
ESG audited data obtained from a Devo customer to validate the real-world benefits of Devo. This customer was a top tier manufacturer/retailer with 300 users, leveraging 140 data sources and a data retention requirement of 400 days. Prior to Devo, the organization was leveraging a cloud SaaS data analytics platform ingesting 6TB per day and processing 350,000 EPS at 100% utilization of its 285-node cluster that included ten search nodes and 275 indexing nodes. The previous solution was licensed by capacity and at the last renewal, the team found that they’d have to reduce the amount of data they were collecting and reduce data retention to 90 days due to licensing costs. In addition, they could neither support the number of concurrent users they needed to, nor query and alert in real time.
After implementing Devo’s SaaS offering, this client realized several significant benefits. The business was able to keep data retention at 400 days and use all its data sources, and it is now ingesting 14TB per day with just ten Devo Data Nodes, a dramatic reduction in hardware and operational overhead. In fact, the system is capable of ingesting more than 40TB per day, so the customer doesn’t have to worry about upgrading for a while. Devo is ingesting at a rate of 3.5 million EPS at 20% utilization of the ten-node cluster. Query times decreased by up to 98% and time-to-alert is now measured in milliseconds rather than minutes to hours. All users can query concurrently—in real time.
Why This Matters
Two-thirds of organizations surveyed by ESG said that their IT environment is more or significantly more complex than it was two years ago. As a follow-up, we asked the same respondents what they believe is responsible for the additional complexity, and 30% asserted that it was higher data volumes.3
What is needed to effectively extract value from the ever-growing volumes of real-time and historical operational data while tying in business data to yield the most accurate insights is a platform designed for performance, scalability, accessibility, security, and cost efficiency.
Devo provides a cloud-native, full stack, multi-tenant, distributed data analytics platform that is engineered for the collection, integration, and analysis of operational machine and business data at petabyte scale across an organization.
ESG’s testing revealed that Devo’s integrated approach to operational analytics enables organizations to collect, store, analyze, and visualize massive amounts of data. We were able to monitor device traffic holistically across multiple vendors’ devices and search out risky user behaviors in real time.
Devo enabled rich visualization of the interrelated components of a complex application stack and ensured that we were able to leverage machine learning to effectively monitor that stack. Devo is architected to empower globally distributed organizations to gain accurate and complete insights across all machine and business data silos to ensure that they can rapidly respond to the real-time operational needs of the business.
The Bigger Truth
The complexities associated with finding value in operational data will not be getting easier. As terabytes become petabytes and the requirement for real-time insight becomes the difference between success and failure, organizations must act to ensure they remain competitive in the market. They can achieve this by aligning data-driven initiatives with business goals.
Modern, data-driven organizations need a solution that unites disparate data silos of operational and business data across an organization with a fast, scalable, easy-to-use, secure, and cost-effective data platform. Before initiating a proof of concept or evaluating vendors, businesses should ask themselves some specific questions: What is the business goal? What use case should we focus on first? What kind of data do we generate, and where does it live across my organization? What level of security is required to minimize risk and maximize accessibility? How could this impact my standing with regulatory compliance? How much does this cost now, and what will it cost five years from now?
ESG testing validated the Devo Data Operations Platform provides performance, scalability, accessibility, security, and cost efficiency in a full stack, multi-tenant, distributed data analytics platform. Devo’s integrated approach to operational analytics enabled us to collect, store, analyze, and visualize massive amounts of disparate data from multiple sources, monitoring traffic across multiple vendors’ devices and uncovering risky user behaviors in real time. A customer confirmed ESG’s findings, reporting significant performance improvements while also realizing significant cost savings.
Organizations are in search of a comprehensive platform that enables the business to integrate disparate, ever-growing machine data silos, empower all personnel in an organization to contribute to operational excellence through corporate-wide data-driven initiatives, and ensure budgets remain in check from both a capital and an operational cost standpoint. By simplifying the machine data pipeline and making use of a platform architecture that properly aligns to the needs of each stage—collect, store, analyze, and visualize—Devo is enabling organizations to easily satisfy their requirements of gaining insight into their operational data, at predictable speed and scale, in a secure way, at a reasonable cost.
1. Source: ESG Master Survey Results, 2019 Technology Spending Intentions Survey, March 2019.↩
3. Source: ESG Master Survey Results, 2019 Technology Spending Intentions Survey, March 2019.↩