A common problem for large organizations using Azure Sentinel is the handling of data ingestion from applications. We show how our engineers have used Azure Event Hubs for a large environment of a global insurance company to control segregation of data, event filtering and volume control. First, we present an overall view of Event Hubs: what it is, and what it is used for. Second, you will learn from our engineers about the process and functionality of the platform. And lastly, we describe the benefits of using Azure Event Hubs and what value it brings to our clients. We aim to clarify the uses, processes, and benefits of Azure Event Hubs alongside Azure Sentinel. If you don’t know what Azure Sentinel is, you can refer to our previous blog articles presenting the Microsoft SIEM / SOAR platform.
What is Event Hubs and what is it used for?
Azure Event Hubs is a powerful and very scalable messaging platform offered as a PaaS service in Microsoft’s in Azure. The Event Hubs system handles event ingestion and a distributed stream processing. It gives organizations a fully managed solution to receive, process and store large amounts of data with high throughput, without requiring their own servers.
Event Hubs play the role of the “front door” of an event pipeline by sitting between event publishers and event consumers. It thus decouples the producer of an event stream from the consumers of those events. It provides distributed stream processing functions with data and analytics services inside and outside Azure to support the data pipeline. Thanks to the scalability and the hybrid deployment options, companies can easily respond to volume changes. Built-in monitoring and analytics features help to track the efficiency of the service, and eliminate inaccurate or unsuitable data sets.
Here are a few scenarios where Azure Event Hubs can be used:
- Application logging
- Preparing streamed data for further analysis
- Outlier detection and elimination
- Bridging On-Prem and Cloud log sources
Event Hubs is one of several messaging systems in Azure that provides a key capability of multiple sender/receiver concept. Unlike other messaging systems, it can handle event streams with very high throughput because it splits it into a set of scalable partitions which. Each can have single or multiple consumer groups as recipients of the messages
Why is Event Hubs useful?
- The high throughput that Event Hubs can handle (1 million messages per second) is unparalleled.
- The scalable nature of Event Hubs allows companies to start with small data streams (megabytes p. second) and evolve to terabytes p. second, without a change of platform.
- Data can be ingested from a virtually infinite number of sources simultaneously, allowing for streamlining of multiple log pipelines stemming from multiple applications.
- With Azure Policies you can assign Event Hubs to Azure Subscriptions thereby establishing a standard interface point for events between applications and services.
- Data filtration reduces cost: consumers of data can establish filter rules which reduce the effort and cost of processing data
- Integration with other services allows for Event Hubs to bridge the gap between Azure and non-Azure platforms.
- Data transformation is available by using any real-time analytics provider or batching/storage adapters.
Example of Event Hubs feeding Azure Sentinel
The following example of Azure Event Hubs shows an actual implementation that we have deployed in a productive environment:
Here we have four log sources feeding into the Sentinel platform. Three of them reside in Azure (Azure Diagnostic Settings, Azure Activity Log and Azure Metrics from Azure) while one runs on-premises (Apache Kafka). The latter supplies event data from a logging platform based on ELK (Elasticsearch, Logstash and Kibana).
Azure Event Hubs aggregates the messages from the different log sources. The Function app formats and filters the logs before posting them to the Log Analytics Workspace that is being used for Azure Sentinel.
Summary of benefits
Using Event Hubs provided the following benefits to our client:
- A standardized event service interface between various application and service components
- Bridging hybrid cloud environments with on-premises system and multiple cloud providers all integrating into the same platform transparently
- A powerful integration for a high-throughput existing on-premises logging platform (ELK stack) with Azure Sentinel, using readily available Apache Kafka interfaces
- Segregation of responsibilities: application owners assure that their systems publish their log events to Azure Event Hubs, without having to worry much about the data volumes. Consumers can select which data they want to handle, supported by in-stream filtering and data transformation
- Simplification: no need to publish to multiple API’s or consumers; publish data once, for all to consume
- Governance: Azure Policies are used to deploy the Event Hub integration and to define standards for access control
- Monitoring and cost control: Event Hubs provide detailed view of cost and various configurations for handling automatic resources sizing
As a consultancy company on the lookout for brand new tools and features to help defend companies from cyberattacks, Arco IT can help you leverage this platform’s features for your own needs. Watch out for our updates with more information from our experts who are using the Azure Event Hubs tool every day.