#Data Pipeline Guidance Microsoft patterns & practices
This reference implementation is a work-in-progress project. It is meant to demonstrate proven practices regarding the high-scale, high-volume ingestion of data in a typical event processing system.
The project makes heavy use of Microsoft Azure Event Hubs, a cloud-scale telemetry ingestion service. Familiarity with the general concepts underlying Event Hubs is very useful for understanding the source in this reference implementation.
##Overview
The two primary concerns of this project are:
-
Facilitating cold storage of data for later analytics. That is, translating the chatty stream of events into chunky blobs.
-
Dispatching incoming events to specific handlers. That is, examining an incoming event and passing it along to an appropriate handler function. The emphasis of our dispatcher solution is on speed and overal throughput.