// Event Hub to Eventstream
The Fabric servive Eventstream can read data from an Event Hub service to help collect data from IoT devices and other streaming services. But how do you configure the Event Hub and Eventstream to work together?
I’ll try to help with that in this blogpost.
Start and configure the Event Hub
First of all we need an Event Hub in Azure. The process is quite easy. Choose to create a new service and search for the Event Hub service. Just like the screenshot below.
Select “Create” and then select “Event hubs” once again.
Now, configure your Event Hubs service as needed.
Remember that the name of the service must be unique due to the possibility to get data from a web api from the Event Hub endpoint.
The Throughput Units configuration is the part where the money is. This is where you configure the needed and available throughput from the service. In order to make that decision, you can read this documentation from Microsoft.
In short one Throughput Unit can handle the following:
- Ingress: Up to 1 MB per second or 1000 events per second (whichever comes first).
- Egress: Up to 2 MB per second or 4096 events per second.
For this demo and runthrough I will choose 1 Throughout Unit as I’m only going to demo the service.
The next configuration windows in the service is up to you to decide. It contains networking, TLS and other stuff to be altered if needed.
Configure an Event Hub
In order to have an Event Hub running, you need to take yet a small step in the configuration.
The service name “Event Hubs” is in plural, and it is only a container for several hubs. So we need to configure our hub which we need to send data to Eventstream in Microsoft Fabric.
To the left in the Menu there is a section named “Event Hubs” - here you can create a new Event Hub.
Here you type in your prefered name of the hub (this is not needed to be globally unique), the partition count and select the retention period.
I’ve selected 2 partitions and 24 hours retention period. Without further discussing the pertition size and retention period just for this demo purpose, I’ll let it stay this way.
Copy needed information
From the newly created event hub we need some information for the Eventstream in Fabric to read the events coming from the service.
We need the following:
- Under Shared Access Policies we need the policy name
- Also from the Shared Access Policies we need the Primary Key
- We also need to remember the Event Hubs name (the one in plural)
- Lastly also the name of the single event hub you just created
If you can’t see any Shared Access Signatures, you can create one at the plus-sign at the top of the window.
My information from above screenshot is:
- Policy name: RootManagedSharedAccessKey
- Primay key: Tm6NCsXd2BzN7lMfzBF7Laa3LH3FrghUq+AEhA8dw8U=
- Event hubs name: brianbonk
- Event hub name: kustodemo (not in the picture)
Configure Eventstream
Now we are ready to go back to Fabric and start configuring our Eventstream.
First of all we need an Eventstream. This can be done from several places in Fabric - either from the bottom left corner where you can choose Real-Time Analytics or in the top “New” menu and then select “Show all”.
I’m using the second option and here the Eventstream can be found at the bottom along with the other Real-Time analytics services like the KQL database and the KQL Queryset.
Now type in a name for your Eventstream service - this must be without whitespaces (for now).
Click OK and wait a few minutes. The Fabric service is now starting a Eventstream service for you and configuring the first things to make it work.
Create a new source
In the new window, create a new source by clicking the New Source and then select Azure Event Hubs
Now give the connection a name, in this example I’m calling it “StreamFromEventHub” and in the Cloud Connection field select Create new connection.
In the newly opened window (it might take a while), you now need to create yet another connection name and fill in the informatio we got from the Event Hub (Policy name, Primary Key and the two hub names).
The names are not the same between Event Hub and the Eventstream, so for the ease of the information I’ve mapped the different names below:
Event Hub name | Eventstream name |
---|---|
Namespace | Event Hub Namespace |
Event hub | Event Hub |
Policy name | Shared Access Key name |
Primary key | Shared Access Key |
If you like to change it, you can also select a different Privacy Level to this connection. This privacy level is the same you find in Power BI connections.
Now type in the information from your configuration and select Create.
Now you are guided back to your first window from Fabric and can now select your configured Event Hub on the drop down. If you cannot see the newly created Event Hub in the dropdown, just hit the “refresh wheel” to the right of the field.
Also select your consumer group. This is for me $default as I have not configured any consumer groups in my Event Hub.
The Data Format option I just let stay as JSON - if you know your dataformat and it is different than JSON, just select it from the dropdown. You can only use Avro, Json and CSV format for now.
Fiish the setup by clicking the Create button and you are ready to test it out.
Test the connection
When testing the connection in Eventstream, you have the possibility to create demo data directly from your Event Hub. This feature is in preview as this blog post is written.
In this option you can not create demo data from your Event Hub to be handled by your Eventstream service in Fabric.
I’m fond of using the Fraud call detection option in the dropdown and select 100 Repeat send. In this way I’m getting a lot of data to test the connection.
Remember to select your created Event Hub from the top dropdown.
Hit Send and the Event Hub is sending out data.
Back to the Fabric universe and the Eventstream you will see data begin to come in from the Event Hub. You can find the data examples, if you click the Data Preview in your source connection.
Final thoughts
From here “the sky is the limit” - you can stream your data to a KQL database, a Lakehouse or to a custom app. All from the same single source of data.
The Eventstream service is also capable of handling a minor subset of data manipulation features, such as aggregations, filters etc. But more on that in another post about the destination features of Eventstream.
☕