I am using a service that outputs to an event hub.
We want to save this output, which will be read once a day using a batch job running on Apache Spark. Basically, we calculated, we just get all the messages dropped into the drops.
What is the easiest way to capture messages from an event hub in a Blob repository?
Our first thought was the work of Streaming Analytics, but it requires the analysis of raw messages (CSV / JSON / Avro), our current format is nothing.
Update We resolved this issue by changing the message format. Anyway, I would like to know if there is any low-return method for storing messages in blobs. Did EventHub have a solution for this before streaming analytics arrived?
source share