In this blog, we will see it in action using an example. See how to combine real-time data ingestion component with a Serverless processing layer.
One of the previous blogs covered some of the concepts behind how Azure Event Hubs supports multiple protocols for data exchange. In this blog, we will see it in action using an example. With the help of a sample app, you will see how to combine real-time data ingestion component with a Serverless processing layer.
The sample application has the following components:
To follow along and deploy this solution to Azure, you are going to need a Microsoft Azure account. You can grab one for free if you don't have it already!
Let's go through the individual components of the applications
As always, the **[code is available on GitHub](https://github.com/abhirockzz/eventhubs-functions-cosmosdb-example).**
This is pretty straightforward - it is a Go app which uses the Sarama Kafka client to send (simulated) "orders"
to Azure Event Hubs (Kafka topic). It is available in the form of a Docker image for ease of use (details in next section)
Here is the relevant code snippet:
JSON
1
order := Order{OrderID: "order-1234", CustomerID: "customer-1234", Product: "product-1234"}
2
3
b, err := json.Marshal(order)
4
5
msg := &sarama.ProducerMessage{Topic: eventHubsTopic, Key: sarama.StringEncoder(oid), Value: sarama.ByteEncoder(b)}
6
producer.SendMessage(msg)
A lot of the details have been omitted (from the above snippet) - you can grok through the full code here. To summarize, an Order
is created, converted (marshaled) into JSON (bytes
) and sent to Event Hubs Kafka endpoint.
The Serverless
part is a Java Azure Function. It leverages the following capabilities:
The Trigger allows the Azure Functions logic to get invoked whenever an order
event is sent to Azure Event Hubs. The Output Binding takes care of all the heavy lifting such as establishing database connection, scaling, concurrency, etc. and all that's left for us to build is the business logic, which in this case has been kept pretty simple - on receiving the order
data from Azure Event Hubs, the function enriches it with additional info (customer and product name in this case), and persists it in an Azure Cosmos DB container.
You can check the [OrderProcessor](https://github.com/abhirockzz/eventhubs-functions-cosmosdb-example/blob/master/consumer-azure-function/src/main/java/com/abhirockzz/OrderProcessor.java)
code on Github, but here is the gist:
Java
1
@FunctionName("storeOrders")
2
public void storeOrders(
3
4
@EventHubTrigger(name = "orders", eventHubName = "", connection =
5
"EventHubConnectionString", cardinality = Cardinality.ONE)
6
OrderEvent orderEvent,
7
8
@CosmosDBOutput(name = "databaseOutput", databaseName = "AppStore",
9
collectionName = "orders", connectionStringSetting =
10
"CosmosDBConnectionString")
11
OutputBinding<Order> output,
12
13
final ExecutionContext context) {
14
....
15
16
Order order = new Order(orderEvent.getOrderId(),Data.CUSTOMER_DATA.get(orderEvent.getCustomerId()), orderEvent.getCustomerId(),Data.PRODUCT_DATA.get(orderEvent.getProduct());
17
output.setValue(order);
18
19
....
20
}
The storeOrders
method is annotated with [@FunctionName](https://docs.microsoft.com/java/api/com.microsoft.azure.functions.annotation.functionname?view=azure-java-stable&WT.mc_id=dzone-blog-abhishgu)
and it receives data from Event Hubs in the form of an OrderEvent
object. Thanks to the [@EventHubTrigger](https://docs.microsoft.com/java/api/com.microsoft.azure.functions.annotation.eventhubtrigger?view=azure-java-stable&WT.mc_id=dzone-blog-abhishgu)
annotation, the platform that takes care of converting the Event Hub payload to a Java POJO
(of the type OrderEvent
) and routing it correctly. The connection = "EventHubConnectionString"
part specifies that the Event Hubs connection string is available in the function configuration/settings named EventHubConnectionString
The [@CosmosDBOutput](https://docs.microsoft.com/java/api/com.microsoft.azure.functions.annotation.cosmosdboutput?view=azure-java-stable&WT.mc_id=dzone-blog-abhishgu)
annotation is used to persist data in Azure Cosmos DB. It contains the Cosmos DB database and container name, along with the connection string which will be picked up from the CosmosDBConnectionString
configuration parameter in the function. The POJO (Order
in this case) is persisted to Cosmos DB with a single setValue
method call on the [OutputBinding](https://docs.microsoft.com/java/api/com.microsoft.azure.functions.outputbinding?view=azure-java-stable&WT.mc_id=dzone-blog-abhishgu)
object - the platform makes it really easy, but there is a lot going on behind the scenes!
Let's switch gears and learn how to deploy the solution to Azure
cloud tutorial azure serverless databases messaging azure functions azure cosmos db
Learn about Azure SQL Database | Cloud Database as a Service | serverless | Microsoft Azure | SQL. Azure SQL Database is the intelligent, scalable, relational database service built for the cloud. It’s evergreen and always up to date, with AI-powered and automated features that optimize performance and durability for you. Serverless compute and Hyperscale storage options automatically scale resources on demand, so you can focus on building new applications without worrying about storage size or resource management.
Mismanagement of multi-cloud expense costs an arm and leg to business and its management has become a major pain point. Here we break down some crucial tips to take some of the management challenges off your plate and help you optimize your cloud spend.
Co-authored by Rodrigo Souza, Ramnandan Krishnamurthy, Anitha Adusumilli and Jovan Popovic (Azure Cosmos DB and Azure Synapse Analytics teams) Azure Synapse Link now supports querying Azure Cosmos DB data using Synapse SQL serverless. This capability, available in public preview, allows you to use familiar analytical T-SQL queries and build powerful near real-time BI dashboards on Azure Cosmos DB data.
In this article you'll find out how Spring Cloud Function in Azure - Outlines the deployment of Spring Cloud Function as a Java Function to Azure Functions, from validation and setup to testing and deployment.
This article describes how to build a custom URL shortener service using Azure's serverless platform with Azure Functions and Cosmos DB,