1. Overview

In this tutorial, we’ll explore Azure Java Functions. Azure Function is a serverless compute service that can execute code in response to events originating from Azure services or custom applications. We can write event-driven code in programming languages like Python, Powershell, Javascript, C#, and Typescript.

This feature allows automation engineers to develop apps that handle the events originating from the various Azure services. Azure Function provides a serverless hosting environment for running the code without worrying about infrastructure management. Hence, it facilitates quick deployment, auto-scaling, and easy maintenance of these automation apps.

2. High-Level Use Cases

Let’s see a few examples where we can use Azure Functions:

Azure Function

Azure Function is an excellent serverless service that helps handle events or data originating from different Azure services or custom applications.  Additionally, we can configure the function to run at specific schedules to poll the data from various sources and perform further processing. Even, custom applications can send messages to Function apps over HTTP protocol.

Functions can read data from the event context, transform or enrich them, and then send them to target systems. The event could be a new file uploaded to the Blob Storage container, a new message in the Queue Storage, or a topic in the Kafka streaming service. There could be more such scenarios related to other services.

Moreover, Azure’s Event Grid service can centrally manage events using a pub-sub architecture. Services can publish events to Event Grid, and subscriber applications can consume them. Function apps can consume and process events from Event Grid.

3. Key Concepts

Before writing code, we must know a few concepts about the Azure Functions programming model like bindings and triggers.

When we deploy code in Azure Functions, the code must provide the mandatory information on how it will be triggered. Azure uses this information to invoke the functions running in it. This is called a trigger. The Azure Java Function library provides the framework to specify the trigger declaratively with the help of annotations.

Similarly, the functions need the data relevant to the triggering source for further processing. This information can be provided with the help of input bindings. Also, there can be scenarios where the function must send some data to a target system. Output bindings can help achieve this. Unlike triggers, bindings are optional.

Let’s suppose that whenever a file is uploaded to Blob Storage, we have to insert its contents into a Cosmos DB database. Trigger would help us define the file upload event in the Blob Storage. Additionally, we can retrieve the file contents from the trigger. Furthermore, using output binding we can provide the information related to the target Cosmos DB where we’ll insert the data. In a way, it also helps return data from a function.

These concepts will be clearer in the upcoming sections.

4. Prerequisites

To experience the Azure Function in action, we’ll need an active subscription to Azure to create cloud resources.

IDEs like Eclipse, Visual Studio Code, and IntelliJ offer extensions or plugins to help develop, debug, test, and deploy Function apps in Azure using Maven-based tools. They help create scaffolds with all the necessary components of the framework to speed up development. However, we’ll use IntelliJ with the Azure Toolkit plugin for this tutorial.

Although the toolkit plugin helps create the Java project with the necessary Maven dependency, it’s important to take a look at it:

<dependency>
    <groupId>com.microsoft.azure.functions</groupId>
    <artifactId>azure-functions-java-library</artifactId>
    <version>3.1.0</version>
</dependency>

Azure functions Maven plugin helps package and deploy the Azure function:

<plugin>
    <groupId>com.microsoft.azure</groupId>
    <artifactId>azure-functions-maven-plugin</artifactId>
    <version>1.24.0</version>
</plugin>

This plugin helps specify the deployment configurations such as Azure Function’s name, resource group, runtime environment, settings, etc.:

<configuration>
    <appName>${functionAppName}</appName>
    <resourceGroup>java-functions-group</resourceGroup>
    <appServicePlanName>java-functions-app-service-plan</appServicePlanName>
    <region>westus</region>
    <runtime>
        <os>windows</os>
        <javaVersion>17</javaVersion>
    </runtime>
    <appSettings>
        <property>
            <name>FUNCTIONS_EXTENSION_VERSION</name>
            <value>~4</value>
        </property>
        <property>
            <name>AZURE_STORAGE</name>
            <value>DefaultEndpointsProtocol=https;AccountName=functiondemosta;AccountKey=guymcrXX..XX;EndpointSuffix=core.windows.net</value>
        </property>
    </appSettings>
</configuration>

The azure-functions plugin packages the whole application in a predefined standard folder structure:

deployment

The function.json file created for each trigger endpoint defines the entry point and the bindings of the application:

{
  "scriptFile" : "../azure-functions-1.0.0-SNAPSHOT.jar",
  "entryPoint" : "com.baeldung.functions.BlobTriggerJava.run",
  "bindings" : [ {
    "type" : "blobTrigger",
    "direction" : "in",
    "name" : "content",
    "path" : "feeds/{name}.csv",
    "dataType" : "binary",
    "connection" : "AZURE_STORAGE"
  }, {
    "type" : "cosmosDB",
    "direction" : "out",
    "name" : "output",
    "databaseName" : "organization",
    "containerName" : "employee",
    "connection" : "COSMOS_DB"
  } ]
}

Finally, once we execute the Maven goals clean, compile, package, and azure-functions:deploy in IntelliJ, it deploys the application into Azure Function:

azure deploy

If the Function service doesn’t exist in the Azure subscription, the plugin creates or updates it with the latest configurations. Interestingly, along with the Function, the azure-functions Maven plugin creates a bunch of other supporting essential cloud resources as well:

java-fn-group

5. Key Components of Azure Function SDK

While developing the Azure functions in Java, we mostly deal with annotations to declare triggers and bindings. A few important triggers applied to functions are @HttpTrigger, @BlobTrigger, @CosmosDBTrigger, @EventGridTrigger, @TimerTrigger, etc.

The input binding annotations such as @BlobInput, @CosmosDBInput, @TableInput, etc*.* complement the triggers to help functions access the data from the event sources. Understandably, we can apply them to the input arguments of the function.

On the other hand, the output binding annotations such as @BlobOutput, @CosmosDBOutput, @TableOutput, etc. are applied on function arguments of type OutputBinding. Moreover, they help update the data received from the source into the target systems like Blob Storages, Cosmos DB, Storage Tables, etc.

Additionally, we may use certain interfaces such as ExecutionContext, HttpRequestMessage,  HttpResponseMessage.Builder, HttpResponseMessage,  OutputBinding , etc. ExecutionContext is one of the arguments to the function and it helps access the runtime environments such as logger, invocation id, etc. The other interfaces help receive the HTTP request payload in the parameters, and form and then return the HTTP response messages.

Let’s understand these components with the help of sample code in the next section.

6. Java Implementation

We’ll implement a few use cases using Azure Function and learn about this framework. Similarly, we can later apply the same concepts to the annotations not discussed in this article.

6.1. Move HTTP Request Payload to Storage Table

Let’s create a function that receives employee data in the HTTP request payload and then updates an employee Azure Storage Table:

@FunctionName("addEmployee")
@StorageAccount("AZURE_STORAGE")
public HttpResponseMessage run(@HttpTrigger(name = "req", methods = { HttpMethod.POST }, route = "employee/{partitionKey}/{rowKey}",
  authLevel = AuthorizationLevel.FUNCTION) HttpRequestMessage<Optional<Employee>> empRequest,
  @BindingName("partitionKey") String partitionKey,
  @BindingName("rowKey") String rowKey,
  @TableOutput(name = "data", tableName = "employee") OutputBinding<Employee> employeeOutputBinding,
  final ExecutionContext context) {
    context.getLogger().info("Received a http request: " + empRequest.getBody().toString());

    Employee employee = new Employee(empRequest.getBody().get().getName(),
      empRequest.getBody().get().getDepartment(),
      empRequest.getBody().get().getSex(),
      partitionKey, rowKey);
    employeeOutputBinding.setValue(employee);

    return empRequest.createResponseBuilder(HttpStatus.OK)
      .body("Employee Inserted")
      .build();
}

*Client programs can trigger this function by submitting a POST HTTP request at the API endpoint https://{Azure Function URL}/api/employee/{partitionKey}/{rowKey}?code={function access key}*. The request body should constitute employee data in JSON format. The clients can send the partition and the row key of the table record in the partitionKey and rowKey URI path variables. The @BindingName annotation helps bind the input path variables in the partitionKey and rowKey method variables.

In the method’s body, we create the Employee object from the HTTP request body and then set it to the output binding employeeOutputBinding argument. We provide the storage table information in the tableName attribute of the @TableOutput annotation applied to the method argument employeeOutputBinding. There’s no need to explicitly call any API to insert data into the employee table.

The @StorageAccount annotation specifies the connection string of the employee table’s Storage Account in the value AZURE_STORAGE variable. We can store it as a runtime environment variable in the application’s settings:

Azure appsettings

Let’s now invoke the endpoint for inserting a record in the employee table:

postman

Additionally, for troubleshooting purposes, we can check the function logs in the Azure portal:

appinsight

Finally, the JSON data gets inserted into the employee table in the Azure Storage Account:

azure storage table

6.2. Move Blob Data to Cosmos DB

In the previous use case, we explicitly invoked the HTTP endpoint. However, this time let’s consider an example to trigger a function automatically when a file is uploaded to a Blob Storage. Afterward, the functions reads the data and inserts it into a Cosmos DB database:

@FunctionName("BlobTriggerJava")
@StorageAccount("AZURE_STORAGE")
public void run(
  @BlobTrigger(name = "content", path = "feeds/{name}.csv", dataType = "binary") byte[] content,
  @BindingName("name") String fileName,
  @CosmosDBOutput(name = "output",
    databaseName = "organization",
    connection = "COSMOS_DB",
    containerName = "employee") OutputBinding<List<Employee>> employeeOutputBinding,
  final ExecutionContext context) {
    context.getLogger().info("Java Blob trigger function processed a blob. Name: " + fileName + "\n  Size: " + content.length + " Bytes");
    employeeOutputBinding.setValue(getEmployeeList(content));
    context.getLogger().info("Processing finished");
}

The @BlobTrigger annotation’s path attribute helps specify the file in the Blob Storage. The function populates the argument employeeOutputBinding of type OutputBinding<List> with the file’s content. We define the target Cosmos DB details in the @CosmosDBOutput annotation. The connection attribute’s value COSMOS_DB is an environment variable in the function’s application settings in Azure. It contains the target DB’s connection string.

To demonstrate, we’ll upload an employee.csv file to the Blob container under a Storage Account databasesas from the IntelliJ IDE:

azure explorer intellij

Finally, the function gets invoked and inserts the data into the Cosmos DB employee container:

cosmosdb

7. Conclusion

In this article, we learned the Azure Function’s programming model in Java. The framework is well-designed and simple to understand. However, it’s important to understand the basic troubleshooting steps when the function fails to trigger or cannot update the target system. For this, we must learn the concept of Azure Functions and the complementary services such as Storage Account, Application Insight, etc.

As usual, the code used can be found over on GitHub.