This is the Dapr SDK for Java, including the following features:
- PubSub
- Service Invocation
- Binding
- State Store
- Actors
- JDK 11 or above - the published jars are compatible with Java 8:
- Java IDE installed:
- IntelliJ
- Eclipse
- Apache NetBeans
- Visual Studio Code
- Any other IDE for Java that you prefer.
- Install one of the following build tools for Java:
- If needed, install the corresponding plugin for the build tool in your IDE, for example:
- An existing Java Maven or Gradle project. You may also start a new project via one of the options below:
For a Maven project, add the following to your pom.xml
file:
<project>
...
<dependencies>
...
<!-- Dapr's core SDK with all features, except Actors. -->
<dependency>
<groupId>io.dapr</groupId>
<artifactId>dapr-sdk</artifactId>
<version>1.7.1</version>
</dependency>
<!-- Dapr's SDK for Actors (optional). -->
<dependency>
<groupId>io.dapr</groupId>
<artifactId>dapr-sdk-actors</artifactId>
<version>1.7.1</version>
</dependency>
<!-- Dapr's SDK integration with SpringBoot (optional). -->
<dependency>
<groupId>io.dapr</groupId>
<artifactId>dapr-sdk-springboot</artifactId>
<version>1.7.1</version>
</dependency>
...
</dependencies>
...
</project>
For a Gradle project, add the following to your build.gradle
file:
dependencies {
...
// Dapr's core SDK with all features, except Actors.
compile('io.dapr:dapr-sdk:1.7.1')
// Dapr's SDK for Actors (optional).
compile('io.dapr:dapr-sdk-actors:1.7.1')
// Dapr's SDK integration with SpringBoot (optional).
compile('io.dapr:dapr-sdk-springboot:1.7.1')
}
Clone this repository including the submodules:
git clone https://github.com/dapr/java-sdk.git
Then head over to build the Maven (Apache Maven version 3.x) project:
# make sure you are in the `java-sdk` directory.
mvn clean install
Try the following examples to learn more about Dapr's Java SDK:
- Invoking a Http service
- Invoking a Grpc service
- State management
- PubSub with subscriber over Http
- Binding with input over Http
- Actors
- Secrets management
- Distributed tracing with OpenTelemetry SDK
- Exception handling
- Unit testing
Please, refer to our Javadoc website.
The Java SDK for Dapr is built using Project Reactor. It provides an asynchronous API for Java. When consuming a result is consumed synchronously, as in the examples referenced above, the block()
method is used.
The code below does not make any API call, it simply returns the Mono publisher object. Nothing happens until the application subscribes or blocks on the result:
Mono<Void> result = daprClient.publishEvent("mytopic", "my message");
To start execution and receive the result object synchronously(void
or Void
becomes an empty result), use block()
. The code below shows how to execute the call and consume an empty response:
Mono<Void> result = daprClient.publishEvent("mytopic", "my message");
result.block();
This SDK provides a basic serialization for request/response objects but also for state objects. Applications should provide their own serialization for production scenarios.
- Implement the DaprObjectSerializer interface. See this class as example.
- Use your serializer class in the following scenarios:
- When building a new instance of DaprClient:
DaprClient client = (new DaprClientBuilder()) .withObjectSerializer(new MyObjectSerializer()) // for request/response objects. .withStateSerializer(new MyStateSerializer()) // for state objects. .build();
- When registering an Actor Type:
ActorRuntime.getInstance().registerActor( DemoActorImpl.class, new MyObjectSerializer(), // for request/response objects. new MyStateSerializer()); // for state objects.
- When building a new instance of ActorProxy to invoke an Actor instance, use the same serializer as when registering the Actor Type:
try (ActorClient actorClient = new ActorClient()) { DemoActor actor = (new ActorProxyBuilder(DemoActor.class, actorClient)) .withObjectSerializer(new MyObjectSerializer()) // for request/response objects. .build(new ActorId("100")); }
In IntelliJ Community Edition, consider debugging in IntelliJ.
In Visual Studio Code, consider debugging in Visual Studio Code.
If you need to debug your Application, run Dapr sidecar separately and then start the application from your IDE (IntelliJ, for example). For Linux and MacOS:
dapr run --app-id testapp --app-port 3000 --dapr-http-port 3500 --dapr-grpc-port 5001
Note: confirm the correct port that the app will listen to and that the Dapr ports above are free, changing the ports if necessary.
When running your Java application from IDE, make sure the following environment variables are set, so the Java SDK knows how to connect to Dapr's sidecar:
DAPR_HTTP_PORT=3500
DAPR_GRPC_PORT=5001
Now you can go to your IDE (like Eclipse, for example) and debug your Java application, using port 3500
to call Dapr while also listening to port 3000
to expose Dapr's callback endpoint.
Most exceptions thrown from the SDK are instances of DaprException
. DaprException
extends from RuntimeException
, making it compatible with Project Reactor. See example for more details.
Change the dapr.proto.baseurl
property below in pom.xml to point to the URL for the desired commit hash in Git if you need to target a proto file that is not been merged into master yet.
Note: You may need to run mvn clean
after changing this setting to remove any auto-generated files so that the new proto files get downloaded and compiled.
<project>
...
<properties>
...
<!-- change this .... -->
<dapr.proto.baseurl>https://raw.githubusercontent.com/dapr/dapr/(current ref in pom.xml)/dapr/proto</dapr.proto.baseurl>
<!-- to something like this: -->
<dapr.proto.baseurl>https://raw.githubusercontent.com/dapr/dapr/1ac5d0e8590a7d6772c9957c236351ed992ccb19/dapr/proto</dapr.proto.baseurl>
...
</properties>
...
</project>
Along with the pre-requisites for SDK the following are needed.
- Docker installed
- Bash shell
- In Windows use WSL2
- In Linux and Mac, default shells are enough
The code for the tests are present inside the project sdk-tests. This module alone can be imported as a separate project in IDEs. This project depends on the rest of the JARs built by the other modules in the repo like sdk, sdk-springboot etc.
As a starting point for running Integration Tests, first run mvn clean install
from the root of the repo to build the JARs for the different modules
except the sdk-tests
module.
During normal CI build, docker compose is used to bring up services like MongoDB, Hashicorp Vault, Apache Zookeeper, Kafka etc.
Similarly, all of these need to be run for running the ITs either individually or as a whole.
Run the following commands from the root of the repo to start all the docker containers that the tests depend on.
docker-compose -f ./sdk-tests/deploy/local-test-kafka.yml up -d
docker-compose -f ./sdk-tests/deploy/local-test-mongo.yml up -d
docker-compose -f ./sdk-tests/deploy/local-test-vault.yml up -d
To stop the containers and services, run the following commands.
docker-compose -f ./sdk-tests/deploy/local-test-kafka.yml down
docker-compose -f ./sdk-tests/deploy/local-test-mongo.yml down
docker-compose -f ./sdk-tests/deploy/local-test-vault.yml down
From the java-sdk
repo root, change to the sdk-tests
directory and run the following command.
## with current directory as /java-sdk/sdk-tests/
mvn clean install
The above command runs all the integration tests present in the sdk-tests
project.
In IntelliJ, go to File > New > Project from Existing Sources...
. Import the sdk-tests
project.
Once the project has been imported, the individual tests can be run normally as any Unit Tests, from the IDE itself.
Sometimes when the
sdk-tests
project does not build correctly, tryFile > Invalidate Caches...
and try restarting IntelliJ.
You should be able to set breakpoints and Debug the test directly from IntelliJ itself as seen from the above image.