The API.AI Java SDK makes it easy to integrate speech recognition with API.AI natural language processing API on Android devices. API.AI allows using voice commands and integration with dialog scenarios defined for a particular agent in API.AI.
Authentication is accomplished through setting the client access token when initializing an AIConfiguration object. The client access token specifies which agent will be used for natural language processing.
Note: The API.AI Java SDK only makes query requests, and cannot be used to manage entities and intents. Instead, use the API.AI user interface or REST API to create, retrieve, update, and delete entities and intents.
This section contains a detailed tutorial how to work with libai. This tutorial was write for IntelliJ IDEA.
- Create an API.AI agent with entities and intents, or use one that you've already created. See the API.AI documentation for instructions on how to do this.
- Open IntelliJ IDEA.
- From the start screen (or File menu) , choose Open....
- In the Open project dialog, fill Path to apiai-java-sdk directory, then expand exmaples directory and choose one of the client example. Then click Ok.
- Open Run, choose Edit Configuraion. In the Run/Debug Configuration Dialog fill Programm arguments your Client access token
- If there are no errors, you can get the result using Idea Input/Output to make text request (text-client).
If you want work with library source code, for some reason. You should to do this steps.
- First of all make all steps of Quick Start. Only after that do steps below.
- Open File, choose Project Structure..., choose Modules. Add new module libai. After this choose text-client(or voice client) module. Remove Maven: ai.api.libai:1.1.0 and Add Module dependency libai. After this click Apply.
- Try to Run. If there are no errors, you can get the result using Idea Input/Output to make text request (text-client).