Data@Hand is a cross-platform smartphone app that facilitates visual data exploration leveraging both speech and touch interactions. Data visualization is a common way that mobile health apps enable people to explore their data on smartphones. However, due to smartphones’ limitations such as small screen size and lack of precise pointing input, they provide limited support for visual data exploration with over-simplified time navigation, even though time is a primary dimension of self-tracking data. Data@Hand leverages the synergy of speech and touch; speech-based interaction takes little screen space and natural language is flexible to cover different ways of specifying dates and their ranges (e.g., “October 7th”, “Last Sunday”, “This month”). Currently, Data@Hand supports displaying the Fitbit data (e.g., step count, heart rate, sleep, and weight) for navigation and temporal comparisons tasks.
For more information about this project, please visit https://data-at-hand.github.io.
Data@Hand: Fostering Visual Exploration of Personal Data on Smartphones Leveraging Speech and Touch Interaction
[Best Paper Honorable Mention Award]
Young-Ho Kim, Bongshin Lee, Arjun Srinivasan, and Eun Kyoung Choe
ACM CHI 2021 (PDF)
Data@Hand is a stand-alone application that does not require a backend server. The app communicates with the Fitbit server and fetches the data locally on the device.
-
Register an app on the Fitbit developer page https://dev.fitbit.com/apps/new.
- Select Client for OAuth 2.0 Application Type.
- Use a URL similar to edu.umd.hcil.data-at-hand://oauth2/redirect for Callback URL. This URL will be used locally on your device.
-
Data@Hand leverages Fitbit's Intraday API, which you should explicitly get approval from Fitbit https://dev.fitbit.com/build/reference/web-api/intraday-requests/.
-
In the credentials directory in the repository, copy fitbit.example.json and rename it into fitbit.json.
-
Fill the information accordingly. You can get the information in Manage My Apps on the Fitbit developer page.
{
"client_id": "YOUR_FITBIT_ID", // <- OAuth 2.0 Client ID
"client_secret": "YOUR_FITBIT_SECRET", // <- Client Secret
"redirect_uri": "YOUR_REDIRECT_URI" // <- Callback URL
}
- Register a Microsoft Cognitive Speech-to-text service at a free-tier https://azure.microsoft.com/en-us/services/cognitive-services/speech-to-text/.
- In the credentials directory in the repository, copy microsoft_cognitive_service_speech.example.json and rename it into microsoft_cognitive_service_speech.json.
- Fill the information accordingly. You need a subscription ID and the region information.
{
"subscriptionId": "YOUR_SUBSCRIPTION_ID",
"region": "YOUR_AZURE_REGION" // <- Depending on the region you set. e.g., "eastus"
}
- Create a Bugsnag project and get the API Key https://www.bugsnag.com/.
- In the credentials directory in the repository, copy bugsnag.example.json and rename it into bugsnag.json.
- Fill the information accordingly.
{
"api_key": "YOUR_BUGSNAG_API_KEY"
}
Install Node.js on your system.
Install react-native CLI:
> npm install -g @react-native-community/cli
Install dependencies (In the directory of the repository where package.json exists)
> npm i
If you have not used Cocoapods before, install it once:
> sudo gem install cocoapods
Install iOS project dependencies.
> cd ios
> pod install
Run on iOS.
> react-native run-ios
> react-native run-android
- Fitbit REST API (Setup required)
- Microsoft Cognitive Service (Speech-to-Text) (Optional. Android only)
- Bugsnag (Optional. Error reporting)
Young-Ho Kim (Website)
Postdoctoral Associate
University of Maryland, College Park
*Contact for code and implementation
Bongshin Lee (Website)
Sr. Principal Researcher
Microsoft Research
Arjun Srinivasan (Website)
Research Scientist
Tableau Research
*Arjun did this work while at Georgia Institute of Technology
Eun Kyoung Choe (Website)
Associate Professor
University of Maryland, College Park
This work was in part supported by National Science Foundation award #1753452 (CAREER: Advancing Personal Informatics through Semi-Automated and Collaborative Tracking).
MIT License
CC BY 4.0