React Native wrapper for the Android Activity Recognition API and CMMotionActivity. It attempts to determine the user activity such as driving, walking, running and cycling. Possible detected activities for android are listed here and for iOS are listed here.
npm i -S react-native-activity-recognition
or with Yarn:
yarn add react-native-activity-recognition
react-native link react-native-activity-recognition
IMPORTANT NOTE: You'll need to follow Step 4 for both iOS and Android of manual-linking
Make alterations to the following files in your project:
- Add following lines to
android/settings.gradle
...
include ':react-native-activity-recognition'
project(':react-native-activity-recognition').projectDir = new File(rootProject.projectDir, '../node_modules/react-native-activity-recognition/android')
...
- Add the compile line to dependencies in
android/app/build.gradle
...
dependencies {
...
compile project(':react-native-activity-recognition')
...
}
- Add import and link the package in
android/app/src/.../MainApplication.java
import com.xebia.activityrecognition.RNActivityRecognitionPackage; // <--- add import
public class MainApplication extends Application implements ReactApplication {
// ...
@Override
protected List<ReactPackage> getPackages() {
return Arrays.<ReactPackage>asList(
new MainReactPackage(),
// ...
new RNActivityRecognitionPackage() // <--- add package
);
}
- Add activityrecognition service in
android/app/src/main/AndroidManifest.xml
...
<application ...>
...
<service android:name="com.xebia.activityrecognition.DetectionService"/>
...
</application>
...
- In the XCode's "Project navigator", right click on your project's Libraries folder ➜
Add Files to <...>
- Go to
node_modules
➜react-native-activity-recognition
➜ios
➜ selectRNActivityRecognition.xcodeproj
- Add
RNActivityRecognition.a
toBuild Phases -> Link Binary With Libraries
- Add
NSMotionUsageDescription
key to yourInfo.plist
with strings describing why your app needs this permission
import ActivityRecognition from 'react-native-activity-recognition'
...
// Subscribe to updates
this.unsubscribe = ActivityRecognition.subscribe(detectedActivities => {
const mostProbableActivity = detectedActivities.sorted[0]
})
...
// Start activity detection
const detectionIntervalMillis = 1000
ActivityRecognition.start(detectionIntervalMillis)
...
// Stop activity detection and remove the listener
ActivityRecognition.stop()
this.unsubscribe()
detectedActivities
is an object with keys for each detected activity, each of which have an integer percentage (0-100) indicating the likelihood that the user is performing this activity. For example:
{
ON_FOOT: 8,
IN_VEHICLE: 15,
WALKING: 8,
STILL: 77
}
Additionally, the detectedActivities.sorted
getter is provided which returns an array of activities, ordered by their
confidence value:
[
{ type: 'STILL', confidence: 77 },
{ type: 'IN_VEHICLE', confidence: 15 },
{ type: 'ON_FOOT', confidence: 8 },
{ type: 'WALKING', confidence: 8 },
]
Because the activities are sorted by confidence level, the first value will be the one with the highest probability Note that ON_FOOT and WALKING are related but won't always have the same value. I have never seen WALKING with a higher confidence than ON_FOOT, but it may happen that WALKING comes before ON_FOOT in the array if they have the same value.
The following activity types are supported:
- IN_VEHICLE
- ON_BICYCLE
- ON_FOOT
- RUNNING
- WALKING
- STILL
- TILTING
- UNKNOWN
detectedActivities
is an object with key to the detected activity with a confidence value for that activity given by CMMotionActivityManager. For example:
{
WALKING: 2
}
detectedActivities.sorted
getter will return it in the form of an array.
[
{type: "WALKING", confidence: 2}
]
The following activity types are supported:
- RUNNING
- WALKING
- STATIONARY
- AUTOMOTIVE
- CYCLING
- UNKNOWN
The following projects were very helpful in developing this library: