Users expect their apps to be smart and amaze them in some way. Learn with us how to make a React app more intelligent with Cognitive Services and Microsoft Azure. Together we will build a fun application which gets information about people on a photo and picks just the right smiley face for them. You will see React, Azure Functions and Cognitive Services working together.
In the project directory, you can run:
Runs the app in the development mode.
Open http://localhost:3000 to view it in the browser.
The page will reload if you make edits.
You will also see any lint errors in the console.
Launches the test runner in the interactive watch mode.
See the section about running tests for more information.
Builds the app for production to the build
folder.
It correctly bundles React in production mode and optimizes the build for the best performance.
The build is minified and the filenames include the hashes.
Your app is ready to be deployed!
TypeScript: https://www.typescriptlang.org/
Clone this repo and run the scripts above.
— Read the photo url from the disk
— Call the service with the photo and log emotion data
— Display the photo as an image in the stream
— Make sure the Composer component fires onPostCreated with the URL and emotion data.
- Add a typing for server response
Service URL: https://faceprocessor.azurewebsites.net/api/EmotionDetector
- Add a canvas component to your Post component
— Calculate the scale of the image
— Draw the image on the canvas
— Use the emoticon from https://static-asm.secure.skypeassets.com/pes/v1/emoticons/smile/views/default_160 and draw it on the canvas using the faceRectangle data (remember to first load the image!)
— Suggested: sort by emotions and pick top pick
- Alternatively: go creative