Skip to content

pongpatapee/Litter-Detection-A-EYE

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

15 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

ecomake-map

Using the camera module on a raspberry PI 4, we take pictures after a certain interval and then send a POST request to microsoft Azure's computer vision AI that detects objects. We use the returned description of the image to decide whether the litter is detected, if litter is detected we store the image file along with it's geographical coordinates on a firebase storage database. Using the react app, we display the position of each image on a google map (react-google-map) at the corresponding latitude and longitude.

Image of demo

Ecomake 2020 3rd place winner project - Made by Dan Peerapatanapokin, Grayson Harralson, Kenneth Wong and Ibrahim Saeed.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • JavaScript 38.9%
  • Python 24.0%
  • CSS 23.1%
  • HTML 14.0%