This is the repository for the 2020 LGSI Internship program
Developing objects that address the discomfort of the visually impaired
-
Blinded people are more sensitive to minor changes in the position of objects than ordianry people.
-
Therefore, we can provide visually impaired people with wearable device that becomes their eyes and ears, telling them the location of objects they want to find.
-
We think that it would be helpful for blinded people and work with LG Smart TV effectively.
So, This application we developed is named "Eye Helper"
-
When I say what I want to find, the device recognizes this and starts the program.
-
The attached camera recognizes objects in real time.
-
When the device find the object I'm looking for, exit the navigation and output an announcement message.
-
After 3 minutes, the navigation automatically ends.
- This is the web application where object detection results are shown.
-
This is how the picture of the object you found is displayed.
-
It tells you that the item you were looking for is in front of you.
We were able to win the 2020 LGSI Hackathon 1st Prize.