There are a lot of things we do in our lives that rely on our eyes, especially as programmers.
It came as a surprise to see a blind student in our classes. As much as companies and schools try to help people with disabilities, they have limited staff and resources.
When we learned about the City of Mississauga's Smart Cities initiative at PC Hacks, we thought of a way to make the project be more accessible and inclusive.
We have developed an assistive technology application to help the visually impaired through live remote assistance from their mobile device called BlindSight.
Volunteers can help the visually impaired enhance their daily lifestyle by allowing them to be more independent. Routine tasks like reading medicine bottle labels, navigating commute to and from work are more difficult to accomplish with limited vision.
Core Features - object recognition, live video chat, gesture navigation, remote assistance
Swift 4 - iOS app
Core ML - object recognition
Twilio - video interface
Firebase - push notifications, server, database
Balsamiq - Wireframing for UI/UX
Machine Learning - Accuracy of objects recognized in the camera's view
Server Connection - Firebase
UI/UX Design - keeping UI simple to make a blind person's UX as smooth as possible (limiting buttons in case of errors - as the usual best practices don't apply to people with disabilities; privacy (disabling front-facing camera)
Android App - to increase user base
Reward System - Partnering with sponsors to provide rewards for reaching helper milestones (number of hours, number of users helped, maintaining a high rating), and schools for community volunteering recognition
Location Pairing - Partnering with sponsors to provide rewards for reaching helper milestones (number of hours, number of users helped, maintaining a high rating), and schools for community volunteering recognition
Voice Commands - User-friendly navigation app
Location Pairing - Allowing volunteer to select specific user to help from a map