Mimicking the human eyes was no easy task for Android developer Nikos. In this tech for good project we explore the use of object recognition to improve the lives of those living with visual impairments.
During my Innovation Time project, I spent a week developing an app that could mimic the human eyes by recognising what’s in front of mobile devices. Having been inspired by our project with Vista, where I helped develop a vision-screening app with the charity that helped diagnose vision-problems in younger children and communities.
By Android developer Nikos Rapousis.
Limited accessibility for communities isn’t something I can live with. That’s why to combat the challenges faced by people with visual disabilities and impairments, I wanted to explore how mobile devices and their cameras can be turned into a tool that mimics human actions.
Using object recognition technology so that my app; SIMI (pronounced See.Me) could not only recognise who and what is in front of you, but also be interacted with through voice, used as a source of information and act as a communications tool in case of emergencies.
Limited accessibility in mobile apps means that vulnerable communities often have to seek assistance for everyday tasks or are simply unable to do them. Providing support and care for people with sight loss is paramount so that they can overcome the barriers preventing them from doing all the things they’d like to do.
An app that could be used as a simple daily solution for problems faced by people with sight loss. From making interactions easier to helping people see what’s in front of them, even assisting them in making their way from A to B; we wanted to create a platform that would be easy to use and cater for all the daily needs of those living with impairments or disabilities.
Using the phone’s camera and computer vision, the app can detect and recognise what’s in front of it. From flower pots to bicycles and even people.
Users can talk to the app to interact via voice and access all of SIMI's features with different commands; such as “call Mum and Dad” or "send them a picture of what I'm seeing".
Contacts can be set up in the app to facilitate emergency calls, whether it's the emergency services or mum and dad, users will always have the ability to contact someone important in worst-case scenarios.
The SIMI app can be opened and closed by simply shaking your mobile device, this is an optional setting that once turned on makes interacting with the app as easy as possible.
In the future, we’d like to implement the ability for SIMI to read out words from an article or book. In order to make getting around easier, we’d want to implement accessibility details for various locations so that users can plan their trip without having to worry about accessibility when they get to their destination.
Last but not least, introducing the Lost & Found feature. Looking for something you’ve lost is already hard enough with full-vision. Imagine what it’s like for someone with a visual impairment! Giving SIMI the ability to search for an object in a room by asking “SIMI find the remote control” and scanning the room with your camera would make this task far easier and less daunting.
An interesting aspect of the tech is training models to recognise different objects, as long as you have a few hundred images to use as examples, pretty much anything can be recognised.
The sheer impact this technology can have has never been more apparent than in LUSH’s Naked store in Milan, the world’s first plastic-free cosmetics store, which used an app we developed off the back of the SIMI project which harnesses the same object recognition technology.
The app replaced the need for packaging in the store as customers used mobile devices to scan products and get all the information they needed, including descriptions, ingredients, origins, health benefits and more. A huge impact that truly shows how the technology can be used for good, have a challenge in mind? Object recognition could be answer.