An object finder for the visually impaired
Vision loss or vision impairment are impactful to human. The visually impaired people lost the sense of vision for them to determine the surrounding environment precisely. They are having a hard time in their life and require assistance for to carry on their daily life. In this project, an assist...
| Main Author: | |
|---|---|
| Format: | Final Year Project / Dissertation / Thesis |
| Published: |
2022
|
| Subjects: | |
| Online Access: | http://eprints.utar.edu.my/4714/ http://eprints.utar.edu.my/4714/1/fyp_IA_2022_CYQ.pdf |
| _version_ | 1848886222997422080 |
|---|---|
| author | Choo, Yong Quan |
| author_facet | Choo, Yong Quan |
| author_sort | Choo, Yong Quan |
| building | UTAR Institutional Repository |
| collection | Online Access |
| description | Vision loss or vision impairment are impactful to human. The visually impaired people
lost the sense of vision for them to determine the surrounding environment precisely.
They are having a hard time in their life and require assistance for to carry on their daily
life. In this project, an assistive solution is proposed to help the people with visual
impairment to identify and even locate objects accurately to pick up the object. This
project aims to develop an object finder application that can help the visually impaired
people to easily locate and identify common objects or daily essentials in the indoor
environment. The system will implement an object detection module developed using
the pre-trained YOLO model to detects object in a single-shot convolutional neural
network for real time detection from the images or video. The YOLO model is high
speed and high accuracy to archive high responsiveness to the users. The system will
output the detection result by label the bounding boxes of the objects in the visual data
to enable the system to compute the relative direction of the object from the user’s hand
in the next stage. To implement hand detection, the background frame of the video is
extracted and will be used to perform background subtraction with the subsequent
frame. The system will then compute the direction of the object relative to the user’s
hand. The overlap of the user’s hand with the objects in the visual data will allow the
system to trigger a notification to the user to pick up the object. |
| first_indexed | 2025-11-15T19:35:04Z |
| format | Final Year Project / Dissertation / Thesis |
| id | utar-4714 |
| institution | Universiti Tunku Abdul Rahman |
| institution_category | Local University |
| last_indexed | 2025-11-15T19:35:04Z |
| publishDate | 2022 |
| recordtype | eprints |
| repository_type | Digital Repository |
| spelling | utar-47142023-01-11T09:19:57Z An object finder for the visually impaired Choo, Yong Quan T Technology (General) Vision loss or vision impairment are impactful to human. The visually impaired people lost the sense of vision for them to determine the surrounding environment precisely. They are having a hard time in their life and require assistance for to carry on their daily life. In this project, an assistive solution is proposed to help the people with visual impairment to identify and even locate objects accurately to pick up the object. This project aims to develop an object finder application that can help the visually impaired people to easily locate and identify common objects or daily essentials in the indoor environment. The system will implement an object detection module developed using the pre-trained YOLO model to detects object in a single-shot convolutional neural network for real time detection from the images or video. The YOLO model is high speed and high accuracy to archive high responsiveness to the users. The system will output the detection result by label the bounding boxes of the objects in the visual data to enable the system to compute the relative direction of the object from the user’s hand in the next stage. To implement hand detection, the background frame of the video is extracted and will be used to perform background subtraction with the subsequent frame. The system will then compute the direction of the object relative to the user’s hand. The overlap of the user’s hand with the objects in the visual data will allow the system to trigger a notification to the user to pick up the object. 2022-01 Final Year Project / Dissertation / Thesis NonPeerReviewed application/pdf http://eprints.utar.edu.my/4714/1/fyp_IA_2022_CYQ.pdf Choo, Yong Quan (2022) An object finder for the visually impaired. Final Year Project, UTAR. http://eprints.utar.edu.my/4714/ |
| spellingShingle | T Technology (General) Choo, Yong Quan An object finder for the visually impaired |
| title | An object finder for the visually impaired |
| title_full | An object finder for the visually impaired |
| title_fullStr | An object finder for the visually impaired |
| title_full_unstemmed | An object finder for the visually impaired |
| title_short | An object finder for the visually impaired |
| title_sort | object finder for the visually impaired |
| topic | T Technology (General) |
| url | http://eprints.utar.edu.my/4714/ http://eprints.utar.edu.my/4714/1/fyp_IA_2022_CYQ.pdf |