The present invention relates to the field of indoor navigation, and more particularly to an intelligent indoor navigation system with a user interface adaptable for a user with one or multiple disabilities.
People suffering from multiple disabilities including deaf, mute, visually impaired, blind, autistic, people with cerebral palsy and other cognitive disorders find it difficult to navigate with independence in indoor public areas such as public transport, malls, hospitals, airports, etc. Ability to comprehend and process information is a great challenge amongst people with cognitive disorders. Whereas, the visual or auditory impairments lead to inability to see or hear guidance to navigate within a closed environment.
Aid systems for the visually impaired or blind people are known in prior art. People suffering from multiple disabilities including: deaf, mute, visually impaired, blind, autistic, those with cerebral palsy and other cognitive disorders find it difficult to navigate with independence in public areas such as: public transport, malls, hospitals, airports, etc.). Ability to comprehend and process information is greatly affected amongst those with cognitive disorders. Whereas, the visual or auditory impairments lead to inability to see or hear guidance to navigate within a closed environment. Studying the social behavior of those with disabilities, it has been found that they have a deep desire to be independent and move around on their own.
Aided indoor wayfinding encompasses all of the ways in which people orient themselves in physical space and navigate from place to place. When there is a well-designed and aided indoor wayfinding system, people are able to understand their environment. This provides users with a sense of control and reduces anxiety, fear and stress. Indoor wayfinding can be particularly challenging for some people with disabilities for example: someone who is deaf or hard of hearing will rely on visual information but may not be able to hear someone providing directions.
Someone who is blind will not be able to see a directory, but if it is in a predictable location and has information in a tactile format, they can read it and they will use sound and even smell to gather more information about their environment. It is important to provide aided indoor wayfinding information in a variety of different formats as visual, auditory and physical to cater to the needs of those with multiple disabilities. People mostly use different forms of information gathering to find their way to their destination but this is especially important for people with disabilities.
For example, Canadian patent CA 2739555 provides for a portable device, similar to the white cane usually used by blind people when equipped with a GPS system. The cane can help them navigate. However, users solely rely on the movement of the cane to be able to detect obstacles and path guidance. This method is found to be less accurate and dependent on the cane movement.
Another traditional known navigation system is disclosed in the U.S. Pat. No. 10,302,757 B2, suggesting an aid system and method for visually impaired users. Accordingly, a user is provided with a control device to identify the object to determine the path, which is sent to the user. This solution does offer guidance to the blind people and people with visual impairment. However, users with other disabilities such as deaf, mute, autistic as well as people with cerebral palsy and other cognitive disorder, cannot be guided properly with the aid system and method described in the prior art. Further, in this solution, the system and method for the visually impaired users were limited to one indoor location.
However, none of the traditional methods disclose a navigation system which is customized based on the particular type of disability of a user or a navigation system adapted for multiple disabilities.
The above information is presented as background information only to assist with an understanding of the present disclosure. No determination has been made, and no assertion is made, as to whether any of the above might be applicable as prior art with regard to the present disclosure.
Therefore it is an objective of the present invention to provide a navigation method and system for disabled persons that incorporates an Artificial Intelligence (AI) model with high accuracy, personalized real-time guidance and providing for an enhanced user experience.
The present invention involves an adaptive indoor navigation system for users with single or multiple disabilities comprising an adaptive user interface having a plurality of guidance modes, wherein a particular guidance mode is activated based on selection of a type of user interface by a user of the adaptive indoor navigation system, and wherein the adaptive indoor navigation system is customized based on a type of the disability.
In an embodiment of the present invention, the plurality of guidance modes comprise a text-based guidance mode, voice-based guidance mode and an adaptive user interface which is a combination of text and voice based guidance mode.
In another embodiment, the adaptive indoor navigation system functions based on artificial Intelligence (AI) and Augmented Reality (AR) technologies.
In another embodiment, the user is provided guidance or navigation control through Global Positioning System (GPS) or a multiple-node indoor positioning system.
In another embodiment, an Indoor Web Facility Manager Application is used for real-time path planning.
In another embodiment, crowds and incidents are tracked for the real-time path planning.
In another embodiment, the Indoor Web Facility Manager application further comprises an Information Management module that triggers important alerts and notifications on a user's control device in case of emergency or urgent announcement.
In another embodiment, the system is operable online or offline.
In another embodiment, preplanning of a journey is conducted using pre-defined paths stored in a random access memory (RAM) of a device on which the indoor navigation system is running.
In another embodiment, real time sentiment and personality analysis is conducted for personalized AI guidance and user data collection.
In another embodiment, data training is performed based on user experiences and further includes sentimental analysis information.
In another embodiment, the system further comprises a plurality of connected information nodes positioned around an indoor space.
In another embodiment, the system provides personalized navigation with optimal path determination based on the type of disability of the user.
In another embodiment, the system is customized based on the type of disability of the user, and is capable of adapting to multiple disabilities.
In another embodiment, the system further comprises a control device and an optimal path navigation module, wherein the optimal path navigation module is configured to obtain a destination information from the user, obtain a digital map based on the destination information, generate a path by applying an Artificial Intelligence (AI) model, present the path to the user in a format depending on the generated user interface, perform navigation for the user towards the destination by applying the AI model for assisting with the navigation guidance, obtain user feedback and input the user feedback into the AI model for future navigation, wherein the AI model is trained by a supervised machine learning system and an unsupervised machine learning system independently and simultaneously.
In another embodiment, the system further comprises a non-transitory computer readable storage medium for storing a program causing one or more processors to perform a method for Artificial Intelligence (AI)-assisted navigation in an indoor space for a user with single or multiple disabilities.
In another embodiment, pre-defined paths, maps and user information are stored within a database.
As another aspect of the present invention, a method for providing indoor navigation to a user with disabilities is disclosed, the method comprising obtaining a user input data, generating and then activating a user interface type based on the user input data, determining a destination information, generating a path by applying an Artificial Intelligence (AI) model and presenting the path to the user in a format depending on the activated user interface type, wherein the indoor navigation is customized based on a type of disability of the user.
In an embodiment of the present invention, the user interface type is a graphic user interface, voice user interface, and a combination of Adaptive User Interface text and voice-based interface.
In another embodiment, the method for providing indoor navigation is interactive and capable of generating a feedback for requesting real-time physical assistance to the user.
In an embodiment, the indoor navigation functions based on artificial Intelligence (AI) and Augmented Reality (AR) technologies.
In an embodiment, the user is provided guidance or navigation control through Global Positioning System (GPS) and/or multiple information-nodes for indoor positioning system.
In an embodiment, an Indoor Web Facility Manager Application is used for real-time path planning.
In an embodiment, crowds and incidents are tracked for the real-time path planning.
In an embodiment, the Indoor Web Facility Manager application further comprises an Information Management module that triggers important alerts and notifications on a user's control device in case of emergency or urgent announcement.
In an embodiment, the system is operable online or offline.
In an embodiment, preplanning of a journey is conducted using pre-defined paths stored in a random access memory (RAM) of a device on which the indoor navigation system is running.
In an embodiment, real-time sentiment and personality analysis is conducted for personalized AI guidance and user data collection.
In an embodiment, data training is performed based on user experiences and further includes sentimental analysis information.
In an embodiment, the system further comprises a plurality of connected information nodes positioned around an indoor space.
In an embodiment, the system provides personalized navigation with optimal path determination based on the type of disability of the user.
In an embodiment, the method further comprises the steps of obtaining a destination information from the user, obtaining a digital map based on the destination information, generating a path by applying an Artificial Intelligence (AI) model, presenting the generated path to the user in a format depending on the generated user interface, performing navigation for the user towards the destination by applying the AI model for assisting with the navigation guidance; and obtaining user feedback and input the user feedback into the AI model for future navigation, wherein the AI model is trained by a supervised machine learning system and an unsupervised machine learning system independently and simultaneously.
The subject matter that is regarded as the invention is particularly pointed out and distinctly claimed in the claims at the conclusion of the specification. The foregoing and other aspects, features, and advantages of the invention are apparent from the following detailed description taken in conjunction with the accompanying drawings in which—
The aspects of an intelligent indoor navigation system with an adaptive user interface will be described in conjunction with
The present invention relates to a field of an interactive and mobile aid application for people challenged with multiple disabilities like deaf, mute, visually impairment, blind, autistic, cerebral palsy and other cognitive disorders. A personalized navigation guidance for people with multiple disabilities including cognitive, visual impairment, blindness, auditory and other physical disabilities includes real-time positioning and indoor path-planning when online or offline.
In a preferable embodiment of the present invention, the indoor navigation system includes an Adaptive User Interface, Graphic User Interface and a Voice User Interface. Interactive Artificial Intelligence (AI) based navigation guidance is available for people with multiple disabilities to guide and navigate the user through multiple indoor locations. Another feature includes preplanning of a journey from home even before reaching the location.
In another embodiment, real-time access to transportation data (like: Train schedules, Bus schedules, Event, Weather) for accurate journey and path planning is included along with sharing of the journey with contacts to follow a similar navigation guidance, connecting various indoor locations across different geographies using a unified communication path and real-time sentiment and personality analysis for personalized AI guidance and user feedback collection within an indoor environment. Further, a web facility manager application tracks crowds for real-time path planning and immediate assistance with a feedback mechanism to constantly enhance user experience and personalize the user's journey.
In an embodiment of the present invention, people with multiple disabilities require an intelligent two-way interaction where feedback is received and processed to provide personalized real-time guidance. The navigation system in accordance with the present invention includes orienting oneself, knowing one's destination, following the best route, recognizing one's destination and finding one's way back out. People who are disabled with auditory, cognitive and/or visual impairments benefit from tactile information and require an interactive adaptive interface to facilitate wayfinding.
Based on user's selection of VUI, GUI or AUI, guidance is activated to navigate the user via the most optimal path leading to the destination. Obstacle detection—both moving and non-moving is tracked to ensure user's safety, as depicted in
Once the user input is finalized, it calculates the optimal path from the selected starting point to the destination. This optimal path result differs per user as it is directly connected with the utilities that customize the navigation. The utilities include user's disability type, preferred language, biological data, and enterprise data; all information is stored in both RAM and cloud server. Finally, all the data is transferred to the user's control device when the nearby information node is detected. The information node triggers and pulls user and navigation information.
In an embodiment, weightage is given to the paths based on obstacles found, tactile paving footpath, defined paths and lift (elevator) to calculate the most optimal path. For example, in
In the present disclosure, the control device can be a mobile device such as a smartphone, a tablet, an e-reader, a PDA or a portable music player.
Within the indoor space, a plurality of information nodes are arranged. In
A weightage is given respectively to each path among the various paths available between the user location and the destination required based on a number of criteria comprising presence or absence of obstacles detected within the path, presence or absence of a tactile paving footpath within the path, presence or absence of a defined path and presence or absence of a lift (elevator) within the path. The path weightage is taken into consideration to determine the most optimal path between the user location and the user destination as a function of the specific disability of the user. The optimal path for a user having a given type of disability (for example blindness) may be different from the optimal path for a user having another type of disability (for example mobility impairment).
The entire journey map is configured in the user's control device upon selection of destination and based on real-time incidents/scenario and dynamic changes on the user's path, the map is calculated on a real-time basis to provide guidance in the VUI, GUI or AUI format based on user preference. This is achieved through the dynamic map generated via the AR (Augmented Reality) module that relays information to the main algorithm that determines the final guidance output for the user. AI (Artificial Intelligence) module is trained over time by the user to fully personalize the user experience through the control device based on preferences and usage pattern. When tested with users in a real scenario, the navigation guidance resulted in the most accurate and optimal path planning even in the case of changing map due to crowds or unplanned/sudden indoor floor plan modifications.
102 and 103 illustrate two of the various paths that can be created for people with disabilities like: blindness, visual impairment and mobility. Similar calculations are run for those with cognitive disabilities like mobility, ASD, etc. and those with auditory disabilities, cerebral palsy and multiple combinations of disabilities. 104 represents the obstacles both moving and non-moving that are detected through user's control device by scanning the surrounding indoor environment using real-time image processing. These obstacles (both moving and non-moving) are detected on a real-time basis to ensure users' safety while travelling.
An entire journey map is configured in the user's control device upon selection of a destination and based on real-time incidents/scenario and dynamic changes on the user's path. An optimal path is calculated on a real-time basis to provide guidance in the Voice User Interface (VUI), Graphic User Interface (GUI) or Adaptive User Interface (AUI) format based on user data and preference. This is achieved through the dynamic map generated via the Augmented Reality (AR) module that relays information to the main algorithm that determines the final guidance output for the user.
The navigation system and method of the present disclosure, which will be described later, are assisted by utilizing an Artificial Intelligence (AI) model. The AI model is trained over time by the user to fully personalize the user experience through the control device based on user's preferences and usage pattern. When tested with users in a real scenario, the navigation guidance resulted in the most accurate and optimal path planning even in the case of changing map due to crowds or unplanned/sudden indoor floor plan modifications.
Referring to
User also has the ability to provide feedback about the experience of the completed journey. The experience can be stored in a cloud server and marked as favorite and shared with contact person of the user. The Artificial Intelligence model learns and trains with user data and provides customized user experience at every instance of usage of the control device by the user. According to the embodiment of the present disclosure, real-time sentiment and personality analysis for a personalized AI guidance and user feedback collection are achieved. To achieve this purpose and based on the above introduced algorithm and fundamental processes according to
One of the machine learning systems for training the AI model is a supervised machine learning system for sentiment analysis. For example, voice feedback of the user can be obtained and processed into a text file by utilizing speech to text function. The text file of the user feedback is gathered as input of the supervised machine learning system. Output is a list of sentiment types, e.g. happy, worried, sad etc. A learning algorithm is run based on the gathered training set. The other one of the machine learning systems for training the AI model is an unsupervised machine learning for optimal path enhancement. For example, the input is a new pixel map file obtained from the completed path of the user. A training model runs through the user pixel map files. The neural network is not provided with desired outputs. The system itself must then decide what features it will use to group the input data.
The above introduced machine learning systems are independent and are performed simultaneously. By applying two independent machine learning systems simultaneously, the AI training can be enriched that lead to continuously improving users experience.
In an embodiment, the user is assured when following the correct path and guided in case of deviation from the selected path, if the user is lost, a recalibrated path is suggested but if the user is unable to follow the guidance, the indoor facility manager is notified to provide on-ground assistance through any of the preferred mode of communication (VUI, GUI or AUI). The user can also call for assistance at any given point in time. This alert for help is linked to the Web Indoor Facility Manager Application described in
The indoor navigation is tracked across multiple indoor locations on the user's path across multiple indoor geographies. In the previous disclosed U.S. Pat. No. 10,302,757 B2 by an Italian company, the aided guidance for the visually impaired was limited to one indoor location whereas in the present invention, the user is successfully able to navigate across multiple indoor locations spread across different geographies. Each indoor journey undertaken by the user is tracked & recorded for future reference. Alerts are sent in the preferred format—VUI (Visual User Interface), GUI (Graphic User Interface) or AUI (Adaptive User Interface) based on user's selected method of communication. UI can be changed by the user at any time based on the preference or journey-type.
User GPS and compass values are passed through the information framework in alignment with the triggers from the information node to present the user with the most accurate navigation information and guidance based on user's real-time position and location. This information is delivered using Artificial Intelligence (AI) through graphic, voice or adaptive user interface modes. As shown as 204, the user upon arrival at the final destination is alerted and guided to exit the indoor premise. The user feedback is collected at this stage to constantly enhance and personalize the navigation guidance.
Further, multiple locations, paths or maps can be saved as favorites by the user for future reference. User can also share this maps or favorited locations with contacts on the user's phonebook, or manually input a contact. Users with similar disabilities can also view suggested maps, routes and navigation guidance for their reference. When the user shares the map with a phonebook contact a notification is sent to the contact on their control device (305). In 306, the recipient can then access the notification to access and activate (start using) the map. The map can be stored with the data of the recipient's profile information on the control device. The user can delete this information if the user doesn't want the application to record this data. User can also share the route, map and/or navigation guidance with other phonebook contacts, or manually input a contact. The Artificial Intelligence algorithm learns and trains with user data and provides customized user experience at every instance of usage of the control device by the user.
In
Selection of Training data is based on previously recorded instances, data sets, navigation paths, sentiment of the conversations, etc. to provide a personalized AI (Artificial Intelligence) guidance to the user on a real-time basis. This guidance is a two-way communication that allows the user to have a conversation with the navigation platform for instructions, feedback and specific assistance (403). Decision and classification are conducted by implementing an optimal path based on a calculation approach that is effective in case of large number of training examples that are stored within the framework. These examples are classified by user preference types, journey types and disabilities to render optimal paths and calculations when the input is parsed and delivered via the Artificial Intelligence (AI), Augmented Reality (AR), Global Positioning System (GPS) data (404).
Outcome is delivered in the desired User Interface (UI)—VUI (Visual User Interface), GUI (Graphic User Interface) or AUI (Adaptive User Interface) based on user selection and preference. User also has the ability to provide feedback about the experience. The experience can be stored (
Real-time heat maps are generated to track active users and communicate with them when necessary. This allows the Indoor Web Facility Manager to take important or critical actions to assist people with disabilities during their indoor navigation journey. Crowd management by altering the navigation paths can be managed on a real-time basis to ensure an uninterrupted indoor navigation guidance (503). Information Node sensor manager enables the user Indoor Web Facility Manager to track the Information Node sensors located across the indoor premise. It tracks if the Information Nodes are fully or partially functional and placed at the correct positions within the indoor floor plan (504).
Emergency Help function connects the user's control device to the Indoor Web Facility Manager's application view by relaying messages in case user needs urgent assistance. This also allows the Indoor Web Facility Manager application to send messages to troubleshoot within the given situation and also guide the user to the nearest safe point where assistance can be made available (505). Control Device communication is linked to the Indoor Web Facility Manager's application for instant and real-time communication. Real-time communication is AI (Artificial Intelligence) driven to provide immediate and relevant voice, text or graphic-based assistance to the user to till manual assistance is provided (if required). In case, the AI (Artificial Intelligence) guidance is able to resolve and troubleshoot, Indoor Web Facility Manager is notified (506).
Connecting with the Augmented Reality (AR) session. Once the Information Nodes are connected, they are linked to the Augmented Reality (AR) session to ensure real-time map simulation on the control device and to better scan the user's environment. In addition, it also monitors moving or non-moving obstacles and alert the user in case of a crowd that is likely to clog the user's path to the selected destination (603). Then, the map of the selected destination is downloaded from cloud onto user's control device and then real-time data from Information Nodes and Augmented Reality (AR) is intertwined to provide an integrated, optimized, personalized and intelligent navigation guidance. The data is validated and referenced back with the Global Positioning System (GPS) and Compass data from the user's control device to ensure highest level of accuracy and error-free navigation guidance.
At every stage of the completion of the journey within the user's path, a notification is sent assuring the user that the path is correctly being followed until the user reaches the final destination based on the prior selection submitted by the user (611). Once the journey is successfully completed the user is prompted to select the next destination or terminate the navigation guidance (612).
In
Once the user input is finalized, it calculates the optimal path from the selected starting point to the destination. (Refer
Indoor map matching process is carried out on the cloud platform based on the destination input by the user and connects with the Information Node and Augmented Reality (AR) session to trigger real-time guidance (707). Finally, all the data is transferred to the user's control device when the nearby information node is detected. The information node triggers and pulls user and navigation information (708).
Many changes, modifications, variations and other uses and applications of the subject invention will become apparent to those skilled in the art after considering this specification and the accompanying drawings, which disclose the preferred embodiments thereof. All such changes, modifications, variations and other uses and applications, which do not depart from the spirit and scope of the invention, are deemed to be covered by the invention, which is to be limited only by the claims which follow.
This patent application claims priority from U.S. Provisional Patent Application No. 62/884,282 filed Aug. 8, 2019. This patent application is herein incorporated by reference in its entirety.
Number | Date | Country | |
---|---|---|---|
62884282 | Aug 2019 | US |