Not applicable.
Not applicable.
The invention is in the field of devices for training and assessing navigation performance without the use of sight, through auditory or other sensory cues, in blind users or under non-visual conditions in sighted users. To date, full-scale navigational training and assessment in the blind has typically been done in real physical space containing real objects. On the other hand, virtual navigation has usually been done through vision (either full vision, as in the sighted, or residual vision, such as in the visual navigation system of Turano et al., 2001, for training low vision subjects with retinitis pigmentosa).
Chebat, Maidenbaum and Amedi (2015, Navigation Using Sensory Substitution in Real and Virtual Mazes, PloS ONE 10(6): e0126307. Doi:10.1371/journal.pone.0126307) developed a handheld device termed the EyeCane that reads out the distance of real objects, but it requires moving and rearranging physical objects to define the layout of the objects defining the navigational path. It is not, therefore, a virtual navigation system.
Seki & Ito (2003), among others, used acoustic signaling to generate a three-dimensional auditory (acoustic) virtual reality in the form of sound sources defined by auditory signals equivalent to those produced by sound sources located in 3D space relative to the navigator. None of these studies used sound signals encoding the parameter of distance from the navigator in directions defined by the orientation of a hand-held device, or did so in free space to an accuracy greater than provided by the satellite-guided GPS system.
U.S. Pat. No. 8,886,462 (Systems, methods, and software for providing wayfinding orientation and wayfinding data to blind travelers) uses input identifying a starting landmark and ending landmark in a particular selected geographic region, then searches a database for the corresponding narrative wayfinding instructions, and outputs them in the form of text or audio to guide a blind pedestrian from the starting landmark to the ending landmark. Such narrative instructions, however, do not provide any feedback about whether the traveler is on or off the designated trajectory. The instructions could update based on the location of the user defined according to GPS information, which can be grossly inaccurate in city streets and are generally unreliable or unavailable for indoor wayfinding.
US20150212712A1 (Advanced navigation techniques for portable devices) describes a hand-held cane-like wayfinding device that is controlled by a virtual reality system to provide haptic (tactile-kinesthetic and vibratory) feedback about the nature of virtual objects in a virtual environment surrounding the user. While the Description include the concept of coupling the haptic information with realistic virtual auditory information about the cane contacting virtual objects, the claims do not include any mention of auditory signals. This publication provides a comprehensive review of related IP, but none of it provides direct auditory feedback for virtual environments in the form of a coded distance signal. Only the EyeCane of Chebat, Maidenbaum and Amedi (2015) provides an auditory coded distance signal, but that is derived from a real environment, not from a virtual environment based on the local acquisition of the position of the hand-held device.
The invention generally comprises a navigation system for the blind that uses auditory or other non-visual information, such as vibration, force feedback, etc., to generate a virtual environment in empty real space (hereinafter VEERS) for entirely non-visual navigation. The core concept is that the location of an individual person, or other animal or device, is tracked in real indoor or outdoor space by electronic signals from a virtual reality system. Information about the position and other descriptors, such as orientation, of the virtual object is fed back to the individual by auditory or other non-visual sensory means to allow assessment of their ability to non-visually navigate a space inhabited by objects. An auditory signaling code is used to convey the distance of objects in a virtual space to the user in relation to their current position, for example in order for them to avoid contact with the objects. The direction of the virtual objects is conveyed by the orientation of the hand-held transponder. To be useful to a walking pedestrian, the system has to have an accuracy much greater than is provided by the satellite-guided GPS signals, especially for indoor training applications, where GPS can be highly inaccurate over many meters. The system of the invention readily overcomes this challenge and meets the high accuracy required for a walking pedestrian.
The effectiveness of the navigation system derives from the ability of the user to encode a mental map of the route to be navigated. Armed with this mental map, users can then employ the distance signal from the device to determine their location and orientation in space relative to the features of the map. The system is thus designed to train users in the facility of using such mental map transformations for effective navigation, based on their spatial memory of the layout of the navigable routes in the area that they are navigating.
In a typical embodiment, an auditory signal emitted by a hand-held orientable transponder is arranged to vary in some parameter(s), such as pitch and/or intensity, in proportion to the straight-line distance from the transponder along its line of orientation to the nearest virtual object, with far-to-near distance encoded by low to high pitch, for example. Thus, as long as the distance signal remains low or is absent, the user knows that they are able to proceed in that direction without impediment by a virtual object, but if the pitch increases, it implies that an object is close and that they should search for an open path by reorienting the transponder in different directions. The transponder location and orientation are provided by ultra-wideband (UWB) radio signal technology with base-stations receiving signals from a moving transponder enabling its position and orientation in 3D space to be identified with sufficient accuracy to guide the user at each step.
The VEERS system is risk-free for indoor navigation because the user is navigating within an empty space populated with only virtual objects.
As no physical objects or partitions need to be built or reconfigured, moved or rearranged, VEERS allows high flexibility for setting up multiple virtual environments, varying their complexity, reconfiguring them for repeated testing, training, etc.
The VEERS system is particularly useful for training both blind and blindfolded sighted individuals in spatial orientation skills, such as planning and following navigation directions from a memorized spatial map, or making further decisions, such as recasting the planned route if an unexpected obstacle is encountered, together with core spatial cognition capabilities such as spatial memory and spatiomotor coordination.
With regard to virtual space, the VEERS system will provide accurate positional and speed readout of the user navigating a virtual layout in an empty space. (Note that the previous work along these lines does not have these benefits; the EyeCane, a handheld device by Chebat, Maidenbaum and Amedi (2015) reads out the distance—up to 5 m of real objects, but it is not able to read out distances to virtual objects, provides no readout of the user's position in space, and requires moving and rearranging physical objects to form the navigational paths being tested.)
With regard to real space, various other aspects can provide other advantages, such as using an optical distance meter in a real physical environment with the same form of distance coding, to provide a blind or visually impaired person with a flexible distance readout in every direction from where they are currently located (according to the direction in which the handheld device is pointing.) This approach can provide effective navigation for regions that are poorly served by smartphone-mediated GPS such as indoor areas, outdoor city areas where the signal is distorted by tall buildings, or regions lacking cellphone service.
The invention generally comprises a navigation system for the blind that uses auditory or other non-visual information, such as vibration, force feedback, etc., to generate a virtual environment in empty real space (hereinafter VEERS) for entirely non-visual navigation. With regard to
With regard to
With regard to
With regard to
The directional distance signal to the virtual objects can be a pitch of a tone, with a low tone for distant objects, increasing in frequency (and urgency) as the (virtual) object or obstacle is approached. Other forms of obstacle readout could be utilized, or included together with the distance encoding, such as knocking sounds when a wall is encountered. The transponder could indicate the distance of obstacles in the direction of the transponder's orientation by a tactile readout such as the intensity of its vibration, or by an auditory verbal readout of the distance to the object. Auditory parameters such as frequency, amplitude, pulse and rhythms may be used to convey details about the virtual environment, and each parameter may have an established correspondence with a respective physical aspect such as the size and relative motion of the virtual objects. The user scans the virtual transponder beam across the space to obtain wide-angle sense of the arrangements of objects in the virtual scene. (Note that this system does not require recreating the ambient sounds of a real environment, which is the approach taken by prior auditory virtual reality systems, or overlaying the auditory identifiers for specific target location, which is an approach taken by prior augmented reality systems.)
The system may encode the aspects of the structure of the object, such as its height and width, and include that information in the non-visual—auditory, tactile or verbal—readout. A further embodiment of this information could be to encode the size of objects as a second variable such as sound or vibrational intensity, or verbal specification of the width and height of the obstacle.
The basic training function of the invention is to allow the user to build up and verify a mental map of the auditory virtual reality space by navigating through it guided by the auditory distance cue. The accuracy of the mental map can be improved by practice in minimizing the errors and maximizing the speed of navigation through the virtual paths defined in the map. This enhanced capability of deploying mental mapping by blind users, or non-visually in sighted users, can be of general use when navigating real physical environments guided by the usual means of the long cane and general auditory cues. It may be appreciated that the training activities take place in an open space that is generally free of obstacles and obstructions, so that the training routines avoid collisions and impacts with objects and are generally safe.
The training procedure utilizing the VEERS system may include initial training on tactile maps through hand exploration and memorizing of the navigational layout, such as the Likova Cognitive-Kinesthetic Training (see U.S. Pat. No. 10,722,150, issued Jul. 28, 2020), further transferred to the full-scale VEERS navigational layout for extended training and assessment of non-visual navigational capabilities. As the VEERS system can serve both assessment and navigational training, it has the potential to become a useful tool in the O&M instructors' practice.
With regard to
With regard to
With regard to
The auditory virtual reality can be further extended with coding of the signals by other modalities such as tactile vibration or verbal instructions.
This application claims the priority date benefit of U.S. Provisional Application 63/208,704, filed Jun. 9, 2021.
Number | Date | Country | |
---|---|---|---|
63208704 | Jun 2021 | US |