Step based guidance system

Abstract
A device for providing spatial information to a user. The device includes a camera configured to detect image data. The device also includes an accelerometer configured to determine step data. The device also includes a processor connected to the camera and the accelerometer and configured to determine a distance travelled per step of the user based on the image data and the step data. The processor is also configured to determine a distance to a reference point based on the image data. The processor is also configured to determine a number of steps corresponding to the distance to the reference point based on the distance travelled per step of the user. The device also includes an output unit connected to the processor and configured to output the spatial information indicating the number of steps corresponding to the distance to the reference point.
Description
BACKGROUND

1. Field


The present disclosure relates to automatic and dynamic adjustment of information provided by a device, and more particularly to a system and a method for generating and providing automatic and dynamic adjustment of spatial information for blind users.


2. Description of the Related Art


Navigation systems are capable of providing navigation instructions to a user based on a current location and a desired destination. Typically, these navigation systems are used in vehicles for providing driving directions. These navigation systems commonly utilize Global Positioning System (GPS) technology for estimating the current location of the vehicle.


More recently, portable navigation systems have been integrated into mobile devices, such as smartphones. Users can now use these portable navigation systems when riding a bicycle, walking, or otherwise proceeding along a route at a slower speed relative to a vehicle. These portable navigation systems, like their vehicle-based counterparts, use GPS technology for estimating a current location of the navigation system. Navigation systems may provide turning instructions as a user approaches a turn, and output the instructions in terms of standardized units of length, such as feet, meters, or yards.


However, individuals having certain disabilities, such as blindness, may not be able to accurately gauge distance in terms of standard measurements, such as feet, meters, or yards. In order for these individuals to gain the most benefit from a navigation system, the navigation system should output the instruction in a unit more intuitive to the user. Furthermore, navigation systems solely using GPS technology may not be as effective indoors, as the margin for error with GPS is too high to provide location information at a sufficiently accurate level. Therefore, navigation systems using solely GPS technology may not be usable or optimal for disabled users, particularly disabled users indoors.


Thus, there is a need for systems and methods for providing more intuitive, more accurate navigation and spatial information to users.


SUMMARY

What is described is a system for providing spatial information to a user. The system includes a camera configured to detect image data. The system also includes an accelerometer configured to determine step data. The system also includes a processor connected to the camera and the accelerometer. The processor is configured to determine a distance travelled per step of the user based on the image data and the step data. The processor is also configured to determine a distance to a reference point based on the image data. The processor is also configured to determine a number of steps corresponding to the distance to the reference point based on the distance travelled per step of the user. The system also includes an output unit connected to the processor. The output unit is configured to output the spatial information indicating the number of steps corresponding to the distance to the reference point.


Also described is a device for providing spatial information to a user. The device includes a camera configured to detect image data. The device includes an accelerometer configured to determine step data and a memory configured to store step distance data for establishing a baseline distance travelled per step. The device also includes a processor connected to the camera and the accelerometer. The processor is configured to determine a distance to a reference point based on the image data. The processor is also configured to determine a number of steps corresponding to the distance to the reference point based on the baseline distance travelled per step. The processor is also configured to determine a distance travelled per step of the user based on the image data and the step data. The processor is also configured to determine an updated distance to the reference point. The processor is also configured to determine an updated number of steps corresponding to the updated distance to the reference point based on the distance travelled per step of the user. The device also includes an output unit connected to the processor. The output unit is configured to output the spatial information indicating the number of steps corresponding to the distance to the reference point. The output unit is also configured to output the updated number of steps corresponding to the updated distance to the reference point.


Also described is a method for providing spatial information to a user. The method includes detecting, by a camera, image data and determining, by an accelerometer, step data. The method includes determining, by a processor, a distance travelled per step of the user based on the image data and the step data. The method includes determining, by the processor, a distance to a reference point based on the image data. The method also includes determining, by the processor, a number of steps corresponding to the distance to the reference point based on the distance travelled per step of the user. The method also includes outputting, by an output unit, the spatial information indicating the number of steps corresponding to the distance to the reference point.





BRIEF DESCRIPTION OF THE DRAWINGS

Other systems, methods, features, and advantages of the present invention will be or will become apparent to one of ordinary skill in the art upon examination of the following figures and detailed description. It is intended that all such additional systems, methods, features, and advantages be included within this description, be within the scope of the present invention, and be protected by the accompanying claims. Component parts shown in the drawings are not necessarily to scale, and may be exaggerated to better illustrate the important features of the present invention. In the drawings, like reference numerals designate like parts throughout the different views, wherein:



FIG. 1 illustrates an exemplary use of a system for providing spatial information to a user according to an embodiment of the present invention;



FIG. 2 illustrates an exemplary use of a system for providing spatial information to a user according to an embodiment of the present invention;



FIG. 3 is a block diagram of components of a system for providing spatial information to a user according to an embodiment of the present invention;



FIG. 4 illustrates a method for determining a number of steps corresponding to a distance to a reference point based on a distance travelled per step of a user according to an embodiment of the present invention;



FIG. 5 illustrates a method for dynamically adjusting a number of steps corresponding to a distance to a reference point based on a changing distance travelled per step of a user according to an embodiment of the present invention; and



FIG. 6 illustrates an exemplary database for storing a baseline step distance for various users and for various environments according to an embodiment of the present invention.





DETAILED DESCRIPTION

Disclosed herein are systems and methods for providing spatial information to a user. The systems and methods provide several benefits and advantages, such as providing a more intuitive indication of distance between a user and a reference point by using steps instead of traditional units of measurement for distance. The benefits and advantages are particularly more significant for disabled individuals, who may have a harder time gauging a given distance. These benefits are achieved by outputting spatial information unique to the user in terms of the user's steps and distance travelled per step. Determining a number of steps to a reference point based on the user's distance travelled per step (e.g., pace, or stride) provides benefits and advantages such as the ability to output information to a user based on individual characteristics of the user. This is advantageous because different users walk with different stride lengths and speeds, so a number of steps determined for one user may not be accurate for another user. For example, a first user who is 5 feet 6 inches tall may have a stride length (e.g., distance travelled per step, or pace) of 2.3 feet and a second user who is 6 feet 10 inches tall may have a stride length of 2.8 feet. If a distance to a point of reference is 50 feet away, the number of steps for the first user is 22 steps and the number of steps for the second user is 18 steps. Providing the number of steps determined for the first user to the second user may result in a high level of inaccuracy based on the difference in stride lengths. As such, more personal and more accurate information is provided by using a number of steps tailored to the user.


Automatically adjusting or calibrating the spatial information based on the user's distance travelled per step provides additional benefits and advantages such as allowing the number of steps to be travelled to change as characteristics of the user or the user's environment change. The systems and methods provide additional benefits and advantages such as the information being tailored to various walking environments of the user such as an incline, decline, crowded environments, empty environments, etc., and also tailored to the particular speed of the user, such as brisk walking, leisurely walking, jogging, running, etc. further allowing accurate information to be communicated to the user. For example, a user may be 50 feet away from a point of reference, walking leisurely, and the system provides spatial information indicating the point of reference is 22 steps away. However, after 10 steps, the user may encounter a situation causing the user to slow down, such as an inclined walkway or a crowd of people. The system may automatically adjust the number of steps and provide updated spatial information indicating the user is now 15 steps away from the point of reference, compared to the 12 steps the user would have thought, if the original distance per step was used.


An exemplary system includes a camera capable of detecting image data corresponding to an environment of a user. The system further includes an accelerometer that is capable of determining step data of a user, such as when the user has taken a step. The system further includes a processor connected to the camera and the accelerometer. The processor is capable of determining a distance travelled per step of the user based on the image data and the step data. The processor is also capable of determining a distance to a reference point based on the image data. The processor is also capable of determining a number of steps corresponding to the distance to the reference point based on the pace of the user. The system further includes an output unit connected to the processor and configured to output the spatial information indicating the number of steps corresponding to the distance to the reference point.


With reference now to FIG. 1, an exemplary system for providing spatial information to a user is illustrated. In many embodiments, spatial information refers to information regarding an environment around the user, including but not limited to, an absolute location of objects and destinations (e.g., object is 30 feet away, location is 50 steps away), a relative location of objects and destinations (e.g., object is 10 feet farther away than another object), or states of the environment (e.g., crowded, empty, incline, decline).


In FIG. 1, a user 102 of a device 104 is walking along a path 106, such as a sidewalk. The device 104 is illustrated as a wearable device resembling a necklace, but other devices, such 3as a wearable smart watch or a smartphone may be used. As the user 102 is walking, the device 104 determines a distance travelled by the user 102 per step of the user 102. The device 104 may determine the distance travelled per step by determining when the user 102 takes a step, and how far the user 102 has travelled between steps. In some embodiments, the device 104 determines when the user 102 has taken a step using an accelerometer, and the device 104 determines a distance the user 102 has travelled between steps using a camera and/or stereo cameras. In some embodiments, the device 104 uses image data from the camera and/or stereo cameras combined with location data from a map of the environment to determine the distance the user has travelled between steps.


The device 104 identifies a reference point 108. In some embodiments, the reference point 108 is dynamically chosen based on the surroundings of the user. In some embodiments, the system may recognize a location based on the image data detected by the camera and provide spatial information to the user associated with the location. For example, a user may frequently patronize a chain of coffee shops, and when the device 104 recognizes a location associated with the chain of coffee shops, the device 104 may provide spatial information to the user, such as “Coffee Shop X is 30 steps away to your right.”


In some embodiments, the reference point 108 is a destination being navigated to and is specified by the user. In some embodiments, the reference point 108 is a checkpoint, an intermediate location or a landmark relative to a destination being navigated to.


Once the device 104 identifies the reference point 108, the device 104 determines a distance 110 to the reference point 108. The device 104 may determine the distance 110 to the reference point 108 using one or more of an inertial measurement unit (IMU), a camera, stereo cameras offset by a stereo distance and information associated with the environment, such as a map.


The device 104 determines a number of steps to the reference point 108 based on the determined distance travelled per step and the determined distance 110 to the reference point 108. The device 104 provides an output 114 of the number of steps to the reference point 108, to the user 102. In some embodiments, the output 114 is an audio output. In some embodiments, the audio output is communicated through a speaker 318 on the device 104. In some embodiments, the audio output includes an identification of the reference point 108 and the number of steps to the reference point 108. For example, if the distance 110 to the reference point 108 is 100 feet, and the user travels 2.5 feet per step, the output 114 may be an audio output of “Store X approaching in 40 steps to your right.” In some embodiments, the output is a tactile output. In some embodiments, the device 104 includes a vibration unit 320 and may communicate information to the user 102 using varying lengths and combinations of vibration tones.


By providing the output in terms of number of steps, the device 104 provides more intuitive, more useful information to the user 102, especially if the user 102 is disabled or otherwise unable to accurately determine distances.


The user 102 may travel a distance 112 toward the reference point 108. Upon traveling this distance 112, the device 104 may provide an updated number of steps to the reference point 108. In some embodiments, the device 104 determines an updated number of steps by decrementing the number of steps communicated previously in the output 114 by a number of steps taken by the user in the distance 112 travelled toward the reference point 108. In some embodiments, the device 104 determines an updated number of steps by determining an updated distance travelled per step by the user 102 and determining an updated distance 118 to the reference point 108, and using those values to determine the updated number of steps to the reference point 108.


Upon determining the updated number of steps to the reference point 108, the device 104 provides an updated output 116 to the user with the updated number of steps to the reference point 108. The updated output 116 may be provided in the same manner or a different manner as described with respect to the output 114. For example, the output 114 may be an audio output and when the user 102 is within a threshold distance of the reference point 108, the updated output 116 may be a vibration indicating the user 102 is a number of steps away from the reference point 108, such as five vibration pulses indicating the user 102 is five steps away. In another example, the output 114 and the updated output 116 may be communicated in the same manner, for purposes of consistency. Output preferences may be determined by the user 102 and may be changed by the user at any time.


The device 104 may update the number of steps to the reference point 108 periodically. In some embodiments, the device 104 updates the number of steps to the reference point 108 every five minutes, every minute, every 30 seconds, etc. The device 104 may also update the number of steps to the reference point 108 when a step distance adjustment trigger is detected. In some embodiments, a step distance adjustment trigger is a change in speed that exceeds a threshold change in speed, as detected by an accelerometer 312 or an IMU 324. In some embodiments, a step distance adjustment trigger is a change in the environment, such as an inclined walking surface, a declined walking surface, or an increase in presence of other people or things around the user 102, as detected by a camera 308, stored map data, or an IMU 324.


The device 104 may determine the distance travelled by the user 102 per step based on one step, or may wait to collect a threshold number of samples of distance travelled per step before making a determination of the distance travelled by the user 102 per step. The device 104 may provide an initial estimate of the number of steps based on the step distance data stored in a memory 304. The step distance data may include a baseline distance travelled per step, for purposes of providing an initial estimate. In some embodiments, the stored step distance data is based on historical data associated with the user 102. In some embodiments, the stored step distance data is based on historical data of many users, and associated with the location. In some embodiments, the memory 304 is local to the device 104. In some embodiments, the memory is remotely located, such as on a cloud-based storage, and may be accessed by the device 104 remotely, such as via Wi-Fi or via a cellular radio network. The distance travelled per step may initially be provided by the user 102 or may initially be determined from attributes of the user 102, such as height, weight, and gender.


In some embodiments, the device 104 may discard or ignore outliers. For example, if the user 102 stumbles, pauses, or takes small steps to negotiate an obstacle, such as litter on the ground, the distance travelled by the user per step may be skewed or inaccurate. The device 104 may also discard distance data associated with climbing up and down stairs, as the distance travelled per step while climbing up and down stairs may not be representative of the user's distance travelled per step for flat (or substantially flat) surfaces. The device 104 may detect climbing up or down stairs based on a combination of accelerometer data, camera image data, IMU data and/or map data associated with the environment of the user 102. In some embodiments, if stairs are between the user 102 and a reference point, the number of steps in the stairs may be presented to the user in the output. The number of steps in the stairs may be determined by the camera or by map data associated with the environment.


With reference now to FIG. 2, another exemplary system for providing spatial information to a user is illustrated. A user 202 of a device 104 is receiving navigation instructions from the device 104 in order to get to the destination 212. The user 202 is initially at location 205 and receives an output 206 from the device 104 regarding the number of steps to reach a reference point 207, whereupon the user 102 will turn 90 degrees to the user's right hand side. The output 206 may also include information regarding a number of steps to a related point of reference 214 by determining a distance between the user's 102 initial location 205 and the related point of reference 214. In an example embodiment, a user 102 on the way to a book store 212 may be interested in a library 214 along the way. The device 104 may deteiuiine the point of reference 214 is a library based on the image data from a camera of the device 104, or from the location data associated with the user's 102 geographic location, and then may determine it is a related point of reference based on common attributes between the destination 212 and the point of reference 214. In some embodiments, a point of reference is a related point of reference if it is a location the user frequently identifies as a destination to be navigated to.


The device 104 may determine the number of steps from a location 205 to a reference point 207 by determining a distance travelled per step and determining a distance 216 to the reference point 207, as described herein. The device 104 may determine a distance per step value based on the user's distance travelled per step for the session or may determine the distance per step value based on a distance per step value stored in a memory 304.


Once the user 202 has reached the reference point 207, the device 104 may provide an output 208 that notifies the user 202 to perform the next step in the directions, such as turn 90 degrees to the right hand side. Once the user 202 has turned 90 degrees to the right hand side, the device 104 provides an output 210 indicating a number of steps corresponding to a new or remaining distance 218 to the destination 212. In some embodiments, the device 104 determines that the user has performed a turn using an IMU 324 or a gyroscope. The device 104 determines the number of steps corresponding to the distance 218 to the destination 212 based on the distance travelled per step of the user 202 and the distance 218 to the destination, each determined as described herein.


In one implementation and with reference to FIG. 3, a device 104 includes a processor 302, connected to a memory 304, a sensor array 306, an output unit 316, and a transceiver 322.


The processor 302 may be a computer processor such as an ARM processor, DSP processor, distributed processor, microprocessor, controller, or other processing device. The processor 302 may be located in the device 104, may be a remote processor or it may be a pairing of a local and a remote processor.


The memory 304 may be one or any combination of the following: a RAM or other volatile or nonvolatile memory, a non-transitory memory or a data storage device, such as a hard disk drive, a solid state disk drive, a hybrid disk drive or other appropriate data storage. The memory 304 may further store machine-readable instructions which may be loaded into or stored in the memory 304 and executed by the processor 302. As with the processor 302, the memory 304 may be positioned on the device 104, may be positioned remote from the device 104 or may be a pairing of a local and a remote memory. The memory 304 may also store step distance data and information associated with the environment, such as map data.


The sensor array 306 includes a camera 308, stereo cameras 310, an accelerometer 312, a sensor 314, a GPS unit 326, and an IMU 324. The stereo cameras 310 may be a stereo camera pair including two cameras offset by a stereo distance, and configured to detect image data to be used by the processor 302 for determining a distance to an object. The stereo cameras 310 may be used instead of or in conjunction with the camera 308 to detect image data. The sensor 314 may be one or more sensors which provide further information about the environment in conjunction with the rest of the sensor array 306 such as one or more of a temperature sensor, an air pressure sensor, a moisture or humidity sensor, a gas detector or other chemical sensor, a sound sensor, a pH sensor, a smoke detector, an altimeter, a depth gauge, a compass, a motion detector, a light sensor, or other sensor. The GPS unit 326 may be used to determine a geographical location. As is described herein, locations determined using the GPS unit 326 may not provide enough accuracy to be a basis for providing step numbers, but may be accurate enough to determine a location, such as a particular mall or a particular office building. The IMU 324 may include the accelerometer 312 or may be a separate device.


The output unit 316 includes a speaker 318 and a vibration unit 320. The speaker 318 may be one or more speakers or other devices capable of producing sounds and/or vibrations. The vibration unit 320 may be one or more vibration motors or actuators capable of providing haptic and tactile output.


The transceiver 322 can be a receiver and/or a transmitter configured to receive and transmit data from a remote data storage or other device. The transceiver 322 may include an antenna capable of transmitting and receiving wireless communications. For example, the antenna may be a Bluetooth or Wi-Fi antenna, a cellular radio antenna, a radio frequency identification (RFID) antenna or reader and/or a near field communication (NFC) unit.


With reference now to FIG. 4, a method 400 may be used by a device such as the device 104 for providing spatial information to a user.


The image data is detected by the camera 308 and/or the stereo cameras 310 of the device 104 (step 402). In some embodiments, the image data includes data regarding the environment of the device 104. The step data is detected by the accelerometer 312 of the device 104 (step 404). In some embodiments, the step data includes an indication of when the user took a step.


A distance travelled per step is determined by the processor 302 (step 406). The processor 302 determines the distance travelled per step based on the image data and the step data by comparing the image data when steps are taken to determine a change in distance travelled between steps. For example, the user may be walking through a shopping mall and the image data may include a series of images of stores and objects near the user, with each image taken when the user takes a step. The processor 302 compares consecutive images within the series of images to determine a change in distance travelled by the user. Since the images were taken when the user took a step, the determined change in distance between consecutive images provides a distance travelled per step. In some embodiments, map data associated with the location of the user that is stored in the memory 304 is also used to determine the distance travelled between steps by comparing the image data to the map data.


A distance to a reference point is determined by the processor 302 (step 408). The distance to the reference point is determined based on the image data. In some embodiments, the map data is also used to determine the distance to the reference point. A number of steps corresponding to the distance to the reference point is determined by the processor 302 (step 410). The number of steps corresponding to the distance to the reference point is determined using the distance travelled per step and the distance to the reference point.


The spatial information is output to the user, including the determined number of steps to the reference point (step 412). As described herein, the output of the spatial information indicating the number of steps may be an audio output provided by the speaker 316, or it may be a series of vibrations provided by the vibration unit 320.


With reference now to FIG. 5, a method 500 may be used by a processor, such as the processor 302 for providing updated spatial information to the user.


A baseline distance travelled per step is determined by the processor 302 (step 502). In some embodiments, the baseline distance travelled per step is determined based on historical step distance data associated with the user. In some embodiments, the baseline distance travelled per step is determined based on historical step distance data associated with the location of the user. For example, the distance travelled per step of all users in a particular mall may be aggregated and the mean or median may be calculated in order to determine a baseline distance travelled per step for a user at the particular mall.


A distance to the reference point is determined by the processor 302 (step 504). As described herein, a camera 308 and/or stereo cameras 310 may be used to determine the distance to the reference point. In some embodiments, the reference point is a place or an object recognized by the device 104 based on a comparison of stored images and image data detected by the camera 308 and/or the stereo cameras 310. In some embodiments, the reference point is a location identified by the user.


A number of steps corresponding to the distance to the reference point is determined by the processor 302 based on the determined baseline distance travelled per step and the determined distance to the reference point (step 506). The number of steps is output by the output unit 316 (step 508). As described herein, the output may be an audio output provided by a speaker 318 or a tactile output provided by a vibration unit 320.


The device 104 collects image data and step data using the camera 308 and the accelerometer 312, respectively (step 509). It is determined whether enough data has been collected to determine a distance travelled per step (step 510). In some embodiments, the processor 302 determines whether a threshold number of data points of distance travelled per step have been collected. In some embodiments, the threshold number is predetermined by the user or is a value associated with the device 104.


When enough data has been collected, a distance travelled per step is determined by the processor 302 (step 512). The distance travelled per step is determined based on the image data and the step data. A distance to the reference point is determined based on the image data (step 514). In some embodiments, the distance determined in step 504 is different than the distance determined in step 514, as the user may have moved closer to the reference point or farther away.


An updated number of steps to the reference point is determined by the processor 302 (step 516). The number of steps is determined based on the distance travelled per step determined in step 512 and the updated distance to the reference point determined in step 514. The output unit 316 provides an output including the updated number of steps to the reference point, determined in step 516 (step 518).


It is determined whether a step distance adjustment trigger is detected (step 520). In some embodiments, the step distance adjustment trigger is detected by the processor 302. In some embodiments, the processor 302 compares data received from one or more elements of the sensor array 306 with a list or table of step distance adjustment triggers. When there is a match, the processor 302 determines a step distance adjustment trigger is detected. For example, the accelerometer 312 may provide device acceleration data to the processor 302. The device acceleration data is compared to a list or table including an acceleration threshold, and when the device acceleration data exceeds the acceleration threshold, the step distance adjustment trigger is detected, as an increase in acceleration may indicate that the user of the device 104 has begun moving faster (e.g., walking at a faster rate, jogging, or running) and the distance between steps may have increased accordingly. Conversely, a deceleration may be detected, and when the deceleration exceeds a deceleration threshold, the step distance adjustment trigger is detected, as deceleration may indicate that the user of the device 104 has slowed down (e.g., going from running to jogging, running to walking, or from a brisk walk to a slow walk) and the distance between steps may have decreased accordingly.


When the step distance adjustment trigger is detected by the processor 302, the process proceeds to step 512, where the distance travelled per step is updated (step 512), the distance to the reference point is updated (step 514), the number of steps corresponding to the updated distance to the reference point is updated (step 516), and the output is provided indicating the updated number of steps (step 518).


When the step distance adjustment trigger is not detected, it is determined whether a time to update the number of steps is reached (step 522). In some embodiments, the number of steps to the reference point is periodically updated and the updated number of steps is provided to the user to keep the user apprised as to the user's progress toward the reference point. In some embodiments, the frequency by which the number of steps to the reference point is updated is determined by the user.


When the time to update the number of steps is reached, the process proceeds to step 512, where the distance travelled per step is updated (step 512), the distance to the reference point is updated (step 514), the number of steps corresponding to the updated distance to the reference point is updated (step 516), and the output is provided indicating the updated number of steps (step 518).


With reference now to FIG. 6, a database 600 associates step distances for a general user 602 and a particular user 604 with various locations 606 and environment categories 608. The database 600 may be stored in a memory of a device 104 and accessible by a processor using a method similar to the methods 400 and 500. The database 600 may also be stored in a remote location and accessible via the transceiver 322.


Each of the various locations 606 and environment categories 608 may be distinguished based on various characteristics. The locations 606 may correspond to particular geographical locations, such as a particular mall (Mall X), a particular subway station (Subway Station X), or a particular office building (Office Building X). The locations 606 may also correspond to general locations such as a supermarket. The locations 606 may be identified using geographic coordinates or may be identified based on image data detected by camera 308 and/or stereo cameras 310. For example, the image data may capture a name of a subway station or a series of store names in a mall, allowing the device 104 to identify a particular location. The locations 606 may also be identified using GPS data received by the GPS unit 326.


The environment categories 608 may correspond to conditions, such as whether the device is indoors or outdoors or whether the environment is crowded or empty. In some embodiments, the environment category 608 to apply is determined based on the image data from the camera 308. For example, the camera 308 and/or the stereo cameras 310 may detect the image data indicating the user is indoors or in a crowded environment. In some embodiments, the sensor data from the sensor 314 is used to determine the environment category to apply.


As described herein, the location 606 and the environment categories 608 may have associated step distance data for a general user 602 and a particular user 604. The data for the general user 602 may be aggregated and determined based on the step distance data for all users in the particular location or the environment category. In contrast, the data for the particular user 604 is based on the step distance data for the particular user only.


In some embodiments, the device 104 will determine whether stored step distance data is available for the particular user 604 for the current location and/or the current type of environment. When step distance data for the particular user 604 is available, the user's step distance data is used to establish a baseline distance travelled per step for the particular user 604. When step distance data for the particular user 604 in the current location and the current type of environment is unavailable, the general user's 602 step distance data is used to establish the baseline distance travelled per step for the user. In an example embodiment, when the device 104 detects that the particular user 604 is at the Mall X, and a baseline distance travelled per step is to be determined, 2.6 feet per step is used as the distance travelled per step, for purposes of determining the number of steps to reference points. When the device 104 detects that the particular user 604 is at Subway Station X, there is no stored step distance data for the particular user 604 at Subway Station X, so 2.7 feet per step is used, which corresponds to the stored step distance data for the general user 602.


In some embodiments, more than one of the locations 606 and environments 608 may apply, such as a crowded indoor mall. The corresponding stored step distance data for the applicable conditions may be averaged, or the median may be used. In an example embodiment, if the particular user 604 is at Mall X and it is crowded, the values of 2.6 and 1.9 may be averaged to determine a baseline distance travelled per step for the user.


The step distance data may be updated based on determined distance travelled per step while the user is within a particular category. For example, if the database 600 is used, and the device 104 detects the particular user 604 is at Subway Station X and it is crowded, and the particular user 604 averages 2.5 feet per step while the user is at Subway Station X, then the values associated with the particular user 604 for Subway Station X and for a crowded environment may be added or modified. In the example database 600, there is no value for the particular user 604 at Subway Station X, so 2.5 is stored in the corresponding entry. For the particular user 604 in crowded environments, a value of 1.9 is currently stored, but it may be modified by the 2.5 feet per step the particular user 604 averaged in this session.


In some embodiments, additional environment categories 608 and locations 606 may be added. For example, if the user goes to a location not listed in the locations 606 more than a threshold number of times, the particular location may be added to the list of locations 606.


In some embodiments, when there is no value associated with the particular user 604 for a location 606 or environment category 608, an average distance travelled per step of the particular user 604 for all locations and environments is used. In some embodiments, the average distance travelled per step of the particular user 604 for all locations and environments is averaged with the general user 602 data for the corresponding location 606 and/or environment category 608.


Exemplary embodiments of the methods/systems have been disclosed in an illustrative style. Accordingly, the terminology employed throughout should be read in a non-limiting manner. Although minor modifications to the teachings herein will occur to those well versed in the art, it shall be understood that what is intended to be circumscribed within the scope of the patent warranted hereon are all such embodiments that reasonably fall within the scope of the advancement to the art hereby contributed, and that that scope shall not be restricted, except in light of the appended claims and their equivalents.

Claims
  • 1. A system for providing, to a user, a navigation route to a reference point, the system comprising: a camera configured to detect image data;an accelerometer configured to determine step data;a processor connected to the camera and the accelerometer, the processor configured to: determine a distance travelled per step of the user based on the image data and the step data,determine a distance to the reference point based on initial image data,determine an initial number of steps corresponding to the distance to the reference point based on the distance travelled per step of the user,detect a change in travelling speed of the user,determine whether the change in travelling speed exceeds a threshold,determine an updated distance travelled per step of the user and an updated distance to the reference point when the change in travelling speed of the user exceeds the threshold, anddetermine an updated number of steps corresponding to the updated distance to the reference point based on the updated distance travelled per step of the user; andan output unit connected to the processor and configured to output spatial information indicating the initial number of steps corresponding to the distance to the reference point and the updated number of steps corresponding to the updated distance to the reference point.
  • 2. The system of claim 1, further comprising a memory configured to store step distance data for establishing a baseline distance travelled per step.
  • 3. The system of claim 2, wherein the step distance data is based on historical step distance data associated with the user.
  • 4. The system of claim 2, wherein the step distance data is based on historical step distance data associated with a location of the user.
  • 5. The system of claim 1, wherein the processor is further configured to update the number of steps corresponding to the distance to the reference point based on detecting a step distance adjustment trigger.
  • 6. The system of claim 1, wherein the processor is further configured to periodically determine, along the navigation route, an updated distance to the reference point based on updated image data.
  • 7. A device for providing spatial information to a user, comprising: a camera configured to detect image data;an accelerometer configured to determine step data;a memory configured to store step distance data for establishing a baseline distance travelled per step;a processor connected to the camera and the accelerometer, the processor configured to: determine an initial distance to a reference point based on only the image data;determine an initial number of steps corresponding to the initial distance to the reference point based on the baseline distance travelled per step;determine a current distance travelled per step of the user based on the image data and the step data;determine an updated distance to the reference point; anddetermine an updated number of steps corresponding to the updated distance to the reference point based on the current distance travelled per step of the user; andan output unit connected to the processor and configured to output the spatial information indicating the initial number of steps corresponding to the initial distance to the reference point and the updated number of steps corresponding to the updated distance to the reference point.
  • 8. The device of claim 7, wherein the processor is further configured to update the stored step distance data based on the determined current distance travelled per step of the user.
  • 9. The device of claim 7, wherein the current distance travelled per step of the user based on the image data and the step data is determined when a threshold amount of image data is detected and a threshold amount of step data is determined.
  • 10. The device of claim 7, wherein the baseline distance travelled per step is an average of step distance data associated with a location of the user and an environment of the user.
  • 11. The device of claim 7, wherein the step distance data is based on historical step distance data associated with the user.
  • 12. The device of claim 7, wherein the step distance data is based on historical step distance data associated with a location of the user.
  • 13. A method for providing spatial information to a user, comprising: detecting, by a camera, image data;determining, by an accelerometer, step data;determining, by a processor, an initial distance travelled per step of the user based on the image data and the step data;determining, by the processor, an initial distance to a reference point based on the image data;determining, by the processor, an initial number of steps corresponding to the initial distance to the reference point based on the initial distance travelled per step of the user;outputting, by an output unit, the number of steps corresponding to the initial distance to the reference point;detecting, by the accelerometer, an acceleration or deceleration of the user exceeding a threshold;determining, by the processor, an updated distance travelled per step of the user and an updated distance to the reference point;determining, by the processor, an updated number of steps corresponding to the updated distance to the reference point based on the updated distance travelled per step of the user; andoutputting, by the output unit, the updated number of steps corresponding to the updated distance to the reference point.
  • 14. The method of claim 13, further comprising updating the number of steps corresponding to the distance to the reference point periodically.
US Referenced Citations (408)
Number Name Date Kind
4520501 DuBrucq May 1985 A
4586827 Hirsch et al. May 1986 A
4786966 Hanson Nov 1988 A
5047952 Kramer Sep 1991 A
5097856 Chi-Sheng Mar 1992 A
5129716 Holakovsky et al. Jul 1992 A
5233520 Kretsch et al. Aug 1993 A
5265272 Kurcbart Nov 1993 A
5463428 Lipton et al. Oct 1995 A
5508699 Silverman Apr 1996 A
5539665 Lamming et al. Jul 1996 A
5543802 Villevieille Aug 1996 A
5544050 Abe Aug 1996 A
5568127 Bang Oct 1996 A
5636038 Lynt Jun 1997 A
5659764 Sakiyama Aug 1997 A
5701356 Stanford et al. Dec 1997 A
5733127 Mecum Mar 1998 A
5807111 Schrader Sep 1998 A
5872744 Taylor Feb 1999 A
5953693 Sakiyama Sep 1999 A
5956630 Mackey Sep 1999 A
5982286 Vanmoor Nov 1999 A
6009577 Day Jan 2000 A
6055048 Langevin et al. Apr 2000 A
6067112 Wellner et al. May 2000 A
6199010 Richton Mar 2001 B1
6229901 Mickelson et al. May 2001 B1
6230135 Ramsay May 2001 B1
6230349 Silver et al. May 2001 B1
6285757 Carroll et al. Sep 2001 B1
6307526 Mann Oct 2001 B1
6323807 Golding et al. Nov 2001 B1
6349001 Spitzer Feb 2002 B1
6466232 Newell Oct 2002 B1
6477239 Ohki Nov 2002 B1
6542623 Kahn Apr 2003 B1
6580999 Maruyama et al. Jun 2003 B2
6594370 Anderson Jul 2003 B1
6603863 Nagayoshi Aug 2003 B1
6619836 Silvant et al. Sep 2003 B1
6701296 Kramer Mar 2004 B1
6774788 Balfe Aug 2004 B1
6825875 Strub et al. Nov 2004 B1
6826477 Ladetto et al. Nov 2004 B2
6834373 Dieberger Dec 2004 B2
6839667 Reich Jan 2005 B2
6857775 Wilson Feb 2005 B1
6920229 Boesen Jul 2005 B2
D513997 Wilson Jan 2006 S
7027874 Sawan et al. Apr 2006 B1
D522300 Roberts Jun 2006 S
7069215 Bangalore Jun 2006 B1
7106220 Gourgey et al. Sep 2006 B2
7228275 Endo Jun 2007 B1
7299034 Kates Nov 2007 B2
7308314 Havey et al. Dec 2007 B2
7336226 Jung et al. Feb 2008 B2
7356473 Kates Apr 2008 B2
7413554 Kobayashi et al. Aug 2008 B2
7417592 Hsiao et al. Aug 2008 B1
7428429 Gantz et al. Sep 2008 B2
7463188 McBurney Dec 2008 B1
7496445 Mohsini Feb 2009 B2
7501958 Saltzstein et al. Mar 2009 B2
7525568 Raghunath Apr 2009 B2
7564469 Cohen Jul 2009 B2
7565295 Hernandez-Rebollar Jul 2009 B1
7598976 Sofer et al. Oct 2009 B2
7618260 Daniel et al. Nov 2009 B2
D609818 Tsang et al. Feb 2010 S
7656290 Fein et al. Feb 2010 B2
7659915 Kurzweil et al. Feb 2010 B2
7743996 Maciver Jun 2010 B2
D625427 Lee Oct 2010 S
7843351 Bourne Nov 2010 B2
7843488 Stapleton Nov 2010 B2
7848512 Eldracher Dec 2010 B2
7864991 Espenlaub et al. Jan 2011 B2
7938756 Rodetsky et al. May 2011 B2
7991576 Roumeliotis Aug 2011 B2
8005263 Fujimura Aug 2011 B2
8035519 Davis Oct 2011 B2
D649655 Petersen Nov 2011 S
8123660 Kruse et al. Feb 2012 B2
D656480 McManigal et al. Mar 2012 S
8138907 Barbeau et al. Mar 2012 B2
8150107 Kurzweil et al. Apr 2012 B2
8177705 Abolfathi May 2012 B2
8239032 Dewhurst Aug 2012 B2
8253760 Sako et al. Aug 2012 B2
8300862 Newton et al. Oct 2012 B2
8325263 Kato et al. Dec 2012 B2
D674501 Petersen Jan 2013 S
8359122 Koselka et al. Jan 2013 B2
8395968 Vartanian et al. Mar 2013 B2
8401785 Cho et al. Mar 2013 B2
8414246 Tobey Apr 2013 B2
8418705 Ota et al. Apr 2013 B2
8428643 Lin Apr 2013 B2
8483956 Zhang Jul 2013 B2
8494507 Tedesco et al. Jul 2013 B1
8494859 Said Jul 2013 B2
8538687 Plocher et al. Sep 2013 B2
8538688 Prehofer Sep 2013 B2
8571860 Strope Oct 2013 B2
8583282 Angle et al. Nov 2013 B2
8588464 Albertson et al. Nov 2013 B2
8588972 Fung Nov 2013 B2
8591412 Kovarik et al. Nov 2013 B2
8594935 Cioffi et al. Nov 2013 B2
8606316 Evanitsky Dec 2013 B2
8610879 Ben-Moshe et al. Dec 2013 B2
8630633 Tedesco et al. Jan 2014 B1
8676274 Li Mar 2014 B2
8676623 Gale et al. Mar 2014 B2
8694251 Janardhanan et al. Apr 2014 B2
8704902 Naick et al. Apr 2014 B2
8718672 Xie et al. May 2014 B2
8743145 Price Jun 2014 B1
8750898 Haney Jun 2014 B2
8768071 Tsuchinaga et al. Jul 2014 B2
8786680 Shiratori Jul 2014 B2
8797141 Best et al. Aug 2014 B2
8797386 Chou et al. Aug 2014 B2
8803699 Foshee et al. Aug 2014 B2
8805929 Erol et al. Aug 2014 B2
8812244 Angelides Aug 2014 B2
8814019 Dyster et al. Aug 2014 B2
8825398 Alexandre Sep 2014 B2
8836532 Fish, Jr. et al. Sep 2014 B2
8836580 Mendelson Sep 2014 B2
8836910 Cashin et al. Sep 2014 B2
8902303 Na'Aman et al. Dec 2014 B2
8909534 Heath Dec 2014 B1
D721673 Park et al. Jan 2015 S
8926330 Taghavi Jan 2015 B2
8930458 Lewis et al. Jan 2015 B2
8981682 Delson et al. Mar 2015 B2
8994498 Agrafioti et al. Mar 2015 B2
D727194 Wilson Apr 2015 S
9004330 White Apr 2015 B2
9025016 Wexler et al. May 2015 B2
9042596 Connor May 2015 B2
9053094 Yassa Jun 2015 B2
9076450 Sadek Jul 2015 B1
9081079 Chao et al. Jul 2015 B2
9081385 Ferguson Jul 2015 B1
D736741 Katz Aug 2015 S
9111545 Jadhav et al. Aug 2015 B2
D738238 Pede et al. Sep 2015 S
9137484 DiFrancesco et al. Sep 2015 B2
9137639 Garin et al. Sep 2015 B2
9140554 Jerauld Sep 2015 B2
9148191 Teng et al. Sep 2015 B2
9158378 Hirukawa Oct 2015 B2
D742535 Wu Nov 2015 S
D743933 Park et al. Nov 2015 S
9185489 Gerber et al. Nov 2015 B2
9190058 Klein Nov 2015 B2
9104806 Stivoric et al. Dec 2015 B2
9230430 Civelli et al. Jan 2016 B2
9232366 Charlier et al. Jan 2016 B1
9267801 Gupta et al. Feb 2016 B2
9269015 Boncyk Feb 2016 B2
9275376 Barraclough et al. Mar 2016 B2
9304588 Aldossary Apr 2016 B2
D756958 Lee et al. May 2016 S
D756959 Lee et al. May 2016 S
9335175 Zhang et al. May 2016 B2
9341014 Oshima et al. May 2016 B2
9355547 Stevens et al. May 2016 B2
20010023387 Rollo Sep 2001 A1
20020067282 Moskowitz et al. Jun 2002 A1
20020071277 Starner et al. Jun 2002 A1
20020075323 O'Dell Jun 2002 A1
20020173346 Wang Nov 2002 A1
20020178344 Bourguet Nov 2002 A1
20030026461 Arthur Hunter Feb 2003 A1
20030133008 Stephenson Jul 2003 A1
20030133085 Tretiakoff Jul 2003 A1
20030179133 Pepin Sep 2003 A1
20040056907 Sharma et al. Mar 2004 A1
20040232179 Chauhan Nov 2004 A1
20040267442 Fehr et al. Dec 2004 A1
20050020845 Suzuki et al. Jan 2005 A1
20050140544 Hamel Jun 2005 A1
20050221260 Kikuchi Oct 2005 A1
20050259035 Iwaki Nov 2005 A1
20050283752 Fruchter Dec 2005 A1
20060004512 Herbst Jan 2006 A1
20060028550 Palmer Feb 2006 A1
20060029256 Miyoshi Feb 2006 A1
20060129308 Kates Jun 2006 A1
20060171704 Bingle et al. Aug 2006 A1
20060177086 Rye et al. Aug 2006 A1
20060184318 Yoshimine Aug 2006 A1
20060292533 Selod Dec 2006 A1
20070001904 Mendelson Jan 2007 A1
20070052672 Ritter et al. Mar 2007 A1
20070173688 Kim Jul 2007 A1
20070182812 Ritchey Aug 2007 A1
20070202865 Moride Aug 2007 A1
20070230786 Foss Oct 2007 A1
20070296572 Fein Dec 2007 A1
20080024594 Ritchey Jan 2008 A1
20080068559 Howell Mar 2008 A1
20080120029 Zelek et al. May 2008 A1
20080144854 Abreu Jun 2008 A1
20080145822 Bucchieri Jun 2008 A1
20080174676 Squilla et al. Jul 2008 A1
20080198222 Gowda Aug 2008 A1
20080198324 Fuziak Aug 2008 A1
20080208455 Hartman Aug 2008 A1
20080251110 Pede Oct 2008 A1
20080260210 Kobeli Oct 2008 A1
20090012788 Gilbert Jan 2009 A1
20090040215 Afzulpurkar Feb 2009 A1
20090058611 Kawamura Mar 2009 A1
20090106016 Athsani et al. Apr 2009 A1
20090118652 Carlucci May 2009 A1
20090122161 Bolkhovitinov May 2009 A1
20090122648 Mountain et al. May 2009 A1
20090157302 Tashev et al. Jun 2009 A1
20090177437 Roumeliotis Jul 2009 A1
20090189974 Deering Jul 2009 A1
20090210596 Furuya Aug 2009 A1
20100041378 Aceves Feb 2010 A1
20100042322 Won Feb 2010 A1
20100080418 Ito Apr 2010 A1
20100109918 Liebermann May 2010 A1
20100110368 Chaum May 2010 A1
20100179452 Srinivasan Jul 2010 A1
20100182242 Fields et al. Jul 2010 A1
20100182450 Kumar Jul 2010 A1
20100198494 Chao Aug 2010 A1
20100199232 Mistry et al. Aug 2010 A1
20100241350 Cioffi et al. Sep 2010 A1
20100245585 Fisher et al. Sep 2010 A1
20100267276 Wu Oct 2010 A1
20100292917 Emam et al. Nov 2010 A1
20100298976 Sugihara et al. Nov 2010 A1
20100305845 Alexandre et al. Dec 2010 A1
20100308999 Chornenky Dec 2010 A1
20110066383 Jangle Mar 2011 A1
20110071830 Kim Mar 2011 A1
20110092249 Evanitsky Apr 2011 A1
20110124383 Garra et al. May 2011 A1
20110125735 Petrou May 2011 A1
20110181422 Tran Jul 2011 A1
20110187640 Jacobsen Aug 2011 A1
20110211760 Boncyk Sep 2011 A1
20110216006 Litschel Sep 2011 A1
20110221670 King, III et al. Sep 2011 A1
20110234584 Endo Sep 2011 A1
20110246064 Nicholson Oct 2011 A1
20110260681 Guccione Oct 2011 A1
20110307172 Jadhav et al. Dec 2011 A1
20120016578 Coppens Jan 2012 A1
20120053826 Slamka Mar 2012 A1
20120062357 Slamka Mar 2012 A1
20120069511 Azera Mar 2012 A1
20120075168 Osterhout et al. Mar 2012 A1
20120082962 Schmidt Apr 2012 A1
20120085377 Trout Apr 2012 A1
20120092161 West Apr 2012 A1
20120092460 Mahoney Apr 2012 A1
20120123784 Baker et al. May 2012 A1
20120136666 Corpier et al. May 2012 A1
20120143495 Dantu Jun 2012 A1
20120162423 Xiao et al. Jun 2012 A1
20120194552 Osterhout et al. Aug 2012 A1
20120206335 Osterhout et al. Aug 2012 A1
20120206607 Morioka Aug 2012 A1
20120207356 Murphy Aug 2012 A1
20120214418 Lee Aug 2012 A1
20120220234 Abreu Aug 2012 A1
20120232430 Boissy et al. Sep 2012 A1
20120249797 Haddick et al. Oct 2012 A1
20120252483 Farmer et al. Oct 2012 A1
20120316884 Rozaieski et al. Dec 2012 A1
20120323485 Mutoh Dec 2012 A1
20120327194 Shiratori Dec 2012 A1
20130002452 Lauren Jan 2013 A1
20130044005 Foshee et al. Feb 2013 A1
20130046541 Klein et al. Feb 2013 A1
20130066636 Singhal Mar 2013 A1
20130079061 Jadhav Mar 2013 A1
20130090133 D'Jesus Bencci Apr 2013 A1
20130115578 Shiina et al. May 2013 A1
20130115579 Taghavi May 2013 A1
20130116559 Levin May 2013 A1
20130127980 Haddick May 2013 A1
20130128051 Velipasalar et al. May 2013 A1
20130131985 Weiland et al. May 2013 A1
20130141576 Lord et al. Jun 2013 A1
20130144629 Johnston et al. Jun 2013 A1
20130155474 Roach et al. Jun 2013 A1
20130157230 Morgan Jun 2013 A1
20130184982 DeLuca Jul 2013 A1
20130201344 Sweet, III et al. Aug 2013 A1
20130204605 Illgner-Fehns Aug 2013 A1
20130211718 Yoo et al. Aug 2013 A1
20130218456 Zelek et al. Aug 2013 A1
20130202274 Chan Sep 2013 A1
20130228615 Gates et al. Sep 2013 A1
20130229669 Smits Sep 2013 A1
20130243250 France et al. Sep 2013 A1
20130245396 Berman et al. Sep 2013 A1
20130250078 Levy Sep 2013 A1
20130250233 Blum et al. Sep 2013 A1
20130253818 Sanders et al. Sep 2013 A1
20130265450 Barnes, Jr. Oct 2013 A1
20130271584 Wexler et al. Oct 2013 A1
20130290909 Gray Oct 2013 A1
20130307842 Grinberg et al. Nov 2013 A1
20130311179 Wagner Nov 2013 A1
20130328683 Sitbon et al. Dec 2013 A1
20130332452 Jarvis Dec 2013 A1
20140009561 Sutherland Jan 2014 A1
20140031081 Vossoughi Jan 2014 A1
20140031703 Rayner Jan 2014 A1
20140031977 Goldenberg et al. Jan 2014 A1
20140032596 Fish et al. Jan 2014 A1
20140037149 Zetune Feb 2014 A1
20140055353 Takahama Feb 2014 A1
20140071234 Millett Mar 2014 A1
20140081631 Zhu et al. Mar 2014 A1
20140085446 Hicks Mar 2014 A1
20140098018 Kim et al. Apr 2014 A1
20140100773 Cunningham et al. Apr 2014 A1
20140125700 Ramachandran May 2014 A1
20140132388 Alalawi May 2014 A1
20140133290 Yokoo May 2014 A1
20140160250 Pomerantz Jun 2014 A1
20140172361 Chiang Jun 2014 A1
20140184384 Zhu et al. Jul 2014 A1
20140184775 Drake Jul 2014 A1
20140204245 Wexler Jul 2014 A1
20140222023 Kim et al. Aug 2014 A1
20140228649 Rayner Aug 2014 A1
20140233859 Cho Aug 2014 A1
20140236932 Ikonomov Aug 2014 A1
20140249847 Soon-Shiong Sep 2014 A1
20140251396 Subhashrao et al. Sep 2014 A1
20140253702 Wexler Sep 2014 A1
20140278070 McGavran Sep 2014 A1
20140281943 Prilepov Sep 2014 A1
20140287382 Villar Cloquell Sep 2014 A1
20140309806 Ricci Oct 2014 A1
20140313040 Wright, Sr. Oct 2014 A1
20140335893 Ronen Nov 2014 A1
20140343846 Goldman et al. Nov 2014 A1
20140345956 Kojina Nov 2014 A1
20140347265 Aimone Nov 2014 A1
20140368412 Jacobsen Dec 2014 A1
20140369541 Miskin Dec 2014 A1
20140379251 Tolstedt Dec 2014 A1
20140379336 Bhatnager Dec 2014 A1
20150002808 Rizzo, III Jan 2015 A1
20150016035 Tussy Jan 2015 A1
20150058237 Bailey Feb 2015 A1
20150063661 Lee Mar 2015 A1
20150081884 Maguire Mar 2015 A1
20150099946 Sahin Apr 2015 A1
20150109107 Gomez et al. Apr 2015 A1
20150120186 Heikes Apr 2015 A1
20150135310 Lee May 2015 A1
20150141085 Nuovo et al. May 2015 A1
20150141873 Fei May 2015 A1
20150142891 Haque May 2015 A1
20150154643 Artman et al. Jun 2015 A1
20150125831 Chandrashekhar Nair et al. Jul 2015 A1
20150196101 Dayal et al. Jul 2015 A1
20150198454 Moore et al. Jul 2015 A1
20150198455 Chen Jul 2015 A1
20150199566 Moore et al. Jul 2015 A1
20150201181 Moore et al. Jul 2015 A1
20150211858 Jerauld Jul 2015 A1
20150219757 Boelter et al. Aug 2015 A1
20150223355 Fleck Aug 2015 A1
20150256977 Huang Sep 2015 A1
20150257555 Wong Sep 2015 A1
20150260474 Rublowsky Sep 2015 A1
20150262509 Labbe Sep 2015 A1
20150279172 Hyde Oct 2015 A1
20150324646 Kimia Nov 2015 A1
20150330787 Cioffi et al. Nov 2015 A1
20150336276 Song Nov 2015 A1
20150338917 Steiner et al. Nov 2015 A1
20150341591 Kelder et al. Nov 2015 A1
20150346496 Haddick et al. Dec 2015 A1
20150350845 Patel Dec 2015 A1
20150356345 Velozo Dec 2015 A1
20150356837 Pajestka Dec 2015 A1
20150364943 Vick Dec 2015 A1
20150367176 Bejestan Dec 2015 A1
20150375395 Kwon Dec 2015 A1
20160007158 Venkatraman Jan 2016 A1
20160028917 Wexler Jan 2016 A1
20160042228 Opalka Feb 2016 A1
20160078289 Michel et al. Mar 2016 A1
20160098138 Park Apr 2016 A1
20160156850 Werblin et al. Jun 2016 A1
20160166197 Venkatraman Jun 2016 A1
20160198319 Huang Jul 2016 A1
20160350514 Rajendran Dec 2016 A1
20170227574 Theytaz Aug 2017 A1
Foreign Referenced Citations (62)
Number Date Country
201260746 Jun 2009 CN
101527093 Sep 2009 CN
201440733 Apr 2010 CN
101803988 Aug 2010 CN
101647745 Jan 2011 CN
102316193 Jan 2012 CN
102631280 Aug 2012 CN
202547659 Nov 2012 CN
202722736 Feb 2013 CN
102323819 Jun 2013 CN
103445920 Dec 2013 CN
102011080056 Jan 2013 DE
102012000587 Jul 2013 DE
102012202614 Aug 2013 DE
1174049 Sep 2004 EP
1721237 Nov 2006 EP
2364855 Sep 2011 EP
2371339 Oct 2011 EP
2127033 Aug 2012 EP
2581856 Apr 2013 EP
2751775 Jul 2016 EP
2885251 Nov 2006 FR
2401752 Nov 2004 GB
10069539 Mar 1998 JP
2001304908 Oct 2001 JP
201012529 Jan 2010 JP
2010182193 Aug 2010 JP
4727352 Jul 2011 JP
2013169611 Sep 2013 JP
100405636 Nov 2003 KR
20080080688 Sep 2008 KR
20120020212 Mar 2012 KR
1250929 Apr 2013 KR
WO1995004440 Feb 1995 WO
WO 9949656 Sep 1999 WO
WO 0010073 Feb 2000 WO
WO 0038393 Jun 2000 WO
WO 0179956 Oct 2001 WO
WO 2004076974 Sep 2004 WO
WO 2006028354 Mar 2006 WO
WO 2006045819 May 2006 WO
WO 2007031782 Mar 2007 WO
WO 2008008791 Jan 2008 WO
WO 2008015375 Feb 2008 WO
WO 2008035993 Mar 2008 WO
WO 2008096134 Aug 2008 WO
WO2008127316 Oct 2008 WO
WO 2010062481 Jun 2010 WO
WO 2010109313 Sep 2010 WO
WO 2012040703 Mar 2012 WO
WO2012163675 Dec 2012 WO
WO 2013045557 Apr 2013 WO
WO 2013054257 Apr 2013 WO
WO 2013067539 May 2013 WO
WO 2013147704 Oct 2013 WO
WO 2014104531 Jul 2014 WO
WO 2014138123 Sep 2014 WO
WO 2014172378 Oct 2014 WO
WO 2015065418 May 2015 WO
WO2015092533 Jun 2015 WO
WO 2015108882 Jul 2015 WO
WO2015127062 Aug 2015 WO
Non-Patent Literature Citations (94)
Entry
Zhang, Shanjun; Yoshino, Kazuyoshi; A Braille Recognition System by the Mobile Phone with Embedded Camera; 2007; IEEE.
Diallo, Amadou; Sep. 18, 2014; Apple iOS8: Top New Features, Forbes Magazine.
N. Kalar, T. Lawers, D. Dewey, T. Stepleton, M.B. Dias; Iterative Design of a Braille Writing Tutor to Combat Illiteracy; Aug. 30, 2007; IEEE.
Bharathi et al.; “Effective Navigation for Visually Impaired by Wearable Obstacle Avoidance System;” 2012 International Conference on Computing, Electronics and Electrical Technologies (ICCEET); pp. 956-958; 2012.
Pawar et al.; “Review Paper on Multitasking Stick for Guiding Safe Path for Visually Disable People;” IJPRET; vol. 3, No. 9; pp. 929-936; 2015.
Ram et al.; “The People Sensor: A Mobility Aid for the Visually Impaired;” 2012 16th International Symposium on Wearable Computers; pp. 166-167; 2012.
Singhal; “The Development of an Intelligent Aid for Blind and Old People;” Emerging Trends and Applications in Computer Science (ICETACS), 2013 1st International Conference; pp. 182-185; Sep. 13, 2013.
Aggarwal et al.; “All-in-One Companion for Visually Impaired;” International Journal of Computer Applications; vol. 79, No. 14; pp. 37-40; Oct. 2013.
“Light Detector” Every Ware Technologies; 2 pages; Jun. 18, 2016.
Arati et al. “Object Recognition in Mobile Phone Application for Visually Impaired Users;” IOSR Journal of Computer Engineering (IOSR-JCE); vol. 17, No. 1; pp. 30-33; Jan. 2015.
Yabu et al.; “Development of a Wearable Haptic Tactile Interface as an Aid for the Hearing and/or Visually Impaired;” NTUT Education of Disabilities; vol. 13; pp. 5-12; 2015.
Mau et al.; “BlindAid: An Electronic Travel Aid for the Blind;” The Robotics Institute Carnegie Mellon University; 27 pages; May 2008.
Shidujaman et al.; “Design and navigation Prospective for Wireless Power Transmission Robot;” IEEE; Jun. 2015.
Wu et al. “Fusing Multi-Modal Features for Gesture Recognition”, Proceedings of the 15th ACM on International Conference on Multimodal Interaction, Dec. 9, 2013, ACM, pp. 453-459.
Pitsikalis et al. “Multimodal Gesture Recognition via Multiple Hypotheses Rescoring”, Journal of Machine Learning Research, Feb. 2015, pp. 255-284.
Shen et al. “Walkie-Markie: Indoor Pathway Mapping Made Easy” 10th USENIX Symposium on Networked Systems Design and Implementation (NSDI'13); pp. 85-98, 2013.
Tu et al. “Crowdsourced Routing II D2.6” 34 pages; 2012.
De Choudhury et al. “Automatic Construction of Travel Itineraries Using Social Breadcrumbs” pp. 35-44; Jun. 2010.
The Nex Band; http://www.mightvcast.com/#faq; May 19, 2015; 4 pages.
Cardonha et al.; “A Crowdsourcing Platform for the Construction of Accessibility Maps”; W4A'13 Proceedings of the 10th International Cross-Disciplinary Conference on Web Accessibility; Article No. 26; 2013; 5 pages.
Bujacz et al.; “Remote Guidance for the Blind—A Proposed Teleassistance System and Navigation Trials”; Conference on Human System Interactions; May 25-27, 2008; 6 pages.
Rodriguez et al; “CrowdSight: Rapidly Prototyping Intelligent Visual Processing Apps”; AAAI Human Computation Workshop (HCOMP); 2011; 6 pages.
Chaudary et al.; “Alternative Navigation Assistance Aids for Visually Impaired Blind Persons”; Proceedings of Iceapvi; Feb. 12-14 2015; 5 pp.
Garaj et al.; “A System for Remote Sighted Guidance of Visually Impaired Pedestrians”; the British Journal of Visual Impairment; vol. 21, No. 2, 2003; 9 pp.
Coughlan et al.; “Crosswatch: a System for Providing Guidance to Visually Impaired Travelers at Traffic Intersections”; Journal of Assistive Technologies 7.2; 2013; 17 pp.
Sudol et al.; “LookTel — a Comprehensive Platform for Computer-Aided Visual Assistance”; Computer Vision and Pattern Recognition Workshops (Cvprw), 2010 IEEE Computer Society Conference; Jun. 13-18, 2010; 8 pp.
Paladugu et al.; “GoingEasy® with Crowdsourcing in the Web 2.0 World for Visually Impaired Users: Design and User Study”; Arizona State University; 8 pp.
Kammoun et al.; “Towards a Geographic Information System Facilitating Navigation of Visually Impaired Users”; Springer Berlin Heidelberg; 2012; 8 pp.
Bigham et al.; “Viz Wiz: Nearly Real-Time Answers to Visual Questions” Proceedings of the 23nd annual Acm symposium on User interface software and technology; 2010; 2 pp.
Guy et al; “CrossingGuard: Exploring Information Content in Navigation Aids for Visually Impaired Pedestrians” Proceedings of the Sigchi Conference on Human Factors in Computing Systems; May 5-10, 2012; 10 pp.
Zhang et al.; “A Multiple Sensor-Based Shoe-Mounted User Interface Designed for Navigation Systems for the Visually Impaired”; 5th Annual Icst Wireless Internet Conference (Wicon); Mar. 1-3, 2010; 9 pp.
Shoval et al.; “Navbelt and the Guidecane - Robotics-Based Obstacle Avoidance Systems for the Blind and Visually Impaired”; IEEE Robotics & Automation Magazine, vol. 10, Issue 1; Mar. 2003; 12 pp.
Dowling et al.; “Intelligent Image Artificial Vision”; 8th Australian (Anziis); Dec. 10-12, 2003; Processing Constraints for Blind Mobility Facilitated Through Systems Conference.
Heyes, Tony; “The Sonic Pathfinder an Electronic Travel Aid for the http://members.optuszoo.com.au/aheyew40/pa/pf blerf.html; Dec. Vision Impaired”;.
Lee et al.; “Adaptive Power Control of Obstacle Avoidance System Using Via Motion Context for Visually Impaired Person.” International Conference on Cloud Computing and Social Networking (Icccsn), Apr. 26-27, 2012 4 pp.
Wilson, Jeff, et al. “Swan: System for Wearable Audio Navigation”; 11th IEEE International Symposium on Wearable Computers; Oct. 11-13, 2007; 8 pp.
Borenstein et al.; “The GuideCane - a Computerized Travel Aid for the Active Guidance of Blind Pedestrians”; IEEE International Conference on Robotics and Automation; Apr. 21-27, 1997; 6 pp.
Bhatlawande et al.; “Way-finding Electronic Bracelet for Visually Impaired People”; IEEE Point-of-Care Healthcare Technologies (Pht), Jan. 16-18, 2013; 4 pp.
Blenkhorn et al.; “An Ultrasonic Mobility Device with Minimal Audio Feedback”; Center on Disabilities Technology and Persons with Disabilities Conference; Nov. 22, 1997; 5 pp.
Maim et al.; “Blind Navigation with a Wearable Range Camera and Vibrotactile Helmet”; 19th Acm International Conference on Multimedia; Nov. 28, 2011; 4 pp.
Shoval et al.; “The Navbelt — a Computerized Travel Aid for the Blind”; Resna Conference, Jun. 12-17, 1993; 6 pp.
Kumar et al.; “An Electronic Travel Aid for Navigation of Visually Impaired Persons”; Communications Systems and Networks (Comsnets), 2011 Third International Conference; Jan. 2011; 5 pp.
Pawar et al.; “Multitasking Stick for Indicating Safe Path to Visually Disable People”; Iosr Journal of Electronics and Communication Engineering (Iosr-Jece), vol. 10, Issue 3, Ver. Ii; May-Jun 2015; 5 pp.
Pagliarini et al.; “Robotic Art for Wearable”; Proceedings of Eurosiam: European Conference for the Applied Mathematics and Informatics 2010; 10 pp.
Greenberg et al.; “Finding Your Way: a Curriculum for Teaching and Using the Braillenote with Sendero Gps 2011”; California School for the Blind; 2011; 190 pp.
Helal et al.; “Drishti: an Integrated Navigation System for Visually Impaired and Disabled”; Fifth International Symposium on Wearable Computers; Oct. 8-9, 2001; 8 pp.
Parkes, Don; “Audio Tactile Systems for Designing and Learning Complex Environments as a Vision Impaired Person: Static and Dynamic Spatial Information Access”; EdTech-94 Proceedings; 1994; 8 pp.
Zeng et al.; “Audio-Haptic Browser for a Geographical Information System”; Icchp 2010, Part Ii, Lncs 6180; Jul. 14-16, 2010; 8 pp.
AiZuhair et al.; “Nfc Based Applications for Visually Impaired People —a Review”; IEEE International Conference on Multimedia and Expo Workshops (Icmew), Jul. 14, 2014; 7 pp.
Graf, Christian; “Verbally Annotated Tactile Maps — Challenges and Approaches”; Spatial Cognition Vii, vol. 6222; Aug. 15-19, 2010; 16 pp.
Hamid, Nazatul Naquiah Abd; “Facilitating Route Learning Using Interactive Audio-Tactile Maps for Blind and Visually Impaired People”; Chi 2013 Extended Abstracts; Apr. 27, 2013; 6 pp.
Ramya, et al.; “Voice Assisted Embedded Navigation System for the Visually Impaired”; International Journal of Computer Applications; vol. 64, No. 13, Feb. 2013; 7 pp.
Caperna et al.; “A Navigation and Object Location Device for the Blind”; Tech. rep. University of Maryland College Park; May 2009; 129 pp.
Burbey et al.; “Human Information Processing with the Personal Memex”; Ise 5604 Fall 2005; Dec. 6, 2005; 88 pp.
Ghiani, et al.; “Vibrotactile Feedback to Aid Blind Users of Mobile Guides”; Journal of Visual Languages and Computing 20; 2009; 13 pp.
Guerrero et al.; “An Indoor Navigation System for the Visually Impaired”; Sensors vol. 12, Issue 6; Jun. 13, 2012; 23 pp.
Nordin et al.; “Indoor Navigation and Localization for Visually Impaired People Using Weighted Topological Map”; Journal of Computer Science vol. 5, Issue 11; 2009; 7 pp.
Hesch et al.; “Design and Analysis of a Portable Indoor Localization Aid for the Visually Impaired”; International Journal of Robotics Research; vol. 29; Issue 11; Sep. 2010; 15 pgs.
Joseph et al.; “Visual Semantic Parameterization — to Enhance Blind User Perception for Indoor Navigation”; Multimedia and Expo Workshops (Icmew), 2013 IEEE International Conference; Jul. 15/2013; 7 pp.
Katz et al; “Navig: Augmented Reality Guidance System for the Visually Impaired”; Virtual Reality (2012) vol. 16; 2012; 17 pp.
Rodriguez et al.; “Assisting the Visually Impaired: Obstacle Detection and Warning System by Acoustic Feedback”; Sensors 2012; vol. 12; 21 pp.
Treuillet; “Outdoor/Indoor Vision-Based Localization for Blind Pedestrian Navigation Assistance”; Wspc/Instruction File; May 23, 2010; 16 pp.
Ran et al.; “Drishti: an Integrated Indoor/Outdoor Blind Navigation System and Service”; Proceeding Percom '04 Proceedings of the Second IEEE International Conference on Pervasive Computing and Communications (PerCorn'04); 2004; 9 pp.
Wang, et al.; “Camera-Based Signage Detection and Recognition for Blind Persons”; 13th International Conference (Icchp) Part 2 Proceedings; Jul. 11-13, 2012; 9 pp.
Krishna et al.; “A Systematic Requirements Analysis and Development of an Assistive Device to Enhance the Social Interaction of People Who are Blind or Visually Impaired”; Workshop on Computer Vision Applications for the Visually Impaired; Marseille, France; 2008; 12 pp.
Lee et al.; “A Walking Guidance System for the Visually Impaired”; International Journal of Pattern Recognition and Artificial Intelligence; vol. 22; No. 6; 2008; 16 pp.
Ward et al.; “Visual Experiences in the Blind Induced by an Auditory Sensory Substitution Device”; Journal of Consciousness and Cognition; Oct. 2009; 30 pp.
Merino-Garcia, et al.; “A Head-Mounted Device for Recognizing Text in Natural Sciences”; Cbdar'll Proceedings of the 4th International Conference on Camera-Based Document Analysis and Recognition; Sep. 22, 2011; 7 pp.
Yi, Chucai; “Assistive Text Reading from Complex Background for Blind Persons”; Cbdar'll Proceedings of the 4th International Conference on Camera-Based Document Analysis and Recognition; Sep. 22, 2011; 7 pp.
Yang, et al.; “Towards Automatic Sign Translation”; the Interactive Systems Lab, Carnegie Mellon University; 2001; 5 pp.
Meijer, Dr. Peter B.L.; “Mobile Ocr, Face and Object Recognition for www.seeingwithsound.com/ocr.htm; Apr. 18, 2014; 7 pp.. the Blind”; the vOICe,.
Omron; Optical Character Recognition Sensor User's Manual; 2012; 450 pp.
Park, Sungwoo; “Voice Stick”; www.yankodesign.com/2008/08/21/voice-stick; Aug. 21, 2008; 4 pp.
Rentschler et al.; “Intelligent Walkers for the Elderly: Performance and Safety Testing of Va-Pamaid Robotic Walker”; Department of Veterans Affairs Journal of Rehabilitation Research and Development; vol. 40, No. 5; Sep./Oct. 2013; 9pages.
Science Daily; “Intelligent Walker Designed to Assist the Elderly and People Undergoing Medical Rehabilitation”; http://www.sciencedaily.com/releases/2008/11/081107072015.htm; Jul. 22, 2014; 4 pages.
Glover et al.; “A Robotically-Augmented Walker for Older Adults”; Carnegie Mellon University, School of Computer Science; Aug. 1, 2003; 13 pp.
OrCam; www.orcam.com; Jul. 22, 2014; 3 pp.
Eccles, Lisa; “Smart Walker Detects Obstacles”; Electronic Design; http://electronicdesign.comJelectromechanical/smart-walker-detects-obstacles; Aug. 20, 2001; 2 pp.
Graft, Birgit; “An Adaptive Guidance System for Robotic Walking Aids”; Journal of Computing and Information Technology - Cit 17; 2009; 12 pp.
Frizera et al.; “The Smart Walkers as Geriatric Assistive Device. The Simbiosis Purpose”; Gerontechnology, vol. 7, No. 2; Jan. 30, 2008; 6 pp.
Rodriquez-Losada et al.; “Guido, the Robotic Smart Walker for the Frail Visually Impaired”; IEEE International Conference on Robotics and Automation (Icra); Apr. 18-22, 2005; 15 pp.
Kayama et al.; “Outdoor Environment Recognition and Semi-Autonomous Mobile Vehicle for Supporting Mobility of the Elderly and Disabled People”; National Institute of Information and Communications Technology, vol. 54, No. 3; Aug. 2007; 11 pp.
Kalra et al.; “A Braille Writing Tutor to Combat Illiteracy in Developing Communities”; Carnegie Mellon University Research Showcase, Robotics Institute; 2007; 10 pp.
Blaze Engineering; “Visually Impaired Braille”; Braille 'n. Speak Manual; Resource Guide: Assistive Technology http://www.blaize.com; Nov. 17, 2014; for Students who use 5 pages.
AppleVis; an Introduction to Braille Screen Input on 10S 8; http://www.applevis.com/guides/braille-ios/introduction-braille-screen-input-ios-8, Nov. 16, 2014; 7 pages.
Dias et al.; “Enhancing an Automated Braille Writing Tutor”; IEEE/Rsj International Conference on Intelligent Robots and Systems; Oct. 11-15 2009; 7 pp.
D'Andrea, Frances Mary; “More than a Perkins Brailler: a Review of the Mountbatten Brailler, Part 1”; Afb AccessWorld Magazine; vol. 6, No. 1, Jan. 2005; 9 pp.
Trinh et al.; “Phoneme-based Predictive Text Entry Interface”; Proceedings of the 16th International Acm Sigaccess Conference on Computers & Accessibility; Oct. 2014; 2 pgs.
Merri et al.; “The Instruments for a Blind Teacher of English: the challenge of the board”; European Journal of Psychology of Education, vol. 20, No. 4 (Dec. 2005), 15 pp.
Kirinic et al.; “Computers in Education of Children with Intellectual and Related Developmental Disorders”; International Journal of Emerging Technologies in Learning, vol. 5, 2010, 5 pp.
Campos et al.; “Design and Evaluation of a Spoken-Feedback Keyboard”; Department of Information Systems and Computer Science, Inesc-Id/Ist/Universidade Tecnica de Lisboa, Jul. 2004; 6 pp.
Ebay; Maven (Made in Korea) Neoprene Canon Dslr Camera Curved Neck Strap #6782; http://www.ebay.com/itm/Matin-Made-in-Korea-Neoprene-Canon-Dslr-Camera-Curved-.Neck-Strap-67824281608526018?hash=item41912d18c2:g:---pMAAOSwe-FU6zDa ; 4 pp.
Newegg; Motorola S10-Hd Bluetooth Stereo Headphone w/ Comfortable Sweat Proof Design; http://www.newegg.com/Product/Product.aspx?Item=9SIAONW2G39901&Tpk=9sia0nw2g39901;.
Newegg; Motorola Behind the Neck Stereo Bluetooth Headphone Black/Red Bulk (S9) - Oem; http://www.newegg.com/Product/Product.aspx?Item=N82E16875982212&Tpk=n82e16875982212;.
Related Publications (1)
Number Date Country
20170261334 A1 Sep 2017 US