Wearable computing device having a curved back to reduce pressure on vertebrae

Information

  • Patent Grant
  • 10561519
  • Patent Number
    10,561,519
  • Date Filed
    Wednesday, July 20, 2016
    8 years ago
  • Date Issued
    Tuesday, February 18, 2020
    4 years ago
Abstract
A wearable computing device includes a first side portion and a second side portion that partially extend across a shoulder and rest on a front of a user. The device also includes a neck portion connected to the first side portion and the second side portion. The neck portion includes an outer edge and an inner edge. The neck portion is curved from the first side portion to the second side portion to extend around a portion of a circumference of the neck of the user. The neck portion is also curved from the outer edge to the inner edge to follow a curvature of a spine of the user. The wearable computing device also includes an input device and a mobile processor designed to determine output data based on input data. The wearable computing device also includes an output device designed to output the output data.
Description
BACKGROUND
1. Field

The present disclosure relates to a wearable computing device to be worn around a user's neck that includes a curved back for reducing an amount of pressure applied to vertebra of the user by the device.


2. Description of the Related Art

As computing power becomes faster and electronic devices become smaller, technology is being implemented in increasingly smaller packages. Technology is now at a point in which advanced computing functions can be implemented in devices sufficiently small to be worn by users as accessories. Wearable computing devices, or wearable smart devices, can perform functions for a user without requiring physical manipulation of the device by the user. Examples of wearable computing devices include eyeglasses, watches, and necklaces.


Wearable computing devices perform various functions for users. For example, some wearable computing devices can function as extensions of a mobile phone of the user. Other wearable computing devices perform functions that require a relatively large amount of computation, such as providing social and environmental awareness.


Design of wearable computing devices should take into consideration various factors based on characteristics of the device. In particular, wearable computing devices that can perform computation-heavy social and environmental awareness features may have a greater mass than wearable computing devices that perform less computation-heavy features. If this mass is not well-distributed on a user, it may result in discomfort experienced by the user. Similarly, processors of wearable computing devices that can perform computation-heavy social and environmental awareness features may generate more heat than processors of wearable computing devices that perform less computation-heavy features. If this heat is not well-distributed into the atmosphere, it may result in additional discomfort experienced by the user.


Thus, there is a need for devices and systems for increasing comfort of wearable computing devices that perform computation-heavy social and environmental awareness functions.


SUMMARY

What is described is a wearable computing device designed to be worn around a neck of a user. The wearable computing device includes a first side portion and a second side portion each designed to at least partially extend across a shoulder of the user and to rest on a front of the user. The wearable computing device also includes a neck portion defining a cavity and having a first end connected to the first side portion and a second end connected to the second side portion. The neck portion also includes an outer edge and an inner edge that is positioned nearer the neck of the user than the outer edge when the wearable computing device is worn. The neck portion is curved from the first end to the second end in order to extend around a portion of a circumference of the neck of the user. The neck portion is also curved from the outer edge to the inner edge at a center portion between the first end and the second end in order to follow a curvature of a spine of the user. The wearable computing device also includes an input device designed to detect input data. The wearable computing device also includes a mobile processor positioned in the cavity, coupled to the input device, and designed to determine output data based on the input data. The wearable computing device also includes an output device coupled to the mobile processor and designed to output the output data.


Also described is a wearable computing device designed to be worn around a neck of a user. The wearable computing device includes a first side portion and a second side portion each designed to at least partially extend across a shoulder of the user and to rest on a front of the user. The wearable computing device also includes a neck portion defining a cavity and having a first end connected to the first side portion and a second end connected to the second side portion. The neck portion also includes an outer edge and an inner edge that is positioned nearer the neck of the user than the outer edge when the wearable computing device is worn. The neck portion is curved from the first end to the second end to extend around a portion of a circumference of the neck of the user. The neck portion is also curved from the outer edge to the inner edge at a center portion between the first end and the second end to follow a curvature of a spine of the user. The wearable computing device also includes a camera designed to detect image data. The wearable computing device also includes a mobile processor positioned in the cavity, coupled to the camera, and designed to recognize objects based on the image data and to determine navigation instructions based on the image data. The wearable computing device also includes a speaker coupled to the mobile processor and designed to output data corresponding to the recognized objects or the determined navigation instructions.


Also described is a wearable computing device designed to be worn around a neck of a user. The wearable computing device includes a first side portion and a second side portion each having a rigid portion that rests on a front of the user when the wearable computing device is worn. Each of the first side portion and the second side portion also has a flexible portion that at least partially extends across a shoulder of the user. The wearable computing device also includes a neck portion that defines a cavity and has a first end connected to the flexible portion of the first side portion and a second end connected to the flexible portion of the second side portion. The neck portion also includes a top edge and a bottom edge designed to contact a back of the user at a lower location than the top edge. The neck portion is curved from the top edge to the bottom edge to follow a curvature of a spine of the user. The wearable computing device also includes a camera designed to detect image data. The wearable computing device also includes a mobile processor positioned in the cavity, coupled to the camera, and designed to recognize objects based on the image data and determine navigation instructions based on the image data. The wearable computing device also includes a speaker coupled to the mobile processor and designed to output data corresponding to the recognized objects or the determined navigation instructions.





BRIEF DESCRIPTION OF THE DRAWINGS

Other systems, methods, features, and advantages of the present invention will be or will become apparent to one of ordinary skill in the art upon examination of the following figures and detailed description. It is intended that all such additional systems, methods, features, and advantages be included within this description, be within the scope of the present invention, and be protected by the accompanying claims. Component parts shown in the drawings are not necessarily to scale, and may be exaggerated to better illustrate the important features of the present invention. In the drawings, like reference numerals designate like parts throughout the different views, wherein:



FIG. 1 is a perspective view of a wearable computing device designed to be worn around a neck of a user and that includes two side portions and a neck portion that has features for increasing comfort according to an embodiment of the present invention;



FIG. 2A illustrates an enlarged view of the neck portion of FIG. 1 showing a curvature from an inner edge of the neck portion to an outer edge of the neck portion that is designed to follow a curvature of a spine according to an embodiment of the present invention;



FIG. 2B illustrates an enlarged view of a neck portion of a wearable computing device that has padding on a contact surface of both sides of the neck portion and does not include padding at a center location such that a spine is positioned between the padding to reduce pressure on the spine according to an embodiment of the present invention;



FIG. 3 illustrates a view of the wearable computing device of FIG. 1 as worn by a user according to an embodiment of the present invention;



FIG. 4 illustrates another view of the wearable computing device of FIG. 1 as worn by a user and shows a spine of the user to illustrate how the curvature illustrated in FIG. 2B follows a curvature of the spine according to an embodiment of the present invention; and



FIG. 5 is an exploded view of the wearable computing device of FIG. 1 illustrating various features for distributing heat away from a mobile processor according to an embodiment of the present invention.





DETAILED DESCRIPTION

Described herein are wearable computing devices that may be worn around a neck of a user. The wearable computing devices may be relatively heavy and may have a neck portion that rests on a neck or back of the user. The present invention includes a design of the neck portion that provides increased comfort to users. In particular, the neck portion has been designed such that the weight of the wearable computing devices is not applied to any particular vertebra of the user and is relatively evenly distributed about the user's body.


The neck portion has also been designed to increase comfort by reducing an amount of heat experienced by the user. In particular, the neck portion includes various heat distribution devices that receive heat from the electronic components. The heat distribution devices are each connected and are designed in such a way that heat transfers from the electronic components to the heat distribution devices and into the atmosphere.


The wearable computing devices provide several benefits and advantages such as increased comfort to users of the wearable computing device. This allows the user to wear the wearable computing device for longer periods of time. Increased comfort is provided in at least two different ways: by providing an even weight distribution and by distributing heat away from the neck of the user. Distributing heat away from the neck of the user provides additional benefits and advantages such as reducing the likelihood of electronic components overheating, which in turn reduces the likelihood of damage to the electronic components.


Turning to FIG. 1, a wearable computing device 100 is shown. The wearable computing device 100 is designed to be worn around a neck of a user. In that regard, the wearable computing device 100 includes a neck portion 102 designed to rest on a back of a neck of the user and to extend around at least a portion of a circumference of the neck of the user. The wearable computing device 100 also includes a first side portion 104 and a second side portion 106. The side portions 104, 106 are designed to extend across a user's shoulder and to rest on a front of a user, such as on a chest of the user.


The first side portion 104 includes a first flexible portion 108 and a first rigid portion 110. The second side portion 106 includes a second flexible portion 112 and a second rigid portion 114. The first flexible portion 108 is positioned between the neck portion 102 and the first rigid portion 110. The first flexible portion 108 may be coupled to a first end 116 of the neck portion 102, and the second flexible portion 112 may be coupled to a second end 118 of the neck portion 102. In that regard, the first flexible portion 108 may extend across the shoulder of the user and may be malleable or flexible such that it may follow the contours of the shoulder of the user. The first rigid portion 110 may rest on a portion of the front of the user, such as on the chest of the user.


The neck portion 102 may include a top edge or inner edge 120 and a bottom edge or outer edge 122. The inner edge 120 may correspond to an edge that is nearer a center of the wearable computing device 100 than the outer edge 122. When the wearable computing device 100 is worn by a user, as shown in FIG. 3, the inner edge 120 may be nearer to a head of the user than the outer edge 122.


The neck portion 102 may also include a contact surface 124 and an exposed surface 126. The exposed surface 126 may be on an opposite side of the neck portion 102 from the contact surface 124. When the wearable computing device 100 is worn by a user, as shown in FIG. 3, the contact surface 124 may be in contact with a neck or a back of the user and the exposed surface 126 may be exposed to the environment. The contact surface 124 and/or the exposed surface 126 may refer to the neck portion 102 and/or to the entire wearable computing device 100.


The wearable computing device 100 may include multiple features for providing situational awareness to a user. For example, the wearable computing device 100 may provide assistance to a blind user by providing information to the blind user regarding objects in the environment, providing navigation instructions to the blind user, or the like.


The wearable computing device 100 may include one or more input devices for receiving input. The input devices may be used to receive user input, may detect data corresponding to the environment of the user, may receive a communication signal, or the like. For example, the wearable computing device 100 may include one or more buttons 128 for receiving user input. In some embodiments, a user may select a mode of operation of the wearable computing device 100 via the one or more buttons 128.


The wearable computing device 100 may also include one or more camera 130, such as a single camera, a stereo pair of cameras, a wide angle camera, or the like. The camera 130 may detect image data corresponding to the environment of the user.


The wearable computing device 100 may also include one or more output devices for providing output data to the user. The output devices may provide audio feedback, haptic feedback, visual feedback, or the like to the user. For example, the wearable computing device 100 may include a first output unit 132A and a second output unit 132B. The first output unit 132A and the second output unit 132B may each provide audio and haptic output. In that regard, the first output unit 132A and the second output unit 132B may together provide stereo feedback to the user. For example, the first output unit 132A and the second output unit 132B may each output audio data providing an identification of an object in the environment. As another example, the first output unit 132A and the second output unit 132B may provide navigation instructions via audio feedback and/or via stereo haptic feedback.


The wearable computing device 100 may include a mobile processor 134 and a memory 136. In some embodiments, the neck portion 102 defines a cavity in which the mobile processor 134 and/or the memory 136 are positioned. The memory 136 may include any memory for storing non-transitory data including instructions to be performed by the mobile processor 134. The mobile processor 134 may receive input data from the buttons 128 and/or the camera 130. The mobile processor 134 may then determine output data based on the input data and cause the first output unit 132A and the second output unit 132B to output the output data.


The wearable computing device 100 may operate in four modes: explorer mode, scan mode, find mode and capture mode. Each of the buttons 128 may correspond to one mode. For example, one button may correspond to the explorer mode and another button may correspond to the scan mode.


While in the explorer mode, the wearable computing device 100 provides data to the user associated with the surroundings of the user. In some embodiments, the wearable computing device 100 may describe data detected by the camera 130. The data may include predefined data, such as hazard data, whether a friend of the user is passing by, whether a user's favorite restaurant is detected, etc.


While in the scan mode, the wearable computing device 100 may describe everything that is in the field of view of the camera 130. For example, the wearable computing device 100 may describe everything in the field of view, such as by telling the user that object X is 50 degrees to your left, object Y is at your eleven-o'clock, objects Z and W are directly ahead, or the like.


While in the find mode, the wearable computing device 100 can navigate the user to a desired object, place, person, or the like. The user can provide data about the desired object, place, person, or the like, such as by speaking the name or address of the object, place, person, or the like. The wearable computing device 100 can then determine the location of the object, place, person, or the like and provide navigation directions to the user.


The capture mode may allow the wearable computing device 100 to store its current location in the memory 16 so that it can guide the user back to the same location at a later time. The capture mode may include 2 instructions—capture and return. Capture stores the location information (and possibly any obstacles that may arise during a return trip to the position) while return causes the wearable computing device 100 to provide navigation instructions to the user for a return to the location. In various embodiments, a single press of the capture button may indicate the capture instruction and a double click indicates the return instruction.


The wearable computing device 100 may be worn for a relatively long period of time. In that regard, it is desirable for the wearable computing device 100 to be comfortable when worn by a user. It is been shown that comfort of a necklace is increased when pressure on one or more vertebra is decreased. Thus, the neck portion 102 of the wearable computing device 100 includes features for more evenly distributing the weight of the wearable computing device 100 on the user and for decreasing pressure applied to any one or more vertebra by the wearable computing device 100.


One such feature is that the neck portion 102 curves from the first end 116 to the second end 118 to extend around at least a portion of a neck of the user. The neck portion 102 includes a longitudinal axis 138 that may be substantially perpendicular to a longitudinal axis of the first side portion 104 and the second side portion 106. The neck portion 102 may be curved from the longitudinal axis in order to connect with the first side portion 104 and the second side portion 106 while maintaining curvature allowing it to extend around the neck.


The neck portion 102 also includes a width 140 extending from the inner edge 120 to the outer edge 122. Thus, the contact surface 124 may be in contact with the user along the width 140 of the neck portion 102. At least a portion of the contact surface 124 may be bowed outward (i.e., bowed towards the exposed surface 126), such that a concave cavity is defined by the contact surface 124. This bowing of the contact surface 124 results in a curvature that follows a curvature of a spine of the user. For example, the curvature of the contact surface 124 may resemble the curvature from a cervical portion of the spine to a thoracic portion of the spine.


Turning to FIG. 2A, a close-up view of the neck portion 102 illustrates the curvature of the neck portion 102. In particular, FIG. 2A illustrates the curvature of the neck portion 102 from the first end 116 to the second end 118. This curvature may result in the neck portion 102 having a substantially “U” shape from the first side portion 104 to the second side portion 106.


The neck portion 102 may have a center portion 200 positioned between the first end 116 and the second end 118 and extending along the width 140. As shown, the curvature of the neck portion 102 from the inner edge 120 to the outer edge 122 may occur along the width 140 at the center portion 200 of the neck portion 102. In that regard, when the wearable computing device 100 is worn, the contact surface 124 along the center portion 200 may rest flush with the user's spine. This curvature reduces an amount of force applied by the neck portion 102 to any one or more vertebra of the user, thus increasing comfort of the wearable computing device 100.


The neck portion 102 may also include a padding 202 that defines the contact surface 124. The padding 202 may be coupled to a casing of the neck portion 102 and may further distribute the weight of the wearable computing device 100. The padding 202 may include material such as silicon, foam, rubber, or any other material capable of providing cushioning or padding.


Turning to FIG. 2B, a neck portion 252 of another wearable computing device 250 may include different features than the neck portion 102 of FIG. 2A. The neck portion 252 has a first end 254, a second end 256, and a curvature between the first end 254 and the second end 256. The neck portion 252 may also have a curvature from an inner edge to an outer edge. The neck portion 252 may include padding having different features than the padding 202 of FIG. 2B. For example, the neck portion 252 may include a first padding 264 and a second padding 266.


The first padding 264 may span from the first end 254 to a first location 260 positioned away from a halfway point 258 of the neck portion 252. The second padding 266 may span from the second end 256 to a second location 262 positioned away from the halfway point 258 of the neck portion 252.


No padding may exist between the first location 260 and the second location 262. When the wearable computing device 250 is worn, the first padding 264 and the second padding 266 may contact the user's neck, back, and/or shoulders. However, because no padding exists between the first location 260 and the second location 262, the neck portion 252 may not contact the spine of the user or may make minimal contact with the spine of the user. Thus, use of the first padding 264 and the second padding 266 may reduce pressure applied to the user's spine by the neck portion 252 even more so than the design of the neck portion 102 of FIG. 2A.


Turning now to FIG. 3, the wearable computing device 100 is shown as worn by a user 300. As shown, the neck portion 102 at least partially rests on a neck 304 and/or a back 306 of the user 300. The inner edge 120 of the neck portion 102 is positioned higher on the back 306 of the user 300 than the outer edge 122. Thus, the inner edge 120 is positioned nearer to a head 310 of the user 300 than the outer edge 122. In some embodiments, the inner edge 120 may be substantially parallel to the shoulder 302 of the user 300. Stated differently, the inner edge 120 may be positioned at substantially the same height as the user's shoulder 302.



FIG. 3 illustrates how the curvature of the neck portion 102 from the first end 116 to the second end 118 resembles a curvature of the neck 304 of the user 300. This allows the neck portion 102 to extend from a first side of the neck 304 to a second side of the neck 304. From the first end 116, the first side portion 104 extends from the first end 116 of the neck portion 102 over the shoulder 302 and rests on a front 308 of the user 300.


Referring now to FIG. 4, a cross-sectional view of the wearable computing device 100 as worn on the user 300 is shown. A spine 400 of the user is shown to illustrate how the curvature of the neck portion 102 resembles the curvature of the spine 400. The spine 400 includes a cervical portion 402, a thoracic portion 404, and a lumbar portion 406. The spine 400 has a curvature between the cervical portion 402 and the thoracic portion 404. As shown, the contact surface 124 of the neck portion 102 has a curvature that resembles the curvature of the spine 400 between the cervical portion 402 and the thoracic portion 404. Thus, the curvature of the contact surface 124 reduces an amount of pressure applied to any vertebrae of the spine 400 by the wearable computing device 100 by more evenly distributing contact with the user 300.


As shown, the first flexible portion 108 extends across the shoulder 302 towards the front 308 of the user 300. In some embodiments, the first flexible portion 108 may extend along a portion of the front 308 of the user 300. The first rigid portion 110 may rest on the front 308 of the user 300. In that regard, it may be desirable for the first rigid portion 110 to have a relatively flat contact surface such that it may rest on a flat portion of the front 308 of the user 300.


Turning now to FIG. 5, an exploded view of the neck portion 102 illustrates various features of the neck portion 102 for dissipating heat. The neck portion 102 includes a housing 500 having an inner housing 500A and an outer housing 500B. The housing 500 may include metal, plastic, or another rigid material that provides a structure for the components of the neck portion 102. Stated differently, the housing 500 may define a cavity in which components of the neck portion 102 are positioned.


The inner housing 500A may define the contact surface 124 and the outer housing 500B may define the exposed surface 126. In some embodiments, additional padding may be coupled to the contact surface 124 of the inner housing 500A, thus creating a new contact surface that includes the padding.


A printed circuit board (PCB) mount 502 may be positioned within the housing 500. In some embodiments, the PCB mount 502 may be coupled to the inner housing 500A. The PCB mount 502 may include metal, plastic, or another rigid material on which a PCB may be mounted.


A motherboard 504 may include the mobile processor 134 and the memory 136 positioned on and electrically coupled via a PCB 506. The motherboard 504 may be mounted on the PCB mount 502. For example, the motherboard 504 may be coupled to the PCB mount 502 via a snap-fit connection, a press-fit connection, fasteners, or the like.


Because the mobile processor 134 may perform computation-heavy social and environmental awareness functions, it may generate a relatively large amount of heat during operation. It is desirable to dissipate this heat away from the neck portion 102 in order to increase comfort of the user. Thus, the neck portion 102 may include various features for dissipating the heat generated by the mobile processor 134.


The neck portion 102 may include a thermal pad 508 that is coupled to the mobile processor 134. The thermal pad 508 may include a material having a relatively low resistance that is capable of transferring heat. The thermal pad 508 may partially or fully contact a surface of the mobile processor 134.


A pipe 510 may be coupled to the thermal pad 508 and may receive heat from the mobile processor 134 via the thermal pad 508. The pipe 510 may include a metal, such as copper. In that regard, the pipe 510 may have a relatively low resistance and be capable of transferring heat.


A heat spreader 512 may be coupled to the pipe 510 via thermal paste (not shown). The thermal paste may include any spreadable material capable of conducting heat. The heat spreader 512 may include any material capable of conducting heat. For example, the heat spreader 512 may include a metal such as aluminum, copper, or the like. The heat spreader 512 may receive heat from the mobile processor 134 via the thermal pad 508, the pipe 510, and the thermal paste.


The heat spreader 512 may have a relatively large surface area. In that regard, heat received by the heat spreader 512 may be dissipated, or spread, into the atmosphere and/or to the outer housing 500B from various surfaces of the heat spreader 512. Because the heat spreader 512 has a relatively large surface area, heat may be distributed over a relatively large area. This reduces the likelihood of any single location of the neck portion 102 having a relatively high temperature.


Exemplary embodiments of the methods/systems have been disclosed in an illustrative style. Accordingly, the terminology employed throughout should be read in a non-limiting manner. Although minor modifications to the teachings herein will occur to those well versed in the art, it shall be understood that what is intended to be circumscribed within the scope of the patent warranted hereon are all such embodiments that reasonably fall within the scope of the advancement to the art hereby contributed, and that that scope shall not be restricted, except in light of the appended claims and their equivalents.

Claims
  • 1. A wearable computing device, comprising: a first side portion and a second side portion each configured to at least partially extend across a shoulder of a user and to rest on a front of the user;a neck portion having an inner housing and an outer housing configured to be coupled together and to define a cavity, the neck portion having a first end connected to the first side portion, a second end connected to the second side portion, a center portion that is in between the first end and the second end and that has a width that extends from a neck of the user to a thoracic portion of a spine of the user, an outer edge, an inner edge that is positioned nearer the neck of the user than the outer edge when the wearable computing device is worn, a contact surface that extends the width of the center portion and is configured to contact and rest flush with the thoracic portion of the spine of the user when the wearable computing device is worn, and an exposed surface defined by the outer housing and oriented opposite the contact surface, a curvature of the neck portion extending from the inner edge to the outer edge along the width of the center portion;an input device configured to detect input data;a mobile processor positioned in the cavity, coupled to the input device, and configured to determine output data based on the input data;an output device coupled to the mobile processor and configured to output the output data;a thermal pad positioned in the cavity, in contact with the mobile processor such that the mobile processor is located between the thermal pad and the contact surface, and configured to transfer heat from the mobile processor; anda heat spreader located in the cavity and configured to distribute the heat from the mobile processor via the thermal pad.
  • 2. The wearable computing device of claim 1 further comprising a padding coupled to the neck portion and configured to contact the user when the wearable computing device is worn.
  • 3. The wearable computing device of claim 2 wherein the padding includes a first padding that extends from a first location away from a halfway point along the width of the neck portion to the first end of the neck portion and a second padding that extends from a second location away from the halfway point to the second end of the neck portion such that the spine of the user is positioned between the first padding and the second padding when the wearable computing device is worn.
  • 4. The wearable computing device of claim 1 wherein the inner edge of the neck portion is positioned at a height that is substantially equal to a height of shoulders of the user when the wearable computing device is worn.
  • 5. The wearable computing device of claim 1 wherein the mobile processor is further configured to operate in a first mode in which it generates navigation instructions to a desired location and in a second mode in which it recognizes objects in an environment.
  • 6. The wearable computing device of claim 1 wherein the output device includes a first output unit and a second output unit that each include a speaker and a vibration unit for providing at least one of stereo audio output or stereo haptic output.
  • 7. The wearable computing device of claim 1 wherein the curvature from the inner edge of the neck portion to the outer edge of the neck portion resembles a curvature of the spine of the user from a cervical portion of the spine to the thoracic portion of the spine.
  • 8. The wearable computing device of claim 1 wherein each of the first side portion and the second side portion includes a rigid portion configured to rest on the front of the user when the wearable computing device is worn, and a flexible portion positioned between the rigid portion and the neck portion and configured to at least partially extend across the shoulder of the user.
  • 9. A wearable computing device, comprising: a first side portion and a second side portion each configured to at least partially extend across a shoulder of a user and to rest on a front of the user;a neck portion having an inner housing and an outer housing configured to be coupled together and to define a cavity, the neck portion having a first end connected to the first side portion, a second end connected to the second side portion, a center portion that in between the first end and the second end and that has a width that extends from a neck of the user to a thoracic portion of a spine of the user, an outer edge, an inner edge that is positioned nearer the neck of the user than the outer edge when the wearable computing device is worn, a contact surface that extends the width of the center portion and is configured to contact and rest flush with the thoracic portion of the spine of the user when the wearable computing device is worn, and an exposed surface defined by the outer housing and oriented opposite the contact surface, a curvature of the neck portion extending from the inner edge to the outer edge along the width of the center portion;a padding that is coupled to the neck portion and spans the entire contact surface to distribute weight on the user;a camera configured to detect image data;a mobile processor positioned in the cavity, coupled to the camera, and configured to recognize objects based on the image data and determine navigation instructions based on the image data;a speaker coupled to the mobile processor and configured to output data corresponding to the recognized objects or the determined navigation instructions;a thermal pad positioned in the cavity, in contact with the mobile processor such that the mobile processor is located between the thermal pad and the contact surface, and configured to transfer heat from the mobile processor; anda heat spreader located in the cavity and configured to distribute the heat from the mobile processor via the thermal pad.
  • 10. The wearable computing device of claim 9 wherein the inner edge of the neck portion is positioned at a height that is substantially equal to a height of shoulders of the user when the wearable computing device is worn.
  • 11. The wearable computing device of claim 9 wherein the speaker includes a first output unit and a second output unit that each include a combined speaker and vibration unit for providing at least one of stereo audio output or stereo haptic output.
  • 12. The wearable computing device of claim 9 wherein the curvature from the inner edge of the neck portion to the outer edge of the neck portion resembles a curvature of the spine of the user from a cervical portion of the spine to the thoracic portion of the spine.
  • 13. The wearable computing device of claim 9 wherein each of the first side portion and the second side portion includes a rigid portion configured to rest on the front of the user when the wearable computing device is worn, and a flexible portion positioned between the rigid portion and the neck portion and configured to at least partially extend across the shoulder of the user.
  • 14. A wearable computing device designed to be worn around a neck of a user comprising: a first side portion and a second side portion each having a rigid portion configured to rest on a front of the user when the wearable computing device is worn, and a flexible portion configured to at least partially extend across a shoulder of the user;a neck portion having an inner housing and an outer housing configured to be coupled together and to define a cavity, the neck portion having a first end connected to the flexible portion of the first side portion, a second end connected to the flexible portion of the second side portion, a top edge, a bottom edge configured to contact a back of the user at a lower location than the top edge, a contact surface defined by the inner housing and configured to contact the user, and an exposed surface defined by the outer housing and oriented opposite the contact surface, the contact surface being bowed towards the exposed surface along a width of the neck portion between the top edge and the bottom edge to resemble a curvature of a spine of the user;a padding that is coupled to the neck portion and spans the entire contact surface to distribute weight on the user;a camera configured to detect image data;a mobile processor positioned in the cavity, coupled to the camera, and configured to recognize objects based on the image data and determine navigation instructions based on the image data;a speaker coupled to the mobile processor and configured to output data corresponding to the recognized objects or the determined navigation instructions;a thermal pad positioned in the cavity, in contact with the mobile processor such that the mobile processor is located between the thermal pad and the contact surface, and configured to transfer heat from the mobile processor; anda heat spreader located in the cavity and configured to distribute the heat from the mobile processor via the thermal pad.
  • 15. The wearable computing device of claim 14 wherein the top edge of the neck portion is positioned at a height that is substantially equal to a height of shoulders of the user when the wearable computing device is worn.
  • 16. The wearable computing device of claim 14 wherein a curve from the top edge of the neck portion to the bottom edge of the neck portion resembles the curvature of the spine of the user from a cervical portion of the spine to a thoracic portion of the spine.
US Referenced Citations (405)
Number Name Date Kind
4520501 DuBrucq May 1985 A
4586827 Hirsch et al. May 1986 A
4786966 Hanson Nov 1988 A
5047952 Kramer Sep 1991 A
5097856 Chi-Sheng Mar 1992 A
5129716 Holakovszky et al. Jul 1992 A
5233520 Kretsch et al. Aug 1993 A
5265272 Kurcbart Nov 1993 A
5463428 Lipton et al. Oct 1995 A
5508699 Silverman Apr 1996 A
5539665 Lamming et al. Jul 1996 A
5543802 Villevielle et al. Aug 1996 A
5544050 Abe Aug 1996 A
5568127 Bang Oct 1996 A
5636038 Lynt Jun 1997 A
5659764 Sakiyama Aug 1997 A
5701356 Stanford et al. Dec 1997 A
5733127 Mecum Mar 1998 A
5780756 Babb Jul 1998 A
5807111 Schrader Sep 1998 A
5872744 Taylor Feb 1999 A
5953693 Sakiyama Sep 1999 A
5956630 Mackey Sep 1999 A
5982286 Vanmoor Nov 1999 A
6009577 Day Jan 2000 A
6055048 Langevin et al. Apr 2000 A
6067112 Wellner et al. May 2000 A
6199010 Richton Mar 2001 B1
6229901 Mickelson et al. May 2001 B1
6230135 Ramsay May 2001 B1
6230349 Silver et al. May 2001 B1
6285757 Carroll et al. Sep 2001 B1
6307526 Mann Oct 2001 B1
6323807 Golding et al. Nov 2001 B1
6349001 Spitzer Feb 2002 B1
6466232 Newell et al. Oct 2002 B1
6477239 Ohki Nov 2002 B1
6542623 Kahn Apr 2003 B1
6580999 Maruyama et al. Jun 2003 B2
6594370 Anderson Jul 2003 B1
6603863 Nagayoshi Aug 2003 B1
6619836 Silvant et al. Sep 2003 B1
6701296 Kramer Mar 2004 B1
6774788 Balfe Aug 2004 B1
6825875 Strub et al. Nov 2004 B1
6826477 Ladetto et al. Nov 2004 B2
6834373 Dieberger Dec 2004 B2
6839667 Reich Jan 2005 B2
6857775 Wilson Feb 2005 B1
6920229 Boesen Jul 2005 B2
D513997 Wilson Jan 2006 S
7027874 Sawan et al. Apr 2006 B1
D522300 Roberts Jun 2006 S
7069215 Bangalore Jun 2006 B1
7106220 Gourgey et al. Sep 2006 B2
7228275 Endo Jun 2007 B1
7299034 Kates Nov 2007 B2
7308314 Havey et al. Dec 2007 B2
7336226 Jung et al. Feb 2008 B2
7356473 Kates Apr 2008 B2
7413554 Kobayashi et al. Aug 2008 B2
7417592 Hsiao et al. Aug 2008 B1
7428429 Gantz et al. Sep 2008 B2
7463188 McBurney Dec 2008 B1
7496445 Mohsini et al. Feb 2009 B2
7501958 Saltzstein et al. Mar 2009 B2
7525568 Raghunath Apr 2009 B2
7564469 Cohen Jul 2009 B2
7565295 Hernandez-Rebollar Jul 2009 B1
7598976 Sofer et al. Oct 2009 B2
7618260 Daniel et al. Nov 2009 B2
D609818 Tsang Feb 2010 S
7656290 Fein et al. Feb 2010 B2
7659915 Kurzweil et al. Feb 2010 B2
7743996 Maciver Jun 2010 B2
D625427 Lee Oct 2010 S
7843351 Bourne Nov 2010 B2
7843488 Stapleton Nov 2010 B2
7848512 Eldracher Dec 2010 B2
7864991 Espenlaub et al. Jan 2011 B2
7938756 Rodetsky et al. May 2011 B2
7991576 Roumeliotis Aug 2011 B2
8005263 Fujimura Aug 2011 B2
8035519 Davis Oct 2011 B2
D649655 Petersen Nov 2011 S
8123660 Kruse et al. Feb 2012 B2
D656480 McManigal et al. Mar 2012 S
8138907 Barbeau et al. Mar 2012 B2
8150107 Kurzweil et al. Apr 2012 B2
8177705 Abolfathi May 2012 B2
8239032 Dewhurst Aug 2012 B2
8253760 Sako et al. Aug 2012 B2
8300862 Newton et al. Oct 2012 B2
8325263 Kato et al. Dec 2012 B2
D674501 Petersen Jan 2013 S
8359122 Koselka et al. Jan 2013 B2
8395968 Vartanian et al. Mar 2013 B2
8401785 Cho et al. Mar 2013 B2
8414246 Tobey Apr 2013 B2
8418705 Ota et al. Apr 2013 B2
8428643 Lin Apr 2013 B2
8483956 Zhang Jul 2013 B2
8494507 Tedesco et al. Jul 2013 B1
8494859 Said Jul 2013 B2
8538687 Plocher et al. Sep 2013 B2
8538688 Prehofer Sep 2013 B2
8571860 Strope Oct 2013 B2
8583282 Angle et al. Nov 2013 B2
8588464 Albertson et al. Nov 2013 B2
8588972 Fung Nov 2013 B2
8591412 Kovarik et al. Nov 2013 B2
8594935 Cioffi et al. Nov 2013 B2
8606316 Evanitsky Dec 2013 B2
8610879 Ben-Moshe et al. Dec 2013 B2
8630633 Tedesco et al. Jan 2014 B1
8676274 Li Mar 2014 B2
8676623 Gale et al. Mar 2014 B2
8694251 Janardhanan et al. Apr 2014 B2
8704902 Naick et al. Apr 2014 B2
8718672 Xie et al. May 2014 B2
8743145 Price Jun 2014 B1
8750898 Haney Jun 2014 B2
8768071 Tsuchinaga et al. Jul 2014 B2
8786680 Shiratori et al. Jul 2014 B2
8797141 Best et al. Aug 2014 B2
8797386 Chou et al. Aug 2014 B2
8803699 Foshee et al. Aug 2014 B2
8805929 Erol et al. Aug 2014 B2
8812244 Angelides Aug 2014 B2
8814019 Dyster et al. Aug 2014 B2
8825398 Alexandre et al. Sep 2014 B2
8836532 Fish, Jr. et al. Sep 2014 B2
8836580 Mendelson Sep 2014 B2
8836910 Cashin et al. Sep 2014 B2
8902303 Na'Aman et al. Dec 2014 B2
8909534 Heath Dec 2014 B1
D721673 Park et al. Jan 2015 S
8926330 Taghavi Jan 2015 B2
8930458 Lewis et al. Jan 2015 B2
8981682 Delson et al. Mar 2015 B2
8994498 Agrafioti Mar 2015 B2
D727194 Wilson Apr 2015 S
9004330 White Apr 2015 B2
9025016 Wexler et al. May 2015 B2
9042596 Connor May 2015 B2
9053094 Yassa Jun 2015 B2
9076450 Sadek Jul 2015 B1
9081079 Chao et al. Jul 2015 B2
9081385 Ferguson et al. Jul 2015 B1
D736741 Katz Aug 2015 S
9111545 Jadhav et al. Aug 2015 B2
D738238 Pede et al. Sep 2015 S
9137484 DiFrancesco et al. Sep 2015 B2
9137639 Garin et al. Sep 2015 B2
9140554 Jerauld Sep 2015 B2
9148191 Teng et al. Sep 2015 B2
9158378 Hirukawa Oct 2015 B2
D742535 Wu Nov 2015 S
D743933 Park et al. Nov 2015 S
9185489 Gerber et al. Nov 2015 B2
9190058 Klein Nov 2015 B2
9104806 Stivoric et al. Dec 2015 B2
9230430 Civelli et al. Jan 2016 B2
9232366 Charlier et al. Jan 2016 B1
9267801 Gupta et al. Feb 2016 B2
9269015 Boncyk et al. Feb 2016 B2
9275376 Barraclough et al. Mar 2016 B2
9304588 Aldossary Apr 2016 B2
D756958 Lee et al. May 2016 S
D756959 Lee et al. May 2016 S
9335175 Zhang et al. May 2016 B2
9341014 Oshima et al. May 2016 B2
9355547 Stevens et al. May 2016 B2
D769453 Ma Oct 2016 S
9576460 Dayal Feb 2017 B2
9578307 Moore Feb 2017 B2
20010023387 Rollo Sep 2001 A1
20020067282 Moskowitz et al. Jun 2002 A1
20020071277 Starner et al. Jun 2002 A1
20020075323 O'Dell Jun 2002 A1
20020173346 Wang Nov 2002 A1
20020178344 Bourguet Nov 2002 A1
20030026461 Arthur Hunter Feb 2003 A1
20030133008 Stephenson Jul 2003 A1
20030133085 Tretiakoff Jul 2003 A1
20030179133 Pepin et al. Sep 2003 A1
20040056907 Sharma Mar 2004 A1
20040232179 Chauhan Nov 2004 A1
20040267442 Fehr et al. Dec 2004 A1
20050020845 Fink et al. Sep 2005 A1
20050221260 Kikuchi Oct 2005 A1
20050259035 Iwaki Nov 2005 A1
20050283752 Fruchter Dec 2005 A1
20060004512 Herbst et al. Jan 2006 A1
20060028550 Palmer, Jr. et al. Feb 2006 A1
20060029256 Miyoshi et al. Feb 2006 A1
20060129308 Kates Jun 2006 A1
20060171704 Bingle et al. Aug 2006 A1
20060177086 Rye et al. Aug 2006 A1
20060184318 Yoshimine Aug 2006 A1
20060292533 Selod Dec 2006 A1
20070001904 Mendelson Jan 2007 A1
20070052672 Ritter et al. Mar 2007 A1
20070173688 Kim Jul 2007 A1
20070182812 Ritchey Aug 2007 A1
20070202865 Moride Aug 2007 A1
20070230786 Foss Oct 2007 A1
20070296572 Fein et al. Dec 2007 A1
20080024594 Ritchey Jan 2008 A1
20080068559 Howell et al. Mar 2008 A1
20080120029 Zelek et al. May 2008 A1
20080144854 Abreu Jun 2008 A1
20080145822 Bucchieri Jun 2008 A1
20080174676 Squilla et al. Jul 2008 A1
20080198222 Gowda Aug 2008 A1
20080198324 Fuziak Aug 2008 A1
20080208455 Hartman Aug 2008 A1
20080251110 Pede Oct 2008 A1
20080260210 Kobeli Oct 2008 A1
20090012788 Gilbert Jan 2009 A1
20090040215 Afzulpurkar Feb 2009 A1
20090058611 Kawamura Mar 2009 A1
20090106016 Athsani Apr 2009 A1
20090118652 Carlucci May 2009 A1
20090122161 Bolkhovitinov May 2009 A1
20090122648 Mountain et al. May 2009 A1
20090157302 Tashev et al. Jun 2009 A1
20090177437 Roumeliotis Jul 2009 A1
20090189974 Deering Jul 2009 A1
20090210596 Furuya Aug 2009 A1
20100041378 Aceves et al. Feb 2010 A1
20100080418 Ito Apr 2010 A1
20100109918 Liebermann May 2010 A1
20100110368 Chaum May 2010 A1
20100179452 Srinivasan Jul 2010 A1
20100182242 Fields et al. Jul 2010 A1
20100182450 Kumar et al. Jul 2010 A1
20100198494 Chao et al. Aug 2010 A1
20100199232 Mistry et al. Aug 2010 A1
20100241350 Cioffi et al. Sep 2010 A1
20100245585 Fisher et al. Sep 2010 A1
20100267276 Wu et al. Oct 2010 A1
20100292917 Emam et al. Nov 2010 A1
20100298976 Sugihara et al. Nov 2010 A1
20100305845 Alexandre et al. Dec 2010 A1
20100308999 Chornenky Dec 2010 A1
20110066383 Jangle et al. Mar 2011 A1
20110071830 Kim Mar 2011 A1
20110092249 Evanitsky Apr 2011 A1
20110124383 Garra et al. May 2011 A1
20110125735 Petrou May 2011 A1
20110181422 Tran Jul 2011 A1
20110187640 Jacobsen et al. Aug 2011 A1
20110211760 Boncyk et al. Sep 2011 A1
20110216006 Litschel Sep 2011 A1
20110221670 King, III et al. Sep 2011 A1
20110234584 Endo Sep 2011 A1
20110246064 Nicholson Oct 2011 A1
20110260681 Guccione et al. Oct 2011 A1
20110307172 Jadhav et al. Dec 2011 A1
20120016578 Coppens Jan 2012 A1
20120053826 Slamka Mar 2012 A1
20120062357 Slamka Mar 2012 A1
20120069511 Azera Mar 2012 A1
20120075168 Osterhout et al. Mar 2012 A1
20120082962 Schmidt Apr 2012 A1
20120085377 Trout Apr 2012 A1
20120092161 West Apr 2012 A1
20120092460 Mahoney Apr 2012 A1
20120123784 Baker et al. May 2012 A1
20120136666 Corpier et al. May 2012 A1
20120143495 Dantu Jun 2012 A1
20120162423 Xiao et al. Jun 2012 A1
20120194552 Osterhout et al. Aug 2012 A1
20120206335 Osterhout et al. Aug 2012 A1
20120206607 Morioka Aug 2012 A1
20120207356 Murphy Aug 2012 A1
20120214418 Lee et al. Aug 2012 A1
20120220234 Abreu Aug 2012 A1
20120232430 Boissy et al. Sep 2012 A1
20120249797 Haddick et al. Oct 2012 A1
20120252483 Farmer et al. Oct 2012 A1
20120316884 Rozaieski et al. Dec 2012 A1
20120323485 Mutoh Dec 2012 A1
20120327194 Shiratori Dec 2012 A1
20130002452 Lauren Jan 2013 A1
20130044005 Foshee et al. Feb 2013 A1
20130046541 Klein et al. Feb 2013 A1
20130066636 Singhal Mar 2013 A1
20130079061 Jadhav Mar 2013 A1
20130090133 D'Jesus Bencci Apr 2013 A1
20130115578 Shiina May 2013 A1
20130115579 Taghavi May 2013 A1
20130116559 Levin et al. May 2013 A1
20130127980 Haddick May 2013 A1
20130128051 Velipasalar et al. May 2013 A1
20130131985 Weiland et al. May 2013 A1
20130141576 Lord et al. Jun 2013 A1
20130144629 Johnston Jun 2013 A1
20130145531 Fratesi Jun 2013 A1
20130155474 Roach et al. Jun 2013 A1
20130157230 Morgan Jun 2013 A1
20130184982 DeLuca et al. Jul 2013 A1
20130201344 Sweet, III Aug 2013 A1
20130202274 Chan Aug 2013 A1
20130204605 Illgner-Fehns Aug 2013 A1
20130211718 Yoo et al. Aug 2013 A1
20130218456 Zelek et al. Aug 2013 A1
20130228615 Gates et al. Sep 2013 A1
20130229669 Smits Sep 2013 A1
20130243250 France Sep 2013 A1
20130245396 Berman et al. Sep 2013 A1
20130250078 Levy Sep 2013 A1
20130250233 Blum et al. Sep 2013 A1
20130253818 Sanders et al. Sep 2013 A1
20130265450 Barnes, Jr. Oct 2013 A1
20130271584 Wexler et al. Oct 2013 A1
20130290909 Gray Oct 2013 A1
20130307842 Grinberg et al. Nov 2013 A1
20130311179 Wagner Nov 2013 A1
20130328683 Sitbon et al. Dec 2013 A1
20130332452 Jarvis Dec 2013 A1
20140009561 Sutherland et al. Jan 2014 A1
20140031081 Vossoughi et al. Jan 2014 A1
20140031977 Goldenberg et al. Jan 2014 A1
20140032596 Fish et al. Jan 2014 A1
20140037149 Zetune Feb 2014 A1
20140055353 Takahama Feb 2014 A1
20140071234 Millett Mar 2014 A1
20140081631 Zhu et al. Mar 2014 A1
20140085446 Hicks Mar 2014 A1
20140098018 Kim et al. Apr 2014 A1
20140100773 Cunningham et al. Apr 2014 A1
20140125700 Ramachandran et al. May 2014 A1
20140132388 Alalawi May 2014 A1
20140133290 Yokoo May 2014 A1
20140160250 Pomerantz Jun 2014 A1
20140184384 Zhu et al. Jul 2014 A1
20140184775 Drake Jul 2014 A1
20140204245 Wexler Jul 2014 A1
20140222023 Kim et al. Aug 2014 A1
20140233859 Cho Aug 2014 A1
20140236932 Ikonomov Aug 2014 A1
20140249847 Soon-Shiong Sep 2014 A1
20140251396 Subhashrao et al. Sep 2014 A1
20140253702 Wexler et al. Sep 2014 A1
20140278070 McGavran et al. Sep 2014 A1
20140281943 Prilepov Sep 2014 A1
20140287382 Villar Cloquell Sep 2014 A1
20140309806 Ricci Oct 2014 A1
20140313040 Wright, Sr. Oct 2014 A1
20140335893 Ronen Nov 2014 A1
20140343846 Goldman et al. Nov 2014 A1
20140345956 Kojina Nov 2014 A1
20140347265 Aimone Nov 2014 A1
20140368412 Jacobsen et al. Dec 2014 A1
20140369541 Miskin et al. Dec 2014 A1
20140379251 Tolstedt Dec 2014 A1
20140379336 Bhatnager Dec 2014 A1
20150002808 Rizzo, III et al. Jan 2015 A1
20150016035 Tussy Jan 2015 A1
20150058237 Bailey Feb 2015 A1
20150063661 Lee Mar 2015 A1
20150081884 Maguire Mar 2015 A1
20150099946 Sahin Apr 2015 A1
20150109107 Gomez et al. Apr 2015 A1
20150120186 Heikes Apr 2015 A1
20150125831 Chandrashekhar Nair et al. May 2015 A1
20150135310 Lee May 2015 A1
20150141085 Nuovo et al. May 2015 A1
20150142891 Haque et al. May 2015 A1
20150154643 Artman et al. Jun 2015 A1
20150196101 Dayal Jul 2015 A1
20150198454 Moore et al. Jul 2015 A1
20150198455 Chen et al. Jul 2015 A1
20150199566 Moore et al. Jul 2015 A1
20150201181 Moore et al. Jul 2015 A1
20150211858 Jerauld Jul 2015 A1
20150219757 Boelter et al. Aug 2015 A1
20150223355 Fleck Aug 2015 A1
20150256977 Huang Sep 2015 A1
20150257555 Wong Sep 2015 A1
20150260474 Rublowsky et al. Sep 2015 A1
20150262509 Labbe Sep 2015 A1
20150279172 Hyde Oct 2015 A1
20150324646 Kimia Nov 2015 A1
20150330787 Cioffi et al. Nov 2015 A1
20150336276 Song et al. Nov 2015 A1
20150338917 Steiner et al. Nov 2015 A1
20150341591 Kelder et al. Nov 2015 A1
20150346496 Haddick et al. Dec 2015 A1
20150356345 Velozo Dec 2015 A1
20150356837 Pajestka et al. Dec 2015 A1
20150364943 Vick et al. Dec 2015 A1
20150367176 Bejestan Dec 2015 A1
20150375395 Kwon et al. Dec 2015 A1
20160007158 Venkatraman Jan 2016 A1
20160028917 Wexler Jan 2016 A1
20160042228 Opalka Feb 2016 A1
20160078289 Michel Mar 2016 A1
20160085278 Osterhout Mar 2016 A1
20160098138 Park Apr 2016 A1
20160156850 Werblin et al. Jun 2016 A1
20160198319 Huang Jul 2016 A1
20160350514 Rajendran Dec 2016 A1
Foreign Referenced Citations (62)
Number Date Country
201260746 Jun 2009 CN
101527093 Sep 2009 CN
201440733 Apr 2010 CN
101803988 Aug 2010 CN
101647745 Jan 2011 CN
102316193 Jan 2012 CN
102631280 Aug 2012 CN
202547659 Nov 2012 CN
202722736 Feb 2013 CN
102323819 Jun 2013 CN
103445920 Dec 2013 CN
102011080056 Jan 2013 DE
102012000587 Jul 2013 DE
102012202614 Aug 2013 DE
1174049 Sep 2004 EP
1721237 Nov 2006 EP
2368455 Sep 2011 EP
2371339 Oct 2011 EP
2127033 Aug 2012 EP
2581856 Apr 2013 EP
2751775 Jul 2016 EP
2885251 Nov 2006 FR
2401752 Nov 2004 GB
1069539 Mar 1998 JP
2001304908 Oct 2001 JP
2010012529 Jan 2010 JP
2010182193 Aug 2010 JP
4727352 Jul 2011 JP
2013169611 Sep 2013 JP
100405636 Nov 2003 KR
20080080688 Sep 2008 KR
20120020212 Mar 2012 KR
1250929 Apr 2013 KR
WO 1995004440 Feb 1995 WO
WO 9949656 Sep 1999 WO
WO 0010073 Feb 2000 WO
WO 0038393 Jun 2000 WO
WO 179956 Oct 2001 WO
WO 2004076974 Sep 2004 WO
WO 2006028354 Mar 2006 WO
WO 2006045819 May 2006 WO
WO 2007031782 Mar 2007 WO
WO 2008015375 Feb 2008 WO
WO 2008035993 Mar 2008 WO
WO 2008008791 Apr 2008 WO
WO 2008096134 Aug 2008 WO
WO 2008127316 Oct 2008 WO
WO 2010062481 Jun 2010 WO
WO 2010109313 Sep 2010 WO
WO 2012040703 Mar 2012 WO
WO 2012163675 Dec 2012 WO
WO 2013045557 Apr 2013 WO
WO 2013054257 Apr 2013 WO
WO 2013067539 May 2013 WO
WO 2013147704 Oct 2013 WO
WO 2014104531 Jul 2014 WO
WO 2014138123 Sep 2014 WO
WO 2014172378 Oct 2014 WO
WO 2015065418 May 2015 WO
WO 2015092533 Jun 2015 WO
WO 2015108882 Jul 2015 WO
WO 2015127062 Aug 2015 WO
Non-Patent Literature Citations (94)
Entry
Zhang, Shanjun; Yoshino, Kazuyoshi; A Braille Recognition System by the Mobile Phone with Embedded Camera; 2007; IEEE.
Diallo, Amadou; Sep. 18, 2014; Apple iOS8: Top New Features, Forbes Magazine.
N. Kalar, T. Lawers, D. Dewey, T. Stepleton, M.B. Dias; Iterative Design of a Braille Writing Tutor to Combat Illiteracy; Aug. 30, 2007; IEEE.
AIZuhair et al.; “NFC Based Applications for Visually Impaired People—A Review”; IEEE International Conference on Multimedia and Expo Workshops (ICMEW), Jul. 14, 2014; 7 pages.
“Light Detector” EveryWare Technologies; 2 pages; Jun. 18, 2016.
Aggarwal et al.; “All-in-One Companion for Visually Impaired;” International Journal of Computer Applications; vol. 79, No. 14; pp. 37-40; Oct. 2013.
AppleVis; An Introduction to Braille Screen Input on iOS 8; http://www.applevis.com/guides/braille-ios/introduction-braille-screen-input-ios-8, Nov. 16, 2014; 7 pages.
Arati et al. “Object Recognition in Mobile Phone Application for Visually Impaired Users;” IOSR Journal of Computer Engineering (IOSR-JCE); vol. 17, Impaired No. 1; pp. 30-33; Jan. 2015.
Bharathi et al.; “Effective Navigation for Visually Impaired by Wearable Obstacle Avoidance System;” 2012 International Conference on Computing, Electronics and Electrical Technologies (ICCEET); pp. 956-958; 2012.
Bhatlawande et al.; “Way-finding Electronic Bracelet for Visually Impaired People”; IEEE Point-of-Care Healthcare Technologies (PHT), Jan. 16-18, 2013; 4 pages.
Bigham et al.; “VizWiz: Nearly Real-Time Answers to Visual Questions” Proceedings of the 23nd annual ACM symposium on User interface software and technology; 2010; 2 pages.
Blaze Engineering; “Visually Impaired Resource Guide: Assistive Technology for Students who use Braille”; Braille 'n Speak Manual; http://www.blaize.com; Nov. 17, 2014; 5 pages.
Blenkhorn et al.; “An Ultrasonic Mobility Device with Minimal Audio Feedback”; Center on Disabilities Technology and Persons with Disabilities Conference; Nov. 22, 1997; 5 pages.
Borenstein et al.; “The GuideCane—A Computerized Travel Aid for the Active Guidance of Blind Pedestrians”; IEEE International Conference on Robotics and Automation; Apr. 21-27, 1997; 6 pages.
Bujacz et al.; “Remote Guidance for the Blind—A Proposed Teleassistance System and Navigation Trials”; Conference on Human System Interactions; May 25-27, 2008; 6 pages.
Burbey et al.; “Human Information Processing with the Personal Memex”; ISE 5604 Fall 2005; Dec. 6, 2005; 88 pages.
Campos et al.; “Design and Evaluation of a Spoken-Feedback Keyboard”; Department of Information Systems and Computer Science, INESC-ID/IST/Universidade Tecnica de Lisboa, Jul. 2004; 6 pages.
Caperna et al.; “A Navigation and Object Location Device for the Blind”; Tech. rep. University of Maryland College Park; May 2009; 129 pages.
Cardonha et al.; “A Crowdsourcing Platform for the Construction of Accessibility Maps”; W4A'13 Proceedings of the 10th International Cross-Disciplinary Conference on Web Accessibility; Article No. 26; 2013; 5 pages.
Chaudary et al.; “Alternative Navigation Assistance Aids for Visually Impaired Blind Persons”; Proceedings of ICEAPVI; Feb. 12-14, 2015; 5 pages.
Coughlan et al.; “Crosswatch: A System for Providing Guidance to Visually Impaired Travelers at Traffic Intersections”; Journal of Assistive Technologies 7.2; 2013; 17 pages.
D'Andrea, Frances Mary; “More than a Perkins Brailler: A Review of the Mountbatten Brailler, Part 1”; AFB AccessWorld Magazine; vol. 6, No. 1, Jan. 2005; 9 pages.
Dias et al.; “Enhancing an Automated Braille Writing Tutor”; IEEE/RSJ International Conference on Intelligent Robots and Systems; Oct. 11-15, 2009; 7 pages.
Dowling et al.; “Intelligent Image Processing Constraints for Blind Mobility Facilitated Through Artificial Vision”; 8th Australian and NewZealand Intelligent Information Systems Conference (ANZIIS); Dec. 10-12, 2003; 7 pages.
Ebay; MATIN (Made in Korea) Neoprene Canon DSLR Camera Curved Neck Strap #6782; http://www.ebay.com/itm/MATIN-Made-in-Korea-Neoprene-Canon-DSLR-Camera-Curved-Neck-Strap-6782-/281608526018?hash=item41912d18c2:g:˜pMAAOSwe-FU6zDa ; 4 pages.
Eccles, Lisa; “Smart Walker Detects Obstacles”; Electronic Design; http://electronicdesign.com/electromechanical/smart-walker-detects-obstaeles; Aug. 20, 2001; 2 pages.
Frizera et al.; “The Smart Walkers as Geriatric Assistive Device. The SIMBIOSIS Purpose”; Gerontechnology, vol. 7, No. 2; Jan. 30, 2008; 6 pages.
Garaj et al.; “A System for Remote Sighted Guidance of Visually Impaired Pedestrians”; The British Journal of Visual. Impairment; vol. 21, No. 2, 2003; 9 pages.
Ghiani, et al.; “Vibrotactile Feedback to Aid Blind Users of Mobile Guides”; Journal of Visual Languages and Computing 20; 2009; 13 pages.
Glover et al.; “A Robotically-Augmented Walker for Older Adults”; Carnegie Mellon University, School of Computer Science; Aug. 1, 2003; 13 pages.
Graf, Christian; “Verbally Annotated Tactile Maps—Challenges and Approaches”; Spatial Cognition VII, vol. 6222; Aug. 15-19, 2010; 16 pages.
Graft, Birgit; “An Adaptive Guidance System for Robotic Walking Aids”; Journal of Computing and Information Technology—CIT 17; 2009; 12 pages.
Greenberg et al.; “Finding Your Way: A Curriculum for Teaching and Using the Braillenote with Sendero GPS 2011”; California School for the Blind; 2011; 190 pages.
Guerrero et al.; “An Indoor Navigation System for the Visually Impaired”; Sensors vol. 12, Issue 6; Jun. 13, 2012; 23 pages.
Guy et al; “CrossingGuard: Exploring Information Content in Navigation Aids for Visually Impaired Pedestrians” Proceedings of the SIGCHI Conference on Human Factors in Computing Systems; May 5-10, 2012; 10 pages.
Hamid, Nazatul Naquiah Abd; “Facilitating Route Learning Using Interactive Audio-Tactile Maps for Blind and Visually Impaired People”; CHI 2013 Extended Abstracts; Apr. 27, 2013; 6 pages.
Helal et al.; “Drishti: An Integrated Navigation System for Visually Impaired and Disabled”; Fifth International Symposium on Wearable Computers; Oct. 8-9, 2001; 8 pages.
Hesch et al.; “Design and Analysis of a Portable Indoor Localization Aid for the Visually Impaired”; International Journal of Robotics Research; vol. 29; Issue 11; Sep. 2010; 15 pgs.
Heyes, Tony; “The Sonic Pathfinder an Electronic Travel Aid for the Vision Impaired”; http://members.optuszoo.com.au/aheyew40/pa/pf_blerf.html; Dec. 11, 2014; 7 pages.
Joseph et al.; “Visual Semantic Parameterization—To Enhance Blind User Perception for Indoor Navigation”; Multimedia and Expo Workshops (ICMEW), 2013 IEEE International Conference; Jul. 15, 2013; 7 pages.
Kalra et al.; “A Braille Writing Tutor to Combat Illiteracy in Developing Communities”; Carnegie Mellon University Research Showcase, Robotics Institute; 2007; 10 pages.
Kammoun et al.; “Towards a Geographic Information System Facilitating Navigation of Visually Impaired Users”; Springer Berlin Heidelberg; 2012; 8 pages.
Katz et al; “NAVIG: Augmented Reality Guidance System for the Visually Impaired”; Virtual Reality (2012) vol. 16; 2012; 17 pages.
Kayama et al.; “Outdoor Environment Recognition and Semi-Autonomous Mobile Vehicle for Supporting Mobility of the Elderly and Disabled People”; National Institute of Information and Communications Technology, vol. 54, No. 3; Aug. 2007; 11 pages.
Kirinic et al.; “Computers in Education of Children with Intellectual and Related Developmental Disorders”; International Journal of Emerging Technologies in Learning, vol. 5, 2010, 5 pages.
Krishna et al.; “A Systematic Requirements Analysis and Development of an Assistive Device to Enhance the Social Interaction of People Who are Blind or Visually Impaired”; Workshop on Computer Vision Applications for the Visually Impaired; Marseille, France; 2008; 12 pages.
Kumar et al.; “An Electronic Travel Aid for Navigation of Visually Impaired Persons”; Communications Systems and Networks (COMSNETS), 2011 Third International Conference; Jan. 2011; 5 pages.
Lee et al.; “Adaptive Power Control of Obstacle Avoidance System Using via Motion Context for Visually Impaired Person.” International Conference on Cloud Computing and Social Networking (ICCCSN), Apr. 26-27, 2012 4 pages.
Lee et al.; “A Walking Guidance System for the Visually Impaired”; International Journal of Pattern Recognition and Artificial Intelligence; vol. 22; No. 6; 2008; 16 pages.
Mann et al.; “Blind Navigation with a Wearable Range Camera and Vibrotactile Helmet”; 19th ACM International Conference on Multimedia; Nov. 28, 2011; 4 pages.
Mau et al.; “BlindAid: An Electronic Travel Aid for the Bling;” The Robotics Institute Carnegie Mellon University; 27 pages; May 2008.
Meijer, Dr. Peter B.L.; “Mobile OCR, Face and Object Recognition for the Blind”; The vOICe, www.seeingwithsound.com/ocr.htm; Apr. 18, 2014; 7 pages.
Merino-Garcia, et al.; “A Head-Mounted Device for Recognizing Text in Natural Sciences”; CBDAR'11 Proceedings of the 4th International Conference on Camera-Based Document Analysis and Recognition; Sep. 22, 2011; 7 pages.
Merri et al.; “The Instruments for a Blind Teacher of English: The challenge of the board”; European Journal of Psychology of Education, vol. 20, No. 4 (Dec. 2005), 15 pages.
NEWEGG; Motorola Behind the Neck Stereo Bluetooth Headphone Black/Red Bulk (S9)—OEM; http://www.newegg.com/Product/Product.aspx?Item=N82E16875982212&Tpk=n82e16875982212 3 pages.
NEWEGG; Motorola S10-HD Bluetooth Stereo Headphone w/ Comfortable Sweat Proof Design; http://www.newegg.com/Product/Product.aspx?Item=9SIA0NW2G39901&Tpk=9sia0nw2g39901; 4 pages.
Nordin et al.; “Indoor Navigation and Localization for Visually Impaired People Using Weighted Topological Map”; Journal of Computer Science vol. 5, Issue 11; 2009; 7 pages.
OMRON; Optical Character Recognition Sensor User's Manual; 2012; 450 pages.
OrCam; www.orcam.com; Jul. 22, 2014; 3 pages.
Pagliarini et al.; “Robotic Art for Wearable”; Proceedings of EUROSIAM: European Conference for the Applied Mathematics and Informatics 2010; 10 pages.
Paladugu et al.; “GoingEasy® with Crowdsourcing in the Web 2.0 World for Visually Impaired Users: Design and User Study”; Arizona State University; 8 pages.
Park, Sungwoo; “Voice Stick”; www.yankodesign.com/2008/08/21/voice-stick; Aug. 21, 2008; 4 pages.
Parkes, Don; “Audio Tactile Systems for Designing and Learning Complex Environments as a Vision Impaired Person: Static and Dynamic Spatial Information Access”; EdTech-94 Proceedings; 1994; 8 pages.
Pawar et al.; “Multitasking Stick for Indicating Safe Path to Visually Disable People”; IOSR Journal of Electronics and Communication Engineering (IOSR-JECE), vol. 10, Issue 3, Ver. II; May-Jun. 2015; 5 pages.
Pawar et al.; “Review Paper on Multitasking Stick for Guiding Safe Path for Visually Disable People;” IJPRET; vol. 3, No. 9; pp. 929-936; 2015.
Ram et al.; “The People Sensor: A Mobility Aid for the Visually Impaired;” 2012 16th International Symposium on Wearable Computers; pp. 166-167; 2012.
Ramya, et al.; “Voice Assisted Embedded Navigation System for the Visually Impaired”; International Journal of Computer Applications; vol. 64, No. 13, Feb. 2013; 7 pages.
Ran et al.; “Drishti: An Integrated Indoor/Outdoor Blind Navigation System and Service”; Proceeding PERCOM '04 Proceedings of the Second IEEE International Conference on Pervasive Computing and Communications (PerCom'04); 2004; 9 pages.
Rentschler et al.; “Intelligent Walkers for the Elderly: Performance and Safety Testing of VA-PAMAID Robotic Walker”; Department of Veterans Affairs Journal of Rehabilitation Research and Development; vol. 40, No. 5; Sep./Oct. 2013; 9pages.
Rodríguez et al.; “Assisting the Visually Impaired: Obstacle Detection and Warning System by Acoustic Feedback”; Sensors 2012; vol. 12; 21 pages.
Rodriguez et al; “CrowdSight: Rapidly Prototyping Intelligent Visual Processing Apps”; AAAI Human Computation Workshop (HCOMP); 2011; 6 pages.
Rodriquez-Losada et al.; “Guido, The Robotic Smart Walker for the Frail Visually Impaired”; IEEE International Conference on Robotics and Automation (ICRA); Apr. 18-22, 2005; 15 pages.
Science Daily; “Intelligent Walker Designed to Assist the Elderly and People Undergoing Medical Rehabilitation”; http://www.sciencedaily.com/releases/2008/11/081107072015.htm; Jul. 22, 2014; 4 pages.
Shoval et al.; “Navbelt and the Guidecane—Robotics-Based Obstacle-Avoidance Systems for the Blind and Visually Impaired”; IEEE Robotics & Automation Magazine, vol. 10, Issue 1; Mar. 2003; 12 pages.
Shoval et al.; “The Navbelt—A Computerized Travel Aid for the Blind”; RESNA Conference, Jun. 12-17, 1993; 6 pages.
Singhal; “The Development of an Intelligent Aid for Blind and Old People;” Emerging Trends and Applications in Computer Science (ICETACS), 2013 1st International Conference; pp. 182-185; Sep. 13, 2013.
Sudol et al.; “LookTel—A Comprehensive Platform for Computer-Aided Visual Assistance”; Computer Vision and Pattern Recognition Workshops (CVPRW), 2010 IEEE Computer Society Conference; Jun. 13-18, 2010; 8 pages.
The Nex Band; http://www.mightycast.com/#faq; May 19, 2015; 4 pages.
Treuillet; “Outdoor/Indoor Vision-Based Localization for Blind Pedestrian Navigation Assistance”; WSPC/Instruction File; May 23, 2010; 16 pages.
Trinh et al.; “Phoneme-based Predictive Text Entry Interface”; Proceedings of the 16th International ACM SIGACCESS Conference on Computers & Accessibility; Oct. 2014; 2 pgs.
Wang, et al.; “Camera-Based Signage Detection and Recognition for Blind Persons”; 13th International Conference (ICCHP) Part 2 Proceedings; Jul. 11-13, 2012; 9 pages.
Ward et al.; “Visual Experiences in the Blind Induced by an Auditory Sensory Substitution Device”; Journal of Consciousness and Cognition; Oct. 2009; 30 pages.
Wilson, Jeff, et al. “Swan: System for Wearable Audio Navigation”; 11th IEEE International Symposium on Wearable Computers; Oct. 11-13, 2007; 8 pages.
Yabu et al.; “Development of a Wearable Haptic Tactile Interface as an Aid for the Hearing and/or Visually Impaired;” NTUT Education of Disabilities; vol. 13; pp. 5-12; 2015.
Yang, et al.; “Towards Automatic Sign Translation”; The Interactive Systems Lab, Carnegie Mellon University; 2001; 5 pages.
Yi, Chucai; “Assistive Text Reading from Complex Background for Blind Persons”; CBDAR'11 Proceedings of the 4th International Conference on Camera-Based Document Analysis and Recognition; Sep. 22, 2011; 7 pages.
Zeng et al.; “Audio-Haptic Browser for a Geographical Information System”; ICCHP 2010, Part II, LNCS 6180; Jul. 14-16, 2010; 8 pages.
Zhang et al.; “A Multiple Sensor-Based Shoe-Mounted User Interface Designed for Navigation Systems for the Visually Impaired”; 5th Annual ICST Wireless Internet Conference (WICON); Mar. 1-3, 2010; 9 pages.
Shidujaman et al.; “Design and navigation Prospective for Wireless Power Transmission Robot;” IEEE; Jun. 2015.
Wu et al. “Fusing Multi-Modal Features for Gesture Recognition”, Proceedings of the 15th ACM on International Conference on Multimodal Interaction, Dec. 9, 2013, ACM, pp. 453-459.
Pitsikalis et al. “Multimodal Gesture Recognition via Multiple Hypotheses Rescoring”, Journal of Machine Learning Research, Feb. 2015, pp. 255-284.
Shen et al. “Walkie-Markie: Indoor Pathway Mapping Made Easy” 10th USENIX Symposium on Networked Systems Design and Implementation (NSDI'13); pp. 85-98, 2013.
Tu et al. “Crowdsourced Routing II D2.6” 34 pages; 2012.
De Choudhury et al. “Automatic Construction of Travel Itineraries Using Social Breadcrumbs” pp. 35-44; Jun. 2010.
Related Publications (1)
Number Date Country
20180021161 A1 Jan 2018 US