The disclosed technology relates generally to eyewear, and more particularly some embodiments relate to display eyewear.
In general, one aspect disclosed features an electronic device, comprising: a structure configured to be worn on the head of a user; a camera movably coupled to the structure and arranged to capture images in a field of view of the user; and a display panel fixedly coupled to the structure and arranged to display, to the user, the images captured by the camera.
Embodiments of the electronic device may include one or more of the following features. Some embodiments comprise camera to tilt horizontally and vertically relative to a resting line of sight of the user. In some embodiments, the gimbal is configured to move the camera such that an angular position of the camera has a magnitude of up to 90° in any semi-meridian relative to a normal from a front of the structure. Some embodiments comprise a motor configured to control the gimbal. Some embodiments comprise a controller configured to control the motor based on a control signal, wherein the control signal is provided by at least one of: a tilt sensor configured to sense a tilt of the structure, wherein the control signal is based on the tilt, and a distance sensor configured to sense a distance to an object, wherein the control signal is based on the distance. In some embodiments, the structure comprises at least one of: a frame configured to be worn on the head of a user, the frame including a nose rest configured to rest on the nose of the user, and an ear rest configured to rest on an ear of the user; and a frame front. In some embodiments, the display panel is occluded or transparent. In some embodiments, the display panel is disposed within a resting line of sight of the user. In some embodiments, a field of view displayed by the display panel is smaller than a field of view captured by the camera; and the controller is further configured to shift a portion of the captured images from outside the field of view of the display to within the field of view of the display. In some embodiments, an angle of the shift exceeds 10°. In some embodiments, an angle of the shift is a fixed angle. Some embodiments comprise a user input device, wherein an angle of the shift is determined by inputs received by the user input device. In some embodiments, the controller is further configured to shift a portion of the captured images prior to displaying the captured images. In some embodiments, the structure further comprises: a lens disposed in a line of sight of an eye of the user. In some embodiments, the structure comprises: a pair of eyeglasses.
In general, one aspect disclosed features a method for an electronic device configured to be worn on the head of a user, the method comprising: capturing images in a field of view of the user using a camera movably coupled to the electronic device; displaying the images to the user on a display panel fixedly coupled to the electronic device; controlling a direction of the camera based on at least one of: a tilt of the electronic device, and a distance from the camera to an object in the field of view of the user.
Embodiments of the method may include one or more of the following features. In some embodiments, the camera is disposed above a resting line of sight of the user, and wherein controlling the camera comprises: causing a line of sight of the camera to intersect a resting line of sight of the user at the object based on the distance to the object. In some embodiments, the camera is disposed above a resting line of sight of the user, and wherein controlling the camera comprises: causing an angle between a line of sight of the camera and a resting line of sight of the user to increase with increasing tilt.
In general, one aspect disclosed features an electronic device, comprising: a structure configured to be worn on the head of a user; a camera fixedly coupled to the structure and arranged to capture images in a field of view of the user; and a display panel fixedly coupled to the structure above a resting line of sight of the user and arranged to display, to the user, the images captured by the camera.
In general, one aspect disclosed features a method for an electronic device configured to be worn on the head of a user, the method comprising: capturing images in front of the user using a camera of the electronic device; electronically shifting a portion of the captured images; and displaying the shifted images to the user on a display panel of the electronic device.
Embodiments of the method may include one or more of the following features. Some embodiments comprise electronically shifting the portion of the captured images based on a vergence distance between an eye of the user and an object in the captured images.
The present disclosure, in accordance with one or more various embodiments, is described in detail with reference to the following figures. The figures are provided for purposes of illustration only and merely depict typical or example embodiments.
The figures are not exhaustive and do not limit the present disclosure to the precise form disclosed.
Display eyewear are utilized for a plurality of applications. Conventional display eyewear generally have an outward facing camera for some applications. However, conventional display eyewear implementations suffer from a number of shortcomings.
In some implementations, the camera can be used as a feed source for the display in the eyewear that is viewed by the user, either contemporaneously with the image capture of the camera, or at a later time. It is common practice to embed and fix the camera in the display eyewear such that the camera angle is respective to the angle of the eyewear frame or headset as it rests on the face. In some implementations, the entire front of the frame or headset is hinged at side-attachments or earpieces such that the camera angle can be manually adjusted. In such implementations, the display angle made by the line of sight and the vertical plane of the display changes simultaneously with an angular change of the front of the display frame or headset. This simultaneous change with a fixed camera angle can result in a loss of the ability to view the display when attempting to point the camera at an object of regard, or can result in an inability to point the camera at the object of regard while viewing the display.
Furthermore, the geometric anthropomorphic diversity of human head shapes and eye positions results in a range of display positions relative to the user's resting line of sight, that is, the line of sight with a comfortable head tilt and the eyes at rest. The tilt of the display eyewear front varies with the shape of the resting position of the display eyewear on the users' nose bridge, and the height and position of the users' ears. The distribution of positions of the camera and the display results in a variation of performance when a camera is fixed in the display eyewear relative to the position of the display and relative to the line of sight.
Embodiments of the disclosed technology address these and other shortcomings of conventional implementations. In some embodiments, performance is enhanced by providing the ability to modulate the camera angle relative to the eyewear front angle and/or the display angle relative to the user's line of sight.
Some embodiments may provide virtual reality image enhancement for the visually impaired. In such embodiments, the user may sight the object of regard with the camera while simultaneously viewing an enhanced image in the display eyewear. The object of regard may be a target positioned a long distance from the user. The display may be mounted above the line of sight and accessed by an angular tip down of the head. In these embodiments, the camera may be angled upward about the same number of degrees that the head is tipped down to see the content in the display.
In some embodiments, the camera may be movable, and may be controlled according to a tilt of the user's head, a distance to an object, and the like. For example, a user may desire to view an object or task content at a great distance, where the object is positioned near the user's resting line of sight. In this example, the camera angle may be declined by a small number of degrees to allow the user a comfortable head position to view the intermediate distance object with the camera. In another example, the user may desire to view an object or task content at an intermediate distance, where the object is positioned somewhat below the user's resting line of sight. In this example, the camera angle may be declined by a greater number of degrees to allow the user a comfortable head position to view the intermediate distance object with the camera. The same user may wish to view a hand-held object at even a lower angle relative to the resting line of sight. In this example, the camera angle may be declined an even greater angle for comfortable viewing.
In some embodiments, the display eyewear may include a wide angle imaging camera and electronic means of shifting the image in the display, for example when the camera image does not align with the center of the display. In such embodiments, the image shift allows for a fixed camera position, and shifting a region of pixels to intersect the line of sight on the display. The image shift allows for a resting head position and resting line of sight when viewing the display content, and freedom from use of a hinged frame front or an adjustable camera angle. Such image shifting is also useful to bring scene content into view on a display that is otherwise not visible in the presence of field defects like hemianopsia where half of the visual field is blind.
The tip-down display eyewear 100 may include one or more outward-facing cameras 106. In the embodiment of
The tip-down display eyewear 100 may include one or more tip-down micro-display panels 108. In the embodiment of
In some embodiments, the tip-down display eyewear 100 may include a controller 112. The controller 112 may receive images captured by the camera 106, and may provide these images to the tip-down micro-display panels 108. The controller 112 may process the captured images prior to providing the processed images to the tip-down micro-display panels 108. For example, the controller 112 may magnify, brighten, or otherwise enhance the images. In embodiments having a movable camera 106, the controller 112 may control the movement of the movable camera 106. Some embodiments may comprise one or more sensors such as tilt sensors, distance sensors, and the like. In such embodiments, the controller 112 may control the movement of the movable camera 106 in accordance with signals received from the sensors, for example as described in detail below. The controller 112 may be located within one of the temples 104, and/or within some other portion of the tip-down display eyewear 100 or may be tethered to the tip-down display eyewear with wire or wirelessly.
It should be appreciated that the embodiments of
The user may employ the tip-down display eyewear 100 by simply tilting the user's head downward. In the example of
The immersive display eyewear 300 may include one or more outward-facing cameras 306. In the embodiment of
The immersive display eyewear 300 may include one or more immersive micro-display panels 308. In the embodiment of
In some embodiments, the immersive display eyewear 300 may include a controller 312. The controller 312 may receive images captured by the camera 306, and may provide these images to the immersive micro-display panels 308. The controller 312 may process the captured images prior to providing the processed imager to the immersive micro-display panels 308. For example, the controller 312 may magnify, brighten, or otherwise enhance the images. The controller 312 may control the movement of the movable camera 306. Some embodiments may comprise one or more sensors 314 such as tilt sensors, distance sensors, and the like. In such embodiments, the controller 312 may control the movement of the movable camera 306 in accordance with signals received from the sensors 314, for example as described in detail below. The controller 312 may be located within one of the temples 304, and/or within some other portion of the immersive display eyewear 300.
It should be appreciated that the embodiments of
Referring to
Referring again to
Referring again to
Referring again to
Referring to
Referring to
Referring to
Some embodiments perform electronic image shifting. In some embodiments, electronic image shifting is employed to shift images that may be only partially displayed so that they are fully displayed. In some embodiments, electronic image shifting is employed to shift images from one portion of the display to another, for example to assist users having hemianopsia or similar conditions. In any of these embodiments, the camera may feature a wide-angle lens to provide a larger camera field of view.
Referring again to
Referring again to
Referring to
Referring to
Referring to
Referring to
The computer system 1100 also includes a main memory 1106, such as a random access memory (RAM), cache and/or other dynamic storage devices, coupled to bus 1102 for storing information and instructions to be executed by processor 1104. Main memory 1106 also may be used for storing temporary variables or other intermediate information during execution of instructions to be executed by processor 1104. Such instructions, when stored in storage media accessible to processor 1104, render computer system 1100 into a special-purpose machine that is customized to perform the operations specified in the instructions.
The computer system 1100 further includes a read only memory (ROM) 1108 or other static storage device coupled to bus 1102 for storing static information and instructions for processor 1104. A storage device 1110, such as a magnetic disk, optical disk, or USB thumb drive (Flash drive), etc., is provided and coupled to bus 1102 for storing information and instructions.
The computer system 1100 may be coupled via bus 1102 to a display 1112, such as a liquid crystal display (LCD) (or touch screen), for displaying information to a computer user. An input device 1111, including alphanumeric and other keys, is coupled to bus 1102 for communicating information and command selections to processor 1104. Another type of user input device is cursor control 1111, such as a mouse, a trackball, or cursor direction keys for communicating direction information and command selections to processor 1104 and for controlling cursor movement on display 1112. In some embodiments, the same direction information and command selections as cursor control may be implemented via receiving touches on a touch screen without a cursor.
The computing system 1100 may include a user interface module to implement a GUI that may be stored in a mass storage device as executable software codes that are executed by the computing device(s). This and other modules may include, by way of example, components, such as software components, object-oriented software components, class components and task components, processes, functions, attributes, procedures, subroutines, segments of program code, drivers, firmware, microcode, circuitry, data, databases, data structures, tables, arrays, and variables.
In general, the word “component,” “engine,” “system,” “database,” data store,” and the like, as used herein, can refer to logic embodied in hardware or firmware, or to a collection of software instructions, possibly having entry and exit points, written in a programming language, such as, for example, Java, C or C++. A software component may be compiled and linked into an executable program, installed in a dynamic link library, or may be written in an interpreted programming language such as, for example, BASIC, Perl, or Python. It will be appreciated that software components may be callable from other components or from themselves, and/or may be invoked in response to detected events or interrupts. Software components configured for execution on computing devices may be provided on a computer readable medium, such as a compact disc, digital video disc, flash drive, magnetic disc, or any other tangible medium, or as a digital download (and may be originally stored in a compressed or installable format that requires installation, decompression or decryption prior to execution). Such software code may be stored, partially or fully, on a memory device of the executing computing device, for execution by the computing device. Software instructions may be embedded in firmware, such as an EPROM. It will be further appreciated that hardware components may be comprised of connected logic units, such as gates and flip-flops, and/or may be comprised of programmable units, such as programmable gate arrays or processors.
The computer system 1100 may implement the techniques described herein using customized hard-wired logic, one or more ASICs or FPGAs, firmware and/or program logic which in combination with the computer system causes or programs computer system 1100 to be a special-purpose machine. According to one embodiment, the techniques herein are performed by computer system 1100 in response to processor(s) 1104 executing one or more sequences of one or more instructions contained in main memory 1106. Such instructions may be read into main memory 1106 from another storage medium, such as storage device 1110. Execution of the sequences of instructions contained in main memory 1106 causes processor(s) 1104 to perform the process steps described herein. In alternative embodiments, hard-wired circuitry may be used in place of or in combination with software instructions.
The term “non-transitory media,” and similar terms, as used herein refers to any media that store data and/or instructions that cause a machine to operate in a specific fashion. Such non-transitory media may comprise non-volatile media and/or volatile media. Non-volatile media includes, for example, optical or magnetic disks, such as storage device 1110. Volatile media includes dynamic memory, such as main memory 1106. Common forms of non-transitory media include, for example, a floppy disk, a flexible disk, hard disk, solid state drive, magnetic tape, or any other magnetic data storage medium, a CD-ROM, any other optical data storage medium, any physical medium with patterns of holes, a RAM, a PROM, and EPROM, a FLASH-EPROM, NVRAM, any other memory chip or cartridge, and networked versions of the same.
Non-transitory media is distinct from but may be used in conjunction with transmission media. Transmission media participates in transferring information between non-transitory media. For example, transmission media includes coaxial cables, copper wire and fiber optics, including the wires that comprise bus 1102. Transmission media can also take the form of acoustic or light waves, such as those generated during radio-wave and infra-red data communications.
The computer system 1100 also includes a communication interface 1118 coupled to bus 1102. Network interface 1118 provides a two-way data communication coupling to one or more network links that are connected to one or more local networks. For example, communication interface 1118 may be an integrated services digital network (ISDN) card, cable modem, satellite modem, or a modem to provide a data communication connection to a corresponding type of telephone line. As another example, network interface 1118 may be a local area network (LAN) card to provide a data communication connection to a compatible LAN (or a WAN component to communicate with a WAN). Wireless links may also be implemented. In any such implementation, network interface 1118 sends and receives electrical, electromagnetic or optical signals that carry digital data streams representing various types of information.
A network link typically provides data communication through one or more networks to other data devices. For example, a network link may provide a connection through local network to a host computer or to data equipment operated by an Internet Service Provider (ISP). The ISP in turn provides data communication services through the world wide packet data communication network now commonly referred to as the “Internet.” Local network and Internet both use electrical, electromagnetic or optical signals that carry digital data streams. The signals through the various networks and the signals on network link and through communication interface 1118, which carry the digital data to and from computer system 1100, are example forms of transmission media.
The computer system 1100 can send messages and receive data, including program code, through the network(s), network link and communication interface 1118. In the Internet example, a server might transmit a requested code for an application program through the Internet, the ISP, the local network and the communication interface 1118.
The received code may be executed by processor 1104 as it is received, and/or stored in storage device 1110, or other non-volatile storage for later execution.
Each of the processes, methods, and algorithms described in the preceding sections may be embodied in, and fully or partially automated by, code components executed by one or more computer systems or computer processors comprising computer hardware. The one or more computer systems or computer processors may also operate to support performance of the relevant operations in a “cloud computing” environment or as a “software as a service” (SaaS). The processes and algorithms may be implemented partially or wholly in application-specific circuitry. The various features and processes described above may be used independently of one another, or may be combined in various ways. Different combinations and sub-combinations are intended to fall within the scope of this disclosure, and certain method or process blocks may be omitted in some implementations. The methods and processes described herein are also not limited to any particular sequence, and the blocks or states relating thereto can be performed in other sequences that are appropriate, or may be performed in parallel, or in some other manner. Blocks or states may be added to or removed from the disclosed example embodiments. The performance of certain of the operations or processes may be distributed among computer systems or computers processors, not only residing within a single machine, but deployed across a number of machines.
As used herein, a circuit might be implemented utilizing any form of hardware, or a combination of hardware and software. For example, one or more processors, controllers, ASICs, PLAs, PALs, CPLDs, FPGAs, logical components, software routines or other mechanisms might be implemented to make up a circuit. In implementation, the various circuits described herein might be implemented as discrete circuits or the functions and features described can be shared in part or in total among one or more circuits. Even though various features or elements of functionality may be individually described or claimed as separate circuits, these features and functionality can be shared among one or more common circuits, and such description shall not require or imply that separate circuits are required to implement such features or functionality. Where a circuit is implemented in whole or in part using software, such software can be implemented to operate with a computing or processing system capable of carrying out the functionality described with respect thereto, such as computer system 1100.
As used herein, the term “or” may be construed in either an inclusive or exclusive sense. Moreover, the description of resources, operations, or structures in the singular shall not be read to exclude the plural. Conditional language, such as, among others, “can,” “could,” “might,” or “may,” unless specifically stated otherwise, or otherwise understood within the context as used, is generally intended to convey that certain embodiments include, while other embodiments do not include, certain features, elements and/or steps.
Terms and phrases used in this document, and variations thereof, unless otherwise expressly stated, should be construed as open ended as opposed to limiting. Adjectives such as “conventional,” “traditional,” “normal,” “standard,” “known,” and terms of similar meaning should not be construed as limiting the item described to a given time period or to an item available as of a given time, but instead should be read to encompass conventional, traditional, normal, or standard technologies that may be available or known now or at any time in the future. The presence of broadening words and phrases such as “one or more,” “at least,” “but not limited to” or other like phrases in some instances shall not be read to mean that the narrower case is intended or required in instances where such broadening phrases may be absent.
The present application is a divisional of U.S. patent application Ser. No. 16/915,985, filed Jun. 29, 2020, entitled “DISPLAY EYEWEAR WITH ADJUSTABLE CAMERA DIRECTION,” the disclosure thereof incorporated by reference herein in its entirety.
Number | Date | Country | |
---|---|---|---|
Parent | 16915985 | Jun 2020 | US |
Child | 17504226 | US |