Example embodiments of the present invention relate generally to user interface technology and, more particularly, relate to methods and apparatuses for facilitating interaction with a user interface, such as near-eye displays and augmented reality displays.
The modern communications era has brought about a tremendous expansion of wireline and wireless networks. Wireless and mobile networking technologies have addressed related consumer demands, while providing more flexibility and immediacy of information transfer. Concurrent with the expansion of networking technologies, an expansion in computing power has resulted in development of affordable computing devices capable of taking advantage of services made possible by modern networking technologies. This expansion in computing power has led to a reduction in the size of computing devices and given rise to a new generation of mobile devices that are capable of functionality that only a few years ago required processing power that could be provided only by the most advanced desktop computers. Consequently, mobile computing devices having a small form factor have become ubiquitous and are used to access network applications and services.
In addition, display devices, such as projectors, monitors, or augmented reality glasses, may provide an enhanced view by incorporating computer-generated information with a view of the real world. Such display devices may further be remote wireless display devices such that the remote display device provides an enhanced view by incorporating computer-generated information with a view of the real world. In particular, augmented reality devices, such as augmented reality glasses, may provide for overlaying virtual graphics over a view of the physical world. As such, methods of navigation and transmission of other information through augmented reality devices may provide for richer and deeper interaction with the surrounding environment. The usefulness of augmented reality devices relies upon supplementing the view of the real world with meaningful and timely virtual graphics.
Methods, apparatuses, and computer program products are herein provided for facilitating interaction via a remote user interface with a display, such as an augmented reality display, e.g., augmented reality glasses, an augmented reality near-eye display and/or the like, that may be either physically collocated or remote from a remote user interface. In one example embodiment, two or more users may interact in real-time with one user providing input via a remote user interface that defines one or more icons or other indications that are displayed upon an augmented reality display of the other user, thereby providing for a more detailed and informative interaction between the users.
In one example embodiment, a method may include receiving an image of a view of an augmented reality device. The method may also include causing the image to be displayed. Further, the method may include receiving an input indicating a respective portion of the image. In addition, the method may comprise determining, by a processor, a location of the input within the image, and causing information regarding the location of the input to be provided to the augmented reality device such that an indication may be imposed upon the view provided by the augmented reality at the location of the input.
According to one example embodiment, the method may further include receiving the image in real time so that the image that is caused to be displayed is also displayed by the augmented reality device. In another embodiment, the method may also include receiving a video recording, and causing the video recording to be displayed. According to another embodiment, the method may also include receiving the input to identify a respective feature within an image of the video recording and continuing to identify the respective feature as the image changes. The method may also include employing feature recognition to identify the respective feature within the video recording. In one embodiment, the method may include receiving an input that moves across the image so as to indicate both a location and a direction.
In another example embodiment, an apparatus may comprise at least one processor and at least one memory storing computer program code, wherein the at least one memory and stored computer program code are configured, with the at least one processor, to cause the apparatus to at least receive an image of a view of an augmented reality device. Further the apparatus may comprise at least one processor and at least one memory storing computer program code, wherein the at least one memory and stored computer program code are configured, with the at least one processor, to cause the apparatus to at least cause the image to be displayed. In addition, the apparatus may comprise at least one processor and at least one memory storing computer program code, wherein the at least one memory and stored computer program code are configured, with the at least one processor, to cause the apparatus to at least receive an input indicating a respective portion of the image. According to one embodiment, the apparatus may comprise at least one processor and at least one memory storing computer program code, wherein the at least one memory and stored computer program code are configured, with the at least one processor, to cause the apparatus to at least determine a location of the input within the image, and cause information regarding the location of the input to be provided to the augmented reality device such that an indication may be imposed upon the view provided by the augmented reality at the location of the input.
In another example embodiment, a computer program product is provided. The computer program product of the example embodiment may include at least one non-transitory computer-readable storage medium having computer-readable program instructions stored therein. The computer-readable program instructions may comprise program instructions configured to cause an apparatus to perform a method comprising receiving an image of a view from an augmented reality device. The method may also include causing the image to be displayed. Further, the method may include receiving an input indicating a respective portion of the image. In one embodiment, the method may also include determining, by a processor, a location of the input within the image, and causing information regarding the location of the input to be provided to the augmented reality device such that an indication may be imposed upon the view provided by the augmented reality at the location of the input.
In another example embodiment, an apparatus may include means for receiving an image of a view of an augmented reality device. The apparatus may also include means for causing the image to be displayed. Further, the apparatus may include means for receiving an input indicating a respective portion of the image. In addition, the apparatus may comprise means for determining, by a processor, a location of the input within the image, and causing information regarding the location of the input to be provided to the augmented reality device such that an indication may be imposed upon the view provided by the augmented reality at the location of the input.
According to another example embodiment, a method may include causing an image of a field of view of an augmented reality device to be captured. Further, the method may include causing the image to be provided to a remote user interface. In addition, the method may include receiving information indicative of an input to the remote user interface corresponding to a respective portion of the image. The method may also include causing an indicator to be provided upon the view provided by the augmented reality device based upon the information from the remote user interface.
In another example embodiment, an apparatus may comprise at least one processor and at least one memory storing computer program code, wherein the at least one memory and stored computer program code are configured, with the at least one processor, to cause the apparatus to at least cause an image of a field of view to be captured. The at least one memory and stored computer program code are configured, with the at least one processor, to cause the apparatus to at least cause the image to be provided to a remote user interface. In addition, the at least one memory and stored computer program code are configured, with the at least one processor, to cause the apparatus to at least receive information indicative of an input to the remote user interface corresponding to a respective portion of the image. The at least one memory and stored computer program code are further configured, with the at least one processor, to cause the apparatus to at least cause an indicator to be provided upon the view provided by the apparatus based upon the information form the remote user interface.
In another example embodiment, a computer program product is provided that may include at least one non-transitory computer-readable storage medium having computer-readable program instructions stored therein. The computer-readable program instructions may comprise program instructions configured to cause an apparatus to perform a method comprising causing an image of a field of view of an augmented reality device to be captured. Further, the method may include causing the image to be provided to a remote user interface. In addition, the method may include receiving information indicative of an input to the remote user interface corresponding to a respective portion of the image. The method may also include causing an indicator to be provided upon the view provided by the augmented reality device based upon the information from the remote user interface. In another example embodiment, the method may also include providing an indication of a location, object, person and/or the like a user is viewing in a field of view of an augmented reality device, such as by providing a gesture, pointing, focusing the user's gaze or other similar techniques for specifying a location, object, person and/or the like within the scene or field of view.
The above summary is provided merely for purposes of summarizing some example embodiments of the invention so as to provide a basic understanding of some aspects of the invention. Accordingly, it will be appreciated that the above described example embodiments are merely examples and should not be construed to narrow the scope or spirit of the invention in any way. It will be appreciated that the scope of the invention encompasses many potential embodiments, some of which will be further described below, in addition to those here summarized
Having thus described embodiments of the invention in general terms, reference will now be made to the accompanying drawings, which are not necessarily drawn to scale, and wherein:
Some embodiments of the present invention will now be described more fully hereinafter with reference to the accompanying drawings, in which some, but not all embodiments of the invention are shown. Indeed, the invention may be embodied in many different forms and should not be construed as limited to the embodiments set forth herein; rather, these embodiments are provided so that this disclosure will satisfy applicable legal requirements. Like reference numerals refer to like elements throughout.
As used herein, the terms “data,” “content,” “information” and similar terms may be used interchangeably to refer to data capable of being transmitted, received, displayed and/or stored in accordance with various example embodiments. Thus, use of any such terms should not be taken to limit the spirit and scope of the disclosure.
The term “computer-readable medium” as used herein refers to any medium configured to participate in providing information to a processor, including instructions for execution. Such a medium may take many forms, including, but not limited to a non-transitory computer-readable storage medium (e.g., non-volatile media, volatile media), and transmission media. Transmission media include, for example, coaxial cables, copper wire, fiber optic cables, and carrier waves that travel through space without wires or cables, such as acoustic waves and electromagnetic waves, including radio, optical and infrared waves. Signals include man-made transient variations in amplitude, frequency, phase, polarization or other physical properties transmitted through the transmission media. Examples of non-transitory computer-readable media include a magnetic computer readable medium (e.g., a floppy disk, hard disk, magnetic tape, any other magnetic medium), an optical computer readable medium (e.g., a compact disc read only memory (CD-ROM), a digital versatile disc (DVD), a Blu-Ray disc, or the like), a random access memory (RAM), a programmable read only memory (PROM), an erasable programmable read only memory (EPROM), a FLASH-EPROM, or any other non-transitory medium from which a computer can read. The term computer-readable storage medium is used herein to refer to any computer-readable medium except transmission media. However, it will be appreciated that where embodiments are described to use a computer-readable storage medium, other types of computer-readable mediums may be substituted for or used in addition to the computer-readable storage medium in alternative embodiments.
Additionally, as used herein, the term ‘circuitry’ refers to (a) hardware-only circuit implementations (e.g., implementations in analog circuitry and/or digital circuitry); (b) combinations of circuits and computer program product(s) comprising software and/or firmware instructions stored on one or more computer readable memories that work together to cause an apparatus to perform one or more functions described herein; and (c) circuits, such as, for example, a microprocessor(s) or a portion of a microprocessor(s), that require software or firmware for operation even if the software or firmware is not physically present. This definition of ‘circuitry’ applies to all uses of this term herein, including in any claims. As a further example, as used herein, the term ‘circuitry’ also includes an implementation comprising one or more processors and/or portion(s) thereof and accompanying software and/or firmware. As another example, the term ‘circuitry’ as used herein also includes, for example, a baseband integrated circuit or applications processor integrated circuit for a mobile phone or a similar integrated circuit in a server, a cellular network device, other network device, and/or other computing device.
Some embodiments of the present invention may relate to a provision of a mechanism by which an augmented reality device, such as augmented reality glasses, is enhanced by the display of icons or other indications that are provided by another via a user interface that may be remote from the augmented reality device. In order to increase the relevancy of the input provided via the user interface, an image may be provided by the augmented reality device to and displayed by the remote user interface. As such, the input provided via the remote user interface may be based upon the same image, field of view, or combinations thereof as that presented by the augmented reality device such that the input and, in turn, the icons or other indications that are created based upon input and presented upon the augmented reality device may be particularly pertinent. In order to further illustrate a relationship between the augmented reality device 2 and a remote user interface 3, reference is made to
As shown, the mobile terminal 10 may include an antenna 12 (or multiple antennas 12) in communication with a transmitter 14 and a receiver 16. The mobile terminal 10 may also include a processor 20 configured to provide signals to and receive signals from the transmitter and receiver, respectively. The processor 20 may, for example, be embodied as various means including circuitry, one or more microprocessors with accompanying digital signal processor(s), one or more processor(s) without an accompanying digital signal processor, one or more coprocessors, one or more multi-core processors, one or more controllers, processing circuitry, one or more computers, various other processing elements including integrated circuits such as, for example, an ASIC or FPGA, or some combination thereof. Accordingly, although illustrated in
Some Narrow-band Advanced Mobile Phone System (NAMPS), as well as Total Access Communication System (TACS), mobile terminals may also benefit from embodiments of this invention, as should dual or higher mode phones (e.g., digital/analog or TDMA/CDMA/analog phones). Additionally, the mobile terminal 10 may be capable of operating according to Wi-Fi or Worldwide Interoperability for Microwave Access (WiMAX) protocols.
It is understood that the processor 20 may comprise circuitry for implementing audio/video and logic functions of the mobile terminal 10. For example, the processor 20 may comprise a digital signal processor device, a microprocessor device, an analog-to-digital converter, a digital-to-analog converter, and/or the like. Control and signal processing functions of the mobile terminal may be allocated between these devices according to their respective capabilities. The processor may additionally comprise an internal voice coder (VC) 20a, an internal data modem (DM) 20b, and/or the like. Further, the processor may comprise functionality to operate one or more software programs, which may be stored in memory. For example, the processor 20 may be capable of operating a connectivity program, such as a web browser. The connectivity program may allow the mobile terminal 10 to transmit and receive web content, such as location-based content, according to a protocol, such as Wireless Application Protocol (WAP), hypertext transfer protocol (HTTP), and/or the like. The mobile terminal 10 may be capable of using a Transmission Control Protocol/Internet Protocol (TCP/IP) to transmit and receive web content across the internet or other networks.
The mobile terminal 10 may also comprise a user interface including, for example, an earphone or speaker 24, a ringer 22, a microphone 26, a display 28, a user input interface, and/or the like, which may be operationally coupled to the processor 20. In this regard, the processor 20 may comprise user interface circuitry configured to control at least some functions of one or more elements of the user interface, such as, for example, the speaker 24, the ringer 22, the microphone 26, the display 28, and/or the like. The processor 20 and/or user interface circuitry comprising the processor 20 may be configured to control one or more functions of one or more elements of the user interface through computer program instructions (e.g., software and/or firmware) stored on a memory accessible to the processor 20 (e.g., volatile memory 40, non-volatile memory 42, and/or the like). Although not shown, the mobile terminal may comprise a battery for powering various circuits related to the mobile terminal, for example, a circuit to provide mechanical vibration as a detectable output. The display 28 of the mobile terminal may be of any type appropriate for the electronic device in question with some examples including a plasma display panel (PDP), a liquid crystal display (LCD), a light-emitting diode (LED), an organic light-emitting diode display (OLED), a projector, a holographic display or the like. The display 28 may, for example, comprise a three-dimensional touch display. The user input interface may comprise devices allowing the mobile terminal to receive data, such as a keypad 30, a touch display (e.g., some example embodiments wherein the display 28 is configured as a touch display), a joystick (not shown), a motion sensor 31 and/or other input device. In embodiments including a keypad, the keypad may comprise numeric (0-9) and related keys (#, *), and/or other keys for operating the mobile terminal.
The mobile terminal 10 may comprise memory, such as a subscriber identity module (SIM) 38, a removable user identity module (R-UIM), and/or the like, which may store information elements related to a mobile subscriber. In addition to the SIM, the mobile terminal may comprise other removable and/or fixed memory. The mobile terminal 10 may include volatile memory 40 and/or non-volatile memory 42. For example, volatile memory 40 may include Random Access Memory (RAM) including dynamic and/or static RAM, on-chip or off-chip cache memory, and/or the like. Non-volatile memory 42, which may be embedded and/or removable, may include, for example, read-only memory, flash memory, magnetic storage devices (e.g., hard disks, floppy disk drives, magnetic tape, etc.), optical disc drives and/or media, non-volatile random access memory (NVRAM), and/or the like. Like volatile memory 40 non-volatile memory 42 may include a cache area for temporary storage of data. The memories may store one or more software programs, instructions, pieces of information, data, and/or the like which may be used by the mobile terminal for performing functions of the mobile terminal. For example, the memories may comprise an identifier, such as an international mobile equipment identification (IMEI) code, capable of uniquely identifying the mobile terminal 10.
In some example embodiments, one or more of the elements or components of the remote user interface 3 may be embodied as a chip or chip set. In other words, certain elements or components may comprise one or more physical packages (e.g., chips) including materials, components and/or wires on a structural assembly (e.g., a baseboard). The structural assembly may provide physical strength, conservation of size, and/or limitation of electrical interaction for component circuitry included thereon. In the embodiment of
The apparatus 102 may be embodied as various different types of augmented reality devices including augmented reality glasses and near eye displays. Regardless of the type of augmented reality device 2 in which the apparatus 102 is incorporated, the apparatus 102 of
The processor 110 may, for example, be embodied as various means including one or more microprocessors with accompanying digital signal processor(s), one or more processor(s) without an accompanying digital signal processor, one or more coprocessors, one or more multi-core processors, one or more controllers, processing circuitry, one or more computers, various other processing elements including integrated circuits such as, for example, an ASIC (application specific integrated circuit) or FPGA (field programmable gate array), one or more other types of hardware processors, or some combination thereof. Accordingly, although illustrated in
The memory 112 may comprise, for example, volatile memory, non-volatile memory, or some combination thereof. In this regard, the memory 112 may comprise a non-transitory computer-readable storage medium. Although illustrated in
As shown in
The media item capturing module 116 can include all hardware, such as a lens or other optical component(s), and software necessary for creating a digital image file from a captured image. The media item capturing module 116 may also include all hardware, such as a lens or other optical component(s), and software necessary to provide various media item capturing functionality, such as, for example, image zooming functionality. Image zooming functionality can include the ability to magnify or de-magnify an image prior to or subsequent to capturing an image.
Alternatively or additionally, the media item capturing module 116 may include only the hardware needed to view an image, while a memory device, such as the memory 112 of the apparatus 102 stores instructions for execution by the processor 110 in the form of software necessary to create a digital image file from a captured image. In an example embodiment, the media item capturing module 116 may further include a processor or co-processor which assists the processor 110 in processing image data and an encoder and/or decoder for compressing and/or decompressing image data. The encoder and/or decoder may encode and/or decode according to, for example, a joint photographic experts group (JPEG) standard or other format.
The communication interface 114 may be embodied as any device or means embodied in circuitry, hardware, a computer program product comprising computer readable program instructions stored on a computer readable medium (e.g., the memory 112) and executed by a processing device (e.g., the processor 110), or a combination thereof that is configured to receive and/or transmit data from/to another computing device. In some example embodiments, the communication interface 114 is at least partially embodied as or otherwise controlled by the processor 110. In this regard, the communication interface 114 may be in communication with the processor 110, such as via a bus. The communication interface 114 may include, for example, an antenna, a transmitter, a receiver, a transceiver and/or supporting hardware or software for enabling communications with one or more remote computing devices, such as the remote user interface 3, e.g., mobile terminal 10. The communication interface 114 may be configured to receive and/or transmit data using any protocol that may be used for communications between computing devices. In this regard, the communication interface 114 may be configured to receive and/or transmit data using any protocol that may be used for transmission of data over a wireless network, wireline network, some combination thereof, or the like by which the apparatus 102 and one or more computing devices may be in communication. As an example, the communication interface 114 may be configured to transmit an image that has been captured by the media item capturing module 116 over the network 1 to the remote user interface 3, such as in real time or near real time, and to receive information from the remote user interface regarding an icon or other indication to be presented upon the augmented reality display 118, such as to overlay the image that has been captured and/or overlay an image to the field of view of an augmented reality device, such as augmented reality glasses. The communication interface 114 may additionally be in communication with the memory 112, the media item capturing module 116 and the augmented reality display 118, such as via a bus.
In some example embodiments, the apparatus 102 comprises an augmented reality display 118. The augmented reality display 118 may comprise any type of display, near-eye display, glasses and/or the like capable of displaying at least a virtual graphic overlay on the physical world. The augmented reality display 118 may also be configured to capture an image or a video of a forward field of view when a user engages the augmented reality display, such as with the assistance of the media item capturing module 116. Further, the augmented reality display may be configured to capture an extended field of view by sweeping a media item capturing module 116, such as a video camera and/or the like, over an area of visual interest, and compositing frames from such a sweep sequence in registration, by methods well-known in the art of computer vision, so as to provide display and interaction, including remote guidance, such as from a remote user interface, over a static image formed of an area of visual interest larger than that captured continuously by the media item capturing module. As such, an augmented reality device 2 of
According to one embodiment, the augmented reality device 2 may be configured to display an image of the field of view of the augmented reality device 2 along with an icon or other indication representative of an input to the remote user interface 3 with the icon or other indication being overlaid, for example, upon the image of the field of view. In one embodiment, a first user may wear an augmented reality device 2, such as augmented reality glasses, augmented reality near-eye displays and/or the like, while a second user interacts with a remote user interface 3. In another embodiment, a first user may engage an augmented reality device 2, and a plurality of users may interact with a plurality of remote user interfaces. Further still, a plurality of users may engage a plurality of augmented reality devices and interact with at least one user, who may be interacting with a remote user interface. The one or more users interacting with the remote user interface may provide separate inputs to separate remote user interfaces, share a cursor displayed on separate remote user interfaces representing a single input and/or the like. As previously mentioned and as shown at 150 in
Although a single icon is shown in the embodiment of
In one embodiment of the present invention, as shown in
Alternatively, the second user may provide input at the desired location in one of the images and may indicate that the same location is to be tracked in the other images. Based upon image recognition, feature detection or the like, the processor 110 of the remote user interface 3 may identify the same location, such as the same building, person or the like in the other images such that an icon or other indication may be imposed upon the same building, person or the like in the series of images, the scene and/or field of view displayed by the augmented reality device 2. In another embodiment, local orientation tracking and/or the like may provide an icon or other indication to remain in a correct location relative to a user viewing the augmented reality device. Further, in another embodiment, the icon or other indication may be imposed upon the same building, person or the like in the series of images, the scene and/or field of view displayed by the augmented reality device such that the icon or other indication may not be displayed by the augmented reality device when the building, person, or the like associated with the icon or other indication is not present within the series of images, the scene and/or field of view displayed by the augmented reality device and may be displayed when the building, person or the like is present within the series of images, the scene and/or field of view.
Referring now to
Referring now to
Accordingly, blocks of the flowcharts support combinations of means for performing the specified functions. It will also be understood that one or more blocks of the flowcharts, and combinations of blocks in the flowcharts, may be implemented by special purpose hardware-based computer systems which perform the specified functions, or combinations of special purpose hardware and computer program product(s).
Although described above in conjunction with an embodiment having a single augmented reality device 2 and a single user interface 3, a system in accordance with another embodiment of the present invention may include two or more augmented reality devices and/or two or more user interfaces. As such, a single user interface 3 may provide inputs that define icons or other indications to be presented upon the display of two or more augmented reality devices. Additionally or alternatively, a single augmented reality device may receive inputs from two or more user interfaces and may augment the image of its surroundings with icons or other indications defined by the multiple inputs. In one embodiment, the first and second users may each include an augmented reality device 2 and a user interface 3 so that each user can see not only its surroundings via the augmented reality display, but also an image from the augmented reality display of the other user. Additionally, each user of this embodiment can provide input via the user interface to define icons or other indications for display by the augmented reality device of the other user.
The above described functions may be carried out in many ways. For example, any suitable means for carrying out each of the functions described above may be employed to carry out embodiments of the invention. In one embodiment, the means for performing operations 200-206 of
Referring now to
The above described functions may be carried out in many ways. For example, any suitable means for carrying out each of the functions described above may be employed to carry out embodiments of the invention. In one embodiment, the means for performing operations 222-228 of
Many modifications and other embodiments of the inventions set forth herein will come to mind to one skilled in the art to which these inventions pertain having the benefit of the teachings presented in the foregoing descriptions and the associated drawings. Therefore, it is to be understood that the embodiments of the invention are not to be limited to the specific embodiments disclosed and that modifications and other embodiments are intended to be included within the scope of the invention. Moreover, although the foregoing descriptions and the associated drawings describe example embodiments in the context of certain example combinations of elements and/or functions, it should be appreciated that different combinations of elements and/or functions may be provided by alternative embodiments without departing from the scope of the invention. In this regard, for example, different combinations of elements and/or functions than those explicitly described above are also contemplated within the scope of the invention. Although specific terms are employed herein, they are used in a generic and descriptive sense only and not for purposes of limitation.