The ability to provide efficient and intuitive interaction between computer systems and users thereof is essential for delivering an engaging and enjoyable user-experience. Graphical user-interfaces (GUI) are commonly used for facilitating interaction between an operating user and the computing system. Today, most computer systems employ icon-based GUIs that utilize icons and menus for assisting a user in navigating and launching content and applications on the computing system.
Meanwhile, the popularity of mobile computing devices coupled with the advancements in imaging technology—particularly given the inclusion of cameras within such devices—has given rise to a heightened interest in augmented reality (AR). In general, AR refers to overlaying graphical information onto a live video feed of a real-world environment so as to ‘augment’ the image which one would ordinarily see. Through the combination of augmented reality and graphical user interface, even more meaningful interactions are made available to the operating user.
The features and advantages of the present disclosure as well as additional features and advantages thereof will be more clearly understood hereinafter as a result of a detailed description of implementations when taken in conjunction with the following drawings in which:
The following discussion is directed to various examples. Although one or more of these examples may be discussed in detail, the implementations disclosed should not be interpreted, or otherwise used, as limiting the scope of the disclosure, including the claims. In addition, one skilled in the art will understand that the following description has broad application, and the discussion of any implementations is meant only to be an example of one implementation, and not intended to intimate that the scope of the disclosure, including the claims, is limited to that implementation. Furthermore, as used herein, the designators “A”, “B” and “N” particularly with respect to the reference numerals in the drawings, indicate that a number of the particular feature so designated can be included with examples of the present disclosure. The designators can represent the same or different numbers of the particular features.
The figures herein follow a numbering convention in which the first digit or digits correspond to the drawing figure number and the remaining digits identify an element or component in the drawing. Similar elements or components between different figures may be identified by the user of similar digits. For example, 143 may reference element “43” in
Ordinarily, a user interface contains a multitude of information, which is presented via a traditional menu structure. Information is layered in a linear fashion according to a typical workflow. Whether designed for on-board printer control panels or third-party displays (e.g., smartphone/tablet), user interface software is designed to be compatible with a wide range of products and their varied capabilities. Consequently, a large amount of contextually irrelevant data and interaction options are presented. Additionally, the current method for remotely interacting with peripheral products is deficient in that the interaction is only metaphorically tied to the peripheral product by an abstract identifier such as a pictorial representation or product identifier for example.
Today, interaction with a peripheral device (e.g., printer) requires one to perform tasks either through the on-product display menu, driver software, or other application. For the latter two options, the peripheral device must be searched for, identified as a compatible device, added to the list of trusted devices, and then interacted with via options presented in a traditional menu system. This plethora of steps and interactions is time-consuming and often times frustrating (e.g., device not found) for the operating user. Augmented reality allows for a more efficient and tangible interaction between a remote device and a physical peripheral object.
Implementations of the present disclosure utilizes an augmented reality environment to automatically recognize a physical object desired for interaction with by a user, while also providing contextually relevant information and interaction options to the user. In one example, an optical sensor is activated on a mobile device and a communicable object is automatically connected with the mobile device upon being detected by the optical sensor. Moreover, a designated action, such as a print or scan operation, is executed on the peripheral device upon receiving input associated with a graphical representation of the peripheral device on the user interface of the mobile device. Accordingly, augmented reality offers a remarkable opportunity to simplify user interaction, and make virtual interaction with a physical object more tangible and logical.
Referring now in more detail to the drawings in which like numerals identify corresponding parts throughout the views,
Processor 105 may be, at least one central processing unit (CPU), at least one semiconductor-based microprocessor, at least one graphics processing unit (GPU), other hardware devices suitable for retrieval and execution of instructions stored in machine-readable storage medium 114, or combinations thereof. For example, the processor 105 may include multiple cores on a chip, include multiple cores across multiple chips, multiple cores across multiple devices (e.g., if the computing device 101 includes multiple node devices), or combinations thereof. Processor 105 may fetch, decode, and execute instructions to implement the approaches described herein. As an alternative or in addition to retrieving and executing instructions, processor 105 may include at least one integrated circuit (IC), other control logic, other electronic circuits, or combinations thereof that include a number of electronic components for performing the functionality described herein.
The wireless module 107 can be used to transmit and receive data to and from other devices. For example, the wireless module 107 may be used to send document data to be printed via the printer device 120, or receive scanned document data from the printer device 120 via the communication interface 123. The wireless module 107 may be configured for short-wavelength radio transmission such as Bluetooth wireless communication. The wireless module 107 may include, for example, a transmitter that may convert electronic signals to radio frequency (RF) signals and/or a receiver that may convert RF signals to electronic signals. Alternatively, the wireless module 107 may include a transceiver to perform functions of both the transmitter and receiver. The wireless module 107 may further include or connect to an antenna assembly to transmit and receive the RF signals over the air. The wireless module 107 may communicate with a network, such as a wireless network, a cellular network, a local area network, a wide area network, a telephone network, an intranet/Internet, or a combination thereof.
Display unit 115 represents an electronic visual and touch-sensitive display configured to display images and includes a graphical touch user interface 116 for enabling touch-based input interaction between an operating user and the mobile computing device 101. According to one implementation, the user interface 116 may serve as the display of the system 100. The user interface 110 can include hardware components and software components. Additionally, the user interface 110 may refer to the graphical, textual and auditory information a computer program may present to the user, and the control sequences (e.g., touch input) the user may employ to control the program. In one example system, the user interface 110 may present various pages that represent applications available to the user. The user interface 110 may facilitate interactions between the user and computer systems by inviting and responding to user input and translating tasks and results to a language or image that the user can understand. In one implementation, the user interface 116 is configured to display interactive screens and video images for facilitating user interaction with the computing device 101 and an augmented reality environment.
Meanwhile, image sensor 110 represents an optical image capturing device such as a digital video camera. As used herein, the image sensor 110 is configured to capture images/video of a physical environment within a field of view for displaying to the operating user via the display 115. Furthermore, the object detection module 112 is configured to detect relevant peripheral objects or devices within the field of view of the image sensor 110 for establishing an automatic connection between the mobile computing device 101 and the relevant peripheral device (e.g., printer device 120).
Furthermore, an augmented reality (AR) application 106 can be installed on and executed by the computing device 101. As used herein, application 106 represents executable instructions or software that causes a computing device to perform useful tasks. For example, the AR application 106 may include instructions that upon being opened and launched by a user, causes the processor to activate the image sensor 110 and search (via the object detection module) for peripheral objects (e.g., printer 120) to automatically pair with the mobile computing device 101.
Machine-readable storage medium 114 may be any electronic, magnetic, optical, or other physical storage device that contains or stores executable instructions. Thus, machine-readable storage medium may be, for example, Random Access Memory (RAM), an Electrically Erasable Programmable Read-Only Memory (EEPROM), a storage drive, a Compact Disc Read Only Memory (CD-ROM), and the like. As such, the machine-readable storage medium can be non-transitory. As described in detail herein, machine-readable storage medium 114 may be encoded with a series of executable instructions for providing augmented reality for the computing device 101. Still further, storage medium 114 may include software 116 executable by processor 105 and, that when executed, causes the processor 105 to perform some or all of the functionality described herein. For example, the augmented reality application 106 may be implemented as executable software within the storage medium 114.
Printer device 120 represents a physical peripheral device and includes a communication interface 107 for establishing a wireless communication with the mobile computing device 101 as described above (e.g., over a local wireless network). In one example, the printing device 104 may be a commercial laser jet printer, consumer inkjet printer, multi-function printer (MFD), all-in-one (AIO) printer, or any print device capable of producing a representation of an electronic document on physical media (i.e., document 125) such as paper or transparency film. The printer device 120 further includes an identifier marker 124 affixed thereon that allows for object detection via a computer vision algorithm (associated with the object detection module 112) that determines the orientation and scale of the object (for which the marker is affixed) in relation to the user or camera 110 as will be described in further detail below.
As shown here, the tablet device 201 connects with the printer device so as cause printing of physical media 225 corresponding with the electronic document 225′ displayed on the user interface 216 of the tablet device 201. In another example, a user may send a digital video or photo document to a connected television monitor from the tablet device 201 for displaying on the larger display of the monitor. In yet another example, the user may start a file transfer with a connected personal computer by dragging documents on the user interface of the mobile device onto a graphical representation of the personal computer on the augmented reality application.
Moreover, augmented reality may be used as an alternative to traditional user interface menus and is technically advantageous in that the augmented information presented may be more contextually relevant. As shown here, the augmented image 335′ includes relevant data 326 associated with the physical printer device 320. For example, the relevant data 326 may include the current print queue status, paper count and type, image quality and similar information relevant to the physical printer 320.
Referring now to
Implementations of the present disclosure provide augmented reality device interfacing. Moreover, many advantages are afforded by the system and method of device interfacing according to implementations of the present disclosure. For instance, the augmented reality interfacing method serves to simplify interaction through contextual menu options while also presenting relevant information and interaction options in a more user-friendly and tangible manner. The present implementations are able to leverage the larger displays and processor capabilities found in table computing device, thus reducing reliance upon on-product displays and lowering production costs of such devices. Furthermore, examples described herein encourage printing from portable devices (e.g., local file system and online/cloud storage) rather than immobile desktop computers, while also attracting and improving print relevance for a younger demographic of users.
Furthermore, while the disclosure has been described with respect to particular examples, one skilled in the art will recognize that numerous modifications are possible. For instance, although examples described herein depict a tablet device as the mobile computing device, the disclosure is not limited thereto. For example, the mobile computing device may be a smartphone, netbook, e-reader, cell phone, or any other portable electronic device having a display and user interface.
Not all components, features, structures, characteristics, etc. described and illustrated herein need be included in a particular example or implementation. If the specification states a component, feature, structure, or characteristic “may”, “might”, “can” or “could” be included, for example, that particular component, feature, structure, or characteristic is not required to be included. If the specification or claim refers to “a” or “an” element, that does not mean there is only one of the element. If the specification or claims refer to “an additional” element, that does not preclude there being more than one of the additional element.
It is to be noted that, although some examples have been described in reference to particular implementations, other implementations are possible according to some examples. Additionally, the arrangement o order of elements or other features illustrated in the drawings or described herein need not be arranged in the particular way illustrated and described. Many other arrangements are possible according to some examples.
The techniques are not restricted to the particular details listed herein. Indeed, those skilled in the art having the benefit of this disclosure will appreciate that many other variations from the foregoing description and drawings may be made within the scope of the present techniques. Accordingly, it is the following claims including any amendments thereto that define the scope of the techniques.
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/US2013/057470 | 8/30/2013 | WO | 00 |