This application relates generally to assisted user device maintenance. The application relates more particularly to assisted end user maintenance of multifunction peripherals using a personal imaging device and augmented reality.
Document processing devices include printers, copiers, scanners and e-mail gateways. More recently, devices employing two or more of these functions are found in office environments. These devices are referred to as multifunction peripherals (MFPs) or multifunction devices (MFDs). As used herein, MFPs are understood to comprise printers, alone or in combination with other of the afore-noted functions. It is further understood that any suitable document processing device can be used.
MFPs are powerful, but complex devices which require regular maintenance. Device maintenance includes restocking of consumables, such as paper, toner or ink. Device maintenance further includes addressing simple issues, such as freeing paper jams. More complex device issues may require attention from a professional service technician.
Various embodiments will become better understood with regard to the following description, appended claims and accompanying drawings wherein:
The systems and methods disclosed herein are described in detail by way of examples and with reference to the figures. It will be appreciated that modifications to disclosed and described examples, arrangements, configurations, components, elements, apparatuses, devices methods, systems, etc. can suitably be made and may be desired for a specific application. In this disclosure, any identification of specific techniques, arrangements, etc. are either related to a specific example presented or are merely a general description of such a technique, arrangement, etc. Identifications of specific details or examples are not intended to be, and should not be, construed as mandatory or limiting unless specifically designated as such.
When MFPs experience maintenance issues, it is advantageous when they can be addressed directly by end users. Issues treatable by end users result in minimized device down time, increased worker productivity and decreased worker frustration. There is a varying degree of difficulty within the category of user addressable maintenance issues. A new user will require assistance or training in handling even the simpler device issues. Certain device issues, such as problems that surface rarely or problems that require following a precise instruction sequence can be challenging to any user. If end user device maintenance is not done properly, it can result in further exacerbating the problem, and possibly damaging the device, perhaps to the point where a service call must be placed for a professional technician.
It is advantageous when more issues are addressable by end users over a wide range of device sophistication. User manuals may be complex, difficult to navigate or not readily available.
Example embodiments disclosed herein provide for assisting users perform device maintenance by use of a personal or portable imaging device, such as a smartphone, smart glasses, tablet computer or the like.
Embodiments detailed further below include an immersive instructional tool whereby the system presents an augmented reality animation in a step-by-step manner to be replicated or mimicked by an end user. This, used in conjunction with machine learning, provides a system that learns and improves recommended solutions over time.
The system suitably provides perceived reality of a system feed (visual and audio state or pattern) to be used in conjunction with error heuristics for problem solving. The system suitably provides “Perceived Reality” whereby an application can ascertain visual and audio anomalies of the system, as read by a feed obtained through the mobile device or wearable, to be used in conjunction with known heuristics about the system to suggest a hardware fix of the target device. As a result, the system can help the user trouble shoot hardware problems and prevent calls to service technicians.
In an example embodiment, an augmented reality device maintenance system includes a memory storing user instruction data corresponding to user actions to change a state of a multifunction peripheral. The instruction data is stored associatively with state data corresponding to a known state of a multifunction peripheral. A processor stores digital image data from a captured image of the multifunction peripheral in the memory. A display generator generates an image corresponding to received image data on an associated display. The system also receives current device state data corresponding to a current operational state of the MFP and retrieves, from the memory, user instruction data associated with state data that corresponds to received current device state data. The display generator superimposes instructions corresponding to retrieved user instruction data on the generated image.
In other example embodiments disclosed herein the system employs an imaging device, such as a smartphone, tablet or smart glasses that feeds audio and visual patterns to the system which determines whether the device state is indirectly or directly contributing to the reported problem in addition to simply fixing the reported problem.
In other example embodiments disclosed herein, augmented reality (AR) technology is used in conjunction with a digital imaging device such as a smartphone, tablet or smart glasses to provide digital information onto the physical world.
In other example embodiments, augmented reality, or AR, provides an immersive instructional tool whereby the system presents an augmented reality animation in a step-by-step manner to be replicated or mimicked by the end user. This will help users save money and time with DIY (do it yourself) tasks such as fixing the washing machine or office tasks, such as fixing a paper jam.
Example embodiments provide a system based on a conversation between the user and system where the task is broken down into steps, and each step is demonstrated on top of the real-world system using AR animation. The user then mimics the task on the real-world device. The hardware detects the device state, such as machine codes, device appearance, device sounds, or the like, which are triggered in response to user interaction and the state of which is communicated to the application to return a success, failure or not-attempted status. A successful step is suitably met with confirmation and presentation of a subsequent step. A failed step is suitably met with a confirmation and presentation of previous step and an additional hint. A not-attempted status is suitably met with a prompt.
Provision of an AR immersive instructional tool based on a demonstration and mimic feedback interaction will replace instructional videos for an unsophisticated or casual user and prove to reduce office service costs in the workplace.
In accordance with the subject application,
Also illustrated in
To accomplish the forgoing, the user's device suitably communicates with an intelligent device, such as networked server on which is stored MFP images, including images from different external perspectives. For example, see augmented reality device maintenance system 400 of
Turning now to
Processor 202 is also in data communication with a storage interface 208 for reading or writing data with storage 216, suitably comprised of a hard disk, optical disk, solid-state disk, cloud-based storage, or any other suitable data storage as will be appreciated by one of ordinary skill in the art.
Processor 202 is also in data communication with a network interface 210 which provides an interface to a network interface controller (NIC) 214, which in turn provides a data path to any suitable wired or physical network connection 220, or to a wireless data connection via a wireless network interface, such as WiFi 218. Example wireless connections include cellular, Wi-Fi, wireless universal serial bus (wireless USB), satellite, and the like. Example wired interfaces include Ethernet, USB, IEEE 1394 (FireWire), Lightning, telephone line, or the like. Processor 202 is also in data communication with a hardware monitor 221, suitably amassing state data from subassemblies, sensors, digital thermometers, or the like, and suitably including digital state date including device codes, such as device error codes. Processor 202 can also be in data communication a document processor interface 222, with BLUETOOTH interface 226 and NFC interface 228, either directly as shown via data path 212.
Processor 202 can also be in data communication with any suitable user input/output (I/O) interface (not shown) which provides data communication with user peripherals, such as displays, keyboards, mice, track balls, touch screens, or the like. As referenced above, hardware monitor 221 suitably provides device event or state data, working in concert with suitable monitoring systems. By way of further example, monitoring systems may include page counters, sensor output, such as consumable level sensors, temperature sensors, power quality sensors, device error sensors, door open sensors, and the like. Data is suitably stored in one or more device logs, suitably as device state data, in storage 216.
Document processor interface 222 is suitable for data communication with MFP functional units 250. In the illustrated example, these units include a copy engine, suitably comprised of copy hardware 240, a scan engine, suitably comprised of scan hardware 242, a print engine, suitably comprised of print hardware 244 and a fax engine, suitably comprised of fax hardware 246. These subsystems together comprise MFP functional hardware 250. It will be understood that functional units are suitably comprised of intelligent units, including any suitable hardware or software platform.
Turning now to
Processor 304 is also in data communication with a storage interface 306 for reading or writing to a data storage system 308, suitably comprised of a hard disk, optical disk, solid-state disk, or any other suitable data storage as will be appreciated by one of ordinary skill in the art.
Processor 304 is also in data communication with a network interface controller (NIC) 330, which provides a data path to any suitable network or device connection, such as a suitable wireless data connection via wireless network interface 338. A suitable data connection to an MFP or server is via a data network, such as a local area network (LAN), a wide area network (WAN), which may comprise the Internet, or any suitable combination thereof. A digital data connection is also suitably directly with an MFP or server, such as via BLUETOOTH, optical data transfer, Wi-Fi direct, or the like.
Processor 304 is also in data communication with a user input/output (I/O) interface 340 which provides data communication with user peripherals, such as touch screen display 344 via display generator 346, as well as keyboards, mice, track balls, or the like. Peripherals can also be connected via the data bus 314, such as camera 336 and microphone 334. It will be understood that functional units are suitably comprised of intelligent units, including any suitable hardware or software platform.
Display 614 includes an augmented display of the MFP, along with a superimposed image 616 instructing the user where to position their hand to remove the tray. Next, display 620 includes a superimposed image 624 showing a tray grasp. Next, display 628 includes a superimposed image 632 demonstrating tray removal. Next, display 636 illustrates simultaneous overlays comprising an instructional text box 644 and a directional arrow 648 for tray opening.
Next, display 652 illustrates overlay 656 directing the user to remove paper from the paper tray. Display 660 then provides instructional text overlay 664 and graphical overlay 668 to instruct the user on how to grasp paper in the tray. Next, display 672 instructs the user on removal of paper from the tray with overlay 676. Finally, display 680 illustrates paper 684 removed from the paper tray.
Next, captured digital audio and/or video is sent to an augmented reality system at block 812, and pattern mapping with captured data and stored image or sound data is completed at block 816. Mapped data provides for recognition of visual or audio patterns at block 820. An associated augmented reality image is generated at block 824 and displayed at block 828 on an associated display. In this example embodiment, the system suitably diagnoses a device's condition from a capture image and/or audio.
Next, any additional action step that may be desired are relayed to the user at block 1036, and determination is made if additional steps are required at block 1040. If so, the process returns to block 1036. If not, final confirmation and any fixes are applied at block 1044, and the process ends at block 1048.
While certain embodiments have been described, these embodiments have been presented by way of example only, and are not intended to limit the scope of the inventions. Indeed, the novel embodiments described herein may be embodied in a variety of other forms; furthermore, various omissions, substitutions and changes in the form of the embodiments described herein may be made without departing from the spirit of the inventions. The accompanying claims and their equivalents are intended to cover such forms or modifications as would fall within the spirit and scope of the inventions.