Enterprise workflows, such as those related to inventory management, are often optimized to reduce the time required by users to perform various tasks. Such tasks may include, for example, scanning inventory with a device such as a mobile computer equipped with an optical scanner, prior to shipping the inventory to another location. When a workflow includes alternative paths rather than a single sequential path, the user may be required to stop the current flow of operation, i.e. scanning and moving items, and then manipulate the device to locate the appropriate menu or field in an interface provided by the device, in order to activate the appropriate alternative path.
The above scenario may result in an undesirably long interruption to the optimized scanning and shipping workflow. Accordingly, there is a need for a method for controlling an alternative user interface in such devices.
The accompanying figures, where like reference numerals refer to identical or functionally similar elements throughout the separate views, together with the detailed description below, are incorporated in and form part of the specification, and serve to further illustrate embodiments of concepts that include the claimed invention, and explain various principles and advantages of those embodiments.
Skilled artisans will appreciate that elements in the figures are illustrated for simplicity and clarity and have not necessarily been drawn to scale. For example, the dimensions of some of the elements in the figures may be exaggerated relative to other elements to help to improve understanding of embodiments of the present invention.
The apparatus and method components have been represented where appropriate by conventional symbols in the drawings, showing only those specific details that are pertinent to understanding the embodiments of the present invention so as not to obscure the disclosure with details that will be readily apparent to those of ordinary skill in the art having the benefit of the description herein.
A method for controlling an alternative motion-based user interface in a device comprising a projector is provided. The method comprises receiving, at the device, an input comprising an instruction to switch from a previous user interface to the alternative user interface. The device then generates an image containing a plurality of selectable elements and a pointing element, and projects the image on a surface with the projector. The device detects a movement of the device, and updates the image to stabilize the plurality of selectable elements against the detected movement and to track the detected movement with the pointing element. The device projects the updated image on the surface, and selects one of the selectable elements based on a motion vector of the pointing element relative to the selectable elements. The device performs an action corresponding to the selected one of the selectable elements.
The device 100 comprises a processor 105, an image output apparatus 110 comprising at least one of a display 115 and a projector 120, a motion sensor 125, an input apparatus 130, an optical scanner 135, optionally a camera 140, and a memory 145. The device 100 may also comprise a communications interface 147. The components of the device 100 are connected by one or more communication buses 148 (e.g. Universal Serial Bus (USB), Peripheral Component Interconnect (PCI) and the like) that carry data between the components. While
The image output apparatus 110 operates to present images to a user of the device 100 under the control of the processor 105. For example, the images may be received by the image output apparatus 110 from the processor 105 over the communication bus 148. The image output apparatus 110 thus comprises any integrated circuits (ICs), buses and the like necessary to convey data from the processor 105 to the display 115, the projector 120, or both the display 115 and the projector 120. The image output apparatus 110 comprises at least one of the display 115 and the projector 120, and may also comprise both the display 115 and the projector 120.
The display 115 may be realized as an electronic display configured to graphically display information and/or content under the control of the processor 105. Depending on the implementation of the embodiment, the display may be realized as a liquid crystal display (LCD), a touch-sensitive display, a cathode ray tube (CRT), a light emitting diode (LED) display, an organic light emitting diode (OLED) display, a plasma display, a projection display, or another suitable electronic display. The projector 120 may be realized as a digital light processing (DLP) apparatus, employing one or more digital micromirror devices (DMDs) to project an image received from the processor 105 onto a surface external to the device 100. The projector 120 may also be implemented as a laser-based projection device, or a liquid crystal on silicon (LCoS) device.
The motion sensor 125 may be realized as an accelerometer, a gyroscope, or a combination of accelerometers and gyroscopes. The motion sensor 125 is connected to the processor 105 via the communication bus 148 and operates to provide the processor 105 with data describing the movement of the device 100. For example, the data provided to the processor 105 may include a description of a velocity and direction of movement of the device 100, an angle of inclination of the device 100, an acceleration of the device 100, and the like.
The input apparatus 130 operates to receive input from the user of the device 100 and provide that input to the processor 105 via the communication bus 148. The input apparatus 130 may be realized as one or more of a keypad, a touch screen integrated with the display 115 or provided as a separate apparatus from the display 115, a pressure sensor, a microphone, one or more buttons such as a trigger-style button, and the like.
The optical scanner 135 operates to emit light, such as laser light, onto a surface carrying a graphical indicator, such as a barcode, to be decoded. The emission of light by the optical scanner 135 may be initiated, in one embodiment, by actuation of the input apparatus 130 by the user of the device 100. The optical scanner 135 captures the reflected or emitted portion of that light and from the reflected or emitted portion, decodes various types of data from the graphical indicator. The optical scanner operates to provide the decoded data to the processor 105 for further processing.
The camera 145 captures light reflecting from or emitted by objects and surfaces in the vicinity of the device 100, and transmits image data to the processor 105 representing the captured light. The camera 145 thus includes a lens assembly for focusing incoming light, as well as a sensor. The sensor may be realized as an analog sensor, or as a digital sensor such as a charge-coupled device (CCD) sensor or a complementary metal-oxide-semiconductor (CMOS) sensor. In some embodiments, the optical scanner 135 may be a function of the camera 145, rather than a separate hardware component as illustrated in
The memory 145 may be an IC memory chip containing any form of RAM (random-access memory), a CD-RW (compact disk with read write), a hard disk drive, a DVD-RW (digital versatile disc with read write), a flash memory card, external subscriber identity module (SIM) card or any other non-transitory medium for storing digital information. The memory 145 comprises applications 150, and motion vector rules 155. The applications 150 include various software and/or firmware programs necessary for the operation of the device 100 as well as software and/or firmware programs (e.g. inventory management applications for operating the optical scanner 135 to scan and record boxes, pallets and the like) that address specific requirements of the user of the device 100. In accordance with the embodiments, the device 100 initiates an alternative user interface control process, based on movement of the device 100 and on images presented by the image output apparatus 110, to permit the user of the device to provide input to the applications 150.
The communications interface 147 enables the device 100 to communicate with other devices over local connections (e.g. Bluetooth, local area network) or via a wide area network such as the Internet, mobile networks or a combination thereof. The communications interface 147 may therefore include one or more transmitters, one or more receivers (e.g. an antenna) and any encoding or decoding components for communicating with other devices. Data received at the communications interface may be provided to the processor 105 for further processing via the bus 148, and the processor 105 may provide data to the communications interface 147 over the bus 148 for transmission to another device. Examples of data received at the communications interface 147 and provided to the processor 105 include input data in addition to, or instead of, the data provided by the input apparatus 130; and image data instead of, or in addition to, images retrieved from the memory 145. Data stored in the memory 145 may also be updated via transmissions received at the communications interface 147.
In one embodiment, the device 100 may be used in an inventory management workflow as a scanning device. For example, the user of the device 100 may repeatedly operate the input apparatus 130 to cause the optical scanner 135 to scan and decode a series of barcodes affixed to various objects (e.g. boxes of inventory). The processor 105 may execute the applications 150 to store the decoded barcode data in the memory 145. The inventory management workflow may also branch, or may comprise multiple workflows, such that more than one course of action is available to the user at some points in time. For example, there may be a workflow or branch for scanning items to be loaded for delivery, and another workflow or branch for unloading items from a delivery; it may be necessary or at least desirable to switch between these two workflows or branches when unloading and then loading a delivery vehicle. In another example, at times, it may be necessary for the user of the device 100 to suspend the scanning process and branch to an exception workflow to enter exception data or take some other action. For example, exception data may be required when an actual number of objects does not match the number required to fulfill an order. In this example, the user may be a warehouse worker scanning items and loading them for delivery. If there are not enough items to fulfill a given order, the user may branch to an exception workflow in which the scanning process is interrupted and the user enters the actual quantity of items available (in contrast to the quantity specified by the order) in an exception quantity field. As another example, exception data may be required when the same object was erroneously scanned more than once. In a further example of a branch to an exception workflow, the user of the device 100 may be responsible for loading a vehicle for deliveries by retrieving items, scanning the items and placing them in the vehicle. When the user retrieves an item and determines that that item is intended for a different vehicle, the user may branch to an exception workflow for transmitting an indication from the device 100 informing other system components of the misplaced item. When the above exceptions or other branches have been completed, the device 100 may return to the previous functionality, such as scanning and recording inventory.
In one embodiment, the entry of the device 100 into one of a plurality of alternative workflows or branches of a workflow may be realized by a predefined command provided to the input apparatus 130 by the user. For example, when the input apparatus 130 includes a trigger button for operating the optical scanner 135 as described above, the predefined command may be a long press (e.g. longer than a predefined period of time stored in the memory 145). In other embodiments, the command may be a movement of the device 100 by the user, an audible command captured by the device 100, and the like. In some embodiments, the command may be generated automatically by the device 100 itself upon detection of a predefined condition, such as a location, time of day, duplicate scan, and the like.
In one embodiment, the arrival of the device 100 at a point in a workflow where multiple possible branches or other workflows may be entered into may initiate the alternative user interface control process. During the alternative user interface control process, the device 100 generates an image containing selectable elements and a pointing element. The selectable elements may be menu options retrieved from the applications 150, and the pointing element may be a cursor, a crosshair, or the like. Having generated the image, the device 100 outputs the image through image output apparatus 110. In one embodiment, in which the image output apparatus 110 includes the projector 120, outputting the image comprises projecting the image on a surface in the vicinity of the device 100 using the projector 120. In some embodiments, in which the image output apparatus 110 includes the display 115, outputting the image comprises presenting the image on the display 115. In some of the embodiments in which the image is presented on the display 115, the device 100 may also capture a video feed of an object using the camera 140 and present the image on the display 115 as an overlay on the video feed. Additionally, in some embodiments, outputting the image may comprise a combination of projection using the projector 120 and presentation using the display 115, with or without the video feed. In other embodiments, the pointing element may be transparent so as to be invisible in the projected or displayed image. In still other embodiments, the pointing element may be virtual in that a position of the pointing element is stored by the device 100, but the pointing element is omitted from the image itself.
Having outputted the image, the device 100 detects movement of the device 100 with the motion sensor 125 and updates the image to stabilize the selectable elements against the detected movement and to track the detected movement with the pointing element. Stabilization of the selectable elements against the detected movement by the device 100 comprises manipulating, by the device 100, the selectable elements in the updated image to reduce their apparent motion from the perspective of the user. In other words, stabilization includes reducing or eliminating the effect of the detected movement of the device 100 on the position of the selectable elements in space. Tracking the movement of the device 100 with the pointing element, on the other hand, comprises manipulating, by the device 100, the pointing element in the updated image to follow the movement of the device 100 in space. Therefore the updated image, for example, may include the menu options discussed above, having been relocated within the image to counter the movement of the device 100. The cursor or crosshair discussed above may therefore appear to have moved within the image in relation to the menu options.
The device 100, after generating the updated image, outputs the updated image using the image output apparatus 110. Thus, the updated image replaces the initial image on the image output apparatus 110. The operations performed by the device 100 to update the image and output the updated image depend on the nature of the image output apparatus 110. In one embodiment, in which the image output apparatus 110 comprises the projector 120, updating the image comprises relocating the selectable elements in the updated image to counter the movement of the device 100 (e.g. by rotation, translation and the like, in a direction opposite to a direction of the movement of the device 100). Updating the image also comprises maintaining the location of the pointing element in the updated image. In some embodiments, the image output apparatus comprises the display 115, and updating the image comprises relocating the selectable elements and maintaining the location of the pointing element as discussed above.
In some embodiments, the image output apparatus 110 includes the projector 120, and the projector 120 includes a movable lens. For example, the projector 120 may be coupled to a motor within a housing of the device 100, or the lens only may be coupled to a motor, or the lens may be a liquid lens whose shape changes with varying electrical current applied to the lens. In such embodiments, updating the image may comprise maintaining the location of the selectable elements in the updated image (although the selectable elements may still be skewed or zoomed, for example), and relocating the pointing element in the image to track the movement of the device 100. In such embodiments, outputting the updated image comprises repositioning the lens to counter the movement of the device 100 and projecting the updated image. As a result, the selectable elements (whose locations in the updated image were maintained) remain in place on the projection surface, and the pointing element (whose location in the updated image was altered to track the movement of the device 100) moves along the projection surface to track the movement of the device 100. In other embodiments, stabilization of the selectable elements may be achieved by a combination of relocating the selectable elements in the image and repositioning the projector lens, thus extending the range of motion of the device 100 over which the selectable elements may be stabilized.
Following the output of the updated image, or simultaneously with outputting the updated image, the device 100 determines a motion vector of the pointing element relative to the known positions of the selectable elements. The motion vector describes the motion of the pointing element, across successive updated images, relative to the selectable elements. In some embodiments, the motion vector comprises a direction and velocity of movement of the pointing element relative to the selectable elements. In some embodiments, the motion vector comprises, for a series of updated images, distances between the pointing element and each selectable element (for example, the distances between the pointing element and every edge of each selectable element).
The device 100 compares the motion vector to the rules 155. The rules 155 define conditions that, when met by the motion vector, cause the selection of one of the selectable elements. In some embodiments, the rules specify predefined motion vectors (e.g. angles and velocities) that correspond to specific selectable elements. In some embodiments, the rules specify threshold conditions that do not correspond to specific selectable elements. For example, one such condition may specify that when the motion vector indicates that the pointing element has traversed two opposing edges of a selectable element in a certain time frame, that selectable element is selected (regardless of which selectable element it is).
When the motion vector does not match any of the rules 155, the device 100 continues to detect further movement of the device 100, update the image outputted by the image output apparatus 110, and repeat the determination of a motion vector and the comparison of the motion vector to the rules 155. In some embodiments, during such repetition, the device 100 may monitor the input apparatus 130 to determine whether the predefined command mentioned earlier (e.g. a long press of a trigger button) has been released (i.e. is no longer being applied) or whether a timeout period has elapsed without movement of the pointing element relative to the selectable elements. When the command has been released, the generation and updating of images ceases and the device 100 returns to the operations being performed before initiating the alternative user interface control process.
When, on the other hand, the motion vector does match one of the rules 155, the device 100 selects one of the selectable elements based on the motion vector. In some embodiments, the matched rule itself may specify a particular selectable element. In other embodiments, the rules 155 do not specify particular selectable elements, and the selection is based on which motion vector data resulted in the match. For example, the motion vector may include position data indicating that the pointing element traversed two opposing edges of a specific selectable element, and the rules 155 contain a rule defining “traverse two opposing edges of an element” as a selection event, then the device 100 selects whichever one of the selectable elements was traversed by the pointing element.
In response to the selection of a selectable element, the device 100 performs an action corresponding to the selected one of the selectable elements. For example, the action may be to present a menu with the image output apparatus 110, to launch another one of the applications 150, or the like. As discussed in the examples of workflow branches above, the action may be to present an exception quantity field and a number pad or keypad allowing the user of the device 100 to enter a number or other data into the field. Following the performance of the action corresponding to the selected one of the selectable elements, and optionally the performance of additional actions, the device 100 may return to the previous user interface, such as the user interface employed to scan items and record the scanned data in the memory 145. The return to a previous user interface, however, may be omitted in some embodiments. For example, in the inventory management examples discussed above, the entry of exception data or other workflow branching by the user may be deferred until all other items have been scanned, in which case it may not be necessary to return to the previous user interface for scanning items.
At block 210, the device 100 generates an image containing a plurality of selectable elements and a pointing element. The selectable elements and the pointing element may be contained within the data defining the applications 150. For example, in some embodiments, the application being executed at the time of the input received at block 205 may contain various menus comprising selectable elements (e.g. menu options) and the device 100 may select the elements of one of those menus at block 210. In some embodiments, the device 100 may execute a different one of the applications 150 to retrieve the selectable elements and pointing element from the memory 145, or may receive the selectable elements and pointing element via the communications interface 147. In some embodiments, the selectable elements and the pointing element may be selected from a number of possible sets of selectable elements and pointing elements. The selection may be based on, for example, the current context of use of the device 100 (e.g. whether the device 100 is being used to scan inventory for shipping or receiving), the time of day, a location of the device 100, and the like. The image generated at block 210 contains the selectable elements and the pointing element in predetermined default positions, for example centered in the image.
Referring back to
Returning to
At block 225 of
The device 100 may also, to track the detected movement with the pointing element 320, maintain the original location of the pointing element in the image. As shown in
At block 230 of
At block 232 of
At block 235, the device 100 operates to compare the motion vector to the rules 155 stored in the memory 150 and determine whether the motion vector matches any of the rules 155. As mentioned earlier, the rules 155 define various conditions to be met by the motion vector for causing some further action to be taken by the device 100. In some embodiments, the rules specify predefined motion vectors (e.g. angles and velocities) that correspond to specific selectable elements. In other embodiments, the rules specify threshold conditions, such as a requirement that the motion vector traverses two opposing edges of a selectable element in a certain period of time. Another example of such a condition is a requirement that the motion vector remain within a certain distance of the center of a selectable element for a period of time. In some embodiments, where a long press of a trigger button causes the initiation of the alternative user interface control process, an example condition may be a requirement that the motion vector end within the boundaries of a selectable element when the long press is released.
When the determination at block 235 is negative (that is, when the motion vector does not match any of the rules 155) the performance of the method 200 proceeds, optionally, to block 240 and then returns to block 220 to continue monitoring for device movement. At block 240, the device 100 determines whether the input (optionally) received at block 205 has been released, or whether a timeout period has elapsed without movement of the pointing element relative to the selectable elements. In some embodiments, the device 100 may alternatively determine at block 240 whether the device 100 has moved to such an extent that the object originally in the field of view of the projector 120 or the camera 140 is no longer in that field of view. Thus, for example, the determination at block 240 may be whether the trigger button depressed at block 205 has been released. When the input has been released, the performance of method 200 ends and the device 100 returns to the function being performed before method 200 began. Otherwise, the device 100 proceeds to repeat blocks 220, 225, 230 and 235.
When the determination at block 235 is affirmative (that is, when the motion vector does match one of the rules 155), the device 100 proceeds to block 245, shown in
At block 250, in response to the selection of a selectable element at block 245, the device 100 performs an action corresponding to the selected one of the selectable elements. For example, the action may be to present a menu with the image output apparatus 110, to launch another one of the applications 150, or the like. The device 100 may also conduct a final update to the image prior to performing the action, as shown in
Accordingly, various embodiments described above may be advantageously implemented on devices to provide and control alternative motion-based user interfaces. Embodiments of the present disclosure describe several ways in which the device may present an alternative user interface in order to provide ready access to certain applications or functions of the device. The embodiments described above allow alternative workflows to be handled with minimal additional input by the user, avoiding the need to back out of applications or navigate through various levels of menus. The embodiments also allow the device to accept input in a manner that closely mimics the function previously being performed by the device (e.g. scanning inventory), further reducing the time lost to the switch between alternative workflows. In addition, the embodiments allow for control of the device using relatively small gestures by a user, and provide a control mechanism that may be applied with little or no modification to a variety of different menus of selectable elements.
In the foregoing specification, specific embodiments have been described. However, one of ordinary skill in the art appreciates that various modifications and changes may be made without departing from the scope of the invention as set forth in the claims below. Accordingly, the specification and figures are to be regarded in an illustrative rather than a restrictive sense, and all such modifications are intended to be included within the scope of present teachings.
The benefits, advantages, solutions to problems, and any element(s) that may cause any benefit, advantage, or solution to occur or become more pronounced are not to be construed as a critical, required, or essential features or elements of any or all the claims. The invention is defined solely by the appended claims including any amendments made during the pendency of this application and all equivalents of those claims as issued.
Moreover in this document, relational terms such as first and second, top and bottom, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. The terms “comprises,” “comprising,” “has”, “having,” “includes”, “including,” “contains”, “containing” or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises, has, includes, contains a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. An element proceeded by “comprises . . . a”, “has . . . a”, “includes . . . a”, “contains . . . a” does not, without more constraints, preclude the existence of additional identical elements in the process, method, article, or apparatus that comprises, has, includes, contains the element. The terms “a” and “an” are defined as one or more unless explicitly stated otherwise herein. The terms “substantially”, “essentially”, “approximately”, “about” or any other version thereof, are defined as being close to as understood by one of ordinary skill in the art, and in one non-limiting embodiment the term is defined to be within 10%, in another embodiment within 5%, in another embodiment within 1% and in another embodiment within 0.5%. The term “coupled” as used herein is defined as connected, although not necessarily directly and not necessarily mechanically. A device or structure that is “configured” in a certain way is configured in at least that way, but may also be configured in ways that are not listed.
It will be appreciated that some embodiments may be comprised of one or more generic or specialized processors (or “processing devices”) such as microprocessors, digital signal processors, customized processors and field programmable gate arrays (FPGAs) and unique stored program instructions (including both software and firmware) that control the one or more processors to implement, in conjunction with certain non-processor circuits, some, most, or all of the functions of the method and/or apparatus described herein. Alternatively, some or all functions could be implemented by a state machine that has no stored program instructions, or in one or more application specific integrated circuits (ASICs), in which each function or some combinations of certain of the functions are implemented as custom logic. Of course, a combination of the two approaches could be used.
Moreover, an embodiment may be implemented as a computer-readable storage medium having computer readable code stored thereon for programming a computer (e.g., comprising a processor) to perform a method as described and claimed herein. Examples of such computer-readable storage mediums include, but are not limited to, a hard disk, a CD-ROM, an optical storage device, a magnetic storage device, a ROM (Read Only Memory), a PROM (Programmable Read Only Memory), an EPROM (Erasable Programmable Read Only Memory), an EEPROM (Electrically Erasable Programmable Read Only Memory) and a Flash memory. Further, it is expected that one of ordinary skill, notwithstanding possibly significant effort and many design choices motivated by, for example, available time, current technology, and economic considerations, when guided by the concepts and principles disclosed herein will be readily capable of generating such software instructions and programs and ICs with minimal experimentation.
The Abstract of the Disclosure is provided to allow the reader to quickly ascertain the nature of the technical disclosure. It is submitted with the understanding that it will not be used to interpret or limit the scope or meaning of the claims. In addition, in the foregoing Detailed Description, it can be seen that various features are grouped together in various embodiments for the purpose of streamlining the disclosure. This method of disclosure is not to be interpreted as reflecting an intention that the claimed embodiments require more features than are expressly recited in each claim. Rather, as the following claims reflect, inventive subject matter lies in less than all features of a single disclosed embodiment. Thus the following claims are hereby incorporated into the Detailed Description, with each claim standing on its own as a separately claimed subject matter.