The present invention relates to design of user interfaces for electronic computing devices. In particular, the invention relates to system(s) and method(s) for automatically adapting the user interface of a device to enhance usability in response to the manner and conditions of operation.
Advancements in technology have generated a variety of interactive computer devices used for a variety of functions, and employing a wide array of applications. Further, a single such device can be employed to effect numerous types of functionality as well as provide multiple applications. Many portable cellular phones act as communication devices, word processing devices, and media players. In order to facilitate and control functionality of a device, the device is typically provided with a user interface. Generally, the user interface is designed to enhance usability of the device. Usability is the degree to which the design of a particular user interface takes into account the human psychology and physiology of the user, and makes the process of using the device effective, efficient and satisfying.
Several user interfaces have been established in order to accommodate the variety of functions and applications available for interactive computer devices while accounting for usability. Devices manipulated through physical buttons, a form of a human user interface (HMI) are often designed with arrangement of the buttons to accommodate intended physical manner of operation. Devices comprising display screens for facilitation of interaction often utilize a graphical user interface (GUI). GUIs generally offer graphical icons, and visual indicators as opposed to text-based interfaces, typed command labels or text navigation to fully represent the information and actions available to a user. Interaction with the device is usually performed through direct manipulation of the graphical elements. In order to effectuate usability of a GUI, the visual and interactive elements are designed to enhance efficiency and ease of use for the underlying logical design of a stored program. Devices employing GUI may further design the user interface to account for the manner of physical operation. Many devices employ GUIs on a device with touchscreen interaction, wherein the graphical elements are manipulated by touching the element displayed on the screen in order facilitate interaction. In order to enhance usability, a device employing a touchscreen GUI may provide a user interface wherein the input elements are aligned along the right side of a device to accommodate operation with the right hand.
However, given the variety of functions and applications available on a device the manner in which a user operates the device can vary depending on specific function being exploited and application at use. For example, several portable electronic devices, can be held and operated with one hand, two hands, or no hands and operated with a stylus. The manner in which a user chooses to operate the device is often dictated by the function being exploited, such as making a phone call when used as a communication device, or typing on a keypad when used as a word processing device. Likewise, when employing a single functional aspect of the device, such as in the form of a media player, the particular application of the media player can influence manner of operation. Furthermore, the manner of operation of a device can vary depending on extrinsic factors such as conditions under which a user operates the device.
Although a user interface may be designed to enhance usability for a specific manner of operation, the user interface elements responsible for control or input interaction remain constant (as in devices with a HMI) or are dictated by the applications as programmed for the device (as in a device with a GUI). Therefore, when a user changes the manner in which he operates the device, the user is forced to accommodate the design of the user interface. The accommodation often entails altering the physical manner in which the user interacts with the device in a less efficient or appealing manner. For example, a user may have to reach across a screen to touch upon a command, thus upsetting a view of another onlooker. As a result the usability of the device decreases when the manner of operation changes.
The following presents a simplified summary in order to provide a basic understanding of some aspects of the disclosed embodiments. This summary is not an extensive overview and is intended to neither identify key or critical elements nor delineate the scope of such embodiments. Its purpose is to present some concepts of the described embodiments in a simplified form as a prelude to the more detailed description that is presented later.
Disclosed herein are system(s) and method(s) for automatically adapting the user interface of computer operated device in response to the manner in which a user physically operates the device and the conditions surrounding operation in order to optimize usability of the device. The system(s) and method(s) pertain to devices using either a graphical user interface (GUI) wherein the user interacts with the device via a touchscreen medium or a human machine interface (HMI) comprising buttons (e.g., physical or virtualized) on a computerized device. For example, the device can be a handheld mobile device such as a tablet personal computer, a game control, or a large interactive display board. The system is particularly useful in devices operated in a variety of manners such that as the user modifies his manner of operation, the device adapts itself to accommodate the new manner of operation. Examples of different manners of operation include holding a device in one hand verses two, using a stylus, or controlling the function of a computer through the bottom left corner of a large touchscreen display.
When the user interface of the device is a GUI provided on a touchscreen enabled device, the system adapts the interactive elements such as input widgets including control panels, volume icons, call buttons, etc. such that the arrangement of the interactive elements enhances usability. The arrangement of the non-interactive elements can also adapt to offset the interactive elements while enhancing the size and arrangement of the elements in accordance with utility and aesthetic appeal. For example, when holding a device in the right hand the interactive elements can align on the right side while the non-interactive visual elements can comprise the center of the display. Similarly, when the interface is a HMI, the underlying functionality of the buttons can change in response to the manner in which the user operates the device. In another aspect of the invention, the design of the user interface can further account for extrinsic conditions such as the orientation of the device or environmental conditions including temperature, light, pressure, sound, etc.
In order determine the manner and the conditions of operation for a specific instance of use, the system provides a variety of sensors on or integrated within the device. The sensors can detect and provide information defining the physical location, identity and orientation of an object touching or surrounding the device. The sensors can also determine orientation of the device, and environmental conditions acting upon the device. In order to interpret the sensed information, a database is provided which stores information defining the variety of sensor information capable of being generated by the system and a defined group of physical and environmental parameters. The database further includes user interface designs and/or user interface elements. Upon generation of sensor signals, the system correlates the sensed information with the corresponding physical and/or environmental parameters associated with the sensed information. In turn, the system generates a user interface that enhances usability in light of the physical and/or environmental parameters.
To the accomplishment of the foregoing and related ends, one or more embodiments comprise the features hereinafter fully described and particularly pointed out in the claims. The following description and the annexed drawings set forth in detail certain illustrative aspects and are indicative of but a few of the various ways in which the principles of the embodiments may be employed. Other advantages and novel features will become apparent from the following detailed description when considered in conjunction with the drawings and the disclosed embodiments are intended to include all such aspects and their equivalents.
Various embodiments are now described with reference to the drawings. In the following description, for purposes of explanation, numerous specific details are set forth in order to provide a thorough understanding of one or more embodiments. It may be evident, however, that the various embodiments may be practiced without these specific details. In other instances, well-known structures and devices are shown in block diagram form in order to facilitate describing these embodiments.
As used in this application, the terms “component”, “module”, “system”, and the like are intended to refer to a computer-related entity, either hardware, a combination of hardware and software, software, or software in execution. For example, a component may be, but is not limited to being, a process running on a processor, a processor, an object, an executable, a thread of execution, a program, and/or a computer. By way of illustration, both an application running on a server and the server can be a component. One or more components may reside within a process and/or thread of execution and a component may be localized on one computer and/or distributed between two or more computers. The word “exemplary” is used herein to mean serving as an example, instance, or illustration. Any aspect or design described herein as “exemplary” is not necessarily to be construed as preferred or advantageous over other aspects or designs.
Furthermore, the one or more embodiments may be implemented as a method, apparatus, or article of manufacture using standard programming and/or engineering techniques to produce software, firmware, hardware, or any combination thereof to control a computer to implement the disclosed embodiments. The term “article of manufacture” (or alternatively, “computer program product”) as used herein is intended to encompass a computer program accessible from any computer-readable device, carrier, or media. For example, computer readable media can include but are not limited to magnetic storage devices (e.g., hard disk, floppy disk, magnetic strips . . . ), optical disks (e.g., compact disk (CD), digital versatile disk (DVD) . . . ), smart cards, and flash memory devices (e.g., card, stick). Additionally it should be appreciated that a carrier wave can be employed to carry computer-readable electronic data such as those used in transmitting and receiving electronic mail or in accessing a network such as the Internet or a local area network (LAN). Of course, those skilled in the art will recognize many modifications may be made to this configuration without departing from the scope of the disclosed embodiments.
Various aspects can incorporate inference schemes and/or techniques in connection with transitioning interface schemes. As used herein, the term “inference” refers generally to the process of reasoning about or inferring states of the system, environment, and/or user from a set of observations as captured via events and/or data. Inference can be employed to identify a specific context or action, or can generate a probability distribution over states, for example. The inference can be probabilistic—that is, the computation of a probability distribution over states of interest based on a consideration of data and events, or decision theoretic, building upon probabilistic inference, and considering display actions of highest expected utility, in the context of uncertainty in user goals and intentions. Inference can also refer to techniques employed for composing higher-level events from a set of events and/or data. Such inference results in the construction of new events or actions from a set of observed events and/or stored event data, whether or not the events are correlated in close temporal proximity, and whether the events and data come from one or several event and data sources.
It is to be appreciated that various technologies such as voice recognition, inference, gaze recognition, advanced quality of service guarantee mechanisms, etc. can be employed to allow transitioning of interfaces. Moreover, various embodiments described herein can employ principles of artificial intelligence (AI) to facilitate automatically performing various aspects (e.g., transitioning interfaces, communication session, analyzing resources, extrinsic information, user state, and preferences, risk assessment) as described herein. An Al component can optionally include an inference component that can further enhance automated aspects of the AI component utilizing in part inference based schemes to facilitate inferring intended actions to be performed at a given time and state. The Al-based aspects can be effected via any suitable machine-learning based technique and/or statistical-based techniques and/or probabilistic-based techniques. For example, the use of expert systems, fuzzy logic, support vector machines (SVMs), Hidden Markov Models (HMMs), greedy search algorithms, rule-based systems, Bayesian models (e.g., Bayesian networks), neural networks, other non-linear training techniques, data fusion, utility-based analytical systems, systems employing Bayesian models, etc. are contemplated and are intended to fall within the scope of the hereto appended claims.
Various embodiments will be presented in terms of systems that may include a number of components, modules, and the like. It is to be understood and appreciated that the various systems may include additional components, modules, etc. and/or may not include all of the components, modules, etc. discussed in connection with the figures. A combination of these approaches may also be used.
In one embodiment, the system is executable on a device using a graphical UI (GUI) displayed on an electronic device in which the user interacts with the device via touch (herein referred to a touchscreen device). The graphical UI comprises icons and visual indicators presented on a display screen such as an LCD display that are representative of information available to the user. The user can further interact with the device through direct manipulation of the graphical elements (herein referred to as widgets) on the display. In devices using a GUI, the system enables the visual composition and the temporal behavior of the graphical UI to adapt in response to manner in which a user physically operates the device.
In an alternative embodiment, the system can be implemented on a device utilizing a human machine interface (HMI). In this embodiment, the HMI comprises buttons that command input functions of the device. Further, the HMI interface comprising buttons is independent and indirectly linked to the underlying applications controlling functionally of the device. Therefore, the buttons can be neutral with respect to a particular function. In turn, the buttons have the capability of temporally developing a variety of input commands. In such embodiment, the functionally of a versatile set of buttons can adapt depending on manner of physical operation of the device. It is to be appreciated that GUIs and HMIs can be employed concurrently, or as a hybrid type of interface.
The system is particularly beneficial in a device that is operable in a variety of physical arrangements with relation to position device and the manner in which the user physically operates the device. The device may be any portable electronic device operable with one hand, two hands, or via stylus such as a mobile phone, (or smartphone), a personal digital assistant (PDA), a tablet personal computer (PC), a portable media players, or handheld gaming device. In another embodiment, the device can be any touchscreen computing device. For example, devices employing point of sale software, automated teller machines (ATMs), airline self-ticketing and check-in devices, information kiosks in a public space, or a global positioning system (GPS) device mounted in automobile or airplane. In another embodiment, the device can be an electronic controller for use in video gaming. In another embodiment, the device can be a versatile handheld weapon operable in a variety of hand positions. Finally, although the shapes of the devices named above are known, the device can be any three-dimensional or two-dimensional shape. It should be appreciated that the listing of possible executable devices above is not exhaustive and technological advancement will introduce additional devices where the subject system will be applicable.
Referring back to
The sensor component 101 represents one or more sensors. The sensors can be attached to or integrated within a device 104. A device can comprise one or more sensors or be completely enveloped by sensors. The sensors can be capacitive, resistive, pressure sensing, positional, inductive, thermal, optical or laser or any combination of the above. The sensor component 101 can further comprise accelerometers. The accelerometers can provide gesture recognition and facilitate movement between different UI(s) as the points of contact on the device change. The accelerometers can further detect the orientation of the device 104. Similarly, additional positional sensors can be applied such as gyroscopic sensors, or acoustic or infrared sensors. Furthermore, the sensor component can contain an environmental sensor system including conventional light, image, thermal, electromagnetic, vibratory, atmospheric pressure, or acoustic sensors. It is to be appreciated that a thin film or the like of sensors can be incorporated as a skin of the device or portion thereof to facilitate detecting user intended use.
The sensor component 101 is generally depicted in
The interface database 102 can contain information pertaining to sensor code recognition, physical contact parameters, environmental parameters, interface elements and interface design settings, and user identification. The database serves as a look-up table for mapping a UI in response to sensed information by way of correlating processed sensor signals with a physical contact parameter or environmental parameter. A physical contact parameter defines the relationship between a sensor activation code and the physical position, type and configuration of an object contacting or surrounding the device (e.g. human hand, stylus, table, or other object). For example, when the sensor component receives signals indicating contact with the device, the physical contact parameter will indicate the exact position of the object touching the device. Further, in addition to the location of the object touching the device, the physical contact parameter can identify the activation code responsive to the touch with the type of object generating the touch. For example, the object can be a left or right hand, a finger, a stylus, a table etc. The physical contact parameters can also account for additional contact points pertaining to a specific device such as a holder or stand specifically designed for the device. A device employed with thermal sensors can further distinguish between human body parts and inanimate objects.
In another aspect of the invention, the physical contact parameters can define the anatomical configuration of the object contacting the device. In this aspect of the invention, the physical contact parameter draws a relationship between the contact point(s) and the type of object contacting the device. When the object is an inanimate object, the identity of the object as either an interfacing object such as a stylus or a support object, such as a table, can dictate the manner in which the object is used. When used as an interfacing object, the physicality of the object and the manner in which a user handles the object can be factored into the physical contact parameter. When the object is a human body part, the anatomy and physiology of part will further be taken into account. For example, when defining a physical contact parameter, the physiology of a human hand limits the distance at which and interactive elements are distanced on a UI. Further, the manner of operation of the device with respect to the applications and function of the device can be a factor in determining the physical contact parameter. The manner of operation can include how a user positions his hand or hands with relation to the shape and operation of the device in order to use the applications of the device. For example, when an application of the device requires input through a keypad, a detection of five contact points can equate to the manner in which a user positions the five fingers of a right hand for use of a keypad.
The sensor component 101 can further compromise thermal or optical sensors that detect, in addition to the physical contact points, the spatial location of an object surrounding a device. (Although an object may not be contacting a device per se, the information pertaining to the spatial location and configuration of the surrounding object will be classified as a physical contact parameter for purposes of explanation). This aspect of the invention provides another means by which to establish the precise anatomical configuration of the object interfacing with the device. This aspect of the invention can be combined with the physical parameters establishing the position of an object contacting the device. For example, a physical contact parameter can include the elevation and configuration of a hand over a device when two fingers are touching the device. Therefore the sensor component 101 can detect manner in which a UIs with a device with more accuracy.
Similarly, in another aspect of the invention, the physical contact parameters can be representative of spatial orientation of an object surrounding a device that is not touching the device. For example, the spatial sensors can detect where an object is hovering over a device. Thus in addition to the anatomical configuration of the interfacing object, the physical contact parameters can encompass the distance of an object from the device and the particular angle or orientation of an object around the device. In this embodiment, an invisible three-dimensional grid can exist in the space surrounding a device in order to transcribe a code accounting for the spatial position of the object around the device.
Considering the variety of factors which can define a physical contact parameter it should be appreciated that a large number of parameters are encompassed by the system 100 and embodied within the interface database. Furthermore, in another aspect of the invention a device may operate through manipulation of more than one user. For example, consider a gaming device with a GUI where several users place their hands on the GUI in order to interact with the device and perform the game. In this embodiment of the invention, the physical contact parameters will increase in number in order to account for differentiation between the several users and the manner of operation of by each individual user.
As mentioned above, the physical contact parameters correlate to a specific activation sensor code. The number of physical contact parameters will depend on the number of related sensor codes a specific embodiment of the system establishes. Likewise, the number of sensor codes will depend on the number and type of sensors employed. For example, consider a three dimensional rectangular prism shaped device with two capacitive sensors along the edges, respectively assigned left, and right. The device further has a third senor located on the back plain of the device. The device is designed to be grasped in one hand (the left or right hand), two hands, or to lie on its back whereby the user interacts with the device utilizing a stylus. In this example, the language used to define a sensor code could be as simple as a 1 for left sensor activation, a 2 for right sensor activation, a 1-2 left and right sensor activation, a 3 for back sensor activation, and a 4 and 5 for top and bottom sensor activation respectively. The physical parameters of the device with response to activation of the sensors will be as follows: 1=left hand use, 2=right hand use, 1-2=two hand use, and 3=stylus use. The mechanism described above provides the basic philosophy behind establishment of a sensor code for a physical contact parameter. An alternative mechanism of relating an activation sensor code with a physical contact parameter will be later described with reference to
A more complex array of physical contact parameters are provided in the interface database 102 in another embodiment of the system 100 when implemented in a three dimensional device that is completely enveloped by sensors. The sensors can further determine each point of contact on the device. For example, each point of contact can correlate to a specific code such as number on a three dimensional quadrant plane. Depending upon the type of contact, a series of numbers/codes can be activated in order to create a code or number sequence. This code/number sequence is an example of a sensor code that is sent by the sensor component 101 to the adaptation component 103. The number of sensor codes will depend on the total combinations and permutations of the different contact points represented by on a device that are defined by a number or code. Therefore, given the size of the device, the number of code/number sequences can range from one to N number of code/number sequences where N is an integer. In turn, each code/number sequence or sensor code will correlate to a defined physical contact parameter. It should be appreciated that upper limits of the code/number sequences and the respectively assigned physical parameters can then be a limited or extremely high order of magnitude. As mentioned above, a more detailed description of the manner in which a three dimensional device enveloped by sensors establishes a sensor code correlating to a specific physical contact parameter will be further described with reference to
In addition to physical contact parameters, the interface database can contain additional environmental parameters in order to correlate sensor signals related to environmental factors with a specific UI. In this embodiment, the sensor component can process environmental sensor signals in order to output a sensor code. Alternatively, the information relating to the environmental sensors can be added to the information pertaining to physical contact and spatial orientation sensed information in order to generate one sensor code that is sent to the interface database. The environmental parameters can also account for signals indicating device orientation derived from accelerometers. The environmental sensors can account for extrinsic factors such as atmospheric pressure, atmospheric temperature, sound, ambient light, time etc. The environmental parameters can provide factors such as increase resolution of a display, or limit the complexity of a UI in order to account for decreased physical mobility of the interactive user. The environmental parameters can be integrated into the mix of elements factored into the determination of the appropriate UI.
In addition to the variety of physical contact parameters and environmental parameters, the UI database 102 defines the variety of UI elements and interface designs pertaining to a specific device or program executable by the device. In a GUI, the UI elements consists of widgets which are visually displayed elements enabling interaction with the device, and non-interactive elements. The widgets allow for interactions appropriate to the kind of data they hold. Widgets can include small interactive elements such as buttons, toolbars, scroll menus, windows, icons, keypads etc. Larger widgets, can include windows which provide a frame or container for the main presentation content such as a web page, email message, word document, or drawing. Larger windows are primarily the output of function executed through user manipulation of smaller widgets. However larger windows can also facilitate interaction. For example, a menu displaying a variety of options for the user, can comprise of a larger window with multiple smaller icons, each representative of particular executable program that the user may access. In an exemplary embodiment of the invention, the system employs a touchscreen device with a GUI. In a touchscreen device, the user may touch upon a smaller icon to open a new window. The new window may further comprise of additional small icons for interaction with the device. The user further interacts with the device through direct manipulation of the widgets on the display. In addition to the elements of a UI that enable direct interaction for controlling a device, additional elements of the UI exist for display purposes only. For example, a video or picture or displayed message. The non-interactive elements in combination with the user input elements or widgets are organized in order to create a UI that enhances usability of the device.
The design of a UI affects the amount of effort the user must expend to provide input for the system and to interpret the output of the system, and how much effort it takes to learn how to do this. Usability is the degree to which the design of a particular UI takes into account the human psychology and physiology of the users, and makes the process of using the system effective, efficient and satisfying. Usability is mainly a characteristic of the UI. The UI of the devices employing the system 100 further accounts for the functionality of the device and the applications employed on the device. Therefore, the UI generated by the system accounts for how a device 104 is used with respect to efficiency, effectiveness, and satisfaction, while taking into account the requirements from its context of use. One example of a UI provided by the system 100 takes into account the following factors in order to enhance usability of a device: the physical placement of a user's hand on the device, how the user uses his hand in order to interact with the device, the a particular application of the device, and the environmental conditions of operation.
In one embodiment, the UI elements are pre-arranged in order to provide a UI that optimizes usability in response to a physical parameter. Therefore a number of UIs can be stored in the interface database 102. Each of the stored interfaces are specifically designed with regard to a physical contact parameter or series of parameters. As with the physical contact parameters, the number of UIs stored in the interface can vary from one to a high order of magnitude. In one embodiment, a different UI can exist for each physical contact parameter. In another embodiment, several different physical contact parameters can correlate to the same UI. In another embodiment, the system can create a custom UI from UI elements in response to a specific sensor signal and corresponding physical contact or environmental parameter. In this embodiment the UI database is further employed with information pertaining to usability. As will be described supra, the system has a custom interface generation component 903 (
In another aspect of the system 100 a specific physical parameter can correlate to a specific subset of interfaces. The subset of interfaces can be directed for implementation by a primary physical parameter. For example, a user can place his hand on a device in a specific manner that is analogous to providing the device with a unique identification code or password. The primary physical parameter related to the code in turn directs the user database to pull from a designated subset of UIs. Therefore the interface database 102 can hold information correlating a specific physical parameter with a subset of interfaces.
This embodiment can further be exploited as a user recognition or identification mechanism where several different users utilize a specific device. In this aspect of the system 100, a user may touch a specific device in a certain way in order to signal the identification of the user. In turn, the device is signaled to operate in a specific adaptation mode wherein a certain subset of UIs correlating to the user are employed by the adaptation system 100. The user identification mechanism described above can also be utilized as a security measure similar to biometric identification of a user. Rather than recognition of a users fingerprint as in biometric identification, the device can recognize a specific touch sequence. In addition to signaling an interface subset for the user, the system can cause the device to either grant access for user or prevent the user from interacting with the device by freezing the functionality of the UI. Therefore the UI database can further comprise of user identification information.
The system will now be explained with regard to implementation in a handheld mobile device. Several handheld mobile devices exist which are operated by the user with one hand (left or right), two hands, or no hands through manipulation by with a stylus. These devices include but are not limited to cell phones, smartphones, PDA's, mobile media players, handheld gaming devices, remote controllers, or advanced technology weapons. In one aspect of the system, the UI adapts to the manner in which the device is held. For example: when a user grips a handheld device with two hands as opposed to one, the UI can change to a design where the input widgets are located along the bottom center of the device for manipulation by the left and right thumbs respectively.
In another aspect of the invention, the handheld device may require interaction through a stylus or a keypad such as a virtual keyboard. For example, the system 100 can provide for the following sequence of events. The device can be placed on a table. When the table is the only physical contact with the device the UI can provide only non-interactive or visual elements. This UI could be considered a default interface. As the user approaches the device with a stylus, the sensor component of the system, considering it has positional and capacitive sensor capability, can process the sensed position of the stylus. In response to the corresponding physical contact parameter, the system 100 can then implement a UI that designs interactive widgets in the appropriate vicinity of the UI for interaction between the stylus and the device. Similarly, the appearance of a virtual keyboard can be a response to a physical contact parameter signaled by a sensor code designating a device that is laid on a table. Alternatively, the UI can change at to provide a keyboard underneath the user's hands when he places his hands upon the device in a composition configuration. In another aspect of the invention, the appearance of a virtual keyboard could be a response to hands hovering over a device in the composition form.
In addition, the non-interactive visual elements of the device can be designed to offset the input commands in a manner that enhances the visibility of the elements. For example the non-interactive elements can be displayed in a manner that correlates to the manner of operation and the corresponding program of use. A user can operate a device with their right hand and the input widgets can be arranged on the right side of the device in a specific configuration to enhance the control aspect of usability. Further, the specific program at use will dictate the remainder of the interface design with respect to the assigned physical contact parameter. The non-interactive elements can be designed with relation to size and aesthetic appearance of the elements in light of the specific application employed, the utility of the elements with respect to the application, or user preferences.
Additional aspects of the system 100 are brought forth through description of implementation on a larger device. In this example, the device is a large tablet PC that is used as a presentation apparatus by a salesman for displaying products to a potential customer. The tablet PC further uses a touchscreen GUI. The device is manually operated with one hand two hands or a stylus. However in this example the salesman operates the device with his right hand only while the screen of the device is positioned in front of the customer to the left of the salesman. When the salesman holds the device with his right hand the UI automatically adapts for improved usability as a presentation model that anticipates the type of use and the physical position of the salesman and the customer. The particular UI is thus adapted for complete control of the device by the users with his thumb. In turn the user does not need to let go of the device to change hand positions or reach across the screen and interrupt the view of the customer. For instance, in accordance with this example, the UI can locate a main scroll bar for scrolling through a series of products in the top right corner of the display while the pictured products appear in the center of the display. The salesman can then scroll through products using his thumb. The UI can also be designed with a miniature version of the larger display screen at the top right corner just above the scroll bar. Therefore, rather than letting go of the device or reaching across the display screen with his left hand in order to touch upon a product appearing in the center of the screen. The salesman can simply reach above the scroll bar in order to select the desired product. The miniature display is strategically positioned above the scroll bar in order to account for the ease in which the salesman can reach up rather than down while offsetting the area of the display covered by the salesman's right palm.
In addition to mobile type handheld devices, implementation of the system 100 in larger stationary devices can further bring to light additional aspects of the system. For example, consider a device comprising a large touch screen with a GUI. One aspect of the system can recognize the position and orientation of a user's hand wherever the user places his hand on the device or wherever the hand is hovering over a device. The system can additionally distinguish between left and right hands of the user, two hands, use with a stylus or multiple hands originating from multiple users. As the user moves his hand over or upon the device, a control panel can continuously move beneath the user's hands in order to follow the users hand movements. In turn the entire UI comprising the interactive widgets and non-interactive visual components, can continuously adapt.
The embodiments of the system described above, referenced a device utilizing a GUI. However, the aspects of the system 100 described above can further be applied to a device using a HMI. For example, a device such a remote control can comprise of several non-denominational buttons. Depending on how the user holds the controller, the buttons can be assigned their respective functionality. Therefore, in essence, no matter where or how a user holds the device, the position of the users index finger can always be the position of the “start” or “play” button. Similarly, the controller could be a mobile remote control or be the control panel of a larger non-mobile device.
Furthermore, in addition to the adaptation of the UI with regards to the physical contact parameters, the UI can further account for environmental parameters. (As noted above, the term environmental parameters as used to describe the system 100 encompass parameters developed through accelerometers in addition to parameter such as temperature, ambient light, time, sound, atmospheric pressure, etc. . . . ) The UI of a handheld device can adapt depending on the amount of light, the altitude, or the temperature. For example in a situation where temperatures indicate a fire present, a communication device employing the system can adapt the UI such that a single large emergency interactive element is displayed on a GUI. Likewise, when the interface is a HMI, all of the buttons on the device could have the same functionality, that is dialing 911. In another aspect, the system can sense an increase or decrease in environmental sound. In response, a device employing a GUI can adapt to provide a volume control element in a easily accessible position on the interface in relation to the interfacing object. Additionally, the interface can adapt according to the orientation of the device. It should be appreciated that a variety of interface adaptations in response to environmental parameters are within the scope of the invention.
The number and degree of the aspects of system 100 described above can further controlled by the user. For instance, the entire adaptability of the system can be controlled. A user can elect to use the system within a certain device to a point where they would prefer the device no longer change interfaces in response to the manner of operation. Thus the system can provide “modes” of adaptability. One mode would turn the system off completely, while another mode can allow for limited adaptability. The number of modes in which the system 100 can operate is unlimited. This aspect of the system can be appreciated with regard to the example provided above wherein the UI can continuously adapts as the user moves his hand over the UI of a device.
The sensor code generated by the sensor signal processing component 302 defines the sensor signals relating to both physical contact parameters and environmental parameters. The mechanism by which he sensor processing component establishes a code relating to physical contact parameters will be described in detail with reference to
The grid at 501 is used to establish the points of actual physical contact on the device as will be exemplified with reference to
The grid at 502 comprises the same properties as the grid at 501, however, the grid at 502 is not limited by the dimensions of the device but the area around the device capable of being reached by the sensors employed. The grid at 502 further captures the spatial location and configuration of an object surrounding the device. The grid is defined by axis's x′, y′, and z′, in order to differentiate between sensor signals representative of physical touch and those representative of spatial location and configuration. Each of the axis's x′, y′, and z′, are numbered as described with reference to the 501 grid; however extension of the axis's is also provided in the negative direction. The depiction at 504 shows how the grid 502 is related to a device (represented by the rectangular prism). The apex of the 502 grid is provided at the center point of the device regardless of shape.
According to one embodiment of the system wherein only capacitive sensors are present on the device, the corresponding sensor signals will be representative of only the points of physical contact of the interfacing object with the device. Referring to
In addition, the interface correlation component can contain a memory recall component 803. The memory recall component 803 stores information pertaining to readily used interface designs for efficient production. Likewise, given multiple applicable UIs, a user can have the option of requesting a second, third . . . etc. interface option following disfavor of each previously generated option. The memory recall component 803 stores the most frequently selected interface selected pertaining to a specific parameter and causes that interface option to be selected first the next time the same or related parameter is received. In another aspect of the invention, based on an incoming physical contact parameter of environmental parameter, the memory recall component can predict the upcoming physical movements by the user on or around the device based on past sequences of received parameters. Therefore the memory recall component 803 can prepare the next interface that is likely to be implemented by the system for more efficient production. Furthermore, in another aspect of the invention, where multiple users use a particular device, a subset of interfaces for that user can reside in the interface database. Upon receipt of a primary a physical contact parameter serving as the user identification code, the memory recall component 803 can be signaled to direct the interface correlation component 801 to select from a subset of interfaces assigned to that the user.
The interface correlation component can further comprise of an inference engine 804 which can employ artificial intelligence (AI) or other suitable machine learning & reasoning (MLR) logic which facilitates automating one or more features in accordance with the subject innovation. The inference engine 804 can interact with the memory recall component 803, to provide the decision logic in place of, or in addition to the inference engine 804. The subject innovation (e.g., in connection with drawing inferences from visual representations and attributes) can employ various AI- or MLR-based schemes for carrying out various aspects thereof. For example, a process for determining an appropriate or suitable conclusion to be drawn from a visual representation can be facilitated via an automatic classifier system and process.
A classifier is a function that maps an input attribute vector, x=(x1, x2, x3, x4, xn), to a confidence that the input belongs to a class, that is, f(x)=confidence(class). Such classification can employ a probabilistic and/or statistical-based analysis (e.g., factoring into the analysis utilities and costs) to prognose or infer an action that a user desires to be automatically performed.
A support vector machine (SVM) is an example of a classifier that can be employed. The SVM operates by finding a hypersurface in the space of possible inputs, which the hypersurface attempts to split the triggering criteria from the non-triggering events. Intuitively, this makes the classification correct for testing data that is near, but not identical to training data. Other directed and undirected model classification approaches include, e.g., naive Bayes, Bayesian networks, decision trees, neural networks, fuzzy logic models, and probabilistic classification models providing different patterns of independence can be employed. Classification as used herein also is inclusive of statistical regression that is utilized to develop models of priority.
As will be readily appreciated from the subject specification, the subject innovation can employ classifiers that are explicitly trained (e.g., via a generic training data) as well as implicitly trained (e.g., via observing user behavior, receiving extrinsic information). For example, SVM's are configured via a learning or training phase within a classifier constructor and feature selection module. Thus, the classifier(s) can be used to automatically learn and perform a number of functions, including but not limited to determining according to a predetermined criteria what conclusion(s) (or inferences) to draw based upon a combination of data parameters and/or characteristics.
The interface generation component, 802 is detailed in
The CIGC is responsible for generating custom interfaces from the interface elements gathered in the interface correlation component 801 in response to a physical contact parameter of environmental parameter. The interface elements include all interactive elements or input widgets and all non-interactive elements such as visual widgets. The CIGC component designs a custom interface with the various elements in consideration of rules governing usability held in the interface database or base. In another embodiment the CIGC can create a custom interface influenced by the memory recall component 803 and/or the inference engine 804, either in addition to or in the alternative of utilizing rules. In yet another embodiment of the system 100 as depicted in
In accordance with the various methods of generating a UI, an implementation scheme (e.g., rule) can be applied to define and/or implement a set of criteria by which conclusions are drawn. It will be appreciated that the rule-based implementation can automatically and/or dynamically define conclusions to be drawn from a specific set of information or attributes. In response thereto, the rule-based implementation can make determinations by employing a predefined and/or programmed rule(s) based upon most any desired criteria. It is to be understood that rules can be preprogrammed by a user or alternatively, can be built by the system on behalf of the user. Additionally, the system adaptation component 103 can ‘learn’ or ‘be trained’ by actions of a user or group of users.
Referring back to the drawings,
Referring now to
Referring now to
Generally, program modules include routines, programs, components, data structures, etc., that perform particular tasks or implement particular abstract data types. Moreover, those skilled in the art will appreciate that the inventive methods can be practiced with other computer system configurations, including single-processor or multiprocessor computer systems, minicomputers, mainframe computers, as well as personal computers, hand-held computing devices, microprocessor-based or programmable consumer electronics, and the like, each of which can be operatively coupled to one or more associated devices.
The illustrated aspects of the innovation may also be practiced in distributed computing environments where certain tasks are performed by remote processing devices that are linked through a communications network. In a distributed computing environment, program modules can be located in both local and remote memory storage devices.
A computer typically includes a variety of computer-readable media. Computer-readable media can be any available media that can be accessed by the computer and includes both volatile and nonvolatile media, removable and non-removable media. By way of example, and not limitation, computer-readable media can comprise computer storage media and communication media. Computer storage media includes both volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information such as computer-readable instructions, data structures, program modules or other data. Computer storage media includes, but is not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, digital versatile disk (DVD) or other optical disk storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information and which can be accessed by the computer.
Communication media typically embodies computer-readable instructions, data structures, program modules or other data in a modulated data signal such as a carrier wave or other transport mechanism, and includes any information delivery media. The term “modulated data signal” means a signal that has one or more of its characteristics set or changed in such a manner as to encode information in the signal. By way of example, and not limitation, communication media includes wired media such as a wired network or direct-wired connection, and wireless media such as acoustic, RF, infrared and other wireless media. Combinations of the any of the above should also be included within the scope of computer-readable media.
With reference again to
The system bus 1308 can be any of several types of bus structure that may further interconnect to a memory bus (with or without a memory controller), a peripheral bus, and a local bus using any of a variety of commercially available bus architectures. The system memory 1306 includes read-only memory (ROM) 1310 and random access memory (RAM) 1312. A basic input/output system (BIOS) is stored in a non-volatile memory 1310 such as ROM, EPROM, EEPROM, which BIOS contains the basic routines that help to transfer information between elements within the computer 1302, such as during start-up. The RAM 1312 can also include a high-speed RAM such as static RAM for caching data.
The computer 1302 further includes an internal hard disk drive (HDD) 1314 (e.g., EIDE, SATA), which internal hard disk drive 1314 may also be configured for external use in a suitable chassis (not shown), a magnetic floppy disk drive (FDD) 1316, (e.g., to read from or write to a removable diskette 1318) and an optical disk drive 1320, (e.g., reading a CD-ROM disk 1322 or, to read from or write to other high capacity optical media such as the DVD). The hard disk drive 1314, magnetic disk drive 1316 and optical disk drive 1320 can be connected to the system bus 1308 by a hard disk drive interface 1324, a magnetic disk drive interface 1326 and an optical drive interface 1328, respectively. The interface 1324 for external drive implementations includes at least one or both of Universal Serial Bus (USB) and IEEE 13134 interface technologies. Other external drive connection technologies are within contemplation of the subject innovation.
The drives and their associated computer-readable media provide nonvolatile storage of data, data structures, computer-executable instructions, and so forth. For the computer 1302, the drives and media accommodate the storage of any data in a suitable digital format. Although the description of computer-readable media above refers to a HDD, a removable magnetic diskette, and a removable optical media such as a CD or DVD, it should be appreciated by those skilled in the art that other types of media which are readable by a computer, such as zip drives, magnetic cassettes, flash memory cards, cartridges, and the like, may also be used in the exemplary operating environment, and further, that any such media may contain computer-executable instructions for performing the methods of the innovation.
A number of program modules can be stored in the drives and RAM 1312, including an operating system 1330, one or more application programs 1332, other program modules 1334 and program data 1336. All or portions of the operating system, applications, modules, and/or data can also be cached in the RAM 1312. It is appreciated that the innovation can be implemented with various commercially available operating systems or combinations of operating systems.
A user can enter commands and information into the computer 1302 through one or more wired/wireless input devices, e.g., a keyboard 1338 and a pointing device, such as a mouse 1340. Other input devices (not shown) may include a microphone, an IR remote control, a joystick, a game pad, a stylus pen, touch screen, or the like. These and other input devices are often connected to the processing unit 1304 through an input device interface 1342 that is coupled to the system bus 1308, but can be connected by other interfaces, such as a parallel port, an IEEE 13134 serial port, a game port, a USB port, an IR interface, etc.
A monitor 1344 or other type of display device is also connected to the system bus 1308 via an interface, such as a video adapter 1346. In addition to the monitor 1344, a computer typically includes other peripheral output devices (not shown), such as speakers, printers, etc.
The computer 1302 may operate in a networked environment using logical connections via wired and/or wireless communications to one or more remote computers, such as a remote computer(s) 1348. The remote computer(s) 1348 can be a workstation, a server computer, a router, a personal computer, portable computer, microprocessor-based entertainment appliance, a peer device or other common network node, and typically includes many or all of the elements described relative to the computer 1302, although, for purposes of brevity, only a memory/storage device 1350 is illustrated. The logical connections depicted include wired/wireless connectivity to a local area network (LAN) 1352 and/or larger networks, e.g., a wide area network (WAN) 1354. Such LAN and WAN networking environments are commonplace in offices and companies, and facilitate enterprise-wide computer networks, such as intranets, all of which may connect to a global communications network, e.g., the Internet.
When used in a LAN networking environment, the computer 1302 is connected to the local network 1352 through a wired and/or wireless communication network interface or adapter 1356. The adapter 1356 may facilitate wired or wireless communication to the LAN 1352, which may also include a wireless access point disposed thereon for communicating with the wireless adapter 1356.
When used in a WAN networking environment, the computer 1302 can include a modem 1358, or is connected to a communications server on the WAN 1354, or has other means for establishing communications over the WAN 1354, such as by way of the Internet. The modem 1358, which can be internal or external and a wired or wireless device, is connected to the system bus 1308 via the serial port interface 1342. In a networked environment, program modules depicted relative to the computer 1302, or portions thereof, can be stored in the remote memory/storage device 1350. It will be appreciated that the network connections shown are exemplary and other means of establishing a communications link between the computers can be used.
The computer 1302 is operable to communicate with any wireless devices or entities operatively disposed in wireless communication, e.g., a printer, scanner, desktop and/or portable computer, portable data assistant, communications satellite, any piece of equipment or location associated with a wirelessly detectable tag (e.g., a kiosk, news stand, restroom), and telephone. This includes at least Wi-Fi and Bluetooth™ wireless technologies. Thus, the communication can be a predefined structure as with a conventional network or simply an ad hoc communication between at least two devices.
Wi-Fi, or Wireless Fidelity, allows connection to the Internet from a couch at home, a bed in a hotel room, or a conference room at work, without wires. Wi-Fi is a wireless technology similar to that used in a cell phone that enables such devices, e.g., computers, to send and receive data indoors and out; anywhere within the range of a base station. Wi-Fi networks use radio technologies called IEEE 802.11 (a, b, g, etc.) to provide secure, reliable, fast wireless connectivity. A Wi-Fi network can be used to connect computers to each other, to the Internet, and to wired networks (which use IEEE 802.3 or Ethernet). Wi-Fi networks operate in the unlicensed 2.4 and 5 GHz radio bands, at an 11 Mbps (802.11a) or 54 Mbps (802.11b) data rate, for example, or with products that contain both bands (dual band), so the networks can provide real-world performance similar to the basic 10 BaseT wired Ethernet networks used in many offices.
Referring now to
A memory 1404 connected to the processor 1402 serves to store program code executed by the processor 1402, and serves as a storage means for storing information such as user credential and receipt transaction information and the like. The memory 1404 can be a nonvolatile memory suitably adapted to store at least a complete set of the information that is displayed. Thus, the memory 1404 can include a RAM or flash memory for high-speed access by the processor 1402 and/or a mass storage memory, e.g., a micro drive capable of storing gigabytes of data that comprises text, images, audio, and video content. According to one aspect, the memory 1404 has sufficient storage capacity to store multiple sets of information, and the processor 1402 could include a program for alternating or cycling between various sets of display information.
A display 1406 is coupled to the processor 1402 via a display driver system 1408. The display 1406 can be a color liquid crystal display (LCD), plasma display, or the like. In this example, the display 1406 is a ¼ VGA display with sixteen levels of gray scale. The display 1406 functions to present data, graphics, or other information content. For example, the display 1406 can display a set of customer information, which is displayed to the operator and can be transmitted over a system backbone (not shown). Additionally, the display 1406 can display a variety of functions that control the execution of the device 1400. The display 1406 is capable of displaying both alphanumeric and graphical characters.
Power is provided to the processor 1402 and other components forming the hand-held device 1400 by an onboard power system 1414 (e.g., a battery pack). In the event that the power system 1414 fails or becomes disconnected from the device 1400, a supplemental power source 1412 can be employed to provide power to the processor 1402 and to charge the onboard power system 1414. The processor 1402 of the device 1400 induces a sleep mode to reduce the current draw upon detection of an anticipated power failure.
The terminal 1400 includes a communication subsystem 1414 that includes a data communication port 1416, which is employed to interface the processor 1402 with a remote computer. The port 1416 can include at least one of Universal Serial Bus (USB) and IEEE 13134 serial communications capabilities. Other technologies can also be included, for example, infrared communication utilizing an infrared data port.
The device 1400 can also include a radio frequency (RF) transceiver section 1418 in operative communication with the processor 1402. The RF section 1418 includes an RF receiver 1420, which receives RF signals from a remote device via an antenna 1422 and demodulates the signal to obtain digital information modulated therein. The RF section 1418 also includes an RF transmitter 1424 for transmitting information to a remote device, for example, in response to manual user input via a user input device 1426 (e.g., a keypad) or automatically in response to the completion of a transaction or other predetermined and programmed criteria. The transceiver section 1418 facilitates communication with a transponder system, for example, either passive or active, that is in use with product or item RF tags. The processor 1402 signals (or pulses) the remote transponder system via the transceiver 1418, and detects the return signal in order to read the contents of the tag memory. In one implementation, the RF section 1418 further facilitates telephone communications using the device 1400. In furtherance thereof, an audio I/O section 1428 is provided as controlled by the processor 1402 to process voice input from a microphone (or similar audio input device) and audio output signals (from a speaker or similar audio output device).
In another implementation, the device 1400 can provide voice recognition capabilities such that when the device 1400 is used simply as a voice recorder, the processor 1402 can facilitate high-speed conversion of the voice signals into text content for local editing and review, and/or later download to a remote system, such as a computer word processor. Similarly, the converted voice signals can be used to control the device 1400 instead of using manual entry via the keypad 1426.
Onboard peripheral devices, such as a printer 1430, signature pad 1432, and a magnetic strip reader 1434 can also be provided within the housing of the device 1400 or accommodated externally through one or more of the external port interfaces 1416.
The device 1400 can also include an image capture system 1436 such that the user can record images and/or short movies for storage by the device 1400 and presentation by the display 1406. Additionally, a dataform reading system 1438 is included for scanning dataforms. It is to be appreciated that these imaging systems (1436 and 1438) can be a single system capable of performing both functions.
Referring now to
The system 1500 also includes one or more server(s) 1504. The server(s) 1504 can also be hardware and/or software (e.g., threads, processes, computing devices). The servers 1504 can house threads to perform transformations by employing the innovation, for example. One possible communication between a client 1502 and a server 1504 can be in the form of a data packet adapted to be transmitted between two or more computer processes. The data packet may include a cookie and/or associated contextual information, for example. The system 1500 includes a communication framework 1506 (e.g., a global communication network such as the Internet) that can be employed to facilitate communications between the client(s) 1502 and the server(s) 1504.
Communications can be facilitated via a wired (including optical fiber) and/or wireless technology. The client(s) 1502 are operatively connected to one or more client data store(s) 1508 that can be employed to store information local to the client(s) 1502 (e.g., cookie(s) and/or associated contextual information). Similarly, the server(s) 1504 are operatively connected to one or more server data store(s) 1515 that can be employed to store information local to the servers 1504.
What has been described above includes examples of the innovation. It is, of course, not possible to describe every conceivable combination of components or methodologies for purposes of describing the subject innovation, but one of ordinary skill in the art may recognize that many further combinations and permutations of the innovation are possible. Accordingly, the innovation is intended to embrace all such alterations, modifications and variations that fall within the spirit and scope of the appended claims. Furthermore, to the extent that the term “includes” is used in either the detailed description or the claims, such term is intended to be inclusive in a manner similar to the term “comprising” as “comprising” is interpreted when employed as a transitional word in a claim.