This application is based on and hereby claims priority to German Application No. 10 2012 224 394.1 filed on Dec. 27, 2012, the contents of which are hereby incorporated by reference.
The present invention relates to the technical field of customizing a presentation of information on an interaction device.
The current related art generally requires manual changing between different types of presentation for control and output. Time-controlled mechanisms which, like screensavers, independently change to an information display after a defined period of time without interaction are also typical. The next interaction (mouse movement, screen touching, etc.), a manual step, prompts a change to be made back to the control mode.
Simple motion detectors which activate a system when a person is detected in the environment are also known. However, a distinction is not made in this case between information output and control; the system is only activated, generally put into an output mode in this case. A manual step would again be required in this case for a conceivable transition to a control mode.
However, the presentation is not altered in relation to the user as regards whether the latter can actually control the device from his current position or whether the information presented can be meaningfully grasped in the output mode.
Systems which activate or deactivate a display in response to approach are likewise known. Proximity sensors in mobile telephones are the best-known example of this. They switch off the display (and the associated touch-sensitive surface) when the telephone is held close to the ear. This is intended to avoid a control element being inadvertently activated as a result of contact with the body when held to the ear. However, this is a purely binary function (on/off) in the immediate vicinity and cannot be expanded to other situations. In both states, the user is close to the device and is therefore theoretically able to control the latter.
User interfaces (human/machine interface, HMI) are generally optimized for their typical use. If inputs are primarily intended to be possible, corresponding control elements are presented. If, however, the display of information is primarily desired, scarcely any or no control elements are present and the information comes to the fore.
Therefore, one potential object is flexibly customizing a user interface to use.
According to a first aspect of the inventors' proposal, an interaction device comprises a user interface, a proximity sensor, a logic module and software. The user interface comprises an output device. The software can be executed on the logic module. The software is designed to evaluate data from the proximity sensor and to control the user interface. The proximity sensor is also designed to detect when a user approaches in the visual range of the proximity sensor. The software is designed to use the detected approach to customize a presentation of information on the output device and to refine the presentation of information as the distance between the user and the proximity sensor decreases.
According to another aspect, the inventors propose a method for customizing a presentation of information on an interaction device. In this case, a distance between a user and the interaction device is detected by the interaction device. A presentation of information on the interaction device is then automatically customized by the interaction device using the detected distance. In this case, the presentation of information is automatically refined as the distance of the user decreases.
These and other objects and advantages of the present invention will become more apparent and more readily appreciated from the following description of the preferred embodiments, taken in conjunction with the accompanying drawings of which:
Reference will now be made in detail to the preferred embodiments of the present invention, examples of which are illustrated in the accompanying drawings, wherein like reference numerals refer to like elements throughout.
Within the scope of this application, the term “proximity/distance sensor” is also used synonymously for the term “proximity sensor”.
According to one preferred embodiment, the detection of approach of the user 1 comprises the determination of a distance between the interaction device 10 and the user 1.
According to another preferred embodiment, the software is designed to receive inputs via an input device. Since the output device is a touch-sensitive display 54a in the exemplary embodiment illustrated in
In this case, the input device 54a and the output device are advantageously integrated with one another, either in a combined device (for instance a touch panel, a touch-sensitive screen) or by being in the local vicinity of one another, for example physical switching elements in the form of knobs, switches, etc., beside or around the output device.
The proximity/distance sensor continuously detects objects in its detection range. Different technologies can be used for this purpose, for instance:
Depending on the sensor, sensor values of different quality may be recorded. One difficulty in this case is also the distinction of persons and items. However, objects which remain motionless for a relatively long time may possibly be classified as an item in this case and “dismissed”. According to another preferred embodiment, the software 27 is designed to classify an object which remains motionless for a relatively long time as an item and therefore not to interpret this object as a user 1. However, the sensors need not necessarily primarily detect movement, but rather, to a certain degree, the distance between the sensor and the user/object.
In order to improve the sensor data obtained, different sensors can also be combined with one another.
According to one preferred embodiment, the customization of the presentation of information comprises customization of a display abstraction. When a user moves into the visual range of the proximity/distance sensor 5, the output device 54a is first of all activated, for example woken from the power-saving mode. In this case, the interaction device 10 begins with a coarse information mode. With a decreased distance between the interaction device 10 and the user, the presentation of information in this case is refined in arbitrary discrete stages or else in an infinitely variable manner. The abstraction of the output presentation therefore declines as the distance decreases.
In the direct vicinity of the interaction device 10—it is likely in this case that the user 1 could now actually interact with the device—the interaction device 10 changes to the control mode. The outputs are now optimized for the user to interact with the interaction device 10. This comprises, for example, the selection, manipulation and changing of control elements. The customization of the presentation of information therefore comprises customization of an interaction mode.
Preferred embodiments therefore solve the problem of how the user interface can be flexibly customized to use by automatically changing between control and different output presentations.
This is based on the fact that a user 1 can directly control the interaction device 10 only in the immediate vicinity of the latter. At a certain distance, it is only possible to view the user interface 14. The display of control elements is therefore unnecessary and takes up space. In this case, it is desirable to shift the focus more toward the display of information. In addition, it is desirable to reduce the abundance of information with increasing distance since the human eye can no longer completely resolve the presented information with increasing distance. If the user 1 is even completely outside the visual range of the device, the latter can also save energy and can deactivate the user interface 14.
This therefore results in the following 3 modes:
(1) offline/power-saving mode—the output device (or else the associated overall device) is deactivated or is in a power-saving mode;
(2) information mode (with abstraction levels)—the output device solely presents information;
(3) control mode—the output device shows elements for assisting with input.
In this case, the information mode may have different abstraction levels, alternating in steps or flowing, depending on the distance between the user and the device.
The logic module 12 therefore processes the sensor values in such a manner that the corresponding mode is fixed and, within the information mode, the degree of abstraction is fixed (for example in percent, where 100% corresponds to the coarsest presentation).
The base device 40 comprises a socket 44 for controlling the display 54a, a socket 49 for controlling further elements on other attachments which can be plugged in, such as for mechanical switches, and a housing 42 in which a communication device is accommodated. The communication device comprises the logic module 12, a radio unit and possible further components. The base device 40 also comprises a bus terminal 43 to which a connection cable 93 for a building control bus system 63 can be connected. The base device 40 also comprises a further terminal 46 to which a further connection cable 96 for a data network can be connected.
The interaction device 10 itself can preferably be installed in the flush-mounted box 90. The user 1 sees only the touch display 54a on the attachment 50a. The interaction device and, in particular, its user interface can be changed by plugging another attachment, for example one of the attachments 50, 50c described in
In this case, the control mode may also comprise only the change between different items of information or programs to be presented.
Example: only a few centimeters in front of the device, for instance when a finger approaches, other information/programs is/are displayed at the side on a touchscreen (it/they effectively project into the image somewhat); the corresponding information/programs is/are now shifted to the center by “swiping” the screen. When the finger is removed, the elements at the side are cleared again. This is illustrated in
Optionally, the direction of the user in relation to the combined output/input device can also be taken into account. If the user is standing in front of the interaction device, for example, and moves his hand at the right-hand edge of the interaction device, control elements can be presented primarily on the right when changing from the pure information mode to the control mode (possibly useful only on touch-sensitive screens).
Preferred embodiments include the advantageous combination of the following functions:
The advantages of this solution are:
The proposed controller can be installed in various devices which are provided for input/output. These may be both expansions of conventional desktop or tablet computers but also information carousel systems, information terminals, HMI interfaces for production devices, etc.
In particular, a form for display-assisted interaction devices in building control is also conceivable, as shown in
According to preferred embodiments, an interaction device 10 in the building changes between different display abstractions and interaction modes depending on the distance of the user. Example for a heating and air-conditioning system controller in the room in a flush-mounted device: if there is no user in the room, the device is off. If a user is 10 m away, the entire display appears only in one color, for example blue for “too cold”, red for “too warm” and green for “target temperature reached”. The closer the user comes to the device, the more information appears: the current temperature, the target temperature, the ventilation mode. If the user comes within reach of the device, operating elements (for example arrows) for adjusting the target temperature and the ventilation mode appear. A proximity sensor is used in this case.
The proposals are preferably used in relatively complex building control interaction devices. Further possible uses are in vending machines (for example for tickets), information kiosks (at railway stations or airports) or in billboards.
The invention has been described in detail with particular reference to preferred embodiments thereof and examples, but it will be understood that variations and modifications can be effected within the spirit and scope of the invention covered by the claims which may include the phrase “at least one of A, B and C” as an alternative expression that means one or more of A, B and C may be used, contrary to the holding in Superguide v. DIRECTV, 69 USPQ2d 1865 (Fed. Cir. 2004).
Number | Date | Country | Kind |
---|---|---|---|
10 2012 224 394.1 | Dec 2012 | DE | national |