With the “digital revolution” in full swing, the penetration of large displays into the retail and commercial space continues to increase. Digital advertising displays in a retail space and flight information displays in airports are just two examples of non-interactive large displays providing data to a viewer. An evolutionary step in the growth of large displays, a large interactive display, or LID, provides the viewer with a unique ability to interact with the device, to select the data displayed on the device or to access desired content. One potential stumbling block to widespread public acceptance is the perception that private or confidential data displayed on the device, for example email, images, and video may be compromised by other individuals proximate to or using the large interactive device.
Advantages of one or more disclosed embodiments may become apparent upon reading the following detailed description and upon reference to the drawings in which:
A large interactive device (LID) can support multiple contemporaneous user sessions. As LIDs appear more frequently in public settings, maintaining the privacy of personal or confidential data presents a significant issue. To increase public acceptance and use of the LID, a user should have the ability with a simple gesture to adjust one or more display parameters of any personal or confidential data they may be accessing to prevent disclosure of the information to another LID user or passers-by.
A method for displaying data on a large interactive device is provided. The method can include detecting a cupped hands gesture including a first cupped hand and a second cupped hand proximate a LID display surface. The method can further include altering a first display parameter of a display object proximate the cupped hands gesture in response to the detection of the cupped hand gesture.
A system for displaying data on a large interactive device is also provided. The system can include at least one display device and at least one sensor coupled to the processor. The at least one sensor can detect the presence of a cupped hands gesture proximate the display device. The system can further include logic, which when executed by the processor alters a first display parameter of a display object proximate the cupped hands gesture responsive to detection of the cupped hands gesture by the at least one sensor.
Another method for displaying data on a large interactive device is also provided. The method can include generating a display object for a user and displaying the display object on a LID proximate the user. The method can further include detecting a cupped hands gesture, comprising a first cupped hand and a second cupped hand, proximate a display surface of the LID by the user. The method can include altering a display parameter of the display object proximate the cupped hands gesture responsive to detecting the cupped hands gesture, the display parameter including at least one of: display object intensity, display object size, and display object sharpness.
As used herein, the term “display object” can refer to any media, streaming media, visual communication, or the like provided by a processor to a user via the display device. Display objects can include images, video, or any combination of images and video. Display objects can also include images containing text data in whole or in part.
As used herein, the term “display parameter” can refer to any criterion, value, or level affecting the visual display of data on the display device. Example display parameters include, but are not limited to: color, brightness, gamma, contrast, sharpness, size, location, or combinations thereof.
As a hypothetical exercise, envision a user accessing private or confidential information via a public LID. The user, sensing others around, desires to make the private or confidential information less accessible to those nearby. The method 100 can include the user making a cupped hands gesture proximate the LID display surface at 110. The cupped hands gesture can include a first cupped hand and a second cupped hand, for example a user cupping both a right (first) and a left (second) hand.
The cupped hands gesture can indicate a desire for limiting visibility of others, therefore making the gesture proximate the LID display surface may be considered an “intuitive” gesture indicating the user's desire to exclude others from viewing the display object. Since touch based and proximity sensors are available, the cupped hands gesture can be wither a “contact” type gesture on touch screen devices, or a “proximity” type gesture on devices capable of sensing objects proximate the display surface.
In response to detecting the cupped hands gesture at 110, the method can include altering a first display parameter of a display object proximate the cupped hands gesture at 120. The first display parameter can include, without limitation, at least one of the intensity of the display object, the size of the display object, and the sharpness of the display object. Altering the first display parameter can obscure or render unintelligible the display object proximate the user's cupped hands gesture. For example, in some embodiments, after the cupped hands gesture is detected by the LID, the intensity of the display object can be altered to make the display object indistinguishable or invisible when viewed against the LID display surface.
In response to detecting the translation of the first cupped hand toward the second cupped hand at 210, the method can include altering a second display parameter of the display object proximate the cupped hands gesture at 220. The second display parameter can include, without limitation, at least one of the intensity of the display object, the size of the display object, and the sharpness of the display object. Altering the second display parameter can further obscure or render unintelligible the display object proximate the user's cupped hands gesture,
For example, in some embodiments, after the cupped hands gesture is detected by the LID, the intensity of the display object can be altered to make the display object indistinguishable or invisible when viewed against the LID display surface and after the translation of the first cupped hand towards the second cupped hand is detected the size of the display object can be reduced commensurate with the translation of the first cupped hand towards the second cupped hand. In so doing, both the intensity (the first display parameter) and the size (the second display parameter) can be altered in response to the detection of the cupped hands gesture and the translation of the first cupped hand towards the second cupped hand (respectively).
For example, in some embodiments, after the cupped hands gesture is detected by the LID the intensity of the display object can be altered rendering the display object indistinguishable or invisible when viewed against the LID display surface; and, after the translation of the first cupped hand towards the second cupped hand is detected, the size of the display object can be reduced; the sharpness of the display image can then be reduced when the user rotates one of their cupped hands to a palm down gesture. In so doing, the intensity (the first display parameter), the size (the second display parameter), and the sharpness (the third display parameter) can ALL be altered in response to the detection of the cupped hands gesture, the translation of the first cupped hand towards the second cupped hand, and the rotation of the first or second cupped hand to a palm down position (respectively).
For clarity and ease of discussion,
The system 400 as depicted in
The at least one display device 410 can include any number of independent displays arranged in a regular or irregular array to provide a single large interactive device. The independent display or displays forming the at least one display device can use any existing or future display technology including, but not limited to, cathode ray tube (CRT) technology, light emitting diode (LED) technology, gas plasma technology, organic LED (OLED) technology, liquid crystal display (LCD) technology, three dimensional display technology, or any combination thereof.
In some embodiments, the at least one display device 410 can include a plurality of display devices, with each of the plurality of display devices disposed proximate at least one other of the plurality of display devices. For example the at least one display device can include nine (i.e., a plurality) 42″ LCD displays arranged in a 3×3 array.
The processor 420 can include one or more devices configured to execute a machine executable instruction set. In some embodiments, a single processor 420 can be used to generate graphical data for display on the LID and execute one or more machine executable instruction sets. In other embodiments, the processor 420 can include a plurality of processors, each having one or more dedicated functions, for example a central processing unit (CPU) executing a machine executable instruction set and a graphical processing unit (GPU) providing at least a portion of the graphical data displayed on the LID, The processor 420 can have any number of data inputs and any number of data outputs. For example, in at least some embodiments, the processor 420 can have a data input to accept sensor data generated by the at least one sensor 430. The processor 420 can be disposed at least partially within the at least one display device 410, In some embodiments, the processor 420 can be disposed within a housing external to the at least one display device 410.
The at least one sensor 430 can include any number of systems, devices, or any combination of systems and devices configured to detect the presence one or more gestures proximate the display device 410. In some embodiments, the at least one sensor can include a touch sensitive surface, for example a capacitive touch sensor, built in to the display device 410. In some embodiments, the at least one sensor can include one or more acoustic or electromagnetic based touch detection technologies housed within a bezel at least partially surrounding the display device 410.
The at least one sensor 430 can detect the presence of the cupped hands gesture 440 proximate the LID display surface as a plurality of continuous touches formed using the edge of the pinky finger and the palm. The at least one sensor 430 can detect the translation of one or both cupped hands gesture 441 on the display surface of the display device 410. The at least one sensor 430 can detect the palm down cupped hand gesture 442 on the display surface of the display device 410. The output generated by the at least one touch sensor 430 can be used to provide an input to the processor 420.
The processor 420 can execute a machine executable instruction set or logic 450 that alters a first display parameter of a display object 460 proximate the cupped hands gesture 440 responsive to detection of the cupped hands gesture by the at least one sensor 430.
Referring next to
Referring next to
Referring finally to
A display object can be generated at 510. The display object can include any one way (e.g., email) or two-way (e.g., video conference) communication between the user and the display device 410.
The display object can be displayed on the LID, proximate a user at 520. A cupped hands gesture can be detected on the display surface of the display device at 530. In at least some embodiments, the cupped hands gesture can be detected at 530 using a remote sensor disposed in a bezel surrounding the display device. In other embodiments the cupped hands gesture can be detected at 530 using a sensor integrated into the display surface, for example a capacitive touch sensor overlaying the display surface.
A display parameter associated with the display object can be altered responsive to the detection of the cupped hands gesture at 540. The altered display parameter can include at least one of: the display object intensity, the display object size, and the display object sharpness. Altering the display parameter at 540 can include, in some embodiments, adjusting the one or more display parameters to a predetermined value, for example, reducing the intensity, size, or sharpness of the display object by 50% in order to reduce the visibility or legibility of the display object.
Diminishing the display parameter in response to the detection of the translation of the user's hands towards each other at 620 can include further diminishing the intensity of the display object, further diminishing the size of the display object, or further diminishing the sharpness of the display object. Such reductions can further reduce the visibility or legibility of the display object, thereby increasing the privacy of the display object.
Increasing the display parameter in response to the detection of the translation of the user's hands away from each other at 720 can include increasing the intensity of the display object, increasing the size of the display object, or increasing the sharpness of the display object. Such increases can enhance the visibility or legibility of the display object, for example when the user desires to resume interaction with the display object.
Certain embodiments and features may have been described using a set of numerical upper limits and a set of numerical lower limits. It should be appreciated that ranges from any lower limit to any upper limit are contemplated unless otherwise indicated. Certain lower limits, upper limits and ranges appear in one or more claims below. All numerical values are “about” or “approximately” the indicated value, and take into account experimental error and variations that would be expected by a person having ordinary skill in the art.
While the foregoing is directed to embodiments of the present invention, other and further embodiments of the invention may be devised without departing from the basic scope thereof, and the scope thereof is determined b the claims that follow.
Filing Document | Filing Date | Country | Kind | 371c Date |
---|---|---|---|---|
PCT/US2011/033694 | 4/22/2011 | WO | 00 | 8/27/2013 |