The present invention relates to a method of controlling an information board using human initiated gestures, a gesture capturing device and a computer program performing such a method. Specifically, the invention relates to facilitating user input using a gesture recognition system.
An information board is a form of communication used to present various content items to the user. Content would include any form of data that can be interfaced with human sensory perception. The present invention relates to an information board of the type comprising of a display that accepts input from a gesture-capturing device. The information board may be used to browse websites, news, print documents, draw shapes, or play music or perform whatever functionality is made available to the user. The input device, an RGBD or image segmentation aware type of camera, captures human gestures and then the software running on the system translates the human generated intentions, hand and body or motion gestures into some meaningful operation or control in real time, to be used as a means of navigation for the information board. These gesture-based controls create an interactive user interface where the user can navigate, select and operate various features included on the information board.
Navigation and selection features include the selection of content, sending and receiving messages, asking questions and the use of all other information board features. The use of human gestures to navigate the information board eliminates the need for users to use control devices such as a touch screen, keyboard, mouse, remote control or other physical control devices. Instead of using markers, keyboards and/or mouse pointer controls, this new interactive information board dramatically improves the user's experience and makes the navigation and discover of information easier to access and control.
An objective of the invention is to overcome at least some of the drawbacks relating to touch and other types of human interactive controlled information board designs. Known information boards suffer from the disadvantage of being difficult to navigate and understand and require human contact with a physical device to navigate. This may contribute to the exchange of bacteria, viruses, and dirt when used in a public environment where information boards are normally displayed. Known information boards also delivery unchanging information to eliminate the need to provide a remote or other control device that can be stolen, lost or damaged.
Other traditional style interactive information and “white boards” use digital pens as input devices that use digital ink to replace traditional “white board” markers. Digital pens are often lost or broken and can be difficult to use. In these types of devices projectors are often used to display a computer's video output on the white board interface, which acts as a large touch screen. Proper lighting is needed as well as a touchable surface. Interactive white boards also typically require users to train the software prior to the use of any interactive functionality. The proposed interactive information board does not suffer from any of these requirements as no special surface, lighting or digital pen or touch related equipment is needed.
It is the principal object of the present invention to obtain a type of interactive information board that can be interactive with the user in a safe and understandable way without exposing the user to potentially harmful elements. The present invention can be used in such places as hospitals, nurseries, day-care centers, bus and airport terminals, schools and any public place where there is a high a volume of public use and where traditional physically controlled devices requiring human contact might become contaminated, dirty, broken or lost.
Having thus described the invention in general terms, reference will now be made to the accompanying drawings, which are not necessarily drawn to scale, and wherein:
The system provides a gesture based interface used to navigate and control the display of information and content on an information board that is displayed on a screen. The operator of the system navigates and manipulates the elements of the information board by issuing gestural commands using the operator's hands, arms and other body elements. The user's head, feet, arms, legs, or entire body may be used to provide the navigation and control. A system of cameras can detect the position, orientation and movement of the user's hands and body elements and translates that information into executable commands to be used to navigate and operate the information board.
The following is a description of one cycle of a standard session with the interactive information board:
1.) The interactive information board starts out by default in “sleep mode,” and there is a lock icon displayed at the bottom-left corner of the main window along with Text/Video helpers to help guide users to trigger a push gesture to start the system.
2.) To enter into controlling mode or start a new controlling session, the user may simply push towards to the big display (TV/projector or Monitor screen), the hand distance should be within device range. The user may also use the HELP guide for a reference to become more familiar with the interactive information board usage features.
3.) Once a push gesture is detected, the application will be unlocked and jump into the “Menu Page.” The detected hand or body movement (a red point) will also be displayed on the screen. Mode indicator will guides the user the current type of active mode. If a user's willing to interact with information board but his hand's not currently within partitioned distance (see
4.) On the “Root Page,” there will be a few 3D-like selectable object icons that will be positioned in a circle, one icon in bigger size which indicates it is currently selected, and the other icons will be of a smaller size which means they are not currently selected. To switch icons on the menu, the user may trigger a left or right gesture, after the desired icon is selected, to view the contents under the selected icon, the user may trigger a push gesture, the main menu will then disappear and the corresponding contents will be displayed. During the entire process the user may rely on the Text/Video Helpers for guidance.
5.) Some sample functions that are provided in the contents include: “Left/Right gesture” for switching slides, “Grab gesture” for printing forms and “Up gesture” for going back main menu.
6.) To exit from active mode into sleeping mode, the user simply walks away from the device working range.
7.) The entire process is repeated once a user is detected again.
Number | Date | Country | |
---|---|---|---|
61526220 | Aug 2011 | US |