The present application claims priority under 35 U.S.C. ยง119 of Japanese Application No. 2010-257779 filed on Nov. 18, 2010, the disclosure of which is expressly incorporated by reference herein in its entirety.
1. Field of the Invention
The present invention relates to a screen operation system allowing a user to operate a screen displayed on an image display apparatus by an information processing apparatus.
2. Description of Related Art
A screen operation system operated through what is commonly referred to as a Graphical User Interface (GUI) is widely used, in which a screen is displayed on an image display apparatus by predetermined programs in an information processing apparatus; and a user uses an input device, such as a mouse, to operate an operation target, such as an icon, on the screen, to provide predetermined instructions to programs executed in the information processing apparatus.
Using a projector as an image display apparatus provides a large screen, which is suitable for a conference with a large audience. In order to operate the screen, however, an input device is required which is connected to an information processing apparatus that controls the projector. In the case where a plurality of users operate the screen, it is inconvenient for the users to take turns to operate the input device of the information processing apparatus.
A known technology related to such a screen operation system using a projector screen allows a user to move a pointing object, such as a fingertip, in front of the projected screen for screen operation (refer to Related Art 1).
The conventional technology mentioned above captures a projected screen using a camera installed in the projector, analyzes a captured image of a pointing object, such as a user's hand, that overlaps the screen, and detects operation of the pointing object. It is necessary to move the pointing object so as to overlap an operation target, such as an icon, on the screen. There is thus a problem in which a person who operates the screen must be in front of the screen for screen operation and thus cannot readily operate the screen.
In view of the circumstances above, an object of the present invention is to provide a screen operation system configured to allow easy screen operation.
An advantage of the present invention provides a screen operation system obtaining operation information associated with an operation performed by a user with a pointing object relative to a screen of an image display apparatus controlled by an information processing apparatus and causing the information processing apparatus to execute processing associated with the operation information. The screen operation system includes a mobile information apparatus including a camera capturing an image of the screen of the image display apparatus; a display displaying the image captured by the camera; and a communicator communicating information with the information processing apparatus. The operation information is obtained based on captured image information obtained from the image captured by the camera such that the pointing object is displayed at a predetermined position on the screen of the image display apparatus.
According to the present invention, a user uses the mobile information apparatus that he carries to capture the screen of the image display apparatus and to operate the screen. Thus, the user can operate the screen if he is in a place where he can see the screen of the image display apparatus. Accordingly, the user can operate the screen without being in front of the screen.
The present invention is further described in the detailed description which follows, in reference to the noted plurality of drawings by way of non-limiting examples of exemplary embodiments of the present invention, in which like reference numerals represent similar parts throughout the several views of the drawings, and wherein:
The particulars shown herein are by way of example and for purposes of illustrative discussion of the embodiments of the present invention only and are presented in the cause of providing what is believed to be the most useful and readily understood description of the principles and conceptual aspects of the present invention. In this regard, no attempt is made to show structural details of the present invention in more detail than is necessary for the fundamental understanding of the present invention, the description taken with the drawings making apparent to those skilled in the art how the forms of the present invention may be embodied in practice.
The embodiments of the present invention are explained below with reference to the drawings.
Examples of the mobile information apparatus 5 include mobile telephone terminals (including Personal Handy-phone System (PHS) terminals), smartphones, and PDAs. The mobile information apparatus 5 has a camera that captures the projected screen 4 of the projector 2. A user captures an image of the projected screen 4 of the projector 2 using the mobile information apparatus 5. While viewing a screen of a display 8 on which the captured image is displayed, the user moves a pointing object 6, such as the user's hand or fingertip or a pointer, onto a predetermined position where an operation target, such as an icon, is located on the projected screen 4 of the projector 2. Thereby, the user operates the projected screen 4 of the projector 2.
The mobile information apparatus 5 captures a capture area 7 along with the pointing object 6, such as a finger, the capture area 7 being provided in a field angle of the camera within the projected screen 4 of the projector 2. The captured image includes the pointing object 6, such as a finger, that overlaps a predetermined position in the capture area 7. Based on the captured image information, operation information is obtained pertaining to the operation performed by the user with the pointing object 6 relative to the projected screen 4 of the projector 2.
The mobile information apparatus 5 and the information processing apparatus 1 can communicate with each other through a wireless communication medium, such as a wireless LAN. The mobile information apparatus 5 and the information processing apparatus 1 share a processing load of obtaining the operation information from the captured image information, and the mobile information apparatus 5 transmits predetermined information to the information processing apparatus 1 on a real-time basis.
Based on the captured image information output from the input section 12 and camera shake information output from the camera shake sensor 13, the moving body tracking processor 14 detects a relative movement of a captured object and the camera 11.
Based on the information obtained in the moving body tracking processor 14, the image analyzer 15 identifies the screen 3 and then an area of the projected screen 4 on the screen 3. The projection area is identified based on an indicator image displayed in a predetermined position on the projected screen 4. The indicator image is a distinctive image within the projected screen 4, such as, for example, an image of a start button displayed at the lower left of the projected screen 4. It is possible to use an image of a marker displayed on the projected screen particularly for identifying the projection area.
Based on the captured image information output from the input section 12 and the information obtained in the moving body tracking processor 14, the pointing object detector 16 detects, by movement recognition, a portion where a movement is different from the entire captured image, and then determines, by shape recognition, whether the portion is a pointing object. The pointing object is recognized herein from characteristics of its shape (e.g., shape of a pen, a pointer, a hand, a finger, or a nail).
Based on the information obtained in the pointing object detector 16, the operation mode analyzer (operation mode determinator) 17 determines an operation mode associated with the movement of the pointing object 6. Examples of the operation mode include tapping (patting with a finger), flicking (lightly sweeping with a finger), pinch-in/pinch-out moving two fingers toward or apart each other), and other gestures. For example, a user can tap to select (equivalent to clicking or double-clicking of a mouse), flick to scroll the screen or turn pages, and pinch-in/pinch-out to zoom-out/zoom-in on the screen.
Based on the information obtained in the image analyzer 15 and the operation mode analyzer 17, the coordinate calculator (first operation position obtainer) 18 obtains a relative position of the pointing object 6 on the capture area 7. A coordinate of a pointed position indicated by the pointing object 6 (position of a fingertip in the case where a hand is identified as the pointing object 6) is calculated herein.
The display 8 is controlled by a display controller 20, to which the captured image information captured by the camera 11 is input through the input section 12. The captured image is then displayed on the display 8.
With the pointing object 6, such as a finger, appearing in the area captured by the camera 11, the pointing object detector 16 identifies the pointing object (ST104). Then, the operation mode analyzer 17 determines an operation mode, such as tapping or flicking, and the coordinate calculator 18 obtains an operation position, specifically a relative position of the pointing object 6 on the capture area 7 (ST105). The communicator 19 transmits, to the information processing apparatus 1, information pertaining to the captured projected screen, the operation mode, and the operation position obtained in the steps above (ST106).
As shown in
The display controller 25 controls display operation of the projector 2 and outputs screen information being displayed by the projector 2 to the image coordinate analyzer 22.
Based on the captured image information received in the communicator 21 and the displayed screen information output from the display controller 25, the image coordinate analyzer (captured position obtainer) 22 obtains an absolute position of the capture area 7 relative to the entire projected screen 4 of the projector 2. In this process, the capture area 7 relative to the entire projected screen 4 is obtained through matching and detailed coordinates are calculated for the identified capture area 7.
Based on the information of the pointing object 6 received in the communicator 21, specifically the information of the relative position of the pointing object 6 on the capture area 7, the operation coordinate analyzer (second operation position obtainer) 23 obtains an absolute position of the pointing object 6 relative to the entire projected screen 4 of the projector 2. Based on the information of the position of the pointing object 6, an operation target (selection menu or icon) on the projected screen 4 is identified. In addition, based on the information of the operation mode, such as tapping or flicking, received in the communicator 21, information on operation details (operation information) is output, the information on operation details indicating what kind of operation was performed on the projected screen 4 by the pointing object.
Based on the information on operation details (operation information) obtained in the operation coordinate analyzer 23, the operation processor 24 executes processing associated with the operation details.
A variety of necessary processes are divided and assigned to the mobile information apparatus 5 and the information processing apparatus 1. It is also possible to perform the processes in either the mobile information apparatus 5 or the information processing apparatus 1. For example, the operation mode analyzer 17 of the mobile information apparatus 5 determines the operation mode, such as tapping or flicking. Instead, the information processing apparatus 1 may determine the operation mode.
In order to reduce a communication load on the mobile information apparatus 5 and to reduce a calculation load on the information processing apparatus 1 in the case where a plurality of users perform screen operations using the mobile information apparatuses 5, it is desirable that the mobile information apparatus 5 be configured to perform as many necessary processes as possible within the processing capacity of the mobile information apparatus 5.
The touch screen display 32 of the mobile information apparatus 31 displays an image captured by the camera and detects a touch operation by a pointing object 6, such as a fingertip, on the screen. While capturing the projected screen 4 of the projector 2 with the camera, the user moves the pointing object 6 on the touch screen display 32 on which the captured image is displayed and thereby operates the projected screen 4 of the projector 2.
The mobile information apparatus 31 captures a capture area 7 provided in a field angle of the camera within the projected screen 4 of the projector 2. Based on captured image information obtained therefrom and operation position information obtained from the touch operation of the pointing object 6 on the touch screen display 32 on which the captured area 7 is displayed, operation information is obtained pertaining to the user's operation performed with the pointing object 6 relative to the projected screen 4 of the projector 2.
In the mobile information apparatus 31, the touch screen display 32 is controlled by a display controller 33, to which the captured image information captured by the camera 11 is input through an input section 12. The captured image is then displayed on the touch screen display 32. Furthermore, the display controller 33 detects a touch operation performed by the pointing object 6, such as a fingertip, on the touch screen display 32 and outputs information of a touch position.
The touch position information is input to an operation mode analyzer 17, which determines an operation mode, such as tapping or flicking, associated with the movement of the pointing object 6 based on the touch position information. Furthermore, the touch position information is input to the coordinate calculator 18 through the operation mode analyzer 17. The coordinate calculator 18 obtains a relative position of the pointing object 6 on the capture area 7 based on the touch position information.
The information processing apparatus 1 is the same as that in the first embodiment and performs the same processing.
As described above, in the screen operation systems according to the first and second embodiments of the present invention, a user uses the mobile information apparatuses 5 and 31, respectively, that he carries to capture the projected screen 4 of the projector 2 and to operate the screen. Thus, the user can operate the screen if he is in a place where he can see the projected screen 4 of the projector 2. Accordingly, the user can operate the screen while he sits at his own chair without going in front of the screen 3.
In addition, screen operation is not limited regardless of conditions. For example, even in the case where the pointing object cannot reach an operation target, such as an icon, on the screen because of the very large size of the screen 3, the systems allow the user to operate the screen, thus providing a high level of convenience. In the case where a plurality of users operate the screen, they use the mobile information apparatuses 5 and 31 that they carry for screen operation, eliminating the inconvenience of taking turns to operate an input device of the information processing device 1 and allowing simple screen operation. Furthermore, the mobile information apparatuses 5 and 31 may be widely used mobile telephone terminals each equipped with a camera. It is thus unnecessary to prepare exclusive devices for a number of users to operate the screen, reducing the installation cost.
In particular, a relative position of the pointing object 6 on the capture area 7 in the projected screen 4 of the projector 2 is obtained and the position of the capture area 7 relative to the entire projected screen 4 of the projector 2 is obtained. Then, an absolute position of the pointing object 6 is obtained relative to the entire projected screen 4 of the projector 2. Thus, only a portion of the projected screen 4 of the projector 2 needs to be captured by the mobile information apparatuses 5 and 31 for screen operation, thus facilitating screen operation.
The operation mode is determined which is associated with the movement of the pointing object 6, such as tapping, flicking, or pinch-in/pinch-out. Assigning processing to each operation mode, the processing including selection, scroll of the screen, page turning, and zoom-in or zoom-out of the screen, allows a variety of instructions with the movement of the pointing object 6, thus facilitating screen operation.
A projector is used as the image display apparatus in the first and second embodiments. The image display apparatus of the present invention, however, is not limited to a projector, and may be an image display apparatus that uses a plasma display panel or an LCD panel.
In the case where a user captures the projected screen 4 on the screen 3 from an angle using the camera 11 of the mobile information apparatus 5, the capture area 7 having a rectangular shape on the screen 3 is displayed in a distorted quadrangular shape, as shown in
As shown in
If the corrected captured image is displayed as-is on the mobile information apparatus 5, the corrected captured image is displayed small in the screen of the display 8, as shown in
Since the calculation load of the image correction is large, the image correction is performed in the information processing apparatus 1. The image correction, however, may be performed in a mobile information apparatus 5 having high processing performance.
A drive bay or a housing space in which a peripheral, such as an optical disk apparatus, is replaceably housed is provided on a rear side of a keyboard 46 of a case 45 of the portable information processing apparatus 42. A case 47 of the image display apparatus 41 is attached to the drive bay such that the optical engine unit 43 and the control unit 44 are retractably provided in the case 47. For use, in a state where the optical engine unit 43 and the control unit 44 are pulled out, the optical engine unit 43 is rotated to adjust a projection angle of laser light from the optical engine unit 43 for appropriate display of the projected screen 4 on the screen 3.
The image display apparatus 41, which is installed in the portable information processing apparatus 42, can be readily used in a conference with a relatively small number of people. Furthermore, the projected screen 4 can be displayed substantially larger than a display 48 of the portable information processing apparatus 42, thus allowing a user to view the projected screen 4 while being seated in his own seat. In the case where the image display apparatus 41 is used in combination with the above-described screen operation system of the present invention, users do not have to take turns to operate the portable information processing apparatus 42. They can instead use the mobile information apparatuses 5 and 31 that they carry at their seats to operate the screen of the image display apparatus 41, thus providing a high level of convenience.
In the screen operation system, the information processing apparatus 51 at Point A is connected with a relay apparatus 54 at Point B via a network. In this regard, any conventional wired or wireless network can be utilized. Display signals are transmitted from the information processing apparatus 51 to the relay apparatus 54, which controls the image display apparatus 53 to display the screen. The mobile information apparatus 5 is the mobile information apparatus shown in the first embodiment and thus the screen can be operated in the same manner as in the first embodiment.
The information processing apparatus 51 may have the same configuration as the information processing apparatus 1 shown in the first embodiment. Communication with the mobile information apparatus 5 is performed via the network and the relay apparatus 54. The relay apparatus 54 and the mobile information apparatus 5 can communicate with each other via a wireless communication medium, such as a wireless LAN.
The mobile information apparatus 5 shown in the first embodiment is used in this example. However, the mobile information apparatus 31 shown in the second embodiment may also be applied to the screen operation system.
The screen operation system of the present invention allows easy screen operation. It is useful as a screen operation system in which a user operates a screen displayed on an image display apparatus by an information processing apparatus.
It is noted that the foregoing examples have been provided merely for the purpose of explanation and are in no way to be construed as limiting of the present invention. While the present invention has been described with reference to exemplary embodiments, it is understood that the words which have been used herein are words of description and illustration, rather than words of limitation. Changes may be made, within the purview of the appended claims, as presently stated and as amended, without departing from the scope and spirit of the present invention in its aspects. Although the present invention has been described herein with reference to particular structures, materials and embodiments, the present invention is not intended to be limited to the particulars disclosed herein; rather, the present invention extends to all functionally equivalent structures, methods and uses, such as are within the scope of the appended claims.
The present invention is not limited to the above described embodiments, and various variations and modifications may be possible without departing from the scope of the present invention.
Number | Date | Country | Kind |
---|---|---|---|
2010-257779 | Nov 2010 | JP | national |