The present application claims priority to and incorporates by reference the entire contents of Japanese Patent Application No. 2014-137201 filed in Japan on Jul. 2, 2014.
1. Field of the Invention
The present invention relates to a projector device, an interactive system, and an interactive control method.
2. Description of the Related Art
In recent years, there is known an interactive system that detects an operation on a screen projected by a projector device (an image projecting device) and enables the operation of a device, such as the projector device or a personal computer device. In the case of this interactive system, when a user operates a projection surface with an electronic pen, a pen, his/her finger, or the like, a cursor, a pointer, or the like on the projection surface is moved and displayed in accordance with the operation. Furthermore, in the case of this interactive system, when the user touches a desired position on the projection surface, the device is controlled, for example, to display an operation menu or to perform an operation corresponding to the touched position, such as switching of a display image.
Japanese Patent Application Laid-open No. 2012-185798 has disclosed an information processing system capable of accurately determining the timing at which a pointer has come in contact with a projection surface. This information processing system includes a coordinates detecting device for detecting coordinates on the projection surface, an information processing device for processing the detected coordinates, and a contact detecting device for detecting contact of the pointer with the projection surface.
The coordinates detecting device performs notification of approach of the pointer to the projection surface and detection of coordinates of an approaching position. When the contact detecting device has detected the approach of the pointer to the projection surface by notification of the approach of the pointer through the information processing device, the contact detecting device detects whether the pointer comes in contact with the projection surface within a given length of time. Then, when having detected contact of the pointer with the projection surface, the contact detecting device issues an event indicating the contact of the pointer. This makes it possible to determine the contact timing of the pointer with the projection surface accurately.
However, conventional interactive systems, including the technique disclosed in Japanese Patent Application Laid-open No. 2012-185798, have difficulty identifying multiple pointers at once. Therefore, there is a problem that when a projection surface is operated with multiple pointers, this constitutes an obstacle to the more accurate interactive operation.
In view of the above, there is a need to provide a projector device, an interactive system, and an interactive control method capable of the more accurate interactive operation in accordance with operations of multiple pointers.
It is an object of the present invention to at least partially solve the problems in the conventional technology.
A projector device includes: a projecting unit that projects a projection image onto a projection surface; a receiving unit that receives information transmitted from a pointer operating the projection surface, the information including unique identification information of the pointer and first operation direction information indicating an operation direction of the pointer; a photographing unit that photographs the projection surface; a generating unit that generates second operation direction information indicating an operation direction of the pointer operating the projection surface from multiple photographed images taken by the photographing unit; an associating unit that associates the second operation direction information matching with the first operation direction information with the unique identification information; an operation-information generating unit that generates operation information including the associated unique identification information and second operation direction information; and a transmitting unit that transmits the generated operation information to a controlled device.
An interactive system includes: a projection-image generating unit that generates a projection image corresponding to operation information; a projecting unit that projects the generated projection image onto a projection surface; a receiving unit that receives information transmitted from a pointer operating the projection surface, the information including unique identification information of the pointer and first operation direction information indicating an operation direction of the pointer; a photographing unit that photographs the projection surface; a generating unit that generates second operation direction information indicating an operation direction of the pointer operating the projection surface from multiple photographed images taken by the photographing unit; an associating unit that associates the second operation direction information matching with the first operation direction information with the unique identification information; an operation-information generating unit that generates operation information including the associated unique identification information and second operation direction information; and a transmitting unit that transmits the generated operation information to the projection-image generating unit.
An interactive control method includes: generating, by a projection-image generating unit, a projection image corresponding to operation information; projecting, by a projecting unit, the generated projection image onto a projection surface; receiving, by a receiving unit, information transmitted from a pointer operating the projection surface of the projection image, the information including unique identification information of the pointer and first operation direction information indicating an operation direction of the pointer; photographing, by a photographing unit, the projection surface; generating, by a generating unit, second operation direction information indicating an operation direction of the pointer operating the projection surface from multiple photographed images taken; associating, by an associating unit, the second operation direction information matching with the first operation direction information with the unique identification information; generating, by an operation-information generating unit, operation information including the associated unique identification information and second operation direction information; and transmitting, by a transmitting unit, the generated operation information to the projection-image generating unit.
The above and other objects, features, advantages and technical and industrial significance of this invention will be better understood by reading the following detailed description of presently preferred embodiments of the invention, when considered in connection with the accompanying drawings.
Exemplary embodiments of an interactive system according to the present invention will be explained in detail below with reference to accompanying drawings.
Outline
First, as the interactive system, for example, a “light-reflection type” interactive system or a “light-block type” interactive system can be adopted. In the case of the “light-reflection type” interactive system, an infrared device is installed above a projection surface onto which an image is projected by a projector device or the like, and the whole projection surface is photographed by a photographing unit (a camera unit). The infrared device projects an infrared light onto the whole projection surface. When a user operates the projection surface with a pointer such as a pen or his/her finger, the infrared light is reflected at the operated point. An operation detecting unit detects a position pointed by the pointer, a form of the user operation, or the like by using an image of the infrared light reflected on the projection surface photographed by the photographing unit.
On the other hand, in the case of the “light-block type” interactive system, a retroreflective sheet is installed on an outer frame of a projection surface. Furthermore, illumination units for emitting light to the whole projection surface are installed, for example, at the corners of the projection surface, and light receiving units receive light projected to the projection surface. Then, a position pointed by a pointer (a blocker) such as a pen or user's finger which blocks the light when operated on within the projection surface, a form of user operation, or the like is detected from angles regarding the pointer and the light receiving units.
The both “light-reflection type” and “light-block type” interactive systems detect the timing at which a pointer, such as a pen, has been irradiated with light as the timing at which the pointer has come in contact with the projection surface. However, it often happens that the timing at which a pointer has been irradiated with light is different from the timing at which the pointer has come in contact with the projection surface. If a user does not keep a pointer well away from the projection surface, the pointer is continuously irradiated with light, and the pointer is continuously recognized; therefore, the interactive operation may become inaccurate.
To prevent such disadvantage, a switch unit is installed at the tip of a pen used as a pointer, and the switch is designed to conduct electrical current when the tip of the pen has come in contact with the projection surface, in order to make the notification of the contact state of the pen tip with the projection surface wirelessly. Accordingly, it is possible to detect an operation position and detect a period of time in which the notification indicating the contact between the projection surface and the pen tip has been received as a period of time in which an operation has been performed with the pen in contact with the projection surface, thereby is possible to perform the more accurate interactive control.
Incidentally, according to this configuration, it is possible to identify a pen and a finger. That is, if notified of a contact state of a pointer with the projection surface when a position of the pointer was detected, an interactive system can identify the pointer as a pen; if not notified, the interactive system can identify the pointer as a finger (a pointer other than a pen).
Here, in the case where the projection surface is operated with multiple pointers, either type interactive system has difficulty identifying the pointers, and therefore has difficulty performing the accurate interactive control. For example, in the “light-reflection type” interactive system, when an operation has been performed by bringing a first pen into contact with the projection surface, an operation position of the first pen is detected by the photographing unit, and the first pen makes the notification of its own contact state. At this time, if the first pen makes the notification of unique identification information of the first pen together, the system side can recognize that a point of the detected operation position has been operated with the first pen. Also in the case where the same projection surface is operated with multiple pens, if the pens are operated to be brought into contact with the projection surface in turns, the pens can be identified from association between detected position and notification information, thereby it is possible to perform the more accurate interactive control.
However, in the case where an operation has been performed by bringing a first pen and a second pen into contact with the projection surface at the same time, respective operation positions of the first and second pens are detected by the photographing unit, and both the first and second pens make the notification of their own contact states. In this case, it is not possible to specify which pen's operation position the detected operation position is, and this may constitute an obstacle to the accurate interactive control.
In interactive systems according to the embodiments, a pen, an example of a pointer, notifies the system side of a unique identifier of the pen and a direction (an angle) detected by the pen. Then, the system side performs interactive control by comparing the direction in the notification from the pen with a direction calculated from position coordinates detected by the system side and associating the unique identifier of the pen with the operation position. Accordingly, even when multiple pens have come in contact with the projection surface at about the same time, it is possible to identify the pens and perform the more accurate interactive control.
The laser module 6 projects an infrared light parallel to a projection surface 4 at a little distance from the projection surface 4. The photographing unit (a camera unit) 5 photographs the infrared light projected to the projection surface 4. Therefore, the infrared light reflected by the pen 2 (or a finger), which is a pointer, is photographed by the photographing unit 5 (a light-reflection system). When a switch installed at the tip of the pen 2 has been pushed, the pen 2 transmits unique identification information (a pen identifier) and angle information to the projector device 1 wirelessly. The projector device 1 detects an operation position of the pen 2 on the projection surface 4 by using the photographed image taken by the photographing unit 5 and the pen identifier and angle information received from the pen 2, and converts the detected operation position into operation information, and then transmits the operation information to the personal computer device 3. Incidentally, the interactive system according to the first embodiment is an application example of a light-reflection type interactive system; however, other types of interactive systems, such as a light-block type, may be employed as well.
Hardware Configuration of Projector Device
The CPU 11 executes an “interactive program for the projector device 1” for controlling the operation of the projector device 1. The RAM 12 constitutes a work area of the CPU 11 and the like. The ROM 13 stores therein the interactive program executed by the CPU 11 and data required for the execution of the interactive program. The USB I/F port 14 is a connection port of a USB cable. The optical unit 15 is a part that produces an image, and is a DMD™ or a color wheel if the projector device 1 is a DLP™ projector. DLP is an abbreviation of “Digital Light Processing”. DMD is an abbreviation of “Digital Micromirror Device”.
The power supply unit 16 supplies electric power to the projector device 1. The power-supply control unit 17 controls the power supply from the power supply unit 16. The fan unit 18 cools the main body of the projector device 1. The lamp unit 19 generates light as a light source. The lamp control unit 20 is ballast for controlling the lamp unit 19. The image-signal receiving unit 22 is an image signal port for each type of image input, such as D-subminiature (D-sub), HDMI™, or video. The image-signal processing unit 21 receives input of an image signal from an image signal port, and processes the image signal. The operation receiving unit 23 is, for example, an operation key, and receives a user operation. The wireless unit 24 performs, for example, infrared wireless communication or wireless communication which meets the Bluetooth™ communication standard, thereby receiving information from the pen 2. The photographing unit 5 is a camera device, and photographs the projection surface 4 as described above.
Hardware Configuration of Pen
Functional Blocks of Projector Device and Pen
Incidentally, in this example, it will be assumed that the units 41 to 50 of the projector device 1 and the units 51 to 53 of the pen 2 are realized by software; however, all or some of the functions may be realized by hardware.
Furthermore, the respective interactive programs for the projector device 1 and the pen 2 may each be provided in such a manner that each interactive program is recorded on a computer-readable recording medium, such as a CD-ROM or a flexible disk (FD), in an installable or executable file format. Moreover, the interactive programs may be provided in such a manner that each interactive program is recorded on a computer-readable recording medium, such as a CD-R, a DVD, a Blu-ray Disc™, or a semiconductor memory. DVD is an abbreviation of “Digital Versatile Disk”. Furthermore, the interactive programs may be provided in such a manner that each interactive program is installed via a network such as the Internet. Moreover, the interactive programs may each be built into a ROM or the like in a device in advance.
In
The image-input receiving unit 41 receives input of an image input via an HDMI™, a VGA, or a network. VGA is an abbreviation of “Video Graphics Array”. The image editing unit 42 performs a process of editing an image to be projected, such as change of the magnification. The projecting unit 43 projects an image signal to the outside. The position-coordinates calculating unit 44 calculates position coordinates of a pen on a projection image from a photographed image of the projection surface 4. The pen-information communication unit 45 performs communication with the pen 2, and transmits and receives information to/from the pen 2. The PC-operation-coordinates-information converting unit 46 converts the calculated position coordinates and a pen identifier into operation coordinates information for the personal computer device 3, and transmits the operation coordinates information to the personal computer device 3 via the operation-information communication unit 49.
The pen-direction calculating unit 47 calculates an operation direction (an angle) of the pen 2 from the continuity of detected position coordinates. The pen identifying unit 48 identifies a pen 2 by comparing acceleration information of the pen 2 received from the pen 2 with an operation direction of the pen 2 calculated by the projector device 1. The operation-information communication unit 49 performs communication of operation information with the personal computer device 3. The photographing control unit 50 controls photographing by the photographing unit 5 that photographs the projection surface 4.
The pen-contact detecting unit 51 of the pen 2 detects whether the tip of the pen 2 is in contact with the projection surface 4. The pen-direction detecting unit 52 detects an operation direction of the pen 2. The pen-information communication unit 53 transmits and receives information to/from the projector device 1 through communication with the projector device 1.
Inter-Device Operation
Subsequently,
That is, when the pen 2 has come in contact with the projection surface 4, in the projector device 1, a reflected light is detected in a photographed image from the photographing unit 5. At Step S1, the projector device 1 detects an operation position of the pen 2 by using the photographed image from the photographing unit 5. Furthermore, at Step S2, the projector device 1 calculates position coordinates corresponding to the operation position of the pen 2.
Next, at Step S3, the pen 2 detects whether the pen tip has come in contact with the projection surface 4, and generates contact detection information. Furthermore, at Step S4, the pen 2 detects acceleration of the pen 2 operated to move, and generates acceleration information. Then, at Step S5, the pen 2 transmits notification information, which includes the contact detection information, the acceleration information, and a pen identifier uniquely assigned to the pen 2, to the projector device 1. While the pen 2 is in contact with the projection surface 4, the projector device 1 and the pen 2 repeatedly perform the above-described processes at Steps S1 to S5 at fixed intervals.
Next, at Step S6, the projector device 1 calculates an operation direction of the pen 2 by using the position coordinates calculated at Step S2. For example, an inclination angle of the pen 2 when the projection surface 4 is seen two-dimensionally can be used as an operation direction of the pen 2. Furthermore, at Step S6, the projector device 1 calculates an operation direction of the pen 2 by using the acceleration information in the notification from the pen 2 at Step S5. Then, at Step S6, the projector device 1 compares the calculated operation directions of the pen 2.
If the operation direction calculated from the acceleration information in the notification from the pen 2 matches with the operation direction calculated from the position coordinates of the pen 2, it means that these operation directions are the operation direction of the same pen 2. Therefore, even when the projection surface 4 is operated with multiple pens 2 at the same time, position coordinates of each pen 2 detected at Step S2 can be associated with notification information (contact detection information, acceleration information, and a pen identifier) in the notification from the pen 2.
Next, at Step S7, the projector device 1 generates operation information for the personal computer device 3 from the detected position coordinates and the identifier of the pen 2. Then, at Step S8, the projector device 1 transmits the generated operation information to the personal computer device 3. At Step S9, the personal computer device 3 performs, for example, control of moving the display of a cursor, a scroll bar, a given object, or the like projected through the projector device 1 in accordance with the operation information received from the projector device 1. Furthermore, the personal computer device 3 performs, for example, control of switching the display screen, control of starting a specified application, or the like in accordance with the received operation information. Accordingly, even when the projection surface 4 is operated with multiple pens 2 at the same time, the operations of the pens 2 can be identified, and thereby it is possible to perform the more accurate interactive control.
Subsequently, the flow of the interactive operation of the projector device 1 is shown in a flowchart of
If the pointer currently operating the projection surface 4 is the pen 2, the notification of notification information is made by wireless communication. That is, when contact of the pen tip with the projection surface 4 has been detected by the pen-contact detecting unit 51 of the pen 2 shown in
At Step S19, the projector device 1 has not received notification information from the pen 2 within the waiting time, so the pen identifying unit 48 determines that the pointer is not the pen 2 but a pointer such as a finger which does not have a function of transmitting notification information. Then, at Step S18, the PC-operation-coordinates-information converting unit 46 generates operation information for the personal computer device 3 from an identifier of the pointer being the finger, and the calculated position coordinates of the finger, and transmits the generated operation information to the personal computer device 3, and the processing shown in the flowchart of
On the other hand, when the projector device 1 has received notification information from the pen 2 while the pen identifying unit 48 was waiting for the given length of time, the processing proceeds to Step S15, and at Step S15, the pen identifying unit 48 determines that the pen 2 is currently used as a pointer. At Step S16, the pen-direction calculating unit 47 calculates a direction (for example, an angle) of the pen 2 operated to move from position coordinates of multiple points each calculated using a photographed image in the past and present.
Next, at Step S17, the pen identifying unit 48 compares the operation direction of the pen 2 calculated from the photographed image with the operation direction of the pen 2 in the notification from the pen 2. When the both match with each other, the pen identifying unit 48 associates the pen identifier of the pen 2 included in the notification information with the position coordinates calculated from the photographed image. At Step S18, the PC-operation-coordinates-information converting unit 46 generates operation information for the personal computer device 3 from the pen identifier associated with the position information calculated from the photographed image. Then, the operation-information communication unit 49 transmits the generated operation information to the personal computer device 3. Then, the processing shown in the flowchart of
As is obvious from the above explanation, in the interactive system according to the first embodiment, the photographing unit 5 photographs the projection surface 4 onto which an image from the personal computer device 3 is being projected through the projecting unit 43 of the projector device 1. Then, the position-coordinates calculating unit 44 analyzes the photographed image, thereby calculating position coordinates of a pointer operating the projection surface 4.
The pen-contact detecting unit 51 of the pen 2 detects whether the pen tip has come in contact with the projection surface 4, and generates contact detection information. Furthermore, the pen 2 generates acceleration information depending on the operation state. Then, the pen 2 transmits notification information, which includes the contact detection information, the acceleration information, and unique identification information uniquely assigned to the pen 2, to the projector device 1.
The pen identifying unit 48 of the projector device 1 compares the operation direction of the pen 2 calculated from the photographed image with the operation direction (the acceleration information) of the pen 2 in the notification from the pen 2. When the both match with each other, the pen identifying unit 48 associates the unique identification information (a pen identifier) of the pen 2 included in the notification information with the position coordinates calculated from the photographed image.
The PC-operation-coordinates-information converting unit 46 generates operation information for the personal computer device 3 from the pen identifier associated with the position information calculated from the photographed image. The operation-information communication unit 49 transmits the generated operation information to the personal computer device 3. The personal computer device 3 performs, for example, control of moving the display of a cursor, scroll bar, given object, or the like projected through the projector device 1. Furthermore, the personal computer device 3 performs, for example, control of switching the display screen, control of starting a specified application, or the like in accordance with the received operation information. Accordingly, even when the projection surface 4 is operated with multiple pens 2 at the same time, the operations of the pens 2 can be identified, and thereby it is possible to perform the more accurate interactive control.
Subsequently, an interactive system according to a second embodiment is explained. When the projection surface 4 is operated with one pen 2, an operation direction calculated from a photographed image is the operation direction of the pen 2. Therefore, when the projection surface 4 is operated with one pen 2, there is no need for the projector device 1 to perform the process of associating an operation direction calculated from a photographed image with unique identification information of the pen 2.
From this point of view, in the second embodiment, the above-mentioned associating process is performed when the projection surface 4 is operated with multiple pens 2. Incidentally, the second embodiment described below differs from the above-described first embodiment in this point only. Therefore, only a different part from the first embodiment is explained below to omit redundant description.
A flowchart of
On the other hand, at Step S20, when the pen identifying unit 48 has determined that there is a single (one) pen 2 operating the projection surface 4 (NO at Step S20), the processing proceeds to Step S18. That is, when the number of pens 2 operating the projection surface 4 is one, an operation direction calculated from a photographed image is the operation direction of the pen 2. Therefore, when the projection surface 4 is operated with one pen 2, there is no need for the projector device 1 to perform the process of associating an operation direction calculated from a photographed image with unique identification information of the pen 2. Therefore, when the pen identifying unit 48 has determined that there is a single (one) pen 2 operated on the projection surface 4 (NO at Step S20), the processing skips the associating process at Steps S16 and S17 and proceeds to Step S18. Then, the position coordinates of the pen 2 calculated from the photographed image is transmitted as operation information to the personal computer device 3, and interactive control is performed in accordance with the operation of the one pen 2.
In the case of the interactive system according to the second embodiment, when the number of pens 2 operated on the projection surface 4 is one, the process of associating an operation direction with a pen can be eliminated. Therefore, it is possible to reduce the arithmetic operation of the projector device 1, and yet possible to achieve the same effects as the first embodiment described above.
Subsequently, an interactive system according to a third embodiment is explained. In the interactive system according to the third embodiment, the projector device 1 identifies a pointer operating the projection surface 4, and enables the interactive control. That is, in the case of the third embodiment, the projector device 1 identifies a pen 2 and a pointer such as a finger other than the pen 2, and enables the interactive control. Incidentally, the third embodiment described below differs from the above-described embodiments in this point only. Therefore, only a different part from the first and second embodiments is explained below to omit redundant description.
A flowchart of
At Step S14 in the flowchart of
When the pen identifying unit 48 has determined that the number of pens 2 detected from the number of pieces of notification information in the notification is equal to the number of pointers detected in a photographed image (YES at Step S21), the processing proceeds to Step S15, and at Step S15, interactive control is performed in accordance with the operation(s) of one or more pens 2 as explained in the first and second embodiments.
On the other hand, when the pen identifying unit 48 has determined that the number of pens 2 detected from the number of pieces of notification information in the notification is different from the number of pointers detected in a photographed image (NO at Step S21), the processing proceeds to Step S22. At Steps S22 and S23, the pen identifying unit 48 recognizes that the pointers detected in the photographed image include both pen(s) 2 having a function of making the notification of notification information and pointer(s) such as a finger not having the function of making the notification of notification information.
At Step S24, the pen-direction calculating unit 47 calculates an operation direction of each pointer from position coordinates of the pointer on multiple photographed images taken by the photographing unit 5. At Step S25, the pen identifying unit 48 detects a calculated operation direction matching with an operation direction included in the notification information received from the pen 2 (the operation direction in the notification) out of the operation directions calculated by the pen-direction calculating unit 47 (the calculated operation directions). Then, the pen identifying unit 48 associates the unique identification information of the pen 2 included in the notification information with the calculated operation direction matching with the operation direction in the notification. Furthermore, at Step S26, the pen identifying unit 48 recognizes that out of the calculated operation directions calculated by the pen-direction calculating unit 47, a calculated operation direction matching with no operation direction in the notification from the pen 2 is an operation direction of a pointer such as a finger not having the function of making the notification of notification information. In other words, the pen identifying unit 48 recognizes that a calculated operation direction with which there is no unique identification information associated is an operation direction of a pointer such as a finger.
At Step S18, the calculated operation direction associated with the unique identification information of the pen 2 and the calculated operation direction recognized to be an operation direction of a pointer such as a finger not having the function of making the notification of notification information (=the calculated operation direction with which there is no unique identification information associated) are converted into operation information for the personal computer device 3 by the PC-operation-coordinates-information converting unit 46. Then, the operation information is transmitted to the personal computer device 3 by the operation-information communication unit 49, and used for the interactive control.
Even when there are mixed pointers of pen(s) 2 having a function of transmitting notification information and pointer(s) such as a finger not having the function of transmitting notification information, the interactive system according to the third embodiment can distinguish and detect respective operations of the pointers accurately, and can achieve the same effects as the embodiments described above.
According to an embodiment, it is possible to perform more accurate interactive control in accordance with operations of multiple pointers.
Although the invention has been described with respect to specific embodiments for a complete and clear disclosure, the appended claims are not to be thus limited but are to be construed as embodying all modifications and alternative constructions that may occur to one skilled in the art that fairly fall within the basic teaching herein set forth.
Number | Date | Country | Kind |
---|---|---|---|
2014-137201 | Jul 2014 | JP | national |