1. Field of the Invention
The present invention relates to a coordinate detection system, an information processing apparatus, a method of detecting coordinate, and a program.
2. Description of the Related Art
An exemplary coordinate detection system which detects a coordinate pointed by a pointer such as an electronic pen and displays a handwritten character or the like is a coordinate detection system of an optical type (for example, Patent Documents 1 and 2).
Patent Document 1: Japanese Laid-Open Patent Publication No. 2005-173684
Patent Document 2: Japanese Patent No. 5122948
It is a general object of at least one embodiment of the present invention to provide a coordinate detection system that substantially obviates one or more problems caused by the limitations and disadvantages of the related art.
One aspect of the embodiments of the present invention may be to provide a coordinate detection system for detecting a coordinate pointed by a pointer with which a pointing operation is performed on a board face including a first calculation unit that extracts a plurality of areas respectively indicative of a plurality of annular members provided in the pointer from a picked-up image picked by an image pick-up unit arranged at a predetermined position of the board face and calculates center positions of the extracted areas; and a second calculation unit that calculates a position of a tip portion of the pointer on the board face based on the center positions of the areas calculated by the first calculation unit and a position of a principal point of the image pick-up unit.
Additional objects and advantages of the embodiments will be set forth in part in the description which follows, and in part will be clear from the description, or may be learned by practice of the invention. Objects and advantages of the invention will be realized and attained by means of the elements and combinations particularly pointed out in the appended claims.
In a case of the coordinate detection system of the optical type, a light from the pointer is picked up by multiple image pick-up units arranged at different positions and the coordinate of the tip portion of the pointer is calculated by using a triangulation method.
Therefore, it is desirable that a material of a light emission portion (a reflective portion) of the pointer is formed in an annular shape in a peripheral direction of the pointer so that the light from the pointer can be securely picked up by the multiple image pick-up units.
However, in a case where the light emission portion of the pointer (or the reflective portion) is formed in the annular shape in its peripheral direction, a light from the pointer is drawn as a light emission area having a predetermined area on a picked-up image picked up by the image pick-up unit.
In order to accurately calculate the coordinate of the tip portion of the pointer, it is necessary to accurately extract a pixel corresponding to the center axis of the pointer from the light area. However, the center position of the light area does not conform with the center axis of the pointer. Therefore, an occurrence of an error is unavoidable in a case where the coordinate of the tip portion of the pointer is calculated based on the center position of the light area.
The present invention is provided to accurately calculate the coordinate of the tip portion of the pointer in consideration of the above problems.
A description is given below, with reference to the
Reference symbols typically designate as follows:
100: coordinate detection system;
101: coordinate input apparatus;
102: computer (information processing apparatus);
103
a-103d: image pick-up unit;
104
a-104d: peripheral light emission unit;
110: pointer;
120: terminal apparatus;
310: grip portion;
311: light emission circuit;
320: tip portion;
321, 322: light emission portion;
323: vertex;
331, 332: center point;
501
a: CMOS image sensor;
530: plane;
540
a,
540
b: intersecting line;
701
a: lens;
1200: picked-up image; and
1210, 1220: light emission area.
Firstly, described is a system structure of a coordinate detection system of an embodiment.
Referring to
The coordinate input apparatus 101 displays an image generated by the terminal apparatus 120 and displays a content handwritten by a pointing operation on an input face, which is a board face of the coordinate input apparatus 101, using the pointer.
The computer (the information processing apparatus) 102 controls the coordinate input apparatus 101 so that an image sent by the terminal apparatus 120 is displayed on the coordinate input apparatus 101. Referring to
Further, the computer 102 analyzes pointing (a contact position between the input face and the tip portion of the pointer 110) input into the input face of the coordinate input apparatus 101 based on a picked-up image picked up by the image pick-up units 103a to 103d in real time. Further, a line is formed by connecting time-series coordinates and controls so that the handwritten content is displayed on the coordinate input apparatus 101.
Referring to
As such, even though the coordinate input apparatus 101 does not have a touch panel function, the user can perform various inputs by causing the tip portion of the pointer 110 to touch the coordinate input apparatus 101.
The image pick-up units 103a to 103d are provided to shoot the entire input face of the coordinate input apparatus 101 and arranged at a predetermined position (both end positions in the embodiment) on the input face of the coordinate input apparatus 101. Within the embodiment the image pick-up units 103a and 103b shoot an upper half of the input face of the coordinate input apparatus 101 and the image pick-up units 103c and 103d shoot a lower half of the input face of the coordinate input apparatus 101. The coordinate of the contact position of the tip portion if the pointer 110 is calculated based on a picked-up image obtained by shooting the pointer using the image pick-up unit.
The peripheral light emission units 104a to 104d are arranged in the periphery of the coordinate input apparatus 101 and irradiates the input face of the coordinate input apparatus 101. The peripheral light emission units 104a to 104d may be detachably attached to the coordinate input apparatus 101.
Next, a hardware structure of the coordinate detection system 100 is described.
Referring to
The CPU 202 runs the coordinate detection program 220 and simultaneously controls entire operations of the coordinate detection system 100. The ROM 202 stores an initial program loader (IPL) or the like and mainly stores a program run by the CPU 201 at a time of starting up the computer 201. The RAM 203 functions as a work area used by the CPU 201 at a time of executing, for example, a coordinate detection program 220.
An SSD 204 is a non-volatile memory in which a coordinate detection program 220 and various data are stored. The network controller 205 performs a process based on a communication protocol when the computer 100 communicates with a server through a network (not illustrated). This network includes a local area network (LAN), a wide area network (WAN) formed by connecting multiple LANs such as the Internet, or the like.
The external memory controller 206 reads out data from the detachable external memory 230. The external memory 230 includes a universal serial bus (USB) memory, a SD card, and so on.
The four image pick-up units 103a to 103d are connected to the sensor controller 207 to control shooting with these four image pick-up units 103a to 103d.
A GPU 208 is a processor exclusively used for drawing for Calculating pixel values of pixels of an image displayed on the coordinate input apparatus 101. The coordinate input apparatus controller 211 outputs an image formed by the GPU 208 to the coordinate input apparatus 101.
The capture device 209 takes in (captures) an image displayed on a display unit 121 by the terminal apparatus 120.
In a case of the coordinate detection system 100 of the embodiment, the computer 102 needs not to communicate with the pointer 110 but may have a communication function for communicating with the pointer 110. In this case, the computer 102 includes a pointer controller 210 for communicating with the pointer 110. With this, the computer 102 can receive a control signal from the pointer 10.
The coordinate detection program 200 may be put into circulation while the coordinate detection program 200 is stored in the external memory 230 or may be downloaded from a server (not illustrated) through the network controller 205. At this time, the coordinate detection program 220 may be in a compressed state or in a state of executable format.
Next, the structure of the pointer 110 is described.
The grip portion has a cylindrical shape so as to be easily gripped by the user. A light emission circuit 311 is provided inside the light emission circuit 311. However, the cylindrical shape is an example and another shape may be used. The tip portion 320 has a conic shape and is provided with annular light emission portions 321 and 322 (the conic shape is an example and may be another shape.) The light emission portions 321 and 322 are controlled to be turned ON/OFF by the light emission circuit 311 and emit light having a predetermined light quantity.
Hereinafter, a part of the tip portion 320 directly contacting the input face of the coordinate input apparatus 101 is referred to as a “vertex”. Within the embodiment, the vertex 323 of the tip portion 320 is positioned on a linear line connecting a center point 331 of a circle of a cross-section (see the right side of
Said differently, the light emission portions 321 and 322 are arranged such that these cross-sections are substantially orthogonal to the center axis and the centers of these cross-sections substantially conform with the center axis.
Next described is a procedure of calculating a two-dimensional coordinate (a two-dimensional coordinate of the contact position between the input face and the vertex 323 of the pointer 110) of the vertex 323 of the pointer 110 on the input face of the coordinate input apparatus 101.
In a case where the pointer 110 (see
Specifically, a picked-up image is obtained by shooting with the image pick-up units 103a to 103d (step 1).
Subsequently, the two-dimensional coordinates of the center points of the light emission portions 321 and 322 on the picked-up image are calculated based on the obtained picked-up image (step 2).
Subsequently, elements on the CMOS image sensor corresponding to the two-dimensional coordinates of the center points 331 and 332 on the picked-up image are specified (step 3). Further, the three-dimensional coordinates of the specified elements in the three-dimensional coordinate space are calculated (step 3).
Referring to
Subsequently, a plane including the center points 331 and 332 of the light emission portions 321 and 322 and a principal point of the image pick-up unit 103a in the three-dimensional coordinate space is calculated (step 4). Because the center points 331 and 332 of the light emission portions 321 and 322 specify the canter axis of the pointer 110, the plane calculated in step 4 includes the center axis of the pointer 100 and the principal point of the image pick-up unit 103a (hereinafter, the plane is referred to as a “center axis plane”). The principal point is a cross point between an optical axis and a principal surface of a single thin lens replacing an optical system of the image pick-up unit 103a. In a case where the optical system of the image pick-up unit 103a is formed of a single lens, the center of the single lens may be the principal point).
As illustrated in
Subsequently, the intersecting line between the center axis plane calculated in step 4 and the input face of the coordinate input apparatus 101 is calculated (step 5).
Referring to
Subsequently, angles (turn angles) formed between the intersecting lines 540a and 540b calculated in step 5 and a reference direction on the input face of the coordinate input apparatus 101 are calculated, and the two-dimensional coordinate of the vertex 323 of the pointer 110 on the input face is calculated using the calculated turn angle (step 6).
Referring to
Here, the upper left end of the coordinate input apparatus 100 is determined as an original point, the X-axis is determined in a lateral direction, and the Y-axis is determined in a lengthwise direction. Further, a turn angle of the point 600 from the X-axis (the reference direction) viewed at the image pick-up unit 103a is designated by α, and a turn angle of the point 600 from the X-axis (the reference direction) viewed at the image pick-up unit 103b is designated by β. Further, the width of the coordinate input apparatus 101 along the X-axis is designated by L.
On these premises, the Y coordinate of the point 600 is expressed using the X coordinate as follows:
Y=X tan α [Formula 1]
Y=(L−X)tan β [Formula 2]
Then, Y is eliminated from Formula 1 and Formula 2, and rearranged for X.
X=L tan β/(tan α+tan β) [Formula 3]
Further, Formula 3 is assigned to Formula 1 as follows:
X=L tan α×tan β/(tan α+tan β) [Formula 4]
Said differently, the X coordinate and the Y coordinate can be calculated by calculating the turn angles α and β of the intersecting lines 540a and 540b from the X-axis (the reference direction) based on the picked-up images shot by the image pick-up units 103a and 103b and assigning to Formulas 3 and 4. The above described process is based on the picked-up images shot by the image pick-up units 103a and 103b. A process similar to the above described process is applicable to picked-up images shot by the image pick-up units 103c and 103d. Therefore, an explanation of the process based on the picked-up images shot by the image pick-up units 103c and 103d is omitted.
As described above, in a case of the pointer 110 provided with the annular light emission portions 321 and 322, if the two-dimensional coordinates of the center points 331 and 332 of the light emission portions 321 and 322 on the picked-up images are calculated, the two-dimensional coordinates of the vertex 323 on the input face of the coordinate input apparatus 101 can be calculated.
Said differently, in a case where an error is included in the two-dimensional coordinates on the picked-up images corresponding to the center points 331 and 332 of the light emission portions 321 and 322, the coordinate of the vertex 323 on the input face of the coordinate input apparatus includes an error.
Here, the procedure explained using
Said differently, in a case of the image pick-up unit 103a including the lens distortion, a relationship between the center points 331 and 332 of the light emission portions 321 and 322 and the elements (the points 501 and 502) on the CMOS image sensor 500a is different from the relationship illustrated in
The inventor of the present invention has focused on this difference and has conceived a structure of eliminating the influence of the lens distortion. Hereinafter, the influence of the lens distortion in the image pick-up unit 103a and also the structure of eliminating the influence of the lens distortion are explained.
Before explaining the influence of the lens distortion, an internal structure of the image pick-up unit 103a is explained.
A light ray 714 having an incident angle θ enters the image pick-up unit 103a, passes through a principal point (a point 510a) and is received by an element 712 on the CMOS image sensor 500a. At this time, a distance between a center element 711 and the element 712 on the CMOS image sensor 500a becomes fθ where the focal length of the lens 701a is f.
The incident angle θ is measured from an optical axis 713, which passes through the principal point (the point 510a) being the center of the lens 701a) and the center element 711 of the CMOS image sensor 500a and is perpendicular to the CMOS image sensor 500a.
Next described is a difference of the coordinate of the element corresponding to the center point on the CMOS image sensor caused depending on whether the lens distortion exists or does not exist.
Referring to
Referring to
As described, depending on whether there is the lens distortion or not, the coordinates of the elements corresponding to the center point 331 on the CMOS image sensor 500a shift by f tan(θ)−fθ.
Described next is a calculation method of calculating the coordinate of the element 931 on the CMOS image sensor 500a while eliminating the influence of the lens distortion.
Because the center point is inside the pointer 110, it is not possible to directly detect on the CMOS image sensor 500a. Therefore, an edge point of the light emission portion 321 is detected and the coordinate of the element corresponding to the center point 331 is calculated using the coordinate of the element corresponding to the edge point on the CMOS image sensor 500a.
Here, a distance between the element 1021R and the center element 711 is f(θ−Δθ). Further, a distance between the element 1021L and the center element 711 is f(θ+Δθ).
Here, Δθ is an angle formed between a straight line connecting the edge point 321R or 321L of the light emission portion 321 to the principal point (the point 510a) and a straight line connecting the center point 331 of the light emission portion 321 and the principal point (the point 510a).
As known from
Therefore, within this embodiment, the coordinate of the center position of the light emission area indicative of the light emission portion 321 drawn on the picked-up image is firstly calculated so as to convert the coordinate of the calculated coordinate of the center position to a coordinate in a case where no distortion is assumed. Specifically, a correction is performed by multiplying the calculated coordinate of the center position with a lens distortion correction function to specify the element corresponding to the corrected coordinate of the center position on the CMOS image sensor 500a.
With the above structure, it is possible to specify the elements corresponding to the center points 331 and 332 of the light emission portion 321 and 322 on the CMOS image sensor 500a while eliminating the influence of the lens distortion.
Described next is a functional structure of the coordinate detection system 220. A coordinate detection program 220 has a functional structure illustrated in
Calculating the two-dimensional coordinate corresponding to the center point of the light emission portion on the picked-up image using the relationship illustrated in
Correcting the two-dimensional coordinate corresponding to the center point of the light emission portion on the picked-up image in consideration if the error caused by the lens distortion illustrated in
Calculating the two-dimensional coordinate of the vertex 323 of the pointer 110 on the input face in conformity with the procedure illustrated in
The process part 1100a includes a picked-up image capture part 1101a, a light emission area extraction part 1102a, center position calculation parts 1103a and 1104a, center position correction parts 1105a and 1106a, a plane calculation part 1107a, and a turn angle calculation part 1108a.
The picked-up image capture part 1101a acquires the picked-up images shot by the image pick-up unit 103a at a predetermined period of time. The light emission area extraction part 1102a extracts light emission areas indicative of the light emission portions 321 and 322 drawn on the acquired picked-up image.
The center position calculation part 1103a calculates the two-dimensional coordinate corresponding to the center point 331 of the light emission portion 321 on the picked-up image 1200 based on the extracted light emission area. The center position calculation part 1104a calculates the two-dimensional coordinate corresponding to the center point 332 on the picked-up image 1200 based on the extracted light emission area.
The center position correction part 1105a corrects coordinates of the pixels 1310 and 1320 corresponding to the center point 331 on the picked-up image 1200 using the lens distortion correction function so as to calculate the coordinate corresponding to the center point 331 in a case where no lens distortion is assumed. The center position correction part 1106a corrects coordinates of the pixels 1310 and 1320 corresponding to the center point 332 on the picked-up image 1200 using the lens distortion correction function so as to calculate the coordinate corresponding to the center point 332 in a case where no lens distortion is assumed.
The plane calculation part 1107a specifies the coordinates of elements (points 501 and 502) on the CMOS image sensor 500a based on the coordinates of the pixels 1312 and 1322 corresponding to the center points corrected using the lens distortion correction function. The plane calculation part 1107a calculates the center axis plane (the plane 530) including the coordinate of the elements (the points 501 and 502) of the CMOS image sensor 500a on the three-dimensional coordinate space and the coordinate of the principal point (the point 510a) of the image pick-up unit 103a on the three-dimensional coordinate space.
The turn angle calculation part 1108a calculates the intersecting line 540a between the calculated center axis plane and the input face of the coordinate input apparatus 101, and further calculates the turn angle α of the vertex 323 of the pointer 110 from the reference direction.
The tip coordinate calculation part 1109 calculates the coordinate of the point 600 indicative of the position of the vertex 323 on the input face based on the turn angle α calculated by the turn angle calculation part 1108a and the turn angle β calculated by the turn angle calculation part 1108b.
As described above, the coordinate detection system is structured as follows:
The two annular light emission portions are provided to the tip portion of the pointer and arranged in the longitudinal direction of the tip portion;
The coordinates of the center points of the two light emission portions on the corresponding picked-up images are calculated, and the two-dimensional coordinate of the vertex of the pointer on the input face of the coordinate input apparatus is calculated using the calculated two center points on the picked-up image; and
The two-dimensional coordinates of the center points of the two light emission portions on the picked-up image are corrected using the lens distortion correction function in calculating the two-dimensional coordinate of the vertex of the pointer on the input face of the coordinate input apparatus.
With this, an error of the two-dimensional coordinate of the vertex of the pointer caused by a lens distortion can be eliminated.
As a result, it is possible to accurately calculate the coordinate of the tip portion of the pointer in the coordinate detection system.
Within the first embodiment, the intersecting lines between the center axis planes respectively calculated by the plane calculation parts 1107a and 1107b and the input face are calculated in calculating the two-dimensional coordinate of the vertex 323 of the pointer 110 on the input face.
However, the present invention is not limited to this, and an intersecting line between the center axis plane calculated by the plane calculation part 1107a and the center axis plane calculated by the plane calculation part 1107b may be calculated. The second embodiment is described below.
Differences of the functional structure of the coordinate detection program 220 illustrated in
The plane intersecting line calculation part 1401 calculates the intersecting line between the center axis plane calculated by the plane calculation part 1107a and the center axis plane calculated by the plane calculation part 1107b. The center axis planes respectively calculated by the plane calculation parts 1107a and 1107b are planes including both the center points 331 and 332 of the light emission portions 321 and 322. Therefore, the intersecting line between the center axis planes equals to the center axis of the pointer 110.
The tip coordinate calculation part 1402 calculates the intersection point between the intersecting line calculated by the plane intersecting line calculation part 1401 and the input face of the coordinate input apparatus 101, and calculates the two-dimensional coordinate of the vertex 323 of the pointer 110 on the input face of the coordinate input apparatus 101.
As such, even in a case where the intersecting line between the center axis plane calculated by the plane calculation part 1107a and the center axis plane calculated by the plane calculation part 1107b is calculated, the two-dimensional coordinate of the vertex 323 of the pointer 110 on the input face can be accurately calculated.
Within the above embodiments, although the center position calculation parts 1103a and 1104a calculate the barycentric positions of the light emission areas 1210 and 1220 in order to calculate the coordinates corresponding to the center points 331 and 332 on the picked-up image, the present invention is not limited to this.
For example, the coordinates corresponding to the center points 331 and 332 on the picked-up image may be calculated based on the shapes of boundaries of the light emission areas 1210 and 1220.
Further, within the above embodiments, although the picked-up image capture parts 1101a and 1101b are structured to acquire all pixels included in the picked-up image, the present invention is not limited to this. For example, only pixels included in an area to a predetermined height from the input face of the coordinate input apparatus 101 may be acquired. Said differently, an area of interest (AOI) or a region of interest (ROI) is set, and only pixels included in the AOI or the ROI may be acquired.
In the above embodiments, although conditions for starting an execution of the coordinate detection program 220 or 1420 are not referred to, the coordinate detection program 220 or 1420 may be started being executed based on a predetermined instruction, for example.
This predetermined instruction by the user may include a detection of a predetermined action by the user. For example, a sensor which can detect a touch of the vertex 323 onto the input face of the coordinate input apparatus 101 may be provided in the pointer 110, and the coordinate detection program 220 or 1420 may be executed in a case where the touch is detected by the sensor.
Within the above embodiments, although the two-dimensional coordinate of the vertex 323 of the pointer 110 on the input face is calculated regardless of a slant of the pointer 110 relative to the input face, the present invention is not limited to this. For example, it may be structured such that the two-dimensional coordinate of the vertex 32 of the pointer 110 on the input face is not calculated in a case where it is determined that the instruction input onto the input face by the pointer when the slant of the pointer 110 relative to the input face exceeds a predetermined threshold value.
Within the embodiments, although the annular light emission portion is provided in the tip portion of the pointer, the present invention is not limited to this. What identified in the picked-up image may be provided instead of the light emission portion. For example, a paint (e.g., a fluorescent paint) having a predetermined color may be coated on an annular member, or the annular member may be made of a predetermined material (a reflective material).
Within the above embodiments, although the light emission portion of the pointer emits a light having a predetermined light quantity, the present invention is not limited to this. For example, a modulation circuit may be provided inside the pointer 110 so as to emit a modulated light.
Within the above embodiments, the coordinate detection system 100 including the coordinate input apparatus 101, the computer (the information processing apparatus) 102, the image pick-up units 103a to 103d, and the peripheral light emission units 104a to 104d are formed to be a single apparatus.
However, the present invention is not limited to this. For example, any one or some of the coordinate input apparatus 101, the computer (the information processing apparatus) 102, the image pick-up units 103a to 103d, and the peripheral light emission units 104a to 104d may be separate.
All examples and conditional language recited herein are intended for pedagogical purposes to aid the reader in understanding the principles of the invention and the concepts contributed by the inventor to furthering the art, and are to be construed as being without limitation to such specifically recited examples and conditions, nor does the organization of such examples in the specification relate to a showing of the superiority or inferiority of the invention. Although a coordinate detection system has been described in detail, it should be understood that various changes, substitutions, and alterations could be made thereto without departing from the spirit and scope of the invention.
This application is based upon and claims the benefit of priority of the prior Japanese Patent Application No. 2014-033680, filed on Feb. 25, 2014, the entire contents of which are incorporated herein by reference.
Number | Date | Country | Kind |
---|---|---|---|
2014-033680 | Feb 2014 | JP | national |