This application is based upon and claims the benefit of priority of the prior Japanese Patent Application No. 2015-068318, filed on Mar. 30, 2015, the entire contents of which are incorporated herein by reference.
The embodiment disclosed herein is related to a calibration in a display system.
Calibration is performed in a case where a display system combining a display device and a plane scanning sensor is used. In the calibration, a parameter for performing coordinate transformation between a display coordinate system in the display device and a detection coordinate system in the plane scanning sensor is calculated for example on the basis of a matching sample associating the position of a given point in a display surface with a position detected as a result of touching the point.
However, the plane scanning sensor may be affected by noise and thus detect another position other than the position touched by a user. The display system may not operate correctly when calibration is performed on the basis of such erroneous detection.
According to an aspect of the embodiment, A calibration method performed by a computer, the calibration method including making a display device display a figure indicating a touch route, receiving a plurality of detection results from a detection device, each of the plurality of detection results including one or more touch positions on a surface on which the figure is displayed in each detection timing, extracting one or more loci based on the plurality of detection results, and when the extracted one or more loci include a plurality of loci, executing a calibration based on a corresponding locus coinciding with or approximate to the figure, the corresponding locus being included in the plurality of loci.
The object and advantages of the invention will be realized and attained by means of the elements and combinations particularly pointed out in the claims.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory and are not restrictive of the invention, as claimed.
In one aspect, it is one of the purposes of the present embodiment to perform calibration more correctly for a display system detecting a touch position.
The detecting device 105 in the present example is a plane scanning sensor. The detecting device 105 is set on the left side of the display surface 201. However, the detecting device 105 may be set on the right side, lower side, or upper side of the display surface 201. As illustrated in
However, when the detection range is too wide, an object outside the display surface is detected, thus causing noise. Accordingly, the occurrence of the noise is suppressed by setting an effective detection range appropriately.
In the present embodiment, a figure that guides a slide operation by the touch of a finger is displayed on the display surface 201, and calibration and the setting of the effective detection range are performed on the basis of a locus touched by a finger of a user. Incidentally, in the following, this figure may be referred to simply as a figure.
The setting unit 301 sets the detection range. The identifying unit 303 identifies the figure to be displayed. The display processing unit 305 displays the figure on the display device 103. The detecting unit 307 detects the position of an object. The first extracting unit 309 extracts a locus from detection data. The selecting unit 311 selects a locus coinciding with or approximate to the figure. The second extracting unit 313 extracts a matching sample that associates a display position in the display coordinate system with a detected position in the detection coordinate system. The calculating unit 315 calculates a parameter used for coordinate transformation.
The figure data storage unit 316 stores figure data. The detection data storage unit 317 stores detection data. The locus data storage unit 319 stores locus data. The sample storage unit 321 stores sample data. The parameter storage unit 322 stores the parameter used for coordinate transformation. The range data storage unit 323 stores range data.
The setting unit 301, the identifying unit 303, the display processing unit 305, the detecting unit 307, the first extracting unit 309, the selecting unit 311, the second extracting unit 313, and the calculating unit 315 described above are implemented by using hardware resources (for example
The figure data storage unit 316, the detection data storage unit 317, the locus data storage unit 319, the sample storage unit 321, the parameter storage unit 322, and the range data storage unit 323 described above are implemented by using hardware resources (for example
Description will next be made of processing in the control device 101.
A
The description returns to
When the detecting unit 307 determines that the detection is not to be ended, the detecting unit 307 returns to S601 to repeat the above-described processing. When the detecting unit 307 determines that the detection is to be ended, on the other hand, the detecting unit 307 ends the detection processing to proceed to processing of S409 illustrated in
A locus 801 is a locus touched by a finger of a user along the
A locus 803 is a locus of a detected position resulting from a noise. The locus 803 is produced by a noise of a type that involves movement of a detected position. The locus 803 from a starting point (Ub21, Vb21) is drawn by an intermediate point (Ub22, Vb22), an intermediate point (Ub23, Vb23), and an endpoint (Ub24, Vb24) sequentially detected in time series. Incidentally, the starting point (Ub21, Vb21) of the locus 803 is detected simultaneously with an intermediate point (Ua21, Va21) of the locus 801. The intermediate point (Ub22, Vb22) of the locus 803 is detected simultaneously with an intermediate point (Ua22, Va22) of the locus 801. The intermediate point (Ub23, Vb23) of the locus 803 is detected simultaneously with an intermediate point (Ua23, Va23) of the locus 801. The endpoint (Ub24, Vb24) of the locus 803 is detected simultaneously with an intermediate point (Ua24, Va24) of the locus 801. Incidentally, at a point in time that an intermediate point (Ua20, Va20) of the locus 801 is detected, the noise causing the locus 803 is not produced. In addition, at a point in time that an intermediate point (Ua25, Va25) of the locus 801 is detected, the noise causing the locus 803 has disappeared.
A locus 805 is also a locus of a detected position resulting from a noise. The locus 805 is produced by a noise of a type that does not involve movement of a detected position. Hence, the locus 805 is drawn by a fixed point (Uc31, Vc31). Incidentally, at a point in time that an intermediate point (Ua30, Va30) of the locus 801 is detected, the noise causing the locus 805 is not produced. Suppose that the fixed point (Uc31, Vc31) of the locus 805 is thereafter detected together when intermediate points (Ua31, Va31) to (Ua34, Va34) of the locus 801 are detected. Suppose that the noise causing the locus 805 has disappeared at a point in time that an intermediate point (Ua35, Va35) of the locus 801 is detected.
The description returns to
The first extraction processing in S409 will be described in detail.
The first extracting unit 309 identifies one starting point among the extracted starting points (S903). The first extracting unit 309 provides a new locus table (S905). The first extracting unit 309 first adds a record of the position of the starting point to the locus table (S907). At this point in time, a processing object is the position of the starting point. The first extracting unit 309 determines whether or not a detection result at a time next to a time of detection of the processing object includes a position coinciding with or adjacent to the position of the processing object (S909). When determining that the detection result includes a position coinciding with or adjacent to the position of the processing object, the first extracting unit 309 adds the record of the coinciding or adjacent position to the locus table (S911). Then, the first extracting unit 309 sets the coinciding or adjacent position as a next processing object (S912), and returns to S909 to repeat the above-described processing. Thus, a connected series of positions is extracted. Incidentally, together with each position, the first extracting unit 309 sets a time in the new record.
Then, when determining in S909 that a detection result at a time next to a time of detection of the processing object does not include a position coinciding with or adjacent to the position of the processing object, the generation of the locus table for the starting point identified in S903 is ended.
The first extracting unit 309 determines whether or not there is an unprocessed starting point (S913). When there is an unprocessed starting point, the first extracting unit 309 returns to S903 to repeat the above-described processing. When there is no unprocessed starting point, the first extraction processing is ended. The processing then proceeds to processing of S411 illustrated in
The description returns to
The selecting unit 311 determines whether or not there is an unprocessed locus table (S1305). When determining that there is an unprocessed locus table, the selecting unit 311 returns to the processing illustrated in S1301 to repeat the above-described processing. When determining that there is no unprocessed locus table, on the other hand, the selecting unit 311 selects a locus table of a locus having a high degree of similarity (S1307). In the present example, the degree of similarity corresponding to the locus of the locus table in
The description returns to
A feature point in the present example is a starting point, a corner, or an endpoint. In a first record in the present example, the starting point (X1, Y1) of the
The description returns to
The second extracting unit 313 identifies a next point in the locus (S1405). The second extracting unit 313 determines whether or not the identified point is a corner (S1407). The second extracting unit 313 for example determines that the point in question is a corner when an angle formed by an average of traveling directions at a few points preceding the point in question and an average of traveling directions at a few points succeeding the point in question is larger than a given value. The traveling directions correspond to directions of movement vectors. In the example of the locus 801 illustrated in
The second extracting unit 313 determines whether or not the point identified in S1405 is the endpoint of the locus (S1411). When determining that the identified point is not the endpoint of the locus, the second extracting unit 313 returns to the processing illustrated in S1405 to repeat the above-described processing.
When determining that the identified point is the endpoint of the locus, on the other hand, the second extracting unit 313 adds the matching sample associating the position of the endpoint of the locus with the position of the endpoint of the figure to the sample data (S1413). The second extracting unit 313 sets “endpoint” as a kind of feature point. Then, the second extraction processing is ended, and the processing proceeds to processing of S415 illustrated in
The description returns to
The calculated parameter is stored in the parameter storage unit 322. When the display system is thereafter used, the position of the detection coordinate system is converted into the position of the display coordinate system by using the calculated parameter. Alternatively, the position of the display coordinate system may be converted into the position of the detection coordinate system by using the calculated parameter.
The setting unit 301 sets a proper detection range (S417). For example, the setting unit 301 for example identifies a rectangle based on the locus, and stores the positions of four corners of the rectangle as a parameter representing a detection range in the range data storage unit 323. Incidentally, the detection range set at this time may be referred to also as an effective range. The rectangle based on the locus is for example a rectangle circumscribed about the locus or a rectangle inscribed in the locus. In addition, the detection range may be set with a margin region added around the rectangle based on the locus.
The accuracy of calibration is expected to be increased when a figure thus having three feature points not aligned on one straight line is used. Coordinate transformation may not be performed correctly in a case of a parameter calculated on the basis of a linear matching pattern.
In addition, more matching samples may be extracted by detecting touch positions on a plurality of figures.
Points other than starting points, corners, or endpoints may be used as feature points. For example, an intermediate point on a line (for example a center of a side) may be used as a feature point.
Incidentally, in an environment using such a display system, various kinds of cooperative operations are expected to be performed between a portable terminal device and the display system. For example, the portable terminal device and the display system may perform various kinds of cooperative operations in response to a cooperation request from the portable terminal device to the display system. The display system may detect the portable terminal device of a person entering a room in which the display device 103 is installed, and automatically distribute a particular program to the portable terminal device. The portable terminal device may automatically start the received particular program. The portable terminal device and the display system may synchronize data or processes. Processing in the display system may be controlled by operation on the portable terminal device, or processing in the portable terminal device may be controlled by operation on the display system.
According to the present embodiment, it is possible to perform calibration more correctly for the display system that detects a position touched by a user. This is because a locus based on noise is removed as a locus not approximate to the figure. That is, effect of noise is removed.
In addition, the work of setting a detection range in the detecting device 105 can be omitted.
Samples are weighted according to kinds of feature points. Thus, the relation between the display coordinate system of the display device 103 and the detection coordinate system of the detecting device 105 can be reflected in the parameter more correctly according to the strength of the features.
One embodiment of the present technology has been described above. However, the present technology is not limited to this. For example, the above-described functional block configuration may not coincide with a program module configuration.
In addition, the configuration of each storage area described above is an example, and is not limited to the configuration as described above. Further, also in a processing flow, the order of pieces of processing may be changed or a plurality of pieces of processing may be performed in parallel with each other unless a processing result is changed.
The control device 101 described above is a computer device in which as illustrated in
The embodiment of the present technology described above is summarized as follows.
A calibration method according to one of the embodiments includes: making a display device display a figure indicating a touch route, receiving a plurality of detection results from a detection device, each of the plurality of detection results including one or more touch positions on a surface on which the figure is displayed in each detection timing, extracting one or more loci based on the plurality of detection results; and when the extracted one or more loci include a plurality of loci, executing a calibration based on a corresponding locus coinciding with or approximate to the figure, the corresponding locus being included in the plurality of loci.
Thus, calibration can be performed more correctly for a display system that detects the position touched by a user.
The calibration method may further include setting a detection range in the detecting device on the basis of the corresponding locus.
Thus, the work of setting the detection range in the detecting device can be omitted.
Further, the calculating may weight a sample according to a kind of the feature point.
Thus, relation between the display coordinate system of the display device and the detection coordinate system of the detecting device can be reflected in the parameter more correctly according to a geometric characteristic of the feature point.
Incidentally, a program for making a computer perform processing based on the method can be created. The program may be stored on a computer readable storage medium or a storage device such for example as a flexible disk, a compact disc read-only memory (CD-ROM), a magneto-optical disk, a semiconductor memory, or a hard disk. Incidentally, an intermediate processing result is generally stored temporarily in a storage device such as a main memory.
All examples and conditional language recited herein are intended for pedagogical purposes to aid the reader in understanding the invention and the concepts contributed by the inventor to furthering the art, and are to be construed as being without limitation to such specifically recited examples and conditions, nor does the organization of such examples in the specification relate to a showing of the superiority and inferiority of the invention. Although the embodiment of the present invention has been described in detail, it should be understood that the various changes, substitutions, and alterations could be made hereto without departing from the spirit and scope of the invention.
Number | Date | Country | Kind |
---|---|---|---|
2015-068318 | Mar 2015 | JP | national |