1. Field of the Invention
The present invention relates to a data processing apparatus in which a plurality of operators can simultaneously operate a single operation screen including a plurality of operation items, and a data processing system, a control method for the data processing apparatus, and a storage medium.
2. Description of the Related Art
In recent years, camera scanners have been known as an apparatus that reads image data of a document. The camera scanner captures an image of a document placed on a platen, and then processes and stores image data of the document captured by the camera (see Japanese Patent Application Laid-Open No. 2006-115334).
Furthermore, an image processing system has been known in which a projector projects a captured image data obtained by the camera in such a camera scanner and an operation button onto the platen. In the image processing system, an operation such as printing of the captured image data can be performed by detecting an operation performed by a user on a projected screen.
However, the image processing system described above is not expected to be used in cases where a plurality of operators simultaneously operates the operation screen. Such a case includes an insurance contract procedure where a camera captures an image of a contract document placed on the platen and a projector projects the resultant image data as well as a check box used for confirming that the content has been checked, onto the platen. In such a case, an insurance company employee and a client can simultaneously operate the operation screen. However, the check box should be allowed to be checked when the client has agreed to the presentation given by the employee, and should not be freely checked by the employee.
The present invention is directed to a technique capable of limiting, for each operation item, an operator who can operate the item in a system in which a plurality of operators can simultaneously operate a single operation screen including a plurality of the operation items.
The present invention is directed to a data processing apparatus in which an operation screen including a plurality of operation items can be simultaneously operated by a first operator and a second operator, includes a projection unit configured to project the operation screen onto a predetermined area, a determination unit configured to determine whether an operator who has performed an operation on the operation screen projected by the projection unit is the first operator or the second operator, and a control unit configured to perform control so as to validate, when the determination unit determines that the operator is the first operator, an operation on a first operation item in the operation screen and an operation on a second operation item in the operation screen, and so as to validate, when the determination unit determines that the operator is the second operator, an operation on the first operation item, and to invalidate an operation on the second operation item.
Further features of the present invention will become apparent from the following description of exemplary embodiments with reference to the attached drawings.
Exemplary embodiments for implementing the present invention are described below with reference to the drawings.
As illustrated in
As illustrated in
The camera unit 202 may capture an image with a fixed resolution, but is preferably capable of capturing an image with a high resolution and a low resolution.
The camera scanner 101 may further include a liquid crystal display (LCD) touch panel 330 and a speaker 340 (not illustrated in
[Xc,Yc,Zc]T[Rc|tc][X,Y,Z,1]T (1)
In formula (1), Rc and tc are constituted of external parameters obtained respectively based on an orientation (rotation) and a position (translation) of a camera with respect to the orthogonal coordinate system. Thus, Rc and tc are respectively referred to as a 3×3 rotation matrix and a translation vector. A three-dimensional point defined in the camera coordinate system can be converted into a three-dimensional point in the orthogonal coordinate system with the following formula
[X,Y,Z]T=└Rc−1tc┘[Xc,Yc,Zc,1]T (2)
A two-dimensional camera image plane captured by the camera unit 202 is obtained by the camera unit 202, by converting three-dimensional information in a three-dimensional space into two-dimensional information. More specifically, the plane is obtained by performing perspective projection conversion on a three-dimensional point Pc[Xc,Yc,Zc] on the camera coordinate system into two-dimensional coordinates pc[xp,yp] on the camera image plane with the following formula (3):
λ[xp,yp,1]T=A[Xc,Yc,Zc]T (3)
In formula (3), A is referred to as a camera internal parameter that is a 3×3 matrix expressed by a focal distance, an image center, and the like.
With formulae (1) and (3) described above, a three-dimensional point group expressed by the orthogonal coordinate system can be converted into three-dimensional point group coordinates on the camera coordinate system and into the camera image plane. It is assumed that the internal parameter of each hardware device and a position and an orientation (external parameter) of the hardware device with respect to the orthogonal coordinate system are calibrated in advance with a known calibration method. The term “three-dimensional point group” hereinafter represents three-dimensional data on the orthogonal coordinate system unless otherwise specified.
As illustrated in
The CPU 302 controls an operation of the entire controller unit 201. The RAM 303 is a volatile memory. The ROM 304 is a nonvolatile memory and stores a boot program for the CPU 302. The HDD 305 has a larger capacity than the RAM 303, and stores a control program, for the camera scanner 101, executed by the controller unit 201. When the CPU 302 executes a program stored in the ROM 304 and the HDD 305, a functional configuration of the camera scanner 101 and processing (information processing) in flowcharts described below are implemented.
The CPU 302 executes the boot program stored in the ROM 304, when the camera scanner 101 is turned ON or the like to be started. The boot program is used for the CPU 302 to read out the control program stored in the HDD 305, and load the control program onto the RAM 303. After executing the boot program, the CPU 302 executes the control program loaded on the RAM 303, and thus performs the control. The RAM 303 further stores data used in the operation based on the control program. Such data is written to and read from the RAM 303 by the CPU 302. The HDD 305 may further store various settings required for the operation based on the control program and image data generated by a camera input. Such settings and data are written to and read from the HDD 305 by the CPU 302. The CPU 302 communicates with other apparatuses on the network 104 through the network I/F 306.
The image processor 307 reads out and processes the image data stored in the RAM 303, and writes the resultant image data to the RAM 303. The image processor 307 executes image processing such as rotation, magnification, and color conversion.
The camera I/F 308 is connected to the camera unit 202 and the range image sensor 208. In response to an instruction from the CPU 302, the camera I/F 308 acquires image data from the camera unit 202 and range image data from the range image sensor 208, and writes them to the RAM 303. The camera I/F 308 transmits a control command from the CPU 302 to the camera unit 202 and the range image sensor 208, so that the settings of the camera unit 202 and the range image sensor 208 are performed.
The controller unit 201 may further include at least one of a display controller 309, a serial I/F 310, an audio controller 311, and a universal serial bus (USB) controller 312.
The display controller 309 is connected to the projector 207 and an LCD touch panel 330, and controls displaying of the image data according to an instruction from the CPU 302.
The serial I/F 310 inputs and outputs a serial signal. The serial I/F 310 is connected to the turntable 209, and transmits instructions for starting and ending rotation and for setting a rotation angle from the CPU 302 to the turntable 209. The serial I/F 310 is connected to the LCD touch panel 330. When the LCD touch panel 330 is pressed, the CPU 302 acquires coordinates of the pressed portion through the serial I/F 310.
The audio controller 311 is connected to the speaker 340, and converts audio data into an analog audio signal and outputs audio sound through the speaker 340, under an instruction from the CPU 302.
The USB controller 312 controls an external USB device according to an instruction from the CPU 302. The USB controller 312 is connected to an external memory 350 such as a USB memory and a secure digital (SD) card, and writes and reads data to and from the external memory 350.
A main control unit 402, mainly in charge of the control, controls other modules in the functional configuration 401.
The image acquisition unit 407 is a module that performs image input processing, and includes a camera image acquisition unit 408 and a range image acquisition unit 409. The camera image acquisition unit 408 acquires image data, output from the camera unit 202 through the camera I/F 308, and stores the image data in the RAM 303 (captured image acquisition processing). The range image acquisition unit 409 acquires the range image data, output from the range image sensor 208 through the camera I/F 308, and stores the range image data in the RAM 303 (range image acquisition processing). The processing executed by the range image acquisition unit 409 is described below in detail with reference to
A recognition processing unit 410 is a module that detects and recognizes a movement of an object on the platen 204, from the image data acquired by the camera image acquisition unit 408 and the range image acquisition unit 409. The recognition processing unit 410 includes a gesture recognition unit 411 and a gesture operator identification unit 412. The gesture recognition unit 411 sequentially acquires images on the platen 204 from the image acquisition unit 407. Upon detecting a gesture such as touching, the gesture recognition unit 411 notifies the main control unit 402 of the detected gesture. The gesture operator identification unit 412 identifies an operator who has performed the gesture detected by the gesture recognition unit 411, and notifies the main control unit 402 of the identified operator. The processing executed by the gesture recognition unit 411 and the gesture operator identification unit 412 is described in detail below with reference to
An image processing unit 413 provides a function with which the image processor 307 analyzes the images acquired from the camera unit 202 and the range image sensor 208. The gesture recognition unit 411 and the gesture operator identification unit 412 also execute processing using a function of the image processing unit 413.
A user interface unit 403 receives a request from the main control unit 402 and generates a graphic user interface (GUI) parts such as a message and a button. The user interface unit 403 requests a display unit 406 to display the generated GUI parts. The display unit 406 displays the requested GUI parts requested to the projector 207 or the LCD touch panel 330 through the display controller 309. The projector 207 is directed toward the platen 204, and thus can project the GUI parts onto the platen 204. Thus, the platen 204 includes a projection area onto which the image is projected by the projector 207. The user interface unit 403 receives a gesture operation such as touching recognized by the gesture recognition unit 411 or an input operation from the LCD touch panel 330 through the serial I/F 310 as well as coordinates related to the received operation. The user interface unit 403 determines an operation content (such as a pressed button), based on the association between the displayed content on the operation screen and the operated coordinates. The user interface unit 403 notifies the main control unit 402 of the operation content, whereby the operation made by the operator is received.
A network communication unit 404 performs communications based on TCP/IP with other apparatuses on the network 104 through the network I/F 306.
A data management unit 405 stores various data, such as operation data generated by the CPU 302 executing the control program, in a predetermined area on the HDD 305, and manages the data.
(Description about Range Image Sensor and Range Image Acquisition Unit)
The range image sensor 208 is an infrared pattern projection method range image sensor and includes an infrared pattern projection unit 361, an infrared camera 362, and the RGB camera 363, as illustrated in
Processing executed by the range image acquisition unit 409 is described with reference to a flowchart in
When the processing starts, in step S501, the range image acquisition unit 409 projects an infrared three-dimensional shape measurement pattern 522 onto a target object 521 by using the infrared pattern projection unit 361 as illustrated in
In step S502, the range image acquisition unit 409 acquires an RGB camera image 523 as an image of the target object 521 captured by the RGB camera 363 and an infrared camera image 524 as an image of a three-dimensional shape measurement pattern 522 projected by the infrared camera 362 in step S501. The infrared camera 362 and the RGB camera 363 are installed at different positions, and thus respectively capture the RGB camera image 523 and the infrared camera image 524, as two images different from each other in the imaging area as illustrated in
In step S503, the range image acquisition unit 409 converts the coordinate system of the infrared camera 362 into the coordinate system of the RGB camera 363, so that the coordinate systems match between the infrared camera image 524 and the RGB camera image 523. It is assumed that the relative positions and the internal parameters of the infrared camera 362 and the RGB camera 363 have been given by the calibration processing executed in advance.
In step S504, the range image acquisition unit 409 extracts the corresponding points between the three-dimensional shape measurement pattern 522 and the infrared camera image 524 that has been subjected to the coordinate conversion in step S503 as illustrated in
In step S505, the range image acquisition unit 409 calculates a distance from the infrared camera 362, based on triangulation with a straight line, connecting the infrared pattern projection unit 361 and the infrared camera 362, serving as a base line 525. The range image acquisition unit 409 calculates the distance between the target object 521 and the infrared camera 362 at a position corresponding to the pixel successfully associated in step S504, and stores the distance as a pixel value for the pixel. On the other hand, the range image acquisition unit 409 stores an invalid value for a pixel having failed to be associated as a portion in which measurement of the distance has failed. The range image acquisition unit 409 performs the processing described above on all the pixels in the infrared camera image 524 that have been subjected to the coordinate conversion in step S503, and thus generates a range image of which each pixel is provided with the distance value (distance information).
In step S506, the range image acquisition unit 409 stores RGB values of the RGB camera image 523 for each pixel of the range image, whereby a range image in which each pixel has four values including the R, G, B, and distance values, is formed. The range image thus acquired is based on the range image sensor coordinate system defined for the RGB camera 363 of the range image sensor 208.
Then, in step S507, the range image acquisition unit 409 converts the distance information obtained based on the range image sensor coordinate system as described above with reference to
The range image sensor 208 may employ systems other than the infrared pattern projection system employed in the present exemplary embodiment described above. For example, a stereo system in which stereographic three-dimensional viewing is achieved with two RGB cameras or a Time of Flight (TOF) system in which a distance is measured by detecting a flight time of a laser beam may be employed.
(Description about Gesture Recognition Unit)
The processing executed by the gesture recognition unit 411 is described in detail with reference to a flowchart in
When the processing starts, in step S601 in
In step S602, the gesture recognition unit 411 acquires the three-dimensional point group of an object on the platen 204 through steps S621 and S622.
In step S621, the gesture recognition unit 411 acquires one frame of each of the range image and the three-dimensional point group from the range image acquisition unit 409.
In step S622, the gesture recognition unit 411 uses the plane parameters of the platen 204 to remove a point group on the plane including the platen 204 from the acquired three-dimensional point group.
In step S603, the gesture recognition unit 411 executes processing of detecting a hand shape and a fingertip of the user from the acquired three-dimensional point group, through steps S631 to S634. The processing executed in step S603 is described with reference to
In step S631, the gesture recognition unit 411 extracts a skin-colored three-dimensional point group 701 in
In step S632, the gesture recognition unit 411 generates a two-dimensional image 702, illustrated in
In step S633, the gesture recognition unit 411 calculates a curvature of the outer shape at each of the points defining the detected outer shape of the hand, and detects a point, at which the curvature smaller than a predetermined value is calculated, as a fingertip.
In step S634, the gesture recognition unit 411 calculates the number of detected fingertips and the coordinates of each fingertip. As described above, the correspondence relationship between points in the two-dimensional image, projected onto the platen 204, and points in the three-dimensional point group representing the hand is stored. Thus, the gesture recognition unit 411 can acquire the three dimensional coordinates of each fingertip. An image from which the fingertip is detected is not limited to the image of the three-dimensional point group projected onto the two-dimensional image as in the method described above. For example, the hand area may be extracted by background subtraction on the range image or from a skin color area in the RGB image, and the fingertip in the hand area may be detected by a method similar to that described above (such as calculation of the curvature of the outer shape). The coordinates of the fingertip detected in this case are two-dimensional coordinates on the two-dimensional image such as the RGB image or the range image. Thus, the gesture recognition unit 411 needs to convert the coordinates into the three dimensional coordinates on the orthogonal coordinate system by using the distance information on the range image at the coordinates. At that time, the fingertip point may be the center of the circle of curvature used for detecting the fingertip instead of the point on the outer shape as the fingertip point.
In step S604, the gesture recognition unit 411 executes gesture determination processing through steps S641 to S645 from the detected hand shape and fingertip.
In step S641, the gesture recognition unit 411 determines whether the number of the fingertips detected in step S603 is one. When the gesture recognition unit 411 determines that the number of the fingertips is not one (No in step S641), the processing proceeds to step S646. In step S646, the gesture recognition unit 411 determines that no gesture has been performed. On the other hand, when the gesture recognition unit 411 determines that the number of the fingertip is one (Yes in step S641), the processing proceeds to step S642. In step S642, the gesture recognition unit 411 calculates the distance between the detected fingertip and the plane including the platen 204.
In step S643, the gesture recognition unit 411 determines whether the distance calculated in step S642 is equal to or smaller than a predetermined value. When the distance is equal to or smaller than the predetermined value (Yes in step S643), the processing proceeds to step S644. In step S644, the gesture recognition unit 411 determines that a touch gesture of touching the platen 204 with the fingertip has been performed. When the distance calculated in step S642 is not equal to or smaller than the predetermined value (No in step S643), the processing proceeds to step S645. In step S645, the gesture recognition unit 411 determines that a gesture of moving the fingertip (gesture with the fingertip positioned above the platen 204 without touching) has been performed.
In step S605, the gesture recognition unit 411 notifies the main control unit 402 of the determined gesture, and then the processing returns to step S602 and the gesture recognition processing is repeated.
The gesture recognition unit 411 can recognize the gesture performed by the user based on the range image, through the processing described above.
(Description about Used States)
Used states of the camera scanner 101 are described with reference to
The present invention is expected to be applied to a case where a plurality of operators simultaneously operates the camera scanner 101. For example, the present invention can be applied to various cases such as a procedure for a contract and the like, a presentation for a product and the like, meetings, and education.
Operators of the camera scanner 101 are classified into a person (main operator) who operates the camera scanner 101 while giving an explanation, and a person (sub operator) who operates the camera scanner 101 while listening to the explanation.
In the procedure for a contract and the like, a presenter (main operator) and a client (sub operator) perform a procedure including checking of a contract content, inputting of necessary items, checking of input content, and approval. The necessary items can be input by either the presenter or the client as long as the client checks the input content in the end. On the other hand, only the client who has agreed to the presentation of the presenter can confirm that the input content is checked. Thus, the presenter is not allowed confirm that the content is checked on behalf of the client.
In an education case, a teacher (main operator) and a student (sub operator) perform a procedure including question setting, answering, correcting, explaining a suggested answer, and the like. The answers made by the student can be corrected by the teacher but not by the student. Meanwhile, the student can input the answers for the questions, and the teacher can input the suggested answers. It is an object of the present invention to implement an appropriate processing flow in the cases where the camera scanner 101 is operated by a plurality of operators, by appropriately giving authority to each operator.
In a case described below, the operators (the main operator 801 and the sub operator 802) facing each other as illustrated in
(Description about Main Control Unit)
Processing executed by the main control unit 402 is described in detail with reference to a flowchart in
When the processing starts, in step S901 in
In step S902, the main control unit 402 determines whether a gesture detection notification has been input from the gesture recognition unit 411. Although the detected gesture in the description below is a touching operation on the platen 204, the other gesture operations may be detected. In a case illustrated in
In step S903, the main control unit 402 identifies the operation item selected by the gesture. The operation item identified as the selected item is the name input field 1002 in the cases illustrated in
In step S904, the main control unit 402 identifies the gesture operator who has performed the gesture detected in step S902. The main operator 801 is identified as the gesture operator in the cases illustrated in
Now, the gesture operator identifying processing in step S904 is described with reference to
In step S1101 in
Then, in step S1102, the gesture operator identification unit 412 executes thinning processing on the hand area to generate a center line 1121 of the hand area illustrated in
In step S1103, the gesture operator identification unit 412 executes vector approximation processing to generate a vector 1131 illustrated in
In step S1104, the gesture operator identification unit 412 generates a frame model of the hand area including a finger area 1141, a hand area 1142, a forearm area 1143, and an upper arm area 1144 as illustrated in
Finally, in step S1105, the gesture operator identification unit 412 estimates the direction in which the gesture operation has been performed based on the frame model acquired in step S1104, and identifies the gesture operator based on the positional relationship between the operators set in advance. Thus, the main control unit 402 can determine that the touch gesture has been performed by the operator 802 with the operator's right hand.
The method of identifying the gesture operator is not limited thereto, and the operator may be identified through other methods. For example, as illustrated in
Referring back to
The method of determining whether the operator is authorized is not limited thereto. For example, each of the UI parts 1001 to 1004, corresponding to the respective operation items, may be provided with information indicating whether the part is authorized to be operated. Thus, whether the gesture operator is authorized to operate the corresponding item may be determined based on the information provided to the operated UI part and the information about the gesture operator.
For example, a UI screen illustrated in
Alternatively, a UI screen as illustrated in
As illustrated in
The authority management table 1201 illustrated in
Referring back to
The name input field 1002 can be operated by both the main operator 801 and the sub operator 802. Therefore, as illustrated in
In step S908, the main control unit 402 determines whether the system is terminated. The processing from step S901 to step S908 is repeated until the main control unit 402 determines that the system is terminated. The main control unit 402 determines that the system is terminated when an end button projected and displayed on the operation screen is pressed or when a power button (not illustrated) on the main body of the camera scanner 101 is pressed (Yes in step S908).
As described above, according to the present exemplary embodiment, when the gesture operation is detected, the gesture operator is identified and whether execution is permitted is determined for each operation item on the operation screen. Thus, in the data processing system in which a plurality of operators can simultaneously operate a single operation screen, a displayed item that can be operated by one operator only can be prevented from being freely operated by the other operator.
In the first exemplary embodiment, a case where the operators are constantly in a range of the camera scanner 101 to perform the operations. However, in the contract procedure and the like, the presenter might temporarily leave the operator's seat. In such a case, it is not desirable that some of the operation items that can be operated by the client are operated when the presenter is away. Thus, in a second exemplary embodiment, a method is described in which when one operator has moved away from a position where the camera scanner 101 can be operated, the authority given to the other operator is changed.
Processing executed in the camera scanner 101 according to the second exemplary embodiment is described with reference to
After identifying the operation item in step S903, and identifying the gesture operator in step S904, then in step S1401, the main control unit 402 checks the away state of an operator. More specifically, the away state is checked by detecting the absence of the operator from a person detection area 1503 or 1504 by a corresponding one of human presence sensors 1501 and 1502 attached to the camera scanner 101 as illustrated in
Then, in step S905, based on the operation item identified in step S903 and the away state checked in step S1401, the main control unit 402 determines whether the operator is authorized to operate the operation item. When the main control unit 402 determines that the operator is authorized (Yes in step S905), the processing proceeds to step S906. On the other hand, when the main control unit 402 determines that the operator is not authorized (No in step S905), the processing proceeds to step S908. Whether the operator is authorized is determined based on an authority management table 1601 illustrated in
As described above, according to the present exemplary embodiment, when one operator is away from the position where the camera scanner 101 can be operated, the authority of the other operator can be limited. As a result, when one operator is away, the operation can be prevented from being freely performed by the other operator.
In the first and the second exemplary embodiments described above, when a certain operation item is operated by an authorized operator, processing corresponding to the operated item is immediately executed regardless of whether the item has been operated by the main operator or the sub operator. In this configuration, when a user such as a sub operator who is not used to the operation uses the camera scanner 101, an unintentional action might be erroneously recognized as a gesture operation, and an erroneous operation might be performed accordingly. Thus, in a third exemplary embodiment, a method for preventing such an erroneous operation, by the main operator controlling a timing at which the sub operator can operate the camera scanner 101, will be described.
Processing executed in the camera scanner 101 according to the third exemplary embodiment is described with reference to
In step S1701, the main control unit 402 determines whether the gesture operator identified in step S904 is the main operator 801. When the gesture operator is the main operator 801 (Yes in step S801), the processing proceeds to step S905. In step S905, the authority checking processing is executed. On the other hand, when the gesture operator is the sub operator 802 (No in step S801), the processing proceeds to step S1702.
In step S1702, the main control unit 402 checks whether an instruction of permitting the sub operator 802 to perform an operation has been input. For example, the main control unit 402 may determine that the instruction of permitting the sub operator 802 to perform an operation has been input, when a hand of the main operator 801 is placed on a predetermined position on the platen as illustrated in
Then, in step S905, the main control unit 402 determines whether the identified operator is authorized based on an authority management table 1901 illustrated in
As described above, the sub operator 802 can press the check button in the case illustrated in
As described above, according to the present exemplary embodiment, the main operator 801 can restrict an operation performed by the sub operator 802, and thus an erroneous operation can be prevented from being performed due to an unintentional operation performed by the sub operator 802.
In the exemplary embodiments described above, higher user operability can be achieved in a data processing apparatus such as the camera scanner 101, with which a plurality of operators can simultaneously operate a single operation screen including a plurality of operation items. More specifically, an operator who has performed a gesture operation on a projected operation screen is identified based on a range image. Then, whether the operation is permitted is controlled for each of the operation items in the operation screen in accordance with the identified operator. Thus, a display item that can be operated by one operator of a plurality of operators can be prevented from being freely operated by the other operator.
The exemplary embodiments are described above. However, the present invention is not limited to the particular exemplary embodiments, and can be modified and changed in various ways without departing from the spirit of the present invention described in claims.
Embodiment(s) of the present invention can also be realized by a computer of a system or apparatus that reads out and executes computer executable instructions (e.g., one or more programs) recorded on a storage medium (which may also be referred to more fully as a ‘non-transitory computer-readable storage medium’) to perform the functions of one or more of the above-described embodiment(s) and/or that includes one or more circuits (e.g., application specific integrated circuit (ASIC)) for performing the functions of one or more of the above-described embodiment(s), and by a method performed by the computer of the system or apparatus by, for example, reading out and executing the computer executable instructions from the storage medium to perform the functions of one or more of the above-described embodiment(s) and/or controlling the one or more circuits to perform the functions of one or more of the above-described embodiment(s). The computer may comprise one or more processors (e.g., central processing unit (CPU), micro processing unit (MPU)) and may include a network of separate computers or separate processors to read out and execute the computer executable instructions. The computer executable instructions may be provided to the computer, for example, from a network or the storage medium. The storage medium may include, for example, one or more of a hard disk, a random-access memory (RAM), a read only memory (ROM), a storage of distributed computing systems, an optical disk (such as a compact disc (CD), digital versatile disc (DVD), or Blu-ray Disc (BD)™), a flash memory device, a memory card, and the like.
While the present invention has been described with reference to exemplary embodiments, it is to be understood that the invention is not limited to the disclosed exemplary embodiments. The scope of the following claims is to be accorded the broadest interpretation so as to encompass all such modifications and equivalent structures and functions.
This application claims the benefit of Japanese Patent Application No. 2014-167828, filed Aug. 20, 2014, which is hereby incorporated by reference herein in its entirety.
Number | Date | Country | Kind |
---|---|---|---|
2014-167828 | Aug 2014 | JP | national |