The present invention relates to a cooperative control device, a cooperative control method, and a cooperative control program. Specifically, the present invention relates to a cooperative control device, a cooperative control method, and a cooperative control program that are capable of, in two orientation means used for receiving optical signals from a facing node, causing an orientation angle of a first orientation means to reach a target angle of the first orientation means and cancel an orientation angle error of a second orientation means by using an integration result of sequentially integrating an orientation angle error of the first orientation means at each calculation cycle, the orientation angle error of the first orientation means being obtained by applying kinematics to the orientation angle error of the second orientation means.
A related technology described in NPL 1 “A Cooperative Control Algorithm of the On Board Tracking Control System for Free-Space Optical Communications” (Motoaki Shimizu et al., the 49th Aircraft Symposium) includes two orientation means, namely, a first orientation means (e.g., a gimbal), and a second orientation means (e.g., a fine pointing mechanism (FPM)) provided on the first orientation means for fine pointing, as orientation means that enable tracking and receiving of optical signals from a facing node when conducting free-space optical communications with the facing node. Here, a relation represented by the following kinematic and geometric transformation expression presented as expression (1) is established between an orientation angle error α of the second orientation means, an orientation angle θg of the first orientation means, and a target angle θ′g of the first orientation means.
Mathematical Expression 1]
θ′g=sin−1(√{square root over (1−tan2α)}·sin θg−tan α·cos θg)
Note that expression (1) is presented for only one axis for simplification, where:
α is an orientation angle error of the second orientation means,
θg is an orientation angle of the first orientation means obtained from position information on a node of the device and a facing node (e.g., position information measured by a global positioning system (GPS)), and
θ′g is a target angle of the first orientation means obtained by adding the orientation angle θg of the first orientation means to a value obtained by kinematically transforming the orientation angle error α of the second orientation means to an orientation angle error of the first orientation means.
NPL 1 discloses that the orientation angle error α of the second orientation means can be canceled by converting the orientation angle error α of the second orientation means to the orientation angle error (θ′g−θg) of the first orientation means using expression (1), and controlling the orientation angle of the first orientation means to change from θg to the target angle θ′g. Expression (1) is calculated at each predetermined calculation cycle of a central processing unit (CPU).
NPL 1: Motoaki Shimizu et al. (NEC Corporation), “A Cooperative Control Algorithm of the On Board Tracking Control System for Free-Space Optical Communications”, the 49th Aircraft Symposium, October 2011.
The technique described in NPL 1, however, starts canceling the orientation angle error α of the second orientation means as illustrated by the long-dashed line 12 in
α=θ′g−θg
is assumed to simplify explanation.
In view of the above-described problem, it is an object of the present invention to provide a cooperative control device, a cooperative control method, and a cooperative control program that are capable of, in two orientation means used for receiving optical signals, causing an orientation angle of a first orientation means to surely reach a target angle of the first orientation means and reliably cancel an orientation angle error of a second orientation means.
In order to solve the above-described problem, a cooperative control device, a cooperative control method, and a cooperative control program according to the present invention mainly employ characteristic configurations as described below.
(1) A cooperative control device according to the present invention cooperatively controls two orientation means of a first orientation means and a second orientation means included as orientation means used for receiving optical signals from a facing node. The cooperative control device includes at least:
a position information acquisition means for acquiring position information on a node of the device and the facing node, or both of the position information and attitude information on the node and the facing node;
an orientation angle generation means for calculating a position-information-based target angle of the first orientation means using the position information on the node and the facing node acquired by the position information acquisition means, or both of the position information and the attitude information on the node and the facing node; and
a conversion means for converting the position-information-based target angle of the first orientation means calculated by the orientation angle generation means to a kinematics-used target angle of the first orientation means by applying kinematics to an orientation angle error of the second orientation means, the kinematics-used target angle of the first orientation means being used for removing the orientation angle error of the second orientation means.
The cooperative control device further includes:
an integration means for integrating a target error angle of the first orientation means obtained by subtracting the position-information-based target angle of the first orientation means from the kinematics-used target angle of the first orientation means converted by the conversion means; and
an addition means for adding the target error angle of the first orientation means integrated by the integration means to the position-information-based target angle of the first orientation means, to output a target angle of the first orientation means for cooperative control, the target angle for cooperative control being capable of removing the orientation angle error of the second orientation means.
(2) A cooperative control method according to the present invention cooperatively controls two orientation means of a first orientation means and a second orientation means included as orientation means used for receiving optical signals from a facing node. The cooperative control method includes at least:
a step for acquiring position information on a node of the device and the facing node, or both of the position information and attitude information on the node and the facing node;
a step for generating an orientation angle by calculating a position-information-based target angle of the first orientation means using the position information on the node and the facing node acquired by the step for acquiring, or both of the position information and the attitude information on the node and the facing node;
a step for converting the position-information-based target angle of the first orientation means calculated by the step for generating to a kinematics-used target angle by applying kinematics to an orientation angle error of the second orientation means, the kinematics-used target angle being used for removing the orientation angle error of the second orientation means.
The cooperative control method further includes:
a step for integrating a target error angle of the first orientation means obtained by subtracting the position-information-based target angle of the first orientation means from the kinematics-used target angle of the first orientation means converted by the step for converting; and a step for adding the target error angle of the first orientation means integrated by the step for integrating to the position-information-based target angle of the first orientation means, to output a target angle of the first orientation means for cooperative control, the target angle being capable of removing the orientation angle error of the second orientation means.
(3) A cooperative control program according to the present invention can cause a computer to implement at least the cooperative control method described in (2).
The cooperative control device, the cooperative control method, and the cooperative control program of the present invention provide the following advantageous effects.
In two orientation means of a first orientation means and a second orientation means used for receiving optical signals from a facing node, an orientation angle error of the first orientation means is obtained by applying kinematics to an orientation angle error of the second orientation means. The obtained orientation angle error of the first orientation means is integrated as a target error angle, and then a result of the integration is used to calculate a target angle of the first orientation means. This operation makes it possible to ensure that the orientation angle of the first orientation means reaches the target angle of the first orientation means, and that the orientation angle error of the second orientation means is canceled.
A preferred exemplary embodiment of a cooperative control device, a cooperative control method, and a cooperative control program according to the present invention will now be described with reference to the accompanied drawings. While the following describes the cooperative control device and the cooperative control method according to the present invention, it is obvious that the cooperative control method may be implemented as a cooperative control program that is capable of causing a computer to carry out the cooperative control method, or may be recorded in a computer-readable recording medium.
(Feature of Present Invention)
An outline of features of the present invention is described before describing the exemplary embodiment of the present invention. The present invention includes two orientation means, namely, a first orientation means (e.g., a gimbal), and a second orientation means (e.g., an FPM) provided on the first orientation means for fine pointing, as orientation means that enable tracking and receiving of optical signals from a facing node when conducting free-space optical communications with the facing node, as in NPL 1. The present invention, however, is different from NPL 1 in that the main feature of the present invention is that an orientation angle of the first orientation means can reach a target angle of the first orientation means by: obtaining an orientation angle error of the first orientation means by applying kinematics to an orientation angle error of the second orientation means, integrating the obtained orientation angle error of the first orientation means, and calculating the target angle of the first orientation means using a result of the integration.
More specifically, the present invention employs the following scheme. First, a position-information-based target angle of the first orientation means is calculated using position information on a node of the device and a facing node. The calculated position-information-based target angle of the first orientation means is converted, by applying kinematics, to a kinematics-used target angle of the first orientation means to remove an orientation angle error of the second orientation means.
Subsequently, the position-information-based target angle of the first orientation means is subtracted from the converted kinematics-used target angle of the first orientation means to obtain a target error angle of the first orientation means. The target error angle of the first orientation means is sequentially integrated at each calculation cycle.
A result of integrating the target error angle at each calculation cycle is added with the position-information-based target angle of the first orientation means to output a target angle of the first orientation means for cooperative control, the target angle being capable of removing the orientation angle error of the second orientation means. This operation is the main feature of the present invention.
The above operation can provide an advantageous effect that ensures a cancellation of the orientation angle error of the second orientation means.
(Example Configuration of Exemplary Embodiment)
The following describes an example configuration of an exemplary embodiment of the cooperative control device according to the present invention with reference to the block diagram of
As illustrated in the node 1 side in
The second orientation means 4 rotates a mirror thereof about two axes, the mirror receiving light for optical communication generated by the laser generator 13 at the facing node 2 side. The first orientation means 3 further rotates the second orientation means 4 about two axes. The first orientation means angle sensor 5 detects a rotation angle of the first orientation means 3, and is configured using a resolver, for example. Here, the rotation angle of the first orientation means 3 detected by the first orientation means angle sensor 5 is input to the first controller 11 as a current angle of the first orientation means 3. The second orientation means angle sensor 6 directly detects a tilt (rotation angle) of the mirror of the second orientation means 4, and is provided on the second orientation means 4.
The device node position information acquisition unit 8 acquires position information on the node 1 (e.g., position information acquired by a GPS). The facing node position information acquisition unit 14 transmits, to the facing node 2, the position information on the node 1 acquired by the device node position information acquisition unit 8 at the node 1 side as facing-node position information. In the same manner, the facing node position information acquisition unit 14 at the facing node 2 side transmits the position information on the facing node 2 to the node 1 as facing-node position information. The position-information-based orientation angle generator 15 generates a position-information-based target angle of the first orientation means 3 using the facing-node position information acquired from the facing node position information acquisition unit 14 at the facing node 2 side and device node position information acquired by the device node position information acquisition unit 8.
The first controller 11 performs phase compensation using: the position-information-based target angle of the first orientation means 3 output from the position-information-based orientation angle generator 15 through the adder 18, or a kinematics-used target angle of the first orientation means 3 obtained by the adder 18 adding the position-information-based target angle of the first orientation means 3 and a result of the integration by the integrator 17; and an angle (defined as a current angle) detected by the first orientation means angle sensor 5. The first controller 11 then generates a control signal for the first orientation means 3, outputs the control signal to the first orientation means 3 via the first driver 12 to orient the first orientation means 3 at the target angle. Consequently, light for optical communication emitted by the laser generator 13 at the facing node 2 side is incident on and reflected by the mirror of the second orientation means 4 and is directed to the light-receiving sensor 7 (e.g., a quadrisected light-receiving element).
The second controller 9 sets a target angle of the second orientation means 4 to zero degrees, generates a current angle that is an orientation angle error of the second orientation means 4 by selecting either an output of the light-receiving sensor 7 or an output of the second orientation means angle sensor 6, and performs phase compensation using the target angle and the current angle of the second orientation means 4. The second controller 9 then generates a control signal for the second orientation means 4, outputs the control signal to the second orientation means 4 via the second driver 10 to orient the second orientation means 4 at the target angle (in this case, zero degrees).
The converter 16 converts the position-information-based target angle of the first orientation means 3, generated by the position -information-based orientation angle generator 15, to the kinematics-used target angle of the first orientation means 3 by applying kinematics to the orientation angle error of the second orientation means 4. This conversion is performed to remove the orientation angle error of the second orientation means 4 based on the tilt (rotation angle) of the mirror, that is, the orientation angle error of the second orientation means 4, detected by the second orientation means angle sensor 6 and the rotation angle of the first orientation means 3 detected by the first orientation means angle sensor 5.
The integrator 17 sequentially integrates, at each predetermined calculation cycle, a target error angle obtained by subtracting the position-information-based target angle of the first orientation means 3 generated by the position-information-based orientation angle generator 15 from the kinematics-used target angle of the first orientation means 3 converted by the converter 16. Consequently, the adder 18 adds a result of the integration by the integrator 17 to the position-information-based target angle of the first orientation means 3 output from the position-information-based orientation angle generator 15, thereby outputting, to the first controller 11, a result of the addition as a target angle for the first controller 11, that is, as a target angle of the first orientation means 3 for cooperative control, the target angle being capable of removing the orientation angle error of the second orientation means 4.
When rotation angles of the two respective axes of the first orientation means 3 that are output from the first orientation means angle sensor 5 are defined as an azimuth (AZ) axis Ψ and an elevation (EL) axis θ, the target angles Ψg and θg of the first orientation means 3 generated by the position-information-based orientation angle generator 15 are given by the following expressions (2) and (3). The target angles Ψg and θg of the first orientation means 3 given by expressions (2) and (3) are referred to as “position-information-based target angles” in the present exemplary embodiment.
The variables a, b, and c are, as described in NPL 1, defined by target vectors from the device node position information to the facing node position information represented by expressions (4) and (5).
Mathematical Expression 3]
[a b c]T (4)
l
t=√{square root over (a2+b2+c2)} (5)
If the orientation angle of the first orientation means 3 is correctly oriented at the target angles Ψg and θg represented by expressions (2) and (3), that is, the “position-information-based target angles”, it is regarded that the second controller 9 controls both axes of the second orientation means 4 at zero degrees using outputs from the light-receiving sensor 7. At the same time, it is regarded that outputs from the second orientation means angle sensor 6 are zero degrees for both axes.
Subsequently, it is assumed that one of the facing node 2 and the node 1 moves and an optical axis of a light beam emitted by the laser generator 13 in the facing node 2 shifts to tilt the mirror of the second orientation means 4, and the rotation angles of the two axes output from the second orientation means angle sensor 6 are changed, that is, X axis is changed from zero degrees to α degrees (≠0) and Y axis is changed from zero degrees to β degrees (≠0). In this case, the target angles Ψ′g and θ′g of the first orientation means 3, the target angles being capable of canceling the rotation angles (that is, the orientation angle errors) of the two axes of the second orientation means 4 by kinetic formulation between the first orientation means 3 and the second orientation means 4 are given by the following expressions (6) and (7), as described in NPL 1. The target angles ΨP′g and θ′g of the first orientation means 3 given by expressions (6) and (7) are referred to as “kinematics-used target angles” in the present exemplary embodiment.
As described above, in order to solve the problem of a conventional technique that an orientation angle of the first orientation means 3 cannot reach a target angle and an orientation angle error α of the second orientation means 4 cannot be canceled, the present exemplary embodiment includes: the integrator 17 that integrates target error angles obtained by subtracting the “position-information-based target angles Ψg and θg” of the first orientation means 3 given by expressions (2) and (3) from the “kinematics-used target angles Ψ′g and θ′g” of the first orientation means 3 given by expressions (6) and (7), and the adder 18 that adds a result of the integration by the integrator 17 to the “position-information-based target angles Ψg and θg” of the first orientation means 3.
The following describes a procedure of integration by the integrator 17 and a procedure of addition by the adder 18. As represented by the following expressions (8) and (9), subtraction processing of “kinematics-used target angles Ψ′g and θ′g”—“position-information-based target angles Ψg and θg” is performed at each calculation cycle and a result of the subtraction is integrated by the integrator 17 as a target error angle at each calculation cycle.
Mathematical Expression 5]
Δψ′gsum=Σ(ψ′g−ψg) (8)
Δθgsum=Σ(θ′g−θg) (9)
As represented by the following expressions (10) and (11), the target angles Ψ′a and θ′a of the first orientation means 3 for cooperative control are obtained by the adder 18 adding integration results ΔΨ′gsum and Δθ′gsum given by expressions (8) and (9) to the “position-information-based target angles Ψg and θg”.
Mathematical Expression 6]
ψ′a=Δψ′gsum+ψg (10)
θ′a=Δθ′gsum+θg (11)
When the power is turned on, the integration results ΔΨ′gsum and Δθ′gsum from the integrator 17 are initialized as represented by the following expression (12).
Mathematical Expression 7]
Δψ′gsum=Δθ′gsum=0 (12)
The “kinematics-used target angles Ψ′g and θ′g” of the first orientation means 3 given by expressions (6) and (7) are calculated and output by the converter 16 illustrated in
(Explanation of Operation in Exemplary Embodiment)
The following describes example operation of the cooperative control device 100 illustrated in
In the calculation flowchart illustrated in
Next, an instruction of cooperative control between coarse control and fine control is input as a command input by a user, and whether the cooperative control is turned on is checked (step S4). If the cooperative control is not turned on (No at step S4), the calculation at step S5 is skipped and the process proceeds to step S6. If the cooperative control is turned on (Yes at step S4), the calculation at step S5 is performed.
When the process proceeds to step S5, calculations of the above expressions (6) and (7) are performed to calculate the “kinematics-used target angles Ψ′g and θ′g”, and then calculations of the above expressions (8) and (9) are performed to integrate, at each calculation cycle, the target error angle obtained by the subtraction processing of the “kinematics-used target angles Ψ′g and θ′g”—the “position-information-based target angles Ψg and θg”. Subsequently, calculations of expressions (10) and (11) are performed to add the integration results given by expressions (8) and (9) to the “position-information-based target angles Ψg and θg” to calculate the target angles Ψ′a and θ′a of the first orientation means 3 for cooperative control (step S5).
Next, calculation on time, represented by the following expression (13), is performed to update a calculation cycle to the next cycle. After the calculation cycle is updated (step S6), the process returns to step S3.
Mathematical Expression 8]
t=t+Δt (1 3)
where Δt is a time interval at which a predetermined calculation cycle is given.
Although not illustrated clearly in the calculation flowchart in
The following further describes, with reference to the explanatory diagram of
In the explanatory diagram of
At and before a time t0, which is during an initial state, it is assumed that cooperative control is turned off, the orientation angle (that is, the orientation angle error) of the second orientation means 4 is α, and the orientation angle of the first orientation means 3 is θg represented by the above expression (3).
At the time t0, cooperative control is turned on in response to an instruction from a user. In the cycle 1, one integration operation using the above expressions (7), (9), and (11) is performed, and the target angle θ′a of the first orientation means 3 for cooperative control is set to an angle given by the following expression (14) as indicated by the dashed-dotted line 14 in
Mathematical Expression 9]
θ′a=θg+α (14)
To simplify explanation, the orientation angle α of the second orientation means 4 is assumed to be given by the following expression (15). Note that θ′a is a target angle for cooperative control after the first integration since the turning-on of cooperative control.
Mathematical Expression 10]
α=θ′a−θg (15)
A calculation cycle applied in the explanatory diagram of
At the same time, at the time t1 when a cycle 2 starts, one integration operation using the above expressions (7), (9), and (11) is performed again. At the time t1, however, the orientation angle of the second orientation means 4 is ‘0’ as indicated by the long-dashed line 12 in
Furthermore, at the time t1, the orientation angle of the first orientation means 3 catches up with the target angle θ′a of the first orientation means 3 for cooperative control as indicated by the solid line 11 in
Here, it is assumed that, at a time 2 when the cycle 2 ends and a cycle 3 starts, one of the facing node 2 and the node 1 moves and an optical axis of a light beam emitted by the laser generator 13 in the facing node 2 shifts. By the shift, an error angle occurs at the light-receiving sensor 7 and the orientation angle of the second orientation means 4 changes from ‘0’ to α. At the same time, at the time t2 when the cycle 3 starts, one integration operation using expressions (7), (9), and (11) is performed again. This operation changes the target angle θ′a of the first orientation means 3 for cooperative control to an angle obtained by integrating the angle calculated by expression (14) in the cycle 1 with the orientation angle α of the second orientation means 4 changed at the time t2. The changed target angle θ′a of the first orientation means 3 for cooperative control is given by the following expression (16) as indicated by the dashed-dotted line 14 in
θ′a=θg+2α (1 6)
Consequently, in the period of the cycle 3, the orientation angle of the first orientation means 3 moves toward the target angle θ′a given by expression (16). The orientation angle of the first orientation means 3 for cooperative control thus reaches the target angle θ′a of the first orientation means 3 for cooperative control given by expression (16) at the time t3 when the cycle 3 of the calculation cycle ends, as indicated by the solid line 11 in
At the same time, at the time t3 when a cycle 4 starts, one integration operation using the above expressions (7), (9), and (11) is performed again. At the time t3, however, the orientation angle of the second orientation means 4 is ‘0’ as indicated by the long-dashed line 12 in
Furthermore, at the time t3, the orientation angle of the first orientation means 3 catches up with the target angle θ′a of the first orientation means 3 for cooperative control as indicated by the solid line 11 in
Here, it is assumed that, at a time t4 when the cycle 4 ends and a cycle 5 starts, one of the facing node 2 and the node 1 moves and an optical axis of a light beam emitted by the laser generator 13 in the facing node 2 shifts. By the shift, an error angle occurs at the light-receiving sensor 7 and the orientation angle of the second orientation means 4 changes from ‘0’ to α. At the same time, at the time t4 when the cycle 5 starts, one integration operation using expressions (7), (9), and (11) is performed again. This operation changes the target angle θ′a of the first orientation means 3 for cooperative control to an angle obtained by integrating the angle calculated by expression (16) in the cycle 3 with the orientation angle α of the second orientation means 4 changed at the time t2 as indicated by the dashed-dotted line 14 in
Mathematical Expression 12]
θ′a=θg+3α (1 7)
Consequently, in the period of the cycle 5, the orientation angle of the first orientation means 3 moves toward the target angle θ′a given by expression (17). The orientation angle of the first orientation means 3 for cooperative control thus reaches the target angle θ′a of the first orientation means 3 for cooperative control given by expression (17) at the time t5 when the cycle 5 of the calculation cycle ends, as indicated by the solid line 11 in
At the same time, at the time t5 when a cycle 6 starts, one integration operation using the above expressions (7), (9), and (11) is performed again. At the time t5, however, the orientation angle of the second orientation means 4 is ‘0’ as indicated by the long-dashed line 12 in
Furthermore, at the time t5, the orientation angle of the first orientation means 3 catches up with the target angle θ′a of the first orientation means 3 for cooperative control as indicated by the solid line 11 in
As described above, this configuration sequentially changes calculation expressions according to the respective calculation cycles, that is, in the order of expression (14) at the cycle 1, expression (16) at the cycle 3, and expression (17) at the cycle 5 for a target angle θ′a of the first orientation means 3 for cooperative control, by sequentially integrating, at each calculation cycle, the orientation angle of the first orientation means 3 obtained based on position information with an orientation angle error of the first orientation means 3 obtained by applying kinematics to an orientation angle error of the second orientation means 4. This configuration enables the orientation angle of the first orientation means 3 to reach the target angle θ′a of the first orientation means 3 for cooperative control, and the orientation angle error of the second orientation means 4 to be canceled to ‘0’.
In other words, a cooperative control method in a conventional technique has a problem, as indicated in
The following further describes, with reference to
Note that, as in the explanatory diagram of
Subsequently, at the time t1 when the calculation cycle 1 ends, as in the description of
At the same time, at the time t1 when the cycle 2 starts, one integration operation using the above expressions (7), (9), and (11) is performed again. At the time t1, however, as in the description of
Furthermore, at the time t1, as in the description of
As described above, the case of
At and before the time t0′, which is during an initial state, as in the cases of
At the time t0′, cooperative control is turned on in response to an instruction from a user. In the cycle 1′, as in the cases of
At the time t1′ when the calculation cycle 1′ ends, because the calculation cycle is set to a half (½) of the calculation cycle in
In this configuration, at the time t1′ when the cycle 1′ of the calculation cycle ends, the orientation angle α of the second orientation means 4 is not canceled, that is, remains at (α/2) without being reduced to ‘0’ as indicated by the long-dashed line 12 in
At the same time, at the time t1′ when the cycle 2′ starts, one integration operation using the above expressions (7), (9), and (11) is performed again. At the time t1′, however, the orientation angle of the second orientation means 4 remains at (α/2) as indicated by the long-dashed line 12 in
At the time t2′ when the cycle 2′ of the calculation cycle ends, because the calculation cycle is set to a half (½) of the calculation cycle in
In this configuration, at the time t2′ when the cycle 2′ of the calculation cycle ends, the orientation angle (α/2) of the second orientation means 4 is canceled to ‘0’ as indicated by the long-dashed line 12 in
At the same time, at the time t2′ when a cycle 3′ starts, one integration operation using the above expressions (7), (9), and (11) is performed again. At the time t2′, however, the orientation angle of the second orientation means 4 is ‘0’ as indicated by the long-dashed line 12 in
At this point, at the time t2′, the orientation angle of the first orientation means 3 has not caught up with the target angle θ′a of the first orientation means 3 for cooperative control as indicated by the solid line 11 in
In this configuration, the orientation angle of the second orientation means 4 is canceled to ‘0’ at the time t2′ when the cycle 2′ of the calculation cycle ends as indicated by the long-dashed line 12 in
At the same time, at the time t3′ when a cycle 4′ starts, one integration operation using the above expressions (7), (9), and (11) is performed again. At the time t3′, however, the orientation angle of the second orientation means 4 has undershot ‘0’ and reaches −(α/2) as indicated by the long-dashed line 12 in
At the time t4′ when the cycle 4′ of the calculation cycle ends, the orientation angle of the first orientation means 3 for cooperative control reaches the target angle θ′a of the first orientation means 3 for cooperative control given by expression (20), as indicated by the solid line 11 in
At the same time, at the time t4 when a cycle 5′ starts, one integration operation using the above expressions (7), (9), and (11) is performed again. At the time t4′, however, the orientation angle of the second orientation means 4 is ‘0’ as indicated by the long-dashed line 12 in
Furthermore, at the time t4′, the orientation angle of the first orientation means 3 catches up with the target angle θ′a of the first orientation means 3 for cooperative control as indicated by the solid line 11 in
In the case of
As described above, when a calculation cycle is set to a half (½) of the calculation cycle in
A solution to such a slow convergence is to stop the integration operation that has been performed by the integrator 17 at each calculation cycle until the orientation angle of the first orientation means 3 reaches the target angle θ′ of the first orientation means 3 for cooperative control. In other words, stopping the integration operation prevents the overshoot of the target angle θ′a of the first orientation means 3, thereby making it possible to resolve the occurrence of the slow convergence.
While the above exemplary embodiment describes a case in which the position-information-based orientation angle generator 15 of the cooperative control device 100 illustrated in
While the description of the above exemplary embodiment has been made on the assumption that the orientation means includes two rotation axes, the present invention is not limited to this exemplary embodiment. In other words, the orientation means may include three or more axes. In this case, completely the same cooperative control as that of the above-described exemplary embodiment may be performed by controlling the converter 16 in the cooperative control device 100 illustrated in
While the description of the above exemplary embodiment has been made on the assumption that the orientation means includes two orientation means of the first orientation means 3 and the second orientation means 4, the present invention is not limited to this exemplary embodiment. In other words, the orientation means may include three or more orientation means. In this case, completely the same cooperative control as that of the above-described exemplary embodiment may be performed by controlling the converter 16 in the cooperative control device 100 illustrated in
(Explanation of Advantageous Effects of Exemplary Embodiment)
As described above, the present embodiment provides the following advantageous effects.
In the two orientation means, the first orientation means 3 and the second orientation means 4 used for receiving optical signals from the facing node 2, an orientation angle error of the first orientation means 3 is obtained by applying kinematics to an orientation angle error α of the second orientation means 4 by the converter 16 and the integrator 17, the obtained orientation angle error of the first orientation means 3 is integrated, at each calculation cycle, as a target error angle, and then the integration result is used to calculate, by the adder 18, a target angle θ′a of the first orientation means 3 for cooperative control. This operation makes it possible to ensure that the orientation angle of the first orientation means 3 reaches the target angle θ′a of the first orientation means 3 for cooperative control, and that the orientation angle error α of the second orientation means 4 is canceled.
Furthermore, the orientation angle error can be canceled more finely, for example, by using not only position information on the node 1 and the facing node 2 but also attitude information on the node 1 and the facing node 2, by using two axes, or three or more axes as the rotation axes of the first orientation means 3 and the second orientation means 4, and by using two orientation means, that is, the first orientation means 3 and the second orientation means 4, or three more orientation means as the orientation means, when generating the position-information-based target angle θg of the first orientation means 3.
While certain configurations of a preferred exemplary embodiment according to the present invention have been described above, the exemplary embodiment has been presented by way of example only, and is not intended to limit the scope of the invention. It will be understood by those of ordinary skill in the art that various changes and modifications according to specific use may be made therein without departing from the spirit and scope of the present invention.
This application is based upon and claims the benefit of priority from Japanese patent application No. 2012-163539, filed on Jul. 24, 2012, the disclosure of which is incorporated herein in its entirety by reference.
1 node (of the device)
2 facing node
3 first orientation means
4 second orientation means
5 first orientation means angle sensor
6 second orientation means angle sensor
7 light-receiving sensor
8 device node position information acquisition unit
9 second controller
10 second driver
11 first controller
12 first driver
13 laser generator
14 facing node position information acquisition unit
15 position-information-based orientation angle generator
16 converter
17 integrator
18 adder
100 cooperative control device
11 solid line
12 long-dashed line
13 short-dashed line
14 dashed-dotted line
Number | Date | Country | Kind |
---|---|---|---|
2012-163539 | Jul 2012 | JP | national |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/JP2013/001974 | 3/22/2013 | WO | 00 |