The present disclosure relates to an inspection apparatus, an inspection method, and a program, and relates, in particular, to an inspection apparatus, an inspection method, and a program that allow for more accurate inspection results to be acquired.
An inspection apparatus is hitherto known that inspects vegetation such as state and activity of a plant raised at a certain location (refer, for example, to PTL 1).
JP 2003-9664 A
However, it was difficult to acquire accurate inspection results with such an inspection apparatus.
The present disclosure has been devised in light of the foregoing, and it is an object of the disclosure to allow for acquisition of more accurate inspection results.
An inspection apparatus of an aspect of the present disclosure includes a calculation process section. The calculation process section calculates an inspection value for inspecting an inspection target by using sensing data having conflicting inspection-related-angle-dependent components such that the inspection-related-angle-dependent components included in sensing data acquired by sensing the inspection target are reduced.
An inspection method of an aspect of the present disclosure includes calculating an inspection value for inspecting an inspection target by using sensing data having conflicting inspection-related-angle-dependent components such that the inspection-related-angle-dependent components included in sensing data acquired by sensing the inspection target are reduced.
A program of an aspect of the present disclosure causes a computer to function as a calculation process section. The calculation process section calculates an inspection value for inspecting an inspection target by using sensing data having conflicting inspection-related-angle-dependent components such that the inspection-related-angle-dependent components included in sensing data acquired by sensing the inspection target are reduced.
In an aspect of the present disclosure, an inspection value is calculated for inspecting an inspection target by using sensing data having conflicting inspection-related-angle-dependent components such that the inspection-related-angle-dependent components included in sensing data acquired by sensing the inspection target are reduced.
According to an aspect of the present disclosure, it is possible to acquire more accurate inspection results.
A detailed description will be given below of an embodiment of a vegetation inspection system to which the present technology is applied with reference to drawings.
A description will be given of a configuration example of an embodiment of a vegetation inspection system to which the present technology is applied with reference to
As illustrated in
The imaging apparatuses 12-1 and 12-2 are fastened in accordance with given arrangement conditions with respect to the field 14 and communicate with the vegetation inspection apparatus 13 via a wiredly or wirelessly constructed communication network. Then, each of the imaging apparatuses 12-1 and 12-2 takes an image of the field 14 under control of the vegetation inspection apparatus 13 and sends the image of the field 14 acquired as a result thereof to the vegetation inspection apparatus 13.
For example, the imaging apparatuses 12-1 and 12-2 are fastened under the arrangement conditions in which the imaging apparatuses 12-1 and 12-2 are located at the same distance from the center of the field 14 on a straight line that passes the center of the field 14 and in which the imaging apparatuses 12-1 and 12-2 are at the same height from the field 14 and at the same elevation angle toward the center of the field 14. That is, as illustrated in
The vegetation inspection apparatus 13 controls timings when the imaging apparatuses 12-1 and 12-2 take images of the field 14. Then, the vegetation inspection apparatus 13 finds a vegetation index for inspecting vegetation of a plant raised in the field 14 on the basis of images of the field 14 taken by the imaging apparatuses 12-1 and 12-2. It should be noted that the detailed configuration of the vegetation inspection apparatus 13 will be described later with reference to
The vegetation inspection system 11 configured as described above can find a vegetation index free from (with reduced) angle-dependent effect for each of directions in which the plant raised in the field 14 grows, in which the imaging apparatuses 12-1 and 12-2 take images of the field 14, in which light is shone on the field 14 from the sun.
For example, in a case where a lawn is raised in the field 14, the direction in which the lawn grows (hereinafter referred to as the grain) varies significantly depending on mowing thereof. Therefore, it is difficult to constantly inspect vegetation under the same conditions. For this reason, a description will be given below assuming that a lawn is inspected by the vegetation inspection system 11 in the field 14 where the lawn as can be found in a soccer stadium is raised. Of course, not only the above-described rice and sugar canes but also various other plants can be subject to inspection by the vegetation inspection system 11.
For example, hatching directions of the field 14 in
A description will be given of the grain angle in the field 14, the angle of the imaging direction by the imaging apparatuses 12, and the angle of the shining direction of light shone on the field 14 with reference to
As illustrated in A of
Also, as illustrated in B of
Similarly, as illustrated in C of
It should be noted that in a case where the shining direction points vertically downward, the shining direction angle is L(0,0).
Then, when the grain angle P(0,0), the imaging direction angle C(0,0), and the shining direction angle L(0,0) are all satisfied, inspection can be conducted in a manner free from effects dependent upon these angles as illustrated in
However, it is difficult to conduct inspection such that the grain angle P(0,0), the imaging direction angle C(0,0), and the shining direction angle L(0,0) are all satisfied. For this reason, the vegetation inspection system 11 is capable of eliminating effects dependent upon these angles using inspection conditions as illustrated with reference to
<Inspection Conditions that Eliminate Angle-Dependent Effects>
A description will be given of a first inspection condition for eliminating angle-dependent effects with reference to
As illustrated in
By taking the arithmetic mean of the two images taken under the inspection condition described above, the difference in shining direction angle is cancelled (the shining direction angle becomes equivalent to L(0,0)), making it possible to eliminate the effect dependent upon the shining direction angle.
A description will be given of a second inspection condition for eliminating angle-dependent effects with reference to
As illustrated in
By taking the arithmetic mean of the two images taken under the second inspection condition described above, the difference in imaging direction angle is cancelled (the imaging direction angle becomes equivalent to C(0,0)), making it possible to eliminate effects dependent upon the imaging direction angle.
A description will be given of a third inspection condition for eliminating angle-dependent effects with reference to
As illustrated in
By taking the arithmetic mean of the two images taken under the third inspection condition described above, the differences in imaging direction angle and shining direction angle are cancelled (the imaging direction angle becomes equivalent to C(0,0) and the shining direction angle becomes equivalent to L(0,0)), making it possible to eliminate effects dependent upon the imaging direction angle and the shining direction angle.
A description will be given of a fourth inspection condition for eliminating angle-dependent effects with reference to
As illustrated in
By taking the arithmetic mean of the two images taken under the fourth inspection condition described above, the difference in grain angle is cancelled (the grain angle becomes equivalent to P(0,0)), making it possible to eliminate the effect dependent upon the grain angle. As will be described later, for example, in a case where two areas whose grain angle components conflict with each other are included in a single image, it is possible to reduce the effect dependent upon the grain angle by taking the arithmetic mean of sensing values of two areas as sensing data.
A description will be given of a fifth inspection condition for eliminating angle-dependent effects with reference to
As illustrated in
By taking the arithmetic mean of the two images taken under the fifth inspection condition described above, the differences in grain angle and imaging direction angle are cancelled (the grain angle becomes equivalent to P(0,0) and the imaging direction angle becomes equivalent to C(0,0)), making it possible to eliminate the effects dependent upon the grain angle and imaging direction angle.
A description will be given of a sixth inspection condition for eliminating angle-dependent effects with reference to
As illustrated in
By taking the arithmetic mean of the two images taken under the sixth inspection condition described above, the differences in grain angle, imaging direction angle, and shining direction are cancelled (the grain angle becomes equivalent to P(0,0), and the imaging direction angle becomes equivalent to C(0,0), and the shining direction angle becomes equivalent to L(0,0)), making it possible to eliminate the effects dependent upon the grain angle, imaging direction angle, and shining direction angle.
Incidentally, in
That is, in a case where the weather is cloudy with no sunlight directly shone on the field 14, there is no need to consider the effect dependent upon the shining direction angle (it is acceptable to consider that the shining direction is equivalent to angle L(0,0)). In seventh to tenth inspection conditions for eliminating angle-dependent effects which will be described below, imaging is performed in cloudy weather.
A description will be given of a seventh inspection condition for eliminating angle-dependent effects with reference to
As illustrated in
A description will be given of an eighth inspection condition for eliminating angle-dependent effects with reference to
As illustrated in
By taking the arithmetic mean of the two images taken under the eighth inspection condition described above, the difference in imaging direction angle is cancelled (the imaging direction angle becomes equivalent to C(0,0)), making it possible to eliminate the effect dependent upon the imaging direction angle.
A description will be given of a ninth inspection condition for eliminating angle-dependent effects with reference to
As illustrated in
By taking the arithmetic mean of the two images taken under the ninth inspection condition described above, the difference in grain angle is cancelled (the grain angle becomes equivalent to P(0,0)), making it possible to eliminate the effect dependent upon the grain angle.
A description will be given of a tenth inspection condition for eliminating angle-dependent effects with reference to
As illustrated in
By taking the arithmetic mean of the two images taken under the tenth inspection condition described above, the differences in grain angle and imaging direction angle are cancelled (the grain angle becomes equivalent to P(0,0) and the imaging direction angle becomes equivalent to C(0,0)), making it possible to eliminate the effects dependent upon the grain angle and imaging direction angle.
As illustrated in
The communication section 21 communicates with the imaging apparatuses 12-1 and 12-2. For example, the communication section 21 receives image data making up images taken with the imaging apparatuses 12-1 and 12-2 (e.g., Raw data made up of red (R), green (G), blue (B), and infrared (IR) pixel values) and supplies the image data to the data server 22. Also, when supplied with an imaging command instructing the imaging apparatuses 12-1 and 12-2 to perform imaging from the imaging control section 24, the communication section 21 sends the imaging command to the imaging apparatuses 12-1 and 12-2.
The data server 22 accumulates image data supplied from the communication section 21 and supplies image data to the calculation process section 25 in response to a request from the calculation process section 25. The data server 22 also acquires weather information at the time of image taking by the imaging apparatuses 12-1 and 12-2 via the weather information acquisition section 23 and stores the weather information in association with image data corresponding to the image.
The weather information acquisition section 23 acquires, for example, weather information observed by an observational instrument (not illustrated) installed in the field 14 or weather information in the neighborhood of the field 14 delivered via an external network such as the Internet and supplies such weather information to the imaging control section 24 as required. The weather information acquisition section 23 also supplies, to the data server 22, weather information at the time of taking of image data accumulated in the data server 22.
The imaging control section 24 sends an imaging command instructing the imaging apparatuses 12-1 and 12-2 to perform imaging to the imaging apparatuses 12-1 and 12-2 via the communication section 21 in accordance with time information measured by a built-in timer (e.g., data including date, hours, minutes, and seconds) or weather information supplied from the weather information acquisition section 23. As a result, the imaging control section 24 controls timings when the imaging apparatuses 12-1 and 12-2 image the field 14.
For example, the imaging control section 24 sends an imaging command when the angle of shining direction of light shone on the field 14 reaches L(0,0) as described above with reference to
The calculation process section 25 reads image data accumulated in the data server 22 and calculates a vegetation index of the field 14 free from angle-dependent effects using a set of images taken with the imaging apparatuses 12-1 and 12-2. That is, the calculation process section 25 is configured to include, as illustrated, a vegetation index calculation section 31, a lens distortion correction section 32, an addition process section 33, a cancellation process section 34, and an integration process section 35.
When image data of a set of images taken with the imaging apparatuses 12-1 and 12-2 is accumulated in the data server 22, the vegetation index calculation section 31 reads these pieces of image data from the data server 22 and calculates a vegetation index for use as an inspection value for inspecting vegetation.
For example, in a case where the vegetation inspection system 11 uses a normalized difference vegetation index (NDVI), an index indicating vegetation distribution condition and activity, the normalized difference vegetation index NDVI can be calculated by computing the following Formula (1). As illustrated in Formula (1), the normalized difference vegetation index NDVI can be found by using a pixel value R representing the red component in the visible range and a pixel value IR representing the component in the near-infrared range.
[Formula 1]
NDVI=IR−R/IR+R (1)
Therefore, the vegetation index calculation section 31 finds the normalized difference vegetation index NDVI for each of the pixels making up each of the images by using the pixel value R and the pixel value IR of the image data acquired by imaging of the field 14 by the respective imaging apparatuses 12-1 and 12-2. Then, the vegetation index calculation section 31 generates two NDVI images on the basis of the pixel values of the two images taken with the imaging apparatuses 12-1 and 12-2.
The lens distortion correction section 32 corrects lens distortion of the two NDVI images generated by the vegetation index calculation section 31.
That is, as illustrated in
The addition process section 33 performs a process of taking the arithmetic mean of the two NDVI images whose lens distortion has been corrected by the lens distortion correction section 32, generating a single NDVI image with cancelled difference in imaging direction angle.
In the cancellation process section 34, information is registered in advance about the azimuth angle of the lawn for each of the areas distinguished by grain of the lawn raised in the field 14 (e.g., which area has what kind of grain). Then, the cancellation process section 34 cancels the difference in grain angle in the NDVI image generated by the addition process section 33 by adding the pixel values of some of the areas in the two-dimensional sensing data whose azimuth angles of the grain are opposite to each other.
The integration process section 35 performs a process of taking the mean of the area data that has been added up by the cancellation process section 34, dividing the data into areas distinguished by grain and integrating the data for the normalized difference vegetation index NDVI in which the difference in grain angle is cancelled by the cancellation process section 34.
The vegetation inspection apparatus 13 configured as described above makes it possible to acquire vegetation information (NDVI image) free from the effects dependent upon the grain angle, imaging direction angle, and shining direction angle.
A description will be given of a process performed by the calculation process section 25, for example, under the tenth inspection condition illustrated in
In the field 14, for example, a lawn is raised such that four areas with the different grain angles P(a,b) are arranged alternately. That is, an area with the grain angle P(a1,b1), an area with the grain angle P(a2,b2), an area with the grain angle P(a3,b3), and an area with the grain angle P(a4,b4) are arranged, two down and two across, and these four areas are laid over the entire field 14.
Then, when the weather is cloudy, the field 14 is imaged from the imaging direction at the angle C(c,d) by the imaging apparatus 12-1, and the field 14 is imaged from the imaging direction at the angle C(−c,d) by the imaging apparatus 12-2.
At this time, lens distortion is present in an NDVI image P1 generated by the vegetation index calculation section 31 from the image taken with the imaging apparatus 12-1 in accordance with the imaging direction angle C(c,d) by the imaging apparatus 12-1. Similarly, lens distortion is present in an NDVI image P2 generated by the vegetation index calculation section 31 from the image taken with the imaging apparatus 12-2 in accordance with the imaging direction angle C(−c,d) by the imaging apparatus 12-2.
Therefore, the lens distortion correction section 32 generates an NDVI image P3 with corrected lens distortion of the NDVI image P1 and generates an NDVI image P4 with corrected lens distortion of the NDVI image P2. Here, each of rectangles illustrated in a grid pattern in the NDVI images P3 and P4 represents each of the areas distinguished in accordance with the grain angle P(a,b), and each of arrows illustrated in each of the rectangles indicates the grain angle P(a,b).
At this time, the normalized difference vegetation index NDVI of each of the pixels making up the NDVI image P3 includes a component dependent upon the grain angle P(a,b) and a component dependent upon the imaging direction angle C(c,d). Similarly, the normalized difference vegetation index NDVI of each of the pixels making up the NDVI image P4 includes a component dependent upon the grain angle P(a,b) and a component dependent upon the imaging direction angle C(−c,d).
Then, the addition process section 33 performs a process of taking the arithmetic mean of the NDVI images P3 and P4, generating an NDVI image P5 in which the component dependent upon the imaging direction angle C(c,d) and the component dependent upon the imaging direction angle C(−c,d) have cancelled each other out. That is, the NDVI image P5 becomes equivalent to the imaging direction angle C(0,0). It should be noted that the shining direction angle is L(0,0) because imaging is performed in cloudy weather.
Next, the cancellation process section 34 performs a process of cancelling the component dependent upon the grain angle P(a,b) included in the NDVI image P5.
For example, the azimuth angles θ of the grain angle P(a1,b1) and the grain angle P(a3,b3) are opposite to each other, and the azimuth angles θ of the grain angle P(a2,b2) and the grain angle P(a4,b4) are opposite to each other. Therefore, the cancellation process section 34 can cancel the component dependent upon the grain angle P(a,b) by adding the normalized difference vegetation index NDVI of the area with the grain angle P(a1,b1), the normalized difference vegetation index NDVI of the area with the grain angle P(a2,b2), the normalized difference vegetation index NDVI of the area with the grain angle P(a3,b3), and the normalized difference vegetation index NDVI of the area with the grain angle P(a4,b4).
At this time, the cancellation process section 34 performs a process of adding the normalized difference vegetation indices NDVI such that the two of the four areas subject to addition overlap each other as illustrated on the right of the NDVI image P5.
As a result, all the components dependent upon angles are cancelled from the normalized difference vegetation index NDVI, and the normalized difference vegetation index NDVI equivalent to the grain angle P(0,0), the imaging direction angle C(0,0), and the shining direction angle L(0,0) is acquired.
Thereafter, the integration process section 35 performs a process of taking the mean of the data of the four areas, dividing the data into area-by-area information for each grain angle P(a,b) and integrating the data, outputting an NDVI image P6 made up of the normalized difference vegetation index NDVI of the entire field 14 free from angle dependence. That is, as illustrated on the right side of the NDVI image P6, assuming a certain area to be a center, the integration process section 35 takes the mean by adding up the mean of the four areas having that area at bottom right, the mean of the four areas having that area at bottom left, the mean of the four areas having that area at top right, and the mean of the four areas having that area at top left, and uses the mean as the normalized difference vegetation index NDVI of that area. The integration process section 35 performs such averaging for all areas.
This allows the integration process section 35 to output the NDVI image P6 made up of the normalized difference vegetation index NDVI of the entire field 14 free from angle dependence. It should be noted that the calculation process section 25 may find a vegetation index other than the normalized difference vegetation index NDVI (e.g., ratio vegetation index (RVI) or green NDVI (GNDVI)) by using a pixel value R representing a red component in the visible range, a pixel value G representing a green component in the visible range, a pixel value B representing a blue component in the visible range, and a pixel value IR representing a component in the near-infrared range.
In step S11, the weather information acquisition section 23 acquires weather information in the field 14 and supplies the information to the imaging control section 24.
In step S12, the imaging control section 24 decides whether or not it is time for the imaging apparatuses 12-1 and 12-2 to image the field 14. For example, when the weather is cloudy to such an extent that the field 14 remains unaffected by the angle L(θ,φ) of the light shining direction from the sun on the basis of weather information supplied from the weather information acquisition section 23 in step S11, the imaging control section 24 decides that it is time to perform imaging. Alternatively, the imaging control section 24 may decide whether or not it is time to perform imaging on the basis of time information as described above.
In a case where the imaging control section 24 decides in step S12 that it is not time for the imaging apparatuses 12-1 and 12-2 to take images of the field 14, the process returns to step S11 to repeat the same processes from here onward. On the other hand, in a case where the imaging control section 24 decides in step S12 that it is time for the imaging apparatuses 12-1 and 12-2 to take images of the field 14, the process proceeds to step S13.
In step S13, the imaging control section 24 sends, to the imaging apparatuses 12-1 and 12-2, an imaging command instructing to perform imaging via the communication section 21, and the imaging apparatuses 12-1 and 12-2 take images of the field 14 in accordance with the imaging command. Then, each of the imaging apparatuses 12-1 and 12-2 sends the taken image of the field 14, and the communication section 21 acquires the image data of these images and accumulates the data in the data server 22.
In step S14, the vegetation index calculation section 31 reads two images worth of image data to be processed from the data server 22 and computes the above Formula (1), finding the normalized difference vegetation index NDVI for each pixel making up each image and generating two NDVI images.
In step S15, the lens distortion correction section 32 performs a process of correcting lens distortion on each of the two NDVI images generated by the vegetation index calculation section 31 in step S14, generating two NDVI images with corrected lens distortion.
In step S16, the addition process section 33 performs a process of taking the arithmetic mean of the two NDVI images whose lens distortion has been corrected by the lens distortion correction section 32 in step S15, generating a single NDVI image with cancelled component dependent upon the imaging direction angle.
In step S17, the cancellation process section 34 adds the areas whose grain angles are opposite to each other in the NDVI image generated by the addition process section 33 in step S16, cancelling the component dependent upon the grain angle.
In step S18, the integration process section 35 divides the NDVI image whose component dependent upon the grain angle has been cancelled by the cancellation process section 34 in step S17 into grain areas and integrates the image.
Then, after the process in step S18, the process returns to step S11 to repeat the similar processes from here onward.
As described above, the vegetation inspection system 11 makes it possible to acquire an NDVI image made up of the normalized difference vegetation index NDVI of the entire field 14 by cancelling the component dependent upon the grain angle and cancelling the component dependent upon the imaging direction angle. Therefore, it is possible to inspect vegetation of the lawn raised in the field 14 by eliminating the effects dependent upon the grain angle, imaging direction angle, and shining direction angle using the vegetation inspection system 11.
It should be noted that, in order to cancel the effects dependent upon the grain angle, imaging direction angle, and shining direction angle in the present embodiment, an arithmetic mean process is performed by using images whose corresponding positive and negative components of the respective angles completely conflict with each other for ease of comprehension. For the imaging direction, for example, an arithmetic mean process is performed by using images from two directions at the imaging direction angles C(c,d) and C(−c,d). In contrast, actually, even if images with completely conflicting positive and negative components of the angles are not used, and, for example, as long as the components conflict with each other within a given reference value such as 5% or 10%, it is possible to reduce the angle-dependent effects by using such images. Further, the grain angle, imaging direction angle, and shining direction angle are merely examples of angles related to inspection of the field 14, and other angles may also be used.
It should be noted that although a configuration for taking two images with the two imaging apparatuses 12-1 and 12-2 has been described in the present embodiment, for example, the normalized difference vegetation index NDVI may be calculated from two images taken successively by moving the single imaging apparatus 12. For example, imaging may be performed at an arbitrary position by mounting the imaging apparatus 12 to an unmanned aerial vehicle (UAV). Further, two or more imaging apparatuses 12 may be used, and, for example, the normalized difference vegetation index NDVI may be calculated from four images taken from east-west and north-south directions using the four imaging apparatuses 12.
Further, it is not necessary to provide the lens distortion correction section 32 if images free from lens distortion can be taken. It should be noted that in addition to a configuration in which the imaging apparatuses 12 and the vegetation inspection apparatus 13 are connected via a communication network, images may be supplied via a recording medium, for example, with the imaging apparatuses 12 and the vegetation inspection apparatus 13 put offline. In this case, there is no need to provide the data server 22. Also, the imaging apparatuses 12 and the vegetation inspection apparatus 13 may be configured as an integral apparatus.
Further, for example, the imaging apparatuses 12 can find a vegetation index and send the index to the vegetation inspection apparatus 13, and the vegetation inspection apparatus 13 can perform a process of eliminating angle-dependent effects. Also, the process of eliminating angle-dependent effects, for example, is not limited to the vertical direction as illustrated by the grain angle P(0,0), the imaging direction angle C(0,0), and the shining direction angle L(0,0) in
Further, the system in the present specification refers to the apparatus as a whole made up of a plurality of apparatuses.
It should be noted that the respective processes described with reference to the above flowchart need not necessarily be performed chronologically in accordance with the sequence described as a flowchart and include those that are performed in parallel or individually (e.g., parallel processes or object-based processes). Further, the program may be processed by a single central processing unit (CPU) or by a plurality of CPUs in a distributed manner.
Further, the above series of processes may be performed by hardware or software. In a case where the series of processes are performed by software, the program making up the software is installed from a program recording medium recording the program to a computer incorporated in dedicated hardware or, for example, to a general-purpose personal computer or other computer capable of performing various functions by installing various programs.
In the computer, a CPU 101, a read only memory (ROM) 102, and a random access memory (RAM) 103 are connected to each other by a bus 104.
An input/output (I/O) interface 105 is further connected to the bus 104. An input section 106, an output section 107, a storage section 108, a communication section 109, and a drive 110 are connected to the I/O interface 105. The input section 106 includes a keyboard, a mouse, a microphone, and so on. The output section 107 includes a display, a speaker, and so on. The storage section 108 includes a hard disk, a non-volatile memory, and so on. The communication section 109 includes a network interface and so on. The drive 110 records information to or reads information from a removable recording medium 111 such as magnetic disk, optical disk, magneto-optical disk, or semiconductor memory.
In the computer configured as described above, the CPU 101 performs the above series of processes, for example, by loading the program stored in the storage section 108 into the RAM 103 via the I/O interface 105 and bus 104 for execution.
The program executed by the computer (CPU 101) is provided in a manner stored in, for example, a magnetic disk (including flexible disk), an optical disk (e.g., compact disk-read only memory (CD-ROM), digital versatile disk (DVD)), magneto-optical disk, or a removable medium 111 which is a packaged medium including semiconductor memory. Alternatively, the program is provided via a wired or wireless transmission medium such as local area network, the Internet, and digital satellite broadcasting.
Then, the program can be installed to the storage section 108 via the I/O interface 105 as the removable medium 111 is inserted into the drive 110. Alternatively, the program can be received by the communication section 109 via a wired or wireless transmission medium and installed to the storage section 108. In addition to the above, the program can be installed, in advance, to the ROM 102 or storage section 108.
It should be noted that the present technology can also have the following configurations:
(1) An inspection apparatus including:
a calculation process section configured to calculate an inspection value for inspecting an inspection target by using sensing data having conflicting inspection-related-angle-dependent components such that the inspection-related-angle-dependent components included in sensing data acquired by sensing the inspection target are reduced.
(2) The inspection apparatus of feature (1), in which
the calculation process section reduces the angle-dependent components included in the sensing data in accordance with a growing direction of a plant, the inspection target, by performing a calculation process using at least two areas where azimuth angles of the growing direction of the plant, the inspection target, are opposite to each other.
(3) The inspection apparatus of feature (1) or (2), in which
the sensing data is image data including sensing values of respective wavelength ranges acquired on the basis of a sensor that has detection devices, arranged two-dimensionally, for detecting light in different wavelength ranges for the respective wavelength ranges.
(4) The inspection apparatus of feature (3), in which
the sensing data includes sensing values in respective red, green, blue, and near-infrared wavelength ranges.
(5) The inspection apparatus of any one of features (1) to (4), in which
the inspection target is a plant, and the inspection value is a normalized difference vegetation index.
(6) The inspection apparatus of any one of features (1) to (5), in which
the calculation process section reduces the angle-dependent component included in the sensing data in accordance with a sensing direction in which the inspection target is sensed by performing a calculation process using two pieces of the sensing data in which the azimuth angles of the sensing directions at the time of sensing the sensing data are sensed in directions opposite to each other.
(7) The inspection apparatus of any one of features (1) to (6), in which
the calculation process section reduces the angle-dependent component included in the sensing data in accordance with a shining direction of light shone on the inspection target by performing a calculation process using two pieces of the sensing data in which the azimuth angles of shining direction of light shone on the inspection target at the time of sensing the sensing data are sensed in directions opposite to each other.
(8) The inspection apparatus of any one of features (1) to (7), further including:
a sensing control section configured to control a sensing apparatus for sensing the inspection target.
(9) The inspection apparatus of feature (8), further including:
a weather information acquisition section configured to acquire weather information indicating weather of the field where a plant, the inspection target, is raised, in which
the sensing control section causes the sensing apparatus to sense the plant when the weather is cloudy in the field.
(10) The inspection apparatus of feature (8) or (9), in which
the sensing control section causes the sensing apparatus to sense the inspection target at a timing that is in accordance with the sensing direction of sensing the inspection target and the shining direction of light shone on the inspection target.
(11) The inspection apparatus of feature (1), in which
the sensing data is image data including sensing values of respective wavelength ranges acquired on the basis of a sensor that has detection devices, arranged two-dimensionally, for detecting light in different wavelength ranges for the respective wavelength ranges.
(12) The inspection apparatus of feature (11), in which
the sensing data includes sensing values in respective red, green, blue, and near-infrared wavelength ranges.
(13) The inspection apparatus of feature (12), in which
the inspection target is a plant, and the inspection value is a normalized difference vegetation index.
(14) An inspection method including:
calculating an inspection value for inspecting an inspection target by using sensing data having conflicting inspection-related-angle-dependent components such that the inspection-related-angle-dependent components included in sensing data acquired by sensing the inspection target are reduced.
(15) A program causing a computer to function as:
a calculation process section configured to calculate an inspection value for inspecting an inspection target by using sensing data having conflicting inspection-related-angle-dependent components such that the inspection-related-angle-dependent components included in sensing data acquired by sensing the inspection target are reduced.
It should be noted that the present embodiment is not limited to that described above and can be modified in various ways without departing from the gist of the present technology.
Number | Date | Country | Kind |
---|---|---|---|
2015-089013 | Apr 2015 | JP | national |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/JP2016/061521 | 4/8/2016 | WO | 00 |