During some military operations, one or more soldiers locate targets to be fired upon by indirect fire systems or air support (for example) and transmit a location for the target to a fire control center or an integrated tactical network. The fire control center or an integrated tactical network then deploys a strike on the target using the target location. Target locators are used by military personnel to determine the coordinates of a target.
In a first embodiment, a target locator system comprises at least two target-locator cameras and a wireless communication system to communicatively couple the at least two target-locator cameras. Each target-locator camera includes a target sight, a range finder, a location sensor, and at least one elevation angle sensor. The target sight has an axis and sights a target. The range finder, which is aligned with the target sight, determines a distance to a target. The location sensor determines a location associated with the range finder. The at least one elevation angle sensor determines an elevation angle of an axis of the target sight when the target sight is sighting the target.
In accordance with common practice, the various described features are not drawn to scale but are drawn to emphasize features relevant to the present invention. Reference characters denote like elements throughout figures and text.
In the following detailed description, reference is made to the accompanying drawings that form a part hereof, and in which is shown by way of illustration specific illustrative embodiments in which the invention may be practiced. These embodiments are described in sufficient detail to enable those skilled in the art to practice the invention, and it is to be understood that other embodiments may be utilized and that logical, mechanical and electrical changes may be made without departing from the scope of the present invention. The following detailed description is, therefore, not to be taken in a limiting sense.
The method in which the target locator systems operate to determine the exact location of the target is described with reference to method 400 of
Each of the target-locator cameras 100 and 200 includes a respective range finder 160 and 260. The range finder 160 typically is aligned with a target sight 162. The target sight 162 has an axis 161. The range finder 160 generates information indicative of a distance d1 between a first target-locator camera, such as target-locator camera 100, and the target 300 when the target sight 162 is aimed at the target 300. The range finder 260 is aligned with a target sight 262. The target sight 262 has an axis 261. The range finder 260 generates information indicative of a distance d2 between the second target-locator camera, such as target-locator camera 200, and the target 300 when the target sight 262 is aimed at the target 300. When the target sights 162 and 262 are aimed at the target 300, the respective axes 161 and 261 are parallel to the line of sight between the target 300 and the respective range finders 160 and 260. In one implementation of this embodiment, the axes 161 and 261 are optical axes and the target sights 261 and 262 visually sight the target 300 for users of the target-locator cameras 100 and 200. In another implementation of this embodiment, the target sights 162 and/or 262 are optical target sights. In yet another implementation of this embodiment, the target sights 162 and/or 262 are infra-red target sights. In yet another implementation of this embodiment, the target sights 162 and/or 262 are external to the respective range finders 160 and/or 260, but are proximally located with and aligned to the respective range finders 160 and/or 260.
Each target-locator camera 100 and 200 includes a respective location sensor 110 and 210 to determine a location associated with the respective range finder 160 and 260. In one implementation of this embodiment, the location sensor is a global positioning system receiver and the location is a geographic location. The global positioning system for military applications is typically accurate to within 1 or 2 meters in both latitude and longitude. Each target-locator camera 100 and 200 includes at least one respective elevation angle sensor 140 and 240 to determine an elevation angle of a respective axis 161 or 261 of the target sight 162 or 262 when sighting the target 300.
Each target-locator camera 100 and 200 includes at least one respective azimuth sensor 130 and 230 to determine an azimuth of the respective axis 161 or 261 when sighting the target 300. The sensitivity of the azimuth sensors does not need to be highly accurate since the readings from the two azimuth sensors 130 and 230 are used to determine on which side of the line d3 the target is located. The elevation angle and the information indicative of distance, such as distance d1 and distance d2, provide two possible locations of the target 300 that are symmetric about the line d3, as described below with reference to
As shown in
The control center (CC) 401 includes a control-center receiver 470 and a processor 450. As shown in
As shown in
In the first target-locator camera 100, the location sensor 110, the azimuth sensor 130, the elevation angle sensor 140, and the range finder 160 are communicatively coupled to the target-locator-camera transmitter 180. In the second target-locator camera 200, the location sensor 210, the azimuth sensor 230, the elevation angle sensor 240, the range finder 260, the storage medium 225, the target-locator-camera receiver 270, and the target-locator-camera transmitter 280 are communicatively coupled to the processor 250.
As will be described in detail below with reference to
The processor 250 calculates an exact location of the target 300 based on the information indicative of the location of the first target-locator camera 100, the distance d1, the first angular information, the location of the second target-locator camera 200, the distance d2, and the second angular information. In one implementation of this embodiment, the processor 250 triangulates the exact location of the target 300 based on the information indicative of the location of the first target-locator camera 100, the distance d1, the first angular information, the location of the second target-locator camera 200, the distance d2, and the second angular information.
The first target-locator camera 101 includes the range finder 160, the location sensor 110, at least one elevation angle sensor 140, at least one azimuth sensor 130, the target-locator-camera transmitter 180 (also referred to herein as a first target-locator-camera transmitter 180), a target-locator-camera receiver 170 (also referred to herein as a first target-locator-camera receiver 170), the processor 150, and software 120 embedded in the storage medium 125. The location sensor 110, the azimuth sensor 130, the elevation angle sensor 140, the range finder 160, the storage medium 125, and the first target-locator-camera receiver 170, and the first target-locator-camera transmitter 180 are communicatively coupled to the processor 150. In one implementation, the processor 150 comprises a microprocessor or microcontroller. In one implementation, the processor 150 comprises processor support chips and/or system support chips such as ASICs and FPGAs. In another implementation of this embodiment, the processor 150 is a programmable processor.
The second target-locator camera 202 includes the range finder 260, the location sensor 210, at least one elevation angle sensor 240, at least one azimuth sensor 230, the second target-locator-camera transmitter 280 (also referred to herein as a second target-locator-camera transmitter 280), the target-locator-camera receiver 270 (also referred to herein as the second target-locator-camera receiver 270), the processor 250, and software 220 embedded in the storage medium 225. The location sensor 210, the azimuth sensor 230, the elevation angle sensor 240, the range finder 260, the storage medium 225, the target-locator-camera receiver 270, and the target-locator-camera transmitter 280 are communicatively coupled to the processor 250.
The first target-locator-camera transmitter 180 is communicatively coupled to the second target-locator-camera receiver 270 via wireless communication link represented generally by the numeral 191. The second target-locator-camera transmitter 280 is communicatively coupled to the first target-locator-camera receiver 170 via the wireless communication link 191. In one implementation of this embodiment, the first target-locator-camera receiver 170 and the first target-locator-camera transmitter 180 comprise a first transceiver. In another implementation of this embodiment, the second target-locator-camera receiver 270 and the second target-locator-camera transmitter 280 comprise a second transceiver.
As shown in
The first target-locator camera 101 and the second target-locator camera 202 exchange the information generated at the respective azimuth sensors 130 and 230, elevation angle sensors 140 and 240, location sensors 110 and 210, and range finders 160 and 260. The processors 150 and 250 each calculate an exact location of the target 300 based on the information indicative of the location of the other at least one target-locator camera, information indicative of the distance from the other at least one target-locator camera to the target, and information indicative of the angular information of the other at least one target-locator camera, information indicative of the location of the target-locator camera, information indicative of the distance from the target-locator camera to the target, and information indicative of the angular information of the target-locator camera. Then the target-locator-camera transmitters 180 and 280 in each of the at least two target-locator cameras 101 and 202 transmit information indicative of the exact location of the target 300 from the respective target-locator-camera transmitters 180 and 280 via the respective wireless communication links 193 and 194 to the control-center receiver 470.
As shown in
In this embodiment of the target locator system 12, the processors 150 and 250 each send the information generated by the respective azimuth sensors 130 and 230, the elevation angle sensors 140 and 240, the location sensors 110 and 210 and the range finders 160 and 260 to the control center receiver 470 in the control center 401. The processor 450 in the control center 401 receives the information from the control center receiver 470 and determines the exact location of the target 300 based on the information received from each of the target-locator cameras 103 and 203.
Embodiments of the method of operating the target locator systems 10-12 of
The azimuth angle is the bearing, relative to true north, of a point on the horizon 460 directly beneath an observed object. The horizon is defined as a large imaginary circle centered on the observed object, equidistant from the zenith (point straight overhead) and the nadir (point exactly opposite the zenith). As seen from above the horizon 460, compass bearings are measured clockwise in degrees from north. Azimuth angles θ can thus range from 0 degrees (north) through 90 (east), 180 (south), 270 (west), and up to 360 (north again).
The azimuth angles are used to indicate if the target 300 is north or south of the line d3. If θ1 is greater than θ3+180°, the target 300 is north of the line d3. If θ1 is less than θ3+180°, the target 300 is south of the line d3. If the line d3 is parallel to the north-south axis, then if θ1 is greater than θ3+180°, the target 300 is west of the line d3 and likewise, if θ1 is less than θ3+180°, the target 300 is east of the line d3.
In the exemplary case shown in
At block 402, the first target-locator camera sights a target. A user of the first target-locator camera aims the target sight in the range finder of the first target-locator camera at the target in order to sight the target. In one implementation of this embodiment, the user of the first target-locator camera aims a target sight in an optical range finder at the target and focuses the target sight on the target. At block 404, the first target-locator camera determines a first elevation angle and a first azimuth of the first target-locator camera. As defined herein, the “first elevation angle and a first azimuth of the first target-locator camera” is the “first angular information” as described above with reference to
The elevation angle sensor 130 (
The azimuth sensor 130 in the first target-locator camera 100 senses the azimuth angle θ1 (also referred to herein as first azimuth angle θ1) of the axis 161 of the first target-locator camera 100 when the range finder 160 (
The first elevation angle φ1 and the first azimuth angle θ1 together comprise the first angular information of the axis 161 of the first target-locator camera 100. In one implementation of this embodiment, the azimuth sensor is a magnetometer. In another implementation of this embodiment, there is no azimuth sensor in the first target-locator camera 100 so that the azimuth is not sensed.
At block 406, the first target-locator camera determines a first distance from the first target-locator camera to the target. The first distance is determined by the range finder in the first target-locator camera.
At block 408, the first target-locator camera determines a location of the first target-locator camera. In one implementation of this embodiment, the processor in the first target-locator camera receives information indicative of the location of the first target-locator camera from a location sensor in the first target-locator camera.
At block 410, the second target-locator camera sights the target. A user of the second target-locator camera aims the target sight in the range finder of the second target-locator camera at the target in order to sight the target. In one implementation of this embodiment, the user of the second target-locator camera aims a target sight in an optical range finder at the target and focuses the target sight on the target. The target sighted by the second target-locator camera is the same target sighted by the first target-locator camera at block 402. If the target is moving, the sighting of the target by the first target-locator camera and the second target-locator camera is done at almost exactly the same time. In one implementation of this embodiment, if the target is a stationary object, such as a building, the sighting of the target by the first target-locator camera and the second target-locator camera can be done at different times.
At block 412, the second target-locator camera determines a second elevation angle and a second azimuth of the second target-locator camera. As defined herein, the “second elevation angle and a second azimuth of the second target-locator camera” is the “second angular information” as described above with reference to
The elevation angle sensor 240 (
The azimuth sensor 230 in the second target-locator camera 200 senses the second azimuth angle θ2 (also referred to herein as second azimuth angle θ2) of the axis 261 of the second target-locator camera 200 when the range finder 260 (
At block 414, the second target-locator camera determines a second distance from the second target-locator camera to the target. The second distance is determined by the range finder in the second target-locator camera.
At block 416, the second target-locator camera determines a location of the second target-locator camera. In one implementation of this embodiment, the processor in the second target-locator camera receives information indicative of the location of the second target-locator camera from a location sensor in the second target-locator camera.
At block 418, a processor in at least one of the first target-locator camera, the second target-locator camera, and/or the control center determines the third distance from the first target-locator camera to the second target-locator camera. The processor calculates the third distance based on the information indicative of the location of the first target-locator camera and the location of the second target-locator camera.
In one implementation of this embodiment of block 418, the transmitter in the first target-locator camera wirelessly transmits the first angular information, the first distance, and the first location of the first target-locator camera to the receiver in the second target-locator camera and the processor in the second target-locator camera uses the information indicative of the location of the first target-locator camera and the information indicative of the location of the second target-locator camera to determine the third distance.
In another implementation of this embodiment of block 418, the transmitter in the second target-locator camera wirelessly transmits the second angular information, the second distance, and the second location of the second target-locator camera to the receiver in the first target-locator camera and the processor in the first target-locator camera uses the information indicative of the location of the second target-locator camera and the information indicative of the location of the first target-locator camera to determine the third distance.
In yet another implementation of this embodiment of block 418, the transmitter in the first target-locator camera wirelessly transmits the first angular information, the first distance, and the first location of the first target-locator camera to the receiver in the control center while the transmitter in the second target-locator camera sends the second angular information, the second distance, and the second location of the second target-locator camera to the receiver in the control center. In this case, the processor in the control center uses the information indicative of the location of the second target-locator camera and the information indicative of the location of the first target-locator camera to determine the third distance. The third distance is a baseline distance of a triangle formed by the first distance, the second distance and the third distance.
At block 420, a processor in at least one of the first target-locator camera, the second target-locator camera, and/or the control center calculates an exact location of the target based on the third distance, the first elevation angle, the second elevation angle, the first distance, and the second distance. In one implementation of this embodiment of block 420, the transmitter in the first target-locator camera sends the first angular information, the first distance d1, and the first location of the first target-locator camera to the receiver in the second target-locator camera and the processor in the second target-locator camera calculates an exact location of the target based on the information indicative of the location of the first target-locator camera, the distance from the first target-locator camera to the target, the angular information of the first target-locator camera, the location of the second target-locator camera, the distance from the second target-locator camera to the target, and the angular information of the second target-locator camera to determine the exact location of the target.
In another implementation of this embodiment of block 418, the transmitter in the second target-locator camera sends the second angular information, the second distance, and the second location of the second target-locator camera to the receiver in the first target-locator camera and the processor in the first target-locator camera calculates an exact location of the target based on the information indicative of the location of the first target-locator camera, the distance from the first target-locator camera to the target, the angular information of the first target-locator camera, the location of the second target-locator camera, the distance from the second target-locator camera to the target, and the angular information of the second target-locator camera to determine the exact location of the target.
In yet another implementation of this embodiment of block 418, the transmitter in the first target-locator camera sends the first angular information, the first distance, and the first location of the first target-locator camera to the receiver in the control center while the transmitter in the second target-locator camera sends the second angular information, the second distance, and the second location of the second target-locator camera to the receiver in the control center. In this case, the processor in the control center calculates an exact location of the target based on the information indicative of the location of the first target-locator camera, the distance from the first target-locator camera to the target, the angular information of the first target-locator camera, the location of the second target-locator camera, the distance from the second target-locator camera to the target, and the angular information of the second target-locator camera to determine the exact location of the target.
In yet another implementation of this embodiment, of method 400, the target locator system 10 (
In yet another implementation of this embodiment, of method 400, the target locator system 11 (
In yet another implementation of this embodiment, of method 400, the target locator system 12 (
The methods and techniques described here may be implemented in digital electronic circuitry, or with a programmable processor (for example, a special-purpose processor or a general-purpose processor such as a computer) firmware, software, or in combinations of them. Apparatus embodying these techniques may include appropriate input and output devices, a programmable processor, and a storage medium tangibly embodying program instructions for execution by the programmable processor. A process embodying these techniques may be performed by a programmable processor executing a program of instructions to perform desired functions by operating on input data and generating appropriate output. The techniques may advantageously be implemented in one or more programs that are executable on a programmable system including at least one programmable processor coupled to receive data and instructions from, and to transmit data and instructions to, a data storage system, at least one input device, and at least one output device. Generally, a processor will receive instructions and data from a read-only memory and/or a random access memory.
Storage devices suitable for tangibly embodying computer program instructions and data include all forms of non-volatile memory, including by way of example semiconductor memory devices, such as EPROM, EEPROM, and flash memory devices; magnetic disks such as internal hard disks and removable disks; magneto-optical disks; and DVD disks. Any of the foregoing may be supplemented by, or incorporated in, specially-designed application-specific integrated circuits (ASICs).
Although specific embodiments have been illustrated and described herein, it will be appreciated by those of ordinary skill in the art that any arrangement, which is calculated to achieve the same purpose, may be substituted for the specific embodiment shown. This application is intended to cover any adaptations or variations of the present invention. Therefore, it is manifestly intended that this invention be limited only by the claims and the equivalents thereof.
This application is related to U.S. patent application Ser. No. 11/268,938 (Attorney Docket No. H0008552.71550) having a title of “PASSIVE-OPTICAL LOCATOR” (also referred to here as the “Ser. No. 11/268,938 application”) filed on Nov. 8, 2005. This application is also related to U.S. patent application Ser. No. 11/482,354 (Attorney Docket No. H0011688.72862) having a title of “PASSIVE-OPTICAL LOCATOR” (also referred to here as the “H0011688.72862 Application”), and to U.S. patent application Ser. No. 11/482,468 (Attorney Docket No. H0011689.72861) having a title of “PASSIVE-OPTICAL LOCATOR” (also referred to here as the “H0011689.72861 Application”), both of which were filed on Jul. 7, 2006.