This disclosure generally relates to autonomous and semi-autonomous motorized weapons systems. More specifically, the present disclosure relates to hardware- and software-based techniques for efficient operation of motorized weapons systems, via improvements in target identification and selection, autonomous actuation of motor and targeting systems, dynamic tracking, and trajectory measurement and assessment.
Within the context of motorized weapons systems, the concept of a “kill chain” refers to the sequence of actions performed between the first detection of potential targets, and the elimination of the targets. The sequence of actions within a kill chain generally may include the following: (1) Find—identifying and locating a target, (2) Fix or Track—determining the accurate location of the target, (3) Target—time-critical targeting, including predicting where the target may pop-up, (4) Engage—firing on the target, and (5) Assess—determining whether or not the target has been hit and/or eliminated.
Conventional weapon systems may include various components for achieving the above steps of a kill chain, including cameras and sensors to identify targets, display screens and controls (e.g., joysticks) to allow an operator to identify targets and aim the weapon, and a variety of weapons that may be fired at the target. Such systems may include “fully autonomous” weapons systems, which are capable of targeting and firing without any intervention by a human operator, “semi-autonomous” weapons systems, which may use automated software target tracking tools but still rely on a human operator for target selection and firing commands, “supervised autonomous” weapons systems, which may be granted permission to react to threats autonomously, and/or manual weapon systems that are operated entirely by the human operator.
Typically, conventional weapons systems rely on an “operator centric” approach to perform the actions in the kill chain sequence. Such systems often prioritize the interface and environment provided to the human operator. First, the human operator may be put in a safe environment, and the operator's eyesight may be improved using broad spectrum and high-resolution options. The weapon may be stabilized from motion and vibration, to allow the operator to find and track the target via a joystick and cursor or similar interface. After these steps, image recognition software may be used to attempt to recognize the target that been selected and tracked by the operator, and trajectory adjustments may be applied. Such systems and processes may result in a number of technical problems and inefficiencies, including difficulties of targeting and tracking when the operator is in a moving vehicle, difficulties selection and identification of targets and inefficiencies in selecting follow-on targets, and operator-based assessment and correction of weapon targeting and firing.
Techniques described herein relate to hardware- and software-based solutions for operating motorized weapons systems, including target identification and selection techniques, autonomous actuation of motor and targeting systems, dynamic tracking, and trajectory measurement and assessment techniques. Certain embodiments described herein correspond to semi-autonomous motorized weapon systems, which may include various combinations of hardware such as weapons capable of firing munitions, two-axis and/or three-axis mounts configured to support and position the weapons, motors coupled to the mounts and configured to move the mounts to specified positions to control the direction to which the weapons is aimed, and/or operator interface components such as operator controls and a target display device. In some embodiments, such a semi-autonomous motorized weapon system may be implemented with various hardware-based and software-based components configured to determine target points associated with targets at a remote locations, determine one or more areas having boundaries surrounding the target points, such boundary areas determined based on the likelihood of the weapon hitting the target when aimed at the boundary in comparison to predetermined likelihood thresholds. Such embodiments may be further configured to engage the motor of the motorized weapon system, with instructions to move the mount from an initial position to a target position at which the weapon is aimed at the target point, and during engagement of the motor, to periodically determine, during the movement of the mount toward the target position, whether the weapon is aimed at a position within the boundary area surrounding the target point. When determining, during the movement of the mount toward the target position, that the weapon is not aimed at a position within the area surrounding the target point, the semi-autonomous motorized weapon system may disable a manual firing mechanism of the weapon system to prevent firing of the weapon by an operator, whereas when it is determined during the movement of the mount toward the target position, that the weapon is aimed at a position within the area surrounding the target point, the semi-autonomous motorized weapon system may enable (or re-enable) the manual firing mechanism to allow firing of the weapon. Finally, the semi-autonomous motorized weapon system may be configured to receive and execute firing commands from operators, via the manual firing mechanism, thereby firing the weapon at times when the manual firing mechanism is enabled.
Additional techniques described herein include weapon-agnostic motorized weapon systems, including weapon-agnostic targeting/firing systems that may support various different types or models of weapons, as well as implementation of operation-specific rules of engagement that may be received and enforced by the weapon-agnostic targeting and firing systems. Further techniques described herein include minimum confidence thresholds for target selection and/or prioritization via semi-autonomous weapons systems, which may be separate determinations from target identification confidence and/or target verification confidence. Still further techniques described herein may include sensor-based real-time projectile firing assessment and automatic correction of targeting algorithms based on accuracy evaluations.
The various techniques described herein further include combinations of autonomous target selection, prioritization, and re-selection by targeting/firing systems within semi-autonomous motorized weapon systems, dynamic target tracking of both primary and secondary targets including target movement predictions and weapon/projectile characteristics, autonomous motor actuation to automatically orient the weapon toward the primary target before receiving any operator input, simplified user interfaces and operator controls, and enabling/disabling of the firing mechanism depending on the projected point of impact of the weapon, thereby providing increased system efficiency, increased rate of firing, improved weapon system accuracy, and reduced operator error, along with the other technical advantages described herein.
In the appended figures, similar components and/or features may have the same reference label. Further, various compo of the same type may be distinguished by following the reference label by a dash and a second label that distinguishes among the similar components. If only the first reference label is used in the specification, the description is applicable to any one of the similar components having the same first reference label irrespective of the second reference label.
In the following description, for the purposes of explanation, numerous specific details are set forth in order to provide a thorough understanding of various embodiments of the present invention. It will be apparent, however, to one skilled in the art that embodiments of the present invention may be practiced without some of these specific details. In other instances, well-known structures and devices are shown in block diagram form.
The ensuing description provides exemplary embodiments only, and is not intended to limit the scope, applicability, or configuration of the disclosure. Rather, the ensuing description of the exemplary embodiments will provide those skilled in the art with an enabling description for implementing an exemplary embodiment. It should be understood that various changes may be made in the function and arrangement of elements without departing from the spirit and scope of the invention as set forth in the appended claims.
Specific details are given in the following description to provide a thorough understanding of the embodiments. However, it will be understood by one of ordinary skill in the art that the embodiments may be practiced without these specific details. For example, circuits, systems, networks, processes, and other components may be shown as components in block diagram form in order not to obscure the embodiments in unnecessary detail. In other instances, well-known circuits, processes, algorithms, structures, and techniques may be shown without unnecessary detail in order to avoid obscuring the embodiments.
Also, it is noted that individual embodiments may be described as a process which is depicted as a flowchart, a flow diagram, a data flow diagram, a structure diagram, or a block diagram. Although a flowchart may describe the operations as a sequential process, many of the operations can be performed in parallel or concurrently. In addition, the order of the operations may be re-arranged. A process is terminated when its operations are completed, but could have additional steps not included in a figure. A process may correspond to a method, a function, a procedure, a subroutine, a subprogram, etc. When a process corresponds to a function, its termination can correspond to a return of the function to the calling function or the main function.
The term “computer-readable medium” includes, but is not limited non-transitory media such as portable or fixed storage devices, optical storage devices, and various other mediums capable of storing, containing or carrying instruction(s) and/or data. A code segment or computer-executable instructions may represent a procedure, a function, a subprogram, a program, a routine, a subroutine, a module, a software package, a class, or any combination of instructions, data structures, or program statements. A code segment may be coupled to another code segment or a hardware circuit by passing and/or receiving information, data, arguments, parameters, or memory contents. Information, arguments, parameters, data, etc. may be passed, forwarded, or transmitted via any suitable means including memory sharing, message passing, token passing, network transmission, etc.
Furthermore, embodiments may be implemented by hardware, software, firmware, middleware, microcode, hardware description languages, or any combination thereof. When implemented in software, firmware, middleware or microcode, the program code or code segments to perform the necessary tasks may be stored in a computer-readable medium. A processor(s) may perform the necessary tasks.
Various techniques (e.g., methods, systems, computing devices, non-transitory computer-readable storage memory storing a plurality of instructions executable by one or more processors, etc.) are described herein for hardware- and software-based solutions for operating motorized weapons systems, including target identification and selection techniques, autonomous actuation of motor and targeting systems, dynamic tracking, and trajectory measurement and assessment techniques. Certain embodiments described herein correspond to semi-autonomous motorized weapon systems, which may include various combinations of hardware such as weapons capable of firing munitions, two-axis and/or three-axis mounts configured to support and position the weapons, motors coupled to the mounts and configured to move the mounts to specified positions to control the direction to which the weapons is aimed, and/or operator interface components such as operator controls and a target display device. In some embodiments, such a semi-autonomous motorized weapon system may be implemented with various hardware-based and software-based components configured to determine target points associated with targets at a remote locations, determine one or more areas having boundaries surrounding the target points, such boundary areas determined based on the likelihood of the weapon hitting the target when aimed at the boundary in comparison to predetermined likelihood thresholds. Such embodiments may be further configured to engage the motor of the motorized weapon system, with instructions to move the mount from an initial position to a target position at which the weapon is aimed at the target point, and during engagement of the motor, to periodically determine, during the movement of the mount toward the target position, whether the weapon is aimed at a position within the boundary area surrounding the target point. When determining, during the movement of the mount toward the target position, that the weapon is not aimed at a position within the area surrounding the target point, the semi-autonomous motorized weapon system may disable a manual firing mechanism of the weapon system to prevent firing of the weapon by an operator, whereas when it is determined during the movement of the mount toward the target position, that the weapon is aimed at a position within the area surrounding the target point, the semi-autonomous motorized weapon system may enable (or re-enable) the manual firing mechanism to allow firing of the weapon. Finally, the semi-autonomous motorized weapon system may be configured to receive and execute firing commands from operators, via the manual firing mechanism, thereby firing the weapon at times when the manual firing mechanism is enabled.
Additional techniques described herein include weapon-agnostic motorized weapon systems, including weapon-agnostic targeting/firing systems that may support various different types or models of weapons, as well as implementation of operation-specific rules of engagement that may be received and enforced by the weapon-agnostic targeting and firing systems. Further techniques described herein include minimum confidence thresholds for target selection and/or prioritization via semi-autonomous weapons systems, which may be separate determinations from target identification confidence and/or target verification confidence. Still further techniques described herein may include sensor-based real-time projectile firing assessment and automatic correction of targeting algorithms based on accuracy evaluations.
The various techniques described herein further include combinations of autonomous target selection, prioritization, and re-selection by targeting/firing systems within semi-autonomous motorized weapon systems, dynamic target tracking of both primary and secondary targets including target movement predictions and weapon/projectile characteristics, autonomous motor actuation to automatically orient the weapon toward the primary target before receiving any operator input, simplified user interfaces and operator controls for operating the semi-autonomous motorized weapon systems, and enabling/disabling of the firing mechanism depending on the projected point of impact of the weapon, thereby providing increased system efficiency, increased rate of firing, improved weapon system accuracy, and reduced operator error, along with the other technical advantages described herein.
With reference now to
In some embodiments, weapon system 100 may be a remotely operated weapon stations (ROWS), including stabilization and auto-targeting technology. The targeting system of weapon system 100 may be configured to perform rapid target selection and acquisition, and increased hit probabilities. Weapon system 100 may be compatible with many different types of weapon 110 and different corresponding types of ammunition, and as discussed below, the operation of the targeting system and other components of the weapon system 100 may depend on knowledge of which type of weapon 110 and ammunition is currently in use. As discussed in more detail below, weapon system 100 may be fully integrated, with auto-targeting capabilities, and/or remote operation. Weapon system 100 also may be capable of being mounted to various different types of platforms, including tripods, buildings, ground vehicles (e.g., trucks, tanks, cars, jeeps), all-terrain vehicles (ATVs), utility task vehicles (UTVs), boats, fixed-wing aircraft, helicopters, and drones. As described in further detail below, various embodiments of weapon systems 100 may include capabilities for automatic target detection, selection, and re-selection, active stabilization, automatic ballistic solutions, target tagging, and/or continuous target tracking.
As noted above, weapon 110 may any type of gun, armament, or ordinance, including without limitation, off-the-shelf firearms, large caliber rifles, machine guns, autocannons, grenade launchers, rockets, and/or directed energy weapons such as lasers, high-power microwave emitters, and other undisclosed devices. The weapon 110 may be attached to the weapon system 100 using a 2-axis or 3-axis mechanical gimbal mount 120, capable of controlling azimuth and yaw, elevation and pitch, and possibly cant and roll. A closed loop servomotor within the weapon system 100 may be configured to drive the gimbal to an identified target. A firing mechanism within the weapon system may be configured to fire the weapon 110, either electronically or by manually pulling the trigger, in response to a firing command from a human operator and/or additional firing instructions received from a targeting/firing component of the weapon system 110.
Camera/sensor unit 125 may include an array of various different sensors configured to collect data at the weapon system 100, and transmit the sensor/image data back to the internal software systems of the weapon system 100 (e.g., targeting system/component, firing control, ballistics engine) and/or to a display device for outputting to an operator. Cameras/sensors within the sensor unit 125 may include, for example, cameras sensitive in various spectrums such as visible and infrared (IR), for day and night visibility, as well as rangefinders (e.g., LIDAR, RADAR, ultrasonic, etc.) to determine distance to target. Additional sensors within the sensor unit 125 may include rate gyros (e.g., MEMS or fiber optic gyros), which may be used to stabilize the weapon 110 within the mount 120. Magnetometers and accelerometers also may be included within the weapon system 100, and may be used for canceling gyro drift. Accelerometers also may be used to detect and respond to vehicle accelerations (i.e., when the weapon system 100 is mounted on a vehicle), and vibrations caused by vehicle movement and/or terrain and weather. Sensors 125 also may include wind speed sensors, including hot-wire, laser/LIDAR, sonic and other types of anemometers. Additionally, as described below, a global positioning system (GPS) receiver or other positioning devices may be included within the sensor unit 125, in order to determine the weapon location, head, and velocity to compute firing solutions, and for use in situations where external target coordinates are provided. It should also be understood that for each of the cameras and/or sensors described above and elsewhere herein, the cameras/sensors may be housed within the sensor unit 125, positioned elsewhere in the weapon system 100, installed on a structure or vehicle on which the weapon system 100 is mounted, or installed at a separate remote location and configured to transmit wireless sensor data back to the weapon system 100.
Referring now to
As indicated by the arrows shown in the diagram of weapon system 200, the targeting/firing system 210 may be configured to control drive the motor 235 to a particular target point, and to initiate firing of the weapon 225. The camera/sensor unit 240 may collect image and sensor data, and transmit that data back to the targeting/firing system 210 for use in target detecting, selection, and tracking functionality. In some cases, image and sensor data may be transmitted directly from the sensor unit 240 to the display 250 for rendering/use in an operator user interface. The targeting/firing system 210 also may transmit various targeting data to the display device 250 for presentation to the operator, and may receive from the operator firing commands and/or other control commands via the operator controls 245.
In some embodiments, all components of a weapon system 200 may be co-located and installed together as a single integrated system. For instance, weapon systems 200 may include turrets or platform-mounted guns which include the weapon/motor 225-235, camera/sensor unit 240, targeting/firing system 210, as well as the operator controls 245 and display 250. However, in other embodiments, some or all of the components of a weapon system 200 may non-integrated and located remoted from the others. For example, in some cases the weapon/motor 225-235 and a subset of the sensors/cameras 240 may be located near the potential targets, while the targeting/firing system 210 and operator interface components 245-250 may be in a distance remote location. Certain sensors 240 may be located at or near the weapon 225 (e.g., to measure distance to target, current location, weapon movement and vibration, wind and weather conditions, etc.), while other sensors 240 may be positioned at or near the target and/or at other angles to the target, while still other sensors or cameras 240 may be remotely located (e.g., drone-based cameras, satellite imagery, etc.). In embodiments in which certain components of a weapon system 200 are located remotely from others, each of the components may include network transceivers and interfaces configured for secure network communication, including components for data encryption and transmission over public or private computer networks, satellite transmission systems, and/or secure short-range wireless communications, etc.
The targeting/firing system 210 may receive input data from various data sources, and analyze the data to identify, select, and prioritize targets, actuate the motor 235, dynamically track targets, generate firing solutions, and control firing of the weapon 225. In order to perform these functions, the targeting/firing system 210 may receive data from one or more cameras/sensor units 240, including a GPS unit 211. The sensor data may include images of targets and potential targets, distance/range data, heat or infrared data, audio data, vehicle or weapon location data, vehicle or weapon movement and vibration data, wind and weather condition data, and any other sensor data described herein. Additionally, one or more data stores may store system configuration and operation data, including a rules data store 213 and a profiles data store 214. The rules data store 213 may include, for example, target identification rules, target selection/priority rules, firing rules, and other rules of engagement, each of which may depend on the particular operation, the current location of the weapon system 200, the individual operator, etc. The profiles data store 213 may include, for example, individual user profiles with user preferences and parameters, weapon profiles, and/or ballistic profiles that may include specifications for individual weapon types and ammunition types that may be used to calculate maximize ranges and targeting solutions. Additionally, one or more communication modules 212 within the targeting/firing system 210 may be used to receive commands and other data from the current operator and/or from a separate command centers. As discussed below, commands received from a command center or other higher-level authority may be to control the target selection and rules of engagement for particular operations. Communication modules 212 also may be used to receive or retrieve sensor data from remote sensor systems, including satellite data, image data from remote cameras, target GPS data, weather data, etc. The targeting/firing system 210 may include various components (e.g., targeting component 220) configured to receive and analyze the various data to performing target functions including subcomponents for target detection 221, target selection 222, target tracking 223, and firing control 215, among others.
The operator controls 245 and display screen 250 may correspond to the input/output interface between the human operator and the weapon system 200. As noted above, certain weapons systems 200 may be fully autonomous, or may operate in a supervised autonomous mode, in which case the operator controls 245 and display screen 250 need not be present. Additionally, the operator controls 245 and display screen 250 may be remotely located in some embodiments, allowing the operators to control the weapon system 200 from a separate location that may be a few feet away or across the globe. The display device 250 may receive and output various user interview views to the operator, including views described below for identifying and highlighting targets, obscuring non-targets, rendering target points, weapon trajectories, confidence ranges, and providing various additional sensor readings to the operator. The operator controls 245 may allow the operator to identify, select, and mark targets, and to fire the weapon 225. As shown in this example, the operator controls 245 may include a fire button 246 (to fire the weapon 225), and a “next target” button 247 to instruct the target component 220 to re-select the next priority target. In certain embodiments, the operator controls might include only these two buttons, and need not include a joystick for aiming tracking, etc.
Referring briefly to
Referring now to
In step 401, the components of the motorized weapon system 200 may identify and verify one or more targets, using sensor units 240 and/or additional data sources. In some embodiments, the identification and/or verification of targets may be performed fully autonomously by the system 200. For example, image data from cameras and sensor data from other sensors 240 (e.g., range to target data, heat data, audio, etc.) may be used to identify one or more targets within the range and proximity of the weapon system 200. In some cases, data from additional sources may be used as well, including imagery or sensor data from remote sensor or imaging systems (e.g., other weapons systems 200, fixed cameras, drones, satellites, etc.). For example, if sensor unit 240 does not include a rangefinder and/or if exact range to target data is not available, the targeting/firing system 210 may be configured to calculate approximate range data using passive ranging techniques. For example, heights of known objects (or presumed heights) may be used to calculate the distance of those objects from the weapon system 200. Additional sources of target data also may be received via communication modules 212, which may include the GPS coordinates of targets, or bearing to targets, received from a command center. Such image data and other sensor data received from additional data sources may be used by the targeting/firing system 210 to triangulate or confirm a target's location, or verify the identity of a target, etc.
As used herein, target identification and target verification refer to related but separate techniques. Target identification (or target detection) refers to the analysis of camera images, sensor data, etc., to detect objects and identify the detected object as potential targets for the weapon system 200 (e.g., vehicles, structures, weapons, individuals, etc.), rather than generally non-target objects such as rocks, trees, hills, shadows, and the like. Target verification (or target confirmation) refers to additional analyses of the same images/sensor data, and/or additional sources images/sensor data, to determine whether or not the identified potential target should be selected for targeting by the weapon system 200. Target verification techniques may be based on the configuration of the system and priorities of the particular mission, etc. For example, target verification techniques for vehicles may include identifying the size of a vehicle target (e.g., based on image analysis, target range, heat signatures from engines, etc.), the vehicle type (e.g., based on image analysis, and comparisons to a database 214 of target/non-target images), the presence of weapons on a target or proximate to a target, etc. For example, the size, shape, color, movement, audio and heat signatures of a vehicle may be analyzed to determine if that vehicle is a drone, helicopter, aircraft, boat, tank, truck, jeep, or car, whether the target is a military or civilian vehicle, the number of individuals and/or weapons on the vehicle, and the like, all of which may be used be a rules database 213 to determine whether the vehicle is a target non-target. Target verification also may include identifying particular insignia on targets, and for human targets, facial recognition and/or biometric recognition to confirm the identity of the target.
In some cases, both target identification and target verification in step 401 may be performed fully autonomously by the weapon system 200, using the techniques described above. In other cases, target identification and/or verification may include semi-autonomous or manual steps. For example, the rules of engagement for particular operations may require that each target be visually confirmed by a human operator. Such visual confirmation may be performed by the operator, as described in steps 406-407 below. Additionally or alternatively, the visual confirmation may be received from a different user, such as a commanding officer at a remote command center or other authorized user. In such cases, the weapon system 200 may be configured to transmit imagery and other sensor data to one or more remote locations, and then to receive the instructions identifying the potential target as a selected target or a non-target, from the remote authorized user/command center via a communication module 212. These remote visual confirmation techniques may be entirely transparent with respect to the operator of the weapon system 200 in some cases, that is, if a target is not selected/confirmed by a remote authorized user then that target might not ever be rendered or selected via the operator display device and/or might not be selectable by the operator during steps 406-407.
As noted above, both target identification and target selection in step 401 may be based on sets of rules received via a rules database 213 or other sources. Target selection rules may be based on target type (e.g., types of vehicles, individuals (if any), and structures, etc.), target size, target distance, the presence and types of weapons on a target, the uniform/insignia on a target, and the like. Additional rules may relate to the probability that the target has been accurately identified (e.g., level of confidence of facial recognition, vehicle type identification, insignia recognition, etc.), the probability that the weapon system 200 will be able to hit the selected target (e.g., based on target distance, target movement, weapon and ammunition type, wind and weather conditions, etc.), and/or the presence of potential collateral damage that may occur if the target is fired upon (e.g., based on detection of friendly and non-targets in the proximity of the identified target). Different sets of rules may be applied for different operators, different weapons 225 and ammunition types, different times, and/or different physical locations for the engagement. For instance, while one set of target identification, selection, and prioritization rules may be selected and applied by the targeting/firing system 210 for an engagement with a particular operator, at a particular date and time, using a particular weapon/ammunition type, in a particular country/region of the engagement, having particular lighting or weather conditions, and so on, an entirely different set of target identification, selection, and prioritization rules may be selected and applied by the targeting/firing system 210 if one or more of these variables (e.g., operator, time, weapon or ammunition type, engagement location or environmental conditions, etc.) changes.
In step 402, for scenarios in which multiple targets have been identified and selected in step 401, the targeting/firing system 210 of the motorized weapon system 200 may be configured to prioritize the multiple targets, thereby determining a firing order. As with the techniques for target identification and selection described above, target prioritization techniques similarly may be on imagery and sensor data, as well as sets of operational rules that may apply to operators, weapons, locations, etc. Examples of target prioritization rules may include, without limitation, rules that prioritize vehicles over human targets, certain types of vehicles over other types of vehicles, armored vehicles over non-armored vehicles, armed targets over non-armed targets, uniformed/insignia targets over non-uniformed or insignia targets, close targets over far targets, advancing targets over stationary or retreating targets, higher confidence targets (i.e., higher probability of weapon being able to hit the target) over lower confidence targets, targets firing weapons over targets not firing weapons, and/or any combination of these criteria. In some examples, the targeting/firing system 210 may evaluate the current target distance and trajectory of all advancing and armed targets (e.g., missiles, drones, ground vehicles, and individuals, etc.), in order to prioritize the targets in the order in which they would first reach the current position (or future position) of the weapon system 200. These target prioritization rules also may include rules determining how particular types of targets may be targeted. For example, such rules may include the desired point of impact for a particular target type (e.g., the engine of boat, the center of mass of an individual, etc.).
Additionally, different sets of rules or algorithms may be applied for prioritizing targets, depending on the current operator, current location, current date/time, and/or based on predefined operation-specific rules of engagement. Further, rules or algorithms for prioritization may be based on or adjusted in view of current conditions, such as the current amount of ammunition of the weapon system 200 (e.g., lower ammunition circumstances may cause prioritization of most valuable/important targets first), the current wind or weather conditions (e.g., in which closer and/or higher confidence targets may be prioritized), or based on nearby friendly or non-hostile targets (e.g., in which closer and/or higher confidence targets may be prioritized). Additionally, certain prioritizing algorithms may adjust the priorities of a set of targets to reduce and/or minimize the lag time between successive firings of the weapon, for instance, by prioritizing a set of nearby targets successively in the priority rank order, in order to reduce the firing latency time required to drive the weapon 225 through the sequence of targets.
In various embodiments, operators may be permitted to switch on-the-fly between different rules or algorithms for target selection and prioritization. Such switching capabilities may be based the rank and/or authorization level of the operator, and in some cases may require that a request for approval be transmitted from the weapons system 200 to a high-level user at a remote command center.
Referring briefly to
Finally, example user interface 500 includes two operator controls: a fire button 510 to allow the user to fire the weapon 225, and a next button 515 to allow the user to select the next target in the priority list. In this example, fire button 510 is shaded indicating that the weapon 225 cannot currently be fired. As described below in more detail, this may represent a feature in which the operator's firing control mechanism 246 is disabled whenever the weapon 225 is not currently aimed at a selected target. However, it will be noted that the next button 515 is enabled in this example, indicating that the next mechanism 247 that allows the operator to change the primary target 501 to the next highest priority target 502 in the priority list may be enabled even when the crosshairs 505 are not yet positioned on the primary target 501.
The kill chain sequence may continue by performing the functionality of steps 403-410 in a continuous loop for each of the targets selected in step 401, and in the priority order of the target prioritization performed in step 402. Therefore, the first iteration of steps 403-410 may be performed for the highest priority target, the second iteration of steps 403-410 may be performed for the second highest priority target, and so on.
In step 404, for the current highest priority target in the prioritization list, the targeting/firing system 210 may perform a dynamic tracking technique to determine a firing solution for that target. A firing solution refers to a precise firing position for the weapon (e.g., an azimuth/horizontal angle and altitude/elevation angle) and a precise firing time calculated by the targeting/firing system 210 to hit the primary target. For stationary targets, target tracking need not be performed, and the firing solution may be computed based on a number of factors, including the target distance and target bearing from the weapon 225, the muzzle velocity of the weapon 225, the aerodynamic drag of the projectile/ammunition to be fired, the wind and weather conditions, and gravity (any one of which may vary based on the current conditions).
When the target is moving and/or anticipated to be moving, dynamic target tracking may be required to generate a firing solution, introducing additional variables which may increase the complexity and uncertainty of the firing solution calculation. Initially, dynamic target tracking may involve calculating the anticipated direction and velocity of the target. In some embodiments, the targeting/firing system 210 may assume that the primary target will continue along its current course with the same velocity and direction. If the target is currently moving along a curved path, and/or is currently accelerating or decelerating, then the targeting/firing system 210 may assume the same curved path and/or the same acceleration/declaration pattern, and may extrapolate out based on those variables. Further, in some embodiments, the targeting/firing system 210 may anticipate future changes in course or speed, based on factors such as upcoming obstructions in the target's path, curves in roads, previous flight patterns, etc.
In addition to dynamically tracking the target in order to anticipate the future position of the target, the determination of a firing solution for a moving target also may take into account the anticipated time to drive the motor 235 so that the weapon is positioned at the correct firing point, and the anticipated amount of time between the firing command and when the projectile/ammunition will reach the target. The time to drive the motor 235 may be calculated based on the distance the gun is to be driven, the speed of the motor and/or the weight of the weapon 225. The amount of time between receiving a firing command and when the projectile/ammunition will reach the target may be based on the muzzle velocity of the weapon 225, the aerodynamic drag of the projectile/ammunition to be fired, the wind and weather conditions, etc. Additionally, in some cases, an anticipated delay for operator reaction time (e.g., 0.5 seconds, 1 second) also may be included in the firing solution calculation.
Referring briefly to
Further, example interface 600 also includes three operator controls: a fire button 610, a next button 615, and a safe button 620. As discussed above, the fire button 610 allows the operator to fire the weapon 225, but in some cases might be enabled only after the weapon 225 has reached the firing solution point 607. The next button 615 allows the operator not to fire the weapon 225 at the primary target 601, but instead to re-select the next highest priority target in the priority list. In this example, the primary target 601 may be moved to the back of the priority list or elsewhere in the priority list, based on the operator's selection of the next control 615. Finally, the safe button 620 allows the operator to mark the currently selected primary target 601 as a friendly or non-target object, thereby removing it from the set of selected targets determined in step 401 and priority list of step 402. Thus, after an operator has marked a target using the safe mechanism 615, it may not be selected again by the targeting/firing system 210, at least during the current engagement by the current weapon system 200. In some embodiments, the configuration settings of the targeting/firing system 210 may determine that a target marked as safe by an operator during one engagement might thereafter be excluded from target selection/prioritization in future engagements. Additional or alternatively, weapon system 200 may transmit data identifying any targets marked as safe to other weapons systems 200 in the same general location, so that those other weapons systems 200 may automatically remove the target marked as safe from their target selection/prioritization lists as well.
Although step 404 was described above as performed for only a single target (i.e., the current highest priority target), in some embodiments, the targeting/firing system 210 may continuously performing dynamic tracking for all targets selected/prioritized in steps 401-402. In such cases, by performing dynamic tracking on the selected secondary target(s), before the completion of the firing sequence 403-410 for the primary target, the targeting/firing system 210 may more quickly and efficiently determine the firing solution for the next primary target as soon as the firing sequence 403-410 is completed for the first primary target. Additionally, while dynamically tracking a plurality of secondary target(s), the targeting/firing system 210 may potentially re-order the prioritization sequence determined in step 402, for example, based on movement of the secondary targets and/or based on newly received data about one or more of the secondary targets (e.g., improved verification information, additional threat information, etc.).
In step 405, the targeting/firing system 210 may engage the motor 235 to drive the orientation of the weapon 225 toward the firing solution determined for the primary target in step 404 Thus, referring again to
In step 406, the targeting/firing system 210 may generate and transmit a user interface to be rendered for the operator via one or more display devices 250. As discussed above, the human operator may be located at the weapon system 200 or remote to the weapon system 200, in which case the user interface may be transmitted via the communication module 212 over one or more secure computer networks, wireless networks, satellite networks, etc. In various embodiments, the user interface provided in step 406 may correspond to user interfaces 500 and/or 600 discussed above, although several variations may be implemented in different embodiments. For instance, as noted above, the primary target 501 may be marked by a particular scheme that is different from the secondary targets and from non-targets. In some cases, the user interface may automatically zoom in on the primary target (as in screen 600) to allow the operator the best possible visual of the target. Additionally or alternatively, secondary targets and/or non-targets may be blocked out, hidden, or otherwise obscured to prevent confusion or distraction by the operator. Further, in different embodiments, each of the various different target points discussed above (e.g., crosshairs 605 representing current weapon aiming point, the current target position point 606, and/or firing solution target point 607) may or may not be rendered within the user interface, and/or may be shown in different colors, using different graphics and icons, etc. Finally, the user interface generated and rendered in step 406 may include additional components such as side menus, overlays, and the like, to convey any relevant sensor information about the target or the firing environment. Examples of such sensor that may be included in the operator user interface may include the target type, target name/identifier of verified (if known) and confidence level of the verified name/identifier, distance to target, current wind and weather conditions, current status of weapon 225 and ammunition supply, number of other secondary targets, etc.
In step 407, the targeting/firing system 210 may receive engagement instructions from the operator, via operator controls 245. As illustrated in
Additionally, as noted above during the discussion of the dynamic target tracking, there may be time delay between steps 406 and 407, for target analysis, evaluation, and decision-making by the operator. During this time delay, the dynamic tracking may continue for the primary target as well as the secondary targets selected by the targeting/firing system 210. Thus, while the operator deliberates on whether or not to fire on a target between steps 406 and 407, for moving targets and/or other circumstances (e.g., a detected change in the wind), the firing solution may be updated during this time delay and the motor 235 may be continuously engaged so that the weapon 225 is continuously aimed at the most recent firing solution target point. Additionally, for excess delays or deliberations between steps 406 and 407, the target identification, selection, and prioritization techniques discussed above in steps 401 and 402 may be updated, automatically and entirely transparently to the operator, to re-select and re-prioritize the targets based on new imagery, sensor data, and other relevant data received during the time delay between steps 406-407.
After receiving the firing/engagement instructions from the operator in step 407, the targeting/firing system 210 may perform the received instructions in steps 408-410. In this example, similar to that shown in
The next command (step 409) is an operator instruction not to fire the weapon 225 at the target, but to retain the target within the set of selected targets/target priority list, and then to re-select the next highest priority target in the priority list. In various examples, a next command in step 409 may cause the target to be placed at the back of the priority list of selected targets, or may cause the target to placed immediately after the next highest priority target in the priority list. Finally, a safe command (step 410) is an operator instruction to mark the target as a friendly or non-target object, thereby removing it from the set of selected targets and target priority list. Thus, after step 410, the target may not be selected again by the targeting/firing system 210, during at least the current engagement by the current weapon system 200. As noted above, in some embodiments, a target marked as safe during step 410 during an engagement at one weapon system 200 also might be excluded from target selection in future engagements of the weapon system 200, and/or during current and future engagements at different weapons systems 200.
Thus, the various techniques discussed above with reference to
As mentioned above, certain aspects of the present disclosure relate to techniques for disabling and re-enabling an operator firing control (e.g., 246), during the period of time when the motor 235 of a motorized weapon system 200 is engaged and the weapon 225 is being positioned and oriented toward a determined target point for firing. The process of engaging the motor 235 of the weapon system 200 to position the weapon 225 to fire on a particular target point may take anywhere from a fraction of second to several seconds, depending on factors size as the motor size and speed, gun size and weight, angular distance to be traveled, etc. During the time period when the motor 235 is engaged in positioning the weapon 225, the projected point of impact of a projectile fired from the weapon 225 may become closer and closer to the target point, and similarly, the likelihood of hitting the target may increase continuously until a maximum likelihood is reached when the projected point of impact of the weapon 225 (e.g., marked by crosshairs 505, 605, etc.) is directly on the determined firing solution target point. Because many unknown variables may exist during the weapon firing process (e.g., exact target distance and bearing, exact muzzle velocity and aerodynamic drag of projectile, future target movement, exact wind and air pressure conditions, exact weapon vibration, and so on), the probability of hitting the target might never be 100%. However, when the likelihood of hitting the target is determined to be sufficiently high, e.g., above a predetermined likelihood threshold, then the targeting/firing system 210 may be configured to enable firing of the weapon 225 (and/or automatically fire the weapon 225).
Accordingly, in some embodiments, the targeting/firing system 210 may be configured to determine if/when the predetermined likelihood threshold for hitting the target is reached during the time period when the motor 235 is engaged in positioning the weapon 225, but before the crosshairs 505 are directly on the target (i.e., before the projected point of impact of the weapon 225 is directly on the determined firing solution target point). In such embodiments, the targeting/firing system 210 may be configured to disable the operator firing mechanism 246 when the current likelihood of hitting the target is below the predetermined likelihood threshold, based on the position/orientation of the weapon 225 and other factors. The operator firing mechanism 246 then may be re-enabled in response to the targeting/firing system 210 determining that the current likelihood of hitting the target is above the predetermined likelihood threshold. These aspects are described below in more detail with reference to
Referring now to
In step 701, a motorized weapon system 200 has identified and selected a particular target, and determines a firing solution and/or target point for the selected target. Thus, step 701 may be similar or identical to step 404 discussed above. As noted above, one or both of the target and the weapon system 200 may potentially be moving during this process. When both the targets and the weapon 225 are stationary, target tracking need not be performed, and the firing solution target point may be computed based on factors including the target distance, target bearing from the weapon 225, muzzle velocity of the weapon 225, aerodynamic drag of the projectile/ammunition to be fired, the wind and weather conditions, and gravity (any one of which may vary based on the current conditions). However, when one or both of the selected target and the weapon 225 are moving and/or are anticipated to be moving, dynamic target tracking may be required to generate a firing solution, and additional variables may increase the complexity and uncertainty of the firing solution calculation. For example, dynamic target tracking may be used to determine the current velocity and direction of travel of both the weapon system 200 and the target, and that data may be used to calculate the anticipated velocity and direction of travel of both in the near future. In some cases, the targeting/firing system 210 may assume that both the weapon system 200 and the target may continue along their current course with the same velocity and direction, and if either is currently moving along a curved path and/or is currently accelerating/decelerating, then the targeting/firing system 210 may assume the same curved path and/or the same acceleration/declaration in the near future. As noted above, when performing dynamical tracking on a moving target, the determination of a firing solution (e.g., predicted future coordinates at a future firing time) also may take into account the anticipated time to engage the motor 235 to position and orient the weapon at the correct firing point, as well as the anticipated time lag for the fired projectile to reach the target. Additionally, in some cases, the targeting/firing system 210 may build in an anticipated delay for operator reaction time (e.g., 0.5 seconds, 1 second) which may be included in the firing solution calculations for moving targets.
In step 702, the targeting/firing system 210 of the motorized weapon system 200 may determine a boundary area surrounding the target point determined in step 701. In some examples, the boundary area may be referred to as a “confidence lock” boundary, because as discussed below, the firing mechanism may be disabled when the projected point of impact of the weapon is outside of this area. From the perspective of the weapon system 200, the boundary area may be a circle or other two-dimensional closed shape surrounding the target point. A simple example of a circular boundary area 807 is shown in
In some embodiments, the boundary area may be circular, as shown in
The size of the boundary area determined in step 702 may be based on any combination of factors that may introduce uncertainty in the point of impact calculation of the weapon 225 with respect to the target. For instance, the size of the boundary area (e.g., in terms of angular degrees or coordinates) may be based on one or more of the target size, distance between the weapon 225 and the target, the general accuracy and precision data for the weapon type 225 and ammunition type, and other factors such as wind, vibration level of the weapon 225 during movement by the motor, and current movement of the weapon system 200 and/or the target. In scenarios where there is a high degree of confidence in the predictive accuracy of the weapon's crosshairs, the boundary area may be relatively small. In contrast, for scenarios of greater uncertainty of the relevant variables, and where the confidence level is in the predictive accuracy of the weapon's crosshairs is lower, than the boundary area may be relatively large.
In step 703, the targeting/firing system 210 engages the motor 235 to position and orient the weapon 225 toward the target point identified in step 701. Thus, step 703 may be similar or identical to step 405, discussed above. For example, referring back to
In step 704, at a particular point of time when the motor 235 is engaged and the weapon 225 is moving, the targeting/firing system 210 may compute the projected point of impact if a projectile were fired from the weapon 225 at that time. The projected point of impact corresponds to the calculation of the crosshairs (e.g., 505 and 605) discussed above and shown in
In step 705, the targeting/firing system 210 may compare the projected point of impact computed in step 704 to the “confidence lock” boundary area defined in step 702. This may be straightforward comparison of angular coordinates from the perspective of the weapon 225. If the current point of impact of the weapon 225 is projected to fall outside of the defined boundary area (705:No), then in step 706 the targeting/firing system 210 may disable the operator firing mechanism 246 thereby preventing the weapon 225 from being fired. However, if the current point of impact of the weapon 225 is projected to fall within the defined boundary area (705:Yes), then in step 707 the targeting/firing system 210 may enable (or re-enable) the operator firing mechanism 246, thereby allowing the operator to fire the weapon 225.
In some embodiments, after the operator firing mechanism 246 has been re-enabled in step 707, and the operator fires on the target, the targeting/firing system 210 may be configured to perform a rapid post-firing command movement of the weapon 225 in order to further improve shot confidence. For instance, after the operator pushes the enabled firing mechanism 246, rather than immediately firing the weapon 225, the targeting/firing system 210 in some cases may engage the motor 235 for a short amount of time (e.g., 50 ms, 100 ms, 200 ms, etc.), in response to a determination that the corresponding small weapon movement may significantly increase shot confidence. These short post-firing command movements may be performed in the case of moving targets and/or moving weapon systems 200, in the event of a sudden change in the trajectory of the target, to correct for a lag in operator reaction time, and/or as part of a firing burst to increase hit probability.
Referring briefly to
As further shown in
As mentioned above, these steps may be performed periodically or continuously even when the motor 235 is not moving and the crosshairs 805 are fixed on the target point 806. In these scenarios, a new action such as a change in movement of the target 801 or the weapon system 200, an object obscuring the target 801, and/or new sensor readings (e.g., a change in wind conditions) may temporarily cause the probability level of the weapon 225 hitting the target to drop below the predetermine likelihood threshold and out of the confidence lock boundary area 807, requiring a minor adjust via the motor 235 or other corrective action by the weapon system 200.
Using similar techniques to those discussed above in referenced to
In some embodiments, over the course of a particular operation (or multiple operations at or near the same location) the firing/targeting system 210 may continuously assess and evaluate its target accuracy, which may result the system 210 increasing or decreasing the confidence levels it had previously computed for one or more selected targets. As an example, if a first target is initially determined to be too small and too far away to have a sufficiently high confidence level for firing on the target, the firing/targeting system 210 may instead select a number of closer targets and may fire on those targets. Then, by analyzing the firing trajectories and accuracies of hitting the closer targets, the firing/targeting system 210 may be better able to evaluate the range, lighting, wind conditions, and the like, so that the confidence level for the hitting the first target now may be increased based on the accuracy feedback from the closer targets.
As demonstrated in the above examples, a motorized weapon system 200 may be weapon-agnostic, in that a weapon system 200 may support many different types or models of weapons 235, including various firearms, large caliber rifles, machine guns, autocannons, grenade launchers, rockets, and/or directed energy weapons such as lasers, high-power microwave emitters, and other undisclosed devices. Further, the targeting/firing system 210 may weapon profiles in data store 214 and/or weapon-specific rules in data store 213, that allow the weapon system 200 to perform the techniques discussed herein in a similar or identical manner regardless of the current weapon type. In some embodiments, the targeting/firing system 210, sensor units 240, and the operator interface 245-250 may function identically regardless of the type of motor 235, mount 230, and weapon 225 integrated into the system 200. Because systems 200 having different types of weapons 225, mounts 230, and/or motors 235, may perform differently in some respects (e.g., time required to re-position and re-orient the weapon 225, maximum range of weapon, type, size, and speed of projectiles fired, etc.), the targeting/firing system 210 may be configured to initially determine these weapon-specific data factors, and adjust the techniques described herein to provide a uniform operator experience.
For instance, the targeting/firing system 210 of a first weapon system 200 may automatically select targets based on the firing range of the weapon 225 installed on that system 200, whereas a different system 200 might select more or less targets based on its having a weapon 225 with a different range. In other example, a first weapon system 200 may prioritize a set of selected targets taking into account the speed of the motor 235 on that system 200, whereas a different system 200 might prioritize the same set of targets differently as a result of having a different motor speed. As yet another example, different sensor units 240 have different numbers, types, and/or qualities of cameras and other sensors, may result in different sets of input provided to the targeting/firing systems 210. As a result, a first weapon system 200 may have sufficient data to select and verify a target with high confidence, while a second weapon system 200 with different cameras/sensors 240 would not select because it could not verify the target with a sufficient confidence level. In all of these examples, the different behaviors of the weapon systems 200, resulting from different weapons 225, mounts 230, motors 235, and/or sensor units 240 may be entirely transparent to the operator. In some cases, operators of weapons systems 200 need not ever know what weapon 225 they are firing, and the entire operator interface may function identically regardless of the particular weapon, motor, mount, or sensor unit. These similarities may apply to the operator interface with respect to the kill chain sequence described in reference to
Additional techniques applicable to the above examples include the implementation of operation-specific rules of engagement that may be retrieved/received and enforced by the targeting/firing system 210. As discussed above, specific rules of engagement and/or operational parameters for the motorized weapon system may include different requirements or parameters for target identification and selection, different minimum confidence thresholds for firing the weapon 225, different target prioritization algorithms, and so on. In some embodiments, the motorized weapon system 200 may be configured to receive a set of operation-specific rules of engagement from a remote command center via a secure communication channel, store and apply those operation-specific rules during the appropriate operation. As noted above, specific rules of engagement and/or sets of operational parameters may be associated with specific operators, operator rank, engagement location (e.g., country, region, etc.). In some embodiments, operators having sufficient rank and/or authorization levels may be permitted to manually override certain rules of engagement and/or operational parameters of the weapon system 200, and to apply the operator's own preferred rules/parameters in place. Additionally or alternatively, such overrides may require outside approval, and thus upon receiving a rule/parameter override request from the operator, the weapon system may be configured to transmit a secure request for override approval a remote command center.
In several examples above, the target points for selected targets, including stationary and moving targets, are computed based on a desired point of impact location on the target (e.g., an engine of a boat or vehicle, the center of mass of an individual, etc.). However, in some embodiments, the targeting/firing system 210 may be configured with warning shot capabilities in which the desired point of impact location is not on the target. For instance, the rules of engagement enforced by the targeting/firing system 210 for a particular operation may dictate that only warning shots are to be fired at particular selected target. Alternatively, such rules may dictate that at least one initial warning shot is to be fired at a selected target before an attempt is made to hit the target. In some cases, the operator controls 245 also may include a warning shot mode that can be activated by the operator, independent of the rules of engagement of the operation, to allow the operator to independently fire one or more warning shots on any selected target.
When the targeting/firing system 210 is configured to operate in a warning shot mode, the firing solution may be adjusted to assure that the projectiles fired by the weapon 225 will miss the target. In some embodiments, the targeting/firing system 210 may determine the preferred location of a desired warning shot based on the type and size of the target (e.g., the number and position of warning shots for human targets may be different than for vehicle targets), the orientation and/or the direction of movement of the target (e.g., it may be desirable to firing a warning shot directly in front of the target), and so on.
Implementation of the techniques, blocks, steps and means described above may be done in various ways. For example, these techniques, blocks, steps and means may be implemented in hardware, software, or a combination thereof. For a hardware implementation, the processing units may be implemented within one or more application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field programmable gate arrays (FPGAs), processors, controllers, micro-controllers, microprocessors, other electronic units designed to perform the functions described above, and/or a combination thereof.
Furthermore, embodiments may be implemented by hardware, software, scripting languages, firmware, middleware, microcode, hardware description languages, and/or any combination thereof. When implemented in software, firmware, middleware, scripting language, and/or microcode, the program code or code segments to perform the necessary tasks may be stored in a machine readable medium such as a storage medium. A code segment or machine-executable instruction may represent a procedure, a function, a subprogram, a program, a routine, a subroutine, a module, a software package, a script, a class, or any combination of instructions, data structures, and/or program statements. A code segment may be coupled to another code segment or a hardware circuit by passing and/or receiving information, data, arguments, parameters, and/or memory contents. Information, arguments, parameters, data, etc. may be passed, forwarded, or transmitted via any suitable means including memory sharing, message passing, token passing, network transmission, etc.
For a firmware and/or software implementation, the methodologies may be implemented with modules (e.g., procedures, functions, and so on) that perform the functions described herein. Any machine-readable medium tangibly embodying instructions may be used in implementing the methodologies described herein. For example, software codes may be stored in a memory. Memory may be implemented within the processor or external to the processor. As used herein the term “memory” refers to any type of long term, short term, volatile, nonvolatile, or other storage medium and is not to be limited to any particular type of memory or number of memories, or type of media upon which memory is stored.
Moreover, as disclosed herein, the term “storage medium” may represent one or more memories for storing data, including read only memory (ROM), random access memory (RAM), magnetic RAM, core memory, magnetic disk storage mediums, optical storage mediums, flash memory devices and/or other machine readable mediums for storing information. The term “machine-readable medium” includes, but is not limited to portable or fixed storage devices, optical storage devices, and/or various other storage mediums capable of storing that contain or carry instruction(s) and/or data.
A computer system as illustrated in
The computer system 900 is shown comprising hardware elements that can be electrically coupled via a bus 905 (or may otherwise be in communication, as appropriate). The hardware elements may include one or more processors 910, including without limitation one or more general-purpose processors and/or one or more special-purpose processors (such as digital signal processing chips, graphics acceleration processors, video decoders, and/or the like); one or more input devices 915, which can include without limitation a mouse, a keyboard, remote control, and/or the like; and one or more output devices 920, which can include without limitation a display device, a printer, and/or the like.
The computer system 900 may further include (and/or be in communication with) one or more non-transitory storage devices 925, which can comprise, without limitation, local and/or network accessible storage, and/or can include, without limitation, a disk drive, a drive array, an optical storage device, a solid-state storage device, such as a random access memory (“RAM”), and/or a read-only memory (“ROM”), which can be programmable, flash-updateable and/or the like. Such storage devices may be configured to implement any appropriate data stores, including without limitation, various file systems, database structures, and/or the like.
The computer system 900 might also include a communications subsystem 930, which can include without limitation a modem, a network card (wireless or wired), an infrared communication device, a wireless communication device, and/or a chipset (such as a Bluetooth™ device, an 802.11 device, a WiFi device, a WiMax device, cellular communication device, etc.), and/or the like. The communications subsystem 930 may permit data to be exchanged with a network (such as the network described below, to name one example), other computer systems, and/or any other devices described herein. In many embodiments, the computer system 900 will further comprise a working memory 935, which can include a RAM or ROM device, as described above.
The computer system 900 also can comprise software elements, shown as being currently located within the working memory 935, including an operating system 940, device drivers, executable libraries, and/or other code, such as one or more application programs 945, which may comprise computer programs provided by various embodiments, and/or may be designed to implement methods, and/or configure systems, provided by other embodiments, as described herein. Merely by way of example, one or more procedures described with respect to the method(s) discussed above might be implemented as code and/or instructions executable by a computer (and/or a processor within a computer); in an aspect, then, such code and/or instructions can be used to configure and/or adapt a general purpose computer (or other device) to perform one or more operations in accordance with the described methods.
A set of these instructions and/or code might be stored on a non-transitory computer-readable storage medium, such as the non-transitory storage device(s) 925 described above. In some cases, the storage medium might be incorporated within a computer system, such as computer system 900. In other embodiments, the storage medium might be separate from a computer system (e.g., a removable medium, such as a compact disc), and/or provided in an installation package, such that the storage medium can be used to program, configure, and/or adapt a general purpose computer with the instructions/code stored thereon. These instructions might take the form of executable code, which is executable by the computer system 500 and/or might take the form of source and/or installable code, which, upon compilation and/or installation on the computer system 900 (e.g., using any of a variety of generally available compilers, installation programs, compression/decompression utilities, etc.), then takes the form of executable code.
It will be apparent to those skilled in the art that substantial variations may be made in accordance with specific requirements. For example, customized hardware might also be used, and/or particular elements might be implemented in hardware, software (including portable software, such as applets, etc.), or both. Further, connection to other computing devices such as network input/output devices may be employed.
As mentioned above, in one aspect, some embodiments may employ a computer system (such as the computer system 900) to perform methods in accordance with various embodiments of the invention. According to a set of embodiments, some or all of the procedures of such methods are performed by the computer system 900 in response to processor 910 executing one or more sequences of one or more instructions (which might be incorporated into the operating system 940 and/or other code, such as an application program 945) contained in the working memory 935. Such instructions may be read into the working memory 935 from another computer-readable medium, such as one or more of the non-transitory storage device(s) 925. Merely by way of example, execution of the sequences of instructions contained in the working memory 935 might cause the processor(s) 910 to perform one or more procedures of the methods described herein.
The terms “machine-readable medium,” “computer-readable storage medium” and “computer-readable medium,” as used herein, refer to any medium that participates in providing data that causes a machine to operate in a specific fashion. These mediums may be non-transitory. In an embodiment implemented using the computer system 900, various computer-readable media might be involved in providing instructions/code to processor(s) 910 for execution and/or might be used to store and/or carry such instructions/code. In many implementations, a computer-readable medium is a physical and/or tangible storage medium. Such a medium may take the form of a non-volatile media or volatile media. Non-volatile media include, for example, optical and/or magnetic disks, such as the non-transitory storage device(s) 925. Volatile media include, without limitation, dynamic memory, such as the working memory 935.
Common forms of physical and/or tangible computer-readable media include, for example, a floppy disk, a flexible disk, hard disk, magnetic tape, or any other magnetic medium, a CD-ROM, any other optical medium, any other physical medium with patterns of marks, a RAM, a PROM, EPROM, a FLASH-EPROM, any other memory chip or cartridge, or any other medium from which a computer can read instructions and/or code.
Various forms of computer-readable media may be involved in carrying one or more sequences of one or more instructions to the processor(s) 910 for execution. Merely by way of example, the instructions may initially be carried on a magnetic disk and/or optical disc of a remote computer. A remote computer might load the instructions into its dynamic memory and send the instructions as signals over a transmission medium to be received and/or executed by the computer system 900.
The communications subsystem 930 (and/or components thereof) generally will receive signals, and the bus 905 then might carry the signals (and/or the data, instructions, etc. carried by the signals) to the working memory 935, from which the processor(s) 910 retrieves and executes the instructions. The instructions received by the working memory 935 may optionally be stored on a non-transitory storage device 925 either before or after execution by the processor(s) 910.
100981 It should further be understood that the components of computer system 900 can be distributed across a network. For example, some processing may be performed in one location using a first processor while other processing may be performed by another processor remote from the first processor. Other components of computer system 900 may be similarly distributed. As such, computer system 900 may be interpreted as a distributed computing system that performs processing in multiple locations. In some instances, computer system 900 may be interpreted as a single computing device, such as a distinct laptop, desktop computer, or the like, depending on the context.
The methods, systems, and devices discussed above are examples. Various configurations may omit, substitute, or add various procedures or components as appropriate. For instance, in alternative configurations, the methods may be performed in an order different from that described, and/or various stages may be added, omitted, and/or combined. Also, features described with respect to certain configurations may be combined in various other configurations. Different aspects and elements of the configurations may be combined in a similar manner. Also, technology evolves and, thus, many of the elements are examples and do not limit the scope of the disclosure or claims.
Specific details are given in the description to provide a thorough understanding of example configurations (including implementations). However, configurations may be practiced without these specific details. For example, well-known circuits, processes, algorithms, structures, and techniques have been shown without unnecessary detail in order to avoid obscuring the configurations. This description provides example configurations only, and does not limit the scope, applicability, or configurations of the claims. Rather, the preceding description of the configurations will provide those skilled in the art with an enabling description for implementing described techniques. Various changes may be made in the function and arrangement of elements without departing from the spirit or scope of the disclosure.
Also, configurations may be described as a process which is depicted as a flow diagram or block diagram. Although each may describe the operations as a sequential process, many of the operations can be performed in parallel or concurrently. In addition, the order of the operations may be rearranged. A process may have additional steps not included in the figure. Furthermore, examples of the methods may be implemented by hardware, software, firmware, middleware, microcode, hardware description languages, or any combination thereof. When implemented in software, firmware, middleware, or microcode, the program code or code segments to perform the necessary tasks may be stored in a non-transitory computer-readable medium such as a storage medium. Processors may perform the described tasks.
Having described several example configurations, various modifications, alternative constructions, and equivalents may be used without departing from the spirit of the disclosure. For example, the above elements may be components of a larger system, wherein other rules may take precedence over or otherwise modify the application of the invention. Also, a number of steps may be undertaken before, during, or after the above elements are considered.
The present application is a non-provisional of and claims priority to U.S. Provisional Patent Application No. 62/581,280, filed Nov. 3, 2017, entitled “SEMI-AUTONOMOUS TARGETING OF REMOTELY OPERATED WEAPONS.” The entire contents of provisional application no. 62/581,280 is incorporated herein by reference for all purposes.
Number | Date | Country | |
---|---|---|---|
62581280 | Nov 2017 | US |