This disclosure generally relates to shot peening, and specifically the detection of an offset distance and angle of the nozzle of a shot peening unit to a target.
Shot peening is a surface enhancement process that imparts a shallow layer of compressive residual stress into the surface of a metal component by impacting metallic, ceramic, or glass peening particles, otherwise referred to as shot or media, at a high velocity.
Shot peening is a surface enhancement process that imparts a shallow layer of compressive residual stress into the surface of a metal component by impacting metallic, ceramic, or glass peening particles, otherwise referred to as shot or media, at a high velocity. The intensity of peening is dependent upon the kinetic energy of the shot, which is a function of its mass and velocity. This velocity varies with various machine parameters, including the offset distance from the peening nozzle of the shot peening unit to the impingement target-a target object or the workpiece to which shot media is applied. Throughout a peening process, the nozzle of the shot peening unit may move across an impingement target, or an impingement target may move relative to the nozzle, such that the offset distance and angle of a specific area of the impingement target to which shot media is applied may vary relative to immediately adjacent areas as the surface geometry of the impingement target changes. Keeping this distance constant requires extensive control mechanisms, often utilizing robotic devices. Due to the complexity of these systems, the offset distance oftentimes can only be confirmed in between cycles.
The peening coverage must also be maintained with great accuracy throughout a peening process in order to comply with industry standards over peening coverage such as SAE J2277. A peening process must be designed in order to ensure this compliance, which is time-consuming and requires the use of complex equipment and a skilled operator. Many processes are completed with irregularly shaped targets which possess holes, edges, and steep faces. These features further complicate the design of the peening process, as the offset distance and angle from the nozzle to a particular region on the impingement target may change as the nozzle moves over the impingement target, or the impingement target is moved relative to the nozzle. In peening operations, it is beneficial to know the offset distance and angle from the nozzle of the shot peening unit to the target region or area of the impingement target to which shot media is directed.
Reference will now be made in detail to various embodiments, examples of which are illustrated in the accompanying drawings. While this disclosure includes certain examples, it will be understood the disclosure is not intended to limit the claims to these examples. On the contrary, the disclosure is intended to cover alternatives, modifications and equivalents, which may be included within the spirit and scope of the claims. Furthermore, in this detailed description, numerous specific details are set forth in order to provide a thorough understanding. However, one of ordinary skill in the art will appreciate that the subject matter of the present disclosure may be practiced without these specific details. In other instances, well known methods, procedures, and components have not been described in detail as not to unnecessarily obscure aspects of the present disclosure.
The instant disclosure includes a nozzle attachment that houses a proximity detection system for use in shot peening processes. This detection system may provide, to a user or controller, real-time distances (an “offset distance” or “distance”) and incident angle relationships (an “angle”) between the peening nozzle and a target surface. These data allow a peening process to be designed with more precision, ensuring proper and consistent peening coverage without requiring a peening operation to be shut down while external measurement equipment is used to validate the nozzle offset distance and angle from the impingement target.
Referring now to
Still in the example shown in
LIDAR sensors may be advantageous over other methods or instruments used to obtain similar measurements. For example, optical sensors, such as cameras, require being focused or adjusted not only for a target area but also for a target range of distance or depth, in order to achieve accurate measurements. Optical sensors and cameras may also become obscured by dust particles or other obstructions, which may reduce the sensor's accuracy or completely prohibit the optical sensor from functioning. LIDAR, however, emits waves which may not require focus for depth and may pass through dust particles or other obstructions, and may thus be advantageous for use in shot peening applications where dust and other debris may be present.
The sensor 103 communicates with a controller 106 via a digital signal processor 104 which, in this instance utilizes ethernet communication, USB communication, a digital signal processing program, a PCB control system, or any suitable communication protocol to establish a connection between the sensor 103 and the controller 106. The controller 106 communicates with a display 105 to display one or more datapoints generated by the sensor 103.
The sensor 103, attached to the nozzle attachment 101, defines a detection zone 206, as shown in
The array of points 209 may be selected automatically by the sensor 103, automatically by the controller 106, and/or by a user utilizing the display 105 to be a grid of points spaced in a planar direction (having a vertical and horizontal spacing) that is substantially normal to the direction of the pulsed laser light emitted by the sensor 103. In an example, the array of points 209 may be selected automatically by the sensor 103, automatically by the controller 106, and/or by a user utilizing the display 105 to be an 8×8 grid of points, or array of points 209 representing the inspection zone 208. In an example, the array of points 209 may be centered about the detection zone 206 of the sensor 103, which may be centered about the peening zone 207.
As shown in
In one example, the controller 106 selects two points C and F substantially in the center of the inspection zone 208, as shown in
In one example, to determine if the two points lie on a detected edge, the controller 106 determines the distance between the sensor 103 and a first point, and then determines the distance between the sensor 103 and a second point. The controller 106 calculates the change in distance between the first and second points, or the delta, and compares that value against a threshold value to determine if an edge is present. The system 100 (and, particularly, the controller 106) employs an algorithm that identifies edges based on a comparison of the distances from the sensor 103 to adjacent points within the inspection zone 208. In response to the difference in distances being greater than a threshold value, the system 100 determines that an edge is present between the two adjacent points.
In an example, if the system 100 by an algorithm determines that an edge is present between the two adjacent points, the system 100, and in particular the sensor 103 may be refocused to focus the detection zone 206 and thus the inspection zone 208 on a different area of the peening zone 207, such that the edge is not present in, or is on the outermost part of the detection zone 206 and/or inspection zone 208. In an example, if the system 100 by an algorithm determines that an edge is present between the two adjacent points, the system 100, and particularly the controller 106 directs the sensor 103 to refocus the detection zone 206 and thus the inspection zone 208 on a different area of the peening zone 207. In an example, if the system 100 by an algorithm determines that an edge is present between the two adjacent points, the system 100, and particularly the controller 106 directs the algorithm to move the inspection zone 208 within the detection zone 206, but still within an area of the peening zone 207.
In an example, if the system 100 by an algorithm determines that an edge is present between the two adjacent points, the system 100, and in particular the shot peening nozzle 200 may be repositioned, or the controller 106 may direct the nozzle 200 to be repositioned, such that the sensor 103, detection zone 206, inspection zone 208 and the peening zone 207 are all repositioned to a different area on the target 309. In an example, if the system 100 by an algorithm determines that an edge is present between the two adjacent points, the target 309 may be repositioned within the peening zone 207.
In some examples, similar edge detection concepts are utilized in order to detect the presence of an object, or impingement target 309, in the inspection zone 208. In one example, rather than determining a scalar value for the distance between the sensor 103 and the point of detection, the controller 106 compares the distance itself to a threshold value, and translates the distance into a binary variable (e.g., within the threshold, without the threshold). By translating the distance determination into a binary variable and then plotting the variables on the inspection zone 208, the controller 106 maps a relative shape of the object.
As shown in
Referring now to
The controller 106 determines length HC according to:
Finally, the controller 106 determines angle δ, which defines an angle from C to F relative to the X-axis, according to:
Referring now to
The controller 106 determines length HF according to:
Finally, the controller 106 determines angle δ, which defines an angle from C to F relative to the X-axis, according to:
Method 3 utilizes the same initial steps as Method 2, but utilizes different formulas for determining the lengths CH and HF. In particular, the controller determines length CH according to:
The controller 106 determines length HF according to:
The controller 106 then utilizes the same formula as Method 2 for angle δ.
In one example, the controller 106 utilizes Method 1 in response to determining that points C and F are substantially in a relative center 210 of the inspection zone 208, or if points C and F are on the same relative side of a central axis 212 (shown as line ABE in
Shown in
If the resultant distance does exceed a threshold value, the controller goes to step 810B, in which the controller selects an additional data point at a predetermined additional distance away from the central axis. The controller repeats steps 806-810B until the resultant distance does not exceed the threshold value. When the resultant distance does not exceed a threshold, the controller at step 810A measures the distance from the last data point taken to the central axis. At step 812A, if the distance is below a first threshold value, the controller goes to step 814A and calculates the target distance and angle according to a first stored algorithm. If at step 812A the distance is above a threshold value, the controller goes to step 812B to determine if the distance is below or above a second threshold value. If the distance is below a second threshold value, the controller goes to step 814B and calculates the target distance and angle according to a second stored algorithm, but if the distance is above a second threshold value, the controller goes to step 814C and calculates the target distance and angle according to a third stored algorithm.
The previous description of the disclosed embodiments is provided to enable any person skilled in the art to make or use the present invention. Various modifications to these embodiments will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other embodiments without departing from the spirit or scope of the invention. Thus, the present invention is not intended to be limited to the embodiments shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.
This application is a non-provisional conversion of U.S. Pat. App. No. 63/509,796 entitled “DISTANCE AND ANGLE DETECTION FOR SHOT PEENING NOZZLES,” filed Jun. 23, 2023, the contents of which are incorporated in their entirety and for all purposes.
Number | Date | Country | |
---|---|---|---|
63509796 | Jun 2023 | US |