The improvements generally relate to laser processing systems and more particularly to laser processing systems which involve imaging.
Conventional techniques for laser-processing a surface exist. In one conventional technique, spatial coordinates of the surface to be laser-processed are first determined using an optical 3D imaging system. The optical 3D imaging system has a laser line projector and a camera which are spaced apart from one another, have different viewpoints, and are referenced to one another. In a second, subsequent step, the so-determined spatial coordinates of the surface to be laser-processed can be communicated to a laser processing system, which can be operated to laser-process the surface based on these spatial coordinates. In practice, the optical 3D imaging system and the laser processing system have respective light beams, respective reference systems for spatial coordinates, and are made to correspond to one another based on calibration.
Although conventional techniques for laser-processing a surface have been satisfactory to a certain degree, there remains room for improvement.
It was found that the spatial coordinates of the surface to be processed could be obtained by imaging a spot formed on the surface by a laser processing beam which is displaced with respect to the surface.
More specifically, by knowing the relative position and orientation of the laser processing system and of the camera, the spatial coordinates of the surface can be determined, e.g., by triangulation, based on the spot formed on the surface by the laser processing beam as imaged by the camera. As such, features of the imaged spot, at any point in time, can vary based on the position, orientation and/or shape of the surface to be processed. For instance, in embodiments where the laser processing beam is converging, the imaged spot may not necessarily correspond to a focal point of the laser processing beam. Accordingly, in such embodiments, the imaged spot may have a different dimension than a dimension of the focal point, which when imaged can then help in determining the spatial coordinates of the surface so-illuminated by the laser processing beam. Examples of such features include a center position, a specific shape, a dimension (e.g., diameter), an orientation, and the like.
Accordingly, the spatial coordinates of the surface can be determined by imaging a first pass of the spot formed on the surface by the laser processing beam, within a given tolerance. The tolerance may be affected by the features of the imaged spot. For instance, the tolerance can be limited if the center position of the spot can be determined. Alternately, since the dimension of the spot is indicative of the distance between the focal point and the surface, the absolute value of that distance can be measured based on the dimension of the spot, and the measured distance can be used to move the focal point of the laser processing beam onto the surface or within a certain limited distance therefrom, after properly determining the direction of the displacement of the focal point to be applied.
As can be understood, the laser processing beam can process the surface only when the focal point is within a predetermined distance from the surface, and when a sufficient intensity is reached. Using a laser processing beam having a moveable focal point can allow to move the focal point on the surface, based on the previously determined spatial coordinates of the surface to laser process the surface.
In accordance with one aspect, there is provided a method for laser-processing a surface, the method comprising: directing, from a first viewpoint, a laser processing beam towards said surface including providing a focal point of said laser processing beam at a focal point position, resulting in illuminating said surface with a spot, while imaging said spot on said surface from a second viewpoint different from the first viewpoint; determining spatial coordinates of said surface based on calibration data and a feature of said imaged spot; and laser-processing said surface based on said previously determined spatial coordinates of said surface.
In accordance with another aspect, there is provided a laser processing system comprising: a frame; a laser processing subsystem mounted to said frame and having a first viewpoint relative to a surface, the laser processing subsystem being adapted to direct a laser processing beam towards said surface and to provide a focal point of said laser processing beam at a focal point position, resulting in illuminating said surface with a spot; a camera mounted to said frame and having a second viewpoint different from said first viewpoint, the camera being adapted to, simultaneously to said illuminating, image said spot of said surface and to generate an image of said spot; a computer communicatively coupled to said laser processing subsystem and to said camera, said computer having a memory system having stored thereon instructions executable by a processor to: determine spatial coordinates of said surface based on calibration data and a feature of said imaged spot in said image; and instruct the laser processing subsystem to laser process said surface based on said previously determined spatial coordinates of said surface.
In accordance with another aspect, there is provided a method for determining spatial coordinates of a surface, the method comprising: directing, from a first view point, a laser processing beam towards said surface including providing a focal point of said laser processing beam at a focal point position, resulting in illuminating said surface with a spot, while imaging said spot on said surface from a second viewpoint different from the first viewpoint; and determining spatial coordinates of said surface based on calibration data and a feature of said imaged spot.
It will be understood that the expression “computer” as used herein is not to be interpreted in a limiting manner. It is rather used in a broad sense to generally refer to the combination of some form of one or more processing units and some form of memory system accessible by the processing unit(s). Similarly, the expression “controller” as used herein is not to be interpreted in a limiting manner but rather in a general sense of a device, or of a system having more than one device, performing the function(s) of controlling one or more device such as an electronic device for instance.
It will be understood that the various functions of a computer or of a controller can be performed by hardware or by a combination of both hardware and software. For example, hardware can include logic gates included as part of a silicon chip of the processor. Software can be in the form of data such as computer-readable instructions stored in the memory system. With respect to a computer, a controller, a processing unit, or a processor chip, the expression “configured to” relates to the presence of hardware or a combination of hardware and software which is operable to perform the associated functions.
Many further features and combinations thereof concerning the present improvements will appear to those skilled in the art following a reading of the instant disclosure.
In the figures,
As depicted, the laser processing system 10 has a frame 12, a laser processing subsystem 14 and a camera 16, both mounted to the frame 12.
The laser processing subsystem 14 and the camera 16 both have their own, respective, and different viewpoints relative to the surface S. Accordingly, the laser processing subsystem 14 has a first viewpoint, i.e., a known position and orientation in the X, Y, Z coordinate system, and the camera 16 has a different, second viewpoint, i.e., a known position and orientation in the X, Y, Z coordinate system.
As depicted, the laser processing subsystem 14 is adapted to direct a laser processing beam 18 towards the surface S, and to provide a focal point 20 of the laser processing beam 18 at a focal point position (Xfp, Yfp, Zfp) in the X, Y, Z coordinate system, which understandably results in the illumination of the surface S with a spot.
While the surface S is illuminated with the spot, the camera 16 is adapted to image the spot of the surface S and to generate an image of the spot, which can be referred to as “the imaged spot.” The camera can produce images of the surface S to be processed. In some embodiments, these images can have their own coordinate system X′ and Y′, and can be registered in the X, Y, Z coordinate system after their acquisition.
As shown, a computer 22 is communicatively coupled to the laser processing subsystem 14 and to the camera 16. In this example, the computer 22 is mounted to the frame 12 and is wiredly coupled to the laser processing subsystem 14 and to the camera 16. However, in some other embodiments, the computer 22 can be remote from the laser processing subsystem 14, and be wirelessly coupled thereto via wireless communication links such as Wi-Fi, Bluetooth, cellular data link and the like.
As can be understood, the computer 22 has a memory system 24 on which are stored instructions executable by processor(s) 26 to determine spatial coordinates of the surface S based on calibration data and on a feature (i.e., one or more features) of the imaged spot, and to instruct the laser processing subsystem 14 to laser process the surface S on the basis of the previously determined spatial coordinates of the surface S.
The calibration data allows to determine the spatial coordinates of the surface S based on the first viewpoint of the laser processing subsystem 14 and on the second viewpoint of the camera 16 as function of the feature(s) of the imaged spot.
Non-limiting examples of such calibration data are described in the following paragraphs for explanatory purposes.
Referring to
As can be seen, in one specific embodiment, the spatial coordinates of the surface SA, SB or SC can be determined based on a feature provided in the form of a center position of the imaged spot in the X, Y, Z coordinate system. Indeed, in this example, for a given orientation a of the laser processing beam 18, the calibration data can be indicative of the spatial coordinates X and Z as function of the coordinates X′ and Y′ (in pixels) of the center of the spot in the image generated by the camera 16 for an angle of incidence β. For instance, Table 1 shows an example of calibration data, provided in the form of a lookup table.
In the case of the surface SA, the imaged spot can be determined to be incident on the camera 16 at an angle βA and its center is localized at position X′A and Y′A in the image obtained by the camera 16. Accordingly, it can be determined that the spatial coordinates of the surface SA are XA, Y, ZA based on the above calibration data. Similarly, in the case of the surface SB, the imaged spot can be determined to be incident at an angle βB on the camera 16 and its center is positioned at X′B and Y′B in the image. Accordingly, it can be determined that the spatial coordinates of the surface SB are XB, Y, ZB.
Calibration data similar to those presented in Table 1 can be provided for other combinations of viewpoints of the laser processing subsystem 14 and of the camera 16. Accordingly, the right calibration data can be selected based on the first viewpoint of the laser processing subsystem 14 and of the second viewpoint of the camera 16 prior to actually determining the spatial coordinates of the surface SA, SB or SC. As can be understood, the viewpoint of the laser processing subsystem 14 corresponds to a laser emission angle of the laser processing subsystem 14.
As can be seen, in another specific embodiment, the spatial coordinates of the surface SA, SB or SC can be determined based on a feature provided in the form of a dimension D of the imaged spot in the X, Y, Z coordinate system. More specifically, the dimension D of the imaged spot corresponds to a diameter of the imaged spot in this example.
Indeed, in this specific example, for a given orientation a of the laser processing beam 18 and for a given convergence/divergence D(r) of the laser processing beam 18, where r is an axial position along an optical axis 28 of the laser processing beam 18, the calibration data can be indicative of the spatial coordinates X and Z as function of the dimension of the spot in the image generated by the camera 16.
As can be understood, due to the nature of converging beams, the laser processing beam 18 converges towards the focal point 20, but after the focal point 20 is reached, the laser processing beam 18 diverges. Accordingly, the spatial coordinates of the surface SA, SB or SC can be determined on the basis of the dimension of the imaged spot. More specifically, as shown in
In some embodiments, the angle β of the spot in the image generated by the camera 16 can be used to determine on which side of the focal point 20 the surface SA, SB or SC actually lies.
As can be understood from the example above, the spatial coordinates of the surface SA, SB or SC can be determined based on a feature of the imaged spot. Examples of such features include a center position, a specific shape, a dimension (e.g., diameter), an orientation, and/or any suitable combination thereof.
In another embodiment, the images produced by the camera 16 can be pre-processed to correct the distortions induced by the viewpoint of the camera 16 upon the surface S. In such embodiment, the coordinate system X′ and Y′ of the image can be corrected to correspond to the coordinate system of the laser (X, Z) through a relationship such as one pixel of the camera 16 corresponds to one millimeter for the laser spot at the surface S in the X direction. In another specific embodiment, a mathematical relationship can be established to link the direction Y′ of the image to the Z direction of the laser processing subsystem 14, which also corresponds to the position Z of the surface S, in place of a calibration data set such as those described with reference to Table 1 and Table 2.
Moreover, in this example, the frame 12 has one or more handles 30 protruding or recessing from the frame 12 for handling purposes. More specifically, the laser processing system 10 is portable and handleable by a user in a manner which can allow a user to manually position the laser processing system 10 proximate to a surface S to laser process it as desired.
In some embodiments, a display can be mounted to the frame 12, and facing the user, for displaying user instructions to the user. An example of such user instructions includes instructing the user to move the laser processing system 10 closer or farther to the surface S when it is determined that the laser processing system 10 is too close to or too far from the surface S.
In this example, the laser processing beam 18 has a wavelength in an infrared region of the electromagnetic spectrum, e.g., 1064 nm, and so the camera 16 is configured to image illumination in that infrared region of the electromagnetic spectrum. Accordingly, the camera 16 used in this example is sensitive but not necessarily limited to the infrared region of the electromagnetic spectrum.
As can be understood, the camera 16 can be a 2D camera in some embodiments whereas the camera 16 can be a 3D camera in some other embodiments. Characteristics of the camera 16 such as sensor size, quantum efficiency, number of frame per seconds, aperture size, focal length of the lens of the camera and any other characteristics may have an impact on the calibration data. The system performance and can be tailored to suit specific industry needs. It is intended that although the illustrated embodiment has only one camera 16, other embodiments of the laser processing system 10 can have a plurality of cameras each having different, respective viewpoints with respect to the surface S to image. In these embodiments, an image of the spot moving on the surface S may stem from images acquired by one or more of the cameras.
In one specific embodiment, the laser processing subsystem 14 comprise a fiber-baser laser source. In some other embodiments, the laser processing subsystem 14 can also be a solid-state laser source or any other type of laser source.
Moreover, the laser processing subsystem 14 can include a 3-axis scanner, composed of two rotating mirrors and a moving lens, to direct the light toward the surface to be laser-processed. In another embodiment, the laser processing subsystem 14 can include a 2-axis scanner, composed of one rotating mirror and a moving lens, to direct the light toward the surface to be laser-processed. In still another embodiment, the scanner could be based on reflective optical parts, such as flat mirrors, converging mirrors and diverging mirrors, in any combination.
In still further embodiments, the laser processing subsystem 14 can have a scanning head with a fixed focal length. In these embodiments, the scanning head can be moved closer or farther from the surface S so that the focal point 20 of the laser processing beam 18 moves accordingly. In alternate embodiments, the surface S can be moved relative to the laser processing subsystem 14 in order to move the focal point 20 of the laser processing beam 18 relative to the surface S.
Depending on the embodiment, parameters of the laser processing beam 18 can be modified over time. For instance, in some embodiments, the laser processing subsystem 14 can be configured to modify a width, an optical power, a repetition frequency, a scanning speed, and any other suitable parameter, during a single pass of the laser processing beam 18 on the surface S to laser-process.
In alternate embodiments, the imaged moving spot 32′ shown in
In this specific example, the image 36 is in greyscale. Pixels of the image 36 are considered to be part of the imaged moving spot 32′ when they have an intensity greater than a predetermined threshold (e.g., intensity greater than 50, when greyscale spans from 0 to 256).
Due to the known position and orientation of the laser processing subsystem 14 and of the camera 16, the spatial coordinates of the surface S can be determined based on a center path 38 of the imaged moving spot 32′ and/or on a thickness t of the imaged moving spot 32′, the thickness t being measured perpendicularly to the center path 38. For instance, in this embodiment, it can be determined that the thickness t of the imaged moving spot 32′, throughout its length, exceeds a thickness threshold tthres, i.e., t>tthres. For instance, a properly focused beam could have a thickness of 3 pixels on the image, measured at half-maximum of the intensity in greyscale, such that tthres can be 4 pixels. Accordingly, the first focal point path P1 can be determined to be too far from the surface S, and can be made closer in a subsequent, second pass of the focal point 20.
As it can be understood, in the example described with reference to
It is noted that, in some embodiments, initial or current spatial coordinates of the surface S can be updated after, and even during, each pass of the laser processing beam 18.
For instance, with reference to the embodiments shown in
In this embodiment, the step of determining the current spatial coordinates of the surface S can be independent and simultaneous to the step of laser-processing the surface S. For instance, all the while the surface S is being illuminated by the laser processing beam 18, and imaged by the camera 16 to then determine the current spatial coordinates of the surface S, the laser processing subsystem 14 can be in the process of laser-processing the surface S based on previous, independent spatial coordinates of the surface.
In embodiments where the camera 16 acquires multiple images of the moving spot during a single pass of the laser processing beam 18 on the surface S, the laser processing system 10 can be configured to determine partial spatial coordinates of portions of the surface S as the laser processing beam 18 is gradually passed onto the surface S. In these embodiments, the current spatial coordinates of portions of the surface S can be updated in real time so that they reflect the partial spatial coordinates of the portions of the surface S where the laser processing beam has just passed. As such, only a portion of the current spatial coordinates may be updated.
It is envisaged that newly updated spatial coordinates of the surface S can be stored on a memory system which the laser processing subsystem 14 may directly or indirectly access to retrieve the latest spatial coordinates of the surface S. It is noted that a first computation step, in which the computer 22 updates the spatial coordinates of the surface S, can be independent from a second computation step, in which the computer 22 determines the spatial coordinates of the subsequent pass of the laser processing beam 18. In such embodiments, both the first and second computation steps can have access to the current spatial coordinates stored on the memory system as desired. Both computation steps can be performed at different frequencies.
As can be understood, during said imaging, the focal point 20 can have an intensity I exceeding a laser processing threshold Ithres. Depending on the embodiment, the laser processing threshold Ithres can be a laser-cleaning threshold, a laser marking threshold or a laser cutting threshold. Accordingly, during said imaging, the laser processing beam 18 can be used to laser process, e.g., laser clean, laser mark or laser cut, the surface S during the successive passes of the focal point 20. Of course, the wavelength of the laser processing beam 18 can be chosen based on the desired type of laser-processing and on the material to be laser-processed. For instance, it is known that laser cleaning of stainless steel or aluminum can be done with a laser beam having a center wavelength of 1064 nm, and plastics, composites and organic materials are more easily processed with laser beam having wavelength around 10.64 microns).
In one mode of operation, the focal point 20 of the laser processing beam 18 has an intensity I exceeding the laser processing threshold Ithres during said imaging, so that it can be determined that the surface S is satisfactorily laser-processed based on the features of the imaged spot. For instance, from image 36 of
In alternate embodiments, the third focal point path P3 could have been determined as consisting only of the middle portion 48, as the second pass of the focal point 20 along the spaced apart portions 42 and 44 of the second focal point path P2 would already had yielded satisfactory laser-processing.
In one other mode of operation, the focal point 20 of the laser processing beam 18 has an intensity I below the laser processing threshold Ithres during said imaging, so that the spatial coordinates of the surface S can be determined while not necessarily laser-processing the surface S during said imaging. Then, the intensity I of the focal point 20 of the laser processing beam 18 can be increased above the laser processing threshold Ithres to actually laser-process the surface S based on the previously determined spatial coordinates of the surface S, for instance, in only one pass of the focal spot 20.
In some embodiments, the laser processing system 10 can be configured to control the type of laser-processing that is performed by the laser processing beam 18 depending on previously determined spatial coordinates of the surface S. More specifically, the intensity of the focal point 20 can be increased above the laser processing threshold Ithres only when the focal point 20 is directed at a predetermined spatial regions of the surface S. For instance, the intensity of the focal point 20 can be increased above the laser processing threshold Ithres when it is determined that the surface S is at a given depth, or within a predetermined depth range. The intensity of the focal point 20 can be decreased below the laser processing threshold Ithres upon determining that the surface S lies within a given non-laser processing zone in some alternate embodiments.
As can be understood, the examples described above and illustrated are intended to be exemplary only. For instance, the frame can be fixed relative to the ground in some other embodiments. Alternately, the frame can include a first frame to which is mounted the laser processing subsystem and a second frame to which is mounted the camera, where the first frame and the second frame are made integral to one another. In another example, the camera and the laser processing subsystem could be mounted on independent frames that are mechanically referenced one to the other by mean of position captors, referenced actuators or any other means. As can be understood, the expression “calibration data” is meant to be construed broadly so as to encompass data stored in the form of table, array, or even in the form of mathematical relations. The laser processing system can have any type of suitable configuration allowing triangulation of the surface. For instance, the laser processing system can have a standard configuration in which the laser processing beam is perpendicular to the surface and the camera images the surface from an oblique perspective; a reverse configuration in which the laser processing beam is oblique relative to the surface and the camera images the surface from a perpendicular perspective; a specular configuration in which the laser processing beam and the field of view of the camera are oblique relative to the surface, and in which they points towards a same direction; and a look-away configuration in which the laser processing beam and the field of view of the camera are oblique relative to the surface, and in which they point in opposite directions. However, any other suitable configuration can be used. In some embodiments, the first pass of the laser processing beam can be performed based on initial coordinates of the surface to be laser-processed. However, in some embodiments, particularly in embodiments where initial coordinates of the surface are unknown, the first pass of the laser processing beam on the surface S can be performed based on default spatial coordinates, which may correspond to a default focal point path of the focal point of the laser processing beam. For instance, in some embodiments, the default focal point path can be set to extend in a plane spaced-apart by a predetermined spacing (e.g., 30 cm) from the frame of the laser processing system. The scope is indicated by the appended claims.
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/CA2019/050169 | 2/8/2019 | WO | 00 |
Number | Date | Country | |
---|---|---|---|
62628389 | Feb 2018 | US |