The present invention relates to a press brake, an image output device, and an image output method.
Patent Literature 1 discloses that images of a work region of a press brake are captured by a plurality of image capturing devices, and the captured images captured by the plurality of image capturing devices are displayed on display means.
Patent Literature 1: U.S. Pat. Application Publication No. 2019/0176201
However, since the structure of the press brake is complicated, in addition to a gaze object that a user desires to gaze at, various other objects are present in the captured image captured by the image capturing device. In addition, the press brake has similar colors overall. Therefore, when the user confirms the display means, there is an inconvenience that it is difficult to gaze at the gaze object within the captured image in an efficient manner.
One aspect of the present invention is a press brake including a press brake main body, an image capturing device, a distance measuring device, and an image processing device. The press brake main body is provided with an upper table configured to hold an upper tool and a lower table configured to hold a lower tool, and is configured to carry out a bending process to a plate-shaped workpiece when the upper table moves up and down relative to the lower table. The image capturing device captures an image of a work region in which the upper table and the lower table carry out the bending process in the press brake main body, and outputs a captured image. The distance measuring device detects a distance to an object present in the captured image and generates distance data in which the object and the distance are associated with each other. The image processing device generates, based on the distance data, a gaze image obtained by cutting out, from the captured image, a gaze object that is an object to be gazed at within the work region, and outputs the gaze image to a target device usable by a user.
According to the one aspect of the present invention, it is possible to recognize the distance to the object present in the captured image by generating the distance data. Since the distance data is data in which the object and the distance are associated with each other, by referring to the distance data, the image processing device can separate the gaze object and the other objects among the objects images of which are captured by the image capturing device. This makes it possible for the image processing device to generate the gaze image obtained by cutting out the gaze object from the captured image. Then, when the image processing device outputs the gaze image to the target device, the user can use the gaze image via the target device. Since the gaze image is an image obtained by cutting out the gaze object, the gaze object can be visually recognized more easily as compared with the case in which the captured image is visually recognized as it is.
According to the one aspect of the present invention, it is possible to visually recognize the gaze object in the captured image in an efficient manner.
A press brake according to the present embodiment will be described with reference to the drawings.
Hereinafter, the configuration of the press brake will be explained in detail. The press brake includes the press brake main body 10, an NC device 40, the camera 50, the distance measuring device 55, the image processing device 60, and a display device 70.
The configuration of the press brake main body 10 will be described. “FF”, “FR”, “L”, “R”, “U”, and “D” shown in the drawings refer to a forward direction, a backward direction, a left direction, a right direction, and a downward direction, respectively.
The press brake main body 10 is a working machine that carries out the bending process to the plate-shaped workpiece (sheet metal) W by a pair of tools. The press brake main body 10 is provided with the lower table 22 and the upper table 26.
The lower table 22 is provided at the lower part of the main body frame 16 and extends in the lateral direction. The lower table 22 holds a die 14 that is a lower tool. A lower tool holder 24 is attached on the upper end side of the lower table 22, and a die 14 is mounted on the lower tool holder 24.
The upper table 26 is provided at the upper part of the main body frame 16 and extends in the lateral direction. The upper table 26 is provided above the lower table 22 so as to face the lower table 22. The upper table 26 holds a punch 12 that is an upper tool. An upper tool holder 28 is attached on the lower end side of the upper table 26, and a punch 12 is mounted on the upper tool holder 28.
The upper table 26 is configured to move up and down with respect to the lower table 22 when a pair of hydraulic cylinders 30 provided on the left and right are driven up and down, respectively. The individual hydraulic cylinders 30 are driven up and down when an actuator mainly composed of a pump and a motor is operated. The vertical position of the upper table 26 is detected by a position detection sensor such as an unillustrated linear encoder. Position information detected by the position detection sensor is supplied to the NC device 40.
The press brake main body 10 may have a configuration in which the lower table 22 is moved up and down in lieu of the configuration in which the upper table 26 is moved up and down. In other words, the upper table 26 may be configured to move up and down relative to the lower table 22.
An unillustrated table cover that covers the upper table 26 is fixedly attached to the main body frame 16. Even when the upper table 26 moves up and down, the table cover does not move up and down and maintains a stationary state.
In the press brake main body 10, the workpiece W is placed on, for example, the die 14. When the upper table 26 is lowered, the workpiece W is sandwiched between the punch 12 and the die 14 to be bent.
A foot switch 36 on which an operator M carries out a stepping operation is installed in front of the lower table 22. When the operator M carries out the stepping operation, the foot switch 36 outputs an activation signal. The activation signal is a signal for starting a lowering operation of the upper table 26.
Behind the lower table 22, a back gauge 38 for positioning the workpiece W in the front-rear direction with respect to the die 14 is provided. The back gauge 38 includes an abutting member 39 against which the end face of the workpiece W can be abutted. The abutting member 39 protrudes forward from the back gauge 38. The position of the abutting member 39 in the front-rear direction is adjustable.
In the press brake main body 10, a three-dimensional space, which includes the lower table 22 and the surroundings thereof, and the upper table 26 and the surroundings thereof, corresponds to the work region in which the lower table 22 and the upper table 26 carry out the bending process. A gaze region GR, which is to be gazed at within the work region, is defined in the work region.
The gaze region GR is an approximately cubic three-dimensional space that extends in the lateral direction, the front-rear direction and the front-rear direction. As shown in
For example, a range in the lateral direction in the gaze region GR is set to include the die 14 mounted on the lower tool holder 24 and the punch 12 mounted on the upper tool holder 28. Further, the vertical range in the gaze region GR is set to include the upper end side of the lower table 22 and the lower end side of the upper table 26. The range in the vertical direction is based on the state when the upper table 26 is in the most raised position (a fully open position).
Further, the range in the front-rear direction in the gaze region GR is set such that predetermined distances are ensured at the front and at the back centering on the lower table 22 and the upper table 26, respectively. The predetermined distance is determined in consideration of the length of the workpiece W in the front-rear direction, the distance from the lower table 22 to the back gauge 38, and the like.
The gaze region GR set in this manner includes the gaze object that is an object to be gazed at within the work region. The gaze object is, for example, the punch 12, the die 14, the back gauge 38, the workpiece W placed on the press brake main body 10, and a hand and an arm of the operator M.
The gaze region GR may be set to a certain range and position regardless of a size of the workpiece W, a layout of the punch 12 and the die 14, and a position of the abutting member 39. However, the gaze region GR may be variably set in a range and a position in accordance with the layout of the punch 12 and the die 14 and the position of the abutting member 39.
The NC (Numerical Control) device 40 is a control device that controls the press brake main body 10. The NC device 40 drives the pair of hydraulic cylinders 30 up and down to control the vertical movement of the upper table 26. The NC device 40 controls the vertical position of the upper table 26 based on the position information detected by a position detection unit.
The camera 50 is an image capturing device that captures an image of the work region centering on the gaze region GR and outputs the captured image ID. The camera 50 is attached to the table cover of the upper table 26 and is arranged behind the upper table 26. The camera 50 captures an image of the gaze region GR and a surrounding region thereof from above the gaze region GR. The camera 50 attached to the table cover does not move up and down even when the upper table 26 moves up and down, and maintains the same position.
When the operator M works, the operator M stands in front of the press brake main body 10 so as to face the press brake main body 10. Since the line of sight of the operator M is obstructed by the upper table 26, the punch 12, the lower table 22, the die 14, and the like, visibility in the work region behind the upper table 26 and the lower table 22 is reduced. In the work region behind the upper table 26 and the lower table 22, there are back sides of the punch 12 and the die 14, a rear region of the workpiece W abutted against the back gauge 38, and the like. By capturing the image of the work region from above and behind the upper table 26 with the camera 50, the work region at the rear in which the visibility of the operator M is low can be covered with the imaging range of the camera 50.
Here, in the situation in which the camera 50 captures the image of the work region, it is sufficient that the gaze region GR is included in the image capturing range, that is, the angle of view of the camera 50. Therefore, when observed from the camera 50, it may be a situation in which a part of the gaze region GR is obstructed by a structure of the press brake main body 10 such as the upper table 26, the upper tool holder 28, the punch 12, the lower table 22, the lower tool holder 24, or the die 14.
The camera 50 includes an image capturing element such as a CCD (Charge Coupled Device) or a CMOS (Complementary Metal Oxide Semiconductor). A wide-angle lens or a fish-eye lens may be attached to the camera 50 so as to be able to capture a wide range. The camera 50 captures the image of the work region in response to a control signal from the image processing device 60, and acquires the captured image ID. The camera 50 outputs the acquired captured image ID to the image processing device 60.
As shown in
As shown in
In the same manner as the camera 50, the distance measuring device 55 acquires the distance data DD in response to the control signal from the image processing device 60, and outputs the acquired distance data DD to the image processing device 60.
The resolution (the number of pixels) of the distance data DD generated by the distance measuring device 55 is the same as the resolution (the number of pixels) of the captured image ID output from the camera 50. However, the resolution (the number of pixels) of the distance data DD may be different from the resolution (the number of pixels) of the captured image ID.
In
The image processing device 60 outputs the control signals to the camera 50 and the distance measuring device 55 at a predetermined cycle, so as to periodically acquire the captured image ID from the camera 50 and periodically acquire the distance data DD from the image processing device 60. The control signals output from the image processing device 60 to the camera 50 and the distance measuring device 55 are synchronized with each other. Therefore, the image capturing timing by the camera 50 and the distance measuring timing by the distance measuring device 55 are synchronized with each other.
The image processing device 60 is composed of a microcomputer that is mainly composed of a memory such as a CPU (Central Processing Unit), a ROM (Read Only Memory), and a RAM (Random Access Memory), as well as an I/O (Input/Output) interface. The CPU of the image processing device 60 reads, from the ROM or the like, various programs and data in accordance with the processing contents, expands them in the RAM, and executes the expanded various programs. Thereby, the microcomputer functions as a plurality of information processing circuits provided to the image processing device 60. In the present embodiment, an example of realizing the plurality of information processing circuits provided to the image processing device 60 by way of software is shown, but dedicated hardware for executing the respective information processing circuits may be prepared.
The image processing device 60 includes a storage unit 61, an object identification unit 62, an image cutout unit 63, and an image output unit 64 as the plurality of information processing circuits.
The storage unit 61 stores definition information that defines the range of the gaze region GR. The definition information defines the range of the gaze region GR in accordance with the distance from the distance measuring device 55. As shown in
Note that except for the method in which the reference plates are respectively arranged on the upper end plane Fa1 and the lower end plane Fa2 so as to actually measure the distances to the reference plates, other methods may also be used to acquire the distance distribution of the upper end plane Fa1 and the distance distribution of the lower end plane Fa2. For example, the distance distribution of the upper end plane Fa1 and the distance distribution of the lower end plane Fa2 may be obtained by way of geometric calculations in consideration of the positional relationship between the gaze region GR and the distance measuring device 55.
The object identification unit 62 identifies the gaze object based on the definition information and the distance data DD that is output from the distance measuring device 55. The object identification unit 62 identifies, as the gaze object, an object to which the distance Dij is within the gaze region GR, from among the objects present in the captured image ID.
The image cutout unit 63 generates the gaze image CD obtained by cutting out the gaze object from the captured image ID. The gaze image CD generated by the image cutout unit 63 is output to the image output unit 64.
The image output unit 64 outputs the gaze image CD to the display device 70. Further, the image output unit 64 may output the gaze image CD to the image storage device 80. The image storage device 80 can store a predetermined volume of gaze images CD. A user including the operator M can use the gaze image CD stored in the image storage device 80 via a computer.
The display device 70 includes a display panel 71 for displaying information. The display device 70 is arranged at a position visible to the operator M who operates the foot switch 36. The gaze image CD output from the image output unit 64 is displayed on the display panel 71 of the display device 70. Note that the display device 70 may display either the gaze image CD that is output from the image output unit 64 or the captured image ID, which is switched in accordance with the operation of the operator M. Further, on the display panel 71 of the display device 70, information that is output from the NC device 40 or the like can be displayed in addition to the gaze image CD.
In step S10, the image cutout unit 63 outputs the control signal to the camera 50 and acquires the captured image ID from the camera 50.
In step S11, the object identification unit 62 outputs the control signal to the distance measuring device 55 and acquires the distance data DD from the distance measuring device 55.
For convenience of description, the processing of step S10 and the processing of step S11 are separated, but it is desirable to execute the processing of step S10 and the processing of step S11 such that the capturing timing of the captured image ID and the distance measuring timing for the object are synchronized with each other.
The correction process of the distance data DD is carried out not only to a specific pixel but also to all of the pixels constituting the distance data DD. As a result, the coordinates of the distance data DD and the coordinates of the captured image ID are matched for the same object. Therefore, the distance data DD is data in which the distance is associated with each pixel PCij corresponding to the captured image ID, in other words, is equivalent to data in which the object reflected in the pixel PCij and the distance are associated with each other. The correction process of the distance data DD does not have to be carried out by the distance measuring device 55, and may be executed by the object identification unit 62 that has acquired the distance data DD.
In step S12, the object identification unit 62 extracts the target pixel PCij to be processed from among the pixels PCij that constitute the distance data DD. When the processing of step S12 is executed for the first time, the object identification unit 62 extracts, as the target pixel PCij, the pixel PCij that is set in advance such as the pixel PCij located at the upper left in the distance data DD. On the other hand, when the processing of step S12 is executed for the second time onwards, the target pixel PCij is extracted according to processing coordinate information updated in step S17 that will be described later.
In step S13, the object identification unit 62 refers to the definition information stored in the storage unit 61, and identifies the range of the gaze region GR corresponding to the coordinates of the target pixel PCij.
In step S14, the object identification unit 62 determines whether or not the distance Dij of the target pixel PCij is within the gaze region GR. Specifically, the object identification unit 62 determines whether or not the distance Dij of the target pixel PCij is equal to or greater than the distance to the upper end plane Fa1 of the gaze region GR and equal to or smaller than the distance to the lower end plane Fa2 of the gaze region GR. When the distance of the target pixel PCij is within the gaze region GR, an affirmative determination is made in step S14 and the process proceeds to step S15. On the other hand, when the distance of the target pixel PCij is outside the gaze region GR, a negative determination is made in step S14 to skip the process of step S15, and the process proceeds to step S16.
In step S15, the object identification unit 62 identifies the target pixel PCij as the gaze object. The object identification unit 62 outputs coordinate information of the target pixel PCij to the image cutout unit 63 as the coordinate information of the gaze object.
In step S16, the object identification unit 62 determines whether or not the processing is completed. If not all the pixels PCij constituting the distance data DD are extracted as the target pixels PCij, the object identification unit 62 determines that the processing is not completed. In this case, since a negative determination is made in step S16, the process proceeds to step S17. On the other hand, if all of the pixels PCij constituting the distance data DD are extracted as the target pixels PCij, the object identification unit 62 determines that the processing is completed. In this case, since an affirmative determination is made in step S16, the process proceeds to step S18.
In step S17, the object identification unit 62 updates the processing coordinate information. The processing coordinate information is information that identifies the pixel PCij to be extracted as the target pixel PCij in the processing of step S12. The object identification unit 62 updates the processing coordinate information so that the pixel PCij that has not yet been extracted in the distance data DD becomes the target pixel PCij.
In step S19, the image output unit 64 outputs the gaze image CD to the display device 70. The display device 70 to which the gaze image CD is input displays the gaze image CD on the display panel 71 thereof.
As described above, the press brake according to the present embodiment includes the image processing device 60 that generates the gaze image CD obtained by cutting out the gaze object, which is the object to be gazed at within the work region, from the captured image ID based on the distance data DD, and outputs the gaze image CD to the target device usable by the user.
According to the present configuration, by generating the distance data DD, it is possible to recognize the distance to the object present in the captured image ID. Since the distance data DD is the data in which the object and the distance are associated with each other, by referring to the distance data DD, the image processing device 60 can separate the gaze object that exists in the gaze region GR and the object that exists outside the gaze region GR, from among the objects captured with the camera 50. This makes it possible for the image processing device 60 to generate the gaze image CD obtained by cutting out the gaze object from the camera 50. Then, when the image processing device 60 outputs the gaze image CD to the target device, the user can use the gaze image CD via the target device. Since the gaze image CD is the image obtained by cutting out the gaze object, the gaze object can be visually recognized more easily as compared with the case in which the captured image ID is visually recognized as it is. As a result, when the image is observed, the gaze object can be visually recognized in an efficient manner.
In the present embodiment, the target device is the display device 70 that is visually recognized by the operator M of the press brake main body 10. According to the present configuration, by using the display device 70, the operator M can confirm the gaze image CD in real time. Then, when the image is observed, the gaze object can be visually recognized in an efficient manner.
However, the target device may be the image storage device 80 usable, via the computer, by the user such as the operator M. According to the present configuration, by using the image storage device 80 via the computer, the user can confirm the gaze image CD at a necessary timing. Then, when the image is observed, the gaze object can be visually recognized in an efficient manner.
In the present embodiment, the gaze region GR is the three-dimensional space that is set between the lower end side of the upper table 26 and the upper end side of the lower table 22.
According to the present configuration, the object involved in the bending process is included in the gaze region GR. This makes it possible for the image processing device 60 to appropriately generate the gaze image CD in which the object involved in the bending process is cut out. Since the gaze image CD is the image obtained by cutting out the gaze object, the gaze object can be visually recognized more easily as compared with the case in which the captured image ID is visually recognized as it is. As a result, when the image is observed, the gaze object can be visually recognized in an efficient manner.
In the present embodiment, the gaze region GR is set to include the punch 12, the die 14, the back gauge 38 against which the workpiece W is abutted, and the workpiece W.
According to the present configuration, the image processing device 60 can appropriately generate the gaze image CD obtained by cutting out the gaze objects such as the punch 12, the die 14, the back gauge 38, and the workpiece W. Since the gaze image CD is the image obtained by cutting out the gaze object, the gaze object can be visually recognized more easily as compared with the case in which the captured image ID is visually recognized as it is. As a result, when the image is observed, the gaze object can be visually recognized in an efficient manner.
The gaze region GR may be set to include the hand and the arm of the operator M that hold the workpiece W. According to the present configuration, the operator M can easily grasp the positional relationship between the body of the operator M himself/herself and the workpiece W by realizing the hand and arm displayed on the display device 70. This makes it possible to improve workability.
The distance data DD is the data in which the distance is associated with each of the plurality of pixels PCij corresponding to the captured image ID. According to the present configuration, in the distance data DD, the distance is associated with each of the pixels PCij. The image processing device 60 can cut out the gaze object in a unit of the pixel by cutting out the captured image ID based on the distance data DD. This makes it possible for the image processing device 60 to cut out the gaze image CD corresponding to the gaze object by a simple process.
Note that in the present embodiment, the distance data DD is the data in which the distance is associated with each of the pixels PCij. However, the distance data DD may be data in which the distance is associated not only with a single pixel PCij but also with each pixel block composed of a plurality of adjacent pixels PCij. Further, the distance data DD may be data in which the distance is associated with each pixel block grouped for each object recognized by an image processing technique. In addition, the distance data DD does not have to correspond to all of the pixels PAij constituting the captured image ID. The distance data DD may be data in which the distance is associated only with a specific pixel PAij that is selected from all of the pixels PAij constituting the captured image ID.
In the present embodiment, the image processing device 60 identifies the gaze object among the objects present in the captured image ID based on the definition information and the distance data DD. According to the present configuration, the image processing device 60 can recognize, by referring to the definition information, the range of the gaze region GR by way of the distance from the distance measuring device 55. Therefore, the image processing device 60 can identify the gaze object by comparing the distance to the object in consideration of the definition information and the distance data DD. This makes it possible to appropriately identify the gaze object.
In the present embodiment, the definition information includes the distance distribution of the upper end plane Fa1 of the gaze region GR and the distance distribution of the lower end plane Fa2 of the gaze region GR. According to the present configuration, the image processing device 60 can recognize, as the gaze object, an object that has a distance equal to or greater than the distance recognized from the distance distribution of the upper end plane Fa1 and equal to or smaller than the distance recognized from the distance distribution of the lower end plane Fa2. This makes it possible for the image processing device 60 to appropriately identify the gaze object.
The first camera 50 and the first distance measuring device 55 are arranged behind the upper table 26. For this reason, in the gaze region GR, objects located in front of the lower table 22 and the upper table 26 may be blocked by the structures such as the upper table 26, the punch 12, the lower table 22, and the die 14 and may not be captured by the first camera 50 and the first distance measuring device 55. Further, depending on the angles of view of the first camera 50 and the first distance measuring device 55, it may not be possible to cover the entire area of the gaze region GR. Therefore, the second camera 51 and the second distance measuring device 56 are arranged in front of the upper table 26 to cover the entire area of the gaze region GR.
In a similar manner, as shown in Figure (d) of
In the press brake having such a configuration, the object identification unit 62 of the image processing device 60 identifies the gaze object existing in the gaze region GR based on the first and second definition information and the distance data DD of the first distance measuring device 55 and the second distance measuring device 56. The image cutout unit 63 of the image processing device 60 generates the gaze image CD obtained by cutting out the gaze object from the captured image ID.
A method for the process of identifying the gaze object and the process of generating the gaze image CD includes, for example, a method shown below. That is, the method is the one in which the process using the first camera 50 and the first distance measuring device 55 and the process using the second camera 51 and the second distance measuring device 56 are individually carried out, and the gaze image CDs cut out from the respective processes are combined at the end.
Specifically, the object identification unit 62 identifies the gaze object based on the first definition information and the distance data DD of the first distance measuring device 55. Then, the image cutout unit 63 generates the gaze image CD that is obtained by cutting out the gaze object, which identified by the first distance measuring device 55, from the captured image ID of the first camera 50. In the same manner, the object identification unit 62 identifies the gaze object based on the second definition information and the distance data DD of the second distance measuring device 56. Then, the image cutout unit 63 generates the gaze image CD that is obtained by cutting out the gaze object, which is identified by the second distance measuring device 56, from the captured image ID of the second camera 51. Finally, the image cutout unit 63 combines the gaze image CD cut out from the captured image ID of the first camera 50 and the gaze image CD cut out from the captured image ID of the second camera 51 so that the images overlap with each other in the regions in which the image capturing ranges of the first camera 50 and the second camera 51 overlap with each other. As a result, the image cutout unit 63 generates the gaze image CD.
According to the present configuration, by using the plurality of cameras 50 and 51 and the plurality of distance measuring devices 55 and 56 in combination, it is possible to efficiently cover the entire area of the gaze region GR.
A configuration is considered in which the distance measuring device 55 is attached to a table cover 32 of the upper table 26 as shown in
When the upper table 26 moves up and down, the region to be gazed at substantially changes. This is because the gaze region GR only need to include the gaze objects such as the punch 12, the die 14, the back gauge 38, the workpiece W, and the hand of the operator M. Therefore, the object identification unit 62 corrects the definition information so as to correct the range of the gaze region GR.
When the distance between the distance measuring device 55 and the lower end side of the upper table 26 is “D1”, the distance distribution of the upper end plane Fa1 held by the storage unit 61 is set based on the distance D1 when the upper table 26 is in a fully open position. Therefore, the object identification unit 62 corrects the distance distribution of the upper end plane Fa1, which is read from the storage unit 61, in accordance with the vertical movement of the upper table 26. Specifically, the object identification unit 62 corrects the distance distribution of the upper end plane Fa1 as the distance D1 changes in accordance with an amount of movement of the upper table 26 when the upper table 26 moves up and down, that is, the upper end plane Fa1 moves up and down.
Next, a configuration is considered in which the distance measuring device 55 is attached to the upper table 26 as shown in
When the distance measuring device 55 moves up and down, the region to be gazed at substantially changes. This is because the gaze region GR only need to include the gaze objects such as the punch 12, the die 14, the back gauge 38, the workpiece W, and the hand of the operator M. Therefore, the object identification unit 62 corrects the definition information so as to correct the range of the gaze region GR.
When the distance between the distance measuring device 55 and the upper end side of the lower table 22 is “D2”, the distance distribution of the lower end plane Fa2 held by the storage unit 61 is set based on the distance D2 when the upper table 26 is in a fully open position. Therefore, the object identification unit 62 corrects the distance distribution of the lower end plane Fa2, which is read from the storage unit 61, in accordance with the vertical movement of the distance measuring device 55. Specifically, the object identification unit 62 corrects the distance distribution of the lower end plane Fa2 as the distance D2 changes in accordance with an amount of movement of the distance measuring device 55 when the distance measuring device 55 moves up and down, that is, the lower end plane Fa2 moves up and down.
In this manner, in the present embodiment, the image processing device 60 can correct the distance distribution of the lower end plane Fa2 in accordance with the vertical movement of the upper table 26. According to the present configuration, the image processing device 60 corrects the distance distribution of the lower end plane Fa2 in accordance with the vertical movement of the distance measuring device 55 that is interlocked with the upper table 26. This makes it possible to optimize the range of the gaze region GR in accordance with the vertical movement of the distance measuring device 55. As a result, only the necessary gaze object can be cut out as the gaze image.
Further, in the present embodiment, the image processing device 60 can correct the distance distribution of the upper end plane Fa1 in accordance with the vertical movement of the upper table 26. According to the present configuration, the image processing device 60 corrects the distance distribution of the upper end plane Fa1 in accordance with the vertical movement of the upper table 26. This makes it possible to optimize the range of the gaze region GR in accordance with the vertical movement of the upper table 26. As a result, only the necessary gaze object can be cut out as the gaze image.
For example, the sensor 52 includes an image capturing element 52a and a beam projector 52b. The sensor 52 includes an image output unit 52c that receives a luminance signal indicating an intensity of a reflected beam received by the image capturing element 52a and outputs the image capturing ID having the luminance indicating the intensity of the reflected signal based on the luminance signal. Further, the sensor 52 measures a delay time of a beam receiving timing with respect to the beam receiving timing for each pixel based on the luminance signal, and generates the distance data DD indicating the distance to the object.
According to the present configuration, since the camera 50 and the distance measuring device 55 can be integrated, the device can be simplified.
As described above, the embodiments of the present invention have been described, but the statements and drawings that form a part of the present disclosure should not be understood to limit the present invention. The present disclosure will reveal, to those skilled in the art, various alternative embodiments, examples, and operational techniques.
For example, the distance measuring device is not limited to the configuration using the image capturing element, and may have a configuration in which a two-dimensional distance distribution is generated by using a laser radar, an ultrasonic sensor, or the like. Further, the press brake main body 10 is configured such that the operator M places the workpiece W, but the workpiece W may be placed by a transfer robot.
Further, not only the press brake described above, but also the image output device and the image output method for outputting the gaze image from the captured image and the distance data also function as a part of the present invention.
The disclosure of the present application is related to the subject matter described in Japanese Patent Application No. 2020-157478 filed on Sep. 18, 2020, and all the disclosure contents thereof are incorporated herein by reference.
Number | Date | Country | Kind |
---|---|---|---|
2020-157478 | Sep 2020 | JP | national |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/JP2021/033070 | 9/9/2021 | WO |