The present disclosure relates to a sewing machine.
A sewing machine is known that can detect, from a captured image of a sewing object captured during sewing, a marker that is placed on the sewing object by a user. The known sewing machine makes changes to a sewing condition, such as stopping sewing or the like, which is identified on the basis of the detected marker, at a position identified on the basis of the marker.
In the known sewing machine, due to the influence of a feed efficiency of the sewing object, an inclination of a cloth during the sewing, and the like, the position of the marker detected by the sewing machine sometimes differs from an actual position of the marker. In this type of case, the sewing machine cannot make the changes to the sewing condition, such as stopping the sewing or the like, which is identified on the basis of the detected marker, at the position instructed by the marker. In other words, the sewing condition is changed at a position that differs from the position of the marker placed by the user.
Embodiments of the broad principles derived herein provide a sewing machine capable of notifying a user of a position of a marker, on a sewing object, detected by the sewing machine.
Embodiments provide a sewing machine that includes a bed portion, a conveyance portion, a sewing portion, an image capture portion, a projector, a processor, and a memory. The conveyance portion includes a feed dog. The conveyance portion is configured to convey a sewing object placed on the bed portion in a conveyance direction, using the feed dog. The sewing portion includes a needle bar. The sewing portion is configured to form stitches in the sewing object conveyed by the conveyance portion, by causing a sewing needle mounted on the needle bar to move up and down. The image capture portion is configured to perform image capture of an image capture range including below the needle bar. The projector is configured to project a projection image toward the bed portion. The processor is configured to control the conveyance portion, the sewing portion, the image capture portion, and the projector. The memory is configured to store marker information and computer-readable instructions that, when executed by the processor, instruct the processor to perform processes. The processes include causing the image capture portion to perform the image capture at a predetermined timing, during a conveyance period in which the conveyance portion is being driven, identifying the marker in a captured image obtained by the image capture, using the marker information stored in the memory, identifying a projection position corresponding to the identified marker, when the marker is identified, and causing the projector to project the projection image indicating the identified projection position, while following a movement of the marker on the sewing object being conveyed.
Embodiments will be described below in detail with reference to the accompanying drawings in which:
Each of
Each of
Hereinafter, an embodiment of the present disclosure will be explained with reference to the drawings. The drawings are used to explain technological features that the present disclosure can utilize, and a configuration of a device that is described, and the like do not limit the present disclosure to only that configuration, and the like, but are merely explanatory examples.
A physical configuration of a sewing machine 1 will be explained with reference to
As shown in
As shown in
The pillar portion 3 is provided internally with a processor 80 (refer to
A cover 42 that can open and close is provided on an upper portion of the arm portion 4. A thread housing portion 45 is provided below the cover 42. A thread spool 20, around which an upper thread is wound, is housed in the thread housing portion 45. During sewing, the upper thread wound around the thread spool 20 is supplied from the thread spool 20 to the sewing needle 52 mounted on the needle bar 51, via a predetermined path provided in the head portion 5. The drive shaft 34, which extends in the left-right direction, is provided inside the arm portion 4. The drive shaft 34 is driven to rotate by the sewing machine motor 33. Various switches, including a start/stop switch 29, are provided on a lower portion on the left of the front surface of the arm portion 4. The start/stop switch 29 starts or stops operation of the sewing machine 1. In other words, the start/stop switch 29 is used to input a command to start sewing or stop the sewing.
As shown in
The image sensor 57 is, for example, a known area sensor in which a plurality of imaging elements aligned in a main scanning direction are arranged in a plurality of rows in a sub-scanning direction. A known complementary metal oxide semiconductor (CMOS) is used as the imaging element, for example. In the present embodiment, the main scanning direction and the sub-scanning direction respectively correspond to the left-right direction and the front-rear direction of the sewing machine 1. The image sensor 57 is configured to capture an image of an image capture range RC (refer to
The projector 58 is configured to project an image onto a predetermined range (a projection range RP (refer to
An electrical configuration of the sewing machine 1 will be explained with reference to
The CPU 81 performs main control of the sewing machine 1, and executes various arithmetic operations and processing relating to sewing, image capture, and image projection, in accordance with various programs stored in the ROM 82. Although not shown in the drawings, the ROM 82 is provided with a plurality of storage areas including a program storage area. The various programs used to operate the sewing machine 1 are stored in the program storage area. For example, a program of main processing and the like to be described later is stored in the program storage area. Calculation results of the arithmetic processing performed by the CPU 81 can be stored in the RAM 83. The flash memory 84 includes a storage area 87 that stores marker information. The marker information will be described later. The flash memory 84 stores various parameters used by the sewing machine 1 to perform various types of processing. For example, the parameters include variables that cause a world coordinate system, an image coordinate system of the image sensor 57, and a projection coordinate system of the projector 58 to be associated with each other. The world coordinate system indicates whole space, and is a coordinate system that is not influenced by a center of gravity or the like.
The drive circuit 91 is connected to the sewing machine motor 33, and is configured to drive the sewing machine motor 33 in accordance with a control signal from the CPU 81. The drive circuit 92 is connected to the feed amount adjustment motor 22, and is configured to drive the feed amount adjustment motor 22 in accordance with a control signal from the CPU 81. The drive circuit 93 is configured to drive the LCD 31 in accordance with a control signal from the CPU 81, and causes an image, an operation screen, and the like to be displayed on the LCD 31. The drive circuit 94 is configured to drive the liquid crystal panel 59 of the projector 58 in accordance with a control signal from the CPU 81, and causes the projection image to be displayed on the liquid crystal panel 59.
The light source 56 of the projector 58, a drive shaft angle sensor 35, the touch panel 32, the start/stop switch 29, and the image sensor 57 are further connected to the input/output I/F 85. The light source 56 illuminates in accordance with a control signal from the CPU 81, and projects the projection image displayed on the liquid crystal panel 59 onto the sewing object being conveyed on the bed portion 2. The drive shaft angle sensor 35 can detect a rotation speed, and a rotation position of the sewing machine motor 33. The touch panel 32 can output, to the CPU 81, coordinate data indicating an input position of the operation using the finger or the dedicated touch pen. On the basis of the coordinate data acquired from the touch panel 32, the CPU 81 is configured to detect the item selected on the operation screen displayed on the LCD 31, and performs corresponding processing. The start/stop switch 29 can receive an input of an operation relating to the sewing machine 1 separately from the touch panel 32, and can perform output to the CPU 81. When the CPU 81 receives the input of the operation relating to the start/stop switch 29, the CPU 81 outputs a control signal to start or to stop a sewing operation. The image sensor 57 can output, to the CPU 81, data of a captured image captured by the imaging elements.
A marker 68 will be explained with reference to
Main processing of the present embodiment will be explained with reference to
As shown in
The processor 80 determines whether the sewing object is being conveyed (step S5). The sewing object is being conveyed when the feed dog 24 is higher than the needle plate 11. More specifically, the sewing object is being conveyed when the leading end of the sewing needle 52 is higher than the needle plate 11. On the basis of an output signal of the drive shaft angle sensor 35, when an angle of the drive shaft 34 is in a predetermined range, the processor 80 of the present embodiment determines that the sewing object is being conveyed. When the sewing object is being conveyed (yes at step S5), the processor 80 returns the processing to step S5. When the sewing object is not being conveyed (no at step S5), the processor 80 controls the image sensor 57, causes the image capture range RC to be captured, and acquires the captured image (step S6). By the processing at step S5 and step S6, the image of the sewing object is captured by the image sensor 57 when the sewing object is not being conveyed during a conveyance period in which the conveyance portion 21 is being driven. The conveyance period includes a first period and a second period. The first period is a period in which the upper end of the feed dog 24 is located above the upper end of the needle plate 11 and the feed dog 24 moves from the front to the rear. The second period is a period in which the upper end of the feed dog 24 is located below the upper end of the needle plate 11. The feed dog 24 moves from the rear to the front during the second period. The processor 80 causes the image sensor 57 to capture the image of the sewing object at an image capture timing that is when the sewing object is not being conveyed during the conveyance period. In other words, the processor 80 causes the image sensor 57 to capture the image of the sewing object at the image capture timing during the second period from among the conveyance period.
The processor 80 uses the marker information acquired by the processing at step S2 and identifies the marker 68 from inside the detection range in the captured image (step S7). A known method may be used as appropriate as a method for identifying the marker 68. In the present embodiment, the sewing machine 1 stores a procedure for identifying the marker 68, as the marker information. The processor 80 identifies the marker 68 in accordance with the procedure indicated by the marker information. More specifically, the processor 80 identifies the marker 68 from inside the detection range using the following procedure, for example. For example, by performing Hough transformation processing, which is known technology, on the image of the detection range, the processor 80 extracts a circumferential line (the graphic 63), and sequentially extracts the graphics 64 to 66 that are inside the extracted circumferential line. The processor 80 identifies the marker 68 on the basis of relative positions of the graphics 63 to 66.
The processor 80 determines whether the marker 68 has been identified from the captured image by the processing at step S7 (step S8). When the marker 68 has not been identified (no at step S8), the processor 80 determines whether the flag is set to ON (step S12). When the flag is not set to ON (no at step S12), the processor 80 sets a second range R2 as the detection range (step S14). As shown by shading in
At step S6 that is repeatedly performed, when the image capture range RC shown in
The processor 80 calculates a distance between the marker position PM calculated at step S10 and the needle drop position PN (step S11). The coordinates of the world coordinate system of the needle drop position PN are stored in advance in the flash memory 84. At step S11, the processor 80 of the present embodiment calculates the distance between the marker position PM and the needle drop position PN in the conveyance direction. In the present embodiment, the processor 80 determines whether the distance calculated at step S11 is equal to or less than the first distance D1 (step S15). The first distance D1 is a distance that is set in advance taking into account performing an operation by which the user removes the marking pin 60 from the sewing object. When the calculated distance is not equal to or less than the first distance D1 (no at step S15), the processor 80 determines whether the distance calculated at step S11 is equal to or less than the second distance D2 (step S16). The second distance D2 is longer than the first distance D1, and is shorter than a distance from the needle drop position PN to the front end of the second range R2 and a distance from the needle drop position PN to the front end of the projection range RP. In other words, a distance between the marker position PM identified from the second range R2 and the needle drop position PN is longer than the second distance D2.
When the distance calculated at step S11 is not equal to or less than the second distance D2 (no at step S16), the processor 80 identifies a projection position corresponding to the identified marker 68 (step S17). The processor 80 identifies the projection position on a predetermined plane corresponding to the identified marker 68. The predetermined plane is an X-Y plane of the world coordinate system and a Z coordinate of the predetermined plane has a predetermined value. The predetermined value of the present embodiment is a value corresponding to the thickness of the sewing object, for example. In other words, the predetermined plane of the present embodiment corresponds to the upper surface of the sewing object. The predetermined plane may correspond to the upper surface of the needle plate 11. The processor 80 identifies, as the projection position, a position closer to the needle bar 51 by a predetermined amount than the marker position that is the position on the sewing object of the marker 68 identified by the processing at step S7. More specifically, the faster the set sewing speed, the larger the predetermined amount is set to be, in comparison to when the sewing speed is slower, and the processor 80 identifies the projection position as the position that is closer to the needle bar 51, by the set predetermined amount, than the marker position. A relationship between the sewing speed and the predetermined amount is stored in advance in the flash memory 84. Identifying the projection position by the processing in this way is performed in order to take into account the conveyance amount of the sewing object in a period from the image capture to when the projection image is projected. With respect to the relationship between the sewing speed and the predetermined amount, when the sewing speed is the first speed, the predetermined amount is a first amount, and when the sewing speed is a second speed, the predetermined amount is a second amount, for example. The first amount is, for example, a distance between the graphic 64 and the graphic 66 in the front-rear direction. For the marker position PM shown in
The processor 80 determines whether the projection position PP identified by the processing at step S17 is inside the projection range RP (step S20). The projection position PP shown in
At step S6 that is repeatedly performed, when the image capture range RC shown in
At step S6 that is repeatedly performed, when the image capture range RC shown in
At step S15 and step S16 that are repeatedly performed, when the distance calculated at step S11 is not equal to or less than the first distance D1 (no at step S15), and when the calculated distance is equal to or less than the second distance D2 (yes at step S16), the processor 80 sets, as the sewing speed, the second speed that is slower than the first speed (step S18). The processor 80 controls the drive circuit 91, and performs the sewing at the speed set at step S18. The processor 80 sets the second amount, which is the predetermined amount smaller than the predetermined amount set by the processing at step S17, and identifies the projection position that is closer to the needle bar 51 than the marker position by the set predetermined amount (step S19). The processor 80 performs the processing at step S20 described above.
At step S6 that is repeatedly performed, when the image capture range RC shown in
The sewing machine 1 of the above-described embodiment can project the projection image indicating the projection position corresponding to the marker 68 detected on the basis of the captured image, while following a movement of the marker 68 on the sewing object. Using the projection image, the sewing machine 1 can notify the user of a recognition result, by the sewing machine 1, of the position of the marker 68 on the sewing object. The user can ascertain the relationship between the position of the actual marker 68 and the position of the marker 68 detected by the sewing machine 1, by referring to the projection image during sewing.
The processor 80 causes the image sensor 57 to perform the image capture a plurality of times during the conveyance period, and identifies the projection position on the basis of the captured image every time the image capture is performed. More specifically, during the conveyance period, each time the leading end of the sewing needle 52, which is caused to reciprocate up and down by the rotation of the drive shaft 34, is lower than the needle plate 11, the processor 80 causes the image sensor 57 to capture an image of the sewing object. In other words, during the conveyance period, the processor 80 causes the image sensor 57 to perform the image capture a plurality of times at mutually different timings. By projecting the projection image indicating the identified projection position, the processor 80 causes the projector 58 to project the projection image while following the movement of the marker 68 on the sewing object. In comparison to a case in which the projection image is projected on the basis of the marker 68 detected on the basis of the captured image captured by the single image capture, the sewing machine 1 can accurately identify the projection position while taking into account an influence of a feed efficiency of the sewing object and an inclination or the like of the sewing object with respect to the upper surface of the bed portion 2 during the sewing.
The processor 80 identifies, as the projection position, the position closer to the needle bar 51, by the predetermined amount, than the marker position that is the position on the sewing object of the marker 68 identified on the basis of the captured image (step S17, step S19). The sewing machine 1 can identify the projection position while taking into account the fact that the sewing object is being conveyed, during the period from when the captured image is generated to when the projection image is projected on the basis of the captured image.
The processor 80 sets the predetermined amount to be larger the faster the acquired sewing speed, in comparison to when the sewing speed is slower, and the processor 80 identifies, as the projection position, the position that is closer to the needle bar 51 (further to the rear) than the marker position by the set predetermined amount (step S17, step S19). Thus, the sewing machine 1 can identify the projection position while taking into account the fact that the sewing object is being conveyed by the amount corresponding to the sewing speed, during the period from when the captured image is generated to when the projection image is projected on the basis of the captured image.
The processor 80 causes the image sensor 57 to perform the image capture while the sewing object is not being conveyed, during the conveyance period (no at step S5; step S6). Thus, the sewing machine 1 can acquire the clear captured image, in comparison to a case in which the image capture is performed during a period in which the sewing object is being conveyed.
When the marker 68 is identified from the captured image, the processor 80 calculates the distance in the conveyance direction from the marker position to the position below the needle bar 51 (the needle drop position) (step S11), and when the sewing object has been conveyed by the calculated distance from the marker position (step S27), the sewing by the conveyance portion 21 and the sewing portion 30 is stopped (step S28). The sewing machine 1 can stop the sewing on the basis of the detection result of the marker 68. During sewing, by referring to the projection image, the user can verify, before the sewing is stopped, whether or not the sewing is to be stopped at a position instructed by the marker 68, using the relationship between the position of the actual marker 68 and the position of the marker 68 detected by the sewing machine 1. Thus, in comparison to related art, the sewing machine 1 can reduce a possibility that the sewing machine 1 stops the sewing at a position not instructed by the user using the marker 68.
When the distance calculated by the processing at step S11 becomes the first distance D1 (yes at step S15), the processor 80 stops the sewing by the conveyance portion 21 and the sewing portion 30 (step S24). By stopping the sewing, the sewing machine 1 can prompt the user to remove the marker 68 on the sewing object. The sewing machine 1 can suppress defects arising from a case in which the sewing is continued with the marker 68 still in place on the sewing object.
When the distance calculated by the processing at step S11 becomes the second distance D2 that is longer than the first distance D1, the processor 80 causes the conveyance speed of the sewing object by the conveyance portion 21 to be slower than the current speed (step S18). Thus, between the first distance D1 and the second distance D2, by reducing the conveyance speed, the sewing machine 1 can more accurately calculate the first distance D1. In comparison to a case in which the conveyance speed is not reduced, between the first distance D1 and the second distance D2, the user can more easily ascertain the relationship between the position of the actual marker 68 and the position of the marker 68 detected by the sewing machine 1. The sewing machine 1 can shorten a sewing time in comparison to a case in which the sewing object is conveyed at the reduced speed from the start of the sewing.
In the projection position on the sewing object, the processor 80 projects, as the projection image, the line segment extending in the direction orthogonal to the conveyance direction (step S22). Thus, the sewing machine 1 can perform notification of the marker position detected by the sewing machine 1, using the line segment SL that extends in the direction orthogonal to the conveyance direction.
The processor 80 projects the line segment SL as the projection image onto the virtual line VL that extends to the upstream side in the conveyance direction from the needle drop position PN that is below the needle bar 51 on the sewing object (step S22). Thus, when the stitches are formed in the straight line on the sewing object, the stitches are formed along the virtual line VL that extends to the upstream side in the conveyance direction from the needle drop position PN that is below the needle bar 51. The sewing machine 1 can project the line segment SL including the sewing stop position detected by the sewing machine 1 on the basis of the marker 68. Using the projected line segment SL, the user can verify the sewing stop position before the sewing is stopped.
The processor 80 causes the projection image to be projected onto the sewing object apart from onto the marker 68. Thus, the sewing machine 1 can project the projection image in a position that is not on the marker 68. In comparison to a case in which the projection image is projected onto the marker 68, the user can easily verify the actual position of the marker 68 and the position of the marker 68 detected by the sewing machine 1. In comparison to a case in which the projection image is projected onto the marker 68, the sewing machine 1 easily identifies the marker 68 from the captured image when the image of the marker 68 is captured the plurality of times.
The processor 80 sets part of the captured image as the detection range, and, using the marker 68 stored in the flash memory 84, identifies the marker 68 in the detection range. In comparison to identifying the marker 68 from an entire range of the captured image, the sewing machine 1 can identify the marker 68 from the captured image in a shorter time.
The processor 80 sets, as the detection range at the start of the sewing, the first range R1 that is a part of the captured image (step S4), and when the marker 68 is identified from the detection range of the single captured image (yes at step S8), the processor 80 sets the fourth range R4 as the detection range of the captured image that is subsequently captured (step S23). The fourth range R4 includes the projection position identified from the single captured image, and is smaller than the first range R1. In comparison to a case in which the marker 68 is identified from the first range R1 of the captured image even after the marker 68 has been detected, the sewing machine 1 can efficiently identify the marker 68 from the captured image in the shorter time.
The processor 80 sets, as the detection range at the start of the sewing, the first range R1 that is part of the captured image (step S4), and when the marker 68 cannot be identified from the first range R1 (no at step S8; no at step S12), the processor 80 sets the second range R2 as the detection range (step S14). The second range R2, which is on the upstream side in the conveyance direction of the captured image, is smaller than the first range R1. The sewing machine 1 conveys the sewing object from the upstream side to the downstream side in the conveyance direction. When the marker 68 cannot be identified once after the start of the sewing, it is assumed that the marker 68 is disposed on the upstream side with respect to the detection range in the conveyance direction. In comparison to a case in which the sewing machine 1 continues the processing with the first range R1 set as the detection range as it is, the sewing machine 1 can improve a speed of the processing to identify the marker 68.
After the marker 68 has been identified on the basis of the detection range in the single captured image (yes at step S8; step S10), when the marker 68 cannot be identified from the detection range in the captured image captured by the subsequent image capture (no at step S8; yes at step S12), the processor 80 sets the third range R3 as the detection range (step S13). The third range R3 includes the marker position identified on the basis of the single captured image, and the range further to the upstream side in the conveyance direction than the detection range identified on the basis of the subsequent captured image. Thus, after identifying the marker 68 on the sewing object from the detection range, when the sewing machine 1 cannot identify the marker 68 from the detection range of the captured image by the subsequent image capture, the sewing machine 1 can increase a possibility of identifying the marker 68 from the detection range of the captured image by the further subsequent image capture.
The sewing machine of the present disclosure is not limited to the above described embodiment, and various changes may be made without departing from the spirit and scope of the present disclosure. For example, the following modifications may be added as appropriate.
(A) The configuration of the sewing machine 1 may be changed as appropriate. The sewing machine 1 may be an industrial sewing machine or a multi-needle sewing machine. A type, a mounting position, and the like of each of the image capture portion and the projector maybe changed as appropriate. A positional relationship between the image capture range of the image capture portion and the projection range of the projector may be changed as appropriate. The pattern of the marker may be changed as appropriate. The marker may be disposed on a surface of a seal or the like, or may be a mark (a cross, for example) drawn on the sewing object by the user. A configuration may be adopted in which a plurality of types of marker of different sewing conditions can be detected.
(B) The program including the instructions to cause the main processing shown in
(C) The respective steps of the main processing performed by the sewing machine 1 are not limited to the example in which they are performed by the processor 80, and a part or all of the steps may be performed by another electronic device (an ASIC, for example). The configuration of the processor 80 may be changed as appropriate. The respective steps of the main processing may be performed through distributed processing by a plurality of electronic devices (a plurality of CPUs, for example). The respective steps of the main processing can be changed in order, omitted or added, as necessary. An aspect in which an operating system (OS) or the like operating on the sewing machine 1 performs a part or all of the main processing on the basis of a command from the processor 80 is also included in the scope of the present disclosure. For example, the following modifications from (C-1) to (C-5) may be added to the main processing, as appropriate.
(C-1) It is sufficient that the marker information be information that can identify the marker from the captured image. For example, the marker information may be image data of the marker, and the processor 80 may identify the marker from the captured image by comparing the image data of the marker and the captured image.
(C-2) The processing that causes the projector to project the projection image indicating the identified projection position, while following the movement of the marker on the sewing object during the sewing, may be changed as appropriate. After once identifying the marker position from the captured image, without performing the image capture, the processor 80 may identify the projection position while following the movement of the marker on the basis of the identified marker position and the drive amount of the conveyance portion 21 (the product of the feed amount and the number of stitches, for example), and may project the projection image indicating the identified projection position. The processor 80 may repeat, a plurality of times, processing performed by a constant amount in which the marker position is identified from the captured image, and the projection position is identified while following the movement of the marker on the basis of the identified marker position, the drive amount of the conveyance portion 21, and the like. When it is possible to ignore the conveyance amount by the conveyance portion 21 during the period from the image capture by the image capture portion to the projection by the projector, the processor 80 may project the marker position as the projection position. The predetermined amount when calculating the projection position may be a constant that does not depend on the sewing speed, or may be set by the user. The processor 80 may cause the projection image to be projected onto the marker when the image capture is only performed once, or the like, as described above. When the processor 80 causes the image capture portion to perform the image capture the plurality of times, the processor 80 may extract, from the captured image, each of the graphic indicating the projection position in the captured image and the marker, may compare an extraction result of each, and may correct the projection position of the graphic. When the extraction results differ from each other by equal to or greater than a predetermined amount, the processor 80 may reduce the sewing speed, or may perform an error notification using the display portion, a voice output portion, or the like.
(C-3) The graphic in the projection image included as the graphic indicating the position of the marker detected by the sewing machine may be changed as appropriate. When the graphic is the line segment, an extending range of the line segment may be changed as appropriate. As shown in
(C-4) The processor 80 may cause the image capture portion to perform the image capture during a period in which the sewing object is being conveyed, during the conveyance period. A method of identifying a period, during the conveyance period, in which the sewing object is not being conveyed may be changed as appropriate. The sewing condition instructed by the marker may be changed as appropriate. Depending on processing corresponding to the sewing condition instructed by the marker, the processor 80 may change the processing performed by the processor 80 as appropriate. For example, when the marker is a marker instructing the sewing speed to be reduced by a predetermined amount, the processor 80 may calculate a distance, in the conveyance direction, from the marker position identified from the captured image to the position below the needle bar 51 (the needle drop position), and the processor 80 may reduce the sewing speed by the predetermined amount when the sewing object has been conveyed by the calculated distance. In the main processing, the processor 80 may cause the sewing speed to be constant, and may omit the processing at step S16, step S18, and step S19, for example, and may perform the processing at step S17 subsequent to the processing at step S15. When the marker is a marker instructing the stopping of the sewing, at step S15, the processor 80 may determine whether or not the distance calculated at step S11 is zero, and may stop the sewing when the distance is zero (step S28). The processing from step S24 to step S27 may be omitted as appropriate. The first distance D1 and the second distance D2 may be changed as appropriate. The processor 80 may change the sewing speed in three or more stages in accordance with the distance calculated at step S11.
(C-5) The processor 80 need not necessarily change the detection range after setting the first range R1, which is part of the captured image, as the detection range at the start of the sewing. The processor 80 may set the second range R2 as the detection range at the start of the sewing. After the marker has been identified on the basis of the detection range in the single captured image (yes at step S8, step S10), when the processor 80 cannot identify the marker from the detection range in the captured image by the subsequent image capture (no at step S8, yes at step S12), the processor 80 may set the first range R1 as the detection range (step S13). After setting the third range R3 as the detection range by the processing at step S13, the processor 80 need not necessarily perform the new image capture, and may perform processing to identify the marker 68 from the third range R3 in the captured image in which the marker 68 has not been identified.
The apparatus and methods described above with reference to the various embodiments are merely examples. It goes without saying that they are not confined to the depicted embodiments. While various features have been described in conjunction with the examples outlined above, various alternatives, modifications, variations, and/or improvements of those features and/or examples may be possible. Accordingly, the examples, as set forth above, are intended to be illustrative. Various changes may be made without departing from the broad spirit and scope of the underlying principles.
Number | Date | Country | Kind |
---|---|---|---|
2017-240575 | Dec 2017 | JP | national |
This application is a continuation-in-part of International Application No. PCT/JP2018/019608 filed on May 22, 2018, which claims priority to Japanese Application JP2017-240575 filed on Dec. 15, 2017, the entire contents of both of which are hereby incorporated by reference.
Number | Date | Country | |
---|---|---|---|
Parent | PCT/JP2018/019608 | May 2018 | US |
Child | 16891182 | US |