This application claims priority to Japanese Patent Application No. 2014-108973 filed May 27, 2014, the content of which is hereby incorporated herein by reference.
The present disclosure relates to a sewing machine that is capable of sewing an embroidery pattern.
A sewing machine is known that can edit embroidery data of an embroidery pattern. For example, in a known sewing machine, if an editing parameter is set that is to be used to edit the embroidery data of the embroidery pattern, the sewing machine associates the editing parameter with the embroidery pattern and stores the associated data in a memory. A user may operate the sewing machine to select the embroidery pattern and the editing parameter that are stored in the memory. The sewing machine sews the selected embroidery pattern using the selected editing parameter.
The numbers of the embroidery patterns and the editing parameters that are stored in the memory may become large. In this case, in the above-described sewing machine, it may be difficult for the user to identify the embroidery pattern sewn in the past and the editing parameter.
Embodiments of the broad principles derived herein provide a sewing machine that allows easy identification of an embroidery pattern and an editing parameter.
Embodiments provide a sewing machine that includes a sewing portion, an image capturing portion, a first memory, a processor, and a second memory. The sewing portion is configured to sew an embroidery pattern on a sewing workpiece. The image capturing portion is configured to capture an image. The first memory is configured to store embroidery pattern data, editing parameters, and first feature information. The embroidery pattern data includes information for sewing respective types of embroidery patterns. The editing parameters are parameters used to edit the embroidery pattern data corresponding to the respective types of embroidery patterns. The first feature information is information that indicates features of the respective types of embroidery patterns. The second memory is configured to store computer-readable instructions. The computer-readable instructions, when executed by the processor, cause the sewing machine to perform processes that include causing the image capturing portion to capture an image including the embroidery pattern sewn on the sewing workpiece, extracting second feature information from a captured image, the second feature information being information that indicates a feature of the sewn embroidery pattern, and the captured image being the image captured by the image capturing portion, identifying the sewn embroidery pattern, based on the first feature information stored in the first memory and the extracted second feature information, identifying an editing parameter corresponding to the identified embroidery pattern, from among the editing parameters stored in the first memory, and causing the sewing portion to sew the identified embroidery pattern using the identified editing parameter.
Embodiments will be described below in detail with reference to the accompanying drawings in which:
Embodiments will be explained with reference to the drawings. A physical configuration of a sewing machine 1 will be explained with reference to
As shown in
A needle plate 21 (refer to
The sewing machine 1 further includes an embroidery frame movement mechanism (hereinafter referred to as a movement mechanism) 40. The movement mechanism 40 may be mounted on and removed from the bed 11 of the sewing machine 1.
The main body portion 41 internally includes an X axis movement mechanism (not shown in the drawings) and an X axis motor 83 (refer to
The display 15 is provided on the front surface of the pillar 12. An image including various items, such as a command, an illustration, a setting value, a message, etc., may be displayed on the display 15. A touch panel 26, which can detect a pressed position, is provided on the front surface side of the display 15. When the user performs a pressing operation on the touch panel 26 using the user's finger or a stylus pen (not shown in the drawings), the pressed position may be detected by the touch panel 26. A CPU 61 (refer to
A cover 16 is provided on an upper portion of the arm 13 such that the cover 16 may open and close. In
As shown in
An image sensor 35 is provided inside the head 14. The image sensor 35 is, for example, a known complementary metal oxide semiconductor (CMOS) image sensor. The image sensor 35 may capture an image of a specified area and may output image data of the captured image. The output image data may be stored in a specified storage area of a RAM 63 (refer to
A sewing operation of the sewing machine 1 will be briefly explained. When the embroidery pattern is sewn, the needle bar up-down movement mechanism 34 and the shuttle mechanism (not shown in the drawings) may be driven in synchronization with the movement of the embroidery frame 50 that is moved in the left-right direction (the X axis direction) and the front-rear direction (the Y axis direction) by the movement mechanism 40. In this manner, the embroidery pattern may be sewn on the sewing workpiece 3 held by the embroidery frame 50, by the sewing needle 7 mounted on the needle bar 6. When a normal practical pattern, which is not the embroidery pattern, is sewn, the sewing may be performed while the sewing workpiece 3 is fed by the feed dog (not shown in the drawings), in a state in which the movement mechanism 40 is removed from the bed 11.
The electrical configuration of the sewing machine 1 will be explained with reference to
The CPU 61 performs overall control of the sewing machine 1 and executes various arithmetic calculations and processing relating to sewing, in accordance with various programs stored in the ROM 62. The ROM 62 stores the various programs to operate the sewing machine 1. The programs stored in the ROM 62 include, for example, a program that causes the sewing machine 1 to perform pattern sewing processing, which will be explained below.
The RAM 63 includes a storage area to store calculation results etc. of arithmetic processing by the CPU 61 as necessary. The flash memory 64 stores the various parameters and the like that are used for the sewing machine 1 to perform the various processing. The flash memory 64 also stores an associated data table 90 (refer to
The sewing machine motor 81 is connected to the drive circuit 71. The drive circuit 71 may drive the sewing machine motor 81 in accordance with a control signal from the CPU 61. The needle bar up-and-down movement mechanism 34 may be driven via the drive shaft (not shown in the drawings) of the sewing machine 1 in accordance with the driving of the sewing machine motor 81, and the needle bar 6 may be thus moved up and down. The X axis motor 83 is connected to the drive circuit 72. The Y axis motor 84 is connected to the drive circuit 73. The drive circuits 72 and 73 may drive the X axis motor 83 and the Y axis motor 84, respectively, in accordance with a control signal from the CPU 61. The embroidery frame 50 may be moved in the left-right direction (the X axis direction) and in the front-rear direction (the Y axis direction) in accordance with the driving of the X axis motor 83 and the Y axis motor 84, by a movement amount that corresponds to the control signal. The drive circuit 74 may cause an image to be displayed on the display 15 by driving the display 15 in accordance with a control signal from the CPU 61.
The associated data table 90 will be explained with reference to
The embroidery pattern type is data indicating each type of various shapes of embroidery patterns, such as the letter K, the letter L, a flower, a car, and the like. The plurality of embroidery pattern types are stored in the associated data table 90.
The embroidery pattern data is data that includes information to sew each of the plurality of types of embroidery patterns. Specifically, the embroidery pattern data includes a sewing order, coordinate data, and first thread color information. The coordinate data represents (relative) coordinates, in the embroidery coordinate system, of needle drop points to be used to sew the embroidery pattern. The needle drop point is a point at which the sewing needle 7, from vertically above the needle hole (not shown in the drawings), may pierce the sewing workpiece 3, when the needle bar 6 is moved downward from an upward position. By moving the embroidery frame 50 in the X axis direction and the Y axis direction based on the coordinate data and driving the needle bar 6, the sewing machine 1 sews the embroidery pattern. The first thread color information is information indicating a color of the upper thread to be used to sew the embroidery pattern.
The local feature quantity set is a set of a plurality of local feature quantities in the embroidery pattern. The local feature quantity is a known parameter indicating a feature. For example, a local feature quantity is disclosed in “Gradient-Based Feature Extraction SIFT and HOG, Hironobu Fujiyoshi, Information Processing Society of Japan, Research Report CVIM 160, pp. 211 to 224, September 2007” (hereinafter referred to as Reference Literature 1), the relevant portions of which are incorporated by reference.
The histogram is generated based on the local feature quantity. A method of generating the histogram is disclosed in Reference Literature 1, for example, and is briefly explained here. The CPU 61 extracts local feature points and feature areas from a reference image in which the embroidery pattern is captured, and calculates the local feature quantity. The CPU 61 carries out vector quantization on the local feature quantity. The vector-quantized local feature quantity is called a visual word. The histogram is generated from the visual word obtained from a single reference image. An example of the histogram is a first histogram 121 shown in
The average angle value is an average value of angles of luminance gradient directions of a plurality of local feature points. The average size value is an average value of sizes of feature areas (to be explained below).
The parameter data table 91 will be explained with reference to
Registration of data in the parameter data table 91 will be explained. When a user performs sewing of the embroidery pattern, the user may perform a panel operation, for example, and thus may select a desired embroidery pattern from among the plurality of embroidery patterns stored in the flash memory 64. The user may specify the sewing position, the size, and the rotation angle of the embroidery pattern, with respect to the sewing workpiece 3 held by the embroidery frame 50. Further, the user may specify a stitch type to be used for sewing the embroidery pattern. The CPU 61 adjusts the embroidery pattern data based on the editing parameters that include the specified sewing position, size, and rotation angle. The CPU 61 performs sewing using the specified stitch type. The CPU 61 associates the embroidery pattern type with the editing parameters and registers the associated data in the parameter data table 91. Specifically, the registered type of the sewn embroidery pattern and the registered editing parameters thereof are sequentially accumulated in the parameter data table 91. An embroidery pattern 100 shown in
The pattern sewing processing will be explained with reference to
When the CPU 61 detects a command to start the processing, the CPU 61 reads out, from the program storage area of the ROM 62 shown in
As shown in
The CPU 61 controls the image sensor 35 to capture, as a captured image 110 (refer to
The CPU 61 extracts local feature points 131 and feature areas 132, which are shown in
The local feature points 131 include a local feature point 131A and a local feature point 131B. The feature areas 132 include a feature area 132A, which has the local feature point 131A as its center, and a feature area 132B, which has the local feature point 131B as its center. In
Next, the CPU 61 performs processing to describe (calculate) the local feature quantities (S5). At S5, the CPU 61 first calculates a luminance gradient and a luminance gradient direction, for the pixels inside the feature area 132 that centers on the single local feature point 131. From a magnitude of the luminance gradient and the luminance gradient direction that have been calculated, the CPU 61 generates a histogram, for example, that is divided into 36 directions and that is weighted. From the generated 36-direction histogram, the CPU 61 allocates, as a reference gradient direction of the local feature point 131, a direction having a peak value. Next, the CPU 61 performs normalization of degree (a rotation direction). Specifically, the CPU 61 rotates the feature area 132 surrounding the local feature point 131 in the reference gradient direction of that local feature point 131. By performing the normalization of degree in this manner, the CPU 61 can obtain the local feature quantity that is rotation invariant.
Next, the CPU 61 uses a Gaussian window to perform weighting such that a greater value is assigned in the vicinity of the center of the feature area 132. The size of the Gaussian window is determined by a smoothing scale of the DoG image from which the local feature point 131 is extracted. Therefore, when the size of the embroidery pattern 100 in the captured image 110 is doubled, for example, the scale is also doubled and the local feature quantity in the same area is obtained. In this way, it is possible to obtain the local feature quantity that is invariant to changes of scale.
Next, the CPU 61 divides the feature area 132 into 16 (4×4) areas and generates an 8-direction histogram for each of the divided areas. As a result, the CPU 61 can describe the local feature quantity of a 128-dimensional vector that is invariant to changes of scale. By performing the above-described processing on all of the local feature points 131, the CPU 61 calculates the local feature quantities for all of the local feature points 131.
Of the local features extracted at S4, by identifying the local features for the feature area 132 that are smaller than the specified size, the CPU 61 identifies the sewing features (S6). The CPU 61 determines whether the number of the sewing features identified at S6 is larger than a specified number N1 (S7). The specified number N1 is 30, for example. When the number of the sewing features is not larger than the specified number N1 (no at S7), the CPU 61 performs error processing (S24). The error processing is processing to notify an operator that the embroidery pattern 100 cannot be recognized. In the error processing, for example, a message stating “The embroidery pattern cannot be recognized” may be displayed on the display 15. The CPU 61 then ends the pattern sewing processing.
When the number of the sewing features is larger than the specified number N1 (yes at S7), the CPU 61 identifies the stitch type of the embroidery pattern 100 that is sewn on the sewing workpiece 3 (S8). More specifically, of the local feature quantities calculated at S5, the CPU 61 extracts the local feature quantity of the sewing features identified at S6. The CPU 61 compares the extracted local feature quantity of the sewing features with reference stitch feature quantities stored in the flash memory 64 and identifies the reference stitch feature quantity that is approximate to the extracted local feature quantity of the sewing features. The CPU 61 identifies the stitch type corresponding to the identified reference stitch feature quantity. In the case of the specific example, satin stitch is identified as the stitch type.
The CPU 61 identifies an embroidery area based on the sewing features (S9). Although only some of the sewing features are illustrated in
The CPU 61 identifies the pattern features inside the embroidery area identified at S9 (S10). In this way, it is possible to exclude the local features (not illustrated in
Next, the CPU 61 determines whether the number of the pattern features identified at S10 is larger than a specified number N2 (S11). The specified number N2 is 30, for example. When the number of the pattern features is not larger than the specified number N2 (no at S11), the CPU 61 performs the error processing (S24). The CPU 61 then ends the pattern sewing processing.
When the number of the pattern features is larger than the specified number N2 (yes at S11), the CPU 61 generates a histogram, using the local feature quantities of the pattern features identified at S10, from among the local feature quantities calculated at S5 (S12). A method of generating the histogram is substantially the same as the case described above. A histogram 122, which is an example of the generated histogram, is shown in
The CPU 61 compares the histogram 122 (refer to
When the similar histogram is identified (yes at S14), the CPU 61 performs matching processing (S15). In the matching processing, from among the local feature quantity sets registered in the associated data table 90, the CPU 61 extracts a first feature quantity set that corresponds to the first histogram 121 identified at S13. Of the local feature quantities calculated at S5, the CPU 61 performs matching of the local feature quantities of the pattern features identified at S10 and the local feature quantities included in the first feature quantity set. The CPU 61 identifies the local features that have been successfully matched.
The CPU 61 determines whether the number of the local features that have been successfully matched at S15 is larger than a specified number N3 (S16). The specified number. N3 is 30, for example. In the specific example, the number of local features that have been successfully matched may be larger than the specified number N3. Therefore, the embroidery pattern sewn on the sewing workpiece 3 may be identified as the embroidery pattern 100 of the letter K. When the number of local features that have been successfully matched is not larger than the specified number N3 (no at S16), the CPU 61 performs the error processing (S24). The CPU 61 then ends the pattern sewing processing.
When the number of the local features that have been successfully matched is larger than the specified number N3 (yes at S16), the CPU 61 uses the local features that have been successfully matched to identify editing parameters (S17). More specifically, the CPU 61 calculates a gravity center of the coordinates of the local feature points 131 of the local features that have been successfully matched, in the captured image 110. Based on a movement amount by which the embroidery frame 50 has been moved at S2 and the coordinates of the gravity center in the captured image 110, the CPU 61 calculates a sewing position of the embroidery pattern 100 on the sewing workpiece 3. Further, the CPU 61 calculates an average value of angles of the luminance gradient directions of the local features that have been successfully matched. The luminance gradient directions are indicated by arrows 133 in
Based on a difference between the calculated average value of the angles of the luminance gradient directions and the average angle value registered in the associated data table 90 (refer to
The CPU 61 stores in the RAM 63 the first editing parameters identified at S17, and the stitch type identified at S8 (S18). In this way, the first editing parameters and the stitch type are set in the sewing machine 1 (S18). The CPU 61 extracts second thread color information from the captured image 110 (S19). The second thread color information is information indicating a color of the upper thread of the embroidery pattern 100. The CPU 61 compares the first thread color information included in the embroidery pattern data of the embroidery pattern 100 stored in the flash memory 64 with the second thread color information extracted at S19 (S20). The CPU 61 performs notification of the comparison result at S20 (S21). When, as a result of the comparison at S20, the first thread color information and the second thread color information are different, for example, a message stating “The upper thread color is different to the sewing data” may be displayed on the display 15.
When the notification of the comparison result is performed at S21, the user may easily recognize a difference of the color of the upper thread based on the thread color information in the embroidery pattern data and the color of the upper thread sewn on the sewing workpiece 3. Thus, the user may change the color of the upper thread to be the same as the color of the sewn upper thread, for example.
The CPU 61 is on stand-by until the CPU 61 detects a command to perform sewing (no at S22). The user may dispose a new sewing workpiece 3, on which the embroidery pattern has not been sewn, on the embroidery frame 50. The user may input the command to the sewing machine 1 to perform the sewing by operating the start/stop switch 29. When the CPU 61 detects the command to perform the sewing (yes at S22), the CPU 61 performs the sewing (S23). The CPU 61 uses the first editing parameters identified at S17 and sews the embroidery pattern 100 identified by the processing from S13 to S16. At this time, the sewing is performed using the stitch type identified at S8. In this way, the embroidery pattern that is the same as the embroidery pattern 100 shown in
As described above, in the present embodiment, it is possible to automatically identify the embroidery pattern 100 and the first editing parameters, from the captured image 110 of the sewing workpiece 3 on which the embroidery pattern 100 has been sewn. Thus, the embroidery pattern 100 and the first editing parameters can be more easily identified than a case in which the user operates the sewing machine 1 to identify the embroidery pattern 100 sewn in the past and the first editing parameters.
In the present embodiment, the embroidery pattern 100 can be identified based on the degree of similarity between the first histogram 121 and the histogram 122.
In addition to the embroidery pattern 100 and the first editing parameters, the CPU 61 can identify the stitch type (the satin stitch, for example) (S8). Therefore, the embroidery pattern, the editing parameters, and the stitch type can be more easily identified in comparison to the case in which the user operates the sewing machine 1 to identify the embroidery pattern 100 sewn in the past, the first editing parameters, and the stitch type.
Various changes may be made to the above-described embodiment. In the above-described embodiment, the stitch type is identified at S8, but the stitch type need not necessarily be identified. In this case, the processing at S6 and S7 need not necessarily be performed. Further, in this case, the identification of the embroidery area at S9 may be performed using another method. For example, the embroidery area need not be identified based on the sewing features and the embroidery area may be identified based on the pattern features.
In the above-described embodiment, the embroidery pattern is identified using the histogram, but the histogram need not necessarily be used. For example, the CPU 61 may perform matching of the local feature quantities of the pattern features and the local feature quantity sets of the associated data table 90. Then, by identifying the similar local feature quantity set, the CPU 61 may identify the embroidery pattern.
It is sufficient if the embroidery pattern is identified based on information indicating the feature of the embroidery pattern, and the information indicating the feature of the embroidery pattern need not necessarily be the histogram and the local feature quantities. For example, the information indicating the embroidery pattern may be information about a shape of the embroidery pattern in the captured image. In this case, the CPU 61 may extract the information about the shape of the embroidery pattern 100 from the captured image 110. Then, the CPU 61 may compare the extracted information about the shape of the embroidery pattern 100 with information about a shape of an embroidery pattern stored in the associated data table 90, and thus calculate a degree of similarity and identify the embroidery pattern. In this case, the degree of similarity may be parameters based on the information about the shape of the embroidery pattern. It is sufficient if the editing parameters include at least one of the sewing position, the size, and the rotation angle of the embroidery pattern. In this case, at least one of the sewing position, the size, and the rotation angle can be easily identified.
The apparatus and methods described above with reference to the various embodiments are merely examples. It goes without saying that they are not confined to the depicted embodiments. While various features have been described in conjunction with the examples outlined above, various alternatives, modifications, variations, and/or improvements of those features and/or examples may be possible. Accordingly, the examples, as set forth above, are intended to be illustrative. Various changes may be made without departing from the broad spirit and scope of the underlying principles.
Number | Date | Country | Kind |
---|---|---|---|
2014-108973 | May 2014 | JP | national |