This application claims priority to Japanese Patent Application No. 2009-52022, filed Mar. 5, 2009, the disclosure of which is incorporated herein by reference in its entirety.
The present invention relates to an embroidery data generating apparatus and a computer-readable medium that stores an embroidery data generating program that generate embroidery data that are used by a sewing machine to produce a variety of images in embroidered form.
An embroidery data generating apparatus is known that generates embroidery data used in a sewing machine based on a video image or the like. With this type of embroidery data generating apparatus, a user specifies from video images, for example, an image that the user likes. A contour of an area that will be embroidered of the image selected by the user is extracted, and an image of the contour is converted to embroidery data.
With the above-described example of the embroidery data generating apparatus, from among video images that are formed of a plurality of contiguous frames, a still image that is a single frame is converted to embroidery data. As a result, in some cases, a sewing machine that performs sewing based on the embroidery data cannot sew an embroidery pattern conveying a dynamic impression that brings to life features of the moving images.
Various exemplary embodiments of the general principles herein provide an embroidery data generating apparatus and a computer readable medium storing an embroidery data generating program that are capable of generating embroidery data that causes a sewing machine to sew an embroidery pattern conveying a dynamic impression that brings to life features of moving images.
The exemplary embodiments provide an embroidery data generating apparatus the includes: a data acquisition device that acquires, from among a plurality of image data, a single image data as a target data, and at least one image data as a reference data, the reference data being different from the target data; a feature point extraction device that extracts, respectively, a first feature point representing a shape of a first target image contained in the target data, and a second feature point representing a shape of a second target image contained in the reference data; a displacement vector identification device that, based on the first feature point and the second feature point extracted by the feature point extraction device, identifies a displacement vector that is obtained when one of the first target image and the second target image is displaced to a position of the other target image; a parameter setting device that, based on the displacement vector identified by the displacement vector identification device, sets embroidery parameters that are referred to when generating embroidery data used to perform embroidery by a sewing machine; and an embroidery data generating device that refers to the embroidery parameters set by the parameter setting device and generates the embroidery data that causes the sewing machine to embroider the first target image.
The exemplary embodiments also provide a computer-readable medium storing an embroidery data generating program. The embroidery data generating program includes instructions that cause a computer to perform the steps of: acquiring, from among a plurality of image data, a single image data as a target data, and at least one image data as a reference data, the reference data being different from the target data; extracting, respectively, a first feature point representing a shape of a first target image contained in the target data, and a second feature point representing a shape of a second target image contained in the reference data; identifying, based on the first feature point and the second feature point, a displacement vector that is obtained when one of the first target image and the second target image is displaced to a position of the other target image; setting, based on the displacement vector, embroidery parameters that are referred to when generating embroidery data used to perform embroidery by a sewing machine; and referring to the embroidery parameters and generating the embroidery data that causes the sewing machine to embroider the first target image.
The exemplary embodiments also provide an embroidery data generating apparatus that includes: a frame acquisition device that respectively acquires from moving image data, of a plurality of contiguous frames included in the moving image data, a freely specified target frame, and a reference frame that includes at least one of a frame that is played back before the target frame and a frame that is played back after the target frame in a playback order of the moving image data; a feature point extraction device that extracts, respectively, a first feature point representing a shape of a first target image contained in the target frame, and a second feature point representing a shape of a second target image contained in the reference frame; a displacement vector identification device that, in accordance with the playback order of the target frame and the reference frame in the moving image data and based on the first feature point and the second feature point extracted by the feature point extraction device, identifies a displacement vector that is obtained when one of the first target image and the second target image is displaced to a position of the other target image; a parameter setting device that, based on the displacement vector identified by the displacement vector identification device, sets embroidery parameters that are referred to when generating embroidery data used to perform embroidery by a sewing machine; and an embroidery data generating device that refers to the embroidery parameters set by the parameter setting device and generates the embroidery data that causes the sewing machine to embroider the first target image.
Exemplary embodiments of the present disclosure will be described below in detail with reference to the accompanying drawings in which:
Hereinafter, exemplary embodiments of the present disclosure will be explained with reference to the appended drawings.
An embroidery data generating apparatus 1 will be explained with reference to
Next, an electrical configuration of the embroidery data generating apparatus 1 will be explained. As shown in
A hard disk drive (HDD) 15, the mouse 22, a video controller 16, a key controller 17, a CD-ROM drive 18, a memory card connector 23, and the image scanner 25 are connected to the I/O interface 14. The display 24 is connected to the video controller 16. The keyboard 21 is connected to the key controller 17. An embroidery data generating program, which is a program that controls the embroidery data generating apparatus 1, is stored on a CD-ROM 114 that is inserted into the CD-ROM drive 18. When the embroidery data generating program is loaded, the embroidery data generating program is set up from the CD-ROM 114 on the hard disk drive 15 and is stored in a program storage area 156. The memory card connector 23 can read information from and write information into a memory card 115. The scanner 25 reads text or a design etc. that is printed onto a paper medium as image data.
The hard disk drive 15 is provided with at least a moving image data storage area 151, a frame storage area 152, a reference value table storage area 153, a parameter setting table storage area 154, an embroidery data storage area 155, the program storage area 156 and an other information storage area 157.
Moving images that are formed of a plurality of contiguous frames (hereinafter referred to as moving image data) are stored in the moving image data storage area 151. The frames that form the moving image data each form a single image data. In the present embodiment, the moving image data stored in the moving image data storage area 151 are, for example, acquired via a network not shown in the figures. The moving image data stored in the moving image data storage area 151 may also be read from a recording medium (the CD-ROM 114 and the memory card 115, for example).
From among the plurality of contiguous frames that form the moving image data, a target frame and a reference frame are stored in the frame storage area 152. The target frame is a frame that is selected by a user from among the frames that form the moving image data. The reference frame is a frame that either precedes or follows the target frame in the playback order of the moving image data. In the present embodiment, the reference frame stored in the frame storage area 152 is a frame that, when the moving image data are played back, is played back a predetermined period of time (one second, for example) after the target frame.
A stitch definition table 200 (refer to
A configuration of the sewing machine 3 will be explained with reference to
A memory card slot 37 in which the memory card 115 can be mounted is provided on a side face of a pillar 36 of the sewing machine 3. When the memory card 115, in which the embroidery data are stored, is mounted in the memory card slot 37, the embroidery data are supplied to the sewing machine 3. The control unit (not shown in the drawings) of the sewing machine 3 automatically performs the embroidery operation described above based on the embroidery data that are supplied from the memory card 115.
The stitch definition table 200 and the parameter setting table 300 stored in the hard disk drive 15 will next be explained with reference to
As shown in
Hereinafter, each parameter will be explained in more detail while referring to a pattern that is sewn as a square shape using the fill stitches defined by the stitch definition table 200 (refer to
The angle 201 is a parameter that sets a stitching direction of the fill stitches (note that a range of the angle 201 is from “0” to “359”). In the present embodiment, the reference value parameter of the angle 201 is “zero” (refer to
The thread density 202 is a parameter that sets an interval between each line, each line being reversed at both ends of the embroidery pattern (namely, a line is a group of stitches that continue until a reversal of direction occurs). The number of lines per millimeter (threads/mm) is set as the thread density 202 (note here that a range of the thread density 202 is from “1.0” to “7.0”). In the present embodiment, the reference value parameter of the thread density 202 is “4.5 threads/mm” (refer to
The stitch pitch 203 is a parameter that sets an interval between the needle drop points used to sew a single stitch. A length of an interval between two consecutive needle drop points (in millimeters) is set as the stitch pitch 203 (note here that a range of the stitch pitch 203 is from “1.0” to “10.0”). In the present embodiment, the reference value parameter of the stitch pitch 203 is “4.0” (refer to
The deviation 204 is a parameter that sets to what degree needle drop points of adjacent lines will be displaced in a stitch direction. A ratio (%) is set as the deviation 204, taking the stitch length of the adjacent line as reference (note here that a range of the deviation 204 is from “0” to “99”). In the present embodiment, the reference value parameter of the deviation 204 is “30%” (refer to
The feather setting 205 is a parameter that is used to fade out the contour of side areas in which each line of the embroidery pattern is reversed (this technique is known as “feathering”). An execution setting 206, a side 207 and a fadeout length 208 that are sub-parameters are set in the feather setting 205. As the execution setting 206, a setting is made to determine whether or not feathering will be performed on the embroidery pattern (“ON/OFF”). As the side 207, a setting is made such that the side area in which the feathering is performed on the embroidery pattern corresponds to the stitching direction of the stitches (namely the angle 201). More specifically, when feathering is performed on a starting point side of the stitching direction (specifically, the left side area of the embroidery pattern), “starting point side” is set as the side 207. When feathering is performed on an endpoint side of the stitching direction (specifically, the right side area of the embroidery pattern), “endpoint side” is set as the side 207. When feathering is performed on both the starting point and the endpoint sides of the stitching direction (specifically, on the left and right side areas of the embroidery pattern), “both sides” is set as the side 207. As the fadeout length 208, as a length over which feathering is performed, a stitch length (mm) is set that is the length of a stitch (from the point at which the direction is reversed) positioned in the side area in which feathering is performed (note here that a range of the fadeout length 208 is from “0.1” to “100.0”).
In the present embodiment, as a reference value parameter of the feather setting 205, a setting is made that indicates that feathering will not be performed (namely, the execution setting 206 is “OFF”) (refer to
Even when the execution setting 206 is “ON” on the stitch definition table 200, sometimes manual setting by the user or automatic setting (to be described later) is not performed for the side 207 and the fadeout length 208. In this case, in the present embodiment, as reference value parameters when feathering is performed, the side 207 is automatically set as the “starting point side” and the fadeout length 208 is automatically set as “1.0 mm”. In other words, when the execution setting 206 is “ON”, as long as manual setting by the user or automatic setting (to be described later) is not performed, a standard type of feathering is performed on the embroidery pattern, as shown by the sewing mode 58 (refer to
As shown in
The above parameters of the parameter setting table 300 can be set manually by the user. In other words, of the parameters included in the stitch definition table 200, the user can specify, on the parameter setting table 300, the parameters that will be the target of automatic setting (to be described later).
The parameter setting table 300 in
When all of the parameters of the parameter setting table 300 are set as “No automatic setting”, the fill stitches defined by the stitch definition table 200 are not changed for as long as the settings are not manually changed by the user. On the other hand, when there are parameters on the parameter setting table 300 that are set to “Automatic setting”, the fill stitches defined by the stitch definition table 200 are changed in accordance with those parameters. This process will be described in more detail later.
Next, a processing procedure by which the embroidery data generating apparatus 1 according to the present embodiment generates embroidery data from moving image data will be explained with reference to
As shown in
More specifically, in an example shown in
Feature points are then extracted from the image data input at step Si (step S3). At step S3, image analysis is performed on the target frame and the reference frame, respectively, and feature points are extracted that represent the shape of the design included in each of the frames. More specifically, by performing image analysis on the frame 100B that is the target frame, feature points P (shown as hollow points) that represent the shape of the arrow design 110B are extracted, as shown in
Based on the feature points extracted at step S3, displacement information for the image data input at step S1 can be acquired (step S5). At step S5, a barycentric position of the feature points extracted at step S3 is calculated for the target frame and the reference frame, respectively. The barycentric position of the feature points is calculated, for example, as a coordinate position when a mean value coordinate position is taken for each of the feature points. A direction and a distance from the barycentric position of the target frame to the barycentric position of the reference frame (namely, a displacement vector) are calculated as displacement information of the image data.
More specifically, the mean value coordinate position of the feature points P of the arrow design 110B (refer to
Based on the displacement vector acquired at step S5, the parameters that define the fill stitches are set (step S7). At step S7, the parameters of the stitch definition table 200 corresponding to the parameters that are set as “Automatic setting” on the parameter setting table 300 are automatically set based on the displacement vector acquired at step S5. This process will be explained in more detail later.
Based on the image data input at step S1, embroidery data are generated to be used in performing embroidery by the sewing machine 3 (step S9). At step S9, in a similar way to generation of embroidery data from image data in known art, the design included in the target frame (in the present embodiment, the arrow design 110B included in the frame 100B) is converted to embroidery data. More specifically, the embroidery data is generated that causes the sewing machine 3 to sew the embroidery pattern of the design that is included in the target frame, with the fill stitches defined by the stitch definition table 200. The embroidery data generated at step S9 are saved to the embroidery data storage area 155.
The embroidery data that are generated at step S9 are, for example, displayed on the display 24. The user confirms the embroidery data displayed on the display 24 and specifies whether to end or to redo the embroidery data generation process. When the user specifies to end the embroidery data generation process (yes at step S11), the main process is ended. When the user specifies to redo the embroidery data generation process (no at step S11), the process returns to step S7. Before specifying to redo the embroidery data generation process, by changing the settings on the stitch definition table 200 and the parameter setting table 300, the user can acquire embroidery data of a different sewing mode to that previously acquired (step S7 and step S9).
After ending the main process (refer to
The relationship between the parameters set at step S7 and the embroidery data generated at step S9 will be explained with reference to
In the example of the parameter setting table 300 shown in
In an example of the parameter setting table 300 shown in
When “Automatic setting” is set for the angle 301, the orientation of the displacement vector acquired at step S5 is set as the angle 201 in the stitch definition table 200. For example, when the orientation of the above-described displacement vector R (refer to
When “Automatic setting” is set for the thread density 302, the thread density 202 in the stitch definition table 200 is set to a value that corresponds to the length of the displacement vector acquired at step S5. More specifically, when the displacement vector R is longer than the reference displacement vector length 306, a value that is smaller than the reference value parameter (“4.5 threads/mm” in the present embodiment) is set as the thread density 202. When the displacement vector R is shorter than the reference displacement vector length 306, a value that is greater than the reference value parameter is set as the thread density 202. The value that is set as the thread density 202 is a value obtained by dividing the reference value parameter by a ratio of the length of the displacement vector R to the reference displacement vector length 306. For example, when the length of the displacement vector R is “112.5 mm”, as it is longer than the reference displacement vector length 306 of “50 mm” (refer to
When the stitch pitch 303 is set as “Automatic setting”, the stitch pitch 203 on the stitch definition table 200 is set to a value that corresponds to the length of the displacement vector acquired at step S5. More specifically, when the displacement vector R is longer than the reference displacement vector length 306, a value that is larger than the reference value parameter (“4.0 mm” in the present embodiment) is set as the stitch pitch 203. When the displacement vector R is shorter than the reference displacement vector length 306, a value that is smaller than the reference value parameter is set as the stitch pitch 203. The value that is set as the stitch pitch 203 is a value obtained by multiplying the reference value parameter by a ratio of the length of the displacement vector R to the reference displacement vector length 306. For example, when the length of the displacement vector R is “75 mm”, as this is longer than the reference displacement vector length 306 of “50 mm” (refer to
When the feather side 304 is set to “Automatic setting”, the starting point side of the displacement vector acquired at step S5 is set as the side 207 in the stitch definition table 200 (refer to
When the feather fadeout length 305 is set to “Automatic setting”, the fadeout length 208 in the stitch definition table 200 is set to a value that corresponds to the length of the displacement vector acquired at step S5. More specifically, when the displacement vector R is longer than the reference displacement vector length 306, a value that is larger than the reference value parameter at the time of performing feathering (“1.0 mm” in the present embodiment) is set as the fadeout length 208. When the displacement vector R is shorter than the reference displacement vector length 306, a value that is smaller than the reference value parameter is set as the fadeout length 208. The value that is set as the fadeout length 208 is a value obtained by multiplying the reference value parameter by a ratio of the length of the displacement vector R to the reference displacement vector length 306. For example, when the length of the displacement vector R is “150 mm”, as this is longer than the reference displacement vector length 306 of “50 mm” (refer to
When the feather side 304 is set to “Automatic setting”, even if the execution setting 206 on the stitch definition table 200 is “OFF”, feathering is performed on the embroidery pattern. In other words, even when the execution setting 206 is set to “OFF”, if the feather side 304 is set to “Automatic setting”, the execution setting 206 is set to “ON” at step S7, and the side 207 is set based on the above-described displacement vector.
When the parameter setting table 300 shown in
As described above, in the embroidery data generating apparatus 1 according to the present embodiment, a freely specified target frame (the frame 100B) and a reference frame (the frame 100C) that follows the target frame are acquired from among a plurality of contiguous frames that form moving image data. Based on the feature points P of the arrow design 110B that is included in the frame 100B and the feature points P′ of the arrow design 110C that is included in the frame 100C, the displacement vector R obtained when the arrow design 110B is displaced is acquired. While referring to the parameters of the stitch definition table 200 that are set based on the displacement vector R, the embroidery data generating apparatus 1 generates the embroidery data to enable the sewing machine 3 to sew the arrow design 110B in embroidered form. In this way, the embroidery data generating apparatus 1 can generate embroidery data that causes the sewing machine 3 to sew a dynamic embroidery pattern that brings to life the features of a moving image. Additionally, the sewing machine 3 that performs the sewing operation based on this embroidery data can represent the displacement of the arrow design 110B using the embroidery pattern, in accordance with the parameters set on the stitch definition table 200 (namely, the angle of the stitches, the stitch pitch, the thread density and feathering when the sewing machine 3 performs embroidering).
Note that the embroidery data generating apparatus 1 according to the present disclosure is not limited to the above-described exemplary embodiment, and various modifications can of course be made. In the above-described exemplary embodiment, examples are described in which a plurality of parameters (stitch direction, stitch length, stitch density, feather settings) are referred to when the embroidery data are generated, but the embroidery data may be generated as long as at least one parameter is referred to.
In the above-described exemplary embodiment, when setting the parameters at step S7, the stitch definition table 200 is set based on the parameter setting table 300 and the displacement vector R, but a new set value table that is different to the stitch definition table 200 may be used to perform parameter settings. At step S9, the embroidery data may be generated based on the new set value table. In this case, all of the set values in the stitch definition table 200 are not changed, and thus the stitch definition table 200 is maintained in the default state. When the embroidery data are next generated (no at step S11), the embroidery data maybe generated at step S9 based on the stitch definition table 200 that is in the default state. In this way, the user can be saved the time and effort of resetting the stitch definition table 200 to the default state.
In the above-described exemplary embodiment, by image input performed at step S1, a frame that is played back within a predetermined period of time (1 second, for example) after a target frame when moving image data are played back is acquired as a reference frame. However, the reference frame is not limited to this example. The reference frame may be a frame that is played back a predetermined number of frames (5 frames, for example) after the target frame, when the moving image data are played back.
In the above-described exemplary embodiment, by the image input performed at step S1, a frame that follows the target frame is acquired as the reference frame, but the reference frame is not limited to this example. The reference frame may be a frame that is played back a predetermined period of time (1 second, for example) or a predetermined number of frames (5 frames, for example) before the target frame, when the moving image data are played back.
For example, in the example shown in
In addition, by the image input performed at step S1, a frame preceding the target frame and a frame following the target frame may both be acquired as the reference frames. For example, in the example shown in
In the above-described embodiment, the image input is performed at step S1, and the target frame and the reference frame are acquired from the moving image data. However, the method of acquiring the target frame and the reference frame is not limited to this example. For example, two image data selected and input by the user may be acquired as the target frame and the reference frame. The two image data can be selected image data that are, for example, acquired via a network that is not shown in the drawings, or image data that are read from the scanner 25 etc. It is, however, preferable that the images of the two image data have a common design that will be sewn as the embroidery pattern, and that movement of the design has continuity. It is preferable that the user is caused to input information indicating to which of the target frame and the reference frame the two image data respectively correspond (for example, the user is caused to select, from the two image data, a target image from which the embroidery data is to be generated). By this, even when the frame is not acquired from moving image data, the embroidery data generating apparatus 1 can generate embroidery data that causes the sewing machine 3 to sew a dynamic embroidery pattern from a plurality of image data.
In the above-described exemplary embodiment, the embroidery data generating apparatus 1 is a personal computer, but the embroidery data generating apparatus 1 is not limited to this example. By storing the embroidery data generating program in the sewing machine 3, the sewing machine 3 may function as the embroidery data generating apparatus 1 and generate the embroidery data.
The apparatus and methods described above with reference to the various embodiments are merely examples. It goes without saying that they are not confined to the depicted embodiments. While various features have been described in conjunction with the examples outlined above, various alternatives, modifications, variations, and/or improvements of those features and/or examples may be possible. Accordingly, the examples, as set forth above, are intended to be illustrative. Various changes may be made without departing from the broad spirit and scope of the underlying principles.
Number | Date | Country | Kind |
---|---|---|---|
2009-052022 | Mar 2009 | JP | national |