This application is based upon and claims the benefit of priority under 35 USC 119 of Japanese Patent Application No. 2018-012964 filed on Jan. 29, 2018, the entire disclosure of which, including the description, claims, drawings, and abstract, is incorporated herein by reference in its entirety.
This application relates generally to an application device, an ink application method, and a non-transitory recording medium.
Printing devices that print a print-target image on a print medium in accordance with movement of the device on the print medium are known.
For example, patent literature (Unexamined Japanese Patent Application Kokai Publication No. H10-35034) discloses a manually-moved printing device that is manually scanned on a recording medium to print on the recording medium. Specifically, as the printing device disclosed in the patent literature is manually scanned on a recording medium by the user, the device ejects ink from the print head to the recording medium in accordance with the amount of movement of the device for printing. Furthermore, when the printing device disclosed in the Patent Literature is scanned in the opposite direction to the ordinary direction, the device decorates printed characters such as making the printed characters in bold or underlining the printed characters.
The printing device disclosed in the Patent Literature decorates the characters that are printed by its own device. On the other hand, there is a demand for application of ink based on a luminance distribution of characters and the like that preexist on the application target, which are not necessarily characters and the like that are printed by its own device.
The present disclosure advantageously provides an application device, an ink application method, and a non-transitory recording medium that make it possible to apply ink based on a captured image of an application surface of an application target.
According to an embodiment of the present invention, the following is provided.
The application device according to the present disclosure is an application device, comprising:
a sensor that detects movement of the application device on an application target;
a camera that obtains a captured image that is an image of a surface of the application target and is captured during the movement of the application device;
a head that applies ink; and
a processor,
wherein the processor
specifies an ink application area based on the captured image of the surface of the application target that is captured by the camera in accordance with the movement that is detected by the sensor, and
controls the head to apply ink to the ink application area in accordance with the movement.
The ink application method according to the present disclosure is a method by an application device that comprises a head for applying ink, including:
detecting movement of the application device on an application target;
obtaining a captured image that is an image of a surface of the application target and is captured during the movement of the application device;
specifying an ink application area based on the captured image of the surface of the application target that is captured in accordance with the movement that is detected, and
controlling the head to apply ink to the ink application area in accordance with the movement.
The non-transitory computer-readable recording medium according to the present disclosure is a recording medium on which a program is recorded, the program allowing a computer of an application device that comprises a head for applying ink to execute the processing of:
detecting movement of the application device on an application target;
obtaining a captured image that is an image of a surface of the application target and is captured during the movement of the application device;
specifying an ink application area based on the captured image of the surface of the application target that is captured by the camera in accordance with the movement that is detected by the sensor, and
controlling the head to apply ink to the ink application area in accordance with the movement.
A more complete understanding of this application can be obtained when the following detailed description is considered in conjunction with the following drawings, in which:
An embodiment of the present disclosure will be described below with reference to the drawings. Here, the same or corresponding parts are referred to by the same reference numbers.
The application target 30 is, for example, print paper, labels, cardboard, or the like. The material of the application target 30 is not restricted to paper, and may be, for example, films, chemical fibers, resins, metals, or the like and can be anything as long as ink is allowed to adhere. The surface of the application target 30 to which ink is applied is not necessarily planar and may be curved, namely a surface more or less bulged or hollowed. Ink is an application material (paint) applied to the application target 30 for printing the print-target image. Here, ink is not necessarily liquid and may be solid or gelled. Moreover, ink may be dye ink, pigment ink, or the like and can be formed by any material as long as it is applicable.
The print-target image is formed on the application target 30 by applying ink while the user holds by hand and slidingly moves the application device 10 on the application target 30 in a prescribed moving direction as shown in
Here, in
As shown in
The processor 11 comprises a central processing unit (CPU), a read only memory (ROM), and a random access memory (RAM). The CPU is, for example, a microprocessor or the like and a central arithmetic operation processor that executes various kinds of processing and arithmetic operations. In the processor 11, the CPU is connected to the parts of the application device 10 via a system bus and functions as control means for controlling the entire application device 10 while reading control programs that are stored in the ROM and using the RAM as the work memory. Moreover, the processor 11 comprises a clock that measures the time such as a real time clock (RTC).
The storage 12 is a nonvolatile memory such as a flash memory and a hard disc. The storage 12 stores programs and data that are used by the processor 11 to execute various kinds of processing. For example, the storage 12 saves display and print data such as characters, symbols, and emoji, and tables in which various print settings are stated. Moreover, the storage 12 stores data that are generated or acquired as a result of the processor 11 executing various kinds of processing.
The user interface 13 comprises an input receiver such as input keys, buttons, switches, a touch pad, and a touch panel, and a display such as a liquid crystal panel and a light emitting diode (LED). The user interface 13 receives various kinds of operation orders from the user via an inputter and transmits the received operation orders to the processor 11. Moreover, the user interface 13 acquires various kinds of information from the processor 11 and displays on the display images that indicate the acquired information.
The power supply 14 comprises a battery and a voltage detector and generates and supplies to the parts a power supply necessary for operations of the application device 10.
The communicator 15 comprises an interface for the application device 10 to communicate with an external device. The external device is, for example, a terminal device such as a personal computer, a tablet terminal, and a smartphone. The communicator 15 communicates with the external device via, for example, USB (universal serial bus), a local area network (LAN) such as wireless fidelity (Wi-Fi), Bluetooth (registered trademark), or the like. The communicator 15 acquires various kinds of data including print data from the external device via such wired or wireless communication under the control of the processor 11.
The movement detector 16 is provided in the lower part of the application device 10 and detects movement of the application device 10 while the application device 10 moves on the application target 30. Specifically, the movement detector 16 comprises a light emitter such as an LED that emits light toward the surface of the application target 30, and an optical sensor that reads light emitted by the light emitter and reflected on the surface of the application target 30. The movement detector 16 reads light emitted by the LED with the optical sensor and detects the amount of movement and the moving direction of the application device 10 based on change in the read light. The movement detector 16 functions as a sensor.
The imager 17 is a so-called camera and comprises a lens that collects light emitted by an object, an imaging element such as a charge coupled device (CCD) and a complementary metal oxide semiconductor (CMOS) that receives the collected light and acquires an image of the object, and an analog/digital (A/D) converter that converts data indicating a captured image sent by the imaging element as electric signals to digital data. While the application device 10 moves on the application target 30, the imager 17 captures images of the surface of the application target 30 and supplies to the processor 11 the captured images that are obtained through the imaging.
The image processor 18 comprises an image processing processor such as a digital signal processor (DSP) and a graphics processing unit (GPU) and a buffer memory that temporarily saves images to process, and processes captured images that are obtained through the imaging by the imager 17 under the control of the processor 11. For example, the image processor 18 executes recognition processing such as edge recognition, character recognition, and object recognition on the captured images using a known image recognition technique.
The applicator (ink head) 19 is an application mechanism (print mechanism) that executes printing by applying ink to the surface of the application target 30. The applicator 19 applies ink to the surface of the application target 30 in an inkjet system in which ink filled in an ink tank is atomized and directly blasted to the application target 30. The applicator 19 functions as a head.
As an example, the applicator 19 ejects ink in a thermal system. Specifically, in the applicator 19, multiple nozzles are arrayed in the main scan direction (the X direction) and the sub scan direction (the Y direction). Ink within the multiple nozzles is heated by a heater to create bubbles and the created bubbles cause the ink to be ejected (vertically downward) toward the application target 30 from each of the multiple nozzles. With this principle, the applicator 19 applies ink to the surface of the application target 30.
Moreover, as shown in
With the above arrangement of the imager 17 and the applicator 19, the imager 17 captures an image of an area on the application target 30 to which ink is applied before the applicator 19 applies ink. The applicator 19 reaches the position where the imager 17 has captured an image after the application device 10 moves over the distance L since the imager 17 has captured an image of the position.
For applying ink to the ink application area 31, as shown in
The applicator 19 applies ink to the ink application area 31 based on the captured images of the ink application area 31 that are captured by the imager 17. Specifically, the applicator 19 applies ink to the specified ink application area 31 in a pattern based on the luminance distribution in the application target 30 of which images are captured by the imager 17 in accordance with the movement of the application device 10 detected by the movement detector 16.
Here, the luminance distribution in the ink application area 31 means the positional distribution of color shades in the ink application area 31. For example, when at least one character is depicted in the ink application area 31 as shown in
In more detail, the applicator 19 apples ink to the background portion in the ink application area 31 to print a background image (1). Specifically, as shown in
The imaging controller 110 controls imaging by the imager 17. Specifically, the imaging controller 110 makes the imager 17 capture an image with prescribed imaging timing while the application device 10 is scanned on the application target 30 by the user. A time for the imager 17 to capture an image comes each time a prescribed time has elapsed while the application device 10 is scanned on the application target 30. The prescribed time is preset, for example, to a value from several milliseconds to several hundred milliseconds or so. The imaging controller 110 is realized by the processor 11 cooperating with the imager 17.
The imager 17 repeatedly captures an image each time a prescribed time has elapsed under the control of the imaging controller 110 while the application device 10 moves on the application target 30. As a result, the imager 17 sequentially captures, while the application device 10 moves on the application target 30, images of multiple areas that are each a portion of the ink application area 31. In other words, the imaging range over which the imager 17 can capture an image at a time is limited to a range of the width W in the moving direction of the application device 10. Therefore, the imager 17 captures images of the ink application area 31 in partial areas instead of capturing an image of the entire ink application area 31 at a time.
Specifically, at a first imaging time when the elapsed time since the imaging start time is 0, as shown in
After capturing an image of the first area 32a, the application device 10 moves on the ink application area 31 from one end to the other. For example, provided that the moving speed of the application device 10 is presented by V, at a second imaging time when a time T1 has elapsed since the imaging start, the lens of the imager 17 moves over a distance (V×T1) from the imaging start position as shown in
As further shown in
As described above, while the application device 10 moves on the ink application area 31, the imager 17 captures an image each time a prescribed imaging time comes, thereby capturing images of the multiple areas 32a, 32b, 32c, . . . of the width W within the ink application area 31 in sequence. The captured images that are captured by the imager 17 are associated with position information of the area 32 of which images are captured and then stored in the storage 12. The position information is stated based on the amount of movement of the application device 10 since the imaging start time detected by the movement detector 16.
Returning to
The image data generator 120 determines the application pattern based on the captured images of the ink application area 31 that are captured by the imager 17. Specifically, as the imager 17 captures an image of any of the multiple areas 32a, 32b, 32c, . . . , the image data generator 120 determines the application pattern based on the luminance distribution in the area 32 of which an image is captured, and generates image data.
Specifically, the image data generator 120 calculates the brightness at each position within an image for each of the captured images 40a, 40b, 40c, . . . . Then, the image data generator 120 identifies a first area where the brightness is higher than a threshold (namely, relatively light portion) and a second area where the brightness is lower than the threshold (namely, relatively dark portion) in each of the captured images 40a, 40b, 40c, . . . . Consequently, the image data generator 120 identifies the first area where the brightness is higher than the threshold as the background portion and the second area where the brightness is lower than the threshold as the character portion.
Identifying the character portion and the background portion as described above, the image data generator 120 determines the application pattern based on the identification results. For example, for applying ink of a desired color to the background portion other than the character portion “ABC” in the ink application area 31, the image data generator 120 determines the application pattern to apply ink of a desired color to the background portion and apply no ink of any color to the character portion in each of the captured images 40a, 40b, 40c, . . . .
Determining the application pattern as described above, the image data generator 120 generates an image that indicates the determined application pattern (an application image). Specifically, as shown in
Here, in
As described above, each time the captured image 40 is obtained by the imager 17, the image data generator 120 generates, based on the captured image 40, the application image 41 that indicates a pattern of ink to apply to the area of which the captured image 40 is captured. Generating the application image 41, the image data generator 120 generates nozzle data 42 based on the generated application image 41. The nozzle data 42 are data for applying ink to the ink application area 31 from the nozzles of the applicator (ink head) 19 in the application pattern determined by the image data generator 120.
For example, as shown
Generating a new application image 41, the image data generator 120 generates nozzle data 42 with the newly generated application image 41. Then, the image data generator 120 repeatedly updates the existing nozzle data 42 with the newly generated application image 41 each time the application images 41a, 41b, 41c, . . . are generated in sequence. The image data generator 120 is realized by the processor 11 cooperating with the image processor 18. The image data generator 120 functions as image data generation means.
The application controller 130 controls application of ink by the applicator 19. Specifically, as movement of the application device 10 is detected by the movement detector 16, the application controller 130 outputs the content of the nozzle data 42 that are generated by the image data generator 120 to the applicator 19 in time with the detected movement. Then, the application controller 130 controls energized dots of the applicator 19 to eject ink from the nozzles of the applicator 19. As a result, printing is executed. The application controller 130 is realized by the processor 11 cooperating with the applicator 19. The application controller 130 functions as application control means.
The applicator 19 applies ink to the ink application area 31 in the pattern based on the captured image 40 of the ink application area 31 captured by the imager 17 in accordance with the movement of the application device 10 detected by the movement detector 16 under the control of the application controller 130. In more detail, with the imager 17 starting capturing images of the ink application area 31, as the application device 10 moves and thus the applicator 19 reaches an area of the ink application area 31 of which an image is captured by the imager 17, the applicator 19 applies ink to the ink application area 31 in accordance with the movement of the application device 10 detected by the movement detector 16.
Specifically, as shown in
Subsequently, when the position of the applicator 19 reaches on the ink application area 31, the applicator 19 starts applying ink to the ink application area 31. Specifically, the applicator 19 starts applying ink to the ink application area 31 when the movement detector 16 detects movement over the distance L between the position where the applicator 19 is provided and the position where the imager 17 is provided from the position where the imager 17 has started capturing the ink application area 31.
For example,
Furthermore,
As described above, when the applicator 19 reaches each of the multiple areas 32a, 32b, 32c, . . . of which images are captured by the imager 17 by the movement of the application device 10, the applicator 19 applies ink to the area 32 the applicator 19 has reached in the pattern based on the luminance distribution in the area 32. For each of the multiple areas 32a, 32b, 32c, . . . , the image data generator 120 executes a processing of generating an application image 41 from the captured image 40 of the area 32 and a processing of generating nozzle data 42 from the generated application image 41 while the applicator 19 is reaching the area 32 of which an image is captured by the imager 17, namely while the application device 10 moves over the distance L. The applicator 19 applies ink to the ink application area 31 according to the nozzle data 42 that are generated by the image data generator 120 in accordance with the amount of movement of the application device 10 on the ink application area 31.
Finally, as shown in
The process flow of the applying processing executed by the application device 10 that comprises the above configuration will be described with reference to
When the user desires to apply ink to a desired ink application area 31 on the application target 30, the user operates the user interface 13 of the application device 10 to press down the print start button and places the application device 10 on the ink application area 31 with the position where the lens of the imager 17 is provided in alignment with the end of the ink application area 31. Then, the user scans the application device 10 in the direction from the position where the nozzles of the applicator 19 are provided to the position where the lens of the imager 17 is provided, namely in the +Y direction while keeping the underside of the application device 10 in contact with the application target 30. In such a state, the applying processing shown in
As the applying processing starts, the processor 11 detects movement of the application device 10 (Step S1). Specifically, as scanning of the application device 10 on the application target 30 starts, the processor 11 detects the amount of movement and the moving direction of the application device 10 on the application target 30 through the movement detector 16.
Detecting movement of the application device 10, first, the processor 11 determines whether a time to capture an image has come (Step S2). Specifically, a time to capture an image comes each time a prescribed time has elapsed while the application device 10 moves on the application target 30. Therefore, the processor 11 determines that a time to capture an image has come each time a prescribed time has elapsed since the application device 10 has started moving on the application target 30.
If a time to capture an image has come (Step S2; YES), the processor 11 functions as the imaging controller 110 to execute imaging (Step S21). Specifically, the processor 11 controls the imager 17 to capture an image of an area 32 of the width W that is a range over which the imager 17 can capture an image within the ink application area 31. On the other hand, if a time to capture an image has not come (Step S2; NO), the processor 11 skips the imaging of the Step S21.
Secondly, the processor 11 determines whether there is a new captured image 40 (Step S3). Specifically, the processor 11 determines whether a captured image 40 for which no application image 41 has been generated is newly obtained by capturing an image of any area 32 within the ink application area 31 in the Step S2.
If there is a new captured image 40 (Step S3; YES), the processor 11 functions as the image data generator 120 to generate an application image 41 based on the new captured image 40 (Step S31). For example, when one of the captured images 40a, 40b, 40c, . . . shown in
Generating the application image 41, the processor 11 further functions as the image data generator 120 to generate nozzle data 42 based on the generated application image 41 (Step S32). Specifically, the processor 11 generates, for example, the nozzle data 42 shown in
On the other hand, if determined that there is no new captured image 40 in the Step S3 (Step S3; NO), the processor 11 skips the processing of generating the application image 41 in the Step S31 and the processing of generating the nozzle data 42 in the Step S32.
Thirdly, the processor 11 determines whether movement of the application device 10 in the ink application area 31 on the application target 30 is detected by the movement detector 16 (Step S4). As shown in
On the other hand, if movement of the application device 10 in the ink application area 31 is detected (Step S4; YES), the processor 11 functions as the application controller 130 to apply ink in accordance with the movement of the application device 10 (Step S41). Specifically, the processor 11 applies ink to the ink application area 31 in the application pattern according to the nozzle data 42 that are generated in the Step S32 each time movement of the application device 10 on the ink application area 31 is detected by the movement detector 16.
For example, when the applicator 19 is situated in the first area 32a as shown in
Subsequently, the processor 11 determines whether the applying processing on the ink application area 31 is complete (Step S5). Specifically, for example, when the user operates the user interface 13 to press down the end button, the processor 11 determines that the applying processing is complete. Alternatively, the processor 11 may determine that the applying processing is complete when the application device 10 is spaced from the application surface of the application target 30.
If the applying processing is not complete (Step S5; NO), the processor 11 returns the processing to the Step S1. Then, the processor 11 executes the processing of capturing an image each time a time to capture an image has come, executes the processing of generating the application image 41 and the nozzle data 42 each time a new captured image 40 is obtained, and executes the processing of applying ink each time movement of the application device 10 in the ink application area 31 is detected. The processor 11 repeats the above processing until application of ink to the ink application area 31 is complete. Finally, if the application of ink is complete (Step S5; YES), the applying processing shown in
As described above, the application device 10 according to this embodiment captures images of the ink application area 31 on the application target 30 that is an application target, and applies ink to the ink application area 31 in the pattern based on the luminance distribution in the ink application area 31 of which the images are captured in accordance with movement of its own device on the application target 30. As a result, the application device 10 according to this embodiment can apply ink based on the luminance distribution of characters and the like that preexist on the application target 30, which are not necessarily characters and the like that are printed by its own device.
Particularly, the application device 10 according to this embodiment comprises, in a single device, the capability of executing the processing of capturing images of the ink application area 31, the processing of generating the application image 41 and the nozzle data 42 based on the captured image 40, and the processing of applying ink. Then, the application device 10 according to this embodiment implements the processing of these three steps while the application device 10 moves on the ink application area 31 in one direction. Consequently, it is possible with a simple operation of the user holding and scanning the application device 10 on the ink application area 31 to apply ink with precise alignment with the positions of characters that are preprinted on the application target 30.
An embodiment of the present disclosure is described above. However, the above embodiment is given by way of example and the applicable range of the present disclosure is not confined thereto. In other words, various applications are available to the embodiment of the present disclosure and any embodiment is included in the scope of the present disclosure.
For example, the above embodiment is described using a case in which the ink application area 31 is an area in which characters “ABC” are preprinted. However, in the present disclosure, the ink application area 31 is not restricted to such an area in which characters are printed. For example, as the ink application area 31, an area in which a symbol, a figure, or the like other than characters is preprinted may be selected or an area in which a pattern, a graphic, or the like is predepicted may be selected. Moreover, the ink application area 31 may be an area other than an area in which characters, a symbol, or a figure is printed or an area other than an area in which a pattern, a graphic, or the like is predepicted. Alternatively, as the ink application area 31, an area in which smear, stain, or the like is present on the application target 30 may be selected. As just stated, any area on the application target 30 may be set as the ink application area 31 as long as image data that indicate the luminance distribution in the area are obtainable through imaging of the imager 17.
Moreover, the above embodiment is described using a case in which the applicator 19 applies ink to the first area in the ink application area 31 where the brightness is higher than the threshold to print a background image on the background portion in the ink application area 31. However, in the present disclosure, the applicator 19 is not restricted to printing a background image and may apply ink to a portion of characters or the like in the ink application area 31 to change the color or the density of the characters or the like that preexists in the ink application area 31.
Specifically, the image data generator 120 determines an application pattern to apply ink of a desired color at a desired density to the second area in the ink application area 31 where the brightness is lower than the threshold. For example, for darkening characters or the like preprinted in the ink application area 31, the image data generator 120 determines an application pattern to apply, to the second area in the ink application area 31 that has a brightness lower than the threshold, ink of which the brightness is further lower than that brightness. On other hand, for lightening characters or the like preprinted in the ink application area 31 or for making less visible stain, smear, or the like that preexists in the ink application area 31, the image data generator 120 determines an application pattern to apply, to the second area in the ink application area 31 that has a brightness lower than the threshold, ink of which the brightness is higher than that brightness. Further, when making the second area inconspicuous, it is preferable to apply the ink whose brightness is higher than the brightness of the second area such that the brightness of the second area can approach the brightness of the first area that is in proximity to the second area. Then, the applicator 19 applies ink in the pattern determined by the image data generator 120.
Alternatively, the applicator 19 may enhance the outlines of characters by applying ink in the peripheral portions of characters in the ink application area 31. In such a case, the image data generator 120 determines an application pattern to apply ink of a desired color at a desired density to the boarder portion between the first area where the brightness is higher than the threshold and the second area where the brightness is lower than the threshold in the ink application area 31. Then, the applicator 19 applies ink in the pattern determined by the image data generator 120.
In the above embodiment, the application device 10 comprises the function of the image data generator 120 that generates the application images 41a, 41b, 41c, . . . and the nozzle data 42 based on the captured images 40a, 40b, 40c, . . . of the ink application area 31 that are captured by the imager 17. However, it may be possible in the present disclosure that the application device 10 does not comprise the function of the image data generator 120 and an external device of the application device 10 comprises the function of the image data generator 120. The external device is an information processing device such as a personal computer, a smartphone, and a tablet terminal connected to the application device 10 via wireless or wired communication or a server connected to the application device 10 via a wide area network such as the Internet.
When the application device 10 does not comprise the function of the image data generator 120, the application device 10 transmits data of the captured images 40a, 40b, 40c, . . . of the ink application area 31 that are captured by the imager 17 to the external device via the communicator 15. The external device generates, with the function of the image data generator 120 described in the above embodiment, the application images 41a, 41b, 41c, . . . and the nozzle data 42 based on the captured images 40a, 40b, 40c, . . . that are received from the application device 10 and transmits the generated nozzle data 42 to the application device 10. The application device 10 receives the nozzle data 42 from the external device via the communicator 15 and applies ink to the ink application area 31 according to the received nozzle data 42. Alternatively, the processing of generating the nozzle data 42 from the application images 41a, 41b, 41c, . . . may be executed by the application device 10, not by the external device. In such a case, the application device 10 receives the application images 41a, 41b, 41c, . . . from the external device, generates the nozzle data 42 from the received application images 41a, 41b, 41c, . . . , and applies ink to the ink application area 31 according to the generated nozzle data 42. As just stated, as the external device comprises at least part of the function of the image data generator 120, it is possible to reduce the amount of processing executed on the application device 10 and therefore simplify the configuration of the application device 10.
In the above embodiment, the imager 17 repeatedly captures images of multiple areas 32 within the ink application area 31 each time a prescribed time has elapsed while the application device 10 moves on the application target 30. However, in the present disclose, the imager 17 may repeatedly capture images of multiple areas 32 within the ink application area 31 each time movement over a prescribed distance is detected by the movement detector 16 while the application device 10 moves on the application target 30. In other words, timing of the imager 17 capturing images may be prescribed by the amount of movement of the application device 10 on the application target 30 instead of being prescribed by the elapse of time.
In such a case, the prescribed distance may be a distance that corresponds to the width W in the moving direction of the application device 10 on the application target 30 of an area of which an image the imager 17 can capture. In other words, the imager 17 may capture images of multiple areas 32 within the ink application area 31 each time movement over a distance that corresponds to the width W is detected by the movement detector 16 while the application device 10 moves on the application target 30. As just stated, as the imager 17 captures an image each time the application device 10 moves over a range over which the imager 17 can capture an image, it is possible to prevent the multiple areas 32 to capture images from overlapping with each other. Therefore, it is possible to efficiently acquire the captured images 40 within the ink application area 31 and reduce the amount of processing of the image data generator 120.
In the above embodiment, the application device 10 captures images of the ink application area 31 and applies ink to the ink application area 31 while moving on the application target 30 in a prescribed direction, specifically in the direction from the position where the applicator 19 is provided to the position where the imager 17 is provided in the application device 10 (the +Y direction). However, in the present disclosure, the application device 10 may execute the processing described in the above embodiment while moving on the application target 30 in a direction other than the prescribed direction. In other words, it may be possible to apply ink to an area at any position on the application target 30 in the pattern based on the luminance distribution in the area while the user scans the application device 10 in any direction on the XY plane.
Specifically, the movement detector 16 detects the amount of moving and the moving direction of the application device 10 while the application device 10 moves on the application target 30 in any direction. The imager 17 captures an image of an area on the application target 30 each time a prescribed time to capture an image has come while the application device 10 moves on the application target 30 in any direction. The captured image captured by the imager 17 is stored in the storage 12 in association with position information of the area of which an image is captured. Here, the position information is position information represented by two-dimensional coordinates on the XY plane and stated based on the amount of movement and the moving direction of the application device 10 since the imaging start, which are detected by the movement detector 16. When the position of the nozzles of the applicator 19 has reached an area of which an image is captured by the imager 17, the applicator 19 applies ink to the area in the pattern based on the luminance distribution in the area. As just stated, if the applicator 19 can apply ink when the position of the applicator 19 has reached an area on the application target 30 of which an image is captured by the imager 17, the application device 10 may be allowed to move on the application target 30 in any direction, not necessarily in the +Y direction.
In the above embodiment, the applicator 19 ejects ink from the applicator 19 in a thermal system. However, in the present disclosure, the applicator 19 may eject ink in another system, not necessarily in a thermal system. For example, the applicator 19 may eject ink in a piezoelectric system using a piezoelectric element to print a print-target image on the application target 30. Moreover, the applicator 19 may apply ink to the application target 30 in another system such as a heat transfer system, not necessarily in an inkjet system. Moreover, the shape of the application device 10 is not necessarily a quadratic prism shape as shown in
In the above embodiment, with the CPU executing programs that are stored in the ROM, the processor 11 functions as the parts of the imaging controller 110, the image data generator 120, and the application controller 130. However, it may be possible in the present disclosure that instead of the CPU, the processor 11 comprises, for example, dedicated hardware such as an application specific integrated circuit (ASIC), a field-programmable gate array (FPGA), and various kinds of control circuits and the dedicated hardware functions as the parts of the imaging controller 110, the image data generator 120, and the application controller 130. In such a case, the functions of the parts may each be realized by a separate piece of hardware or the functions of the parts may collectively be realized by a single piece of hardware. Moreover, it may be possible that among the functions of the parts, some are realized by dedicated hardware and others are realized by software or firmware.
Here, needless to say, an application device that preliminarily comprises the configuration for realizing the functions according to the present disclosure can be provided. Additionally, it is possible to make an existing information processing device or the like function as the application device according to the present disclosure by applying programs. In other words, it is possible to make an existing information processing device or the like function as the application device according to the present disclosure by applying programs for realizing the functional configurations of the application device 10 that are exemplified in the above embodiment in a manner that the CPU or the like that controls the existing information processing device or the like can execute the programs. Moreover, the ink application method according to the present disclosure can be implemented using the application device.
Moreover, such programs are applied by any method. The programs can be saved and applied, for example, on a non-transitory computer-readable recording medium such as a flexible disc, a compact disc (CD)-ROM, a digital versatile disc (DVD)-ROM, and a memory card. Furthermore, the programs can be superimposed on carrier waves and applied via a communication medium such as the Internet. For example, the programs may be posted and distributed on a bulletin board system (BBS) on a communication network. Then, the programs may be activated and executed in the same manner as other application programs under the control of an operating system (OS) to execute the above-described processing.
The foregoing describes some example embodiments for explanatory purposes. Although the foregoing discussion has presented specific embodiments, persons skilled in the art will recognize that changes may be made in form and detail without departing from the broader spirit and scope of the invention. Accordingly, the specification and drawings are to be regarded in an illustrative rather than a restrictive sense. This detailed description, therefore, is not to be taken in a limiting sense, and the scope of the invention is defined only by the included claims, along with the full range of equivalents to which such claims are entitled.
Number | Date | Country | Kind |
---|---|---|---|
2018-012964 | Jan 2018 | JP | national |
Number | Name | Date | Kind |
---|---|---|---|
20020154186 | Matsumoto | Oct 2002 | A1 |
Number | Date | Country |
---|---|---|
1755405 | Apr 2006 | CN |
106311524 | Jan 2017 | CN |
H10-035034 | Feb 1998 | JP |
2018-012964 | Jan 2018 | JP |
Entry |
---|
First Office Action dated Apr. 15, 2020 received in Chinese Patent Application No. CN 201910086290.9 together with an English language translation. |
Number | Date | Country | |
---|---|---|---|
20190232650 A1 | Aug 2019 | US |