The present invention relates to an information processing apparatus, a method, and a non-transitory computer-readable storage medium storing a program.
There is conventionally known preparing a template that stores information such as the shape and arrangement of images, characters, and graphics constituting a poster and automatically arranging images, characters, and graphics in accordance with the template, thereby generating a poster. Japanese Patent Laid-Open No. 2017-59123 proposes generating a poster by selecting a template in ascending order of the difference between the impression evaluation value of a template and the impression evaluation value of an image.
In Japanese Patent Laid-Open No. 2017-59123, a template in which the difference between the impression evaluation value of the template and the impression evaluation value of an image is small is selected, but generating a commercial material that gives an impression intended by a user, including a color change at the time of printing, is not taken into consideration at all.
The present invention provides an information processing apparatus capable of printing a commercial material that appropriately gives an impression intended by a user, a method, and a non-transitory computer-readable storage medium storing a program.
The present invention in one aspect provides an information processing apparatus comprising: at least one processor and at least a memory coupled to the at least one processor and having instructions stored thereon, and when executed by the at least one processor, acting as: a first acceptance unit configured to accept contents of a commercial material; a second acceptance unit configured to accept a designation of an impression the commercial material gives to a user; a third acceptance unit configured to accept a condition of the commercial material; an acquisition unit configured to acquire information of a color gamut reproducible by a printing unit that prints the commercial material; a generation unit configured to generate a plurality of commercial material data based on the contents accepted by the first acceptance unit, the designation of the impression accepted by the second acceptance unit, and the condition of the commercial material accepted by the third acceptance unit; and a display unit configured to display a commercial material image based on the plurality of commercial material data generated by the generation unit, wherein the commercial material image is an image represented by data specified based on the designation of the impression accepted by the second acceptance unit from a plurality of data generated from the plurality of commercial material data based on the information of the color gamut acquired by the acquisition unit.
According to the present invention, it is possible to print a commercial material that appropriately gives an impression intended by a user.
Further features of the present invention will become apparent from the following description of exemplary embodiments with reference to the attached drawings.
Hereinafter, embodiments will be described in detail with reference to the attached drawings. Note, the following embodiments are not intended to limit the scope of the claimed invention. Multiple features are described in the embodiments, but limitation is not made an invention that requires all such features, and multiple such features may be combined as appropriate. Furthermore, in the attached drawings, the same reference numerals are given to the same or similar configurations, and redundant description thereof is omitted.
Terms used in this embodiment are defined in advance, as follows.
“Color reproduction region” is also called a color reproduction range, a color gamut, or a gamut. Generally, “color reproduction region” indicates the range of colors that can be reproduced in an arbitrary color space. In addition, a gamut volume is an index representing the extent of this color reproduction range. The gamut volume is a three-dimensional volume in an arbitrary color space. Chromaticity points forming the color reproduction range are sometimes discrete. For example, a specific color reproduction range is represented by 729 points on CIE-L*a*b*, and points between them are obtained by using a well-known interpolating operation such as tetrahedral interpolation or cubic interpolation. In this case, as the corresponding gamut volume, it is possible to use a volume obtained by calculating the volumes on CIE-L*a*b* of tetrahedrons or cubes forming the color reproduction range and accumulating the calculated volumes, in accordance with the interpolating operation method. The color reproduction region and the color gamut in this embodiment are not limited to a specific color space. In this embodiment, however, a color reproduction region in the CIE-L*a*b* space will be explained as an example. Furthermore, the numerical value of a color reproduction region in this embodiment indicates a volume obtained by accumulation in the CIE-L*a*b* space on the premise of tetrahedral interpolation.
“Gamut mapping” is processing of performing conversion between different color gamuts, and is, for example, mapping of an input color gamut to an output color gamut of a device such as a printer. Perceptual, Saturation, Colorimetric, and the like of the ICC profile are general. The mapping processing may be implemented by, for example, conversion by a three-dimensional lookup table (3DLUT). Furthermore, the mapping processing may be performed after conversion of a color space into a standard color space. For example, if an input color space is sRGB, conversion into the CIE-L*a*b* color space is performed and then the mapping processing to an output color gamut is performed on the CIE-L*a*b* color space. The mapping processing may be conversion by a 3DLUT, or may be performed using a conversion formula. Conversion between the input color space and the output color space may be performed simultaneously. For example, the input color space may be the sRGB color space, and conversion into RGB values or CMYK values unique to a printer may be performed at the time of output.
In this embodiment, a method of operating an application for poster generation in a poster generation apparatus and automatically generating a poster will be described as an example. Note that in the following explanation, “image” includes a still image and a frame image extracted from a moving image, unless it is specifically stated otherwise.
The CPU (central processing unit/processor) 101 generally controls the poster generation apparatus 100 and, for example, reads out a program stored in the ROM 102 to the RAM 103 and executes it, thereby implementing an operation according to this embodiment.
The HDD (hard disk) 104 is a storage medium (storage unit) configured to store databases that hold image files and processing results of image analysis and the like, and skeletons used by a poster creation application. The skeleton will be described later. The display 105 is a display unit that displays, to a user, a user interface (UI) according to this embodiment and an electronic poster as a layout result of image data (to be also referred to as “images” hereinafter). The keyboard 106 and the pointing device 107 accept instruction operations from the user. The display 105 may have a touch sensor function.
The keyboard 106 is used by the user to, for example, input the number of double-page spreads of a poster to be created on the user interface (UI) displayed on the display 105. The pointing device 107 is used by the user to, for example, click a button on the UI displayed on the display 105.
The data communication unit 108 performs communication with an external apparatus via a wired or wireless network. The data communication unit 108, for example, transmits data laid out by an automatic layout function to a printer or a server capable of communicating with the poster generation apparatus 100. The GPU 109 is a Graphics Processing Unit and can perform an efficient operation by parallelly processing more data. The GPU 109 is used when, for example, performing learning a plurality of times using a learning model such as deep learning. A data bus 110 connects the blocks shown in
The poster creation application according to this embodiment is stored in the HDD 104. The poster creation application is activated by the user executing an operation of clicking or double-clicking, using the pointing device 107, the icon of the application displayed on the display 105.
Next, a printing apparatus 120 includes an image processing accelerator 121, a data communication unit 122, a CPU 123, a RAM 124, a storage medium 125, a printhead controller 126, and a printhead 127.
The CPU 123 generally controls the printing apparatus 120 by reading out a program stored in the storage medium 125 to the RAM 124 serving as a work area and executing it. The image processing accelerator 121 is hardware capable of executing image processing at a higher speed than the CPU 123. The image processing accelerator 121 is activated by the CPU 123 writing parameters and data necessary for image processing to a predetermined address of the RAM 124. After the parameters and data are loaded, the image processing accelerator 121 executes image processing for the data. Note that the image processing accelerator 121 need not be an essential element, and similar processing may be executed by the CPU 123. The parameters may be stored in the storage medium 125 or may be stored in a storage (not shown) such as a flash memory or an HDD.
Image processing performed by the CPU 123 or the image processing accelerator 121 will be described here. Image processing is, for example, processing of generating data indicating a dot formation position of ink in each scanning by the printhead 127 based on acquired print data. The CPU 123 or the image processing accelerator 121 performs color conversion processing and quantization processing of acquired print data.
The color conversion processing is processing of performing color separation to ink concentrations to be used in the printing apparatus 120. For example, the acquired print data contains image data indicating an image. In a case where the image data is data indicating an image in a color space coordinate system such as sRGB as the expression colors of a monitor, data indicating an image by color coordinates (R, G, B) of the sRGB is converted into ink data (CMYK) to be handled by the printing apparatus 120. The color conversion method is implemented by, for example, matrix operation processing or processing using a 3DLUT or 4DLUT.
In this embodiment, as an example, the printing apparatus 120 uses inks of black (K), cyan (C), magenta (M), and yellow (Y) for printing. Therefore, image data of RGB signals is converted into image data formed by 8-bit color signals of K, C, M, and Y. The color signal of each color corresponds to the application amount of each ink. Furthermore, the ink colors are four colors of K, C, M, and Y, as examples. However, to improve image quality, it is also possible to use other ink colors such as inks of fluorescent ink (F) and light cyan (Lc), light magenta (Lm), and gray (Gy) having low concentrations. In this case, color signals corresponding to the inks are generated.
After the color conversion processing, quantization processing is performed for the ink data. This quantization processing is processing of decreasing the number of tone levels of the ink data. In this embodiment, quantization is performed by using a dither matrix in which thresholds to be compared with the values of the ink data are arrayed in individual pixels. After the quantization processing, binary data indicating whether to form a dot in each dot formation position is finally generated.
After the image processing is performed, the printhead controller 126 transfers the binary data to the printhead 127. At the same time, the CPU 123 performs printing control via the printhead controller 126 so as to operate a carriage motor (not shown) for operating the printhead 127, and to operate a conveyance motor for conveying a print medium. The printhead 127 scans the print medium and also discharges ink droplets onto the print medium, thereby forming an image.
The image processing apparatus 100 and the printing apparatus 120 are connected to each other via the communication line 110. In this embodiment, a Local Area Network (LAN) will be explained as an example of the communication line 110. However, the connection may also be obtained by using, for example, a USB hub, a wireless communication network using a wireless access point, or a Wifi direct communication function.
A description will be provided below by assuming that the printhead 127 has nozzle arrays for four color inks of cyan (C), magenta (M), yellow (Y), and black (K).
When the poster creation application is installed in the poster generation apparatus 100, an activation icon is displayed on the top screen (desktop) of an operating system (OS) operating on the poster generation apparatus 100. If the user operates (for example, double-clicks) the activation icon displayed on the display 105 using the pointing device 107, the program of the poster creation application stored in the HDD 104 is loaded into the RAM 103 and executed by the CPU 101. The poster creation application is thus activated.
Program modules corresponding to the constituent elements shown in
In accordance with a UI operation using the pointing device 107, the poster creation condition designation unit 201 designates poster creation conditions in the poster generation unit 210. In this embodiment, as the poster creation conditions, the size, the creation count, and the application purpose category of a poster are designated. As the size of the poster, the actual values of a width and a height may be designated, or a paper size such as A1 or A2 may be designated. The application purpose category is a category indicating for what kind of application purpose the poster is to be used, and examples are restaurant, school event, and sale.
By a UI operation using the keyboard 106, the text designation unit 202 designates character information to be arranged on the poster. Character information to be arranged on the poster indicates a character string indicating, for example, a title, a date/time, a location, or the like. Also, the text designation unit 202 links each character information with the title, the date/time, or the location such that the type of information can be discriminated, and then outputs the character information to the skeleton acquisition unit 213 and the layout unit 217.
The image designation unit 203 designates one or a plurality of image data that are stored in the HDD 104 and are to be arranged on the poster. The image data may be designated, for example, based on the structure of a file system including image data, such as a device and a directory, or may be designated based on additional information such as an image capturing date/time for identifying an image or attribute information. The image designation unit 203 outputs the file path of the designated image to the image acquisition unit 211.
The target impression designation unit 204 designates the target impression of the poster to be created. The target impression is an impression the poster to be created is requested to finally evoke. In this embodiment, an intensity representing the degree of an impression to be imparted to a word representing the impression is designated by a UI operation using the pointing device 107. Information indicating the target impression designated by the target impression designation unit 204 is shared by the skeleton selection unit 214, the coloring pattern selection unit 215, the font selection unit 216, and the poster selection unit 219. Details of the impression will be described later.
The gamut mapping designation unit 207 designates one or a plurality of gamut mapping methods to be used by the gamut mapping unit 221. In this embodiment, which one of gamut mapping methods such as Perceptual, Saturation, and Colorimetric is to be used is designated by a UI operation using the pointing device 107.
The configuration of the poster generation unit 210 will be described next in detail. The image acquisition unit 211 acquires one or a plurality of image data designated by the image designation unit 203 from the HDD 104. The image acquisition unit 211 outputs the acquired image data to the image analysis unit 212. In addition, the image acquisition unit 211 outputs the number of acquired images to the skeleton acquisition unit 213. Examples of an image stored in the HDD 104 are a still image and a frame image cut out from a moving image. The still image and the frame image are acquired from an image capturing device such as a digital camera or a smart device. The image capturing device may be provided in the poster generation apparatus 100, or may be provided in an external apparatus. Note that if the image capturing device is an external apparatus, the image is acquired via the data communication unit 108. Also, as another example, the still image may be an illustration image created by image editing software or a CG image created by Computer Graphics (CG) generation software. The still image and the cutout image may be images acquired from a network or a server via the data communication unit 108. An example of the image acquired from the network or the server is a social networking service image (to be referred to as an “SNS image” hereinafter). Also, the program executed by the CPU 101 analyzes, for each image, data added to the image and determines the storage source. For example, as for an SNS image, the acquisition destination may be managed in an application by acquiring the image from the SNS via the application. Note that the image is not limited to the above-described images, and may be another type of image.
The image analysis unit 212 executes image data analysis processing using a method to be described later for the image data acquired from the image acquisition unit 211, thereby acquiring information indicating an image feature amount to be described later. More specifically, for example, the image analysis unit 212 executes object recognition processing to be described later and acquires information indicating the image feature amount of image data. Also, the image analysis unit 212 links the acquired information indicating the image feature amount with the image data and outputs it to the layout unit 217.
The skeleton acquisition unit 213 acquires, from the HDD 104, one or a plurality of skeletons that match the conditions designated by the poster creation condition designation unit 201, the text designation unit 202, and the image acquisition unit 211. In this embodiment, a skeleton is information indicating the arrangement of character strings, images, and graphics to be arranged on a poster. In this embodiment, the skeleton is one of commercial material constituent elements that constitute a commercial material together with a coloring pattern and a font.
The skeleton may be stored in the HDD 104 using, for example, a CSV format, or may be stored using a DB format such as SQL. The skeleton acquisition unit 213 outputs the one or the plurality of skeletons acquired from the HDD 104 to the skeleton selection unit 214.
Among the skeletons acquired from the skeleton acquisition unit 213, the skeleton selection unit 214 selects one or a plurality of skeletons matching the target impression designated by the target impression designation unit 204, and outputs these to the layout unit 217. Since the arrangement on the entire poster is decided by the skeleton, variations of the poster after generation can be increased by preparing various types of skeletons in advance.
The coloring pattern selection unit 215 acquires, from the HDD 104, one or a plurality of coloring patterns matching the target impression designated by the target impression designation unit 204, and outputs these to the layout unit 217. The coloring pattern is a combination of colors to be used in a poster.
The font selection unit 216 acquires, from the HDD 104, one or a plurality of font patterns matching the target impression designated by the target impression designation unit 204, and outputs these to the layout unit 217. The font pattern is a combination of at least one of the font of a title, the font of a subtitle, and the font of a text.
The layout unit 217 lays out various kinds of data on each of the one or the plurality of skeletons acquired from the skeleton selection unit 214, thereby generating one or a plurality of poster data more than the designated number of created posters. The layout unit 217 arranges, on each skeleton, the text acquired from the text designation unit 202 and the image data acquired from the image analysis unit 212, applies the coloring pattern acquired from the coloring pattern selection unit 215, and applies the font pattern acquired from the font selection unit. The layout unit 217 outputs the plurality of generated poster data to the gamut mapping unit 221.
The device color gamut acquisition unit 220 acquires a color gamut (device color gamut) reproducible by a printer from printer information acquired from the printer designation unit 206. The device color gamut is defined by, for example, 729 points on CIE-L*a*b* and stored in the storage medium 125 of the printing apparatus 120. The image processing apparatus 100 acquires the device color gamut held by the printing apparatus 120 via the data communication unit 108, and stores it in the RAM 103. Note that the device color gamut acquisition method is not limited to this. For example, the device color gamut corresponding to the printer may be stored on the HDD 104 or a server (not shown) in advance, and the device color gamut acquisition unit 220 may acquire the device color gamut via the HDD 104 or the data communication unit 108. Instead of designating the printer, the printer designation unit 206 may designate an ICC profile. Sinde the ICC profile defines a color space that the device can express, the device color gamut acquisition unit 220 may acquire the device color gamut by analyzing the ICC profile. The acquired device color gamut is output to the gamut mapping unit 221.
The gamut mapping unit 221 acquires the gamut mapping method to be used from the gamut mapping designation unit 207, and acquires the device color gamut from the device color gamut acquisition unit 220. For the plurality of poster data acquired from the layout unit 217, the gamut mapping unit 221 performs gamut mapping processing using the designated one or plurality of gamut mapping methods. The gamut mapping unit 221 outputs the poster data after the gamut mapping to the impression estimation unit 218.
For the plurality of poster data acquired from the gamut mapping unit 221, the impression estimation unit 218 estimates the impression of each poster and links the estimated impression with each poster. The impression estimation unit 218 outputs the plurality of poster data each linked with the estimated impression to the poster selection unit 219.
The poster selection unit 219 compares the target impression designated by the target impression designation unit 204 with the estimated impression of each of the plurality of poster data linked with the estimated impression acquired from the impression estimation unit 218, and selects poster data linked with the estimated impression closest to the target impression. The selection result is stored in the HDD 104. The poster selection unit 219 outputs the selected poster data to the poster display unit 205.
The poster display unit 205 outputs a poster image to be displayed on the display 105 in accordance with the poster data acquired from the poster selection unit 219. The poster image is, for example, bitmap data. The poster display unit 205 displays the poster image on the display 105. Note that a function of, after the generation result is displayed on the poster display unit 205, editing the arrangement, colors, and shapes, and the like of the images, texts, and graphics by an additional operation of the user to change the design to a design demanded by the user may be imparted to the poster creation application. The printing unit 208 prints the poster data stored in the HDD 104 by the printer designated by the printer designation unit 206.
A title box 502, a subtitle box 503, and a text box 504 each accept a designation of character information to be arranged on a poster. Note that in this embodiment, three types of character information are accepted as an example, but the present invention is not limited to this. For example, character information of a location, a date/time, or the like may additionally be accepted. In addition, not all designations need be done, and some boxes may be blank.
An image designation region 505 is a region in which an image to be arranged on the poster is displayed. An image 506 indicates the thumbnail of a designated image. An image addition button 507 is a bottom used to add an image to be arranged on the poster. If the user presses the image addition button 507, the image designation unit 203 displays a dialog screen used to select a file stored in the HDD 104, and accepts image file selection by the user. The thumbnail of the selected image is then added to the image designation region 505.
Impression sliders (impression slider bars or impression setting sliders) 508 to 511 are objects that set the factors of target impressions of the poster to be created. For example, the slider 508 is a slider that sets the factor of a target impression concerning a premium nature. If the slider 508 is moved to the right side, the premium nature is set high. If the slider 508 is moved to the left side, a target impression is set such that the poster has a low premium nature (gives a cheap impression). Also, when the factors of target impressions set by the sliders are combined, a target impression is set on which not only the factor of target impression set by one slider but also the factors of target impressions set by other sliders are reflected. For example, if a user operation is performed on the screen of the poster creation application to set the impression slider 508 to the right side of the center of the slider and set the impression slider 511 to the left side of the center of the slider, a poster giving a refined impression with a high premium nature and a low profoundness is generated. In addition, for example, if the impression slider 508 is set to the right side of the center of the slider, and the impression slider 511 is set to the right side of the center of the slider, a poster giving a gorgeous impression in which both the premium nature and the profoundness are high is generated. As described above, when the factors of target impressions indicated by the plurality of impression sliders are combined, even if a factor of a common target impression “high premium nature” is set, target impressions of different directions, that is, a “refined” target impression and a “gorgeous” target impression can be set. That is, the target impression can be formed and decided by a plurality of factors representing impressions but may be decided by one factor representing an impression. In this embodiment, defining a state in which a slider is set at the leftmost position as “−2” and a state in which a slider is set at the rightmost position as “+2”, the value is corrected to an integer value of −2 to +2. As for these numerical values indicating impressions, “−2” indicates “low”, “−1” indicates “somewhat low”, “0” indicates “neither”, “+1” indicates “somewhat high”, and “+2” indicates “high”. Note that the purpose for correcting to −2 to +2 is to facilitate a distance calculation to be described later by making the scale match the estimated impression. However, the present invention is not limited to this, and normalization may be done using values 0 to 1.
A radio button 512 is a button capable of executing control of enabling or disabling the setting of each target impression. The user can set whether to enable or disable the setting of each target impression by pressing the radio button 512 to set on/off. For example, if off is selected by the radio button 512, the impression is excluded from control of impression. For example, if the user wants to create a restrained poster with a low dynamism but has no particular designations concerning other impressions, a poster specialized to the low dynamism can be generated by turning off the radio buttons 512 other than that for the dynamism. Note that
A size list box 513 is a list box that sets the size of the poster to be created. By a click operation of the user using the pointing device 107, a list of creatable poster sizes is displayed, and a poster size can be selected. A creation count box 514 can set the number of candidates of the poster to be created. A category list box 515 can set the application purpose category of the poster to be created. A printer designation box 516 can set a printer that prints the created poster. Note that although a printer is designated in this embodiment, a print mode may further be designated. In this case, even if the color gamut that the printer can express changes depending on the print mode (the printing method or the paper type), an appropriate device color gamut can be acquired.
A gamut mapping designation region 517 is formed by a plurality of checkboxes used to decide the type of gamut mapping method to be used. By a click operation of the user using the pointing device 107, the checkbox of the gamut mapping method to be used can be enabled or disabled.
A reset button 518 is a button used to reset the setting information on the application activation screen 501. If the user presses an OK button 519, the poster creation condition designation unit 201, the text designation unit 202, the image designation unit 203, and the target impression designation unit 204 output the contents set on the application activation screen 501 to the poster generation unit 210. At this time, the poster creation condition designation unit 201 acquires the size of the poster to be created from the size list box 513, the number of posters to be created from the creation count box 514, and the application purpose category of the poster to be created from the category list box 515.
The text designation unit 202 acquires character information to be arranged on the poster from the title box 502, the subtitle box 503, and the text box 504. The image designation unit 203 acquires the path of an image file to be arranged on the poster from the image designation region 505. The target impression designation unit 204 acquires the target impression of the poster to be created from the impression sliders 508 to 511 and the radio buttons 512. Note that the poster creation condition designation unit 201, the text designation unit 202, the image designation unit 203, and the target impression designation unit 204 may process the values set on the application activation screen 501. For example, the text designation unit 202 may remove an unnecessary blank character at the top or end of the input character information. Also, the target impression designation unit 204 may correct the values of the target impressions designated by the impression sliders 508 to 511.
A poster image 602 is a poster image output by the poster display unit 205. Since the poster generation unit 210 generates posters in number equal to or more than the creation count designated by the poster creation condition designation unit 201, the generated posters are displayed in a list as the poster images 602 on the poster preview screen 601. For example, if the user clicks a poster using the pointing device 107, the poster is selected.
An edit button 603 can edit the poster in the selected state via a UI (not shown) that provides an editing function. A print button 604 can print the poster in the selected state by the printing apparatus 120.
A method of processing of quantifying the impression of a poster, which is preprocessing for executing impression estimation processing to be described later in S913 of
In the processing of quantifying the impression of a poster, an impression that a person has concerning various posters is quantified. At the same time, the correspondence relationship between a poster image and the impression of the poster is derived. This makes it possible to estimate the impression of the poster from the generated poster image. If the impression can be estimated, the impression of the poster can be controlled by correcting the poster image, or a poster image having a certain target impression can be searched for. Note that the poster impression quantification processing is executed by, for example, operating, in the poster generation apparatus, an impression learning application configured to learn the impression of a poster image in advance before poster generation processing.
In S701, the CPU 101 executes acquisition of subjective evaluation of the impression of a poster.
In S702, the CPU 101 executes factor analysis of the acquired subjective evaluation result. If the subjective evaluation result is directly used, the number of adjective pairs is the number of dimensions, and control is complex. It is therefore preferable to decrease the number of dimensions to an efficient number by an analysis method such as principal component analysis or factor analysis. In this embodiment, a description will be made assuming that the dimensions are decreased to four factors by factor analysis. This number changes depending on selection of adjective pairs in subjective evaluation and the factor analysis method, as a matter of course. Also, the output of factor analysis is standardized. That is, each factor is scaled such that the average is 0, and the variance is 1 in the poster used for analysis. Hence, −2, −1, 0, +1, and +2 of an impression designated by the target impression designation unit 204 can directly be made to correspond to −20, −10, average value, +10, and +20 of each impression, respectively, and calculation of the distance between a target impression and an estimated impression to be described later is facilitated. Note that in this embodiment, a premium nature, a familiarity, a dynamism, and a profoundness shown in
In S703, the CPU 101 associates a poster image with an impression. Quantification can be performed for the poster that has undergone the subjective evaluation of the above-described method, but an impression needs to be estimated without subjective evaluation even for a poster to be created from now on. Associating between a poster image and an impression can be implemented by learning a model for estimating an impression from a poster image using, for example, a deep learning method using a Convolution Neural Network (CNN) or a machine learning method using a decision tree. In this embodiment, in impression learning, supervised deep learning using a CNN is performed using a poster image as an input and four factors as outputs. That is, a deep learning model is created by performing learning using a poster image that has undergone subjective evaluation and a corresponding impression as a correct answer, and an unknown poster image is input to the learning model, thereby estimating the impression.
Note that the deep learning model created above is stored in, for example, the HDD 104, and the impression estimation unit 218 deploys the deep learning model stored in the HDD 104 onto the RAM 103 and executes it. The impression estimation unit 218 forms an image from poster data acquired from the layout unit 217, and estimates the impression of the poster by operating, by the CPU 101 or the GPU 109, the deep learning model deployed on the RAM 103. Note that in this embodiment, the deep learning method is used, but the present invention is not limited to this. For example, if a machine learning method such as a decision tree is used, a feature amount such as a luminance average value or an edge amount of a poster image may be extracted by image analysis, and a machine learning model that estimates an impression based on the feature amount may be created.
The flowchart shown in
In S901, the poster creation application displays the application activation screen 501 on the display 105. The user inputs each setting via the UI screen of the application activation screen 501 using the keyboard 106 or the pointing device 107.
In S902, the poster creation condition designation unit 201, the text designation unit 202, the image designation unit 203, and the target impression designation unit 204 acquire corresponding settings from the application activation screen 501.
In S903, the skeleton selection unit 214, the coloring pattern selection unit 215, and the font selection unit 216 decide the number of skeletons, the number of coloring patterns, and the number of fonts, respectively, to be selected in accordance with the creation count designated by the poster creation condition designation unit 201. In this embodiment, the layout unit 217 generates poster data as many as the number of skeletons×the number of coloring pattern×the number of fonts by a method to be described later. The number of skeletons, the number of coloring patterns, and the number of fonts to be selected are decided such that the number of posters to be generated at this time exceeds the creation count. In this embodiment, for example, the number of skeletons, the number of coloring patterns, and the number of fonts are decided in accordance with
For example, if the creation count is 6, the selection count is 3. The number of poster data generated by the layout unit 217 is 3×3×3=27, and the poster selection unit 219 selects six poster data among these. Thus, the poster selection unit 219 can select posters having general impressions more matching the target impression from the poster data generated in number equal to or more than the creation count.
In S904, the image acquisition unit 211 acquires image data. More specifically, the image acquisition unit 211 reads out an image file in the HDD 104, which is designated by the image designation unit 203, to the RAM 103.
In S905, the image analysis unit 212 executes analysis processing for the image data acquired in S904 and acquires information indicating a feature amount. Examples of the information indicating a feature amount are metainformation stored in the image and information indicating an image feature amount that can be acquired by analyzing the image. These pieces of information are used in object recognition processing that is analysis processing. Note that in this embodiment, object recognition processing is executed as the analysis processing, but the present invention is not limited to this, and another analysis processing may be executed. Furthermore, the process of S905 may be omitted. Details of processing performed by the image analysis unit 212 in S905 will be described below.
The image analysis unit 212 executes object recognition processing for the image acquired in S904. Here, a known method can be used for the object recognition processing. In this embodiment, an object is recognized by a discriminator created by Deep Learning. The discriminator outputs a likelihood indicating whether a pixel forming an image is a pixel forming an object as a value of 0 to 1, and recognizes that an object exceeding a threshold exists in the image. The image analysis unit 212 recognizes an object image, thereby acquiring a type of an object such as a face, a pet such as a dog or a cat, a flower, a food, a building, an ornament, or a landmark, and a position thereof.
In S906, the skeleton acquisition unit 213 acquires a skeleton matching various kinds of set conditions. In this embodiment, one skeleton is described in one file and stored in the HDD 104. The skeleton acquisition unit 213 sequentially reads out skeleton files from the HDD 104 to the RAM 103, leaves skeletons matching the conditions on the RAM 103, and erases skeletons not matching the conditions from the RAM 103.
Here,
In S921, concerning a skeleton loaded into the RAM 103, the skeleton acquisition unit 213 determines whether the poster size 513 designated by the poster creation condition designation unit 201 matches the size of the skeleton. Note that matching of sizes is confirmed here, but matching of aspect ratios may suffice. In this case, the skeleton acquisition unit 213 enlarges or reduces the coordinate system of the loaded skeleton, thereby acquiring a skeleton matching the poster size designated by the poster creation condition designation unit 201.
In S922, the skeleton acquisition unit 213 determines whether the application purpose category 515 designated by the poster creation condition designation unit 201 matches the category of the skeleton. For a skeleton to be used only for a specific application purpose, the application purpose category is described in the skeleton file, and the skeleton is not acquired unless the application purpose category is selected. This prevents the skeleton from being used in other application purpose categories in a case where the design of the skeleton is specialized for a specific application purpose, for example, in a case where a pattern reminding a school is drawn by a graphic, or a pattern of a sports gear is drawn. Note that if no application purpose category is set on the application activation screen 501, S922 is skipped.
In S923, the skeleton acquisition unit 213 determines whether the number of image objects in the loaded skeleton matches the number of images acquired by the image acquisition unit 211.
In S924, the skeleton acquisition unit 213 determines whether a character object in the loaded skeleton matches the character information designated by the text designation unit 202. More specifically, the skeleton acquisition unit 213 determines whether the type of character information designated by the text designation unit 202 exists in the skeleton. For example, assume that character strings are designated in the title box 502 and the text box 504 on the application activation screen 501, a blank field is designated in the subtitle box 503. In this case, all character objects in the skeleton are searched for, if both a character object for which “title” is set as the type of character information of metadata and a character object for which “text” is set are found, it is determined that the character objects match, and otherwise, it is determined that the character objects do not match.
As described above, the skeleton acquisition unit 213 holds, on the RAM 103, skeletons for which all the skeleton size, the application purpose category, the number of image objects, and the types of character objects match the set conditions. Note that in this embodiment, the skeleton acquisition unit 213 determines all skeleton files on the HDD 104, but the present invention is not limited to this. For example, the poster creation application may hold, on the HDD 104, a database that associates the file path of each skeleton file with the search conditions (the skeleton size, the number of image objects, and the types of character objects). In this case, the skeleton acquisition unit 213 loads only matching skeleton files found as the result of search on the database from the HDD 104 to the RAM 103, thereby acquiring the skeleton files at a high speed. The explanation will return to
In S907, the skeleton selection unit 214 selects a skeleton matching the target impression designated by the target impression designation unit 204 among the skeletons acquired in S906. Here,
Here, as for a method of setting N, a fixed value may be set, or the value may be changed depending on the conditions designated by the poster creation condition designation unit 201. For example, if a creation count of 6 is set in the creation count box 514 on the application activation screen 501, the poster generation unit 210 generates six posters. The layout unit 217 to be described later generates posters by combining skeletons, coloring patterns, and fonts selected by the skeleton selection unit 214, the coloring pattern selection unit 215, and the font selection unit 216. For this reason, since 2×2×2=8 posters can be generated by selecting, for example, two skeleton, two coloring patterns, and two fonts, the condition that the creation count is 6 can be satisfied. In this way, the number N of skeletons to be selected may be decided in accordance with the conditions designated by the poster creation condition designation unit 201.
Also, the range of each impression in the skeleton impression table shown in
Note that the skeleton impression table is created in advance by, for example, fixing coloring patterns, fonts, and images and character data to be arranged on the skeletons, generating poster images based on the skeletons, and estimating impression thereof, and stored in the HDD 104. That is, the impressions of poster images in which the same character colors and the same images are used, but the arrangements of these are different are estimated, thereby forming a table showing the relative characteristics between the skeletons. At this time, processing of canceling the impression based on the used coloring pattern or image is preferably performed by standardizing the whole estimated impression or averaging the impressions of a plurality of poster images generated from one skeleton using a plurality of coloring patterns or images. This makes it possible to form a table of influences of arrangements on impressions in which, for example, the impression of a skeleton including a small image is determined not by the image but by an element such as a graphic or a character, and a high dynamism can be obtained if the arrangement of an image or characters is tilted.
In S908, the coloring pattern selection unit 215 selects a coloring pattern matching the target impression designated by the target impression designation unit 204. By the same method as in S906, the coloring pattern selection unit 215 refers to an impression table corresponding to coloring patterns and selects a coloring pattern in accordance with the target impression.
In S909, the font selection unit 216 selects a combination of fonts matching the target impression designated by the target impression designation unit 204. By the same method as in S906, the font selection unit 216 refers to an impression table corresponding to fonts and selects a font in accordance with the target impression.
In S910, the layout unit 217 sets the character information, the images, the coloring patterns, and the fonts on the skeletons selected by the skeleton selection unit 214, and generates posters.
S910 and processing of the layout unit 217 will be described next in detail with reference to
First, S910 will be described in detail with reference to
In S1301, the layout unit 217 lists all combinations of the skeletons acquired from the skeleton selection unit 214, the coloring patterns acquired from the coloring pattern selection unit 215, and the fonts acquired from the font selection unit 216. The layout unit 217 performs the following layout processing sequentially for the combinations, thereby generating poster data. For example, if the number of skeletons acquired from the skeleton selection unit 214 is 3, the number of coloring patterns acquired from the coloring pattern selection unit 215 is 2, and the number of fonts acquired from the font selection unit 216 is 2, the layout unit 217 generates 3×2×2=12 poster data. Next, in S1301, the layout unit 217 selects one of the listed combinations and executes the processes of S1302 to S1307.
In S1302, the coloring assignment unit 1201 assigns a coloring pattern acquired from the coloring pattern selection unit 215 to a skeleton acquired from the skeleton selection unit 214.
In S1303, the image arranging unit 1202 arranges, based on additional analysis information, the image data acquired from the image analysis unit 212 on the skeleton data acquired from the coloring assignment unit 1201. In this embodiment, the image arranging unit 1202 assigns image data 1401 to the image object 1504 in the skeleton. Also, if the aspect ratio of the image object 1504 and that of the image data 1401 are different, the image arranging unit 1202 performs trimming such that the aspect ratio of the image data 1401 matches that of the image object 1504. More specifically, trimming is performed based on the position of the object obtained by the image analysis unit 212 analyzing the image data 1401 such that the object region reduced by the trimming is minimum. Note that the trimming method is not limited to this, and another trimming method of, for example, trimming the center of the image or devising the composition such that face positions implement a triangular composition may be used. The image arranging unit 1202 outputs the skeleton data that has undergone the image arrangement to the image correction unit 1203.
In S1304, the image correction unit 1203 acquires the skeleton data that has undergone the image arrangement from the image arranging unit 1202, and performs correction for the image arranged on the skeleton. In this embodiment, if the resolution of the image is insufficient, up-sampling processing using super-resolution processing is performed. First, the image correction unit 1203 determines whether the image arranged on the skeleton satisfies a predetermined resolution. For example, assume that an image having a size of 1,600 px×1,200 px is assigned to a region having a size of 200 mm×150 mm on the skeleton. In this case, the print resolution of the image can be calculated by
Next, upon determining that the print resolution of the image is less than a threshold, the image correction unit 1203 raises the resolution by super-resolution processing. On the other hand, upon determining that the print resolution of the image is equal to or more than the threshold, and the resolution is sufficient, image correction is not particularly performed. In this embodiment, if the print resolution of the image is less than 300 dpi, super-resolution processing is performed.
In S1305, the font setting unit 1204 sets the font acquired from the font selection unit 216 for the skeleton data that has undergone the image correction, which is acquired from the image correction unit 1203.
In S1306, the text arranging unit 1205 arranges the text designated by the text designation unit 202 on the skeleton data that has undergone the font setting, which is acquired from the font setting unit 1204. In this embodiment, each text shown in
In S1307, the text decoration unit 1206 adds a decoration to each character object in the skeleton that has undergone the text arrangement, which is acquired from the text arranging unit 1205. In this embodiment, if the color difference between the title characters and the background region thereof is equal to or less than a threshold, processing of adding an outline to the title characters is performed. This improves the readability of the title. The text decoration unit 1206 outputs the decorated skeleton data, that is, poster data that has undergone all layout processes to the impression estimation unit 218.
In S1308, the layout unit 217 determines whether all poster data are generated. Upon determining that poster data are generated using all combinations of skeletons, coloring pattern, and fonts, the layout unit 217 ends the layout processing, and the process advances to S913. Upon determining that not all poster data are generated, the process returns to S1301, and poster data is generated using a combination not used for generation.
S910 has been described above. The explanation will return to
In S911, the device color gamut acquisition unit 220 acquires a device color gamut linked with the printer designated by the printer designation unit 206, and outputs it to the gamut mapping unit 221.
In S912, the gamut mapping unit 221 performs gamut mapping for the image data in which each poster data acquired from the layout unit 217 is rendered such that all colors are fitted to the colors in the device color gamut acquired from the device color gamut acquisition unit 220.
In S2402, the gamut mapping unit 221 executes gamut mapping for the image data rendered in S2401.
In S2403, the gamut mapping unit 221 determines whether all gamut mapping methods designated by the gamut mapping designation unit 207 are executed for the image data rendered in S2401. Upon determining that all gamut mapping methods are executed, the gamut mapping unit 221 advances to S2404, and otherwise, returns to S2402.
In S2404, the gamut mapping unit 221 determines whether the processes of S2401 to S2403 are executed for all poster data generated in S910. If all processes are executed, the processing is ended. Otherwise, the process returns to S2401.
Referring back to
In S914, the poster selection unit 219 selects poster image data to be output to the display 105 (presented to the user) from the poster image data acquired from the impression estimation unit 218 and the estimated impression linked with the poster image data. In this embodiment, the poster selection unit 219 selects a poster for which the value of the distance between the target impression and the estimated impression of the poster is equal to or less than a predetermined threshold.
Note that in this embodiment, a Euclidean distance is used as a distance. The smaller the value indicated by the Euclidean distance is, the closer the target impression and the estimated impression are. Also, the distance calculated by the poster selection unit 219 is not limited to the Euclidean distance, and any distance such as a Manhattan distance or a cosine similarity can be used if the distance between vectors can be calculated.
Also, if the number of selected posters is less than the creation count designated by the poster creation condition designation unit 201, the poster selection unit 219 selects posters as many as the shortage in ascending order of the value of the distance between the target impression and the estimated impression of the poster. Note that in this embodiment, the poster selection unit 219 selects posters as many as the shortage, but the present invention is not limited to this. For example, if the number of posters selected by the poster selection unit 219 is less than the creation count, a message indicating the shortage may be displayed on the preview screen 601. Alternatively, the poster selection unit 219 may select posters as many as the shortage and then display these on the preview screen 601 such that posters for which the value of the distance between the target impression and the estimated impression is equal to or less than a threshold and posters for which the value of the distance is more than the threshold can be discriminated. Also, for example, if the number of selected posters is not sufficient, the process returns to S903 to increase the numbers of skeletons, coloring patterns, and fonts to be selected.
In S915, the poster display unit 205 renders the poster data selected by the poster selection unit 219 and outputs the poster image to the display 105. That is, the preview screen 601 shown in
In S916, the printing unit 208 prints the poster image data acquired from the poster display unit 205 via the printing apparatus 120. Note that the poster image data to be printed in this embodiment has undergone gamut mapping in S912 and therefore has colors within the device color gamut. For this reason, the printing apparatus 120 need not perform gamut mapping again, and if printing is performed in the Colorimetric mode, the colors intended at the time of poster generation can directly be printed. Note that although one printing apparatus is used in this embodiment, if there are a plurality of printing apparatuses, a printing apparatus designated in the printer designation box 516 can be used for printing.
The poster generation processing procedure of generating a poster according to the designation of an impression by the user has been described above.
As described above, according to this embodiment, it is possible to generate a poster that expresses an impression required by the user. More specifically, in this embodiment, the elements constituting a poster, such as a skeleton, a coloring pattern, and font are combined based on the target impression, thereby generating a plurality of variations of poster candidates according to the target impression. Furthermore, after one or a plurality of gamut mapping processes are applied to the poster candidates, the impressions of entire posters are estimated, and a poster close to the target impression is selected from the poster candidates. It is therefore possible to generate a poster in which not only the impression of each single element but also the impression as a whole complies with the user intention after printing.
When printing image data, colors in digital data and color after printing do not match depending on the model of the printer, the type of a medium to print, and the gamut mapping method. For this reason, even if a poster image with a certain impression is created as image data, the colors change upon printing, resulting in a different impression. In this embodiment, since impression estimation and poster image selection are performed for poster image data that has undergone gamut mapping, a poster image close to the target impression can be generated in accordance with the printer and the medium. More specifically, for example, assume that in this embodiment, as the target impression on the application activation screen 501, the premium nature is designated as −1, the familiarity is designated as +1, and the dynamism and the profoundness are designated as off. At this time, for example, the poster image 602 on the preview screen 601 is generated with an estimated impression close to the target impression such that the premium nature is −1.2, the familiarity is +0.9, the dynamism is +0.2, and the profoundness is −1.3. Furthermore, since the poster image 602 is formed in the color gamut capable of reproducing colors even in printing, the same impression can be maintained even after printing.
In the first embodiment, as the objects used to perform the operation of setting the target impression, the impression sliders 508 to 511 of the application activation screen 501 are used. However, the target impression setting method is not limited to this.
Also, in this embodiment, poster data has been described as an example of the commercial material data. However, the operation according to this embodiment can be applied not only to a poster but also to other types of commercial material. Not a poster but, for example, a banner or the like may be used. In this case, in place of the poster preview shown in
The second embodiment will be described below concerning differences from the first embodiment. In the first embodiment, processing of selecting a skeleton, a coloring pattern, and fonts that are the constituent elements of a poster based on a target impression and generating a poster has been described. In the second embodiment, a combination generation unit searches for a combination of constituent elements of a poster, which makes the impression of the whole poster close to the target impression, based on a genetic algorithm. This makes it possible to more flexibly select constituent elements of a poster, which are optimum for the target impression, without pre-calculation such as a skeleton impression table, a coloring pattern impression table, and a font impression table.
The combination generation unit 1701 acquires one or a plurality of skeletons from a skeleton acquisition unit 213, poster data and a poster estimated impression from an impression estimation unit 218, and a target impression from a target impression designation unit 204. The combination generation unit 1701 also acquires lists of coloring patterns and fonts from an HDD 104.
Furthermore, the combination generation unit 1701 acquires a list of gamut mapping methods from the HDD 104. The combination generation unit 1701 generates combinations of the poster constituent elements (the skeletons, the coloring patterns, and the fonts) and the gamut mapping methods used for poster generation. The combination generation unit 1701 outputs the generated combinations of poster constituent elements to a layout unit 217. Also, the combination generation unit 1701 outputs the generated gamut mapping methods to a gamut mapping unit 221.
A poster selection unit 1702 selects, from the poster data acquired from the impression estimation unit 218, a poster for which the value of the distance between the estimated impression of the poster and the target impression designated by the target impression designation unit 204 is equal to or less than a threshold, and stores it in a RAM 103. In addition, the poster selection unit 1702 determines whether the number of selected and stored posters has reached a creation count designated in a creation count box 514.
In the description of S1801 after S906, an operation at the time of execution for the first time and an operation from the second loop will separately be described. When executing S1801 for the first time, the combination generation unit 1701 acquires the tables of skeletons, coloring pattern, and fonts to be used for poster generation.
After that, the combination generation unit 1701 executes the processes of S910 to S1802 for all the generated combinations.
Next, from the second loop of S1801, the combination generation unit 1701 calculates the value of the distance between the target impression and the poster estimated impression acquired from the impression estimation unit 218 and links it with the combination table.
This makes it possible to efficiently search for a combination based on the value of the distance between the target impression and the estimated impression. Note that in this embodiment, 100 combinations are generated, but the present invention is not limited to this. Also, tournament selection and uniform crossover are used. However, the present invention is not limited to this and, for example, another method such as ranking selection, roulette selection, or single point crossover may be used. In addition, mutation may be incorporated such that a risk of falling into a local optimal resolution is eliminated. A skeleton (arrangement), a coloring pattern, and a font are used as the constituent elements of the poster to be searched for. However, other constituent elements may be used. For example, a plurality of patterns to be inserted into the background of the poster may be prepared, and which pattern should be used or not may be decided by the search. When the constituent elements to be searched for are increased, more variations of posters can be generated, and the width of impression expression can be increased. As the gamut mapping method, all methods stored in the HDD 104 are used. However, like a gamut mapping designation region 517 shown in
In S1802 after S913, the poster selection unit 1702 calculates the value of the distance between the poster estimated impression and the target impression, like S1801, and creates the same table as in
In S1803, the poster selection unit 1702 determines whether the number of poster images stored in the RAM 103 in S1802 has reached the creation count designated in the creation count box 514. Upon determining that the number of poster images has reached the creation count, the poster selection unit 1702 ends the poster generation processing. Upon determining that the number of poster images has not reached the creation count, the poster selection unit 1702 returns to S1801. That is, the above-described process of S1801 in the second loop is executed, and the processes of S1801 to S1802 are repetitively executed until the number of poster images stored in the RAM 103, for which the value of the distance to the target impression is equal to or less than the threshold, reaches the designated creation count. Note that if the number of stored poster images for which the value of the distance to the target impression is equal to or less than the threshold is equal to or more than the designated creation count, the poster selection unit 1702 may compare the values of the distances of the stored poster images, and finally store only the poster images having a smaller value in the RAM 103. A poster image judged to have a larger value based on the comparison result may be deleted from the RAM 103.
Note that in this embodiment, a combination of poster constituent elements is searched for by the genetic algorithm. However, the search method is not limited to this, and another search method such as neighborhood search or tab search may be used.
As described above, according to this embodiment, a combination of constituent elements used in a poster is searched for, thereby generating a poster for which the impression of the entire generated poster is close to the target impression. This is particularly effective when generating a poster in accordance with an image or character information input by the user. For example, consider a case where although an image has a dynamic impression, a poster with a restrained impression should be generated as a whole. In this embodiment, it is possible to evaluate the impression of the whole poster and search for a combination of a skeleton, a coloring pattern, and fonts, which makes the impression close to the target impression. Hence, to suppress the impression of an image, the constituent elements of the poster can be controlled in accordance with the image by, for example, using a skeleton with a small image area or using more restrained fonts and colors. Also, in this embodiment, even if the user does not understand the characteristics of gamut mapping methods, a gamut mapping method for obtaining a suitable impression can automatically be selected by the search. According to this embodiment, it is possible to flexibly find a combination of constituent elements and a gamut mapping method optimum for the impression of a whole poster and creates many variations of the poster close to the target impression.
The third embodiment will be described below concerning differences from the first and second embodiments. In the first and second embodiments, an example in which a poster is generated by controlling the constituent elements of the poster based on a target impression has been described. In the third embodiment, an example in which a template formed by combining a skeleton, a coloring pattern, and a font is prepared in advance, and a layout unit generates a poster only by setting images and character information will be described. This makes it possible to generate a poster matching a target impression by simpler processing.
The template acquisition unit 2101 acquires, from an HDD 104, a template group that matches conditions designated by a poster creation condition designation unit 201, a text designation unit 202, and an image acquisition unit 211. In this embodiment, the template indicates a skeleton in which a coloring and fonts are set in advance. The template acquisition unit 2101 outputs the acquired template group to a layout unit 2102.
The layout unit 2102 lays out an image acquired from the image acquisition unit 211 and a text acquired from the text designation unit 202 on each template acquired from the template acquisition unit 2101, thereby generating poster data. The layout unit 2102 outputs the generated poster data group to an impression estimation unit 218.
In S2201 after S905, the template acquisition unit 2101 acquires, from the HDD 104, one or a plurality of templates matching the conditions designated by the poster creation condition designation unit 201, the text designation unit 202, and the image acquisition unit 211. In this embodiment, the template indicates a skeleton in which a coloring and fonts are set in advance. Also, one template is described in one file and stored in the HDD 104.
Like the skeleton acquisition unit 213, the template acquisition unit 2101 sequentially reads out the template files from the HDD 104 to a RAM 103, leaves templates matching the set conditions on the RAM 103, and erases templates not matching the conditions from the RAM 103. The template acquisition unit 2101 then outputs the one or plurality of acquired templates to the layout unit 2102.
In S2202, the layout unit 2102 lays out an image acquired from the image acquisition unit 211 and a text acquired from the text designation unit 202 on each template acquired from the template acquisition unit 2101, thereby generating poster data. The layout unit 2102 outputs the one or plurality of generated poster data to the impression estimation unit 218. Note that since the setting of the image is the same as that of the image arranging unit 1202, and setting of the character information is the same as that of the text arranging unit 1205, a description thereof will be omitted.
As described above, according to this embodiment, the templates in which various colorings and fonts are set in advance are prepared, thereby generating a poster close to the target impression by simple processing.
Embodiment(s) of the present invention can also be realized by a computer of a system or apparatus that reads out and executes computer executable instructions (e.g., one or more programs) recorded on a storage medium (which may also be referred to more fully as a ‘non-transitory computer-readable storage medium’) to perform the functions of one or more of the above-described embodiment(s) and/or that includes one or more circuits (e.g., application specific integrated circuit (ASIC)) for performing the functions of one or more of the above-described embodiment(s), and by a method performed by the computer of the system or apparatus by, for example, reading out and executing the computer executable instructions from the storage medium to perform the functions of one or more of the above-described embodiment(s) and/or controlling the one or more circuits to perform the functions of one or more of the above-described embodiment(s). The computer may comprise one or more processors (e.g., central processing unit (CPU), micro processing unit (MPU)) and may include a network of separate computers or separate processors to read out and execute the computer executable instructions. The computer executable instructions may be provided to the computer, for example, from a network or the storage medium. The storage medium may include, for example, one or more of a hard disk, a random-access memory (RAM), a read only memory (ROM), a storage of distributed computing systems, an optical disk (such as a compact disc (CD), digital versatile disc (DVD), or Blu-ray Disc (BD)™), a flash memory device, a memory card, and the like.
While the present invention has been described with reference to exemplary embodiments, it is to be understood that the invention is not limited to the disclosed exemplary embodiments. The scope of the following claims is to be accorded the broadest interpretation so as to encompass all such modifications and equivalent structures and functions.
This application claims the benefit of Japanese Patent Application No. 2023-163583, filed Sep. 26, 2023, which is hereby incorporated by reference herein in its entirety.
Number | Date | Country | Kind |
---|---|---|---|
2023-163583 | Sep 2023 | JP | national |