For purposes of printing a document on a digital printing press, a document description file (a portable document file (PDF), for example) may be processed to generate raster image data for the press. The raster image data represents raster images for pages of the document. A raster image is a bit map, which defines a grid of pixels or pixel cells of a document page and defines colors or continuous tones for the pixels/pixel cells.
Variable Data Printing (VDP) printing refers to a form of digital printing in which some variable objects, such as texts, graphics and images, may change from one printing of a document to the next; and other, reoccurring, or static, objects of the document do not change. As examples, VDP printing may be used for purposes of printing brochures, advertisements, announcements, and so forth, with information (mailing addresses, for example) that changes among the copies that are produced by the printing press. VDP may present challenges for print shops and their content creators due the changing content.
In general, to print a VDP document, a print server may analyze a document file that describes the VDP document, such as a portable document format (PDF) file to analyze the document file to identify the static and variable objects of the document. Once identified, the static objects may be processed to derive the corresponding raster image data, and the raster image data for the static objects may then be reused until no longer needed. Through the reuse of the raster image data for static image objects, computing intensive operations that may otherwise be used to produce raster images for the reoccurring static may be reduced.
The printer server may include a raster image processor that generates the raster image data for the pages of a document to be printed. One way for the raster image processor to generate the raster image data for a given document page is for the processor to allocate a region of memory for the entire page and use the memory region as a canvas. For this technique, for each object of the document page, the raster image data may write data to the region of memory to effectively form the object on the canvas and blend the object with any other objects that partially or wholly share the same space on the canvas.
Processing a given document page to generate raster data may, however, may be relatively inefficient, however, as such operations may consume a significant amount of memory and may be associated with relatively intensive computing operations. In accordance with example implementations that are described herein, a raster image processor, or recomposition engine, generates raster image data for a document page for one pixel cell row, or line, at a time; communicates the raster image data to a digital printing press; and then repeats the process until raster data for all of the pixel lines are communicated to the digital printing press. Generating raster image data in this manner reduces the load on memory and computing resources, as further described herein.
In the context of this application, a “pixel cell,” or “cell” is associated with an atomic spatial unit of an image, such as a raster image of a document page. For example, the raster image of the document page may be viewed as being formed from a rectangular grid of pixel cells. As an example, in accordance with some implementations, a “cell” may be associated with a single pixel may be associated with a particular color value. As another example, in accordance with some implementations, a “cell” may be a collection, or group, of spatially adjacent pixels, and the pixels of the cell may be associated with the same color. For example, the cell may be a block of 4×4 pixels that is associated with a particular color (i.e., the color is homogenous for the cell). In accordance with further example implementations, the pixel cell may be a block of pixels, which is associated with a continuous tone, such as, for example, an Indigo compressed format (ICF) continuous tone, or “contone.” In the following description, a pixel cell has an associated “value,” and the “value” may be a color, a contone, or another property for the pixel cell.
In accordance with example implementations that are described herein, the image of an object may be partitioned into pixel cells called “object source cells,” or “source cells.” The raster image of a document page may be partitioned into pixel cells called “target cells.” Moreover, a document page may be associated with “cell lines,” which may be viewed as the raster image of the document page being horizontally partitioned into rows. In accordance with example implementations, a cell line extends across the width of the raster image, and the cell line has a height of one cell. As such, in accordance with example implementations, the number of cell lines is equal to the height of the raster image in pixels divided by the pixel height of the pixel cell, and the number of cells per cell line is equal to the width of the raster image in pixels divided by the pixel width of the pixel cell.
In general, the objects (text or graphics, which are defined by a PDF file, for example) that are part of a given document page may be associated with different layers. The “layer” associated with an object refers to a plane in which the object lies and which is parallel to the document page. The layer number, or order, represents a depth dimension, or order, of the layer, and in accordance with example implementations, the layer number increases with distance from the plane in which the background of the document page lies.
Objects of a document page may partially or entirely intersect, or overlap; and whether or not object portions that are overlapped are visible in the raster image of the document page depends on the degrees of opaqueness of the overlapping pixel cells. For example, for a given document page, a pixel cell A of object A that is associated with layer number 3 may overlap a pixel cell B of object B that is associated with layer number 2. For this example, object B is located behind object A, and the pixel B may or may not be visible, depending on the degree of opaqueness of the pixel cell A. In this manner, in accordance with example implementations, a given pixel cell may be opaque, nearly opaque, nearly transparent or transparent. In accordance with example implementations, an opaque or nearly opaque pixel cell means that the cell blocks enough light to prevent the viewing of a pixel cell that is disposed at the same position and associated with a lower order layer. Moreover, in accordance with example implementations, a transparent pixel cell means that an underlying pixel cell is fully viewable; and a nearly transparent pixel cell means that values (contones or colors, depending on the particular implementation) for the cell and an underlying cell are combined, or blended. The process of determining a pixel cell value for overlapping, or intersecting, pixel cells is called “blending” herein.
In accordance with example implementations, the recomposition engine processes a document description file for purposes of generating a cell line table, which identifies, for each cell line associated with the document page, which objects are associated with the cell line. In other words, for each cell line, the cell line table identifies objects that are partially or fully contained in the cell line and the positions of the contained objects.
The recomposition engine, in accordance with example implementations, constructs an object intersection table from the cell line table for purposes of identifying intersections of objects (if any) for each cell line. Using the object intersection table, the recomposition engine may then process the cell lines (called “target cell lines” herein) one at a time and communicate raster image data to the digital printing press in corresponding units of data. In this manner, in accordance with example implementations, the recomposition engine may, for a given target cell line, determine whether objects overlap, or intersect, in the given target cell line, and based on a result of this determination, perform a blending of the intersecting source object cells (if any) for purposes of generating the raster image data for the target cell line. Moreover, as described herein, in accordance with example implementations, the recomposition engine may use the cell line intersection table to, for a given target cell line, optimize the generation of raster image data for the target cell line to accommodate the cases in which one or no objects are contained in the cell line or the case in which multiple objects exist for the cell line but do not intersect.
As a more specific example,
In accordance with example implementations, the raster image data 130 represents a single target cell line, of a document page. As described herein, the recomposition engine 114 constructs a cell line table 118 based on the page description data 116. The cell line table 118 identifies, per cell line, which object are objects are partially or entirely contained in the cell line. Based on the cell line table 118, the recomposition engine 114 generates an object intersection table 120, which, per cell line, identifies the positions of any object(s) contained in the cell line and whether objects overlap in the cell line. Based on the object intersection table 120, the recomposition engine 114 may then, generate the raster data 130 for each target cell line, as described herein.
Among the other features of the system 100, in accordance with some implementations, the print server 110 may include one or multiple processors 140 (one or multiple central processing units (CPUs), one or multiple processing cores, and so forth) and a memory 144. In general, the memory 144 is a non-transitory memory that may store data representing machine executable instructions (or software), which are executed by one or multiple processors 140 for purposes of performing techniques that are described herein. For example, in accordance with some implementations, the memory 144 may store machine executable instructions that when executed by the processor(s) 140 may cause the processor(s) 140 to perform functions of the recomposition engine 114 as described herein. The memory 144 may further store data representing initial, intermediate and final versions of the raster image data 130, as well as other data, in accordance with example implementations.
In accordance with some implementations, the memory 144 may be formed from semiconductor storage devices, memristors, phase change memory devices, non-volatile memory devices, volatile memory devices, a combination of one or more of the foregoing memory storage technologies and so forth.
In accordance with some implementations, the recomposition engine 114 may be partially or wholly based in software (i.e., formed by one or more of the processors 140, executing machine executable instructions). However, in accordance with further example implementations, the recomposition engine 114 may be formed partially or in whole from one or multiple hardware circuits such as one or multiple field programmable gate arrays (FPGAs) or application specific integrated circuits (ASICs).
As depicted in
The corresponding cell line table 118 may contain rows 220, where each row 220 describes the object or objects that may be contained in a cell line that is associated with the row 220. In this manner, in accordance with example implementations, the cell line table 118 includes a column 208, which identifies the particular target page 204. The cell line table 118 includes a column 210 identifying the cell lines associated with the target page 204. The number of cell lines may be equal or less than the number of vertical pixels on the page, depending on the cell size. For example, if the page 204 contains 8000 pixels in height and the compression ratio is 4:1 (i.e., 4 associated pixels per cell), then there are 2000 cell lines and 2000 corresponding rows 220 of the table 118. Each cell line, in turn, may contain, or hold, zero, one or more objects. For example, the row 220-1 corresponds to example cell line number one, and example cell line 220-4 contains information for cell line number 4559.
In accordance with example implementations, the table 118 further includes a column 212 identifying an object count for the number of implicated objects in the target cell line. For the example page 204, the object 205 does not coincide with any other object. Therefore, for the example cell line table 118, the object 205 is represented in the table 118 (as Object ID=1) with corresponding rows 220 having an object count of “1.” To the contrary, the objects 206 and 207 (represented by Object ID=2 and Object ID=3, respectively overlap, and correspondingly, rows 220 of the cell line table 118 have an object count of “2” where the objects 206 and 206 overlap. As more specific examples, row 220-1 of the cell line table 118 contains an object count of “0” representing that no objects are contained in the first cell line of the page 204. As another example, row 220-3 of the table 118, is associated with cell line number 12, contains the objects 206 and 207, and has an object count of “2.”
In accordance with example implementations, the cell line table 118 includes an object identification and cell line column 214, which contains an identifier for each object that is associated with the cell line and the particular cell line number of the object. For example, row 220-3 of the cell line table 118, which corresponds to cell line number 12, has the following entries: “2(3),” which identifies object number “2” (the object 206), and the “(3)” represents that the cell line number 12 contains row number three of the object 206; and the column 214 entry for the row 220-3 further represents that cell line number 12 contains object number 3 (i.e., the object 210) and contains cell line number two for the object.
In accordance with example implementations, the recomposition engine 114 (
The recomposition engine 114 performs the reverse z-order blending by beginning with the uppermost layer 302 and stopping when an opaque or nearly opaque source cell is encountered. For example, for the target cell 310, the reverse z-order blending views the cells along a reverse z direction 330. In this direction, the blending first encounters a nearly transparent source cell 332 that is associated with the uppermost layer 302. Because the source cell 332 is neither nearly opaque nor opaque, the processing continues along the direction 330, and as shown, source cells 334 and 336, which are associated with the next two layers 303 and 304 are transparent. Therefore, processing along the direction 330 continues to the lowest layer source cell 338, which, for this example, is opaque or nearly opaque. Accordingly, the recomposition engine 114 assigns the value of the source cell 338 to the target cell 310.
The value for the target cell 318 is derived by processing in a reverse z-order direction, as indicated at reference numeral 350. As shown, source cells 352 and 354, which are associated with the uppermost 302 and next uppermost 303 layers, are transparent. However, a source cell 356 of the next layer 304 is opaque or nearly opaque. Therefore, the reverse z-order processing stops at the second layer 304, as the value of the source cell 356 sets the value for the target cell 318. It is noted that the reverse z-order processing ends at the layer 304, as due to the opacity of the source cell 356, the values of any source cells below the cell 356 do not contribute to or affect the value of the target cell 318. In a similar manner, for purposes of determining the color value for the target cell 320, the recomposition engine 114 proceeds in a reverse z-order direction, as indicated by reference numeral 370. The processing ends at the layer 303, as the corresponding source cell 374 is opaque or nearly opaque, thereby providing the value for the target cell 320.
In accordance with example implementations, each row 460 of the object intersection table 120 identifies the object intersection(s), if any, for an associated cell line. In accordance with example implementations, the object intersection table 120 includes a column 450 that contains a cell line identifier (1, 2, 3, and so forth) identifying the cell line for the associated row 460. Moreover, the object intersection table 120 includes a column 452 that identifies information pertaining to the objects that are contained in the associated cell line.
For example, row 460-2 contains information pertaining to cell line number “15,” which is highlighted and assigned reference numeral 430 on the page 410. For the row 460-2, the column 452 contains three entries: an entry for each object of the cell line. Each entry, in turn, describes an identifier for the object, the cell line on which the object begins, the horizontal cell offset for the object, and the horizontal length of the object. For example, for the first entry in column 452 for the row 460-2, the entry is “1:11:5:12,” which means object number 1 (i.e., object 420) begins on cell line number “11,” begins on cell “5” of the cell line, and has a length of “12” contiguous cells.
As also depicted in
Referring to
If one or more source objects are implicated for the target cell line (per decision block 512), the recomposition engine 114 determines (decision block 530) whether a single source object is implicated for the target cell line; and if so, the recomposition engine 114 sets the raster image data equal to the encoded source cell data, pursuant to block 534 and communicates the raster image data to the printing press pursuant, to block 520.
In accordance with example implementations, if two or more source objects are implicated for the target cell line, then, pursuant to block 538, the recomposition engine 114 initializes a transparent target cell line, pursuant to block 538. If the recomposition engine 114 then determines (decision block 542) that an intersection or overlap between source objects occur for the cell line, the recomposition engine 114 decodes (block 546) the source cell line data and blends (block 550) the sources of line data using the target cell line. The recomposition engine 114 then sets the raster image data equal to the encoded blended source cell data, pursuant to block 544 to form the raster image data that is communicated to the digital printing press 160. Otherwise, if the multiple objects do not overlap for the target cell line (as determined in decision block 542), the recomposition engine 114 combines (block 558) the source cell lines without decoding to form the raster image data.
The lowest layer for the sources (layer number 1 for this example) contains three object source cells 640, 644 and 648, which are nearly transparent. The recomposition engine 114 correspondingly produces another intermediate target cell line 660, and for the transparent cell line 660, the source cell 640 modifies the previous target cell line 630. Due to the near transparency of the source cells 644 and 648, these cells do not modify the cell line 630. Lastly, in accordance with example implementations, the recomposition engine 114 may create the target cell line in its final state by blending a background color with the cells of the target cell line 660. In this blending of the background color, the transparent cells are filled with the background color, and the cell 640, being nearly transparent, is blended with the background color. The remaining cells are not blended with the background color, as these cells are opaque or nearly opaque. The recomposition engine 114 may then communicate raster image data representing the target cell line, in its final state, to the digital printing press 160.
Referring to
More specifically, in accordance with example implementations, a technique 800 that is depicted in
Referring to
While the disclosure has been described with respect to a limited number of implementations, those skilled in the art, having the benefit of this disclosure, will appreciate numerous modifications and variations therefrom. It is intended that the appended claims cover all such modifications and variations.
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/US2017/039959 | 6/29/2017 | WO | 00 |