product incorporates color, differences may occur between the color viewed on the screen, during the design process, and the color rendered in the product produced by the printer due to differing color spaces and/or gamuts of the two devices. As such, the published or otherwise finished product may appear different than it did during the design process.
The following presents a simplified summary of one or more embodiments of the present disclosure in order to provide a basic understanding of such embodiments. This summary is not an extensive overview of all contemplated embodiments, and is intended to neither identify key or critical elements of all embodiments, nor delineate the scope of any or all embodiments.
The present disclosure, in one or more embodiments, relates to a method for determining whether an image is out of gamut for a device. The method may include the steps of receiving a source image in a first color space, the first color space corresponding to a first device, converting the image to a second color space, the second color space corresponding to a second device, converting the image back to the first color space to produce a converted image, comparing the source image and the converted image, and calculating an out-of-gamut score for the converted image. In some embodiments, the source image and converted image may each comprise a plurality of pixels. An out-of-gamut score may be the root mean square variance between the plurality of pixels in the source image and the plurality of pixels in the converted image. In some embodiments, calculating an out-of-gamut score may include plotting the plurality of pixels of the source image and the plurality of pixels in the converted image to determine a pixel-wise distance. In some embodiments, the first color space may be defined as an RGB color space, and the second color space may be defined as a CMYK color space. In some embodiments, an out-of-gamut score may be compared to a threshold value, and in some embodiments, the method may include notifying a user if the out-of-gamut score exceeds the threshold. Furthermore, in some embodiments, an out-of-gamut score may be calculated for an individual color channel.
The present disclosure, in one or more embodiments, further relates to a method for determining whether an image is out of gamut for a device. The method may include importing an image to a first color space to produce a source image, converting at least a portion of the image to a second color space, converting the at least a portion of the image back to the first color space to produce a converted image, and based on the comparison, calculating an out-of-gamut score for the converted image.
While multiple embodiments are disclosed, still other embodiments of the present disclosure will become apparent to those skilled in the art from the following detailed description, which shows and describes illustrative embodiments of the invention. As will be realized, the various embodiments of the present disclosure are capable of modifications in various obvious aspects, all without departing from the spirit and scope of the present disclosure. Accordingly, the drawings and detailed description are to be regarded as illustrative in nature and not restrictive.
While the specification concludes with claims particularly pointing out and distinctly claiming the subject matter that is regarded as forming the various embodiments of the present disclosure, it is believed that the invention will be better understood from the following description taken in conjunction with the accompanying Figures, in which:
The present disclosure, in one or more embodiments, relates to methods for determining whether an image is out of gamut for a color space. Particularly, the present disclosure relates to methods for comparing a color or image in one color space to an approximation of how that color or image will appear in a different color space. In this way, methods of the present disclosure allow a user to determine whether a color or image, as it may appear on one device operating in one color space, will translate accurately or as expected to a different device operating in a different color space. For example, methods of the present disclosure may allow a user to determine whether a particular color or image, as it appears on a computer screen, will appear the same or similar when printed on a printing device.
Referring now to
As shown in
The image may be received from a source, such as the Internet, a scanner, a camera, another device, or a remote or local data storage space, for example. The image may be received from a digital scanner, for example, wherein a physical image may be digitally scanned and sent to the computing device as a digital file. The image may be downloaded from the Internet or from a remote or cloud storage space such as Dropbox, Google Drive, Box.net, or Jostens Replay It photo sharing platform in some embodiments. The image may be uploaded from a device or storage space, such as a thumb drive, digital camera, smartphone, disk, or other device or storage space. In some embodiments, the image may already be digitally stored at the computing device, such as on an internal or external hard drive, for example. Multiple images may be received from a single source or from more than one source in some embodiments.
In addition to or alternative to receiving an image, in some embodiments, an image may be selected (115). That is, an image may be selected for conversion. The image may be located locally or remotely. For example, a user may log into a remote work station and select an image stored remotely. The selection may occur via user input or may occur automatically or partially automatically. In some embodiments, a user may select an image to be converted, and send it over a wired or wireless network to be converted, for example.
In some embodiments, the received or selected image may be imported to a first color space (120). The received or selected image may be in a source format when received or selected, such as a camera color space for example. The received or selected image may be converted to a first color space, such as a web-viewable color space. The first color space may be an RGB, sRGB, Adobe RGB, CMYK, and/or LAB color space, for example. Other color spaces may include U.S. Sheetfed Coated, U.S. Sheetfed Uncoated, U.S. Web Coated, and/or U.S. Web Uncoated. A variety of other color spaces are contemplated as well. The first color space may include the use of International Color Consortium (ICC) color spaces or profiles in some embodiments. The first color space may additionally or alternatively include other color spaces or profiles in some embodiments. In other embodiments, the received or selected image may be already represented in the first color space, without the need to import the image to the first color space.
In some embodiments, a user may make changes to the received or selected image. Changes may be made while the image is in its initial source format, in the first color space, and/or in a different color space or format. Changes may include, for example, editing the image coloring, applying a photo filter, editing the saturation of the image, or editing other image properties. A user may crop or resize the image, or may add text or a border. A user may additionally or alternatively incorporate the image with other images and/or design elements. For example, the image may be incorporated into a page having other images, text, graphics, borders, and/or other design elements. Such a page may have areas or design elements with particular color selections. The image, group of images, page, group of pages, or other component, as represented in the first color space, may be considered a pre-conversion image (I).
With continued reference to
With continued reference to
I′=RGB(CMYK(I))
HSV(I′)=HSV(RGB(CMYK(I)))
Referring back to
In some embodiments, a score may be determined or calculated to quantify the color variance between the pre-conversion image (I) and post-conversion image (I′) (160). The score may be determined or calculated based on the comparison step 140. In some embodiments, the out-of-gamut score may be determined by comparing the results of step 140 to a scale or control image, for example. In other embodiments, the out-of-gamut score may be determined using one or more calculations. For example, a root mean square variance (γ) between corresponding pixels (p) or color coordinates of pixels in images (I) and (I′) may be calculated. The calculation may be represented as follows:
As such, the root mean square variance (γ) may be calculated by summing the color space distance between corresponding pixels of images (I) and (I′). That is, for example, in an RGB color space having R, G, and B axes in a 3-D space, where pixel (Ip) of image (I) is located at one point in the color space having R, G, and B coordinates, and the corresponding pixel (P′p) of image (I′) is located at another point in the color space having R, G, and B coordinates, the distance (Ip-p′p) between the two points may be calculated. In some cases, the distance between corresponding pixels may be zero. The distances between all of the corresponding pixels, or between at least a portion of the corresponding pixels may be squared and summed. The square root of the summed squares may provide the root mean square variance (γ), as depicted in the above equation.
Additionally or alternatively, a root mean square variance (γ) may be calculated in a different color space, such as an HSV color space. For example, the pixel-wise distance between images HSV(I) and HSV(I′) may be calculated. That is, for example, in an HSV color space having H, S, and V axes in a 3-D space, where pixel (Ip) of image HSV(I) is located at one point in the color space having H, S, and V coordinates, and the corresponding pixel (Pp) of image HSV(I′) is located at another point in the color space having H, S, and V coordinates, the distance (Ip-I′p) between the two points may be calculated. It may be appreciated that other coordinates and 3-D spaces may be used to calculate a pixel-wise distance. For example, the distance may be calculated for pixels in a different 3-D space, such as a conical space having hue (H), saturation (S), and brightness (B) axes.
The root mean square variance en may be an out-of-gamut score for an image. Or in some embodiments, the root mean square variance may be further altered or compared to a chart or to one or more control values to determine an out-of-gamut score. For example, root mean square values falling in a particular range may receive one score, while root mean square values outside of the range may receive a different score. Additionally or alternatively, other scores may be calculated to quantify the variance between the two images. In some embodiments, for example, an out-of-gamut score may be calculated as or calculated using the mean of the pixel-wise distances between images (I) and (I′). That is, the average distance between corresponding pixels (Ip) and (I′p) may be calculated. In other embodiments, the median or mode of the pixel-wise distances between the images may be used, for example. In still other embodiments, the root mean variance may be calculated as or used to calculate an out-of-gamut score. Other calculations for determining an out-of-gamut score are contemplated as well.
In some embodiments, an out-of-gamut score, such as a root mean square variance or other calculation, may be calculated for one or more individual color channels. For example, in an RGB color space having R, G, and B axes, and where each pixel of an image has a coordinate along each of the three axes, an out-of-gamut score may be calculated or determined for the red channel (R), the green channel (G), and/or the blue channel (B). To calculate the root mean square variance (γr) between images (I) and (I′) for the red channel, for example, the distance between the (R) coordinates for corresponding pixels between images (I) and (I′) may be calculated, squared, and summed. The square root of the resulting number may provide a root mean square variance of the red channel. Similarly, in an HSV color space having H, S, and V axes, and where each pixel of an image has a coordinate along each of the three axes, an out-of-gamut score may be calculated or determined for the hue (H), the saturation (S), and/or the value or luminance channel (V). Other calculations or methods for determining an out-of-gamut score for a particular color channel may be used in other embodiments. Scores for particular color channels may allow a user to determine which color(s) of image (I) may not be represented accurately or as expected in the second color space. Individual color channel scores may provide an indication of particular regions or areas of image (I′) that may be out of gamut or may visually appear different. In some embodiments individual color channel scores may be weighted differently to determine an overall out-of-gamut score for the image. For example, in an RGB color space, the green (G) channel may be weighted more heavily than the red (R) or blue (B) channels, as the human eye may be more sensitive to the color. Similarly, in an HSV color space, the hue (H) channel may be weighted more heavily than the saturation (S) or luminance (V) channels. In some embodiments, other scores may additionally or alternatively be calculated to quantify the differences between the pre- and post-conversion images.
In some embodiments, an out-of-gamut score may be calculated for a particular color of the pre-conversion image (I). For example, a user may select a particular color in image (I) by using a tool, such as a digital eyedropper tool, to select the particular color in one or more locations where it appears in the image (I). The user may select one or more pixels or, for example, may outline a region of the image (I) having the particular color. The selected color may generally be any color found in image (I). An out-of-gamut score may be calculated for the particular color by, for example, calculating the root mean square variance of the pixel-wise distance for pixels having the particular color found in image (I) and corresponding pixels of image (I′).
An out-of-gamut score may be calculated for an image automatically, or may be calculated after some user input. In some embodiments, an out-of-gamut score may be calculated upon an image being received in the first color space. The conversion and calculation may be performed during, concurrent with, or subsequent to an upload or download process in some embodiments. In other embodiments, an out-of-gamut score may be calculated at a user's initiation or request, for example. An out-of-gamut score calculation may be performed using graphics software in some embodiments, such as Aurigma Graphics Mill, Adobe Photoshop, and/or Python Imaging Library, for example. In other embodiments, other software products, applications, plugins, and/or other tools may be used to calculate an out-of-gamut score. In some embodiments, an out-of-gamut score and/or a copy of the pre-conversion image (I′) may be stored for later analysis. For example, an out-of-gamut score may be calculated during an upload or download process and stored, for example in a database. In some embodiments, a meta analysis may then be completed for a collection of stored out-of-gamut scores, for example to determine a percentage of individual images that may be out of gamut for an entire yearbook.
With continued reference to
With continued reference to
Method 100 may be initiated automatically or by some user input or activation in some embodiments. In some embodiments, the method 100 may be performed concurrent with, during, or subsequent to an image being received at the first device and/or in the first color space. Where multiple images are received concurrently or subsequent to one another, such as in a batch download or upload for example, the method 100 may be performed for each image. The method 100 may be performed during the batch download or upload of the images in some embodiments. Where multiple images are received, the notification step (180) may be performed for each image, or in some embodiments, a user may receive a single notification for the plurality of images. For example, a notification may be issued to inform a user that one or more images of the plurality has an out-of-gamut score exceeding or falling below a particular threshold. In some embodiments, the user may then take further action to address the images to determine which image(s) is out of gamut. In other embodiments, a user may receive a single notice with information regarding which of the plurality of images is out of gamut. Other notification procedures for a batch download or upload may be contemplated as well.
It may be appreciated that the methods of the present disclosure may be used to provide an estimation or approximation of how a particular color may appear in a color space. For example, method 100 may be used where a particular color is received or selected in the first color space. The color may be converted to the second color space, and converted back to the first color space. The converted color may then be compared back to the originally received or selected color. An out-of-gamut score may be calculated for the comparison in some embodiments. The out-of-gamut score may be compared to a threshold, and a user may be notified of the result. The method may be used to convert and compare a particular color or multiple colors such as a color pallet, for example.
It may be further appreciated that the methods of the present disclosure may be used to provide an estimation or approximation of how a particular portion of an image may appear in a color space. For example, a user may select a particular color found in an image in a first color space, such that the particular color may be converted to the second color space, and converted back to the first color space. The converted color may then be compared back to the color originally selected for conversion in the first color space. An out-of-gamut score may be calculated for the particular color and, in some embodiments, compared to a threshold. In some embodiments, a particular region or area of an image in the first color space may be selected for conversion and comparison, such that an out-of-gamut score may be calculated for the particular region or area.
Methods of the present disclosure may be used In yearbook design and/or production in some embodiments. For example, a yearbook may be designed, at least in part, using a device operating in a color space such as RGB. However, the yearbook may ultimately be printed or produced, at least in part, using a printer operating in a different color space such as CMYK. Methods of the present disclosure may assist a user in determining which photos, designs, or colors to use in a yearbook design, or whether and how to edit such photos, designs, or colors. For example,
In some embodiments, the computer 410 and printer 430 may operate in different color spaces. For example, while the computer 410 may operate in an RGB color space, the printer 430 may operate in a CMYK color space. Where the computer 410 and printer 430 operate in different color spaces, a particular color may appear different when viewed at the computer than when rendered by the printer. This may lead to unexpected results or dissatisfaction when a particular image, design, or color appears with different coloring than expected when the yearbook is printed. Thus, methods of the present disclosure may operate to notify a user at either or both the computer 410 and the printer 430 of potential color variance by way of an out-of-gamut score.
In some embodiments, an out-of-gamut score may be calculated in conjunction with a particular design software. For example, in some embodiments, an out-of-gamut score may be calculated when a new image is placed into an Adobe InDesign, Adobe Photoshop, Quark, Corel, or other design software workspace. A plugin or other preconfigured setting or tool may provide access to out-of-gamut score calculation within a particular design software space. In some embodiments, the out-of-gamut score calculation may be provided over a web-based system or a cloud-based system, such as the web-based and cloud-based systems described in U.S. Provisional Patent Application No. 62/139,261, entitled Yearbook Publishing System and filed Mar. 27, 2015. In a web-based system, the software such as a graphics software, application, plugin, or other tool used to calculate the out-of-gamut score may be provided to a user over a network such as the Internet. In a cloud-based system, the software such as a graphics software, application, plugin, or other tool may be available to a user in a hosted design space, such as that described in U.S. Provisional Patent Application No. 62/139,261. With either a web-based system or a cloud-based system, a user may generally obtain the benefit of the out-of-gamut score calculation without the need for acquiring local software or hardware.
It may be appreciated that the methods of the present disclosure may provide for an out-of-gamut determination of an image on a device without the need to actually or physically render the image on the device. That is, the color space in which the device operates may be applied to a digital image, without sending the image to the device or rendering the image at the device. For example, where the device is a printer operating in a CMYK color space, a digital image may be converted from, for example, an RGB color space, to the CMYK color space of the printer, and back to the RGB color space for comparison and score calculation. In this way, it may be determined digitally whether the image is likely to be out of gamut when it is rendered at the printer, without actually sending the image to the printer and without printing the image.
For purposes of this disclosure, any system described herein may include any instrumentality or aggregate of instrumentalities operable to compute, calculate, determine, classify, process, transmit, receive, retrieve, originate, switch, store, display, communicate, manifest, detect, record, reproduce, handle, or utilize any form of information, intelligence, or data for business, scientific, control, or other purposes. For example, a system or any portion thereof may be a personal computer (e.g., desktop or laptop), tablet computer, mobile device (e.g., personal digital assistant (PDA) or smart phone), server (e.g., blade server or rack server), a network storage device, or any other suitable device or combination of devices and may vary in size, shape, performance, functionality, and price. A system may include random access memory (RAM), one or more processing resources such as a central processing unit (CPU) or hardware or software control logic, ROM, and/or other types of nonvolatile memory. Additional components of a system may include one or more disk drives or one or more mass storage devices, one or more network ports for communicating with external devices as well as various input and output (I/O) devices, such as a keyboard, a mouse, touchscreen and/or a video display. Mass storage devices may include, but are not limited to, a hard disk drive, floppy disk drive, CD-ROM drive, smart drive, flash drive, or other types of non-volatile data storage, a plurality of storage devices, or any combination of storage devices. A system may include what is referred to as a user interface, which may generally include a display, mouse or other cursor control device, keyboard, button, touchpad, touch screen, microphone, camera, video recorder, speaker, LED, light, joystick, switch, buzzer, bell, and/or other user input/output device for communicating with one or more users or for entering information into the system. Output devices may include any type of device for presenting information to a user, including but not limited to, a computer monitor, flat-screen display, or other visual display, a printer, and/or speakers or any other device for providing information in audio form, such as a telephone, a plurality of output devices, or any combination of output devices. A system may also include one or more buses operable to transmit communications between the various hardware components.
One or more programs or applications, such as a web browser, and/or other applications may be stored in one or more of the system data storage devices. Programs or applications may be loaded in part or in whole into a main memory or processor during execution by the processor. One or more processors may execute applications or programs to run systems or methods of the present disclosure, or portions thereof, stored as executable programs or program code in the memory, or received from the Internet or other network. Any commercial or freeware web browser or other application capable of retrieving content from a network and displaying pages or screens may be used. In some embodiments, a customized application may be used to access, display, and update information.
Hardware and software components of the present disclosure, as discussed herein, may be integral portions of a single computer or server or may be connected parts of a computer network. The hardware and software components may be located within a single location or, in other embodiments, portions of the hardware and software components may be divided among a plurality of locations and connected directly or through a global computer information network, such as the Internet.
As will be appreciated by one of skill in the art, the various embodiments of the present disclosure may be embodied as a method (including, for example, a computer-implemented process, a business process, and/or any other process), apparatus (including, for example, a system, machine, device, computer program product, and/or the like), or a combination of the foregoing. Accordingly, embodiments of the present disclosure may take the form of an entirely hardware embodiment, an entirely software embodiment (including firmware, middleware, microcode, hardware description languages, etc.), or an embodiment combining software and hardware aspects. Furthermore, embodiments of the present disclosure may take the form of a computer program product on a computer-readable medium or computer-readable storage medium, having computer-executable program code embodied in the medium, that define processes or methods described herein. A processor or processors may perform the necessary tasks defined by the computer-executable program code. Computer-executable program code for carrying out operations of embodiments of the present disclosure may be written in an object oriented, scripted or unscripted programming language such as Java, Perl, PHP, Visual Basic, Smalltalk, C++, or the like. However, the computer program code for carrying out operations of embodiments of the present disclosure may also be written in conventional procedural programming languages, such as the C programming language or similar programming languages. A code segment may represent a procedure, a function, a subprogram, a program, a routine, a subroutine, a module, an object, a software package, a class, or any combination of instructions, data structures, or program statements. A code segment may be coupled to another code segment or a hardware circuit by passing and/or receiving information, data, arguments, parameters, or memory contents. Information, arguments, parameters, data, etc. may be passed, forwarded, or transmitted via any suitable means including memory sharing, message passing, token passing, network transmission, etc.
In the context of this document, a computer readable medium may be any medium that can contain, store, communicate, or transport the program for use by or in connection with the systems disclosed herein. The computer-executable program code may be transmitted using any appropriate medium, including but not limited to the Internet, optical fiber cable, radio frequency (RF) signals or other wireless signals, or other mediums. The computer readable medium may be, for example but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device. More specific examples of suitable computer readable medium include, but are not limited to, an electrical connection having one or more wires or a tangible storage medium such as a portable computer diskette, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), a compact disc read-only memory (CD-ROM), or other optical or magnetic storage device. Computer-readable media includes, but is not to be confused with, computer-readable storage medium, which is intended to cover all physical, non-transitory, or similar embodiments of computer-readable media.
Various embodiments of the present disclosure may be described herein with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products. It is understood that each block of the flowchart illustrations and/or block diagrams, and/or combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer-executable program code portions. These computer-executable program code portions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a particular machine, such that the code portions, which execute via the processor of the computer or other programmable data processing apparatus, create mechanisms for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks. Alternatively, computer program implemented steps or acts may be combined with operator or human implemented steps or acts in order to carry out an embodiment of the invention.
Additionally, although a flowchart may illustrate a method as a sequential process, many of the operations in the flowcharts illustrated herein can be performed in parallel or concurrently. In addition, the order of the method steps illustrated in a flowchart may be rearranged for some embodiments. Similarly, a method illustrated in a flow chart could have additional steps not included therein or fewer steps than those shown. A method step may correspond to a method, a function, a procedure, a subroutine, a subprogram, etc.
As used herein, the terms “substantially” or “generally” refer to the complete or nearly complete extent or degree of an action, characteristic, property, state, structure, item, or result. For example, an object that is “substantially” or “generally” enclosed would mean that the object is either completely enclosed or nearly completely enclosed. The exact allowable degree of deviation from absolute completeness may in some cases depend on the specific context. However, generally speaking, the nearness of completion will be so as to have generally the same overall result as if absolute and total completion were obtained. The use of “substantially” or “generally” is equally applicable when used in a negative connotation to refer to the complete or near complete lack of an action, characteristic, property, state, structure, item, or result. For example, an element, combination, embodiment, or composition that is “substantially free of” or “generally free of” an ingredient or element may still actually contain such item as long as there is generally no measurable effect thereof.
In the foregoing description various embodiments of the present disclosure have been presented for the purpose of illustration and description. They are not intended to be exhaustive or to limit the invention to the precise form disclosed. Obvious modifications or variations are possible in light of the above teachings. The various embodiments were chosen and described to provide the best illustration of the principals of the disclosure and their practical application, and to enable one of ordinary skill in the art to utilize the various embodiments with various modifications as are suited to the particular use contemplated. All such modifications and variations are within the scope of the present disclosure as determined by the appended claims when interpreted in accordance with the breadth they are fairly, legally, and equitably entitled.
Number | Date | Country | |
---|---|---|---|
62262054 | Dec 2015 | US |