The present inventions are directed to digital cameras, such as those for capturing still or moving pictures, and more particularly, to digital cameras that compress image data.
Although some currently available digital video cameras include high resolution image sensors, and thus output high resolution video, the image processing and compression techniques used on board such cameras may be too lossy and thus may eliminate too much raw image data to be acceptable in the high end portions of the market noted above. An aspect of at least one of the embodiments disclosed herein includes the realization that video quality that is acceptable for the higher end portions of the markets noted above, such as the major motion picture market, can be satisfied by cameras that can capture, compress and store raw or substantially raw video data at cinema-quality resolution and frame rates, such as at a resolution of at least about 2k or at least about 4k, and at a frame rate of at least about 23 frames per second. Examples of compressed raw data compression systems and methods are described in U.S. Pat. No. 8,174,560, which is incorporated by reference in its entirety herein.
Another aspect of various embodiments of the present disclosure includes the realization that because the human eye is more sensitive to green wavelengths than any other color, green image data based modification of image data output from an image sensor can be used to enhance compressibility of the data, yet provide a high quality video image. One such technique can include subtracting the magnitude of green light detected from the magnitudes of red and/or blue light detected prior to compressing the data. For instance, as discussed further herein, red and/or blue image data in a mosaiced (e.g., Bayer pattern) image data set can be modified based on green data in the mosaiced image data set. This can convert the red and/or blue image data into a more compressible form.
A further aspect of various embodiments of the present disclosure includes the realization that a first portion of the green image data may be used to modify a second portion of the green image data to improve compression. For example, mosaiced, raw image data (e.g., Bayer pattern image data or image data filtered using another type of color filter array [CFA]) may be composed of two green channels in addition to a red and a blue channel. As described above, green channel data may be subtracted from each of the blue and red channels to improve compressibility of the image data with little or no visual loss. According to various embodiments, this improved compressibility is possible, at least in part, because the color and/or intensity of the red and blue channels are correlated with the color and/or intensity of green channels. Accordingly, subtracting green channel data from red and/or blue channel data according to the techniques described herein may de-correlate a portion of the color and/or intensity data, improving compressibility.
According to some implementations, green image data may be modified based on other green image data, e.g., in order to improve compressibility. For instance, for Bayer pattern data, the first green channel can be used to predict a second green channel. For instance, data of a first green channel may be subtracted from data of a second green channel, and the difference or residual can be encoded, improving compressibility of the image data with little or no visual loss. Subtracting first green channel data from second green channel data may also improve compressibility as the first and second green channels may be spatially correlated with one another. Accordingly, subtracting the first green channel data from the second green channel data may also at least partially decorrelate the green image data, further improving compressibility. Moreover, green image data inherently contain more of the image detail than the red and blue planes. Embodiments described herein at least partly evolved from the realization that, using a carefully designed algorithm such as any of the ones described herein, encoding one green channel using another green channel can be done to improve compression, while still preserving an acceptable level of image detail to achieve cinema quality compressed raw image data. According to certain implementations, this modification of the green image data can be done in conjunction with any of the red/blue data modification techniques in order to further improve compressibility of the image data. In some other implementations, the green data modification is done instead of red/blue data modification.
Further, similar to the description above, the process of green image data subtraction from blue, red, and/or other green image data, can be reversed following application of lossy compression algorithms (e.g., at compression ratios of at least 3, 4, 5, 6, 7, 8, 9, 10, 11, or 12 to 1, or higher), depending on the embodiment. Moreover, the resulting system and method incorporating such a technique can provide visually lossless video image data with enhanced compressibility of such video image data.
According to an embodiment, a method of compressing mosaiced color image data is disclosed comprising: accessing mosaiced color image data acquired by one or more image sensors of a video camera, the mosaiced color image data comprising a plurality of picture element values for each of a plurality of spatially interleaved color channels, the spatially interleaved color channels comprising a first green color channel, a second green color channel, a red color channel, and a blue color channel; transforming the second green color channel at least partly by, for each respective picture element of a plurality of picture elements of the second green color channel, modifying an initial value corresponding to the respective picture element using a calculated value derived from values of a plurality of picture elements of the first green color channel that are in spatial proximity to the respective picture element; compressing the transformed second green color channel; and storing the transformed, compressed second green color in at least one memory device of the video camera along with compressed versions of the first green color channel, the red color channel, and the blue color channel.
According to an aspect, said transforming comprises subtracting the calculated value from the initial value.
According to another aspect, the calculated value comprises an average of the values of the plurality of picture elements of the first green color channel that are in spatial proximity to the respective picture element.
According to yet another aspect, the plurality of picture elements of the first green color channel in spatial proximity to the respective picture element comprise at least two picture elements which are diagonally adjacent to the respective picture element.
According to another aspect, the at least two picture elements include four picture elements of the first green color channel which are diagonally adjacent to the respective picture element.
According to yet another aspect, the plurality of picture elements of the first green color channel which are in spatial proximity to the respective picture element include at least two picture elements, further wherein the respective picture element is positioned between the at least two picture elements.
According to another aspect, the at least two picture elements include two picture elements which are diagonally opposite one another with respect to the respective picture element.
According to yet another aspect, the at least two picture elements include a first pair of picture elements which are diagonally opposite one another with respect to the respective picture element and a second pair of picture elements which are diagonally opposite one another with respect to the respective picture element.
According to another aspect, the at least two picture elements are diagonally adjacent to the respective picture element.
According to yet another aspect, the plurality of picture elements of the first green color which are in spatial proximity to the respective picture element include at least three picture elements.
According to another aspect, the color image data is mosaiced according to a Bayer pattern.
According to yet another aspect, said transforming results in a spatial decorrelation of the first green color channel from the second green color channel.
According to another aspect, the method further comprises compressing the first green color channel and storing the compressed first green color channel in the at least one memory device.
According to yet another aspect, the method further comprises: transforming the red color channel by subtracting from respective picture element values of the red color channel a calculated value derived from picture element values of one or more of the first green color channel and the second green color channel which are in spatial proximity to the respective picture element values of the red color channel; compressing the transformed red color channel; transforming the blue color channel by subtracting from respective picture element values of the blue color channel a calculated value derived from picture element values of one or more of the first green color channel and the second green color channel which are in spatial proximity to the respective picture element values of the blue color channel; compressing the transformed blue color channel; and storing the transformed, compressed red and blue color channels in the at least one memory device.
According to another aspect, said transforming results in a spatial decorrelation of one or more of the first and second green color channels from the red and blue color channels.
According to another embodiment, a video camera is disclosed comprising: at least one memory device; one or more image sensors configured to convert light incident on the image sensor into color image data, the color image data mosaiced according to a pattern and comprising a plurality of picture element values for each of a plurality of spatially interleaved color channels, the spatially interleaved color channels comprising a first green color channel, a second green color channel, a red color channel, and a blue color channel; and an image processing module configured to: transform the second green color channel at least partly by, for each respective picture element of a plurality of picture elements of the second green color channel, modifying an initial value corresponding to the respective picture element using a calculated value derived from values of a plurality of picture elements of the first green color channel that are in spatial proximity to the respective picture element; compress the transformed second green color channel; and store the transformed, compressed second green color in the memory device.
According to yet another embodiment, an apparatus for processing mosaiced color image data is disclosed comprising: at least one memory device; one or more processors; and an image processing module executing in the one or more processors and configured to: access color image data from the memory device, the color image data comprising a plurality of picture element values for each of a plurality of spatially interleaved color channels, the spatially interleaved color channels comprising a first green color channel, a second green color channel, a red color channel, and a blue color channel; and transform the second green color channel at least partly by, for each respective picture element of a plurality of picture elements of the second green color channel, modifying an initial value corresponding to the respective picture element using a calculated value derived from values of a plurality of picture elements of the first green color channel that are in spatial proximity to the respective picture element; and compress the transformed second green color channel.
According to another embodiment, a method of decoding color image data is disclosed comprising: accessing encoded color image data for a second green color channel of a plurality of color channels of the color image data, wherein the encoded, color image data was encoded at least partly by: transforming the second green color channel least partly by, for each respective picture element of a plurality of picture elements of the second green color channel, modifying an initial value corresponding to the respective picture element using a calculated value derived from values of a plurality of picture elements of a first green color channel of the plurality of color channels, the plurality of picture elements of the first green channel in spatial proximity to the respective picture element; and compressing the transformed second green color channel; and decoding the accessed color image data for the second green color channel.
According to an aspect, the decoding comprises substantially reversing the transform operation and performing a decompression operation.
According to another aspect, substantially reversing the transform operation is performed after performing the decompression operation.
According to yet another aspect, substantially reversing the transform operation is performed prior to performing the decompression operation.
With continued reference to
The optics hardware 16 can be in the form of a lens system having at least one lens configured to focus an incoming image onto the image sensor 18. The optics hardware 16, optionally, can be in the form of a multi-lens system providing variable zoom, aperture, and focus. Additionally, the optics hardware 16 can be in the form of a lens socket supported by the housing 12 and configured to receive a plurality of different types of lens systems for example, but without limitation, the optics hardware 16 include a socket configured to receive various sizes of lens systems including a 50-100 millimeter (F2.8) zoom lens, an 18-50 millimeter (F2.8) zoom lens, a 300 millimeter (F2.8) lens, 15 millimeter (F2.8) lens, 25 millimeter (F1.9) lens, 35 millimeter (F1.9) lens, 50 millimeter (F1.9) lens, 85 millimeter (F1.9) lens, and/or any other lens. As noted above, the optics hardware 16 can be configured such that despite which lens is attached thereto, images can be focused upon a light-sensitive surface of the image sensor 18.
The image sensor 18 can be any type of video sensing device, including, for example, but without limitation, CCD, CMOS, vertically-stacked CMOS devices such as the Foveon® sensor, or a multi-sensor array using a prism to divide light between the sensors. In some embodiments, the image sensor 18 can include a CMOS device having about 12 million photocells. However, other size sensors can also be used. In some configurations, camera 10 can be configured to record and/or output video (e.g., compressed raw video) at “2 k” (e.g., 2048×1152 pixels), “4 k” (e.g., 4,096×2,540 pixels), “4.5 k” horizontal resolution, “5 k” horizontal resolution (e.g., 5120×2700 pixels), “6 k” horizontal resolution (e.g., 6144×3160), or greater resolutions. In some embodiments, the camera can be configured to record compressed raw image data having a horizontal resolution of between at least any of the above-recited resolutions. In further embodiments, the resolution is between at least one of the aforementioned values (or some value between the aforementioned values) and about 6.5 k, 7 k, 8 k, 9 k, or 10 k, or some value therebetween). As used herein, in the terms expressed in the format of xk (such as 2 k and 4 k noted above), the “x” quantity refers to the approximate horizontal resolution. As such, “4 k” resolution corresponds to about 4000 or more horizontal pixels and “2 k” corresponds to about 2000 or more pixels. Using currently commercially available hardware, the sensor can be as small as about 0.5 inches (8 mm), but it can be about 1.0 inches, or larger. Additionally, the image sensor 18 can be configured to provide variable resolution by selectively outputting only a predetermined portion of the sensor 18. For example, the sensor 18 and/or the image processing module can be configured to allow a user to identify the resolution of the image data output.
The camera 10 can also be configured to downsample and subsequently process the output of the sensor 18 to yield video output at 2K, 1080p, 720p, or any other resolution. For example, the image data from the sensor 18 can be “windowed”, thereby reducing the size of the output image and allowing for higher readout speeds. However, other size sensors can also be used. Additionally, the camera 10 can be configured to upsample the output of the sensor 18 to yield video output at higher resolutions.
With reference to
With continued reference to
As noted above, however, the Bayer pattern data illustrated in
In some embodiments, the camera 10 can be configured to delete or omit some of the green image data. For example, in some embodiments, the image processing module 20 can be configured to delete ½ of the green image data so that the total amount of green image data is the same as the amounts of blue and red image data. For example,
In some alternatives, the camera 10 can be configured to delete ½ of the green image data after the red and blue image data has been transformed based on the green image data. This optional technique is described below following the description of the subtraction of green image data values from the other color image data.
Optionally, the image processing module 20 can be configured to selectively delete green image data. For example, the image processing module 20 can include a deletion analysis module (not shown) configured to selectively determine which green image data to delete. For example, such a deletion module can be configured to determine if deleting a pattern of rows from the green image data would result in aliasing artifacts, such as Moiré lines, or other visually perceptible artifacts. The deletion module can be further configured to choose a pattern of green image data to delete that would present less risk of creating such artifacts. For example, the deletion module can be configured to choose a green image data deletion pattern of alternating vertical columns if it determines that the image captured by the image sensor 18 includes an image feature characterized by a plurality of parallel horizontal lines. This deletion pattern can reduce or eliminate artifacts, such as Moiré lines, that might have resulted from a deletion pattern of alternating lines of image data parallel to the horizontal lines detected in the image.
However, this merely one exemplary, non-limiting example of the types of image features and deletion patterns that can be used by the deletion module. The deletion module can also be configured to detect other image features and to use other image data deletion patterns, such as for example, but without limitation, deletion of alternating rows, alternating diagonal lines, or other patterns. Additionally, the deletion module can be configured to delete portions of the other image data, such as the red and blue image data, or other image data depending on the type of sensor used.
Additionally, the camera 10 can be configured to insert a data field into the image data indicating what image data has been deleted. For example, but without limitation, the camera 10 can be configured to insert a data field into the beginning of any video clip stored into the storage device 24, indicating what data has been deleted in each of the “frames” of the video clip. In some embodiments, the camera can be configured to insert a data field into each frame captured by the sensor 18, indicating what image data has been deleted. For example, in some embodiments, where the image processing module 20 is configured to delete ½ of the green image data in one deletion pattern, the data field can be as small as a single bit data field, indicating whether or not image data has been deleted. Since the image processing module 20 is configured to delete data in only one pattern, a single bit is sufficient to indicate what data has been deleted.
In some embodiments, as noted above, the image processing module 20 can be configured to selectively delete image data in more than one pattern. Thus, the image data deletion field can be larger, including a sufficient number of values to provide an indication of which of the plurality of different image data deletion patterns was used. This data field can be used by downstream components and or processes to determine to which spatial positions the remaining image data corresponds.
In some embodiments, the image processing module can be configured to retain all of the raw green image data, e.g., the data shown in
As noted above, in known Bayer pattern filters, there are twice as many green elements as the number of red elements and the number of blue elements. In other words, the red elements comprise 25% of the total Bayer pattern array, the blue elements corresponded 25% of the Bayer pattern array and the green elements comprise 50% of the elements of the Bayer pattern array. Thus, in some embodiments, where all of the green image data is retained, the image processing module 20 can include a second green data image processing module 38. As such, the first green data image processing module 36 can process half of the green elements and the second green image data processing module 38 can process the remaining green elements. However, the present inventions can be used in conjunction with other types of patterns, such as for example, but without limitation, CMY and RGBW.
Additionally, in some embodiments, the image processing module 20 can include other modules and/or can be configured to perform other processes, such as, for example, but without limitation, gamma correction processes, noise filtering processes, etc.
Additionally, in some embodiments, the image processing module 20 can be configured to subtract a value of a green element from a value of a blue element and/or red element. As such, in some embodiments, when certain colors are detected by the image sensor 18, the corresponding red or blue element can be reduced to zero. For example, in many photographs, there can be large areas of black, white, or gray, or a color shifted from gray toward the red or blue colors. Thus, if the corresponding pixels of the image sensor 18 have sensed an area of gray, the magnitude of the green, red, and blue, would be about equal. Thus, if the green value is subtracted from the red and blue values, the red and blue values will drop to zero or near zero. Thus, in a subsequent compression process, there will be more zeros generated in pixels that sense a black, white, or gray area and thus the resulting data will be more compressible. Additionally, the subtraction of green from one or both of the other colors can make the resulting image data more compressible for other reasons.
Such a technique can help achieve a higher effective compression ratio and yet remain visually lossless due to its relationship to the entropy of the original image data. For example, the entropy of an image is related to the amount of randomness in the image. The subtraction of image data of one color, for example, from image data of the other colors can reduce the randomness, and thus reduce the entropy of the image data of those colors, thereby allowing the data to be compressed at higher compression ratios with less loss. Typically, an image is not a collection of random color values. Rather, there is often a certain degree of correlation between surrounding picture elements. Thus, such a subtraction technique can use the correlation of picture elements to achieve better compression. The amount of compression will depend, at least in part, on the entropy of the original information in the image.
In some embodiments, the magnitudes subtracted from a red or blue pixel can be the magnitude of the value output from a green pixel adjacent to the subject red or blue pixel. Further, in some embodiments, the green magnitude subtracted from the red or blue elements can be derived from an average of the surrounding green elements. Such techniques are described in greater detail below. However, other techniques can also be used.
Optionally, the image processing module 20 can also be configured to selectively subtract green image data from the other colors. For example, the image processing module 20 can be configured to determine if subtracting green image data from a portion of the image data of either of the other colors would provide better compressibility or not. In this mode, the image processing module 20 can be configured to insert flags into the image data indicating what portions of the image data has been modified (by e.g., green image data subtraction) and which portions have not been so modified. With such flags, a downstream demosaicing/reconstruction component can selectively add green image values back into the image data of the other colors, based on the status of such data flags.
Optionally, image processing module 20 can also include a further data reduction module (not shown) configured to round values of the red and blue data. For example, if, after the subtraction of green magnitudes, the red or blue data is near zero (e.g., within one or two on an 8-bit scale ranging from 0-255 or higher magnitudes for a higher resolution system). For example, the sensor 18 can be a 12-bit sensor outputting red, blue, and green data on a scale of 0-4095. Any rounding or filtering of the data performed the rounding module can be adjusted to achieve the desired effect. For example, rounding can be performed to a lesser extent if it is desired to have lossless output and to a greater extent if some loss or lossy output is acceptable. Some rounding can be performed and still result in a visually lossless output. For example, on a 8-bit scale, red or blue data having absolute value of up to 2 or 3 can be rounded to 0 and still provide a visually lossless output. Additionally, on a 12-bit scale, red or blue data having an absolute value of up to 10 to 20 can be rounded to 0 and still provide visually lossless output.
Additionally, the magnitudes of values that can be rounded to zero, or rounded to other values, and still provide a visually lossless output depends on the configuration of the system, including the optics hardware 16, the image sensor 18, the resolution of the image sensor, the color resolution (bit) of the image sensor 18, the types of filtering, anti-aliasing techniques or other techniques performed by the image processing module 20, the compression techniques performed by the compression module 22, and/or other parameters or characteristics of the camera 10.
As noted above, in some embodiments, the camera 10 can be configured to delete ½ of the green image data after the red and blue image data has been transformed based on the green image data. For example, but without limitation, the processor module 20 can be configured to delete ½ of the green image data after the average of the magnitudes of the surrounding green data values have been subtracted from the red and blue data values. This reduction in the green data can reduce throughput requirements on the associated hardware. Additionally, the remaining green image data can be used to reconstruct the red and blue image data, described in greater detail below with reference to
As noted above, the camera 10 can also include a compression module 22. The compression module 22 can be in the form of a separate chip or it can be implemented with software and another processor. For example, the compression module 22 can be in the form of a commercially available compression chip that performs a compression technique in accordance with the JPEG 2000 standard, or other compression techniques. In some embodiments, the image processing module 20 and/or the compression module 22 are implemented in a field-programmable gate array (FPGA), an application-specific integrated circuit (ASIC), combinations of the same or the like.
The compression module 22 can be configured to perform any type of compression process on the data from the image processing module 20. In some embodiments, the compression module 22 performs a compression technique that takes advantage of the techniques performed by the image processing module 20. For example, as noted above, the image processing module 20 can be configured to reduce the magnitude of the values of the red and blue data by subtracting the magnitudes of green image data, thereby resulting in a greater number of zero values, as well as other effects. Additionally, the image processing module 20 can perform a manipulation of raw data that uses the entropy of the image data. Thus, the compression technique performed by the compression module 22 can be of a type that benefits from the presence of larger strings of zeros to reduce the size of the compressed data output therefrom.
Further, the compression module 22 can be configured to compress the image data from the image processing module 20 to result in a visually lossless output. For example, firstly, the compression module can be configured to apply any known compression technique, such as, but without limitation, JPEG 2000, MotionJPEG, any DCT based codec, any codec designed for compressing RGB image data, H.264, MPEG4, Huffman, or other techniques.
Depending on the type of compression technique used, the various parameters of the compression technique can be set to provide a visually lossless output. For example, many of the compression techniques noted above can be adjusted to different compression rates, wherein when decompressed, the resulting image is better quality for lower compression rates and lower quality for higher compression rates. Thus, the compression module can be configured to compress the image data in a way that provides a visually lossless output, or can be configured to allow a user to adjust various parameters to obtain a visually lossless output. For example, the compression module 22 can be configured to compress the image data at a compression ratio of about 6:1, 7:1, 8:1 or greater. In some embodiments, the compression module 22 can be configured to compress the image data to a ratio of 12:1 or higher.
Additionally, the compression module 22 can be configured to allow a user to adjust the compression ratio achieved by the compression module 22. For example, the camera 10 can include a user interface that allows a user to input commands that cause the compression module 22 to change the compression ratio. Thus, in some embodiments, the camera 10 can provide for variable compression.
As used herein, the term “visually lossless” is intended to include output that, when compared side by side with original (never compressed) image data on the same display device, one of ordinary skill in the art would not be able to determine which image is the original with a reasonable degree of accuracy, based only on a visual inspection of the images.
With continued reference to
In some embodiments, the storage device 24 can be mounted on an exterior of the housing 12. Further, in some embodiments, the storage device 24 can be connected to the other components of the system 14 through standard communication ports, including, for example, but without limitation, IEEE 1394, USB 2.0, IDE, SATA, etc. Further, in some embodiments, the storage device 24 can comprise a plurality of hard drives operating under a RAID protocol. However, any type of storage device can be used.
With continued reference to
The display 30 can be any type of monitoring device. For example, but without limitation, the display 30 can be a four-inch LCD panel supported by the housing 12. For example, in some embodiments, the display 30 can be connected to an infinitely adjustable mount configured to allow the display 30 to be adjusted to any position relative to the housing 12 so that a user can view the display 30 at any angle relative to the housing 12. In some embodiments, the display 30 can be connected to the monitor module through any type of video cables such as, for example, an RGB or YCC format video cable.
Optionally, the playback module 28 can be configured to receive data from the storage device 24, decompressed and demosaic the image data and then output the image data to the display 30. In some embodiments, the monitor module 26 and the playback module 28 can be connected to the display through an intermediary display controller (not shown). As such, the display 30 can be connected with a single connector to the display controller. The display controller can be configured to transfer data from either the monitor module 26 or the playback module 28 to the display 30.
With continued reference to
For example, but without limitation, with reference to
In
As noted above, known Bayer pattern filters often include twice as many green elements as blue and red elements. In the pattern of
Thus, in the operation block 52, the red, blue, and green image data output from the image sensor 18 can be received by the image processing module 20 and organized into separate color data components, such as those illustrated in
After the operation block 52, the flowchart 50 can move on to operation block 54. In the operation block 54, the image data can be further processed. For example, optionally, any one or all of the resulting data (e.g., green 1, green 2, the blue image data from
For example, the image data can be pre-emphasized or processed in other ways. In some embodiments, the image data can be processed to be more (mathematically) non-linear. Some compression algorithms benefit from performing such a linearization on the picture elements prior to compression. However, other techniques can also be used. For example, the image data can be processed with a linear curve, which provides essentially no emphasis.
For instance, the image data may represent linear light sensor data, and the pre-emphasis curve can be designed to preserve detail in darker regions upon application of the compression algorithm. For instance, the pre-emphasis function can be designed emphasize darker image data values in comparison to brighter image data values, e.g., by applying a log curve or other appropriate function that weights darker image data values higher than brighter image data values. In some cases, the pre-emphasis curve may cause some reduction in precision in highlights or other relatively brighter image regions while preserving detail in shadows or other darker image regions. In some embodiments, the operation block 54 can process the image data using curve defined by the function y=x{circumflex over ( )}0.5. In some embodiments, this curve can be used where the image data was, for example but without limitation, floating point data in the normalized 0-1 range. In other embodiments, for example, where the image data is 12-bit data, the image can be processed with the curve y=(x/4095){circumflex over ( )}0.5. Additionally, the image data can be processed with other curves, such as y=(x+c){circumflex over ( )}g where 0.01<g<1 and c is an offset, which can be 0 in some embodiments. Additionally, log curves can also be used. For example, curves in the form y=A*log(B*x+C) where A, B, and C are constants chosen to provide the desired results. The pre-emphasis curve according to certain embodiments does not reduce the bit depth of the image data. Additionally, the above curves and processes can be modified to provide more linear areas in the vicinity of black, similar to those techniques utilized in the well-known Rec709 gamma curve. In applying these processes to the image data, the same processes can be applied to all of the image data, or different processes can be applied to the different colors of image data. However, these are merely exemplary curves that can be used to process the image data, or curves or transforms can also be used. Additionally, these processing techniques can be applied using mathematical functions such as those noted above, or with Look Up Tables (LUTs). Additionally, different processes, techniques, or transforms can be used for different types of image data, different ISO settings used during recording of the image data, temperature (which can affect noise levels), etc.
After the operation block 54, the flowchart 50 can move to an operation block 56. In the operation block 56, the red and blue picture elements can be transformed. For example, as noted above, green image data can be subtracted from each of the blue and red image data components. In some embodiments, a red or blue image data value can be transformed by subtracting a green image data value of at least one of the green picture elements adjacent to the red or blue picture element. In some embodiments, an average value of the data values of a plurality of adjacent green picture elements can be subtracted from the red or blue image data value. For example, but without limitation, average values of 2, 3, 4, or more green image data values can be calculated and subtracted from red or blue picture elements in the vicinity of the green picture elements.
For example, but without limitation, with reference to
Rm,n=Rm,n−(Gm,n−1+Gm+1,n+Gm,n+1+Gm−1,n)/4 (1)
Similarly, the blue elements can be transformed in a similar manner by subtracting the average of the green elements surrounding the blue target element as follows:
Bm+1,n+1=Bm+1,n+1−(Gm+1,n+Gm+2,n+1+Gm+1,n+2+Gm,n+1)/4 (2)
With continued reference to
In various embodiments, the flowchart 50 of
In some embodiments, a denoising stage may occur before compression in operation block 58. Removing noise from data prior to compression can be advantageous because it can greatly improve the effectiveness of the compression process. In some embodiments, noise removal can be done as part of the compression process in operation block 58.
In various embodiments, a denoising stage can occur at numerous points in the image data transformation process. For example, denoising can be applied after operation block 52 to raw image data from an image sensor prior to transformation; or to Bayer pattern (or other mosaiced) data after the transformation in operation block 56. In some embodiments, denoising can be applied before or after the pre-emphasis of data that occurs in operation block 54. Of note, denoising data before pre-emphasis can be advantageous because denoising can operate more effectively on perceptually linear data. In addition, in exemplary embodiments, green image data can be denoised before operation block 56 to minimize noise during the transformation process of red and blue picture elements in operation block 56.
As shown in
Additionally, as described above with respect to
By processing the image data in the manner described above with reference to
For example, with reference to
With continued reference to
In the operation block 64, a process performed in operation block 56 (
In the operation block 66, the green picture elements can be demosaiced. For example, as noted above, all the values from the data components Green 1 and/or Green 2 (
With continued reference to
In the operation block 70, the red and blue image data can be demosaiced. For example, firstly, the blue image data of
The operation block 70 can also include a demosaicing process of the red image data. For example, the red image data from
After the operation block 70, the flowchart can move on to an operation block 72. In the operation block 72, the demosaiced red and blue image data can be reconstructed from the demosaiced green image data.
In some embodiments, each of the red and blue image data elements can be reconstructed by adding in the green value from co-sited green image element (the green image element in the same column “m” and row “n” position). For example, after demosaicing, the blue image data includes a blue element value DBm−2,n−2. Because the original Bayer pattern of
In some embodiments, optionally, the blue and/or red image data can first be reconstructed before demosaicing. For example, the transformed blue image data B′m−1,n−1 can be first reconstructed by adding the average value of the surrounding green elements. This would result in obtaining or recalculating the original blue image data Bm−1,n−1. This process can be performed on all of the blue image data. Subsequently, the blue image data can be further demosaiced by any known demosaicing technique. The red image data can also be processed in the same or similar manners.
As shown in
In operation block 70′, the image data can be demosaiced. In the description set forth above with reference to operation blocks 66 and 70, the green, red, and blue image data can be demosacied in two steps. However, in the present flow chart 60′, the demosaicing of all three colors of image data is represented in a single step, although the same demosaicing techniques described above can be used for this demosaicing process. After the operation block 70′, the flow chart can move on to operation block 72, in which the red and blue image data can be reconstructed, and operation block 64 in which an inverse look-up table can be applied.
After the image data has been decompressed and processed according to either of the flow charts 70 or 70′, or any other suitable process, the image data can be further processed as demosaiced image data.
By demosaicing the green image data before reconstructing the red and blue image data, certain further advantages can be achieved. For example, as noted above, the human eye is more sensitive to green light. Demosaicing and processing the green image data optimize the green image values, to which the human eye is more sensitive. Thus, the subsequent reconstruction of the red and blue image data will be affected by the processing of the green image data.
Additionally, Bayer patterns have twice as many green elements as red and blue elements. Thus, in embodiments where all of the green data is retained, there is twice as much image data for the green elements as compared to either the red or blue image data elements. Thus, the demosaicing techniques, filters, and other image processing techniques result in a better demosaiced, sharpened, or otherwise filtered image. Using these demosaiced values to reconstruct and demosaic the red and blue image data transfers the benefits associated with the higher resolution of the original green data to the process, reconstruction, and demosaicing of the red and blue elements. As such, the resulting image is further enhanced.
Further, as described above, green channel data may be subtracted from each of the blue and red channels to improve compressibility of the image data with little or no visual loss. According to various embodiments, this advantageous improved compressibility is possible, at least in part, because the color and/or intensity of the red and blue channels are correlated with the color and/or intensity of green channels. Accordingly, subtracting green channel data from red and/or blue channel data may de-correlate a portion of the color and/or intensity data, improving compressibility.
Green Average Subtraction
Referring again to
In some cases, the calculated green value is spatially co-located or spatially correlated with the target pixel. For instance, values for at least two pixels opposing one another with respect to the target pixel may be averaged or combined using some other appropriate type of algorithm to generate the calculated green value. For instance, at least two pixels may include pixels diagonally opposing one another with respect to the target pixel, pixels above and below the target pixel, pixels to the left and right of the target pixel, or a combination thereof.
In various implementations, Green pixels located a further distance from the target pixel may be used to generate the calculated green value (e.g., calculate an average green value). The calculated green value may be used to perform GAS (Green Average Subtraction), in which the average green value calculated with respect to a target pixel may be subtracted from the target pixel. For example, in GAS, average green values may be subtracted from pixels in the Red/Blue data path and/or one of the Green data paths.
For example, the image processing module 20 may compute a 4-pixel average of a kernel of pixels (e.g., defect corrected and noise reduced pixels) which may include an average Green value relative to any particular pixel (e.g., a target pixel). Neighboring Green pixels (or a kernel of Green pixels) to be averaged may be selected based on whether an even row is being processed or an odd row is being processed, as well as whether a Red, Blue, or Green pixel is a target pixel. For example, referring again to
Referring to
In an embodiment, when the target pixel is Green 2 pixel Gm−1,n (labeled 180), the four neighboring Green 1 pixels may include Gm−2,n−1, Gm,n−1, Gm,n+1, and Gm−2,n+1 (labeled 181, 182, 183, and 184 respectively). In this example, it may be seen that, for Bayer pattern image data, when the target pixel is a Green pixel, the four closest neighboring pixels may be used as a kernel and may include Green pixels of the other channel (for example, Green 1 channel vs. Green 2 channel) immediately diagonally adjacent (for example, immediately adjacent, diagonally adjacent, spatially adjacent) to the target pixel, for example, the pixels diagonally opposite the Green target pixel. In one embodiment, only two of the diagonally adjacent Green pixels are used as a kernel (e.g., pixel 181 and pixel 183, pixel 184 and pixel 182). In both cases (i.e., where two diagonally opposing adjacent pixels are used or where all four diagonally adjacent pixels are used), there is some degree of spatial colocation and spatial correlation between the pixels used in the calculation and the target pixel, resulting in better results, e.g., results suitable for cinema applications. This can reduce or eliminate the occurrence of edge exaggeration on color boundaries or other undesirable artifacts, which may occur where only a single Green pixel is used in the calculation, or where only pixels on one side of the Green pixel are employed in the calculation, such as an embodiment where only pixel 181 and pixel 184 are used, or only pixels 182 and 183 are employed. In some alternative embodiments, only a single Green pixel is used in the calculation, or only pixels from one side of the target pixel are used.
In various embodiments, and as mentioned above, an average green value (or green value calculated according to some other appropriate algorithm) may be computed based on a kernel of Green pixels located a further distance from the target pixel. Referring again to
Similarly, in various embodiments an average green value with respect to a Red and/or Blue target pixel may be computed based on neighboring Green pixels located a further distance from the target pixel (rather than, or in addition to, the Green pixels immediately adjacent to the target pixel). For example, referring again to
In various embodiments, once neighboring Green pixels are determined for any particular target pixel, an average green value may be determined for that target pixel as described above with referenced to
As mentioned above, in various embodiments, green data modification (e.g., GAS) may be performed on both Red/Blue data and on Green2 (or Green 1) data.
Referring to
At block 176, Red, Blue, and/or Green2 pixels may be transformed as described above. Specifically, in an embodiment respective computed average green values may be subtracted from Red, Blue, and/or Green2 pixel values. In various embodiments Red/Blue or Green2 pixels may be transformed first. For example, in an embodiment Red/Blue pixels are first transformed via GAS based on Green1 and Green2 pixel values, according to any of the methods described above. Then Green2 pixel values may be transformed via GAS based on Green1 pixel values. In another embodiment, Green1 pixel values may be used to transform Red/Blue and Green2 pixel values in any order and/or simultaneously. In an embodiment, Green1 pixel values, or any combination of Green1 and Green2 pixel values (for example, demosaiced green pixel values) may be used to transform Red/Blue pixels. At block 178, Red, Blue, and/or Green channels of data, GAS having been implemented, may be compressed, as described above. Although the image data has been transformed (e.g., by the subtraction of green image data), the transformation is reversible. Moreover, the compressed image data according to certain embodiments is compressed raw image data. For example, the compressed raw data is mosaiced. Moreover, in various implementations, the compressed raw data can be decompressed and then gamma processed, color corrected, tonally processed and/or demosaiced using any process the user desires.
In an embodiment, GAS may be performed with respect to Green1 data/pixels, rather than with respect to Green2 data/pixels (as described above). In an embodiment, the camera 10 may support GAS performed with respect to either Green1 data and/or Green2 data. In various embodiments, other blocks may be included in flowchart 170. For example, as indicated, flowchart 170 may include a data block in which the image data is pre-emphasized (for example, similar to block 54 of
As described above, GAS with respect to a green channel, in addition to Red/Blue channels, may further improve compressibility of the image data with little or no visual loss. In various embodiments, this further improved compressibility is advantageously possible, at least in part, because the first green channel can be used to predict a second green channel and the first and second green channels may be spatially correlated with one another. Accordingly, in an embodiment, subtracting the first green channel data from the second green channel data may at least partially spatially de-correlate the green image data (further reducing entropy and improving compressibility) and the difference, or residual, may be encoded.
GAS at Bayer Borders
In various embodiments, GAS may be performed for pixels that lie at various edges of mosaiced image data, such as image data that is mosaiced according to a Bayer pattern color filter array.
Mirroring of pixel values for calculations of average green values at the edges of the Bayer pattern may similarly, in various embodiments, be implemented for Red/Blue pixels and for pixels in a first row, last row, first column, and/or last column. For example, referring to Table 1 below:
Green Average Subtraction may be applied on the Red/Blue data path and/or Green2 data after a Pre-Emphasis function is applied to the image data in some embodiments (e.g., after block 54 of
GAS Calculation
In an embodiment, image data may captured, processed, and stored with 16-bit precision. For example, image sensor 18 and image processing module 20 (shown in
RedBlue_GAS[15:0]=(RedBlue[15:0]−GreenAverage[15:0]+2{circumflex over ( )}16)/2; and
Green2_GAS[15:0]=(Green2[15:0]−Green1Average[15:0]+2{circumflex over ( )}16)/2
In an embodiment, division by 2 in the RedBlue_GAS and Green2_GAS operations may bring the GAS data into the same dynamic range as the unmodified Green data (for example, Green1 data). The division operation may be implemented as a logical shift by one bit and, in particular implementations a one-half-bit loss of precision may be incurred during this step. In an embodiment, a one-half-bit loss of precision may advantageously enable for faster processing of image data and reduced bandwidth requirements. However, in some embodiments the GAS calculation me be implemented such that there is no loss of precision. For example, an extra buffering bit may be added in the processing pipeline to represent the RedBlue_GAS and Green2_GAS, and/or the RedBlue_GAS and Green2_GAS may not be divided by 2.
In some embodiments, the RedBlue_GAS value and/or the Green2_GAS value may be calculated using one of the techniques described above, e.g., the techniques described with respect to block 56 of
In various embodiments, inverse green data modification, which in the case of Green Average Subtraction, is referred to herein as De-GAS, may be used to reverse the effect of the green data modification transform that was performed upstream in the Sensor Data Path.
At block 202, a decompression algorithm may be applied to the compressed and GASed image data (for example, similar to block 62 for flowchart 60). At block 204 average green values may be calculated based on, for example, Green1 and/or Green2 image data. For example, in an embodiment in which GAS has not been applied to the Green1 image data, Green1 pixels neighboring the Green2 pixels may be determined and averages may be calculated, as described above. At block 206, Green2 pixels may be reconstructed based on the calculated Green1 average values. For example, the Green1 average values may be added back into the Green2 pixel data. At block 208, Red/Blue pixels may be reconstructed based on the calculated Green1 average values and/or calculated Green1 and Green2 average values. At block 209, the reconstructed image data may be filtered and/or denoised (as described above).
In various embodiments, additional blocks may be included in flowchart 200. For example, Green1 and/or Green1 and Green2 image data may be demosaiced, and/or Red/Blue image data may be demosaiced prior to reconstruction of the Red, Blue, and/or Green image data. In another example, filtering and/or denoising may be applied to the image data at any point in the flowchart 200 (for example, as is shown in flowcharts of
An example subsystem for reversing the GAS process is shown in
In an embodiment, arithmetic equation for De-GAS are as follows:
RedBlue_DeGAS[16:0]=RedBlue_GAS[16:0]*2+GreenAverage[16:0]−2{circumflex over ( )}16;
and
Green2_DeGAS[16:0]=Green2_GAS[16:0]*2+Green1Average[16:0]−2{circumflex over ( )}16
In various embodiments, the RAW path in the Record/Monitor Pre-Process may work on GAS data while a separate RGB path (not shown) may utilize non-GAS data for processing.
Sensor Flip
In some embodiments, the default Bayer pattern is when red pixels occupy the first line of frame. According to certain embodiments, the term sensor flip denotes when Bayer pattern is such that blue pixels occupy the first line of the frame. When this happens, the algorithm for GAS and De-GAS may change. For example, for a given red or blue pixel, the locations of the 4 neighbor green pixels may differ between the two bayer patterns, as described above.
According to various embodiments, additional software may not be needed to implement a sensor flip mode. The sensor flip may be automatically determined by using the register Bayer programming of the demosaic block. As such, no additional programming may be necessary.
Additional Embodiments
According to various embodiments, Green Average Subtraction, as implemented on the Red/Blue and Green channels described above, advantageously enables significant improvements in compressibility of the resulting image data with little or no loss of image information. For example, implementing GAS, as described above, on raw image data may enable processing, compression, and/or storage of the raw image data in a lossy manner (e.g., at compression ratios of at least 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18 to 1, or higher), but with little or no visual loss, where the image data is visually lossless or substantially visually lossless upon decompression and playback. In an embodiment, the systems and methods of green data modification (e.g., GAS) involving modification of green data, such as in addition to green data modification on the red and blue data may enable greater than a doubling of compression efficiency (as compared to lossless compression efficiency without GAS on green image channels).
Thus, in accordance with an embodiment, a video camera can comprise a portable housing, and a lens assembly supported by the housing and configured to focus light. A light sensitive device can be configured to convert the focused light into raw image data with a resolution of at least 2 k (or at least about 4 k depending on the embodiment), at a frame rate of at least about twenty-three frames per second. The camera can also include a memory device and an image processing system configured to compress and store in the memory device the compressed raw image data using lossy compression (e.g., at a compression ratio of at least 2:1, 3:1, 4:1, or 5:1) and remain substantially visually lossless, and at a rate of at least about 23 frames per second.
In accordance with yet another embodiment, a video camera can comprise a portable housing having at least one handle configured to allow a user to manipulate the orientation with respect to at least one degree of movement of the housing during a video recording operation of the camera. A lens assembly can comprise at least one lens supported by the housing and configured to focus light at a plane disposed inside the housing. A light sensitive device can be configured to convert the focused light into raw image data with a horizontal resolution of at least 2 k (or, in some embodiments, at least 4 k) and at a frame rate of at least about twenty three frames per second. A memory device can also be configured to store video image data. An image processing system can be configured to compress and store in the memory device the raw image data at a compression ratio of at least six to one and remain substantially visually lossless, and at a rate of at least about 23 frames per second.
Depending on the embodiment, certain acts, events, or functions of any of the algorithms, methods, or processes described herein can be performed in a different sequence, can be added, merged, or left out all together (e.g., not all described acts or events are necessary for the practice of the algorithms). Moreover, in certain embodiments, acts or events can be performed concurrently, e.g., through multi-threaded processing, interrupt processing, or multiple processors or processor cores or on other parallel architectures, rather than sequentially.
Conditional language used herein, such as, among others, “can,” “might,” “may,” “e.g.,” and the like, unless specifically stated otherwise, or otherwise understood within the context as used, is generally intended to convey that certain embodiments include, while other embodiments do not include, certain features, elements and/or states. Thus, such conditional language is not generally intended to imply that features, elements and/or states are in any way required for one or more embodiments or that one or more embodiments necessarily include logic for deciding, with or without author input or prompting, whether these features, elements and/or states are included or are to be performed in any particular embodiment. The terms “comprising,” “including,” “having,” and the like are synonymous and are used inclusively, in an open-ended fashion, and do not exclude additional elements, features, acts, operations, and so forth. Also, the term “or” is used in its inclusive sense (and not in its exclusive sense) so that when used, for example, to connect a list of elements, the term “or” means one, some, or all of the elements in the list. In addition, the articles “a” and “an” are to be construed to mean “one or more” or “at least one” unless specified otherwise.
Conjunctive language such as the phrase “at least one of X, Y and Z,” unless specifically stated otherwise, is otherwise understood with the context as used in general to convey that an item, term, etc. may be either X, Y or Z. Thus, such conjunctive language is not generally intended to imply that certain embodiments require at least one of X, at least one of Y and at least one of Z to each be present.
While the above detailed description has shown, described, and pointed out novel features as applied to various embodiments, it will be understood that various omissions, substitutions, and changes in the form and details of the devices or algorithms illustrated can be made without departing from the spirit of the disclosure. Thus, nothing in the foregoing description is intended to imply that any particular feature, characteristic, step, module, or block is necessary or indispensable. As will be recognized, the processes described herein can be embodied within a form that does not provide all of the features and benefits set forth herein, as some features can be used or practiced separately from others. The scope of protection is defined by the appended claims rather than by the foregoing description.
This application is a continuation of U.S. patent application Ser. No. 15/656,958, filed Jul. 21, 2017, which application is a continuation of U.S. patent application Ser. No. 15/173,232, filed Jun. 3, 2016, which application is a continuation of U.S. patent application Ser. No. 14/180,168, filed Feb. 13, 2014, which application claims priority benefit under 35 U.S.C. § 119(e) from U.S. Provisional Application Nos. 61/764,821, filed Feb. 14, 2013, and 61/778,325, filed Mar. 12, 2013. The disclosures of each of the foregoing applications are hereby incorporated by reference herein in their entireties.
Number | Name | Date | Kind |
---|---|---|---|
3154493 | Pierrot et al. | Oct 1964 | A |
3971065 | Bayer | Jul 1976 | A |
3972010 | Dolby | Jul 1976 | A |
4200889 | Strobele | Apr 1980 | A |
4316213 | Wharton et al. | Feb 1982 | A |
4450487 | Koide | May 1984 | A |
4561012 | Acampora | Dec 1985 | A |
5016107 | Sasson et al. | May 1991 | A |
5040063 | Citta et al. | Aug 1991 | A |
5049983 | Matsumoto et al. | Sep 1991 | A |
5132803 | Suga et al. | Jul 1992 | A |
5172227 | Tsai et al. | Dec 1992 | A |
5249053 | Jain | Sep 1993 | A |
5255083 | Capitant et al. | Oct 1993 | A |
5303062 | Kawarai | Apr 1994 | A |
5343243 | Maeda | Aug 1994 | A |
5412427 | Rabbani et al. | May 1995 | A |
5442718 | Kobayashi et al. | Aug 1995 | A |
5526047 | Sawanobori | Jun 1996 | A |
5535246 | Beech | Jul 1996 | A |
5537157 | Washino et al. | Jul 1996 | A |
5563655 | Lathrop | Oct 1996 | A |
5592224 | Shim | Jan 1997 | A |
5592237 | Greenway | Jan 1997 | A |
5600373 | Chui et al. | Feb 1997 | A |
5818524 | Juen | Oct 1998 | A |
5875122 | Acharya | Feb 1999 | A |
5949468 | Asahina et al. | Sep 1999 | A |
5991515 | Fall et al. | Nov 1999 | A |
5999220 | Washino | Dec 1999 | A |
6009201 | Acharya | Dec 1999 | A |
6091851 | Acharya | Jul 2000 | A |
6124811 | Acharya et al. | Sep 2000 | A |
6154493 | Acharya et al. | Nov 2000 | A |
6169317 | Sawada et al. | Jan 2001 | B1 |
6192086 | Darr | Feb 2001 | B1 |
6198505 | Turner et al. | Mar 2001 | B1 |
6262763 | Totsuka | Jul 2001 | B1 |
6269217 | Rodriguez | Jul 2001 | B1 |
RE37342 | Washino et al. | Aug 2001 | E |
6275263 | Hu | Aug 2001 | B1 |
6285794 | Georgiev et al. | Sep 2001 | B1 |
6314206 | Sato | Nov 2001 | B1 |
6466699 | Schwartz et al. | Oct 2002 | B1 |
RE38079 | Washino et al. | Apr 2003 | E |
6567988 | Okawa | May 2003 | B1 |
6597860 | Song et al. | Jul 2003 | B2 |
6697106 | Saito | Feb 2004 | B1 |
6778709 | Taubman | Aug 2004 | B1 |
6798901 | Acharya et al. | Sep 2004 | B1 |
6825876 | Easwar et al. | Nov 2004 | B1 |
6859226 | Kawamura et al. | Feb 2005 | B2 |
6867717 | Ion | Mar 2005 | B1 |
6878977 | Kozuka et al. | Apr 2005 | B1 |
6937276 | Chung | Aug 2005 | B2 |
6944349 | Onno et al. | Sep 2005 | B1 |
6958774 | Kuroiwa | Oct 2005 | B2 |
6983074 | Clauson et al. | Jan 2006 | B1 |
6989773 | Wee et al. | Jan 2006 | B2 |
6990240 | Hagiwara | Jan 2006 | B2 |
6995793 | Albadawi et al. | Feb 2006 | B1 |
6995794 | Hsu et al. | Feb 2006 | B2 |
7038719 | Hirai | May 2006 | B2 |
7039254 | Maenaka et al. | May 2006 | B1 |
7050642 | Graffagnino | May 2006 | B2 |
7092016 | Morton et al. | Aug 2006 | B2 |
7095899 | Malvar | Aug 2006 | B2 |
7110605 | Marcellin et al. | Sep 2006 | B2 |
7113645 | Sano et al. | Sep 2006 | B2 |
7126634 | Kato | Oct 2006 | B2 |
7127116 | Goldstein et al. | Oct 2006 | B2 |
7155066 | Baharav | Dec 2006 | B2 |
7174045 | Yokonuma | Feb 2007 | B2 |
7212313 | Hoel | May 2007 | B1 |
7253836 | Suzuki et al. | Aug 2007 | B1 |
7312821 | Voss | Dec 2007 | B2 |
7313286 | Schwartz et al. | Dec 2007 | B2 |
7324141 | Kubo et al. | Jan 2008 | B2 |
7343043 | Yokonuma | Mar 2008 | B2 |
7349574 | Sodini et al. | Mar 2008 | B1 |
7349579 | Kadowaki et al. | Mar 2008 | B2 |
7365658 | Todorov et al. | Apr 2008 | B2 |
7369161 | Easwar et al. | May 2008 | B2 |
7376183 | Weigand et al. | May 2008 | B2 |
7385647 | Park | Jun 2008 | B2 |
7388992 | Atsumi et al. | Jun 2008 | B2 |
7394485 | Kim | Jul 2008 | B2 |
7477781 | Tanbakuchi | Jan 2009 | B1 |
7480417 | Malvar | Jan 2009 | B2 |
7483909 | Sena et al. | Jan 2009 | B2 |
7512283 | Brower | Mar 2009 | B2 |
7526134 | Matsubara | Apr 2009 | B2 |
7577689 | Masinter et al. | Aug 2009 | B1 |
7590301 | Wu | Sep 2009 | B2 |
7609300 | Wu | Oct 2009 | B2 |
7656561 | Mølgaard et al. | Feb 2010 | B2 |
7778473 | Kodama | Aug 2010 | B2 |
7796186 | Oshima | Sep 2010 | B2 |
7830967 | Jannard et al. | Nov 2010 | B1 |
7868879 | Rizko | Jan 2011 | B2 |
7898575 | Ishii | Mar 2011 | B2 |
7902512 | Chang et al. | Mar 2011 | B1 |
7907791 | Kinrot | Mar 2011 | B2 |
7936919 | Kameyama | May 2011 | B2 |
7952636 | Ikeda et al. | May 2011 | B2 |
8014597 | Newman | Sep 2011 | B1 |
8125547 | Oda et al. | Feb 2012 | B2 |
8170402 | Frost-Ruebling et al. | May 2012 | B2 |
8174560 | Jannard et al. | May 2012 | B2 |
8237830 | Jannard et al. | Aug 2012 | B2 |
8358357 | Jannard et al. | Jan 2013 | B2 |
8477173 | Kenoyer | Jul 2013 | B2 |
8792029 | Lee | Jul 2014 | B2 |
8817141 | Tanaka | Aug 2014 | B2 |
8849090 | Kosakai et al. | Sep 2014 | B2 |
8872933 | Jannard et al. | Oct 2014 | B2 |
8878952 | Jannard et al. | Nov 2014 | B2 |
9019393 | Jannard et al. | Apr 2015 | B2 |
9025929 | Kosakai et al. | May 2015 | B2 |
9230299 | Jannard et al. | Jan 2016 | B2 |
9245314 | Jannard et al. | Jan 2016 | B2 |
9436976 | Jannard et al. | Sep 2016 | B2 |
9521384 | Jannard et al. | Dec 2016 | B2 |
9565419 | Presler | Feb 2017 | B2 |
9596385 | Jannard et al. | Mar 2017 | B2 |
9716866 | Jannard et al. | Jul 2017 | B2 |
9787878 | Jannard et al. | Oct 2017 | B2 |
9792672 | Jannard et al. | Oct 2017 | B2 |
20010048477 | Misawa | Dec 2001 | A1 |
20020012055 | Koshiba et al. | Jan 2002 | A1 |
20020033737 | Staszewski et al. | Mar 2002 | A1 |
20020039142 | Zhang et al. | Apr 2002 | A1 |
20020041707 | Newman | Apr 2002 | A1 |
20020063787 | Watanabe | May 2002 | A1 |
20020167602 | Nguyen | Nov 2002 | A1 |
20020196354 | Chang et al. | Dec 2002 | A1 |
20030005140 | Dekel et al. | Jan 2003 | A1 |
20030007567 | Newman et al. | Jan 2003 | A1 |
20030011747 | Lenz | Jan 2003 | A1 |
20030018750 | Onno et al. | Jan 2003 | A1 |
20030031322 | Beckmann et al. | Feb 2003 | A1 |
20030038885 | Rodriguez | Feb 2003 | A1 |
20030053684 | Acharya | Mar 2003 | A1 |
20030122037 | Hyde et al. | Jul 2003 | A1 |
20030122937 | Guarnera et al. | Jul 2003 | A1 |
20030135302 | Hung et al. | Jul 2003 | A1 |
20030156188 | Abrams, Jr. | Aug 2003 | A1 |
20030185302 | Abrams, Jr. | Oct 2003 | A1 |
20030202106 | Kanleinsberger et al. | Oct 2003 | A1 |
20040032516 | Kakarala | Feb 2004 | A1 |
20040051793 | Tecu | Mar 2004 | A1 |
20040095477 | Maki et al. | May 2004 | A1 |
20040131274 | Perlmutter et al. | Jul 2004 | A1 |
20040165080 | Burks et al. | Aug 2004 | A1 |
20040169746 | Chen et al. | Sep 2004 | A1 |
20040169751 | Takemura et al. | Sep 2004 | A1 |
20040196389 | Honda | Oct 2004 | A1 |
20040201701 | Takagi | Oct 2004 | A1 |
20040201760 | Ota et al. | Oct 2004 | A1 |
20040213472 | Kodama et al. | Oct 2004 | A1 |
20040218812 | Douglass | Nov 2004 | A1 |
20040246346 | Kim et al. | Dec 2004 | A1 |
20050041116 | Tsukioka | Feb 2005 | A1 |
20050182972 | Apostolopoulos et al. | Aug 2005 | A1 |
20050183118 | Wee et al. | Aug 2005 | A1 |
20050213812 | Ishikawa et al. | Sep 2005 | A1 |
20050264661 | Kawanishi et al. | Dec 2005 | A1 |
20050276496 | Molgaard et al. | Dec 2005 | A1 |
20050286797 | Hayaishi | Dec 2005 | A1 |
20060007324 | Takei | Jan 2006 | A1 |
20060012694 | Yoneda et al. | Jan 2006 | A1 |
20060061659 | Niwa | Mar 2006 | A1 |
20060061822 | Sung et al. | Mar 2006 | A1 |
20060114987 | Roman | Jun 2006 | A1 |
20060158704 | Kameyama | Jul 2006 | A1 |
20060165178 | Ma et al. | Jul 2006 | A1 |
20060165179 | Feuer et al. | Jul 2006 | A1 |
20060170786 | Won | Aug 2006 | A1 |
20060177139 | Marcellin et al. | Aug 2006 | A1 |
20060210156 | Lei et al. | Sep 2006 | A1 |
20060221199 | Nakajima | Oct 2006 | A1 |
20060221203 | Abe et al. | Oct 2006 | A1 |
20060221230 | Dutt et al. | Oct 2006 | A1 |
20060232690 | Tamura et al. | Oct 2006 | A1 |
20060244842 | Hatano | Nov 2006 | A1 |
20060257129 | Shibatani | Nov 2006 | A1 |
20070035636 | Wu | Feb 2007 | A1 |
20070041634 | Sugimori | Feb 2007 | A1 |
20070051817 | Yano | Mar 2007 | A1 |
20070065139 | Ishii | Mar 2007 | A1 |
20070085916 | Nishio | Apr 2007 | A1 |
20070091187 | Lin | Apr 2007 | A1 |
20070092149 | Sung | Apr 2007 | A1 |
20070109316 | Fainstain | May 2007 | A1 |
20070127095 | Sugimori | Jun 2007 | A1 |
20070133902 | Kumar | Jun 2007 | A1 |
20070133967 | Takahashi | Jun 2007 | A1 |
20070153093 | Lin et al. | Jul 2007 | A1 |
20070160142 | Abrams, Jr. | Jul 2007 | A1 |
20070164335 | McKee | Jul 2007 | A1 |
20070165116 | Hung et al. | Jul 2007 | A1 |
20070206852 | McGee | Sep 2007 | A1 |
20070216782 | Chernoff | Sep 2007 | A1 |
20070285517 | Ishikuro | Dec 2007 | A1 |
20080002035 | Yoshida | Jan 2008 | A1 |
20080012953 | Yang et al. | Jan 2008 | A1 |
20080018746 | Kawanami | Jan 2008 | A1 |
20080055426 | Pertsel et al. | Mar 2008 | A1 |
20080062272 | Kuroiwa | Mar 2008 | A1 |
20080063070 | Schwartz et al. | Mar 2008 | A1 |
20080063269 | Chiu | Mar 2008 | A1 |
20080079818 | Takahashi | Apr 2008 | A1 |
20080084581 | Kobayashi et al. | Apr 2008 | A1 |
20080089406 | Fukuhara et al. | Apr 2008 | A1 |
20080131013 | Suino et al. | Jun 2008 | A1 |
20080240583 | Jones | Oct 2008 | A1 |
20080259180 | Ovsiannikov | Oct 2008 | A1 |
20080273809 | Demos | Nov 2008 | A1 |
20080284485 | Schilling | Nov 2008 | A1 |
20080285871 | Ishikawa | Nov 2008 | A1 |
20080301315 | Cheng et al. | Dec 2008 | A1 |
20090033752 | Bodnar et al. | Feb 2009 | A1 |
20090052797 | Matsushita et al. | Feb 2009 | A1 |
20090052861 | Goldman | Feb 2009 | A1 |
20090080784 | Luh et al. | Mar 2009 | A1 |
20090086817 | Matsuoka et al. | Apr 2009 | A1 |
20090141140 | Robinson | Jun 2009 | A1 |
20100014590 | Smith | Jan 2010 | A1 |
20100026849 | Hamada | Feb 2010 | A1 |
20100111489 | Presler | May 2010 | A1 |
20100134902 | Naitou | Jun 2010 | A1 |
20100142811 | Okamoto et al. | Jun 2010 | A1 |
20100225795 | Suzuki et al. | Sep 2010 | A1 |
20110149110 | Sugiyama | Jun 2011 | A1 |
20110170794 | Ogawa et al. | Jul 2011 | A1 |
20110194763 | Moon et al. | Aug 2011 | A1 |
20120105960 | Park | May 2012 | A1 |
20120229926 | Wade | Sep 2012 | A1 |
20130016427 | Sugawara | Jan 2013 | A1 |
20130027790 | Park | Jan 2013 | A1 |
20130162849 | Wu | Jun 2013 | A1 |
20130170039 | Miyoshi | Jul 2013 | A1 |
20130177301 | Nakayama | Jul 2013 | A1 |
20130201559 | Minamisawa | Aug 2013 | A1 |
20130258172 | Seol | Oct 2013 | A1 |
20140063297 | Yamura | Mar 2014 | A1 |
20140161367 | Ridenour et al. | Jun 2014 | A1 |
20140218580 | Mayer et al. | Aug 2014 | A1 |
20140226036 | Jannard et al. | Aug 2014 | A1 |
20140333810 | Nakaseko | Nov 2014 | A1 |
20150092094 | Itonaga et al. | Apr 2015 | A1 |
20150229843 | Shimizu | Aug 2015 | A1 |
20160316106 | Jannard et al. | Oct 2016 | A1 |
20170034400 | Jannard et al. | Feb 2017 | A1 |
20170053385 | Jannard et al. | Feb 2017 | A1 |
20180070061 | Jannard et al. | Mar 2018 | A1 |
20180124290 | Jannard et al. | May 2018 | A1 |
20180130183 | Jannard et al. | May 2018 | A1 |
Number | Date | Country |
---|---|---|
2 831 698 | Oct 2008 | CA |
2 683 636 | Jan 2014 | CA |
1941842 | Apr 2007 | CN |
101689357 | Mar 2015 | CN |
104702926 | Jun 2015 | CN |
1 028 595 | Aug 2000 | EP |
1 605 403 | Dec 2005 | EP |
2 145 330 | Jan 2010 | EP |
2 419 879 | Aug 2016 | EP |
1141893 | Aug 2015 | HK |
06-054239 | Feb 1994 | JP |
2000-069488 | Mar 2000 | JP |
2001-515318 | Sep 2001 | JP |
2002-051266 | Feb 2002 | JP |
2004-038693 | Feb 2004 | JP |
2004-248061 | Sep 2004 | JP |
2004-260821 | Sep 2004 | JP |
2004-282780 | Oct 2004 | JP |
2004-349842 | Dec 2004 | JP |
2005-210216 | Aug 2005 | JP |
2005-286415 | Oct 2005 | JP |
2006-171524 | Jun 2006 | JP |
2006-311314 | Nov 2006 | JP |
2007-267072 | Oct 2007 | JP |
2008-124976 | May 2008 | JP |
2011-015347 | Jan 2011 | JP |
2012-523790 | Oct 2012 | JP |
10-2002-0041778 | Jun 2002 | KR |
10-2009-0035204 | Apr 2009 | KR |
10-1478380 | Dec 2014 | KR |
490590 | Jun 2002 | TW |
I527435 | Mar 2016 | TW |
WO 91001613 | Feb 1991 | WO |
WO 92010911 | Jun 1992 | WO |
WO 97009818 | Mar 1997 | WO |
WO 99012345 | Mar 1999 | WO |
WO 99013429 | Mar 1999 | WO |
WO 99060793 | Nov 1999 | WO |
WO 2008128112 | Oct 2008 | WO |
WO 2009087783 | Jul 2009 | WO |
WO 2014127153 | Aug 2014 | WO |
Entry |
---|
US 9,392,240 B2, 07/2016, Jannard et al. (withdrawn) |
Answers & Objections to Plaintiff Bruce Royce's First Set of Interrogatories, Jinni Tech, Ltd., and Bruce Royce v. Red.Com, Inc., Red.Com, LLC, and Landmine Media, Inc., Case No. 2-17-cv-00217-JLR, filed Feb. 15, 2018, in 20 pages. |
Declaration of Cliff Reader, PhD. under 37 C.F.R. § 1.68, Inter Partes Review of U.S. Pat. No. 9,230,299, Ex. 1003, Apple Inc., v. Red.Com, LLC, dated May 6, 2019, in 100 pages. |
Declaration of Cliff Reader, PhD. under 37 C.F.R. § 1.68, Inter Partes Review of U.S. Pat. No. 9,245,314, Ex. 1003, Apple Inc., v. Red.Com, LLC, dated May 6, 2019, in 103 pages. |
Defendant Jinni Tech. Ltd.'s Jan. 15, 2019 Supplemental Response to Red.Com LLC's Interrogatories No. 11 and No. 12. Red.Com, LLC. v. Jinni Tech, Ltd., and Bruce Royce, Case No. 8:17-cv-02082-CJC-KES, filed Jan. 16, 2019, in 15 pages. |
Defendant Jinni Tech. Ltd.'s Response to Red.Com Llc's Second Set of Interrogatories, Red.Com, LLC. v. Jinni Tech, Ltd., and Bruce Royce, Case No. 8:17-cv-02082-CJC-KES, filed Nov. 2, 2018, in 14 pages. |
Defendant's Answer to First Amended Complaint, Red.Com, Inc., v. Jinni Tech, Ltd., and Bruce Royce, Case No. 8:17-cv-00382, filed Oct. 25, 2017, in 24 pages. |
Defendants Amended Answer to Complaint, Red.Com, LLC. V. Jinni Tech, Ltd., and Bruce Royce, Case No. 8:17-cv-02082-CJC-KES, filed Mar. 28, 2018, in 14 pages. |
Exhibit 1010, Apple Inc., v. Red.Com, LLC, U.S. Appl. No. 60/911,196 as filed (including Filing Receipt and Notice to File Missing Parts) Apr. 11, 2007, pp. 31. |
Exhibit 1011, Apple Inc., v. Red.Com, LLC, U.S. Appl. No. 61/017,406 as filed (including Filing Receipt, Notice to File Missing Parts and Missing Parts Response) filed Dec. 28, 2007, pp. 68. |
Exhibit 1012, Apple Inc., v. Red.Com, LLC, U.S. Appl. No. 60/923,339 as filed (including Filing Receipt) Apr. 13, 2007, pp. 22. |
First Set of Interrogatories to Defendant Jinni Tech, Ltd., Red.Com, Inc., v. Jinni Tech, Ltd., and Bruce Royce, Case No. 8:17-cv-00382, dated Feb. 15, 2018 in 9 pages. |
Long, Ben, “Real World Aperture”, 2007, Chapter 3 & Chapter 6, pp. 47. [Uploaded in 2 parts]. |
Order Granting Plaintiff's Motion for Voluntary Dismissal Without Prejudice [Dkt. 94], Red.Com, Inc., v. Jinni Tech, Ltd., and Bruce Royce, Case No. 8:17-cv-00382, filed Jun. 27, 2019 in 3 pages. |
Patent Owner Red.Com, LLC's Preliminary Response to Petition for Inter Partes Review of U.S. Pat. No. 9,230,299, Apple Inc., v. Red.Com, LLC, filed Aug. 15, 2019, in 228 pages. |
Patent Owner Red.Com, LLC's Preliminary Response to Petition for Inter Partes Review of U.S. Pat. No. 9,245,314, Apple Inc., v. Red.Com, LLC, filed Aug. 15, 2019, in 260 pages. |
Petition for Inter Partes Review of U.S. Pat. No. 9,230,299, Apple Inc., v. Red.Com, LLC, dated May 6, 2019, in 79 pages. |
Petition for Inter Partes Review of U.S. Pat. No. 9,245,314, Apple Inc., v. Red.Com, LLC, dated May 6, 2019, in 80 pages. |
Plaintiff's Answers and Objections to First Set of Interrogatories (Nos. 1-9), Red.Com, Inc., v. Jinni Tech, Ltd., and Bruce Royce, Case No. 8:17-cv-00382, filed Dec. 7, 2018 in 49 pages. [Uploaded in 3 parts]. |
Request for Admissions to Jinni Tech, Ltd., Amended, Red.Com, Inc., v. Jinni Tech, Ltd., and Bruce Royce, Case No. 8:17-cv-00382, dated Feb. 25, 2018 in 25 pages. |
Roberts et al., “Television Colorimetry: A Tutorial for System Designers”, Research and Development Department, Technical Resources, The British Broadcasting Corporation, Sep. 1995, pp. 19. |
Roberts, A., “The Film Look: Its Not Just Jerky Motion . . . ”, R&D White Paper, WHP 053, Research and Development, British Broadcasting Corporation, Dec. 2002, pp. 19. |
Search Report in Brazilian Application No. PI0809662-7, dated Jan. 21, 2019. |
Serial ATA International Organization: Serial ATA Revision 2.6, Feb. 15, 2007, pp. 600. |
Summons to attend oral proceedings in European Application No. 14177071.9, dated Apr. 9, 2019. |
Zhang et al., “Lossless Compression of Color Mosaic Images”, IEEE Transactions on Image Processing, vol. 15, No. 6, Jun. 2006, pp. 1379-1388. |
Official Communication in Japanese Application No. 2015-558135, dated Apr. 2, 2019. |
U.S. Pat. No. 8,174,560, Video Camera, May 8, 2012. |
U.S. Pat. No. 8,872,933, Video Camera, Oct. 28, 2014. |
U.S. Pat. No. 8,358,357, Video Camera, Jan. 22, 2013. |
U.S. Pat. No. 9,230,299, Video Camera, Jan. 5, 2016. |
U.S. Pat. No. 9,245,314, Video Camera, Jan. 26, 2016. |
U.S. Pat. No. 9,787,878, Video Camera, Oct. 10, 2017. |
U.S. Pat. No. 9,596,385, Electronic Apparatus, Mar. 14, 2017. |
U.S. Pat. No. 8,237,830, Video Camera, Aug. 7, 2012. |
U.S. Pat. No. 7,830,967, Video Camera, Nov. 9, 2010. |
U.S. Pat. No. 8,878,952, Video Camera, Nov. 4, 2014. |
U.S. Pat. No. 9,019,393, Video Processing System and Method, Apr. 28, 2015. |
U.S. Pat. No. 9,436,976, Video Camera, Sep. 6, 2016. |
U.S. Pat. No. 9,792,672, Video Capture Devices and Methods, Oct. 17, 2017. |
U.S. Pat. No. 8,174,560, Video Camera, May 16, 2014. |
U.S. Pat. No. 9,521,384 Green Average Subtraction in Image Data, Dec. 13, 2016. |
U.S. Pat. No. 9,716,866, Green Image Data Processing, Jul. 25, 2017. |
U.S. Appl. No. 16/008,340, Video Camera, filed Jun. 14, 2018. |
U.S. Appl. No. 16/100,049, Video Capture Devices and Methods, filed Aug. 9, 2018. |
U.S. Appl. No. 16/264,338, Video Capture Devices and Methods, Jan. 31, 2019. |
U.S. Appl. No. 15/702,550, filed Sep. 12, 2017 (and entire prosecution history), Jannard et al. |
U.S. Appl. No. 16/008,340, filed Jun. 14, 2018 (and entire prosecution history), Jannard et al. |
U.S. Appl. No. 16/100,049, filed Aug. 9, 2018 (and entire prosecution history), Jannard et al. |
U.S. Appl. No. 16/264,338, filed Jan. 31, 2019 (and entire prosecution history), Jannard et al. |
2K Digital Cinema Camera Streamlines Movie and HD Production, Silicon Imaging Digital Cinema, Press News Releases, Hollywood, California, date listed Nov. 1, 2006, in 2 pages. www.siliconimaging.com_DigitalCinema_News_PR_11_01_06_1. |
4:4:4 12-bit Uncompressed DVX100, date listed May 11-16, 2004, in 9 pages. http://www.dvinfo.net/forum/archive/index.php/t-20332-p-13.html. |
Abel Cine, “Abel North American Agent for Phantom Cameras,” date listed Feb. 7, 2007, http://web.archive.org/web/20120523003248/http://about.abelcine.com/2007/02/07/abel-north-american-agent-for-phantom-cameras/ in 2 pages. |
Arriflex D-20 Preliminary Specifications, archive.org indicates available on-line on May 31, 2005, www.arri.com, [online], http://web.archive.org/web/20050531010626/www.arri.com/entry/products.htm, pp. 1-2. |
Arriflex D-21: The Film Style Digital Camera, date listed Jan. 4, 2008, www.arri.de, [online] http://www.arri.de/press/press/press_release.html?tx_ttnews[tt_news]=32&tx_ttnews[backPid]=1781&cHash=e89c9b0855e89c9b0855. |
Bazhyna et al., “Near-lossless compression algorithm for Bayer pattern color filter arrays” SPIE—The International Society for Optical Engineering, vol. 5678; Copyright date listed is 2005. |
Bruner, Guy, Silicon Imaging Shows 1920×1080P Camera System, Camcorder News, Las Vegas, NAB, date listed Apr. 25, 2006, in 8 pages. http://www.camcorderinfo.com/content/Silicon-Imaging-Shows-1920x1080P-Camera-System.htm. |
CineForm Insider, blog post dated Nov. 13, 2007; http://cineform.blogspot.com/2007/11/cineform-on-chip.html, in 3 pages. |
CineForm Insider, date listed as Jan. through Dec. 2006, in 17 pages. http://cineform.blogspot.com/search?updated-min=2006-01-01T00:00:00-08:00&updated-max=2007-01-01T00:00:00-08:00&max-results=22. |
CineForm Online Workflow Solutions for Film and Video, date listed Nov. 1, 2006. |
CineForm Raw—Dalsa and Vision Research Raw File Converters, printed Aug. 16, 2010, www.cineform.com, [online]. |
CineForm RAW—Technology Overview and Workflow, date listed Apr. 13, 2006, in 3 pages. |
CinemaTechnic Camera Profiles | ARRI 16SR, date listed 2001. http://cinematechnic.com/resources/arri_16SR.html, date retrieved Feb. 12, 2010. |
Dalsa Origin Brochure, document indicates that it was printed Apr. 2004, in 2 pages. |
“Dalsa Technology with Vision,” Presentation, date listed Mar. 2003, pp. 35. |
Digital Cinema Initiatives, LLC, “Digital Cinema System Specification”, date listed Jul. 20, 2005, V1.0, pp. 176. |
Digital Negative (DNG) Specification, Adobe Systems Incorporated, Feb. 2005, in 50 pages. |
Digital Negative (DNG) Specification, date listed Apr. 2008. |
Doutre et al., “An Efficient Compression Scheme for Colour Filter Array Images Using Estimated Colour Difference”, IEEE Canadian Conference on Electrical and Computer Engineering, Apr. 22-26, 2007, pp. 24-27. |
“Gamma Correction and Tone Reproduction of Scan Image”, date listed Jun. 1994, in 35 pages. |
Gastaldi et al., “Compression of Videos Captured Via Bayer Patterned Color Filter Arrays”, Signal Processing Conference, 2005 13th European, Sep. 2005, in 4 pages. |
Ion, Lucian, et al., High Dynamic Range Data Centric Workflow System, DALSA Digital Cinema, this paper reported to be originally presented at SMPTE Technical Conference and Exhibit, New York, date listed Nov. 2005, in 14 pages. |
Ion, Lucian, et al., White Paper: 4K Digital Capture and Postproduction Workflow, DALSA Digital Cinema, in 5 pages. |
ISO Standard 15444 (part 1): Information technology—JPEG 2000 image coding system: Core coding system, pp. i-v, xiv, 1-11,120-122, copyright date listed is 2004. |
JPEG 2000 still image coding versus other standards, date listed Jul. 2000. |
Lee et al., “A Novel Approach of Image Compression in Digital Cameras with a Bayer Color Filter Array”, IEEE 2001, date listed 2001, pp. 482-485. |
LEICA Instructions, LEICA R8, in 70 pages. |
Leica R system: The analog-digital system, date listed 2005, in 40 pages. |
Lian et al., “Reversing Demosaicking and Compression in Color Filter Array Image Processing: Performance Analysis and Modeling”, IEEE Transactions on Image Processing, vol. 15, No. 11; date listed is Nov. 2006. |
Lukac et al., “Single-Sensor Camera Image Processing”, Color Image Processing: Methods and Applications, Chapter 16, pp. 363-392, date listed on document is “CRC Press 2006”. |
Lukac et al.: Single-sensor camera image compression, date listed May 2006, pp. 299-307. |
Lukac et al., “Single-Sensor Image Compression From the End-User's Perspective”, IEEE CCECE/CCGEI, May 2006, in 4 pages. |
Marino et al., “Wavelet-Based Perceptually Lossless Coding of R-G-B images”, Integrated Computer-Aided Engineering, date listed 2000, vol. 7, pp. 117-134. |
Menon et al., “On the Dependency Between Compression and Demosaicing in Digital Cinema”, Visual Media Production, The 2nd IEEE European Conference, Nov. 30-Dec. 1, 2005, pp. 104-111. |
Mitani, et al.; A 4 K×2 K-pixel color image pickup system; IEICE Transactions on Information and Systems; E82D (8): 1219-1227; Aug. 1999. |
Mitani, et al.; Ultrahigh-definition color video camera system with 4K-scanning lines; Sensors and Camera Systems for Scientific, Industrial, and Digital Photography Applications IV, 5017: 159-166, Published May 16, 2013. |
NAB2006DayThree, archive.org indicates available on-line Mar. 2, 2007, [on-line] http://web.archive.org/web/20070302002153/http://web.mac.com/mikedcurtis/iWeb/HD4NDs_Image_Galleries/NAB2006DayThreePt1.html, in 5 pages. |
New Camcorder from Silicon Imaging, © 2006-2008 Digital Camcorder News, date listed Apr. 19, 2006, in 2 pages. http://www.digitalcamcordernews.com/2006/04/new-camcorder-from-silicon-imaging. |
Nordhauser, Steve, Silicon Imaging Announces World's First Digital Cinema Camera with Direct-to-Disk 10-bit CineForm RAW™ Recording and Adobe® Production Studio Integration, Silicon Imaging, Inc., Albany, New York, date listed Jun. 26, 2006, in 3 pages. http://www.filmmakers.com/news/digital/article_713.shtml. |
Notes from the field: Silicon Imaging SI-1920HDVR camera in actual use, FRESHDV, date listed May 18, 2006, in 2 pages. http://www.freshdv.com/2006/05/notes-from-field-silicon-imaging-si.html. |
Olsen et al., “An improved image processing chain for mobile terminals”, Graduate Thesis, Agder University College, date listed May 2002, in 71 pages. |
On-line discussion thread from www.dvxuser.com, first post in thread dated May 1, 2006, retrieved from http://www.dvxuser.com/V6/showthread.php?55590-Worried-about-depending-on-RED-codec. |
On-line discussion thread from www.dvxuser.com, first post in thread dated Sep. 8, 2006, retrieved from http://www.dvxuser.com/V6/showthread.php?70333-Workflow-(good)-News. |
On-line discussion thread from www.dvxuser.com, first post in thread dated Sep. 8, 2006, retrieved from http://www.dvxuser.com/V6/showthread.php?70412-First-video-from-the-RED-4K-demo. |
On-line discussion thread from www.dvxuser.com, first post in thread dated Sep. 8, 2006, retrieved from http://www.dvxuser.com/V6/showthread.php?70417-RED-workflow-(how-we-prepared-the-Red-Footage-for-IBC. |
On-line discussion thread from www.dvxuser.com, first post in thread dated Sep. 10, 2006, retrieved from http://www.dvxuser.com/V6/showthread.php?70671-4K-RAW-data-rates. |
On-line discussion thread from www.dvxuser.com, first post in thread dated Sep. 18, 2006, retrieved from http://www.dvxuser.com/V6/showthread.php?71703-Dynamic-Range. |
On-line discussion thread from www.dvxuser.com, first post in thread dated Sep. 19, 2006, retrieved from http://www.dvxuser.com/V6/showthread.php?71756-RED-code-RAW-lossless-lossy. |
On-line discussion thread from www.dvxuser.com, first post in thread dated Sep. 24, 2006, retrieved from http://www.dvxuser.com/V6/showthread.php?72306-4k-live-(-4k-Still-from-Red-One-is-up-. |
On-line discussion thread from www.dvxuser.com, first post in thread dated Oct. 2, 2006, retrieved from http://www.dvxuser.com/V6/showthread.php?73415-1st-video-posted. |
On-line discussion thread from www.dvxuser.com, first post in thread dated Oct. 3, 2006, retrieved from http://www.dvxuser.com/V6/showthread.php?73448-editing-4K-at-home. |
On-line discussion thread from www.dvxuser.com, first post in thread dated Oct. 9, 2006, retrieved from http://www.dvxuser.com/V6/showthread.php?74232-1k-Bubble-Girl-video-up. |
On-line discussion thread from www.dvxuser.com, first post in thread dated Oct. 31, 2006, retrieved from http://www.dvxuser.com/V6/showthread.php?76711-First-REDCODE-image! |
On-line discussion thread from www.dvxuser.com, first post in thread dated Nov. 3, 2006, retrieved from http://www.dvxuser.com/V6/showthread.php?76954-Red-still-gallery-updated-with-new-4k-still! |
On-line discussion thread from www.dvxuser.com, first post in thread dated Nov. 4, 2006, retrieved from http://www.dvxuser.com/V6/showthread.php?77032-RAW-vs-REDCODE-RAW. |
On-line discussion thread from www.dvxuser.com, first post in thread dated Nov. 5, 2006, retrieved from http://www.dvxuser.com/V6/showthread.php?77117-Slo-Mo-and-REDCODE-RAW-questions. |
On-line discussion thread from www.dvxuser.com, first post in thread dated Nov. 6, 2006, retrieved from http://www.dvxuser.com/V6/showthread.php?77216-120fps-at-4K. |
On-line discussion thread from www.dvxuser.com, first post in thread dated Nov. 13, 2006, retrieved from http://www.dvxuser.com/V6/showthread.php?78010-David-Stump-on-Red. |
On-line discussion thread from www.dvxuser.com, first post in thread dated Nov. 14, 2006, retrieved from http://www.dvxuser.com/V6/showthread.php?78150-RED-L-A-photos-what-have-you-s. |
On-line discussion thread from www.dvxuser.com, first post in thread dated Nov. 15, 2006, retrieved from http://www.dvxuser.com/V6/showthread.php?78290-Red-Camera-first-test-with-Still-Lens-(-Nikon-). |
On-line discussion thread from www.dvxuser.com, first post in thread dated Nov. 19, 2006, retrieved from http://www.dvxuser.com/V6/showthread.php?78623-Red-compression-and-matrix-tests. |
On-line discussion thread from www.dvxuser.com, first post in thread dated Nov. 20, 2006, retrieved from http://www.dvxuser.com/V6/showthread.php?78823-Image-links-fixed. |
On-line discussion thread from www.dvxuser.com, first post in thread dated Nov. 21, 2006, retrieved from http://www.dvxuser.com/V6/showthread.php?78934-redcode-amazingly-good-! |
On-line discussion thread from www.dvxuser.com, first post in thread dated Nov. 24, 2006, retrieved from http://www.dvxuser.com/V6/showthread.php?79130-More-footage. |
On-line discussion thread from www.dvxuser.com, first post in thread dated Dec. 11, 2006, retrieved from http://www.dvxuser.com/V6/showthread.php?80963-NEW-VIDEO!!!-Bus-Video-1080p-clip-online-REDCODE. |
On-line discussion thread from www.dvxuser.com, first post in thread dated Dec. 18, 2006, retrieved from http://www.dvxuser.com/V6/showthread.php?81686-Specs-changes. |
On-line discussion thread from www.hdforindies.com, first post in thread dated Sep. 8, 2006, retrieved from http://www.hdforindies.com/2006/09/amsterdam-ibc-2006-red-news-redcode-4k.html. |
On-line discussion thread from www.hdforindies.com, first post in thread dated Dec. 19, 2006, retrieved from http://www.hdforindies.com/2006/12/mikes-conjecture-on-redcode-data-rates.html. |
Parrein et al., “Demosaicking and JPEG2000 Compression of Microscopy Images”, 2004 International Conference on Image Processing (ICIP), date listed 2004, pp. 521-524. |
Phantom 65 the world's first 65mm digital cinema, date listed Nov. 22, 2006. |
Phantom 65, Vision Research, Inc., date listed Sep. 27, 2006, in 2 pages. |
Phantom 65, archive.org indicates available on-line Feb. 4, 2007, www.visionresearch.com, [online], http://web.archive.org/web/20070204110551/www.visionresearch.com/index.cfm?sector=htm/files&page=camera_65_new, pp. 1-2. |
Phantom® Digital Widescreen CinemaTM, Vision Research, date listed May 3, 2006, in 17 pages. |
“Phantom HD”, http://www.alfavisionsrl.com.ar/espanol/alquiler/camera/info/manuals/DS_phantomHD.pdf, dated Mar. 30, 2007, pp. 2. |
Poynton, Charles, “A Technical Introduction to Digital Video,” 1996, Ch. 6 (Gamma), pp. 91-114. |
Puhovski, Nenad, [compiled by] High Definition Report from Cilect Standing Committee for New Technologies, Madrid, date listed 2006, in 146 pages. |
“RED Digital Cinema”, http://www.dvxuser.com/articles/redteam/RED-DVXUSER.pdf, copyright date Dec. 31, 2006, pp. 2. |
Red Digital Cinema, Brochure, date listed 2006, in 2 pages. |
Red Digital Cinema, “Introducing REDCODE”, Sep. 2006, International Broadcasting Convention, Amsterdam, the Netherlands, in 1 page. |
Red Digital Cinema, “Mysterium Sensor”, Sep. 2006, International Broadcasting Convention, Amsterdam, the Netherlands, in 1 page. |
Red Digital Cinema, “Preliminary Specifications”, Sep. 2006, International Broadcasting Convention, Amsterdam, the Netherlands, in 1 page. |
Red Digital Cinema, “Preliminary Specifications”, Apr. 14-19, 2007, Las Vegas, Nevada, in 1 page. |
Red Digital Cinema, “Simple. 4K to Anything”, Sep. 2006, International Broadcasting Convention, Amsterdam, the Netherlands, in 1 page. |
“Red Exclusive Brochure”, www.dvxuser.com, retrieved on Feb. 5, 2013, in 1 page http://www.dvxuser.com/V6/archive/index.php/t-54786.html. |
Red vs Dalsa Origin, Reduser.net, The DSMC System, Red One, date listed Oct. 26, 2007, in 5 pages. http://www.reduser.net/forum/archive/index.php/t-5344.html. |
Robin, Gamma Correction, www.broadcastengineering.com [online], date listed Jan. 1, 2005 in 5 pages. |
SI-2K Digital Cinema Camera, Silicon Imaging, copyright date listed is 2007, in 14 pages. http://web.archive/org/web/20080610162715/www.siliconimaging.com Date retrieved Sep. 3, 2015. |
Silicon Imaging SI-2K Mini Full Specifications, archive.org indicates available on-line May 23, 2007, www.siliconimaging.com, [online], http://web.archive.org/web/20070523223217/www.siliconimaging.com/DigitalCinema/SI_2K_full_specifications.html, pp. 1-2. |
Silicon Imaging, PRESS News Releases, www.siliconimaging.com/DigitalCinema/SI_Press.html, printed Nov. 5, 2012. |
Silicon Imaging Support: Frequently-Asked-Questions, archive.org indicates available on-line Dec. 12, 2007, www.siliconimaging.com, [online], http://web.archive.org/web/20071212165310/www.siliconimaging.com/DigitalCinema/SiliconImaging_faq.html, in 12 pages. |
SI-1920HDVR Camera Architecture, Silicon Imaging Digital Cinema, https://web.archive.org/web/20060423023557/http://www.siliconimaging.com/DigitalCinema/CameraArchitecture.html, archive.org indicates available on-line Apr. 23, 2006, in 2 pages. |
SI-1920HDVR Cineform Raw workflow, Silicon Imaging Digital Cinema, https://web.archive.org/web/20060423023730/http://www.siliconimaging.com/DigitalCinema/CineformWorkflow.html, archive.org indicates available on-line Apr. 23, 2006, in 2 pages. |
SI-1920HDVR, Silicon Imaging Digital Cinema, http://web.archive.org/web/20060828080100/http://www.siliconimaging.com/DigitalCinema.html, archive.org indicates available on-line Aug. 28, 2006, in 2 pages. |
SI-1920HDVR FAQ, Silicon Imaging Digital Cinema, http://web.archive.org/web/20060423023601/http://www.siliconimaging.com/DigitalCinema/faq.html, archive.org indicates available on-line Apr. 23, 2006, in 5 pages. |
SI-1920HDVR Key Features, Silicon Imaging Digital Cinema, in 2 pages. http://www.siliconimaging.com/DigitalCinema/key_features.html, Date retrieved Sep. 3, 2010. |
SI-1920HDVR Key Features, Silicon Imaging Digital Cinema, https://web.archive.org/web/20060423023637/http://www.siliconimaging.com/DigitalCinema/key_features.html, archive.org indicates available on Apr. 23, 2006, in 2 pages. |
SI-1920HDVR Specifications, Silicon Imaging Digital Cinema, http://web.archive.org/web/20060423023724/http://www.siliconimaging.com/DigitalCinema/full_specifications.html, archive.org indicates available on-line Apr. 23, 2006, in 2 pages. |
Smith, et al.; Constant quality JPEG2000 rate control for digital cinema; Source: Proceedings of SPIE—The International Society for Optical Engineering, v 6508, n PART 1, 2007, Conference: Visual Communications and Image Processing 2007, Jan. 30, 2007-Feb. 1, 2007. |
Smith, et al., Image Resolution of the One-CCD Palomar Motion Picture Camera, 37th Advance Motion Imaging Conference, Seattle, Washington, date listed Feb. 27-Mar. 1, 2003, in 8 pages. |
Some Like It Raw, Silicon Imaging D-Cinema Camera with Cineform RAW Codec, Studio Daily, date listed May 8, 2006, [on-line] http://www.studiodaily.com/2006/05/some-like-it-raw/. |
Taubman et al., “JPEG2000: Standard for Interactive Imaging”, Proceedings of the IEEE, vol. 90, No. 8, Aug. 2002, in 22 pages. |
The Red One Camera 4K Resolution, various dates listed, starting from Feb. 7, 2007, URL:http://www.vnnforum.com/showthread.php?t=44489 [retrieved on Aug. 3, 2012]. |
Vision Research introduces the Phantom HD, http://web.archive.org/web/20060715130053/www.visionresearch.com/phantomhd.html, archive.org indicates available on-line Jul. 15, 2006, in 3 pages. |
Wilt, Adam, Camera Log, NAB 2009—SI-2K, date listed Apr. 19, 2009, in 5 pages. http://provideocoalition.com/index.php/awilt/story/nab_2009_si_2k/. |
Wu et al., “Temporal Color Video Demosaicking via Motion Estimation and Data Fusion”, IEEE Transactions on Circuits and Systems for Video Technology, vol. 16, No. 2, Feb. 2006, pp. 231-240. |
Xie et al., “A Low-Complexity and High-Quality Image Compression Method for Digital Cameras”, ETRI Journal, vol. 28, No. 2, Apr. 2006, pp. 260-263. |
Zeng, Jianfen, et al., Video Coding Techniques for Digital Cinema, © Jul. 2004 IEEE International Conference on Multimedia and Expo (ICME), pp. 415-418, vol. 1. |
Zhang et al., “Real-Time Lossless Compression of Mosaic Video Sequences”, Aug. 10, 2005, pp. 8. |
Complaint for Patent Infringement in Ex Parte Reexam Application No. 90/012550, dated Feb. 12, 2013. |
Complaint for Patent Infringement; Red.Com, Inc., Inc. v. Sony Corporation of America and Sony Electronics, Inc., U.S. District Court for the Southern District of California, Case No. 3:13 cv-00334-DMS-BGS, dated Feb. 12, 2013. |
Complaint for Patent Infringement; Red.Com, Inc. v. Nokia USA Inc. and Nokia Technologies, Ltd., Case No. 8:16-cv-00594-MWF-JC, filed Mar. 30, 2016 in 9 pages. |
Joint Motion for Dismissal Without Prejudice; Red.Com, Inc. v. Sony Corporation of America and Sony Electronics Inc., Case No. 13CV0334-DMS-BGS, dated Jul. 19, 2013. |
Order Granting Joint Motion for Dismissal Without Prejudice; Red.Com, Inc. v. Sony Corporation of America and Sony Electronics Inc., Case No. 13CV0334-DMS-BGS, dated Jul. 19, 2013. |
Defendants' Answer and Affirmative Defenses, Red.Com, Inc. v. Nokia USA Inc. and Nokia Technologies, Ltd., Case No. 8:16-cv-00594-MWF-JC, filed Aug. 1, 2016 in 18 pages. |
Complaint for Patent Infringement, Red.Com, Inc. v. Sony Corp. of Am. and Sony Electronics Inc., Case No. 2:16-cv-00937, filed Aug. 24, 2016, in 32 pages. |
Answer to Plaintiff's Complaint for Patent Infringement, Red.Com, Inc. v. Sony Corp. of Am. and Sony Electronics Inc., Case No. 2:16-cv-00937, filed Nov. 21, 2016, in 16 pages. |
Disclosure of Asserted Claims and Infringement Contentions, Red.Com, Inc. v. Nokia USA Inc. and Nokia Technologies, Ltd., Case No. 8:16-cv-00594-MWF-JC, served Oct. 27, 2016, in 226 pages. |
Disclosure of Initial Invalidity Contentions with Exhibits, Red.Com, Inc. v. Nokia USA Inc. and Nokia Technologies, Ltd., Case No. 8:16-cv-00594-MWF-JC, served Dec. 9, 2016, in 2500 pages. |
Order of Dismissal, Sony Corp. of America, Red.Com, Inc. v. Sony Corp. of Am. and Sony Electronics Inc., Case No. 2:16-cv-00937, filed Dec. 27, 2016, in 1 page. |
Order for Dismissal, Red.Com, Inc. v. Nokia USA, Inc., Case No. 8:16-cv-00594, filed Jan. 27, 2017, in 2 pages. |
Complaint for Patent Infringement, Red.Com, Inc. v. Jinni Tech, Ltd., and Bruce Royce, Case No. 8:17-cv-00382, filed Mar. 2, 2017, in 41 pages. |
First Amended Complaint, Jinni Tech, Ltd., and Bruce Royce v. Red.Com, Inc., Red.Com, LLC., and Landmine Media, Inc., Case No. 2-17-cv-00217-JLR, filed May 23, 2017, in 32 pages. |
First Amended Complaint, Red.Com, LLC v. Jinni Tech, Ltd., and Bruce Royce, Case No. 8:17-cv-00382, filed Jul. 24, 2017, in 45 pages. |
Decision from Oral Proceedings in Opposition to EP 2145330, dated May 23, 2017, in 3 pages. |
Answer and Affirmative Defenses to First Amended Complaint, Jinni Tech, Ltd., and Bruce Royce v. Red.Com, Inc., Red.Com, LLC, and Landmine Media, Inc., Case No. 2-17-cv-00217-JLR, filed Nov. 13, 2017, in 17 pages. |
Comments submitted before Oral Proceedings, R. 116EPC in regards to European Publication No. EP2145330, dated Feb. 23, 2017. |
Defendant's Answer, Affirmative Defenses and Counterclaims; Demand for Jury Trial; Red.Com, Inc. v. Sony Corporation of America and Sony Electronics Inc., Case No. 13CV0334-DMS-BGS, dated Jun. 20, 2013. |
European Opposition Opponent Comments submitted before Oral Proceedings in Opposition to EP 2145330, dated Feb. 2, 2017, in 7 pages. |
European Opposition Minutes from Oral Proceedings in Opposition to EP 2145330, dated May 23, 2017, in 14 pages. |
European Opposition Summary of Facts and Submissions in Opposition to EP 2145330, dated May 23, 2017, in 32 pages. |
Re-Examination of U.S. Pat. No. 8,174,560 and its complete file history. |
Request for Re-Examination of U.S. Pat. No. 8,174,560, dated Sep. 13, 2012. |
Re-Examination Grant in U.S. Pat. No. 8,174,560, dated Dec. 6, 2012. |
Official Communication in European Application No. 10726688.4, dated Jul. 14, 2014. |
Summons to Attend Oral Proceedings in European Application No. 10726688.4, dated May 13, 2015. |
Official Communication in Japanese Application No. 2012-506053, dated Oct. 16, 2013. |
International Search Report and Written Opinion in PCT Application No. PCT/US2010/028808, dated Aug. 3, 2010. |
Examination Report in Australian Application No. 2008240144, dated Dec. 23, 2010. |
Examination Report in Australian Application No. 2012216606, dated Jul. 31, 2014. |
Notice of Acceptance in Australian Application No. 2012216606, dated Apr. 28, 2016. |
Examination Report in Australian Application No. 2016213747, dated Jul. 27, 2017. |
Official Communication in Chinese Application No. 200880018570.6, dated Mar. 31, 2014. |
Official Communication in Chinese Application No. 201510041027.X. |
Official Communication in European Application No. 08745686.9, dated Mar. 30, 2010. |
Extended European Search Report in European Application No. 08745686.9, dated Aug. 4, 2011. |
Office Action in European Application No. 08745686.9, dated Aug. 10, 2012. |
Summons to Attend Oral Proceedings in European Application No. 08745686.9, dated Oct. 31, 2013. |
Official Communication in European Application No. 08745686.9, dated Feb. 5, 2014. |
Official Communication in European Application No. 08745686.9, dated Mar. 18, 2014. |
Notice of Opposition in European Application No. 08745686.9, dated Apr. 22, 2015. |
Official Communication in European Application No. 14177071.9, dated Aug. 22, 2014. |
Official Communication in European Application No. 14177071.9, dated Jul. 30, 2015. |
Communication Pursuant to article 94(3) EPC dated May 2, 2016. |
European Opposition Opponent Reply Brief in Opposition to EP 2145330, dated Feb. 18, 2016 in 15 pages. |
European Opposition Preliminary Opinion of the Opposition Division in EP Application No. 08745686.9, dated Jun. 17, 2016 in 16 pages. |
Official Communication in Indian Application No. 6379/DELNP/2009, dated Jul. 25, 2017. |
Official Communication in Japanese Application No. 2010-503253, dated Jun. 26, 2012. |
Office Action in Korean Application No. 10-2009-7023045, dated Feb. 6, 2014. |
Official Communication in Korean Application No. 10-2014-7021892. |
Office Action in Mexican Application No. MX/a/2009/010926, dated May 16, 2012. |
Examination Report in New Zealand Application No. 580171, dated Feb. 22, 2011. |
Examination Report in New Zealand Application No. 601474, dated Aug. 1, 2012. |
Examination Report in New Zealand Application No. 620333, dated Feb. 14, 2014. |
Examination Report in New Zealand Application No. 710813, dated Aug. 12, 2015. |
Further Examination Report in New Zealand Application No. 710813, dated Aug. 3, 2017. |
Examination Report in New Zealand Application No. 728945, dated Aug. 3, 2017. |
Official Communication in Taiwanese Application No. 099111497, dated Jul. 24, 2015. |
Written Opinion in PCT Application No. PCT/US2008/060126, dated Jul. 7, 2008. |
International Preliminary Report on Patentability in PCT Application No. PCT/US2008/060126, dated Oct. 13, 2009. |
Official Communication in Taiwanese Application No. 097113289, dated Aug. 29, 2013. |
Official Communication in Taiwanese Application No. 097113289, dated Jul. 15, 2014. |
Final Office Action in Re-Examination of U.S. Pat. No. 8,174,560, dated Oct. 31, 2013. |
Notice of Intent to Issue Ex Parte Reexamination Certificate in Re-Examination of U.S. Pat. No. 8,174,560, dated Mar. 5, 2014. |
International Search Report and Written Opinion in PCT Application No. PCT/US2010/060851, dated Aug. 24, 2011. |
International Search Report and Written Opinion in PCT Application No. PCT/US2014/016301, dated May 21, 2014. |
International Preliminary Report on Patentability and Written Opinion in PCT Application No. PCT/US2014/016301, dated Aug. 27, 2015. |
Canon, Digital Video Camcorder Instruction Manual: XL2, Canon Inc., 2004, in 122 pages. |
Chandler et al., “Visually Lossless Compression of Digitized Radiographs Based on Contrast Sensitivity and Visual Masking”, Proceedings of SPIE, vol. 5749, 2005, in 14 pages. |
“Color Filter Array”, Wikipedia, http://en.wikipedia.org/wiki/Color_filter_array, printed Aug. 18, 2017 in 4 pages. |
“Color Filter Array Designs”, as archived by www.archive.org on Nov. 13, 2006, http://www.quadibloc.com/other/cfaint.htm, 15 pages. |
Fox, David, “HD Wins Variable Acceptance Across Europe”, http://www.urbanfox.tv/articles/cameras/c11ibc2001cameras.htm, 2001, in 7 pages. |
“Handbook of Image and Video Processing”, Second Edition, Elsevier Academic Press, 2005, pp. 644, 657, 739-740. |
Panasonic®, “Operating Instructions: Camera-Recorder”, Model No. AG-DVX100Bp, 2005, in 88 pages. |
Panasonic®, “Operating Instructions: Camera-Recorder”, Model No. AJ-HDC27Hp, DVCPRO HD, 2005, in 140 pages. |
Panasonic®, “VariCam DVCPRO HD”, Model No. AJ-HDC27H Variable Frame-Rate HD Camera-Recorder, 2004, in 20 pages. |
Panavision®, “Genesis User's Manual”, Version 1.4, http://panalab.panavision.com/sites/default/files/docs/documentLibrary/Genesis%20Users%20Manual.pdf, 2008, in 278 pages. |
Sci-Tech Awards, http://www.oscars.org/sci-tech, printed Oct. 2, 2017 in 11 pages. |
Silverstein et al., “The Relationship Between Image Fidelity and Image Quality”, as printed Sep. 7, 2004 in 5 pages. |
Slone et al., “Assessment of Visually Lossless Irreversible Image Compression: Comparison of Three Methods by Using an Image-Comparison Workstation”, Radiology, May 2000, vol. 215, No. 2, pp. 543-553. |
Sony®, “Digital Camcorder”, Operating Instructions, Model No. DSR-PD150, 2000, in 172 pages. |
Sony®, “HD Camcorder: HDW-F900”, Operation Manual, 1st Edition (Revised 1), Aug. 13, 2000, in 253 pages. |
Sony®, “HD Color Camera: HDC1500 Series”, Operation Manual, 1st Edition (Revised 8), May 13, 2008, in 65 pages. |
“Term: Compression, Visually Lossless”, http://www.digitizationguidelines.gov/term.php?term=compressionvisuallylossless, as printed Jan. 23, 2018 in 1 page. |
“Viper FilmStream Camera System”, Product Data Sheet, a Thompson Brand, Grass Valley, 2003, in 4 pages. |
Wang et al., “New Color Filter Arrays of High Light Sensitivity and High Demosaicking Performance”, Powerpoint Presentation, http://www.eecs.qmul.ac.uk/˜phao/Papers/ICIP11.ppt.pdf, IEEE International Conference on Image Processing (ICIP), Brussels, Belgium, Sep. 11-14, 2011, in 21 pages. |
Zhu et al., “Color Filter Arrays Based on Mutually Exclusive Blue Noise Patterns”, Journal of Visual Communication and Image Representation, vol. 10, 1999, pp. 245-267. |
Declaration of Thomas Graeme Nattress dated Oct. 1, 2017 in 4 pages. |
Appendix A, “Example Image of a Red One Camera”, Obtained from http://www.red.com/products/red-one#tech-specs, Jun. 27, 2013 in 3 pages. |
Appendix B, “Example Image of a Red Epic Camera”, Obtained from http://www.red.com/products/epic-mx#tech-specs, Jun. 27, 2013 in 3 pages. |
Appendix C, “Example Image of a Scarlet Camera”, Obtained from http://www.red.com/products/scarlet#scarlet-dragon, Jun. 27, 2013 in 3 pages. |
Appendix D, Claim Chart, created Jun. 27, 2013 in 34 pages. |
Grounds of Appeal as filed in European Patent Application No. 08745686.9, dated Oct. 2, 2017 in 42 pages. |
Official Communication in Japanese Application No. 2015-558135, dated May 1, 2018. |
Complaint for Patent Infringement, Red.Com, LLC. v. Jinni Tech, Ltd., and Bruce Royce, Case No. 8:17-cv-02082, filed Nov. 29, 2017, in 44 pages. |
Defendants' Answer and Affirmative Defenses to Complaint, Red.Com, LLC. v. Jinni Tech, Ltd., and Bruce Royce, Case No. 8:17-cv-02082-CJC-KES, filed Mar. 7, 2018, in 12 pages. |
Order Grating Stipulation to Consolidate and Discharging Order to Show Cause, Red.Com, LLC. v. Jinni Tech, Ltd., and Bruce Royce, Case No. 8:17-cv-02082-CJC-KES, filed Jul. 2, 2018, in 1 page. |
Number | Date | Country | |
---|---|---|---|
20190124302 A1 | Apr 2019 | US |
Number | Date | Country | |
---|---|---|---|
61778325 | Mar 2013 | US | |
61764821 | Feb 2013 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 15656958 | Jul 2017 | US |
Child | 16018849 | US | |
Parent | 15173232 | Jun 2016 | US |
Child | 15656958 | US | |
Parent | 14180168 | Feb 2014 | US |
Child | 15173232 | US |