This application claims the benefit of priority from Japanese Patent Application No. 2023-131792 filed on Aug. 14, 2023, the entire contents of which are incorporated herein by reference.
The present invention relates to an imaging system.
Teleconferencing systems via networks are widely used. In such a teleconferencing system, images of participants taken by web cameras installed or embedded above monitors, for example, are generally displayed on a screen of each terminal. Such a configuration, in which the line of sight of the participant looking at the monitor does not match the line of sight of the participant displayed on the screen, may cause deterioration in the quality of communication. As a conventional example of such a configuration, a display device with a camera in which an imaging module (camera) is built into a display element (display unit), a communication device, and a communication system are disclosed (for example, Japanese Patent Application Laid-open Publication No. 2005-176151).
In the above conventional technique, the display drive of the display element and the imaging drive of the imaging module are performed alternately. This may cause flickering of the displayed and taken images, which may cause deterioration in the display quality and imaging quality.
An imaging system according to an embodiment of the present disclosure includes a display device that has a display panel, and an imaging device that is disposed such that the display panel is interposed between an object to be imaged and the imaging device, and takes an image of the object to be imaged, the image being transmitted through the display panel. The imaging device acquires light exposure data of a color different from a display color on the display device to generate imaging data of the object to be imaged.
The following describes embodiments of the present disclosure in detail with reference to the accompanying drawings. The present disclosure is not limited by what is described in the following embodiments. Components to be described below include those easily conceivable by those skilled in the art or those substantially identical thereto. In addition, the components to be described below can be combined as appropriate. What is disclosed herein is merely an example, and the present disclosure naturally encompasses appropriate modifications easily conceivable by those skilled in the art while maintaining the gist of the present disclosure. To further clarify the description, the drawings may schematically illustrate, for example, widths, thicknesses, and shapes of various parts as compared with actual aspects thereof. However, they are merely examples, and interpretation of the present disclosure is not limited thereto. The same component as that described with reference to an already mentioned drawing is denoted by the same reference numeral through the present specification and the drawings, and detailed description thereof may not be repeated where appropriate.
In the present disclosure, the display device 100 is a liquid crystal display device that performs display output by what is called a field sequential color (FSC) method, in which pixels are controlled such that the single pixel transmits each of light beams of a plurality of colors at different timings.
The display panel P includes a display region 7, a signal output circuit 8, a scan circuit 9, a VCOM drive circuit 10, a timing controller 13, and a power source circuit 14. Hereafter, one surface on which the display region 7 is present of the display panel P is referred to as the display surface and the other surface is referred to as the back surface. When described as a lateral side of the display device 100, it refers to a side in a direction that intersects (e.g., orthogonally) with the direction in which the display surface and the back surface of the display device 100 face each other.
A plurality of pixels Pix are arranged in a matrix having a row-column configuration in the display region 7. The pixel Pix includes a switching element 1 and two electrodes.
The display panel P has two substrates facing each other and a liquid crystal 3 sealed between the two substrates. Hereafter, one of the two substrates is referred to as a first substrate 30 and the other as a second substrate 20.
The first substrate 30 includes a light transmissive glass substrate 35, pixel electrodes 2 stacked on the second substrate 20 side of the glass substrate 35, and an insulating layer 55 stacked on the second substrate 20 side to cover the pixel electrodes 2. The pixel electrode 2 is provided individually for each pixel Pix. The second substrate 20 includes a light transmissive glass substrate 21, the common electrode 6 stacked on the first substrate 30 side of the glass substrate 21, and an insulating layer 56 stacked on the first substrate 30 side to cover the common electrode 6. The common electrode 6 has a plate or film shape that is shared by the pixels Pix.
The liquid crystal 3 in a first embodiment is a polymer dispersed liquid crystal. Specifically, the liquid crystal 3 includes a bulk material 51 and fine particles 52. The fine particles 52 change the orientation in the bulk material 51 depending on the potential difference between the pixel electrode 2 and the common electrode 6. The potential of the pixel electrode 2 is individually controlled for each pixel Pix, resulting in the scattering state of the liquid crystal 3 being controlled for each pixel Pix.
The following describes a principle of controlling the potentials of the pixel electrode 2 and the common electrode 6.
The switching element 1 uses a semiconductor such as a thin film transistor (TFT), for example. One of a source and a drain of the switching element 1 is coupled to one (the pixel electrode 2) of the two electrodes. The other of the source and the drain of the switching element 1 is coupled to a signal line 4. A gate of the switching element 1 is coupled to a scan line 5. The scan line 5 provides the potential to open and close between the source and the drain of the switching element 1 under the control of the scan circuit 9. The scan circuit 9 controls the potential.
In the example illustrated in
In the present disclosure, the direction in which the scan line 5 extends is the X direction while the direction in which the scan lines 5 are aligned is the Y direction. In
The common electrode 6 is coupled to the VCOM drive circuit 10. The VCOM drive circuit 10 applies a common potential to the common electrode 6.
While the scan circuit 9 supplies a potential that serves as a drive signal to the scan line 5 to control the switching element 1 to be turned on, the signal output circuit 8 supplies a pixel signal to the signal line 4, which causes a storage capacitance formed between the pixel electrode 2 and the common electrode 6 and the liquid crystal 3 (the fine particles 52), which is a capacitive load, to be charged. As a result, a voltage depending on the pixel signal is applied between the pixel electrode 2 and the common electrode 6.
After the switching element 1 is turned off, the applied voltage between the pixel electrode 2 and the common electrode 6 is held by the storage capacitance and the liquid crystal 3 (the fine particles 52), which is the capacitive load. The degree of scattering of the liquid crystal 3 (the fine particles 52) is controlled depending on the applied voltage between the pixel electrode 2 and the common electrode 6 for each pixel Pix. The liquid crystal 3 may be a polymer dispersed liquid crystal having characteristics that the degree of scattering increases as the applied voltage between the pixel electrode 2 and the common electrode 6 for each pixel Pix increases. Alternatively, the liquid crystal 3 may be a polymer dispersed liquid crystal having characteristics that the degree of scattering increases as the applied voltage between the pixel electrode 2 and the common electrode 6 for each pixel Pix decreases.
As illustrated in
The first light source 11R, the second light source 11G, and the third light source 11B each emit light under the control of the light source drive circuit 12. The first light source 11R, the second light source 11G, and the third light source 11B are light sources using light emitting devices such as light emitting diodes (LEDs), for example, but are not limited thereto. The first light source 11R, the second light source 11G, and the third light source 11B may be light sources that can control light emission timings.
The light source drive circuit 12 controls the light emission timing of the first light source 11R, the second light source 11G, and the third light source 11B under the control of the timing controller 13. In the present disclosure, the light emission color (a first color) of the first light source 11R is red (R), the light emission color (a second color) of the second light source 11G is green (G), and the light emission color (a third color) of the third light source 11B is blue (B).
When light is emitted from the light source 11, the display region 7 is illuminated with light (the first, the second, and third colors) emitted from the lateral side in the Y direction. Each pixel Pix transmits or scatters light emitted from the lateral side in the Y direction. The degree of scattering of the liquid crystal 3 for each pixel Pix depends on the state of the liquid crystal 3 controlled depending on the pixel signal for each pixel Pix.
The timing controller 13 is a circuit that controls the operation timings of the signal output circuit 8, the scan circuit 9, the VCOM drive circuit 10, and the light source drive circuit 12. In the present disclosure, the timing controller 13 operates on the basis of signals input thereto via the image processing circuit 70.
The image processing circuit 70 outputs signals based on display image data I to the signal output circuit 8 and the timing controller 13. The data indicating the RGB gradation value assigned to one pixel Pix among the pixels Pix provided in the display region 7 is defined as pixel data. The display image data I input to the image processing circuit 70 to output the display image is a set of a plurality of pieces of pixel data for each pixel Pix in the display region 7. The image processing circuit 70 may be provided on one of the substrates included in the display panel P, may be mounted on a flexible printed circuit board on which wiring lines extending from the display panel P and the like are provided, or may be provided outside the display panel P.
As illustrated in
In a vertical scan period GateScan in the first subframe period RF, the pixel data depending on the output gradation value of each pixel Pix corresponding to the first color (red (R)) of the display image data I (N) is written. As a result, the voltage depending on the pixel data for each pixel Pix is applied to the pixel electrode 2, and the scattering state of the liquid crystal 3 for each pixel Pix is controlled according to the applied voltage to the pixel electrode 2.
The first light source 11R is emitted in a light emission period RON following the vertical scan period GateScan. In the light emission period RON, light in the first color (red (R)) depending on the pixel data for each pixel Pix written in the previous vertical scan period GateScan is scattered and displayed.
In the vertical scan period GateScan in the second subframe period GF, the pixel data depending on the output gradation value of each pixel Pix corresponding to the second color (green (G)) of the display image data I (N) is written. As a result, the voltage depending on the pixel data for each pixel Pix is applied to the pixel electrode 2, and the scattering state of the liquid crystal 3 for each pixel Pix is controlled according to the applied voltage to the pixel electrode 2.
The second light source 11G is emitted in a light emission period GON following the vertical scan period GateScan. In the light emission period GON, light in the second color (green (G)) depending on the pixel data for each pixel Pix written in the previous vertical scan period GateScan is scattered and displayed.
In the vertical scan period GateScan in the third subframe period BF, the pixel data depending on the output gradation value of each pixel Pix corresponding to the third color (blue (B)) of the display image data I (N) is written. As a result, the voltage depending on the pixel data for each pixel Pix is applied to the pixel electrode 2, and the scattering state of the liquid crystal 3 for each pixel Pix is controlled according to the applied voltage to the pixel electrode 2.
The third light source 11B is emitted in a light emission period BON following the vertical scan period GateScan. In the light emission period BON, light in the third color (blue (B)) depending on the pixel data for each pixel Pix written in the previous vertical scanning period GateScan is scattered and displayed.
In the display device 100 based on the FSC method, an image in which the first (red (R)), the second (green (G)), and the third (blue (B)) colors, i.e., three colors, are combined (mixed) is recognized due to an afterimage phenomenon caused by the limited temporal resolution in human eyes. The display device 100 based on the FSC method does not require a color filter for each pixel Pix, making it possible to increase a light transmittance in the display region 7.
Returning to
The image sensor includes color filters each of which selectively transmits the first (red (R)), the second (green (G)), and the third (blue (B)) colors.
The color filter array of the image sensor illustrated in
As illustrated in
The image of the object to be imaged PA passes through the display panel P and forms an image on the imaging element of the imaging device 300, thus being captured as image data. Specifically, the imaging device 300 captures the image data of the object to be imaged PA at 90 FPS, for example. In the following description, the image data acquired by the imaging device 300 is also referred to as the “imaging data”.
The image data captured by the imaging device 300 is distributed as video data via a network 400 to information terminals 500 of participants in the teleconferencing system, for example. Examples of the information terminals 500 include desktop and notebook personal computers. The imaging system 200 according to the embodiment may include displays, web cameras, and like that are included in the information terminals 500, for example.
In the imaging system 200 according to the embodiment, light from the display image data I displayed on the display device 100 may affect the imaging data acquired by the imaging device 300. The following describes a method of acquiring the imaging data in the imaging system 200 according to the embodiment.
In the example illustrated in
The frame period F2 is temporally divided into a first subframe period RF2, a second subframe period GF2, and a third subframe period BF2.
The frame period F3 is temporally divided into a first subframe period RF3, a second subframe period GF3, and a third subframe period BF3.
The frame period F4 is temporally divided into a first subframe period RF4, a second subframe period GF4, and a third subframe period BF4.
Controlling the synchronization of the image display timing in the display device 100 with the acquisition timing (a light exposure timing) of the imaging data in the imaging device 300 is performed by the display device 100, for example. In this case, a synchronization signal output from the display device 100 is input to the imaging device 300 (refer to
In the example illustrated in
The imaging device 300 then acquires light exposure data G1 of the second color (green (G)) in the third subframe period BF1 in which the third color (blue (B)) is displayed in the frame period F1, and in the first subframe period RF2 in which the first color (red (R)) is displayed in the frame period F2 on the display device 100. The acquired light exposure data G1 is different from the display colors (the third color (blue (B)) and the first color (red (R))) in the two subframe periods.
The imaging device 300 then acquires light exposure data R1 of the first color (red (R)) in the second subframe period GF2 in which the second color (green (G)) is displayed, and in the third subframe period BF2 in which the third color (blue (B)) is displayed in the frame period F2 on the display device 100. The acquired light exposure data R1 is different from the display colors (the second color (green (G)) and the third color (blue (B))) in the two subframe periods.
The imaging device 300 combines the light exposure data R1 of the first color (red (R)), the light exposure data G1 of the second color (green (G)), and the light exposure data B1 of the third color (blue (B)) to generate imaging data RGB1 (=R1, G1, B1).
The imaging device 300 then acquires light exposure data B2 of the third color (blue (B)) in the first subframe period RF3 in which the first color (red (R)) is displayed, and in the second subframe period GF3 in which the second color (green (G)) is displayed in the frame period F3 on the display device 100. The acquired light exposure data B2 is different from the display colors (the first color (red (R)) and the second color (green (G))) in the two subframe periods.
The imaging device 300 combines the light exposure data R1 of the first color (red (R)), the light exposure data G1 of the second color (green (G)), and the light exposure data B2 of the third color (blue (B)) to generate imaging data RGB2 (=R1, G1, B2).
The imaging device 300 then acquires light exposure data G2 of the second color (green (G)) in the third subframe period BF3 in which the third color (blue (B)) is displayed in the frame period F3, and the first subframe period RF4 in which the first color (red (R)) is displayed in the frame period F4 on the display device 100. The acquired light exposure data G2 is different from the display colors (the third color (blue (B)) and the first color (red (R))) in the two subframe periods.
The imaging device 300 combines the light exposure data R1 of the first color (red (R)), the light exposure data G2 of the second color (green (G)), and the light exposure data B2 of the third color (blue (B)) to generate imaging data RGB3 (=R1, G2, B2).
The imaging device 300 then acquires light exposure data R2 of the first color (red (R)) in the second subframe period GF4 in which the second color (green (G)) is displayed, and in the third subframe period BF4 in which the third color (blue (B)) is displayed in the frame period F4 on the display device 100. The acquired light exposure data R2 is different from the display colors (the second color (green (G)) and the third color (blue (B))) in the two subframe periods.
The imaging device 300 combines the light exposure data R2 of the first color (red (R)), the light exposure data G2 of the second color (green (G)), and the light exposure data B2 of the third color (blue (B)) to generate imaging data RGB4 (=R2, G2, B2).
Thereafter, in the same manner, the imaging device 300 acquires imaging data RGB5, RGB6, . . . . As a result, when the display device 100 displays the display image data I at 60 FPS, the imaging device 300 can capture the imaging data of the object to be imaged PA at 90 FPS.
As described above, in the acquisition timing of the imaging data according to the first embodiment, the imaging device 300 captures the light exposure data of the color different from the display colors on the display device 100 in the two subframe periods. The imaging device 300 combines pieces of light exposure data acquired in the multiple subframe periods to generate the imaging data of the object to be imaged PA. This prevents light of the display image data I displayed on the display device 100 from affecting the imaging data acquired by the imaging device 300, thus making it possible to display high quality remote images.
In the example illustrated in
The frame period F2 is temporally divided into the first subframe period RF2, the second subframe period GF2, and the third subframe period BF2.
In the example illustrated in
The imaging device 300 then acquires the light exposure data B2 of the third color (blue (B)), and the light exposure data R1 of the first color (red (R)) in the second subframe period GF1 in the frame period F1. The third and the first colors are different from the display color (the second color (green (G))) on the display device 100 in the second subframe period GF1 in the frame period F1.
The imaging device 300 combines the light exposure data R1 of the first color (red (R)), the light exposure data G1 of the second color (green (G)), the light exposure data B1/2 corresponding to the third color (blue (B)), and the light exposure data B2/2 corresponding to the third color (blue (B)) to generate the imaging data RGB1 (=R1, G1, B1/2+B2/2). The light exposure data B1/2 represents the half value of the light exposure data B1 while the light exposure data B2/2 represents the half value of the light exposure data B2. The imaging device 300 may combine the light exposure data R1 of the first color (red (R)), the light exposure data G1 of the second color (green (G)), and the light exposure data B1 of the third color (blue (B)) to generate the imaging data RGB1 (=R1, G1, B1). The imaging device 300 may combine the light exposure data R1 of the first color (red (R)), the light exposure data G1 of the second color (green (G)), and the light exposure data B2 of the third color (blue (B)) to generate the imaging data RGB1 (=R1, G1, B2).
The imaging device 300 then acquires the light exposure data R2 of the first color (red (R)) and the light exposure data G2 of the second color (green (G)) in the third subframe period BF1 in the frame period F1. The first and the second colors are different from the display color (the third color (blue (B))) on the display device 100 in the third subframe period BF1 in the frame period F1.
The imaging device 300 combines the light exposure data R1/2 corresponding to the first color (red (R)), the light exposure data R2/2 corresponding to the first color (red (R)), the light exposure data G2 of the second color (green (G)), and the light exposure data B2 of the third color (blue (B)) to generate the imaging data RGB2 (=R1/2+R2/2, G2, B2). The light exposure data R1/2 represents the half value of the light exposure data R1 while the light exposure data R2/2 represents the half value of the light exposure data R2. The imaging device 300 may combine the light exposure data R1 of the first color (red (R)), the light exposure data G2 of the second color (green (G)), and the light exposure data B2 of the third color (blue (B)) to generate the imaging data RGB2 (=R1, G2, B2). The imaging device 300 may combine the light exposure data R2 of the first color (red (R)), the light exposure data G2 of the second color (green (G)), and the light exposure data B2 of the third color (blue (B)) to generate the imaging data RGB2 (=R2, G2, B2).
The imaging device 300 then acquires the light exposure data G3 of the second color (green (G)) and the light exposure data B3 of the third color (blue (B)) in the first subframe period RF2 in the frame period F2. The second and the third colors are different from the display color (the first color (red (R))) on the display device 100 in the first subframe period RF2 in the frame period F2.
The imaging device 300 combines the light exposure data R2 of the first color (red (R)), the light exposure data G2/2 corresponding to the second color (green (G)), the light exposure data G3/2 corresponding to the second color (green (G)), and the light exposure data B3 of the third color (blue (B)) to generate the imaging data RGB3 (=R2, G2/2+G3/2, B3). The light exposure data G2/2 represents the half value of the light exposure data G2 while the light exposure data G3/2 represents the half value of the light exposure data G3. The imaging device 300 may combine the light exposure data R2 of the first color (red (R)), the light exposure data G2 of the second color (green (G)), and the light exposure data B3 of the third color (blue (B)) to generate the imaging data RGB3 (=R2, G2, B3). The imaging device 300 may combine the light exposure data R2 of the first color (red (R)), the light exposure data G3 of the second color (green (G)), and the light exposure data B3 of the third color (blue (B)) to generate the imaging data RGB3 (=R2, G3, B3).
The imaging device 300 then acquires the light exposure data B4 of the third color (blue (B)), and the light exposure data R3 of the first color (red (R)) in the second subframe period GF2 in the frame period F2. The third and the first colors are different from the display color (the second color (green (G))) on the display device 100 in the second subframe period GF2 in the frame period F2.
The imaging device 300 combines the light exposure data R3 of the first color (red (R)), the light exposure data G3 of the second color (green (G)), the light exposure data B3/2 corresponding to the third color (blue (B)), and the light exposure data B4/2 corresponding to the third color (blue (B)) to generate the imaging data RGB4 (=R3, G3, B3/2+B4/2). The light exposure data B3/2 represents the half value of the light exposure data B3 while the light exposure data B4/2 represents the half value of the light exposure data B4. The imaging device 300 may combine the light exposure data R3 of the first color (red (R)), the light exposure data G3 of the second color (green (G)), and the light exposure data B3 of the third color (blue (B)) to generate the imaging data RGB4 (=R3, G3, B3). The imaging device 300 may combine the light exposure data R3 of the first color (red (R)), the light exposure data G3 of the second color (green (G)), and the light exposure data B4 of the third color (blue (B)) to generate the imaging data RGB4 (=R3, G3, B4).
The imaging device 300 then acquires the light exposure data R4 of the first color (red (R)) and the light exposure data G4 of the second color (green (G)) in the third subframe period BF2 in the frame period F2. The first and the second colors are different from the display color (the third color (blue (B))) on the display device 100 in the third subframe period BF2 in the frame period F2.
The imaging device 300 combines the light exposure data R3/2 corresponding to the first color (red (R)), the light exposure data R4/2 corresponding to the first color (red (R)), the light exposure data G4 of the second color (green (G)), and the light exposure data B4 of the third color (blue (B)) to generate the imaging data RGB5 (=R3/2+R4/2, G4, B4). The light exposure data R3/2 represents the half value of the light exposure data R3 while the light exposure data R4/2 represents the half value of the light exposure data R4. The imaging device 300 may combine the light exposure data R3 of the first color (red (R)), the light exposure data G4 of the second color (green (G)), and the light exposure data B4 of the third color (blue (B)) to generate the imaging data RGB5 (=R3, G4, B4). The imaging device 300 may combine the light exposure data R4 of the first color (red (R)), the light exposure data G4 of the second color (green (G)), and the light exposure data B4 of the third color (blue (B)) to generate the imaging data RGB5 (=R4, G4, B4).
Thereafter, in the same manner, the imaging device 300 acquires imaging data RGB6, RGB7, . . . . As a result, when the display device 100 displays the display image data I at 60 FPS, the imaging device 300 can capture the imaging data of the object to be imaged PA at 180 FPS.
As described above, in the acquisition timing of the imaging data according to the second embodiment, the imaging device 300 captures the pieces of light exposure data of the colors different from the display color on the display device 100 in each subframe period. The pieces of data acquired in the multiple subframe periods are then combined to generate the imaging data of the object to be imaged PA. This prevents light of the display image data I displayed on the display device 100 from affecting the imaging data acquired by the imaging device 300, thus making it possible to display high quality remote images.
When the imaging device 300 has the image sensor having the color filters consisting of the complementary color filters as illustrated in
The imaging device 300 having this configuration captures the light exposure data of a color different from the display colors on the display device 100 in the two subframe periods in the same manner as the first embodiment. The imaging device 300 combines pieces of light exposure data acquired in the multiple subframe periods to generate the imaging data of the object to be imaged PA. This prevents light of the display image data I displayed on the display device 100 from affecting the imaging data acquired by the imaging device 300, thus making it possible to display high quality remote images.
The imaging device 300 captures the light exposure data of colors different from the display color on the display device 100 in each subframe period in the same manner as the second embodiment. The imaging device 300 combines pieces of light exposure data acquired in the multiple subframe periods to generate the imaging data of the object to be imaged PA. This prevents light of the display image data I displayed on the display device 100 from affecting the imaging data acquired by the imaging device 300, thus making it possible to display high quality remote images.
Alternatively, when the imaging device 300 has the image sensor having the color filters consisting of the complementary color filters as illustrated in
The imaging device 300 having this configuration captures the light exposure data of a color different from the display colors on the display device 100 in the two subframe periods in the same manner as the first embodiment.
Specifically, the imaging device 300 acquires the light exposure data of cyan (C) (the fourth color), which is the complementary color of red (R) (the first color), in the first subfield period in which the red (R) (the first color) is displayed in the first frame. The imaging device 300 acquires the light exposure data of magenta (M) (the fifth color), which is the complementary color of green (G) (the second color), in the second subfield period in which green (G) (the second color) is displayed in the first frame, for example. Then, the imaging device 300 extracts the light exposure data of blue (B) (the third color) from each of the light exposure data of cyan (C) (the fourth color) acquired in the first subfield period in the first frame and the light exposure data of magenta (M) (the fifth color) acquired in the second subfield period in the first frame.
The imaging device 300 acquires the light exposure data of yellow (Y) (the sixth color), which is the complementary color of blue (B) (the third color), in the third subfield period in which blue (B) (the third color) is displayed in the first frame, and acquires the light exposure data of cyan (C) (the fourth color), which is the complementary color of red (R) (the first color), in the first subfield period in which red (R) (the first color) is displayed in the second frame, for example. Then, the imaging device 300 extracts the light exposure data of green (G) (the second color) from each of the light exposure data of yellow (Y) (the sixth color) acquired in the first subfield period in the first frame and the light exposure data of cyan (C) (the fourth color) acquired in the first subfield period in the second frame.
The imaging device 300 acquires the light exposure data of magenta (M) (the fifth color), which is the complementary color of green (G) (the second color), in the second subfield period in which green (G) (the second color) is displayed in the second frame, and acquires the light exposure data of yellow (Y) (the sixth color), which is the complementary color of blue (B) (the third color), in the third subfield period in which blue (B) (the third color) is displayed in the second frame, for example. Then, the imaging device 300 extracts the light exposure data of red (R) (the first color) from each of the light exposure data of magenta (M) (the fifth color) acquired in the second subfield period in the second frame and the light exposure data of yellow (Y) (the sixth color) acquired in the third subfield period in the second frame.
The pieces of light exposure data acquired in the multiple subframe periods are then combined to generate the imaging data of the object to be imaged PA. This prevents light of the display image data I displayed on the display device 100 from affecting the imaging data acquired by the imaging device 300, thus making it possible to display high quality remote images.
The imaging device 300 captures the light exposure data of colors different from the display color on the display device 100 in each subframe period in the same manner as the second embodiment.
Specifically, the imaging device 300 acquires the light exposure data of cyan (C) (the fourth color), which is the complementary color of red (R) (the first color), in the first subfield period in which the red (R) (the first color) is displayed in the first frame, and extracts the light exposure data of green (G) (the second color) and the light exposure data of blue (B) (the third color) from the acquired light exposure data of cyan (C) (the fourth color), for example.
The imaging device 300 acquires the light exposure data of magenta (M) (the fifth color), which is the complementary color of green (G) (the second color), in the second subfield period in which the green (G) (the second color) is displayed in the first frame, and extracts the light exposure data of blue (B) (third color) and the light exposure data of red (R) (first color) from the acquired light exposure data of magenta (M) (fifth color), for example.
The imaging device 300 acquires the light exposure data of yellow (Y) (the sixth color), which is the complementary color of blue (B) (the third color), in the third subfield period in which blue (B) (the third color) is displayed in the first frame, and extracts the light exposure data of red (R) (the first color) and the light exposure data of green (G) (the second color) from the acquired light exposure data of yellow (Y) (the sixth color), for example.
The imaging device 300 combines pieces of light exposure data acquired in the multiple subframe periods to generate the imaging data of the object to be imaged PA. This prevents light of the display image data I displayed on the display device 100 from affecting the imaging data acquired by the imaging device 300, thus making it possible to display high quality remote images.
The preferred embodiments of the present disclosure are described above. The present disclosure is not limited to such embodiments. The contents disclosed in the embodiments are only examples and various modifications can be made without departing from the purpose of the present disclosure. For example, appropriate modifications made within the scope that does not depart from the purpose of the present disclosure naturally belong to the technical scope of the invention.
Number | Date | Country | Kind |
---|---|---|---|
2023-131792 | Aug 2023 | JP | national |