See Application Data Sheet.
Not applicable.
Not applicable.
Not applicable.
Not applicable.
This invention relates to a data processing device, a data processing system and a data processing method.
The so-called uniformity correction, which corrects the luminance displayed on the screen so that it is as uniform as possible over the entire screen surface—the so-called display device with uniformity correction—is provided in devices used in the medical and printing industries, for example. Uniformity correction is used to correct the red, green and blue (RGB) signals of each pixel that make up an input image. The displayed signals are obtained by multiplying the input data with predetermined uniformity correction factors.
An image processing device that shifts the positions of pixels integrating R, G and B signals from several units to several tens of units when multiplying the input data by the uniformity correction factors, thereby preventing the generation of a grid-like luminance distribution, is proposed (reference patent 1).
Publication of patent reference no.1: 2016-46751
The image processing device described in reference patent 1 adjusts the luminance of the screen relatively. Under these conditions, the display cannot be based on the absolute value of the luminance.
A principal but not limiting objective of the invention is to provide an information processing device or the like that can perform an image display based on an absolute value of luminance.
The information processing device is capable of measuring from a predetermined measuring position, the brightness of the display unit that displays the image in relation to the input signal. The first acquisition unit acquires the luminance correction information that relates to the luminance information contained in the input signal. The second acquisition unit acquires the image data to be displayed by the display unit. A luminance correction unit corrects the data acquired by the second acquisition unit on the basis of the luminance correction information acquired by the first acquisition unit. An output transmission unit transmits the image data corrected by the correction unit to the display unit.
The invention provides an information processing or similar device capable of performing an image display based on an absolute value of luminance.
Configuration 1
The information processing system 10 consists of an information processing device 20 (see
Real luminance image data, including luminance information corresponding to real luminance, is input into the information processing device.20 Here, “real luminance” is related to an inherent spectral sensitivity curve, such as a target luminance and a trichromatic component, based on spectral radiance. It is a physical quantity or spectral radiance whose value is uniquely determined. Displaying in “real luminance” means reproducing the absolute physical quantity described above and displaying it as such.
The data of an image and real luminance are, for example, those of a real image taken by a two-dimensional color luminance meter at high resolution 36. To obtain the data of images in real luminance, a calibration in luminance must be carried out beforehand to allow the shooting in real luminance. It can also be a real image taken by a digital camera. The real luminance image data can still be a synthetic image created by a simulation tool based on physical theory. Finally, the real luminance data can be a spectral image taken by a hyperspectral camera or any other such device.
The actual luminance data of the image can, for example, be used to convert each pixel of the image to be displayed using the X, Y and Z trichromatic components of the CIE color system (CIE: International Commission on Illumination). Each pixel of the image to be displayed can alternatively take a value from the CIELAB color space (CIE L*a*b*) or a value from the CIERGB color space (CIE Red Green, Blue) or finally a specific value from the CIELMS color space (CIE Long Medium Short), etc. It integrates a spectral sensitivity curve and is represented by a physical quantity whose value is specifically determined from the spectral radiance. The physical quantity is not limited to three dimensions; it can be a physical quantity in one, two or more four dimensions. Real luminance image data can be the image data that includes the spectral radiance of each pixel.
Real luminance image data can also be a set of image or video data in a common format such as JPEG (Joint Photographic Experts Group) or PNG (Portable Network Graphics) associated with reference information or similar that maps RGB gradation values to a luminance level appropriate to that data.
Real luminance image data can also be image data or video data in a common format, such as JPEG or PNG, combined with data that maps recorded RGB gradation values to luminance levels in relation to the gamma value and color sensitivity information of the shooting equipment.
The real luminance image data is luminance-corrected according to the luminance correction information described below. The image data after luminance correction is input to the projectors 31. Projectors 31 project the image on screen 33 based on the input image data. In the case of this configuration, projectors 31 invert the left and right sides of the input image, such a projection mode is called back-projection.
The image is projected on screen 33 in a rear-projection mode. Here is an example of a case in which the image is viewed from a position approximately opposite the projector 31. In general, the rear-projection image is based on the light distribution characteristics of projector 31 and the orientation of screen 33, with high luminance in the central part and lower luminance at the periphery. For example, when the observation position is moved to the right, the area of high luminance seems to move to the right.
The luminance correction information corrects the luminance distribution and the absolute luminance value, which vary depending on the observation position. In the following explanations, the processing step is referred to as the “preparatory phase” until the luminance correction information is created. In the following explanations, the position from which the luminance is measured in order to create the luminance correction information is called the measurement position.
Once the preparation phase has been completed, the information processing system 10 of this configuration enters the operational use phase. In the operational use phase, the real luminance image data, corrected with the luminance correction information corresponding to a measurement position, is input to projectors 31 and projected onto screen 33. From the measurement position, it is possible to display a real luminance image that is faithful to the real luminance image data. By placing the test camera 15 in the measurement position, the test camera 15 allows real luminance images to be taken.
The testing of a camera 15 using a system as described above allows, for example, the evaluation of the effects of glare in the lenses and ghost images related to the headlights of a vehicle coming in the opposite direction. or related to the variation of luminosity before and after a tunnel, on the creation of images by the said camera. As it is easy to evaluate several camera models 15 under identical conditions, useful information can thus be obtained, for example, for the selection of onboard camera models.
After a luminance measurement is completed, the projected gray level of the image projected by projectors 31 on screen 33 is changed and the luminance is measured again. Based on the above, the luminance at each of the image points is measured in correspondence to several different gray levels projected by projectors 31.
In the present configuration, a high-resolution two-dimensional color luminance meter is used as luminance meter 36. A two-dimensional luminance meter can also be used as luminance meter 36. The luminance in each point of the image can be measured by a luminance meter capable of measuring the luminance of only one point by using a mechanical scanning of the screen 33.
The information processing device 20 consists of a central processing unit (CPU) 21, a main memory 22, an auxiliary memory 23, a communication unit 24, an output interface 25, an input interface 26 and a computer bus. The data processing device 20 in this configuration is a data processing device such as a conventional personal computer or tablet.
The CPU 21, in the case of the present configuration, is a management unit for the calculation operations to run the program. The CPU 21 can use one or more processing units, or a multi-core unit. Instead of several CPUs or multi-core CPUs, or in addition to one or more CPUs or multi-core CPUs, FPGAs (User Programmable Logic Gates), CPLD (Programmable Complex Logic Device), ASICs (Application Specific Integrated Circuits) or GPUs (Graphics Processing Units) can also be used. The CPU 21 can be connected via a computer bus to the hardware parts making up the information processing device 20.
Main memory 22 is a storage device such as SRAM(static random access memory), DRAM (dynamic random access memory) or flash memory. Main memory 22 contains the information necessary during the processing performed by the information processing device 20 and temporarily stores the program executed by said information processing device 20.
Auxiliary storage device 23 can be a memory such as SRAM, flash memory, hard disk, or magnetic tape. Auxiliary storage device 23 may contain the program to be executed by CPU 21, a luminance measurement database 51 or the basis of luminance correction variables 52, as well as various information necessary for program execution.
The basis of the luminance measurement data 51 and the basis of the luminance correction variables 52 can be stored in a different storage device which is connected to the data processing device 20 via a network. The data can be who was manufactured. The details of each database or variable will be described below. The communication unit 24 is an interface for communication with a network.
The output interface 25 is an interface for outputting the image data to be displayed by the display device 30. The input interface 26 is an interface allowing the acquisition of the results of the luminance measurements by the luminance meter 36. Input interface 26 can also be an interface for reading data measured in advance by the luminance meter 36 using a portable storage medium such as an SD (secure digital) memory card.
The display device 30 is equipped with a screen 33 and projectors 31. Screen 33 is for rear projection. Screen 33 is only an example of a display unit that can be used in this configuration.
The display device 30 can include a front projector 31 with a screen 33 suitable for front projection. The display device 30 can also be a liquid crystal or electroluminescence (OLED) display panel, or any other type of display panel.
In the operational use phase, instead of the luminance meter 36, a camera 15 under test can be placed as shown in
The vertical axis in
In the position field, the position on screen 33 is recorded by the X and Y coordinates. In the present configuration, the X and Y coordinates result from the positions of the measurement pixels by the two-dimensional color luminance meter 36. The level 10 field gathers the actual luminance measurement data of each position on the screen corresponding to the input level of 10 transmitted by the output interface 25 to the projectors 31. In this case, the entire screen surface is displayed by the projectors 31 at this input level 10 as a dark gray screen. The unit of measurement for luminance is the candela per square meter.
Similarly, in the input level 20 field are the luminance values measured at each of the points on the screen corresponding to input level 20 transmitted by output interface 25 to the projectors 31. The level 255 field records the actual luminance values on each of the points of the screen corresponding to level 255 input to the projectors via output interface 25.
The basis of the luminance correction variables 52 has a position field and input level value fields. The input level value fields can be in any number depending on the display level in luminance displayed, e.g. luminance 100, luminance 200, luminance 5000 or luminance 10000.
In the position field, any position on screen 33 is recorded by its X and Y coordinates. In the displayed luminance field of level 100, the value of the input level to the projector from output interface 25 is recorded when the displayed luminance value is 100 candela/square meter, as measured by a luminance meter placed at the measurement location.
Similarly, in the field of the displayed luminance value 200, the recorded values correspond to the input level via the output interface 25 to the projector when the displayed luminance value is 200 candela/square meter, measured by a luminance meter 36 placed at the measurement location. In the displayed luminance field 5000, the input level values to the projectors 31 via the output interface are recorded for the displayed luminance value of 5000 candelas/square meter, measured by a luminance meter 36 placed at the measuring location.
In
If a luminance value corresponding to the displayed luminance value field in
The CPU 21 determines the value of the input levels (step S501). The value of the input levels can be defined as any value, for example, with a step of ten levels. The evaluation image of the brightness distribution is displayed by display device 30 (step S502). Specifically, the CPU 21 transmits to the projectors 31, via the output interface 25, the data of a brightness distribution evaluation image whose entire surface corresponds to the levels determined in step S501. The projector projects the image onto the screen according to the acquired image data. The evaluation image of the brightness distribution is then displayed on the screen. The evaluation image of the brightness distribution can be, for example, an image in which the different levels are arranged in a checkerboard pattern.
The CPU 21 obtains measurements of the luminance distribution via the luminance meter 36 and the input interface 26 (step S503). The CPU 21 stores (step S504) the measured values in the field corresponding to the value of the input levels determined in step S501 as a record corresponding to each coordinate position of the luminance measurement database 51.
The central unit 21 determines whether or not the measurement of the predetermined input level value has been carried out (step S505). If it is determined that it has not been completed (NO in step S505), CPU 21 returns to step S501. If it is determined that it is complete (YES in step S505), CPU 21 proceeds to calculate the correction value and starts the corresponding subroutine (step S506). The subroutine for the calculation of the correction value creates a base of luminance correction variables 52 based on the actual luminance measurements 51. The processing flow of the subroutine for the calculation of the correction value is described below.
CPU 21 interpolates the basis of the luminance correction variables 52 to match the resolution of the input data to be introduced into projectors 31 (step S507). Specifically, CPU 21 adds records to the basis of luminance correction variables 52 so that the number of display pixels by the projectors corresponds to the number of records in the luminance correction basis 52. CPU 21 records the input level values for each field of the added records based on an arbitrary interpolation technique. In addition, CPU 21 corrects the data in the position field to match the positions of the projector pixels. CPU 21 then terminates the process.
The CPU 21 obtains a measurement result of base 51 of the luminance measurements showing the relationship between the input level value and the luminance value corresponding to a record of base 51, i.e. a position on the screen (step S512).
CPU 21 calculates the input level values corresponding to the luminance values of each display luminance value field in Luminance Correction Base 52 (step S513). The CPU calculates the input level values for a given display luminance value, for example, by linear interpolation of the data acquired in step S512. The CPU 21 calculates the input level values for a given display luminance value based on the data acquired in step S512. For example, a function indicating the relationship between the input level value and the display luminance value can be calculated by the method of least squares or a similar method, and the input level value for a given display luminance value can be calculated based on the calculated function.
CPU 21 stores the input level values for each display luminance value calculated in step S513 in the storage of base 52 of the luminance correction variables corresponding to the position obtained in step S512 (step S514).
The CPU 21 determines whether or not it has completed processing all records of the actual luminance measurement base 51 (step S515). If it is determined that processing is not completed (NO in step S515), CPU 21 returns to step S512. If it is determined that processing is complete (YES in step S515), CPU 21 terminates the process.
The CPU 21 acquires the luminance value of a pixel in the image acquired in step S521 (step S522). The CPU 21 extracts the record corresponding to the position of the pixel acquired in step S522 from the luminance correction database 52. The central processing unit 21 then acquires the value of the input level of the field corresponding to the luminance value acquired in step S522 (step S523). By step S523, the central processing unit 21 carries out the function assigned to the first acquisition unit of the present configuration.
If the luminance correction in DB 52 does not have a field corresponding to the luminance value obtained in step S522, CPU 21 calculates the value of the input level by interpolation.
CPU 21 records the input level values obtained in step S523 in relation to the pixel positions obtained in step S522 (step S524). In step S524, CPU 21 performs the function assigned to the luminance correction unit of this configuration. CPU 21 determines whether or not the processing of all pixels of the original image data has been completed (step S525). If it is determined that processing is not complete (NO in step S525), CPU 21 returns to step S522.
When processing is considered complete (YES in step S525), the CPU 21 transmits the image data to projector 31 via output interface 25 based on the input level values of each pixel recorded in step S524 (step S526). With step S526, CPU 21 performs the function assigned to the output unit in the present configuration. Projector 31 projects the image on screen 33 based on the input image data. Afterwards, the central processing unit ends the processing process.
According to the procedure described above, screen 33 displays an image in real luminance when viewed from the measurement position.
In application of the present configuration, an information processing device 20 or similar can be realized, capable of a display according to an absolute value of luminance.
As an application example, by placing a test camera 15 at the measurement position and aiming the image at the screen 33, an evaluation of the test camera 15 can be performed using real luminance images.
Using the information processing system 10 as described in its present configuration, it is possible to evaluate, on the images taken by the camera to be tested 15, the effects of lens glare and ghost images caused, for example, by the headlights of oncoming motor vehicles, or changes in brightness at the entrance and exit of tunnels.
The image displayed in real luminance can be dynamic, for example a video. By switching the image to be projected from projector 31 to screen 33 at a predetermined frame rate, a video can be displayed on screen 33 in real luminance. This can make it possible, for example, to check the operation of the autonomous driving system on the basis of images captured by an on-board camera. It is also possible to carry out driving simulations and other applications using the images displayed in real luminance.
Configuration 2
The present configuration concerns an information processing device 20 that creates luminance correction information for a plurality of measurement positions and displays corrected images according to the measurement position closest to the position where the camera 15 to be tested or its equivalent will have been installed. The description of the common parts with configuration 1 will be omitted.
In this configuration, the process corresponding to the preparation step as described in
CPU 21 calculates the distance (step S532) between the position acquired in step S531 and each of the multiple measurement positions for which luminance correction information was previously created. A measurement position for luminance correction is selected by CPU 21 (step S533). Further processing is performed using the luminance correction database 52 corresponding to the selected measurement position.
In step S533, the measurement position closest to the position acquired in step S531 can be selected. In step S533, several measurement positions close to the position acquired in step S531 can also be selected and the measurement value at the position acquired in step S531 can be estimated by interpolating the data.
CPU 21 obtains the original image data from Auxiliary Storage Unit 23 or another server or similar equipment connected via a network (step S521). Since the further processing is the same as the processing performed by the configuration 1 program described in
According to the present configuration, the information processing system 10 can be realized by selecting the closest measurement position from a plurality of measurement positions in order to perform the luminance correction. As an application example, the data processing system 10 is able to display a real luminance image even when the position of the test camera 15 is changed.
In step S502 of the program described using
Configuration 3
The present configuration concerns an information processing system 10 that superimposes an image projected on a screen 33 from a plurality of projectors 31. The descriptions of the common parts with configuration 1 will be omitted.
In this configuration, the preparation step consists of two stages: a deformation acquisition step and a luminance distribution acquisition step.
The information processing system 10 in the deformation acquisition phase is equipped with an information processing device 20, a display device 30 and a luminance meter 36.
The information processing device 20 is equipped with a central processing unit 21, a main storage memory 22, an auxiliary storage memory device 23, a communication unit 24, an output interface 25, an input interface 26, a control display 27 and a bus. The control screen 27 is a liquid crystal display device or similar, for example, provided in the information processing device 20. The Information Processing Device 20 in this configuration may be a personal computer or general purpose tablet or other equivalent information processing device.
The display device 30 comprises a screen 33 and a plurality of projectors 31, such as a first projector 311, a second projector 312, and so on. In the following description, the individual projectors 31 will be referred to generically as projector 31 when they do not need to be distinguished. The arrangement of projectors 31 will be described below.
A camera 37 is connected to the input interface 26. Camera 37 is placed in a position opposite to projector 31, in front of screen 33 and facing projector 31. Camera 37 can be placed on the same side as the first projector 311 or similar projector but in such a way that it does not block the projection path of projector 31. Camera 37 is a high-resolution digital camera.
In the present configuration, a total of six projectors 31 are used, in two rows of three projectors from left to right, i.e. in three columns of two projectors from top to bottom. The projectors 31 at both ends in the horizontal direction are arranged in the form of a truncated fan so that the axis of each of these projectors 31 at both ends in the horizontal direction is oriented towards the optical axis of projector 31 in the middle position.
A group of several projectors 31 can be housed in a single enclosure and thus be supplied in an integrated form that appears to be a single projector. When projectors are thus supplied as a single integrated projector, all or part of projector group 31 can share optical components, such as projection lenses, relay optics or spatial light modulators, for example. All or part of projector group 31 can also share a single optical path. All or part of Projector 31 can share power supply circuits, command and control circuits, and so on.
As shown in
Even if the projection area of each of the projectors 31 is adjusted to correspond as much as possible to a common projection area by adjusting the installation position and lens shift of these projectors 31, the projection area of these projectors will differ, as shown in
CPU 21 operates projectors 31 one by one and acquires the projection area of each of these projectors 31 via camera 37. The central processing unit 21 superimposes the projection area of each projector 31 on screen 27, as shown in
The CPU 21 can automatically determine the operational range by calculating a rectangle with a predetermined aspect ratio that is included in the projection area of each of the projectors 31. In the following description, the coordinates in the operational range will be used to indicate the position on the screen 33.
The operational range can be defined as the projection range common to any number of projectors 31, for example, three or more projectors.
The CPU 21 transmits image data to each of the projectors 31, which are transformed from the original image to project a predetermined image over the operational range. Projectors 31 project the input images onto screen 33, as shown in
As in the case of configuration 1, by entering the image data with corrected luminance distribution into each of the projectors 31, a real luminance image can be displayed on screen 33. Furthermore, such a high luminance image cannot be reproduced by only one projector 31 on screen 33.
CPU 21 starts a subroutine to obtain the luminance distribution (step S552). The luminance distribution acquisition subroutine is a subroutine that measures the luminance distribution and creates a luminance correction database 52 as described in
The image used to acquire the deformation can be any image, such as a so-called checkerboard image in which white and black squares are arranged alternately. In the following explanation, this will be the example of the case where an all-white image is used as an image for deformation acquisition.
The CPU 21 acquires the projection area of the white image via camera 37 and stores it in the auxiliary storage device 23 (step S563). CPU 21 then determines whether processing for all projectors 31 is complete or not (step S564). If it is determined that processing is not complete, CPU 21 returns to step S561.
If it is determined that processing is complete, the CPU 21 determines the operational scope described in
The CPU 21 obtains the projection range recorded in step S563 for a projector 31. CPU 21 corrects the projected image on screen 33 by distorting the original image as described in
CPU 21 determines whether processing of all projectors 31 is complete or not (step S568). If it is considered not completed (NO in step S568), CPU 21 returns to step S566. If it is considered complete (YES in step S568), CPU 21 terminates processing.
CPU 21 determines a value for the input level (step S571). An arbitrary value can be determined for the interval value of the input level, e.g. every ten elementary levels. CPU 21 creates an evaluation image of the luminance distribution based on the shape correction information stored in the auxiliary storage device 23 (step S572). Specifically, CPU 21 creates the image data to project the image of the input level value determined in step S571 onto the operational range described using
CPU 21 determines whether processing for all projectors 31 is complete or not (step S573). If it is determined that it is not completed (NO in step S573), CPU 21 returns to step S572.
If the processing is judged to be complete (YES in step S573), the CPU 21 displays the evaluation image of the luminance distribution (step S574). Specifically, CPU 21 transmits the data of the luminance distribution evaluation image created in step S572 to projectors 31 via output interface 25. Projectors 31 project the image onto screen 33 based on the image input data. The image projected by each projector 31 is superimposed on the operating range described using
CPU 21 acquires the measured values of the luminance distribution from the luminance meter 36 via interface 26 (step S575). CPU 21 stores the measured values in the fields corresponding to the input level values determined in step S571 for each coordinate position in the database of the actual luminance measurements 51 (step S576)
The relationship between input level value and luminance on screen 33 is the same for any position on screen 33. Therefore, by displaying a single evaluation image of the luminance distribution on screen 33 and measuring the luminance, the relationship between the input level value of each projector 31 and the luminance on screen 33 can be obtained to create a database of actual luminance measurements 51. By using the data of the input level values of each projector 31 and the luminance data on screen 33, the actual luminance display can be performed with high accuracy.
CPU 21 determines whether or not the measurement of the predetermined input level value has been performed (step S577). If it is judged that the measurement is not complete (NO in step S577), the processor returns to step S571. If it is judged that the measurement is complete (YES in step S577), CPU 21 starts the subroutine for calculating the correction value (step S578). The subroutine for calculating the correction value is the same as described in
CPU 21 acquires the luminance value of a pixel in the image acquired in step S581 (step S582). For the pixel from which the luminance is acquired, CPU 21 calculates the position in the operational range described in
CPU 21 stores the input level values obtained in step S584 in relation to the positions calculated in step S583 (step S585). CPU 21 determines whether processing for all pixels of the original image data has been completed or not (step S586). If it is determined that processing is not complete (NO in step S586), the CPU returns to step S582.
If the treatment is considered complete (YES in step S586), CPU 21 obtains the shape correction information corresponding to a projector 31 from auxiliary storage device 23 (step S591). In this step S591, CPU 21 performs the function assigned to the third acquisition unit in this configuration.
CPU 21 transforms the image data formed by the input level values for each pixel recorded in step S585 according to the shape correction information (step S592). In this step S592, CPU 21 performs the function assigned to the shape correction unit of the current configuration.
CPU 21 transmits the image data from step S592 to projectors 31 via output interface 25 (step S593). Projectors 31 project an image on screen 33 based on the image input data.
CPU 21 determines whether processing for all projectors 31 is complete or not (step S594). If it determines that it is not completed (NO in step S594), the Central Processing Unit returns to step S591. If it determines that processing is complete (YES in step S594), the Central Processing Unit 21 will terminate processing.
In the present configuration, to the extent that the image is projected by several projectors 31 for an entire operational range, the information processing device 20 can provide a display in real luminance while a single projector would be limited to only a portion in high luminance.
In the subroutine for obtaining the luminance distribution described in
By using the minimum required number of projectors 31, it is possible to realize an information processing unit 20 that displays low luminance images but in real and precise luminance. This saves power consumption and extends the service life of projectors 31.
For displaying the same image, all projectors 31 can be used for areas that include a high luminance portion, while one or more projectors 31 can be used for other parts of the image. Since no overlay projection is performed on the low-luminance parts, Information Processing System 10 can be provided to display a high-resolution image.
Configuration 4
The present configuration concerns an information processing system 10 that uses auxiliary projectors 32 that project an image over only part of the operational range. The description of the common parts of configuration 3 is omitted.
In this configuration, two auxiliary spotlights 2 are arranged in a truncated fan shape on either side of the six spotlights 31 themselves arranged in the same way as in configuration 3.
The six projectors 31, from the first projector 311 to the sixth projector 316, are capable of projecting an image onto an area that includes the operational range.
The first and second auxiliary projectors 321 and 322, located on the right side, project the image onto the right half of the truncated operational range. As shown by the dashed lines in
Similarly, the third and fourth auxiliary projectors 323 and 324, located on the left side, project images on the left half of the truncated operational range. As shown by the dotted lines in
In the present configuration, Information Processing System 10 can become a processing system capable of displaying high luminance as real luminance even near the edges of the operational range.
In the present configuration, Information Processing System 10 can become a processing system capable of displaying a high luminance image in real luminance over a very large area.
The number of auxiliary projectors 32 may be less than three or more than five. Auxiliary 32 projectors can be placed at any location. The size of the projection area of auxiliary projectors 32 can be different from the size of the projection area of projectors 31.
Configuration 5
The present configuration concerns an information processing system 10 with several screens 33. The description of the common parts of configuration 3 is omitted.
Behind each screen 33 there are six projectors 31, each of these groups being located behind each screen 33. The optical axis of each projector 31 is arranged in such a way that this optical axis is oriented towards the measuring position.
Thus, a horizontal image called panoramic image with real brightness is projected from a total of eighteen projectors 31, successively on the three screens 33.
In the present configuration, it is possible to provide an information processing system 10 capable of evaluating a camera 15 to be tested from a wide angle. As the rear axis of each projector 31 is oriented towards the measurement position, the information processing system 10 can become a processing system capable of displaying a high brightness image in real brightness.
Screen 33 can be composed of four or more screens. Screen 33 can also be connected vertically.
Screen 33 can be curved. This can allow to build an information processing system that is less affected by the angle breaks of screen 33.
Configuration 6
The present configuration concerns an information processing system 10 in which a human user visually observes an image in real luminance. The description of the common parts of configuration 3 is omitted.
A real luminance image is displayed on screen 33. The user can, for example, evaluate the visibility of the dashboard when hit by the headlights of an oncoming vehicle, by the morning low sun or by the setting sun, etc. The user can also evaluate the visibility of the dashboard when hit by the headlights of an oncoming vehicle, by the morning low sun or by the setting sun. The user can also evaluate the visibility of a “HUD” head-up display system, which projects various informations onto the windshield 17.
In this configuration, an information processing system 10 can perform a real luminance display to serve as a visual for a driving simulator that allows, for example, to experience phenomena such as glare caused by the headlights of an oncoming vehicle.
Configuration 7
The information processing system 10 includes a display device 30 and an information processing device 20. The display device 30 has a display unit 33 that displays an image. The information processing device 20 has a first acquisition unit 61, a second acquisition unit 62, a brightness correction unit 63, and an output transmission unit 64.
The first acquisition unit 61 acquires luminance correction information that corrects the measured luminance from a predetermined measurement position on the image display unit according to the input signal to match the luminance information contained in the input signal. The second acquisition unit 62 acquires an image to be displayed on the display unit 33. The brightness correction unit 63 corrects the image acquired by the second acquisition unit 62 based on the correction information acquired by the first acquisition unit 61. The output transmission unit 64 transmits the image corrected by the brightness correction unit to the display unit.
Configuration 8
The present configuration refers to a form of realization of the information processing system 10 which associates a general-purpose computer 90 with a program 97 for its operation.
The information processing system 10 of the present version includes a computer 90, a display 30 and a luminance meter 36.
Computer 90 consists of a central unit 21, a main storage device 22, an auxiliary storage device 23, a communication unit 24, an output interface 25, an input interface 26, a readout unit 28 and a bus. Computer 90 can be a general-purpose personal computer, a tablet or other information device.
Program 97 is recorded on a portable storage medium 96. The CPU 21 reads program 97 from the playback unit 28 and stores program 97 in an auxiliary storage device 23. The CPU 21 can also read program 97 stored in solid-state memory 98, or a flash memory mounted in the computer 90. In addition, the CPU 21 can download program 97 from the communication unit 24 or another server or similar equipment not specified, which is connected via the communication unit 24 to a network not specified, and store program 97 in the auxiliary storage device 23.
Program 97 is installed as the control program of computer 90 and is loaded into the main storage device 22 to be executed. This allows computer 90 to function as the information processing device 20 described above.
Configuration 9
The present configuration is a form in which the coordinates of an image to be projected from projectors 31, the coordinates in the operational range described using
All numbers of pixels described using
The projector number field records the number given to each projector 31 in sequential order. The projector coordinate field records each coordinate of the image to be projected from each of the projectors 31 as described in
As shown in
In the distribution area, the luminance distribution between the projectors 31 is recorded. In
The value of the distribution field is determined so that the sum is 1 for each position in the operational range. If there is a mixture of high and low luminance 31 projectors, the characteristics of each 31 projector can be used effectively by increasing the value of the distribution field of the high luminance 31 projectors.
The value of the distribution field can be defined so that the value of the distribution field is proportional to the maximum luminance that each projector 31 can provide for each position in the operational range. This definition reduces the number of measurements of the luminance distribution and makes it possible to realize an information processing system 10 that can display the actual luminance with a small number of operations. In the following description of the present configuration, an example of a case where the luminance distribution is recorded in the allocation field will be used for explanation.
The Operational Range Coordinate Field records the coordinates of the operational range as described in
For example, if the aspect ratio of the operational range is different from the aspect ratio of the source image, the source image is not projected to the edge of the operational range. In this case, a “-” symbol is stored in the original image coordinate field corresponding to the coordinates of the operational range that are not projected.
CPU 21 selects one of the projectors 31 for processing, a step omitted in the flowchart, and sets the initial value of the projector coordinates to “0, 0” (step S602). CPU 21 searches the first conversion database with the projector coordinates as key, and obtains the records extracted from the operational range coordinate field (step S603). CPU 21 determines whether or not the coordinates of the projector are within the operational range of coordinates (step S604). If they are outside the operational range coordinates (NO in step S604), the symbol “-” is recorded in the operational range coordinates obtained in step S603.
If the coordinates are determined to be within the operational range (YES in step S604), CPU 21 calculates the coordinates of the original image corresponding to the coordinates of the operational range (step S605). Specifically, CPU 21 searches the second database for conversion data using several coordinates close to the operational range coordinates obtained in step S603 as a key, extracts the records, and interpolates the original image coordinates from the extracted records to calculate the original image coordinates. The interpolation can be performed by any method, such as the nearest neighbor estimation method, bilinear method, bicubic method, etc.
CPU 21 determines whether the calculated coordinates of the source image are within the limits of the source image (step S606). For example, if the symbol “-” is recorded in the original coordinate field of the record extracted by the search in the second conversion database and the interpolation cannot be performed successfully, CPU 21 determines that the coordinates are outside the boundaries of the original image.
If the coordinates are judged to be within the range of the original image (YES in step S606), CPU 21 obtains the luminance of the pixel based on the original image data obtained in step S601 (step S607). For example, the luminance of the pixel can be the luminance of the point of the original image closest to the coordinates calculated in step S605. From the original image data, pixels close to the coordinates calculated in step S605 can be extracted and interpolated using any interpolation technique to calculate the luminance.
CPU 21 calculates the luminance allocated to projectors 31 by integrating the luminance calculated in step S607 according to the distribution recorded in the distribution field of the record extracted from the first conversion database in step S603 (step S608).
If the coordinates are determined to be outside the operational range (NOT in step S604) or outside the original image, CPU 21 determines that the pixel is black, i.e. the luminance of the pixel is zero.
After completion of step S608 or step S609, CPU 21 obtains the input level values corresponding to the pixel luminance (step S610). In this case, CPU 21 performs an interpolation based on the use of the luminance correction database 52 described in
CPU 21 records the input level values obtained in step S610 against the projector coordinates, and CPU 21 determines whether processing of all projector coordinates is complete or not (step S611). If the processing is considered not completed (NOT in step S612), CPU 21 selects the next coordinates of the projector to be processed, and CPU 21 returns to step S603.
If it is determined that the processing of all projector coordinates is complete (YES in step S612), the Central Processing Unit 21 determines whether or not the processing of all projectors 31 is complete (step S614). If it is judged that not all projectors 31 have been processed (NO in step S614), the Central Processing Unit 21 selects the next projector 31 to be processed, and the Central Processing Unit returns to step S602.
When it is determined that all projectors 31 have been processed (YES in step S614), the CPU 21 transmits the image to all projectors 31 (step S616). The image is projected from each of the projectors 31 to the screen 33. The result is a real luminance display, which projects an image on screen 33 with a luminance that is true to the original image data. The CPU terminates processing.
First Variant
Second Variant
In this configuration, by combining the first and second bases of the conversion data, it is possible to obtain various types of projections, such as those described in
The technical characteristics (constituent requirements) described in each example can be combined with each other and, when combined, can form new technical characteristics.
The examples presented here are indicative in all respects and should not be considered restrictive. The scope of the invention is indicated by the claims, not by the descriptions above. It is intended to include all amendments in accordance with the claims.
10 Information processing system
15 Camera to be tested
16 Driving simulator
17 Windshield
18 Seat
19 Steering wheel
20 Information processing device
21 Central Processing Unit
22 Main storage device
23 Auxiliary storage device
24 Transmission unit
25 Output interface
26 Input interface
27 Control screen
28 Reading unit
30 Display device
31 Projectors
311 First projector
312 Second projector
313 Third projector
314 Fourth projector
315 Fifth projector
316 Sixth projector
321 First auxiliary projector
322 Second auxiliary projector
323 Third auxiliary projector
324 Fourth auxiliary projector
33 Screen (display unit)
331 First screen.
332 Second screen.
333 Third screen.
36 Luminance meter (two-dimensional color luminance meter)
37 Camera
51 Database of luminance measurements
52 Luminance correction database
61 First acquisition unit
62 Second acquisition unit
63 Luminance correction unit
64 Output transmission unit
96 Portable storage media
97 Computer program
98 Semiconductor memory
Number | Date | Country | Kind |
---|---|---|---|
2017-166119 | Aug 2017 | JP | national |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/JP2018/032246 | 8/30/2018 | WO | 00 |