This specification relates to the technical field of electronics, and in particular, to a compensating display screen, an under-screen optical system, and an electronic device.
Currently, photographing and displaying are essential functions of many electronic devices, and both a front camera and a display are disposed on the front side of an electronic device to meet various needs such as selfie, content display, and touch interaction.
As people are imposing higher aesthetic requirements on mobile phones, full-screen electronic devices, such as full-screen mobile phones, have gradually become a new trend of innovation of mobile phones, because full-screen mobile phones have a very high screen-to-body ratio, which are easy to control and have a highly aesthetic appearance. Currently, the development of full-screen electronic devices is challenged by the conflict between the front camera and the display screen. The existence of the front camera makes it difficult for the display screen to cover the entire front side of the mobile phone to achieve a high screen-to-body ratio.
A full screen can be achieved by disposing the optical module on the rear side of the display screen. The display screen is located on the front side for displaying an image. A light beam received or emitted by the optical module passes through the display screen. However, as the display screen includes a plurality of pixel units arranged periodically in the transverse direction and the longitudinal direction and the plurality of pixel units constitutes a periodic pixel diffraction structure, the display screen produces a diffraction effect on the incident light beam, eventually resulting in deterioration of the projection or imaging quality of the optical module disposed on the rear side of the display screen.
To solve the technical problem in existing technologies that a light beam passing through the display screen experiences a diffraction which affects the projection or imaging quality, this specification provides a compensating display screen including a transparent display screen and a compensating element. The transparent display screen includes a plurality of periodically arranged pixel units for display. The compensating element is configured to compensate a diffraction effect of the transparent display screen, so that when a preset light beam is incident on the compensating display screen, the same preset light beam is emitted from the compensating display screen.
In some embodiments, the compensating element includes a diffractive optical element or a spatial light modulator.
In an embodiment, the diffractive optical element includes at least two diffractive optical sub-elements.
In some embodiments, the compensating element is designed by performing the following steps: obtaining diffraction performance of the transparent display screen; performing an inverse diffraction calculation on the preset light beam based on the diffraction performance to obtain a complex amplitude spatial distribution of an incident light beam that enters the transparent display screen; and calculating a diffraction pattern of the compensating element based on the complex amplitude spatial distribution and the preset light beam.
This specification further provides an optical system, including a transparent display screen that has a plurality of periodically arranged pixel units for display, an optical module, and a compensating element. The optical module is configured to receive a light beam from the transparent display screen or emit a light beam outward through the transparent display screen. The compensating element is disposed between the transparent display screen and the optical module, and configured to compensate a diffraction effect of the transparent display screen, so that when the light beam is passed through the transparent display screen and the compensating element, the same light beam is emitted from the transparent display screen and the compensating element.
In some embodiments, the compensating element includes a diffractive optical element.
In an embodiment, the diffractive optical element includes at least two diffractive optical sub-elements.
In some embodiments, the diffractive optical element is integrated in the transparent display screen.
In some embodiments, the compensating element includes a spatial light modulator.
This specification further provides an electronic device including the optical system described in the above embodiments. Compared with existing technologies, the compensating display screen provided in this specification has the following advantages. A compensating element compensating a diffraction effect of a transparent display screen is provided to offset the diffraction effect caused by a light beam passing through a periodic pixel diffraction structure of the transparent display screen, so that when a preset light beam is incident on the compensating display screen, the same preset light beam is emitted from the compensating display screen, thereby avoiding a diffraction impact of the transparent display screen on the incident light beam and improving the quality and effect of imaging or projection.
To describe the technical solutions in the embodiments of this specification more clearly, the following briefly describes the accompanying drawings required for describing the embodiments. Apparently, the accompanying drawings in the following description show merely some embodiments of this specification, and a person of ordinary skill in the art may still derive other drawings from the accompanying drawings without creative efforts.
To make the objectives, technical solutions, and advantages of this specification clearer and more comprehensible, the following further describes this specification in detail with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein merely explain this specification but do not limit this specification.
It should be noted that when a component is described to be “fixed on” or “disposed on” another component, the component may be directly or indirectly on the another component. When a component is described to be “connected to” another component, the component may be directly or indirectly connected to the another component. Orientation or position relationships indicated by the terms such as “top,” “bottom,” “left,” and “right” are based on orientation or position relationships shown in the accompanying drawings, and are used only for ease of description, rather than indicating or implying that the mentioned apparatus or element needs to have a particular orientation or needs to be constructed and operated in a particular orientation. Therefore, such terms should not be construed as limitation to this specification. The terms “first” and “second” are merely for description, and are not an indication or implication of relative importance or implicit indication of a quantity of technical features. “A plurality of” means two or more, unless otherwise definitely limited.
To describe the technical solutions of this specification, the following describes the technical solutions in detail with reference to the specific accompanying drawings and embodiments.
A plurality of sensors may be disposed on the top, in other parts, or scattered in different parts of the electronic device. In some embodiments, the sensors may also be disposed on the rear side of the electronic device.
The sensors are configured to send or receive external information of the electronic device, such as illumination information and sound information. For example, the camera 102 may be a visible light camera (color camera or grayscale camera), and is configured to collect an image of an external object. The speaker is configured to convert an electrical signal into a sound signal and send the sound signal out. The ambient light sensor is configured to obtain intensity information of external ambient light. The proximity sensor is configured to detect whether an external object is approaching the electronic device. In addition, the emitting module 101 and the light receiving module 103 may form a depth camera module configured to collect depth image information of the external object. Understandably, the sensors are not limited to sensors of the foregoing types, and different types of sensors may be disposed in the electronic device as actually required. For example, in an embodiment, the sensors further include a floodlighting module or the like.
The processor 206 is configured to perform overall control on the entire electronic device. The processor 206 may be a single processor or may include a plurality of processing units, for example, may include processing units having different functions. In some embodiments, the processor 206 may also be an integrated system on chip (SOC) which includes a central processing unit, an on-chip memory, a controller, and a communication interface. In some embodiments, the processor 206 may be an application processor such as a mobile application processor, and is mainly responsible for implementing other functions in the electronic device except a communication function such as text processing and image processing.
The display screen 106 is configured to display an image under control of the processor 206 to render applications and the like to a user. In addition, the display screen 106 may also include a touch control function. In this case, the display also serves as a human-machine interaction interface for receiving user input.
The microphone 202 is configured to receive voice information, and may be configured to implement voice interaction with the user.
The radio frequency and baseband processor 203 is responsible for the communication function of the electronic device, for example, receiving and translating signals such as a voice signal or a text signal to implement information exchange between remote users.
The interface 204 is configured to enable connections between the electronic device and the outside to further implement functions such as data transmission and power transmission. The interface 204 is controlled by a communication interface in the processor 206. The interface 204 may include a USB interface and a WiFi interface.
The memory 205 is configured to store data such as application data, system data, and temporary code and data stored by the processor 206 during execution. The memory 205 may include one or more memories, and may be a random access memory (RAM), a flash memory, or any other form of memory that can be used to store data. Understandably, the memory may be a part of the electronic device, or may exist independently of the electronic device, for example, a cloud memory. The data stored in the memory may communicate with the electronic device through the interface 204 or the like. The application, such as a face recognition application, is generally stored in a non-volatile readable storage medium. When executing the application, the processor calls a corresponding program from the storage medium to execute the application.
The ambient light/proximity sensor 201 may be an integrated single sensor, or may be a stand-alone ambient light sensor and a stand-alone proximity sensor. The ambient light sensor is configured to obtain illumination information of a current environment in which the electronic device is located. In an embodiment, based on the illumination information, screen brightness can be automatically adjusted to provide a display brightness more comfortable for human eyes. The proximity sensor can detect whether an object is approaching the electronic device, based on this, some functions can be implemented. For example, when the user's face is close enough during answering of a phone call, the touch control function of the screen is turned off to prevent accidental touch. In some embodiments, the proximity sensor can also quickly determine an approximate distance between the face and the electronic device.
The battery 207 is configured to provide electricity. The speaker 209 is configured to implement voice output.
The MEMS sensor 208 is configured to obtain current status information of the electronic device, such as location, direction, acceleration, and gravity. Therefore, the MEMS sensor 208 may include sensors such as an accelerometer, a gravimeter, and a gyroscope. In an embodiment, the MEMS sensor 208 may be configured to activate some face recognition applications. For example, when the user picks up the electronic device, the MEMS sensor 208 can acquire this change and transmit this change to the processor 206 at the same time. The processor 206 calls the face recognition application in the memory to perform the face recognition application.
The camera 102 is configured to collect an image. In some applications, for example, when a selfie application is executed, the processor controls the camera 102 to collect an image and transmits the image to the display for displaying. In some embodiments, for example, in an unlocking program based on face recognition, when the unlocking program is activated, the camera collects an image, and the processor processes the image such as performing face detection and recognition, and performs an unlocking task according to a recognition result. The camera 102 may be a single camera or a plurality of cameras. In some embodiments, the camera 102 may include an RGB camera and a grayscale camera that are configured to collect visible light information, and may also include an infrared camera and an ultraviolet camera that are configured to collect invisible light information. In some embodiments, the camera 102 may include a light field camera, a wide-angle camera, and a telephoto camera.
The camera 102 may be disposed at any position of the electronic device, for example, located on the top or at the bottom of a front surface (that is the same surface on which the display is located), a rear surface, or other positions. In the embodiment shown in
The depth camera 210 includes the light emitting module 101 and the light receiving module 103 that are respectively responsible for signal emission and reception of the depth camera. The depth camera may also include a depth calculation processor configured to process the received signal to obtain depth image information. The depth calculation processor may be a special-purpose processor such as an ASIC chip, or may be the processor 206 in the electronic device. For example, for a structured light depth camera, the light emitting module 101 and the light receiving module 103 may be an infrared laser speckle pattern projector and a corresponding infrared camera, respectively. The infrared laser speckle pattern projector is configured to emit a preset speckle pattern of a specific wavelength to a surface of a spatial object. The preset speckle pattern is reflected by the surface of the object and imaged in the infrared camera. In this way, the infrared camera can obtain an infrared speckle image modulated by the object. Further, the infrared speckle image is calculated by the depth calculation processor to generate a corresponding depth image. Generally, a light source in the projector may be a near-infrared light source with a wavelength such as 850 nm or 940 nm. Types of light sources may include an edge-emitting laser, a vertical-cavity surface-emitting laser, or a corresponding light source array. Distribution of speckles in the preset speckle pattern is randomly distributed to realize sub-regions irrelevance along a specific direction or a plurality of directions. That is, any sub-region selected along a specific direction meets a relatively high uniqueness requirement. Optionally, in some embodiments, the light emitting module 101 may also include light sources such as an LED and laser that can emit visible light, ultraviolet light, or other wavelengths, and may be configured to emit structured light patterns such as stripes, speckles, and two-dimensional codes.
In some embodiments, the depth camera may also be a time of flight based (TOF-based) depth camera, a binocular depth camera, or the like. For the TOF-based depth camera, the light emitting module is configured to emit pulsed beams or modulated (such as amplitude modulated) continuous wave beams to the outside. After the light receiving module receives a beam reflected by an object, the circuit of the processor calculates a time interval between the beam emission and the beam reception to further calculate depth information of the object. For the binocular depth camera, a type of binocular depth camera is an active binocular depth camera including one light emitting module and two light receiving modules. The light emitting module projects a structured light image onto the object. The two light receiving modules obtain two structured light images. The processor directly uses the two structured light images to calculate a depth image. Another type of binocular depth camera is a passive binocular depth camera, in which the light emitting module may be referred to as one of the two light receiving modules. The two light receiving modules collect two images. The processor directly uses the two images to calculate a depth image. Subsequently, conception of this specification is described by using a structured light depth camera as an example. Understandably, the corresponding invention content can also be applied to other types of depth cameras.
Referring to
However, disposing the sensor (the following will take optical modules such as a light emitting module, a light receiving module, and a camera as examples) on the rear side of the display screen may face some problems, for example, how to hide the sensor to provide the user with perfect full-screen experience, and how to overcome impact on light emission and light reception (including impact on a beam amplitude and phase) caused by the display screen. The optical module is configured to receive or emit light beams of a specified wavelength or wavelength range. In this specification, optical modules are categorized into visible light optical modules and invisible light optical modules. A visible light optical module is configured to emit or receive a visible light beam. An invisible light optical module is configured to emit or receive an invisible light beam. To facilitate understanding, an infrared optical module will be used as an example of the invisible light optical module for description. Understandably, invisible light such as ultraviolet and X-ray is also applicable to this specification.
The light receiving module 33 is configured to receive a light beam 34 from the display screen 31. The light receiving module 33 includes an image sensor 331, a filter element 332, and a lens 333. The image sensor 331 may be a charge coupled device (CCD), a complementary metal-oxide-semiconductor transistor (CMOS), or the like. The filter element 332 may be a Bayer filter, an infrared filter, or the like. The light receiving module may also include other forms of structures such as a light field camera and a photodiode. The lens 333 may be a single lens, or may be a lens group, or a lens array.
In the embodiments shown in
In an embodiment, the filter unit is an optical switch such as a liquid crystal shutter (a liquid crystal spatial light modulator). When powered off, the filter unit is in a non-transparent state that does not allow light to pass through. When powered on, the filter unit is in a transparent state that allows light to pass through. Therefore, with the optical switch being disposed between the optical module and the display screen, the optical module can be hidden by configuring the optical switch. When the optical module is in operation, the optical switch is set to the transparent state to allow light to pass through. When the optical module is not in operation, the optical switch is set to the non-transparent state to block passage of light. In this way, the user on the one side of the display screen cannot see the optical module disposed on the opposite side of the display screen. The optical switch may also be made of other types of materials, such as an electrochromic material and a thermochromic material. Alternatively, a specific optical structure may be used to control whether to allow light to pass.
In an embodiment, the filter unit is a unidirectional fluoroscopy film. The unidirectional fluoroscopy film maximally allows external light to enter the optical module through the display screen while preventing internal light from passing through the display screen. That is, a surface of the unidirectional fluoroscopy film facing the optical module has a transmittance of visible light lower than a reflectance thereof (for example, the transmittance is 5% to 30%, and the reflectance is 90% to 95%), but a surface of the unidirectional fluoroscopy film facing the display screen has a transmittance of the visible light higher than a reflectance thereof (for example, the transmittance is 60% to 95%, and the reflectance is 5% to 30%). It needs to be noted that if the optical module on one side of the display screen is a visible light receiving module (such as a color camera), the imaging quality will be affected to some extent due to influence of the transmittance. If the optical module on one side of the display screen receives or emits an invisible light beam (such as an infrared beam), that is, when the optical module includes an infrared receiving module (such as an infrared camera) and an infrared emitting module, the corresponding unidirectional fluoroscopy film needs to be configured to have a relatively high transmittance of infrared light. That is, for the infrared receiving module, the surface of the unidirectional fluoroscopy film facing the display screen needs to have a relatively high transmittance of infrared light that is generally higher than a reflectance thereof. For the infrared emitting module, the surface of the unidirectional fluoroscopy film facing the infrared emitting module needs to have a relatively high transmittance of infrared light that is generally higher than a reflectance thereof. For example, the transmittance is 80% to 95% to ensure the quality of imaging or projection. When a unidirectional fluoroscopy film is in use, a unidirectional fluoroscopy film with an appropriate performance is selected for different optical modules.
In an embodiment, the filter unit is a filter configured to block visible light and allow passage of only light beams that fall in a specific invisible light wavelength range. For example, in a case that the optical module on one side of the display screen is an infrared receiving module (such as an infrared camera), an infrared emitting module, or the like, the use of the infrared filter enables the infrared receiving module and the infrared emitting module to collect an infrared image, emit an infrared beam, and prevent passage of visible light, thereby achieving the purpose of hiding the optical module behind the display screen.
In an embodiment, the filter unit is a special filter. The special filter has a low transmittance of visible light, but has a high transmittance of light of a specific invisible light wavelength, such as near-infrared light. In an example, the transmittance of visible light is 10% to 50%, and the transmittance of near-infrared light is 60% to 99%. Outside ambient light passes through the special filter and then irradiates the optical module. After the ambient light is reflected by the optical module, the possibility for the reflected ambient light to pass through the special filter again is greatly reduced, thereby achieving the purpose of hiding the optical module behind the display screen. In addition, because this module has a very high transmittance of near-infrared light, impact on the infrared emitting module and the infrared receiving module is rather limited.
Understandably, the foregoing types of filter units are not a limitation on this specification. Any filter unit that can implement similar functions may be applied to this specification.
Understandably, the filter unit may be an independent optical device, or may be combined with the optical module or the display screen. For example, when the filter unit is in the form of a film, the filter unit may be provided by coating the film on the surface of the display screen or the optical device.
As shown in
In an embodiment, the first filter unit 62 and the third filter unit 64 are infrared filters, and the second filter unit 63 is an optical switch or a unidirectional fluoroscopy film.
In an embodiment, the first filter unit 62 is a first unidirectional fluoroscopy film, the third filter unit 64 is a third unidirectional fluoroscopy film, and the second filter unit 63 is an optical switch or a second unidirectional fluoroscopy film. For visible light, surfaces of the first, second, and third unidirectional fluoroscopy films facing the optical module have a transmittance lower than a reflectance thereof, but surfaces facing the display screen 61 have a transmittance higher than a reflectance thereof. For infrared light, a surface of the first unidirectional fluoroscopy film facing the display screen 61 has a transmittance higher than a reflectance thereof, and a surface of the third unidirectional fluoroscopy film facing the light emitting module has a transmittance higher than a reflectance thereof.
In an embodiment, the first filter unit 62 and the third filter unit 64 are optical switches, and the second filter unit 63 is an optical switch or a unidirectional fluoroscopy film.
Understandably, the solutions of this specification are not limited to the above embodiments. Any reasonable combinations of filter units are applicable herein.
The above embodiments provide solutions for hiding the optical module on the rear side of the display screen.
The display screen generally includes a plurality of pixel units arranged periodically in the transverse direction and the longitudinal direction. The plurality of pixel units constitute a periodic pixel diffraction structure. Therefore, the display screen produces a diffraction effect on the incident light beam, eventually resulting in deterioration of the quality of projection or imaging.
In conventional technologies, a light emitting module that includes a light source, a lens, and a DOE, is configured to project a patterned light beam such as a structured light patterned light beam (such as a speckle pattern, a stripe pattern, or a two-dimensional pattern), a floodlight beam, a single-spot light beam, or a modulated TOF light beam. When the patterned light beam emitted by the light emitting module is projected outward through the display screen 81, it will be diffracted due to a periodic structure of internal pixels inside the display screen 81. That is, if the light emitting module mentioned in conventional technologies is directly disposed on one side of the display screen, the patterned light beam emitted by the light emitting module will be diffracted again by the display screen. In this case, the display screen 81 is a second diffractive optical element (a second DOE). The light beam that is diffracted twice may affect the patterned light beam, for example, reducing contrast and increasing noise, and may even completely deviate from the patterned light beam. This may bring a huge challenge for placing the optical module behind the screen.
In this embodiment, the first DOE 84 will no longer project a preset patterned light beam (such as a preset speckle patterned light beam), but diffraction effects of the first DOE 84 and the display screen (that is, the second DOE) 81 are comprehensively considered in a design stage to achieve the following purposes: the first DOE 84 receives an incident light beam from a light source and then projects a first diffracted light beam 85, and the first diffracted light beam 85 is diffracted again by the second DOE 81 to project a patterned light beam 86.
In an embodiment, designing the first DOE 84 generally includes the following steps.
A first step is to obtain diffraction performance of the display screen, that is, the second DOE, and the diffraction performance may be described by, for example, using a complex amplitude transmittance function. A possible detection method is to let a plane wave beam enter the display screen from a single angle or a plurality of angles, collect emergent light intensity distribution by using a receiving screen, and measure the diffraction performance of the second DOE by the light intensity distribution.
A second step is to perform inverse diffraction calculation on the patterned light beam 86 based on the diffraction performance of the display screen, that is, the second DOE, so as to obtain complex amplitude spatial distribution of the first diffracted light beam 85.
A last step is to calculate a diffraction pattern of the first DOE based on the complex amplitude spatial distribution of the first diffracted light beam 85 and light beam distribution before the light beam enters the first DOE 84 through the lens 83.
Understandably, this design process is only an example, and any other reasonable design schemes are applicable to this specification.
The first DOE 84 is not limited to a single DOE, but may include a plurality of sub-DOEs, which are not necessarily formed on different optical devices. For example, two sub-DOEs may be formed on surfaces of the same transparent optical device separately.
Alternatively, the first DOE 84 and the second DOE 81 may not be necessarily discrete devices. For example, the first DOE 84 may be formed on the rear side of the display screen, that is, the second DOE 81, to improve an overall integration degree. The display screen 81 usually includes a plurality of layers having different functions. Therefore, to further improve the integration degree, the first DOE may also be integrated into a layer in the display screen 81, or one or more sub-DOE layers of the first DOE 84 may be integrated into a layer of the display screen 81.
In this embodiment, the compensating element 95 and the second DOE 91 form a new compensating display screen 98. In the compensating display screen 98, the compensating element 95 is designed to compensate the diffraction effect of the display screen, thereby offsetting impact caused by the second DOE 91 on the patterned light beam projected by the light emitting module. That is, after a plane wave beam enters the compensating display screen, an isophase plane of emergent light that exits from the compensating display screen is still perpendicular to a wave vector direction of the incident light. In this way, the patterned light beam that exits from the first DOE 94 enters the compensating display screen, and is then projected into a space as a patterned light beam 97. Understandably, it is difficult for the compensating element 95 to fully eliminate the diffraction effect of the second DOE 91. Therefore, it is difficult to ensure that the patterned light beam 97 is exactly the same as the patterned light beam 96. A slight difference between the two patterned light beams is allowed. For example, a slight difference in the spatial distribution of intensity may exist between the two patterned light beams.
The compensating element 95 may be configured as any optical element that can change an amplitude and/or phase of the light beam, such as a DOE, a spatial light modulator (SLM), or the like. When the compensating element 95 is the spatial light modulator, the compensating element 95 may be a liquid crystal spatial light modulator and may include a plurality of pixels. Each pixel may modulate the amplitude and/or phase of the incident light by changing properties of the pixel (such as a refractive index or a grayscale).
Compared with the embodiment shown in
In an embodiment, designing the compensating element 95 generally includes the following steps.
A first step is to obtain diffraction performance of the display screen, that is, the second DOE 91, and the diffraction performance may be described by, for example, using a complex amplitude transmittance function. A possible detection method is to let a plane wave beam enter the display screen from a single angle or a plurality of angles, collect emergent light intensity distribution by using a receiving screen, and measure the diffraction performance of the second DOE 91 by the light intensity distribution.
A second step is to perform inverse diffraction calculation on the emergent light beam 97 based on the diffraction performance of the display screen, that is, the second DOE 91, to obtain complex amplitude spatial distribution of the incident light beam that enters the second DOE 91.
Finally, a last step is to calculate a diffraction pattern of the compensating element 95 based on the complex amplitude spatial distribution of the incident light beam that enters the second DOE 91 and the light beam distribution of the incident light beam 96 that enters the compensating element 95.
In the above steps, the incident light beam 96 that enters the compensating element 95 has almost the same spatial distribution as that of the emergent light beam 97, and may be a plane wave light beam or a patterned light beam.
Understandably, this design process is only an example, and any other reasonable design schemes are applicable to this specification.
In a case that the compensating element 95 is a DOE (denoted by a third DOE 95), the first DOE 94 and the third DOE 95 are not limited to a single DOE, but may include a plurality of sub-DOEs that are superimposed on each other. The plurality of sub-DOEs are not necessarily formed on different optical devices. For example, two sub-DOEs may be formed on surfaces of the same transparent optical device, separately. Alternatively, the first DOE 94 and the third DOE 95 may not be necessarily discrete devices. Two DOEs may be formed on surfaces corresponding to the same transparent optical device respectively, so as to serve as the first DOE 94 and the third DOE 95 respectively. Similarly, the third DOE 95 and the second DOE 91 are not necessarily discrete devices. For example, the third DOE 95 may be formed on one side of the display screen (the second DOE 91) to improve an overall integration degree. Because the display screen usually includes a plurality of layers having different functions, the third DOE 95 may also be integrated into one of the layers of the display screen to further increase the integration degree.
Relative locations of the first DOE 94, the third DOE 95, and the second DOE 91 are not limited to the embodiment shown in
The first DOE 104 is not necessarily a single DOE, but may be a plurality of sub-DOEs. The plurality of sub-DOEs are not necessarily formed on different optical devices. For example, two sub-DOEs may be formed on surfaces of the same transparent optical device separately. Alternatively, the first DOE 104 and the second DOE 101 may not be necessarily discrete devices. For example, the first DOE 104 may be formed on one side of the display screen (the second DOE 101) to improve an overall integration degree. The display screen 101 usually includes a plurality of layers having different functions. Therefore, to further improve the integration degree, the first DOE 104 may also be integrated into a layer in the display screen 101, or one or more sub-DOE layers of the first DOE 104 may be integrated into a layer of the display screen 101.
In the process described in the embodiments shown in
Understandably, when the light receiving module and the light emitting module are combined with the display screen to form an under-screen depth camera, a structural form of the light receiving module and the display screen as well as a structural form of the light emitting module and the display screen may be arbitrarily configured according to actual requirements, and are not limited to the embodiment shown in
Referring to
In each of the foregoing embodiments, disposing an optical module behind the display screen requires that light can pass through the display screen, that is, the display screen is a transparent display screen. However, a transparent display screen is more costly than a conventional non-transparent display screen. To solve this problem, this specification provides a spliced display screen solution based on the foregoing embodiments.
In an embodiment, the second display screen unit 126b is a non-transparent display screen, such as a common LCD display screen or a common LED display screen. The two display screen units are spliced to form a whole display screen 126.
In an embodiment, the first display screen unit 126a and the second display screen unit 126b are the same type of display screens. For example, both are OLED display screens, but an aperture ratio of the first display screen unit 126a is higher than that of the second display screen unit 126b, which makes it easier for light to pass through. Understandably, in this case, the entire display screen 126 is not necessarily formed by splicing, but the same display screen may have two regions, and aperture ratios of the two regions are controlled during design and manufacturing. Alternatively, besides the aperture ratio, other types of settings may be controlled, for example, a resolution difference between the two regions. A resolution of the first display screen unit 126a is lower than that of the second display screen unit 126b. Alternatively, the two regions are made of materials of different transparency. Overall transparency of the material of the first display screen unit 126a is higher than that of the material of the second display screen unit 126b, thereby ultimately making the transparency of the first display screen unit 126a higher than that of the second display screen unit 126b.
In an embodiment, the display screen 126 includes more than two display screen units. For example, one first display screen unit 126a is configured for each sensor. Shapes of the first display screen unit 126a and the second display screen unit 126b are not limited to the form shown in
In an embodiment, the first display screen unit 126a and the second display screen unit 126b are controlled independently. When the sensor behind the first display screen unit 126a operates, the first display screen unit 126a is in an off state, and the second display screen unit 126b can still display content normally.
It may be understood that to make the sensor behind the first display screen unit 126a meet the requirement of sending or receiving a signal, all the solutions of the foregoing embodiments may be applied to this spliced screen solution.
The foregoing descriptions are merely exemplary embodiments of this specification, but do not limit this specification. Any modification, equivalent replacement, and improvement made without departing from the spirit and principle of this specification shall fall within the protection scope of this specification.
Number | Date | Country | Kind |
---|---|---|---|
201811082123.9 | Sep 2018 | CN | national |
The application is a continuation application of International Patent Application No. PCT/CN2019/092163, filed with the China National Intellectual Property Administration (CNIPA) on Jun. 21, 2019, and entitled “COMPENSATING DISPLAY SCREEN, UNDER-SCREEN OPTICAL SYSTEM AND ELECTRONIC DEVICE”, which is based on and claims priority to and benefit of Chinese Patent Application No. 201811082123.9, filed with the CNIPA on Sep. 17, 2018. The entire contents of all of the above-identified applications are incorporated herein by reference.
Number | Date | Country | |
---|---|---|---|
Parent | PCT/CN2019/092163 | Jun 2019 | US |
Child | 17016252 | US |