This application claims the benefit of priority to European Patent Application EP19202643.3 titled “User Controlled Imaging Device” and filed on Oct. 11, 2019, which is incorporated herein by reference in its entirety.
The application discloses a user-controlled imaging device. The imaging device comprises an imaging unit. Further, the application discloses a mobile end device. Furthermore, the application discloses a method of steering an imaging unit.
Many mobile devices and many other devices in industrial, medical and automotive domains are equipped with multiple numbers of image sensors. For instance, mobile devices like smartphones or tablets are equipped with multiple front and rear camera modules. In addition to a conventional front camera sensor with red, green, blue (RGB) Bayer colour filter array, infrared (IR) light source and IR sensitive sensors are also added to the front side of the mobile devices. This set of hardware units feed software applications to support biometric verification such as iris and face detection or recognition.
The rear side of such a mobile device may be equipped with one or more camera devices and lighting module in order to enrich photos taken with mobile devices. In the lowlight environment, lighting modules like camera flash is enabled to illuminate the scene. Some lighting modules can be designed in the form of statically or dynamically steerable light beams like disclosed in WO 2017 080 875 A1 referred to an adaptive light source and U.S. Pat. No. 9,992,396 B1 concerning focusing a lighting module. However, it is still inconvenient for the user to control the direction of the light beam of the light source, especially when hands-free control of the device is required.
Therefore, providing improved image quality for mobile devices and a more convenient handling of the imaging process is desired.
An advantage of embodiments is achieved by, for example, a user-controlled imaging device, by a mobile end device, or by a method of steering an imaging unit.
The user-controlled imaging device can include an imaging unit. The imaging unit can be used for taking an image of a part of the environment with the user-controlled imaging device. Further, the user-controlled imaging device can include a user facing imaging unit, which is used for depicting the face of the user. This user facing imaging unit may be for example on the opposite side of the user-controlled imaging device such that a simultaneous surveillance of the face of the user and an imaging of a portion of the environment is enabled. However, it is also possible that the user facing imaging unit is spatially separated from the imaging unit such that there is no interference between the monitoring of the user and the imaging of the environment of the user-controlled imaging device. Moreover, the user-controlled imaging device exhibits a pupil detection- and tracking unit for computing the direction of the user's viewing direction. The pupil detection- and tracking is performed based on the image data of the user facing imaging unit. The user-controlled imaging device also includes a control unit for controlling the imaging unit corresponding to the determined direction of the user's view. Controlling the imaging unit can mean that a directional function of the imaging unit is controlled based on the direction of the user's view. As is described later, in detail, the directional function may comprise the controlling of the direction of a focus or a light beam. The algorithm responsible for tracking the movement of the head/pupil can take several consecutive frames from the user-facing imaging unit into account in order to compute for example the direction of the light beam of the light source of the imaging unit. A statistical technique (e.g. running average), which elaborates the content of the consecutive frames in time and space, is incorporated into the algorithm to have a more robust and stable response of the steering of the directional function, for example of the light beam steering.
Hence a hands-free control of an imaging process of a scene to be imaged can be realized, which enables a more target-oriented and automated reception of a selected portion to be imaged. Further, an additional advantage can include the subjective highlighting of a scene wherein the importance of objects is determined based on the user's attention, (e.g., the viewing direction of the eyes). Embodiments can also be advantageous for appliance of surveillance and remote observation, for example using drones.
An advantage of embodiments can include controlling of the imaging unit is enabled without the use of the hand of the user. Further, the scene is able to be highlighted based on the user's attention to important objects. Hence the quality of important image areas is improved.
The mobile end device can include the user-controlled imaging device. The mobile end device may comprise for example a smartphone, a mobile phone, a tablet computer, a notebook or a subnotebook. The mobile end device shares the advantages of the user-controlled imaging device.
According to the method of controlling an imaging unit, an image of the imaging unit is provided and the face of the user is monitored by an additional user facing imaging unit. Based on images from the user facing imaging unit a pupil detection is performed. Further a direction of user's view based on the pupil detection is determined and the imaging unit is controlled corresponding to the determined direction unit, wherein the user facing imaging unit is spatially separated from the imaging unit such that there is no interference between the monitoring of the user and the imaging of the environment. Controlling the imaging unit may comprise orientating a light beam or positioning a focus of a camera for emphasizing a portion of a scene to be imaged.
The method of controlling an imaging unit shares the advantages of the user-controlled imaging device.
The claims and the following description disclose particularly advantageous embodiments and features of embodiments. Features of the embodiments may be combined as appropriate. Features described in the context of one claim category can apply equally to another claim category.
Further, the imaging unit of the user-controlled imaging device may include a light source with a steerable light beam. The light source has the function of illuminating a portion of the environment, which has to be depicted or which is intended to be highlighted in the imaged portion of the environment. In this variant, controlling of the imaging unit comprises orientating the light beam corresponding to the determined direction of the user's view.
Hence a hands-free control of a light beam for lighting of a scene to be imaged is realized, which enables a more target-oriented and automated lighting of a selected portion to be imaged. Further an additional advantage is the subjective highlighting of a scene wherein the importance of objects is determined based on the user's attention, (e.g., the viewing direction of his eyes). For example, snapshots in low lighting conditions as in mobile applications, medical exertions and industrial use, which require hands-free control of light beam steering, can benefit from embodiments. Controlling the enlightenment of a scene is also advantageous for appliance of surveillance and remote observation, for example using drones.
An advantage of the mentioned variant is that the direction of the light beam is enabled to be controlled without the use of the hand of the user. Further, the scene is able to be highlighted based on the user's attention to important objects. Hence the quality of important image areas is improved.
According to an aspect of the user-controlled imaging device the user facing imaging unit comprises a colour camera module. A colour camera module may be appropriate for depicting details of the eye and for creating image data, which are useful for pupil detection and localisation, which can be used for eye tracking.
The user facing imaging unit may comprise an IR sensitive camera module. An IR sensitive camera is also fully functional in a dark environment. Further, the user facing imaging unit may additionally comprise an IR light source. IR light may be used for getting a better contrast for eye tracking and has the advantage of invisibility such that IR light does not disturb the user's view at the screen of his mobile device.
In an embodiment, the user facing imaging unit comprises a front camera and the imaging unit comprises a rear camera. In this context, the front side is the side faced to the face of the user and the rear side is faced to the environment.
Advantageously, the image sections of the different cameras do not overlap and do not interfere with each other.
Further, in an alternative variant, controlling the imaging unit comprises positioning the focus of the rear camera corresponding to the determined direction of the user's view. In this variant a portion of the scene to be imaged is able to be automatically focused based on the direction of the user's view. Hence, important portions of the images are able to be emphasized not only by light but also by image sharpness.
Furthermore, in a particular variation of the user-controlled imaging device the light source comprises a flash. The flash enables highlighting of a scene in a dark environment.
Other advantages and features of the present disclosure will become apparent from the following detailed descriptions considered in conjunction with the accompanying drawings. It is to be understood, however, that the drawings are designed solely for the purposes of illustration and not as a definition of the limits of the embodiments.
In the drawings, like numbers refer to like components throughout. Components in the diagrams are not necessarily drawn to scale.
Although aspects have been disclosed in the form of preferred embodiments and variations thereon, it will be understood that numerous additional modifications and variations could be made thereto without departing from the scope of the aspects. For example, the “front side” and the “rear side” of the imaging device may be arranged apart from each other. In other words, the first module and the second module are not mandatorily placed together.
For the sake of clarity, it is to be understood that the use of “a” or “an” throughout this application does not exclude a plurality, and “comprising” does not exclude other steps or elements.
Number | Date | Country | Kind |
---|---|---|---|
19202643.3 | Oct 2019 | EP | regional |