The present disclosure relates to an apparatus and method for image analysis and more specifically, an apparatus and method for image analysis in which an absolute distance for a partial region of an image is determined by an auto focusing function, and the absolute distance is applied to an output of a depth map so as to determine absolute distances for the remaining regions.
A depth map represents information about a distance between an observation point and an object. Since the depth map includes distance information between the observation point and each surface of the object, it is possible to determine a three-dimensional image of the object using the depth map.
On the other hand, since the depth map includes only relative distance information between the observation point and the object, an absolute distance of the object included in the depth map may not be determined through the depth map.
Therefore, there is a need for an invention that may determine the absolute distance of the object included in the depth map.
Provided is an apparatus and method for image analysis and more specifically, an apparatus and method for image analysis in which an absolute distance for a partial region of an image is determined by an auto focusing function, and the absolute distance is applied to an output of a depth map so as to determine absolute distances for the remaining regions.
Aspects of the present disclosure are not limited to the above-mentioned aspects. Additional aspects will be set forth in part in the description which follows and, in part, will be apparent from the description, or may be learned by practice of the presented embodiments.
According to an aspect of the disclosure, an apparatus for image analysis may include: at least one memory storing instructions, and at least one processor configured to execute the instructions, in which, by executing the instructions, the at least one processor is configured to control: an image sensor to generate a captured image; a depth map generator to generate a depth map corresponding to the captured image; and a map analyzer to analyze an absolute distance for a first region of the depth map based on an absolute distance of a selected sub-region of the first region of the depth map, where the at least one processor is configured to control the map analyzer to determine the absolute distance for the selected sub-region based on an automatic focus function of the image sensor.
The image sensor may include a monocular camera.
The depth map may include relative distance information of objects included in the captured image.
The at least one processor may be further configured to control: an absolute distance finder to divide the depth map into a plurality of divided regions, and determine an absolute distance according to the automatic focus function of the image sensor for at least two reference regions selected from the divided regions; and a distance converter to convert a relative distance for the first region of the depth map into an absolute distance according to a ratio of the relative distance and the absolute distance of each of the at least two reference regions.
The at least one processor may be further configured to control the absolute distance finder to select a sub-region having a relative distance within a preset range among the first region of the depth map as a reference region.
The at least one processor may be further configured to control the absolute distance finder to exclude a region having a minimum relative distance or a maximum relative distance among the divided regions from the selection of the reference region.
The at least one processor may be further configured to control, based on a variation of relative distances that constitute the divided regions exceeding a preset threshold, the absolute distance finder to exclude a corresponding divided region from the at least two reference regions.
The at least one processor may be further configured to control the absolute distance finder to determine a number of reference regions based on a distribution of relative distances constituting the depth map.
The at least one processor may be further configured to control the absolute distance finder to increase the number of reference regions based on an increase in the variation of the relative distances constituting the depth map.
The at least one processor may be further configured to control the absolute distance finder to: generate a depth map array based on a size of a plurality of relative distances corresponding to the divided regions, respectively, divide the depth map array into a plurality of depth map groups according to the size of the relative distance, select a reference map group as a selection reference for the reference region among the plurality of depth map groups, extract a relative distance from the reference map group, and select a divided region corresponding to the relative distance extracted from the reference map group as the reference region.
The at least one processor may be further configured to control the absolute distance finder to extract a minimum relative distance, a maximum relative distance, or a middle relative distance from the relative distances of the reference map group.
The at least one processor may be further configured to control the absolute distance finder to extract the smallest relative distance and a largest relative distance excluding a minimum relative distance and a maximum relative distance among the relative distances of the reference map group.
The at least one processor may be further configured to control the absolute distance finder to determine a number of relative distances extracted from the reference map group based on variation of the relative distances constituting the reference map group.
The at least one processor may be further configured to control the absolute distance finder to: perform an automatic focus operation on the capturing region of the captured image corresponding to the reference region, and determine the absolute distance of the reference region based on a position of a focus lens of the image sensor corresponding to a maximum sharpness of an auto focus image from the performed automatic focus operation.
The at least one processor may be further configured to control the absolute distance finder to: perform an automatic focus operation on an enlarged capturing region corresponding to the reference region and a peripheral region of the reference region of the captured image, and determine the absolute distance of the reference region based on a position of a focus lens of the image sensor corresponding to a maximum sharpness of an auto focus image from the performed automatic focus operation.
The at least one processor may be further configured to control the absolute distance finder to: perform a first automatic focus operation on an enlarged capturing region corresponding to the reference region and a peripheral region of the reference region of the captured image, perform a second automatic focus operation on a reference focus region corresponding to the reference region of an auto focus image from the performed first automatic focus operation, and determine an absolute distance of the reference region based on a position of a focus lens of the image sensor corresponding to a maximum sharpness of a first auto focus image from the performed first automatic focus operation and the sharpness of a second auto focus image from the performed second automatic focus operation.
The at least one processor may be further configured to control the depth map generator to convert a resolution of the captured image to correspond to a preset resolution of a depth map, and generate the depth map based on the converted captured image.
According to an aspect of the disclosure, a method for image analysis may include: generating a captured image; generating a depth map for the captured image; determining an absolute distance based on an automatic focus function of an image sensor that generates the captured image for a selected sub-region of a first region of the depth map; and determining an absolute distance for the first region of the depth map based on the determined absolute distance.
The determining the absolute distance may include: dividing the depth map into a plurality of divided regions, and determining an absolute distance based on the automatic focus function of the image sensor for at least two reference regions selected from the divided regions; and converting a relative distance for the first region of the depth map into an absolute distance according to a ratio of the relative distance and the absolute distance of each of the at least two reference regions.
The determining the absolute distance may include selecting a region having a relative distance included within a preset range among the first region of the depth map as a reference region.
According to an aspect of the disclosure, a non-transitory recording medium storing a computer program, which, when executed, may cause at least one processor to execute a method including: generating a captured image; generating a depth map for the captured image; determining an absolute distance based on an automatic focus function of an image sensor that generates the captured image for a selected sub-region of a first region of the depth map; and determining an absolute distance for the first region of the depth map based on the determined absolute distance.
The details of other embodiments are included in the detailed description and drawings.
The above and other aspects, features, and advantages of certain embodiments of the present disclosure will be more apparent from the following description taken in conjunction with the accompanying drawings, in which:
Hereinafter, embodiments of the present disclosure will be described in detail with reference to the accompanying drawings. Advantages and features of the present disclosure, and a method for achieving the advantages and features will become apparent with reference to the embodiments described below in detail in conjunction with the accompanying drawings. However, the present disclosure is not limited to the embodiments disclosed below, and may be implemented in a variety of different forms. The presented embodiments will be provided in order to allow those skilled in the art to completely recognize the scope of the present disclosure. However, the embodiments described herein are example embodiments, and thus, the disclosure is not limited thereto. The same reference numbers indicate the same components throughout the specification.
Unless defined otherwise, all terms (including technical and scientific terms) used in the present specification have the same meaning as meanings commonly understood by those skilled in the art to which the present disclosure pertains. In addition, terms defined in generally used dictionaries are not ideally or excessively interpreted unless specifically defined clearly.
As used herein, each of the expressions “A or B,” “at least one of A and B,” “at least one of A or B,” “A, B, or C,” “at least one of A, B, and C,” and “at least one of A, B, or C,” may include one or all possible combinations of the items listed together with a corresponding expression among the expressions.
It will be understood that the terms “first”, “second”, or the like, may be used to distinguish one component from another, and should not be construed to limit the corresponding component in other aspects (e.g., importance or order).
It will be understood that the terms “includes,” “comprises,” “has,” “having,” “including,” “comprising,” and the like when used in this specification, specify the presence of stated features, figures, steps, operations, components, members, or combinations thereof, but do not preclude the presence or addition of one or more other features, figures, steps, operations, components, members, or combinations thereof.
As used herein, the terms “configured to” may be interchangeably used with the terms “suitable for,” “having the capacity to,” “designed to,” “adapted to,” “made to,” or “capable of” depending on circumstances. The term “configured to” does not essentially mean “specifically designed in hardware to.” Rather, the term “configured to” may mean that a device can perform an operation together with another device or parts. For example, a ‘device configured (or set) to perform A, B, and C’ may be a dedicated device to perform the corresponding operation or may mean a general-purpose device capable of various operations including the corresponding operation. Additionally, as used herein, a device that is ‘configured to perform A, B, and C,’ should be interpreted as both a device which directly performs A, B, and C, and a device which indirectly performs A, B, and C through a different device.
Referring to
The image sensor 110 may generate a captured image. For example, the image sensor 110 may include a camera equipped with one or more image sensors such as a complementary metal-oxide-semiconductor (CMOS) or a charge-coupled device (CCD).
In the present disclosure, the image sensor 110 may include a monocular camera. The captured image generated by the image sensor 110 may represent an image captured at one capturing point.
In addition, the image sensor 110 may be equipped with a lens capable of performing an automatic focus. As described below, the image sensor 110 may perform an automatic focus operation under the control of the map analyzer 150.
In addition, the image sensor 110 may be capable of adjusting a capturing direction. For example, the image sensor 110 may include a pan tilt equipment. The pan tilt equipment may adjust the capturing direction of the image sensor 110 by an input control signal.
The storage 120 may temporarily or permanently store the captured image generated by the image sensor 110. In addition, the storage 120 may store a depth map generated by the depth map generator 140 and may also store an analysis result generated by the map analyzer 150. The storage 120 may include a plurality of memory modules. Alternatively or additionally, the storage 120 may be implemented as an external memory device, a hard disk, and/or an optical disk, a cloud storage, not being limited thereto, and may be connected to the controller 130 wiredly or wirelessly.
The depth map generator 140 may generate a depth map for the captured image generated by the image sensor 110. For example, the depth map generator 140 may generate the depth map of the captured image generated by the image sensor 110, which is the monocular camera, using a deep learning-based model.
The depth map may include relative distance information of objects included in the captured image. The depth map may include the relative distance information between the objects included in the image, as well as relative distance information across surfaces that constitute each of the objects. For example, the depth map may include relative distance information for all pixels included in the captured image. Based on the relative distance information of the objects, not only the arrangement relationship between the objects but also a three-dimensional shape of each object may be estimated.
According to some embodiments of the present disclosure, the depth map generator 140 may generate a depth map from an artificial intelligence model and then convert the depth map by reflecting characteristics of a focus lens equipped in the image sensor 110. An absolute distance between the object captured by the image sensor 110 and the image sensor 110 may be determined by referring to a position of the focus lens. In this case, the depth map generator 140 may convert the depth map so that the amount of change between the position of the focus lens and the absolute distance is reflected.
In addition, the depth map generator 140 may generate a depth map from at least one artificial intelligence model selected from a plurality of different artificial intelligence models. The plurality of artificial intelligence models may generate optimal depth maps in different situations. For example, a first artificial intelligence model may generate an optimal depth map for an image during a daytime scene, and a second artificial intelligence model may generate an optimal depth map for an image during a nighttime scene. Alternatively, a third artificial intelligence model may generate an optimal depth map for an image including people or animals, and a fourth artificial intelligence model may generate an optimal depth map for an image including natural environments. The depth map generator 140 may select at least one of the plurality of artificial intelligence models by referring to capturing conditions of the image sensor 110 and generate a depth map from the selected artificial intelligence model.
According to an embodiment, the AI model may include a plurality of artificial neural network layers. An artificial neural network may include, for example, a deep neural network (DNN), a convolutional neural network (CNN), a recurrent neural network (RNN), a restricted Boltzmann machine (RBM), a deep belief network (DBN), and a bidirectional recurrent deep neural network (BRDNN), a deep Q-network, or a combination of two or more thereof, but is not limited thereto. The AI model may alternatively or additionally include a software structure other than the hardware structure.
The map analyzer 150 may analyze the depth map and determine an absolute distance for an entire region of the depth map. The map analyzer 150 may determine the absolute distance for the entire region of the depth map based on an absolute distance determined for a selected region among the entire region of the depth map. Here, the map analyzer 150 may determine the absolute distance for the selected region from the automatic focus function of the image sensor 110. The detailed configuration and function of the map analyzer 150 will be described later with reference to
The output 160 may output an analysis result of the map analyzer 150. A user may check the absolute distance for the entire region of the captured image by referring to the analysis result of the map analyzer 150.
The controller 130 may perform overall control of the image sensor 110, the storage 120, the depth map generator 140, the map analyzer 150, and the output 160. The controller 130 may be physically implemented by analog and/or digital circuits including one or more of a logic gate, an integrated circuit, a microprocessor, a microcontroller, a memory circuit, a passive electronic component, an active electronic component, an optical component, and the like, and may implement or execute software and/or firmware to perform the functions or operations described herein. The controller 130 may include at least one processor such as a central processing unit (CPU), a graphics processing unit (GPU), an accelerated processing unit (APU), a many integrated core (MIC), a field-programmable gate array (FPGA), a digital signal processor (DSP), a neural processing unit (NPU), a hardware accelerator, a machine learning accelerator, or the like. Through at least one processor, the controller 130 may be configured to execute instructions according to at least one program code and code of an operating system stored in at least one memory.
Referring to
The absolute distance finder 151 may divide the depth map into a plurality of regions and determine an absolute distance of at least two reference regions selected from the divided regions. The determination of the absolute distance of the reference region may be performed using the auto focus function of the image sensor 110.
Before calculating the absolute distance of the reference region, the absolute distance finder 151 may determine a relative distance for each divided region. Hereinafter, the relative distance for the divided region is referred to as a region relative distance.
The region relative distance may be determined as an average value or median value of the relative distances included in the divided regions. Here, the average value may represent a value obtained by dividing the sum of all relative distances included in the divided regions by the number of relative distances, and the median value may represent a value corresponding to the middle of a distribution of relative distances included in the divided regions. Alternatively, the median value may represent a relative distance of a center pixel of the divided region. In some examples, the region relative distance may be determined as the average value or median value of the relative distances included in the divided regions, and various values representing the divided regions may be determined as the region relative distance. The case in which the region relative distance is determined as the average value or median value of the divided regions will be described as an example.
As described above, the depth map may include the relative distance information for each pixel. For example, a value corresponding to a relative distance (hereinafter, referred to as a relative distance value) may be assigned to all pixels included in the depth map. The relative distance value may be set to increase or decrease as a distance between a specific point of the object corresponding to the pixel and the capturing point is closer. Hereinafter, the case in which the relative distance value increases as the distance between the specific point of the object and the capturing point is closer will be described as an example.
Each of the divided regions selected by the absolute distance finder 151 may include a plurality of pixels, and a relative distance value may be set for each pixel. The absolute distance finder 151 may determine an average value or median value of the relative distance values of all pixels included in each divided region as the region relative distance of the corresponding divided region.
The absolute distance finder 151 may select at least two reference regions. The reference region may be used to convert the relative distance for the entire region of the depth map into the absolute distance. To convert the distance for the entire region of the depth map, the absolute distance finder 151 may calculate the absolute distance of the reference region from the auto focus function of the image sensor 110.
The lens equipped in the image sensor 110 may be equipped with a movable focus lens configured to automatically adjust the focus. The sharpness of a specific region of the captured image may be increased by moving the focus lens. A correlation may be established between a position of the focus lens for maximally improving the sharpness and a distance to the object corresponding to that region. With such a correlation, the absolute distance between the capturing point and the corresponding object may be determined from the position of the focus lens.
The absolute distance finder 151 may determine the absolute distance to the object included in the reference region by performing the automatic focus operation of the image sensor 110 for the reference region and referring to the position of the focus lens at this time.
The distance converter 152 may convert the relative distance for the entire region of the depth map into the absolute distance according to the ratio of the relative distance and the absolute distance of each of at least two reference regions. When the relative and absolute distances for two or more reference regions are confirmed, the ratio between the relative and absolute distances for the entire depth map may be determined. The distance converter 152 may determine the absolute distance of each region by applying the relative distance of the entire region constituting the depth map to the ratio between the relative distance and the absolute distance.
Referring to
The captured image 200 may include at least one object 210. The object 210 may include a person, an animal, a vehicle, or a building. The captured image 200, which is generated by the image sensor 110, may be a still image or one of a plurality of scenes included in a moving image.
Referring to
The depth map 300 may include relative distance information of an object 310. The relative distance information may represent a value in which a distance between each point included in the depth map 300 and the capturing point is relatively determined.
In the present disclosure, a resolution of the depth map 300 that may be processed by the map analyzer 150 may be different from a resolution of the captured image 200. In this case, the depth map generator 140 may convert the resolution of the captured image 200 to correspond to a preset resolution of the depth map 300, and may generate the depth map 300 based on converted captured image.
Referring to
The horizontal and vertical sizes of each of the plurality of divided regions 400 may be the same. The size of the divided regions 400 may be a size at which the auto focus by the image sensor 110 is smoothly performed.
The absolute distance finder 151 may determine a region relative distance for each of the divided regions 400. The region relative distance may be determined as the average value or median value of the relative distance values of all pixels included in each divided region 400.
Referring to
The absolute distance finder 151 may select a region having a relative distance included in a preset range among the entire regions of the depth map 300 as the reference region 410. For example, the absolute distance finder 151 may exclude a divided region 400 having the largest or smallest relative distance among the entire divided regions 400 from selection of the reference region 410.
The absolute distance finder 151 may select the reference region 410 among the divided regions 400 having a region relative distance included between an upper relative distance value and a lower relative distance value. Here, the upper relative distance value may represent a relative distance value that is lower by a preset value from the highest region relative distance in the distribution of the overall region relative distances, and the lower relative distance value may represent a relative distance value that is higher by a preset value than the lowest region relative distance in the distribution of the overall region relative distances.
The absolute distance finder 151 may exclude a region having a minimum or maximum relative distance value among the entire divided regions 400 from the selection of the reference region 410. For example, when the relative distance value is set to a resolution of 8 bits, the minimum relative distance value may be 0 and the maximum relative distance value may be 255. The minimum or maximum relative distance value may be understood as a noise or saturation value rather than a normal relative distance value. Therefore, when the region relative distance of the divided regions 400 is determined by reflecting the minimum relative distance value or the maximum relative distance, the reliability of the region relative distance may be reduced. As the region having the minimum relative distance value or the maximum relative distance value is excluded from the selection of the reference region 410, a reference region 410 having a high reliability region relative distance may be selected.
The absolute distance finder 151 may select a reference region 410 by referring to the distribution of relative distance values constituting the divided regions 400. For example, when the variation or standard deviation of the relative distances constituting the divided regions 400 exceeds a preset threshold, the absolute distance finder 151 may exclude the corresponding divided region 400 from selection of the reference region 410. When the variation or standard deviation of the relative distance values constituting the divided regions 400 exceeds a threshold, it may be understood that a plurality of objects having various distances are included in the corresponding divided region 400. In this case, it may not be desirable to select the corresponding divided region 400 as the reference region 410 because the absolute distance of the object included in the corresponding divided region 400 may not be clear. Accordingly, the absolute distance finder 151 may select the reference region 410 from the divided regions 400 composed of relative distance values whose variance or standard deviation is a threshold or less.
As described above, the image sensor 110 may include a pan tilt equipment. The pan tilt equipment may adjust the capturing direction of the image sensor 110 within a preset driving range. The absolute distance finder 151 may select a region included in the driving range of the pan tilt equipment among the entire divided regions 400 as the reference region 410. The pan tilt equipment may be driven so that the image sensor 110 faces a direction corresponding to a specific region among the entire divided regions 400, and may zoom in on the corresponding region by an automatic focus operation in a state in which the absolute distance finder 151 may select the corresponding region as the reference region 410.
In some examples of the present disclosure, the reference region 410 may be used to determine the absolute distance of another divided region 400. Therefore, when the relative distance of the selected reference region 410 is not appropriate, the reliability of the absolute distance of another divided region 400 may be reduced. As the region having the relative distance included in the preset range is selected as the reference region 410, the reliability of the absolute distance of another divided region 400 determined based on such a selection may be improved.
Referring to
The distance coordinate plane may be formed by a horizontal axis representing an absolute distance and a vertical axis representing a relative distance. The reference points P1 and P2 represent coordinate points on the distance coordinate plane corresponding to the reference region 410.
The absolute distance finder 151 may first check a maximum region relative distance HIGH and a minimum region relative distance LOW among the entire region relative distances. In addition, the absolute distance finder 151 may exclude a divided region 400 having the maximum region relative distance HIGH or having the minimum region relative distance LOW among the entire divided regions 400 from the selection of the reference region 410. Alternatively, the absolute distance finder 151 may select the reference region 410 from the divided regions 400 having a region relative distance included between an upper relative distance value TH_HIGH and a lower relative distance value TH_LOW. The upper relative distance value TH_HIGH may be determined as a value smaller than the maximum region relative distance HIGH by a preset interval, and the lower relative distance value TH_LOW may be determined as a value greater than the minimum region relative distance LOW by a preset interval.
In addition, the absolute distance finder 151 may select a reference region 410 so that a relative distance between different reference regions 410 is formed to be a preset interval or more.
When the reference region 410 is selected, the absolute distance finder 151 may cause the image sensor 110 to perform auto focus on the reference region 410. When the auto focus of the image sensor 110 is completed, the absolute distance finder 151 may calculate an absolute distance of the object corresponding to the reference region 410 by referring to the position of the focus lens.
When the relative distance and absolute distance to the reference region 410 are confirmed, the reference points P1 and P2 corresponding to the reference region 410 may be displayed on the distance coordinate plane.
Referring to
The reference graph G1 may be generated by connecting a plurality of reference points P1 and P2.
The reference graph G1 may reflect the ratio of the relative distance and the absolute distance of each reference region 410, and the distance converter 152 may determine the absolute distance of another divided region 400 based on the reference graph G1. Referring to
Through such a process, the distance converter 152 may determine the absolute distances of the entire divided regions 400 included in the entire depth map 300.
Referring to
When there are three reference regions 410, three reference points P1, P2, and P3 may be displayed on the distance coordinate plane, and a reference graph G2 generated based on the three reference points P1, P2, and P3 may be formed as a curved line rather than a straight line.
As the relative distance and absolute distance of the first reference region are A1 and B1, respectively, the first reference point P1 may be displayed on the distance coordinate plane based on the relative distance and absolute distance of the first reference region. As the relative distance and absolute distance of the second reference region are A2 and B2, respectively, a second reference point P2 may be displayed on the distance coordinate plane based on the relative distance and absolute distance of the second reference region. As the relative distance and absolute distance of the third reference region are A3 and B3, respectively, a third reference point P3 may be displayed on the distance coordinate plane based on the relative distance and absolute distance of the third reference region.
As the number of reference regions 410 increases, the ratio of the relative distance and the absolute distance of more reference regions 410 may be reflected to generate the reference graph G2. When the distance conversion is performed using the reference graph G2 generated based on more reference regions 410, the reliability of distance conversion may be improved.
The absolute distance finder 151 may determine the number of reference regions 410 based on the distribution of relative distance values constituting the depth map 300. For example, the absolute distance finder 151 may increase the number of reference regions 410, as the variation or standard deviation of the relative distance values constituting the depth map 300 may increase. When the variation or standard deviation of the relative distance values constituting the depth map 300 increases, a plurality of objects having various distances may be included in the captured image 200. In this case, the absolute distance that reflects a difference between different relative distances may be calculated through the reference graph generated using a larger number of reference regions 410.
Referring to
The absolute distance finder 151 may generate a depth map array 600 in which a plurality of relative distances for the plurality of divided regions 400 are listed in order of size.
The absolute distance finder 151 may divide the depth map array 600 into a plurality of depth map groups 610, 620, 630, and 640 according to the size of the relative distance. For example, the absolute distance finder 151 may divide the depth map array 600 into four depth map groups 610 to 640 as illustrated in
It may be understood that the smaller the relative distance, the corresponding object may exist at a relatively long distance, and the larger the relative distance, the corresponding object may exist at a relatively short distance. The relative distance corresponding to the object existing at the long distance (hereinafter, referred to as a relative long distance) and the relative distance corresponding to the object existing at the short distance (hereinafter, referred to as a relative short distance) may provide different references for determining the absolute distance. For example, even if a first relative long distance difference between different relative long distances and a second relative distance difference between different relative short distances are the same, a difference in absolute distance corresponding to the first relative distance difference may be formed to be larger than a difference in absolute distance corresponding to the second relative distance difference. Accordingly, the depth map group corresponding to the short distance may include a larger number of relative distances than the depth map group corresponding to the long distance among the plurality of depth map groups 610 to 640 included in the depth map array 600. For example, the size of the depth map groups 610 to 640 may be determined so that a difference in absolute distance corresponding to the difference between the maximum relative distance and the minimum relative distance of each of the depth map groups 610 to 640 is formed to be identical or similar for each of the depth map groups 610 to 640.
The absolute distance finder 151 may select a depth map group that serves as a selection reference for the reference region 410 among the plurality of depth map groups 610 to 640. For example, the absolute distance finder 151 may select depth map groups that are not adjacent to each other among the plurality of depth map groups 610 to 640 as a selection reference for the reference region 410. For example, the absolute distance finder 151 may select the first depth map group 610 and the third depth map group 630, or select the first depth map group 610 and the fourth depth map group 640, or select the second depth map group 620 and the fourth depth map group 640. Alternatively, the absolute distance finder 151 may select all of the plurality of depth map groups 610 to 640 as the selection reference for the reference region 410. The depth map group that serves as the selection reference for the reference region 410 is referred to as a reference map group.
When the selection of the reference map group is completed, the absolute distance finder 151 may extract a relative distance from the reference map group and select a divided region 400 corresponding to the extracted relative distance as the reference region 410.
The absolute distance finder 151 may extract a minimum relative distance, a maximum relative distance, or a middle relative distance among the relative distances of the reference map group, and select a divided region 400 corresponding to the extracted relative distance as the reference region 410. Alternatively, the absolute distance finder 151 may extract the smallest relative distance and the largest relative distance, excluding the minimum relative distance and the maximum relative distance, among the relative distances of the reference map groups 610 to 640, and select a divided region 400 corresponding to the extracted relative distance as the reference region 410.
The absolute distance finder 151 may determine the number of relative distances extracted from the reference map groups based on the variation or standard deviation of the relative distances constituting the reference map groups 610 to 640. For example, the absolute distance finder 151 may extract a larger number of relative distances from the reference map groups 610 to 640, as the variation or standard deviation of the relative distances constituting the reference map groups increases. The number of reference regions 410 may be determined depending on the number of extracted relative distances.
Referring to
Under the control of the absolute distance finder 151, the image sensor 110 may perform auto focus on the reference capturing region 201a corresponding to the reference region 410. The image sensor 110 may perform auto focus on a scene corresponding to the reference capturing region 201a.
The image sensor 110 may perform the auto focus by moving the focus lens. The absolute distance finder 151 may calculate an absolute distance of the reference region 410 based on a position of the focus lens of the image sensor 110 that maximally improves the sharpness of the auto focus image 202 formed by performing the auto focus.
Referring to
The enlarged capturing region 201 may include a reference capturing region 201a and a peripheral capturing region 201b. The reference capturing region 201a may represent a region corresponding to the reference region 410, and the peripheral capturing region 201b may represent a region corresponding to the peripheral region of the reference region 410.
Under the control of the absolute distance finder 151, the image sensor 110 may perform auto focus on the enlarged capturing region 201. The image sensor 110 may perform auto focus on a scene corresponding to the enlarged capturing region 201.
The image sensor 110 may sequentially perform the auto focus on the two enlarged capturing regions 201. Accordingly, auto focus images 203 for the two enlarged capturing regions 201 may be sequentially generated.
The image sensor 110 may perform the auto focus by moving the focus lens. The absolute distance finder 151 may calculate an absolute distance of the reference region 410 based on a position of the focus lens of the image sensor 110 that maximally improves the sharpness of the auto focus image 203 formed by performing the auto focus. Since the auto focus is performed on a scene corresponding to a larger enlarged capturing region 201 than on a scene corresponding to the reference capturing region 201a, the absolute distance may be determined at a faster speed. In addition, since information on the scene corresponding to not only the reference capturing region 201a but also the peripheral capturing region 201b adjacent thereto is reflected, the reliability of the calculated absolute distance may be improved.
Referring to
The absolute distance finder 151 may perform a first auto focus on an enlarged capturing region 201 corresponding to the reference region 410 and the peripheral region of the reference region 410 of the captured image 200.
The enlarged capturing region 201 may include a reference capturing region 201a and a peripheral capturing region 201b. The reference capturing region 201a may represent a region corresponding to the reference region 410, and the peripheral capturing region 201b may represent a region corresponding to the peripheral region of the reference region 410.
Under the control of the absolute distance finder 151, the image sensor 110 may perform the first auto focus on the enlarged capturing region 201. The image sensor 110 may perform the first auto focus on a scene corresponding to the enlarged capturing region 201.
After the first auto focus is completed, the absolute distance finder 151 may perform a second auto focus on the reference focus region 203a corresponding to the reference region 410 among the auto focus images 203 by the first auto focus.
Under the control of the absolute distance calculation unit 151, the image sensor 110 may perform the second auto focus on the reference focus region 203a. The image sensor 110 may perform the second auto focus on a scene corresponding to the reference focus region 203a.
The image sensor 110 may perform the second auto focus after performing the first auto focus. This process may be repeated for each enlarged capturing region 201.
The image sensor 110 may perform the auto focus by moving the focus lens. The absolute distance finder 151 may calculate an absolute distance of the reference region 410 based on a position of the focus lens of the image sensor 110 that maximally forms the sharpness of the first auto focus image 203 formed by performing the first auto focus, and the sharpness of the second auto focus image 204 formed by performing the second auto focus. Since the absolute distance is determined by comprehensively determining the position of the focus lens for the enlarged capturing region 201 and the position of the focus lens for the reference focus region 203a, the reliability of the absolute distance determination may be improved. In particular, the absolute distance determination may be more accurate.
Referring to
A depth map generator 140 may generate a depth map 300 for the captured image 200 (S520), and an absolute distance finder 151 of a map analyzer 150 may calculate an absolute distance for a selected region (reference region) among the entire region of the depth map 300 based on an auto focus function of the imaging unit 110 (S530).
In addition, a distance converter 152 of the map analyzer 150 may determine an absolute distance for the entire region of the depth map 300 based on the absolute distance determined by the absolute distance finder 151. The map analyzer 150 may generate a reference graph by connecting at least two reference points on a distance coordinate plane, and may determine an absolute distance of each divided region 400 based on a relative distance of other divided regions 400 to the reference graph (S540).
At least one of the components, elements, modules or units represented by a block as illustrated in
According to the apparatus and method for image analysis of the present disclosure as described above, since the absolute distance for a region of the image is determined by the auto focusing function and the absolute distance is applied to a result of the depth map to determine the absolute distance for the remaining region, the absolute distance of all objects included in the depth map may be determined.
The effects of the present disclosure are not limited to the effects mentioned above, and other effects not mentioned will be clearly understood by those skilled in the art from the description of the claims.
Although the embodiments of the present disclosure have been described with reference to the accompanying drawings, those of ordinary skill in the art to which the present disclosure pertains will understand that the present disclosure may be embodied in other forms without changing the technical spirit or essential features thereof. Therefore, it should be understood that the embodiments described above are illustrative in all aspects and not restrictive.
Number | Date | Country | Kind |
---|---|---|---|
10-2022-0055392 | May 2022 | KR | national |
10-2022-0118616 | Sep 2022 | KR | national |
This application is a continuation of International Application No. PCT/KR2023/006027, filed on May 3, 2023, in the Korean Intellectual Property Receiving Office, which is based on and claims priority to Korean Patent Applications No. 10-2022-0055392, filed on May 4, 2022, and No. 10-2022-0118616 filed on Sep. 20, 2022, in the Korean Intellectual Property Office, the disclosures of which are incorporated by reference herein in their entireties.
Number | Date | Country | |
---|---|---|---|
Parent | PCT/KR2023/006027 | May 2023 | WO |
Child | 18934348 | US |