The present disclosure relates to an information processing apparatus, an information processing method, and a storage medium.
Japanese Patent No. 5235718 discusses a technique that performs image analysis on a captured image to extract a feature amount of the image, and detects, based on a change of the feature amount, an action (camera tampering attempts) obstructing image capturing.
The present disclosure is directed to a technique capable of detecting a state where image capturing is obstructed depending on a situation.
According to an aspect of the present disclosure, an information processing apparatus determining whether image capturing by an image capturing apparatus is obstructed, includes a dividing unit configured to divide an input image captured by the image capturing apparatus into a plurality of blocks, a processing determination unit configured to determine whether to perform first detection processing using a reference image corresponding to the image capturing apparatus or second detection processing using a feature amount of the input image, on each of the blocks, and an obstruction determination unit configured to determine whether the image capturing by the image capturing apparatus is obstructed, based on a detection result of each of the blocks by the first detection processing or the second detection processing.
Further features of the present disclosure will become apparent from the following description of exemplary embodiments with reference to the attached drawings.
An exemplary embodiment of the present disclosure is described in detail below with reference to the accompanying drawings. In the present specification and the drawings, components having substantially the same functional configuration are denoted by the same reference numerals, and repetitive descriptions of the components are omitted.
An example of a configuration of a system according to an exemplary embodiment of the present disclosure is described with reference to
A type of the network A103 is not particularly limited as long as the network A103 can connect each of the image capturing apparatuses A101-1 to A101-3 with the management apparatus A105. Specific examples of the network A103 include the Internet, a local area network (LAN), a wide area network (WAN), a public line (e.g., telephone line or mobile communication line). Further, other examples of the network A103 include a dedicated line, an asynchronous transfer mode (ATM) line, a frame relay line, a cable television line, and a data broadcasting wireless communication line. Further, the network A103 may be a wireless network or a wired network. In addition, the network A103 may include a plurality of different types of networks. As a specific example, communication between each of the image capturing apparatuses A101-1 to A101-3 and the management apparatus A105 may be relayed by a communication apparatus. In this case, the different types of networks may be applied to the communication between the communication apparatus and each of the image capturing apparatuses A101-1 to A101-3, and the communication between the communication apparatus and the management apparatus A105.
Each of the image capturing apparatuses A101-1 to A101-3 has a detection function to detect an action (e.g., camera tampering attempts) that shields at least a part of a viewing angle to obstruct the image capturing. In the example illustrated in
The management apparatus A105 is an information processing apparatus that is used for monitoring operation based on images corresponding to image capturing results of the respective image capturing apparatuses A101-1 to A101-3. The management apparatus A105 has functions of, for example, presentation of the image corresponding to the image capturing result of each image capturing apparatus A101, control of the above-described detection function of each image capturing apparatus A101, and reception of notification (e.g., alert) from each image capturing apparatus A101. The management apparatus A105 can be realized by, for example, a personal computer (PC).
The management apparatus A105 includes, for example, a main body performing various kinds of calculations, an output device (e.g., display) presenting information to the user, and an input device (e.g., keyboard and pointing device) receiving an instruction from the user. The management apparatus A105 may receive, from the user, an instruction about setting of each image capturing apparatus A101 through a user interface such as a web browser, and may update setting of the target image capturing apparatus A101 based on the instruction. Further, the management apparatus A105 may receive the image (e.g., moving image or still image) corresponding to the image capturing result from each image capturing apparatus A101, and may present the image to the user through the output device or record the image. Furthermore, the management apparatus A105 may receive notification of an alert and the like from each image capturing apparatus A101, and present information corresponding to the notification to the user through the output device. The various kinds of functions described above may be implemented by, for example, applications installed in the management apparatus A105.
An example of a hardware configuration of an information processing apparatus 100 adoptable as parts relating to execution of the various kinds of calculations of the image capturing apparatus A101 and as the management apparatus A105 is described with reference to
The information processing apparatus 100 includes a central processing unit (CPU) 101, a read only memory (ROM) 102, and a random access memory (RAM) 103. The information processing apparatus 100 further includes an auxiliary storage device 104 and a communication interface (I/F) 107. The information processing apparatus 100 may include at least any of an output device 105 and an input device 106. The CPU 101, the ROM 102, the RAM 103, the auxiliary storage device 104, the output device 105, the input device 106, and the communication I/F 107 are connected to one another through a bus 108.
The CPU 101 controls various kinds of operation of the information processing apparatus 100. For example, the CPU 101 may control operation of the entire information processing apparatus 100. The ROM 102 stores control programs, a boot program, and other programs executable by the CPU 101. The RAM 103 is a main storage memory of the CPU 101, and is used as a work area or a temporary storage area for loading various kinds of programs.
The auxiliary storage device 104 stores various kinds of data and various kinds of programs. The auxiliary storage device 104 is implemented by a storage device temporarily or persistently storing various kinds of data, such as a nonvolatile memory represented by a hard disk drive (HDD) and a solid state drive (SSD).
The output device 105 is a device outputting various kinds of information, and is used for presentation of the various kinds of information to the user. For example, the output device 105 is implemented by a display device such as a display. In this case, the output device 105 presents the information to the user by displaying various kinds of display information. As another example, the output device 105 may be implemented by a sound output device outputting sound such as voice and electronic sound. In this case, the output device 105 presents the information to the user by outputting sound such as voice and electronic sound. The device adopted as the output device 105 may be appropriately changed depending on a medium used for presentation of information to the user.
The input device 106 is used to receive various kinds of instructions from the user. The input device 106 can be implemented by, for example, a mouse, a keyboard, and a touch panel. Further, as another example, the input device 106 may include a sound collection device such as a microphone, and may collect voice uttered by the user. In this case, when various kinds of analysis processing such as acoustic analysis and natural language processing is performed on the collected voice, contents represented by the voice are recognized as the instruction from the user. Further, a device adopted as the input device 106 may be appropriately changed depending on a method of recognizing the instruction from the user. In addition, a plurality of types of devices may be adopted as the input device 106.
The communication DF 107 is used for communication with an external apparatus through the network. A device adopted as the communication I/F 107 may be appropriately changed depending on a type of a communication path and an adopted communication system.
When the CPU 101 loads programs stored in the ROM 102 or the auxiliary storage device 104 to the RAM 103 and executes the programs, functional configurations illustrated in
An example of a functional configuration of the image capturing apparatus A101 according to the present exemplary embodiment is described with reference to
The image capturing unit A201 guides light of an object incident through an optical system such as a lens, to an image capturing device, photoelectrically converts the light into an electric signal by the image capturing device, and generates image data based on the electric signal.
The compression unit A202 applies encoding processing, compression processing, and other processing on the image data output from the image capturing unit A201, to reduce a data amount of the image data.
The format conversion unit A203 converts the image data, the data amount of which has been reduced by compression, into other image data of a predetermined format. As a specific example, the format conversion unit A203 may convert the target image data into image data of a format more suitable for transmission through the network.
The format conversion unit A203 outputs the format-converted image data to a predetermined output destination. As a specific example, the format conversion unit A203 may output the format-converted image data to the communication unit A204 to transmit the image data to the other apparatus (e.g., management apparatus A105) through the network.
The communication unit A204 transmits and receives information and data to and from the other apparatus through a predetermined network. For example, the communication unit A204 receives information corresponding to an instruction about various kinds of settings received by the management apparatus A105 from the user. In addition, the communication unit A204 transmits an image corresponding to the image capturing result of the image capturing unit A201 and notifies the management apparatus A105 of various kinds of notification information (e.g., alert information).
The block dividing unit A205 divides the image of the image data output from the image capturing unit A201 (i.e., image corresponding to image capturing result of image capturing unit A201) into a plurality of blocks. As a specific example, the block dividing unit A205 may divide the image corresponding to the image capturing result of the image capturing unit A201, into a plurality of blocks each having a rectangular shape.
For example,
Note that the example illustrated in
The detection processing switching unit A206 selectively switches, based on a predetermined condition, whether to apply processing by the first detection unit A207 described below or processing by the second detection unit A208 described below to each of the plurality of blocks obtained by dividing the image corresponding to the image capturing result of the image capturing unit A201. As a specific example, the detection processing switching unit A206 may acquire, from the setting reception unit A211 described below, the information corresponding to the instruction received by the management apparatus A105 from the user, and may determine processing to be applied to each of the blocks based on the information.
The first detection unit A207 detects occurrence of a state where a partial area corresponding to an input image (e.g., partial image corresponding to each of blocks) in the viewing angle of the image capturing unit A201 is shielded, based on a difference between the input image and a reference image.
The second detection unit A208 detects occurrence of the state where the partial area corresponding to the input image (e.g., partial image corresponding to each of blocks) in the viewing angle of the image capturing unit A201 is shielded, based on a feature amount representing a predetermined image feature extracted from the input image. As a specific example, the second detection unit A208 may extract edge power as the above-described feature amount by applying a Sobel filter to the input image. In this case, the second detection unit A208 may detect occurrence of the state where the partial area corresponding to the input image in the viewing angle of the image capturing unit A201 is shielded, based on uniformity of the input image corresponding to the extracted edge power.
In the following description, the state where the partial area corresponding to the input image (e.g., partial image corresponding to each of blocks) in the viewing angle of the image capturing unit A201 is shielded is also referred to as a “shielded state”, for convenience.
The obstruction determination unit A209 determines whether the image capturing by the image capturing unit A201 is obstructed, based on a detection result of the shielded state of each of the blocks obtained by dividing the image corresponding to the image capturing result of the image capturing unit A201, detected by the first detection unit A207 or the second detection unit A208. As a specific example, the obstruction determination unit A209 may determine whether the image capturing by the image capturing unit A201 is obstructed, based on a ratio of the blocks detected as being shielded to the plurality of blocks obtained by dividing the image corresponding to the image capturing result of the image capturing unit A201.
The notification unit A210 notifies a predetermined notification destination (e.g., management apparatus A105 illustrated in
The setting reception unit A211 receives, from the management apparatus A105, an instruction about various kinds of settings received by the management apparatus A105 from the user, and controls various kinds of settings for operation of the image capturing apparatus A101 in response to the instruction. As a specific example, the setting reception unit A211 may control the detection processing switching unit A206 to switch the shielded state detection processing to be applied to each of the blocks, in response to the instruction from the user received from the management apparatus A105.
Further, the setting reception unit A211 may transmits, to the management apparatus A105, information to present a user interface (UI) for receiving instructions about control of the various kinds of settings from the user (e.g., setting screen) to the user, thereby causing the management apparatus A105 to present the UI. Further, the setting reception unit A211 may control the various kinds of settings for operation of the image capturing apparatus A101 (e.g., setting about switching condition of detection processing switching unit A206), in response to the instruction received by the management apparatus A105 from the user through the above-described UI.
An example of processing by the image capturing apparatus A101 according to the present exemplary embodiment is described with reference to
In step S101, the block dividing unit A205 divides the image corresponding to the image capturing result of the image capturing unit A201, into a predetermined number of blocks.
In step S102, the detection processing switching unit A206 determines whether to apply the processing by the first detection unit 207 or the processing by the second detection unit A208 to each of the blocks, based on the user instruction notified from the setting reception unit A211. The processing in step S102 is separately described in detail below.
In step S103, the image capturing apparatus A101 determines whether processing in steps S104 to S106 described below has been performed on all of the plurality of blocks obtained by dividing the image corresponding to the image capturing result of the image capturing unit A201. In a case where the image capturing apparatus A101 determines in step S103 that the processing in steps S104 to S106 has not been performed on all of the plurality of blocks obtained by dividing the image corresponding to the image capturing result of the image capturing unit A201 (NO in step S103), the processing proceeds to step S104.
In step S104, the detection processing switching unit A206 confirms whether application of the processing by the first detection unit A207 (shielded state detection processing based on background difference) to the target block is determined in step S102.
In step S104, in a case where the detection processing switching unit A206 confirms that the processing by the first detection unit A207 (shielded state detection processing based on background difference) is applied to the target block (YES in step S104), the processing proceeds to step S106. In step S106, the detection processing switching unit A206 requests the first detection unit A207 to perform the processing on the target block. The first detection unit A207 detects the shielded state of a partial area corresponding to the target block in the viewing angle of the image capturing unit A201 by using a background difference based on comparison between a partial image corresponding to the target block and a reference image.
On the other hand, in step S104, in a case where the detection processing switching unit A206 confirms that the processing by the first detection unit A207 (shielded state detection processing based on background difference) is not applied to the target block (NO in step S104), the processing proceeds to step S105. In step S105, the detection processing switching unit A206 requests the second detection unit A208 to perform the processing on the target block. The second detection unit A208 detects the shielded state of the partial area corresponding to the target block in the viewing angle of the image capturing unit A201 by using edge power extracted from the partial image corresponding to the target block.
The image capturing apparatus A101 performs the processing in steps S104 to S106 on all of the blocks obtained by dividing the image corresponding to the image capturing result of the image capturing unit A201, in the above-described manner.
In a case where the image capturing apparatus A101 determines in step S103 that the processing in steps S104 to S106 has been already performed on all of the plurality of blocks obtained by dividing the image corresponding to the image capturing result of the image capturing unit A201 (YES in step S103), the processing proceeds to step S107.
In step S107, the obstruction determination unit A209 determines whether the image capturing by the image capturing unit A201 is obstructed based on the number of blocks detected as being shielded among all of the blocks obtained by dividing the image corresponding to the image capturing result of the image capturing unit A201. More specifically, the obstruction determination unit A209 calculates a ratio of the blocks detected as being shielded to all of the blocks, and compares the ratio with a threshold. In a case where the calculated ratio exceeds the threshold, the obstruction determination unit A209 determines that the image capturing by the image capturing unit A201 is obstructed.
In step S108, the obstruction determination unit 209 confirms whether it is determined in step S107 that the image capturing by the image capturing unit A201 is obstructed.
In step S108, in a case where the obstruction determination unit A209 confirms that the image capturing by the image capturing unit A201 is obstructed (YES in step S108), the processing proceeds to step S109. In step S109, the notification unit A210 notifies the management apparatus A105 of detection of the state where the image capturing by the image capturing unit A201 is obstructed.
On the other hand, in step S108, in a case where the obstruction determination unit A209 confirms that the image capturing by the image capturing unit A201 is not obstructed (NO in step S108), the series of processing illustrated in
Next, an example of the processing by the detection processing switching unit A206 to determine whether to apply the processing by the first detection unit A207 or the processing by the second detection unit A208 to the target block, illustrated in step S102 of
In step S201, the setting reception unit A211 transmits a screen that presents the detection result of the shielded state of each of the blocks based on the current setting to each of the blocks, to the management apparatus A105 through the communication unit A204, and causes the management apparatus A105 to present the screen.
A start button A410 is a button for receiving an instruction to start setting about the shielded state detection, from the user. An end button A411 is a button for receiving an instruction to end the setting about the shielded state detection, from the user. A close button A412 is a button for receiving an instruction to close the setting screen, from the user. Radio buttons A413 and A414 are interfaces for receiving selection of a method to detect the shielded state of each of the blocks, from the user. In a case where the radio button A413 is selected, the shielded state detection processing based on the edge power by the second detection unit A208 is applied to the target block. In a case where the radio button A414 is selected, the shielded state detection processing based on the background difference by the first detection unit A207 is applied to the target block.
As illustrated in
In step S202, the management apparatus A105 determines whether an instruction to complete all setting processing has been received from the user. As a specific example, in a case where the close button A412 is pressed, the management apparatus A105 may recognize that the instruction to complete all setting processing has been received from the user.
In a case where the management apparatus A105 determines in step S202 that the instruction to complete all setting processing has been received from the user (YES in step S202), the series of processing illustrated in
On the other hand, in a case where the management apparatus A105 determines in step S202 that the instruction to complete all setting processing has not been received from the user (NO in step S202), the processing proceeds to step S203. In step S203, the management apparatus A105 determines whether an instruction to start setting about the shielded state detection has been received from the user. As a specific example, in a case where the start button A410 is pressed, the management apparatus A105 may recognize that the instruction to start the setting about the shielded state detection has been received from the user.
In a case where the management apparatus A105 determines in step S203 that the instruction to start the setting about the shielded state detection has not been received (NO in step S203), the processing proceeds to step S201. In this case, the series of processing from step S201 illustrated in
On the other hand, in a case where the management apparatus A105 determines in step S203 that the instruction to start the setting about the shielded state detection has been received (YES in step S203), the processing proceeds to step S204. In step S204, the management apparatus A105 determines whether, out of the method based on the edge power and the method based on the background difference, the method based on the edge power has been selected as the method to detect the shielded state. As a specific example, in a case where the radio button A413 associated with the method based on the edge power is designated out of the radio buttons A413 and A414, the management apparatus A105 may recognize that the method based on the edge power has been selected.
In a case where the management apparatus A105 determines in step S204 that the method based on the edge power has been selected as the method to detect the shielded state (e.g., in a case where the radio button A413 is designated) (YES in step S204), the processing proceeds to step S205. In step S205, the management apparatus A105 performs setting processing relating to the shielded state detection by the method based on the edge power. The processing is separately described in detail below with reference to
On the other hand, in a case where the management apparatus A105 determines in step S204 that the method based on the edge power has not been selected as the method to detect the shielded state (e.g., in a case where the radio button A414 is designated) (NO in step S204), the processing proceeds to step S206. In step S206, the management apparatus A105 performs setting processing relating to the shielded state detection by the method based on the background difference. The processing is separately described in detail below with reference to
The management apparatus A105 performs the series of processing illustrated in
Next, an example of the setting processing relating to the shielded state detection by the method based on the edge power, described as the processing in step S205 of
In step S301, the setting reception unit A211 transmits a screen that presents a detection result of the shielded state of each of the blocks based on the edge power, to the management apparatus A105 through the communication unit A204, and causes the management apparatus A105 to present the screen.
It is found from the screen illustrated in
In the screen illustrated in
In step S302, the management apparatus A105 determines whether the instruction to designate a block has been received from the user through the above-described screen.
In a case where the management apparatus A105 determines in step S302 that the instruction to designate a block has been received from the user (YES in step S302), the processing proceeds to step S303. In step S303, the management apparatus A105 requests the image capturing apparatus A101 that has captured the image displayed on the screen, to switch execution and inexecution of the shielded state detection processing based on the edge power on the designated block. The setting reception unit A211 of the image capturing apparatus A101 instructs the detection processing switching unit A206 to switch execution and inexecution of the shielded state detection processing based on the edge power on the block designated by the user, in response to the request from the management apparatus A105. The detection processing switching unit A206 switches execution and inexecution of the shielded state detection processing based on the edge power on the target block, in response to the instruction from the setting reception unit A211.
In the present exemplary embodiment, in a case where the shielded state detection processing based on the edge power on the target block is switched to inexecution, the shielded state detection processing based on the background difference is performed on the target block.
For example, a screen illustrated in
As can be seen from comparison between the screen illustrated in
On the other hand, in a case where the management apparatus A105 determines in step S302 that the instruction to designate the block has not been received from the user (NO in step S302), the processing proceeds to step S304. In this case, the processing in step S303 is not performed.
In step S304, the management apparatus A105 determines whether an instruction to end setting about the shielded state detection has been received from the user. As a specific example, in a case where the end button A411 is pressed, the management apparatus A105 may recognize that the instruction to end the setting about the shielded state detection has been received from the user.
In a case where the management apparatus A105 determines in step S304 that the instruction to end the setting about the shielded state detection has been received from the user (YES in step S304), the series of processing illustrated in
For example, a screen illustrated in
On the other hand, in a case where the management apparatus A105 determines in step S304 that the instruction to end the setting about the shielded state detection has not been received from the user (NO in step S304), the processing proceeds to step S301.
The management apparatus A105 performs the series of processing illustrated in
Next, an example of the setting processing relating to the shielded state detection by the method based on the background difference, described as the processing in step S206 of
For example, a screen illustrated in
On the other hand, a screen illustrated in
In step S305, the setting reception unit A211 transmits a screen that presents a detection result of the shielded state of each of the blocks based on the background difference, to the management apparatus A105 through the communication unit A204, and causes the management apparatus A105 to present the screen.
In
In other words, it is found from the screen illustrated in
In the screen illustrated in
In step S306, the management apparatus A105 determines whether an instruction to designate a block has been received from the user through the above-described screen.
In a case where the management apparatus A105 determines in step S306 that the instruction to designate a block has been received from the user (YES in step S306), the processing proceeds to step S307. In step S307, the management apparatus A105 requests the image capturing apparatus A101 that has captured the image displayed on the screen, to switch execution and inexecution of the shielded state detection processing based on the background difference on the designated block. The setting reception unit A211 of the image capturing apparatus A101 instructs the detection processing switching unit A206 to switch execution and inexecution of the shielded state detection processing based on the background difference on the block designated by the user, in response to the request from the management apparatus A105. The detection processing switching unit A206 switches execution and inexecution of the shielded state detection processing based on the background difference on the target block, in response to the instruction from the setting reception unit A211.
In the present exemplary embodiment, in a case where the shielded state detection processing based on the background difference on the target block is switched to inexecution, the shielded state detection processing based on the edge power is performed on the target block.
For example, a screen illustrated in
As can be seen from the comparison between the screen illustrated in
On the other hand, in a case where the management apparatus A105 determines in step S306 that the instruction to designate the block has not been received from the user (NO in step S306), the processing proceeds to step S308. In this case, the processing in step S307 is not performed.
In step S308, the management apparatus A105 determines whether an instruction to end the setting about the shielded state detection has been received from the user. As a specific example, in the case where the end button A411 is pressed, the management apparatus A105 may recognize that the instruction to end the setting about the shielded state detection has been received from the user.
In a case where the management apparatus A105 determines in step S308 that the instruction to end the setting about the shielded state detection has been received from the user (YES in step S308), the series of processing illustrated in
For example, a screen illustrated in
On the other hand, in a case where the management apparatus A105 determines in step S308 that the instruction to end the setting about the shielded state detection has not been received from the user (NO in step S308), the processing returns to step S305.
The management apparatus A105 performs the series of processing illustrated in
Applying the above-described processing makes it possible to selectively switch the processing to be applied to the determination whether each of the blocks is shielded, between the processing based on the feature amount (e.g., edge power) and the processing based on the background difference depending on the situation of the time. Such a mechanism makes it possible to improve detection accuracy of the state where the image capturing by the image capturing unit A201 is obstructed, depending on the situation of the time (e.g., scene to be monitored).
Subsequently, a modification of the present exemplary embodiment is described. In the above-described exemplary embodiment, the processing to be applied to each of the blocks is manually set by the user operation. In contrast, in the present modification, an example of a mechanism in which the image capturing apparatus A101 automatically set the processing to be applied to each of the blocks obtained by dividing the image corresponding to the image capturing result of the image capturing unit A201 by using a detection result of the shielded state of each of the blocks, is described.
First, an example of a functional configuration of the image capturing apparatus A101 according to the present modification is described with reference to
The detection processing determination unit A212 receives feedback of the detection result of the shielded state of the block based on the background difference by the first detection unit A207 and feedback of the detection result of the shielded state of the block based on the feature amount (e.g., edge power) by the second detection unit A208. The detection processing determination unit A212 determines whether to apply the detection processing based on the background difference or the detection processing based on the feature amount, to the target block, based on the feedback (i.e., detection result described above).
For example,
The detection processing determination unit A212 determines the detection processing to be applied to each of the blocks based on whether the background difference acquired for each of the blocks is larger than or smaller than a threshold, and whether the feature amount extracted from each of the blocks is larger than or smaller than a threshold. More specifically, the detection processing determination unit A212 basically determines the processing based on the edge power as the applied processing, and in a case where the edge power is smaller than the threshold and the background difference is smaller than the threshold, the detection processing determination unit A212 determines the detection processing based on the background difference as the applied processing. Further, the detection processing determination unit A212 controls the detection processing switching unit A206 to switch the processing to be applied to the target block, based on the determination result of the detection processing applied to the target block.
Applying the above-described control makes it possible to automatically and selectively switch the processing to be applied for determination whether each of the blocks is shielded, between the processing based on the feature amount (e.g., edge power) and the processing based on the background difference, depending on the situation of the time. Such a mechanism makes it possible to improve the detection accuracy of the state where the image capturing by the image capturing unit A201 is obstructed, depending on the situation of the time (e.g., scene to be monitored).
The present disclosure can be realized by supplying programs implementing one or more functions of the above-described exemplary embodiment to a system or an apparatus through a network or a recording medium, and causing one or more processors of a computer in the system or the apparatus to read out and execute the programs. Further, the present disclosure can be realized by a circuit (e.g., application specific integrated circuit (ASIC)) implementing one or more functions of the above-described exemplary embodiment.
Further, the configurations described with reference to
As a specific example, the components A205 to A211 relating to detection of the state where the image capturing by the image capturing unit A201 is obstructed may be provided outside the image capturing apparatus A101. In this case, an apparatus including the components A205 to A211 relating to detection of the state where the image capturing by the image capturing unit A201 is obstructed corresponds to an example of the “information processing apparatus” according to the present exemplary embodiment.
Further, as another example, among the components of the image capturing apparatus A101, a load of the processing by at least some of the components may be distributed to a plurality of apparatuses.
Embodiment(s) of the present disclosure can also be realized by a computer of a system or apparatus that reads out and executes computer executable instructions (e.g., one or more programs) recorded on a storage medium (which may also be referred to more fully as a ‘non-transitory computer-readable storage medium’) to perform the functions of one or more of the above-described embodiment(s) and/or that includes one or more circuits (e.g., application specific integrated circuit (ASIC)) for performing the functions of one or more of the above-described embodiment(s), and by a method performed by the computer of the system or apparatus by, for example, reading out and executing the computer executable instructions from the storage medium to perform the functions of one or more of the above-described embodiment(s) and/or controlling the one or more circuits to perform the functions of one or more of the above-described embodiment(s). The computer may comprise one or more processors (e.g., central processing unit (CPU), micro processing unit (MPU)) and may include a network of separate computers or separate processors to read out and execute the computer executable instructions. The computer executable instructions may be provided to the computer, for example, from a network or the storage medium. The storage medium may include, for example, one or more of a hard disk, a random-access memory (RAM), a read only memory (ROM), a storage of distributed computing systems, an optical disk (such as a compact disc (CD), digital versatile disc (DVD), or Blu-ray Disc (BD)™), a flash memory device, a memory card, and the like.
While the present disclosure has been described with reference to exemplary embodiments, the scope of the following claims are to be accorded the broadest interpretation so as to encompass all such modifications and equivalent structures and functions.
This application claims the benefit of Japanese Patent Application No. 2020-086982, filed May 18, 2020, which is hereby incorporated by reference herein in its entirety.
Number | Date | Country | Kind |
---|---|---|---|
2020-086982 | May 2020 | JP | national |