This application is based upon and claims priority from Chinese Patent Application No. 201611105497.9, filed on Dec. 5, 2016, the disclosure of which is expressly incorporated herein by reference in its entirety.
The present disclosure generally relates to imaging systems, and more specifically to apparatus and method for controlling a camera array.
With the proliferation of virtual reality (VR) technology and the enthusiasm towards VR experiences, there is a high demand for using multiple cameras to shoot photos or video footages. However, many conventional VR camera systems only use a limited number of cameras, e.g., 2˜4 cameras, and thus have a limited field-of-view coverage. For example, these systems cannot produce 360-degree panoramic images.
Moreover, to produce high-quality VR photos and/or videos, the operations of the multiple cameras need to be preciously synchronized. As such, camera operators often have to spend tremendous time and effort in manually calibrating and setting operation parameters used by each individual camera. This is cumbersome. As the number of used cameras increases, this task may quickly become unmanageable by one person. Although some sophisticate VR image processing programs have been developed to combine (e.g., stitch) the images captured by different cameras and thus allow the cameras to work in an unsynchronized manner, the software-based imaging processing is time consuming and has a high requirement for computing resources, while the generated VR content still suffers from image quality loss due to the cameras not being in synchronization.
Furthermore, cameras used in the conventional VR camera system often work independently and have no effective means to communicate with other cameras or report their operation statuses to a camera operator. For example, each camera in a VR camera system may cover a different part of the scene to be imaged. Thus, the content recorded by each camera is indispensable for producing the VR content. If one of the cameras experiences a failure and stops working in the middle of a recording while this failure is unnoticed by the camera operator, then the photos/videos recorded by other cameras of the system after the occurrence of the failure are no longer useful for VR generation. This problem worsens when the VR camera system uses a large number of cameras, as it is difficult for a camera operator to attend to multiple cameras at the same time.
Thus, it is desirable to develop a controller to perform centralized controlling and/or monitoring of the multiple cameras, in order to coordinate the operations of different cameras and free the camera operators from the daunting tasks of camera calibrations and/or image processing. The disclosed systems and methods address one or more of the demands listed above.
Consistent with one embodiment of the present disclosure, a camera system is provided. The system may include a plurality of cameras and a controller coupled to the cameras. The controller may be configured to activate the cameras; determine at least one of hardware or software conditions of the cameras; when the conditions of the cameras are normal, instruct the cameras to initiate an operation; after the operation is initiated, monitor operation status of the cameras; and when an abnormal operation status is detected, instruct the cameras to stop the operation.
Consistent with another embodiment of the present disclosure, a method for controlling a plurality of cameras is provided. The method may include activating the cameras; determining at least one of hardware or software conditions of the cameras; when the conditions of the cameras are normal, instructing the cameras to initiate an operation; after the operation is initiated, monitoring operation status of the cameras; and when one or more of the cameras have abnormal operation status, instructing the cameras to stop the operation.
Consistent with yet another embodiment of the present disclosure, a non-transitory computer-readable storage medium storing instructions for controlling a plurality of cameras is provided. The instructions cause a processor to perform operations including activating the cameras; determining at least one of hardware or software conditions of the cameras; when the conditions of the cameras are normal, instructing the cameras to initiate an operation; after the operation is initiated, monitoring operation status of the cameras; and when one or more of the cameras have abnormal operation statuses, instructing the cameras to stop the operation.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the invention, as claimed.
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present disclosure and, together with the description, serve to explain the principles of the present disclosure.
Reference will now be made in detail to the disclosed embodiments, examples of which are illustrated in the accompanying drawings. Wherever convenient, the same reference numbers will be used throughout the drawings to refer to the same or like parts.
Features and characteristics of the present disclosure, as well as methods of operation and functions of related elements of structure and the combination of parts and economies of manufacture, may become more apparent upon consideration of the following description with reference to the accompanying drawings, all of which form a part of this specification. It is to be understood, however, that the drawings are for the purpose of illustration and description only and are not intended as a definition of the limits of the invention. As used in the specification and in the claims, the singular form of “a”, “an”, and “the” include plural referents unless the context clearly dictates otherwise.
Camera rig 20 may be a structure used for mounting camera array 30. Camera rig 20 may be built to form a specially designed camera path. Industry standard trussing and grip gear can be used in conjunction with various custom rigging solutions to allow substantial flexibility with positioning, height, and camera movement. Camera rig 20 may include complex structures that include multiple circles and curves with various diameters, straight tracks, incline/decline angles, overhead rigging, etc. Camera rig 20 may also be as simple as a single straight or curved track. For example, in 360-degree panorama photography, camera rig 20 may form a 360-degree circle for aligning camera array 30.
As shown in
Side wall 21 may include multiple slots for receiving camera modules 32. Each slot may include a Universal Serial Bus (USB) port (not shown in
In some embodiments, camera array 30 may also include one or more camera modules 34 located on a surface 22 of camera rig 20. Camera modules 34 may be pointed up and configured to capture a scene above camera rig 20. As such, in the following description, camera module 34 may also be referred to as “top view camera module 34.” Similarly, camera array 30 may include one or more camera modules located on a bottom surface (not shown) of camera rig 20 and configured to capture a scene below camera rig 20.
In some embodiments, two or more of camera modules 32 and 34 may be positioned to have a sufficient field-of-view overlap so that certain parts of a scene can be seen by more than one camera module. As described above, such overlap is suitable for creating the VR effects. Moreover, capturing an object by more than one camera module may be beneficial for correcting exposure or color deficiencies in the images captured by camera array 30. Other benefits include disparity/depth calculations, stereoscopic reconstruction, and the potential to perform multi-camera high-dynamic range (HDR) imaging using an alternating mosaic pattern of under- and over-exposure across camera array 30.
In some embodiments, each camera module 32 or 34 may include one or more memory cards for storing raw image data captured by the respective camera module 32 or 34. Example memory cards include, but are not limited to, a secure digital (SD) memory card, a secure digital high capacity (SDHC) memory card, a secure digital extra capacity (SDXC) memory card, and a compact flash (CF) memory card, etc. For example, top surface 22 may include a card slot 35 for receiving a memory card used by top view camera module 34.
Camera rig 20 may be constructed from a heat dissipating material that draws heat from camera array 30 for dissipation in the atmosphere. As shown in
Camera system 10 may also include a user interface 40 located on top surface 22. User interface 40 may be configured to present certain information regarding the statuses of camera array 30 to a camera operator. User interface 40 may also be configured to receive user input for controlling certain functions of camera array 30. For example, user interface 40 may include a display panel for outputting images, videos, and/or other types of visual information to the operator. The display panel may include a liquid crystal display (LCD), a light-emitting diode (LED) display, a plasma display, or any other type of display. The display panel may have one or more associated or embedded speakers (not shown) for broadcasting audio messages. As another example, user interface 40 may include various input devices, such as a knob, a dial, a keyboard, and/or a touch screen for the operator to set the operation parameters of camera array 30 and/or enter other settings for camera system 10.
In the disclosed embodiments, camera system 10 may also include one or more hot buttons for triggering various functions of camera system 10. These hot buttons may locate at various places on the surface of camera rig 20. As shown in
Camera module 210 may be an image capturing device that includes any of optical devices, lenses, charge coupled devices (CCD), complementary metal-oxide-semiconductor (CMOS) detector arrays and driving circuitry, and other arrangements of optical components, electronic components, and control circuitry used in transmitting and receiving light of various wavelengths. For example, camera module 210 may be an action camera, a digital camera, a web camera, or digital single-lens reflex (DSLR) camera. Camera module 210 may also be imbedded in another device, such as a smartphone, a computer, a personal digital assistant (PDA), a monitoring device, etc.
Camera module 210 may be configured to capture one or more images in a variety of ways. For example, camera module 210 may be configured to capture images initiated by a user, by programming, by a hardware setting, or by a combination thereof. In some embodiments, when camera module 210 is configured to capture images by software or hardware programming or by a hardware setting, image capture can be performed at one or more predetermined conditions. For example, multiple camera modules 210 may be controlled by controller 220 to capture images simultaneously or in an ordered fashion. Alternatively, or additionally, a set of predetermined conditions, for example, the sensing of a moving object, can trigger camera module 210 to capture images. In some embodiments, capturing images may include placing camera module 210 in a mode or setting capable of capturing one or more images. As used herein, an “image” can refer to, in part or in whole, a static or dynamic visual representation including, but not limited to, a photo, a picture, a graphic, a video, a hologram, a virtual reality image, an augmented reality image, other visual representations, or combinations thereof.
Camera module 210 may include various features suitable for VR creation. In one embodiment, camera module 210 may use a 16MP (megapixel) light sensor capable of capturing high-resolution (e.g., 4608×3456) photos with enhanced color and contrast. Camera module 210 may also have a wide field of view, such as a 155-degree viewing angle. Camera module 210 may further be configured to record videos with various resolutions and frame rates, such as 1296p at 30 fps, and 1080p at 30 fps or 60 fps.
Controller 220 may be configured to control and monitor the operations of camera modules 210.
I/O interface 222 may be configured for two-way communication between controller 220 and various components of camera modules 210 and other devices, such as user interface 240. For example, I/O interface 222 may receive certain signals transmitted from one or more camera modules 210 (e.g., signals indicating the current temperature of a camera module 210) and relay the signals to processing unit 224 for further processing. As another example, I/O interface 222 may receive instructions generated by processing unit 224 (e.g., an instruction commanding one or more camera modules 210 to initiate video recording) and transmit the instructions to camera module 210 for execution.
Processing unit 224 may be implemented with one or more application-specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field programmable gate arrays (FPGAs), controllers, micro-controllers, microprocessors, or other electronic components. Processing unit 224 may execute computer instructions (program code) and perform functions in accordance with techniques described herein. Computer instructions include routines, programs, objects, components, data structures, procedures, modules, and functions, which perform particular functions described herein.
Each of storage unit 226 and/or memory module 228 includes one or more memories configured to store the instructions and data used for controlling and monitoring camera modules 210. The memories may be implemented using any type of volatile or non-volatile memory devices, or a combination thereof, such as a static random access memory (SRAM), an electrically erasable programmable read-only memory (EEPROM), an erasable programmable read-only memory (EPROM), a programmable read-only memory (PROM), a read-only memory (ROM), a magnetic memory, a flash memory, or a magnetic or optical disk.
Storage unit 226 and/or memory module 228 may be configured to store the computer instructions and data that may be used by processing unit 224 to perform functions consistent with the present disclosure. For example, storage unit 226 and/or memory module 228 may store common used operation parameters used by camera module 210 for creating VR content.
In some embodiments, controller 220 may communicate with camera modules 210 via control board 230. Control board 230 may include a bus for facilitating communications between camera modules 210 and controller 220. The bus may correspond to one or more protocols or standards. For example, the bus may include one or more of the following: USB port (e.g., USB 2.0, 3.0, or Type-C); a High-Definition Multimedia port; a Lightning connector; or any other hardware bus that is similar or derivative of those described above.
To instruct camera modules 210 to perform certain operations or prescribe certain settings for camera modules 210, controller 220 may send the instructions and/or settings to control board 230, which then distributes the instructions and/or settings to camera modules 210. Similarly, control board 230 may receive from camera modules 210 streams of signals describing the real-time operation statuses of camera modules 210. Control board 230 may then aggregate the signals and transmit the aggregated signals to controller 220. In some embodiments, control board 230 may include a memory card or other non-transitory memory where the data exchanged between camera modules 210 and controller 220 may be temporality cached.
In some embodiments, control board 230 may be configured to communicate with a mobile device 250 via a network 260. Network 260 may be any type of wired or wireless network that may allow transmitting and receiving data. For example, network 260 may be a wired network, a local wireless network (e.g., Bluetooth®, Wi-Fi, near field communications (NFC), etc.), a cellular network, an Internet, or the like, or a combination thereof. Other known communication methods which provide a medium for transmitting data are also contemplated.
In some embodiments, control board 230 may include a port such as a USB, SD, RJ45, or similar port for wired communication with mobile device 250. In some embodiments, control board 230 may include a wireless transceiver for exchanging data with mobile device 250 using one or more wireless communication methods, including IEEE 802.11, IEEE 802.16, BLUETOOTH® or another suitable wireless communication method.
Consistent with the disclosed embodiments, various components of system 200 may collaborate to control and/or monitor the operations of camera modules 210. For example, in some situations, top view camera modules 34 may not be needed for creating the VR content. Accordingly, controller 220 may control the activation/deactivation of top view camera modules 34 independently from side view camera modules 32. When the camera operator only use side view camera modules 32 to record image data, controller 220 may send a deactivation signal to top view camera modules 34 to turn off top view camera modules 34.
For another example, when camera modules 220 are recording image data, controller 220 may periodically check the operation status (e.g., temperature, remaining free storage space of memory card, etc.) according to a predetermined schedule. If controller 220 detects that a camera module 210 experiences a system failure, controller 220 may instruct the failed camera module 210 to generate a warning message, such as a beeping sound, so as to alert the camera operator. Based on the operator's preference, controller 220 may adjust the volume level of the beeping sound. For example, when the operator works in a large studio, controller 220 may turn up the warning sound, so that the operator can hear the beeping sound even if the operator is far away from a failed camera module 210. Conversely, when the operator works in a small room, controller 220 may turn down the volume of the warning sound to a level comfortable for the operator.
For another example, controller 220 may be configured to perform certain operations on multiple camera modules 210 at once. In one example, controller 220 may reset the operation parameters of all camera modules 220 to the default settings. Controller 220 may also format the memory cards in all camera modules 220.
For yet another example, as described in more detail below, controller 220 may establish a wireless connection (e.g., Wi-Fi or Bluetooth® connection) with mobile device 250 via control board 230. This way, mobile device 250 may be used by the camera operator to control and/or monitor the operations of camera modules 210.
In step 420, controller 220 may check the hardware and/or software conditions of camera modules 210. After system 200 is activated, controller 220 may immediately determine whether the hardware and/or software conditions of camera modules 210 are suitable for recording images for creating the VR effects.
The hardware and/or software conditions may include any condition that affects the proper operation of camera modules 210 and/or the synchronization among camera modules 210. For example, the conditions to be checked by controller 220 may include but are not limited to: whether a camera module 210 is activated; whether communication between the camera module 210 and controller 220 is established properly; whether the camera module 210 experiences a system crash; whether the camera module 210 is installed with a memory card; whether the memory card can be read and/or written properly; the remaining free storage space of the memory card; the temperature of the camera module 210; the version of the firmware used by the camera module 210; whether all the camera modules 210 currently use the same set of operation parameters (e.g., same ISO, same shutter speed, etc.); the state of charge (SoC), state of health (SoH), and/or remaining time of the battery used by each camera module 210; and/or the serial number of each camera module 210.
By knowing above conditions for each camera module 210, the camera operator may determine whether system 200 is ready for recording image data and, more particularly, shooting photos and/or videos for creating VR content. For example, if the memory card of a camera module 210 is full or almost full, controller 220 may alert the operator to replace the memory card or format the memory card. As another example, if camera modules 210 use different versions of firmware, controller 220 may alert the operator to update the firmware in all the camera modules 210 to the same version, so as to ensure the image data recorded by different camera modules 210 to have consistent quality and format. As yet another example, if the battery remaining time of one or more camera modules 210 is below a predetermined level, controller 220 may conclude such battery remaining time is too short for performing any imaging operation and alert the operator to the same.
Moreover, for VR creation, controller 220 may also check whether the values of the same operation parameter used by different camera modules 210 are the same or follow a predetermined relationship. For example, it may be desirable for all the camera modules 210 to use the same setting for the white balance, so as to generate VR content with consistent color quality. As another example, the ISOs used by camera modules 210 located at different positions, such as side view camera modules 32 and top view camera modules 34, may be desired to follow a fixed ratio, e.g., 1600 for side view camera modules 32 and 800 for top view camera modules 34. This is because top view camera modules 34 often face a brighter environment than side view camera modules 32.
To determine the above hardware and/or software conditions, controller 220 may send an inquiry to each camera module 210, which then runs a self-check process in response to the inquiry and reports the checking results to controller 220. When the hardware and/or software conditions of a camera module 210 is determined to be normal, controller 220 may conclude that the camera module 210 is ready for recording image data and may instruct user interface 240 to indicate the same. Conversely, when at least one of the hardware or software conditions of the camera module 210 is determined to be abnormal, controller 220 may conclude that the camera module 210 is not suitable for recording image data and may instruct user interface 240 to display the same.
Still referring to
After seeing the condition causing the failure, the camera operator may perform various operations to fix the failure. For example, if the “no memory card” condition is indeed because of a memory card being missing from camera 02, the operator may insert a memory card into Camera 02. As another example, if Camera 02 already has a memory card and the “no memory card” is due to Camera 02's failure to recognize the memory card, the operator may perform operations such as unplugging and replugging the memory card, restarting system 200, restarting Camera 02 only, etc.
In some embodiments, user interface 240 may also display possible ways for the operator to fix the problems of a failed camera module 210. For example, referring to the “no memory card” condition, user interface 240 may display a list of possible solutions, including: “unplug and then replug the memory card,” “restart system 200,” “disconnect and then reconnect the power line,” etc.
User interface 240 may also provide an option for the camera operator to recheck a failed camera module 210. For example, still referring to
Referring back to
In the disclosed embodiments, exemplary operation parameters used in the photo recording mode may include:
Exemplary operation parameters used in the video recording mode may include:
Exemplary operation parameters used in the time-lapse video recording mode may include:
In the disclosed embodiments, controller 220 may instruct user interface 240 to switch among the photo recording mode, the video recording mode, and the time-lapse video recording mode, and allow the operator to select the desired operation parameters for each imaging mode. For example,
If the operator wants to enter and/or change the values of the operation parameters, the operator may press a setting button 242 and instructs controller 220 to display via user interface 240 a page for setting the operation parameters.
After the operator selects a desired imaging mode and operation parameters used in the imaging mode, controller 220 may synchronize the selected imaging mode and operation parameters to some or all the camera modules 210. For example, referring to
The term “synchronizing” in this disclosure, when used in conjunction with “imaging mode” and/or “operation parameters,” refers to applying, by controller 220, the user-defined imaging mode and/or operation parameters to user-selected camera modules 210. In the disclosed embodiments, according to the specific use scenario, controller 220 may instruct different groups of camera modules 210 to use different imaging modes and/or different values for the operation parameters. Certainly, in some use cases, controller 220 may also apply the same imaging mode and/or the same operation parameters to all the camera modules 210. For example, referring to
Referring back to
As described above, the creation of high-quality VR content requires the user-defined imaging mode and operation parameters to be properly synchronized among camera modules 210. That is, the imaging mode and operation parameters used by different camera modules 210 should be preciously set as specified by the camera operator. As such, each time when the operator changes the settings for camera modules 210, controller 220 may need to verify whether the new settings have been successfully synchronized among camera modules 210.
For example, to shoot a dynamic scene (e.g., a scene with moving objects or changing light intensity), it is critical to precisely synchronize the image capturing at frame level or even sub-frame level among all the camera modules 210. Therefore, the operator may desire all the camera modules 210 to use the same shutter speed. Accordingly, controller 220 may verify whether the same shutter speed have been successfully synchronized among all the camera modules 210. If the mode/parameter synchronization is unsuccessful, controller 220 may display an alert/error message on user interface 240, indicating the mode/parameter synchronization has failed and prompting the operator to perform step 430 again. As another example, the operator may set the ISO used by side view camera modules 32 to be 1600 and the ISO used by top view camera modules 34 to be 800. After instructing side view camera modules 32 and top view camera modules 34 to set these ISO values, controller 220 may further check whether the ISOs have been set as instructed. If the ISOs are set correctly, controller 220 may proceed to step 450. Otherwise, controller 220 may return to step 430.
Still referring to
Referring back to
In some embodiments, controller 220 may be configured to monitor a group of pre-selected key performance indicators (KPI) of camera modules 210. The KPIs are parameters capable of measuring the camera operation status. The KPIs may include but are not limited to: the operation parameters (e.g., shutter speed, clock signals, etc.) of camera modules 210; the operation temperature of camera modules 210; the SoC, SoH, and remaining time of the batteries used by camera modules 210; the remaining free storage space in the memory cards; the health of the memory cards (e.g., the temperature of a memory card, the writing speed and/or reading speed of a memory card, etc.); whether any camera module 210 has experienced a system crash; and whether any camera module 210 accidentally stopped the image recording.
For example, controller 220 may continuously monitor whether the imaging modes and/or operator mode originally set in step 430 are maintained during the entire course of the image recording. If it is determined that one or more camera modules 210 unexpectedly change the originally set imaging mode and/or operation parameters, controller 220 may conclude an error has occurred.
As another example, each camera module 210 may include one or more temperature sensors for measuring the camera module 210's operation temperature and reporting the operation temperature to controller 220.
As another example, each camera module 210 may include a battery monitoring system for monitoring the SoC and/or SoH of the battery. The term “state of charge,” as used in the present disclosure, refers to the remaining charge in the battery as compared to the amount of charge when the battery is fully charged. Therefore, the SoC may be expressed as a percentage of the fully charge state. In the disclosed embodiments, the battery monitoring system may monitor the output voltage of the battery, voltages of individual cells in the battery, current in and/or out of the battery, etc., to determine the SoC. The term “state of health,” as used in the present disclosure, refers to one of more of a capacity of the battery, an internal resistance of the battery, self-discharge characteristics of the battery, and/or cell temperature of the battery. Based on the SoC, the SoH, and/or current load of the battery, the battery monitoring system may determine the battery remaining time. Alternatively, the battery monitoring system may send the detected SoC and/or SoH to controller 220 for determining the battery remaining time. For example, controller 220 may use a battery module or run a simulation to compute the remaining time based on the current SoC, the current SoH, the current tasks performed by camera modules 210, and the power specifications of camera modules 210.
In some embodiments, the operator may select and/or define, via user interface 240, the KPIs to be monitored by controller 220. For example, if camera modules 210 are connected to an external power source and do not rely on their own batteries, controller 220 may be relieved from the burden of monitoring the battery remaining time. In contrast, if camera modules 210 are powered by the batteries, the operator may set or add the battery remaining time as a KPI to be monitored by controller 220.
In step 470, when an abnormal operation status is detected, controller 220 may instruct camera modules 210 to stop the image recording and/or generate an alert message indicating the abnormal operation status.
In particular, when one or more of the monitored KPIs exceed predefined normal ranges, controller 220 may conclude an abnormal operation status has occurred. For example, camera modules 210 are designed to operate in certain temperature range. When the operation temperature of a camera module 210 exceeds or drops below a predetermined temperature, controller 220 may conclude that the camera module 210 needs to be turned off immediately in order to avoid damages to the camera module 210.
For another example, when controller 220 detects that the remaining free storage space in a camera module 210's memory card has dropped below a predetermined percentage, e.g., 5%, of the total storage space, controller 220 may conclude that the memory card needs to be replaced with another memory card with more empty storage space.
Similarly, when detecting that the health of a camera module 210's memory card has deteriorated, controller 220 may conclude the camera module 210 has an abnormal operation status. Moreover, when it is detected that a camera module 210 accidentally stopped recording or had a system crash, controller 220 may conclude the operation status of the camera module 210 is abnormal.
For yet another example, when camera modules 210 are performing synchronized imaging, even if both the frame capturing and encoding in different camera modules 210 are initiated simultaneously, the synchronization may still be destroyed because of subsequent clock drifts in part or all of the camera modules 210. Consistent with the disclosed embodiments, the clock drifts may be avoided by using a common clock signal generated by a single crystal oscillator to drive all the camera modules 210. However, if in some embodiments each camera module 210 relies on its own crystal oscillator for providing the clock signal, the clock signal in one camera module 210 may gradually drift apart or desynchronize from the clock signal in another camera module 210, due to the frequency error inherent to each crystal oscillator and the discrepancy in accuracy across different crystal oscillators. Thus, in these embodiments, when detecting the clock draft between any two camera modules 210 exceeds a predetermined level, controller 220 may conclude the imaging synchronization is lost and stops the image recording. Controller 220 may further remind the operator to restart the image recording or automatically restart the image recording, such that the synchronization may be restored.
In some embodiments, when an abnormal operation status is detected, controller 220 may instruct all the camera modules 210 to stop recording image data. For example, many VR applications require all camera modules 220 to record the image data simultaneously. Thus, there is no need to keep other camera modules 210 working when a camera module 210 has failed.
Additionally or alternatively, controller 220 may generate an alert message when an abnormal operation status is detected. In one embodiment, controller 220 may instruct the camera module 210 detected with the abnormal operation status to generate a warning sound, so as to alert the camera operator to the abnormality. Certainly, controller 220 may also be configured to broadcast the warning sound itself or flash a signal light. In another embodiment, controller 220 may instruct user interface 240 to indicate the detected abnormal operation status.
In some embodiments, controller 220 may decide whether to stop the image recording or generate an alert message, based on the the values of the detected KPIs. For example, when the battery remaining time of one or more camera modules 210 is below a first threshold, e.g., 20 minutes remaining or 10% of the fully charged state, controller 220 may alert the camera operator to stop the image recording. However, if this situation gets unattended and keeps worsening, such as when the battery remaining time drops below a second threshold, e.g., 5 minutes remaining or 5% of the fully charged state, controller 220 may automatically stop the image recording, to prevent any unintended shutdown from causing damages to the already-recorded image data, e.g., corrupting the saved image files. As such, the first threshold may be referred to as an “alert threshold,” while the second threshold may be referred to as a “shutdown threshold.”
As described above, controller 220 may monitor multiple KPIs simultaneously and each KPI may have its respective alert threshold and shutdown threshold. By monitoring various KPIs and comparing the KPIs with their respective thresholds, controller 220 may form a comprehensive understanding of the operation statuses of camera modules 210, and quickly and accurately react to the exact situation happening in camera modules 210. For example, controller 220 may only alert the operator to stop the imaging record when the remaining free storage space of a memory card is below an alert percentage of the total storage space, the operation temperature of a camera module 210 exceeds an alert temperature level, or the remaining time of a battery is below an alert time. Moreover, controller 220 may only alert the operator to stop the imaging record when the remaining free storage space of a memory card is below a shutdown percentage, the operation temperature of a camera module 210 exceeds a shutdown temperature level, or the remaining time of a battery is below a shutdown time.
Referring back to
For example, the camera operator may use hot button 45 to start or stop the recording of image data. If system 200 works under the photo recording mode, every time when the operator presses hot button 45, controller 220 may instruct camera modules 210 to take a photo. If system 200 works under the video recording mode or the time-lapse video recording mode, when the operator presses hot button 45 while camera modules 210 are recording, controller 220 may instruct camera modules 210 to stop recording; and when the operator presses hot button 45 when camera modules 210 are not recording, controller 220 may instruct camera modules 210 to start recording again.
As described above, some or all of the functions of controller 220 may be performed by mobile device 250. In some embodiments, mobile device 250 may be implemented as a remote control which pairs with controller 220 via Bluetooth® connection.
After remote control 251 successfully pairs with controller 220, the operator may use remote control 251 to control certain operations of camera modules 210. For example, when the operator presses first mode selector 253, remote control 251 may transmit a signal to controller 220 and cause controller 220 to instruct camera modules 210 to switch into the photo recording mode. Moreover, when the operator presses second mode selector 254, remote control 251 may similarly cause camera modules 210 to switch into the video recording mode. In one embodiment, when the operator presses both first mode selector 253 and second mode selector 254 simultaneously, remote control 251 may cause camera modules 210 to switch into the time-lapse video recording mode. After the imaging mode is selected, the operator may then press start button 252 to cause camera modules 210 to take photos, or to start or stop video recording.
In some embodiments, mobile device 250 may also be implemented as a remote control which pairs with controller 220 via Wi-Fi network. For example, mobile device 250 may be a mobile phone with computing ability (e.g., a smart phone), a tablet computer, a laptop computer, a remote controller, a personal digital assistant (PDA), a wearable device (e.g., a smart watch, a smart wrist band, Google Glass™, etc.).
Referring to
Moreover, user interface 257 may also display the hardware/software conditions of camera modules 210 before the image recording starts and the operation statuses of camera modules 210 after the image recording starts, in a way similar to user interface 240 (
In some embodiments, mobile phone 256 may also provide a preview of the scene before the image recording is started. For example, mobile phone 256 may send a request to controller 220 for previewing the scenes to be shot by one or more selected camera modules 210. Upon receiving the request, controller 220 may instruct the selected camera modules 210 to activate their Wi-Fi functions. Controller 220 may then serve as a Wi-Fi access point and connect the selected camera modules 210 to mobile phone 256. This way, the selected camera modules 210 may stream the image data to mobile phone 256 for image preview.
Evident from the above detailed description, the disclosed camera-array control system provides a solution to centralize the controlling, checking, and monitoring of a camera array. In particular, the centralized control system makes it possible to synchronize the user settings for a large number of camera modules and the operation (e.g., initiate the recording of image data) by the camera modules, so as to improve the efficiency and accuracy of controlling the camera array. Also, the centralized control system facilitates the communications between the camera operator and the camera array and communications within the camera. These features are beneficial for creating high-quality VR content, which requires multiple camera modules to work in a coordinated manner. Moreover, the disclosed system uses a modular design and does not limit the number of camera modules to be controlled, and thus can be easily expanded. In addition, part or all of the control functions performed by the controller can be performed by various mobile devices, which enable remote control of the camera array and further improve the user experience.
Other embodiments of the present disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the present disclosure. This application is intended to cover any variations, uses, or adaptations of the present disclosure following the general principles thereof and including such departures from the present disclosure as come within known or customary practice in the art. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the invention being indicated by the following claims.
It will be appreciated that the present invention is not limited to the exact constructions that are described above and illustrated in the accompanying drawings, and that various modifications and changes can be made without departing from the scope thereof. It is intended that the scope of the invention should only be limited by the appended claims.
Number | Date | Country | Kind |
---|---|---|---|
201611105497.9 | Dec 2016 | CN | national |