An example embodiment relates generally to image processing, including the capture and delivery of panoramic images and videos, including combining images captured by multiple cameras into a panoramic image, such as a 360° panorama.
Panoramic views, including 360° images and videos, are generated for a variety of purposes. For example, panoramic views may be utilized in conjunction with various immersive media applications, such as virtual reality systems. In such a virtual reality system, a viewer, such as a person viewing a head mounted display, may be presented a panoramic view that presents content across a wider field of view than that offered by conventional video viewing systems that present content across a narrow field of view. As such, and particularly in contexts where a 360° view is presented, the viewer may be more fully immersed in the scene represented by the panoramic view.
A panoramic view, such as a 360° image or video, may be captured using a plurality of cameras, such as planar sensors with fisheye lenses, where the nodal point of each camera differs from camera to camera. Consequently, individual images captured by the various cameras typically contain parallax differentials and other differences that pose difficulties when attempting to combine multiple images into a single, 360° image or video.
The combination of individual images to form the panoramic image may be both processing intensive and time intensive. For example, currently available commercial solutions, such as the Google Jump system, rely on computational photography techniques that involve extensive processor resources, offline processing, or both to combine multiple images captured by a plurality of cameras into a single 360° image. As such, the availability of the resulting panoramic view is delayed by the requisite processing of the images, such that the panoramic image cannot be viewed in real time. Moreover, systems that require extensive image processing and the associated hardware necessary to perform such image processing to combine and blend the images captured by a plurality of cameras are of limited utility in many situations.
A method, apparatus and computer program product are therefore provided in accordance with an example embodiment in order to automatically place the seam between combined images used to generate a panoramic view, such as for utilization in conjunction with a virtual reality system, in a computationally efficient manner. In this regard, the method, apparatus and computer program product of an example embodiment provide for the selection of seam locations and scale factors used when combining adjacent images with overlapping portions in a more timely manner and with less intensive processing than at least some conventional systems. Thus, the resulting panoramic view may be more readily available and may be more widely utilized, such as by viewers of virtual reality systems, particularly in contexts where real time and/or near real time rendering of panoramic video views is desired.
In an example embodiment, a method is provided that includes receiving an image frame captured by a first camera and an image frame captured by a second camera, wherein the first camera and the second camera have different fields of view, wherein the different fields of view have a mutually overlapping portion. The method of this example embodiment also includes combining the image frame captured by the first camera and the image frame captured by the second camera into a composite image. The method of this example embodiment also includes determining a seam location and a scale factor for the image frame captured by the first camera and the image frame captured by the second camera. The method of this example embodiment also includes applying the seam location and the scale factor to a plurality of subsequent image frames captured by the first camera and a plurality of subsequent image frames captured by the second camera
In some implementations of the method of an example embodiment, determining a seam location and a scale factor includes generating a plurality of scaled image frames by applying a plurality of scale factors to the image frame captured by at least one of the first camera or the second camera; computing an error measurement associated with the seam location for each scaled image frame from amongst the plurality of scaled image frames; and identifying the scale factor based upon the computed error measurement.
In some implementations of the method of an example embodiment, determining a seam location and a scale factor includes generating a plurality of scaled image frames by applying a plurality of scale factors to the image frame captured by at least one of the first camera or the second camera; dividing the mutually overlapping portion of the fields of view of the image frame captured by the first camera and the image frame captured by the second camera into a plurality of overlapping sections; computing an error measurement associated with each overlapping section of each scaled image frame from amongst the plurality of scaled image frames; and identifying the section and the scale factor based upon the computed error measurement.
In some implementations of the method of an example embodiment, the method also includes receiving a control signal associated with a trigger for determining a second seam location and a second scale factor for a second image frame captured by the first camera and a second image frame captured by the second camera and, in response to receiving the control signal, determining the second seam location and the second scale factor for the second image frame captured by the first camera and the second image frame captured by the second camera; and applying the second seam location and the second scale factor to a second plurality of subsequent image frames captured by the first camera and a second plurality of subsequent image frames captured by the second camera.
In some implementations of the method of an example embodiment, the method also includes detecting a set of data associated with motion of an image element, wherein the image element is located within a predetermined distance of the seam location. In some such implementations, the method also includes, in response to detecting the set of data associated with motion of the image element, determining a direction associated with the motion. In some such implementations, the method further includes shifting the seam location in a direction opposite the direction associated with the motion.
In another example embodiment, an apparatus is provided that includes at least one processor and at least one memory that includes computer program code with the at least one memory and the computer program code configured to, with the at least one processor, cause the apparatus to at least receive an image frame captured by a first camera and an image frame captured by a second camera, wherein the first camera and the second camera have different fields of view, wherein the different fields of view have a mutually overlapping portion; combine the image frame captured by the first camera and the image frame captured by the second camera into a composite image; determine a seam location and a scale factor for the image frame captured by the first camera and the image frame captured by the second camera; and apply the seam location and the scale factor to a plurality of subsequent image frames captured by the first camera and a plurality of subsequent image frames captured by the second camera.
In some implementations of the apparatus of an example embodiment, the at least one memory and the computer program code are configured to, with the processor, cause the apparatus to determine a seam location and a scale factor by generating a plurality of scaled image frames by applying a plurality of scale factors to the image frame captured by at least one of the first camera or the second camera; computing an error measurement associated with the seam location for each scaled image frame from amongst the plurality of scaled image frames; and identifying the scale factor based upon the computed error measurement.
In some implementations of the apparatus of an example embodiment, the at least one memory and the computer program code are configured to, with the processor, cause the apparatus to determine a seam location and a scale factor by generating a plurality of scaled image frames by applying a plurality of scale factors to the image frame captured by at least one of the first camera or the second camera; dividing the mutually overlapping portion of the fields of view of the image frame captured by the first camera and the image frame captured by the second camera into a plurality of overlapping sections; computing an error measurement associated with each overlapping section of each scaled image frame from amongst the plurality of scaled image frames; and identifying the section and the scale factor based upon the computed error measurement.
In some implementations of the apparatus of an example embodiment, the at least one memory and the computer program code are configured to, with the processor, cause the apparatus to receive a control signal associated with a trigger for determining a second seam location and a second scale factor for a second image frame captured by the first camera and a second image frame captured by the second camera and, in response to receiving the control signal, determine the second seam location and the second scale factor for the second image frame captured by the first camera and the second image frame captured by the second camera; and apply the second seam location and the second scale factor to a second plurality of subsequent image frames captured by the first camera and a second plurality of subsequent image frames captured by the second camera.
In some implementations of the apparatus of an example embodiment, the at least one memory and the computer program code are configured to, with the processor, cause the apparatus to detect a set of data associated with motion of an image element, wherein the image element is located within a predetermined distance of the seam location. In some such implementations, the at least one memory and the computer program code are configured to, with the processor, cause the apparatus to, in response to detecting the set of data associated with motion of the image element, determine a direction associated with the motion. In some such implementations, the at least one memory and the computer program code are configured to, with the processor, cause the apparatus to shift the seam location in a direction opposite the direction associated with the motion.
In a further example embodiment, a computer program product is provided that includes at least one non-transitory computer-readable storage medium having computer-executable program code instructions stored therein with the computer-executable program code instructions including program code instructions configured to receive an image frame captured by a first camera and an image frame captured by a second camera, wherein the first camera and the second camera have different fields of view, wherein the different fields of view have a mutually overlapping portion; combine the image frame captured by the first camera and the image frame captured by the second camera into a composite image; determine a seam location and a scale factor for the image frame captured by the first camera and the image frame captured by the second camera; and apply the seam location and the scale factor to a plurality of subsequent image frames captured by the first camera and a plurality of subsequent image frames captured by the second camera.
In an implementation of the computer-executable program code instructions of an example embodiment, the program code instructions configured to determine a seam location and a scale factor include program code instructions configured to generate a plurality of scaled image frames by applying a plurality of scale factors to the image frame captured by at least one of the first camera or the second camera; compute an error measurement associated with the seam location for each scaled image frame from amongst the plurality of scaled image frames; and identify the scale factor based upon the computed error measurement.
In another implementation of the computer-executable program code instruction of an example embodiment, the program code instructions configured to determine a seam location and a scale factor comprise program code instructions configured to generate a plurality of scaled image frames by applying a plurality of scale factors to the image frame captured by at least one of the first camera or the second camera; divide the mutually overlapping portion of the fields of view of the image frame captured by the first camera and the image frame captured by the second camera into a plurality of overlapping sections; compute an error measurement associated with each overlapping section of each scaled image frame from amongst the plurality of scaled image frames; and identify the section and the scale factor based upon the computed error measurement.
In an implementation of the computer-executable program code instructions of an example embodiment, the computer-executable program code instructions further comprise program code instructions configured to receive a control signal associated with a trigger for determining a second seam location and a second scale factor for a second image frame captured by the first camera and a second image frame captured by the second camera and, in response to receiving the control signal, determine the second seam location and the second scale factor for the second image frame captured by the first camera and the second image frame captured by the second camera; and apply the second seam location and the second scale factor to a second plurality of subsequent image frames captured by the first camera and a second plurality of subsequent image frames captured by the second camera.
In an implementation of the computer-executable program code instructions of an example embodiment, the computer-executable program code instructions further comprise program code instructions configured to detect a set of data associated with motion of an image element, wherein the image element is located within a predetermined distance of the seam location. In some such implementations, the computer-executable program code instructions further comprise program code instructions configured to, in response to detecting the set of data associated with motion of the image element, determine a direction associated with the motion; and shift the seam location in a direction opposite the direction associated with the motion.
In yet another example embodiment, an apparatus is provided that includes means for receiving an image frame captured by a first camera and an image frame captured by a second camera, wherein the first camera and the second camera have different fields of view, wherein the different fields of view have a mutually overlapping portion; combining the image frame captured by the first camera and the image frame captured by the second camera into a composite image; determining a seam location and a scale factor for the image frame captured by the first camera and the image frame captured by the second camera; and applying the seam location and the scale factor to a plurality of subsequent image frames captured by the first camera and a plurality of subsequent image frames captured by the second camera.
In an implementation of the apparatus of an example embodiment, the apparatus further includes means for determining a plurality of scaled image frames by applying a plurality of scale factors to the image frame captured by at least one of the first camera or the second camera; computing an error measurement associated with the seam location for each scaled image frame from amongst the plurality of scaled image frames; and identifying the scale factor based upon the computed error measurement.
In another implementation of the apparatus of an example embodiment, the apparatus further includes means for determining a plurality of scaled image frames by applying a plurality of scale factors to the image frame captured by at least one of the first camera or the second camera; dividing the mutually overlapping portion of the fields of view of the image frame captured by the first camera and the image frame captured by the second camera into a plurality of overlapping sections; computing an error measurement associated with each overlapping section of each scaled image frame from amongst the plurality of scaled image frames; and identifying the section and the scale factor based upon the computed error measurement.
In an implementation of the apparatus of an example embodiment, the apparatus further includes means for receiving a control signal associated with a trigger for determining a second seam location and a second scale factor for a second image frame captured by the first camera and a second image frame captured by the second camera and, in response to receiving the control signal, determining the second seam location and the second scale factor for the second image frame captured by the first camera and the second image frame captured by the second camera; and applying the second seam location and the second scale factor to a second plurality of subsequent image frames captured by the first camera and a second plurality of subsequent image frames captured by the second camera.
In an implementation of the apparatus of an example embodiment, the apparatus further includes means for detecting a set of data associated with motion of an image element, wherein the image element is located within a predetermined distance of the seam location. In some such implementations, the apparatus further includes means for, in response to detecting the set of data associated with motion of the image element, determining a direction associated with the motion. In some such implementations, the apparatus further includes means for shifting the seam location in a direction opposite the direction associated with the motion.
Having thus described certain example embodiments of the present disclosure in general terms, reference will hereinafter be made to the accompanying drawings, which are not necessarily drawn to scale, and wherein:
Some embodiments will now be described more fully hereinafter with reference to the accompanying drawings, in which some, but not all, embodiments of the invention are shown. Indeed, various embodiments of the invention may be embodied in many different forms and should not be construed as limited to the embodiments set forth herein; rather, these embodiments are provided so that this disclosure will satisfy applicable legal requirements. Like reference numerals refer to like elements throughout. As used herein, the terms “data,” “content,” “information,” and similar terms may be used interchangeably to refer to data capable of being transmitted, received and/or stored in accordance with embodiments of the present invention. Thus, use of any such terms should not be taken to limit the spirit and scope of embodiments of the present invention.
Additionally, as used herein, the term ‘circuitry’ refers to (a) hardware-only circuit implementations (e.g., implementations in analog circuitry and/or digital circuitry); (b) combinations of circuits and computer program product(s) comprising software and/or firmware instructions stored on one or more computer readable memories that work together to cause an apparatus to perform one or more functions described herein; and (c) circuits, such as, for example, a microprocessor(s) or a portion of a microprocessor(s), that require software or firmware for operation even if the software or firmware is not physically present. This definition of ‘circuitry’ applies to all uses of this term herein, including in any claims. As a further example, as used herein, the term ‘circuitry’ also includes an implementation comprising one or more processors and/or portion(s) thereof and accompanying software and/or firmware. As another example, the term ‘circuitry’ as used herein also includes, for example, a baseband integrated circuit or applications processor integrated circuit for a mobile phone or a similar integrated circuit in a server, a cellular network device, other network device, and/or other computing device.
As defined herein, a “computer-readable storage medium,” which refers to a non-transitory physical storage medium (e.g., volatile or non-volatile memory device), can be differentiated from a “computer-readable transmission medium,” which refers to an electromagnetic signal.
A method, apparatus and computer program product are provided in accordance with an example embodiment in order to efficiently generate a panoramic view, such as for use in conjunction with virtual reality or other applications. In this regard, a panoramic view is generated by combining images captured by a plurality of cameras arranged in an array, such that portions of images captured by adjacent cameras within the array overlap with each other and may be stitched or otherwise combined together. Through the application of several techniques, the coordination between two adjacent images can be improved, resulting in an enhanced viewing experience for a viewer. Moreover, the panoramic view may be generated in an efficient manner, both in terms of the processing resources consumed during the generation of the panoramic view and the time required to generate the panoramic view. As a result, the panoramic view may, in some instances, be generated in real time or near real time relative to the capture of the images that at least partially comprise the panoramic view.
In some example implementations, the coordination of two adjacent images is achieved by searching a two-dimensional space of possible options to identify a configuration of seam location, scale factor, or both, that results in a minimum error for a given frame. In some contexts, including but not limited to contexts where the location of a seam between two adjacent images is fixed, a variety of convergence depths between two adjacent cameras are evaluated by applying a plurality of scale factors to an image and calculating an error associated with each scale factor. The scale factor associated with the minimum error is then selected, and applied to subsequent frames captured by the particular adjacent cameras. In some contexts, including but not limited to contexts where the location of a seam between two adjacent images is not fixed, the overlapping area of images captured by the adjacent cameras is divided into a series of columns or other sections, and an error associated with each column is calculated. The column or other section with the minimum error is then selected as the seam location, and applied to subsequent frames captured by the particular adjacent cameras. In some contexts, including but not limited to contexts where motion is detected in content near a seam location, the seam location can be moved on a per-frame basis in response to the motion. Regardless of the context in which the seam location and/or scale factor is selected and applied, the selection and application of a seam location and/or scale factor may be performed and/or re-performed in response to a manual and/or automatic trigger.
Some example implementations contemplate the use of devices suitable for capturing images used in virtual reality and other immersive content environments, such as Nokia's OZO system, where multiple cameras are placed in an array such that each camera is aimed in a particular direction to capture a particular field of view. Particularly in contexts involving live stitching, it is necessary to stitch the images received from each camera in real time or near real time. In such a scenario, solutions for real time or near real time stitching involve the use of camera calibration data. The camera calibration data can be used to generally determine the placement of each camera, and generate a transformation matrix that can be used to stitch multiple images together to form a panoramic view, such as a 360° image. Camera calibration is typically performed in a manner directed toward infinite scene location. As a result, objects located relatively near the camera(s) will be subject to parallax effects, which compound the difficulty associated with stitching the images together. While such stitching may be accomplished using time-intensive and processor-resource intensive techniques, such techniques are incompatible with the timing requirements associated with live stitching and/or other practical considerations associated with the camera array and its processing capabilities. In contrast, the techniques disclosed herein are viable in live stitching contexts, particularly in resource-constrained situations. Moreover, implementations of the techniques disclosed herein have provided for a significant reduction of visible stitching errors under a wider range of input than conventional techniques used to achieve real time or near real time performance at typical resolutions on reasonable hardware.
The panoramic view that is generated in accordance with an example embodiment of the present invention is based upon images captured by at least two cameras. In the embodiment depicted in
As shown in
While the embodiment depicted in
Based upon the images captured by the cameras 10, a panoramic view is generated. In this regard, the panoramic view may be generated by an apparatus 20 as depicted in
Regardless of the manner in which the apparatus 20 is embodied, the apparatus of an example embodiment is configured to include or otherwise be in communication with a processor 22 and a memory device 24 and optionally the user interface 26 and/or a communication interface 28. In some embodiments, the processor (and/or co-processors or any other processing circuitry assisting or otherwise associated with the processor) may be in communication with the memory device via a bus for passing information among components of the apparatus. The memory device may be non-transitory and may include, for example, one or more volatile and/or non-volatile memories. In other words, for example, the memory device may be an electronic storage device (e.g., a computer readable storage medium) comprising gates configured to store data (e.g., bits) that may be retrievable by a machine (e.g., a computing device like the processor). The memory device may be configured to store information, data, content, applications, instructions, or the like for enabling the apparatus to carry out various functions in accordance with an example embodiment of the present invention. For example, the memory device could be configured to buffer input data for processing by the processor. Additionally or alternatively, the memory device could be configured to store instructions for execution by the processor.
As described above, the apparatus 20 may be embodied by a computing device. However, in some embodiments, the apparatus may be embodied as a chip or chip set. In other words, the apparatus may comprise one or more physical packages (e.g., chips) including materials, components and/or wires on a structural assembly (e.g., a baseboard). The structural assembly may provide physical strength, conservation of size, and/or limitation of electrical interaction for component circuitry included thereon. The apparatus may therefore, in some cases, be configured to implement an embodiment of the present invention on a single chip or as a single “system on a chip.” As such, in some cases, a chip or chipset may constitute means for performing one or more operations for providing the functionalities described herein.
The processor 22 may be embodied in a number of different ways. For example, the processor may be embodied as one or more of various hardware processing means such as a coprocessor, a microprocessor, a controller, a digital signal processor (DSP), a processing element with or without an accompanying DSP, or various other processing circuitry including integrated circuits such as, for example, an ASIC (application specific integrated circuit), an FPGA (field programmable gate array), a microcontroller unit (MCU), a hardware accelerator, a special-purpose computer chip, or the like. As such, in some embodiments, the processor may include one or more processing cores configured to perform independently. A multi-core processor may enable multiprocessing within a single physical package. Additionally or alternatively, the processor may include one or more processors configured in tandem via the bus to enable independent execution of instructions, pipelining and/or multithreading.
In an example embodiment, the processor 22 may be configured to execute instructions stored in the memory device 24 or otherwise accessible to the processor. Alternatively or additionally, the processor may be configured to execute hard coded functionality. As such, whether configured by hardware or software methods, or by a combination thereof, the processor may represent an entity (e.g., physically embodied in circuitry) capable of performing operations according to an embodiment of the present invention while configured accordingly. Thus, for example, when the processor is embodied as an ASIC, FPGA or the like, the processor may be specifically configured hardware for conducting the operations described herein. Alternatively, as another example, when the processor is embodied as an executor of software instructions, the instructions may specifically configure the processor to perform the algorithms and/or operations described herein when the instructions are executed. However, in some cases, the processor may be a processor of a specific device (e.g., a pass-through display or a mobile terminal) configured to employ an embodiment of the present invention by further configuration of the processor by instructions for performing the algorithms and/or operations described herein. The processor may include, among other things, a clock, an arithmetic logic unit (ALU) and logic gates configured to support operation of the processor.
In some embodiments, the apparatus 20 may optionally include a user interface 26 that may, in turn, be in communication with the processor 22 to provide output to the user and, in some embodiments, to receive an indication of a user input. As such, the user interface may include a display and, in some embodiments, may also include a keyboard, a mouse, a joystick, a touch screen, touch areas, soft keys, a microphone, a speaker, or other input/output mechanisms. Alternatively or additionally, the processor may comprise user interface circuitry configured to control at least some functions of one or more user interface elements such as a display and, in some embodiments, a speaker, ringer, microphone and/or the like. The processor and/or user interface circuitry comprising the processor may be configured to control one or more functions of one or more user interface elements through computer program instructions (e.g., software and/or firmware) stored on a memory accessible to the processor (e.g., memory device 24, and/or the like).
The apparatus 20 may optionally also include the communication interface 28. The communication interface may be any means such as a device or circuitry embodied in either hardware or a combination of hardware and software that is configured to receive and/or transmit data from/to a network and/or any other device or module in communication with the apparatus. In this regard, the communication interface may include, for example, an antenna (or multiple antennas) and supporting hardware and/or software for enabling communications with a wireless communication network. Additionally or alternatively, the communication interface may include the circuitry for interacting with the antenna(s) to cause transmission of signals via the antenna(s) or to handle receipt of signals received via the antenna(s). In some environments, the communication interface may alternatively or also support wired communication. As such, for example, the communication interface may include a communication modem and/or other hardware/software for supporting communication via cable, digital subscriber line (DSL), universal serial bus (USB) or other mechanisms.
Referring now to
The apparatus includes means, such as the processor 22, the memory 24, the communication interface 28 or the like, for receiving an image frame captured by a first camera and an image frame captured by a second camera, wherein the first camera and the second camera have different fields of view, wherein the different fields of view have a mutually overlapping portion. For example, and with reference to block 32 of
The apparatus also includes means, such as the processor 22, the memory 24, the communication interface 28 or the like, for combining the image frame captured by the first camera and the image frame captured by the second camera into a composite image. For example, and with reference to block 34 of
When the neighboring images are selected and combined, such as in example implementations of block 34, differences between the images in their mutually overlapping portions may be visible to a viewer and/or otherwise undesirable. Consequently, establishing and positioning a seam between the two images that minimizes such differences is desirable, and can improve the experience of a viewer, particularly a viewer who is seeking an immersive experience associated with a virtual reality viewing system. In some contexts, the seam established between two neighboring images will be fixed in a predetermined position with respect to the images for all such neighboring frames. However, in other contexts, the location of the seam will not be fixed in a particular location for all neighboring frames, and can be set, such as by apparatus 20, on a frame-by-frame basis or in accordance with any other protocol. Regardless of whether the seam is in a fixed location or not, the seam itself may take any of a number of configurations. For example, a seam may be configured as a vertical line. In other examples the seam may take the form of an arc or any other shape, including but not limited to an optimized shape.
The apparatus 20 also includes means, such as the processor 22, the memory 24, the communication interface 28 or the like, for determining a seam location and a scale factor for the image frame captured by the first camera and the image frame captured by the second camera. For example, and with reference to
In some example implementations, the apparatus 20 also includes means, such as the processor 22, memory 24, the communication interface 28, or the like, for generating a plurality of scaled image frames by applying a plurality of scale factors to the image frame captured by at least one of the first camera or the second camera. For example, and with reference to block 38 of
While camera 52 and camera 54 are arranged such that there is an overlapping portion of their respective fields of view and the images captured by camera 52 and camera 54 have mutually overlapping portions, the orientation and/or configuration of camera 52 and/or camera 54 may be such that the appearance of image elements common to images captured by camera 52 and camera 54 may be subject to parallax, differences in size, and other visibly perceptible differences. As shown in
With reference again to block 38 in
The apparatus 20 also includes means, such as the processor 22, memory 24, the communication interface 28, or the like, for identifying the scale factor based upon the computed error measurement. For example, and with reference to blocks 38 and 40 of
As shown in block 48, the apparatus 20 also includes means, such as the processor 22, memory 24, the communication interface 28, or the like, for determining whether there are any additional cameras in a camera array for which a scale factor has not been calculated. As shown in
The apparatus 20 includes means, such as the processor 22, memory 24, the communication interface 28, or the like, for applying the seam location and the scale factor to a plurality of subsequent image frames captured by the first camera and a plurality of subsequent image frames captured by the second camera. For example, and with reference to block 50 of
In contexts and/or example implementations where the seam location is not fixed, process 30 in
Unlike example implementations that arise in contexts where the seam location between two images is fixed in advance, the computation of error associated with each scaled image need not be tied to a single region associated with the seam. The apparatus 20 also includes means, such as the processor 22, memory 24, the communication interface 28, or the like, for dividing the mutually overlapping portion of the fields of view of the image frame captured by the first camera and the image frame captured by the second camera into a plurality of overlapping sections. The apparatus 20 also includes means, such as the processor 22, memory 24, the communication interface 28, or the like, for computing an error measurement associated with each overlapping section of each scaled image frame from amongst the plurality of scaled image frames. For example, and as shown in block 44 of
The apparatus 20 also includes means, such as the processor 22, memory 24, the communication interface 28, or the like, for identifying the section and the scale factor based upon the computed error measurement. For example, and as shown in block 46 of
Referring now to
The apparatus also includes means, such as the processor 22, the memory 24, the communication interface 28 or the like, for, in response to receiving the control signal, determining the second seam location and the second scale factor for the second image frame captured by the first camera and the second image frame captured by the second camera. For example, and as depicted in block 64 of
The apparatus 20 includes also means, such as the processor 22, the memory 24, the communication interface 28 or the like, for applying the second seam location and the second scale factor to a second plurality of subsequent image frames captured by the first camera and a second plurality of subsequent image frames captured by the second camera. For example, as shown in
Implementations of process 60 may be particularly advantageous in situations where the position of a camera within an array and/or an entire camera array changes, such that a previously calculated seam location and/or scale factor may cease to be optimal or acceptable. Moreover in situations where the trigger may be generated by a viewer of a 360° video stream, the recalculation of the seam location and/or scale factor may be particularly beneficial where the user is focused on content at or near the seam location, such that the recalculation and/or relocation of the seam may improve the viewer's experience.
Referring now to
The apparatus 20 may also include means, such as the processor 22, the memory 24, and the communication interface 28, or the like, for in response to detecting the set of data associated with motion of the image element, determining a direction associated with the motion and, in some instances, shifting the seam location in a direction opposite the direction associated with the motion. For example, and with reference to block 78 of
As shown in
As described above,
Accordingly, blocks of the flowcharts support combinations of means for performing the specified functions and combinations of operations for performing the specified functions for performing the specified functions. It will also be understood that one or more blocks of the flowcharts, and combinations of blocks in the flowcharts, can be implemented by special purpose hardware-based computer systems which perform the specified functions, or combinations of special purpose hardware and computer instructions.
In some embodiments, certain ones of the operations above may be modified or further amplified. Furthermore, in some embodiments, additional optional operations may be included. Modifications, additions, or amplifications to the operations above may be performed in any order and in any combination.
Many modifications and other embodiments of the inventions set forth herein will come to mind to one skilled in the art to which these inventions pertain having the benefit of the teachings presented in the foregoing descriptions and the associated drawings. Therefore, it is to be understood that the inventions are not to be limited to the specific embodiments disclosed and that modifications and other embodiments are intended to be included within the scope of the appended claims. Moreover, although the foregoing descriptions and the associated drawings describe example embodiments in the context of certain example combinations of elements and/or functions, it should be appreciated that different combinations of elements and/or functions may be provided by alternative embodiments without departing from the scope of the appended claims. In this regard, for example, different combinations of elements and/or functions than those explicitly described above are also contemplated as may be set forth in some of the appended claims. Although specific terms are employed herein, they are used in a generic and descriptive sense only and not for purposes of limitation.
The present application claims the benefit of U.S. Provisional Patent Application Ser. No. 62/356,355 which was filed on Jun. 29, 2016 and titled METHODS AND APPARATUS FOR AUTOMATED PLACEMENT OF A SEAM IN A PANORAMIC IMAGE DERIVED FROM MULTIPLE CAMERAS, the entire content of which is incorporated by reference herein for all purposes.
Number | Date | Country | |
---|---|---|---|
62356355 | Jun 2016 | US |