The present disclosure relates to imaging systems and methods that include a multi-camera array. In particular, the disclosure relates to systems and methods that enable low-profile imaging systems and mobile devices while maintaining or improving image quality.
Many mobile devices, such as mobile phones and tablet computing devices, include cameras that may be operated by a user to capture still and/or video images. Because the mobile devices are typically designed to be relatively small, it can be important to design the cameras or imaging systems to be as thin as possible in order to maintain a low-profile mobile device. Folded optic image sensor arrays (“array cameras”) allow for the creation of low-profile image capture devices without shortening the focal length or decreasing the resolution of the image across the sensor array's field of view. By redirecting light toward each sensor in the array using a primary and secondary surface, and by positioning the lens assemblies used to focus the incoming light between the primary and secondary surfaces, the sensor array may be positioned on a flat substrate perpendicular to the lens assemblies. The longer focal length makes it possible to implement features such as optical zoom and to incorporate more complicated optics that require more space than commonly afforded by the traditional mobile camera, such as adding more optical elements.
Some array cameras employ a central mirror or prism with multiple facets to split incoming light comprising the target image into multiple portions for capture by the sensors in the array, wherein each facet directs a portion of the light from the target image toward a sensor in the array. Each portion of the split light may be passed through a lens assembly and reflected off of a surface positioned directly above or below a sensor, such that each sensor captures a portion of the image. The sensor fields of view can overlap to assist in stitching together the captured portions into a complete image.
The folded optic sensor arrays and image capture techniques described herein allow for the creation of low-profile image capture devices without shortening the focal length or decreasing the resolution of the image across the sensor array's field of view, wherein the captured images may be free of parallax and tilt artifacts. A challenge of existing array cameras is the quality degradation due to parallax and/or tilt between different views of same object as seen from different cameras of the array. Parallax prevents seamless stitching of the images captured by each camera into a final image completely free of artifacts. Camera views can partially overlap (for example, by approximately 20%). Depending on depth (for example, distance from lens to object) the image from one camera can be shifted relative to the image from another camera. The resulting parallax and tilt can cause “double image” ghosting in the image area corresponding to the overlapping fields of view when the images are stitched or fused together. Even if the array is structured such that there is no overlap in sensor fields of view, parallax results in discontinuous features in the image, such as lines and edges, when such features cross over the borders between sensor fields of view.
The above-described problems, among others, are addressed in some embodiments by the array cameras free (or substantially free) of parallax and tilt artifacts as described herein. Some of the embodiments may employ a central mirror or prism, for example with multiple surfaces or facets, to split incoming light comprising the target image into multiple portions for capture by the sensors in the array. The mirror surfaces and surrounding cameras can be configured to avoid causing parallax and tilt artifacts in a captured image. For example, the planes formed by the mirror surfaces or prism facets may all intersect at a common point, referred to as the apex, which may be along the vertical axis of symmetry of the array in some embodiments. The cameras can be positioned so that the optical axis of each camera is aligned with or intersects with the apex. The optical axis of a camera can intersect with both a center of projection of its lens assembly and the apex. Accordingly, the synthetic aperture (the sum of all views of the cameras in the array) can have a virtual optical axis passing through the apex. In addition, each camera can be positioned such that the angle formed between the camera optical axis and the virtual optical axis is twice the angle formed between the corresponding mirror surface and the virtual optical axis. However, these angles do not have to be the same for all cameras in the array. Accordingly, in some embodiments the apex may not be along the vertical axis of symmetry of the array. Further, the distance between the apex and the center of projection (located within the lens corresponding to a sensor) can be the same for all the cameras in the array. Accordingly, the views of the cameras in the array can seamlessly merge into a single image free of parallax and tilt artifacts.
Each portion of the split light may be passed through a lens assembly and reflected off of an optional additional reflective surface positioned directly above or below a sensor, such that each sensor captures a portion of the image. In some circumstances, each sensor in the array may capture a portion of the image which overlaps slightly with the portions captured by neighboring sensors in the array, and these portions may be assembled into the target image, for example by linear blending or other image stitching techniques. The sensors can be positioned off-center from the optical axis of its lens assembly in some examples in order to capture a wider field of view.
One aspect relates to an imaging system comprising a reflecting component including a plurality of primary light redirecting surfaces, the reflecting component comprising an apex at a location of an intersection of planes formed by each of the plurality of primary light redirecting surfaces; and a plurality of cameras, each of the plurality of cameras having an optical axis, the plurality of cameras arranged to each receive light redirected from one of the primary light redirecting surfaces of the reflecting component and such that the optical axis of each of the plurality of cameras is aligned to intersect with the apex of the reflecting component.
Another aspect relates to a method of manufacturing a folded optic array camera substantially free of parallax and tilt artifacts, the method comprising providing a reflecting component including a plurality of primary light redirecting surfaces, the reflecting component comprising an apex at a location of an intersection of planes formed by each of the plurality of primary light redirecting surfaces; and for each camera of a plurality of cameras positioned around the reflecting component positioning a lens assembly to receive a portion of light representing a target image scene from one of the plurality of primary light redirecting surfaces, the lens assembly having an optical axis, and positioning the lens assembly such that the optical axis is aligned to intersect with the apex.
Another aspect relates to an image capture apparatus comprising means for splitting light representing a target image scene into a plurality of portions and redirecting each of the plurality of portions in a different direction; means for focusing each of the plurality of portions of light; and means for capturing each of the plurality of portions of light after being focused; the means for splitting light, means for focusing, and means for capturing positioned according to a predetermined spatial relationship in order to reduce or eliminate parallax and tilt artifacts between images generated based on the plurality of portions of light.
Another aspect relates to a method of forming an array camera substantially free of parallax and tilt artifacts, the method comprising, for each camera of a plurality of cameras positioned in an array having a vertical axis of symmetry selecting a first location for an image sensor; selecting a second location for a primary light directing surface such that a plane formed by the primary light directing surface intersects with an apex point, the second location selected such that the primary light directing surface directs a portion of light representing a target image scene toward the image sensor; and selecting a third location for a center of projection of a lens assembly positioned between the sensor and the primary light directing surface, third location selected such that an optical axis angle of the camera intersects with the apex point; said method performed programmatically by one or more computing devices.
The disclosed aspects will hereinafter be described in conjunction with the appended drawings and appendices, provided to illustrate and not to limit the disclosed aspects, wherein like designations denote like elements.
I. Introduction
Implementations disclosed herein provide systems, methods and apparatus for generating images substantially free of parallax and tilt artifacts using an array camera with folded optics. Aspects of the present invention relate to an array camera exhibiting little or no parallax artifacts in the captured images. For example, the planes of the central mirror pyramid or prism of the array camera can intersect at a common point (referred to as an “apex”). The apex can serve as a point of intersection for the optical axes of the cameras in the array, as well as a point of intersection with the virtual optical axis. Each camera in the array “sees” a portion of the image scene using a corresponding facet of the central mirror prism, and accordingly each individual camera/mirror pair represents only a sub-aperture of the total array camera. The complete array camera has a synthetic aperture generated based on the sum of all individual aperture rays, that is, based on stitching together the images generated by the sub-apertures. Each camera can include a sensor and a lens assembly, the lens assembly having a center of projection located along the camera optical axis, and may optionally include a secondary light redirecting surface between the sensor and lens assembly. The sensor may be positioned off-center from the optical axis to capture more light from the image scene.
In the following description, specific details are given to provide a thorough understanding of the examples. However, the examples may be practiced without these specific details.
II. Overview of Folded Optic Array Cameras
Referring now to
Referring to
The sensors 105, 125 may be mounted on the substrate 150 as shown in
Still referring to
In some embodiments in which the receiving sensors are each an array of a plurality of sensors, the central reflective surface may be made of multiple reflective surfaces angled relative to one another in order to send a different portion of the target image scene toward each of the sensors. Each sensor in the array may have a substantially different field of view, and in some embodiments the fields of view may overlap. Certain embodiments of the central reflective surface may have complicated non-planar surfaces to increase the degrees of freedom when designing the lens system. Further, although the central surface is discussed as being a reflective surface, in other embodiments central surface may be refractive. For example, central surface may be a prism configured with a plurality of facets, where each facet directs a portion of the light comprising the scene toward one of the sensors.
After being reflected off the central reflective surface 120, the light may propagate through lens assemblies 115, 130 as illustrated in
In some embodiments, each lens assembly may comprise one or more lenses and an actuator for moving the lens among a plurality of different lens positions through a housing. The actuator may be a voice coil motor (VCM), micro-electronic mechanical system (MEMS), or a shape memory alloy (SMA). The lens assembly may further comprise a lens driver for controlling the actuator.
Traditional auto focus techniques may be implemented by changing the focal length between the lens 115, 130 and corresponding sensor 105, 125 of each camera. In some embodiments, this may be accomplished by moving a lens barrel. Other embodiments may adjust the focus by moving the central mirror up or down or by adjusting the angle of the mirror relative to the lens assembly. Certain embodiments may adjust the focus by moving the side mirrors over each sensor. Such embodiments may allow the assembly to adjust the focus of each sensor individually. Further, it is possible for some embodiments to change the focus of the entire assembly at once, for example by placing a lens like a liquid lens over the entire assembly. In certain implementations, computational photography may be used to change the focal point of the camera array.
As illustrated in
Each sensor's field of view 140, 145 may be steered into the object space by the surface of the central mirror 120 associated with that sensor. Mechanical methods may be employed to tilt the mirrors and/or move the prisms in the array so that the field of view of each camera can be steered to different locations on the object field. This may be used, for example, to implement a high dynamic range camera, to increase the resolution of the camera system, or to implement a plenoptic camera system. Each sensor's (or each 3×1 array's) field of view may be projected into the object space, and each sensor may capture a partial image comprising a portion of the target scene according to that sensor's field of view. In some embodiments, the fields of view 140, 145 for the opposing sensor arrays 105, 125 may overlap by a certain amount 150. To reduce the overlap 150 and form a single image, a stitching process as described below may be used to combine the images from the two opposing sensor arrays 105, 125. Certain embodiments of the stitching process may employ the overlap 150 for identifying common features in stitching the partial images together. After stitching the overlapping images together, the stitched image may be cropped to a desired aspect ratio, for example 4:3 or 1:1, to form the final image.
The sensors 105, 125 may be mounted on the substrate 150 as shown in
Still referring to
Each sensor in the array may have a substantially different field of view, and in some embodiments the fields of view may overlap. As described in more detail below, the spatial relationships between the various primary light redirecting surfaces 122, 124, lens assemblies 115, 130, and sensors 105, 125 can be predetermined to reduce or eliminate parallax and tilt artifacts occurring between the different fields of view.
As illustrated by
Some configurations of such array cameras 100A, 100B can suffer from parallax and tilt artifacts based on the relative positioning of the sensors and light redirecting surfaces, presenting challenges with respect to quality degradation due to parallax and tilt between different views of same object as seen from different cameras of the array. Parallax and tilt prevent seamless stitching of the images captured by each camera into a final image completely free of artifacts. Depending on depth (e.g., distance from lens to object) the image from one camera can be shifted in position and angle relative to an overlapping image from another camera. The resulting parallax and tilt can cause “double image” ghosting in the image area corresponding to the overlapping fields of view when the images are stitched or fused together. Even if the array is structured such that there is no overlap in sensor fields of view, parallax results in discontinuous features in the image, such as lines and edges, when such features cross over the borders between sensor fields of view.
As used herein, the term “camera” refers to an image sensor, lens system, and a number of corresponding light redirecting surfaces, for example the primary light redirecting surface 124, lens assembly 130, secondary light redirecting surface 135, and sensor 125 as illustrated in
Device 200 may be a cell phone, digital camera, tablet computer, personal digital assistant, or the like. There are many portable computing devices in which a reduced thickness imaging system such as is described herein would provide advantages. Device 200 may also be a stationary computing device or any device in which a thin imaging system would be advantageous. A plurality of applications may be available to the user on device 200. These applications may include traditional photographic and video applications, high dynamic range imaging, panoramic photo and video, or stereoscopic imaging such as 3D images or 3D video.
The image capture device 200 includes the cameras 215a-n for capturing external images. The cameras 215a-n may each comprise a sensor, lens assembly, and a primary and secondary reflective or refractive surface for redirecting a portion of a target image to each sensor, as discussed above with respect to
The image processor 220 may be configured to perform various processing operations on received image data comprising N portions of the target image in order to output a high quality stitched image, as will be described in more detail below. Image processor 220 may be a general purpose processing unit or a processor specially designed for imaging applications. Examples of image processing operations include cropping, scaling (e.g., to a different resolution), image stitching, image format conversion, color interpolation, color processing, image filtering (e.g., spatial image filtering), lens artifact or defect correction, etc. Image processor 220 may, in some embodiments, comprise a plurality of processors. Certain embodiments may have a processor dedicated to each image sensor. Image processor 220 may be one or more dedicated image signal processors (ISPs) or a software implementation of a processor.
As shown, the image processor 220 is connected to a memory 230 and a working memory 205. In the illustrated embodiment, the memory 230 stores capture control module 235, image stitching module 240, and operating system 245. These modules include instructions that configure the image processor 220 of device processor 250 to perform various image processing and device management tasks. Working memory 205 may be used by image processor 220 to store a working set of processor instructions contained in the modules of memory 230. Alternatively, working memory 205 may also be used by image processor 220 to store dynamic data created during the operation of device 200.
As mentioned above, the image processor 220 is configured by several modules stored in the memories. The capture control module 235 may include instructions that configure the image processor 220 to adjust the focus position of cameras 215a-n. Capture control module 235 may further include instructions that control the overall image capture functions of the device 200. For example, capture control module 235 may include instructions that call subroutines to configure the image processor 220 to capture raw image data of a target image scene using the cameras 215a-n. Capture control module 235 may then call the image stitching module 240 to perform a stitching technique on the N partial images captured by the cameras 215a-n and output a stitched and cropped target image to imaging processor 220. Capture control module 235 may also call the image stitching module 240 to perform a stitching operation on raw image data in order to output a preview image of a scene to be captured, and to update the preview image at certain time intervals or when the scene in the raw image data changes.
Image stitching module 240 may comprise instructions that configure the image processor 220 to perform stitching and cropping techniques on captured image data. For example, each of the N sensors 215a-n may capture a partial image comprising a portion of the target image according to each sensor's field of view. The fields of view may share areas of overlap, as described above and below. In order to output a single target image, image stitching module 240 may configure the image processor 220 to combine the multiple N partial images to produce a high-resolution target image. Target image generation may occur through known image stitching techniques. Examples of image stitching can be found in U.S. patent application Ser. No. 11/623,050 which is hereby incorporated by reference in its entirety.
For instance, image stitching module 240 may include instructions to compare the areas of overlap along the edges of the N partial images for matching features in order to determine rotation and alignment of the N partial images relative to one another. Due to rotation of partial images and/or the shape of the field of view of each sensor, the combined image may form an irregular shape. Therefore, after aligning and combining the N partial images, the image stitching module 240 may call subroutines which configure image processor 220 to crop the combined image to a desired shape and aspect ratio, for example a 4:3 rectangle or 1:1 square. The cropped image may be sent to the device processor 250 for display on the display 225 or for saving in the storage 210.
Operating system module 245 configures the image processor 220 to manage the working memory 205 and the processing resources of device 200. For example, operating system module 245 may include device drivers to manage hardware resources such as the cameras 215a-n. Therefore, in some embodiments, instructions contained in the image processing modules discussed above may not interact with these hardware resources directly, but instead interact through standard subroutines or APIs located in operating system component 270. Instructions within operating system 245 may then interact directly with these hardware components. Operating system module 245 may further configure the image processor 220 to share information with device processor 250.
Device processor 250 may be configured to control the display 225 to display the captured image, or a preview of the captured image, to a user. The display 225 may be external to the imaging device 200 or may be part of the imaging device 200. The display 225 may also be configured to provide a view finder displaying a preview image for a use prior to capturing an image, or may be configured to display a captured image stored in memory or recently captured by the user. The display 225 may comprise an LCD or LED screen, and may implement touch sensitive technologies.
Device processor 250 may write data to storage module 210, for example data representing captured images. While storage module 210 is represented graphically as a traditional disk device, those with skill in the art would understand that the storage module 210 may be configured as any storage media device. For example, the storage module 210 may include a disk drive, such as a floppy disk drive, hard disk drive, optical disk drive or magneto-optical disk drive, or a solid state memory such as a FLASH memory, RAM, ROM, and/or EEPROM. The storage module 210 can also include multiple memory units, and any one of the memory units may be configured to be within the image capture device 200, or may be external to the image capture device 200. For example, the storage module 210 may include a ROM memory containing system program instructions stored within the image capture device 200. The storage module 210 may also include memory cards or high speed memories configured to store captured images which may be removable from the camera.
Although
Additionally, although
III. Overview of Folded Optic Array Cameras Free from Parallax and Tilt Artifacts
Each camera 310A, 310B looks at the apex A of the central mirror prism 350, the optical axis 315A, 315B of each camera 310A, 310B passing through the apex A. The lens centers of the lenses 312A, 312B associated with each of the cameras 310A, 310B are at the same distance from the apex, and each camera 310A, 310B sees half the field of view 340 of the virtual camera 320. The angle of the optical axis 315A, 315B of each camera 310A, 310B relative to the vertical axis 325 can be double the angle of a plane formed by its corresponding mirror 330, 335 relative to the vertical axis 325. In the illustrated embodiment, the vertical axis 325 denotes the vertical axis of symmetry of the array 300 and is also the virtual optical axis (e.g., the optical axis of the virtual camera 320 represented by virtual sensor 321 and virtual lens 322).
As illustrated, the planes formed by the mirror surfaces 330, 335 intersect at a common point, referred to as the apex and labeled as A in the figures, along the virtual optical axis 325 of the array. The cameras 310A, 310B can be positioned so that the optical axis 315A, 315B of each camera intersects with the apex A. In addition, each camera 310A, 310B can be positioned such that the angle (labeled as angle 2α) formed between the camera's optical axis 315A, 315B and the virtual optical axis 325 is twice the angle (labeled as angle α) formed between the corresponding mirror surface 330, 335 and the virtual optical axis 325. However, these angles do not have to be the same for all cameras in the array. The distance D between the apex A and the center of projection 313B (located within the lens 312B corresponding to a sensor 311B) can be the same or essentially the same for all the cameras in the array. All cameras 310A, 310B of the array virtually merge into (read “serve as”) one single virtual camera 320 looking upward along the virtual optical axis 325 of the array 300. In this way each individual camera/lens/mirror combination represents only a sub-aperture of the total array 300. The virtual camera 320 has a synthetic aperture made of the sum of all individual aperture rays.
FOV=2β
β=90−α
FOV=180−2α
Beyond the angle β, the light rays that the virtual camera 320 “sees” can be obstructed by the physical structure of the real camera 310B. In some embodiments of array cameras, the FOV may be smaller.
Additionally, the array camera is desirably thin (e.g., 4 mm or less in height) in some embodiments, which constrains the angle α to less than 45° and to more than a certain value. Other practical requirements may make α>30°. In various embodiments, the focal length and angle α do not have to be the same for all cameras.
The central mirror pyramid 520 can include four reflective facets or mirrored surfaces 525A, 525B, 525C, 525D each directing light toward one of the four cameras 510A, 510B, 510C, 510D and forming the apex A of the pyramid. Each camera 510A, 510B, 510C, 510D can include an image sensor and lens assembly, and in some implementation a secondary light redirecting surface, represented in the figures as the two joined rectangular boxes. Each camera 510A, 510B, 510C, 510D can have an optical axis 515A, 515B, 515C, 515D passing through the apex A of the mirror pyramid, and can see, using a corresponding reflective facets 525A, 525B, 525C, 525D, a portion of the total field of view of the virtual camera. The spatial relationships between each of the cameras 510A, 510B, 510C, 510D, the apex A, and the corresponding one of reflective facets 525A, 525B, 525C, 525D can be defined as above in order to reduce or eliminate parallax and tilt artifacts. Although commonly described herein as a mirror pyramid, in some embodiments the reflective facets may form a reflecting component having a different shape, for example by being positioned a distance apart and therefore not forming a unified structure, or by being positioned together to form a reflecting component having a flat upper surface rather than a pointed apex. As such, the apex A may not be a physical point but rather a point in space representing an intersection of the planes of the facets.
Other array camera designs with optical axis alignment for parallax reductions are also possible, for example an eight camera design using four mirrors or reflective facets in addition to a central mirror pyramid (or other shape/structure of central reflective component). Other numbers of cameras and corresponding mirrored surfaces are possible. In addition, as discussed above with respect to
IV. Overview of Example Captured Images
When all images are in focus, the in-focus object should preferably be at one distance, for example, in one plane. Proper aligning removes any parallax. When the depth to different objects is different, they can be neither all in focus nor all aligned at the same time. Even if one object is aligned in the overlapping views, some other object at different depth may not be aligned.
In some examples, in-focus objects will be aligned properly, however there can be misalignment between images of out of focus objects in the same scene. Extensive simulation in Zemax has shown that with such objects there is visible parallax between overlapping views from different mirrors. However the mixture of two such images is the true defocused view of the object from a larger aperture. Views from individual cameras show partial (incomplete) aperture imaging. The result of mixing such individual views is full aperture imaging. The array camera generates a synthetic aperture images from all of the partial views.
In one experiment with the four-camera array, with the in-focus object at 50 mm and cameras focused at 40 mm, the experiment revealed slight parallax in the overlapping regions between individual views. However, linear blending of the views still produced cleanness and absence of ghosting in the final image.
Although real and imperfect mirrors may cause darkening in captured images, such darkening will be constant across all images capture by the array using the imperfect mirror. Accordingly, in some embodiments post-capture processing techniques can be used to correct for the known darkening of the image due to the mirror, for example by multiplying the captured image by a mask of the known darkened regions. The result would appear as if it was captured by an ideal mirror with sharp edges and a sharp apex. In other embodiments, manufacturing constraints can be placed on mirror construction to avoid darkening artifacts, for example requiring mirror edges to be precise and sharp to better than 0.25 mm.
V. Overview of Example Image Capture Process
The process 1000 then moves to block 1010, in which at least one reflective surface is mounted proximate to and in a predefined spatial relationship with a corresponding one of the plurality of imaging sensors. For example, this block could comprise mounting a central mirror pyramid in the middle of a surrounding array of two, four, or eight sensors, wherein the central mirror pyramid comprises a surface associated with each sensor in the arrays. As described above, the predefined spatial relationship can provide for an array camera that is substantially free of parallax and tilt artifacts, for example by specifying that all planes formed by the facets or mirrored surface of the central mirror pyramid intersect at a common point (the apex), that each mirror is positioned at an angle α relative to a vertical axis of the array passing through the apex (though different mirrors may be mounted at different angles or all at the same angle in various embodiments), that the corresponding sensor of each mirror is positioned at an angle 2α relative to the vertical axis of the array passing through the apex, and that the center of projection of the lens assembly associated with each sensor is positioned the same distance D from the apex as each of the other centers of projection. In some embodiments, blocks 1005 and 1010 of process 1000 can be implemented as a method of manufacturing an array camera that is substantially free of parallax and tilt artifacts.
The process 1000 then transitions to block 1015, in which light comprising a target image of a scene is reflected off of the at least one reflective surface toward the imaging sensors of an array camera manufactured by blocks 1005 and 1010. For example, a portion of the light may be reflected off of each of a plurality of surfaces toward each of the plurality of sensors. This may further comprise passing the light through a lens assembly associated with each sensor, and may also include reflecting the light off of a second surface onto a sensor. Block 1015 may further comprise focusing the light using the lens assembly or through movement of any of the reflective surfaces.
The process 1000 may then move to block 1020, in which the sensors capture a plurality of images of the target image scene. For example, each sensor may capture an image of a portion of the scene corresponding to that sensor's field of view. Due to the predetermined spatial relationship used in constructing the array camera, the fields of view may exhibit little or no parallax and tilt artifacts. Together, the fields of view of the plurality of sensors cover at least the target image in the object space.
The process 1000 then may transition to block 1025 in which an image stitching method is performed to generate a single image from the plurality of images. In some embodiments, the image stitching module 240 of
Next, the process 1000 transitions to block 1030 in which the stitched image is cropped to a specified aspect ratio, for example 4:3 or 1:1. Finally, the process ends after storing the cropped image at block 1035. For example, the image may be stored as a full resolution final image in storage 210 of
VI. Implementing Systems and Terminology
Implementations disclosed herein provide systems, methods and apparatus for multiple aperture array cameras free from parallax and tilt artifacts. One skilled in the art will recognize that these embodiments may be implemented in hardware, software, firmware, or any combination thereof.
In some embodiments, the circuits, processes, and systems discussed above may be utilized in a wireless communication device. The wireless communication device may be a kind of electronic device used to wirelessly communicate with other electronic devices. Examples of wireless communication devices include cellular telephones, smart phones, Personal Digital Assistants (PDAs), e-readers, gaming systems, music players, netbooks, wireless modems, laptop computers, tablet devices, etc.
The wireless communication device may include one or more image sensors, two or more image signal processors, a memory including instructions or modules for carrying out the processes discussed above. The device may also have data, a processor loading instructions and/or data from memory, one or more communication interfaces, one or more input devices, one or more output devices such as a display device and a power source/interface. The wireless communication device may additionally include a transmitter and a receiver. The transmitter and receiver may be jointly referred to as a transceiver. The transceiver may be coupled to one or more antennas for transmitting and/or receiving wireless signals.
The wireless communication device may wirelessly connect to another electronic device (e.g., base station). A wireless communication device may alternatively be referred to as a mobile device, a mobile station, a subscriber station, a user equipment (UE), a remote station, an access terminal, a mobile terminal, a terminal, a user terminal, a subscriber unit, etc. Examples of wireless communication devices include laptop or desktop computers, cellular phones, smart phones, wireless modems, e-readers, tablet devices, gaming systems, etc. Wireless communication devices may operate in accordance with one or more industry standards such as the 3rd Generation Partnership Project (3GPP). Thus, the general term “wireless communication device” may include wireless communication devices described with varying nomenclatures according to industry standards (e.g., access terminal, user equipment (UE), remote terminal, etc.).
The functions described herein may be stored as one or more instructions on a processor-readable or computer-readable medium. The term “computer-readable medium” refers to any available medium that can be accessed by a computer or processor. By way of example, and not limitation, such a medium may comprise RAM, ROM, EEPROM, flash memory, CD-ROM or other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other medium that can be used to store desired program code in the form of instructions or data structures and that can be accessed by a computer. Disk and disc, as used herein, includes compact disc (CD), laser disc, optical disc, digital versatile disc (DVD), floppy disk and Blu-ray® disc where disks usually reproduce data magnetically, while discs reproduce data optically with lasers. It should be noted that a computer-readable medium may be tangible and non-transitory. The term “computer-program product” refers to a computing device or processor in combination with code or instructions (e.g., a “program”) that may be executed, processed or computed by the computing device or processor. As used herein, the term “code” may refer to software, instructions, code or data that is/are executable by a computing device or processor.
Software or instructions may also be transmitted over a transmission medium. For example, if the software is transmitted from a website, server, or other remote source using a coaxial cable, fiber optic cable, twisted pair, digital subscriber line (DSL), or wireless technologies such as infrared, radio, and microwave, then the coaxial cable, fiber optic cable, twisted pair, DSL, or wireless technologies such as infrared, radio, and microwave are included in the definition of transmission medium.
The methods disclosed herein comprise one or more steps or actions for achieving the described method. The method steps and/or actions may be interchanged with one another without departing from the scope of the claims. In other words, unless a specific order of steps or actions is required for proper operation of the method that is being described, the order and/or use of specific steps and/or actions may be modified without departing from the scope of the claims.
It should be noted that the terms “couple,” “coupling,” “coupled” or other variations of the word couple as used herein may indicate either an indirect connection or a direct connection. For example, if a first component is “coupled” to a second component, the first component may be either indirectly connected to the second component or directly connected to the second component. As used herein, the term “plurality” denotes two or more. For example, a plurality of components indicates two or more components.
The term “determining” encompasses a wide variety of actions and, therefore, “determining” can include calculating, computing, processing, deriving, investigating, looking up (e.g., looking up in a table, a database or another data structure), ascertaining and the like. Also, “determining” can include receiving (e.g., receiving information), accessing (e.g., accessing data in a memory) and the like. Also, “determining” can include resolving, selecting, choosing, establishing and the like.
The phrase “based on” does not mean “based only on,” unless expressly specified otherwise. In other words, the phrase “based on” describes both “based only on” and “based at least on.”
In the foregoing description, specific details are given to provide a thorough understanding of the examples. However, it will be understood by one of ordinary skill in the art that the examples may be practiced without these specific details. For example, electrical components/devices may be shown in block diagrams in order not to obscure the examples in unnecessary detail. In other instances, such components, other structures and techniques may be shown in detail to further explain the examples.
Headings are included herein for reference and to aid in locating various sections. These headings are not intended to limit the scope of the concepts described with respect thereto. Such concepts may have applicability throughout the entire specification.
It is also noted that the examples may be described as a process, which is depicted as a flowchart, a flow diagram, a finite state diagram, a structure diagram, or a block diagram. Although a flowchart may describe the operations as a sequential process, many of the operations can be performed in parallel, or concurrently, and the process can be repeated. In addition, the order of the operations may be re-arranged. A process is terminated when its operations are completed. A process may correspond to a method, a function, a procedure, a subroutine, a subprogram, etc. When a process corresponds to a software function, its termination corresponds to a return of the function to the calling function or the main function.
The previous description of the disclosed implementations is provided to enable any person skilled in the art to make or use the present invention. Various modifications to these implementations will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other implementations without departing from the spirit or scope of the invention. Thus, the present invention is not intended to be limited to the implementations shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.
The present application is a continuation of U.S. patent application Ser. No. 14/571,149, filed on Dec. 15, 2014, entitled “MULTI-CAMERA SYSTEM USING FOLDED OPTICS FREE FROM PARALLAX AND TILT ARTIFACTS,” which claims the benefit under 35 U.S.C. §119(e) of U.S. Provisional Patent Application No. 62/015,319, filed on Jun. 20, 2014, entitled “MULTI-CAMERA SYSTEM USING FOLDED OPTICS FREE FROM PARALLAX AND TILT ARTIFACTS,” the contents of which is hereby incorporated by reference herein.
Number | Name | Date | Kind |
---|---|---|---|
3698803 | Toshio | Oct 1972 | A |
4114171 | Altman | Sep 1978 | A |
4437745 | Hajnal | Mar 1984 | A |
4639586 | Fender et al. | Jan 1987 | A |
4740780 | Brown et al. | Apr 1988 | A |
4751570 | Robinson | Jun 1988 | A |
4890314 | Judd et al. | Dec 1989 | A |
5012273 | Nakamura et al. | Apr 1991 | A |
5016109 | Gaylord | May 1991 | A |
5063441 | Lipton et al. | Nov 1991 | A |
5142357 | Lipton et al. | Aug 1992 | A |
5194959 | Kaneko | Mar 1993 | A |
5207000 | Chang et al. | May 1993 | A |
5231461 | Silvergate et al. | Jul 1993 | A |
5243413 | Gitlin et al. | Sep 1993 | A |
5313542 | Castonguay | May 1994 | A |
5475617 | Castonguay | Dec 1995 | A |
5506913 | Ibison et al. | Apr 1996 | A |
5539483 | Nalwa | Jul 1996 | A |
5606627 | Kuo | Feb 1997 | A |
5614941 | Hines | Mar 1997 | A |
5640222 | Paul | Jun 1997 | A |
5642299 | Hardin et al. | Jun 1997 | A |
5686960 | Sussman et al. | Nov 1997 | A |
5721585 | Keast et al. | Feb 1998 | A |
5734507 | Harvey | Mar 1998 | A |
5745305 | Nalwa | Apr 1998 | A |
5760846 | Lee | Jun 1998 | A |
5793527 | Nalwa | Aug 1998 | A |
5798791 | Katayama et al. | Aug 1998 | A |
5903306 | Heckendorn et al. | May 1999 | A |
5926411 | Russell | Jul 1999 | A |
5990934 | Nalwa | Nov 1999 | A |
6111702 | Nalwa | Aug 2000 | A |
6115176 | Nalwa | Sep 2000 | A |
6128143 | Nalwa | Oct 2000 | A |
6141034 | McCutchen | Oct 2000 | A |
6141145 | Nalwa | Oct 2000 | A |
6144501 | Nalwa | Nov 2000 | A |
6195204 | Nalwa | Feb 2001 | B1 |
6219090 | Nalwa | Apr 2001 | B1 |
6285365 | Nalwa | Sep 2001 | B1 |
6356397 | Nalwa | Mar 2002 | B1 |
6421185 | Wick | Jul 2002 | B1 |
6611289 | Yu et al. | Aug 2003 | B1 |
6628897 | Suzuki | Sep 2003 | B2 |
6650774 | Szeliski | Nov 2003 | B1 |
6700711 | Nalwa | Mar 2004 | B2 |
6701081 | Dwyer et al. | Mar 2004 | B1 |
6768509 | Bradski et al. | Jul 2004 | B1 |
6775437 | Kazarinov et al. | Aug 2004 | B2 |
6782137 | Avinash | Aug 2004 | B1 |
6798406 | Jones et al. | Sep 2004 | B1 |
6809887 | Gao et al. | Oct 2004 | B1 |
6850279 | Scherling | Feb 2005 | B1 |
6855111 | Yokoi et al. | Feb 2005 | B2 |
6861633 | Osborn | Mar 2005 | B2 |
6862364 | Berestov | Mar 2005 | B1 |
6987534 | Seta | Jan 2006 | B1 |
6992700 | Sato et al. | Jan 2006 | B1 |
7006123 | Yoshikawa et al. | Feb 2006 | B2 |
7039292 | Breiholz | May 2006 | B1 |
7084904 | Liu et al. | Aug 2006 | B2 |
7116351 | Yoshikawa | Oct 2006 | B2 |
7215479 | Bakin | May 2007 | B1 |
7253394 | Kang | Aug 2007 | B2 |
7271803 | Ejiri et al. | Sep 2007 | B2 |
7336299 | Kostrzewski et al. | Feb 2008 | B2 |
7612953 | Nagai et al. | Nov 2009 | B2 |
7710463 | Foote | May 2010 | B2 |
7805071 | Mitani | Sep 2010 | B2 |
7817354 | Wilson | Oct 2010 | B2 |
7860214 | Haff | Dec 2010 | B1 |
7893957 | Peters et al. | Feb 2011 | B2 |
7961398 | Tocci | Jun 2011 | B2 |
7978222 | Schneider | Jul 2011 | B2 |
8004557 | Pan | Aug 2011 | B2 |
8098276 | Chang et al. | Jan 2012 | B2 |
8115813 | Tang | Feb 2012 | B2 |
8139125 | Scherling | Mar 2012 | B2 |
8228417 | Georgiev et al. | Jul 2012 | B1 |
8267601 | Campbell et al. | Sep 2012 | B2 |
8284263 | Oohara et al. | Oct 2012 | B2 |
8294073 | Vance et al. | Oct 2012 | B1 |
8356035 | Baluja et al. | Jan 2013 | B1 |
8400555 | Georgiev et al. | Mar 2013 | B1 |
8442392 | Ollila et al. | May 2013 | B2 |
8482813 | Kawano et al. | Jul 2013 | B2 |
8791984 | Jones et al. | Jul 2014 | B2 |
8836693 | Katano | Sep 2014 | B2 |
8928988 | Ford | Jan 2015 | B1 |
8988564 | Webster et al. | Mar 2015 | B2 |
9049375 | Wade et al. | Jun 2015 | B2 |
9055208 | Kim | Jun 2015 | B2 |
9185296 | Wade et al. | Nov 2015 | B2 |
9264610 | Duparre | Feb 2016 | B2 |
9294672 | Georgiev | Mar 2016 | B2 |
9316810 | Mercado | Apr 2016 | B2 |
9332188 | Takei et al. | May 2016 | B2 |
9374516 | Osborne | Jun 2016 | B2 |
9386222 | Georgiev | Jul 2016 | B2 |
9602806 | Stafford et al. | Mar 2017 | B1 |
9609210 | Djordjevic | Mar 2017 | B2 |
9733458 | Georgiev | Aug 2017 | B2 |
9952371 | Ambur | Apr 2018 | B2 |
9973680 | Osborne et al. | May 2018 | B2 |
20010028482 | Nishioka | Oct 2001 | A1 |
20020070365 | Karellas | Jun 2002 | A1 |
20020136150 | Mihara et al. | Sep 2002 | A1 |
20030024987 | Zhu | Feb 2003 | A1 |
20030034395 | Tsikos | Feb 2003 | A1 |
20030038814 | Blume | Feb 2003 | A1 |
20030156751 | Lee et al. | Aug 2003 | A1 |
20030214575 | Yoshikawa | Nov 2003 | A1 |
20040021767 | Endo et al. | Feb 2004 | A1 |
20040051805 | Yoshikawa et al. | Mar 2004 | A1 |
20040066449 | Givon | Apr 2004 | A1 |
20040105025 | Scherling | Jun 2004 | A1 |
20040183907 | Hovanky et al. | Sep 2004 | A1 |
20040195492 | Hsin | Oct 2004 | A1 |
20040246333 | Steuart et al. | Dec 2004 | A1 |
20040263611 | Cutler | Dec 2004 | A1 |
20050053274 | Mayer et al. | Mar 2005 | A1 |
20050057659 | Hasegawa | Mar 2005 | A1 |
20050081629 | Hoshal | Apr 2005 | A1 |
20050111106 | Matsumoto et al. | May 2005 | A1 |
20050185711 | Pfister et al. | Aug 2005 | A1 |
20050218297 | Suda et al. | Oct 2005 | A1 |
20050243175 | Yamada et al. | Nov 2005 | A1 |
20050253951 | Fujimoto et al. | Nov 2005 | A1 |
20060023074 | Cutler | Feb 2006 | A1 |
20060023106 | Yee et al. | Feb 2006 | A1 |
20060023278 | Nishioka | Feb 2006 | A1 |
20060061660 | Brackmann | Mar 2006 | A1 |
20060084852 | Mason | Apr 2006 | A1 |
20060098267 | Togawa | May 2006 | A1 |
20060140446 | Luo et al. | Jun 2006 | A1 |
20060193509 | Criminisi et al. | Aug 2006 | A1 |
20060215054 | Liang et al. | Sep 2006 | A1 |
20060215903 | Nishiyama | Sep 2006 | A1 |
20060238441 | Benjamin et al. | Oct 2006 | A1 |
20070024739 | Konno | Feb 2007 | A1 |
20070058961 | Kobayashi et al. | Mar 2007 | A1 |
20070064142 | Misawa et al. | Mar 2007 | A1 |
20070085903 | Zhang | Apr 2007 | A1 |
20070146530 | Nose | Jun 2007 | A1 |
20070164202 | Wurz et al. | Jul 2007 | A1 |
20070216796 | Lenel et al. | Sep 2007 | A1 |
20070242152 | Chen | Oct 2007 | A1 |
20070263115 | Horidan et al. | Nov 2007 | A1 |
20070268983 | Elam | Nov 2007 | A1 |
20080029708 | Olsen et al. | Feb 2008 | A1 |
20080030573 | Ritchey | Feb 2008 | A1 |
20080030597 | Olsen | Feb 2008 | A1 |
20080058629 | Seibel et al. | Mar 2008 | A1 |
20080088702 | Linsenmaier et al. | Apr 2008 | A1 |
20080117289 | Schowengerdt et al. | May 2008 | A1 |
20080117532 | Shafer | May 2008 | A1 |
20080218612 | Border et al. | Sep 2008 | A1 |
20080259172 | Tamaru | Oct 2008 | A1 |
20080266404 | Sato | Oct 2008 | A1 |
20080290435 | Oliver et al. | Nov 2008 | A1 |
20080291543 | Nomura et al. | Nov 2008 | A1 |
20080297612 | Yoshikawa | Dec 2008 | A1 |
20080316301 | Givon | Dec 2008 | A1 |
20090003646 | Au et al. | Jan 2009 | A1 |
20090005112 | Sorek et al. | Jan 2009 | A1 |
20090015812 | Schultz et al. | Jan 2009 | A1 |
20090051804 | Nomura et al. | Feb 2009 | A1 |
20090080695 | Yang | Mar 2009 | A1 |
20090085846 | Cho et al. | Apr 2009 | A1 |
20090096994 | Smits | Apr 2009 | A1 |
20090153726 | Lim | Jun 2009 | A1 |
20090160931 | Pockett et al. | Jun 2009 | A1 |
20090219402 | Schneider | Sep 2009 | A1 |
20090268210 | Prince | Oct 2009 | A1 |
20090268983 | Stone | Oct 2009 | A1 |
20090268985 | Wong et al. | Oct 2009 | A1 |
20090296984 | Nijim et al. | Dec 2009 | A1 |
20090315808 | Ishii | Dec 2009 | A1 |
20100044555 | Ohara et al. | Feb 2010 | A1 |
20100045774 | Len et al. | Feb 2010 | A1 |
20100066812 | Kajihara et al. | Mar 2010 | A1 |
20100165155 | Chang | Jul 2010 | A1 |
20100202766 | Takizawa et al. | Aug 2010 | A1 |
20100215249 | Heitz et al. | Aug 2010 | A1 |
20100232681 | Fujieda et al. | Sep 2010 | A1 |
20100259655 | Takayama | Oct 2010 | A1 |
20100265313 | Liu et al. | Oct 2010 | A1 |
20100265363 | Kim | Oct 2010 | A1 |
20100278423 | Itoh et al. | Nov 2010 | A1 |
20100289878 | Sato et al. | Nov 2010 | A1 |
20100290703 | Sim et al. | Nov 2010 | A1 |
20100290769 | Nasiri et al. | Nov 2010 | A1 |
20100302396 | Golub et al. | Dec 2010 | A1 |
20100309286 | Chen et al. | Dec 2010 | A1 |
20100309333 | Smith et al. | Dec 2010 | A1 |
20110001789 | Wilson et al. | Jan 2011 | A1 |
20110007135 | Okada et al. | Jan 2011 | A1 |
20110009163 | Fletcher et al. | Jan 2011 | A1 |
20110012998 | Pan | Jan 2011 | A1 |
20110038535 | Wang et al. | Feb 2011 | A1 |
20110043623 | Fukuta et al. | Feb 2011 | A1 |
20110090575 | Mori | Apr 2011 | A1 |
20110096089 | Shenhav et al. | Apr 2011 | A1 |
20110096988 | Suen et al. | Apr 2011 | A1 |
20110128412 | Milnes et al. | Jun 2011 | A1 |
20110150442 | Ollila et al. | Jun 2011 | A1 |
20110181588 | Barenbrug et al. | Jul 2011 | A1 |
20110213664 | Osterhout et al. | Sep 2011 | A1 |
20110235899 | Tanaka | Sep 2011 | A1 |
20110249341 | Difrancesco et al. | Oct 2011 | A1 |
20110262122 | Minamisawa et al. | Oct 2011 | A1 |
20110304764 | Shigemitsu et al. | Dec 2011 | A1 |
20120008148 | Pryce et al. | Jan 2012 | A1 |
20120033051 | Atanassov et al. | Feb 2012 | A1 |
20120044368 | Lin et al. | Feb 2012 | A1 |
20120056987 | Fedoroff | Mar 2012 | A1 |
20120075168 | Osterhout et al. | Mar 2012 | A1 |
20120127276 | Tsai et al. | May 2012 | A1 |
20120229688 | Tajiri | Sep 2012 | A1 |
20120249750 | Izzat et al. | Oct 2012 | A1 |
20120249815 | Bohn et al. | Oct 2012 | A1 |
20120269400 | Heyward | Oct 2012 | A1 |
20120281072 | Georgiev et al. | Nov 2012 | A1 |
20120293607 | Bhogal et al. | Nov 2012 | A1 |
20120293632 | Yukich | Nov 2012 | A1 |
20120327195 | Cheng | Dec 2012 | A1 |
20130003140 | Keniston et al. | Jan 2013 | A1 |
20130010084 | Hatano | Jan 2013 | A1 |
20130038689 | McDowall | Feb 2013 | A1 |
20130057655 | Su et al. | Mar 2013 | A1 |
20130070055 | Atanassov et al. | Mar 2013 | A1 |
20130076924 | Wade et al. | Mar 2013 | A1 |
20130077945 | Liu et al. | Mar 2013 | A1 |
20130100304 | Wade et al. | Apr 2013 | A1 |
20130128030 | Georgiev | May 2013 | A1 |
20130141802 | Yang | Jun 2013 | A1 |
20130182325 | Minamisawa et al. | Jul 2013 | A1 |
20130222556 | Shimada | Aug 2013 | A1 |
20130229529 | Lablans | Sep 2013 | A1 |
20130250045 | Ki et al. | Sep 2013 | A1 |
20130250053 | Levy | Sep 2013 | A1 |
20130250123 | Zhang et al. | Sep 2013 | A1 |
20130260823 | Shukla et al. | Oct 2013 | A1 |
20130278785 | Nomura et al. | Oct 2013 | A1 |
20130286451 | Verhaegh | Oct 2013 | A1 |
20130329015 | Pulli et al. | Dec 2013 | A1 |
20130335598 | Gustavsson et al. | Dec 2013 | A1 |
20130335600 | Gustavsson et al. | Dec 2013 | A1 |
20140009631 | Topliss | Jan 2014 | A1 |
20140016832 | Kong et al. | Jan 2014 | A1 |
20140085502 | Lin et al. | Mar 2014 | A1 |
20140104378 | Kauff et al. | Apr 2014 | A1 |
20140111650 | Georgiev | Apr 2014 | A1 |
20140139623 | Mccain et al. | May 2014 | A1 |
20140139693 | Takei et al. | May 2014 | A1 |
20140152852 | Ito et al. | Jun 2014 | A1 |
20140184749 | Hilliges et al. | Jul 2014 | A1 |
20140192253 | Laroia | Jul 2014 | A1 |
20140285673 | Hundley et al. | Sep 2014 | A1 |
20140340568 | Sano et al. | Nov 2014 | A1 |
20150043076 | Nakayama | Feb 2015 | A1 |
20150049172 | Ramachandra et al. | Feb 2015 | A1 |
20150070562 | Nayar et al. | Mar 2015 | A1 |
20150085363 | Liu et al. | Mar 2015 | A1 |
20150103197 | Djordjevic | Apr 2015 | A1 |
20150125092 | Zhuo et al. | May 2015 | A1 |
20150177524 | Webster et al. | Jun 2015 | A1 |
20150201128 | Dong | Jul 2015 | A1 |
20150244934 | Duparre et al. | Aug 2015 | A1 |
20150253647 | Mercado | Sep 2015 | A1 |
20150286033 | Osborne | Oct 2015 | A1 |
20150288865 | Osborne | Oct 2015 | A1 |
20150370040 | Georgiev | Dec 2015 | A1 |
20150371387 | Atanassov | Dec 2015 | A1 |
20150373252 | Georgiev | Dec 2015 | A1 |
20150373262 | Georgiev | Dec 2015 | A1 |
20150373268 | Osborne | Dec 2015 | A1 |
20150373269 | Osborne | Dec 2015 | A1 |
20150373279 | Osborne | Dec 2015 | A1 |
20160014332 | de Leon | Jan 2016 | A1 |
20160085059 | Mercado | Mar 2016 | A1 |
20160127641 | Gove | May 2016 | A1 |
20160127646 | Osborne | May 2016 | A1 |
20160269602 | Osborne | Sep 2016 | A1 |
20160286121 | Georgiev | Sep 2016 | A1 |
20160295112 | Georgiev et al. | Oct 2016 | A1 |
20160353008 | Osborne | Dec 2016 | A1 |
20160373263 | Zaidi | Dec 2016 | A1 |
20170026570 | Shepard | Jan 2017 | A1 |
20170038502 | Georgiev | Feb 2017 | A1 |
20170118421 | Georgiev | Apr 2017 | A1 |
20180084193 | Georgiev et al. | Mar 2018 | A1 |
Number | Date | Country |
---|---|---|
101046534 | Oct 2007 | CN |
101201459 | Jun 2008 | CN |
101257576 | Sep 2008 | CN |
101571666 | Nov 2009 | CN |
101581828 | Nov 2009 | CN |
101867720 | Oct 2010 | CN |
101902657 | Dec 2010 | CN |
101952762 | Jan 2011 | CN |
202405984 | Aug 2012 | CN |
0610605 | Aug 1994 | EP |
0751416 | Jan 1997 | EP |
0751416 | Jan 1997 | EP |
1176812 | Jan 2002 | EP |
1383342 | Jan 2004 | EP |
1816514 | Aug 2007 | EP |
1832912 | Sep 2007 | EP |
2242252 | Oct 2010 | EP |
2354390 | Mar 2001 | GB |
2354391 | Mar 2001 | GB |
S60213178 | Oct 1985 | JP |
H06217184 | Aug 1994 | JP |
H06251127 | Sep 1994 | JP |
H089424 | Jan 1996 | JP |
H0847001 | Feb 1996 | JP |
H08125835 | May 1996 | JP |
8194274 | Jul 1996 | JP |
H08242453 | Sep 1996 | JP |
H09214992 | Aug 1997 | JP |
H10142490 | May 1998 | JP |
2001194114 | Jul 2001 | JP |
2002158913 | May 2002 | JP |
2003304561 | Oct 2003 | JP |
2004260787 | Sep 2004 | JP |
3791847 | Jun 2006 | JP |
2006279538 | Oct 2006 | JP |
2007147457 | Jun 2007 | JP |
2007323615 | Dec 2007 | JP |
2008009424 | Jan 2008 | JP |
2009122842 | Jun 2009 | JP |
2010041381 | Feb 2010 | JP |
2010067014 | Mar 2010 | JP |
2010128820 | Jun 2010 | JP |
2010524279 | Jul 2010 | JP |
20060049992 | May 2006 | KR |
20080071400 | Aug 2008 | KR |
WO-9321560 | Oct 1993 | WO |
WO-9847291 | Oct 1998 | WO |
WO-2006075528 | Jul 2006 | WO |
WO-2007129147 | Nov 2007 | WO |
WO-2008112054 | Sep 2008 | WO |
WO-2009047681 | Apr 2009 | WO |
WO-2009086330 | Jul 2009 | WO |
WO-2010019757 | Feb 2010 | WO |
WO-2012136388 | Oct 2012 | WO |
WO-2012164339 | Dec 2012 | WO |
WO-2013154433 | Oct 2013 | WO |
WO-2014012603 | Jan 2014 | WO |
WO-2014025588 | Feb 2014 | WO |
Entry |
---|
Chowdhury et al, Challenges of Megapixel Camera Module Assembly and Test, 2005. |
Hua et al, Design analysis of a high-resolution panoramic camera using conventional imagers and a mirror pyramid, Feb. 2007. |
Meng et al, Single-shot specular surface reconstruction with gonio-plenoptic imaging, 2015. |
Hung et al, Integrated the back-side inclined exposure technology to fabricate the 45 degree k-type prism with nanometer roughness (Year: 2012). |
Han Y., et al., “Removing Illumination from Image Pair for Stereo Matching”, Audio, Language and Image Processing (ICALIP), 2012 International Conference on, IEEE, Jul. 16, 2012, XP032278010, pp. 508-512. |
Hao M., et al., “Object Location Technique for Binocular Stereo Vision Based on Scale Invariant Feature Transform Feature Points”, SIFT, Journal of Harbin Engineering University, Jun. 2009, vol. 30, No. 6 pp. 649-653. |
International Search Report and Written Opinion—PCT/US2015/033195—ISA/EPO—dated Dec. 17, 2015. |
Kawanishi T., et al., “Generation of High-Resolution Stereo Panoramic Images by Omnidirectional Imaging Sensor Using Hexagonal Pyramidal Mirrors”, Patiern Recognition, 1998, Proceedings, Fourteenth International Conference on Brisbane, QLD., Australia Aug. 16-20, 1998, Los Alamitos, CA, USA,IEEE Comput. Soc, US, Jan. 1, 1998 (Jan. 1, 1998), pp. 485-489, vol. 1, XP031098377, ISBN: 978-0-8186-8512-5. |
RICOH Imagine Change: “New RICOH THETA Model, Capturing 360-degree Images in One Shot, is on Sale Soon—Spherical Video Function, API and SDK (Beta Version)”, News Release, Oct. 28, 2014, 3 pages. |
Shuchun Y., et al., “Preprocessing for stereo vision based on LOG filter”, Proceedings of 2011 6th International Forum on Strategic Technology, Aug. 2011, XP055211077, pp. 1074-1077. |
Tan K-H., et al., “Multiview Panoramic Cameras Using a Pyramid”, Omnidirectional Vision, 2002, Proceedings, Third Workshop on Jun. 2, 2002, Piscataway, NJ, USA,IEEE, Jan. 1, 2002 (Jan. 1, 2002), pp. 87-93, XP010611080, ISBN: 978-0-7695-1629-5. |
Arican, et al., “Intermediate View Generation for Perceived Depth Adjustment of Sterio Video”, Mitsubishi Electric Research Laboratories, http://www.merl.com, TR2009-052, Sep. 2009; 12 pages. |
Hoff, et al., “Surfaces from Stereo: Integrating Feature Matching, Disparity Estimation, and Contour Detection”, IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 11, No. 2, pp. 121-136, Feb. 1989. |
Krotkov E., et al., “Active vision for reliable ranging: Cooperating focus, stereo, and vergence”, International Journal of Computer Vision. vol. 11, No. 2, Oct. 1, 1993 (Oct. 1, 1993), pp. 187-203, XP055149875, ISSN: 0920-5691. DOI: 10.1007/BF01469228. |
Murphy M., et al., “Lens Drivers Focus on Performance in High-Resolution Camera Modules,” Analog Dialogue, Nov. 2006, vol. 40, pp. 1-3. |
Narkhede, et al., “Stereoscopic Imaging: A Real-Time, In Depth Look,” IEEE Potentials, Feb./Mar. 2004, vol. 23, Issue 1, pp. 38-42. |
Sun W.S., et al., “Single-Lens Camera Based on a Pyramid Prism Array to Capture Four Images,” Optical Review, 2013, vol. 20 (2), pp. 145-152. |
Zhao W., et al., “Effects of Camera Alignment Errors on Stereoscopic Depth Estimates,” Dec. 1996, Pattern Recognition, 24 pages. |
Number | Date | Country | |
---|---|---|---|
20160198087 A1 | Jul 2016 | US |
Number | Date | Country | |
---|---|---|---|
62015319 | Jun 2014 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 14571149 | Dec 2014 | US |
Child | 15015573 | US |