This disclosure relates to a camera array and, more specifically, to methods for capturing images using the camera array.
Digital cameras are increasingly used in outdoors and sports environments. Using a camera to capture outdoors and sports environments, however, can be difficult if the camera is bulky or cannot capture the field of view desired. A user's experience with a camera can be diminished by camera bulkiness and limited camera functionality.
The disclosed embodiments have other advantages and features which will be more readily apparent from the following detailed description of the invention and the appended claims, when taken in conjunction with the accompanying drawings, in which:
The figures and the following description relate to preferred embodiments by way of illustration only. It should be noted that from the following discussion, alternative embodiments of the structures and methods disclosed herein will be readily recognized as viable alternatives that may be employed without departing from the principles of what is claimed.
Reference will now be made in detail to several embodiments, examples of which are illustrated in the accompanying figures. It is noted that wherever practicable similar or like reference numbers may be used in the figures and may indicate similar or like functionality. The figures depict embodiments of the disclosed system (or method) for purposes of illustration only. One skilled in the art will readily recognize from the following description that alternative embodiments of the structures and methods illustrated herein may be employed without departing from the principles described herein.
A camera array configuration includes a plurality of cameras, each camera having a distinctive field of view. For example, the camera array can include a 2×1 camera array, a 2×2 camera array, or any other suitable arrangement of cameras. Each camera can have a camera housing structured to at least partially enclose the camera. Alternatively, the camera array can include a camera housing structured to enclose the plurality of cameras. Each camera can include a camera body having a camera lens structured on a front surface of the camera body, various indicators on the front of the surface of the camera body (such as LEDs, displays, and the like), various input mechanisms (such as buttons, switches, and touch-screen mechanisms), and electronics (e.g., imaging electronics, power electronics, etc.) internal to the camera body for capturing images via the camera lens and/or performing other functions. In another embodiment, the camera array includes some or all of the various indicators, various input mechanisms, and electronics and includes the plurality of cameras. A camera housing can include a lens window structured on the front surface of the camera housing and configured to substantially align with the camera lenses of the plurality of cameras, and one or more indicator windows structured on the front surface of the camera housing and configured to substantially align with the camera indicators.
Even though cameras 105A and 105B have a partial common field of view, any object in the common field of view may not be aligned in images captured by the left camera 105A and the right camera 105B. Thus, as seen in the left image 110A, a part of the sphere appears to the left of the cube and, in the right image 110B, a part of the sphere appears to the right of the cube. This offset 120 of the cube and sphere are due to parallax error. The parallax error is inherent to the conventional camera environment 100 because the distance between the cameras causes the position of objects in the common field of view to be different relative to each to each camera, based on the distance between the cameras. Parallax error can be further exacerbated based on a position of a read window 130 of an image sensor window 140 of an image sensor within each camera. For example, the read window 130 is not necessarily centered on the image sensor window 140 and, therefore, can result in a greater effective distance the cameras 105A and 105B. The position of the read window 130 within an image sensor window 140 is an issue that can arise during the manufacturing of the image sensor and can be adjusted for various purposes, as further described in
The angled image formed by taping the images 110C and 110D together is different than the flat image formed by taping the images 110A and 110B together in
Unlike the environment 100 in
It should be noted that the orientation of cameras 105C and 105D is such that the vectors normal to each camera lens of cameras 105C and 105D intersect within the partial common field of view. In contrast, while the cameras 105A and 105B have a partial common field of view, vectors normal to each camera lens of cameras 105A and 105B do not intersect within the partial common field of view. In embodiments with a 2×2 camera array, vectors normal to each camera lens in the array can intersect within a field of view common to all cameras. For example, for any two cameras, vectors normal to the camera lenses of the two cameras can intersect within a field of view common to all four cameras.
The camera 105 illustrated in
The camera array 300 can be adapted to be at least partially enclosed by a protective camera housing (not illustrated in the embodiment of
Portions of the housing and/or array may include exposed areas to allow a user to manipulate buttons that are associated with the camera array 300 functionality. Alternatively, such areas may be covered with a pliable material to allow the user to manipulate the buttons through the housing. For example, in one embodiment the top face of the housing includes an outer shutter button structured so that a shutter button of the camera array 300 is substantially aligned with the outer shutter button when the camera array 300 is secured within the housing. The shutter button of the camera array 300 is operationally coupled to the outer shutter button so that pressing the outer shutter button allows the user to operate the camera shutter button.
In one embodiment, the front face of the housing includes one or more lens windows structured so that the lenses of the cameras in the camera array 300 are substantially aligned with the lens windows when the camera array 300 is secured within the housing. The lens windows can be adapted for use with a conventional lens, a wide angle lens, a flat lens, or any other specialized camera lens. In this embodiment, the lens window includes a waterproof seal so as to maintain the waterproof aspect of the housing.
In one embodiment, the housing and/or array includes one or more securing structures for securing the housing and/or array to one of a variety of mounting devices. For example, various mounts include a clip-style mount or a different type of mounting structure via a different type of coupling mechanism.
In one embodiment, the housing includes an indicator window structured so that one or more camera array indicators are substantially aligned with the indicator window when the camera array 300 is secured within the housing. The indicator window can be any shape or size, and can be made of the same material as the remainder of the housing, or can be made of any other material, for instance a transparent or translucent material and/or a non-reflective material.
The housing can include a first housing portion and a second housing portion, according to one example embodiment. The second housing portion detachably couples with the first housing portion opposite the front face of the first housing portion. The first housing portion and second housing portion are collectively structured to enclose a camera array 300 within the cavity formed when the second housing portion is secured to the first housing portion in a closed position.
The camera array 300 is configured to capture images and video, and to store captured images and video for subsequent display or playback. The camera array 300 is adapted to fit within a housing, such as the housing discussed above or any other suitable housing. Each camera 105A-D in the array 300 can be an interchangeable camera module. As illustrated, the camera array 300 includes a plurality of lenses configured to receive light incident upon the lenses and to direct received light onto image sensors internal to the lenses.
The camera array 300 can include various indicators, including LED lights and a LED display. The camera array 300 can also include buttons configured to allow a user of the camera array 300 to interact with the camera array 300, to turn the camera array 300 on, and to otherwise configure the operating mode of the camera array 300. The camera array 300 can also include a microphone configured to receive and record audio signals in conjunction with recording video. The camera array 300 can include an I/O interface. The I/O interface can be enclosed by a protective door and/or include any type or number of I/O ports or mechanisms, such as USC ports, HDMI ports, memory card slots, and the like.
The camera array 300 can also include a door that covers a removable battery and battery interface. The camera array 300 can also include an expansion pack interface configured to receive a removable expansion pack, such as a display module, an extra battery module, a wireless module, and the like. Removable expansion packs, when coupled to the camera array 300, provide additional functionality to the camera array 300 via the expansion pack interface.
A single-body camera frame 405B can be used to improve proximity of the center of lenses of adjacent camera frames. The single-body camera frame 405B (also referred to herein as a “lens stack”) includes the plurality of lenses 410 and the image sensor 415 but does not include the auto-focus coils 420. In addition, the single-body camera frame 405B is a single mold that holds the components 410 and 415 within the frame 405B in a fixed position, set by the mold, and minimizes excess space within the frame 405B. The cross-section of the frame 405B is trapezoidal in shape, allowing adjacent cameras to be configured such that the proximity between the centers of the lenses of the cameras is reduced. In some embodiments, for lenses that are approximately 1 mm in diameter, the distance between the centers of the lenses of adjacent cameras is 1 mm or less.
Camera Array Block Diagrams
The image sensor 510 is a device capable of electronically capturing light incident on the image sensor 510. In one embodiment, CMOS sensors are used, including transistors, photodiodes, amplifiers, analog-to-digital converters, and power supplies. In one embodiment, the image sensor 510 has rolling shutter functionality, and can capture light incident upon different portions of the image sensor at slightly different times. Alternatively, the image sensor 510 can be a CCD sensor configured to can capture the portions of the image at substantially the same time. In one embodiment, the image sensor 510 has an adjustable read window 130. An adjustable read window 130 can modify the portions of an image sensor that are exposed to light and read to capture an image, or can modified the portions of an image sensor completely exposed to light that are read out to capture an image. By adjusting the read window 130, the camera 500A can modify when a portion of an image is captured relative to when image capture begins. For example, by shifting the read window 130 in a rolling shutter direction, the image sensor captures portions of the image in the read window 130 earlier than if the read window 130 was not shifted in the rolling shutter direction. Additionally, adjusting the read window 130 can be used to address inherent tolerance issues with the camera 500A and adjust convergence point of the camera array, as further described in conjunction with
The processor 520 is one or more hardware devices (e.g., a central processing unit (CPU), a graphics processing unit (GPU), a digital signal processor (DSP), and the like) that execute computer-readable instructions stored in the memory 525. The processor 520 controls other components of the camera based on the instructions that are executed. For example, the processor 520 may send electronic control signals to the image sensor 510 or use the synchronization interface 505 to send data to cameras 500B, 500C, and 500D.
The memory 525 is a non-transitory storage medium that can be read by the processor 520. The memory 525 may contain volatile memory (e.g., random access memory (RAM)), non-volatile memory (e.g., a flash memory, hard disk, and the like), or a combination thereof. The memory 525 may store image data captured by the image sensor 510 and computer-readable instructions to be executed by the processor 520.
The sensor controller 515 controls operation of the image sensor 510 and other functions of the camera 500A. The sensor controller 515 can include physical and/or electronic input devices, such as exterior buttons to start recording video and/or capture a still image, a touchscreen with tap-to-focus capabilities, and a dial/button combination for navigating a menu hierarchy of the camera 500A. In addition, the sensor controller 15 may include remote user input devices, such as remote controls that wirelessly communicate with the cameras 500A-D. The image sensor 510 may function independently of the sensor controller 515. For example, a slave camera in a master-slave pairing can receive a signal from the master camera to capture an image through the synchronization interface 505.
The synchronization interface 505 sends data to and receives data from cameras 500A, 500B, 500C, and 500D, or an external computing system. In particular, the synchronization interface 505 may send or receive commands to cameras 500A, 500B, 500C, and 500D for simultaneously capturing an image and/or calibrating synchronization with the cameras 500A, 500B, 500C, and 500D (e.g., sending or receiving a synchronization pulse). In the illustrated embodiment of
The image store 605 is configured to store a plurality of images synchronously captured by each of a plurality of cameras, such as the cameras 500A-D of
The capture controller 615 controls image capture by the image sensor 510. In one embodiment, the capture controller 615 applies a calibration correction to synchronize image capture with one or more additional cameras, for instance based on synchronization or calibration data stored in the synchronization store 610. The calibration correction may include a read window shift by a determined number of pixels, as determined by the pixel shift determination module 620. The calibration correction may include, alternatively or additionally, a time lag for one of the cameras in the array to delay relative to the other camera of the array before beginning image capture, as determined by the time lag determination module 625.
The pixel shift determination module 620 identifies a pixel shift between an image captured by a first camera 500A, an image captured by a second camera 500B, an image captured by a third camera 500C, and an image captured by a fourth camera 500D. This pixel shift indicates spatial misalignment between the image sensors of the cameras 500. In one embodiment, the pixel shift determination module 620 determines a pixel shift in a rolling shutter direction due to a misalignment between the image sensors along the rolling shutter direction. The capture controller 615 can use the determined pixel shift to correct the misalignment between the image sensors.
The time lag determination module 625 determines a time lag between the capture of an image row by a first camera 500A, a corresponding image row of a second camera 500B, a corresponding image row of a third camera 500C, and a corresponding image row of a fourth camera 500D. The time lag determination module 625 can determine a time lag based on a pixel shift received from the pixel shift determination module 620. Using the determined time lag, tlag, the capture controller 615 synchronizes the plurality of cameras by delaying image capture of a first of the plurality by the time lag relative to a second, third and fourth of the plurality. In one embodiment, an image sensor has an associated row time, trow, which represents an elapsed time between exposing a first pixel row and a second, subsequent pixel row. If images taken by a plurality of cameras are determined to have a pixel shift of n pixels, then the time lag ttag required to correct the pixel shift can be determined using the following equation:
tlag=trow×n
In one embodiment, calibrating image capture between cameras in a plurality of cameras involves synchronously capturing images with the plurality of cameras, determining a pixel shift between the captured images, and applying a determined correction iteratively until the determined pixel shift is less than a pre-determined pixel shift threshold. The calibration process may be initiated when cameras are powered on or paired, when the cameras are manufactured, when the camera array is assembled, or in response to a manual initiation of the calibration process by a user of the camera array. A master camera system can initiate the calibration process after an amount of time elapses that is greater than or equal to a pre-determined threshold since a previous calibration. In an embodiment with additional cameras, additional calibrations can be performed among other cameras in response to the calibration of the master camera.
The image capture module 630 processes captured images. In an alternative embodiment not described further herein, the captured images are processed outside of the synchronization interface 505, for instance by a system external to the camera array. The image capture module 630 includes a tolerance correction module 632, a convergence adjustment module 634, and an image processing module 636. Alternative embodiments may have one or more additional, omitted, or alternative modules configured to perform similar functionality.
The tolerance correction module 632 shifts read windows of the plurality of cameras 500 to correct for tolerances in each camera in the plurality. For example, each camera can have slight differences in read window location due to manufacturing discrepancies and product tolerances of the cameras. In addition, image sensors of the plurality of cameras can have varying read window locations. These variations from camera to camera can be due to discrepancies between locations and orientations of the image sensor within each camera. For example, if the image sensor is a complementary metal-oxide-semiconductor (CMOS) sensor, the location of the read window for each CMOS sensor can shift due to the sensitivity of pixel sensors used in each CMOS sensor. Shifted read window locations in CMOS sensors of cameras in the camera array can shift the field of views of cameras, as shown in the field of views of cameras in the camera array 700 in
The convergence adjustment module 634 can dynamically adjust a read window in an image sensor based on image data captured by the camera array. For example, if the camera array is capturing an image of an object in a foreground, the convergence point of the plurality of cameras in the camera array can be adjusted closer to the camera array and, if capturing an object in a background, the convergence point of the plurality of cameras in the camera array can be adjusted farther away from the camera array. In another example, the convergence point of the camera array is only adjusted by the convergence adjustment module 634 if the depth of the object in the field of view of the camera array exceeds a threshold distance. The convergence adjustment can be done manually by a user of the camera array or automatically by the convergence adjustment module 634 based on image processing algorithms measuring a depth of an object in the field of view of the camera array. The process of adjusting convergence of the plurality of cameras 500 is further described in conjunction with
The image processing module 636 adjusts images captured by the camera array to compensate for the angled fields of view of the cameras of the camera array. The images are adjusted using, for example warps, transformations, crops, or any other suitable image enhancement, restoration, and/or compression techniques. One or more of the images can be adjusted individually, or all of the images can be adjusted substantially synchronously or sequentially. For example, prior to and during the taping of the images to produce a simulated flat image, as further described below in conjunction with
Read Window Adjustment
In the illustrated example, the images 710A and 710B display the corner of the cube at (x,y) coordinates (3,4) and (12,6), respectively, where the read window has a height of 12 pixels and 20 pixels and the cube a width of 3 pixels. Thus, the cube has different heights within the read windows, and has different locations along the x-axis within the read windows. The read window of the image sensor of each camera in the plurality of cameras 705A and 705B can be adjusted to display the center of the cube at the same height in the y-axis and distance from the respective edges in the x-axis. Images 710C and 710D illustrate adjusted read windows, showing the corner of the cube displayed at (4,5) and (13,5), respectively (a same height and distance from the image edge). Detection of the cube can be an automatic process based on an object detection algorithm or assisted by a user of the camera array 700 and the user's input.
To align the read windows 130 within a multiple camera array, an object of interest can be identified. For instance, if a multiple camera array is used to capture a set of images (one per camera) of a garden, a garden flower within the images can be selected. A set of correlation coefficients representative of an amount of decorrelation within the overlapping portions of the captured set of images is determined. The set of correlation coefficients can be weighted such that correlation coefficients associated with the identified object of interest are weighted more heavily than other correlation coefficients. In some embodiments, correlation coefficients associated with a center region representative of an overlap between each of the images in the set of images are weighted more heavily. In some embodiments, the further away from the object of interest or the center region a correlation coefficient is, the less it is weighted.
The read windows 130 (such as read windows A, B, C, and D of
This process can be repeated a number of times, for instance, until the captured images represent a below threshold level of decorrelation.
When capturing the shared view 115, the distance between the center of the lenses of the cameras 805A and 805B affect the convergence point 820 of the lenses of the cameras 805A and 805B. This distance between the center of the lenses can be increased and decreased by shifting the read windows 130 of the image sensor windows 140 within the cameras 805A and 805B, as illustrated in
For example, as the distance 825 increases, as seen in distance 825B in
In various embodiments, the distance between the read windows can increase or decrease from shifting one or both of the read windows of the image sensors of the cameras 805A and 805B. The shift of the read windows can be done automatically based on object detection algorithms or by a manual input of a user using the camera array 800. For example, a user can input a setting or a mode (e.g., landscape, portrait, macro, sports, night, movie) and based on the input the read windows are shifted to better capture an image of that type of mode.
Image Processing & Taping
The images captured by the camera array can vary in distortion and warp based on the camera in the camera array or the position of the camera in the camera array (e.g., roll, pitch, yaw, etc.). Thus, as seen in image 910A, if the camera is a fish eye camera, the captured image 910A has a fish eye distortion. In addition, the portions of the shared field of view of image 910A are angled at a different orientation than adjacent images 910B and 910C, as each image was captured at different angled fields of view. Since the images 910A, 910B, 910C, and 910D are of a shared view and each image shares a portion of a shared field of view 920 with an adjacent image, common objects, such as objects 930, are visible in the portions of the shared field of view 920. For example, the common object 930AB between 910A and 910B is in the portion of the shared field of view 920AB, the common object 930AC between 910A and 910C is in the portion of the shared field of view 920AC, the common object 930BD between 910B and 910D is in the portion of the shared field of view 920BD, and the common object 930CD between 910C and 910D is in the portion of the shared field of view 920CD. Thus, each image has a first portion representative of an overlapping field of view with a corresponding portion of a horizontally adjacent image and a second portion representative of an overlapping field of view with a corresponding portion of a vertically adjacent image. In the example shown, each common object 930 is warped differently in each adjacent image due to the fish eye distortion.
In general, the distortions corrected are the distortions caused by the alignment step from
To help reduce or eliminate parallax error, a single lens can be used for multiple cameras in a multiple camera array. In some embodiments, the common lens is a ball lens.
The optical paths of light through the ball lens 1002 and upon each lens module 1000 intersects within the ball lens 1002. Accordingly, the distance D between the centers of the lenses is reduced to zero, effectively eliminating parallax error. It should be noted that in some embodiments, a ball lens 1002 introduces distortion to images captured using a ball lens 1002; in such embodiments, an additional warp function can be applied to each image captured using the ball lens 1002 prior to stitching the images together to reduce the effects of the distortion introduced by the ball lens 1002.
In the embodiment of
The use of cone lenses 1414 as forward-most lenses in the lens stacks 1410 in the multiple camera array of
It should be noted that the multiple camera arrays described herein can be configured to couple to and act as a remote control for any device configured to wirelessly communicate with the multiple camera array device. For instance, the multiple camera array can be configured to act as a remote control for another camera, a music/video playback device, a storage device, a wireless communication device, a telemetry monitoring device, a calendar control or display device, a slideshow control device, or any other wireless device.
Additional Configuration Considerations
Throughout this specification, some embodiments have used the expression “coupled” along with its derivatives. The term “coupled” as used herein is not necessarily limited to two or more elements being in direct physical or electrical contact. Rather, the term “coupled” may also encompass two or more elements are not in direct contact with each other, but yet still co-operate or interact with each other, or are structured to provide a thermal conduction path between the elements.
Likewise, as used herein, the terms “comprises,” “comprising,” “includes,” “including,” “has,” “having” or any other variation thereof, are intended to cover a non-exclusive inclusion. For example, a process, method, article, or apparatus that comprises a list of elements is not necessarily limited to only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus.
In addition, use of the “a” or “an” are employed to describe elements and components of the embodiments herein. This is done merely for convenience and to give a general sense of the invention. This description should be read to include one or at least one and the singular also includes the plural unless it is obvious that it is meant otherwise.
Finally, as used herein any reference to “one embodiment” or “an embodiment” means that a particular element, feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment. The appearances of the phrase “in one embodiment” in various places in the specification are not necessarily all referring to the same embodiment.
Upon reading this disclosure, those of skill in the art will appreciate still additional alternative structural and functional designs as disclosed from the principles herein. Thus, while particular embodiments and applications have been illustrated and described, it is to be understood that the disclosed embodiments are not limited to the precise construction and components disclosed herein. Various modifications, changes and variations, which will be apparent to those skilled in the art, may be made in the arrangement, operation and details of the method and apparatus disclosed herein without departing from the spirit and scope defined in the appended claims.
This application is a continuation of U.S. application Ser. No. 14/921,410, filed Oct. 23, 2015, which is a continuation of U.S. application Ser. No. 14/308,501, filed Jun. 18, 2014, now U.S. Pat. No. 9,196,039, which claims the benefit of U.S. Provisional Application No. 61/973,788, filed Apr. 1, 2014, U.S. Provisional Application No. 61/979,386, filed Apr. 14, 2014, and U.S. Provisional Application No. 61/985,256, filed Apr. 28, 2014, all of which are incorporated by reference herein in their entirety.
Number | Name | Date | Kind |
---|---|---|---|
4191462 | Johnson | Mar 1980 | A |
5602584 | Mitsutake | Feb 1997 | A |
5668595 | Katayama | Sep 1997 | A |
6141034 | McCutchen | Oct 2000 | A |
6549215 | Jouppi | Apr 2003 | B2 |
6778207 | Lee | Aug 2004 | B1 |
7710463 | Foote | May 2010 | B2 |
8004558 | Prechtl | Aug 2011 | B2 |
8013899 | Gillard | Sep 2011 | B2 |
8340453 | Chen | Dec 2012 | B1 |
8421846 | Nelson | Apr 2013 | B2 |
8488257 | Stark | Jul 2013 | B2 |
8818101 | Lim | Aug 2014 | B1 |
8928988 | Ford | Jan 2015 | B1 |
8953057 | Kajita | Feb 2015 | B2 |
9007520 | Azuma | Apr 2015 | B2 |
9152019 | Kintner | Oct 2015 | B2 |
9176298 | Gustafson | Nov 2015 | B1 |
9196039 | Macmillan | Nov 2015 | B2 |
9204041 | Campbell | Dec 2015 | B1 |
9262801 | Macmillan | Feb 2016 | B2 |
9473713 | Macmillan | Oct 2016 | B2 |
9792684 | Martin | Oct 2017 | B2 |
20020122113 | Foote | Sep 2002 | A1 |
20030076413 | Kanade | Apr 2003 | A1 |
20030132951 | Sorokin | Jul 2003 | A1 |
20030133019 | Higurashi | Jul 2003 | A1 |
20060013473 | Woodfill | Jan 2006 | A1 |
20060139475 | Esch | Jun 2006 | A1 |
20060181610 | Carlsson | Aug 2006 | A1 |
20070146530 | Nose | Jun 2007 | A1 |
20070236595 | Pan | Oct 2007 | A1 |
20070291375 | Ohtake | Dec 2007 | A1 |
20080012879 | Clodfelter | Jan 2008 | A1 |
20080131113 | Chang | Jun 2008 | A1 |
20090046170 | Linzer | Feb 2009 | A1 |
20090087161 | Roberts | Apr 2009 | A1 |
20090290033 | Jones | Nov 2009 | A1 |
20090322883 | Peters | Dec 2009 | A1 |
20100097444 | Lablans | Apr 2010 | A1 |
20100119172 | Yu | May 2010 | A1 |
20100128121 | Wilkinson | May 2010 | A1 |
20100225764 | Nizko | Sep 2010 | A1 |
20100231691 | Lee | Sep 2010 | A1 |
20100245548 | Sasaki | Sep 2010 | A1 |
20100265365 | Oshita | Oct 2010 | A1 |
20100318467 | Porter | Dec 2010 | A1 |
20110026141 | Barrows | Feb 2011 | A1 |
20110211106 | Marks | Sep 2011 | A1 |
20110216158 | Bigioi | Sep 2011 | A1 |
20110222757 | Yeatman | Sep 2011 | A1 |
20110234807 | Jones | Sep 2011 | A1 |
20110249073 | Cranfill | Oct 2011 | A1 |
20110249100 | Jayaram | Oct 2011 | A1 |
20120038753 | Hoshino | Feb 2012 | A1 |
20120075427 | Yahav | Mar 2012 | A1 |
20120098925 | Dasher | Apr 2012 | A1 |
20120061571 | Wo | May 2012 | A1 |
20120105574 | Baker | May 2012 | A1 |
20120147232 | Takayama | Jun 2012 | A1 |
20120169842 | Chuang | Jul 2012 | A1 |
20120194652 | Myokan | Aug 2012 | A1 |
20120262607 | Shimura | Oct 2012 | A1 |
20120265416 | Lu | Oct 2012 | A1 |
20130010144 | Park | Jan 2013 | A1 |
20130013185 | Smitherman | Jan 2013 | A1 |
20130020470 | Luo | Jan 2013 | A1 |
20130027558 | Ramanath | Jan 2013 | A1 |
20130076900 | Mrozek | Mar 2013 | A1 |
20130147948 | Higuchi | Jun 2013 | A1 |
20130221226 | Koren | Aug 2013 | A1 |
20130229529 | Lablans | Sep 2013 | A1 |
20140061841 | Kim | Mar 2014 | A1 |
20140071227 | Takenaka | Mar 2014 | A1 |
20140104378 | Kauff | Apr 2014 | A1 |
20140111606 | Cossairt | Apr 2014 | A1 |
20140111650 | Georgiev | Apr 2014 | A1 |
20140112597 | Yu | Apr 2014 | A1 |
20140125819 | Tokunaga | May 2014 | A1 |
20140153916 | Kintner | Jun 2014 | A1 |
20140160363 | Mutschelknaus | Jun 2014 | A1 |
20140168378 | Hall | Jun 2014 | A1 |
20140226915 | Fujita | Aug 2014 | A1 |
20140293057 | Wierich | Oct 2014 | A1 |
20140300754 | Hsieh | Oct 2014 | A1 |
20140375845 | Lee | Dec 2014 | A1 |
20150029306 | Cho | Jan 2015 | A1 |
20150042826 | Kenjo | Feb 2015 | A1 |
20150055929 | Van | Feb 2015 | A1 |
20150110483 | Koshkin | Apr 2015 | A1 |
20150116553 | Ford | Apr 2015 | A1 |
20150124103 | Dawson | May 2015 | A1 |
20150131924 | He | May 2015 | A1 |
20150146029 | Venkataraman | May 2015 | A1 |
20150181199 | Yu | Jun 2015 | A1 |
20150207990 | Ford | Jul 2015 | A1 |
20150264232 | Yang | Sep 2015 | A1 |
20150278988 | Macmillan | Oct 2015 | A1 |
20150279038 | Macmillan | Oct 2015 | A1 |
20150339805 | Ohba | Nov 2015 | A1 |
20160050345 | Longbotham | Feb 2016 | A1 |
20160119520 | Park | Apr 2016 | A1 |
20160123657 | Kim | May 2016 | A1 |
20160139061 | Kesselberg | May 2016 | A1 |
20160140713 | Martin | May 2016 | A1 |
20160142689 | Karikomi | May 2016 | A1 |
20160241782 | Fujita | Aug 2016 | A1 |
20160247983 | Mutschelknaus | Aug 2016 | A1 |
20160286137 | Marks | Sep 2016 | A1 |
20170332052 | Lu | Nov 2017 | A1 |
Number | Date | Country |
---|---|---|
102508391 | Jun 2012 | CN |
102625120 | Aug 2012 | CN |
0989436 | Mar 2000 | EP |
2012061571 | May 2012 | WO |
Entry |
---|
Wilt Adam, ‘HPATech Retreat 2010—Day 4’, 2010, 10 pages, [Online], [Retrieved Mar. 11, 2014], Retrieved from the Internet, http://provideocoalition.com/awilt/story/hpatr2010_4/PI/>. |
PCT Invitation to Pay Additional Fees and, Where Applicable, Protest Fee for PCT/US15/16719, dated May 8, 2015, 2 Pages. |
PCT International Search Report for Written Opinion for PCT/US15/16719, dated Jul. 8, 2015, 13 Pages. |
Office Action for U.S. Appl. No. 14/921,410, dated Dec. 16, 2016, 15 Pages. |
Office Action for U.S. Appl. No. 14/921,410, dated Apr. 28, 2017, 14 Pages. |
Office Action for U.S. Appl. No. 14/921,399, dated Mar. 23, 2017, 15 Pages. |
Number | Date | Country | |
---|---|---|---|
20190124276 A1 | Apr 2019 | US |
Number | Date | Country | |
---|---|---|---|
61985256 | Apr 2014 | US | |
61979386 | Apr 2014 | US | |
61973788 | Apr 2014 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 14921410 | Oct 2015 | US |
Child | 16227404 | US | |
Parent | 14308501 | Jun 2014 | US |
Child | 14921410 | US |