The present invention relates to an image capturing apparatus and an image capturing method, in particular, to those that capture images using a high speed image sensor.
In the past, high speed video cameras that can capture images at high speeds have beer known For example, a video camera has accomplished high speed image capturing by converting an image size per one image to be processed at a high speed rate into ¼ of the standard image size and placing these four images in an image of the regular rate (see Patent Document “Japanese Patent. Application Laid-Open Publication No. HEI 8-88833”). Another video camera has accomplished high speed image capturing using a circuit structure that processes data received from a sensor in parallel so as to increase a process amount per unit time (see Patent Document “Japanese Patent Application Laid-Open Publication No. HEI 8-251492”).
However, the high speed image capturing described in Patent Document “Japanese Patent Application Laid-Open Publication No. HEI 8-88833” or Patent Document “Japanese Patent Application Laid-Open Publication No. HEI 8-251492” was aimed to temporarily store a captured image in a storage device such as a VTR or a semiconductor memory, reproduce the captured image in slow motion, and analyze a very high speed motion and the device itself had a complicated structure and was expensive. Thus, it was difficult to apply high speed image capturing systems as described in Patent Document 1 or Patent Document 2 to portable image capturing devices that have been widespread as home-use devices, so-called camcorders (product names of devices in which a video camera and a recorder are integrated in one unit), digital cameras, and so forth from view points of portability and power consumption.
With reference to
The image sensor 101 can select a high speed image capturing mode in which the image sensor 101 reads a signal at a first screen rate (also referred to as frame rate) of 60 fps (fields/second) or more based on the NTSC specifications or a regular image capturing mode in which the image sensor 101 reads a signal at a regular second screen rate. The screen rate in the high speed image capturing mode is 240 fps that is four times higher than that of the regular rate. The image sensor 101 is equipped with a CDS (Correlated Double Sampling) and a A/D converter and the image sensor 101 outputs captured image data.
The pre-processing circuit 102 performs an optically correcting process such as a shading correction for captured image data that are output from the image sensor 101 and outputs a digital image signal. The camera signal processing circuit 103 performs a camera signal process such as a white balance adjustment process for the captured image data that are received from the pre-processing circuit 102.
The conversion processing section 104 performs a display decimation and a size adjustment to convert an image signal received from the camera signal processing circuit 103 into an image signal having a screen rate and a screen size suitable for a display of the display section 112. The display decimation is performed only when an image signal received from the camera signal processing circuit 103 is output to the display processing circuit 108. The display decimation decimates the number of fields per unit time of the image signal captured by the image capturing apparatus 100 in the high speed image capturing mode to the number of fields per unit time defined in the display standard of the display device (60 fps in this case).
The compression and decompression circuit 105 performs a compression-encoding process for captured image data received from the conversion processing section 10 according to a still image encoding system, for example, JPEG (Joint Photographic Experts Group) or the like. In addition, the compression and decompression circuit 105 performs a decompression-decoding process for encoded data of a still image supplied from the memory control circuit 106. The memory control circuit 106 controls writing and reading image data to and from the memory 107. The memory 107 is a FIFO (First In First Out) type buffer memory that temporarily stores image data received from the memory control circuit 106 and, for example, an SDRAM (Synchronous Dynamic Random Access Memory) or the like is used for the memory 107.
The display processing circuit 108 generates an image signal to be displayed on the display section 112 from an image signal received from the conversion processing section 104 or the compression and decompression circuit 109, supplies the signal to the display section 112, and causes it to display an image. The display section 112 is composed, for example, of an LCD (Liquid Crystal Display) and displays a camera-through image that is being captured or a reproduced image of data that have been recorded in the recording device 111.
The compression and decompression circuit 109 performs a compression-encoding process according to a moving image encoding system, for example, MPEG (Moving Picture Experts Group) or the like for image data received from the conversion processing section 104. In addition, the compression and decompression circuit 109 performs a decompression-decoding process for encoded data of a moving image supplied from the recording device 111 and outputs the resultant data to the display processing circuit 108. The display section 112 displays a moving image received from the display processing circuit 108.
The control section 113 is a microcomputer composed, for example, of a CPU (Central Processing Unit), a ROM (Read Only Memory), a RAM (Random Access Memory), and so forth and totally controls each section of the image capturing apparatus by executing programs stored in the ROM and so forth.
In the image capturing apparatus shown in
When receiving a record request of a high speed captured image from the control section 113 according to the user's operation, the conversion processing section 104 sends an image signal of 240 fps to the compression and decompression circuit 105. If necessary, the conversion processing section 104 reduces the size of the image signal received from the camera signal processing circuit 103 and sends the image signal to the compression and decompression circuit 105.
The compression and decompression circuit 105 compression-encodes the image signal received from the conversion processing section 104 according to the JPEG format. The memory control circuit 106 temporarily stores encoded data received from the compression and decompression circuit 105 into the memory 107. In such a manner, image data for a predetermined period are stored in the memory 107.
When receiving a read request for encoded data stored in the memory 107 from the control section 113, the memory control circuit 106 reads the encoded data stored in the memory 107 at 60 fps and sends the encoded data to the compression and decompression circuit 105. The compression and decompression circuit 105 decompression-decodes the encoded data received from the memory control circuit 106 and sends the decoded data to the conversion processing section 104. When receiving a record request for the recording device 111 from the control section 113, the conversion processing section 104 sends the image signal received from the compression and decompression circuit 105 to the compression and decompression circuit 109. The compression and decompression circuit 109 compresses the image signal received from the conversion processing section 104 according to the MPEG format and stores the compression-encoded signal to the recording device 111 through the recording device control circuit 110. The conversion processing section 104 adjusts the size of the image signal of 60 fps received from the compression and decompression circuit 105, sends the resultant image signal to the display processing circuit 108, and causes display section 112 to display a reproduced image.
In the foregoing proposed image capturing apparatus shown in
Therefore, an object of the present invention is to provide an image capturing apparatus and an image capturing method that do not need to cause a camera signal processing circuit to perform a process at a high speed screen rate and that are easy to operate, are produced at low cost, and thereby have excellent portability.
To solve the foregoing problems, the present invention is an image capturing apparatus, comprising:
The present invention is an image capturing apparatus, comprising:
The present invention is an image capturing apparatus, comprising:
The present invention is an image capturing method, comprising:
The present invention is an image capturing method, comprising:
The present invention is an image capturing method, comprising:
In the image capturing apparatus according to the present invention, although a screen rate of an image capturing device is high, a camera signal processing circuit is needed to always satisfy only a screen rate according to the display performance of a display section. Thus, it is not necessary to perform a high frequency drive and a parallel process only for high speed image capturing and thereby power consumption and circuit scale can be reduced.
Next, with reference to accompanying drawings, a first embodiment of the present invention will be described. As shown in
The image sensor 101 converts incident light of an object captured through an optical system (including a lens, an infrared suppression filter, an optical low-pass filter, and so forth) into an electric signal according the photoelectric conversion. As the image sensor 101, for example, a CMOS (Complementary Metal Oxide Semiconductor) type image capturing device is used. In the CMOS type image capturing device, photo diodes, line-column selection MOS transistors, signal wires, and so for are two-dimensionally arranged to form a vertical scanning circuit, a horizontal scanning circuit, a noise reduction circuit, a timing generation circuit, and so forth. As the image sensor 101, a CCD (Charge Coupled Device) that can capture images at high speeds may be used.
The image sensor 101 can be switched over between a high speed image capturing mode in which a signal is read at a first screen rate (also referred to as the frame rate) higher than the regular screen rate (60 fps (fields; sec) that is based on the specifications of the NTSC system and a regular image capturing mode in which a signal is read at a second screen rate that is the regular screen rate. The screen rate of the high speed image capturing mode is needed to be 240 fps that is four times higher than that of the regular rate. The image sensor 101 is internally equipped with a CDS (Correlated Double Sampling), an A/D converter, and so forth and outputs a digitally captured image signal corresponding to the matrix of pixels of the image sensor 101.
The image sensor 101 uses three image capturing devices that output captured image signals, for example, of three-primary colors and obtains One output line every four output lines of each image capturing device to accomplish a screen rate of 240 fps that is four times higher than that of the regular screen rate (60 fps). Assuming that the number of pixels of one frame at the regular screen rate is, for example, 6.4 million pixels, the number of pixels in the high speed image capturing mode is 1.6 million pixels.
The color filter array shown in
On the other hand, at the high screen rate, horizontal scanning lines are decimated and read at a rate of one every four horizontal scanning lines. To deal with the high screen rate, as shown in
The image capturing apparatus 200 includes a conversion processing section 201, a pre-processing circuit 202, a camera signal processing circuit 208, display processing circuit 208, a recording device control circuit 210, a recording device 111, a display section, and a control section 213.
The conversion processing section 201 performs signal shunting and display decimating for a digital image signal received from the image sensor 101. The display decimating is performed only when a signal is output to the display processing circuit 208. The display decimating is a decimation of fields that satisfy the number of fields per unit time defined in the display standard in the high speed image capturing mode of the image capturing apparatus 200 (in this case, 60 fps).
The pre-processing circuit 202 performs an optically correcting process such as a shading correction for a digital image signal that is output from the image sensor 101 and outputs a resultant digital image signal. The camera signal processing circuit 203 performs a camera signal process such as a white balance adjustment process (also referred to as a development process, an image creation process, or the like) for the image signal received from the pre-processing circuit 232. An output signal of the camera signal processing circuit 203 is supplied to the display processing circuit 208.
The display processing circuit 208 generates an image signal to be displayed on the display section 112 from the image signal received from the camera signal processing circuit 203 and supplies the resultant signal to the display section 112 to cause it to display an image. The display section 112 is composed, for example, of an LCD (Liquid Crystal Display) and displays a camera-through image that is being captured, a reproduced image of data recorded on the recording device 111, and so forth. The display section 112 may be disposed outside the image capturing apparatus 200 and it may be provided with an interface for an external output instead of the display section 112.
The recording device control circuit 210 connected to the conversion processing section 201 controls writing and reading image data to and from the recording device 111. Data stored in the recording device 111 are captured image data that have not been processed by the foregoing pre-processing circuit 202 and camera signal processing circuit 203 and are referred to as raw data in this specification.
As the recording device 111, a magnetic tape, a semiconductor memory such as a flash memory, a hard disk, or the like can be used. As the recording device 111, a non-attachable/detachable type is basically used. However, the recording device 111 may be attachable/detachable such that raw data can be retrieved to the outside. When raw data are retrieved to the outside, raw data that have been processed in the pre-processing circuit 202 are preferably retrieved. The camera signal process is performed, for example, according to software of an external personal computer.
The control section 213 is a microcomputer composed, for example, of a CPU (Central Processing Unit), a ROM (Read Only Memory), a RAM (Random Access Memory), and so forth and totally controls each section of the image capturing apparatus by executing programs stored in the ROM and so forth.
An output image signal of the size adjustment section 222 of the conversion processing section 201 is supplied to a shading correction circuit 231 of the pre-processing circuit 202. The shading correction circuit 231 corrects the brightness of the vicinity of the screen such that it does not become dark. An output signal of the shading correction circuit 231 is supplied to the camera signal processing circuit 203.
The camera signal processing circuit 203 is composed, for example, of a simultaneously forming circuit 241, a white balance correction section 242, an aperture correction section 243, a gamma correction section 244, and a YC generation section 245 arranged in the order from the input side. However, the structure of the camera signal processing circuit 203 is not limited to that shown in
The simultaneously forming circuit 241 interpolates lost pixels of each color component. The simultaneously forming circuit 241 outputs three-primary color signals (R, G, B) in parallel. Output signals of the simultaneously forming circuit 241 are supplied to the white balance correction section 242. The white balance correction section 242 corrects unbalancing of colors caused by a different color temperature environment of an object and different sensitivities of color filters of the sensor.
An output of the white balance correction section 242 is supplied to the aperture correction section 243. The aperture correction section 243 is to perform a contour correction that extracts a portion where a signal largely changes and emphasizes the portion. An output signal of the aperture correction section 243 is supplied to the gamma correction section 244.
The gamma correction section 244 corrects input and output characteristics such that the gradation is correctly reproduce when a captured image signal is output to the display section 112. An output signal of the gamma correction section 244 is supplied to the YC generation section 245.
The YC generation section 245 venerates a luminance signal (Y) and a color difference signal (C). The luminance signal is generated by combining the gamma-corrected RGB signals at a predetermined composition ratio. The color difference signal is generated by combining the gamma-corrected RGB signals at a predetermined composition ratio. The generated luminance signal and color difference signal are supplied to the display section 112 through the display processing circuit 208.
In the image capturing apparatus shown in
The decimation section 221 of the conversion processing section 201 decimates raw data by ¼ such that the raw data of 60 fps are obtained. The raw data that have been decimated and size-converted are supplied to the camera signal processing circuit 203 through the pre-processing circuit 202. A signal for which the camera signal process has been performed in the camera signal processing circuit 203 is supplied to the display section 112 through the display processing circuit 208 and an image that is being captured is displayed on the display section 112.
In the image capturing apparatus shown in
An output signal of the pre-processing circuit 202 is supplied to the display section 112 through the camera signal processing circuit 203 and the display processing circuit 208 and a reproduced image is displayed by the display section 112. For example, when only the screen rate has been changed, the reproduced image becomes a slow motion reproduced image, the time axis of which has been expanded four times than in the recording state. Instead, images may be captured by changing image capturing conditions (exposure condition and so forth) and when they are reproduced, the four types of captured images (any of still images or moving images) may be compared. Instead, a signal that is read from the recording device 111 may be decimated so as to obtain a frame-by-frame reproduction image.
Next, with reference to
As shown in
Next, with reference to
As shown in
Next, with reference to
In the image capturing apparatus 500, raw data received from the conversion processing section 201 are supplied to the memory 502 through a memory control circuit 501. The memory control circuit 501 controls writing and reading image data to and from the memory 502. The memory 502 is a FIFO (First In First Out) type memory that temporarily stores image data received from the memory control circuit 501, for example an SDRAM (Synchronous Dynamic Random Access Memory) or the like. The memory 502 performs buffering corresponding to throughputs of the pre-processing circuit 202 and the camera signal processing circuit 203.
An output signal of the camera signal processing circuit 203 is supplied to a terminal r of switch SW4. A terminal p of the switch SW4 and a terminal p of a switch SW3 are connected in common. The switch SW4 is connected to a recording device control circuit 504. Connected to the recording device control circuit 504 is a recording device 505.
The recording device control circuit 504 controls writing and reading image data to and from the recording device 505 through the switch SW4. Data stored in the recording device 505 are a luminance signal and a color difference signal processed by the pre-processing circuit 202 and the camera signal processing circuit 203. The recording device 505 may be a magnetic tape, a semiconductor memory such as a flash memory, a recordable optical disc, a hard disk, or the like. The recording device 505 is basically an attachable/detachable type. Instead, the recording device 505 may not be attachable/detachable type and recorded data may be output to the outside through a communication interface.
In a recording pause period in the high speed image capturing mode, for example, in a recoding standby state where a hand is released from a record button, raw data are read from the memory 502 and the raw data are processed by the pre-processing circuit 202 and the camera signal processing circuit 203 and an output signal of the camera signal processing circuit 203 is recorded on the recording device 505 through the terminal r of the switch SW4 and the recording device control circuit 504. The raw data are read from the memory 502 at the regular screen rate of 60 fps or a lower rate.
As shown in
The present invention is not limited to the foregoing embodiments. Instead, various modifications of the embodiments can be performed based on the spirit of the present invention. For example, data stored in the recording device 111 of the image capturing apparatus 400 (
In addition, the present invention can be applied to devices having an image capturing function such as a mobile phone and a PDA (Personal Digital Assistants) as well as a camcorder and a digital still camera. In addition, the present invention can be applied to a processing device and a recording device for a captured image signal of a small camera for a television phone or a game software application connected to a personal computer or the like.
Number | Date | Country | Kind |
---|---|---|---|
2006-293836 | Oct 2006 | JP | national |
The present application claims the benefit under 35 U.S.C. § 120 as a continuation application of U.S. application Ser. No. 16/391,771, filed on Apr. 23, 2019, now U.S. Pat. No. 10,708,563, which claims the benefit under 35 U.S.C. § 120 as a continuation application of U.S. application Ser. No. 15/855,060, filed on Dec. 27, 2017, now U.S. Pat. No. 10,313,648, which claims the benefit under 35 U.S.C. § 120 as a continuation application of U.S. application Ser. No. 15/581,208, filed Apr. 28, 2017, now U.S. Pat. No. 9,866,811, which claims the benefit under 35 U.S.C. § 120 as a continuation application of U.S. application Ser. No. 15/001,442, filed Jan. 20, 2016, now U.S. Pat. No. 9,661,291, which claims the benefit under 35 U.S.C. § 120 as a continuation application of U.S. application Ser. No. 14/662,291, filed Mar. 19, 2015, which claims the benefit under 35 U.S.C. § 120 as a continuation application of U.S. application Ser. No. 14/462,615, filed Aug. 19, 2014, now U.S. Pat. No. 9,025,929, which claims the benefit under 35 U.S.C. § 120 as a continuation application of U.S. application Ser. No. 12/447,480, filed Apr. 28, 2009, now U.S. Pat. No. 8,849,090, issued Sep. 30, 2014, which claims the benefit under 35 U.S.C. § 371 as a U.S. National Stage Entry of International Application No. PCT/JP2007/070389, filed in the Japanese Patent Office as a Receiving Office on Oct. 12, 2007, which claims priority to Japanese Patent Application Number 2006-293836, filed in the Japanese Patent Office on Oct. 30, 2006, each of which applications is hereby incorporated by reference in its entirety.
Number | Name | Date | Kind |
---|---|---|---|
4496995 | Colles et al. | Jan 1985 | A |
5196938 | Blessinger | Mar 1993 | A |
5442718 | Kobayashi et al. | Aug 1995 | A |
5557424 | Panizza | Sep 1996 | A |
5568192 | Hannah | Oct 1996 | A |
5625412 | Aciu et al. | Apr 1997 | A |
5751350 | Tanaka | May 1998 | A |
5786851 | Kondo et al. | Jul 1998 | A |
5856845 | Murata et al. | Jan 1999 | A |
6453117 | Ito et al. | Sep 2002 | B1 |
7058280 | Suzuki | Jun 2006 | B2 |
7310118 | Kamimura | Dec 2007 | B2 |
7557833 | Okawa | Jul 2009 | B2 |
7593037 | Matsumoto | Sep 2009 | B2 |
RE43462 | Washino et al. | Jun 2012 | E |
8849090 | Kosakai et al. | Sep 2014 | B2 |
9025929 | Kosakai et al. | May 2015 | B2 |
9538153 | Kosakai et al. | Jan 2017 | B1 |
9661291 | Kosakai | May 2017 | B2 |
9866811 | Kosakai | Jan 2018 | B2 |
10313648 | Kosakai | Jun 2019 | B2 |
10708563 | Kosakai | Jul 2020 | B2 |
20030011747 | Lenz | Jan 2003 | A1 |
20030210338 | Matsuoka et al. | Nov 2003 | A1 |
20040136689 | Oka | Jul 2004 | A1 |
20040151471 | Ogikubo | Aug 2004 | A1 |
20040151479 | Ogikubo | Aug 2004 | A1 |
20040164936 | Lim | Aug 2004 | A1 |
20040204849 | Shipley et al. | Oct 2004 | A1 |
20050036055 | Nakasuji et al. | Feb 2005 | A1 |
20050068424 | Kaneko et al. | Mar 2005 | A1 |
20050088550 | Mitsunaga | Apr 2005 | A1 |
20050104978 | Shinotsuka | May 2005 | A1 |
20050158025 | Shinkai | Jul 2005 | A1 |
20050163492 | Ueda et al. | Jul 2005 | A1 |
20050243180 | Yokonuma | Nov 2005 | A1 |
20060013507 | Kaneko et al. | Jan 2006 | A1 |
20060061666 | Kaneko et al. | Mar 2006 | A1 |
20060147187 | Takemoto et al. | Jul 2006 | A1 |
20060232688 | Suzuki et al. | Oct 2006 | A1 |
20070046785 | Matsumoto et al. | Mar 2007 | A1 |
20070230940 | Takita | Oct 2007 | A1 |
20090189994 | Shimonaka | Jul 2009 | A1 |
20100061707 | Kosakai et al. | Mar 2010 | A1 |
20140355948 | Kosakai et al. | Dec 2014 | A1 |
20150201126 | Kosakai et al. | Jul 2015 | A1 |
20160142694 | Kosakai et al. | May 2016 | A1 |
20170019650 | Kosakai et al. | Jan 2017 | A1 |
20170230629 | Kosakai et al. | Aug 2017 | A1 |
20180124369 | Kosakai et al. | May 2018 | A1 |
20190281268 | Kosakai et al. | Sep 2019 | A1 |
Number | Date | Country |
---|---|---|
1499116 | Jan 2005 | EP |
2 240 446 | Jul 1991 | GB |
07-095507 | Apr 1995 | JP |
07-298112 | Nov 1995 | JP |
08-088833 | Apr 1996 | JP |
08-251492 | Sep 1996 | JP |
09-224221 | Aug 1997 | JP |
2000-050205 | Feb 2000 | JP |
2001-045427 | Feb 2001 | JP |
2001-103356 | Apr 2001 | JP |
2001-112012 | Apr 2001 | JP |
2001-036848 | Sep 2001 | JP |
2002-320203 | Oct 2002 | JP |
2004-120384 | Apr 2004 | JP |
2004-221955 | Aug 2004 | JP |
2004-242267 | Aug 2004 | JP |
2005-006198 | Jan 2005 | JP |
2005-039708 | Feb 2005 | JP |
2005-039709 | Feb 2005 | JP |
2005-295423 | Oct 2005 | JP |
2006-094145 | Apr 2006 | JP |
2006-121479 | May 2006 | JP |
2006-157149 | Jun 2006 | JP |
2006-157152 | Jun 2006 | JP |
2006-180315 | Jul 2006 | JP |
2006-295851 | Oct 2006 | JP |
2007-511992 | May 2007 | JP |
WO 97030548 | Aug 1997 | WO |
WO 0221828 | Mar 2002 | WO |
WO 2006067909 | Jun 2006 | WO |
Entry |
---|
International Search Report and Written Opinion and English translation thereof dated Dec. 11, 2007 in connection with International Application No. PCT/JP2007/070389. |
International Preliminary Report on Patentability and English translation thereof dated May 5, 2009 in connection with International Application No. PCT/JP2007/070389. |
Japanese Office Action and English translation thereof dated Jul. 13, 2010 in connection with Japanese Application No. 2006-0293836. |
Supplemental European Search Report dated Mar. 4, 2011 in connection with European Application No. 07830123.1. |
European Office Action dated Mar. 10, 2011 in connection with European Application No. 07830123.1. |
European Office Action dated Nov. 7, 2011 in connection with European Application No. 07830123.1. |
European Office Action dated Nov. 3, 2011 in connection with European Application No. 11181543.7. |
European Office Action dated Jul. 3, 2012 in connection with European Application No. 11181543.7. |
European Office Action dated Jan. 16, 2013 in connection with European Application No. 07830123.1. |
European Office Action dated Jan. 16, 2013 in connection with European Application No. 11181543.7. |
European Office Action dated Jul. 1, 2013 in connection with European Application No. 11181543.7. |
European Office Action dated Oct. 16, 2013 in connection with European Application No. 11181543.7. |
Setarehdan et al., Automatic Cardiac LV Boundary Detection and Tracking Using Hybrid Fuzzy Temporal and Fuzzy Multiscale Edge Detection. IEEE Transactions on Biomedical Engineering. 1999;46(11):1364-1378. |
U.S. Appl. No. 12/447,480, filed Apr. 28, 2009, Kosakai et al. |
U.S. Appl. No. 14/462,615, filed Aug. 19, 2014, Kosakai et al. |
U.S. Appl. No. 14/662,291, filed Mar. 19, 2015, Kosakai et al. |
U.S. Appl. No. 15/001,442, filed Jan. 20, 2016, Kosakai et al. |
U.S. Appl. No. 15/266,287, filed Sep. 15, 2016, Kosakai et al. |
U.S. Appl. No. 15/581,208, filed Apr. 28, 2017, Kosakai et al. |
U.S. Appl. No. 15/855,060, filed Dec. 27, 2017, Kosakai et al. |
U.S. Appl. No. 16/391,771, filed Apr. 23, 2019, Kosakai et al. |
Number | Date | Country | |
---|---|---|---|
20200296342 A1 | Sep 2020 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 16391771 | Apr 2019 | US |
Child | 16888224 | US | |
Parent | 15855060 | Dec 2017 | US |
Child | 16391771 | US | |
Parent | 15581208 | Apr 2017 | US |
Child | 15855060 | US | |
Parent | 15001442 | Jan 2016 | US |
Child | 15581208 | US | |
Parent | 14662291 | Mar 2015 | US |
Child | 15001442 | US | |
Parent | 14462615 | Aug 2014 | US |
Child | 14662291 | US | |
Parent | 12447480 | US | |
Child | 14462615 | US |