Mobile computing device with improved image preview functionality

Information

  • Patent Grant
  • 8988578
  • Patent Number
    8,988,578
  • Date Filed
    Friday, February 3, 2012
    14 years ago
  • Date Issued
    Tuesday, March 24, 2015
    10 years ago
Abstract
A mobile computing device can comprise a microprocessor, a display, at least one motion sensor, and an imaging device including a two-dimensional image sensor and an imaging lens configured to focus an image of a target object on the image sensor. The mobile computing device can be configured to periodically display a preview image frame of the target object. The mobile computing device can be further configured to compensate for a movement of the imaging device relatively to the target object during a time period elapsed between taking and displaying the preview image frame, by transforming the preview image frame based on the device movement detected by the motion sensor.
Description
FIELD OF THE INVENTION

This invention relates generally to mobile computing devices equipped with digital cameras, and, more specifically, to mobile computing devices equipped with digital cameras providing image preview functionality.


BACKGROUND OF THE INVENTION

Mobile computing devices equipped with digital imaging devices are widely used for many imaging applications. A common type of a digital imaging device includes a lens configured to focus an image of the target object onto a two-dimensional image sensor, which is often provided by a complementary metal-oxide semiconductor (CMOS) image sensor that converts light signals into electric signals. A preview image can be displayed on the screen of the mobile computing device to facilitate aiming of the imaging device.


SUMMARY OF THE INVENTION

In one embodiment, there is provided a mobile computing device comprising a microprocessor, a display, at least one motion sensor, and an imaging device including a two-dimensional image sensor and an imaging lens configured to focus an image of a target object on the image sensor. The mobile computing device can be configured to periodically display a preview image frame of the target object. The mobile computing device can be further configured to compensate for a movement of the imaging device relatively to the target object during a time period elapsed between taking and displaying the preview image frame, by transforming the preview image frame based on the device movement detected by the motion sensor.


In a further aspect, at least one motion sensor can be provided by an accelerometer.


In a further aspect, at least one motion sensor can be provided by an gyroscope.


In a further aspect, the imaging device can be configured to measure distance between the imaging lens and the target object.


In a further aspect, the transformation of the preview image frame can comprise a pixel shift.


In a further aspect, the transformation of the preview image frame can comprise rotation of the preview image.


In a further aspect, the transformation of the preview image frame can comprise scaling of the preview image.


In a further aspect, the mobile computing device can be further configured to introduce a colored polygon to the preview image frame to minimize a perceived visual disturbance caused by the mobile computing device movement.


In a further aspect, the polygon can be colored using a color similar to a color of an area of the preview image frame closest to the polygon.





BRIEF DESCRIPTION OF THE DRAWINGS

The features described herein can be better understood with reference to the drawings described below. The drawings are not necessarily to scale, emphasis instead generally being placed upon illustrating the principles of the invention. In the drawings, like numerals are used to indicate like parts throughout the various views.



FIGS. 1
a-1c schematically illustrate one embodiment of a mobile computing device;



FIG. 2 schematically illustrates a component diagram of one embodiment of mobile computing device;



FIG. 3 schematically illustrates one embodiment of method of transforming a preview image frame;



FIGS. 4-5 schematically illustrate transforming a preview image frame to compensate for the imaging device movement using the above described method.





DETAILED DESCRIPTION OF THE INVENTION

In one embodiment, there is provided a mobile computing device comprising an imaging device having a two-dimensional image sensor and an imaging lens configured to focus an image of a target object on the image sensor. A “mobile computing device” herein shall refer to a portable programmable device for data processing, including a central processing unit (CPU), a memory, and at least one communication interface. A mobile computing device can be provided, e.g., by a personal digital assistant (PDA), a portable data terminal (PDT), or a smart phone.


The mobile computing device can further comprise a display that can be employed, inter alia, for periodically displaying a preview image of the target object. In a further aspect, the frequency of preview frames can be high enough to provide a “quasi real-time” preview, thus facilitating aiming the mobile computing device in the direction of the target object.


However, a noticeable time lag between capturing and displaying a frame can result from the time needed to capture each preview frame and to perform post-processing of the preview frame. As the lag becomes greater, it becomes more and more difficult for the user to accurately aim the mobile computing device in the direction of the target object, which in turn leads to increasing the time needed to capture an image, thus increasing the power consumption by the mobile computing device and reducing the useful battery life.


In one embodiment, the mobile computing device can be configured to compensate for the device movement relatively to the target object during the time period elapsed between taking and displaying a preview image frame. The mobile computing device can be configured to transform the preview image frame based on the device movement detected by a motion sensor (e.g., an accelerometer and/or a gyroscope). In a further aspect, the preview image can be shifted, rotated, and/or scaled in order to compensate for the device movement.


Preview image frame transformation to compensate for the device movement can be considered as being opposed to image stabilization, since preview image frame transformation is intended to move the image as the imaging device moves, so that the preview display shows where the imaging device is pointing at the time the user sees the preview image, rather than where the imaging device was pointing at the time the image was captured.


One embodiment of mobile computing device 100 is shown in FIGS. 1a (front panel view), 1b (side panel view), and 1c (bottom panel view). Mobile computing device 100 can comprise housing 52 within which other components of mobile computing device 100 can be disposed. LCD screen display with touch screen sensor 54 can be disposed on the front panel 56. Also disposed on front panel 56 can be LED-based indicators 58 and 62, and keyboard 64 including functional and navigation keys 68, 72. Imaging window 74 can be disposed on the top panel of housing 52. Disposed on the side panel (best viewed in FIG. 1b) can be infra-red communication port 76, access door to a secure digital (SD) memory interface 78, audio jack 80, and hand strap 82. Disposed on the bottom panel (best viewed in FIG. 1c) can be multi-pin mechanical connector 84 and hand strap clip 86.


The mobile computing device 100 can be used, for example, for bar code reading and decoding in POS and other applications. A skilled artisan would appreciate the fact that other uses of mobile computing device 100 are within the scope of this disclosure.



FIG. 2 schematically illustrates a component diagram of one embodiment of mobile computing device 100. The imaging device 98 can comprise a multiple pixel image sensor assembly 107. The image sensor assembly 107 can include an image sensor 102 comprising a multiple pixel image sensor array 104 having pixels arranged in rows and columns of pixels, column circuitry 106, and row circuitry 108. Associated with the image sensor 102 can be amplifier circuitry 110, and an A/D converter 112 which can convert image information in the form of analog signals read out of multiple pixel image sensor 104 into image information in the form of digital signals. Image sensor 102 can also have an associated timing and control circuit 114 for use in controlling, e.g., the exposure period of image sensor 102, and/or gain applied to the amplifier 110. The noted circuit components 102, 110, 112, and 114 can be packaged into a common image sensor integrated circuit 116.


In the course of operation of the image sensor assembly 107, image signals can be read out of image sensor 102, converted and stored into a system memory such as RAM 120. A memory 122 of image sensor assembly 107 can include RAM 120, a nonvolatile memory such as EPROM 124, and a storage memory device 126 such as may be provided by a flash memory or a hard drive memory. In one embodiment, image sensor assembly 107 can include microprocessor 118 which can be adapted to read out image data stored in memory 122 and subject such image data to various image processing algorithms. Image sensor assembly 107 can include a direct memory access unit (DMA) 128 for routing image information read out from image sensor 102 that has been subject to conversion to RAM 120. In another embodiment, image sensor assembly 107 can employ a system bus providing for bus arbitration mechanism (e.g., a PCI bus) thus eliminating the need for a central DMA controller. A skilled artisan would appreciate that other embodiments of the system bus architecture and/or direct memory access components providing for efficient data transfer between the image sensor 102 and RAM 120 are within the scope of this disclosure.


In a further aspect, the image sensor assembly 107 can include an imaging lens assembly 130 for focusing an image of the decodable indicia 30 onto image sensor 102. Imaging light rays can be transmitted about an optical axis 132. Lens assembly 130 can be controlled with use of lens assembly control circuit 144. Lens assembly control circuit 144 can send signals to lens assembly 130, e.g., for changing a focal length and/or a best focus distance of lens assembly 130.


Image sensor assembly 107 can include various interface circuits for coupling several of the peripheral devices to system address/data bus (system bus) bus 158, for communication with second microprocessor 118 also coupled to system bus 158. Image sensor assembly 107 can include interface circuit 160 for coupling image sensor timing and control circuit timing and control circuit 144 to system bus 158, interface circuit 162 for coupling the lens assembly control circuit 144 to system bus 158, interface circuit 164 for coupling the illumination assembly control circuit 146 to system bus 158, interface circuit 166 for coupling the display 150 to system bus 158, interface circuit 168 for coupling keyboard 152, pointing device 154, and trigger 156 to system bus 158, and interface circuit 170 for coupling the filter module control circuit 148 to system bus 158.


In a further aspect, image sensor assembly 107 can include one or more I/O interfaces 172, 174 for providing communication with external devices (e.g., a POS cash register computer, a retail store server, an inventory facility server, a local area network base station, or a cellular base station). I/O interfaces 172, 174 can be interfaces of any combination of known computer interfaces, e.g., Ethernet (IEEE 802.3), USB, IEEE 802.11, Bluetooth, CDMA, and GSM.


In another aspect, mobile computing device 100 can include at least one motion sensor. In one embodiment, the motion sensor can be provided, e.g., by an accelerometer configured to detect the g-force, and can be employed to detect changes in the spatial orientation of the imaging device 98 and/or coordinate acceleration of the imaging device 98. In one embodiment, the motion sensor can be provided, e.g., by a gyroscope configured to detect changes in the spatial orientation of the imaging device 98. In one embodiment, mobile computing device 100 can include both an accelerometer and a gyroscope. In another embodiment, mobile computing device can only include an accelerometer.


In another aspect, mobile computing device 100 can be capable of following focus and measuring the distance between the lens and the target object.


As noted herein supra, mobile computing device 100 can be configured to compensate for the device movement relatively to the target object during the time period elapsed between taking and displaying a preview image frame. Mobile computing device 100 can be configured to transform the preview image frame based on the device movement detected by a motion sensor (e.g., an accelerometer and/or a gyroscope). In a further aspect, the preview image can be shifted, rotated, and/or scaled in order to compensate for the device movement.


One embodiment of method of transforming a preview image frame is being described with references to FIG. 3.


For illustrative purposes, the image sensor pixels can be assumed to have the same vertical and horizontal pitch and are thus be effectively square. In a right-handed coordinate system where the z axis positive direction is pointing “out of the page”, and where clockwise rotation when viewed along the increasing direction of an axis is positive, the image sensor array 104 can be assumed to be a rectangular array of pixels Px pixels wide and Py pixels high. The image can be assumed to be located at the origin, with its x and y pixel directions aligned with the x and y axes respectively, and with the lens 130 pointing in the negative z direction (i.e. “into the page”). The angle of view of the image sensor's lens assembly in the xz and yz planes can be denoted as ax and ay respectively.


The distance between the lens 130 and the target object can be denoted as d. In a further aspect, if this distance cannot actually be measured then a typical value can be assumed according to the use case that. The assumed value can be selected in such a way that it can most likely be greater than or equal to the actual distance, as it would be better to undercompensate rather than over-compensate.


The net translation along x, y and z axes of the image sensor between the time the image was captured by the sensor and when it is displayed can be denoted as Tx, Ty and Tz respectively. The net rotation about x, y and z axes of the image sensor between the time the image was captured by the sensor and when it is displayed can be denoted as Rx, Ry and Rz respectively.


Thus, the compensation due to translation along x axis can be represented as a pixel shift in the x direction of:

CX=PxTx/(2d tan(ax/2)) pixels.


The compensation due to translation along y axis can be represented as a pixel shift in the y direction of:

CY=PyTy/(2d tan(ay/2)) pixels.


The compensation due to translation along the z axis can be represented as an image size adjustment of linear magnitude:

d/(d−TZ)


The compensation due to rotation about the x axis can be approximated to a pixel shift in the x direction of magnitude:

PyRx/ax


The compensation due to rotation about the y axis can be approximated to a pixel shift in the x direction of magnitude:


PxRy/ay


The compensation due to rotation about the z axis can be represented as a corresponding image rotation of:

Rz


In one embodiment, image transformation to compensate for the camera movement can include a pixel shift, an image rotation, and an image scaling. However, because the impact of z axis translation and rotation is likely to be much smaller than for the other two axes, it will in many cases be possible to ignore Tz and Rz terms. Thus, in one embodiment, image transformation to compensate for the camera movement can include only pixel shift operations.


Thus, mobile computing device 100 can be configured to transform the preview image frame by applying the following pixel shift values. The total pixel shift in the x direction can be calculated as follows:

Px(Tx/(2d tan(ax/2))−Ry/ay),


and the total pixel shift in the y direction can be calculated as follows:

Py(Ty/(2d tan(ay/2))+Rx/ax)



FIGS. 4-5 schematically illustrate transforming a preview image frame to compensate for the imaging device movement using the above described method. As shown in FIG. 4, the y coordinate of an arbitrarily chosen element 401 of the target object 402 at the time of taking an image of the target object 402 can be denoted as PCY0. Due to movement of the mobile computing device 100 between the time of taking the image and the time of displaying the image, the y coordinate of the same element 401 of the target object 402 in the displayed frame can be adjusted using the above described method:

PCYF=PCY0+Py(Ty/(2d tan(ay/2))+Rx/ax)


Assuming that at the time of taking the image, the target object was centered in the image frame, the dashed line 410 schematically shows the position of the target object 402 within the uncompensated image frame. Further assuming that between the time of taking the image and the time of displaying the image mobile computing device 100 was rotated to the right, the preview image frame should be shifted to the left to compensate for the imaging device movement, and hence the solid line 412 schematically shows the position of the target object 402 within the resulting (compensated) image frame.


As shown in FIG. 5, the field of view 510 of imaging device 98 has moved to the right between the time 522 of taking the image and the time 524 of displaying the image. While a prior art mobile computing device would not compensate for the device movement, and hence would display uncompensated preview image frame 532, mobile computing device 10 can apply the above described compensation method and hence can display a compensated preview image frame 534.


In a further aspect, the mobile computing device can be further configured to introduce a colored polygon 538 to the preview image frame 534 to minimize a perceived visual disturbance caused by the mobile computing device movement, as shown in FIG. 5. The polygon can be colored using a color similar to a color of an area of the preview image frame closest to the polygon.


A sample of systems and methods that are described herein follows:


A1. A mobile computing device comprising:


a microprocessor;


an imaging device including a two-dimensional image sensor communicatively coupled to said microprocessor and an imaging lens configured to focus an image of a target object on said two-dimensional image sensor;


a display communicatively coupled to said microprocessor;


at least one motion sensor communicatively coupled to said microprocessor;


wherein said mobile computing device is configured to periodically display a preview image frame of said target object on said display;


wherein said mobile computing device is further configured to compensate for a movement of said imaging device relatively to said target object during a time period elapsed between taking and displaying said preview image frame, by transforming said preview image frame based on device movement detected by said at least one motion sensor.


A2. The mobile computing device of claim A1, wherein said at least one motion sensor is provided by an accelerometer.


A3. The mobile computing device of claim A1, wherein said at least one motion sensor is provided by a gyroscope.


A4. The mobile computing device of claim A1, wherein said imaging device is configured to measure distance between said imaging lens and said target object.


A5. The mobile computing device of claim A1, wherein said transformation of said preview image frame comprises a pixel shift.


A6. The mobile computing device of claim A1, wherein said transformation of said preview image frame comprises a rotation of said preview image.


A7. The mobile computing device of claim A1, wherein said transformation of said preview image frame comprises a scaling of said preview image.


A8. The mobile computing device of claim A1, further configured to introduce a colored polygon to said preview image frame to minimize a perceived visual disturbance caused by a motion of said mobile computing device.


A9. The mobile computing device of claim A1, further configured to introduce a polygon to said preview image frame;


wherein said polygon is colored using a color similar to a color of an area of said preview image frame closest to said polygon.


A10. The mobile computing device of claim A1, wherein said target object is provided by optical decodable indicia; and


wherein said microprocessor is configured to output decoded message data corresponding to said decodable indicia.


While the present invention has been described with reference to a number of specific embodiments, it will be understood that the true spirit and scope of the invention should be determined only with respect to claims that can be supported by the present specification. Further, while in numerous cases herein wherein systems and apparatuses and methods are described as having a certain number of elements it will be understood that such systems, apparatuses and methods can be practiced with fewer than the mentioned certain number of elements. Also, while a number of particular embodiments have been described, it will be understood that features and aspects that have been described with reference to each particular embodiment can be used with each remaining particularly described embodiment.

Claims
  • 1. A mobile computing device comprising: a microprocessor;an imaging device including a two-dimensional image sensor communicatively coupled to said microprocessor and an imaging lens configured to focus an image of a target object on said two-dimensional image sensor;a display communicatively coupled to said microprocessor;at least one motion sensor communicatively coupled to said microprocessor;wherein said mobile computing device is configured to periodically display a preview image frame of said target object on said display;wherein said mobile computing device is further configured to compensate for a movement of said imaging device relative to said target object during a time period elapsed between taking and displaying said preview image frame, by transforming said preview image frame after taking of said preview image frame based on device movement detected by said at least one motion sensor; andwherein said transformation of said preview image frame comprises: a pixel shift in a same x direction as an imaging device translation along an x axis; anda pixel shift in a same y direction as an imaging device translation along a y axis.
  • 2. The mobile computing device of claim 1, wherein said at least one motion sensor is provided by an accelerometer.
  • 3. The mobile computing device of claim 1, wherein said at least one motion sensor is provided by a gyroscope.
  • 4. The mobile computing device of claim 1, wherein said imaging device is configured to measure distance between said imaging lens and said target object.
  • 5. The mobile computing device of claim 1, wherein said transformation of said preview image frame comprises a rotation of said preview image.
  • 6. The mobile computing device of claim 1, wherein said transformation of said preview image frame comprises a scaling of said preview image.
  • 7. The mobile computing device of claim 1, further configured to introduce a colored polygon to said preview image frame to minimize a perceived visual disturbance caused by a movement of said mobile computing device.
  • 8. The mobile computing device of claim 1, further configured to introduce a polygon to said preview image frame; wherein said polygon is colored using a color similar to a color of an area of said preview image frame closest to said polygon.
  • 9. The mobile computing device of claim 1, wherein said target object is provided by optical decodable indicia; and wherein said microprocessor is configured to output decoded message data corresponding to said decodable indicia.
  • 10. A mobile device comprising: an imaging device for capturing image frames;a display for displaying images;a motion sensor for sensing motion of the mobile device; anda microprocessor communicatively coupled to the imaging device, the display, and the motion sensor, the microprocessor being configured for: capturing image frames with the imaging device;displaying a preview image frame on the display corresponding to a captured image frame;sensing motion of the mobile device with the motion sensor;in response to the sensing of movement of the mobile device during a time period between (i) capturing the captured image frame corresponding to the displayed preview image frame and (ii) displaying the preview image frame, shifting pixels of the preview image frame in the same direction of the sensed motion of the mobile device;wherein shifting pixels of the preview image frame in the same direction of the sensed motion of the mobile device, comprises: a pixel shift in a same x direction as a mobile device translation along an x axis; anda pixel shift in a same y direction as a mobile device translation along a y axis.
  • 11. The mobile device of claim 10, wherein the motion sensor comprises an accelerometer.
  • 12. The mobile device of claim 10, wherein the motion sensor comprises a gyroscope.
  • 13. The mobile device of claim 10, wherein the imaging device comprises a two-dimensional image sensor.
  • 14. The mobile device of claim 10, wherein the imaging device comprises: a two-dimensional image sensor; andan imaging lens for focusing light on the two-dimensional image sensor.
  • 15. The mobile device of claim 10, wherein the microprocessor is configured for, in response to the sensing of movement of the mobile device during a time period between (i) capturing the captured image frame corresponding to the displayed preview image frame and (ii) displaying the preview image frame, rotating the preview image frame.
  • 16. The mobile device of claim 10, wherein the microprocessor is configured for, in response to the sensing of movement of the mobile device during a time period between (i) capturing the captured image frame corresponding to the displayed preview image frame and (ii) displaying the preview image frame, scaling the preview image frame.
  • 17. A mobile device comprising: an imaging device for capturing image frames;a display for displaying images;a motion sensor for sensing motion of the mobile device; anda microprocessor communicatively coupled to the imaging device, the display, and the motion sensor, the microprocessor being configured for: capturing image frames with the imaging device;displaying a preview image frame on the display corresponding to a captured image frame;sensing motion of the mobile device with the motion sensor;in response to the sensing of movement of the mobile device during a time period between (i) capturing the captured image frame corresponding to the displayed preview image frame and (ii) displaying the preview image frame, shifting pixels of the preview image frame in the same direction of the sensed motion of the mobile device; andattempting to decode decodable indicia in a captured image frame;wherein shifting pixels of the preview image frame in the same direction of the sensed motion of the mobile device, comprises: a pixel shift in a same x direction as a mobile device translation along an x axis; anda pixel shift in a same y direction as a mobile device translation along a y axis.
  • 18. The mobile device of claim 17, wherein the motion sensor comprises an accelerometer.
  • 19. The mobile device of claim 17, wherein the microprocessor is configured for, in response to the sensing of movement of the mobile device during a time period between (i) capturing the captured image frame corresponding to the displayed preview image frame and (ii) displaying the preview image frame, rotating the preview image frame.
  • 20. The mobile device of claim 17, wherein the microprocessor is configured for in response to the sensing of movement of the mobile device during a time period between (i) capturing the captured image frame corresponding to the displayed preview image frame and (ii) displaying the preview image frame, scaling the preview image frame.
US Referenced Citations (204)
Number Name Date Kind
4301599 Leay Nov 1981 A
4656525 Norris Apr 1987 A
4819101 Lemelson Apr 1989 A
5027400 Baji et al. Jun 1991 A
5084768 Stern et al. Jan 1992 A
5260837 Lemelson Nov 1993 A
5289274 Kondo Feb 1994 A
5379159 Lemelson Jan 1995 A
5473744 Allen et al. Dec 1995 A
5619249 Billock et al. Apr 1997 A
5828406 Parulski et al. Oct 1998 A
5841512 Goodhill Nov 1998 A
5903892 Hoffert et al. May 1999 A
5926230 Niijima et al. Jul 1999 A
5956083 Taylor et al. Sep 1999 A
5990931 Nimri et al. Nov 1999 A
6043845 Thompson Mar 2000 A
6081262 Gill et al. Jun 2000 A
6084590 Robotham et al. Jul 2000 A
6122006 Bogdanowicz et al. Sep 2000 A
6134547 Huxley et al. Oct 2000 A
6198544 Wess et al. Mar 2001 B1
6282549 Hoffert et al. Aug 2001 B1
6292218 Parulski et al. Sep 2001 B1
6314575 Billock et al. Nov 2001 B1
6370543 Hoffert et al. Apr 2002 B2
6614454 Livingston Sep 2003 B1
6618051 Edwards et al. Sep 2003 B1
RE38284 Allen et al. Oct 2003 E
6654031 Ito et al. Nov 2003 B1
6677981 Mancuso et al. Jan 2004 B1
6718231 Konno et al. Apr 2004 B2
6735776 Legate May 2004 B1
6885392 Mancuso et al. Apr 2005 B1
6921336 Best Jul 2005 B1
6947598 Yogeshwar et al. Sep 2005 B2
6995793 Albadawi et al. Feb 2006 B1
7064783 Colavin et al. Jun 2006 B2
7071969 Stimson, III Jul 2006 B1
7076332 Cifra et al. Jul 2006 B2
7088396 Fredlund et al. Aug 2006 B2
7209229 Hoot Apr 2007 B2
7221398 Stavely et al. May 2007 B2
7309464 Meissner et al. Dec 2007 B2
7318202 Sugiyama et al. Jan 2008 B2
7400348 Hoyos Jul 2008 B2
7505163 Hart et al. Mar 2009 B2
7522912 Seo et al. Apr 2009 B2
7570380 Hart et al. Aug 2009 B2
7583786 Jing et al. Sep 2009 B2
7620218 Steinberg et al. Nov 2009 B2
7640514 Camara et al. Dec 2009 B2
7689898 Merril et al. Mar 2010 B2
7697778 Steinberg et al. Apr 2010 B2
7715660 Ching May 2010 B2
7755667 Rabbani et al. Jul 2010 B2
7760239 Kim et al. Jul 2010 B2
7849416 Chandhoke et al. Dec 2010 B2
7856055 Zhou et al. Dec 2010 B2
7907198 Tognazzini Mar 2011 B2
8073207 Ayaki et al. Dec 2011 B2
8107141 Patrick et al. Jan 2012 B2
8111744 Kim Feb 2012 B2
8155391 Tang et al. Apr 2012 B1
8224087 Bronstein et al. Jul 2012 B2
8249164 Sadowski Aug 2012 B2
8261191 Ording Sep 2012 B2
8271030 Lee Sep 2012 B2
8330797 Kim et al. Dec 2012 B2
8340474 Kim et al. Dec 2012 B2
8390672 Ryu et al. Mar 2013 B2
8396321 Aldrich Mar 2013 B1
20010043219 Robotham et al. Nov 2001 A1
20020052933 Leonhard et al. May 2002 A1
20020059581 Billock et al. May 2002 A1
20020126151 Chandhoke Sep 2002 A1
20020140873 Van Dijk et al. Oct 2002 A1
20020167607 Eerenberg et al. Nov 2002 A1
20020184628 Kim et al. Dec 2002 A1
20030084178 Simpson et al. May 2003 A1
20030169303 Islam et al. Sep 2003 A1
20030191816 Landress et al. Oct 2003 A1
20030219226 Newell et al. Nov 2003 A1
20040021898 Ashizaki Feb 2004 A1
20040061773 Liu Apr 2004 A1
20040090546 Doron May 2004 A1
20040174934 Komi et al. Sep 2004 A1
20040205220 Nakamura et al. Oct 2004 A1
20040213544 Legate Oct 2004 A1
20050057500 Bohn Mar 2005 A1
20050058431 Jia et al. Mar 2005 A1
20050068568 Hart et al. Mar 2005 A1
20050068570 Hart et al. Mar 2005 A1
20050096539 Leibig et al. May 2005 A1
20050123892 Cornelius et al. Jun 2005 A1
20050153764 Sterchi et al. Jul 2005 A1
20050188057 Joo Aug 2005 A1
20050193341 Hayward et al. Sep 2005 A1
20050200762 Barletta et al. Sep 2005 A1
20050231513 LeBarton et al. Oct 2005 A1
20050280842 Rodriguez et al. Dec 2005 A1
20060044265 Min Mar 2006 A1
20060052152 Tedsen et al. Mar 2006 A1
20060070108 Renkis Mar 2006 A1
20060119597 Oshino Jun 2006 A1
20060120670 Lee Jun 2006 A1
20060143078 Pozniansky et al. Jun 2006 A1
20060171453 Rohlfing et al. Aug 2006 A1
20060181721 Kulkarni et al. Aug 2006 A1
20060198437 Kim Sep 2006 A1
20060274155 Kim et al. Dec 2006 A1
20070011704 Anglin Jan 2007 A1
20070014480 Sirohey et al. Jan 2007 A1
20070033528 Merril et al. Feb 2007 A1
20070036265 Jing et al. Feb 2007 A1
20070040928 Jung et al. Feb 2007 A1
20070052856 Jung et al. Mar 2007 A1
20070081081 Cheng Apr 2007 A1
20070092244 Pertsel et al. Apr 2007 A1
20070109411 Jung et al. May 2007 A1
20070130002 Moran et al. Jun 2007 A1
20070188510 Kokemohr Aug 2007 A1
20070231778 Kim et al. Oct 2007 A1
20070237225 Luo et al. Oct 2007 A1
20070262995 Tran Nov 2007 A1
20070274563 Jung et al. Nov 2007 A1
20080005269 Knighton et al. Jan 2008 A1
20080034306 Ording Feb 2008 A1
20080043108 Jung et al. Feb 2008 A1
20080052090 Heinemann et al. Feb 2008 A1
20080158351 Rodriguez et al. Jul 2008 A1
20080216022 Lorch et al. Sep 2008 A1
20080218611 Parulski et al. Sep 2008 A1
20080231740 McIntyre et al. Sep 2008 A1
20080231741 McIntyre et al. Sep 2008 A1
20080259175 Muramatsu et al. Oct 2008 A1
20080298571 Kurtz et al. Dec 2008 A1
20090003652 Steinberg et al. Jan 2009 A1
20090022422 Sorek et al. Jan 2009 A1
20090051787 Yoon et al. Feb 2009 A1
20090079764 Lin et al. Mar 2009 A1
20090147092 Nakada et al. Jun 2009 A1
20090158222 Kerr Jun 2009 A1
20090304147 Jing et al. Dec 2009 A1
20100060727 Steinberg et al. Mar 2010 A1
20100066763 MacDougall et al. Mar 2010 A1
20100079485 Bentley Apr 2010 A1
20100097375 Tadaishi et al. Apr 2010 A1
20100110266 Lee et al. May 2010 A1
20100157167 Lawther et al. Jun 2010 A1
20100171691 Cook et al. Jul 2010 A1
20100205537 Knighton et al. Aug 2010 A1
20100214216 Nasiri et al. Aug 2010 A1
20100214451 Hwang et al. Aug 2010 A1
20100271506 Chen Oct 2010 A1
20100277603 Tsai Nov 2010 A1
20100328472 Steinberg et al. Dec 2010 A1
20110013072 Choi Jan 2011 A1
20110050730 Ranford et al. Mar 2011 A1
20110098918 Siliski et al. Apr 2011 A1
20110102630 Rukes May 2011 A1
20110128410 Lee et al. Jun 2011 A1
20110142318 Chen et al. Jun 2011 A1
20110149094 Chen et al. Jun 2011 A1
20110155808 Santos et al. Jun 2011 A1
20110157231 Ye et al. Jun 2011 A1
20110161365 Shin et al. Jun 2011 A1
20110163955 Nasiri et al. Jul 2011 A1
20110193989 Steinberg et al. Aug 2011 A1
20110199493 Steinberg et al. Aug 2011 A1
20110211081 Tsubaki et al. Sep 2011 A1
20110228329 Suzuki Sep 2011 A1
20110230232 Tran Sep 2011 A1
20110234779 Weisberg Sep 2011 A1
20110285874 Showering et al. Nov 2011 A1
20110289394 Roh et al. Nov 2011 A1
20110292042 Vaganov Dec 2011 A1
20110293244 Kuriyama Dec 2011 A1
20110298709 Vaganov Dec 2011 A1
20120012656 Wang Jan 2012 A1
20120022786 Siliski et al. Jan 2012 A1
20120069052 Lee et al. Mar 2012 A1
20120081385 Cote et al. Apr 2012 A1
20120092435 Wohlert Apr 2012 A1
20120121187 Lee et al. May 2012 A1
20120162101 Song et al. Jun 2012 A1
20120179969 Lee et al. Jul 2012 A1
20120229391 Skinner et al. Sep 2012 A1
20120249792 Wilborn Oct 2012 A1
20120249837 Chen Oct 2012 A1
20120260293 Young et al. Oct 2012 A1
20120268615 Choi et al. Oct 2012 A1
20120293607 Bhogal et al. Nov 2012 A1
20120300023 Lee Nov 2012 A1
20120307086 Jefremov et al. Dec 2012 A1
20120321283 Cormican et al. Dec 2012 A1
20120327269 Hwang et al. Dec 2012 A1
20130009951 Kwon et al. Jan 2013 A1
20130010138 Bigioi et al. Jan 2013 A1
20130016409 Kurtz et al. Jan 2013 A1
20130050430 Lee Feb 2013 A1
20130069937 Kim Mar 2013 A1
20130121556 Matsumoto May 2013 A1
20130151967 Kerr et al. Jun 2013 A1
Related Publications (1)
Number Date Country
20130201377 A1 Aug 2013 US