Rapid synchronized lighting and shuttering

Information

  • Patent Grant
  • 9544504
  • Patent Number
    9,544,504
  • Date Filed
    Friday, October 23, 2015
    9 years ago
  • Date Issued
    Tuesday, January 10, 2017
    7 years ago
Abstract
This document describes various apparatuses and techniques for rapid synchronized lighting and shuttering. These apparatuses and techniques are capable of capturing two images, where one of the images integrates multiple exposures during which an image area is illuminated by a light source and another of the images integrates multiple exposures during which the image is not illuminated by the light source. The image area can be illuminated by rapidly flashing the image area with the light source and synchronizing a shutter to permit one image sensor to capture an image when the image area is illuminated and another image sensor to capture the image area when the image area is not illuminated. These two images can be captured concurrently or nearly concurrently, thereby reducing or eliminating motion artifacts. Further, these apparatuses and techniques may do so with slow and relatively low-cost cameras and relatively low computational costs.
Description
BACKGROUND

Current imaging techniques have failed to adequately address undesired ambient light. Some partial but inadequate solutions have been devised, including drowning out ambient light using a very bright light. This partial solution, however, often requires a brighter illuminant than is practical, higher power usage than is desired, or creates undesirably high heat. Also, it fails to handle very bright ambient light, such as outdoor scenes on a sunny day or when imaging a gesture made over a bright computer screen. Other partial but inadequate solutions include spectral approaches in which an object is illuminated with a narrow-frequency light and then band-pass filtering the image. This approach can fail due to the narrow-frequency lighting device drifting out of the narrow-frequency band.


Another partial but inadequate solution involves capturing an image with a light on, then another image with the light off, and then subtracting the ambient background light to provide an image having only the provided light. This solution, however, fails to address the ambient light changing or the image changing, such as when an object in the imaging area is moving. These problems can be addressed somewhat through complex and resource-intensive processing of the images and a fast camera, though the processing is computationally expensive and these fast cameras are also costly and often large and heavy as well. Further, even with this processing and fast camera, motion artifacts cannot be completely addressed.


SUMMARY

This document describes various apparatuses and techniques for rapid synchronized lighting and shuttering. These apparatuses and techniques are capable of capturing two images, where one of the images integrates multiple exposures during which an image area is illuminated by a light source and another of the images integrates multiple exposures during which the image is not illuminated by the light source. The image area can be illuminated by rapidly flashing the image area with the light source and synchronizing a shutter to permit one image sensor to capture an image when the image area is illuminated and another image sensor to capture the image area when the image area is not illuminated. These two images can be captured concurrently or nearly concurrently, thereby reducing or eliminating motion artifacts. Further, these apparatus and techniques may do so with slow and relatively low-cost cameras and relatively low computational costs.


This Summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This Summary is not intended to identify key or essential features of the claimed subject matter, nor is it intended to be used as an aid in determining the scope of the claimed subject matter.





BRIEF DESCRIPTION OF THE DRAWINGS

The detailed description is described with reference to the accompanying figures. In the figures, the left-most digit of a reference number identifies the figure in which the reference number first appears. The use of the same reference number in different instances in the description and the figures may indicate similar or identical items.



FIG. 1 illustrates an example environment in which the techniques may be implemented and the apparatus may be embodied.



FIG. 2 illustrates an example computing device in which the controller of FIG. 1 may be embodied.



FIG. 3 is a flow diagram depicting example methods for rapid synchronized lighting and shuttering, including optional image comparison and use of an optional additional flashing light source.



FIG. 4 illustrates the environment of FIG. 1 along with some elements in greater detail as well as a flash graph showing times at which an image area is and is not illuminated.



FIG. 5 illustrates the environment of FIG. 1 along with additional elements for capturing one or more additional images.



FIG. 6 illustrates an example multi-light-source flash graph showing different times at which multiple light sources illuminate the image area.



FIG. 7 is a flow diagram depicting example methods for rapid synchronized lighting and shuttering effective to enable creation of a net image not illuminated by ambient light.



FIG. 8 illustrates an example device in which techniques for rapid synchronized lighting and shuttering can be implemented.





DETAILED DESCRIPTION
Overview

This document describes various apparatuses and techniques for rapid synchronized lighting and shuttering. Various embodiments of these techniques and apparatuses enable subtraction of ambient or other light with little or no motion artifacts, using relatively slow image sensors, and/or with relatively low usage of computational resources.


In some embodiments, for example, a camera system having two slow frame-rate image sensors capable of capturing an image 60 or fewer times per second, a light source that rapidly flashes an object, and shutters synchronized with the flashes enables one of the image sensors to receive light from the image area when the image area is illuminated by the light source and the other of the image sensors to receive light from the image area when the image area is not illuminated by the light source (assuming the shutters integrating the two sensors are synchronized). The image sensors may have a slow frame rate, which often results in the image sensors being small and inexpensive, without negatively affecting the techniques' ability to subtract ambient light as noted in part above. With these two images, a net image can be calculated that subtracts out ambient or other sources of light.


The described apparatuses and techniques, however, enable relatively slow frame-rate image sensors, in conjunction with synchronized light sources and shutters, to gain a resiliency to motion artifacts. This resiliency can be equal to fast frame-rate image sensors operating at a frame-rate of the shutters of the slow frame-rate image sensors that follow the described techniques. By so doing, the apparatus and techniques may be less costly and require lower computing and video bandwidth for similar or even superior resiliency to motion artifacts.


Example Environment



FIG. 1 is an illustration of an example environment 100 in which rapid synchronized lighting and shuttering can be implemented. Environment 100 includes an image area 102 having an object 104 (a person's head), a light source 106, ambient light 108, image sensors 110, a shutter system 112, and a controller 114.


Image area 102 is an area of which an image will be captured by image sensors 110, such as a moving hand, person, or object, as well as unmoving parts of the area. For example, image area 102 may be actors in a studio performing a comedy, a persons' hand performing a gesture over a lighted computer screen, a person moving his arms to control a game, or a wildlife scene of a butterfly landing on a flower.


Light source 106 is capable of illuminating image area 102 with rapidly flashing light. This rapid flashing can be random, regular, patterned, periodic, or coded in various manners, and is capable of illuminating image area 102 at a flash rate twice as fast (or faster) as an image-capture-rate of image sensors 110. Further, light source 106 illuminates image area 102 some fraction of an amount of time in which image sensors 110 are exposed, such as ¼, ¼, ¼, ⅔, ¾, and so forth, depending on various factors set forth elsewhere herein. Light source 106 can include light-emitting-diodes (LEDs), laser diodes, and other types of lighting elements. Light source 106 can provide various frequencies of light, such as infrared, narrow-band, white, and so forth.


Ambient light 108 can be any type of light, such as light within an office building (e.g., fluorescent, incandescent, and/or natural light through glass, each alone or in combination), lighting in a television studio, light from a computer screen, or light from the sun or moon (direct, reflected, and/or refracted), to name just a few.


Image sensors 110, marked as first image sensor 110-1 and second image sensor 110-2 in FIG. 1, can be of various types and image-capturing rates. Thus, they can be relatively slow and inexpensive digital image sensors (e.g., those having a frame rate that is 60 times per second or slower), fast and expensive digital image sensors (those having a frame rate greater than at least 60 times per second, such as 100, 240, 1000, and 2000), analog sensors, and so forth. Image sensors 110 may have a slow image-capture-rate that is half or slower than half the speed of flashes of light from light source 106, though this is not required. They may also include or exclude a shutter, which may be a rolling shutter or global shutter, which can be synchronized with the other image sensor. This shutter, which may be included with the image sensor, is not a shutter of shutter system 112.


Shutter system 112 is capable of shuttering as fast as, and being synchronized with (or vice-a-versa), the flash rate and pattern of light source 106 and/or other light sources set forth elsewhere herein. Shutter system 112 is capable of preventing one of image sensors 110 from being exposed to light from light source 106 while exposing at least one other image sensor 110 to that light.


In embodiment 100 of FIG. 1, shutter system 112 includes a beam splitter 116, one or more polarizers (not shown), and one or more ferro-electric polarization retarders 118. Beam splitter 116 can be a silvered mirror, a dual brightness film (DBEF) sheet, or other device capable of splitting or directing light. Ferro-electric polarization retarders 118 act to prohibit one of image sensors 110 from being exposed to polarized light from image area 102 by cancelling that light while passing light to another of image sensors 110 to be exposed to polarized light from image area 102. Thus, light from light source 106 is flashed, illuminating image area 102, such as a person's head (object 104), the light reflects off of image area 102, is received and split by beam splitter 116, is polarized by one or more polarizers, and then is canceled or passed by ferro-electric polarization retarders 118. Here ferro-electric polarization retarders 118 (marked “118-1” and “118-2”) are rotated 90 degrees effective to pass or cancel received, polarized light. Note, however, that other embodiments of shutter system 112 may be used, such as fast mechanical shutters.


Generally, controller 114 is capable of controlling and synchronizing shutters and lighting of image area 102. In some embodiments, controller 114 is capable of flashing light source 106 and shuttering shutters of shuttering system 112 effective to enable image sensor 110-1 to capture a first image integrating multiple exposures during which light source 106 is flashed and to enable image sensor 110-2 to capture a second image integrating multiple exposures during which light source 106 is not flashed. Thus, controller 114 can cause shutters of shuttering system 112 to expose or pass light synchronously with illuminations of image area 102 by light source 106.


Further, in some embodiments controller 114 is capable of coding or patterning the flashing of flashing light source 106 to address particular kinds of ambient light. Consider a case where image area 102 is indoors and exposed to fluorescent ambient light. Fluorescent lights and some other types of light are not steady, though they may appear so. Instead, some types of lights flicker, such as at twice the supply frequency, e.g., fluorescent light sources may “flicker” at 50 Hz to 120 Hz, for example, which many people cannot see. Controller 114 determines the pattern of the fluorescent's illumination with various sensors (not shown), and thereby determines the rate and regularity at which the ambient light flickers. Controller 114 may then code the flashing of flashing light source 106 to this pattern of light flickering. This is but one way in which the apparatuses and techniques may avoid interference from ambient light.


These example embodiments are not intended to limit application of controller 114. Ways in which controller 114 acts and interacts, including with elements described in FIG. 1, are set forth in additional and greater detail below.



FIG. 2 illustrates a computing device 202 in which controller 114 may be embodied. Computing device 202 is illustrated with various non-limiting example devices: smartphone 202-1, laptop 202-2, television 202-3, desktop 202-4, and tablet 202-5. Computing device 202 includes processor(s) 204 and computer-readable media 206, which includes memory media 208 and storage media 210. Applications and/or an operating system (not shown) embodied as computer-readable instructions on computer-readable memory 206 can be executed by processor(s) 204 to provide some or all of the functionalities described herein. Computer-readable media 206 also includes controller 114, image post-processing module 212, and three-dimension (3D) module 214.


Computing device 202 also includes, or is in communication with, image sensors 110, input/output (I/O) ports 216, and network interface(s) 218. Image sensors 110 capture images as noted herein, and may be separate or integral with computing device 202. In some cases image sensors 110 can be used to capture gestures made near a lighted computer display, such as display 220 of tablet 202-5 or in proximity to a computing device, such as desktop 202-4 for larger gestures (like arm and body movements).


Captured images are received by computing device 202 from image sensors 110 via the one or more I/O ports 216. I/O ports 216 enable interaction generally between controller 114 and light source 106, shuttering system 112, and image sensors 110. I/O ports 216 can include a variety of ports, such as by way of example and not limitation, high-definition multimedia (HDMI), digital video interface (DVI), display port, fiber-optic or light-based, audio ports (e.g., analog, optical, or digital), USB ports, serial advanced technology attachment (SATA) ports, peripheral component interconnect (PCI) express based ports or card slots, serial ports, parallel ports, or other legacy ports.


Computing device 202 may also include network interface(s) 218 for communicating data over wired, wireless, or optical networks. Data communicated over such networks may include control data from controller 114, timing, sequences, coding, and the like to or from light source 106 and shuttering system 112. By way of example and not limitation, network interface 218 may communicate data over a local-area-network (LAN), a wireless local-area-network (WLAN), a personal-area-network (PAN), a wide-area-network (WAN), an intranet, the Internet, a peer-to-peer network, point-to-point network, a mesh network, and the like.


Example Methods



FIG. 3 is flow diagram depicting example methods 300 for rapid synchronized lighting and shuttering.


Block 302 rapidly flashes an object with a first light source, the object illuminated by a second light source in addition to the first light source. As noted in part above, this second light source can be ambient light. This second light source, however, may also or instead include another flashing light source. Use of other flashing light sources is set forth in greater detail elsewhere herein.


Block 304 synchronizes shuttering to expose and not expose different image sensors to the rapid flashing, the synchronized shuttering exposing a first image sensor during the flashes and a second image sensor not during the flashes. By way of example, consider a case where controller 114 synchronizes mechanical shutters to open one shutter prior to or during each flash and close prior to or during each flash (this first mechanical shutter is optically interposed between the object and the first image sensor). Controller 114 also synchronizes another mechanical shutter to open when or after each flash ceases and close prior to or when each flash begins (this second mechanical shutter is optically interposed between the object and the second image sensor).


By way of another example, consider FIG. 4, which illustrates some of the elements of FIG. 1 but in greater detail. Controller 114 (not shown) causes light source 106 to flash image area 102 and object 104 at times 2, 4, 6, 8, and 10 (shown at flash graph 402, each time representing one millisecond). No flashes from light source 106 are present at times 1, 3, 5, 7, and 9, though image area 102 is illuminated by ambient light 108 at these times. Controller 114 controls shutter system 112 by rotating ferro-electric polarization retarders 118-1, 118-2 effective to pass or block polarized light to image sensor 110-1, 110-2, respectively, at each of times 1 through 10. Note that the light that exposes image sensors 110 alternates based on when image area 102 is illuminated, shown at flash exposure light 404 and non-flash exposure light 406.


Block 306 captures, at the first image sensor, a first image of the object, the first image integrating multiple exposures during which the object is illuminated by the first light source during multiple, respective flashes and during which the object is also illuminated by the second light source.


Returning to the ongoing example of FIG. 4, image sensor 110-1 is exposed, for a single image, at times 2, 4, 6, 8, and 10, while image sensor 110-2 is exposed, also for a single image, at times 1, 3, 5, 7, and 9. As noted above, each of the images captured are exposed multiple times (here five), though as few as two or many more than five exposures can be made for each image. This is, or can be, a function of the image-capture rate of the image sensors. Thus, in those cases where an image sensor having a fast-image-capture-rate is practical, the number of exposures per image may be lower. A fast camera having an image capture rate of 240 images per second combined with a shutter system and light source capable of being synchronized by a controller for 960 flashes and shuttering 960 times per second, the images captured may have four exposures (960/240=4).


Block 308 captures, at the second image sensor, a second image of the object, the second image integrating multiple exposures during which the object is not illuminated by the first light source but is illuminated by the second light source. For the ongoing example, image sensor 110-2 is exposed five times for a single image captured, the five times when object 104 is not illuminated by light source 106 but is illuminated by ambient light 108.


Block 310 compares the first image and the second image effective to create a net image, the net image showing the object illuminated by the first light source but not illuminated by the second light source. This first light source flashes image area 102 but the second light source is not necessarily ambient light 108. Instead, other flashing light sources may be those excluded or removed to provide a net image. Furthermore, block 310 is not necessarily performed by controller 114. In some cases, controller 114 of FIGS. 1 and 2 provides the first image and the second image in a format usable by post-processing module 212 and/or 3D module 214 to determine the net image.


As noted in part above, first and second images captured during methods 300 can be captured concurrently or nearly concurrently. Image sensor 110-1 and image sensor 110-2 can be capturing images concurrently at the start of time 1 of graph 402 of FIG. 4. In such a case, image sensor 110-2, while available to receive light, may not actually receive the light, as its ferro-electric polarization retarder 118-2 will not pass light from beam splitter 116 at time 1. Still, both image sensors 110-1 and 110-2 are attempting to capture an image. Both image sensors 110-1 and 110-2 may send the captured image at the end of time 10 of graph 402 and then proceed to start again at a new cycle of flashes. As shown in FIG. 4, the exposures are interleaved during the same time period, here times 1 to 10 of graph 402 (here a total time period of 10 milliseconds).


Additionally and optionally, block 312 may synchronize shuttering of other image sensors to an additional flashing light source to provide one or more additional images. Capture of additional images can be interleaved with capture of images one and two.


By way of example, consider FIG. 5, which illustrates the elements of FIG. 1 along with additional elements for capturing one or more additional images. Each additional image can be used to determine placement of objects or other uses. Placement of objects can aid in 3D applications. In this example, a light source 502 flashes at the flash rate of light source 106 (or faster) and shuttering system 112 of FIG. 5 shutters at the rate of light source 502 using an additional shutter 504 (here also a ferro-electric polarization retarder). Controller 114 (not shown) is capable of synchronizing flashing of light source 502 and shuttering of additional shutter 504 effective to enable image sensor 110-1 and image sensor 110-2, or image sensor 110-3 and a fourth image sensor paired with image sensor 110-3 (not shown), to capture images integrating multiple exposures during which light source 502 is flashed.


While not shown, other image sensors may be used, such as one to capture another image when object 104 is exposed to ambient but no other light sources, or other images to capture images when flashed with still further flashing light sources. Note that light source 502 illuminates image area 102 from a different direction than light source 106 effective to provide a shadow of object 104. This difference in direction changes shadows of object 104, which can aid in 3D applications as noted herein, such as when object 104 is moving and thus its location is tracked through this movement based on the shadows. As noted in part above, additional images sensors may be used for 3D applications, though these additional image sensors are not necessarily required, as images can be captured by image sensors 110-1 and 110-2 that are illuminated by additional lights sources and thus have different shadowing.


In more detail, at FIG. 5 assume that controller 114 causes light source 106 to flash image area 102 and object 104 at times 1, 4, 7, and 10, causes light source 502 to flash at times 2, 5, 8, and 11, and ceases flashing at times 3, 6, 9, and 12 (image area 102 may still be illuminated by ambient light 108). This timing is shown at multi-light-source flash graph 602 in FIG. 6. This is but one example of ways in which controller 114 may code flashing of light sources so that they do not interfere with each other.


This example pattern is not required, however. Controller 114 may instead flash light source 106 at times 1, 5, and 9, another light source at times 3 and 7, and use (or avoid exposure during) times 2, 4, 6, and 8 based on a determined flicker of a fluorescent ambient light at times 2, 4, 6, and 8, for example. Other ways of coding one or more light sources are described elsewhere herein.


Light passes to beam splitter 116 and then to ferro-electric polarization retarder 118-1 (associated with image sensor 110-1) and a second beam splitter 506. From second beam splitter 506, light passes to ferro-electric polarization retarder 118-2 (associated with image sensor 110-2) and additional shutter 504, after which second-flash-exposure light 508 may pass to image sensor 110-3. Note that the light that exposes image sensors 110 alternates by threes based on when image area 102 is illuminated, shown at flash exposure light 404, non-flash exposure light 406, and second-flash-exposure light 508.



FIG. 7 is flow diagram depicting example methods 700 for rapid synchronized lighting and shuttering effective to enable creation of a net image not illuminated by ambient light. Methods 700 may be performed by controller 114 of FIG. 1, whether operating through software, hardware, or a combination thereof.


Block 702 rapidly flashes an object with a first light source, the object illuminated by ambient light in addition to the first light source. This rapid flashing is a multiple of the image sensors' frame rate, such as four or six times this frame rate. Thus, assuming that the image sensors' frame rate is eight, controller 114 flashes the object with light source 106 32 times per second at a four multiple flash rate.


Block 704 synchronizing shuttering of image sensors to the rapid flashing, the synchronized shuttering exposing a first image sensor during the flashes and a second image sensor not during the flashes.


Block 706 captures, at the first image sensor, a first image of the object, the first image integrating two or more exposures during which the object is illuminated by the first light source during multiple, respective flashes and during which the object is also illuminated by the ambient light.


Block 708 captures, at the second image sensor, a second image of the object, the second image integrating two or more exposures during which the object is not illuminated by the first light source but is illuminated by the ambient light.


Block 710 provides the first image and the second image effective to enable creation of a net image, the net image showing the object illuminated by the first light source but not illuminated by the ambient light.


Any of the methods set forth herein may provide images to a third party to create a net image or may create the net image internally, such as by post-processing module 212 of FIG. 1. Further, these methods may be used not only for large moving objects in natural light, such as persons walking outside, but also for small objects in other types of light. Thus, the methods may be used to flash a hand, finger, or stylus. The net image of this moving hand, finger, or stylus, may be effective to enable determination of a gesture performed by the hand, the finger, or the stylus, such as over a computer screen or in front of a television. This determination can be made by computing device 202, with which a person is interacting through the gesture, thereby enabling the television, tablet, or smart phone, for example, to determine gestures made even when illuminated by ambient light from a bright computer screen or television.


Note that various blocks of methods 300 and/or 700 may be repeated effective to continually provide images by which ambient light may be removed and/or locations be determined (whether for 3D applications, gesture recognition, or otherwise), among other applications.


The preceding discussion describes methods in which the techniques for rapid synchronized lighting and shuttering may be performed. These methods are shown as sets of blocks that specify operations performed but are not necessarily limited to the order shown for performing the operations by the respective blocks.


Aspects of these methods may be implemented in hardware (e.g., fixed logic circuitry), firmware, a System-on-Chip (SoC), software, manual processing, or any combination thereof. A software implementation represents program code that performs specified tasks when executed by a computer processor, such as software, applications, routines, programs, objects, components, data structures, procedures, modules, functions, and the like. The program code can be stored in one or more computer-readable memory devices, both local and/or remote to a computer processor. The methods may also be practiced in a distributed computing environment by multiple computing devices.


Example Device



FIG. 8 illustrates various components of example device 800 that can be implemented as any type of client, server, and/or display device as described with reference to the previous FIGS. 1-7 to implement techniques for rapid synchronized lighting and shuttering. In embodiments, device 800 can be implemented as one or a combination of a wired and/or wireless device, as a form of flat panel display, television, television client device (e.g., television set-top box, digital video recorder (DVR), etc.), consumer device, computer device, server device, portable computer device, viewer device, communication device, video processing and/or rendering device, appliance device, gaming device, electronic device, and/or as another type of device. Device 800 may also be associated with a viewer (e.g., a person or user) and/or an entity that operates the device such that a device describes logical devices that include viewers, software, firmware, and/or a combination of devices.


Device 800 includes communication devices 802 that enable wired and/or wireless communication of device data 804 (e.g., received data, data that is being received, data scheduled for broadcast, data packets of the data, etc.). The device data 804 or other device content can include configuration settings of the device, media content stored on the device, and/or information associated with a viewer of the device. Media content stored on device 800 can include any type of audio, video, and/or image data. Device 800 includes one or more data inputs 806 via which any type of data, media content, and/or inputs can be received, such as viewer-selectable inputs, position changes of a viewer, messages, music, television media content, recorded video content, and any other type of audio, video, and/or image data received from any content and/or data source.


Device 800 also includes communication interfaces 808, which can be implemented as any one or more of a serial and/or parallel interface, a wireless interface, any type of network interface, a modem, and as any other type of communication interface. The communication interfaces 808 provide a connection and/or communication links between device 800 and a communication network by which other electronic, computing, and communication devices communicate data with device 800.


Device 800 includes one or more processors 810 (e.g., any of microprocessors, controllers, and the like), which process various computer-executable instructions to control the operation of device 800 and to enable techniques for rapid synchronized lighting and shuttering. Alternatively or in addition, device 800 can be implemented with any one or combination of hardware, firmware, or fixed logic circuitry that is implemented in connection with processing and control circuits which are generally identified at 812. Although not shown, device 800 can include a system bus or data transfer system that couples the various components within the device. A system bus can include any one or combination of different bus structures, such as a memory bus or memory controller, a peripheral bus, a universal serial bus, and/or a processor or local bus that utilizes any of a variety of bus architectures.


Device 800 also includes computer-readable storage media 814, such as one or more memory devices that enable persistent and/or non-transitory data storage (i.e., in contrast to mere signal transmission), examples of which include random access memory (RAM), non-volatile memory (e.g., any one or more of a read-only memory (ROM), non-volatile RAM (NVRAM), flash memory, EPROM, EEPROM, etc.), and a disk storage device. A disk storage device may be implemented as any type of magnetic or optical storage device, such as a hard disk drive, a recordable and/or rewriteable compact disc (CD), any type of a digital versatile disc (DVD), and the like. Device 800 can also include a mass storage media device 816.


Computer-readable storage media 814 provides data storage mechanisms to store the device data 804, as well as various device applications 818 and any other types of information and/or data related to operational aspects of device 800. For example, an operating system 820 can be maintained as a computer application with the computer-readable storage media 814 and executed on processors 810. The device applications 818 may include a device manager, such as any form of a control application, software application, signal-processing and control module, code that is native to a particular device, a hardware abstraction layer for a particular device, and so on. The device applications 818 also include any system components or modules to implement these described techniques. In this example, the device applications 818 can include controller 114.


Furthermore, device 800 may include or be capable of communication with display 220, image sensors 110, light sources 106, and shuttering system 112.


CONCLUSION

This document describes various apparatuses and techniques for rapid synchronized lighting and shuttering. This rapid synchronized lighting and shuttering permits images without ambient or other undesired light to be created with little or no motion artifacts. Further, these apparatus and techniques may do so with slow and relatively low-cost cameras and relatively low computational costs. Although the invention has been described in language specific to structural features and/or methodological acts, it is to be understood that the invention defined in the appended claims is not necessarily limited to the specific features or acts described. Rather, the specific features and acts are disclosed as example forms of implementing the claimed invention.

Claims
  • 1. A system comprising: a controller configured to: synchronize flashing of a first light source and a second light source and shuttering of a shutter effective to capture a first image integrating multiple exposures during which the first light source is flashed and a second image integrating multiple exposures during which the second light source is flashed; andprovide the first image and the second image, the first image and the second image are processed to determine a position or location of an object captured in the first image and the second image.
  • 2. A system as recited in claim 1, wherein the first light source illuminates the object from a different direction than the second light source.
  • 3. A system as recited in claim 2, wherein the first image and the second image are processed to determine a position or location of an object captured in the first image and the second image by comparing a first shadow of the object as illuminated by the first light source in the first image and a second shadow of the object as illuminated by the second light source in the second image.
  • 4. A system as recited in claim 3, wherein the object is moving and its location is tracked through the movement based on shadows of the object as illuminated by the first light source and the second light source.
  • 5. A system as recited in claim 3, wherein the object is a finger, hand, arm, body, or stylus performing a gesture and the position or location of the object is processed for three-dimensional gesture recognition.
  • 6. A system as recited in claim 1, further configured to: capture a third image through the synchronized shuttering in which the object is not illuminated by the first light source or the second light source; andprovide the third image with the first image and the second image.
  • 7. A system as recited in claim 1, wherein the first light source and the second light source are LEDs or laser diodes and a flash rate of the LEDs or laser diodes is at least 60 Hz.
  • 8. A system as recited in claim 7, wherein the flash rate of the LEDs or laser diodes is at least twice as fast as an image-capture rate at which the first and second images are captured.
  • 9. A system as recited in claim 8, wherein the controller is further configured to synchronize the shuttering of the shutter at a shuttering rate at least as fast as the flash rate.
  • 10. A system as recited in claim 1, wherein the shutter includes a beam splitter, a polarizer, and a ferro-electric polarization retarder and the controller is further configured to control the beam splitter, the polarizer, and the ferro-electric polarization retarder as part of synchronizing the flashing of the first light source and the second light source and the shuttering of the shutter.
  • 11. A system as recited in claim 1, wherein the controller is configured to synchronize flashing of the first light source and the second light source and shuttering of the shutter effective to capture of the first image by a first image sensor and capture of the second image by a second image sensor.
  • 12. A method comprising: rapidly flashing an object with a first light source and a second light source;synchronizing shuttering of a first shutter to the rapid flashing of the first light source effective to enable capture a first image integrating multiple exposures;synchronizing shuttering of a second shutter to the rapid flashing of the second light source effective to enable capture of a second image integrating multiple exposures;comparing the first image and the second image effective to determine a position or location of an object captured in the first image and the second image.
  • 13. A method as described in claim 12, wherein the first light source illuminates the object from a different angle than the second light source.
  • 14. A method as described in claim 13, wherein comparing the first image and the second image effective to determine a position or location of the object compares a first shadow of the object as illuminated by the first light source in the first image and a second shadow of the object as illuminated by the second light source in the second image.
  • 15. A method as described in claim 14, wherein the object is moving and its location is tracked through the movement based on shadows of the object as illuminated by the first light source and the second light source.
  • 16. A method as described in claim 12, wherein the synchronizing shuttering of the first shutter synchronizes the shuttering of the first shutter at a rate at least as fast as a rate of the rapid flashing of the first light source and the synchronizing shuttering of the second shutter synchronizes the shuttering of the second shutter at a rate at least as fast as a rate of the rapid flashing of the second light source.
  • 17. A computing device comprising: a first light source configured to flash at a first flash rate;a second light source configured to flash at a second flash rate;a first mechanical shutter configured to shutter at least as fast as the first flash rate;a second mechanical shutter configured to shutter at least as fast as the second flash rate;a controller configured to: synchronize flashing of the first light source and shuttering of the first shutter effective to capture a first image integrating multiple exposures during which the first light source is flashed and synchronizing flashing of the second light source and shuttering of the second shutter effective to enable capture of a second image integrating multiple exposures during which the second light source is flashed; andprovide the first image and the second image, the first image and the second image are processed to determine a position or location of an object captured in the first image and the second image.
  • 18. A computing device as described in claim 17, wherein the first light source illuminates the object from a different direction than the second light source.
  • 19. A computing device as described in claim 18, wherein the first image and the second image are processed to determine a position or location of the object by comparing a first shadow of the object as illuminated by the first light source in the first image and a second shadow of the object as illuminated by the second light source in the second image.
  • 20. A computing device as described in claim 19, wherein the object is moving and its location is tracked through the movement based on shadows of the object as illuminated by the first light source and the second light source.
PRIORITY APPLICATION

This application is a divisional of U.S. patent application Ser. No. 14/325,247, filed on Jul. 7, 2014 which is a continuation of, and claims priority to U.S. patent application Ser. No. 13/667,408, filed Nov. 2, 2012, and entitled “Rapid Synchronized Lighting and Shuttering,” the entire disclosures of which is hereby incorporated by reference.

US Referenced Citations (481)
Number Name Date Kind
578325 Fleming Mar 1897 A
4046975 Seeger, Jr. Sep 1977 A
4065649 Carter et al. Dec 1977 A
4243861 Strandwitz Jan 1981 A
4302648 Sado et al. Nov 1981 A
4317013 Larson Feb 1982 A
4365130 Christensen Dec 1982 A
4492829 Rodrique Jan 1985 A
4527021 Morikawa et al. Jul 1985 A
4559426 Van Zeeland et al. Dec 1985 A
4577822 Wilkerson Mar 1986 A
4588187 Dell May 1986 A
4607147 Ono et al. Aug 1986 A
4651133 Ganesan et al. Mar 1987 A
4735394 Facco Apr 1988 A
5008497 Asher Apr 1991 A
5111223 Omura May 1992 A
5128829 Loew Jul 1992 A
5220521 Kikinis Jun 1993 A
5283559 Kalendra et al. Feb 1994 A
5331443 Stanisci Jul 1994 A
5349403 Lo Sep 1994 A
5375076 Goodrich et al. Dec 1994 A
5480118 Cross Jan 1996 A
5546271 Gut et al. Aug 1996 A
5548477 Kumar et al. Aug 1996 A
5558577 Kato Sep 1996 A
5621494 Kazumi et al. Apr 1997 A
5681220 Bertram et al. Oct 1997 A
5737183 Kobayashi et al. Apr 1998 A
5745376 Barker et al. Apr 1998 A
5748114 Koehn May 1998 A
5781406 Hunte Jul 1998 A
5807175 Davis et al. Sep 1998 A
5818361 Acevedo Oct 1998 A
5828770 Leis et al. Oct 1998 A
5842027 Oprescu et al. Nov 1998 A
5874697 Selker et al. Feb 1999 A
5905485 Podoloff May 1999 A
5926170 Oba Jul 1999 A
5929946 Sharp et al. Jul 1999 A
5971635 Wise Oct 1999 A
6002389 Kasser Dec 1999 A
6005209 Burleson et al. Dec 1999 A
6012714 Worley et al. Jan 2000 A
6040823 Seffernick et al. Mar 2000 A
6044717 Biegelsen et al. Apr 2000 A
6061644 Leis May 2000 A
6108200 Fullerton Aug 2000 A
6128007 Seybold Oct 2000 A
6178443 Lin Jan 2001 B1
6254105 Rinde et al. Jul 2001 B1
6278490 Fukuda et al. Aug 2001 B1
6329617 Burgess Dec 2001 B1
6344791 Armstrong Feb 2002 B1
6366440 Kung Apr 2002 B1
6380497 Hashimoto et al. Apr 2002 B1
6437682 Vance Aug 2002 B1
6511378 Bhatt et al. Jan 2003 B1
6532035 Saari et al. Mar 2003 B1
6532147 Christ, Jr. Mar 2003 B1
6543949 Ritchey et al. Apr 2003 B1
6565439 Shinohara et al. May 2003 B2
6574030 Mosier Jun 2003 B1
6597347 Yasutake Jul 2003 B1
6600121 Olodort et al. Jul 2003 B1
6603408 Gaba Aug 2003 B1
6608664 Hasegawa Aug 2003 B1
6617536 Kawaguchi Sep 2003 B2
6651943 Cho et al. Nov 2003 B2
6677999 Bean Jan 2004 B2
6685369 Lien Feb 2004 B2
6695273 Iguchi Feb 2004 B2
6700617 Hamamura et al. Mar 2004 B1
6704864 Philyaw Mar 2004 B1
6721019 Kono et al. Apr 2004 B2
6725318 Sherman et al. Apr 2004 B1
6753920 Momose et al. Jun 2004 B1
6774888 Genduso Aug 2004 B1
6776546 Kraus et al. Aug 2004 B2
6781819 Yang et al. Aug 2004 B2
6784869 Clark et al. Aug 2004 B1
6813143 Makela Nov 2004 B2
6819316 Schulz et al. Nov 2004 B2
6856506 Doherty et al. Feb 2005 B2
6859565 Baron Feb 2005 B2
6861961 Sandbach et al. Mar 2005 B2
6914197 Doherty et al. Jul 2005 B2
6950950 Sawyers et al. Sep 2005 B2
6976799 Kim et al. Dec 2005 B2
7002624 Uchino et al. Feb 2006 B1
7007238 Glaser Feb 2006 B2
7046292 Ziemkowski May 2006 B2
7091436 Serban Aug 2006 B2
7099149 Krieger et al. Aug 2006 B2
7102683 Perry et al. Sep 2006 B2
7106222 Ward et al. Sep 2006 B2
7123292 Seeger et al. Oct 2006 B1
7129979 Lee Oct 2006 B1
D535292 Shi et al. Jan 2007 S
7162153 Harter, Jr. et al. Jan 2007 B2
7194662 Do et al. Mar 2007 B2
7213991 Chapman et al. May 2007 B2
7260221 Atsmon Aug 2007 B1
7277087 Hill et al. Oct 2007 B2
7295720 Raskar Nov 2007 B2
7301759 Hsiung Nov 2007 B2
7379094 Yoshida et al. May 2008 B2
7400452 Detro et al. Jul 2008 B2
7443443 Raskar et al. Oct 2008 B2
7469386 Bear et al. Dec 2008 B2
7486165 Ligtenberg et al. Feb 2009 B2
7496291 Bloom Feb 2009 B2
7499037 Lube Mar 2009 B2
7509042 Mori et al. Mar 2009 B2
7542052 Solomon et al. Jun 2009 B2
7558594 Wilson Jul 2009 B2
7559834 York Jul 2009 B1
7567293 Perlman Jul 2009 B2
RE40891 Yasutake Sep 2009 E
7636921 Louie Dec 2009 B2
7656392 Bolender Feb 2010 B2
7724952 Shum et al. May 2010 B2
7729493 Krieger et al. Jun 2010 B2
7731147 Rha Jun 2010 B2
7733326 Adiseshan Jun 2010 B1
7777972 Chen et al. Aug 2010 B1
7782342 Koh Aug 2010 B2
7813715 McKillop et al. Oct 2010 B2
7822338 Wernersson Oct 2010 B2
7865639 McCoy et al. Jan 2011 B2
7884807 Hovden et al. Feb 2011 B2
D636397 Green Apr 2011 S
7927654 Hagood et al. Apr 2011 B2
7928964 Kolmykov-Zotov et al. Apr 2011 B2
7944520 Ichioka et al. May 2011 B2
7945717 Rivalsi May 2011 B2
7973771 Geaghan Jul 2011 B2
7978281 Vergith et al. Jul 2011 B2
8016255 Lin Sep 2011 B2
8053688 Conzola et al. Nov 2011 B2
8065624 Morin et al. Nov 2011 B2
8069356 Rathi et al. Nov 2011 B2
8090885 Callaghan et al. Jan 2012 B2
8098233 Hotelling et al. Jan 2012 B2
8115499 Osoinach et al. Feb 2012 B2
8117362 Rodriguez et al. Feb 2012 B2
8118274 McClure et al. Feb 2012 B2
8130203 Westerman Mar 2012 B2
8154524 Wilson et al. Apr 2012 B2
8162282 Hu et al. Apr 2012 B2
D659139 Gengler May 2012 S
8169421 Wright et al. May 2012 B2
8179236 Weller et al. May 2012 B2
8184190 Dosluoglu May 2012 B2
8229509 Paek et al. Jul 2012 B2
8229522 Kim et al. Jul 2012 B2
8231099 Chen Jul 2012 B2
8248791 Wang et al. Aug 2012 B2
8255708 Zhang Aug 2012 B1
8264310 Lauder et al. Sep 2012 B2
8267368 Torii et al. Sep 2012 B2
8274784 Franz et al. Sep 2012 B2
8279589 Kim Oct 2012 B2
8322290 Mignano Dec 2012 B1
8387078 Memmott Feb 2013 B2
8416559 Agata et al. Apr 2013 B2
8498100 Whitt, III et al. Jul 2013 B1
8543227 Perek et al. Sep 2013 B1
8548608 Perek et al. Oct 2013 B2
8564944 Whitt, III et al. Oct 2013 B2
8570725 Whitt, III et al. Oct 2013 B2
8587701 Tatsuzawa Nov 2013 B2
8599542 Healey et al. Dec 2013 B1
8610015 Whitt et al. Dec 2013 B2
8614666 Whitman et al. Dec 2013 B2
8646999 Shaw et al. Feb 2014 B2
8699215 Whitt, III et al. Apr 2014 B2
8719603 Belesiu May 2014 B2
8724302 Whitt, III et al. May 2014 B2
8786767 Rihn et al. Jul 2014 B2
9204129 Keshavmurthy Dec 2015 B2
9275809 Panay et al. Mar 2016 B2
20020044216 Cha Apr 2002 A1
20020113882 Pollard et al. Aug 2002 A1
20020134828 Sandbach et al. Sep 2002 A1
20030007086 Bean Jan 2003 A1
20030036365 Kuroda Feb 2003 A1
20030128285 Itoh Jul 2003 A1
20030163611 Nagao Aug 2003 A1
20030197687 Shetter Oct 2003 A1
20030197806 Perry et al. Oct 2003 A1
20030231243 Shibutani Dec 2003 A1
20040056843 Lin et al. Mar 2004 A1
20040156168 LeVasseur et al. Aug 2004 A1
20040184677 Raskar Sep 2004 A1
20040189822 Shimada Sep 2004 A1
20040212601 Cake et al. Oct 2004 A1
20040258924 Berger et al. Dec 2004 A1
20040268000 Barker et al. Dec 2004 A1
20050030728 Kawashima et al. Feb 2005 A1
20050047773 Satake et al. Mar 2005 A1
20050052831 Chen Mar 2005 A1
20050055498 Beckert et al. Mar 2005 A1
20050057515 Bathiche Mar 2005 A1
20050059489 Kim Mar 2005 A1
20050068460 Lin Mar 2005 A1
20050094895 Baron May 2005 A1
20050099400 Lee May 2005 A1
20050134717 Misawa Jun 2005 A1
20050146512 Hill et al. Jul 2005 A1
20050264653 Starkweather et al. Dec 2005 A1
20050264988 Nicolosi Dec 2005 A1
20060085658 Allen et al. Apr 2006 A1
20060125799 Hillis et al. Jun 2006 A1
20060154725 Glaser et al. Jul 2006 A1
20060155391 Pistemaa et al. Jul 2006 A1
20060156415 Rubinstein et al. Jul 2006 A1
20060176377 Miyasaka Aug 2006 A1
20060181514 Newman Aug 2006 A1
20060187216 Trent, Jr. et al. Aug 2006 A1
20060195522 Miyazaki Aug 2006 A1
20060198625 Okuno Sep 2006 A1
20060203096 LaSalle Sep 2006 A1
20070003267 Shibutani Jan 2007 A1
20070024742 Raskar et al. Feb 2007 A1
20070056385 Lorenz Mar 2007 A1
20070062089 Homer et al. Mar 2007 A1
20070069153 Pai-Paranjape et al. Mar 2007 A1
20070072474 Beasley et al. Mar 2007 A1
20070081091 Pan et al. Apr 2007 A1
20070145945 McGinley et al. Jun 2007 A1
20070182663 Biech Aug 2007 A1
20070182722 Hotelling et al. Aug 2007 A1
20070185590 Reindel et al. Aug 2007 A1
20070200830 Yamamoto Aug 2007 A1
20070201859 Sarrat Aug 2007 A1
20070220708 Lewis Sep 2007 A1
20070234420 Novotney et al. Oct 2007 A1
20070236408 Yamaguchi et al. Oct 2007 A1
20070236475 Wherry Oct 2007 A1
20070247432 Oakley Oct 2007 A1
20070252674 Nelson et al. Nov 2007 A1
20070260892 Paul et al. Nov 2007 A1
20070263119 Shum et al. Nov 2007 A1
20070264000 Hsieh Nov 2007 A1
20070283179 Burnett et al. Dec 2007 A1
20080019684 Shyu et al. Jan 2008 A1
20080053222 Ehrensvard et al. Mar 2008 A1
20080068451 Hyatt Mar 2008 A1
20080084499 Kisacanin et al. Apr 2008 A1
20080104437 Lee May 2008 A1
20080106592 Mikami May 2008 A1
20080151478 Chern Jun 2008 A1
20080158185 Westerman Jul 2008 A1
20080174570 Jobs et al. Jul 2008 A1
20080177185 Nakao et al. Jul 2008 A1
20080203277 Warszauer et al. Aug 2008 A1
20080228969 Cheah et al. Sep 2008 A1
20080238884 Harish Oct 2008 A1
20080253822 Matias Oct 2008 A1
20080303927 Khanh Dec 2008 A1
20080316002 Brunet et al. Dec 2008 A1
20080320190 Lydon et al. Dec 2008 A1
20090009476 Daley, III Jan 2009 A1
20090073957 Newland et al. Mar 2009 A1
20090083562 Park et al. Mar 2009 A1
20090091554 Keam Apr 2009 A1
20090140985 Liu Jun 2009 A1
20090147102 Kakinuma et al. Jun 2009 A1
20090160944 Trevelyan et al. Jun 2009 A1
20090167930 Safaee-Rad et al. Jul 2009 A1
20090174759 Yeh et al. Jul 2009 A1
20090195497 Fitzgerald et al. Aug 2009 A1
20090231275 Odgers Sep 2009 A1
20090231465 Senba Sep 2009 A1
20090251008 Sugaya Oct 2009 A1
20090259865 Sheynblat et al. Oct 2009 A1
20090262492 Whitchurch et al. Oct 2009 A1
20090265670 Kim et al. Oct 2009 A1
20090284613 Kim Nov 2009 A1
20090285491 Ravenscroft et al. Nov 2009 A1
20090296331 Choy Dec 2009 A1
20090303204 Nasiri et al. Dec 2009 A1
20090320244 Lin Dec 2009 A1
20090321490 Groene et al. Dec 2009 A1
20100013319 Kamiyama et al. Jan 2010 A1
20100026656 Hotelling et al. Feb 2010 A1
20100038821 Jenkins et al. Feb 2010 A1
20100045633 Gettemy Feb 2010 A1
20100051432 Lin et al. Mar 2010 A1
20100053534 Hsieh et al. Mar 2010 A1
20100077237 Sawyers Mar 2010 A1
20100085321 Pundsack Apr 2010 A1
20100102182 Lin Apr 2010 A1
20100103112 Yoo et al. Apr 2010 A1
20100103332 Li et al. Apr 2010 A1
20100123686 Klinghult et al. May 2010 A1
20100133398 Chiu et al. Jun 2010 A1
20100142130 Wang et al. Jun 2010 A1
20100149111 Olien Jun 2010 A1
20100149377 Shintani et al. Jun 2010 A1
20100157085 Sasaki Jun 2010 A1
20100161522 Tirpak et al. Jun 2010 A1
20100164857 Liu et al. Jul 2010 A1
20100164897 Morin et al. Jul 2010 A1
20100165181 Murakami Jul 2010 A1
20100171875 Yamamoto Jul 2010 A1
20100171891 Kaji et al. Jul 2010 A1
20100174421 Tsai et al. Jul 2010 A1
20100180063 Ananny et al. Jul 2010 A1
20100188299 Rinehart et al. Jul 2010 A1
20100206614 Park et al. Aug 2010 A1
20100222110 Kim et al. Sep 2010 A1
20100235546 Terlizzi et al. Sep 2010 A1
20100238320 Washisu Sep 2010 A1
20100238620 Fish Sep 2010 A1
20100250988 Okuda et al. Sep 2010 A1
20100271771 Wu et al. Oct 2010 A1
20100274932 Kose Oct 2010 A1
20100279768 Huang et al. Nov 2010 A1
20100289457 Onnerud et al. Nov 2010 A1
20100295812 Burns et al. Nov 2010 A1
20100302378 Marks et al. Dec 2010 A1
20100306538 Thomas et al. Dec 2010 A1
20100308778 Yamazaki et al. Dec 2010 A1
20100308844 Day et al. Dec 2010 A1
20100315348 Jellicoe et al. Dec 2010 A1
20100321877 Moser Dec 2010 A1
20100324457 Bean et al. Dec 2010 A1
20100325155 Skinner et al. Dec 2010 A1
20110012866 Keam Jan 2011 A1
20110012873 Prest et al. Jan 2011 A1
20110019123 Prest et al. Jan 2011 A1
20110031287 Le Gette et al. Feb 2011 A1
20110036965 Zhang et al. Feb 2011 A1
20110037721 Cranfill et al. Feb 2011 A1
20110043990 Mickey et al. Feb 2011 A1
20110050946 Lee et al. Mar 2011 A1
20110055407 Lydon et al. Mar 2011 A1
20110060926 Brooks et al. Mar 2011 A1
20110069148 Jones et al. Mar 2011 A1
20110074688 Hull et al. Mar 2011 A1
20110081946 Singh et al. Apr 2011 A1
20110102326 Casparian et al. May 2011 A1
20110134032 Chiu et al. Jun 2011 A1
20110157046 Lee et al. Jun 2011 A1
20110157087 Kanehira et al. Jun 2011 A1
20110163955 Nasiri et al. Jul 2011 A1
20110164370 McClure et al. Jul 2011 A1
20110167181 Minoo et al. Jul 2011 A1
20110167287 Walsh et al. Jul 2011 A1
20110167391 Momeyer et al. Jul 2011 A1
20110169762 Weiss Jul 2011 A1
20110176035 Poulsen Jul 2011 A1
20110179864 Raasch et al. Jul 2011 A1
20110181754 Iwasaki Jul 2011 A1
20110184646 Wong et al. Jul 2011 A1
20110193787 Morishige et al. Aug 2011 A1
20110199402 Ishii Aug 2011 A1
20110205372 Miramontes Aug 2011 A1
20110221659 King, et al. Sep 2011 A1
20110227913 Hyndman Sep 2011 A1
20110231682 Kakish et al. Sep 2011 A1
20110234881 Wakabayashi et al. Sep 2011 A1
20110248152 Svajda et al. Oct 2011 A1
20110248920 Larsen Oct 2011 A1
20110261001 Liu Oct 2011 A1
20110261209 Wu Oct 2011 A1
20110267272 Meyer et al. Nov 2011 A1
20110290686 Huang Dec 2011 A1
20110295697 Boston et al. Dec 2011 A1
20110297566 Gallagher et al. Dec 2011 A1
20110298919 Maglaque Dec 2011 A1
20110304577 Brown Dec 2011 A1
20110316807 Corrion Dec 2011 A1
20110317991 Tsai Dec 2011 A1
20120002052 Muramatsu et al. Jan 2012 A1
20120007821 Zaliva Jan 2012 A1
20120008015 Manabe Jan 2012 A1
20120019686 Manabe Jan 2012 A1
20120020556 Manabe Jan 2012 A1
20120023459 Westerman Jan 2012 A1
20120024682 Huang et al. Feb 2012 A1
20120032891 Parivar Feb 2012 A1
20120044179 Hudson Feb 2012 A1
20120044379 Manabe Feb 2012 A1
20120047368 Chinn et al. Feb 2012 A1
20120050975 Garelli et al. Mar 2012 A1
20120062736 Xiong Mar 2012 A1
20120068919 Lauder et al. Mar 2012 A1
20120075249 Hoch Mar 2012 A1
20120092279 Martin Apr 2012 A1
20120094257 Pillischer et al. Apr 2012 A1
20120099749 Rubin et al. Apr 2012 A1
20120113579 Agata et al. May 2012 A1
20120117409 Lee et al. May 2012 A1
20120127118 Nolting et al. May 2012 A1
20120133797 Sato et al. May 2012 A1
20120140396 Zeliff et al. Jun 2012 A1
20120145525 Ishikawa Jun 2012 A1
20120162693 Ito Jun 2012 A1
20120175487 Goto Jul 2012 A1
20120182242 Lindahl et al. Jul 2012 A1
20120194448 Rothkopf Aug 2012 A1
20120224042 Saijo Sep 2012 A1
20120224073 Miyahara Sep 2012 A1
20120229634 Laett et al. Sep 2012 A1
20120246377 Bhesania et al. Sep 2012 A1
20120249443 Anderson et al. Oct 2012 A1
20120256959 Ye et al. Oct 2012 A1
20120274811 Bakin Nov 2012 A1
20120281129 Wang et al. Nov 2012 A1
20120287218 Ok Nov 2012 A1
20120300275 Vilardell et al. Nov 2012 A1
20120312955 Randolph Dec 2012 A1
20120330162 Rajan et al. Dec 2012 A1
20130009413 Chiu et al. Jan 2013 A1
20130027867 Lauder et al. Jan 2013 A1
20130044074 Park et al. Feb 2013 A1
20130063873 Wodrich et al. Mar 2013 A1
20130067126 Casparian et al. Mar 2013 A1
20130076617 Csaszar et al. Mar 2013 A1
20130088431 Ballagas et al. Apr 2013 A1
20130106766 Yilmaz et al. May 2013 A1
20130128102 Yano May 2013 A1
20130162554 Lauder et al. Jun 2013 A1
20130172906 Olson et al. Jul 2013 A1
20130217451 Komiyama et al. Aug 2013 A1
20130222681 Wan Aug 2013 A1
20130227836 Whitt, III Sep 2013 A1
20130228023 Drasnin Sep 2013 A1
20130228433 Shaw Sep 2013 A1
20130228434 Whitt, III Sep 2013 A1
20130228439 Whitt, III Sep 2013 A1
20130229100 Siddiqui Sep 2013 A1
20130229335 Whitman Sep 2013 A1
20130229347 Lutz, III Sep 2013 A1
20130229350 Shaw et al. Sep 2013 A1
20130229351 Whitt, III Sep 2013 A1
20130229354 Whitt, III et al. Sep 2013 A1
20130229363 Whitman Sep 2013 A1
20130229366 Dighde Sep 2013 A1
20130229380 Lutz, III Sep 2013 A1
20130229534 Panay Sep 2013 A1
20130229568 Belesiu et al. Sep 2013 A1
20130229570 Beck et al. Sep 2013 A1
20130229756 Whitt, III Sep 2013 A1
20130229757 Whitt, III et al. Sep 2013 A1
20130229758 Belesiu Sep 2013 A1
20130229759 Whitt, III et al. Sep 2013 A1
20130229760 Whitt, III Sep 2013 A1
20130229761 Shaw Sep 2013 A1
20130229762 Whitt, III Sep 2013 A1
20130229773 Siddiqui Sep 2013 A1
20130230346 Shaw Sep 2013 A1
20130231755 Perek Sep 2013 A1
20130232280 Perek Sep 2013 A1
20130232348 Oler Sep 2013 A1
20130232349 Oler et al. Sep 2013 A1
20130232350 Belesiu et al. Sep 2013 A1
20130232353 Belesiu Sep 2013 A1
20130232571 Belesiu Sep 2013 A1
20130262886 Nishimura Oct 2013 A1
20130300590 Dietz Nov 2013 A1
20130300647 Drasnin Nov 2013 A1
20130301199 Whitt Nov 2013 A1
20130301206 Whitt Nov 2013 A1
20130304941 Drasnin Nov 2013 A1
20130322000 Whitt Dec 2013 A1
20130322001 Whitt Dec 2013 A1
20130329360 Aldana Dec 2013 A1
20130332628 Panay Dec 2013 A1
20130339757 Reddy Dec 2013 A1
20140012401 Perek et al. Jan 2014 A1
20140043275 Whitman Feb 2014 A1
20140048399 Whitt, III Feb 2014 A1
20140055624 Gaines Feb 2014 A1
20140119802 Shaw May 2014 A1
20140125864 Rihn May 2014 A1
20140313401 Rihn et al. Oct 2014 A1
Foreign Referenced Citations (20)
Number Date Country
103455149 Dec 2013 CN
1223722 Jul 2002 EP
1591891 Nov 2005 EP
2353978 Aug 2011 EP
2123213 Jan 1984 GB
56108127 Aug 1981 JP
H104540 Jan 1998 JP
10326124 Dec 1998 JP
1173239 Mar 1999 JP
2001142564 May 2001 JP
2002300438 Oct 2002 JP
3602207 Dec 2004 JP
2006160155 Jun 2006 JP
2006294361 Oct 2006 JP
20050014299 Feb 2005 KR
102011008717 Aug 2011 KR
20110122333 Nov 2011 KR
WO-0072079 Nov 2000 WO
WO-2006044818 Apr 2006 WO
WO-2010147609 Dec 2010 WO
Non-Patent Literature Citations (173)
Entry
“Corrected Notice of Allowance”, U.S. Appl. No. 13/471,054, Jan. 11, 2016, 2 pages.
“Foreign Notice of Allowance”, CN Application No. 201320097079.5, Apr. 1, 2016, 4 Pages.
“Foreign Office Action”, CN Application No. 201310067603.9, Feb. 15, 2016, 12 Pages.
“Accessing Device Sensors”, retrieved from <https://developer.palm.com/content/api/dev-guide/pdk/accessing-device-sensors.html> on May 25, 2012, 2011, 4 pages.
“Advanced Configuration and Power Management Specification”, Intel Corporation, Microsoft Corporation, Toshiba Corp. Revision 1, Dec. 22, 1996, 364 pages.
“Advisory Action”, U.S. Appl. No. 13/939,032, Feb. 24, 2014, 2 pages.
“Advisory Action”, U.S. Appl. No. 14/199,924, May 28, 2014, 2 pages.
“Cirago Slim Case® —Protective case with built-in kickstand for your iPhone 5®”, Retrieved from <http://cirago.com/wordpress/wp-content/uploads/2012/10/ipc1500brochure1.pdf> on Jan. 29, 2013, Jan. 2013, 1 page.
“Corrected Notice of Allowance”, U.S. Appl. No. 13/470,633, Apr. 9, 2013, 2 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 13/470,633, Jul. 2, 2013, 2 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 13/563,435, Jan. 14, 2014, 2 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 13/563,435, Mar. 20, 2014, 2 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 13/563,435, Jan. 22, 2014, 2 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 13/565,124, Apr. 3, 2014, 4 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 13/565,124, Mar. 10, 2014, 2 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 13/565,124, Apr. 14, 2014, 2 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 13/651,327, Sep. 12, 2013, 2 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 13/651,327, Sep. 23, 2013, 2 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 13/651,726, Sep. 17, 2013, 2 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 13/667,408, Jun. 24, 2014, 9 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 13/938,930, May 6, 2014, 2 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 13/938,930, Jun. 6, 2014, 2 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 13/939,002, May 22, 2014, 2 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 13/939,002, May 5, 2014, 2 pages.
“Final Office Action”, U.S. Appl. No. 13/471,001, Jul. 25, 2013, 20 pages.
“Final Office Action”, U.S. Appl. No. 13/471,054, Oct. 23, 2014, 17 pages.
“Final Office Action”, U.S. Appl. No. 13/471,139, Sep. 16, 2013, 13 pages.
“Final Office Action”, U.S. Appl. No. 13/471,336, Aug. 28, 2013, 18 pages.
“Final Office Action”, U.S. Appl. No. 13/564,520, Jan. 15, 2014, 7 pages.
“Final Office Action”, U.S. Appl. No. 13/651,195, Apr. 18, 2013, 13 pages.
“Final Office Action”, U.S. Appl. No. 13/651,232, May 21, 2013, 21 pages.
“Final Office Action”, U.S. Appl. No. 13/651,287, May 3, 2013, 16 pages.
“Final Office Action”, U.S. Appl. No. 13/651,976, Jul. 25, 2013, 21 pages.
“Final Office Action”, U.S. Appl. No. 13/653,321, Aug. 2, 2013, 17 pages.
“Final Office Action”, U.S. Appl. No. 13/653,682, Jun. 11, 2014, 11 pages.
“Final Office Action”, U.S. Appl. No. 13/653,682, Oct. 18, 2013, 16 pages.
“Final Office Action”, U.S. Appl. No. 13/656,055, Oct. 23, 2013, 14 pages.
“Final Office Action”, U.S. Appl. No. 13/780,228, Mar. 28, 2014, 13 pages.
“Final Office Action”, U.S. Appl. No. 13/938,930, Nov. 8, 2013, 10 pages.
“Final Office Action”, U.S. Appl. No. 13/939,002, Nov. 8, 2013, 7 pages.
“Final Office Action”, U.S. Appl. No. 13/939,032, Dec. 20, 2013, 5 pages.
“Final Office Action”, U.S. Appl. No. 14/063,912, Apr. 29, 2014, 10 pages.
“Final Office Action”, U.S. Appl. No. 14/199,924, May 6, 2014, 5 pages.
“Final Office Action”, U.S. Appl. No. 14/325,247, Apr. 16, 2015, 21 pages.
“FingerWorks Installation and Operation Guide for the TouchStream ST and TouchStream LP”, FingerWorks, Inc. Retrieved from <http://ec1.images-amazon.com/media/i3d/01/A/man-migrate/MANUAL000049862.pdf>, 2002, 14 pages.
“First One Handed Fabric Keyboard with Bluetooth Wireless Technology”, Retrieved from: <http://press.xtvworld.com/article3817.html> on May 8, 2012, Jan. 6, 2005, 2 pages.
“Force and Position Sensing Resistors: An Emerging Technology”, Interlink Electronics, Available at <http://staff.science.uva.nl/˜vlaander/docu/FSR/An—Exploring—Technology.pdf>, Feb. 1990, pp. 1-6.
“Foreign Office Action”, CN Application No. 201320097066.8, Oct. 24, 2013, 5 Pages.
“Foreign Office Action”, CN Application No. 201320097079.5, Jul. 28, 2014, 4 pages.
“Foreign Office Action”, CN Application No. 201320097079.5, Sep. 26, 2013, 4 pages.
“Foreign Office Action”, CN Application No. 201320328022.1, Feb. 17, 2014, 4 Pages.
“Foreign Office Action”, CN Application No. 201320328022.1, Oct. 18, 2013, 3 Pages.
“Frogpad Introduces Weareable Fabric Keyboard with Bluetooth Technology”, Retrieved from: <http://www.geekzone.co.nz/content.asp?contentid=3898> on May 7, 2012, Jan. 7, 2005, 3 pages.
“Incipio LG G-Slate Premium Kickstand Case—Black Nylon”, Retrieved from: <http://www.amazon.com/Incipio-G-Slate-Premium-Kickstand-Case/dp/B004ZKP916> on May 8, 2012, 2012, 4 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2013/028484, Jun. 24, 2014, 10 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2013/053683, Nov. 28, 2013, 11 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2013/028948, Jun. 21, 2013, 11 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2013/029461, Jun. 21, 2013, 11 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2013/040968, Sep. 5, 2013, 11 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2013/067912, Feb. 13, 2014, 12 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2013/042550, Sep. 24, 2013, 14 pages.
“Membrane Keyboards & Membrane Keypads”, Retrieved from: <http://www.pannam.com/> on May 9, 2012, Mar. 4, 2009, 2 pages.
“Motion Sensors”, Android Developers—retrieved from <http://developer.android.com/guide/topics/sensors/sensors—motion.html> on May 25, 2012, 2012, 7 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/599,635, Feb. 25, 2014, 13 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/468,918, Dec. 26, 2013, 18 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/471,001, Feb. 19, 2013, 15 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/471,030, May 15, 2014, 10 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/471,054, Mar. 13, 2015, 18 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/471,054, Jun. 3, 2014, 15 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/471,139, Mar. 21, 2013, 12 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/471,186, Feb. 27, 2014, 8 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/471,202, Feb. 11, 2013, 10 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/471,237, Mar. 24, 2014, 7 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/471,336, Jan. 18, 2013, 14 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/471,336, May 7, 2014, 17 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/471,376, Apr. 2, 2014, 17 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/471,405, Feb. 20, 2014, 37 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/492,232, Apr. 30, 2014, 9 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/527,263, Apr. 3, 2014, 6 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/527,263, Jul. 19, 2013, 5 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/563,435, Jun. 14, 2013, 6 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/564,520, Feb. 14, 2014, 5 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/564,520, Jun. 19, 2013, 8 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/565,124, Jun. 17, 2013, 5 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/593,066, Jan. 2, 2015, 11 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/651,195, Jan. 2, 2013, 14 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/651,232, Jan. 17, 2013, 15 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/651,232, Dec. 5, 2013, 15 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/651,272, Feb. 12, 2013, 10 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/651,287, Jan. 29, 2013, 13 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/651,304, Mar. 22, 2013, 9 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/651,327, Mar. 22, 2013, 6 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/651,726, Apr. 15, 2013, 6 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/651,871, Mar. 18, 2013, 14 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/651,871, Jul. 1, 2013, 5 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/651,976, Feb. 22, 2013, 16 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/653,321, Feb. 1, 2013, 13 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/653,682, Feb. 7, 2013, 11 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/653,682, Feb. 26, 2014, 10 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/653,682, Jun. 3, 2013, 14 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/656,055, Mar. 12, 2014, 17 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/656,055, Apr. 23, 2013, 11 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/780,228, Oct. 30, 2013, 12 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/938,930, Aug. 29, 2013, 9 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/939,002, Aug. 28, 2013, 6 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/939,002, Dec. 20, 2013, 5 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/939,032, Aug. 29, 2013, 7 pages.
“Non-Final Office Action”, U.S. Appl. No. 14/063,912, Jan. 2, 2014, 10 pages.
“Non-Final Office Action”, U.S. Appl. No. 14/199,924, Apr. 10, 2014, 6 pages.
“Non-Final Office Action”, U.S. Appl. No. 14/200,595, Apr. 11, 2014, 4 pages.
“Non-Final Office Action”, U.S. Appl. No. 14/325,247, Nov. 17, 2014, 15 pages.
“Notice of Allowance”, U.S. Appl. No. 13/470,633, Mar. 22, 2013, 7 pages.
“Notice of Allowance”, U.S. Appl. No. 13/471,139, Mar. 17, 2014, 4 pages.
“Notice of Allowance”, U.S. Appl. No. 13/471,202, May 28, 2013, 7 pages.
“Notice of Allowance”, U.S. Appl. No. 13/471,237, May 12, 2014, 8 pages.
“Notice of Allowance”, U.S. Appl. No. 13/563,435, Nov. 12, 2013, 5 pages.
“Notice of Allowance”, U.S. Appl. No. 13/565,124, Dec. 24, 2013, 6 pages.
“Notice of Allowance”, U.S. Appl. No. 13/651,195, Jul. 8, 2013, 9 pages.
“Notice of Allowance”, U.S. Appl. No. 13/651,232, Apr. 25, 2014, 9 pages.
“Notice of Allowance”, U.S. Appl. No. 13/651,272, May 2, 2013, 7 pages.
“Notice of Allowance”, U.S. Appl. No. 13/651,287, May 2, 2014, 6 pages.
“Notice of Allowance”, U.S. Appl. No. 13/651,304, Jul. 1, 2013, 5 pages.
“Notice of Allowance”, U.S. Appl. No. 13/651,327, Jun. 11, 2013, 7 pages.
“Notice of Allowance”, U.S. Appl. No. 13/651,726, May 31, 2013, 5 pages.
“Notice of Allowance”, U.S. Appl. No. 13/651,871, Oct. 2, 2013, 7 pages.
“Notice of Allowance”, U.S. Appl. No. 13/653,321, Dec. 18, 2013, 4 pages.
“Notice of Allowance”, U.S. Appl. No. 13/667,408, Mar. 13, 2014, 11 pages.
“Notice of Allowance”, U.S. Appl. No. 13/938,930, Feb. 20, 2014, 4 pages.
“Notice of Allowance”, U.S. Appl. No. 13/939,002, Mar. 3, 2014, 4 pages.
“Notice of Allowance”, U.S. Appl. No. 13/939,032, Apr. 3, 2014, 4 pages.
“Notice of Allowance”, U.S. Appl. No. 14/018,286, May 23, 2014, 8 pages.
“Notice of Allowance”, U.S. Appl. No. 14/199,924, Jun. 10, 2014, 4 pages.
“Notice to Grant”, CN Application No. 201320097089.9, Sep. 29, 2013, 2 Pages.
“Notice to Grant”, CN Application No. 201320097124.7, Oct. 8, 2013, 2 pages.
“Position Sensors”, Android Developers—retrieved from <http://developer.android.com/guide/topics/sensors/sensors—position.html> on May 25, 2012, 5 pages.
“Restriction Requirement”, U.S. Appl. No. 13/468,918, Nov. 29, 2013, 6 pages.
“Restriction Requirement”, U.S. Appl. No. 13/471,139, Jan. 17, 2013, 7 pages.
“Restriction Requirement”, U.S. Appl. No. 13/593,066, Oct. 8, 2014, 8 pages.
“Restriction Requirement”, U.S. Appl. No. 13/595,700, May 28, 2014, 6 pages.
“Restriction Requirement”, U.S. Appl. No. 13/651,304, Jan. 18, 2013, 7 pages.
“Restriction Requirement”, U.S. Appl. No. 13/651,726, Feb. 22, 2013, 6 pages.
“Restriction Requirement”, U.S. Appl. No. 13/651,871, Feb. 7, 2013, 6 pages.
“Restriction Requirement”, U.S. Appl. No. 14/325,247, Oct. 6, 2014, 6 pages.
“SolRxTM E-Series Multidirectional Phototherapy ExpandableTM 2-Bulb Full Body Panel System”, Retrieved from: <http://www.solarcsystems.com/us—multidirectional—uv—light—therapy—1—intro.html> on Jul. 25, 2012, 2011, 4 pages.
“Supplemental Notice of Allowance”, U.S. Appl. No. 13/653,321, Mar. 28, 2014, 4 pages.
“Supplemental Notice of Allowance”, U.S. Appl. No. 14/018,286, Jun. 11, 2014, 5 pages.
“The Microsoft Surface Tablets Comes With Impressive Design and Specs”, Retrieved from <http://microsofttabletreview.com/the-microsoft-surface-tablets-comes-with-impressive-design-and-specs> on Jan. 30, 2013, Jun. 2012, 2 pages.
“Tilt Shift Lenses: Perspective Control”, retrieved from http://www.cambridgeincolour.com/tutorials/tilt-shift-lenses1.htm, Mar. 28, 2008, 11 Pages.
“Virtualization Getting Started Guide”, Red Hat Enterprise Linux 6, Edition 0.2—retrieved from <http://docs.redhat.com/docs/en-US/Red—Hat—Enterprise—Linux/6/html-single/Virtualization—Getting—Started—Guide/index.html> on Jun. 13, 2012, 24 pages.
“Welcome to Windows 7”, Retrieved from: <http://www.microsoft.com/en-us/download/confirmation.aspx?id=4984> on Aug. 1, 2013, Sep. 16, 2009, 3 pages.
“What is Active Alignment?”, http://www.kasalis.com/active—alignment.html, retrieved on Nov. 22, 2012, Nov. 22, 2012, 2 Pages.
Block,“DeviceOrientation Event Specification”, W3C, Editor's Draft, retrieved from <https://developer.palm.com/content/api/dev-guide/pdk/accessing-device-sensors.html> on May 25, 2012, Jul. 12, 2011, 14 pages.
Brown,“Microsoft Shows Off Pressure-Sensitive Keyboard”, retrieved from <http://news.cnet.com/8301-17938—105-10304792-1.html> on May 7, 2012, Aug. 6, 2009, 2 pages.
Butler,“SideSight: Multi-“touch” Interaction around Small Devices”, In the proceedings of the 21st annual ACM symposium on User interface software and technology., retrieved from <http://research.microsoft.com/pubs/132534/sidesight—cry3.pdf> on May 29, 2012, Oct. 19, 2008, 4 pages.
Crider,“Sony Slate Concept Tablet “Grows” a Kickstand”, Retrieved from: <http://androidcommunity.com/sony-slate-concept-tablet-grows-a-kickstand-20120116/> on May 4, 2012, Jan. 16, 2012, 9 pages.
Dietz,“A Practical Pressure Sensitive Computer Keyboard”, In Proceedings of UIST 2009, Oct. 2009, 4 pages.
Glatt,“Channel and Key Pressure (Aftertouch).”, Retrieved from: <http://home.roadrunner.com/˜jgglatt/tutr/touch.htm> on Jun. 11, 2012, 2012, 2 pages.
Hanlon,“ElekTex Smart Fabric Keyboard Goes Wireless”, Retrieved from: <http://www.gizmag.com/go/5048/ > on May 7, 2012, Jan. 15, 2006, 5 pages.
Jacobs,“2D/3D Switchable Displays”, In the proceedings of Sharp Technical Journal (4), Available at <https://cgi.sharp.co.jp/corporate/rd/journal-85/pdf/85-04.pdf>, Apr. 2003, pp. 15-18.
Kaur,“Vincent Liew's redesigned laptop satisfies ergonomic needs”, Retrieved from: <http://www.designbuzz.com/entry/vincent-liew-s-redesigned-laptop-satisfies-ergonomic-needs/> on Jul. 27, 2012, Jun. 21, 2010, 4 pages.
Khuntontong,“Fabrication of Molded Interconnection Devices by Ultrasonic Hot Embossing on Thin Polymer Films”, IEEE Transactions on Electronics Packaging Manufacturing, vol. 32, No. 3, Jul. 2009, pp. 152-156.
Linderholm,“Logitech Shows Cloth Keyboard for PDAs”, Retrieved from: <http://www.pcworld.com/article/89084/logitech—shows—cloth—keyboard—for—pdas.html> on May 7, 2012, Mar. 15, 2002, 5 pages.
McLellan,“Eleksen Wireless Fabric Keyboard: a first look”, Retrieved from: <http://www.zdnetasia.com/eleksen-wireless-fabric-keyboard-a-first-look-40278954.htm> on May 7, 2012, Jul. 17, 2006, 9 pages.
Morookian,“Ambient-Light-Canceling Camera Using Subtraction of Frames”, NASA Tech Briefs, Retrieved from <http://ntrs.nasa.gov/archive/nasa/casi.ntrs.nasa.gov/20110016693—2011017808.pdf>, May 2004, 2 pages.
Post,“E-Broidery: Design and Fabrication of Textile-Based Computing”, IBM Systems Journal, vol. 39, Issue 3 & 4, Jul. 2000, pp. 840-860.
Prospero,“Samsung Outs Series 5 Hybrid PC Tablet”, Retrieved from: <http://blog.laptopmag.com/samsung-outs-series-5-hybrid-pc-tablet-running-windows-8> on Oct. 31, 2013, Jun. 4, 2012, 7 pages.
Purcher,“Apple is Paving the Way for a New 3D GUI for IOS Devices”, Retrieved from: <http://www.patentlyapple.com/patently-apple/2012/01/apple-is-paving-the-way-for-a-new-3d-gui-for-ios-devices.html> on Jun. 4, 2012, Retrieved from: <http://www.patentlyapple.com/patently-apple/2012/01/apple-is-paving-the-way-for-a-new-3d-gui-for-ios-devices.html> on Jun. 4, 2012, Jan. 12, 2012, 15 pages.
Takamatsu,“Flexible Fabric Keyboard with Conductive Polymer-Coated Fibers”, In Proceedings of Sensors 2011, Oct. 28, 2011, 4 pages.
Zhang,“Model-Based Development of Dynamically Adaptive Software”, In Proceedings of ICSE 2006, Available at <http://www.irisa.fr/lande/lande/icse-proceedings/icse/p371.pdf>, May 20, 2006, pp. 371-380.
“Supplemental Notice of Allowance”, U.S. Appl. No. 13/471,054, Nov. 19, 2015, 2 pages.
“Notice of Allowance”, U.S. Appl. No. 13/471,054, Sep. 25, 2015, 7 pages.
“Foreign Office Action”, CN Application No. 201310067603.9, Oct. 17, 2016, 6 pages.
“Restriction Requirement”, U.S. Appl. No. 14/792,154, Nov. 10, 2016, 7 pages.
Related Publications (1)
Number Date Country
20160044225 A1 Feb 2016 US
Divisions (2)
Number Date Country
Parent 14325247 Jul 2014 US
Child 14921569 US
Parent 13667408 Nov 2012 US
Child 14325247 US