Object-detecting backlight unit

Information

  • Patent Grant
  • 9256089
  • Patent Number
    9,256,089
  • Date Filed
    Friday, June 15, 2012
    12 years ago
  • Date Issued
    Tuesday, February 9, 2016
    8 years ago
Abstract
This document describes techniques and apparatuses for implementing an object-detecting backlight unit for a display device. An object-detecting backlight unit includes two or more light sources configured to provide light to a display to form an image, and a light sensor configured to receive reflected light when an object is near the display and determine that the reflected light originated from a region of the display. The reflected light is caused by light from the image reflecting off of the object back towards the display. The backlight unit is configured to detect a position of the object based on the region of the display from which the reflected light originated.
Description
BACKGROUND

Display devices, such as televisions, laptop computers, tablet computers, and smart phones, may use a modulating display panel, such as a liquid crystal display, in combination with a backlight to display images to users. Increasingly, users want to use display devices that are interactive, such as devices equipped with touchscreen surfaces or cameras that capture user gestures. However, the region near, or just in front of, the display is not covered by touchscreen devices or cameras. For example, typical touchscreen devices capture data when the user physically touches, or is inherently close to touching, the display. Cameras, on the other hand, typically do not have a field of view that is wide enough to capture objects or user gestures close to the display. In addition, hardware costs may prohibit manufacturers from equipping some display devices, such as televisions, with a touchscreen or a camera.


SUMMARY

This document describes techniques and apparatuses for implementing an object-detecting backlight unit for a display device. An object-detecting backlight unit includes two or more light sources configured to provide light to a display to form an image, and a light sensor configured to receive reflected light when an object is near the display and determine that the reflected light originated from a region of the display. The reflected light is caused by light from the image reflecting off of the object back towards the display. The backlight unit is configured to detect a position of the object based on the region of the display from which the reflected light originated.


This summary is provided to introduce simplified concepts in a simplified form that are further described below in the Detailed Description. This summary is not intended to identify essential features of the claimed subject matter, nor is it intended for use in determining the scope of the claimed subject matter.





BRIEF DESCRIPTION OF THE DRAWINGS

Embodiments of techniques and apparatuses for implementing an object-detecting backlight unit are described with reference to the following drawings. The same numbers are used throughout the drawings to reference like features and components:



FIG. 1 illustrates an example environment in which an object-detecting backlight unit can be implemented.



FIG. 2 illustrates a detailed example of an object-detecting backlight unit and a display.



FIG. 3 illustrates another detailed example of an object-detecting backlight unit and a display.



FIG. 4 illustrates an example method for detecting a position of an object near a display using an object-detecting backlight unit.



FIG. 5 illustrates an example device in which techniques for an object-detecting backlight unit can be implemented.





DETAILED DESCRIPTION

Overview


As described above, users increasingly want to use display devices that are interactive, such as devices with touchscreen surfaces or cameras that capture user gestures. However, the region just in front of the display is not covered by touchscreen devices or cameras. For example, typical touchscreen devices capture data when the user physically touches, or is inherently close to touching, the display. Cameras, on the other hand, typically do not have a field of view that is wide enough to capture objects or gestures close to the display.


In a liquid crystal display (LCD), as commonly used in personal computers and televisions, light that passes through the LCD to form an image on the screen for the viewer is provided by a backlight unit. This document describes an object-detecting backlight unit that enhances the performance of backlight units for display devices. As described in more detail below, the object-detecting backlight unit is able to detect a position of an object near the display, as well as user gestures, such as swipes or wipes, near the display. In some embodiments, the object-detecting backlight unit is able to perform these tasks without modifying existing backlight units with any additional active components. Thus, unlike touchscreen displays which often require manufacturers to make the entire screen a touchscreen and add a digitizer to the device, manufacturers can modify conventional display devices with the object-detecting backlight unit without incurring additional hardware costs. In addition, the object-detecting backlight unit has low power and low processing overheads as compared with, for example, driving a camera and processing its output.


This document describes techniques and apparatuses for implementing an object-detecting backlight unit for a display device. An object-detecting backlight unit includes two or more light sources configured to provide light to a display to form an image, and a light sensor configured to receive reflected light when an object is near the display and determine that the reflected light originated from a region of the display. The reflected light is caused by light from the image reflecting off of the object back towards the display. The backlight unit is configured to detect a position of the object based on the region of the display from which the reflected light originated.


Example Environment



FIG. 1 is an illustration of an example environment 100 in which an object-detecting backlight unit can be implemented. Environment 100 includes a display device 102, which is illustrated, by way of example and not limitation, as one of a smart phone 104, a laptop computer 106, a television device 108, a desktop computer 110, or a tablet computer 112.


Display device 102 includes processor(s) 114 and computer-readable media 116, which includes memory media 118 and storage media 120. Applications and/or an operating system (not shown) embodied as computer-readable instructions on computer-readable media 116 can be executed by processor(s) 114 to provide some or all of the functionalities described herein. Computer-readable media also includes controller 122. How controller 122 is implemented and used varies, and is described in further detail below.


Display device 102 also includes a backlight unit 124, which includes multiple light sources 126 and a light sensor 128. Light sources 126 are configured to inject light through a display 130 to form an image for viewing, such as a two-dimensional image, a three-dimensional image, or a multi-view image. In various embodiments, display 130 may be configured as a high resolution, flat-panel electronic display, such as a high-resolution liquid crystal display (LCD). An LCD is an electronically modulated optical device composed of liquid crystal display pixels positioned in front of a backlight unit to produce images.


Light sources 126 may include, by way of example and not limitation, light-emitting diodes (LEDs), compact cylindrical fluorescent light sources (CCFL), or any other type of light source configured for use in a display device. The number of light sources 126 may vary from two to four light sources for small display devices such as mobile phones, to 100 or more light sources for large display devices such as computer monitors or televisions. The output of backlight unit 124 can be controlled by either DC current control or by pulse-width modulation of light sources 126. Light sources 126 can be arranged electrically in combinations of series and parallel based on the power supply availability of display device 102.


Light sensor 128 is configured to detect light, such as light that originates from one of light sources 126, passes through display 130 to form an image for viewing, and is reflected back towards display 130 by an object near the display. In an embodiment, light sensor 128 is an ambient light detector that enables the brightness of display 130 to be controlled in proportion to external lighting conditions. An ambient light detector can be implemented as a single or dual silicon photodiode with subsequent signal conditioning.


As described in more detail below, backlight unit 124 is configured to detect a position of an object near the display without the use of a touchscreen or a camera. As described herein, an object is “near” the display if the object is positioned in front of the display, or if the object is positioned close enough to the display to be able to reflect light from the display back towards the display. In some embodiments, display device 102 may be configured with one or both of a touchscreen and a camera, and backlight unit 124 is configured to detect a position of objects near the display that are not physically touching the display and are too close to the display to be captured by the camera. In other embodiments, however, display device 102 may not be configured with either a touchscreen or a camera.



FIG. 2 illustrates a detailed example 200 of backlight unit 124 and display 130. In this example, display 130 is oriented horizontally. Alternately, however, display 130 may be oriented vertically, such as a screen of a typical television device. In this example, backlight unit 124 is configured with multiple light sources, two of which are identified as a first light source 202 and second light source 204, positioned along the base of display 130. It is to be noted, however, that backlight unit may be implemented with 100 or more light sources. Backlight unit 124 may use a variety of different configurations of light sources, such as a row of light sources positioned along the top of display 130, a row of light sources positioned along the top and the base of display 130, or a row of light sources positioned on the left and right sides of display 130.


In this example, first light source 202 is associated with a first region 206 on the right side of display 130, and second light source 204 is associated with a second region 208 on the left side of display 130. As described herein, a “region” can refer to any area on display 130, such as a right region of the display, a middle region of the display, a left region of the display, a top region of the display, or a bottom region of the display, to name just a few. A light source is considered to be “associated” with a particular region of display 130 if the light source projects light principally from that particular region of the display and/or if the light source is positioned in that particular region of the display.


In some embodiments, for example, backlight unit 124 is configured such that light from each light source emerges from the display in a way that approximately preserves the spatial distribution of each light source. In other words, light from the left-most light sources is principally projected from the left side of the display and light from the right-most light sources is principally projected from the right side of the display. In other embodiments, however, the light projected by each light source may be scrambled and diffused inside backlight unit 124 so as to make display 130 uniform in brightness. Such scrambling and diffusing, however, may cause light projected from one or more light sources on a left side of the display to be projected on the right side of the display, or vice versa. However, despite the scrambling and diffusing, a majority of the light projected on the left side of display will be from the light sources on the left side of the display, and a majority of the light projected on the right side of display will be from the light sources on the right side of display.


Continuing with example 200, display 130 receives light from first light source 202 and light from second light source 204. Display 130 then forms an image for viewing by projecting light 210 from first light source 202 out of first region 206 of display 130 and projecting light 212 from second light source 204 out of second region 208 of display 130.



FIG. 3 illustrates another detailed example 300 of backlight unit 124 and display 130. In this example, light 210 is reflected back towards display 130 as reflected light 302 when light 210 contacts an object 304, which in this case is a user's hand. Light sensor 128 receives reflected light 302, and backlight unit 124 determines whether reflected light 302 originated from first region 206 or second region 208 of display 130. It is to be noted that in conventional backlight units systems, the light sensor cannot discriminate between light that originated from different light sources or different regions of the display.


In accordance with various embodiments, to detect a position of objects positioned near the display, the output of at least two light sources of the backlight unit, each associated with a different region of the display, are modulated with different modulation functions. For instance, the output of at least two light sources can be modulated with sine waves with different frequencies. In this example, first light source 202 is modulated at a first frequency and second light source 204 is modulated at a second frequency. The frequencies can be invisible to the human eye (e.g., faster than 60 Hz). Further, the first frequency and the second frequency may be separated in frequency large enough to permit integration of the output with a time constant that is short compared with a likely speed of human interaction events, such as user's hand moving from one side of the display to the other to form a page-turn gesture. Light sensor 128 is configured to identify the light source or the region of the display from which the reflected light originated by demodulating the reflected light and comparing a frequency of the reflected light to the frequencies associated with the various light sources and regions of the display.


In an embodiment, backlight unit 124 may use an analog modulation scheme to modulate the light sources. Such modulation schemes are well known, and are not discussed in detail herein. In another embodiment, such as for cases where a large number of light sources are to be modulated, backlight unit 124 may use a digital modulation scheme in which each light source is driven by a binary function that is a member of an orthogonal set of functions, such as the Walsh functions.


In this example, because there are two described light sources, determining the light source from which the reflected light originated is the same as determining the region of the display from which the reflected light originated. However, in cases where backlight unit 124 includes more than two light sources, multiple light sources associated with a particular region of the display may be modulated with the same frequency. Light sensor 128, therefore, may be unable to determine the particular light source from which the reflected light originated, but will be able to determine a region of the display from which the light originated by virtue of the fact that all of the light sources within this region are modulated at the same frequency.


Continuing with the example above, backlight unit 124 determines the origin of reflected light 302 by demodulating the frequency of reflected light 302 and comparing the frequency of the reflected light to both the first frequency associated with first light source 202 and first region 206 of display 130, and to the second frequency associated with second light source 204 and second region 208 of display 130. In an embodiment, a photocurrent from light sensor 128 is amplified and band-pass filtered so as to provide two independent signal channels centered on the first frequency and the second frequency of the light sources. The signal channels are then rectified and integrated over a time period that is large compared to the first frequency and the second frequency, but short compared with human interaction timescales. This enables quick and low-computational-cost comparison of the frequency of the reflected light to the first frequency and the second frequency. In another embodiment, reflected light 302 can be demodulated using a digital modulation scheme, such as the Hadamard transform. For example, reflected light 302 can be demodulated using the Hadamard Transform, and data is clocked at the same clock rate as the basis frequency of the Walsh function used to modulate the light. It is to be appreciated, however, that other digital modulation schemes can be used to demodulate reflected light 302.


In another embodiment, backlight unit 124 determines the origin of reflected light 302 by demodulating reflected light 302 to determine an amplitude of the modulation function of reflected light 302. Then, backlight unit 124 compares the amplitude of the modulation function of reflected light 302 with an amplitude of the modulation function associated with first light source 202 and first region 206 of display 130, and with an amplitude of the modulation function associated with second light source 204 and second region 208 of display 130.


Continuing with example 300, after determining the region of display 130 from which reflected light 302 originated, backlight unit 124 can detect a position of object 304 relative to display 130. In this example, backlight unit 124 determines that object 304 is positioned in space relative to first region 206 of display 130 based on the determination that reflected light 302 originated from first region 206 of display 130. Thus, backlight unit 124 is able to detect a position of an object near display 130 without using a touchscreen or a camera.


Backlight unit 124 is further configured to determine a movement of an object near the display, such as movement from the right to the left side of display 130, from the left to the right side of display 130, from the top to the bottom of display 130, or from the bottom to the top of display 130. The ability to determine a movement of an object enables a user to perform various gestures to initiate corresponding events on display device 102. For instance, a user reading an electronic book rendered by display 130 may be able to move his hand from the right side to the left side of display 130 to initiate a page-turn gesture.


In FIG. 3, for example, object 304 moves from the right side to the left side of display 130. When this occurs, light 212 is reflected back towards display 130 as reflected light 306 when light 212 contacts object 304. Light sensor 128 receives reflected light 306, and backlight unit 124 determines whether reflected light 306 originated from first region 206 of display 130 or from second region 208 of display 130. Based on this determination, backlight unit 124 can detect an additional position of object 304 relative to display 130. In this example, backlight unit 124 can determine that object 304 is positioned in space relative to second region 208 of display 130 based on the determination that reflected light 306 originated from second region 208 of display 130.


Backlight unit 124 then determines a movement of the object based on the change in the position of the object. In this example, backlight unit 124 determines that object 304 moved from a position in space relative to first region 206 on the right side of display 130 to a position in space relative to second region 208 on the left side of display 130. Backlight unit 124 communicates the movement of object 304 to controller 122, which processes the movement to form a gesture. Controller 122 can then communicate the gesture to an operating system of display device 102 to initiate a variety of different events based on the gesture. For example, movement of the user's hand from the right side of display 130 to the left side of display 130 may be identified as a page-turn gesture for an electronic book, a volume-control gesture for an audio application, a channel-change gesture for a television application, a play-video gesture for a DVD application, to name just a few.


Backlight unit 124 may also be configured to identify movement of an object towards or away from display 130. In one embodiment, backlight unit 124 identifies movement of an object towards or away from the display 130 based on a change in the relative strength of the amplitude of reflected light. Continuing with the example above, if an amplitude of reflected light 302 is relatively stronger than an amplitude of reflected light 306, then display device 102 determines that object 304 is moving away from display 130. Alternately, if the amplitude of reflected light 302 is relatively weaker than the amplitude of reflected light 306, display device 102 determines that object 304 is moving closer to display 130.


Backlight unit 124 may also be configured to identify a speed of the movement of an object. Controller 122 may process the movement and the speed of the movement to form different gestures. For example, a rapid movement of the user's hand from the right side of display 130 to the left side of display 130 may be identified as chapter-change gesture, instead of just a page-turn gesture, for an electronic book. As another example, a rapid movement of the user's hand towards display 130 may be identified as a gesture to suppress information on display 130 for reasons of confidentiality. The speed of such rapid movements may be faster than the frame rate of a typical camera, however, in some embodiments backlight unit 124 is configured to respond to these movements within 10 milliseconds.


In one embodiment, backlight unit 124 is configured with at least a third light source 308 that is associated with a middle region of display 130 and is modulated at a third frequency. Third light source 308 enables backlight unit 124 to distinguish a variety of different gestures. For example, third light source 308 allows backlight unit to distinguish between a single object, such as a user's hand, moving from one side of the display to the other, and two objects, such as each of the user's hands, being positioned on either side of the display. In FIG. 3, for example, when object 304 moves from first region 206 on the right side of display 130 to second region 208 on the left side of display 130, additional reflected light (not pictured) from third light source 308 is reflected back towards display 130 and received by light sensor 128 before light sensor 128 receives reflected light 306 associated with second region 208 of display 130. This additional reflected light from third light source 308 indicates that object 304 crossed over the middle of display 130. Backlight unit 124, therefore, can determine that object 304 moved from first region 206 to second region 208 of display 130.


Alternately, if object 304 is positioned near first region 206 on the right side of display 130, and an additional object is positioned near second region 208 on the left side of display 130, light sensor 128 receives reflected light 302 associated with first region 206 as well as reflected light 306 associated with second region 208 of display 130. However, because light sensor 128 does not receive reflected light corresponding to third light source 308 in the middle of display 130, backlight unit 124 can determine that an object did not move from one side of the display to the other, and therefore determine that two objects are near the display.


Example Method



FIG. 4 is flow diagram depicting an example method 400 for detecting a position of an object near a display using an object-detecting backlight unit. Block 402 receives reflected light when an object is near a display. For example, light sensor 128 (FIG. 3) receives reflected light 302 when object 304 is near display 130. The reflected light is caused by light from an image being rendered by the display reflecting off of the object, such as reflected light 302 reflecting off of object 304.


Block 404 determines that the reflected light originated from a region of the display based on a frequency of the reflected light being equal to a frequency associated with the region of the display. For example, backlight unit 124 determines that reflected light 302 originated from first region 206 of display 130 based on a frequency of reflected light 302 being equal to a frequency associated with first region 206 of display 130.


Block 406 detects a position of the object as being positioned in space relative to the region of the display. For example, backlight unit 124 detects a position of object 304 as being positioned in space relative to first region 206 of display 130 based on reflected light 302 originating from first region 206 of display 130.


Block 408 receives additional reflected light when the object is near the display. For example, light sensor 128 receives reflected light 306 when object 304 is near display 130. In this example, the additional reflected light is caused by object 304 moving from first region 206 on the left side of display 130 to second region 208 on the right side of display 130.


Block 410 determines that the additional reflected light originated from an additional region of the display based on an additional frequency of the reflected light being equal to an additional frequency associated with the additional region of the display. For example, backlight unit 124 determines that reflected light 306 originated from second region 208 of display 130 based on a frequency of reflected light 306 being equal to a frequency associated with second region 208 of display 130.


Block 412 detects an additional position of the object as being positioned in space relative to the additional region of the display. For example, backlight unit 124 detects an additional position of object 304 as being positioned in space relative to second region 208 of display 130 based on reflected light 306 originating from second region 208 of display 130.


Block 414 determines a movement of the object based on a change between the position and the additional position of the object. For example, backlight unit 124 determines a movement of object 304 based on a change between object 304 being positioned on the right side of the display and then the left side of the display. In some embodiments, backlight unit 124 can then communicate the movement of object 304 to controller 122, which processes the movement to form a gesture. Controller 122 can then communicate the gesture to an operating system of display device 102 to initiate a variety of different events. For example, movement of the user's hand from the right side of display 130 to the left side of display 130 may be identified as a page-turn gesture for an electronic book, a volume-control gesture for an audio application, a channel-change gesture for a television application, or a play-video gesture for a DVD application.


Example Device



FIG. 5 illustrates various components of example device 500 that can be implemented as any type of client, server, and/or display device as described with reference to the previous FIGS. 1-4 to implement techniques enabling an object-detecting backlight unit. In embodiments, device 500 can be implemented as one or a combination of a wired and/or wireless device, as a form of flat panel display, television, television client device (e.g., television set-top box, digital video recorder (DVR), etc.), consumer device, computer device, server device, portable computer device, user device, communication device, video processing and/or rendering device, appliance device, gaming device, electronic device, and/or as another type of device. Device 500 may also be associated with a viewer (e.g., a person or user) and/or an entity that operates the device such that a device describes logical devices that include users, software, firmware, and/or a combination of devices.


Device 500 includes communication devices 502 that enable wired and/or wireless communication of device data 504 (e.g., received data, data that is being received, data scheduled for broadcast, data packets of the data, etc.). The device data 504 or other device content can include configuration settings of the device, media content stored on the device, and/or information associated with a user of the device. Media content stored on device 500 can include any type of audio, video, and/or image data. Device 500 includes one or more data inputs 506 via which any type of data, media content, and/or inputs can be received, such as user-selectable inputs, messages, music, television media content, recorded video content, and any other type of audio, video, and/or image data received from any content and/or data source.


Device 500 also includes communication interfaces 508, which can be implemented as any one or more of a serial and/or parallel interface, a wireless interface, any type of network interface, a modem, and as any other type of communication interface. The communication interfaces 508 provide a connection and/or communication links between device 500 and a communication network by which other electronic, computing, and communication devices communicate data with device 500.


Device 500 includes one or more processors 510 (e.g., any of microprocessors, controllers, and the like), which process various computer-executable instructions to control the operation of device 500 and to enable techniques for implementing an object-detecting backlight unit. Alternatively or in addition, device 500 can be implemented with any one or combination of hardware, firmware, a system-on-chip (SoC), or fixed logic circuitry that is implemented in connection with processing and control circuits which are generally identified at 512. Although not shown, device 500 can include a system bus or data transfer system that couples the various components within the device. A system bus can include any one or combination of different bus structures, such as a memory bus or memory controller, a peripheral bus, a universal serial bus, and/or a processor or local bus that utilizes any of a variety of bus architectures.


Device 500 also includes computer-readable storage media 514, such as one or more memory devices that enable persistent and/or non-transitory data storage (i.e., in contrast to mere signal transmission), examples of which include random access memory (RAM), non-volatile memory (e.g., any one or more of a read-only memory (ROM), non-volatile RAM (NVRAM), flash memory, EPROM, EEPROM, etc.), and a disk storage device. A disk storage device may be implemented as any type of magnetic or optical storage device, such as a hard disk drive, a recordable and/or rewriteable compact disc (CD), any type of a digital versatile disc (DVD), and the like. Device 500 can also include a mass storage media device 516.


Computer-readable storage media 514 provides data storage mechanisms to store the device data 504, as well as various device applications 518 and any other types of information and/or data related to operational aspects of device 500. For example, an operating system 520 can be maintained as a computer application with the computer-readable storage media 514 and executed on processors 510. The device applications 518 may include a device manager, such as any form of a control application, software application, signal-processing and control module, code that is native to a particular device, a hardware abstraction layer for a particular device, and so on.


The device applications 518 also include any system components or modules to implement techniques using or enabling an object-detecting backlight unit. In this example, the device applications 518 can include controller 122 for controlling and/or receiving data from an object-detecting backlight unit.


CONCLUSION

This document describes various apparatuses and techniques for implementing an object-detecting backlight unit. Although the invention has been described in language specific to structural features and/or methodological acts, it is to be understood that the invention defined in the appended claims is not necessarily limited to the specific features or acts described. Rather, the specific features and acts are disclosed as example forms of implementing the claimed invention.

Claims
  • 1. A display device comprising: a backlight unit comprising at least a light sensor, a first light source modulated at a first frequency and associated with a first region of a display, a second light source modulated at a second frequency and associated with a second region of the display, and a third light source modulated at a third frequency and associated with a third region of the display;the display configured to receive light from the first light source, the second light source, and the third light source and to form an image for viewing by projecting the light from the first light source out of the first region of the display, projecting the light from the second light source out of the second region of the display, and projecting the light from the third light source out of the third region of the display; andthe light sensor of the backlight unit configured to receive reflected light from the image formed by the display when an object is near the display and determine whether the reflected light originated from the first region, the second region, or the third region of the display by demodulating the reflected light and comparing a frequency of the reflected light to the first frequency, the second frequency, and the third frequency, the backlight unit configured to detect a position of the object relative to the display based on whether the reflected light originated from the first region, the second region, or the third region of the display, the third region of the display comprising a middle region of the display.
  • 2. A display device as described in claim 1, wherein the backlight unit is further configured to: detect the position of the object as being positioned in space relative to the first region of the display if the reflected light originated from the first region of the display; anddetect the position of the object as being positioned in space relative to the second region of the display if the reflected light originated from the second region of the display.
  • 3. A display device as described in claim 1, wherein the backlight unit is further configured to determine a movement of the object based on a change between the position of the object and an additional position of the object.
  • 4. A display device as described in claim 3, wherein the backlight unit is further configured to communicate the movement of the object to a controller to initiate the controller processing the movement to form a gesture.
  • 5. A display device as described in claim 4, wherein the gesture comprises one of a page-turn gesture, a volume-control gesture, or a channel-change gesture.
  • 6. A display device as described in claim 1, wherein the display comprises a liquid crystal display (LCD).
  • 7. A display device as described in claim 1, wherein the light sensor comprises an ambient light detector.
  • 8. A method comprising: receiving reflected light when an object is near a display, the reflected light comprising light from an image being rendered by the display reflecting off of the object;determining whether the reflected light originated from a first region of the display, a second region of the display, or a third region of the display based on a frequency of the reflected light being equal to a first frequency associated with the first region of the display, a second frequency associated with the second region of the display, or a third frequency associated with the third region of the display; anddetecting a position of the object as being positioned in space relative to the first region of the display responsive to determining that the frequency is equal to the first frequency, the first region of the display, the second region of the display, and the third region of the display each corresponding to respective areas on the display, at least one of the first, second, or third regions of the display comprising a middle region of the display.
  • 9. A method as described in claim 8, wherein determining whether the reflected light originated from the first region of the display, the second region of the display, or the third region of the display further comprises: demodulating the reflected light to determine the frequency of the reflected light; andcomparing the frequency of the reflected light to the first frequency associated with the first region of the display, the second frequency associated with the second region of the display, and the third frequency associated with the third region of the display.
  • 10. A method as described in claim 8, further comprising: receiving additional reflected light when the object is near the display;determining whether the additional reflected light originated from the first region of the display or the second region of the display based on an additional frequency of the additional reflected light being equal to the first frequency associated with the first region of the display or the second frequency associated with the second region of the display;detecting an additional position of the object as being positioned in space relative to the second region of the display responsive to determining that the additional frequency is equal to the second frequency; anddetermining a movement of the object based on a change between the position and the additional position of the object.
  • 11. A method as described in claim 10, further comprising processing the movement of the object to form a gesture, and communicating the gesture to initiate an event.
  • 12. A method as described in claim 8, wherein determining the movement of the object further comprises: determining that the object is moving away from the display if an amplitude of the reflected light is relatively stronger than an additional amplitude of the additional reflected light; anddetermining that the object is moving towards the display if the amplitude of the reflected light is relatively weaker than the additional amplitude of the additional reflected light.
  • 13. A method as described in claim 8, wherein the first region of the display comprises one of a left region of the display, a right region of the display, a top region of the display, or a bottom region of the display, and wherein the second region of the display comprises a different one of the left region of the display, the right region of the display, the top region of the display, or the bottom region of the display.
  • 14. A backlight unit comprising: three or more light sources configured to provide light to a display to form an image;a light sensor configured to receive reflected light when an object is near the display and determine that the reflected light originated from a first region of the display, a second region of the display, or a third region of the display based on a comparison of a frequency of the reflected light to a first frequency associated with a first region of the display, a second frequency associated with the second region of the display, and a third frequency associated with a third region of the display, the reflected light comprising light from the image reflecting off of the object; andthe backlight unit configured to detect a position of the object based on the region of the display from which the reflected light originated, the first region of the display, the second region of the display, and the third region of the display each corresponding to respective areas on the display, at least one of the first, second, or third regions of the display comprising a middle region of the display.
  • 15. A backlight unit as described in claim 14, wherein the light sensor is configured to determine that the reflected light originated from the region of the display by: demodulating the reflected light to determine the frequency of the reflected light;comparing the frequency of the reflected light to the first frequency associated with the first region of the display and to the second frequency associated with the second region of the display; anddetermining that the reflected light originated from the first region of the display based on the frequency of the reflected light being equal to the first frequency associated with the first region of the display and determining that the reflected light originated from the second region of the display based on the frequency of the reflected light being equal to the second frequency associated with the second region of the display.
  • 16. A backlight unit as described in claim 14, wherein the three or more light sources comprise three or more light-emitting diodes (LEDs).
  • 17. A backlight unit as described in claim 14, wherein the display comprises a liquid crystal display (LCD).
  • 18. A backlight unit as described in claim 14, wherein the light sensor comprises an ambient light detector.
  • 19. A display device as described in claim 1, wherein the first frequency and the second frequency are invisible to the human eye.
  • 20. A display device as described in claim 1, wherein the first region of the display comprises one of a left region, a right region, a top region, or a bottom region of the display, and wherein the second region of the display comprises a different one of the left region, the right region, the top region, or the bottom region of the display.
US Referenced Citations (436)
Number Name Date Kind
4046975 Seeger, Jr. Sep 1977 A
4065649 Carter et al. Dec 1977 A
4243861 Strandwitz Jan 1981 A
4302648 Sado et al. Nov 1981 A
4317013 Larson Feb 1982 A
4365130 Christensen Dec 1982 A
4492829 Rodrique Jan 1985 A
4527021 Morikawa et al. Jul 1985 A
4559426 Van Zeeland et al. Dec 1985 A
4588187 Dell May 1986 A
4607147 Ono et al. Aug 1986 A
4651133 Ganesan et al. Mar 1987 A
4735495 Henkes Apr 1988 A
5220521 Kikinis Jun 1993 A
5283559 Kalendra et al. Feb 1994 A
5319455 Hoarty et al. Jun 1994 A
5331443 Stanisci Jul 1994 A
5548477 Kumar et al. Aug 1996 A
5558577 Kato Sep 1996 A
5618232 Martin Apr 1997 A
5681220 Bertram et al. Oct 1997 A
5745376 Barker et al. Apr 1998 A
5748114 Koehn May 1998 A
5781406 Hunte Jul 1998 A
5806955 Parkyn, Jr. et al. Sep 1998 A
5807175 Davis et al. Sep 1998 A
5808713 Broer et al. Sep 1998 A
5818361 Acevedo Oct 1998 A
5828770 Leis et al. Oct 1998 A
5838403 Jannson et al. Nov 1998 A
5874697 Selker et al. Feb 1999 A
5921652 Parker et al. Jul 1999 A
5926170 Oba Jul 1999 A
5967637 Ishikawa et al. Oct 1999 A
5971635 Wise Oct 1999 A
6002389 Kasser Dec 1999 A
6005209 Burleson et al. Dec 1999 A
6012714 Worley et al. Jan 2000 A
6040823 Seffernick et al. Mar 2000 A
6044717 Biegelsen et al. Apr 2000 A
6061644 Leis May 2000 A
6072551 Jannson et al. Jun 2000 A
6112797 Colson et al. Sep 2000 A
6124906 Kawada et al. Sep 2000 A
6129444 Tognoni Oct 2000 A
6172807 Akamatsu Jan 2001 B1
6178443 Lin Jan 2001 B1
6215590 Okano Apr 2001 B1
6254105 Rinde et al. Jul 2001 B1
6256447 Laine Jul 2001 B1
6279060 Luke et al. Aug 2001 B1
6329617 Burgess Dec 2001 B1
6344791 Armstrong Feb 2002 B1
6351273 Lemelson et al. Feb 2002 B1
6380497 Hashimoto et al. Apr 2002 B1
6411266 Maguire, Jr. Jun 2002 B1
6437682 Vance Aug 2002 B1
6511378 Bhatt et al. Jan 2003 B1
6529179 Hashimoto et al. Mar 2003 B1
6532147 Christ, Jr. Mar 2003 B1
6543949 Ritchey et al. Apr 2003 B1
6565439 Shinohara et al. May 2003 B2
6597347 Yasutake Jul 2003 B1
6600121 Olodort et al. Jul 2003 B1
6603408 Gaba Aug 2003 B1
6617536 Kawaguchi Sep 2003 B2
6648485 Colgan et al. Nov 2003 B1
6685369 Lien Feb 2004 B2
6704864 Philyaw Mar 2004 B1
6721019 Kono et al. Apr 2004 B2
6725318 Sherman et al. Apr 2004 B1
6774888 Genduso Aug 2004 B1
6776546 Kraus et al. Aug 2004 B2
6784869 Clark et al. Aug 2004 B1
6813143 Makela Nov 2004 B2
6819316 Schulz et al. Nov 2004 B2
6856506 Doherty et al. Feb 2005 B2
6861961 Sandbach et al. Mar 2005 B2
6867828 Taira et al. Mar 2005 B2
6870671 Travis Mar 2005 B2
6895164 Saccomanno May 2005 B2
6898315 Guha May 2005 B2
6914197 Doherty et al. Jul 2005 B2
6950950 Sawyers et al. Sep 2005 B2
6970957 Oshins et al. Nov 2005 B1
6976799 Kim et al. Dec 2005 B2
6980177 Struyk Dec 2005 B2
6981792 Nagakubo et al. Jan 2006 B2
7006080 Gettemy Feb 2006 B2
7051149 Wang et al. May 2006 B2
7073933 Gotoh et al. Jul 2006 B2
7083295 Hanna Aug 2006 B1
7091436 Serban Aug 2006 B2
7104679 Shin et al. Sep 2006 B2
7106222 Ward et al. Sep 2006 B2
7123292 Seeger et al. Oct 2006 B1
7151635 Bidnyk et al. Dec 2006 B2
7153017 Yamashita et al. Dec 2006 B2
7194662 Do et al. Mar 2007 B2
7213991 Chapman et al. May 2007 B2
7224830 Nefian et al. May 2007 B2
7260221 Atsmon Aug 2007 B1
7260823 Schlack et al. Aug 2007 B2
7277087 Hill et al. Oct 2007 B2
7364343 Keuper et al. Apr 2008 B2
7370342 Ismail et al. May 2008 B2
7374312 Feng et al. May 2008 B2
7375885 Ijzerman et al. May 2008 B2
7384178 Sumida et al. Jun 2008 B2
7400377 Evans et al. Jul 2008 B2
7400817 Lee et al. Jul 2008 B2
7410286 Travis Aug 2008 B2
7431489 Yeo et al. Oct 2008 B2
7447934 Dasari et al. Nov 2008 B2
7469386 Bear et al. Dec 2008 B2
7499037 Lube Mar 2009 B2
7502803 Culter et al. Mar 2009 B2
7503684 Ueno et al. Mar 2009 B2
7528374 Smitt et al. May 2009 B2
7542052 Solomon et al. Jun 2009 B2
7545429 Travis Jun 2009 B2
7558594 Wilson Jul 2009 B2
7559834 York Jul 2009 B1
7572045 Hoelen et al. Aug 2009 B2
RE40891 Yasutake Sep 2009 E
7631327 Dempski et al. Dec 2009 B2
7636921 Louie Dec 2009 B2
7639876 Clary et al. Dec 2009 B2
7656392 Bolender Feb 2010 B2
7660047 Travis et al. Feb 2010 B1
7675598 Hong Mar 2010 B2
7728923 Kim et al. Jun 2010 B2
7733326 Adiseshan Jun 2010 B1
7773076 Pittel et al. Aug 2010 B2
7773121 Huntsberger et al. Aug 2010 B1
7774155 Sato et al. Aug 2010 B2
7777972 Chen et al. Aug 2010 B1
7782341 Kothandaraman Aug 2010 B2
7782342 Koh Aug 2010 B2
7813715 McKillop et al. Oct 2010 B2
7815358 Inditsky Oct 2010 B2
7844985 Hendricks et al. Nov 2010 B2
7884807 Hovden et al. Feb 2011 B2
D636397 Green Apr 2011 S
7928964 Kolmykov-Zotov et al. Apr 2011 B2
7936501 Smith et al. May 2011 B2
7945717 Rivalsi May 2011 B2
7957082 Mi et al. Jun 2011 B2
7965268 Gass et al. Jun 2011 B2
7970246 Travis et al. Jun 2011 B2
7973771 Geaghan Jul 2011 B2
7978281 Vergith et al. Jul 2011 B2
7991257 Coleman Aug 2011 B1
8007158 Woo et al. Aug 2011 B2
8018579 Krah Sep 2011 B1
8053688 Conzola et al. Nov 2011 B2
8065624 Morin et al. Nov 2011 B2
8069356 Rathi et al. Nov 2011 B2
8098233 Hotelling et al. Jan 2012 B2
8115499 Osoinach et al. Feb 2012 B2
8130203 Westerman Mar 2012 B2
8149272 Evans et al. Apr 2012 B2
8154524 Wilson et al. Apr 2012 B2
D659139 Gengler May 2012 S
8169421 Wright et al. May 2012 B2
8189973 Travis et al. May 2012 B2
8229509 Paek et al. Jul 2012 B2
8229522 Kim et al. Jul 2012 B2
8251563 Papakonstantinou et al. Aug 2012 B2
8310508 Hekstra et al. Nov 2012 B2
8325416 Lesage et al. Dec 2012 B2
8354806 Travis et al. Jan 2013 B2
8362975 Uehara Jan 2013 B2
8466954 Ko et al. Jun 2013 B2
8467133 Miller Jun 2013 B2
8548608 Perek et al. Oct 2013 B2
8565560 Popovich et al. Oct 2013 B2
8614666 Whitman et al. Dec 2013 B2
8903517 Perek et al. Dec 2014 B2
8947353 Boulanger et al. Feb 2015 B2
9201185 Large Dec 2015 B2
20020008854 Travis et al. Jan 2002 A1
20020134828 Sandbach et al. Sep 2002 A1
20020163510 Williams et al. Nov 2002 A1
20030137821 Gotoh et al. Jul 2003 A1
20030197687 Shetter Oct 2003 A1
20040258924 Berger et al. Dec 2004 A1
20040268000 Barker et al. Dec 2004 A1
20050055498 Beckert et al. Mar 2005 A1
20050057515 Bathiche Mar 2005 A1
20050059489 Kim Mar 2005 A1
20050062715 Tsuji et al. Mar 2005 A1
20050146512 Hill et al. Jul 2005 A1
20050264653 Starkweather et al. Dec 2005 A1
20050264988 Nicolosi Dec 2005 A1
20050285703 Wheeler et al. Dec 2005 A1
20060010400 Dehlin et al. Jan 2006 A1
20060012767 Komatsuda et al. Jan 2006 A1
20060028476 Sobel Feb 2006 A1
20060028838 Imade Feb 2006 A1
20060083004 Cok Apr 2006 A1
20060085658 Allen et al. Apr 2006 A1
20060102914 Smits et al. May 2006 A1
20060125799 Hillis et al. Jun 2006 A1
20060132423 Travis Jun 2006 A1
20060146573 Iwauchi et al. Jul 2006 A1
20060154725 Glaser et al. Jul 2006 A1
20060156415 Rubinstein et al. Jul 2006 A1
20060181514 Newman Aug 2006 A1
20060187216 Trent, Jr. et al. Aug 2006 A1
20060195522 Miyazaki Aug 2006 A1
20060215244 Yosha et al. Sep 2006 A1
20060262185 Cha et al. Nov 2006 A1
20060287982 Sheldon et al. Dec 2006 A1
20070019181 Sinclair et al. Jan 2007 A1
20070046625 Yee Mar 2007 A1
20070047221 Park Mar 2007 A1
20070062089 Homer et al. Mar 2007 A1
20070072474 Beasley et al. Mar 2007 A1
20070076434 Uehara et al. Apr 2007 A1
20070080813 Melvin Apr 2007 A1
20070091638 Ijzerman et al. Apr 2007 A1
20070122027 Kunita et al. May 2007 A1
20070182663 Biech Aug 2007 A1
20070182722 Hotelling et al. Aug 2007 A1
20070188478 Silverstein et al. Aug 2007 A1
20070201246 Yeo et al. Aug 2007 A1
20070217224 Kao et al. Sep 2007 A1
20070222766 Bolender Sep 2007 A1
20070234420 Novotney et al. Oct 2007 A1
20070236408 Yamaguchi et al. Oct 2007 A1
20070236475 Wherry Oct 2007 A1
20070247432 Oakley Oct 2007 A1
20070260892 Paul et al. Nov 2007 A1
20070274094 Schultz et al. Nov 2007 A1
20070274095 Destain Nov 2007 A1
20070274099 Tai et al. Nov 2007 A1
20070283179 Burnett et al. Dec 2007 A1
20080001924 de los Reyes et al. Jan 2008 A1
20080005423 Jacobs et al. Jan 2008 A1
20080013809 Zhu et al. Jan 2008 A1
20080019150 Park et al. Jan 2008 A1
20080037284 Rudisill Feb 2008 A1
20080104437 Lee May 2008 A1
20080122803 Izadi et al. May 2008 A1
20080150913 Bell et al. Jun 2008 A1
20080151478 Chern Jun 2008 A1
20080158185 Westerman Jul 2008 A1
20080211787 Nakao et al. Sep 2008 A1
20080219025 Spitzer et al. Sep 2008 A1
20080238884 Harish Oct 2008 A1
20080253822 Matias Oct 2008 A1
20080309636 Feng et al. Dec 2008 A1
20080316002 Brunet et al. Dec 2008 A1
20080316768 Travis Dec 2008 A1
20080320190 Lydon et al. Dec 2008 A1
20090009476 Daley, III Jan 2009 A1
20090040426 Mather et al. Feb 2009 A1
20090073957 Newland et al. Mar 2009 A1
20090131134 Baerlocher et al. May 2009 A1
20090135318 Tateuchi et al. May 2009 A1
20090140985 Liu Jun 2009 A1
20090146992 Fukunaga et al. Jun 2009 A1
20090152748 Wang et al. Jun 2009 A1
20090158221 Nielsen et al. Jun 2009 A1
20090161385 Parker et al. Jun 2009 A1
20090167728 Geaghan et al. Jul 2009 A1
20090195497 Fitzgerald et al. Aug 2009 A1
20090231275 Odgers Sep 2009 A1
20090239586 Boeve et al. Sep 2009 A1
20090244832 Behar et al. Oct 2009 A1
20090251008 Sugaya Oct 2009 A1
20090262492 Whitchurch et al. Oct 2009 A1
20090265670 Kim et al. Oct 2009 A1
20090276734 Taylor et al. Nov 2009 A1
20090285491 Ravenscroft et al. Nov 2009 A1
20090303204 Nasiri et al. Dec 2009 A1
20090316072 Okumura et al. Dec 2009 A1
20090320244 Lin Dec 2009 A1
20090321490 Groene et al. Dec 2009 A1
20100001963 Doray et al. Jan 2010 A1
20100013738 Covannon et al. Jan 2010 A1
20100026656 Hotelling et al. Feb 2010 A1
20100038821 Jenkins et al. Feb 2010 A1
20100045609 Do et al. Feb 2010 A1
20100045633 Gettemy Feb 2010 A1
20100051356 Stern et al. Mar 2010 A1
20100051432 Lin et al. Mar 2010 A1
20100053534 Hsieh et al. Mar 2010 A1
20100077237 Sawyers Mar 2010 A1
20100079861 Powell Apr 2010 A1
20100081377 Chatterjee et al. Apr 2010 A1
20100083108 Rider et al. Apr 2010 A1
20100085321 Pundsack Apr 2010 A1
20100103112 Yoo et al. Apr 2010 A1
20100117993 Kent May 2010 A1
20100123686 Klinghult et al. May 2010 A1
20100135036 Matsuba et al. Jun 2010 A1
20100149111 Olien Jun 2010 A1
20100149134 Westerman et al. Jun 2010 A1
20100156798 Archer Jun 2010 A1
20100156913 Ortega et al. Jun 2010 A1
20100161522 Tirpak et al. Jun 2010 A1
20100164857 Liu et al. Jul 2010 A1
20100171891 Kaji et al. Jul 2010 A1
20100174421 Tsai et al. Jul 2010 A1
20100180063 Ananny et al. Jul 2010 A1
20100188299 Rinehart et al. Jul 2010 A1
20100206614 Park et al. Aug 2010 A1
20100214214 Corson et al. Aug 2010 A1
20100214257 Wussler et al. Aug 2010 A1
20100222110 Kim et al. Sep 2010 A1
20100231498 Large et al. Sep 2010 A1
20100231510 Sampsell et al. Sep 2010 A1
20100231556 Mines et al. Sep 2010 A1
20100238138 Goertz et al. Sep 2010 A1
20100245289 Svajda Sep 2010 A1
20100250988 Okuda et al. Sep 2010 A1
20100274932 Kose Oct 2010 A1
20100279768 Huang et al. Nov 2010 A1
20100289457 Onnerud et al. Nov 2010 A1
20100295812 Burns et al. Nov 2010 A1
20100299642 Merrell et al. Nov 2010 A1
20100302378 Marks et al. Dec 2010 A1
20100304793 Kim Dec 2010 A1
20100306538 Thomas et al. Dec 2010 A1
20100308778 Yamazaki et al. Dec 2010 A1
20100308844 Day et al. Dec 2010 A1
20100315348 Jellicoe et al. Dec 2010 A1
20100321339 Kimmel Dec 2010 A1
20100321482 Cleveland Dec 2010 A1
20100322479 Cleveland Dec 2010 A1
20100325155 Skinner et al. Dec 2010 A1
20100331059 Apgar et al. Dec 2010 A1
20110012873 Prest et al. Jan 2011 A1
20110019123 Prest et al. Jan 2011 A1
20110031287 Le Gette et al. Feb 2011 A1
20110037721 Cranfill et al. Feb 2011 A1
20110043142 Travis Feb 2011 A1
20110043990 Mickey et al. Feb 2011 A1
20110044582 Travis et al. Feb 2011 A1
20110060926 Brooks et al. Mar 2011 A1
20110069148 Jones et al. Mar 2011 A1
20110074688 Hull et al. Mar 2011 A1
20110102326 Casparian et al. May 2011 A1
20110102356 Kemppinen et al. May 2011 A1
20110115747 Powell et al. May 2011 A1
20110134032 Chiu et al. Jun 2011 A1
20110134112 Koh et al. Jun 2011 A1
20110163955 Nasiri et al. Jul 2011 A1
20110164370 McClure et al. Jul 2011 A1
20110167181 Minoo et al. Jul 2011 A1
20110167287 Walsh et al. Jul 2011 A1
20110167391 Momeyer et al. Jul 2011 A1
20110167992 Eventoff et al. Jul 2011 A1
20110179864 Raasch et al. Jul 2011 A1
20110184646 Wong et al. Jul 2011 A1
20110193787 Morishige et al. Aug 2011 A1
20110193938 Oderwald et al. Aug 2011 A1
20110199389 Lu et al. Aug 2011 A1
20110202878 Park et al. Aug 2011 A1
20110205372 Miramontes Aug 2011 A1
20110216266 Travis Sep 2011 A1
20110227913 Hyndman Sep 2011 A1
20110242138 Tribble Oct 2011 A1
20110242298 Bathiche et al. Oct 2011 A1
20110248152 Svajda et al. Oct 2011 A1
20110248920 Larsen Oct 2011 A1
20110261083 Wilson Oct 2011 A1
20110262001 Bi et al. Oct 2011 A1
20110273475 Herz et al. Nov 2011 A1
20110290686 Huang Dec 2011 A1
20110295697 Boston et al. Dec 2011 A1
20110297566 Gallagher et al. Dec 2011 A1
20110304577 Brown Dec 2011 A1
20110316807 Corrion Dec 2011 A1
20120007821 Zaliva Jan 2012 A1
20120011462 Westerman et al. Jan 2012 A1
20120019165 Igaki et al. Jan 2012 A1
20120020112 Fisher et al. Jan 2012 A1
20120023459 Westerman Jan 2012 A1
20120024682 Huang et al. Feb 2012 A1
20120044179 Hudson Feb 2012 A1
20120047368 Chinn et al. Feb 2012 A1
20120050975 Garelli et al. Mar 2012 A1
20120075249 Hoch Mar 2012 A1
20120081316 Sirpal et al. Apr 2012 A1
20120092279 Martin Apr 2012 A1
20120094257 Pillischer et al. Apr 2012 A1
20120099749 Rubin et al. Apr 2012 A1
20120117409 Lee et al. May 2012 A1
20120127118 Nolting et al. May 2012 A1
20120127126 Mattice et al. May 2012 A1
20120127573 Robinson et al. May 2012 A1
20120140396 Zeliff et al. Jun 2012 A1
20120145525 Ishikawa Jun 2012 A1
20120162693 Ito Jun 2012 A1
20120182242 Lindahl et al. Jul 2012 A1
20120188791 Voloschenko et al. Jul 2012 A1
20120194448 Rothkopf Aug 2012 A1
20120195063 Kim et al. Aug 2012 A1
20120200802 Large Aug 2012 A1
20120206937 Travis et al. Aug 2012 A1
20120224073 Miyahara Sep 2012 A1
20120229634 Laett et al. Sep 2012 A1
20120246377 Bhesania Sep 2012 A1
20120256829 Dodge Oct 2012 A1
20120256959 Ye et al. Oct 2012 A1
20120274811 Bakin Nov 2012 A1
20120300275 Vilardell et al. Nov 2012 A1
20130021289 Chen et al. Jan 2013 A1
20130046397 Fadell et al. Feb 2013 A1
20130063873 Wodrich et al. Mar 2013 A1
20130076617 Csaszar et al. Mar 2013 A1
20130100082 Bakin et al. Apr 2013 A1
20130106766 Yilmaz et al. May 2013 A1
20130120466 Chen et al. May 2013 A1
20130127980 Haddick et al. May 2013 A1
20130154959 Lindsay et al. Jun 2013 A1
20130155723 Coleman Jun 2013 A1
20130172906 Olson et al. Jul 2013 A1
20130182246 Tanase Jul 2013 A1
20130207937 Lutian Aug 2013 A1
20130212483 Brakensiek et al. Aug 2013 A1
20130222272 Martin, Jr. Aug 2013 A1
20130222274 Mori et al. Aug 2013 A1
20130222323 McKenzie Aug 2013 A1
20130229335 Whitman Sep 2013 A1
20130232280 Perek Sep 2013 A1
20130308339 Woodgate et al. Nov 2013 A1
20130328761 Boulanger Dec 2013 A1
20140012401 Perek Jan 2014 A1
20140043275 Whitman Feb 2014 A1
20140372914 Byrd et al. Dec 2014 A1
20140379942 Perek et al. Dec 2014 A1
20150005953 Fadell et al. Jan 2015 A1
Foreign Referenced Citations (57)
Number Date Country
1515937 Jul 2004 CN
1650202 Aug 2005 CN
1700072 Nov 2005 CN
1787605 Jun 2006 CN
1920642 Feb 2007 CN
101038401 Sep 2007 CN
101366001 Feb 2009 CN
101473167 Jul 2009 CN
101512403 Aug 2009 CN
101644979 Feb 2010 CN
101688991 Mar 2010 CN
101889225 Nov 2010 CN
101893785 Nov 2010 CN
2353978 Aug 2011 EP
2410116 Jul 2005 GB
2428101 Jan 2007 GB
H07218865 Aug 1995 JP
H0980354 Mar 1997 JP
H09178949 Jul 1997 JP
H10234057 Sep 1998 JP
10326124 Dec 1998 JP
2000106021 Apr 2000 JP
2002100226 Apr 2002 JP
2002162912 Jun 2002 JP
2003215349 Jul 2003 JP
2004171948 Jun 2004 JP
2005077437 Mar 2005 JP
2005156932 May 2005 JP
2005331565 Dec 2005 JP
2006004877 Jan 2006 JP
2006278251 Oct 2006 JP
2006294361 Oct 2006 JP
2006310269 Nov 2006 JP
2007184286 Jul 2007 JP
2007273288 Oct 2007 JP
2008066152 Mar 2008 JP
2008286874 Jul 2008 JP
2008529251 Jul 2008 JP
2009059583 Mar 2009 JP
2010151951 Jul 2010 JP
20010039013 May 2001 KR
20080009490 Jan 2008 KR
20080055051 Jun 2008 KR
WO-0128309 Apr 2001 WO
WO-0172037 Sep 2001 WO
WO-03048635 Jun 2003 WO
WO-03083530 Sep 2003 WO
WO-2005059874 Jun 2005 WO
WO-2006044818 Apr 2006 WO
WO-2006082444 Aug 2006 WO
WO-2007094304 Aug 2007 WO
WO-2007123202 Nov 2007 WO
WO-2008013146 Jan 2008 WO
WO-2008038016 Apr 2008 WO
WO-2012174364 Dec 2012 WO
WO-2013033274 Mar 2013 WO
WO-2013163347 Oct 2013 WO
Non-Patent Literature Citations (153)
Entry
“Developing Next-Generation Human Interfaces using Capacitive and Infrared Proximity Sensing”, Retrieved at <<http://www.silabs.com/pages/DownloadDoc.aspx?FILEURL=support%20documents/technicaldocs/capacitive%20and%20proximity%20sensing—wp.pdf&src=SearchResults>>, Retrieved Date: Jan. 3, 2012, pp. 10.
“Optical Sensors in Smart Mobile Devices”, Retrieved at <<http://www.onsemi.jp/pub—link/Collateral/TND415-D.PDF>>, Nov. 2010, pp. 13.
“Directional Backlighting for Display Panels”, U.S. Appl. No. 13/021,448, filed Feb. 4, 2011, pp. 38.
“Notice of Allowance”, U.S. Appl. No. 13/651,195, (Jul. 8, 2013), 9 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/021,448, (Aug. 16, 2013), 25 pages.
“Accessing Device Sensors”, retrieved from <https://developer.palm.com/content/api/dev-guide/pdk/accessing-device-sensors.html> on May 25, 2012, 4 pages.
“ACPI Docking for Windows Operating Systems”, Retrieved from: <http://www.scritube.com/limba/engleza/software/ACPI-Docking-for-Windows-Opera331824193.php> on Jul. 6, 2012,10 pages.
“Cholesteric Liquid Crystal”, Retrieved from: <http://en.wikipedia.org/wiki/Cholesteric—liquid—crystal> on Aug. 6, 2012,(Jun. 10, 2012), 2 pages.
“Cirago Slim Case®—Protective case with built-in kickstand for your iPhone 5®”, Retrieved from <http://cirago.com/wordpress/wp-content/uploads/2012/10/ipc1500brochure1.pdf> on Jan. 29, 2013, 1 page.
“DR2PA”, retrieved from <http://www.architainment.co.uk/wp-content/uploads/2012/08/DR2PA-AU-US-size-Data-Sheet-Rev-H—LOGO.pdf> on Sep. 17, 2012, 4 pages.
“Final Office Action”, U.S. Appl. No. 13/651,195, (Apr. 18, 2013),13 pages.
“First One Handed Fabric Keyboard with Bluetooth Wireless Technology”, Retrieved from: <http://press.xtvworld.com/article3817.html> on May 8, 2012,(Jan. 6, 2005), 2 pages.
“Force and Position Sensing Resistors: An Emerging Technology”, Interlink Electronics, Available at <http://staff.science.uva.nl/˜vlaander/docu/FSR/An—Exploring—Technology.pdf>,(Feb. 1990), pp. 1-6.
“Frogpad Introduces Weareable Fabric Keyboard with Bluetooth Technology”, Retrieved from: <http://www.geekzone.co.nz/content.asp?contentid=3898> on May 7, 2012,(Jan. 7, 2005),3 pages.
“How to Use the iPad's Onscreen Keyboard”, Retrieved from <http://www.dummies.com/how-to/content/how-to-use-the-ipads-onscreen-keyboard.html> on Aug. 28, 2012, 3 pages.
“i-Interactor electronic pen”, Retrieved from: <http://www.alibaba.com/product-gs/331004878/i—Interactor—electronic—pen.html> on Jun. 19, 2012, 5 pages.
“Incipio LG G-Slate Premium Kickstand Case—Black Nylon”, Retrieved from: <http://www.amazon.com/Incipio-G-Slate-Premium-Kickstand-Case/dp/B004ZKP916> on May 8, 2012, 4 pages.
“Membrane Keyboards & Membrane Keypads”, Retrieved from: <http://www.pannam.com/> on May 9, 2012,(Mar. 4, 2009), 2 pages.
“Motion Sensors”, Android Developers, retrieved from <http://developer.android.com/guide/topics/sensors/sensors—motion.html> on May 25, 2012, 7 pages.
“MPC Fly Music Production Controller”, AKAI Professional, Retrieved from: <http://www.akaiprompc.com/mpc-fly> on Jul. 9, 2012,4 pages.
“NI Releases New Maschine & Maschine Mikro”, Retrieved from <http://www.djbooth.net/index/dj-equipment/entry/ni-releases-new-maschine-mikro/> on Sep. 17, 2012, 19 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/471,001, (Feb. 19, 2013),15 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/471,139, (Mar. 21, 2013),12 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/471,202, (Feb. 11, 2013),10 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/471,336, (Jan. 18, 2013),14 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/651,195, (Jan. 2, 2013),14 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/651,232, (Jan. 17, 2013),15 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/651,272, (Feb. 12, 2013),10 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/651,287, (Jan. 29, 2013),13 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/651,304, (Mar. 22, 2013), 9 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/651,327, (Mar. 22, 2013), 6 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/651,871, (Mar. 18, 2013),14 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/651,976, (Feb. 22, 2013),16 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/653,321, (Feb. 1, 2013),13 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/653,682, (Feb. 7, 2013),11 pages.
“Notice of Allowance”, U.S. Appl. No. 13/470,633, (Mar. 22, 2013), 7 pages.
“Notice of Allowance”, U.S. Appl. No. 13/471,202, (May 28, 2013), 7 pages.
“On-Screen Keyboard for Windows 7, Vista, XP with Touchscreen”, Retrieved from <www.comfort-software.com/on-screen-keyboard.html> on Aug. 28, 2012, (Feb. 2, 2011), 3 pages.
“Position Sensors”, Android Developers, retrieved from <http://developer.android.com/guide/topics/sensors/sensors—position.html> on May 25, 2012, 5 pages.
“Reflex LCD Writing Tablets”, retrieved from <http://www.kentdisplays.com/products/lcdwritingtablets.html> on Jun. 27, 2012, 3 pages.
“Restriction Requirement”, U.S. Appl. No. 13/471,139, (Jan. 17, 2013), 7 pages.
“Restriction Requirement”, U.S. Appl. No. 13/651,304, (Jan. 18, 2013), 7 pages.
“Restriction Requirement”, U.S. Appl. No. 13/651,726, (Feb. 22, 2013), 6 pages.
“Restriction Requirement”, U.S. Appl. No. 13/651,871, (Feb. 7, 2013), 6 pages.
“SMART Board™ Interactive Display Frame Pencil Pack”, Available at <http://downloads01.smarttech.com/media/sitecore/en/support/product/sbfpd/400series(interactivedisplayframes)/guides/smartboardinteractivedisplayframepencilpackv12mar09.pdf>,(2009), 2 pages.
“SoIRxTM E-Series Multidirectional Phototherapy ExpandableTM 2-Bulb Full Body Panel System”, Retrieved from: < http://www.solarcsystems.com/us—multidirectional—uv—light—therapy—1—intro.html > on Jul. 25, 2012,(2011), 4 pages.
“The Microsoft Surface Tablets Comes With Impressive Design and Specs”, Retrieved from from <http://microsofttabletreview.com/the-microsoft-surface-tablets-comes-with-impressive-design-and-specs> on Jan. 30, 2013, (Jun. 2012), 2 pages.
“Tilt Shift Lenses: Perspective Control”, retrieved from http://www.cambridgeincolour.com/tutorials/tilt-shift-lenses1.htm, (Mar. 28, 2008),11 Pages.
“Virtualization Getting Started Guide”, Red Hat Enterprise Linux 6, Edition 0.2, retrieved from <http://docs.redhat.com/docs/en-US/Red—Hat—Enterprise—Linux/6/html-single/Virtualization—Getting—Started—Guide/index.html> on Jun. 13, 2012, 24 pages.
“What is Active Alignment?”, http://www.kasalis.com/active—alignment.html, retrieved on Nov. 22, 2012, 2 Pages.
Block, Steve et al., “DeviceOrientation Event Specification”, W3C, Editor's Draft, retrieved from <https://developer.palm.com/content/api/dev-guide/pdk/accessing-device-sensors.html> on May 25, 2012,(Jul. 12, 2011), 14 pages.
Brown, Rich “Microsoft Shows Off Pressure-Sensitive Keyboard”, retrieved from <http://news.cnet.com/8301-17938—105-10304792-1.html> on May 7, 2012, (Aug. 6, 2009), 2 pages.
Butler, Alex et al., “SideSight: Multi-“touch” Interaction around Small Devices”, In the proceedings of the 21st annual ACM symposium on User interface software and technology., retrieved from <http://research.microsoft.com/pubs/132534/sidesight—crv3.pdf> on May 29, 2012,(Oct. 19, 2008), 4 pages.
Crider, Michael “Sony Slate Concept Tablet “Grows” a Kickstand”, Retrieved from: <http://androidcommunity.com/sony-slate-concept-tablet-grows-a-kickstand-20120116/> on May 4, 2012,(Jan. 16, 2012), 9 pages.
Das, Apurba et al., “Study of Heat Transfer through Multilayer Clothing Assemblies: A Theoretical Prediction”, Retrieved from <http://www.autexrj.com/cms/zalaczone—pliki/5—013—11.pdf>, (Jun. 2011), 7 pages.
Dietz, Paul H., et al., “A Practical Pressure Sensitive Computer Keyboard”, In Proceedings of UIST 2009,(Oct. 2009), 4 pages.
Gaver, William W., et al., “A Virtual Window on Media Space”, retrieved from <http://www.gold.ac.uk/media/15gaver-smets-overbeeke.MediaSpaceWindow.chi95.pdf> on Jun. 1, 2012, retrieved from <http://www.gold.ac.uk/media/15gaver-smets-overbeeke.MediaSpaceWindow.chi95.pdf> Jun. 1, 2012,(May 7, 1995), 9 pages.
Glatt, Jeff “Channel and Key Pressure (Aftertouch).”, Retrieved from: <http://home.roadrunner.com/˜jgglatt/tutr/touch.htm> on Jun. 11, 2012, 2 pages.
Hanlon, Mike “ElekTex Smart Fabric Keyboard Goes Wireless”, Retrieved from: <http://www.gizmag.com/go/5048/> on May 7, 2012,(Jan. 15, 2006), 5 pages.
Harada, Susumu et al., “VoiceDraw: A Hands-Free Voice-Driven Drawing Application for People With Motor Impairments”, In Proceedings of Ninth International ACM SIGACCESS Conference on Computers and Accessibility, retrieved from <http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.113.7211&rep=rep1&type=pdf > on Jun. 1, 2012,(Oct. 15, 2007), 8 pages.
Iwase, Eiji “Multistep Sequential Batch Assembly of Three-Dimensional Ferromagnetic Microstructures with Elastic Hinges”, Retrieved at <<http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=1549861>> Proceedings: Journal of Microelectromechanical Systems, (Dec. 2005), 7 pages.
Kaufmann, Benoit et al., “Hand Posture Recognition Using Real-time Artificial Evolution”, EvoApplications'09, retrieved from <http://evelyne.lutton.free.fr/Papers/KaufmannEvolASP2010.pdf> on Jan. 5, 2012,(Apr. 3, 2010), 10 pages.
Kaur, Sukhmani “Vincent Liew's redesigned laptop satisfies ergonomic needs”, Retrieved from: <http://www.designbuzz.com/entry/vincent-liew-s-redesigned-laptop-satisfies-ergonomic-needs/> on Jul. 27, 2012,(Jun. 21, 2010), 4 pages.
Khuntontong, Puttachat et al., “Fabrication of Molded Interconnection Devices by Ultrasonic Hot Embossing on Thin Polymer Films”, IEEE Transactions on Electronics Packaging Manufacturing, vol. 32, No. 3,(Jul. 2009), pp. 152-156.
Linderholm, Owen “Logitech Shows Cloth Keyboard for PDAs”, Retrieved from: <http://www.pcworld.com/article/89084/logitech—shows—cloth—keyboard—for—pdas.html> on May 7, 2012,(Mar. 15, 2002), 5 pages.
Manresa-Yee, Cristina et al., “Experiences Using a Hands-Free Interface”, In Proceedings of the 10th International ACM SIGACCESS Conference on Computers and Accessibility, retrieved from <http://dmi.uib.es/˜cmanresay/Research/%5BMan08%5DAssets08.pdf> on Jun. 1, 2012,(Oct. 13, 2008), pp. 261-262.
McLellan, Charles “Eleksen Wireless Fabric Keyboard: a first look”, Retrieved from: <http://www.zdnetasia.com/eleksen-wireless-fabric-keyboard-a-first-look-40278954.htm> on May 7, 2012,(Jul. 17, 2006), 9 pages.
Nakanishi, Hideyuki et al., “Movable Cameras Enhance Social Telepresence in Media Spaces”, In Proceedings of the 27th International Conference on Human Factors in Computing Systems, retrieved from <http://smg.ams.eng.osaka-u.ac.jp/˜nakanishi/hnp—2009—chi.pdf> on Jun. 1, 2012,(Apr. 6, 2009),10 pages.
Piltch, Avram “ASUS Eee Pad Slider SL101 Review ”, Retrieved from <http://www.laptopmag.com/review/tablets/asus-eee-pad-slider-sl101.aspx>, (Sep. 22, 2011), 5 pages.
Post, E.R. et al., “E-Broidery: Design and Fabrication of Textile-Based Computing”, IBM Systems Journal, vol. 39, Issue 3 & 4,(Jul. 2000), pp. 840-860.
Purcher, Jack “Apple is Paving the Way for a New 3D GUI for IOS Devices”, Retrieved from: <http://www.patentlyapple.com/patently-apple/2012/01/apple-is-paving-the-way-for-a-new-3d-gui-for-ios-devices.html> on Jun. 4, 2012,(Jan. 12, 2012),15 pages.
Qin, Yongqiang et al., “pPen: Enabling Authenticated Pen and Touch Interaction on Tabletop Surfaces”, In Proceedings of ITS 2010, Available at <http://www.dfki.de/its2010/papers/pdf/po172.pdf>,(Nov. 2010), pp. 283-284.
Reilink, Rob et al., “Endoscopic Camera Control by Head Movements for Thoracic Surgery”, In Proceedings of 3rd IEEE RAS & EMBS International Conference of Biomedical Robotics and Biomechatronics, retrieved from <http://doc.utwente.nl/74929/1/biorob—online.pdf> Jun. 1, 2012,(Sep. 26, 2010), pp. 510-515.
Sumimoto, Mark “Touch & Write: Surface Computing With Touch and Pen Input”, Retrieved from: <http://www.gottabemobile.com/2009/08/07/touch-write-surface-computing-with-touch-and-pen-input/> on Jun. 19, 2012,(Aug. 7, 2009), 4 pages.
Sundstedt, Veronica “Gazing at Games: Using Eye Tracking to Control Virtual Characters”, In ACM SIGGRAPH 2010 Courses, retrieved from <http://www.tobii.com/Global/Analysis/Training/EyeTrackAwards/veronica—sundstedtpdf> on Jun. 1, 2012,(Jul. 28, 2010), 85 pages.
Takamatsu, Seiichi et al., “Flexible Fabric Keyboard with Conductive Polymer-Coated Fibers”, In Proceedings of Sensors 2011,(Oct. 28, 2011), 4 pages.
Valli, Alessandro “Notes on Natural Interaction”, retrieved from <http://www.idemployee.id.tue.nl/g.w.m.rauterberg/lecturenotes/valli-2004.pdf> on Jan. 5, 2012,(Sep. 2005), 80 pages.
Valliath, G T., “Design of Hologram for Brightness Enhancement in Color LCDs”, Retrieved from <http://www.loreti.it/Download/PDF/LCD/44—05.pdf> on Sep. 17, 2012, 5 pages.
Vaucell, Cati “Scopemate, A Robotic Microscope!”, Architectradure, retrieved from <http://architectradure.blogspot.com/2011/10/at-uist-this-monday-scopemate-robotic.html> on Jun. 6, 2012,(Oct. 17, 2011), 2 pages.
Williams, Jim “A Fourth Generation of LCD Backlight Technology”, Retrieved from <http://cds.linear.com/docs/Application%20Note/an65f.pdf>, (Nov. 1995), 124 pages.
Xu, Zhang et al., “Hand Gesture Recognition and Virtual Game Control Based on 3D Accelerometer and EMG Sensors”, IUI'09, Feb. 8-11, 2009, retrieved from <http://sclab.yonsei.ac.kr/courses/10TPR/10TPR.files/Hand%20Gesture%20Recognition%20and%20Virtual%20Game%20Control%20based%20on%203d%20accelerometer%20and%20EMG%20sensors.pdf> on Jan. 5, 2012,(Feb. 8, 2009), 5 pages.
Xu, Zhi-Gang et al., “Vision-based Detection of Dynamic Gesture”, ICTM'09, Dec. 5-6, 2009, retrieved from <http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=5412956> on Jan. 5, 2012,(Dec. 5, 2009), pp. 223-226.
Zhang, et al., “Model-Based Development of Dynamically Adaptive Software”, In Proceedings of ICSE 2006, Available at <http://www.irisa.fr/lande/lande/icse-proceedings/icse/p371.pdf>,(May 20, 2006), pp. 371-380.
Zhu, Dingyun et al., “Keyboard before Head Tracking Depresses User Success in Remote Camera Control”, In Proceedings of 12th IFIP TC 13 International Conference on Human-Computer Interaction, Part II, retrieved from <http://csiro.academia.edu/Departments/CSIRO—ICT—Centre/Papers?page=5> Jun. 1, 2012,(Aug. 24, 2009), 14 pages.
“Optics for Displays: Waveguide-based Wedge Creates Collimated Display Backlight”, OptoIQ, retrieved from <http://www.optoiq.com/index/photonics-technologies-applications/lfw-display/lfw-article-display.articles.laser-focus-world.volume-46.issue-1.world-news.optics-for—displays.html> on Nov. 2, 2010,(Jan. 1, 2010),3 pages.
Travis, Adrian et al., “Collimated Light from a Waveguide for a Display Backlight”, Optics Express, 19714, vol. 17, No. 22, retrieved from <http://download.microsoft.com/download/D/2/E/D2E425F8-CF3C-4C71-A4A2-70F9D4081007/OpticsExpressbacklightpaper.pdf> on Oct. 15, 2009,6 pages.
Travis, Adrian et al., “The Design of Backlights for View-Sequential 3D”, retrieved from <http://download.microsoft.com/download/D/2/E/D2E425F8-CF3C-4C71-A4A2-70F9D4081007/Backlightforviewsequentialautostereo.docx> on Nov. 1, 2010,4 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/021,448, (Dec. 13, 2012), 9 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2013/043961, Oct. 17, 2013, 11 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/371,725, Nov. 7, 2013, 19 pages.
“International Search Report”, Application No. PCT/US2010/045676, Apr. 28, 2011, 2 Pages.
“International Search Report”, Application No. PCT/US2010/046129, Mar. 2, 2011, 3 Pages.
“What is the PD-Net Project About?”, retrieved from <http://pd-net.org/about/> on Mar. 10, 2011, 3 pages.
“Real-Time Television Content Platform”, retrieved from <http://www.accenture.com/us-en/pages/insight-real-time-television-platform.aspx> on Mar. 10, 2011, May 28, 2002, 3 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2013/055679, Nov. 18, 2013, 8 pages.
Kim et al.,'“A Controllable Viewing Angle LCD with an Optically isotropic liquid crystal”, Journal of Physics D: Applied Physics, vol. 43, No. 14, Mar. 23, 2010, 7 Pages.
Lee, “Flat-panel Backlight for View-sequential 3D Display”, Optoelectronics, IEE Proceedings-.vol. 151. No. 6 IET, Dec. 2004, 4 pages.
Travis, et al., '“Flat Projection for 3-D”, In Proceedings of the IEEE, vol. 94 Issue: 3, Available at <http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=1605201>,Mar. 13, 2006, pp. 539-549.
Travis, et al., '“P-127: Linearity in Flat Panel Wedge Projection”, SID 03 Digest, retrieved from <http://www2.eng.cam.ac.uk/˜arlt1/Linearity%20in%20flat%20panel%20wedge%20projection.pdf>,May 12, 2005, pp. 716-719.
Yagi, “The Concept of “AdapTV””, Series: The Challenge of “AdapTV”, Broadcast Technology, No. 28, 2006, pp. 16-17.
“Final Office Action”, U.S. Appl. No. 13/021,448, Jan. 16, 2014, 33 Pages.
“Final Office Action”, U.S. Appl. No. 13/371,725, Apr. 2, 2014, 22 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/494,651, Feb. 4, 2014, 15 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 13/494,651, Oct. 24, 2014, 2 pages.
“EP Search Report”, EP Application No. 09812072.8, Apr. 5, 2012, 6 Pages.
“Final Office Action”, U.S. Appl. No. 13/494,651, Jun. 11, 2014, 19 pages.
“Foreign Office Action”, CN Application No. 200980134848, May 13, 2013, 7 Pages.
“Foreign Office Action”, CN Application No. 200980134848, May 31, 2012, 7 Pages.
“Foreign Office Action”, CN Application No. 200980134848, Dec. 4, 2013, 8 Pages.
“Foreign Office Action”, CN Application No. 200980134848, Dec. 19, 2012, 8 Pages.
“Foreign Office Action”, CN Application No. 201080037117.7, Jul. 1, 2014, 9 Pages.
“Foreign Office Action”, CN Application No. 201210023945.6, Jun. 25, 2014, 6 Pages.
“Foreign Office Action”, JP Application No. 2011-526118, Aug. 16, 2013, 8 Pages.
“Foreign Office Action”, JP Application No. 2012-525632, May 2, 2014, 10 Pages.
“Foreign Office Action”, JP Application No. 2012-525722, Apr. 22, 2014, 15 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2009/055250, Mar. 2, 2014, 10 Pages.
“International Search Report and Written Opinion”, Application No. PCT/US2013/028488, Jun. 24, 2014, 11 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/021,448, Jul. 22, 2014, 35 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/371,725, Nov. 3, 2014, 27 pages.
“Notice of Allowance”, U.S. Appl. No. 13/494,651, Oct. 2, 2014, 4 pages.
“Notice of Allowance”, U.S. Appl. No. 14/018,286, May 23, 2014, 8 pages.
“Search Report”, EP Application No. 09812072.8, Apr. 17, 2013, 5 Pages.
“Supplemental Notice of Allowance”, U.S. Appl. No. 14/018,286, Jun. 11, 2014, 5 pages.
Boual, et al., “Wedge Displays as Cameras”, Retrieved From: http://www.camfpd.com/72-3.pdf, SID Symposium Digest of Technical Papers, vol. 37, Issue 1, pp. 1999-2002, Jun. 2006, 4 Pages.
Chen, et al., '“Design of a Novel Hybrid Light Guide Plate for Viewing Angle Switchable Backlight Module”, Institute of Photonic Systems, Ntional Chiao Tung University, Tainan, Taiwan., Jul. 1, 2013, 4 Pages.
Chou, et al., '“Imaging and Chromatic Behavior Analysis of a Wedge-Plate Display”, Retrieved From: http://www.di.nctu.edu.tw/2006TDC/papers/Flexible/06-012.doc, SID Symposium Digest of Technical Papers vol. 37, Issue 1, pp. 1031-1034,Jun. 2006, 4 Pages.
Ishida, et al., '“A Novel Ultra Thin Backlight System without Optical Sheets Using a Newly Developed Multi-Layered Light-guide”, SID 10 Digest, Jul. 5, 2012, 4 Pages.
Nishizawa, et al., '“Investigation of Novel Diffuser Films for 2D Light-Distribution Control”, Tohoku University, Aramaki Aoba, Aoba-ku, Sendai 980-8579, Japan, LINTEC Corporation, 23-23 Honcho, Itabashi-ku, Tokyo 173-0001, Japan., Dec. 2011, 4 Pages.
Phillips, et al., “Links Between Holography and Lithography”, Fifth International Symposium on Display Holography, 206., Feb. 17, 1995, 9 Pages.
Powell, “High-Efficiency Projection Screen”, U.S. Appl. No. 14/243,501, Apr. 2, 2014, 26 Pages.
Travis, “P-60: LCD Smear Elimination by Scanning Ray Angle into a Light Guide”, Retrieved From: http://www2.eng.cam.ac.uk/˜arlt1/P—60.pdf, SID Symposium Digest of Technical Papers vol. 35, Issue 1, pp. 474-477, May 2004, 4 Pages.
Travis, et al., “Optical Design of a Flat Panel Projection Wedge Display”, 9th International Display Workshops, paper FMC6-3, Dec. 4-6, 2002, Hiroshima, Japan., Dec. 2002, 4 Pages.
“Final Office Action”, U.S. Appl. No. 13/371,725, Mar. 3, 2015, 30 pages.
“Foreign Office Action”, CN Application No. 201080037117.7, Aug. 20, 2013, 10 pages.
“Foreign Office Action”, CN Application No. 201210023945.6, Dec. 3, 2013, 13 pages.
“Non-Final Office Action”, U.S. Appl. No. 14/059,280, Mar. 3, 2015, 18 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 13/494,651, Dec. 29, 2014, 2 pages.
“Final Office Action”, U.S. Appl. No. 13/021,448, Jan. 2, 2015, 19 pages.
“First Examination Report”, NZ Application No. 628690, Nov. 27, 2014, 2 pages.
“Advisory Action”, U.S. Appl. No. 14/059,280, Sep. 25, 2015, 7 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 13/021,448, Aug. 17, 2015, 2 pages.
“Extended European Search Report”, EP Application No. 12800433.0, Oct. 28, 2014, 10 pages.
“Extended European Search Report”, EP Application No. 13859406.4, Sep. 8, 2015, 6 pages.
“Final Office Action”, U.S. Appl. No. 14/059,280, Jul. 22, 2015, 25 pages.
“Foreign Office Action”, CN Application No. 201280029520.4, Jun. 30, 2015, 11 pages.
“Foreign Office Action”, JP Application No. 2012-525722, Aug. 13, 2014, 17 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2014/066248, Mar. 12, 2015, 10 pages.
“Notice of Allowance”, U.S. Appl. No. 13/021,448, Jul. 30, 2015, 11 pages.
“Restriction Requirement”, U.S. Appl. No. 13/598,898, Jul. 17, 2015, 6 pages.
“Foreign Office Action”, CN Application No. 201310067592.4, Oct. 23, 2015, 12 Pages.
“Foreign Office Action”, CN Application No. 201310067622.1, Oct. 27, 2015, 14 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/598,898, Oct. 23, 2015, 18 pages.
“Notice of Allowance”, U.S. Appl. No. 14/059,280, Nov. 23, 2015, 9 pages.
Related Publications (1)
Number Date Country
20130335387 A1 Dec 2013 US