Smart electronic display for restaurants

Information

  • Patent Grant
  • 10922736
  • Patent Number
    10,922,736
  • Date Filed
    Tuesday, May 3, 2016
    8 years ago
  • Date Issued
    Tuesday, February 16, 2021
    3 years ago
Abstract
Exemplary embodiments herein provide a wireless transmitter/receiver which receives a unique identifier from a smart device and finds associated order history stored on an electronic storage device. A display controlling assembly may generate individualized image data based on the associated order history and transmits the individualized image data to the electronic menu board for display. In some embodiments, menu information is sent directly to the smart device.
Description
TECHNICAL FIELD

Embodiments generally relate to electronic displays used for point of sale and quick service drive through applications.


BACKGROUND OF THE ART

Electronic displays are now being used for menu boards in restaurants, both dine in, as well as drive through quick service restaurants.


SUMMARY OF THE EXEMPLARY EMBODIMENTS

Exemplary embodiments provide a system and method for providing communication between a smart device and an electronic display menu board. A Bluetooth low energy transmitter/receiver is preferably used to determine if a compatible smart device is within close proximity to the display. If so, the system can perform a number of operations including checking to see if this smart device has been in close proximity before, and if so what was purchased. The system can also transmit the menu data to the smart device so that the user can review the menu and place an order through the smart device.


The foregoing and other features and advantages of the present invention will be apparent from the following more detailed description of the particular embodiments, as illustrated in the accompanying drawings.





BRIEF DESCRIPTION OF THE DRAWINGS

A better understanding of an exemplary embodiment will be obtained from a reading of the following detailed description and the accompanying drawings wherein identical reference characters refer to identical parts and in which:



FIG. 1 is a simplified bock diagram of one type of electronic menu board.



FIG. 2 is a schematic illustration of a user approaching one type of electronic menu board on foot.



FIG. 3 is a schematic illustration of a user approaching a second type of electronic menu board while operating an automobile.



FIG. 4 is a logic flowchart showing one embodiment for operating the displays described herein.



FIG. 5 is a logic flowchart showing a second embodiment for operating the displays described herein.



FIG. 6 is a logic flowchart showing a third embodiment for operating the displays described herein.



FIG. 7 is a logic flowchart showing a fourth embodiment for operating the displays described herein.





DETAILED DESCRIPTION

The invention is described more fully hereinafter with reference to the accompanying drawings, in which exemplary embodiments of the invention are shown. This invention may, however, be embodied in many different forms and should not be construed as limited to the exemplary embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the invention to those skilled in the art. In the drawings, the size and relative sizes of layers and regions may be exaggerated for clarity.


The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. As used herein, the singular forms “a”, “an” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms “comprises” and/or “comprising,” when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.


Embodiments of the invention are described herein with reference to illustrations that are schematic illustrations of idealized embodiments (and intermediate structures) of the invention. As such, variations from the shapes of the illustrations as a result, for example, of manufacturing techniques and/or tolerances, are to be expected. Thus, embodiments of the invention should not be construed as limited to the particular shapes of regions illustrated herein but are to include deviations in shapes that result, for example, from manufacturing.


Unless otherwise defined, all terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs. It will be further understood that terms, such as those defined in commonly used dictionaries, should be interpreted as having a meaning that is consistent with their meaning in the context of the relevant art and will not be interpreted in an idealized or overly formal sense unless expressly so defined herein.



FIG. 1 provides a block diagram for various electronic components which may be used within an exemplary electronic display assembly. One or more power modules 21 may be placed in electrical connection with a backplane 22, which could be provided as a printed circuit board which may facilitate electrical communication and/or power between a number of components in the display assembly. A display controlling assembly 20 may also be in electrical connection with the backplane 22. The display controlling assembly 20 preferably includes a number of different components, including but not limited to a video player, electronic storage device, and a microprocessor which is programmed to perform any of the logic that is described within this application as well as other logic common to this technology but not explicitly described herein. It should also be noted that any storage of data for any of the embodiments described herein can occur at either: (1) the electronic storage device on the display controlling assembly 20, (2) a remote server 42 which can be accessed through second data interface connection 33, or (3) stored both on the local storage device on the display controlling assembly 20 as well as a periodic backup stored on the remote server 42.


This figure also shows a backlight 23, LCD assembly 24, and a front transparent display panel 25. The backlight 23 may be a CCFL or light emitting diode (LED) backlight. It should be noted that although the setup for an LCD is shown, embodiments can be practiced with any electronic image-producing assembly. Thus any other flat panel display could be used, including but not limited to plasma, light-emitting polymers, and organic light emitting diode (OLED) displays. A fan assembly 26 is shown for optionally cooling displays which may reach elevated temperatures. One or more temperature sensors 27 may be used to monitor the temperature of the display assembly, and selectively engage fan assembly 26 when cooling is needed. An ambient light sensor 28 is preferably positioned to measure the amount of ambient light that is contacting the front display panel 25.


A variety of different electrical inputs/outputs are also shown, and all or only a select few of the inputs/outputs may be practiced with any given embodiment. The AC power input 30 delivers the incoming power to the backplane 22. A video signal input 31 can receive video signals from a plurality of different sources. In a preferred embodiment the video signal input 31 would be an HDMI input. Two data interface connections 32 and 33 are also shown. The first data interface connection 32 is preferably a Bluetooth low energy transmitter/receiver. In an exemplary embodiment, the data interface connection 32 is provided as an iBeacon transmitter/receiver. The second data interface connection 33 may be a network connection such as an Ethernet port, wireless network connection, or a satellite network connection. The second data interface connection 33 preferably allows the display assembly to communicate with the internet, and may also permit a remote user to communicate with the display assembly. The second data interface connection 33 can also provide the video data through a network source. The second data interface connection 33 can also be utilized to transmit display settings, error messages, and various other forms of data to a website for access and control by the user. Optional audio connections 34 may also be provided for connection to internal or external speaker assemblies.


A backlight sensor 29 is preferably placed within the backlight cavity to measure the amount of luminance being generated within the backlight cavity. Additionally, a display luminance sensor 40 is preferably positioned in front of the display 24 in order to measure the amount of luminance exiting the display 24. A camera 41 may be positioned to record the area surrounding the display and is also preferably placed in electrical connection with the backplane 22.


The Bluetooth low energy transmitter/receiver 32 allows communication with smart phone devices which may be within relatively close proximity of the electronic display. Generally speaking, the Bluetooth low energy transmitter/receiver 32 sends out a signal to notify smart phone devices in the area of the presence of the transmitter/receiver 32 and can both push data to these devices as well as pull data from these devices. A number of functions using this communication is described further below.



FIG. 2 is a schematic illustration of a user 14 approaching one type of electronic menu board on foot. In this embodiment, three separate displays 10, 11, and 12 are placed in a 1×3 array and contained within a housing that places a portion of the bezel 13 in between each display. Each display 10, 11, and 12 may contain each of the components shown above in FIG. 1 or only a portion of the shown components. Once the user 14 enters a certain proximity to the displays 10, 11, and 12, the transmitter/receiver 32 of one or more of the displays may begin communicating with the smart phone device 15 of the user 14.



FIG. 3 is a schematic illustration of a user approaching a second type of electronic menu board while operating an automobile 16. Here, a single monolithic display 110 is contained within a housing. The assembly lacks the bezels which would need to be placed between each display. The display 110 can be driven in separately controllable areas 100a-100f, where in this case each area is being driven to show a different image. This assembly only requires a single collection of the components shown in FIG. 1. Once the automobile 16 enters a certain proximity to the display 110, the transmitter/receiver 32 of the display may begin communicating with the smart phone device 15 within the automobile 16. It should be noted that a user can approach a monolithic display such as this on foot (similar to what is shown in FIG. 2). Further, it should also be noted that the array of displays shown in FIG. 2 can be approached by a user operating an automobile 16, as shown in FIG. 3.



FIG. 4 is a logic flowchart showing one embodiment for operating the displays described herein. Initially, the transmitter/receiver 32 transmits outgoing signals as well as receives incoming signals. The system then determines if a smart device is within close proximity to the display. If not, the system returns to transmit outgoing signals and receive incoming signals.


If a smart device is in close proximity to the display, the system preferably begins to search through stored identifying information (which can be stored electronically on the electronic storage device on the display controlling assembly 20 or remotely on a server that can be accessed through second data interface connection 33) to determine if the smart device matches any of the stored identifying information. If not, the system preferably stores identifying information for the smart device along with the products ordered, which are stored in association with the identifying information for the smart device.


If the smart device matches any of the stored identifying information, the system should preferably access the electronic storage device to retrieve previous product orders by this smart device. Ideally, the system would then generate a message for the smart device making an offer to sell a previous product or previous purchase to the user. This message is transmitted to the smart device. Alternatively, the system could also display the previous order on the display, prompting the user on whether they would like to place the same order. The system would then receive incoming signals and determine if the smart device indicates an acceptance of the offer by the user (if using the embodiment where the offer is sent to the smart device). If yes, the system would place the order for the accepted offer and optionally display the order confirmation to the user on the display. If not, the system simply returns to again transmit outgoing signals and receive incoming signals until another smart device is detected.



FIG. 5 is a logic flowchart showing a second embodiment for operating the displays described herein. Similar to the initial start of the method shown in FIG. 4, the transmitter/receiver 32 transmits outgoing signals as well as receives incoming signals. The system then determines if a smart device is within close proximity to the display. If not, the system returns to transmit outgoing signals and receive incoming signals.


In this embodiment, if a smart device is detected in close proximity to the display, the system will transmit menu data to the smart device. In this embodiment, the menu data can be stored electronically on the electronic storage device on the display controlling assembly 20. The transmitter/receiver 32 would then receive any incoming signals and determine if the smart device has placed an order for a menu selection. If not, the system returns to transmit outgoing signals and receive incoming signals. If so, the system places the order for the menu selection and optionally displays a confirmation of the order to the user through the display.



FIG. 6 is a logic flowchart showing a third embodiment for operating the displays described herein. This embodiment is similar to the logic shown above in FIG. 4, but with a couple notable differences. First, in this embodiment, if the system does not detect a smart device in close proximity, a message is displayed which prompts the viewer to turn on their smart device Bluetooth functionality. The second difference between this embodiment and that of FIG. 4 is that the time elapsed during the taking and/or filling of the order can be stored. This permits a later statistical analysis of the times for taking an order and filling an order, to improve or analyze the performance of the system.



FIG. 7 is a logic flowchart showing a fourth embodiment for operating the displays described herein. This embodiment would preferably utilize the camera 41 described above. In this embodiment, the display would normally be driven with the backlight at a reduced power level in order to save power (as well as wear and tear on some of the electronics and fans). As the system receives data from the camera 41, the backlight power can be increased once a vehicle or viewer is determined to be in close proximity to the display. Once the system determines that a vehicle or viewer is in close proximity to the display, the system may check to see if the viewer's sex can be identified. Optionally, the system can also check to see if the viewer's age can be identified as well. If the viewer's age and sex can be identified, then an offer is displayed or transmitted to the viewer which is tailored to the viewer's age and sex. If the viewer's age can be determined but not sex, then an offer is displayed or transmitted to the viewer which is tailored to the viewer's age. If the viewer's sex can be determined but not age, then an offer is displayed or transmitted to the viewer which is tailored to the viewer's sex. If neither the viewer's sex nor age can be determined, then the system may display or transmit the normal menu offerings. As used in this embodiment, the term ‘transmit’ is used to mean the electronic transmission of an offer to the user's smart device.


Camera recognition software having the functionality described herein is commercially available from KeyLemon in Switzerland (www.keylemon.com) as well as FaceFirst in Westlake Village, Calif. (www.facefirst.com).


Having shown and described a preferred embodiment of the invention, those skilled in the art will realize that many variations and modifications may be made to affect the described invention and still be within the scope of the claimed invention. Additionally, many of the elements indicated above may be altered or replaced by different elements which will provide the same result and fall within the spirit of the claimed invention. It is the intention, therefore, to limit the invention only as indicated by the scope of the claims.

Claims
  • 1. A system for efficiently displaying an individualized offer for a particular menu item to a user of an identified smart device, said system comprising: an electronic menu board configured to display the individualized offer for the particular menu item to the user along with other menu items, wherein the electronic menu board comprises a monolithic display comprising separately controllable areas configured to be individually driven;a wireless transmitter/receiver in electronic communication with the electronic menu board, wherein the wireless transmitter/receiver is configured to transmit an outgoing signal within a transmission area and receive an incoming signal identifying the identified smart device within the transmission area;an electronic storage device configured to store a unique identifier for the identified smart device and order history data associated with the unique identifier, wherein the electronic storage device is in electronic communication with the wireless transmitter/receiver;a display controlling assembly electrically connected to the electronic menu board, the wireless transmitter/receiver, and the electronic storage device, wherein said display controlling assembly is configured to generate the individualized offer based on the order history data associated with the unique identifier for the identified smart device, wherein said display controlling assembly is further configured to transmit the individualized offer to the electronic menu board for display at a first one of the separately controllable areas and the identified smart device for display, wherein the particular menu item is associated with one or more menu items in the order history data associated with the unique identifier for the identified smart device, and wherein said display controlling assembly is further configured to display the other menu items in at least one other of said separately controllable areas;a second wireless transmitter/receiver electrically connected to the display controlling assembly; anda camera positioned to record a second area, said camera configured to transmit a recorded image to the display controlling assembly, wherein the display controlling assembly is further configured to drive a backlight of the electronic menu board at a baseline power level and increase the power level supplied to the backlight following receipt of the recorded image from the camera indicating the user or a vehicle associated with the user within the second area;wherein the display controlling assembly is further configured to transmit an order for the particular menu item to the second wireless transmitter/receiver upon receipt of one or more signals from the identified smart device indicating acceptance of the individualized offer.
  • 2. The system of claim 1 wherein: said electronic storage device is located remote from the electronic menu board.
  • 3. The system of claim 1 wherein: the display controlling assembly comprises facial recognition software and is configured to process the recorded image and generate viewer data for the user wherein said viewer data is also used to generate the individualized image offer.
  • 4. The system of claim 3 wherein: the viewer data comprises the user's age.
  • 5. The system of claim 3 wherein: the viewer data comprises the user's sex.
  • 6. The system of claim 1 wherein: the display controlling assembly comprises a video player, an electronic storage device, and a microprocessor.
  • 7. The system of claim 1 further comprising: a power module; anda backplane in electrical communication with the power module, electronic menu board, wireless transmitter/receiver, electronic storage device, and display controlling assembly.
  • 8. A system for displaying individualized offers for particular menu items to users of smart devices, said system comprising: a housing;a monolithic electronic menu board located within the housing and comprising separately controllable areas;a wireless transmitter/receiver located on or within the housing and configured to periodically transmit outgoing signals and periodically receive corresponding incoming signals identifying smart devices within a transmission area and order selections by users of the smart devices;an electronic storage device electrically connected to the wireless transmitter/receiver and configured to store a unique identifier for each of the smart devices identified by the wireless transmitter/receiver along with order history data associated with each of the unique identifiers; anda display controlling assembly electrically connected to the electronic storage device and the monolithic electronic menu board, wherein said display controlling assembly is configured to receive the associated order history data, generate the individualized offers based on the associated order history data, transmit the individualized offers to the monolithic electronic menu board for display at a first one of the separately controllable areas and the identified smart devices for display, wherein each of the particular menu items comprise menu items listed in the order history data associated with the respective unique identifier, and receive signals from one or more of the smart devices indicating acceptance of the respective individualized offers, wherein said display controlling assembly is further configured to display other menu items at other ones of said separately controllable areas of said monolithic electronic menu board;a second wireless transmitter/receiver electrically connected to the display controlling assembly and configured to receive signals from the display controlling assembly indicating acceptance of the individualized offers; anda camera positioned to record an area in proximity to the electronic menu board, wherein said camera is configured to transmit recorded images to the display controlling assembly, and wherein the display controlling assembly is further configured to drive a backlight of the electronic menu board at a baseline power level and increase the power level supplied to the backlight following receipt of the recorded images at the camera indicating one or more users or vehicles within the area.
  • 9. The system of claim 8 further comprising: an elapsed time recorder in communication with the display controlling assembly and the electronic storage device, wherein said elapsed time recorder is configured to measure the elapsed time between receiving the order selection and filling the order and transmit the elapsed time to the electronic storage device.
  • 10. The system of claim 8 wherein: the display controlling assembly comprises facial recognition software configured to process the recorded image and generate viewer data wherein said viewer data is also used to generate the individualized offer.
  • 11. A method for displaying an individualized offer to a user of an identified smart device, said method comprising the steps of: driving a backlight of an electronic menu board at a first power level;transmitting an outgoing wireless signal from a wireless transmitter/receiver located at the electronic menu board within a transmission area;receiving an incoming wireless signal at the wireless transmitter/receiver from the identified smart device within the transmission area of the outgoing wireless signal;receiving images from a camera indicating the user or a vehicle associated with the user within a detection area;driving the backlight of the electronic menu board at a second power level that is greater than the first power level following the receipt of the images from the camera indicating detection of at least one user or the vehicle within the detection area;analyzing the incoming wireless signal from the wireless transmitter/receiver to identify a unique identifier for the identified smart devices within the transmission area;querying an electronic storage device to retrieve order history data associated with the unique identifier;generating the individualized offer, wherein said individualized offer comprises at least one menu item listed in the retrieved order history data;displaying the individualized offer at a first one of a plurality of separately controllable areas of the electronic menu board and the identified smart device, wherein the electronic menu board comprises a monolithic display;displaying menu items at other ones of said plurality of separately controllable areas;receiving an acceptance of the individualized offer from the smart device by way of the wireless transmitter/receiver; andtransmitting the acceptance to a second wireless transmitter/receiver to fulfil the accepted individualized offer.
  • 12. The method of claim 11 further comprising the steps of: recording, by way of an elapsed time recorder, the elapsed time between receiving the order selection and filling the order; andelectronically transmitting the elapsed time from the elapsed time recorder to the electronic storage device for recordation.
CROSS-REFERENCE TO RELATED APPLICATIONS

This application claims the benefit of U.S. Provisional Application No. 62/162,420, filed May 15, 2015 which is herein incorporated by reference in its entirety.

US Referenced Citations (277)
Number Name Date Kind
4271410 Crawford Jun 1981 A
4399456 Zalm Aug 1983 A
4456910 DiMassimo et al. Jun 1984 A
4571616 Haisma et al. Feb 1986 A
4593978 Mourey et al. Jun 1986 A
4753519 Miyatake Jun 1988 A
5029982 Nash Jul 1991 A
5049987 Hoppenstein Sep 1991 A
5081523 Frazier Jan 1992 A
5088806 McCartney et al. Feb 1992 A
5115229 Shalit May 1992 A
5162645 Wagensonner et al. Nov 1992 A
5162785 Fagard Nov 1992 A
5351201 Harshbarger, Jr. et al. Sep 1994 A
5402141 Haim et al. Mar 1995 A
5565894 Bates et al. Oct 1996 A
5656824 den Boer et al. Aug 1997 A
5663952 Gentry, Jr. Sep 1997 A
5694141 Chee Dec 1997 A
5751346 Dozier et al. May 1998 A
5835074 Didier et al. Nov 1998 A
5886731 Ebisawa Mar 1999 A
5912743 Kinebuchi et al. Jun 1999 A
6027222 Oki et al. Feb 2000 A
6032126 Kaehler Feb 2000 A
6055012 Haskell et al. Apr 2000 A
6075556 Urano et al. Jun 2000 A
6091777 Guetz et al. Jul 2000 A
6094457 Linzer et al. Jul 2000 A
6100906 Asaro et al. Aug 2000 A
6153985 Grossman Nov 2000 A
6192083 Linzer et al. Feb 2001 B1
6259492 Imoto et al. Jul 2001 B1
6292157 Greene et al. Sep 2001 B1
6292228 Cho Sep 2001 B1
6297859 George Oct 2001 B1
6326934 Kinzie Dec 2001 B1
6359390 Nagai Mar 2002 B1
6392727 Larson et al. May 2002 B1
6417900 Shin et al. Jul 2002 B1
6421103 Yamaguchi Jul 2002 B2
6421694 Nawaz et al. Jul 2002 B1
6428198 Saccomanno et al. Aug 2002 B1
6536041 Knudson et al. Mar 2003 B1
6546294 Kelsey et al. Apr 2003 B1
6553336 Johnson et al. Apr 2003 B1
6587525 Jeong et al. Jul 2003 B2
6642666 St-Germain Nov 2003 B1
6674463 Just et al. Jan 2004 B1
6690726 Yavits et al. Feb 2004 B1
6697100 Tatsuzawa Feb 2004 B2
6698020 Zigmond et al. Feb 2004 B1
6712046 Nakamichi Mar 2004 B2
6812851 Dukach et al. Nov 2004 B1
6820050 Simmon et al. Nov 2004 B2
6825899 Kobayashi Nov 2004 B2
6850209 Mankins et al. Feb 2005 B2
6859215 Brown Feb 2005 B1
6996460 Krahnstoever et al. Feb 2006 B1
7038186 De Brabander et al. May 2006 B2
7057590 Lim et al. Jun 2006 B2
7103852 Kairis, Jr. Sep 2006 B2
7136415 Yun et al. Nov 2006 B2
7174029 Agostinelli et al. Feb 2007 B2
7304638 Murphy Dec 2007 B2
7307614 Vinn Dec 2007 B2
7319862 Lincoln et al. Jan 2008 B1
7358851 Patenaude et al. Apr 2008 B2
7385593 Krajewski et al. Jun 2008 B2
7391811 Itoi et al. Jun 2008 B2
7480042 Phillips et al. Jan 2009 B1
7518600 Lee Apr 2009 B2
7519703 Stuart Apr 2009 B1
7573458 Dunn Aug 2009 B2
7581094 Apostolopoulos et al. Aug 2009 B1
7614065 Weissmueller et al. Nov 2009 B2
7636927 Zigmond et al. Dec 2009 B2
7669757 Crews et al. Mar 2010 B1
7714834 Dunn May 2010 B2
7764280 Shiina Jul 2010 B2
7810114 Flickinger et al. Jul 2010 B2
7813694 Fishman et al. Oct 2010 B2
7825991 Enomoto Nov 2010 B2
7924263 Dunn Apr 2011 B2
7937724 Clark et al. May 2011 B2
7988849 Biewer et al. Aug 2011 B2
8130836 Ha Mar 2012 B2
8212921 Yun Jul 2012 B2
8218812 Sugimoto et al. Jul 2012 B2
8242974 Yamazaki et al. Aug 2012 B2
8350799 Wasinger et al. Jan 2013 B2
8400570 Dunn et al. Mar 2013 B2
8417376 Smolen Apr 2013 B1
8441574 Dunn et al. May 2013 B2
8544033 Acharya et al. Sep 2013 B1
8605121 Chu et al. Dec 2013 B2
8689343 De Laet Apr 2014 B2
8704752 Wasinger et al. Apr 2014 B2
8823630 Roberts et al. Sep 2014 B2
8989718 Ramer Mar 2015 B2
9026686 Dunn et al. May 2015 B2
9031872 Foster May 2015 B1
10185969 Holloway et al. Jan 2019 B1
10225718 Kim et al. Mar 2019 B2
20010019454 Tadic-Galeb et al. Sep 2001 A1
20020013144 Waters Jan 2002 A1
20020018522 Wiedenmann Feb 2002 A1
20020026354 Shoji et al. Feb 2002 A1
20020112026 Fridman et al. Aug 2002 A1
20020120721 Eilers et al. Aug 2002 A1
20020147648 Fadden et al. Oct 2002 A1
20020154138 Wada et al. Oct 2002 A1
20020163513 Tsuji Nov 2002 A1
20020164962 Mankins et al. Nov 2002 A1
20020190972 Ven de Van Dec 2002 A1
20020194365 Jammes Dec 2002 A1
20020194609 Tran Dec 2002 A1
20030031128 Kim et al. Feb 2003 A1
20030039312 Horowitz et al. Feb 2003 A1
20030061316 Blair et al. Mar 2003 A1
20030098881 Nolte et al. May 2003 A1
20030117428 Li et al. Jun 2003 A1
20030125892 Edge Jul 2003 A1
20030160734 Rogers Aug 2003 A1
20030161354 Bader et al. Aug 2003 A1
20030177269 Robinson et al. Sep 2003 A1
20030196208 Jacobson Oct 2003 A1
20030202605 Hazra et al. Oct 2003 A1
20030227428 Nose Dec 2003 A1
20040012722 Alvarez Jan 2004 A1
20040114041 Doyle et al. Jun 2004 A1
20040136698 Mock Jul 2004 A1
20040138840 Wolfe Jul 2004 A1
20040194131 Ellis et al. Sep 2004 A1
20040207738 Wacker Oct 2004 A1
20040252187 Alden Dec 2004 A1
20050005302 Zigmond et al. Jan 2005 A1
20050012734 Johnson et al. Jan 2005 A1
20050046951 Sugihara et al. Mar 2005 A1
20050071252 Henning et al. Mar 2005 A1
20050123001 Craven et al. Jun 2005 A1
20050127796 Olesen et al. Jun 2005 A1
20050134525 Tanghe et al. Jun 2005 A1
20050134526 Willem et al. Jun 2005 A1
20050184983 Brabander et al. Aug 2005 A1
20050188402 de Andrade et al. Aug 2005 A1
20050195330 Zacks et al. Sep 2005 A1
20050216939 Corbin Sep 2005 A1
20050253699 Madonia Nov 2005 A1
20050289061 Kulakowski et al. Dec 2005 A1
20050289588 Kinnear Dec 2005 A1
20060087521 Chu et al. Apr 2006 A1
20060150222 McCafferty et al. Jul 2006 A1
20060160614 Walker et al. Jul 2006 A1
20060214904 Kimura et al. Sep 2006 A1
20060215044 Masuda et al. Sep 2006 A1
20060244702 Yamazaki et al. Nov 2006 A1
20070047808 Choe et al. Mar 2007 A1
20070094620 Park Apr 2007 A1
20070127569 Hatalker Jun 2007 A1
20070152949 Sakai Jul 2007 A1
20070157260 Walker Jul 2007 A1
20070164932 Moon Jul 2007 A1
20070165955 Hwang et al. Jul 2007 A1
20070168539 Day Jul 2007 A1
20070200513 Ha et al. Aug 2007 A1
20070211179 Hector et al. Sep 2007 A1
20070241203 Wagner Oct 2007 A1
20070247594 Tanaka Oct 2007 A1
20070274400 Murai et al. Nov 2007 A1
20070286107 Singh et al. Dec 2007 A1
20080008471 Dress Jan 2008 A1
20080017422 Carro Jan 2008 A1
20080018584 Park et al. Jan 2008 A1
20080028059 Shin et al. Jan 2008 A1
20080037783 Kim et al. Feb 2008 A1
20080055247 Boillot Mar 2008 A1
20080074372 Baba et al. Mar 2008 A1
20080093443 Barcelou Apr 2008 A1
20080104631 Krock et al. May 2008 A1
20080106527 Cornish et al. May 2008 A1
20080112601 Warp May 2008 A1
20080119237 Kim May 2008 A1
20080143637 Sunahara et al. Jun 2008 A1
20080163291 Fishman et al. Jul 2008 A1
20080170028 Yoshida Jul 2008 A1
20080174522 Cho et al. Jul 2008 A1
20080201208 Tie et al. Aug 2008 A1
20080231604 Peterson Sep 2008 A1
20080232478 Teng et al. Sep 2008 A1
20080246871 Kupper et al. Oct 2008 A1
20080259198 Chen Oct 2008 A1
20080266331 Chen et al. Oct 2008 A1
20080272999 Kurokawa et al. Nov 2008 A1
20080278432 Ohshima Nov 2008 A1
20080278455 Atkins et al. Nov 2008 A1
20080303918 Keithley Dec 2008 A1
20080313046 Denenburg et al. Dec 2008 A1
20090036190 Brosnan et al. Feb 2009 A1
20090058845 Fukuda et al. Mar 2009 A1
20090102914 Collar et al. Apr 2009 A1
20090102973 Harris Apr 2009 A1
20090109165 Park et al. Apr 2009 A1
20090128867 Edge May 2009 A1
20090164615 Akkanen Jun 2009 A1
20090182917 Kim Jul 2009 A1
20090219295 Reijnaerts Sep 2009 A1
20090251602 Williams et al. Oct 2009 A1
20090254439 Dunn Oct 2009 A1
20090260028 Dunn et al. Oct 2009 A1
20090267866 Reddy et al. Oct 2009 A1
20090273568 Milner Nov 2009 A1
20090289968 Yoshida Nov 2009 A1
20090313125 Roh et al. Dec 2009 A1
20090315867 Sakamoto et al. Dec 2009 A1
20100039440 Tanaka et al. Feb 2010 A1
20100039696 de Groot et al. Feb 2010 A1
20100042506 Ravenel et al. Feb 2010 A1
20100060550 McGinn et al. Mar 2010 A1
20100083305 Acharya et al. Apr 2010 A1
20100104003 Dunn et al. Apr 2010 A1
20100109974 Dunn et al. May 2010 A1
20100121693 Pacana May 2010 A1
20100171889 Pantel et al. Jul 2010 A1
20100177157 Berlage Jul 2010 A1
20100177158 Walter Jul 2010 A1
20100188342 Dunn Jul 2010 A1
20100194861 Hoppenstein Aug 2010 A1
20100195865 Luff Aug 2010 A1
20100198983 Monroe et al. Aug 2010 A1
20100231563 Dunn et al. Sep 2010 A1
20100238299 Dunn et al. Sep 2010 A1
20100242081 Dunn et al. Sep 2010 A1
20100253613 Dunn et al. Oct 2010 A1
20100253778 Lee et al. Oct 2010 A1
20110012856 Maxwell et al. Jan 2011 A1
20110047567 Zigmond et al. Feb 2011 A1
20110069018 Atkins et al. Mar 2011 A1
20110074803 Kerofsky Mar 2011 A1
20110078536 Han et al. Mar 2011 A1
20110102630 Rukes May 2011 A1
20110181693 Lee et al. Jul 2011 A1
20110225859 Safavi Sep 2011 A1
20110258011 Burns Oct 2011 A1
20110273482 Massart et al. Nov 2011 A1
20120075362 Ichioka et al. Mar 2012 A1
20120182278 Ballestad Jul 2012 A1
20120188262 Rabii Jul 2012 A1
20120203872 Luby et al. Aug 2012 A1
20120302343 Hurst et al. Nov 2012 A1
20130110565 Means, Jr. et al. May 2013 A1
20130162908 Son et al. Jun 2013 A1
20130232029 Rovik Sep 2013 A1
20140043302 Barnes Feb 2014 A1
20140114778 Miller Apr 2014 A1
20140136935 Santillie May 2014 A1
20140139116 Reed May 2014 A1
20140222578 Poornachandran et al. Aug 2014 A1
20140236728 Wright Aug 2014 A1
20140333541 Lee et al. Nov 2014 A1
20140375704 Bi et al. Dec 2014 A1
20150070340 Trachtenberg et al. Mar 2015 A1
20150128076 Fang et al. May 2015 A1
20150172848 Gao Jun 2015 A1
20150172878 Luna Jun 2015 A1
20150227978 Woycik Aug 2015 A1
20150312488 Kostrzewa et al. Oct 2015 A1
20160012487 Bastaldo-Tsampalis et al. Jan 2016 A1
20160014103 Masters et al. Jan 2016 A1
20160034240 Kreiner et al. Feb 2016 A1
20160063954 Ryu Mar 2016 A1
20160125777 Knepper et al. May 2016 A1
20160293206 Dunn Oct 2016 A1
20160358357 Dunn et al. Dec 2016 A1
20170111486 Bowers et al. Apr 2017 A1
20170201797 Kwon Jul 2017 A1
20190113959 Lee Apr 2019 A1
Foreign Referenced Citations (36)
Number Date Country
1613264 May 2005 CN
101777315 Jul 2010 CN
102246196 Nov 2011 CN
0313331 Apr 1989 EP
1640337 Mar 2006 EP
2332120 Jun 2011 EP
2401736 Jan 2012 EP
2401869 Jan 2012 EP
0514488 Sep 2011 ID
2002064842 Feb 2002 JP
2002209230 Jul 2002 JP
2002366121 Dec 2002 JP
2005236469 Sep 2005 JP
2006184859 Jul 2006 JP
2008034841 Feb 2008 JP
2008165055 Jul 2008 JP
2009009422 Jan 2009 JP
20000021499 Apr 2000 KR
20020072633 Sep 2002 KR
200403940 Mar 2004 TW
WO9608892 Mar 1996 WO
WO2006089556 Aug 2006 WO
WO2006111689 Oct 2006 WO
WO2009004574 Jan 2009 WO
WO2010037104 Apr 2010 WO
WO2010085783 Jul 2010 WO
WO2010085784 Jul 2010 WO
WO2010094039 Aug 2010 WO
WO2010099178 Sep 2010 WO
WO2010099194 Sep 2010 WO
WO2011026186 Mar 2011 WO
WO2011035370 Mar 2011 WO
WO2011044640 Apr 2011 WO
WO2011060487 May 2011 WO
WO2011143720 Nov 2011 WO
WO2016000546 Jan 2016 WO
Non-Patent Literature Citations (11)
Entry
Rouaissia, Chaouki. “P-132: Adding Proximity Detection to a Standard Analog-Resistive Touchscreen.” SID Symposium Digest of Technical Papers. vol. 43. No. 1. Oxford, UK: Blackwell Publishing Ltd, 2012.
AMS AG, TCS3404, TCS3414, Digital Color Sensors, Apr. 2011, 41 pages, Texas Advanced Optoelectronic Solutions Inc. is now ams AG.
Analog Devices, ADV212: JPEG 2000 Video Codec, http://www.analog.com/en/audiovideo-products/video-compression/ADV212/products/pr . . . , accessed Oct. 15, 2008, 2 pages.
Analog Devices, Inc., JPEG 2000 Video Codec ADV212, 2006, 44 pages.
Photo Research, Inc., PR-650 SpectraScan Colorimeter, 1999, 2 pages.
Teravision Corp, LCD-TV Panel Control Board Specification, Nov. 2007, 24 pages.
Texas Advanced Optoelectronic Solutions Inc., TCS230 Programmable Color Light-To-Frequency Converter, Dec. 2007, 12 pages.
Texas Advanced Optoelectronic Solutions Inc., TCS3404CS, TCS3414CS Digital Color Light Sensors, Feb. 2009, 38 pages.
Wikipedia, Color rendering index, https://en.wikipedia.org/wiki/Color_rendering_index, accessed Aug. 25, 2016, 13 pages.
Wikipedia, Gamut, https://en.wikipedia.org/wiki/Gamut, accessed Aug. 25, 2016, 8 pages.
Wikipedia, Gradient-index optics, https://en.wikipedia.org/wiki/Gradient-index_optics, accessed Aug. 25, 2016, 5 pages.
Related Publications (1)
Number Date Country
20160335705 A1 Nov 2016 US
Provisional Applications (1)
Number Date Country
62162420 May 2015 US