The present application is related to a light sensor, and more specifically to methods and systems that incorporate the light sensor beneath a dual-mode display.
Present day cameras and electronic displays, when integrated into the same device, occupy separate regions of the device. A region of the device associated with the camera does not function as a display, while a region of the device functioning as the electronic display does not function as a camera.
The technology disclosed here integrates a light sensor with a dual-mode display, thus increasing the size of the dual-mode display. The light sensor can be a camera, an ambient sensor, a proximity sensor, etc. The light sensor is placed beneath the dual-mode display and can detect incoming light while the dual-mode display is displaying a display image. The dual-mode display and the light sensor can operate at the same time. For example, a camera placed beneath the dual-mode display can record an image of the environment, while at the same time the dual-mode display is showing the display image. Further, multiple light sensors can be placed at various locations beneath the dual-mode display.
Technology
The technology disclosed here integrates a light sensor with a dual-mode display, thus increasing the size of the dual-mode display. The light sensor can be a camera, an ambient sensor, a proximity sensor, etc. The light sensor is placed beneath the dual-mode display and can detect incoming light while the dual-mode display is displaying a display image. The dual-mode display and the light sensor can operate at the same time. For example, a camera placed beneath the dual-mode display can record an image of the environment, while at the same time the dual-mode display is showing the display image. Further, multiple light sensors can be placed at various locations beneath the dual-mode display.
The selectively addressable region 125 can be the same size as the light sensor 120, can be smaller than the light sensor 120, or can be larger than the light sensor 120. The selectively addressable region is placed above the light sensor 120. The selectively addressable region 125 can operate in a transparent mode when the dual-mode display is opaque, and can operate in an opaque mode, with the dual-mode display is transparent. When the selectively addressable region 125 is in the transparent mode, the selectively addressable region allows light to enter and exit through the selectively addressable region 125. For example, when the light sensor 120 is a proximity sensor, the selectively addressable region 125 turns transparent, and allows a beam of light, e.g., an infrared beam of light, from the proximity sensor to exit through the selectively addressable region 125, and enter through the selectively addressable region 125 back to the proximity sensor.
The dual-mode display 110 becomes opaque when displaying opaque portions of a display image. The dual-mode display 110 becomes substantially transparent when not displaying the display image. When the dual-mode display 110 is displaying transparent portions of the image, the dual-mode display 110 can be substantially transparent, the dual-mode display 110 can be opaque, or, the dual-mode display 110 can assume a degree of transparency corresponding to the degree of transparency of the display image. The dual-mode display 110 can be optically transparent to the visible light, infrared light, etc.
The light sensor 120 is placed beneath a selectively addressable region 125 of the dual-mode display 110. The light sensor 120 can be a camera, an ambient sensor, a proximity sensor, etc. The light sensor 120, and the optional additional light sensors 130, 140 can activate and deactivate. When the light sensor 120, and the optional additional light sensors 130, 140 activate, they can detect a property of incoming light, such as frequency, intensity, and/or time of flight of incoming light. For example, when the light sensor 120 is a camera, when the camera is active, the camera records an image, i.e. the camera detects frequency and intensity of incoming light. When the light sensor 120 is an ambient sensor, when the ambient sensor is active, the ambient sensor measures the intensity of ambient light. When the light sensor 120 is a proximity sensor, the proximity sensor emits a light beam, such as an infrared light beam, and measures a time of flight of the emitted light beam. From the time of flight of the emitted light beam, the proximity sensor determines distance to an object.
As seen in
During operation, a voltage is applied across the OLED layer 210 such that the anode layer 220 is positive with respect to the cathode layer 200. A current of electrons flows through the device from the cathode layer 200 to the anode layer 220. The current of electrons flowing through the OLED layer 210 excites the organic materials in the OLED layer to emit radiation at frequencies in the visible light spectrum.
The optional TFT layer 230 can be used in active matrix organic light emitting diode (AMOLED) displays. The optional TFT layer 230 is not required in passive matrix organic light emitting diode (PMOLED) displays. The activation of each individual OLED, whether in AMOLED or PMOLED displays, can be done in various ways, such as using row and column addressing, or activating each individual OLED, as described herein.
The substrate 240 can be substantially transparent, reflective, opaque, etc. When the substrate 240 is substantially transparent, an opaque layer 270 can be placed beneath the substrate 240 and the light sensor 250, such as an opaque graphite layer. When the substrate 240 is not substantially transparent, the substrate 240 is modified so that a region of the substrate 260 placed above the light sensor 250 is substantially transparent. Further, the region of the substrate 260 can be part of the light sensor 250. For example, the region of the substrate 260 can be a lens focusing light on to the light sensor 250 underneath, such as a camera, an ambient sensor, a proximity sensor, etc.
Region 330 corresponds to the selectively addressable region situated above a light sensor. Let us say that the electrodes 340, 350, 360, 370 are straight lines (not pictured), and address the region 330, in addition to the whole dual-mode display. In that case, turning off the electrodes 340, 350, 360, 370 not only turns off the region 330 corresponding to the light sensor, but a whole vertical and horizontal strip of the display corresponding to the electrodes 340, 350, 360, 370 turns off. To turn off the light emitting elements only in the region 330, the display controller needs to be able to address the light emitting elements in the region 330, separately from the light emitting elements associated with the rest of the dual-mode display.
In one embodiment, to create separate addressing for the region 330, two sets of separately addressable electrodes are created. A first set of electrodes is responsible for activating majority of the dual-mode display, while the second set of electrodes is responsible for activating region 330 corresponding to the selectively addressable region placed above the light sensor. The first set of electrodes includes horizontal and vertical electrodes 340, 350, 360, 370 and is responsible for activating the dual-mode display except for the selectively addressable region. The horizontal and vertical electrodes 340, 350, 360, 370 corresponding to the region 330 are routed around the region 330, as shown in
In another embodiment, each light emitting element can be individually addressable. For example, if there are m×n light emitting elements, there are m×n unique addresses corresponding to each light emitting element. In such a case, to uniquely address region 330 corresponding to the selectively addressable region placed above the light sensor, a memory associated with the display controller, stores the unique addresses of the light emitting elements associated with the region 330. The processor queries the memory for the unique addresses corresponding to the light emitting elements associated with the region 330. If there are multiple regions 330 corresponding to multiple light sensors, the memory stores the ID of the light sensor and the unique addresses of the light emitting elements corresponding to the light sensor. The processor queries the memory for the unique addresses of the light emitting elements associated with the ID of the light sensor, the processor is about to activate.
When the light sensor is a camera, the light sensor can record an image of the user's head. When the user's head is recorded by the plurality of light sensors 410, the processor, using triangulation, can determine a position of the user's head, based on a plurality of images recorded by the plurality of light sensors 410. The position of the user's head can include a three-dimensional position of the user's head, rotation of the user's head, the user's visual focus point 460, and/or the user's gaze direction 480.
Similarly, when the light sensor is a proximity sensor, the light sensor can measure a distance from the light sensor to the user's head. When the distance to the user's head is measured by the plurality of light sensors 410, using triangulation, the processor can determine the position of the user's head, based on a plurality of measurements recorded by the plurality of light sensors 410. The position of the user's head can include a three-dimensional position of the user's head, rotation of the user's head, the user's visual focus point 460, and/or the user's gaze direction 480.
For example, the processor determines the user's visual focus point 460 on the dual-mode display. Based on the user's visual focus point 460, the processor activates the camera 470 closest to the user. The camera 470 records an image of the user. The image so recorded captures the user's gaze. Recording of the user's gaze, by the camera 470 closest to the user, can be useful in teleconferencing applications to create an appearance that the user is always looking at the conversationalist on the other end of the teleconferencing meeting.
For example, in
Similarly, in
Configuring the dual-mode display includes providing a substantially transparent cathode layer, providing a substantially transparent light emitting layer comprising organic light emitting diodes, providing a substantially transparent anode layer, and providing a substrate. The substrate can be substantially transparent, opaque, reflective, etc. When the substrate is opaque and/or reflective, a region of the substrate is made transparent such that a light sensor placed beneath the substrate can record light coming through the transparent region of the substrate. The region of the substrate above the light sensor can be a lens focusing incoming light to the light sensor. Additionally or alternatively, the region of the substrate can be part of the light sensor.
In addition to the above-mentioned layers, an optional thin film transistor (TFT) layer can be provided to cause the substantially transparent light emitting layer to emit light. When the substrate is substantially transparent, an opaque graphite layer can be placed beneath the substrate and the light sensor.
In step 610, a light sensor is placed beneath a selectively addressable region. The light sensor activates and deactivates. When the light sensor activates the light sensor detects a property of incoming light, such as frequency, intensity, time of travel. The light sensor can be a camera, an ambient sensor, a proximity sensor, etc.
Placing the light sensor includes establishing a correspondence between the light sensor and the selectively addressable region, as described herein. For example, establishing the correspondence can include: configuring a first set of electrodes to activate and to deactivate the dual-mode display except the selectively addressable region; and configuring a second set of electrodes to activate and to deactivate the selectively addressable region.
In another example, establishing the correspondence can include: storing in a memory an identification associated with a plurality of light emitting elements corresponding to the selectively addressable region; and sending the identification associated with the plurality of light emitting elements to a display controller. When there is a plurality of light sensors corresponding to a plurality of selectively addressable regions, the memory stores identification (ID) of the light sensor and an ID of the selectively addressable region, which is associated with the light sensor. The processor can send the ID of the selectively addressable region to the memory, and ask for the identification of the plurality of light emitting elements associated with the sent selectively addressable region ID.
In step 620, the selectively addressable region is configured to not display the display image when the light sensor activates, such that the inactive selectively addressable region tends to be unnoticeable to a user. In one embodiment, the size of the light sensor is sufficiently small, such as less than 3 mm, so that the missing selectively addressable region tends to be unnoticeable to a user. Additionally or alternatively, the display controller can deactivate the selectively addressable region for a sufficiently short amount of time, so that the user does not have time to perceive the missing selectively addressable region. For example, the sufficiently short amount of time can be less than 1/30 of a second.
In addition, to the steps described above, a plurality of cameras can be placed beneath a plurality of portions of the dual-mode display. The plurality of cameras can record a plurality of images. A processor can be configured to track the user's head, based on the plurality of images recorded by the plurality of cameras. Based on the head tracking the processor can be configured to display a different display image on the dual-mode display. The plurality of cameras can be placed anywhere on the dual-mode display. For example the plurality of cameras can be evenly distributed, can be placed in corners of the display, the center of the display, etc. instead of the plurality of cameras, or in addition to the plurality of cameras, one or more proximity sensors can be used to track the user's head.
In addition to head tracking, the processor can be configured to track the user's eyes, based on a plurality of images recorded by the plurality of cameras. Based on the eye tracking, a camera in the plurality of cameras can be activated, where the camera is closest to the user's focus point on the substantially transfer parent display layer. Activating the camera includes recording, by the camera, an image of the user. The image of the user captures the user's gaze, and can be used in teleconferencing applications to create an appearance of the user is always facing the conversationalist on the other end of a video call.
In addition to the steps described above, the plurality of cameras disposed beneath the dual-mode display can record a plurality of images. Based on the plurality of images recorded by the plurality of cameras, the processor determines a position of the user's head. Responsive to the position of the user's head, the processor displays a different display image on the dual-mode display.
In addition to determining the position of the user's head, the processor can determine the user's visual focus point on the dual-mode display, based on the plurality of images recorded by the plurality of cameras. Based on the user's visual focus point, the processor can activate a camera in the plurality of cameras closest to the user's visual focus point. Activating the camera includes recording, by the camera, an image of the user. The camera closest to the user's visual focus point is best at capturing user's gaze, and creating an appearance that the user is looking at the conversationalist on the other end of a video call.
The processor can also perform calibration of the camera placed beneath the dual-mode display. When the dual-mode display is not displaying the display image, the camera records a calibration image. When the dual-mode display is displaying the display image, the camera records an image. Based on the calibration image, the processor modifies the recorded image to obtain a final image. For example, the processor can subtract the calibration image from the recorded image to obtain the final image.
Computer
In the example of
This disclosure contemplates the computer system 800 taking any suitable physical form. As example and not by way of limitation, computer system 800 may be an embedded computer system, a system-on-chip (SOC), a single-board computer system (SBC) (such as, for example, a computer-on-module (COM) or system-on-module (SOM)), a desktop computer system, a laptop or notebook computer system, an interactive kiosk, a mainframe, a mesh of computer systems, a mobile telephone, a personal digital assistant (PDA), a server, or a combination of two or more of these. Where appropriate, computer system 800 may include one or more computer systems 800; be unitary or distributed; span multiple locations; span multiple machines; or reside in a cloud, which may include one or more cloud components in one or more networks. Where appropriate, one or more computer systems 800 may perform without substantial spatial or temporal limitation one or more steps of one or more methods described or illustrated herein. As an example and not by way of limitation, one or more computer systems 800 may perform in real time or in batch mode one or more steps of one or more methods described or illustrated herein. One or more computer systems 800 may perform at different times or at different locations one or more steps of one or more methods described or illustrated herein, where appropriate.
The processor may be, for example, a conventional microprocessor such as an Intel Pentium microprocessor or Motorola power PC microprocessor. One of skill in the relevant art will recognize that the terms “machine-readable (storage) medium” or “computer-readable (storage) medium” include any type of device that is accessible by the processor.
The memory is coupled to the processor by, for example, a bus. The memory can include, by way of example but not limitation, random access memory (RAM), such as dynamic RAM (DRAM) and static RAM (SRAM). The memory can be local, remote, or distributed.
The bus also couples the processor to the non-volatile memory and drive unit. The non-volatile memory is often a magnetic floppy or hard disk, a magnetic-optical disk, an optical disk, a read-only memory (ROM), such as a CD-ROM, EPROM, or EEPROM, a magnetic or optical card, or another form of storage for large amounts of data. Some of this data is often written, by a direct memory access process, into memory during execution of software in the computer 800. The non-volatile storage can be local, remote, or distributed. The non-volatile memory is optional because systems can be created with all applicable data available in memory. A typical computer system will usually include at least a processor, memory, and a device (e.g., a bus) coupling the memory to the processor.
Software is typically stored in the non-volatile memory and/or the drive unit. Indeed, storing and entire large program in memory may not even be possible. Nevertheless, it should be understood that for software to run, if necessary, it is moved to a computer readable location appropriate for processing, and for illustrative purposes, that location is referred to as the memory in this paper. Even when software is moved to the memory for execution, the processor will typically make use of hardware registers to store values associated with the software, and local cache that, ideally, serves to speed up execution. As used herein, a software program is assumed to be stored at any known or convenient location (from non-volatile storage to hardware registers) when the software program is referred to as “implemented in a computer-readable medium.” A processor is considered to be “configured to execute a program” when at least one value associated with the program is stored in a register readable by the processor.
The bus also couples the processor to the network interface device. The interface can include one or more of a modem or network interface. It will be appreciated that a modem or network interface can be considered to be part of the computer system 800. The interface can include an analog modem, isdn modem, cable modem, token ring interface, satellite transmission interface (e.g. “direct PC”), or other interfaces for coupling a computer system to other computer systems. The interface can include one or more input and/or output devices. The I/O devices can include, by way of example but not limitation, a keyboard, a mouse or other pointing device, disk drives, printers, a scanner, and other input and/or output devices, including a display device. The display device can include, by way of example but not limitation, a cathode ray tube (CRT), liquid crystal display (LCD), or some other applicable known or convenient display device. For simplicity, it is assumed that controllers of any devices not depicted in the example of
In operation, the computer system 800 can be controlled by operating system software that includes a file management system, such as a disk operating system. One example of operating system software with associated file management system software is the family of operating systems known as Windows® from Microsoft Corporation of Redmond, Wash., and their associated file management systems. Another example of operating system software with its associated file management system software is the Linux™ operating system and its associated file management system. The file management system is typically stored in the non-volatile memory and/or drive unit and causes the processor to execute the various acts required by the operating system to input and output data and to store data in the memory, including storing files on the non-volatile memory and/or drive unit.
Some portions of the detailed description may be presented in terms of algorithms and symbolic representations of operations on data bits within a computer memory. These algorithmic descriptions and representations are the means used by those skilled in the data processing arts to most effectively convey the substance of their work to others skilled in the art. An algorithm is here, and generally, conceived to be a self-consistent sequence of operations leading to a desired result. The operations are those requiring physical manipulations of physical quantities. Usually, though not necessarily, these quantities take the form of electrical or magnetic signals capable of being stored, transferred, combined, compared, and otherwise manipulated. It has proven convenient at times, principally for reasons of common usage, to refer to these signals as bits, values, elements, symbols, characters, terms, numbers, or the like.
It should be borne in mind, however, that all of these and similar terms are to be associated with the appropriate physical quantities and are merely convenient labels applied to these quantities. Unless specifically stated otherwise as apparent from the following discussion, it is appreciated that throughout the description, discussions utilizing terms such as “processing” or “computing” or “calculating” or “determining” or “displaying” or “generating” or the like, refer to the action and processes of a computer system, or similar electronic computing device, that manipulates and transforms data represented as physical (electronic) quantities within the computer system's registers and memories into other data similarly represented as physical quantities within the computer system memories or registers or other such information storage, transmission or display devices.
The algorithms and displays presented herein are not inherently related to any particular computer or other apparatus. Various general purpose systems may be used with programs in accordance with the teachings herein, or it may prove convenient to construct more specialized apparatus to perform the methods of some embodiments. The required structure for a variety of these systems will appear from the description below. In addition, the techniques are not described with reference to any particular programming language, and various embodiments may thus be implemented using a variety of programming languages.
In alternative embodiments, the machine operates as a standalone device or may be connected (e.g., networked) to other machines. In a networked deployment, the machine may operate in the capacity of a server or a client machine in a client-server network environment, or as a peer machine in a peer-to-peer (or distributed) network environment.
The machine may be a server computer, a client computer, a personal computer (PC), a tablet PC, a laptop computer, a set-top box (STB), a personal digital assistant (PDA), a cellular telephone, an iPhone, a Blackberry, a processor, a telephone, a web appliance, a network router, switch or bridge, or any machine capable of executing a set of instructions (sequential or otherwise) that specify actions to be taken by that machine.
While the machine-readable medium or machine-readable storage medium is shown in an exemplary embodiment to be a single medium, the term “machine-readable medium” and “machine-readable storage medium” should be taken to include a single medium or multiple media (e.g., a centralized or distributed database, and/or associated caches and servers) that store the one or more sets of instructions. The term “machine-readable medium” and “machine-readable storage medium” shall also be taken to include any medium that is capable of storing, encoding or carrying a set of instructions for execution by the machine and that cause the machine to perform any one or more of the methodologies or modules of the presently disclosed technique and innovation.
In general, the routines executed to implement the embodiments of the disclosure, may be implemented as part of an operating system or a specific application, component, program, object, module or sequence of instructions referred to as “computer programs.” The computer programs typically comprise one or more instructions set at various times in various memory and storage devices in a computer, and that, when read and executed by one or more processing units or processors in a computer, cause the computer to perform operations to execute elements involving the various aspects of the disclosure.
Moreover, while embodiments have been described in the context of fully functioning computers and computer systems, those skilled in the art will appreciate that the various embodiments are capable of being distributed as a program product in a variety of forms, and that the disclosure applies equally regardless of the particular type of machine or computer-readable media used to actually effect the distribution.
Further examples of machine-readable storage media, machine-readable media, or computer-readable (storage) media include but are not limited to recordable type media such as volatile and non-volatile memory devices, floppy and other removable disks, hard disk drives, optical disks (e.g., Compact Disk Read-Only Memory (CD ROMS), Digital Versatile Disks, (DVDs), etc.), among others, and transmission type media such as digital and analog communication links.
In some circumstances, operation of a memory device, such as a change in state from a binary one to a binary zero or vice-versa, for example, may comprise a transformation, such as a physical transformation. With particular types of memory devices, such a physical transformation may comprise a physical transformation of an article to a different state or thing. For example, but without limitation, for some types of memory devices, a change in state may involve an accumulation and storage of charge or a release of stored charge. Likewise, in other memory devices, a change of state may comprise a physical change or transformation in magnetic orientation or a physical change or transformation in molecular structure, such as from crystalline to amorphous or vice versa. The foregoing is not intended to be an exhaustive list in which a change in state for a binary one to a binary zero or vice-versa in a memory device may comprise a transformation, such as a physical transformation. Rather, the foregoing is intended as illustrative examples.
A storage medium typically may be non-transitory or comprise a non-transitory device. In this context, a non-transitory storage medium may include a device that is tangible, meaning that the device has a concrete physical form, although the device may change its physical state. Thus, for example, non-transitory refers to a device remaining tangible despite this change in state.
Remarks
The language used in the specification has been principally selected for readability and instructional purposes, and it may not have been selected to delineate or circumscribe the inventive subject matter. It is therefore intended that the scope of the invention be limited not by this Detailed Description, but rather by any claims that issue on an application based hereon. Accordingly, the disclosure of various embodiments is intended to be illustrative, but not limiting, of the scope of the embodiments, which is set forth in the following claims.
This application is a continuation of U.S. patent application Ser. No. 15/336,380, filed on Oct. 27, 2016, which claims priority to U.S. Provisional Patent Application No. 62/249,130, filed on Oct. 30, 2015, U.S. Provisional Application No. 62/300,631, filed on Feb. 26, 2016, U.S. Provisional Application No. 62/319,099, filed on Apr. 6, 2016, and U.S. Provisional Application No. 62/373,910, filed on Aug. 11, 2016, all of which are incorporated by reference herein in their entirety.
Number | Name | Date | Kind |
---|---|---|---|
4928301 | Smoot | May 1990 | A |
5466926 | Sasano et al. | Nov 1995 | A |
5828406 | Parulski et al. | Oct 1998 | A |
6107618 | Fossum et al. | Aug 2000 | A |
6627907 | Cole et al. | Sep 2003 | B1 |
6867821 | De Schipper | Mar 2005 | B2 |
6876143 | Daniels | Apr 2005 | B2 |
6885157 | Cok et al. | Apr 2005 | B1 |
7034866 | Colmenarez et al. | Apr 2006 | B1 |
7130011 | Hsieh et al. | Oct 2006 | B2 |
7697053 | Kurtz et al. | Apr 2010 | B2 |
8103085 | Zadeh | Jan 2012 | B1 |
8576325 | Dudek et al. | Nov 2013 | B2 |
8867015 | Posner et al. | Oct 2014 | B2 |
9057931 | Baldwin | Jun 2015 | B1 |
9204089 | Nagano et al. | Dec 2015 | B2 |
9843736 | Evans et al. | Dec 2017 | B2 |
20020089467 | Hara | Jul 2002 | A1 |
20040212555 | Falco | Oct 2004 | A1 |
20040257473 | Miyagawa | Dec 2004 | A1 |
20060136846 | Im et al. | Jun 2006 | A1 |
20060146008 | Johnson | Jul 2006 | A1 |
20070002130 | Hartkop | Jan 2007 | A1 |
20070081094 | Ciudad et al. | Apr 2007 | A1 |
20070109239 | Den Boer et al. | May 2007 | A1 |
20070247439 | Daniel et al. | Oct 2007 | A1 |
20080068484 | Nam | Mar 2008 | A1 |
20080106628 | Cok et al. | May 2008 | A1 |
20080142599 | Benillouche | Jun 2008 | A1 |
20080165267 | Cok | Jul 2008 | A1 |
20080292144 | Kim | Nov 2008 | A1 |
20090102763 | Border et al. | Apr 2009 | A1 |
20090121619 | Rajan et al. | May 2009 | A1 |
20090153652 | Barenbrug | Jun 2009 | A1 |
20090322706 | Austin | Dec 2009 | A1 |
20100044121 | Simon et al. | Feb 2010 | A1 |
20100060552 | Watanabe et al. | Mar 2010 | A1 |
20100073518 | Yeh | Mar 2010 | A1 |
20100117949 | Lai et al. | May 2010 | A1 |
20100118179 | Ciudad et al. | May 2010 | A1 |
20100177179 | Behm | Jul 2010 | A1 |
20100182446 | Matsubayashi | Jul 2010 | A1 |
20110164047 | Pance | Jul 2011 | A1 |
20110248155 | Chen | Oct 2011 | A1 |
20110279689 | Maglaque | Nov 2011 | A1 |
20120074227 | Ferren et al. | Mar 2012 | A1 |
20120105400 | Mathew et al. | May 2012 | A1 |
20120287327 | Border et al. | Nov 2012 | A1 |
20130135268 | Kanade et al. | May 2013 | A1 |
20130135328 | Rappoport et al. | May 2013 | A1 |
20130147764 | Chaji et al. | Jun 2013 | A1 |
20130182062 | Son et al. | Jul 2013 | A1 |
20130207896 | Robinson et al. | Aug 2013 | A1 |
20130221856 | Soto | Aug 2013 | A1 |
20130242479 | Yoo et al. | Sep 2013 | A1 |
20130251215 | Coons | Sep 2013 | A1 |
20130278516 | Nagata et al. | Oct 2013 | A1 |
20130286152 | Hojer | Oct 2013 | A1 |
20130321686 | Tan et al. | Dec 2013 | A1 |
20130322752 | Lim et al. | Dec 2013 | A1 |
20140036168 | Ludwig | Feb 2014 | A1 |
20140208417 | Robison | Jul 2014 | A1 |
20140232956 | Kwon et al. | Aug 2014 | A1 |
20140267850 | Li et al. | Sep 2014 | A1 |
20150035936 | Robinson et al. | Feb 2015 | A1 |
20150049165 | Choi | Feb 2015 | A1 |
20150212583 | Shen et al. | Jul 2015 | A1 |
20150271392 | Musgrave et al. | Sep 2015 | A1 |
20150279020 | Gupta et al. | Oct 2015 | A1 |
20160180837 | Gustavsson et al. | Jun 2016 | A1 |
20170084231 | Chew | Mar 2017 | A1 |
20170123452 | Evans et al. | May 2017 | A1 |
20170123453 | Evans et al. | May 2017 | A1 |
20170123454 | Evans et al. | May 2017 | A1 |
20170124932 | Evans et al. | May 2017 | A1 |
20170124933 | Evans et al. | May 2017 | A1 |
20170124942 | Evans et al. | May 2017 | A1 |
20170171448 | Evans et al. | Jun 2017 | A1 |
20180013944 | Evans et al. | Jan 2018 | A1 |
20180107241 | Evans et al. | Apr 2018 | A1 |
Number | Date | Country |
---|---|---|
101048727 | Oct 2007 | CN |
102379002 | Mar 2012 | CN |
106019671 | Oct 2016 | CN |
445744 | Jul 2001 | TW |
201001364 | Jan 2010 | TW |
201207535 | Feb 2012 | TW |
M1480723 | Jun 2014 | TW |
201523460 | Jun 2015 | TW |
2015097490 | Jul 2015 | WO |
Entry |
---|
Notice of Allowance dated Aug. 1, 2017, for U.S. Appl. No. 15/444,320 of Evans et al. filed Feb. 27, 2017. |
Advisory Action dated Jul. 6, 2017, for U.S. Appl. No. 15/283,204 of Evans et al. filed Sep. 30, 2016; 4 pages. |
Advisory Action dated May 30, 2017, for U.S. Appl. No. 15/231,664 of Evans, D.J. et al. filed Aug. 8, 2016; 3 pages. |
Non-Final Office Action dated Jun. 26, 2017, for U.S. Appl. No. 15/442,576 of Evans et al. filed Feb. 24, 2017; 25 pages. |
Non-Final Office Action dated Jun. 6, 2017 for U.S. Appl. No. 15/231,643 of Evans, D.J. et al. filed Aug. 8, 2016; 26 pages. |
Non-Final Office Action dated Feb. 16, 2017 in U.S. Appl. No. 15/231,664 of Evans et al.; 26 pages. |
Non-Final Office Action dated Jun. 23, 2017 in U.S. Appl. No. 15/231,664 of Evans et al.; 25 pages. |
Notice of Allowance dated May 19, 2017, for U.S. Appl. No. 15/283,112 of Evans et al. filed Sep. 30, 2016, 8 pages. |
Notice of Allowance dated May 26, 2017, for U.S. Appl. No. 15/336,380 of Evans, J.D. et al. filed Oct. 27, 2017; 11 pages. |
U.S. Appl. No. 15/444,320 of Evans et al. filed Feb. 27, 2017, 82 pages. |
Evans V , et al., Non-Final Office Action received in U.S. Appl. No. 15/231,652 of Evans et al. dated Feb. 24, 2017; 26 pages. |
International Search Report and Written Opinion in PCT/US16/59550 dated Jan. 3, 2017, 10 pages. |
International Search Report and Written Opinion in PCT/US16/58543 dated Jan. 3, 2017, 12 pages. |
International Search Report and Written Opinion in PCT/US16/58545 dated Jan. 3, 2017, 12 pages. |
Non-Final Office Action dated Dec. 2, 2016 in U.S. Appl. No. 15/283,204 of Evans et al. filed Sep. 30, 2016, 12 pages. |
Final Office Action dated Apr. 18, 2017 in U.S. Appl. No. 15/283,204 of Evans et al. filed Sep. 30, 2016, 17 pages. |
Non-Final Office Action dated Dec. 15, 2016 in U.S. Appl. No. 15/283,112 of Evans et al. filed Sep. 30, 2016, 26 pages. |
Final Office Action dated Apr. 26, 2017 in U.S. Appl. No. 15/231,652 of Evans et al. filed Aug. 8, 2016, 32 pages. |
Final Office Action dated Mar. 31, 2017 in U.S. Appl. No. 15/231,643 of Evans et al. filed Aug. 8, 2016, 32 pages. |
Final Office Action dated Apr. 11, 2017 in U.S. Appl. No. 15/231,664 of Evans et al. filed Aug. 8, 2016, 24 pages. |
Non-Final Office Action dated Jan. 5, 2017 in U.S. Appl. No. 15/231,643 of Evans et al. filed Aug. 8, 2016, 37 pages. |
Advisory Action dated May 12, 2017, for U.S. Appl. No. 15/231,643 of Evans et al. filed Aug. 8, 2016, 4 pages. |
International Search Report and Written Opinion in PCT/US16/58548 dated Jan. 19, 2017, 8 pages. |
International Search Report and Written Opinion in PCT/US16/58947 dated Jan. 13, 2017, 9 pages. |
“ISR-WrOp”, International Search Report and Written Opinion dated May 18, 2017, for International Application No. PCT/US2017/19790 filed Feb. 27, 2017, 8 pages. |
Evans V , et al., International Search Report and Written Opinion received in PCT Application No. PCT/US16/59524, dated Feb. 27, 2017; 10 pages. |
Evans V , et al., Non-Final Office Action mailed in U.S. Appl. No. 15/336,380 dated Mar. 14, 2017; 17 pages. |
Corrected Notice of Allowability dated Nov. 16, 2017 for U.S. Appl. No. 15/444,320 of Evans et al., filed Feb. 27, 2017. |
Decision to Grant dated Mar. 6, 2018 for Taiwan Application No. 105135139 of Essential Products, Inc. |
Non-Final Office Action dated Feb. 23, 2018 for Taiwan Application No. 105134986 of Essential Products, Inc. |
Non-Final Office Action dated Mar. 1, 2018 for U.S. Appl. No. 15/442,576 of Evans et al. |
Non-Final Office Action dated Sep. 21, 2017 for TW Application No. 105134992 of Evans et al., filed Oct. 28, 2016. |
Non-Final Office Action dated Sep. 19, 2017 for U.S. Appl. No. 15/283,204 of Evans et al. filed Sep. 30, 2016. |
Notice of Allowance dated Aug. 30, 2017 for U.S. Appl. No. 15/231,643 of Evans et al., filed Oct. 30, 2016. |
Notice of Allowance dated Nov. 9, 2017 for U.S. Appl. No. 15/442,576 of Evans et al., filed Feb. 24, 2017. |
Notice of Allowance dated Sep. 13, 2017 for U.S. Appl. No. 15/231,664 of Evans et al. filed Aug. 8, 2016. |
Office Action dated Feb. 13, 2018 for Taiwan Patent Application No. 105134992 of Essential Products, Inc. |
Office Action dated Nov. 10, 2017 for TW Application No. 105134991 of Evans et al., filed Oct. 28, 2016. |
Supplemental Notice of Allowance dated Sep. 19, 2017 for U.S. Appl. No. 15/444,320 of Evans et al., filed Feb. 27, 2017. |
Int'l Preliminary Report on Patentability dated May 11, 2018 for PCT Patent Application No. PCT/US2016/058545 of Essential Products, Inc. |
Int'l Preliminary Report on Patentability dated May 11, 2018 for PCT Patent Application No. PCT/US2016/059550 of Essential Products, Inc. |
Int'l Preliminary Report on Patentability dated May 11, 2018 for PCT Patent Application No. PCT/US2016/058543 of Essential Products, Inc. |
Notice of Allowance dated Jun. 4, 2018 for Taiwan Patent Application No. 105134986 of Essential Products, Inc. |
Final Office Action dated Mar. 22, 2018 for U.S. Appl. No. 15/283,204 of Evans et al. filed Sep. 30, 2016. |
Office Action dated Jan. 25, 2018 for Taiwan Patent Application No. 105135139 of Essential Products, Inc. |
Notice of Allowance dated Jun. 4, 2018 for U.S. Appl. No. 15/283,204 of Evans et al. |
Non-Final Office Action dated Jun. 20, 2018 for U.S. Appl. No. 15/712,034 of Evans et al. |
Number | Date | Country | |
---|---|---|---|
20170330507 A1 | Nov 2017 | US |
Number | Date | Country | |
---|---|---|---|
62373910 | Aug 2016 | US | |
62319099 | Apr 2016 | US | |
62300631 | Feb 2016 | US | |
62249130 | Oct 2015 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 15336380 | Oct 2016 | US |
Child | 15665359 | US |