The present disclosure relates generally to the field of human-machine interfaces in vehicles, and more particularly, to touch-sensitive, multi-function interfaces that utilize haptic feedback responses to reduce driver distraction.
Conventional control systems in vehicles typically present operators with a combination of mechanical, single-function controls such as switches, buttons, levers, knobs, dials, etc. The operator interacts with these control systems by manipulating the controls to execute various control functions. As the number of controllable features increase, switch panels can easily become cluttered with numerous switches, buttons, levers, knobs and dials.
In an effort to reduce the amount of clutter in control panels, while keeping up with consumer demand requiring greater switching functionality, some control systems have implemented the use of tactile feedback responses to notify a user that a switch is activated. Yet these tactile feedback responses merely simulate the depression of a binary mechanical switch and this simulation does not inform the user which multifunction switch has been activated. Haptic feedback responses currently lack adequate familiarity for users in a multifunction switching environment.
According to one aspect, the present disclosure may be directed to a method for setting threshold values based on an amount and also selecting a haptic feedback response that may be based at least on the amount. For example, a method may include determining a touch value based on a touch applied to a touch-sensitive sensor. The method may additionally comprise determining an amount that the touch value exceeds a first threshold value. The method may additionally comprise setting a second threshold value based, at least in part on the amount. The method may additionally comprise selecting a haptic feedback response. The method may additionally comprise generating an output signal that causes a haptic actuator to provide the haptic feedback response.
According to one aspect, the present disclosure may be directed to a method for setting threshold values based on an elapsed time amount and also selecting a haptic feedback response that may be based on the elapsed time amount. For example, the method may comprise determining a touch value based on a touch applied to a touch-sensitive sensor. The method may additionally comprise determining an elapsed time amount that the touch value exceeds a first threshold value. The method may additionally comprise setting a second threshold value based, at least in part on the elapsed time. The method may additionally comprise selecting a haptic feedback response that may be based, at least in part on the elapsed time. The method may additionally include generating an output signal that causes a haptic actuator to provide the haptic feedback response.
In accordance with another aspect, the present disclosure may be directed to an electronic device for setting threshold values based on an amount and selecting a haptic feedback response that is based on the amount. The electronic device may include a touch-sensitive sensor. The electronic device may additionally include memory. The electronic device may include a processing unit that may be in communication with a memory and a touch sensitive sensor. The processing unit of the electronic device may be configured to determine a touch value based on a touch applied to the touch-sensitive sensor. Additionally, the processing unit of the electronic device may be configured to determine an amount that the touch value exceeds a first threshold value. The processing unit may be further configured to set a second threshold value based, at least in part on the amount. The processing unit of the electronic device may be further configured to select a haptic feedback response, based at least in part, on the amount and may be further configured to generate an output signal indicative of the haptic feedback response.
Like reference symbols in the various drawings indicate like elements.
Implementations of the present disclosure now will be described more fully hereinafter. Indeed, these implementations can be embodied in many different forms and should not be construed as limited to the implementations set forth herein; rather, these implementations are provided so that this disclosure will satisfy applicable legal requirements. Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art. Methods and systems similar or equivalent to those described herein can be used in the practice or testing of the present disclosure. As used in the specification, and in the appended claims, the singular forms “a”, “an”, “the”, include plural referents unless the context clearly dictates otherwise. The term “comprising” and variations thereof as used herein is used synonymously with the term “including” and variations thereof and are open, non-limiting terms.
Coupling a touch-sensitive sensor 206 to the steering grip 102 of a steering apparatus 100, provides a driver with a human-machine interface that can be configured to detect a touch provided by a user, determine if a switch function should be activated, and then provide the user with a haptic feedback response.
A touch-sensitive sensor 206 can be any sensor configured to change at least one electrical property in response to a touch applied to the sensor 206. A touch, also known as a touch event, can be for example a physical contact that occurs when a driver in a vehicle uses their hand (gloved or ungloved) to apply a force to the touch-sensitive sensor 206. A touch-sensitive sensor 206, can be any suitable tactile sensor including, a mechanical sensor, a resistive sensor, a capacitive sensor, a magnetic sensor, an optical fiber sensor, a piezoelectric sensor, a silicon sensor, and/or a temperature sensor. The touch-sensitive sensor 206 can include an array of touch-sensing units, wherein each touch-sensing unit includes conductors, electrodes and a touch-sensitive surface. In accordance with the present disclosure, the touch-sensitive surface can embody any touch-sensitive deformable member that can be induced to vibrate by a touch sensitive system described in detail below.
An actuator 218 can include or embody any suitable device that can provide a user with a haptic feedback response. Suitable actuators 218 can include an electric motor, a pneumatic actuator, a hydraulic piston, a relay, a comb drive, a piezoelectric actuator, a thermal bimorph, a digital micromirror, an electroactive polymer and a speaker actuator. A speaker actuator can include for example, a conical surface operatively coupled to a deformable member, where the conical surface is configured to induce the deformable member to vibrate because of a sound wave transmitted by the conical surface. A deformable member can, for example, embody a touch-sensitive surface of a touch-sensitive interface 206. Additionally, a deformable member can be any suitable surface or plate that can deform to provide a user with a haptic feedback response.
Haptic feedback responses can include any of a number of stimuli that can be perceived through touch or other non-visual sensory means, such as, for example, mechanical vibrations, changes in surface features (e.g., temperature) or textures, changes in surface tension, electric pulses, or any other types of stimuli that can be perceived through non-visual, touch-based senses. Some examples of haptic feedback responses in accordance with the present disclosure are discussed in more detail in the discussion of
As stated above, a touch-sensitive system 200 can include a variety of computer readable media such as system memory (volatile and non-volatile) 204, removable storage 208, and/or non-removable storage 210. Further examples of computer readable media can include RAM, ROM, electrically erasable program read-only memory (EEPROM), flash memory or other memory technology, CD-ROM, digital versatile disks (DVD) or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information and which can be accessed by a computing device.
A touch-sensitive system 200 can include a processing unit 202 that can include or embody any suitable device for processing, moving, and/or manipulating data associated with touch-sensitive system 200. The processing unit 202 can be, for example, a standard programmable processor communicatively coupled to memory 204, a touch-sensitive sensor 206 and a actuator 218. A processing unit 202 can be configured to perform arithmetic and logic operations necessary for the operation of the touch-sensitive system 200 as described in detail below.
A processing unit 202 can be configured to determine touch values, wherein the touch values can be used by the processing unit 202 to performing arithmetic and logical functions necessary for the operation of touch-sensitive system 200. Touch values can be based on the touch applied to a touch-sensitive sensor 206. For example, a touch value can embody a value corresponding to a characteristic or attribute of a touch. A characteristic of a touch can be, for example, a location, a force, and/or a time associated with a touch applied to a touch-sensitive sensor 206.
The location of a touch can be, for example, an area where a touch makes contact with touch-sensitive sensor 206 or a single point where a touch makes contact with the sensor 206. The location of an applied touch can include x and y components. For instance, the location can be a position in either one dimension (e.g., the X- or Y-direction) or two dimensions (e.g., the X- and Y-directions). A location can be determined, for example, by measuring the voltage at the electrodes of the touch-sensitive sensor 206 as a correlation exists between the change in voltage and the resistance of the electrodes.
The magnitude of force of a touch can be, for example, the average magnitude across the area where the touch makes contact with a touch-sensitive sensor 206 or the magnitude at a single point where the touch makes contact with the sensor 206. The magnitude of the force can be determined, for example, by measuring the change in resistance of the touch-sensitive sensor 206 as a correlation exists between the magnitude of a force applied and the resistance of the touch-sensitive sensor 206.
The time associated with the touch applied to a touch-sensitive sensor 206 can be, for example, a timestamp corresponding to the start of a touch, a timestamp corresponding to the end of the touch, or the average of a time stamps corresponding to the start of a touch and the end of the touch. The time associated with the touch applied to a touch-sensitive sensor 206 can be determined for example by using a system clock.
A processing unit 202 can be configured to determine amounts, wherein an amount can be the outcome of a calculation that uses two or more touch values. Amounts can be used in arithmetic and logic operations necessary for the operation of the touch-sensitive system 200. An amount can be, for example, an elapsed time amount, a traversed length amount, an absolute distance amount, or any other amount suitable.
An elapsed time amount can be based on, for example, two discrete time touch values. For instance, an elapsed time amount can be the difference between the timestamp corresponding to the start of a touch and the timestamp corresponding to the end of a touch. In another example, an elapsed time amount can between any timestamps corresponding to a touch.
A traversed length amount can be based on, for example, two discrete location touch values. For instance, a traversed length amount can be the calculated length of the path taken from one discrete location to another discrete location. Similarly, an absolute distance amount can be based on the same two discreet location touch values and can be the calculated linear distance between the two locations.
A processing unit 202 can be further configured to provide a control message for use in controlling various system features. For example, a control message can be used in an automotive environment to control a variety of automotive control functions. In the automotive environment, control messages can be used to control media systems (audio, visual, communication, etc.), driving systems (cruise control), climate control systems (locks, windows, mirrors, etc.). In one example a control message can be specifically used to control increasing or decreasing the volume of a media system. A table of control message may be stored, for example in the system memory 204 and or in the database 216, as shown in
Additionally, some of the arithmetic and logic operations performed by the processing unit 202 require the use of threshold values. A threshold value can include or embody any touch value or amount as defined above. A user can predetermine a threshold value and processor unit 202 can be further configured to adjust threshold values as detailed below in the description of
Referring to
In accordance with the present disclosure, a first threshold force value 301 can include or embody a predetermined force threshold that, when exceeded by the determined force touch value, is designed to emulate a “push” event that is analogous to a user's pressing of a mechanical button. A second threshold force value 302 can include or embody a predetermined force threshold, below which the determined force touch value is designed to trigger emulation of a “release” event that is analogous to a user's release of a mechanical button. The first and second threshold force values 301, 302 (representing the “push” and “release” events, respectively) can be defined as different values. Such embodiments may be useful to ensure that a user's interaction with the switch is not prematurely terminated because the user was unable (e.g., because of fatigue) to maintain a high force value over a prolonged period of time.
After touch values are determined, the exemplary process can include determining an amount that the touch value exceeds a first threshold value (Step 1220). In one example, amounts can be determined using a processing unit 202. As described above, an amount can be, for example, an elapsed time amount, a traversed length amount, an absolute distance amount, or any other amount suitable.
Importantly, however, it is contemplated that although the following steps make reference to an elapsed time amount, any other suitable amount in accordance with the present disclosure can be substituted.
Referring back to flowchart 1200 as shown in
In accordance with the present disclosure, setting a second threshold value based at least in part on the amount (Step 1240) can further include determining that the amount exceeds a first threshold amount and then setting the second threshold value based on at least on the determined amount. A first threshold amount can be a predetermined minimum that an amount must exceed before significantly adjusting the second threshold value. Similar to an amount, a first threshold amount can embody a first threshold time amount, a first threshold absolute distance amount, a first threshold traversed length amount and any other suitable threshold amount. Requiring an amount to exceed a first threshold amount before the processing unit 202 adjusts the second threshold value, ensures that the interaction by a user was intended to activate a switch.
Referring back to flowchart 1200 as shown in
Similarly,
Referring back to flowchart 1200 as shown in
A second threshold elapsed time 1108 the same or different as each provides a threshold for different processing unit 202 functions. As stated above, the first threshold amount 1106 can be used by a processing unit 202 to decide how a second force threshold value 1102 can be established, while the second threshold amount 1108 can be used by a processing unit 202 to decide if an output signal indicative of a haptic feedback response 1110 should be generated.
It should be understood that the various techniques described herein can be implemented in connection with hardware or software or, where appropriate, with a combination of both. Additionally, aspects of the presently disclosed subject matter may be implemented in a computing environment in or across a plurality of processing units 202 or other computing devices. Thus, the processes and apparatus of the presently disclosed subject matter, or certain aspects or portions thereof, may take the form of program code (i.e., instructions) embodied in tangible media, such as floppy diskettes, CD-ROMs, hard drives, or any other machine-readable storage medium where, when the program code is loaded into and executed by a machine, such as a computer, the machine becomes an apparatus for practicing the presently disclosed subject matter.
Computer-executable instructions, such as program modules, being executed by a computer can be used. Generally, program modules include routines, programs, objects, components, data structures, etc. that performs particular tasks or implement particular abstract data types. Distributed computing environments can be used where tasks are performed by remote processing devices that are linked through a communications network or other data transmission medium. In a distributed computing environment, program modules and other data can be located in both local and remote computer storage media including memory storage devices.
Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described above. Rather, the specific features and acts described above are disclosed as example forms of implementing the claims.
This application claims the benefit of U.S. Provisional Application No. 61/888,322, filed Oct. 8, 2013, and U.S. Provisional Application No. 61/891,231, filed Oct. 15, 2013, each of which is incorporated herein by reference in its entirety.
Number | Name | Date | Kind |
---|---|---|---|
4484026 | Thornburg | Nov 1984 | A |
4540979 | Gerger et al. | Sep 1985 | A |
4801771 | Mizuguchi et al. | Jan 1989 | A |
4929934 | Ueda et al. | May 1990 | A |
5398962 | Kropp | Mar 1995 | A |
5408873 | Schmidt et al. | Apr 1995 | A |
5423569 | Reighard et al. | Jun 1995 | A |
5453941 | Yoshikawa | Sep 1995 | A |
5463258 | Filion et al. | Oct 1995 | A |
5539259 | Filion et al. | Jul 1996 | A |
5793297 | Takeuchi et al. | Aug 1998 | A |
5855144 | Parada | Jan 1999 | A |
5871063 | Young | Feb 1999 | A |
5914658 | Arakawa | Jun 1999 | A |
5943044 | Martinelli | Aug 1999 | A |
5965952 | Podoloff et al. | Oct 1999 | A |
6067077 | Martin | May 2000 | A |
6333736 | Sandbach | Dec 2001 | B1 |
6378384 | Atkinson et al. | Apr 2002 | B1 |
6429846 | Rosenberg et al. | Aug 2002 | B2 |
6501463 | Dahley et al. | Dec 2002 | B1 |
6636197 | Goldberg et al. | Oct 2003 | B1 |
6809462 | Pelrine et al. | Oct 2004 | B2 |
6906700 | Armstrong | Jun 2005 | B1 |
6933920 | Lacroix et al. | Aug 2005 | B2 |
7126583 | Breed | Oct 2006 | B1 |
7136051 | Hein et al. | Nov 2006 | B2 |
7258026 | Papakostas et al. | Aug 2007 | B2 |
7649278 | Yoshida et al. | Jan 2010 | B2 |
8203454 | Knight et al. | Jun 2012 | B2 |
8214105 | Daly et al. | Jul 2012 | B2 |
8222799 | Polyakov et al. | Jul 2012 | B2 |
8237324 | Pei et al. | Aug 2012 | B2 |
8269731 | Molne | Sep 2012 | B2 |
8633916 | Bernstein | Jan 2014 | B2 |
8698764 | Karakotsios et al. | Apr 2014 | B1 |
9244562 | Rosenberg | Jan 2016 | B1 |
9337832 | Buttolo | May 2016 | B2 |
9690380 | Monkhouse et al. | Jun 2017 | B2 |
9864507 | Cheng | Jan 2018 | B2 |
20020054060 | Schena | May 2002 | A1 |
20030043014 | Nakazawa et al. | Mar 2003 | A1 |
20030076968 | Rast | Apr 2003 | A1 |
20030083131 | Armstrong | May 2003 | A1 |
20030206162 | Roberts | Nov 2003 | A1 |
20040021643 | Hoshino | Feb 2004 | A1 |
20040195031 | Nagasaka | Oct 2004 | A1 |
20050021190 | Worrell | Jan 2005 | A1 |
20050052426 | Hagermoser et al. | Mar 2005 | A1 |
20050063757 | Sugimura | Mar 2005 | A1 |
20050067889 | Chernoff | Mar 2005 | A1 |
20050110769 | DaCosta et al. | May 2005 | A1 |
20050156892 | Grant | Jul 2005 | A1 |
20050273218 | Breed et al. | Dec 2005 | A1 |
20060025897 | Shostak et al. | Feb 2006 | A1 |
20060054479 | Iisaka | Mar 2006 | A1 |
20060076855 | Eriksen et al. | Apr 2006 | A1 |
20060109256 | Grant | May 2006 | A1 |
20060113880 | Pei et al. | Jun 2006 | A1 |
20060177212 | Lamborghini et al. | Aug 2006 | A1 |
20060248478 | Liau | Nov 2006 | A1 |
20060262103 | Hu | Nov 2006 | A1 |
20060284839 | Breed | Dec 2006 | A1 |
20070062753 | Yoshida et al. | Mar 2007 | A1 |
20070097073 | Takashima | May 2007 | A1 |
20070100523 | Trachte | May 2007 | A1 |
20070129046 | Soh | Jun 2007 | A1 |
20070287494 | You | Dec 2007 | A1 |
20080012837 | Marriott et al. | Jan 2008 | A1 |
20080062145 | Shahoian | Mar 2008 | A1 |
20080079604 | Madonna et al. | Apr 2008 | A1 |
20080150911 | Harrison | Jun 2008 | A1 |
20080202912 | Boddie et al. | Aug 2008 | A1 |
20080264183 | Graham et al. | Oct 2008 | A1 |
20080289887 | Flint et al. | Nov 2008 | A1 |
20090001855 | Lipton | Jan 2009 | A1 |
20090020343 | Rothkopf | Jan 2009 | A1 |
20090125811 | Bethurum | May 2009 | A1 |
20090140994 | Tanaka et al. | Jun 2009 | A1 |
20090140996 | Takashima et al. | Jun 2009 | A1 |
20090151447 | Jin et al. | Jun 2009 | A1 |
20090153340 | Pinder | Jun 2009 | A1 |
20090160529 | Lamborghini | Jun 2009 | A1 |
20090189749 | Salada | Jul 2009 | A1 |
20090228791 | Kim et al. | Sep 2009 | A1 |
20090237374 | Li | Sep 2009 | A1 |
20090241378 | Ellis | Oct 2009 | A1 |
20100001974 | Su et al. | Jan 2010 | A1 |
20100045612 | Molne | Feb 2010 | A1 |
20100053087 | Dai | Mar 2010 | A1 |
20100066512 | Rank | Mar 2010 | A1 |
20100141606 | Bae | Jun 2010 | A1 |
20100168998 | Matsunaga | Jul 2010 | A1 |
20100200375 | Han et al. | Aug 2010 | A1 |
20100226075 | Jahge | Sep 2010 | A1 |
20100236911 | Wild et al. | Sep 2010 | A1 |
20100250066 | Eckstein et al. | Sep 2010 | A1 |
20100250071 | Pala et al. | Sep 2010 | A1 |
20100268426 | Pathak | Oct 2010 | A1 |
20100302177 | Kim et al. | Dec 2010 | A1 |
20100315267 | Chung et al. | Dec 2010 | A1 |
20100321335 | Seong-Taek et al. | Dec 2010 | A1 |
20100328112 | Liu | Dec 2010 | A1 |
20110037721 | Cranfill et al. | Feb 2011 | A1 |
20110046788 | Daly et al. | Feb 2011 | A1 |
20110054359 | Sazonov et al. | Mar 2011 | A1 |
20110069021 | Hill | Mar 2011 | A1 |
20110109552 | Yasutake | May 2011 | A1 |
20110141052 | Bernstein et al. | Jun 2011 | A1 |
20110148608 | Grant | Jun 2011 | A1 |
20110175844 | Berggren | Jul 2011 | A1 |
20110205081 | Chen | Aug 2011 | A1 |
20110210926 | Pasquero et al. | Sep 2011 | A1 |
20110216015 | Edwards | Sep 2011 | A1 |
20110227872 | Huska | Sep 2011 | A1 |
20110241850 | Bosch et al. | Oct 2011 | A1 |
20110245992 | Stahlin et al. | Oct 2011 | A1 |
20110248728 | Maruyama | Oct 2011 | A1 |
20110255023 | Doyle et al. | Oct 2011 | A1 |
20110260983 | Pertuit et al. | Oct 2011 | A1 |
20110267181 | Kildal | Nov 2011 | A1 |
20110279380 | Weber | Nov 2011 | A1 |
20110290038 | Hoshino et al. | Dec 2011 | A1 |
20120013573 | Liu et al. | Jan 2012 | A1 |
20120038468 | Provancher | Feb 2012 | A1 |
20120039494 | Ellis | Feb 2012 | A1 |
20120105367 | Son et al. | May 2012 | A1 |
20120126959 | Zarrabi et al. | May 2012 | A1 |
20120127115 | Gannon | May 2012 | A1 |
20120169663 | Kim et al. | Jul 2012 | A1 |
20120223900 | Jiyama | Sep 2012 | A1 |
20120267221 | Gohng et al. | Oct 2012 | A1 |
20120267222 | Gohng et al. | Oct 2012 | A1 |
20120296528 | Wellhoefer et al. | Nov 2012 | A1 |
20120299856 | Hasui | Nov 2012 | A1 |
20130016053 | Jung et al. | Jan 2013 | A1 |
20130063380 | Wang et al. | Mar 2013 | A1 |
20130063389 | Moore | Mar 2013 | A1 |
20130093679 | Dickinson et al. | Apr 2013 | A1 |
20130096849 | Campbell et al. | Apr 2013 | A1 |
20130106691 | Rank | May 2013 | A1 |
20130113715 | Grant et al. | May 2013 | A1 |
20130113717 | Van Eerd et al. | May 2013 | A1 |
20130122857 | Karaogu et al. | May 2013 | A1 |
20130128587 | Lisseman et al. | May 2013 | A1 |
20130141396 | Lynn et al. | Jun 2013 | A1 |
20130147284 | Chun | Jun 2013 | A1 |
20130154938 | Arthur et al. | Jun 2013 | A1 |
20130181931 | Kenta | Jul 2013 | A1 |
20130218488 | Grandemange | Aug 2013 | A1 |
20130222287 | Bae et al. | Aug 2013 | A1 |
20130222310 | Birnbaum et al. | Aug 2013 | A1 |
20130228023 | Drasnin et al. | Sep 2013 | A1 |
20130250213 | Tomomasa | Sep 2013 | A1 |
20130250502 | Tossavainen | Sep 2013 | A1 |
20130250613 | Kamada | Sep 2013 | A1 |
20130257776 | Tissot | Oct 2013 | A1 |
20130265273 | Marsden | Oct 2013 | A1 |
20130307788 | Rao et al. | Nov 2013 | A1 |
20130342337 | Kiefer et al. | Dec 2013 | A1 |
20140071060 | Santos-Gomez | Mar 2014 | A1 |
20140092025 | Pala et al. | Apr 2014 | A1 |
20140114624 | Buchanan et al. | Apr 2014 | A1 |
20140191973 | Zellers | Jul 2014 | A1 |
20140267076 | Birnbaum et al. | Sep 2014 | A1 |
20140267113 | Lisseman et al. | Sep 2014 | A1 |
20140267114 | Lisseman et al. | Sep 2014 | A1 |
20140347176 | Modarres et al. | Nov 2014 | A1 |
20150009164 | Shinozaki | Jan 2015 | A1 |
20150009168 | Olien et al. | Jan 2015 | A1 |
20150046825 | Li | Feb 2015 | A1 |
20150097794 | Lisseman | Apr 2015 | A1 |
20150116205 | Westerman | Apr 2015 | A1 |
20150212571 | Kitada | Jul 2015 | A1 |
20150309576 | Tissot | Oct 2015 | A1 |
20160109949 | Park | Apr 2016 | A1 |
20160216764 | Morrell | Jul 2016 | A1 |
20170075424 | Bernstein | Mar 2017 | A1 |
Number | Date | Country |
---|---|---|
1607850 | Dec 2005 | EP |
06-037056 | May 1994 | JP |
2000-71809 | Mar 2000 | JP |
2005-175815 | Jun 2005 | JP |
2006-150865 | Jun 2006 | JP |
2008-123429 | May 2008 | JP |
2008-181709 | Aug 2008 | JP |
2008-299866 | Dec 2008 | JP |
2011-3188 | Jan 2011 | JP |
2012-73785 | Apr 2012 | JP |
2012-150833 | Aug 2012 | JP |
2012-155628 | Aug 2012 | JP |
2012176640 | Sep 2012 | JP |
2013-513865 | Apr 2013 | JP |
2013-182528 | Sep 2013 | JP |
1020060047110 | May 2006 | KR |
1020100129424 | Dec 2010 | KR |
2001088935 | Aug 2008 | WO |
2011008292 | Jan 2011 | WO |
2012052635 | Apr 2012 | WO |
2013082293 | Jun 2013 | WO |
2014194192 | Dec 2014 | WO |
2015054354 | Apr 2015 | WO |
2015054362 | Apr 2015 | WO |
2015054364 | Apr 2015 | WO |
2015054369 | Apr 2015 | WO |
2015054373 | Apr 2015 | WO |
Entry |
---|
International Search Report and Written Opinion issued in related International Application No. PCT/US2014/059652 dated Dec. 22, 2014. |
International Search Report and Written Opinion issued in related International Application No. PCT/US2014/059673 dated Jan. 9, 2015. |
International Search Report and Written Opinion issued in related International Application No. PCT/2014/059669 dated Jan. 23, 2015. |
International Search Report and Written Opinion issued in related International Application No. PCT/US2014/059657 dated Feb. 16, 2015. |
International Search Report and Written Opinion issued in related International Application No. PCT/US2014/059639 dated Feb. 24, 2015. |
International Search Report and Written Opinion issued in related International Application No. PCT/US2014/040224 dated Sep. 24, 2014. |
Office Action dated Sep. 30, 2015 in U.S. Appl. No. 14/509,493, filed Oct. 8, 2014. |
Co-pending U.S. Appl. No. 14/509,598, filed Oct. 8, 2014, and its file history. |
Co-pending U.S. Appl. No. 14/509,493, filed Oct. 8, 2014, and its file history. |
Office Action dated Jun. 16, 2016, received in connection with U.S. Appl. No. 14/509,493. |
Co-pending U.S. Appl. No. 14/509,462, filed Oct. 8, 2014, and its file history. |
Office Action dated Jun. 14, 2016, received in connection with U.S. Appl. No. 14/509,462. |
Co-pending U.S. Appl. No. 14/509,560, filed Oct. 8, 2014, and its file history. |
Co-pending U.S. Appl. No. 14/509,535, filed Oct. 8, 2014, and its file history. |
Office Action dated Feb. 11, 2016, received in connection with U.S. Appl. No. 14/509,535. |
Co-pending U.S. Appl. No. 14/291,845, filed May 30, 2014, and its file history. |
Office Action dated Feb. 24, 2016, received in connection with U.S. Appl. No. 14/291,845. |
Office Action dated Sep. 24, 2015, received in connection with U.S. Appl. No. 14/291,845. |
International Preliminary Report on Patentability and Written Opinion, dated Apr. 12, 2016, received in connection with International Patent Application No. PCT/US2014/059639. |
International Preliminary Report on Patentability and Written Opinion, dated Apr. 12, 2016, received in connection with International Patent Application No. PCT/US2014/059652. |
International Preliminary Report on Patentability and Written Opinion, dated Apr. 12, 2016, received in connection with International Patent Application No. PCT/US2014/059657. |
International Preliminary Report on Patentability and Written Opinion, dated Apr. 12, 2016, received in connection with International Patent Application No. PCT/US2014/059669. |
International Preliminary Report on Patentability and Written Opinion, dated Apr. 12, 2016, received in connection with International Patent Application No. PCT/US2014/059673. |
International Preliminary Report on Patentability and Written Opinion, dated Dec. 10, 2015, received in connection with International Patent Application No. PCT/US2014/040224. |
Office action issued in co-pending U.S. Appl. No. 14/509,462, dated Jun. 9, 2017. |
Office Action issued in co-pending U.S. Appl. No. 14/509,462, dated Nov. 24, 2017. |
Office Action issued in co-pending U.S. Appl. No. 15/230,786, dated Aug. 24, 2017. |
Office Action issued in co-pending U.S. Appl. No. 14/291,845, dated Aug. 24, 2017. |
Notice of Allowance issued in co-pending U.S. Appl. No. 14/509,493, dated Oct. 10, 2017. |
Office Action received in connection with JP Patent Application No. 2011-075258. (English Translation attached) dated Nov. 4, 2014. |
Office Action in U.S. Appl. No. 13/076,226, now U.S. Pat. No. 9,007,190 dated Apr. 14, 2015, dated Mar. 11, 2013. |
Office Action in U.S. Appl. No. 13/076,226, now U.S. Pat. No. 9,007,190 dated Apr. 14, 2015, dated Feb. 13, 2014. |
Office Action in U.S. Appl. No. 13/076,226, now U.S. Pat. No. 9,007,190 dated Apr. 14, 2015, dated Sep. 11, 2014. |
Office Action issued in U.S. Appl. No. 14/509,560, dated Feb. 10, 2017. |
Office Action issued in U.S. Appl. No. 14/291,845, dated Feb. 3, 2017. |
Office Action issued in U.S. Appl. No. 14/509,598, dated Jan. 6, 2017. |
Office Action issued in U.S. Appl. No. 13/863,363, dated Nov. 10, 2015. |
Office Action issued in U.S. Appl. No. 14/211,475, dated Dec. 17, 2015. |
Office Action issued in U.S. Appl. No. 14/211,665, dated Dec. 15, 2015. |
Office Action issued in U.S. Appl. No. 14/509,462, dated Dec. 28, 2016. |
Office Action issued in U.S. Appl. No. 14/509,462, dated Jun. 14, 2016. |
Office Action issued in U.S. Appl. No. 14/509,493, dated Dec. 28, 2016. |
Office Action issued in U.S. Appl. No. 14/509,493, dated Jun. 16, 2016. |
Office Action issued in U.S. Appl. No. 14/509,535, dated Feb. 11, 2016. |
Office Action issued in U.S. Appl. No. 14/291,845, dated Sep. 24, 2015. |
Office Action issued in U.S. Appl. No. 14/291,845, dated Feb. 24, 2016. |
International Preliminary Report on Patentability, dated Dec. 10, 2015, received in connection with International Application No. PCT/US2014/040224. |
International Preliminary Report on Patentability dated Apr. 12, 2016, received in connection with International Patent Application No. PCT/US2014/059639. |
International Preliminary Report on Patentability dated Apr. 12, 2016, received in connection with International Patent Application No. PCT/US2014/059652. |
International Preliminary Report on Patentability dated Apr. 12, 2016, received in connection with International Patent Application No. PCT/US2014/059657. |
International Preliminary Report on Patentability dated Apr. 12, 2016, received in connection with International Patent Application No. PCT/US2014/059669. |
International Preliminary Report on Patentability dated Apr. 12, 2016, received in connection with International Patent Application No. PCT/US2014/059673. |
International Preliminary Report on Patentability for International Patent Application No. PCT/US2013/030417, dated Oct. 23, 2014. |
International Search Report and Written Opinion for International Patent Application No. PCT/US2013/030417, dated Jun. 21, 2013. |
Office Action in U.S. Appl. No. 15/230,786 dated Feb. 7, 2017. |
Non-Final Office Action dated Jan. 6, 2017, in co-pending related U.S. Appl. No. 14/509,598. |
Notice of Allowance issued in co-pending U.S. Appl. No. 14/509,462, dated Feb. 22, 2018. |
Office Action dated Feb. 26, 2018, received in connection with Chinese Application No. 201480030786. (English Translation attached). |
Notice of Allowance issued in U.S. Appl. No. 14/291,845, dated Apr. 26, 2018. |
Office Action issued in Chinese Application No. 201480055487.1, dated Apr. 27, 2018. |
Office Action issued in U.S. Appl. No. 14/509,598, dated May 17, 2018. |
Supplemental Notice of Allowance issued in U.S. Appl. No. 14/509,462, dated May 29, 2018. |
Office Action issued for Japanese Application No. 2016-517039, dated Jun. 26, 2018. |
Office Action issued for U.S. Appl. No. 15/867,226, dated Jun. 29, 2018. |
Office Action issued for Japanese Application No. 2016-515524, dated Jul. 24, 2018. |
Number | Date | Country | |
---|---|---|---|
20150097791 A1 | Apr 2015 | US |
Number | Date | Country | |
---|---|---|---|
61891231 | Oct 2013 | US | |
61888322 | Oct 2013 | US |