The present invention relates to the field of user interfaces for portable electronic devices, and more particularly to a system, method and apparatus for sensing and interpreting finger gestures and movements to control and provide input to electronic devices.
Portable electronic devices have become increasingly popular. Examples of these devices include wireless or cellular telephones, personal digital assistants (PDAs), pagers and audio or music delivery devices. Some devices have become increasingly small such that they are now deemed “pocketable” and/or “wearable.”
A portable electronic device typically has a user interface for operative control. Most if not all conventional user interfaces for such portable electronic devices employ physical buttons, stylus, or voice control. In some devices, a large number of operations or functions are possible with the user interface.
One major shortcoming of these prior art user interfaces is that the user must physically retrieve and position the portable electronic device appropriately for physical contact therewith, for example, by utilizing a stylus to provide commands upon a touch sensitive screen of a PDA or by manually depressing function buttons on a portable media player. In addition, as the size of a device becomes smaller, the interface becomes increasingly inappropriate from an ergonomic standpoint. Voice controlled systems may alleviate some of these problems, however, the major shortcoming of a voice-controlled interface is that the user must speak openly in such a way that other nearby people may hear. Many voice controlled systems are also extremely sensitive to environmental noise and interference.
Accordingly, it would be desirable to have a system, method and apparatus for improving the shortcomings of prior art electronic device control systems.
The present invention is a system, method and apparatus for controlling and providing data, signals and commands to electronic devices such as wireless phones, Personal Digital Assistants (PDAs), music players/recorders, media players/recorders, computers such as laptops or other portable computers, public telephones and other devices. As described herein, the inventive systems, methods and apparatus involve the use of bioacoustic or contact sensing technology, adaptive training methods and wireless technology for the control of such electronic devices.
In one embodiment, the present invention is a method for controlling an electronic device which includes receiving one or more bioacoustic signals, each signal related to one or more hand gestures, determining the identity of the one or more hand gestures based on a positive correlation between the received signals and predetermined hand gesture data and selectively issuing one or more commands associated with the identified hand gesture for activating one or more functions of the electronic device.
In another embodiment, the present invention is a wrist adaptable wireless apparatus for invoking functions of a portable wireless device including a processor coupled to at least one piezo-electric contact microphone which receives sensor signal data, a storage facility for storing a plurality of gesture patterns wherein the processor is operative to compare sensor signal data with the plurality of gesture patterns, to detect a substantial match between the sensor signal data and one of the plurality of gesture patterns, and to select one of a plurality of user input commands associated with the match, wherein the plurality of user input commands correspond to a plurality of functions of the portable wireless device; and a wireless transmitter coupled to said processor and operative to wirelessly transmit the user input command to the portable wireless device.
In yet another embodiment, the present invention is a wireless control system including a bioacoustic sensor component, a digital processor coupled to the sensor component, a storage component for storing gesture pattern data indicative of a plurality of gestures, each gesture corresponding to a unique one of a plurality of electronic device commands wherein the processor is operative to compare acoustic sensor signals with the gesture pattern data and to select one of the electronic device commands corresponding to a gesture that correlates with the acoustic sensor signals and a wireless transmitter and antenna coupled to the processor and operative to transmit the electronic device command.
Referring to
In one embodiment, apparatus 20 includes a band 40 having bioacoustic sensor material formed therein or attached thereto. Band 40 has a signal processing component 50 attached thereto which may include components such as a signal amplifier, a digital processor, a memory, a broadcast component, an encryption module and an antenna, as discussed in more detail later herein. Bioacoustic sensor material may be one or more piezo-electric contact materials or films (also referred to herein as microphones). Preferably, band 40 is sized and configured to fit around a human arm 30. More preferably, band 40 is sized and configured to fit around a distal end 34 of human arm 30, proximate to a human hand 38. In one embodiment, the material of band 40 may be constructed of fabric, elastic, links, or other structure capable of incorporating bioacoustic sensor material, such as bioacoustic material incorporating one or more piezo-electric contact microphones therein. In a preferred embodiment, band 40 has an outer surface and an inner surface, wherein signal processing component 50 is attached or affixed to the band's outer surface and the bioacoustic sensor material is formed or attached to an inner surface of band 40. In this configuration, the bioacoustic material is positioned to receive bioacoustic signals from the user. In the present invention, the bioacoustic or piezo-electric material is optimized for sensing vibration in human skin over the ulna bone at the wrist. The internal sound is conducted by the bones of the hand and wrist to the ulna below the wristband sensor. Airborne sound does not register in the wristband.
As discussed in more detail later herein, signal processing component 50 may be configured in a wristwatch or wristwatch like configuration and incorporate one or more of a signal amplifier, digital processor, broadcast facility, memory and other components which are operative to receive, process and provide bioacoustic signals.
Referring to
In the present invention it is contemplated that one or more of the various apparatus components or elements, such as digital processor 230, wireless broadcast device 240, antenna 250, memory 260 and other components such an encryption module may be remotely located from bioacoustic sensor 210. For example, such components or elements may be integrated into a container which is placed in a region other than the user's arm, such as in a belt configuration or other remote configuration.
In another embodiment, the present invention may be configured with one or more piezo-electric contact microphones, signal amplifier, digital processor, a small-field wireless broadcast device and antenna embedded in a finger ring plus a wristwatch amplifier/repeater, not shown. In this embodiment, the wristwatch contains a larger battery than the ring and rebroadcasts gesture commands to the user's wearable devices once received from the ring. The wristwatch can also receive and display small amounts of information such as sports scores, stock quotes, weather and appointments.
In such an embodiment, sensing is performed at the finger ring of the bone-conducted sound of index and middle finger contacts with the thumb. Once sensed, these signals are narrowcasted from the sensing ring to the wristwatch which encrypts and broadcasts commands to worn cell phones, handheld computers or any nearby digital devices equipped with an appropriate receiver. However, in such an embodiment, the broadcast signals can only be decrypted by authorized digital devices.
In yet another embodiment of the invention, a fingernail (or thumbnail) mounted, or ring-mounted touch-edge and touch-surface device that emits coded audio tones into the finger (or thumb) to be picked up by a wrist unit controller and relayed forward to the controlled wearable device is used. In still another embodiment, a narrow-casting infra-red remote control embedded in a watch configuration for control of devices in any environment is used. In another embodiment, coded audio is emitted out from the wrist through the hand to grasped objects such as door knobs for unlocking and locking doors. In still another embodiment, coded audio is received from objects grasped by the hand, and the audio signals are relayed from the wrist to the wearable device.
Referring to
The invention includes a user-specific training module for machine learning of gesture classification of the finger gesture audio patterns. During the device training session, users are asked by the system to perform hand gestures repeatedly such as “touch index finger to thumb”, “touch middle finger to thumb”, or “snap your fingers”. At the same time the learning component accurately learns the mapping from signal to gesture for the given individual user. Training and adaptation for gesture classification may be performed using a discriminative algorithm. The learning algorithm first maps the high dimensional recordings into an internal compact representation. It then uses a machine learning technique called boosting to find a set of discriminative features. Finally, these features are combined into a single highly accurate yet compact gesture classifier. For example, in one embodiment, a state machine or hidden Markov model (HMM) may be used to classify quantized voltages into gesture classes to control the desired devices.
The present invention is designed to listen for or otherwise sense (via wrist, forearm, fingernail or ring-mounted sensors) naturally occurring fingertip or hand gestures. Exemplary detectable gestures include:
In one exemplary embodiment, a finger/thumb tap means select, a finger/thumb double-tap means operate, a money gesture (rub) means scroll. In the present invention sound made by fingertip gestures, worn rings or fingernail-mounted devices, or grasped objects such as doorknobs, light-switches or wall-mounted name-plates may also be sensed by the bioacoustic sensors or microphones.
In another exemplary embodiment, a user wearing a wireless phone and earpiece or headset might listen to voicemails or music using VCR-like “forward”, “play”, “stop”, and “rewind” commands mapped to the tapping of thumb to index finger for “play”, thumb to middle finger for “stop”, thumb sliding from middle finger to index finger for “forward” and thumb sliding from index finger to middle finger for “rewind”. In a public area, the user can make these gestures silently and in a visually concealed and private manner without unholstering any controlled digital device.
Referring to
One or more readout panels may also be provided as shown in
One of the many advantages of the present invention is that users can control digital devices near them without speech or physically manipulating their respective devices. Users control nearby digital devices merely through the use of simple finger gestures with concurrent audio signals sensed, learned, interpreted, encrypted and broadcast to the devices. Furthermore, while the present system, method and apparatus are ideally suited to provide able-bodied users with more convenient, intuitive and efficient ways to control electronic devices, the system, method and apparatus of the present invention would also greatly benefit people with special needs, such as people with severe speech articulation problems or other similar ailments or handicaps which make conventional user interface controls difficult or even impossible to use. It is contemplated that a wearable communication device of the present invention could make a big difference in the quality of life of such people.
While the invention has been described in conjunction with specific embodiments thereof, it is evident that many alternatives, modifications and variations will be apparent to those skilled in the art. Accordingly, preferred embodiments of the invention as set forth herein are intended to be illustrative, not limiting. Various changes may be made without departing from the spirit and scope of the invention.
This application is a continuation of prior U.S. patent application Ser. No. 09/898,108, filed Jul. 3, 2001, now U.S. Pat. No. 7,148,879, which claimed priority to U.S. Provisional Application No. 60/216,207, filed Jul. 6, 2000, and U.S. Provisional Application No. 60/265,212, filed Jan. 31, 2001, which are hereby incorporated herein by reference.
Number | Name | Date | Kind |
---|---|---|---|
3629521 | Puharich et al. | Dec 1971 | A |
4421119 | Pratt | Dec 1983 | A |
4720607 | de Moncuit | Jan 1988 | A |
4799498 | Collier | Jan 1989 | A |
5073950 | Colbert et al. | Dec 1991 | A |
5319747 | Gerrissen et al. | Jun 1994 | A |
5327506 | Stites, III | Jul 1994 | A |
5368044 | Cain et al. | Nov 1994 | A |
5615681 | Ohtomo | Apr 1997 | A |
5766208 | Mcewan | Jun 1998 | A |
5810731 | Sarvazyan et al. | Sep 1998 | A |
5836876 | Dimarogonas | Nov 1998 | A |
6115482 | Sears et al. | Sep 2000 | A |
6135951 | Richardson et al. | Oct 2000 | A |
6151208 | Bartlett | Nov 2000 | A |
6234975 | Mcleod et al. | May 2001 | B1 |
6380923 | Fukumoto | Apr 2002 | B1 |
6396930 | Vaudrey et al. | May 2002 | B1 |
6409684 | Wilk | Jun 2002 | B1 |
6507662 | Brooks | Jan 2003 | B1 |
6589287 | Lundborg | Jul 2003 | B2 |
6631197 | Taenzer | Oct 2003 | B1 |
6754472 | Williams et al. | Jun 2004 | B1 |
6783501 | Takahashi et al. | Aug 2004 | B2 |
6844660 | Scott | Jan 2005 | B2 |
6898299 | Brooks | May 2005 | B1 |
7010139 | Smeehuyzen | Mar 2006 | B1 |
7123752 | Kato et al. | Oct 2006 | B2 |
7148879 | Amento et al. | Dec 2006 | B2 |
7198607 | Jamsen | Apr 2007 | B2 |
7206423 | Feng et al. | Apr 2007 | B1 |
7370208 | Levin et al. | May 2008 | B2 |
7405725 | Mohri et al. | Jul 2008 | B2 |
7536557 | Murakami et al. | May 2009 | B2 |
7539533 | Tran | May 2009 | B2 |
7615018 | Nelson et al. | Nov 2009 | B2 |
7625315 | Hickman | Dec 2009 | B2 |
7648471 | Hobson | Jan 2010 | B2 |
7671351 | Setlak et al. | Mar 2010 | B2 |
7708697 | Wilkinson et al. | May 2010 | B2 |
7760918 | Bezvershenko et al. | Jul 2010 | B2 |
7778848 | Reeves | Aug 2010 | B1 |
7796771 | Calhoun et al. | Sep 2010 | B2 |
7878075 | Johansson et al. | Feb 2011 | B2 |
7914468 | Shalon et al. | Mar 2011 | B2 |
8023669 | Segev et al. | Sep 2011 | B2 |
8023676 | Abolfathi et al. | Sep 2011 | B2 |
8031046 | Franza et al. | Oct 2011 | B2 |
8098129 | Falck et al. | Jan 2012 | B2 |
8196470 | Gross et al. | Jun 2012 | B2 |
8200289 | Joo et al. | Jun 2012 | B2 |
8253693 | Buil et al. | Aug 2012 | B2 |
8270637 | Abolfathi | Sep 2012 | B2 |
8270638 | Abolfathi et al. | Sep 2012 | B2 |
8312660 | Fujisaki | Nov 2012 | B1 |
8348936 | Trembly et al. | Jan 2013 | B2 |
8421634 | Tan et al. | Apr 2013 | B2 |
8467742 | Hachisuka et al. | Jun 2013 | B2 |
8482488 | Jannard | Jul 2013 | B2 |
8491446 | Hinds et al. | Jul 2013 | B2 |
8500271 | Howell et al. | Aug 2013 | B2 |
8521239 | Hosoi et al. | Aug 2013 | B2 |
8540631 | Penner et al. | Sep 2013 | B2 |
8542095 | Kamei | Sep 2013 | B2 |
8594568 | Falck | Nov 2013 | B2 |
8750852 | Forutanpour et al. | Jun 2014 | B2 |
8922427 | Dehnie et al. | Dec 2014 | B2 |
20010013546 | Ross | Aug 2001 | A1 |
20010051776 | Lenhardt | Dec 2001 | A1 |
20030048915 | Bank | Mar 2003 | A1 |
20030066882 | Ross | Apr 2003 | A1 |
20060018488 | Viala et al. | Jan 2006 | A1 |
20070012507 | Lyon | Jan 2007 | A1 |
20080223925 | Saito et al. | Sep 2008 | A1 |
20080260211 | Bennett et al. | Oct 2008 | A1 |
20090149722 | Abolfathi et al. | Jun 2009 | A1 |
20090234262 | Reid, Jr. et al. | Sep 2009 | A1 |
20090287485 | Glebe | Nov 2009 | A1 |
20090289958 | Kim et al. | Nov 2009 | A1 |
20100016741 | Mix et al. | Jan 2010 | A1 |
20100066664 | Son et al. | Mar 2010 | A1 |
20100137107 | Jamsa et al. | Jun 2010 | A1 |
20100162177 | Eves et al. | Jun 2010 | A1 |
20100168572 | Sliwa et al. | Jul 2010 | A1 |
20100286571 | Allum et al. | Nov 2010 | A1 |
20100316235 | Park et al. | Dec 2010 | A1 |
20110125063 | Shalon et al. | May 2011 | A1 |
20110134030 | Cho | Jun 2011 | A1 |
20110135106 | Yehuday et al. | Jun 2011 | A1 |
20110137649 | Rasmussen et al. | Jun 2011 | A1 |
20110152637 | Kateraas et al. | Jun 2011 | A1 |
20110227856 | Corroy et al. | Sep 2011 | A1 |
20110245669 | Zhang | Oct 2011 | A1 |
20110255702 | Jensen | Oct 2011 | A1 |
20110269601 | Nelson et al. | Nov 2011 | A1 |
20110282662 | Aonuma et al. | Nov 2011 | A1 |
20120010478 | Kinnunen et al. | Jan 2012 | A1 |
20120011990 | Mann | Jan 2012 | A1 |
20120058859 | Elsom-Cook et al. | Mar 2012 | A1 |
20120065506 | Smith | Mar 2012 | A1 |
20120212441 | Christiansson et al. | Aug 2012 | A1 |
20120280900 | Wang et al. | Nov 2012 | A1 |
20120290832 | Rodriguez et al. | Nov 2012 | A1 |
20130034238 | Abolfathi | Feb 2013 | A1 |
20130041235 | Rogers et al. | Feb 2013 | A1 |
20130119133 | Michael et al. | May 2013 | A1 |
20130120458 | Celebisoy et al. | May 2013 | A1 |
20130135223 | Shai | May 2013 | A1 |
20130142363 | Amento et al. | Jun 2013 | A1 |
20130171599 | Bleich et al. | Jul 2013 | A1 |
20130173926 | Morese et al. | Jul 2013 | A1 |
20130215060 | Nakamura | Aug 2013 | A1 |
20130225915 | Redfield et al. | Aug 2013 | A1 |
20130225940 | Fujita et al. | Aug 2013 | A1 |
20130278396 | Kimmel | Oct 2013 | A1 |
20130288655 | Foruntanpour et al. | Oct 2013 | A1 |
20140009262 | Robertson et al. | Jan 2014 | A1 |
20140028604 | Morinaga et al. | Jan 2014 | A1 |
20140035884 | Oh et al. | Feb 2014 | A1 |
20140097608 | Buzhardt et al. | Apr 2014 | A1 |
20140099991 | Cheng et al. | Apr 2014 | A1 |
20140168135 | Saukko et al. | Jun 2014 | A1 |
20140174174 | Uehara et al. | Jun 2014 | A1 |
20140188561 | Tenbrock et al. | Jul 2014 | A1 |
20140210791 | Hanauer et al. | Jul 2014 | A1 |
20140240124 | Bychkov | Aug 2014 | A1 |
20150084011 | Park et al. | Mar 2015 | A1 |
20150199950 | Heiman | Jul 2015 | A1 |
Number | Date | Country |
---|---|---|
2003257031 | Feb 2004 | AU |
2007200415 | Aug 2007 | AU |
1207883 | Jul 1986 | CA |
0712114 | May 1996 | EP |
0921753 | Jun 1999 | EP |
1436804 | Feb 2004 | EP |
2312997 | Apr 2011 | EP |
2643981 | May 2012 | EP |
2483677 | Aug 2012 | EP |
2226931 | Jul 1990 | GB |
02249017 | Oct 1990 | JP |
04-317638 | Nov 1992 | JP |
2003058190 | Feb 2003 | JP |
2005142729 | Jun 2005 | JP |
2010210730 | Sep 2010 | JP |
20100056688 | Oct 1990 | KR |
200946887 | Aug 1997 | TW |
WO 8201329 | Apr 1982 | WO |
WO 9601585 | Jan 1996 | WO |
WO 03033882 | Apr 2003 | WO |
WO 2006094372 | Sep 2006 | WO |
WO 2010045158 | Apr 2010 | WO |
WO 2012168534 | Dec 2012 | WO |
Entry |
---|
U.S. Office Action dated Feb. 13, 2013 in U.S. Appl. No. 13/309,124. |
U.S. Office Action dated Sep. 24, 2013 in U.S. Appl. No. 13/309,124. |
U.S. Office Action dated Jan. 29, 2014 in U.S. Appl. No. 13/309,124. |
Zhong et al., “OsteoConduct: Wireless Body-Area Communication based on Bone Conduction,” Proceeding of the ICST 2nd International Conference on Body Area Networks, BodyNets 2007. |
Travis et al., “Hambone: A bio-acoustic gesture interface,” 2007 11th IEEE International Symposium on Wearable Computers, 2007. |
Scanlon, Michael V. Acoustic sensor for health status monitoring. Army Research Lab Aberdeen Proving Ground MD, 1998. |
Yamada, Guillaume Lopez; Masaki Shuzo; Ichiro. “New healthcare society supported by wearable sensors and information mapping-based services.” International Journal of Networking and Virtual Organisations 9.3 (2011): 233-247. |
Scanlon, Michael V. “Acoustic sensors in the helmet detect voice and physiology.” AeroSense 2003. International Society for Optics and Photonics, 2003. |
Amento et al., “The Sound of One Hand: A Wrist-Mounted Bio-Acoustic Fingertip Gesture Interface,” Short Talk: Its All About Sound, CHI 2002. |
“Kinect Gestures,” retrieved from http://support.xbox.com/en-US/xbox-360/kinect/body-controller on Oct. 24, 2013. |
Mark Billinghurst, “Chapter 14: Gesture Based Interaction,” Haptic Input, Aug. 24, 2011. |
Kompis, Martin, and Rudolf Haeusler, “Electromagnetic interference of bone-anchored hearing aids by cellular phones revisited,” Acta oto-laryngologica 122.5, 2002, 510-512. |
Chris Harrison, Desney Tan, Dan Morris, “Skinput: Appropriating the Skin as an Interactive Canvas,” CommuniCations of the ACM 54.8, 2011, 111-118. |
T. Scott Saponas, et al., “Enabling always-available input with muscle-computer interfaces,” Proceedings of the 22nd Annual ACM Symposium on User Interface Software and Technology, ACM, 2009. |
Jao Henrique Donker, “The Body as a communication medium,” 2009. |
Sang-Yoon Chang, et al., “Body Area Network Security: Robust Key Establishment Using Human Body Channel,” retrieved from https://www.usenix.org/system/files/conference/healthsec12/healthsec12-final15.pdf on Oct. 16, 2013. |
Vidya Bharrgavi, et al., “Security Solution for Data Integrity in Wireless BioSensor Networks,” Distributed Computing Systems Workshops, 2007, ICDCSW'07, 27th International Conference, IEEE, 2007. |
Daniel Halperin, et al., “Pacemakers and Implantable Cardiac Defibrillators: Software Radio Attacks and Zero-Power Defenses,” Security and Privacy, SP 2008, IEEE Symposium, IEEE, 2008. |
Carmen C. Y. Poon, et al., “A Novel Biometrics Method to Secure Wireless Body Area Sensor Networks for Telemedicine and M-Health,” Communications Magazine, IEEE 44.4, 2006, 73-81. |
Zicheng Liu, et al., “Direct Filtering for Air-and Bone-Conductive Microphones,” Multimedia Signal Processing, 2004 IEEE 6th Workshop, IEEE, 2004. |
Mujibiya, Adiyan, et al. “The sound of touch: on-body touch and gesture sensing based on transdermal ultrasound propagation.” Proceedings of the 2013 ACM international conference on Interactive tabletops and surfaces. ACM, 2013. |
Harrison, Chris, Robert Xiao, and Scott Hudson. “Acoustic barcodes: passive, durable and inexpensive notched identification tags.” Proceedings of the 25th annual ACM symposium on User interface software and technology. ACM, 2012. |
Yoo, Jerald, Namjun Cho, and Hoi-Jun Yoo. “Analysis of body sensor network using human body as the channel.” Proceedings of the ICST 3rd international conference on Body area networks. ICST (Institute for Computer Sciences, Social-Informatics and Telecommunications Engineering), 2008. |
Ni, Tao, and Patrick Baudisch. “Disappearing mobile devices.” Proceedings of the 22nd annual ACM symposium on User interface software and technology. ACM, 2009. |
Hinckley, Ken, and Hyunyoung Song, “Sensor synaesthesia: touch in motion, and motion in touch.” Proceedings of the Sigchi Conference on Human Factors in Computing Systems. ACM, 2011. |
Hinge, Dhanashree, and S. D. Sawarkar. “Mobile to Mobile data transfer through Human Area Network.” IJRCCT 2.11 (2013): 1181-1184. |
Park, Duck Gun, et al. “TAP: touch-and-play.” Proceedings of the SIGCHI conference on Human Factors in computing systems. ACM, 2006. |
Ruiz, J. Agud, and Shigeru Shimamoto. “A study on the transmission characteristics of the human body towards broadband intra-body communications.” Consumer Electronics, 2005. (ISCE 2005). Proceedings of the Ninth International Symposium on. IEEE, 2005. |
Nagai, Ryoji, et al. “Near-Field Coupling Communication Technology for Human-Area Networking.” Proc. Conf. on Information and Communication Technologies and Applications (ICTA2011), International Institute of Informatics and Systems (IIIS). 2012. |
Maruf, Md Hasan. “An Input Amplifier for Body-Channel Communication.” (2013). |
Rekimoto, Jun. “Gesturewrist and gesturepad: Unobtrusive wearable interaction devices.” Wearable Computers, 2001. Proceedings. Fifth International Symposium on. IEEE, 2001. |
Lipkova, Jolana, and Jaroslav Cechak. “Transmission of Information Using the Human Body,” http://www.iiis.org/cds2010/cd2010imc/ccct—2010/paperspdf/ta303gi.pdf, CCCT 2010. |
U.S. Office Action dated Dec. 17, 2015 in U.S. Appl. No. 14/065,663. |
U.S. Office Action dated Nov. 19, 2015 in U.S. Appl. No. 14/083,499. |
U.S. Office Action dated Nov. 19, 2015 in U.S. Appl. No. 14/090,668. |
U.S. Office Action dated Jan. 11, 2016 in U.S. Appl. No. 14/514,658. |
U.S. Office Action dated Feb. 25, 2016 in U.S. Appl. No. 14/072,126. |
U.S. Notice of Allowance dated Apr. 4, 2016 in U.S. Appl. No. 14/083,499. |
U.S. Notice of Allowance dated Mar. 21, 2016 in U.S. Appl. No. 14/090,668. |
U.S. Office Action dated Mar. 16, 2016 in U.S. Appl. No. 14/482,087. |
U.S. Office Action dated Mar. 10, 2016 in U.S. Appl. No. 14/482,091. |
Office Action mailed Jul. 7, 2016 in U.S. Appl. No. 14/072,126. |
Notice of Allowance mailed Jul. 12, 2016 in U.S. Appl. No. 14/482,091. |
Number | Date | Country | |
---|---|---|---|
60216207 | Jul 2000 | US | |
60265212 | Jan 2001 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 09898108 | Jul 2001 | US |
Child | 11586142 | US |