Computer gaming has become increasingly more realistic, with high resolution graphics, three-dimensional rendering, sophisticated interface devices, and internet multi-player games. Additionally, systems have been developed that allow the user to hold a structure that interfaces with a system, such as a joystick or remote, where the interface detects a player's motion. The system then incorporates the player's motion into a wireless application. Current systems require that user to hold the interface. In some situations the interface can be broken or the interface can break object when the user inadvertently released the interface. Additionally, current interface systems are only capable of detecting the arm movements. Finally, current systems are not capable of detecting other parameters from the user in order to incorporate the user information into the gaming system. Systems capable of doing this would enhance the user the experience.
Provided herein is an interface for providing information from a user to a control unit or data processing system comprising at least one wearable patch in communication with the control unit, wherein the patch is adaptable to detect object data and transmit the object data to the control unit. The patch can be adaptable to communicate with the control unit through at least one of narrowband and ultrawideband frequencies. The patch can comprise at least one sensor. Furthermore, the sensor can comprise at least one multipower radio. A multipower radio can function as both a narrowband radio and an ultrawideband radio. The patch can be used to detect object data, wherein the object data comprises at least one detectable parameter. The detectable parameter can comprise one or more of temperature, motion, heart rate, ECG, EEG, blood pressure, and hydration. In some embodiments, the patch can be further adaptable to provide feedback to the user. The feedback can be selected from one or more of on-screen instruction, shock, heat, or vibration. In some embodiments, the patch can be a disposable patch. The patch can be a flexible patch. The patch can be further adaptable to be positioned on an inanimate object. The patch can be further adaptable to be positioned on an animate object. In some embodiments of the patch, the object data detected is motion.
Another embodiment of the invention described herein comprises a parameter determining patch for detecting object data from an object, the motion determining sensor comprising at least one wearable data obtaining sensor adaptable to obtain data from an object; and at least one transmitter for transmitting object data. The transmitter can be adaptable to transmit data using at least one of narrowband and ultrawideband frequency. The transmitter can comprise at least one multimode radio. The transmitter can also work to receive information from an external source. In some embodiments, the patch can be adaptable to be in communication with a control unit. Additionally, the patch can comprise at least one receiver adaptable to receive data from an external unit. In some embodiments, the patch can be adaptable to stimulate the object with a stimulus. The stimulus can be selected from at least one of on-screen instruction, shock, heat, or vibration. The object data can be selected from at least one of motion, hydration, heart rate, ECG, EEG, blood pressure, and temperature.
Further provided herein are systems for incorporating information from an object comprising a control unit providing an output associated with an object, and at least one wearable patch in communication with the control unit. The wearable patch can be adaptable to be positioned on the object and further adaptable to detect at least one parameter from the object. Further, the control unit can be adaptable to adjust the output associated with the object in response to the parameter. The object can be an animate object. Alternatively, the object can be an inanimate object. In some embodiments, the parameter detected is movement. The movement can comprise at least one of displacement, velocity, acceleration, or any combination thereof. In some embodiments, the parameter can be a physiological parameter. The physiological parameter can be selected from at least one of temperature, hydration, heart rate, ECG, EEG, blood pressure, or any combination thereof. In some embodiments, the wearable patch can be adaptable to provide feedback to the object. The feedback can be physical feedback including, but not limited to at least one of vibration, electric shock, or change in temperature. Furthermore, the data processing system can be adaptable to provide feedback in response to the detected parameter. The feedback can be selected from at least one of audio feedback or visual feedback. Additionally, the system can further comprise at least one currently available data processing interface devices. Currently available data processing interface devices include, but are not limited to joysticks or remotes. The patch can comprise at least one sensor. The sensor can comprise at least one multimode radio.
Additionally provided herein are methods for interacting with a virtual environment of a control unit comprising positioning at least one wearable patch comprising at least one sensor on an object from which information is desired; acquiring information from the object using the at least one patch; incorporating the object information acquired into the virtual environment; and adjusting the virtual environment in response to the information from the object. The sensor can comprise at least one multipower radio. In some embodiments, the sensor is disposable. The sensor can be flexible. In some embodiments of the method, the positioning step can comprise positioning more than one sensor on the user. The object can be an animate object. Alternatively, the object can be an inanimate object. The sensor can be adaptable to acquire physiological data from an animate object. The physiological data can be selected from at least one of heart rate, ECG, EEG, blood pressure, hydration, speed, temperature, or any combination thereof. In some embodiments, the method can further comprise the step of providing feedback to the object through the patch. The feedback can be a stimulus applied to the user. Additionally, the method can further comprise the step of providing feedback to the object through the virtual environment. The feedback can be audio feedback or visual feedback. In some embodiments, the method further provides for the step of recording the object information. The object information can be recorded and stored and then used later to evaluate the progress of the user. Additionally, the method can comprise the step of recording the object information and then manipulating the recorded information virtually. In some embodiments of the method, the system is a gaming system. The method can further provide the use of a system that is adaptable to be adjusted in real time. Additionally, the method can further comprising the step of communicating the object information incorporated into the virtual environment with a second computer system accessing the same virtual environment.
All publications, patents, and patent applications mentioned in this specification are herein incorporated by reference to the same extent as if each individual publication, patent, or patent application was specifically and individually indicated to be incorporated by reference.
The novel features of the invention are set forth with particularity in the appended claims. A better understanding of the features and advantages of the present invention will be obtained by reference to the following detailed description that sets forth illustrative embodiments, in which the principles of the invention are utilized, and the accompanying drawings of which:
Provided herein is an invention comprising at least one lightweight, flexible, and wearable patch that is capable of detecting at least one parameter from an object. The patch can comprise at least one sensor. The patch can further be capable of transmitting the parameter or parameters detected to a control unit, or data processor, which can then incorporate the parameter into a program. The program can be visually represented on a display, where the changes to the program are also visually represented on the display. The patch can further allow the data processing system to faithfully reproduce a computer representation of the user, which adds a new dimension of realism to the program. The users of such systems will have to deal with their own physical constraints. For example purposes only, in a fighting game, the strength of the punches or the ability of a user to run fast is determined by detecting a user's heart-rate and other physical factors. As another example, electrocardiogram (ECG) sensors can be included with the patch and can be used to provide feedback to other players in a team situation. The ability to see other player's heart rate can make it more difficult for players to bluff while playing cards.
The invention provides that a sensor patch can be placed on the surface of a user or object. In some embodiments, the sensor can be placed on an animate object, such as a human, or even an animal. Alternatively, the patch can be placed on an inanimate object such as, for example purposes only, a piece of sporting equipment. The sensor can measure a parameter from the user or object. In the case of an animate object, the sensor can detect a physiological parameter including, but not limited to, heart rate, hydration, blood pressure, ECG, electroencephalogram (EEC), and temperature. In some embodiments, the sensor can detect the movement of the user's body or movement of at least part of the user's body. The sensor can also be used to detect the spatial relationship between the user and an object or between multiple users. In some embodiments, the sensor can detect a single parameter. In some embodiments, the sensor can be used to detect multiple parameters. The sensor can also be placed on an inanimate object. For example, if the user is playing a tennis video game, the sensor can be placed on the user's own tennis racquet. The movement of the racquet can be detected by the patch. The patch can then send information regarding the movement to the system.
Once information is detected by the sensor, the sensor can transmit this data to a data processor or control unit.
In some embodiments, the data processor or control unit can be connected to an output display, as illustrated in
The patch can also serve as a user-interface. The patch can be used as an interface or input to a control unit, thereby allowing the user to manipulate the control unit. The control unit can then be used to manipulate an external object in response to the user's input.
Additionally, the invention described herein can be used to provide feedback to the user. As shown in
In some embodiments, the user can obtain feedback instantaneously. In some embodiments, the data from the user can be detected by the patches while the user is in motion and then the patches can store the information collected. The information can then be downloaded to a mobile device for real-time use. Alternatively, the information can be stored by the patch and then downloaded at some point in time for delayed feedback.
Provided herein is a wearable patch that can be interfaced with a data processing system. At least one patch can be used with a system. In some embodiments, the invention can provide for the use of multiple patches to be used with the system. The patch can comprise at least one multipower radio. The multipower radio can be capable of transmitting data from the patch to a data processing system using either narrowband or ultrawideband frequencies.
The patches can be used to detect various parameters that can be used with a data processing system or any other suitable control system. The patches can comprise sensors including, but not limited to, accelerometers, temperature sensors, ECG sensors, EEG sensors, impedance sensors, moisture sensors, or any combination thereof. The sensors can detect parameters from the user or from an object. The detectable parameters include, but are not limited to, temperature, motion, heart rate, ECG data, EEG data, blood pressure, hydration, or any combination thereof. In some embodiments, the patches can be used as part of a feedback system. The ability of the patches to detect user limitations allows the limitations of the players to be included in the processing system. This can enhance the user's interaction with the system thereby providing a more realistic experience. In some embodiments, the patches can further comprise transducers including, but not limited to, vibrational transducers, electrical transducers, thermal transducers, or any combination thereof.
The relationship between the patch position and the ten piece model is shown in
Further provided herein are systems for incorporating information from an object comprising a control unit providing an output associated with an object, and at least one wearable patch in communication with the control unit. The wearable patch can be adaptable to be positioned on the object and further adaptable to detect at least one parameter from the object. Further, the control unit can be adaptable to adjust the output associated with the object in response to the parameter. The object can be an animate object. Alternatively, the object can be an inanimate object. In some embodiments, the parameter detected is movement. The movement can comprise at least one of displacement, speed, or velocity, or any combination thereof. In some embodiments, the parameter can be a physiological parameter. The physiological parameter can be selected from at least one of temperature, hydration, heart rate, ECG, EEG, blood pressure, or any combination thereof. In some embodiments, the wearable patch can be adaptable to provide feedback to the object. The feedback can be physical feedback including, but not limited to at least one of vibration, electric shock, or change in temperature. Furthermore, the data processing system can be adaptable to provide feedback in response to the detected parameter. The feedback can be selected from at least one of audio feedback or visual feedback. Additionally, the system can further comprise at least one currently available data processing interface devices. Currently available data processing interface devices include, but are not limited to joysticks or remotes. The patch can comprise at least one sensor. The sensor can comprise at least one multipower radio.
Additionally provided herein are methods for interacting with a virtual environment of a control unit comprising positioning at least one wearable patch comprising at least one sensor on an object from which information is desired; acquiring information from the object using the at least one patch; incorporating the object information acquired into the virtual environment; and adjusting the virtual environment in response to the information from the object. The sensor can comprise at least one multipower radio. In some embodiments, the sensor is disposable. The sensor can be flexible. In some embodiments of the method, the positioning step can comprise positioning more than one sensor on the user. The object can be an animate object. Alternatively, the object can be an inanimate object. The sensor can be adaptable to acquire physiological data from an animate object. The physiological data can be selected from at least one of heart rate, ECG, EEG, blood pressure, hydration, speed, temperature, or any combination thereof. In some embodiments, the method can further comprise the step of providing feedback to the object through the patch. The feedback can be a stimulus applied to the user. Additionally, the method can further comprise the step of providing feedback to the object through the virtual environment. The feedback can be audio feedback or visual feedback. In some embodiments, the method further provides for the step of recording the object information. The object information can be recorded and stored and then used later to evaluate the progress of the user. Additionally, the method can comprise the step of recording the object information and then manipulating the recorded information virtually. In some embodiments of the method, the system is a gaming system. The method can further provide the use of a system that is adaptable to be adjusted in real time. Additionally, the method can further comprising the step of communicating the object information incorporated into the virtual environment with a second computer system accessing the same virtual environment.
The invention described herein can be used with an interactive exercise video. The user will attach patches to their body at various locations, such an on the arms and legs and torso. The user can then follow an on screen coach while performing various exercises such as yoga or Pilates or other stretching moves. Alternatively, the user can perform various aerobic exercises or weight lifting routines. The feedback from the patches can then be used to assess how well the user is performing the requested tasks. A visual display can show the user what corrective measures need to be taken, if any. For example, if a user is not performing a stretch properly, the system can indicate this to the user. A computer coach can provide real-time feedback to the user through visual or audio cues. Additionally, physiological parameters, such as heart rate, can be detected to determine whether the user is over-exerting himself or herself and the system can adjust dynamically to compensate for the user's ability. The system can also adjust to maximize the desired result for the user, such as weight loss or aerobic strength.
The invention described herein can be used in conjunction with a completely tetherless player who wears virtual reality goggles to immerse himself or herself into a game. The video game player can now move around and the feedback of the player's position and movement is closely tracked. The tracked information can be used to update the scenery in the virtual environment. The transducers can be used to reinforce the visual feedback. For instance, invisible objects can be illuminated in virtual reality and if the player touches one of these invisible objects with his body, a transducer signal can be used to provide feedback (such as vibration or shock). The user can run, jump, kick or do anything possible within the confines of his or her environment and these movements and motions can be accurately tracked and recreated for the game. The game system can be set up so that other users of the game can see a virtual version of the player and the player's movements, even though users are located at different locations.
The invention described herein can be used to create animated sequences for motion pictures or for the gaming industry. Typically, simple light sensors (white dots) are used to pick-up signals in a video sequence. The locations of these light markers are used to move a computer animated character in sequence. The current system requires wearing a special uniform with marker tags. Using the invention described herein, the patches can be used to record the location and position of the patches automatically. The software can be used to track the motion of the actor for the use in movies or games. More complex scenes can be rendered in three-dimension as normal action sequences involving multiple characters are rendered in their natural setting.
The invention described herein can be used with medical devices. The patches of the invention can be used to assist remote diagnosis of motor impairments. The patches can be placed on the area surrounding a joint that has limited movement. The motion can then be tracked and the range of motion used to determine the extent of an injury. Furthermore, the patches can be used for training and recovery exercises to evaluate the progression of recovery.
The patches can also be used to track the fine movement of the arms, wrist joints, and fingers, to aid in surgical operations. A surgeon can wear patches instead of wearing gloves. The surgeon can then use the gloves to grip medical equipment directly and manipulate objects. The motion of the patches can be recorded and used to manipulate a robotic surgical arm. In cases where internal surgery is needed to be performed, a model of the surgical area can be created in virtual reality. As the surgeon manipulates objects and performs surgery in a virtual reality environment, a robotic surgeon can perform surgery on an actual patient by recreating the movements of the surgeon. Additionally, the patches can be used to train a surgeon in a virtual reality environment. The surgeon can practice operating on a virtual reality patient. The system can provide feedback to the surgeon as they perform the surgical procedure. This can provide the surgeon with training and an impartial method for evaluating a surgeon's skills.
The patches can be used with systems for use in training athletes. For example, a golfer can use a system to improve his or her golfing technique. A patch can be placed at various positions on the golfer. A patch can also be placed on the club that the golfer actually plays golf with. The golfer can then use the system to evaluate his or her swing which takes into consideration the actual club the golfer plays with. The system can then provide instruction to the user on how to improve his or her swing based on the virtual performance of the golfer as measured by the system.
The invention described herein can be used with gaming systems. The patches can be used with games where the user plays the game from a stationary position. The user's arm movements or leg movements need to be tracked in order to provide feedback in games involving fighting, dancing, or playing sports. The patches, worn at unobtrusive positions on the body can be used to track movement. This is unlike current systems where motion detectors, such as accelerometers, are incorporated into joysticks or game interface devices. The wearable patches are light-weight, thin, and low-powered. The patches can be worn at multiple locations. The game can detect the number of patches being used. The user will then undergo a calibration sequence with the system in order for the system to learn the location of the patches and the player's mobility. The calibration sequence can consist of a series of motions, such as moving the limbs around, jumping, or perform any other suitable task for calibrating the system.
The patch can function to serve as a user-interface between a user and an object. The patch allows the user to manipulate a data processor or control unit so that the control unit can produce the effects of the user's manipulation. Such a user-interface system can be used to control an object. For example purposes only, a patch can be placed on a motor-impaired person's head. The patch can comprise at least one EEG sensor. The EEG sensor can detect electrical activity from the user's brain and this information can be sent to a control unit located in communication with a wheel chair. The user can think of directions in which he or she wishes to travel. The patch can pick up these directions and the chair can then be controlled by the control unit to move in the desired directions.
While preferred embodiments of the present invention have been shown and described herein, it will be obvious to those skilled in the art that such embodiments are provided by way of example only. Numerous variations, changes, and substitutions will now occur to those skilled in the art without departing from the invention. It should be understood that various alternatives to the embodiments of the invention described herein may be employed in practicing the invention. It is intended that the following claims define the scope of the invention and that methods and structures within the scope of these claims and their equivalents be covered thereby.
This application claims the benefit of U.S. Provisional Application No. 60/956,806, filed Aug. 20, 2007, which application is incorporated herein by reference.
Number | Name | Date | Kind |
---|---|---|---|
4313443 | Frosch et al. | Feb 1982 | A |
4784162 | Ricks et al. | Nov 1988 | A |
5124128 | Hildenbrand et al. | Jun 1992 | A |
5231990 | Gauglitz | Aug 1993 | A |
5511553 | Segalozitz | Apr 1996 | A |
5717848 | Watanabe et al. | Feb 1998 | A |
5720770 | Nappholz et al. | Feb 1998 | A |
5913727 | Ahdoot | Jun 1999 | A |
5957854 | Besson et al. | Sep 1999 | A |
D439981 | Kasabach et al. | Apr 2001 | S |
6230970 | Walsh et al. | May 2001 | B1 |
6275143 | Stobbe | Aug 2001 | B1 |
6278499 | Darbee et al. | Aug 2001 | B1 |
6295461 | Palmer et al. | Sep 2001 | B1 |
D451604 | Kasabach et al. | Dec 2001 | S |
6336900 | Alleckson et al. | Jan 2002 | B1 |
D460971 | Sica et al. | Jul 2002 | S |
6436058 | Krahner et al. | Aug 2002 | B1 |
6454708 | Ferguson et al. | Sep 2002 | B1 |
6463039 | Ricci et al. | Oct 2002 | B1 |
6494829 | New et al. | Dec 2002 | B1 |
6527711 | Stivoric et al. | Mar 2003 | B1 |
6595929 | Stivoric et al. | Jul 2003 | B2 |
6605038 | Teller et al. | Aug 2003 | B1 |
6677852 | Landt | Jan 2004 | B1 |
6694180 | Boesen | Feb 2004 | B1 |
6731962 | Katarow et al. | May 2004 | B1 |
6885191 | Gleman | Apr 2005 | B1 |
6893396 | Schulze et al. | May 2005 | B2 |
6909420 | Nicolas et al. | Jun 2005 | B1 |
7020508 | Stivoric et al. | Mar 2006 | B2 |
7103578 | Beck et al. | Sep 2006 | B2 |
7125382 | Zhou et al. | Oct 2006 | B2 |
7206630 | Tarler | Apr 2007 | B1 |
7270633 | Goscha et al. | Sep 2007 | B1 |
7294105 | Islam | Nov 2007 | B1 |
7376234 | Gardiner | May 2008 | B1 |
7382247 | Welch et al. | Jun 2008 | B2 |
7571369 | Wang et al. | Aug 2009 | B2 |
7602301 | Stirling et al. | Oct 2009 | B1 |
7603255 | Case et al. | Oct 2009 | B2 |
7733224 | Tran | Jun 2010 | B2 |
7969307 | Peeters | Jun 2011 | B2 |
8611319 | Magar et al. | Dec 2013 | B2 |
8926509 | Magar et al. | Jan 2015 | B2 |
20010003163 | Bungert et al. | Jun 2001 | A1 |
20010047127 | New et al. | Nov 2001 | A1 |
20020065828 | Goodspeed | May 2002 | A1 |
20030004403 | Drinan et al. | Jan 2003 | A1 |
20030139903 | Zweig et al. | Jul 2003 | A1 |
20030219035 | Schmidt | Nov 2003 | A1 |
20030236103 | Tamaki et al. | Dec 2003 | A1 |
20040013097 | Massa | Jan 2004 | A1 |
20040077975 | Zimmerman | Apr 2004 | A1 |
20040199056 | Husemann et al. | Oct 2004 | A1 |
20040236192 | Necola Shehada et al. | Nov 2004 | A1 |
20050035852 | Paulsen | Feb 2005 | A1 |
20050090718 | Dodds | Apr 2005 | A1 |
20050101841 | Kaylor et al. | May 2005 | A9 |
20050113167 | Buchner et al. | May 2005 | A1 |
20050119533 | Sparks et al. | Jun 2005 | A1 |
20050197680 | Delmain et al. | Sep 2005 | A1 |
20050206518 | Welch et al. | Sep 2005 | A1 |
20050282633 | Nicolas et al. | Dec 2005 | A1 |
20060004303 | Weidenhaupt et al. | Jan 2006 | A1 |
20060025657 | Rosenfeld et al. | Feb 2006 | A1 |
20060031102 | Teller et al. | Feb 2006 | A1 |
20060103534 | Arms et al. | May 2006 | A1 |
20060122473 | Kill et al. | Jun 2006 | A1 |
20060122474 | Teller et al. | Jun 2006 | A1 |
20060173259 | Flaherty et al. | Aug 2006 | A1 |
20060264767 | Shennib | Nov 2006 | A1 |
20070027388 | Chou | Feb 2007 | A1 |
20070081505 | Roberts | Apr 2007 | A1 |
20070087780 | Nassimi | Apr 2007 | A1 |
20070100219 | Sweitzer et al. | May 2007 | A1 |
20070135866 | Baker et al. | Jun 2007 | A1 |
20070208233 | Kovacs | Sep 2007 | A1 |
20070208262 | Kovacs | Sep 2007 | A1 |
20070232234 | Inzerillo et al. | Oct 2007 | A1 |
20070244383 | Talbot et al. | Oct 2007 | A1 |
20070279217 | Venkatraman et al. | Dec 2007 | A1 |
20070282218 | Yarden | Dec 2007 | A1 |
20080001735 | Tran | Jan 2008 | A1 |
20080054880 | Miyauchi et al. | Mar 2008 | A1 |
20080065877 | Son et al. | Mar 2008 | A1 |
20080119707 | Stafford | May 2008 | A1 |
20080139894 | Szydlo-Moore et al. | Jun 2008 | A1 |
20080252596 | Bell et al. | Oct 2008 | A1 |
20090037670 | Rofougaran | Feb 2009 | A1 |
20090054737 | Magar et al. | Feb 2009 | A1 |
20090316618 | Fielding et al. | Dec 2009 | A1 |
20100013607 | Sabo et al. | Jan 2010 | A1 |
20100049006 | Magar et al. | Feb 2010 | A1 |
20100160746 | Venkatraman et al. | Jun 2010 | A1 |
20100316043 | Doi et al. | Dec 2010 | A1 |
20110019595 | Magar et al. | Jan 2011 | A1 |
20110019824 | Sattiraju et al. | Jan 2011 | A1 |
20120256492 | Song et al. | Oct 2012 | A1 |
20140091947 | Magar et al. | Apr 2014 | A1 |
Number | Date | Country |
---|---|---|
1070479 | Jan 2001 | EP |
1292218 | Apr 2006 | EP |
2420628 | May 2006 | GB |
2006055530 | Mar 2006 | JP |
10-2004-0032451 | Apr 2004 | KR |
10-2004-0074056 | Aug 2004 | KR |
2005-0072558 | Jul 2005 | KR |
1020050116274 | Dec 2006 | KR |
10-2007-0048168 | May 2007 | KR |
WO 8902682 | Mar 1989 | WO |
WO 8904093 | May 1989 | WO |
WO 8904578 | May 1989 | WO |
WO 9810617 | Mar 1998 | WO |
WO 0225773 | Mar 2002 | WO |
WO 02064032 | Aug 2002 | WO |
WO 02064032 | Feb 2003 | WO |
WO 03015005 | Feb 2003 | WO |
WO 03015838 | Feb 2003 | WO |
WO 03015005 | Dec 2003 | WO |
WO 2004002301 | Jan 2004 | WO |
WO 03015838 | Apr 2004 | WO |
WO 2004002301 | Apr 2004 | WO |
WO 03015838 | May 2004 | WO |
WO 2004084720 | Oct 2004 | WO |
WO 2004084720 | Mar 2005 | WO |
WO 2005029242 | Mar 2005 | WO |
WO 2005029242 | Jun 2005 | WO |
WO 2006094513 | Sep 2006 | WO |
WO 2006094513 | Apr 2007 | WO |
WO 2008035151 | Mar 2008 | WO |
WO 2008097316 | Aug 2008 | WO |
WO 2008035151 | Dec 2008 | WO |
Entry |
---|
Berrou, et al. Near Shannon limit error-correcting coding and decoding: Turbo-codes. 1. IEEE Int. Conf. Commun., vol. 2, Geneva, Switzerland, May 1993, p. 1064-1070. |
International Search Report dated Nov. 19, 2007 for PCT application No. 2007/062772. |
Vucetic, et al. Turbo Codes: Principles and Applications. The Kluwer International Series in Engineering and Computer Science). Kluwer Academic Publishers, 2000. (Table of Contents pages only) (8 pages). |
UK combined search and examination report dated Sep. 12, 2011 for Application No. GB0815326.4. |
International Search Report and written opinion dated Mar. 19, 2009 for PCT application No. 2008/073739. |
International search report and written opinion dated Nov. 19, 2007 for PCT application No. 2007/062772. |
International search report and written opinion dated Jan. 22, 2009 for PCT application No. 2008/080716. |
International search report and written opinion dated Feb. 24, 2009 for PCT application No. 2008/073591. |
International search report and written opinion dated Apr. 24, 2009 for PCT application No. 2008/081010. |
Montemont, et al. Experimental comparison of discrete and CMOS charge sensitive preamplifiers for CZT radiation detectors IEEE Transactions on Nuclear Science. 2002; 50(4):936-941. |
European search report dated Apr. 5, 2012 for EP Application No. 08841472.7. |
UK combined search and examination report dated Jun. 26, 2012 for Application No. GB 1210339.6. |
UK combined search and examination report dated Jun. 27, 2012 for Application No. GB 1210351.1. |
Office action dated Feb. 13, 2013 for U.S. Appl. No. 12/739,519. |
Office action dated Mar. 29, 2012 for U.S. Appl. No. 12/739,549. |
Office action dated Apr. 3, 2012 for U.S. Appl. No. 12/739,519. |
Office action dated May 2, 2011 for U.S. Appl. No. 12/134,151. |
Office action dated Aug. 7, 2009 for U.S. Appl. No. 11/756,161. |
Office action dated Oct. 5, 2012 for U.S. Appl. No. 12/739,549. |
Office action dated Dec. 19, 2011 for U.S. Appl. No. 12/134,151. |
Office action dated Jul. 9, 2013 for U.S. Appl. No. 12/096,195. |
Office action dated Apr. 4, 2013 for U.S. Appl. No. 12/702,127. |
U.S. Appl. No. 14/099,842, filed Dec. 6, 2013, Magar et al. |
U.S. Appl. No. 14/537,736, filed Nov. 10, 2014, Magar et al. |
Office action dated Nov. 6, 2014 for U.S. Appl. No. 14/099,842. |
Notice of allowance dated Dec. 3, 2014 for U.S. Appl. No. 12/134,151. |
Office action dated Feb. 27, 2014 for U.S. Appl. No. 12/134,151. |
Notice of allowance dated Oct. 2, 2014 for U.S. Appl. No. 12/134,151. |
European search report and search opinion dated Apr. 16, 2014 for EP Application No. 07757453.1. |
Office action dated May 22, 2014 for U.S. Appl. No. 12/702,127. |
Office action dated Jun. 19, 2014 for U.S. Appl. No. 12/096,195. |
Office action dated Jul. 8, 2014 for U.S. Appl. No. 12/739,549. |
Office action dated Mar. 5, 2015 for U.S. Appl. No. 12/739,549. |
Number | Date | Country | |
---|---|---|---|
20090051544 A1 | Feb 2009 | US |
Number | Date | Country | |
---|---|---|---|
60956806 | Aug 2007 | US |