Electronics embedded in garments are becoming increasingly common. Such electronics often need connectivity to external devices for power and/or data transmission. For example, it can be difficult to integrate bulky electronic components (e.g., batteries, microprocessors, wireless units, and sensors) into wearable garments, such as a shirt, coat, or pair of pants. Furthermore, connecting such electronic components to a garment may cause issues with durability since garments are often washed.
This document describes an interactive object with multiple electronics modules. An interactive object (e.g., a garment) includes a grid or array of conductive thread woven into the interactive object, and an internal electronics module coupled to the grid of conductive thread. The internal electronics module includes a first subset of electronic components, such as sensing circuitry configured to detect touch-input to the grid of conductive thread. An external electronics module that includes a second subset of electronic components (e.g., a microprocessor, power source, or network interface) is removably coupled to the interactive object via a communication interface. The communication interface enables communication between the internal electronics module and the external electronics module when the external electronics module is coupled to the interactive object.
This summary is provided to introduce simplified concepts concerning an interactive object with multiple electronics modules, which is further described below in the Detailed Description. This summary is not intended to identify essential features of the claimed subject matter, nor is it intended for use in determining the scope of the claimed subject matter.
Embodiments of an interactive object with multiple electronics modules are described with reference to the following drawings. The same numbers are used throughout the drawings to reference like features and components:
Overview
Electronics embedded in garments and other flexible objects (e.g., blankets, handbags, and hats) are becoming increasingly common. Such electronics often need connectivity to external devices for power and/or data transmission. For example, it can be difficult to integrate bulky electronic components (e.g., batteries, microprocessors, wireless units, and sensors) into wearable garments, such as a shirt, coat, or pair of pants. Furthermore, connecting such electronic components to a garment may cause issues with durability since garments are often washed. However, some electronic components, such as sensing circuitry, are better equipped to be positioned within the garment.
An interactive object with multiple electronics modules is described. An interactive object (e.g., a garment) includes at least an internal electronics module containing a first subset of electronic components for the interactive object, and an external electronics module containing a second subset of electronic components for the interactive object. As described herein, the internal electronics module may be physically and permanently coupled to the interactive object, whereas the external electronics module may be removably coupled to the interactive object. Thus, instead of integrating all of the electronics within the interactive object, at least some of the electronics are placed in the external electronics module.
In one or more implementations, the interactive object includes an interactive textile with conductive threads woven into the textile to form a flexible touch pad. The internal electronics module contains sensing circuitry that is directly coupled to the conductive threads to enable the detection of touch-input to the interactive textile. The external electronics module contains electronic components that are needed to process and communicate the touch-input data, such as a microprocessor, a power source, a network interface, and so forth.
The interactive object further includes a communication interface configured to enable communication between the internal electronics module and the external electronics module. In some implementations, the communication interface may be implemented as a connector that connects the electronic components in the external electronics module to the electronic components in the internal electronics module to enable the transfer of power and data between the modules. The connector may include a connector plug and a connector receptacle. For example, the connector plug may be implemented at the external electronics module and is configured to connect to the connector receptacle, which may be implemented at the interactive object.
Thus, while the electronic components are separated into multiple different modules, the communication interface enables the system to function as a single unit. For example, the power source contained within the external electronics module may transfer power, via the communication interface, to the sensing circuitry of the internal electronics module to enable the sensing circuitry to detect touch-input to the conductive thread. When touch-input is detected by the sensing circuitry of the internal electronics module, data representative of the touch-input may be communicated, via the communication interface, to the microprocessor contained within the external electronics module. The microprocessor may then analyze the touch-input data to generate one or more control signals, which may then be communicated to a remote computing device (e.g., a smart phone) via the network interface to cause the computing device to initiate a particular functionality.
Separating the electronics of the interactive object into multiple different modules provides a variety of different benefits. For example, the system design enables interoperability and customization because the external electronics module can be detached from the interactive object, and then attached to a different interactive object to carry over some of the functions and properties, such as user specific settings. Additionally, by separating the garment embedded electronics from the external electronics module, users, designers and companies are able to design the external electronics modules in the form factor, mechanical, material and surface finish qualities that are specific to the application or the user. For example, a leather jacket might have an external electronics module that is leather, and in the form of a strap that matches a certain jacket style, or allows a flexible form factor that would have been hard to achieve inside a garment.
Furthermore, separating the electronics enable broken parts to be easily replaced or serviced without the need to access the entire interactive object. For example, the external electronics module can be shipped to a repair service, or a new external electronics module can be purchased without the need to purchase a new interactive object. In addition, separating the electronic components into internal and external modules ensures that parts such as batteries are not exposes to washing cycles that a typical garment would go through. For example, the external electronics module, which may include the battery, can easily be removed from the interactive object before washing the interactive object. Furthermore, by separating parts, the manufacturing challenges are significantly simplified and certification processes (such as FCC certification for RF transmission units) can be handled over the part in question, thereby reducing the complexity.
In environment 100, interactive objects 104 include “flexible” objects, such as a shirt 104-1, a hat 104-2, and a handbag 104-3. It is to be noted, however, that interactive textile 102 may be integrated within any type of flexible object made from fabric or a similar flexible material, such as garments or articles of clothing, blankets, shower curtains, towels, sheets, bed spreads, or fabric casings of furniture, to name just a few. Interactive textile 102 may be integrated within flexible objects 104 in a variety of different ways, including weaving, sewing, gluing, and so forth.
In this example, objects 104 further include “hard” objects, such as a plastic cup 104-4 and a hard smart phone casing 104-5. It is to be noted, however, that hard objects 104 may include any type of “hard” or “rigid” object made from non-flexible or semi-flexible materials, such as plastic, metal, aluminum, and so on. For example, hard objects 104 may also include plastic chairs, water bottles, plastic balls, or car parts, to name just a few. Interactive textile 102 may be integrated within hard objects 104 using a variety of different manufacturing processes. In one or more implementations, injection molding is used to integrate interactive textiles 102 into hard objects 104.
Interactive textile 102 enables a user to control object 104 that the interactive textile 102 is integrated with, or to control a variety of other computing devices 106 via a network 108. Computing devices 106 are illustrated with various non-limiting example devices: server 106-1, smart phone 106-2, laptop 106-3, computing spectacles 106-4, television 106-5, camera 106-6, tablet 106-7, desktop 106-8, and smart watch 106-9, though other devices may also be used, such as home automation and control systems, sound or entertainment systems, home appliances, security systems, netbooks, and e-readers. Note that computing device 106 can be wearable (e.g., computing spectacles and smart watches), non-wearable but mobile (e.g., laptops and tablets), or relatively immobile (e.g., desktops and servers).
Network 108 includes one or more of many types of wireless or partly wireless communication networks, such as a local-area-network (LAN), a wireless local-area-network (WLAN), a personal-area-network (PAN), a wide-area-network (WAN), an intranet, the Internet, a peer-to-peer network, point-to-point network, a mesh network, and so forth.
Interactive textile 102 can interact with computing devices 106 by transmitting touch data through network 108. Computing device 106 uses the touch data to control computing device 106 or applications at computing device 106. As an example, consider that interactive textile 102 integrated at shirt 104-1 may be configured to control the user's smart phone 106-2 in the user's pocket, television 106-5 in the user's home, smart watch 106-9 on the user's wrist, or various other appliances in the user's house, such as thermostats, lights, music, and so forth. For example, the user may be able to swipe up or down on interactive textile 102 integrated within the user's shirt 104-1 to cause the volume on television 106-5 to go up or down, to cause the temperature controlled by a thermostat in the user's house to increase or decrease, or to turn on and off lights in the user's house. Note that any type of touch, tap, swipe, hold, or stroke gesture may be recognized by interactive textile 102.
In more detail, consider
Interactive textile 102 is configured to sense multi-touch-input from a user when one or more fingers of the user's hand touch interactive textile 102. Interactive textile 102 may also be configured to sense full-hand touch-input from a user, such as when an entire hand of the user touches or swipes interactive textile 102. To enable the detection of touch-input, interactive textile 102 includes conductive threads 202, which are woven into interactive textile 102 (e.g., in a grid or array pattern). Notably, the conductive threads 202 do not alter the flexibility of interactive textile 102, which enables interactive textile 102 to be easily integrated within interactive objects 104.
Interactive object 104 includes an internal electronics module 204 that is embedded within interactive object 104 and is directly coupled to conductive threads 202. Internal electronics module 204 can be communicatively coupled to an external electronics module 206 via a communication interface 208. Internal electronics module 204 contains a first subset of electronic components for the interactive object 104, and external electronics module 206 contains a second, different, subset of electronics components for the interactive object 104. As described herein, the internal electronics module 204 may be physically and permanently embedded within interactive object 104, whereas the external electronics module 206 may be removably coupled to interactive object 104.
In system 200, the electronic components contained within the internal electronics module 204 includes sensing circuitry 210 that is coupled to conductive thread 202 that is woven into interactive textile 102. For example, wires from the conductive threads 202 may be connected to sensing circuitry 210 using flexible PCB, creping, gluing with conductive glue, soldering, and so forth. Sensing circuitry 210 is configured to detect the location of the touch-input on conductive thread 202, as well as motion of the touch-input. For example, when an object, such as a user's finger, touches conductive thread 202, the position of the touch can be determined by sensing circuitry 210 by detecting a change in capacitance on the grid or array of conductive thread 202. The touch-input may then be used to generate touch data usable to control computing device 106. For example, the touch-input can be used to determine various gestures, such as single-finger touches (e.g., touches, taps, and holds), multi-finger touches (e.g., two-finger touches, two-finger taps, two-finger holds, and pinches), single-finger and multi-finger swipes (e.g., swipe up, swipe down, swipe left, swipe right), and full-hand interactions (e.g., touching the textile with a user's entire hand, covering textile with the user's entire hand, pressing the textile with the user's entire hand, palm touches, and rolling, twisting, or rotating the user's hand while touching the textile).
Communication interface 208 enables the transfer of power and data (e.g., the touch-input detected by sensing circuitry 210) between the internal electronics module 204 and the external electronics module 206. In some implementations, communication interface 208 may be implemented as a connector that includes a connector plug and a connector receptacle. The connector plug may be implemented at the external electronics module 206 and is configured to connect to the connector receptacle, which may be implemented at the interactive object 104. A more-detailed discussion of example connectors is discussed below with regards to
In system 200, the external electronics module 206 includes a microprocessor 212, power source 214, and network interface 216. Power source 214 may be coupled, via communication interface 208, to sensing circuitry 210 to provide power to sensing circuitry 210 to enable the detection of touch-input, and may be implemented as a small battery. When touch-input is detected by sensing circuitry 210 of the internal electronics module 204, data representative of the touch-input may be communicated, via communication interface 208, to microprocessor 212 of the external electronics module 206. Microprocessor 212 may then analyze the touch-input data to generate one or more control signals, which may then be communicated to computing device 106 (e.g., a smart phone) via the network interface 216 to cause the computing device 106 to initiate a particular functionality. Generally, network interfaces 216 are configured to communicate data, such as touch data, over wired, wireless, or optical networks to computing devices 106. By way of example and not limitation, network interfaces 216 may communicate data over a local-area-network (LAN), a wireless local-area-network (WLAN), a personal-area-network (PAN) (e.g., Bluetooth™), a wide-area-network (WAN), an intranet, the Internet, a peer-to-peer network, point-to-point network, a mesh network, and the like (e.g., through network 108 of
While internal electronics module 204 and external electronics module 206 are illustrated and described as including specific electronic components, it is to be appreciated that these modules may be configured in a variety of different ways. For example, in some cases, electronic components described as being contained within internal electronics module 204 may be at least partially implemented at the external electronics module 206, and vice versa. Furthermore, internal electronics module 204 and external electronics module 206 may include electronic components other that those illustrated in
At 304, a zoomed-in view of conductive thread 202 is illustrated. Conductive thread 202 includes a conductive wire 306 that is twisted, braided, or wrapped with a flexible thread 308. Twisting conductive wire 306 with flexible thread 308 causes conductive thread 202 to be flexible and stretchy, which enables conductive thread 202 to be easily woven with non-conductive threads 302 to form interactive textile 102.
In one or more implementations, conductive wire 306 is a thin copper wire. It is to be noted, however, that conductive wire 306 may also be implemented using other materials, such as silver, gold, or other materials coated with a conductive polymer. Flexible thread 308 may be implemented as any type of flexible thread or fiber, such as cotton, wool, silk, nylon, polyester, and so forth.
Interactive textile 102 can be formed cheaply and efficiently, using any conventional weaving process (e.g., jacquard weaving or 3D-weaving), which involves interlacing a set of longer threads (called the warp) with a set of crossing threads (called the weft). Weaving may be implemented on a frame or machine known as a loom, of which there are a number of types. Thus, a loom can weave non-conductive threads 302 with conductive threads 102 to create interactive textile 102.
In example 300, conductive thread 202 is woven into interactive textile 102 to form a grid that includes a set of substantially parallel conductive threads 202 and a second set of substantially parallel conductive threads 202 that crosses the first set of conductive threads to form the grid. In this example, the first set of conductive threads 202 are oriented horizontally and the second set of conductive threads 202 are oriented vertically, such that the first set of conductive threads 202 are positioned substantially orthogonal to the second set of conductive threads 202. It is to be appreciated, however, that conductive threads 202 may be oriented such that crossing conductive threads 202 are not orthogonal to each other. For example, in some cases crossing conductive threads 202 may form a diamond-shaped grid. While conductive threads 202 are illustrated as being spaced out from each other in
In example 300, sensing circuitry 210 is shown as being integrated within object 104, and is directly connected to conductive threads 202. During operation, sensing circuitry 210 can determine positions of touch-input on the grid of conductive thread 202 using self-capacitance sensing or projective capacitive sensing.
For example, when configured as a self-capacitance sensor, sensing circuitry 210 charges crossing conductive threads 202 (e.g., horizontal and vertical conductive threads) by applying a control signal (e.g., a sine signal) to each conductive thread 202. When an object, such as the user's finger, touches the grid of conductive thread 202, the conductive threads 202 that are touched are grounded, which changes the capacitance (e.g., increases or decreases the capacitance) on the touched conductive threads 202.
Sensing circuitry 210 uses the change in capacitance to identify the presence of the object. To do so, sensing circuitry 210 detects a position of the touch-input by detecting which horizontal conductive thread 202 is touched, and which vertical conductive thread 202 is touched by detecting changes in capacitance of each respective conductive thread 202. Sensing circuitry 210 uses the intersection of the crossing conductive threads 202 that are touched to determine the position of the touch-input on the grid of conductive threads 202. For example, sensing circuitry 210 can determine touch data by determining the position of each touch as X,Y coordinates on the grid of conductive thread 202.
When implemented as a self-capacitance sensor, “ghosting” may occur when multi-touch-input is received. Consider, for example, that a user touches the grid of conductive thread 202 with two fingers. When this occurs, sensing circuitry 210 determines X and Y coordinates for each of the two touches. However, sensing circuitry 210 may be unable to determine how to match each X coordinate to its corresponding Y coordinate. For example, if a first touch has the coordinates X1, Y1 and a second touch has the coordinates X4,Y4, sensing circuitry 210 may also detect “ghost” coordinates X1, Y4 and X4,Y1.
In one or more implementations, sensing circuitry 210 is configured to detect “areas” of touch-input corresponding to two or more touch-input points on the grid of conductive thread 202. Conductive threads 202 may be weaved closely together such that when an object touches the grid of conductive thread 202, the capacitance will be changed for multiple horizontal conductive threads 202 and/or multiple vertical conductive threads 202. For example, a single touch with a single finger may generate the coordinates X1,Y1 and X2,Y1. Thus, sensing circuitry 210 may be configured to detect touch-input if the capacitance is changed for multiple horizontal conductive threads 202 and/or multiple vertical conductive threads 202. Note that this removes the effect of ghosting because sensing circuitry 210 will not detect touch-input if two single-point touches are detected which are spaced apart.
Alternately, when implemented as a projective capacitance sensor, sensing circuitry 210 charges a single set of conductive threads 202 (e.g., horizontal conductive threads 202) by applying a control signal (e.g., a sine signal) to the single set of conductive threads 202. Then, sensing circuitry 210 senses changes in capacitance in the other set of conductive threads 202 (e.g., vertical conductive threads 202).
In this implementation, vertical conductive threads 202 are not charged and thus act as a virtual ground. However, when horizontal conductive threads 202 are charged, the horizontal conductive threads capacitively couple to vertical conductive threads 202. Thus, when an object, such as the user's finger, touches the grid of conductive thread 202, the capacitance changes on the vertical conductive threads (e.g., increases or decreases). Sensing circuitry 210 uses the change in capacitance on vertical conductive threads 202 to identify the presence of the object. To do so, sensing circuitry 210 detects a position of the touch-input by scanning vertical conductive threads 202 to detect changes in capacitance. Sensing circuitry 210 determines the position of the touch-input as the intersection point between the vertical conductive thread 202 with the changed capacitance, and the horizontal conductive thread 202 on which the control signal was transmitted. For example, sensing circuitry 210 can determine touch data by determining the position of each touch as X,Y coordinates on the grid of conductive thread 202.
Whether implemented as a self-capacitance sensor or a projective capacitance sensor, the conductive thread 202 and sensing circuitry 210 is configured to communicate the touch data that is representative of the detected touch-input to external electronics module 206, which is removably coupled to interactive object 104 via communication interface 208. The microprocessor 212 may then cause communication of the touch data, via network interface 216, to computing device 106 to enable the device to determine gestures based on the touch data, which can be used to control object 104, computing device 106, or applications implemented at computing device 106.
The computing device 106 can be implemented to recognize a variety of different types of gestures, such as touches, taps, swipes, holds, and covers made to interactive textile 102. To recognize the various different types of gestures, the computing device can be configured to determine a duration of the touch, swipe, or hold (e.g., one second or two seconds), a number of the touches, swipes, or holds (e.g., a single tap, a double tap, or a triple tap), a number of fingers of the touch, swipe, or hold (e.g., a one finger-touch or swipe, a two-finger touch or swipe, or a three-finger touch or swipe), a frequency of the touch, and a dynamic direction of a touch or swipe (e.g., up, down, left, right). With regards to holds, the computing device 106 can also determine an area of the grid of conductive thread 202 that is being held (e.g., top, bottom, left, right, or top and bottom. Thus, the computing device 106 can recognize a variety of different types of holds, such as a cover, a cover and hold, a five finger hold, a five finger cover and hold, a three finger pinch and hold, and so forth.
In one or more implementations, communication interface 208 is implemented as a connector that is configured to connect external electronics module 206 to internal electronics module 204 of interactive object 104. Consider, for example,
As described above, interactive object 104 includes an internal electronics module 204 which include various types of electronics, such as sensing circuitry 210, sensors (e.g., capacitive touch sensors woven into the garment, microphones, or accelerometers), output devices (e.g., LEDs, speakers, or micro-displays), electrical circuitry, and so forth.
External electronics module 206 includes various electronics that are configured to connect and/or interface with the electronics of internal electronics module 204. Generally, the electronics contained within external electronics module 206 are different than those contained within internal electronics module 204, and may include electronics such as microprocessor 212, power source 214 (e.g., a battery), network interface 216 (e.g., Bluetooth or WiFi), sensors (e.g., accelerometers, heart rate monitors, or pedometers), output devices (e.g., speakers, LEDs), and so forth.
In this example, external electronics module 206 is implemented as a strap that contains the various electronics. The strap, for example, can be formed from a material such as rubber, nylon, or any other type of fabric. Notably, however, external electronics module 206 may take any type of form. For example, rather than being a strap, external electronics module 206 could resemble a circular or square piece of material (e.g., rubber or nylon).
Connector 402 includes a connector plug 404 and a connector receptacle 406. In this example, connector plug 404 is positioned on external electronics module 206 and is configured to attach to connector receptacle 406, which is positioned on interactive object 104, to form an electronic connection between external electronics module 206 and interactive object 104. For example, in
In various implementations, connector plug 404 may resemble a snap or button, and is configured to connect or attach to connector receptacle 406 via a magnetic or mechanical coupling. For example, in some implementations magnets on connector plug 404 and connector receptacle 406 cause a magnetic connection to form between connector plug 404 and connector receptacle 406. Alternately, a mechanical connection between these two components may cause the components to form a mechanical coupling, such as by “snapping” together.
Connector 402 may be implemented in a variety of different ways. In one or more implementations, connector plug 404 includes an anisotropic conducting polymer which is configured to connect to circular pads of a printed circuit board (PCB) implemented at connector receptacle 406. In another implementation, connector plug 404 may include compliant polyurethane polymers to provide compliance to metal pads implemented at connector receptacle 406 to enable an electromagnetic connection. In another implementation, connector plug 404 and connector receptacle 406 may each include magnetically coupled coils which can be aligned to provide power and data transmission.
At 502, a top side of connector plug 404 is shown. In this case, the top side of connector plug 404 resembles a round, button-like structure. Notably the top side of connector plug 404 may be implemented with various different shapes (e.g., square or triangular). Further, in some cases the top side of connector plug 404 may resemble something other than a button or snap.
In this example, the top side of connector plug 404 includes tiny holes that enables light from light sources (e.g., LEDs) to shine through. Of course, other types of input or output units could also be positioned here, such as a microphone or a speaker.
At 504, a bottom side of connector plug 404 is shown. The bottom side of connector plug 404 includes an anisotropic conducting polymer 506 to enable electrical connections between the electronics of interactive object 104 and the electronics of external electronics module 206.
In more detail, consider
In this example, connector plug 404 of connector 402 includes a button cap 602, a printed circuit board (PCB) 604, anisotropic conducting polymer 606, a magnet 608, and a casing 610.
Button cap 602 resembles a typical button, and may be made from a variety of different materials, such as plastic, metal, and so forth. In this example, button cap 602 includes holes which enable light from LEDs to shine through.
PCB 604 is configured to electrically connect electronics of interactive object 104 to anisotropic conducting polymer 606. A top layer of PCB 604 may include the LEDs that shine through the holes in button cap 602. A bottom layer of PCB 604 includes contacts which electrically connect to anisotropic conducting polymer 606 positioned beneath PCB 604.
Anisotropic conducting polymer 606 includes a strip of anisotropic material that is configured to form a connection with connector receptacle 406. The anisotropic material include any type of anisotropic material.
Magnet 608 is configured to enable a magnetic connection to connector receptacle 406. The magnetic connection enables connector plug 404 to attach to connector receptacle 406 without the need to apply force to connect, which reduces the chance of the connection wearing down over time. Alternately, in one or more implementations, connector plug 404 may be implemented without magnet 608. For example, connector plug 404 could be implemented as physical or mechanical snap that snaps to connector receptacle 406. Casing 610 is configured to hold the components of connector plug 404, and can be implemented from a variety of different materials such as plastic, metal, and so forth.
In this example, connector receptacle 406 includes a receptacle PCB 612 which includes circular pads which are configured to connect to anisotropic conducting polymer 606. The bottom layer of receptacle PCB 612 includes connections to the electronics of interactive object 104.
Connector receptacle may also include a metallic component 614 which is configured to generate a magnetic force with magnet 608 of connector plug 404 to form the magnetic connection between connector plug 404 and connector receptacle 406. Metallic component 614 may be implemented as any type of metal or alloy, or as another magnet, that can generate a magnetic force with magnet 608. Connector receptacle 406 may also include other components, such as a housing, a washer, and so forth.
Notably, anisotropic conducting polymer 606 includes various properties which make for a good connector, which include rotational tolerance, mechanical compliance, multi-pin electrical and power transmission, and being waterproof.
For instance, when connector plug 404 attaches to connector receptacle 406, an electrical connection is formed between anisotropic conducting polymer 606 and receptacle PCB 612. The anisotropic conducting polymer 606 provides rotational tolerance because the strip of anisotropic material can be rotated 360 degrees and maintain the same connection to the circular pads of receptacle PCB 612. This is beneficial because when wearing a garment, the strap of external electronics module 206 will naturally move around. Thus, the rotational tolerance enables the connector to be rotated without losing the connection between connector plug 404 and connector receptacle 406. Furthermore, the anisotropic conducting polymer 606 is elastomeric, which causes the strip of material to shrink and conform under mechanical force.
Anisotropic conducting polymer 606 provides multi-pin electrical transmissions and power transfer transmissions simultaneously. For example, the anisotropic material causes conduction to occur in just one direction, which means that the conductive paths can operate completely independently, without interfering with each other. This enables multiple conducting channels, which makes it easy to isolate multiple data lines or power lines from each other using anisotropic conducting polymer 606 and the circular structure of receptacle PCB 612.
Additionally, anisotropic conducting polymer 606 is waterproof which prevents connector 402 from being damaged by water, such as when being worn in the rain or when being washed.
Connector 402 may be implemented in a variety of different ways. In one or more implementations, instead of using anisotropic conducting polymer 606, connector plug 404 may include compliant polyurethane polymers to provide compliance to metal pads implemented at connector receptacle 406 to enable an electromagnetic connection. In another implementation, connector plug 404 and connector receptacle 406 may each include magnetically coupled coils which can be aligned to provide power and data transmission between interactive object 104 and external electronics module 206.
Computing system 700 includes communication devices 702 that enable wired and/or wireless communication of device data 704 (e.g., received data, data that is being received, data scheduled for broadcast, data packets of the data, etc.). Device data 704 or other device content can include configuration settings of the device, media content stored on the device, and/or information associated with a user of the device. Media content stored on computing system 700 can include any type of audio, video, and/or image data. Computing system 700 includes one or more data inputs 706 via which any type of data, media content, and/or inputs can be received, such as human utterances, user-selectable inputs (explicit or implicit), messages, music, television media content, recorded video content, and any other type of audio, video, and/or image data received from any content and/or data source.
Computing system 700 also includes communication interfaces 708, which can be implemented as any one or more of a serial and/or parallel interface, a wireless interface, any type of network interface, a modem, and as any other type of communication interface. Communication interfaces 708 provide a connection and/or communication links between computing system 700 and a communication network by which other electronic, computing, and communication devices communicate data with computing system 700.
Computing system 700 includes one or more processors 710 (e.g., any of microprocessors, controllers, and the like), which process various computer-executable instructions to control the operation of computing system 700 and to enable techniques for, or in which can be embodied, interactive textiles. Alternatively or in addition, computing system 700 can be implemented with any one or combination of hardware, firmware, or fixed logic circuitry that is implemented in connection with processing and control circuits which are generally identified at 712. Although not shown, computing system 700 can include a system bus or data transfer system that couples the various components within the device. A system bus can include any one or combination of different bus structures, such as a memory bus or memory controller, a peripheral bus, a universal serial bus, and/or a processor or local bus that utilizes any of a variety of bus architectures.
Computing system 700 also includes computer-readable media 714, such as one or more memory devices that enable persistent and/or non-transitory data storage (i.e., in contrast to mere signal transmission), examples of which include random access memory (RAM), non-volatile memory (e.g., any one or more of a read-only memory (ROM), flash memory, EPROM, EEPROM, etc.), and a disk storage device. A disk storage device may be implemented as any type of magnetic or optical storage device, such as a hard disk drive, a recordable and/or rewriteable compact disc (CD), any type of a digital versatile disc (DVD), and the like. Computing system 700 can also include a mass storage media device 716.
Computer-readable media 714 provides data storage mechanisms to store device data 704, as well as various device applications 718 and any other types of information and/or data related to operational aspects of computing system 700. For example, an operating system 720 can be maintained as a computer application with computer-readable media 714 and executed on processors 710. Device applications 718 may include a device manager, such as any form of a control application, software application, signal-processing and control module, code that is native to a particular device, a hardware abstraction layer for a particular device, and so on. Device applications 718 also include any system components, engines, or managers to implement an interactive object with multiple electronics modules.
Although embodiments of techniques using, and objects including, an interactive object with multiple electronics modules has been described in language specific to features and/or methods, it is to be understood that the subject of the appended claims is not necessarily limited to the specific features or methods described. Rather, the specific features and methods are disclosed as example implementations of an interactive object with multiple electronics modules.
This application claims priority to U.S. Provisional Patent Application Ser. No. 62/337,081 filed on May 16, 2016, the disclosure of which is incorporated by reference herein in its entirety.
Number | Name | Date | Kind |
---|---|---|---|
3610874 | Gagliano | Oct 1971 | A |
3752017 | Lloyd et al. | Aug 1973 | A |
3953706 | Harris et al. | Apr 1976 | A |
4104012 | Ferrante | Aug 1978 | A |
4654967 | Thenner | Apr 1987 | A |
4700044 | Hokanson et al. | Oct 1987 | A |
4795998 | Dunbar et al. | Jan 1989 | A |
4838797 | Dodier | Jun 1989 | A |
5016500 | Conrad et al. | May 1991 | A |
5298715 | Chalco et al. | Mar 1994 | A |
5341979 | Gupta | Aug 1994 | A |
5468917 | Brodsky et al. | Nov 1995 | A |
5564571 | Zanotti | Oct 1996 | A |
5656798 | Kubo et al. | Aug 1997 | A |
5724707 | Kirk et al. | Mar 1998 | A |
5798798 | Rector et al. | Aug 1998 | A |
6032450 | Blum | Mar 2000 | A |
6080690 | Lebby et al. | Jun 2000 | A |
6101431 | Niwa et al. | Aug 2000 | A |
6210771 | Post et al. | Apr 2001 | B1 |
6313825 | Gilbert | Nov 2001 | B1 |
6340979 | Beaton et al. | Jan 2002 | B1 |
6386757 | Konno | May 2002 | B1 |
6440593 | Ellison et al. | Aug 2002 | B2 |
6492980 | Sandbach | Dec 2002 | B2 |
6493933 | Post et al. | Dec 2002 | B1 |
6513970 | Tabata et al. | Feb 2003 | B1 |
6543668 | Fujii et al. | Apr 2003 | B1 |
6711354 | Kameyama | Mar 2004 | B2 |
6717065 | Hosaka et al. | Apr 2004 | B2 |
6802720 | Weiss et al. | Oct 2004 | B2 |
6833807 | Flacke et al. | Dec 2004 | B2 |
6835898 | Eldridge et al. | Dec 2004 | B2 |
6854985 | Weiss | Feb 2005 | B1 |
6929484 | Weiss et al. | Aug 2005 | B2 |
7134879 | Sugimoto et al. | Nov 2006 | B2 |
7164820 | Eves et al. | Jan 2007 | B2 |
7223105 | Weiss et al. | May 2007 | B2 |
7230610 | Jung et al. | Jun 2007 | B2 |
7249954 | Weiss | Jul 2007 | B2 |
7299964 | Jayaraman et al. | Nov 2007 | B2 |
7310236 | Takahashi et al. | Dec 2007 | B2 |
7317416 | Flom et al. | Jan 2008 | B2 |
7348285 | Dhawan et al. | Mar 2008 | B2 |
7365031 | Swallow et al. | Apr 2008 | B2 |
7421061 | Boese et al. | Sep 2008 | B2 |
7462035 | Lee et al. | Dec 2008 | B2 |
7528082 | Krans et al. | May 2009 | B2 |
7544627 | Tao et al. | Jun 2009 | B2 |
7578195 | DeAngelis et al. | Aug 2009 | B2 |
7644488 | Aisenbrey | Jan 2010 | B2 |
7670144 | Ito et al. | Mar 2010 | B2 |
7677729 | Vilser et al. | Mar 2010 | B2 |
7691067 | Westbrook et al. | Apr 2010 | B2 |
7698154 | Marchosky | Apr 2010 | B2 |
7791700 | Bellamy | Sep 2010 | B2 |
7834276 | Chou et al. | Nov 2010 | B2 |
7941676 | Glaser | May 2011 | B2 |
7952512 | Delker et al. | May 2011 | B1 |
8062220 | Kurtz et al. | Nov 2011 | B2 |
8169404 | Boillot | May 2012 | B1 |
8179604 | Prada Gomez et al. | May 2012 | B1 |
8282232 | Hsu et al. | Oct 2012 | B2 |
8289185 | Alonso | Oct 2012 | B2 |
8301232 | Albert et al. | Oct 2012 | B2 |
8314732 | Oswald et al. | Nov 2012 | B2 |
8334226 | Nhan et al. | Dec 2012 | B2 |
8341762 | Balzano | Jan 2013 | B2 |
8344949 | Moshfeghi | Jan 2013 | B2 |
8367942 | Howell et al. | Feb 2013 | B2 |
8475367 | Yuen et al. | Jul 2013 | B1 |
8505474 | Kang et al. | Aug 2013 | B2 |
8514221 | King et al. | Aug 2013 | B2 |
8527146 | Jackson et al. | Sep 2013 | B1 |
8549829 | Song et al. | Oct 2013 | B2 |
8560972 | Wilson | Oct 2013 | B2 |
8569189 | Bhattacharya et al. | Oct 2013 | B2 |
8614689 | Nishikawa et al. | Dec 2013 | B2 |
8700137 | Albert | Apr 2014 | B2 |
8758020 | Burdea et al. | Jun 2014 | B2 |
8759713 | Sheats | Jun 2014 | B2 |
8764651 | Tran | Jul 2014 | B2 |
8785778 | Streeter et al. | Jul 2014 | B2 |
8790257 | Libbus et al. | Jul 2014 | B2 |
8814574 | Selby et al. | Aug 2014 | B2 |
8921473 | Hyman | Dec 2014 | B1 |
8948839 | Longinotti-Buitoni et al. | Feb 2015 | B1 |
9055879 | Selby et al. | Jun 2015 | B2 |
9093289 | Vicard et al. | Jul 2015 | B2 |
9125456 | Chow | Sep 2015 | B2 |
9141194 | Keyes et al. | Sep 2015 | B1 |
9148949 | Zhou et al. | Sep 2015 | B2 |
9230160 | Kanter | Jan 2016 | B1 |
9235241 | Newham et al. | Jan 2016 | B2 |
9331422 | Nazzaro et al. | May 2016 | B2 |
9335825 | Rautiainen et al. | May 2016 | B2 |
9354709 | Heller et al. | May 2016 | B1 |
9569001 | Mistry et al. | Feb 2017 | B2 |
9575560 | Poupyrev et al. | Feb 2017 | B2 |
9588625 | Poupyrev | Mar 2017 | B2 |
9594443 | VanBlon et al. | Mar 2017 | B2 |
9600080 | Poupyrev | Mar 2017 | B2 |
9693592 | Robinson et al. | Jul 2017 | B2 |
9766742 | Papakostas | Sep 2017 | B2 |
9778749 | Poupyrev | Oct 2017 | B2 |
9811164 | Poupyrev | Nov 2017 | B2 |
9817109 | Saboo et al. | Nov 2017 | B2 |
9837760 | Karagozler et al. | Dec 2017 | B2 |
9921660 | Poupyrev | Mar 2018 | B2 |
9933908 | Poupyrev | Apr 2018 | B2 |
9971414 | Gollakota et al. | May 2018 | B2 |
9971415 | Poupyrev et al. | May 2018 | B2 |
9983747 | Poupyrev | May 2018 | B2 |
10082950 | Lapp | Sep 2018 | B2 |
10088908 | Poupyrev et al. | Oct 2018 | B1 |
20010035836 | Miceli et al. | Nov 2001 | A1 |
20020080156 | Abbott et al. | Jun 2002 | A1 |
20020170897 | Hall | Nov 2002 | A1 |
20030100228 | Bungo et al. | May 2003 | A1 |
20030119391 | Swallow et al. | Jun 2003 | A1 |
20040009729 | Hill et al. | Jan 2004 | A1 |
20040259391 | Jung et al. | Dec 2004 | A1 |
20050069695 | Jung et al. | Mar 2005 | A1 |
20050128124 | Greneker et al. | Jun 2005 | A1 |
20050148876 | Endoh et al. | Jul 2005 | A1 |
20060035554 | Glaser et al. | Feb 2006 | A1 |
20060040739 | Wells | Feb 2006 | A1 |
20060047386 | Kanevsky et al. | Mar 2006 | A1 |
20060061504 | Leach, Jr. et al. | Mar 2006 | A1 |
20060125803 | Westerman et al. | Jun 2006 | A1 |
20060136997 | Telek et al. | Jun 2006 | A1 |
20060148351 | Tao et al. | Jul 2006 | A1 |
20060157734 | Onodero et al. | Jul 2006 | A1 |
20060166620 | Sorensen | Jul 2006 | A1 |
20060170584 | Romero et al. | Aug 2006 | A1 |
20060209021 | Yoo et al. | Sep 2006 | A1 |
20060258205 | Locher et al. | Nov 2006 | A1 |
20070024488 | Zemany et al. | Feb 2007 | A1 |
20070026695 | Lee et al. | Feb 2007 | A1 |
20070118043 | Oliver et al. | May 2007 | A1 |
20070161921 | Rausch | Jul 2007 | A1 |
20070176821 | Flom et al. | Aug 2007 | A1 |
20070192647 | Glaser | Aug 2007 | A1 |
20070197115 | Eves et al. | Aug 2007 | A1 |
20070197878 | Shklarski | Aug 2007 | A1 |
20070210074 | Maurer et al. | Sep 2007 | A1 |
20070237423 | Tico et al. | Oct 2007 | A1 |
20080002027 | Kondo et al. | Jan 2008 | A1 |
20080024438 | Collins et al. | Jan 2008 | A1 |
20080065291 | Breed | Mar 2008 | A1 |
20080134102 | Movold et al. | Jun 2008 | A1 |
20080136775 | Conant | Jun 2008 | A1 |
20080168396 | Matas et al. | Jul 2008 | A1 |
20080211766 | Westerman et al. | Sep 2008 | A1 |
20080233822 | Swallow et al. | Sep 2008 | A1 |
20080282665 | Speleers | Nov 2008 | A1 |
20080291158 | Park et al. | Nov 2008 | A1 |
20080303800 | Elwell | Dec 2008 | A1 |
20080316085 | Rofougaran et al. | Dec 2008 | A1 |
20080320419 | Matas et al. | Dec 2008 | A1 |
20090018428 | Dias et al. | Jan 2009 | A1 |
20090033585 | Lang | Feb 2009 | A1 |
20090053950 | Surve | Feb 2009 | A1 |
20090056300 | Chung et al. | Mar 2009 | A1 |
20090058820 | Hinckley | Mar 2009 | A1 |
20090113298 | Jung et al. | Apr 2009 | A1 |
20090115617 | Sano et al. | May 2009 | A1 |
20090118648 | Kandori et al. | May 2009 | A1 |
20090149036 | Lee et al. | Jun 2009 | A1 |
20090177068 | Stivoric et al. | Jul 2009 | A1 |
20090203244 | Toonder | Aug 2009 | A1 |
20090270690 | Roos et al. | Oct 2009 | A1 |
20090278915 | Kramer et al. | Nov 2009 | A1 |
20090288762 | Wolfel | Nov 2009 | A1 |
20090295712 | Ritzau | Dec 2009 | A1 |
20090319181 | Khosravy et al. | Dec 2009 | A1 |
20100050133 | Nishihara et al. | Feb 2010 | A1 |
20100053151 | Marti et al. | Mar 2010 | A1 |
20100065320 | Urano | Mar 2010 | A1 |
20100071205 | Graumann et al. | Mar 2010 | A1 |
20100094141 | Puswella | Apr 2010 | A1 |
20100201586 | Michalk | Aug 2010 | A1 |
20100205667 | Anderson et al. | Aug 2010 | A1 |
20100208035 | Pinault et al. | Aug 2010 | A1 |
20100225562 | Smith | Sep 2010 | A1 |
20100234094 | Gagner et al. | Sep 2010 | A1 |
20100241009 | Petkie | Sep 2010 | A1 |
20100281438 | Latta et al. | Nov 2010 | A1 |
20100306713 | Geisner et al. | Dec 2010 | A1 |
20100313414 | Sheats | Dec 2010 | A1 |
20100325770 | Chung et al. | Dec 2010 | A1 |
20110003664 | Richard | Jan 2011 | A1 |
20110010014 | Oexman et al. | Jan 2011 | A1 |
20110073353 | Lee et al. | Mar 2011 | A1 |
20110093820 | Zhang et al. | Apr 2011 | A1 |
20110166940 | Bangera et al. | Jul 2011 | A1 |
20110181509 | Rautiainen et al. | Jul 2011 | A1 |
20110181510 | Hakala et al. | Jul 2011 | A1 |
20110197263 | Stinson, III | Aug 2011 | A1 |
20110213218 | Weiner et al. | Sep 2011 | A1 |
20110221666 | Newton et al. | Sep 2011 | A1 |
20110234492 | Ajmera et al. | Sep 2011 | A1 |
20110279303 | Smith | Nov 2011 | A1 |
20110286585 | Hodge | Nov 2011 | A1 |
20110303341 | Meiss et al. | Dec 2011 | A1 |
20110307842 | Chiang et al. | Dec 2011 | A1 |
20110318985 | McDermid | Dec 2011 | A1 |
20120001875 | Li et al. | Jan 2012 | A1 |
20120019168 | Noda et al. | Jan 2012 | A1 |
20120047468 | Santos et al. | Feb 2012 | A1 |
20120068876 | Bangera et al. | Mar 2012 | A1 |
20120092284 | Rofougaran et al. | Apr 2012 | A1 |
20120123232 | Najarian et al. | May 2012 | A1 |
20120127082 | Kushler et al. | May 2012 | A1 |
20120144934 | Russell et al. | Jun 2012 | A1 |
20120150493 | Casey et al. | Jun 2012 | A1 |
20120154313 | Au et al. | Jun 2012 | A1 |
20120156926 | Kato et al. | Jun 2012 | A1 |
20120174299 | Balzano | Jul 2012 | A1 |
20120174736 | Wang et al. | Jul 2012 | A1 |
20120193801 | Gross et al. | Aug 2012 | A1 |
20120248093 | Ulrich et al. | Oct 2012 | A1 |
20120254810 | Heck et al. | Oct 2012 | A1 |
20120268416 | Pirogov et al. | Oct 2012 | A1 |
20120280900 | Wang et al. | Nov 2012 | A1 |
20120298748 | Factor et al. | Nov 2012 | A1 |
20120310665 | Xu et al. | Dec 2012 | A1 |
20130016070 | Starner et al. | Jan 2013 | A1 |
20130027218 | Schwarz et al. | Jan 2013 | A1 |
20130046544 | Kay et al. | Feb 2013 | A1 |
20130053653 | Cuddihy et al. | Feb 2013 | A1 |
20130078624 | Holmes et al. | Mar 2013 | A1 |
20130082922 | Miller | Apr 2013 | A1 |
20130083173 | Geisner et al. | Apr 2013 | A1 |
20130102217 | Jeon | Apr 2013 | A1 |
20130104084 | Mlyniec et al. | Apr 2013 | A1 |
20130117377 | Miller | May 2013 | A1 |
20130132931 | Bruns et al. | May 2013 | A1 |
20130147833 | Aubauer et al. | Jun 2013 | A1 |
20130150735 | Cheng | Jun 2013 | A1 |
20130161078 | Li | Jun 2013 | A1 |
20130169471 | Lynch | Jul 2013 | A1 |
20130194173 | Zhu et al. | Aug 2013 | A1 |
20130195330 | Kim et al. | Aug 2013 | A1 |
20130196716 | Muhammad | Aug 2013 | A1 |
20130207962 | Oberdorfer et al. | Aug 2013 | A1 |
20130253029 | Jain et al. | Sep 2013 | A1 |
20130260630 | Ito et al. | Oct 2013 | A1 |
20130278499 | Anderson | Oct 2013 | A1 |
20130278501 | Bulzacki | Oct 2013 | A1 |
20130332438 | Li et al. | Dec 2013 | A1 |
20130345569 | Mestha et al. | Dec 2013 | A1 |
20140005809 | Frei et al. | Jan 2014 | A1 |
20140028539 | Newham et al. | Jan 2014 | A1 |
20140049487 | Konertz et al. | Feb 2014 | A1 |
20140050354 | Heim et al. | Feb 2014 | A1 |
20140070957 | Longinotti-Buitoni et al. | Mar 2014 | A1 |
20140073969 | Zou et al. | Mar 2014 | A1 |
20140081100 | Muhsin et al. | Mar 2014 | A1 |
20140095480 | Marantz et al. | Apr 2014 | A1 |
20140121540 | Raskin | May 2014 | A1 |
20140135631 | Brumback et al. | May 2014 | A1 |
20140139422 | Mistry et al. | May 2014 | A1 |
20140139616 | Pinter et al. | May 2014 | A1 |
20140143678 | Mistry et al. | May 2014 | A1 |
20140184496 | Gribetz et al. | Jul 2014 | A1 |
20140184499 | Kim | Jul 2014 | A1 |
20140191939 | Penn et al. | Jul 2014 | A1 |
20140200416 | Kashef et al. | Jul 2014 | A1 |
20140201690 | Holz | Jul 2014 | A1 |
20140208275 | Mongia et al. | Jul 2014 | A1 |
20140215389 | Walsh et al. | Jul 2014 | A1 |
20140239065 | Zhou et al. | Aug 2014 | A1 |
20140244277 | Krishna Rao et al. | Aug 2014 | A1 |
20140246415 | Wittkowski | Sep 2014 | A1 |
20140247212 | Kim et al. | Sep 2014 | A1 |
20140250515 | Jakobsson | Sep 2014 | A1 |
20140253431 | Gossweiler et al. | Sep 2014 | A1 |
20140253709 | Bresch et al. | Sep 2014 | A1 |
20140262478 | Harris et al. | Sep 2014 | A1 |
20140280295 | Kurochikin et al. | Sep 2014 | A1 |
20140281975 | Anderson | Sep 2014 | A1 |
20140282877 | Mahaffey et al. | Sep 2014 | A1 |
20140297006 | Sadhu | Oct 2014 | A1 |
20140298266 | Lapp | Oct 2014 | A1 |
20140306936 | Dahl et al. | Oct 2014 | A1 |
20140316261 | Lux et al. | Oct 2014 | A1 |
20140318699 | Longinotti-Buitoni et al. | Oct 2014 | A1 |
20140324888 | Xie et al. | Oct 2014 | A1 |
20140333467 | Inomata | Nov 2014 | A1 |
20140343392 | Yang | Nov 2014 | A1 |
20140347295 | Kim et al. | Nov 2014 | A1 |
20140357369 | Callens et al. | Dec 2014 | A1 |
20140368441 | Touloumtzis | Dec 2014 | A1 |
20150002391 | Chen | Jan 2015 | A1 |
20150009096 | Lee et al. | Jan 2015 | A1 |
20150029050 | Driscoll et al. | Jan 2015 | A1 |
20150030256 | Brady et al. | Jan 2015 | A1 |
20150040040 | Balan et al. | Feb 2015 | A1 |
20150062033 | Ishihara | Mar 2015 | A1 |
20150068069 | Tran et al. | Mar 2015 | A1 |
20150077282 | Mohamadi | Mar 2015 | A1 |
20150085060 | Fish et al. | Mar 2015 | A1 |
20150091820 | Rosenberg et al. | Apr 2015 | A1 |
20150091858 | Rosenberg et al. | Apr 2015 | A1 |
20150091859 | Rosenberg et al. | Apr 2015 | A1 |
20150112606 | He et al. | Apr 2015 | A1 |
20150133017 | Liao et al. | May 2015 | A1 |
20150143601 | Longinotti-Buitoni et al. | May 2015 | A1 |
20150145805 | Liu | May 2015 | A1 |
20150162729 | Reversat et al. | Jun 2015 | A1 |
20150177866 | Hwang et al. | Jun 2015 | A1 |
20150185314 | Corcos et al. | Jul 2015 | A1 |
20150199045 | Robucci et al. | Jul 2015 | A1 |
20150226004 | Thompson | Aug 2015 | A1 |
20150256763 | Niemi | Sep 2015 | A1 |
20150261320 | Leto | Sep 2015 | A1 |
20150268027 | Gerdes | Sep 2015 | A1 |
20150268799 | Starner et al. | Sep 2015 | A1 |
20150277569 | Sprenger et al. | Oct 2015 | A1 |
20150280102 | Tajitsu et al. | Oct 2015 | A1 |
20150312041 | Choi | Oct 2015 | A1 |
20150317518 | Fujimaki et al. | Nov 2015 | A1 |
20150323993 | Levesque et al. | Nov 2015 | A1 |
20150332075 | Burch | Nov 2015 | A1 |
20150341550 | Lay | Nov 2015 | A1 |
20150346820 | Poupyrev et al. | Dec 2015 | A1 |
20150375339 | Sterling et al. | Dec 2015 | A1 |
20160018948 | Parvarandeh et al. | Jan 2016 | A1 |
20160026253 | Bradski et al. | Jan 2016 | A1 |
20160038083 | Ding et al. | Feb 2016 | A1 |
20160041617 | Poupyrev | Feb 2016 | A1 |
20160041618 | Poupyrev | Feb 2016 | A1 |
20160042169 | Polehn | Feb 2016 | A1 |
20160048235 | Poupyrev | Feb 2016 | A1 |
20160048236 | Poupyrev | Feb 2016 | A1 |
20160048672 | Lux et al. | Feb 2016 | A1 |
20160054792 | Poupyrev | Feb 2016 | A1 |
20160054803 | Poupyrev | Feb 2016 | A1 |
20160054804 | Gollakata et al. | Feb 2016 | A1 |
20160055201 | Poupyrev et al. | Feb 2016 | A1 |
20160090839 | Stolarcyzk | Mar 2016 | A1 |
20160098089 | Poupyrev | Apr 2016 | A1 |
20160100166 | Dragne et al. | Apr 2016 | A1 |
20160103500 | Hussey et al. | Apr 2016 | A1 |
20160106328 | Mestha et al. | Apr 2016 | A1 |
20160145776 | Roh | May 2016 | A1 |
20160170491 | Jung | Jun 2016 | A1 |
20160171293 | Li et al. | Jun 2016 | A1 |
20160186366 | McMaster | Jun 2016 | A1 |
20160216825 | Forutanpour | Jul 2016 | A1 |
20160249698 | Berzowska et al. | Sep 2016 | A1 |
20160253044 | Katz | Sep 2016 | A1 |
20160259037 | Molchanov et al. | Sep 2016 | A1 |
20160282988 | Poupyrev | Sep 2016 | A1 |
20160283101 | Schwesig et al. | Sep 2016 | A1 |
20160284436 | Fukuhara et al. | Sep 2016 | A1 |
20160299526 | Inagaki et al. | Oct 2016 | A1 |
20160320852 | Poupyrev | Nov 2016 | A1 |
20160320853 | Lien et al. | Nov 2016 | A1 |
20160320854 | Lien et al. | Nov 2016 | A1 |
20160321428 | Rogers | Nov 2016 | A1 |
20160345638 | Robinson et al. | Dec 2016 | A1 |
20160349790 | Connor | Dec 2016 | A1 |
20160349845 | Poupyrev et al. | Dec 2016 | A1 |
20160377712 | Ivu et al. | Dec 2016 | A1 |
20170052618 | Lee et al. | Feb 2017 | A1 |
20170060254 | Molchanov et al. | Mar 2017 | A1 |
20170060298 | Hwang | Mar 2017 | A1 |
20170075481 | Chou et al. | Mar 2017 | A1 |
20170075496 | Rosenberg et al. | Mar 2017 | A1 |
20170097413 | Gillian et al. | Apr 2017 | A1 |
20170097684 | Lien | Apr 2017 | A1 |
20170115777 | Poupyrev | Apr 2017 | A1 |
20170125940 | Karagozler et al. | May 2017 | A1 |
20170192523 | Poupyrev | Jul 2017 | A1 |
20170196513 | Longinotti-Buitoni et al. | Jul 2017 | A1 |
20170232538 | Robinson et al. | Aug 2017 | A1 |
20170233903 | Jeon | Aug 2017 | A1 |
20170249033 | Podhajny | Aug 2017 | A1 |
20170322633 | Shen et al. | Nov 2017 | A1 |
20170325337 | Karagozler et al. | Nov 2017 | A1 |
20170325518 | Poupyrev et al. | Nov 2017 | A1 |
20180004301 | Poupyrev | Jan 2018 | A1 |
20180005766 | Fairbanks et al. | Jan 2018 | A1 |
20180046258 | Poupyrev | Feb 2018 | A1 |
20180157330 | Gu et al. | Jun 2018 | A1 |
20180160943 | Fyfe et al. | Jun 2018 | A1 |
20180196527 | Poupyrev et al. | Jul 2018 | A1 |
Number | Date | Country |
---|---|---|
202887794 | Apr 2013 | CN |
103355860 | Jan 2016 | CN |
102011075725 | Nov 2012 | DE |
102013201359 | Jul 2014 | DE |
0161895 | Nov 1985 | EP |
1815788 | Aug 2007 | EP |
2070469 | Sep 1981 | GB |
2443208 | Apr 2008 | GB |
2003280049 | Oct 2003 | JP |
2006234716 | Sep 2006 | JP |
2011102457 | May 2011 | JP |
WO-0130123 | Apr 2001 | WO |
WO-2001027855 | Apr 2001 | WO |
WO-0175778 | Oct 2001 | WO |
WO-2002082999 | Oct 2002 | WO |
WO-2005033387 | Apr 2005 | WO |
2007125298 | Nov 2007 | WO |
WO-2008061385 | May 2008 | WO |
WO-2009032073 | Mar 2009 | WO |
2009083467 | Jul 2009 | WO |
WO-2010032173 | Mar 2010 | WO |
WO-2012026013 | Mar 2012 | WO |
WO-2012152476 | Nov 2012 | WO |
WO-2013082806 | Jun 2013 | WO |
WO-2013084108 | Jun 2013 | WO |
WO-2013186696 | Dec 2013 | WO |
WO-2013191657 | Dec 2013 | WO |
WO-2014019085 | Feb 2014 | WO |
WO-2014116968 | Jul 2014 | WO |
WO-2014136027 | Sep 2014 | WO |
WO-2014138280 | Sep 2014 | WO |
WO-2014160893 | Oct 2014 | WO |
WO-2014165476 | Oct 2014 | WO |
WO-2014204323 | Dec 2014 | WO |
WO-2015017931 | Feb 2015 | WO |
WO-2015022671 | Feb 2015 | WO |
2016053624 | Apr 2016 | WO |
2017200570 | Nov 2017 | WO |
20170200949 | Nov 2017 | WO |
2018106306 | Jun 2018 | WO |
Entry |
---|
“Corrected Notice of Allowance”, U.S. Appl. No. 14/504,061, dated Dec. 27, 2016, 2 pages. |
“Corrected Notice of Allowance”, U.S. Appl. No. 14/582,896, dated Dec. 19, 2016, 2 pages. |
“International Search Report and Written Opinion”, Application No. PCT/US2016/024289, dated Aug. 25, 2016, 17 pages. |
Cheng,“Smart Textiles: From Niche to Mainstream”, IEEE Pervasive Computing, Jul. 2013, pp. 81-84. |
Farringdon,“Wearable Sensor Badge & Sensor Jacket for Context Awareness”, Third International Symposium on Wearable Computers, Oct. 1999, 7 pages. |
Schneegass,“Towards a Garment OS: Supporting Application Development for Smart Garments”, Wearable Computers, ACM, Sep. 2014, 6 pages. |
“Combined Search and Examination Report”, GB Application No. 1620891.0, dated May 31, 2017, 9 pages. |
“Final Office Action”, U.S. Appl. No. 15/398,147, dated Jun. 30, 2017, 11 pages. |
“Final Office Action”, U.S. Appl. No. 14/874,955, dated Jun. 30, 2017, 9 pages. |
“Final Office Action”, U.S. Appl. No. 14/959,799, dated Jul. 19, 2017, 12 pages. |
“Final Office Action”, U.S. Appl. No. 14/504,121, dated Aug. 8, 2017, 16 pages. |
“International Search Report and Written Opinion”, Application No. PCT/US2016/063874, dated May 11, 2017, 19 pages. |
“Non-Final Office Action”, U.S. Appl. No. 14/862,409, dated Jun. 22, 2017, 15 pages. |
“Non-Final Office Action”, U.S. Appl. No. 14/959,730, dated Jun. 23, 2017, 14 pages. |
“Notice of Allowance”, U.S. Appl. No. 14/513,875, dated Jun. 28, 2017, 7 pages. |
“Notice of Allowance”, U.S. Appl. No. 15/343,067, dated Jul. 27, 2017, 9 pages. |
“Notice of Allowance”, U.S. Appl. No. 14/504,038, dated Aug. 7, 2017, 17 pages. |
“Cardiio”, Retrieved From: <http://www.cardiio.com/> Apr. 15, 2015 App Information Retrieved From: <https://itunes.apple.com/us/app/cardiio-touchless-camera-pulse/id542891434?Is=1&mt=8> Apr. 15, 2015, Feb. 24, 2015, 6 pages. |
“Corrected Notice of Allowance”, U.S. Appl. No. 14/312,486, dated Oct. 28, 2016, 4 pages. |
“Extended European Search Report”, EP Application No. 15170577.9, dated Nov. 5, 2015, 12 pages. |
“Final Office Action”, U.S. Appl. No. 14/312,486, dated Jun. 3, 2016, 32 pages. |
“Final Office Action”, U.S. Appl. No. 14/504,038, dated Sep. 27, 2016, 23 pages. |
“Final Office Action”, U.S. Appl. No. 14/504,061, dated Mar. 9, 2016, 10 pages. |
“Frogpad Introduces Wearable Fabric Keyboard with Bluetooth Technology”, Retrieved From: <http://www.geekzone.co.nz/content.asp?contentid=3898> Mar. 16, 2015, Jan. 7, 2005, 2 pages. |
“International Search Report and Written Opinion”, Application No. PCT/US2015/044774, dated Nov. 3, 2015, 12 pages. |
“International Search Report and Written Opinion”, Application No. PCT/US2016/024267, dated Jun. 20, 2016, 13 pages. |
“International Search Report and Written Opinion”, Application No. PCT/US2016/024273, dated Jun. 20, 2016, 13 pages. |
“International Search Report and Written Opinion”, Application No. PCT/US2016/032307, dated Aug. 25, 2016, 13 pages. |
“International Search Report and Written Opinion”, Application No. PCT/US2016/029820, dated Jul. 15, 2016, 14 pages. |
“International Search Report and Written Opinion”, Application No. PCT/US2016/030177, dated Aug. 2, 2016, 15 pages. |
“International Search Report and Written Opinion”, Application No. PCT/US2015/043963, dated Nov. 24, 2015, 16 pages. |
“International Search Report and Written Opinion”, Application No. PCT/US2015/050903, dated Feb. 19, 2016, 18 pages. |
“International Search Report and Written Opinion”, Application No. PCT/US2016/030115, dated Aug. 8, 2016, 18 pages. |
“International Search Report and Written Opinion”, Application No. PCT/US2015/043949, dated Dec. 1, 2015, 18 pages. |
“Non-Final Office Action”, U.S. Appl. No. 14/312,486, dated Oct. 23, 2015, 25 pages. |
“Non-Final Office Action”, U.S. Appl. No. 14/504,038, dated Feb. 26, 2016, 22 pages. |
“Non-Final Office Action”, U.S. Appl. No. 14/504,061, dated Nov. 4, 2015, 8 pages. |
“Non-Final Office Action”, U.S. Appl. No. 14/518,863, dated Oct. 14, 2016, 16 pages. |
“Non-Final Office Action”, U.S. Appl. No. 14/582,896, dated Jun. 29, 2016, 9 pages. |
“Non-Final Office Action”, U.S. Appl. No. 14/666,155, dated Aug. 24, 2016, 9 pages. |
“Non-Final Office Action”, U.S. Appl. No. 14/681,625, dated Aug. 12, 2016, 9 pages. |
“Non-Final Office Action”, U.S. Appl. No. 14/930,220, dated Sep. 14, 2016, 15 pages. |
“Notice of Allowance”, U.S. Appl. No. 14/312,486, dated Oct. 7, 2016, 15 pages. |
“Notice of Allowance”, U.S. Appl. No. 14/504,061, dated Sep. 12, 2016, 7 pages. |
“Notice of Allowance”, U.S. Appl. No. 14/582,896, dated Nov. 7, 2016, 5 pages. |
“Philips Vital Signs Camera”, Retrieved From: <http://www.vitalsignscamera.com/> Apr. 15, 2015, Jul. 17, 2013, 2 pages. |
“Pre-Interview Communication”, U.S. Appl. No. 14/513,875, dated Oct. 21, 2016, 3 pages. |
“Restriction Requirement”, U.S. Appl. No. 14/666,155, dated Jul. 22, 2016, 5 pages. |
“The Instant Blood Pressure app estimates blood pressure with your smartphone and our algorithm”, Retrieved at: http://www.instantbloodpressure.com/—on Jun. 23, 2016, 6 pages. |
Arbabian,“A 94GHz mm-Wave to Baseband Pulsed-Radar for Imaging and Gesture Recognition”, 2012 IEEE, 2012 Symposium on VLSI Circuits Digest of Technical Papers, 2012, 2 pages. |
Balakrishnan,“Detecting Pulse from Head Motions in Video”, In Proceedings: CVPR '13 Proceedings of the 2013 IEEE Conference on Computer Vision and Pattern Recognition Available at: <http://people.csail.mit.edu/mrub/vidmag/papers/Balakrishnan_Detecting_Pulse_from_2013_CVPR_paper.pdf>, Jun. 23, 2013, 8 pages. |
Couderc,“Detection of Atrial Fibrillation using Contactless Facial Video Monitoring”, In Proceedings: Heart Rhythm Society, vol. 12, Issue 1 Available at: <http://www.heartrhythmjournal.com/article/S1547-5271(14)00924-2/pdf>, Jan. 2015, 7 pages. |
Espina,“Wireless Body Sensor Network for Continuous Cuff-less Blood Pressure Monitoring”, International Summer School on Medical Devices and Biosensors, 2006, Sep. 2006, 5 pages. |
Godana,“Human Movement Characterization in Indoor Environment using GNU Radio Based Radar”, Retrieved at: http://repository.tudelft.nl/islandora/object/uuid:414e1868-dd00-4113-9989-4c213f1f7094?collection=education, Nov. 30, 2009, 100 pages. |
He,“A Continuous, Wearable, and Wireless Heart Monitor Using Head Ballistocardiogram (BCG) and Head Electrocardiogram (ECG) with a Nanowatt ECG Heartbeat Detection Circuit”, In Proceedings: Thesis, Department of Electrical Engineering and Computer Science, Massachusetts Institute of Technology Available at: <http://dspace.mit.edu/handle/1721.1/79221>, Feb. 2013, 137 pages. |
Holleis,“Evaluating Capacitive Touch Input on Clothes”, Proceedings of the 10th International Conference on Human Computer Interaction, Jan. 1, 2008, 10 pages. |
Nakajima,“Development of Real-Time Image Sequence Analysis for Evaluating Posture Change and Respiratory Rate of a Subject in Bed”, In Proceedings: Physiological Measurement, vol. 22, No. 3 Retrieved From: <http://iopscience.iop.org/0967-3334/22/3/401/pdf/0967-3334_22_3_401.pdf> Feb. 27, 2015, Aug. 2001, 8 pages. |
Patel,“Applications of Electrically Conductive Yarns in Technical Textiles”, International Conference on Power System Technology (POWECON), Oct. 30, 2012, 6 pages. |
Poh,“A Medical Mirror for Non-contact Health Monitoring”, In Proceedings: ACM SIGGRAPH Emerging Technologies Available at: <http://affect.media.mit.edu/pdfs/11.Poh-etal-SIGGRAPH.pdf>, 2011, 1 page. |
Poh,“Non-contact, Automated Cardiac Pulse Measurements Using Video Imaging and Blind Source Separation.”, In Proceedings: Optics Express, vol. 18, No. 10 Available at: <http://www.opticsinfobase.org/view_article.cfm?gotourl=http%3A%2F%2Fwww%2Eopticsinfobase%2Eorg%2FDirectPDFAccess%2 F77B04D55%2DBC95%2D6937%2 D5BAC49A426378C02%5F199381%2Foe%2D18%2D10%2D10762%2EP, May 7, 2010, 13 pages. |
Pu,“Gesture Recognition Using Wireless Signals”, Oct. 2014, pp. 15-18. |
Pu,“Whole-Home Gesture Recognition Using Wireless Signals”, MobiCom '13 Proceedings of the 19th annual international conference on Mobile computing & networking, Aug. 27, 2013, 12 pages. |
Wang,“Exploiting Spatial Redundancy of Image Sensor for Motion Robust rPPG”, In Proceedings: IEEE Transactions on Biomedical Engineering, vol. 62, Issue 2, Jan. 19, 2015, 11 pages. |
Wang,“Micro-Doppler Signatures for Intelligent Human Gait Recognition Using a UWB Impulse Radar”, 2011 IEEE International Symposium on Antennas and Propagation (APSURSI), Jul. 3, 2011, pp. 2103-2106. |
Wijesiriwardana,“Capacitive Fibre-Meshed Transducer for Touch & Proximity Sensing Applications”, IEEE Sensors Journal, IEEE Service Center, Oct. 1, 2005, 5 pages. |
Zhadobov,“Millimeter-wave Interactions with the Human Body: State of Knowledge and Recent Advances”, International Journal of Microwave and Wireless Technologies, Mar. 1, 2011, 11 pages. |
Zhang,“Study of the Structural Design and Capacitance Characteristics of Fabric Sensor”, Advanced Materials Research (vols. 194-196), Feb. 21, 2011, 8 pages. |
“Final Office Action”, U.S. Appl. No. 15/142,619, dated Feb. 8, 2018, 15 pages. |
“Final Office Action”, U.S. Appl. No. 15/093,533, dated Mar. 21, 2018, 19 pages. |
“First Action Interview Office Action”, U.S. Appl. No. 15/286,152, dated Mar. 1, 2018, 5 pages. |
“Foreign Office Action”, Chinese Application No. 201721290290.3, dated Mar. 9, 2018, 2 pages. |
“Non-Final Office Action”, U.S. Appl. No. 15/267,181, dated Feb. 8, 2018, 29 pages. |
“Non-Final Office Action”, U.S. Appl. No. 14/874,955, dated Feb. 8, 2018, 7 pages. |
“Notice of Allowance”, U.S. Appl. No. 14/959,730, dated Feb. 22, 2018, 8 pages. |
“Pre-Interview Communication”, U.S. Appl. No. 15/166,198, dated Mar. 8, 2018, 8 pages. |
“Pre-Interview First Office Action”, U.S. Appl. No. 15/286,152, dated Feb. 8, 2018, 4 pages. |
“Advisory Action”, U.S. Appl. No. 14/504,139, dated Aug. 28, 2017, 3 pages. |
“Final Office Action”, U.S. Appl. No. 14/959,901, dated Aug. 25, 2017, 19 pages. |
“Final Office Action”, U.S. Appl. No. 15/403,066, dated Oct. 5, 2017, 31 pages. |
“Non-Final Office Action”, U.S. Appl. No. 15/093,533, dated Aug. 24, 2017, 18 pages. |
“Non-Final Office Action”, U.S. Appl. No. 15/142,619, dated Aug. 25, 2017, 16 pages. |
“Non-Final Office Action”, U.S. Appl. No. 14/959,799, dated Sep. 8, 2017, 16 pages. |
“Non-Final Office Action”, U.S. Appl. No. 15/398,147, dated Sep. 8, 2017, 7 pages. |
“Non-Final Office Action”, U.S. Appl. No. 14/518,863, dated Sep. 29, 2017, 20 pages. |
“Non-Final Office Action”, U.S. Appl. No. 15/142,689, dated Oct. 4, 2017, 18 pages. |
“Pre-Interview Office Action”, U.S. Appl. No. 14/862,409, dated Sep. 15, 2017, 16 pages. |
“Written Opinion”, PCT Application No. PCT/US2017/032733, dated Jul. 26, 2017, 5 pages. |
“Combined Search and Examination Report”, GB Application No. 1620892.8, dated Apr. 6, 2017, 5 pages. |
“Corrected Notice of Allowance”, U.S. Appl. No. 14/930,220, dated Mar. 20, 2017, 2 pages. |
“Corrected Notice of Allowance”, U.S. Appl. No. 14/930,220, dated May 11, 2017, 2 pages. |
“Final Office Action”, U.S. Appl. No. 14/518,863, dated May 5, 2017, 18 pages. |
“First Action Interview Office Action”, U.S. Appl. No. 14/959,901, dated Apr. 14, 2017, 3 pages. |
“International Preliminary Report on Patentability”, Application No. PCT/US2015/050903, dated Apr. 13, 2017, 12 pages. |
“International Search Report and Written Opinion”, Application No. PCT/US2016/060399, dated Jan. 30, 2017, 11 pages. |
“Non-Final Office Action”, U.S. Appl. No. 14/504,038, dated Mar. 22, 2017, 33 pages. |
“Non-Final Office Action”, U.S. Appl. No. 15/403,066, dated May 4, 2017, 31 pages. |
“Notice of Allowance”, U.S. Appl. No. 14/494,863, dated May 30, 2017, 7 pages. |
“Pre-Interview Communication”, U.S. Appl. No. 15/343,067, dated Apr. 19, 2017, 3 pages. |
“Textile Wire Brochure”, Retrieved at: http://www.textile-wire.ch/en/home.html, Aug. 7, 2004, 17 pages. |
Stoppa,“Wearable Electronics and Smart Textiles: A Critical Review”, In Proceedings of Sensors, vol. 14, Issue 7, Jul. 7, 2014, pp. 11957-11992. |
“Final Office Action”, U.S. Appl. No. 14/959,799, dated Jan. 4, 2018, 17 pages. |
“Final Office Action”, U.S. Appl. No. 14/959,730, dated Nov. 22, 2017, 16 pages. |
“International Search Report and Written Opinion”, PCT/US2017/047691, dated Nov. 16, 2017, 13. |
“International Search Report and Written Opinion”, PCT Application No. PCT/US2017/051663, dated Nov. 29, 2017, 16 pages. |
“Non-Final Office Action”, U.S. Appl. No. 14/504,121, dated Jan. 2, 2018, 19 pages. |
“Non-Final Office Action”, U.S. Appl. No. 14/959,901, dated Jan. 8, 2018, 21 pages. |
“Non-Final Office Action”, U.S. Appl. No. 14/504,139, dated Oct. 18, 2017, 12 pages. |
“Non-Final Office Action”, U.S. Appl. No. 15/595,649, dated Oct. 31, 2017, 16 pages. |
“Non-Final Office Action”, U.S. Appl. No. 14/862,409, dated Dec. 14, 2017, 17 pages. |
“Notice of Allowance”, U.S. Appl. No. 15/403,066, dated Jan. 8, 2018, 18 pages. |
“Notice of Allowance”, U.S. Appl. No. 14/874,955, dated Oct. 20, 2017, 7 pages. |
“Notice of Allowance”, U.S. Appl. No. 15/398,147, dated Nov. 15, 2017, 8 pages. |
“Notice of Publication”, U.S. Appl. No. 15/703,511, dated Jan. 4, 2018, 1 page. |
Bondade, et al., “A linear-assisted DC-DC hybrid power converter for envelope tracking RF power amplifiers”, 2014 IEEE Energy Conversion Congress and Exposition (ECCE), IEEE, Sep. 14, 2014, pp. 5769-5773, XP032680873, DOI: 10.1109/ECCE.2014.6954193, Sep. 14, 2014, 5 pages. |
Fan, et al., “Wireless Hand Gesture Recognition Based on Continuous-Wave Doppler Radar Sensors”, IEEE Transactions on Microwave Theory and Techniques, Plenum, USA, vol. 64, No. 11, Nov. 1, 2016 (Nov. 1, 2016), pp. 4012-4012, XP011633246, ISSN: 0018-9480, DOI: 10.1109/TMTT.2016.2610427, Nov. 1, 2016, 9 pages. |
Lien, et al., “Soli: Ubiquitous Gesture Sensing with Millimeter Wave Radar”, ACM Transactions on Graphics (TOG), ACM, Us, vol. 35, No. 4, Jul. 11, 2016 (Jul. 11, 2016), pp. 1-19, XP058275791, ISSN: 0730-0301, DOI: 10.1145/2897824.2925953, Jul. 11, 2016, 19 pages. |
Martinez-Garcia, et al., “Four-quadrant linear-assisted DC/DC voltage regulator”, Analog Integrated Circuits and Signal Processing, Springer New York LLC, US, vol. 88, No. 1, Apr. 23, 2016 (Apr. 23, 2016) , pp. 151-160, XP035898949, ISSN: 0925-1030, DOI: 10.1007/S10470-016-0747-8, Apr. 23, 2016, 10 pages. |
Skolnik, “CW and Frequency-Modulated Radar”, In: “Introduction to Radar Systems”, Jan. 1, 1981 (Jan. 1, 1981), McGraw Hill, XP055047545, ISBN: 978-0-07-057909-5 pp. 68-100, p. 95-p. 97, Jan. 1, 1981, 18 pages. |
Zheng, et al., “Doppler Bio-Signal Detection Based Time-Domain Hand Gesture Recognition”, 2013 IEEE MTT-S International Microwave Workshop Series on RF and Wireless Technologies for Biomedical and Healthcare Applications (IMWS-BIO), IEEE, Dec. 9, 2013 (Dec. 9, 2013), p. 3, XP032574214, DOI: 10.1109/IMWS-BIO.2013.6756200, Dec. 9, 2013, 3 Pages. |
“Corrected Notice of Allowance”, U.S. Appl. No. 14/312,486, dated Jan. 23, 2017, 4 pages. |
“Corrected Notice of Allowance”, U.S. Appl. No. 14/582,896, dated Feb. 6, 2017, 2 pages. |
“Corrected Notice of Allowance”, U.S. Appl. No. 14/582,896, dated Feb. 23, 2017, 2 pages. |
“International Preliminary Report on Patentability”, Application No. PCT/US2015/043963, dated Feb. 16, 2017, 12 pages. |
“International Preliminary Report on Patentability”, Application No. PCT/US2015/030388, dated Dec. 15, 2016, 12 pages. |
“International Preliminary Report on Patentability”, Application No. PCT/US2015/043949, dated Feb. 16, 2017, 13 pages. |
“International Preliminary Report on Patentability”, Application No. PCT/US2015/044774, dated Mar. 2, 2017, 8 pages. |
“International Search Report and Written Opinion”, Application No. PCT/US2016/062082, dated Feb. 23, 2017, 12 pages. |
“International Search Report and Written Opinion”, Application No. PCT/US2016/055671, dated Dec. 1, 2016, 14 pages. |
“Non-Final Office Action”, U.S. Appl. No. 14/504,121, dated Jan. 9, 2017, 13 pages. |
“Non-Final Office Action”, U.S. Appl. No. 14/504,139, dated Jan. 27, 2017, 10 pages. |
“Non-Final Office Action”, U.S. Appl. No. 14/513,875, dated Feb. 21, 2017, 9 pages. |
“Non-Final Office Action”, U.S. Appl. No. 14/874,955, dated Feb. 27, 2017, 8 pages. |
“Non-Final Office Action”, U.S. Appl. No. 14/959,799, dated Jan. 27, 2017, 10 pages. |
“Non-Final Office Action”, U.S. Appl. No. 15/398,147, dated Mar. 9, 2017, 10 pages. |
“Notice of Allowance”, U.S. Appl. No. 14/930,220, dated Feb. 2, 2017, 8 pages. |
“Pre-Interview Communication”, U.S. Appl. No. 14/494,863, dated Jan. 27, 2017, 5 pages. |
“Pre-Interview Communication”, U.S. Appl. No. 14/959,730, dated Feb. 15, 2017, 3 pages. |
“Pre-Interview Communication”, U.S. Appl. No. 14/959,901, dated Feb. 10, 2017, 3 pages. |
“Final Office Action”, U.S. Appl. No. 14/518,863, dated Apr. 5, 2018, 21 pages. |
“Final Office Action”, U.S. Appl. No. 14/504,139, dated May 1, 2018, 14 pages. |
“Final Office Action”, U.S. Appl. No. 15/595,649, dated May 23, 2018, 13 pages. |
“First Action Interview Office Action”, U.S. Appl. No. 15/166,198, dated Apr. 25, 2018, 8 pages. |
“Non-Final Office Action”, U.S. Appl. No. 15/287,253, dated Apr. 5, 2018, 17 pages. |
“Written Opinion”, PCT Application No. PCT/US2017/032733, dated Jul. 24, 2017, 5 pages. |
“Final Office Action”, U.S. Appl. No. 15/142,689, dated Jun. 1, 2018, 16 pages. |
“Final Office Action”, U.S. Appl. No. 14/874,955, dated Jun. 11, 2018, 9 pages. |
“Final Office Action”, U.S. Appl. No. 14/959,901, dated Jun. 15, 2018, 21 pages. |
“Final Office Action”, U.S. Appl. No. 15/286,152, dated Jun. 26, 2018, 25 pages. |
“Final Office Action”, U.S. Appl. No. 15/267,181, dated Jun. 7, 2018, 31 pages. |
“Final Office Action”, U.S. Appl. No. 14/504,121, dated Jun. 9, 2018, 23 pages. |
“Foreign Office Action”, Chinese Application No. 201721290290.3, dated Jun. 6, 2018, 3 pages. |
“Non-Final Office Action”, U.S. Appl. No. 15/586,174, dated Jun. 18, 2018, 7 pages. |
“Non-Final Office Action”, U.S. Appl. No. 15/286,512, dated Jul. 19, 2018, 15 pages. |
“Non-Final Office Action”, U.S. Appl. No. 15/142,829, dated Aug. 16, 2018, 15 pages. |
“Notice of Allowance”, U.S. Appl. No. 14/862,409, dated Jun. 6, 2018, 7 pages. |
“Notice of Allowance”, U.S. Appl. No. 15/142,619, dated Aug. 13, 2018, 9 pages. |
“Pre-Interview Communication”, U.S. Appl. No. 15/287,359, dated Jul. 24, 2018, 2 pages. |
“Restriction Requirement”, U.S. Appl. No. 15/286,537, dated Aug. 27, 2018, 8 pages. |
“Final Office Action”, U.S. Appl. No. 15/166,198, dated Sep. 27, 2018, 33 pages. |
“Non-Final Office Action”, U.S. Appl. No. 15/287,253, dated Sep. 7, 2018, 20 pages. |
“Non-Final Office Action”, U.S. Appl. No. 14/959,901, dated Oct. 11, 2018, 22 pages. |
“Non-Final Office Action”, U.S. Appl. No. 15/287,308, dated Oct. 15, 2018, 18 pages. |
“Non-Final Office Action”, U.S. Appl. No. 15/286,152, dated Oct. 19, 2018, 27 pages. |
“Non-Final Office Action”, U.S. Appl. No. 15/286,837, dated Oct. 26, 2018, 10 pages. |
“Non-Final Office Action”, U.S. Appl. No. 14/504,139, dated Oct. 5, 2018, 16 pages. |
“Notice of Allowance”, U.S. Appl. No. 15/142,689, dated Oct. 30, 2018, 9 pages. |
“Notice of Allowance”, U.S. Appl. No. 14/874,955, dated Oct. 4, 2018, 8 pages. |
“Notice of Allowance”, U.S. Appl. No. 15/595,649, dated Sep. 14, 2018, 8 pages. |
“Notice of Allowance”, U.S. Appl. No. 15/586,174, dated Sep. 24, 2018, 5 pages. |
“Pre-Interview Communication”, U.S. Appl. No. 15/286,495, dated Sep. 10, 2018, 4 pages. |
“Written Opinion”, PCT Application No. PCT/US2017/051663, dated Oct. 12, 2018, 8 pages. |
Gürbüz et al., “Detection and Identification of Human Targets in Radar Data”, Proc. SPIE 6567, Signal Processing, Sensor Fusion, and Target Recognition XVI, 656701, May 7, 2007, 12 pages. |
Number | Date | Country | |
---|---|---|---|
20170329425 A1 | Nov 2017 | US |
Number | Date | Country | |
---|---|---|---|
62337081 | May 2016 | US |