Capacitive multi-touch surfaces can detect the positions of one or more fingers on the surface, but cannot uniquely identify objects placed on the surface. Optical multi-touch tables, which use a camera/projector system or sensor-in-pixel technology, have the ability to identify objects equipped with a visual marker as well as sense multi-touch user input. However, such tables are large, have rigid form-factor limitations (because of the optical arrangement) and a high power consumption.
The following presents a simplified summary of the disclosure in order to provide a basic understanding to the reader. This summary is not intended to identify key features or essential features of the claimed subject matter nor is it intended to be used to limit the scope of the claimed subject matter. Its sole purpose is to present a selection of concepts disclosed herein in a simplified form as a prelude to the more detailed description that is presented later.
A multi-modal sensing surface comprises two overlaid arrays: a capacitive sensing electrode array and an array of RF antennas. A first sensing module is coupled to the capacitive sensing electrode array and is configured to detect both an increase and a decrease of capacitance between electrodes in the array. A second sensing module is coupled to the array of RF antennas and is configured to selectively tune and detune one or more of the RF antennas in the array of RF antennas.
Many of the attendant features will be more readily appreciated as the same becomes better understood by reference to the following detailed description considered in connection with the accompanying drawings.
The present description will be better understood from the following detailed description read in light of the accompanying drawings, wherein:
Like reference numerals are used to designate like parts in the accompanying drawings.
The detailed description provided below in connection with the appended drawings is intended as a description of the present examples and is not intended to represent the only forms in which the present example may be constructed or utilized. The description sets forth the functions of the example and the sequence of steps for constructing and operating the example. However, the same or equivalent functions and sequences may be accomplished by different examples.
As described above, the existing surface devices which can detect multi-touch user input and also identify objects placed on the surface (by way of markers on the bottoms of the objects) use optical techniques to locate and identify objects. Consequently, the surface devices are bulky and consume a lot of power when operating. The multi-touch user input detection may also use optical techniques (e.g. using FTIR or imaging of the surface) or may use capacitive sensing (in a similar manner to conventional smartphones and tablets).
Unlike capacitive sensing surfaces, NFC and RFID readers can identify objects via parasitically powered tags which when activated transmit the identifier (ID) of the tag (which may be a unique ID); however, they do not provide information about the location of the object being identified. Furthermore, if capacitive sensing and NFC are used in close proximity to each other, they can interfere with each other.
The embodiments described below are not limited to implementations which solve any or all of the disadvantages of known sensing surfaces.
Described herein is a multi-modal sensing surface which can both detect multi-touch user input and also locate one or more objects on the surface. Where an object comprises a short-range wireless tag (e.g. an NFC or near-field RFID tag) the multi-modal sensing surface can both locate and identify the object. The sensing surface may operate as an input device for a computing device and may be a separate peripheral device or may be integrated into the computing device itself.
The multi-modal sensing surface described herein comprises a capacitive sensing electrode array and an array of RF antennas with one array being overlaid on top of the other array (e.g. the array of RF antennas may be underneath the capacitive sensing electrode array, i.e. on the opposite side of the capacitive sensing electrode array from a surface that a user touches). A first sensing module is coupled to the capacitive sensing electrode array and is configured to detect both a decrease and an increase in the capacitance between electrodes in the array. A second sensing module is coupled to the array of RF antennas and is configured to selectively tune and detune the RF antennas in the array, where, when tuned, these antennas are tuned to the same frequency as the wireless tags in the objects (e.g. 13.56 MHz for NFC) such that the second sensing module can activate a proximate wireless tag and receive data from the tag (e.g. a unique ID of the tag). The location and identity information (where known) are then provided as an input to software running on a computing device.
As shown in
The first part 108 of the sensing surface 100 is a multi-layer structure comprising one array overlaid over the other array as shown in more detail in
In various examples the two arrays 202, 208 may be substantially the same size so that the arrays overlap completely. In other examples, however, the two arrays may not be the same size (e.g. the capacitive sensing electrode array 202 may be larger than the array of RF antennas or vice versa) and/or the arrays may be partially offset from each other so that they do not overlap completely and such that there are portions of the sensing surface which are multi-modal (i.e. where the two arrays overlap) and there are portions of the sensing surface which are not (i.e. where there is only one of the two arrays 202, 208).
The capacitive sensing electrode array 202 comprises a first set of electrodes 204 in a first layer 205 and a second set of electrodes 206 in a second layer 207. In the example shown in
The array of RF antennas 208 comprises a plurality of loop antennas and the example in
In the example shown in
The two arrays 202, 208 are separated by a distance (e.g. by an insulating layer also not shown in
As shown in
Three further examples 402-408 of RF loop antennas are shown in
The loop antennas within each of the two sets 210, 211 may be equally spaced (where this spacing, s, between antennas is not necessarily the same as the width, w, of an antenna) or unequally spaced (and as described above, in some examples the antenna array 208 may only comprise a single set of antennas). Unequal spacing may, for example, be used to achieve variable resolution at various points on the sensing surface (e.g. to provide a sensing surface with lower resolution towards the edges and higher resolution in the middle) and this may, for example, enable the same number of antennas to be used for a larger sensing surface and for a smaller sensing surface.
In an example, the loop antennas may be spaced so as to provide good coverage of the whole surface and to alleviate the effects of any nulls 502 in the signal response of a single antenna 504. This can be described with reference to
Although a matrix of RF antennas (as shown in
In the example sensing surface 100 shown in
The second part 110 of the sensing surface 100 comprises the active electronics and this can be described with reference to
As described above, the first sensing module 602 (which may comprise a microprocessor control unit, MCU) is coupled to the capacitive sensing electrode array 202 and is configured to detect both a decrease and an increase in the capacitance between electrodes in the array. A decrease of mutual capacitance between electrodes (i.e. between one or more electrodes in the first set of electrodes 204 and one or more electrodes in the second set of electrodes 206) is used to detect a user's fingers in the same way as conventional multi-touch sensing. Unlike conventional multi-touch sensing, however, the first sensing module 602 can also detect an increase in the capacitance between electrodes in the array. An increase in mutual capacitance between electrodes (i.e. between one or more electrodes in the first set of electrodes 204 and one or more electrodes in the second set of electrodes 206) is used to detect the position, and in various examples, also the shape, of a conductive object, such as a wireless tag (e.g. an NFC or RFID tag) in a non-conductive housing or other object formed from a conductive material (without a tag). Unlike a user's finger, such an object has no connection to ground and instead it capacitive couples adjacent electrodes (consequently, the object does not need to have a high electrical conductivity and instead can be made from, or include, any conductive material).
The second sensing module 604 is coupled to the array of RF antennas 208 and is configured to selectively tune and detune the RF antennas in the array. For example, the second sensing module 604 may deactivate all but a selected one or more RF antennas and then power the selected RF antennas such that they can activate and read any proximate wireless tags (where the reading of tags using a selected antenna may be performed in the same way as a conventional NFC or RFID reader). Where more than one RF antenna is tuned and powered at the same time, these antennas are selected to be sufficiently far apart that there is no effect on one powered RF antenna from any of the other powered RF antennas. The deactivation of an RF antenna may be implemented in many different ways, for example by shorting the two halves of the loop via a transistor or making the tuning capacitors (which would otherwise tune the antenna at the right frequency) open-circuit (using a transistor). This selective tuning and detuning of the RF antennas stops the antennas from coupling with each other (e.g. such that the power is not coupled into another antenna, which may then activate tags proximate to that other antenna and not the original, powered antenna). The second sensing module 604 may be further configured to connect all the RF antennas to ground when the first sensing module 602 is operating. This prevents the capacitive sensors from sensing activity on the non-touch-side of the sensing mat (e.g. legs under the table) and provides the capacitive return path to ground (which completes the circuit of the user's finger to the sensing electrodes to ground and to the user's body).
Depending upon the implementation of the sensing surface 100, the second part may also comprise a communication interface 606 arranged to communicate with a separate computing device 102 using a wired or wireless technology. In examples where the power source 605 comprises an input connection for an external power source (e.g. a USB socket) and the communication interface 606 uses a wired protocol (e.g. USB), the communication interface 606 and power source 605 may be integrated. In various examples, the communication interface 606 may, in addition or instead, be arranged to communicate with an object 106 (e.g. following identification of the module by the second sensing module 604).
In various examples, the sensing surface 100 may be integrated with a computing device such that the second part 110 further comprises the component parts of the computing device, such as a processor 608, memory 610, display interface 612, etc. In other examples, the sensing surface 100 may be integrated within a peripheral for a computing device e.g. within a keyboard 700 as shown in
In various examples, the sensing surface 100 may be arranged to detect gestures above the surface of the first part 108 as well as fingers or conductive objects in contact with the surface (using the two arrays and the two sensing modules as described above). The second part 110 may therefore additionally comprise a gesture recognition module 614 coupled to the capacitive sensing electrode array 202 (or this functionality may be incorporated within the first sensing module 602).
The functionality of one or both of the sensing modules 602, 604 and/or the gesture recognition module 614 described herein may be performed, at least in part, by one or more hardware logic components. For example, and without limitation, illustrative types of hardware logic components that can be used include Field-programmable Gate Arrays (FPGAs), Application-specific Integrated Circuits (ASICs), Application-specific Standard Products (ASSPs), System-on-a-chip systems (SOCs), Complex Programmable Logic Devices (CPLDs), Graphics Processing Units (GPUs).
In examples where the sensing surface 100 is integrated with a computing device such that the second part 110 further comprises the component parts of the computing device, such as a processor 608, memory 610, input/output interface 612, etc. the processor 608 may be a microprocessor, controller or any other suitable type of processor for processing computer executable instructions to control the operation of the device in order to implement functionality of the computing device (e.g. to run an operating system and application software).
The operating system and application software may be provided using any computer-readable media that is accessible by the sensing surface 100. Computer-readable media may include, for example, computer storage media such as memory 610 and communications media. Computer storage media, such as memory 610, includes volatile and non-volatile, removable and non-removable media implemented in any method or technology for storage of information such as computer readable instructions, data structures, program modules or other data. Computer storage media includes, but is not limited to, RAM, ROM, EPROM, EEPROM, flash memory or other memory technology, CD-ROM, digital versatile disks (DVD) or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other non-transmission medium that can be used to store information for access by a computing device. In contrast, communication media may embody computer readable instructions, data structures, program modules, or other data in a modulated data signal, such as a carrier wave, or other transport mechanism. As defined herein, computer storage media does not include communication media. Therefore, a computer storage medium should not be interpreted to be a propagating signal per se. Propagated signals may be present in a computer storage media, but propagated signals per se are not examples of computer storage media. Although the computer storage media (memory 610) is shown within the sensing surface 100 it will be appreciated that the storage may be distributed or located remotely and accessed via a network or other communication link (e.g. using communication interface 606).
The sensing surface 100 may also comprise an input/output interface 612 arranged to output display information to a display device which may be separate from or integral to the sensing surface 100. The display information may provide a graphical user interface. The input/output interface 612 may also be arranged to receive and process input from one or more devices, such as a user input device (e.g. a mouse, keyboard, camera, microphone or other sensor). In some examples the user input device may detect voice input, user gestures or other user actions and may provide a natural user interface (NUI). The input/output interface 612 may comprise NUI technology which enables a user to interact with the computing-based device in a natural manner, free from artificial constraints imposed by input devices such as mice, keyboards, remote controls and the like. Examples of NUI technology that may be provided include but are not limited to those relying on voice and/or speech recognition, touch and/or stylus recognition (touch sensitive displays), gesture recognition both on screen and adjacent to the screen, air gestures, head and eye tracking, voice and speech, vision, touch, gestures, and machine intelligence. Other examples of NUI technology that may be used include intention and goal understanding systems, motion gesture detection systems using depth cameras (such as stereoscopic camera systems, infrared camera systems, RGB camera systems and combinations of these), motion gesture detection using accelerometers/gyroscopes, facial recognition, 3D displays, head, eye and gaze tracking, immersive augmented reality and virtual reality systems and technologies for sensing brain activity using electric field sensing electrodes (EEG and related methods).
If the first sensing module detects an increase in capacitance at a location on the sensing surface (in block 804), the location is used to identify one of the RF antennas (block 808, by the second sensing module 604) and then all other RF antennas are deactivated (block 810, by the second sensing module 604). The identified RF antenna (which has not been deactivated in block 810) is then used to read any proximate wireless tags (block 812, by the second sensing module 604).
The reading of a proximate wireless tag (in block 812) comprises activating the tag and then reading data transmitted by the activated tag. The tag is activated by the RF power coupled to it from the antenna and if the tag is a passive tag, this coupled RF power also provides sufficient power to enable the tag to transmit the data (which comprises an ID for the tag). In various examples, the power which is coupled from the RF antenna to the tag may also power other functionality within the object, such as a flashing LED within the object.
In some examples, the location which is identified (in block 804, by the first sensing module 602) may be between two RF antennas in the same set (e.g. set 210 or set 211 in
Having located and identified an object with a wireless tag on the sensing surface using the method described above, the method (i.e. blocks 802-812) may be repeated to track any movement of the identified object. Alternatively, the tracking of an object may be performed based on signal strength (block 814, i.e. based on the strength of the signal received from the wireless tag) without reading (i.e. decoding) the data transmitted by the tag repeatedly and this may be less susceptible to noise than only using the capacitive sensing to track location (in block 804) because the capacitive sensing may detect both the object (which results in an increase in capacitance between electrodes in the array 202) and a user's hand holding and moving the object (which results in a decrease in capacitance between electrodes in the array 202). Furthermore, by detecting whether an object is being touched or picked up by a user or not, this may be provided as additional input data to software (in block 816).
The location data and object identifier (as read from the wireless tag) which are determined (in blocks 804 and 808-812) are then provided as an input to software (block 816, e.g. where the software may be running on a processor 608 in the sensing surface 100 or in a separate computing device). If the object which caused the increase in capacitance (as detected in block 804) does not include a wireless tag, no object ID will be read by the second sensing module 604 (in block 812) in which case, only location information will be provided as an input to software (in block 816).
If a change in capacitance is detected at more than one location (in block 804) the subsequent blocks in the method of
If there is one or more location where an increase in capacitance is detected (in addition to none, one or more location where a decrease in capacitance is detected), then if the locations are close together but do not correspond to the same RF antenna (e.g. locations 902 and 904 in
If instead the two detected locations (from block 804) are far apart (e.g. locations 902 and 906 in
If instead the two detected locations (from block 804) correspond to the same RF antenna (e.g. locations 902 and 908 in
Locations may, for example, be determined to be ‘close together’ for this purpose if they correspond to adjacent RF antennas (e.g. as for locations 902 and 904 in
In addition to detecting the location of an object on the sensing surface (in block 804, using the first sensing module 602 and the capacitive sensing electrode array 202) and the identity of the object, if it contains a wireless tag (in block 812, using the second sensing module 604 and the array of RF antennas 208), the orientation of an object may also be determined. The orientation may be determined using the first sensing module 602 (as part of block 804 e.g. where the object is shaped such that its orientation can be determined from the shape of the region with increased capacitance) and/or the second sensing module 604 (as part of block 812 or 814 e.g. where the object two or more wireless tags which are physically spaced apart or where an antenna in the wireless tag in the object is shaped and hence is directional, for example by using a dipole rather than a coil). Determining the orientation using the first sensing module 602 is likely to be a lower power solution than using the second sensing module 604. Where the orientation of an object is determined (in any of blocks 804, 812 and 814) this may also be provided as an input to software (in block 816).
Two example form factors of the sensing surface are shown in
The sensing surface may have any size, e.g. it may be small (e.g. less than 100 cm2, as in the example in
Although
The sensing surface described above provides a portable sensing area which can detect both multi-touch inputs (e.g. a user's fingers) and objects placed on the surface and if those objects include a wireless tag, the surface can also identify the objects. The use of the combination of capacitive sensing and RF sensing provides a sensing device which has a lower power consumption than a purely RF solution and hence a longer battery life where the sensing device is battery powered. The location and identification of objects can also be performed more quickly than a purely RF solution.
Although the present examples are described and illustrated herein as being implemented in a system as shown in
A first further example provides a multi-modal sensing surface comprising: two overlaid arrays, the two arrays comprising a capacitive sensing electrode array and an array of RF antennas; a first sensing module coupled to the capacitive sensing electrode array and arranged to detect both an increase and a decrease of capacitance between electrodes in the array; and a second sensing module coupled to the array of RF antennas and arranged to selectively tune and detune one or more of the RF antennas in the array of RF antennas.
In the first further example, the second sensing module may be further arranged to receive data transmitted by one or more wireless tags proximate to a tuned RF antenna and via that tuned RF antenna.
In the first further example, the second sensing module may be arranged to selectively tune one or more of the RF antennas in the array of RF antennas to a frequency corresponding to a wireless tag.
In the first further example, the second sensing module may be arranged to selectively detune one or more of the RF antennas in the array of RF antennas by deactivating the antenna.
In the first further example, the array of RF antennas may comprise a first set of RF antennas at a first orientation and a second set of RF antennas at a second orientation. The first set of RF antennas may be perpendicular to the second set of RF antennas and may lie in a plane parallel to a plane comprising the second set of RF antennas.
In the first further example, the sensing surface may comprise a touch surface and the array of RF antennas may be on an opposite side of the capacitive sensing array from the touch surface.
In the first further example, the second sensing module may be further arranged to connect the array of RF antennas to ground whilst the first sensing module is detecting capacitance changes between the electrodes in the capacitive sensing electrode array.
In the first further example, each RF antenna may have a pre-defined signal response and the antennas in the array of RF antennas may be spaced such that a null in the signal response of one RF antenna does not substantially align with a null in the signal response of an adjacent RF antenna.
In the first further example, the two overlaid arrays may be formed in or on a flexible substrate. At least one of the two overlaid arrays may be woven into a fabric substrate.
The first further example may further comprise a communication interface arranged to communicate data to a separate computing device, the data comprising locations of any touch events and objects detected by the first sensing module and identities of any objects determined by the second sensing module.
In the first further example, the first and second sensing modules may be located in a detachable electronics module and may be coupled to the arrays via one or more connectors.
A second further example provides a computing device comprising the multi-modal sensing surface according to the first further example.
A third further example provides a method of detecting and locating touch events and objects using a multi-modal sensing surface, the method comprising: detecting, in a first sensing module in the multi-modal sensing surface, changes in capacitance between electrodes in a capacitive sensing electrode array in the multi-modal sensing surface; in response to detecting, in the first sensing module, a decrease in capacitance between the electrodes at a first location, providing location data identifying the first location as an input to a computer program; in response to detecting, in the first sensing module, an increase in capacitance between the electrodes at a second location: identifying, based on the second location, an RF antenna in an array of RF antennas in the multi-modal sensing surface; detuning, in a second sensing module in the multi-modal sensing surface, one or more adjacent RF antennas in the array of RF antennas; and reading, by the second sensing module and via the identified RF antenna, data from any proximate wireless tags.
The method of the third further example may further comprise: prior to detecting changes in capacitance, connecting the array of RF antennas to ground.
The method of the third further example may further comprise: in response to detecting, in the first sensing module, an increase in capacitance between the electrodes at a second location: providing location data identifying the second location and any data read from any proximate wireless tags as an input to a computer program.
The method of the third further example may further comprise: in response to detecting, in the first sensing module, an increase in capacitance between the electrodes at a second location: tracking motion of an object initially at the second location on the multi-modal sensing surface. The motion may be tracked by repeatedly analyzing strengths of signals received by the identified RF antenna from any proximate wireless tags.
A fourth further example provides a user input device comprising a multi-modal sensing surface, the multi-modal sensing surface comprising a sensing mat and an electronics module and wherein the sensing mat comprises two overlaid arrays, the two arrays comprising a capacitive sensing electrode array and an array of RF antennas and the electronics module comprises a first sensing module coupled to the capacitive sensing electrode array and arranged to detect both an increase and a decrease of capacitance between electrodes in the array and a second sensing module coupled to the array of RF antennas and arranged to selectively tune and detune one or more of the RF antennas in the array of RF antennas.
In the fourth further example, the second sensing module may be further arranged to receive data transmitted by one or more wireless tags proximate to a tuned RF antenna and via that tuned RF antenna.
In the fourth further example, the second sensing module may be arranged to selectively tune one or more of the RF antennas in the array of RF antennas to a frequency corresponding to a wireless tag.
In the fourth further example, the second sensing module may be arranged to selectively detune one or more of the RF antennas in the array of RF antennas by deactivating the antenna.
In the fourth further example, the array of RF antennas may comprise a first set of RF antennas at a first orientation and a second set of RF antennas at a second orientation. The first set of RF antennas may be perpendicular to the second set of RF antennas and may lie in a plane parallel to a plane comprising the second set of RF antennas.
In the fourth further example, the sensing surface may comprise a touch surface and the array of RF antennas may be on an opposite side of the capacitive sensing array from the touch surface.
In the fourth further example, the second sensing module may be further arranged to connect the array of RF antennas to ground whilst the first sensing module is detecting capacitance changes between the electrodes in the capacitive sensing electrode array.
In the fourth further example, each RF antenna may have a pre-defined signal response and the antennas in the array of RF antennas may be spaced such that a null in the signal response of one RF antenna does not substantially align with a null in the signal response of an adjacent RF antenna.
In the fourth further example, the two overlaid arrays may be formed in or on a flexible substrate. At least one of the two overlaid arrays may be woven into a fabric substrate.
The fourth further example may further comprise a communication interface arranged to communicate data to a separate computing device, the data comprising locations of any touch events and objects detected by the first sensing module and identities of any objects determined by the second sensing module.
In the fourth further example, the first and second sensing modules may be located in a detachable electronics module and may be coupled to the arrays via one or more connectors.
In the first and/or fourth further example, the first and/or second sensing module may be at least partially implemented using hardware logic selected from any one or more of: a field-programmable gate array, a program-specific integrated circuit, a program-specific standard product, a system-on-a-chip, a complex programmable logic device.
The term ‘computer’ or ‘computing-based device’ is used herein to refer to any device with processing capability such that it can execute instructions. Those skilled in the art will realize that such processing capabilities are incorporated into many different devices and therefore the terms ‘computer’ and ‘computing-based device’ each include PCs, servers, mobile telephones (including smart phones), tablet computers, set-top boxes, media players, games consoles, personal digital assistants and many other devices.
The methods described herein may be performed by software in machine readable form on a tangible storage medium e.g. in the form of a computer program comprising computer program code means adapted to perform all the steps of any of the methods described herein when the program is run on a computer and where the computer program may be embodied on a computer readable medium. Examples of tangible storage media include computer storage devices comprising computer-readable media such as disks, thumb drives, memory etc. and do not include propagated signals. Propagated signals may be present in a tangible storage media, but propagated signals per se are not examples of tangible storage media. The software can be suitable for execution on a parallel processor or a serial processor such that the method steps may be carried out in any suitable order, or simultaneously.
This acknowledges that software can be a valuable, separately tradable commodity. It is intended to encompass software, which runs on or controls “dumb” or standard hardware, to carry out the desired functions. It is also intended to encompass software which “describes” or defines the configuration of hardware, such as HDL (hardware description language) software, as is used for designing silicon chips, or for configuring universal programmable chips, to carry out desired functions.
Those skilled in the art will realize that storage devices utilized to store program instructions can be distributed across a network. For example, a remote computer may store an example of the process described as software. A local or terminal computer may access the remote computer and download a part or all of the software to run the program. Alternatively, the local computer may download pieces of the software as needed, or execute some software instructions at the local terminal and some at the remote computer (or computer network). Those skilled in the art will also realize that by utilizing conventional techniques known to those skilled in the art that all, or a portion of the software instructions may be carried out by a dedicated circuit, such as a DSP, programmable logic array, or the like.
Any range or device value given herein may be extended or altered without losing the effect sought, as will be apparent to the skilled person.
Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described above. Rather, the specific features and acts described above are disclosed as example forms of implementing the claims.
It will be understood that the benefits and advantages described above may relate to one embodiment or may relate to several embodiments. The embodiments are not limited to those that solve any or all of the stated problems or those that have any or all of the stated benefits and advantages. It will further be understood that reference to ‘an’ item refers to one or more of those items.
The steps of the methods described herein may be carried out in any suitable order, or simultaneously where appropriate. Additionally, individual blocks may be deleted from any of the methods without departing from the spirit and scope of the subject matter described herein. Aspects of any of the examples described above may be combined with aspects of any of the other examples described to form further examples without losing the effect sought.
The term ‘comprising’ is used herein to mean including the method blocks or elements identified, but that such blocks or elements do not comprise an exclusive list and a method or apparatus may contain additional blocks or elements.
It will be understood that the above description is given by way of example only and that various modifications may be made by those skilled in the art. The above specification, examples and data provide a complete description of the structure and use of exemplary embodiments. Although various embodiments have been described above with a certain degree of particularity, or with reference to one or more individual embodiments, those skilled in the art could make numerous alterations to the disclosed embodiments without departing from the spirit or scope of this specification.
Number | Name | Date | Kind |
---|---|---|---|
2170373 | Kind | Aug 1939 | A |
5214427 | Yano | May 1993 | A |
5355105 | Angelucci | Oct 1994 | A |
5521601 | Kandlur et al. | May 1996 | A |
5623129 | Mallicoat | Apr 1997 | A |
6062937 | Kikuchi | May 2000 | A |
6118379 | Kodukula et al. | Sep 2000 | A |
6204764 | Maloney | Mar 2001 | B1 |
6366260 | Carrender | Apr 2002 | B1 |
6404643 | Chung | Jun 2002 | B1 |
6407665 | Maloney | Jun 2002 | B2 |
6443796 | Shackelford | Sep 2002 | B1 |
6454624 | Duff et al. | Sep 2002 | B1 |
6585165 | Kuroda et al. | Jul 2003 | B1 |
6668447 | Samant | Dec 2003 | B2 |
6750769 | Smith | Jun 2004 | B1 |
6773322 | Gabai et al. | Aug 2004 | B2 |
6834251 | Fletcher | Dec 2004 | B1 |
6903056 | Lee | Jun 2005 | B2 |
6903656 | Lee | Jun 2005 | B1 |
7058434 | Wang | Jun 2006 | B2 |
7232069 | White | Jun 2007 | B1 |
7310045 | Inui | Dec 2007 | B2 |
7372967 | Henson et al. | May 2008 | B2 |
7413124 | Frank et al. | Aug 2008 | B2 |
7432855 | Mohamadi | Oct 2008 | B2 |
7439972 | Timcenko | Oct 2008 | B2 |
7488231 | Weston | Feb 2009 | B2 |
7639237 | Perkins | Dec 2009 | B2 |
7821274 | Philipp et al. | Oct 2010 | B2 |
7859408 | Tuttle | Dec 2010 | B2 |
7977577 | Lee | Jul 2011 | B2 |
8079890 | Seligman | Dec 2011 | B2 |
8257157 | Polchin | Sep 2012 | B2 |
8292733 | Crawford et al. | Oct 2012 | B2 |
8463332 | Sato et al. | Jun 2013 | B2 |
8523185 | Gilbreath et al. | Sep 2013 | B1 |
8550916 | Raynal | Oct 2013 | B2 |
8579196 | Lowe | Nov 2013 | B1 |
8670711 | Fine et al. | Mar 2014 | B2 |
8743086 | Chen et al. | Jun 2014 | B2 |
8749390 | Eray | Jun 2014 | B2 |
8803661 | Kaaja | Aug 2014 | B2 |
8970537 | Shepelev et al. | Mar 2015 | B1 |
8982094 | Pi et al. | Mar 2015 | B2 |
9004976 | Rosenberg | Apr 2015 | B2 |
9007306 | Liu | Apr 2015 | B2 |
9011327 | Schenk et al. | Apr 2015 | B2 |
9022575 | Hsu | May 2015 | B2 |
9028312 | Wei et al. | May 2015 | B1 |
9088862 | Jalkanen et al. | Jul 2015 | B2 |
9111164 | Anderton et al. | Aug 2015 | B1 |
9152832 | Royston et al. | Oct 2015 | B2 |
9168464 | Karunaratne | Oct 2015 | B2 |
9229563 | Park | Jan 2016 | B2 |
9269588 | Xu | Feb 2016 | B2 |
9270337 | Zhu et al. | Feb 2016 | B2 |
9270344 | Rosenberg | Feb 2016 | B2 |
9274562 | Franklin | Mar 2016 | B2 |
9627753 | Caimi | Apr 2017 | B2 |
9914066 | Cletheroe et al. | Mar 2018 | B2 |
9933891 | Saul et al. | Apr 2018 | B2 |
10133889 | Cletheroe | Nov 2018 | B2 |
20020106995 | Callaway, Jr. | Aug 2002 | A1 |
20020185981 | Dietz | Dec 2002 | A1 |
20020196250 | Anderson et al. | Dec 2002 | A1 |
20030148700 | Arlinsky et al. | Aug 2003 | A1 |
20030178291 | Schilling | Sep 2003 | A1 |
20040124248 | Selker | Jul 2004 | A1 |
20050134506 | Egbert | Jun 2005 | A1 |
20050183264 | Eckstein et al. | Aug 2005 | A1 |
20050242950 | Lindsay et al. | Nov 2005 | A1 |
20050242959 | Watanabe | Nov 2005 | A1 |
20060045310 | Tu et al. | Mar 2006 | A1 |
20070062852 | Zachut | Mar 2007 | A1 |
20070222700 | De Flaviis | Sep 2007 | A1 |
20080184281 | Ashizaki et al. | Jul 2008 | A1 |
20080186174 | Alexis et al. | Aug 2008 | A1 |
20080238685 | Tuttle | Oct 2008 | A1 |
20080238885 | Zachut | Oct 2008 | A1 |
20080246614 | Paananen | Oct 2008 | A1 |
20090027210 | Sakama et al. | Jan 2009 | A1 |
20090029771 | Donahue | Jan 2009 | A1 |
20090167699 | Rosenblatt et al. | Jul 2009 | A1 |
20100001923 | Zilber | Jan 2010 | A1 |
20100053111 | Karlsson | Mar 2010 | A1 |
20100267421 | Rofougaran | Oct 2010 | A1 |
20110227871 | Cannon | Sep 2011 | A1 |
20110263297 | Kaaja et al. | Oct 2011 | A1 |
20110273382 | Yoo et al. | Nov 2011 | A1 |
20120062490 | Heatherly et al. | Mar 2012 | A1 |
20120075199 | Hsieh | Mar 2012 | A1 |
20120146770 | Brannen et al. | Jun 2012 | A1 |
20120162032 | Yang et al. | Jun 2012 | A1 |
20120258436 | Lee | Oct 2012 | A1 |
20130044078 | Hallenberg et al. | Feb 2013 | A1 |
20130078914 | Royston et al. | Mar 2013 | A1 |
20130155005 | Liang | Jun 2013 | A1 |
20130176175 | Zusman et al. | Jul 2013 | A1 |
20130181937 | Chen | Jul 2013 | A1 |
20130194071 | Slogedal et al. | Aug 2013 | A1 |
20130194230 | Kawaguchi et al. | Aug 2013 | A1 |
20130196596 | Parekh et al. | Aug 2013 | A1 |
20130207938 | Ryshtun et al. | Aug 2013 | A1 |
20130217295 | Karunaratne | Aug 2013 | A1 |
20130231046 | Chang et al. | Sep 2013 | A1 |
20130234734 | Lida et al. | Sep 2013 | A1 |
20130256175 | Wilkinson | Oct 2013 | A1 |
20130278540 | Yilmaz | Oct 2013 | A1 |
20130285797 | Paulsen | Oct 2013 | A1 |
20140029017 | Lee | Jan 2014 | A1 |
20140043248 | Yeh et al. | Feb 2014 | A1 |
20140092054 | Ng | Apr 2014 | A1 |
20140104188 | Bakken et al. | Apr 2014 | A1 |
20140127995 | Hendricksen | May 2014 | A1 |
20140139347 | Forster | May 2014 | A1 |
20140148095 | Smith et al. | May 2014 | A1 |
20140160692 | Lau | Jun 2014 | A1 |
20140176819 | Yilmaz | Jun 2014 | A1 |
20140187153 | Zhu | Jul 2014 | A1 |
20140217176 | Baldischweiler et al. | Aug 2014 | A1 |
20140240100 | Johns | Aug 2014 | A1 |
20140283809 | Huebl | Sep 2014 | A1 |
20140340347 | Tenuta | Nov 2014 | A1 |
20140342663 | Eaton et al. | Nov 2014 | A1 |
20140347244 | Pagani et al. | Nov 2014 | A1 |
20150049063 | Smith et al. | Feb 2015 | A1 |
20150062045 | White | Mar 2015 | A1 |
20150084650 | Zachut et al. | Mar 2015 | A1 |
20150090242 | Weston et al. | Apr 2015 | A1 |
20150116091 | Lefevre et al. | Apr 2015 | A1 |
20150138025 | Horikoshi et al. | May 2015 | A1 |
20150169011 | Bibl | Jun 2015 | A1 |
20150169122 | Kulik | Jun 2015 | A1 |
20150193052 | Fuller et al. | Jul 2015 | A1 |
20150220184 | Park et al. | Aug 2015 | A1 |
20150242012 | Petcavich | Aug 2015 | A1 |
20150258435 | Zhang et al. | Sep 2015 | A1 |
20150258459 | Scott et al. | Sep 2015 | A1 |
20150268730 | Walline et al. | Sep 2015 | A1 |
20150277617 | Gwin | Oct 2015 | A1 |
20150290536 | Schumacher | Oct 2015 | A1 |
20160043752 | Slater | Feb 2016 | A1 |
20160087693 | Shimomura | Mar 2016 | A1 |
20160101370 | Madsen et al. | Apr 2016 | A1 |
20160124574 | Rouaissia et al. | May 2016 | A1 |
20160190851 | Pudipeddi et al. | Jun 2016 | A1 |
20170074966 | Pirc et al. | Mar 2017 | A1 |
20170123531 | Saul et al. | May 2017 | A1 |
20170123561 | Saul et al. | May 2017 | A1 |
20170123562 | Cletheroe et al. | May 2017 | A1 |
20170123563 | Saul et al. | May 2017 | A1 |
20170124364 | Villar et al. | May 2017 | A1 |
20170132438 | Cletheroe et al. | May 2017 | A1 |
20170252664 | Cletheroe et al. | Sep 2017 | A1 |
Number | Date | Country |
---|---|---|
202795320 | Mar 2013 | CN |
104992595 | Oct 2015 | CN |
1271415 | Jan 2003 | EP |
2172834 | Apr 2010 | EP |
2535797 | Dec 2012 | EP |
2620845 | Jul 2013 | EP |
2620845 | Jul 2013 | EP |
2741183 | Jun 2014 | EP |
2208390 | Feb 2015 | EP |
2837651 | Feb 2015 | EP |
2095674 | Nov 2002 | WO |
2007094993 | Aug 2007 | WO |
2008147820 | Dec 2008 | WO |
2009142383 | Nov 2009 | WO |
2010150232 | Dec 2010 | WO |
2014067547 | May 2014 | WO |
2015131746 | Sep 2015 | WO |
2016055290 | Apr 2016 | WO |
Entry |
---|
Kim, et al., “A Highly Sensitive Capacitive Touch Sensor Integrated on a Thin-Film-Encapsulated Active-Matrix OLED for Ultrathin Displays”, In Proceedings of IEEE Transactions on Electron Devices, vol. 58, Issue 10, Oct. 2011, pp. 3609-3615. |
PCT Search Report and Written Opinion dated Jan. 30, 2017 or PCT application No. PCTIUS2016/059804, 14 pages. |
“App Mates”, Retrieved on: Jul. 13, 2016, 1 page, Available at: http://www.appmatestoys.com/. |
Epawn—The next generation gaming experience, “Motion Capture for ‘Toys for Life’ Games”, Retrieved on Oct. 10, 2016, 10 pages; Available at: http://epawn.fr. |
“Lego Shop,” Retrieved on: Dec. 4, 2015, Available at «http://shop.lego.com/en-US/LEGO-Fusion-Battle- Towers-21205» 2 pages. |
“Modular Robotics,” Retrieved on: Dec. 4, 2015, Available at «http://www.modrobotics.com/cubelets/» 17 pages. |
“NFC EZ430 Reader Module Reference Design”, Published on: Sep. 24, 2014, 1 page, Available at: http://www.ti.com/tool/TIDM-NFC-EZ430-MODULE. |
Chan et al., “Capstones and ZebraWidgets: Sensing Stacks of Building. Blocks, Dials and Sliders on Capacitive Touch Screens,” In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, May 5, 2012, pp. 2189-2192. |
Goh et al., “The i-Cube: Design Considerations for Block-based Digital Manipulatives and Their Applications,” In Proceedings of Designing Interactive Systems Conference, Jun. 11, 2012, 10 pages. |
Grosse-Puppendahl, et al., “Capacitive Near-Field Communication for Ubiquitous Interaction and Perception”, In Proceedings of the ACM International Joint Conference on Pervasive and Ubiquitous Computing, Sep. 13, 2014, pp. 231-242. |
Hardy, et al., “Touch & Interact: Touch-based Interaction of Mobile Phones with Displays”, In Proceedings of the 10th International conference on Human computer interaction with mobile devices and services, Sep. 2, 2008, pp. 245-254. |
Ho, et al., “Coupled Data Communication Techniques for High-Performance and Low-Power Computing”, In Publication of Springer, Jun. 23, 2010, 27 pages. |
Holz, et al., “Biometric Touch Sensing: Seamlessly Augmenting Each Touch with Continuous Authentication”, In Proceedings of the 28th ACM User Interface Software and Technology Symposium, Nov. 8, 2015, pp. 303-312. |
Jennings et al., “CONSTRUCT/VizM: A Framework for Rendering Tangible constructions,” In Proceedings of the 14th Congress of the Iberoamerican Society of Digital Graphics, Nov. 17, 2010, pp. 415-418. |
Kitamura et al., “Real-time 3D Interaction with ActiveCube,” In Proceedings of Extended Abstracts on Human Factors in Computing Systems, Mar. 31, 2001, pp. 355-356. |
Kramer, “Moveable Objects, Mobile Cod,” In Master's Thesis, Retrieved on: Dec. 2, 2015, pp. 1-29. |
Kranz et al., “A Display Cube as a Tangible User Interface,” In Proceedings of the Seventh International Conference on Ubiquitous Computing, Sep. 11, 2005, 2 pages. |
Kubitza, et al., WebClip: A Connector for Ubiquitous Physical Input and Output for Touch Screen Devices, In Proceedings of UbiComp '13, Sep. 8-12, 2013, 4 pages. |
PCT 2nd Written Opinion in International Application PCT/US2016/059804, dated Oct. 12, 2017, 6 pages. |
PCT International Search Report and Written Opinion Issued in PCT Application No. PCT/US17/019796, dated Jun. 6, 2017, 11 Pages. |
PCT International Search Report and Written Opinion Issued in PCT Application No. PCT/US2016/059805, dated May 9, 2017, 21 Pages. |
PCT International Search Report and Written Opinion Issued in PCT Application No. PCT/US2016/059808, dated May 8, 2017, 18 Pages. |
PCT International Search Report and Written Opinion Issued in PCT Application No. PCT/US2016/059809, dated Feb. 27, 2017, 12 Pages. |
PCT International Search Report and Written Opinion Issued in PCT Application No. PCT/US2016/060274, dated Feb. 14, 2017, 10 Pages. |
PCT International Search Report and Written Opinion Received for PCT Application No. PCT/US2016/059807, dated Jan. 31, 2017, 13 Pages. |
Saenz, “Siftables are Changing the Shape of Computing,” Published on: May 5, 2010, Available at «http://singularityhub.com/2010/05/05/siftables-are-changing-the-shape-of-computing/» 4 pages. |
Schweikardt et al., “roBlocks: a robotic construction kit for mathematics and science education,” In Proceedings of the 8th international conference on Multimodal interfaces, Nov. 2, 2006, pp. 72-75. |
U.S. Appl. No. 15/063,258, Amendment and Response filed May 31, 2017, 9 pages. |
U.S. Appl. No. 15/063,258, Notice of Allowance dated Jul. 12, 2017, 5 pages. |
U.S. Appl. No. 15/063,258, Notice of Allowance dated Oct. 25, 2017, 5 pages. |
U.S. Appl. No. 15/063,258, Office Action dated Mar. 1, 2017, 9 pages. |
U.S. Appl. No. 15/231,174, Amendment and Response filed Aug. 9, 2017, 11 pages. |
U.S. Appl. No. 15/231,174, Office Action dated May 9, 2017, 22 pages. |
Vu, et al., “Distinguishing Users with Capacitive Touch Communication”, In Proceedings of the 18th annual International conference on Mobile computing and networking, Aug. 22, 2012, 12 pages. |
Watanabe et al., “The soul of ActiveCube: implementing a flexible, multimodal, three-dimensional spatial tangible interface,” In Proceedings of the ACM SIGCHI International Conference on Advances in computer entertainment technology, Jun. 3, 2004, pp. 173-180. |
PCT International Preliminary Report on Patentability Issued in PCT Application No. PCT/US2016/059804, dated Feb. 7, 2018, 10 Pages. |
U.S. Appl. No. 15/231,174, Notice of Allowance dated Nov. 17, 2017, 8 pages. |
PCT Preliminary Report on Patentability in PCT Application No. PCT/US2016/060274, dated Sep. 25, 2017, 5 Pages. |
U.S. Appl. No. 15/231,352, Office Action dated Feb. 23, 2018, 6 pages. |
U.S. Appl. No. 15/231,677, Office Action dated Feb. 1, 2018, 16 pages. |
Macleod, Peter, “A Review of Flexible Circuit Technology and its Applications”, Published on: Jun. 2002, 59 pages, Available at: http://www.lboro.ac.uk/microsites/mechman/research/ipm-ktn/pdf/Technology_review/flexible-circuit-lechnology-and_its-applications_pdf. |
“Multilayer Circuits”, Retrieved on: Jul. 7, 2016, 1 page, Available at: hllp://www.minco.com/components/Home/Flex-Circuits/Product-Technologies/Multilayer. |
“Project Jacquard”, Retrieved on: Jul. 13, 2016, 13 pages, Available at: http://atap.google.com/jacquard/. |
“Satzuma 26319 USB ROLL UP Flexible Keyboard, PC/ Mac, Keyboard”, Published on: Sep. 23, 2010, 4 pages, Available at: https://www.amazon.co.uk/Satzuma-26319-ROLL -Flexible-Keyboard/dp/B00446YLCE. |
“Roll-Up Piano”, Retrieved on: Jul. 13, 2016, 6 pages, Available at: http://www.specialneedstoys.com/uk/communication/musical-instruments/roll-up-piano-musical-instrument-toy.html?utm_campaign=gshopping&utm_source=web&utm_medium=ppc&gclid=COOl9dintc0CFdUW0wod16YL0g. |
Blass, Evan, “Samsung will reportedly launch devices with foldable and rollable screens in 2017”, Published on: Jun. 7, 2016, 7 pages, Available at: http://venturebeat.com/2016/06/07/samsung-will-reportedly-launch-devices-wilh-foldable-and-rollable-screens-in-2017/. |
Opam, Kwame, “LG's rollable display is a crazy prototype from a still-distant future”, Published on: Jan. 5, 2016, 3 pages, Available at: http://www.theverge.com/2016/1/5/10720838/lg-18-inch-rollable-display-prototype-hands--on-ces-2016. |
PCT International Search Report and Written Opinion Issued in PCT Application No. PCT/US2016/059806, dated Mar. 3, 2017, 14 Pages. |
PCT Second Written Opinion Issued in PCT Application No. PCT/US2016/059806, dated Oct. 17, 2017, 8 Pages. |
Vidales, Carlos E., “How to calibrate touch screens”, In Proceedings of the Embedded Systems Programming, vol. 15, Issue 6, May 31, 2002, 12 Pages. |
PCT Second Written Opinion Issued in PCT Application No. PCT/US2016/059807, dated Sep. 12, 2017, 5 Pages. |
“Fisher Price”, Retrieved from https://web.archive.org/web/20160712103809/http://service.mattel.com/instruction_sheets/Y3610a-0920.pdf, Retrieved on Jul. 12, 2016, 20 Pages. |
“Imaginext® Apptivity™ Fortress”, Retrieved from https://web.archive.org/web/20160326105733/http://www.fisher-price.com/shop/imaginext-apptivity-fortress-y3610, Retrieved on: Jul. 12, 2016, 8 Pages. |
“Laser Tag”, Retrieved from <<https://en.wikipedia.org/wiki/Laser_tag>>, Retrieved Date: Jul. 12, 2016, 10 Pages. |
“Super Scope”, Retrieved from <<fhttps://en.wikipedia.org/wiki/Super_Scope>>, Retrieved on: Jul. 17, 2016, 3 Pages. |
“Non Final Office Action Issued in U.S. Appl.No. 15/231,655”, dated Mar. 12, 2018, 16 Pages. |
“Final Office Action Issued in U.S. Appl. No. 15/231,677”, dated Sep. 18, 2018, 16 Pages. |
“Non Final Office Action Issued in U.S. Appl. No. 15/231,757”, dated Jun. 25, 2018, 21 Pages. |
“Non Final Office Action Issued in U.S. Appl. No. 15/231,760”, dated Mar. 8, 2018, 24 pages. |
“International Preliminary Report on Patentability Issued in PCT Application No. PCT/US2016/059806”, dated Jan. 22, 2018, 10 Pages. |
“International Preliminary Report on Patentability Issued in PCT Application No. PCT/US2016/059807”, dated Jan. 15, 2018, 16 Pages. |
“Final Office Action Issued in U.S. Appl. No. 15/231,677”, dated Jul. 26, 2019, 22 Pages. |
“Final Office Action Issued in U.S. Appl. No. 15/231,760”, dated Sep. 18, 2019, 22 Pages. |
“Second Office Action Issued in European Patent Application No. 16801628.5”, dated Aug. 5, 2019, 14 Pages. |
“Office Action Issued in European Patent Application No. 16795504.6”, dated Aug. 7, 2019, 4 Pages. |
“Office Action Issued in European Patent Application No. 16794898.3”, dated May 17, 2019, 3 Pages. |
“Non Final Office Action Issued in U.S. Appl. No. 15/231,760”, dated Mar. 25, 2019, 21 Pages. |
“Final Office Action Issued in U.S. Appl. No. 15/231,760”, dated Nov. 16, 2018, 20 Pages. |
Bolotnyy, et al., “The Practicality of Multi-Tag RFID Systems”, In Proceedings of the 1st International Workshop on RFID Technology—Concepts, Applications, Challenges, Jun. 2007, 10 Pages. |
Broll, et al., “Mobile and Physical User Interfaces for NFC-Based Mobile Interaction with Multiple Tags”, In Proceedings of the 12th International Conference on Human Computer Interaction with Mobile Devices and Services, Sep. 7, 2010, pp. 133-142. |
Hinske Steve, “Determining the Position and Orientation of Multi-Tagged Objects Using RFID Technology”, In Proceedings of the Fifth Annual IEEE International Conference on Pervasive Computing and Communications Workshops, Mar. 19, 2007, 5 Pages. |
“Non Final Office Action Issued in U.S. Appl. No. 15/231,677”, dated Jan. 10, 2019, 15 Pages. |
“Office Action Issued in European Patent Application No. 16801898.4”, dated Feb. 7, 2019, 7 Pages. |
“Office Action Issued in European Patent Application No. 16801897.6”, dated Feb. 27, 2019, 7 Pages. |
“Office Action Issued in European Patent Application No. 16801628.5”, dated Feb. 27, 2019, 11 Pages. |
“Non Final Office Action Issued in U.S. Appl. No. 15/231,677”, dated Jan. 10, 2020, 23 Pages. |
“First Office Action and Search Report Issued in Chinese Patent Application No. 201780015514.6”, dated Feb. 6, 2020, 9 Pages. |
Number | Date | Country | |
---|---|---|---|
20170123554 A1 | May 2017 | US |