The present invention relates to a method for selecting commands, suitable for implementation in a three-dimensional human-machine interface. It also relates to a device implementing the method.
The field of the invention is more particularly but non-limitatively that of contactless human-machine Interfaces.
Touch interfaces, or touch screens, are now widely used for controlling appliances as varied as computers, mobile phones, etc.
Generally, they comprise a display screen and sensors making it possible to determine the point(s) of contact between the surface of the screen and one or more command objects, such as fingers or a stylus.
These touch interfaces frequently use capacitive measurement technologies for detecting the position of the command objects. The screen can be for example covered with a mesh of capacitive electrodes, and the position of the object is detected based on its Interactions, in the form of capacitive couplings, with the electrodes.
Touch interfaces also comprise a software element making it possible to Interpret the user commands. The display changes according to the position of the command object(s) detected, allowing the user to monitor his actions visually and to select commands.
Gesture interfaces, or 3D interfaces, are also known, in which a third dimension is added with the possibility of detecting objects remotely before they touch the surface of the screen. These interfaces are equipped with sensors making it possible to measure the position in space, with respect to the interface, of one or more command objects.
Capacitive measurement technologies are also well adapted to producing this type of interface.
Document FR 2 844 349 by Rozière is known for example, which discloses a capacitive proximity detector comprising a plurality of independent electrodes, making it possible to measure the capacitance and the distance between the electrodes and an object in proximity up to distances of several tens, even hundreds, of millimeters. The electrodes can be produced transparently by using, for example indium tin oxide (ITO), and deposited on the display screen.
These interfaces equipped with space measurement capabilities open up new possibilities for interaction between the user and the machine and make it possible to envisage new human-machine interface (HMI) control modes in which distance or depth data would be fully exploited in order to “browse” through the Interface software.
Document US 2008/0307360 from Chaudhri et al. is known, which discloses human-machine interface (HMI) software with a three-dimensional component. However, the three-dimensional aspect is limited to a representation within an environment in which computer objects (icons, files, etc.) have a three-dimensional appearance. The interface control mode, in particular for selecting objects, remains strictly two-dimensional, as it is based on the detection of events such as the movement of a mouse cursor in the plane of the interface.
The purpose of the present invention is to propose a method for the selection of commands (or computer objects) in a human-machine interface (HMI) equipped with three-dimensional measurement capabilities, which makes full use of the three-dimensional aspect of the measurements in the interface software.
This objective is attained with a method for selecting commands, implementing a command interface, a display and at least one sensor suitable for detecting at least one command object, comprising the steps of:
The method according to the invention can comprise moreover the steps of:
The display mode of a symbol can comprise differentiated graphical representations of this symbol making it possible to show a state such as emphasis for making a selection, the selection itself, the execution of a command, a movement, a rotation, a change, etc. The display mode can correspond for example to highlighting, graphical differentiation with respect to other displayed symbols by means of a change of colour or size, or redisplaying the symbol in a different manner and shifted so as to be visible for example beyond a command object.
The method according to the invention can comprise moreover a step of using at least one of the following data sets: distance data, distance and position data, in order to determine the displayed symbol(s).
Determining the displayed symbol can comprise selecting symbols shown on the display, therefore accessible commands and/or groups of commands on the interface, according to distance and/or position data.
The distance and position data can comprise:
According to embodiments, the method according to the invention can implement at least one of the following measurement types:
These measurements can make it possible in particular to obtain distance and/or position data.
The measured capacitive interactions can comprise in particular:
The variations in measured light intensities can be generated for example by the interruption of light beams by command objects, or shade effects due to the presence of command objects.
The method according to the invention can comprise moreover the steps of:
This inclusion can be defined in a logical manner, such as for example in a tree structure of hierarchical commands, or a stack of commands or sets of commands.
The second symbols can be displayed at positions that are substantially different to that of the first symbol on the display, for example so as not to mask the first symbol.
The method according to the invention can comprise moreover the steps of:
The second symbol can be displayed at a position substantially identical to that of the first symbol on the display, such as for example to illustrate a depthwise movement in a stack of symbols from which elements will be removed as the movement of the command object progresses.
The method according to the invention can comprise moreover a step of selecting a command comprising a step of verifying at least one selection condition based on a data set from: distance data, distance and position data.
It can comprise moreover a step of verifying at least one selection condition from the following selection conditions:
The selection of a command can take place when the command objects are in proximity to or converging towards a position defined for this command.
The method according to the invention can comprise moreover a step of executing a (previously selected) command of one of the following types: executing a computer program, executing an application, displaying the content of a folder stored on computer storage media, displaying an image, playing a sound, reading multimedia content, or any other command.
The method according to the invention can comprise moreover a step of executing a command for moving a symbol, the latter comprising:
The method according to the invention can comprise moreover a step of verifying at least one validation condition from the following validation conditions:
According to embodiments, the method according to the invention can comprise moreover the steps of:
According to another aspect, a device is proposed for selecting commands, comprising:
The display can be a display screen, or any other display means, for example in relief (3D display).
The command interface, the sensors and the display can be according to any arrangement, such as for example:
The device according to the invention can comprise moreover:
According to embodiments, the device according to the invention can comprise moreover sensors of at least one of the following types:
The optical sensors can comprise for example light barriers with light sources emitting light beams and photodetectors arranged so as to be illuminated by these light beams when they are not interrupted by command objects. They can also comprise photodetectors sensitive to lighting variations such as the effects of shade or reflection due to the presence of command objects, for example integrated in a screen based on TFT or OLED technology.
According to embodiments, the device according to the invention can moreover comprise all types of sensors suitable for producing distance and/or position data. It can in particular comprise ultrasound acoustic sensors, arranged for example so as to allow location of the command objects by echo and triangulation measurements.
According to yet another aspect, a device is proposed of one of the following types: computer, telephone, smartphone, tablet, display screen, terminal, characterized in that it comprises a device for selecting commands implementing the method according to the invention.
Other advantages and features of the invention will become apparent on reading the detailed description of implementations and embodiments, which are in no way limitative, and from the following attached drawings:
An embodiment of the invention will be described implementing a human-machine interface (HMI) which comprises capacitive sensors. Of course, this embodiment is an example that is in no way limitative of implementation of the invention. Such an interface is for example well suited to producing a human-machine interface (HMI) for a host system such as a mobile phone, a smartphone, a tablet or a computer.
With reference to
The sensors 6 provide data relating to the distance 4 along the axis z between the object 3 and the detection surface of the interface 1, and data relating to the position 5 in the plane (X, Y) of a projection along the axis z of the object 3 on the command interface 1. They are also able to detect a contact between the command object 3 and the detection surface of the interface 1.
The data relating to the distance 4 and to the position 5 comprise equivalent measurements of distance 4 and of position 5. These measurements, not necessarily expressed in units of length, are translations of measurements of capacitances or variations of capacitances. In particular, physical characteristics of the command object 3 can affect the measured capacitances and therefore their translation in terms of equivalent distance and/or positions.
The data relating to the distance 4 and to the position 5 can comprise moreover trajectories, defined like time sequences of distances 4 and/or positions 5, and derived values such as speeds and accelerations.
The sensors 6 comprise capacitive electrodes based on indium tin oxide (ITO). Depending on the applications, these can have a varied number and arrangement,
The capacitive electrodes of the sensors 6 are linked to an electronic measurement unit 7 which makes it possible to calculate the distance 4 and the position 5. There are several possible embodiments for the capacitive sensors 6 and the electronic measurement unit 7.
Advantageously, the sensors 6 and the electronic measurement unit 7 are produced according to a method described in document FR 2 844 349 by Rozière. They comprise a plurality of independent electrodes 6 distributed over the surface area of the interface 1. These electrodes 6 are linked to an electronic measurement unit 7 that uses floating point detection or in other words referenced to a floating electrical potential. A guard electrode, also with floating reference potential, is placed along the rear face of the measurement electrodes 6, between them and the display screen 2, so as to eliminate any parasitic capacitance. All the electrodes have the same potential and there is thus no capacitive coupling between the electrodes that is capable of degrading the capacitance measurement. This electronic detection unit 7 and its methods of implementation that can be used within the framework of the present invention are also described in detail in the document FR 2 756 048 by Rozière to which reference is made herein.
Scanners make it possible to sequentially measure the capacitance and therefore the distance between the electrodes 6 and the command object 3. The electrodes 6 which are not “polled” are also kept at the same potential as the guard electrode, still to eliminate parasitic capacitances.
Whether a computer, a mobile phone, tablet or any other system whatever is involved, the host system also comprises computer calculation means 8. These calculation means 8 usually comprise a microprocessor unit (central processing unit (CPU)) combined with components such as random access memories (RAM), mass storage means (hard disk, flash memory, etc), and allow one (or a plurality) of computer programs or software to be executed.
A part of this software, also called interface software, is dedicated to HMI management tasks. This interface software contributes to carrying out the steps of the method according to the invention, which comprise:
The human-machine interface software (HMI software) corresponds to what the user sees on the display 2. He interacts with this HMI software by using one or more of the command objects 3 such as his fingers, a stylus, etc.
Conventionally, the HMI software comprises a representation in graphical, symbolic form, of the host system and/or of the possible actions:
Without restriction, it is possible to call all the actions that a user can carry out via the command object 3 and the HMI command software.
These commands are represented graphically on the display 2 by symbols such as icons with which the user can interact by means of the command object 3.
An important challenge in the design of HMIs is the organization and structuring of the representation of the commands so that the user can easily find and select them, by moving or “browsing” through the interface.
The commands can be organised according to hierarchical structures of a three-dimensional nature, which represent sets of commands and among which are distinguished in particular:
The HMIs of the prior art are based essentially on a two-dimensional type browsing, which takes into account only the position 5 of the command object 3 for selecting commands, whether this involves the mouse cursor (hover or click), a physical contact between an object 3 and the detection surface of the interface 1 (tapping) or even hovering over the detection surface of the interface 1. Thus, even browsing through structures of a three-dimensional nature is in fact reduced to a set of actions in the same plane. It is necessary for example to tap on an icon to open a folder and display its content or show stacked commands, i.e. access a different hierarchical (or topological) level.
Advantageously, the method according to the invention makes it possible to browse in a truly three-dimensional manner through an HMI by using the distance measurements 4. It makes it possible in particular to access the different hierarchical (or topological) layers of a set of commands arranged according to a three-dimensional type structure by varying the distance 4 between the command object 3 and the detection surface of the interface 1. This “access” is shown on the display 2 by displaying the symbols (or icons) representing a command or a set of commands of the selected hierarchical (or topological) level as a function of the distance 4.
Browsing is called three-dimensional inasmuch as it is possible by using the distance data 4 to browse hierarchical or topological levels of a command structure and/or command groups for which levels can be shown on the display 2 by one or a plurality of symbols.
It is moreover possible to browse through a particular hierarchical command structure among a plurality of such command structures shown on the display, without affecting the others, by using the position measurement 5. In fact, it is provided that only one command structure the graphical symbol of which is hovered over by the command object 3 (therefore for which the position 5 is close to or over its graphical symbol) “sees” its display changed as a function of the hierarchical level corresponding to the distance 4.
Once displayed, a command can be selected by selecting its representative symbol on the HMI. This selection can be carried out in particular with a view to its execution, or to move the symbol representing it on the display 2 (in which case the command in question comprises the movement of the symbol on the display).
Selecting a command comprises the verification of at least one selection condition, or, in other words, the selection of a command is validated when one or more selection conditions (or time sequences of selection conditions) are satisfied. Various selection conditions can be implemented, including within a single HMI.
Different selection conditions can be implemented in order to allow the execution of different commands optionally attached to or represented by the same symbol on the display 2. These commands can for example, relate to executing an application represented by an icon, and moving this icon.
From the applicable selection conditions within the framework of the invention the following selection conditions are distinguished:
These selection conditions based on detection of a minimum distance 4 or distance less than a threshold can be used without creating ambiguity with respect to the command selection tasks because a command does not have a lower hierarchical or topological level (at least in the application in which this command is selected). In order to further limit the risks of ambiguities, it can be arranged to display the corresponding symbol so that it does not cover the symbols corresponding to command groups of (at least) the same hierarchical structure, and use the position measurement 5 in order to determine the selected command.
These selection conditions can be implemented by adding a duration condition (a predetermined minimum duration) in order to limit the risks of false commands.
Selection conditions can also be implemented based on trajectories, such that:
A condition of this type corresponds to a virtual “click” performed without contact. As previously, the position measurement 5 is used to determine the selected command.
Finally, the selection conditions can be implemented based on the trajectories of several command objects, such that:
A selection condition can be also be used as a deselection or validation condition, in particular for “releasing” an object when the command relates to a manipulation or a movement of a symbol on the display 2.
Specific validation or deselection conditions can be provided, such as for example a diverging opening movement, of several command objects 3. Thus, moving a symbol on the display 2 by using two fingers as command objects 3 can be obtained by a pinch (selection) sequence, movement (the selected symbol follows the fingers), and opening the fingers (deselection).
In order to assist the user with browsing, the position 5 and the distance 4 of the command object 3 can be shown on the display screen 2 by means of a circular pattern centered on the position 5 and having a diameter dependent on the distance 4, or any other pattern.
An implementation of the method according to the invention for browsing tree type command structures folders will be described.
This tree structure comprises command groups or folders represented by the symbol or the icon 10, and commands represented by the symbol or the icon 11 on the display screen 2.
With reference to
With reference to
The sets of commands D21, D22, D23 and the command C24 which are included in D12 are also defined, belonging to a second hierarchical level accessible when the command object 3 is at distances 4 comprised between H2 and H3.
The arrows 12 show the trajectory in space of the finger 3 over the distances 4 (H1, H2, H3) and positions 5 (P1, . . . P4) corresponding to the example below.
With reference to
Firstly, as shown in
The user then lowers his finger 3 to a distance 4 comprised between H1 and H2. When his finger hovers over the position P2, the set of commands D12 is made prominent. To this end the corresponding symbol or icon is for example highlighted, or graphically differentiated from the others by means of a change of colour or size; or re-displayed differently and shifted so as to be visible beyond the finger 3. This situation is shown in
By lowering his finger 3 above the position P2 at a distance 4 comprised between H2 and H3, the user accesses the content of D12. The corresponding sets of commands D21, D22, D23 and the command C24 are displayed, according to
The user can then move his finger 3 to P4 to highlight the command C24 as shown in
Display of a new hierarchical level can replace that of the previous level in order to maintain good legibility, for example on a small screen 2. It is also possible to display the content of a lower hierarchical level in proximity to the symbol of the selected command group of a higher hierarchical level.
According to an embodiment, the symbol representing a command group (or its icon) can comprise a representation of the symbols of the elements or commands that it includes (thus, a representation of their icons reduced in size), and displaying the icons of the content can be performed so that the user has the impression of zooming in on the content when accessing the hierarchical level of this content.
An implementation of the method according to the invention for browsing command structures of the type stack of commands or stack will now be described.
With reference to
Firstly, as shown in
The user then lowers his finger 3 to a distance 4 comprised between distances H1 and H2 as shown in
Then two variants are possible.
According to a first variant shown in
Then, by lowering his finger 3 above the position P1 to a distance 4 comprised between H3 and H4, he displays the second command C2 of the stack, and so on.
The arrows 22 in
This variant is well suited for example to displaying images, in which case the symbol is the image and the command simply its display.
According to a second variant shown in
If the user continues to lower his finger 3 above the position P1 to a distance 4 comprised between H3 and H4, he displays at P2 the second command C2 of the stack, and so on.
In this variant, the user can highlight a displayed command for the purpose of selecting it by moving his finger 3 to position P2. This situation is shown in
The arrows 23 in
As stated previously, the stack 21 can comprise commands 11 and/or sets of commands 10. Once a command 11 or a set of commands 10 is highlighted, it is possible to select it or browse through its tree structure in the same way as previously described in relation to
The distance thresholds 4 can be managed as follows, it being understood that several methods of managing these thresholds can be implemented according to the command structures in question, and/or choices that the user can make via a configuration menu:
With reference to
A particular command symbol 11 can represent several command possibilities (execution of an application for example), or only a movement command (for example if the symbol represents part of a set displayed on the display screen 2).
In the case of a plurality of possible commands, it is sufficient to define different selection conditions for each of them in order to avoid any ambiguity,
Firstly, the user moves two (or more) fingers 3 closer to the surface of the interface 2, until reaching a distance at which the sensors 6 become able to “distinguish” the fingers. When the fingers 3 are detected, and if their positions 5 correspond substantially to that of the symbol 11 on the display 2, the symbol 11 is made prominent (for example highlighted). Depending on the devices, it can be necessary for the fingers 3 to contact the surface of the interface 2.
Alternatively, the user can also browse a structure or command stack as explained previously to reach the step in which the symbol 11 is made prominent.
Then, the user selects the command for moving the command symbol 11 by performing a pinching movement 30 or moving the fingers 3 closer together as shown in
The symbol 11 can be moved by moving the fingers 3, following their position.
Validation of the command, and therefore the position of the symbol 11 at an arrival position, is performed by moving the fingers further apart, as shown in
It is also possible, as shown in
If the distance 4 of the fingers 3 is increased beyond a certain limit during the movement, according to the applications it can be provided that the symbol 11 freezes, changes its appearance, disappears or returns to its starting position. An increase in the distance 4 beyond a certain limit can also be used as a deselection condition for the movement command without validating it, with the symbol 11 returning to its starting position.
This method for controlling movement commands can allow for example a play mode for board games such as chess, draughts, etc. to be implemented.
According to variant embodiments:
Of course, the invention is not limited to the examples which have just been described and numerous adjustments can be made to these examples without exceeding the scope of the invention.
Number | Date | Country | Kind |
---|---|---|---|
1150726 | Jan 2011 | FR | national |
This application is a continuation of U.S. patent application Ser. No. 13/982,791 (now U.S. Pat. No. 10,303,266 and published as U.S. Publication No. 2013-0307776), filed Jul. 31, 2013, which is a U.S. National Stage application under 35 U.S.C. § 371 of International Application No. PCT/FR2012/050183, filed Jan. 30, 2012, which claims the priority benefit of French Patent Application No. 1150726, filed Jan. 31, 2011, the entire disclosures of which are incorporated herein by reference for all purposes.
Number | Name | Date | Kind |
---|---|---|---|
5103085 | Zimmerman | Apr 1992 | A |
5270818 | Ottenstein | Dec 1993 | A |
5345550 | Bloomfield | Sep 1994 | A |
5347295 | Agulnick et al. | Sep 1994 | A |
5363051 | Jenstrom et al. | Nov 1994 | A |
5406305 | Shimomura et al. | Apr 1995 | A |
5483261 | Yasutake | Jan 1996 | A |
5488204 | Mead et al. | Jan 1996 | A |
5528266 | Arbeitman et al. | Jun 1996 | A |
5684294 | Kouhi | Nov 1997 | A |
5825352 | Bisset et al. | Oct 1998 | A |
5835079 | Shieh | Nov 1998 | A |
5844506 | Binstead | Dec 1998 | A |
5880411 | Gillespie et al. | Mar 1999 | A |
5952992 | Helms | Sep 1999 | A |
5956291 | Nehemiah et al. | Sep 1999 | A |
6073036 | Heikkinen et al. | Jun 2000 | A |
6105419 | Michels et al. | Aug 2000 | A |
6188391 | Seely et al. | Feb 2001 | B1 |
6253218 | Aoki et al. | Jun 2001 | B1 |
6308144 | Bronfeld et al. | Oct 2001 | B1 |
6310610 | Beaton et al. | Oct 2001 | B1 |
6313853 | Lamontagne et al. | Nov 2001 | B1 |
6323846 | Westerman et al. | Nov 2001 | B1 |
6373612 | Hoffman et al. | Apr 2002 | B1 |
6414674 | Kamper et al. | Jul 2002 | B1 |
6480188 | Horsley | Nov 2002 | B1 |
6520013 | Wehrenberg | Feb 2003 | B1 |
6583676 | Krah et al. | Jun 2003 | B2 |
6601012 | Horvitz et al. | Jul 2003 | B1 |
6661920 | Skinner | Dec 2003 | B1 |
6664744 | Dietz | Dec 2003 | B2 |
6680677 | Tiphane | Jan 2004 | B1 |
6690275 | Long et al. | Feb 2004 | B2 |
6690387 | Zimmerman et al. | Feb 2004 | B2 |
6812466 | O'Connor et al. | Nov 2004 | B2 |
6822635 | Shahoian et al. | Nov 2004 | B2 |
6847354 | Vranish | Jan 2005 | B2 |
6903730 | Mathews et al. | Jun 2005 | B2 |
6920619 | Milekic | Jul 2005 | B1 |
6938221 | Nguyen | Aug 2005 | B2 |
6947571 | Rhoads et al. | Sep 2005 | B1 |
6956564 | Williams | Oct 2005 | B1 |
6961912 | Aoki et al. | Nov 2005 | B2 |
7015894 | Morohoshi | Mar 2006 | B2 |
7016705 | Bahl et al. | Mar 2006 | B2 |
7019622 | Orr et al. | Mar 2006 | B2 |
7058902 | Iwema et al. | Jun 2006 | B2 |
7151528 | Taylor et al. | Dec 2006 | B2 |
7171221 | Amin et al. | Jan 2007 | B1 |
7184064 | Zimmerman et al. | Feb 2007 | B2 |
7319454 | Thacker et al. | Jan 2008 | B2 |
7417650 | Horvitz | Aug 2008 | B1 |
7522065 | Falcon | Apr 2009 | B2 |
RE40867 | Binstead | Aug 2009 | E |
7570064 | Roziere | Aug 2009 | B2 |
7593552 | Higaki et al. | Sep 2009 | B2 |
7633076 | Huppi et al. | Dec 2009 | B2 |
7653883 | Hotelling et al. | Jan 2010 | B2 |
7663607 | Hotelling et al. | Feb 2010 | B2 |
7663620 | Robertson et al. | Feb 2010 | B2 |
7715790 | Kennedy | May 2010 | B1 |
7743348 | Robbins et al. | Jun 2010 | B2 |
8149002 | Ossart et al. | Apr 2012 | B2 |
8159213 | Roziere | Apr 2012 | B2 |
8381135 | Hotelling et al. | Feb 2013 | B2 |
8479122 | Hotelling et al. | Jul 2013 | B2 |
8612856 | Hotelling et al. | Dec 2013 | B2 |
8770033 | Roziere | Jul 2014 | B2 |
8917256 | Roziere | Dec 2014 | B2 |
8935625 | Lago | Jan 2015 | B2 |
9035903 | Binstead | May 2015 | B2 |
10042418 | Hotelling et al. | Aug 2018 | B2 |
10067632 | Shinde et al. | Sep 2018 | B2 |
10303266 | Roziere | May 2019 | B2 |
20010015718 | Hinckley et al. | Aug 2001 | A1 |
20010031633 | Tuomela et al. | Oct 2001 | A1 |
20010035858 | Blumberg | Nov 2001 | A1 |
20020036618 | Wakai | Mar 2002 | A1 |
20020057260 | Mathews et al. | May 2002 | A1 |
20020140633 | Rafii et al. | Oct 2002 | A1 |
20020167488 | Hinckley et al. | Nov 2002 | A1 |
20030001899 | Partanen et al. | Jan 2003 | A1 |
20030016253 | Aoki et al. | Jan 2003 | A1 |
20030076363 | Murphy | Apr 2003 | A1 |
20030095096 | Robbin et al. | May 2003 | A1 |
20030132922 | Philipp | Jul 2003 | A1 |
20030160808 | Foote et al. | Aug 2003 | A1 |
20030179201 | Thacker | Sep 2003 | A1 |
20040021647 | Iwema et al. | Feb 2004 | A1 |
20040095318 | Morrison et al. | May 2004 | A1 |
20040135818 | Thomson et al. | Jul 2004 | A1 |
20040145601 | Brielmann et al. | Jul 2004 | A1 |
20040150668 | Myers et al. | Aug 2004 | A1 |
20040150669 | Sabiers et al. | Aug 2004 | A1 |
20040224638 | Fadell et al. | Nov 2004 | A1 |
20040233153 | Robinson | Nov 2004 | A1 |
20040245438 | Payne et al. | Dec 2004 | A1 |
20050015731 | Mak et al. | Jan 2005 | A1 |
20050057524 | Hill et al. | Mar 2005 | A1 |
20050134578 | Chambers et al. | Jun 2005 | A1 |
20050190142 | Ferguson | Sep 2005 | A1 |
20050219223 | Kotzin et al. | Oct 2005 | A1 |
20050219228 | Alameh et al. | Oct 2005 | A1 |
20050219394 | Du et al. | Oct 2005 | A1 |
20050221791 | Angelhag | Oct 2005 | A1 |
20050223308 | Gunn et al. | Oct 2005 | A1 |
20060001650 | Robbins et al. | Jan 2006 | A1 |
20060010400 | Dehlin et al. | Jan 2006 | A1 |
20060012577 | Kyrola | Jan 2006 | A1 |
20060017692 | Wehrenberg et al. | Jan 2006 | A1 |
20060026521 | Hotelling et al. | Feb 2006 | A1 |
20060026535 | Hotelling et al. | Feb 2006 | A1 |
20060097733 | Roziere | May 2006 | A1 |
20060117108 | Salisbury et al. | Jun 2006 | A1 |
20060146012 | Arneson et al. | Jul 2006 | A1 |
20060161846 | Marco | Jul 2006 | A1 |
20060161870 | Hotelling et al. | Jul 2006 | A1 |
20060161871 | Hotelling et al. | Jul 2006 | A1 |
20060164241 | Makela et al. | Jul 2006 | A1 |
20060197753 | Hotelling | Sep 2006 | A1 |
20060244735 | Wilson | Nov 2006 | A1 |
20060279548 | Geaghan | Dec 2006 | A1 |
20060290921 | Hotelling et al. | Dec 2006 | A1 |
20070075965 | Huppi et al. | Apr 2007 | A1 |
20070085157 | Fadell et al. | Apr 2007 | A1 |
20070099574 | Wang | May 2007 | A1 |
20070216659 | Amineh | Sep 2007 | A1 |
20070277123 | Shin et al. | Nov 2007 | A1 |
20070288860 | Ording et al. | Dec 2007 | A1 |
20070294639 | Van et al. | Dec 2007 | A1 |
20080006762 | Fadell et al. | Jan 2008 | A1 |
20080090617 | Sutardja | Apr 2008 | A1 |
20080113618 | De leon et al. | May 2008 | A1 |
20080165141 | Christie | Jul 2008 | A1 |
20080211779 | Pryor | Sep 2008 | A1 |
20080278450 | Lashina | Nov 2008 | A1 |
20080284261 | Andrieux et al. | Nov 2008 | A1 |
20080307345 | Hart et al. | Dec 2008 | A1 |
20080307360 | Chaudhri et al. | Dec 2008 | A1 |
20080309632 | Westerman et al. | Dec 2008 | A1 |
20090058829 | Kim et al. | Mar 2009 | A1 |
20090128498 | Hollemans et al. | May 2009 | A1 |
20090237371 | Kim | Sep 2009 | A1 |
20090247233 | Kim | Oct 2009 | A1 |
20090265670 | Kim | Oct 2009 | A1 |
20090289914 | Cho | Nov 2009 | A1 |
20090295715 | Seo et al. | Dec 2009 | A1 |
20090315858 | Sato et al. | Dec 2009 | A1 |
20090327969 | Estrada | Dec 2009 | A1 |
20100052700 | Yano et al. | Mar 2010 | A1 |
20100060599 | Kwak et al. | Mar 2010 | A1 |
20100095206 | Kim | Apr 2010 | A1 |
20100123667 | Kim et al. | May 2010 | A1 |
20100211919 | Brown et al. | Aug 2010 | A1 |
20100265204 | Tsuda | Oct 2010 | A1 |
20100283743 | Coddington | Nov 2010 | A1 |
20100289740 | Kim et al. | Nov 2010 | A1 |
20110018811 | Miernik | Jan 2011 | A1 |
20110041096 | Larco et al. | Feb 2011 | A1 |
20110057956 | Ranford | Mar 2011 | A1 |
20110128244 | Cho et al. | Jun 2011 | A1 |
20110164063 | Shimotani et al. | Jul 2011 | A1 |
20110169783 | Wang et al. | Jul 2011 | A1 |
20110179368 | King et al. | Jul 2011 | A1 |
20110221693 | Miyazaki | Sep 2011 | A1 |
20110221776 | Shimotani et al. | Sep 2011 | A1 |
20110248963 | Lawrence et al. | Oct 2011 | A1 |
20110296351 | Ewing et al. | Dec 2011 | A1 |
20120026113 | Kasahara et al. | Feb 2012 | A1 |
20120044662 | Kim et al. | Feb 2012 | A1 |
20120187965 | Roziere | Jul 2012 | A1 |
20120188200 | Roziere | Jul 2012 | A1 |
20120249443 | Anderson | Oct 2012 | A1 |
20120270533 | You | Oct 2012 | A1 |
20130135247 | Na et al. | May 2013 | A1 |
20130154982 | Hotelling et al. | Jun 2013 | A1 |
20130307776 | Roziere | Nov 2013 | A1 |
20140062875 | Rafey et al. | Mar 2014 | A1 |
20140074426 | Hotelling et al. | Mar 2014 | A1 |
20140132335 | Rauhala et al. | May 2014 | A1 |
20150035792 | Roziere et al. | Feb 2015 | A1 |
20160004348 | Roziere | Jan 2016 | A1 |
20160179247 | Blondin | Jun 2016 | A1 |
20180341324 | Hotelling et al. | Nov 2018 | A1 |
20190121470 | Roziere | Apr 2019 | A1 |
Number | Date | Country |
---|---|---|
1243096 | Oct 1988 | CA |
201266371 | Jul 2009 | CN |
101526881 | Sep 2009 | CN |
101547253 | Sep 2009 | CN |
101681218 | Mar 2010 | CN |
101727236 | Jun 2010 | CN |
101952792 | Jan 2011 | CN |
102037436 | Apr 2011 | CN |
10042300 | Mar 2002 | DE |
10059906 | Jun 2002 | DE |
10251296 | May 2004 | DE |
0462759 | Dec 1991 | EP |
0464908 | Jan 1992 | EP |
0288692 | Jul 1993 | EP |
0664504 | Jul 1995 | EP |
0992969 | Apr 2000 | EP |
1014295 | Jan 2002 | EP |
1185058 | Mar 2002 | EP |
1335430 | Aug 2003 | EP |
1355223 | Oct 2003 | EP |
1452988 | Sep 2004 | EP |
1507132 | Feb 2005 | EP |
1507196 | Feb 2005 | EP |
1569079 | Aug 2005 | EP |
1696414 | Aug 2006 | EP |
2104024 | Sep 2009 | EP |
2105844 | Sep 2009 | EP |
2109030 | Oct 2009 | EP |
2166463 | Mar 2010 | EP |
2267791 | Dec 2010 | EP |
2426581 | Mar 2012 | EP |
2634680 | Sep 2013 | EP |
2634687 | Sep 2013 | EP |
2778859 | Sep 2014 | EP |
2756048 | May 1998 | FR |
2844349 | Mar 2004 | FR |
2330670 | Apr 1999 | GB |
2418808 | Apr 2006 | GB |
63-167923 | Jul 1988 | JP |
6-161661 | Jun 1994 | JP |
8-263699 | Oct 1996 | JP |
2000-163031 | Jun 2000 | JP |
2002-342033 | Nov 2002 | JP |
10-2011-0029681 | Mar 2011 | KR |
1997018547 | May 1997 | WO |
1997023738 | Jul 1997 | WO |
1998014863 | Apr 1998 | WO |
1999028813 | Jun 1999 | WO |
2000038042 | Jun 2000 | WO |
2004093045 | Oct 2004 | WO |
2006003590 | Jan 2006 | WO |
2006023569 | Mar 2006 | WO |
2006026012 | Mar 2006 | WO |
2006003590 | May 2006 | WO |
2009028892 | Mar 2009 | WO |
2015007781 | Jan 2015 | WO |
Entry |
---|
“Capacitive Position Sensing”, Available online at: <http://www.synaptics.com/technology/cps.cfm>, Accessed on Aug. 5, 2005. |
Final Office Action received for U.S. Appl. No. 16/222,838, dated Jan. 28, 2020, 37 pages. |
Final Office Action received for U.S. Appl. No. 13/982,791, dated Jun. 7, 2018, 6 pages. |
International Search Report received for PCT Patent Application No. PCT/EP2015/052876, dated Jun. 16, 2015, 2 pages. |
International Search Report received for PCT Patent Application No. PCT/FR2012/050183, dated Apr. 16, 2012, 3 pages. |
Non-Final Office Action received for U.S. Appl. No. 14/767,238, dated Sep. 15, 2017, 16 pages. |
Non-Final Office Action received for U.S. Appl. No. 16/222,838, dated Aug. 15, 2019, 25 pages. |
Notice of Allowance received for U.S. Appl. No. 14/767,238, dated Aug. 8, 2018, 8 pages. |
Search Report received for French Patent Application No. 1150726, dated Sep. 2, 2011, 2 pages. |
Search Report received for French Patent Application No. 1351275, dated Oct. 18, 2013, 2 pages. |
Search Report received for French Patent Application No. 1451080, dated Jul. 22, 2014, 2 pages. |
“4-Wire Resistive Touchscreens”, Available online at: <http://www.touchscreens.com/intro-touchtypes-4-resistive.html>, Accessed on Aug. 5, 2005. |
“5-Wire Resistive Touchscreens”, Available online at: <http://www.touchscreens.com/intro-touchtypes-resistive.html>, Accessed on Aug. 5, 2005. |
“A Brief Overview of Gesture Recognition”, Available online at: <http://www.dai.ed.ac.uk/Cvonline/LOCA_COPIES/COHEN/gesture_overview.html>, Accessed on Apr. 20, 2004. |
Advisory Action received for U.S. Appl. No. 13/982,791, dated Oct. 17, 2018, 2 pages. |
Agilent Technologies Inc., “Agilent unveils optical proximity sensor for mobile appliances”, Available online at: <http:/www.embeddedstar.com/press/content/2004/8/embedded16015.html>, Accessed on Aug. 31, 2004, 2 pages. |
Ahmad, Subatai, “A Usable Real-Time 3D Hand Tracker”, Proceedings of the 28th Asilomar Conference on Signals, Systems and Computers—Part 2 (of 2), vol. 2, Oct. 1994, 5 pages. |
“Ai Squared Products—XoomText Magnifier”, Available Online at: <http://www/aisquared.com/Products/zoomtexturemag/index.cfm>, Accessed on Oct. 26, 2005. |
Bier et al., “Toolglass and Magic Lenses: The See-Through Interface”, In James Kijiya, editor, Computer Graphics (SIGGRAPH '93 Proceedings), vol. 27, Aug. 1993, pp. 73-80. |
“Capacitive Touchscreens”, Available online at: <http://www.touchscreens.com/intro-touchtypes-capacitive.html>, Accessed on Aug. 5, 2005. |
Chen et al., “Flowfield and Beyond: Applying Pressure-Sensitive Multi-Point Touchpad Interaction”, Multimedia and Expo, ICME '03, Proceedings, Jul. 9, 2003, pp. I-49-I52. |
cnet news.com, “Reinventing the Scroll Wheel”, Photo 1, Available online at: <http://news.com.com/2300-1041_3-6107951-1.html?tag=ne.gall.pg>, Accessed on Aug. 22, 2006, 2 pages. |
cnet news.com, “Reinventing the Scroll Wheel”, Photo 2, Available online at: <http://news.com.com/2300-1041_3-6107951-2.html?tag=ne.gall.pg>, Accessed on Aug. 22, 2006, 2 pages. |
cnet news.com, “Reinventing the Scroll Wheel”, Photo 3, Available online at: <http://news.com.com/2300-1041_3-6107951-3.html?tag=ne.gall.pg>, Accessed on Aug. 22, 2006, 2 pages. |
cnet news.com, “Reinventing the Scroll Wheel”, Photo 4, Available online at: <http://news.com.com/2300-1041_3-6107951-4.html?tag=ne.gall.pg>, Accessed on Aug. 22, 2006, 2 pages. |
cnet news.com, “Reinventing the Scroll Wheel”, Photo 5, Available online at: <http://news.com.com/2300-1041_3-6107951-5.html?tag=ne.gall.pg>, Accessed on Aug. 22, 2006, 2 pages. |
cnet news.com, “Reinventing the Scroll Wheel”, Photo 6, Available online at: <http://news.com.com/2300-1041_3-6107951-6.html?tag=ne.gall.pg>, Accessed on Aug. 22, 2006, 2 pages. |
cnet news.com, “Reinventing the Scroll Wheel”, Photo 7, Available online at: <http://news.com.com/2300-1041_3-6107951-7.html?tag=ne.gall.pg>, Accessed on Aug. 22, 2006, 2 pages. |
cnet news.com, “Reinventing the Scroll Wheel”, Photo 8, Available online at: <http://news.com.com/2300-1041_3-6107951-8.html?tag=ne.gall.pg>, Accessed on Aug. 22, 2006, 2 pages. |
Comparing Touch Technologies, Available online at: <http://www.touchscreens.com/intro-touchtypes.html>, Accessed on Oct. 10, 2004. |
Douglas et al., “The Ergonomics of Computer Pointing Devices”, 1997. |
EVB Elektronik, “TSOP6238 IR Receiver Modules for Infrared Remote Control Systems”, Jan. 2004, 1 page. |
Final Office Action received for U.S. Appl. No. 13/982,791, dated Jan. 20, 2017, 14 pages. |
Final Office Action received for U.S. Appl. No. 13/982,791, dated Oct. 15, 2015, 17 pages. |
Final Office Action received for U.S. Appl. No. 14/767,238, dated Mar. 9, 2018, 23 pages. |
Final Office Action received for U.S. Appl. No. 14/767,238, dated Sep. 15, 2017, 16 pages. |
Final Office Action received for U.S. Appl. No. 14/767,238, dated Mar. 17, 2017, 16 pages. |
“FingerWorks—Gesture Guide—Application Switching”, Available online at: <http://www.fingerworks.com/gesture_guide_apps.html>, Accessed on Aug. 27, 2004, 1 page. |
“FingerWorks—Gesture Guide—Editing”, Available online at: <http://www.fingerworks.com/gesure_guide_editing.html>, Accessed on Aug. 27, 2004, 1 page. |
“FingerWorks—Gesture Guide—File Operations”, Available online at: <http://www.fingerworks.com/gesture_guide_files.html>, Accessed on Aug. 27, 2004, 1 page. |
“FingerWorks—Gesture Guide—Text Manipulation”, Available online at: <http://www.fingerworks.com/gesture_guide_text_manip.html>, Accessed on Aug. 27, 2004, 2 pages. |
“FingerWorks—Gesture Guide—Tips and Tricks”, Available online at: <http://www.fingerworks.com/gesture_guide_tips.html>, Accessed on Aug. 27, 2004, 1 page. |
“FingerWorks—Gesture Guide—Web”, Available online at: <http://www.fingerworks.com/gesture_guide_web.html>, Accessed on Aug. 27, 2004, 1 page. |
“FingerWorks—Guide to Hand Gestures for USB Touchpads”, Available online at: <http://www.fingerworks.com/igesture_userguide.html>, Accessed on Aug. 27, 2004, 1 page. |
“FingerWorks—iGesture—Technical Details”, Available online at: <http://www.fingerworks.com/igesture_tech.html>, Accessed on Aug. 27, 2004, 1 page. |
“FingerWorks—The Only Touchpads with Ergonomic Full-Hand Resting and Relaxation!”, Available online at: <http://www.fingerworks.com/resting.html>, 2001, 1 page. |
“FingerWorks—Tips for Typing on the Mini”, Available online at: <http://www.fingerworks.com/mini_typing.html>, Accessed on Aug. 27, 2004, 2 pages. |
Fisher et al., “Repetitive Motion Disorders: The Design of Optimal Rate-Rest Profiles”, Human Factors, vol. 35, No. 2, Jun. 1993, pp. 283-304. |
Fukumoto et al., “ActiveClick: Tactile Feedback for Touch Panels”, In CHI 2001 Summary, 2001, pp. 121-122. |
Fukumoto et al., “Body Coupled Fingering: Wireless Wearable Keyboard”, CHI 97, Mar. 1997, pp. 147-154. |
“Gesture Recognition”, Available online at: <http://www.fingerworks.com/gesture_recognition.html>, downloaded on Aug. 30, 2005, 2 pages. |
“GlidePoint”, Available online at: <http://www.cirque.com/technology/technology_gp.html>, Accessed on Aug. 5, 2005. |
Hardy, Ian, “Fingerworks”, BBC World On Line, Mar. 7, 2002. |
Hillier et al., “Introduction to Operations Research”, 1986. |
Hinckley et al., “Touch-Sensing Input Devices”, CHI '99 Proceedings, May 1999, pp. 223-230. |
“How Do Touchscreen Monitors Know Where You're Touching?”, Available online at: <http://electronics.howstuffworks.com/question716.html>, Jul. 7, 2008, 2 pages. |
“How Does a Touchscreen Work?”, Available online at: <http://www.touchscreens.com/intro-anatomy.html>, Accessed on Aug. 5, 2005. |
“iGesture Pad—the MultiFinger USB TouchPad with Whole-Hand Gestures”, Available online at: <http://www.fingerworks.com/igesture.html>, Accessed on Aug. 27, 2004, 2 pages. |
“iGesture Products for Everyone (learn in minutes) Product Overview”, Available online at: <FingerWorks.com>, Accessed on Aug. 30, 2005. |
“Infrared Touchscreens”, Available online at: <http://www.touchscreens.com/intro-touchtypes-infrared.html>, Accessed on Aug. 5, 2005. |
International Search Report received for PCT Patent Application No. PCT/EP2014/052533, dated May 9, 2014, 3 pages. |
Jacob et al., “Integrality and Separability of Input Devices”, ACM Transactions on Computer-Human Interaction, vol. 1, Mar. 1994, pp. 3-26. |
Kahney, L., “Pocket PCs Masquerade as IPods”, Available online at: <http://www.wired.com/gadgets/mac/news/2004/03/62543>, Mar. 8, 2004, 2 pages. |
Kennedy, Peter J. et al., Unpublished U.S. Appl. No. 10/805,144, filed Mar. 19, 2004, titled “Methods and Apparatuses for Configuration Automation”, 59 pages. |
Kionx, “KXP84 Series Summary Data Sheet”, Oct. 21, 2005, 4 pages. |
Lee et al., “A Multi-Touch Three Dimensional Touch-Sensitive Tablet”, CHI'85 Proceedings, Apr. 1985, pp. 21-25. |
Lee, S., “A Fast Multiple-Touch-Sensitive Input Device”, A Thesis Submitted in Conformity with the Requirements for the Degree of Master of Applied Science in the Department of Electrical Engineering, University of Toronto, Oct. 1984, 115 pages. |
“Lunar Screen Magnifier and Lunar Plus Enhanced Screen Magnifier”, Available online at: <www.dolphincomputeraccess.com/products/lunar.htm>, Accessed on Oct. 25, 2005. |
Matsushita et al., “HoloWall: Designing a Finger, Hand, Body and Object Sensitive Wall”, In Proceedings of UIST '97, Oct. 1997. |
“Mouse Emulation”, FingerWorks, Available online at: <http://www.fingerworks.com/gesture_guide_mouse.html>, Accessed on Aug. 30, 2005. |
“Mouse Gestures in Opera”, Available online at: <http://www.opera.com/products/desktop/mouse/index.dml>, Accessed on Aug. 30, 2005. |
“Mouse Gestures”, Optim oz, May 21, 2004. |
“MultiTouch Overview”, FingerWorks, Available online at: <http://www.fingerworks.com/multoverview.html>, Accessed on Aug. 30, 2005. |
“Near Field Imaging Touchscreens”, Available online at: <http://www.touchscreens.com/intro-touchtypes-nfi.html>, Accessed on Aug. 5, 2005. |
Non-Final Office Action received for U.S. Appl. No. 13/982,791, dated Apr. 22, 2016, 16 pages. |
Non-Final Office Action received for U.S. Appl. No. 13/982,791, dated Mar. 25, 2015, 18 pages. |
Non-Final Office Action received for U.S. Appl. No. 13/982,791, dated Oct. 6, 2017, 16 pages. |
Non-Final Office Action received for U.S. Appl. No. 14/767,238, dated Sep. 28, 2016, 12 pages. |
Notice of Allowance received for U.S. Appl. No. 13/982,791, dated Jan. 11, 2019, 21 pages. |
“PenTouch Capacitive Touchscreens”, Available online at: <http://www.touchscreens.com/intro-touchtypes-pentouch.html>, Accessed on Aug. 5, 2005. |
Quantum Research Group, “QT510/QWheel Touch Slider IC”, 2004-2005, 14 pages. |
Quek, “Unencumbered Gestural Interaction”, IEEE Multimedia, vol. 3, 1996, pp. 36-47. |
Radwin, “Activation Force and Travel Effects on Overexertion in Repetitive Key Tapping”, Human Factors, vol. 39, No. 1, Mar. 1997, pp. 130-140. |
Rekimoto et al., “ToolStone: Effective Use of the Physical Manipulation Vocabularies of Input Devices”, In Proc. of UIST 2000, 2000. |
Roos, Gina, “Agilent's New Proximity Sensor Beats the Fumble-Fingered Competition Hands Down . . . Literally”, eeProductCenter, Available online at: <http://www.eeproductcenter.com/showArticle.jhtml?article ID_46200544>, Sep. 1, 2004, 3 pages. |
Rubine, Dean H., “The Automatic Recognition of Gestures”, CMU-CS-91-202, Submitted in Partial Fulfillment of the Requirements for the Degree of Doctor of Philosophy in Computer Science at Carnegie Mellon University, Dec. 1991, 285 pages. |
Rubine, Dean, “Combining Gestures and Direct Manipulation”, CHI'92, May 3-7, 1992, pp. 659-660. |
Rubine et al., “Programmable Finger-Tracking Instrument Controllers”, Computer Music Journal, vol. 14, No. 1, 1990, pp. 26-41. |
Rutledge et al., “Force-To-Motion Functions For Pointing”, Human-Computer Interaction—INTERACT, 1990. |
Search Report received for Chinese Patent Application No. 201280007164.6, dated Apr. 20, 2017, 4 pages. |
Smith et al., “Relating Distortion to Performance in Distortion Oriented Displays”, Proceedings of Sixth Australian Conference on Computer-Human Interaction, Nov. 1996, pp. 6-11. |
Sun Microsystems, “The Star7 PDA Prototype”, Available Online at: <http://www.youtube.com/watch?v=Ahg80BYixLO>, 1992, 7 pages. |
“Surface Acoustic Wave Touchscreens”, Available online at: <http://www.touchscreens.com/intro-touchtypes-saw.html>, Accessed on Aug. 5, 2005. |
“Symbol Commander”, Available online at: <http://www.sensiva.com/symbolcomander/>, Accessed on Aug. 30, 2005. |
Synaptics, “Transparent Capacitive Position Sensing”, Available online at: <http://www.synaptics.com/technology/tcps.cfm>, 2005, 2 pages. |
Texas Instruments, “TSC2003/I2C Touch Screen Controller”, Data Sheet SBAS 162, Oct. 2001, 20 pages. |
The Gadgeteer, “Apple iPod (30GB)”, Available online at: <http://thegadgeteer.com/review/apple_ipod_30gb_review>, Jun. 6, 2003, 19 pages. |
“Touch Technologies: Touch is Everywhere”, Available online at: <http://www.3m.com/3MTouchSystems/downloads/PDFs/TouchTechOV.pdf>, Accessed on Aug. 30, 2005. |
“Touchscreen Technology Choices”, Available online at: <http://www.elotouch.com/products/detech2.asp>, Accessed on Aug. 5, 2005. |
Universal Remote Control Inc., “All Complete Control Remotes Now Use Narrow Band RF”, Available online at: <http://www.universalremote.com/corporate/press_release.php?press=13>, 2008. |
Universal Remote Control Inc., “Operating System with the Aurora MX-950”, MX-950 Owners Manual, 2005. |
Universial Remote Control Inc., “MX-950 (The Aurora)”, Available online at: <www.unversalremote.com>, 2005. |
“Visual Disabilities”, Available Online at: <http://depts.stcc.edu/ods/ACCESS/bpvisual.htm>, Accessed on Oct. 25, 2005. |
“Wacom Components—Technology”, Available online at: <http://www.wacom-components.com/english/tech.asp>, Accessed on Oct. 10, 2004. |
“Watershed Algorithm”, Available online at: <http://rsb.info.nih.gov/ij/plugins/watershed.html>, Accessed on Aug. 5, 2005. |
Wellner, Pierre, “The Digital Desk Calculators: Tangible Manipulation on a Desk Top Display”, In ACM UIST '91 Proceedings, Nov. 11-13, 1991, pp. 27-34. |
Westerman, Wayne, “Hand Tracking, Finger Identification, and Chordic Manipulation on a Multi-Touch Surface”, A Dissertation Submitted to the Faculty of the University of Delaware in Partial Fulfillment of the Requirements for the Degree of Doctor of Philosophy in Electrical Engineering, 1999, 363 pages. |
Williams, Jim, “Applications for a Switched-Capacitor Instrumentation Building Block”, Linear Technology Application Note 3, Jul. 1985, pp. 1-16. |
Yamada et al., “A Switched-Capacitor Interface for Capacitive Pressure Sensors”, IEEE Transactions on Instrumentation and Measurement, vol. 41, No. 1, Feb. 1992, pp. 81-86. |
Yeh et al., “Switched Capacitor Interface Circuit for Capacitive Transducers”, IEEE, 1985. |
Zhai et al., “Dual Stream Input for Pointing and Scrolling”, Proceedings of CHI '97 Extended Abstracts, 1997. |
Zimmerman et al., “Applying Electric Field Sensing to Human-Computer Interfaces”, In CHI '85 Proceedings, 1995, pp. 280-287. |
Search Report received for Chinese Patent Application No. 201810965827.4, dated Dec. 31, 2020, 5 pages (2 page of English Translation and 3 page of Official Copy). |
Non-Final Office Action received for U.S. Appl. No. 16/222,838, dated Jan. 26, 2021, 38 pages. |
Search Report received for Chinese Patent Application No. 201810965827.4, dated Jul. 30, 2021, 2 pages (Official Copy Only). See attached Communication 37 CFR § 1.98(a) (3). |
Number | Date | Country | |
---|---|---|---|
20190361539 A1 | Nov 2019 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 13982791 | US | |
Child | 16398321 | US |