The present disclosure relates generally to improved techniques in establishing useful hand gestures within ultrasonic fields to activate features.
There are situations when receiving haptic feedback before touching the surface would be beneficial. These include when vision of the display is restricted, such as while driving, and when the user doesn't want to touch the device, such as when their hands are dirty. Providing feedback above the surface would also allow for an additional information channel alongside the visual.
A mid-air haptic feedback system creates tactile sensations in the air. One way to create mid-air haptic feedback is using ultrasound. A phased array of ultrasonic transducers is used to exert an acoustic radiation force on a target. This continuous distribution of sound energy, which will be referred to herein as an “acoustic field”, is useful for a range of applications, including haptic feedback.
It is known to control an acoustic field by defining one or more control points in a space within which the acoustic field may exist. Each control point is assigned an amplitude value equating to a desired amplitude of the acoustic field at the control point. Transducers are then controlled to create an acoustic field exhibiting the desired amplitude at each of the control points.
Tactile sensations on human skin can be created by using a phased array of ultrasound transducers to exert an acoustic radiation force on a target in mid-air. Ultrasound waves are transmitted by the transducers, with the phase emitted by each transducer adjusted such that the waves arrive concurrently at the target point in order to maximize the acoustic radiation force exerted.
By defining one or more control points in space, the acoustic field can be controlled. Each point can be assigned a value equating to a desired amplitude at the control point. A physical set of transducers can then be controlled to create an acoustic field exhibiting the desired amplitude at the control points.
As a result, mid-air gesture interaction in the automotive domain has been explored to counteract some drawbacks associated with increasingly ubiquitous touchscreen interfaces (e.g. added complexity and increased eyes-off-the-road time) yet remains a relatively fledgling modality. Naturally, gestures have their own limitations, namely potential cultural semantic nuances, the learning associated with more intricate gestures, and a lack of a sense of agency. Focused ultrasound can be used to create haptic sensations without requiring physical contact which creates the opportunity to bind tangible sensations to mid-air gestures; thus, haptic sensations can provide confirmatory cues thereby increasing one's feeling of control over their gesture. The advent of this technology also provides an alternative to auditory feedback as the latter could plausibly displease vehicle occupants by interrupting conversations and audible media. Furthermore, mid-air haptic icons could allow more expressivity than previous haptic technologies as result of information encoded via novel spatial and temporal interplay. This could provide higher resolution in relaying feature semantics and therefore reduce the onus on having intuitive gestures, learning the meaning of haptic signals or visually attending to a touchscreen for information transfers. From a user experience standpoint, the refinement of mid-air haptics for an automotive gesture interface could also improve user engagement and aesthetic appeal as seen in applications in public digital signage.
Preliminary investigations have compared mid-air haptic gesture (MAHG) interfaces with industry trending touchscreen solutions, and have notionally combined existing gesture sets with stock haptic sensations. These studies demonstrated clear benefits associated with MAHG interfaces, such as reduced eyes-off-the-road times and increased task performance. However, beyond their scope, was using semiotic and psychophysical principles to ground original holistic interaction designs by mapping gestures and haptics with specific tasks as well as optimizing haptic intensity, transience, pattern, location on the hand etc. All of these factors are likely to play a key role in augmenting the opportunities that mid-air haptics present, therefore, the aim of this ongoing program of research is to build and evaluate an exemplar set of robust, function-associated haptic gestures—based on human-centered design—to support an in-vehicle infotainment system (IVIS).
Current automotive mid-air gesture interfaces provide only visual and audible feedback as confirmation of a successfully executed interaction with an IVIS. Visual feedback presents the complication of providing potential for additional driver distraction from the roadway which is the third most common cause of car crashes; this is one of the primary problems that gesture interfaces aim to solve. Audible feedback is a reasonable alternative to this problem; however, it can be intrusive to the driving experience by interrupting music, radio, conversations or even sleeping occupants.
Using an arbitrary mid-air haptic sensation that actuates onto the driver's hand after they have performed an appropriate gesture/hand-pose can let the driver know their input has been acknowledged by the system which improves the driver's sense of agency (control) in their interaction. However, they still cannot be absolutely certain that the system has correctly distinguished their performed gesture from others in a gesture set without this information relayed through visual or auditory feedback. The premise behind this invention is therefore to transfer this information through the haptic channel using function-associated and user-centered mid-air haptic icons or “Ultrahapticons”. In this way, the driver can recognize the sensation for the intended infotainment feature and be confident they have made the right selection without removing their eyes from the road. Conversely if they recognize an Ultrahapticon that does not represent the feature they intended to select, they know there has been a false-positive in the system and they can perform their gesture again without affecting the system status. Additionally, Ultrahapticons hold potential for enabling an improved learning curve of the gesture set; this is a barrier to user acceptance of the interaction. Unfamiliarity with a novel interaction paradigm has also been linked to detrimental lane-keeping ability.
There have been a few research efforts to explore the augmentation of gestures with mid-air haptic feedback in the automotive sector as well as studies to investigate the human's ability to detect and recognize mid-air haptic sensations. Orestis Georgiou, Valerio Biscione, Adam Harwood, Daniel Griffiths, Marcello Giordano, Ben Long, and Tom Carter. 2017. Haptic In-vehicle gesture controls. 233-238. DOI: https://doi.org/10.1145/3131726.3132045 (Georgiou) demonstrated a mid-air haptic gesture (MAHG) interface in this 2017 conference paper. Their interface gave an example of how mid-air haptic sensations can be assigned to match the behavior of the gestural actions. This prototype demonstrated a proof-of-concept that mid-air haptic sensations can be mapped onto a driver's gesture in real-time, however, the sensations are intended to represent the gestures and not the features themselves.
David R. Large, Kyle Harrington, Gary Burnett, and Orestis Georgiou. 2019. Feel the noise: Mid-air ultrasound haptics as a novel human-vehicle interaction paradigm. Appl. Ergon. (2019). DOI: https://doi.org/10.1016/j.apergo.2019.102909 (Large) conducted a study where mid-air haptic buttons are seemingly suspended above the gear shifter in a driving simulator. The driver can “tap” these buttons with their palm to select and then move their palm sideways to adjust. During the adjustment interaction the driver feels haptic clicks on the palm to represent increments. This study aimed at evaluating the human factors issues when gestures are augmented with haptics in terms of driver safety and user experience, they found there were genuine advantages over in-vehicle touchscreen and non-haptic gesture input methods. This study provides evidence for the utility of mid-air haptics from an automotive human factors stand-point however the interface was function agnostic and the tasks were purely arbitrary.
Gözel Shaken, John H. Williamson, and Stephen Brewster. 2018. May the force be with you: Ultrasound haptic feedback for mid-air gesture interaction in cars. Proc.—10th Int. ACM Conf. Automot. User Interfaces Interact. Veh. Appl. AutomotiveUI 2018 July (2018), 1-10. DOI: https://doi.org/10.1145/3239060.3239081 (Shaken) ran an experiment to establish the effects of combining 3 forms of feedback including peripheral visual, auditory and mid-air haptics. The sensations they used followed similar logic to the Georgiou et al prototype in that they mimicked the behavior and nature of the gesture and not the feature itself i.e. a gesture performed with a clockwise circular motion to increase volume was paired with a haptic that displayed a clockwise circular motion. In this study the participants were not always able to feel the sensations or distinguish between them which caused them to ignore the mid-air haptics entirely in some cases.
Davide Rocchesso, Francesco Saverio Cannizzaro, Giovanni Capizzi, and Francesco Landolina. 2019. Accessing and selecting menu items by in-air touch. (2019), 1-9. DOI: https://doi.org/10.1145/335195.3352053 (Rocchesso) demonstrate a methodology for using arbitrary shapes as icons for different items in a multi-level menu. They show that certain shapes are easily confused with one another and should not be included coincidentally in a menu design. The icons used in this paper were arbitrary with no direct application for a specific use as this was not the intention of the research.
Ultrahapticons are a set of tangible and recognizable mid-air haptic icons that have been derived from research study participants' metaphorical associations with car infotainment features by detecting movement of a human hand to operate an automotive function. In line with semiotic theory (the study of signs), data from the study was analyzed to identify key characteristics that when realized in mid-air haptic form, would enable a user to “feel” the feature they are interacting with. Their use is not limited to an automotive context, they can be instrumented to any application that exhibits the same feature functionality i.e. home entertainment system, laptop UI's, digital communication, Extended Reality (XR) etc.
The accompanying figures, where like reference numerals refer to identical or functionally similar elements throughout the separate views, together with the detailed description below, are incorporated in and form part of the specification, serve to further illustrate embodiments of concepts that include the claimed invention and explain various principles and advantages of those embodiments.
Skilled artisans will appreciate that elements in the figures are illustrated for simplicity and clarity and have not necessarily been drawn to scale. For example, the dimensions of some of the elements in the figures may be exaggerated relative to other elements to help to improve understanding of embodiments of the present invention.
The apparatus and method components have been represented where appropriate by conventional symbols in the drawings, showing only those specific details that are pertinent to understanding the embodiments of the present invention so as not to obscure the disclosure with details that will be readily apparent.
Various parts of the hand are susceptible to haptic effects in different ways. For example:
In addition, Ultrahapticons may be classified by where on the hand the effect is felt.
A. Ultrahapticons for Telephone Calls
This is a “single finger only” Ultrahapticon.
This is a “palm only” Ultrahapticon.
This is a “palm only” Ultrahapticon.
B. Ultrahapticons for Audio
This is a “two finger only” Ultrahapticon.
This is a “two finger only” Ultrahapticon.
C. Ultrahapticons for Cabin Temperature
This is a “two finger only” Ultrahapticon.
This is a “thumb only” Ultrahapticon.
This is a “palm and two finger” Ultrahapticon.
D. Ultrahapticons for Seat Temperature
This is a “thumb, palm and two finger” Ultrahapticon.
This is a “palm only” Ultrahapticon.
E. Ultrahapticons for Fan Speed
This is a “palm only” Ultrahapticon.
F. Ultrahapticons for Navigation
This is a “palm only” Ultrahapticon.
This is a “palm only” Ultrahapticon.
This is a “palm only” Ultrahapticon.
G. Ultrahapticons for Home Screen
This is a “palm only” Ultrahapticon.
This is a “palm only” Ultrahapticon.
A. Feature and Participant Selection
It was evident in the related work that the non-driving related task (NDRT) and User Interface (UI) design is principle in the success of MAHGs. Results from Large indicated that MAHGs appear superior to a touchscreen interface and mid-air gestures (without haptics), with regard to visual demand and task performance, for tasks that involve incrementally adjusting a setting. However, the results indicate that the touchscreen is more appropriate for the selection of a non-goal orientated 4×4 button grid when compared with MAHGs in a car following paradigm simulation. Arguably, the button selection condition might yield different results if the UI reflected the smaller button proportions inherent in some contemporary production vehicles. Hence, an expert user experience appraisal was conducted in a Tesla Model X to identify ecologically valid features and interactions in the touchscreen interface that could benefit from actuation using MAHGs. These consisted of discrete selection and continuous adjustment interactions for fan speed, cabin and seat temperature, navigation and audio volume; discrete shortcut interactions for telephone calls and the landing page (home) as well as response to telephone call notifications.
A participatory design study was subsequently conducted with a sample of seventeen participants (Male n11, Female n6, Age Range 19-65 yrs, mean 30 yrs) comprising members of the Nottingham Electric Vehicle Owners Club, staff and students at the University of Nottingham and non-technical employees at Ultraleap Ltd. Understanding cultural difference was considered important therefore multiple nationalities were recruited for the sample (UK [n7], Mexico [n3], Malaysia [n3], Hungary [n1], Spain [n2] and India [n1]).
B. Participatory Design Study Procedure
The procedure encompasses an amalgamation of learnings from related literature. Six of the participants were involved in individual face-to-face sessions where they experienced the mid-air haptic technology (the exposed group); eleven were organized remotely as a result of Covid-19 lockdown measures (the non-exposed group), the technology was therefore comprehensively demonstrated to them via remote video call.
C. Cognitive Mapping
Following a practice word-association task, participants verbalized the mental models they associated with each infotainment feature. Specifically, participants were asked “For the words [infotainment feature] what would you associate [tactually i.e. physical sensations; visually, as mental images or objects and auditorily i.e. sounds.]?”. Participants were encouraged to consider the words themselves and not the features within context based on findings that structuring the questioning in this way led to less bias yet still yielded concrete metaphors.
D. Mental Model Visualization.
The next stage involved asking the participants to sketch the visual, auditory and tactual elements they had previously mentioned. In consideration of differing sketching abilities, the participants were encouraged to follow a “think-aloud” protocol as they sketched; this would enable the investigator to review video footage to understand the participants' thought processes if the sketch wasn't sufficiently communicative. The investigator demonstrated with an arbitrary example of a radio metaphor illustrated as a retro “boombox” radio, and then directed the participants to render a conceptual sketch for each feature. The “exposed” group were then demonstrated examples of sensations via the Ultraleap STRATOS explore array and the non-exposed group had the technology and types of sensations thoroughly described to them with aid from a graphical visualizer. The participants were then informed of twelve tunable, mid-air haptic parameters that could be manipulated to create different sensations. Using this information the participants highlighted elements of each sketch they thought most embodied the metaphor (i.e. the antenna on the radio example); they were then encouraged by the investigator to elicit how they would use the parameters, along with a nominal open palm gesture, to encapsulate these characteristics as their personal mid-air haptic icons, or “Ultrahapticons”.
E. Ultrahapticon Refinement
The next step guided the participants to extend their designs to include how specific dimensions of the sensation would adapt to reflect a user-manipulated change in the feature setting (real-time interactional feedback). This time the participants were asked to consider that the feature will be adjusted using a more realistic “index finger and thumb pinch” hand pose and that they should elicit what axis this hand pose should move along to influence the function. This gesture was selected based on current design guidelines for automotive gesture interfaces generated by the array manufacturer-Ultraleap.
F. Semiotic Decomposition
The Ultrahapticon study elicited 119 total sketched designs which were analyzed for their semiotic components to determine the most intuitive designs for each feature (referent). First the participants' mental model sketches were classified into distinct prevailing styles (proposals). Although not specifically instructed to, many of the participants proposed multiple styles for a single referent. To account for this, the proposals were analyzed for “Max Consensus” (MC: percent of participants eliciting the most popular proposal) and “Consensus-Distinct Ratio” (CDR: the spread of participants displaying the most popular proposal−the closer the value is to 1, the smaller the spread and the more agreement there is among participants). Singular incidences of proposals were eliminated resulting in a shortlist of 23 Ultrahapticon styles for the 7 features.
The next level of analysis involved breaking down the styles into exemplar level semiotic components; their feature “constructs” and “intents”. Constructs are physical characteristics of a feature that, collectively, comprise the holistic mental model (e.g. the rails of a rocking chair); an intent is a symbolic construct that is used by the designer to express meaning or behavior (e.g. blurred lines indicating movement of the rocking chair).
Derived from the 23 styles were 88 distinct semiotic features; these were analyzed further for consensus which indicated that 65 were commonly occurring (appearing at least twice). 32 of these were adapted by the participants into their Ultrahapticon designs.
These Ultrahapticons were subsequently decomposed into their value level components to understand the participants' expectation of the real-time interactional feedback. This included understanding any consensus regarding construct rendering; what spatial, temporal and spatiotemporal parameters were used to signify feature intents; location of the sensations on the hand, axial direction of the pinch gesture and the dynamic adaptation of the sensation to reflect the feature adjustment.
Limited consensus was observed among the participants during the technical “value level” part of the study and sometimes the concepts were not feasible. Therefore, some results were adapted based on literature heuristics and where high disagreement occurred, all variations were considered for that icon style. Additionally, frequently reoccurring constructs and intents from popular styles that weren't selected for haptification by participants were reimagined as sensations by the investigator on the basis that these may have been discarded purely due to the participants' partial understanding of feasibility. This was exacerbated by the language barrier in some cases which was the only cultural difference observed in the study. To refine the resultant 30 user-centered Ultrahapticons, a remote workshop was conducted with four mid-air haptic experts. The Ultrahapticon design process was described to the attendees after which they were asked to rate each concept on a five-point Likert scale pertaining to feature appropriateness, expected salience, naturalism, instant recognizability, perspicuity, and technical feasibility. They then provided expert consultation on how to adapt the designs to improve the aforementioned aspects. The data from the workshop was used to hone the concepts and the result was a shortlist of 17 sensations for 7 features.
For IVIS feature Telephone Calls 1720, the chosen Ultrahapticons were “Rotary Dial” 1722, “Coiled Wire” 1724, “Bouncing Headset”.
For IVIS feature Audio 1760, the chosen Ultrahapticons were “Sound Waves” 1762 and “Bass Speaker” 1764.
For IVIS feature Cabin Temperature 1730, the chosen Ultrahapticons were “Fire” 1732, “Ice” 1734, and “Thermometer” 1736.
For IVIS feature Seat Temperature 1770, the chosen Ultrahapticons were “Heating Elements” 1772 and “Seat Profile” 1774.
For IVIS feature Navigation 1740, the chosen Ultrahapticons were “Compass” 1742 (both “Compass In” and “Compass Out”), “Road Systems” 1744, “T-Junction” 1746, “Waypoint Blip” 1748.
For IVIS feature Fan Speed 1750, the chosen Ultrahapticon was “Propeller” 1752.
For IVIS feature Home 1710, the chosen Ultrahapticons were “Sofa Cushion” 1714 and “Triangular Roof” 1712.
G. Conclusions and Future Work
The next phase in this research may determine the most articulate icons from the shortlist; the icons will be prototyped along with synchronous hand poses based on established psychophysical principles and then evaluated in a salience study. An initial objective will be to understand the “articulatory directness” of the icons, that is the strength of the link between feature and metaphor and whether the icon rendering reflects the intended symbolism. This includes identifying the optimal way of manipulating haptic spatio-temporal dimensions to reflect the dynamic interaction with a specific feature.
The next study may also focus on perceptual optimization of the icon set by testing the icons' salience under simulated workload similar to that of a driving task. This will determine whether certain icons are masked through cognitive load and whether some sensations are confused with others within the set. When eventually tested in a driving simulator, this design process will improve the likelihood of validating distraction and task time measures associated with this MAHG concept without perception confounds.
The various features of the foregoing embodiments may be selected and combined to produce numerous variations of improved haptic-based systems.
In the foregoing specification, specific embodiments have been described. However, one of ordinary skill in the art appreciates that various modifications and changes can be made without departing from the scope of the invention as set forth in the claims below. Accordingly, the specification and figures are to be regarded in an illustrative rather than a restrictive sense, and all such modifications are intended to be included within the scope of present teachings.
The benefits, advantages, solutions to problems, and any element(s) that may cause any benefit, advantage, or solution to occur or become more pronounced are not to be construed as a critical, required, or essential features or elements of any or all the claims. The invention is defined solely by the appended claims including any amendments made during the pendency of this application and all equivalents of those claims as issued.
Moreover in this document, relational terms such as first and second, top and bottom, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. The terms “comprises,” “comprising,” “has”, “having,” “includes”, “including,” “contains”, “containing” or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises, has, includes, contains a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. An element proceeded by “comprises . . . a”, “has . . . a”, “includes . . . a”, “contains . . . a” does not, without more constraints, preclude the existence of additional identical elements in the process, method, article, or apparatus that comprises, has, includes, contains the element. The terms “a” and “an” are defined as one or more unless explicitly stated otherwise herein. The terms “substantially”, “essentially”, “approximately”, “about” or any other version thereof, are defined as being close to as understood by one of ordinary skill in the art. The term “coupled” as used herein is defined as connected, although not necessarily directly and not necessarily mechanically. A device or structure that is “configured” in a certain way is configured in at least that way, but may also be configured in ways that are not listed.
The Abstract of the Disclosure is provided to allow the reader to quickly ascertain the nature of the technical disclosure. It is submitted with the understanding that it will not be used to interpret or limit the scope or meaning of the claims. In addition, in the foregoing Detailed Description, it can be seen that various features are grouped together in various embodiments for the purpose of streamlining the disclosure. This method of disclosure is not to be interpreted as reflecting an intention that the claimed embodiments require more features than are expressly recited in each claim. Rather, as the following claims reflect, inventive subject matter lies in less than all features of a single disclosed embodiment. Thus the following claims are hereby incorporated into the Detailed Description, with each claim standing on its own as a separately claimed subject matter.
This application claims the benefit of the following application, which is incorporated by references in its entirety: Ser. No. 63/079,708, filed Sep. 17, 2020.
Number | Name | Date | Kind |
---|---|---|---|
4218921 | Berge | Aug 1980 | A |
4760525 | Webb | Jul 1988 | A |
4771205 | Mequio | Sep 1988 | A |
4881212 | Takeuchi | Nov 1989 | A |
5226000 | Moses | Jul 1993 | A |
5235986 | Maslak | Aug 1993 | A |
5243344 | Koulopoulos | Sep 1993 | A |
5329682 | Thurn | Jul 1994 | A |
5371834 | Tawel | Dec 1994 | A |
5422431 | Ichiki | Jun 1995 | A |
5426388 | Flora | Jun 1995 | A |
5477736 | Lorraine | Dec 1995 | A |
5511296 | Dias | Apr 1996 | A |
5729694 | Holzrichter | Mar 1998 | A |
5859915 | Norris | Jan 1999 | A |
6029518 | Oeftering | Feb 2000 | A |
6193936 | Gardner | Feb 2001 | B1 |
6216538 | Yasuda | Apr 2001 | B1 |
6436051 | Morris | Aug 2002 | B1 |
6503204 | Sumanaweera | Jan 2003 | B1 |
6647359 | Verplank | Nov 2003 | B1 |
6771294 | Pulli | Aug 2004 | B1 |
6772490 | Toda | Aug 2004 | B2 |
6800987 | Toda | Oct 2004 | B2 |
7107159 | German | Sep 2006 | B2 |
7109789 | Spencer | Sep 2006 | B2 |
7182726 | Williams | Feb 2007 | B2 |
7225404 | Zilles | May 2007 | B1 |
7284027 | Jennings, III | Oct 2007 | B2 |
7345600 | Fedigan | Mar 2008 | B1 |
7487662 | Schabron | Feb 2009 | B2 |
7497662 | Mollmann | Mar 2009 | B2 |
7577260 | Hooley | Aug 2009 | B1 |
7692661 | Cook | Apr 2010 | B2 |
RE42192 | Schabron | Mar 2011 | E |
7966134 | German | Jun 2011 | B2 |
8000481 | Nishikawa | Aug 2011 | B2 |
8123502 | Blakey | Feb 2012 | B2 |
8269168 | Axelrod | Sep 2012 | B1 |
8279193 | Birnbaum | Oct 2012 | B1 |
8351646 | Fujimura | Jan 2013 | B2 |
8369973 | Risbo | Feb 2013 | B2 |
8594350 | Hooley | Nov 2013 | B2 |
8607922 | Werner | Dec 2013 | B1 |
8782109 | Tsutsui | Jul 2014 | B2 |
8823674 | Birnbaum | Sep 2014 | B2 |
8833510 | Koh | Sep 2014 | B2 |
8884927 | Cheatham, III | Nov 2014 | B1 |
9208664 | Peters | Dec 2015 | B1 |
9267735 | Funayama | Feb 2016 | B2 |
9421291 | Robert | Aug 2016 | B2 |
9612658 | Subramanian | Apr 2017 | B2 |
9662680 | Yamamoto | May 2017 | B2 |
9667173 | Kappus | May 2017 | B1 |
9816757 | Zielinski | Nov 2017 | B1 |
9841819 | Carter | Dec 2017 | B2 |
9863699 | Corbin, III | Jan 2018 | B2 |
9898089 | Subramanian | Feb 2018 | B2 |
9945818 | Ganti | Apr 2018 | B2 |
9958943 | Long | May 2018 | B2 |
9977120 | Carter | May 2018 | B2 |
10101811 | Carter | Oct 2018 | B2 |
10101814 | Carter | Oct 2018 | B2 |
10133353 | Eid | Nov 2018 | B2 |
10140776 | Schwarz | Nov 2018 | B2 |
10146353 | Smith | Dec 2018 | B1 |
10168782 | Tchon | Jan 2019 | B1 |
10268275 | Carter | Apr 2019 | B2 |
10281567 | Carter | May 2019 | B2 |
10318008 | Sinha | Jun 2019 | B2 |
10444842 | Long | Oct 2019 | B2 |
10469973 | Hayashi | Nov 2019 | B2 |
10496175 | Long | Dec 2019 | B2 |
10497358 | Tester | Dec 2019 | B2 |
10510357 | Kovesi | Dec 2019 | B2 |
10520252 | Momen | Dec 2019 | B2 |
10523159 | Megretski | Dec 2019 | B2 |
10531212 | Long | Jan 2020 | B2 |
10535174 | Rigiroli | Jan 2020 | B1 |
10569300 | Hoshi | Feb 2020 | B2 |
10593101 | Han | Mar 2020 | B1 |
10599434 | Barrett | Mar 2020 | B1 |
10657704 | Han | May 2020 | B1 |
10685538 | Carter | Jun 2020 | B2 |
10755538 | Carter | Aug 2020 | B2 |
10818162 | Carter | Oct 2020 | B2 |
10911861 | Buckland | Feb 2021 | B2 |
10915177 | Carter | Feb 2021 | B2 |
10921890 | Subramanian | Feb 2021 | B2 |
10930123 | Carter | Feb 2021 | B2 |
10943578 | Long | Mar 2021 | B2 |
10991074 | Bousmalis | Apr 2021 | B2 |
11048329 | Lee | Jun 2021 | B1 |
11098951 | Kappus | Aug 2021 | B2 |
11113860 | Rigiroli | Sep 2021 | B2 |
11169610 | Sarafianou | Nov 2021 | B2 |
11189140 | Long | Nov 2021 | B2 |
11204644 | Long | Dec 2021 | B2 |
11276281 | Carter | Mar 2022 | B2 |
11531395 | Kappus | Dec 2022 | B2 |
11543507 | Carter | Jan 2023 | B2 |
11550395 | Beattie | Jan 2023 | B2 |
11550432 | Carter | Jan 2023 | B2 |
11553295 | Kappus | Jan 2023 | B2 |
11714492 | Carter | Aug 2023 | B2 |
11715453 | Kappus | Aug 2023 | B2 |
11727790 | Carter | Aug 2023 | B2 |
11740018 | Kappus | Aug 2023 | B2 |
11742870 | Long | Aug 2023 | B2 |
20010007591 | Pompei | Jul 2001 | A1 |
20010033124 | Norris | Oct 2001 | A1 |
20020149570 | Knowles | Oct 2002 | A1 |
20030024317 | Miller | Feb 2003 | A1 |
20030144032 | Brunner | Jul 2003 | A1 |
20030182647 | Radeskog | Sep 2003 | A1 |
20040005715 | Schabron | Jan 2004 | A1 |
20040014434 | Haardt | Jan 2004 | A1 |
20040052387 | Norris | Mar 2004 | A1 |
20040091119 | Duraiswami | May 2004 | A1 |
20040210158 | Organ | Oct 2004 | A1 |
20040226378 | Oda | Nov 2004 | A1 |
20040264707 | Yang | Dec 2004 | A1 |
20050052714 | Klug | Mar 2005 | A1 |
20050056851 | Althaus | Mar 2005 | A1 |
20050148874 | Brock-Fisher | Jul 2005 | A1 |
20050212760 | Marvit | Sep 2005 | A1 |
20050226437 | Pellegrini | Oct 2005 | A1 |
20050267695 | German | Dec 2005 | A1 |
20050273483 | Dent | Dec 2005 | A1 |
20060085049 | Cory | Apr 2006 | A1 |
20060090955 | Cardas | May 2006 | A1 |
20060091301 | Trisnadi | May 2006 | A1 |
20060164428 | Cook | Jul 2006 | A1 |
20070036492 | Lee | Feb 2007 | A1 |
20070094317 | Wang | Apr 2007 | A1 |
20070177681 | Choi | Aug 2007 | A1 |
20070214462 | Boillot | Sep 2007 | A1 |
20070236450 | Colgate | Oct 2007 | A1 |
20070263741 | Erving | Nov 2007 | A1 |
20080012647 | Risbo | Jan 2008 | A1 |
20080027686 | Mollmann | Jan 2008 | A1 |
20080084789 | Altman | Apr 2008 | A1 |
20080130906 | Goldstein | Jun 2008 | A1 |
20080152191 | Fujimura | Jun 2008 | A1 |
20080226088 | Aarts | Sep 2008 | A1 |
20080273723 | Hartung | Nov 2008 | A1 |
20080300055 | Lutnick | Dec 2008 | A1 |
20090093724 | Pernot | Apr 2009 | A1 |
20090116660 | Croft, III | May 2009 | A1 |
20090232684 | Hirata | Sep 2009 | A1 |
20090251421 | Bloebaum | Oct 2009 | A1 |
20090319065 | Risbo | Dec 2009 | A1 |
20100013613 | Weston | Jan 2010 | A1 |
20100016727 | Rosenberg | Jan 2010 | A1 |
20100030076 | Vortman | Feb 2010 | A1 |
20100044120 | Richter | Feb 2010 | A1 |
20100066512 | Rank | Mar 2010 | A1 |
20100085168 | Kyung | Apr 2010 | A1 |
20100103246 | Schwerdtner | Apr 2010 | A1 |
20100109481 | Buccafusca | May 2010 | A1 |
20100199232 | Mistry | Aug 2010 | A1 |
20100231508 | Cruz-Hernandez | Sep 2010 | A1 |
20100262008 | Roundhill | Oct 2010 | A1 |
20100302015 | Kipman | Dec 2010 | A1 |
20100321216 | Jonsson | Dec 2010 | A1 |
20110006888 | Bae | Jan 2011 | A1 |
20110010958 | Clark | Jan 2011 | A1 |
20110051554 | Varray | Mar 2011 | A1 |
20110066032 | Vitek | Mar 2011 | A1 |
20110199342 | Vartanian | Aug 2011 | A1 |
20110310028 | Camp, Jr. | Dec 2011 | A1 |
20120057733 | Morii | Mar 2012 | A1 |
20120063628 | Rizzello | Mar 2012 | A1 |
20120066280 | Tsutsui | Mar 2012 | A1 |
20120223880 | Birnbaum | Sep 2012 | A1 |
20120229400 | Birnbaum | Sep 2012 | A1 |
20120229401 | Birnbaum | Sep 2012 | A1 |
20120236689 | Brown | Sep 2012 | A1 |
20120243374 | Dahl | Sep 2012 | A1 |
20120249409 | Toney | Oct 2012 | A1 |
20120249474 | Pratt | Oct 2012 | A1 |
20120299853 | Dagar | Nov 2012 | A1 |
20120307649 | Park | Dec 2012 | A1 |
20120315605 | Cho | Dec 2012 | A1 |
20130035582 | Radulescu | Feb 2013 | A1 |
20130079621 | Shoham | Mar 2013 | A1 |
20130094678 | Scholte | Apr 2013 | A1 |
20130100008 | Marti | Apr 2013 | A1 |
20130101141 | Mcelveen | Apr 2013 | A1 |
20130173658 | Adelman | Jul 2013 | A1 |
20130331705 | Fraser | Dec 2013 | A1 |
20140027201 | Islam | Jan 2014 | A1 |
20140104274 | Hilliges | Apr 2014 | A1 |
20140139071 | Yamamoto | May 2014 | A1 |
20140168091 | Jones | Jun 2014 | A1 |
20140201666 | Bedikian | Jul 2014 | A1 |
20140204002 | Bennet | Jul 2014 | A1 |
20140265572 | Siedenburg | Sep 2014 | A1 |
20140267065 | Levesque | Sep 2014 | A1 |
20140269207 | Baym | Sep 2014 | A1 |
20140269208 | Baym | Sep 2014 | A1 |
20140269214 | Baym | Sep 2014 | A1 |
20140270305 | Baym | Sep 2014 | A1 |
20140320436 | Modarres | Oct 2014 | A1 |
20140361988 | Katz | Dec 2014 | A1 |
20140369514 | Baym | Dec 2014 | A1 |
20150002477 | Cheatham, III | Jan 2015 | A1 |
20150005039 | Liu | Jan 2015 | A1 |
20150006645 | Oh | Jan 2015 | A1 |
20150007025 | Sassi | Jan 2015 | A1 |
20150013023 | Wang | Jan 2015 | A1 |
20150019299 | Harvey | Jan 2015 | A1 |
20150022466 | Levesque | Jan 2015 | A1 |
20150029155 | Lee | Jan 2015 | A1 |
20150066445 | Lin | Mar 2015 | A1 |
20150070147 | Cruz-Hernandez | Mar 2015 | A1 |
20150070245 | Han | Mar 2015 | A1 |
20150078136 | Sun | Mar 2015 | A1 |
20150081110 | Houston | Mar 2015 | A1 |
20150084929 | Lee | Mar 2015 | A1 |
20150110310 | Minnaar | Apr 2015 | A1 |
20150130323 | Harris | May 2015 | A1 |
20150168205 | Lee | Jun 2015 | A1 |
20150192995 | Subramanian | Jul 2015 | A1 |
20150209564 | Lewin | Jul 2015 | A1 |
20150220199 | Wang | Aug 2015 | A1 |
20150226537 | Schorre | Aug 2015 | A1 |
20150226831 | Nakamura | Aug 2015 | A1 |
20150241393 | Ganti | Aug 2015 | A1 |
20150248787 | Abovitz | Sep 2015 | A1 |
20150258431 | Stafford | Sep 2015 | A1 |
20150277610 | Kim | Oct 2015 | A1 |
20150293592 | Cheong | Oct 2015 | A1 |
20150304789 | Babayoff | Oct 2015 | A1 |
20150323667 | Przybyla | Nov 2015 | A1 |
20150331576 | Piya | Nov 2015 | A1 |
20150332075 | Burch | Nov 2015 | A1 |
20160019762 | Levesque | Jan 2016 | A1 |
20160019879 | Daley | Jan 2016 | A1 |
20160026253 | Bradski | Jan 2016 | A1 |
20160044417 | Clemen, Jr. | Feb 2016 | A1 |
20160124080 | Carter | May 2016 | A1 |
20160138986 | Carlin | May 2016 | A1 |
20160175701 | Froy | Jun 2016 | A1 |
20160175709 | Idris | Jun 2016 | A1 |
20160189702 | Blanc | Jun 2016 | A1 |
20160242724 | Lavallee | Aug 2016 | A1 |
20160246374 | Carter | Aug 2016 | A1 |
20160249150 | Carter | Aug 2016 | A1 |
20160291716 | Boser | Oct 2016 | A1 |
20160306423 | Uttermann | Oct 2016 | A1 |
20160320843 | Long | Nov 2016 | A1 |
20160339132 | Cosman | Nov 2016 | A1 |
20160374562 | Vertikov | Dec 2016 | A1 |
20170002839 | Bukland | Jan 2017 | A1 |
20170004819 | Ochiai | Jan 2017 | A1 |
20170018171 | Carter | Jan 2017 | A1 |
20170024921 | Beeler | Jan 2017 | A1 |
20170052148 | Estevez | Feb 2017 | A1 |
20170123487 | Hazra | May 2017 | A1 |
20170123499 | Eid | May 2017 | A1 |
20170140552 | Woo | May 2017 | A1 |
20170144190 | Hoshi | May 2017 | A1 |
20170153707 | Subramanian | Jun 2017 | A1 |
20170168586 | Sinha | Jun 2017 | A1 |
20170181725 | Han | Jun 2017 | A1 |
20170193768 | Long | Jul 2017 | A1 |
20170193823 | Jiang | Jul 2017 | A1 |
20170211022 | Reinke | Jul 2017 | A1 |
20170236506 | Przybyla | Aug 2017 | A1 |
20170270356 | Sills | Sep 2017 | A1 |
20170279951 | Hwang | Sep 2017 | A1 |
20170336860 | Smoot | Nov 2017 | A1 |
20170366908 | Long | Dec 2017 | A1 |
20180035891 | Van Soest | Feb 2018 | A1 |
20180039333 | Carter | Feb 2018 | A1 |
20180047259 | Carter | Feb 2018 | A1 |
20180074580 | Hardee | Mar 2018 | A1 |
20180081439 | Daniels | Mar 2018 | A1 |
20180101234 | Carter | Apr 2018 | A1 |
20180139557 | Ochiai | May 2018 | A1 |
20180146306 | Benattar | May 2018 | A1 |
20180151035 | Maalouf | May 2018 | A1 |
20180166063 | Long | Jun 2018 | A1 |
20180181203 | Subramanian | Jun 2018 | A1 |
20180182372 | Tester | Jun 2018 | A1 |
20180190007 | Panteleev | Jul 2018 | A1 |
20180246576 | Long | Aug 2018 | A1 |
20180253627 | Baradel | Sep 2018 | A1 |
20180267156 | Carter | Sep 2018 | A1 |
20180304310 | Long | Oct 2018 | A1 |
20180309515 | Murakowski | Oct 2018 | A1 |
20180310111 | Kappus | Oct 2018 | A1 |
20180350339 | Macours | Dec 2018 | A1 |
20180361174 | Radulescu | Dec 2018 | A1 |
20190001129 | Rosenbluth | Jan 2019 | A1 |
20190038496 | Levesque | Feb 2019 | A1 |
20190091565 | Nelson | Mar 2019 | A1 |
20190163275 | Iodice | May 2019 | A1 |
20190175077 | Zhang | Jun 2019 | A1 |
20190187244 | Riccardi | Jun 2019 | A1 |
20190196578 | Iodice | Jun 2019 | A1 |
20190196591 | Long | Jun 2019 | A1 |
20190197840 | Kappus | Jun 2019 | A1 |
20190197841 | Carter | Jun 2019 | A1 |
20190197842 | Long | Jun 2019 | A1 |
20190204925 | Long | Jul 2019 | A1 |
20190206202 | Carter | Jul 2019 | A1 |
20190235628 | Lacroix | Aug 2019 | A1 |
20190257932 | Carter | Aug 2019 | A1 |
20190310710 | Deeley | Oct 2019 | A1 |
20190342654 | Buckland | Nov 2019 | A1 |
20200042091 | Long | Feb 2020 | A1 |
20200080776 | Kappus | Mar 2020 | A1 |
20200082221 | Tsai | Mar 2020 | A1 |
20200082804 | Kappus | Mar 2020 | A1 |
20200103974 | Carter | Apr 2020 | A1 |
20200117229 | Long | Apr 2020 | A1 |
20200193269 | Park | Jun 2020 | A1 |
20200218354 | Beattie | Jul 2020 | A1 |
20200257371 | Sung | Aug 2020 | A1 |
20200294299 | Rigiroli | Sep 2020 | A1 |
20200302760 | Carter | Sep 2020 | A1 |
20200320347 | Nikolenko | Oct 2020 | A1 |
20200327418 | Lyons | Oct 2020 | A1 |
20200380832 | Carter | Dec 2020 | A1 |
20210037332 | Kappus | Feb 2021 | A1 |
20210043070 | Carter | Feb 2021 | A1 |
20210056693 | Cheng | Feb 2021 | A1 |
20210109712 | Oliver | Apr 2021 | A1 |
20210111731 | Oliver | Apr 2021 | A1 |
20210112353 | Brian | Apr 2021 | A1 |
20210141458 | Sarafianou | May 2021 | A1 |
20210165491 | Sun | Jun 2021 | A1 |
20210170447 | Buckland | Jun 2021 | A1 |
20210183215 | Carter | Jun 2021 | A1 |
20210201884 | Kappus | Jul 2021 | A1 |
20210225355 | Long | Jul 2021 | A1 |
20210303072 | Carter | Sep 2021 | A1 |
20210303758 | Long | Sep 2021 | A1 |
20210334706 | Yamaguchi | Oct 2021 | A1 |
20210381765 | Kappus | Dec 2021 | A1 |
20210397261 | Kappus | Dec 2021 | A1 |
20220035479 | Lasater | Feb 2022 | A1 |
20220095068 | Kappus | Mar 2022 | A1 |
20220113806 | Long | Apr 2022 | A1 |
20220155949 | Ring | May 2022 | A1 |
20220198892 | Carter | Jun 2022 | A1 |
20220236806 | Carter | Jul 2022 | A1 |
20220252550 | Catsis | Aug 2022 | A1 |
20220300028 | Long | Sep 2022 | A1 |
20220300070 | Iodice | Sep 2022 | A1 |
20220329250 | Long | Oct 2022 | A1 |
20220393095 | Chilles | Dec 2022 | A1 |
20230036123 | Long | Feb 2023 | A1 |
20230075917 | Pittera | Mar 2023 | A1 |
20230117919 | Iodice | Apr 2023 | A1 |
20230124704 | Rorke | Apr 2023 | A1 |
20230141896 | Liu | May 2023 | A1 |
20230168228 | Brian | Jun 2023 | A1 |
20230215248 | Lowther | Jul 2023 | A1 |
20230228857 | Carter | Jul 2023 | A1 |
20230251720 | William | Aug 2023 | A1 |
20230259213 | Long | Aug 2023 | A1 |
Number | Date | Country |
---|---|---|
2470115 | Jun 2003 | CA |
2909804 | Nov 2014 | CA |
101986787 | Mar 2011 | CN |
102459900 | May 2012 | CN |
102591512 | Jul 2012 | CN |
103797379 | May 2014 | CN |
103984414 | Aug 2014 | CN |
107340871 | Nov 2017 | CN |
107407969 | Nov 2017 | CN |
107534810 | Jan 2018 | CN |
0057594 | Aug 1982 | EP |
309003 | Mar 1989 | EP |
0696670 | Feb 1996 | EP |
1875081 | Jan 2008 | EP |
1911530 | Apr 2008 | EP |
2271129 | Jan 2011 | EP |
1461598 | Apr 2014 | EP |
3207817 | Aug 2017 | EP |
3216231 | Aug 2019 | EP |
3916525 | Dec 2021 | EP |
2464117 | Apr 2010 | GB |
2513884 | Nov 2014 | GB |
2513884 | Nov 2014 | GB |
2530036 | Mar 2016 | GB |
2008074075 | Apr 2008 | JP |
2010109579 | May 2010 | JP |
2011172074 | Sep 2011 | JP |
2012048378 | Mar 2012 | JP |
2012048378 | Mar 2012 | JP |
5477736 | Apr 2014 | JP |
2015035657 | Feb 2015 | JP |
2016035646 | Mar 2016 | JP |
2017168086 | Sep 2017 | JP |
6239796 | Nov 2017 | JP |
20120065779 | Jun 2012 | KR |
20130055972 | May 2013 | KR |
1020130055972 | May 2013 | KR |
20160008280 | Jan 2016 | KR |
20200082449 | Jul 2020 | KR |
9118486 | Nov 1991 | WO |
9639754 | Dec 1996 | WO |
03050511 | Jun 2003 | WO |
2005017965 | Feb 2005 | WO |
2007144801 | Dec 2007 | WO |
2009071746 | Jun 2009 | WO |
2009112866 | Sep 2009 | WO |
2010003836 | Jan 2010 | WO |
2010139916 | Dec 2010 | WO |
2011132012 | Oct 2011 | WO |
2012023864 | Feb 2012 | WO |
2012104648 | Aug 2012 | WO |
2013179179 | Dec 2013 | WO |
2014181084 | Nov 2014 | WO |
2015006467 | Jan 2015 | WO |
2015039622 | Mar 2015 | WO |
2015127335 | Aug 2015 | WO |
2015194510 | Dec 2015 | WO |
2016007920 | Jan 2016 | WO |
2016073936 | May 2016 | WO |
2016095033 | Jun 2016 | WO |
2016099279 | Jun 2016 | WO |
2016132141 | Aug 2016 | WO |
2016132144 | Aug 2016 | WO |
2016137675 | Sep 2016 | WO |
2016162058 | Oct 2016 | WO |
2017172006 | Oct 2017 | WO |
2018109466 | Jun 2018 | WO |
2020049321 | Mar 2020 | WO |
2021130505 | Jul 2021 | WO |
2021260373 | Dec 2021 | WO |
Entry |
---|
Young et al. “Designing Mid-Air Haptic Gesture Controlled User Interfaces for Cars”, Proceedings of the ACM On Human-Computer Interaction, ACM PUB2Z7, New York, NY, USA, vol. 4, No. EICS, Jun. 18, 2020 (2020-96-18), pp. 1-23. |
Almusawi et al., “A new artificial neural network approach in solving inverse kinematics of robotic arm (denso vp6242).” Computational intelligence and neuroscience 2016 (2016). (Year: 2016). |
Azad et al., Deep domain adaptation under deep label scarcity. arXiv preprint arXiv: 1809.08097 (2018) (Year: 2018). |
Beranek, L., & Mellow, T. (2019). Acoustics: Sound Fields, Transducers and Vibration. Academic Press. |
Boureau et al.,“A theoretical analysis of feature pooling in visual recognition.” In Proceedings of the 27th international conference on machine learning (ICML-10), pp. 111-118. 2010. (Year: 2010). |
Bybi, A., Grondel, S., Mzerd, A., Granger, C., Garoum, M., & Assaad, J. (2019). Investigation of cross-coupling in piezoelectric transducer arrays and correction. International Journal of Engineering and Technology Innovation, 9(4), 287. |
Certon, D., Felix, N., Hue, P. T. H., Patat, F., & Lethiecq, M. (Oct. 1999). Evaluation of laser probe performances for measuring cross-coupling in 1-3 piezocomposite arrays. In 1999 IEEE Ultrasonics Symposium. Proceedings. International Symposium (Cat. No. 99CH37027) (vol. 2, pp. 1091-1094). |
Certon, D., Felix, N., Lacaze, E., Teston, F., & Patat, F. (2001). Investigation of cross-coupling in 1-3 piezocomposite arrays. ieee transactions on ultrasonics, ferroelectrics, and frequency control, 48(1), 85-92. |
Chang Suk Lee et al., An electrically switchable visible to infra-red dual frequency cholesteric liquid crystal light shutter. J. Mater. Chem. C, 2018, 6, 4243 (7 pages). |
Der et al., Inverse kinematics for reduced deformable models. ACM Transactions on graphics (TOG) 25, No. 3 (2006): 1174-1179. (Year: 2006). |
DeSilets, C. S. (1978). Transducer arrays suitable for acoustic imaging (No. GL-2833). Stanford Univ CA Edward L Ginzton Lab of Physics. |
Duka, “Neural network based inverse kinematics solution for trajectory tracking of a robotic arm.” Procedia Technology 12 (2014) 20-27. (Year: 2014). |
Henneberg, J., Gerlach, A., Storck, H., Cebulla, H., & Marburg, S. (2018). Reducing mechanical cross-coupling in phased array transducers using stop band material as backing. Journal of Sound and Vibration, 424, 352-364. |
https://radiopaedia.org/articles/physical-principles-of-ultrasound-1?lang=GB (Accessed May 29, 2022). |
Office Action (Non-Final Rejection) dated May 25, 2022 for U.S. Appl. No. 16/843,281 (pp. 1-28). |
Office Action (Non-Final Rejection) dated Jun. 9, 2022 for U.S. Appl. No. 17/080,840 (pp. 1-9). |
Office Action (Non-Final Rejection) dated Jun. 27, 2022 for U.S. Appl. No. 16/198,959 (pp. 1-17). |
Office Action (Non-Final Rejection) dated Jun. 27, 2022 for U.S. Appl. No. 16/734,479 (pp. 1-13). |
Oikonomidis et al., “Efficient model-based 3D tracking of hand articulations using Kinect.” In BmVC, vol. 1, No. 2, p. 3. 2011. (Year: 2011). |
Patricio Rodrigues, E., Francisco de Oliveira, T., Yassunori Matuda, M., & Buiochi, F. (Sep. 2019). Design and Construction of a 2-D Phased Array Ultrasonic Transducer for Coupling in Water. In INTER-NOISE and NOISE-CON Congress and Conference Proceedings (vol. 259, No. 4, pp. 5720-5731). Institute of Noise Control Engineering. |
Seo et al., “Improved numerical inverse kinematics for human pose estimation,” Opt. Eng. 50(3 037001 (Mar. 1, 2011) https:// doi.org/10.1117/1.3549255 (Year: 2011). |
Walter, S., Nieweglowski, K., Rebenklau, L., Wolter, K. J., Lamek, B., Schubert, F., . . . & Meyendorf, N. (May 2008). Manufacturing and electrical interconnection of piezoelectric 1-3 composite materials for phased array ultrasonic transducers. In 2008 31st International Spring Seminar on Electronics Technology (pp. 255-260). |
Wang et al., Few-shot adaptive faster r-cnn. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 7173-7182. 2019. (Year: 2019). |
Al-Mashhadany, “Inverse Kinematics Problem (IKP) of 6-DOF Manipulator By Locally Recurrent Neural Networks (LRNNs),” Management and Service Science (MASS), International Conference on Management and Service Science., IEEE, Aug. 24, 2010, 5 pages. (Year: 2010). |
Guez, “Solution to the inverse kinematic problem in robotics by neural networks.” In Proceedings of the 2nd International Conference on Neural Networks, 1988. San Diego, California. (Year: 1988) 8 pages. |
Invitation to Pay Additional Fees for PCT/GB2022/051821 (dated Oct. 20, 2022), 15 pages. |
Mahboob, “Artificial neural networks for learning inverse kinematics of humanoid robot arms.” MS Thesis, 2015. (Year: 2015) 95 pages. |
Office Action (Ex Parte Quayle Action) dated Jan. 6, 2023 for U.S. Appl. No. 17/195,795 (pp. 1-6). |
Office Action (Final Rejection) dated Jan. 9, 2023 for U.S. Appl. No. 16/144,474 (pp. 1-16). |
Office Action (Final Rejection) dated Nov. 18, 2022 for U.S. Appl. No. 16/228,767 (pp. 1-27). |
Office Action (Final Rejection) dated Nov. 18, 2022 for U.S. Appl. No. 17/068,831 (pp. 1-9). |
Office Action (Final Rejection) dated Dec. 8, 2022 for U.S. Appl. No. 16/229,091 (pp. 1-9). |
Office Action (Final Rejection) dated Dec. 15, 2022 for U.S. Appl. No. 16/843,281 (pp. 1-25). |
Office Action (Non-Final Rejection) dated Oct. 17, 2022 for U.S. Appl. No. 17/807,730 (pp. 1-8). |
Office Action (Non-Final Rejection) dated Nov. 9, 2022 for U.S. Appl. No. 17/454,823 (pp. 1-16). |
Office Action (Non-Final Rejection) dated Nov. 16, 2022 for U.S. Appl. No. 17/134,505 (pp. 1-7). |
Office Action (Non-Final Rejection) dated Nov. 16, 2022 for U.S. Appl. No. 17/692,852 (pp. 1-4). |
Office Action (Non-Final Rejection) dated Dec. 6, 2022 for U.S. Appl. No. 17/409,783 (pp. 1-7). |
Office Action (Non-Final Rejection) dated Dec. 22, 2022 for U.S. Appl. No. 17/457,663 (pp. 1-20). |
Office Action (Notice of Allowance and Fees Due (PTOL-85)) dated Oct. 31, 2022 for U.S. Appl. No. 17/068,834 (pp. 1-2). |
Office Action (Notice of Allowance and Fees Due (PTOL-85)) dated Oct. 31, 2022 for U.S. Appl. No. 17/176,899 (pp. 1-2). |
Office Action (Notice of Allowance and Fees Due (PTOL-85)) dated Nov. 1, 2022 for U.S. Appl. No. 16/404,660 (pp. 1-5). |
Office Action (Notice of Allowance and Fees Due (PTOL-85)) dated Nov. 2, 2022 for U.S. Appl. No. 16/734,479 (pp. 1-2). |
Office Action (Notice of Allowance and Fees Due (PTOL-85)) dated Nov. 10, 2022 for U.S. Appl. No. 16/198,959 (pp. 1-2). |
Office Action (Notice of Allowance and Fees Due (PTOL-85)) dated Nov. 16, 2022 for U.S. Appl. No. 16/404,660 (pp. 1-2). |
“Welcome to Project Soli” video, https://atap.google.com/#project-soli Accessed Nov. 30, 2018, 2 pages. |
A. B. Vallbo, Receptive field characteristics of tactile units with myelinated afferents in hairy skin of human subjects, Journal of Physiology (1995), 483.3, pp. 783-795. |
A. Sand, Head-Mounted Display with Mid-Air Tactile Feedback, Proceedings of the 21st ACM Symposium on Virtual Reality Software and Technology, Nov. 13-15, 2015 (8 pages). |
Alexander, J. et al. (2011), Adding Haptic Feedback to Mobile TV (6 pages). |
Amanda Zimmerman, The gentle touch receptors of mammalian skin, Science, Nov. 21, 2014, vol. 346 Issue 6212, p. 950. |
Aoki et al., Sound location of stero reproduction with parametric loudspeakers, Applied Acoustics 73 (2012) 1289-1295 (7 pages). |
Ashish Shrivastava et al., Learning from Simulated and Unsupervised Images through Adversarial Training, Jul. 19, 2017, pp. 1-16. |
Bajard et al., Bkm: A New Hardware Algorithm for Complex Elementary Functions, 8092 IEEE Transactions on Computers 43 (1994) (9 pages). |
Bajard et al., Evaluation of Complex Elementary Functions / A New Version of BKM, SPIE Conference on Advanced Signal Processing, Jul. 1999 (8 pages). |
Benjamin Long et al., “Rendering volumetric haptic shapes in mid-air using ultrasound”, Acm Transactions On Graphics (TPG), ACM, US, (20141119), vol. 33, No. 6, ISSN 0730-0301, pp. 1-10. |
Bortoff et al., Pseudolinearization of the Acrobot using Spline Functions, IEEE Proceedings of the 31st Conference on Decision and Control, Sep. 10, 1992 (6 pages). |
Bożena Smagowska & Małgorzata Pawlaczyk-Łuszczyńska (2013) Effects of Ultrasonic Noise on the Human Body—A Bibliographic Review, International Journal of Occupational Safety and Ergonomics, 19:2, 195-202. |
Brian Kappus and Ben Long, Spatiotemporal Modulation for Mid-Air Haptic Feedback from an Ultrasonic Phased Array, ICSV25, Hiroshima, Jul. 8-12, 2018, 6 pages. |
Canada Application 2,909,804 Office Action dated Oct. 18, 2019, 4 pages. |
Casper et al., Realtime Control of Multiple-focus Phased Array Heating Patterns Based on Noninvasive Ultrasound Thermography, IEEE Trans Biomed Eng. Jan. 2012; 59(1): 95-105. |
Christoper M. Bishop, Pattern Recognition and Machine Learning, 2006, pp. 1-758. |
Colgan, A., “How Does the Leap Motion Controller Work?” Leap Motion, Aug. 9, 2014, 10 pages. |
Corrected Notice of Allowability dated Aug. 9, 2021 for U.S. Appl. No. 15/396,851 (pp. 1-6). |
Corrected Notice of Allowability dated Jan. 14, 2021 for U.S. Appl. No. 15/897,804 (pp. 1-2). |
Corrected Notice of Allowability dated Jun. 21, 2019 for U.S. Appl. No. 15/966,213 (2 pages). |
Corrected Notice of Allowability dated Oct. 31, 2019 for U.S. Appl. No. 15/623,516 (pp. 1-2). |
Damn Geeky, “Virtual projection keyboard technology with haptic feedback on palm of your hand,” May 30, 2013, 4 pages. |
David Joseph Tan et al., Fits like a Glove: Rapid and Reliable Hand Shape Personalization, 2016 IEEE Conference on Computer Vision and Pattern Recognition, pp. 5610-5619. |
Definition of “Interferometry”according to Wikipedia, 25 pages., Retrieved Nov. 2018. |
Definition of “Multilateration” according to Wikipedia, 7 pages., Retrieved Nov. 2018. |
Definition of “Trilateration”according to Wikipedia, 2 pages., Retrieved Nov. 2018. |
Diederik P. Kingma et al., Adam: A Method for Stochastic Optimization, Jan. 30, 2017, pp. 1-15. |
E. Bok, Metasurface for Water-to-Air Sound Transmission, Physical Review Letters 120, 044302 (2018) (6 pages). |
E.S. Ebbini et al. (1991), A spherical-section ultrasound phased array applicator for deep localized hyperthermia, Biomedical Engineering, IEEE Transactions on (vol. 38 Issue: 7), pp. 634-643. |
EPO Office Action for EP16708440.9 dated Sep. 12, 2018 (7 pages). |
EPSRC Grant summary EP/J004448/1 (2011) (1 page). |
Eric Tzeng et al., Adversarial Discriminative Domain Adaptation, Feb. 17, 2017, pp. 1-10. |
European Office Action for Application No. EP16750992.6, dated Oct. 2, 2019, 3 pages. |
Ex Parte Quayle Action dated Dec. 28, 2018 for U.S. Appl. No. 15/966,213 (pp. 1-7). |
Extended European Search Report for Application No. EP19169929.7, dated Aug. 6, 2019, 7 pages. |
Freeman et al., Tactile Feedback for Above-Device Gesture Interfaces: Adding Touch to Touchless Interactions ICMI'14, Nov. 12-16, 2014, Istanbul, Turkey (8 pages). |
Gavrilov L R et al.(2000) “A theoretical assessment of the relative performance of spherical phased arrays for ultrasound surgery” Ultrasonics, Ferroelectrics, and Frequency Control, IEEE Transactions on (vol. 47, Issue: 1), pp. 125-139. |
Gavrilov, L.R. (2008) “The Possibility of Generating Focal Regions of Complex Configurations in Application to the Problems of Stimulation of Human Receptor Structures by Focused Ultrasound” Acoustical Physics, vol. 54, No. 2, pp. 269-278. |
Georgiou et al., Haptic In-Vehicle Gesture Controls, Adjunct Proceedings of the 9th International ACM Conference on Automotive User Interfaces and Interactive Vehicular Applications (AutomotiveUI '17), Sep. 24-27, 2017 (6 pages). |
GitHub—danfis/libccd: Library for collision detection between two convex shapes, Mar. 26, 2020, pp. 1-6. |
GitHub—IntelRealSense/hand_tracking_samples: researc codebase for depth-based hand pose estimation using dynamics based tracking and CNNs, Mar. 26, 2020, 3 pages. |
Gokturk, et al., “A Time-of-Flight Depth Sensor-System Description, Issues and Solutions,” Published in: 2004 Conference on Computer Vision and Pattern Recognition Workshop, Date of Conference: Jun. 27-Jul. 2, 2004, 9 pages. |
Hasegawa, K. and Shinoda, H. (2013) “Aerial Display of Vibrotactile Sensation with High Spatial-Temporal Resolution using Large Aperture Airbourne Ultrasound Phased Array”, University of Tokyo (6 pages). |
Henrik Bruus, Acoustofluidics 2: Perturbation theory and ultrasound resonance modes, Lab Chip, 2012, 12, 20-28. |
Hilleges et al. Interactions in the air: adding further depth to interactive tabletops, UIST '09: Proceedings of the 22nd annual ACM symposium on User interface software and technologyOct. 2009 pp. 139-148. |
Hoshi et al., Tactile Presentation by Airborne Ultrasonic Oscillator Array, Proceedings of Robotics and Mechatronics Lecture 2009, Japan Society of Mechanical Engineers; May 24, 2009 (5 pages). |
Hoshi T et al., “Noncontact Tactile Display Based on Radiation Pressure of Airborne Ultrasound”, IEEE Transactions On Haptics, IEEE, USA, (Jul. 1, 2010), vol. 3, No. 3, ISSN 1939-1412, pp. 155-165. |
Hoshi, T., Development of Aerial-Input and Aerial-Tactile-Feedback System, IEEE World Haptics Conference 2011, p. 569-573. |
Hoshi, T., Handwriting Transmission System Using Noncontact Tactile Display, IEEE Haptics Symposium 2012 pp. 399-401. |
Hoshi, T., Non-contact Tactile Sensation Synthesized by Ultrasound Transducers, Third Joint Euro haptics Conference and Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems 2009 (5 pages). |
Hoshi, T., Touchable Holography, SIGGRAPH 2009, New Orleans, Louisiana, Aug. 3-7, 2009. (1 page). |
Hua J, Qin H., Haptics-based dynamic implicit solid modeling, IEEE Trans Vis Comput Graph. Sep.-Oct. 2004;10(5):574-86. |
Hyunjae Gil, Whiskers: Exploring the Use of Ultrasonic Haptic Cues on the Face, CHI 2018, Apr. 21- 26, 2018, Montréal, QC, Canada. |
Iddan, et al., “3D Imaging in the Studio (And Elsewhwere . . . ” Apr. 2001, 3DV systems Ltd., Yokneam, Isreal, www.3dvsystems.com.il, 9 pages. |
Imaginary Phone: Learning Imaginary Interfaces By Transferring Spatial Memory From a Familiar Device Sean Gustafson, Christian Holz and Patrick Baudisch. UIST 2011. (10 pages). |
India Morrison, The skin as a social organ, Exp Brain Res (2010) 204:305-314. |
International Preliminary Report on Patentability and Written Opinion issued in corresponding PCT/ US2017/035009, dated Dec. 4, 2018, 8 pages. |
International Preliminary Report on Patentability for Application No. PCT/EP2017/069569 dated Feb. 5, 2019, 11 pages. |
International Search Report and Written Opinion for Application No. PCT/GB2018/053738, dated Apr. 11, 2019, 14 pages. |
International Search Report and Written Opinion for Application No. PCT/GB2018/053739, dated Jun. 4, 2019, 16 pages. |
International Search Report and Written Opinion for Application No. PCT/GB2019/050969, dated Jun. 13, 2019, 15 pages. |
International Search Report and Written Opinion for Application No. PCT/GB2019/051223, dated Aug. 8, 2019, 15 pages. |
International Search Report and Written Opinion for Application No. PCT/GB2019/052510, dated Jan. 14, 2020, 25 pages. |
ISR & WO for PCT/GB2020/052545 (dated Jan. 27, 2021) 14 pages. |
ISR and WO for PCT/GB2020/050013 (dated Jul. 13, 2020) (20 pages). |
ISR and WO for PCT/GB2020/050926 (dated Jun. 2, 2020) (16 pages). |
ISR and WO for PCT/GB2020/052544 (dated Dec. 18, 2020) (14 pages). |
ISR for PCT/GB2020/052546 (dated Feb. 23, 2021) (14 pages). |
ISR for PCT/GB2020/053373 (dated Mar. 26, 2021) (16 pages). |
Iwamoto et al. (2008), Non-contact Method for Producing Tactile Sensation Using Airborne Ultrasound, EuroHaptics, pp. 504-513. |
Iwamoto et al., Airborne Ultrasound Tactile Display: Supplement, The University of Tokyo 2008 (2 pages). |
Iwamoto T et al., “Two-dimensional Scanning Tactile Display using Ultrasound Radiation Pressure”, Haptic Interfaces for Virtual Environment and Teleoperator Systems, 2006 14th Symposium On Alexandria, VA, USA Mar. 25-26, 2006, Piscataway, NJ, USA, IEEE, (20060325), ISBN 978-1-4244-0226-7, pp. 57-61. |
Jager et al., “Air-Coupled 40-KHZ Ultrasonic 2D-Phased Array Based on a 3D-Printed Waveguide Structure”, 2017 IEEE, 4 pages. |
Japanese Office Action (with English language translation) for Application No. 2017-514569, dated Mar. 31, 2019, 10 pages. |
JonasChatel-Goldman, Touch increases autonomic coupling between romantic partners, Frontiers in Behavioral Neuroscience Mar. 2014, vol. 8, Article 95. |
Jonathan Taylor et al., Articulated Distance Fields for Ultra-Fast Tracking of Hands Interacting, ACM Transactions on Graphics, vol. 36, No. 4, Article 244, Publication Date: Nov. 2017, pp. 1-12. |
Jonathan Taylor et al., Efficient and Precise Interactive Hand Tracking Through Joint, Continuous Optimization of Pose and Correspondences, SIGGRAPH '16 Technical Paper, Jul. 24-28, 2016, Anaheim, CA, ISBN: 978-1-4503-4279-87/16/07, pp. 1-12. |
Jonathan Tompson et al., Real-Time Continuous Pose Recovery of Human Hands Using Convolutional Networks, ACM Trans. Graph. 33, 5, Article 169, Aug. 2014, pp. 1-10. |
K. Jia, Dynamic properties of micro-particles in ultrasonic transportation using phase-controlled standing waves, J. Applied Physics 116, n. 16 (2014) (12 pages). |
Kai Tsumoto, Presentation of Tactile Pleasantness Using Airborne Ultrasound, 2021 IEEE World Haptics Conference (WHC) Jul. 6-9, 2021. Montreal, Canada. |
Kaiming He et al., Deep Residual Learning for Image Recognition, http://image-net.org/challenges/ LSVRC/2015/ and http://mscoco.org/dataset/#detections-challenge2015, Dec. 10, 2015, pp. 1-12. |
Kamakura, T. and Aoki, K. (2006) “A Highly Directional Audio System using a Parametric Array in Air” WESPAC IX 2006 (8 pages). |
Keisuke Hasegawa, Electronically steerable ultrasound-driven long narrow air stream, Applied Physics Letters 111, 064104 (2017). |
Keisuke Hasegawa, Midair Ultrasound Fragrance Rendering, IEEE Transactions On Visualization and Computer Graphics, vol. 24, No. 4, Apr. 2018 1477. |
Keisuke Hasegawa,,Curved acceleration path of ultrasound-driven air flow, J. Appl. Phys. 125, 054902 (2019). |
Kolb, et al., “Time-of-Flight Cameras in Computer Graphics,” Computer Graphics forum, vol. 29 (2010), No. 1, pp. 141-159. |
Konstantinos Bousmalis et al., Domain Separation Networks, 29th Conference on Neural Information Processing Systems (NIPS 2016), Barcelona, Spain. Aug. 22, 2016, pp. 1-15. |
Krim, et al., “Two Decades of Array Signal Processing Research—The Parametric Approach”, IEEE Signal Processing Magazine, Jul. 1996, pp. 67-94. |
Lang, Robert, “3D Time-of-Flight Distance Measurement with Custom Solid-State Image Sensors in CMOS/CCD-Technology”, A dissertation submitted to Department of EE and CS at Univ. of Siegen, dated Jun. 28, 2000, 223 pages. |
Large et al.,Feel the noise: Mid-air ultrasound haptics as a novel human-vehicle interaction paradigm, Applied Ergonomics (2019) (10 pages). |
Li, Larry, “Time-of-Flight Camera—An Introduction,” Texas Instruments, Technical White Paper, SLOA190B—Jan. 2014 Revised May 2014, 10 pages. |
Light, E.D., Progress in Two Dimensional Arrays for Real Time Volumetric Imaging, 1998 (17 pages). |
Line S Loken, Coding of pleasant touch by unmyelinated afferents in humans, Nature Neuroscience vol. 12 [ No. 5 [ May 2009 547. |
M. Barmatz et al, “Acoustic radiation potential on a sphere in plane, cylindrical, and spherical standing wave fields”, The Journal of the Acoustical Society of America, New York, NY, US, (Mar. 1, 1985), vol. 77, No. 3, pp. 928-945, XP055389249. |
M. Toda, New Type of Matching Layer for Air-Coupled Ultrasonic Transducers, IEEE Transactions on Ultrasonics, Ferroelecthcs, and Frequency Control, vol. 49, No. 7, Jul. 2002 (8 pages). |
Mahdi Rad et al., Feature Mapping for Learning Fast and Accurate 3D Pose Inference from Synthetic Images, Mar. 26, 2018, pp. 1-14. |
Marco A B Andrade et al., “Matrix method for acoustic levitation simulation”, IEEE Transactions On Ultrasonics, Ferroelectrics and Frequency Control, IEEE, US, (Aug. 1, 2011), vol. 58, No. 8, ISSN 0885-3010, pp. 1674-1683. |
Mariana von Mohr, The soothing function of touch: affective touch reduces feelings of social exclusion, Scientific Reports, 7: 13516, Oct. 18, 2017. |
Marin, About LibHand, LibHand—A Hand Articulation Library, www.libhand.org/index.html, Mar. 26, 2020, pp. 1-2; www.libhand.org/download.html, 1 page; www.libhand.org/examples.html, pp. 1-2. |
Markus Oberweger et al., DeepPrior++: Improving Fast and Accurate 3D Hand Pose Estimation, Aug. 28, 2017, pp. 1-10. |
Markus Oberweger et al., Hands Deep in Deep Learning for Hand Pose Estimation, Dec. 2, 2016, pp. 1-10. |
Marshall, M ., Carter, T., Alexander, J., & Subramanian, S. (2012). Ultratangibles: creating movable tangible objects on interactive tables. In Proceedings of the 2012 ACM annual conference on Human Factors in Computing Systems, (pp. 2185-2188). |
Marzo et al., Holographic acoustic elements for manipulation of levitated objects, Nature Communications Doi: 10.1038/ncomms9661 (2015) (7 pages). |
Meijster, A., et al., “A General Algorithm for Computing Distance Transforms in Linear Time,” Mathematical Morphology and its Applications to Image and Signal Processing, 2002, pp. 331-340. |
Mingzhu Lu et al. (2006) Design and experiment of 256-element ultrasound phased array for noninvasive focused ultrasound surgery, Ultrasonics, vol. 44, Supplement, Dec. 22, 2006, pp. e325-e330. |
Mitsuru Nakajima, Remotely Displaying Cooling Sensation via Ultrasound-Driven Air Flow, Haptics Symposium 2018, San Francisco, USA p. 340. |
Mohamed Yacine Tsalamlal, Affective Communication through Air Jet Stimulation: Evidence from Event-Related Potentials, International Journal of Human-Computer Interaction 2018. |
Mueller, GANerated Hands for Real-Time 3D Hand Tracking from Monocular RGB, Eye in-Painting with Exemplar Generative Adverserial Networks, pp. 49-59 (Jun. 1, 2018). |
Nina Gaissert, Christian Wallraven, and Heinrich H. Bulthoff, “Visual and Haptic Perceptual Spaces Show High Similarity in Humans ”, published to Journal of Vision in 2010, available at http://www.journalofvision.org/content/10/11/2 and retrieved on Apr. 22, 2020 ( Year: 2010), 20 pages. |
Notice of Allowance dated Apr. 20, 2021 for U.S. Appl. No. 16/563,608 (pp. 1-5). |
Notice of Allowance dated Apr. 22, 2020 for U.S. Appl. No. 15/671,107 (pp. 1-5). |
Notice of Allowance dated Dec. 19, 2018 for U.S. Appl. No. 15/665,629 (pp. 1-9). |
Notice of Allowance dated Dec. 21, 2018 for U.S. Appl. No. 15/983,864 (pp. 1-7). |
Notice of Allowance dated Feb. 10, 2020, for U.S. Appl. No. 16/160,862 (pp. 1-9). |
Notice of Allowance dated Feb. 7, 2019 for U.S. Appl. No. 15/851,214 (pp. 1-7). |
Notice of Allowance dated Jul. 22, 2021 for U.S. Appl. No. 16/600,500 (pp. 1-9). |
Notice of Allowance dated Jul. 31, 2019 for U.S. Appl. No. 15/851,214 (pp. 1-9). |
Notice of Allowance dated Jul. 31, 2019 for U.S. Appl. No. 16/296,127 (pp. 1-9). |
Notice of Allowance dated Jun. 10, 2021 for U.S. Appl. No. 17/092,333 (pp. 1-9). |
Notice of Allowance dated Jun. 17, 2020 for U.S. Appl. No. 15/210,661 (pp. 1-9). |
Notice of Allowance dated Jun. 25, 2021 for U.S. Appl. No. 15/396,851 (pp. 1-10). |
Notice of Allowance dated May 30, 2019 for U.S. Appl. No. 15/966,213 (pp. 1-9). |
Notice of Allowance dated Oct. 1, 2020 for U.S. Appl. No. 15/897,804 (pp. 1-9). |
Notice of Allowance dated Oct. 16, 2020 for U.S. Appl. No. 16/159,695 (pp. 1-7). |
Notice of Allowance dated Oct. 30, 2020 for U.S. Appl. No. 15/839,184 (pp. 1-9). |
Notice of Allowance dated Oct. 6, 2020 for U.S. Appl. No. 16/699,629 (pp. 1-8). |
Notice of Allowance dated Sep. 30, 2020 for U.S. Appl. No. 16/401,148 (pp. 1-10). |
Notice of Allowance in U.S. Appl. No. 15/210,661 dated Jun. 17, 2020 (22 pages). |
Obrist et al., Emotions Mediated Through Mid-Air Haptics, CHI 2015, Apr. 18-23, 2015, Seoul, Republic of Korea. (10 pages). |
Obrist et al., Talking about Tactile Experiences, CHI 2013, Apr. 27-May 2, 2013 (10 pages). |
Office Action dated Apr. 8, 2020, for U.S. Appl. No. 16/198,959 (pp. 1-17). |
Office Action dated Apr. 16, 2020 for U.S. Appl. No. 15/839,184 (pp. 1-8). |
Office Action dated Apr. 17, 2020 for U.S. Appl. No. 16/401,148 (pp. 1-15). |
Office Action dated Apr. 18, 2019 for U.S. Appl. No. 16/296,127 (pp. 1-6). |
Office Action dated Apr. 28, 2020 for U.S. Appl. No. 15/396,851 (pp. 1-12). |
Office Action dated Apr. 29, 2020 for U.S. Appl. No. 16/374,301 (pp. 1-18). |
Office Action dated Apr. 4, 2019 for U.S. Appl. No. 15/897,804 (pp. 1-10). |
Office Action dated Aug. 10, 2021 for U.S. Appl. No. 16/564,016 (pp. 1-14). |
Office Action dated Aug. 19, 2021 for U.S. Appl. No. 17/170,841 (pp. 1-9). |
Office Action dated Aug. 22, 2019 for U.S. Appl. No. 16/160,862 (pp. 1-5). |
Office Action dated Aug. 9, 2021 for U.S. Appl. No. 17/068,825 (pp. 1-9). |
Office Action dated Dec. 11, 2019 for U.S. Appl. No. 15/959,266 (pp. 1-15). |
Office Action dated Dec. 7, 2020 for U.S. Appl. No. 16/563,608 (pp. 1-8). |
Office Action dated Feb. 20, 2019 for U.S. Appl. No. 15/623,516 (pp. 1-8). |
Office Action dated Feb. 25, 2020 for U.S. Appl. No. 15/960,113 (pp. 1-7). |
Office Action dated Feb. 7, 2020 for U.S. Appl. No. 16/159,695 (pp. 1-8). |
Office Action dated Jan. 10, 2020 for U.S. Appl. No. 16/228,767 (pp. 1-6). |
Office Action dated Jan. 29, 2020 for U.S. Appl. No. 16/198,959 (p. 1-6). |
Office Action dated Jul. 10, 2019 for U.S. Appl. No. 15/210,661 (pp. 1-12). |
Office Action dated Jul. 26, 2019 for U.S. Appl. No. 16/159,695 (pp. 1-8). |
Office Action dated Jul. 9, 2020 for U.S. Appl. No. 16/228,760 (pp. 1-17). |
Office Action dated Jun. 19, 2020 for U.S. Appl. No. 16/699,629 (pp. 1-12). |
Office Action dated Jun. 25, 2020 for U.S. Appl. No. 16/228,767 (pp. 1-27). |
Office Action dated Jun. 25, 2021 for U.S. Appl. No. 16/899,720 (pp. 1-5). |
Office Action dated Mar. 11, 2021 for U.S. Appl. No. 16/228,767 (pp. 1-23). |
Office Action dated Mar. 20, 2020 for U.S. Appl. No. 15/210,661 (pp. 1-10). |
Office Action dated Mar. 31, 2021 for U.S. Appl. No. 16/228,760 (pp. 1-21). |
Office Action dated May 13, 2021 for U.S. Appl. No. 16/600,500 (pp. 1-9). |
Office Action dated May 14, 2021 for U.S. Appl. No. 16/198,959 (pp. 1-6). |
Office Action dated May 16, 2019 for U.S. Appl. No. 15/396,851 (pp. 1-7). |
Office Action dated May 18, 2020 for U.S. Appl. No. 15/960,113 (pp. 1-21). |
Office Action dated Oct. 17, 2019 for U.S. Appl. No. 15/897,804 (pp. 1-10). |
Office Action dated Oct. 31, 2019 for U.S. Appl. No. 15/671,107 (pp. 1-6). |
Office Action dated Oct. 7, 2019 for U.S. Appl. No. 15/396,851 (pp. 1-9). |
Office Action dated Sep. 16, 2021 for U.S. Appl. No. 16/600,496 (pp. 1-8). |
Office Action dated Sep. 18, 2020 for U.S. Appl. No. 15/396,851 (pp. 1-14). |
Office Action dated Sep. 21, 2020 for U.S. Appl. No. 16/198,959 (pp. 1-17). |
Office Action dated Sep. 24, 2021 for U.S. Appl. No. 17/080,840 (pp. 1-9). |
OGRECave/ogre—GitHub: ogre/Samples/Media/materials at 7de80a7483f20b50f2b10d7ac6de9d9c6c87d364, Mar. 26, 2020, 1 page. |
Optimal regularisation for acoustic source reconstruction by inverse methods, Y. Kim, P.A. Nelson, Institute of Sound and Vibration Research, University of Southampton, Southampton, SO17 1BJ, UK Received Feb. 25, 2003; 25 pages. |
Oscar Martínez-Graullera et al., “2D array design based on Fermat spiral for ultrasound imaging”, ULTRASONICS, (Feb. 1, 2010), vol. 50, No. 2, ISSN 0041-624X, pp. 280-289, XP055210119. |
Partial International Search Report for Application No. PCT/GB2018/053735, dated Apr. 12, 2019, 14 pages. |
Partial ISR for Application No. PCT/GB2020/050013 dated May 19, 2020 (16 pages). |
PCT Partial International Search Report for Application No. PCT/GB2018/053404 dated Feb. 25, 2019, 13 pages. |
Péter Tamás Kovács et al., “Tangible Holographic 3D Objects with Virtual Touch”, Interactive Tabletops & Surfaces, ACM, 2 Penn Plaza, Suite 701 New York NY 10121-0701 USA, (Nov. 15, 2015), ISBN 978-1-4503-3899-8, pp. 319-324. |
Phys.org, Touchable Hologram Becomes Reality, Aug. 6, 2009, by Lisa Zyga (2 pages). |
Pompei, F.J. (2002), “Sound from Ultrasound: The Parametric Array as an Audible Sound Source”, Massachusetts Institute of Technology (132 pages). |
Rocchesso et al., Accessing and Selecting Menu Items by In-Air Touch, ACM CHItaly'19, Sep. 23-25, 2019, Padova, Italy (9 pages). |
Rochelle Ackerley, Human C-Tactile Afferents Are Tuned to the Temperature of a Skin-Stroking Caress, J. Neurosci., Feb. 19, 2014, 34(8):2879-2883. |
Ryoko Takahashi, Tactile Stimulation by Repetitive Lateral Movement of Midair Ultrasound Focus, Journal of Latex Class Files, vol. 14, No. 8, Aug. 2015. |
Schmidt, Ralph, “Multiple Emitter Location and Signal Parameter Estimation” IEEE Transactions of Antenna and Propagation, vol. AP-34, No. 3, Mar. 1986, pp. 276-280. |
Sean Gustafson et al., “Imaginary Phone”, Proceedings of the 24th Annual ACM Symposium on User Interface Software and Techology: Oct. 16-19, 2011, Santa Barbara, CA, USA, ACM, New York, NY, Oct. 16, 2011, pp. 283-292, XP058006125, DOI: 10.1145/2047196.2047233, ISBN: 978-1-4503-0716-1. |
Search report and Written Opinion of ISA for PCT/GB2015/050417 dated Jul. 8, 2016 (20 pages). |
Search report and Written Opinion of ISA for PCT/GB2015/050421 dated Jul. 8, 2016 (15 pages). |
Search report and Written Opinion of ISA for PCT/GB2017/050012 dated Jun. 8, 2017. (18 pages). |
Search Report by EPO for EP 17748466 dated Jan. 13, 2021 (16 pages). |
Search Report for GB1308274.8 dated Nov. 11, 2013. (2 pages). |
Search Report for GB1415923.0 dated Mar. 11, 2015. (1 page). |
Search Report for PCT/GB/2017/053729 dated Mar. 15, 2018 (16 pages). |
Search Report for PCT/GB/2017/053880 dated Mar. 21, 2018. (13 pages). |
Search report for PCT/GB2014/051319 dated Dec. 8, 2014 (4 pages). |
Search report for PCT/GB2015/052507 dated Mar. 11, 2020 (19 pages). |
Search report for PCT/GB2015/052578 dated Oct. 26, 2015 (12 pages). |
Search report for PCT/GB2015/052916 dated Feb. 26, 2020 (18 pages). |
Search Report for PCT/GB2017/052332 dated Oct. 10, 2017 (12 pages). |
Search report for PCT/GB2018/051061 dated Sep. 26, 2018 (17 pages). |
Search report for PCT/US2018/028966 dated Jul. 13, 2018 (43 pages). |
Sergey Ioffe et al., Batch Normalization: Accelerating Deep Network Training by Reducing Internal Covariat Shift, Mar. 2, 2015, pp. 1-11. |
Seungryul, Pushing the Envelope for RGB-based Dense 3D Hand Pose Estimation for RGB-based Desne 3D Hand Pose Estimation via Neural Rendering, arXiv:1904.04196v2 [cs.CV] Apr. 9, 2019 (5 pages). |
Shakeri, G., Williamson, J. H. and Brewster, S. (2018) May the Force Be with You: Ultrasound Haptic Feedback for Mid-Air Gesture Interaction in Cars. In: 10th International ACM Conference on Automotive User Interfaces and Interactive Vehicular Applications (AutomotiveUI 2018) (11 pages). |
Shanxin Yuan et al., BigHand2.2M Bechmark: Hand Pose Dataset and State of the Art Analysis, Dec. 9, 2017, pp. 1-9. |
Shome Subhra Das, Detectioin of Self Intersection in Synthetic Hand Pose Generators, 2017 Fifteenth IAPR International Conference on Machine Vision Applications (MVA), Nagoya University, Nagoya, Japan, May 8-12, 2017, pp. 354-357. |
Sixth Sense webpage, http://www.pranavmistry.com/projects/sixthsense/ Accessed Nov. 30, 2018, 7 pages. |
Stan Melax et al., Dynamics Based 3D Skeletal Hand Tracking, May 22, 2017, pp. 1-8. |
Stanley J. Bolanowski, Hairy Skin: Psychophysical Channels and Their Physiological Substrates, Somatosensory and Motor Research, vol. 11. No. 3, 1994, pp. 279-290. |
Stefan G. Lechner, Hairy Sensation, PHYSIOLOGY 28: 142-150, 2013. |
Steve Guest et al., “Audiotactile interactions in roughness perception”, Exp. Brain Res (2002) 146:161-171, DOI 10.1007/s00221-002-1164-z, Received: Feb. 9, 2002/Accepted: May 16, 2002/ Published online: Jul. 26, 2002, Springer-Verlag 2002, (11 pages). |
Supplemental Notice of Allowability dated Jul. 28, 2021 for U.S. Appl. No. 16/563,608 (pp. 1-2). |
Supplemental Notice of Allowability dated Jul. 28, 2021 for U.S. Appl. No. 17/092,333 (pp. 1-2). |
Sylvia Gebhardt, Ultrasonic Transducer Arrays for Particle Manipulation (date unknown) (2 pages). |
Takaaki Kamigaki, Noncontact Thermal and Vibrotactile Display Using Focused Airborne Ultrasound, EuroHaptics 2020, LNCS 12272, pp. 271-278, 2020. |
Takahashi Dean: “Ultrahaptics shows off sense of touch in virtual reality”, Dec. 10, 2016 (Dec. 10, 2016), XP055556416, Retrieved from the Internet: URL: https://venturebeat.com/2016/12/10/ultrahaptics-shows-off-sense-of-touch-in-virtual-reality/ [retrieved on Feb. 13, 2019] 4 pages. |
Takahashi, M. et al., Large Aperture Airborne Ultrasound Tactile Display Using Distributed Array Units, SICE Annual Conference 2010 p. 359-62. |
Takayuki et al., “Noncontact Tactile Display Based on Radiation Pressure of Airborne Ultrasound” IEEE Transactions on Haptics vol. 3, No. 3, p. 165 (2010). |
Teixeira, et al., “A brief introduction to Microsoft's Kinect Sensor,” Kinect, 26 pages, retrieved Nov. 2018. |
Toby Sharp et al., Accurate, Robust, and Flexible Real-time Hand Tracking, CHI '15, Apr. 18-23, 2015, Seoul, Republic of Korea, ACM 978-1-4503-3145-6/15/04, pp. 1-10. |
Tom Carter et al, “UltraHaptics: Multi-Point Mid-Air Haptic Feedback for Touch Surfaces”, Proceedings of the 26th Annual ACM Symposium On User Interface Software and Technology, UIST '13, New York, New York, USA, (Jan. 1, 2013), ISBN 978-1-45-032268-3, pp. 505-514. |
Tom Nelligan and Dan Kass, Intro to Ultrasonic Phased Array (date unknown) (8 pages). |
Tomoo Kamakura, Acoustic streaming induced in focused Gaussian beams, J. Acoust. Soc. Am. 97 (5), Pt. 1, May 1995 p. 2740. |
Uta Sailer, How Sensory and Affective Attributes Describe Touch Targeting C-Tactile Fibers, Experimental Psychology (2020), 67(4), 224-236. |
Vincent Lepetit et al., Model Based Augmentation and Testing of an Annotated Hand Pose Dataset, ResearchGate, https://www.researchgate.net/publication/307910344, Sep. 2016, 13 pages. |
Wang et al., Device-Free Gesture Tracking Using Acoustic Signals, ACM MobiCom '16, pp. 82-94 (13 pages). |
Wilson et al., Perception of Ultrasonic Haptic Feedback on the Hand: Localisation and Apparent Motion, CHI 2014, Apr. 26-May 1, 2014, Toronto, Ontario, Canada. (10 pages). |
Wooh et al., “Optimum beam steering of linear phased arays,” Wave Motion 29 (1999) pp. 245-265, 21 pages. |
Xin Cheng et al., “Computation of the acoustic radiation force on a sphere based on the 3-D FDTD method”, Piezoelectricity, Acoustic Waves and Device Applications (SPAWDA), 2010 Symposium On, IEEE, (Dec. 10, 2010), ISBN 978-1-4244-9822-2, pp. 236-239. |
Xu Hongyi et al, “6-DoF Haptic Rendering Using Continuous Collision Detection between Points and Signed Distance Fields”, IEEE Transactions On Haptics, IEEE, USA, vol. 10, No. 2, ISSN 1939-1412, (Sep. 27, 2016), pp. 151-161, (Jun. 16, 2017). |
Yang Ling et al., “Phase-coded approach for controllable generation of acoustical vortices”, Journal of Applied Physics, American Institute of Physics, US, vol. 113, No. 15, ISSN 0021-8979, (Apr. 21, 2013), pp. 154904-154904. |
Yarin Gal et al., Dropout as a Bayesian Approximation: Representing Model Uncertainty in Deep Learning, Oct. 4, 2016, pp. 1-12, Proceedings of the 33rd International Conference on Machine Learning, New York, NY, USA, 2016, JMLR: W&CP vol. 48. |
Yaroslav Ganin et al., Domain-Adversarial Training of Neural Networks, Journal of Machine Learning Research 17 (2016) 1-35, submitted May 2015; published Apr. 2016. |
Yaroslav Ganin et al., Unsupervised Domain Adaptataion by Backpropagation, Skolkovo Institute of Science and Technology (Skoltech), Moscow Region, Russia, Proceedings of the 32nd International Conference on Machine Learning, Lille, France, 2015, JMLR: W&CP vol. 37, copyright 2015 by the author(s), 11 pages. |
Yoshino, K. and Shinoda, H. (2013), “Visio Acoustic Screen for Contactless Touch Interface with Tactile Sensation”, University of Tokyo (5 pages). |
Zeng, Wejun, “Microsoft Kinect Sensor and Its Effect,” IEEE Multimedia, Apr.-Jun. 2012, 7 pages. |
ISR & WO for PCT/GB2022/051388 (dated Aug. 30, 2022) (15 pages). |
Office Action (Final Rejection) dated Sep. 16, 2022 for U.S. Appl. No. 16/404,660 (pp. 1-6). |
Office Action (Non-Final Rejection) dated Aug. 29, 2022 for U.S. Appl. No. 16/995,819 (pp. 1-6). |
Office Action (Non-Final Rejection) dated Sep. 21, 2022 for U.S. Appl. No. 17/721,315 (pp. 1-10). |
Office Action (Notice of Allowance and Fees Due (PTOL-85)) dated Aug. 24, 2022 for U.S. Appl. No. 16/198,959 (pp. 1-6). |
Office Action (Notice of Allowance and Fees Due (PTOL-85)) dated Aug. 31, 2022 for U.S. Appl. No. 16/198,959 (pp. 1-2). |
Office Action (Notice of Allowance and Fees Due (PTOL-85)) dated Sep. 7, 2022 for U.S. Appl. No. 17/068,834 (pp. 1-8). |
Office Action (Notice of Allowance and Fees Due (PTOL-85)) dated Sep. 8, 2022 for U.S. Appl. No. 17/176,899 (pp. 1-8). |
Office Action (Notice of Allowance and Fees Due (PTOL-85)) dated Sep. 12, 2022 for U.S. Appl. No. 16/734,479 (pp. 1-7). |
EPO Examination Search Report 17 702 910.5 (dated Jun. 23, 2021). |
Office Action dated Oct. 29, 2021 for U.S. Appl. No. 16/198,959 (pp. 1-7). |
Notice of Allowance dated Nov. 5, 2021 for U.S. Appl. No. 16/899,720 (pp. 1-9). |
Corrected Notice of Allowability dated Nov. 24, 2021 for U.S. Appl. No. 16/600,500 (pp. 1-5). |
International Search Report and Written Opinion for App. No. PCT/GB2021/051590, dated Nov. 11, 2021, 20 pages. |
Anonymous: “How does Ultrahaptics technology work?—Ultrahaptics Developer Information”, Jul. 31, 2018 (Jul. 31, 2018), XP055839320, Retrieved from the Internet: URL:https://developer.ultrahaptics.com/knowledgebase/haptics-overview/ [retrieved on Sep. 8, 2021]. |
Office Action (Notice of Allowance and Fees Due (PTOL-85)) dated Dec. 14, 2021 for U.S. Appl. No. 17/170,841 (pp. 1-8). |
Office Action (Non-Final Rejection) dated Dec. 20, 2021 for U.S. Appl. No. 17/195,795 (pp. 1-7). |
EPO Application 18 725 358.8 Examination Report dated Sep. 22, 2021. |
EPO 21186570.4 Extended Search Report dated Oct. 29, 2021. |
Cappellari et al., “Identifying Electromyography Sensor Placement using Dense Neural Networks.” In DATA, pp. 130-141. 2018. ( Year: 2018). |
ISR and WO for PCT/GB2023/050001 (dated May 24, 2023) (20 pages). |
Montenegro et al., “Neural Network as an Alternative to the Jacobian for Iterative Solution to Inverse Kinematics,” 2018 Latin American Robotic Symposium, 2018 Brazilian Symposium on Robotics (SBR) and 2018 Workshop on Robotics in Education (WRE) João Pessoa, Brazil, 2018, pp. 333-338 (Year: 2018). |
Nuttall, A. (Feb. 1981). Some windows with very good sidelobe behavior. IEEE Transactions on Acoustics, Speech, and Signal Processing. 8 pages. |
Office Action (Ex Parte Quayle Action) dated Jul. 20, 2023 for U.S. Appl. No. 16/843,281 (pp. 1-15). |
Office Action (Ex Parte Quayle Action) dated Sep. 18, 2023 for U.S. Appl. No. 18/066,267 (pp. 1-6). |
Office Action (Final Rejection) dated Jul. 25, 2023 for U.S. Appl. No. 17/454,823 (pp. 1-17). |
Office Action (Final Rejection) dated Aug. 30, 2023 for U.S. Appl. No. 16/564,016 (pp. 1-15). |
Office Action (Non-Final Rejection) dated Sep. 7, 2023 for U.S. Appl. No. 16/144,474 (pp. 1-16). |
Office Action (Notice of Allowance and Fees Due (PTOL-85)) dated Jun. 16, 2023 for U.S. Appl. No. 17/354,636 (pp. 1-7). |
Office Action (Notice of Allowance and Fees Due (PTOL-85)) dated Jul. 20, 2023 for U.S. Appl. No. 17/692,852 (pp. 1-8). |
Office Action (Notice of Allowance and Fees Due (PTOL-85)) dated Aug. 2, 2023 for U.S. Appl. No. 16/843,281 (pp. 1-5). |
Office Action (Notice of Allowance and Fees Due (PTOL-85)) dated Aug. 8, 2023 for U.S. Appl. No. 17/645,305 (pp. 1-8). |
Office Action (Notice of Allowance and Fees Due (PTOL-85)) dated Sep. 11, 2023 for U.S. Appl. No. 18/065,603 (pp. 1-11). |
Oyama et al., “Inverse kinematics learning for robotic arms with fewer degrees of freedom by modular neural network systems,” 2005 IEEE/RSJ International Conference on Intelligent Robots and Systems, Edmonton, Alta., 2005, pp. 1791-1798, doi: 10.1109/ IROS.2005.1545084. (Year: 2005). |
Papoulis, A. (1977). Signal Analysis. The University of Michigan: McGraw-Hill, pp. 92-93. |
Prabhu, K. M. (2013). Window Functions and Their Applications in Signal Processing . CRC Press., pp. 87-127. |
Aksel Sveier et al., Pose Estimation with Dual Quaternions and Iterative Closest Point, 2018 Annual American Control Conference (ACC) (8 pages). |
JP Office Action for JP 2020-534355 (dated Dec. 6, 2022) (8 pages). |
Ken Wada, Ring Buffer Basics (2013) 6 pages. |
Notice of Allowance dated Feb. 23, 2023 for U.S. Appl. No. 18/060,556 (pp. 1-10). |
Office Action (Final Rejection) dated Mar. 21, 2023 for U.S. Appl. No. 16/995,819 (pp. 1-7). |
Office Action (Non-Final Rejection) dated Mar. 1, 2023 for U.S. Appl. No. 16/564,016 (pp. 1-10). |
Office Action (Non-Final Rejection) dated Mar. 22, 2023 for U.S. Appl. No. 17/354,636 (pp. 1-5). |
Office Action (Non-Final Rejection) dated Apr. 19, 2023 for U.S. Appl. No. 18/066,267 (pp. 1-11). |
Office Action (Non-Final Rejection) dated Apr. 27, 2023 for U.S. Appl. No. 16/229,091 (pp. 1-5). |
Office Action (Non-Final Rejection) dated May 8, 2023 for U.S. Appl. No. 18/065,603 (pp. 1-17). |
Office Action (Notice of Allowance and Fees Due (PTOL-85)) dated Mar. 8, 2023 for U.S. Appl. No. 17/721,315 (pp. 1-8). |
Office Action (Notice of Allowance and Fees Due (PTOL-85)) dated Mar. 15, 2023 for U.S. Appl. No. 17/134,505 (pp. 1-5). |
Office Action (Notice of Allowance and Fees Due (PTOL-85)) dated Mar. 24, 2023 for U.S. Appl. No. 17/080,840 (pp. 1-8). |
Office Action (Notice of Allowance and Fees Due (PTOL-85)) dated Apr. 4, 2023 for U.S. Appl. No. 17/409,783 (pp. 1-5). |
Office Action (Notice of Allowance and Fees Due (PTOL-85) dated Apr. 6, 2023 for U.S. Appl. No. 17/807,730 (pp. 1-7). |
Office Action (Notice of Allowance and Fees Due (PTOL-85)) dated Apr. 28, 2023 for U.S. Appl. No. 17/195,795 (pp. 1-7). |
Office Action (Notice of Allowance and Fees Due (PTOL-85)) dated May 12, 2023 for U.S. Appl. No. 16/229,091 (pp. 1-8). |
Office Action (Notice of Allowance and Fees Due (PTOL-85)) dated May 24, 2023 for U.S. Appl. No. 16/229,091 (pp. 1-2). |
Office Action dated Feb. 9, 2023 for U.S. Appl. No. 18/060,556 (pp. 1-5). |
Office Action dated Mar. 3, 2023 for U.S. Appl. No. 18/060,525 (pp. 1-12). |
Office Action dated Apr. 19, 2023 for U.S. Appl. No. 18/066,267 (pp. 1-11). |
Partial ISR for PCT/GB2023/050001 (dated Mar. 31, 2023) 13 pages. |
Rakkolainen et al., A Survey of Mid-Air Ultrasound Haptics and Its Applications (IEEE Transactions on Haptics), vol. 14, No. 1, 2021, 18 pages. |
Office Action (Notice of Allowance and Fees Due (PTOL-85)) dated Jan. 18, 2022 for U.S. Appl. No. 16/899,720 (pp. 1-2). |
Office Action (Non-Final Rejection) dated Jan. 24, 2022 for U.S. Appl. No. 16/228,767 (pp. 1-22). |
Office Action (Non-Final Rejection) dated Jan. 21, 2022 for U.S. Appl. No. 17/068,834 (pp. 1-12). |
Office Action (Notice of Allowance and Fees Due (PTOL-85)) dated Feb. 11, 2022 for U.S. Appl. No. 16/228,760 (pp. 1-8). |
ISR and WO for PCT/GB2020/052829 (dated Feb. 10, 2021) (15 pages). |
EPO Examination Report 17 748 4656.4 (dated Jan. 12, 2021) (16 pages). |
Office Action (Notice of Allowance and Fees Due (PTOL-85)) dated Feb. 28, 2022 for U.S. Appl. No. 17/068,825 (pp. 1-7). |
Mohamed Yacine Tsalamlal, Non-Intrusive Haptic Interfaces: State-of-the Art Survey, HAID 2013, LNCS 7989, pp. 1-9, 2013. |
EPO Communication for Application 18 811 906.9 (dated Nov. 29, 2021) (15 pages). |
ISR and WO for PCT/GB2021/052415 (dated Dec. 22, 2021) (16 pages). |
Gareth Young et al.. Designing Mid-Air Haptic Gesture Controlled User Interfaces for Cars, PACM on Human-Computer Interactions, Jun. 2020 (24 pages). |
Office Action (Non-Final Rejection) dated Mar. 4, 2022 for U.S. Appl. No. 16/404,660 (pp. 1-5). |
Office Action (Notice of Allowance and Fees Due (PTOL-85)) dated Mar. 7, 2022 for U.S. Appl. No. 16/600,496 (pp. 1-5). |
Communication Pursuant to Article 94(3) EPC for EP 19723179.8 (dated Feb. 15, 2022). |
EPO ISR and WO for PCT/GB2022/050204 (dated Apr. 7, 2022) (15 pages). |
IN 202047026493 Office Action dated Mar. 8, 2022. |
ISR & WO For PCT/GB2021/052946. |
Office Action (Final Rejection) dated Mar. 14, 2022 for U.S. Appl. No. 16/564,016 (pp. 1-12). |
Office Action (Non-Final Rejection) dated Mar. 15, 2022 for U.S. Appl. No. 16/144,474 (pp. 1-13). |
Office Action (Non-Final Rejection) dated Apr. 1, 2022 for U.S. Appl. No. 16/229,091 (pp. 1-10). |
Office Action (Non-Final Rejection) dated May 2, 2022 for U.S. Appl. No. 17/068,831 (pp. 1-10). |
Number | Date | Country | |
---|---|---|---|
20220083142 A1 | Mar 2022 | US |
Number | Date | Country | |
---|---|---|---|
63079708 | Sep 2020 | US |