Embodiments described herein relate to methods and apparatus for configuring a plurality of virtual buttons on a device. For example, each of the plurality of virtual buttons may be activated or deactivated based on a mode of operation of the device.
Physical or mechanical buttons, which are sometimes called hardware or traditional buttons, typically operate through hardware components that physically move upon being pressed and which typically operate like a switch. These buttons are known in the art and are used in a variety of applications. For example, mechanical buttons may be found on the sides of current smart phones to control volume and other settings. Those buttons are readily identifiable both visually and by touch—typically they are set apart from the surrounding device surface and protrude slightly so a user can easily feel their location and engage them with a press. Softkeys and different types of virtual buttons, which are not directly engaged via moving parts of a physical button, are also known in the art. For instance, a touch-sensitive area may act as a virtual button on a smart phone display, and a familiar example of this is the implementation of a virtual keyboard on the touch screen of a smart phone. Advantageously, virtual buttons may be programmed or changed easily through the use of software. Advantageously, virtual buttons may reduce or eliminate hardware related problems experienced by physical buttons, such as mechanical failure after prolonged use, difficulty in waterproofing, and other issues known in the art. In addition to virtual buttons integrated with a touch screen, other types of virtual buttons may reside on a housing or non-screen surface of a device. For example, instead of a traditional button on a surface such as a dashboard, one may use a variety of force sensors, inductive sensors, or similar technology to create a touch-responsive area that acts as a virtual button and does not visually appear to be a traditional physical button.
US2013/0275058 discloses a handheld portable electronic device, having a display screen and a framing structure, wherein the framing structure includes a strain gauge, for detecting strain within the framing structure. Thus, gestures, such as squeezing the sides of the device, can be recognized, and can be used as inputs to control an application running on the device.
According to some embodiments, there is provided a method for configuring a plurality of virtual buttons on a device. The method comprises receiving a first signal indicating a mode of operation of the device; outputting a first control signal operable to activate or deactivate each of the plurality of virtual buttons on the device respectively based on the mode of operation of the device; and responsive to receiving a second signal indicating that one of the plurality of virtual buttons that is activated has been engaged by a user of the device, outputting a second control signal operable to initiate a first haptic or audible response corresponding to the one of the plurality of virtual buttons.
According to some embodiments there is provided a control circuit for controlling a plurality of virtual buttons on a device. The control circuit is configured to receive a first signal indicating of a mode of operation of the device; output a first control signal to activate or deactivate each of the plurality of virtual buttons on the device respectively based on the mode of operation of the device; and responsive to receiving a second signal indicating that one of the plurality of virtual buttons that is activated has been enabled by a user of the device, output a second control signal operable to initiate output of a first haptic or audible response corresponding to the one of the plurality of virtual buttons.
According to some embodiments there is provided an integrated circuit comprising a control circuit for controlling a plurality of virtual buttons on a device. The control circuit is configured to: receive a first signal indicating of a mode of operation of the device; output a first control signal to activate or deactivate each of the plurality of virtual buttons on the device respectively based on the mode of operation of the device; and responsive to receiving a second signal indicating that one of the plurality of virtual buttons that is activated has been enabled by a user of the device, outputting a second control signal operable to initiate output of a first haptic or audible response corresponding to the one of the plurality of virtual buttons.
According to some embodiments there is provided a device comprising a control circuit for controlling a plurality of virtual buttons on a device. The control circuit configured to receive a first signal indicating of a mode of operation of the device; output a first control signal to activate or deactivate each of the plurality of virtual buttons on the device respectively based on the mode of operation of the device; and responsive to receiving a second signal indicating that one of the plurality of virtual buttons that is activated has been enabled by a user of the device, outputting a second control signal operable to initiate output of a haptic or audible response corresponding to the one of the plurality of virtual buttons.
Traditionally, the placement of virtual buttons on a device has been on the “front” of the device (for example, on a touch screen), or on a limited number of surfaces specifically designed for the user interface, with the remainder of the device being free from virtual buttons. By positioning the virtual buttons on a limited number of surfaces on a device, the device (which may be portable) may be easily held by a user without inadvertently engaging the virtual buttons.
Currently, devices, such as smart phones, have a single front surface that might for example be dominated by a touch screen. Other virtual buttons, such as buttons operating through the use of inductive sensors, may be placed below the touch screen to add further operability to the touch screen surface.
However, while mechanical buttons may be placed elsewhere on the device, the device is usually otherwise free from the presence of virtual buttons.
Should a user want to press a virtual button or indeed the touch screen while using the device, their hands are likely to cover at least a portion of the touch screen surface, which is also acting as a display screen, for example as illustrated in
In
This configuration may be undesirable in some modes of operation of the device as it may obscure information being displayed by the touch display screen.
For a better understanding of the embodiments, and to show how they may be put into effect, reference will now be made, by way of example only, to the accompanying drawings, in which:
The description below sets forth example embodiments according to this disclosure. Further example embodiments and implementations will be apparent to those having ordinary skill in the art. Further, those having ordinary skill in the art will recognize that various equivalent techniques may be applied in lieu of, or in conjunction with, the embodiments discussed below, and all such equivalents should be deemed as being encompassed by the present disclosure.
The steps of any methods disclosed herein do not have to be performed in the exact order disclosed, unless a step is explicitly described as following or preceding another step and/or where it is implicit that a step must follow or precede another step. Any feature of any of the embodiments disclosed herein may be applied to any other embodiment, wherever appropriate. Likewise, any advantage of any of the embodiments may apply to any other embodiments, and vice versa. Other objectives, features and advantages of the enclosed embodiments will be apparent from the following description.
Some of the embodiments contemplated herein will now be described more fully with reference to the accompanying drawings. Other embodiments, however, are contained within the scope of the subject matter disclosed herein, and the disclosed subject matter should not be construed as limited to only the embodiments set forth herein; rather, these embodiments are provided by way of example to convey the scope of the subject matter to those of ordinary skill in the art.
The device 200 comprises a touch screen display 203. The device 200 also comprises a plurality of virtual buttons 201a to 201l which, in this example, are situated along the sides of the device 200. For example, the plurality of virtual buttons 201a to 201l may be located on a non-display surface of the device 200. The non-display surface may be metallic or other materials known in the art suitable to form a housing or other structure associated with the device 200.
In this example, the device 200 comprises a smart phone, and the plurality of virtual buttons 201a to 201l are located on the sides 204 and 205 of the device 200.
In this example, the back surface 206 of the device 200 comprises a plurality of virtual buttons 202a to 202d.
It will be appreciated that any surface of the device may comprise a virtual button. In particular, any non-display surface of the device may comprise a virtual button.
The virtual buttons may operate through the use of force sensors, inductive sensors, capacitive sensors, a resistive deflection sensing array, or any other mechanism configured to define a touch-sensitive area, or any combination of such mechanisms. It will be apparent that any type of virtual button used to replace a mechanical button may be used in the embodiments of this disclosure.
The plurality of virtual buttons may be visually hidden in or on the device. For example, when looking at the back or side surfaces of the device 200, a user may not be able to see the presence of the plurality of virtual buttons, and the location of the plurality of virtual buttons may not be otherwise visually or tactilely indicated. In different embodiments, the presence of the virtual buttons may be hidden only when the device is powered off. In the off state, one or more of the virtual buttons may be flush with, and otherwise visually and tactilely indistinguishable from, the rest of the device's casing or surrounding area. Upon powering on, or in specific modes of operation, however, one or more virtual buttons may be made detectable through the use of lighting, sound, or haptics (as described more fully below), or other techniques that will be apparent to those having ordinary skill in the art. Advantageously, the use of virtual buttons indistinguishable from other portions of a device in use may create a cleaner look for the device and may effectively create a highly configurable device without cluttering its physical appearance or feel, as will be more apparent with the benefit of this disclosure.
Device 300 comprises control circuitry 301. The control circuitry 301 may be configured to communicate with an Applications Processor 302 and/or an orientation module 303. Applications Processor 302 may determine, or help determine, a mode of operation of the device as described more fully below. Orientation module 303 may be hardware and/or software based and in one embodiment may comprise an accelerometer, inertia sensor, or other sensor configured to signal a device's orientation, such as portrait or landscape mode. In other embodiments, the orientation module 303 may act more generally as a context-aware module. For example, it may determine characteristics such as whether a device is placed flat on a table, held within a purse or pocket (via a light-based sensor), being transported at high speed, etc. The control circuitry 301 may also be configured to communicate with a virtual button controller 304.
The virtual button controller 304 may be configured to control the virtual buttons 305. The virtual buttons 305 may be positioned around the device or on other surfaces, for example, as illustrated in
The control circuitry 301 may be configured to perform a method as illustrated in
In step 401, the control circuitry is configured to receive a first signal indicating a mode of operation of the device. For example, the first signal may comprise a signal S1 received from the applications processor 302. The first signal S1 may indicate a particular application that is running on the device, for example a phone application, a gaming application, a camera application, an email application, a music playback application, a video playback application etc. In addition, the first signal S1 may indicate a mode of operation in the form of a status of an application that is running on the device. For example, when an email application is running, the first signal S1 may indicate whether the device display is showing a list of emails, from which the user may select one email to read, or whether the device display is showing a box in which the user may compose a new email.
The first signal may additionally or alternatively comprise a signal S2 received from the orientation module 303. The signal S2 may indicate the orientation of the device 300 and may originate from, for example, a gyroscope and/or an accelerometer, which may provide a corresponding signal to the orientation module 303.
In some examples, the mode of operation may additionally or alternatively comprise a user defined characteristic. For example, the mode of operation may also indicate whether or not the user is left or right handed or any setting or characteristic that is distinguishable from a different setting or characteristic during operation.
Therefore, in different embodiments, the mode of operation of the device may comprise an application running on the device, a characteristic or parameter within or associated with such application, an orientation of the device, a user defined characteristic, or other modes associated with the device or its user interface or its software as would be understood by those having ordinary skill in the art. For example, a mode of operation may be defined not only with respect to which app is being run on a device, but also with respect to one or more particular actions being taken within an individual app.
In step 402, the control circuitry is configured to output a first control signal C1 to activate or deactivate each of the plurality of virtual buttons on the device based on the mode of operation of the device. If a virtual button is activated, then responsive to sensing a touch or press from a user, the virtual button may transmit a signal to the virtual button controller. If a virtual button is deactivated, then, for example, either the virtual button may be placed in a state of operation in which it does not sense the touch or press of a user, or the signals transmitted to the virtual button controller may be completely or partially ignored as a result of the deactivated state of the virtual button.
For example, when a virtual button is deactivated, if the deactivated virtual button is touched or pressed by a user, then there may be no operation performed by the device in response.
By activating and deactivating different combinations of the plurality of virtual buttons depending on the mode of operation of the device, the inadvertent engagement of virtual buttons which would otherwise not be used in that mode of operation may be avoided. Additionally, combinations of virtual buttons may be activated or deactivated together to create unique user interfaces. For example, two or more adjoining or nearby virtual buttons may be activated together, that is, within a predetermined time of each other, and those activated buttons may act in concert to implement a different user interface feature, such as a virtual scroll bar as discussed more fully below. In such an embodiment, a user may swipe across the group of virtual buttons that, together, act as one larger button to scroll content on a screen or some other functionality.
The first control signal C1 may be output to the virtual button controller 304, which may control an operation state of the plurality of virtual buttons. The virtual button controller 304 may, for example, place the plurality of virtual buttons in an activated or deactivated state in response to the control signal C1, or may ignore signals received from virtual buttons which are deactivated by the control signal C1.
For example, referring to the example illustrated in
It will be appreciated that in some modes of operation, all of the plurality of virtual buttons may be deactivated. For example, when the device is not in use (for example when no applications are actively running on the device), it may be desirable for all of the virtual buttons to be deactivated. Determining the context of the device in this regard may be facilitated by the orientation module 303 of
In step 403, the control circuitry 301 is configured to, responsive to receiving a second signal SL indicating that one of the plurality of virtual buttons that is activated has been enabled by a user of the device, output a second control signal, C2 operable to initiate output of a first haptic or audible or visual response corresponding to the one of the plurality of virtual buttons. In this embodiment, enabling a button refers to the button being pressed or otherwise touched or activated to initiate a responsive action associated with the device or software running on the device. In a typical embodiment, enabling a virtual button would be akin to pressing a mechanical button, the response to which would be to initiate an action. In other embodiments, however, enabling a virtual button may depend on the degree or the type of press or contact from a user. For example, enabling a button may depend on the amount of pressure applied—a “hard” press may create a different response than a “soft” press, and many different pressure levels may be detected, each of which may be correlated to a different response. In other embodiments, virtual buttons may be enabled via one or more touch gestures (for example, swipe, pinch, tap, two-finger tap, etc.) that may be configured for different responses. More generally, embodiments of this disclosure contemplate any such enablement or activation or similar effect as would be apparent by those having ordinary skill in the art.
In some examples, a first haptic response may be provided that is associated with one or more of the plurality virtual buttons. According to some embodiments, a haptic response may be any tactile sensation generated in or on the device, and preferably one designed to create an impression that the virtual button itself generated such sensation. In one embodiment, a haptic response may be generated by one or more vibrating motors or linear resonant actuators (LRAs) or similar devices. As is known in the art, haptic responses may be tailored to simulate mechanical button clicks and a host of other sensations including various surface textures, and all such techniques may be used in the embodiments of this disclosure. In other examples, a first haptic response may be provided by a separate haptic device 306. In some examples, a first audible response is provided by a speaker 307. In some examples, a first haptic response and first audible response may be provided together. In other embodiments, a visual response may be provided by a visual indicator 309. For example, the visual response may be fashioned by using LED or other lighting at or near the location of a virtual button to, for example, signal that such button has been enabled or that it has been enabled in a certain manner. A virtual button may be enabled by a user when the user in some way touches or presses the location of the virtual button on the surface of the device.
The first haptic, audible, and/or visual response may be configured to indicate a location of the one of the plurality of virtual buttons that has been activated.
For example, returning to
In step 403, when the user locates the activated virtual button, the user may be notified that they have found the appropriate location on the surface by the first haptic response.
In some examples however, the location of some or all virtual buttons may additionally or alternatively be notified to a user by an indentation or pattern on the surface of the device, or by some other physical trait of the device, that may be felt by the user at the location of the virtual button.
In some examples, virtual buttons may be provided in conjunction with a touch screen. For example, the side surfaces 204 and 205 of device 200 may comprise a touch screen 308, as may the back surface. As a user moves their finger along the touch screen to the location of the one of the plurality of virtual buttons, the touch screen 308 may be configured to transmit the second signal SI to the control circuitry indicating that one of the plurality of virtual buttons that is activated has been enabled by a user of the device. In this example, the virtual button may be enabled by the user placing this finger in the correct position on the surface of the device.
In some examples, one or more of the plurality of virtual buttons may be sensitive to different types of engagement by the user. For example, a virtual button may be able to distinguish between the user touching the location of the virtual button and the user applying pressure to the location of the virtual button.
In this example, the activated virtual button may therefore be capable of more than one mode of engagement.
Therefore, in step 403, the second signal SL may indicate that one of the plurality of virtual buttons that is activated has been engaged by a user of the device in a first mode of engagement. For example, the one of the plurality of virtual buttons is engaged in the first mode of engagement when the user touches the location of the one of the plurality of virtual buttons without applying any additional pressure, or with a level of pressure below a particular threshold.
As discussed previously, the first haptic response may then indicate a location of the one of the plurality of virtual buttons to the user.
In some examples, the control circuitry may be configured to, responsive to receiving a third signal Sp indicating that the one of the plurality of buttons has been engaged by a user of the device in a second mode of engagement, output a third control signal C3 operable to initiate a second haptic or audible or visual response corresponding to the one of the plurality of virtual buttons. In some examples, a second haptic response may be provided by the one of the plurality virtual buttons. In other examples, the second haptic response may be provided by a separate haptic device 306. In some examples, a second audible response is provided by a speaker 307. In some examples, a second visual response is provided by a visual indicator 309. In some examples, two or more of a second haptic response, a second audible response, and a second visual response may be provided together.
The one of the plurality of virtual buttons may be engaged in the second mode of engagement when the user applies pressure to the location of the one of the plurality of virtual buttons. For example, the one of the plurality of virtual buttons may be engaged in the second mode of engagement when the user applies a level of pressure above a particular threshold to the location of the one of the plurality of virtual buttons.
The second haptic or audible response may be different to the first haptic or audible response, and in some examples, may simulate a click associated with a traditional or mechanical button press.
For example, when a user places a device in gaming mode as illustrated in
As user holding the device 200 and looking at the touch screen 203 may then locate the activated virtual buttons by running their fingers along the top side edge 204 and back of the device 200. When a finger of the user touches the location of one of the activated virtual buttons (either sensed by the activated virtual button itself or the touch screen 308), the first haptic or audible response may alert the user that they have located an activated virtual button.
In some examples, the first haptic or audible response may be provided when the user initially touches the location of the activated virtual button, and the first haptic response may not be repeated if the user continues to touch the location of the activated virtual button.
Once the user has located a virtual button that they wish to use, the user may “press” the virtual button to generate a response within the gaming mode (for example, button 201a may cause an avatar within the game to jump, whereas button 201f may cause the avatar within the game to crouch).
When the user presses the virtual button, for example, when the user applies a level of pressure above a particular threshold to the location of the virtual button, the user may be alerted that they have pressed the virtual button by the second haptic or audible response.
In some examples, a virtual button may be configured with the particular threshold and may output a signal when it detects a pressure above the particular threshold to indicate that it has been pressed. In some examples, a touch screen (e.g. touch screen) may be utilized to adjust the particular threshold. For example, the touch screen may detect that the user's finger is close to the virtual button, but not exactly on the location of the virtual button. In this case, the virtual button may still detect pressure from the user, but the virtual button may be configured to lower the level of the particular threshold in order to avoid the user having to apply more pressure in order to trigger the same threshold. The virtual button may be configured by the virtual button controller 304. The virtual button controller 304 may receive an indication of the location of the user's finger from the touch screen 308, and may provide appropriate control of the particular threshold for the virtual button in response.
It will be appreciated that different types of user interaction may trigger the different modes of engagement of a virtual button. For example, a virtual button may be engaged in a first mode of engagement by a user tapping the location, by a user touching the location of the virtual button for an elongated period of time, by a user dragging their touch across the location of the virtual button, or any other touch pattern that is distinguishable by the virtual button.
It will also be appreciated that for different modes of operation, different combinations of virtual buttons may be activated, and that in different modes of operation, the same virtual button may provide different operations.
In this example, the first signal may indicate to the control circuitry 301 that an email app has been opened, and that the device is being held in portrait mode, for example as illustrated in
In response to this first signal, the control circuitry may activate virtual buttons 501 to 504 if the first signal indicates that the user is left handed, and may activate virtual buttons 505 to 508 if the first signal indicates that the user is right handed. Any other virtual buttons may be deactivated (for example virtual buttons 509 and 510).
Consider an example where the user is left handed.
In this example, the grouping of virtual buttons act in concert, and the user may engage more than one of the virtual buttons within a predetermined time, for example by swiping a finger across the virtual buttons, to cause the email app to scroll up and down through a list of emails. In particular, the user may hold the device 500 in their left hand and may use their thumb to swipe up and down over the four virtual buttons 501 to 504 in turn.
The virtual buttons 501 to 504 may therefore be effectively implemented as a virtual scroll wheel, and, as the user engages each of the virtual buttons, a haptic or audible response may be provided that simulates a mechanical scroll wheel clicking as it turns.
It will be appreciated that a single virtual button may be utilized instead of the plurality of virtual buttons to similarly implement a virtual scroll wheel.
In this example, the first signal may indicate to the control circuitry 301 that a camera app has been opened and that the device 500 is being held in portrait mode, for example as illustrated in
In response to this first signal, the control circuitry may activate virtual buttons 501 to 504 and 509. The virtual buttons 505 to 508 may be deactivated.
In this example, the same virtual buttons 501 to 504 that were used to scroll up and down through a list of emails in the email mode of operation, may be used to cause the camera to zoom in and out. These virtual buttons may be engaged and a haptic or audible response may be provided as previously described with reference to
In addition, in this camera mode, the virtual button 509 is activated. This virtual button may provide two modes of engagement. A first mode of engagement may be used to locate the virtual button, and the second mode of engagement may be used to cause an operation in the application on the device. In some examples, a touch screen may be formed on the surface over the virtual button 509, and may be used to locate the virtual button. For example, the virtual button 509 may be located by the user touching the location of the virtual button 509. As described with reference to
It will be appreciated that the addition of the virtual buttons away from the display screen 510 of the device avoids the use of the display screen as a touch screen, thereby avoiding obscuring information displayed by the display screen 510.
Embodiments may be implemented in an electronic, portable and/or battery powered host device such as a smartphone, an audio player, a mobile or cellular phone, or a handset. Embodiments may be implemented on one or more integrated circuits provided within such a host device.
It should be understood—especially by those having ordinary skill in the art with the benefit of this disclosure—that the various operations described herein, particularly in connection with the figures, may be implemented by other circuitry or other hardware components. The order in which each operation of a given method is performed may be changed, and various elements of the systems illustrated herein may be added, reordered, combined, omitted, modified, etc. It is intended that this disclosure embrace all such modifications and changes and, accordingly, the above description should be regarded in an illustrative rather than a restrictive sense.
Similarly, although this disclosure makes reference to specific embodiments, certain modifications and changes can be made to those embodiments without departing from the scope and coverage of this disclosure. Moreover, any benefits, advantages, or solutions to problems that are described herein with regard to specific embodiments are not intended to be construed as a critical, required, or essential feature or element.
Further embodiments likewise, with the benefit of this disclosure, will be apparent to those having ordinary skill in the art, and such embodiments should be deemed as being encompassed herein.
The skilled person will recognise that some aspects of the above-described apparatus and methods, for example the method as performed by the control circuitry, may be embodied as processor control code, for example on a non-volatile carrier medium such as a disk, CD- or DVD-ROM, programmed memory such as read only memory (Firmware), or on a data carrier such as an optical or electrical signal carrier. For many applications embodiments of the invention will be implemented on a DSP (Digital Signal Processor), ASIC (Application Specific Integrated Circuit) or FPGA (Field Programmable Gate Array). Thus the code may comprise conventional program code or microcode or, for example code for setting up or controlling an ASIC or FPGA. The code may also comprise code for dynamically configuring re-configurable apparatus such as re-programmable logic gate arrays. Similarly the code may comprise code for a hardware description language such as Verilog™ or VHDL (Very high speed integrated circuit Hardware Description Language). As the skilled person will appreciate, the code may be distributed between a plurality of coupled components in communication with one another. Where appropriate, the embodiments may also be implemented using code running on a field-(re)programmable analogue array or similar device in order to configure analogue hardware.
Note that as used herein the term module shall be used to refer to a functional unit or block which may be implemented at least partly by dedicated hardware components such as custom defined circuitry and/or at least partly be implemented by one or more software processors or appropriate code running on a suitable general purpose processor or the like. A module may itself comprise other modules or functional units. A module may be provided by multiple components or sub-modules which need not be co-located and could be provided on different integrated circuits and/or running on different processors.
It should be noted that the above-mentioned embodiments illustrate rather than limit the invention, and that those skilled in the art will be able to design many alternative embodiments without departing from the scope of the appended claims or embodiments. The word “comprising” does not exclude the presence of elements or steps other than those listed in a claim or embodiment, “a” or “an” does not exclude a plurality, and a single feature or other unit may fulfil the functions of several units recited in the claims or embodiments. Any reference numerals or labels in the claims or embodiments shall not be construed so as to limit their scope.
Although the present disclosure and certain representative advantages have been described in detail, it should be understood that various changes, substitutions, and alterations can be made herein without departing from the spirit and scope of the disclosure as defined by the appended claims or embodiments. Moreover, the scope of the present disclosure is not intended to be limited to the particular embodiments of the process, machine, manufacture, compositions of matter, means, methods, or steps, presently existing or later to be developed that perform substantially the same function or achieve substantially the same result as the corresponding embodiments herein may be utilized. Accordingly, the appended claims or embodiments are intended to include within their scope such processes, machines, manufacture, compositions of matter, means, methods, or steps.
As used herein, when two or more elements are referred to as “coupled” to one another, such term indicates that such two or more elements are in electronic communication or mechanical communication, as applicable, whether connected indirectly or directly, with or without intervening elements.
This disclosure encompasses all changes, substitutions, variations, alterations, and modifications to the example embodiments herein that a person having ordinary skill in the art would comprehend. Similarly, where appropriate, the appended claims encompass all changes, substitutions, variations, alterations, and modifications to the example embodiments herein that a person having ordinary skill in the art would comprehend. Moreover, reference in the appended claims to an apparatus or system or a component of an apparatus or system being adapted to, arranged to, capable of, configured to, enabled to, operable to, or operative to perform a particular function encompasses that apparatus, system, or component, whether or not it or that particular function is activated, turned on, or unlocked, as long as that apparatus, system, or component is so adapted, arranged, capable, configured, enabled, operable, or operative. Accordingly, modifications, additions, or omissions may be made to the systems, apparatuses, and methods described herein without departing from the scope of the disclosure. For example, the components of the systems and apparatuses may be integrated or separated. Moreover, the operations of the systems and apparatuses disclosed herein may be performed by more, fewer, or other components and the methods described may include more, fewer, or other steps. Additionally, steps may be performed in any suitable order. As used in this document, “each” refers to each member of a set or each member of a subset of a set.
Although exemplary embodiments are illustrated in the figures and described below, the principles of the present disclosure may be implemented using any number of techniques, whether currently known or not. The present disclosure should in no way be limited to the exemplary implementations and techniques illustrated in the drawings and described above.
Unless otherwise specifically noted, articles depicted in the drawings are not necessarily drawn to scale.
All examples and conditional language recited herein are intended for pedagogical objects to aid the reader in understanding the disclosure and the concepts contributed by the inventor to furthering the art, and are construed as being without limitation to such specifically recited examples and conditions. Although embodiments of the present disclosure have been described in detail, it should be understood that various changes, substitutions, and alterations could be made hereto without departing from the spirit and scope of the disclosure.
Although specific advantages have been enumerated above, various embodiments may include some, none, or all of the enumerated advantages. Additionally, other technical advantages may become readily apparent to one of ordinary skill in the art after review of the foregoing figures and description.
To aid the Patent Office and any readers of any patent issued on this application in interpreting the claims appended hereto, applicants wish to note that they do not intend any of the appended claims or claim elements to invoke 35 U.S.C. § 112(f) unless the words “means for” or “step for” are explicitly used in the particular claim.
Number | Name | Date | Kind |
---|---|---|---|
3686927 | Scharton | Aug 1972 | A |
4902136 | Mueller et al. | Feb 1990 | A |
5374896 | Sato et al. | Dec 1994 | A |
5684722 | Thorner et al. | Nov 1997 | A |
5748578 | Schell | May 1998 | A |
5857986 | Moriyasu | Jan 1999 | A |
6050393 | Murai et al. | Apr 2000 | A |
6278790 | Davis et al. | Aug 2001 | B1 |
6294891 | McConnell et al. | Sep 2001 | B1 |
6332029 | Azima et al. | Dec 2001 | B1 |
6388520 | Wada et al. | May 2002 | B2 |
6567478 | Oishi et al. | May 2003 | B2 |
6580796 | Kuroki | Jun 2003 | B1 |
6683437 | Tierling | Jan 2004 | B2 |
6703550 | Chu | Mar 2004 | B2 |
6762745 | Braun et al. | Jul 2004 | B1 |
6768779 | Nielsen | Jul 2004 | B1 |
6784740 | Tabatabaei | Aug 2004 | B1 |
6906697 | Rosenberg | Jun 2005 | B2 |
6995747 | Casebolt et al. | Feb 2006 | B2 |
7042286 | Meade et al. | May 2006 | B2 |
7154470 | Tierling | Dec 2006 | B2 |
7277678 | Rozenblit et al. | Oct 2007 | B2 |
7333604 | Zernovizky et al. | Feb 2008 | B2 |
7392066 | Hapamas | Jun 2008 | B2 |
7623114 | Rank | Nov 2009 | B2 |
7639232 | Grant et al. | Dec 2009 | B2 |
7777566 | Drogi et al. | Aug 2010 | B1 |
7791588 | Tierling et al. | Sep 2010 | B2 |
7825838 | Srinivas et al. | Nov 2010 | B1 |
7979146 | Ullrich et al. | Jul 2011 | B2 |
3068025 | Devenyi et al. | Nov 2011 | A1 |
8098234 | Lacroix et al. | Jan 2012 | B2 |
8102364 | Tierling | Jan 2012 | B2 |
8325144 | Tierling et al. | Dec 2012 | B1 |
8427286 | Grant et al. | Apr 2013 | B2 |
8441444 | Moore et al. | May 2013 | B2 |
8466778 | Hwang et al. | Jun 2013 | B2 |
8480240 | Kashiyama | Jul 2013 | B2 |
8572293 | Cruz-Hernandez et al. | Oct 2013 | B2 |
8572296 | Shasha et al. | Oct 2013 | B2 |
8593269 | Grant et al. | Nov 2013 | B2 |
8648659 | Oh et al. | Feb 2014 | B2 |
8648829 | Shahoian | Feb 2014 | B2 |
8659208 | Rose et al. | Feb 2014 | B1 |
8754757 | Ullrich et al. | Jun 2014 | B1 |
8754758 | Ullrich et al. | Jun 2014 | B1 |
8947216 | Da Costa et al. | Feb 2015 | B2 |
8981915 | Birnbaum et al. | Mar 2015 | B2 |
8994518 | Gregorio et al. | Mar 2015 | B2 |
9019087 | Bakircioglu et al. | Apr 2015 | B2 |
9030428 | Fleming | May 2015 | B2 |
9063570 | Weddle et al. | Jun 2015 | B2 |
9070856 | Rose et al. | Jun 2015 | B1 |
9083821 | Hughes | Jul 2015 | B2 |
9092059 | Bhatia | Jul 2015 | B2 |
9117347 | Matthews | Aug 2015 | B2 |
9128523 | Buuck et al. | Sep 2015 | B2 |
9164587 | Da Costa et al. | Oct 2015 | B2 |
9196135 | Shah et al. | Nov 2015 | B2 |
9248840 | Truong | Feb 2016 | B2 |
9326066 | Klippel | Apr 2016 | B2 |
9329721 | Buuck et al. | May 2016 | B1 |
9354704 | Lacroix et al. | May 2016 | B2 |
9368005 | Cruz-Hernandez et al. | Jun 2016 | B2 |
9489047 | Jiang et al. | Nov 2016 | B2 |
9495013 | Underkoffler et al. | Nov 2016 | B2 |
9507423 | Gandhi et al. | Nov 2016 | B2 |
9513709 | Gregorio et al. | Dec 2016 | B2 |
9520036 | Buuck | Dec 2016 | B1 |
9588586 | Rihn | Mar 2017 | B2 |
9640047 | Choi et al. | May 2017 | B2 |
9652041 | Jiang et al. | May 2017 | B2 |
9696859 | Heller et al. | Jul 2017 | B1 |
9697450 | Lee | Jul 2017 | B1 |
9715300 | Sinclair et al. | Jul 2017 | B2 |
9740381 | Chaudhri et al. | Aug 2017 | B1 |
9842476 | Rihn et al. | Dec 2017 | B2 |
9864567 | Seo | Jan 2018 | B2 |
9881467 | Levesque | Jan 2018 | B2 |
9886829 | Levesque | Feb 2018 | B2 |
9946348 | Ullrich et al. | Apr 2018 | B2 |
9947186 | Macours | Apr 2018 | B2 |
9959744 | Koskan et al. | May 2018 | B2 |
9965092 | Smith | May 2018 | B2 |
10032550 | Zhang et al. | Jul 2018 | B1 |
10039080 | Miller et al. | Jul 2018 | B2 |
10055950 | Saboune et al. | Aug 2018 | B2 |
10074246 | Da Costa et al. | Sep 2018 | B2 |
10102722 | Levesque et al. | Oct 2018 | B2 |
10110152 | Hajati | Oct 2018 | B1 |
10171008 | Nishitani et al. | Jan 2019 | B2 |
10175763 | Shah | Jan 2019 | B2 |
10191579 | Forlines et al. | Jan 2019 | B2 |
10264348 | Harris et al. | Apr 2019 | B1 |
10402031 | Vandermeijden et al. | Sep 2019 | B2 |
10564727 | Billington et al. | Feb 2020 | B2 |
10620704 | Rand et al. | Apr 2020 | B2 |
10667051 | Stahl | May 2020 | B2 |
10726638 | Mondello et al. | Jul 2020 | B2 |
10735956 | Bae et al. | Aug 2020 | B2 |
10782785 | Hu et al. | Sep 2020 | B2 |
10795443 | Hu et al. | Oct 2020 | B2 |
10820100 | Stahl et al. | Oct 2020 | B2 |
10828672 | Stahl et al. | Nov 2020 | B2 |
10832537 | Doy et al. | Nov 2020 | B2 |
10848886 | Rand | Nov 2020 | B2 |
10860202 | Sepehr et al. | Dec 2020 | B2 |
10969871 | Rand et al. | Apr 2021 | B2 |
10976825 | Das et al. | Apr 2021 | B2 |
11069206 | Rao et al. | Jul 2021 | B2 |
11079874 | Lapointe et al. | Aug 2021 | B2 |
11139767 | Janko et al. | Oct 2021 | B2 |
11150733 | Das et al. | Oct 2021 | B2 |
11259121 | Lindemann et al. | Feb 2022 | B2 |
11263877 | Marchais et al. | Mar 2022 | B2 |
11460526 | Foo et al. | Oct 2022 | B1 |
20010043714 | Asada et al. | Nov 2001 | A1 |
20020018578 | Burton | Feb 2002 | A1 |
20020085647 | Oishi et al. | Jul 2002 | A1 |
20030068053 | Chu | Apr 2003 | A1 |
20030214485 | Roberts | Nov 2003 | A1 |
20050031140 | Browning | Feb 2005 | A1 |
20050134562 | Grant et al. | Jun 2005 | A1 |
20050195919 | Cova | Sep 2005 | A1 |
20060028095 | Maruyama et al. | Feb 2006 | A1 |
20060197753 | Hotelling | Sep 2006 | A1 |
20070013337 | Liu et al. | Jan 2007 | A1 |
20070024254 | Radecker et al. | Feb 2007 | A1 |
20070241816 | Okazaki et al. | Oct 2007 | A1 |
20080077367 | Odajima | Mar 2008 | A1 |
20080226109 | Yamakata et al. | Sep 2008 | A1 |
20080240458 | Goldstein et al. | Oct 2008 | A1 |
20080293453 | Atlas et al. | Nov 2008 | A1 |
20080316181 | Nurmi | Dec 2008 | A1 |
20090020343 | Rothkopf et al. | Jan 2009 | A1 |
20090079690 | Watson et al. | Mar 2009 | A1 |
20090088220 | Persson | Apr 2009 | A1 |
20090096632 | Ullrich et al. | Apr 2009 | A1 |
20090102805 | Meijer et al. | Apr 2009 | A1 |
20090128306 | Luden et al. | May 2009 | A1 |
20090153499 | Kim et al. | Jun 2009 | A1 |
20090189867 | Krah et al. | Jul 2009 | A1 |
20090278819 | Goldenberg et al. | Nov 2009 | A1 |
20090313542 | Cruz-Hernandez et al. | Dec 2009 | A1 |
20100013761 | Birnbaum et al. | Jan 2010 | A1 |
20100080331 | Garudadri et al. | Apr 2010 | A1 |
20100085317 | Park | Apr 2010 | A1 |
20100141408 | Doy et al. | Jun 2010 | A1 |
20100141606 | Bae et al. | Jun 2010 | A1 |
20100260371 | Afshar | Oct 2010 | A1 |
20100261526 | Anderson et al. | Oct 2010 | A1 |
20110056763 | Tanase et al. | Mar 2011 | A1 |
20110075835 | Hill | Mar 2011 | A1 |
20110077055 | Pakula et al. | Mar 2011 | A1 |
20110141052 | Bernstein et al. | Jun 2011 | A1 |
20110161537 | Chang | Jun 2011 | A1 |
20110163985 | Bae et al. | Jul 2011 | A1 |
20110167391 | Momeyer et al. | Jul 2011 | A1 |
20120011436 | Jinkinson | Jan 2012 | A1 |
20120105367 | Son | May 2012 | A1 |
20120112894 | Yang et al. | May 2012 | A1 |
20120206246 | Cruz-Hernandez et al. | Aug 2012 | A1 |
20120206247 | Bhatia et al. | Aug 2012 | A1 |
20120229264 | Company Bosch et al. | Sep 2012 | A1 |
20120249462 | Flanagan et al. | Oct 2012 | A1 |
20120253698 | Cokonaj | Oct 2012 | A1 |
20120306631 | Hughes | Dec 2012 | A1 |
20130016855 | Lee et al. | Jan 2013 | A1 |
20130027359 | Schevin et al. | Jan 2013 | A1 |
20130038792 | Quigley et al. | Feb 2013 | A1 |
20130096849 | Campbell et al. | Apr 2013 | A1 |
20130141382 | Simmons et al. | Jun 2013 | A1 |
20130275058 | Awad | Oct 2013 | A1 |
20130289994 | Newman et al. | Oct 2013 | A1 |
20130307786 | Heubel | Nov 2013 | A1 |
20140035736 | Weddle et al. | Feb 2014 | A1 |
20140056461 | Afshar | Feb 2014 | A1 |
20140064516 | Cruz-Hernandez et al. | Mar 2014 | A1 |
20140079248 | Short et al. | Mar 2014 | A1 |
20140085064 | Crawley et al. | Mar 2014 | A1 |
20140118125 | Bhatia | May 2014 | A1 |
20140118126 | Garg et al. | May 2014 | A1 |
20140119244 | Steer et al. | May 2014 | A1 |
20140125467 | Da Costa et al. | May 2014 | A1 |
20140139327 | Bau et al. | May 2014 | A1 |
20140176415 | Buuck et al. | Jun 2014 | A1 |
20140205260 | Lacroix et al. | Jul 2014 | A1 |
20140222377 | Bitan et al. | Aug 2014 | A1 |
20140226068 | Lacroix et al. | Aug 2014 | A1 |
20140253303 | Levesque | Sep 2014 | A1 |
20140292501 | Lim et al. | Oct 2014 | A1 |
20140300454 | Lacroix et al. | Oct 2014 | A1 |
20140340209 | Lacroix et al. | Nov 2014 | A1 |
20140347176 | Modarres et al. | Nov 2014 | A1 |
20150049882 | Chiu et al. | Feb 2015 | A1 |
20150061846 | Yliaho | Mar 2015 | A1 |
20150070149 | Cruz-Hernandez et al. | Mar 2015 | A1 |
20150070151 | Cruz-Hernandez et al. | Mar 2015 | A1 |
20150070260 | Saboune et al. | Mar 2015 | A1 |
20150077324 | Birnbaum et al. | Mar 2015 | A1 |
20150084752 | Heubel et al. | Mar 2015 | A1 |
20150116205 | Westerman et al. | Apr 2015 | A1 |
20150130767 | Myers | May 2015 | A1 |
20150208189 | Tsai | Jul 2015 | A1 |
20150216762 | Oohashi et al. | Aug 2015 | A1 |
20150234464 | Yliaho | Aug 2015 | A1 |
20150264455 | Granoto et al. | Sep 2015 | A1 |
20150268768 | Woodhull | Sep 2015 | A1 |
20150324116 | Marsden et al. | Nov 2015 | A1 |
20150325116 | Umminger, III | Nov 2015 | A1 |
20150339898 | Saboune et al. | Nov 2015 | A1 |
20150341714 | Ahn et al. | Nov 2015 | A1 |
20150356981 | Johnson et al. | Dec 2015 | A1 |
20160004311 | Yliaho | Jan 2016 | A1 |
20160007095 | Lacroix | Jan 2016 | A1 |
20160063826 | Morrell et al. | Mar 2016 | A1 |
20160070392 | Wang et al. | Mar 2016 | A1 |
20160074278 | Muench et al. | Mar 2016 | A1 |
20160097662 | Chang et al. | Apr 2016 | A1 |
20160132118 | Park et al. | May 2016 | A1 |
20160162031 | Westerman et al. | Jun 2016 | A1 |
20160179203 | Modarres et al. | Jun 2016 | A1 |
20160187987 | Ulrich et al. | Jun 2016 | A1 |
20160239089 | Taninaka et al. | Aug 2016 | A1 |
20160246378 | Sampanes et al. | Aug 2016 | A1 |
20160277821 | Kunimoto | Sep 2016 | A1 |
20160291731 | Liu | Oct 2016 | A1 |
20160328065 | Johnson | Nov 2016 | A1 |
20160358605 | Ganong, III et al. | Dec 2016 | A1 |
20170031495 | Smith | Feb 2017 | A1 |
20170052593 | Jiang et al. | Feb 2017 | A1 |
20170078804 | Guo et al. | Mar 2017 | A1 |
20170083096 | Rihn et al. | Mar 2017 | A1 |
20170090572 | Holenarsipur et al. | Mar 2017 | A1 |
20170090573 | Hajati et al. | Mar 2017 | A1 |
20170153760 | Chawda et al. | Jun 2017 | A1 |
20170168574 | Zhang | Jun 2017 | A1 |
20170168773 | Keller et al. | Jun 2017 | A1 |
20170169674 | Macours | Jun 2017 | A1 |
20170180863 | Biggs et al. | Jun 2017 | A1 |
20170220197 | Matsumoto et al. | Aug 2017 | A1 |
20170256145 | Macours et al. | Sep 2017 | A1 |
20170357440 | Tse | Dec 2017 | A1 |
20180021811 | Kutej et al. | Jan 2018 | A1 |
20180033946 | Kemppinen et al. | Feb 2018 | A1 |
20180059733 | Gault et al. | Mar 2018 | A1 |
20180059793 | Hajati | Mar 2018 | A1 |
20180067557 | Robert et al. | Mar 2018 | A1 |
20180074637 | Rosenberg et al. | Mar 2018 | A1 |
20180082673 | Tzanetos | Mar 2018 | A1 |
20180084362 | Zhang et al. | Mar 2018 | A1 |
20180095596 | Turgeman | Apr 2018 | A1 |
20180151036 | Cha et al. | May 2018 | A1 |
20180158289 | Vasilev et al. | Jun 2018 | A1 |
20180159452 | Eke et al. | Jun 2018 | A1 |
20180159457 | Eke | Jun 2018 | A1 |
20180159545 | Eke et al. | Jun 2018 | A1 |
20180160227 | Lawrence et al. | Jun 2018 | A1 |
20180165925 | Israr et al. | Jun 2018 | A1 |
20180178114 | Mizuta et al. | Jun 2018 | A1 |
20180182212 | Li et al. | Jun 2018 | A1 |
20180183372 | Li et al. | Jun 2018 | A1 |
20180194369 | Lisseman et al. | Jul 2018 | A1 |
20180196567 | Klein et al. | Jul 2018 | A1 |
20180224963 | Lee | Aug 2018 | A1 |
20180227063 | Heubel et al. | Aug 2018 | A1 |
20180237033 | Hakeem et al. | Aug 2018 | A1 |
20180206282 | Singh | Sep 2018 | A1 |
20180253123 | Levesque et al. | Sep 2018 | A1 |
20180255411 | Lin et al. | Sep 2018 | A1 |
20180267897 | Jeong | Sep 2018 | A1 |
20180294757 | Feng et al. | Oct 2018 | A1 |
20180301060 | Israr et al. | Oct 2018 | A1 |
20180304310 | Long et al. | Oct 2018 | A1 |
20180321056 | Yoo et al. | Nov 2018 | A1 |
20180321748 | Rao et al. | Nov 2018 | A1 |
20180323725 | Cox et al. | Nov 2018 | A1 |
20180329172 | Tabuchi | Nov 2018 | A1 |
20180335848 | Moussette et al. | Nov 2018 | A1 |
20180367897 | Bjork et al. | Dec 2018 | A1 |
20190020760 | DeBates | Jan 2019 | A1 |
20190035235 | Da Costa et al. | Jan 2019 | A1 |
20190227628 | Rand et al. | Jan 2019 | A1 |
20190044651 | Nakada | Feb 2019 | A1 |
20190051229 | Ozguner et al. | Feb 2019 | A1 |
20190064925 | Kim et al. | Feb 2019 | A1 |
20190069088 | Seiler | Feb 2019 | A1 |
20190073078 | Sheng et al. | Mar 2019 | A1 |
20190102031 | Shutzberg et al. | Apr 2019 | A1 |
20190103829 | Vasudevan et al. | Apr 2019 | A1 |
20190138098 | Shah | May 2019 | A1 |
20190163234 | Kim et al. | May 2019 | A1 |
20190196596 | Yokoyama et al. | Jun 2019 | A1 |
20190206396 | Chen | Jul 2019 | A1 |
20190215349 | Adams et al. | Jul 2019 | A1 |
20190220095 | Ogita et al. | Jul 2019 | A1 |
20190228619 | Yokoyama et al. | Jul 2019 | A1 |
20190114496 | Lesso | Aug 2019 | A1 |
20190235629 | Hu et al. | Aug 2019 | A1 |
20190294247 | Hu et al. | Sep 2019 | A1 |
20190296674 | Janko et al. | Sep 2019 | A1 |
20190297418 | Stahl | Sep 2019 | A1 |
20190305851 | Vegas-Olmos et al. | Oct 2019 | A1 |
20190311590 | Doy et al. | Oct 2019 | A1 |
20190341903 | Kim | Nov 2019 | A1 |
20190384393 | Cruz-Hernandez et al. | Dec 2019 | A1 |
20190384898 | Chen et al. | Dec 2019 | A1 |
20200117506 | Chan | Apr 2020 | A1 |
20200139403 | Palit | May 2020 | A1 |
20200150767 | Karimi Eskandary et al. | May 2020 | A1 |
20200218352 | Macours et al. | Jul 2020 | A1 |
20200231085 | Kunii et al. | Jul 2020 | A1 |
20200313529 | Lindemann et al. | Oct 2020 | A1 |
20200313654 | Marchais et al. | Oct 2020 | A1 |
20200314969 | Marchais et al. | Oct 2020 | A1 |
20200401292 | Lorenz et al. | Dec 2020 | A1 |
20200403546 | Janko et al. | Dec 2020 | A1 |
20210108975 | Peso Parada et al. | Apr 2021 | A1 |
20210125469 | Alderson | Apr 2021 | A1 |
20210153562 | Fishwick et al. | May 2021 | A1 |
20210157436 | Peso Parada et al. | May 2021 | A1 |
20210174777 | Marchais et al. | Jun 2021 | A1 |
20210175869 | Taipale | Jun 2021 | A1 |
20210200316 | Das et al. | Jul 2021 | A1 |
20210325967 | Khenkin et al. | Oct 2021 | A1 |
20210328535 | Khenkin et al. | Oct 2021 | A1 |
20210365118 | Rajapurkar et al. | Nov 2021 | A1 |
20220026989 | Rao et al. | Jan 2022 | A1 |
20220328752 | Lesso et al. | Oct 2022 | A1 |
20220404398 | Reynaga et al. | Dec 2022 | A1 |
20220408181 | Hendrix et al. | Dec 2022 | A1 |
Number | Date | Country |
---|---|---|
2002347829 | Apr 2003 | AU |
103165328 | Jun 2013 | CN |
107835968 | Mar 2018 | CN |
210628147 | May 2020 | CN |
114237414 | Mar 2022 | CN |
0784844 | Jun 2005 | EP |
2306269 | Apr 2011 | EP |
2363785 | Sep 2011 | EP |
2487780 | Aug 2012 | EP |
2600225 | Jun 2013 | EP |
2846218 | Mar 2015 | EP |
2846229 | Mar 2015 | EP |
2846329 | Mar 2015 | EP |
2988528 | Feb 2016 | EP |
3125508 | Feb 2017 | EP |
3379382 | Sep 2018 | EP |
201747044027 | Aug 2018 | IN |
H02130433 | May 1990 | JP |
08149006 | Jun 1996 | JP |
H10184782 | Jul 1998 | JP |
6026751 | Nov 2016 | JP |
6250985 | Dec 2017 | JP |
6321351 | May 2018 | JP |
20120126446 | Nov 2012 | KR |
2013104919 | Jul 2013 | WO |
2013186845 | Dec 2013 | WO |
2014018086 | Jan 2014 | WO |
2014094283 | Jun 2014 | WO |
2016105496 | Jun 2016 | WO |
2016164193 | Oct 2016 | WO |
2017113651 | Jul 2017 | WO |
2018053159 | Mar 2018 | WO |
2018067613 | Apr 2018 | WO |
2018125347 | Jul 2018 | WO |
2020055405 | Mar 2020 | WO |
Entry |
---|
First Examination Opinion Notice, State Intellectual Property Office of the People's Republic of China, Application No. 201880037435.X, dated Dec. 31, 2020. |
International Search Report and Written Opinion of the International Searching Authority, International Application No. PCT/GB2020/051035, dated Jul. 10, 2020. |
International Search Report and Written Opinion of the International Searching Authority, International Application No. PCT/GB2020/050823, dated Jun. 30, 2020. |
International Search Report and Written Opinion of the International Searching Authority, International Application No. PCT/US2020/024864, dated Jul. 6, 2020. |
International Search Report and Written Opinion of the International Searching Authority, International Application No. PCT/GB2020/051037, dated Jul. 9, 2020. |
Communication Relating to the Results of the Partial International Search, and Provisional Opinion Accompanying the Partial Search Result, of the International Searching Authority, International Application No. PCT/GB2020/050822, dated Jul. 9, 2020. |
International Search Report and Written Opinion of the International Searching Authority, International Application No. PCT/GB2019/052991, dated Mar. 17, 2020, received by Applicant Mar. 19, 2020. |
International Search Report and Written Opinion of the International Searching Authority, International Application No. PCT/GB2019/023342, dated Jun. 9, 2020. |
International Search Report and Written Opinion of the International Searching Authority, International Application No. PCT/GB2020/050822, dated Aug. 31, 2020. |
International Search Report and Written Opinion of the International Searching Authority, International Application No. PCT/GB2020/051438, dated Sep. 28, 2020. |
International Search Report and Written Opinion of the International Searching Authority, International Application No. PCT/GB2019/050964, dated Sep. 3, 2019. |
International Search Report and Written Opinion of the International Searching Authority, International Application No. PCT/GB2019/050770, dated Jul. 5, 2019. |
Communication Relating to the Results of the Partial International Search, and Provisional Opinion Accompanying the Partial Search Result, of the International Searching Authority, International Application No. PCT/US2018/031329, dated Jul. 20, 2018. |
Combined Search and Examination Report, UKIPO, Application No. GB1720424.9, dated Jun. 5, 2018. |
Invitation to Pay Additional Fees, Partial International Search Report and Provisional Opinion of the International Searching Authority, International Application No. PCT/US2020/052537, dated Jan. 14, 2021. |
International Search Report and Written Opinion of the International Searching Authority, International Application No. PCT/US2020/056610, dated Jan. 21, 2021. |
International Search Report and Written Opinion of the International Searching Authority, International Application No. PCT/GB2020/052537, dated Mar. 9, 2021. |
Steinbach et al., Haptic Data Compression and Communication, IEEE Signal Processing Magazine, Jan. 2011. |
Pezen et al., Syntacts Open-Source Software and Hardware for Audio-Controlled Haptics, IEEE Transactions on Haptics, vol. 14, No. 1, Jan.-Mar. 2021. |
Danieau et al., Enhancing Audiovisual Experience with Haptic Feedback: A Survey on HAV, IEEE Transactions on Haptics, vol. 6, No. 2, Apr.-Jun. 2013. |
Danieau et al., Toward Haptic Cinematography: Enhancing Movie Experiences with Camera-Based Haptic Effects, IEEE Computer Society, IEEE MultiMedia, Apr.-Jun. 2014. |
Jaijongrak et al., A Haptic and Auditory Assistive User Interface: Helping the Blinds on their Computer Operations, 2011 IEEE International Conference on Rehabilitation Robotics, Rehab Week Zurich, ETH Zurich Science City, Switzerland, Jun. 29-Jul. 1, 2011. |
Lim et al., An Audio-Haptic Feedbacks for Enhancing User Experience in Mobile Devices, 2013 IEEE International Conference on Consumer Electronics (ICCE). |
Weddle et al., How Does Audio-Haptic Enhancement Influence Emotional Response to Mobile Media, 2013 Fifth International Workshop on Quality of Multimedia Experience (QoMEX), QMEX 2013. |
Final Notice of Preliminary Rejection, Korean Patent Office, Application No. 10-2019-7036236, dated Nov. 29, 2021. |
Examination Report under Section 18(3), United Kingdom Intellectual Property Office, Application No. GB2018050.1, dated Dec. 22, 2021. |
Second Office Action, National Intellectual Property Administration, PRC, Application No. 2019800208570, dated Jan. 19, 2022. |
Examination Report under Section 18(3), UKIPO, Application No. GB2018051.9, dated Nov. 5, 2021. |
Office Action of the Intellectual Property Office, ROC (Taiwan) Patent Application No. 107115475, dated Apr. 30, 2021. |
First Office Action, China National Intellectual Property Administration, Patent Application No. 2019800208570, dated Jun. 3, 2021. |
International Search Report and Written Opinion of the International Searching Authority, International Application No. PCT/US2021/021908, dated Jun. 9, 2021. |
Notice of Preliminary Rejection, Korean Intellectual Property Office, Application No. 10-2019-7036236, dated Jun. 29, 2021. |
Combined Search and Examination Report, United Kingdom Intellectual Property Office, Application No. GB2018051.9, dated Jun. 30, 2021. |
Communication pursuant to Rule 164(2)(b) and Article 94(3) EPC, European Patent Office, Application No. 18727512.8, dated Jul. 8, 2021. |
Gottfried Behler: “Measuring the Loudspeaker's Impedance during Operation for the Derivation of the Voice Coil Temperature”, AES Convention Preprint, Feb. 25, 1995 (Feb. 25, 1995), Paris. |
First Office Action, China National Intellectual Property Administration, Patent Application No. 2019800211287, dated Jul. 5, 2021. |
Examination Report under Section 18(3), United Kingdom Intellectual Property Office, Application No. GB2106247.6, dated Mar. 31, 2022. |
Combined Search and Examination Report under Sections 17 and 18(3), UKIPO, Application No. GB2210174.5, dated Aug. 1, 2022. |
Examination Report under Sections 17 and 18(3), UKIPO, Application No. GB2112207.2, dated Aug. 18, 2022. |
International Search Report and Written Opinion of the International Searching Authority, International Application No. PCT/US2022/030541, dated Sep. 1, 2022. |
Vanderborght, B. et al., Variable impedance actuators: A review; Robotics and Autonomous Systems 61, Aug. 6, 2013, pp. 1601-1614. |
International Search Report and Written Opinion of the International Searching Authority, International Application No. PCT/US2022/033190, dated Sep. 8, 2022. |
International Search Report and Written Opinion of the International Searching Authority, International Application No. PCT/US2022/033230, dated Sep. 15, 2022. |
Examination Report under Sections 17 and 18(3), UKIPO, Application No. GB2115048.7, dated Aug. 24, 2022. |
Communication pursuant to Article 94(3) EPC, European Patent Application No. 18727512.8, dated Sep. 26, 2022. |
Examination Report under Section 18(3), UKIPO, Application No. GB2112207.2, dated Nov. 7, 2022. |
Examination Report, Intellectual Property India, Application No. 202117019138, dated Jan. 4, 2023. |
Examination Report under Section 18(3), UKIPO, Application No. GB2113228.7, dated Feb. 10, 2023. |
Examination Report under Section 18(3), UKIPO, Application No. GB2113154.5, dated Feb. 17, 2023. |
Number | Date | Country | |
---|---|---|---|
20200401292 A1 | Dec 2020 | US |
Number | Date | Country | |
---|---|---|---|
62864649 | Jun 2019 | US |