The described embodiments relate generally to electronic devices. More particularly, the present embodiments relate to an electronic device that includes a haptic engine used to detect an input action associated with an input device and provides haptic feedback based on the detected input action.
Portable electronic devices have become increasingly popular, and the features and functionality provided by portable electronic devices continue to expand to meet the needs and expectations of many consumers. For example, some portable electronic devices include features such as touch sensors, a display, various input devices, speakers, and microphones. In some cases, the electronic device may take on a small form factor. In such cases, it can be challenging to include all of the components in the electronic device that are needed to provide the various functionalities in the smallest space.
Embodiments disclosed herein provide an electronic device that is configured to provide haptic feedback to a user based on an input action associated with an input device. A haptic engine is configured to detect an input action associated with the input device (e.g., a translational input) and produce a haptic output based on the detected input action. The haptic output may be perceived by a user as haptic feedback. The haptic feedback can indicate to the user that the user input has been received by the electronic device.
In some embodiments, an input device includes a haptic engine operably connected to or mechanically coupled to an input surface of the input device and to a processing device. The haptic engine may include an electromagnetic actuator, such as a linear actuator, that detects a user input or input action associated with the input surface and provides a haptic output based on the detected input action. The electromagnetic actuator includes a magnet assembly and a coil assembly adjacent to the magnet assembly. For example, the coil assembly can at least partially surround the magnet assembly. The haptic engine detects the input action based on a first movement between the magnet assembly and the coil assembly with respect to each other, the first movement inducing an input device signal in the coil assembly. The processing device is configured to receive or respond to the input device signal and to responsively cause a haptic output signal to be transmitted to the haptic engine. The haptic output signal produces a second movement between the magnet assembly and the coil assembly with respect to each other to produce a haptic output that is applied or transferred to the input surface.
In some embodiments, the input device is an input button in an electronic watch (e.g., a smart watch). The electronic watch also includes a display and a processing device operably connected to the display and to the electromagnetic actuator. An input action (e.g., a translational input action) received by the input button causes the processing device to receive or respond to an input device signal from the electromagnetic actuator and a haptic output signal to be transmitted to the electromagnetic actuator. The display is configured to display a user interface screen associated with an application program, and the input action also causes a change in the user interface screen displayed on the display. For example, an icon may be selected and a different user interface screen displayed based on the selected icon, or the digits in the time displayed on the display are changed based on the input action.
In some embodiments, an electronic device includes an input device configured to receive a user input, a haptic device operably connected to the input device, and a processing device operably connected to the haptic device. The processing device is configured to receive or respond to an input device signal from the haptic device based on the user input. In response to the input device signal, the processing device is configured to cause a haptic output signal to be transmitted to the haptic device. The haptic output signal causes the haptic device to produce a haptic output.
In some embodiments, an electronic device includes an input device configured to receive a user input and a haptic engine operably connected to the input device. The haptic engine is configured to detect the user input and produce a haptic output based on the detected user input. The haptic engine is further configured to operate in a first mode in which the haptic engine engages the input device, and in a second mode in which the haptic engine is not engaged with the input device.
In some embodiments, an electronic watch includes an electromagnetic actuator operably connected to an input button, and a processing device operably connected to the electromagnetic actuator. The electromagnetic actuator includes a magnet assembly and a coil assembly adjacent the magnet assembly. The electromagnetic actuator is configured to detect an input action (e.g., a translational input action) provided to the input button based on a first movement between the magnet assembly and the coil assembly. The first movement induces an input device signal in the coil assembly. The processing device is configured to receive or respond to the input device signal and to cause a haptic output signal to be transmitted to the electromagnetic actuator to cause a second movement between the magnet assembly and the coil assembly to produce a haptic output. The haptic output may be applied to the input button and/or to another region or surface of the electronic device.
In some embodiments, an electronic watch includes a linear actuator operably connected to a crown and a processing device operably connected to the linear actuator. The linear actuator includes a magnet assembly movably coupled to a shaft and a coil assembly adjacent the magnet assembly. The linear actuator is configured to detect an input action (e.g., a translational input action) provided to the crown based on a first movement between the magnet assembly and the coil assembly, the first movement inducing an input device signal. The processing device is configured to receive or respond to the input device signal and to cause a haptic output signal to be transmitted to the linear actuator. The haptic output signal causes a second movement between the magnet assembly and the coil assembly to produce a haptic output. The haptic output may be applied to the crown and/or to another region or surface of the electronic device.
The disclosure will be readily understood by the following detailed description in conjunction with the accompanying drawings, wherein like reference numerals designate like structural elements, and in which:
The cross-hatching in the figures is provided to distinguish the elements or components from one another. The cross-hatching is not intended to indicate a type of material or materials or the nature of the material(s).
Reference will now be made in detail to representative embodiments illustrated in the accompanying drawings. It should be understood that the following descriptions are not intended to limit the embodiments to one preferred embodiment. To the contrary, it is intended to cover alternatives, modifications, and equivalents as can be included within the spirit and scope of the described embodiments as defined by the appended claims.
The following disclosure relates to an electronic device that is configured to provide haptic feedback to a user. In general, a haptic device may be configured to produce a mechanical movement or vibration that may be transmitted through the enclosure and/or an input device of the electronic device. In some cases, the movement or vibration is transmitted to the skin of the user and perceived as a stimulus or haptic feedback by the user. In some embodiments, the haptic feedback may be coupled to an input action on an input device. One example of an input action is the pressing of an input button. The haptic feedback can indicate to a user that the input action has been received or registered by the input device and/or the electronic device.
In a particular embodiment, the electronic device includes an input device and a haptic engine operably connected to the input device. A haptic engine may include an electromechanical assembly that is capable of producing a change in momentum using a moving mass that results in a haptic output. In the embodiments described herein, the haptic engine is configured to function as both an input sensor and a haptic device. In particular, the input sensor may be integrated within the haptic device in that the electromechanical components that produce and receive signals from both the haptic device and the input sensor. For example, when the haptic device is an electromagnetic actuator, an input action (e.g., button press) can cause the magnet assembly and the coil assembly to move with respect to each other. This movement induces a current (or “input device signal”) in the coil assembly. The input device signal indicates an input action associated with the input device has occurred. A processing device may be responsive to the input device signal, and may cause a haptic output signal to be transmitted to the coil assembly. The haptic output signal may cause the haptic device to produce a haptic output.
As used herein, the term “input action” may be construed broadly to include any type of user input associated with an input device, or with one or more components in an input device. Example input actions include, but are not limited to, a translational input, touch input, force input, motion input, acceleration input, pressure input, velocity input, rotational input, and combinations thereof. In a non-limiting example, an input device can be an input button in an electronic device, and one input action associated with the input button is a button press or translational input. The button press may cause the input button, or components within the input button, to translate or move in the same direction as the direction of the button press (e.g., horizontal direction).
Additionally or alternatively, an input action can include a force input where an amount of force, or varying amounts of force, is applied to an input device. In such embodiments, a processing device operably connected to the haptic engine is configured to detect the applied force on the input device. Additionally or alternatively, the processing device can be configured to detect motion or a rotation of the input device (or of a component in the input device). In a non-limiting example, the input device may be a crown of an electronic watch (e.g., a smart watch) that a user can rotate to provide one or more inputs to the smart watch.
In general, a haptic engine may produce one or more types of haptic output, such as vibrations, an applied force, movement, and combinations thereof. The haptic output may be transmitted through the enclosure and/or an input device of the electronic device and detected by a user. In some cases, the movement, force, and/or vibration is transmitted to the skin of the user and perceived as a stimulus or haptic feedback by the user. In one non-limiting example, a user can press an input button and the haptic engine can apply a force to the input button in a direction opposite the direction of the button press. The applied force may be perceived by the user as a “tap” or “knock” that indicates the input device and/or the electronic device registered the button press. Alternatively, the haptic engine may move a mass in one direction or in opposing directions in response to the button press. The movement may be perceived by the user as a vibration that indicates the input device and/or the electronic device registered the button press.
In some embodiments, the position of the haptic engine can be adjusted so that a haptic output can be applied to different regions of an electronic device. For example, a haptic engine may be positioned at a first position to apply a haptic output directly to an input device (e.g., an exterior surface of an input button). The haptic engine may also be positioned at a second position to apply a haptic output to a different region of the electronic device (e.g., an exterior surface of an enclosure). The position of the haptic engine can be adjusted using any suitable method. For example, in one embodiment an electromagnet or switch may position the haptic device.
Directional terminology, such as “top”, “bottom”, “front”, “back”, “leading”, “trailing”, etc., is used with reference to the orientation of the Figure(s) being described. Because components of embodiments described herein can be positioned in a number of different orientations, the directional terminology is used for purposes of illustration only and is in no way limiting. When used in conjunction with the components of an input device and of an electronic device, the directional terminology is intended to be construed broadly, and therefore should not be interpreted to preclude the presence of one or more intervening components or other intervening features or elements.
These and other embodiments are discussed below with reference to
The electronic device 100 includes input devices 106, 108. In some embodiments, one or both of the input devices 106, 108 may be configured as input/output devices. The term “input device” is intended to be construed broadly to include both input and input/output devices. An input device may include an input component, such as a button, knob, dial, crown, and the like. Although shown on a side of the electronic device 100, the input devices 106, 108 can be positioned substantially anywhere on the electronic device 100.
As will be described in more detail later, the electronic device 100 includes at least one haptic engine (see e.g.,
In the illustrated embodiment, the input device 106 is a crown and the input device 108 an input button. Input devices in other embodiments are not limited to these configurations. For example, an input device may be a rocker switch, a portion of the enclosure 102, one or more keys in a keyboard, a slide switch, a virtual icon or image on a display, or any other suitable input device.
The input device 106 (e.g., crown) is configured to receive translational and rotational input actions. For example, the input device 106 may include a shaft that extends into the electronic device 100. Pressing the input device 106 can cause the shaft, or components coupled to the shaft, to move or translate a given distance. Additionally or alternatively, the shaft may rotate when a user rotates the input device 106. The amount of shaft rotation can be detected and measured by an optical encoder positioned adjacent to the shaft. The amount of shaft rotation may be used as an input to the electronic device 100 and/or to an application program running on the electronic device 100.
One or more functions can be performed when the input device 106 is rotated and/or pressed. For example, if the display 104 of the electronic device 100 is displaying a time keeping application, the input device 106 may be rotated in either direction to change or adjust the position of the hands or the digits that are displayed for the time keeping application. Additionally or alternatively, the input device 106 may be rotated to move a cursor or other type of selection mechanism from a first displayed location to a second displayed location in order to select an icon or move the selection mechanism between various icons that are presented on the display 104. Additionally or alternatively, the input device 106 may be pressed to perform various functions, such as changing the image on a display, waking the electronic device 100 from a sleep state, and/or to select or activate an application. In some embodiments, the input device 106 can be rotated or pressed to disable an application or function. For example, the input device 106 may be pressed to disable an alert produced by an application on the electronic device 100 or received by the electronic device 100.
In some embodiments, the input device 108 (e.g., an input component or input button) can be configured to be pressed to cause various functions to be performed and/or disabled. The input device 108 may include a shaft that extends into the electronic device 100. Pressing the input device 108 can cause the shaft, or components coupled to the shaft, to move or translate a given distance. For example, a single press can activate an application and/or display a particular image or screen on the display. Additionally or alternatively, a single press may disable or delay an alert. A multiple press (e.g., a double press or double click) can activate an application and a component within the electronic device 100. For example, a double click may access an application that uses a wireless communication network to transmit data associated with the application (e.g., an electronic payment application). Additionally or alternatively, a press-hold may operate to turn on and turn off the electronic device 100 or to place the electronic device 100 in a power saving mode (e.g., a mode where minimal functions and applications operate and other applications and functions are disabled).
In some embodiments, pressing both of the input devices 106, 108 in various combinations can cause one or more functions to be performed. For example, pressing the input device 106 and then immediately pressing the input device 108 can cause an action to be performed on the electronic device 100. Additionally or alternatively, simultaneous press-holds on the input devices 106, 108 can cause another action to be performed on the electronic device 100.
The electronic device 100 further includes an enclosure 102 that forms an outer surface or partial outer surface for the internal components of the electronic device 100. The enclosure 102 defines openings and/or apertures that receive and/or support a display 104 and the input devices 106, 108. The enclosure 102 can be formed of one or more components operably connected together, such as a front piece and a back piece. Alternatively, the enclosure 102 can be formed of a single piece operably connected to the display 104. In the illustrated embodiment, the enclosure 102 is formed into a substantially rectangular shape, although this configuration is not required. For example, certain embodiments may include a substantially circular enclosure 102.
The display 104 can provide a visual output for the electronic device 100 and/or function to receive user inputs to the electronic device 100. For example, the display 104 may incorporate an input device configured to receive touch input, force input, temperature input, and the like. The display 104 may be substantially any size and may be positioned substantially anywhere on the electronic device 100. The display 104 can be implemented with any suitable display, including, but not limited to, a multi-touch sensing touchscreen device that uses liquid crystal display (LCD) element, a light emitting diode (LED) element, an organic light-emitting display (OLED) element, or an organic electro luminescence (OEL) element.
For example, in one embodiment the haptic engine 200 is configured to apply a haptic output to a bottom surface of the enclosure 102 (e.g., the momentum of the haptic output can be transferred to the bottom surface). When a user is wearing the electronic device 100 on his or her wrist, the haptic output may be detected by the user as haptic feedback because the bottom surface of the electronic device 100 is in contact with the wrist. In other embodiments, the haptic output may be applied or transferred to a side of the electronic device 100, a top surface of the electronic device 100, multiple surfaces of the electronic device 100, and combinations thereof.
Additionally or alternatively, the haptic engine 200 may be configured to produce a haptic output that is applied or transferred to the input device 108. The haptic engine 200 may be mechanically or structurally coupled to the input surface 210 (of the input device 108) to receive movement from and/or transmit movement to the input surface 210. By mechanically coupling the haptic engine 200 to the input surface 210, movement of the input surface 210 results in movement of one or more components of the haptic engine 200. In one non-limiting example, when a user applies a force to the input surface 210 of the input device 108 (e.g., presses the input surface 210), the haptic engine 200 can detect such input action and produce a first signal (“input device signal”) that is received by the processing device 202. Based on the input device signal, the processing device 202 can cause a second signal (“haptic output signal”) to be transmitted to the haptic engine 200 that causes the haptic engine 200 to produce a haptic output (e.g., a vibration or an applied force). A user may then detect the haptic output as haptic feedback when the user's finger is in contact with the input surface 210.
In some embodiments, the haptic engine 200 can be configured to operate in two or more modes. For example, in a first mode, the haptic engine 200 may be positioned at a first position to apply a haptic output directly to the interior surface of an input device (e.g., input device 108). In a second mode, the haptic engine 200 can be positioned at a second position to produce a haptic output within the electronic device 100 and/or to apply to one or more non-input-device surfaces or regions of the electronic device 100.
Additionally or alternatively, a second haptic device 204 may be operably connected to the processing device 202. In such embodiments, the haptic engine 200 can produce a haptic output for the input device 108 while the second haptic device 204 may produce a haptic output for one or more different surfaces (non-input-device surfaces) or regions of the electronic device 100.
Any suitable type of haptic device can be used in the haptic engine 200 and/or the second haptic device 204. Example haptic devices include, but are not limited to, actuators, vibrator, and other type of motors. As described earlier, a haptic device and haptic engine may produce one or more types of haptic output, such as movement, vibrations, transfer of momentum, and other actions that may produce a perceptible or tactile output.
In some embodiments, an input sensor and a haptic device are separate components within the electronic device 100. In such embodiments, the input sensor can detect or sense an input action using any suitable sensing technology, such as capacitive, piezoelectric, piezoresistive, electromagnetic, ultrasonic, and magnetic sensing technologies. For example, in one embodiment a capacitive input sensor can be used to detect the presence of a user's finger on the input device. Additionally or alternatively, a capacitive sensor may be used to detect a user applying a force on the input device. For example, when the input device is an input button, the input sensor can detect the presence of a user's finger on the button and/or the user pressing the input button.
Example embodiments of a haptic engine will now be discussed.
In the example of
In the illustrated embodiment, the haptic engine 200 includes a magnet assembly 300 coupled to and/or movably positioned about a shaft 302. The magnet assembly 300 can include one or more magnets. In the illustrated embodiment, the magnet assembly 300 includes two magnets 300a, 300b of opposite polarities. The magnets 300a, 300b can be made of any suitable ferromagnetic material, such as neodymium. The shaft 302 may be formed form one or more components that are fixed with respect to each other or may be separated to allow for decoupling of the haptic engine 200 from other elements of the device. The shaft 302 can be made of a non-ferrous material such as tungsten, titanium, stainless steel, or the like.
A coil assembly 304 at least partially surrounds the magnet assembly 300 and/or the shaft 302. The coil assembly 304 includes one or more coils. Each coil can be formed with a winding of a conductive material, such as a metal. In one embodiment, the width of the coil assembly 304 can be less than or substantially equal to the width of the magnet assembly 300. In other embodiments, the width of the coil assembly 304 may be greater than the width of the magnet assembly 300.
In some embodiments, a frame 306 can be positioned at least partially around the coil assembly 304, the magnet assembly 300, and/or the shaft 302 to increase the momentum of the linear actuator. The frame 306 can be made of any suitable material. In one embodiment, the frame 306 is made of a metal, such as tungsten.
The coil assembly 304 and the magnet assembly 300 are positioned such that a first air gap separates the coil assembly 304 from the magnet assembly 300. Similarly, the coil assembly 304 and the frame 306 are positioned such that a second air gap separates the coil assembly 304 from the frame 306. In the illustrated embodiment, the first and second air gaps are located on opposing sides of the coil assembly 304.
In some embodiments, the frame 306 can be disengaged from the input device 108. The shaft 302 extends through a bearing 308 and a collar 310 which support the frame 306. The collar 310 allows the shaft 302 to pass one frame 306 in only one direction. For example, the collar 310 may permit the shaft 302 to only move in a direction away from the input device 108.
The coil assembly 304 may be energized by transmitting a current along a length of a wire that forms a coil in the coil assembly 304. A direction of the current along the wire of the coil determines a direction of a magnetic field that emanates from the coil assembly 304. The opposing polarities of the magnets 300a, 300b generate a radial magnetic field that interacts with the magnetic field of the coil assembly 304. The Lorentz force resulting from the interaction of the magnetic fields causes the frame 306 and the magnet assembly 300 to move in a first direction aligned with the axis of the shaft 302. Reversing the current flow through the coil assembly 304 reverses the Lorentz force. As a result, the magnetic field or force on the magnet assembly 300 is also reversed and the frame 306 and the magnet assembly 300 move in an opposing second direction. Thus, the frame 306 and the magnet assembly 300 can move in one direction or in an oscillating manner depending on the direction of the current flow through the coil assembly 304. In some embodiments, the frame 306 includes one or more magnets 312 that assist in moving the frame 306 and produce increased momentum when a current passes through the coil assembly 304.
When a user provides an input action to the input button 108 (e.g., a button press), the shaft 302, the magnet assembly 300, and the frame 306 can move a given distance into the electronic device 100. This movement induces a current (“input device signal”) in the coil assembly 304. A processing device (e.g., processing device 202 in
A housing 314 may be attached to the enclosure 102 and positioned at least partially around the frame 306, the magnet assembly 300, the coil assembly 304, and the shaft 302. In the illustrated embodiment, the shaft 302 extends through the housing 314 with a contact area 316 attached to the interior surface of the input device 108. The momentum of a haptic output can be transferred to the input device 108 using the contact area 316.
A bracket 322 can at least partially surround the housing 314 and attach to an interior surface of the enclosure 102 using one or more fasteners 324. The bracket 322 fixes the housing 314 to the enclosure 102. Any suitable fastener may be used, such as screws, welding, and/or an adhesive. In some embodiments, the shaft 302 can extend into and/or pass through an opening in the bracket 322. This allows the shaft 302 to move in or through the opening when a force is applied to the input device 108.
In the example embodiment, the coil assembly 304 is fixed to the housing 314. The frame 306 and the magnet assembly 300 move with respect to the coil assembly 304. In such embodiments, the coil assembly 304 may not contact any portion of the frame 306 even when the frame 306 and the magnet assembly 300 are maximally displaced within the housing 314 (e.g., to one end of the shaft 302). It should be appreciated that in other embodiments the coil assembly 304 may move instead of, or in addition to, the frame 306 and the magnet assembly 300. However, it may be easier to provide the interconnections for the coil assembly 304 when the coil assembly 304 is fixed to the housing 314. For example, the coil assembly 304 can be electrically connected to a power source using a flexible circuit or other signal line.
A compliant element 318 can be positioned on each side of the frame 306 to bias the frame 306 towards the center region of the travel. The compliant elements 318 provide a return force or local biasing to the frame 306. The compliant elements 318 may be any suitable compliant element such as a leaf spring, beehive spring, and the like.
In some embodiments, the haptic engine 200 can function as a force sensor. Using the known characteristics of the input device signal and the linear actuator, such as the mass of the magnet assembly 300 and the spring coefficients of the compliant elements 318, the acceleration of the movement of the input device 108 can be correlated to an amount of force.
A compliant structure 320 can be positioned between the input device 108 and the enclosure 102 to allow travel between the input device 108 and the enclosure 102 and to return the input device 108 to a resting position. In one embodiment, the compliant structure 320 is positioned around an interior perimeter of the input device 108. In other embodiments, one or more discrete compliant structures 320 may be positioned around an interior perimeter of the input device 108.
As discussed earlier, in some embodiments at least some of the components of the haptic engine 200 are shared and form both an input sensor and a haptic device. In the illustrated embodiment, the magnet assembly 300 and the coil assembly 304 can be used as an input sensor. When a user performs an input action on the input device 108 (e.g., by pressing), the shaft 302, the magnet assembly 300, and the frame 306 can move inward, which in turn induces a current (“input device signal”) in the coil assembly 304. A processing device (not shown) operably connected to the coil assembly 304 may receive or be responsive to the input device signal and cause a haptic output signal to be transmitted to the coil assembly 304. The haptic output signal is transmitted along the length of a wire in a coil in the coil assembly 304, which in turn produces a magnetic field that causes the frame 306 and the magnets 300a, 300b to move and produce a haptic output (an applied force, movement, and/or vibration). The movement, vibration, and/or applied force may be perceived by a user as haptic feedback. Thus, the input action is sensed through the movement of the frame 306 and the magnets 300a, 300b with respect to the coil assembly 304, and a haptic output is produced by the movement of the frame 306 and the magnets 300a, 300b with respect to the coil assembly 304.
Additionally or alternatively, a discrete input sensor can be included in the electronic device. For example, in one embodiment, the compliant structure 320 can be formed as a force sensing layer configured to detect an amount of force applied to the input device 108. In one example, the force sensing layer can include two electrode layers separated by a dielectric or compliant material (e.g., air, foam, silicon, and the like). Each electrode layer can include one or more electrodes that are aligned in at least one direction to produce one or more capacitors. When a force is applied to the input device 108 (e.g., when a user presses the input device 108), the distance between the electrodes in at least one capacitor changes, which changes a capacitance of the capacitor. A processing device (not shown) can receive or be responsive to a signal from each capacitor representing the capacitance of that capacitor. The processing device may be configured to correlate the signal(s) to an amount of force that was applied to the input device 108.
The force sensing layer provides for a range of force input values that can be used to control a variety of functions. For example, a user can press the input device with a first force to perform a scrolling action at a first speed and press the input device with a second force to perform a scrolling action at a different second speed (e.g., a faster speed).
In some embodiments, a different type of input sensor can be used. The input sensor can be configured to detect any suitable characteristic or property. For example, the input sensor may be an image sensor, a light or optical sensor, a proximity sensor, a magnet, a biometric sensor, a touch sensor, an accelerometer, and so on. In an example embodiment, the input sensor can include one or more strain gauges, a tactile or reed switch, or a capacitive touch sensor. For example, the capacitive touch sensor may include a first electrode disposed within the enclosure 102 adjacent the input device 108 and a second electrode attached to or embedded in the input device 108.
With respect to the example of
As shown in
In the example of
Additionally, in some embodiments the haptic engine 200 provides a haptic output based on a rotational input action. The optical encoder 402 may produce an input device signal when the input device 106 is rotated. The processing unit may receive or be responsive to the input device signal and, in turn, cause a haptic output signal to be transmitted to the haptic engine 200. The haptic output signal may cause the haptic engine 200 to produce a haptic output that can be perceived by a user as haptic feedback indicating the rotational input action has been received by the electronic device.
It should be noted that the positions of the haptic engine 200 and the optical encoder 402 shown in
As discussed earlier, a haptic engine can be configured to operate in two or more modes. For example, in a first mode, the haptic engine may be positioned at a first position to apply a haptic output directly to the interior surface of an input device (e.g., input device 108). In a second mode, the haptic engine can be positioned at a second position to produce a haptic output within the electronic device and/or to apply a haptic output to one or more non-input-device surfaces or regions of the electronic device.
Other embodiments can use different types of biasing mechanisms. For example, one or more magnets and electromagnets can be included in, or attached to, the contact area 602 and the input device 108, respectively. Alternatively, one or more electromagnets may be included in, or attached to, the input device 108. The electromagnet(s) are used to produce a magnetic field that attracts or repels the magnet assembly 300. The electromagnet(s) can be activated to move the magnet assembly 300 to a given position. The one or more electromagnets are deactivated when the magnet assembly 300 is at the given position. A current applied to the coil assembly 304 can then be used to move the magnet assembly 300 and produce a haptic output. In some embodiments, the attachment mechanism may be a mechanical switch that is configured to position the shaft 600 in at least two different positions. For example, the switch may adjust the position of a movable arm that is attached to the shaft 600.
When the shaft 600 is positioned a given distance from the interior surface of the input device 108 (
When a haptic output is to be applied directly to the input device 108, the biasing mechanism adjusts the position of the shaft 600 so that the contact area 602 contacts the interior surface of the input device 108 (see
In this example, the shaft 302 includes a contact area 316 that is attached to the input device 108 and a collar 310 that is disengagably coupled to the frame 306. When a haptic output is to be produced within the electronic device, but not applied or transferred directly to the input device 108, a current is passed through the coil assembly 700 to produce a magnetic field that causes the magnet assembly 300 to move in a direction away from the input device 108. The magnet assembly 300 can be positioned at a first location within the housing 314 (
After the magnet assembly 300 is situated at the first position, another current is passed through the coil assembly 700 to produce a magnetic field that causes the magnet assembly 300 and the frame 306 to move in one direction or in an oscillating manner (in two opposing directions) to produce the haptic output. For example, the magnet assembly 300 can move a length L1 along the shaft 302 when moving in an oscillating manner to produce the haptic output within the enclosure 102.
When a haptic output is to be applied directly to the input device 108, a current is passed through the coil assembly 700 to produce a magnetic field that causes the magnet assembly 300 to move in a direction toward the input device 108. The magnet assembly may be positioned at a second location within the housing 314 (
A coil assembly 304 is formed with a coil 800 that encircles the magnets 300a, 300b. As described earlier, in one embodiment the magnets 300a, 300b move in a direction aligned with the shaft 302 when a haptic output signal is transmitted through the coil 800. The coil 800 can be stationary or move with respect to the magnets 300a, 300b. Additionally, a width of the coil 800 can be greater than, less than, or substantially the same as the width of the magnet assembly 300.
At block 900, a determination is made as to whether a haptic output is to be produced. In particular, the device may determine whether a haptic output is to be produced in response to a user input (received at an input device) or in response to another event. Other events include, for example, notifications, alerts, alarms, and other events that may be signaled to a user. If the determination is negative, the method returns to block 900.
In response to a positive determination at block 900, the process continues at block 902. At block 902, a determination is made as to whether the haptic output is to be applied directly to an input device. If the haptic output will not be applied directly to the input device, the method passes to block 904 where the device produces the haptic output. The device may determine that the haptic output is not to be applied directly to the input device because the haptic output is associated with one or more non-user input events including, for example, a notification, alert, or alarm. In some cases, the haptic output of block 904 corresponds to a second mode in which a general haptic output that is delivered to an exterior surface of the electronic device. The exterior surface may include, but is not limited to, an exterior surface of the input device.
If the haptic output will be applied directly to the input device (and/or the momentum of the haptic output transferred directly to the input device), the process continues at block 906. At block 906, the device makes a determination as to whether a haptic engine or haptic device should be adjusted. For example, the mode of the haptic engine or the haptic device may be changed so that a shaft or other element of the haptic engine engages the input device. Additionally or alternatively, one or more characteristics of a haptic output signal that is received by the haptic engine or the haptic device can be adjusted. For example, a frequency, amplitude, and/or a phase of the haptic output signal may be changed. Adjusting one or more of the characteristics of the haptic output signal can adjust the magnitude and/or the type of haptic output. If the haptic engine or haptic device will not be adjusted, the method passes to block 904 where the haptic output is produced.
If the haptic engine or haptic device will be adjusted, the process continues at block 908 where the haptic engine is adjusted and the haptic output is produced. The haptic output of block 908 may correspond to a first mode in which a localized haptic output that is delivered to the input device. For example, the localized haptic output may be concentrated or focused on an exterior surface of the input device and used to provide haptic feedback or acknowledgement of a user action received using the input device.
In some embodiments, the processing unit(s) 1000 and the processing device 202 (
The memory 1002 can store electronic data that can be used by the electronic device 100 and instructions and/or program code that is executed by the processing unit(s) 1000. For example, a memory can store electrical data or content such as, for example, audio and video files, documents and applications, device settings and user preferences, timing and control signals or data for the haptic device 1012 (or one or more components included therein), data structures or databases, and so on. The memory 1002 can be configured as any type of memory. By way of example only, the memory can be implemented as random access memory, read-only memory, Flash memory, removable memory, or other types of storage elements, or combinations of such devices. The one or more I/O devices 1004 can transmit and/or receive data to and from a user or another electronic device. The I/O device(s) 1004 can include a touch sensing input surface such as a track pad, one or more buttons, one or more microphones or speakers, one or more ports such as a microphone port, and/or a keyboard.
The electronic device 100 may also include one or more sensors 1006 positioned substantially anywhere on the electronic device 100. The sensor(s) 1006 may be configured to sense substantially any type of characteristic, such as, but not limited to, images, pressure, light, touch, force, biometric data, temperature, position, motion, and so on. For example, the sensor(s) 1006 may be an image sensor, a temperature sensor, a light or optical sensor, an atmospheric pressure sensor, a proximity sensor, a humidity sensor, a magnet, a gyroscope, a biometric sensor, an accelerometer, and so on.
The power source 1008 can be implemented with one or more devices capable of providing energy to the electronic device 100. For example, the power source 1008 can be one or more batteries or rechargeable batteries. Additionally or alternatively, the power source 1008 may be a connection cable that connects the electronic device to another power source, such as a wall outlet or another electronic device.
The network communication interface 1010 can facilitate transmission of data to or from other electronic devices. For example, a network communication interface 1010 can transmit electronic signals via a wireless and/or wired network connection. Examples of wireless and wired network connections include, but are not limited to, cellular, Wi-Fi, Bluetooth, infrared, and Ethernet.
The input device 1012 can be any suitable input device that is configured to provide a haptic feedback to a user in response to an input action. For example, the input device 1012 can be an input button, a crown, a section of the enclosure, and/or a display.
The haptic engine 1014 can be implemented as any suitable device configured to provide force feedback, vibratory feedback, tactile sensations, and the like. For example, in one embodiment, the haptic engine 1014 can be implemented as an electromagnetic actuator (e.g., linear actuator) configured to provide a punctuated haptic feedback, such as a tap or a knock. Additionally or alternatively, the electromagnetic actuator may be configured to translate in two directions to provide a vibratory haptic feedback.
It should be noted that
The foregoing description, for purposes of explanation, used specific nomenclature to provide a thorough understanding of the described embodiments. However, it will be apparent to one skilled in the art that the specific details are not required in order to practice the described embodiments. Thus, the foregoing descriptions of the specific embodiments described herein are presented for purposes of illustration and description. They are not targeted to be exhaustive or to limit the embodiments to the precise forms disclosed. It will be apparent to one of ordinary skill in the art that many modifications and variations are possible in view of the above teachings.
This application is a continuation of U.S. Nonprovisional patent application Ser. No. 16/800,723, filed Feb. 25, 2020, which is a continuation of U.S. Nonprovisional patent application Ser. No. 15/366,674, filed Dec. 1, 2016, now U.S. Pat. No. 10,585,480, which is a nonprovisional patent application of and claims the benefit of U.S. Provisional Patent Application No. 62/334,036, filed May 10, 2016, the disclosures of which are hereby incorporated by reference herein in their entirety.
Number | Name | Date | Kind |
---|---|---|---|
5196745 | Trumper et al. | Mar 1993 | A |
5293161 | MacDonald et al. | Mar 1994 | A |
5424756 | Ho et al. | Jun 1995 | A |
5434549 | Hirabayashi et al. | Jul 1995 | A |
5436622 | Gutman et al. | Jul 1995 | A |
5668423 | You et al. | Sep 1997 | A |
5842967 | Kroll | Jan 1998 | A |
5739759 | Nakazawa et al. | Apr 1998 | A |
6084319 | Kamata et al. | Jul 2000 | A |
6342880 | Rosenberg et al. | Jan 2002 | B2 |
6373465 | Jolly et al. | Apr 2002 | B2 |
6388789 | Bernstein | May 2002 | B1 |
6438393 | Surronen | Aug 2002 | B1 |
6445093 | Binnard | Sep 2002 | B1 |
6493612 | Bisset et al. | Dec 2002 | B1 |
6554191 | Yoneya | Apr 2003 | B2 |
6693622 | Shahoian et al. | Feb 2004 | B1 |
6777895 | Shimoda et al. | Aug 2004 | B2 |
6822635 | Shahoian | Nov 2004 | B2 |
6864877 | Braun et al. | Mar 2005 | B2 |
6952203 | Banerjee et al. | Oct 2005 | B2 |
6988414 | Ruhrig et al. | Jan 2006 | B2 |
7068168 | Girshovich et al. | Jun 2006 | B2 |
7080271 | Kardach et al. | Jul 2006 | B2 |
7126254 | Nanataki et al. | Oct 2006 | B2 |
7130664 | Williams | Oct 2006 | B1 |
7196688 | Shena et al. | Mar 2007 | B2 |
7202851 | Cunningham et al. | Apr 2007 | B2 |
7234379 | Claesson et al. | Jun 2007 | B2 |
7253350 | Noro et al. | Aug 2007 | B2 |
7276907 | Kitagawa et al. | Oct 2007 | B2 |
7321180 | Takeuchi et al. | Jan 2008 | B2 |
7323959 | Naka et al. | Jan 2008 | B2 |
7336006 | Watanabe et al. | Feb 2008 | B2 |
7339572 | Schena | Mar 2008 | B2 |
7355305 | Nakamura et al. | Apr 2008 | B2 |
7360446 | Dai et al. | Apr 2008 | B2 |
7370289 | Ebert et al. | May 2008 | B1 |
7385874 | Vuilleumier | Jun 2008 | B2 |
7392066 | Hapamas | Jun 2008 | B2 |
7423631 | Shahoian et al. | Sep 2008 | B2 |
7508382 | Denoue et al. | Mar 2009 | B2 |
7570254 | Suzuki et al. | Aug 2009 | B2 |
7576477 | Koizumi | Aug 2009 | B2 |
7656388 | Schena et al. | Feb 2010 | B2 |
7667371 | Sadler et al. | Feb 2010 | B2 |
7667691 | Boss et al. | Feb 2010 | B2 |
7675414 | Ray | Mar 2010 | B2 |
7710397 | Krah et al. | May 2010 | B2 |
7710399 | Bruneau et al. | May 2010 | B2 |
7741938 | Kramlich | Jun 2010 | B2 |
7755605 | Daniel et al. | Jul 2010 | B2 |
7798982 | Zets et al. | Sep 2010 | B2 |
7825903 | Anastas et al. | Nov 2010 | B2 |
7855657 | Doemens et al. | Dec 2010 | B2 |
7890863 | Grant et al. | Feb 2011 | B2 |
7893922 | Klinghult et al. | Feb 2011 | B2 |
7904210 | Pfau et al. | Mar 2011 | B2 |
7911328 | Luden et al. | Mar 2011 | B2 |
7919945 | Houston et al. | Apr 2011 | B2 |
7952261 | Lipton et al. | May 2011 | B2 |
7952566 | Poupyrev et al. | May 2011 | B2 |
7956770 | Klinghult et al. | Jun 2011 | B2 |
7976230 | Ryynanen et al. | Jul 2011 | B2 |
8002089 | Jasso et al. | Aug 2011 | B2 |
8020266 | Ulm et al. | Sep 2011 | B2 |
8040224 | Hwang | Oct 2011 | B2 |
8053688 | Conzola et al. | Nov 2011 | B2 |
8063892 | Shahoian | Nov 2011 | B2 |
8072418 | Crawford et al. | Dec 2011 | B2 |
8081156 | Ruettiger | Dec 2011 | B2 |
8125453 | Shahoian et al. | Feb 2012 | B2 |
8154537 | Olien et al. | Apr 2012 | B2 |
8174495 | Takashima et al. | May 2012 | B2 |
8174512 | Ramstein et al. | May 2012 | B2 |
8188989 | Levin | May 2012 | B2 |
8169402 | Shahoian et al. | Jun 2012 | B2 |
8217892 | Meadors | Jul 2012 | B2 |
8217910 | Stallings et al. | Jul 2012 | B2 |
8232494 | Purcocks | Jul 2012 | B2 |
8248386 | Harrison | Aug 2012 | B2 |
8253686 | Kyung | Aug 2012 | B2 |
8262480 | Cohen et al. | Sep 2012 | B2 |
8264465 | Grant et al. | Sep 2012 | B2 |
8265292 | Leichter | Sep 2012 | B2 |
8265308 | Gitzinger et al. | Sep 2012 | B2 |
8344834 | Niiyama | Jan 2013 | B2 |
8345025 | Seibert et al. | Jan 2013 | B2 |
8351104 | Zaifrani et al. | Jan 2013 | B2 |
8378797 | Pance et al. | Feb 2013 | B2 |
8378965 | Gregorio et al. | Feb 2013 | B2 |
8384316 | Houston et al. | Feb 2013 | B2 |
8390218 | Houston et al. | Mar 2013 | B2 |
8390572 | Marsden et al. | Mar 2013 | B2 |
8390594 | Modarres et al. | Mar 2013 | B2 |
8400027 | Dong et al. | Mar 2013 | B2 |
8405618 | Colgate et al. | Mar 2013 | B2 |
8421609 | Kim et al. | Apr 2013 | B2 |
8432365 | Kim et al. | Apr 2013 | B2 |
8469806 | Grant et al. | Jun 2013 | B2 |
8471690 | Hennig et al. | Jun 2013 | B2 |
8493177 | Flaherty et al. | Jul 2013 | B2 |
8493189 | Suzuki | Jul 2013 | B2 |
8562489 | Burton | Oct 2013 | B2 |
8576171 | Grant | Nov 2013 | B2 |
8598750 | Park | Dec 2013 | B2 |
8598972 | Cho et al. | Dec 2013 | B2 |
8604670 | Mahameed et al. | Dec 2013 | B2 |
8605141 | Dialameh et al. | Dec 2013 | B2 |
8614431 | Huppi et al. | Dec 2013 | B2 |
8619031 | Hayward | Dec 2013 | B2 |
8624448 | Kaiser et al. | Jan 2014 | B2 |
8628173 | Stephens et al. | Jan 2014 | B2 |
8633916 | Bernstein et al. | Jan 2014 | B2 |
8639485 | Connacher et al. | Jan 2014 | B2 |
8643480 | Maier et al. | Feb 2014 | B2 |
8648829 | Shahoian et al. | Feb 2014 | B2 |
8653785 | Collopy | Feb 2014 | B2 |
8654524 | Pance et al. | Feb 2014 | B2 |
8681130 | Adhikari | Mar 2014 | B2 |
8686952 | Burrough et al. | Apr 2014 | B2 |
8717151 | Forutanpour et al. | May 2014 | B2 |
8730182 | Modarres et al. | May 2014 | B2 |
8749495 | Grant et al. | Jun 2014 | B2 |
8754759 | Fadell et al. | Jun 2014 | B2 |
8760037 | Eshed et al. | Jun 2014 | B2 |
8773247 | Ullrich | Jul 2014 | B2 |
8780074 | Castillo et al. | Jul 2014 | B2 |
8797153 | Vanhelle et al. | Aug 2014 | B2 |
8797295 | Bernstein et al. | Aug 2014 | B2 |
8803670 | Steckel et al. | Aug 2014 | B2 |
8834390 | Couvillon | Sep 2014 | B2 |
8836502 | Culbert et al. | Sep 2014 | B2 |
8836643 | Jolliff et al. | Sep 2014 | B2 |
8867757 | Ooi | Oct 2014 | B1 |
8872448 | Boldyrev et al. | Oct 2014 | B2 |
8878401 | Lee | Nov 2014 | B2 |
8890824 | Guard | Nov 2014 | B2 |
8907661 | Maier et al. | Dec 2014 | B2 |
8976139 | Koga et al. | Mar 2015 | B2 |
8976141 | Myers et al. | Mar 2015 | B2 |
8977376 | Lin et al. | Mar 2015 | B1 |
8981682 | Delson et al. | Mar 2015 | B2 |
8987951 | Park | Mar 2015 | B2 |
9008730 | Kim et al. | Apr 2015 | B2 |
9024738 | Van Schyndel et al. | May 2015 | B2 |
9046947 | Takeda | Jun 2015 | B2 |
9049339 | Muench | Jun 2015 | B2 |
9052785 | Horie | Jun 2015 | B2 |
9054605 | Jung et al. | Jun 2015 | B2 |
9058077 | Lazaridis et al. | Jun 2015 | B2 |
9086727 | Tidemand et al. | Jul 2015 | B2 |
9092056 | Myers et al. | Jul 2015 | B2 |
9094762 | Wong et al. | Jul 2015 | B2 |
9104285 | Colgate et al. | Aug 2015 | B2 |
9116570 | Lee et al. | Aug 2015 | B2 |
9122330 | Bau et al. | Sep 2015 | B2 |
9134796 | Lemmons et al. | Sep 2015 | B2 |
9172669 | Swink et al. | Oct 2015 | B2 |
9182837 | Day | Nov 2015 | B2 |
9218727 | Rothkopf et al. | Dec 2015 | B2 |
9245704 | Maharjan et al. | Jan 2016 | B2 |
9256287 | Shinozaki et al. | Feb 2016 | B2 |
9274601 | Faubert et al. | Mar 2016 | B2 |
9280205 | Rosenberg et al. | Mar 2016 | B2 |
9285905 | Buuck et al. | Mar 2016 | B1 |
9286907 | Yang et al. | Mar 2016 | B2 |
9304587 | Wright et al. | Apr 2016 | B2 |
9319150 | Peeler et al. | Apr 2016 | B2 |
9348414 | Kagayama | May 2016 | B2 |
9348473 | Ando | May 2016 | B2 |
9361018 | Pasquero et al. | Jun 2016 | B2 |
9396629 | Weber et al. | Jul 2016 | B1 |
9430042 | Levin | Aug 2016 | B2 |
9436280 | Tartz et al. | Sep 2016 | B2 |
9442570 | Slonneger | Sep 2016 | B2 |
9448631 | Winter et al. | Sep 2016 | B2 |
9448713 | Cruz-Hernandez et al. | Sep 2016 | B2 |
9449476 | Lynn et al. | Sep 2016 | B2 |
9459734 | Day | Oct 2016 | B2 |
9466783 | Olien et al. | Oct 2016 | B2 |
9489049 | Li | Nov 2016 | B2 |
9496777 | Jung | Nov 2016 | B2 |
9501149 | Burnbaum et al. | Nov 2016 | B2 |
9513704 | Heubel et al. | Dec 2016 | B2 |
9519346 | Lacroix et al. | Dec 2016 | B2 |
9535500 | Pasquero et al. | Jan 2017 | B2 |
9539164 | Sanders et al. | Jan 2017 | B2 |
9542028 | Filiz et al. | Jan 2017 | B2 |
9557830 | Grant | Jan 2017 | B2 |
9557857 | Schediwy | Jan 2017 | B2 |
9563274 | Senanayake | Feb 2017 | B2 |
9564029 | Morrell et al. | Feb 2017 | B2 |
9594429 | Bard et al. | Mar 2017 | B2 |
9600037 | Pance et al. | Mar 2017 | B2 |
9600071 | Rothkopf | Mar 2017 | B2 |
9607491 | Mortimer et al. | Mar 2017 | B1 |
9627163 | Ely et al. | Apr 2017 | B2 |
9632583 | Virtanen et al. | Apr 2017 | B2 |
9639158 | Levesque | May 2017 | B2 |
9666040 | Flaherty et al. | May 2017 | B2 |
9707593 | Berte | Jul 2017 | B2 |
9710061 | Pance et al. | Jul 2017 | B2 |
9727238 | Peh et al. | Aug 2017 | B2 |
9733704 | Cruz-Hernandez et al. | Aug 2017 | B2 |
9762236 | Chen et al. | Sep 2017 | B2 |
9823828 | Zambetti et al. | Nov 2017 | B2 |
9829981 | Ji | Nov 2017 | B1 |
9830782 | Morrell et al. | Nov 2017 | B2 |
9857872 | Terlizzi et al. | Jan 2018 | B2 |
9870053 | Modarres et al. | Jan 2018 | B2 |
9874980 | Brunet et al. | Jan 2018 | B2 |
9875625 | Khoshkava et al. | Jan 2018 | B2 |
9886057 | Bushnell et al. | Feb 2018 | B2 |
9886090 | Silvanto et al. | Feb 2018 | B2 |
9902186 | Whiteman et al. | Feb 2018 | B2 |
9904393 | Frey et al. | Feb 2018 | B2 |
9878239 | Heubel et al. | Mar 2018 | B2 |
9921649 | Grant et al. | Mar 2018 | B2 |
9927887 | Bulea | Mar 2018 | B2 |
9927902 | Burr et al. | Mar 2018 | B2 |
9928950 | Lubinski et al. | Mar 2018 | B2 |
9940013 | Choi et al. | Apr 2018 | B2 |
9971407 | Holenarsipur et al. | May 2018 | B2 |
9977499 | Westerman et al. | May 2018 | B2 |
9990040 | Levesque | Jun 2018 | B2 |
9996199 | Park et al. | Jun 2018 | B2 |
10025399 | Kim et al. | Jul 2018 | B2 |
10032550 | Zhang et al. | Jul 2018 | B1 |
10037660 | Khoshkava et al. | Jul 2018 | B2 |
10061385 | Churikov et al. | Aug 2018 | B2 |
10069392 | Degner et al. | Sep 2018 | B2 |
10078483 | Finnan et al. | Sep 2018 | B2 |
10082873 | Zhang | Sep 2018 | B2 |
10108265 | Harley et al. | Oct 2018 | B2 |
10110986 | Min | Oct 2018 | B1 |
10120446 | Pance et al. | Nov 2018 | B2 |
10120478 | Filiz et al. | Nov 2018 | B2 |
10120484 | Endo et al. | Nov 2018 | B2 |
10122184 | Smadi et al. | Nov 2018 | B2 |
10133351 | Weber et al. | Nov 2018 | B2 |
10139976 | Iuchi et al. | Nov 2018 | B2 |
10146336 | Lee et al. | Dec 2018 | B2 |
10152131 | Grant et al. | Dec 2018 | B2 |
10152182 | Haran et al. | Dec 2018 | B2 |
10203762 | Bradski et al. | Feb 2019 | B2 |
10209821 | Roberts-Hoffman et al. | Feb 2019 | B2 |
10232714 | Wachinger | Mar 2019 | B2 |
10235034 | Jitkoff et al. | Mar 2019 | B2 |
10235849 | Levesque | Mar 2019 | B1 |
10248221 | Pance et al. | Apr 2019 | B2 |
10254840 | Weinraub | Apr 2019 | B2 |
10261585 | Bard et al. | Apr 2019 | B2 |
10275075 | Hwang et al. | Apr 2019 | B2 |
10282014 | Butler et al. | May 2019 | B2 |
10284935 | Miyoshi | May 2019 | B2 |
10289199 | Hoellwarth | May 2019 | B2 |
10346117 | Sylvan et al. | Jul 2019 | B2 |
10372214 | Gleeson et al. | Aug 2019 | B1 |
10373381 | Nuernberger et al. | Aug 2019 | B2 |
10382866 | Min | Aug 2019 | B2 |
10390139 | Biggs | Aug 2019 | B2 |
10394326 | Ono et al. | Aug 2019 | B2 |
10397686 | Forstner et al. | Aug 2019 | B2 |
10430077 | Lee | Oct 2019 | B2 |
10437359 | Wang et al. | Oct 2019 | B1 |
10459226 | Leppanen et al. | Oct 2019 | B2 |
10531191 | Macours | Jan 2020 | B2 |
10556252 | Tsang et al. | Feb 2020 | B2 |
10564721 | Cruz-Hernandez et al. | Feb 2020 | B2 |
10585480 | Bushnell et al. | Mar 2020 | B1 |
10591993 | Lehmann et al. | Mar 2020 | B2 |
10599223 | Amin-Shahidi et al. | Mar 2020 | B1 |
10622538 | Zhang et al. | Apr 2020 | B2 |
10649529 | Nekimken et al. | May 2020 | B1 |
10685626 | Kim et al. | Jun 2020 | B2 |
10691211 | Amin-Shahidi et al. | Jun 2020 | B2 |
10768738 | Zhang et al. | Sep 2020 | B1 |
10768747 | Wang et al. | Sep 2020 | B2 |
10775889 | Lehmann et al. | Sep 2020 | B1 |
10809830 | Kim et al. | Oct 2020 | B2 |
10845220 | Song et al. | Nov 2020 | B2 |
10845878 | Zhao et al. | Nov 2020 | B1 |
10890978 | Bushnell et al. | Jan 2021 | B2 |
10936071 | Pandya et al. | Mar 2021 | B2 |
10942571 | Hendren et al. | Mar 2021 | B2 |
10996007 | Fenner et al. | Mar 2021 | B2 |
11024135 | Ostdiek et al. | Jun 2021 | B1 |
11054932 | Xu et al. | Jul 2021 | B2 |
11188151 | Bushnell et al. | Nov 2021 | B2 |
20030117132 | Klinghult | Jun 2003 | A1 |
20050036603 | Hughes | Feb 2005 | A1 |
20050191604 | Allen | Sep 2005 | A1 |
20050230594 | Sato et al. | Oct 2005 | A1 |
20060017691 | Cruz-Hernandez et al. | Jan 2006 | A1 |
20060209037 | Wang et al. | Sep 2006 | A1 |
20060223547 | Chin et al. | Oct 2006 | A1 |
20060252463 | Liao | Nov 2006 | A1 |
20070106457 | Rosenberg | May 2007 | A1 |
20070152974 | Kim et al. | Jul 2007 | A1 |
20080062145 | Shahoian | Mar 2008 | A1 |
20080062624 | Regen | Mar 2008 | A1 |
20080084384 | Gregorio et al. | Apr 2008 | A1 |
20080111791 | Nikittin | May 2008 | A1 |
20090085879 | Dai et al. | Apr 2009 | A1 |
20090115734 | Fredriksson et al. | May 2009 | A1 |
20090166098 | Sunder | Jul 2009 | A1 |
20090167702 | Nurmi | Jul 2009 | A1 |
20090174672 | Schmidt | Jul 2009 | A1 |
20090207129 | Ullrich et al. | Aug 2009 | A1 |
20090225046 | Kim et al. | Sep 2009 | A1 |
20090243404 | Kim et al. | Oct 2009 | A1 |
20090267892 | Faubert | Oct 2009 | A1 |
20100116629 | Borissov et al. | May 2010 | A1 |
20100225600 | Dai et al. | Sep 2010 | A1 |
20100313425 | Hawes | Dec 2010 | A1 |
20100328229 | Weber et al. | Dec 2010 | A1 |
20110115754 | Cruz-Hernandez | May 2011 | A1 |
20110128239 | Polyakov et al. | Jun 2011 | A1 |
20110132114 | Siotis | Jun 2011 | A1 |
20110169347 | Miyamoto et al. | Jul 2011 | A1 |
20110205038 | Drouin et al. | Aug 2011 | A1 |
20110261021 | Modarres et al. | Oct 2011 | A1 |
20110267181 | Kildal | Nov 2011 | A1 |
20110267294 | Kildal | Nov 2011 | A1 |
20120038469 | Dehmoubed et al. | Feb 2012 | A1 |
20120038471 | Kim et al. | Feb 2012 | A1 |
20120056825 | Ramsay et al. | Mar 2012 | A1 |
20120062491 | Coni et al. | Mar 2012 | A1 |
20120113008 | Makinen et al. | May 2012 | A1 |
20120235942 | Shahoian | Sep 2012 | A1 |
20120249474 | Pratt et al. | Oct 2012 | A1 |
20120327006 | Israr et al. | Dec 2012 | A1 |
20130016042 | Makinen et al. | Jan 2013 | A1 |
20130021296 | Min et al. | Jan 2013 | A1 |
20130043670 | Holmes | Feb 2013 | A1 |
20130044049 | Biggs et al. | Feb 2013 | A1 |
20130076635 | Lin | Mar 2013 | A1 |
20130154996 | Trend et al. | Jun 2013 | A1 |
20130207793 | Weaber et al. | Aug 2013 | A1 |
20140118419 | Wu et al. | May 2014 | A1 |
20140125470 | Rosenberg | May 2014 | A1 |
20140168175 | Mercea et al. | Jun 2014 | A1 |
20150084909 | Worfolk et al. | Mar 2015 | A1 |
20150126070 | Candelore | May 2015 | A1 |
20150182113 | Utter, II | Jul 2015 | A1 |
20150185842 | Picciotto et al. | Jul 2015 | A1 |
20150186609 | Utter, II | Jul 2015 | A1 |
20150234493 | Parivar et al. | Aug 2015 | A1 |
20150293592 | Cheong et al. | Oct 2015 | A1 |
20160063826 | Morrell et al. | Mar 2016 | A1 |
20160098107 | Morrell et al. | Apr 2016 | A1 |
20160171767 | Anderson | Jun 2016 | A1 |
20160253019 | Geaghan | Sep 2016 | A1 |
20160293829 | Maharjan et al. | Oct 2016 | A1 |
20160327911 | Eim et al. | Nov 2016 | A1 |
20160328930 | Weber et al. | Nov 2016 | A1 |
20160334901 | Rihn | Nov 2016 | A1 |
20160379776 | Oakley | Dec 2016 | A1 |
20170024010 | Weinraub | Jan 2017 | A1 |
20170083096 | Rihn | Mar 2017 | A1 |
20170090655 | Zhang et al. | Mar 2017 | A1 |
20170173458 | Billington | Jun 2017 | A1 |
20170249024 | Jackson et al. | Aug 2017 | A1 |
20170336273 | Elangovan et al. | Nov 2017 | A1 |
20170357325 | Yang et al. | Dec 2017 | A1 |
20170364158 | Wen et al. | Dec 2017 | A1 |
20180005496 | Dogiamis et al. | Jan 2018 | A1 |
20180015362 | Terahata | Jan 2018 | A1 |
20180029078 | Park et al. | Feb 2018 | A1 |
20180284894 | Raut et al. | Oct 2018 | A1 |
20180321841 | Lapp | Nov 2018 | A1 |
20180335883 | Choi et al. | Nov 2018 | A1 |
20190073079 | Xu et al. | Mar 2019 | A1 |
20190278232 | Ely et al. | Sep 2019 | A1 |
20190310724 | Yazdandoost et al. | Oct 2019 | A1 |
20200251648 | Fukumoto | Aug 2020 | A1 |
20210176548 | Fenner et al. | Jun 2021 | A1 |
20210325993 | Xu et al. | Oct 2021 | A1 |
20210398403 | Ostdiek et al. | Dec 2021 | A1 |
Number | Date | Country |
---|---|---|
1846179 | Oct 2006 | CN |
101036105 | Sep 2007 | CN |
201044066 | Apr 2008 | CN |
101409164 | Apr 2009 | CN |
101436099 | May 2009 | CN |
101663104 | Mar 2010 | CN |
101872257 | Oct 2010 | CN |
201897778 | Jul 2011 | CN |
201945951 | Aug 2011 | CN |
102349039 | Feb 2012 | CN |
102448555 | May 2012 | CN |
203405773 | Jan 2014 | CN |
203630729 | Jun 2014 | CN |
104679233 | Jun 2015 | CN |
105144052 | Dec 2015 | CN |
106133650 | Nov 2016 | CN |
106354203 | Jan 2017 | CN |
206339935 | Jul 2017 | CN |
107305452 | Oct 2017 | CN |
207115337 | Mar 2018 | CN |
214030 | Mar 1983 | DE |
1686776 | Aug 2006 | EP |
2743798 | Jun 2014 | EP |
3098690 | Nov 2016 | EP |
2004129120 | Apr 2004 | JP |
2004236202 | Aug 2004 | JP |
2010537279 | Dec 2010 | JP |
2010540320 | Dec 2010 | JP |
2012048378 | Mar 2012 | JP |
20050033909 | Apr 2005 | KR |
101016208 | Feb 2011 | KR |
20130137124 | Dec 2013 | KR |
20160017070 | Feb 2016 | KR |
20160131275 | Nov 2016 | KR |
20170107570 | Sep 2017 | KR |
2010035805 | Oct 2010 | TW |
201430623 | Aug 2014 | TW |
WO 2002073587 | Sep 2002 | WO |
WO 2006091494 | Aug 2006 | WO |
WO 2007049253 | May 2007 | WO |
WO 2007114631 | Oct 2007 | WO |
WO 2009038862 | Mar 2009 | WO |
WO 2009156145 | Dec 2009 | WO |
WO 2010129221 | Nov 2010 | WO |
WO 2010129892 | Nov 2010 | WO |
WO 2012173818 | Dec 2012 | WO |
WO 2013169303 | Nov 2013 | WO |
WO 2014066516 | May 2014 | WO |
WO 2014200766 | Dec 2014 | WO |
WO 2016091944 | Jun 2016 | WO |
WO 2016144563 | Sep 2016 | WO |
WO 2019003254 | Jan 2019 | WO |
Entry |
---|
Author Unknown, “3D Printed Mini Haptic Actuator,” Autodesk, Inc., 16 pages, 2016. |
Hasser et al., “Preliminary Evaluation of a Shape-Memory Alloy Tactile Feedback Display,” Advances in Robotics, Mechantronics, and Haptic Interfaces, ASME, DSC—vol. 49, pp. 73-80, 1993. |
Hill et al., “Real-time Estimation of Human Impedance for Haptic Interfaces,” Stanford Telerobotics Laboratory, Department of Mechanical Engineering, Standford University, 6 pages, at least as early as Sep. 30, 2009. |
Lee et al, “Haptic Pen: Tactile Feedback Stylus for Touch Screens,” Mitsubishi Electric Research Laboratories, http://wwwlmerl.com, 6 pages, Oct. 2004. |
Stein et al., “A process chain for integrating piezoelectric transducers into aluminum die castings to generate smart lightweight structures,” Results in Physics 7, pp. 2534-2539, 2017. |
“Lofelt at Smart Haptics 2017,” Auto-generated transcript from YouTube video clip, uploaded on Jun. 12, 2018 by user “Lofelt,” Retrieved from Internet: <https://www.youtube.com/watch?v=3w7LTQkS430>, 3 pages. |
“Tutorial: Haptic Feedback Using Music and Audio—Precision Microdrives,” Retrieved from Internet Nov. 13, 2019: https://www.precisionmicrodrives.com/haptic-feedback/tutorial-haptic-feedback-using-music-and-audio/, 9 pages. |
“Feel what you hear: haptic feedback as an accompaniment to mobile music playback,” Retrieved from Internet Nov. 13, 2019: https://dl.acm.org/citation.cfm?id=2019336, 2 pages. |
“Auto Haptic Widget for Android,” Retrieved from Internet Nov. 13, 2019, https://apkpure.com/auto-haptic-widget/com.immersion.android.autohaptic, 3 pages. |
D-BOX Home, Retrieved from Internet Nov. 12, 2019: https://web.archive.org/web/20180922193345/https://www.d-box.com/en, 4 pages. |
Number | Date | Country | |
---|---|---|---|
20210157411 A1 | May 2021 | US |
Number | Date | Country | |
---|---|---|---|
62334036 | May 2016 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 16800723 | Feb 2020 | US |
Child | 17145115 | US | |
Parent | 15366674 | Dec 2016 | US |
Child | 16800723 | US |