Situationally-aware alerts

Information

  • Patent Grant
  • 10039080
  • Patent Number
    10,039,080
  • Date Filed
    Tuesday, August 30, 2016
    8 years ago
  • Date Issued
    Tuesday, July 31, 2018
    6 years ago
Abstract
An electronic device that provides situationally-aware alerts determines to provide an alert output (such as haptic, audio, visual, and so on) via an output device, determines a movement pattern based on one or more signals from one or more sensors indicating information relating at least to movement of the electronic device, and adjusts the alert output to account for the movement pattern. In some implementations, the electronic device may adjust the alert output by delaying the alert output. In other implementations, the electronic device may adjust the alert output by altering the alert output to be discernible despite the movement pattern based on a cadence of the movement pattern. In still other implementations, the electronic device may determine to provide the alert output in response to receiving an incoming communication and may adjust the alert output differently based on a priority associated with the incoming communication.
Description
FIELD

The described embodiments relate generally to alerts. More particularly, the present embodiments relate to adjusting alerts based on a user's situation.


BACKGROUND

Many electronic devices provide various notifications, alerts, or other output to users. Such notifications may be visual, audio, haptic, and so on. For example, a smart phone that receives a communication such as a call or text or email message may indicate such on a screen, play a tone or other audio, and/or vibrate.


In general, notifications may be configured to be salient, or noticeable, to a user without being overly disturbing to others. For example, a smart phone may present a visual indicator on a display screen as well as playing a tone for an incoming call. The tone may assist the user in noticing the incoming call if the user is not currently looking at the display, but may be disturbing to others if the user is in the context of a meeting or other scenario where audio is overly noticeable.


SUMMARY

The present disclosure relates to electronic devices that provide situationally-aware alerts. An electronic device determines to provide alert output (such as a vibration or other haptic output, audio output, visual output, and so on) via an output device, determines a movement pattern based on one or more signals from one or more sensors indicating information relating at least to movement of the electronic device, and adjusts the alert output to account for the movement pattern. In some implementations, the electronic device may adjust the alert output by delaying the alert output. In other implementations, the electronic device may adjust the alert output by altering the alert output to be discernible despite the movement pattern based on a cadence of the movement pattern. In still other implementations, the electronic device may determine to provide the alert output in response to receiving an incoming communication and may prioritize incoming communications by adjusting the alert output differently based on an associated priority.


In various embodiments, an electronic device that provides situationally-aware alerts includes a haptic output device, a sensor operable to produce a signal indicating information relating to movement of the electronic device, and a processing unit connected to the sensor and the haptic output device. The processing unit is configured to determine to provide a haptic output via the haptic output device, determine a movement pattern based on the signal, and adjust the haptic output to account for the movement pattern by delaying the haptic output.


In some examples, the movement pattern indicates changes in elevation and the processing unit delays the haptic output until changes in elevation cease. In various implementations of such examples, the sensor includes a pressure sensor, the processing unit is configured to determine that the movement pattern indicates the changes in elevation based on the pressure sensor, and the processing unit is configured to delay the haptic output until the processing unit determines based on the pressure sensor that the changes in elevation have ceased.


In various examples, the processing unit is configured to determine a first period based on the movement pattern where the electronic device will be less proximate to a user (such as where the user is running and the electronic device is in the user's pocket and moves in the pocket further from the user and closer to the user in the pocket at different portions of the user's stride), determine a second period based on the movement pattern where the electronic device will be more proximate to the user, and delay the haptic output from the first period to the second period. In other examples, the processing unit delays the haptic output for a first period when the movement pattern indicates a first type of movement and delays the haptic output for a second period when the movement pattern indicates a second type of movement.


In numerous examples, the signal includes information indicating a heart rate of a user is elevated and the processing unit delays the haptic output until the heart rate of the user reduces. In various examples, the processing unit estimates a time when the haptic output will be salient despite the movement and delays the haptic output until the time.


In some embodiments, an electronic device that provides situationally-aware alerts includes a haptic output device, a sensor operable to produce a signal indicating information relating to movement of the electronic device, and a processing unit connected to the sensor and the haptic output device. The processing unit is configured to determine to provide a haptic output via the haptic output device, determine a movement pattern based on the signal, and adjust the haptic output to account for the movement pattern by altering the haptic output to be discernible despite the movement pattern based on a cadence of the movement pattern.


In various examples, the processing unit is configured to adjust a pattern of the haptic output to be mismatched with the cadence. In numerous examples, the processing unit is configured to alter the haptic output by time shifting the haptic output to a pause in the cadence.


In some examples, the processing unit is configured to determine to provide the haptic output in response to receiving an incoming communication, adjust the haptic output in a first manner when the incoming communication is associated with a first priority, and adjust the haptic output in a second manner when the incoming communication is associated with a second priority. In various examples, the processing unit is configured to alter the haptic output in a first manner when the movement pattern indicates a first type of movement and in a second manner when the movement pattern indicates a second type of movement. In numerous examples, the processing unit is configured to prompt for an acknowledgement of the adjusted haptic output, determine the acknowledgement has not been received, and provide additional haptic output until the acknowledgement is received.


In numerous embodiments, an electronic device that provides situationally-aware alerts includes a non-transitory storage medium storing instructions; a haptic output device; a sensor operable to produce a signal indicating information about a situation of a user of the electronic device; a communication component operable to receive an incoming communication associated with a priority; and a processing unit connected to the sensor, the communication component, the haptic output device, and the non-transitory storage medium. The processing unit is configured to execute the instructions to determine to provide a haptic output via the haptic output device in response to receiving the incoming communication; determine a movement pattern based on the signal; and adjust the haptic output to account for the movement pattern by delaying the haptic output when the incoming communication is associated with a first priority and by altering the haptic output to be discernible despite the movement pattern based on a cadence of the movement pattern when the incoming communication is associated with a second priority.


In various examples, the electronic device that provides situationally-aware alerts further includes an output device other than the haptic output device wherein the processing unit is configured to provide an output via the output device in addition to the haptic output. In some implementations of such examples, the output is at least one of visual output or audio output.


In numerous examples, the processing unit is configured to communicate with an additional electronic device and the processing unit signals the additional electronic device to produce output in addition to the haptic output. In various examples, the processing unit is configured to communicate with an additional electronic device and the processing unit evaluates the situation of the user by receiving data indicating a status of the additional electronic device that affects the situation of the user.


In some examples, the first and second priorities are based on at least one of a source of the incoming communication, a priority indicator included in the incoming communication, or a type of the incoming communication. In various examples, the first and second priorities are user assigned.





BRIEF DESCRIPTION OF THE DRAWINGS

The disclosure will be readily understood by the following detailed description in conjunction with the accompanying drawings, wherein like reference numerals designate like structural elements.



FIG. 1 is depicts an example system for providing situationally-aware alert output.



FIG. 2 depicts a block diagram illustrating sample components of the system of FIG. 1 and sample functional relationships among those components.



FIG. 3 is a flow chart illustrating a first example method for providing situationally-aware alert output. This first example method may be performed by the example system of FIGS. 1-2.



FIG. 4 is a flow chart illustrating a second example method for providing situationally-aware alert output. This second example method may be performed by the example system of FIGS. 1-2.



FIG. 5 is a flow chart illustrating a third example method for providing situationally-aware alert output. This third example method may be performed by the example system of FIGS. 1-2.



FIG. 6 is a flow chart illustrating a fourth example method for providing situationally-aware alert output. This fourth example method may be performed by the example system of FIGS. 1-2.



FIG. 7 is a flow chart illustrating a fifth example method for providing situationally-aware alert output. This fifth example method may be performed by the example system of FIGS. 1-2.





DETAILED DESCRIPTION

Reference will now be made in detail to representative embodiments illustrated in the accompanying drawings. It should be understood that the following descriptions are not intended to limit the embodiments to one preferred embodiment. To the contrary, it is intended to cover alternatives, modifications, and equivalents as can be included within the spirit and scope of the described embodiments as defined by the appended claims.


The description that follows includes sample systems, apparatuses, methods, and computer program products that embody various elements of the present disclosure. However, it should be understood that the described disclosure may be practiced in a variety of forms in addition to those described herein.


Notifications and other output provided by an electronic device may be thwarted if they are not salient to a user. The situation in which a user is in (e.g., activities the user is performing, activities going on around the user, a location where the user is, and so on) may affect the salience of a notification. For example, movement of a user may decrease the salience of a vibration or other notification related output. By way of another example, a user may be located in a highly distracting environment (high noise level and so on) and/or engaged in other activities that decreases the salience of such a vibration. In yet another example, a user's cognitive state may affect perceived salience. When the user is engaged in a highly demanding cognitive task, when the user's attention is focused away from the electronic device, and so on, the user's absorbed cognitive state may reduce perceived salience of a vibration or other notification related output.


Larger actuators or other output components may be used, and/or larger amounts of power may be provided to actuators or other output components, in order to increase the salience of vibrations despite a user's situation. However, these sorts of solutions may still not ensure that a user notices a notification or other output and may not be feasible given space, power, and/or other electronic device constraints.


Further, the situation in which a user is in may make the salience of a vibration or other notification related output too noticeable. Sound related to a vibration provided by an electronic device may be salient to people other than the user in a meeting or other situation where sound is particularly noticeable. This may be exacerbated if the electronic device is on a surface such as a table that may amplify the vibration. In such a situation, it may be desirable to decrease the salience of the vibration such that it is still noticeable by the user but not others, or to prevent the notification from being annoyingly strong to the user. Efforts such as larger actuators or other output components and/or larger amounts of power discussed above to ensure salience in situations that decrease salience may further exacerbate these issues if increased salience is not necessary.


The following disclosure relates to an electronic device that adjusts alert output based on a user's situation in order to increase salience of the alert output when the user's situation merits increased salience. The alert output may be vibrations or other haptic output, visual output, audio output, and so on. Adjusting the alert output may include delaying the alert output, altering one or more parameters of the alert output (such as amplitude of a vibration, frequency of a vibration, and so on), and so on. The electronic device may determine to provide an alert output, evaluate the user's situation based on information from one or more sensors, and increase salience by adjusting the alert output based on the user's situation.


In some embodiments, the alert output may be haptic output and increasing salience may include providing output via an output device other than and/or in addition to the haptic output. For example, the electronic device may provide an audio or visual output instead of and/or in addition to the haptic output if the electronic device evaluates the user's situation to affect salience of the haptic output too adversely.


In various embodiments, increasing salience may include signaling another electronic device to provide the alert output and/or other output rather than and/or in addition to the electronic device. Similarly, the sensor data the electronic device uses to evaluate the user's situation may be received by the electronic device from other electronic devices with which the electronic device communicates.


In a particular embodiment, the electronic device may evaluate data from one or more sensors to determine that the user is moving. The electronic device may evaluate the data to determine a movement pattern and adjust the alert output to account for the movement pattern. In some implementations, the electronic device may adjust the alert output by delaying the alert output based on the movement pattern, such as delaying until the user is no longer moving or the user's activity level declines, delaying to when the electronic device will be more proximate to the user than another time, delaying different time periods based on different types of movement, delaying until a time the electronic device estimates the alert output will be salient despite the movement, and so on. In other implementations, the electronic device may adjust the alert output by altering the alert output to be discernible despite the movement pattern based on a cadence of the movement pattern, such as by mismatching the alert output with a cadence of the movement pattern, altering the alert output in different manners based on different types of movement, and so on.


In still other implementations, the electronic device may adjust the alert output to account for the movement pattern by delaying the alert output in some situations and altering the alert output to be discernible despite the movement pattern based on a cadence of the movement pattern in other situations. For example, the electronic device may utilize priorities to prioritize some alerts over others. An alert output may be associated with a priority such as an urgency priority. The electronic device may delay the alert output if the priority is a first priority and may alter the alert output if the priority is a second priority.


By way of example, the alert output may be provided in response to receiving an incoming communication. In such an example, the electronic device may include a list of contacts organized into different priorities such as very important (VIP) contacts and non-VIP contacts. The electronic device may adjust the alert output in a first way if the source of the incoming communication is a VIP contact and in a second way if the source of the incoming communication is a non-VIP contact. In other implementations of such an example, the priority may be otherwise be associated with a source of the communication, a priority indicator included in the incoming communication, a type of the incoming notification, and so on.


In various embodiments, the electronic device may increase salience of the alert output by prompting for an acknowledgement of the alert output. If the acknowledgement is not received, such as after a period of time after providing a prompt, the alert output may be provided again. In some implementations, the alert output may be provided repeatedly until acknowledged.


These and other embodiments are discussed below with reference to FIGS. 1-7. However, those skilled in the art will readily appreciate that the detailed description given herein with respect to these Figures is for explanatory purposes only and should not be construed as limiting.



FIG. 1 is depicts an example system 100 for providing situationally-aware alert output. The system 100 includes an electronic device 101 that provides situationally-aware alerts. The electronic device 101 may determine (such as in response to receiving one or more incoming communications) to provide alert output (such as vibrations or other haptic output, visual output, audio output, and so on), evaluate a user's 104 situation based on information from one or more sensors, and increase salience by adjusting the alert output based on the user's 104 situation.


Many different aspects of the user's 104 situation may affect salience of the alert output. As such, the electronic device 101 may analyze a variety of different data in evaluating a variety of different aspects of the user's 104 situation. Such aspects may involve ambient noise levels, ambient light levels, the cognitive state of the user 104, motion of the user 104, health data of the user 104, whether or not the user 104 is climbing stairs, whether or not the user 104 is driving, and so on. Such aspects may also involve activities the user is performing on other electronic devices with which the electronic device 101 may communicate, such as a first other electronic device 103 and a second other electronic device 102 (such as typing on a keyboard 105 of the first other electronic device 103, playing music on the second other electronic device 102, and so on). The electronic device 101 may receive signals from one or more different sensors indicating data the electronic device 101 may use in evaluating the user's 104 situation.


In various implementations, such sensors may be components of the electronic device 101. However, such sensors may also be components of one or more other electronic devices with which the electronic device 101 may communicate such as the first other electronic device 103 and the second other electronic device 102.


The electronic device 101 may evaluate data from one or more sensors to determine that the user is moving. The electronic device 101 may evaluate the data to determine a movement pattern and adjust the alert output (such as by delaying the alert output, altering one or more parameters of the alert output, and so on) to account for the movement pattern. In some implementations, the electronic device 101 may delay the alert output based on the movement pattern. In other implementations, the electronic device may alter the alert output to be discernible despite the movement pattern based on a cadence of the movement pattern. In still other implementations, the electronic device 101 may adjust the alert output to account for the movement pattern by delaying the alert output in some situations and altering the alert output to be discernible despite the movement pattern based on a cadence of the movement pattern in other situations.


For example, incoming communications received by the electronic device 101 may be prioritized with respect to other incoming communications. In various situations, incoming communications from some senders may be prioritized over other incoming communications from other senders, incoming communications associated with some applications may be prioritized over incoming communications associated with other applications, incoming communications having certain content may be prioritized over incoming communications having other content, and so on.


By way of example, the electronic device 101 may determine to provide an alert output in response to receiving an incoming communication that is associated with a priority according to a source of the incoming communication. The electronic device 101 may delay the alert output if the priority is a first priority and may alter the alert output and/or provide the alert output if the priority is a second priority. Although this example is described using first and second priorities, it is understood that this is an illustration. In various examples, priority may vary continuously and handling of corresponding alerts may also vary continuously.


In various implementations, the electronic device 101 may include different profiles for providing situationally-aware alert output in different situations. For example, the electronic device 101 may be configured for the user 104 to increase salience differently when the user 104 is working, at home during waking hours, at home during sleeping hours, driving, and so on. For each situation, the different profiles may specify how salience of alert outputs is to be determined, when to increase salience, how to increase salience, and so on. Such profiles may be specified by the user 104, configured by default for the user 104, and so on.


Although the electronic device 101 is described above as providing the alert output, it is understood that this is an example. In some implementations, the electronic device 101 may signal one or more of the first other electronic device 103 and the second other electronic device 102 based on evaluation of the user's 104 situation to provide alert output and/or other output (such as visual output, audio output, and so on) instead of and/or addition to the electronic device 101 providing the alert output.


Further, although the electronic device 101 is illustrated as a smart phone, the first other electronic device 103 is illustrated as a laptop computing device, and the second other electronic device 102 is illustrated as a wearable device, it is understood that these are examples. In various implementations, the electronic device 101, the first other electronic device 103, and the second other electronic device 102 may be a variety of different electronic and/or other devices without departing from the scope of the present disclosure.



FIG. 2 depicts a block diagram illustrating sample components of the system 100 of FIG. 1 and sample functional relationships among those components. The electronic device 101 may include one or more processing units 210, one or more sensors 211, one or more haptic output devices 212, one or more non-transitory storage media 213, one or more communication components 214, and so on.


The processing unit 210 may execute instructions stored in the non-transitory storage media 213 to perform a variety of different functions. For example, the processing unit 210 may execute such instructions to receive one or more signals from the one or more sensors 211, communicate with the first other electronic device 103 and/or the second other electronic device 102 via the communication component 214, provide haptic output via the haptic output device 212, and so on. The processing unit 210 may also execute the instructions to perform various methods of providing situationally aware haptic output. Such methods may involve determining to provide a haptic output, evaluate a user's situation based on information from the one or more sensors 211, and increasing salience by adjusting the haptic output based on the user's situation.


The haptic output devices 212 may be one or more actuators or other vibration producing components. The non-transitory storage media 213 may take the form of, but is not limited to, a magnetic storage medium; optical storage medium; magneto-optical storage medium; read only memory; random access memory; erasable programmable memory; flash memory; and so on. The communication components 214 may be one or more cellular antennas, WiFi antennas, Bluetooth antennas, and so on.


The one or more sensors 211 may be one or more of a variety of different sensors. Such sensors may include, but are not limited to, one or more accelerometers, gyroscopes, global positioning system (GPS) or other navigation system components, communication components (such as by tracking WiFi network handoffs, cellular handoffs, and/or other events of various communication networks with or without other associated information such as GPS data associated with network components), compasses, magnetometers, hall effect sensors, barometric or other pressure sensors, cameras, microphones, image sensors, inertial sensors, barometers, health sensors (such as photoplethysmogram sensors that may be used to determine a heart rate of the user and/or other information regarding the body of the user), touch pressure sensors, sensors that monitor a user's cognitive state (such as one or more heart rate sensors, eye movement sensors, galvanic skin response sensors, sensors that monitor use and activity on one or more other devices, and so on), combinations thereof, and so on. The communication components may be used to obtain sensor data by utilizing data from the communication components to track WiFi network handoffs, cellular handoffs, and/or other events of various communication networks with or without other associated information such as GPS data associated with network components.


Similarly, the first other electronic device 103 may include one or more processing units 220, one or more sensors 221, one or more haptic output devices 222, one or more non-transitory storage media 223, one or more communication components 224, and so on. Likewise, the second other electronic device 102 may include one or more processing units 215, one or more sensors 216, one or more haptic output devices 217, one or more non-transitory storage media 218, and one or more communication components 219.


Although FIG. 2 is illustrated and described above as including a haptic output device 212 and providing situationally aware haptic output, it is understood that this is an example. In various implementations, other kinds of situationally aware alert output may be provided. Such alert output may include audio output, video output, and so on.



FIG. 3 is a flow chart illustrating a first example method 300 for providing situationally-aware alert output. This first example method 300 may be performed by the example system 100 of FIGS. 1-2.


The flow begins at block 310 where an electronic device operates. The flow then proceeds to block 320 where the electronic device determines whether or not to provide an alert output (such as a vibration or other haptic output, audio output, visual output, and so on). The electronic device may determine to provide an alert output in response to receiving an incoming communication (such as an email, a text message, a social media communication, a telephone call, and so on), in response to triggering of a reminder such as a calendar or other schedule reminder, based on the status of a resource such as a battery power level falling below a threshold level or a change in a connection to a communication network, based on a status change of an executing application such as the completion of a download, and/or any other event for which the electronic device determines to provide a notification or other output to a user. If so, the flow proceeds to block 330. Otherwise, the flow returns to block 310 where the electronic device continues to operate.


At block 330, the electronic device evaluates the user's situation before proceeding to block 340. The electronic device may evaluate data regarding a variety of different aspects of the user's situation from a variety of different sensors included the electronic device and/or other electronic devices with which the electronic device communicates.


For example, the electronic device may determine an ambient noise level of the user's situation using one or more microphones. By way of another example, the electronic device may determine an illumination level of the user's situation using one or more ambient light sensors or other light detectors.


By way of still another example, the electronic device may analyze data to determine a movement pattern of the user or other movement information using data from one or more accelerometers, gyroscopes, GPS or other navigation system components, communication components (such as by tracking WiFi network handoffs, cellular handoffs, and/or other events of various communication networks with or without other associated information such as GPS data associated with network components), compasses, magnetometers, hall effect sensors, barometric or other pressure sensors, cameras, microphones, image sensors, inertial sensors, barometers, health sensors (such as photoplethysmogram sensors that may be used to determine a heart rate of the user and/or other information regarding the body of the user), touch pressure sensors, combinations thereof, and so on. The electronic device may determine a variety of information about the user's movement as part of determining the movement pattern such as a movement speed, a movement cadence, whether the use is changing elevation, an exertion level of the user, a type of the movement (e.g., jogging, running, walking, climbing stairs, bicycling, driving, riding in a car, and so on), and/or a variety of other different information regarding the pattern of the user's movement.


By way of still another example, the electronic device may receive a communication from an associated device indicating that a user of the electronic device is involved in a distracting activity using the other electronic device that may impact salience of the alert output. For example, the other electronic device may be playing audio or video, the user may be typing on a keyboard and/or otherwise entering input on an input device of the other electronic device, and so on. The electronic device may determine a distraction level of the user's situation based on one or more communications from the other electronic device regarding such distracting activities that may impact salience of the alert output.


At block 340, the electronic device determines whether or not to increase salience of the alert output based on the user's situation (such as by adjusting the alert output, which may include delaying the alert output, altering one or more parameters of the alert output, and so on). The electronic device may determine by evaluating the user's situation that the alert output will be salient as is and the salience of the alert output should not be increased. Alternatively, the electronic device determines by evaluating the user's situation that the alert output may not be salient as is (such as where the user's situation is too loud, too distracting, and so on) and the salience of the alert output should be increased. If so, the flow proceeds to block 360. Otherwise, the flow proceeds to block 350.


At block 350, after the electronic device determines not to increase the salience of the alert output, the electronic device provides the alert output. The flow then returns to block 310 where the electronic device continues to operate.


At block 360, after the electronic device determines to increase the salience of the alert output, the electronic device adjusts the alert output based on the user situation by delaying or altering the alert output. Such adjustment may include altering the time at which the alert output is provided (such as by delaying a period of time), altering one or more parameters of the alert output (such as providing a different waveform to an actuator, altering an amplitude of a waveform provided to an actuator, altering a phase of a waveform provided to an actuator, increasing power provided to an actuator, and so on), providing other output (such as visual, audio, and so on) instead of and/or in addition to the alert output, providing other alert output (which may have similar or different output characteristics than the alert output) via an electronic device with which the electronic device communicates instead or and/or in addition to alert output the electronic device may provide, and so on.


The flow then proceeds to block 370 where the electronic device provides the adjusted alert output based on the user's situation. The flow then returns to block 310 where the electronic device continues to operate.


For example, the alert output may be a haptic output. The electronic device may analyze data from one or more microphones to determine that the user is in a high noise environment. Such a high noise environment may reduce the possibility that the user will notice the haptic output. In response, the electronic device may increase a vibration amplitude included in the haptic output to increase the salience of the haptic output in the high noise environment. Additionally or alternatively, the electronic device may provide a different type of alert output such as a visual alert (e.g., flash a light emitting diode and so on).


By way of another example, the electronic device may analyze data from accelerometers, motion sensors, communication components, and/or other sensors and determine that the user is driving. The user may not notice haptic output while driving. However, the user's vehicle may be communicably connected to the electronic device and may be capable of providing vibrations or other haptic output via the steering wheel or other portion of the user's vehicle that the user touches while driving. As such, the electronic device may signal the user's vehicle to provide haptic output via the steering wheel or other portion instead of and/or in addition to the electronic device providing the haptic output. Additionally or alternatively, the electronic device may provide and/or signal the user's vehicle to provide another type of alert, such as audio through one or more speakers, visual indications through a display, and so on.


Although the example method 300 is illustrated and described as including particular operations performed in a particular order, it is understood that this is an example. In various implementations, various orders of the same, similar, and/or different operations may be performed without departing from the scope of the present disclosure.


For example, the example method 300 is illustrated and described as determining whether or not to increase salience of the haptic output based on the user's situation. However, in some implementations, the electronic device may determine whether or not to decrease salience of the alert output based on the user's situation.


By way of example, the alert output may be a haptic output and the electronic device may analyze data from GPS or other navigation sensors and/or other sensors and so on to determine that the user is in a quiet environment such as a meeting or a movie theater. Such a quiet environment may allow the user to notice the haptic output, but may cause the haptic output to be undesirably noticeable to others. In response, the electronic device may decrease a vibration amplitude included in the haptic output to decrease the salience of the haptic output in the quiet environment so that the adjusted haptic output will still be noticeable to the user but will not be undesirably noticeable to others or noticeable to others at all.


Alternatively, rather than altering the haptic output, the electronic device may delay the haptic output. For example, in situations where the electronic device determines that the user is in a movie theater, the electronic device may delay the haptic output until an ambient light sensor detects increased light. This may correspond to a movie being finished, the user leaving the movie theater, and/or other situations where the haptic output may no longer be undesirably noticeable to others.


By way of another example, the electronic device may analyze data from motion sensors and/or other sensors and determine that the electronic device is on a surface that amplifies haptic output such as a hard surface table top (such by analyzing that the electronic device is subject to very little motion, among other conditions). Amplification of the haptic output may not make the haptic output less salient to the user, but may make the haptic output unpleasant or undesirably noticeable to others. As such, the electronic device may modify vibration included in the haptic output to modify how the haptic output will be amplified so that the adjusted haptic output will still be noticeable to the user but will not be unpleasant and/or undesirably noticeable to others.


In various examples, the electronic device may increase and/or decrease salience of an output based on how the user's cognitive state affects the user's situation. For example, the electronic device may determine the user is engaged in a highly demanding cognitive task, that the user's attention is focused away from the electronic device, and so on. Based on that determination, the electronic device may determine to increase salience (e.g., escalate) of the output. Alternatively or additionally, based on the determination, the electronic device may determine to decrease salience (e.g., de-escalate) or delay the output to avoid distracting the user when the user is involved in a demanding task or has his attention elsewhere.



FIG. 4 is a flow chart illustrating a second example method 400 for providing situationally-aware alert output. This second example method 400 may be performed by the example system 100 of FIGS. 1-2.


The flow begins at block 410 where an electronic device operates. The flow then proceeds to block 420 where the electronic device determines whether or not to provide an alert output (such as a vibration or other haptic output, audio output, visual output, and so on). If so, the flow proceeds to block 430. Otherwise, the flow returns to block 410 where the electronic device continues to operate.


At block 430, the electronic device evaluates ambient noise in the user's situation using data from one or more microphones and/or other sensors. The flow then proceeds to block 440.


At block 440, the electronic device determines whether or not to alter the alert output because of the ambient noise in the user's situation (though in various implementations the electronic device may delay the alert output, such as until the ambient noise changes, rather than alter the alert output). The electronic device may determine to alter the alert output if the ambient noise in the user's situation exceeds a first threshold. If not, the flow proceeds to block 450 where the electronic device provides the alert output before the flow returns to block 410 and the electronic device continues to operate. Otherwise, the flow proceeds to block 460.


At block 460, after the electronic device determines to alter the alert output because of the ambient noise in the user's situation, the electronic device increases the alert output. The flow then proceeds to block 470 where the electronic device provides the increased alert output.


The flow then proceeds to block 480 where the electronic device determines whether or not to provide other output. Such other output may be haptic output, visual output provided via a visual output device, audio output provided via an audio output device, output provided by another electronic device with which the electronic device communicates, and/or any other output. The electronic device may determine to provide the other output if the ambient noise in the user's environment exceeds both the first and a second threshold. If not, the flow returns to block 410 and the electronic device continues to operate. Otherwise, the flow proceeds to block 490.


At block 490, after the electronic device determines to provide other output, the electronic device provides the other output. The flow then returns to block 410 and the electronic device continues to operate. Additionally and/or alternatively, the other output may be adjusted based on the user's situation in addition to and/or instead of adjusting the alert output.


Although the example method 400 is illustrated and described as including particular operations performed in a particular order, it is understood that this is an example. In various implementations, various orders of the same, similar, and/or different operations may be performed without departing from the scope of the present disclosure.


For example, the example method 400 is illustrated and described as determining whether or not to increase the alert output based on ambient noise the user's situation. However, in some implementations, the electronic device may determine whether or not to decrease the alert output based on ambient noise the user's situation.


By way of example, the electronic device may analyze ambient noise in the user's situation and determine that the alert output may be too noticeable based on the user's situation having below a threshold amount of ambient noise. In response, the electronic device may decrease the alert output to decrease the alert output to make the alert output more suited for the user's situation while still allowing the alert output to remain salient.


Further, the example method 400 is illustrated and described as determining to alter the alert output and/or provide other output based on comparison of ambient noise to first and second thresholds. However, it is understood that this is an illustration. In various examples, provided alerts may be varied continuously in response to a continuous scale of ambient noise.



FIG. 5 is a flow chart illustrating a third example method 500 for providing situationally-aware alert output. This third example method 500 may be performed by the example system 100 of FIGS. 1-2.


The flow begins at block 510 where an electronic device operates. The flow then proceeds to block 520 where the electronic device determines whether or not to provide an alert output (such as a vibration or other haptic output, audio output, visual output, and so on). If so, the flow proceeds to block 530. Otherwise, the flow returns to block 510 where the electronic device continues to operate.


At block 530, the electronic device determines whether or not the user is moving. The electronic device may utilize signals from one or more accelerometers, gyroscopes, inertial sensors, communication components, barometric or other pressure sensors, altimeters, magnetometers, and/or other sensors to determine whether or not the user is moving. If not, the flow proceeds to block 540 where the electronic device provides the alert output before the flow returns to block 510 and the electronic device continues to operate. Otherwise, the flow proceeds to block 550.


At block 550, the electronic device determines a movement pattern of the user using the signals from the one or more sensors. The electronic device may determine a variety of different data about the user's movement pattern. The movement pattern may include a cadence of the user's movement pattern, a heart rate or other health data of the user related to the movement pattern, whether or not the user is changing elevation (such as ascending and/or descending, the rate of change, and so on), a speed of the user's movement pattern, and/or any other such information about the pattern of the user's movement.


The flow then proceeds to block 560 where the electronic device adjusts the alert output (such as by adjusting the alert output, which may include delaying the alert output, altering one or more parameters of the alert output, and so on) based on the user's movement pattern by delaying or altering the alter output. In some implementations, adjusting the alert output may include delaying the alert output. The alert output may be delayed until the movement stops or the electronic device estimates the movement will stop, until a user who has been determined (such as using a pressure sensor) to be changing elevation (such as walking up stairs or a ramp, riding an escalator or an elevator, and so on) ceases changing elevation or the electronic device estimates the user will stop changing elevation, until the electronic device estimates the alert output will be salient despite the movement, until a user's heart rate or other health data of the user related to the movement reduces or otherwise changes, a specific time interval (such as thirty seconds), and so on.


In implementations where adjusting the alert output includes delaying the alert output by a period of time, the electronic device may delay for different periods of time based on a variety of factors. For example, the electronic device may determine based on the movement pattern that the electronic device will be less proximate to a user after a first period of time (such as five seconds) and more proximate to the user after a second period of time (such as ten seconds), such as where the electronic device is located in the user's pocket and thusly moving within the pocket closer to and further from the user as part of the movement pattern. In such an example, the electronic device may delay the alert output by the second period of time.


By way of a second example, the electronic device may determine a type of motion based on the movement pattern, such as running motion, walking motion, stair climbing motion, dancing motion, driving motion, and so on. The processing unit may delay the alert output different periods based on the type of motion. In some examples, the processing unit may delay the alert output by a first period (such as twenty seconds) when the movement pattern indicates a first type of motion (such as walking motion) and by a second period (such as forty seconds) when the movement pattern indicates a second type of motion (such as running motion).


In various implementations, the electronic device may estimate a time when the alert output will be salient despite the movement, such as where the movement pattern indicates the movement will pause. In such an implementation, the electronic device may delay until that time.


In other implementations, adjusting the alert output may include altering the alert output to be discernible despite the movement pattern based on a cadence of the user and alter the alert output based thereupon. In such implementations, the electronic device may determine a cadence of the movement pattern. A cadence of a movement pattern may involve the rhythm of body parts such as legs involved in the motion, the rate at which they move, and so on.


For example, the electronic device may alter a pattern of the alert output (such as the waveform of haptic output) to be mismatched with the cadence. As the altered alert output is mismatched to the cadence of the movement pattern, the altered alert output may be more salient despite the movement.


By way of another example, the cadence of the movement pattern may involve pauses in motion. The electronic device may alter the alert output by time shifting the alert output to such a pause in the cadence.


In numerous examples, the processing unit may alter the alert output in different manners based on the type of motion. In some examples, the processing unit may alter the alert output in a first manner when the movement pattern indicates a first type of motion (such as driving motion) and in a second manner when the movement pattern indicates a second type of motion (such as flying motion). Although these examples are described as altering alert output in a first manner for a first type of motion and in a second manner for a second type of motion, it is understood that this is an illustration. In various examples, alert output may be continuously varied based on a continuous scale of motion.


In various examples, the alert output may be provided in response to an incoming communication such as an email, text message, phone call, and so on. The incoming communication may have an associated priority. Such a priority may be based on a source of the incoming communication (such as a first priority for communications from very important person or VIP contacts compared to a second priority for other contacts), a priority in indicator included in the incoming communication (such as an urgent priority flag indicating a first priority included in the communication or normal priority flag indicating a second priority included in the communication) or a type of the communication (such as a first priority for email communications and a second priority for text message communications). The priority may be user assigned. The electronic device may adjust the alert output differently based on the associated priority.


For example, the electronic device may delay the alert output if the associated priority is a first priority and alter the alert output based on a cadence of the movement if the associated priority is a second priority. By way of another example, the electronic device may delay the alert output a first period if the associated priority is a first priority and delay the alert output a second period if the associated priority is a second priority. By way of still another example, the electronic device may alter the alert output based on a cadence of the movement in a first manner if the associated priority is a first priority and alter the alert output based on a cadence of the movement in a second manner if the associated priority is a second priority. Although this example is described as delaying a first period for a first priority and a second period for a second priority, it is understood that this is an illustration. In various examples, alert output may be delayed on a continuous scale for a continuous priority scale.


After the electronic device adjusts the alert output based on the user's movement pattern, the flow proceeds to block 570 where the electronic device provides the adjusted alert output. The flow then proceeds to block 580.


At block 580, the electronic device determines whether or not the adjusted alert output has been acknowledged. The electronic device may prompt for acknowledgement when the adjusted alert output is provided so that the electronic device can ensure that the provided output was salient to the user. If so, the flow may return to 510 where the electronic device continues to operate.


Otherwise, the flow may return to block 570 where the adjusted alert output is again provided. The electronic device may continue providing the adjusted alert output periodically, at intervals, and/or otherwise repeatedly providing the adjusted alert output until the provided output is acknowledged.


Although the example method 500 is illustrated and described as including particular operations performed in a particular order, it is understood that this is an example. In various implementations, various orders of the same, similar, and/or different operations may be performed without departing from the scope of the present disclosure.


For example, the example method 500 is illustrated and described as the electronic device altering the alert output if the electronic device is moving. However, in some implementations, the electronic device may determine that the electronic device is moving and further determine whether or not the motion will affect salience of the alert output. In such an example, the electronic device may alter the alert output if the motion will affect salience and not alter the alert output if the motion will not affect salience.


By way of another example, the example method 500 is illustrated and described as the electronic device providing the altered alert output if the electronic device is moving. However, in some implementations, the electronic device may determine that another electronic device with which it communicates is not moving or is moving in a way that will not affect salience. In such implementations, the electronic device may adjust the alert output by signaling the other electronic device to provide the alert output. For example, a user's smart phone may be moving significantly while the user is jogging but a user's wearable device may not be and the smart phone may signal the wearable device to provide the alert output.



FIG. 6 is a flow chart illustrating a fourth example method 600 for providing situationally-aware alert output. This fourth example method 600 may be performed by the example system 100 of FIGS. 1-2.


The flow begins at block 610 where an electronic device operates. The flow then proceeds to block 620 where the electronic device determines whether or not an incoming communication is received. If so, the flow proceeds to block 630. Otherwise, the flow returns to block 610 where the electronic device continues to operate.


At block 630, the electronic device determines whether or not the user is moving. If not, the flow proceeds to block 660 where the electronic device provides alert output (such as a vibration or other haptic output, audio output, visual output, and so on) before the flow returns to block 610 and the electronic device continues to operate. Otherwise, the flow proceeds to block 640.


At block 640, the electronic device determines a movement pattern of the user using the signals from the one or more sensors. Next, the flow proceeds to block 650 where the electronic device determines whether the movement pattern is a first type of movement pattern (such as walking) or a second type of movement pattern (such as running).


If the movement pattern is the first type of movement pattern, the flow proceeds to block 670 where the electronic device delays the alert output. The flow then proceeds after the delay to block 660 where the electronic device provides the alert output.


If the movement pattern is the second type of movement pattern, the flow proceeds to block 680 where the electronic device alters the alert output to be discernible despite the movement based on a cadence of the movement pattern. Next, the flow proceeds to block 690 where the electronic device provides the altered alert output. The flow then returns to block 610 where the electronic device continues to operate.


Although the example method 600 is illustrated and described as including particular operations performed in a particular order, it is understood that this is an example. In various implementations, various orders of the same, similar, and/or different operations may be performed without departing from the scope of the present disclosure.


For example, the example method 600 is illustrated and described as adjusting the alert output to account for the movement pattern in a first way when the movement pattern is a first type of movement and a second way when the movement pattern is a second type of movement. However, it is understood that this is an example. In various implementations, the electronic device may adjust the alert input in a variety of different ways based on a variety of different types of movement and/or based on other factors without departing from the scope of the present disclosure. Alternatively, the alert output may be provided to the user without adjustment.


Further, the example method 600 is illustrated and described as handling the alert output differently based on first or second types of movement patterns. However, it is understood that this is an illustration. In various examples, alert output may be varied continuously in response to a continuous scale of movement patterns.



FIG. 7 is a flow chart illustrating a fifth example method 700 for providing situationally-aware alert output. This fifth example method 700 may be performed by the example system 100 of FIGS. 1-2.


The flow begins at block 710 where an electronic device operates. The flow then proceeds to block 720 where the electronic device determines whether or not an incoming notification or communication is received. If so, the flow proceeds to block 730. Otherwise, the flow returns to block 710 where the electronic device continues to operate.


At block 730, the electronic device determines whether or not the user is moving. If not, the flow proceeds to block 760 where the electronic device provides alert output (such as a vibration or other haptic output, audio output, visual output, and so on) before the flow returns to block 710 and the electronic device continues to operate. Otherwise, the flow proceeds to block 740.


At block 740, the electronic device determines a movement pattern of the user using the signals from the one or more sensors. Next, the flow proceeds to block 750 where the electronic device determines whether the incoming notification or communication is associated with a first priority or a second priority.


If the incoming notification or communication is associated with a first priority, the flow proceeds to block 770 where the electronic device delays the alert output. The flow then proceeds after the delay to block 760 where the electronic device provides the alert output.


If the incoming notification or communication is associated with a second priority, the flow proceeds to block 780 where the electronic device alters the alert output to be discernible despite the movement based on a cadence of the movement pattern. Next, the flow proceeds to block 790 where the electronic device provides the altered alert output. The flow then returns to block 710 where the electronic device continues to operate.


Although the example method 700 is illustrated and described as including particular operations performed in a particular order, it is understood that this is an example. In various implementations, various orders of the same, similar, and/or different operations may be performed without departing from the scope of the present disclosure.


For example, the example method 700 is illustrated and described as adjusting the alert output in a first way when the associated priority is a first priority and a second way when the associated priority is a second priority. However, it is understood that this is an example. In various implementations, the electronic device may adjust the alert output in a variety of different ways based on a variety of different associated priorities and/or based on other factors without departing from the scope of the present disclosure. By way of illustration, alert output may be continuously adjusted based on an associated continuous priority scale. Alternatively, the alert output may be provided to the user without adjustment.


Although the example methods 300-700 are illustrated and described separately, various operations described in the context of one or more of the example methods 300-700 may be used in one or more of the other example methods 300-700. For example, in some implementations, the example method 700 may include the operation of providing other output described at 490 of the example method 400. By way of another example, in various implementations, the example method 700 may include the operation of determining whether or not alert output was acknowledged described at 580 of the example method 500.


Although the above describes adjusting alert output for individual alerts, delaying alert output for individual alerts, and/or otherwise handling alert output for individual alerts, it is understood that these are examples. In various implementations, output for alerts may be batched in various ways. For example, alerts associated with received high-priority communications may be individually output whereas those associated with received low-priority communications may be delayed before a single alert is output corresponding to a group of the low-priority communications. In some implementations of such an example, one or more rules may be applied (such as a user specified rule, a default rule, and so on) specifying how such batching is handled. By way of illustration, a rule may specify that a batch notification is provided no more than once per hour and alerts corresponding to received low-priority communications may be batched according to this rule. This batching may reduce the possibility of over-frequent alerts. Users may learn or train themselves to ignore over frequent alerts. Thus, reducing the number of alerts may increase salience of alerts.


As described above and illustrated in the accompanying figures, the present disclosure relates to an electronic device that provides situationally-aware alerts that adjusts alert output based on a user's situation in order to increase salience of the alert output when the user's situation merits increased salience. The electronic device may determine to provide an alert output, evaluate the user's situation based on information from one or more sensors, and increase salience by adjusting the alert output based on the user's situation.


In the present disclosure, the methods disclosed may be implemented as sets of instructions or software readable by a device. Further, it is understood that the specific order or hierarchy of steps in the methods disclosed are examples of sample approaches. In other embodiments, the specific order or hierarchy of steps in the method can be rearranged while remaining within the disclosed subject matter. The accompanying method claims present elements of the various steps in a sample order, and are not necessarily meant to be limited to the specific order or hierarchy presented.


The described disclosure may be provided as a computer program product, or software, that may include a non-transitory machine-readable medium having stored thereon instructions, which may be used to program a computer system (or other electronic devices) to perform a process according to the present disclosure. A non-transitory machine-readable medium includes any mechanism for storing information in a form (e.g., software, processing application) readable by a machine (e.g., a computer). The non-transitory machine-readable medium may take the form of, but is not limited to, a magnetic storage medium (e.g., floppy diskette, video cassette, and so on); optical storage medium (e.g., CD-ROM); magneto-optical storage medium; read only memory (ROM); random access memory (RAM); erasable programmable memory (e.g., EPROM and EEPROM); flash memory; and so on.


The foregoing description, for purposes of explanation, used specific nomenclature to provide a thorough understanding of the described embodiments. However, it will be apparent to one skilled in the art that the specific details are not required in order to practice the described embodiments. Thus, the foregoing descriptions of the specific embodiments described herein are presented for purposes of illustration and description. They are not targeted to be exhaustive or to limit the embodiments to the precise forms disclosed. It will be apparent to one of ordinary skill in the art that many modifications and variations are possible in view of the above teachings.

Claims
  • 1. An electronic device that provides situationally-aware alerts, comprising: a haptic output device;a sensor operable to produce a signal indicating information relating to movement of the electronic device; anda processing unit connected to the sensor and the haptic output device that is configured to: determine to provide a haptic output via the haptic output device;determine a movement pattern using the signal;adjust the haptic output to account for the movement pattern by altering the haptic output to be discernible despite the movement pattern based on a cadence of the movement pattern;determine to provide the haptic output in response to receiving an incoming communication;adjust the haptic output in a first manner when the incoming communication is associated with a first priority; andadjust the haptic output in a second manner when the incoming communication is associated with a second priority.
  • 2. The electronic device of claim 1, wherein the processing unit is configured to adjust a pattern of the haptic output to be mismatched with the cadence.
  • 3. The electronic device of claim 1, wherein the processing unit is configured to alter the haptic output by time shifting the haptic output to a pause in the cadence.
  • 4. The electronic device of claim 1, wherein the processing unit is configured to escalate the adjusted haptic output if a response from a user to the adjusted haptic output is not received.
  • 5. The electronic device of claim 1, wherein the processing unit is configured to alter the haptic output: in a first manner when the movement pattern indicates a first type of movement; andin a second manner when the movement pattern indicates a second type of movement.
  • 6. The electronic device of claim 1, wherein the processing unit is configured to: prompt for an acknowledgement of the adjusted haptic output;determine the acknowledgement has not been received; andprovide additional haptic output until the acknowledgement is received.
  • 7. The electronic device of claim 1, further comprising: determining the adjusted haptic output was not salient; andescalating the adjusted haptic output.
  • 8. An electronic device that provides situationally-aware alerts, comprising: a non-transitory storage medium storing instructions;a haptic output device;a sensor operable to produce a signal indicating information about a situation of a user of the electronic device;a communication component operable to receive an incoming communication associated with a priority; anda processing unit connected to the sensor, the communication component, the haptic output device, and the non-transitory storage medium that is configured to execute the instructions to: determine to provide a haptic output via the haptic output device in response to receiving the incoming communication;determine a movement pattern using the signal; andadjust the haptic output to account for the movement pattern: by delaying the haptic output when the incoming communication is associated with a first priority; andby altering the haptic output to be discernible despite the movement pattern based on a cadence of the movement pattern when the incoming communication is associated with a second priority.
  • 9. The electronic device of claim 8, further including an output device other than the haptic output device wherein the processing unit is configured to provide an output via the output device in addition to the haptic output.
  • 10. The electronic device of claim 9, wherein the output is at least one of visual output or audio output.
  • 11. The electronic device of claim 8, wherein the processing unit is configured to communicate with an additional electronic device and the processing unit signals the additional electronic device to produce output in addition to the haptic output.
  • 12. The electronic device of claim 8, wherein the processing unit is configured to communicate with an additional electronic device and the processing unit evaluates the situation of the user by receiving data indicating a status of the additional electronic device that affects the situation of the user.
  • 13. The electronic device of claim 8, wherein the first and second priorities are based on at least one of: a source of the incoming communication;a priority indicator included in the incoming communication; ora type of the incoming communication.
  • 14. The electronic device of claim 8, wherein the first and second priorities are user assigned.
  • 15. The electronic device of claim 8, further comprising: providing the adjusted haptic output;determining a user did not respond to the adjusted haptic output; andproviding an additional haptic output that is stronger than the adjusted haptic output.
  • 16. An electronic device that provides situationally-aware alerts, comprising: a haptic output device;a sensor operable to produce a signal indicating information relating to movement of the electronic device; anda processing unit connected to the sensor and the haptic output device that is configured to: determine to provide a haptic output via the haptic output device;determine a movement pattern using the signal;adjust the haptic output to account for the movement pattern by altering the haptic output to be discernible despite the movement pattern at least based on a cadence of the movement pattern;determine that a user's attention is focused away from the electronic device when the adjusted haptic output is provided; andescalate the adjusted haptic output in response to the determination that the user's attention is focused away from the electronic device.
  • 17. The electronic device of claim 16, wherein: the movement pattern indicates changes in elevation; andthe processing unit adjusts the haptic output by delaying the haptic output until the changes in elevation cease.
  • 18. The electronic device of claim 16, wherein the processing unit is configured to: determine that the user's attention is focused away from the electronic device when the adjusted haptic output is provided by determining the user is operating an additional electronic device; andescalate the adjusted haptic output in response to determining that the user is operating the additional electronic device.
  • 19. The electronic device of claim 16, wherein: the signal includes information indicating a heart rate of a user is elevated; andthe processing unit adjust the haptic output by delaying the haptic output until the heart rate of the user reduces.
  • 20. The electronic device of claim 16, wherein the processing unit is configured to: determine that the user's attention is focused away from the electronic device when the adjusted haptic output is provided by; and escalate the adjusted haptic output in response to determining that the response from the user is not received.
CROSS-REFERENCE TO RELATED APPLICATION

This application claims the benefit under 35 U.S.C. § 119(e) of U.S. Provisional Patent Application No. 62/303,964, filed on Mar. 4, 2016, and entitled “Situationally-Aware Alerts,” the contents of which are incorporated by reference as if fully disclosed herein.

US Referenced Citations (414)
Number Name Date Kind
3001049 Didier Sep 1961 A
3390287 Sonderegger Jun 1968 A
3419739 Clements Dec 1968 A
4236132 Zissimopoulos Nov 1980 A
4412148 Klicker et al. Oct 1983 A
4414984 Zarudiansky Nov 1983 A
4695813 Nobutoki et al. Sep 1987 A
4975616 Park Dec 1990 A
5010772 Bourland Apr 1991 A
5245734 Issartel Sep 1993 A
5283408 Chen Feb 1994 A
5293161 MacDonald et al. Mar 1994 A
5317221 Kubo et al. May 1994 A
5365140 Ohya et al. Nov 1994 A
5434549 Hirabayashi et al. Jul 1995 A
5436622 Gutman et al. Jul 1995 A
5510584 Norris Apr 1996 A
5510783 Findlater et al. Apr 1996 A
5513100 Parker et al. Apr 1996 A
5587875 Sellers Dec 1996 A
5590020 Sellers Dec 1996 A
5602715 Lempicki et al. Feb 1997 A
5619005 Shibukawa et al. Apr 1997 A
5621610 Moore et al. Apr 1997 A
5625532 Sellers Apr 1997 A
5629578 Winzer et al. May 1997 A
5635928 Takagi et al. Jun 1997 A
5718418 Gugsch Feb 1998 A
5739759 Nakazawa et al. Apr 1998 A
5742242 Sellers Apr 1998 A
5783765 Muramatsu Jul 1998 A
5793605 Sellers Aug 1998 A
5812116 Malhi Sep 1998 A
5813142 Demon Sep 1998 A
5818149 Safari et al. Oct 1998 A
5896076 Van Namen Apr 1999 A
5907199 Miller May 1999 A
5951908 Cui et al. Sep 1999 A
5959613 Rosenberg et al. Sep 1999 A
5973441 Lo et al. Oct 1999 A
5982304 Selker et al. Nov 1999 A
5982612 Roylance Nov 1999 A
5995026 Sellers Nov 1999 A
5999084 Armstrong Dec 1999 A
6069433 Lazarus et al. May 2000 A
6078308 Rosenberg et al. Jun 2000 A
6127756 Iwaki Oct 2000 A
6135886 Armstrong Oct 2000 A
6218966 Goodwin Apr 2001 B1
6220550 McKillip, Jr. Apr 2001 B1
6222525 Armstrong Apr 2001 B1
6252336 Hall Jun 2001 B1
6342880 Rosenberg et al. Jan 2002 B2
6351205 Armstrong Feb 2002 B1
6373465 Jolly et al. Apr 2002 B2
6408187 Merriam Jun 2002 B1
6411276 Braun et al. Jun 2002 B1
6429849 An Aug 2002 B1
6438393 Surronen Aug 2002 B1
6444928 Okamoto et al. Sep 2002 B2
6455973 Meson Sep 2002 B1
6465921 Horng Oct 2002 B1
6552404 Hynes Apr 2003 B1
6552471 Chandran et al. Apr 2003 B1
6557072 Osborn Apr 2003 B2
6642857 Schediwy Nov 2003 B1
6693626 Rosenberg Feb 2004 B1
6717573 Shahoian et al. Apr 2004 B1
6809462 Pelrine et al. Oct 2004 B2
6809727 Piot et al. Oct 2004 B2
6864877 Braun et al. Mar 2005 B2
6906697 Rosenberg Jun 2005 B2
6906700 Armstrong Jun 2005 B1
6906703 Vablais et al. Jun 2005 B2
6952203 Banerjee et al. Oct 2005 B2
6954657 Bork et al. Oct 2005 B2
6963762 Kaaresoja et al. Nov 2005 B2
6995752 Lu Feb 2006 B2
7005811 Wakuda et al. Feb 2006 B2
7016707 Fujisawa et al. Mar 2006 B2
7022927 Hsu Apr 2006 B2
7023112 Miyamoto et al. Apr 2006 B2
7081701 Yoon et al. Jul 2006 B2
7121147 Okada Oct 2006 B2
7123948 Nielsen Oct 2006 B2
7130664 Williams Oct 2006 B1
7136045 Rosenberg et al. Nov 2006 B2
7161580 Bailey et al. Jan 2007 B2
7162928 Shank et al. Jan 2007 B2
7170498 Huang Jan 2007 B2
7176906 Williams et al. Feb 2007 B2
7182691 Schena Feb 2007 B1
7194645 Bieswanger et al. Mar 2007 B2
7217891 Fischer et al. May 2007 B2
7218310 Tierling et al. May 2007 B2
7219561 Okada May 2007 B2
7253350 Noro et al. Aug 2007 B2
7269484 Hein Sep 2007 B2
7333604 Zernovizky et al. Feb 2008 B2
7334350 Ellis Feb 2008 B2
7348968 Dawson Mar 2008 B2
7388741 Konuma et al. Jun 2008 B2
7392066 Hapamas Jun 2008 B2
7423631 Shahoian et al. Sep 2008 B2
7446752 Goldenberg et al. Nov 2008 B2
7469595 Kessler et al. Dec 2008 B2
7495358 Kobayashi et al. Feb 2009 B2
7508382 Denoue et al. Mar 2009 B2
7561142 Shahoian et al. Jul 2009 B2
7562468 Ellis Jul 2009 B2
7569086 Chandran Aug 2009 B2
7575368 Guillaume Aug 2009 B2
7586220 Roberts Sep 2009 B2
7619498 Miura Nov 2009 B2
7639232 Grant et al. Dec 2009 B2
7641618 Noda et al. Jan 2010 B2
7675253 Dorel Mar 2010 B2
7675414 Ray Mar 2010 B2
7679611 Schena Mar 2010 B2
7707742 Ellis May 2010 B2
7710399 Bruneau et al. May 2010 B2
7732951 Mukaide Jun 2010 B2
7742036 Grant et al. Jun 2010 B2
7788032 Moloney Aug 2010 B2
7793429 Ellis Sep 2010 B2
7793430 Ellis Sep 2010 B2
7798982 Zets et al. Sep 2010 B2
7868489 Amemiya et al. Jan 2011 B2
7886621 Smith et al. Feb 2011 B2
7888892 McReynolds et al. Feb 2011 B2
7893922 Klinghult et al. Feb 2011 B2
7919945 Houston et al. Apr 2011 B2
7929382 Yamazaki Apr 2011 B2
7946483 Miller et al. May 2011 B2
7952261 Lipton et al. May 2011 B2
7952566 Poupyrev et al. May 2011 B2
7956770 Klinghult et al. Jun 2011 B2
7961909 Mandella et al. Jun 2011 B2
8031172 Kruse et al. Oct 2011 B2
8044940 Narusawa Oct 2011 B2
8069881 Cunha Dec 2011 B1
8077145 Rosenberg et al. Dec 2011 B2
8081156 Ruettiger Dec 2011 B2
8082640 Takeda Dec 2011 B2
8098234 Lacroix et al. Jan 2012 B2
8123660 Kruse et al. Feb 2012 B2
8125453 Shahoian et al. Feb 2012 B2
8141276 Ellis Mar 2012 B2
8156809 Tierling et al. Apr 2012 B2
8174372 da Costa May 2012 B2
8179202 Cruz-Hernandez et al. May 2012 B2
8188623 Park May 2012 B2
8205356 Ellis Jun 2012 B2
8210942 Shimabukuro et al. Jul 2012 B2
8232494 Purcocks Jul 2012 B2
8248277 Peterson et al. Aug 2012 B2
8248278 Schlosser et al. Aug 2012 B2
8253686 Kyung et al. Aug 2012 B2
8255004 Huang et al. Aug 2012 B2
8261468 Ellis Sep 2012 B2
8264465 Grant et al. Sep 2012 B2
8270114 Argumedo et al. Sep 2012 B2
8288899 Park et al. Oct 2012 B2
8291614 Ellis Oct 2012 B2
8294600 Peterson et al. Oct 2012 B2
8315746 Cox et al. Nov 2012 B2
8344834 Niiyama Jan 2013 B2
8378797 Pance et al. Feb 2013 B2
8378798 Bells et al. Feb 2013 B2
8378965 Gregorio et al. Feb 2013 B2
8384679 Paleczny et al. Feb 2013 B2
8390594 Modarres et al. Mar 2013 B2
8395587 Cauwels et al. Mar 2013 B2
8398570 Mortimer et al. Mar 2013 B2
8411058 Wong et al. Apr 2013 B2
8446264 Tanase May 2013 B2
8451255 Weber et al. May 2013 B2
8461951 Gassmann et al. Jun 2013 B2
8466889 Tong et al. Jun 2013 B2
8471690 Hennig et al. Jun 2013 B2
8487759 Hill Jul 2013 B2
8515398 Song et al. Aug 2013 B2
8542134 Peterson et al. Sep 2013 B2
8545322 George et al. Oct 2013 B2
8547341 Takashima et al. Oct 2013 B2
8552859 Pakula et al. Oct 2013 B2
8570291 Motomura Oct 2013 B2
8575794 Lee et al. Nov 2013 B2
8587955 DiFonzo et al. Nov 2013 B2
8596755 Hibi Dec 2013 B2
8598893 Camus Dec 2013 B2
8599047 Schlosser et al. Dec 2013 B2
8599152 Wurtenberger et al. Dec 2013 B1
8600354 Esaki Dec 2013 B2
8614431 Huppi et al. Dec 2013 B2
8621348 Ramsay et al. Dec 2013 B2
8629843 Steeves et al. Jan 2014 B2
8633916 Bernstein et al. Jan 2014 B2
8674941 Casparian et al. Mar 2014 B2
8680723 Subramanian Mar 2014 B2
8681092 Harada et al. Mar 2014 B2
8682396 Yang et al. Mar 2014 B2
8686952 Pope et al. Apr 2014 B2
8710966 Hill Apr 2014 B2
8723813 Park et al. May 2014 B2
8735755 Peterson et al. May 2014 B2
8760273 Casparian et al. Jun 2014 B2
8780060 Maschmeyer et al. Jul 2014 B2
8787006 Golko et al. Jul 2014 B2
8797152 Henderson et al. Aug 2014 B2
8798534 Rodriguez et al. Aug 2014 B2
8836502 Culbert et al. Sep 2014 B2
8845071 Yamamoto et al. Sep 2014 B2
8857248 Shih et al. Oct 2014 B2
8860562 Hill Oct 2014 B2
8861776 Lastrucci Oct 2014 B2
8866600 Yang et al. Oct 2014 B2
8890668 Pance et al. Nov 2014 B2
8918215 Bosscher et al. Dec 2014 B2
8928621 Ciesla et al. Jan 2015 B2
8948821 Newham et al. Feb 2015 B2
8970534 Adachi et al. Mar 2015 B2
8976141 Myers et al. Mar 2015 B2
9008730 Kim et al. Apr 2015 B2
9012795 Niu Apr 2015 B2
9013426 Cole et al. Apr 2015 B2
9019088 Zawacki et al. Apr 2015 B2
9035887 Prud'Hommeaux et al. May 2015 B1
9072576 Nishiura Jul 2015 B2
9083821 Hughes Jul 2015 B2
9092129 Abdo et al. Jul 2015 B2
9098991 Park et al. Aug 2015 B2
9122325 Peshkin et al. Sep 2015 B2
9131039 Behles Sep 2015 B2
9134834 Reshef Sep 2015 B2
9158379 Cruz-Hernandez et al. Oct 2015 B2
9178509 Bernstein Nov 2015 B2
9189932 Kerdemelidis et al. Nov 2015 B2
9201458 Hunt et al. Dec 2015 B2
9202355 Hill Dec 2015 B2
9235267 Pope et al. Jan 2016 B2
9274601 Faubert et al. Mar 2016 B2
9274602 Garg et al. Mar 2016 B2
9274603 Modarres et al. Mar 2016 B2
9275815 Hoffmann Mar 2016 B2
9293054 Bruni et al. Mar 2016 B2
9300181 Maeda et al. Mar 2016 B2
9310906 Yumiki et al. Apr 2016 B2
9317116 Ullrich et al. Apr 2016 B2
9317118 Puskarich Apr 2016 B2
9318942 Sugita et al. Apr 2016 B2
9357052 Ullrich May 2016 B2
9360944 Pinault Jun 2016 B2
9390599 Weinberg Jul 2016 B2
9396434 Rothkopf Jul 2016 B2
9405369 Modarres et al. Aug 2016 B2
9449476 Lynn Sep 2016 B2
9477342 Daverman et al. Oct 2016 B2
9501912 Hayskjold et al. Nov 2016 B1
9544694 Abe et al. Jan 2017 B2
9594450 Lynn et al. Jul 2017 B2
9779592 Hoen Oct 2017 B1
9934661 Hill Apr 2018 B2
20030210259 Liu Nov 2003 A1
20040021663 Suzuki et al. Feb 2004 A1
20040127198 Roskind et al. Jul 2004 A1
20050057528 Kleen Mar 2005 A1
20050107129 Kaewell et al. May 2005 A1
20050110778 Ben Ayed May 2005 A1
20050118922 Endo Jun 2005 A1
20050217142 Ellis Oct 2005 A1
20050237306 Klein et al. Oct 2005 A1
20050248549 Dietz et al. Nov 2005 A1
20050258715 Schlabach Nov 2005 A1
20060014569 DelGiorno Jan 2006 A1
20060154674 Landschaft et al. Jul 2006 A1
20060209037 Wang et al. Sep 2006 A1
20060239746 Grant Oct 2006 A1
20060252463 Liao Nov 2006 A1
20070099574 Wang May 2007 A1
20070152974 Kim et al. Jul 2007 A1
20070178942 Sadler et al. Aug 2007 A1
20070188450 Hernandez et al. Aug 2007 A1
20080084384 Gregorio et al. Apr 2008 A1
20080158149 Levin Jul 2008 A1
20080165148 Williamson Jul 2008 A1
20080181501 Faraboschi Jul 2008 A1
20080181706 Jackson Jul 2008 A1
20080192014 Kent et al. Aug 2008 A1
20080204428 Pierce et al. Aug 2008 A1
20080255794 Levine Oct 2008 A1
20090002328 Ullrich et al. Jan 2009 A1
20090115734 Fredriksson et al. May 2009 A1
20090120105 Ramsay et al. May 2009 A1
20090128503 Grant et al. May 2009 A1
20090135142 Fu et al. May 2009 A1
20090167702 Nurmi Jul 2009 A1
20090167704 Terlizzi et al. Jul 2009 A1
20090218148 Hugeback et al. Sep 2009 A1
20090225046 Kim et al. Sep 2009 A1
20090236210 Clark et al. Sep 2009 A1
20090267892 Faubert Oct 2009 A1
20090267920 Faubert et al. Oct 2009 A1
20090305744 Ullrich Dec 2009 A1
20090313542 Cruz-Hernandez et al. Dec 2009 A1
20100020036 Hui et al. Jan 2010 A1
20100053087 Dai et al. Mar 2010 A1
20100079264 Hoellwarth Apr 2010 A1
20100089735 Takeda et al. Apr 2010 A1
20100141408 Doy et al. Jun 2010 A1
20100141606 Bae et al. Jun 2010 A1
20100152620 Ramsay et al. Jun 2010 A1
20100164894 Kim et al. Jul 2010 A1
20100188422 Shingai et al. Jul 2010 A1
20100194547 Terrell et al. Aug 2010 A1
20100231508 Cruz-Hernandez et al. Sep 2010 A1
20100231550 Cruz-Hernandez et al. Sep 2010 A1
20100265197 Purdy Oct 2010 A1
20100309141 Cruz-Hernandez et al. Dec 2010 A1
20100328229 Weber et al. Dec 2010 A1
20110053577 Lee et al. Mar 2011 A1
20110075835 Hill Mar 2011 A1
20110080347 Steeves et al. Apr 2011 A1
20110107958 Pance et al. May 2011 A1
20110121765 Anderson et al. May 2011 A1
20110128239 Polyakov et al. Jun 2011 A1
20110148608 Grant et al. Jun 2011 A1
20110163985 Bae et al. Jul 2011 A1
20110193824 Modarres et al. Aug 2011 A1
20110248948 Griffin et al. Oct 2011 A1
20110260988 Colgate et al. Oct 2011 A1
20110263200 Thornton et al. Oct 2011 A1
20110291950 Tong Dec 2011 A1
20110304559 Pasquero Dec 2011 A1
20120028577 Rodriguez et al. Feb 2012 A1
20120068957 Puskarich et al. Mar 2012 A1
20120075198 Sulem et al. Mar 2012 A1
20120092263 Peterson et al. Apr 2012 A1
20120105333 Maschmeyer et al. May 2012 A1
20120126959 Zarrabi et al. May 2012 A1
20120127088 Pance et al. May 2012 A1
20120133494 Cruz-Hernandez et al. May 2012 A1
20120139844 Ramstein et al. Jun 2012 A1
20120206248 Biggs Aug 2012 A1
20120256848 Madabusi Srinivasan Oct 2012 A1
20120268412 Cruz-Hernandez et al. Oct 2012 A1
20120274578 Snow et al. Nov 2012 A1
20120280927 Ludwig Nov 2012 A1
20120319987 Woo Dec 2012 A1
20120327006 Israr et al. Dec 2012 A1
20130027345 Binzel Jan 2013 A1
20130063285 Elias Mar 2013 A1
20130063356 Martisauskas Mar 2013 A1
20130106699 Babatunde May 2013 A1
20130120290 Yumiki et al. May 2013 A1
20130124076 Bruni et al. May 2013 A1
20130162543 Behles et al. Jun 2013 A1
20130191741 Dickinson et al. Jul 2013 A1
20130200732 Jun et al. Aug 2013 A1
20130207793 Weaber et al. Aug 2013 A1
20130217491 Hilbert et al. Aug 2013 A1
20130222280 Sheynblat et al. Aug 2013 A1
20130228023 Drasnin et al. Sep 2013 A1
20130261811 Yagi et al. Oct 2013 A1
20130300590 Dietz et al. Nov 2013 A1
20140035397 Endo et al. Feb 2014 A1
20140077628 Yamada et al. Mar 2014 A1
20140082490 Jung et al. Mar 2014 A1
20140085065 Biggs et al. Mar 2014 A1
20140143785 Mistry May 2014 A1
20140197936 Biggs et al. Jul 2014 A1
20140232534 Birnbaum et al. Aug 2014 A1
20140247227 Jiang et al. Sep 2014 A1
20140267076 Birnbaum et al. Sep 2014 A1
20140267952 Sirois Sep 2014 A1
20150005039 Liu et al. Jan 2015 A1
20150040005 Faaborg Feb 2015 A1
20150061848 Hill Mar 2015 A1
20150090572 Lee et al. Apr 2015 A1
20150109215 Puskarich Apr 2015 A1
20150169059 Behles et al. Jun 2015 A1
20150192414 Das Jul 2015 A1
20150194165 Faaborg et al. Jul 2015 A1
20150220199 Wang et al. Aug 2015 A1
20150227204 Gipson et al. Aug 2015 A1
20150296480 Kinsey et al. Oct 2015 A1
20150324049 Kies et al. Nov 2015 A1
20150349619 Degner et al. Dec 2015 A1
20160049265 Bernstein Feb 2016 A1
20160063826 Morrell et al. Mar 2016 A1
20160071384 Hill Mar 2016 A1
20160162025 Shah Jun 2016 A1
20160163165 Morrell et al. Jun 2016 A1
20160172953 Hamel et al. Jun 2016 A1
20160195929 Martinez et al. Jul 2016 A1
20160196935 Bernstein Jul 2016 A1
20160206921 Szabados Jul 2016 A1
20160211736 Moussette et al. Jul 2016 A1
20160216764 Morrell et al. Jul 2016 A1
20160216766 Puskarich Jul 2016 A1
20160231815 Moussette et al. Aug 2016 A1
20160233012 Lubinski et al. Aug 2016 A1
20160241119 Keeler Aug 2016 A1
20160259480 Augenbergs et al. Sep 2016 A1
20160306423 Uttermann et al. Oct 2016 A1
20160371942 Smith, IV et al. Dec 2016 A1
20170038905 Bijamov et al. Feb 2017 A1
20170070131 Degner et al. Mar 2017 A1
20170285747 Chen Oct 2017 A1
20170311282 Miller et al. Oct 2017 A1
20170357325 Yang et al. Dec 2017 A1
20170364158 Wen et al. Dec 2017 A1
20180075715 Morrell et al. Mar 2018 A1
20180081441 Pedder et al. Mar 2018 A1
Foreign Referenced Citations (78)
Number Date Country
2015100710 Jul 2015 AU
2016100399 May 2016 AU
2355434 Feb 2002 CA
1817321 Aug 2006 CN
101409164 Apr 2009 CN
101763192 Jun 2010 CN
101903848 Dec 2010 CN
102025257 Apr 2011 CN
201829004 May 2011 CN
102246122 Nov 2011 CN
102315747 Jan 2012 CN
102591512 Jul 2012 CN
102713805 Oct 2012 CN
102844972 Dec 2012 CN
102915111 Feb 2013 CN
103181090 Jun 2013 CN
103218104 Jul 2013 CN
103416043 Nov 2013 CN
104220963 Dec 2014 CN
19517630 Nov 1996 DE
10330024 Jan 2005 DE
102009038103 Feb 2011 DE
102011115762 Apr 2013 DE
0483955 May 1992 EP
1047258 Oct 2000 EP
1686776 Aug 2006 EP
2060967 May 2009 EP
2073099 Jun 2009 EP
2194444 Jun 2010 EP
2264562 Dec 2010 EP
2315186 Apr 2011 EP
2374430 Oct 2011 EP
2395414 Dec 2011 EP
2461228 Jun 2012 EP
2631746 Aug 2013 EP
2434555 Oct 2013 EP
H05301342 Nov 1993 JP
2002199689 Jul 2002 JP
2002102799 Sep 2002 JP
200362525 Mar 2003 JP
2003527046 Sep 2003 JP
2004236202 Aug 2004 JP
20050033909 Apr 2005 KR
1020100046602 May 2010 KR
1020110101516 Sep 2011 KR
20130024420 Mar 2013 KR
200518000 Nov 2007 TW
200951944 Dec 2009 TW
201145336 Dec 2011 TW
201218039 May 2012 TW
201425180 Jul 2014 TW
WO 9716932 May 1997 WO
WO 00051190 Aug 2000 WO
WO 01059588 Aug 2001 WO
WO 01089003 Nov 2001 WO
WO 02073587 Sep 2002 WO
WO 03038800 May 2003 WO
WO 06057770 Jun 2006 WO
WO 07114631 Oct 2007 WO
WO 08075082 Jun 2008 WO
WO 09038862 Mar 2009 WO
WO 09068986 Jun 2009 WO
WO 09097866 Aug 2009 WO
WO 09122331 Oct 2009 WO
WO 09150287 Dec 2009 WO
WO 10085575 Jul 2010 WO
WO 10087925 Aug 2010 WO
WO 11007263 Jan 2011 WO
WO 12052635 Apr 2012 WO
WO 12129247 Sep 2012 WO
WO 13069148 May 2013 WO
WO 13150667 Oct 2013 WO
WO 13169302 Nov 2013 WO
WO 13186847 Dec 2013 WO
WO 14018086 Jan 2014 WO
WO 14098077 Jun 2014 WO
WO 13169299 Nov 2014 WO
WO 15023670 Feb 2015 WO
Non-Patent Literature Citations (20)
Entry
U.S. Appl. No. 15/260,047, filed Sep. 8, 2016, Degner.
U.S. Appl. No. 15/306,034, filed Oct. 21, 2016, Bijamov et al.
U.S. Appl. No. 15/364,822, filed Nov. 30, 2016, Chen.
Nakamura, “A Torso Haptic Display Based on Shape Memory Alloy Actuators,” Massachusetts Institute of Technology, 2003, pp. 1-123.
Astronomer's Toolbox, “The Electromagnetic Spectrum,” http://imagine.gsfc.nasa.gov/science/toolbox/emspectrum1.html, updated Mar. 2013, 4 pages.
Hasser et al., “Preliminary Evaluation of a Shape-Memory Alloy Tactile Feedback Display,” Advances in Robotics, Mechantronics, and Haptic Interfaces, ASME, DSC—vol. 49, pp. 73-80, 1993.
Hill et al., “Real-time Estimation of Human Impedance for Haptic Interfaces,” Stanford Telerobotics Laboratory, Department of Mechanical Engineering, Stanford University, Third Joint Eurohaptics Conference and Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems, Salt Lake City, Utah, Mar. 18-20, 2009, pp. 440-445.
Kim et al., “Tactile Rendering of 3D Features on Touch Surfaces,” UIST '13, Oct. 8-11, 2013, St. Andrews, United Kingdom, 8 pages.
Lee et al, “Haptic Pen: Tactile Feedback Stylus for Touch Screens,” Mitsubishi Electric Research Laboratories, http://wwwlmerl.com, 6 pages, Oct. 2004.
U.S. Appl. No. 14/165,475, filed Jan. 27, 2014, Hayskjold et al.
U.S. Appl. No. 15/047,447, filed Feb. 18, 2016, Augenbergs et al.
U.S. Appl. No. 15/098,669, filed Apr. 14, 2016, Uttermann et al.
U.S. Appl. No. 15/102,826, filed Jun. 8, 2016, Smith et al.
U.S. Appl. No. 15/800,630, filed Nov. 1, 2017, Morrell et al.
U.S. Appl. No. 15/583,938, filed May 1, 2017, Hill.
U.S. Appl. No. 15/621,966, filed Jun. 13, 2017, Pedder et al.
U.S. Appl. No. 15/621,930, filed Jun. 13, 2017, Wen et al.
U.S. Appl. No. 15/622,017, filed Jun. 13, 2017, Yang et al.
U.S. Appl. No. 15/641,192, filed Jul. 3, 2017, Miller et al.
Actuator definition downloaded from http://www.thefreedictionary.com/actuator on May 3, 2018, 2 pages.
Related Publications (1)
Number Date Country
20170257844 A1 Sep 2017 US
Provisional Applications (1)
Number Date Country
62303964 Mar 2016 US