Situationally-aware alerts

Information

  • Patent Grant
  • 10609677
  • Patent Number
    10,609,677
  • Date Filed
    Monday, January 28, 2019
    5 years ago
  • Date Issued
    Tuesday, March 31, 2020
    4 years ago
Abstract
An electronic device that provides situationally-aware alerts determines to provide an alert output (such as haptic, audio, visual, and so on) via an output device, determines a movement pattern based on one or more signals from one or more sensors indicating information relating at least to movement of the electronic device, and adjusts the alert output to account for the movement pattern. In some implementations, the electronic device may adjust the alert output by delaying the alert output. In other implementations, the electronic device may adjust the alert output by altering the alert output to be discernible despite the movement pattern based on a cadence of the movement pattern. In still other implementations, the electronic device may determine to provide the alert output in response to receiving an incoming communication and may adjust the alert output differently based on a priority associated with the incoming communication.
Description
FIELD

The described embodiments relate generally to alerts. More particularly, the present embodiments relate to adjusting alerts based on a user's situation.


BACKGROUND

Many electronic devices provide various notifications, alerts, or other output to users. Such notifications may be visual, audio, haptic, and so on. For example, a smart phone that receives a communication such as a call or text or email message may indicate such on a screen, play a tone or other audio, and/or vibrate.


In general, notifications may be configured to be salient, or noticeable, to a user without being overly disturbing to others. For example, a smart phone may present a visual indicator on a display screen as well as playing a tone for an incoming call. The tone may assist the user in noticing the incoming call if the user is not currently looking at the display, but may be disturbing to others if the user is in the context of a meeting or other scenario where audio is overly noticeable.


SUMMARY

The present disclosure relates to electronic devices that provide situationally-aware alerts. An electronic device determines to provide alert output (such as a vibration or other haptic output, audio output, visual output, and so on) via an output device, determines a movement pattern based on one or more signals from one or more sensors indicating information relating at least to movement of the electronic device, and adjusts the alert output to account for the movement pattern. In some implementations, the electronic device may adjust the alert output by delaying the alert output. In other implementations, the electronic device may adjust the alert output by altering the alert output to be discernible despite the movement pattern based on a cadence of the movement pattern. In still other implementations, the electronic device may determine to provide the alert output in response to receiving an incoming communication and may prioritize incoming communications by adjusting the alert output differently based on an associated priority.


In various embodiments, an electronic device that provides situationally-aware alerts includes a haptic output device, a sensor operable to produce a signal indicating information relating to movement of the electronic device, and a processing unit connected to the sensor and the haptic output device. The processing unit is configured to determine to provide a haptic output via the haptic output device, determine a movement pattern based on the signal, and adjust the haptic output to account for the movement pattern by delaying the haptic output.


In some examples, the movement pattern indicates changes in elevation and the processing unit delays the haptic output until changes in elevation cease. In various implementations of such examples, the sensor includes a pressure sensor, the processing unit is configured to determine that the movement pattern indicates the changes in elevation based on the pressure sensor, and the processing unit is configured to delay the haptic output until the processing unit determines based on the pressure sensor that the changes in elevation have ceased.


In various examples, the processing unit is configured to determine a first period based on the movement pattern where the electronic device will be less proximate to a user (such as where the user is running and the electronic device is in the user's pocket and moves in the pocket further from the user and closer to the user in the pocket at different portions of the user's stride), determine a second period based on the movement pattern where the electronic device will be more proximate to the user, and delay the haptic output from the first period to the second period. In other examples, the processing unit delays the haptic output for a first period when the movement pattern indicates a first type of movement and delays the haptic output for a second period when the movement pattern indicates a second type of movement.


In numerous examples, the signal includes information indicating a heart rate of a user is elevated and the processing unit delays the haptic output until the heart rate of the user reduces. In various examples, the processing unit estimates a time when the haptic output will be salient despite the movement and delays the haptic output until the time.


In some embodiments, an electronic device that provides situationally-aware alerts includes a haptic output device, a sensor operable to produce a signal indicating information relating to movement of the electronic device, and a processing unit connected to the sensor and the haptic output device. The processing unit is configured to determine to provide a haptic output via the haptic output device, determine a movement pattern based on the signal, and adjust the haptic output to account for the movement pattern by altering the haptic output to be discernible despite the movement pattern based on a cadence of the movement pattern.


In various examples, the processing unit is configured to adjust a pattern of the haptic output to be mismatched with the cadence. In numerous examples, the processing unit is configured to alter the haptic output by time shifting the haptic output to a pause in the cadence.


In some examples, the processing unit is configured to determine to provide the haptic output in response to receiving an incoming communication, adjust the haptic output in a first manner when the incoming communication is associated with a first priority, and adjust the haptic output in a second manner when the incoming communication is associated with a second priority. In various examples, the processing unit is configured to alter the haptic output in a first manner when the movement pattern indicates a first type of movement and in a second manner when the movement pattern indicates a second type of movement. In numerous examples, the processing unit is configured to prompt for an acknowledgement of the adjusted haptic output, determine the acknowledgement has not been received, and provide additional haptic output until the acknowledgement is received.


In numerous embodiments, an electronic device that provides situationally-aware alerts includes a non-transitory storage medium storing instructions; a haptic output device; a sensor operable to produce a signal indicating information about a situation of a user of the electronic device; a communication component operable to receive an incoming communication associated with a priority; and a processing unit connected to the sensor, the communication component, the haptic output device, and the non-transitory storage medium. The processing unit is configured to execute the instructions to determine to provide a haptic output via the haptic output device in response to receiving the incoming communication; determine a movement pattern based on the signal; and adjust the haptic output to account for the movement pattern by delaying the haptic output when the incoming communication is associated with a first priority and by altering the haptic output to be discernible despite the movement pattern based on a cadence of the movement pattern when the incoming communication is associated with a second priority.


In various examples, the electronic device that provides situationally-aware alerts further includes an output device other than the haptic output device wherein the processing unit is configured to provide an output via the output device in addition to the haptic output. In some implementations of such examples, the output is at least one of visual output or audio output.


In numerous examples, the processing unit is configured to communicate with an additional electronic device and the processing unit signals the additional electronic device to produce output in addition to the haptic output. In various examples, the processing unit is configured to communicate with an additional electronic device and the processing unit evaluates the situation of the user by receiving data indicating a status of the additional electronic device that affects the situation of the user.


In some examples, the first and second priorities are based on at least one of a source of the incoming communication, a priority indicator included in the incoming communication, or a type of the incoming communication. In various examples, the first and second priorities are user assigned.





BRIEF DESCRIPTION OF THE DRAWINGS

The disclosure will be readily understood by the following detailed description in conjunction with the accompanying drawings, wherein like reference numerals designate like structural elements.



FIG. 1 is depicts an example system for providing situationally-aware alert output.



FIG. 2 depicts a block diagram illustrating sample components of the system of FIG. 1 and sample functional relationships among those components.



FIG. 3 is a flow chart illustrating a first example method for providing situationally-aware alert output. This first example method may be performed by the example system of FIGS. 1-2.



FIG. 4 is a flow chart illustrating a second example method for providing situationally-aware alert output. This second example method may be performed by the example system of FIGS. 1-2.



FIG. 5 is a flow chart illustrating a third example method for providing situationally-aware alert output. This third example method may be performed by the example system of FIGS. 1-2.



FIG. 6 is a flow chart illustrating a fourth example method for providing situationally-aware alert output. This fourth example method may be performed by the example system of FIGS. 1-2.



FIG. 7 is a flow chart illustrating a fifth example method for providing situationally-aware alert output. This fifth example method may be performed by the example system of FIGS. 1-2.





DETAILED DESCRIPTION

Reference will now be made in detail to representative embodiments illustrated in the accompanying drawings. It should be understood that the following descriptions are not intended to limit the embodiments to one preferred embodiment. To the contrary, it is intended to cover alternatives, modifications, and equivalents as can be included within the spirit and scope of the described embodiments as defined by the appended claims.


The description that follows includes sample systems, apparatuses, methods, and computer program products that embody various elements of the present disclosure. However, it should be understood that the described disclosure may be practiced in a variety of forms in addition to those described herein.


Notifications and other output provided by an electronic device may be thwarted if they are not salient to a user. The situation in which a user is in (e.g., activities the user is performing, activities going on around the user, a location where the user is, and so on) may affect the salience of a notification. For example, movement of a user may decrease the salience of a vibration or other notification related output. By way of another example, a user may be located in a highly distracting environment (high noise level and so on) and/or engaged in other activities that decreases the salience of such a vibration. In yet another example, a user's cognitive state may affect perceived salience. When the user is engaged in a highly demanding cognitive task, when the user's attention is focused away from the electronic device, and so on, the user's absorbed cognitive state may reduce perceived salience of a vibration or other notification related output.


Larger actuators or other output components may be used, and/or larger amounts of power may be provided to actuators or other output components, in order to increase the salience of vibrations despite a user's situation. However, these sorts of solutions may still not ensure that a user notices a notification or other output and may not be feasible given space, power, and/or other electronic device constraints.


Further, the situation in which a user is in may make the salience of a vibration or other notification related output too noticeable. Sound related to a vibration provided by an electronic device may be salient to people other than the user in a meeting or other situation where sound is particularly noticeable. This may be exacerbated if the electronic device is on a surface such as a table that may amplify the vibration. In such a situation, it may be desirable to decrease the salience of the vibration such that it is still noticeable by the user but not others, or to prevent the notification from being annoyingly strong to the user. Efforts such as larger actuators or other output components and/or larger amounts of power discussed above to ensure salience in situations that decrease salience may further exacerbate these issues if increased salience is not necessary.


The following disclosure relates to an electronic device that adjusts alert output based on a user's situation in order to increase salience of the alert output when the user's situation merits increased salience. The alert output may be vibrations or other haptic output, visual output, audio output, and so on. Adjusting the alert output may include delaying the alert output, altering one or more parameters of the alert output (such as amplitude of a vibration, frequency of a vibration, and so on), and so on. The electronic device may determine to provide an alert output, evaluate the user's situation based on information from one or more sensors, and increase salience by adjusting the alert output based on the user's situation.


In some embodiments, the alert output may be haptic output and increasing salience may include providing output via an output device other than and/or in addition to the haptic output. For example, the electronic device may provide an audio or visual output instead of and/or in addition to the haptic output if the electronic device evaluates the user's situation to affect salience of the haptic output too adversely.


In various embodiments, increasing salience may include signaling another electronic device to provide the alert output and/or other output rather than and/or in addition to the electronic device. Similarly, the sensor data the electronic device uses to evaluate the user's situation may be received by the electronic device from other electronic devices with which the electronic device communicates.


In a particular embodiment, the electronic device may evaluate data from one or more sensors to determine that the user is moving. The electronic device may evaluate the data to determine a movement pattern and adjust the alert output to account for the movement pattern. In some implementations, the electronic device may adjust the alert output by delaying the alert output based on the movement pattern, such as delaying until the user is no longer moving or the user's activity level declines, delaying to when the electronic device will be more proximate to the user than another time, delaying different time periods based on different types of movement, delaying until a time the electronic device estimates the alert output will be salient despite the movement, and so on. In other implementations, the electronic device may adjust the alert output by altering the alert output to be discernible despite the movement pattern based on a cadence of the movement pattern, such as by mismatching the alert output with a cadence of the movement pattern, altering the alert output in different manners based on different types of movement, and so on.


In still other implementations, the electronic device may adjust the alert output to account for the movement pattern by delaying the alert output in some situations and altering the alert output to be discernible despite the movement pattern based on a cadence of the movement pattern in other situations. For example, the electronic device may utilize priorities to prioritize some alerts over others. An alert output may be associated with a priority such as an urgency priority. The electronic device may delay the alert output if the priority is a first priority and may alter the alert output if the priority is a second priority.


By way of example, the alert output may be provided in response to receiving an incoming communication. In such an example, the electronic device may include a list of contacts organized into different priorities such as very important (VIP) contacts and non-VIP contacts. The electronic device may adjust the alert output in a first way if the source of the incoming communication is a VIP contact and in a second way if the source of the incoming communication is a non-VIP contact. In other implementations of such an example, the priority may be otherwise be associated with a source of the communication, a priority indicator included in the incoming communication, a type of the incoming notification, and so on.


In various embodiments, the electronic device may increase salience of the alert output by prompting for an acknowledgement of the alert output. If the acknowledgement is not received, such as after a period of time after providing a prompt, the alert output may be provided again. In some implementations, the alert output may be provided repeatedly until acknowledged.


These and other embodiments are discussed below with reference to FIGS. 1-7. However, those skilled in the art will readily appreciate that the detailed description given herein with respect to these Figures is for explanatory purposes only and should not be construed as limiting.



FIG. 1 is depicts an example system 100 for providing situationally-aware alert output. The system 100 includes an electronic device 101 that provides situationally-aware alerts. The electronic device 101 may determine (such as in response to receiving one or more incoming communications) to provide alert output (such as vibrations or other haptic output, visual output, audio output, and so on), evaluate a user's 104 situation based on information from one or more sensors, and increase salience by adjusting the alert output based on the user's 104 situation.


Many different aspects of the user's 104 situation may affect salience of the alert output. As such, the electronic device 101 may analyze a variety of different data in evaluating a variety of different aspects of the user's 104 situation. Such aspects may involve ambient noise levels, ambient light levels, the cognitive state of the user 104, motion of the user 104, health data of the user 104, whether or not the user 104 is climbing stairs, whether or not the user 104 is driving, and so on. Such aspects may also involve activities the user is performing on other electronic devices with which the electronic device 101 may communicate, such as a first other electronic device 103 and a second other electronic device 102 (such as typing on a keyboard 105 of the first other electronic device 103, playing music on the second other electronic device 102, and so on). The electronic device 101 may receive signals from one or more different sensors indicating data the electronic device 101 may use in evaluating the user's 104 situation.


In various implementations, such sensors may be components of the electronic device 101. However, such sensors may also be components of one or more other electronic devices with which the electronic device 101 may communicate such as the first other electronic device 103 and the second other electronic device 102.


The electronic device 101 may evaluate data from one or more sensors to determine that the user is moving. The electronic device 101 may evaluate the data to determine a movement pattern and adjust the alert output (such as by delaying the alert output, altering one or more parameters of the alert output, and so on) to account for the movement pattern. In some implementations, the electronic device 101 may delay the alert output based on the movement pattern. In other implementations, the electronic device may alter the alert output to be discernible despite the movement pattern based on a cadence of the movement pattern. In still other implementations, the electronic device 101 may adjust the alert output to account for the movement pattern by delaying the alert output in some situations and altering the alert output to be discernible despite the movement pattern based on a cadence of the movement pattern in other situations.


For example, incoming communications received by the electronic device 101 may be prioritized with respect to other incoming communications. In various situations, incoming communications from some senders may be prioritized over other incoming communications from other senders, incoming communications associated with some applications may be prioritized over incoming communications associated with other applications, incoming communications having certain content may be prioritized over incoming communications having other content, and so on.


By way of example, the electronic device 101 may determine to provide an alert output in response to receiving an incoming communication that is associated with a priority according to a source of the incoming communication. The electronic device 101 may delay the alert output if the priority is a first priority and may alter the alert output and/or provide the alert output if the priority is a second priority. Although this example is described using first and second priorities, it is understood that this is an illustration. In various examples, priority may vary continuously and handling of corresponding alerts may also vary continuously.


In various implementations, the electronic device 101 may include different profiles for providing situationally-aware alert output in different situations. For example, the electronic device 101 may be configured for the user 104 to increase salience differently when the user 104 is working, at home during waking hours, at home during sleeping hours, driving, and so on. For each situation, the different profiles may specify how salience of alert outputs is to be determined, when to increase salience, how to increase salience, and so on. Such profiles may be specified by the user 104, configured by default for the user 104, and so on.


Although the electronic device 101 is described above as providing the alert output, it is understood that this is an example. In some implementations, the electronic device 101 may signal one or more of the first other electronic device 103 and the second other electronic device 102 based on evaluation of the user's 104 situation to provide alert output and/or other output (such as visual output, audio output, and so on) instead of and/or addition to the electronic device 101 providing the alert output.


Further, although the electronic device 101 is illustrated as a smart phone, the first other electronic device 103 is illustrated as a laptop computing device, and the second other electronic device 102 is illustrated as a wearable device, it is understood that these are examples. In various implementations, the electronic device 101, the first other electronic device 103, and the second other electronic device 102 may be a variety of different electronic and/or other devices without departing from the scope of the present disclosure.



FIG. 2 depicts a block diagram illustrating sample components of the system 100 of FIG. 1 and sample functional relationships among those components. The electronic device 101 may include one or more processing units 210, one or more sensors 211, one or more haptic output devices 212, one or more non-transitory storage media 213, one or more communication components 214, and so on.


The processing unit 210 may execute instructions stored in the non-transitory storage media 213 to perform a variety of different functions. For example, the processing unit 210 may execute such instructions to receive one or more signals from the one or more sensors 211, communicate with the first other electronic device 103 and/or the second other electronic device 102 via the communication component 214, provide haptic output via the haptic output device 212, and so on. The processing unit 210 may also execute the instructions to perform various methods of providing situationally aware haptic output. Such methods may involve determining to provide a haptic output, evaluate a user's situation based on information from the one or more sensors 211, and increasing salience by adjusting the haptic output based on the user's situation.


The haptic output devices 212 may be one or more actuators or other vibration producing components. The non-transitory storage media 213 may take the form of, but is not limited to, a magnetic storage medium; optical storage medium; magneto-optical storage medium; read only memory; random access memory; erasable programmable memory; flash memory; and so on. The communication components 214 may be one or more cellular antennas, WiFi antennas, Bluetooth antennas, and so on.


The one or more sensors 211 may be one or more of a variety of different sensors. Such sensors may include, but are not limited to, one or more accelerometers, gyroscopes, global positioning system (GPS) or other navigation system components, communication components (such as by tracking WiFi network handoffs, cellular handoffs, and/or other events of various communication networks with or without other associated information such as GPS data associated with network components), compasses, magnetometers, hall effect sensors, barometric or other pressure sensors, cameras, microphones, image sensors, inertial sensors, barometers, health sensors (such as photoplethysmogram sensors that may be used to determine a heart rate of the user and/or other information regarding the body of the user), touch pressure sensors, sensors that monitor a user's cognitive state (such as one or more heart rate sensors, eye movement sensors, galvanic skin response sensors, sensors that monitor use and activity on one or more other devices, and so on), combinations thereof, and so on. The communication components may be used to obtain sensor data by utilizing data from the communication components to track WiFi network handoffs, cellular handoffs, and/or other events of various communication networks with or without other associated information such as GPS data associated with network components.


Similarly, the first other electronic device 103 may include one or more processing units 220, one or more sensors 221, one or more haptic output devices 222, one or more non-transitory storage media 223, one or more communication components 224, and so on. Likewise, the second other electronic device 102 may include one or more processing units 215, one or more sensors 216, one or more haptic output devices 217, one or more non-transitory storage media 218, and one or more communication components 219.


Although FIG. 2 is illustrated and described above as including a haptic output device 212 and providing situationally aware haptic output, it is understood that this is an example. In various implementations, other kinds of situationally aware alert output may be provided. Such alert output may include audio output, video output, and so on.



FIG. 3 is a flow chart illustrating a first example method 300 for providing situationally-aware alert output. This first example method 300 may be performed by the example system 100 of FIGS. 1-2.


The flow begins at block 310 where an electronic device operates. The flow then proceeds to block 320 where the electronic device determines whether or not to provide an alert output (such as a vibration or other haptic output, audio output, visual output, and so on). The electronic device may determine to provide an alert output in response to receiving an incoming communication (such as an email, a text message, a social media communication, a telephone call, and so on), in response to triggering of a reminder such as a calendar or other schedule reminder, based on the status of a resource such as a battery power level falling below a threshold level or a change in a connection to a communication network, based on a status change of an executing application such as the completion of a download, and/or any other event for which the electronic device determines to provide a notification or other output to a user. If so, the flow proceeds to block 330. Otherwise, the flow returns to block 310 where the electronic device continues to operate.


At block 330, the electronic device evaluates the user's situation before proceeding to block 340. The electronic device may evaluate data regarding a variety of different aspects of the user's situation from a variety of different sensors included the electronic device and/or other electronic devices with which the electronic device communicates.


For example, the electronic device may determine an ambient noise level of the user's situation using one or more microphones. By way of another example, the electronic device may determine an illumination level of the user's situation using one or more ambient light sensors or other light detectors.


By way of still another example, the electronic device may analyze data to determine a movement pattern of the user or other movement information using data from one or more accelerometers, gyroscopes, GPS or other navigation system components, communication components (such as by tracking WiFi network handoffs, cellular handoffs, and/or other events of various communication networks with or without other associated information such as GPS data associated with network components), compasses, magnetometers, hall effect sensors, barometric or other pressure sensors, cameras, microphones, image sensors, inertial sensors, barometers, health sensors (such as photoplethysmogram sensors that may be used to determine a heart rate of the user and/or other information regarding the body of the user), touch pressure sensors, combinations thereof, and so on. The electronic device may determine a variety of information about the user's movement as part of determining the movement pattern such as a movement speed, a movement cadence, whether the use is changing elevation, an exertion level of the user, a type of the movement (e.g., jogging, running, walking, climbing stairs, bicycling, driving, riding in a car, and so on), and/or a variety of other different information regarding the pattern of the user's movement.


By way of still another example, the electronic device may receive a communication from an associated device indicating that a user of the electronic device is involved in a distracting activity using the other electronic device that may impact salience of the alert output. For example, the other electronic device may be playing audio or video, the user may be typing on a keyboard and/or otherwise entering input on an input device of the other electronic device, and so on. The electronic device may determine a distraction level of the user's situation based on one or more communications from the other electronic device regarding such distracting activities that may impact salience of the alert output.


At block 340, the electronic device determines whether or not to increase salience of the alert output based on the user's situation (such as by adjusting the alert output, which may include delaying the alert output, altering one or more parameters of the alert output, and so on). The electronic device may determine by evaluating the user's situation that the alert output will be salient as is and the salience of the alert output should not be increased. Alternatively, the electronic device determines by evaluating the user's situation that the alert output may not be salient as is (such as where the user's situation is too loud, too distracting, and so on) and the salience of the alert output should be increased. If so, the flow proceeds to block 360. Otherwise, the flow proceeds to block 350.


At block 350, after the electronic device determines not to increase the salience of the alert output, the electronic device provides the alert output. The flow then returns to block 310 where the electronic device continues to operate.


At block 360, after the electronic device determines to increase the salience of the alert output, the electronic device adjusts the alert output based on the user situation by delaying or altering the alert output. Such adjustment may include altering the time at which the alert output is provided (such as by delaying a period of time), altering one or more parameters of the alert output (such as providing a different waveform to an actuator, altering an amplitude of a waveform provided to an actuator, altering a phase of a waveform provided to an actuator, increasing power provided to an actuator, and so on), providing other output (such as visual, audio, and so on) instead of and/or in addition to the alert output, providing other alert output (which may have similar or different output characteristics than the alert output) via an electronic device with which the electronic device communicates instead or and/or in addition to alert output the electronic device may provide, and so on.


The flow then proceeds to block 370 where the electronic device provides the adjusted alert output based on the user's situation. The flow then returns to block 310 where the electronic device continues to operate.


For example, the alert output may be a haptic output. The electronic device may analyze data from one or more microphones to determine that the user is in a high noise environment. Such a high noise environment may reduce the possibility that the user will notice the haptic output. In response, the electronic device may increase a vibration amplitude included in the haptic output to increase the salience of the haptic output in the high noise environment. Additionally or alternatively, the electronic device may provide a different type of alert output such as a visual alert (e.g., flash a light emitting diode and so on).


By way of another example, the electronic device may analyze data from accelerometers, motion sensors, communication components, and/or other sensors and determine that the user is driving. The user may not notice haptic output while driving. However, the user's vehicle may be communicably connected to the electronic device and may be capable of providing vibrations or other haptic output via the steering wheel or other portion of the user's vehicle that the user touches while driving. As such, the electronic device may signal the user's vehicle to provide haptic output via the steering wheel or other portion instead of and/or in addition to the electronic device providing the haptic output. Additionally or alternatively, the electronic device may provide and/or signal the user's vehicle to provide another type of alert, such as audio through one or more speakers, visual indications through a display, and so on.


Although the example method 300 is illustrated and described as including particular operations performed in a particular order, it is understood that this is an example. In various implementations, various orders of the same, similar, and/or different operations may be performed without departing from the scope of the present disclosure.


For example, the example method 300 is illustrated and described as determining whether or not to increase salience of the haptic output based on the user's situation. However, in some implementations, the electronic device may determine whether or not to decrease salience of the alert output based on the user's situation.


By way of example, the alert output may be a haptic output and the electronic device may analyze data from GPS or other navigation sensors and/or other sensors and so on to determine that the user is in a quiet environment such as a meeting or a movie theater. Such a quiet environment may allow the user to notice the haptic output, but may cause the haptic output to be undesirably noticeable to others. In response, the electronic device may decrease a vibration amplitude included in the haptic output to decrease the salience of the haptic output in the quiet environment so that the adjusted haptic output will still be noticeable to the user but will not be undesirably noticeable to others or noticeable to others at all.


Alternatively, rather than altering the haptic output, the electronic device may delay the haptic output. For example, in situations where the electronic device determines that the user is in a movie theater, the electronic device may delay the haptic output until an ambient light sensor detects increased light. This may correspond to a movie being finished, the user leaving the movie theater, and/or other situations where the haptic output may no longer be undesirably noticeable to others.


By way of another example, the electronic device may analyze data from motion sensors and/or other sensors and determine that the electronic device is on a surface that amplifies haptic output such as a hard surface table top (such by analyzing that the electronic device is subject to very little motion, among other conditions). Amplification of the haptic output may not make the haptic output less salient to the user, but may make the haptic output unpleasant or undesirably noticeable to others. As such, the electronic device may modify vibration included in the haptic output to modify how the haptic output will be amplified so that the adjusted haptic output will still be noticeable to the user but will not be unpleasant and/or undesirably noticeable to others.


In various examples, the electronic device may increase and/or decrease salience of an output based on how the user's cognitive state affects the user's situation. For example, the electronic device may determine the user is engaged in a highly demanding cognitive task, that the user's attention is focused away from the electronic device, and so on. Based on that determination, the electronic device may determine to increase salience (e.g., escalate) of the output. Alternatively or additionally, based on the determination, the electronic device may determine to decrease salience (e.g., de-escalate) or delay the output to avoid distracting the user when the user is involved in a demanding task or has his attention elsewhere.



FIG. 4 is a flow chart illustrating a second example method 400 for providing situationally-aware alert output. This second example method 400 may be performed by the example system 100 of FIGS. 1-2.


The flow begins at block 410 where an electronic device operates. The flow then proceeds to block 420 where the electronic device determines whether or not to provide an alert output (such as a vibration or other haptic output, audio output, visual output, and so on). If so, the flow proceeds to block 430. Otherwise, the flow returns to block 410 where the electronic device continues to operate.


At block 430, the electronic device evaluates ambient noise in the user's situation using data from one or more microphones and/or other sensors. The flow then proceeds to block 440.


At block 440, the electronic device determines whether or not to alter the alert output because of the ambient noise in the user's situation (though in various implementations the electronic device may delay the alert output, such as until the ambient noise changes, rather than alter the alert output). The electronic device may determine to alter the alert output if the ambient noise in the user's situation exceeds a first threshold. If not, the flow proceeds to block 450 where the electronic device provides the alert output before the flow returns to block 410 and the electronic device continues to operate. Otherwise, the flow proceeds to block 460.


At block 460, after the electronic device determines to alter the alert output because of the ambient noise in the user's situation, the electronic device increases the alert output. The flow then proceeds to block 470 where the electronic device provides the increased alert output.


The flow then proceeds to block 480 where the electronic device determines whether or not to provide other output. Such other output may be haptic output, visual output provided via a visual output device, audio output provided via an audio output device, output provided by another electronic device with which the electronic device communicates, and/or any other output. The electronic device may determine to provide the other output if the ambient noise in the user's environment exceeds both the first and a second threshold. If not, the flow returns to block 410 and the electronic device continues to operate. Otherwise, the flow proceeds to block 490.


At block 490, after the electronic device determines to provide other output, the electronic device provides the other output. The flow then returns to block 410 and the electronic device continues to operate. Additionally and/or alternatively, the other output may be adjusted based on the user's situation in addition to and/or instead of adjusting the alert output.


Although the example method 400 is illustrated and described as including particular operations performed in a particular order, it is understood that this is an example. In various implementations, various orders of the same, similar, and/or different operations may be performed without departing from the scope of the present disclosure.


For example, the example method 400 is illustrated and described as determining whether or not to increase the alert output based on ambient noise the user's situation. However, in some implementations, the electronic device may determine whether or not to decrease the alert output based on ambient noise the user's situation.


By way of example, the electronic device may analyze ambient noise in the user's situation and determine that the alert output may be too noticeable based on the user's situation having below a threshold amount of ambient noise. In response, the electronic device may decrease the alert output to make the alert output more suited for the user's situation while still allowing the alert output to remain salient.


Further, the example method 400 is illustrated and described as determining to alter the alert output and/or provide other output based on comparison of ambient noise to first and second thresholds. However, it is understood that this is an illustration. In various examples, provided alerts may be varied continuously in response to a continuous scale of ambient noise.



FIG. 5 is a flow chart illustrating a third example method 500 for providing situationally-aware alert output. This third example method 500 may be performed by the example system 100 of FIGS. 1-2.


The flow begins at block 510 where an electronic device operates. The flow then proceeds to block 520 where the electronic device determines whether or not to provide an alert output (such as a vibration or other haptic output, audio output, visual output, and so on). If so, the flow proceeds to block 530. Otherwise, the flow returns to block 510 where the electronic device continues to operate.


At block 530, the electronic device determines whether or not the user is moving. The electronic device may utilize signals from one or more accelerometers, gyroscopes, inertial sensors, communication components, barometric or other pressure sensors, altimeters, magnetometers, and/or other sensors to determine whether or not the user is moving. If not, the flow proceeds to block 540 where the electronic device provides the alert output before the flow returns to block 510 and the electronic device continues to operate. Otherwise, the flow proceeds to block 550.


At block 550, the electronic device determines a movement pattern of the user using the signals from the one or more sensors. The electronic device may determine a variety of different data about the user's movement pattern. The movement pattern may include a cadence of the user's movement pattern, a heart rate or other health data of the user related to the movement pattern, whether or not the user is changing elevation (such as ascending and/or descending, the rate of change, and so on), a speed of the user's movement pattern, and/or any other such information about the pattern of the user's movement.


The flow then proceeds to block 560 where the electronic device adjusts the alert output (such as by adjusting the alert output, which may include delaying the alert output, altering one or more parameters of the alert output, and so on) based on the user's movement pattern by delaying or altering the alter output. In some implementations, adjusting the alert output may include delaying the alert output. The alert output may be delayed until the movement stops or the electronic device estimates the movement will stop, until a user who has been determined (such as using a pressure sensor) to be changing elevation (such as walking up stairs or a ramp, riding an escalator or an elevator, and so on) ceases changing elevation or the electronic device estimates the user will stop changing elevation, until the electronic device estimates the alert output will be salient despite the movement, until a user's heart rate or other health data of the user related to the movement reduces or otherwise changes, a specific time interval (such as thirty seconds), and so on.


In implementations where adjusting the alert output includes delaying the alert output by a period of time, the electronic device may delay for different periods of time based on a variety of factors. For example, the electronic device may determine based on the movement pattern that the electronic device will be less proximate to a user after a first period of time (such as five seconds) and more proximate to the user after a second period of time (such as ten seconds), such as where the electronic device is located in the user's pocket and thusly moving within the pocket closer to and further from the user as part of the movement pattern. In such an example, the electronic device may delay the alert output by the second period of time.


By way of a second example, the electronic device may determine a type of motion based on the movement pattern, such as running motion, walking motion, stair climbing motion, dancing motion, driving motion, and so on. The processing unit may delay the alert output different periods based on the type of motion. In some examples, the processing unit may delay the alert output by a first period (such as twenty seconds) when the movement pattern indicates a first type of motion (such as walking motion) and by a second period (such as forty seconds) when the movement pattern indicates a second type of motion (such as running motion).


In various implementations, the electronic device may estimate a time when the alert output will be salient despite the movement, such as where the movement pattern indicates the movement will pause. In such an implementation, the electronic device may delay until that time.


In other implementations, adjusting the alert output may include altering the alert output to be discernible despite the movement pattern based on a cadence of the user and alter the alert output based thereupon. In such implementations, the electronic device may determine a cadence of the movement pattern. A cadence of a movement pattern may involve the rhythm of body parts such as legs involved in the motion, the rate at which they move, and so on.


For example, the electronic device may alter a pattern of the alert output (such as the waveform of haptic output) to be mismatched with the cadence. As the altered alert output is mismatched to the cadence of the movement pattern, the altered alert output may be more salient despite the movement.


By way of another example, the cadence of the movement pattern may involve pauses in motion. The electronic device may alter the alert output by time shifting the alert output to such a pause in the cadence.


In numerous examples, the processing unit may alter the alert output in different manners based on the type of motion. In some examples, the processing unit may alter the alert output in a first manner when the movement pattern indicates a first type of motion (such as driving motion) and in a second manner when the movement pattern indicates a second type of motion (such as flying motion). Although these examples are described as altering alert output in a first manner for a first type of motion and in a second manner for a second type of motion, it is understood that this is an illustration. In various examples, alert output may be continuously varied based on a continuous scale of motion.


In various examples, the alert output may be provided in response to an incoming communication such as an email, text message, phone call, and so on. The incoming communication may have an associated priority. Such a priority may be based on a source of the incoming communication (such as a first priority for communications from very important person or VIP contacts compared to a second priority for other contacts), a priority in indicator included in the incoming communication (such as an urgent priority flag indicating a first priority included in the communication or normal priority flag indicating a second priority included in the communication) or a type of the communication (such as a first priority for email communications and a second priority for text message communications). The priority may be user assigned. The electronic device may adjust the alert output differently based on the associated priority.


For example, the electronic device may delay the alert output if the associated priority is a first priority and alter the alert output based on a cadence of the movement if the associated priority is a second priority. By way of another example, the electronic device may delay the alert output a first period if the associated priority is a first priority and delay the alert output a second period if the associated priority is a second priority. By way of still another example, the electronic device may alter the alert output based on a cadence of the movement in a first manner if the associated priority is a first priority and alter the alert output based on a cadence of the movement in a second manner if the associated priority is a second priority. Although this example is described as delaying a first period for a first priority and a second period for a second priority, it is understood that this is an illustration. In various examples, alert output may be delayed on a continuous scale for a continuous priority scale.


After the electronic device adjusts the alert output based on the user's movement pattern, the flow proceeds to block 570 where the electronic device provides the adjusted alert output. The flow then proceeds to block 580.


At block 580, the electronic device determines whether or not the adjusted alert output has been acknowledged. The electronic device may prompt for acknowledgement when the adjusted alert output is provided so that the electronic device can ensure that the provided output was salient to the user. If so, the flow may return to 510 where the electronic device continues to operate.


Otherwise, the flow may return to block 570 where the adjusted alert output is again provided. The electronic device may continue providing the adjusted alert output periodically, at intervals, and/or otherwise repeatedly providing the adjusted alert output until the provided output is acknowledged.


Although the example method 500 is illustrated and described as including particular operations performed in a particular order, it is understood that this is an example. In various implementations, various orders of the same, similar, and/or different operations may be performed without departing from the scope of the present disclosure.


For example, the example method 500 is illustrated and described as the electronic device altering the alert output if the electronic device is moving. However, in some implementations, the electronic device may determine that the electronic device is moving and further determine whether or not the motion will affect salience of the alert output. In such an example, the electronic device may alter the alert output if the motion will affect salience and not alter the alert output if the motion will not affect salience.


By way of another example, the example method 500 is illustrated and described as the electronic device providing the altered alert output if the electronic device is moving. However, in some implementations, the electronic device may determine that another electronic device with which it communicates is not moving or is moving in a way that will not affect salience. In such implementations, the electronic device may adjust the alert output by signaling the other electronic device to provide the alert output. For example, a user's smart phone may be moving significantly while the user is jogging but a user's wearable device may not be and the smart phone may signal the wearable device to provide the alert output.



FIG. 6 is a flow chart illustrating a fourth example method 600 for providing situationally-aware alert output. This fourth example method 600 may be performed by the example system 100 of FIGS. 1-2.


The flow begins at block 610 where an electronic device operates. The flow then proceeds to block 620 where the electronic device determines whether or not an incoming communication is received. If so, the flow proceeds to block 630. Otherwise, the flow returns to block 610 where the electronic device continues to operate.


At block 630, the electronic device determines whether or not the user is moving. If not, the flow proceeds to block 660 where the electronic device provides alert output (such as a vibration or other haptic output, audio output, visual output, and so on) before the flow returns to block 610 and the electronic device continues to operate. Otherwise, the flow proceeds to block 640.


At block 640, the electronic device determines a movement pattern of the user using the signals from the one or more sensors. Next, the flow proceeds to block 650 where the electronic device determines whether the movement pattern is a first type of movement pattern (such as walking) or a second type of movement pattern (such as running).


If the movement pattern is the first type of movement pattern, the flow proceeds to block 670 where the electronic device delays the alert output. The flow then proceeds after the delay to block 660 where the electronic device provides the alert output.


If the movement pattern is the second type of movement pattern, the flow proceeds to block 680 where the electronic device alters the alert output to be discernible despite the movement based on a cadence of the movement pattern. Next, the flow proceeds to block 690 where the electronic device provides the altered alert output. The flow then returns to block 610 where the electronic device continues to operate.


Although the example method 600 is illustrated and described as including particular operations performed in a particular order, it is understood that this is an example. In various implementations, various orders of the same, similar, and/or different operations may be performed without departing from the scope of the present disclosure.


For example, the example method 600 is illustrated and described as adjusting the alert output to account for the movement pattern in a first way when the movement pattern is a first type of movement and a second way when the movement pattern is a second type of movement. However, it is understood that this is an example. In various implementations, the electronic device may adjust the alert input in a variety of different ways based on a variety of different types of movement and/or based on other factors without departing from the scope of the present disclosure. Alternatively, the alert output may be provided to the user without adjustment.


Further, the example method 600 is illustrated and described as handling the alert output differently based on first or second types of movement patterns. However, it is understood that this is an illustration. In various examples, alert output may be varied continuously in response to a continuous scale of movement patterns.



FIG. 7 is a flow chart illustrating a fifth example method 700 for providing situationally-aware alert output. This fifth example method 700 may be performed by the example system 100 of FIGS. 1-2.


The flow begins at block 710 where an electronic device operates. The flow then proceeds to block 720 where the electronic device determines whether or not an incoming notification or communication is received. If so, the flow proceeds to block 730. Otherwise, the flow returns to block 710 where the electronic device continues to operate.


At block 730, the electronic device determines whether or not the user is moving. If not, the flow proceeds to block 760 where the electronic device provides alert output (such as a vibration or other haptic output, audio output, visual output, and so on) before the flow returns to block 710 and the electronic device continues to operate. Otherwise, the flow proceeds to block 740.


At block 740, the electronic device determines a movement pattern of the user using the signals from the one or more sensors. Next, the flow proceeds to block 750 where the electronic device determines whether the incoming notification or communication is associated with a first priority or a second priority.


If the incoming notification or communication is associated with a first priority, the flow proceeds to block 770 where the electronic device delays the alert output. The flow then proceeds after the delay to block 760 where the electronic device provides the alert output.


If the incoming notification or communication is associated with a second priority, the flow proceeds to block 780 where the electronic device alters the alert output to be discernible despite the movement based on a cadence of the movement pattern. Next, the flow proceeds to block 790 where the electronic device provides the altered alert output. The flow then returns to block 710 where the electronic device continues to operate.


Although the example method 700 is illustrated and described as including particular operations performed in a particular order, it is understood that this is an example. In various implementations, various orders of the same, similar, and/or different operations may be performed without departing from the scope of the present disclosure.


For example, the example method 700 is illustrated and described as adjusting the alert output in a first way when the associated priority is a first priority and a second way when the associated priority is a second priority. However, it is understood that this is an example. In various implementations, the electronic device may adjust the alert output in a variety of different ways based on a variety of different associated priorities and/or based on other factors without departing from the scope of the present disclosure. By way of illustration, alert output may be continuously adjusted based on an associated continuous priority scale. Alternatively, the alert output may be provided to the user without adjustment.


Although the example methods 300-700 are illustrated and described separately, various operations described in the context of one or more of the example methods 300-700 may be used in one or more of the other example methods 300-700. For example, in some implementations, the example method 700 may include the operation of providing other output described at 490 of the example method 400. By way of another example, in various implementations, the example method 700 may include the operation of determining whether or not alert output was acknowledged described at block 580 of the example method 500.


Although the above describes adjusting alert output for individual alerts, delaying alert output for individual alerts, and/or otherwise handling alert output for individual alerts, it is understood that these are examples. In various implementations, output for alerts may be batched in various ways. For example, alerts associated with received high-priority communications may be individually output whereas those associated with received low-priority communications may be delayed before a single alert is output corresponding to a group of the low-priority communications. In some implementations of such an example, one or more rules may be applied (such as a user specified rule, a default rule, and so on) specifying how such batching is handled. By way of illustration, a rule may specify that a batch notification is provided no more than once per hour and alerts corresponding to received low-priority communications may be batched according to this rule. This batching may reduce the possibility of over-frequent alerts. Users may learn or train themselves to ignore over frequent alerts. Thus, reducing the number of alerts may increase salience of alerts.


As described above and illustrated in the accompanying figures, the present disclosure relates to an electronic device that provides situationally-aware alerts that adjusts alert output based on a user's situation in order to increase salience of the alert output when the user's situation merits increased salience. The electronic device may determine to provide an alert output, evaluate the user's situation based on information from one or more sensors, and increase salience by adjusting the alert output based on the user's situation.


In the present disclosure, the methods disclosed may be implemented as sets of instructions or software readable by a device. Further, it is understood that the specific order or hierarchy of steps in the methods disclosed are examples of sample approaches. In other embodiments, the specific order or hierarchy of steps in the method can be rearranged while remaining within the disclosed subject matter. The accompanying method claims present elements of the various steps in a sample order, and are not necessarily meant to be limited to the specific order or hierarchy presented.


The described disclosure may be provided as a computer program product, or software, that may include a non-transitory machine-readable medium having stored thereon instructions, which may be used to program a computer system (or other electronic devices) to perform a process according to the present disclosure. A non-transitory machine-readable medium includes any mechanism for storing information in a form (e.g., software, processing application) readable by a machine (e.g., a computer). The non-transitory machine-readable medium may take the form of, but is not limited to, a magnetic storage medium (e.g., floppy diskette, video cassette, and so on); optical storage medium (e.g., CD-ROM); magneto-optical storage medium; read only memory (ROM); random access memory (RAM); erasable programmable memory (e.g., EPROM and EEPROM); flash memory; and so on.


The foregoing description, for purposes of explanation, used specific nomenclature to provide a thorough understanding of the described embodiments. However, it will be apparent to one skilled in the art that the specific details are not required in order to practice the described embodiments. Thus, the foregoing descriptions of the specific embodiments described herein are presented for purposes of illustration and description. They are not targeted to be exhaustive or to limit the embodiments to the precise forms disclosed. It will be apparent to one of ordinary skill in the art that many modifications and variations are possible in view of the above teachings.

Claims
  • 1. An electronic device that provides situationally-aware alerts, comprising: a haptic output device;a sensor operable to produce a signal indicating information relating to movement of the electronic device; anda processing unit connected to the sensor and the haptic output device that:determines to provide a haptic output via the haptic output device;determines a movement pattern using the signal;determines that a pattern of the haptic output matches the movement pattern; andprovides the haptic output after a delay to make the pattern of the haptic output mismatch the movement pattern.
  • 2. The electronic device of claim 1, wherein: the movement pattern indicates changes in elevation; andthe processing unit delays the haptic output until after the changes in elevation.
  • 3. The electronic device of claim 2, wherein: the sensor includes a pressure sensor;the processing unit identifies the changes in elevation using the pressure sensor; andthe processing unit delays the haptic output until the processing unit determines, using the pressure sensor, that the changes in elevation have ceased.
  • 4. The electronic device of claim 1, wherein the processing unit: determines a first period, using the movement pattern, where the electronic device will be less proximate to a user;determines a second period, using the movement pattern, where the electronic device will be more proximate to the user; anddelays the haptic output from the first period to the second period.
  • 5. The electronic device of claim 1, wherein the processing unit delays the haptic output: for a first period when the movement pattern indicates a first type of movement; andfor a second period when the movement pattern indicates a second type of movement.
  • 6. The electronic device of claim 1, wherein: the signal includes information indicating a heart rate of a user is elevated; andthe processing unit delays the haptic output until the heart rate of the user reduces.
  • 7. The electronic device of claim 1, wherein the processing unit: estimates a time when the haptic output will be salient despite the movement pattern; anddelays the haptic output until the time.
  • 8. The electronic device of claim 1, wherein: the movement pattern has a cadence; andthe processing unit uses a delay duration that causes the haptic output to mismatch the cadence.
  • 9. The electronic device of claim 1, wherein the processing unit uses a delay duration that causes the haptic output to occur after the movement.
  • 10. The electronic device of claim 1, wherein the processing unit uses a delay duration that causes the haptic output to occur during the movement.
  • 11. The electronic device of claim 1, wherein the processing unit delays the haptic output by providing a different waveform.
  • 12. The electronic device of claim 1, wherein the processing unit delays the haptic output by phase shifting a waveform.
  • 13. The electronic device of claim 1, wherein the processing unit monitors the signal over a period of time in order to determine the movement pattern.
  • 14. The electronic device of claim 1, wherein the movement pattern corresponds to the movement of the electronic device.
  • 15. The electronic device of claim 1, wherein: the processing unit provides the haptic output in response to an incoming message;the haptic output is a first haptic output when the incoming message is associated with a first priority; andthe haptic output is a second haptic output when the incoming message is associated with a second priority.
  • 16. The electronic device of claim 1, wherein: the haptic output is a first haptic output; andthe processing unit provides a second haptic output upon determining that an acknowledgement to the first haptic output has not been received.
  • 17. An electronic device that provides situationally-aware alerts, comprising: a haptic output device;a sensor operable to produce a signal indicating information relating to movement of the electronic device; anda processing unit connected to the sensor and the haptic output device that: determines to provide a haptic output via the haptic output device;determines a movement pattern using the signal;determines that a pattern of the haptic output matches the movement pattern; andprovides the haptic output after a delay determined using the movement pattern to make the pattern of the haptic output mismatch the movement pattern.
  • 18. The electronic device of claim 17, wherein: the haptic output has the pattern; andthe processing unit uses a delay duration that adjusts the pattern to be less similar to the movement pattern.
  • 19. An electronic device that provides situationally-aware alerts, comprising: a haptic output device;a sensor operable to produce a signal indicating information relating to movement of the electronic device; anda processing unit connected to the sensor and the haptic output device that: determines to provide a haptic output via the haptic output device;determines the movement using the signal;determines that a pattern of the haptic output matches a pattern of the movement; andprovides the haptic output after a delay to make the pattern of the haptic output mismatch the pattern of the movement.
  • 20. The electronic device of claim 19, wherein the processing unit provides the haptic output during at least a portion of the movement.
CROSS-REFERENCE TO RELATED APPLICATION

This application is a divisional of U.S. patent application Ser. No. 16/015,367, filed Jun. 22, 2018, entitled “Situationally-Aware Alerts,” which is a continuation of U.S. patent application Ser. No. 15/641,192, filed Jul. 3, 2017, entitled “Situationally-Aware Alerts,” now abandoned, which is a continuation of U.S. patent application Ser. No. 15/251,459, filed Aug. 30, 2016, entitled “Situationally-Aware Alerts,” now U.S. Pat. No. 10,039,080, which claims the benefit under 35 U.S.C. § 119(e) of U.S. Provisional Patent Application No. 62/303,964, filed on Mar. 4, 2016, and entitled “Situationally-Aware Alerts,” the contents of which are incorporated by reference as if fully disclosed herein.

US Referenced Citations (526)
Number Name Date Kind
3001049 Didier Sep 1961 A
3390287 Sonderegger Jun 1968 A
3419739 Clements Dec 1968 A
4236132 Zissimopoulos Nov 1980 A
4412148 Klicker et al. Oct 1983 A
4414984 Zarudiansky Nov 1983 A
4490815 Umehara et al. Dec 1984 A
4695813 Nobutoki et al. Sep 1987 A
4975616 Park Dec 1990 A
5010772 Bourland Apr 1991 A
5245734 Issartel Sep 1993 A
5283408 Chen Feb 1994 A
5293161 MacDonald et al. Mar 1994 A
5317221 Kubo et al. May 1994 A
5365140 Ohya et al. Nov 1994 A
5434549 Hirabayashi et al. Jul 1995 A
5436622 Gutman et al. Jul 1995 A
5510584 Norris Apr 1996 A
5510783 Findlater et al. Apr 1996 A
5513100 Parker et al. Apr 1996 A
5587875 Sellers Dec 1996 A
5590020 Sellers Dec 1996 A
5602715 Lempicki et al. Feb 1997 A
5619005 Shibukawa et al. Apr 1997 A
5621610 Moore et al. Apr 1997 A
5625532 Sellers Apr 1997 A
5629578 Winzer et al. May 1997 A
5635928 Takagi et al. Jun 1997 A
5718418 Gugsch Feb 1998 A
5739759 Nakazawa et al. Apr 1998 A
5742242 Sellers Apr 1998 A
5783765 Muramatsu Jul 1998 A
5793605 Sellers Aug 1998 A
5812116 Malhi Sep 1998 A
5813142 Demon Sep 1998 A
5818149 Safari et al. Oct 1998 A
5896076 Van Namen Apr 1999 A
5907199 Miller May 1999 A
5951908 Cui et al. Sep 1999 A
5959613 Rosenberg et al. Sep 1999 A
5973441 Lo et al. Oct 1999 A
5982304 Selker et al. Nov 1999 A
5982612 Roylance Nov 1999 A
5995026 Sellers Nov 1999 A
5999084 Armstrong Dec 1999 A
6035257 Epperson Mar 2000 A
6069433 Lazarus et al. May 2000 A
6078308 Rosenberg et al. Jun 2000 A
6104947 Heikkila et al. Aug 2000 A
6127756 Iwaki Oct 2000 A
6135886 Armstrong Oct 2000 A
6198206 Saarmaa Mar 2001 B1
6218966 Goodwin Apr 2001 B1
6219033 Rosenberg Apr 2001 B1
6220550 McKillip, Jr. Apr 2001 B1
6222525 Armstrong Apr 2001 B1
6252336 Hall Jun 2001 B1
6342880 Rosenberg et al. Jan 2002 B2
6351205 Armstrong Feb 2002 B1
6373465 Jolly et al. Apr 2002 B2
6408187 Merriam Jun 2002 B1
6411276 Braun et al. Jun 2002 B1
6429849 An Aug 2002 B1
6437485 Johansson Aug 2002 B1
6438393 Surronen Aug 2002 B1
6444928 Okamoto et al. Sep 2002 B2
6455973 Ineson Sep 2002 B1
6465921 Horng Oct 2002 B1
6552404 Hynes Apr 2003 B1
6552471 Chandran et al. Apr 2003 B1
6557072 Osborn Apr 2003 B2
6642857 Schediwy Nov 2003 B1
6693626 Rosenberg Feb 2004 B1
6717573 Shahoian et al. Apr 2004 B1
6747400 Maichl et al. Jun 2004 B2
6809462 Pelrine et al. Oct 2004 B2
6809727 Piot et al. Oct 2004 B2
6864877 Braun et al. Mar 2005 B2
6906697 Rosenberg Jun 2005 B2
6906700 Armstrong Jun 2005 B1
6906703 Vablais et al. Jun 2005 B2
6952203 Banerjee et al. Oct 2005 B2
6954657 Bork et al. Oct 2005 B2
6963762 Kaaresoja et al. Nov 2005 B2
6965189 Menzel Nov 2005 B2
6995752 Lu Feb 2006 B2
7005811 Wakuda et al. Feb 2006 B2
7016707 Fujisawa et al. Mar 2006 B2
7022927 Hsu Apr 2006 B2
7023112 Miyamoto et al. Apr 2006 B2
7081701 Yoon et al. Jul 2006 B2
7091948 Chang et al. Aug 2006 B2
7121147 Okada Oct 2006 B2
7123948 Nielsen Oct 2006 B2
7130664 Williams Oct 2006 B1
7136045 Rosenberg et al. Nov 2006 B2
7158122 Roberts Jan 2007 B2
7161580 Bailey et al. Jan 2007 B2
7162928 Shank et al. Jan 2007 B2
7170498 Huang Jan 2007 B2
7176906 Williams et al. Feb 2007 B2
7180500 Marvit et al. Feb 2007 B2
7182691 Schena Feb 2007 B1
7194645 Bieswanger et al. Mar 2007 B2
7205978 Poupyrev Apr 2007 B2
7217891 Fischer et al. May 2007 B2
7218310 Tierling et al. May 2007 B2
7219561 Okada May 2007 B2
7253350 Noro et al. Aug 2007 B2
7269484 Hein Sep 2007 B2
7333604 Zernovizky et al. Feb 2008 B2
7334350 Ellis Feb 2008 B2
7348968 Dawson Mar 2008 B2
7382357 Panotopoulos et al. Jun 2008 B2
7388741 Konuma et al. Jun 2008 B2
7392066 Hapamas Jun 2008 B2
7423631 Shahoian et al. Sep 2008 B2
7446752 Goldenberg et al. Nov 2008 B2
7469155 Chu Dec 2008 B2
7469595 Kessler et al. Dec 2008 B2
7471033 Thiesen et al. Dec 2008 B2
7495358 Kobayashi et al. Feb 2009 B2
7508382 Denoue et al. Mar 2009 B2
7561142 Shahoian et al. Jul 2009 B2
7562468 Ellis Jul 2009 B2
7569086 Chandran Aug 2009 B2
7575368 Guillaume Aug 2009 B2
7586220 Roberts Sep 2009 B2
7619498 Miura Nov 2009 B2
7639232 Grant et al. Dec 2009 B2
7641618 Noda et al. Jan 2010 B2
7649305 Priya et al. Jan 2010 B2
7675253 Dorel Mar 2010 B2
7675414 Ray Mar 2010 B2
7679611 Schena Mar 2010 B2
7707742 Ellis May 2010 B2
7710399 Bruneau et al. May 2010 B2
7732951 Mukaide Jun 2010 B2
7737828 Yang et al. Jun 2010 B2
7742036 Grant et al. Jun 2010 B2
7788032 Moloney Aug 2010 B2
7793429 Ellis Sep 2010 B2
7793430 Ellis Sep 2010 B2
7798982 Zets et al. Sep 2010 B2
7868489 Amemiya et al. Jan 2011 B2
7886621 Smith et al. Feb 2011 B2
7888892 McReynolds et al. Feb 2011 B2
7893922 Klinghult et al. Feb 2011 B2
7919945 Houston et al. Apr 2011 B2
7929382 Yamazaki Apr 2011 B2
7946483 Miller et al. May 2011 B2
7952261 Lipton et al. May 2011 B2
7952566 Poupyrev et al. May 2011 B2
7956770 Klinghult et al. Jun 2011 B2
7961909 Mandella et al. Jun 2011 B2
8018105 Erixon et al. Sep 2011 B2
8031172 Kruse et al. Oct 2011 B2
8044940 Narusawa Oct 2011 B2
8069881 Cunha Dec 2011 B1
8072418 Crawford et al. Dec 2011 B2
8077145 Rosenberg et al. Dec 2011 B2
8081156 Ruettiger Dec 2011 B2
8082640 Takeda Dec 2011 B2
8084968 Murray et al. Dec 2011 B2
8098234 Lacroix et al. Jan 2012 B2
8123660 Kruse et al. Feb 2012 B2
8125453 Shahoian et al. Feb 2012 B2
8141276 Ellis Mar 2012 B2
8156809 Tierling et al. Apr 2012 B2
8169401 Hardwick May 2012 B2
8174344 Yakima et al. May 2012 B2
8174372 da Costa May 2012 B2
8179027 Barta et al. May 2012 B2
8179202 Cruz-Hernandez et al. May 2012 B2
8188623 Park May 2012 B2
8205356 Ellis Jun 2012 B2
8210942 Shimabukuro et al. Jul 2012 B2
8232494 Purcocks Jul 2012 B2
8242641 Bae Aug 2012 B2
8248277 Peterson et al. Aug 2012 B2
8248278 Schlosser et al. Aug 2012 B2
8253686 Kyung et al. Aug 2012 B2
8255004 Huang et al. Aug 2012 B2
8261468 Ellis Sep 2012 B2
8264465 Grant et al. Sep 2012 B2
8270114 Argumedo et al. Sep 2012 B2
8270148 Griffith et al. Sep 2012 B2
8288899 Park et al. Oct 2012 B2
8291614 Ellis Oct 2012 B2
8294600 Peterson et al. Oct 2012 B2
8315746 Cox et al. Nov 2012 B2
8339250 Je et al. Dec 2012 B2
8344834 Niiyama Jan 2013 B2
8345013 Heubel et al. Jan 2013 B2
8378797 Pance et al. Feb 2013 B2
8378798 Bells et al. Feb 2013 B2
8378965 Gregorio et al. Feb 2013 B2
8384316 Houston et al. Feb 2013 B2
8384679 Paleczny et al. Feb 2013 B2
8388346 Rantala et al. Mar 2013 B2
8390594 Modarres et al. Mar 2013 B2
8395587 Cauwels et al. Mar 2013 B2
8398570 Mortimer et al. Mar 2013 B2
8405618 Colgate et al. Mar 2013 B2
8411058 Wong et al. Apr 2013 B2
8446264 Tanase May 2013 B2
8451255 Weber et al. May 2013 B2
8452345 Lee et al. May 2013 B2
8461951 Gassmann et al. Jun 2013 B2
8466889 Tong et al. Jun 2013 B2
8471690 Hennig et al. Jun 2013 B2
8487759 Hill Jul 2013 B2
8515398 Song et al. Aug 2013 B2
8542134 Peterson et al. Sep 2013 B2
8545322 George et al. Oct 2013 B2
8547341 Takashima et al. Oct 2013 B2
8547350 Anglin et al. Oct 2013 B2
8552859 Pakula et al. Oct 2013 B2
8570291 Motomura Oct 2013 B2
8575794 Lee et al. Nov 2013 B2
8587955 DiFonzo et al. Nov 2013 B2
8593409 Heubel Nov 2013 B1
8598893 Camus Dec 2013 B2
8599047 Schlosser et al. Dec 2013 B2
8599152 Wurtenberger et al. Dec 2013 B1
8600354 Esaki Dec 2013 B2
8614431 Huppi et al. Dec 2013 B2
8621348 Ramsay et al. Dec 2013 B2
8629843 Steeves et al. Jan 2014 B2
8633916 Bernstein et al. Jan 2014 B2
8674941 Casparian et al. Mar 2014 B2
8680723 Subramanian Mar 2014 B2
8681092 Harada et al. Mar 2014 B2
8682396 Yang et al. Mar 2014 B2
8686952 Burrough et al. Apr 2014 B2
8710966 Hill Apr 2014 B2
8717309 Almalki May 2014 B2
8723813 Park et al. May 2014 B2
8733540 Woiler et al. May 2014 B2
8735755 Peterson et al. May 2014 B2
8760273 Casparian et al. Jun 2014 B2
8760413 Peterson et al. Jun 2014 B2
8780060 Maschmeyer et al. Jul 2014 B2
8787006 Golko et al. Jul 2014 B2
8797152 Henderson et al. Aug 2014 B2
8798534 Rodriguez et al. Aug 2014 B2
8803842 Wakasugi et al. Aug 2014 B2
8816981 Kai et al. Aug 2014 B2
8836502 Culbert et al. Sep 2014 B2
8857248 Shih et al. Oct 2014 B2
8860562 Hill Oct 2014 B2
8861776 Lastrucci Oct 2014 B2
8866600 Yang et al. Oct 2014 B2
8890666 Parker et al. Nov 2014 B2
8890668 Pance et al. Nov 2014 B2
8918215 Bosscher et al. Dec 2014 B2
8928621 Ciesla et al. Jan 2015 B2
8947383 Ciesla et al. Feb 2015 B2
8948821 Newham et al. Feb 2015 B2
8952937 Shih et al. Feb 2015 B2
8970534 Adachi et al. Mar 2015 B2
8976141 Myers et al. Mar 2015 B2
9008730 Kim et al. Apr 2015 B2
9012795 Niu Apr 2015 B2
9013426 Cole et al. Apr 2015 B2
9019088 Zawacki et al. Apr 2015 B2
9024738 Van Schyndel et al. May 2015 B2
9035887 Prud'Hommeaux et al. May 2015 B1
9072576 Nishiura Jul 2015 B2
9083821 Hughes Jul 2015 B2
9092129 Abdo et al. Jul 2015 B2
9098991 Park et al. Aug 2015 B2
9117347 Matthews Aug 2015 B2
9122325 Peshkin et al. Sep 2015 B2
9131039 Behles Sep 2015 B2
9134834 Reshef Sep 2015 B2
9141225 Cok et al. Sep 2015 B2
9158379 Cruz-Hernandez et al. Oct 2015 B2
9178509 Bernstein Nov 2015 B2
9189932 Kerdemelidis et al. Nov 2015 B2
9201458 Hunt et al. Dec 2015 B2
9202355 Hill Dec 2015 B2
9219401 Kim et al. Dec 2015 B2
9235267 Burrough et al. Jan 2016 B2
9274601 Faubert et al. Mar 2016 B2
9274602 Garg et al. Mar 2016 B2
9274603 Modarres et al. Mar 2016 B2
9275815 Hoffmann Mar 2016 B2
9285923 Liao et al. Mar 2016 B2
9293054 Bruni et al. Mar 2016 B2
9300181 Maeda et al. Mar 2016 B2
9310906 Yumiki et al. Apr 2016 B2
9310950 Takano et al. Apr 2016 B2
9317116 Ullrich et al. Apr 2016 B2
9317118 Puskarich Apr 2016 B2
9317154 Perlin et al. Apr 2016 B2
9318942 Sugita et al. Apr 2016 B2
9325230 Yamada et al. Apr 2016 B2
9330544 Levesque et al. May 2016 B2
9357052 Ullrich May 2016 B2
9360944 Pinault Jun 2016 B2
9367238 Tanada Jun 2016 B2
9380145 Tartz et al. Jun 2016 B2
9390599 Weinberg Jul 2016 B2
9396434 Rothkopf Jul 2016 B2
9405369 Modarres et al. Aug 2016 B2
9411423 Heubel Aug 2016 B2
9417695 Griffin et al. Aug 2016 B2
9430042 Levin Aug 2016 B2
9448628 Tan et al. Sep 2016 B2
9448713 Cruz-Hernandez et al. Sep 2016 B2
9449476 Lynn Sep 2016 B2
9452268 Badaye et al. Sep 2016 B2
9454239 Elias et al. Sep 2016 B2
9467033 Jun et al. Oct 2016 B2
9468846 Terrell et al. Oct 2016 B2
9471172 Sirois Oct 2016 B2
9477342 Daverman et al. Oct 2016 B2
9480947 Jiang et al. Nov 2016 B2
9501912 Hayskjold et al. Nov 2016 B1
9542028 Filiz et al. Jan 2017 B2
9544694 Abe et al. Jan 2017 B2
9564029 Morrell et al. Feb 2017 B2
9576445 Cruz-Hernandez Feb 2017 B2
9595659 Kim Mar 2017 B2
9600070 Chatterjee et al. Mar 2017 B2
9608506 Degner et al. Mar 2017 B2
9622214 Ryu Apr 2017 B2
9640048 Hill May 2017 B2
9652040 Martinez et al. May 2017 B2
9659482 Yang et al. May 2017 B2
9665198 Kies et al. May 2017 B2
9692286 Endo et al. Jun 2017 B2
9594450 Lynn et al. Jul 2017 B2
9696803 Curz-Hernandez et al. Jul 2017 B2
9727157 Ham et al. Aug 2017 B2
9733704 Cruz-Hernandez et al. Aug 2017 B2
9746945 Sheynblat et al. Aug 2017 B2
9778743 Grant et al. Oct 2017 B2
9779592 Hoen Oct 2017 B1
9785251 Martisauskas Oct 2017 B2
9823833 Grant et al. Nov 2017 B2
9830782 Morrell et al. Nov 2017 B2
9831871 Lee et al. Nov 2017 B2
9836123 Gipson et al. Dec 2017 B2
9846484 Shah Dec 2017 B2
9857872 Terlizzi et al. Jan 2018 B2
9870053 Modarres et al. Jan 2018 B2
9886093 Moussette et al. Feb 2018 B2
9891708 Cruz-Hernandez et al. Feb 2018 B2
9904393 Frey et al. Feb 2018 B2
9911553 Bernstein Mar 2018 B2
9928950 Lubinski et al. Mar 2018 B2
9934661 Hill Apr 2018 B2
9970757 Das et al. May 2018 B2
9990099 Ham et al. Jun 2018 B2
9997306 Bernstein Jun 2018 B2
10013058 Puskarich et al. Jul 2018 B2
10032550 Zhang Jul 2018 B1
10038361 Hajati et al. Jul 2018 B2
10039080 Miller et al. Jul 2018 B2
10061386 Frescas et al. Aug 2018 B2
10062832 Caraveo et al. Aug 2018 B2
10067585 Kim Sep 2018 B2
10069392 Degner et al. Sep 2018 B2
10108151 Cardinali et al. Oct 2018 B2
10120446 Pance et al. Nov 2018 B2
10126817 Morrell et al. Nov 2018 B2
10127778 Hajati et al. Nov 2018 B2
10133352 Lee et al. Nov 2018 B2
10139907 Billington Nov 2018 B2
10139959 Butler et al. Nov 2018 B2
10152116 Wang et al. Dec 2018 B2
10198097 Lynn et al. Feb 2019 B2
10204494 Do et al. Feb 2019 B2
10236760 Moussette et al. Mar 2019 B2
10338682 Heubel et al. Jul 2019 B2
10345905 McClure et al. Jul 2019 B2
10444834 Vescovi Oct 2019 B2
10444841 Nakamura et al. Oct 2019 B2
20020194284 Haynes Dec 2002 A1
20030210259 Liu Nov 2003 A1
20040021663 Suzuki et al. Feb 2004 A1
20040127198 Roskind et al. Jul 2004 A1
20050057528 Kleen Mar 2005 A1
20050107129 Kaewell et al. May 2005 A1
20050110778 Ben Ayed May 2005 A1
20050118922 Endo Jun 2005 A1
20050217142 Ellis Oct 2005 A1
20050237306 Klein et al. Oct 2005 A1
20050248549 Dietz et al. Nov 2005 A1
20050258715 Schlabach Nov 2005 A1
20060014569 DelGiorno Jan 2006 A1
20060154674 Landschaft et al. Jul 2006 A1
20060209037 Wang et al. Sep 2006 A1
20060239746 Grant Oct 2006 A1
20060252463 Liao Nov 2006 A1
20070032270 Orr Feb 2007 A1
20070043725 Hotelling et al. Feb 2007 A1
20070099574 Wang May 2007 A1
20070152974 Kim et al. Jul 2007 A1
20070168430 Brun et al. Jul 2007 A1
20070178942 Sadler et al. Aug 2007 A1
20070188450 Hernandez et al. Aug 2007 A1
20080084384 Gregorio et al. Apr 2008 A1
20080165148 Williamson Jul 2008 A1
20080181501 Faraboschi Jul 2008 A1
20080181706 Jackson Jul 2008 A1
20080192014 Kent et al. Aug 2008 A1
20080204428 Pierce et al. Aug 2008 A1
20080255794 Levine Oct 2008 A1
20090002328 Ullrich et al. Jan 2009 A1
20090015560 Robinson et al. Jan 2009 A1
20090115734 Fredriksson et al. May 2009 A1
20090120105 Ramsay et al. May 2009 A1
20090128503 Grant et al. May 2009 A1
20090135142 Fu et al. May 2009 A1
20090167508 Fadell Jul 2009 A1
20090167702 Nurmi Jul 2009 A1
20090218148 Hugeback et al. Sep 2009 A1
20090225046 Kim et al. Sep 2009 A1
20090236210 Clark et al. Sep 2009 A1
20090267892 Faubert Oct 2009 A1
20090291670 Sennett et al. Nov 2009 A1
20100020036 Hui et al. Jan 2010 A1
20100053087 Dai et al. Mar 2010 A1
20100079264 Hoellwarth Apr 2010 A1
20100089735 Takeda et al. Apr 2010 A1
20100110018 Faubert et al. May 2010 A1
20100141408 Doy et al. Jun 2010 A1
20100141606 Bae et al. Jun 2010 A1
20100148944 Kim et al. Jun 2010 A1
20100152620 Ramsay et al. Jun 2010 A1
20100164894 Kim et al. Jul 2010 A1
20100188422 Shingai et al. Jul 2010 A1
20100231508 Cruz-Hernandez et al. Sep 2010 A1
20100265197 Purdy Oct 2010 A1
20100328229 Weber et al. Dec 2010 A1
20110007023 Abrahamsson et al. Jan 2011 A1
20110053577 Lee et al. Mar 2011 A1
20110075835 Hill Mar 2011 A1
20110107958 Pance et al. May 2011 A1
20110121765 Anderson et al. May 2011 A1
20110128239 Polyakov et al. Jun 2011 A1
20110148608 Grant et al. Jun 2011 A1
20110156539 Park et al. Jun 2011 A1
20110157052 Lee et al. Jun 2011 A1
20110163985 Bae et al. Jul 2011 A1
20110216013 Siotis Sep 2011 A1
20110248948 Griffin et al. Oct 2011 A1
20110260988 Colgate et al. Oct 2011 A1
20110263200 Thornton et al. Oct 2011 A1
20110291950 Tong Dec 2011 A1
20110304559 Pasquero Dec 2011 A1
20120075198 Sulem et al. Mar 2012 A1
20120092263 Peterson et al. Apr 2012 A1
20120126959 Zarrabi et al. May 2012 A1
20120133494 Cruz-Hernandez et al. May 2012 A1
20120139844 Ramstein et al. Jun 2012 A1
20120206248 Biggs Aug 2012 A1
20120256848 Madabusi Srinivasan Oct 2012 A1
20120274578 Snow et al. Nov 2012 A1
20120280927 Ludwig Nov 2012 A1
20120319987 Woo Dec 2012 A1
20120327006 Israr et al. Dec 2012 A1
20130027345 Binzel Jan 2013 A1
20130033967 Chuang et al. Feb 2013 A1
20130043987 Kasama Feb 2013 A1
20130058816 Kim Mar 2013 A1
20130106699 Babatunde May 2013 A1
20130191741 Dickinson et al. Jul 2013 A1
20130207793 Weaber et al. Aug 2013 A1
20130217491 Hilbert et al. Aug 2013 A1
20130228023 Drasnin et al. Sep 2013 A1
20130261811 Yagi et al. Oct 2013 A1
20130300590 Dietz et al. Nov 2013 A1
20140082490 Jung et al. Mar 2014 A1
20140085065 Biggs et al. Mar 2014 A1
20140143785 Mistry et al. May 2014 A1
20140168153 Deichmann et al. Jun 2014 A1
20140197936 Biggs et al. Jul 2014 A1
20140232534 Birnbaum et al. Aug 2014 A1
20140266644 Heubel Sep 2014 A1
20140267076 Birnbaum et al. Sep 2014 A1
20150005039 Liu et al. Jan 2015 A1
20150040005 Faaborg Feb 2015 A1
20150098309 Adams et al. Apr 2015 A1
20150169059 Behles et al. Jun 2015 A1
20150194165 Faaborg et al. Jul 2015 A1
20150205355 Yairi Jul 2015 A1
20150205417 Yairi et al. Jul 2015 A1
20150296480 Kinsey et al. Oct 2015 A1
20150365540 Davis Dec 2015 A1
20160103544 Filiz et al. Apr 2016 A1
20160206921 Szabados et al. Jul 2016 A1
20160216766 Puskarich Jul 2016 A1
20160241119 Keeler Aug 2016 A1
20160259480 Augenbergs et al. Sep 2016 A1
20160306423 Uttermann et al. Oct 2016 A1
20160371942 Smith, IV et al. Dec 2016 A1
20170038905 Bijamov et al. Feb 2017 A1
20170070131 Degner et al. Mar 2017 A1
20170090667 Abdollahian et al. Mar 2017 A1
20170153703 Yun et al. Jun 2017 A1
20170192508 Lim et al. Jul 2017 A1
20170242541 Iuchi et al. Aug 2017 A1
20170255295 Tanemura et al. Sep 2017 A1
20170285747 Chen Oct 2017 A1
20170311282 Miller et al. Oct 2017 A1
20170315618 Ullrich et al. Nov 2017 A1
20170345992 Noguchi Nov 2017 A1
20170357325 Yang et al. Dec 2017 A1
20170364158 Wen et al. Dec 2017 A1
20180052550 Zhang et al. Feb 2018 A1
20180059793 Hajati Mar 2018 A1
20180060941 Yang et al. Mar 2018 A1
20180075715 Morrell et al. Mar 2018 A1
20180081441 Pedder et al. Mar 2018 A1
20180174409 Hill Jun 2018 A1
20180203513 Rihn Jul 2018 A1
20180302881 Miller et al. Oct 2018 A1
20190027674 Zhang et al. Jan 2019 A1
20190214895 Moussette et al. Jul 2019 A1
20190250713 Chen Aug 2019 A1
20200026359 Uttermann et al. Jan 2020 A1
20200027320 Hill Jan 2020 A1
Foreign Referenced Citations (121)
Number Date Country
2015100710 Jul 2015 AU
2016100399 May 2016 AU
2355434 Feb 2002 CA
1324030 Nov 2001 CN
1692371 Nov 2005 CN
1817321 Aug 2006 CN
101120290 Feb 2008 CN
101409164 Apr 2009 CN
101763192 Jun 2010 CN
101903848 Dec 2010 CN
101938207 Jan 2011 CN
102025257 Apr 2011 CN
102057656 May 2011 CN
201829004 May 2011 CN
102163076 Aug 2011 CN
102246122 Nov 2011 CN
102315747 Jan 2012 CN
102591512 Jul 2012 CN
102667681 Sep 2012 CN
102713805 Oct 2012 CN
102754054 Oct 2012 CN
102768593 Nov 2012 CN
102844972 Dec 2012 CN
102915111 Feb 2013 CN
103019569 Apr 2013 CN
103154867 Jun 2013 CN
103155410 Jun 2013 CN
103181090 Jun 2013 CN
103218104 Jul 2013 CN
103278173 Sep 2013 CN
103416043 Nov 2013 CN
103440076 Dec 2013 CN
103567135 Feb 2014 CN
103970339 Aug 2014 CN
104049746 Sep 2014 CN
104220963 Dec 2014 CN
104917885 Sep 2015 CN
104956244 Sep 2015 CN
105556268 May 2016 CN
208013890 Oct 2018 CN
19517630 Nov 1996 DE
10330024 Jan 2005 DE
102009038103 Feb 2011 DE
102011115762 Apr 2013 DE
0483955 May 1992 EP
1047258 Oct 2000 EP
1686776 Aug 2006 EP
2060967 May 2009 EP
2073099 Jun 2009 EP
2194444 Jun 2010 EP
2207080 Jul 2010 EP
2264562 Dec 2010 EP
2315186 Apr 2011 EP
2374430 Oct 2011 EP
2395414 Dec 2011 EP
2461228 Jun 2012 EP
2631746 Aug 2013 EP
2434555 Oct 2013 EP
H05301342 Nov 1993 JP
2002199689 Jul 2002 JP
2002102799 Sep 2002 JP
200362525 Mar 2003 JP
2003527046 Sep 2003 JP
200494389 Mar 2004 JP
2004236202 Aug 2004 JP
2006150865 Jun 2006 JP
3831410 Oct 2006 JP
2007519099 Jul 2007 JP
200818928 Jan 2008 JP
2010536040 Nov 2010 JP
2010272903 Dec 2010 JP
2011523840 Aug 2011 JP
2012135755 Jul 2012 JP
2013149124 Aug 2013 JP
2014002729 Jan 2014 JP
2014509028 Apr 2014 JP
2014235133 Dec 2014 JP
2014239323 Dec 2014 JP
2015153406 Aug 2015 JP
2015228214 Dec 2015 JP
2016095552 May 2016 JP
20050033909 Apr 2005 KR
1020100046602 May 2010 KR
1020110101516 Sep 2011 KR
20130024420 Mar 2013 KR
200518000 Nov 2007 TW
200951944 Dec 2009 TW
201145336 Dec 2011 TW
201218039 May 2012 TW
201425180 Jul 2014 TW
WO 97016932 May 1997 WO
WO 00051190 Aug 2000 WO
WO 01059558 Aug 2001 WO
WO 01089003 Nov 2001 WO
WO 02073587 Sep 2002 WO
WO 03038800 May 2003 WO
WO 03100550 Dec 2003 WO
WO 06057770 Jun 2006 WO
WO 07114631 Oct 2007 WO
WO 08075082 Jun 2008 WO
WO 09038862 Mar 2009 WO
WO 09068986 Jun 2009 WO
WO 09097866 Aug 2009 WO
WO 09122331 Oct 2009 WO
WO 09150287 Dec 2009 WO
WO 10085575 Jul 2010 WO
WO 10087925 Aug 2010 WO
WO 11007263 Jan 2011 WO
WO 12052635 Apr 2012 WO
WO 12129247 Sep 2012 WO
WO 13069148 May 2013 WO
WO 13150667 Oct 2013 WO
WO 13169299 Nov 2013 WO
WO 13169302 Nov 2013 WO
WO 13173838 Nov 2013 WO
WO 13186846 Dec 2013 WO
WO 13186847 Dec 2013 WO
WO 14018086 Jan 2014 WO
WO 14098077 Jun 2014 WO
WO 15023670 Feb 2015 WO
WO 16141482 Sep 2016 WO
Non-Patent Literature Citations (11)
Entry
U.S. Appl. No. 16/352,784, filed Mar. 13, 2019, Moussette et al.
U.S. Appl. No. 16/391,100, filed Apr. 22, 2019, Chen.
Actuator definition downloaded from http://www.thefreedictionary.com/actuator on May 3, 2018, 2 pages.
Astronomer's Toolbox, “The Electromagnetic Spectrum,” http://imagine.gsfc.nasa.gov/science/toolbox/emspectrum1.html, updated Mar. 2013, 4 pages.
Hasser et al., “Preliminary Evaluation of a Shape-Memory Alloy Tactile Feedback Display,” Advances in Robotics, Mechantronics, and Haptic Interfaces, ASME, DSC—vol. 49, pp. 73-80, 1993.
Hill et al., “Real-time Estimation of Human Impedance for Haptic Interfaces,” Stanford Telerobotics Laboratory, Department of Mechanical Engineering, Stanford University, Third Joint Eurohaptics Conference and Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems, Salt Lake City, Utah, Mar. 18-20, 2009, pp. 440-445.
Kim et al., “Tactile Rendering of 3D Features on Touch Surfaces,” UIST '13, Oct. 8-11, 2013, St. Andrews, United Kingdom, 8 pages.
Lee et al, “Haptic Pen: Tactile Feedback Stylus for Touch Screens,” Mitsubishi Electric Research Laboratories, http://wwwlmerl.com, 6 pages, Oct. 2004.
Nakamura, “A Torso Haptic Display Based on Shape Memory Alloy Actuators,” Massachusetts Institute of Technology, 2003, pp. 1-123.
PuntoCellulare, “LG-GD910 3G Watch Phone,” YouTube (http://www.youtube.com/watch?v+HcCI87KIELM), Jan. 8, 2009, 9 pages.
Sullivan, Mark, “This Android Wear Update Turns Your Device into The Dick Tracy Watch,” Fast Company (https://www.fastcompany.com/3056319/this-android-wear-update-turns-your-device-into-the-dick-tracy-watch), Feb. 4, 2016, 9 pages.
Related Publications (1)
Number Date Country
20190159170 A1 May 2019 US
Provisional Applications (1)
Number Date Country
62303964 Mar 2016 US
Divisions (1)
Number Date Country
Parent 16015367 Jun 2018 US
Child 16259645 US
Continuations (2)
Number Date Country
Parent 15641192 Jul 2017 US
Child 16015367 US
Parent 15251459 Aug 2016 US
Child 15641192 US