The present disclosure is directed to providing haptic output for an electronic device. Specifically, the present disclosure is directed to providing a various types of haptic output based on a determined orientation, position and/or operating environment of the electronic device.
Electronic devices are very commonplace in today's society. These electronic devices include cell phones, tablet computers, personal digital assistants and the like. Some of these electronic devices include an ability to notify a user of a particular item of interest, such as, for example, an incoming phone call, or may otherwise attempt to gain the user's attention through the use of various alert notifications. These alert notifications may include vibrating motors, noise from speakers in the form of ringtones and the like.
It is with respect to these and other general considerations that embodiments of the present disclosure have been made. Also, although relatively specific problems have been discussed, it should be understood that the embodiments disclosed herein should not be limited to solving the specific problems identified in the background.
This summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description section. This summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used as an aid in determining the scope of the claimed subject matter.
Embodiments of the present disclosure provide a system and method for providing a type of haptic output for an electronic device. In certain embodiments, the type of haptic output is based on a determined orientation, position, and/or operating environment of the electronic device. Specifically, the electronic device may receive input from one or more sensors associated with electronic device. Once the input from the one or more sensors is received, an orientation, position and/or operating environment of the electronic device is determined. Based on the determined orientation of the electronic device, a type of haptic output is selected and provided.
In another embodiment, a method and system is provided for adjusting a type of haptic output provided by an electronic device. In such embodiments, input is received from one or more sensors associated with the electronic device. An orientation of the electronic device is determined using the input from the one or more sensors. The input from the sensors may be received simultaneously, substantially simultaneously or in a sequential manner. Further, input may be received from a first sensor and the electronic device may then request that additional readings be received from a second sensor. Once the sensor readings from the one or more sensors is received, the electronic device may request additional input. The additional input may include information from an application or other program being executed on the electronic device or from an additional sensor. Using this information, the electronic device is configured to determine its orientation, position with respect to a user or another computing device, and/or its current operating environment. Once this information is determined, a type of haptic output is selected and provided by the electronic device.
Various embodiments are described more fully below with reference to the accompanying drawings, which form a part hereof, and which show specific exemplary embodiments. However, embodiments may be implemented in many different forms and should not be construed as limited to the embodiments set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the embodiments to those skilled in the art. The following detailed description is, therefore, not to be taken in a limiting sense.
As discussed above, embodiments of the present disclosure provide a system and method for providing a type of haptic output for an electronic device. The type of haptic output is based on a determined orientation, position, and/or operating environment of the electronic device. Specifically, the electronic device may receive input from one or more sensors associated with an electronic device. The input from the one or more sensors is analyzed to determine a current orientation, position and/or operating environment of the electronic device. Once the orientation, position and/or operating environment of the electronic device is determined, a type of haptic output is selected and provided by the electronic device. As will be discussed below, the type of haptic output selected is a type that will maximize salience of the haptic output given the current orientation, position and/or operating environment of the electronic device.
For example the various types of haptic output may include: (i) a direction of the haptic output (e.g., haptic output provided in an x axis, a y axis, a z axis or a combination thereof), (ii) a duration of the haptic output, (iii) a frequency or waveform of a vibration or pulse of the haptic output, (iv) an amount of power consumed by providing the haptic output, (v) a pattern of the haptic output (e.g., a series of short pulses, a series of different vibrations or combinations thereof), and the like.
In addition to providing haptic output, the electronic device may also be configured to output auditory and/or visual notifications based on the received input from the one or more sensors. Specifically, the input from the one or more sensors is analyzed to determine a current orientation, position and/or operating environment of the electronic device. Once the orientation, position and/or operating environment of the electronic device is determined, a type of visual and/or auditory output is selected and provided by the electronic device. In further embodiments, haptic output may be combined with auditory notifications and/or visual notifications.
For example the various types of auditory output may various tones, rings and other sounds. Likewise, the various types of visual output may include flashing lights, flashing the screen, outputting a text, graphic or picture on a display of the electronic device and the like.
As shown in
The wearable electronic device 100 may also include a band or a strap 120 that is used to connect or secure the wearable electronic device 100 to a user. In other embodiments, the wearable electronic device 100 may include a lanyard or necklace. In still further examples, the wearable electronic device 100 may be secured to or within another part of a user's body. In these and other embodiments, the strap, band, lanyard, or other securing mechanism may include one or more electronic components or sensors in wireless or wired communication with an accessory. For example, the band secured 120 to the wearable electronic device 100 may include one or more sensors, an auxiliary battery, a camera, or any other suitable electronic component.
As shown in
Although not shown in
Further, the wearable electronic device 100 and the electronic device 130 may include other components not shown or described above. For example, the wearable electronic device 100 may include a keyboard or other input mechanism. Additionally, the wearable electronic device 100 may include one or more components that enable the wearable electronic device 100 to connect to the internet and/or access one or more remote databases or storage devices. The electronic device 100 may also enable communication over wireless media such as acoustic, radio frequency (RF), infrared, and other wireless media mediums. Such communication channels may enable the wearable electronic device 100 to remotely connect and communicate with one or more additional devices such as, for example, a laptop computer, tablet computer, mobile telephone, personal digital assistant, portable music player, speakers and/or headphones and the like.
The wearable electronic device 100 and the electronic device 130 may also provide haptic output to notify a user of each device of a particular condition of the device. The haptic output may also be used to notify the user of an incoming text message, phone call, email message or other such communication. The haptic output may also notify a user of upcoming appointments and other notifications from various applications on each of the devices.
In certain embodiments, each of the devices 100 and 130 may be configured to output different types of haptic, audio and/or visual output based on various conditions associated with the device. For example, a first haptic output may be provided by the wearable electronic device 100 in a first set setting while a second, different haptic output may be provided by the wearable electronic device 100 in a second setting. In another example, a first haptic output and first visual notification may be provided in a first setting and a second haptic output, a second visual notification and a first auditory notification may be provided in a second setting. Depending on the settings the wearable electronic device 100 or the electronic device 130 is operating in, the haptic output may be a vibration, a series of vibrations, a pulse, a series of pulses, and various combinations thereof. Further, each vibration or pulse may be caused by a single haptic actuator or multiple haptic actuators.
In addition to the vibration or the pulse provided by a haptic actuator and as discussed above, each of the devices described herein may also output a visual notifications and/or auditory notifications. The visual and auditory notifications may be output by a light source, the display, and speakers respectively. For example, the visual and auditory notifications, when output, may include flashing lights and tones having a variety of patterns, volumes, intensity or brightness and so on. Further, the visual and auditory notifications may be output separate from the vibrations or pulses output by the haptic actuator. The visual and auditory notifications may also be output simultaneously or substantially simultaneously with the vibrations or pulses output by the haptic actuator. In yet another embodiment, the visual and auditory notifications may be output sequentially with respect to the vibrations or pulses output by the haptic actuator. Although the wearable electronic device 100 and the electronic device 130 may provide haptic output to gain the attention of the user, embodiments described herein are directed to determining an optimal method of vibrating or pulsing the haptic actuator. As needed or as determined, each of the device 100 and 130 may also provide auditory or visual notifications. For example, embodiments described herein vary the type of haptic output provided by an electronic device based on the sensed operating environment of each of the devices 100 and 130, the orientation of each of the devices 100 and 130 and/or the position of the each of the devices 100 and 130 with respect to a user.
Accordingly, embodiments of the present disclosure allow the wearable electronic device 100 and the electronic device 130 to autonomously observe or sense current operating conditions associated with each device 100 and 130, their respective orientations, and their positions with respect to a user. Based on these factors, each of the wearable electronic device 100 and the electronic device 130 can adjust the type of haptic output that is to be provided when a notification is to be provided to a user.
For example, the electronic device 130 may determine its current operating environment, such as for example, whether the electronic device 130 is indoors, outdoors, contained in a purse or bag, sitting on a table, in a pocket, attached to an arm of the user etc., by obtaining sensor readings from one or more sensors.
Based on the sensor readings from the one or more sensors, the electronic device 130 may select and/or optimize the type (e.g., the direction, duration, pattern, timing, etc.) of the haptic output provided by one or more haptic actuators. Specifically, the one or more haptic actuators may be configured to output vibrations or pulses in a variety of patterns, for a variety of durations, in a variety of directions (e.g., different axis) and combinations thereof. In embodiments where the electronic device 130 includes multiple haptic actuators, a first haptic actuator may be used to output vibrations or pulses in a first direction and in a first pattern while a second haptic actuator may be used to output vibrations is a second direction and for a second duration. When multiple haptic actuators are present in the electronic device 130, the haptic actuators may output vibrations or pulses separately, simultaneously or substantially simultaneously.
For example, the electronic device 130 may utilize readings from various sensors to determine that the electronic device 130 is face down on a table. As such, a first haptic actuator may be activated to output a first type of vibration or pulse at a first determined frequency, duration and/or in a first direction or axis. In another example, the readings from one or more sensors may determine that the wearable electronic device 100 is being worn on a wrist or arm of the user. As such, a second haptic actuator may be activated to output a second type of vibration or pulse at a second frequency, duration and/or in a second direction or axis. In these embodiments, the readings from the one or more sensors may be utilized to cause the haptic actuator to vibrate or pulse in such a way as to optimize salience depending on the determined environment, orientation and position in which the wearable electronic device 100 is operating.
In some embodiments, the devices 100 and 130 may include an accelerometer that detects an orientation of the devices 100 and 130 and/or whether the devices 100 and 130 are moving (e.g., being carried by a user) or stationary (e.g., lying on a table). The sensor may also be a gyroscope, magnetometer or other sensors that detect distance and/or depth. In yet another example, the sensor may be a light sensor that detects an amount of ambient light in order to determine whether the devices 100 and 130 are in a bag, in a pocket, covered by an article of clothing or the like. The sensor may also be a temperature sensor that monitors a rise in heat caused by a user holding the electronic device 130 in the user's hand or wearing the wearable electronic device 100.
As discussed above, each of the devices 100 and 130 may include multiple sensors. Accordingly, each of the sensors may obtain sensor readings separately, simultaneously or substantially simultaneously. In another embodiment, one sensor may be a primary sensor that provides initial readings to a processor of the electronic devices. If additional readings are required to determine the current operating environment of the devices 100 and 130, the processor of each device may request additional readings from the other sensors.
For example, if the primary sensor is an accelerometer, the processor of the electronic device 130 may determine, based on the data provided by the accelerometer, that the electronic device 130 is stationary. Based on the readings from that single sensor, a type of haptic output may be selected and provided by the electronic device 130. However, the processor may also request that one or more additional sensors provide sensed data so as to verify or more correctly identify the current operating environment, orientation or position of the electronic device 130. Using this data, the electronic device may maximize salience of the haptic output.
Once the one or more sensors have taken their respective readings, the sensor readings may be analyzed together or separately in order to determine an operating environment, orientation and/or position of the device. More specifically, the readings from the one or more sensors may be used to determine the most effective way to obtain the attention of a user of one of the devices 100 or 130 based on the sensed operating environment, orientation and/or position of the devices 100 and 130.
Continuing the example from above, readings from an accelerometer may be analyzed in conjunction with an ambient light sensor to determine that the electronic device 130 is in a pocket of a user. Further, the data from the accelerometer may indicate that the user is currently walking or running. Using this data, the processor of the electronic device 130 may instruct one or more haptic actuators to output a vibration or pulse along a first axis, in a first pattern and for a first duration.
In another example, the data from the accelerometer and the ambient light sensor may indicate that the electronic device 130 is in a pocket of the user but the user is sitting down or standing in place. In this example, the processor of the electronic device 130 may instruct the one or more haptic actuators to output a vibration or pulse along the first axis, but in a second pattern or for a second duration.
As discussed above, the wearable electronic device 100 and the electronic device 130 may include a single sensor or multiple sensors. The sensors may be used alone or in combination with other sensors. For example, if the one sensor is an accelerometer, a gyroscope may be used in conjunction with the accelerometer to determine an orientation of the devices 100 and 130.
Continuing with the example above, if the wearable electronic device 130 is worn on the arm of a user, a sensor may be configured to determine whether the arm of the user is stationary or moving. Further, the sensor may be configured to determine how the arm of the user is positioned. Specifically, input from the one or more sensors can determine that the user has his arms folded or otherwise held horizontally with respect to the user. The sensors may also determine whether the user has his arms in a vertical position with respect to the user such as, for example, if the user's arms are pointed down or swinging while walking.
Once the position of the wearable electronic device 100 or the electronic device 130 has been determined, the processor may instruct the haptic actuator to output a vibration or pulse that maximizes salience. For example, the processor may determine, based on the determined orientation, operating environment and position of the wearable electronic device 100 or the electronic device 130, that the haptic output should be provided in a first pattern at a first frequency, in a first direction, and for a first determined amount of time. In embodiments, the processor may also determine, based on the received sensor readings, that a display and/or a light source may be visible, or in a line of sight of the user of each of the devices 100 and 130. In such embodiments, the processor may also cause the display and/or the light source to flash or otherwise give a visual indication of the notification along with, or in place of, the haptic output.
In yet another embodiment, the sensor may be a light sensor. In such embodiments, the light sensor may be configured to determine an amount of ambient light (or other source of light) that the devices 100 and 130 are exposed to. For example, the light sensor may be able to determine whether the wearable electronic device 100 is covered or partially covered, such as, for example, by an article of clothing, or whether the electronic device is placed in a pocket or in a bag. If the wearable electronic device 100 is covered or partially covered, the haptic actuator may output a vibration or pulse of a second type, having a second duration and in a second direction.
In still yet another embodiment, the sensor may be an image sensor that is part of a camera of one of the electronic devices 100 and 130. In such embodiments, the image sensor may be configured to determine whether the devices 100 and 130 are moving or stationary and is also able to determine the position of the user with respect to the electronic device (e.g., whether the user is wearing the wearable electronic device 100). For example, the image sensor may be able to determine an orientation of the wearable electronic device 100 by temporarily analyzing the surroundings and making a determination of its orientation based on the collected data.
In other embodiments, data received from the camera may be used to provide additional information regarding the operating environment of the devices 100 and 130. For example, if the electronic device 130 is lying face down on a surface and one sensor (e.g., a light sensor) is located on the face of the electronic device 130 while the image sensor is located on the opposite side of the electronic device 130, the light sensor may indicate that it is receiving substantially no light while the camera indicates that it is receiving light. Accordingly a determination may be made that the electronic device 130 is face down on the surface.
In accordance with the embodiments and examples described herein, once the orientation and/or position of the devices 100130 have been determined, haptic output may be provided by the device. The type of haptic output, includes the frequency, direction, pattern and duration of the haptic output such as described above.
The wearable electronic device 100 or the electronic device 130 may also include a microphone or other such input device. In certain embodiments, the microphone may also be used to determine an orientation or position of the devices 100 and 130 with respect to a user. For example, the microphone could be used to detect the rubbing of an article of clothing against the wearable electronic device 100 to determine whether the wearable electronic device 100 is in a bag, a pocket or covered by the article of clothing.
In another embodiment, the haptic actuator of the devices 100 and 130 may emit a small or quick pulse or vibration when it is determined that the device is stationary. The microphone may then help determine whether the device is placed on a hard or soft surface depending on the amount of noise caused by the emitted pulse or vibration. Based on this reading, the type of haptic output may be selected or adjusted. For example, if it is determined that the electronic device 130 is on a hard flat surface, the haptic output that is provided may be a series of short pulses instead of a long vibratory output.
In addition to the above, the microphone may be configured to detect and recognize a voice of a user, including determining a direction from which the voice originates. Based on this information, an orientation and/or position of the devices 100 and 130 may be determined. Once the orientation and position of the devices 100 and 130 is determined, the haptic actuator may output various types of haptic output such as described above.
Although not shown, each of the electronic devices 100 and 130 may also include a global positioning system (GPS) sensor. The GPS sensor may indicate, based on monitored position data, whether the devices 100 or 130 are moving and may also determine an environment in which the device is located (e.g., shopping mall, movie theater, park etc.). Using this information, either singularly or in combination with other sensors, the processor of the device may adjust the type of haptic output provided by the haptic actuator accordingly.
The wearable electronic device 100 and the electronic device 130 may also include a memory. The memory may be configured to store settings and/or orientation information received from the sensor or microphone. In addition, the memory may store data corresponding to a calendaring application, a contacts application and the like. Because the each of the devices 100 and 130 has access to this additional data, the additional data may be used to select a type of haptic output.
For example, if the readings from the sensor indicate that the electronic device 130 is face-up on a table, the haptic actuator may be configured to output a first type of vibratory output. However, if data from the calendaring application indicates that the user of the electronic device 130 is in a meeting at the time the vibratory output is to be provided, a second, different type of vibratory output may be provided by the haptic actuator. In another embodiment, the haptic output may be delayed until the data from the calendaring application, or other such application, indicates that the user is free.
In yet another embodiment, the vibratory output may change based on a contact list. For example, if the electronic device 130 is in a first orientation and a first position with respect to a user, the haptic actuator may output a first type of vibration. However, if the embodiment determined that the notification for which the vibration is output is a person of a defined significance to the user (e.g., mother, father, wife, supervisor and so on), such as, for example as defined in a contacts application, the type of vibratory output may be adjusted accordingly. For example, each individual in a contacts application may be associated with a different type of haptic feedback or combinations of types of haptic feedback.
In still yet other embodiments, the data from the calendaring application and contact information may be combined or arranged in a hierarchy. Thus, if the user is in a meeting but is receiving a notification from a person of importance to the user, the haptic actuator may provide a type of tactile, audio, and/or visual notification when the user would otherwise not be disturbed.
In some additional embodiments, the haptic, audio and/or visual output may change based on the determined operating environment of the electronic device 100. More specifically, the output, whether auditory, haptic, visual and various combinations, may vary based on a determined operating environment of the electronic device 100.
For example, a duration of haptic vibrations may have a first length when the electronic device 100 is placed on a table and a have a second length when the electronic device 100 is placed in a pocket of a user. Likewise, auditory notifications, along with haptic output, may be provided when the electronic device is in a pocket of a user. In this scenario, visual notifications would not be provided as the electronic device 100 could determine that the display of the electronic device is not visible to a user. In yet another example, a volume of the auditory notification may be altered based on a determined level of ambient noise. Likewise, the brightness of the display may be altered based on a sensed amount of ambient light.
Specifically, in the example shown in
As also discussed, the haptic actuator may output a pulse or vibration having different frequencies, patterns and/or durations. In embodiments where two or more haptic actuators are contained in the electronic device 200, one haptic actuator may be selected to output the vibration or pulse over another haptic actuator based on the orientation or location of the haptic actuator in the electronic device 200. For example, if a first haptic actuator is configured to output vibrations in a y axis and another haptic actuator is configured to output vibrations or pulses in the z axis, the haptic actuator configured to output vibrations in the y axis may be selected based on the determined orientation of the electronic device 200 and/or based on the determined position of the electronic device with respect to the user 210.
As also discussed above, the electronic device 200 may include two or more haptic actuators with each haptic actuator being configured to output vibrations or pulses in a different axis. As such, and continuing with the example above, a haptic actuator configured to output vibrations in the z axis may be selected over a haptic actuator configured to output vibrations in a y axis in order to provide haptic output with greater salience based on the current orientation and/or position of the electronic device 200.
As shown in
Based on this received data, the electronic device 300 may determine that a particular type of vibration or pulse may be required to gain the attention of the user 310. For example, the vibration or pulse provided by a haptic actuator may be output at a high frequency and for a long duration. Further, the vibrations or pulses may be output in different axes (e.g., a first pulse in a first axis and a second pulse in a second, different axis). In addition to the vibrations or pulse, an audible notification may be output by the electronic device 300.
As also shown in
For example, if the electronic device 400 included an accelerometer and microphone, the electronic device 400 may detect that it is in a classroom or meeting setting by the sensors reporting no movement from the accelerometer and/or a relatively low ambient noise level from the microphone. Upon detecting that it is operating in this environment, the electronic device 400 may adjust the type of vibrations or pulses output by a haptic actuator.
In addition, the haptic actuator may be configured to output a test pulse or vibration. In response, the microphone may sense or otherwise analyze an amount of noise caused by the vibration or pulse (e.g., noise caused by the electronic device 400 vibrating on a hard surface) and adjust the type of haptic output provided. In another embodiment, if the vibration or pulse is too loud, a visual notification may be output by the electronic device 400. For example, a screen or light source of the electronic device 400 may flash or strobe in order to gain the attention of the user.
The method 500 begins when a sensor reading is received 510. According to one or more embodiments, the sensor reading may be received from a sensor contained in the electronic device. As discussed above, the sensor may be an accelerometer, a light sensor, an image sensor, and the like. Additionally, the sensor reading may be received from another input device such as, for example a microphone. Although specific sensors and components are mentioned, any number of sensors and components may be used in various combinations to receive sensor readings and other data corresponding to the operating environment of the electronic device.
In embodiments, the sensor readings may be received continuously or substantially continuously. Thus, the electronic device would always be aware of its current orientation, position, and operating environment. In another embodiment, the sensor readings may occur periodically. For example, sensor readings may be taken at predetermined intervals or time-based intervals.
In another embodiment, the sensor readings are taken when haptic output is to be provided. For example, if an incoming phone call is received by the electronic device, the processor of the electronic device may request sensor readings from one or more sensors prior to selecting and providing a type of haptic output. Based on the received sensor readings, the processor of the electronic device may determine the type of haptic output to provide based on the current sensor readings. In yet another embodiment, sensor readings may be requested when the electronic device changes state (e.g., is moved from a stationary position to a moving position by being picked up, carried and so on).
Once the sensor readings have been received, flow proceeds to operation 520 in which the operating parameters of the electronic device are determined. The operating parameters of the electronic device may include an orientation of the electronic device, a position of the electronic device with respect to a user of the electronic device and/or with respect to a second electronic device, whether the electronic device is stationary or moving, whether the electronic device is covered, partially covered or visible, and the like.
In certain embodiments, the viewability of a display screen of the electronic device may also be determined based on the received sensor readings. For example, the received sensor readings may indicate that the display of the electronic device is in an orientation in which the display is not currently visible to the user of the electronic device. Likewise, the sensor reading may determine that the display screen of the electronic device is occluded or partially occluded from view.
In operation 530 a determination is made as to whether additional data about the electronic device is available. The additional data may include data from other applications or programs that are executing or available on the electronic device. For example, a calendaring application or contact application may provide additional data about a user's location or whether the user should be notified by a visual notification, an auditory notification, a tactile notification or some combination thereof.
Further, the additional information may help determine the orientation of the electronic device. For example, if a calendaring application indicates that the user is in a meeting, the electronic device may make a determination that the electronic device is positioned on a table or in a pocket of the user and may adjust the haptic output accordingly.
In other embodiments, the electronic device may learn various types of behavior based on the user's schedule and output vibrations or other notifications accordingly. For example, if an electronic device was in a particular orientation at a given time of day, or was in a particular orientation during a calendared appointment, the electronic device may use types of haptic output that were provided in those situations when the data from these sources indicate the user is again in a similar appointment or situation.
In other embodiments, data from applications executing on the computing device may be received first in an attempt to determine the current orientation and operating environment of the electronic device. Once the data from the applications is received, additional sensor data may be requested from the one or more sensors of the electronic device to verify the assumed orientation or position of the electronic device.
If operation 530 determines that additional information is available and/or is needed, flow proceeds to operation 540 and the additional information is provided to the processor of the electronic device. Once the additional information is received, flow proceeds to operation 550 and a type of haptic output is determined that will maximize salience. In embodiments, the type of haptic output may include the direction of the haptic output, the frequency of the haptic output, the duration of the haptic output, the pattern of the haptic output, the amount of power to be used in providing the haptic output and the like.
Returning back to operation 530, if it is determined that no additional information is needed (or wanted), flow proceeds directly to operation 550 and the type of haptic output is determined. As discussed, the type of haptic output may include the direction, the frequency, the pattern, the duration, etc. of the vibrations and pulses.
When the type of haptic output is determined, flow proceeds to operation 560 and the haptic output is provided. In certain embodiments, the haptic output may be provided by a single haptic actuator. In other embodiments, the haptic output may be provided by multiple haptic actuators. In embodiments where multiple haptic actuators are used, operation 550 may determine which haptic actuator is used. In addition, operation 550 may also determine whether multiple haptic actuators are active simultaneously or substantially simultaneously. In other embodiments, the haptic actuators may be actuated in sequence or in a staggered or alternating manner.
In a basic configuration, the electronic device 600 may include at least one processor 605 and an associated memory 610. The memory 610 may comprise, but is not limited to, volatile storage such as random access memory, non-volatile storage such as read-only memory, flash memory, or any combination thereof. The memory 610 may store an operating system 615 and one or more program modules 620 suitable for running software applications 655. The operating system 615 may be configured to control the electronic device 600 and/or one or more software applications 655 being executed by the operating system 615.
The electronic device 600 may have additional features or functionality than those expressly described herein. For example, the electronic device 600 may also include additional data storage devices, removable and non-removable, such as, for example, magnetic disks, optical disks, or tape. Exemplary storage devices are illustrated in
In certain embodiments, various program modules and data files may be stored in the system memory 610. The program modules 620 and the processor 605 may perform processes that include one or more of the operations of method 500 shown and described with respect to
As also shown in
The electronic device 600 also includes communication connections 645 that facilitate communications with additional computing devices 650. Such communication connections 645 may include a RF transmitter, a receiver, and/or transceiver circuitry, universal serial bus (USB) communications, parallel ports and/or serial ports.
As used herein, the term computer readable media may include computer storage media. Computer storage media may include volatile and nonvolatile media and/or removable and non-removable media implemented in any method or technology for the storage of information. Examples include computer-readable instructions, data structures, or program modules. The memory 610, the removable storage device 625, and the non-removable storage device 630 are all examples of computer storage media. Computer storage media may include RAM, ROM, electrically erasable read-only memory (EEPROM), flash memory or other memory technology, CD-ROM, digital versatile disks (DVD) or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other article of manufacture which can be used to store information and which can be accessed by the electronic device 600. Any such computer storage media may be part of the electronic device 600.
In certain embodiments, the system 705 may execute one or more applications or programs. These applications or programs include browser applications, e-mail applications, calendaring applications, contact manager applications, messaging applications, games, media player applications and the like.
One or more embodiments provide that application programs may be loaded into a memory 710 and may be executed by, or in association with, the operating system 715. Additional exemplary application programs may include phone programs, e-mail programs, personal information management (PIM) programs, word processing programs, spreadsheet programs, Internet browser programs, messaging programs, and the like. The system 705 also includes a non-volatile storage area 720 within the memory 710. The non-volatile storage area 720 may be used to store persistent information. In certain embodiments, the application programs may use and store information in the non-volatile storage area 720. A synchronization application or module (not shown) may also be included with the system 705 to synchronize applications or data resident on the device 700 with another computer or device. In embodiments, the device 700 includes a power supply 725. The power supply 725 may be a battery, solar cell, and the like that provides power to each of the components shown. The power supply 725 may also include an external power source, such as an AC adapter or other such connector that supplements or recharges the batteries. The device 700 may also include a radio 730 that performs the function of transmitting and receiving radio frequency communications Additionally, communications received by the radio 730 may be disseminated to the application programs disclosed herein the operating system 715. Likewise, communications from the application programs may be disseminated to the radio 730 as needed.
The electronic device 700 may also include a visual indicator 735, a keypad 760 and a display 765. In embodiments, the keypad may be a physical keypad or a virtual keypad generated on a touch screen display 765.
The visual indicator 735 may be used to provide visual notifications to a user of the electronic device 700. The electronic device 700 may also include an audio interface 740 for producing audible notifications and alerts. In certain embodiments, the visual indicator 735 is a light emitting diode (LED) or other such light source and the audio interface 740 is a speaker. In certain embodiments, the audio interface may be configured to receive audio input.
The audio interface 740 may also be used to provide and receive audible signals from a user of the electronic device 700. For example, a microphone may be used to receive audible input. In accordance with embodiments of the present disclosure, the microphone may also serve as an audio sensor to facilitate control of notifications such as described above. The system 705 may further include a video interface 750 that enables an operation of an on-board camera 755 to record still images, video, and the like.
In one or more embodiments, data and information generated or captured by the electronic device 700 may be stored locally. Additionally or alternatively, the data may be stored on any number of storage media that may be accessed by the electronic device 700 using the radio 730, a wired connection or a wireless connection between the electronic device 700 and a remote computing device. Additionally, data and information may be readily transferred between computing devices.
Embodiments of the present disclosure are described above with reference to block diagrams and operational illustrations of methods and the like. The operations described may occur out of the order as shown in any of the figures. Additionally, one or more operations may be removed or executed substantially concurrently. For example, two blocks shown in succession may be executed substantially concurrently. Additionally, the blocks may be executed in the reverse order.
The description and illustration of one or more embodiments provided in this disclosure are not intended to limit or restrict the scope of the present disclosure as claimed. The embodiments, examples, and details provided in this disclosure are considered sufficient to convey possession and enable others to make and use the best mode of the claimed embodiments. Additionally, the claimed embodiments should not be construed as being limited to any embodiment, example, or detail provided above. Regardless of whether shown and described in combination or separately, the various features, including structural features and methodological features, are intended to be selectively included or omitted to produce an embodiment with a particular set of features. Having been provided with the description and illustration of the present application, one skilled in the art may envision variations, modifications, and alternate embodiments falling within the spirit of the broader aspects of the embodiments described herein that do not depart from the broader scope of the claimed embodiments.
Number | Name | Date | Kind |
---|---|---|---|
5196745 | Trumper et al. | Mar 1993 | A |
5293161 | MacDonald et al. | Mar 1994 | A |
5424756 | Ho et al. | Jun 1995 | A |
5434549 | Hirabayashi et al. | Jul 1995 | A |
5436622 | Gutman et al. | Jul 1995 | A |
5739759 | Nakazawa et al. | Apr 1998 | A |
6084319 | Kamata et al. | Jul 2000 | A |
6342880 | Rosenberg et al. | Jan 2002 | B2 |
6373465 | Jolly et al. | Apr 2002 | B2 |
6438393 | Surronen | Aug 2002 | B1 |
6445093 | Binnard | Sep 2002 | B1 |
6493612 | Bisset et al. | Dec 2002 | B1 |
6693622 | Shahoian et al. | Feb 2004 | B1 |
6822635 | Shahoian | Nov 2004 | B2 |
6864877 | Braun et al. | Mar 2005 | B2 |
6952203 | Banerjee et al. | Oct 2005 | B2 |
6988414 | Ruhrig et al. | Jan 2006 | B2 |
7068168 | Girshovich et al. | Jun 2006 | B2 |
7080271 | Kardach et al. | Jul 2006 | B2 |
7130664 | Williams | Oct 2006 | B1 |
7234379 | Claesson et al. | Jun 2007 | B2 |
7253350 | Noro et al. | Aug 2007 | B2 |
7323959 | Naka et al. | Jan 2008 | B2 |
7339572 | Schena | Mar 2008 | B2 |
7355305 | Nakamura et al. | Apr 2008 | B2 |
7370289 | Ebert et al. | May 2008 | B1 |
7392066 | Hapamas | Jun 2008 | B2 |
7423631 | Shahoian et al. | Sep 2008 | B2 |
7508382 | Denoue et al. | Mar 2009 | B2 |
7570254 | Suzuki et al. | Aug 2009 | B2 |
7656388 | Schena et al. | Feb 2010 | B2 |
7675414 | Ray | Mar 2010 | B2 |
7710397 | Krah et al. | May 2010 | B2 |
7710399 | Bruneau et al. | May 2010 | B2 |
7741938 | Kramlich | Jun 2010 | B2 |
7755605 | Daniel et al. | Jul 2010 | B2 |
7798982 | Zets et al. | Sep 2010 | B2 |
7825903 | Anastas et al. | Nov 2010 | B2 |
7855657 | Doemens et al. | Dec 2010 | B2 |
7890863 | Grant et al. | Feb 2011 | B2 |
7893922 | Klinghult et al. | Feb 2011 | B2 |
7904210 | Pfau et al. | Mar 2011 | B2 |
7919945 | Houston et al. | Apr 2011 | B2 |
7952261 | Lipton et al. | May 2011 | B2 |
7952566 | Poupyrev et al. | May 2011 | B2 |
7956770 | Klinghult et al. | Jun 2011 | B2 |
7976230 | Ryynanen et al. | Jul 2011 | B2 |
8002089 | Jasso et al. | Aug 2011 | B2 |
8040224 | Hwang | Oct 2011 | B2 |
8063892 | Shahoian | Nov 2011 | B2 |
8081156 | Ruettiger | Dec 2011 | B2 |
8125453 | Shahoian et al. | Feb 2012 | B2 |
8154537 | Olien et al. | Apr 2012 | B2 |
8174495 | Takashima et al. | May 2012 | B2 |
8169402 | Shahoian et al. | Jun 2012 | B2 |
8217892 | Meadors | Jul 2012 | B2 |
8232494 | Purcocks | Jul 2012 | B2 |
8248386 | Harrison | Aug 2012 | B2 |
8253686 | Kyung | Aug 2012 | B2 |
8262480 | Cohen et al. | Sep 2012 | B2 |
8265308 | Gitzinger et al. | Sep 2012 | B2 |
8344834 | Niiyama | Jan 2013 | B2 |
8345025 | Seibert et al. | Jan 2013 | B2 |
8351104 | Zaifrani et al. | Jan 2013 | B2 |
8378797 | Pance et al. | Feb 2013 | B2 |
8378965 | Gregorio et al. | Feb 2013 | B2 |
8384316 | Houston et al. | Feb 2013 | B2 |
8390218 | Houston et al. | Mar 2013 | B2 |
8390594 | Modarres et al. | Mar 2013 | B2 |
8400027 | Dong et al. | Mar 2013 | B2 |
8469806 | Grant et al. | Jun 2013 | B2 |
8471690 | Hennig et al. | Jun 2013 | B2 |
8493177 | Flaherty et al. | Jul 2013 | B2 |
8493189 | Suzuki | Jul 2013 | B2 |
8598750 | Park | Dec 2013 | B2 |
8598972 | Cho et al. | Dec 2013 | B2 |
8605141 | Dialameh et al. | Dec 2013 | B2 |
8614431 | Huppi et al. | Dec 2013 | B2 |
8619031 | Hayward | Dec 2013 | B2 |
8624448 | Kaiser et al. | Jan 2014 | B2 |
8633916 | Bernstein et al. | Jan 2014 | B2 |
8639485 | Connacher et al. | Jan 2014 | B2 |
8648829 | Shahoian et al. | Feb 2014 | B2 |
8717151 | Forutanpour et al. | May 2014 | B2 |
8730182 | Modarres et al. | May 2014 | B2 |
8749495 | Grant et al. | Jun 2014 | B2 |
8760037 | Eshed et al. | Jun 2014 | B2 |
8797153 | Vanhelle et al. | Aug 2014 | B2 |
8803670 | Steckel et al. | Aug 2014 | B2 |
8872448 | Boldyrev et al. | Oct 2014 | B2 |
8878401 | Lee | Nov 2014 | B2 |
8976139 | Koga et al. | Mar 2015 | B2 |
8981682 | Delson et al. | Mar 2015 | B2 |
8987951 | Park | Mar 2015 | B2 |
9054605 | Jung et al. | Jun 2015 | B2 |
9058077 | Lazaridis et al. | Jun 2015 | B2 |
9086727 | Tidemand et al. | Jul 2015 | B2 |
9134796 | Lemmons et al. | Sep 2015 | B2 |
9172669 | Swink et al. | Oct 2015 | B2 |
9285905 | Buuck | Mar 2016 | B1 |
9396629 | Weber et al. | Jul 2016 | B1 |
20050036603 | Hughes | Feb 2005 | A1 |
20050230594 | Sato et al. | Oct 2005 | A1 |
20060209037 | Wang et al. | Sep 2006 | A1 |
20060223547 | Chin et al. | Oct 2006 | A1 |
20060252463 | Liao | Nov 2006 | A1 |
20070106457 | Rosenberg | May 2007 | A1 |
20070152974 | Kim et al. | Jul 2007 | A1 |
20070247393 | Kuroki | Oct 2007 | A1 |
20080062145 | Shahoian | Mar 2008 | A1 |
20080084384 | Gregorio et al. | Apr 2008 | A1 |
20080111791 | Nikittin | May 2008 | A1 |
20080158149 | Levin | Jul 2008 | A1 |
20090085879 | Dai et al. | Apr 2009 | A1 |
20090096632 | Ullrich | Apr 2009 | A1 |
20090115734 | Fredriksson et al. | May 2009 | A1 |
20090167542 | Culbert et al. | Jul 2009 | A1 |
20090167702 | Nurmi | Jul 2009 | A1 |
20090174672 | Schmidt | Jul 2009 | A1 |
20090225046 | Kim et al. | Sep 2009 | A1 |
20090231271 | Heubel et al. | Sep 2009 | A1 |
20090243404 | Kim et al. | Oct 2009 | A1 |
20090267892 | Faubert | Oct 2009 | A1 |
20090267920 | Faubert et al. | Oct 2009 | A1 |
20090313542 | Cruz-Hernandez et al. | Dec 2009 | A1 |
20100056953 | Couvillon | Mar 2010 | A1 |
20100225600 | Dai et al. | Sep 2010 | A1 |
20100267424 | Kim et al. | Oct 2010 | A1 |
20100328229 | Weber et al. | Dec 2010 | A1 |
20110021272 | Grant | Jan 2011 | A1 |
20110115754 | Cruz-Hernandez | May 2011 | A1 |
20110128239 | Polyakov et al. | Jun 2011 | A1 |
20110132114 | Siotis | Jun 2011 | A1 |
20110163946 | Tartz et al. | Jul 2011 | A1 |
20110205038 | Drouin et al. | Aug 2011 | A1 |
20110210834 | Pasquero et al. | Sep 2011 | A1 |
20120062491 | Coni et al. | Mar 2012 | A1 |
20120127071 | Jitkoff et al. | May 2012 | A1 |
20120127088 | Pance | May 2012 | A1 |
20120223824 | Rothkopf | Sep 2012 | A1 |
20120235942 | Shahoian | Sep 2012 | A1 |
20120286943 | Rothkopf et al. | Nov 2012 | A1 |
20120293463 | Adhikari | Nov 2012 | A1 |
20120319827 | Pance et al. | Dec 2012 | A1 |
20130002341 | Maier et al. | Jan 2013 | A1 |
20130044049 | Biggs et al. | Feb 2013 | A1 |
20130127755 | Lynn et al. | May 2013 | A1 |
20130207793 | Weaber et al. | Aug 2013 | A1 |
20130253818 | Sanders et al. | Sep 2013 | A1 |
20130278401 | Flaherty et al. | Oct 2013 | A1 |
20140002386 | Rosenberg et al. | Jan 2014 | A1 |
20140028573 | Olien et al. | Jan 2014 | A1 |
20140125470 | Rosenberg | May 2014 | A1 |
20140205260 | Lacroix | Jul 2014 | A1 |
20140218183 | Van Schyndel et al. | Aug 2014 | A1 |
20140218853 | Pance et al. | Aug 2014 | A1 |
20140225831 | Shinozaki et al. | Aug 2014 | A1 |
20140225832 | Wright et al. | Aug 2014 | A1 |
20140274398 | Grant | Sep 2014 | A1 |
20140282270 | Slonneger | Sep 2014 | A1 |
20140347323 | Colgate et al. | Nov 2014 | A1 |
20150070149 | Cruz-Hernandez | Mar 2015 | A1 |
20150123500 | Jung | Mar 2015 | A1 |
20150116205 | Westerman et al. | Apr 2015 | A1 |
20150126070 | Candelore | May 2015 | A1 |
20150130730 | Harley et al. | May 2015 | A1 |
20150135121 | Peh et al. | May 2015 | A1 |
20150277562 | Bard et al. | May 2015 | A1 |
20150234493 | Parivar et al. | Aug 2015 | A1 |
20150338921 | Burnbaum et al. | Nov 2015 | A1 |
20150349619 | Degner et al. | Dec 2015 | A1 |
20150362994 | Rihn | Dec 2015 | A1 |
20160011664 | Silvanto et al. | Jan 2016 | A1 |
Number | Date | Country |
---|---|---|
101036105 | Sep 2007 | CN |
101409164 | Apr 2009 | CN |
101663104 | Mar 2010 | CN |
101872257 | Oct 2010 | CN |
1686776 | Aug 2006 | EP |
2004129120 | Apr 2004 | JP |
2004236202 | Aug 2004 | JP |
2010537279 | Dec 2010 | JP |
2010540320 | Dec 2010 | JP |
20050033909 | Apr 2005 | KR |
2010035805 | Oct 2010 | TW |
WO2002073587 | Sep 2002 | WO |
WO2006091494 | Aug 2006 | WO |
WO2007114631 | Oct 2007 | WO |
WO2009038862 | Mar 2009 | WO |
WO2010129892 | Nov 2010 | WO |
WO2013169303 | Nov 2013 | WO |
WO2014066516 | May 2014 | WO |
Entry |
---|
Hasser et al., “Preliminary Evaluation of a Shape-Memory Alloy Tactile Feedback Display,” Advances in Robotics, Mechantronics, and Haptic Interfaces, ASME, DSC—vol. 49, pp. 73-80, 1993. |
International Search Report and Written Opinion, PCT/US2015/031505, 15 pages, dated Jul. 27, 2015. |
Hasser et al., “Preliminary Evaluation of a Shape-Memory Alloy Tactile Feedback Display,” Advances in Robotics, Mechantronics, and Haptic Interfaces, ASME, DSC-vol. 49, pp. 73-80, 1993. |
Hill et al., “Real-time Estimation of Human Impedance for Haptic Interfaces,” Stanford Telerobotics Laboratory, Department of Mechanical Engineering, Standford University, 6 pages, at least as early as Sep. 30, 2009. |
Lee et al, “Haptic Pen: Tactile Feedback Stylus for Touch Screens,” Mitsubishi Electric Research Laboratories, http://wwwlmerl.com, 6 pages, Oct. 2004. |
Number | Date | Country | |
---|---|---|---|
20150338919 A1 | Nov 2015 | US |