Various embodiments relate generally to electronic devices that perform haptic events. More specifically, various embodiments relate to electronic devices having multiple actuators capable of providing localized haptic feedback.
Electronic devices often recreate a user's sense of touch by performing events that cause forces or vibrations to be applied to the user. These events, which support and enable haptic or kinesthetic communication, can be used to enhance the user's ability to remotely control an electronic device, improve the realism of virtual objects in computer simulations, etc. Many haptic devices incorporate tactile sensors that measure the forces exerted by the user on the electronic device.
Different haptic technologies are commonly found in many electronic devices. For example, this may take the form of a vibration in response to a touch event (i.e., a user interaction with the interface of an electronic device) or when a certain event occurs (e.g., an email or text message is received).
Electronic devices have conventionally included a single actuator that is responsible for performing the haptic events. Therefore, the force or vibration corresponding to a haptic event always originates from the same location, regardless of which type of haptic event is performed (e.g., different counts/durations of taps and vibrations).
Systems and techniques for providing localized haptic feedback by an electronic device are described herein. More specifically, an array of piezoelectric actuators can be disposed beneath the display of the electronic device. When a user interacts with content presented on the display (e.g., by touching the display), one or more of the piezoelectric actuators in the array can be induced into performing a haptic event.
For example, a power source may apply voltage to the piezoelectric actuator(s) (and thus induce the haptic event) when the user is watching a cinematic video, interacting with an application, or playing a video game. In some embodiments, a haptic processor that is communicatively coupled to touch circuitry within the electronic device is responsible for specifying how much voltage should be applied to each piezoelectric actuator by the power source. Consequently, touch events performed on the user device may affect which haptic event(s) are performed by the array of piezoelectric actuators. The localized haptic feedback provided by the piezoelectric actuator(s) can increase the realism of content experienced by the user.
The piezoelectric actuator(s) can be induced into performing the same haptic event or different types of haptic events. For example, the piezoelectric actuators disposed near the outer border of the display may vibrate periodically at a high intensity, while the piezoelectric actuators near in middle of the display may vibrate continuously at a low intensity.
One or more embodiments of the present invention are illustrated by way of example and not limitation in the figures of the accompanying drawings, in which like references indicate similar elements.
Systems and techniques for providing localized haptic feedback by an electronic device are described herein. More specifically, an array of piezoelectric actuators can be disposed within or beneath the display assembly of the electronic device. The piezoelectric actuators may be able to perform different types of haptic events based on what content is being shown by the electronic device, in response to a user interaction with the electronic device, etc.
These techniques can be used with any electronic device (also referred to herein as a “user device”) for which it is desirable to provide more realistic and targeted haptic feedback, such as personal computers, tablets, personal digital assistants (PDAs), mobile phones, game consoles and controllers (e.g., Sony PlayStation or Microsoft Xbox), mobile gaming devices (e.g., Sony PSP or Nintendo 3DS), music players (e.g., Apple iPod Touch), wearable electronic devices (e.g., watches), network-connected (“smart”) devices (e.g., televisions), virtual/augmented reality systems and controllers (e.g., Oculus Rift or Microsoft Hololens), wearable devices (e.g., watches and fitness bands), and other portable electronic devices.
Brief definitions of terms, abbreviations, and phrases used throughout this application are given below.
Reference in this specification to “one embodiment” or “an embodiment” means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment of the disclosure. The appearances of the phrase “in one embodiment” in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments necessarily mutually exclusive of other embodiments. Moreover, various features are described that may be exhibited by some embodiments and not by others. Similarly, various requirements are described that may be requirements for some embodiments and not for other embodiments.
Unless the context clearly requires otherwise, throughout the description and the claims, the words “comprise,” “comprising,” and the like are to be construed in an inclusive sense, as opposed to an exclusive or exhaustive sense; that is to say, in the sense of “including, but not limited to.” As used herein, the terms “connected,” “coupled,” or any variant thereof, means any connection or coupling, either direct or indirect, between two or more elements; the coupling of or connection between the elements can be physical, logical, or a combination thereof. For example, two components may be coupled directly to one another or via one or more intermediary channels or components. As another example, devices may be coupled in such a way that information can be passed there between, while not sharing any physical connection with one another. Additionally, the words “herein,” “above,” “below,” and words of similar import, when used in this application, shall refer to this application as a whole and not to any particular portions of this application. Where the context permits, words in the Detailed Description using the singular or plural number may also include the plural or singular number respectively. The word “or,” in reference to a list of two or more items, covers all of the following interpretations of the word: any of the items in the list, all of the items in the list, and any combination of the items in the list.
If the specification states a component or feature “may,” “can,” “could,” or “might” be included or have a characteristic, that particular component or feature is not required to be included or have the characteristic.
The term “module” refers broadly to software, hardware, or firmware components. Modules are typically functional components that can generate useful data or other output using specified input(s). A module may or may not be self-contained. An application program (also called an “application”) may include one or more modules, or a module can include one or more application programs.
The terminology used in the Detailed Description is intended to be interpreted in its broadest reasonable manner, even though it is being used in conjunction with certain examples. The terms used in this specification generally have their ordinary meanings in the art, within the context of the disclosure, and in the specific context where each term is used. For convenience, certain terms may be highlighted, for example using capitalization, italics, and/or quotation marks. The use of highlighting has no influence on the scope and meaning of a term; the scope and meaning of a term is the same, in the same context, whether or not it is highlighted. It will be appreciated that an element or feature can be described in more than one way.
Consequently, alternative language and synonyms may be used for any one or more of the terms discussed herein, and special significance is not to be placed on whether or not a term is elaborated or discussed herein. Synonyms for certain terms are provided. A recital of one or more synonyms does not exclude the use of other synonyms. The use of examples anywhere in this specification, including examples of any terms discussed herein, is illustrative only, and is not intended to further limit the scope and meaning of the disclosure or of any exemplified term. Likewise, the disclosure is not limited to the various embodiments given in this specification.
The user device could include other features as well, such as a camera, speaker, and a touch-sensitive button that are offset from the display 102. The camera, speaker, and/or touch-sensitive button may be located within an opaque border that surrounds the display 102 and is not responsive to user interactions (i.e., is not touch sensitive). The opaque border is often used to hide the various components that reside within the user device 100.
The haptic actuator 106 can provide tactile feedback in real time in the form of taps, vibrations, etc. (which are collectively referred to as “haptic events”). The type of haptic event performed by the haptic actuator 106 may correspond to how hard a user presses the display 102, where the user presses the display 102, etc. The haptic actuator 106 can be any kind of mechanical component that is capable of performing a haptic event. For example, the haptic actuator 106 may be a small motor that is driven by a processor and is electrically coupled to a rechargeable power supply disposed within the housing 104.
As shown in
Although
The display assembly 200 can include a protective substrate 202, an optically-clear bonding layer 204, driving lines 204 and sensing lines 208 disposed on a mounting substrate 210, and a display layer 212. Various embodiments can include some or all of these layers, as well as other layers (e.g., optically-clear adhesive layers).
The protective substrate 202 enables a user to interact with the display assembly 200 (e.g., by making contact with an outer surface using a finger 222). The protective substrate 202 is preferably substantially or entirely transparent and can be composed of glass, plastic, or any other suitable material (e.g., crystallized aluminum oxide).
Together, the driving lines 206 and sensing lines 208 include multiple electrodes (“nodes”) that create a coordinate grid for the display assembly 200. The coordinate grid may be used by a processor on a printed circuit board assembly (PCBA) 218 to determine the intent of a user interaction with the protective substrate 202. The driving lines 206 and/or sensing lines 208 can be mounted to or embedded within a transparent substrate 210, such as glass or plastic. The driving lines 206, sensing lines 208, and/or mounting substrate 210 are collectively referred to herein as “touch circuitry 216.”
An optically-clear bonding layer 204 may be used to bind the protective substrate 202 to the touch circuitry 216, which generates signals responsive to a user interaction with the protective substrate 202. The bonding layer 204 can include an acrylic-based or silicon-based adhesive, as well as one or more layers of indium-tin-oxide (ITO). Moreover, the bonding layer 204 is preferably substantially or entirely transparent (e.g., greater than 99% light transmission) and may display good adhesion to a variety of substrates, including glass, polyethylene (PET), polycarbonate (PC), polymethyl methacrylate (PMMA), etc.
A display layer 212 is configured to display content with which the user may be able to interact. The display layer 212 could include, for example, a liquid crystal display (LCD) panel and a backlight assembly (e.g., a diffuser and a backlight) that is able to illuminate the LCD panel. Other display technologies could also be used, such as light emitting diodes (LEDs), organic light emitting diodes (OLED), electrophoretic/electronic ink (“e-ink”), etc. Air gaps may be present between or within some of these layers. For example, an air gap may be present between the diffuser and the backlight in the backlight assembly.
As shown in
Similar to display assembly 200 of
The protective substrate 402 enables a user to interact with the display assembly 400 (e.g., by making contact with an outer surface using a finger 424). The protective substrate 402 is preferably substantially or entirely transparent and can be composed of glass, plastic, or any other suitable material (e.g., crystallized aluminum oxide).
The touch circuitry 416 creates a coordinate grid for the display assembly 200 that may be used by a processor on a PCBA 418 to determine the intent of a user interaction with the protective substrate 402. In some embodiments, driving lines 406 and/or sensing lines 408 are mounted to or embedded within a transparent substrate 410, such as glass or plastic. In other embodiments, the touch circuitry 416 is connected to touch-sensing elements (e.g., capacitors) that are disposed between display elements (e.g., liquid crystals) in an integrated display panel that supports touch functionality. One skilled in the art will recognize that “touch circuitry” can be used to refer to different techniques/technologies for registering and analyzing touch events.
An optically-clear bonding layer 404 may be used to bind the protective substrate 402 to the touch circuitry 416. The bonding layer 404 can include an acrylic-based or silicon-based adhesive, as well as one or more layers of ITO. Moreover, the bonding layer 404 is preferably substantially or entirely transparent (e.g., greater than 99% light transmission) and may display good adhesion to a variety of substrates, including glass, polyethylene (PET), polycarbonate (PC), polymethyl methacrylate (PMMA), etc.
A display layer 412 is configured to display content with which the user may be able to interact. The display layer 412 could include, for example, a liquid crystal display (LCD) panel and a backlight assembly (e.g., a diffuser and a backlight) that is able to illuminate the LCD panel. However, as noted above, other display technologies could also be used, such as light emitting diodes (LEDs), organic light emitting diodes (OLED), electrophoretic/electronic ink (“e-ink”), etc.
An array of piezoelectric actuators 414 can be disposed beneath at least a portion of the display assembly 400. In some embodiments the piezoelectric actuators are integrated into the display assembly 400 (e.g., within an optically clear substrate), while in other embodiments the piezoelectric actuators are affixed to the inner side of the active display layer 412 (or some other layer in the display assembly 400). The piezoelectric actuators may be microceramic transducers that perform a haptic event (e.g., a tap or vibration) in response to having a voltage applied by a power source 422.
As shown in
Although the array of piezoelectric actuators 414 is depicted as a grid in which each piezoelectric actuator is connected to its neighbors, other arrangements are also possible. For example, each piezoelectric actuator could be electrically coupled to the power source 422 and/or the haptic processor 420 so that the piezoelectric actuators are independently controllable. Such a configuration may also allow the piezoelectric actuators to simultaneously perform different haptic events. For example, the piezoelectric actuators near the bottom of the display assembly 400 could periodically vibrate, while the piezoelectric actuators near the top of the display assembly 400 could vibrate continuously. Similarly, each piezoelectric actuator in the array could vibrate at different intensities based on the distance between the corresponding piezoelectric actuator and the most recent touch event performed by a user.
The array of piezoelectric actuators 414 can also perform haptic events based on the digital content being shown by the display layer 412 at a given point in time. For example, some piezoelectric actuators may vibrate and others may remain still when the user interacts with an application by touching the protective substrate 402. The array of piezoelectric actuators 414 could also create a false sense of location by inducing certain piezoelectric actuators to perform haptic events.
In some embodiments, the array of piezoelectric actuators 602 is as wide and tall as the display itself. However, the array of piezoelectric actuators 602 need not always encompass the entire display. For example, the array of piezoelectric actuators may only extend across a subset of the display (e.g., only Zone #3), and the remainder of the display may be completely devoid of any piezoelectric actuators. The subset of the display may represent an area that is subject to frequent user interactions or an area that is expected to provide haptic feedback. For example, the array of piezoelectric actuators 602 may be positioned so that a piezoelectric actuator is aligned with each key of a keyboard shown on the display. As another example, piezoelectric actuators may be arranged around the outer edge of the display where a user is likely to grip the user device 600.
The piezoelectric actuators could be microceramic transducers that perform haptic events in response to receiving a voltage. For example, the piezoelectric actuators could comprise a synthetic piezoceramic material (e.g., barium titanate, lead zirconate titanate (PZT), or potassium niobate) or a lead-free piezoceramic material (e.g., sodium potassium niobate or bismuth ferrite). Other materials could also be used, such as quartz or carbon nanotubes that include piezoelectric fibers.
The touch circuitry can be configured to generate an input signal (also referred to as a “touch event signal”) in response to the user interacting with the display assembly. The input signal can then be transmitted from the touch circuitry to a haptic processor (step 702). One skilled in the art will recognize that the systems and techniques described herein can be implemented based on other types of input (e.g., those provided by input/output devices, such as mice and keyboards) or no input at all (e.g., haptic events may be automatically performed based on content that is to be shown by the user device).
The haptic processor can analyze the input signal to determine an appropriate haptic event to be performed by one or more of the piezoelectric actuators within the array (step 703). More specifically, the haptic processor may analyze the metadata of the input signal, which could specify the strength of the touch event, the location (i.e., coordinates) of the touch event, and other contextual information (e.g., a timestamp or a designation of the content that is being shown by the user device).
In some embodiments, the haptic processor determines the appropriate haptic event by reviewing instructions that are to be executed by the user device to identify an application programming interface (API) call to a certain haptic event. For example, the instructions/code for an application being executed by the user device may include tags for certain haptic events (e.g., perform haptic event of type A when the user interacts with point B at time C). These calls to certain haptic events may be inserted by a developer when the application is being developed or could be added later on (e.g., as a patch/update). In some embodiments, the developer must choose from a predetermined set of haptic events that can be performed by the array of piezoelectric actuators. In other embodiments, the developer of the content is able to create unique haptic events that can be performed by the array of piezoelectric actuators. Thus, developers may be able to convert content created for conventional user devices so that the content is usable with the user devices described herein. One skilled in the art will recognize that the same techniques can be used for applications, programs, scripts, etc.
When the appropriate haptic event has been identified, the haptic processor can generate an output signal that induces a power source to selectively apply a voltage to one or more of the piezoelectric actuators (step 704). The output signal may specify how much voltage is to be applied, how long the voltage is to be applied, which piezoelectric actuator(s) are to receive the voltage, etc.
Application of the voltage causes the piezoelectric actuator(s) to perform the appropriate haptic event (step 705). For example, a voltage could be applied to a single piezoelectric actuator or multiple piezoelectric actuators. Alternatively, different voltages could be applied to multiple piezoelectric actuators (and thereby induce haptic events of different types, strengths, etc.). Therefore, the appropriate haptic event may require that non-adjacent piezoelectric actuators simultaneously perform the same haptic event or different haptic events.
The manufacturer can then select at least one region of the display assembly that will be able to provide localized haptic feedback (step 802). The region can be a subset of the display assembly or the entirety of the display assembly as shown in
The array of piezoelectric actuators is then communicatively coupled to the haptic processor (step 804) and electrically coupled to a power source (step 805). The power source could be, for example, a rechargeable lithium-ion (Li-Ion) battery, a rechargeable nickel-metal hydride (NiMH) battery, a rechargeable nickel-cadmium (NiCad) battery, or any other power source suitable for an electronic user device. Other types of power sources may also be used. For example, some user devices may be designed with the intention that they remain electrically coupled to a power source (e.g., an outlet) during use and therefore do not require batteries at all. The haptic processor induces haptic events by controlling how the power source applies voltages to the array of piezoelectric actuators. Thus, the arrangement and coupling of the components enables the haptic processor to produce localized haptic feedback by selectively causing voltages to be applied to one or more piezoelectric actuators within the array (step 806).
Unless contrary to physical possibility, it is envisioned that the steps described above may be performed in various sequences and combinations. For instance, the array of piezoelectric actuators could be integrated within the display assembly itself (and thus may not need to be affixed to the display assembly as described in step 803). Additional steps could also be included in some embodiments. For example, the haptic processor and/or power source could also be coupled to other components of the user device. As another example, the user device may be configured to invoke and execute an application that allows a user to manually modify whether the array of piezoelectric actuators will perform haptic events, which haptic event(s) may be performed, strength or duration of haptic events that are to be performed, etc.
In various embodiments, the processing system 900 operates as part of a user device (e.g., user device 600 of
The processing system 900 may be a server computer, a client computer, a personal computer (PC), a tablet PC, a laptop computer, a personal digital assistant (PDA), a mobile telephone, an iPhone®, an iPad®, a Blackberry®, a processor, a telephone, a web appliance, a network router, switch or bridge, a console, a hand-held console, a gaming device, a music player, or any portable, device or any machine capable of executing a set of instructions (sequential or otherwise) that specify actions to be taken by the processing system.
While the main memory 906, non-volatile memory 910, and storage medium 926 (also called a “machine-readable medium) are shown to be a single medium, the term “machine-readable medium” and “storage medium” should be taken to include a single medium or multiple media (e.g., a centralized or distributed database, and/or associated caches and servers) that store one or more sets of instructions 928. The term “machine-readable medium” and “storage medium” shall also be taken to include any medium that is capable of storing, encoding, or carrying a set of instructions for execution by the computing system and that cause the computing system to perform any one or more of the methodologies of the presently disclosed embodiments.
In general, the routines executed to implement the embodiments of the disclosure, may be implemented as part of an operating system or a specific application, component, program, object, module or sequence of instructions referred to as “computer programs.” The computer programs typically comprise one or more instructions (e.g., instructions 904, 908, 928) set at various times in various memory and storage devices in a computer, and that, when read and executed by one or more processing units or processors 902, cause the processing system 900 to perform operations to execute elements involving the various aspects of the disclosure.
Moreover, while embodiments have been described in the context of fully functioning computers and computer systems, those skilled in the art will appreciate that the various embodiments are capable of being distributed as a program product in a variety of forms, and that the disclosure applies equally regardless of the particular type of machine or computer-readable media used to actually effect the distribution.
Further examples of machine-readable storage media, machine-readable media, or computer-readable (storage) media include, but are not limited to, recordable type media such as volatile and non-volatile memory devices 910, floppy and other removable disks, hard disk drives, optical disks (e.g., Compact Disk Read-Only Memory (CD ROMS), Digital Versatile Disks (DVDs)), and transmission type media, such as digital and analog communication links.
The network adapter 912 enables the processing system 900 to mediate data in a network 914 with an entity that is external to the processing system 900 through any known and/or convenient communications protocol supported by the processing system 900 and the external entity. The network adapter 912 can include one or more of a network adaptor card, a wireless network interface card, a router, an access point, a wireless router, a switch, a multilayer switch, a protocol converter, a gateway, a bridge, bridge router, a hub, a digital media receiver, and/or a repeater.
The network adapter 912 can include a firewall which can, in some embodiments, govern and/or manage permission to access/proxy data in a computer network, and track varying levels of trust between different machines and/or applications. The firewall can be any number of modules having any combination of hardware and/or software components able to enforce a predetermined set of access rights between a particular set of machines and applications, machines and machines, and/or applications and applications, for example, to regulate the flow of traffic and resource sharing between these varying entities. The firewall may additionally manage and/or have access to an access control list which details permissions including for example, the access and operation rights of an object by an individual, a machine, and/or an application, and the circumstances under which the permission rights stand.
As indicated above, the techniques introduced here implemented by, for example, programmable circuitry (e.g., one or more microprocessors), programmed with software and/or firmware, entirely in special-purpose hardwired (i.e., non-programmable) circuitry, or in a combination or such forms. Special-purpose circuitry can be in the form of, for example, one or more application-specific integrated circuits (ASICs), programmable logic devices (PLDs), field-programmable gate arrays (FPGAs), etc.
The foregoing description of various embodiments has been provided for the purposes of illustration and description. It is not intended to be exhaustive or to limit the claimed subject matter to the precise forms disclosed. Many modifications and variations will be apparent to one skilled in the art. Embodiments were chosen and described in order to best describe the principles of the invention and its practical applications, thereby enabling others skilled in the relevant art to understand the claimed subject matter, the various embodiments, and the various modifications that are suited to the particular uses contemplated.
Although the above Detailed Description describes certain embodiments and the best mode contemplated, no matter how detailed the above appears in text, the embodiments can be practiced in many ways. Details of the systems and methods may vary considerably in their implementation details, while still being encompassed by the specification. As noted above, particular terminology used when describing certain features or aspects of various embodiments should not be taken to imply that the terminology is being redefined herein to be restricted to any specific characteristics, features, or aspects of the invention with which that terminology is associated. In general, the terms used in the following claims should not be construed to limit the invention to the specific embodiments disclosed in the specification, unless those terms are explicitly defined herein. Accordingly, the actual scope of the invention encompasses not only the disclosed embodiments, but also all equivalent ways of practicing or implementing the embodiments under the claims.
The language used in the specification has been principally selected for readability and instructional purposes, and it may not have been selected to delineate or circumscribe the inventive subject matter. It is therefore intended that the scope of the invention be limited not by this Detailed Description, but rather by any claims that issue on an application based hereon. Accordingly, the disclosure of various embodiments is intended to be illustrative, but not limiting, of the scope of the embodiments, which is set forth in the following claims.
This application claims priority to and the benefit of U.S. Provisional Application 62/300,631, entitled “MOBILE DEVICES AND MOBILE DEVICE ACCESSORIES” (Attorney Docket No. 119306-8020.US00) filed on Feb. 26, 2016, and U.S. Provisional Application 62/318,137, entitled “LOCALIZED HAPTIC FEEDBACK BY ELECTRONIC DEVICES” (Attorney Docket No. 119306-8016.US00) filed on Apr. 4, 2016, each of which applications are included in their entirety by this reference hereto.
Number | Date | Country | |
---|---|---|---|
62318137 | Apr 2016 | US |