The described embodiments generally relate to electronic devices, and, more particularly, to an electronic device having a light-transmissive cover and a display adjacent or partially surrounding a keyboard of the device.
Many electronic devices include one or more input devices that enable a user to interact with the device. Example input devices include keyboards, trackpads, or mice. Laptop computers, for example, may include a base with a keyboard and a trackpad, and a display that is hinged to the base. The keyboard may accept typing inputs via a plurality of keys. The trackpad may accept touch inputs and may be used to move a cursor on the display and to select graphical or other user interface elements displayed on the display. Some traditional trackpads are limited to a trackpad area that is generally located between the user's hands when positioned in a typing position. Most traditional trackpads are also visually static in appearance and tactile feel. The systems and techniques described herein are directed to an electronic device having a touch-sensitive cover and a display that produces a dynamic graphical output and may extend over the palm rest region of a keyboard.
A laptop computer includes a display portion comprising a primary display and a base portion coupled to the display portion. The base portion includes a keyboard and a light-transmissive cover defining a first touch-sensitive input region along a first side of the keyboard and a second touch-sensitive input region along a second side of the keyboard. The base portion also includes a first display under the first touch-sensitive input region and a second display under the second touch-sensitive input region. The light-transmissive cover may further define a third touch-sensitive input region along a third side of the keyboard and the base portion may further include a third display under the third touch-sensitive input region. The first touch-sensitive input region may correspond to a palm rest region that is below the keyboard, the second side of the keyboard may correspond to a left side of the keyboard, the third side of the keyboard may correspond to a right side of the keyboard, and the first display may have an equal or higher resolution than the second display and the third display.
The base portion may include a housing component and the light-transmissive cover may be a monolithic glass member that is attached to the housing component. The first touch-sensitive input region may have a first roughness and the second and third touch-sensitive input regions may have a second roughness that is smoother than the first roughness.
The light-transmissive cover may define a continuous input surface extending over the first touch-sensitive input region and the second touch-sensitive input region, and the light-transmissive cover may define a key web of the keyboard. The laptop computer may further include a light source configured to illuminate the key web. The light source may be configured to, in a first mode of operation, illuminate a first portion of the key web with a first illumination pattern and, in a second mode of operation, illuminate a second portion of the key web with a second illumination pattern. The first portion of the key web may overlap the second portion of the key web.
A laptop computer may include a display portion comprising a primary display configured to display a graphical user interface and a base portion coupled to the display portion. The base portion may include a light-transmissive cover defining a keyboard opening and a touch-sensitive input region adjacent at least one side of the keyboard opening and including a first region having a first surface texture and a second region that at least partially surrounds the first region and has a second surface texture that is different than the first surface texture. The base portion may also include an additional display below the touch-sensitive input region.
In a first mode of operation, touch inputs applied within the first region control a cursor displayed by the graphical user interface and touch inputs applied within the second region and outside of the first region do not control the cursor displayed by the graphical user interface. In a second mode of operation, the additional display displays a graphical output that at least partially surrounds the first region and defines an expanded input region that includes the first region and at least a portion of the second region, and touch inputs applied within the expanded input region control the cursor displayed by the graphical user interface.
The first surface texture of the first region may have a first roughness and the second surface texture of the second region may have a second roughness that is smoother than the first roughness. The graphical output may resemble a visual appearance of the first surface texture. The laptop computer may further include a haptic actuator coupled to the light-transmissive cover, and, in the second mode of operation, the haptic actuator may produce a haptic output via the light-transmissive cover in response to detecting a touch input that is inside of the expanded input region and outside of the first region. The haptic output may be configured to reduce a tactile difference between the first region and the second region when a finger is slid across the second region. The touch input may include an applied force, and the haptic output may be produced in response to determining that the applied force exceeds a threshold force. The haptic output may include a tactile impulse.
A laptop computer may include a display portion comprising a primary display configured to display a graphical user interface associated with an application program and a base portion pivotally coupled to the display portion. The base portion may include a keyboard, a light-transmissive cover defining a first input region along a first side of the keyboard and a second input region along a second side of the keyboard, a touch sensing system configured to detect touch inputs applied to the first input region and the second input region, and an additional display positioned under the light-transmissive cover. The additional display may be configured to display, along the first input region, a first graphical output associated with the application program, and display, along the second input region, a second graphical output associated with the application program. The first side of the keyboard may be a lateral side of the keyboard and the second side of the keyboard may be a palm rest region below the keyboard.
The additional display may be configured to display the first and second graphical outputs when the application program is active and, when a different application program is active, display a third graphical output instead of the first graphical output and display a fourth graphical output instead of the second graphical output. The first graphical output may define a first selectable control that controls a first operation of the application program and the third graphical output may define a second selectable control that controls a second operation of the different application program.
The graphical user interface of the application program may include a first portion of an image, the first graphical output may correspond to a selectable control that is configured to control an operation of the application program, and the second graphical output may include a second portion of the image. The second portion of the image may be all of the image, the second graphical output further may include a movable preview window, and the first portion of the image may correspond to a segment of the image within the movable preview window.
The disclosure will be readily understood by the following detailed description in conjunction with the accompanying drawings, wherein like reference numerals designate like structural elements, and in which:
Reference will now be made in detail to representative embodiments illustrated in the accompanying drawings. It should be understood that the following descriptions are not intended to limit the embodiments to one preferred embodiment. To the contrary, it is intended to cover alternatives, modifications, and equivalents as can be included within the spirit and scope of the described embodiments as defined by the appended claims.
The embodiments described herein are generally directed to notebook or laptop computers with a dynamic display interface next to or surrounding the keyboard. For example, a laptop computer, which may be referred to as a portable computer or a notebook computer, may include a display portion that is hinged to a base portion that includes a keyboard. The dynamic display interface may include one or more additional displays positioned in the base portion of a laptop computer next to the keyboard. The dynamic display interface may also include touch sensing components and/or force sensing components, which may be aligned with or otherwise associated with the additional display(s) to enable the detection of touch and/or force inputs applied to the dynamic display interface. The dynamic display interface may span all or most of a palm rest region below a keyboard of a laptop computer (e.g., where a user's palms may sit when the user is typing). In some cases, the dynamic display interface may extend around one or more additional sides of the keyboard, and in some cases may completely surround the keyboard. The dynamic display interface may also use haptic outputs to provide tactile feedback to a user.
Providing a dynamic display interface next to or surrounding the keyboard of a laptop computer may enable various new and unique ways to interact with a laptop computer. For example, conventional laptops often have a dedicated trackpad of a fixed size positioned in a palm rest region of the computer. In contrast to some conventional laptops, the dynamic display interface described herein may include a touch-sensitive display that spans all or most of the palm rest region of a laptop computer (and optionally extends along two, three, or all four sides of a keyboard). The displays of the dynamic display interface may be used to dynamically produce differently sized and/or shaped trackpad-style input regions with which a user can control application programs or other aspects of the laptop. Notably, because the dynamic display interface uses dynamic displays, the particular size, shape, or other characteristic of a trackpad region may change depending on the mode of the laptop. For example, the location, shape, and/or size of a trackpad region may be modified based on the particular application that is being executed by the laptop. In one scenario, if a word processing application is active (e.g., being displayed on the primary display of the laptop), the trackpad region may be smaller or may be located in a different portion of the palm rest than if a gaming application is active. This may allow the user interface experience to be tailored for different applications or functions of the laptop computer. As used herein, a trackpad region may refer to an area on a palm rest region of a laptop computer that accepts touch and/or force inputs to control a cursor that is displayed on the primary display of a device. Touch and/or force inputs applied to a trackpad region may control other functions and/or operations of the device as well.
In addition to using a graphical output on the dynamic display interface to dynamically define the boundaries of a trackpad region, a dynamic display interface may use tactile cues to help define the boundaries of a trackpad region. For example, a haptic output delivered along the surface of the dynamic display interface may be used to help a user to tactilely perceive the boundaries of the trackpad region. More particularly, the device may produce a haptic output when the user's finger is within the boundary and not produce the haptic output when the user's finger is outside of the boundary.
In addition to being able to dynamically display different trackpad regions, the dynamic display interface may display graphical outputs, such as icons, buttons, sliders, or keys, that can be selected and/or manipulated to control the laptop. Moreover, the particular graphical outputs that are displayed are dynamic and can be changed based on the current state of the laptop and/or the particular application that is being executed. For example, if a photo editing application is being executed, the graphical outputs may be selectable icons representing photo editing controls or tools, and if a word processing application is being executed, the graphical outputs may be selectable icons representing text formatting tools. In this way, selectable controls may be displayed on the dynamic display interface in the base portion, rather than on the primary display of the laptop, thus freeing up space on the primary display for other graphical outputs. As used herein, a primary display refers to a display that provides a primary means of conveying visual information to the user, such as by displaying graphical user interfaces of application programs, operating systems, and the like. The primary display may be part of a hinged top portion of a laptop, such as is found in conventional laptop computers.
The dynamic display interface may define or provide touch-sensitive input regions located at any suitable location on the base portion. For example, as noted above, the dynamic display interface may include a display and associated touch (and/or force) sensing system positioned in a palm rest region of the laptop (e.g., below the keyboard). The dynamic display interface may also include displays and associated touch sensing systems and/or force sensing systems (which may be referred to as touch- and/or force-sensing systems for simplicity) along one or both lateral sides of the keyboard (e.g., on the left and right sides of the keyboard), and/or above the keyboard (e.g., above or in place of a function row).
In some cases, different touch-sensitive input regions around the keyboard may be associated with particular types of selectable content. The physically different locations may help indicate to a user what type of function or control may be available at that location. For example, a touch-sensitive input region that is along a left lateral side of the keyboard may be dedicated to presenting a scroll bar (or other interface). Thus, if an active application includes a graphical user interface with graphical content that can be scrolled (e.g., a document, webpage, image, etc.), the user can expect that a scroll control object will be available on the left lateral side of the keyboard regardless of what the particular application is. Similarly, the right lateral side of the keyboard may be dedicated to selectable icons that control a function of an active application. Thus, if an active application includes selectable controls, the user can expect that the controls will be available on the right lateral side of the keyboard, regardless of what the particular application is. In this way, the user experience may be consistent across multiple applications and/or graphical user interfaces that may be shown on the primary display of a laptop, as the user can expect a certain type or category of control (if relevant to the active application) to be available at a consistent location regardless of the particular application that is active. In some cases, the locations of the user interface elements may be customized by a user. For example, a user may prefer a scroll control object to be located on the right side of the keyboard. The device may allow the user to customize whether the scroll control object is located on the right side of the keyboard for all applications that use a scroll bar, or even select a subset of applications for which the scroll bar should be on the right side of the keyboard, and another subset for which the scroll bar should be on the left side of the keyboard. These and other user-selected customizations are possible for any of the graphical outputs, user interface objects, or other affordances described herein.
The device 100 may be or may resemble a laptop computer, also known as a notebook or portable computer, that has a display portion 102 and a base portion 104 flexibly or pivotally coupled to the display portion 102 (e.g., so that the display portion 102 is able to rotate, pivot, flex, articular, or otherwise move relative to the base portion 104). The display portion 102 includes a display 103, also referred to as a primary display, that provides a primary means of conveying visual information to the user, such as by displaying graphical user interfaces of application programs, operating systems, and the like. The display 103 may have or be characterized by a display resolution, which may be characterized by the number of pixels in a given area (e.g., pixels per inch). The display 103 may include various display components, such as liquid crystal display (LCD) components, light source(s) (e.g., light emitting diodes (LEDs), organic LEDs (OLEDs)), filter layers, polarizers, light diffusers, covers (e.g., glass or plastic cover sheets), and the like.
The display portion 102 and the base portion 104 may be coupled to one another such that they can be positioned in and moved between an open position and a closed position. In the open position, a user may be able to provide inputs to the device 100 via the base portion 104 (e.g., via a keyboard, trackpad region, etc.) while simultaneously viewing information on the display portion 102. In the closed position, the display portion 102 and the base portion 104 are collapsed against one another. More particularly, the display portion 102 and the base portion 104 may be hinged together (e.g., via a pivot or hinge mechanism 105) to form a clamshell device that can be moved between an open and a closed configuration.
The base portion 104 includes a keyboard 114. The keyboard 114 may include a plurality of individual keys that accept typing inputs. The keys may be movable, electromechanical keys that, when they are struck or otherwise actuated by a user, a keycap or other actuation surface moves and causes an electrical switch (e.g., a tactile dome switch) to open or close, thereby signaling that the key has been actuated. Other types of keys and/or switch technologies may also be used, including but not limited to movable keys that use capacitive sensors, optical sensors, resistive sensors, reed switches, or any other suitable sensors or switches. In some cases, the keyboard may not have movable keys, but instead may have a substantially solid single surface (e.g., glass, metal, plastic, ceramic, etc.) that defines multiple key regions and is associated with touch- and/or force-sensing systems that detect key inputs applied to the surface. The continuous surface of such keyboards may be part of a single cover that also defines the touch-sensitive display regions that are next to the keyboard, as described herein.
The base portion 104 may include a light-transmissive cover 112. The light-transmissive cover 112 may include a keyboard opening in which a keyboard or a part of a keyboard may be positioned. In some cases, the keyboard opening is a single opening in which a keyboard module that includes multiple keys (e.g., all the keys of a keyboard) is positioned. In such cases, the opening may be a substantially rectangular opening and the keyboard may be positioned at least partially within the single opening.
Except for the keyboard opening, the light-transmissive cover 112 may define a continuous portion of the top surface of the base portion 104 (which may be the top exterior surface of the base portion 104). The exterior surface of the light-transmissive cover 112 may define the input surface of the dynamic display interface as described herein. For example, graphical outputs may be viewed through the light-transmissive cover 112, and touch inputs may be applied to the exterior surface of the light-transmissive cover 112. Accordingly, the light-transmissive cover 112 may define both a structural component of the enclosure of the device, and also part of an active input/output system.
As used herein, a continuous surface may refer to a surface or member that has no seams, openings, through-holes, or other discontinuities. Thus, the light-transmissive cover 112 may lack seams, openings, through-holes, or other discontinuities in the portion of the light-transmissive cover 112 that surrounds the keyboard opening and extends to the edges of the base portion 104. In such cases, the keyboard opening (e.g., the single keyboard opening 223 in
In some cases the keyboard may be a virtual keyboard or otherwise lack moving, mechanical keys, and the light-transmissive cover 112 may define the input surface of the keyboard. In such cases, the light-transmissive cover 112 may lack openings for keys and may thus define a continuous surface over the entire top of the base portion 104.
The light-transmissive cover 112 may be formed from or include a light-transmissive material, such as glass, plastic, or light-transmissive ceramics. As used herein, light-transmissive may be used to refer to something that is transparent or translucent, or otherwise allows light to propagate therethrough. In some cases, transparent materials or components may introduce some diffusion, lensing effects, distortions, or the like (e.g., due to surface textures) while still allowing objects or images to be seen through the materials or components, and such deviations are understood to be within the scope of the meaning of transparent. Also, materials that are transparent may be coated, painted, or otherwise treated to produce a non-transparent (e.g., opaque) component; in such cases the material may still be referred to as transparent, even though the material may be part of an opaque component. Translucent components may be formed by producing a textured or frosted surface on an otherwise transparent material (e.g., clear glass). Translucent materials may also be used, such as translucent polymers, translucent ceramics, or the like. As used herein, a cover that is formed from or includes a light-transmissive material and allows graphical outputs to be viewed through the cover may be considered a light-transmissive cover. Further, any cover or portion of a cover that overlies an image-forming part of a display may be understood to be light-transmissive, even if that cover or portion of the cover is not explicitly stated to be light-transmissive.
In some cases, the light-transmissive cover 112, which may be referred to simply as a cover 112 for simplicity, is a single member, such as a single monolithic glass member, a single monolithic plastic member, or a single monolithic member formed from or including any other suitable material. In other cases, the cover 112 may be formed from multiple members, either of the same material or different materials, that are bonded, adhered, joined, or otherwise coupled together to define the cover 112. The cover 112 may be light-transmissive to allow one or more displays that are positioned under the cover 112 to be visible through the cover 112, as described in more detail herein. The cover 112 may be attached to a bottom housing component 110 to define substantially all of the enclosure of the base portion 104. The cover 112 may be attached to the bottom housing component 110 via adhesive, fasteners, mechanical interlocks, or other suitable techniques.
In some cases, some portions of the cover 112 may be masked to form opaque regions. The masking may be formed using any suitable technique such as depositing an ink, dye, film, or otherwise positioning an opaque material below the cover 112 (and above any other components or layers that are intended to remain hidden or occluded). The opaque regions may be used to cover or hide internal components of the base portion 104 that may detract from the aesthetic appearance of the device or are otherwise unnecessary to see, such as structural components, fasteners, non-image forming portions of displays, electrical components, adhesives, and the like.
The cover 112 may also define touch-sensitive input regions 109 along one or more sides of the keyboard 114. The touch-sensitive input regions 109, which may be part of the dynamic display interface, may also be associated with and/or at least partially defined by underlying displays that display graphical outputs within the touch-sensitive input regions 109, as well as touch- and/or force-sensing systems that detect touch and/or force inputs applied to the cover 112. Moreover, because the cover 112 is light-transmissive, graphical outputs from the displays may be visible through the cover 112. In this way, and as described herein, the touch-sensitive input regions 109 may function as supplementary user interfaces that not only accept inputs (e.g., touch inputs, gestures, etc.), but also display dynamic graphical content that can change based on the mode of operation of the device. As described herein, the mode of operation of a device may correspond to the application program that is active on the device 100.
The touch-sensitive input regions 109 (which may refer collectively to the touch-sensitive input regions 109-1-109-4) may be located at any suitable position along the cover 112, and may have any suitable size, shape, or other property. As shown in
The trackpad region 111 may be distinguished from surrounding areas using physical and/or permanent markings on the cover 112 (e.g., using paint, etching, masking, texturing, etc.). In other cases, the trackpad region 111 may be distinguished using a graphical output from a display underlying the first touch-sensitive input region 109-1, in which case there may be no physical marking or distinguishing feature on the cover 112 itself. Various examples of techniques for dynamically defining trackpad regions using a display under the touch-sensitive input region 109-1 are described herein.
The device 100 may also include second and third touch-sensitive input regions 109-2, 109-3 positioned along second and third sides of the keyboard 114, such as along right and left lateral sides of the keyboard, respectively. The device 100 may also include a fourth touch-sensitive input region 109-4 positioned along a fourth side of the keyboard 114 (e.g., above the keyboard, or between the keyboard and the hinge mechanism). Similar to the first touch-sensitive input region 109-1, displays underlying the second, third, and/or fourth touch-sensitive input regions 109-2-109-4 may display graphical outputs within the touch-sensitive input regions and which a user may select. While
As described herein, each of the touch-sensitive input regions 109 may be configured to display different types of user interface content, which may include images, virtual buttons, virtual dials, virtual keys, icons, or other graphical outputs. Moreover, the type of user interface content that is displayed in a given touch-sensitive input region 109 may be consistent, where feasible, across multiple states of operation of the device 100. For example, the third touch-sensitive input region 109-3 may display a scroll control object when an application being executed by the device 100 includes scrollable content, and the second touch-sensitive input region 109-2 may display selectable graphical outputs for controlling application functions when an application includes selectable functions. In some cases, if the application does not include scrollable content or selectable functions, the second and third touch-sensitive input regions 109-2, 109-3 may be blank, or they may have other types of content. By maintaining consistency between particular touch-sensitive input regions and particular types of content, user confusion can be avoided, as they will become familiar with a particular layout of user interface options that is consistent across multiple applications and/or states of the device.
As shown in
The keyboard is omitted from
The device 100 may also include a key web light source 202. The key web light source 202 may be positioned under the cover 112 and aligned with the key web 201 of the cover 112 to illuminate the key web 201. Because the key web 201 may be light-transmissive, the light source 202 may transmit light through the light-transmissive material of the key web 201. The key web light source 202 may be configured to dynamically illuminate different regions of the key web 201. For example, under some conditions (e.g., based on the particular application program being executed) the key web light source 202 may illuminate a portion of the key web 201 around a first subset of keys, and under other conditions (e.g., when a different application program is being executed), the key web light source 202 may illuminate a different portion of the key web 201 around a different subset of keys. Other types of visual outputs may also be provided by the key web light source 202, as described herein. In this way additional information may be provided to a user via the key web 201.
The key web light source 202 may use any suitable light source and/or display technology. For example, the key web light source 202 may be or may include a pixelated display, such as one or more light emitting diode (LED) displays, organic LED (OLED) displays, liquid crystal displays (LCD), or the like, arranged in the shape as shown in
The device 100 may also include touch sensing components and/or force sensing components 203 (referred to as touch- and/or force-sensing components for simplicity). The touch- and/or force-sensing components are represented in
The touch- and/or force-sensing components 203 may include any suitable components and may use any suitable technique for sensing touch and/or force inputs. For example, the touch sensing components may include an array of capacitive electrodes that may operate using a self-capacitive or mutual-capacitive sensing scheme. A touch sensing system may determine, based on an electrical response from the touch sensing components, a location of one or more touch inputs applied to an input surface (e.g., the cover 112). The touch sensing system may be configured to determine the location of multiple simultaneous touch inputs. The force sensing components may include a strain sensor, capacitive gap sensor, or other force sensitive structure that is configured to produce an electrical response that corresponds to an amount of applied force associated with a touch input. The electrical response may increase continuously as the amount of applied force increases. Accordingly, the force sensing system may determine, based on the electrical response of the force sensing components, one or more properties of the applied force associated with a touch input (e.g., the magnitude of the applied force).
Touch and force sensing functions may be performed by a single system, or by multiple distinct systems. For example, the touch- and/or force-sensing components 203 may include one or more first electrode layers (or other components) that are coupled to dedicated circuitry for touch sensing, and one or more second electrode layers (or other components) that are coupled to separate dedicated circuitry for force sensing. As another example, the touch- and/or force-sensing components 203 may include different layers for touch and force sensing functions, but may be coupled to a single combined processing system. Other configurations and combinations are also possible, including touch- and/or force-sensing components and systems that are not substantially planar layers under the cover 112. In some cases, the touch- and/or force-sensing components 203 span multiple touch-sensitive input regions 109 (
The device 100 may also include a display system 205 under the cover 112. The display system 205 may be configured to display graphical outputs that are visible through the cover 112, and thus provide graphical output functionality to the touch-sensitive input regions, as described herein. Example graphical outputs that may be displayed by the display system 205 are discussed herein and include trackpad region boundaries, selectable application controls, scroll bars, images, photographs, etc.
The display system 205 may include one display or multiple displays. As shown, the display system 205 includes four displays 204-1, 204-2, 204-3, and 204-4 (collectively referred to as displays 204), each positioned under a different touch-sensitive input region of the device 100. For example, with reference to
Providing each touch-sensitive input region 109 with a separate display, as shown in
The displays 204 may be any suitable type of display. For example, the displays 204 may be or may include light emitting diode (LED) displays, organic LED (OLED) displays, liquid crystal displays (LCD), arrays of individual light sources (e.g., individual LEDs), or any other suitable type of display. Moreover, the displays 204 may all be the same type of display, or they may be different from one another. For example, the first display 204-1 may be an OLED display, and the second and third displays 204-2, 204-3 may be LED displays.
The device 100 may also include a haptic actuator 206 within the base portion 104. The haptic actuator 206 is configured to produce a haptic output that may include movement (e.g., vibration or displacement) of the cover 112. The movement caused by the haptic actuator 206 may be perceptible as tactile feedback to the user when the user is in contact with the cover 112 or other portion of the device 100. The haptic output may be produced by any suitable technique, including vibrations, oscillations, impulses (e.g., non-periodic forces or movements), local or global deformations of the cover 112, or any other suitable technique. As described herein, haptic outputs may be used for various output purposes, such as to indicate to a user that a touch and/or force input has been registered (e.g., simulating a click of a mechanical switch). For example, in some cases the haptic actuator 206 produces a haptic output in response to the device 100 determining that an applied force (e.g., a force of a touch input applied to the cover 112) exceeds a threshold force, as measured or detected by a force sensing system. More particularly, when a user presses on the cover 112 to make a selection, similar to clicking a mouse button or a heavy press on a trackpad surface, a force sensing system determines if the applied force exceeds a force threshold, and if so, the haptic actuator 206 may produce a haptic output to provide tactile feedback to the user and optionally to simulate the feel of a mechanical switch. The haptic output produced in response to detecting a force that exceeds the force threshold may be a tactile impulse, a vibration, or any other suitable haptic output. Haptic outputs may also be used to reduce a tactile difference between textured and untextured surfaces of the cover 112, as described herein.
The haptic actuator 206 may be any suitable mechanism and may be coupled to the device 100 in any suitable way. In some cases, it is mounted to a stack that includes the cover 112, the display system 205, and the touch- and/or force-sensing components 203. In other cases is it mounted to the bottom housing component 110 (or another component within the base portion 104), and forces that produce the haptic output are transmitted to the cover 112 indirectly through the physical structures of the base portion 104.
The bottom housing component 110 may form a bottom and one or more side surfaces of the base portion 104, and may define a cavity in which components of the device 100 are positioned. The bottom housing component 110 may be formed from any suitable material, including metal (e.g., aluminum, magnesium, titanium, metal alloys, etc.), glass, ceramic, plastic, or the like.
The base portion 220 includes a cover 222 that defines a top exterior surface of the base portion 220. The cover 222 may be similar to the cover 112 in all respects, except instead of defining a key web with multiple key openings, as illustrated by the cover 112 in
A keyboard assembly 224 may be positioned at least partially within the keyboard opening 223. The keyboard assembly 224 may include any suitable components. For example, the keyboard assembly 224 may include a substrate, key make sensing components (e.g., dome switches, electrodes, etc.), mechanical support structures (e.g., hinges, scissor mechanisms, guides, etc.), keycaps, fabric covers, sealing membranes, and the like. The keyboard assembly 224 may also include a key web and a light source that may dynamically illuminate different regions of the key web, similar to the key web 201 and light source 202 described with respect to
The base portion 220 may also include other components that are the same as or similar to those shown in
As described above, the displays in the base portion 104 may be configured to display graphical outputs that are visible through the cover 112. Such graphical outputs may be used for various purposes, such as to define selectable controls that control an operation of an application program executed by the device 100 or other functions of the device 100. In the case of the first display 204-1, this display may underlie the palm rest region of the device 100 (e.g., the portion of the cover 112 where a user typically rests his or her hands when typing on the keyboard). This region may also be where conventional trackpads are positioned. Because of the display 204-1, the device 100 may provide enhanced trackpad functionality as compared to conventional static trackpad regions.
While
The first touch-sensitive input region 309-1 may have a first region 316, which may correspond to a trackpad region, that is distinguishable from a second region 318 that at least partially surrounds the first region 316 (and may correspond to the rest of the first touch-sensitive input region 309-1). The first region 316 may be distinguished from the second region 318 in any suitable manner, including tactilely and/or visually. The first region 316 may be visually distinguished by a border (as shown) that defines the first region 316. The border may be formed in any suitable way, such as via paint, dye, etching (e.g., in the top or bottom surface of the cover 312), or any other suitable technique. As another example, the border may be produced by a display under the first touch-sensitive input region 309-1. In some cases, instead of or in addition to a visible border as described above, the first region 316 may be tactilely distinguished from the second region 318. In order to produce the tactile distinction between the first region 316 and the second region 318, the first region 316 may include a surface texture that is different from the surface texture of the second region 318. In some cases, the property that produces a tactile distinction also contributes to the visual distinction. For example, the surface texture in the first region 316 may be visually distinct than an un-textured or smooth surface of the second region 318.
The first region 316 may define a trackpad region that is active when the device 300 is in a first mode of operation. As used herein, a mode of operation may correspond to any suitable state or mode of the device 300. For example, a mode of operation may correspond to a particular application program, or a particular type, class, or category of application program, being active on the device 300. A mode of operation may also correspond to a particular type of user interface being active on the device.
In a first mode of operation, such as when a first application program is active, the device 300 treats the first region 316 as a trackpad region that can control the operation of the device 300 similar to a conventional trackpad, such as by controlling the position and/or movement of a cursor that is displayed on a display of the device 300. For example, in the first mode of operation, touch inputs that are applied to the cover 312 within the first region 316 may control a cursor displayed by a graphical user interface on the primary display of the device 300, while touch inputs applied within the second region 318 and outside of the first region 316 do not control the cursor displayed by the graphical user interface (e.g., touch inputs applied outside the first region 316 may be ignored).
When the device 300 is in a second mode of operation, such as when a second application program is active, the device 300 may activate an expanded trackpad region to provide a larger area (or a smaller or differently shaped area) in which a user may provide trackpad inputs. For example, in the second mode of operation, a display corresponding to the first touch-sensitive input region 309-1 (e.g., the display 204-1,
The device 300 may also respond to touch and/or force inputs differently based on the mode of operation of the device 300. For example, as suggested above, in the first mode of operation, the device 300 may be configured to respond in a particular way to inputs provided within the first region 316 (e.g., treating the inputs as trackpad inputs) while ignoring inputs provided outside of the first region 316. In the second mode of operation, the device may be configured to respond differently than in the first mode of operation. For example, the device 300 may treat inputs inside the first region 316 and inside the expanded input region 320 as trackpad inputs, while ignoring inputs provided outside the expanded input region 320.
The expanded input region 320 may provide a larger trackpad region with which users may interact to control the device 300. However, the larger input surface may not necessarily be desirable or useful in all operating modes or for all application programs. For example, a large trackpad region may be advantageous when controlling graphics programs and games, where users manipulate graphical outputs displayed on a primary display (e.g., moving an animated character around a virtual world, translating or rotating graphical objects in a computer aided drafting program, or the like), but may be disadvantageous when inputting text into an email editor or word processing program. For example, a larger trackpad region, such as the expanded input region 320, may increase the likelihood of detecting unintended inputs due to a user's hands resting on the palm rest region during typing, and the larger size may not be necessary or particularly useful for such applications. By dynamically changing the size and/or shape of the trackpad region based on the mode of operation of the device, the device 300 may provide an input experience that is tailored for each of various different uses or functions of a device. Stated another way, the device 300 does not have to compromise on the size and/or shape of the trackpad to try to accommodate each and every possible use. Rather, the size and shape of the trackpad can be customized to an ideal or preferred size for each application. Further, users may be able to customize the size of the first region 316 and/or the expanded input region 320, and may in fact be able to customize the size of the input region for different application programs or other modes of operation of the device 300.
As noted above, haptic outputs may be provided via the cover 312 in response to the detection of particular inputs in the trackpad region. For example, when a user presses in the trackpad region with sufficient force (e.g., when the force exceeds a threshold), a haptic output may be produced via the cover 312 to inform the user that the input has been detected and/or registered. When the device 300 is in the first mode of operation, the haptic output may be produced when an input having an applied force that exceeds a threshold force is detected within the first region 316. In the first mode of operation, if an input exceeding the threshold force is detected outside the first region 316, the input may not be registered and the device 300 may produce no haptic output. In the second mode of operation, on the other hand, an input may be registered and a haptic output may be produced when an input having an applied force that exceeds a threshold force is detected within the expanded input region 320 (which includes and/or encompasses the area of the first region 316).
As described above, a portion of the cover of a device, such as a trackpad region, may have a surface texture that differs from its surroundings. The surface texture may help visually and/or tactilely distinguish a trackpad region of the cover (e.g., where touch inputs are detected and/or registered) from other regions of the cover (e.g., where touch inputs are ignored or where non-trackpad style inputs are displayed, such as buttons or icons). The surface texture may have a greater roughness than surrounding or adjacent areas of the cover. For example, the surface texture of the trackpad region may be rougher than the surrounding areas of a cover (e.g., the surrounding areas may be smoother than the trackpad region). The greater roughness of the trackpad region may have or may result in lower friction (e.g., a lower coefficient of static and/or kinetic friction) than the surrounding areas, which may be polished surfaces. The lower friction of the rougher trackpad region may provide a different tactile feel than the smooth, polished surfaces, especially when a user is sliding a finger or appendage over the trackpad region to provide an input to the device. For example, it may feel as though it takes less force or effort for a user to slide a finger over the rougher, textured surface as compared to a less rough surface. The lower friction of a textured surface may be an effect of the texturing. For example, the texturing may reduce the overall contact area between a finger and the surface, which may lead to a lower actual or perceived frictional force between the finger and the surface (as compared to a non-textured surface).
The device 400 may include a cover 402 (which may be an embodiment of the cover 112) that may be light-transmissive along at least a portion of the cover 402. The cover 402 may define a single keyboard opening in which a keyboard may be positioned, or it may define a key web with multiple openings for keys of the keyboard. Like the cover 112, the cover 402 may define first, second, third, and fourth touch-sensitive input region 409-1-409-4.
As shown in
The textured region 406 of the cover 402 may correspond to a trackpad region in which a user may provide touch inputs. In some cases, the textured region 406 represents the smallest area that the device offers as a trackpad region. Thus, if the device 400 is in a mode where trackpad-style inputs are accepted, a user can expect that inputs applied to the textured region 406 will be registered as inputs. However, as noted above, improved functionality and features may be provided by allowing displays that are positioned beneath the cover 402 to dynamically display differently sized and shaped input regions. Accordingly, techniques such as those described above for dynamically changing the size and shape of the input regions may be applied to devices with textured surfaces as well. Moreover, additional techniques may be used to help mitigate or reduce the visual and/or tactile differences between the areas of different surface textures.
While
As described herein, the graphical output 503 may be produced by a display under the palm rest region of a laptop computer. As shown, the graphical output 503 includes a solid border. However, it will be understood that other graphical outputs may be used instead of or in addition to the solid border that is shown, such as a graphical output that produces an appearance of a texture, as described above. In some cases, no graphical output is displayed.
In
In order to reduce the tactile difference between the textured region 504 and the adjacent region 505, a device may produce a haptic output when the user's finger 508 is detected in the adjacent region 505 and within the boundary of the expanded input region 506. The presence of the user's finger 508 outside the textured region 504 and within the adjacent region 505 may be determined by a touch- and/or force-sensing system of the device (e.g., the sensing system 909,
The haptic output may be any suitable haptic output, such as a vibration, oscillation, acoustic wave, electrostatic haptic output, local deformation of the cover 502, or the like. The haptic output may be produced by any suitable haptic actuator, such as a moving or oscillating mass, an electrode, a piezoelectric material, or the like. For example, the cover 502 or a portion thereof may be shifted side-to-side (e.g., in a direction that is substantially parallel with the exterior surface of the cover 502). The side-to-side movement may be induced by a side-to-side action of the haptic actuator. As another example, the cover 502 may be moved up and down (e.g., in a direction that is substantially perpendicular to the exterior surface of the cover). The up and down movement may be induced by an up and down action of the haptic actuator. The haptic output is represented in
The haptic output may reduce the tactile difference between the regions in several ways. For example, the haptic output may reduce an actual or perceived friction of the adjacent region 505 (which may have a higher coefficient of friction than the textured region 504). As another example, an electrostatic haptic output may produce an oscillating attractive or repulsive force on the finger 508 that continuously changes the actual or perceived friction of the adjacent region 505. In some cases, the haptic output does not actually change the coefficient of friction between the finger 508 and the cover 502. In such case, the haptic output may still indicate to a user that his or her finger 508 is within the expanded trackpad region even when the finger 508 is no longer within the textured region 504. Accordingly, the user may easily determine when he or she is interacting with the trackpad region, even without having to look directly at the palm rest region. The haptic output may cease when the user's finger is outside of the expanded input region 506 to indicate to the user that his or her finger is outside of the region that is being used to accept trackpad style inputs.
The haptic output may be produced based on the detected location of a user's finger 508 relative to the textured region 504 and the adjacent region 505. For example, when the user's finger 508 is detected within the textured region 504, there may be no haptic output (or at least no haptic output that is configured to reduce a tactile difference). When the user's finger 508 is detected outside the textured region 504 and within the adjacent region 505, the haptic output may be produced to reduce the tactile difference between the regions. When the user's finger 508 is detected outside of the expanded input region 506, there may be no haptic output. Due to the lower roughness and the lack of a haptic output, the user may perceive a tactile difference outside of the expanded input region 506, indicating to the user that he or she is outside of the expanded input region 506 and that touch and/or force inputs may not be detected (or may be treated differently by the device).
The portion of the cover 502 in
As described herein, the cover of a device (e.g., a laptop computer) may be formed of a light-transmissive (e.g., transparent) material, such as glass, polymer, ceramic, etc., and a display may be positioned under the light-transmissive cover to display various types of graphical outputs through the cover. In some cases, graphical outputs may be displayed within regions that are typically used for trackpad inputs, such as on the palm rest region of a cover. As noted above, however, a cover may include textured regions to provide a desirable tactile feel to a trackpad region and to help tactilely and visually distinguish the trackpad region. However, such textured regions, while still being sufficiently light-transmissive to allow a graphical output to be viewed through the textured regions, may distort or diffuse the graphical output as compared to non-textured regions of the cover. Accordingly, graphical outputs that span a textured and a non-textured region (or that are displayed near a textured region) may be modified or otherwise configured to reduce the difference in appearance caused by the textured region.
In some cases, however, the graphical output 606 may be modified or otherwise configured to reduce the sharp transition between the distorted portion and the undistorted portion of a graphical output.
As described above, a light-transmissive cover may include an integral key web (e.g., the key web may be a portion of a monolithic cover component). The key web may include or otherwise define members that are positioned between adjacent keycaps, which, like the rest of the cover, may be light-transmissive. The key web may also be associated with a light source that is configured to illuminate or otherwise display images, colors, or other visible outputs through the key web. As used herein, visible outputs may include any illuminated output that is visible to a user, including solid colored light, graphical outputs, images, or the like. As noted above, the light source may be or may include a display, one or more individual sources of light (e.g., individual LEDs), light guides, and/or any other combination of illumination components. Notably, the light source illuminates (or otherwise displays visible output through) the material that forms the key web, and not merely an empty space between adjacent keycaps.
The light-transmissive key web and the corresponding light source may be configured to produce various dynamic visual effects via the key web. For example, in cases where the light source is or includes a display, graphical outputs may be displayed through the key web. Further, different areas of the key web may be illuminated differently from one another, and the areas that are illuminated may change based on active applications, user preference, ambient light characteristics (e.g., as determined by light-sensitive sensors), or the like.
As shown in
The second mode of operation may correspond to a different application program being executed by the device. For example, while the illumination pattern shown in
Moreover, because the underlying light source may be a display that can selectively illuminate different regions, the key web is not limited to any particular fixed illumination pattern or set of illumination patterns. For example, individual application programs may specify what colors, images, or other graphical or visible outputs to present via the key web, and specify the locations and/or patterns of the colors, images, or other graphical or visible outputs. In this way, the manner in which the key web is illuminated may be widely variable. Moreover, the illumination pattern need not be static. Rather, the patterns may dynamically change to produce animations or other moving and/or changing visible output. For example, the key web can be animated to simulate a moving wave or band that directs the user's attention to a key or area of the keyboard. The animation could also expand outward from a key to provide an indication that an input was received. The latter effect may produce an appearance of drops landing in water, with each key strike producing a new ripple. As another example, an animation or visible output may be controlled by the speed of a user's typing. For example, the key web may transition between various colors while a user is typing. When the user ceases typing, the color ceases changing; when the user types faster, the color may change at a faster rate. Other effects are also contemplated.
As noted above, the combination of a display and a light-transmissive (e.g., transparent) cover over the display in a base portion of a computer allows detailed graphical outputs to be displayed on areas of the cover below, above, and along the sides of a keyboard. The integration of touch- and/or force-sensing systems further allows a user to interact with such images in various ways. The particular graphical outputs that are displayed via the cover may depend on the state of operation of the device (e.g., the particular application program that is being executed, the graphical content that is displayed on a primary display, or the like). In some cases, the graphical output displayed in touch-sensitive regions that are adjacent a keyboard can be interacted with in order to control the operation of the device and/or an application program active on the device.
In
While the first application program is active, several graphical outputs may be displayed via the cover 812 (e.g., by an underlying display such as the display system 205,
While the first graphical output 822 is being displayed (and optionally any other graphical outputs shown or described), a second graphical output associated with the application program may be displayed along a second input region. For example,
Other graphical outputs may be displayed in addition to the first and second graphical outputs 822, 818 described with respect to
When the second application program is active, the device may display different graphical outputs via the cover 812 than those that are displayed when the first application program is active. For example, a third graphical output 828 associated with the second application program may be displayed along the palm rest region of the device 800 (e.g., the first touch-sensitive input region 109-1,
When the second application program is active, as shown in
When the second application program is active, the slider bar 820 may also be displayed. The slider bar 820 may be manipulated to scroll or move the portion of the map (or other graphical output) on the primary display. Once again, presenting the slider bar 820 in the same touch-sensitive input region (e.g., the third touch-sensitive input region 109-3,
When the third application program is active, the device may display different graphical outputs via the cover 812 than those that are displayed when the first and/or second application programs are active. For example, a fifth graphical output 840 associated with the third application program may be displayed along the palm rest region of the device 800 (e.g., the first touch-sensitive input region 109-1,
When the third application program is active, as shown in
For some application programs, it may be desirable to provide more selectable controls than may be conveniently included in the second touch-sensitive input region. In such cases, additional graphical outputs 842, 846 may be displayed via the cover 812. The additional graphical outputs 842, 846 may include more selectable controls that have a similar appearance and/or type of function as those in the sixth graphical output 844. In other cases, however, any other types of graphical output, affordance, or information may be presented in the additional graphical outputs 842, 846. Further, because the display within the base portion 804 may underlie multiple areas of the cover 812 (including optionally substantially the entire cover 812), the additional graphical outputs 842, 846 may be displayed at any suitable location on the cover 812. For example,
While
As shown in
The processing units 902 of
The memory 904 can store electronic data that can be used by the device 900. For example, a memory can store electrical data or content such as, for example, audio and video files, documents and applications, device settings and user preferences, timing and control signals or data for the various modules, data structures or databases, and so on. The memory 904 can be configured as any type of memory. By way of example only, the memory can be implemented as random access memory, read-only memory, Flash memory, removable memory, or other types of storage elements, or combinations of such devices.
The touch sensors 906 (also referred to herein as touch sensing systems) may detect various types of touch-based inputs and generate signals or data that are able to be accessed using processor instructions. The touch sensors 906 may use any suitable components and may rely on any suitable phenomena to detect physical inputs. For example, the touch sensors 906 may be capacitive touch sensors, resistive touch sensors, acoustic wave sensors, or the like. The touch sensors 906 may include any suitable components for detecting touch-based inputs and generating signals or data that are able to be accessed using processor instructions, including electrodes (e.g., electrode layers), physical components (e.g., substrates, spacing layers, structural supports, compressible elements, etc.), processors, circuitry, firmware, and the like. The touch sensors 906 may be integrated with or otherwise configured to detect touch inputs applied to a cover of a computing device, such as the cover 112 in
The force sensors 908 (also referred to herein as force sensing systems) may detect various types of force-based inputs and generate signals or data that are able to be accessed using processor instructions. The force sensors 908 may use any suitable components and may rely on any suitable phenomena to detect physical inputs. For example, the force sensors 908 may be strain-based sensors, piezoelectric-based sensors, piezoresistive-based sensors, capacitive sensors, resistive sensors, or the like. The force sensors 908 may include any suitable components for detecting force-based inputs and generating signals or data that are able to be accessed using processor instructions, including electrodes (e.g., electrode layers), physical components (e.g., substrates, spacing layers, structural supports, compressible elements, etc.), processors, circuitry, firmware, and the like. The force sensors 908 may be used in conjunction with various input mechanisms to detect various types of inputs. For example, the force sensors 908 may be used to detect clicks, presses, or other force inputs applied to a touch-sensitive input region (e.g., the touch-sensitive input regions 109,
The touch sensors 906 and the force sensors 908 (which may also be referred to as touch and force sensing systems) may be considered part of a sensing system 909. The sensing system 909 may include touch sensors alone, force sensors alone, or both touch and force sensors. Moreover, the sensing system 909 may provide touch sensing functions and/or force sensing functions using any configuration or combination of hardware and/or software components, systems, subsystems, and the like. For example, some force sensing components and associated circuitry may be capable of determining both a location of an input as well as a magnitude of force (e.g., a non-binary measurement) of the input. In such cases, a distinct physical touch-sensing mechanism may be omitted. In some examples, physical mechanisms and/or components may be shared by the touch sensors 906 and the force sensors 908. For example, an electrode layer that is used to provide a drive signal for a capacitive force sensor may also be used to provide the drive signal of a capacitive touch sensor. In some examples, a device includes functionally and/or physically distinct touch sensors and force sensors to provide the desired sensing functionality.
The device 900 may also include one or more haptic actuator(s) 912. The haptic actuator(s) 912 may include one or more of a variety of haptic technologies such as, but not necessarily limited to, rotational haptic devices, linear actuators, piezoelectric devices, vibration elements, and so on. In general, the haptic actuator(s) 912 may be configured to provide punctuated and distinct feedback to a user of the device. More particularly, the haptic actuator(s) 912 may be adapted to produce a knock or tap sensation and/or a vibration sensation. Such haptic outputs may be provided in response to detection of touch- and/or force-based inputs, such as detection of force inputs on a touch-sensitive input region (e.g., the touch-sensitive input regions 109,
The one or more communication channels 910 may include one or more wireless interface(s) that are adapted to provide communication between the processing unit(s) 902 and an external device. In general, the one or more communication channels 910 may be configured to transmit and receive data and/or signals that may be interpreted by instructions executed on the processing units 902. In some cases, the external device is part of an external communication network that is configured to exchange data with wireless devices. Generally, the wireless interface may include, without limitation, radio frequency, optical, acoustic, and/or magnetic signals and may be configured to operate over a wireless interface or protocol. Example wireless interfaces include radio frequency cellular interfaces, fiber optic interfaces, acoustic interfaces, Bluetooth interfaces, infrared interfaces, USB interfaces, Wi-Fi interfaces, TCP/IP interfaces, network communications interfaces, or any other conventional communication interfaces.
As shown in
The foregoing description, for purposes of explanation, used specific nomenclature to provide a thorough understanding of the described embodiments. However, it will be apparent to one skilled in the art that the specific details are not required in order to practice the described embodiments. Thus, the foregoing descriptions of the specific embodiments described herein are presented for purposes of illustration and description. They are not targeted to be exhaustive or to limit the embodiments to the precise forms disclosed. It will be apparent to one of ordinary skill in the art that many modifications and variations are possible in view of the above teachings. Also, when used herein to refer to positions of components, the terms above and below, or their synonyms, do not necessarily refer to an absolute position relative to an external reference, but instead refer to the relative position of components with reference to the figures.
Moreover, the foregoing figures and descriptions include numerous concepts and features, which may be combined in numerous ways to achieve numerous benefits and advantages. Thus, features, components, elements, and/or concepts from various different figures may be combined to produce embodiments or implementations that are not necessarily shown or described together in the present description. Further, not all features, components, elements, and/or concepts shown in a particular figure or description are necessarily required in any particular embodiment and/or implementation. It will be understood that such embodiments and/or implementations fall within the scope of this description.
This application is a nonprovisional patent application of and claims the benefit of U.S. Provisional Patent Application No. 62/676,823, filed May 25, 2018 and titled “Portable Computer with Dynamic Display Interface,” the disclosure of which is hereby incorporated herein by reference in its entirety.
Number | Name | Date | Kind |
---|---|---|---|
4106839 | Cooper | Aug 1978 | A |
4256412 | Tybus et al. | Mar 1981 | A |
4855174 | Kamamoto et al. | Aug 1989 | A |
4989622 | Kozuka et al. | Feb 1991 | A |
5055347 | Bacon, Jr. | Oct 1991 | A |
5512374 | Wallace et al. | Apr 1996 | A |
6061104 | Evanicky et al. | May 2000 | A |
6093887 | Ponto et al. | Jul 2000 | A |
6189938 | Nakadaira et al. | Feb 2001 | B1 |
6278873 | Itakura et al. | Aug 2001 | B1 |
6288330 | Chen | Sep 2001 | B1 |
6359768 | Eversley et al. | Mar 2002 | B1 |
6392873 | Honda | May 2002 | B1 |
6424338 | Anderson | Jul 2002 | B1 |
6442826 | Staudt et al. | Sep 2002 | B1 |
6473069 | Gerpheide | Oct 2002 | B1 |
6483024 | Smithson et al. | Nov 2002 | B1 |
6589891 | Rast | Jul 2003 | B1 |
6654256 | Gough | Nov 2003 | B2 |
6671160 | Hayden | Dec 2003 | B2 |
6940731 | Davis et al. | Sep 2005 | B2 |
6996425 | Watanabe | Feb 2006 | B2 |
7048242 | Oddsen, Jr. | May 2006 | B2 |
7436653 | Yang et al. | Oct 2008 | B2 |
7491900 | Peets et al. | Feb 2009 | B1 |
7586753 | Lu | Sep 2009 | B2 |
7604377 | Yu et al. | Oct 2009 | B2 |
7755913 | He | Jul 2010 | B2 |
7829812 | Tolbert et al. | Nov 2010 | B2 |
7920904 | Kim et al. | Apr 2011 | B2 |
7986525 | Wang | Jul 2011 | B2 |
8066233 | Fujikawa et al. | Nov 2011 | B2 |
8092897 | Honma et al. | Jan 2012 | B2 |
8101859 | Zadesky | Jan 2012 | B2 |
8164898 | Lin et al. | Apr 2012 | B2 |
D660193 | Neuner | May 2012 | S |
8195244 | Smoyer et al. | Jun 2012 | B2 |
8199488 | Zou et al. | Jun 2012 | B2 |
8358513 | Kim et al. | Jan 2013 | B2 |
8396521 | Horimoto et al. | Mar 2013 | B2 |
8456847 | Hwang et al. | Jun 2013 | B2 |
8509863 | Vedurmudi et al. | Aug 2013 | B2 |
8553907 | Thomason et al. | Oct 2013 | B2 |
8558977 | Gettemy et al. | Oct 2013 | B2 |
8560947 | Gillespie | Oct 2013 | B2 |
8587935 | Lee | Nov 2013 | B2 |
8654524 | Pance et al. | Feb 2014 | B2 |
8665236 | Myers | Mar 2014 | B2 |
8675359 | Chen | Mar 2014 | B2 |
8744529 | Freund et al. | Jun 2014 | B2 |
8773848 | Russell-Clarke et al. | Jul 2014 | B2 |
8824140 | Prest et al. | Sep 2014 | B2 |
8974924 | Weber et al. | Mar 2015 | B2 |
8975540 | Mareno et al. | Mar 2015 | B2 |
9007748 | Jarvis | Apr 2015 | B2 |
9086748 | Nam et al. | Jul 2015 | B2 |
9124676 | Allore et al. | Sep 2015 | B2 |
9135944 | Jenks | Sep 2015 | B2 |
9162519 | Suehiro et al. | Oct 2015 | B2 |
9173306 | Lim et al. | Oct 2015 | B2 |
9192072 | Shin et al. | Nov 2015 | B2 |
9203463 | Asrani et al. | Dec 2015 | B2 |
9218116 | Benko et al. | Dec 2015 | B2 |
9250659 | Tsai et al. | Feb 2016 | B2 |
9390869 | Lee et al. | Jul 2016 | B2 |
9429997 | Myers et al. | Aug 2016 | B2 |
9448631 | Winter et al. | Sep 2016 | B2 |
9489054 | Sumsion et al. | Nov 2016 | B1 |
9532723 | Kim et al. | Jan 2017 | B2 |
9621218 | Glickman et al. | Apr 2017 | B1 |
9642241 | Huitema et al. | May 2017 | B2 |
9654164 | Irci et al. | May 2017 | B2 |
9693473 | Hibino et al. | Jun 2017 | B2 |
9740237 | Moore et al. | Aug 2017 | B2 |
9804635 | Kim et al. | Oct 2017 | B2 |
9826649 | Narajowski et al. | Nov 2017 | B2 |
9898903 | Khoshkava et al. | Feb 2018 | B2 |
9939784 | Berardinelli et al. | Apr 2018 | B1 |
9955603 | Kiple et al. | Apr 2018 | B2 |
10013075 | Shipman | Jul 2018 | B2 |
10042442 | Kwak | Aug 2018 | B2 |
10110267 | Kim et al. | Oct 2018 | B2 |
10321590 | Cater et al. | Jun 2019 | B2 |
10424765 | Hwang et al. | Sep 2019 | B2 |
10468753 | Kim et al. | Nov 2019 | B2 |
10705570 | Kuna et al. | Jul 2020 | B2 |
10983570 | Files et al. | Apr 2021 | B1 |
11379010 | Kuna et al. | Jul 2022 | B2 |
11720149 | Kuna et al. | Aug 2023 | B2 |
11720176 | Ligtenberg et al. | Aug 2023 | B2 |
11812842 | Perkins et al. | Nov 2023 | B2 |
20020006687 | Lam | Jan 2002 | A1 |
20020072335 | Watanabe | Jun 2002 | A1 |
20020130981 | Ma et al. | Sep 2002 | A1 |
20040190239 | Weng | Sep 2004 | A1 |
20050140565 | Krombach | Jun 2005 | A1 |
20060203124 | Park et al. | Sep 2006 | A1 |
20070195495 | Kim et al. | Aug 2007 | A1 |
20070229702 | Shirono et al. | Oct 2007 | A1 |
20070287512 | Kilpi et al. | Dec 2007 | A1 |
20080018475 | Breed et al. | Jan 2008 | A1 |
20080084384 | Gregorio et al. | Apr 2008 | A1 |
20080174037 | Chen | Jul 2008 | A1 |
20080309640 | Hong et al. | Dec 2008 | A1 |
20090003141 | Ozawa et al. | Jan 2009 | A1 |
20090041984 | Mayers et al. | Feb 2009 | A1 |
20090219156 | August et al. | Sep 2009 | A1 |
20090278688 | Tuttle | Nov 2009 | A1 |
20090295943 | Kim et al. | Dec 2009 | A1 |
20090298547 | Kim et al. | Dec 2009 | A1 |
20100061044 | Zou et al. | Mar 2010 | A1 |
20100091442 | Theobald et al. | Apr 2010 | A1 |
20100105452 | Shin et al. | Apr 2010 | A1 |
20100137043 | Horimoto et al. | Jun 2010 | A1 |
20100151925 | Vedurmudi et al. | Jun 2010 | A1 |
20100157515 | Tseng et al. | Jun 2010 | A1 |
20100265182 | Ball et al. | Oct 2010 | A1 |
20100302016 | Zaborowski | Dec 2010 | A1 |
20100308998 | Hesch, Jr. et al. | Dec 2010 | A1 |
20100315399 | Jacobson et al. | Dec 2010 | A1 |
20110038114 | Pance et al. | Feb 2011 | A1 |
20110047459 | Van Der Westhuizen | Feb 2011 | A1 |
20110065479 | Nader | Mar 2011 | A1 |
20110091051 | Thomason et al. | Apr 2011 | A1 |
20110095994 | Birnbaum | Apr 2011 | A1 |
20110134032 | Chiu | Jun 2011 | A1 |
20110205169 | Yasutake | Aug 2011 | A1 |
20110292579 | Koga | Dec 2011 | A1 |
20120009983 | Mow et al. | Jan 2012 | A1 |
20120069517 | Prest et al. | Mar 2012 | A1 |
20120088072 | Pawloski et al. | Apr 2012 | A1 |
20120094594 | Rofougaran et al. | Apr 2012 | A1 |
20120097412 | Wennemer et al. | Apr 2012 | A1 |
20120175165 | Merz et al. | Jul 2012 | A1 |
20120212424 | Sharma | Aug 2012 | A1 |
20120236477 | Weber | Sep 2012 | A1 |
20120268412 | Cruz-Hernandez | Oct 2012 | A1 |
20120274575 | Solomon et al. | Nov 2012 | A1 |
20120327008 | Kurita | Dec 2012 | A1 |
20130051000 | Yu et al. | Feb 2013 | A1 |
20130076649 | Myers et al. | Mar 2013 | A1 |
20130273295 | Kenney et al. | Oct 2013 | A1 |
20130308282 | Shin et al. | Nov 2013 | A1 |
20140015773 | Loeffler | Jan 2014 | A1 |
20140031093 | Song et al. | Jan 2014 | A1 |
20140139450 | Levesque et al. | May 2014 | A1 |
20140253487 | Bezinge et al. | Sep 2014 | A1 |
20140274231 | De Luis et al. | Sep 2014 | A1 |
20140288438 | Venkatraman et al. | Sep 2014 | A1 |
20140298478 | Kim et al. | Oct 2014 | A1 |
20140311767 | Merz et al. | Oct 2014 | A1 |
20140320344 | Sanderovich et al. | Oct 2014 | A1 |
20140320435 | Modarres et al. | Oct 2014 | A1 |
20140347799 | Ono et al. | Nov 2014 | A1 |
20140368455 | Croisonnier et al. | Dec 2014 | A1 |
20150001104 | Kim et al. | Jan 2015 | A1 |
20150062419 | Hooton et al. | Mar 2015 | A1 |
20150090571 | Leong et al. | Apr 2015 | A1 |
20150109223 | Kessler | Apr 2015 | A1 |
20150124401 | Prest et al. | May 2015 | A1 |
20150171916 | Asrani et al. | Jun 2015 | A1 |
20150183185 | Chang | Jul 2015 | A1 |
20150185946 | Fourie | Jul 2015 | A1 |
20150255853 | Kwong et al. | Sep 2015 | A1 |
20150364820 | Dong et al. | Dec 2015 | A1 |
20160029899 | Kim et al. | Feb 2016 | A1 |
20160034042 | Joo et al. | Feb 2016 | A1 |
20160055729 | Maddox et al. | Feb 2016 | A1 |
20160064820 | Kim et al. | Mar 2016 | A1 |
20160098016 | Ely et al. | Apr 2016 | A1 |
20160098107 | Morrell | Apr 2016 | A1 |
20160103544 | Filiz et al. | Apr 2016 | A1 |
20160147257 | Yamazaki | May 2016 | A1 |
20160254587 | Jung et al. | Sep 2016 | A1 |
20160255944 | Baranski et al. | Sep 2016 | A1 |
20160270247 | Jones et al. | Sep 2016 | A1 |
20160308563 | Ouyang et al. | Oct 2016 | A1 |
20160316574 | Chang et al. | Oct 2016 | A1 |
20160327980 | Farahani et al. | Nov 2016 | A1 |
20160327986 | Farahani et al. | Nov 2016 | A1 |
20170010771 | Bernstein et al. | Jan 2017 | A1 |
20170038793 | Kallman | Feb 2017 | A1 |
20170048495 | Scalisi et al. | Feb 2017 | A1 |
20170060201 | Prather et al. | Mar 2017 | A1 |
20170094804 | Brodsky et al. | Mar 2017 | A1 |
20170104261 | Wong et al. | Apr 2017 | A1 |
20170230073 | Youn et al. | Aug 2017 | A1 |
20170264008 | Ying et al. | Sep 2017 | A1 |
20170264722 | Zhong | Sep 2017 | A1 |
20170303048 | Hooton et al. | Oct 2017 | A1 |
20180020208 | Woo et al. | Jan 2018 | A1 |
20180026341 | Mow et al. | Jan 2018 | A1 |
20180026353 | Tseng et al. | Jan 2018 | A1 |
20180077328 | Park et al. | Mar 2018 | A1 |
20180090847 | Romano et al. | Mar 2018 | A1 |
20180198212 | Rodriguez | Jul 2018 | A1 |
20180210515 | Lyles et al. | Jul 2018 | A1 |
20180213660 | Prest et al. | Jul 2018 | A1 |
20180217668 | Ligtenberg et al. | Aug 2018 | A1 |
20180217669 | Ligtenberg et al. | Aug 2018 | A1 |
20180218859 | Ligtenberg et al. | Aug 2018 | A1 |
20180284845 | Honma et al. | Oct 2018 | A1 |
20190020365 | Ouyang et al. | Jan 2019 | A1 |
20190083715 | Redmond et al. | Mar 2019 | A1 |
20190090806 | Clavelle et al. | Mar 2019 | A1 |
20190101960 | Silvanto et al. | Apr 2019 | A1 |
20190103682 | Thai et al. | Apr 2019 | A1 |
20190128669 | Nobayashi et al. | May 2019 | A1 |
20190129473 | Hu | May 2019 | A1 |
20190312334 | Shin et al. | Oct 2019 | A1 |
20190361543 | Zhang | Nov 2019 | A1 |
20190377385 | Bushnell | Dec 2019 | A1 |
20200057525 | Prest et al. | Feb 2020 | A1 |
20200058992 | Wu et al. | Feb 2020 | A1 |
20200073445 | Kuna et al. | Mar 2020 | A1 |
20200076056 | Froese et al. | Mar 2020 | A1 |
20200076057 | Leutheuser et al. | Mar 2020 | A1 |
20200076058 | Zhang et al. | Mar 2020 | A1 |
20200278747 | Ligtenberg et al. | Sep 2020 | A1 |
20200314567 | Shriner et al. | Oct 2020 | A1 |
20200328499 | O'Driscoll et al. | Oct 2020 | A1 |
20200409023 | Kazuo et al. | Dec 2020 | A1 |
20210149458 | Silvanto et al. | May 2021 | A1 |
20210167487 | Vanna et al. | Jun 2021 | A1 |
20210234403 | Ku et al. | Jul 2021 | A1 |
20210353226 | Hiemstra et al. | Nov 2021 | A1 |
20220006176 | Froese et al. | Jan 2022 | A1 |
20220057885 | Prest et al. | Feb 2022 | A1 |
20220059928 | Leutheuser et al. | Feb 2022 | A1 |
20230161390 | Silvanto et al. | Jun 2023 | A1 |
20230333600 | Kuna et al. | Oct 2023 | A1 |
20230333658 | Ligtenberg et al. | Oct 2023 | A1 |
Number | Date | Country |
---|---|---|
101087500 | Dec 2007 | CN |
101350849 | Jan 2009 | CN |
101753655 | Jun 2010 | CN |
102159045 | Aug 2011 | CN |
102405453 | Apr 2012 | CN |
202281978 | Jun 2012 | CN |
202735925 | Feb 2013 | CN |
102984904 | Mar 2013 | CN |
103168280 | Jun 2013 | CN |
203054674 | Jul 2013 | CN |
103327758 | Sep 2013 | CN |
103390793 | Nov 2013 | CN |
203416294 | Jan 2014 | CN |
103681061 | Mar 2014 | CN |
103777765 | May 2014 | CN |
104064865 | Sep 2014 | CN |
104427048 | Mar 2015 | CN |
104582379 | Apr 2015 | CN |
104742308 | Jul 2015 | CN |
105006647 | Oct 2015 | CN |
105228966 | Jan 2016 | CN |
105703060 | Jun 2016 | CN |
105892568 | Aug 2016 | CN |
203674398 | Dec 2016 | CN |
106571516 | Apr 2017 | CN |
107221506 | Sep 2017 | CN |
107275751 | Oct 2017 | CN |
107317121 | Nov 2017 | CN |
107534223 | Jan 2018 | CN |
107735903 | Feb 2018 | CN |
207216299 | Apr 2018 | CN |
108400425 | Aug 2018 | CN |
108594622 | Sep 2018 | CN |
108594623 | Sep 2018 | CN |
208385608 | Jan 2019 | CN |
109546295 | Mar 2019 | CN |
109980332 | Jul 2019 | CN |
110875974 | Mar 2020 | CN |
112532263 | Mar 2021 | CN |
112799294 | May 2021 | CN |
2565742 | Mar 2013 | EP |
2843501 | Mar 2015 | EP |
2993730 | Mar 2016 | EP |
3144768 | Mar 2017 | EP |
3438786 | Feb 2019 | EP |
2516439 | Jan 2015 | GB |
2529885 | Mar 2016 | GB |
S58151619 | Sep 1983 | JP |
H61039144 | Feb 1986 | JP |
H05022023 | Jan 1993 | JP |
H09232849 | Sep 1997 | JP |
H10102265 | Apr 1998 | JP |
H63249697 | Oct 1998 | JP |
2001216077 | Aug 2001 | JP |
20023431 | Nov 2002 | JP |
2004272690 | Sep 2004 | JP |
2006243812 | Sep 2006 | JP |
2007072375 | Mar 2007 | JP |
2011014149 | Jan 2011 | JP |
2011159276 | Aug 2011 | JP |
2011239139 | Nov 2011 | JP |
2011248888 | Dec 2011 | JP |
2011249126 | Dec 2011 | JP |
2012019526 | Jan 2012 | JP |
2012027592 | Feb 2012 | JP |
2012222553 | Nov 2012 | JP |
2013508818 | Mar 2013 | JP |
2014501070 | Jan 2014 | JP |
2014078240 | May 2014 | JP |
2014512879 | May 2014 | JP |
2014186075 | Oct 2014 | JP |
2015031952 | Feb 2015 | JP |
2019537909 | Dec 2019 | JP |
20060125712 | Dec 2006 | KR |
20110049416 | May 2011 | KR |
20110076951 | Jul 2011 | KR |
20130096048 | Aug 2013 | KR |
20140017420 | Feb 2014 | KR |
20140113962 | Sep 2014 | KR |
20150012312 | Feb 2015 | KR |
20160019833 | Feb 2016 | KR |
20160052275 | May 2016 | KR |
20160001920 | Jun 2016 | KR |
20160134504 | Nov 2016 | KR |
20180025126 | Mar 2018 | KR |
20190118095 | Oct 2019 | KR |
20200027010 | Mar 2020 | KR |
201129285 | Aug 2011 | TW |
201532835 | Sep 2015 | TW |
201701119 | Jan 2017 | TW |
WO0014826 | Mar 2000 | WO |
WO2009002605 | Dec 2008 | WO |
WO2009033616 | Mar 2009 | WO |
WO2009049331 | Apr 2009 | WO |
WO2009129123 | Oct 2009 | WO |
WO2011130849 | Oct 2011 | WO |
WO2012006152 | Jan 2012 | WO |
WO2012129247 | Sep 2012 | WO |
WO2014037945 | Mar 2014 | WO |
WO2014149172 | Sep 2014 | WO |
WO2014182392 | Nov 2014 | WO |
WO2015153701 | Oct 2015 | WO |
WO2016039803 | Mar 2016 | WO |
WO2016053901 | Apr 2016 | WO |
WO2016168432 | Oct 2016 | WO |
WO2018013573 | Jan 2018 | WO |
WO2018090295 | May 2018 | WO |
WO2018142132 | Aug 2018 | WO |
Entry |
---|
Author Unknown, “Improved Touchscreen Products,” Research Disclosure, Kenneth Mason Publications, Hampshire, UK, GB, vol. 428, No. 53, Dec. 1, 1999. |
Kim et al., “Ultrathin Cross-Linked Perfluoropolyether Film Coatings from Liquid CO2 and Subsequent UV Curing,” Chem. Matter, vol. 22, pp. 2411-2413, 2010. |
International Search Report and Written Opinion, PCT/US2018/053203, 12 pages, Jan. 2, 2019. |
Author Unknown, “Smart Watch—New Fashion Men/women Bluetooth Touch Screen Smart Watch Wrist Wrap Watch Phone,” https://www.fargoshopping.co.ke/, 5 pages, Mar. 2016. |
Number | Date | Country | |
---|---|---|---|
20190361543 A1 | Nov 2019 | US |
Number | Date | Country | |
---|---|---|---|
62676823 | May 2018 | US |