This disclosure relates generally to optical sensing interfaces, and more specifically to optical touchscreen interfaces.
Touchscreen displays have become commonplace in the realm of digital media. For example, digital e-book readers, mobile handsets, smartphones, tablet computers, and various multimedia devices are commonly equipped with touchscreen displays. As the name indicates, touchscreen displays generally include two essential components: a display and a touchscreen interface. The touchscreen interface can generally be positioned in front of the display such that the touchscreen interface covers most or all of the viewable area of the display. The touchscreen interface is configured to transfer data, commands, and responses from the outside world into the device. For example, the touchscreen interface may be used to move a cursor, icon, or other visual object, to navigate menus, and to make selections with respect to the GUI on the display. More specifically, the touchscreen interface can be configured to recognize the touch and position (among other possible attributes) of one or more human fingers or a stylus that contact the touchscreen interface and to send the data to a processor that controls the display.
There are a number of different touchscreen sensing technologies in use including resistive sensing, capacitive sensing, and surface acoustic wave (SAW) sensing. Another type of touchscreen sensing technology used is an infrared grid. Generally, an infrared grid touchscreen includes an array of light-emitting diodes (LEDs) and corresponding photo-detectors arranged along the edges of the touchscreen. The LEDs emit infrared (IR) light beams across the screen in horizontal and vertical patterns. The photo-detectors detect disruptions in the patterns caused by, for example, the presence of a finger or stylus. The choice of which sensing technology to use in a particular application can depend on one or more of a variety of factors including the type of display, the quality or clarity of the display desired (for example, the contrast of the display), cost, resistance to contaminants, resilience to wear, and resolution, among others.
The systems, methods and devices of this disclosure each have several innovative aspects, no single one of which is solely responsible for the desirable attributes disclosed herein.
One innovative aspect of the subject matter described in this disclosure can be implemented in an apparatus. The apparatus includes a light source, a photo-detector, and a substrate. The apparatus additionally includes an emitting waveguide that extends at least partially over the substrate, the emitting waveguide configured to propagate light from the light source to one or more corresponding emitting portions arranged along the emitting waveguide. Each emitting portion is configured to be capable of emitting at least a portion of the light from the emitting waveguide outwards from a plane defining the substrate. The apparatus further includes a sensing waveguide that extends at least partially over the substrate, the sensing waveguide including one or more sensing portions arranged along the sensing waveguide. Each sensing portion is arranged proximate a corresponding emitting portion and is configured to be capable of receiving light scattered by an object over the corresponding emitting portion. The sensing waveguide is configured to propagate the received light to the photo-detector.
In some implementations, the apparatus further includes a transparent lower cladding layer, a transparent core layer arranged over the lower cladding layer, the core layer including the emitting waveguide and the sensing waveguide, and a transparent upper cladding layer arranged over the core layer. In some implementations, the apparatus further comprises a transparent cover layer arranged over the upper cladding layer. In some implementations, the core layer includes a plurality of emitting waveguides and a plurality of sensing waveguides. The apparatus can further include a distributing waveguide optically coupled with the light source and configured to receive light emitted from the light source and to propagate the received light along the distributing waveguide. The distributing waveguide includes a plurality of turning portions, each turning portion arranged along the distributing waveguide and configured to reflect a portion of the light received by the turning portion into a corresponding one of the plurality of emitting waveguides. In some implementations, the reflected portions of light entering each emitting waveguide have substantially the same intensity. In some other implementations, each turning portion in the distributing waveguide reflects substantially the same percentage of the intensity of the light received by the turning portion as the other turning portions in the distributing waveguide reflect. Each turning portion can include a set of turning gratings. In some implementations, each set of turning gratings is configured as a set of nano-gratings. Each emitting portion and each sensing portion can include a set of gratings. In some implementations, each set of emitting or sensing gratings also is configured as a set of nano-gratings.
In some implementations, the photo-detector is part of an array of photo-detectors, each photo-detector of the array of photo-detectors being configured to receive light from a corresponding one of the sensing waveguides. Each emitting portion and the corresponding proximately-arranged sensing portion can be configured as a sensing point. In some such implementations, the apparatus is communicatively-coupled to a processor configured to determine a location of the sensing point when the photo-detector detects a threshold amount of light associated with the sensing point.
In some implementations, the emitting waveguides and the sensing waveguides extend along a width of the substrate along a direction of an x-axis of the substrate, and the distributing waveguide extends along a length of the substrate along a direction of a y-axis of the substrate orthogonal to the x-axis. In some such implementations, each emitting waveguide includes only one emitting portion and each sensing waveguide includes only one sensing portion. In some such implementations, each sensing point is associated with a particular x-coordinate position and a particular y-coordinate position because the emitting waveguides and corresponding sensing waveguides are arranged in a sufficiently dense periodic fashion along the length of the apparatus such that both an x- and a y-coordinate position of the object can be determined based on detection of light from a single sensing waveguide.
In some other implementations, each emitting waveguide includes a plurality of emitting portions and each sensing waveguide includes a plurality of sensing portions. In some such implementations, each of the sensing points along a corresponding pair of adjacent emitting and sensing waveguides is associated with the same particular y-coordinate position. In some such implementations, the apparatus is a first apparatus of a display device and the display device further includes a second apparatus disposed over or with the first apparatus. The second apparatus includes a plurality of second emitting waveguides that each extend at least partially over the substrate, each second emitting waveguide configured to propagate light to a plurality of corresponding second emitting portions arranged along the second emitting waveguide. Each second emitting portion is configured to be capable of emitting at least a portion of the light from the second emitting waveguide outwards from the plane defining the substrate. The second apparatus further includes a plurality of second sensing waveguides that each extend at least partially over the substrate, each second sensing waveguide including a plurality of second sensing portions arranged along the second sensing waveguide. Each second sensing portion is disposed proximate a corresponding emitting portion and is configured to be capable of receiving light scattered by an object over the corresponding second emitting portion. The second sensing waveguide is configured to propagate the received light to a photo-detector. In some such implementations, the second emitting waveguides and the second sensing waveguides extend along the length of the substrate along the direction of the y-axis, the second apparatus includes a second distributing waveguide that extends along the length of the substrate along the direction of the x-axis, and each of the sensing points of the second apparatus along a corresponding pair of adjacent second emitting and second sensing waveguides is associated with the same particular x-coordinate position. In some such implementations, the sensing points of the second apparatus are positioned directly over, or are positioned proximately offset from, the sensing points of the first apparatus, and when an object is suitably positioned on or over at least one sensing point of the first apparatus the object is also positioned over at least one sensing point of the second apparatus such that the processor is configured to determine an x-coordinate position of the object based on information from the second apparatus and to determine a y-coordinate position of the object based on information from the first apparatus. In some implementations, the second distributing waveguide of the second apparatus receives light from the light source of the first apparatus and distributes the light to the second emitting waveguides.
In another aspect, an apparatus includes light generating means, photo-detection means, and a substrate. The apparatus additionally includes first guiding means that extend at least partially over the substrate. The first guiding means is configured to propagate light from the light generation means to one or more corresponding emitting means arranged along the first guiding means. Each emitting means is configured to be capable of emitting at least a portion of the light from the first guiding means outwards from a plane defining the substrate. The apparatus further includes second guiding means that extends at least partially over the substrate. The second guiding means including one or more sensing means arranged along the second guiding means. Each sensing means is arranged proximate a corresponding emitting means, and is configured to be capable of receiving light scattered by an object over the corresponding emitting means. The second guiding means is configured to propagate the received light to the photo-detection means. In some implementations, each emitting means and each sensing means includes a set of nano-gratings.
Details of one or more implementations of the subject matter described in this specification are set forth in the accompanying drawings and the description below. Although the examples provided in this disclosure may be described in terms of, or in combination with, EMS and MEMS-based displays, the touchscreen concepts provided herein may apply to other types of displays, such as liquid crystal displays (LCDs), organic light-emitting diode (OLED) displays and field emission displays. Other features, aspects, and advantages will become apparent from the description, the drawings, and the claims. Note that the relative dimensions of the following figures may not be drawn to scale.
Like reference numbers and designations in the various drawings indicate like elements.
The disclosed implementations include examples of systems, apparatus, devices, components, methods, and techniques for producing an optical sensing interface and for using an optical sensing interface to detect touches or gestures applied on or over a display. Some implementations relate more specifically to optical touchscreen interfaces that utilize an array of waveguides to distribute light and a plurality of gratings to selectively reflect, scatter and receive light in conjunction with the waveguides. For example, a distributing waveguide can distribute light from a light source to a plurality of emitting waveguides that extend across at least a portion of a display. Each emitting waveguide can be configured to emit light at one or more portions along the emitting waveguide. In some implementations, each of the emitting portions includes a number of gratings configured to scatter a portion of the light propagating through the emitting waveguide. The optical sensing interface also includes a photo-detector and a plurality of sensing waveguides that extend across at least a portion of the display. Each sensing waveguide can be configured to receive light at one or more portions along the sensing waveguide and to propagate the received light to the photo-detector for detection. In some implementations, each of the sensing portions includes a number of gratings configured to receive light scattered from an object near an adjacent proximate emitting portion and to propagate the received light through the sensing waveguide to the photo-detector.
Particular implementations of the subject matter described in this disclosure can be implemented to realize one or more of the following potential advantages. Some implementations are particularly useful or applicable to handheld computing devices in which a primary source or method of user input is a touchscreen or other sensing interface disposed on or over a display of the device. In some implementations, the optical sensing interfaces described herein can be configured to detect single- and multi-point touches or gestures including near-field touches or gestures; that is, touches or gestures that do not necessarily physically contact a surface of the sensing interface or display of the device. In contrast to some traditional IR-based touchscreens that require multiple light sources, in some implementations, only a single light source may be used. Additionally, while traditional IR-based touchscreens are based on sensing disruptions or drops in the amplitudes of light signals, and thus can suffer from shadowing effects making them unusable for detecting multi-point touches or gestures, implementations described herein are based on sensing amplitude increases and are not as susceptible to shadowing. In some implementations, some or all of the waveguides of the optical sensing interfaces described herein can be produced using the same lower cladding layer, the same core materials, and the same upper cladding layer. In some implementations, the array of waveguides can be constructed on or in a flexible substrate. Additionally, the substrate and waveguides can be made very thin: for example, on the order of a few microns. Furthermore, the use of nano-gratings as components of sensing points, as described below, enables very high resolution: for example, at or below the sub-millimeter range.
The display 102 can include any suitable display screen technology. For example, the display 102 can be a MEMS-based display such as an interferometric modulator (IMOD)-based display or a digital MEMS shutter (DMS) display, an LCD display, or an LED display. The display 102 can generally be configured to display a graphical user interface (GUI) that facilitates interaction between a user of the device 100 and the operating system and other applications executing (or “running”) on the device 100. For example, the GUI may generally present programs, files, and operational options with graphical images. The graphical images may include, for example, windows, fields, dialog boxes, menus, other text, icons, buttons, cursors, scroll bars, among other presentations. During operation of the device 100, the user (hereinafter “user” and “viewer” may be used interchangeably) can select, activate, or manipulate various graphical images (hereinafter also referred to as “visual objects”) displayed on the display 102 to initiate function(s) associated with the visual object, to manipulate the visual object, or to input information to the device 100.
The device 100 includes one or more user input devices, including an optical sensing interface 104, that are operatively coupled to one or more processors or processing circuits (not shown) within the device 100. In some implementations, the optical sensing interface 104 can be configured as, or referred to as, a touchscreen interface. The optical sensing interface 104 can generally be in communication with a sensing interface controller (not shown). Generally, the optical sensing interface 104 and other input devices are configured to transfer data, commands, and responses from the outside world into the device 100. For example, the input devices may be used to move a cursor, icon, or other visual object, to navigate menus, and to make selections with respect to the GUI on the display 102. In some implementations, the input devices, such as the optical sensing interface 104, can be used to perform other operations including paging, scrolling, panning, dragging, “flicking,” “flinging,” and zooming, among other possibilities. Other input devices include buttons or keys, computer “mice,” trackballs, touchpads, and joysticks, among others.
The optical sensing interface 104 is generally configured to recognize the touch and position (among other possible attributes) of a “touch event” on or over the display 102. The touch event can be a touch, such as from one or more human fingers or a stylus that physically contacts a part of the sensing interface 104. In some implementations, the optical sensing interface 104 also can be configured to recognize near-field or other gestures applied over the optical sensing interface 104; that is, the optical sensing interface 104 can be configured to sense the positioning and motion of an object, such as a human finger or stylus, in close proximity to a part of the sensing interface 104. Such gestures applied over the optical sensing interface 104 may intermittently physically or directly contact an upper (or outer) surface of the optical sensing interface 104. Thus, for purposes of some implementations herein, touch gestures include gestures that are sensed by the sensing interface 104 regardless of whether or not the gestures physically or directly contact the sensing device.
A processor, alone or in conjunction with other components including a sensing interface controller and the optical sensing interface 104, interprets each touch event and executes one or more instructions to perform an action or actions based on the touch event. In some implementations, the optical sensing interface 104 is configured to sense and distinguish between multiple touches, different magnitudes of touches, as well as the velocity (e.g., speed and direction) or acceleration of a touch as one or more fingers (or a stylus or other suitable object) are moved across or over the optical sensing interface 104. The optical sensing interface 104 can generally be positioned in front of the display 102 such that the optical sensing interface covers most or all of the viewable area of the display 102.
In some implementations, the optical sensing interface 104 registers touch events, generates signals in response to the registered touch events, and sends these signals to a sensing interface controller. The sensing interface controller then processes these signals and sends the processed data to the processor. In some implementations, the functionality of the sensing interface controller can be incorporated into or integrated with the processor. For example, a processor can be configured to receive touch event signals from the optical sensing interface 104 and to process or translate these signals into computer input events.
In some implementations, the optical sensing interface 104 is capable of recognizing multiple touch events that occur at different locations on the touch sensitive surface of the optical sensing interface 104 at the same or similar time; that is, the optical sensing interface 104 allows for multiple contact points or “touch points” to be detected and subsequently tracked simultaneously. In some implementations, the optical sensing interface 104 generates separate tracking signals for each touch point on the optical sensing interface 104 at the same time. Such an implementation may be referred to as a “multi-touch” interface.
In some implementations, the device 100 is operable to recognize gestures applied to the optical sensing interface 104 and to control aspects of the device 100 based on the gestures. For example, a gesture may be defined as a stylized single or multi-point touch event interaction with the optical sensing interface 104 that is mapped to one or more specific computing operations. As described, the gestures may be made through various hand and, more particularly, finger motions. The optical sensing interface 104 receives the gestures and the one or more processors execute instructions to carry out operations associated with the gestures. In some implementations, a memory (not shown) of the device 100 includes a gestural operation program and an associated gesture library, which may be a part of the operating system or a separate application. The gestural operation program can generally include a set of instructions that recognizes the occurrence of gestures and informs the processor what instructions to execute or actions to perform in response to the gestures. In some implementations, for example, when a user performs one or more gestures on or over the optical sensing interface 104, the optical sensing interface 104 relays gesture information to the processor, which, using and executing instructions from the memory, including the gestural operation program, interpret the gestures and control different components of the device 100 based on the gestures. For example, the gestures may be identified as commands for performing actions in applications stored in the memory, modifying or manipulating visual objects displayed by display 102, and modifying data stored in the memory. For example, the gestures may initiate commands associated with dragging, flicking, flinging, scrolling, paging, panning, zooming, rotating, and sizing. Additionally, the commands also may be associated with launching a particular program or application, opening a file or document, viewing a menu, viewing a video, making a selection, or executing other instructions.
In some implementations, the device 100, and particularly the optical sensing interface 104 and the processor, is/are configured to immediately recognize the gestures applied to the optical sensing interface 104 such that actions associated with the gestures can be implemented at the same time (or substantially the same time as perceived by a viewer) as the gesture. That is, the gesture and the corresponding action occur effectively simultaneously. In some implementations, a visual object can be continuously manipulated based on the gesture applied to the optical sensing interface 104. That is, there may be a direct relationship between a gesture being applied to the optical sensing interface 104 and the visual object displayed by the display 102. For example, during a scrolling gesture, the visual object (such as text) displayed on the display 102 moves with the associated gesture (either in the same or the opposite direction for example); that is, with the finger or other input across the optical sensing interface 104. As another example, during a dragging operation, the visual object (such as an icon, picture, or other image) being dragged moves across the display based on the velocity of the gesture. However, it some implementations, a visual object may continue to move after the gesture has ended. For example, during some scrolling operations, or during a flinging or flicking operation, the visual object's velocity and acceleration can be based on the velocity and acceleration associated with the gesture and may continue to move after the gesture has ceased based on the velocity or acceleration of the previously applied gesture.
In the illustrated implementation, the distributing waveguide 206 propagates the light received from the light source 208 along the length (or parallel to a y-axis) of the device 100. A plurality of turning portions 210 are arranged along the length of the distributing waveguide 206. In some implementations, each turning portion 210 includes a set of turning gratings for reflecting a portion of the light incident on the turning gratings into the corresponding emitting waveguide 212. In some such implementations, the turning gratings can be configured as nano-gratings passing or reflecting only certain wavelengths or ranges of wavelengths. For example, the turning gratings can be similar to Fiber-Bragg gratings in that they pass only certain wavelengths of light. Each turning portion 210 is configured to reflect a portion of the light received by the turning portion 210. For example, each turning portion 210 can typically reflect a small portion (e.g., less than 10%) and transmit the remainder through to the next turning portion 210.
Light reflected by each turning portion 210 is optically coupled into a corresponding one of a plurality of emitting waveguides 212. In the illustrated implementation, each of the emitting waveguides 212 extends along the width (or parallel to an x-axis) of the device 100. In some implementations, the turned light entering each emitting waveguide 212 has substantially the same intensity (for example, each emitting waveguide 212 receives substantially the same amount of light as the other emitting waveguides). In some other implementations, each turning portion 210 in the distributing waveguide 206 reflects substantially the same percentage of the intensity of the light received by the turning portion 210 as the other turning portions 210 in the distributing waveguide 206 reflect (for example, each emitting waveguide 212 receives approximately 10%—or some other suitable percentage—of the light incident on the corresponding turning portion 210). For example, if each turning portion 210 is configured to reflect 10% of the light received by the turning portion into the corresponding emitting waveguide 212, then the first turning portion 210 after the light source 208 would reflect 10% of the intensity of the light received from the light source and transmit 90% of the light received from the light source. In this example, the second turning portion 210 would receive the 90% intensity from the first turning portion 210, and would reflect 10% of that light—9% of the original light intensity—into the corresponding emitting waveguide 212 and transmit 90% of that light—81% of the original light intensity—to the third turning portion 210. Each emitting waveguide 212 propagates the light received by the corresponding turning portion 210 along the emitting waveguide 212 towards an emitting portion 214. In some implementations, each emitting portion 214 includes a set of gratings. For example, each set of gratings of each emitting portion 214 can be configured as a nano-grating or similar to a Fiber-Bragg grating. Each emitting portion 214 is configured to emit, leak, or scatter a portion of the light received by the emitting portion 214.
The optical sensing interface 104 also includes a corresponding plurality of sensing waveguides 216. In the illustrated implementation, each of the sensing waveguides 216 also extends along the width (or parallel to the x-axis) of the device 100. Each sensing waveguide 216 is optically coupled to or with a photo-detector array 220. In some implementations, the photo-detector array 220 includes a plurality of individual photo-detectors (not shown)—each individual photo-detector of the array being configured for detecting light from a single one of the sensing waveguides 216 or from a subset of the sensing waveguides 216. Each sensing waveguide 216 includes one or more sensing portions 218. Each sensing portion 218 is arranged proximate a corresponding one of the emitting portions 214 of an emitting waveguide 212. For example, in some implementations, each emitting waveguide 212, at least at the emitting portion 214, can be spaced apart from the nearest sensing waveguide 216, at least at the sensing portion 218, by approximately the width of the emitting or sensing waveguide. In some other implementations, other widths or spacings may be appropriate. In some implementations, each sensing portion 218 includes a set of gratings. For example, each set of gratings of each sensing portion 218 can be configured as a nano-grating or similar to a Fiber-Bragg grating. Each sensing portion 218 is configured to receive light scattered onto or into the gratings of the sensing portion 218 (by, for example, a finger or other object proximate the sensing portion 218) and to propagate the received light along the sensing waveguide 216 for transfer into, and detection by, the photo-detector array 220. Each sensing portion 218 and corresponding proximate emitting portion 214 can be referred to as a sensing point 222.
In some implementations, all of the waveguides—the distributing waveguide 206, the plurality of emitting waveguides 212, and the plurality of sensing waveguides 216—are planar waveguides.
In some implementations, the optical sensing interface 104 is constructed on a transparent substrate (not shown) upon which the lower cladding layer 332 is grown, deposited, or otherwise disposed. In some implementations, the substrate can be a flexible substrate. The cores of the distributing waveguide 206, the emitting waveguides 212, and the sensing waveguides 216 are then grown, deposited, or otherwise disposed over the lower cladding layer 332. The upper cladding layer is grown, deposited, or otherwise disposed over the cores and the lower cladding layer 332. In some implementations in which a flexible substrate is used, the cores of the distributing waveguide 206, the emitting waveguides 212, and the sensing waveguides 216 can be fabricated through one or more nanoimprint lithographic and/or roll-to-roll replication techniques. These techniques can result in very thin waveguides with nanometer scale dimensions and features. Such techniques also can be considerably less expensive than more traditional fabrication techniques including electron beam lithography, ion beam deposition, or other typical semiconductor fabrication processes. In some implementations, the lower cladding layer and the upper cladding layer are formed of silicon dioxide or “silica” (SiO2) or another suitable material that is transparent to visible light. In some implementations, the core is formed of doped SiO2. For example, in some implementations, the core of each waveguide is formed of heavily-doped SiO2 in which the dopant is germanium (Ge), although other dopants can be used. Generally, like traditional waveguides, the outer cladding layers have indices of refraction that are lower than the index of refraction of the core. For example, in one implementation, the core has an index of refraction, n, of 1.455, while the lower and upper cladding layers indices of refraction, n, of 1.444. In some other implementations, a different dopant, a combination of two or more dopants, or a different dopant concentration can be used to fabricate a core having a different index of refraction to provide a greater or smaller difference relative to the index of refraction of the cladding layers.
In some implementations, to produce the gratings 340 for the portion 338 of the core 340 (e.g., the gratings for the emitting portion 214 or the sensing portion 218), the core 340 is etched prior to deposition of the upper cladding layer 336 to produce a series of gratings 340 and a series of notches or gaps 342 of specified spacings and of specified periods. Generally, the spacings of the gaps 342 and the differences in the indices of refraction of the gratings 340 and whatever material is in the gaps 342 (for example, the upper cladding layer material) result in the various transmission and reflection coefficients for the light incident on the gratings 340. For example, the lengths of each of the gratings 340 along the waveguide 330 can be in the range of tens to hundreds or thousands of nanometers while the gaps 342 between adjacent gratings 340 also can be in the range of tens to hundreds or thousands of nanometers. Such gratings may be referred to as nano-gratings. However, generally, the desired dimensions of the gratings 340 and the dimensions of the gaps 342 will depend on the wavelength (or wavelengths) of the light from the light source 208. For example, the length of each grating 340 or spacing 342 can be approximately λ/2 where λ is the wavelength of the light used (for example, 850 nm). In some implementations, the gaps 342 are filled with the upper cladding layer 336 when the upper cladding layer 336 is deposited. In some other implementations, the gaps 342 can be filled with a different material. As described above, the gratings of the emitting portions 214 or the gratings of the sensing portions 218 also can be configured as or similar to Fiber-Bragg gratings. In some other implementations, the gratings can be formed from doping or otherwise periodically changing the material properties of the emitting and sensing portions 214 and 218. The turning portions 210 of the distributing waveguide 206 can be implemented with turning gratings (e.g., 45° turning gratings) produced similar to the gratings 340 but at an angle (e.g., 45° relative to the length of the distributing waveguide 206 to reflect portions of the light propagating along the distributing waveguide 206 into the emitting waveguides 212. In some implementations, the turning portions 210 also can be configured as or similar to Fiber-Bragg gratings.
As described above, in some implementations, each emitting waveguide 212 may receive a different intensity of light than the other emitting waveguides (for example, in an implementation where the turning portions 210 each turn an equal percentage of the light received by the turning portion into the corresponding waveguide as described above). In some such implementations, the corresponding sensing waveguides 216 each receive a baseline amount of scattered light proportional to the amount of light received by the corresponding emitting waveguide 212. In some such implementations, the photo-detector array 220 can be configured to distinguish between the sensing waveguides 216 based on the amount of light sensed by the photo-detector array 220. That is, in some implementations, the photo-detector array 220 does not require a sub-photo-detector dedicated to each sensing waveguide 216. In some implementations, the threshold sensitivities (e.g., for registering a touch event) for each of the sensing waveguides 216 also can be different and proportional to the amount of light received by the corresponding emitting waveguide 212.
Depending on the arrangement and density of the sensing points 222, and the resolution desired, different detection and position location schemes can be utilized to determine a location of a touch event or a movement of a touch gesture. In the example shown in
One technique involves increasing the density of the sensing points 222 by reducing the spacing between adjacent pairs of emitting and sensing waveguides 212 and 216.
In such implementations, each sensing point 222 can be pre-associated with a particular x-coordinate position and a particular y-coordinate position in, for example, a memory of the device 100. To ensure that a touch event can be detected anywhere on or proximately over the optical sensing interface 104, the emitting waveguides 212 and the corresponding sensing waveguides 216 are arranged in a sufficiently dense periodic fashion along the length of the device 100 such that both an x- and a y-coordinate position of an object can be determined based on detection, by the photo-detector array 220, of light from a single sensing waveguide 216.
In some other implementations, the density of the sensing points can be increased by increasing the number of emitting portions 214 and sensing portions 218 along each of the emitting waveguides 212 and sensing waveguides 216, respectively.
In some such implementations, to obtain the x-coordinate position, the device 100 further includes a second optical sensing interface.
In some implementations, each of the sensing points 922 of the second optical sensing interface 904 along a corresponding pair of adjacent second emitting and second sensing waveguides 912 and 916, respectively, is associated with the same particular x-coordinate position. In some implementations, the sensing points 922 of the second optical sensing interface 904 are positioned directly over, or are positioned proximately offset from, the sensing points 222 of the first optical sensing interface 804. In this way, when an object, such as a finger, is suitably positioned on or over the optical sensing interfaces, the object is positioned on or over at least one sensing point of the first optical sensing interface 804 and at least one sensing point of the second optical sensing interface 904 such that the processor is configured to determine an x-coordinate position of the object based on information from the second optical sensing interface 904 and to determine a y-coordinate position of the object based on information from the first optical sensing interface 804.
In some other implementations, the second optical sensing interface 904 can share one or both of the light source 208 or photo-detector array 220 of the first optical sensing interface 804.
When an optical sensing interface such as those described above with reference to
The IMOD display device can include an array of IMOD display elements which may be arranged in rows and columns. Each display element in the array can include at least a pair of reflective and semi-reflective layers, such as a movable reflective layer (i.e., a movable layer, also referred to as a mechanical layer) and a fixed partially reflective layer (i.e., a stationary layer), positioned at a variable and controllable distance from each other to form an air gap (also referred to as an optical gap, cavity or optical resonant cavity). The movable reflective layer may be moved between at least two positions. For example, in a first position, i.e., a relaxed position, the movable reflective layer can be positioned at a distance from the fixed partially reflective layer. In a second position, i.e., an actuated position, the movable reflective layer can be positioned more closely to the partially reflective layer. Incident light that reflects from the two layers can interfere constructively and/or destructively depending on the position of the movable reflective layer and the wavelength(s) of the incident light, producing either an overall reflective or non-reflective state for each display element. In some implementations, the display element may be in a reflective state when unactuated, reflecting light within the visible spectrum, and may be in a dark state when actuated, absorbing and/or destructively interfering light within the visible range. In some other implementations, however, an IMOD display element may be in a dark state when unactuated, and in a reflective state when actuated. In some implementations, the introduction of an applied voltage can drive the display elements to change states. In some other implementations, an applied charge can drive the display elements to change states.
The depicted portion of the array in
In
The optical stack 16 can include a single layer or several layers. The layer(s) can include one or more of an electrode layer, a partially reflective and partially transmissive layer, and a transparent dielectric layer. In some implementations, the optical stack 16 is electrically conductive, partially transparent and partially reflective, and may be fabricated, for example, by depositing one or more of the above layers onto a transparent substrate 20. The electrode layer can be formed from a variety of materials, such as various metals, for example indium tin oxide (ITO). The partially reflective layer can be formed from a variety of materials that are partially reflective, such as various metals (e.g., chromium and/or molybdenum), semiconductors, and dielectrics. The partially reflective layer can be formed of one or more layers of materials, and each of the layers can be formed of a single material or a combination of materials. In some implementations, certain portions of the optical stack 16 can include a single semi-transparent thickness of metal or semiconductor which serves as both a partial optical absorber and electrical conductor, while different, electrically more conductive layers or portions (e.g., of the optical stack 16 or of other structures of the display element) can serve to bus signals between IMOD display elements. The optical stack 16 also can include one or more insulating or dielectric layers covering one or more conductive layers or an electrically conductive/partially absorptive layer.
In some implementations, at least some of the layer(s) of the optical stack 16 can be patterned into parallel strips, and may form row electrodes in a display device as described further below. As will be understood by one having ordinary skill in the art, the term “patterned” is used herein to refer to masking as well as etching processes. In some implementations, a highly conductive and reflective material, such as aluminum (Al), may be used for the movable reflective layer 14, and these strips may form column electrodes in a display device. The movable reflective layer 14 may be formed as a series of parallel strips of a deposited metal layer or layers (orthogonal to the row electrodes of the optical stack 16) to form columns deposited on top of supports, such as the illustrated posts 18, and an intervening sacrificial material located between the posts 18. When the sacrificial material is etched away, a defined gap 19, or optical cavity, can be formed between the movable reflective layer 14 and the optical stack 16. In some implementations, the spacing between posts 18 may be approximately 1-1000 μm, while the gap 19 may be approximately less than 10,000 Angstroms (Å).
In some implementations, each IMOD display element, whether in the actuated or relaxed state, can be considered as a capacitor formed by the fixed and moving reflective layers. When no voltage is applied, the movable reflective layer 14 remains in a mechanically relaxed state, as illustrated by the display element 12 on the left in
The display device 40 includes a housing 41, a display 30, an antenna 43, a speaker 45, an input device 48 (which can be or which can include the optical sensing interface 104 described above) and a microphone 46. The housing 41 can be formed from any of a variety of manufacturing processes, including injection molding, and vacuum forming. In addition, the housing 41 may be made from any of a variety of materials, including, but not limited to: plastic, metal, glass, rubber and ceramic, or a combination thereof. The housing 41 can include removable portions (not shown) that may be interchanged with other removable portions of different color, or containing different logos, pictures, or symbols.
The display 30 may be any of a variety of displays, including a bi-stable or analog display, as described herein. The display 30 also can be configured to include a flat-panel display, such as plasma, EL, OLED, STN LCD, or TFT LCD, or a non-flat-panel display, such as a CRT or other tube device. In addition, the display 30 can include an IMOD-based display, as described herein.
Some of the components of the display device 40 are schematically illustrated in
The network interface 27 includes the antenna 43 and the transceiver 47 so that the display device 40 can communicate with one or more devices over a network. The network interface 27 also may have some processing capabilities to relieve, for example, data processing requirements of the processor 21. The antenna 43 can transmit and receive signals. In some implementations, the antenna 43 transmits and receives RF signals according to the IEEE 16.11 standard, including IEEE 16.11(a), (b), or (g), or the IEEE 802.11 standard, including IEEE 802.8A, b, g, n, and further implementations thereof. In some other implementations, the antenna 43 transmits and receives RF signals according to the Bluetooth® standard. In the case of a cellular telephone, the antenna 43 can be designed to receive code division multiple access (CDMA), frequency division multiple access (FDMA), time division multiple access (TDMA), Global System for Mobile communications (GSM), GSM/General Packet Radio Service (GPRS), Enhanced Data GSM Environment (EDGE), Terrestrial Trunked Radio (TETRA), Wideband-CDMA (W-CDMA), Evolution Data Optimized (EV-DO), 1xEV-DO, EV-DO Rev A, EV-DO Rev B, High Speed Packet Access (HSPA), High Speed Downlink Packet Access (HSDPA), High Speed Uplink Packet Access (HSUPA), Evolved High Speed Packet Access (HSPA+), Long Term Evolution (LTE), AMPS, or other known signals that are used to communicate within a wireless network, such as a system utilizing 3G, 4G or 5G technology. The transceiver 47 can pre-process the signals received from the antenna 43 so that they may be received by and further manipulated by the processor 21. The transceiver 47 also can process signals received from the processor 21 so that they may be transmitted from the display device 40 via the antenna 43.
In some implementations, the transceiver 47 can be replaced by a receiver. In addition, in some implementations, the network interface 27 can be replaced by an image source, which can store or generate image data to be sent to the processor 21. The processor 21 can control the overall operation of the display device 40. The processor 21 receives data, such as compressed image data from the network interface 27 or an image source, and processes the data into raw image data or into a format that can be readily processed into raw image data. The processor 21 can send the processed data to the driver controller 29 or to the frame buffer 28 for storage. Raw data typically refers to the information that identifies the image characteristics at each location within an image. For example, such image characteristics can include color, saturation and gray-scale level.
The processor 21 can include a microcontroller, CPU, or logic unit to control operation of the display device 40. The conditioning hardware 52 may include amplifiers and filters for transmitting signals to the speaker 45, and for receiving signals from the microphone 46. The conditioning hardware 52 may be discrete components within the display device 40, or may be incorporated within the processor 21 or other components.
The driver controller 29 can take the image data generated by the processor 21 either directly from the processor 21 or from the frame buffer 28 and can re-format the image data appropriately for high speed transmission to the array driver 22. In some implementations, the driver controller 29 can re-format the image data into a data flow having a raster-like format, such that it has a time order suitable for scanning across the display array 30. Then the driver controller 29 sends the formatted information to the array driver 22. Although a driver controller 29, such as an LCD controller, is often associated with the system processor 21 as a stand-alone Integrated Circuit (IC), such controllers may be implemented in many ways. For example, controllers may be embedded in the processor 21 as hardware, embedded in the processor 21 as software, or fully integrated in hardware with the array driver 22.
The array driver 22 can receive the formatted information from the driver controller 29 and can re-format the video data into a parallel set of waveforms that are applied many times per second to the hundreds, and sometimes thousands (or more), of leads coming from the display's x-y matrix of display elements.
In some implementations, the driver controller 29, the array driver 22, and the display array 30 are appropriate for any of the types of displays described herein. For example, the driver controller 29 can be a conventional display controller or a bi-stable display controller (such as an IMOD display element controller). Additionally, the array driver 22 can be a conventional driver or a bi-stable display driver (such as an IMOD display element driver). Moreover, the display array 30 can be a conventional display array or a bi-stable display array (such as a display including an array of IMOD display elements). In some implementations, the driver controller 29 can be integrated with the array driver 22. Such an implementation can be useful in highly integrated systems, for example, mobile phones, portable-electronic devices, watches or small-area displays.
In some implementations, the input device 48 can be configured to allow, for example, a user to control the operation of the display device 40. In addition to including an optical sensing interface as described above, the input device 48 also can collectively include a keypad, such as a QWERTY keyboard or a telephone keypad, a button, a switch, a rocker, or a pressure- or heat-sensitive membrane. The microphone 46 can be configured as an input device for the display device 40. In some implementations, voice commands through the microphone 46 can be used for controlling operations of the display device 40.
The power supply 50 can include a variety of energy storage devices. For example, the power supply 50 can be a rechargeable battery, such as a nickel-cadmium battery or a lithium-ion battery. In implementations using a rechargeable battery, the rechargeable battery may be chargeable using power coming from, for example, a wall socket or a photovoltaic device or array. Alternatively, the rechargeable battery can be wirelessly chargeable. The power supply 50 also can be a renewable energy source, a capacitor, or a solar cell, including a plastic solar cell or solar-cell paint. The power supply 50 also can be configured to receive power from a wall outlet.
In some implementations, control programmability resides in the driver controller 29 which can be located in several places in the electronic display system. In some other implementations, control programmability resides in the array driver 22. The above-described optimization may be implemented in any number of hardware and/or software components and in various configurations.
As used herein, a phrase referring to “at least one of” a list of items refers to any combination of those items, including single members. As an example, “at least one of: a, b, or c” is intended to cover: a, b, c, a-b, a-c, b-c, and a-b-c.
The various illustrative logics, logical blocks, modules, circuits and algorithm steps described in connection with the implementations disclosed herein may be implemented as electronic hardware, computer software, or combinations of both. The interchangeability of hardware and software has been described generally, in terms of functionality, and illustrated in the various illustrative components, blocks, modules, circuits and steps described above. Whether such functionality is implemented in hardware or software depends upon the particular application and design constraints imposed on the overall system.
The hardware and data processing apparatus used to implement the various illustrative logics, logical blocks, modules and circuits described in connection with the aspects disclosed herein may be implemented or performed with a general purpose single- or multi-chip processor, a digital signal processor (DSP), an application specific integrated circuit (ASIC), a field programmable gate array (FPGA) or other programmable logic device, discrete gate or transistor logic, discrete hardware components, or any combination thereof designed to perform the functions described herein. A general purpose processor may be a microprocessor, or, any conventional processor, controller, microcontroller, or state machine. A processor also may be implemented as a combination of computing devices, such as a combination of a DSP and a microprocessor, a plurality of microprocessors, one or more microprocessors in conjunction with a DSP core, or any other such configuration. In some implementations, particular steps and methods may be performed by circuitry that is specific to a given function.
In one or more aspects, the functions described may be implemented in hardware, digital electronic circuitry, computer software, firmware, including the structures disclosed in this specification and their structural equivalents thereof, or in any combination thereof. Implementations of the subject matter described in this specification also can be implemented as one or more computer programs, i.e., one or more modules of computer program instructions, encoded on a computer storage media for execution by, or to control the operation of, data processing apparatus.
Various modifications to the implementations described in this disclosure may be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other implementations without departing from the spirit or scope of this disclosure. Thus, the claims are not intended to be limited to the implementations shown herein, but are to be accorded the widest scope consistent with this disclosure, the principles and the novel features disclosed herein. Additionally, a person having ordinary skill in the art will readily appreciate, the terms “upper” and “lower” are sometimes used for ease of describing the figures, and indicate relative positions corresponding to the orientation of the figure on a properly oriented page, and may not reflect the proper orientation of, e.g., an IMOD display element as implemented.
Certain features that are described in this specification in the context of separate implementations also can be implemented in combination in a single implementation. Conversely, various features that are described in the context of a single implementation also can be implemented in multiple implementations separately or in any suitable subcombination. Moreover, although features may be described above as acting in certain combinations and even initially claimed as such, one or more features from a claimed combination can in some cases be excised from the combination, and the claimed combination may be directed to a subcombination or variation of a subcombination.
Similarly, while operations are depicted in the drawings in a particular order, a person having ordinary skill in the art will readily recognize that such operations need not be performed in the particular order shown or in sequential order, or that all illustrated operations be performed, to achieve desirable results. Further, the drawings may schematically depict one more example processes in the form of a flow diagram. However, other operations that are not depicted can be incorporated in the example processes that are schematically illustrated. For example, one or more additional operations can be performed before, after, simultaneously, or between any of the illustrated operations. In certain circumstances, multitasking and parallel processing may be advantageous. Moreover, the separation of various system components in the implementations described above should not be understood as requiring such separation in all implementations, and it should be understood that the described program components and systems can generally be integrated together in a single software product or packaged into multiple software products. Additionally, other implementations are within the scope of the following claims. In some cases, the actions recited in the claims can be performed in a different order and still achieve desirable results.