The subject matter of the present disclosure relates to an electronic device having a display and a surrounding touch sensitive bezel for user interface and control.
There exist today many types of hand-held electronic devices, each of which utilizes some sort of user interface. The user interface typically includes an output device in the form of a display, such as a Liquid Crystal Display (LCD), and one or more input devices, which can be mechanically actuated (e.g., switches, buttons, keys, dials, joysticks, joy pads) or electrically activated (e.g., touch pads or touch screens). The display is typically configured to present visual information such as text and graphics, and the input devices are typically configured to perform operations such as issuing commands, making selections, or moving a cursor or selector of the electronic device. Each of these well-known devices has considerations such as size and shape limitations, costs, functionality, complexity, etc. that must be taken into account when designing the hand-held electronic device. In most cases, the user interface is positioned on the front face (or front surface) of the hand-held device for easy viewing of the display and easy manipulation of the input devices.
To elaborate, the telephone 10A typically includes a display 12 such as a character or graphical display, and input devices 14 such as a number pad and in some cases a navigation pad. The PDA 10B typically includes a display 12 such as a graphical display, and input devices 14 such as a stylus based resistive touch screen and buttons. The media player 10C typically includes a display 12 such as a character or graphic display and input devices 14 such as buttons or wheels. The iPod® media player manufactured by Apple Computer, Inc. of Cupertino, Calif. is one example of a media player that includes both a display and input devices disposed next to the display. The remote control 10D typically includes an input device 14 such as a keypad and may or may not have a character display 12. The camera 10E typically includes a display 12 such as a graphic display and input devices 14 such as buttons. The GPS module 10F typically includes a display 12 such as graphic display and input devices 14 such as buttons, and in some cases a joy pad.
Such prior art devices 10A-10F often employ a user interface in conjunction with the display 12 and input device 14. In one example,
The electronic device 20 has a housing 22 that contains the display 24 and the input device 26. The input device 26 typically requires a number of components, such as pressure pads, printed circuit board, integrated circuits, etc. Accordingly, the housing 22 for the electronic device 20 must typically be extended or enlarged beyond the size of the display 24 so that the electronic device 20 can accommodate the components of the input device 26. Consequently, due to the required components for the input device 26, the size of the housing 22 may in some cases be larger than is actually required to house just the display 24 and any other necessary components (i.e., processor, memory, power supply, etc.) for the device 20. In addition, placement of the display 24 and the input device 26 typically accommodate only one orientation of the device 20 when held by a user.
In another example,
In yet another example,
Recently, traditionally separate hand-held electronic devices have begun to be combined in limited ways. For example, the functionalities of a telephone have been combined with the functionalities of a PDA. One problem that has been encountered is in the way inputs are made into the device. Each of these devices has a particular set of input mechanisms or devices for providing inputs into the device. Some of these input mechanisms are generic to all the devices (e.g., power button) while others are not. The ones that are not generic are typically dedicated to a particular functionality of the device. By way of example, PDAs typically include a touch screen and a few dedicated buttons while cell phones typically include a numeric keypad and at least two dedicated buttons.
Thus, it is a challenge to design a device with limited input mechanisms without adversely affecting the numerous possible functions that the device can perform. As will be appreciated, it is preferable not to overload the electronic devices with a large number of input mechanisms as this tends to confuse ‘the user and to take up valuable space, i.e., “real estate.” In the case of hand-held devices, space is at a premium because of their small size. At some point, there is not enough space on the device to house all the necessary buttons and switches, etc. This is especially true when considering that all these devices need a display that typically takes up a large amount of space on its own. To increase the number of input devices beyond some level, designers would have to decrease the size of the display. However, this will often leave a negative impression on the user because the user typically desires the largest display possible. Alternatively, to accommodate more input devices designers may opt to increase the size of the device. This, too, will often leave a negative impression on a user because it would make one-handed operations difficult, and at some point, the size of the device becomes so large that it is no longer considered a hand-held device.
Therefore, what is needed in the art is an improved user interface that works for multi-functional hand-held devices.
An electronic device has a display and has a touch sensitive bezel surrounding the display. Areas on the bezel are designated for controls used to operate the electronic device. Visual guides corresponding to the controls are displayed on the display adjacent the areas of the bezel designated for the controls. Touch data is generated by the bezel when a user touches an area of the bezel. The device determines which of the controls has been selected based on which designated area is associated with the touch data from the bezel. The device then initiates the determined control. The device can also have a sensor for determining the orientation of the device. Based on the orientation, the device can alter the areas designated on the bezel for the controls and can alter the location of the visual guides for the display so that they match the altered areas on the bezel if the orientation of the device has changed.
The foregoing summary is not intended to summarize each potential embodiment or every aspect of the present disclosure.
The foregoing summary, preferred embodiments, and other aspects of subject matter of the present disclosure will be best understood with reference to a detailed description of specific embodiments, which follows, when read in conjunction with the accompanying drawings, in which:
While the subject matter of the present disclosure is susceptible to various modifications and alternative forms, specific embodiments thereof have been shown by way of example in the drawings and are herein described in detail. The figures and written description are not intended to limit the scope of the inventive concepts in any manner. Rather, the figures and written description are provided to illustrate the inventive concepts to a person skilled in the art by reference to particular embodiments, as required by 35 U.S.C. § 112.
Co-pending U.S. patent application Ser. No. 11/426,078, which has been incorporated herein by reference in its entirety, discloses electronic devices capable of configuring user inputs based on how the devices are to be used. The electronic devices may be multi-functional hand-held devices. The electronic devices have a user interface that requires no (or at most only a few) physical buttons, keys, or switches so that the display size of the electronic devices—can be substantially increased. Preferably, the electronic devices eliminate such physical buttons, keys, or switches from a front surface of the electronic device so that additional surface area becomes available for a larger display on the electronic device. Ultimately, this strategy allows the electronic device to house a substantially full screen display. As used herein, a full screen display is a display that consumes, or at least dominates, a surface (e.g., a front surface) of the electronic device.
The hand-held device 50 may be constructed with only cross-functional physical buttons, i.e., there are no buttons dedicated to individual device functionalities. These types of buttons may include power buttons and hold switches. In another embodiment, the hand-held device 50 may not include any physical buttons at all. In some embodiments, the physical buttons are limited to only the sides 56 and back surface 58 of the hand-held device 50. In other embodiments, the physical buttons of the handheld device 50 are limited to the upper and lower portion of the sides 56 so that there are no buttons in the regions of the sides 56 where a user would physically support the device 50 (i.e., holding region). In still other embodiments, the physical buttons may be located on the front surface 54, but only in the bezel 55 surrounding the display 60. In some embodiments, the buttons may be located on only the top and bottom surfaces 57 of the device 50.
As shown in the embodiment of
In some cases, it may be desirable to place buttons in the upper or lower regions of the side surfaces 56 out of the way of the grasping hand of the user. This may be particularly well suited when the housing 52 of the device 50 is elongated more than the standard width of a user's grasping hand. As shown in
As discussed above, the touch screen display 60 typically works in conjunction with a GUI presented on the display 60. The GUI shows user controls on the touch screen display 60, which in turn responds to user touches made in regions of the touch screen display 60 corresponding to the displayed user controls. The entire touch screen display 60 or only a portion may be used to show the user controls. Referring to
On the other hand, the control region 74 virtually represents those physical controls 76 that would normally be physically placed on a particular type of electronic device. That is, the virtual controls 76 displayed in the control region 74 essentially mimic physical controls for a particular type of device. For example, when the device 50 is operated with a PDA functionality, the control region 74 may include virtual representations of a hand writing recognition area, a navigation pad, and the standard function buttons. The standard and control regions 72 and 74 can be positioned at any position on the display 60 (top, bottom, sides, center, etc.). For example, as shown in
In another example,
In the embodiments of
The touch sensitive surfaces may be located on any surface of the housing 52, any side of the housing 52, any portion of any side of the housing 52, or at dedicated locations on the surface of the housing 52. For example, the touch sensitive surfaces may be located on the sides 56 or back surface 58 of the housing 52 and may even be located at the bezel (55;
The touch sensitive surfaces of the housing 52 may take the form of one or more touch panels that are positioned within the housing 52. The touch sensitive surfaces may be alternatively or additionally be provided directly by the housing 52. That is, the touch sensing components of the touch sensitive surfaces may be integrated into, incorporated into, or disposed underneath the housing 52 such that the housing 52 itself is touch sensitive and forms part of the touch sensitive surfaces (rather than using a separate touch panel). Similar to a touch screen, such touch sensitive surfaces recognize touches and the positions of the touches on the surfaces. The electronic device 50 has circuitry (not shown), which can include a controller or the like, and the circuitry interprets the touches and thereafter performs actions based on the touch events. Touch sensitive surfaces can be constructed in the same manner as a touch screen, except the surfaces need not be substantially transparent. By way of example, the touch sensitive surfaces for the electronic device 50 may generally correspond to the touch sensitive housing described in detail in U.S. patent application Ser. No. 11/115,539, entitled “Hand-Held Electronic Device with Multiple Touch Sensing Devices,” filed Apr. 26, 2005, which has been incorporated herein by reference in its entirety.
Having a display 60 that encompasses almost the entire front surface 54 of the housing 52 of the electronic device 50 has several advantages discussed herein. In addition, having one or more touch sensitive surfaces on various portions of the housing 52 that allows a user to control the electronic device 50 can also provide several advantages discussed herein. As alluded to above, one or more touch sensitive surfaces can be located on the bezel 55 (i.e., the portion of the front surface 54 of the housing 52 that surrounds the display 60). Turning then to
The electronic device 100 includes a housing 102 and a display 110. The housing 102 holds the display 110, which can be any conventional display known and used in the art for electronic devices. Some common examples for the display 110 include a Liquid Crystal display (LCD), an electroluminescent display, and a touch screen display. The housing 102 also holds the touch sensitive bezel 120, which is positioned substantially around the perimeter of the display 110. (In the present embodiment, the bezel 120 is positioned entirely around the perimeter of the display 110 so that the bezel 120 essentially frames the display 110.) The housing 102 of the electronic device 100 also contains electronic components that provide a number of operations and features, such as memory access, communications, sound, power, etc. In addition, the electronic device 100 houses electronic components (discussed in more detail below) that are used to control operation of the display 110 and the bezel 120.
In one example, the electronic device 100 can be a picture frame having memory for storing digital pictures and for viewing on the display 110. In another example, the electronic device 100 can be a digital media device having the display 110, the touch sensitive bezel 120, and lacking most or all buttons or similar physical controls on the housing 52. In other examples, the electronic device 100 can be an electronic game, a personal digital assistant, a multimedia device, a cellular telephone, a portable video player, a portable navigation device, or the like.
The bezel 120 is touch sensitive and is used to obtain touch data from the user in response to touch events made by the user on the bezel 120. The electronic device 100 uses the touch data obtained with the bezel 120 to perform various operations and functions related to user interface and user control of the device 100. For example, the touch data obtained with the bezel 120 can control what is displayed with the device 100, what files are played, what the volume level is, what the settings for the display 110 are, etc.
A number of techniques can be used to obtain touch data with the touch sensitive bezel 120. In one embodiment, at least a portion of the bezel 120 includes a multi-touch input surface capable of generating touch data for a plurality of touch events made by the user simultaneously at different locations of the bezel 120. For example, the bezel 120 can include a capacitive sensor array and data acquisition circuitry for detecting when a user touches areas or location s on the bezel 120. The capacitive sensor array and data acquisition circuitry can be similar to those disclosed in U.S. patent application Ser. No. 10/949,060, filed Sep. 24, 2004 and entitled “Raw Data Track Pad Device and System,” which is incorporated herein by reference in its entirety. An example of such an embodiment for the bezel 120 is discussed below with reference to
In another embodiment, at least a portion of the bezel 120 includes a plurality of resistive or capacitive sensors and an integrated circuit for analyzing resistive or capacitive values caused by a user touching the bezel 120. An example of an embodiment for such a bezel 120 is discussed below with reference to
During operation of the device 100, areas or location s of the bezel 120 are designated. for various user controls of the device 100. In one embodiment, particular user controls designated for areas of the bezel 120 may be indicated directly on the bezel 120 itself using graphics, words, or the like. In such an embodiment, the user controls/having indications directly on the bezel 120 may be fixed and may be those user controls that a user would typically use when operating the device 100 in any of the possible modes or functionalities of the device 100. In another embodiment, particular user controls designated for areas of the bezel 120 may not have any visual indications appearing directly on the bezel 120 itself. Instead, the designated user controls may be in a logical or predetermined location on the bezel 120 that the user may know or expect.
In yet another embodiment, the electronic device 100 has user interface software or an application for displaying icons, menu items, pictures, or words (referred to herein as “visual guides”) 180 on the display 110. The visual guides 180 correspond to the user controls designated for areas or locations of the bezel 120 and are shown on the display 110 adjacent designated areas on the bezel 120. By way of example, the visual guides 180 in
As shown in
During operation, the user can touch designated areas (e.g., outlined area—121) on the bezel 120 to initiate user controls for the electronic device 100. Some examples of possible user controls include menu operations, cursor operations, and data entry operations. The user interface software operating on the display 110 shows the visual guides 180 in positions adjacent the areas 121 on the bezel 120 designated to perform the user controls so the user may know the general area of the bezel 120 designated for the corresponding user control indicted by the adjacent visual guide 180. The designated areas 121 can be arbitrarily positioned and sized around the bezel 120 depending on the context or content of what is being displayed. The number of distinct areas 121 that can be designated depends on the size of the display 110 and the bezel 120 and depends on what type of touch sensitive sensors are used for the touch sensitive bezel 120. In one example, one edge of the bezel 120 that is about 4 to 5-inches in length may accommodate about one-hundred distinct areas that can be designated for user controls.
In a further embodiment, the electronic device 100 may be capable of rotation and may have an orientation sensor (discussed in more detail below) for determining the orientation of the device 100. Based on the sensed orientation, the areas 121 on the bezel 120 designated for the user controls can be altered or relocated to match the current orientation of the device 100. Likewise, the user interface software operating on the device 100 can alter the location of the visual guides 180 to match the current position of the areas 121 on the bezel 120 designated for the user controls.
Now that details related the electronic device 100, display 110, and bezel 120 for user interface and control have been discussed above in
Referring to
The electronic device 200 includes a housing (not shown), a display 210, display circuitry 212, a capacitive sensor array 220, data acquisition circuitry 230, and processing circuitry 240. The display 210 is positioned on the housing (not shown) and has a perimeter. The capacitive sensor array 220 is also positioned on the housing (not shown) and is positioned substantially around the perimeter of the display 210 so that the capacitive sensor array 220 forms part of a bezel for the display 210. The data acquisition circuitry 230 is coupled to the capacitive sensor array 220 and is used to acquire touch data from the array 220. The processing circuitry 240 is coupled to the data acquisition circuitry 230 and to the display 210.
As will be explained in more detail below, the processing circuitry 240 is configured to obtain touch data from the data acquisition circuitry 230, determine if at least one user control is invoked by the obtained touch data, and initiate at least one operation for the electronic device 200 based on the determined user control. The processing circuitry 240 includes a processor 250 and includes one or more software/firmware components that operate on the processor 250. These components include a display driver 251, a sensor driver 253, and system and/or user applications 260.
The applications 260 have one or more user controls 262 that a user can invoke by touching one or more areas of the capacitive sensor array 220 in order to change or control operation of the electronic device 200. To determine which user control 262 is invoked, the processing circuitry 240 designates one or more areas 221 of the capacitive sensor array 220 for the one or more user controls 262 of the applications 260. Then, when a user touches one or more areas 221 of the capacitive sensor array 220, the data acquisition circuitry 230 provides touch data to the processing circuitry 240. The capacitive sensor array 220 and data acquisition circuitry 230 is preferably capable of generating touch data that describes more than one simultaneously touched areas 221 on the bezel 120 so that the touch data can cover instances when the user touches one area only, touches more than one area simultaneously, or makes a pattern or gesture of touches on the array 220.
In turn, the processing circuitry 240 compares the obtained touch data to the one or more designated areas and determines which of the user controls 262 has been invoked by the user. The comparison of obtained touch data to the designated areas 221 may involve different levels of processing. In one level of processing, the processing circuitry 240 compares the location (e.g., rows and columns) that the obtained touch data occurred on the array 220 to the designated areas 221 for the user controls. If the obtained touch data occurs in the designated area 221 for a satisfactory time period or over an average extent of the area 221, for example, then the processing circuitry 240 determines that the corresponding user control has been invoked.
In other levels of processing, the obtained touch data can include one or more location s (e.g., rows and columns) being touched on the array 220, can include touch data obtained over an interval of time, can include changes in touch data over time, and can include other “aggregate” forms of touch data. In this level of processing, the processing circuitry 240 recognizes “touch gestures” from the touch data and determines which control is invoked by the “touch gesture.” Some examples of “touch gestures” include a single “tap” at a location of the array 220, two or more sequential. taps made at substantially the same location of the array 220 within predefined intervals of one another, touch events occurring substantially simultaneously at two or more location s of the array 220, sliding touches of one or more fingers by the user over the surface of the array 220, sustained touch at one location of the array 220 in conjunction with sliding or tapping touches at other location s of the array 220, and other combinations of the above.
To recognize such “touch gestures,” one or more areas 221 of the array 220 are associated with a control 262, and touch gestures involving one or more touches on those areas 221 are associated with the control 262. The touch gesture can be a single momentary tap, a sustained touch, two or more sequential taps, a sweep of a finger,” and any other possible touch arrangement. To then determine if the control 262 has been invoked, the processing circuitry 240 determines if the touch data includes those areas 221 associated with the control 262 and determines from the touch data if the touch gesture associated with the control 262 has occurred on those areas 221.
Turning from discussion of the capacitive sensor array 220, the processing circuitry 240 is also operatively connected to the display 210 by display circuitry 212. The display driver 251 is used to configure visual data (e.g., content, screens, user interface elements, visual guides, etc.) and to send or present the visual data to the display circuitry 212. The electronic device 200 preferably presents one or more visual guides 280 along the perimeter of the display 210. In addition, the one or more visual guides 280 are preferably displayed at location s on the display 210 adjacent to corresponding areas 221 of the capacitive sensor array 220 designated for the one or more user controls 262 associated with the one or more visual guides 280.
Given the overview of the electronic device 200 discussed above, we now turn to a more detailed discussion of the components of the electronic device 200 of the present embodiment. The capacitive sensor array 220 includes a plurality of rows and columns of capacitive sensors, and the array 220 may or may not be symmetrical. The rows and columns of the array 220 are positioned around the perimeter of the display 210.
The data acquisition circuit 230 includes multiplexer (“MUX”) circuitry coupled to the sensor array 220. In particular, two multiplexer circuits 232-1 and 232-2 (referred to as the MUX-1 and MUX-2 circuits) are coupled to the rows of the sensor array 220. Each row in the sensor array 220 can be electrically coupled to a reference voltage Vcc through the MUX-1 circuit 232-1 and can be electrically coupled to a storage capacitor 236 through the MUX-2 circuit 232-2. While not shown in detail, each column of sensor array 220 can be similarly coupled to a reference voltage Vcc and to a storage capacitor using column MUX circuits 234. Thus, a user touching a location or area 221 of the sensor array 220 can alter the capacitance measured at affected rows and columns of the array 220.
During operation, the MUX circuits 232 and 234 are responsible for coupling and stimulating successive elements of the sensor array 220 (e.g., rows, columns, or individual pixels—that is, an element at the intersection of a row and column) to the storage capacitor 236 in a controlled/sequenced manner and indicating that a measurement cycle has begun to the scan circuit 238. When the charge on storage capacitor 236 reaches a specified value or threshold, the scan circuit 238 records the time required to charge the storage capacitor 236 to the specified threshold. Consequently, the scan circuit 238 provides a digital value that is a direct indication of the capacitance of the selected element of the sensor array 220.
The sensor driver 240 obtains measured capacitance data from the acquisition circuitry 230. In turn, the sensor driver 240 processes the measured capacitance data and configures a corresponding control, command, operation, or other function designated by the row and column location of the capacitance data. Depending on what application, content, or the like is currently operating, the system application 260 and/or user application 262 implements the corresponding user control 262. Implementation may affect what is currently being displayed on the display 210. Consequently, the display driver 214 may operate the display circuitry 212 coupled to the display 210 in response to an implemented control 262. For example, a new menu may be presented on the display 210 in response to an implemented user control 262.
As shown in
In another embodiment, the one or more sensors 270 can include one or more ambient light sensors for detecting the level of ambient light around the device 200. Preferably, the device 200 includes at least two such ambient light sensors 270 for redundancy. Based on the level of ambient light detected, the electronic device 200 can automatically adjust the contrast and/or brightness of the display 210 accordingly. In yet another embodiment, the one or more sensors 270 can include a motion sensor, such as a passive pyroelectric sensor. The motion sensor 270 can be used to detect motion of the electronic device 200 from a stationary state so that the device 200 can “wake up” (e.g., turn on or come out of a standby mode) or can show previously hidden visual guides 280 on the display 210 in response to being moved.
Referring to
While the MUX-2 circuitry 232-2 couples the selected sensor row to the storage capacitor 236, a determination is made whether the storage capacitor's voltage reaches a specified threshold (Block 308). If so (i.e., the “Yes” prong of Block 308), the digital value corresponding to the time it took to charge the storage capacitor 236 to the specified threshold is recorded by the scan circuit 238 (Block 310). If the storage capacitor's voltage does not reach the specified threshold during the time that the MUX-2 circuitry 232-2 couples the selected sensor row to the storage capacitor 236 (i.e., the “No” prong of Block 308), then the acts of block 302-308 are repeated.
Once a digital value corresponding to the capacitance of the selected row has been obtained (Block 310), a check is made to see if there are additional rows in the sensor array 220 that need to be sampled. If more rows need to be sampled, the process 300 goes to the next row at Block 314 and repeats the acts of Blocks 302-308. If it is determined at Block 312 that all the rows in the sensor array 220 have been sampled in accordance with Blocks 302-308, a similar sampling process is used to acquire a capacitance value for each column in the sensor array 220 (Block 316). Once all rows and all columns have been processed, the entire process 300 is repeated, which can be done at a predetermined interval (Block 318).
Referring to
The electronic device 200 then obtains touch data with the capacitive sensor array 220 using the techniques disclosed herein (Block 408). The touch data in a basic form includes information of which rows and columns of the capacitive sensor array 220 have been touched (i.e., have exceeded the capacitance threshold). The electronic device 200 performs various forms of processing of the obtained touch data. For example, the touch data can be processed to determine how long the rows and columns have reached a threshold capacitance, to determine how long rows and columns have been below the threshold capacitance since an initial period of being above the threshold, and to determine other forms of information. Furthermore, to facilitate processing, the touch data can be aggregated together into predefined intervals of time and portions of the array 220. In addition, the touch data obtained at a first instance can be stored and later compared to touch data obtained at a subsequent instance to determine changes in the data overtime, caused by a user's touch movement on the array 220, for example. These and other forms of processing of the touch data will be apparent to one skilled in the art with the benefit of the present disclosure.
After the touch data has preferably been processed with the techniques described briefly above, the electronic device 200 then compares the information from the touch data to the designations on the array 220 associated with the user controls for the device 200 (Block 410). From the comparison, the electronic device 200 determines which user control is invoked by the designated area 221 of the array 220 that the user has touched (Block 412). Then, the electronic device 200 initiates the determined user control to affect processing of the device 200 (Block 414). Once the user control is implemented, it may be necessary to update the display 210 and the designated user controls (Block 416). If the same visual guides 280 and designated user controls can be used, then the process returns to Block 408, for example, to obtain any new touch data with the array 220. If, however, new visual guides 280 and designated user controls are needed due to a change in the content of what is displayed or the context of the device's operation, then the process returns to Block 402 to designate new areas on the array 220 for user controls and proceeds to subsequent steps.
One skilled in the art will appreciate that the touch sensitive array 220 around the perimeter of the display 210 of the device 200 can be used to implement various user controls in some ways similar to how a conventional touch pad is used. In a brief example, the information at Block 410 may indicate a “tap” made by the user on the array 220. This “tap” (i.e., a touch by a finger on the array 220 for a “short” duration of time) may have been performed in designated area 221 of the array 220. The electronic device 200 determines that the area 221 invoked by the “tap” is designated for performing a “page up” control of what is being displayed on the device 200. The time duration of the “tap” may indicate the amount or extent of the “page up” control. In response to the user control, the electronic device 200 causes what is being shown on the display 210 to page up as requested by the user.
As noted briefly above, the electronic device 200 of
If the orientation has changed at Block 434, the electronic device 200 determines how the orientation has changed and alters the designation of areas for user controls of the touch bezel (Block 436). In particular, the processing circuitry 240 alters how the one or more areas 221 of the capacitive sensor array 220 are designated for the one or more user controls 262 so that the designation better matches the new orientation of the device 200. In addition, if the display 210 is showing visual guides 280 for the corresponding areas 221 of the bezel 120, then the electronic device 200 also alters location of the visual guides 280 on the display 210 so that they match the newly designated area 221 for the user controls on the array 220 (Block 438). Then, the process 430 can end until called again during the operation of the electronic device 200.
By way of example,
For example, the area 471A of where the “Left” control 480 in
In the example of
Turning to
In previous embodiments of the present disclosure, the touch sensitive bezel of the present disclosure is arranged substantially around the entire perimeter of the display. In one alternative shown in
Any user controls designated for areas 562 on these additional touch sensitive pads 560 may be preconfigured and may not be change during operation. In this way, the user may know the functionality of the various pads 560 and can use the areas 562 to control features of the device 530 without the need of any visual guides 542 on the display 540. Alternatively, the user may be able to designate any user controls for these additional touch sensitive pads 560 using setup and configuration operations of the device 530. In yet another alternative, user controls for areas 562 of these additional pads 560 can be designated and re-designated by the electronic device 530 during operation in much the same way disclosed herein for areas 552 on the bezel 550. For example, areas 562 on the pads 560 can be designated for user controls similar to the areas 552 that can be designated on the bezel 550, and visual guides 542 can be displayed around the perimeter of the display 540 adjacent to corresponding areas 562 on the additional pads 560 in the same way that the visual guides 542 are displayed adjacent designated areas 552 of the bezel 550.
In
In additional alternatives shown in
In the embodiment of
The touch sensitive bezel 620 also includes a control module 630, which is housed in the electronic device and is shown here relative to the PCB 622 for illustrative purposes. The control module 630 is connected to the pads 626 of the PCB 622 by connections (not shown). The control module 630 has a plurality of components, including an infrared sensor, communication circuitry, accelerometer/inclinometer sensor, and other components. A suitable infrared sensor is an RE200B pyroelectric passive infrared sensor. A suitable accelerometer/inclinometer sensor is a KXP84 IC.
The electronic device 600 can also have a plurality of ambient light sensors 604 and a plurality of infrared (IR) modules 606, which are also shown here relative to the control module 630 for illustrative purposes. A suitable ambient light sensor is an ISL29001 light-to-digital sensor. The ambient light sensors 604 can be positioned in various location s on the housing of the electronic device and behind the display. The ambient light sensors 604 detect the level of ambient light near the display so that the electronic device can adjust the contrast, brightness, or backlighting of the display accordingly.
In
In an additional embodiment, the operation and arrangement of IC 634 and the pad element 620 of the present disclosure can use techniques disclosed in U.S. Patent Application Publication No. 2006/0032680, entitled “A Method of Increasing Spatial Resolution of Touch Sensitive Devices,” which is incorporated herein by reference in its entirety, to expand the detected sensitivity of the pad element 620.
In the embodiment shown in
In
To sense location, the device 700 uses many of the same techniques discussed above with reference to the capacitive sensor array of
To sense force, circuitry of the device 700 drives the drive paths 762 on the base layer 760 (one at a time) during a second time period. During this same time, the sense paths 744 on the bottom side of the substrate 740 are again interrogated to obtain data representing the strength or intensity of force applied to cosmetic layer 730 by a user's touch. For example, when a force is applied by a user's finger on the cosmetic layer 730, the spring layer 750 deforms moving the sense paths 744 on the bottom of the substrate 740 closer to the drive paths 762 on the top of the base layer 760. A resulting change in mutual capacitance is then used to generate data indicative of the strength or intensity of an applied force. Additional details related to the layers and other aspects of this embodiment are disclosed in incorporated U.S. patent application Ser. No. 11/278,080.
Using force and location detection, the bezel 720 of the present embodiment can provide additional user interface and controls. For example, a user's finger in
Given all of the previous discussion of the present disclosure, we now turn to an embodiment of an electronic device that incorporates one or more of the aspects and features discussed above. In
In
In
In
As shown in
If the user selects the toggle area 831 in the lower right corner, the screen 920 shows additional user controls. In
In
In
As shown by the example multimedia device 800 of
Furthermore, the touch sensitive bezel 820 according to the present disclosure can be used to obtain touch data corresponding to multiple user controls simultaneously. For example, the user controls of the bezel 820 can be configured so that one side of the bezel 820 controls brightness with a touch and drag motion by the user, while the other side of the bezel 820 controls contrast with a touch and drag motion by the user. Thus, using both of these sides of the bezel 820, the user can simultaneously adjust both the contrast and the brightness of the display 810 using touch and drag motions on the sides of the bezel 820. These and other possibilities and combinations will be apparent to one skilled in the art having the benefit of the present disclosure.
The foregoing description of preferred and other embodiments shows several different configurations of electronic devices. Certain features, details, and configurations were disclosed in conjunction with each embodiment. However, one skilled in the art will understand (1) that such features, details, and configurations can be used with the various different embodiments, even if such features, details, and configurations were not specifically mentioned in conjunction with a particular embodiment, and (2) that this disclosure contemplates various combinations of the features, details, and configurations disclosed herein. More specifically, the foregoing description of preferred and other embodiments is not intended to limit or restrict the scope or applicability of the inventive concepts conceived of by the Applicants. In exchange for disclosing the inventive concepts contained herein, the Applicants desire all patent rights afforded by the appended claims. Therefore, it is intended that the appended claims include all modifications and alterations to the full extent that they come within the scope of the following claims or the equivalents thereof.
This is a continuation of U.S. patent application Ser. No. 16/529,690, filed Aug. 1, 2019, (now U.S. Publication No. 2019-0354215, published on Nov. 21, 2019), which is a continuation of U.S. patent application Ser. No. 15/970,571, filed May 3, 2018, (now U.S. Pat. No. 10,386,980, issued on Aug. 20, 2019), which is a continuation of U.S. patent application Ser. No. 14/724,753, filed May 28, 2015 (now U.S. Pat. No. 9,983,742, issued on May 29, 2018), which is a continuation of U.S. patent application Ser. No. 12/486,710, filed Jun. 17, 2009 (now U.S. Pat. No. 9,047,009, issued Jun. 2, 2015), which is a continuation of U.S. patent application Ser. No. 11/426,078, filed Jun. 23, 2006, (now U.S. Pat. No. 7,656,393, issued on Feb. 2, 2010), which is a continuation-in-part of U.S. patent application Ser. No. 11/367,749, filed Mar. 3, 2006 (U.S. Publication No. 2006-0197753, published on Sep. 7, 2006), which claims priority to U.S. Provisional Patent Application No. 60/658,777, filed Mar. 4, 2005 and U.S. Provisional Patent Application No. 60/663,345, filed Mar. 16, 2005, the entire disclosures of which are incorporated by reference herein in their entirety for all purposes. This application is also related to the following applications, which are all herein incorporated by reference in their entirety for all purposes: (1) U.S. patent application Ser. No. 10/188,182, entitled “Touch Pad for Handheld Device,” filed on Jul. 1, 2002, (now U.S. Pat. No. 7,046,230, issued on May 16, 2016); (2) U.S. patent application Ser. No. 10/722,948, entitled “Touch Pad for Handheld Device,” filed on Nov. 25, 2003, (now U.S. Pat. No. 7,495,659, issued on Feb. 24, 2009); (3) U.S. patent application Ser. No. 10/643,256, entitled “Movable Touch Pad with Added Functionality,” filed on Aug. 18, 2003, (now U.S. Pat. No. 7,499,040, issued on Mar. 3, 2009); (4) U.S. patent application Ser. No. 10/654,108, entitled “Ambidextrous Mouse,” filed on Sep. 2, 2003, (now U.S. Pat. No. 7,808,479, issued on Oct. 5, 2010); (5) U.S. patent application Ser. No. 10/840,862, entitled “Multipoint Touch Screen,” filed on May 6, 2004 (now U.S. Pat. No. 7,663,607, issued on Feb. 16, 2010); (6) U.S. patent application Ser. No. 10/903,964, Untitled “Gestures for Touch Sensitive Input Devices,” filed on Jul. 30, 2004, (now U.S. Pat. No. 8,479,122, issued on Jul. 2, 2013); (7) U.S. patent application Ser. No. 11/038,590, entitled “Mode-Based Graphical User Interfaces for Touch Sensitive Input Devices,” filed on Jan. 18, 2005, (now U.S. Pat. No. 8,239,784, issued on Aug. 7, 2012); and (8) U.S. patent application Ser. No. 11/057,050, entitled “Display Actuator,” filed on Feb. 11, 2005, (now abandoned); (9) U.S. patent application Ser. No. 11/115,539, entitled “Hand-Held Electronic Device with Multiple Touch Sensing Devices,” filed Apr. 26, 2005, (now U.S. Pat. No. 7,800,592, issued on Sep. 21, 2010).
Number | Name | Date | Kind |
---|---|---|---|
3333160 | Gorski | Jul 1967 | A |
3541541 | Engelbart | Nov 1970 | A |
3662105 | Hurst et al. | May 1972 | A |
3798370 | Hurst | Mar 1974 | A |
4110749 | Janko et al. | Aug 1978 | A |
4246452 | Chandler | Jan 1981 | A |
4264903 | Bigelow | Apr 1981 | A |
4550221 | Mabusth | Oct 1985 | A |
4566001 | Moore et al. | Jan 1986 | A |
4672364 | Lucas | Jun 1987 | A |
4672558 | Beckes et al. | Jun 1987 | A |
4689761 | Yurchenco | Aug 1987 | A |
4692809 | Beining et al. | Sep 1987 | A |
4695827 | Beining et al. | Sep 1987 | A |
4719524 | Morishima et al. | Jan 1988 | A |
4733222 | Evans | Mar 1988 | A |
4734685 | Watanabe | Mar 1988 | A |
4736191 | Matzke et al. | Apr 1988 | A |
4746770 | Mcavinney | May 1988 | A |
4771276 | Parks | Sep 1988 | A |
4788384 | Bruere-Dawson et al. | Nov 1988 | A |
4806846 | Kerber | Feb 1989 | A |
4857916 | Bellin | Aug 1989 | A |
4866602 | Hall | Sep 1989 | A |
4891508 | Campbell | Jan 1990 | A |
4896370 | Kasparian et al. | Jan 1990 | A |
4898555 | Sampson | Feb 1990 | A |
4917516 | Retter | Apr 1990 | A |
4922236 | Heady | May 1990 | A |
4933660 | Wynne, Jr. | Jun 1990 | A |
4968877 | Mcavinney et al. | Nov 1990 | A |
4970819 | Mayhak | Nov 1990 | A |
5003519 | Noirjean | Mar 1991 | A |
5017030 | Crews | May 1991 | A |
5027690 | Wachi et al. | Jul 1991 | A |
5125077 | Hall | Jun 1992 | A |
5178477 | Gambaro | Jan 1993 | A |
5179648 | Hauck | Jan 1993 | A |
5186646 | Pederson | Feb 1993 | A |
5189403 | Franz et al. | Feb 1993 | A |
5194862 | Edwards | Mar 1993 | A |
5224151 | Bowen et al. | Jun 1993 | A |
5224861 | Glass et al. | Jul 1993 | A |
5225959 | Stearns | Jul 1993 | A |
5237311 | Mailey et al. | Aug 1993 | A |
5241308 | Young | Aug 1993 | A |
5252951 | Tannenbaum et al. | Oct 1993 | A |
5281966 | Walsh | Jan 1994 | A |
5305017 | Gerpheide | Apr 1994 | A |
D349280 | Kaneko | Aug 1994 | S |
5339213 | O'callaghan | Aug 1994 | A |
5345543 | Capps et al. | Sep 1994 | A |
5355148 | Anderson | Oct 1994 | A |
5376948 | Roberts | Dec 1994 | A |
5379057 | Clough et al. | Jan 1995 | A |
5398310 | Tchao et al. | Mar 1995 | A |
5404152 | Nagai | Apr 1995 | A |
5414445 | Kaneko et al. | May 1995 | A |
5438331 | Gilligan et al. | Aug 1995 | A |
5442742 | Greyson et al. | Aug 1995 | A |
D362431 | Kaneko et al. | Sep 1995 | S |
5463388 | Boie et al. | Oct 1995 | A |
5463696 | Beernink et al. | Oct 1995 | A |
5473343 | Kimmich et al. | Dec 1995 | A |
5473344 | Bacon et al. | Dec 1995 | A |
5483261 | Yasutake | Jan 1996 | A |
5488204 | Mead et al. | Jan 1996 | A |
5491706 | Tagawa et al. | Feb 1996 | A |
5495077 | Miller et al. | Feb 1996 | A |
5495566 | Kwatinetz | Feb 1996 | A |
5510813 | Makinwa et al. | Apr 1996 | A |
5513309 | Meier et al. | Apr 1996 | A |
5523775 | Capps | Jun 1996 | A |
5528265 | Harrison | Jun 1996 | A |
5530455 | Gillick et al. | Jun 1996 | A |
5541372 | Baller et al. | Jul 1996 | A |
5543590 | Gillespie et al. | Aug 1996 | A |
5543591 | Gillespie et al. | Aug 1996 | A |
5559943 | Cyr et al. | Sep 1996 | A |
5561445 | Miwa et al. | Oct 1996 | A |
5563632 | Roberts | Oct 1996 | A |
5563996 | Tchao | Oct 1996 | A |
5565658 | Gerpheide et al. | Oct 1996 | A |
5565887 | Mccambridge et al. | Oct 1996 | A |
5578817 | Bidiville et al. | Nov 1996 | A |
5579036 | Yates, IV | Nov 1996 | A |
5581681 | Tchao et al. | Dec 1996 | A |
5583946 | Gourdol | Dec 1996 | A |
5585823 | Duchon et al. | Dec 1996 | A |
5589856 | Stein et al. | Dec 1996 | A |
5590219 | Gourdol | Dec 1996 | A |
5592198 | Fagard et al. | Jan 1997 | A |
5592566 | Pagallo et al. | Jan 1997 | A |
5594471 | Deeran et al. | Jan 1997 | A |
5594810 | Gourdol | Jan 1997 | A |
5596347 | Robertson et al. | Jan 1997 | A |
5596694 | Capps | Jan 1997 | A |
5598183 | Robertson et al. | Jan 1997 | A |
5598527 | Debrus et al. | Jan 1997 | A |
5611040 | Brewer et al. | Mar 1997 | A |
5611060 | Belfiore et al. | Mar 1997 | A |
5612719 | Beernink et al. | Mar 1997 | A |
5631805 | Bonsall | May 1997 | A |
5633955 | Bozinovic et al. | May 1997 | A |
5634102 | Capps | May 1997 | A |
5636101 | Bonsall et al. | Jun 1997 | A |
5642108 | Gopher et al. | Jun 1997 | A |
5644657 | Capps et al. | Jul 1997 | A |
D382550 | Kaneko et al. | Aug 1997 | S |
5661505 | Livits | Aug 1997 | A |
5663070 | Barr et al. | Sep 1997 | A |
5666113 | Logan | Sep 1997 | A |
5666502 | Capps | Sep 1997 | A |
5666552 | Greyson et al. | Sep 1997 | A |
D385542 | Kaneko et al. | Oct 1997 | S |
5675361 | Santilli | Oct 1997 | A |
5675362 | Clough et al. | Oct 1997 | A |
5677710 | Thompson-rohrlich | Oct 1997 | A |
5686720 | Tullis | Nov 1997 | A |
5689253 | Hargreaves et al. | Nov 1997 | A |
5710844 | Capps et al. | Jan 1998 | A |
5726687 | Belfiore et al. | Mar 1998 | A |
5729219 | Armstrong et al. | Mar 1998 | A |
5729250 | Bishop et al. | Mar 1998 | A |
5730165 | Philipp | Mar 1998 | A |
5736976 | Cheung | Apr 1998 | A |
5741990 | Davies | Apr 1998 | A |
5745116 | Pisutha-arnond | Apr 1998 | A |
5745591 | Feldman | Apr 1998 | A |
5745716 | Tchao et al. | Apr 1998 | A |
5746818 | Yatake | May 1998 | A |
5748185 | Stephan et al. | May 1998 | A |
5748269 | Harris et al. | May 1998 | A |
5751274 | Davis | May 1998 | A |
5754890 | Holmdahl et al. | May 1998 | A |
5764222 | Shieh | Jun 1998 | A |
5764818 | Capps et al. | Jun 1998 | A |
5766493 | Shin | Jun 1998 | A |
5767457 | Gerpheide et al. | Jun 1998 | A |
5767842 | Korth | Jun 1998 | A |
5786804 | Gordon | Jul 1998 | A |
5786818 | Brewer et al. | Jul 1998 | A |
5790104 | Shieh | Aug 1998 | A |
5790107 | Kasser et al. | Aug 1998 | A |
5793881 | Stiver et al. | Aug 1998 | A |
5802516 | Shwarts et al. | Sep 1998 | A |
5808567 | Mccloud | Sep 1998 | A |
5808602 | Sellers | Sep 1998 | A |
5809267 | Moran et al. | Sep 1998 | A |
5812114 | Loop | Sep 1998 | A |
5816225 | Koch et al. | Oct 1998 | A |
5821690 | Martens et al. | Oct 1998 | A |
5821930 | Hansen | Oct 1998 | A |
5823782 | Marcus et al. | Oct 1998 | A |
5825351 | Tam | Oct 1998 | A |
5825352 | Bisset et al. | Oct 1998 | A |
5825353 | Will | Oct 1998 | A |
5828364 | Siddiqui | Oct 1998 | A |
5835079 | Shieh | Nov 1998 | A |
5838304 | Hall | Nov 1998 | A |
5841078 | Miller et al. | Nov 1998 | A |
5841425 | Zenz, Sr. | Nov 1998 | A |
5841426 | Dodson et al. | Nov 1998 | A |
D402281 | Ledbetter et al. | Dec 1998 | S |
5850213 | Imai et al. | Dec 1998 | A |
5854625 | Frisch et al. | Dec 1998 | A |
5856822 | Du et al. | Jan 1999 | A |
5880411 | Gillespie et al. | Mar 1999 | A |
5880715 | Garrett | Mar 1999 | A |
5883619 | Ho et al. | Mar 1999 | A |
5886687 | Gibson | Mar 1999 | A |
5889236 | Gillespie et al. | Mar 1999 | A |
5898434 | Small et al. | Apr 1999 | A |
5900848 | Haneda et al. | May 1999 | A |
5903229 | Kishi | May 1999 | A |
5907152 | Dandliker et al. | May 1999 | A |
5907318 | Medina | May 1999 | A |
5909211 | Combs et al. | Jun 1999 | A |
5910799 | Carpenter et al. | Jun 1999 | A |
5914705 | Johnson et al. | Jun 1999 | A |
5914706 | Kono | Jun 1999 | A |
5920309 | Bisset et al. | Jul 1999 | A |
5923319 | Bishop et al. | Jul 1999 | A |
5931906 | Fidelibus et al. | Aug 1999 | A |
5933134 | Shieh | Aug 1999 | A |
5943044 | Martinelli et al. | Aug 1999 | A |
5956019 | Bang et al. | Sep 1999 | A |
5959611 | Smailagic et al. | Sep 1999 | A |
5977867 | Blouin | Nov 1999 | A |
5977869 | Andreas | Nov 1999 | A |
5977952 | Francis | Nov 1999 | A |
5982302 | Ure | Nov 1999 | A |
5991431 | Borza et al. | Nov 1999 | A |
5996080 | Silva et al. | Nov 1999 | A |
5999166 | Rangan | Dec 1999 | A |
6002389 | Kasser | Dec 1999 | A |
6002808 | Freeman | Dec 1999 | A |
6005299 | Hengst | Dec 1999 | A |
6020881 | Naughton et al. | Feb 2000 | A |
6029214 | Dorfman et al. | Feb 2000 | A |
6031518 | Adams et al. | Feb 2000 | A |
6031524 | Kunert | Feb 2000 | A |
6037882 | Levy | Mar 2000 | A |
6041134 | Merjanian | Mar 2000 | A |
6049328 | Vanderheiden | Apr 2000 | A |
6050825 | Nichol et al. | Apr 2000 | A |
6052339 | Frenkel et al. | Apr 2000 | A |
6064370 | Wang et al. | May 2000 | A |
6069648 | Suso et al. | May 2000 | A |
6072471 | Lo | Jun 2000 | A |
6072475 | Van | Jun 2000 | A |
6072494 | Nguyen | Jun 2000 | A |
6075533 | Chang | Jun 2000 | A |
6084574 | Bidiville | Jul 2000 | A |
6084576 | Leu et al. | Jul 2000 | A |
6097372 | Suzuki | Aug 2000 | A |
6107997 | Ure | Aug 2000 | A |
6111563 | Hines | Aug 2000 | A |
6115028 | Balakrishnan et al. | Sep 2000 | A |
6118435 | Fujita et al. | Sep 2000 | A |
6124587 | Bidiville et al. | Sep 2000 | A |
6125285 | Chavez et al. | Sep 2000 | A |
6128003 | Smith et al. | Oct 2000 | A |
6128006 | Rosenberg et al. | Oct 2000 | A |
6130664 | Suzuki | Oct 2000 | A |
6131047 | Hayes et al. | Oct 2000 | A |
6131299 | Raab et al. | Oct 2000 | A |
6135958 | Mikula-Curtis et al. | Oct 2000 | A |
6144380 | Shwarts et al. | Nov 2000 | A |
6163312 | Furuya | Dec 2000 | A |
6163616 | Feldman | Dec 2000 | A |
6166721 | Kuroiwa et al. | Dec 2000 | A |
6179496 | Chou | Jan 2001 | B1 |
6181322 | Nanavati | Jan 2001 | B1 |
6184868 | Shahoian et al. | Feb 2001 | B1 |
6188389 | Yen | Feb 2001 | B1 |
6188391 | Seely et al. | Feb 2001 | B1 |
6188393 | Shu | Feb 2001 | B1 |
6191774 | Schena et al. | Feb 2001 | B1 |
6198473 | Armstrong | Mar 2001 | B1 |
6198515 | Cole | Mar 2001 | B1 |
6208329 | Ballare | Mar 2001 | B1 |
6211860 | Bunsen | Apr 2001 | B1 |
6211861 | Rosenberg et al. | Apr 2001 | B1 |
6219038 | Cho | Apr 2001 | B1 |
6222465 | Kumar et al. | Apr 2001 | B1 |
6222528 | Gerpheide et al. | Apr 2001 | B1 |
D442592 | Ledbetter et al. | May 2001 | S |
6225976 | Yates et al. | May 2001 | B1 |
6225980 | Weiss et al. | May 2001 | B1 |
6226534 | Aizawa | May 2001 | B1 |
6232957 | Hinckley | May 2001 | B1 |
6239790 | Martinelli et al. | May 2001 | B1 |
D443616 | Fisher et al. | Jun 2001 | S |
6243071 | Shwarts et al. | Jun 2001 | B1 |
6243078 | Rosenberg | Jun 2001 | B1 |
6246862 | Grivas et al. | Jun 2001 | B1 |
6249606 | Kiraly et al. | Jun 2001 | B1 |
6256011 | Culver | Jul 2001 | B1 |
6262716 | Raasch | Jul 2001 | B1 |
6262717 | Donohue et al. | Jul 2001 | B1 |
6266050 | Oh et al. | Jul 2001 | B1 |
6288707 | Philipp | Sep 2001 | B1 |
6289326 | Lafleur | Sep 2001 | B1 |
6292178 | Bernstein et al. | Sep 2001 | B1 |
6297795 | Kato et al. | Oct 2001 | B1 |
6297811 | Kent et al. | Oct 2001 | B1 |
6310610 | Beaton et al. | Oct 2001 | B1 |
6313828 | Chombo | Nov 2001 | B1 |
6323843 | Giles et al. | Nov 2001 | B2 |
6323845 | Robbins | Nov 2001 | B1 |
6323846 | Westerman et al. | Nov 2001 | B1 |
6323849 | He et al. | Nov 2001 | B1 |
6327011 | Kim | Dec 2001 | B2 |
6333734 | Rein | Dec 2001 | B1 |
6337678 | Fish | Jan 2002 | B1 |
6337919 | Dunton | Jan 2002 | B1 |
6340800 | Zhai et al. | Jan 2002 | B1 |
6347290 | Bartlett | Feb 2002 | B1 |
D454568 | Andre et al. | Mar 2002 | S |
6356287 | Ruberry et al. | Mar 2002 | B1 |
6356524 | Aratani | Mar 2002 | B2 |
6362811 | Edwards et al. | Mar 2002 | B1 |
6369797 | Maynard, Jr. | Apr 2002 | B1 |
6373470 | Andre et al. | Apr 2002 | B1 |
6377009 | Philipp | Apr 2002 | B1 |
6377530 | Burrows | Apr 2002 | B1 |
6380931 | Gillespie et al. | Apr 2002 | B1 |
6384743 | Vanderheiden | May 2002 | B1 |
6392632 | Lee | May 2002 | B1 |
6392634 | Bowers et al. | May 2002 | B1 |
6392636 | Ferrari et al. | May 2002 | B1 |
6396477 | Hinckley et al. | May 2002 | B1 |
6411287 | Scharff et al. | Jun 2002 | B1 |
6413233 | Sites et al. | Jul 2002 | B1 |
6414671 | Gillespie et al. | Jul 2002 | B1 |
6421234 | Ricks et al. | Jul 2002 | B1 |
6429846 | Rosenberg et al. | Aug 2002 | B2 |
6429852 | Adams et al. | Aug 2002 | B1 |
6433780 | Gordon et al. | Aug 2002 | B1 |
6452514 | Philipp | Sep 2002 | B1 |
6456275 | Hinckley et al. | Sep 2002 | B1 |
6457355 | Philipp | Oct 2002 | B1 |
6466036 | Philipp | Oct 2002 | B1 |
6466203 | Van | Oct 2002 | B2 |
6469693 | Chiang et al. | Oct 2002 | B1 |
6489947 | Hesley et al. | Dec 2002 | B2 |
6492979 | Kent et al. | Dec 2002 | B1 |
6498601 | Gujar et al. | Dec 2002 | B1 |
6504530 | Wilson et al. | Jan 2003 | B1 |
6505088 | Simkin et al. | Jan 2003 | B1 |
6513717 | Hannigan | Feb 2003 | B2 |
6515669 | Mohri | Feb 2003 | B1 |
6525749 | Moran et al. | Feb 2003 | B1 |
6535200 | Philipp | Mar 2003 | B2 |
6543684 | White et al. | Apr 2003 | B1 |
6543947 | Lee | Apr 2003 | B2 |
6545665 | Rodgers | Apr 2003 | B2 |
6559830 | Hinckley et al. | May 2003 | B1 |
6559831 | Armstrong | May 2003 | B1 |
6567073 | Levin | May 2003 | B1 |
6570557 | Westerman et al. | May 2003 | B1 |
6587091 | Serpa | Jul 2003 | B2 |
6587093 | Shaw et al. | Jul 2003 | B1 |
6593916 | Aroyan | Jul 2003 | B1 |
6597347 | Yasutake | Jul 2003 | B1 |
6597384 | Harrison | Jul 2003 | B1 |
6610936 | Gillespie et al. | Aug 2003 | B2 |
6614422 | Rafii et al. | Sep 2003 | B1 |
6624803 | Vanderheiden et al. | Sep 2003 | B1 |
6624833 | Kumar et al. | Sep 2003 | B1 |
6636197 | Goldenberg et al. | Oct 2003 | B1 |
6639577 | Eberhard | Oct 2003 | B2 |
6639584 | Li | Oct 2003 | B1 |
6650319 | Hurst et al. | Nov 2003 | B1 |
6650975 | Ruffner | Nov 2003 | B2 |
6658994 | Mcmillan | Dec 2003 | B1 |
6661410 | Casebolt et al. | Dec 2003 | B2 |
6670894 | Mehring | Dec 2003 | B2 |
6677927 | Bruck et al. | Jan 2004 | B1 |
6677932 | Westerman | Jan 2004 | B1 |
6677934 | Blanchard | Jan 2004 | B1 |
6686904 | Sherman et al. | Feb 2004 | B1 |
6690387 | Zimmerman et al. | Feb 2004 | B2 |
6700564 | Mcloone et al. | Mar 2004 | B2 |
6703550 | Chu | Mar 2004 | B2 |
6703599 | Casebolt et al. | Mar 2004 | B1 |
6707027 | Liess et al. | Mar 2004 | B2 |
6717569 | Gruhl et al. | Apr 2004 | B1 |
6717573 | Shahoian et al. | Apr 2004 | B1 |
6724366 | Crawford | Apr 2004 | B2 |
6724817 | Simpson et al. | Apr 2004 | B1 |
6725064 | Wakamatsu et al. | Apr 2004 | B1 |
6727889 | Shaw | Apr 2004 | B2 |
6738045 | Hinckley et al. | May 2004 | B2 |
6740860 | Kobayashi | May 2004 | B2 |
6757002 | Oross et al. | Jun 2004 | B1 |
D493157 | Yang | Jul 2004 | S |
D493158 | Yang | Jul 2004 | S |
6762751 | Kuan | Jul 2004 | B2 |
6788288 | Ano | Sep 2004 | B2 |
6791533 | Su | Sep 2004 | B2 |
6792398 | Handley et al. | Sep 2004 | B1 |
6795056 | Norskog et al. | Sep 2004 | B2 |
6795057 | Gordon | Sep 2004 | B2 |
D497606 | Yang | Oct 2004 | S |
6803906 | Morrison et al. | Oct 2004 | B1 |
6816149 | Alsleben | Nov 2004 | B1 |
6816150 | Casebolt et al. | Nov 2004 | B2 |
6819312 | Fish | Nov 2004 | B2 |
6822635 | Shahoian et al. | Nov 2004 | B2 |
D500298 | Yang | Dec 2004 | S |
6828958 | Davenport | Dec 2004 | B2 |
6833825 | Farag et al. | Dec 2004 | B1 |
6834195 | Brandenberg et al. | Dec 2004 | B2 |
6834373 | Dieberger | Dec 2004 | B2 |
6842672 | Straub et al. | Jan 2005 | B1 |
6844871 | Hinckley et al. | Jan 2005 | B1 |
6844872 | Farag et al. | Jan 2005 | B1 |
6847352 | Lantigua | Jan 2005 | B2 |
6848014 | Landron et al. | Jan 2005 | B2 |
6853850 | Shim et al. | Feb 2005 | B2 |
6856259 | Sharp | Feb 2005 | B1 |
6865718 | Levi Montalcini | Mar 2005 | B2 |
6873715 | Kuo et al. | Mar 2005 | B2 |
6876891 | Schuler et al. | Apr 2005 | B1 |
6888532 | Wong et al. | May 2005 | B2 |
6888536 | Westerman et al. | May 2005 | B2 |
6900795 | Knight, III et al. | May 2005 | B1 |
6909424 | Liebenow et al. | Jun 2005 | B2 |
6924789 | Bick | Aug 2005 | B2 |
6927761 | Badaye et al. | Aug 2005 | B2 |
D509819 | Yang | Sep 2005 | S |
D509833 | Yang | Sep 2005 | S |
D510081 | Yang | Sep 2005 | S |
6942571 | Mcallister et al. | Sep 2005 | B1 |
6943779 | Satoh | Sep 2005 | B2 |
6950094 | Gordon et al. | Sep 2005 | B2 |
D511512 | Yang | Nov 2005 | S |
D511528 | Yang | Nov 2005 | S |
6965375 | Gettemy et al. | Nov 2005 | B1 |
D512403 | Yang | Dec 2005 | S |
D512435 | Yang | Dec 2005 | S |
6972401 | Akitt et al. | Dec 2005 | B2 |
6977666 | Hedrick | Dec 2005 | B1 |
6985801 | Straub et al. | Jan 2006 | B1 |
6992656 | Hughes | Jan 2006 | B2 |
6992659 | Gettemy | Jan 2006 | B2 |
6995744 | Moore et al. | Feb 2006 | B1 |
7013228 | Ritt | Mar 2006 | B2 |
7015894 | Morohoshi | Mar 2006 | B2 |
7031228 | Born et al. | Apr 2006 | B2 |
7034802 | Gettemy et al. | Apr 2006 | B1 |
D520516 | Yang | May 2006 | S |
7046230 | Zadesky et al. | May 2006 | B2 |
7051291 | Sciammarella et al. | May 2006 | B2 |
7068256 | Gettemy et al. | Jun 2006 | B1 |
7109978 | Gillespie et al. | Sep 2006 | B2 |
7113196 | Kerr | Sep 2006 | B2 |
7119792 | Andre et al. | Oct 2006 | B1 |
7129416 | Steinfeld et al. | Oct 2006 | B1 |
7142193 | Hayama et al. | Nov 2006 | B2 |
7145552 | Hollingsworth | Dec 2006 | B2 |
7164412 | Kao | Jan 2007 | B2 |
7168047 | Huppi | Jan 2007 | B1 |
7170488 | Kehlstadt et al. | Jan 2007 | B2 |
7170496 | Middleton | Jan 2007 | B2 |
7183948 | Roberts | Feb 2007 | B2 |
7184064 | Zimmerman et al. | Feb 2007 | B2 |
7191024 | Kano | Mar 2007 | B2 |
7233318 | Farag et al. | Jun 2007 | B1 |
7236159 | Siversson | Jun 2007 | B1 |
7239800 | Bilbrey | Jul 2007 | B2 |
7240289 | Naughton et al. | Jul 2007 | B2 |
7312981 | Carroll | Dec 2007 | B2 |
7336260 | Martin et al. | Feb 2008 | B2 |
RE40153 | Westerman et al. | Mar 2008 | E |
7366995 | Montague | Apr 2008 | B2 |
7388578 | Tao | Jun 2008 | B2 |
7423636 | Sano et al. | Sep 2008 | B2 |
7452098 | Kerr | Nov 2008 | B2 |
7453439 | Kushler et al. | Nov 2008 | B1 |
7495659 | Marriott et al. | Feb 2009 | B2 |
7499039 | Roberts | Mar 2009 | B2 |
7499040 | Zadesky et al. | Mar 2009 | B2 |
7505785 | Callaghan et al. | Mar 2009 | B2 |
D592665 | Andre et al. | May 2009 | S |
RE40993 | Westerman | Nov 2009 | E |
7629961 | Casebolt et al. | Dec 2009 | B2 |
7652589 | Autor | Jan 2010 | B2 |
7663607 | Hotelling et al. | Feb 2010 | B2 |
7683889 | Rimas et al. | Mar 2010 | B2 |
7800592 | Kerr et al. | Sep 2010 | B2 |
7808479 | Hotelling et al. | Oct 2010 | B1 |
7932893 | Berthaud | Apr 2011 | B1 |
8479122 | Hotelling et al. | Jul 2013 | B2 |
8537115 | Hotelling et al. | Sep 2013 | B2 |
8665209 | Rimas-Ribikauskas et al. | Mar 2014 | B2 |
8704769 | Hotelling et al. | Apr 2014 | B2 |
8704770 | Hotelling et al. | Apr 2014 | B2 |
9047009 | King et al. | Jun 2015 | B2 |
9335868 | Hotelling et al. | May 2016 | B2 |
9785258 | Hotelling et al. | Oct 2017 | B2 |
9794397 | Min et al. | Oct 2017 | B2 |
9983742 | King et al. | May 2018 | B2 |
10156914 | Hotelling et al. | Dec 2018 | B2 |
20010011991 | Wang et al. | Aug 2001 | A1 |
20010011995 | Hinckley et al. | Aug 2001 | A1 |
20010035854 | Rosenberg et al. | Nov 2001 | A1 |
20010043189 | Brisebois et al. | Nov 2001 | A1 |
20010043545 | Aratani | Nov 2001 | A1 |
20010050673 | Davenport | Dec 2001 | A1 |
20010051046 | Watanabe et al. | Dec 2001 | A1 |
20020008691 | Hanajima et al. | Jan 2002 | A1 |
20020015024 | Westerman et al. | Feb 2002 | A1 |
20020033795 | Shahoian et al. | Mar 2002 | A1 |
20020033848 | Sciammarella et al. | Mar 2002 | A1 |
20020063688 | Shaw et al. | May 2002 | A1 |
20020067334 | Hinckley et al. | Jun 2002 | A1 |
20020084986 | Armstrong | Jul 2002 | A1 |
20020089545 | Levi Montalcini | Jul 2002 | A1 |
20020093487 | Rosenberg | Jul 2002 | A1 |
20020093492 | Baron | Jul 2002 | A1 |
20020118169 | Hinckley et al. | Aug 2002 | A1 |
20020118174 | Rodgers | Aug 2002 | A1 |
20020118848 | Karpenstein | Aug 2002 | A1 |
20020130839 | Wallace et al. | Sep 2002 | A1 |
20020130841 | Scott | Sep 2002 | A1 |
20020140676 | Kao | Oct 2002 | A1 |
20020154090 | Lin | Oct 2002 | A1 |
20020158838 | Smith et al. | Oct 2002 | A1 |
20020164156 | Bilbrey | Nov 2002 | A1 |
20020180763 | Kung | Dec 2002 | A1 |
20020190975 | Kerr | Dec 2002 | A1 |
20030002246 | Kerr | Jan 2003 | A1 |
20030006974 | Clough et al. | Jan 2003 | A1 |
20030011574 | Goodman | Jan 2003 | A1 |
20030025735 | Polgar et al. | Feb 2003 | A1 |
20030038776 | Rosenberg et al. | Feb 2003 | A1 |
20030043121 | Chen | Mar 2003 | A1 |
20030043174 | Hinckley et al. | Mar 2003 | A1 |
20030048260 | Matusis | Mar 2003 | A1 |
20030050092 | Yun | Mar 2003 | A1 |
20030063072 | Brandenberg et al. | Apr 2003 | A1 |
20030074977 | Doemens et al. | Apr 2003 | A1 |
20030076301 | Tsuk et al. | Apr 2003 | A1 |
20030076303 | Huppi | Apr 2003 | A1 |
20030076306 | Zadesky et al. | Apr 2003 | A1 |
20030085870 | Hinckley | May 2003 | A1 |
20030085882 | Lu | May 2003 | A1 |
20030095095 | Pihlaja | May 2003 | A1 |
20030095096 | Robbin et al. | May 2003 | A1 |
20030098851 | Brink | May 2003 | A1 |
20030098858 | Perski et al. | May 2003 | A1 |
20030107551 | Dunker | Jun 2003 | A1 |
20030107552 | Lu | Jun 2003 | A1 |
20030107607 | Nguyen | Jun 2003 | A1 |
20030117377 | Horie et al. | Jun 2003 | A1 |
20030122779 | Martin et al. | Jul 2003 | A1 |
20030142855 | Kuo et al. | Jul 2003 | A1 |
20030156098 | Shaw et al. | Aug 2003 | A1 |
20030179223 | Ying et al. | Sep 2003 | A1 |
20030184517 | Senzui et al. | Oct 2003 | A1 |
20030184520 | Wei | Oct 2003 | A1 |
20030206162 | Roberts | Nov 2003 | A1 |
20030206202 | Moriya | Nov 2003 | A1 |
20030210286 | Gerpheide et al. | Nov 2003 | A1 |
20030222848 | Solomon et al. | Dec 2003 | A1 |
20030222858 | Kobayashi | Dec 2003 | A1 |
20030234768 | Rekimoto et al. | Dec 2003 | A1 |
20040003947 | Kesselman et al. | Jan 2004 | A1 |
20040012572 | Sowden et al. | Jan 2004 | A1 |
20040021643 | Hoshino et al. | Feb 2004 | A1 |
20040021681 | Liao | Feb 2004 | A1 |
20040046739 | Gettemy | Mar 2004 | A1 |
20040046741 | Low et al. | Mar 2004 | A1 |
20040075676 | Rosenberg et al. | Apr 2004 | A1 |
20040099131 | Ludwig | May 2004 | A1 |
20040109013 | Goertz | Jun 2004 | A1 |
20040125086 | Hagermoser et al. | Jul 2004 | A1 |
20040130526 | Rosenberg | Jul 2004 | A1 |
20040138569 | Grunwald et al. | Jul 2004 | A1 |
20040139348 | Norris | Jul 2004 | A1 |
20040156192 | Kerr et al. | Aug 2004 | A1 |
20040212586 | Denny | Oct 2004 | A1 |
20040222979 | Knighton | Nov 2004 | A1 |
20040227736 | Kamrath et al. | Nov 2004 | A1 |
20040239622 | Proctor et al. | Dec 2004 | A1 |
20040242269 | Fadell | Dec 2004 | A1 |
20040242295 | Ghaly | Dec 2004 | A1 |
20040246231 | Large | Dec 2004 | A1 |
20040252109 | Trent et al. | Dec 2004 | A1 |
20040263483 | Aufderheide | Dec 2004 | A1 |
20040263484 | Mantysalo et al. | Dec 2004 | A1 |
20050012723 | Pallakoff | Jan 2005 | A1 |
20050030278 | Fu | Feb 2005 | A1 |
20050035951 | Bjorkengren | Feb 2005 | A1 |
20050035955 | Carter et al. | Feb 2005 | A1 |
20050043060 | Brandenberg et al. | Feb 2005 | A1 |
20050048955 | Ring | Mar 2005 | A1 |
20050052425 | Zadesky et al. | Mar 2005 | A1 |
20050057524 | Hill et al. | Mar 2005 | A1 |
20050068322 | Falcioni | Mar 2005 | A1 |
20050084138 | Inkster et al. | Apr 2005 | A1 |
20050104867 | Westerman et al. | May 2005 | A1 |
20050110767 | Gomes et al. | May 2005 | A1 |
20050110768 | Marriott et al. | May 2005 | A1 |
20050115816 | Gelfond et al. | Jun 2005 | A1 |
20050135053 | Carroll | Jun 2005 | A1 |
20050146509 | Geaghan et al. | Jul 2005 | A1 |
20050146513 | Hill et al. | Jul 2005 | A1 |
20050154798 | Nurmi | Jul 2005 | A1 |
20050168488 | Montague | Aug 2005 | A1 |
20050183035 | Ringel et al. | Aug 2005 | A1 |
20050190158 | Casebolt et al. | Sep 2005 | A1 |
20050212760 | Marvit et al. | Sep 2005 | A1 |
20050219228 | Alameh et al. | Oct 2005 | A1 |
20050228320 | Klinghult | Oct 2005 | A1 |
20050253818 | Nettamo | Nov 2005 | A1 |
20050259077 | Adams et al. | Nov 2005 | A1 |
20050264540 | Niwa | Dec 2005 | A1 |
20050275637 | Hinckley et al. | Dec 2005 | A1 |
20060010400 | Dehlin et al. | Jan 2006 | A1 |
20060022955 | Kennedy | Feb 2006 | A1 |
20060022956 | Lengeling et al. | Feb 2006 | A1 |
20060026521 | Hotelling et al. | Feb 2006 | A1 |
20060026535 | Hotelling et al. | Feb 2006 | A1 |
20060026536 | Hotelling et al. | Feb 2006 | A1 |
20060032680 | Elias et al. | Feb 2006 | A1 |
20060033724 | Chaudhri et al. | Feb 2006 | A1 |
20060044259 | Hotelling et al. | Mar 2006 | A1 |
20060050011 | Kamio | Mar 2006 | A1 |
20060053387 | Ording | Mar 2006 | A1 |
20060066582 | Lyon et al. | Mar 2006 | A1 |
20060073272 | Carel | Apr 2006 | A1 |
20060079969 | Seguin | Apr 2006 | A1 |
20060085757 | Andre et al. | Apr 2006 | A1 |
20060097991 | Hotelling et al. | May 2006 | A1 |
20060109252 | Kolmykov-zotov et al. | May 2006 | A1 |
20060109256 | Grant et al. | May 2006 | A1 |
20060146036 | Prados et al. | Jul 2006 | A1 |
20060181517 | Zadesky et al. | Aug 2006 | A1 |
20060181518 | Shen et al. | Aug 2006 | A1 |
20060183505 | Willrich | Aug 2006 | A1 |
20060197750 | Kerr et al. | Sep 2006 | A1 |
20060197753 | Hotelling | Sep 2006 | A1 |
20060232567 | Westerman et al. | Oct 2006 | A1 |
20060238517 | King et al. | Oct 2006 | A1 |
20060238518 | Westerman et al. | Oct 2006 | A1 |
20060238519 | Westerman et al. | Oct 2006 | A1 |
20060238520 | Westerman et al. | Oct 2006 | A1 |
20060238521 | Westerman et al. | Oct 2006 | A1 |
20060238522 | Westerman et al. | Oct 2006 | A1 |
20060244733 | Geaghan | Nov 2006 | A1 |
20060250357 | Safai | Nov 2006 | A1 |
20060267934 | Harley et al. | Nov 2006 | A1 |
20060267953 | Peterson et al. | Nov 2006 | A1 |
20060279548 | Geaghan | Dec 2006 | A1 |
20060284855 | Shintome | Dec 2006 | A1 |
20070043725 | Hotelling et al. | Feb 2007 | A1 |
20070046646 | Kwon et al. | Mar 2007 | A1 |
20070063923 | Koenig | Mar 2007 | A1 |
20070075968 | Hall et al. | Apr 2007 | A1 |
20070136064 | Carroll | Jun 2007 | A1 |
20070229464 | Hotelling et al. | Oct 2007 | A1 |
20070236466 | Hotelling | Oct 2007 | A1 |
20070247429 | Westerman | Oct 2007 | A1 |
20070257890 | Hotelling et al. | Nov 2007 | A1 |
20080012835 | Rimon et al. | Jan 2008 | A1 |
20080012838 | Rimon | Jan 2008 | A1 |
20080088602 | Hotelling | Apr 2008 | A1 |
20080158172 | Hotelling et al. | Jul 2008 | A1 |
20080211772 | Loucks | Sep 2008 | A1 |
20080238879 | Jaeger et al. | Oct 2008 | A1 |
20080266257 | Chiang | Oct 2008 | A1 |
20080278450 | Lashina | Nov 2008 | A1 |
20080297476 | Hotelling et al. | Dec 2008 | A1 |
20080297477 | Hotelling et al. | Dec 2008 | A1 |
20080297478 | Hotelling et al. | Dec 2008 | A1 |
20090059730 | Lyons et al. | Mar 2009 | A1 |
20090066670 | Hotelling et al. | Mar 2009 | A1 |
20090085894 | Gandhi et al. | Apr 2009 | A1 |
20090096757 | Hotelling et al. | Apr 2009 | A1 |
20090096758 | Hotelling et al. | Apr 2009 | A1 |
20090236648 | Maeda et al. | Sep 2009 | A1 |
20090295738 | Chiang | Dec 2009 | A1 |
20090295753 | King et al. | Dec 2009 | A1 |
20100026656 | Hotelling et al. | Feb 2010 | A1 |
20100112964 | Yi et al. | May 2010 | A1 |
20100164878 | Bestle et al. | Jul 2010 | A1 |
20110012835 | Hotelling et al. | Jan 2011 | A1 |
20120075238 | Minami et al. | Mar 2012 | A1 |
20120099406 | Lau et al. | Apr 2012 | A1 |
20120299859 | Kinoshita | Nov 2012 | A1 |
20140171156 | Pattikonda et al. | Jun 2014 | A1 |
20140375577 | Yeh et al. | Dec 2014 | A1 |
20150002440 | Huang et al. | Jan 2015 | A1 |
20150062050 | Zadesky et al. | Mar 2015 | A1 |
20150091790 | Forutanpour et al. | Apr 2015 | A1 |
20150111558 | Yang | Apr 2015 | A1 |
20150153895 | Hotelling | Jun 2015 | A1 |
20150186092 | Francis et al. | Jul 2015 | A1 |
20150261310 | Walmsley et al. | Sep 2015 | A1 |
20150261362 | King et al. | Sep 2015 | A1 |
20150346026 | Maass et al. | Dec 2015 | A1 |
20160070399 | Hotelling | Mar 2016 | A1 |
20160197753 | Hwang et al. | Jul 2016 | A1 |
20170068352 | Blondin et al. | Mar 2017 | A1 |
20170336964 | Kim et al. | Nov 2017 | A1 |
20180032158 | Hotelling et al. | Feb 2018 | A1 |
20180059809 | Mcclendon et al. | Mar 2018 | A1 |
20180067639 | Balaram | Mar 2018 | A1 |
20180217709 | Hotelling | Aug 2018 | A1 |
20180253172 | King et al. | Sep 2018 | A1 |
20190041991 | Hotelling et al. | Feb 2019 | A1 |
20190113988 | Hotelling et al. | Apr 2019 | A1 |
20190258395 | Balaram | Aug 2019 | A1 |
20190354215 | King et al. | Nov 2019 | A1 |
Number | Date | Country |
---|---|---|
1243096 | Oct 1988 | CA |
1173672 | Feb 1998 | CN |
1343330 | Apr 2002 | CN |
1397870 | Feb 2003 | CN |
1529869 | Sep 2004 | CN |
1588432 | Mar 2005 | CN |
1720499 | Jan 2006 | CN |
1737827 | Feb 2006 | CN |
1942853 | Apr 2007 | CN |
4125049 | Jan 1992 | DE |
19722636 | Dec 1998 | DE |
10022537 | Nov 2000 | DE |
10201193 | Jul 2003 | DE |
10251296 | May 2004 | DE |
0288692 | Nov 1988 | EP |
0464908 | Jan 1992 | EP |
0498540 | Aug 1992 | EP |
0288692 | Jul 1993 | EP |
0653725 | May 1995 | EP |
0664504 | Jul 1995 | EP |
0464908 | Sep 1996 | EP |
0768619 | Apr 1997 | EP |
0795837 | Sep 1997 | EP |
0880091 | Nov 1998 | EP |
1026713 | Aug 2000 | EP |
1014295 | Jan 2002 | EP |
1241557 | Sep 2002 | EP |
1241558 | Sep 2002 | EP |
1505484 | Feb 2005 | EP |
1571537 | Sep 2005 | EP |
0899650 | Jun 2011 | EP |
2380583 | Apr 2003 | GB |
2393688 | Apr 2004 | GB |
2393688 | Jan 2006 | GB |
63106826 | May 1988 | JP |
S63257824 | Oct 1988 | JP |
S63292774 | Nov 1988 | JP |
H03237520 | Oct 1991 | JP |
H0764725 | Mar 1995 | JP |
H07160396 | Jun 1995 | JP |
H07182101 | Jul 1995 | JP |
H07319001 | Dec 1995 | JP |
H08161138 | Jun 1996 | JP |
H08166866 | Jun 1996 | JP |
H08211992 | Aug 1996 | JP |
H096525 | Jan 1997 | JP |
H09244810 | Sep 1997 | JP |
H09305262 | Nov 1997 | JP |
H10228350 | Aug 1998 | JP |
H10326149 | Dec 1998 | JP |
H11143606 | May 1999 | JP |
H11194863 | Jul 1999 | JP |
H11194872 | Jul 1999 | JP |
H11194883 | Jul 1999 | JP |
H11215217 | Aug 1999 | JP |
2000163031 | Jun 2000 | JP |
2000215549 | Aug 2000 | JP |
2000242424 | Sep 2000 | JP |
2000242428 | Sep 2000 | JP |
2000293289 | Oct 2000 | JP |
2000330946 | Nov 2000 | JP |
2001051790 | Feb 2001 | JP |
2001356878 | Dec 2001 | JP |
2002501271 | Jan 2002 | JP |
2002062972 | Feb 2002 | JP |
2002185630 | Jun 2002 | JP |
2002229719 | Aug 2002 | JP |
2002342033 | Nov 2002 | JP |
2002342034 | Nov 2002 | JP |
2003058316 | Feb 2003 | JP |
2003122506 | Apr 2003 | JP |
2003241872 | Aug 2003 | JP |
2003271309 | Sep 2003 | JP |
2003280807 | Oct 2003 | JP |
2003330611 | Nov 2003 | JP |
2004021933 | Jan 2004 | JP |
2004070920 | Mar 2004 | JP |
2004177993 | Jun 2004 | JP |
2004226715 | Aug 2004 | JP |
2004527847 | Sep 2004 | JP |
2004340991 | Dec 2004 | JP |
2005006259 | Jan 2005 | JP |
2005346244 | Dec 2005 | JP |
20010047975 | Jun 2001 | KR |
20020016080 | Mar 2002 | KR |
431607 | Apr 2001 | TW |
9005972 | May 1990 | WO |
9210823 | Jun 1992 | WO |
9417494 | Aug 1994 | WO |
9718547 | May 1997 | WO |
9723738 | Jul 1997 | WO |
9814863 | Apr 1998 | WO |
9926330 | May 1999 | WO |
9938149 | Jul 1999 | WO |
9949443 | Sep 1999 | WO |
0039907 | Jul 2000 | WO |
0055716 | Sep 2000 | WO |
0235461 | May 2002 | WO |
02052494 | Jul 2002 | WO |
03007227 | Jan 2003 | WO |
03001576 | Apr 2003 | WO |
03065192 | Aug 2003 | WO |
03077110 | Sep 2003 | WO |
03088176 | Oct 2003 | WO |
2004111816 | Dec 2004 | WO |
2005052773 | Jun 2005 | WO |
2006020305 | Feb 2006 | WO |
2006023569 | Mar 2006 | WO |
2004111816 | Apr 2006 | WO |
2006094308 | Sep 2006 | WO |
2006096501 | Sep 2006 | WO |
2006094308 | Dec 2006 | WO |
2006132817 | Dec 2006 | WO |
2006132817 | Aug 2007 | WO |
2007103631 | Sep 2007 | WO |
2007103631 | Nov 2008 | WO |
2010014560 | Feb 2010 | WO |
Entry |
---|
4-Wire Resistive Touchscreens, Available online at: <http://www.touchscreens.com/intro-touchtypes-4resistive.html>, Accessed on Aug. 5, 2005. |
5-Wire Resistive Touchscreens, Available online at: <http://www.touchscreens.com/intro-touchtypes-resistive.html>, Accessed on Aug. 5, 2005. |
A Brief Overview of Gesture Recognition, Available online at: <http://www.dai.ed.ac.uk/Cvonline/LOCA_COPIES/COHEN/gesture_overview.html>, Accessed on Apr. 20, 2004. |
About Quicktip, Available online at: <www.logicad3d.com/docs/qt.html>, Accessed on Apr. 8, 2002, 2 pages. |
Advisory Action received for U.S. Appl. No. 10/654,108, dated Apr. 17, 2007, 3 pages. |
Advisory Action received for U.S. Appl. No. 10/654,108, dated Apr. 20, 2009, 3 pages. |
Advisory Action received for U.S. Appl. No. 10/654,108, dated Feb. 12, 2008, 3 pages. |
Advisory Action received for U.S. Appl. No. 10/654,108, dated Mar. 25, 2010, 3 pages. |
Advisory Action received for U.S. Appl. No. 11/204,873, dated Jun. 17, 2009, 3 pages. |
Advisory Action received for U.S. Appl. No. 11/204,873, dated Mar. 3, 2017, 7 pages. |
Advisory Action received for U.S. Appl. No. 11/204,873, dated Sep. 17, 2018, 4 pages. |
Advisory Action received for U.S. Appl. No. 11/367,749, dated Aug. 25, 2010, 3 pages. |
Advisory Action received for U.S. Appl. No. 11/966,948, dated Sep. 23, 2016, 3 pages. |
Advisory Action received for U.S. Appl. No. 12/184,190, dated Feb. 25, 2013, 5 pages. |
Advisory Action received for U.S. Appl. No. 12/189,030, dated Nov. 8, 2013, 3 pages. |
Advisory Action received for U.S. Appl. No. 12/189,030, dated Nov. 12, 2013, 3 pages. |
Apple Unveils Optical Mouse and New Pro Keyboard, Available online at: http://www.apple.com/pr/library/2000/jul/19mousekeyboard.html, Jul. 19, 2000, 2 pages. |
Capacitive Position Sensing, Available online at: <http://www.synaptics.com/technology/cps.cfm>, Accessed on Aug. 5, 2005. |
Capacitive Touchscreens, Available online at: <http://www.touchscreens.com/intro-touchtypes-capacitive.html>, Accessed on Aug. 5, 2005. |
Comparing Touch Technologies, Available online at: <http://www.touchscreens.com/intro-touchtypes.html>, Accessed on Oct. 10, 2004. |
EPO Form 1507 received for European Patent Application No. 02761784.4, dated Nov. 19, 2004, 6 pages. |
Examination Report received for European Patent Application No. 06737515.4, dated Apr. 21, 2008, 5 pages. |
Examiner's Answer to Appeal Brief received for U.S. Appl. No. 11/204,873, dated Mar. 29, 2013, 22 pages. |
Examiner's Answer to Appeal Brief received for U.S. Appl. No. 11/367,749, dated Mar. 15, 2019, 5 pages. |
Examiner's Answer to Appeal Brief received for U.S. Appl. No. 12/184,190, dated Aug. 16, 2013, 6 pages. |
Examiner's Answer to Appeal Brief received for U.S. Appl. No. 12/189,030, dated May 30, 2014, 10 pages. |
Extended Search Report received for European Patent Application No. 09170572.3, dated Nov. 7, 2013, 8 pages. |
Final Office Action received for U.S. Appl. No. 10/654,108, dated Feb. 2, 2007, 17 pages. |
Final Office Action received for U.S. Appl. No. 10/654,108, dated Feb. 18, 2009, 13 pages. |
Final Office Action received for U.S. Appl. No. 10/654,108, dated Jan. 19, 2010, 16 pages. |
Final Office Action received for U.S. Appl. No. 10/654,108, dated Nov. 16, 2007, 13 pages. |
Final Office Action received for U.S. Appl. No. 11/115,539, dated Nov. 21, 2008, 24 pages. |
Final Office Action received for U.S. Appl. No. 11/115,539, dated Oct. 14, 2009, 11 pages. |
Final Office Action received for U.S. Appl. No. 11/204,873, dated Apr. 11, 2018, 29 pages. |
Final Office Action received for U.S. Appl. No. 11/204,873, dated Apr. 28, 2009, 19 pages. |
Final Office Action received for U.S. Appl. No. 11/204,873, dated Aug. 13, 2012, 22 pages. |
Final Office Action received for U.S. Appl. No. 11/204,873, dated Jan. 15, 2010, 24 pages. |
Final Office Action received for U.S. Appl. No. 11/204,873, dated Jun. 20, 2011, 17 pages. |
Final Office Action received for U.S. Appl. No. 11/204,873, dated Sep. 7, 2016, 25 pages. |
Final Office Action received for U.S. Appl. No. 11/204,873, dated Sep. 20, 2010, 15 pages. |
Final Office Action received for U.S. Appl. No. 11/367,749, dated Aug. 25, 2009, 13 pages. |
Final Office Action received for U.S. Appl. No. 11/367,749, dated Jan. 8, 2016, 11 pages. |
Final Office Action received for U.S. Appl. No. 11/367,749, dated Jan. 17, 2014, 13 pages. |
Final Office Action received for U.S. Appl. No. 11/367,749, dated Jul. 14, 2017, 11 pages. |
Final Office Action received for U.S. Appl. No. 11/367,749, dated Jun. 30, 2010, 10 pages. |
Final Office Action received for U.S. Appl. No. 11/367,749, dated Mar. 16, 2018, 13 pages. |
Final Office Action received for U.S. Appl. No. 11/966,948, dated Jul. 13, 2011, 10 pages. |
Final Office Action received for U.S. Appl. No. 11/966,948, dated Mar. 10, 2016, 15 pages. |
Final Office Action received for U.S. Appl. No. 11/966,948, dated Oct. 9, 2014, 11 pages. |
Final Office Action received for U.S. Appl. No. 12/184,190, dated Nov. 8, 2012, 21 pages. |
Final Office Action received for U.S. Appl. No. 12/188,948, dated Dec. 7, 2011, 13 pages. |
Final Office Action received for U.S. Appl. No. 12/188,988, dated Dec. 8, 2011, 15 pages. |
Final Office Action received for U.S. Appl. No. 12/189,030, dated Jun. 20, 2013, 17 pages. |
Final Office Action received for U.S. Appl. No. 12/189,030, dated Oct. 14, 2011, 15 pages. |
Final Office Action received for U.S. Appl. No. 12/486,710, dated Jan. 2, 2014, 30 pages. |
Final Office Action received for U.S. Appl. No. 12/486,710, dated Jul. 1, 2011, 20 pages. |
Final Office Action received for U.S. Appl. No. 12/486,710, dated Sep. 4, 2012, 25 pages. |
Final Office Action received for U.S. Appl. No. 12/890,437, dated Apr. 19, 2011, 20 pages. |
Final Office Action received for U.S. Appl. No. 14/535,101, dated Apr. 28, 2017, 9 pages. |
Final Office Action received for U.S. Appl. No. 14/535,101, dated Feb. 4, 2016, 7 pages. |
Final Office Action received for U.S. Appl. No. 14/535,101, dated May 18, 2018, 10 pages. |
Final Office Action received for U.S. Appl. No. 14/595,032, dated Aug. 3, 2015, 10 pages. |
Final Office Action received for U.S. Appl. No. 14/595,032, dated Feb. 15, 2017, 10 pages. |
Final Office Action received for U.S. Appl. No. 14/724,753, dated Nov. 25, 2016, 16 pages. |
Final Office Action received for U.S. Appl. No. 14/940,010, dated Aug. 23, 2016, 12 pages. |
Final Office Action received for U.S. Appl. No. 14/940,010, dated Feb. 2, 2018, 14 pages. |
Final Office Action received for U.S. Appl. No. 14/940,010, dated Feb. 26, 2019, 12 pages. |
Final Office Action received for U.S. Appl. No. 15/933,196, dated Oct. 31, 2019, 10 pages. |
Final Office Action received for U.S. Appl. No. 16/155,508, dated Oct. 11, 2019, 17 pages. |
FingerWorks—Gesture Guide—Application Switching, Available online at: <http://www.fingerworks.com/gesture_guide_apps.html>, Accessed on Aug. 27, 2004, 1 page. |
FingerWorks—Gesture Guide—Editing, Available online at: <http://www.fingerworks.com/gesure_guide_editing.html> Feb. 13, 2004, Accessed on Aug. 27, 2004, 1 page. |
FingerWorks—Gesture Guide—File Operations, Available online at: <http://www.fingerworks.com/gesture_guide_files.html> Jun. 18, 2004, Accessed on Aug. 27, 2004, 1 page. |
FingerWorks—Gesture Guide—Text Manipulation, Available online at: <http://www.fingerworks.com/gesture_guide_text_manip.html> Jun. 6, 2004, Accessed on Aug. 27, 2004, 2 pages. |
FingerWorks—Gesture Guide—Tips and Tricks, Available online at: <http://www.fingerworks.com/gesture_guide_tips.html>, Accessed on Aug. 27, 2004, 1 page. |
FingerWorks—Gesture Guide—Web, Available online at: <http://www.fingerworks.com/gesture_guide_web.html> Jun. 5, 2004, Accessed on Aug. 27, 2004, 1 page. |
FingerWorks—Guide to Hand Gestures for USB Touchpads, Available online at: <http://www.fingerworks.com/igesture_userguide.html>, Accessed on Aug. 27, 2004, 1 page. |
FingerWorks—iGesture—Technical Details, Available online at: <http://www.fingerworks.com/igesture_tech.html>, Accessed on Aug. 27, 2004, 1 page. |
FingerWorks—The Only Touchpads with Ergonomic Full-Hand Resting and Relaxation!, Available online at: <http://www.fingerworks.com/resting.html>, 2001, 1 page. |
FingerWorks—Tips for Typing on the Mini, Available online at: <http://www.fingerworks.com/mini_typing.html> Jun. 5, 2004, Accessed on Aug. 27, 2004, 2 pages. |
Gesture Recognition, Available online at: <http://www.fingerworks.com/gesture_recognition.html>, downloaded on Aug. 30, 2005, 2 pages. |
Glidepoint, Available online at: <http://www.cirque.com/technology/technology_gp.html>, Accessed on Aug. 5, 2005. |
How Do Touchscreen Monitors Know Where You're Touching?, Available online at: <http://electronics.howstuffworks.com/question716.html>, Jul. 7, 2008, 2 pages. |
How Does a Touchscreen Work?, Available online at: <http://www.touchscreens.com/intro-anatomy.html>, Accessed on Aug. 5, 2005. |
IGesture Pad-the MultiFinger USB TouchPad with Whole-Hand Gestures, Available online at: <http://www.fingerworks.com/igesture.html>, Accessed on Aug. 27, 2004, 2 pages. |
IGesture Products for Everyone (learn in minutes) Product Overview, Available online at: <FingerWorks.com>, Accessed on Aug. 30, 2005. |
Infrared Touchscreens, Available online at: <http://www.touchscreens.com/intro-touchtypes-infrared.html>, Accessed on Aug. 5, 2005. |
Intention to Grant received for European Patent Application No. 16186726.2, dated Jan. 24, 2020, 7 pages. |
International Search Report and Written Opinion received for PCT Patent Application No. PCT/US2006/007585, dated Jul. 25, 2006, 9 pages. |
International Search Report received for PCT Patent Application No. PCT/US2005/003325, dated Mar. 3, 2006. |
International Search Report received for PCT Patent Application No. PCT/US2006/008349, dated Oct. 6, 2006, 3 pages. |
International Search Report received for PCT Patent Application No. PCT/US2006/020341, dated Jun. 12, 2007, 7 pages. |
International Search Report received for PCT Patent Application No. PCT/US2007/062474, dated Jan. 3, 2008, 3 pages. |
International Search Report received for PCT Patent Application No. PCT/US2009/051874, dated Nov. 27, 2009, 3 pages. |
IRiver Clix Quick Start Guide, 2 pages. |
IRiver Clix User Guide, 1999-2006, 38 pages. |
Letter Restarting Period for Response received for U.S. Appl. No. 12/189,030, dated May 26, 2011, 16 pages. |
Mouse Emulation, FingerWorks, Available online at: <http://www.fingerworks.com/gesture_guide_mouse.html> Dec. 10, 2002, Accessed on Aug. 30, 2005. |
Mouse Gestures, Optim oz, May 21, 2004. |
Mouse Gestures in Opera, Available online at: <http://www.opera.com/products/desktop/xmouse/index.dml>, Accessed on Aug. 30, 2005. |
MultiTouch Overview, FingerWorks, Available online at: <http://www.fingerworks.com/multoverview.html>, Accessed on Aug. 30, 2005. |
Near Field Imaging Touchscreens, Available online at: <http://www.touchscreens.com/intro-touchtypes-nfi.html>, Accessed on Aug. 5, 2005. |
Neuros MP3 Digital Audio Computer, Available online at: <www.neurosaudio.com>, Accessed on Apr. 9, 2003., 1 page. |
Non-Final Office Action received for U.S. Appl. No. 10/654,108, dated Jul. 6, 2007, 14 pages. |
Non-Final Office Action received for U.S. Appl. No. 10/654,108, dated Jul. 14, 2009, 17 pages. |
Non-Final Office Action received for U.S. Appl. No. 10/654,108, dated Jun. 16, 2006, 8 pages. |
Non-Final Office Action received for U.S. Appl. No. 10/654,108, dated Oct. 28, 2008, 11 pages. |
Non-Final Office Action received for U.S. Appl. No. 11/115,539, dated Apr. 6, 2009, 32 pages. |
Non-Final Office Action received for U.S. Appl. No. 11/115,539, dated Jan. 15, 2010, 11 pages. |
Non-Final Office Action received for U.S. Appl. No. 11/115,539, dated Jul. 2, 2008, 19 pages. |
Non-Final Office Action received for U.S. Appl. No. 11/204,873, dated Apr. 6, 2016, 33 pages. |
Non-Final Office Action received for U.S. Appl. No. 11/204,873, dated Aug. 9, 2017, 27 pages. |
Non-Final Office Action received for U.S. Appl. No. 11/204,873, dated Feb. 28, 2011, 16 pages. |
Non-Final Office Action received for U.S. Appl. No. 11/204,873, dated Mar. 8, 2012, 21 pages. |
Non-Final Office Action received for U.S. Appl. No. 11/204,873, dated May 20, 2010, 14 pages. |
Non-Final Office Action received for U.S. Appl. No. 11/204,873, dated Nov. 13, 2008, 11 pages. |
Non-Final Office Action received for U.S. Appl. No. 11/204,873, dated Sep. 30, 2009, 21 pages. |
Non-Final Office Action received for U.S. Appl. No. 11/367,749, dated Feb. 26, 2009, 11 pages. |
Non-Final Office Action received for U.S. Appl. No. 11/367,749, dated Jan. 12, 2010, 9 pages. |
Non-Final Office Action received for U.S. Appl. No. 11/367,749, dated Jul. 5, 2013, 9 pages. |
Non-Final Office Action received for U.S. Appl. No. 11/367,749, dated May 14, 2015, 14 pages. |
Non-Final Office Action received for U.S. Appl. No. 11/367,749, dated Nov. 10, 2016, 10 pages. |
Non-Final Office Action received for U.S. Appl. No. 11/426,078, dated Apr. 2, 2009, 22 pages. |
Non-Final Office Action received for U.S. Appl. No. 11/966,948, dated Apr. 6, 2017, 14 pages. |
Non-Final Office Action received for U.S. Appl. No. 11/966,948, dated Aug. 26, 2015, 10 pages. |
Non-Final Office Action received for U.S. Appl. No. 11/966,948, dated Feb. 27, 2014, 11 pages. |
Non-Final Office Action received for U.S. Appl. No. 11/966,948, dated Jan. 28, 2011, 12 pages. |
Non-Final Office Action received for U.S. Appl. No. 12/184,190, dated Dec. 2, 2011, 22 pages. |
Non-Final Office Action received for U.S. Appl. No. 12/184,190, dated Jul. 5, 2012, 20 pages. |
Non-Final Office Action received for U.S. Appl. No. 12/188,948, dated Aug. 21, 2013, 14 pages. |
Non-Final Office Action received for U.S. Appl. No. 12/188,948, dated May 6, 2011, 11 pages. |
Non-Final Office Action received for U.S. Appl. No. 12/188,988, dated Aug. 14, 2013, 18 pages. |
Non-Final Office Action received for U.S. Appl. No. 12/188,988, dated May 16, 2011, 15 pages. |
Non-Final Office Action received for U.S. Appl. No. 12/189,030, dated Aug. 24, 2016, 19 pages. |
Non-Final Office Action received for U.S. Appl. No. 12/189,030, dated Feb. 22, 2017, 23 pages. |
Non-Final Office Action received for U.S. Appl. No. 12/189,030, dated May 6, 2011, 8 pages. |
Non-Final Office Action received for U.S. Appl. No. 12/189,030, dated Nov. 28, 2012, 16 pages. |
Non-Final Office Action received for U.S. Appl. No. 12/486,710, dated Aug. 25, 2014, 29 pages. |
Non-Final Office Action received for U.S. Appl. No. 12/486,710, dated Jun. 6, 2013, 29 pages. |
Non-Final Office Action received for U.S. Appl. No. 12/486,710, dated Mar. 9, 2011, 14 pages. |
Non-Final Office Action received for U.S. Appl. No. 12/486,710, dated Mar. 28, 2012, 25 pages. |
Non-Final Office Action received for U.S. Appl. No. 12/890,437, dated Dec. 28, 2010, 17 pages. |
Non-Final Office Action received for U.S. Appl. No. 14/535,101, dated Aug. 11, 2016, 9 pages. |
Non-Final Office Action received for U.S. Appl. No. 14/535,101, dated May 12, 2015, 8 pages. |
Non-Final Office Action received for U.S. Appl. No. 14/535,101, dated Nov. 15, 2017, 9 pages. |
Non-Final Office Action received for U.S. Appl. No. 14/595,032, dated Apr. 10, 2015, 12 pages. |
Non-Final Office Action received for U.S. Appl. No. 14/595,032, dated Jun. 27, 2016, 9 pages. |
Non-Final Office Action received for U.S. Appl. No. 14/724,753, dated Apr. 20, 2016, 16 pages. |
Non-Final Office Action received for U.S. Appl. No. 14/724,753, dated Jun. 22, 2017, 9 pages. |
Non-Final Office Action received for U.S. Appl. No. 14/940,010, dated Feb. 2, 2016, 9 pages. |
Non-Final Office Action received for U.S. Appl. No. 14/940,010, dated May 17, 2017, 11 pages. |
Non-Final Office Action received for U.S. Appl. No. 14/940,010, dated Oct. 12, 2018, 12 pages. |
Non-Final Office Action received for U.S. Appl. No. 15/257,713, dated Oct. 17, 2018, 9 pages. |
Non-Final Office Action received for U.S. Appl. No. 15/727,455, dated Nov. 2, 2017, 17 pages. |
Non-Final Office Action received for U.S. Appl. No. 15/933,196, dated Mar. 1, 2019, 9 pages. |
Non-Final Office Action received for U.S. Appl. No. 16/155,508, dated Mar. 14, 2019, 15 pages. |
Non-Final Office Action received for U.S. Appl. No. 16/209,137, dated Jan. 25, 2019, 18 pages. |
Non-Final Office Action received for U.S. Appl. No. 15/933,196, dated Apr. 6, 2020, 10 pages. |
Non-Final Office Action received for U.S. Appl. No. 16/403,370, dated Jul. 8, 2020, 10 pages. |
Non-Final Office Action received for U.S. Appl. No. 16/529,690, dated Apr. 6, 2020, 17 pages. |
Notice of Allowance received for Canadian Patent Application No. 2,820,737, dated Feb. 1, 2019, 1 page. |
Notice of Allowance received for U.S. Appl. No. 10/654,108, dated Jun. 1, 2010, 6 pages. |
Notice of Allowance received for U.S. Appl. No. 11/115,539, dated Jun. 2, 2010, 9 pages. |
Notice of Allowance received for U.S. Appl. No. 11/426,078, dated Sep. 25, 2009, 14 pages. |
Notice of Allowance received for U.S. Appl. No. 12/184,190, dated Jan. 11, 2016, 7 pages. |
Notice of Allowance received for U.S. Appl. No. 12/188,948, dated Jan. 2, 2014, 7 pages. |
Notice of Allowance received for U.S. Appl. No. 12/188,988, dated Dec. 30, 2013, 10 pages. |
Notice of Allowance received for U.S. Appl. No. 12/189,030, dated Jun. 5, 2017, 5 pages. |
Notice of Allowance received for U.S. Appl. No. 12/486,710, dated Jan. 29, 2015, 5 pages. |
Notice of Allowance received for U.S. Appl. No. 12/890,437, dated Jun. 18, 2013, 11 pages. |
Notice of Allowance received for U.S. Appl. No. 14/724,753, dated Jan. 24, 2018, 9 pages. |
Notice of Allowance received for U.S. Appl. No. 15/257,713, dated Feb. 21, 2019, 7 pages. |
Notice of Allowance received for U.S. Appl. No. 15/727,455, dated Apr. 12, 2018, 6 pages. |
Notice of Allowance received for U.S. Appl. No. 15/727,455, dated Aug. 15, 2018, 5 pages. |
Notice of Allowance received for U.S. Appl. No. 15/970,571, dated Apr. 5, 2019, 10 pages. |
Notice of Allowance received for U.S. Appl. No. 16/209,137, dated Jun. 14, 2019, 5 pages. |
Notice of Allowance received for U.S. Appl. No. 16/529,690, dated Oct. 20, 2020, 8 pages. |
Notice of Allowance received for U.S. Appl. No. 16/209,137, dated Sep. 27, 2019, 5 pages. |
Office Action received for Australian Patent Application No. 2017258903, dated Nov. 6, 2019, 6 Pages. |
Office Action received for Canadian Patent Application No. 2600326, dated Oct. 21, 2019, 5 Pages. |
Office Action received for Chinese Patent Application No. 201510846284.0, dated May 5, 2019, 14 pages (5 pages of English Translation and 9 pages of Official copy). |
Office Action received for European Patent Application No. 09170572.3, dated Mar. 12, 2020, 5 pages. |
Office Action received for Indian Patent Application No. 2266/KOLNP/2013, dated Apr. 30, 2019, 6 pages. |
Office Action received for Japanese Patent Application No. 2017-224353, dated May 13, 2019, 7 pages (4 pages of English Translation and 3 pages of Official Copy). |
Patent Board Decision received for U.S. Appl. No. 11/204,873, dated Dec. 14, 2015, 10 pages. |
Patent Board Decision received for U.S. Appl. No. 12/184,190, dated Dec. 23, 2015, 5 pages. |
Patent Board Decision received for U.S. Appl. No. 12/189,030, dated Jun. 2, 2016, 8 pages. |
PenTouch Capacitive Touchscreens, Available online at <http://www.touchscreens.com/intro-touchtypes-pentouch.html>, Accessed on Aug. 5, 2005. |
Photographs of Innovations 2000 Best of Show Award Presented at the 2000 International CES Innovations 2000 Design & Engineering Showcase, Jan. 6, 2000, 1 page. |
Pre-interview First Office Action received for U.S. Appl. No. 15/257,713, dated May 9, 2018, 5 pages. |
Pre-interview First Office Action received for U.S. Appl. No. 16/403,370, dated Feb. 20, 2020, 3 pages. |
Product Overview—ErgoCommander, Available online at: <www.logicad3d.com/products/ErgoCommander.htm>, Accessed on Apr. 8, 2002, 2 pages. |
Product Overview—SpaceMouse Classic, Available online at: <www.logicad3d.com/products/Classic.htm>, Apr. 8, 2002, 2 pages. |
Restriction Requirement received for U.S. Appl. No. 10/654,108, dated Mar. 20, 2006, 6 pages. |
Restriction Requirement received for U.S. Appl. No. 10/654,108, dated May 23, 2008, 7 pages. |
Restriction Requirement received for U.S. Appl. No. 11/966,948, dated Aug. 17, 2010, 5 pages. |
Restriction Requirement received for U.S. Appl. No. 12/184,190, dated Sep. 15, 2011, 6 pages. |
Search Report received for Chinese Patent Application No. 201310264394.7, dated Jan. 14, 2016, 4 pages (2 pages of English Translation and 2 pages of Official copy). |
Search Report received for Chinese Patent Application No. 201410259240.3, dated Sep. 27, 2016, 4 pages. |
Search Report received for European Patent Application No. 10010075.9, dated Nov. 9, 2010, 6 pages. |
Search Report received for European Patent Application No. 14169441.4, dated Jul. 7, 2014, 3 pages. |
Search Report received for European Patent Application No. 1621989, dated Mar. 27, 2006. |
Search Report received for European Patent Application No. 17203125.4, dated Mar. 5, 2018, 4 pages. |
Surface Acoustic Wave Touchscreens, Available online at: <http://www.touchscreens.com/intro-touchtypes-saw.html>, Accessed on Aug. 5, 2005. |
Symbol Commander, Available online at: <http://www.sensiva.com/symbolcomander/>, Accessed on Aug. 30, 2005. |
Synaptics Touch Pad Interfacing Guide, Synaptics Inc., Second Edition, San Jose, Mar. 25, 1998, pp. 1-90. |
Tips for Typing, FingerWorks, Available online at: <http://www.fingerworks.com/mini_typing.html>, Accessed on Aug. 30, 2005. |
Touch Technologies Overview, 3M Touch Systems, Massachusetts, 2001. |
Wacom Components—Technology, Available online at: <http://www.wacom-components.com/english/tech.asp>, Accessed on Oct. 10, 2004. |
Watershed Algorithm, Available online at: <http://rsb.info.nih.gov/ij/plugins/watershed.html>, Accessed on Aug. 5, 2005. |
Abraham et al., U.S. Appl. No. 60/364,400, filed Mar. 13, 2002, titled “Multi-button mouse”. |
Ahmad Subatai, “A Usable Real-Time 3D Hand Tracker”, Proceedings of the 28th Asilomar Conference on Signals, Systems and Computers—Part 2 (of 2), vol. 2, Oct. 1994, 5 pages. |
Apple Computer, Inc., “Apple Pro Mouse”, Apple Pro Mouse Design Innovations Product Specification, Jul. 2000, pp. 1-11. |
Bang & Olufsen,“BeoCom 6000 User Guide and Sales Training Brochure Cover Sheets”, 2000, 53 pages. |
Bier et al., “Toolglass and Magic Lenses: The See-Through Interface”, In James Kijiya, editor, Computer Graphics (SIGGRAPH '93 Proceedings), vol. 27, Aug. 1993, pp. 73-80. |
Chapweske Adam, “PS/2 Mouse/Keyboard Protocol”, Available online at: <http://panda.cs.ndsu.nodak.edu/ ˜achapwes/PICmicro/PS2/ps2. htm>, 1999, 7 pages. |
Cirque Corporation,“OEM Touchpad Modules”, Available online at: <www.glidepoint.com/sales/modules.index.shtml>, Accessed on Feb. 13, 2002, 5 pages. |
David Nagel, “More Details on the New Pro Keyboard and ButtonLess Mouse”, Available online at: <http://www.creativemac.com/HTM/News/07_00/detailskeyboardmouse,htm>, Jul. 2000, 2 pages. |
De Meyer Kevin, “Buttonless and Finger Mouse: Crystal Optical Mouse”, Available online at: http://www. heatseekerz. net/index.php?page=articles&id= 19&pagenum=2, Feb. 14, 2002, 2 pages. |
Douglas et al., “The Ergonomics of Computer Pointing Devices”, 1997. |
Dreier Troy, “The Comfort Zone”, Available online at http://www.pcmag.com/print_article/0,3048,a=22807,00asp, Mar. 12, 2002, 6 pages. |
Edwards Andru, “Gear Live Review: iRiver Clix Review”, Available online at: <http://www.gearlive.com/index.php/news/article/gear-live-review-iriver-clix-review-713400/>, Jul. 13, 2006, 5 pages. |
EVB Elektronik,“TSOP6238 IR Receiver Modules for Infrared Remote Control Systems”, Jan. 2004, 1 page. |
Fiore Andrew, “Zen Touchpad”, Available online at: http://instruct1.cit.cornell.edu/courses/ee476/FinalProjects/s2000/fiore/touchpad. html, May 2000, 6 pages. |
Fisher et al., “Repetitive Motion Disorders: The Design of Optimal Rate-Rest Profiles”, Human Factors, vol. 35, No. 2, Jun. 1993, pp. 283-304. |
Flaminio Michael, “IntelliMouse Explorer”, Available online at: http://www.insanely-great.com/reviews/intellimouse.html., Oct. 4, 1999, 4 pages. |
Fukumoto et al., “ActiveClick: Tactile Feedback for Touch Panels”, In CHI 2001 Summary, 2001, pp. 121-122. |
Fukumoto et al., “Body Coupled Fingering: Wireless Wearable Keyboard”, CHI 97, Mar. 1997, pp. 147-154. |
Gadgetboy,“Point and Click with The Latest Mice”, CNETAsia Product Review, Available online at: <www.asia.cnet.com/reviews...are/gadgetboy/0,39001770,38023590,00.htm>, Accessed on Dec. 5, 2001, 1 page. |
Gizmodo,“Logitech's MX Air Is No Longer Vapor”, Available online at: <http://www.gizmodo.com.au/2007/07/logitechs_mx_air_is_no_longer.html>, 2007, 2 pages. |
Grevstad Eric, “Microsoft Wireless IntelliMouse Explorer Review the Ultimate Pointing Machine”, HardwareCentral Review, Jun. 24, 2003. |
Grevstad Eric, “Microsoft Wireless IntelliMouse Explorer Review the Ultimate Pointing Machine”, Available online at: <http://www.hardwarecentral.com/hardwarecentral/print/3826/1>, Oct. 9, 2001, 3 pages. |
Hardy Ian, “Fingerworks”, BBC World on Line, Mar. 7, 2002. |
Hillier et al., “Introduction to Operations Research”, 1986. |
Integration Associates Inc.,“Proximity Sensor Demo Kit”, User Guide, Version 0.62-Preliminary, Apr. 13, 2004, 14 pages. |
Jacob et al., “Integrality and Separability of Input Devices”, ACM Transactions on Computer-Human Interaction, vol. 1, Mar. 1994, pp. 3-26. |
Kinkley et al., “Touch-Sensing Input Devices”, CHI '99 Proceedings, May 1999, pp. 223-230. |
KIONX,“KXP84 Series Summary Data Sheet”, Oct. 21, 2005, 4 pages. |
Lee et al., “A Multi-Touch Three Dimensional Touch-Sensitive Tablet”, CHI '85 Proceedings, Apr. 1985, pp. 121-128. |
Lee et al., “A Multi-Touch Three Dimensional Touch-Sensitive Tablet”, CHI'85 Proceedings, Apr. 1985, pp. 21-25. |
Lee S., “A Fast Multiple-Touch-Sensitive Input Device”, A Thesis Submitted in Conformity with the Requirements for the Degree of Master of Applied Science in the Department of Electrical Engineering, University of Toronto, Oct. 1984, 115 pages. |
Matsushita et al., “HoloWall: Designing a Finger, Hand, Body and Object Sensitive Wall”, In Proceedings of UIST '97, Oct. 1997. |
Mattel,“System Service and Troubleshooting Manual”, Available online at: <www.dsplib.com/intv/Master>, Accessed on Dec. 11, 2002, 1 page. |
Microsoft Inc.,“Scroll and Zoom on a Microsoft Excel Sheet by Using the Microsoft Intellimouse pointing Device”, Available online at: <mk:@MSITStore:C:\Program%20Files\Microsoft%200ffice\Office\1033\xlmain9.chm::/html/xlhowScrol>, 1999, 1 page. |
Peter et al., Unpublished U.S. Appl. No. 10/789,676, filed Feb. 27, 2004, titled “Shape Detecting Input Device”. |
Press Release,“iRiver Clix Delivers Complete Package for Portable Entertainment Fans”, Available online at: <www.iriveramerican.com/images.pdf/iriv_clix.pdf>, May 17, 2006, 3 pages. |
Quantum Research Group,“QT510/QWheel Touch Slider IC”, 2004-2005, 14 pages. |
Quek,“Unencumbered Gestural Interaction”, IEEE Multimedia, vol. 3, 1996, pp. 36-47. |
Radwin,“Activation Force and Travel Effects on Overexertion in Repetitive Key Tapping”, Human Factors, vol. 39, No. 1, Mar. 1997, pp. 130-140. |
Rekimoto et al., “ToolStone: Effective Use of the Physical Manipulation Vocabularies of Input Devices”, In Proc. of UIST 2000, 2000. |
Rekimoto J., “SmartSkin: An Infrastructure for Freehand Manipulation on Interactive Surfaces”, CHI 2002 Conference Proceedings, Conference on Human Factors in Computing Systems, Minneapolis, vol. 4, No. 1, Apr. 20-25, 2002, pp. 113-120. |
Rubine et al., “Programmable Finger-Tracking Instrument Controllers”, Computer Music Journal, vol. 14, No. 1, 1990, pp. 26-41. |
Rubine Dean, “Combining Gestures and Direct Manipulation”, CHI'92, May 3-7, 1992, pp. 659-660. |
Rubine Dean H. , “The Automatic Recognition of Gestures”, CMU-CS-91-202, Submitted in Partial Fulfillment of the Requirements for the Degree of Doctor of Philosophy in Computer Science at Carnegie Mellon University, Dec. 1991, 285 pages. |
Rutledge et al., “Force-to-Motion Functions for Pointing”, Human-Computer Interaction—INTERACT, 1990. |
Safran David, “Letter re: Bang & Olufsen A/S”, Nixon Peabody, LLP with BeoCom 6000 Sales Training Brochure, May 21, 2004, 7 pages. |
Siracusa John, “The Puck is Dead (and the Little Keyboard Too)”, Available online at: <http://www.arstechnic.com/wanderdesk/3q00/macworld2k/mwny-2.html>, Jul. 2000, 6 pages. |
Sylvania,“Intellivision Intelligent Television Master Component Service Manual”, 1979, pp. 1, 2 & 8. |
Tessler et al., “Touchpads Three new input devices”, Available online at: <www.macworld.com/1996/02/review/1806.html>, 1996, 2 pages. |
Texas Instruments,“TSC2003/I2C Touch Screen Controller”, Data Sheet SBAS 162, Oct. 2001, 20 pages. |
Wellner Pierre, “The Digital Desk Calculators: Tangible Manipulation on a DeskTop Display”, In ACM UIST'91 Proceedings, Nov. 11-13, 1991, pp. 27-34. |
Westerman Wayne, “Hand Tracking, Finger Identification, and Chordic Manipulation on a Multi-Touch Surface”, A Dissertation Submitted to the Faculty of the University of Delaware in Partial Fulfillment of the Requirements for the Degree of Doctor of Philosophy in Electrical Engineering, 1999, 363 pages. |
Williams Jim, “Applications for a Switched-Capacitor Instrumentation Building Block”, Linear Technology Application Note 3, Jul. 1985, pp. 1-16. |
Yamada et al., “A Switched-Capacitor Interface for Capacitive Pressure Sensors”, IEEE Transactions on Instrumentation and Measurement, vol. 41, No. 1, Feb. 1992, pp. 81-86. |
Yeh et al., “Switched Capacitor Interface Circuit for Capacitive Transducers”, IEEE, 1985. |
Zhai et al., “Dual Stream Input for Pointing and Scrolling”, Proceedings of CHI '97 Extended Abstracts, 1997. |
Zimmerman et al., “Applying Electric Field Sensing to Human-Computer Interfaces”, In CHI '85 Proceedings, 1995, pp. 280-287. |
Zuber, “Der Klanomeister”, Connect Magazine, Aug. 1998, pp. 52-53. |
Notice of Allowance received for U.S. Appl. No. 16/403,370, dated Nov. 19, 2020, 5 pages. |
Number | Date | Country | |
---|---|---|---|
20210165449 A1 | Jun 2021 | US |
Number | Date | Country | |
---|---|---|---|
60663345 | Mar 2005 | US | |
60658777 | Mar 2005 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 16529690 | Aug 2019 | US |
Child | 17175387 | US | |
Parent | 15970571 | May 2018 | US |
Child | 16529690 | US | |
Parent | 14724753 | May 2015 | US |
Child | 15970571 | US | |
Parent | 12486710 | Jun 2009 | US |
Child | 14724753 | US | |
Parent | 11426078 | Jun 2006 | US |
Child | 12486710 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 11367749 | Mar 2006 | US |
Child | 11426078 | US |