Scrolling based on rotational movement

Abstract
Improved approaches for users to interact with graphical user interfaces of computing devices are disclosed. A rotational user action supplied by a user via a user input device can provide accelerated scrolling. The accelerated nature of the scrolling enables users to scroll or traverse a lengthy data set (e.g., list of items) faster and with greater ease. The amount of acceleration provided can be performed in successive stages, and/or performed based on the speed of the rotational user action. In one embodiment, the rotational user action is transformed into linear action with respect to a graphical user interface. The resulting acceleration effect causes the linear action to be enhanced such that a lengthy data set is able to be rapidly traversed.
Description
BACKGROUND OF THE INVENTION

Field of the Invention


The present invention relates generally to a computing device and, more particularly, to a handheld computing device having a rotational input unit.


Description of the Related Art


There exist today many styles of input devices for performing operations with respect to a consumer electronic device. The operations generally correspond to moving a cursor and making selections on a display screen. By way of example, the input devices may include buttons, switches, keyboards, mice, trackballs, touch pads, joy sticks, touch screens and the like. Each of these devices has advantages and disadvantages that are taken into consideration when designing the consumer electronic device. In handheld computing devices, the input devices are typically buttons and switches. Buttons and switches are generally mechanical in nature and provide limited control with regard to the movement of a cursor (or other selector) and the making of selections. For example, they are generally dedicated to moving the cursor in a specific direction (e.g., arrow keys) or to making specific selections (e.g., enter, delete, number, etc.). In the case of handheld personal digital assistants (PDAs), the input devices tend to utilize touch-sensitive display screens. When using a touch screen, a user makes a selection on the display screen by pointing directly to objects on the screen using a stylus or finger.


In portable computing devices such as laptop computers, the input devices are commonly touch pads. With a touch pad, the movement of an input pointer (i.e., cursor) corresponds to the relative movements of the user's finger (or stylus) as the finger is moved along a surface of the touch pad. Touch pads can also make a selection on the display screen when one or more taps are detected on the surface of the touch pad. In some cases, any portion of the touch pad may be tapped, and in other cases, a dedicated portion of the touch pad may be tapped. In stationary devices such as desktop computers, the input devices are generally selected from keyboards, mice and trackballs. With a mouse, the movement of the input pointer corresponds to the relative movements of the mouse as the user moves the mouse along a surface. With a trackball, the movement of the input pointer corresponds to the relative movements of a ball as the user rotates the ball within a housing. Both mice and trackball devices generally include one or more buttons for making selections on the display screen.


In addition to allowing input pointer movements and selections with respect to a Graphical User Interface (GUI) presented on a display screen, the input devices may also allow a user to scroll across the display screen in the horizontal or vertical directions. For example, a mouse may include a scroll wheel that allows a user to simply roll the scroll wheel forward or backward to perform a scrolling action. In addition, touch pads may provide dedicated active areas that implement scrolling when the user passes his or her finger linearly across the active area in the x and y directions. Both devices may also implement scrolling via horizontal and vertical scroll bars that are displayed as part of the GUI. Using this technique, scrolling is implemented by positioning the input pointer over the desired scroll bar, selecting the desired scroll bar, and moving the scroll bar by moving the mouse or finger in the y direction (forwards and backwards) for vertical scrolling or in the x direction (left and right) for horizontal scrolling.


Further, consumer electronic products other than computers, such as cordless telephones, stereo receivers and compact-disc (CD) players, have used dials to enable users to select a phone number, a radio frequency and a specific CD, respectively. Here, typically, a limited-resolution display is used together with the dial. The display, at best, displays only a single item (number, frequency or label) in a low resolution manner using a character generator LCD. In other words, these devices have used single line, low resolution LCD readouts.


Thus, there is always a need for improved user input devices that facilitate greater ease of use of computing devices.


SUMMARY OF THE INVENTION

The present invention relates to improved approaches for users of computing devices to interact with graphical user interfaces. A rotational user action supplied by a user via a user input device can provide accelerated scrolling. The accelerated nature of the scrolling enables users to scroll or traverse a lengthy data set (e.g., list of items) faster and with greater ease. The amount of acceleration provided can be performed in successive stages, and/or performed based on the speed of the rotational user action. In one embodiment, the rotational user action is transformed into linear action with respect to a graphical user interface. The resulting acceleration effect causes the linear action to be enhanced such that a lengthy data set is able to be rapidly traversed. Other aspects and features of the invention will become apparent below. Although the type of computing device can vary, the invention is particularly well-suited for use with a media player.


Other aspects and advantages of the invention will become apparent from the following detailed description, taken in conjunction with the accompanying drawings, illustrating by way of example the principles of the invention.





BRIEF DESCRIPTION OF THE DRAWINGS

The present invention is illustrated by way of example, and not by way of limitation, in the figures of the accompanying drawings and in which like reference numerals refer to similar elements and in which:



FIG. 1 is a flow diagram of scroll processing according to one embodiment of the invention.



FIG. 2 is a flow diagram of list navigation processing according to another embodiment of the invention.



FIG. 3 is a flow diagram of acceleration amount processing according to one embodiment of the invention.



FIG. 4 is a flow diagram of acceleration amount processing according to another embodiment of the invention.



FIG. 5 is a representative acceleration state machine according to one embodiment of the invention.



FIG. 6 is a flow diagram of next portion determination processing according to one embodiment of the invention.



FIG. 7A is a perspective diagram of a computer system in accordance with one embodiment of the invention.



FIG. 7B is a perspective diagram of a media player in accordance with one embodiment of the present invention.



FIG. 8A is a block diagram of a media player according to one embodiment of the invention.



FIG. 8B is a block diagram of a computing system according to one embodiment of the invention.



FIG. 9 shows the media player of FIG. 7B being used by a user in accordance with one embodiment of the invention.



FIG. 10A is a flow diagram of user input processing according to one embodiment of the invention.



FIG. 10B is a flow diagram of user input processing according to another embodiment of the invention.



FIG. 11 is a flow diagram of user input processing according to another embodiment of the invention.



FIG. 12 is a block diagram of a rotary input display system in accordance with one embodiment of the invention.





DETAILED DESCRIPTION OF THE INVENTION

The present invention relates to improved approaches for users of computing devices to interact with graphical user interfaces. A rotational user action supplied by a user via a user input device can provide accelerated scrolling. The accelerated nature of the scrolling enables users to scroll or traverse a lengthy data set (e.g., list of items) faster and with greater ease. The amount of acceleration provided can be performed in successive stages, and/or performed based on the speed of the rotational user action. In one embodiment, the rotational user action is transformed into linear action with respect to a graphical user interface. The resulting acceleration effect causes the linear action to be enhanced such that a lengthy data set is able to be rapidly traversed. Other aspects and features of the invention will become apparent below. Although the type of computing device can vary, the invention is particularly well-suited for use with a media player.


Embodiments of the invention are discussed below with reference to FIGS. 1-12. However, those skilled in the art will readily appreciate that the detailed description given herein with respect to these figures is for explanatory purposes as the invention extends beyond these limited embodiments.



FIG. 1 is a flow diagram of scroll processing 100 according to one embodiment of the invention. The scroll processing 100 assists a user in scrolling through a data set. The scroll processing 100 initially receives 102 a number of units associated with a rotational user input. The number of units is an indication of an amount of rotational movement a user has invoked with respect to a rotational input device.


Next, an acceleration factor is determined 104. The acceleration factor is an indication of the degree of acceleration to be utilized with the scroll processing 100. After the acceleration factor is determined 104, the number of units that are associated with the rotational user input is modified 106 by the acceleration factor. In one embodiment, the number of units is modified by multiplication with the acceleration factor. In various other embodiments, the number of units can be modified in various other ways.


After the number of units has been modified 106, a next portion of the data set that is being scrolled through can be determined 108 based on the modified number of units. Once the next portion has been determined 108, the next portion of the data set can be presented 110. Typically, the next portion of the data set associated with the scroll processing 100 is presented 110 to the user that caused the rotational user input. In one embodiment, the next portion of the data set can be presented 110 to the user by displaying the next portion of the data set on a display device. In another embodiment of the invention, the next portion of the data set can be presented 110 to the user by displaying the next portion of the data set with at least one item distinctively or distinguishly displayed (e.g., highlighted) from the other items. In still another embodiment, the next portion of the data set can be presented 110 to the user by playing or executing a file. After the next portion of the data set has been presented 110, the scroll processing 100 is complete and ends. However, the scroll processing 100 will repeat for each rotational user input.


Here, the faster the rate of rotational user input, the further down a list the next item becomes. It should be noted that the rate of rotational user input can be relative or absolute in nature. Still further, the rate of rotational user input need not be an actual velocity value, but could be a count or other value that is proportional to or influenced by the rate of rotational user input.


A data set as used herein pertains to a set of data. As one example, the data set can be a list of items (e.g., a list of songs). As another example, the data set can be a media file (e.g., MP3 or other audio file, video file, or image file). In one embodiment, the data set can be considered a sequential data set because the data within the set is often sequential. For example, the songs in a list are arranged sequentially and the data within an audio file are also arranged sequentially.



FIG. 2 is a flow diagram of list navigation processing 200 according to another embodiment of the invention. The list navigation processing 200 initially determines 202 a rate of rotational user input (e.g., dial turn). The rotational user input is provided through user interaction with a rotational input device. A list length is then obtained 204 and a current item in the list is identified. Typically, the current item is the item in the list that is being displayed. In one embodiment, the current item is highlighted such that it is distinctively displayed from other items of the list that are simultaneously displayed.


A next item in the list to be displayed is then determined 206 based on the rotational user input. The determination 206 of the next item in the list can also be dependent on the list length and the current item in the list. For example, the greater the rate of the rotational user input, the further apart the next item is from the current item in the list. The rate of the rotational user input and the length of the list can affect whether acceleration (e.g., acceleration factor) is provided for navigating the list. Thereafter, the list navigation processing 200 displays 208 a next item and one or more subsequent (or neighboring) items thereto. For example, the next item and the one or more subsequent items can be displayed 208 by a display screen produced by a display device. Additionally, the list navigation processing 200 can provide 210 an audio feedback. The audio feedback provides an audible sound that indicates feedback to the user as to the rate at which the items in the list are being traversed. The audible feedback can thus also be proportional to the rate of rotational user input.



FIG. 3 is a flow diagram of acceleration amount processing 300 according to one embodiment of the invention. The acceleration amount processing 300 is, for example, processing that can be performed to determine an acceleration factor. In one embodiment, the acceleration amount processing 300 is, for example, suitable for use as the operation 104 illustrated in FIG. 1. In another embodiment, the acceleration amount processing 300 is, for example, suitable for use as a sub-operation for the operation 206 illustrated in FIG. 2.


The acceleration amount processing 300 initially determines 302 a speed of a rotational user input. As previously noted with respect to FIG. 1, the rotational user input is provided by a rotational input device that is interacted with by a user. In one embodiment, the speed of the rotational user input is determined 302 based on the number of rotational units identified by the rotational user input. More particularly, in another embodiment, the speed of the rotational user input is determined 302 based on the number of rotational units and an amount of time over which such rotational inputs were received. The speed of the rotational user input can, for example, be considered to be the speed of a user movement or the speed of rotation of a rotational input device.


After the speed of the rotational user input has been determined 302, a decision 304 determines whether the speed of the rotational user input is slow. The speed of the rotational user input can be determined or estimated, directly or indirectly, in a variety of ways. In one embodiment, a threshold is used to distinguish between slow and fast speeds of the rotational user input. The precise rate of rotation that is deemed to be the threshold between slow and fast can vary with application. The threshold can be determined experimentally based upon the particular application for which the acceleration amount processing 300 is utilized.


Once the decision 304 determines that the speed of the rotational user input is slow, then the acceleration factor (AF) is set 306 to zero (0). On the other hand, when the decision 304 determines that the speed of the rotational user input is not slow (i.e., the speed is fast), then a decision 308 determines whether an amount of time (At1) since the last time the acceleration was altered exceeds a first threshold (TH1). When the decision 308 determines that the amount of time (At1) since the last acceleration update is longer than the first threshold amount (TH1), then the acceleration factor is modified 310. In particular, in this embodiment, the modification 310 causes the acceleration factor to be doubled.


Following the operation 310, as well as following the operation 306, an acceleration change time is stored 312. The acceleration change time reflects the time that the acceleration factor was last updated. The acceleration change time is stored such that the decision 308 understands the amount of time since the acceleration was last modified (i.e., At1). Following the operation 312, as well as directly following the decision 308 when the amount of time since the last acceleration update was made is less than the first threshold (TH1), the acceleration amount processing 300 is complete and ends.


Hence, according to the acceleration amount processing 300, when the speed of the rotational user input is deemed slow, the acceleration factor is reset to zero (0), which indicates that no acceleration effect is imposed. On the other hand, when the speed of the rotational user input indicates that the speed of such rotation is fast, then the acceleration effect being imposed is doubled. In effect, then, if the user interacts with the rotational input device such that the speed of rotation is slow, then no acceleration effect is provided. In such case, the user can scroll through a data set (e.g., list, audio file) with high resolution. On the other hand, when the user interacts with the rotational input device with a high speed of rotation, then the acceleration effect is step-wise increased (e.g., via doubling or other means). The acceleration effect provided by the invention enables a user to interact with a rotational input device in an efficient, user-friendly manner such that long or extensive data sets can be scrolled through in a rapid manner.



FIG. 4 is a flow diagram of acceleration amount processing 400 according to another embodiment of the invention. The acceleration amount processing 400 is generally similar to the acceleration amount processing 300 illustrated in FIG. 3. However, the acceleration amount processing 400 includes additional operations that can be optionally provided. More specifically, the acceleration amount processing 400 can utilize a decision 402 to determine whether a duration of time (At2) since the last rotational user input is greater than a second threshold (TH2). When the decision 402 determines that the duration of time (At2) since the last rotational user input exceeds the second threshold (TH2), then the acceleration factor is reset 306 to zero (0). Here, when the user has not provided a subsequent rotational user input for more than the duration of the second threshold (TH2), then the acceleration amount processing 400 is reset to no acceleration because it assumes that the user is restarting a scrolling operation and thus would not want to continue with a previous accelerated rate of scrolling.


The rate at which the acceleration effect is doubled is restricted such that the doubling (i.e., operation 310) can only occur at a rate below a maximum rate. The acceleration amount processing 400 also includes a decision 404 that determines whether the acceleration factor (AF) has reached a maximum acceleration factor (AFmAx). The decision 404 can be utilized to limit the maximum acceleration that can be imposed by the acceleration amount processing 400. For example, the acceleration factor (AF) could be limited to a factor of eight (8), representing that with maximum acceleration scrolling would occur at a rate eight (8) times faster than non-accelerated scrolling.


Still further, the acceleration amount processing 400 stores 406 a last input time. The last input time (t2) represents the time the last rotational user input was received (or processed). Note that the duration of time (At2) can be determined by the difference between a current time associated with an incoming rotational user input and the last input time (t2).


As previously noted, the acceleration amount processing 300, 400 is, for example, processing that can be performed to determine an acceleration factor. However, although not depicted in FIG. 3 or 4, when the length of the data set (e.g., list) is short, then the acceleration can be set to zero (i.e., no acceleration) and the acceleration amount processing 300, 400 can be bypassed. For example, in one embodiment, where the data set is a list, if the display screen can display only five (5) entries at a time, then the list can be deemed short if it does not include more than twenty (20) items. Consequently, according to another embodiment of the invention, the acceleration effect imposed by the invention can be dependent on the length of the data set (e.g., list).


The accelerated scrolling can also be depicted as a state machine having states representing different acceleration levels or different rates of acceleration. The particulars of such a state machine will vary widely with implementation.



FIG. 5 is a representative acceleration state machine 500 according to one embodiment of the invention. The acceleration state machine 500 has four states of acceleration. A first state 502 provides no acceleration. From the first state 502, when the speed of a next rotational user input is slow, the acceleration state machine 500 remains at the first state 502. Alternatively, when the speed of the rotational user input is fast, the acceleration state machine 500 transitions from a first state 502 to a second state 504. The second state 504 provides 2× acceleration, meaning that the resulting rate of scrolling would be twice that of the first state. When the acceleration state machine 500 is at the second state 504, when the speed of a next rotational user input is slow, the acceleration state machine 500 transitions back to the first state 502. Alternatively, when the speed of the next rotational user input is fast, the acceleration state machine 500 transitions from the second state 504 to a third state 506. The third state 506 provides 4× acceleration, meaning that the rate of scrolling would be four times that of the first state 502 or twice that of the second state 504. At the third state 506, when the speed of the next rotational user input is slow, the acceleration state machine 500 transitions from the third state 506 to the first state 502. Alternatively, when the speed of the next rotational user input is fast, the acceleration state machine 500 transitions from the third state 506 to a fourth state 508. At the fourth state 508, 8× acceleration is provided, meaning that the acceleration rate of scrolling is eight times that of the first state 502, four times that of the second state 504, or twice that of the third state 506. At the fourth state 508, when the speed of the next rotational user input is slow, the acceleration state machine 500 transitions from the fourth state 508 to the first state 502. Alternatively, when the speed of the next rotational user input is fast, the acceleration state machine 500 remains at the fourth state 508.



FIG. 6 is a flow diagram of next portion determination processing 600 according to one embodiment of the invention. The next portion determination processing 600 is, for example, processing performed by the operation 108 illustrated in FIG. 1


The next portion determination processing 600 receives 602 the modified number of the units. For example, at operation 106 of FIG. 1, the number of units was modified 106 by the acceleration factor to determine the modified number of units. A remainder value is then added 604 to the modified number of units. The remainder value pertains to a previously determined remainder value as discussed below. Next, the modified number of units is divided 606 by a chunking value to view a next portion. The next portion is a subset of the data set that is eventually presented on a display device. For example, the next portion can pertain to one or more items in a list when the data set pertains to a list of items. In another example, the next portion can pertain to a segment or position in an audio file when the data set pertains to an audio file. In any case, the remainder value from the operation 606 is then saved 608 for subsequent usage in computing a subsequent next portion. Following the operation 608, the next portion determination processing 600 is complete and ends. Although the use of the remainder value is not necessary, the scrolling provided by the invention may be smoother to the user when the remainder is carried forward as described above.


As one example of the scroll processing according to the invention, consider the following exemplary case. Assume that the number of units associated with a rotational user input is 51 units. Also assume that an acceleration factor was determined to be 2. Hence, the modified number of units, according to one embodiment, would then be 102 units (51*2). In one implementation, a previous remainder value (if not stale) can be added to the modified number of units. Assume that the previous remainder value was 3, then the modified number of units becomes 105 (102+3). Thereafter, to determine the next portion of the data set, the modified number of units (105) is then divided by a chunking value (e.g., 5). Hence, the resulting value 20 indicates that the next portion of the data set to be presented (i.e., displayed on a display device) would be 20 items down (up) in the list from the current item.


The scroll, list navigation or acceleration amount processing discussed above can be utilized with respect to an audio player having a screen that displays a list of songs, or that provides a scroll bar indicating position of playing within an audio file. Typically, such an audio player typically displays different screens on the display. Each such screen can be individually scrolled through using separate position and acceleration values. Alternatively, the acceleration values can be shared across multiple different screens. Each such screen could be associated with a different list that is partially displayed on the screen, a portion of which is displayed on the screen at a time and, through scrolling, the portion can be altered in an accelerated manner. The file can be a list or represent a scroll bar reflecting play position in a song. Additional details of screens suitable for use with an audio player are described in U.S. Provisional Patent Application No. 60/399,806, filed on Jul. 30, 2002, which is hereby incorporated herein by reference.



FIG. 7A is a perspective diagram of a computer system 650 in accordance with one embodiment of the invention. The computer system 650 includes a base housing 652 that encloses electronic circuitry that performs the computing operations for the computing system 650. Typically, the electronic circuitry includes a microprocessor, memory, I/O controller, graphics controller, etc. The housing 652 also provides a removable computer readable medium drive 654 in which a removable computer readable medium can be placed so as to electronically or optically read data therefrom. The computer housing 652 is also coupled to a display device 656 on which a screen display can be presented for a user of the computer system 650 to view. Still further, the computer system 650 includes a keyboard apparatus 658. The keyboard apparatus 658 allows a user to interact with a computer program (application program or operating system) performed by the computer system 650. In this regard, the keyboard apparatus 658 includes a plurality of keys 660 and a rotational input unit 662. The rotational input unit 662 allows a user to perform a rotational movement with respect to the rotational input unit 662. The rotational movement (rotational user input) can then be processed by the electronic circuitry of the computer system 650 and used to manipulate navigation or selection actions with respect to a graphical user interface being presented to the user on the display device 656. The keyboard apparatus 658 can also include a button 664 associated with the rotational input unit 662. As shown in FIG. 7A, the button 664 can be provided at a center region of the rotational input unit 662. However, the button 664 is not required and, if provided, can be placed elsewhere, such as outside the periphery of the rotational input unit 662.



FIG. 7B is a perspective diagram of a media player 700 in accordance with one embodiment of the present invention. The term “media player” generally refers to computing devices that are dedicated to processing media such as audio, video or other images. In one implementation, the media player is a portable computing device. Examples of media players include music players, game players, video players, video recorders, cameras and the like. These computing devices are generally portable so as to allow a user to listen to music, play games or video, record video or take pictures wherever the user travels. In one embodiment, the media player is a handheld device that is sized for placement into a pocket of the user (i.e., pocket-sized). By being pocket-sized, the user does not have to directly carry the device and therefore the device can be taken almost anywhere the user travels (e.g., the user is not limited by carrying a large, bulky and often heavy device, as in a portable computer). For example, in the case of a music player (e.g., MP3 player), a user may use the device while working out at the gym. In the case of a camera, a user may use the device while mountain climbing. Furthermore, the device may be operated by the user's hands, no reference surface such as a desktop is needed. In one implementation, the music player can be pocket-sized and rather lightweight (e.g., dimensions of 2.43 by 4.02 by 0.78 inches and a weight of 6.5 ounces) for true portability.


The media player 700 typically has connection capabilities that allow a user to upload and download data to and from a host device such as a general purpose computer (e.g., desktop computer or portable computer). For example, in the case of a camera, photo images may be downloaded to the general purpose computer for further processing (e.g., printing). With regard to music players, songs and playlists stored on the general purpose computer may be downloaded into the music player. In one embodiment, the media player 700 can be a pocket-sized handheld MP3 music player that allows a user to store a large collection of music.


As shown in FIG. 7B, the media player 700 includes a housing 702 that encloses various electrical components (including integrated circuit chips and other circuitry) to provide computing capabilities for the media player 700. The integrated circuit chips and other circuitry may include a microprocessor, memory (e.g., ROM or RAM), a power source (e.g., a battery), a circuit board, a hard drive, and various input/output (I/O) support circuitry. In the case of music players, the electrical components may include components for outputting music such as an amplifier and a digital signal processor (DSP). In the case of video recorders or cameras, the electrical components may include components for capturing images such as image sensors (e.g., charge-coupled device (CCD) or complimentary oxide semiconductor (CMOS)) or optics (e.g., lenses, splitters, filters). The housing may also define the shape or form of the media player. That is, the contour of the housing 702 may embody the outward physical appearance of the media player 700.


The media player 700 also includes a display screen 704. The display screen 704 is used to display a Graphical User Interface (GUI) as well as other information to the user (e.g., text, objects, graphics). By way of example, the display screen 704 may be a liquid crystal display (LCD). In one particular embodiment, the display screen corresponds to a high-resolution display with a white LED backlight to give clear visibility in daylight as well as in low-light conditions. Additionally, according to one embodiment, the display screen 704 can be about 2 inches (measured diagonally) and provide a 160-by-128 pixel resolution. The display screen 704 can also operate to simultaneously display characters of multiple languages. As shown in FIG. 7B, the display screen 704 is visible to a user of the media player 700 through an opening 705 in the housing 702, and through a transparent wall 706 that is disposed over the opening 705. Although transparent, the transparent wall 706 may be considered part of the housing 702 since it helps to define the shape or form of the media player 700.


The media player 700 includes a rotational input device 710. The rotational input device 710 receives a rotational input action from a user of the media player 700. The rotational input action is used to control one or more control functions for controlling or interacting with the media player 700 (or application operating thereon). In one embodiment, the control function corresponds to a scrolling feature. The direction of scrolling can vary depending on implementation. For example, scrolling may be implemented vertically (up or down) or horizontally (left or right). For example, in the case of a music player, the moving finger may initiate a control function for scrolling through a song menu displayed on the display screen 704. The term “scrolling” as used herein generally pertains to moving displayed data (e.g., text or graphics) across a viewing area on a display screen 704 so that at least one new item of data (e.g., line of text or graphics) is brought into view in the viewing area. In essence, the scrolling function allows a user to view sets of data currently outside of the viewing area. The viewing area may be the entire viewing area of the display screen 704 or it may be only a portion of the display screen 704 (e.g., a window frame).


By way of example, in the case of a music player (e.g., MP3 player), the scrolling feature may be used to help browse through songs stored in the music player. To elaborate, the display screen 704, during operation, may display a list of media items (e.g., songs). A user of the media player 700 is able to linearly scroll through the list of media items by providing a rotational input action using the rotational input device 710. The displayed items from the list of media items are varied commensurate with the rotational input action such that the user is able to effectively scroll through the list of media items. However, since the list of media items can be rather lengthy, the invention provides the ability for the user to rapidly traverse (or scroll) through the list of media items. In effect, the user is able to accelerate their traversal of the list of media items by providing the rotational input action at greater speeds. The direction of the rotational input action may be arranged to control the direction of scrolling.


In addition to above, the media player 700 may also include one or more buttons 712. The buttons 712 are configured to provide one or more dedicated control functions for making selections or issuing commands associated with operating the media player 700. By way of example, in the case of a music player, the button functions may be associated with opening a menu, playing a song, fast forwarding a song, seeking through a menu and the like. In most cases, the button functions are implemented via a mechanical clicking action. The position of the buttons 712 relative to the rotational input device 710 may be widely varied. For example, they may be adjacent to one another or spaced apart. In the illustrated embodiment, the buttons 712 are configured to surround the inner and outer perimeter of the rotational input device 710. In this manner, the buttons 712 may provide tangible surfaces that define the outer boundaries of the rotational input device 710. As shown, there are four buttons 712A that surround the outer perimeter and one button 712B disposed in the center or middle of the rotational input device 710. By way of example, the plurality of buttons 712 may consist of a menu button, play/stop button, forward seek button, reverse seek button, and the like.


Moreover, the media player 700 may also include a power switch 714, a headphone jack 716 and a data port 718. The power switch 714 is configured to turn the media device 700 on and off. The headphone jack 716 is capable of receiving a headphone connector associated with headphones configured for listening to sound being outputted by the media device 700. The data port 718 is capable of receiving a data connector/cable assembly configured for transmitting and receiving data to and from a host device, such as a general purpose computer. By way of example, the data port 718 may be used to upload or download songs to and from the media device 700. The data port 718 may be widely varied. For example, the data port may be a PS/2 port, a serial port, a parallel port, a USB port, a FireWire port, and the like. In some cases, the data port 718 may be a radio frequency (RF) link or optical infrared (IR) link to eliminate the need for a cable. Although not shown in FIG. 7B, the media player 700 may also include a power port that receives a power connector/cable assembly configured for delivering power to the media player 700. In some cases, the data port 718 may serve as both a data and a power port.



FIG. 8A is a block diagram of a media player 800 according to one embodiment of the invention. The media player 800 can, for example, represent internal components of the media player 700.


The media player 800 includes a processor 802 that pertains to a microprocessor or controller for controlling the overall operation of the media player 800. The media player 800 stores media data pertaining to media items in a file system 804 and a cache 806. The file system 804 is, typically, a storage disk or a plurality of disks. The file system typically provides high capacity storage capability for the media player 800. However, since the access time to the file system 804 is relatively slow, the media player 800 also includes a cache 806. The cache 806 is, for example, Random-Access Memory (RAM) provided by semiconductor memory. The relative access time to the cache 806 is substantially shorter than for the file system 804. However, the cache 806 does not have the large storage capacity of the file system 804. Further, the file system 804, when active, consumes more power than does the cache 806. The power consumption is particularly important when the media player 800 is a portable media player that is powered by a battery (not shown).


The media player 800 also includes a user input device 808 that allows a user of the media player 800 to interact with the media player 800. For example, the user input device 808 can take a variety of forms, such as a button, keypad, dial, etc. Still further, the media player 800 includes a display 810 (screen display) that can be controlled by the processor 802 to display information to the user. A data bus 811 can facilitate data transfer between at least the file system 804, the cache 806, the processor 802, and the coder/decoder (CODEC) 812. The media player 800 can also include an audio feedback unit (not shown) to provide audio feedback for user interactions (such as with the user input device 808).


In one embodiment, the media player 800 serves to store a plurality of media items (e.g., songs) in the file system 804. When a user desires to have the media player play a particular media item, a list of available media items is displayed on the display 810. Then, using the user input device 808, a user can select one of the available media items. The processor 802, upon receiving a selection of a particular media item, supplies the media data (e.g., audio file) for the particular media item to a coder/decoder (CODEC) 812. The CODEC 812 then produces analog output signals for a speaker 814. The speaker 814 can be a speaker internal to the media player 800 or external to the media player 800. For example, headphones or earphones that connect to the media player 800 would be considered an external speaker.



FIG. 8B is a block diagram of a computing system 850 according to one embodiment of the invention. The computing system 850 can, for example, represent a portion of any of the computer system 650 shown in FIG. 7A, the media player 700 shown in FIG. 7B, or the media player 800 shown in FIG. 8A.


The computing system 850 includes a housing 852 that exposes a rotational input device 854. The housing 852 can be a computer's housing or an input/output device's housing. The rotational input device 854 permits a user to interact with the computing system 850 through a rotational action. The rotational action results from either rotation of the rotational input device 854 itself or by rotation of a stylus or user's finger about the rotational input device 854. As examples, the rotational input device 854 can be a rotary dial (including, e.g., a navigational wheel or a scroll wheel) capable of being rotated or a touch pad capable of rotational sensing. In one embodiment, the touch pad has a circular shape. A rotation pickup unit 856 couples to the rotational input device 854 to sense the rotational action. For example, the rotational pickup unit 856 can be optically or electrically coupled to the rotational input device 854.


The computing system 850 further includes a processor 858, a display 860 and an audio feedback unit 862. Signals pertaining to the rotational action are supplied to the processor 858. The processor 858 not only performs processing operations for application programs hosted by the computing system 850 but also can control the display 860 and the audio feedback unit 862. Alternatively, a specialized controller or other circuitry can support the processor 858 in controlling the display 860 or the audio feedback unit 862.


The processor 858 causes a display screen to be produced on the display 860. In one implementation, the display screen includes a selectable list of items (e.g., media items) from which a user may select one or more of the items. By the user providing a rotational action with respect to the rotational input device 854, the list can be scrolled through. The processor 858 receives the signals pertaining to the rotational action from the rotation pickup unit 856. The processor 858 then determines the next items of the list that are to be presented on a display screen by the display 860. In making this determination, the processor 858 can take into consideration the length of the list. Typically, the processor 858 will determine the rate of the rotational action such that the transitioning to different items in the media list can be performed at a rate proportional to the rate of the rotational action.


The processor 858 can also control the audio feedback unit 862 to provide audio feedback to a user. The audio feedback can, for example, be a clicking sound produced by the audio feedback unit 862. In one embodiment, the audio feedback unit 862 is a piezoelectric buzzer. As the rate of transitioning through the list of items increases, the frequency of the clicking sounds can increase. Alternatively, when the rate that the rotational input device 854 is turned slows, the rate of transitioning through the list of items decreases, and thus the frequency of the clicking sounds correspondingly slows. Hence, the clicking sounds provide audio feedback to the user as to the rate in which the items within the list of items are being traversed.



FIG. 9 shows the media player 700 of FIG. 7B being used by a user 920 in accordance with one embodiment of the invention. In this embodiment, the user 920 is linearly scrolling (as shown by arrow 924) through a list of songs 922 displayed on the display screen 904 via a slider bar 923. As shown, the media device 900 is comfortably held in one hand 926 while being comfortably addressed by the other hand 928. This configuration generally allows the user 920 to easily actuate the rotational input device 910 with one or more fingers. For example, the thumb 930 and right-most fingers 931 (or left-most fingers if left handed) of the first hand 926 are used to grip the sides of the media player 900 while a finger 932 of the opposite hand 928 is used to actuate the rotational input device 910.


Referring to FIG. 9, and in accordance with one embodiment of the invention, the rotational input device 910 can be continuously actuated by a circular motion of the finger 932 as shown by arrow 934. For example, the finger may rotate relative to an imaginary axis. In particular, the finger can be rotated through 360 degrees of rotation without stopping. This form of motion may produce incremental or accelerated scrolling through the list of songs 922 being displayed on the display screen 904.



FIG. 10A is a flow diagram of user input processing 1000 according to one embodiment of the invention. The user input processing 1000 is, for example, performed with respect to the computer system 650 illustrated in FIG. 7A or the media player 700 illustrated in FIG. 7B.


The user input processing 1000 displays 1002 a graphical user interface. Then, a rotational movement associated with a user input action is received 1004. Here, the user input action is generally angular, as opposed to linear, and thus pertains to a rotational movement. As discussed in more detail below, the rotational movement can be provided by the user input action. In one example, the rotational movement can be caused by a user acting to rotate a navigational wheel through a user input action. In another example, the rotational movement can be caused by a user's finger or a stylist being moved in a rotational manner through a user input action with respect to a touch pad. After the rotational movement has been received 1004, the rotational movement is converted 1006 into a linear movement. The linear movement is then applied 1008 to at least one object of the graphical user interface. For example, the object of the graphical user interface can be a list, menu or other object having a plurality of selectable items. The linear movement can effect a scroll type action with respect to the object (e.g., list or menu). Alternatively, the linear movement can effect a level adjustment (e.g., volume adjustment) or position adjustment (e.g., slider bar position). After the linear movement has been applied 1008, the user input processing 1000 is complete and ends.



FIG. 10B is a flow diagram of user input processing 1050 according to another embodiment of the invention. The user input processing 1050 is, for example, performed with respect to the computer system 650 illustrated in FIG. 7A or the media player 700 illustrated in FIG. 7B.


The operations 1052-1060 performed by the user input processing 1050 are similar to those like operations performed by the user input processing 1000 illustrated in FIG. 10A. Additionally, the user input processing 1050 operates to provide 1056 audible feedback corresponding to the rotational movements. In other words, as the rotational movement associated with user input action is received 1054, audible feedback corresponding to the rotational movement is provided 1056. Such audible feedback provides the user with feedback concerning the extent to which rotational movement has been input. In one embodiment, the rotational movement associated with user input action is converted into linear movement and applied to an object of a graphical user interface. For example, when the object of the graphical user interface is a multi-item list that is displayed for user scrolling and selection actions, the rotational movement associated with the user input action represents a distance traversed in the multi-item list. When acceleration is applied, the distance traversed is increased (e.g., multiplied). In one embodiment, the audible feedback is provided through a piezoelectric buzzer that is controlled by a processor (or other circuitry). For example, the audio feedback unit 862 shown in FIG. 8B can be a piezoelectric buzzer. The controller for the piezoelectric buzzer can, for example, be a processor of the computer system 650 or the media player 700, or some other circuitry coupled to the piezoelectric buzzer.



FIG. 11 is a flow diagram of user input processing 1100 according to another embodiment of the invention. The user input processing 1100 is, for example, performed by a computing device, such as the computer system 650 illustrated in FIG. 7A or the media player 700 illustrated in FIG. 7B.


The user input processing 1100 begins by the display 1102 of a portion of a list of items together with a select bar. The select bar typically points to or highlights one or more of the items of the list of items. In general, the select bar can be associated with any sort of visual indication specifying one or more of the items of the list of items. Hence, the select bar is one type of visual indicator. Next, a decision 1104 determines whether a rotational movement input has been received. When the decision 1104 determines that a rotational movement input has not yet been received, then a decision 1106 determines whether another input has been received. Here, the inputs are provided by a user of the computing device performing or associated with the user input processing 1100. When the decision 1106 determines that another input has been received, then other processing is performed 1108 to perform any operations or actions caused by the other input. Following the operation 1108, the user input processing 1100 is complete and ends. On the other hand, when the decision 1106 determines that no other input has been received, then the user input processing 1100 returns to repeat the decision 1104.


Once the decision 1104 determines that a rotational movement input has been received, then the rotational movement is converted 1110 to a linear movement. Then, a next portion of the list of items (and placement of the select bar over one of the items) is determined 1112. Thereafter, the next portion of the list of items is displayed 1114. The linear movement operates to move the select bar (or other visual identifier) within the list. In other words, the select bar is scrolled upwards or downwards (in an accelerated or unaccelerated manner) by the user in accordance with the linear motion. As the scrolling occurs, the portion of the list being displayed changes. Following the operation 1114, the user input processing 1100 is complete and ends. However, if desired, the user input processing 1100 can continue following operation 1114 by returning to the decision 1104 such that subsequent rotational movement inputs can be processed to view other portions of the list items in a similar manner.



FIG. 12 is a block diagram of a rotary input display system 1200 in accordance with one embodiment of the invention. By way of example, the rotary input display system 1200 can be performed by a computing device, such as the computer system 650 illustrated in FIG. 7A or the media player 700 illustrated in FIG. 7B. The rotary input display system 1200 utilizes a rotational input device 1202 and a display screen 1204. The rotational input device 1202 is configured to transform a rotational motion 1206 by a user input action (e.g., a swirling or whirling motion) into translational or linear motion 1208 on the display screen 1204. In one embodiment, the rotational input device 1402 is arranged to continuously determine either the angular position of the rotational input device 1202 or the angular position of an object relative to a planar surface 1209 of the rotational input device 1202. This allows a user to linearly scroll through a media list 1211 on the display screen 1204 by inducing the rotational motion 1206 with respect to the rotational input device 1202.


The rotary input display system 1200 also includes a control assembly 1212 that is coupled to the rotational input device 1202. The control assembly 1212 is configured to acquire the position signals from the sensors and to supply the acquired signals to a processor 1214 of the system. By way of example, the control assembly 1212 may include an application-specific integrated circuit (ASIC) that is configured to monitor the signals from the sensors to compute the angular location and direction (and optionally speed and acceleration) from the monitored signals and to report this information to the processor 1214.


The processor 1214 is coupled between the control assembly 1212 and the display screen 1204. The processor 1214 is configured to control display of information on the display screen 1204. In one sequence, the processor 1214 receives angular motion information from the control assembly 1212 and then determines the next items of the media list 1211 that are to be presented on the display screen 1204. In making this determination, the processor 1214 can take into consideration the length of the media list 1211. Typically, the processor 1214 will determine the rate of movement such that the transitioning to different items in the media list 1211 can be performed faster or in an accelerated manner when moved at non-slow speeds or proportional with greater speeds. In effect, to the user, rapid rotational motion causes faster transitioning through the list of media items 1211. Alternatively, the control assembly 1212 and processor 1214 may be combined in some embodiments.


Although not shown, the processor 1214 can also control a buzzer to provide audio feedback to a user. The audio feedback can, for example, be a clicking sound produced by a buzzer 1216. In one embodiment, the buzzer 1216 is a piezoelectric buzzer. As the rate of transitioning through the list of media items increases, the frequency of the clicking sounds increases. Alternatively, when the rate of transitioning slows, the frequency of the clicking sounds correspondingly slows. Hence, the clicking sounds provide audio feedback to the user as to the rate in which the media items within the list of media items are being traversed.


The various aspects, features or embodiments of the invention described above can be used alone or in various combinations. The invention is preferably implemented by a combination of hardware and software, but can also be implemented in hardware or software. The invention can also be embodied as computer readable code on a computer readable medium. The computer readable medium is any data storage device that can store data which can thereafter be read by a computer system. Examples of the computer readable medium include read-only memory, random-access memory, CD-ROMs, DVDs, magnetic tape, optical data storage devices, and carrier waves. The computer readable medium can also be distributed over network-coupled computer systems so that the computer readable code is stored and executed in a distributed fashion.


The advantages of the invention are numerous. Different embodiments or implementations may yield one or more of the following advantages. It should be noted that this is not an exhaustive list and there may be other advantages which are not described herein. One advantage of the invention is that a user is able to traverse through a displayed list of items using a rotational user input action. Another advantage of the invention is that a user is able to easily and rapidly traverse a lengthy list of items. Still another advantage of the invention is the rate of traversal of the list of media items can be dependent on the rate of rotation of a dial (or navigation wheel). Yet still another advantage of the invention is that audible sounds are produced to provide feedback to users of their rate of traversal of the list of media items.


The many features and advantages of the present invention are apparent from the written description, and thus, it is intended by the appended claims to cover all such features and advantages of the invention. Further, since numerous modifications and changes will readily occur to those skilled in the art, it is not desired to limit the invention to the exact construction and operation as illustrated and described. Hence, all suitable modifications and equivalents may be resorted to as falling within the scope of the invention.

Claims
  • 1. A method, comprising: at a portable electronic device with a display, a rotational input device, and an audio feedback unit:displaying a list of items with the display;detecting rotational movement of the rotational input device itself in a first direction; and,in response to detecting rotational movement of the rotational input device itself in the first direction: scrolling the list of items to an entry in the list, andoutputting, with the audio feedback unit, a plurality of audible sound effects that are output at different times at a particular rate, wherein the particular rate, which determines the times at which the audible sound effects are output, is based on a characteristic of the detected rotational-movement.
  • 2. The method of claim 1, wherein the entry in the list corresponds to a media item.
  • 3. The method of claim 1, wherein the characteristic of the detected rotational movement is a speed of the detected rotational movement.
  • 4. The method of claim 1, wherein the particular rate increases as the detected rotational movement increases.
  • 5. The method of claim 1, wherein the particular rate is proportional to a rate of the detected rotational movement.
  • 6. The method of claim 1, wherein the particular rate indicates a rate at which the items in the list are being scrolled.
  • 7. The method of claim 1, including: in response to detecting rotational movement of the rotational input device itself: moving a visual indicator to the entry in the list.
  • 8. A portable electronic device, comprising: a display;a rotational input device;an audio feedback unit;a processor; andnon-transitory computer readable storage media including instructions configured to be executed by the processor, including instructions for: displaying a list of items with the display;detecting rotational movement of the rotational input device itself in a first direction; and,in response to detecting rotational movement of the rotational input device itself in the first direction: scrolling the list of items to an entry in the list, andoutputting, with the audio feedback unit, a plurality of audible sound effects that are output at different times at a particular rate, wherein the particular rate, which determines the times at which the audible sound effects are output, is based on a characteristic of the detected rotational movement.
  • 9. The device of claim 8, wherein the entry in the list corresponds to a media item.
  • 10. The device of claim 8, wherein the characteristic of the detected rotational movement is a speed of the detected rotational movement.
  • 11. The device of claim 8, wherein the particular rate increases as the detected rotational movement increases.
  • 12. The device of claim 8, wherein the particular rate is proportional to a rate of the detected rotational movement.
  • 13. The device of claim 8, wherein the particular rate indicates a rate at which the items in the list are being scrolled.
  • 14. The device of claim 8, including instructions configured to cause the processor to perform operations including: in response to detecting rotational movement of the rotational input device itself: moving a visual indicator to the entry in the list.
  • 15. A non-transitory computer readable storage media including instructions that when executed by a portable electronic device with a display, a rotational input device, an audio feedback unit, and a processor, cause the portable electronic device to: display a list of items with the display;detect rotational movement of the rotational input device itself in a first direction; and,in response to detecting rotational movement of the rotational input device itself in the first direction: scroll the list of items to an entry in the list, andoutput, with the audio feedback unit, a plurality of audible sound effects that are output at different times at a particular rate, wherein the particular rate, which determines the times at which the audible sound effects are output, is based on a characteristic of the detected rotational movement.
  • 16. The computer readable storage media of claim 15, wherein the entry in the list corresponds to a media item.
  • 17. The computer readable storage media of claim 15, wherein the characteristic of the detected rotational movement is a speed of the detected rotational movement.
  • 18. The computer readable storage media of claim 15, wherein the particular rate increases as the detected rotational movement increases.
  • 19. The computer readable storage media of claim 15, wherein the particular rate is proportional to a rate of the detected rotational movement.
  • 20. The computer readable storage media of claim 15, wherein the particular rate indicates a rate at which the items in the list are being scrolled.
  • 21. The computer readable storage media of claim 15, including instructions that when executed cause the portable electronic device to: in response to detecting rotational movement of the rotational input device itself: move a visual indicator to the entry in the list.
CROSS-REFERENCE TO RELATED APPLICATIONS

This application is a continuation of U.S. application Ser. No. 11/959,942, filed Dec. 19, 2007, which is a divisional of U.S. application Ser. No. 10/256,716, filed Sep. 26, 2002, now U.S. Pat. No. 7,312,785, which are hereby incorporated by reference herein, and which claims benefit of priority from: (i) U.S. Provisional Patent Application No. 60/387,692, filed Jun. 10, 2002, and entitled “METHOD AND APPARATUS FOR USE OF ROTATIONAL USER INPUTS,” which is hereby incorporated by reference herein; (ii) U.S. Provisional Patent Application No. 60/359,551, filed Feb. 25, 2002, and entitled “TOUCH PAD FOR HANDHELD DEVICE,” which is hereby incorporated by reference herein; and (iii) U.S. Provisional Patent Application No. 60/346,237, filed Oct. 22, 2001, and entitled “METHOD AND SYSTEM FOR LIST SCROLLING,” which is hereby incorporated by reference herein. This application is related to U.S. patent application Ser. No. 10/072,765, filed Feb. 7, 2002, and entitled “MOUSE HAVING A ROTARY DIAL,” now U.S. Pat. No. 7,084,856, which is hereby incorporated by reference herein. This application is also related to U.S. patent application Ser. No. 10/188,182, filed Jul. 1, 2002, and entitled “TOUCH PAD FOR HANDHELD DEVICE,” now U.S. Pat. No. 7,046,230, which is incorporated by reference herein.

US Referenced Citations (586)
Number Name Date Kind
1061578 Wischhusen et al. May 1913 A
2063276 Thomas Dec 1936 A
2798907 Schneider Jul 1957 A
2903229 Lange Sep 1959 A
2945111 McCormick Jul 1960 A
3005055 Mattke Oct 1961 A
3965399 Walker, Jr. et al. Jun 1976 A
3996441 Ohashi Dec 1976 A
4029915 Ojima Jun 1977 A
4103252 Bobick Jul 1978 A
4110749 Janko et al. Aug 1978 A
4115670 Chandler Sep 1978 A
4121204 Welch et al. Oct 1978 A
4129747 Pepper, Jr. Dec 1978 A
4158216 Bigelow Jun 1979 A
4242676 Piguet et al. Dec 1980 A
4246452 Chandler Jan 1981 A
4264903 Bigelow Apr 1981 A
4266144 Bristol May 1981 A
4293734 Pepper, Jr. Oct 1981 A
D264969 McGoutry Jun 1982 S
4338502 Hashimoto et al. Jul 1982 A
4380007 Steinegger Apr 1983 A
4380040 Posset Apr 1983 A
4394649 Suchoff et al. Jul 1983 A
4475008 Doi et al. Oct 1984 A
4570149 Thornburg et al. Feb 1986 A
4583161 Gunderson et al. Apr 1986 A
4587378 Moore May 1986 A
4604786 Howie, Jr. Aug 1986 A
4613736 Shichijo et al. Sep 1986 A
4644100 Brenner et al. Feb 1987 A
4719524 Morishima et al. Jan 1988 A
4734034 Maness et al. Mar 1988 A
4736191 Matzke et al. Apr 1988 A
4739191 Puar Apr 1988 A
4739299 Eventoff et al. Apr 1988 A
4752655 Tajiri et al. Jun 1988 A
4755765 Ferland Jul 1988 A
4764717 Tucker et al. Aug 1988 A
4771139 DeSmet Sep 1988 A
4798919 Miessler et al. Jan 1989 A
4810992 Eventoff Mar 1989 A
4822957 Talmage, Jr. et al. Apr 1989 A
4831359 Newel May 1989 A
4849852 Mullins Jul 1989 A
4856993 Maness et al. Aug 1989 A
4860768 Hon et al. Aug 1989 A
4866602 Hall Sep 1989 A
4876524 Jenkins Oct 1989 A
4897511 Itaya et al. Jan 1990 A
4914624 Dunthorn Apr 1990 A
4917516 Retter Apr 1990 A
4934889 Kurosaki Jun 1990 A
4943889 Ohmatoi Jul 1990 A
4951036 Grueter et al. Aug 1990 A
4954823 Binstead Sep 1990 A
4976435 Shatford et al. Dec 1990 A
4990900 Kikuchi Feb 1991 A
5008497 Asher Apr 1991 A
5036321 Leach et al. Jul 1991 A
5053757 Meadows Oct 1991 A
5086870 Bolduc Feb 1992 A
5125077 Hall Jun 1992 A
5159159 Asher Oct 1992 A
5179648 Hauck Jan 1993 A
5186646 Pederson Feb 1993 A
5192082 Inoue et al. Mar 1993 A
5193669 Demeo et al. Mar 1993 A
5231326 Echols Jul 1993 A
5237311 Malley et al. Aug 1993 A
5278362 Ohashi Jan 1994 A
5305017 Gerpheide Apr 1994 A
5313027 Inoue et al. May 1994 A
D349280 Kaneko Aug 1994 S
5339213 O'Callaghan Aug 1994 A
5367199 Lefkowitz et al. Nov 1994 A
5374787 Miller et al. Dec 1994 A
5379057 Clough et al. Jan 1995 A
5404152 Nagai Apr 1995 A
5408621 Ben-Arie Apr 1995 A
5414445 Kaneko et al. May 1995 A
5416498 Grant May 1995 A
5424756 Ho et al. Jun 1995 A
5432531 Calder et al. Jul 1995 A
5438331 Gilligan et al. Aug 1995 A
D362431 Kaneko et al. Sep 1995 S
5450075 Waddington Sep 1995 A
5453761 Tanaka Sep 1995 A
5473343 Kimmich et al. Dec 1995 A
5473344 Bacon et al. Dec 1995 A
5479192 Carroll, Jr. et al. Dec 1995 A
5494157 Golenz et al. Feb 1996 A
5495566 Kwatinetz Feb 1996 A
5508703 Okamura et al. Apr 1996 A
5508717 Miller Apr 1996 A
5543588 Bisset et al. Aug 1996 A
5543591 Gillespie et al. Aug 1996 A
5555004 Ono et al. Sep 1996 A
5559301 Bryan, Jr. et al. Sep 1996 A
5559943 Cyr et al. Sep 1996 A
5561445 Miwa et al. Oct 1996 A
5564112 Hayes et al. Oct 1996 A
5565887 McCambridge et al. Oct 1996 A
5578817 Bidiville et al. Nov 1996 A
5581670 Bier et al. Dec 1996 A
5585823 Duchon et al. Dec 1996 A
5589856 Stein et al. Dec 1996 A
5589893 Gaughan et al. Dec 1996 A
5596347 Robertson et al. Jan 1997 A
5596697 Foster et al. Jan 1997 A
5598183 Robertson et al. Jan 1997 A
5611040 Brewer et al. Mar 1997 A
5611060 Belfiore et al. Mar 1997 A
5613137 Bertram et al. Mar 1997 A
5617114 Bier et al. Apr 1997 A
5627531 Posso et al. May 1997 A
5632679 Tremmel May 1997 A
5640258 Kurashima et al. Jun 1997 A
5648642 Miller et al. Jul 1997 A
D382550 Kaneko et al. Aug 1997 S
5657012 Tait Aug 1997 A
5661632 Register Aug 1997 A
D385542 Kaneko et al. Oct 1997 S
5675362 Clough et al. Oct 1997 A
5689285 Asher Nov 1997 A
5721849 Amro Feb 1998 A
5726687 Belfiore et al. Mar 1998 A
5729219 Armstrong et al. Mar 1998 A
5730165 Philipp Mar 1998 A
5748185 Stephan et al. May 1998 A
5751274 Davis May 1998 A
5754890 Holmdahl et al. May 1998 A
5764066 Novak et al. Jun 1998 A
5777605 Yoshinobu et al. Jul 1998 A
5786818 Brewer et al. Jul 1998 A
5790769 Buxton et al. Aug 1998 A
5798752 Buxton et al. Aug 1998 A
5805144 Scholder et al. Sep 1998 A
5808602 Sellers Sep 1998 A
5812239 Eger Sep 1998 A
5812498 Teres Sep 1998 A
5815141 Phares Sep 1998 A
5825351 Tam Oct 1998 A
5825353 Will Oct 1998 A
5828364 Siddiqui Oct 1998 A
5838304 Hall Nov 1998 A
5841078 Miller et al. Nov 1998 A
5841423 Carroll, Jr. et al. Nov 1998 A
D402281 Ledbetter et al. Dec 1998 S
5850213 Imai et al. Dec 1998 A
5852352 Suriano Dec 1998 A
5856645 Norton Jan 1999 A
5856822 Du et al. Jan 1999 A
5859629 Tognazzini Jan 1999 A
5861875 Gerpheide Jan 1999 A
5869791 Young Feb 1999 A
5875311 Bertram et al. Feb 1999 A
5883619 Ho et al. Mar 1999 A
5889236 Gillespie et al. Mar 1999 A
5889511 Ong et al. Mar 1999 A
5894117 Kamishima Apr 1999 A
5903229 Kishi May 1999 A
5907152 Dandiliker et al. May 1999 A
5907318 Medina May 1999 A
5909211 Combs et al. Jun 1999 A
5910802 Shields et al. Jun 1999 A
5914706 Kono Jun 1999 A
5923388 Kurashima et al. Jul 1999 A
D412940 Kato et al. Aug 1999 S
5933102 Miller et al. Aug 1999 A
5933141 Smith Aug 1999 A
5936619 Nagasaki et al. Aug 1999 A
5943044 Martinelli et al. Aug 1999 A
5953000 Weirich Sep 1999 A
5956019 Bang et al. Sep 1999 A
5959610 Silfvast Sep 1999 A
5959611 Smailagic et al. Sep 1999 A
5964661 Dodge Oct 1999 A
5973668 Watanabe Oct 1999 A
6000000 Hawkins et al. Dec 1999 A
6002093 Hrehor et al. Dec 1999 A
6002389 Kasser Dec 1999 A
6005299 Hengst Dec 1999 A
6025832 Sudo et al. Feb 2000 A
6031518 Adams et al. Feb 2000 A
6034672 Gaultier et al. Mar 2000 A
6057829 Silfvast May 2000 A
6075533 Chang Jun 2000 A
6084574 Bidiville Jul 2000 A
D430169 Scibora Aug 2000 S
6097372 Suzuki Aug 2000 A
6104790 Narayanaswami Aug 2000 A
6122526 Parulski et al. Sep 2000 A
6124587 Bidiville Sep 2000 A
6128006 Rosenberg et al. Oct 2000 A
6131048 Sudo et al. Oct 2000 A
6141068 Lijima Oct 2000 A
6147856 Karidis Nov 2000 A
6163312 Furuya Dec 2000 A
6166721 Kuroiwa et al. Dec 2000 A
6179496 Chou Jan 2001 B1
6181322 Nanavati Jan 2001 B1
D437860 Suzuki et al. Feb 2001 S
6188391 Seely et al. Feb 2001 B1
6188393 Shu Feb 2001 B1
6191774 Schena et al. Feb 2001 B1
6198054 Janniere Mar 2001 B1
6198473 Armstrong Mar 2001 B1
6211861 Rosenberg et al. Apr 2001 B1
6219038 Cho Apr 2001 B1
6222528 Gerpheide et al. Apr 2001 B1
D442592 Ledbetter et al. May 2001 S
6225976 Yates et al. May 2001 B1
6225980 Weiss et al. May 2001 B1
6226534 Aizawa May 2001 B1
6227966 Yokoi May 2001 B1
6229456 Engholm et al. May 2001 B1
D443616 Fisher et al. Jun 2001 S
6243078 Rosenberg Jun 2001 B1
6243080 Molne Jun 2001 B1
6243646 Ozaki et al. Jun 2001 B1
6248017 Roach Jun 2001 B1
6254477 Sasaki et al. Jul 2001 B1
6256011 Culver Jul 2001 B1
6259491 Ekedahl et al. Jul 2001 B1
6262717 Donohue et al. Jul 2001 B1
6262785 Kim Jul 2001 B1
6266050 Oh et al. Jul 2001 B1
6285211 Sample et al. Sep 2001 B1
D448810 Goto Oct 2001 S
6297795 Kato et al. Oct 2001 B1
6297811 Kent et al. Oct 2001 B1
6300939 Decker Oct 2001 B1
6300946 Lincke et al. Oct 2001 B1
6307539 Suzuki Oct 2001 B2
D450713 Masamitsu et al. Nov 2001 S
6314483 Goto et al. Nov 2001 B1
6321441 Davidson et al. Nov 2001 B1
6323845 Robbins Nov 2001 B1
6323846 Westerman et al. Nov 2001 B1
D452250 Chan Dec 2001 S
6340800 Zhai et al. Jan 2002 B1
6347290 Bartlett Feb 2002 B1
D454568 Andre et al. Mar 2002 S
6357887 Novak Mar 2002 B1
D455793 Lin Apr 2002 S
6373265 Morimoto et al. Apr 2002 B1
6373470 Andre et al. Apr 2002 B1
6377530 Burrows Apr 2002 B1
6396523 Segal et al. May 2002 B1
6404354 Decker Jun 2002 B1
6424338 Anderson Jul 2002 B1
6429846 Rosenberg et al. Aug 2002 B2
6429852 Adams et al. Aug 2002 B1
6452514 Philipp Sep 2002 B1
6461238 Rehkemper Oct 2002 B1
6465271 Ko et al. Oct 2002 B1
6473069 Gerphelde Oct 2002 B1
6492602 Asai et al. Dec 2002 B2
6492979 Kent et al. Dec 2002 B1
6496181 Bomer et al. Dec 2002 B1
6497412 Bramm Dec 2002 B1
D468365 Bransky et al. Jan 2003 S
D469109 Andre et al. Jan 2003 S
D472245 Andre et al. Mar 2003 S
6546231 Someya et al. Apr 2003 B1
6563487 Martin et al. May 2003 B2
6587091 Serpa Jul 2003 B2
6606244 Liu et al. Aug 2003 B1
6618909 Yang Sep 2003 B1
6636197 Goldenberg et al. Oct 2003 B1
6639584 Li Oct 2003 B1
6640250 Chang et al. Oct 2003 B1
6650975 Ruffner Nov 2003 B2
D483809 Lim Dec 2003 S
6658773 Rohne et al. Dec 2003 B2
6664951 Fujii et al. Dec 2003 B1
6677927 Bruck et al. Jan 2004 B1
6678891 Wilcox et al. Jan 2004 B1
6686904 Sherman et al. Feb 2004 B1
6686906 Salminen et al. Feb 2004 B2
6703550 Chu Mar 2004 B2
6704032 Falcon Mar 2004 B1
6724817 Simpson et al. Apr 2004 B1
6727889 Shaw Apr 2004 B2
D489731 Huang May 2004 S
6734883 Wynn et al. May 2004 B1
6738045 Hinckley et al. May 2004 B2
6750803 Yates et al. Jun 2004 B2
6781576 Tamura Aug 2004 B2
6784384 Park et al. Aug 2004 B2
6788288 Ano Sep 2004 B2
6791533 Su Sep 2004 B2
6795057 Gordon Sep 2004 B2
D497618 Andre et al. Oct 2004 S
6810271 Wood et al. Oct 2004 B1
6822640 Derocher Nov 2004 B2
6834975 Chu-Chia et al. Dec 2004 B2
6844872 Farag et al. Jan 2005 B1
6847351 Noguera Jan 2005 B2
6855899 Sotome Feb 2005 B2
6865718 Montalcini Mar 2005 B2
6886842 Vey et al. May 2005 B2
6894916 Reohr et al. May 2005 B2
D506476 Andre et al. Jun 2005 S
6922189 Fujiyoshi Jul 2005 B2
6930494 Tesdahl et al. Aug 2005 B2
6958614 Morimoto Oct 2005 B2
6977808 Lam et al. Dec 2005 B2
6978127 Bulthuis et al. Dec 2005 B1
6985137 Kaikuranta Jan 2006 B2
7006077 Uusimaki Feb 2006 B1
7019225 Matsumoto et al. Mar 2006 B2
7046230 Zadesky et al. May 2006 B2
7050292 Shimura et al. May 2006 B2
7058903 Jonach et al. Jun 2006 B1
7069044 Okada et al. Jun 2006 B2
7078633 Ihalainen Jul 2006 B2
7084856 Huppi Aug 2006 B2
7113196 Kerr Sep 2006 B2
7117136 Rosedale Oct 2006 B1
7119792 Andre et al. Oct 2006 B1
7215319 Kamijo et al. May 2007 B2
7233318 Farag et al. Jun 2007 B1
7236154 Kerr et al. Jun 2007 B1
7236159 Siversson Jun 2007 B1
7253643 Sequine Aug 2007 B1
7279647 Philipp Oct 2007 B2
7288732 Hashida Oct 2007 B2
7297883 Rochon et al. Nov 2007 B2
7310089 Baker et al. Dec 2007 B2
7312785 Tsuk Dec 2007 B2
7321103 Nakanishi et al. Jan 2008 B2
7325195 Arant Jan 2008 B1
7333092 Zadesky et al. Feb 2008 B2
7345671 Robbin et al. Mar 2008 B2
7348898 Ono Mar 2008 B2
7365737 Marvit et al. Apr 2008 B2
7382139 MacKey Jun 2008 B2
7394038 Chang Jul 2008 B2
7395081 Bonnelykke et al. Jul 2008 B2
7397467 Park et al. Jul 2008 B2
7439963 Geaghan et al. Oct 2008 B2
7466307 Trent, Jr. et al. Dec 2008 B2
7479949 Jobs et al. Jan 2009 B2
7486323 Lee et al. Feb 2009 B2
7495659 Marriott et al. Feb 2009 B2
7502016 Trent, Jr. et al. Mar 2009 B2
7503193 Schoene et al. Mar 2009 B2
7593782 Jobs et al. Sep 2009 B2
7645955 Huang et al. Jan 2010 B2
7671837 Forsblad et al. Mar 2010 B2
7689466 Benbrahim et al. Mar 2010 B1
7708051 Katsumi et al. May 2010 B2
7710393 Tsuk et al. May 2010 B2
7710394 Robbin et al. May 2010 B2
7710409 Robbin et al. May 2010 B2
7716582 Mueller May 2010 B2
7769794 Moore et al. Aug 2010 B2
7772507 Orr et al. Aug 2010 B2
7910843 Rothkopf et al. Mar 2011 B2
7932897 Elias et al. Apr 2011 B2
8125461 Weber et al. Feb 2012 B2
8188357 Robbin et al. May 2012 B2
8274479 Prest et al. Sep 2012 B2
8416198 Rathnam et al. Apr 2013 B2
8482530 Bollinger Jul 2013 B2
8683378 Bull et al. Mar 2014 B2
8866780 Rathnam et al. Oct 2014 B2
8933890 Lampell et al. Jan 2015 B2
20010000537 Inala et al. Apr 2001 A1
20010011991 Wang et al. Aug 2001 A1
20010011993 Saarinen Aug 2001 A1
20010033270 Osawa et al. Oct 2001 A1
20010043545 Aratani Nov 2001 A1
20010050673 Davenport Dec 2001 A1
20010051046 Watanabe et al. Dec 2001 A1
20020000978 Gerpheide Jan 2002 A1
20020011993 Lui et al. Jan 2002 A1
20020027547 Kamijo Mar 2002 A1
20020030665 Ano Mar 2002 A1
20020033848 Sciammarella et al. Mar 2002 A1
20020039493 Tanaka Apr 2002 A1
20020045960 Phillips et al. Apr 2002 A1
20020059584 Ferman et al. May 2002 A1
20020071550 Pletikosa Jun 2002 A1
20020089545 Montalcini Jul 2002 A1
20020103796 Hartley Aug 2002 A1
20020118131 Yates et al. Aug 2002 A1
20020118169 Hinckley et al. Aug 2002 A1
20020145594 Derocher Oct 2002 A1
20020154090 Lin Oct 2002 A1
20020158844 McLoone et al. Oct 2002 A1
20020164156 Bilbrey Nov 2002 A1
20020168947 Lemley Nov 2002 A1
20020180701 Hayama et al. Dec 2002 A1
20020196239 Lee Dec 2002 A1
20030002246 Kerr Jan 2003 A1
20030025679 Taylor et al. Feb 2003 A1
20030028346 Sinclair et al. Feb 2003 A1
20030043121 Chen Mar 2003 A1
20030043174 Hinckley et al. Mar 2003 A1
20030048250 Boon Mar 2003 A1
20030050092 Yun Mar 2003 A1
20030068053 Chu Apr 2003 A1
20030076301 Tsuk et al. Apr 2003 A1
20030076303 Huppi Apr 2003 A1
20030076306 Zadesky et al. Apr 2003 A1
20030091377 Hsu et al. May 2003 A1
20030095095 Pihlaja May 2003 A1
20030095096 Robbin et al. May 2003 A1
20030098851 Brink May 2003 A1
20030103043 Mulligan et al. Jun 2003 A1
20030122792 Yamamoto et al. Jul 2003 A1
20030135292 Husgafvel et al. Jul 2003 A1
20030142081 Lizuka et al. Jul 2003 A1
20030184517 Senzui et al. Oct 2003 A1
20030197740 Reponen Oct 2003 A1
20030206202 Moriya Nov 2003 A1
20030210537 Engelmann Nov 2003 A1
20030224831 Engstrom et al. Dec 2003 A1
20040027341 Derocher Feb 2004 A1
20040074756 Kawakami et al. Apr 2004 A1
20040080682 Dalton Apr 2004 A1
20040109028 Stern et al. Jun 2004 A1
20040109357 Cernea et al. Jun 2004 A1
20040145613 Stavely et al. Jul 2004 A1
20040150619 Baudisch et al. Aug 2004 A1
20040156192 Kerr et al. Aug 2004 A1
20040166912 Stienstra Aug 2004 A1
20040178997 Gillespie et al. Sep 2004 A1
20040200699 Matsumoto et al. Oct 2004 A1
20040215986 Shakkarwar Oct 2004 A1
20040224638 Fadell et al. Nov 2004 A1
20040239622 Proctor et al. Dec 2004 A1
20040252109 Trent, Jr. et al. Dec 2004 A1
20040252867 Lan et al. Dec 2004 A1
20040253989 Tupler et al. Dec 2004 A1
20040263388 Krumm et al. Dec 2004 A1
20040267874 Westberg et al. Dec 2004 A1
20050012644 Hurst et al. Jan 2005 A1
20050017957 Yi Jan 2005 A1
20050024341 Gillespie et al. Feb 2005 A1
20050030048 Bolender Feb 2005 A1
20050052425 Zadesky et al. Mar 2005 A1
20050052426 Hagermoser et al. Mar 2005 A1
20050052429 Philipp Mar 2005 A1
20050068304 Lewis et al. Mar 2005 A1
20050083299 Nagasaka Apr 2005 A1
20050083307 Aufderheide Apr 2005 A1
20050090288 Stohr et al. Apr 2005 A1
20050104867 Westerman et al. May 2005 A1
20050110768 Marriott et al. May 2005 A1
20050125147 Mueller Jun 2005 A1
20050129199 Abe Jun 2005 A1
20050139460 Hosaka Jun 2005 A1
20050140657 Park et al. Jun 2005 A1
20050143124 Kennedy et al. Jun 2005 A1
20050156881 Trent, Jr. et al. Jul 2005 A1
20050162402 Watanabe et al. Jul 2005 A1
20050204309 Szeto Sep 2005 A1
20050212760 Marvit et al. Sep 2005 A1
20050237308 Autio et al. Oct 2005 A1
20060017692 Wehrenberg et al. Jan 2006 A1
20060026521 Hotelling et al. Feb 2006 A1
20060032680 Elias et al. Feb 2006 A1
20060038791 MacKey Feb 2006 A1
20060095848 Naik May 2006 A1
20060097991 Hotelling et al. May 2006 A1
20060131156 Voelckers Jun 2006 A1
20060143574 Ito et al. Jun 2006 A1
20060174568 Kinoshita et al. Aug 2006 A1
20060181517 Zadesky et al. Aug 2006 A1
20060197750 Kerr et al. Sep 2006 A1
20060197753 Hotelling Sep 2006 A1
20060232557 Fallot-Burghardt Oct 2006 A1
20060236262 Bathiche et al. Oct 2006 A1
20060250377 Zadesky et al. Nov 2006 A1
20060274042 Krah et al. Dec 2006 A1
20060274905 Lindahl et al. Dec 2006 A1
20060279896 Bruwer Dec 2006 A1
20060284836 Philipp Dec 2006 A1
20070013671 Zadesky et al. Jan 2007 A1
20070018970 Tabasso et al. Jan 2007 A1
20070044036 Ishimura et al. Feb 2007 A1
20070052044 Forsblad Mar 2007 A1
20070052691 Zadesky et al. Mar 2007 A1
20070080936 Tsuk et al. Apr 2007 A1
20070080938 Robbin et al. Apr 2007 A1
20070080952 Lynch et al. Apr 2007 A1
20070083822 Robbin et al. Apr 2007 A1
20070085841 Tsuk et al. Apr 2007 A1
20070097086 Battles et al. May 2007 A1
20070120834 Boillot May 2007 A1
20070125852 Rosenberg Jun 2007 A1
20070126696 Boillot Jun 2007 A1
20070152975 Ogihara Jul 2007 A1
20070152977 Ng et al. Jul 2007 A1
20070152983 McKillop et al. Jul 2007 A1
20070155434 Jobs et al. Jul 2007 A1
20070157089 Van Os et al. Jul 2007 A1
20070180409 Sohn et al. Aug 2007 A1
20070242057 Zadesky et al. Oct 2007 A1
20070247421 Orsley et al. Oct 2007 A1
20070247443 Philipp Oct 2007 A1
20070271516 Carmichael Nov 2007 A1
20070273671 Zadesky et al. Nov 2007 A1
20070276525 Zadesky et al. Nov 2007 A1
20070279394 Lampell Dec 2007 A1
20070285404 Rimon et al. Dec 2007 A1
20070290990 Robbin et al. Dec 2007 A1
20070291016 Phillips Dec 2007 A1
20070296709 GuangHai Dec 2007 A1
20080001770 Ito et al. Jan 2008 A1
20080006453 Hotelling et al. Jan 2008 A1
20080006454 Hotelling Jan 2008 A1
20080007533 Hotelling et al. Jan 2008 A1
20080007539 Hotelling et al. Jan 2008 A1
20080012837 Marriott et al. Jan 2008 A1
20080018615 Zadesky et al. Jan 2008 A1
20080018616 Lampell Jan 2008 A1
20080018617 Ng et al. Jan 2008 A1
20080036473 Jansson Feb 2008 A1
20080036734 Forsblad et al. Feb 2008 A1
20080060925 Weber et al. Mar 2008 A1
20080062141 Chandhri Mar 2008 A1
20080066016 Dowdy et al. Mar 2008 A1
20080069412 Champagne et al. Mar 2008 A1
20080071810 Casto et al. Mar 2008 A1
20080079699 MacKey Apr 2008 A1
20080087476 Prest Apr 2008 A1
20080088582 Prest Apr 2008 A1
20080088596 Prest Apr 2008 A1
20080088597 Prest Apr 2008 A1
20080088600 Prest Apr 2008 A1
20080094352 Tsuk et al. Apr 2008 A1
20080098330 Tsuk et al. Apr 2008 A1
20080110739 Peng et al. May 2008 A1
20080111795 Bollinger May 2008 A1
20080143681 XiaoPing Jun 2008 A1
20080165144 Forstall et al. Jul 2008 A1
20080165158 Hotelling et al. Jul 2008 A1
20080166968 Tang et al. Jul 2008 A1
20080196945 Konstas Aug 2008 A1
20080201751 Ahmed et al. Aug 2008 A1
20080202824 Philipp et al. Aug 2008 A1
20080209442 Setlur et al. Aug 2008 A1
20080264767 Chen et al. Oct 2008 A1
20080280651 Duarte Nov 2008 A1
20080284742 Prest Nov 2008 A1
20080293274 Milan Nov 2008 A1
20080300055 Lutnick et al. Dec 2008 A1
20090021267 Golovchenko et al. Jan 2009 A1
20090026558 Bauer et al. Jan 2009 A1
20090033635 Wai Feb 2009 A1
20090036176 Ure Feb 2009 A1
20090058687 Rothkopf et al. Mar 2009 A1
20090058801 Bull Mar 2009 A1
20090058802 Orsley et al. Mar 2009 A1
20090064031 Bull et al. Mar 2009 A1
20090073130 Weber et al. Mar 2009 A1
20090078551 Kang Mar 2009 A1
20090109181 Hui et al. Apr 2009 A1
20090141046 Rathnam et al. Jun 2009 A1
20090160771 Hinckley et al. Jun 2009 A1
20090165708 Fadell et al. Jul 2009 A1
20090166098 Sunder Jul 2009 A1
20090167542 Culbert et al. Jul 2009 A1
20090167704 Terlizzi et al. Jul 2009 A1
20090170532 Lee et al. Jul 2009 A1
20090179854 Weber et al. Jul 2009 A1
20090197059 Weber et al. Aug 2009 A1
20090229892 Fisher et al. Sep 2009 A1
20090273573 Hotelling Nov 2009 A1
20090303204 Nasiri et al. Dec 2009 A1
20090307633 Haughay et al. Dec 2009 A1
20100045705 Vertegaal et al. Feb 2010 A1
20100058251 Rottler et al. Mar 2010 A1
20100060568 Fisher et al. Mar 2010 A1
20100073319 Lyon et al. Mar 2010 A1
20100149127 Fisher et al. Jun 2010 A1
20100214216 Nasiri et al. Aug 2010 A1
20100289759 Fisher et al. Nov 2010 A1
20100313409 Weber et al. Dec 2010 A1
20110005845 Hotelling et al. Jan 2011 A1
Foreign Referenced Citations (208)
Number Date Country
1139235 Jan 1997 CN
1455615 Nov 2003 CN
1499356 May 2004 CN
1659506 Aug 2005 CN
3615742 Nov 1987 DE
19722636 Dec 1998 DE
10022537 Nov 2000 DE
20019074 Jan 2001 DE
102004043663 Apr 2006 DE
0 178 157 Apr 1986 EP
0 419 145 Mar 1991 EP
0 498 540 Dec 1992 EP
0 521 683 Jan 1993 EP
0 551 778 Jul 1993 EP
0 674 288 Sep 1995 EP
0 731 407 Sep 1996 EP
0 880 091 Nov 1998 EP
1 026 713 Aug 2000 EP
1 081 922 Mar 2001 EP
1 098 241 May 2001 EP
1 133 057 Sep 2001 EP
1 162 826 Dec 2001 EP
1 168 396 Jan 2002 EP
1 205 836 May 2002 EP
1 244 053 Sep 2002 EP
1 251 455 Oct 2002 EP
1 263 193 Dec 2002 EP
1 347 481 Sep 2003 EP
1 376 326 Jan 2004 EP
1 467 392 Oct 2004 EP
1 482 401 Dec 2004 EP
1 496 467 Jan 2005 EP
1 517 228 Mar 2005 EP
1 542 437 Jun 2005 EP
1 589 407 Oct 2005 EP
1 784 058 May 2007 EP
1 841 188 Oct 2007 EP
1 850 218 Oct 2007 EP
1 876 058 Jan 2008 EP
1 876 711 Jan 2008 EP
2686440 Jul 1993 FR
2 015 167 Sep 1979 GB
2 072 389 Sep 1981 GB
2 315 186 Jan 1998 GB
2 333 215 Jul 1999 GB
2 391 060 Jan 2004 GB
2 402 105 Dec 2004 GB
57-95722 Jun 1982 JP
57-97626 Jun 1982 JP
61-117619 Jun 1986 JP
61-124009 Jun 1986 JP
63-020411 Jan 1988 JP
63-106826 May 1988 JP
63-181022 Jul 1988 JP
63-298518 Dec 1988 JP
3-57617 Jun 1991 JP
03-192418 Aug 1991 JP
04-032920 Feb 1992 JP
H04205408 Jul 1992 JP
05-041135 Feb 1993 JP
05-080938 Apr 1993 JP
05-101741 Apr 1993 JP
5-36623 May 1993 JP
05-189110 Jul 1993 JP
05-205565 Aug 1993 JP
05-211021 Aug 1993 JP
05-217464 Aug 1993 JP
05-233141 Sep 1993 JP
05-262276 Oct 1993 JP
05-265656 Oct 1993 JP
05-274956 Oct 1993 JP
05-289811 Nov 1993 JP
05-298955 Nov 1993 JP
05-325723 Dec 1993 JP
06-020570 Jan 1994 JP
06-028433 Feb 1994 JP
06-084428 Mar 1994 JP
06-089636 Mar 1994 JP
06-096639 Apr 1994 JP
06-111685 Apr 1994 JP
6-111695 Apr 1994 JP
06-139879 May 1994 JP
06-187078 Jul 1994 JP
06-267382 Sep 1994 JP
06-283993 Oct 1994 JP
06-333459 Dec 1994 JP
07-107574 Apr 1995 JP
7-41882 Jul 1995 JP
7-201249 Aug 1995 JP
07-201256 Aug 1995 JP
07-253838 Oct 1995 JP
7-261899 Oct 1995 JP
07-261922 Oct 1995 JP
7-296670 Nov 1995 JP
07-319001 Dec 1995 JP
08-016292 Jan 1996 JP
08-115158 May 1996 JP
08-203387 Aug 1996 JP
08-293226 Nov 1996 JP
08-298045 Nov 1996 JP
08-299541 Nov 1996 JP
08-316664 Nov 1996 JP
09-044289 Feb 1997 JP
09-069023 Mar 1997 JP
09-128148 May 1997 JP
9-134248 May 1997 JP
09-218747 Aug 1997 JP
09-230993 Sep 1997 JP
9-230993 Sep 1997 JP
09-231858 Sep 1997 JP
09-233161 Sep 1997 JP
9-251347 Sep 1997 JP
09-258895 Oct 1997 JP
09-288926 Nov 1997 JP
9-512979 Dec 1997 JP
10-063467 Mar 1998 JP
10-074127 Mar 1998 JP
10-074429 Mar 1998 JP
10-198507 Jul 1998 JP
10-227878 Aug 1998 JP
10-240693 Sep 1998 JP
10-320322 Dec 1998 JP
10-326149 Dec 1998 JP
11-024834 Jan 1999 JP
H1124835 Jan 1999 JP
11-068685 Mar 1999 JP
11-184607 Jul 1999 JP
11-194863 Jul 1999 JP
11-194872 Jul 1999 JP
11-194882 Jul 1999 JP
11-194883 Jul 1999 JP
11-194891 Jul 1999 JP
11-195353 Jul 1999 JP
11-203045 Jul 1999 JP
11-212725 Aug 1999 JP
11-272378 Oct 1999 JP
11-338628 Dec 1999 JP
2000-200147 Jul 2000 JP
2000-215549 Aug 2000 JP
2000-267777 Sep 2000 JP
2000-267786 Sep 2000 JP
2000-267797 Sep 2000 JP
2000-353045 Dec 2000 JP
2001-011769 Jan 2001 JP
2001-022508 Jan 2001 JP
2001-160850 Jun 2001 JP
2001-184158 Jul 2001 JP
3085481 Feb 2002 JP
2002-215311 Aug 2002 JP
2003015796 Jan 2003 JP
2003-060754 Feb 2003 JP
2003099198 Apr 2003 JP
2003-150303 May 2003 JP
2003-517674 May 2003 JP
2003-280799 Oct 2003 JP
2003-280807 Oct 2003 JP
2004-362097 Dec 2004 JP
2005-251218 Sep 2005 JP
2005-285140 Oct 2005 JP
2005-293606 Oct 2005 JP
2006-004453 Jan 2006 JP
2006-178962 Jul 2006 JP
2007-123473 May 2007 JP
1998-71394 Oct 1998 KR
1999-50198 Jul 1999 KR
2000-8579 Feb 2000 KR
2001-0052016 Jun 2001 KR
2001-0108361 Dec 2001 KR
2002-0065059 Aug 2002 KR
2006-0021678 Mar 2006 KR
431607 Apr 2001 TW
470193 Dec 2001 TW
547716 Aug 2003 TW
I220491 Aug 2004 TW
WO 9417494 Aug 1994 WO
WO 9500897 Jan 1995 WO
WO 9627968 Sep 1996 WO
WO 9814863 Apr 1998 WO
WO 9949443 Sep 1999 WO
WO 0102949 Jan 2001 WO
WO 0144912 Jun 2001 WO
WO 0208881 Jan 2002 WO
WO 03036457 May 2003 WO
WO 03044645 May 2003 WO
WO 03044956 May 2003 WO
WO 03025960 Sep 2003 WO
WO 03088176 Oct 2003 WO
WO 03090008 Oct 2003 WO
WO 2004001573 Dec 2003 WO
WO 2004040606 May 2004 WO
WO 2004091956 Oct 2004 WO
WO 2005055620 Jun 2005 WO
WO 2005076117 Aug 2005 WO
WO 2005114369 Dec 2005 WO
WO 2005124526 Dec 2005 WO
WO 2006020305 Feb 2006 WO
WO 2006021211 Mar 2006 WO
WO 2006037545 Apr 2006 WO
WO 2006104745 Oct 2006 WO
WO 2006135127 Dec 2006 WO
WO 2007025858 Mar 2007 WO
WO 2007078477 Jul 2007 WO
WO 2007084467 Jul 2007 WO
WO 2007089766 Aug 2007 WO
WO 2008007372 Jan 2008 WO
WO 2008045414 Apr 2008 WO
WO 2008045833 Apr 2008 WO
WO 0079772 Dec 2009 WO
Non-Patent Literature Citations (236)
Entry
3DConnexion, “Product Overview—ErgoCommander”, www.logicad3d.com/products/ergocommander.htm, downloaded Apr. 8, 2002, 2 pages.
3D Connexion, “Product Overview—SpaceMouse Classic”, www.logicad3d.com/products/classics.htm, downloaded Apr. 8, 2002, 2 pages.
3DConnexion, “About Quicktip®,” www.logicad3d.com/docs/qt.html, Apr. 8, 2002, 2 pages.
Ahl, “Controller Update”, Creative Computing, vol. 9, No. 12, Dec. 1983, 6 pages.
Ahmad, “A Usable Real-Time 3D Hand Tracker,” Proceedings of the 28th Asilomar Conference on Signals, Systems and Computers, Oct. 1994, 5 pages.
Alps Electric (USA) Inc., Alps Electric Introduces the GlidePoint Wave Keyboard; combines a gentily curved design with Alps, advanced GlidePoint technology, Business Wire, Oct. 21, 1996, 3 pages.
Alps Electric (USA) Inc., Alps Electric Ships GlidePoint Keyboard for the Macintosh; Includes a GlidePoint Touchpad, Erase-Eaze Backspace Key and Contoured Wrist Rest, Business Wire, Jul. 1, 1996, 2 pages.
Apple Store, “Apple Presents iPod Ultra-Portable MP3 Music Player Puts 1,000 Songs in Your Pocket”, http://www.apple.com/pr/library/2001/10/23Apple-Presents-iPod.html, Oct. 23, 2001, 3 pages.
“APS Show Guide to Exhibitors”, Physics Today, vol. 49, No. 3, Cervantes Convention Center, St. Louis, Missouri, Mar. 1996, 13 pages.
Apple Computer, Inc., “Apple Unveils Optical Mouse and New Pro Keyboard,” http://www.apple.com/ca/press/2000/07/mouse.html, Press Release, Jul. 19, 2000, 2 pages.
Atari, “Atari VCS/2600 Peripherals”, http://www.classicgaming.com/gamingmuseum/2600p.html, Feb. 28, 2007, 15 pages.
Baig, “Your PC Just Might Need a Mouse,” U.S. News & World Report, vol. 108, Issue 22, Jun. 4, 1990, 4 pages.
Bang & Olufsen Telecom a/s, “BeoCom 6000 User Guide 2000,” Denmark, Sep. 17, 2009, 54 pages.
Bartimo, “The Portables: Traveling Quickly”, Computerworld, Nov. 14, 1983, 6 pages.
Beijing Acer Information Co. Ltd., “Touchpad,” Notebook PC Manual, Feb. 16, 2005, 3 pages.
BeoCom 6000, Sales Training Brochure, date unknown, 5 pages.
Bray, “Phosphors Help Switch on Xenon,” PhysicsWorld/Physicsweb, Apr. 1999, 3 pages.
Brink et al., “Red-Flagging Antihistamines; Pumped-Up Portables, Debts Silver Lining; Easing Aching Hearts”, U.S. News & World Report, May 30, 1994, 2 pages.
Boling, “Programming Microsoft Windows ce.net, Third Edition,” May 28, 2003, 3 pages.
Buxton et al., “Issues and Techniques in Touch-Sensitive Tablet Input”, Computer Systems Research Institute, University of Toronto, Toronto, Ontario, Canada, Jul. 1, 1985, 10 pages.
Chapweske, “PS/2 Mouse/Keyboard Protocol”, http://www.Computer-Engineering.org, May 9, 2003, 7 pages.
Chen et al., “A Study in Interactive 3-D Rotation Using 2-D Control Devices,” Department of Electrical Engineering/Dynamic Graphics Project, University of Toronto, Aug. 1, 1988, 9 pages.
DevSys—Heatseekerz.net, “Crystal Optical Mouse,” http//www.heatseekers.net/index/php?page=articles&id=19&pagenum=3, Feb. 14, 2002, 2 pages.
Design News, A Cahners Publication, National Design Engineering Show Conference, McCormick Place, Chicago, Mar. 18-21, 1996, 87 pages.
Design News, A Cahners Publication, “Literature Plus”, Dec. 18, 1995, 30 pages.
Design News, A Cahners Publication, “Product News”, www.designnews.com, May 5, 1997, 56 pages.
Design News, A Cahners Publication, “Product News”, www.designnews.com, Jun. 9, 1997, 34 pages.
Digital Innovations, LLC “Neuros MP3 Digital Audio Computer”, www.neurosaudio.com, download Apr. 9, 2003, 6 pages.
EBV Electronic, “TSOP6238 IR Receiver Modules for Infrared Remote Control Systems,” Jan. 2004, 1 page.
Evans et al., “Tablet-Based Valuators that Provide One, Two, or Three Degrees of Freedom”, National Research Councel of Canada, Ottawa, Ontario, Aug. 3, 1981, 7 pages.
Fiore, Andrew, “Zen Touchpad”, Cornell University, May 2000, 6 pages.
Gadgetboy, “Point and Click With the Latest Mice”, CNETAsia Product Reviews, www.asia.cnet/com/reviews, Oct. 10, 2001,1 page.
Gfroerer, “Photoluminescence in Analysis of Surfaces and Interfaces,” Encyclopedia of Analytical Chemistry, John Wiley & Sons Ltd, Chichester, 2000, Published online Sep. 15, 2006, 24 pages.
Gillespie, “Synaptics Touch Pad Interfacting Guide” Second Edition, Mar. 25, 1998, Synaptics, Inc., San Jose, California, 91 pages.
Harmony Central, “Diamond Multimedia Announces Rio PMP300 Portable MP3 Music Player,” Press Release, Sep. 14, 1998, 4 pages.
Information Access Company, a Thomson Corporation Company, “Triax Custom Controllers Due; Video Game Controllers,” vol. 67, No. 1, p. 122, HFD—The Weekly Home Furnishings Newspaper, Jan. 4, 1993, 2 pages.
Interlink Electronics, VersaPad: Integration Guide, www.interlinkelectronics.com, Feb. 27, 2001, 35 pages.
Jesitus, “Broken Promises? FoxMeyer's Project,” Penton/IPC Industry Week, Nov. 3, 1997, 6 pages.
Kobayashi et al., “Design of Dynamic Soundscape: Mapping Time to Space for Audio Browsing with Simultaneous Listening,” Massachusetts Institute of Technology, Sep. 1996, 60 pages.
Kobayashi et al., “Development of the Touch Switches with the Click Response,” Japan Aviation Electronics Industry, Ltd.; Translation of Summary, Mar. 1994, 5 pages.
Kobayashi et al., “Dynamic Soundscape: Mapping Time to Space for Audio Browsing,” Speech Inteface Group, MIT Media Library, Cambridge, MA, Mar. 22-27, 1997, 8 pages.
Luna Technologies Inernational, Inc., “Photoluminescent Safety Products,” http://www.lunaplast.com/photoluminescence.com, Dec. 27, 2005, 1 page.
Marriott et al., “Touch Pad for Handheld Device,” U.S. Appl. No. 10/722,948, filed Nov. 25, 2003, 49 pages.
Mims, “A Few Quick Pointers; Mouses, Touch Screens, Touch Pads, Light Pads, and the Like Can Make Your System Easier to Use,” Computers & Electronics, vol. 22, p. 64, May 1984, 6 pages.
Nass, “Touchpad Input Device Goes Digital to Give Portable Systems a Desktop ‘Mouse-Like’ Feel”, Penton Publishing Inc., Electronic Design, vol. 44, No. 18, Sep. 3, 1996, 2 pages.
Nixon Peabody LLP, Attorneys at Law, Letter sent to Mr. Richard Liu, re: Bang & Olufsen A/S—Apple U.S. Patent Application No. 2003/0095096, May 21, 2004, 2 pages.
Cirque Corporation, OEM Touchpad Modules, http://www.glidepoint.com/sales/moduels/index.shtml, Feb. 13, 2002, 5 pages.
Office Action, dated Mar. 4, 2004, received in U.S. Appl. No. 10/188,182, 8 pages.
Office Action, dated Jul. 30, 2004, received in U.S. Appl. No. 10/188,182, 8 pages.
Office Action, dated Sep. 30, 2004, received in U.S. Appl. No. 10/259,159, 14 pages.
Office Action, dated Jun. 16, 2005, received in U.S. Appl. No. 10/259,159, 16 pages.
Office Action, dated Sep. 21, 2005, received in U.S. Appl. No. 10/188,182, 18 pages.
Office Action, dated Jan. 11, 2006, received in U.S. Appl. No. 10/259,159, 15 pages.
Office Action, dated Jun. 2, 2006, received in U.S. Appl. No. 10/722,948, 12 pages.
Office Action, dated Aug. 3, 2006, received in U.S. Appl. No. 10/259,159, 15 pages.
Office Action, dated Oct. 13, 2006, received in U.S. Appl. No. 10/259,159, 18 pages.
Office Action, dated Oct. 27, 2006, received in U.S. Appl. No. 10/643,256, 14 pages.
Office Action, dated Dec. 12, 2006, received in U.S. Appl. No. 10/722,948, 14 pages.
Office Action, dated Dec. 29, 2006, received in Chinese Patent Application No. 200510103886.3, 3 pages.
Office Action, dated Jan. 18, 2007, received in U.S. Appl. No. 10/259,159, 18 pages.
Office Action, dated Mar. 23, 2007, received in U.S. Appl. No. 10/643,256, 11 pages.
Office Action, dated Jul. 13, 2007, received in U.S. Appl. No. 10/643,256, 13 pages.
Office Action, dated Jul. 13, 2007, received in U.S. Appl. No. 10/722,948, 15 pages.
Office Action, dated Oct. 4, 2007, received in U.S. Appl. No. 11/386,238, 12 pages.
Office Action, dated Oct. 4, 2007, received in U.S. Appl. No. 11/806,957, 14 pages.
Office Action, dated Nov. 20, 2007, received in U.S. Appl. No. 11/057,050, 33 pages.
Office Action, dated Dec. 12, 2007, received in U.S. Appl. No. 10/643,256, 11 pages.
Office Action, dated Jan. 30, 2008, received in U.S. Appl. No. 10/722,948, 17 pages.
Office Action, dated Jul. 9, 2008, received in U.S. Appl. No. 10/643,256, 12 pages.
Office Action, dated Aug. 19, 2008, received in U.S. Appl. No. 11/057,050, 23 pages.
Office Action, dated Sep. 17, 2008, received in U.S. Appl. No. 11/203,692, 8 pages.
Office Action, dated Nov. 26, 2008, received in U.S. Appl. No. 11/057,050, 25 pages.
Office Action, dated Dec. 24, 2008, received in U.S. Appl. No. 11/057,050, 25 pages.
Office Action, dated Jan. 26, 2009, received in U.S. Appl. No. 11/355,022, 15 pages.
Office Action, dated Jan. 27, 2009, received in U.S. Appl. No. 11/882,421, 15 pages.
Office Action, dated Feb. 20, 2009, received in U.S. Appl. No. 11/057,050, 25 pages.
Office Action, dated Feb. 23, 2009, received in U.S. Appl. No. 11/203,692, 13 pages.
Office Action, dated Mar. 5, 2009, received in U.S. Appl. No. 11/477,469, 12 pages.
Office Action, dated Jun. 25, 2009, received in U.S. Appl. No. 11/355,022, 18 pages.
Office Action, dated Jul. 24, 2009, received in U.S. Appl. No. 11/483,008, 17 pages.
Office Action, dated Jul. 27, 2009, received in U.S. Appl. No. 11/882,420, 17 pages.
Office Action, dated Aug. 4, 2009, received in U.S. Appl. No. 11/203,692, 12 pages.
Office Action, dated Aug. 6, 2009, received in U.S. Appl. No. 11/057,050, 30 pages.
Office Action, dated Aug. 10, 2009, received in U.S. Appl. No. 11/610,376, 11 pages.
Office Action, dated Aug. 12, 2009, received in U.S. Appl. No. 11/610,384, 20 pages.
Office Action, dated Sep. 1, 2009, received in U.S. Appl. No. 11/482,286, 14 pages.
Office Action, dated Sep. 15, 2009, received in U.S. Appl. No. 11/530,807, 15 pages.
Office Action, dated Oct. 5, 2009, received in U.S. Appl. No. 11/499,360, 7 pages.
Office Action, dated Jan. 14, 2010, received in U.S. Appl. No. 11/394,493, 20 pages.
Office Action, dated Jan. 15, 2010, received in U.S. Appl. No. 11/882,423, 22 pages.
Office Action, dated Jan. 25, 2010, received in U.S. Appl. No. 11/482,286, 17 pages.
Office Action, dated Jan. 27, 2010, received in U.S. Appl. No. 11/499,360, 8 pages.
Office Action, dated Feb. 4, 2010, received in U.S. Appl. No. 11/477,469, 14 pages.
Office Action, dated Mar. 30, 2010, received in U.S. Appl. No. 11/592,679, 13 pages.
Office Action, dated Mar. 30, 2010, received in U.S. Appl. No. 11/483,008, 20 pages.
Office Action, dated Mar. 30, 2010, received in U.S. Appl. No. 11/203,692, 15 pages.
Office Action dated Jun. 4, 2010, received in U.S. Appl. No. 11/530,807, 15 pages.
Office Action dated Jun. 7, 2010, received in U.S. Appl. No. 11/856,530, 15 pages.
Office Action dated Jun. 9, 2010, received in U.S. Appl. No. 11/482,286, 21 pages.
Office Action dated Jun. 11, 2010, received in U.S. Appl. No. 11/203,692, 17 pages.
Office Action dated Jun. 22, 2010, received in U.S. Appl. No. 11/394,493, 14 pages.
Office Action dated Jun. 22, 2010, received in U.S. Appl. No. 11/878,132, 32 pages.
Office Action dated Jun. 22, 2010, received in U.S. Appl. No. 11/882,882, 33 pages.
Office Action dated Jun. 22, 2010, received in U.S. Appl. No. 11/882,890, 15 pages.
Office Action dated Jun. 22, 2010, received in U.S. Appl. No. 11/812,383, 21 pages.
Office Action dated Jun. 23, 2010, received in U.S. Appl. No. 11/812,384, 29 pages.
Office Action dated Jun. 23, 2010, received in U.S. Appl. No. 11/882,889, 13 pages.
Office Action dated Jun. 25, 2010, received in U.S. Appl. No. 11/842,724, 22 pages.
Office Action, dated Jul. 8, 2010, received in U.S. Appl. No. 11/882,423, 19 pages.
Office Action, dated Jul. 9, 2010, received in U.S. Appl. No. 11/849, 801, 13 pages.
Office Action, dated Aug. 2, 2010, received in U.S. Appl. No. 11/882,004, 9 pages.
Office Action, dated Aug. 18, 2010, received in U.S. Appl. No. 11/882,424, 16 pages.
Office Action, dated Aug. 19, 2010, received in U.S. Appl. No. 11/882,422, 13 pages.
Office Action dated Sep. 16, 2010, received in U.S. Appl. No. 11/591,752, 14 pages.
Office Action dated Sep. 29, 2010, received in U.S. Appl. No. 11/882,003, 13 pages.
Office Action dated Oct. 1, 2010, received in U.S. Appl. No. 11/482,286, 28 pages.
Office Action dated Oct. 4, 2010, received in U.S. Appl. No. 11/057,050, 31 pages.
Office Action dated Oct. 13, 2010, received in U.S. Appl. No. 12/205,795, 23 pages.
Office Action dated Oct. 26, 2010, received in U.S. Appl. No. 11/882,423, 18 pages.
Office Action dated Oct. 27, 2010, received in U.S. Appl. No. 11/483,008, 23 pages.
Office Action dated Oct. 29, 2010, received in U.S. Appl. No. 11/838,845, 8 pages.
Office Action dated Nov. 16, 2010, received in U.S. Appl. No. 11/477,469, 13 pages.
Office Action dated Nov. 22, 2010, received in U.S. Appl. No. 11/203,692, 6 pages.
Office Action dated Dec. 3, 2010, received in U.S. Appl. No. 11/530,807, 17 pages.
Office Action dated Dec. 8, 2010, received in U.S. Appl. No. 11/482,286, 33 pages.
Office Action dated Dec. 9, 2010, received in U.S. Appl. No. 11/394,493, 13 pages.
Office Action dated Dec. 22, 2010, received in U.S. Appl. No. 11/882,427, 16 pages.
Office Action dated Jan. 7, 2011, received in U.S. Appl. No. 11/856,530, 13 pages.
Office Action dated Jan. 7, 2011, received in U.S. Appl. No. 12/205,795, 21 pages.
Office Action dated Feb. 1, 2011, received in U.S. Appl. No. 11/882,004, 16 pages.
Office Action dated Feb. 4, 2011, received in U.S. Appl. No. 11/849,801, 22 pages.
Office Action dated Feb. 17, 2011, received in U.S. Appl. No. 12/844,502, 11 pages.
Office Action dated Mar. 16, 2011, received in U.S. Appl. No. 11/882,003, 12 pages.
Office Action dated Mar. 21, 2011, received in U.S. Appl. No. 11/842,724, 29 pages.
Office Action dated Mar. 24, 2011, received in U.S. Appl. No. 11/591,752, 11 pages.
Office Action, dated Mar. 24, 2011, received in U.S. Appl. No. 12/205,757, 14 pages.
Office Action dated Mar. 31, 2011, received in U.S. Appl. No. 11/882,005, 7 pages.
Office Action dated Apr. 26, 2011, received in U.S. Appl. No. 11/838,845, 9 pages.
Perenson, “New & Improved, News of Announced Products and Upgrades”, PC Magazine, Sep. 10, 1996, 2 pages.
Petersen, “Koala Touch Tablet & Micro Illustrator Software,” InfoWorld, Oct. 10, 1983, 3 pages.
Petruzzellis, “Force-Sensing Resistors” Electronics Now, vol. 64, Issue 3, Mar. 1993, 8 pages.
Photographs of Innovations 2000 Best of Show Award Presented at the 2000 International CES InNovations 2000 Design & Engineering Showcase, Jan. 6, 2000, 1 page.
Palazzolo, “Master System Service and Troubleshooting Manual,” Intellivision, www.dsplib.com/intv/master, Dec. 12, 1997, 4 pages.
SanDisk, “Sansa Connect User Guide,” Oct. 17, 2007, 29 pages.
Schramm, “Playing with the iPhone's Accelerometer,” The Unofficial Apple Weblog, http://www.tuaw.com/2007/08/29/playing-with-the-iphones-accelerometer, Aug. 29, 2007, 5 pages.
Soderhoim, “Sensing Systems for ‘Touch and Feel’,” Design News, May 8, 1989, 6 pages.
Sony, “Sony Presents Choice Without Compromise” at IBC '97, M2 Presswire, Jul. 24, 1997, 3 pages.
Spiwak, “A Great New Wireless Keyboard”, Popular Electronics, vol. 14, Issue 12, Dec. 1997, 8 pages.
Spiwak, “A Pair of Unusual Controllers”, Popular Electronics, vol. 14, Issue 4, Apr. 1997, 7 pages.
Suzuki, “Full Loading of Usable Software! Strongest Palm Series Packs 1000,” First Edition, Ascii Co., Ltd., Second Edition, 4 pages.
Sylvania, “Intellivision™ Intelligent Television Master Component Service Manual,” DSPLIB Website—Mattel Intellivsion, dated Jun. 21, 2011, 24 pages.
Telefon, “Der Klangmeister,” Connect Magazine, Aug. 1998, 2 pages.
Tessler, “Touchpads, Three New Input Devices”, Macworld, www.macworld.com/1996/02/review/1806.html, Feb. 13, 2002, 2 pages.
Tessler, “Point Pad”, Macworld, vol. 12, No. 10, Oct. 1995, 2 pages.
Tessler, “Smart Input: How to Choose From the New Generation of Innovative Inpute Devices”, Macworld, vol. 13, No. 5, May 1996, 10 pages.
Tessler, “Touchpads”, Alps Electronic, Macworld, vol. 13, No. 2, Feb. 1996, 4 pages.
The Laser Focus World Buyers Guide, “Manufactures,” a PennWell Publication, Dec. 1995, 162 pages.
The News, “Previews of Exhibitor Booths at the Philadelphia show”, Jan. 13, 1997, 22 pages.
Translation of Trekstor's Defense Statement to the District Court Mannheim, May 23, 2008, 37 pages.
Office Action, dated Sep. 30, 2004, received in U.S. Appl. No. 10/256,716, 11 pages.
Final Office Action, dated Jun. 24, 2005, received in U.S. Appl. No. 10/256,716, 12 pages.
Office Action, dated Jan. 10, 2006, received in U.S. Appl. No. 10/256,716, 12 pages.
Final Office Action, dated Aug. 3, 2006, received in U.S. Appl. No. 10/256,716, 15 pages.
Office Action, dated Oct. 13, 2006, received in U.S. Appl. No. 10/256,716, 16 pages.
Notice of Allowance, dated Jan. 25, 2007, received in U.S. Appl. No. 10/256,716, 4 pages.
Notice of Allowance, dated Jul. 16, 2007, received in U.S. Appl. No. 10/256,716, 4 pages.
Notice of Allowance, dated Nov. 1, 2007, received in U.S. Appl. No. 10,256,716, 4 pages.
Office Action, dated Jun. 9, 2006, received in Chinese Patent Application No. 02820867.6, which corresponds with U.S. Appl. No. 10/256,716, 4 pages.
Office Action, dated Jun. 20, 2008, received in Chinese Patent Application No. 200710090406.3, which corresponds with U.S. Appl. No. 10/256,716, 3 pages.
Office Action, dated Feb. 6, 2009, received in Chinese Patent Application No. 200710090406.3, which corresponds with U.S. Appl. No. 10/256,716, 3 pages.
Office Action, dated Aug. 7, 2009, received in Chinese Patent Application No. 200710090406.3, which corresponds with U.S. Appl. No. 10/256,716, 2 pages.
Office Action (Notification of Reexamination), dated Jul. 15, 2010, received in Chinese Patent Application No. 200710090406.3, which corresponds with U.S. Appl. No. 10/256,716, 4 pages.
Office Action, dated Nov. 30, 2011, received in Chinese Patent Application No. 200710090406.3, which corresponds with U.S. Appl. No. 10/256,716, 3 pages.
Office Action, dated Apr. 6, 2012, received in Chinese Patent Application No. 200710090406.3, which corresponds with U.S. Appl. No. 10/256,716, 3 pages.
Office Action, dated Aug. 21, 2009, received in Chinese Patent Application No. 200810008293.2, which corresponds with U.S. Appl. No. 10/256,716, 3 pages.
Office Action, dated Sep. 27, 2010, received in Chinese Patent Application No. 200810008293.2, which corresponds with U.S. Appl. No. 10/256,716, 3 pages.
Office Action, dated Apr. 13, 2006, received in European Patent Application No. 02 776 261.6, which corresponds with U.S. Appl. No. 10/256,716, 5 pages.
Office Action, dated Jun. 28, 2007, received in European Patent Application No. 02 776 261.6, which corresponds with U.S. Appl. No. 10/256,716, 5 pages.
Office Action, dated Oct. 2, 2008, received in European Patent Application No. 02 776 261.6, which corresponds with U.S. Appl. No. 10/256,716, 4 pages.
Office Action, dated Jun. 29, 2010, received in European Patent Application No. 02 776 261.6, which corresponds with U.S. Appl. No. 10/256,716, 4 pages.
Office Action (Observations by a Third Party), dated Nov. 3, 2008, received in European Application No. 02 776 261.6, which corresponds with U.S. Appl. No. 10/256,716, 11 pages.
Office Action (to Grant), dated Feb. 28, 2011, received in European Patent Application No. 02 776 261.6, which corresponds with U.S. Appl. No. 10/256,716, 5 pages.
Office Action, dated Mar. 8, 2012, received in European Patent Application No. 10 011 448.7, which corresponds with U.S. Appl. No. 10/256,716, 4 pages.
Office Action, dated Sep. 12, 2006, received in Japanese Patent Application No. 2003-538879, which corresponds with U.S. Appl. No. 10/256,716, 4 pages.
Official Inquiry from the Appeal Board, dated Sep. 22, 2008, received in Japanese Patent Application No. 2003-538879, which corresponds with U.S. Appl. No. 10/256,716, 5 pages.
Office Action, dated May 13, 2008, received in Japanese Patent Application No. 2007-057453, which corresponds with U.S. Appl. No. 10/256,716, 3 pages.
Office Action, dated Nov. 18, 2008, received in Japanese Patent Application No. 2008-179252, which corresponds with U.S. Appl. No. 10/256,716, 3 pages.
Office Action, dated Jun. 9, 2009, received in Japanese Patent Application No. 2008-179252, which corresponds with U.S. Appl. No. 10/256,716, 2 pages.
Office Action, dated Dec. 20, 2010, received in Japanese Patent Application No. 2008-179252, which corresponds with U.S. Appl. No. 10/256,716, 4 pages.
Office Action, dated Sep. 26, 2011, received in Japanese Patent Application No. 2008-179252, which corresonds with U.S. Appl. No. 10/256,716, 14 pages.
Office Action, dated Feb. 10, 2009, received in Japanese Patent Application No. 2008-179261, which corresponds with U.S. Appl. No. 10/256,716, 3 pages.
Office Action, dated Nov. 17, 2009, received in Japanese Patent Application No. 2008-179261, which corresponds with U.S. Appl. No. 10/256,716, 2 pages.
Office Action, dated Aug. 2, 2010, received in Japanese Patent Application No. 2008-179261, which corresponds with U.S. Appl. No. 10/256,716, 1 page.
Office Action, dated Sep. 20, 2011, received in Japanese Patent Application No. 2008-291198, which corresonds with U.S. Appl. No. 10/256,716, 8 pages.
Office Action, dated Jan. 9, 2006, received in Korean Patent Application No. 10-2004-7005119, which corresponds with U.S. Appl. No. 10/256,716, 3 pages.
Office Action, dated Aug. 29, 2006, received in Korean Patent Application No. 10-2004-7005119, which corresponds with U.S. Appl. No. 10/256,716, 3 pages.
Office Action, dated Aug. 31, 2007, received in Korean Patent Application No. 10-2004-7005119, which corresponds with U.S. Appl. No. 10/256,716, 3 pages.
Office Action, dated Feb. 11, 2008, received in Korean Patent Application No. 10-2004-7005119, which corresponds with U.S. Appl. No. 10/256,716, 3 pages.
Office Action, dated Jul. 16, 2008, received in Korean Patent Application No. 10-2004-7005119, which corresponds with U.S. Appl. No. 10/256,716, 2 pages.
Office Action, dated Jan. 11, 2008, received in Korean Patent Application No. 10-2007-7012309, which corresponds with U.S. Appl. No. 10/256,716, 4 pages.
Office Action, dated Mar. 2, 2009, received in Korean Patent Application No. 10-2007-7012309, which corresponds with U.S. Appl. No. 10/256,716, 3 pages.
Office Action, dated Jun. 30, 2008, received in Korean Patent Application No. 10-2008-7000097, which corresponds with U.S. Appl. No. 10/256,716, 3 pages.
Office Action, dated Dec. 19, 2008, received in Korean Patent Application No. 10-2008-7000097, which corresponds with U.S. Appl. No. 10/256,716, 8 pages.
Office Action, dated Jul. 29, 2009, received in Korean Patent Application No. 10-2008-7000097, which corresponds with U.S. Appl. No. 10/256,716, 3 pages.
Notice of Allowance, dated Mar. 3, 2010, received in Korean Patent Application No. 10-2008-7000097, which corresponds with U.S. Appl. No. 10/256,716, 5 pages.
Office Action, dated Mar. 3, 2010, received in Korean Patent Application No. 10-2009-7024888, which corresponds with U.S. Appl. No. 10/256,716, 3 pages.
Office Action, dated Jul. 26, 2010, received in Korean Patent Application No. 10-2010-7014838, which corresponds with U.S. Appl. No. 10/256,716, 3 pages.
Certificate of Grant, dated Jan. 31, 2011, received in Singapore Patent Application No. 200701908-6, which corresponds with U.S. Appl. No. 10/256,716, 1 page.
Office Action, dated Apr. 6, 2011, received in Singapore Patent Application No. 200907980-7, which correponds with U.S. Appl. No. 10/256,716, 6 pages.
Certificate of Grant, dated May 6, 2014, received in Singapore Patent Application No. 2009079807, which corresponds with U.S. Appl. No. 10/256,716, 1 page.
Office Action, dated Aug. 7, 2009, received in U.S. Appl. No. 11/610,181, 20 pages.
Notice of Allowance, dated Jan. 14, 2010, received in U.S. Appl. No. 11/610,181, 7 pages.
Office Action, dated Jul. 7, 2009, received in U.S. Appl. No. 11/610,190, 26 pages.
Final Office Action, dated Dec. 31, 2009, received in U.S. Appl. No. 11/610,190, 25 pages.
Office Action, dated Apr. 28, 2010, received in U.S. Appl. No. 11/610,190, 29 pages.
Final Office Action, dated Aug. 6, 2010, received in U.S. Appl. No. 11/610,190, 30 pages.
Office Action, dated Apr. 19, 2011, received in U.S. Appl. No. 11/610,190, 25 pages.
Final Office Action, dated Sep. 20, 2011, received in U.S. Appl. No. 11/610,190, 27 pages.
Office Action, dated Mar. 14, 2012, received in U.S. Appl. No. 11/610,190, 26 pages.
Final Office Action, dated Dec. 6, 2012, received in U.S. Appl. No. 11/610,190, 25 pages.
Office Action dated Nov. 1, 2010, received in U.S. Appl. No. 11/959,918, 8 pages.
Office Action dated Mar. 31, 2011, received in U.S. Appl. No. 11/959,918, 9 pages.
Office Action, dated Mar. 3, 2014, received in U.S. Appl. No. 11/959,918, 9 pages.
Notice of Allowance, dated Nov. 24, 2014, received in U.S. Appl. No. 11/959,918, 7 pages.
Office Action, dated Oct. 26, 2010, received in U.S. Appl. No. 11/959,942, 27 pages.
Final Office Action, dated Jun. 23, 2011, received in U.S. Appl. No. 11/959,942, 30 pages.
Final Office Action, dated Mar. 21, 2012, received in U.S. Appl. No. 11/959,942, 23 pages.
Final Office Action, dated Sep. 28, 2012, received in U.S. Appl. No. 11/959,942, 26 pages.
Office Action, dated Jul. 15, 2014, received in U.S. Appl. No. 11/959,942, 20 pages.
Notice of Allowance, dated Dec. 5, 2014, received in U.S. Appl. No. 11/959,942, 8 pages.
Extended Search Report, dated Feb. 21, 2011, received in European Patent Application No. 10011448.7, which corresponds with U.S. Appl. No. 10/256,716, 8 pages.
Australian Search Report and Written Opinion, dated Aug. 20, 2008, received in Australian Patent Application No. SG 200701908-6, which corresponds with U.S. Appl. No. 10/256,716, 11 pages.
Australian Written Opinion, dated May 27, 2009, received in Australian Patent Application No. SG 200701908-6, which corresponds with U.S. Appl. No. 10/256,716, 5 pages.
Australian Written Opinion, dated Jan. 28, 2010, received in Australian Patent Application No. SG 200701908-6, which corresponds with U.S. Appl. No. 10/256,716, 5 pages.
Australian Search Report and Written Opinion, dated May 10, 2010, received in Australian Patent Application No. SG 200907980-7, which corresponds with U.S. Appl. No. 10/256,716, 9 pages.
International Search Report, dated Oct. 15, 2003, received in International Patent Application No. PCT/2002/033805, which corresponds with U.S. Appl. No. 10/256,716, 3 pages.
Related Publications (1)
Number Date Country
20150248175 A1 Sep 2015 US
Provisional Applications (3)
Number Date Country
60387692 Jun 2002 US
60359551 Feb 2002 US
60346237 Oct 2001 US
Divisions (1)
Number Date Country
Parent 10256716 Sep 2002 US
Child 11959942 US
Continuations (1)
Number Date Country
Parent 11959942 Dec 2007 US
Child 14685484 US