Techniques for interactive input to portable electronic devices

Abstract
Techniques for providing input to interactive and multitasking applications are disclosed. A game input area (surface or plane) receives input for multiple applications including an interactive application executed in connection with a scene. The input received is directed to the appropriate application based on one or more locations (e.g., points, positions, regions, portions) of the input area effectively identified when input is received (or entered). In addition, the manner in which input is received (or entered) can be used to determine which application should receive the input. The input area can additionally resemble or approximate the shape of a scene (e.g., game scene) to allow a person to provide input in a more intuitive way. Accordingly, input can be provided in a simple and more intuitive manner by effectively allowing the user to interact with the input area in a way that mimics or approximates a desired action (e.g., moving a ball or bat around by inputting a rotational movement). Examples of such interaction include positional, directional (e.g., rotational), press or pressure input (or movement) which can easily be provided by a thumb or a finger, for example, on a touch screen.
Description

This application is related to: (i) U.S. Pat. No. 7,046,230, filed Jul. 2, 2002, and entitled “TOUCH PAD FOR HANDHELD DEVICE,” which is hereby incorporated herein by reference; (ii) U.S. patent application Ser. No. 10/722,948, filed Nov. 25, 2003, and entitled “TOUCH PAD FOR HANDHELD DEVICE,” which is hereby incorporated herein by reference; (iii) U.S. patent application Ser. No. 11/481,303, filed Jul. 3, 2006, and entitled “MEDIA MANAGEMENT SYSTEM FOR MANAGEMENT OF GAMES ACQUIRED FROM A MEDIA SERVER,” which is hereby incorporated herein by reference.


BACKGROUND OF THE INVENTION

Conventional input devices (e.g., a keyboard, mouse) are used to provide input to various application programs (applications) running (or being executed) on conventional computing systems (e.g., personal computers). Generally speaking, providing input to an application program running on a mobile device (e.g., portable media player, mobile phones) poses a more difficult problem, especially when an “interactive” application (e.g., gaming application) and/or multiple applications are to be supported. Broadly speaking, applications that receive or require input can be characterized as “interactive” applications.


Typically, interactive applications require input in connection with data or content displayed. The data or content displayed can be characterized as a “scene.” In general, data or content (or scene) displayed is manipulated or controlled based on the input when an interactive application is executed. Often, a person (or a human being) provides the input while viewing the data or content (or scene) displayed by the interactive application.


In a “multitasking” (or multiprogramming) computing environment, multiple applications are effectively supported at the same time. Those skilled in the art will readily appreciate that multitasking poses difficult technical challenges, especially when an interactive application is supported on a mobile device. Despite these challenges, interactive and multitasking applications have become increasingly more popular with users of mobile devices.


Accordingly, improved techniques for providing user input to interactive and multitasking applications would be useful.


SUMMARY OF THE INVENTION

Broadly speaking, the invention relates to improved techniques for providing user input to interactive and multitasking computing environments. The invention can be implemented in numerous ways, including a method, an apparatus, a computer readable medium. Several aspects and embodiments of the invention are discussed below.


One aspect of the invention provides a game input area (surface or plane) that can receive input for multiple applications including an interactive application executed in connection with a scene displayed on a display. Input is directed to the appropriate application based on one or more locations (e.g., points, positions, regions, portions) of the input area effectively identified when input is received (e.g., when input is entered by a person by touching a particular position on a touch screen). In addition, the manner in which input is received (or entered) can be used to determine which application should receive the input. By way of example, the same input area can be effectively used to receive input from a gaming application and a non-gaming application at substantially the same time. More particularly, the input area for a gaming application can effectively overlap or include a number of locations that are designated for a non-gaming application (e.g., a media playback application) and/or designated as such only if input is received in a particular manner (e.g., pushing or pressing of any location, or a particular designated location, would result in sending the input to non-gaming application). As such, a location on the input area can be designated, for example, for a gaming application if input is received in a particular manner different than that designated for the non-gaming application (e.g., tapping or touching the region would send input to a gaming application, but pressing would result in a media player function). Accordingly, this aspect of the invention allows the same input area to be used for multiple applications. Hence, a person can use the same input area (e.g., a top surface or plane of a physical input device such as a touch screen) to multitask. In other words, the person can, for example, play a game and exercise control over another application (e.g., media playback application) using the same input area.


Another aspect of the invention provides an input area that resembles or approximates the shape of a scene (e.g., game scene) associated with an application (e.g., a gaming application). Typically, the scene is used in connection with the application (e.g., a game scene is used to play a game, a record is displayed for a database program and manipulated based on input). Further, the scene is often controlled or manipulated based on input provided by a person. Typically, this requires one or more objects to be controlled or manipulated in the scene based on input provided. It will be appreciated that an input area that resembles or approximates the scene allows a person to provide input in a more intuitive way. Further, input can be provided in a simple and more intuitive manner by effectively allowing the user to interact with the input area in a way that mimics or approximated a desired action or motion of an object displayed in the scene (e.g., moving a ball or bat around by mimicking the motion on the input area). Examples of such interactions include that can be characterized as positional, directional, rotational, pressing and/or pushing type inputs (or movement).


It will be appreciated that these and other aspects of the invention can be combined to realize additional benefits. In general, the invention allows various applications to be integrated and used on devices that are not readily suitable for supporting multiple applications at the same time. As an example, a portable media player can be effectively integrated with various other applications including gaming applications. The media player can, for example, be used to play a game and still behave as a media player during the game play. It will also be appreciated that media player can provide the same media presentation functions (e.g., play, pause, next, back) that users have become accustomed to and provide them in a familiar manner. In one embodiment, a music-based game is provided on a media player. The music-based game can, for example, use individualized music (e.g., music owned and/or stored by an individual). It will be appreciated that a person can use the same input area (or input device) to not only play the game but also control the music being played while the game is in progress. The game can be controlled by using intuitive and simple motions (e.g., directional and/or rotational movement and/or touching a particular location using a thumb or finger). In one embodiment, during game play, the media player can still be controlled in the manner familiar to users.


Other aspects and advantages of the invention will become apparent from the following detailed description, taken in conjunction with the accompanying drawings, illustrating by way of example the principles of the invention.





BRIEF DESCRIPTION OF THE DRAWINGS

The present invention will be readily understood by the following detailed description in conjunction with the accompanying drawings, wherein like reference numerals designate like structural elements, and in which:



FIG. 1A depicts a computing environment where first and second application programs (or applications) are effectively executed by a computing device in accordance with one embodiment of the invention.



FIG. 1B depicts an input area (surface or plane) that resembles or approximates the scene associated with a first application in accordance with one embodiment of the invention.



FIG. 1C depicts a method for providing input to multiple application programs (or applications) using an input device in accordance with one embodiment of the invention.



FIG. 1D depicts a method for providing input to multiple application programs (or applications) in accordance with another embodiment of the invention.



FIGS. 2A-C depict a computing environment in accordance with one or more embodiments of the invention.



FIG. 2D depicts a method for executing an application program (or application) in connection with a scene in accordance with another embodiment of the invention.



FIGS. 3A-B depict game scenes in accordance with one or more embodiments of the invention.



FIGS. 3C-D depict a method for playing a game on a computing device in accordance with one embodiment of the invention.



FIG. 4A depicts a computing device in accordance with one embodiment of the invention.



FIGS. 4B-C depict entering input area in accordance with one or more embodiments of the invention.



FIGS. 4D-F depict an input area in accordance with one embodiment of the invention.



FIG. 4G depicts a method for playing a game using an input device that effectively provides an input area that resembles a game scene in accordance with one embodiment of the invention.



FIG. 5 depicts a rotational movement that can be used to indicate a number within a larger range in accordance with one embodiment of the invention



FIG. 6 depicts a media player in accordance with one embodiment of the invention.





DETAILED DESCRIPTION OF THE INVENTION

The invention pertains to improved techniques for providing user input to interactive and multitasking computing environments. The invention can be implemented in numerous ways, including a method, an apparatus, a computer readable medium. Several aspects and embodiments of the invention are discussed below.


One aspect of the invention provides a game input area (surface or plane) that can receive input for multiple applications including an interactive application executed in connection with a scene displayed on a display. Input is directed to the appropriate application based on one or more locations (e.g., points, positions, regions, portions) of the input area effectively identified when input is received (e.g., when input is entered by a person by touching a particular position on a touch screen). In addition, the manner in which input is received (or entered) can be used to determine which application should receive the input. By way of example, the same input area can be effectively used to receive input from a gaming application and a non-gaming application at substantially the same time. More particularly, the input area for a gaming application can effectively overlap or include a number of locations that are designated for a non-gamming application (e.g., a media playback application) and/or designated as such only if input is received in a particular manner (e.g., pushing or pressing of any location, or a particular designated location, would result in sending the input to non-gaming application). As such, a location on the input area can be designated, for example, for a gaming application if input is received in a particular manner different than that designated for the non-gaming application (e.g., tapping or touching the region would send input to a gaming application, but pressing would result in a media player function). Accordingly, this aspect of the invention allows the same input area to be used for multiple applications. Hence, a person can use the same input area (e.g., a top surface or plane of a physical input device such as a touch screen) to multitask. In other words, the person can, for example, play a game and exercise control over another application (e.g., media playback application) using the same input area.


Another aspect of the invention provides an input area that resembles or approximates the shape of a scene (e.g., game scene) associated with an application (e.g., a gaming application). Typically, the scene is used in connection with the application (e.g., a game scene is used to play a game, a record is displayed for a database program and manipulated based on input). Further, the scene is often controlled or manipulated based on input provided by a person. Typically, this requires one or more objects to be controlled or manipulated in the scene based on input provided. It will be appreciated that an input area that resembles or approximates the scene allows a person to provide input in a more intuitive way. Further, input can be provided in a simple and more intuitive manner by effectively allowing the user to interact with the input area in a way that mimics or approximated a desired action or motion of an object displayed in the scene (e.g., moving a ball or bat around by mimicking the motion on the input area). Examples of such interactions include that can be characterized as positional, directional, rotational, pressing and/or pushing type inputs (or movement).


It will be appreciated that these and other aspects of the invention can be combined to realize additional benefits. In general, the invention allows various applications to be integrated and used on devices that are not readily suitable for supporting multiple applications at the same time. As an example, a portable media player can be effectively integrated with various other applications including gaming applications. The media player can, for example, be used to play a game and still behave as a media player during the game play. It will also be appreciated that media player can provide the same media presentation functions (e.g., play, pause, next, back) that users have become accustomed to and provide them in a familiar manner. In one embodiment, a music-based game is provided on a media player. The music-based game can, for example, use individualized music (e.g., music owned and/or stored by an individual). It will be appreciated that a person can use the same input area (or input device) to not only play the game but also control the music being played while the game is in progress. The game can be controlled by using intuitive and simple motions (e.g., directional and/or rotational movement and/or touching a particular location using a thumb or finger). In one embodiment, during game play, the media player can still be controlled in the manner familiar to users.


Embodiments of these aspects of the invention are discussed below with reference to FIGS. 1A-6. However, those skilled in the art will readily appreciate that the detailed description given herein with respect to these figures is for explanatory purposes as the invention extends beyond these limited embodiments.


In accordance with one aspect of the invention, an input device can effectively provide input for multiple application programs (or applications) during execution or runtime when the applications are both being executed. To further elaborate, FIG. 1A, depicts a computing environment 10 where first and second application programs (or applications) 14 and 16 are effectively executed by a computing device 12 (e.g., a personal computer, laptop, mobile phone, portable media player). Referring to FIG. 1A, an input device 20 effectively provides an input area (surface or plane) 22 for receiving input for both applications 14 and 16. More particularly, one or more locations (e.g., points, positions, regions, portions) 24 on the input area 22 are designated for receiving input for the first application program 14 when input is provided in a manner designated for the first application 14. By way of example, input that effectively pushes or presses on the one or more locations 24 can be designated for the first application 14. However, it will be appreciated that input provided in a different manner (e.g., touching, tapping, or rubbing over) can be designated and provided for the second application 16. Those skilled in the art will appreciate that the manner in which input can be provided can vary widely. Nevertheless, a few exemplary ways for providing input are discussed below.


It should be noted that input device 20 is especially well suited for situations where the first application is executed in connection with a scene 30 displayed on a display 32 of the computing environment 10. One example is a gaming application where the first scene 30 is a scene for a game (game scene) where various game objects are displayed and controlled (or manipulated) based on the input effectively provided by the input device 20. As another example, the first or second application (14 or 16) can be a media playback application for presentation of media. In any case, it will be appreciated that the input area (or surface) 22 can be used to effectively provide input for both the first and second applications 14 and 16. The input can be provided to one or both the first and second applications 14 and 16 dependent on the one or more locations of the input area 22 effectively identified when the input is received and/or the manner of receiving (or entering) the input.


In accordance with another aspect of the invention, the input area 22 (shown in FIG. 1A) can resemble or approximate the shape of the first scene 30. This allows input to be provided in a more intuitive manner as a person (or human being) can easily associate the input area 22 with the scene 30 typically displayed in connection with an application. The benefits of such arrangement become readily apparent for a gaming application where typically one or more game objects (e.g., a ball, a gun, a car) are effectively controlled (e.g., moved) in a game scene. As such, gaming applications are further discussed below in greater detail.


To further elaborate, FIG. 1B depicts an input area (or surface) 40 that resembles or approximates the scene 30 associated with a first application 14 (e.g., gaming application). Referring to FIG. 1B, it is apparent that the scene 30 can be visually mapped to the input area 40. As a result, input associated with the first application 14 can be provided in a more intuitive manner (e.g., by touching various points or positions of the input area 40 that correspond to various points or positions of the scene 30).


It should be noted that one or more locations (e.g., points, positions, portions, regions) 24 of the input area 40 can also be used to provide input for the second application 16. Generally, input for the second application 16 can be provided by interacting with a designated location (e.g., 24) of the input area 40 and/or by providing input in a particular manner (e.g., pressing down).



FIG. 1C depicts a method 50 for providing input to multiple application programs (or applications) using an input device in accordance with one embodiment of the invention. The input device can, for example, be the input device 20 (shown in FIG. 1A). In any case, the input device effectively provides an input area (or surface) for entering input is for multiple active applications. Referring to FIG. 1C, initially, input is received (52). It should be noted that the input is received (or entered) in a particular manner (e.g., press, touch, rub, tab) and/or in connection with at least one location (e.g., a point, position, portion, or region) of the input area. Next, it is determined, based on the manner of receiving (or entering) the input and/or the at least one location of the input area effectively identified by the input, which one of a plurality of applications is to receive the input. Thereafter, the input is provided (56) to the appropriate application. The method 50 ends after input is provided (56) to the application determined (54) to be the appropriate application for receiving the input.



FIG. 1D depicts a method 70 for providing input to multiple application programs in accordance with another embodiment of the invention. Initially, it is determined (72) whether input has been received. If it is determined (72) that input has been received, it is next determined (74) whether the input is associated with one or more locations (e.g., points, positions, portions, regions) of an input area (or surface) designated for a first application. If it is determined (74) that the input is associated with one or more locations designated for the first application, it is then determined whether the input is provided (received or entered) in a manner designated for the first application. In effect, if it is determined (74) that the input is associated with one or more locations designated for the first application and it is determined (76) that the input is provided in a manner designated for the first application, the input is provided (78) to the first application. It should be noted that the order in which the determination (74) and (76) are made may be interchangeable or only one of them may be necessary to determine whether to provide input to the first application. As one example, a system can, for example, be configured to send all input provided in a particular manner to a first application and/or all input associated with one or more particular locations to the first application. Those skilled in the art will understand other variations.


Referring back to FIG. 1D, if it is determined (74) that the input is not associated with one or more locations for the first application or it is determined (76) that input is not provided in a manner designated for the first application, it is determined (80) whether to automatically provide the input to a second application. As such, the input can be provided (84) to the second application and the method 70 can proceed to determine (72) whether another input has been received. Again, those skilled in the art will appreciate many other variations and will readily know that the determination (80) can represent a design or programming choice. More specifically, a choice of whether to automatically send input to the second application. Alternatively, additional checks can be made to determine (84) whether to send the input to the second application. By way of example, based on the manner and/or one or more locations associated with the input, it can be determined (84) whether to provide (82) the input to the second application (or third application), and so on. Accordingly, if it is determined (84) to provide the input to the second application, the input is provided (82) to the second application. Thereafter, it is determined (72) whether input has been received and the method 70 proceeds in a same manner as described above to receive other input (72) and provide it to the appropriate application.



FIG. 2A depicts a computing environment 100 in accordance with one embodiment of the invention. Referring to FIG. 2A, a computing system (or device) 102 effectively provides functionality labeled as an input component (or module) 104. More particularly, the input component 104 effectively provides or generates an input area 110 associated with a scene or area 108 displayed on a display 103. The scene 108 can, for example, be a part of a complete game scene displayed for a gaming application. As such, the scene 108 typically includes at least one object (e.g., ball, racket, gun, car) 113 that is controlled or manipulated (e.g., moved) when the first application 106 is being executed, for example, during game play. The object 113 can be displayed within and/or on a boundary of the scene 108 displayed on the display 103. It should be noted that although the display 103 is depicted as a separate component, it can be a part of the computing system 102 and/or configured for the computing system 102. Also, it will be appreciated that the input area 110 can include or can be effectively provided by an input device 105 (e.g., touch/control pad, touch screen) which interacts with the input component or module 104. The input area 110 can also be a virtual area or an area mapped to empty space where, for example, motion is detected by one or more motion detectors. In any case, the input area 110 resembles or approximates the scene 108 where one or more game objects 113 are to be controlled. Further, input provided can typically identify one or more locations (e.g., points, positions, portions, regions) of the input area 110 and/or can be received (or entered) in a particular manner (e.g., press, touch).


Such input can, for example, be associated with movement between first and second locations of the input area 110. As another example, input can be characterized as positional input that identifies or indicates a single location of the input area 110. In general, input identifies or indicates one or more locations of the input area 110. Referring to FIG. 2A, input can, for example, be entered by a thumb or finger 111 as positional input (e.g., by touching or tapping a particular location 115 of the input area 110 effectively provided (e.g., as a part of a touch pad or touch screen). As another example, input can be characterized as directional movement (including rotational movement) entered by the thumb or finger 111 in various directions and between various locations of the input area 110. Referring to FIG. 2A, the directional movement of the thumb or finger 111 in the input area 110 is effectively mapped to movement of the game object 113 in the scene 108. As another example, “positional” movement of the thumb or finger 111 at location 115 effectively moves or places the game object 113 at corresponding location of the scene 108.


Although the examples shown in FIG. 2A demonstrate mapping input received in the input area 110 to movement of an object 113 in the scene 108, it will be appreciated that objects can be effectively controlled or manipulated in numerous other ways based on the input received by the input area 110. For example, positional input at location 115 of the input area 110 can effectively identify or select a particular game object at a corresponding location in the scene 108. The object can then be controlled (e.g., deleted, moved or modified) by default and/or based on subsequent positional and/or directional input. As such, it is possible to implement various other functions besides movement of objects. In general, a scene and/or one or more objects can be controlled or manipulated based on positional and/or directional input. However, for simplicity and ease of illustration, moving game objects in response to input received in a game area that effectively resembles or approximates a game area (or scene) will be described in greater detail below.


Referring back to FIG. 2A, non-gaming locations (e.g., points, positions, or regions) 132 and 134 are also shown in the input area 110. It will be appreciated that the non-gaming locations 132 and 134 can be designated for receiving input not directly connected to the game being played and/or game area (or scene) 108 being displayed. As such, locations 132 and 134 can be used to provide input for applications other than the game being played. Nevertheless, the locations 132 and 134 can still be part of the input area 110 and/or overlap with the input area 110, and as such, also used for playing a game (i.e., can receive directional and/or positional input for the game). In other words, the thumb or finger 111 can effectively use the game input area 110 to control both a game and a non-gaming application (e.g., a media player). By way of example, pressing or pushing on location 132 can be interpreted as input for a non-gaming application, but a tap or touch on the same location 132 can be interpreted as positional input provided for the game and used to manipulate the scene 108. However, a directional input (or movement) of the thumb or finger 111 over the location 132 (without pressing down) can be interpreted and provided as directional input for controlling the game if a pressing action of the location 132 is not detected.


To further elaborate, FIG. 2B depicts an input area 110 and a game scene (or area) 120 in accordance with one embodiment of the invention. Referring to FIG. 2B, a thumb or finger 111 can effectively input a directional movement (e.g., right to left, left to right). In response to the directional movement, a game object 132 is effectively controlled in the scene 120. More particularly, based on the directional movement input by the thumb or finger 111, the game object 113 is effectively controlled (e.g., moved). By way of example, directional movement between locations 202 and 204 of the input area 110 can be effectively transformed to movement of the object 132 between locations 212 and 214 of the scene 120. The locations 212 and 214 can, for example, correspond to the locations 202 and 204 of the input area 110. However, it should be noted that the directional movement can be interpreted in accordance with much more complex formulas. For example, factors including the distance between locations 202 and 204, the time it takes to complete the movement between them can be used to additionally determine the speed and/or acceleration for moving the object 113. Further, directional movement can, for example, set the object 113 in motion until another input is received and/or a boundary of the scene 120 is reached.


In general, those skilled in the art will appreciate that directional input provided in the input area 110 can be interpreted or effectively mapped to one or more actions, operations, method, or functions that are performed or invoked for an object and/or on its behalf. By way of example, in a gaming environment, an object representing a gun can be “fired,” or “explode”. Again, for simplicity, the following examples, only illustrate movement of the objects, but those skilled in the art will appreciate that virtually any action or operation can be implemented, for example, by defining methods or functions for various objects used by an application program. It should be noted that the positional or directional input (or movement) can also be received in the interior of the input area 110. Referring to FIG. 2B, line 220 and 222 demonstrate directional movement in the interior of input area 110 which can, for example, cause movement of the game object 113 along the corresponding lines 230 and 232 of the game scene (or area) 120. As also shown, input can be received as rotational input 225 in clockwise or counter clockwise directions.


Referring now to FIG. 2C, entering positional input is depicted in accordance with one embodiment of the invention. More specifically, positional input is effectively provided by the finger or thumb 111 in the input area 110. Generally, the positional input can be characterized as input that includes or effectively indicates a location (e.g., point, position, portion, region) of an input area. As such, positional input can be defined to be different from directional movement. By way of example, positional input can be defined as a tap or touch (e.g., coming in contact with an input device and/or its surface, plane or area for a predetermined amount of time). On the other hand, directional input can, for example, be defined as movement between two or more locations. Both directional and positional input can be further distinguished from a press (or push) associated with a sufficient amount of pressure exerted on an input area. Referring to FIG. 2C, positional input at location 240 can cause a game object 113 to move to (or appear) at a corresponding location 242 of the game area (or scene) 120. Similarly, positional input provided at location 246 can cause the object 113 to move to a corresponding location 244.



FIG. 2D depicts a method 250 for executing an application program (or application) in connection with a scene in accordance with another embodiment of the invention. The application can, for example, be an interactive program (e.g., a game) requiring input to be entered in connection with a scene (e.g., a game scene). In any case, an input area (or surface or plane) is determined and/or effectively initiated (252) for receiving input for the application. It should be noted that the input area can resemble or approximate the shape of a scene displayed in connection and/or for the application when the application is initiated or being executed. The input area may effectively have a fixed or predetermined shape. Alternatively, the input area may be determined in a dynamic manner and/or change as the shape of the game scene changes in order to more closely approximate the current game scene. In any case, after the input area has been determined and/or initiated (252), input associated with the input area is received (254). The input is associated or effectively identifies one or more locations (e.g., points, positions, portions, regions) of the input area. Subsequently, the scene is controlled and/or manipulated (256) based on the one or more locations associated with the input and/or the manner input was received, and the method 250 ends. It should be noted that input can also be received (or entered) in a particular manner. Moreover, the manner in which the input is received can also be used to control and/or manipulate the scene.


It will be appreciated that an input device can be physically shaped to resemble a game scene or at least a part of a game scene where one or more game objects are to be controlled. It is also possible to effectively generate an input area (or surface) that resembles a game scene where one or more game objects are controlled without requiring the input device to actually (or physically) be shaped like the scene. Referring to FIG. 3A, a game scene 302 can have virtually any shape 302. One or more game objects 304 can be controlled within the game scene 302 in an area 306. The area 306 is effectively mapped to an input area 308 provided by a physical device 310 (e.g., input device) that may have virtually any shape.



FIG. 3B depicts an input device 324 that resembles a game scene 322 displayed on a display 320 in accordance with one embodiment of the invention. During the game, one or more game objects 328 are controlled based on input received by the input device 324. The input device 324 can, for example, be embedded in a portable computing system (e.g., phone, media player). In any case, the input device 324 effectively provides an input area or surface 326 (e.g., an upper surface) that resembles the game scene 322. From the perspective of a human user, input can be provided intuitively partly because the input area 326 can be easily matched with the game area 322. In addition, the game input area 326 (e.g., upper surface of the input device 324) can be used by the user to enter input for multiple applications. More particularly, the user can interact with one or more non-gaming locations (e.g., buttons) 330 of the input area 326 in order to control a non-gaming application (e.g., media player).



FIG. 3C depicts a method 350 for playing a game on a computing device in accordance with one embodiment of the invention. Initially, an input area (surface or plane) that resembles or approximates the shape of a game scene is determined and/or initiated (352) when the game is operational and/or being played. Next, input associated with the input area is received (354). The input effectively identifies one or more locations (e.g., points, positions, portions, regions) of the input area and/or is received in a particular manner. Thereafter, one or more game objects are controlled and/or manipulated (356) based on the one or more locations of the game scene identified by the input and the method 350 ends.


Those skilled in the art will appreciate that game objects can be controlled and/or manipulated based on various factors and techniques. A few exemplary operations are discussed below with reference to FIG. 3D. It should be noted that one or more of these operations can be used in block 356 of the method 350 illustrated in FIG. 3C depending on the desired system configuration. Referring to FIG. 3D, one or more locations of the game scene can be determined based on one or more locations identified by the input and/or the manner of entering (or receiving) the input. Next, one or more game objects are identified (360). These objects can, for example, be displayed in the game scene. The objects are identified (356) based on the one or more locations associated with input and/or corresponding locations of the game scene and/or manner of receiving (or entering) input. Thereafter, one or more operations are determined (362) to be performed on the one or more game objects. These operations can also be determined based on the one or more locations associated with input and/or game scene and/or manner input was received (or entered). Accordingly, one or more operations are performed (362) and the method 356 ends.


To further elaborate, FIG. 4A depicts a computing device 402 in accordance with one embodiment of the invention. The computing device 402, can, for example, be a mobile device (e.g., a portable media player, mobile phone). The computing device 402 has a housing 403 that includes a display 406 and an input device 408. A game scene (or area) 404 is displayed on the display 406 configured for the computing device 402. It should be noted that the circular game scene (or area) 404 resembles the shape of the input device 408 which effectively provides an input area 410. During the game, objects 412 and 414 are effectively controlled based on input provided by a person who interacts with the input device 408 and in effect the input area 410. In one embodiment, game objects 412 and 414 respectively mimic the behavior of a ball and bat. Hence, the “ball” 412 can fall toward the “bat” 414, be “hit” by the “bat” 412 to bounce back in an opposite direction. During game play, the “bat” 414 can be moved around the circumference of the game scene 404 which resembles a circle. The “Bat” 414 is used to hit the “ball” 412 based on various factors (e.g., angle of contact, velocity of the bat or ball). For simplicity and ease of illustration, the input provided by a person can, for example, merely control (e.g., move) the “bat” 414 so that it can “hit” the “ball” 412 as it bounces back and forth in various directions and between various locations in the game scene 404. It will be appreciated that a person can conveniently use the input area 410 effectively provided by the input device 408 to control the movement of the “bat” 414 around the circumference of the circle 404. More particularly, rotational input can be used to effectively move the “bat” 414 around the circular game scene.


To further elaborate, FIGS. 4B and 4C depict entering inputs in accordance with embodiments of the invention. More particularly, FIG. 4B depicts entering a directional movement as rotational movement from a first position (P1) to a second position (P2) using a thumb or finger 111 to interact with the input device 408.


Referring to FIG. 4B, in response to the rotational movement (P1-P2) around or along the circumference or edge of the input area 410 (or input device 408), the “bat” 414 moves between the corresponding locations DP1-DP2 of the game scene 404. It will be appreciated that when the “ball” 414 is, for example, at location DP2, the person can enter a positional input that effectively moves the “bat” object 414 to a third position (DP3). Referring to FIG. 4C, a thumb or finger 111 can input a positional input, for example, by a tap or touch at position P3 to effectively move the “bat” object 414 from location DP2 to location DP3. Hence, the person can use a combination of rotational and positional input to intuitively control the movement of the “bat” 414 in order to play the game.


It should also be noted that functions not directly related or connected to the game can also be provided, for example, by one or more locations 420 and 430 that are effectively activated by the finger or thumb 111. By way of example, a location 420 (shown in FIG. 4C) can be a physical button or an area on a touch surface configured to be pressed or pushed by a pressing or pushing action in order to control a function (e.g., pause, play, next, back) associated with a media player. Referring to FIG. 4C, one or more locations 420 and 430 can also be designated for non-gaming functions (e.g., input provided to a media player for playing music). By way of example, a pressing input on location 430 can be effectively interpreted as input for a media player. As such, the thumb or finger 111 can press on the location 430 (e.g., press a physical button or a designated area on a touch surface) to control a media player. Again, it should be noted that the location 430 can still be part of the input area 410 provided for gaming as it is possible to enter gaming input using rotational and positional inputs without activating a non-gaming functionality (i.e. by not pressing on the location 430). Also, it is possible to designate, for example, a location 431 for game play regardless of the manner input is entered. For example, a press or push on location 431 can cause a game action (e.g., cause the bat 414 to hit harder).


To further elaborate, FIGS. 4D, 4E and 4F depict an input area 452 in accordance with one embodiment of the invention. Input area 452 includes a location 454 designated for receiving input for both a gaming and a non-gaming application. Referring to FIG. 4D, a thumb or finger 111 can press down on the location 454 to effectively provide input to a non-gaming application, for example, a media player in (e.g., start or stop the music being played during the game). However, referring to FIG. 4E, a tap or touch of the position 454 by the finger or thumb 111 effectively provides positional input for a gaming application. It should be noted that the positional input can be on or over the location 454 without sufficient pressure to cause a pressing or pushing action to be detected. Referring to FIG. 4F, a rotational movement can touch (or go over) the designated location 454 without sufficient pressure, so as to provide a rotational input to the gaming application.



FIG. 4G depicts a method 450 for playing a game using an input device that effectively provides an input area resembling the shape of a game scene in accordance with one embodiment of the invention. Initially, the game is initiated (451). Next, it is determined whether input associated with the input area resembling the game scene is received (452). If it is determined (452) that input is not received, it is determined (454) whether to end the game and the method 450 can end accordingly. In effect, the method 450 can wait for input or a determination (454) to end the game (e.g., by receiving a request or indication to end the game.


If it is determined (452) that input has been received, it is determined (454) whether the input is positional (positional input). If it is determined (454) that the input is positional, one or more game objects can be moved (456) to one or more corresponding locations (e.g., points, positions, portions, regions) of the game scene in response to the positional input. However, if it is determined (454) that the input is not positional input, it is determined (458) whether the input is directional (e.g., rotational) input (or movement). Consequently, one or more game objects can be moved (460) in accordance with the directional input (or directional movement). By way of example, a game object can be moved in the same direction and in a manner that mimics the directional input (or movement). On the other hand, if it is determined that input is neither directional (458) nor positional input (454), it is determined (462) whether the input is associated with a non-gaming location (e.g., a location designated for applications other than the gaming application) of the input area that has been pressed (or pushed). For example, one or more buttons or selected regions of the input area can be reserved to effectively provide input and/or control another application (e.g., a media player).


Accordingly, if it is determined (462) that a non-gaming location of the input area has been pressed (or pushed), the input is provided (464) to another application (e.g., a non-gaming application such as a media player). After the input has been provided (464), it is determined (454) whether to end the game and the game can end accordingly. However, if it is determined (462) that a non-gaming location is not pressed, it is determined (470) whether a gaming location has been pressed. For example, one or more buttons provided in the input area and/or selected regions of the input areas can be reserved as one or more gaming locations for the gaming application. In addition to positional and directional input mechanisms, this provides yet another convenient mechanism for providing input to the gaming application. As such, if it is determined (470) that a gaming location has been pressed, the input is provided to the gaming application. It should be noted that if it is determined (470) that a gaming location has not been pressed, it is determined (454) whether to end the game. Although not depicted in FIG. 4G, those skilled in the art will appreciate that error-checking can also be performed to effectively verify the input. The method 450 ends when it is determined (454) to end the game.


It will be appreciated that the directional and positional input are useful for implementing numerous functions and applications. Directional and positional input can be used in combination with an input area that resembles a game scene allows a person to enter input more intuitively, thereby allowing games to be played in a more convenient manner. Further, directional and/or positional input can be used to implement functionalities which are be difficult to implement using conventional techniques. By way of example, directional and/or positional input can be provided to effectively select or identify a number within a relatively large range as required for various gaming applications. This range can, for example, represent money available for betting in a poker game. Generally, identifying or selecting a number within a relatively large range poses a difficult problem if the actual number is not specifically entered (e.g., by using a keyboard to enter the number).


Referring to FIG. 5, rotational input or movement 502 can be used to indicate a number within a larger range 504 (e.g., 1 to 10n, where n≧6). Those skilled in the art will appreciate that one or more factors including: the direction, distance traveled, speed and acceleration associated with a directional input or movement can be used to effectively determine a number within the range 504. By way of example, a relatively slow rotational movement over a relatively small distance may indicate incrementing by one (1) which would result in updating a number provided and displayed on a display 506 (e.g., updating the number displayed by one). Accordingly, rotational input can be used to effectively increment by one (1) to reach the desired number. However, rotational input or movement 502 extending over relatively larger distances and/or provided relatively faster can be used to effectively increment by larger amounts, for example, tens or hundreds, and so on. Similarly, rotational input or movement 502 can effectively decrement in small or larger amounts. It should be noted that a “rounding off” effect can also be provided to effectively round off the selected number as deemed appropriate. By way of example, starting at number 20, relatively slower rotational movement can initially increment by one's to yield the number 27. Subsequently, relatively larger rotational movement can result in increasing the number by a relatively larger increment, namely 10. However, rather than yielding the numbers 37 and 47, the rounding off effect may result in increasing the number to 40, 50, and so on. Those skilled in the art will appreciate it that such rounding off can be implemented in accordance with various schemes and/or in consideration of the particular application or game. As such, the technique used for a particular type of poker game may differ from that used for a different type of game and/or can be customized by user depending on his or her preference.


In one embodiment, input area 501 can also represent a range. Hence, positional movement may be used to select a number within the range 504. By way of example, touching or tapping an area or region 508 can effectively select the halfway point within the range 504. Subsequently, directional movement 502 can effectively increment or decrement by ones, tens, hundreds, thousands and so on. Rotational input covering or extending over a boundary location 510 can effectively select the last number in the range (e.g., bet all the money available). Also, rotational movement may in effect start an incrementing or decrementing process that can continue as long as desired or until the end of the range 504 is reached. This process may be at a constant rate or accelerating as time goes by. For example, a right (or clockwise) rotational movement can increment by one, then effectively accelerate to increment by tens, hundreds or more. This increment can continue as long as a finger or thumb maintains contact with the input area or directional movement in the opposite direction is received, thereby allowing a person to select “1500,” “25,000,” or “5,000,000” effectively by entering one or more rotational movements.


It should be noted that a “filtering” mechanism can be used to effectively ignore input (e.g., small amount of rotational movement). By way of example, a relatively small amount of movement associated with initiation or termination of rotational movement can be ignored. Typically, this type of movement can be expected and accounted for when input is provided by a human being. As such, the filtering mechanism can effectively ignore movement that can be considered to be unintended and/or a byproduct of the intended rotational movement.


As noted above, an input area allows input to be provided for multiple applications in accordance with one aspect of the invention. This allows for integration of various applications. One embodiment of the invention effectively integrates a media player with gaming applications. FIG. 6 depicts a media player 600 in accordance with one embodiment of the invention. The media player 600 can, for example, be provided as an Apple iPod® media player (available from Apple Computer, Inc.) that provides a media player for playing music and/or viewing media (e.g., movies). An input device 602 effectively provides a circular input area (surface or plane) extending over various designated locations 604, 606, 608 and 610 which can be implemented as selectable areas. As such, these selectable areas can be used to control the media player (e.g., pause, play, forward and backward functions for a media player) and/or media related functions, such as, for example, browsing menus or directories to select or download media files. In addition, the media player 600 also provides the ability to play music-based games. These music-based games can, for example, be used based on media content available to and/or stored by the media player 600. Hence, games can be tailored or individualized for different individuals based on digital media selected by users and/or generally familiar to them. By way of example, music files (audio digital files) stored on the media player 600 for a particular user can be used to play a music trivia game where a song is played and the person playing the game is prompted to identify the song and/or answer a question about the song (e.g., what year it was released).


Referring to FIG. 6, information about songs 1 and 2 are displayed while one of the songs is played. The user (or person) playing the game can then select one of the songs as the correct song by entering a rotational movement 612 in the direction of one of displayed songs (song 1 and 2). A timer 614 can display the time available for making the selection. A selection can be made by providing rotational and/or positional input. By way of example, a right (or clockwise) directional movement may effectively reach far enough to effectively select the second song (song 2). As another example, a relatively short directional movement to the right can effectively start the selection of song 2 as the right arrow 616 appears to become continuously filled to demonstrate the process of selecting the second song (song 2). However, directional movement 612 to the left (or counter-clockwise) can reverse the process and effectively result in filing the left arrow 618 in order to eventually cause the selection of the first song (song 1). It will be appreciated that a relatively quick and/or long directional movement 612 to the left can abruptly reverse the process of selecting the second song (song 2) and/or in effect immediately select the first song (song 1). It should be noted that while the game is being played the person playing the game can still use the selectable areas 604, 606, 608 and 610 to control the media-player. In other words, the person can play the music-based game by interacting via rotational and/or positional input and also control the music being played using a familiar interface. Furthermore, the direction, extent, and/or manner of entering rotational input can be effectively used to allow games to be played in a simple and more intuitive manner.


The following applications are hereby incorporated herein by reference in their entirety for all purposes: (i) U.S. patent application Ser. No. 11/144,541, filed Jun. 3, 2005, and entitled “TECHNIQUES FOR PRESENTING SOUND EFFECTS ON A PORTABLE MEDIA PLAYER,” (ii) U.S. patent application Ser. No. 11/530,846, filed Sep. 11, 2006, and entitled “ALLOWING MEDIA AND GAMING ENVIRONMENTS TO EFFECTIVELY INTERACT AND/OR AFFECT EACH OTHER,” (iii) U.S. patent application Ser. No. 11/530,767, filed Sep. 11, 2006, and entitled “INTEGRATION OF VISUAL CONTENT RELATED TO MEDIA PLAYBACK INTO NON-MEDIA-PLAYBACK PROCESSING,” (iv) U.S. patent application Ser. No. 11/530,768, filed Sep. 11, 2006, and entitled “INTELLIGENT AUDIO MIXING AMONG MEDIA PLAYBACK AND AT LEAST ONE OTHER NON-PLAYBACK APPLICATION,” and (v) U.S. patent application Ser. No. 11/530,773, filed Sep. 11, 2006, and entitled “PORTABLE MEDIA PLAYBACK DEVICE INCLUDING USER INTERFACE EVENT PASSTHROUGH TO NON-MEDIA-PLAYBACK PROCESSING”.


The various aspects, features, embodiments or implementations of the invention described above can be used alone or in various combinations.


The many features and advantages of the present invention are apparent from the written description and, thus, it is intended by the appended claims to cover all such features and advantages of the invention. Further, since numerous modifications and changes will readily occur to those skilled in the art, the invention should not be limited to the exact construction and operation as illustrated and described. Hence, all suitable modifications and equivalents may be resorted to as falling within the scope of the invention.

Claims
  • 1. A method for providing input to multiple executable application programs configured to run at the same time on a mobile device, comprising: providing an input area, the input area at least partially spanning a touch surface or button associated with the input device;receiving at at least one first location of the input area an input provided in a particular manner, the manner comprising at least one of a positional input, a directional input, a rubbing input, a pressing input and a pushing input;determining whether the at least one first location and manner of providing the input is a location and manner designated for providing input to a first executable application program running on the mobile device or a second executable application program running on the mobile device, the manner designated for providing input to the first executable application program being different from the manner designated for providing input to the second executable application program;then providing the input to the first executable application program running on the mobile device if determined that the at least one first location and manner of providing the input is a location and manner designated for providing input to the first executable application program; andthereafter providing the input to a second executable application program running on the mobile device if the input is not provided to the first executable application program,wherein the at least one first location of the input area can be used to provide input to multiple applications running at the same time on the mobile device.
  • 2. The method of claim 1, wherein the step of providing the input to the second executable application program comprises providing the input to the second executable application program if the manner of providing the input is the manner designated for providing input to the second executable application program.
  • 3. The method of claim 1, wherein the step of providing the input to the second executable application program comprises providing the input to the second executable application program if the manner of providing input is a manner not designated for providing input to the first application or the input is associated with a location of the input area not designated to receive input for the first executable application program.
  • 4. The method of claim 1, wherein the step of providing the input to the second executable application program comprises providing the input to the second executable application program if the input is associated with a location of the input area designated to receive input for the second executable application program and the manner of providing input is the manner designated for providing input to the second application program.
  • 5. The method of claim 3, wherein the input is associated with multiple locations of the input area.
  • 6. The method of claim 3, wherein a location of the input area is designated to receive input for both the first executable application program and the second executable application program.
  • 7. The method of claim 1, wherein the positional input comprises one of touching and tapping and the directional input comprises rotational movement.
  • 8. The method of claim 1, wherein the second executable application program is a game or gaming application and is executed in connection with a scene displayed on a display, andwherein the input area resembles or approximates the scene.
  • 9. A computing device configured to provide input to multiple executable application programs configured to run at the same time on a mobile device using an input area, the input area at least partially spanning a touch surface or button associated with the computing device, the computing device being further configured to: receive at at least one first location of the input area an input provided in a particular manner, the manner comprising at least one of a positional input, a directional input, a rubbing input, a pressing input and a pushing input;determine whether the at least one first location and manner of providing the input is a location and manner designated for providing input to a first executable application program running on the mobile device or a second executable application program running on the mobile device, the manner designated for providing input to the first executable application program being different from the manner designated for providing input to the second executable application program;then provide the input to the first executable application program running on the mobile device if determined that the at least one first location and manner of providing the input is a location and manner designated for providing input to the first executable application program; andthereafter provide the input to a second executable application program running on the mobile device if the input is not provided to the first executable application program,wherein at least one first location of the input area can be used to provide input to multiple applications running at the same time on the mobile device.
  • 10. The computing device of claim 9, wherein the input area approximates or resembles a scene associated with the second executable application program.
  • 11. The computing device of claim 10, wherein the second executable application program is a game or gaming application.
  • 12. The computing device of claim 9, wherein: the touch surface or button is shaped as a circle or approximates a circle and is configured to receive rotational or circular movement as input, andthe rotational or circular movement includes left and right rotational or circular movement.
  • 13. The computing device of claim 9, wherein the step of providing the input to the second executable application program comprises providing the input to the second executable application program if the manner of providing the input is the manner designated for providing input to the second executable application program.
  • 14. The computing device of claim 9, wherein the step of providing the input to the first executable application program comprises providing the input to the first executable application program if the input is associated with a location of the input area designated to receive input for the first executable application program, and the step of providing the input to the second executable application program comprises providing the input to the second executable application program if the manner of providing input is a manner not designated for providing input to the first application or the input is associated with a location of the input area not designated to receive input for the first executable application program.
  • 15. The computing device of claim 9, wherein the step of providing the input to the second executable application program further comprises providing the input to the second executable application program if the input is associated with a location of the input area designated to receive input for the second executable application program and the manner of providing input is the manner designated for providing input to the second executable application program.
  • 16. The computing device of claim 14, wherein the input is associated with multiple locations of the input area.
  • 17. The computing device of claim 14, wherein a location of the input area is designated to receive input for both the first executable application program and the second executable application program.
  • 18. The computing device of claim 9, wherein the positional input comprises one of touching and tapping and the directional input comprises rotational movement.
  • 19. A method comprising: providing an input device having an input area configured to enter input for at least a first active application and a second active application, the input area being configured to receive input associated with the first and second applications at substantially the same time and control the first application while the second application is in progress and control the second application while the first application is in progress,entering input in a particular manner and in connection with at least one first location of the input area,determining, based on the manner of entering the input and the at least one first location of the input area effectively identified by the input, which one of the first and second applications is to receive the input, andproviding input to the appropriate application.
  • 20. The method of claim 19, wherein a location of the input area is designated to receive input for both the first active application and the second active application.
  • 21. The method of claim 19, wherein the manner comprises at least one of a positional input, a directional input, a rubbing input, a pressing input and a pushing input.
  • 22. The method of claim 21, wherein the positional input comprises one of touching and tapping and the directional input comprises rotational movement.
  • 23. The method of claim 19, wherein the second active application is a game or gaming application and is controlled in connection with a scene displayed on a display, andwherein the input area resembles or approximates the scene.
CROSS-REFERENCE TO RELATED APPLICATIONS

This application claim benefit of priority from U.S. Provisional Patent Application No. 60/810,423, filed Jun. 2, 2006, and entitled “TECHNIQUES FOR INTERACTIVE INPUT TO PORTABLE ELECTRONIC DEVICES,” which is hereby incorporated herein by reference.

US Referenced Citations (521)
Number Name Date Kind
1061578 Wischhusen et al. May 1913 A
2063276 Thomas Dec 1936 A
2903229 Landge Feb 1956 A
2798907 Schneider Jul 1957 A
2945111 McCormick Oct 1958 A
3005055 Mattke Oct 1961 A
3965399 Walker et al. Jun 1976 A
3996441 Ohashi Dec 1976 A
4029915 Ojima Jun 1977 A
4103252 Bobick Jul 1978 A
4110749 Janko et al. Aug 1978 A
4115670 Chandler Sep 1978 A
4121204 Welch et al. Oct 1978 A
4129747 Pepper Dec 1978 A
4158216 Bigelow Jun 1979 A
4242676 Piguet et al. Dec 1980 A
4246452 Chandler Jan 1981 A
4264903 Bigelow Apr 1981 A
4266144 Bristol May 1981 A
4293734 Peper, Jr. Oct 1981 A
D264969 McGourty Jun 1982 S
4338502 Hashimoto et al. Jul 1982 A
4380007 Steinegger Apr 1983 A
4380040 Posset Apr 1983 A
4394649 Suchoff et al. Jul 1983 A
4475008 Doi et al. Oct 1984 A
4570149 Thornburg et al. Feb 1986 A
4583161 Gunderson et al. Apr 1986 A
4587378 Moore May 1986 A
4604786 Howie, Jr. Aug 1986 A
4613736 Shichijo et al. Sep 1986 A
4644100 Brenner et al. Feb 1987 A
4719524 Morishima et al. Jan 1988 A
4734034 Maness et al. Mar 1988 A
4736191 Matzke et al. Apr 1988 A
4739191 Puar Apr 1988 A
4739299 Eventoff et al. Apr 1988 A
4752655 Tajiri et al. Jun 1988 A
4755765 Ferland Jul 1988 A
4764717 Tucker et al. Aug 1988 A
4771139 DeSmet Sep 1988 A
4798919 Miessler et al. Jan 1989 A
4810992 Eventoff Mar 1989 A
4822957 Talmage, Jr. et al. Apr 1989 A
4831359 Newell May 1989 A
4849852 Mullins Jul 1989 A
4856993 Maness et al. Aug 1989 A
4860768 Hon et al. Aug 1989 A
4866602 Hall Sep 1989 A
4876524 Jenkins Oct 1989 A
4897511 Itaya et al. Jan 1990 A
4914624 Dunthorn Apr 1990 A
4917516 Retter Apr 1990 A
4943889 Ohmatoi Jul 1990 A
4951036 Grueter et al. Aug 1990 A
4954823 Binstead Sep 1990 A
4976435 Shatford et al. Dec 1990 A
4990900 Kikuchi Feb 1991 A
5008497 Asher Apr 1991 A
5036321 Leach et al. Jul 1991 A
5053757 Meadows Oct 1991 A
5086870 Bolduc Feb 1992 A
5125077 Hall Jun 1992 A
5159159 Asher Oct 1992 A
5179648 Hauck Jan 1993 A
5186646 Pederson Feb 1993 A
5192082 Inoue et al. Mar 1993 A
5193669 Demeo et al. Mar 1993 A
5231326 Echols Jul 1993 A
5237311 Mailey et al. Aug 1993 A
5278362 Ohashi Jan 1994 A
5305017 Gerpheide Apr 1994 A
5313027 Inoue et al. May 1994 A
D349280 Kaneko Aug 1994 S
5339213 O'Callaghan Aug 1994 A
5367199 Lefkowitz et al. Nov 1994 A
5374787 Miller et al. Dec 1994 A
5379057 Clough et al. Jan 1995 A
5404152 Nagai Apr 1995 A
5408621 Ben-Arie Apr 1995 A
5414445 Kaneko et al. May 1995 A
5416498 Grant May 1995 A
5424756 Ho et al. Jun 1995 A
5432531 Calder et al. Jul 1995 A
5438331 Gilligan et al. Aug 1995 A
D362431 Kaneko et al. Sep 1995 S
5450075 Waddington Sep 1995 A
5453761 Tanaka Sep 1995 A
5473343 Kimmich et al. Dec 1995 A
5473344 Bacon et al. Dec 1995 A
5479192 Carroll, Jr. et al. Dec 1995 A
5494157 Golenz et al. Feb 1996 A
5495566 Kwatinetz Feb 1996 A
5508703 Okamura et al. Apr 1996 A
5508717 Miller Apr 1996 A
5543588 Bisset et al. Aug 1996 A
5543591 Gillespie et al. Aug 1996 A
5555004 Ono et al. Sep 1996 A
5559301 Bryan, Jr. et al. Sep 1996 A
5559943 Cyr et al. Sep 1996 A
5561445 Miwa et al. Oct 1996 A
5564112 Hayes et al. Oct 1996 A
5565887 McCambridge et al. Oct 1996 A
5578817 Bidiville et al. Nov 1996 A
5581670 Bler et al. Dec 1996 A
5585823 Duchon et al. Dec 1996 A
5589856 Stein et al. Dec 1996 A
5589893 Gaughan et al. Dec 1996 A
5596347 Robertson et al. Jan 1997 A
5596697 Foster et al. Jan 1997 A
5598183 Robertson et al. Jan 1997 A
5611040 Brewer et al. Mar 1997 A
5611060 Belfiore et al. Mar 1997 A
5613137 Bertram et al. Mar 1997 A
5617114 Bier et al. Apr 1997 A
5627531 Posso et al. May 1997 A
5632679 Tremmel May 1997 A
5640258 Kurashima et al. Jun 1997 A
5648642 Miller et al. Jul 1997 A
D382550 Kaneko et al. Aug 1997 S
5657012 Tart Aug 1997 A
5661632 Register Aug 1997 A
D385542 Kaneko et al. Oct 1997 S
5675362 Clough et al. Oct 1997 A
5689285 Asher Nov 1997 A
5721849 Amro Feb 1998 A
5726687 Belfiore et al. Mar 1998 A
5729219 Armstrong et al. Mar 1998 A
5730165 Philipp Mar 1998 A
5748185 Stephan et al. May 1998 A
5751274 Davis May 1998 A
5754890 Holmdahl et al. May 1998 A
5764066 Novak et al. Jun 1998 A
5777605 Yoshinobu et al. Jul 1998 A
5786818 Brewer et al. Jul 1998 A
5790769 Buxton et al. Aug 1998 A
5798752 Buxton et al. Aug 1998 A
5805144 Scholder et al. Sep 1998 A
5808602 Sellers Sep 1998 A
5812239 Eger Sep 1998 A
5812498 Terés Sep 1998 A
5815141 Phares Sep 1998 A
5825351 Tam Oct 1998 A
5825352 Bisset et al. Oct 1998 A
5825353 Will Oct 1998 A
5828364 Siddiqui Oct 1998 A
5838304 Hall Nov 1998 A
5841078 Miller et al. Nov 1998 A
5841423 Carroll, Jr. et al. Nov 1998 A
D402281 Ledbetter et al. Dec 1998 S
5850213 Imai et al. Dec 1998 A
5856645 Norton Jan 1999 A
5856822 Du et al. Jan 1999 A
5859629 Tognazzini Jan 1999 A
5861875 Gerpheide Jan 1999 A
5869791 Young Feb 1999 A
5875311 Bertram et al. Feb 1999 A
5883619 Ho et al. Mar 1999 A
5889236 Gillespie et al. Mar 1999 A
5889511 Ong et al. Mar 1999 A
5894117 Kamishima Apr 1999 A
5903229 Kishi May 1999 A
5907152 Dandliker et al. May 1999 A
5907318 Medina May 1999 A
5909211 Combs et al. Jun 1999 A
5910802 Shields et al. Jun 1999 A
5914706 Kono Jun 1999 A
5923388 Kurashima et al. Jul 1999 A
D412940 Kato et al. Aug 1999 S
5933102 Miller et al. Aug 1999 A
5933141 Smith Aug 1999 A
5936619 Nagasaki et al. Aug 1999 A
5943044 Martinelli et al. Aug 1999 A
5953000 Weirich Sep 1999 A
5956019 Bang et al. Sep 1999 A
5959610 Silfvast Sep 1999 A
5959611 Smailagic et al. Sep 1999 A
5964661 Dodge Oct 1999 A
5973668 Watanabe Oct 1999 A
6000000 Hawkins et al. Dec 1999 A
6002093 Hrehor et al. Dec 1999 A
6002389 Kasser et al. Dec 1999 A
6005299 Hengst Dec 1999 A
6025832 Sudo et al. Feb 2000 A
6031518 Adams et al. Feb 2000 A
6034672 Gaultiet et al. Mar 2000 A
6057829 Silfvast May 2000 A
6075533 Chang Jun 2000 A
6084574 Bidiville Jul 2000 A
D430169 Scibora Aug 2000 S
6097372 Suzuki Aug 2000 A
6104790 Narayanaswami Aug 2000 A
6122526 Parulski et al. Sep 2000 A
6124587 Bidiville et al. Sep 2000 A
6128006 Rosenberg et al. Oct 2000 A
6131048 Sudo et al. Oct 2000 A
6141068 Iijima Oct 2000 A
6147856 Karidis Nov 2000 A
6163312 Furuya Dec 2000 A
6166721 Kuroiwa et al. Dec 2000 A
6179496 Chou Jan 2001 B1
6181322 Nanavati Jan 2001 B1
D437860 Suzuki et al. Feb 2001 S
6188391 Seely et al. Feb 2001 B1
6188393 Shu Feb 2001 B1
6191774 Schena et al. Feb 2001 B1
6198054 Janniere Mar 2001 B1
6198473 Armstrong Mar 2001 B1
6211861 Rosenberg et al. Apr 2001 B1
6219038 Cho Apr 2001 B1
6222528 Gerpheide et al. Apr 2001 B1
D442592 Ledbetter et al. May 2001 S
6225976 Yates et al. May 2001 B1
6225980 Weiss et al. May 2001 B1
6226534 Aizawa May 2001 B1
6227966 Yokoi May 2001 B1
D443616 Fisher et al. Jun 2001 S
6243078 Rosenberg Jun 2001 B1
6243080 Molne Jun 2001 B1
6243646 Ozaki et al. Jun 2001 B1
6248017 Roach Jun 2001 B1
6254477 Sasaki et al. Jul 2001 B1
6256011 Culver Jul 2001 B1
6259491 Ekedahl et al. Jul 2001 B1
6262717 Donohue et al. Jul 2001 B1
6262785 Kim Jul 2001 B1
6266050 Oh et al. Jul 2001 B1
6285211 Sample et al. Sep 2001 B1
D448810 Goto Oct 2001 S
6297795 Kato et al. Oct 2001 B1
6297811 Kent et al. Oct 2001 B1
6300946 Lincke et al. Oct 2001 B1
6307539 Suzuki Oct 2001 B2
D450713 Masamitsu et al. Nov 2001 S
6314483 Goto et al. Nov 2001 B1
6321441 Davidson et al. Nov 2001 B1
6323845 Robbins Nov 2001 B1
D452250 Chan Dec 2001 S
6340800 Zhai et al. Jan 2002 B1
D454568 Andre et al. Mar 2002 S
6357887 Novak Mar 2002 B1
D455793 Lin Apr 2002 S
6373265 Morimoto et al. Apr 2002 B1
6373470 Andre et al. Apr 2002 B1
6377530 Burrows Apr 2002 B1
6396523 Segal et al. May 2002 B1
6424338 Anderson Jul 2002 B1
6429846 Rosenberg et al. Aug 2002 B2
6429852 Adams et al. Aug 2002 B1
6452514 Philipp Sep 2002 B1
6465271 Ko et al. Oct 2002 B1
6473069 Gerpheide Oct 2002 B1
6492602 Asai et al. Dec 2002 B2
6492979 Kent et al. Dec 2002 B1
6496181 Bomer et al. Dec 2002 B1
6497412 Bramm Dec 2002 B1
D468365 Bransky et al. Jan 2003 S
D469109 Andre et al. Jan 2003 S
D472245 Andre et al. Mar 2003 S
6546231 Someya et al. Apr 2003 B1
6587091 Serpa Jul 2003 B2
6606244 Liu et al. Aug 2003 B1
6618909 Yang Sep 2003 B1
6636197 Goldenberg et al. Oct 2003 B1
6639584 Li Oct 2003 B1
6640250 Chang et al. Oct 2003 B1
6650975 Ruffner Nov 2003 B2
D483809 Lim Dec 2003 S
6658773 Rohne et al. Dec 2003 B2
6664951 Fujii et al. Dec 2003 B1
6677927 Bruck et al. Jan 2004 B1
6678891 Wilcox et al. Jan 2004 B1
6686904 Sherman et al. Feb 2004 B1
6686906 Salminen et al. Feb 2004 B2
6703550 Chu Mar 2004 B2
6724817 Simpson et al. Apr 2004 B1
6727889 Shaw Apr 2004 B2
D489731 Huang May 2004 S
6738045 Hinckley et al. May 2004 B2
6750803 Yates et al. Jun 2004 B2
6781576 Tamura Aug 2004 B2
6784384 Park et al. Aug 2004 B2
6788288 Ano Sep 2004 B2
6791533 Su Sep 2004 B2
6795057 Gordon Sep 2004 B2
D497618 Andre et al. Oct 2004 S
6810271 Wood et al. Oct 2004 B1
6822640 Derocher Nov 2004 B2
6834975 Chu-Chia et al. Dec 2004 B2
6844872 Farag et al. Jan 2005 B1
6855899 Sotome Feb 2005 B2
6865718 Levi Montalcini Mar 2005 B2
6886842 Vey et al. May 2005 B2
6894916 Reohr et al. May 2005 B2
D506476 Andre et al. Jun 2005 S
6922189 Fujiyoshi Jul 2005 B2
6930494 Tesdahl et al. Aug 2005 B2
6958614 Morimoto Oct 2005 B2
6977808 Lam et al. Dec 2005 B2
6978127 Bulthuis et al. Dec 2005 B1
6985137 Kaikuranta Jan 2006 B2
7006077 Uusimäki Feb 2006 B1
7019225 Matsumoto et al. Mar 2006 B2
7046230 Zadesky et al. May 2006 B2
7050292 Shimura et al. May 2006 B2
7069044 Okada et al. Jun 2006 B2
7078633 Ihalainen Jul 2006 B2
7084856 Huppi Aug 2006 B2
7113196 Kerr Sep 2006 B2
7113520 Meenan Sep 2006 B1
7117136 Rosedale Oct 2006 B1
7119792 Andre et al. Oct 2006 B1
7215319 Kamijo et al. May 2007 B2
7233318 Farag et al. Jun 2007 B1
7236154 Kerr et al. Jun 2007 B1
7236159 Siversson Jun 2007 B1
7253643 Seguine Aug 2007 B1
7279647 Philipp Oct 2007 B2
7288732 Hashida Oct 2007 B2
7297883 Rochon et al. Nov 2007 B2
7310089 Baker et al. Dec 2007 B2
7312785 Tsuk et al. Dec 2007 B2
7321103 Nakanishi et al. Jan 2008 B2
7333092 Zadesky et al. Feb 2008 B2
7348898 Ono Mar 2008 B2
7382139 Mackey Jun 2008 B2
7394038 Chang Jul 2008 B2
7395081 Bonnelykke Kristensen et al. Jul 2008 B2
7397467 Park et al. Jul 2008 B2
7439963 Geaghan et al. Oct 2008 B2
7466307 Trent et al. Dec 2008 B2
7479949 Jobs et al. Jan 2009 B2
7486323 Lee et al. Feb 2009 B2
7502016 Trent, Jr. et al. Mar 2009 B2
7503193 Schoene et al. Mar 2009 B2
7593782 Jobs et al. Sep 2009 B2
7645955 Huang et al. Jan 2010 B2
7671837 Forsblad et al. Mar 2010 B2
7708051 Katsumi et al. May 2010 B2
7772507 Orr et al. Aug 2010 B2
20010011991 Wang et al. Aug 2001 A1
20010011993 Saarinen Aug 2001 A1
20010033270 Osawa et al. Oct 2001 A1
20010043545 Aratani Nov 2001 A1
20010050673 Davenport Dec 2001 A1
20010051046 Watanabe et al. Dec 2001 A1
20020000978 Gerpheide Jan 2002 A1
20020011993 Lui et al. Jan 2002 A1
20020027547 Kamijo et al. Mar 2002 A1
20020030665 Ano Mar 2002 A1
20020033848 Sciammarella et al. Mar 2002 A1
20020039493 Tanaka Apr 2002 A1
20020045960 Phillips et al. Apr 2002 A1
20020071550 Pletikosa Jun 2002 A1
20020089545 Levi Montalcini Jul 2002 A1
20020103796 Hartley Aug 2002 A1
20020118131 Yates et al. Aug 2002 A1
20020118169 Hinckley et al. Aug 2002 A1
20020145594 Derocher Oct 2002 A1
20020154090 Lin Oct 2002 A1
20020158844 McLoone et al. Oct 2002 A1
20020164156 Bilbrey Nov 2002 A1
20020168947 Lemley Nov 2002 A1
20020180701 Hayama et al. Dec 2002 A1
20020196239 Lee Dec 2002 A1
20030002246 Kerr Jan 2003 A1
20030025679 Taylor et al. Feb 2003 A1
20030028346 Sinclair et al. Feb 2003 A1
20030043121 Chen Mar 2003 A1
20030043174 Hinckley et al. Mar 2003 A1
20030050092 Yun Mar 2003 A1
20030076301 Tsuk et al. Apr 2003 A1
20030076303 Huppi Apr 2003 A1
20030091377 Hsu et al. May 2003 A1
20030095095 Pihlaja May 2003 A1
20030095096 Robbin et al. May 2003 A1
20030098851 Brink May 2003 A1
20030103043 Mulligan et al. Jun 2003 A1
20030122792 Yamamoto et al. Jul 2003 A1
20030135292 Husgafvel et al. Jul 2003 A1
20030142081 Iizuka et al. Jul 2003 A1
20030184517 Senzui et al. Oct 2003 A1
20030197740 Reponen Oct 2003 A1
20030206202 Moriya Nov 2003 A1
20030210537 Engelmann Nov 2003 A1
20030224831 Engstrom et al. Dec 2003 A1
20040027341 Derocher Feb 2004 A1
20040074756 Kawakami et al. Apr 2004 A1
20040080682 Dalton Apr 2004 A1
20040109357 Cernea et al. Jun 2004 A1
20040150619 Baudisch et al. Aug 2004 A1
20040156192 Kerr et al. Aug 2004 A1
20040178997 Gillespie et al. Sep 2004 A1
20040200699 Matsumoto et al. Oct 2004 A1
20040215986 Shakkarwar Oct 2004 A1
20040224638 Fadell et al. Nov 2004 A1
20040239622 Proctor et al. Dec 2004 A1
20040252109 Trent, Jr. et al. Dec 2004 A1
20040252867 Lan et al. Dec 2004 A1
20040253989 Tupler et al. Dec 2004 A1
20040263388 Krumm et al. Dec 2004 A1
20040267874 Westberg et al. Dec 2004 A1
20050012644 Hurst et al. Jan 2005 A1
20050017957 Yi Jan 2005 A1
20050024341 Gillespie et al. Feb 2005 A1
20050030048 Bolender Feb 2005 A1
20050052425 Zadesky et al. Mar 2005 A1
20050052426 Hagermoser et al. Mar 2005 A1
20050052429 Philipp Mar 2005 A1
20050068304 Lewis et al. Mar 2005 A1
20050083299 Nagasaka Apr 2005 A1
20050083307 Aufderheide Apr 2005 A1
20050090288 Stohr et al. Apr 2005 A1
20050104867 Westerman et al. May 2005 A1
20050110768 Marriott et al. May 2005 A1
20050129199 Abe Jun 2005 A1
20050139460 Hosaka Jun 2005 A1
20050140657 Park et al. Jun 2005 A1
20050143124 Kennedy et al. Jun 2005 A1
20050156881 Trent et al. Jul 2005 A1
20050162402 Watanachote Jul 2005 A1
20050204309 Szeto Sep 2005 A1
20050237308 Autio et al. Oct 2005 A1
20060026521 Hotelling et al. Feb 2006 A1
20060032680 Elias et al. Feb 2006 A1
20060038791 Mackey Feb 2006 A1
20060095848 Naik May 2006 A1
20060097991 Hotelling et al. May 2006 A1
20060131156 Voelckers Jun 2006 A1
20060143574 Ito et al. Jun 2006 A1
20060174568 Kinoshita et al. Aug 2006 A1
20060181517 Zadesky et al. Aug 2006 A1
20060197750 Kerr et al. Sep 2006 A1
20060232557 Fallot-Burghardt Oct 2006 A1
20060236262 Bathiche et al. Oct 2006 A1
20060250377 Zadesky et al. Nov 2006 A1
20060274042 Krah et al. Dec 2006 A1
20060279896 Bruwer Dec 2006 A1
20060284836 Philipp Dec 2006 A1
20070013671 Zadesky et al. Jan 2007 A1
20070018970 Tabasso et al. Jan 2007 A1
20070052044 Forsblad et al. Mar 2007 A1
20070052691 Zadesky et al. Mar 2007 A1
20070080936 Tsuk et al. Apr 2007 A1
20070080938 Robbin et al. Apr 2007 A1
20070080952 Lynch et al. Apr 2007 A1
20070083822 Robbin et al. Apr 2007 A1
20070085841 Tsuk et al. Apr 2007 A1
20070097086 Battles et al. May 2007 A1
20070120834 Boillot May 2007 A1
20070126696 Boillot Jun 2007 A1
20070152975 Ogihara Jul 2007 A1
20070152977 Ng et al. Jul 2007 A1
20070152983 McKillop et al. Jul 2007 A1
20070155434 Jobs et al. Jul 2007 A1
20070157089 Van Os et al. Jul 2007 A1
20070242057 Zadesky et al. Oct 2007 A1
20070247421 Orsley et al. Oct 2007 A1
20070247443 Philipp Oct 2007 A1
20070271516 Carmichael Nov 2007 A1
20070273671 Zadesky et al. Nov 2007 A1
20070276525 Zadesky et al. Nov 2007 A1
20070279394 Lampell Dec 2007 A1
20070285404 Rimon et al. Dec 2007 A1
20070290990 Robbin et al. Dec 2007 A1
20070291016 Philipp Dec 2007 A1
20070296709 GuangHai Dec 2007 A1
20080006453 Hotelling et al. Jan 2008 A1
20080006454 Hotelling et al. Jan 2008 A1
20080007533 Hotelling et al. Jan 2008 A1
20080007539 Hotelling et al. Jan 2008 A1
20080012837 Marriott et al. Jan 2008 A1
20080018615 Zadesky et al. Jan 2008 A1
20080018616 Lampell et al. Jan 2008 A1
20080018617 Ng et al. Jan 2008 A1
20080036473 Jansson Feb 2008 A1
20080036734 Forsblad et al. Feb 2008 A1
20080060925 Weber et al. Mar 2008 A1
20080069412 Champagne et al. Mar 2008 A1
20080079699 Mackey Apr 2008 A1
20080087476 Prest Apr 2008 A1
20080088582 Prest Apr 2008 A1
20080088596 Prest Apr 2008 A1
20080088597 Prest Apr 2008 A1
20080088600 Prest Apr 2008 A1
20080094352 Tsuk et al. Apr 2008 A1
20080098330 Tsuk et al. Apr 2008 A1
20080110739 Peng et al. May 2008 A1
20080111795 Bollinger May 2008 A1
20080143681 XiaoPing Jun 2008 A1
20080165158 Hotelling et al. Jul 2008 A1
20080196945 Konstas Aug 2008 A1
20080202824 Philipp et al. Aug 2008 A1
20080209442 Setlur et al. Aug 2008 A1
20080264767 Chen et al. Oct 2008 A1
20080280651 Duarte Nov 2008 A1
20080284742 Prest Nov 2008 A1
20080293274 Milan Nov 2008 A1
20090021267 Golovchenko et al. Jan 2009 A1
20090026558 Bauer et al. Jan 2009 A1
20090033635 Wai Feb 2009 A1
20090036176 Ure Feb 2009 A1
20090058687 Rothkopf et al. Mar 2009 A1
20090058801 Bull Mar 2009 A1
20090058802 Orsley et al. Mar 2009 A1
20090073130 Weber et al. Mar 2009 A1
20090078551 Kang Mar 2009 A1
20090109181 Hui et al. Apr 2009 A1
20090141046 Rathnam et al. Jun 2009 A1
20090160771 Hinckley et al. Jun 2009 A1
20090179854 Weber et al. Jul 2009 A1
20090197059 Weber et al. Aug 2009 A1
20090229892 Fisher et al. Sep 2009 A1
20090273573 Hotelling Nov 2009 A1
20100058251 Rottler et al. Mar 2010 A1
20100060568 Fisher et al. Mar 2010 A1
20100073319 Lyon et al. Mar 2010 A1
20100149127 Fisher et al. Jun 2010 A1
20100289759 Fisher et al. Nov 2010 A1
20100313409 Weber et al. Dec 2010 A1
20110005845 Hotelling et al. Jan 2011 A1
Foreign Referenced Citations (222)
Number Date Country
1139235 Jan 1997 CN
1455615 Nov 2003 CN
1499356 May 2004 CN
1659506 Aug 2005 CN
3615742 Nov 1987 DE
19722636 Dec 1998 DE
10022537 Nov 2000 DE
20019074 Feb 2001 DE
10 2004 043 663 Apr 2006 DE
0178157 Apr 1986 EP
0419145 Mar 1991 EP
0498540 Aug 1992 EP
0521683 Jan 1993 EP
0674288 Sep 1995 EP
0 731 407 Sep 1996 EP
0 551 778 Jan 1997 EP
0551778 Jan 1997 EP
0880091 Nov 1998 EP
1 026 713 Aug 2000 EP
1081922 Mar 2001 EP
1098241 May 2001 EP
1 133 057 Sep 2001 EP
1162826 Dec 2001 EP
1 168 396 Jan 2002 EP
1205836 May 2002 EP
1 244 053 Sep 2002 EP
1251455 Oct 2002 EP
1263193 Dec 2002 EP
1347481 Sep 2003 EP
1376326 Jan 2004 EP
1 467 392 Oct 2004 EP
1482401 Dec 2004 EP
1 496 467 Jan 2005 EP
1 517 228 Mar 2005 EP
1542437 Jun 2005 EP
1 589 407 Oct 2005 EP
1 784 058 May 2007 EP
1 841 188 Oct 2007 EP
1850218 Oct 2007 EP
1 876 711 Jan 2008 EP
2 686 440 Jul 1993 FR
2015167 Sep 1979 GB
2072389 Sep 1981 GB
2315186 Jan 1998 GB
2333215 Jul 1999 GB
2391060 Jan 2004 GB
2402105 Dec 2004 GB
55-174009 Jun 1982 JP
57-95722 Jun 1982 JP
61-117619 Jun 1986 JP
61-124009 Jun 1986 JP
61-164547 Jan 1988 JP
63-106826 May 1988 JP
63-181022 Jul 1988 JP
63-298518 Dec 1988 JP
03-57617 Jun 1991 JP
3-192418 Aug 1991 JP
04-32920 Feb 1992 JP
4-205408 Jul 1992 JP
5-041135 Feb 1993 JP
5-080938 Apr 1993 JP
5-101741 Apr 1993 JP
05-36623 May 1993 JP
5-189110 Jul 1993 JP
5-205565 Aug 1993 JP
5-211021 Aug 1993 JP
5-217464 Aug 1993 JP
05-233141 Sep 1993 JP
05-262276 Oct 1993 JP
5-265656 Oct 1993 JP
5-274956 Oct 1993 JP
05-289811 Nov 1993 JP
5-298955 Nov 1993 JP
5-325723 Dec 1993 JP
06-20570 Jan 1994 JP
6-084428 Mar 1994 JP
6-089636 Mar 1994 JP
6-96639 Apr 1994 JP
06-096639 Apr 1994 JP
06-111685 Apr 1994 JP
06-111695 Apr 1994 JP
6-111695 Apr 1994 JP
6-139879 May 1994 JP
06-187078 Jul 1994 JP
06-208433 Jul 1994 JP
6-230898 Aug 1994 JP
6-267382 Sep 1994 JP
06-283993 Oct 1994 JP
6-333459 Dec 1994 JP
07-107574 Apr 1995 JP
7-107574 Apr 1995 JP
7-41882 Jul 1995 JP
07-41882 Jul 1995 JP
7-201249 Aug 1995 JP
07-201256 Aug 1995 JP
07-253838 Oct 1995 JP
7-261899 Oct 1995 JP
07-261899 Oct 1995 JP
7-261922 Oct 1995 JP
07-296670 Nov 1995 JP
7-319001 Dec 1995 JP
08-016292 Jan 1996 JP
08-115158 May 1996 JP
8-115158 May 1996 JP
8-203387 Aug 1996 JP
8-293226 Nov 1996 JP
8-298045 Nov 1996 JP
08-299541 Nov 1996 JP
8-316664 Nov 1996 JP
09-044289 Feb 1997 JP
09-069023 Mar 1997 JP
09-128148 May 1997 JP
9-134248 May 1997 JP
9-218747 Aug 1997 JP
09-230993 Sep 1997 JP
9-230993 Sep 1997 JP
9-231858 Sep 1997 JP
09-233161 Sep 1997 JP
9-251347 Sep 1997 JP
9-258895 Oct 1997 JP
9-288926 Nov 1997 JP
9-512979 Dec 1997 JP
10-63467 Mar 1998 JP
10-74127 Mar 1998 JP
10-074429 Mar 1998 JP
10-198507 Jul 1998 JP
10-227878 Aug 1998 JP
10-240693 Sep 1998 JP
10-320322 Dec 1998 JP
10-326149 Dec 1998 JP
11-24834 Jan 1999 JP
11-184607 Jul 1999 JP
11-194872 Jul 1999 JP
11-194882 Jul 1999 JP
11-194883 Jul 1999 JP
11-194891 Jul 1999 JP
11-195353 Jul 1999 JP
11-203045 Jul 1999 JP
11-1944863 Jul 1999 JP
A 100-12010 Jul 1999 JP
A 100-12025 Jul 1999 JP
A 100-12026 Jul 1999 JP
A 100-12027 Jul 1999 JP
A 100-12028 Jul 1999 JP
A 100-12029 Jul 1999 JP
11-212725 Aug 1999 JP
11-272378 Oct 1999 JP
A 100-89535 Oct 1999 JP
11-305896 Nov 1999 JP
11-338628 Dec 1999 JP
2000-200147 Jul 2000 JP
2000-215549 Aug 2000 JP
2000-267786 Sep 2000 JP
2000-267797 Sep 2000 JP
2000-353045 Dec 2000 JP
2001-11769 Jan 2001 JP
2001-22508 Jan 2001 JP
2001-184158 Jul 2001 JP
3085481 Feb 2002 JP
2002-215311 Aug 2002 JP
2003-015796 Jan 2003 JP
2003-060754 Feb 2003 JP
2003-099198 Apr 2003 JP
2003-150303 May 2003 JP
2003-517674 May 2003 JP
2003-280799 Oct 2003 JP
2003-280807 Oct 2003 JP
2004-362097 Dec 2004 JP
2005-230460 Sep 2005 JP
A 2005-99635 Sep 2005 JP
A 2005-133824 Oct 2005 JP
A 2005-134953 Oct 2005 JP
2006-4101 Jan 2006 JP
A 2005-235579 Jan 2006 JP
A 2005-358970 Jul 2006 JP
2006-222504 Aug 2006 JP
3852854 Dec 2006 JP
A 2005-312433 May 2007 JP
1998-71394 Oct 1998 KR
1999-50198 Jul 1999 KR
2000-08579 Feb 2000 KR
2001-0052016 Jun 2001 KR
2001-108361 Dec 2001 KR
2002-65059 Aug 2002 KR
10-2006-0021678 Mar 2006 KR
431607 Apr 2001 TW
00470193 Dec 2001 TW
547716 Aug 2003 TW
1220491 Aug 2004 TW
WO-9417494 Aug 1994 WO
WO 9500897 Jan 1995 WO
WO-9627968 Sep 1996 WO
WO-9814863 Apr 1998 WO
WO-9949443 Sep 1999 WO
WO-0079772 Dec 2000 WO
WO-0102949 Jan 2001 WO
WO-0144912 Jun 2001 WO
WO-0208881 Jan 2002 WO
WO-03044645 May 2003 WO
WO 03044956 May 2003 WO
WO-03025960 Sep 2003 WO
WO 03088176 Oct 2003 WO
WO 03090008 Oct 2003 WO
WO-2004001573 Dec 2003 WO
WO 2004040606 May 2004 WO
WO-2004091956 Oct 2004 WO
WO-2005055620 Jun 2005 WO
WO 2005076117 Aug 2005 WO
WO-2005114369 Dec 2005 WO
WO-2005124526 Dec 2005 WO
WO-2006020305 Feb 2006 WO
WO-2006021211 Mar 2006 WO
WO 2006037545 Apr 2006 WO
WO 2006104745 Oct 2006 WO
WO-2006135127 Dec 2006 WO
WO 2007025858 Mar 2007 WO
WO-2007078477 Jul 2007 WO
WO-2007084467 Jul 2007 WO
WO-2007089766 Aug 2007 WO
WO-2008007372 Jan 2008 WO
WO-2008045414 Apr 2008 WO
WO-2008045833 Apr 2008 WO
Related Publications (1)
Number Date Country
20070279394 A1 Dec 2007 US
Provisional Applications (1)
Number Date Country
60810423 Jun 2006 US