Embodiments of the invention relate generally to the field of computerized sensations. More particularly, embodiments of the invention relate to an apparatus, system, and method for providing real-time sensations of temperature and texture to a handheld controller.
As audio visual devices such as gaming platforms, smart phones, tablets, televisions, etc., provide a higher level of interactive experience to a user of such audio visual devices, there is demand for providing more real-time sensations to a user of such audio visual devices.
The term “interactive experience” herein refers to an experience in which a user interacts with a program (software, television broadcast, etc.) executing on an audio/visual device (e.g., computer or television screen) and provides real-time information to the program of the audio/visual device, and in response to providing such information the user receives information back from the executing program.
An example of known real-time sensations is the vibration of a gaming controller. Vibrations may be generated when, for example, the user of the gaming controller encounters an undesired event associated with an audio-visual game while playing the game—car driven by a user when the car slides off a road causing a vibration of the remote controller held by the user. However, such real-time sensations provided to a user are not rich enough (i.e., lacks triggering multiple human sensations) to immerse the user into the interactive experience.
Embodiments of the invention will be understood more fully from the detailed description given below and from the accompanying drawings of various embodiments of the invention, which, however, should not be taken to limit the invention to the specific embodiments, but are for explanation and understanding only.
Embodiments of the invention relate generally to the field of computerized sensations. More particularly, embodiments of the invention relate to an apparatus, system, and method for providing real-time sensations of temperature and texture to a user of a controller.
Described herein is an embodiment of a hand-held controller comprising: a first region to be touched by a user and to provide a real-time computer programmable texture sensation to the user in response to a first trigger signal generated by an interactive program; and a first mechanism, coupled to the first region, to cause the first region to roughen relative to a first state, and to cause the first region to smooth relative to a second state.
Described herein is an embodiment of a system comprising: a processor; an interactive application executing on the processor, the interactive application operable to generate a first trigger signal representing a context of the executing interactive program; and a hand-held controller comprising: a first region to be touched by a user and to provide a real-time computer programmable texture sensation to a user in response to the first trigger signal generated by the interactive program; and a first mechanism, coupled to the first region, to cause the first region to roughen relative to a first state, and to cause the first region to smooth relative to a second state.
Described herein is an embodiment of a method comprising: executing an interactive program on a processor; selecting levels of a computer programmable texture sensation via a user interface (UI) associated with the executing the interactive program; positioning a controller to a context of the interactive program; receiving, by the controller, a first trigger signal in response to the positioning; and in response to receiving the first trigger signal, performing one of: roughening the first region of the controller relative to a first state; and smoothing the first region of the controller relative to a second state.
Embodiments of the invention relate generally to the field of computerized sensations. More particularly, embodiments of the invention relate to an apparatus, system, and method for providing real-time sensations of temperature and texture to a user of a controller. The term “temperature sensation” herein is interchangeably referred to as “thermal sensation.” The term “handheld controller” is also interchangeably referred to as a “controller.”
In one embodiment, an interactive program (i.e., software) is executed on a processor and displayed on an audio-visual device. In one embodiment, the interactive program is configured to generate a trigger signal when a user holding the controller (also referred to as the hand held controller) points to a context displayed on the audio-visual device. In one embodiment, the trigger signal is received by the controller held by the user. In one embodiment, the trigger signal causes the controller to generate one or both sensations of temperature and texture to the user by means of regions on the controller in contact with the user. In one embodiment, the user can adjust the levels of sensations for temperature and/or texture via a user interface associated with the interactive program.
As used herein, unless otherwise specified the use of the ordinal adjectives “first,” “second,” and “third,” etc., to describe a common object, merely indicate that different instances of like objects are being referred to, and are not intended to imply that the objects so described must be in a given sequence, either temporally, spatially, in ranking or in any other manner.
In one embodiment, the program is configured to generate a first trigger signal when a user holding the controller points to a first context displayed on the audio-visual device. In one embodiment, the controller comprises a first region configured to be touched by the user to provide real-time computer programmable texture sensations to the user in response to receiving the first trigger signal associated with the first context. In one embodiment, the controller comprises a first mechanism, coupled to the first region, to cause the first region to roughen relative to a first state, and to cause the first region to smooth relative to a second state, wherein the first and second states represent levels of texture of the first region.
For example, in one embodiment a user holding the controller is a character of an interactive game (also referred to as an interactive program) executing by a processor and displayed by the audio-visual device. When the user points the controller, which in one embodiment is being tracked by a motion detector, towards a first context of the game which represents a rough surface (e.g., the character walking on a unpaved surface), the first trigger signal is generated by the executing gaming program that is transmitted to the controller held by the user. The controller then causes the first region of the controller in contact with the user's hand to roughen to provide a sensation of roughness to the user.
Referring to the same example, in one embodiment when the character of the user moves to a second context representing a smooth surface (e.g., the character walking on a polished concrete surface), the first trigger signal is generated again by the executing gaming program which is transmitted to the user via the controller. The controller then causes the first region of the controller in contact with the user's hand to smooth by providing a smooth sensation to the user.
In one embodiment, the controller comprises a second region configured to be touched by the user and to provide real-time computer programmable temperature (thermal) sensations to the user in response to a second trigger signal generated by the interactive program. In one embodiment, the controller comprises a second mechanism, coupled to the second region, to cause the second region to heat up relative to a third state and to cause the second region to cool down relative to a fourth state, wherein the first and the second regions reside on an outer surface of the controller, and wherein the third and fourth states represent thermals levels of sensations provided by the second region.
For example, in one embodiment when the user points the controller towards a third context of the game which represents a hot surface or surrounding environment (e.g., the character is walking on an unpaved surface on a hot summer day), the second trigger signal is generated by the executing gaming program that is transmitted to the controller held by the user. The controller then causes the second region of the controller in contact with the user's hand to heat up to provide a sensation of high temperature (hot environment) to the user. In this embodiment, the controller provides both sensations of roughness and high temperature representing hot unpaved surface in response to the controller receiving the first and second trigger signals.
Referring to the same example, in one embodiment when the character of the user moves to a fourth context representing a smooth surface (e.g., the character walking on a polished marble surface during night), the second trigger signal is generated again by the executing gaming program which is transmitted to the controller of the user. The controller then causes the second region of the controller in contact with the user's hand to cool down to provide a sensation of coolness to the user. In this embodiment, the controller provides both sensations of smoothness and cool temperature representing cool marble at night in response to the controller receiving the first and second trigger signals.
The term “real-time” herein refers to providing sensations of temperature and/or texture to a user holding the hand-held controller such that the user perceives the sensations (within a few milliseconds) when the first and/or second trigger signals are generated by the interactive program and received by the hand-held controller.
In the following description, numerous details are discussed to provide a more thorough explanation of embodiments of the present invention. It will be apparent, however, to one skilled in the art, that embodiments of the present invention may be practiced without these specific details. In other instances, well-known structures and devices are shown in block diagram form, rather than in detail, in order to avoid obscuring embodiments of the present invention.
Note that in the corresponding drawings of the embodiments signals are represented with lines. Some lines may be thicker, to indicate more constituent signal paths, and/or have arrows at one or more ends, to indicate primary information flow direction. Such indications are not intended to be limiting. Rather, the lines are used in connection with one or more exemplary embodiments to facilitate easier understanding of a circuit or a logical unit. Any represented signal, as dictated by design needs or preferences, may actually comprise one or more signals that may travel in either direction and may be implemented with any suitable type of signal scheme, e. g., differential pair, single-ended, etc.
Examples of gaming consoles include those manufactured by Sony Computer Entertainment, Inc. and other manufacturers. In one embodiment, the audio-visual device 101 is a television, a monitor, a projector display, or other such displays and display systems which are capable of receiving and rendering video output from the computer system 102. In one embodiment, the audio-visual device 101 is a flat panel display which displays various contexts to a user. These contexts provide feedback to the controller 103 to generate real-time temperature and texture sensations to the user.
In one embodiment, a user 104 provides input to the interactive program by operating the controller 103. The term “operating” herein refers to moving the controller, pressing buttons on the controller, etc. In one embodiment, the controller 103 communicates wirelessly 106 with the computer system 102 for greater freedom of movement of the controller 103 than a wired connection. In one embodiment, the controller 103 includes any of various features for providing input to the interactive program, such as buttons, a joystick, directional pad, trigger, touchpad, touch screen, or other types of input mechanisms. One example of a controller is the Sony Dualshock 3® controller manufactured by Sony Computer Entertainment, Inc.
In one embodiment, the controller 103 is a motion controller that enables the user 104 to interface with and provide input to the interactive program by moving the controller 103. One example of a motion controller is the Playstation Move® controller, manufactured by Sony Computer Entertainment, Inc. Various technologies may be employed to detect the position and movement of a motion controller. For example, a motion controller may include various types of motion detection hardware, such as accelerometers, gyroscopes, and magnetometers. In some embodiments, a motion controller can include one or more cameras to capture images of a fixed reference object. The position and movement of the motion controller can then be determined through analysis of the images captured by the one or more cameras. In some embodiments, a motion controller may include an illuminated element which is tracked via a camera having a fixed position. In one embodiment, the tracked motion 107 of the controller 103 causes the generation of the first and second trigger signals from an interactive program that further cause generation of texture and temperature sensations, respectively, to the user 104 of the controller 103.
While the embodiments of the invention describe two trigger signals to provide two different sensations on the controller, the two different sensations may also be generated by a single trigger signal that informs the controller of what type of sensation to generate. In one embodiment, the controller receives the single trigger signal and informs which mechanism(s) (first or second) to generate a corresponding sensation.
In one embodiment, the user 104 positions the controller 103 towards the character 111 of the executing program. As the character 111 moves away from a shaded tree 114 along the rough unpaved path 112 towards the hill 113 under the sun, the user 104 holding the controller 103 will experience several different sensations. In this example, the character 111 near the tree 114 experiences shade which results in cool temperature around the character 111.
When the character 111 is positioned near the tree 114, that represents a cool shaded area, the interactive program generates the second trigger signal. In one embodiment, the second trigger signal causes a second mechanism of the controller 103 to cool down a region of the controller 103 held by the user 104. In one embodiment, when the character 111 walks on the rough unpaved path 112 near the tree 114, the interactive program generates the first trigger signal. In one embodiment, in response to the first trigger signal, a first mechanism of the controller 103 causes a region of the controller 103 held by the user 104 to provide sensations of roughness.
When the character 111 walks on the rough unpaved path 112 towards the hill 113, the temperature of the area surrounding the character 111 rises because of the exposure of the surrounding area to the sun. In this example, the surface on which the character 111 walks is a smooth marble path to the hill 113. In one embodiment, when the character 111 walks away from the rough unpaved path 112 near the tree 114 towards the hill 113 via the smooth marble path, first and second trigger signals are generated by the interactive program. In one embodiment, in response to the first and second trigger signals, the first and second mechanisms of the controller 103 cause corresponding regions of the controller 103 held by the user 104 to provide sensations of smoothness (smooth marble surface leading to the hill 113) and high temperature because of the heat generated by the sun. The components comprising the first and second mechanisms of the controller are discussed with reference to several embodiments below.
In one embodiment, the controller 200 also includes an attachment 202 above the main body 201 of the controller 200. In one embodiment, the attachment 202 is illuminated with various colors in response to trigger signals generated by an interactive program. The controller 200 includes a handle portion for a user to grip, in which various regions 204 and 205 are defined that may be roughened/smoothed and heated/cooled, respectively. In the embodiments discussed herein, the region 204 is referred to as the first region 204, while the region 205 is referred to as the second region 205. In one embodiment, the first region 204 and the second region 205 are adjacent regions. In one embodiment, the first region 204 and the second region 205 form an outer surface which is configured to be held by a user.
In one embodiment, the controller 200 comprises a first mechanism 208 and a second mechanism 209. In one embodiment, the first mechanism 208 is coupled to the first region 204. In one embodiment, the first mechanism 208 is configured to cause the first region 204 to roughen or smooth relative to first and second states.
In one embodiment, the first state is defined as a number on a continuum of 10 to 1, where the number ‘10’ represents the roughest sensation while the number ‘1’ on the continuum represents the smoothest sensation. In one embodiment, the first state corresponds to a sandpaper grit size which refers to the size of the particles of abrading materials embedded in the sandpaper. A person skilled in the art would know that there are two common standards for measuring roughness of a surface; the United States Coated Abrasive Manufacturers Institute (CAMI), now part of the Unified Abrasives Manufacturers' Association, and the European Federation of European Producers of Abrasives (FEPA) ‘P’ grade. The FEPA standards system is the same as the ISO 6344 standard. In one embodiment, the first state is defined by the Japanese Industrial Standards Committee (JIS).
The embodiments discussed herein refer to the texture sensations in view of ‘P’ grade of the FEPA standard. A person skilled in the art may use any standard of measurement without changing the essence of the embodiments of the invention.
In one embodiment, the first state is in the range of P12-P36 FEPA. In one embodiment, the second state is in the range of P120 to P250 FEPA. In one embodiment, both the first and second states are predetermined states i.e., the states have a default value. In one embodiment, both the first and second states are the same. In one embodiment, both the first and second states are P60 FEPA. The higher the ‘P’ the smoother the texture sensation is.
In one embodiment, the second mechanism 209 is operable to cause the second region 205 to heat up or cool down relative to third and fourth states. In one embodiment, the third state is 100-120 degrees Fahrenheit. In one embodiment, the fourth state is in the range of 40-50 degrees Fahrenheit. In one embodiment, the third and fourth states are predetermined states i.e., the states have a default value. In one embodiment, both the third and fourth states are of the same value. In one embodiment the third and fourth states are 100 degrees Fahrenheit. In one embodiment, the first, second, third, and fourth states are programmable.
In one embodiment, the first region 204 comprises a fabric which is operable to be stretched or wrinkled by the first mechanism 208. In one embodiment, the first mechanism 208 comprises a push-pull mechanism which is operable to pull the fabric 204 along the direction of the fabric 204 to cause the fabric 204 to smooth relative to the first state, and to relax the fabric 204 to cause the fabric 204 to roughen relative to the second state. In one embodiment, the first mechanism 208 further comprises an electric motor which is operable to cause the push-pull mechanism to pull or relax the fabric 204.
In one embodiment, the first mechanism 208 comprises a set of prongs and a push-pull mechanism which is operable to push the set of prongs towards the first region to cause a sensation of roughness on the fabric 204. In one embodiment, the push-pull mechanism is operable to pull the set of prongs away from the first region to cause a sensation of smoothness on the fabric 204.
In one embodiment, the second region 205 comprises a metalized fabric that is configured to be heated or cooled down nearly instantaneously. In one embodiment, the second region 205 comprises any fabric which is capable of transmitting heat or cold to a user holding the fabric. In one embodiment, the second region 205 is divided into two or more regions 206 and 210. In one embodiment, the region 206 of the second region 205 provides a sensation of heat to the user. In one embodiment, the region 210 of the second region 205 provides a sensation of coolness to the user.
While the embodiment of
In one embodiment, the buttons 207 and the trigger 203 comprise first and second regions to provide both sensations of texture and temperature to the buttons 207 and the trigger 203 respectively. In one embodiment, the first and second mechanisms are insulated from the upper half of the controller 200 to protect any circuitry in the upper half of the controller 200 from noise generated by first and second mechanisms 208 and 209.
In one embodiment, the fabric comprises a Miura-Ori fabric 310 of
Referring back to
Referring back to
In one embodiment, the electric motor 302 is held stable relative to the fabric region 204/301 by means of a chassis 305. In one embodiment, foam 306 or any comfortable material is placed between the chassis 305 and the first region (fabric) 204/301. One purpose of the foam 306 is to provide a comfortable grip (comprising regions 204/301 and 205 of the controller 200) to a user, and also to provide support to the first region (fabric) 204/301. In one embodiment, the surface of the foam 306 coupling to the fabric 204/301 is smooth enough to allow the fabric 204/301 to be pulled or relaxed without causing any tension on the foam 306 caused by the forces of pull or push.
In one embodiment, the push-pull mechanism 304 comprises a clamp 307 which is operable to pull or relax the fabric 204/301 upon instructions from the logic unit 303 and the electric motor 302. In one embodiment, the electric motor 302 is configured to cause the clamp 307 to pull the fabric out 204 (e.g., pulling in the Miura-Ori fabric 310 fabric of
In one embodiment, the push-pull mechanism 304 comprises magnets that cause the fabric 204/301 to be pulled in or pulled out when electric current flows through the magnets. In one embodiment, the logic unit 303 is operable to receive the first trigger signal from the interactive program and to determine when to cause the push-pull mechanism 304 to pull in or pull out the fabric 204/301 in response to the first trigger signal. In one embodiment, the logic unit 303 is programmable to adjust/change the response time of the push-pull mechanism 304.
The term “response time” herein refers to the time it takes the first and/or second mechanisms 208 and 209 to provide sensations of texture and/or temperature to the first and second regions 204 and 205.
In one embodiment, the plate 407 comprises multiple plates (not shown) each of which is operable by the push-pull logic unit 402 independently. In such an embodiment, the push-pull logic unit 402 is configured to push out (411) or pull in (412) each of the multiple plates to cause some areas of the fabric 401 to smooth relative to other areas of the fabric 401. In one embodiment, the prongs 405 are of different shapes and sizes to cause different sensations of roughness when the prongs 405 are pushed out (411) relative to the fabric 401.
In one embodiment, the push-pull logic unit 402 is held stable relative to the fabric region 204/401 by means of the chassis 305. In one embodiment, foam 406 or any comfortable material is placed between the chassis 305 and the first region (fabric) 204/401. One purpose of the foam 406 is to provide a comfortable grip (comprising regions 204/401 and 205 of the controller 200) to a user, and also to provide support to the first region (fabric) 204/401.
In one embodiment, the logic unit 403 is operable to receive the first trigger signal from the interactive program and to determine when to cause the push-pull logic unit 402 to push-out or pull-in the prongs 405 in response to the first trigger signal. In one embodiment, the logic unit 303 is programmable to adjust/change the response time of the push-pull logic unit 402.
In one embodiment, the thermoelectric device comprises Peltier cells which are operable to be cooled or heated in response to a potential voltage across the Peltier cells. In one embodiment, the potential voltage across the Peltier cells is generated by the heating and cooling sources 612. In one embodiment, a Peltier cell is configured to evolve heat on one side of the cell and to withdraw heat from the opposite side of the cell to cause the opposite side of the cell to cool down. In such an embodiment, the same Peltier cell can be used for heating the second region 205/611 and for cooling the same Peltier cell. Another advantage of the Peltier cell is that they comprise no moving parts and are thus resilient/durable for handling purposes.
In one embodiment, the heating and cooling sources 612 are configured to provide enough potential voltage to the Peltier cells to cause the Peltier cells to heat up within a range of 110 degrees Fahrenheit to 125 degrees Fahrenheit, and to cool down the Peltier cells within a range of 50 degrees Fahrenheit to 40 degrees Fahrenheit. In one embodiment, the voltage potential generated by the heating and cooling sources 612 is adjustable by User Interface (UI) of the interactive program.
In one embodiment, the thermoelectric device (Peltier cells) in the region 614 is insulated by shielding regions 615 and 616. In one embodiment, the shielding region 616 is foam. In one embodiment, the shielding region 515 is made of thick plastic that can withstand temperatures up to 130 degrees Fahrenheit for a continuous period of 5 minutes without deforming In one embodiment, the logic unit (also referred to as a thermal controller) 613 is operable to determine when to activate the heating and cooling sources 612 in response to a second trigger signal from the interactive program.
Some embodiments may be described as a process which is usually depicted as a flowchart, a flow diagram, a structure diagram, or a block diagram. Although a flowchart may describe the operations as a sequential process, many of the operations can be performed concurrently (i.e., in parallel). Likewise, operations in a flowchart illustrated as concurrent processes may be performed sequentially in some embodiments. In addition, the order of the operations may be re-arranged. A process is terminated when its operations are completed. A process may correspond to a method, a program, a procedure, a method of manufacturing or fabrication, etc.
At block 801, an interactive program is executed on a processor of the computer system 102. At block 802, levels of texture sensations are selected by a user via the UI 700 associated with the interactive program. In one embodiment, a user may select a number from a texture sensation continuum shown in table 700. In one embodiment, a user may select roughness and smoothness sensation levels in terms of FEPA ‘P’ grade.
At block 803, the controller 200 is positioned by a user to a particular context of the executing interactive program as shown by the exemplary contexts of
In one embodiment, the first trigger signal indicates to the controller 200 to smooth the first region 204 of the controller 200. Accordingly, at block 806, the controller 200 causes the first region 204 to smooth relative to the second state. In one embodiment, as shown by arrow 808, the user may adjust the level of texture sensation (e.g., select a new level on the texture continuum in UI 700) in response to experiencing the smoothness sensation. Arrow 808 also indicates that, in one embodiment, the user bypasses block 802, after experiencing the smoothness sensation, and positions the controller 200 to a new context of the executing interactive program to receive another texture sensation.
In one embodiment, at block 813 in response to the second trigger signal, the controller 200 causes the second region 205 to heat up by means of the heating source (part of 512) relative to a third state. In one embodiment, as shown by arrow 815, the user may adjust the level of temperature sensation (e.g., select a new heat level from the UI 700) in response to experiencing the heat sensation. In one embodiment, at block 814, the controller 200 causes the second region to cool down by means of the cooling source (part of 512) relative to a fourth state. In one embodiment, as shown by arrow 816, the user may adjust the level of temperature sensation (e.g., select a new coolness level from the UI 700) in response to experiencing the heat sensation.
In one embodiment, at block 903, the controller 200 pushes out (411) the set of prongs 405 (or any of the sets of prongs of
In one embodiment, at block 902, the controller 200 pulls in (410) the set of prongs 405 (or any of the sets of prongs of
In one embodiment, the machine-readable medium 1003 may include, but is not limited to, flash memory, optical disks, CD-ROMs, DVD ROMs, RAMs, EPROMs, EEPROMs, magnetic or optical cards, or other type of machine-readable media suitable for storing electronic or computer-executable instructions. For example, embodiments of the invention may be downloaded as a computer program (e.g., BIOS) which may be transferred from a remote computer (e.g., a server) to a requesting computer (e.g., a client) by way of data signals via a communication link (e.g., a modem or network connection). The computer-executable instructions 1004 stored in the machine-readable medium 1003 are executed by a processor 1002 (discussed with reference to
In one embodiment, a platform unit 2000 is provided, with various peripheral devices connectable to the platform unit 2000. In one embodiment, the platform unit 2000 comprises: a Cell processor 2028; a Rambus® dynamic random access memory (XDRAM) unit 2026; a Reality Simulator graphics unit 2030 with a dedicated video random access memory (VRAM) unit 2032; and an I/O bridge 2034. In one embodiment, the platform unit 2000 also comprises a Blu Ray® Disk BD-ROM® optical disk reader 2040 for reading from a disk 2040A and a removable slot-in hard disk drive (HDD) 2036, accessible through the I/O bridge 2034. In one embodiment, the platform unit 2000 also comprises a memory card reader 2038 for reading compact flash memory cards, Memory Stick® memory cards and the like, which is similarly accessible through the I/O bridge 2034.
In one embodiment, the I/O bridge 2034 connects to multiple Universal Serial Bus (USB) 2.0 ports 2024; a gigabit Ethernet port 2022; an IEEE 802.11b/g wireless network (Wi-Fi) port 2020; and a Bluetooth® wireless link port 2018 capable of supporting of up to seven Bluetooth® connections.
In operation, the I/O bridge 2034 handles all wireless, USB and Ethernet data, including data from one or more game controllers 2002/2003. For example when a user is playing a game, the I/O bridge 2034 receives data from the game (motion) controller 2003 (same as controller 200) via a Bluetooth® link and directs it to the Cell® processor 2028, which updates the current state of the game accordingly.
In one embodiment, the wireless USB and Ethernet ports also provide connectivity for other peripheral devices in addition to game controller 2002/2003, such as: a remote control 2004; a keyboard 2006; a mouse 2008; a portable entertainment device 2010 such as a Sony Playstation® Portable entertainment device; a video image sensor such as an Playstation® Eye video image sensor 2012; a microphone headset 2020; a microphone array 2015, a card reader 2016, and a memory card 2048 for the card reader 2016. Such peripheral devices may therefore in principle be connected to the platform unit 2000 wirelessly; for example the portable entertainment device 2010 may communicate via a Wi-Fi ad-hoc connection, while the microphone headset 2020 may communicate via a Bluetooth link.
The provision of these interfaces means that the Sony Playstation 3® device is also potentially compatible with other peripheral devices such as digital video recorders (DVRs), set-top boxes, digital video image sensors, portable media players, Voice over IP telephones, mobile telephones, printers and scanners.
In one embodiment, the game controller 2002/2003 is operable to communicate wirelessly with the platform unit 2000 via the Bluetooth® link, or to be connected to a USB port, thus also providing power by which to charge the battery of the game controller 2002/2003. In one embodiment, the game controller 2002/2003 also includes memory, a processor, a memory card reader, permanent memory such as flash memory, light emitters such as LEDs or infrared lights, microphone and speaker, a digital video image sensor, a sectored photodiode, an internal clock, and a recognizable/identifiable shape such as a spherical section facing the game console.
In one embodiment, the game controller 2002/2003 is configured for three-dimensional location determination. Consequently gestures and movements by the user of the game controller 2002/2003 may be translated as inputs to a game in addition to or instead of conventional button or joystick commands. Optionally, other wirelessly enabled peripheral devices such as the Playstation™ Portable device may be used as a controller. In the case of the Playstation™ Portable device, additional game or control information (for example, control instructions or number of lives) may be provided on the screen of the device. Other alternative or supplementary control devices may also be used, such as a dance mat (not shown), a light gun (not shown), a steering wheel and pedals (not shown) or the like.
In one embodiment, the remote control 2004 is also operable to communicate wirelessly with the platform unit 2000 via a Bluetooth link. The remote control 2004 comprises controls suitable for the operation of the Blu Ray™ Disk BD-ROM reader 2040 and for the navigation of disk content.
The Blu Ray™ Disk BD-ROM reader 2040 is operable to read CD-ROMs compatible with the Playstation® and PlayStation 2® devices, in addition to conventional pre-recorded and recordable CDs, and so-called Super Audio CDs. The reader 2040 is also operable to read DVD-ROMs compatible with the Playstation 2® and PlayStation 3® devices, in addition to conventional pre-recorded and recordable DVDs. The reader 2040 is further operable to read BD-ROMs compatible with the Playstation 3 device, as well as conventional pre-recorded and recordable Blu-Ray Disks.
The platform unit 2000 is operable to supply audio and video signals, either generated or decoded by the Playstation 3® device via the Reality Simulator graphics unit 2030, through audio 2050 and video connectors 2052 to an audio visual device 2042 such as the audio-visual device 101 of
In one embodiment, the video image sensor 2012 comprises a single charge coupled device (CCD) and a LED indicator. In some embodiments, the video image sensor 2012 includes software and hardware-based real-time data compression and encoding apparatus so that compressed video data may be transmitted in an appropriate format such as an intra-image based MPEG (motion picture expert group) standard for decoding by the platform unit 2000. In one embodiment, the video image sensor LED indicator is arranged to illuminate in response to appropriate control data from the platform unit 2000, for example, to signify adverse lighting conditions.
Embodiments of the video image sensor 2012 may variously connect to the platform unit 2000 via an HDMI, USB, Bluetooth® or Wi-Fi communication port. Embodiments of the video image sensor may include one or more associated microphones and may also be capable of transmitting audio data. In embodiments of the video image sensor, the CCD may have a resolution suitable for high-definition video capture. In one embodiment, the images captured by the video image sensor is incorporated within a game or interpreted as game control inputs. In another embodiment the video image sensor is an infrared video image sensor suitable for detecting infrared light.
In one embodiment, the Power Processing Element (PPE) 2150 is based upon a two-way simultaneous multithreading Power 2070 compliant PowerPC core (PPU) 2155 running with an internal clock of 3.2 GHz. It comprises a 512 kB level 2 (L2) cache 2152 and a 32 kB level 1 (L1) cache 2151. The PPE 2150 is capable of eight single position operations per clock cycle, translating to 25.6 GFLOPs at 3.2 GHz. The primary role of the PPE 2150 is to act as a controller for the SPEs 2110A-H, which handle most of the computational workload. In operation the PPE 2150 maintains a job queue, scheduling jobs for the SPEs 2110A-H and monitoring their progress. Consequently each SPE 2110A-H runs a kernel whose role is to fetch a job, execute it and synchronize it with the PPE 2150.
In one embodiment, each Synergistic Processing Element (SPE) 2110A-H comprises a respective Synergistic Processing Unit (SPU) 2120A-H, and a respective Memory Flow Controller (MFC) 2140A-H comprising in turn a respective Dynamic Memory Access Controller (DMAC) 2142A-H, a respective Memory Management Unit (MMU) 2144A-H and a bus interface (not shown). In one embodiment, each SPU 2120A-H is a RISC processor having local RAM 2130A-H.
In one embodiment, the Element Interconnect Bus (EIB) 2180 is a logically circular communication bus internal to the Cell processor 2028 which connects the above processor elements, namely the PPE 2150, the memory controller 2160, the dual bus interface controller 2170A, B and the 8 SPEs 2110A-H, totaling 12 participants. Participants can simultaneously read and write to the bus at a rate of at least 8 bytes per clock cycle. As noted previously, each SPE 2110A-H comprises a DMAC 2142A-H for scheduling longer read or write sequences. The EIB 2180 comprises four channels, two each in clockwise and anti-clockwise directions. Consequently for twelve participants, the longest step-wise data-flow between any two participants is six steps in the appropriate direction.
In one embodiment, the memory controller 2160 comprises an XDRAM interface 2162 through which the memory controller 2160 interfaces with XDRAM. The dual bus interface controller 2170A, B comprises a system interface 2172A, B.
A user interacts with the game client via the controller 200 of
Within scene A of
When a game client(s) 1102A-C connects to a server processing module, user session control may be used to authenticate the user. An authenticated user can have associated virtualized distributed storage and virtualized network processing. Examples of items that can be stored as part of a user's virtualized distributed storage include purchased media such as, but not limited to games, videos and music etc. Additionally, distributed storage can be used to save game status for multiple games, customized settings for individual games, and general settings for the game client. In one embodiment, the user geo-location module of the server processing is used to determine the geographic location of a user and their respective game client. The user's geographic location can be used by both the sharing/communication logic and the load balance processing service to optimize performance based on geographic location and processing demands of multiple server processing modules. Virtualizing either or both network processing and network storage would allow processing tasks from game clients to be dynamically shifted to underutilized server processing module(s). Thus, load balancing can be used to minimize latency associated with both recall from storage and with data transmission between server processing modules and game clients.
The server processing module has instances of server application A and server application B. The server processing module is able to support multiple server applications as indicated by server application X1 and server application X2. In one embodiment, server processing is based on cluster computing architecture that allows multiple processors within a cluster to process server applications. In another embodiment, a different type of multi-computer processing scheme is applied to process the server applications. This allows the server processing to be scaled in order to accommodate a larger number of game clients executing multiple client applications and corresponding server applications. Alternatively, server processing can be scaled to accommodate increased computing demands necessitated by more demanding graphics processing or game, video compression, or application complexity. In one embodiment, the server processing module performs the majority of the processing via the server application. This allows relatively expensive components such as graphics processors, RAM, and general processors to be centrally located and reduces the cost of the game client. Processed server application data is sent back to the corresponding game client via the internet to be displayed on a monitor.
Scene C illustrates an exemplary application that can be executed by the game client and server processing module. For example, in one embodiment game client 1102C allows user C to create and view a buddy list 1120 that includes user A, user B, user D and user E. As shown, in scene C, user C is able to see either real time images or avatars of the respective user on monitor 1104C. Server processing executes the respective applications of game client 1102C and with the respective game clients 1102 of user A, user B, user D and user E. Because the server processing is aware of the applications being executed by game client B, the buddy list for user A can indicate which game user B is playing. Further still, in one embodiment, user A can view actual in-game video directly from user B. This is enabled by merely sending processed server application data for user B to game client A in addition to game client B.
In addition to being able to view video from buddies, the communication application can allow real-time communications between buddies. As applied to the previous example, this allows user A to provide encouragement or hints while watching the real-time video of user B. In one embodiment two-way real time voice communication is established through a client/server application. In another embodiment, a client/server application enables text chat. In still another embodiment, a client/server application converts speech to text for display on a buddy's screen.
Scene D and scene E illustrate respective user D and user E interacting with game consoles 1110D and 1110E respectively via their respective controllers 200. Each game console 1110D and 1110E are connected to the server processing module and illustrate a network where the server processing modules coordinate game play for both game consoles and game clients. According to the embodiments of the invention, each user will receive real-time sensations of temperature and texture by means of their respective controllers which are configured to receive the first and second trigger signals from the interactive program based on the context of interactive program.
Reference in the specification to “an embodiment,” “one embodiment,” “some embodiments,” or “other embodiments” means that a particular feature, structure, or characteristic described in connection with the embodiments is included in at least some embodiments, but not necessarily all embodiments. The various appearances of “an embodiment,” “one embodiment,” or “some embodiments” are not necessarily all referring to the same embodiments. If the specification states a component, feature, structure, or characteristic “may,” “might,” or “could” be included, that particular component, feature, structure, or characteristic is not required to be included. If the specification or claim refers to “a” or “an” element, that does not mean there is only one of the elements. If the specification or claims refer to “an additional” element, that does not preclude there being more than one of the additional element.
While the invention has been described in conjunction with specific embodiments thereof, many alternatives, modifications and variations of such embodiments will be apparent to those of ordinary skill in the art in light of the foregoing description. For example, with reference to