The present invention relates generally to electronic devices and more specifically to displaying operational information about an electronic device.
Embodied Conversational Agents (ECA's) and avatars are known as user interface elements, for example, in games and on the internet, in chat rooms and internet shopping websites. Their use is attractive to certain market segments. In some games, the avatar is affected in many ways (age, die, change their health, etc.) by events that arise within the game or by user input. For some non-game devices, such as a user interface for controlling a complex electronic device, avatars are used to interact with the user to provide assistance to the user. In one example, an animated dog is used to entertain a user while a search is being done. In another example, a program may collect past user selections of television programs and make recommendations based on them.
The accompanying figures, where like reference numerals refer to identical or functionally similar elements throughout the separate views, together with the detailed description below, are incorporated in and form part of the specification, and serve to further illustrate embodiments of concepts that include the claimed invention. The figures and description explain various principles and advantages, in accordance with the embodiments.
Skilled artisans will appreciate that elements in the figures are illustrated for simplicity and clarity and have not necessarily been drawn to scale. For example, the dimensions of some of the elements in the figures may be exaggerated relative to other elements to help to improve understanding of embodiments of the present invention.
Before describing in detail certain of the embodiments, it should be observed that the embodiments reside primarily in combinations of method steps and apparatus components related to displaying operational information about an electronic device. Accordingly, the apparatus, components and method steps have been represented where appropriate by conventional symbols in the drawings, showing only those specific details that are pertinent to understanding the embodiments of the present invention so as not to obscure the disclosure with details that will be readily apparent to those of ordinary skill in the art having the benefit of the description herein.
In this document, relational terms such as first and second, top and bottom, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. The terms “comprises,” “comprising,” or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. An element proceeded by “comprises . . . a” does not, without more constraints, preclude the existence of additional identical elements in the process, method, article, or apparatus that comprises the element.
Referring to
Referring to
Referring to
When the controller 305 monitors an input or receives an event, the controller 305 has information available (for example, coded within the programmed instructions or stored in a memory that is accessed by the controller under control of the programmed instructions) that can determine when a significant change of status occurs (e.g., the battery capacity has fallen below a next quartile). In response to a change of status determined by the controller from a monitored input or from an event driven input, the controller 305 provides the new operational status or event to the behavior engine 310. The controller 305 send the operational status change to the behavior engine 310, which provides the new operational status or event to the behavior database 310, which uses the operational status or event to update a set of action or attribute states of the avatar from a previous set of states to a new set of states based on a user mapping of the operational status or event to changes of the action or attribute states of the avatar. The behavior database 317 generates new values that define a new graphic appearance of the avatar or avatar background, as well as associated audio and haptic signal values that are to be presented at the time the background and/or avatar's appearance changes. The mapping of the behavior database 317 is one that has been performed in response to user inputs to change these mappings. These values that define the appearance of the avatar and associated audio and haptic signals are returned to the behavior engine 310, which couples the background and avatar appearance values to the graphics rendering engine 320, couples the audio signal values to the audio render engine 330, and couples the haptic signal values to the haptic render engine 350. The graphics render engine 320 uses input obtained from the avatar database 315 and the background and avatar appearance values to generate image information, wherein the image includes a background and one (or more) avatar(s) that have been selected by the user from one or more in the database. The image may be combined with other display information (such as alerts or text information overlaid on the avatar and background) from the controller 305, or otherwise controlled by the controller 305 (such as substituting an alternative complete image, when appropriate) through the display control 340, which generates the composite data necessary to drive a display 345. The display 345 is typically, but not necessarily, physically joined with the rest of the avatar control portion of the electronic device. (For example, they may be separate when the electronic device is a cellular phone/camera/game device that has a head worn display).
The audio render engine 330 converts the audio signal values to signals that drive an audio transducer (typically a speaker) 335. For example, the avatar lips may move and an audio output may say “help me, I need energy” when the battery is critically low. The haptic render engine 350 converts the haptic signal values to signals that drive a haptic transducer (such as a vibrator) 355. For example, the electronic device may vibrate and the avatar put on its glasses when a text message is received.
User inputs (not shown) for manipulating the mappings stored in the behavior database 317 and for selecting from a default set of avatars stored in the avatar database 315 or downloading a new one into the avatar database 315 are received by the controller 305 and converted to database changes in the databases 315, 317. The user inputs are of course used for other purposes as well.
In the example described with reference to
Referring to
The mapping in
Although
It will be appreciated that embodiments of the invention described herein may be comprised of one or more conventional processors and unique stored program instructions that control the one or more processors to implement, in conjunction with certain non-processor circuits, some, most, or all of the functions of the embodiments of the invention described herein. The non-processor circuits may include, but are not limited to, a radio receiver, a radio transmitter, signal drivers, clock circuits, power source circuits, and user input devices. As such, these functions may be interpreted as steps of a method to perform {replace with a technical description of the invention in a few words}. Alternatively, some or all functions could be implemented by a state machine that has no stored program instructions, or in one or more application specific integrated circuits (ASICs), in which each function or some combinations of certain of the functions are implemented as custom logic. Of course, a combination of these approaches could be used. Thus, methods and means for these functions have been described herein. In those situations for which functions of the embodiments of the invention can be implemented using a processor and stored program instructions, it will be appreciated that one means for implementing such functions is the media that stores the stored program instructions, be it magnetic storage or a signal conveying a file. Further, it is expected that one of ordinary skill, notwithstanding possibly significant effort and many design choices motivated by, for example, available time, current technology, and economic considerations, when guided by the concepts and principles disclosed herein will be readily capable of generating such stored program instructions and ICs with minimal experimentation.
In the foregoing specification, specific embodiments of the present invention have been described. However, one of ordinary skill in the art appreciates that various modifications and changes can be made without departing from the scope of the present invention as set forth in the claims below. As one example, there could be embodiments in which more than one avatar is used, either simultaneously on one display or as a group of two or more on one display. Accordingly, the specification and figures are to be regarded in an illustrative rather than a restrictive sense, and all such modifications are intended to be included within the scope of present invention. The benefits, advantages, solutions to problems, and any element(s) that may cause any benefit, advantage, or solution to occur or become more pronounced are not to be construed as a critical, required, or essential features or elements of any or all the claims. The invention is defined solely by the appended claims including any amendments made during the pendency of this application and all equivalents of those claims as issued.
The Abstract of the Disclosure is provided to allow the reader to quickly ascertain the nature of the technical disclosure. It is submitted with the understanding that it will not be used to interpret or limit the scope or meaning of the claims. In addition, in the foregoing Detailed Description, it can be seen that various features are grouped together in a single embodiment for the purpose of streamlining the disclosure. This method of disclosure is not to be interpreted as reflecting an intention that the claimed embodiments require more features than are expressly recited in each claim. Rather, as the following claims reflect, inventive subject matter lies in less than all features of a single disclosed embodiment. Thus the following claims are hereby incorporated into the Detailed Description, with each claim standing on its own as a separately claimed subject matter.
This application is related to a US application filed on even date hereof, having title “METHOD AND APPARATUS FOR DETERMINING THE APPEARANCE OF A CHARACTER DISPLAYED BY AN ELECTRONIC DEVICE”, having attorney docket number CML03970HI, and assigned to the assignee hereof