This invention generally relates to user interfaces and more particularly to a dynamic user interface for wireless communication devices.
Wireless communication devices often have a user interface that includes a visual interface to convey information to the user. Conventional user interfaces typically include multiple pages, screens and/or menus allowing a user to navigate through stored information, functional options and other preferences. One or more screens may visual interface events that have occurred at the wireless communication device. The events may include several different event types such as emails events, voice mail events, call events, and text message events.
A user interface for a wireless communication device has a visual interface that includes multiple visual elements representing events of the same type where non-textual visual characteristics represent unique information of the events. Accordingly, the visual interface includes at least a first visual element representing a first event occurring at a wireless communication device and a second visual element representing a second event occurring at the wireless communication device and being of the same event type as the first event. The first visual element has a first non-textual visual characteristic representing first information related to the first event and the second visual element has a second non-textual visual characteristic representing second information related to the second event different from the first information. The event type is one of a received email event, a transmitted email event, a voice mail event, a missed call event, a received text message event, a sent text message event, a dialed call event, or a received call event.
For the example in
The visual interface 102 includes a first element 116 corresponding to the first event and 108 a second element 118 corresponding to the second event 110. The element 116, 118 is any kind of visual image, icon, picture, cartoon, or visual representation that includes at least one non-textual visual characteristic 120, 122 corresponding to information 112, 114 of the associated event 108, 110. The controller 106 processes unique information 112, 114 associated with the events 108, 110 to generate the visual elements 116, 118 that include a non-textual visual characteristic 120, 122 corresponding to the unique information 112, 114. Accordingly, the first information 112 unique to the first event 108 is represented in the first non-textual visual characteristic 120 and the second information 114 unique to the second event 110 is represented by the second non-textual visual characteristic 122. Examples non-textual visual characteristics 120, 122 include color, size, intensity, contrast, shape, transparency, motion, and positions within the visual interface. Visual characteristics 120, 122 based on motion of the element 116, 118 may include a movement path, speed of movement, acceleration of the element, deceleration of the element, and predictability of the movement (randomness of the movement). The elements 116, 118 may include some text or indicia in addition to the non-textual visual characteristic in some circumstances. Further, the visual interface 102 may include other elements where some elements may represent different event types. Also, additional elements of the same type may include the same non-textual visual characteristics as other elements. This may occur where the information represented by the non-textual visual characteristic is the same for more than one event. Further, an element may include more than one non-textual visual characteristic representing different information. The above scenarios are further discussed with the examples below after discussion of the interaction between the controller and visual interface.
The controller 106 is any processor, microprocessor, processor arrangement, computing device, logic or other electronics that are capable of running code to perform the functions described herein as well as facilitating the overall functionality of the wireless communication device 100. The controller 106 may include a memory and other hardware, software, and/or firmware for interfacing, communicating, and otherwise performing the functions.
At least some aspect of an event 108, 110 is processed, controlled, initiated, managed, invoked, monitored, and/or reacted to by the controller 106. Accordingly, the controller 106 is aware of all events. A user interface manager 124 of the controller generates the environment and elements shown on the visual interface. For the exemplary embodiments discussed herein, the user interface manager 124 is implemented by running code on a processor (such as the controller 106). Various graphics engines and processes may be invoked or implemented as part of the user interface manager 124. When an event 108, 110 occurs, the user interface manager 124 processes the corresponding information dictated by the user interface manager code and/or user settings. The element 116 is generated with the appropriate non-textual visual characteristic 120 and displayed on the visual interface 102. The generated and visually displayed environments and elements may include any of numerous depictions, movements, themes, and visual aspects. Examples of some of the situations discussed above are provided below with reference to a fish pond environment.
For the following example, the first event 108 is a missed call from a first calling number and the second event 110 is a second missed call from a second calling number. The first information 112 is the first calling number and the second information 114 is the second calling number. In the examples, the visual interface 102 depicts fish swimming in a pond where the fish represent missed calls. The first information 112 results in a fish that has a blue body and the second information 114 results in a fish that has red body. For this example, therefore, the non-textual visual characteristics 120, 122 are different colors. Accordingly, the user interface 104 provides a visual representation of a history of events where the user can easily determine at least some aspects of the events. By observing that the pond includes a blue fish and a red fish, the user can easily determine that two different callers have tried to reach the user. Where each color represents a caller that is known by the user, the fish convey the identity of the caller. For example, blue may be assigned to the phone number of a mobile device belonging to the user's spouse and red may be assigned to a device of the user's child.
In an example where the elements include more than one non-textual visual characteristic, each fish includes a transparency that corresponds to the age of the missed call. More recent missed calls are depicted as more opaque fish and older missed calls are represented by fish that are more transparent. Another example includes having older calls represented by slower moving fish. In a situation where the user's spouse called twice and the calls were missed, the visual interface may include two blue fish. Where call age is represented, the fish can be distinguished by an additional visual characteristic as speed or transparency.
In an example where other event types are represented in the visual interface, different types (species) of fish represent different event types. Star fish may represent voice mail messages, goldfish may represent text messages, and bluegills may represent email messages.
The environment depicted in the visual interface may convey a plethora of information visually to the user. A single glance at the visual interface provides information about the history and status of events that have occurred. Such a visual interface has several advantages of over conventional user interfaces. One advantage includes the ability of the user to obtain information about multiple events without navigating through menus or accessing different screens. Although conventional interfaces may provide information about different types of events, they do not provide information regarding events of the same type. For example, some conventional visual interfaces may indicate that a voice mail is pending or that there call has been missed. An icon may provide this indication in some systems. The user, however, cannot determine how many voice mails or missed calls have occurred or determine any information regarding the events other than at least one event has occurred. If additional information is desired in these conventional interfaces, the user must access a different menu or specific screen.
Another advantage of the described embodiments over conventional systems, includes improved privacy. In conventional systems, information regarding an event is often depicted with text describing the event. An eavesdropper can easily observe the information by looking at the screen without accessing the device. Where the information is conveyed with non-textual visual characteristics only known to the user of the device, however, the visual interface does not convey information to the eavesdropper. Such a situation may occur where the wireless communication device is left in a table without supervision of the user. A new event, such as an incoming text message may be indicated purely by non-textual visual characteristics. Someone seeing the new element appear on the screen could not determine any information about the event.
The visual interface 302 is generated and managed by a user interface manager 124 implemented within a controller 106 such as a processor. Data related to one or more of a communication event 320, a detected event 322, and/or a generated event 324 is/are processed by the user interface manager 124 to generate an image within an environment 304 where the image includes a non-textual visual characteristic corresponding to the communication event 320, the detected event 322, the generated event 324, or a combination thereof. In
A communication event 320 is any event related to the communications with the wireless communication device 300. Examples of communications include receiving and transmitting voice, data, multimedia, music, video, email, and text message information as well as transmitting and receiving control signal data. Accordingly, examples of communication events 320 include the receipt of an email, transmission of an email, a received call, a dialed call, a missed call, a transmitted text message, a received text message, a received SMS message, a transmitted SMS message, and notification of a voicemail. Since the controller 106 manages all functions related to communications, the controller 106 is aware of all communication events 320. Information related to the communication events 320 is forwarded to the user interface manager 124 within the controller 106.
A detected event 322 is any event detected by an input device 326 such as a sensor, input device, or user interface. Examples of input devices include sensors, keypads, touch visual interfaces, buttons, touch pads, keyboards, microphones, motion detectors, orientation detectors, current detectors, voltage detectors, power detectors, and position detectors such GPS devices. The visual interface 302 and the input device 326 may be same device in some circumstances. Data from the input device 326 is received by the controller 106 and forwarded to the user interface manager 124.
A generated event 324 is any event that occurs due to code 328 running on a processor that is not based on an external event such as communication event 320 or a detected event 322. The code 328 may be software or firmware and may be running on the same processor as the controller 106 or may be running on a separate computer, processor, controller, or other collection of electronics. Examples of generated events 324 include control signals for producing random or repetitive changes in an element or the visual interface environment. The changes may include changes in color, motion, size, position or any other visual characteristic that may appear random, cyclical or otherwise uncorrelated to external events. A more specific example of a generated event 324 includes generation of data for controlling the motion of a graphical fish in a graphical pond environment on the visual interface. The data may allow the fish to swim in particular pattern or to swim randomly. If based solely on a generated event 324, therefore, the path of the fish is uncorrelated to communication events and detected events. In circumstances where the visual characteristics of the element are based on a combination of events, the particular visual characteristic related to the generated event may be modified by another event. For the fish example, a swimming fish may change direction or speed when a detected event occurs although the underlying motion is based on the generated data. More specifically, one example includes the detection of user input on a touch screen performing the function of the visual interface. The user may, for example, touch the visual interface directly in front of the path of the fish and, in response, the user interface manager modifies the generated path of the fish to change direction to avoid the position of the screen touched by the user.
In the first screen shot 402, the element 408 has a first position. In the second screen shot 404, the element 408 is in the first position at the time user input begins. The element is in second position in the third screen shot 406. User input is complete in the third screen shot 406. Accordingly, for the example of
In the first screen shot 422, the element 428 has a first position. For this example, the element 428 is a sun and the position in the first screen shot 422 represents a time in the morning. In the second screen shot 424, the element 428 is in a second position representing midday. The element 428 is in a third position in the third screen shot 426 represent evening. Accordingly, for the example of
In the first screen shot 502, the element 508 has a first position. For this example, the element 508 is a bird. In the second screen shot 504, the element 508 is in a second position. The element 508 is in a third position in the third screen shot 506. The position of the element 508 and the motion path through the environment is not based on any external events in this example. The user interface manager 124 processes the data to generate a change in non-textual visual characteristics of the element 508 resulting in a change in position and motion of the element 508 within the environment.
In the first screen shot 602, the element 608 has a first position and the second element 610 has first position. For this example, the two elements 608, 610 are graphical birds. In the second screen shot 604, the element 608 is in a second position. The second element 610, however, has been deleted from the environment in response to a detected event. In this case, the detected event is user input through a touch screen. The element 608 is in a third position in the third screen shot 606. The position of the elements 608, 610 and the motion paths through the environment are not based on any external events in this example. The existence of the elements 608, 610 however, is based on the occurrence of the communication event of receiving a voice mail. Accordingly, the element 610 has non-textual visual characteristics based a detected event, a communication event, and a generated event. Continuing with the example, the third element (vehicle) 612 represents an incoming call. In the second screen shot 604 the vehicle has a color (white) indicating that the call is currently being received. The non-textual visual characteristic of color indicates the state of an incoming call. In the third screen shot 606, the third element 612 has a color that is not white (indicated in
At step 702, it is determined that a first event of a particular event type has occurred. The user interface manager 124 receives data from other sections of the controller 102 indicating that an event has occurred.
At step 704, it is determined that a second event has occurred that is of the same event type as the first event. The user interface manager 124 receives data from other sections of the controller 102 indicating that the second event has occurred. Examples of suitable event types include received email events, transmitted email events, voice mail events, missed call events, received text message events, sent text message events, dialed call events, received call events, received advertisements, alarms, and calendar reminders.
At step 706, a first visual element 116 is generated on the visual interface 102 to represent the first event 108. The user interface manager 124 applies information 112 of the first event to generate the first visual element 116 having a non-textual visual characteristic 120 representing the information 112.
At step 708, the second visual element 118 is generated on the visual interface. The user interface manager 124 applies second information 114 of the second event 110 to generate the second visual element representing the second event having a second non-textual visual characteristic 122 representing the second information 114. Accordingly, the first visual element 116 has a first non-textual visual characteristic 120 representing first information 112 related to the first event 108 and the second visual element 118 has a second non-textual visual characteristic 122 representing second information 114 related to the second event 110 where the first information 112 and the second information 114 are different. The visual interface 102, therefore, displays elements where each element looks different based on at least the information corresponding to the event that is represented. Examples of suitable information for the first information 112 and/or the second information 114 include a time received, a time sent, a time dialed, an alarm time, a battery charge, a remaining operation time, a calling party name, a calling party number, a called party number, a called party name, a sending party electronic mail (email) address, a receiving party email address, a message sending party name, a message sending party number, and a meeting type. Examples of the suitable non-textual visual characteristics 120, 122 include element size, element shape, element color, element position within the visual interface, element speed of motion within the visual interface, element path of motion within the visual interface, element direction of motion within the visual interface.
At step 802, it is determined that a communication event 320 has occurred. The user interface manager receives information from other functions within the controller or form other devices indicating that the communication event has occurred. Examples of communication events include received email events, transmitted email events, voice mail events, a missed call events, a received text message events, sent text message events, a dialed call events, received call events, and received advertisements.
At step 804, a visual element having a first non-textual visual characteristic 306 is generated on a visual interface where the characteristic corresponds to the communication event 320.
At step 806, data corresponding to a detected event 322 is received from an input device 326. As described above, the input device 326 may be a user input device or a sensor. The input device, therefore, detects external events such as environmental statistics and occurrences, orientations, positions and locations (and changes to the orientation, position, and location) of the wireless communication device.
At step 808, a visual element having a second non-textual visual characteristic 308 is generated where the characteristic 308 corresponds to the detected event 322. Examples of suitable non-textual visual characteristics include element size, element shape, element color, element position within the visual interface, element speed of motion within the visual interface, element path of motion within the visual interface, element direction of motion within the visual interface.
At step 810, data generated by code running on the processor is received. As discussed above, a generated event 324 is any event that occurs due to code 328 running on a processor that is not based on an external event such as communication event 320 or a detected event 322. The code 328 may be software or firmware and may be running on the same processor as the controller 106 or may be running on a separate computer, processor, controller, or other collection of electronics. Examples of generated events 324 include control signals for producing random or repetitive changes in an element or the visual interface environment. The changes may include changes in color, motion, size, position or any other visual characteristic that may appear random, cyclical or otherwise uncorrelated to external events.
At step 812, a third non-textual visual characteristic based on the generated data is generated to display the third visual element corresponding to the generated event 324.
Therefore, the method provides management of a visual interface 302 to simultaneously display an environment 304 having different non-textual visual characteristics representing communication events, detection events, and generated events. The user can determine the status of communications, wireless communication device functions and other occurrences by glancing at a single screen that also depicts an entertaining environment with generated elements that have changing visual characteristics not based on external events.
Clearly, other embodiments and modifications of this invention will occur readily to those of ordinary skill in the art in view of these teachings. The above description is illustrative and not restrictive. This invention is to be limited only by the following claims, which include all such embodiments and modifications when viewed in conjunction with the above specification and accompanying drawings. The scope of the invention should, therefore, be determined not with reference to the above description, but instead should be determined with reference to the appended claims along with their full scope of equivalents.
This application claims the benefit of priority of U.S. Provisional application No. 61/041,167 entitled “USER INTERFACE FOR MOBILE WIRELESS DEVICES”, docket number PRO 00898, filed Mar. 31, 2008 and incorporated by reference in its entirety, herein. This application is also related to U.S. application Ser. No. 12/413,482 entitled “CALCULATING ROUTE AND DISTANCE ON COMPUTERIZED MAP USING TOUCHSCREEN USER INTERFACE”, docket number UTL 00898, filed on Mar. 27, 2009 and incorporated by reference in its entirety, herein.
Number | Date | Country | |
---|---|---|---|
61041167 | Mar 2008 | US |