The features of the system, which are believed to be novel, are set forth with particularity in the appended claims. The embodiments herein, can be understood by reference to the following description, taken in conjunction with the accompanying drawings, in the several figures of which like reference numerals identify like elements, and in which:
While the specification concludes with claims defining the features of the embodiments of the invention that are regarded as novel, it is believed that the method, system, and other embodiments will be better understood from a consideration of the following description in conjunction with the drawing figures, in which like reference numerals are carried forward.
As required, detailed embodiments of the present method and system are disclosed herein. However, it is to be understood that the disclosed embodiments are merely exemplary, which can be embodied in various forms. Therefore, specific structural and functional details disclosed herein are not to be interpreted as limiting, but merely as a basis for the claims and as a representative basis for teaching one skilled in the art to variously employ the embodiments of the present invention in virtually any appropriately detailed structure. Further, the terms and phrases used herein are not intended to be limiting but rather to provide an understandable description of the embodiment herein.
The terms “a” or “an,” as used herein, are defined as one or more than one. The term “plurality,” as used herein, is defined as two or more than two. The term “another,” as used herein, is defined as at least a second or more. The terms “including” and/or “having,” as used herein, are defined as comprising (i.e., open language). The term “coupled,” as used herein, is defined as connected, although not necessarily directly, and not necessarily mechanically. The term “processing” or “processor” can be defined as any number of suitable processors, controllers, units, or the like that are capable of carrying out a pre-programmed or programmed set of instructions.
The terms “program,” “software application,” and the like as used herein, are defined as a sequence of instructions designed for execution on a computer system. A program, computer program, or software application may include a subroutine, a function, a procedure, an object method, an object implementation, an executable application, an applet, a midlet, a servlet, a source code, an object code, a shared library/dynamic load library and/or other sequence of instructions designed for execution on a computer system. The term “sensory action” can be a physical response, a physical stimulation, or a physical action applied to a device. An “emotional component” can be defined as an audio attribute or visual attribute such as text type, font size or color, audio volume, audio equalization, visual rendering, visual aspect associated with a sensory action. A “multimedia message” can be defined as a data, a packet, an audio response, a visual response, that can be communicated between devices, systems, or people in real-time or non-real-time. The term “real-time” can be defined as occurring coincident at the moment with minimal delay such that a real-time response is perceived at the moment. The term “non-real-time” can be defined as occurring at a time later that a response is provided. A “sensory element” can be a transducer for converting a physical action to an electronic signal.
Embodiments of the invention provide a system and method for multi-dimensional action capture. Multi-dimensional action capture includes identifying an emotion during a communication and associating the emotion with a means of the communication. Multi-dimensional action capture applies an emotional aspect to text, audio, and visual communication. For example, multi-dimensional action capture can sense a physical response during a communication, measure an intensity, duration, and location of the physical response, classify the measurements as belonging to an emotional category, and include an emotional component representing the emotional category within a message for conveying the emotion. The message and the emotional component can be decoded and presented to a user.
In practice, a multimedia message can be created that is associated with a sensory action, for example, a physical response. An emotional component can be assigned to the multimedia message based on the sensory action. The multimedia message can include at least one of text, audio, or visual element that is modified based on the emotional component. For example, the emotional component provides instructions for adjusting an attribute of the text, such as font size or color, for conveying an emotion associated with the text. In one aspect, a level of the emotion can be determined by assessing a strength of a physical response. For example, a sensor element can measure an intensity, speed, and pressure of the response during a communication for classifying an emotion. The multimedia message can be conveyed in real-time such that the feedback provided by the physical response is imparted to the performance at the moment the feedback is provided. Understandably, slight delay may exist, though the delay will not detrimentally delay the audience feedback. For example, audience members can squeeze a mobile device for adjusting an audio equalization of a live performance in real-time.
Referring to
The mobile device 160 can also connect to the Internet 120 over a WLAN. Wireless Local Access Networks (WLANs) provide wireless access to the mobile communication environment 100 within a local geographical area. WLANs can also complement loading on a cellular system, so as to increase capacity. WLANs are typically composed of a cluster of Access Points (APs) 140 also known as base stations. The mobile communication device 160 can communicate with other WLAN stations such as the laptop 170 within the base station area 150. In typical WLAN implementations, the physical layer uses a variety of technologies such as 802.11b or 802.11g WLAN technologies. The physical layer may use infrared, frequency hopping spread spectrum in the 2.4 GHz Band, or direct sequence spread spectrum in the 2.4 GHz Band. The mobile device 160 can send and receive data to the server 130 or other remote servers on the mobile communication environment 100.
In one example, the mobile device 160 can send and receive multimedia data to and from the laptop 170 or other devices or systems over the WLAN connection or the RF connection. As another example, the mobile device can communicate directly with other mobile devices over non-network assisted communications, for example, Mototalk. The multimedia data can include an emotional component for conveying a user's emotion. In one example, a user of the mobile device 160 can conduct a voice call to the laptop 170, or other mobile device within the mobile communication environment 100. During the voice call the user can squeeze the mobile device in a soft or hard manner for conveying one or more emotions during the voice call. The intensity of the squeeze can be conveyed to a device operated by another user and presented through a mechanical effect, such as a soft or hard vibration, or through an audio effect, such as a decrease or increase in volume. Accordingly, the other user may consider the vibration effect or the change in volume with an emotion of the user. The emotional component can be included in a data packet that can be transmitted to and from the mobile device 160 to provide an emotional aspect of the communication. A visual aspect can also be changed such as an icon, a color, or an image which may be present in a message, or on a display.
The mobile device 160 can be a cell-phone, a personal digital assistant, a portable music player, a handheld gaming device, or any other suitable communication device. The mobile device 160 and the laptop 170 can be equipped with a transmitter and receiver for communicating with the AP 140 according to the appropriate wireless communication standard. In one embodiment of the present invention, the wireless station 160 is equipped with an IEEE 802.11 compliant wireless medium access control (MAC) chipset for communicating with the AP 140. IEEE 802.11 specifies a wireless local area network (WLAN) standard developed by the Institute of Electrical and Electronic Engineering (IEEE) committee. The standard does not generally specify technology or implementation but provides specifications for the physical (PHY) layer and Media Access Control (MAC) layer. The standard allows for manufacturers of WLAN radio equipment to build interoperable network equipment.
Referring to
The media console 210 can create a multimedia message such as a text message, a voice note, a voice recording, a video clip, and any combination thereof presented. In another example, an icon or an avatar can be changed. An avatar is a virtual rendering of the user's own choosing that represents the user in a virtual environment such as a game or a chat room. The media console 210 can transmit or receive multimedia messages via the communications unit 240 and render the media according to content descriptions which can include an embedded emotional component. For example, the media console 210 can decode an emotional component associated with a multimedia message and adjust one or more attributes of the message based on the emotional component. For example, the emotional component can instruct certain portions of text to be highlighted with a certain color, certain portions of the text to have a larger font size, or to include certain symbols with the text based on one or more sensory actions identified by the sensory elements 220.
Referring to
At step 301, the method can begin. At step 310, a multimedia message can be created. For example, referring back to
At step 320, a sensory action can be associated with the multimedia message. For example, referring back to
For example, a user may express one or many different emotions based on an assignment of the one or more sensory elements 220. For example, a first sensory element may signify a happy tone, whereas a second sensory element may signify a sad tone. The user can depress the sensory elements in accordance with an emotion during a composition of a multimedia message or a reply to a message. In another example, the user may squeeze the device 160 during composition of a multimedia message to inject an emotional aspect of the message in accordance with one or more sensory actions. The user may squeeze certain portions of the phone harder or softer than other portions for changing an equalization of the audio composition. Notably, various sensors impart differing changes to the audio composition. In another example, the user may receive a multimedia message and comment on the message by squeezing the phone or imparting a physical activity to the phone that can be detected by the sensory elements 220. For example, a user can orient the phone in a certain position, shake the phone up and down, joggle the phone left and right to cause the emotional indicator to be added to the message. An intensity, duration, and location of the squeezing can be assessed for assigning a corresponding emotional component. The processor 230 can also evaluate an intensity of the sensory action such as soft, medium, or hard physical action for respectively assigning one of a low, medium, or high priority to the intensity.
In one aspect, a multimedia message can be created that captures the emotional aspects of the hand movement. For example, one or more sensory elements 220 present on the cell phone can capture physical movement of the cell phone or physical actions applied to the phone. In another arrangement, the user can squeeze the cell phone for translating the hand movement to physical gestures. The squeeze allows a user transmit an intensity grade to their message without needing to type additional descriptive adjectives. The intensity, duration, and speed of the sensory actions associated with the squeeze can be classified into an emotional category. For example, a hard squeeze can signify a harsh tone, whereas a soft can signify a passive tone. The emotional component can be communicated to a second user through the multimedia message. For example, upon receiving the multimedia message, the mobile device 160 can vibrate in accordance with the intensity, duration, and speed of the emotional component. Alternatively, an audio effect or video effect can be generated to convey the emotion.
At step 330, an emotional component can be assigned to the multimedia message based on the sensory action. For example, when the user squeezes the mobile device 160, an emotional component can be assigned to the multimedia message. For example, a lighting sequence or an auditory effect can be adjusted during playing of the multimedia message. For example, during text messaging, an emotional component can be conveyed by changing the color of the text in accordance with a mood of the user. This does not require additional text such as adjectives or text phrases to describe the user's emotion. Accordingly, the emotional component can enhance the user experience without overburdening the user during interpretation of the original communication media. The emotional component provides a multi-dimensional aspect to complement an expressive aspect of the communication dialogue that spans more than one dimension.
As another example, the emotional component can include a visual element to enhance the communication dialogue experience. Consider two people that are physically separated and speaking to one another on cell phones that cannot see what the other user is doing when they are speaking. Hand movement and gesture can be beneficial for conveying expressions and mood. Certain cultures use their hands expressively during conversation which cannot be captured by a standard cell phone. Even a cell phone equipped with video may not have a sufficiently wide camera lens to capture the hand gestures. The hand gestures can be an integral element of the conversation which convey emotion and engage the listening party. The processor 230 can determine a movement associated with the motion of the device 160 during hand movement and convey the movement as an emotional component to be rendered on a receiving the device. The receiving device can adjust a lighting effect, and auditory effect, or a mechanical effect based on the movement. The movement may be intentional or unintentional on the part of the user.
In practice, the media console 210 (See
At step 391, the method can end. Embodiments of the invention are not limited to messaging applications, and the method 300 can be practiced during real-time communication; that is, during an active voice call or media session. For example, the emotional components can be activated during the voice call to emphasize emotional aspects of the user's conversation captured during the communication dialogue.
Referring to
Referring to
The emotional component created can be dependent of the orientation. For example, the user may squeeze the mobile device to signal an action such as a confirmation, acknowledge a response, generate attention, to be associated with a multimedia message. The decision unit 430 (See
In another example, the sensory elements 220 may be associated with specific functionality. For example, one or more of the sensory elements 220 may be associated with an equalization of high-band, mid-band, and low-band frequencies. The user may adjust an audio equalization based on a location of an intensity of the sensory action. For instance, during composition of a multimedia message which is generating voice and music, the user may depress the various sensory elements 220 to adjust an equalization of the voice and music during a composition. Understandably, the sensory elements 220 allow the user to selectively equalize the audio based in an emotional sense. That is, the user can incorporate an emotional aspect to the multimedia message by adjusting the equalization through physical touch.
In another aspect the user can perform multiple squeezes of the mobile device 160 for signaling various commands. Understandably, the user can create a database of codes for associating various sensory actions to convey various actions or emotions. For example, if a menu list is presented on the mobile device 160 with one or more options to choose from, the user can associate a single squeeze with selection of the first item, a second squeeze for selection of the second item, or a hold and release squeeze for scrolling through the menu and selecting a list option. Alternatively, the user may receive a survey for a personal opinion on a subject matter. The user can emphasize responses to the survey through sensory activity picked up by the sensory elements. Embodiments of the invention are not limited to these arrangements and one skilled in the art can appreciate the various configurations available to the user based on the type of sensory actions.
Referring to
The sensory element 220 may contain a sensory detector for measuring an intensity of a sensory action, such as a depressing action, a duration of the sensory action, a speed of the sensory action, and a pressure of the sensory action. For example, the sensory detector may include an infrared light (IR) source for evaluating the intensity, duration, and speed of the sensory action. The IR source may include a transmit element 222 that also serves as a receiver, and a reflection element 221. The transmit element 222 can emit a pulse of light that reflects off the reflective element 221 and returns to the transmit element. A duration of time the light travels between the roundtrip path can be measured to determine a distance. Accordingly, a speed of the top portion during a closing action can be measured. The sensory element 220 may also contain a pressure sensor that can measure the force of a closing action. For example, a top pressure sensory 223 can couple to a bottom pressure sensor 224 when the device is in a closed configuration. The pressure sensory can evaluate the firmness of the depressing action. Understandably, the sensory element 220 may include more or less than the number of components shown for measuring an intensity, speed, duration, and pressure of a sensory action. Embodiments of the invention are not herein limited to the arrangements or components shown, and various configurations are herein contemplated though not shown.
The sensor elements 220 can be installed inside a keyboard or a phone keypad for monitoring key-stroke pressure and key depression speed during typing. The key pressure can be measured by the pressure sensor 224 at the bottom of the key stroke directly under the key pad. The pressure sensor 224 can vary the current flowing through its sensor depending on the pressure that is applied during typing. This current can be sent to an analog-to-digital circuit and read by software as increasing or decreasing the applied pressure.
Referring to
As a previously recited example, the user may squeeze the phone hard during a voice conversation which can be classified as a tone of anger. Alternatively, the user can rapidly squeeze the phone indicating a tone of excitation, or point of emphasis. Further, the user may sustain a squeeze for emphasizing a passive or calm state. Understandably, various detection criteria can be employed for assessing the physical actions and identifying a corresponding emotional category. Notably, the decision unit 430 assigns an emotional category to a message for complementing the manner in which the message is presented.
Referring to
Referring to
Where applicable, the present embodiments of the invention can be realized in hardware, software or a combination of hardware and software. Any kind of computer system or other apparatus adapted for carrying out the methods described herein are suitable. A typical combination of hardware and software can be a mobile communications device with a computer program that, when being loaded and executed, can control the mobile communications device such that it carries out the methods described herein. Portions of the present method and system may also be embedded in a computer program product, which comprises all the features enabling the implementation of the methods described herein and which when loaded in a computer system, is able to carry out these methods.
While the preferred embodiments of the invention have been illustrated and described, it will be clear that the embodiments of the invention is not so limited. Numerous modifications, changes, variations, substitutions and equivalents will occur to those skilled in the art without departing from the spirit and scope of the present embodiments of the invention as defined by the appended claims.