Animated messaging may enhance a user's experience when receiving a message. However, a user may be limited in creating a customized animated character-based message. For example, the user may have to select from a gallery of generic animated characters and/or rely on pre-programmed animations of the characters. Thus, the user may not be able to fully customize the character and/or the animation associated with the character, with respect to the message.
The following detailed description refers to the accompanying drawings. The same reference numbers in different drawings may identify the same or similar elements. Also, the following detailed description does not limit the invention.
Embodiments described herein relate to an animated messaging scheme that permits a user to create characters for animation. The user may create the animated message on a user device that includes an animation messaging client. The user may send the animated message to a recipient via an animated messaging server.
In one implementation, a user may take a picture (e.g., with a camera of the user device) or obtain a picture (e.g., from a photo gallery on the user device or from a photo gallery on the animated messaging server). The picture may be of any thing, such as, for example, a person, a living thing (e.g., a tree, a plant, an animal), or a non-living thing or object.
In one implementation, the user may select features (e.g., facial features, such as, eyes, mouth, head, or the like, bodily features, such as, torso, arms, legs, feet, hands) within the picture to be animated. In another implementation, the user may upload the picture to the animated messaging server and the animated messaging server may automatically select features (e.g., based on object recognition) within the picture to be animated. The user may create a message (e.g., a text message, an e-mail, a multimedia messaging service (MMS) message, or the like) and select animations to be performed with respect to the features selected. For example, the user may encode the message with selectable animations (e.g., emoticons, animation codes, or the like). The animated message may be generated based on the picture, the selected features, the message and the animation codes. The user may preview the animated message before sending the animated message. Once the animated message is completed, the user may send the message to another user.
In other embodiments, variations to the previously described implementation exist, and will be described later. Additionally, in other implementations, the user may create the animated message according to a different order of operations than those described.
As illustrated in
User 105 may also select a background 155 and accessories 160 for car 115. For example, user 105 may select a scenic background (e.g., beach, meadow or the like) or a generic background (e.g., a color, a pattern, or the like). Additionally, user 105 may select accessories 160, such as, for example, clothing (e.g., shirt, pants, dress, blouse, hat, or the like), a costume, jewelry, and/or other types of items, to customize the appearance of car 115.
As illustrated in
As illustrated in
Referring back to
Although
As a result of the foregoing, user 105 may select any character as an animated character and customize animation associated with the character, with respect to the user's 105 message. Since embodiments and implementations have been broadly described, variations to the above embodiments and implementations will be discussed further below.
In this description, user 105-1 and 105-2 may referred to generally as user 105, and user device 110-1 and 110-2 may be referred to generally as user device 110.
User device 110 may include a device having communication capability. User device 110 may include a portable, a mobile, or a handheld communication device. For example, user device 110 may include a wireless telephone (e.g., a mobile phone, a cellular phone, a smart phone), a computational device (e.g., a handheld computer, a laptop), a personal digital assistant (PDA), a web-browsing device, a personal communication systems (PCS) device, a vehicle-based device, and/or some other type portable, mobile or handheld communication device. In other implementations, user device 110 may include a stationary communication device. For example, user device 110 may include a computer (e.g., a desktop computer), a set top box in combination with a television, an Internet Protocol (IP) telephone, or some other type of stationary communication device. User device 110 may include AMC 125. AMC 125 will be described in greater detail below. User device 110 may connect to network 185 via a wired or wireless connection.
Network 185 may include one or multiple networks (wired and/or wireless) of any type. For example, network 185 may include a local area network (LAN), a wide area network (WAN), a telephone network, such as a Public Switched Telephone Network (PSTN), a Public Land Mobile Network (PLMN) or a cellular network, a satellite network, an intranet, the Internet, a data network, a private network, or a combination of networks. Network 185 may operate according to any number of protocols, standards, and/or generations (e.g., second, third, fourth).
Messaging server 195 may include a network device having communication capability. For example, messaging server 195 may include a network computer. Messaging server 195 may include AMS 197. AMS 197 will be described in greater detail below.
Housing 205 may include a structure to contain components of user device 110. For example, housing 205 may be formed from plastic, metal, or some other material. Housing 205 may support microphone 210, speaker 215, keypad 220, and display 225.
Microphone 210 may transduce a sound wave to a corresponding electrical signal. For example, a user may speak into microphone 210 during a telephone call or to execute a voice command. Speaker 215 may transduce an electrical signal to a corresponding sound wave. For example, a user may listen to music or listen to a calling party through speaker 215.
Keypad 220 may provide input to user device 110. Keypad 220 may include a standard telephone keypad, a QWERTY keypad, and/or some other type of keypad. Keypad 220 may also include one or more special purpose keys. In one implementation, each key of keypad 220 may be, for example, a pushbutton. A user may utilize keypad 220 for entering information, such as text or activating a special function.
Display 225 may output visual content and may operate as an input component. For example, display 225 may include a liquid crystal display (LCD), a plasma display panel (PDP), a field emission display (FED), a thin film transistor (TFT) display, or some other type of display technology. Display 225 may display, for example, text, images, and/or video information to a user. In one implementation, display 225 may include a touch-sensitive screen. Display 225 may correspond to a single-point input device (e.g., capable of sensing a single touch) or a multipoint input device (e.g., capable of sensing multiple touches that occur at the same time). Display 225 may implement, for example, a variety of sensing technologies, including but not limited to, capacitive sensing, surface acoustic wave sensing, resistive sensing, optical sensing, pressure sensing, infrared sensing, gesture sensing, etc.
Processing system 305 may include one or more processors, microprocessors, data processors, co-processors, network processors, application specific integrated circuits (ASICs), controllers, programmable logic devices, chipsets, field programmable gate arrays (FPGAs), or some other component that may interpret and/or execute instructions and/or data. Processing system 305 may control the overall operation, or a portion thereof, of user device 110, based on, for example, an operating system and/or various applications (e.g., applications 315).
Memory/storage 310 may include memory and/or secondary storage. For example, memory/storage 310 may include a random access memory (RAM), a dynamic random access memory (DRAM), a read only memory (ROM), a programmable read only memory (PROM), a flash memory, and/or some other type of memory. Memory/storage 310 may include a hard disk (e.g., a magnetic disk, an optical disk, a magneto-optic disk, a solid state disk, etc.) or some other type of computer-readable medium, along with a corresponding drive. The term “computer-readable medium” is intended to be broadly interpreted to include a memory, a secondary storage, a compact disc (CD), a digital versatile disc (DVD), or the like. The computer-readable medium may be implemented in a single device, in multiple devices, in a centralized manner, or in a distributed manner. The computer-readable medium may include a physical memory device or a logical memory device. A logical memory device may include memory space within a single physical memory device or spread across multiple physical memory devices.
Memory/storage 310 may store data, application(s), and/or instructions related to the operation of user device 110. For example, memory/storage 310 may include a variety of applications 315, such as, for example, an e-mail application, a telephone application, a camera application, a video application, a multi-media application, a music player application, a visual voicemail application, a contacts application, a data organizer application, a calendar application, an instant messaging application, a texting application, a web browsing application, a location-based application (e.g., a GPS-based application), a blogging application, and/or other types of applications (e.g., a word processing application, a spreadsheet application, etc.). Applications 315 may include AMC 125. AMC 125 may permit a user to create and send an animated message. AMC 125 will be described in greater detail below.
Communication interface 320 may permit user device 110 to communicate with other devices, networks, and/or systems. For example, communication interface 320 may include an Ethernet interface, a radio interface, a microwave interface, or some other type of wireless and/or wired interface.
As described herein, user device 110 may perform certain operations in response to processing system 305 executing software instructions contained in a computer-readable medium, such as memory/storage 310. The software instructions may be read into memory/storage 310 from another computer-readable medium or from another device via communication interface 320. The software instructions contained in memory/storage 310 may cause processing system 305 to perform processes described herein. Alternatively, hardwired circuitry may be used in place of or in combination with software instructions to implement processes described herein. Thus, implementations described herein are not limited to any specific combination of hardware circuitry and software.
Processing system 405 may include one or more processors, microprocessors, data processors, co-processors, network processors, application specific integrated circuits (ASICs), controllers, programmable logic devices, chipsets, field programmable gate arrays (FPGAs), or some other component that may interpret and/or execute instructions and/or data. Processing system 405 may control the overall operation, or a portion thereof, of messaging server 195, based on, for example, an operating system and/or various applications (e.g., applications 415).
Memory/storage 410 may include memory and/or secondary storage. For example, memory/storage 410 may include a RAM, a DRAM, a ROM, a PROM, a flash memory, and/or some other type of memory. Memory/storage 410 may include a hard disk (e.g., a magnetic disk, an optical disk, a magneto-optic disk, a solid state disk, etc.) or some other type of computer-readable medium, along with a corresponding drive.
Memory/storage 410 may store data, application(s), and/or instructions related to the operation of messaging server 195. For example, memory/storage 410 may include applications 415 that may permit a user to create and send an animated message. Applications 415 may include AMS 197. AMS 197 will be described in greater detail below. In one embodiment, applications 415 may include an authentication authorization, and accounting (AAA) application. In other embodiments, messaging server 195 may not include an AAA application.
Communication interface 420 may permit messaging server 195 to communicate with other devices, networks, and/or systems. For example, communication interface 420 may include an Ethernet interface, a radio interface, or some other type of wireless and/or wired interface.
As described herein, messaging server 195 may perform certain operations in response to processing system 405 executing software instructions contained in a computer-readable medium, such as memory/storage 410. The software instructions may be read into memory/storage 410 from another computer-readable medium or from another device via communication interface 420. The software instructions contained in memory/storage 410 may cause processing system 405 to perform processes described herein. Alternatively, hardwired circuitry may be used in place of or in combination with software instructions to implement processes described herein. Thus, implementations described herein are not limited to any specific combination of hardware circuitry and software.
As previously described, user device 110 may include AMC 125. AMC 125 may operate synchronously with AMS 197 to provide user 105 with the ability to create an animated message and send the animated message to the recipient (e.g. another user 105), as illustrated in
In an exemplary embodiment, user 105 may need to log in with AMS 197 of messaging server 195 before utilizing an animated messaging service. In one embodiment, messaging server 195 may provide AAA services. In other embodiments, AMS 197 may negotiate with an AAA server (not illustrated) to provide AAA services.
Referring to
Referring to
Character gallery 608 may include a gallery of characters that may be stored on messaging server 195. The characters may be indexed according to various categories (e.g., animals, people, plant life, objects, etc.). Character gallery 608 may include popular people (e.g., movie stars, musicians, etc.), cartoon characters, generic characters, holiday characters, holiday icons (e.g., Valentine heart, Christmas tree), and other types of characters according to one or more category lists. Character gallery 608 may include free character content or premium character content (e.g., in which user 105 may purchase).
User device gallery 610 may include a gallery of characters that are stored on user device 110. For example, user 105 may store pictures on his or her user device 110.
Take a picture 612 may permit user 105 to launch a camera (e.g., included with user 110) and capture a picture. GUI 130 may permit user 105 to preview the picture before accepting the picture as the character to be animated. GUI 130 may permit user 105 to save the picture in user device gallery 610 or upload the picture to My Characters 614. My Characters 614 may be stored on messaging server 195 and correspond to a space where user 105 may store pictures and/or animated characters that user 105 has previously utilized for an animated message.
As previously described, when a character has been selected, features associated with the character may be animated. By way of example, the features may include facial features (e.g., head, nose, eyes, mouth) and bodily features (e.g., arms, legs, torso, hands, feet). In one embodiment, user 105 may select the features to be animated. For example,
Additionally, or alternatively, in another embodiment, messaging server 195 may select the feature areas of the character. For example, messaging server 195 or user device 110 may include an object recognition application that may be capable of discerning various features of a character, such as, for example, the head, eyes, mouth, legs, etc. In instances when picture 120 does not correspond to a thing that inherently has these features (e.g., a tree), default feature areas may be selected. Alternatively, as previously described, user 105 may designate features areas of the character. Additionally, as previously described, GUI 130 may permit user 105 to select background 155 and accessories 160. Background 155 of GUI 130 may provide user 105 access to background content and accessories 160 of GUI 130 may provide user 105 access to accessories content from which user 105 may select.
Select a phrase 616 may permit user 105 to select from a list of pre-recorded audio phrases. The pre-recorded audio phrases may be categorized based on context. For example, pre-recorded phases may include generic messages (e.g., “Call me”, “See you tomorrow,” “Meet you there,” “I am running late,” etc.), specialty messages (e.g., messages related to holidays, anniversaries, birthdays, etc.), and/or other types of messages from which user 105 may select.
Record a message 618 may permit user 105 to record a message. For example, user 105 may speak into microphone 210 of user device 110. When record a message 618 is selected, GUI 130 may provide user 105 with other selections, such as, record, play, stop, and accept. GUI 130 may indicate the length of time of the recorded message. GUI 130 may permit user 105 to name and save the recorded message file. GUI 130 may permit user 105 to save the recording on user device 110 or upload the recording to My Recordings 620. My Recordings 620 may be stored on messaging server 195 and correspond to a space where user 105 may store recordings and/or other audio files that user 105 has previously utilized for an animated message.
Compose a message 622 may permit user 105 to enter a message (e.g., by typing a message or utilizing a voice-to-text application). For example, depending on user device 110, user 105 may enter a message utilizing keypad 220 or GUI 130 may provide soft keys to enter a message. Additionally, as previously described, user 105 may select gestures to be added to the message. For example, referring to
With respect to select a phrase 616 and compose a message 622, GUI 130 may permit user 105 with selections of voices for the animated character. For example, GUI 130 may provide categories of male and female voices. User 105 may be permitted to select from celebrity voices or other types of voices (e.g., cartoon voices, etc.). GUI 130 may permit user 105 to select various languages (e.g., English, Spanish, French, etc.) in which the message is to be spoken.
As previously described, user 105 may preview the animated message by selecting preview 175, as illustrated in
Contacts 624 may permit user 105 to select from a contact lists, a phone list, or the like, which may be stored on user device 110. User 105 may select the recipient(s) of the animated message from contacts 624. For example, user 105 may select a telephone number or an e-mail address of the recipient(s). Recipient 626 may permit user 105 to enter a telephone number or an e-mail address directly (e.g., without accessing a contact list). User 105 may send the animated message via messaging server 195.
Although
Process 700 may begin with receiving a login to create an animated message (block 705). For example, as previously described and illustrated with respect to
A session token may be received (block 710). For example, assuming the authentication process is successful, AMS 197 or the AAA server may respond to user device 110 (i.e., AMC 125) with authentication response 510 that includes a session token. The session token may have a time-to-live, in which the duration of the time-to-live may be configured by a network administrator. For example, the duration of the time-to-live may correspond to a single animated message session, multiple days, or one or more months. In one implementation, AMC 125 may erase the session token from memory/storage 310 if user device 110 is hard reset or powered off.
A picture may be selected (block 715). For example, AMC 125 may receive a selection of picture 120. In one implementation, as previously described, user 105 may take picture 120 with user device 110. AMC 125 may receive a user selection of picture 120 that was taken. In other implementations, AMC 125 may receive a user selection of picture 120 from character gallery 608, user device gallery 610, or My Characters 614.
Areas of the picture, which may be animated, may be designated (block 720). For example, as previously described and illustrated with respect to
A message may be composed (block 725). For example, as previously described and illustrated with respect to
Animation codes may be selected (block 730). For example, as previously described and illustrated with respect to
The animated message may be generated (block 735). For example, as previously described, user device 110, messaging server 195, or another device, may generate the animated message based on the user's 105 selections (e.g., the character, designation of features, the message, animation codes, etc.) pertaining to the animated message.
The animated message may be sent (block 740). For example, as previously described, user 105 may send the animated message based on contacts 624 or recipient 626. For example, AMC 125 may receive a selection of a recipient via contacts 624 (e.g., a contacts list or telephone list residing on user device 110). Alternatively, AMC 125 may user 105 may enter a telephone number or e-mail address directly, without accessing a contacts list. The animated message may be sent via e-mail or as an MMS message according to the address or telephone number entered.
Although
The foregoing description of implementations provides illustration, but is not intended to be exhaustive or to limit the implementations to the precise form disclosed. Accordingly, modifications to the embodiments, implementations, etc., described herein may be possible.
The term “may” is used throughout this application and is intended to be interpreted, for example, as “having the potential to,” “configured to,” or “being able to,” and not in a mandatory sense (e.g., as “must”). The terms “a,” “an,” and “the” are intended to be interpreted to include one or more items. Where only one item is intended, the term “one” or similar language is used. Further, the phrase “based on” is intended to be interpreted as “based, at least in part, on,” unless explicitly stated otherwise. The term “and/or” is intended to be interpreted to include any and all combinations of one or more of the associated list items.
In addition, while a series of blocks has been described with regard to the process illustrated in
It will be apparent that the device(s) described herein may be implemented in many different forms of software or firmware in combination with hardware in the implementations illustrated in the figures. The actual software code (executable by hardware) or specialized control hardware used to implement these concepts does not limit the disclosure of the invention. Thus, the operation and behavior of a device(s) was described without reference to the specific software code—it being understood that software and control hardware can be designed to implement the concepts based on the description herein.
Even though particular combinations of features are recited in the claims and/or disclosed in the specification, these combinations are not intended to limit the disclosure of the invention. In fact, many of these features may be combined in ways not specifically recited in the claims and/or disclosed in the specification.
No element, act, or instruction used in the present application should be construed as critical or essential to the implementations described herein unless explicitly described as such.