The present invention relates to an interactive electronic message application, and, more particularly, to an interactive electronic message or greeting card application that provides for creating, displaying, editing, distributing and viewing of digital content with voice and video recordings and other audio-visual features.
Greeting cards and other electronic messages have been ubiquitous tools of personal expression in modern times. Lately, electronic greeting cards and messages have taken an ever increasing role in sending and receiving communications between individuals and more simply recording messages or other information. Electronic greeting cards have been largely focused on providing a customizable user experience by providing users the ability to modify text and photos.
In parallel, with the expanding availability of inexpensive storage media and computing, large amounts of audio and video data is being created and distributed over the Internet. Especially, portable computing devices such as smartphones and tablet computers have been increasingly used to create and distribute digital content.
The general inventive concepts contemplate systems, methods, and apparatuses for creating, displaying, editing, distributing and viewing of high-resolution interactive electronic greeting cards and messages for present day and future portable computing devices and their technologies. By way of example, to illustrate various aspects of the general inventive concepts, several exemplary embodiments of systems methods and/or apparatuses are disclosed herein.
Systems, methods, and apparatuses, according to one exemplary embodiment, contemplate an interactive application, which allows the users to fully customize and personalize the content of an interactive electronic greeting card and/or message.
Systems, methods, and apparatuses, according to one exemplary embodiment, contemplate an interactive electronic greeting card application or electronic message application, which allows the users to embed audio and visual data along with an interactive electronic greeting or message.
Systems, methods, and apparatuses, according to one exemplary embodiment, contemplate an interactive application comprising a digital character, wherein the digital character responds to a user's input by providing an audio and/or visual response. The user's input and the digital character's audio visual response are recorded for subsequent storage or distribution. The user input may be in the form of a voice command or a movement or a sound. The audio visual response may be in the form of a sound, or a pre-recorded answer, a changed graphical representation of the digital character or a combination of any of these items. The interactive application may be hosted on a portable computing device or any other computing device. The portable computing device (or other computing device), the interactive electronic message application, and the user are in communication with a server via one or more communications systems, such as the Internet.
Additional features and advantages will be set forth in part in the description that follows, and in part will be obvious from the description, or may be learned by practice of the embodiments disclosed herein. The objects and advantages of the embodiments disclosed herein will be realized and attained by means of the elements and combinations particularly pointed out in the appended claims. It is to be understood that both the foregoing brief summary and the following detailed description are exemplary and explanatory only and are not restrictive of the embodiments disclosed herein or as claimed.
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate some exemplary embodiments disclosed herein, and together with the description, serve to explain principles of the exemplary embodiments disclosed herein.
The exemplary embodiments disclosed herein will now be described by reference to some more detailed embodiments, with occasional reference to the accompanying drawings. These exemplary embodiments may, however, be embodied in different forms and should not be construed as limited to the embodiments set forth herein. Rather, these exemplary embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the embodiments to those skilled in the art. The description of the exemplary embodiments below do not limit the terms used in the claims in any way. The teens of the claims have all of their full, ordinary meaning.
Unless otherwise defined, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which these embodiments belong. The terminology used in the description herein is for describing exemplary embodiments only and is not intended to be limiting of the embodiments. As used in the specification, the singular forms “a,” “an,” and “the” are intended to include the plural fauns as well, unless the context clearly indicates otherwise. All publications, patent applications, patents, and other references mentioned herein are incorporated by reference in their entirety.
The following are definitions of exemplary terms used throughout the disclosure. Both singular and plural forms of all terms fall within each meaning:
“Computer” or “processing unit” as used herein includes, but is not limited to, any programmed or programmable electronic device, microprocessor, or logic circuit that can store, retrieve, and process data.
“Portable computing devices” include, but are not limited to, computing devices which combine the powers of a conventional computer in portable environments. Exemplary portable computing devices include portable computers, tablet computers, internet tablets, Personal Digital Assistants (PDAs), ultra mobile PCs (UMPCs), carputers (typically installed in automobiles), wearable computers, and smartphones. The term “portable computing device” can be used synonymously with the terms “computer” or “processing unit.”
A “web browser” as used herein, includes, but is not limited to, software for retrieving and presenting information resources on the World Wide Web. An information resource may be a web page, an image, a video, a sound, or any other type of electronic content.
“Software” or “computer program” or “application software” as used herein includes, but is not limited to, one or more computer or machine readable and/or executable instructions that cause a computer, a portable computing device, microprocessor, logic circuit, or other electronic device to perform functions, actions, and/or behave in a desired manner. The instructions may be embodied in various forms such as routines, algorithms, modules or programs, including separate applications or code from dynamically linked libraries. Software may also be implemented in various forms such as a stand-alone program, an app, a function call, a servlet, an applet, instructions stored in a memory or any other computer readable medium, part of an operating system or other type of executable instructions. It will be appreciated by one of ordinary skill in the art that the form of software is dependent on, for example, requirements of a desired application, the environment it runs on, and/or the desires of a designer/programmer or the like.
“Mobile application” or “mobile app” or “software application” or “application” or “app” as used herein, includes, but is not limited to, applications that run on smart phones, tablet computers, and other mobile or portable computing devices. The terms “mobile application” or “mobile app” or “software application” or “application” or “app” can be used synonymously with “software” or “computer program” or “application software.” Mobile applications allow users to connect to services which are traditionally available on the desktop or notebook platforms. Typically, these services access the internet or intranet or cellular or wireless fidelity (Wi-Fi) networks, to access, retrieve, transmit and share data.
A “network” as used herein, includes, but is not limited to, a collection of hardware components and computers or machines interconnected by communication channels that allow sharing of resources and information, including without limitation, the worldwide web or internet.
A “server” as used herein, includes, but is not limited to, a computer or a machine or a device on a network that manages network resources. The general term “server” may include specific types of servers, such as a File Server (a computer and storage device dedicated to storing files), Print Server (a computer that manages one or more printers), a Network Server (a computer that manages network traffic), and a Database Server (a computer system that processes database queries). Although servers are frequently dedicated to performing only server tasks, certain multiprocessing operating systems allow a server to manage other non-server related resources.
A “web server” as used herein, includes, but is not limited to, a server which serves content to a web browser by loading a file from a disk and serving it across a network to a user's web browser, typically using a hyper text transfer protocol (HTTP).
Reference will now be made to the drawings.
If the user elects proceed to step 104 and select and download one or more new digital characters at step 105, the user will then be re-directed to the action step at 103 after the selection and downloading of the new digital character(s). Essentially, the user interacts with one digital character at a time.
When the user arrives at step 103, the user has the ability to perform two actions on the selected digital character: (1) record (step 106); and (2) browse (step 107). Regardless of the choice between steps 106 and 107, the user is presented with the same set of interactive choices as a follow-up. For example, steps 167 and 108 correspond to steps 122 and 123 respectively. Similarly, steps 109-111 correspond to steps 124-126, steps 112-116 correspond to steps 127-131, and steps 117-121 correspond to steps 132-136, respectively.
However, at step 106, if the user chooses to record their interactions with the digital character as opposed to simply browsing their interactions (as in step 107), the user is presented with additional steps 137-141 which will be described in further detail below. For the sake of brevity, only one set of user interactive choices outlined in steps 167 and 108-121 will be described in further detail. It will be understood that steps 122-136 correspond with steps 167 and 108-121 in both features and functionality, except that steps 167 and 108-121 are performed in conjunction with a user's recording of the interactive electronic greeting card screens while steps 122-136 are performed in conjunction with the user simply browsing the interactive electronic greeting card screens.
As the user first interacts with a digital character, the digital character is in a “ready” state. The user chooses to record the interactive session with the interactive electronic greeting card application at step 106. The user then initiates interaction with the digital character by either voice or movement. The digital character detects the user's voice at step 167 and the user's movement at step 108. If the user interacts with the digital character by speaking in an audible proximity of the portable computing device hosting the interactive electronic message application, the digital character responds by either providing an affirmative answer (step 109), a negative answer (step 110), or a “maybe” answer (step 111). All these answers are pre-recorded and are hosted as part of the interactive electronic message application on the portable computing device. The user may interact with the interactive electronic greeting card/message application by speaking or making any sound. In a preferred embodiment, the user interacts with the interactive electronic message application by speaking “questions” or “inquiries” or “statements” or “sounds” to which the interactive electronic greeting card application would respond with its “answers” or “sounds.” The answers are intended to set one or more moods within the interactive message card application, including, but not limited to, humor.
The user may also interact with the interactive electronic message by making a movement on the portable computing device and/or the interactive electronic greeting card application. With the interactive electronic message application open, the user may pinch or tap the screen of the portable computing device (at step 112), resulting in the digital character making an “ouch” sound (step 117). With the interactive electronic message application open, the user may double tap the screen of the portable computing device (at step 113), resulting the digital character making an “sneezing” sound (step 118). With the interactive electronic message application open, the user may swipe the screen of the portable computing device (at step 114), resulting the digital character making a “laughing” sound (step 119). With the interactive electronic message application open, the user may shake the portable computing device (at step 115), resulting the digital character “waking up” making an appropriate “awake” sound (step 120). With the interactive electronic message application open, the user may allow for no interaction with the portable computing device or the interactive electronic message application (at step 116), resulting in the digital character making a “snoring” sound (step 121).
Any voice and or movement by the user is detected as explained above, and the subsequent interactions with the user are recorded by the interactive electronic message application. The user then has a choice of additional steps in steps 137-141. The user is either able to play the video previously recorded (at step 137), share the recorded video on Facebook® (at step 138), share the recorded video on a friend's Facebook® profile (at step 139), send the recorded video via email (at step 140), or save the recorded video to the memory of the portable computing device (at step 141).
The user is also able to access an additional menu of options at step 142. Accordingly, the user may choose to locate stores by selecting the store locator option at step 143. Using this option the user is able to search, either by address or zip code or both, for stores which may be carrying a paper card version of the interactive electronic greeting cards/messages used in this application (step 144). The users may also generally search for stores which may carry any paper greeting cards or other products (step 144). The users may also choose the “Cards” option at step 148 to obtain additional information about the interactive electronic greeting card/message application or any other paper or electronic greeting cards. The user may select the About option at step 149 to get additional information regarding the interactive electronic greeting card/message application and/or the promoters of the interactive electronic message application. The user may also select the Settings option at step 1445 to either review the terms of service, privacy and other legal documents (at step 147), or to turn on/off their Facebook® login.
Referring now to
While the server 201 is shown here as a single server for simplicity's sake, the server 201 may represent an application server, a database server, a web server, or any combination of servers or configuration of servers necessary for the present invention. The server 201 may include one computer system or a plurality of computer systems. The portable computing device 202 may have a memory device to store and retrieve data, and it is in communication with the server 201 via one or more communications systems, such as the Internet 206. Similarly, one or more users 203 is in communication with portable computing device 202 via one or more communications systems, such as the Internet 206.
In one embodiment, the type of “communication” referenced above in relation to system 100 may be a “Circuit communication” type. Circuit communication as used herein is used to indicate a communicative relationship between devices. Direct electrical, optical, and electromagnetic connections and indirect electrical, optical, and electromagnetic connections are examples of circuit communication. Two devices are in circuit communication if a signal from one is received by the other, regardless of whether the signal is modified by some other device. For example, two devices separated by one or more of the following—satellites, routers, gateways, transformers, optoisolators, digital or analog buffers, analog integrators, other electronic circuitry, fiber optic transceivers, etc.—are in circuit communication if a signal from one reaches the other, even though the signal is modified by the intermediate device(s). As a final example, two devices not directly connected to each other (e.g. keyboard and memory), but both capable of interfacing with a third device, (e.g., a CPU), are in circuit communication.
In one exemplary embodiment, as illustrated in
Once the user 203 enters the app 204, the user 203 is greeted by a digital character. In the preferred embodiment, the digital character is a digital mustache 601, as shown in
After the initial greeting, the digital mustache 601 goes into a “ready” state as shown by the screen 701 in
In a preferred embodiment, the user 203 interacts with the app 204 by speaking “questions” or “sounds” or “inquiries” to the digital mustache 601, to which the digital mustache 601 would respond with its “answers” or “sounds.” The answers are intended to set one or more moods within the app 204, including, but not limited to, humor. All the answers are pre-recorded within the app 204 and are hosted as part of the app 204 on the portable computing device 202. For example, in one embodiment, the digital mustache 601 is pre-built with fifty (50) pre-recorded answers. One of ordinary skill in the art will appreciate that any number of pre-recorded answers can be built into each digital mustache 601, and the pre-recorded answers may be added or deleted to the digital mustache 601 at any time. The answers may be accessed and rendered either in a random fashion, or via an algorithmic approach within the software of the app 204. The algorithmic approach may be fashioned to recognize the pitch, tone, speed or other variables of the user's voice and render an appropriate pre-recorded response. For example, if the input is a deep toned, man's voice, the response may be tailored to target the preferences of a man for the mood targeted, for example humor or joy or happiness. The same tailoring of the response can be done if the voice detected or input is a high-pitched and/or man's voice.
An exemplary view of the digital mustache 601 answering a user's question is shown in screen 801 of
The digital mustache may be designed to respond to certain motions (e.g. touching the screen) in one fashion (e.g. audio only, audio plus motion, motion only), while responding to certain actions (e.g. shaking of the device) in a different fashion (e.g. audio only, audio plus motion, motion only).
The entire interaction between the user 203 and the app 204 (via digital mustache 601) may be recorded by activating the record link 1002 shown in
With reference to link 1002 of
With further reference to the user input via a movement, an exemplary embodiment of such input may involve the user 203 shaking the portable computing device 202, as described above with reference to
After recording the interaction, the user 203 may be presented with one or more options. For instance, with reference to
As illustrated in
As illustrated by screen 1501 in
In one exemplary embodiment, the user 203 may be provided with more than one digital mustache 601 to choose from within the app 204. For instance, as illustrated in
The user 203 may select link 2202 in
The user 203 may select the Cards link 2302 to view available paper or electronic greeting cards, or any other information, as shown in the exemplary screen 2401 in
The user 203 may select the Locate a Store link 2303 to locate stores, as shown in screen 2501 of
The user 203 may then select the route calculator link 2703 (including, but not limited to, driving, biking, and walking) to allow the app 204 to calculate the distance between the inputted zip code and the address/zip code of selected store. An exemplary route is shown in screen 2801 of
A Settings screen 2901 with various options for privacy, terms of use, and social media use may be presented to the user 203, as shown in
The above description of specific embodiments has been given by way of example. From the disclosure given, those skilled in the art will not only understand the general inventive concepts and attendant advantages, but will also find apparent various changes and modifications to the structures and methods disclosed. For example, the general inventive concepts are not typically limited to any particular interface between a user and the user's mobile computing device. Thus, for example, use of alternative user input mechanisms, such as voice commands and keyboard entries, are within the spirit and scope of the general inventive concepts. As another example, although the embodiments disclosed herein have been primarily directed to a portable computing device, the general inventive concepts could be readily extended to a personal computer (PC) or other relatively fixed console computers, and may be pursued with reference to a website and/or other online or offline mechanisms. As another example, although the embodiments disclosed herein have been primarily directed to a mobile application on a portable computing device, the general inventive concepts could be readily extended to a mobile browser. Additionally, other browsing environments which permit the rendering and usage of the interactive greeting card application may be employed. For example, social networking applications such as Facebook® and Twitter® may be utilized to render and use the interactive greeting card's pages (e.g. within the Facebook® browser). It is sought, therefore, to cover all such changes and modifications as fall within the spirit and scope of the general inventive concepts, as described and claimed herein, and equivalents thereof.
This application claims the benefit of and priority to U.S. Provisional Application No. 61/619,808, entitled “INTERACTIVE MEDIA APPLICATION WITH AUDIO VISUAL RECORDING CAPABILITIES,” which was filed on Apr. 3, 2012. The entire disclosure of this application (U.S. Provisional Application No. 61/619,808) is incorporated herein by reference.
Number | Date | Country | |
---|---|---|---|
61619808 | Apr 2012 | US |