System for creating and distributing interactive advertisements to mobile devices

Information

  • Patent Grant
  • 10846717
  • Patent Number
    10,846,717
  • Date Filed
    Wednesday, December 30, 2015
    9 years ago
  • Date Issued
    Tuesday, November 24, 2020
    4 years ago
Abstract
A system for mobile devices that facilitates the creation and dissemination of interactive advertisements to a plurality of mobile devices. A computer or PC comprising an interactive media creator is used to generate interactive advertisements and communicate it to a distribution server. Mobile devices have an interactive media client component to receive and present interactive media, such as these interactive advertisements, to a user. User response is collected, user interaction is monitored and reported. Charging for distributing advertisements is supported.
Description
BACKGROUND

1. Technical Field


The present invention relates generally to the interactions between mobile device and a server within a network, and more specifically to the ability to provide interactive advertisements to a user of a mobile device.


2. Related Art


Electronic devices, such as mobile phones and personal digital assistants (PDA's), often contain small screens with very limited viewing area. They are constrained in terms of how much information can be displayed, and in terms of user interaction capabilities. The keyboards on cell phones, for example, are not conducive for user data entry, and only brief user inputs can be solicited from a user without annoying the user.


Often a user would want to seek online help using a mobile phone for conducting an activity such as fixing a problem with a car (changing tires for example) or baking a cake, without having to use a bulky notebook computer that might get damaged due to various constraints and problems of a work area. The use of a computer/notebook is not always possible to retrieve help information when they are needed, such as during an accident on the highway or while cooking in a kitchen that has limited space. The use of a mobile phone is preferable in such circumstances but mobile phone in general are not endowed with the features or applications necessary to facilitate easy access to such information in a format that is useable and convenient. The whole process of retrieving necessary information using a mobile phone is inconvenient due to the inability of the Internet websites to provide information that a typical user can easily read, browse through or view on his mobile phone.


Information access from typical Internet based websites from mobile devices are quite often unsatisfactory and not useful due to several factors, not least of which is the multi-media and graphics rich format in which most Internet websites are designed and made available and the verbosity of text. A mobile phone with a small screen is not a good candidate for viewing such complicated and graphics rich (with graphics, flash screens, video components, etc.) content—imagine a webpage being presented to a user that a music component, a whole a page of text (over 3 KB of text) and three large diagrams, and a table of information, all on the same webpage. Such a multi-media webpage is very typical, and is obviously unsuitable for a mobile device.


User interaction in real time, such as those provided for a user using a PC on the Internet, are often not possible for a user using a cell phone. For example, the amount of textual information cannot be a full page of textual information that is typically made available o a PC. Graphical information also cannot be large and too many graphical images should not be on the same webpage. A typical website provides a rich multi-media experience that has several graphical images, large amounts of text, tables, etc. The same website, when accessed from a cell phone, would not only be unreadable, due to its large amount of text, graphics and even video, but also frustrating due to the nature of web sites—the design of websites often being multi-media based (predominantly providing large multi-media web pages full of text, graphics, flash-based and even containing videos). Often webpages on the Internet provide detailed information to a user while soliciting inputs from a user. Thus, there is a problem in presenting a mobile user with information in order to solicit user input when the user is using a cell phone. Soliciting user input from a user when the user is using a cell phone, rather than a PC, is a big problem.


Mobile devices such as a cell phone are therefore devices for which traditional websites are ill prepared to provide information. In addition, surveys or questionnaires that are created for Internet based access via a PC are not appropriate for cell phone access. Asking one or more detailed questions with information on how to answer them is possible on a web page that is accessed from a PC. However, the same web page would be unmanageable and difficult to browse and navigate on a cell phone with a small LCD screen and small keyboard for user input.


Further limitations and disadvantages of conventional and traditional approaches will become apparent to one of ordinary skill in the art through comparison of such systems with the present invention as set forth in the remainder of the present application with reference to the drawings.


BRIEF SUMMARY OF THE INVENTION

The present invention is directed to apparatus and methods of operation that are further described in the following Brief Description of the Drawings, the Detailed Description of the Invention, and the claims. Other features and advantages of the present invention will become apparent from the following detailed description of the invention made with reference to the accompanying drawings.





BRIEF DESCRIPTION OF THE DRAWINGS

The numerous objects and advantages of the present invention may be better understood by those skilled in the art by reference to the accompanying figures in which:



FIG. 1 is a perspective block diagram of an system for mobile devices that facilitates the creation and dissemination of interactive media to a plurality of other recipient mobile devices, wherein the interactive media is disseminated to the recipient mobile devices in a form that is compatible with the capabilities of the respective recipient mobile devices, and wherein the preferences of the user are also factored in.



FIG. 2 is a perspective block diagram of a system that supports interactive media creation and dissemination, that is facilitated by the use of a PC/computer, by a user, or by a hosted interactive media creator that is accessed by the user using a PC/notebook/laptop.



FIG. 3A is an exemplary display screen for a mobile device that supports the display of interactive media using an interactive media client component, or the browsing through an interactive media from the mobile device using a browser.



FIG. 3B is an exemplary screen of an interactive media client component on a mobile device wherein an interactive advertisement is displayed, that has been selected from an queue of advertisements.



FIG. 3C is an exemplary screen/window on a mobile device that is used to by a user to set user preferences, specifically a selection of categories of interactive media to be delivered to the user, a priority being assigned to them too.



FIG. 4 is a perspective block diagram of the interactive media management tree of information, a logical organization of interactive media, by a server in the system that facilitates creation and distribution of interactive media.



FIG. 5 is an interaction diagram that depicts an exemplary interaction between a recipient device used to respond to the interactive media such as advertisements and questionnaires, wherein the recipient device (a PC, notebook, PDA or laptop) is used by a user to access/retrieve interactive media from one or more distribution servers.



FIG. 6 is a perspective block diagram of a mobile device capable of receiving and playing/rendering interactive media and monitoring its usage.



FIG. 7 is a flow chart of the operation of a distribution server as it receives interactive media from a provider and communicates it eventually to users of mobile devices.



FIG. 8 is a flow chart of an exemplary operation of the server capable of distributing interactive media.





DETAILED DESCRIPTION OF THE DRAWINGS


FIG. 1 is a perspective block diagram of an system 105 for mobile devices that facilitates the creation and dissemination of interactive media to a plurality of other recipient mobile devices 111, 113, wherein the interactive media is disseminated to the recipient mobile devices 111, 113 in a form that is compatible with the capabilities of the respective recipient mobile devices 111, 113, and wherein the preferences of the user are also factored in. The system 105 comprises the interactive media creator 107, the plurality of other recipient mobile devices 111, 113 and a distribution server 109. The display of interactive media in a recipient mobile device, such as the recipient mobile device A 111, requires the use of a corresponding client component, such as a QClient, that can display/render interactive media, one at a time.


Each interactive media can comprise several components, some of which are graphics, video content, textual content, and/or audio content. These components may be adapted to the device to make them more appropriate for the recipient devices. For example, graphics may be made more compatible (smaller or compact) to a device if it is not capable of displaying a default size (albeit small) presented by a interactive media creator 107.


The interactive media creator 107 that is communicatively coupled to the distribution server 109 via network 115 makes it possible for a user, such as an advertisement designer, to incorporate text, audio, voice, music, video, graphics etc. into the interactive media. For example, each interactive media that is an audio guided activity (AGA) comprises textual descriptions, audio preambles, optional audio supplementary information, an optional textual supplementary information, for each step of a multi-step audio guided activity. An AGA is used to describe the method to cook a dish using an associated recipe, the process of executing an operation, such as changing the tire on a car, using an associated multi-step operation, etc. The display of each step in a mobile device 111 involves the display of textual descriptions, the playing of audio information such as a preamble, the optional display of supplementary information and the playing of audio supplementary information, if available. A user can view (often using text, and even graphics if available) and optionally listen to the detailed descriptions of each step of an AGA, one step at a time, and browse through each step.


An interactive media that is an advertisement may comprise at least one of a graphic, music component, textual component and a video component, with simple user inputs such as a start, stop, pause, advance, cancel, replay, etc. For example, a simple advertisement may have just some text and a small graphic, with some background music that plays for 10 seconds, with user interaction supported for retrieving additional information, terminating the advertisement, and for advancing to the next advertisement.


Some of the plurality of recipient mobile devices 111, 113 can be legacy devices that do not have a necessary client component capable of handling the download and display of interactive media. Others of the plurality of other recipient mobile devices 111, 113 have the client component capable of handling the download and display of the interactive media.


In one embodiment, the distribution server 109 determines which recipient mobile device can handle interactive media (because they comprise the client component capable of handling the interactive media, and because the interactive media comprise metadata used to determine appropriateness for a device), and which need to be sent a simpler subset of the interactive media that can be displayed/rendered without the client component, such as by the use of a browser in the recipient mobile device. The browser may then be used to browse through a hosted version of the interactive media that is presented as a set of one or more web pages by the distribution server 109.


The interactive media is created/stored/distributed as a packaged content with associated metadata, employing a structured format such as an XML file. For example, for an advertisement presented as interactive media, the following components may be provided:

    • an audio preamble, used to describe in audio form the purpose of the current advertisement and provide an overview of the product or service (other types of information may also be provided if necessary, in audio format)
    • a textual step description regarding the product or service, in succinct form, with minimal text, and
    • an audio supplementary information, providing additional details that may help a user better understand the product or service, its benefits, alternate products, if any, and any additional detail that may aid the user's comprehension of the product or service.


The distribution server 109 is capable of converting recipient list to a list of phone numbers or IP addresses as needed, in order to communicate the interactive media, or a notification regarding the availability of interactive media, to the recipient mobile devices 111, 113. In order to play all the components of an interactive media, if required, the recipient devices, such as the recipient device 111, have a client component that can handle all the components of an interactive media, audio, textual, graphics and even video components.


In one embodiment the client component, an interactive media client, is required in a recipient mobile device 111 to handle the components of an interactive media, such as audio components and textual components.


Some mobile devices, such as recipient mobile device B 113, may not have the interactive media client. In order to play all the components of an interactive media, if required, the recipient devices, such as the recipient mobile device B 113, do not have a client component. Instead, the distribution server 109 makes it possible for them to receive and display/play the interactive media by sending them the same interactive media in an alternate form, such as a simplified set of web pages, that the recipient mobile device B 113 can display using a browser or some other existing client in the recipient mobile device 113. In addition, the recipient mobile device B 113 will be sent a notification regarding the availability of an appropriate interactive media, the notification also comprises a link that can be activated to download the interactive client component so that it could be installed, before displaying the interactive media.


The recipient mobile device B 113 without the interactive media client component gets an opportunity to download & install the necessary client component. The user can then activate the download link provided (in a notification) whereupon the interactive media client component is downloaded and installed automatically (or with user opt-in). The user of the recipient mobile device B 113 also is given the option, selectively, to receive a subset of interactive media that the recipient mobile device B 113 can handle without the client component.


The recipient mobile device 111 with the interactive media client component receives an interactive media, lets user browse through the interactive media (if there are multiple steps or segments, user can browse through each step or segment, and view the textual components and listen to audio components for each, interact with the interactive client component at the appropriate places, etc. It is able to play/render/display all portions of an interactive media that may be provided, such as audio, text, graphics, video, etc. while also soliciting and acquiring user inputs at the appropriate places for the appropriate actions.


The distribution server 109 is capable of enhancing or modifying a received interacte media from a vendor or source that generates them. For example, the interactive media creator 107 may send an incomplete interactive media with two segments, each with only the audio preamble created (by a user recording the steps of an activity in audio form that incorporates a brief descriptions of steps involved) and the distribution server 109 incorporates a generic textual preamble and a generic textual description in order to complete the interactive media.


The distribution server 109 receives an interactive media from a user, incorporates text or graphics as needed, and inserts a generic or customized prompt to user, and sends the modified interactive media to recipients. The list of recipients are either specified by the user (such as an advertising company) along with the interactive media or pre-configured and stored in the server 109 to be used to forward interactive media. In addition, the user might only provide a profile of recipients, or even multiple profiles of potential recipients, and the distribution server 109 is capable of identifying actual recipients based on these profiles. For example, if a profile provided identifies potential recipients as middle aged individuals with income of over $50,000 with interests in sports and music, the distribution server 109 is capable of identifying actual recipients and targeting them for the delivery of the interactive media. In one embodiment, it identifies recipients by searching through a database of registered recipients (individuals or companies that have registered to receive the interactive media), and have provided a profile comprising their interests (sports, music, hiking, etc.) and hobbies, their preferences for interactive media categories, etc. In another embodiment, the distribution server 109 searches through one or more databases of subscriber information, the databases managed by it or managed by external systems or service providers. For example, the database may be maintained and managed by a real estate company (comprising their potential clients) and a bank (comprising their valued customers).


The distribution server 109 also supports both pull and push mode distribution of interactive media to mobile devices 111. It can send a notification of the availability of the interactive media (that a recipient may be interested in), and the recipient can trigger the retrieval of the interactive media by selecting it from a list or by some equivalent action on the mobile device 111. The triggering, or in general, the user interaction is facilitated by a interactive client component in the recipient mobile device 111, which is either made available by the manufacturer of the mobile handsets or subsequently downloaded over the air by the recipient from a server, or otherwise installed by the recipient (such as an owner of the mobile device 111). The interactive client component is able to process the received interactive media (or portions thereof), playing audio portions such as audio preambles, audio supplementary information, etc. and displaying graphics, textual preambles and textual descriptions of individual segments of a multi-segment content, facilitating interaction by user during the viewing.


In one embodiment, the system 105 comprises interactive media generator 107, mobile devices 111, 113 which are a combination of cellular phones, PDAs, etc., and the network 115 that is a wireless and/or wired network, cellular network such as 3G, UMTS, CMDA, GSM, etc., a WLAN network, or a WiMAX network, Internet, Bluetooth, IrDA, etc.



FIG. 2 is a perspective block diagram of a system 205 that supports interactive media creation and dissemination, that is facilitated by the use of a PC/computer 231, by a user, or by a hosted interactive media creator 207 that is accessed by the user using a PC/notebook/laptop 233. The system 205 comprises the PC/computer 231 that a user uses to create interactive media, a server 217 that receives the interactive media and sends them to one or more recipient mobile devices 227 and recipient computer 211, and the hosted interactive media creator 207 that facilitates interactive media creation using the PC/laptop/computer 233, or via web pages provided by the server 217.


The system 205 also comprises a storage 215 that is used to store interactive media, user profiles, required user profiles desired by individuals or companies interested in disseminating interactive media. It also comprises a media delivery & tracking component 219 that stores results and activity logs that can be used to track interactive media creation, dissemination, and other related activities. In addition, the system 205 comprises a billing system 223 that can facilitate billing for the creation of interactive media, the distribution of interactive media, the charges or payments made to recipients of interactive media for viewing the interactive media, the charges made to individuals and companies when a recipient views delivered interactive media, etc. In general, interactive media comprises content (with or without graphics and multimedia) that requires a user to interact with a client in the viewing of it, the experience comprising user interaction. User interaction comprises user making a selection, choosing one or more items, clicking on displayed information, advancing, entering text as user inputs, providing audio inputs, or a combination of these.


The server 217 comprises a plurality of queues 235 for each user, wherein each of the plurality of queues holds a different category of interactive media for a recipient, or references to interactive media of a specific type of category that a user is likely to be interested in. In another related embodiment, the server 235 maintains several queues 235 of interactive media, some of the queues dedicated to specific categories of interactive media, to specific companies creating the interactive media, or to user groups. Other types of queues are also contemplated. When a new entry is made to any queue, target recipients are identified by the server 217 and the interactive media is either communicated to the recipients, a notification of its availability is communicated to the recipients while an entry is made in a queue for each of the recipients in the server 217 with a reference (such as an identification) to the actual interactive media stored along with it, or a copy of the interactive media is entered into a queue that is delivered to a recipient or browsed through by the recipient using the recipient mobile device 227.


Interactive media creation is also facilitated by the hosted interactive media creation component 207 that can be accessed and used by a user employing the PC/Notebook/Laptop 233. An interactive media creation tool installed in the PC/Notebook/Laptop 231 may also be used by a user to create interactive media that can be uploaded to the server 217. A user with interactive media creation tool in the PC/Notebook/Laptop 231 creates an interactive media and sends the created interactive media to recipients/a mailing-list that the server 217 can communicate with.


The user can also employ a PC/Notebook/Laptop 231 communicatively coupled to a hosted interactive media creation component 217 to create interactive media with only audio inputs and textual inputs provided by the user for the various steps of an associated activity. The interactive media is likely to comprise of audio and/or textual preambles for the steps of an audio guided activity, textual descriptions of the steps of the associated activity, supplementary information in audio and textual formats (even graphics and video formats) for each of the segments (if there are multiple segments), etc. Then the user provides a recipient list in one or more formats. The server 217 sends out the interactive media to recipients specified by the user, using their corresponding mobile phone numbers, IP addresses, email addresses, etc. A recipient user can use his recipient computer 211 to receive or browse thorough the interactive media. A different recipient user can use the recipient mobile device 227 to do the same.


When a recipient using the recipient mobile device 227 gets the interactive media on his mobile device, the segments of the interactive media themselves are provided to the recipient by the server 217, starting with the first segment of a multi-segment activity. Thus, in the beginning of the interactive media, the recipient would view the first segment, perhaps with an audio preamble and appropriate textual description, and would be able activate an Info menu item to hear the audio preamble for the first segment. The user advances to the next segment by activating the Next menu item to proceed, etc. Alternatively, all segments are provided at once to the recipient mobile device 227 wherein the interactive media client component manages its local display/rendering.


In one embodiment, an XML based interactive media is created/stored/by a user using a PC/notebook/laptop 231. It is created as an XML file comprising multiple segments—wherein each segment comprises:

    • an audio preamble,
    • graphics
    • a textual step description, and
    • an audio supplementary information.


The audio preamble and audio supplementary information are played/rendered during a display of a segment, when invoked by the user using appropriate menu-items or buttons. The textual segment description comprises Textual description in the form of a small paragraph. Optionally, it also comprises a graphics or a picture that is also provided as part of the XML based interactive media.



FIG. 3A is an exemplary display screen 309 for a mobile device 307 that supports the display of interactive media using an interactive media client component, or the browsing through an interactive media from the mobile device 307 using a browser. The mobile device 307 receives notifications, such as those received as an SMS message, sent to the user (for example, message of type Service message). The notifications offers the user an opportunity to download a client component that is capable of displaying an interactive media. The exemplary display screen 309 provides a list of interactive media 315 to the user, and the user can select one of them for display using a select button 313 or menu item provided. The user can exit the interactive media client by activating the back button 311 or menu item.


In one embodiment, the list of interactive media 315 are those provided to the user by a distribution server. Such a list is provided based on user preferences and user profile by a server. A user can subscribe to one or more categories of interactive media, or one or more sources of interactive media (sources being content development companies, etc.) and the distribution server stores that information as part of the user's preferences and selects interactive media for delivery to the mobile device 307 based on that. In a related embodiment, such a list is provided based on a priority of interactive media determined by the distribution server. In a different embodiment, the user's selections of various subscriptions of interactive media is managed by the distribution server which provides an RSS feed of the selected interactive media to the user on his mobile device 307.


In one embodiment, the mobile screen 309 is a screen saver screen that is displayed to the user when the user is not using the phone (meaning the phone has been idle for a while). The screen saver 309 on the mobile gets a list of interactive media, such as a list of interactive mobile advertisements and surveys, and displays it to the user. The user can select one of them, and advance to others subsequently, and exit the screen saver whenever the user wants to. In one embodiment, the screen saver 309 is provided references to interactive media as a list to be displayed, wherein the list is complied by a server based on user's preferences, subscriptions to interactive media, user profile (comprising user's interests, hobbies, employment, residential location, etc.) or a combination thereof.



FIG. 3B is an exemplary screen of an interactive media client component 359 on a mobile device 357 wherein an interactive advertisement 367 is displayed, that has been selected from an queue of advertisements 363. Using a Next button 361 on the screen 359, a user can advance to the next interactive media on the current queue AdsQueue 363. The screen of the client component 359 can display interactive media from different queues when a user changes the current queue or select a queue by selecting one from a list of queues (shown in FIG. 3A). Each user has at least one queue of waiting interactive media at a server (not shown), that is accessible by the interactive media client component 359. A user can set the priority of interactive content from each queue, or a priority for content from more than one queue, or prioritize queues. A user can create a profile of user's interests, hobbies, employment, etc. that is incorporated for prioritizing interactive media selected and presented to the user.



FIG. 3C is an exemplary screen/window 379 on a mobile device 377 that is used to by a user to set user preferences, specifically a selection of categories of interactive media to be delivered to the user, a priority being assigned to them too. The user preferences selected/provided by a user is communicated to a server that stores it and employs it to send interactive media to the user. The server selects/provides interactive media to the user satisfying user specified needs and preferences from the available interactive media, which is supplied by vendors of products, advertisers of products, services or suppliers of information or products, etc. The screen 379 makes it possible for a user to edit 373 the user preferences and save 371 updated preferences.



FIG. 4 is a perspective block diagram of the interactive media management tree 407 of information, a logical organization of interactive media, by a server in the system that facilitates creation and distribution of interactive media. The interactive media management tree 407 comprises several categories of interactive media, each category assigned a queue, such as a queue 1 for sports related interactive media 409, which in turn comprises news items 421, advertisements 423 and surveys 425. Similarly, technology related interactive media is assigned a queue 2 411 that can be used to store and distribute interactive media such as interactive demos 431, interactive coupons 433, interactive invitations for conferences 435, interactive advertisements 437, etc. Interactive advertisements 437 can be interactive service offers 441, interactive game demos 43, interactive software screenshots 445 (such as for applications), flash demos of products 447, etc.



FIG. 5 is an interaction diagram that depicts an exemplary interaction between a recipient device 507 used to respond to the interactive media such as advertisements and questionnaires, wherein the recipient device (a PC, notebook, PDA or laptop 507) is used by a user to access/retrieve interactive media from one or more distribution servers 509. The recipient device 507 provides user preferences, such as categories of interactive media of interest to the user, to the distribution server 509, based upon which the distribution server 509 sends push notification to the recipient device 507. After receiving the push notification, a user can initiate access of (one or more) interactive media. The distribution server sends one or more interactive media to the recipient device for review by the user. Additional info related to/associated with the interactive media, such as details of for products and services associated with interactive advertisements, may be requested by a user and the distribution server 509 sends them to the recipient device 507 for review by a user. The interactive media is typically created using an interactive content creation tool 511 that is communicatively coupled to the distribution server 509. The interactive content creation tool 511, or another external server, such as a billing server, can be the recipient of tracking information and reports sent by the distribution server 509.



FIG. 6 is a perspective block diagram of a mobile device 611 capable of receiving and playing/rendering interactive media and monitoring its usage. The mobile device 611 comprises an interactive media client 613 that in turn comprises an usage monitoring component 619. The mobile device 619 also comprises an authentication circuitry 615, an audio and video playback circuitry 617, processing circuitry 621, communication circuitry 623 and an interactive media storage 625.



FIG. 7 is a flow chart of the operation of a distribution server as it receives interactive media from a provider and communicates it eventually to users of mobile devices. At a start block 707, the operation starts when the server receives interactive media from a provider of interactive media. At a next block 709, the server determines who the recipients should be for the interactive media, based on user preferences available and metadata of the interactive media delivered by the provider. For example, the metadata comprises a category identification, a target profile describing a likely profile of recipients expected to be interested in the interactive media, security information such as credentials of the provider and authentication information, a digital signature of the interactive media for integrity check, etc.


Then, at a next block 711, the server adds the received and authenticated (and integrity checked) interactive media (or reference thereto) to queue of users 711 who are determined to be targets for delivery. Then, at a next bloc 713, the server sends a notification to the user's mobile device to notify the user of the availability of the interactive media in the queue. In one embodiment, the server creates a list of references to the interactive media that is available, and sends it to the mobile device to be shown in a queue/list (or more than one queue/list) from which the user can select.


Then, at a next decision block 715, the user selectively decides to view the interactive media, and either selects it for viewing or terminates viewing. In one embodiment, the user of the recipient mobile device browses through the list of available interactive media and selects one of them for viewing. If the user decides to view one of the items listed, the control passes to a next block 719, otherwise, processing terminates at a next block 717.


If, at the decision block 715, the user on the mobile device decides to view an interactive media (either from a list presented, from a notification received for interactive media or otherwise), at a next block 719, the interactive media client component downloads the interactive media. Then at a next block 721, the interactive media is displayed to enable viewing by the user. The interactive media client component monitors usage by the user. Finally, at the next block 723, the viewing of the interactive media by the user is reported to the server by the client, and optionally to a billing server or external server (such as one associated with the provider) by the server. Then, control loops back to the decision block 715 where the user is provided an opportunity to view additional interactive media that may be available.



FIG. 8 is a flow chart of an exemplary operation of the server capable of distributing interactive media. Processing starts at a start block 805. Then, at a next block 807, the server receives interactive media and a recipient list from a provider of interactive media. Then, at a next block 809, the server processes the received interactive media and recipient list and stores it. Then at a next block 811, the server notifies recipients from the list about the availability of the interactive media. It can also communicate a reference to the interactive media to the mobile device as part of the notification. Then, at a next block 813, the server determines the device capabilities of the recipient devices, user's preferences, etc. Then the server tailors interactive media to user's mobile device, for each recipient, when requested from recipient mobile device. Then, at a next block 815, the server provides interactive media to each of the recipient devices based on device capabilities. Then, at a next block 817, the server optionally receives usage information from the mobile devices and sends them (after optional collation) to a provider's server (or a billing server). Finally, processing terminates at the next end block 821.


The terms “circuit” and “circuitry” as used herein may refer to an independent circuit or to a portion of a multifunctional circuit that performs multiple underlying functions. For example, depending on the embodiment, processing circuitry may be implemented as a single chip processor or as a plurality of processing chips. Likewise, a first circuit and a second circuit may be combined in one embodiment into a single circuit or, in another embodiment, operate independently perhaps in separate chips. The term “chip”, as used herein, refers to an integrated circuit. Circuits and circuitry may comprise general or specific purpose hardware, or may comprise such hardware and associated software such as firmware or object code.


The terms “audio preamble” and “voice preamble” as used herein may refer to recorded voice inputs that a user records, to provide a question/prompt in human language, that also selectively incorporates responses in multiple choice format to aid selection by a recipient. The audio preamble may be captured by a mobile device in MP3 format, AMR format, WMA format, etc.


The term “audio-assisted questionnaire” as used herein may refer to a questionnaire comprising audio portions, such as audio preambles, audio supplementary information, audio descriptions of multiple choices, etc. that make it possible for a recipient to listen to most of the information of the questions in a questionnaire (employing human voices, in audible form) without having to read all of that in a small screen of a mobile device, without requiring scrolling through textual descriptions on a limited/constrained device.


As one of ordinary skill in the art will appreciate, the terms “operably coupled” and “communicatively coupled,” as may be used herein, include direct coupling and indirect coupling via another component, element, circuit, or module where, for indirect coupling, the intervening component, element, circuit, or module does not modify the information of a signal but may adjust its current level, voltage level, and/or power level. As one of ordinary skill in the art will also appreciate, inferred coupling (i.e., where one element is coupled to another element by inference) includes direct and indirect coupling between two elements in the same manner as “operably coupled” and “communicatively coupled.”


The present invention has also been described above with the aid of method steps illustrating the performance of specified functions and relationships thereof. The boundaries and sequence of these functional building blocks and method steps have been arbitrarily defined herein for convenience of description. Alternate boundaries and sequences can be defined so long as the specified functions and relationships are appropriately performed. Any such alternate boundaries or sequences are thus within the scope and spirit of the claimed invention.


The present invention has been described above with the aid of functional building blocks illustrating the performance of certain significant functions. The boundaries of these functional building blocks have been arbitrarily defined for convenience of description. Alternate boundaries could be defined as long as the certain significant functions are appropriately performed. Similarly, flow diagram blocks may also have been arbitrarily defined herein to illustrate certain significant functionality. To the extent used, the flow diagram block boundaries and sequence could have been defined otherwise and still perform the certain significant functionality. Such alternate definitions of both functional building blocks and flow diagram blocks and sequences are thus within the scope and spirit of the claimed invention.


One of average skill in the art will also recognize that the functional building blocks, and other illustrative blocks, modules and components herein, can be implemented as illustrated or by discrete components, application specific integrated circuits, processors executing appropriate software and the like or any combination thereof.


Moreover, although described in detail for purposes of clarity and understanding by way of the aforementioned embodiments, the present invention is not limited to such embodiments. It will be obvious to one of average skill in the art that various changes and modifications may be practiced within the spirit and scope of the invention, as limited only by the scope of the appended claims.

Claims
  • 1. A method comprising: identifying, based on a request from a client device, an interactive-software-application demonstration comprising multiple segments without certain graphics for selectable options;generating graphics representing selectable options in a customized size and a customized format for incorporation within the multiple segments of the interactive-software-application demonstration based on detected capabilities of the client device;providing, to the client device via a communication network for display within an interactive-client component, the interactive-software-application demonstration comprising the multiple segments including the graphics representing the selectable options in the customized size and the customized format;receiving a usage report from the client device indicating a user's interaction with one or more of the graphics representing the selectable options in the customized size and the customized format within the multiple segments of the interactive-software-application demonstration, wherein a usage-monitoring component on the client device generates the usage report indicating the user's interaction with the one or more of the graphics;based on receiving the usage report, identifying a survey question concerning the interactive-software-application demonstration; andproviding, to the client device via the communication network, the survey question concerning the interactive-software-application demonstration.
  • 2. The method of claim 1, wherein the interactive-software-application demonstration comprises an interactive demonstration of a game.
  • 3. The method of claim 1, wherein the interactive-software-application demonstration comprises an interactive demonstration of a product advertisement.
  • 4. The method of claim 1, wherein the interactive-software-application demonstration comprises an interactive demonstration of screenshots from a software application.
  • 5. The method of claim 1, further comprising receiving, from the client device via the communication network, an indication of a response to the survey question concerning the interactive-software-application demonstration.
  • 6. The method of claim 5, wherein generating the graphics representing the selectable options in the customized size and the customized format for incorporation within the multiple segments of the interactive-software-application demonstration comprises generating a compacted version of the graphics for a first type of computing device corresponding to the client device in a more compacted format than a version of the graphics for a second type of computing device.
  • 7. The method of claim 1, wherein the survey question comprises a first selectable response option and a second selectable response option.
  • 8. The method of claim 7, further comprising receiving, from the client device via the communication network, an indication of a selection of the first selectable response option.
  • 9. The method of claim 8, further comprising: determining that the first selectable response option relates to an additional survey question concerning the interactive-software-application demonstration; andproviding, to the client device via the communication network, the additional survey question.
  • 10. A system, comprising: at least one processor; andat least one non-transitory computer readable storage medium storing instructions thereon that, when executed by the at least one processor, cause the system to: identify, based on a request from a client device, an interactive-software-application demonstration comprising multiple segments without certain graphics for selectable options;generating graphics representing selectable options in a customized size and a customized format for incorporation within the multiple segments of the interactive-software-application demonstration based on detected capabilities of the client device;provide, to the client device via a communication network for display within an interactive-client component, the interactive-software-application demonstration comprising the multiple segments including the graphics representing the selectable options in the customized size and the customized format;receive a usage report from the client device indicating a user's interaction with one or more of the graphics representing the selectable options in the customized size and the customized format within the multiple segments of the interactive-software-application demonstration, wherein a usage-monitoring component on the client device generates the usage report indicating the user's interaction with the one or more of the graphics;based on receiving the usage report, identify a survey question concerning the interactive-software-application demonstration; andprovide, to the client device via the communication network, the survey question concerning the interactive-software-application demonstration.
  • 11. The system of claim 10, wherein the interactive-software-application demonstration comprises an interactive demonstration of a game.
  • 12. The system of claim 10, wherein the interactive-software-application demonstration comprises an interactive demonstration of screenshots from a software application.
  • 13. The system of claim 10, further comprising instructions that, when executed by the at least one processor, cause the system to receive, from the client device via the communication network, an indication of a response to the survey question concerning the interactive-software-application demonstration.
  • 14. The system of claim 13, further comprising instructions that, when executed by the at least one processor, cause the system to determine not to provide a particular survey question concerning the interactive-software-application demonstration based on the response to the survey question.
  • 15. The system of claim 10, further comprising instructions that, when executed by the at least one processor, cause the system to: determine that the client device comprises the interactive-client component; andbased on the determination that the client device comprises the interactive-client component, provide the interactive-software-application demonstration in a format compatible with the interactive-client component.
  • 16. The system of claim 10, further comprising instructions that, when executed by the at least one processor, cause the system to: determine that the client device does not comprise the interactive-client component; andbased on the determination that the client device does not comprise the interactive-client component, provide the interactive-software-application demonstration in a format compatible with a web browser.
  • 17. A non-transitory computer readable storage media storing instructions thereon that, when executed by a processor, cause a computer system to: identify, based on a request from a client device, an interactive-software-application demonstration comprising multiple segments without certain graphics for selectable options;generate graphics representing selectable options in a customized size and a customized format for incorporation within the multiple segments of the interactive-software-application demonstration based on detected capabilities of the client device;provide, to the client device via a communication network for display within an interactive-client component, the interactive-software-application demonstration comprising the multiple segments including the graphics representing the selectable options in the customized size and the customized format;receive a usage report from the client device indicating a user's interaction with one or more of the graphics representing the selectable options in the customized size and the customized format within the multiple segments of the interactive-software-application demonstration, wherein a usage-monitoring component on the client device generates the usage report concerning the user's interaction with the one or more of the graphics;based on receiving the usage report, identify a survey question concerning the interactive-software-application demonstration; andprovide, to the client device via the communication network, the survey question concerning the interactive-software-application demonstration.
  • 18. The non-transitory computer readable storage media of claim 17, wherein the survey question comprises multiple selectable response options.
  • 19. The non-transitory computer readable storage media of claim 18, further comprising instructions that, when executed by the processor, cause the computer system to receive an indication of a selection of a selectable response option from among the multiple selectable response options.
  • 20. The non-transitory computer readable storage media of claim 18, further comprising instructions that, when executed by the processor, cause the computer system to: determine the selectable response option relates to an additional survey question concerning the interactive-software-application demonstration; andprovide, to the client device via the communication network, the additional survey question.
CROSS REFERENCES TO RELATED APPLICATIONS

This application is a continuation of U.S. application Ser. No. 13/869,678, filed Apr. 24, 2013, which is a continuation of U.S. application Ser. No. 13/397,136, filed Feb. 15, 2012, now issued as U.S. Pat. No. 8,433,299, which is a continuation of U.S. application Ser. No. 11/888,100, filed Jul. 30, 2007, now issued as U.S. Pat. No. 8,131,270, which claims the benefit of and priority to U.S. Provisional Application No. 60/899,493, filed Feb. 5, 2007. Each of the aforementioned applications and patents are hereby incorporated by reference herein in their entirety. This patent application makes reference to U.S. patent application Ser. No. 10/985,702, entitled “QUESTIONNAIRE NETWORK FOR MOBILE HANDSETS,” filed on Nov. 10, 2004. The complete subject matter of the above-referenced United States Patent Application is hereby incorporated herein by reference, in its entirety. This patent application makes reference to U.S. Provisional Patent Application Ser. No. 60/530,175, entitled “QUESTIONNAIRE NETWORK FOR MOBILE HANDSETS AND A TRADING SYSTEM FOR CONTRACTS ON USER COMMITMENTS TO ANSWER QUESTIONNAIRES,” filed on Dec. 17, 2003. The complete subject matter of the above-referenced United States Provisional Patent Application is hereby incorporated herein by reference, in its entirety. This patent application makes reference to U.S. patent application Ser. No. 10/985,702, entitled “QUESTIONNAIRE NETWORK FOR MOBILE HANDSETS,” filed on Nov. 10, 2004. The complete subject matter of the above-referenced United States Patent Application is hereby incorporated herein by reference, in its entirety. This patent application makes reference to U.S. provisional patent 60/849,715 entitled “QUESTIONNAIRE CLIENT FOR MOBILE DEVICE”, filed on Oct. 4, 2006. The complete subject matter of the above-referenced United States Patent Application is hereby incorporated herein by reference, in its entirety. This patent application makes reference to U.S. provisional patent 60/850,084 entitled “MOBILE DEVICE FOR CREATING ADHOC QUESTIONNAIRE”, filed on Oct. 7, 2006. The complete subject matter of the above-referenced United States Patent Application is hereby incorporated herein by reference, in its entirety. This patent application makes reference to U.S. provisional patent 60/858,546 entitled “QUESTIONNAIRE SERVER CAPABLE OF PROVIDING QUESTIONNAIRES BASED ON DEVICE CAPABILITIES”, filed on Nov. 13, 2006. The complete subject matter of the above-referenced United States Patent Application is hereby incorporated herein by reference, in its entirety.

US Referenced Citations (468)
Number Name Date Kind
3647926 Rohloff et al. Mar 1972 A
5036389 Morales Jul 1991 A
5680305 Apgar, IV Oct 1997 A
5740035 Cohen et al. Apr 1998 A
5801754 Ruybal et al. Sep 1998 A
5805821 Saxena Sep 1998 A
5838314 Neel et al. Nov 1998 A
5862223 Walker et al. Jan 1999 A
5870454 Dahlen Feb 1999 A
5873068 Beaumont et al. Feb 1999 A
5905974 Fraser et al. May 1999 A
5915243 Smolen Jun 1999 A
5970473 Gerszberg et al. Oct 1999 A
6057872 Candelore May 2000 A
6098085 Blonder et al. Aug 2000 A
6161458 Spatafora Dec 2000 A
6175822 Jones Jan 2001 B1
6189029 Fuerst Feb 2001 B1
6216112 Fuller et al. Apr 2001 B1
6267379 Forrest et al. Jul 2001 B1
6282713 Kitsukawa et al. Aug 2001 B1
6302698 Ziv-El Oct 2001 B1
6430624 Jamtgaard et al. Aug 2002 B1
6477504 Hamlin et al. Nov 2002 B1
6501779 McLaughlin et al. Dec 2002 B1
6502242 Howe et al. Dec 2002 B1
6513014 Walker et al. Jan 2003 B1
6554618 Lockwood Apr 2003 B1
6587835 Treyz et al. Jul 2003 B1
6631377 Kuzumaki Oct 2003 B2
6721713 Guheen et al. Apr 2004 B1
6735778 Khoo et al. May 2004 B2
6801931 Ramesh et al. Oct 2004 B1
6807532 Kolls Oct 2004 B1
6819669 Rooney Nov 2004 B2
6823046 Yamade et al. Nov 2004 B2
6873688 Aarnio Mar 2005 B1
6873967 Kalagnanam et al. Mar 2005 B1
6898645 Abujbara May 2005 B2
6907402 Khaitan Jun 2005 B1
6941324 Plastina et al. Sep 2005 B2
6973432 Woodard et al. Dec 2005 B1
7020685 Chen et al. Mar 2006 B1
7024381 Hastings et al. Apr 2006 B1
7032030 Codignotto Apr 2006 B1
7096464 Weinmann Aug 2006 B1
7107311 Zittrain et al. Sep 2006 B1
7133834 Abelow Nov 2006 B1
7137126 Coffman Nov 2006 B1
7194756 Addington et al. Mar 2007 B2
7197120 Luehrig et al. Mar 2007 B2
7222158 Wexelblat May 2007 B2
7261239 Rao Aug 2007 B2
7310350 Shao et al. Dec 2007 B1
7336928 Paalasmaa et al. Feb 2008 B2
7337127 Smith et al. Feb 2008 B1
7346545 Jones Mar 2008 B2
7373320 McDonough May 2008 B1
7373323 Dalal et al. May 2008 B1
7405752 Kondo et al. Jul 2008 B2
7418472 Shoemaker et al. Aug 2008 B2
7434050 Jeffries Oct 2008 B2
7444380 Diamond Oct 2008 B1
7487435 Aviv Feb 2009 B2
7496943 Goldberg et al. Feb 2009 B1
7542920 Lin-Hendel Jun 2009 B1
7657022 Anderson Feb 2010 B2
7660864 Markki et al. Feb 2010 B2
7664734 Lawrence et al. Feb 2010 B2
7685252 Maes et al. Mar 2010 B1
7688820 Forte et al. Mar 2010 B2
7706740 Collins et al. Apr 2010 B2
7715790 Kennedy May 2010 B1
7725424 Ponte et al. May 2010 B1
7783529 Sandholm et al. Aug 2010 B2
7797186 Dybus Sep 2010 B2
7827235 Iizuka Nov 2010 B2
7853272 Tipnis et al. Dec 2010 B2
7865829 Goldfield et al. Jan 2011 B1
7899700 Floyd et al. Mar 2011 B2
7930343 Zhang Apr 2011 B2
7941092 Rao May 2011 B2
7947714 Hoffberg May 2011 B2
7956272 Wysocki Jun 2011 B2
7970818 Guedalia et al. Jun 2011 B2
7983611 Rao Jul 2011 B2
8037506 Cooper et al. Oct 2011 B2
8041365 Gentle et al. Oct 2011 B1
8041713 Lawrence Oct 2011 B2
8055546 Cassone et al. Nov 2011 B1
8073013 Coleman et al. Dec 2011 B2
8078096 Rao Dec 2011 B2
8103738 Nguyen Jan 2012 B2
8131270 Rao Mar 2012 B2
8135331 Rao Mar 2012 B2
8166507 McDowell Apr 2012 B2
8175511 Sordo et al. May 2012 B1
8180276 Rao May 2012 B2
8191104 Gordon et al. May 2012 B2
8195749 Rao Jun 2012 B2
8249920 Smith Aug 2012 B2
8270893 Rao Sep 2012 B2
8285196 Rao Oct 2012 B2
8290810 Ramer et al. Oct 2012 B2
8380175 Rao Feb 2013 B2
8385813 Rao Feb 2013 B2
8428645 Rao Apr 2013 B2
8433299 Rao Apr 2013 B2
8478250 Rao Jul 2013 B2
8532636 Rao Sep 2013 B2
8532713 Rao Sep 2013 B2
8540514 Gosling Sep 2013 B2
8565719 Rao Oct 2013 B2
8700014 Rao Apr 2014 B2
8700015 Rao Apr 2014 B2
8898708 Rao Nov 2014 B2
9092794 Rao Jul 2015 B2
9100800 Rao Aug 2015 B2
9158437 Rao Oct 2015 B2
20010005837 Kojo Jun 2001 A1
20010023436 Srinivasan et al. Sep 2001 A1
20010034607 Perschbacher et al. Oct 2001 A1
20010042041 Moshal et al. Nov 2001 A1
20010044327 Kanefsky Nov 2001 A1
20010044751 Pugliese et al. Nov 2001 A1
20010047264 Roundtree Nov 2001 A1
20010047373 Jones et al. Nov 2001 A1
20010049286 Hansmann et al. Dec 2001 A1
20010056374 Joao Dec 2001 A1
20010056396 Goino Dec 2001 A1
20020001395 Davis et al. Jan 2002 A1
20020006124 Jimenez et al. Jan 2002 A1
20020007303 Brookler et al. Jan 2002 A1
20020016818 Kirani et al. Feb 2002 A1
20020035486 Huyn et al. Mar 2002 A1
20020035488 Aquila et al. Mar 2002 A1
20020040346 Kwan Apr 2002 A1
20020046200 Floven et al. Apr 2002 A1
20020052774 Parker May 2002 A1
20020054089 Nicholas et al. May 2002 A1
20020059132 Quay et al. May 2002 A1
20020059373 Boys May 2002 A1
20020059621 Thomas et al. May 2002 A1
20020069161 Eckert et al. Jun 2002 A1
20020070961 Xu et al. Jun 2002 A1
20020071528 Kumamoto Jun 2002 A1
20020095333 Jokinen et al. Jul 2002 A1
20020106617 Hersh Aug 2002 A1
20020107737 Kaneko et al. Aug 2002 A1
20020108109 Harris et al. Aug 2002 A1
20020116258 Stamatelatos et al. Aug 2002 A1
20020120593 Iemoto et al. Aug 2002 A1
20020123359 Wei et al. Sep 2002 A1
20020124247 Houghton Sep 2002 A1
20020128908 Levin et al. Sep 2002 A1
20020138334 DeCotiis et al. Sep 2002 A1
20020138392 LeBlanc Sep 2002 A1
20020138462 Ricketts Sep 2002 A1
20020143975 Kimura et al. Oct 2002 A1
20020152110 Stewart Oct 2002 A1
20020155419 Banerjee et al. Oct 2002 A1
20020156673 Barker Oct 2002 A1
20020161708 Offer Oct 2002 A1
20020161833 Niven et al. Oct 2002 A1
20020165666 Fuchs et al. Nov 2002 A1
20020178223 Bushkin Nov 2002 A1
20020183059 Noreen et al. Dec 2002 A1
20020188746 Drosset et al. Dec 2002 A1
20020198769 Ratcliff, III Dec 2002 A1
20030003946 Bocconi Jan 2003 A1
20030009371 Gauba et al. Jan 2003 A1
20030014400 Siegel Jan 2003 A1
20030028871 Wang et al. Feb 2003 A1
20030036935 Nel Feb 2003 A1
20030037033 Nyman et al. Feb 2003 A1
20030037068 Thomas et al. Feb 2003 A1
20030046140 Callahan et al. Mar 2003 A1
20030050959 Faybishenko et al. Mar 2003 A1
20030060284 Hamalainen et al. Mar 2003 A1
20030065784 Herrod Apr 2003 A1
20030077559 Braunberger et al. Apr 2003 A1
20030083895 Wright et al. May 2003 A1
20030084051 Depura et al. May 2003 A1
20030088452 Kelly May 2003 A1
20030088609 Guedalia et al. May 2003 A1
20030096625 Lee et al. May 2003 A1
20030097280 Fitzpatrick et al. May 2003 A1
20030110234 Egli et al. Jun 2003 A1
20030113038 Spencer Jun 2003 A1
20030115463 Wheeler et al. Jun 2003 A1
20030115602 Knee et al. Jun 2003 A1
20030131055 Yashchin et al. Jul 2003 A1
20030144873 Keshel Jul 2003 A1
20030144899 Kokubo Jul 2003 A1
20030154126 Gehlot Aug 2003 A1
20030154242 Hayes et al. Aug 2003 A1
20030163527 Hsu Aug 2003 A1
20030182245 Seo Sep 2003 A1
20030204406 Reardon et al. Oct 2003 A1
20030208433 Haddad et al. Nov 2003 A1
20030208756 Macrae et al. Nov 2003 A1
20030211856 Zilliacus Nov 2003 A1
20030212804 Hashemi Nov 2003 A1
20030216982 Close et al. Nov 2003 A1
20030218604 Wood et al. Nov 2003 A1
20030229533 Mack et al. Dec 2003 A1
20030233422 Csaszar et al. Dec 2003 A1
20040019497 Volk et al. Jan 2004 A1
20040023191 Brown et al. Feb 2004 A1
20040030631 Brown et al. Feb 2004 A1
20040034561 Smith Feb 2004 A1
20040034684 Payne Feb 2004 A1
20040039684 Sandor Feb 2004 A1
20040043372 Jebb et al. Mar 2004 A1
20040044559 Malik et al. Mar 2004 A1
20040054577 Inoue et al. Mar 2004 A1
20040058694 Mendiola et al. Mar 2004 A1
20040064351 Mikurak Apr 2004 A1
20040064833 Lee et al. Apr 2004 A1
20040072136 Roschelle et al. Apr 2004 A1
20040073476 Donahue Apr 2004 A1
20040073621 Sampson Apr 2004 A1
20040082346 Skytt et al. Apr 2004 A1
20040122735 Meshkin Jun 2004 A1
20040122856 Clearwater Jun 2004 A1
20040128183 Challey et al. Jul 2004 A1
20040139232 Giannetti et al. Jul 2004 A1
20040139472 Furet et al. Jul 2004 A1
20040142720 Smethers Jul 2004 A1
20040148219 Norris, III Jul 2004 A1
20040179039 Blattner et al. Sep 2004 A1
20040190767 Tedesco et al. Sep 2004 A1
20040193683 Blumofe Sep 2004 A1
20040210923 Hudgeons et al. Oct 2004 A1
20040225606 Nguyen Nov 2004 A1
20040230656 Sugawara Nov 2004 A1
20040234936 Ullman et al. Nov 2004 A1
20040244037 Yamaguchi et al. Dec 2004 A1
20040250272 Durden et al. Dec 2004 A1
20040252814 Eakin Dec 2004 A1
20040260761 Leaute et al. Dec 2004 A1
20050005174 Connors Jan 2005 A1
20050009465 Ross et al. Jan 2005 A1
20050010544 Sleat Jan 2005 A1
20050010651 Xu et al. Jan 2005 A1
20050018766 Iwamura Jan 2005 A1
20050021754 Alda et al. Jan 2005 A1
20050027616 Jones Feb 2005 A1
20050027654 Adrian Feb 2005 A1
20050028005 Carson et al. Feb 2005 A1
20050044223 Meyerson Feb 2005 A1
20050048954 Gortz et al. Mar 2005 A1
20050054286 Kanjilal et al. Mar 2005 A1
20050055310 Drewett et al. Mar 2005 A1
20050060759 Rowe et al. Mar 2005 A1
20050066361 Iijima Mar 2005 A1
20050080683 Jordan Apr 2005 A1
20050086355 Deshpande Apr 2005 A1
20050086605 Ferrer et al. Apr 2005 A1
20050108750 Nishikawa et al. May 2005 A1
20050114400 Rao May 2005 A1
20050114881 Philyaw et al. May 2005 A1
20050131983 Raciborski et al. Jun 2005 A1
20050132192 Jeffries et al. Jun 2005 A1
20050144061 Rarity et al. Jun 2005 A1
20050144538 Lawrence et al. Jun 2005 A1
20050149501 Barrett Jul 2005 A1
20050150943 Rao Jul 2005 A1
20050159142 Giniger et al. Jul 2005 A1
20050165663 Razumov Jul 2005 A1
20050181722 Kopra et al. Aug 2005 A1
20050193333 Ebert Sep 2005 A1
20050203940 Farrar et al. Sep 2005 A1
20050209050 Bartels Sep 2005 A1
20050223068 Shohfi et al. Oct 2005 A1
20050235318 Grauch et al. Oct 2005 A1
20050240472 Postrel Oct 2005 A1
20050240623 Kobza et al. Oct 2005 A1
20050242189 Rohs Nov 2005 A1
20050251749 Lamkin et al. Nov 2005 A1
20050256866 Lu et al. Nov 2005 A1
20050262540 Swix et al. Nov 2005 A1
20050267816 Jaramillo Dec 2005 A1
20050283405 Mallo et al. Dec 2005 A1
20050283428 Bartels et al. Dec 2005 A1
20050283736 Elie Dec 2005 A1
20050288958 Eraker et al. Dec 2005 A1
20050288999 Lerner et al. Dec 2005 A1
20060015637 Chung Jan 2006 A1
20060024031 Taira et al. Feb 2006 A1
20060029051 Harris Feb 2006 A1
20060031591 Hollstrom et al. Feb 2006 A1
20060034266 Harris et al. Feb 2006 A1
20060036448 Haynie et al. Feb 2006 A1
20060047729 Yuan Mar 2006 A1
20060059174 Mese et al. Mar 2006 A1
20060068818 Leitersdorf et al. Mar 2006 A1
20060080232 Epps Apr 2006 A1
20060085816 Funk et al. Apr 2006 A1
20060085823 Bell et al. Apr 2006 A1
20060091203 Bakker et al. May 2006 A1
20060123082 Digate et al. Jun 2006 A1
20060126544 Markel et al. Jun 2006 A1
20060129455 Shah Jun 2006 A1
20060148420 Wonak et al. Jul 2006 A1
20060155513 Mizrahi et al. Jul 2006 A1
20060170956 Jung et al. Aug 2006 A1
20060178947 Zsigmond et al. Aug 2006 A1
20060190403 Lin et al. Aug 2006 A1
20060194185 Goldberg et al. Aug 2006 A1
20060195441 Julia et al. Aug 2006 A1
20060200384 Arutunian et al. Sep 2006 A1
20060203758 Tee et al. Sep 2006 A1
20060206493 Lipscomb et al. Sep 2006 A1
20060227364 Frank Oct 2006 A1
20060240851 Washburn Oct 2006 A1
20060242687 Thione et al. Oct 2006 A1
20060246915 Shrivastava Nov 2006 A1
20060259866 Prasad et al. Nov 2006 A1
20060261151 Hansen et al. Nov 2006 A1
20060262922 Margulies et al. Nov 2006 A1
20060265280 Nakada et al. Nov 2006 A1
20060265281 Sprovieri et al. Nov 2006 A1
20060277129 Johnston Dec 2006 A1
20060282304 Bedard et al. Dec 2006 A1
20060288363 Kunkel et al. Dec 2006 A1
20060290661 Innanen et al. Dec 2006 A1
20060294186 Nguyen et al. Dec 2006 A1
20070001806 Poll Jan 2007 A1
20070016472 Reznik Jan 2007 A1
20070022214 Harcourt Jan 2007 A1
20070025538 Jarske et al. Feb 2007 A1
20070036282 Engelke et al. Feb 2007 A1
20070038941 Wysocki Feb 2007 A1
20070050256 Walker et al. Mar 2007 A1
20070053513 Hoffberg Mar 2007 A1
20070060225 Hosogai Mar 2007 A1
20070061260 deGroeve Mar 2007 A1
20070086773 Ramsten Apr 2007 A1
20070099636 Roth May 2007 A1
20070101358 Ambady May 2007 A1
20070105496 Bonta et al. May 2007 A1
20070113263 Chatani May 2007 A1
20070115346 Kim et al. May 2007 A1
20070121580 Forte et al. May 2007 A1
20070121846 Altberg et al. May 2007 A1
20070130463 Law et al. Jun 2007 A1
20070130585 Perret et al. Jun 2007 A1
20070136374 Guedalia Jun 2007 A1
20070136773 O'Neil et al. Jun 2007 A1
20070142060 Moton, Jr. et al. Jun 2007 A1
20070150452 Tsurumaki et al. Jun 2007 A1
20070150512 Kong et al. Jun 2007 A1
20070150608 Randall et al. Jun 2007 A1
20070154168 Cordray et al. Jul 2007 A1
20070155411 Morrison Jul 2007 A1
20070156828 Bramoulle Jul 2007 A1
20070157223 Cordray et al. Jul 2007 A1
20070162459 Desai et al. Jul 2007 A1
20070162502 Thomas et al. Jul 2007 A1
20070162566 Desai et al. Jul 2007 A1
20070174861 Song et al. Jul 2007 A1
20070192168 Van Luchene Aug 2007 A1
20070201681 Chen et al. Aug 2007 A1
20070204003 Abramson Aug 2007 A1
20070220575 Cooper et al. Sep 2007 A1
20070233729 Inoue et al. Oct 2007 A1
20070235527 Appleyard et al. Oct 2007 A1
20070244749 Speiser et al. Oct 2007 A1
20070244758 Xie Oct 2007 A1
20070245365 Mitsui Oct 2007 A1
20070245366 Mitsui Oct 2007 A1
20070281692 Bucher et al. Dec 2007 A1
20070288315 Skillen et al. Dec 2007 A1
20070294254 Yao Dec 2007 A1
20070294354 Sylvain Dec 2007 A1
20070299681 Plastina et al. Dec 2007 A1
20080005341 Subbian Jan 2008 A1
20080009268 Ramer Jan 2008 A1
20080010351 Wardhaugh et al. Jan 2008 A1
20080013700 Butina Jan 2008 A1
20080021721 Jones et al. Jan 2008 A1
20080022325 Ober et al. Jan 2008 A1
20080027874 Monseignat et al. Jan 2008 A1
20080040303 Fogelson Feb 2008 A1
20080066080 Campbell Mar 2008 A1
20080069120 Thomas Mar 2008 A1
20080072139 Salinas et al. Mar 2008 A1
20080082394 Floyd et al. Apr 2008 A1
20080085675 Rao Apr 2008 A1
20080085682 Rao Apr 2008 A1
20080092181 Britt Apr 2008 A1
20080098071 Jones et al. Apr 2008 A1
20080107244 Setzer May 2008 A1
20080109278 Rao May 2008 A1
20080119133 Rao May 2008 A1
20080119167 Rao May 2008 A1
20080124687 Post May 2008 A1
20080126113 Manning et al. May 2008 A1
20080126193 Robinson May 2008 A1
20080126226 Popkiewicz et al. May 2008 A1
20080132252 Altman et al. Jun 2008 A1
20080139239 O'Connor Jun 2008 A1
20080159178 Syrjanen et al. Jul 2008 A1
20080163075 Beck Jul 2008 A1
20080167946 Bezos Jul 2008 A1
20080201731 Howcroft Aug 2008 A1
20080209491 Hasek Aug 2008 A1
20080214162 Ramer et al. Sep 2008 A1
20080221986 Soicher et al. Sep 2008 A1
20080222046 McIsaac Sep 2008 A1
20080227076 Johnson Sep 2008 A1
20080261524 Grushkevich Oct 2008 A1
20080261625 Hughes Oct 2008 A1
20080267155 Aragones et al. Oct 2008 A1
20080269636 Burrows et al. Oct 2008 A1
20080281687 Hurwitz et al. Nov 2008 A1
20080281711 Bridges et al. Nov 2008 A1
20080288276 Harris Nov 2008 A1
20080294760 Sampson et al. Nov 2008 A1
20080299953 Rao Dec 2008 A1
20080301231 Mehta et al. Dec 2008 A1
20090011748 Hotta Jan 2009 A1
20090037265 Moona Feb 2009 A1
20090063379 Kelly Mar 2009 A1
20090076882 Mei et al. Mar 2009 A1
20090117845 Rao May 2009 A1
20090119693 Higgins et al. May 2009 A1
20090119700 Sansom May 2009 A1
20090125510 Graham et al. May 2009 A1
20090176522 Kowalewski et al. Jul 2009 A1
20090187814 Raff Jul 2009 A1
20090210347 Sarcanin Aug 2009 A1
20090240569 Ramer et al. Sep 2009 A1
20090254851 Scott et al. Oct 2009 A1
20090259552 Chenard et al. Oct 2009 A1
20090320077 Gazdzinski Dec 2009 A1
20100036970 Sidi et al. Feb 2010 A1
20100094878 Soroca et al. Apr 2010 A1
20100125498 Jaramillo May 2010 A1
20100128666 Masson et al. May 2010 A1
20100262923 Citrin et al. Oct 2010 A1
20100324971 Morsberger Dec 2010 A1
20110041077 Reiner Feb 2011 A1
20110113090 Peeri May 2011 A1
20110125838 Rao May 2011 A1
20110154397 Macrae et al. Jun 2011 A1
20110178877 Swix et al. Jul 2011 A1
20110197236 Rao Aug 2011 A1
20110265116 Stern et al. Oct 2011 A1
20120022905 Meyer et al. Jan 2012 A1
20120028230 Devereux Feb 2012 A1
20120060184 Nguyen et al. Mar 2012 A1
20120079525 Ellis et al. Mar 2012 A1
20120164937 Rao Jun 2012 A1
20120233644 Rao Sep 2012 A1
20120240146 Rao Sep 2012 A1
20120265613 Ramer et al. Oct 2012 A1
20120278823 Rogers et al. Nov 2012 A1
20120284324 Jarville et al. Nov 2012 A1
20120297311 Duggal Nov 2012 A1
20130096985 Robinson et al. Apr 2013 A1
20130238445 Rao Sep 2013 A1
20140038159 Rao Feb 2014 A1
20140120868 Rao May 2014 A1
20150381759 Rao Dec 2015 A1
20180337973 Rao Nov 2018 A1
20180375917 Rao Dec 2018 A1
Foreign Referenced Citations (2)
Number Date Country
2031498 Mar 2009 EP
WO 06051858 May 2006 WO
Non-Patent Literature Citations (173)
Entry
Fritzsche, David J., “Building Tutorials Using Wink”, 2005, Developments in Business Simulations and Experiential Learning, vol. 32 (Year: 2005).
“Gif_Text: A Graphics Text Generator”. archived on Oct. 11, 2006 at: http://web.archive.org/web/20061011024339/www.srehttp.org/apps/gif_text/mkbutton.htm (Year: 2006).
“Textplot: Display text information in a graphics plot” crawled on Aug. 11, 2003: https://webcache.googleusercontent.com/search?q=cache:hfcfGd1ZfCwJ:https://rdrr.io/cran/gplots/man/textplot.html+&cd=18&hl=en&ct=clnk&gl=us (Year: 2003).
WebdesignerDepot Staff (The Evolution of Cell Phone Design Between 1983-2009, May 2009).
U.S. Appl. No. 13/412,574, May 5, 2017, Notice of Allowance.
U.S. Appl. No. 13/869,678, May 18, 2017, Office Action.
U.S. Appl. No. 14/985,353, Apr. 14, 2017, Office Action.
U.S. Appl. No. 13/869,678, Oct. 5, 2017, Office Action.
U.S. Appl. No. 14/985,355, Jul. 12, 2017, Office Action.
U.S. Appl. No. 13/412,574, Apr. 21, 2016, Office Action.
U.S. Appl. No. 13/869,678, Apr. 8, 2016, Office Action.
U.S. Appl. No. 14/985,353, Apr. 1, 2016, Office Action.
U.S. Appl. No. 14/985,330, Dec. 30, 2015, Rao.
U.S. Appl. No. 14/985,334, Dec. 30, 2015, Rao.
U.S. Appl. No. 14/985,340, Dec. 30, 2015, Rao.
U.S. Appl. No. 14/985,342, Dec. 30, 2015, Rao.
U.S. Appl. No. 14/985,344, Dec. 30, 2015, Rao.
U.S. Appl. No. 14/985,351, Dec. 30, 2015, Rao.
U.S. Appl. No. 14/985,352, Dec. 30, 2015, Rao.
U.S. Appl. No. 14/985,353, Dec. 30, 2015, Rao.
U.S. Appl. No. 14/985,355, Dec. 30, 2015, Rao.
U.S. Appl. No. 10/985,702, Oct. 4, 2007, Office Action.
U.S. Appl. No. 10/985,702, Apr. 28, 2008, Office Action.
U.S. Appl. No. 10/985,702, Sep. 11, 2008, Office Action.
U.S. Appl. No. 10/985,702, Apr. 28, 2009, Office Action.
U.S. Appl. No. 10/985,702, Dec. 8, 2009, Office Action.
U.S. Appl. No. 10/985,702, Aug. 6, 2010, Office Action.
U.S. Appl. No. 11/010,985, Nov. 22, 2006, Office Action.
U.S. Appl. No. 11/010,985, May 18, 2007, Notice of Allowance.
U.S. Appl. No. 11/807,670, Dec. 22, 2009, Office Action.
U.S. Appl. No. 11/807,670, Sep. 7, 2010, Office Action.
U.S. Appl. No. 11/807,670, May 27, 2011, Office Action.
U.S. Appl. No. 11/807,670, Jan. 11, 2012, Office Action.
U.S. Appl. No. 11/807,670, May 17, 2012, Notice of Allowance.
U.S. Appl. No. 11/807,672, Jul. 9, 2009, Office Action.
U.S. Appl. No. 11/807,672, Jul. 29, 2010, Office Action.
U.S. Appl. No. 11/807,672, Apr. 27, 2011, Office Action.
U.S. Appl. No. 11/807,672, Mar. 20, 2012, Notice of Allowance.
U.S. Appl. No. 11/810,597, Jan. 28, 2010, Office Action.
U.S. Appl. No. 11/810,597, Oct. 13, 2010, Office Action.
U.S. Appl. No. 11/810,597, May 16, 2011, Office Action.
U.S. Appl. No. 11/810,597, Oct. 21, 2011, Office Action.
U.S. Appl. No. 11/810,597, Apr. 5, 2012, Office Action.
U.S. Appl. No. 11/810,597, Sep. 25, 2012, Office Action.
U.S. Appl. No. 11/821,771, Nov. 26, 2010, Office Action.
U.S. Appl. No. 11/821,771, Jun. 29, 2011, Office Action.
U.S. Appl. No. 11/821,771, Dec. 14, 2011, Notice of Allowance.
U.S. Appl. No. 11/823,006, Nov. 28, 2011, Office Action.
U.S. Appl. No. 11/823,006, Apr. 11, 2012, Office Action.
U.S. Appl. No. 11/823,006, Jun. 3, 2013, Office Action.
U.S. Appl. No. 11/823,006, Mar. 10, 2014, Office Action.
U.S. Appl. No. 11/881,195, Sep. 28, 2010, Office Action.
U.S. Appl. No. 11/881,195, Jun. 9, 2011, Office Action.
U.S. Appl. No. 11/881,195, May 21, 2012, Office Action.
U.S. Appl. No. 11/881,195, Oct. 18, 2012, Office Action.
U.S. Appl. No. 11/881,195, Jul. 18, 2013, Office Action.
U.S. Appl. No. 11/881,195, Dec. 11, 2013, Notice of Allowance.
U.S. Appl. No. 11/888,100, Aug. 4, 2010, Office Action.
U.S. Appl. No. 11/888,100, May 27, 2011, Office Action.
U.S. Appl. No. 11/888,100, Dec. 19, 2011, Notice of Allowance.
U.S. Appl. No. 11/891,193, Sep. 2, 2010, Office Action.
U.S. Appl. No. 11/891,193, May 16, 2011, Office Action.
U.S. Appl. No. 11/891,193, Jan. 27, 2012, Office Action.
U.S. Appl. No. 11/891,193, Apr. 13, 2012, Notice of Allowance.
U.S. Appl. No. 11/891,193, Jan. 4, 2013, Notice of Allowance.
U.S. Appl. No. 11/897,183, Oct. 5, 2010, Office Action.
U.S. Appl. No. 11/897,183, Mar. 15, 2011, Office Action.
U.S. Appl. No. 11/897,183, Dec. 16, 2011, Office Action.
U.S. Appl. No. 11/897,183, Jul. 2, 2012, Notice of Allowance.
U.S. Appl. No. 11/897,183, Oct. 16, 2012, Notice of Allowance.
U.S. Appl. No. 11/977,763, Aug. 4, 2010, Office Action.
U.S. Appl. No. 11/977,763, Apr. 4, 2011, Notice of Allowance.
U.S. Appl. No. 11/977,764, Sep. 2, 2010, Office Action.
U.S. Appl. No. 11/977,764, Feb. 22, 2011, Notice of Allowance.
U.S. Appl. No. 11/978,851, Feb. 24, 2011, Office Action.
U.S. Appl. No. 11/978,851, Nov. 2, 2011, Office Action.
U.S. Appl. No. 11/978,851, Jun. 18, 2012, Notice of Allowance.
U.S. Appl. No. 12/011,238, Jul. 8, 2010, Office Action.
U.S. Appl. No. 12/011,238, Feb. 9, 2011, Office Action.
U.S. Appl. No. 12/011,238, Sep. 14, 2011, Office Action.
U.S. Appl. No. 12/011,238, Aug. 14, 2012, Office Action.
U.S. Appl. No. 12/011,238, Feb. 27, 2013, Office Action.
U.S. Appl. No. 12/011,238, Sep. 19, 2013, Office Action.
U.S. Appl. No. 13/017,024, Nov. 21, 2012, Office Action.
U.S. Appl. No. 13/075,144, Aug. 25, 2011, Notice of Allowance.
U.S. Appl. No. 13/075,882, Mar. 25, 2013, Office Action.
U.S. Appl. No. 13/075,882, Oct. 8, 2013, Office Action.
U.S. Appl. No. 13/075,882, Oct. 17, 2014, Notice of Allowance.
U.S. Appl. No. 13/093,733, Sep. 14, 2011, Office Action.
U.S. Appl. No. 13/093,733, Jan. 26, 2012, Office Action.
U.S. Appl. No. 13/093,733, Mar. 19, 2012, Notice of Allowance.
U.S. Appl. No. 13/237,625, Oct. 15, 2012, Office Action.
U.S. Appl. No. 13/237,625, Nov. 30, 2012, Notice of Allowance.
U.S. Appl. No. 13/354,811, May 9, 2013, Notice of Allowance.
U.S. Appl. No. 13/397,136, Jun. 4, 2012, Office Action.
U.S. Appl. No. 13/397,136, Jan. 24, 2013, Notice of Allowance.
U.S. Appl. No. 13/402,880, Sep. 10, 2012, Office Action.
U.S. Appl. No. 13/402,880, Apr. 18, 2013, Office Action.
U.S. Appl. No. 13/402,880, Jun. 25, 2013, Notice of Allowance.
U.S. Appl. No. 13/412,574, Dec. 20, 2012, Office Action.
U.S. Appl. No. 13/412,574, Aug. 15, 2013, Office Action.
U.S. Appl. No. 13/412,574, Jan. 5, 2015, Office Action.
U.S. Appl. No. 13/412,574, Jul. 15, 2015, Office Action.
U.S. Appl. No. 13/473,603, Nov. 7, 2013, Office Action.
U.S. Appl. No. 13/473,606, May 30, 2014, Office Action.
U.S. Appl. No. 13/473,606, Aug. 21, 2014, Office Action.
U.S. Appl. No. 13/484,605, Oct. 11, 2012, Office Action.
U.S. Appl. No. 13/484,605, Jun. 25, 2013, Notice of Allowance.
U.S. Appl. No. 13/554,619, Mar. 12, 2013, Office Action.
U.S. Appl. No. 13/554,619, Jun. 13, 2013, Notice of Allowance.
U.S. Appl. No. 13/554,685, Dec. 29, 2014, Office Action.
U.S. Appl. No. 13/554,685, Feb. 26, 2015, Notice of Allowance.
U.S. Appl. No. 13/869,678, Oct. 20, 2015, Office Action.
U.S. Appl. No. 13/902,839, Oct. 30, 2013, Office Action.
U.S. Appl. No. 13/902,839, Feb. 4, 2014, Notice of Allowance.
U.S. Appl. No. 13/908,447, Jun. 24, 2015, Notice of Allowance.
U.S. Appl. No. 14/047,015, Nov. 30, 2015, Office Action.
U.S. Appl. No. 14/059,878, May 20, 2015, Notice of Allowance.
U.S. Appl. No. 14/059,878, Jun. 24, 2015, Notice of Allowance.
U.S. Appl. No. 14/147,600, Apr. 21, 2015, Office Action.
U.S. Appl. No. 14/147,600, Nov. 4, 2015, Notice of Allowance.
U.S. Appl. No. 11/891,193, Feb. 27, 2012, Office Action.
U.S. Appl. No. 13/017,024, Nov. 11, 2012, Office Action.
U.S. Appl. No. 13/075,882, Oct. 14, 2014, Notice of Allowance.
U.S. Appl. No. 14/047,015, Nov. 11, 2015, Office Action.
U.S. Appl. No. 14/985,353, Aug. 19, 2016, Office Action.
U.S. Appl. No. 14/985,353, Nov. 16, 2016, Office Action.
Internal Revenue Service, Department of the Treasury, 1040 Instruction, 2004, entire document.
U.S. Appl. No. 13/412,574, Dec. 12, 2016, Office Action.
U.S. Appl. No. 14/985,355, Feb. 7, 2017, Office Action.
U.S. Appl. No. 09/806,544, titled: “Conversational browser and conversational systems,” dated Jul. 2, 2001.
U.S. Appl. No. 61/471,991, titled “Tangible Anchoring System for Broadcast/Webcast Studios,” dated Apr. 5, 2011.
U.S. Appl. No. 13/869,678, Apr. 5, 2018, Office Action.
U.S. Appl. No. 14/985,355, Jun. 20, 2018, Office Action.
U.S. Appl. No. 14/985,330, Apr. 5, 2018, Office Action.
U.S. Appl. No. 14/985,334, May 3, 2018, Office Action.
U.S. Appl. No. 14/985,351, Apr. 16, 2018, Office Action.
U.S. Appl. No. 14/985,352, Aug. 28, 2018, Office Action.
U.S. Appl. No. 14/985,355, Dec. 28, 2017, Office Action.
U.S. Appl. No. 14/985,330, Nov. 17, 2017, Office Action.
U.S. Appl. No. 14/985,334, Nov. 17, 2017, Office Action.
U.S. Appl. No. 14/985,355, Nov. 16, 2018, Office Action.
U.S. Appl. No. 14/985,344, Aug. 3, 2018, Office Action.
U.S. Appl. No. 16/051,295, Oct. 4, 2018, Office Action.
U.S. Appl. No. 16/051,306, Oct. 4, 2018, Office Action.
U.S. Appl. No. 14/985,351, Nov. 29, 2018, Office Action.
U.S. Appl. No. 14/985,352, Feb. 25, 2019, Office Action.
U.S. Appl. No. 14/985,344, Mar. 8, 2019, Office Action.
U.S. Appl. No. 16/051,295, Feb. 21, 2019, Office Action.
U.S. Appl. No. 16/051,306, Feb. 21, 2019, Office Action.
Boudreaux, Toby, 2006, Deconcept.com, “SWFObject: Javascript Flash Player detection and embed script” archived on Jun. 13, 2006 at http://web.archive.org/web/20060613143233/http://blog.deconcept.conn/swfobject/ (Year: 2006).
Developer's Home, 2006, developershome.com, “Using UAProf (User Agent Profile) to Detect User Agent Types and Device Capabilities”, archived on Oct. 18, 2006 at http://web.archive.org/web/20061018093124/http://www.developershonne.conn/wap/detection/detection.asp?page=uaprof (Year: 2006).
U.S. Appl. No. 13/869,678, May 28, 2020, Notice of Allowance.
U.S. Appl. No. 14/985,352, Apr. 3, 2020, Notice of Allowance.
U.S. Appl. No. 16/051,306, Mar. 17, 2020, Notice of Allowance.
U.S. Appl. No. 16/455,555, Jun. 24, 2020, Office Action.
U.S. Appl. No. 14/985,342, Apr. 3, 2020, Office Action.
U.S. Appl. No. 15/931,334, Jun. 19, 2020, Office Action.
U.S. Appl. No. 14/985,352, Jun. 19, 2019, Office Action.
U.S. Appl. No. 14/985,352, Sep. 18, 2019, Office Action.
U.S. Appl. No. 14/985,344, Sep. 18, 2019, Office Action.
U.S. Appl. No. 16/051,295, May 30, 2019, Office Action.
U.S. Appl. No. 16/051,295, Nov. 6, 2019, Office Action.
U.S. Appl. No. 16/051,306, Sep. 18, 2019, Office Action.
U.S. Appl. No. 14/985,340, Oct. 4, 2019, Office Action.
U.S. Appl. No. 16/455,555, Oct. 30, 2019, Office Action.
U.S. Appl. No. 14/985,342, Oct. 31, 2019, Office Action.
U.S. Appl. No. 14/985,344, Jan. 9, 2020, Notice of Allowance.
U.S. Appl. No. 16/051,295, Feb. 20, 2020, Notice of Allowance.
U.S. Appl. No. 16/051,306, Dec. 11, 2019, Office Action.
U.S. Appl. No. 16/051,306, Feb. 19, 2020, Notice of Allowance.
U.S. Appl. No. 14/985,340, dated Sep. 21, 2020, Office Action.
U.S. Appl. No. 14/985,342, dated Aug. 26, 2020, Notice of Allowance.
Related Publications (1)
Number Date Country
20160110742 A1 Apr 2016 US
Provisional Applications (1)
Number Date Country
60899493 Feb 2007 US
Continuations (3)
Number Date Country
Parent 13869678 Apr 2013 US
Child 14985336 US
Parent 13397136 Feb 2012 US
Child 13869678 US
Parent 11888100 Jul 2007 US
Child 13397136 US