Providing user communication

Information

  • Patent Grant
  • 9071729
  • Patent Number
    9,071,729
  • Date Filed
    Tuesday, January 9, 2007
    18 years ago
  • Date Issued
    Tuesday, June 30, 2015
    9 years ago
Abstract
Systems and methods are disclosed for providing user communication. First an invitation input may be received from a first user. The invitation input may comprise a recommendation to a second user to take an action regarding content to be delivered over a content delivery system. Then, the invitation input may be transmitted to the second user. Next, an acceptance input may be received from the second user. The acceptance input may comprise an acceptance to the invitation input. The acceptance input may then be transmitted to the first user in response to receiving the acceptance input.
Description
BACKGROUND

Service providers may deliver content to a user over a content delivery system. For example, conventional content delivery systems distribute the content to a first user and a second user independently. In other words, the first user may watch a sports program while the second user may simultaneously watch a video-on-demand program. Independent content use, however, does not create a socialized entertainment sense with the users. Consequently, the first user may be socially detached and isolated from the second user. Stated another way, the conventional content delivery system may present an impersonal and unsocial user experience.


SUMMARY OF THE INVENTION

Consistent with embodiments of the present invention, systems and methods are disclosed for providing user communication. First an invitation input may be received from a first user. The invitation input may comprise a recommendation to a second user to take an action regarding content to be delivered over a content delivery system. Then, the invitation input may be transmitted to the second user. Next, an acceptance input may be received from the second user. The acceptance input may comprise an acceptance to the invitation input. The acceptance input may then be transmitted to the first user in response to receiving the acceptance input.


Both the foregoing general description and the following detailed description are examples and explanatory only, and should not be considered to restrict the invention's scope, as described and claimed. Further, features and/or variations may be provided in addition to those set forth herein. For example, embodiments of the invention may be directed to various feature combinations and sub-combinations described in the detailed description.





BRIEF DESCRIPTION OF THE DRAWINGS

The accompanying drawings, which are incorporated in and constitute a part of this disclosure, illustrate various embodiments of the present invention. In the drawings:



FIG. 1 is a block diagram of an operating environment including an application server;



FIG. 2 is a block diagram of the communications processor;



FIG. 3 is a flow chart of a method for providing user communication; and



FIG. 4 is a flow chart of another method for providing user communication.





DETAILED DESCRIPTION

The following detailed description refers to the accompanying drawings. Wherever possible, the same reference numbers are used in the drawings and the following description to refer to the same or similar elements. While embodiments of the invention may be described, modifications, adaptations, and other implementations are possible. For example, substitutions, additions, or modifications may be made to the elements illustrated in the drawings, and the methods described herein may be modified by substituting, reordering, or adding stages to the disclosed methods. Accordingly, the following detailed description does not limit the invention. Instead, the proper scope of the invention is defined by the appended claims.


Service providers may deliver content to users over a content delivery system independently. For example, a first user may receive a sports program from the content delivery system while a second user may simultaneously receive a video-on-demand program from the content delivery system. Independent content use by the users, however, does not create a socialized entertainment experience. Consistent with embodiments of the invention, an invitation input may be received from the first user. The invitation input may comprise a recommendation to the second user to take an action regarding content to be delivered over the content delivery system. Then, the invitation input may be transmitted to the second user. Next, an acceptance input may be received from the second user. The acceptance input may comprise an acceptance to the invitation input. The acceptance input may then be transmitted to the first user in response to receiving the acceptance input. Consequently, the first user may be socially engaged with the second user providing a personalized experience for both the first and second users.



FIG. 1 is a block diagram of a content delivery system 100. Consistent with embodiments of the present invention, system 100 may comprise an edge network 110, an edge quadrature amplitude modulation (QAM) device 115, a video-on-demand (VOD) server 120, a communications processor 125, a broadcast server 130, a modular cable modem termination system (M-CMTS) core 135, and a core network 140. In addition, system 100 may comprise, a hybrid fiber-coax (HFC) network 145, a set-top-box (STB) 150, a television (TV) 155, a cable modem (CM) 160, a portable device 165, a personal computer (PC) 170, and a STB control device 175. Communications processor 125 will be discussed in greater detail below with respect to FIG. 2.


Edge network 110 may comprise, a network providing, for example, full-duplex, two-way broadband services including broadband video and audio, cable television services, or telecommunications services. Edge network 110 may provide data by utilizing network data formats including, for example, i) Internet protocol (IP); ii) Ethernet; iii) digital subscriber line (DSL); iv) asynchronous transfer mode (ATM); and v) virtual private network (VPN). Edge network 110 may utilize managed network services. Edge network 110 may comprise various components including, for example, i) servers; ii) switches; iii) routers; iv) gateways; v) hubs; vi) fiber optic cable; vii) copper cable; and viii) terminations. The aforementioned are examples and edge network 110 may comprise other configurations for broadband service delivery and data switching over system 100.


Edge QAM 115 may provide modulation for various encoding formats (e.g. for data, audio, and video) and may distribute the signal down multiple broadband channels. Edge QAM 115 may modulate signals in, for example, multi-channel quadrature amplitude modulation. Edge QAM 115 may support broadcast and narrowcast with multi-program transport stream (MPTS) pass-through and single-program transport stream (SPTS) to MPTS multiplexing. Edge QAM 115 may meet data-over-cable service interface specification (DOCSIS) and downstream radio frequency interface (DRFI) performance specifications. Furthermore, edge QAM 115 may provide video over internet protocol and moving pictures expert group (MPEG) video simultaneously. Edge QAM 115 may provide various data switching functions and enable two-way, full-duplex communication within the broadband network. Edge QAM 115 may modulate and distribute broadcast multimedia services including, for example, i) a broadcast multi-media service; ii) a high-definition multimedia service; iii) a digital television multimedia service; iv) an analog multimedia service; v) a VOD service; vi) a streaming video service; vii) a multimedia messaging service; viii) a voice-over-internet protocol service (VoIP); ix) an interactive multimedia service; and x) an e-mail service. The aforementioned are examples and edge QAM 115 may comprise other configurations for different broadband and data services.


VOD server 120 may perform processes for providing video entertainment on demand. VOD server 120 may take MPEG compressed video off a hard disk or a networked service, format it into MPEG-TS packets inside a user datagram protocol (UDP) packet, and send it into edge network 110. Edge QAM 115 may receive the UDP packets, where Internet protocol (IP) encapsulation may be removed. The MPEG packets may be forwarded down one QAM channel on edge QAM 115 and onto HFC network 145.


Broadcast server 130 may perform processes for providing broadcast services. Broadcast server 130 may use a broadcast signal and a narrowcast signal to deliver broadcast services to a broadcast system. Broadcast server 130 may receive video, audio, and data from fiber optic input, wireless input, recorded tape, recorded digital video disc, or satellite input. Broadcast server 130 may utilize digital signal formats and analog signal formats. Furthermore, broadcast server 130 may comprise a specialized receiver and data switching equipment for broadband distribution. In addition, broadcast server 130 may provide broadband multimedia services including, for example, i) the broadcast multi-media service; ii) the high-definition multimedia service; iii) the digital television multimedia service; iv) the analog multimedia service; v) the VOD service; vi) the streaming video service; vii) the multimedia messaging service; viii) the voice-over-internet protocol service (VoIP); ix) the interactive multimedia service; and x) the e-mail service. The aforementioned are examples and broadcast server 130 may comprise other components and systems for providing broadcast services in system 100.


M-CMTS core 135 may receive IP datagrams from core network 140. M-CMTS core 135 may then forward these IP datagrams to either a single QAM channel within edge QAM 115 with traditional DOCSIS encapsulation, or may forward the IP datagrams to multiple QAM channels within edge QAM 115, for example, using DOCSIS bonding. M-CMTS core 135 may support DOCSIS features and end-to-end IP within a next generation network architecture (NGNA), for example.


Core network 140 may comprise any data or broadband network that may provide data and services to edge network 110, communications processor 125, broadcast server 130, or M-CMTS core 135. For example, core network 140 may comprise the Internet. In addition, core network 140 may comprise various components including, for example, i) servers; ii) switches; iii) routers; iv) gateways; v) hubs; vi) fiber optic cable; vii) copper cable; and viii) terminations. The aforementioned are examples and core network 140 may comprise other components and may supply other services using various other formats.


HFC network 145 may comprise a communications network (e.g. a cable TV network) that uses optical fiber, coaxial cable, or an optical fiber coaxial cable combination. Fiber in HFC network 120 may provide a high-speed backbone for broadband services. Coaxial cable may connect end users in HFC network 120 to the backbone. Such networks may use, for example, matching DOCSIS cable modems at a head end and at an end user's premises. Such a configuration may provide bi-directional paths and Internet access.


STB 150 may comprise a single component or a multi-component system for receiving broadband services. STB 150 may comprise a service consumer system combining several components including, for example, a set top box, cable modem 160, a network interface unit, a residential gateway, a terminal unit, a scrambler/descrambler, a digital storage media unit, an input/output port, a display device, a keyboard, and a mouse. STB 150 may encode and decode digital and analog signals, and provide interface capability for other components. STB 150 may utilize various operating systems and other software components. The end user's premises may contain STB 150. STB 150 may include all the functionality provided by a cable modem, such as CM 160, in one component and attach to TV 155, for example.


TV 155 may comprise an end use device for displaying delivered broadband services. TV 155 may comprise, for example, a television, a high definition television, a liquid crystal display unit (LCD), a video projection unit, or PC 170. The aforementioned are examples and TV 155 may comprise other display devices for delivered broadband services.


CM 160 may comprise, for example, a cable modem, a network server, a wireless fidelity data switch, or an Ethernet switch. CM 160 may provide data services to the user by accessing DOCSIS services from system 100. CM 160 may provide Internet access, video, or telephone services. The aforementioned are examples and CM 160 may comprise other data delivery devices.


Portable device 165 or PC 170 may comprise any personal computer, network switch, wireless switch, network hub, server, personal digital assistant, and home computing device. Portable device 165 or PC 170 may serve as user devices for data access from system 100. Portable device 165 and PC 170 may transmit and receive data and services from system 100.


STB control device 175 may comprise any input and output device for interfacing with STB 150. For example, STB control device 175 may be a remote control for using STB 150. STB control device 175, after proper programming, may interface with STB 150.


Embodiments consistent with the invention may comprise a system for providing user communication. The system may comprise a memory storage and a processing unit coupled to the memory storage. The processing unit may be operative to receive an invitation input from a first user. The invitation input may comprise a recommendation to a second user to take an action regarding delivered content, delivered over a content delivery system. Furthermore, the processing unit may be operative to transmit the invitation input to the second user.


Consistent with embodiments of the present invention, the aforementioned memory, processing unit, and other components may be implemented in a content delivery system, such as system 100 of FIG. 1. Any suitable combination of hardware, software, and/or firmware may be used to implement the memory, processing unit, or other components. By way of example, the memory, processing unit, or other components may be implemented with communications processor 125, in combination with system 100. The aforementioned system and processors are examples and other systems and processors may comprise the aforementioned memory, processing unit, or other components, consistent with embodiments of the present invention.



FIG. 2 shows communications processor 125 of FIG. 1 in more detail. As shown in FIG. 2, communications processor 125 may include a processing unit 210 and a memory unit 215. Memory 215 may include a communication software module 220 and a communication database 225. While executing on processing unit 210, communication software module 220 may perform processes for providing user communication, including, for example, one or more stages included in method 300 or method 400 described below with respect to FIG. 3 and FIG. 4. Furthermore, any communication software module 220 and communication database 225 may be executed on or reside in any element shown in FIG. 1.


Communications processor 125 (“the processor”) may be implemented using a personal computer, a network computer, a mainframe, or other similar microcomputer-based workstation. The processor may comprise any computer operating environment, such as hand-held devices, multiprocessor systems, microprocessor-based or programmable sender electronic devices, minicomputers, mainframe computers, and the like. The processor may also be practiced in distributed computing environments where tasks are performed by remote processing devices. Furthermore, the processor may comprise a mobile terminal, such as a smart phone, a cellular telephone, a cellular telephone utilizing wireless application protocol (WAP), personal digital assistant (PDA), intelligent pager, portable computer, a hand held computer, a conventional telephone, a wireless fidelity (Wi-Fi) access point, or a facsimile machine. The aforementioned systems and devices are examples and the processor may comprise other systems or devices.



FIG. 3 is a flowchart setting forth the general stages involved in a method 300 consistent with an embodiment of the invention for providing user communication. Method 300 may be implemented using communications processor 125, as described above with respect to FIG. 2. Ways to implement the stages of method 300 will be described in greater detail below. Method 300 may begin at starting block 305 and proceed to stage 310 where communications processor 125 may receive an invitation input from a first user. The invitation input may comprise a recommendation to a second user to take an action regarding content to be delivered over content delivery system 100. For example, the invitation input may be created by the first user using a first user device. The first user device may comprise STB 150, CM 160, portable device 165, PC 170, or STB control device 175. When creating the invitation input, the first user may include, for example, an address identifying the second user. For example, the invitation input may include a second user's username, a second user's control device identification, a second user's account number, or a second user's e-mail address. The aforementioned are examples and the address identifying the second user may comprise other information and the first user device may comprise other elements.


Furthermore, as stated above, the invitation input may comprise the recommendation to the second user to take the action regarding the content to be delivered over a content delivery system 100. For example, the first user may include a recommendation to the second user to view a program. In addition, the recommendation may comprise a recommendation to jump to the program, a recommendation to record the program, or a reminder regarding the program. The first user may also include an attachment with the invitation input. The attachment may contain, for example, a portion of the content, a text message, a user content rating, related program information, a pointer to a specific location within content stored or to be stored at a memory location in VOD server 120, set-top 150, television 155, cable modem 160, personal computer 170, or portable device 165, or a computer file. Moreover, the invitation input may comprise instructions for handling the message including, for example, a date to send the invitation input or a time to send the invitation input.


From stage 310, where communications processor 125 receives the invitation input from the first user, method 300 may advance to stage 320 where communications processor 125 may transmit the invitation input to the second user. For example, communications processor 125 may parse the invitation input for the address for the second user or may obtain the address for the second user in any manner. Once communications processor 125 has the address for the second user, communications processor 125 may transmit the invitation input to the second user through system 100. Communications processor 125 may store the invitation input for later retrieval. Furthermore, communications processor 125 may redirect the invitation input to another system, for example, an Ethernet data system or the Internet.


Communications processor 125 may transmit the invitation input to the second user device. For example, the address for the second user may correspond to the second user device, a memory location in the second user device, or a file storage folder within the second user device. The second user device may comprise STB 150, CM 160, portable device 165, PC 170, or STB control device 175. Communications processor 125 may perform processing on the invitation input, including aggregating with other invitation inputs, system data, and data from other system users. The invitation input may be viewed on the second user device by the second user. The second user may choose to ignore the invitation input, may delay responding to the invitation input, or may immediately respond to the invitation input.


From stage 320, where communications processor 125 transmits the invitation input to the second user, method 300 may advance to stage 330 where communications processor 125 may receive an acceptance input from the second user. For example, in response to receiving the invitation input, the second user may consider the invitation input and enter the acceptance input into the second user device. The acceptance input may be in response to the invitation input. For example, the second user may respond to the invitation input by rejecting the invitation input, postponing the invitation input, storing the invitation input for later retrieval, or sending a different content recommendation to the first user. The acceptance input may be transmitted from the second user device through system 100 or any other system. The acceptance input from the second user may not necessarily be a function of system 100. System 100 may be built so that the second user may receive the invitation input and take autonomous action on any recommendation or other element attached to the invitation input without notification being given to the first user.


From stage 330, where communications processor 125 receives the acceptance input from the second user, method 300 may advance to stage 340 where communications processor 125 may transmit the acceptance input to the first user. For example, communications processor 125 may transmit the acceptance input to the first user through system 100 or any other system. The first user may receive the acceptance input on the first user device. For example, the first user may receive the acceptance input on the first user device while viewing content on the first user device. In response to the acceptance input, the first user may have the option, for example, to accept a different content recommendation, reject a different content recommendation, or join in a shared viewing session with the second user.



FIG. 4 is a flowchart setting forth the general stages involved in a method 400 consistent with embodiments of the invention for providing user communication. Method 400 may be implemented using communications processor 125, as described above in more detail with respect to FIG. 2. Ways to implement method 400's stages will be described in greater detail below. Method 400 may begin at starting block 405 and proceed to stage 410 where system 100 may provide content to a first user and a second user. The first user and the second user may receive and view the content on, for example, STB 150, CM 160, portable device 165, PC 170, or STB control device 175. The first user and the second user may view the same content substantially simultaneously and remotely.


From stage 410, where system 100 provides content to the first user and the second user, method 400 may advance to stage 420 where communications processor 125 may receive (e.g. over system 100) a first input from the first user in response to at least a portion of the content. The first input may be selected from an input set. The first user may define the input set using a portion of a plurality of selectable elements. Furthermore, the first input may include, for example, a login name, password and a username. The first user may assign a key from, for example, the keys on the first user device (e.g. STB control device 175) to elements within the input set. This may allow the first user to select from the input set. The input set may include, for example, a personal expression. The personal expression may be an alphanumeric input entered using keys from the first user device. The keys may have multiple alphanumeric meanings. Furthermore, the input set may include emotional icons, alphanumeric shorthand text, avatars, sound bites, video clips, or text words. In addition, the first user may input new text words for the first user's input set. The first user or communications processor 125 may input the username in as an element of the input set. The first user may transmit the first input by selecting the appropriately assigned key from STB control device 175.


Furthermore, the first input may comprise an invitation input to the second user for the second user to interact contemporaneously with the first user regarding at least a portion of the content. The invitation input may include an address of the second user. The address may comprise a username of the second user, a service location of the second user, a system identification of the second user, an e-mail address of the second user, or an account number of the second user. Furthermore, the invitation input may include a request to allow the first user to add the second user to the first user's input set. In addition, the invitation input may include a content recommendation, a content tuning instruction, a selected portion of the content, a pointer to a specific location within content stored or to be stored at a memory location in VOD server 120, set-top 150, television 155, cable modem 160, personal computer 170, or portable device 165, a computer file, a message to the second user, or a content recording instruction.


For example, the first user, after selecting the first user's username to enter the system, may enter the second user's account number to send the invitation input to the second user. Furthermore, the first user may select a content recommendation selection. The invitation input may be transmitted over system 100 to communications processor 125. Communications processor 125 may direct the invitation input to the second user.


From stage 420, where communications processor 125 receives the first input from the first user, method 400 may advance to stage 430 where communications processor 125 may transmit the first input to the second user. For example, the first input may be transmitted to the second user device over system 100 or any other system. Furthermore, communications processor 125 may store the first input for later transmission if, for example, the second user is not available or has temporarily blocked transmission of the first input. For example, if the second user has a block future input selection selected, communications processor 125 may store the first input and try to transmit later.


The second user may receive the first input where the first input may include the invitation input. The second user may receive the invitation input while viewing content different from the first user's viewed content. The invitation input may display on TV 155, for example, or any other device. The second user may accept the invitation input, deny the invitation input, send an instruction to ignore the invitation input, send the instruction to block future invitation input from the first user, store the invitation input for later retrieval, or send a text message to the first user regarding the invitation input. The second user may use the second user device to respond to the invitation input. For example, while watching a football game, the second user may receive, for example, the invitation input with a request to watch a movie from the first user. The second user may then accept the invitation by pressing a button on STB control device 175. By accepting the invitation, the second user's content may change to the content recommendation sent by the first user.


Upon accepting the invitation input, the second user may transmit an acceptance input to the first user. The acceptance input may include information about the second user's selections regarding the invitation input. The acceptance input may include input from the second user's input set. For example, after accepting the invitation, STB 150 of the second user may transmit an acceptance input. The acceptance input may include a confirmation that the second user's content matches the first user's content. In addition, the second user may select to include a text message and a representative avatar from the second user's input set as an attachment.


The first user may receive the acceptance input, the attached confirmation, and the text message. The first user may select a representative avatar from the first user's input set and transmit the selection to the second user. Once communications processor 125 transmits the first input to the second user in stage 430, method 400 may then end at stage 440.


Embodiments of the invention, for example, may be implemented as a computer process (method), a computing system, or as an article of manufacture, such as a computer program product or computer readable media. The computer program product may be a computer storage media readable by a computer system and encoding a computer program of instructions for executing a computer process. The computer program product may also be a propagated signal on a carrier readable by a computing system and encoding a computer program of instructions for executing a computer process. Accordingly, the present invention may be embodied in hardware and/or in software (including firmware, resident software, micro-code, etc.). In other words, embodiments of the present invention may take the form of a computer program product on a computer-usable or computer-readable storage medium having computer-usable or computer-readable program code embodied in the medium for use by or in connection with an instruction execution system. A computer-usable or computer-readable medium may be any medium that can contain, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device.


The computer-usable or computer-readable medium may be, for example but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, device, or propagation medium. More specific computer-readable medium examples (a non-exhaustive list), the computer-readable medium may include the following: an electrical connection having one or more wires, a portable computer diskette, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), an optical fiber, and a portable compact disc read-only memory (CD-ROM). Note that the computer-usable or computer-readable medium could even be paper or another suitable medium upon which the program is printed, as the program can be electronically captured, via, for instance, optical scanning of the paper or other medium, then compiled, interpreted, or otherwise processed in a suitable manner, if necessary, and then stored in a computer memory.


Embodiments of the present invention, for example, are described above with reference to block diagrams and/or operational illustrations of methods, systems, and computer program products according to embodiments of the invention. The functions/acts noted in the blocks may occur out of the order as shown in any flowchart. For example, two blocks shown in succession may in fact be executed substantially concurrently or the blocks may sometimes be executed in the reverse order, depending upon the functionality/acts involved.


While certain embodiments of the invention have been described, other embodiments may exist. Furthermore, although embodiments of the present invention have been described as being associated with data stored in memory and other storage mediums, data can also be stored on or read from other types of computer-readable media, such as secondary storage devices, like hard disks, floppy disks, or a CD-ROM, a carrier wave from the Internet, or other forms of RAM or ROM. Further, the disclosed methods' stages may be modified in any manner, including by reordering stages and/or inserting or deleting stages, without departing from the invention.


All rights including copyrights in the code included herein are vested in and the property of the Applicant. The Applicant retains and reserves all rights in the code included herein, and grants permission to reproduce the material only in connection with reproduction of the granted patent and for no other purpose.


While the specification includes examples, the invention's scope is indicated by the following claims. Furthermore, while the specification has been described in language specific to structural features and/or methodological acts, the claims are not limited to the features or acts described above. Rather, the specific features and acts described above are disclosed as example for embodiments of the invention.

Claims
  • 1. A method for providing user communication, the method comprising: receiving, at a communication processor, in response to a selection of a key on a remote control device associated with the first user, one of a set of commands for transmitting an invitation input from the first user to a second user, wherein the set of commands is defined by the first user using a portion of a plurality of selectable elements, wherein each of the set of commands is assigned an unique key on the remote control device, wherein the invitation input comprising a recommendation to a second user to take an action regarding content being provided to the first user, the recommendation comprising an attachment having a portion of the content and a pointer to a specific location within the content, wherein the invitation input further comprising instructions for the communication processor for processing the invitation input, the instructions comprising an address and a date and a time to deliver the invitation input to the second user;processing, by the communication processor, the invitation input, wherein processing the invitation input comprises:storing the invitation input for later retrieval, at the communication processor, based on the date and time to deliver the invitation input, anddetermining a status of the second user, wherein determining the status comprises determining whether the second user is available to receive the invitation input; wherein determining whether the second user is available to receive the invitation input comprises determining whether the second user has blocked the first user, and storing the invitation input, in response to the second user blocking the first user, for later retrieval; andtransmitting the invitation input to the second user based on the determined status.
  • 2. The method of claim 1, wherein receiving the invitation input comprises receiving the invitation input over a content delivery system.
  • 3. The method of claim 1, wherein receiving the invitation input comprises receiving the invitation input from a first user device comprising one of the following: a network interface unit, a residential gateway, a set-top box, a terminal unit, a scrambler/descrambler, a digital storage media unit, a control device, a television, an LCD screen, a cable modem, a computer, and a projection unit.
  • 4. The method of claim 1, wherein receiving the invitation input from the first user, the invitation input comprising the recommendation, comprises receiving the invitation input from the first user, the invitation input comprising the recommendation comprising at least one of the following: a recommendation to view a program, a recommendation to jump to the program, a recommendation to record the program, and a reminder regarding the program.
  • 5. The method of claim 1, wherein receiving the invitation input from the first user, the invitation input comprising the recommendation to the second user to take the action regarding the content to be delivered, comprises receiving the invitation input from the first user, the invitation input comprising the recommendation to the second user to take the action regarding the content to be delivered over a content delivery system comprising one of the following: a full-duplex, two-way broadband network, a hybrid fiber-coax (HFC) network, a data network, the internet, cable television network, and a telecommunications network.
  • 6. The method of claim 1, wherein receiving the invitation input from the first user, the invitation input comprising the recommendation to the second user to take the action regarding the content to be delivered, comprises receiving the invitation input from the first user, the invitation input comprising the recommendation to the second user to take the action regarding the content comprising one of the following: a broadcast multimedia service, a high-definition multimedia service, a digital television multimedia service, an analog multimedia service, a Video-on-Demand service, a streaming video, a multimedia messaging service, a voice over IP service, an interactive multimedia service, and an e-mail service.
  • 7. The method of claim 1, wherein transmitting the invitation input comprises transmitting the invitation input over a content delivery system.
  • 8. The method of claim 1, wherein transmitting the invitation input comprises transmitting the invitation input from a second user device.
  • 9. The method of claim 1, wherein transmitting the invitation input comprises transmitting the invitation input to a second user device comprising one of the following: a network interface unit, a residential gateway, a set-top box, a terminal unit, a scrambler/descrambler, a digital storage media unit, a control device, a television, an LCD screen, a cable modem, a computer, and a projection unit.
  • 10. The method of claim 1, wherein transmitting the invitation input comprises transmitting the invitation input to a second user device configured to store the invitation input in a database associated with the second user.
  • 11. The method of claim 1, further comprising: receiving an acceptance input from the second user, the acceptance input comprising an acceptance to the invitation input; andtransmitting the acceptance input to the first user in response to receiving the acceptance input.
  • 12. The method of claim 1, wherein determining whether the second user is available to receive the invitation input comprises determining whether the second user is not available to accept the invitation input, and wherein transmitting the invitation input comprises transmitting the invitation input, in response to the second user not being available, to an alternate address associated with the second user.
  • 13. A method for providing user communication, the method comprising: providing content to a first user and a second user;receiving a first input from the first user, the first input being selected by the first user from a button on a remote control, the button on the remote control being configured to trigger one of a set of commands for transmitting an invitation input to a second user, the set of commands being predefined by the first user using a portion of a plurality of selectable elements, the set of commands associated with communicating information regarding the content being provided to the first user at a time the first input is received from the first user, wherein each one of the set of commands is assigned an unique key on the remote control device, wherein receiving the invitation input from the first user comprises receiving an attachment with the invitation input, and wherein the attachment comprises a portion of the content and a pointer to a specific location within the content, wherein the invitation input further comprising instructions for a communication processor for handling the invitation input, the instructions comprising an address for the second user, and a date and a time to deliver the invitation input to the second user;processing, by the communication processor, the invitation input, wherein processing the invitation input comprises: storing the invitation input for later retrieval, at the communication processor, based on the date and time to deliver the invitation input, and determining a status of the second user, wherein determining the status comprises determining whether the second user is available to receive the invitation input; wherein determining whether the second user is available to receive the invitation input comprises determining whether the second user has blocked the first user, and storing the invitation input, in response to the second user blocking the first user, for later retrieval; andrequesting permission from the second user to add the second user to the first user's input set.
  • 14. The method of claim 13, further comprising: receiving the invitation input from the first user, the invitation input comprising an invitation from the first user to the second user for the second user to interact contemporaneously with the first user in response to at least the portion of the content;transmitting the invitation input to the second user;receiving an acceptance input from the second user, the acceptance input comprising an acceptance to the invitation; andtransmitting the acceptance input to the first user in response to receiving the acceptance input.
  • 15. The method of claim 14, wherein receiving the invitation input from the first user, the invitation input comprising the invitation, further comprises receiving the invitation input from the first user wherein the invitation comprises at least one of the following: a username of the second user, a service location of the second user, a system identification of the second user, an e-mail address of the second user, and an account number of the second user.
  • 16. The method of claim 14, wherein receiving the invitation input from the first user further comprises receiving the invitation input comprising at least one of the following: a request to the second user for the second user to be added to the first user's input set, a content recommendation, a content tuning instruction, a message, and a content recording instruction.
  • 17. The method of claim 13, wherein receiving the first input comprises receiving the first input wherein at least one of the plurality of selectable elements corresponds to a personal expression.
  • 18. The method of claim 13, wherein receiving the first input comprises receiving the first input wherein the at least one of the plurality of selectable elements comprises at least one of the following: an emotional icon, an alphanumeric shorthand text, and an avatar.
  • 19. A system for providing user communication, the system comprising: a memory storage; anda processing unit coupled to the memory storage, wherein the processing unit is operative to: provide content to a first user;receive an input from the first user, the input being associated with a button on a remote control, the button being configured to activate one of a plurality of input commands for transmitting an invitation input to a second user, the plurality of input commands defined by the first user using a portion of a plurality of selectable elements, the plurality of input commands being associated with transmitting information associated with the content provided to the first user at a time the input from the first user is received, wherein each of the plurality of input commands is assigned an unique key on the remote control device;activate, in response to receiving the first input, an input set associated with transmitting the invitation input from the first user to a second user defined by the input set, the input set comprising an input of a username, a password, a message, and at least one recipient, the invitation input comprising a recommendation to a second user to take an action regarding the content to be delivered over a content delivery system, the recommendation comprising an attachment having a portion of the content and a pointer to a specific location within the content;process the invitation input, wherein processing the invitation input comprises: storing the invitation input for later retrieval, at the communication unit, based on the date and time to deliver the invitation input, anddetermining a status of the second user, wherein determining the status comprises determining whether the second user is available to receive the invitation input; wherein determining whether the second user is available to receive the invitation input comprises determining whether the second user has blocked the first user, and storing the invitation input, in response to the second user blocking the first user, for later retrieval;attach the content to be delivered to the invitation input; andtransmit a request for permission from the second user to add the second user as contact of the first user's input set.
US Referenced Citations (211)
Number Name Date Kind
5565909 Thibadeau et al. Oct 1996 A
5583560 Florin et al. Dec 1996 A
5808662 Kinney et al. Sep 1998 A
5812123 Rowe et al. Sep 1998 A
5818439 Nagasaka et al. Oct 1998 A
5861906 Dunn et al. Jan 1999 A
6144375 Jain et al. Nov 2000 A
6188398 Collins-Rector et al. Feb 2001 B1
6282713 Kitsukawa et al. Aug 2001 B1
6438579 Hosken Aug 2002 B1
6615248 Smith Sep 2003 B1
6754904 Cooper et al. Jun 2004 B1
6934963 Reynolds et al. Aug 2005 B1
6968364 Wong et al. Nov 2005 B1
6983426 Kobayashi et al. Jan 2006 B1
7017173 Armstrong et al. Mar 2006 B1
7080139 Briggs et al. Jul 2006 B1
7228305 Eyal et al. Jun 2007 B1
7246367 Livonen Jul 2007 B2
7249366 Flavin Jul 2007 B1
7272844 Bankers et al. Sep 2007 B1
7290211 Goodwin et al. Oct 2007 B2
7363644 Wugofski Apr 2008 B2
7584214 Narahara et al. Sep 2009 B2
7596761 Lemay et al. Sep 2009 B2
7669219 Scott, III Feb 2010 B2
7673315 Wong et al. Mar 2010 B1
7685204 Rogers Mar 2010 B2
7698263 Pickelsimer et al. Apr 2010 B2
7716376 Price et al. May 2010 B1
7877293 Biebesheimer et al. Jan 2011 B2
7886327 Stevens Feb 2011 B2
7895625 Bryan et al. Feb 2011 B1
7904924 de Heer et al. Mar 2011 B1
7933789 Boland et al. Apr 2011 B2
7992163 Jerding et al. Aug 2011 B1
8090606 Svendsen Jan 2012 B2
8091032 Fischer Jan 2012 B2
8220021 Look et al. Jul 2012 B1
8296660 Macadaan et al. Oct 2012 B2
8296803 Yamaoka et al. Oct 2012 B2
8364013 Nijim Jan 2013 B2
8418204 Pickelsimer et al. Apr 2013 B2
8789102 Pickelsimer et al. Jul 2014 B2
8789117 Nijim Jul 2014 B2
8806532 Pickelsimer et al. Aug 2014 B2
8832749 Pickelsimer et al. Sep 2014 B2
8869191 Pickelsimer et al. Oct 2014 B2
20010001160 Shoff et al. May 2001 A1
20010037367 Iyer Nov 2001 A1
20020007485 Rodriguez et al. Jan 2002 A1
20020056123 Liwerant et al. May 2002 A1
20020057297 Grimes et al. May 2002 A1
20020069218 Sull et al. Jun 2002 A1
20020087982 Stuart Jul 2002 A1
20020120757 Sutherland et al. Aug 2002 A1
20020124252 Schaefer et al. Sep 2002 A1
20020128831 Ju et al. Sep 2002 A1
20020144273 Reto Oct 2002 A1
20020156852 Hughes et al. Oct 2002 A1
20020174430 Ellis et al. Nov 2002 A1
20020194195 Fenton et al. Dec 2002 A1
20020199188 Sie et al. Dec 2002 A1
20030002849 Lord Jan 2003 A1
20030021582 Sawada Jan 2003 A1
20030084449 Chane et al. May 2003 A1
20030093790 Logan et al. May 2003 A1
20030093806 Dureau et al. May 2003 A1
20030112467 McCollum et al. Jun 2003 A1
20030115592 Johnson Jun 2003 A1
20030154477 Hassell et al. Aug 2003 A1
20030156827 Janevski Aug 2003 A1
20030177497 Macrae et al. Sep 2003 A1
20030206710 Ferman et al. Nov 2003 A1
20030208763 McElhatten et al. Nov 2003 A1
20030220100 McElhatten et al. Nov 2003 A1
20030225846 Heikes et al. Dec 2003 A1
20030229900 Reisman Dec 2003 A1
20040034867 Rashkovskiy et al. Feb 2004 A1
20040060063 Russ et al. Mar 2004 A1
20040064835 Bellwood et al. Apr 2004 A1
20040078807 Fries et al. Apr 2004 A1
20040078814 Allen Apr 2004 A1
20040103167 Grooters et al. May 2004 A1
20040117786 Kellerman et al. Jun 2004 A1
20040210928 Hamzy et al. Oct 2004 A1
20040255336 Logan et al. Dec 2004 A1
20040255340 Logan Dec 2004 A1
20040268386 Logan et al. Dec 2004 A1
20050022241 Griggs Jan 2005 A1
20050028208 Ellis et al. Feb 2005 A1
20050044565 Jerding et al. Feb 2005 A1
20050055710 Aoki et al. Mar 2005 A1
20050076363 Dukes et al. Apr 2005 A1
20050091316 Ponce et al. Apr 2005 A1
20050149880 Postrel Jul 2005 A1
20050210145 Kim et al. Sep 2005 A1
20050246739 Davidson Nov 2005 A1
20050251820 Stefanik et al. Nov 2005 A1
20050262542 DeWeese et al. Nov 2005 A1
20050278443 Winner et al. Dec 2005 A1
20050278740 Helms Dec 2005 A1
20050278761 Gonder et al. Dec 2005 A1
20050283813 Jamail et al. Dec 2005 A1
20060005207 Louch et al. Jan 2006 A1
20060020904 Aaltonen et al. Jan 2006 A1
20060031882 Swix et al. Feb 2006 A1
20060041927 Stark et al. Feb 2006 A1
20060059514 Hsiao et al. Mar 2006 A1
20060059526 Poslinski Mar 2006 A1
20060075019 Donovan et al. Apr 2006 A1
20060080408 Istvan et al. Apr 2006 A1
20060090183 Zito et al. Apr 2006 A1
20060112325 Ducheneaut et al. May 2006 A1
20060130093 Feng et al. Jun 2006 A1
20060143236 Wu Jun 2006 A1
20060161950 Imai et al. Jul 2006 A1
20060174277 Sezan et al. Aug 2006 A1
20060184972 Rafey et al. Aug 2006 A1
20060190966 McKissick et al. Aug 2006 A1
20060195479 Spiegelman et al. Aug 2006 A1
20060248557 Stark et al. Nov 2006 A1
20060253874 Stark et al. Nov 2006 A1
20060259926 Scheelke et al. Nov 2006 A1
20060271959 Jacoby et al. Nov 2006 A1
20060282856 Errico et al. Dec 2006 A1
20070033533 Sull Feb 2007 A1
20070061835 Klein et al. Mar 2007 A1
20070106627 Srivastava et al. May 2007 A1
20070123353 Smith May 2007 A1
20070124795 McKissick et al. May 2007 A1
20070150918 Carpenter et al. Jun 2007 A1
20070186180 Morgan Aug 2007 A1
20070186231 Haeuser et al. Aug 2007 A1
20070186243 Pettit et al. Aug 2007 A1
20070198532 Krikorian et al. Aug 2007 A1
20070204238 Hua et al. Aug 2007 A1
20070214473 Barton et al. Sep 2007 A1
20070220552 Juster et al. Sep 2007 A1
20070220566 Ahmad-Taylor Sep 2007 A1
20070245367 Ogawa Oct 2007 A1
20070256103 Knudson Nov 2007 A1
20070271338 Anschutz Nov 2007 A1
20070277205 Grannan Nov 2007 A1
20070282949 Fischer et al. Dec 2007 A1
20070294726 Drazin Dec 2007 A1
20070298401 Mohanty et al. Dec 2007 A1
20080010153 Pugh-O'Connor et al. Jan 2008 A1
20080022320 Ver Steeg Jan 2008 A1
20080036917 Pascarella et al. Feb 2008 A1
20080040370 Bosworth et al. Feb 2008 A1
20080052371 Partovi et al. Feb 2008 A1
20080065758 Narayanaswami Mar 2008 A1
20080066111 Ellis et al. Mar 2008 A1
20080066114 Carlson et al. Mar 2008 A1
20080082606 Gupta et al. Apr 2008 A1
20080086456 Rasanen et al. Apr 2008 A1
20080092168 Logan et al. Apr 2008 A1
20080098323 Vallone et al. Apr 2008 A1
20080114861 Gildred May 2008 A1
20080126936 Williams May 2008 A1
20080155600 Klappert et al. Jun 2008 A1
20080163307 Coburn et al. Jul 2008 A1
20080168515 Benson et al. Jul 2008 A1
20080177727 Pickelsimer Jul 2008 A1
20080178218 Pickelsimer Jul 2008 A1
20080235733 Heie et al. Sep 2008 A1
20080247730 Barton et al. Oct 2008 A1
20080263595 Sumiyoshi et al. Oct 2008 A1
20080276278 Krieger et al. Nov 2008 A1
20080288596 Smith et al. Nov 2008 A1
20080313541 Shafton et al. Dec 2008 A1
20080320139 Fukuda et al. Dec 2008 A1
20090019374 Logan et al. Jan 2009 A1
20090044216 McNicoll Feb 2009 A1
20090049098 Pickelsimer et al. Feb 2009 A1
20090049118 Stevens Feb 2009 A1
20090049473 Pickelsimer et al. Feb 2009 A1
20090055743 Pickelsimer et al. Feb 2009 A1
20090055868 Wehmeyer et al. Feb 2009 A1
20090063994 Pickelsimer et al. Mar 2009 A1
20090094643 Pickelsimer et al. Apr 2009 A1
20090100469 Conradt et al. Apr 2009 A1
20090125843 Billmaier et al. May 2009 A1
20090172127 Srikanth et al. Jul 2009 A1
20090172543 Cronin et al. Jul 2009 A1
20090178081 Goldenberg et al. Jul 2009 A1
20090199242 Johnson et al. Aug 2009 A1
20090307719 Clark et al. Dec 2009 A1
20090310933 Lee Dec 2009 A1
20090313664 Patil et al. Dec 2009 A1
20100042746 Keum et al. Feb 2010 A1
20100064320 Angiolillo et al. Mar 2010 A1
20100107194 McKissick et al. Apr 2010 A1
20100175084 Ellis et al. Jul 2010 A1
20100192173 Mizuki et al. Jul 2010 A1
20110010744 Stecyk et al. Jan 2011 A1
20110013885 Wong et al. Jan 2011 A1
20110072455 Pickelsimer et al. Mar 2011 A1
20110090402 Huntington et al. Apr 2011 A1
20110107389 Chakarapani May 2011 A1
20110131600 Howcroft et al. Jun 2011 A1
20110138423 Pickelsimer et al. Jun 2011 A1
20110202945 Pickelsimer et al. Aug 2011 A1
20110283313 Gathen et al. Nov 2011 A1
20120051717 Nijim Mar 2012 A1
20120054808 Nijim Mar 2012 A1
20120054810 Nijim Mar 2012 A1
20120222056 Donoghue et al. Aug 2012 A1
20120284744 Kumar Nov 2012 A1
20130167168 Ellis et al. Jun 2013 A1
Foreign Referenced Citations (1)
Number Date Country
2002218428 Aug 2002 JP
Non-Patent Literature Citations (89)
Entry
Copending U.S. Appl. No. 12/545,099, filed Aug. 21, 2009 entitled “Providing a Video User Interface”.
U.S. Final Office Action dated Dec. 30, 2009 cited in U.S. Appl. No. 11/787,732.
U.S. Appl. No. 11/787,732, filed Apr. 17, 2007 entitled “Providing a Video User Interface”.
U.S. Appl. No. 11/787,733, filed Apr. 17, 2007 entitled “Providing a Lateral Search”.
U.S. Appl. No. 12/126,025, filed May 23, 2008 entitled “Providing a Video User Interface”.
U.S. Appl. No. 12/126,060, filed May 23, 2008 entitled “Providing a Social Network”.
U.S. Appl. No. 12/126,126, filed May 23, 2008 entitled “Providing a User Interface”.
U.S. Office Action dated Jun. 22, 2009 cited in U.S. Appl. No. 11/787,732.
J. Bouwen et al., “Communication Meets Entertainment: Community Television,” Technology White Paper, Alcatel Telecommunications Review, 1st Quarter 2005, pp. 1-8, http://www.alcatel.com/doctypes/articlepaperlibrary/pdf/ATR2005Q1/T0503-Community—TV-EN.pdf.
U.S. Appl. No. 12/126,096, filed May 23, 2008 entitled “Providing a Content Mark”.
U.S. Appl. No. 12/126,165, filed May 23, 2008 entitled “Providing a Customized User Interface”.
U.S. Office Action dated Apr. 28, 2009 cited in U.S. Appl. No. 11/787,733.
Copending U.S. Appl. No. 12/ 868,801, filed Aug. 26, 2010 entitled “Content Library”.
Copending U.S. Appl. No. 12/868,824, filed Aug. 26, 2010 entitled “Playlist Bookmarking”.
Copending U.S. Appl. No. 12/868,838, filed Aug. 26, 2010 entitled “Content Bookmarking”.
U.S. Office Action dated Sep. 14, 2010 cited in U.S. Appl. No. 11/787,732.
U.S. Office Action dated Sep. 30, 2010 cited in U.S. Appl. No. 12/126,165.
U.S. Office Action dated Oct. 5, 2010 cited in U.S. Appl. No. 12/126,096.
Copending U.S. Appl. No. 12/959,731, filed Dec. 3, 2010 entitled “Providing a Media Guide Including Parental Information”.
Copending U.S. Appl. No. 12/959,665, filed Dec. 3, 2010 entitled “Content Recommendations”.
Copending U.S. Appl. No. 12/959,793, filed Dec. 3, 2010 entitled “Personalizing TV Content”.
U.S. Final Office Action dated Jan. 14, 2011 cited in U.S. Appl. No. 12/126,096.
U.S. Office Action dated Feb. 14, 2011 cited in U.S. Appl. No. 12/126,025.
U.S. Office Action dated Feb. 17, 2011 cited in U.S. Appl. No. 12/126,060.
U.S. Office Action dated Feb. 18, 2011 cited in U.S. Appl. No. 12/545,099.
U.S. Final Office Action dated Mar. 2, 2011 cited in U.S. Appl. No. 11/787,732.
U.S. Final Office Action dated Mar. 2, 2011 cited in U.S. Appl. No. 12/126,165.
U.S. Office Action dated Apr. 26, 2011 cited in U.S. Appl. No. 12/126,096.
U.S. Final Office Action dated Jun. 13, 2011 cited in U.S. Appl. No. 12/126,060.
U.S. Final Office Action dated Jun. 21, 2011 cited in U.S. Appl. No. 12/126,025.
U.S. Appl. No. 13/221,151, filed Aug. 30, 2011 entitled “Sharing Digitally Recorded Content”.
U.S. Final Office Action dated Sep. 29, 2011 cited in U.S. Appl. No. 12/126,096.
U.S. Office Action dated Dec. 28, 2011 cited in U.S. Appl. No. 12/545,099, 13 pgs.
U.S. Office Action dated Jan. 5, 2012 cited in U.S. Appl. No. 12/126,096, 20 pgs.
U.S. Final Office Action dated Aug. 5, 2011 cited in U.S. Appl. No. 12/545,099.
U.S. Office Action dated Jan. 18, 2012 cited in U.S. Appl. No. 12/126,025, 27 pgs.
U.S. Office Action dated Jan. 19, 2012 cited in U.S. Appl. No. 12/126,165, 20 pgs.
U.S. Office Action dated Mar. 26, 2012 cited in U.S. Appl. No. 12/126,060, 26 pgs.
U.S. Final Office Action dated May 31, 2012 cited in U.S. Appl. No. 12/126,165, 23 pgs.
U.S. Office Action dated Jun. 5, 2012 cited in U.S. Appl. No. 12/126,126, 40 pgs.
U.S. Office Action dated Jun. 20, 2012 cited in U.S. Appl. No. 12/959,731, 30 pgs.
U.S. Final Office Action dated Aug. 15, 2012 cited in U.S. Appl. No. 12/126,025, 33 pgs.
U.S. Final Office Action dated Aug. 23, 2012 cited in U.S. Appl. No. 12/126,060, 30 pgs.
U.S. Final Office Action dated Aug. 29, 2012 cited in U.S. Appl. No. 12/868,838, 7 pgs.
U.S. Final Office Action dated Apr. 20, 2012 cited in U.S. Appl. No. 12/126,096, 25 pgs.
U.S. Final Office Action dated Apr. 25, 2012 cited in U.S. Appl. No. 12/545,099, 16 pgs.
U.S. Office Action dated May 21, 2012 cited in U.S. Appl. No. 12/868,838, 22 pgs.
U.S. Office Action dated Nov. 5, 2012 cited in U.S. Appl. No. 12/126,096, 23 pgs.
U.S. Office Action dated Nov. 20, 2012 cited in U.S. Appl. No. 13/221,151, 29 pgs.
U.S. Office Action dated Nov. 29, 2012 cited in U.S. Appl. No. 12/868,824, 33 pgs.
U.S. Office Action dated Dec. 19, 2012 cited in U.S. Appl. No. 12/126,025, 30 pgs.
U.S. Final Office Action dated Dec. 28, 2012 cited in U.S. Appl. No. 12/126,126, 24 pgs.
U.S. Office Action dated Oct. 4, 2012 cited in U.S. Appl. No. 12/868,801, 27 pgs.
U.S. Office Action dated Oct. 5, 2012 cited in U.S. Appl. No. 12/959,793, 31 pgs.
U.S. Office Action dated Oct. 9, 2012 cited in U.S. Appl. No. 12/545,099, 21 pgs.
U.S. Final Office Action dated Oct. 9, 2012 cited in U.S. Appl. No. 12/959,731, 19 pgs.
U.S. Office Action dated Oct. 24, 2012 cited in U.S. Appl. No. 12/126,165, 25 pgs.
U.S. Final Office Action dated Mar. 1, 2013 cited in U.S. Appl. No. 13/221,151, 15 pgs.
U.S. Final Office Action dated Mar. 27, 2013 cited in U.S. Appl. No. 12/126,096, 25 pgs.
U.S. Office Action dated Apr. 3, 2013 cited in U.S. Appl. No. 12/959,665, 45 pgs.
U.S. Final Office Action dated Apr. 10, 2013 cited in U.S. Appl. No. 12/959,793, 23 pgs.
U.S. Office Action dated Apr. 11, 2013 cited in U.S. Appl. No. 12/959,731, 22 pgs.
U.S. Final Office Action dated Apr. 11, 2013 cited in U.S. Appl. No. 12/126,025, 28 pgs.
U.S. Final Office Action dated Apr. 25, 2013 cited in U.S. Appl. No. 12/126,165, 26 pgs.
U.S. Final Office Action dated Apr. 30, 2013 cited in U.S. Appl. No. 12/868,801, 13 pgs.
U.S. Final Office Action dated May 3, 2013 cited in U.S. Appl. No. 12/545,099, 20 pgs.
U.S. Office Action dated May 14, 2013 cited in U.S. Appl. No. 12/868,824, 12 pgs.
U.S. Office Action dated Aug. 21, 2013 cited in U.S. Appl. No. 12/126,096, 23 pgs.
U.S. Office Action dated Sep. 9, 2013 cited in U.S. Appl. No. 13/221,151, 16 pgs.
U.S. Office Action dated Jun. 20, 2013 cited in U.S. Appl. No. 12/126,126, 27 pgs.
U.S. Final Office Action dated Aug. 1, 2013 cited in U.S. Appl. No. 12/959,731, 23 pgs.
U.S. Final Office Action dated Aug. 14, 2013 cited in U.S. Appl. No. 12/959,665, 30 pgs.
U.S. Office Action dated Aug. 16, 2013 cited in U.S. Appl. No. 12/868,801, 13 pgs.
U.S. Office Action dated Oct. 4, 2013 cited in U.S. Appl. No. 12/126,165, 28 pgs.
U.S. Office Action dated Oct. 7, 2013 cited in U.S. Appl. No. 12/959,793, 27 pgs.
U.S. Final Office Action dated Oct. 10, 2013 cited in U.S. Appl. No. 12/126,126, 23 pgs.
U.S. Office Action dated Oct. 29, 2013 cited in U.S. Appl. No. 12/126,060, 27 pgs.
U.S. Final Office Action dated Nov. 1, 2013 cited in U.S. Appl. No. 12/868,824, 15 pgs.
U.S. Final Office Action dated Dec. 24, 2013 cited in U.S. Appl. No. 13/221,151, 17 pgs.
U.S. Final Office Action dated Jan. 9, 2014 cited in U.S. Appl. No. 12/126,096, 26 pgs.
U.S. Final Office Action dated Apr. 3, 2014 cited in U.S. Appl. No. 12/126,060, 18 pgs.
U.S. Office Action dated May 20, 2014 cited in U.S. Appl. No. 12/126,096, 19 pgs.
U.S. Office Action dated May 21, 2014 cited in U.S. Appl. No. 13/221,151, 18 pgs.
U.S. Office Action dated Mar. 6, 2014 cited in U.S. Appl. No. 12/959,731, 16 pgs.
U.S. Office Action dated Mar. 21, 2014 cited in U.S. Appl. No. 12/868,824, 12 pgs.
U.S. Office Action dated Aug. 5, 2014 cited in U.S. Appl. No. 12/126,060, 23 pgs.
U.S. Final Office Action dated Aug. 27, 2014 cited in U.S. Appl. No. 12/126,096, 38 pgs.
U.S. Final Office Action dated Sep. 3, 2014 cited in U.S. Appl. No. 13/221,151, 31 pgs.
U.S. Final Office Action dated Sep. 24, 2014 cited in U.S. Appl. No. 12/868,824, 28 pgs.
Related Publications (1)
Number Date Country
20080168506 A1 Jul 2008 US