1. Field of the Invention
The invention is directed to media data streaming, and more particularly to media data streaming in a Web Service environment and/or automation environment.
2. Related Art
Household, academic facility and/or business spaces now more commonly have more than one audio or video device such as CD/DVD player, portable MP3 player, tuner, preamp, power amp, speakers, VCR, DVR, computers running media players or connected to some other source of audio or video (e.g., Internet radio, satellite radio and the like), etc. Typically, a CD/DVD player from one company comes with its own remote control and an amplifier by an entirely different company comes with its own remote control. The same space may have a PC with its keyboard and mouse, and yet another company's portable MP3 player with its own control switches. While each audio device is doing precisely what it was designed to do, each operates completely independent from the others with the possible exception of the portable MP3 player that may be connected to a PC for synchronization. As a result, a user ends up going from one keypad to another or juggling a series of remote controls in order to control the devices.
Since these audio/video and similar devices are not designed to communicate with each other or their communication is very limited, access to these audio/video devices is limited by their physical locations. For example, it is difficult to play an MP3 file saved in a PC hard disk drive in one room or area (a child's bedroom) on speakers located in another room or area (an entertainment room). Thus, in order for a user to enjoy music of his or her choice whenever and wherever he or she wants, each room needs to be equipped with all the necessary audio/video equipment and digital audio/video content.
Also, the audio/video devices are not designed to communicate with other home devices (e.g., TV, lighting, security system, etc.). Thus, it is difficult, if not impossible, to converge the devices for common control for certain occasions. For example, in order to watch a movie, the user must turn on a TV, a DVD player and an audio amplifier by using three different remote controls. Then the user must set the TV to receive a video signal from the DVD player, set the audio amplifier to receive an audio signal from the DVD player and use another control unit to adjust the lighting of the room. Even when a user utilizes a universal remote, as is known in the art, the result is a plurality of devices that are separately operated and are operated separately from a single universal remote. These devices do not converge as described above. Moreover, the devices lack any ability to send or receive media data from a source to a destination.
Accordingly, there is a need for a solution for the aforementioned accessibility, connectability and convergence issues to allow media data to be shared, transferred, or received in any desired device.
The invention meets the foregoing need using IP based media streaming, which results in a significant increase in accessibility and userability of media data and other advantages apparent from the discussion herein.
Accordingly, in one aspect of the invention, a method of streaming media data in a Web Service environment includes the steps of establishing a media streaming network between a first media device and a second media device, the media streaming network implemented with Web Service for Devices and a real-time media streaming protocol, processing media data for transfer using a media streaming application implemented in the first media device, transferring the processed media data via the media streaming network, receiving the processed media data from the media streaming network, and rendering the processed media data for playback using the media streaming application implemented in the second media device.
The media streaming application may be a filter-based media streaming application. The filter-based media streaming application may include at least one of a filter to read the media data from a file or input, a filter to decode the media data, a filter to transform the media data, and a filter to render the media data. The method may further include the steps of configuring at least one client and at least one device connected to a network, the at least one client and at least one device being configured with web services for devices and further configured to transfer media data with the media streaming application, wherein the at least one client comprises one of a TV, a personal computer, a personal digital assistance, a control panel, and a game controller and the at least one device comprises an audio system, a video system, an intercom system, a lighting system, a security system, and a HVAC system. The real time media streaming protocol may be an internet real-time transport protocol (RTP). The media data may be audio data, video data or a combination of audio and video data. Accordingly, in another aspect of the invention, a system for streaming media data in a Web Service environment, includes a media streaming network implemented with Web Service and configured to transfer media data using a real-time media streaming protocol, a first media device connected to the media streaming network and implemented with a media streaming application for processing media data for transfer, and a second media device connected to the media streaming network and implemented with the media streaming application for rendering the processed media data from the first media device for playback. The media streaming application may be a filter-based media streaming application. The filter-based media streaming application may include at least one of a filter to read the media data from a file or input, a filter to decode the media data, a filter to transform the media data, and a filter to render the media data.
Accordingly, in another aspect of the invention, an automation system may include the system for streaming media data noted above and further may include at least one client and at least one device connected to a network that are configured with web services for devices and further configured to transfer media data with the media streaming application, wherein the at least one client comprises one of a TV, a personal computer, a personal digital assistance, a control panel, and a game controller and the at least one device comprises an audio system, a video system, an intercom system, a lighting system, a security system, and a HVAC system.
The real-time media streaming protocol may be an internet real-time transport protocol (RTP). Each of the first and second media devices may include a network interface connected to the network, and a processor running an operating system (OS), Web Service application and the real-time media streaming application. The media data may be audio data, video data or a combination of audio and video data.
An access point for streaming media data in a Web Service environment, includes a processor running an operating system (OS), a Web Service application and a real time media streaming application, wherein the real time media stream application processes out-bound media data for transfer or renders in-bound media data for playback, a media terminal configured to interface an external media device, wherein the media terminal receives the out-bound media data from the external media device or sends the in-bound media data rendered by the real time media stream application to the external media device, and a network interface connected to a media streaming network implemented with Web Service for Devices and a real-time media streaming protocol, wherein the network interface sends the out-bound media data processed by the real time media stream application to the media streaming network or receive the in-bound media data from the media streaming network.
The media streaming application may be a filter-based media streaming application. The filter-based media streaming application may include at least one of a filter to read the media data from a file or input, a filter to decode the media data, a filter to transform the media data, and a filter to render the media data. An automation system may include the access point as noted above and further may include at least one client and at least one device connected to a network that are configured with web services for devices and further configured to transfer media data with the media streaming application, wherein the at least one client comprises one of a TV, a personal computer, a personal digital assistance, a control panel, and a game controller and the at least one device comprises an audio system, a video system, an intercom system, a lighting system, a security system, and a HVAC system. The real-time media streaming protocol may be an internet real-time transport protocol (RTP). The in-bound and out-bound media data may be audio data, video data or a combination of audio and video data. The media terminal is configured for at least one of audio in, video in and TV signal in. The media terminal may be configured for at least one of audio out or video out.
Accordingly, in another aspect of the invention, an intercom unit for a Web Service environment may include a network interface connected to a media streaming network implemented with a real-time media streaming protocol, a processor configured to run an operating system (OS), a Web Service application and a media stream application, a microphone configured to collect a first voice signal, a speaker that reproduces a second voice signal transferred from the network interface, and a sound card that converts the first voice signal to a digital data stream and converts the second voice signal to an analog data stream, wherein the media stream application processes the first voice signal for transfer and renders the second voice signal for playback.
The real-time media streaming protocol may be an internet real-time transport protocol (RTP). The media streaming application may be a filter-based media streaming application. The filter-based media streaming application may include at least one of a filter to read the media data from a file or input, decode the media data, transform the media data, and render the media data.
Accordingly, in another aspect of the invention, a method of establishing a voice communication in a Web Service environment includes the steps of collecting a first voice signal via a microphone, converting the first voice signal into a digital data stream, processing the digital data stream for transfer using a media streaming application, transferring the processed digital data stream via a media streaming network using a real-time media streaming protocol, receiving the processed digital data stream from the media streaming network, rendering the processed digital data streaming using the media streaming application for playback, converting the rendered digital data stream to an analog audio signal, and playing the analog audio signal via a speaker.
The media streaming application may be a filter-based media streaming application. The filter-based media streaming application may include at least one of a filter to read the media data from a file or input, decode the media data, transform the media data, and render the media data. The real-time media streaming protocol may be an internet real-time transport protocol (RTP).
Accordingly, in another aspect of the invention, a machine-readable medium includes stored instructions, which, when executed by a processor cause the processor to stream media data in a Web Service environment, medium includes instructions for establishing a media streaming network between a first media device and a second media device, the media streaming network implemented with Web Service and a real-time media streaming protocol, instructions for processing media data for transfer using a media streaming application implemented in the first media device, instructions for transferring the processed media data via the media streaming network, instructions for receiving the processed media data from the media streaming network, and instructions for rendering the processed media data for playback using the media streaming application implemented in the second media device.
The media streaming application may be a filter-based media streaming application. The filter-based media streaming application may include at least one of a filter to read the media data from a file or input, decode the media data, transform the media data, and render the media data. The real time media streaming protocol may be an internet real-time transport protocol (RTP). The media data may be audio data, video data or a combination of audio and video data.
Additional features of the invention may be set forth or apparent from consideration of the following detailed description, drawings, and claims. Moreover, it is to be understood that both the foregoing summary of the invention and the following detailed description are exemplary and intended to provide further explanation without limiting the scope of the invention as claimed.
The accompanying drawings, which are included to provide a further understanding of the invention, are incorporated in and constitute a part of this specification, illustrate embodiments of the invention and together with the detailed description serve to explain the principles of the invention. No attempt is made to show structural details of the invention in more detail than may be necessary for a fundamental understanding of the invention and the various ways in which it may be practiced. In the drawings:
The embodiments of the invention and the various features and advantageous details thereof are explained more fully with reference to the non-limiting embodiments and examples that are described and/or illustrated in the accompanying drawings and detailed in the following description. It should be noted that the features illustrated in the drawings are not necessarily drawn to scale, and features of one embodiment may be employed with other embodiments as the skilled artisan would recognize, even if not explicitly stated herein. Descriptions of well-known components and processing techniques may be omitted so as to not unnecessarily obscure the embodiments of the invention. The examples used herein are intended merely to facilitate an understanding of ways in which the invention may be practiced and to further enable those of skill in the art to practice the embodiments of the invention. Accordingly, the examples and embodiments herein should not be construed as limiting the scope of the invention, which is defined solely by the appended claims and applicable law. Moreover, it is noted that like reference numerals represent similar parts throughout the several views of the drawings.
The software (i.e., application) enables the hardware such as the server 10, devices 102 and clients 104 to communicate with each other despite their different proprietary languages and communication protocols, and may provide the user with control over most or all the hardware from a single client. The application may utilize at least one portion of the hardware to send commands to the devices 102 and receive feedback from them. The application may integrate centralized device control into a PC based media environment (e.g., Microsoft Media Center™ environment) that may store, organize and play digital media content. The user may use the same remote control 39 to listen to music, watch and record television, enjoy family photographs and home movies, as well as adjust the lighting, secure the home, adjust the temperature, distribute music throughout the house, check surveillance cameras and the like.
The application may be implemented with Web Services. The Web Services use standard Internet protocol (IP) and are based on standard XML-related technologies such as SOAP (Simple Object Access Protocol) for communications and WSDL (Web Services Device Language) to describe interfaces. The devices implemented with Web Service for Device (WSD) become black boxes on the network, providing services to any application, on any platform, written in any language. Moreover, the use of WSD allows for the capabilities of Universal Plug and Play (UPnP) that seamlessly connect and simplify implementation.
Alternatively or additionally, if the server 10 or the PC 32 is running a SideShow™ enabled operating system such as Microsoft Windows Vista™, the devices may be configured as a SideShow™ device or “gadget.” A SideShow™ device or gadget may communicate with any client or device implemented with WSD in the network via protocols according to SideShow™ XML communication specifications. Moreover, the server 10 or the PC 32 using Microsoft Windows Vista™ may be running a SideShow™ gadget application running on the Microsoft Windows Vista™ computer providing a user interface rendering for the device that communicates with automation control devices via WSD technology.
The tuner card 40 may be any type of well-known card capable of receiving video signals that may be connected to a TV signal source such as an antenna, cable box, satellite receiver, or the like, and may provide a TV signal to any of the devices 102 or clients connected to the network 12. The tuner card 40 may convert the TV signal from an analog format to a digital format. The storage 42 may be a hard disk drive, memory card/reader, CD/DVD ROM driver, MP3 player, or the like, and stores, at least for example, digital audio/video (media) content. The sound card 44 may be any type of well-known sound card capable of receiving audio signals and may be connected to an audio signal source such as a microphone, audio player, or the like, and may provide an audio signal from the audio signal source to any of the devices connected to the network 12. The sound card 44 may convert the audio signal from the analog format to the digital format. Although
Media devices implemented with Web Services may be directly connected to the network. For example, a Web Services (WS) enabled amplifier 48, WS enabled audio/video (AN) device 50, WS enabled audio device 52 and WS enabled video device 54 may be directly connected to the network 12, and a user may control these devices 50, 52, 54 from any of the clients 104 (shown in detail in
For those media devices that are not implemented with Web Services, access points may be used to connect those devices to the network 12. For example, if a TV signal source and the tuner card 40 are located in separate rooms, an A/V access point 60 implemented with Web Services may be used to connect the TV signal source to the network. The A/V access point 60 may be equipped with a tuner card and connected to the network 12 wirelessly.
Similarly, an audio source and video source may be connected to the network 12 via an audio access point 62 and video access point 64, respectively. While the access points 60, 62, 64 may be used to interface between the media sources (e.g. TV, audio and video signal sources, and the like), the same access points may be used as media playback devices (e.g. TV, stereo system, video monitor, and the like). For example, an A/V access point 66 implemented with Web Services may provide a TV with a TV signal received from the tuner card 40 or the TV signal received from the AN access point 60. An audio access point 68 may provide an audio playback device with an MP3 file stored in the storage 46. A video access point 70 may provide a monitor with the video signal received from the video access point 64. By using the same operational principles of the audio access point 62 and 68, an intercom system may be implemented. Such access points may utilize the technology disclosed in Applicant's copending patent application U.S. Patent Application No. (to be assigned), entitled NETWORK BASED DIGITAL ACCESS POINT DEVICE, filed Mar. 14, 2007, to Seale Moorer, et al. incorporated herein by reference in its entirety.
The access point 71 may include a processor 74 that may run an operating system (OS) and applications to implement Web Services in the access point 71. Thus, upon being connected to the network 12, the access point 71 may be quickly recognized as a device in the network 12 and controlled by the server 10 and the clients 104.
The processor 74 may also run a media streaming application 80, such DirectShow™ (available from Microsoft™ Redmond, Wash.), to provide a common interface for the media files. DirectShow™ divides a media processing task (e.g. video or audio playback task) into a set of steps known as filters. DirectShow™ filter graphs may be used in video playback, in which the filters provide steps such as file parsing, video/audio de-multiplexing, decompressing, rendering and the like. Of course the invention contemplates any similar or future application of DirectShow™, filter-graph application, and the like.
More specifically as shown in
The filter may include a number of pins 710 that represent connection points on the filter that may be connected to other filters. Pins 710 may be either output or input points. Depending on the filter, data is either requested from an output pin or sent to an input pin in order to transfer data between filters. The filters may be built using a set of C++ classes provided in the DirectShow SDK, called the DirectShow(tm) Base Classes. These handle much of the creation, registration and connection logic for the filter. For the filter graph to use filters automatically, they need to be registered in a separate DirectShow registry entry as well as being registered with COM. This registration can be managed by the DirectShow Base Classes. However, if the application adds the filters manually, they do not need to be registered at all.
The access point 71 may further include an analog to digital (A/D) converter 82, a TV tuner 84, a digital to analog (D/A) converter 86, video in terminal 88, audio in terminal 90, TV signal in terminal 92, video out terminal 94 and audio out terminal 96. Thus, the access point 71 may be capable of handling various media streams (e.g., TV signal, video signal and audio signal). While these components allow the access point 71 to interface with various types of devices, the access point 71 may be equipped with fewer components than what are shown in
Also, if the access point 71 is only used to interface a TV signal source, other components, such as the A/D converter 82, D/A converter 86, video in terminal 88, audio in terminal 90, video out terminal 94, and audio out terminal 96, may not be required. Thus, the access point 71 may include only necessary components for a specific purpose.
In operation, the video in terminal 88 may receive a video data stream from a video source, such as a camcorder, security camera, camera phone, VCR, DVD player, portable media player, DVR, game controller, or the like. Any kind of video in is contemplated by the invention including composite, s-video, component, HDMI (High Definition Multimedia Interface), DVI (Digital Video Interface) including DVI-A, DVI-D and DVI-I, IEEE 1394 (FireWire™), RGBHV, RGBS and the like and any future protocols thereof. If the video streaming data is in an analog format, the video stream data is converted to a digital data stream by the A/D converter 82. The digital video stream data from the A/D converter 82 or the video in terminal 88 are processed by the media streaming application 80. The processed video data stream is then transferred to the network 12 via the network interface 72 using a standard internet protocol, such as internet real-time transport protocol (RTP). The data part of RTP is a thin protocol providing support for applications with real-time properties such as continuous media, including timing, reconstruction, loss detection, security and content identification. Thus, RTP provides support for real-time conferencing of a group of any size within the internet. This support includes source identification and support for gateways like audio and video bridges as well as a multicast-to-unicast translator. RTP further offers quality-of-service feedback from receivers to the multicast group as well as support for the synchronization of different media streams.
Similar to the video data stream, the audio in terminal 90 may receive an audio data stream from an audio source such as a microphone, radio, CD/DVD player, portable media player, or the like. Any type of audio in terminal is contemplated by the invention including RCA, SPDIF, AES, EBU, TOSLINK, XLR interfaces and the like and any future protocols. If necessary, the audio data stream in the analog format may be converted to a digital audio data stream by the A/D converter 82. The digital audio data stream from the A/D converter 82 or the audio in terminal 90 may be processed by the processor 74 using the media streaming application 80 such as DirectShow™, and transferred to the network via network interface 72 using the internet standard protocol such as RTP. Similarly, the TV signal in terminal 92 may receive a TV signal from the TV signal source such as an antenna, cable outlet, set-top box, VCR, DVD, DVR, game controller or the like. The TV signal may then be converted to a digital media steam by the tuner 84. The digital TV signal stream is processed using the media streaming application 80, such as DirectShow™, for real-time multicast or unicast via the network 12 using the internet standard protocol, such as RTP. The tuner 84 may communicate with the Web Service Application 78 such that a user can control the tuner 84 via the clients 104.
In addition to interfacing the media sources as mentioned above, the access point 71 may interface media playback devices, such as an A/V device, video device or audio device, in order to play back the media stream data transferred from the network 12. For example, an audio data stream, which has been previously processed by a different device or access point using the media streaming application is transferred to the network interface 72 via the network 12 using the internet standard protocol. The audio data stream is then rendered by processor 74 using the media streaming application for playback. The rendered audio data stream may be converted to the analog format by the D/A converter 86 if necessary. The audio data stream rendered by the media stream application 80 may then be transferred to the audio device via the audio out terminal 96. Similarly to the audio in terminal, any type of audio is contemplated by the invention including RCA, SPDIF, AES, EBU, TOSLINK, XLR, interfaces and the like and any future protocols.
The video data stream from the network 12 is processed in a similar manner with the audio data stream. Since the access point 71 is configured to interface both the media sources and media playback devices, a real-time end-to-end media streaming may be established by using two or more access points 71.
In accordance with various embodiments of the invention, the methods described herein are intended for operation with dedicated hardware implementations including, but not limited to, semiconductors, application specific integrated circuits, programmable logic arrays, and other hardware devices constructed to implement the methods and modules described herein. Moreover, various embodiments of the invention described herein are intended for operation with as software programs running on a computer processor. Furthermore, alternative software implementations including, but not limited to, distributed processing or component/object distributed processing, parallel processing, virtual machine processing, any future enhancements, or any future protocol can also be used to implement the methods described herein.
It should also be noted that the software implementations of the invention as described herein are optionally stored on a tangible storage medium, such as: a magnetic medium such as a disk or tape; a magneto-optical or optical medium such as a disk; or a solid state medium such as a memory card or other package that houses one or more read-only (non-volatile) memories, random access memories, or other re-writable (volatile) memories. A digital file attachment to email or other self-contained information archive or set of archives is considered a distribution medium equivalent to a tangible storage medium. Accordingly, the invention is considered to include a tangible storage medium or distribution medium, as listed herein and including art-recognized equivalents and successor media, in which the software implementations herein are stored.
While the invention has been described in terms of exemplary embodiments, those skilled in the art will recognize that the invention can be practiced with modifications in the spirit and scope of the appended claims. These examples given above are merely illustrative and are not meant to be an exhaustive list of all possible designs, embodiments, applications or modifications of the invention.
This application claims priority to and the benefit of: Provisional Patent Application No. 60/782,734 filed on Mar. 16, 2006, entitled AUTOMATION CONTROL SYSTEM HAVING A CONFIGURATION TOOL AND TWO-WAY ETHERNET COMMUNICATION FOR WEB SERVICE MESSAGING, DISCOVERY, DESCRIPTIONS, AND EVENTING THAT IS CONTROLLABLE WITH A TOUCH-SCREEN DISPLAY, to Seale MOORER et al.; Provisional Patent Application No. 60/782,596 filed on Mar. 16, 2006, entitled AUTOMATION CONTROL SYSTEM HAVING DIGITAL MEDIA STREAMING, to Seale MOORER et al.; Provisional Patent Application No. 60/782,598 filed on Mar. 16, 2006, entitled AUTOMATION CONTROL SYSTEM HAVING DIGITAL LOGGING, to Seale MOORER et al.; Provisional Patent Application No. 60/782,635 filed on Mar. 16, 2006, entitled AUTOMATION CONTROL SYSTEM HAVING A CONTROL PANEL, to Seale MOORER et al.; Provisional Patent Application No. 60/782,599 filed on Mar. 16, 2006, entitled AUTOMATION CONTROL SYSTEM HAVING A CONFIGURATION TOOL, to Seale MOORER et al.; Provisional Patent Application No. 60/782,600 filed on Mar. 16, 2006, entitled AUTOMATION CONTROL SYSTEM HAVING DEVICE SCRIPTING, to Seale MOORER et al.; Provisional Patent Application No. 60/782,634 filed on Mar. 16, 2006, entitled DEVICE AUTOMATION USING NETWORKED DEVICE CONTROL HAVING A WEB SERVICES FOR DEVICE STACK, to Seale MOORER et al.; Provisional Patent Application No. 60/782,595 filed on Mar. 16, 2006, entitled WIRELESS DIGITAL AMPLIFIER CONFIGURED FOR WALL MOUNTING, SHELF MOUNTING, AND THE LIKE, to Seale MOORER et al.; Provisional Patent Application No. 60/785,275 filed on Mar. 24, 2006, entitled AUTOMATION SYSTEM, to Seale MOORER et al.; Provisional Patent Application No. 60/793,257 filed on Apr. 20, 2006, entitled TOUCH SCREEN FOR USE WITH AUTOMATION SYSTEMS, to Seale MOORER et al.; Provisional Patent Application No. 60/747,726 filed on May 19, 2006, entitled COOLING DEVICE FOR A TOUCH SCREEN AND THE LIKE, to Seale MOORER et al.; Provisional Patent Application No. 60/746,287 filed on May 3, 2006, entitled HOME AUTOMATION SYSTEM AND THE LIKE, to Seale MOORER et al.; Provisional Patent Application No. 60/786,119 filed on Mar. 27, 2006, entitled HOME AUTOMATION PROGRAM CODE FOR SET TOP BOX OR SIMILAR CIRCUIT, to Steve CASHMAN; and Provisional Patent Application No. 60/857,774 filed Nov. 9, 2006, entitled PORTABLE MULTI-FUNCTIONAL MEDIA DEVICE, to Seale MOORER et al., all of which are hereby expressly incorporated by reference for all purposes as if fully set forth herein. Further, this application is related to the following U.S. Patent Applications: U.S. patent application Ser. No. 11/686,826, entitled NETWORK BASED DIGITAL ACCESS POINT DEVICE, filed Mar. 15, 2007, to Seale Moorer, et al.; U.S. patent application Ser. No. 11/686,896, entitled AUTOMATION CONTROL SYSTEM HAVING A CONFIGURATION TOOL AND TWO-WAY ETHERNET COMMUNICATION FOR WEB SERVICE MESSAGING, DISCOVERY, DESCRIPTION, AND EVENTING THAT IS CONTROLLABLE WITH A TOUCH-SCREEN DISPLAY, filed Mar. 15, 2007, to Seale Moorer, et al., now U.S. Patent No. 7,509,402 issued on Mar. 24, 2009; U.S. patent application Ser. No. 11/686,884, entitled AUTOMATION CONTROL SYSTEM HAVING DIGITAL LOGGING, filed Mar. 15, 2007, to Seale Moorer, et al., now U.S. Pat. No. 7,496,627 issued on Feb. 24, 2009; U.S. patent application Ser. No. 11/686,893, entitled USER CONTROL INTERFACE FOR CONVERGENCE AND AUTOMATION SYSTEM, filed Mar. 15, 2007, to Seale Moorer, et al.; U.S. patent application Ser. No. 11/686,846, entitled DEVICE AUTOMATION USING NETWORKED DEVICE CONTROL HAVING A WEB SERVICES FOR DEVICES STACK, filed Mar. 15, 2007, to Seale Moorer, et al., now U.S. Pat. No. 7,587,464 issued on Sep. 8, 2009; U.S. patent application Ser. No. 11/686,875 , entitled AUTOMATION CONTROL SYSTEM HAVING A CONFIGURATION TOOL, filed Mar. 15, 2007, to Seale Moorer, et al.; and U.S. patent application Ser. No. 11/686,889 , entitled AUTOMATION CONTROL SYSTEM HAVING DEVICE SCRIPTING, filed Mar. 15, 2007, to Seale Moorer, et al., which are all hereby expressly incorporated by reference for all purposes as if fully set forth herein.
Number | Name | Date | Kind |
---|---|---|---|
4567557 | Burns | Jan 1986 | A |
4808841 | Ito et al. | Feb 1989 | A |
4989081 | Miyagawa et al. | Jan 1991 | A |
5086385 | Launey et al. | Feb 1992 | A |
5105186 | May | Apr 1992 | A |
5218552 | Stirk | Jun 1993 | A |
5237305 | Ishijuro | Aug 1993 | A |
5282028 | Johnson et al. | Jan 1994 | A |
5502618 | Chiou | Mar 1996 | A |
5565894 | Bates et al. | Oct 1996 | A |
5579221 | Mun | Nov 1996 | A |
5598523 | Fujita | Jan 1997 | A |
5621662 | Humphries et al. | Apr 1997 | A |
5623392 | Ma | Apr 1997 | A |
5666172 | Ida et al. | Sep 1997 | A |
5706191 | Bassett et al. | Jan 1998 | A |
5706290 | Shaw et al. | Jan 1998 | A |
5748444 | Honda et al. | May 1998 | A |
5787259 | Haroun | Jul 1998 | A |
5831823 | Hoedl | Nov 1998 | A |
5850340 | York | Dec 1998 | A |
5877957 | Bennett | Mar 1999 | A |
5922047 | Newlin et al. | Jul 1999 | A |
5956025 | Goulden et al. | Sep 1999 | A |
6020881 | Naughton et al. | Feb 2000 | A |
6029092 | Stein | Feb 2000 | A |
6061602 | Meyer | May 2000 | A |
6112127 | Bennett | Aug 2000 | A |
6139177 | Venkatraman et al. | Oct 2000 | A |
6147601 | Sandelman et al. | Nov 2000 | A |
6154681 | Drees et al. | Nov 2000 | A |
6160477 | Sandelman et al. | Dec 2000 | A |
6175872 | Neumann et al. | Jan 2001 | B1 |
6182094 | Humpleman et al. | Jan 2001 | B1 |
6192282 | Smith et al. | Feb 2001 | B1 |
6198479 | Humpleman et al. | Mar 2001 | B1 |
6201523 | Akiyama et al. | Mar 2001 | B1 |
6222729 | Yoshikawa | Apr 2001 | B1 |
6243707 | Humpleman et al. | Jun 2001 | B1 |
6263260 | Bodmer et al. | Jul 2001 | B1 |
6268857 | Fishkin et al. | Jul 2001 | B1 |
6275922 | Bertsch | Aug 2001 | B1 |
6278676 | Anderson et al. | Aug 2001 | B1 |
6288716 | Humpleman et al. | Sep 2001 | B1 |
6313990 | Cheon | Nov 2001 | B1 |
6314326 | Fuchu | Nov 2001 | B1 |
6353853 | Gravlin | Mar 2002 | B1 |
6385495 | Bennett | May 2002 | B1 |
6389331 | Jensen et al. | May 2002 | B1 |
6402109 | Dittmer | Jun 2002 | B1 |
6405103 | Ryan et al. | Jun 2002 | B1 |
6456892 | Dara-Abrams et al. | Sep 2002 | B1 |
6462654 | Sandelman et al. | Oct 2002 | B1 |
6473661 | Wollner | Oct 2002 | B1 |
6496575 | Vasell et al. | Dec 2002 | B1 |
6522346 | Meyer | Feb 2003 | B1 |
6523696 | Saito et al. | Feb 2003 | B1 |
6526581 | Edson | Feb 2003 | B1 |
6546419 | Humpleman | Apr 2003 | B1 |
6580950 | Johnson et al. | Jun 2003 | B1 |
6587739 | Abrams et al. | Jul 2003 | B1 |
6609038 | Croswell et al. | Aug 2003 | B1 |
6615088 | Myer et al. | Sep 2003 | B1 |
6633781 | Lee et al. | Oct 2003 | B1 |
6640141 | Bennett | Oct 2003 | B2 |
6663781 | Huling | Dec 2003 | B1 |
6690411 | Naidoo et al. | Feb 2004 | B2 |
6690979 | Smith | Feb 2004 | B1 |
6735619 | Sawada | May 2004 | B1 |
6756998 | Bilger | Jun 2004 | B1 |
6763040 | Hite et al. | Jul 2004 | B1 |
6778868 | Imamura et al. | Aug 2004 | B2 |
6782294 | Reich et al. | Aug 2004 | B2 |
6792319 | Bilger | Sep 2004 | B1 |
6792323 | Krzyzanowski et al. | Sep 2004 | B2 |
6792480 | Chaiken et al. | Sep 2004 | B2 |
6823223 | Gonzales et al. | Nov 2004 | B2 |
6834208 | Gonzales et al. | Dec 2004 | B2 |
6838978 | Aizu et al. | Jan 2005 | B2 |
6845275 | Gasiorek et al. | Jan 2005 | B2 |
6850149 | Park | Feb 2005 | B2 |
6859669 | An | Feb 2005 | B2 |
6865428 | Gonzales et al. | Mar 2005 | B2 |
6868292 | Ficco | Mar 2005 | B2 |
6868293 | Schurr et al. | Mar 2005 | B1 |
6870555 | Sekiguchi | Mar 2005 | B2 |
6891838 | Petite et al. | May 2005 | B1 |
6909921 | Bilger | Jun 2005 | B1 |
6912429 | Bilger | Jun 2005 | B1 |
6924727 | Nagaoka et al. | Aug 2005 | B2 |
6928576 | Sekiguchi | Aug 2005 | B2 |
6930599 | Naidoo et al. | Aug 2005 | B2 |
6957110 | Wewalaarachchi et al. | Oct 2005 | B2 |
6957275 | Sekiguchi | Oct 2005 | B1 |
6961763 | Wang et al. | Nov 2005 | B1 |
6965935 | Diong | Nov 2005 | B2 |
6967565 | Lingermann | Nov 2005 | B2 |
6980868 | Huang et al. | Dec 2005 | B2 |
6990379 | Gonzales et al. | Jan 2006 | B2 |
7047092 | Wimsatt | May 2006 | B2 |
7130719 | Ehlers et al. | Oct 2006 | B2 |
7136709 | Arling | Nov 2006 | B2 |
7170422 | Nelson et al. | Jan 2007 | B2 |
7174385 | Li | Feb 2007 | B2 |
7200683 | Wang et al. | Apr 2007 | B1 |
7201356 | Huang | Apr 2007 | B2 |
7203486 | Patel | Apr 2007 | B2 |
7225037 | Shani | May 2007 | B2 |
7260604 | Kuki | Aug 2007 | B2 |
7370280 | Ho et al. | May 2008 | B2 |
7380250 | Schechter et al. | May 2008 | B2 |
7453685 | Lube | Nov 2008 | B2 |
7505889 | Salmonsen et al. | Mar 2009 | B2 |
20010034754 | Elwahab et al. | Oct 2001 | A1 |
20010036192 | Chiles et al. | Nov 2001 | A1 |
20010039460 | Aisa | Nov 2001 | A1 |
20020000092 | Sharood et al. | Jan 2002 | A1 |
20020016639 | Smith et al. | Feb 2002 | A1 |
20020029085 | Park | Mar 2002 | A1 |
20020031120 | Rakib | Mar 2002 | A1 |
20020033760 | Kobayashi | Mar 2002 | A1 |
20020035404 | Ficco et al. | Mar 2002 | A1 |
20020044042 | Christensen | Apr 2002 | A1 |
20020047774 | Christensen | Apr 2002 | A1 |
20020111698 | Graziano et al. | Aug 2002 | A1 |
20020126443 | Zodnik | Sep 2002 | A1 |
20020152311 | Veltman et al. | Oct 2002 | A1 |
20020165953 | Diong | Nov 2002 | A1 |
20020174178 | Stawikowski | Nov 2002 | A1 |
20020180579 | Nagaoka et al. | Dec 2002 | A1 |
20020194328 | Hallenbeck | Dec 2002 | A1 |
20020196158 | Lee | Dec 2002 | A1 |
20030009515 | Lee et al. | Jan 2003 | A1 |
20030009537 | Wang | Jan 2003 | A1 |
20030028270 | Peterson et al. | Feb 2003 | A1 |
20030033028 | Bennett | Feb 2003 | A1 |
20030034898 | Shamoon et al. | Feb 2003 | A1 |
20030037166 | Ueno et al. | Feb 2003 | A1 |
20030040812 | Gonzales et al. | Feb 2003 | A1 |
20030040813 | Gonzales et al. | Feb 2003 | A1 |
20030040819 | Gonzales | Feb 2003 | A1 |
20030065407 | Johnson et al. | Apr 2003 | A1 |
20030069887 | Lucovsky et al. | Apr 2003 | A1 |
20030074088 | Gonzales | Apr 2003 | A1 |
20030083758 | Williamson | May 2003 | A1 |
20030101304 | King et al. | May 2003 | A1 |
20030103088 | Dresti et al. | Jun 2003 | A1 |
20030198938 | Murray | Oct 2003 | A1 |
20030200009 | von Kannewurff | Oct 2003 | A1 |
20030233432 | Davis et al. | Dec 2003 | A1 |
20040003051 | Krzyzanowski et al. | Jan 2004 | A1 |
20040004810 | Kim | Jan 2004 | A1 |
20040010327 | Terashima et al. | Jan 2004 | A1 |
20040010561 | Kim | Jan 2004 | A1 |
20040039459 | Daugherty et al. | Feb 2004 | A1 |
20040092282 | Kim et al. | May 2004 | A1 |
20040133314 | Ehlers | Jul 2004 | A1 |
20040138768 | Murray | Jul 2004 | A1 |
20040143629 | Bodin et al. | Jul 2004 | A1 |
20040176877 | Hesse | Sep 2004 | A1 |
20040213384 | Alles | Oct 2004 | A1 |
20040215694 | Podolsky | Oct 2004 | A1 |
20040215778 | Hesse et al. | Oct 2004 | A1 |
20040215816 | Hayes et al. | Oct 2004 | A1 |
20040237107 | Staples | Nov 2004 | A1 |
20040243257 | Theimer | Dec 2004 | A1 |
20040249922 | Hackman | Dec 2004 | A1 |
20040260407 | Wimsatt | Dec 2004 | A1 |
20040260427 | Wimsatt | Dec 2004 | A1 |
20040266439 | Lynch et al. | Dec 2004 | A1 |
20040267385 | Lingemann | Dec 2004 | A1 |
20040267876 | Kakivaya et al. | Dec 2004 | A1 |
20040267909 | Autret | Dec 2004 | A1 |
20050009498 | Ho | Jan 2005 | A1 |
20050021805 | De Petris et al. | Jan 2005 | A1 |
20050035717 | Adamson | Feb 2005 | A1 |
20050038708 | Wu | Feb 2005 | A1 |
20050055108 | Gonzales | Mar 2005 | A1 |
20050071419 | Lewontin | Mar 2005 | A1 |
20050080879 | Kim et al. | Apr 2005 | A1 |
20050085930 | Gonzales | Apr 2005 | A1 |
20050090915 | Geiwitz | Apr 2005 | A1 |
20050096753 | Arling et al. | May 2005 | A1 |
20050107897 | Callaghan | May 2005 | A1 |
20050108091 | Sotak | May 2005 | A1 |
20050113021 | Gosieski, Jr. et al. | May 2005 | A1 |
20050113943 | Nian | May 2005 | A1 |
20050119767 | Kiwimagi et al. | Jun 2005 | A1 |
20050119793 | Amundson et al. | Jun 2005 | A1 |
20050125083 | Kiko | Jun 2005 | A1 |
20050131551 | Ruutu | Jun 2005 | A1 |
20050131553 | Yoon et al. | Jun 2005 | A1 |
20050131558 | Braithwaite | Jun 2005 | A1 |
20050132405 | AbiEzzi | Jun 2005 | A1 |
20050149758 | Park | Jul 2005 | A1 |
20050159823 | Hayes et al. | Jul 2005 | A1 |
20050198063 | Thomas et al. | Sep 2005 | A1 |
20050198188 | Hickman | Sep 2005 | A1 |
20050198304 | Oliver et al. | Sep 2005 | A1 |
20050232583 | Kubota | Oct 2005 | A1 |
20050262227 | Heller et al. | Nov 2005 | A1 |
20050267605 | Lee et al. | Dec 2005 | A1 |
20050271355 | Gilor | Dec 2005 | A1 |
20060004920 | Hallenbeck | Jan 2006 | A1 |
20060009861 | Bonasia et al. | Jan 2006 | A1 |
20060020353 | Gonzales et al. | Jan 2006 | A1 |
20060053234 | Kumar et al. | Mar 2006 | A1 |
20060058900 | Johanson et al. | Mar 2006 | A1 |
20060069934 | Esch et al. | Mar 2006 | A1 |
20060106933 | Huang et al. | May 2006 | A1 |
20060118694 | Lee et al. | Jun 2006 | A1 |
20060126646 | Bedingfield, Sr. | Jun 2006 | A1 |
20060155802 | He et al. | Jul 2006 | A1 |
20070053376 | Oshima et al. | Mar 2007 | A1 |
20070073419 | Sesay | Mar 2007 | A1 |
20070083679 | Kikuchi | Apr 2007 | A1 |
20070104332 | Clemens et al. | May 2007 | A1 |
20070153459 | Wohlford et al. | Jul 2007 | A1 |
20070162567 | Ding | Jul 2007 | A1 |
20070247800 | Smith et al. | Oct 2007 | A1 |
20080108439 | Cole | May 2008 | A1 |
Number | Date | Country | |
---|---|---|---|
20070220165 A1 | Sep 2007 | US |
Number | Date | Country | |
---|---|---|---|
60782734 | Mar 2006 | US | |
60782598 | Mar 2006 | US | |
60782635 | Mar 2006 | US | |
60782596 | Mar 2006 | US | |
60782599 | Mar 2006 | US | |
60782600 | Mar 2006 | US | |
60782634 | Mar 2006 | US | |
60782595 | Mar 2006 | US | |
60785275 | Mar 2006 | US | |
60793257 | Apr 2006 | US | |
60747726 | May 2006 | US | |
60746287 | May 2006 | US | |
60786119 | Mar 2006 | US | |
60857774 | Nov 2006 | US |