The present invention relates to informing users about media files located on the Web.
The Internet is a worldwide system of computer networks and is a public, self-sustaining facility that is accessible to tens of millions of people worldwide. The most widely used part of the Internet is the World Wide Web, often abbreviated “WWW” or simply referred to as “the Web.” The Web organizes information through the use of hypermedia. The HyperText Markup Language (“HTML”) is typically used to specify the contents and format of a hypermedia document (e.g., a web page).
A web page is the image or collection of images that is displayed to a user when the web page's HTML file is rendered by a browser application program. Each web page can contain embedded references to resources such as images, audio, video, documents, or other web pages. On the Web, the most common type of reference used to identify and locate resources is the Uniform Resource Locator, or URL. A user using a web browser can reach resources that are embedded in the web page being browsed by selecting “hyperlinks” or “links” on the web page that identify the resources through the resources' URLs.
Web pages frequently contain embedded references to media files, including audio and video files. Currently, there are two common ways in which such media files are accessed and played. In one way, a user using a web browser selects a link on a web page which leads to a media file and downloads the entire media file. After download is completed, the user accesses and plays the media file by using one of the media applications on the user's computer. Alternatively, when a web browser detects that there is a reference to a media file on a web page, the reference is automatically followed and downloading of the media file automatically begins. A plug-in application capable of playing the media file then automatically starts to play the media file within the web browser, either after completion of the download or as soon as enough data has been downloaded to initiate playing.
Media files are created in a variety of formats, however, and sometimes a media file cannot be accessed and played by any of the media applications or plug-in applications on the user's computer. In the current approaches, when a user or web browser downloads a media file, no attempt is made to check whether the media file is playable by one of the applications on the user's computer before attempting to download the entire media file. In the case that the media file is not playable by the user's computer, much time and bandwidth is lost in downloading data which is ultimately not useful to the user. If the user pays for Web communications on a per-byte basis, then downloading an unplayable media file incurs additional monetary loss for the user.
In the approach where a web browser automatically starts to play a media file when a portion of it has been downloaded, less time and money is wasted because an error will occur if the media file is not playable and this error frequently occurs before the entire media file has been downloaded. However, even under this approach, a significant amount of time and money may have already been expended in downloading the portion of the media file. This is because existing applications download the media file in a sequential fashion and do not limit the downloading to only the portion of the media file relevant for determining whether the media file is playable. Also, in the case where a user is paying for amount of data downloaded, he or she may wish to know whether a media file is playable before downloading commences.
To circumvent these problems, some web browsers currently perform a basic check to compare downloadable media files with a set of available applications. This check is performed by detecting information embedded in the web page or supplied by the server that describes a media file embedded in the web page. This information describes what application is capable of playing the embedded media file, or allows the web browser to find an application registered to play the embedded media file, and the web browser then checks if this application is installed on the computer. If the necessary application is not installed, then the media file is not downloaded and an error message is generated. This type of basic check, however, suffers from poor quality of information because information embedded in the web page is often wrong or insufficient.
In view of the foregoing, there is a need for a way to accurately detect and communicate to a user whether a media file embedded in a web page is playable on a device with minimal downloading of data about the media and without downloading the entire media file.
The approaches described in this section are approaches that could be pursued, but not necessarily approaches that have been previously conceived or pursued. Therefore, unless otherwise indicated, it should not be assumed that any of the approaches described in this section qualify as prior art merely by virtue of their inclusion in this section.
The present invention is illustrated by way of example, and not by way of limitation, in the figures of the accompanying drawings and in which like reference numerals refer to similar elements and in which:
In the following description, for the purposes of explanation, numerous specific details are set forth in order to provide a thorough understanding of the present invention. It will be apparent, however, that the present invention may be practiced without these specific details. In other instances, well-known structures and devices are shown in block diagram form in order to avoid unnecessarily obscuring the present invention.
Techniques are provided through which information about media files embedded in web pages is obtained from a server and analyzed to determine if the media files are playable on a device. Information is obtained from a server which also hosts the embedded media files. This information may be external to the media file or may be a portion of the media file itself. The determination of whether a media file is playable by a device is made based on information obtained from the server without downloading the entirety of the media file.
According to one technique, a request is sent to a server to obtain information about a media file's MIME type, which is a two-part identifier for file formats. The file format identified in the media file's MIME type is compared to a list of file formats which are known to be playable by a device. If there is a match, then the media file is determined to be playable by the device. Further information may be obtained and further analysis may be performed in the case that there is no match.
According to one technique, a request is sent to a server to obtain one or more portions of a media file. The portions of the media file which are obtained contain information about the media file's audio and video formats, including the coding and decoding schemes used to generate the media file's content. This information is analyzed with respect to the media applications and plug-ins installed on a device to determine whether the media file is playable by the device.
According to one technique, an indication of whether a media file embedded in a web page can be played on a device is displayed by a web browser on the web page. In addition, a user may choose to download the media file based on the indication of the media file's playability. Alternatively, a web browser may automatically commence downloading and/or playing the media file once it has been determined that the media file is playable by the device.
According to one technique, whether a media file embedded in a web page is playable by a device is determined asynchronously with respect to another process which requests the contents of the web page from the server.
According to one technique, the device is a mobile device which communicates with the server through wireless communications.
The flow diagram in
In block 102, a web browser detects that a web page contains an embedded reference to a media file. This may be done by examining the file extension of the file specified in an HREF attribute or EMBED element, or by examining the TYPE attribute of and EMBED or OBJECT element. A media file can be an audio file or video file. A video file may in turn contain both video and audio tracks.
Upon detection of an embedded media file, the web browser in block 104 sends a request to the server for the media file's MIME type. A MIME type is a two-part identifier for file formats. Examples of MIME types include “audio/mp3” (indicates that the file is an audio file in an MPEG-1 Audio Layer 3) and “video/mp4” (indicates that the file is a video file in the MPEG4 format). In block 106, the server returns the media file's MIME type to the web browser.
Next, in block 108, the web browser compares the media file's MIME type with a list of known MIME types. The list of known MIME types is specific to each device on which the web browser operates because whether a media file is playable by a device depends on what applications have been installed on the specific device. In block 110, the web browser determines if there is a match between the media file's MIME type and a MIME type listed in the list of known MIME types. If there is a match, then the media file may be playable by the device. Further steps are performed to determine if the media file is indeed playable by the device.
In block 112, the media file's MIME type is compared with a second list of MIME types which is a subset of the list of known MIME types. The second list consists of “immediately playable” MIME types. In block 114, if the web browser determines that that there is a match between the media file's MIME type and a MIME type in the list of “immediately playable” MIME types, then block 116 is reached and the web browser determines that the media file is playable by the device. In other words, the list of “immediately playable” MIME types contains all MIME types which are known to be definitely playable by the device. For example, in one embodiment, the MIME type “audio/mp3”, which denotes an audio file in the MPEG3 format, is known to be playable by a device. Therefore, if a media file's MIME type is “audio/mp3”, then the web browser determines that this media file is playable without further investigation.
On the other hand, if there is no match between the media file's MIME type and the list of “immediately playable” MIME types in block 114, then the web browser may need to perform further steps to determine whether the media file is playable on the device. These further steps commence with block 124 and are described in detail below with respect to
Going back to block 110, if there is no match between the media file's MIME type and the list of known MIME types, block 118 is performed. In block 118, the web browser compares the media file's file extension with a list of known file extensions. The file extension may be obtained from the path to the media file. Similar to the list of known MIME types, the list of known file extensions is specific to each device. The list of known file extensions contains all file extensions which may be played by a device. In block 120, if there is no match between the media file's file extension and a file extension in the list of known file extensions, then block 122 is reached and the web browser determines that this media file is not playable. On the other hand, if there is a match between the media file's extension and a file extension in the list of known file extensions, then the web browser may need to perform further steps to determine whether the media file is playable on the device. These further steps commence with block 124 and are described in detail below with respect to
As discussed above, if the web browser cannot determine whether a media file is playable by a device based on a media file's MIME type and file extension alone, then the web browser may request additional data from the server to perform further analysis. The flow diagram in
In block 124, the web browser makes an initial determination, based on the media file's MIME type and file extension, regarding what kind of format is in the media file. For example, if the media file has a file extension of “.MP4”, then the web browser determines that the media file is likely in a format in the MPEG4 family of formats. In another example, if the media file has an extension of “.MOV”, then the web browser determines that the media file is likely in a format in the QuickTime family of formats. Based on this initial determination, the web browser requests the first eight bytes of the media file from the server in block 126.
When a media file is in the MPEG4 or QuickTime family of formats, the first eight bytes of the media file is the “header” of the first “atom” of the media file.
One type of atom is the “moov” (movie) atom. The contents of a “moov” atom contains information about how the movie is encoded and a table of contents for the media file. Another type of atom is an “mdat” (movie data) atom. An “mdat” atom contains the video and/or audio data in the media file. A third type of atom is an “ftyp” (file type) atom. The “ftyp” atom identifies the format of the media file within the MPEG4 or QuickTime family. Significantly, atoms in a MPEG4 or QuickTime media file can be arranged in any order. That is, although a “moov” atom contains the table of contents for the media file, it may actually follow the “mdat” atom(s) of the media file. One exception is the “ftyp” atom. If a media file contains an “ftyp” atom, the “ftyp” atom is always the first atom located in the media file.
In block 128, the web browser receives the first eight bytes of the media file. As discussed above, these eight bytes constitute the header of the first atom of the media file. The web browser analyzes the header, specifically the type cell of the header, and determines whether the header is the header of an “flyp” atom. In block 130, if the header does not indicate an “flyp” atom, then the web browser downloads a “moov” atom from the server in block 142 in order to gather additional data about the media file. Block 142 is discussed in more detail below. If the header indicates an “flyp” atom, then in block 132, the web browser requests the server for the entirety of the “ftyp” atom in order to perform analysis on the contents of the “ftyp” atom.
The content of a media file's “ftyp” item contains a series of “profiles” from which the web browser can derive information about the format of the media file. Specifically, each profile contains information about what codec (coding and decoding) formats and bit rates are compatible with the audio and video data in the media file. An “ftyp” atom may contain more than one profile because the audio and video data in a media may be compatible with multiple coding formats and bit rates.
In block 134, the web browser parses through profiles in an “ftyp” atom one at a time to extract format information. In block 136, the web browser compares the extracted format information to information about what formats are supported, or playable, by a device. Information about what formats are supported by a device may be derived from what types of applications are installed on the device. Once again, this information is specific to each device. If the web browser determines that the profile is supported in that it indicates that a format compatible with the media file is supported, or playable, by the device, then the web browser determines in block 138 that the media file is playable by the device. In block 136, if the web browser determines that the profile is not supported, then further examination of the “ftyp” atom is performed.
In block 140, if the web browser detects that there are more profiles in the “ftyp” atom which have not been analyzed, block 134 is repeated. In block 134, the next profile is parsed and the format information contained in the profile is extracted. However, if the end of the “ftyp” atom has been reached and there are no more profiles to parse, then additional information is gathered by requesting and receiving data until the “moov” atom is received in block 142.
As
In block 144, the web browser receives the “moov” atom from the server and analyzes the “moov” atom for information about the media file's format. Sometimes, a “moov” atom may also be large and may contain several megabytes of data. Therefore, downloading an entire “moov” atom may also incur unwanted time and cost. A “moov” atom's content, however, is itself divided into “sub-atoms”, where the header of each “sub-atom” indicates the length and type of “sub-atom”. Consequently, in block 144, the web browser downloads only the headers of the “moov” atom's sub-atoms, skipping over irrelevant content until it detects a sub-atom which contains information about the media file's format. For example, the “moov” atom may contain sub-atoms which contain information about the media file's audio track and the media file's video track.
The web browser may extract information from the audio track and video track sub-atoms, and compares them to formats that are supported, or playable, by a device. For example, the video track sub-atom may indicate that the media file's video data has been compressed using “B-frames” (bi-directional frames), while the device does not have any applications which can play video data compressed into “B-frames”. In this example, in block 146, the web browser determines that the video track is not supported by the device, and determines in block 148 that the media file is not playable by the device. In another example, the video track sub-atom may indicate that the media file's video data has been compressed using “I-frames” (intra frames), and the device does contain at least one application that can play video data compressed into “I-frames”. In this example, if the audio track is similarly supported, in the block 146, the web browser determines that the audio and video tracks are playable and determines in block 138 that the media file is playable by the device.
Although an embodiment of the invention is described above in processes 100A and 100B (and
In one embodiment, once a web browser has determined that a media file is either playable or not playable, it may display this information to the user on the web browser screen. If the media file is playable, the web browser may further provide a selectable option for the user to download the media file for playing. Alternatively, the web browser may automatically download the media file and commence playing the media file. Finally, if the web browser determines that a media file is a video file, but that only one of the video file's audio and video tracks is playable, the web browser may display this detailed information to the user and provide a selectable option for the user to download the playable track for playing.
In another embodiment, the user may be informed of whether a media file is playable before any downloading commences. For example, the user may select an option where only the media file's MIME type is examined.
In one embodiment, the process of downloading information from the server and performing an analysis to determine whether a media file embedded in a web page is playable is performed asynchronously with respect to another process which downloads the contents of the web page. This asynchronous operation allows the web browser to load and display other contents of the web page for the user's viewing without waiting for the analysis on the embedded media file to finish. In one embodiment, when there are multiple media files embedded in the web page, a separate asynchronous process analyzes each media file.
In one embodiment, the device is a mobile device. Mobile devices, such as cellular phones, usually have less communication bandwidth and less storage capacity than wired devices, such as personal computers. Furthermore, users of mobile devices may be charged on a per-byte-downloaded basis. Therefore, it may be advantageous to inform users of mobile devices about whether a media file is playable without incurring the cost of downloading and storing an entire media file.
Computer system 300 may be coupled via bus 302 to a display 312, such as a cathode ray tube (CRT), for displaying information to a computer user. An input device 314, including alphanumeric and other keys, is coupled to bus 302 for communicating information and command selections to processor 304. Another type of user input device is cursor control 316, such as a mouse, a trackball, or cursor direction keys for communicating direction information and command selections to processor 304 and for controlling cursor movement on display 312. This input device typically has two degrees of freedom in two axes, a first axis (e.g., x) and a second axis (e.g., y), that allows the device to specify positions in a plane.
The invention is related to the use of computer system 300 for implementing the techniques described herein. According to one embodiment of the invention, those techniques are performed by computer system 300 in response to processor 304 executing one or more sequences of one or more instructions contained in main memory 306. Such instructions may be read into main memory 306 from another machine-readable medium, such as storage device 310. Execution of the sequences of instructions contained in main memory 306 causes processor 304 to perform the process steps described herein. In alternative embodiments, hard-wired circuitry may be used in place of or in combination with software instructions to implement the invention. Thus, embodiments of the invention are not limited to any specific combination of hardware circuitry and software.
The term “machine-readable medium” as used herein refers to any medium that participates in providing data that causes a machine to operation in a specific fashion. In an embodiment implemented using computer system 300, various machine-readable media are involved, for example, in providing instructions to processor 304 for execution. Such a medium may take many forms, including but not limited to storage media and transmission media. Storage media includes both non-volatile media and volatile media. Non-volatile media includes, for example, optical or magnetic disks, such as storage device 310. Volatile media includes dynamic memory, such as main memory 306. Transmission media includes coaxial cables, copper wire and fiber optics, including the wires that comprise bus 302. Transmission media can also take the form of acoustic or light waves, such as those generated during radio-wave and infra-red data communications. All such media must be tangible to enable the instructions carried by the media to be detected by a physical mechanism that reads the instructions into a machine.
Common forms of machine-readable media include, for example, a floppy disk, a flexible disk, hard disk, magnetic tape, or any other magnetic medium, a CD-ROM, any other optical medium, punchcards, papertape, any other physical medium with patterns of holes, a RAM, a PROM, and EPROM, a FLASH-EPROM, any other memory chip or cartridge, a carrier wave as described hereinafter, or any other medium from which a computer can read.
Various forms of machine-readable media may be involved in carrying one or more sequences of one or more instructions to processor 304 for execution. For example, the instructions may initially be carried on a magnetic disk of a remote computer. The remote computer can load the instructions into its dynamic memory and send the instructions over a telephone line using a modem. A modem local to computer system 300 can receive the data on the telephone line and use an infra-red transmitter to convert the data to an infra-red signal. An infra-red detector can receive the data carried in the infra-red signal and appropriate circuitry can place the data on bus 302. Bus 302 carries the data to main memory 306, from which processor 304 retrieves and executes the instructions. The instructions received by main memory 306 may optionally be stored on storage device 310 either before or after execution by processor 304.
Computer system 300 also includes a communication interface 318 coupled to bus 302. Communication interface 318 provides a two-way data communication coupling to a network link 320 that is connected to a local network 322. For example, communication interface 318 may be an integrated services digital network (ISDN) card or a modem to provide a data communication connection to a corresponding type of telephone line. As another example, communication interface 318 may be a local area network (LAN) card to provide a data communication connection to a compatible LAN. Wireless links may also be implemented. In any such implementation, communication interface 318 sends and receives electrical, electromagnetic or optical signals that carry digital data streams representing various types of information.
Network link 320 typically provides data communication through one or more networks to other data devices. For example, network link 320 may provide a connection through local network 322 to a host computer 324 or to data equipment operated by an Internet Service Provider (ISP) 326. ISP 326 in turn provides data communication services through the world wide packet data communication network now commonly referred to as the “Internet” 328. Local network 322 and Internet 328 both use electrical, electromagnetic or optical signals that carry digital data streams. The signals through the various networks and the signals on network link 320 and through communication interface 318, which carry the digital data to and from computer system 300, are exemplary forms of carrier waves transporting the information.
Computer system 300 can send messages and receive data, including program code, through the network(s), network link 320 and communication interface 318. In the Internet example, a server 330 might transmit a requested code for an application program through Internet 328, ISP 326, local network 322 and communication interface 318.
The received code may be executed by processor 304 as it is received, and/or stored in storage device 310, or other non-volatile storage for later execution. In this manner, computer system 300 may obtain application code in the form of a carrier wave.
In the foregoing specification, embodiments of the invention have been described with reference to numerous specific details that may vary from implementation to implementation. Thus, the sole and exclusive indicator of what is the invention, and is intended by the applicants to be the invention, is the set of claims that issue from this application, in the specific form in which such claims issue, including any subsequent correction. Any definitions expressly set forth herein for terms contained in such claims shall govern the meaning of such terms as used in the claims. Hence, no limitation, element, property, feature, advantage or attribute that is not expressly recited in a claim should limit the scope of such claim in any way. The specification and drawings are, accordingly, to be regarded in an illustrative rather than a restrictive sense.
The present application is a continuation of U.S. application Ser. No. 12/143,119 filed on Jun. 20, 2008, which claims priority to provisional application No. 60/936,862 filed Jun. 22, 2007, the contents of which are incorporated herein in their entirety.
Number | Name | Date | Kind |
---|---|---|---|
5359469 | Ueda | Oct 1994 | A |
5434678 | Abecassis | Jul 1995 | A |
5539586 | Inoue et al. | Jul 1996 | A |
5737395 | Irribarren | Apr 1998 | A |
5940391 | Malkin et al. | Aug 1999 | A |
6002694 | Yoshizawa et al. | Dec 1999 | A |
6101510 | Stone et al. | Aug 2000 | A |
6138148 | Lipkin | Oct 2000 | A |
6289165 | Abecassis | Sep 2001 | B1 |
6389473 | Carmel et al. | May 2002 | B1 |
6421726 | Kenner et al. | Jul 2002 | B1 |
6477550 | Balasubramaniam et al. | Nov 2002 | B1 |
6507727 | Henrick | Jan 2003 | B1 |
6546417 | Baker | Apr 2003 | B1 |
6662341 | Cooper et al. | Dec 2003 | B1 |
6785688 | Abajian et al. | Aug 2004 | B2 |
6912591 | Lash | Jun 2005 | B2 |
6959339 | Wu et al. | Oct 2005 | B1 |
7062686 | Moritomo | Jun 2006 | B2 |
7218594 | Van Gestel | May 2007 | B2 |
7251277 | Luken | Jul 2007 | B2 |
7447738 | Andrews et al. | Nov 2008 | B1 |
7562300 | Tobias et al. | Jul 2009 | B1 |
7673340 | Cohen et al. | Mar 2010 | B1 |
7698392 | Zapata et al. | Apr 2010 | B2 |
7792756 | Plastina et al. | Sep 2010 | B2 |
7836404 | Dietz et al. | Nov 2010 | B2 |
8050542 | Han | Nov 2011 | B2 |
8099445 | Masinter et al. | Jan 2012 | B1 |
8489702 | Batson et al. | Jul 2013 | B2 |
20010044848 | Kikuchi et al. | Nov 2001 | A1 |
20020001272 | Takagi et al. | Jan 2002 | A1 |
20020010740 | Kikuchi et al. | Jan 2002 | A1 |
20020069308 | Jones et al. | Jun 2002 | A1 |
20020082730 | Capps et al. | Jun 2002 | A1 |
20020087642 | Wei et al. | Jul 2002 | A1 |
20020097983 | Wallace et al. | Jul 2002 | A1 |
20020103920 | Berkun et al. | Aug 2002 | A1 |
20020120675 | Everett et al. | Aug 2002 | A1 |
20020124100 | Adams | Sep 2002 | A1 |
20020157034 | Sagar | Oct 2002 | A1 |
20020188665 | Lash | Dec 2002 | A1 |
20030014630 | Spencer et al. | Jan 2003 | A1 |
20030021014 | Barenburg et al. | Jan 2003 | A1 |
20030032419 | Shibasaki et al. | Feb 2003 | A1 |
20030049029 | Murakami et al. | Mar 2003 | A1 |
20030060157 | Henrick | Mar 2003 | A1 |
20030061369 | Aksu et al. | Mar 2003 | A1 |
20030121042 | Franken et al. | Jun 2003 | A1 |
20030149628 | Abbosh et al. | Aug 2003 | A1 |
20030156649 | Abrams, Jr. | Aug 2003 | A1 |
20030221014 | Kosiba et al. | Nov 2003 | A1 |
20030236840 | Hirooka | Dec 2003 | A1 |
20040064500 | Kolar et al. | Apr 2004 | A1 |
20040146284 | Kawate et al. | Jul 2004 | A1 |
20040230825 | Shepherd et al. | Nov 2004 | A1 |
20040247284 | Yamasaki | Dec 2004 | A1 |
20040255017 | Jurisch et al. | Dec 2004 | A1 |
20040267821 | Kiyama et al. | Dec 2004 | A1 |
20050015649 | Lee et al. | Jan 2005 | A1 |
20050021851 | Hamynen | Jan 2005 | A1 |
20050025460 | Hyodo et al. | Feb 2005 | A1 |
20050071491 | Seo | Mar 2005 | A1 |
20050086315 | Malik et al. | Apr 2005 | A1 |
20050102371 | Aksu | May 2005 | A1 |
20050123136 | Shin et al. | Jun 2005 | A1 |
20050138199 | Li et al. | Jun 2005 | A1 |
20050141420 | Li et al. | Jun 2005 | A1 |
20050198274 | Day | Sep 2005 | A1 |
20060007797 | Tsujimoto | Jan 2006 | A1 |
20060013321 | Sekiguchi et al. | Jan 2006 | A1 |
20060034248 | Mishra et al. | Feb 2006 | A1 |
20060050697 | Li et al. | Mar 2006 | A1 |
20060053209 | Li | Mar 2006 | A1 |
20060055798 | Kuwata | Mar 2006 | A1 |
20060089949 | Robbin et al. | Apr 2006 | A1 |
20060123052 | Robbin et al. | Jun 2006 | A1 |
20060159366 | Darwish | Jul 2006 | A1 |
20060195486 | Ohno et al. | Aug 2006 | A1 |
20060224759 | Fang et al. | Oct 2006 | A1 |
20060263065 | Akifusa | Nov 2006 | A1 |
20060288119 | Kim et al. | Dec 2006 | A1 |
20060294183 | Agnoli et al. | Dec 2006 | A1 |
20070005795 | Gonzalez | Jan 2007 | A1 |
20070011258 | Khoo | Jan 2007 | A1 |
20070011501 | Yagawa | Jan 2007 | A1 |
20070016802 | Wingert et al. | Jan 2007 | A1 |
20070143807 | Suneya | Jun 2007 | A1 |
20070165998 | Tsujii et al. | Jul 2007 | A1 |
20070168436 | Andam | Jul 2007 | A1 |
20070260679 | Kikuchi et al. | Nov 2007 | A1 |
20070279789 | Smith et al. | Dec 2007 | A1 |
20080065741 | Stratton et al. | Mar 2008 | A1 |
20080083004 | Kim et al. | Apr 2008 | A1 |
20080092189 | Baker et al. | Apr 2008 | A1 |
20080107170 | Ong et al. | May 2008 | A1 |
20080109556 | Karlberg | May 2008 | A1 |
20080154798 | Valz | Jun 2008 | A1 |
20080162666 | Ebihara et al. | Jul 2008 | A1 |
20080168516 | Flick et al. | Jul 2008 | A1 |
20080193098 | Hirabayashi et al. | Aug 2008 | A1 |
20080208789 | Almog | Aug 2008 | A1 |
20080250023 | Baker et al. | Oct 2008 | A1 |
20090041430 | Ishizaka et al. | Feb 2009 | A1 |
20090220206 | Kisliakov | Sep 2009 | A1 |
20100115346 | Lee | May 2010 | A1 |
20110167345 | Jones et al. | Jul 2011 | A1 |
Number | Date | Country | |
---|---|---|---|
20140012952 A1 | Jan 2014 | US |
Number | Date | Country | |
---|---|---|---|
60936862 | Jun 2007 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 12143119 | Jun 2008 | US |
Child | 13926742 | US |