Claims
- 1. A method for transmitting object data over a computer network, comprising the steps of:
retrieving a request for at least one data object from a data processing device by a cache memory system, the cache memory system including a first cache memory and a second cache memory, the cache memory system being communicably coupled to the data processing device and at least one data storage device; in the event of a cache miss, fetching a predetermined number of initial units of the requested data object from the data storage device and storing the initial object data units in the first cache memory by the cache memory system; delivering the initial object data units to the data processing device while fetching remaining units of the requested data object from the data storage device and storing the remaining object data units in the second cache memory by the cache memory system; and delivering the remaining object data units to the data processing device by the cache memory system.
- 2. The method of claim 1 wherein the retrieving step includes retrieving a request for a plurality of data objects from a script by the cache memory system.
- 3. The method of claim 2 further including the step of pre-processing the script by the cache memory system, wherein the first fetching step includes fetching predetermined numbers of initial units of the requested data objects from the data storage device in parallel, and wherein the second fetching step includes fetching remaining units of the requested data objects from the data storage device in parallel.
- 4. The method of claim 1 further including the steps of, at least at some times, fetching additional remaining units of the requested data object from the data storage device by the cache memory system, storing the additional remaining object data units in the second cache memory by the cache memory system, and delivering the additional remaining object data units to the data processing device by the cache memory system.
- 5. The method of claim 1 wherein the retrieving step includes retrieving a request for at least one data object from the data processing device by the cache memory system, the cache memory system including the first cache memory, the second cache memory, and a database, and further including the step of generating a record for the initial object data units stored in the first cache memory by the cache memory system, the record including at least one field selected from an object identifier field, a local object locator field, a number of data units stored field, an insert time field, an access time field, and an expire time field.
- 6. The method of claim 5 wherein the second fetching step includes fetching the remaining units of the requested data object at a point after the beginning of the data object based on a value in the number of data units stored field of the database record.
- 7. The method of claim 1 wherein the first fetching step includes fetching a predetermined fixed number of initial units of the requested data object from the data storage device to achieve a desired playback rate of the data.
- 8. The method of claim 1 wherein the first fetching step includes fetching the predetermined number of initial units of the requested data object from the data storage device to achieve a desired playback rate of the data, the predetermined number of initial units corresponding to a number of initial object data units provided by the data storage device during a fixed interval of time.
- 9. The method of claim 1 wherein the first fetching step includes fetching the predetermined number of initial units of the requested data object from the data storage device to achieve a desired playback rate of the data, the predetermined number of initial units corresponding to a number of initial object data units provided by the data storage device during a variable interval of time.
- 10. A system for transmitting object data over a computer network, comprising:
at least one data processing device; at least one data storage device; and a cache memory system communicably coupled to the data processing device and the data storage device, the cache memory system including a first cache memory and a second cache memory, wherein the cache memory system is configured to retrieve a request for at least one data object from the data processing device, in the event of a cache miss fetch a predetermined number of initial units of the requested data object from the data storage device and store the initial object data units in the first cache memory, deliver the initial object data units to the data processing device while fetching remaining units of the requested data object from the data storage device and storing the remaining object data units in the second cache memory, and deliver the remaining object data units to the data processing device.
- 11. The system of claim 10 wherein the cache memory system is configured to retrieve a request for a plurality of data objects from a script.
- 12. The system of claim 11 wherein the cache memory system is further configured to pre-process the script, in the event of a cache miss fetch predetermined numbers of initial units of the requested data objects from the data storage device in parallel, and deliver the initial object data units to the data processing device while fetching remaining units of the requested data objects from the data storage device in parallel.
- 13. The system of claim 10 wherein the cache memory system is further configured to, at least at some times, fetch additional remaining units of the requested data object from the data storage device, store the additional remaining object data units in the second cache memory, and deliver the additional remaining object data units to the data processing device.
- 14. The system of claim 10 wherein the cache memory system further includes a database, and the cache memory system is further configured to generate a record for the initial object data units stored in the first cache memory, the record including at least one field selected from an object identifier field, a local object locator field, a number of data units stored field, an insert time field, an access time field, and an expire time field.
- 15. The system of claim 10 wherein the predetermined number of initial object data units is fixed.
- 16. The system of claim 10 wherein the predetermined number of initial object data units is variable.
- 17. The system of claim 10 wherein the predetermined number of initial object data units is determined to achieve a desired playback rate of the data, the predetermined number of initial units corresponding to a number of initial object data units provided by the data storage device during a fixed or variable interval of time.
- 18. The system of claim 10 further including a low-latency first link communicably coupling the data processing device and the cache memory system, and a second link communicably coupling the data storage unit and the cache memory system.
- 19. The system of claim 18 wherein the first and second links comprise different physical media.
- 20. The system of claim 18 wherein the first and second links comprise respective portions of the same physical medium.
- 21. The system of claim 18 wherein the second link communicably couples the data storage unit to the first and second cache memories.
- 22. The system of claim 18 further including a third link communicably coupling the data storage unit to the first cache memory, the second link communicably coupling the data storage unit to the second cache memory.
- 23. The system of claim 18 wherein the first link comprises a Local Area Network (LAN).
- 24. The system of claim 18 wherein the first link comprises a local bus.
- 25. The system of claim 24 wherein the local bus is selected from a Peripheral Component Interconnect (PCI) bus and a Small Computer System Interface (SCSI) bus.
- 26. The system of claim 18 wherein the second link is selected from an Ethernet LAN, a Wide Area Network (WAN), and the Internet.
CROSS REFERENCE TO RELATED APPLICATIONS
[0001] This application claims priority of U.S. Provisional Patent Application No. 60/284,853 filed Apr. 19, 2001 entitled CACHE FOR LARGE-OBJECT REAL-TIME LATENCY ELIMINATION.
Provisional Applications (1)
|
Number |
Date |
Country |
|
60284853 |
Apr 2001 |
US |