Claims
- 1. A method for designing a cache comprising the steps of:
receiving requests forming a workload; performing a trace of said workload; analyzing said trace of said workload; and designing one or more stacks of cache entries based on said analysis of said trace of said workload.
- 2. The method as recited in claim 1 further comprising the step of:
designing a length of each of said one or more stacks of said cache entries based on said analysis of said trace of said workload.
- 3. The method as recited in claim 1, wherein said requests that form said workload are requests to access a disk.
- 4. The method as recited in claim 1, wherein said requests are issued from one or more clients in a network system to a network server, wherein said cache is associated with said network server.
- 5. The method as recited in claim 1, wherein said cache comprises a plurality of logical sections, wherein one of said plurality of logical sections comprises information, wherein one of said plurality of logical sections comprises a cache directory.
- 6. The method as recited in claim 5, wherein said cache directory in said cache comprises a logical based address associated with each cache entry in said cache.
- 7. The method as recited in claim 5, wherein said cache directory in said cache comprises a logical time stamp associated with each cache entry in said cache.
- 8. The method as recited in claim 5, wherein said cache directory in said cache comprises a frequency count associated with each cache entry in said cache.
- 9. The method as recited in claim 8, wherein said frequency count is a number of requests to a particular cache entry.
- 10. The method as recited in claim 8, wherein said one or more stacks are designed in an array based on said frequency count of said cache entries.
- 11. The method as recited as recited in claim 10, wherein each of said one or more stacks comprises cache entries associated with a particular frequency count.
- 12. The method as recited in claim 7, wherein each of said one or more one or more stacks comprises cache entries ordered from most recently used to least recently used based on said logical time stamps of said cache entries.
- 13. The method as recited in claim 12, wherein a cache entry at a least recently used stack position in a particular stack is evicted upon the storing of a new cache entry at a most recently used stack position in said particular stack.
- 14. The method as recited in claim 13, wherein information in said evicted cache entry is discarded.
- 15. The method as recited in claim 13, wherein said evicted cache entry is stored at said most recently used stack position of a next lower level stack except when said particular stack is a lowest level stack.
- 16. The method as recited in claim 13, wherein said evicted cache entry is stored at said most recently used stack position of a next higher level stack except when said particular stack is a highest level stack.
- 17. The method as recited in claim 13, wherein said new cache entry is stored at said most recently used stack position in said particular stack upon a cache miss.
- 18. A computer program product having computer readable medium having computer program logic recorded thereon for designing a cache, comprising:
programming operable for receiving requests forming a workload; programming operable for performing a trace of said workload; programming operable for analyzing said trace of said workload; and programming operable for designing one or more stacks of cache entries based on said analysis of said trace of said workload.
- 19. The computer program product as recited in claim 18 further comprises:
programming operable for designing a length of each of said one or more stacks of said cache entries based on said analysis of said trace of said workload.
- 20. The computer program product as recited in claim 18, wherein said requests that form said workload are requests to access a disk.
- 21. The computer program product as recited in claim 18, wherein said requests are issued from one or more clients in a network system to a network server, wherein said cache is associated with said network server.
- 22. The computer program product as recited in claim 18, wherein said cache comprises a plurality of logical sections, wherein one of said plurality of logical sections comprises information, wherein one of said plurality of logical sections comprises a cache directory.
- 23. The computer program product as recited in claim 22, wherein said cache directory in said cache comprises a logical based address associated with each cache entry in said cache.
- 24. The computer program product as recited in claim 22, wherein said cache directory in said cache comprises a logical time stamp associated with each cache entry in said cache.
- 25. The computer program product as recited in claim 22, wherein said cache directory in said cache comprises a frequency count associated with each cache entry in said cache.
- 26. The computer program product as recited in claim 25, wherein said frequency count is a number of requests to a particular cache entry.
- 27. The computer program product as recited in claim 25, wherein said one or more stacks are designed in an array based on said frequency count of said cache entries.
- 28. The computer program product as recited as recited in claim 27, wherein each of said one or more stacks comprises cache entries associated with a particular frequency count.
- 29. The computer program product as recited in claim 24, wherein each of said one or more one or more stacks comprises cache entries ordered from most recently used to least recently used based on said logical time stamps of said cache entries.
- 30. The computer program product as recited in claim 29, wherein a cache entry at a least recently used stack position in a particular stack is evicted upon the storing of a new cache entry at a most recently used stack position in said particular stack.
- 31. The computer program product as recited in claim 30, wherein information in said evicted cache entry is discarded.
- 32. The computer program product as recited in claim 30, wherein said evicted cache entry is stored at said most recently used stack position of a next lower level stack except when said particular stack is a lowest level stack.
- 33. The computer program product as recited in claim 30, wherein said evicted cache entry is stored at said most recently used stack position of a next higher level stack except when said particular stack is a highest level stack.
- 34. The computer program product as recited in claim 30, wherein said new cache entry is stored at said most recently used stack position in said particular stack upon a cache miss.
- 35. A system comprising:
one or more clients; a server coupled to said one or more clients, wherein said server comprises:
a processor; a memory unit operable for storing a computer program operable for designing a cache; and a bus system coupling the processor to the memory, wherein the computer program is operable for performing the following programming steps:
receiving requests forming a workload; performing a trace of said workload; analyzing said trace of said workload; and designing one or more stacks of cache entries based on said analysis of said trace of said workload.
- 36. The system as recited in claim 35, wherein the computer program product is further operable to perform the programming step:
designing a length of each of said one or more stacks of said cache entries based on said analysis of said trace of said workload.
- 37. The system as recited in claim 35, wherein said requests that form said workload are requests to access a disk.
- 38. The system as recited in claim 35, wherein said requests are issued from said one or more clients to said server.
- 39. The system as recited in claim 35, wherein said cache comprises a plurality of logical sections, wherein one of said plurality of logical sections comprises information, wherein one of said plurality of logical sections comprises a cache directory.
- 40. The system as recited in claim 39, wherein said cache directory in said cache comprises a logical based address associated with each cache entry in said cache.
- 41. The system as recited in claim 39, wherein said cache directory in said cache comprises a logical time stamp associated with each cache entry in said cache.
- 42. The system as recited in claim 39, wherein said cache directory in said cache comprises a frequency count associated with each cache entry in said cache.
- 43. The system as recited in claim 42, wherein said frequency count is a number of requests to a particular cache entry.
- 44. The system as recited in claim 42, wherein said one or more stacks are designed in an array based on said frequency count of said cache entries.
- 45. The system as recited as recited in claim 44, wherein each of said one or more stacks comprises cache entries associated with a particular frequency count.
- 46. The system as recited in claim 41, wherein each of said one or more one or more stacks comprises cache entries ordered from most recently used to least recently used based on said logical time stamps of said cache entries.
- 47. The system as recited in claim 46, wherein a cache entry at a least recently used stack position in a particular stack is evicted upon the storing of a new cache entry at a most recently used stack position in said particular stack.
- 48. The system as recited in claim 47, wherein information in said evicted cache entry is discarded.
- 49. The system as recited in claim 47, wherein said evicted cache entry is stored at said most recently used stack position of a next lower level stack except when said particular stack is a lowest level stack.
- 50. The system as recited in claim 47, wherein said evicted cache entry is stored at said most recently used stack position of a next higher level stack except when said particular stack is a highest level stack.
- 51. The system as recited in claim 47, wherein said new cache entry is stored at said most recently used stack position in said particular stack upon a cache miss.
CROSS REFERENCE TO RELATED APPLICATION
[0001] The present invention is related to the following U.S. Patent Application which is incorporated herein by reference:
[0002] Ser. No. ______ (Attorney Docket No. RSP920010002US1) entitled “Designing a Cache with Adaptive Reconfiguration” filed ______.