Claims
- 1. A method comprising the steps of:
storing a read data in a cache memory; flagging a host data portion of the read data stored in the cache memory; labeling a speculative data portion of the read data stored in the cache memory; associating the host data to a first portion of a prioritization list; and linking the speculative data to a second portion of the prioritization list, to facilitate a persistence of the speculative data stored in the cache memory for a period of time greater than a persistence of the host data stored in the cache memory.
- 2. The method of claim 1, in which the read data is stored in the cache memory by steps comprising:
receiving a host data read command for retrieval of the host data; executing a seek command to retrieve the host data from a predetermined data sector; reading a read on arrival data from a data sector preceding the predetermined data sector; transducing the host data from the predetermined data sector; retrieving a read look ahead data from a data sector subsequent to the predetermined data sector; selecting a cache memory fragment sized to accommodate the read on arrival data along with the host data in addition to the read look ahead data; and storing the read on arrival data along with the host data in addition to the read look ahead data in the cache memory fragment to form the read data.
- 3. The method of claim 2, in which the predetermined data sector, the data sector preceding the predetermined data sector along with the data sector subsequent to the predetermined data sector are each sized to accommodate a substantially equal volume of data, and in which the cache memory is segmented into a plurality of cache memory blocks wherein each cache memory block is sized to accommodate a substantially equal volume of data as the volume of data accommodated by the predetermined data sector.
- 4. The method of claim 3, in which the host data occupies a plurality of predetermined data sectors, the read on arrival data occupies a plurality of data sectors preceding the host data, and read look ahead data occupies a plurality of data sectors subsequent to the host data.
- 5. The method of claim 4, in which the cache memory fragment comprises a plurality of cache memory blocks with a first portion of the plurality of cache memory blocks storing the read on arrival data, a second portion of the plurality of cache memory blocks storing the host data, and a third portion of the cache memory blocks storing the read look ahead data.
- 6. The method of claim 5, in which flagging the host data of the read data comprises the steps of:
transferring the host data to the host; identifying an initial cache memory block of the second portion of the plurality of cache memory blocks storing the host data; setting a first host data pointer to the initial cache memory block of the second portion of the plurality of cache memory blocks storing the host data; determining a final cache memory block of the second portion of the plurality of cache memory blocks storing the host data; setting a second host data pointer to the final cache memory block of the second portion of the plurality of cache memory blocks storing the host data; and associating the first host data pointer with the second host data pointer to identify a host data sub-fragment of the cache memory fragment.
- 7. The method of claim 5, in which labeling the speculative data portion of the read data comprises the steps of:
transferring the host data to the host; identifying an initial cache memory block of the first portion of the plurality of cache memory blocks storing the read on arrival data; setting a first read on arrival pointer to the initial cache memory block of the first portion of the plurality of cache memory blocks storing the read on arrival data; determining a final cache memory block of the first portion of the plurality of cache memory blocks storing the read on arrival data; setting a second read on arrival pointer to the final cache memory block of the first portion of the plurality of cache memory blocks storing the read on arrival data; associating the first read on arrival pointer with the second read on arrival pointer to identify a read on arrival sub-fragment of the cache memory fragment; identifying an initial cache memory block of the third portion of the plurality of cache memory blocks storing the read look ahead data; setting a first read look ahead pointer to the initial cache memory block of the third portion of the plurality of cache memory blocks storing the read look ahead data; determining a final cache memory block of the third portion of the plurality of cache memory blocks storing the read look ahead data; setting a second read look ahead pointer to the final cache memory block of the third portion of the plurality of cache memory blocks storing the read look ahead data; and associating the first read look ahead pointer with the second read look ahead pointer to identify a read look ahead sub-fragment of the cache memory fragment.
- 8. The method of claim 2, in which the cache memory fragment comprises:
a host data sub-fragment storing the host data; a read on arrival data sub-fragment storing the read on arrival data; and a read look ahead sub-fragment storing the read look ahead data.
- 9. The method of claim 8, in which the prioritization list prioritizes removal of the read data stored in the cache memory, and in which the first portion of the prioritization list is a least-recently-used portion having a lowest priority and subject to an earliest removal of the read data from the cache memory, and wherein the host data sub-fragment is assigned to the least-recently-used portion of the prioritization list.
- 10. The method of claim 8, in which the prioritization list prioritizes removal of the read data stored in the cache memory, and in which the second portion of the prioritization list is a most-recently-used portion having a highest priority and subject to a delayed removal of the read data from the cache memory, and wherein the read on arrival data sub-fragment is assigned to the most-recently-used portion of the prioritization list.
- 11. The method of claim 8, in which the prioritization list prioritizes removal of the read data stored in the cache memory, and in which the second portion of the prioritization list is a most-recently-used portion having a highest priority and subject to a delayed removal of the read data from the cache memory, and wherein the read look ahead data sub-fragment is assigned to the most-recently-used portion of the prioritization list.
- 12. The method of claim 8, in which the prioritization list prioritizes removal of the read data stored in the cache memory, and in which the first portion of the prioritization list is a least-recently-used portion having a lowest priority and subject to an earliest removal of the read data from the cache memory, the second portion of the prioritization list is a most-recently-used portion having a highest priority and subject to a delayed removal of the read data from the cache memory, and wherein the host data sub-fragment is assigned to the least-recently-used portion of the prioritization list, the read on arrival data sub-fragment is assigned to the most-recently-used portion of the prioritization list, and the read look ahead data sub-fragment is assigned to the most-recently-used portion of the prioritization list, and further in which the read look ahead data sub-fragment and the read on arrival data sub-fragment persist in the cache memory for a time period greater than a time period the host data persists in the cache memory.
- 13. A data storage device comprising:
an apparatus storing a read data, the read data having a speculative data portion along with a host data portion; and a printed circuit board assembly with a cache memory and a control processor communicating with the apparatus controlling retrieval of the read data, the cache memory storing the host data along with the speculative data, the control processor programmed with a routine to prioritize removal of the host data as well as the speculative data from the cache memory by steps for prioritizing removal of the read data from the cache memory.
- 14. The data storage device of claim 13, in which the steps for prioritizing removal of the read data from the cache memory comprises the steps of:
storing a read data in a cache memory; flagging a host data portion of the read data stored in the cache memory; labeling a speculative data portion of the read data stored in the cache memory; associating the host data to a first portion of a prioritization list; and linking the speculative data to a second portion of the prioritization list, to facilitate a persistence of the speculative data stored in the cache memory for a period of time greater than a persistence of the host data in the cache memory.
- 15. The data storage device of claim 14, in which the read data is stored in the cache memory by steps comprising:
receiving a host data read command for retrieval of the host data; executing a seek command to retrieve the host data from a predetermined data sector; reading a read on arrival data from a data sector preceding the predetermined data sector; transducing the host data from the predetermined data sector; retrieving a read look ahead data from a data sector subsequent to the predetermined data sector; selecting a cache memory fragment sized to accommodate the read on arrival data along with the host data in addition to the read look ahead data; and storing the read on arrival data along with the host data in addition to the read look ahead data in the cache memory fragment to form the read data.
- 16. The data storage device of claim 15, in which the cache memory fragment comprises:
a host data sub-fragment storing the host data; a read on arrival data sub-fragment storing the read on arrival data; and a read look ahead sub-fragment storing the read look ahead data.
- 17. The data storage device of claim 16, in which the prioritization list prioritizes removal of the read data stored in the cache memory, and in which the first portion of the prioritization list is a least-recently-used portion having a lowest priority and subject to an earliest removal of the read data from the cache memory, and wherein the host data sub-fragment is assigned to the least-recently-used portion of the prioritization list.
- 18. The method of claim 16, in which the prioritization list prioritizes removal of the read data stored in the cache memory, and in which the second portion of the prioritization list is a most-recently-used portion having a highest priority and subject to a delayed removal of the read data from the cache memory, and wherein the read on arrival data sub-fragment is assigned to the most-recently-used portion of the prioritization list.
- 19. The method of claim 16, in which the prioritization list prioritizes removal of the read data stored in the cache memory, and in which the second portion of the prioritization list is a most-recently-used portion having a highest priority and subject to a delayed removal of the read data from the cache memory, and wherein the read look ahead data sub-fragment is assigned to the most-recently-used portion of the prioritization list.
- 20. The method of claim 16, in which the prioritization list prioritizes removal of the read data stored in the cache memory, and in which the first portion of the prioritization list is a least-recently-used portion having a lowest priority and subject to an earliest removal of the read data from the cache memory, the second portion of the prioritization list is a most-recently-used portion having a highest priority and subject to a delayed removal of the read data from the cache memory, and wherein the host data sub-fragment is assigned to the least-recently-used portion of the prioritization list, the read on arrival data sub-fragment is assigned to the most-recently-used portion of the prioritization list, and the read look ahead data sub-fragment is assigned to the most-recently-used portion of the prioritization list, and further in which the read look ahead data sub-fragment and the read on arrival data sub-fragment persist in the cache memory for a time period greater than a time period the host data persists in the cache memory.
RELATED APPLICATIONS
[0001] This application claims priority to U.S. Provisional Application No. 60/373,940 filed Apr. 19, 2002, entitled Method and Algorithm for Speculative Read Data Retention Prioritization.
Provisional Applications (1)
|
Number |
Date |
Country |
|
60373940 |
Apr 2002 |
US |