Claims
- 1. A method for updating a cache memory having multiple channels storing digital audio waveform samples for further processing, by updating with data from a main memory over a bus, comprising:determining a cache invalid size for a channel corresponding to a number of cache memory locations no longer required for said further processing; detecting the occurrence of a loop end; when said cache invalid size extends beyond said loop end, prior to calculation of a new current address, calculating a new current address by subtracting a loop size from said loop end; and requesting data from said main memory across said bus up to said loop end with subsequent data from a loop start address.
- 2. The method of claim 1 wherein said fetching data step uses a burst of data over a bus.
- 3. The method of claim 1 wherein fetching subsequent data utilizes a separate bus request.
- 4. The method of claim 1 comprising:assigning a priority to a channel in accordance with a first priority scheme when a loop occurs in said channel; and assigning a priority to said channel in accordance with a second priority scheme when no loop occurs in said channel.
- 5. The method of claim 1 comprising:providing a set of request parameters for updating a channel in said cache memory in accordance with a priority scheme; requesting access to said bus; and replacing said set of request parameters in accordance with a change in priority between requesting access to said bus and a grant of access.
- 6. A cache memory system having multiple channels storing digital audio samples, coupled to a bus, comprising:channel address logic configured to determine a cache invalid size for a channel corresponding to a number of cache memory locations no longer required for further processing, and to detect the occurrence of a loop end; request logic configured to generate a request to fetch data corresponding to said cache invalid size only up to said loop end when said cache invalid size extends beyond said loop end, prior to calculation of a new current address, and fetch subsequent data from a loop start address; and logic for calculating a new current address by subtracting a loop size from said loop end.
- 7. The cache memory system of claim 6 comprising:a priority unit, coupled to said channel address logic and said request logic, configured to assign a priority to a channel in accordance with a first priority scheme when a loop occurs in said channel, and assign a priority to said channel in accordance with a second priority scheme when no loop occurs in said channel.
- 8. The cache memory system of claim 7 comprising:a priority queue coupled to said priority unit.
- 9. A cache memory system for updating a cache memory storing digital audio waveform samples for multiple channels, using a shared bus, comprising:request logic configured to provide a set of request parameters for updating a channel in said cache memory in accordance with a priority scheme; accept logic configured to request access to said shared bus and to replace said set of request parameters in accordance with a change in priority between requesting access to said shared bus and a grant of access.
CROSS-REFERENCE TO RELATED APPLICATIONS
This application is continuation of and claims the benefit of U.S. application Ser. No. 08/971,238, filed Nov. 15, 1997, now U.S. Pat. No. 6,138,207, the disclosure of which is incorporated herein by reference.
US Referenced Citations (7)
Continuations (1)
|
Number |
Date |
Country |
Parent |
08/971238 |
Nov 1997 |
US |
Child |
09/654969 |
|
US |