Check out the new USENIX Web site. next up previous
Next: Equalizing Marginal Utilities Up: SARC: Sequential Prefetching in Previous: Marginal Utility of SEQ

Marginal Utility of RANDOM

The RANDOM list contains essentially only demand-paged, non-sequential data. For demand paged data, LRU is a stack algorithm. In other words, increasing (respectively, decreasing) the length of the list leads to smaller (respectively, larger) number of misses.

Let $ r$ denote the rate of cache misses in RANDOM. To compute the marginal utility of RANDOM, we examine $ r$ changes, namely, $ \Delta r$, changes in response to a small change $ \Delta L$ in $ L$. By the stack property, as $ L$ increases (respectively, decreases) $ r$ decreases (respectively, increases). Hence, $ \Delta r/\Delta L$ is always negative. The marginal utility is measured by the magnitude of this quantity. Unlike SEQ list where an analytical model was necessary, for the RANDOM list the quantity $ {\Delta r}/{\Delta L}$ can be estimated directly from real-time cache introspection.

We will monitor $ \Delta L$ cache space at the bottom of RANDOM list (see Figure 2), and observe rate of hits $ \Delta r$ in this region. A bottom hit is an indication that a miss has been saved due to the cache space at the bottom of RANDOM. In other words, if cache space $ \Delta L$ at the bottom was not allocated to RANDOM, then the rate of misses $ r$ would increase by $ \Delta r$. We assume that $ {\Delta r}/{\Delta L}$ is a constant in a small region (generally much smaller than even $ \Delta L$) at the bottom of the list where our locally adaptive algorithm is active. Hence, as a corollary, if a small amount of cache space were added to RANDOM, then the misses would decrease in proportion to $ {\Delta r}/{\Delta L}$.

However, allocating more cache space to RANDOM will take away equal amount from SEQ, and, hence, needs to be carefully weighed. The optimum is attained when marginal utilities of both the lists are equalized.


next up previous
Next: Equalizing Marginal Utilities Up: SARC: Sequential Prefetching in Previous: Marginal Utility of SEQ
Binny Gill 2005-02-14