Check out the new USENIX Web site. next up previous
Next: Performance Up: Reliability During Disaster Previous: Disaster Test

Latency

Figure 8 shows how latency is distributed across all requests for local-sync, local-sync+FEC, and network-sync solutions. Latency is the time between a local storage server sending a request and a remote storage server receiving the request. We see that these solutions show similar latency for zero link loss, but local-sync+FEC and network-sync show considerably better latency than local-sync for a lossy link. Furthermore, the latency spread of local-sync+FEC and network-sync solutions is considerably less than the spread of the local-sync solution -- particularly as loss increases; proactive redundancy helps to reduce latency jitter on lossy links. Smaller variance in this latency distribution helps to ensure that updates submitted as a group will arrive at the remote site with minimum temporal skew, enabling the entire group to be written instead of not.

Figure 9: Effect of varying wide-area one-way link loss on Aggregate Throughput.
\includegraphics[width=0.95\columnwidth]{results/graph1/tput_vs_loss.eps}

Figure 10: Effect of varying wide-area link latency on Aggregate Throughput.
\includegraphics[width=0.9\columnwidth]{results/graph5/tput_lattest.eps}



Hakim Weatherspoon 2009-01-14