Check out the new USENIX Web site.

Home About USENIX Events Membership Publications Students
USENIX Annual Technical Conference (NO 98), 1998 Abstract

Increasing Effective Link Bandwidth by Suppressing Replicated Data

Jonathan Santos and David Wetherall
Massachusetts Institute of Technology

Abstract

In the Internet today, transfer rates are often limited by the bandwidth of a bottleneck link rather than the computing power available at the ends of the links. To address this problem, we have utilized inexpensive commodity hardware to design a novel link layer caching and compression scheme that reduces bandwidth consumption. Our scheme is motivated by the prevalence of repeated transfers of the same information, as may occur due to HTTP, FTP, and DNS traffic. Unlike existing link compression schemes, it is able to detect and use the long-range correlation of repeated transfers. It also complements application- level systems that reduce bandwidth usage, e.g., Web caches, by providing additional protection at a lower level, as well as an alternative in situations where application-level cache deployment is not practical or economic.

We make three contributions in this paper. First, to motivate our scheme we show by packet trace analysis that there is significant replication of data at the packet level, mainly due to Web traffic. Second, we present an innovative link compression protocol well-suited to traffic with such long-range correlation. Third, we demonstrate by experimentation that the availability of inexpensive memory and general-purpose processors in PCs makes our protocol practical and useful at rates exceeding T3 (45 Mbps).

  • View the full text of this paper in PDF form.

  • If you need the latest Adobe Acrobat Reader, you can download it from Adobe's site.

  • To become a USENIX Member, please see our Membership Information.

?Need help? Use our Contacts page.

Last changed: 12 April 2002 aw
Technical Program
Conference Index
USENIX home