Skip to main content
USENIX
  • Conferences
  • Students
Sign in
  • Overview
  • Registration Information
  • Registration Discounts
  • Symposium Organizers
  • At a Glance
  • Calendar
  • Technical Sessions
  • Live Streaming
  • Purchase the Box Set
  • Tutorial on GENI
  • Posters and Demos
  • Sponsorship
  • Activities
  • Hotel and Travel Information
  • Services
  • Students
  • Questions?
  • Help Promote
  • For Participants
  • Call for Papers
  • Past Proceedings

sponsors

Silver Sponsor
Silver Sponsor
Silver Sponsor
Bronze Sponsor
Bronze Sponsor
Bronze Sponsor
General Sponsor
General Sponsor
Media Sponsor
Media Sponsor
Media Sponsor
Media Sponsor
Media Sponsor
Media Sponsor
Media Sponsor
Media Sponsor
Media Sponsor
Media Sponsor
Media Sponsor
Media Sponsor
Media Sponsor
Media Sponsor
Media Sponsor
Media Sponsor
Media Sponsor
Media Sponsor
Media Sponsor
Media Sponsor

twitter

Tweets by @usenix

usenix conference policies

  • Event Code of Conduct
  • Conference Network Policy
  • Statement on Environmental Responsibility Policy

You are here

Home » Demystifying Page Load Performance with WProf
Tweet

connect with us

http://www.twitter.com/usenix
https://www.facebook.com/usenixassociation
http://www.linkedin.com/groups/USENIX-Association-49559/about
https://plus.google.com/108588319090208187909/posts
http://www.youtube.com/user/USENIXAssociation

Demystifying Page Load Performance with WProf

Authors: 

Xiao Sophia Wang, Aruna Balasubramanian, Arvind Krishnamurthy, and David Wetherall, University of Washington

Abstract: 

Web page load time is a key performance metric that many techniques aim to reduce. Unfortunately, the complexity of modern Web pages makes it difficult to identify performance bottlenecks. We present WProf, a lightweight in-browser profiler that produces a detailed dependency graph of the activities that make up a pageload. WProf is based on a model we developed to capture the constraints between network load, page parsing, JavaScript/CSS evaluation, and rendering activity in popular browsers. We combine WProf reports with critical path analysis to study the page load time of 350 Web pages under a variety of settings including the use of end-host caching, SPDY instead of HTTP, and the mod pagespeed server extension. We find that computation is a significant factor that makes up as much as 35% of the critical path, and that synchronous JavaScript plays a significant role in page load time by blocking HTML parsing. Caching reduces page load time, but the reduction is not proportional to the number of cached objects, because most object loads are not on the critical path. SPDY reduces page load time only for networks with high RTTs and mod_pagespeed helps little on an average page.

Xiao Sophia Wang, University of Washington

Aruna Balasubramanian, University of Washington

Arvind Krishnamurthy, University of Washington

David Wetherall, University of Washington

Open Access Media

USENIX is committed to Open Access to the research presented at our events. Papers and proceedings are freely available to everyone once the event begins. Any video, audio, and/or slides that are posted after the event are also free and open to everyone. Support USENIX and our commitment to Open Access.

Wang PDF
View the slides

Presentation Video 

Presentation Audio

MP3 Download

Download Audio

Public Summary: 

by Rebecca Isaacs

The factors that lead to slow Page Load Times (PLTs) in web browsers are difficult to characterize, involving complex dependencies between various network and computational activities such as fetching and parsing of HTTP objects, and JavaScript and CSS evaluation.  This paper presents a profiling tool called WProf that not only identifies the PLT bottleneck for a given web page, but also generates a dependency graph that explains the bottleneck by revealing the importance of different events in the critical path.

WProf runs in Webkit browsers, where it instruments the timings of various page load activities and records the dependencies between them.  It relies on a model of potential dependencies to identify which ones manifest during load of a particular page.  Understanding which dependencies arise is complicated by the fact that they range in nature from orderings on parsing, loading and evaluation, to limits on resource consumption such as the number of TCP connections.  Moreover, many dependencies are specific to a particular browser implementation. In this work the authors use controlled experimentation, together with inspection of code and documentation, to identify and classify the dependency policies imposed by four widely-used browsers.

The second half of the paper uses WProf to identify the PLT bottlenecks of 150 popular websites.  The study results in some interesting findings, such as that computation contributes a substantially greater fraction of time than expected on the critical path.  The authors also evaluate the impact of two web optimization technologies: SPDY and mod_pagespeed, providing insights to their effectiveness under different operating conditions.

This paper is timely and relevant—everyone suffers from poor page load performance and this work has direct implications for improvements to browser, web page and protocol design.  Although many of the resulting observations are well known, a systematic and comprehensive approach to dependency analysis and critical path extraction is an important step forward.  The PC hopes that more sophisticated applications will follow, such as automated page transformations, guidance for exploiting multicore hardware, or what-if analyses applied to browser and protocol design.

  • Log in or    Register to post comments

Silver Sponsors

Bronze Sponsors

General Sponsors

Media Sponsors & Industry Partners

© USENIX

  • Privacy Policy
  • Contact Us