JSMeter: Characterizing Real-World Behavior of JavaScript Programs

Web Performance Ballroom AB
Average rating: ****.
(4.04, 24 ratings)

JavaScript is widely used in web-based applications and is increasing popular with developers. So-called ”browser wars” in recent years have focused on JavaScript performance, specifically claiming comparative results based on benchmark suites such as SunSpider and V8. In this paper we evaluate the behavior of JavaScript web applications from commercial websites and compare this behavior with the benchmarks.

We measure three specific areas of JavaScript runtime behavior: 1) functions and code; 2) heap-allocated objects and data; 3) events and handlers. We find that the benchmarks are not representative of many real websites and that conclusions reached from measuring the benchmarks may be misleading.

Specific examples of such misleading conclusions include the following: that web applications have many loops, that non-string objects in web applications are extremely short-lived, and that web applications handle few events.

We hope our results will convince the JavaScript community to develop and adopt benchmarks that are more representative of real web applications. Additional information about the project can be found at

http://research.microsoft.com/en-us/projects/jsmeter/

Photo of Ben Livshits

Ben Livshits

Microsoft Research

Ben Livshits is a researcher at Microsoft Research in Redmond, WA. He received a B.A. from Cornell University in 1999, and his M.S. and Ph.D. from Stanford University in 2002 and 2006, respectively. Dr. Livshits’ research interests include application of sophisticated static and dynamic analysis techniques to finding errors in programs. He is known for his work on software reliability and especially tools to improve software security, with a primary focus on approaches to finding buffer overruns in C programs and a variety of security vulnerabilities (cross-site scripting, SQL injections, etc.) in Web-based applications. Lately he has been focused on how Web 2.0 application reliability, performance, and security can be improved through a combination of static and runtime techniques.

Photo of Ben Zorn

Ben Zorn

Microsoft Research

Bio: Ben Zorn is a Principal Researcher at Microsoft Research. After receiving a PhD in Computer Science from UC Berkeley in 1989, he served eight years on the Computer Science faculty at the University of Colorado in Boulder, receiving tenure and being promoted to Associate Professor in 1996. He left the University of Colorado in 1998 to join Microsoft Research, where he currently works. Ben’s research interests include programming language design and implementation and performance measurement and analysis. He has served as an Associate Editor of the ACM journals Transactions on Programming Languages and Systems and Transactions on Architecture and Code Optimization and he currently serves as a Member-at-Large of the SIGPLAN Executive Committee. For more information, visit his web page at http://research.microsoft.com/~zorn/.

Comments on this page are now closed.

Comments

Picture of Bjoern Kaiser
Bjoern Kaiser
06/24/2010 9:37am PDT

great talk with much information. but few take aways beside “dont trust benchmarks you didnt manipulate yourself” ;-)

Picture of Ben Zorn
Ben Zorn
06/16/2010 9:43am PDT

Developers have a love/hate relationship with benchmarks. While we all know that they are unrepresentative, we still use them because we want to be able to compare apples with apples. In this talk, we discuss exactly what the SunSpider and V8 JavaScript benchmarks are doing and compare that behavior with the behavior of real web applications. The results confirm our worst fears about how unrepresentative benchmarks are and point us in new directions to improve performance in cases that matter.

Picture of Steve Souders
Steve Souders
06/07/2010 8:28am PDT

Everyone loves benchmarks. The browser vendors are focused on improving JavaScript performance. And yet, most people agree a good benchmark for real world JavaScript performance has eluded us. Ben and Ben will share their findings from digging deep into the benchmarks that are out there on some of the Web’s most popular sites.

For Velocity China sponsorship information for companies outside China, contact Yvonne Romaine at yromaine@oreilly.com.

  • Google
  • Strangeloop
  • Yahoo! Inc.
  • Dyn Inc.
  • Facebook
  • Schooner Information Technology
  • Tilera
  • AlertSite
  • AppDynamics
  • Aptimize
  • CDNetworks
  • Circonus
  • Cloudscaling
  • Clustrix
  • Coradiant
  • Dell
  • DTO Solutions
  • MaxiScale
  • Neustar
  • Nokia
  • NorthScale, Inc.
  • Shopzilla
  • Splunk
  • Virident
  • Zoompf
  • Neustar

For information on exhibition and sponsorship opportunities at the conference, contact Yvonne Romaine at yromaine@oreilly.com

Download the Velocity Sponsor/Exhibitor Prospectus

Download the Media & Promotional Partner Brochure (PDF) for information on trade opportunities with O'Reilly conferences or contact mediapartners@ oreilly.com

For media-related inquiries, contact Maureen Jennings at maureen@oreilly.com

To stay abreast of conference news and to receive email notification when registration opens, please sign up for the Velocity Conference bulletin (login required)

View a complete list of Velocity contacts