Check out the new USENIX Web site. next up previous
Next: Performance Optimizations and Analysis Up: Experimental Evaluation Previous: Experimental Evaluation

   
Overall Performance


 
Table 1: Execution time, in seconds, of the benchmarks using each JVM. The ratios on the right compare the JVMs pairwise, showing the slower JVM's execution time relative to the faster one's.
  Benchmark JDK Jupiter Kaffe Jupiter/JDK Kaffe/Jupiter
1 209_db 178s 282s 836s 1.59:1 2.96:1
2 228_jack 112s 213s 567s 1.91:1 2.66:1
3 201_compress 333s 700s 2314s 2.10:1 3.31:1
4 222_mpegaudio 276s 649s 1561s 2.35:1 2.40:1
5 213_javac 114s 313s 733s 2.74:1 2.35:1
6 202_jess 93s 257s 608s 2.76:1 2.36:1
Geometric Mean 2.20:1 2.65:1
 

To test Jupiter's functionality and performance, we used it to run the single-threaded applications2 from SPECjvm98 benchmark suite [18]. In this section, we present the execution times consumed by these benchmarks running on Jupiter, and compare them with results from Kaffe 1.0.6, and from the Sun Microsystems JDK v1.2.2-L. We find that Jupiter is faster than Kaffe and slower than JDK.

Table 1 compares the execution times of each benchmark run on the three JVMs. All times were measured on a 533 MHz Pentium III with 512 MB of RAM running Linux, kernel version 2.2.19. Jupiter was compiled with gcc version 2.95.2 at optimization level -O3, with all source code combined into a single compilation unit to facilitate function inlining [16]. The times were reported by the UNIX ``time'' program, and therefore include all JVM initialization. All benchmarks were run with verification and JIT compilation disabled, since Jupiter does not yet possess either of these features. Averaged across all benchmarks (using the geometric mean), Jupiter was 2.20 times slower than JDK, and 2.65 times faster than Kaffe.


next up previous
Next: Performance Optimizations and Analysis Up: Experimental Evaluation Previous: Experimental Evaluation
Tarek S. Abdelrahman
2002-05-27