Check out the new USENIX Web site. next up previous
Next: Simple Client-Server Pattern Up: Performance Patterns: Automated Scenario Evaluation Previous: Performance Pattern Language

Evaluation  

This section illustrates current capabilities as well as the potential of our automated script driven and application scenario based performance evaluation methods and tools. The examples show how the tests used in current benchmarks are supported by the framework, and how these can be used as components of more sophisticated scenario based performance patterns. This section presents results of two types of tests under two patterns to illustrate our methods. Section 4.1 presents results under the simple client-server pattern and behaviors for the cubit and throughput test types. Section 4.2 presents the results under the proxy pattern for the same behaviors and test types. Section 4.3 demonstrates the use of the DSKI to reveal the components of the system support overhead for the client-server pattern using a simple request-response behavior. We also demonstrate the portability of our method by presenting results for both Linux and Solaris. Table 1 presents the Linux testing environment for the cubit and throughput behavior tests, while Table 2 presents that for Solaris. Note that the sending machine is slightly slower than the receiving machine. We originally used identical machines, but a machine failure forced us to use a different receiving machine for tests presented here.


 

 
Table 1: Operating Environment Used for the Tests on Linux Platform
Name of ORB omniORB2, TAO
Language Mapping C++
Operating System Redhat Linux 5.1
  kernel 2.1.126
CPU info Pentium Pro 200 MHz
  128 MB RAM
Compiler info egcs-2.90.27
  egcs-1.0.2 release;
  no optimizations;
Thread package Linux-Pthreads 0.7
Type of invocation static
Measurement method getrusage
Network Info. ATM



 

 
Table 2: Operating Environment Used for the Tests on Solaris Platform
Parameter Description
Name of ORB omniORB2, TAO,
& CORBAplus  
Language Mapping C++
Operating System Solaris 2.6
CPU info Ultra Sparc-II 296 MHz (S)
  Ultra Sparc-IIi 350 MHz (R)
  128 MB RAM
Compiler info SUN C++ 4.2
  no optimizations enabled
Type of invocation static
Measurement method getrusage
Network Info. ATM


Significant further development of our approach is desirable, and is proceeding, but the current capabilities of the tools generally meet and modestly exceed some aspects of current practice. It is important to note that the framework is explicitly designed for user extension precisely because no single developer or authority can know every significant aspect of ORB evaluation. Accumulation of the sum of the CORBA community's collective wisdom concerning ORB evaluation would significantly advance the state of the art. The script based automated approach described here is designed to support such a collective effort.



 
next up previous
Next: Simple Client-Server Pattern Up: Performance Patterns: Automated Scenario Evaluation Previous: Performance Pattern Language
Sridhar Nimmagadda
3/23/1999