:: ::
::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: :::::
::
A SOAP Performance Comparison of different WSRF Implementations
Roland Kübert, Axel Tenschert, Hai-Lang Thai
{kuebert, tenschert}@hlrs.deHigh Performance Computing Center Stuttgart (HLRS), University of Stuttgart
SOAP Comparison to WSRF 28.11.2009
:: ::
::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: :::::
::
Introduction• SOAP is the protocol used most often in web
services communications• WSRF uses SOAP as a communications
protocol
2SOAP Comparison to WSRF 28.11.2009
• Today: Performance analysis’ for SOAP toolkits have been performed for various cases and toolkits But: WSRF implementations have generally not been taken into account
:: ::
::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: :::::
::
Introduction
• SOAP performance of three WSRF implementations are compared:– UNICORE 6 WSRFLite 1.8.6,– Globus Toolkit 4 Java WS-Core 4.2.1 – Apache Muse v2.2.0.
• Benchmark results can indicate which implementation is favorable if performance is a key requirement
3SOAP Comparison to WSRF 28.11.2009
:: ::
::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: :::::
::
Related Work• Investigation of applicability of SOAP in Real-Time
Trading Systems [5]• Analysis of the feasibility of SOAP for Scientific
Computing [1]• Test of specific SOAP toolkits against Axis 1, gSoap,
bSoap and XSUL in a generic SOAP benchmark suite [2]• Investigation of WSRF specific operations for Globus
Toolkit v3.9.2 but without deeper conclusions [7]• Analyzis of suitability of SOAP for wireless devices [4]
4SOAP Comparison to WSRF 28.11.2009
:: ::
::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: :::::
::
Related Work• This work is based on:
– benchmark suite developed by Head et. Al [2]• Selected benchmark suite because it was performed
with the aim of developing a standard benchmark suite for:– Cuantifying, – comparing and– Contrasting
• Wide range of use cases
5SOAP Comparison to WSRF 28.11.2009
the performance of SOAP implementations
:: ::
::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: :::::
::
Methodology: Software• For each middleware one service is developed• Each service exposes 3 different types of operations:
– Echo: received values are sent back– Receive: the number of values is sent back– Send: for a received number, that much values are sent
back• All operations are implemented for primitive data types:
– Byte, double, int and string
6SOAP Comparison to WSRF 28.11.2009
:: ::
::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: :::::
::
Methodology: Software
• Additionally two complex data types are used– MeshInterfaceObject: consists of two integers that
represent coordinates and a double that represents a field value at the given position.
– SimpleEvent: an object representing an event that is composed of a sequence number (int), a time stamp (double) and a message (String)
• Operation echoVoid (void input and output) is implemented to test latency of the SOAP stack
7SOAP Comparison to WSRF 28.11.2009
:: ::
::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: :::::
::
Methodology: Hardware• Server: Dell Latitude D620 with Intel® Core2 Duo™ CPU
T7400 2.17 GHz and 2 GB of memory• Services benchmarked in:
– Ubuntu Linux v9.04 (Kernel 2.6.28-11-generic)– Windows Vista Enterprise 32-bit Service Pack 1
• Client: Dell Optiplex 320 with Intel® Pentium® D CPU 3.00 GHz and 2 GB of memory – Windows Vista Enterprise 32-bit Service Pack 1
8SOAP Comparison to WSRF 28.11.2009
:: ::
::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: :::::
::
Results: Latency• Performed by calling void operation:
– operation needs no processing except the one inherent in every SOAP message processing
good indicator of the overhead imposed by the different toolkits
9SOAP Comparison to WSRF 28.11.2009
Windows Linux
GT4 6 ms 42 ms
Muse 4 ms 6 ms
WSRFLite 2 ms 3 ms
:: ::
::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: :::::
::
Results: Latency• Ranking by measuring the imposed overhead:
1. WSRFLite2. Muse3. GT4
10SOAP Comparison to WSRF 28.11.2009
• General trend: all toolkits run faster under Windows
• Recent test of different VMs on Ubuntu Linux and Windows Vista showed opposite results [6]
• Trend that performance is slower underLinux stays the same for the other tests as well
:: ::
::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: :::::
::
Results: Serialization
11SOAP Comparison to WSRF 28.11.2009
• Serialization performance was tested with the send* operations:– integer specifying array size to be created by
service is sent over wire– only input parameter: an array of the
corresponding size is then returned.
:: ::
::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: :::::
::
Results: Serialization
12SOAP Comparison to WSRF 28.11.2009
:: ::
::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: :::::
::
Results: Serialization
13SOAP Comparison to WSRF 28.11.2009
:: ::
::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: :::::
::
Results: Serialization
14SOAP Comparison to WSRF 28.11.2009
• Muse performs worst on both platforms • GT4 has the best results when dealing with complex
objects • Otherwise best perfomance: WSRFLite.
Ranking:1. WSRFLite (without complex objects)2. GT 43. Muse
:: ::
::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: :::::
::
Results: Deserialization• Deserialization performance was tested with the
receive* operations:– array of objects was sent to service – size of the array was returned as integer
15SOAP Comparison to WSRF 28.11.2009
:: ::
::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: :::::
::
Results: Deserialization
16SOAP Comparison to WSRF 28.11.2009
:: ::
::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: :::::
::
Results: Deserialization
17SOAP Comparison to WSRF 28.11.2009
:: ::
::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: :::::
::
Results: Deserialization• Performance ranking under Windows:
1. WSRFLite2. GT43. Muse
• Performance difference at receiveBase64:1. WSRFLite2. Muse and GT4
• No differences under Linux
18SOAP Comparison to WSRF 28.11.2009
:: ::
::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: :::::
::
Results: End-to-End• End-to-End performance was tested with echo*
operations– each service returns given input array – input array incorporates both complex deserialization
(when receiving) and serialization (when sending) operations
19SOAP Comparison to WSRF 28.11.2009
:: ::
::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: :::::
::
Results: End-to-End
20SOAP Comparison to WSRF 28.11.2009
:: ::
::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: :::::
::
Results: End-to-End
21SOAP Comparison to WSRF 28.11.2009
:: ::
::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: :::::
::
Results: End-to-End• Ranking under Windows and Linux:
1. WSRFLite2. GT43. Muse
• echoBase64 operation:– WSRFLite is still performing much better– GT4 and Muse nearly the same results
22SOAP Comparison to WSRF 28.11.2009
:: ::
::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: :::::
::
Conclusions
• Future work:– investigate the performance of the three toolkits when
making use of capabilities of WSRF and other implemented specifications
– Investigation of methods such as: creating and destroying resources, getting and
setting properties or sending and receiving notifications.
2328.11.2009SOAP Comparison to WSRF
:: ::
::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: :::::
::
Thank You!
24SOAP Comparison to WSRF 28.11.2009
:: ::
::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: :::::
::
References[1] K. Chiu, M. Govindaraju, and R. Bramley.
Investigating the limits of soap performance forscientific computing. In HPDC ’02: Proceedings of the11th IEEE International Symposium on HighPerformance Distributed Computing, page 246,Washington, DC, USA, 2002. IEEE Computer Society.
[2] M. R. Head, M. Govindaraju, A. Slominski, P. Liu,N. Abu-Ghazaleh, R. van Engelen, K. Chiu, and M. J.Lewis. A benchmark suite for soap-basedcommunication in grid web services. In SC ’05:Proceedings of the 2005 ACM/IEEE conference onSupercomputing, page 19, Washington, DC, USA, 2005.IEEE Computer Society.
[3] F. Ilinca, J.-F. Hetu, M. Audet, and R. Bramley.Simulation of 3-d mold-filling and solidificationprocesses on distributed memory parallel architectures.
25SOAP Comparison to WSRF 28.11.2009
:: ::
::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: :::::
::
References[4] J. Kangasharju, S. Tarkoma, and K. Raatikainen.
Comparing soap performance for various encodings,protocols, and connections. In Personal WirelessCommunications, volume 2775 of Lecture Notes inComputer Science, pages 397–406. Springer-Verlag,2003.
[5] C. Kohlhoff and R. Steele. Evaluating soap for highperformance business applications: Real-time tradingsystems, 2003.
[6] M. Larabel. Java performance: Ubuntu linux vs.windows vista. http://www.phoronix.com/scan.php?page=article&item=java_vm_performance&num=1.
[7] M. Li, M. Qi, M. Rozati, and B. Yu. A WSRF basedshopping cart system. In P. M. A. Sloot, A. G.Hoekstra, T. Priol, A. Reinefeld, and M. Bubak, editors, EGC, volume 3470 of Lecture Notes in Computer Science,
pages 993–1001. Springer, 2005.
2628.11.2009SOAP Comparison to WSRF