Segunda-feira, 12 de Janeiro de 2009

WSTest, some numbers - Updated

In my last post I've talked about my Java implementation of WSTest that can compete performance wise with the Microsoft implementation. My initial tests were performed using Virtual Machines and Windows2003 but since then I've managed to get my hands on a install CD of Windows2008 and a gigabit switch, so I finally could perform tests with some resemblance of a valid setup.

 

I still don't have server class hardware to play but at least I can run the tests on more than one machine. The web service host server is a Thinkpad T61p (Core2 Duo T7700 @ 2.40GHz, 4GB RAM), the client machines are a Macbook (Core2 Duo @ 2.16GHz, 1GB RAM) and a HP Compaq 2510p (Core2 Duo U7600 @ 1.20GHz, 2GB RAM)

I've used ApacheBench as the tool for generating load, example of use for the GetOrder test:

ab -c30 -n500000 -k -p post_files/getorder20.xml
 -H 'SOAPAction: "uri:WSTestWeb-TestService/GetOrder"'
 -T "text/xml;charset=UTF-8"
 http://172.17.1.100:8050/WSTest

The WCF implementation tested was the "WSTestSelfHost" (the numbers for the "WSTest_IISHosted" are lower) running on Windows2008 Std with all the latest updates and .NET 3.5SP1. The Java implementation runs on Ubuntu 8.10 with the Generic kernel using the sun-java-jdk-1.6.10 jvm.

 

WSTest (results in tps, higher is better)

Test NameWindows/.NET/WCFWindows/MinaLinux/MinaLinux/Grizzly
GetOrder-205095766085699962
GetOrder-1002773390742144923
EchoStruct-2028088412993911343
EchoStruct-1001582391341704742
EchoList-2025207844920010112
EchoList-1001346376641244531

Some notes:

  • The EchoSynthetic values aren't present because it's not clear to me what the "20" and the "100" are supposed to be on this test.
  • My Java implementation has a huge drop in troughtput in the GetOrder test going from 20 to 100 items, I will have to investigate the reason for this pathological behaviour. This has been fixed.
  • In some tests the Java results are more than 3 times as high!

 

Update: I've update the values to include tests runs with the Mina based http engine, both on Windows and on Linux. Results for Grizzly on Windows are not included because Grizzly aborts/resets connections way to frequently when running on Windows2008

tags: ,
published by luisneves às 00:45
perm link | comment | add to favourites
Quarta-feira, 31 de Dezembro de 2008

A fast implementation of WSTest in Java

I've recently come across the Microsoft updated versions of the WSTest Web Services Benchmark and the .NET StockTrader Sample Application. They wasted no time bragging about the results :-)

Microsoft encourages people to download the benchmark kit and perform their own tests, so I did that. I will ignore the StockTrader App for now because is more complex to install and analyze, I will focus on the WSTest benchmark. The .NET/WCF results are very good and the guys at benchmark labs seem to really know their stuff. It's a pity that the benchmark choose to compare .NET/WCF against WebSphere, probably the most expensive, slow and cumbersome of all Java Application servers.

In the Java-Land there are faster solutions to choose from. I decided to implement my own version of the benchmark to verify just how fast or how slow can a Java implementation be.

The test is essentially a XML serialization/deserialization benchmark, so I picked the speedy JiBX as the framework for Java/XML data binding. JiBX is only as fast as the underlying XML parser and the fastest Stax parser that I know is the Aalto XML Processor. We will also need an HTTP layer and for this I really like the Mina Http Codec.

With all the ingredients in place it didn't took long to produce a benchmark implementation that doesn't suck :-). The code is available here.

And what about the results? Unfortunately I don't have a server class machine laying around for running proper tests. However, I do have VirtualBox and two virtual machines, one with "Windows2003 Server" that runs the "Self-Hosted" WSTest application and another with "Ubuntu 8.10 Server" that runs the Java implementation using the sun-java-jdk-1.6.10 jvm. Using soapUI as a load generator the "linux/java" setup runs circles around the "windows/.net/wcf", in some cases the throughput numbers are more than twice as high. Of course that these results should be taken with a truckload of salt. The tests should have been performed with a proper server machine, using Windows 2008 Server in the .NET setup and with several machines running the load generators. I would love to hear from someone that has a "benchmark lab".

Update: The http bits are now handled by Grizzly, the performance seems to be better.

Update: Check the follow-up post for a more detailed performance test.

tags: ,
published by luisneves às 01:01
perm link | comment | add to favourites

.search this blog

.Fevereiro 2009

Dom
Seg
Ter
Qua
Qui
Sex
Sab
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28

.Subscrever por e-mail

A subscrição é anónima e gera, no máximo, um e-mail por dia.

.recent posts

. WSTest, some numbers - Up...

. A fast implementation of ...

.archives

. Fevereiro 2009

. Janeiro 2009

. Dezembro 2008

. Outubro 2008

. Maio 2008

.subscrever feeds