WSDL 2.0 Interop Dashboard

Generated June 27 2007, 1452

This page summarizes the results of interop testing of the WSDL 2.0 Core and WSDL 2.0 Adjuncts specifications, in fulfillment of the CR exit criteria. The completeness of the Test Suite is also measured.

The results of interop testing prior to the namespace change can be found here.

The interop testing is based on implementations from Apache Woden (a WSDL 2.0 parser), WSO2 WSAS (a web services engine that uses Woden and is built on Apache Axis2) and a WSDL 2.0 parser and web services engine from Canon."

Interop results

Component Model Test Results


These results test the results of parsing a WSDL 2.0 document, by generating an interchange format representing the component model in XML and comparing it against a baseline description of the component model.

8179% passed10199% passed8179% passed

Message Exchange Test Results


These results represent the analysis of log files of message exchanges based on complex WSDL 2.0 documents, with each implementation run pair-wise against the others.

Canon >> CanonCanon >> WSO2 WSASWSO2 WSAS >> CanonWSO2 WSAS >> WSO2 WSAS
MessageTest-1G (SOAP12)28100% passed28100% passed28100% passed28100% passed
MessageTest-4G (SOAP12)467% passed233% failed467% passed233% failed583% passed117% failed6100% passed
MessageTest-5G (SOAP12)12100% passed12100% passed12100% passed12100% passed
MessageTest-6G (SOAP12)14100% passed14100% passed1393% passed17% failed14100% passed
ModuleComposition-1G (SOAP12)6100% passed6100% passed6100% passed6100% passed
LocationTemplate-1G (SOAP12)1393% passed17% failed1393% passed17% failed14100% passed14100% passed
LocationTemplate-2G (SOAP12)583% passed117% failed 583% passed117% failed6100% passed
MessageTest-2G (HTTP)16100% passed16100% passed16100% passed16100% passed
MessageMultipart-1G (HTTP)   8100% passed
MessageTest-3G (HTTP)3383% passed718% failed3998% passed13% failed3998% passed13% failed40100% passed

Validation Test Results


These results represent the analysis of validation log files generated by validating implementations consuming both good and bad documents in the test suite. Green and yellow both mean that the testcase was correctly validated, the difference being that a yellow indicates that the implementation did not correctly identify the specific assertion a "bad" testcase was designed fail on.

8233.88429752066116% passed5723.553719008264462% passed8233.88429752066116% failed (21)
6526.859504132231404% passed8233.88429752066116% passed7330.165289256198346% failed (22)

Test Suite Completeness

Test Case Coverage


An analysis of the use of various parts of the WSDL syntax (elements, attributes, uris) in the test collection. Green indicates 4 or more occurances of the construct, yellow indicates less than 4 occurances, and red indicates zero occurances.

73%75 have good coverage27%28 have minimal coverage

Assertion Coverage


Count of assertions violated by "bad" testcases, compared with the total number of assertions identified in the specs.

9444% have coverage11956% have no coverage