Monday, October 19, 2009

SOAPSonar - EMAP Certified HP QC Native Integration

According to Gartner's Hype Cycle for Application Development, 2009, SOA Testing has almost traversed the "Peak of Inflated Expectation" and is on a glide-slope towards the "Slope of Enlighment." There is of course a "Trough of Disillusionment" in the middle where most enterprise find themselves in. However, mature enterprise SOA deployments have realized that they have to look at SOA Testing more comprehensively and deploy specialized SOA Testing tools that are natively integrated with their existing QA/Testing infrastructure. The accelerated adoption and deployment of Web services has reached an inflection point, and large development and testing teams are demanding more efficient, centralized management.

HP Quality Center is the dominant QA infrastructure used by enterprise to support all essential aspects of centralized test management throughout the application life cycle. It provides a consistent, repeatable process for gathering requirements, planning and scheduling tests, analyzing results and managing defects and issues. HP Quality Center software enables organizations to digitize specific quality processes and procedures within the larger application life cycle, and supports high levels of communication and collaboration among IT teams.

Crosscheck Networks SOAPSonar is a leading SOA Testing product that provides simple, intuitive and comprehensive testing for SOAP-, XML- and REST-based services. The SOAPSonar testing framework is easy to deploy and provides testing modes for functional, performance, compliance and security testing. Addressing enterprise customers' needs for tighter, native integration, Crosscheck Networks is the only SOA Testing Tool provider that provides an HP-EMAP certified integration between SOAPSonar and HP Quality Center.

The SOAPSonar-HP Quality Center integration enables rich collaboration of test resources and identified defects between large services teams - typically comprising five or more developers.

The video below shows how the native integration between SOAPSonar and HP Quality Center provides:

  • Consolidated visibility into service quality and potential business risk
  • Centralized sharing, storage and management of defects, test rules and results, and configuration settings
  • Real-time review and cross-service trend analysis.

Wednesday, June 24, 2009

Limits of Open Source SOA Testing Tools

In this article, we will discuss the limits of adopting an Open Source SOA testing tool for SOA and Web Services projects. Open Source has become an essential and popular resources for many tools and platforms used in SOA deployments. From operating systems such as Linux, to databases such as MySQL, and browsers such as Firefox, Open Source has a proven track record for cost-effective applications and tools.

SOA testing involves the ability to test SOAP, XML, and REST based messaging against a service endpoint in order to assess the robustness, reliability, and resilience of the service. Comprehensive testing of a service endpoint involves 4 primary focus areas: Functional, Performance, Interoperability, and Security.

Functional testing provides the ability to verify the proper behavior of services and build regression test suites to automate testing and baseline expected behavior of services to quickly assess and validate functionality through the lifecycle of service revisions.

Performance testing provides a concurrent, simultaneous loading agent framework which can determine throughput and capacity statistics of the back-end service across the range of input and client load variances to validate Service Level Agreement rates and well as identify bottlenecks and potential architectural weaknesses and performance dependencies.

Interoperability testing maximizes interoperability by measuring both the design characteristics of a service as well as the runtime adherence to standards and best-practices. Isolating potential interoperability issues early on in the lifecyle can significantly optimize efforts of integration when exposing the service to trading partners and clients which may be build on a varying array of disparate web services technologies and platforms.

Security testing assesses the risk posture and robustness of a service with regard to vulnerability, data-leakage, data privacy, and data integrity. Each web service is unique based on the schema which defines the input and response message structure of how to communicate with the service. Using the WSDL schema as the source, security tests can be built to create boundary condition tests for the service which then identify the robustness of the service handling inputs outside the range of expectation. Further, the various security and identity specifications set forth by the W3C and OASIS provide a framework to test for the level of data integrity, data privacy, and access control on the service transactions and endpoint itself

Open Source SOA Testing
The Open Source tools available today for SOA testing focus primarily on the Functional testing of a service. Since functional testing is often first in the SOA development lifecycle, and adopted early-on in the development and implementation phase, it thus becomes widely adopted by development teams both because it is free, and also because the use cases are often limited to simple unit testing of service messaging.

However, as services mature and move down on the SOA lifecyle to system testing, integration, pre-production analysis and validation, the other perspectives of SOA testing need to play a role in the comprehensive assessment of the quality, robustness, and capabilities of a published service. It is in these areas where the Open Source XML/REST testing tools fall short.

Open Source Limitations - SOA Functional Testing
Generally the functional testing capabilites of an Open Source testing tool are adequate for the simple type of SOA deployments which do not have complex WSDLs, Schemas, or message patterns. Once the deployments gain more complexity however, the challenges of functional testing move from single request-response testing to scenario testing where the functional behavior is measured not by one request-response, but rather several transactions each dependent on the other as a business functional unit. Having the ability to test these types of functional scenarios effectively requires the ability to maintain state between one test result and the next.

Open Source Limitations - SOA Performance Testing
While there are several Open Source performance testing products on the market, these are primarily tools used in the static web testing paradigms. When dealing with SOAP and XML -based transactions, the static data testing behaviors of these performance harnesses does not allow for the unique-wire signature requirements of the message patterns as is indicative of actual service transactions. Thus, when running performance tests with a web-based testing platform, the results are that the end-points become inundated with static messages which are not characteristic of actual traffic patterns. In fact, in many cases, the service endpoint itself is supposed to reject these static messages as replay-attacks on the service.

Another consideration of performance testing is the level of security and identity provisions that messages may be required to carry in order to access the service. The static Open Source performance testing harnesses do not provide solutions for message security and message identity requirements.

Open Source Limitations -SOA Interoperability Testing
The promise of SOA provides a open, reusable architecture which lowers cost through the reuse ROI factor. The challenge of SOA however is the ability to widely interoperate with other technologies that also communicate via SOAP, XML, and REST. Interoperability testing involves both design-time analysis of service characteristics, such as WSDL and schema, as well as run-time assessment of a service robustness in terms of consuming and handling message patterns that may fall outside the expected structures. Open Source toolkits leverage the available WS-I analysis framework to provide a means to assess the design-time characteristics of a WSDL and Schema according to published profiles, and also provides some run-time analysis reporting of message patterns. However, the Open Source toolkits do not provide the ability to generate messages that fall outside these expected patterns, which is the key to measuring the actual posture of the run-time service. In fact, it is testing of messages that are not expected where the true measure of a service posture can be determined.

Open Source Limitations - SOA Security Testing.
Security testing falls across many areas. From a threat perspective, security testing involves integrity and structure of messages with injection attacks at the parameter and data structure levels in order to assess the behavior and resilience of the service endpoint when faced with data values and message structures outside of the expected format.

From a trust perspective, security testing involves PKI with encryption, signatures, and identity tokens. This requires testing frameworks that understand the various emerging standards from W3C and OASIS in order to support the wide range of security message formats and also requires the means to retrieve and utility X509 and Private Keys from a variety of sources including Windows Keystore, Java Keystore, SmartCards, PKCS12 files, etc.

The Open Source tools are designed for general testing and message creation, but lack the in-depth security and identity features to be considered viable for this type of testing.

Adopting an Open Source tool for SOA testing seems the simplest, most cost effective choice for developers and testers early on. However, you should plan and consider the implications of a longer term strategy with an Open Source testing tool. The many other aspects of service testing which contribute to a comprehensive testing solution across the entire SOA lifecyle that go beyond simple functional testing.

Companies who specialize in SOA and Web Services testing focus their products on specific customer use-cases and testing needs, rather than the needs of the Open Source community.

You may find that paying for a testing solution ends up costing less than not paying for one.

Tuesday, August 26, 2008

SmartCard Testing for SOAP and Web Services

A.E.T. Europe B.V. and Crosscheck Networks Partner to Deliver Industry-First SOA Testing Solution for Strong Authentication.

Testing for signatures, encryption, decryption, and X.509 client authentication is now seamlessly provided within the SOAPSonar testing framework.
SOAPSonar provides the ability to use keys from a SmartCard to perform digital signatures, encryption, decryption, and SSL X509 mutual authentication. SOAPSonar provides a native integration with A.E.T SafeSign Client software to dynamically access the digital keying information on the card.

Setup for SmartCard Integration Support

SOAPSonar will recognize the existence of A.E.T. SafeSign Client on the machine where is it running and integrates natively through the API to provide seamless access to the smartcard key pair. The first requirement for SmartCard support is to install the A.E.T. SafeSign Client. For more information about how to obtain and install the A.E.T. SafeSign Client, please visit For more information how to obtain and install the Crosscheck Networks SOAPSonar testing tool, visit

Using a SmartCard Key for Signing, Encryption, or Decryption

Follow the steps below to use a SmartCard Key for digital signatures, encryption, or decryption:

1) Ensure A.E.T. SafeSign Client has been installed

2) Attach the card reader

3) Insert the card key

4) Go to the Test Case node in SOAPSonar and click on the Request Tasks tab

5) Create a new Signature, Encryption, or Decryption Task and click on the key icon to select the Key Pair

6) Go to the Current User->My Folder and select the key pair name of the smart card key

Using a SmartCard Key for SSL X509 Mutual Authentication

Follow the steps below to use a SmartCard Key for SSL X509 Mutual Authentication:

1) Ensure A.E.T. SafeSign Client has been installed

2) Attach the card reader

3) Insert the card key

4) Go to the Test Case node in SOAPSonar and click on the Authentication tab. If you want to create a global policy for authentication, instead click on the Policy node under the configuration tab and navigate to the Authentication tab.

5) Under the SSL authentication section, click on the key icon to select the Key Pair

6) Go to the Current User->My Folder and select the key pair name of the smart card key

Tuesday, February 19, 2008

Automated SOA Regression Testing

In testing SOA environments, I found it necessary to come up with a simply, automated means of creating reusable tests to quickly verify that services in a SOA deployment were still working as expected across version upgrades. I use a product from Crosscheck Networks called SOAPSonar which provides automated regression testing and automated regression baseline creation. Some key features that were required for an automated solution that this product provides include:

  1. Comparison to baseline measurements includes XML diff capability to allow you to select which portions of the response documents to isolate and diff services characteristics while also allowing only portions of the responses to be used for the diff checking. SOAPSonar allows node fragment, element value, and attribute value exclusions to be defined such that values that tend to differ in each response (timestamps, etc) can be ignored so as to not skew the diff results. Plus it is simple drag drop in a tree view format which makes the comparison easy.

  1. Regression baselines can be created automatically. The ability to send the entire test sequence to the service and store the behavior as the baseline. SOAPSonar does this and automatically creates the xml diff profile which can then be modified as needed for the verification characteristics (often no modification is required).

  1. Regression baselines can be easily shared. SOAPSonar stores the baseline policies within the test project file which can then be shared with other teams. This allows the regression testing to be integrated directly into build automation or other existing scheduling harnesses.

The steps below provide the actual step by step instructions for how to baseline your services. To run these steps you will first need to download an eval of SOAPSonar Enterprise Edition from

Creating a SOA Regression Baseline

Load up your WSDL in the SOAPSonar interface and create a set of inputs for your back-end services. This can be a single service or multiple services.

Go to the test suite view and press the “Create Regression Baseline” icon to have SOAPSonar run the tests and store the results in a baseline. This will result in the baseline regression policy editor appearing which allows you to specify exclusions if necessary on the document diff rules.

Generating a new baseline will run all test cases and test iterations in the test suite and store all the responses in sequence to an internal baseline regression data set. Before this baseline set of responses is generated, you will be presented with a dialog that allows you to choose how to initialize the success criteria functions for each baseline response. The options include an XML diff of the entire document, copying existing success criteria settings defined on each referenced test case, updating existing settings, or leaving the criteria empty by default where it can be edited and modified manually.

Regression Baseline Criteria Editor

Once the Baseline Regression data set has been created and stored, you are then presented with the Baseline Criteria editor where you can selectively choose for each test iteration how to determine success or failure. The criteria can be XML diff based where you choose to always diff the current test response against this stored baseline response and detect any variances.

To exclude items from the diff, simply click on the item and select the Exclude icon from the toolbar, or right-click on the node and select Exclude.

Running a SOA Baseline Regression Test

To run a baseline regression test, simply choose the “Use Saved Regression Baseline Success Criteria” option and press the Run Test button.

Generating an HTML Baseline SOA Regression Graphical Diff Report

After running the baseline test you can then generate and export an HTML report which highlights the Pass/Fail results of the test and also provides graphical diff reports for each test iteration. In this report, you can view the diff of the request vs. baseline request, response vs. baseline response, and each diff (with applied exclusions) for each XPath expression.

To generate the report, click on the log file and choose the report labeled “[HTML] Baseline Regression XML Diff Report”. You will be prompted for the directory to store the results. The results will be written to the selected directory with the name of the log file with a .htm extension and a subfolder of the test data created with the same name. The test data written with the report includes all actual XML requests and responses, as well as the baseline requests and responses.

The resulting report provides a summary and detailed view of the test results and also provides links to view the graphical diffs of:

1) Stored Baseline Request vs Currently Sent Request

2) Stored Baseline Response vs Currently Received Response

3) XML Fragment Diff of each diff rule defined.

The HTML report is contained in it’s own subdirectory and can be published to a central server.


Creating a baseline of expectation for service behavior in an automated way can significantly reduce testing time for testing version updates of services and detect functional breakage. Given the complexity of version management among SOA components, a baseline strategy will pay big dividends in time savings of the life of the deployment.

Sunday, February 17, 2008

Intro to SOA Regression Testing: A Hands-on Approach

Learn SOA Regression Testing techniques through automated data sources and recording base-line tests.


Regress means to go backwards. Software Regression Testing is the means of identifying unintentional errors or bugs that may have been introduced as a result
of changing a program module. The program module regresses by no longer working as it used to before. Software development is an iterative process in which program modules are continually modified by teams of developers to meet changing system requirements Typical software systems with N modules have N-squared dependencies. A flaw introduced in a modified module can have significant impact across the entire system.

Regression tests help identify changes between a selected product release and a previous release of the product – called a baseline. A baseline is recorded snapshot of desirable product behavior. This expected behavior is then used to ensure that nothing has been broken in the system as a result of changes introduced in a program module. Establishing a regression testing framework is crucial for building reliable and stable software products.

Web services – the foundation of modern Service Oriented Architecture (SOA) – are self-contained, modular applications that one can describe, publish, locate, and invoke over a network. Web services are agnostic to operating system, hardware platform, communication protocol or programming langu
age. Most IT assets such as application servers, RDBMS, CRM/ERP applications, and SaaS products now advertises their interfaces as a Web Services Definition Language (WSDL) interface ready for SOAP/XML messaging. Using SOAP for system-to-system messaging and WSDL for interface description, IT professionals now have unprecedented flexibility in integrating IT assets across corporate domains. It is this flexibility of distributed computing provided by web services that makes developing and deploying a robust, resilient and reliable Service Oriented Architecture challenging.

QA Professional faces unique challenges in performing regression testing of a Service Oriented Architecture. The fundamental advantage of a Service Oriented Architecture is reuse of services across a distributed, technology agnostic infrastructure. In as successful SOA deploy
ment, the number and re-use of services should continue to increase. As the number of services and their re-use within a SOA increases, the difficulty in testing services increases dramatically owing to the interdependencies of the services within a distributed environment. If one the services desired behavior changes, all the dependent services will exhibit faulty behavior. Thus, SOA Architects, Developers and QA Professionals are now responsible for adapting their testing techniques, selecting appropriate testing tools and developing web services domain expertise to make their SOA deployments deliver business value reliably and securely.

In this article, we describe techniques for SOA Regression Testing through a hands-on approach that walks you through:

  • Setting up a simple web services consumer (client) and producer (server) environment.

  • Establishing an external MS Excel data source for driving test scenarios.

  • Recording an acceptable base-line run.

  • Simulate regression by changing producer service.

  • Re-run external test data and identify producer service regression.

After completing the hands-on walk through below, QA Professionals, Developers and Architects will have a strong foundation in establishing and extending test suites for regression testing within their Service Oriented Architecture.

Download data Detailed Document: Intro to SOA Regression Testing:

Sunday, February 10, 2008

Compress SOA Lifecycle through Development & QA Collaboration

Learn six effective ways of bridging the divide between SOA development and QA teams to compress the SOA Lifecycle and maximize resource utilization, improve focus, and keep projects on track.

Web services, the foundation of modern SOA, are being rolled out today in ever increasing numbers across enterprises. The key benefit of web services is reusability of services across applications in a distributed environment. Reuse is especially valuable in exposing monolithic, legacy functionality as self-contained services. With the promise of SOA and web services come also the challenges for successful implementation and testing. These challenges fall directly on the developer and QA teams to meet deadlines while also deploying a robust, resilient and reliable SOA solution.

The challenge of building robust and reliable services within a SOA exposes age-old fissures between Development and QA: Who is really responsible for testing across distributed environments. In this article, we will explore the gaps and recommend ways of bridging such fissures to ensure greater efficiency in developing and deploying web services-based Service Oriented Architecture.

1. The Best Way to Communicate - Speak the Same Language

SOA is a paradigm, not a technology. Thus, it is important to bind the paradigm by following the standard based technology that the paradigm defines. Web Services technologies such as WSDL, XML, SOAP, WS-Security, WS-Addressing, etc. are all bounded by OASIS or W3C standards that define how each of these technologies are to be utilized.

The overused term “Governance” is thrown around frequently in the SOA world. Some see it as an audit mechanism to keep things in line. However, governance as it applies to the SOA development lifecycle should be viewed as a guide or reference for enterprise-wide SOA best practices. Like most spoken languages with rules for syntax, semantics, grammar and idioms that form the basis of a conversation, SOA governance provides rules and constructs for the conversation between QA and development teams.

For example, the first tangible handoff from Development to QA involves a Web Services Definition Language (WSDL). WSDL defines the contract between the producer and a consumer of a SOA web service. The WSDL file itself describes the interface of the producing web service. Unfortunately, the specification for WSDL is broad and can lead to testing problems, deployment problems, and most importantly interoperability issues. To facilitate consistent and robust published WSDLs:

  • Define WSDL Best Practices: Define governance best practices for your organization’s WSDLs so that the producers and consumers of the WSDL can have a common basis to measure interface robustness. The WS-I standard provides profiles that can serve as interoperability guidelines. Defining and enforcing additional rules for your organizations WSDLs ensures measurable quality across enterprise-wide SOA teams.
  • Adherence Metrics: Provide a testing framework that allows the consistent measurement and adherence of the organization’s governance criteria. Governance should be quantifiable within an enterprise and not a mere buzz word adorning corporate presentations and documents.
Bottom-line: Know your WSDLs. Focusing on the quality of the WSDLs is the first critical step in ensuring a robust and reliable SOA.

2. Invite Everyone to the Party - Meet Early and Often

Balancing timelines, deliverables, and tasks is a well known shuffle in a typical software development lifecycle. The SOA lifecycle is no different. A common mistake in a SOA project is to wait and include teams — such as the QA team — as required instead of inviting everyone to the party upfront. Typically, a development group is responsible for both the architecture and development of SOA web services. Unfortunately, this sense of sacred responsibility sometimes relegates the QA group to a secondary role in the early phases of the SOA lifecycle where QA might not be invited to early architecture meetings. The secondary role assigned to QA also exist as a result of a historical and cultural bias against testing by development groups — “I am a developer, not a tester.”

Another reason for delaying QA’s involvement is to control the time and resource expense of different groups on a SOA project. However, this approach has just the opposite impact. Instead of saving time and resources, the project is delayed while the QA team ramps up on project goals and technical requirements.

Involving SOA architects, developers, and testers in all aspects of the planning and implementation process enhances the quality focus of the initiatives and paves the way for parallel efforts on test and implementation strategy. Each member of the team is equally responsibly for a resilient, robust solution. For efficient team collaboration with a SOA project, consider the following actions:

  • Share Perspectives: Avoid downstream project pitfalls by sharing perspectives between QA and Development early in the planning process. Consider the impact of implementation decisions as they relate to QA tasks.
  • Define Roles and Responsibilities: Clearly determine and define roles and responsibilities early in the project to reduce redundancy in parallel tasks.
Bottom-line: QA involvement early in the SOA lifecycle, even as silent observers, in architectural meetings is of utmost importance.

3. Get in the Fast Lane — Share Unit Tests to Jumpstart System Tests

In traditional software development lifecycles, unit testing and system testing are performed in different environment with different tools. With SOA implementations, however, this does not need to be the case. In fact, leveraging the framework and techniques that have been utilized and implemented by the development team provides an excellent opportunity to jump start the system testing initiatives.

Where unit testing performs higher level testing scenarios, choosing a common framework — such as JUNIT — rather than allowing a custom developer testing solution, can facilitate re-use of the testing scenarios. This can be particularly useful in the handoff process between implementation and test. If the handoff includes not only the updated code and features, but also a base set of test scenarios, this can provide the testing team a baseline to build from and thus jumpstart the testing implementation for those features. The following are actions to consider:

  • Define Reusable Unit Tests: When planning unit testing, be cognizant of the system impact and isolate those tests which can be reusable in a system test environment.
  • Use Common Test Framework: For unit testing and system testing ensure that the same testing framework is used. This allows unit tests written against this framework to be reused for system testing baselines to jumpstart the scenario authoring.

Bottom-line: Dependable, reusable, and automated unit tests are the foundation of a quality SOA Deployment.

4. Devel, QA, and Back Again — Compress the Issue Resolution Cycle.

Once the SOA lifecycle has reached the testing phase, the cycle begins where issues are found, reported, and fed back to the development team. This becomes one of the most tangible areas to improve timelines and compress the lifecycle.

The development team members must go through the standard vetting process on issues. Without details about how to reproduce the problem, issue identification remains ambiguous. This is why choosing a testing framework that allows preservation of the test simulation can greatly reduce cyclic bug thrashing. A standard representation of the testing information (inputs, expected results, etc) and simulation sequence can be provided directly to the development team where the issue can be reproduced, and resolved, and later easily verified using the same sequence. Key aspects of improving the bug resolution cycle:

  • HTR Template: “How to Reproduce” template that details all the aspects of what should be included when reporting issues. For example, type of machine, network environment; build number, detailed sequence of steps, etc. All issues should be reported using the same template.
  • Preserve Test Scenario: Provide a testing framework that allows preservation of the testing sequence in a file format. This allows sharing of this information among the teams for ease of reproducing the issue, verifying the fixed issue, and re-testing the scenario repeatedly to ensure the issue does not resurface.
Bottom-line: Clearly identify issues within a SOA by reproducing bugs and preserving test scenarios for regression testing.

5. Making SOA Security Testing Part of QA and Developer.

Often Security plays a pivotal role in SOA deployments, especially when applications with weak security controls are exposed as web services and these web services are consumed across corporate domains.

Traditionally, security testing has been the domain of security officers in an enterprise where security validation has been performed in a post-deployment scenario. This post-deployment approach has the following shortcomings:

  • Delayed Risk Identification: Statistics clearly show that security bugs detected early in the life cycle of an application deployment has dramatic cost savings. Rectifying a security flaw identified in an application after it has been developed and put in production has a disruptive impact with severe financial implications in re-architecture costs and application down-time.
  • Inadequate Security Testing: Post-development testing is traditionally restricted to black box testing. Security techniques that have relied on black box testing methodology provide incomplete test coverage.
So how is SOA Security testing responsibility divided among the two groups?

  • Development Group Responsibilities:
    • Devel is responsible for White Box testing of SOA web services based on a simple fact it has the most visibility and knowledge of web services source code. Close inspection of exception handling code and memory allocation/de-allocation techniques are some of the examples that can have immediate impact on reducing the risk posture of a web service.
    • If WS-Security is part of the deployment to secure web services messages, then it is Devel’s responsibility to ensure the basic provisions of WS-Security are implemented properly and are tested to ensure base line interoperability. However, it is not Devel’s responsibility to test their WS-Security implementation with every vendor’s WS-Security implementation for interoperability. It is QA’s responsibility to build and maintain an extensive interoperability test matrix.
  • QA Group Responsibilities:
    • QA’s responsibility in the domain of security testing should involve every aspect of security testing methodology except for White Box testing. Black Box and Grey Box testing are two techniques that should be part of QA’s arsenal.
    • Another aspect of SOA security testing that falls into QA’s lap is the detection of sensitive data leakages in web services response messages. The detection of sensitive data leakage is a run-time issue that is highly dependent on creation of parameterized regression suites of web services request messages. If there are data leakage violations, it is QA’s responsibility to provide feedback to Devel on what filters to apply to prevent the leakage of certain data from the web services application.

SOA Security testing is best addressed by the divide-and-conquer approach between Devel and QA. WS-Security is a fairly new area for both groups and it promises to be the acid test of how far both groups will go to collaborate on detecting SOA security bugs.

Bottom-line: To augment the SOA security validation process, part of the security validation responsibilities should be shifted from typical security teams towards QA and Development Teams.

6. Find the Chain of Command — Understand Service Chain Testing

Service chaining is the invocation of services in a sequence where output of one service becomes the input for the other operations. Since in an enterprise deployment chaining of services usually traverses multiple services possibly across corporate boundaries, it is the responsibility of QA team to ensure that the whole sequence of services is parameterized and validated.

On the other hand, the development team’s visibility into the service chain is limited to only the services its produces. A developer may produce a service that is dependent on a number of distributed web services. In such cases, where a services producer consumes other web services, the QA team should be responsible for testing all the external web services that are invoked by the service that the developers are producing.

It should also be noted that service chaining clearly requires an understanding of interdependencies between different services and their inputs and outputs. To create a test suite that encompasses the whole sequence of services requires some base-line knowledge of the SOA architecture. QA that is not involved in early architecture discussions will have a tougher time understanding how these operations interact. That is why attending SOA architecture meetings early helps in the understanding the creation of comprehensive test suites. This practice will reduce QA’s dependency on different Devel groups in an enterprise when it is ready to create service chain test suites close to the release date.

Bottom-line: Understanding service dependencies is crucial is developing effective test cases in a highly-distributed Service Oriented Architecture.


SOA testing plays a critical role in ensuring a successful deployment in any enterprise. A successful SOA testing strategy greatly hinges on cooperation between QA and Devel where roles and responsibilities are shared fairly. An early indicator of this successful strategy will be based on the level of QA involvement’s in early part of SOA life-cycle that starts with attendance of SOA architecture meetings. An active role played early in the SOA lifecycle ensures QA’s independent role in testing during the latest stages of deployment without influence from Devel on how tests are to be conducted. A delicate balance between QA and Devel is achieved by QA taking responsibility of SOA system and integration testing while Devel taking responsibility of such tasks as White Box testing, Unit testing and WSDL validation. This delicate balance is achieved by using tools that provide demarcation between the roles or Devel and QA.

SOAPSonar and SOAPSimulator from Crosscheck Networks are tools that provide comprehensive functional, performance, interoperability and security testing of web services. SOAPSonar is a unified SOA Testing tool that clearly delineates the roles and responsibilities across the QA and Development Groups within a large enterprise focused on deploying a Services Oriented Architecture.

SOAPSimulator is a Service Simulation product that has a significant impact on SOA Development Lifecycle by enabling QA professional to start developing Test Cases even before the Service is coded. SOAPSimulator also enables SOA teams to move Quality upstream by enforcing developers to meet governance criteria before releasing their services to the QA team. With the roles and responsibilities clearly defined, and SOA Testing tool such as SOAPSonar and SOAPSimulator that meet the needs of both QA and Developer communities, building and deploying a reliable and robust SOA is no longer a pipe dream.

About Crosscheck Networks

Crosscheck Network’s mission is to provide products for testing, diagnosing and controlling enterprise Web Services. Crosscheck Networks’ products provide QA professionals, security personnel, and compliance officers with necessary information about the functional completeness, scalability, security and interoperability compliance of their Service Oriented Architecture (SOA). Through SOAPSonar and SOAPSimulator - industry leading comprehensive SOA testing products - IT professionals make more informed decisions that enable their companies to stay within corporate quality and regulatory boundaries. The SOA diagnostics architecture is the first in the industry to encompass security, compliance, performance and functional regression under a unified architectural framework. The architecture enables the product to be deployed from the core to the perimeter of an enterprise.

Visit Crosscheck Networks at

Saturday, January 19, 2008

Introduction to MTOM: Hands-on Approach

Learn how to use MTOM by building a simple prototype


With web services-based SOA now being deployed across Global 2000 enterprises, transmitting attachments such as MRI Scans, X-Rays, Design Documents and Business Contracts using SOAP messages has become a common practice. SOAP Message Transmission Optimization Mechanism (MTOM), is a W3C Recommendation designed for optimizing the electronic transmission of attachments. Through electronic transmission of documents, corporations can realize significant cost savings and better service levels by eliminating the use of postal mail. Paper-based manual tasks can be replaced with simple and efficient electronic processes where binary data can be transmitted between organizations through standards such as MTOM.

MTOM provides an elegant mechanism of efficiently transmitting binary data, such as images, PDF files, MS Word documents, between systems. The Figure below shows the steps involved in transmitting data between a Consumer and Producer using MTOM.

MTOM Process

The Consumer application begins by sending a SOAP Message that contains complex data in Base64Binary encoded format. Base64Binary data type represents arbitrary data (e.g., Images, PDF files, Word Docs) in 65 textual characters that can be displayed as part of a SOAP Message element. For the Send SOAP Message Step 1 in the Figure above, a sample SOAP Body with Base64Binary encoded element <tns:data> is as follows:


An MTOM-aware web services engine detects the presence of Base64Binary encoded data types, <tns:data> in our example, and makes a decision – typically based on data size – to convert the Base64Binary data to MIME data with an XML-binary Optimization Package (xop) content type. The data conversion, shown in Step 2 of the Figure above, results in replacing the Base64Binary data with an <xop:Include> element that references the original raw bytes of the document being transmitted. The raw bytes are appended to the SOAP Message and are separated by a MIME boundary as shown below:

<tns:data><xop:Include href=""/></tns:data>
content-id: <>
content-type: application/octet-stream
content-transfer-encoding: binary

The raw binary data along with the SOAP Message and the MIME Boundary is transmitted over the wire to the Producer. The Producer then changes the raw binary data back to Base64Binary encoding for further processing. With this conversion between Base64Binary and raw binary MIME types, MTOM provides two significant advantages:
  1. Efficient Transmission: Base64Binary encoded data is ~33% larger than raw byte transmission using MIME. MTOM therefore reduces data bloat by converting Base64Binary encoding to raw bytes for transmission.

  2. Processing Simplicity: Base64Binary encoded data is composed of 65 textual characters. The data is represented within an element of a SOAP message. Security standards such as WS-Signatures and WS-Encryption can directly be applied to the SOAP Message. Once such operations are performed, the Base64Binary data can be converted to raw bytes for efficient transmission. Securing document transmission via SOAP, therefore, does not require additional standards for securing MIME-based attachments.

In this paper, we will deploy a web service for processing binary data ByteEcho(byte[] data) and examine using this service without MTOM and with MTOM enabled.


For a hands-on understanding of MTOM, we will build web services with simple operation that will echo a byte stream. In this section, we will focus on setting up the SOA Testing framework for MTOM by installing the following components. Install the three products below in the listed sequence:

  1. Crosscheck Networks SOAPSonar Enterprise Edition:
    A .NET-based SOAP client used for comprehensive web services testing. Install pre-requisites that the SOAPSonar installer will ask you for such as .NET Framework 2.0 if it is not already installed on your machine.

  2. Web Service Enhancements (WSE) 3.0 for Microsoft .NET Framework:
    This is an add-on to Microsoft .NET Framework 2.0 that enables developers to build secure Web services based on the latest Web services protocol specifications such as MTOM. Note: During installion, please select the Developer Version option.

  3. Microsoft .NET WebMatrix: This installer includes an IDE for building web services and a lightweight web server.
All three components can be installed on a Windows 2000/XP/2003/Vista machine with moderate resources. The web services Producer is the .NET WebMatrix server that supplies a web service with the ByteEcho(byte[] data) operation that applications can invoke, over HTTP. In addition, it produces the WSDL file that defines the web service interface. This file provides all the necessary information for the consumer, SOAPSonar, to send SOAP requests to the target web service. SOAPSonar consumes and interprets the WSDL-based API published by the producer and invokes the web service.


To build a simple web service that illustrates handling binary data, follow these steps:
  1. Goto: Start > All Programs > Microsoft ASP .NET Web Matrix > ASP .NET Web Matrix.

  2. You will be prompted for with a screen shown below. Fill in the information as shown. Select "Web Services" in the left panel and the "Simple" template in the right panel. Create a C:\WebServices folder to store the .asmx file and picked C#.

  3. Create a New File in WebMatrix

    Note: The OK button will be grayed until all the information on this panel is filled out.

  4. This will auto-generate a web service for you with an Add(int a, int b) operation as shown below. We will keep this auto-generated operation and include a new byte processing operation in the next step.

  5. WebMatrix C# Code

  6. Cut and paste the following code in the Web Matrix IDE right under the Add(int a, int b) operation:

  7. [WebMethod]

    public byte [] ByteEcho(byte[] data) {

    return data;


  8. The IDE will look as follows:
  9. WebMatrix Code for Processing Binary Data

  10. Hit the play button in the IDE and it will prompt you for the start the web application on port 8080. Your local firewall may prevent you from starting a listener on port 8080. Add the port to your firewall's list of allowable ports.

  11. Screen Shot to show WebMatrix Server Starup on Application Port

  12. A web browser with operation Add and ByteEcho will appear. You can click this and start playing with the Add operation. The ByteEcho operation does not accept input from the browser.


To setup the SOAPSonar test client, perform the following steps:

  1. Goto: Start > All Programs > Crosscheck Networks > SOAPSonar Enterprise Edition 3 > SOAPSonar Enterprise to launch SOAPSonar.

  2. Load the WSDL published at the .NET WebMatrix Endpoint http://localhost:8080/BinaryProcess.asmx?WSDL into SOAPSonar as shown in the figure below. Select the ByteEcho_1 test case in the left Project Tree Panel. Right click on the data field and select Content Function > Base64Encode. File Dialog will pop up to enable you to select a file from your system such as a PDF or JPEG file. The filename will be embedded in the data field with a $b64:file prefix. Click SOAPSonar Save to commit changes and SOAPSonar Run to execute the test. You will see the SOAP response in the Response Panel.

  3. SOAPSonar Load WSDL

  4. Save the project by going to File > Save Project As.

  5. Click SOAPSonar Header on in the Request Panel to review the Header information for the request. Select the (Sent Request) Tab as show in the Figure below. The SOAP request and the Header information are also shown below. Make a note of the following information:

    1. Header Content-Length is 324771 bytes. This will vary based on the file you select.

    2. Content-type is text/xml. This will change when MTOM is enabled.

    3. The data field contains base64Encoded value of the selected binary file.

So far, you have successfully loaded a WSDL into the test client, SOAPSonar and setup a simple consumer (SOAPSonar) to producer (webMatrix) Framework to send base64Encoded bytes to the webMatrix server that reflects the bytes to SOAPSonar in the SOAP Response. Next, we will enable MTOM and review its impact on the SOAP Request and Response.


In the section, we will enable MTOM for WebMatrix and the SOAPSonar test client. To enable MTOM for webMatrix as follows:
  1. Goto C:\Program Files\Microsoft WSE\v3.0\Tools and launch WseConfigEditor3.exe.

  2. With the WSE 3.0 Configuration tool, select the File Menu to open the web.config file located in C:\WINDOWS\Microsoft.NET\Framework\v2.0.50727\CONFIG.

  3. Under the General Tab, check both boxes to enable WSE 3.0 for the WebMatrix sever as shown in the Figure below:

  4. WSE 3.0 Configurator

  5. Next, select the Messaging Tab and select “always” for the Server Mode MTOM Settings.
  6. WSE 3.0 Configurator

  7. Goto File > Save to the new configuration and Exit from the WSE 3.0 configuration tool.

  8. Review the web.config file using a text editor and ensure that the following elements appear in the web.config file. As expected, the serverMode value for MTOM is set to “always.” The server will now only accept MTOM encoded messages. If a SOAP request is received by WebMatrix that is not MTOM, an HTTP error 415: "Media unsupported" is
    returned to the sender.

  9. <messaging>

    <mtom clientMode="Off" serverMode="always"/>


  10. Re-send the base64Encoded SOAP request. Since WebMatrix receives a message that is not MTOM, an HTTP error 415: "Media unsupported" is returned to the SOAPSonar as displayed in the Response Panel.

  11. In the Project Tree, click on the Policy Settings node. Change the MTOM Setting – Client Mode to On as shown in the Figure below. This enables SOAPSonar to send MTOM encoded messages. Click to commit changes.

  12. SOAPSonar MTOM Setting

  13. Goto the ByteEcho_1 test case and click SOAPSonar MTOM Messages to execute the test. Review the request sent to the server by clicking on the (Send Request) Tab in the Request Panel. The SOAP request and the Header information are also shown below. Make a note of the following information:
    1. Header Content-Length is 244093 bytes. This will vary based on the file you select.

    2. Content-type is application/xop+xml. This indicates that an MTOM message is being generated by SOAPSonar.

    3. The data field contains MIMEBoundary content-transfer-encoding.

    SOAPSonar MTOM Messages

    Note: The Header Content Length for the Message Request with MTOM turned-on is 244K compared to 325K with the Message Request without MTOM. This corresponds to a ~25% reduction in message size even for a moderately sized message.

    Goto the panel shown in Step 8 above and check Show Raw Response for the MTOM settings. This turns-off the binary to text-encoding and enables you to view the raw binary content in the response panel.

You should be now be comfortable with sending base64Binary encoded and MTOM encoded messages to a test server and viewing the responses in wire format and base64Binary encoded formats.


MTOM provides an efficient mechanism for transmitting binary data. MTOM’s approach of reducing the number of standards required for transmission while reducing the data bloat caused by Base64Binary encoding the entire attachment makes it an ideal standard for transmission of large binary content. Obviously, nothing comes for free. The gain in network transmission efficiency by reducing “wire footprint” is at the expense of CPU resources. The client has to expend processing resources to convert Base64Binary encoded type to MIME type before transmitting it over the wire. The receiver then performs another data type transformation from MIME to Base64Binary data. MTOM is ideal for use cases where a large number of external organizations want to transmit sizeable documents to an enterprise over the internet with low bandwidth availability.


  1. Faster Data Transport Means Faster Web Services with MTOM/XOP.

  2. XML, SOAP and Binary Data.

  3. Getting Started: Creating a WSE 3.0 Enabled Web Service to Transfer Large Amount of Data using WSE 3.0 MTOM.


US Phone: 1-888-CROSSCK (276-7725)
International Phone: 1-617-938-3956