I have discussed the SOA Evangelist, the Architects, and the Developers. Today I will discuss the role of the Testers and the characteristics required to contribute to a successful SOA implementation.
One of the most important roles and one that I probably should have included in the Architects post is the Testing Architect. As I have written on CIO.com in my article called Six Questions to Consider Before Building a SOA Testing Team, SOA testing requires a much deeper knowledge of technical skills, including development skills. It might be unrealistic to expect your entire testing team to possess the desired technical skills required to successfully test SOA, but your test architect and your leads should be able to understand concepts like statefullness, distributed computing, and data services, to be able to properly validate the underlying architecture. They also need to be able to take developer test harnesses and update them with their own test scripts.
A successful SOA testing strategy starts with the test architect. This person must have a very in depth knowledge of SOA and should work closely with the EA team. I actually recommend that this person is a member of the EA team, but every business and culture is different. The goal of the test architect should be to set up a framework and a core group of policies and procedures (part of IT Governance) so that the rest of the test team has the tools and the guidance to successfully test SOA. Without an established testing architecture, the company will have to rely heavily on expert knowledge from the entire testing team. I have seen three scenarios through my personal experiences and through my research.
Scenario 1: Underestimating SOA
The first scenario is out right failure caused by not having the tools and knowledge required. This is caused by a company not realizing that their current methodology and internal personnel are not well equipped for testing SOA. These companies do not invest enough in tools, training, and governance and usually can only test the presentation layer and possibly the interfaces. By not understanding the concepts of SOA, they are unable to validate the architecture which leads to poor performing and fragile services.
Scenario 2: Paying through the nose
The second scenario I have seen is relying too heavily on "expert" consulting firms for testing. In this scenario the company bets the farm on an experienced SOA consulting firm and pay rates that far exceed $100/hr. This model cannot be sustained for any length of time unless the company is willing to burn huge amounts of capital (which is not a popular thing to do these days).
Scenario 3: Good balance of internal/external expertise
A more desirable scenario is to train or hire a SOA testing architect, build a solid testing platform tailored for the needs of the organization, and govern the testing process while training the other members of the team. Companies should hire one or more experienced SOA tester, find an experienced consultant or two, or train an experienced and credible internal candidate to take the lead for creating the testing architecture. At the same time the testing experts are charged with transferring knowledge over to the rest of the internal team members. This is critical because a highly experienced SOA tester is in great demand and is a flight risk since they are a rare breed. It is critical to grow the knowledge base internally.
The needs can be broken down into these categories: People, Tools, Governance. So what are the characteristics of a successful SOA tester. The answer is dependent on the architecture that is implemented, which is related to the tools and policies that are put in place. Below is a diagram I often use to discuss the typical layers of an architecture.
|From SOA Slides|
As I mentioned in the previous discussion about the developers, I see the need and desire to specialize within the layers. The same holds true for testers. Work within the layers happens simultaneously in development. I recommend an iterative development and testing approach which means there should be testers working within the layers simultaneously as well.
Here is what I would strive to put in place, keeping in mind that these are roles and some people may fill multiple roles:
SOA Test Architect
Courageous leader with extensive working knowledge of SOA, distributed computing, integration testing experience, coding/scripting experience, and understands the business. It is likely that you may have to go outside to hire this person or bring in a consultant to assist your top level tester.
SOA testing leads
This person or persons must understand all layers of the architecture (let's not forget security either) and should be able to design test plans that validate both the architecture and the governance model. They should understand how to perform black, white, and gray box tests. Testing abstracted services requires extensive testing in the areas of security, performance, and regression. Throw in the versioning capabilities of services and the fact that service consumers can use services in ways that were not anticipated and the permutations of test cases start skyrocketing. The test leads need to practice risk based testing and balance risks, timelines, and costs. There is just so much more at stake here than in the traditional n-tier architectures.
Business process testers
The business processes should be modeled within some tool and will likely call one or many services. The process flow requires a series of decision points based on variables and constants and can trigger events such as notifications, alerts, other processes, error handling routines, services, and a variety of other possibilities. The testers need to validate the business process as a stand alone entity. For example, if the business process is "validate credit card", the tester needs to ensure that this process handles the inputs correctly, properly runs the validation process, and generates the appropriate output. At this point, the tester need not be concerned with any other processes or services. These testers must work closely with the business and/or business analysts and do not need the breadth of technical knowledge that the leads and architects must have (although it would help). They should be approaching the testing from a business standpoint.
These testers must be much more technical and understand how to work with XML, SOAP, WSDLs, networking & telecommunication concepts, statefullness, and various platforms and technologies (Java vs .Net, Windows vs Unix/Linux vs. mainframe, etc.).
User Interface testers
The company most likely already has an abundance of people who can test the UI. If your company is leveraging mashups, wireless devices, or consumer facing UIs, the complexity of the testing will be greatly increased. These testers may be need to become familiar with AJAX, RIA, portal technology, RSS, and a variety of social software.
Data services testers
These testers must understand concepts of data modeling, database CRUD operations, transformations, security and roles, authentication, and much more. Everything starts with the data and if errors are introduced in this layer, everything else is doomed to fail. You must have a very strong testing lead working in the data services layer since data is the foundation of any successful implementation.
Other areas of focus
The name of the game is speed to market and test automation is a critical component for making that a reality. Performance testing is extremely critical and organizations should practice simulations to try to anticipate future performance of all services. Validating the security model and the governance model should also be part of the overall test strategy. Whoever is responsible for designing the security test plan should read (and fully understand) this book.
I am by no means a testing expert (but I did stay at a Holiday Inn Express last night). I do read a number of testing blogs listed below:
- Randy Rice's Software Testing & Quality Blog
- Frank Cohen's blog
- The ITKO Lisa Soapbox: SOA testing, validation & virtualization
- SOA Testing
- Mike Kelly's testing blog