Skip to main content

Can SOA be bad for your health?

Recently I featured in a podcast and wrote an article on the 5 SOA Security traps, and one particularly sticks in my mind. The issue is about flexibility - a good thing, most people agree, but in security / governance terms it can be a two-edged sword, and so it proves to be in the case of SOA.

The problem comes down to security domains. IT implementations can be thought of as a group of structures with varying levels of security - all the way from a community village where anyone can wander in anywhere, up to castles with moats, drawbridges and even boiling oil!

Imagine for example a company with a particular silo application which is highly sensitive and must be absolutely secure.

This could be implemented on a high-availability cluster with hardware encryption, and even have physical access controlled by putting it in a room with locks on the door and a guard!

Well, OK, this might a little over the top, but the point is the company can take whatever measures it sees fit to implement a high level security domain - think castle.

Now along comes SOA, with its philosophy of flexibility and shared, reusable services. Instead of running silos, applications become a linked set of services and logic, and the wonderful flexibility of SOA means these services could be running anywhere across the enterprise, on any platform and in any technology environment.

So supposing there is a shared 'create customer' service, and the high-security application switches to using this service instead of its own redundant create customer code.

Now, since the security is only as good as the weakest link, the security domain is broken. Someone just drilled a hole in the castle wall.

Of course, companies can take measures to ensure this disaster does not befall their critical apps. Procedures can be put in place to protect the integrity of the security domains, restricting changes to these applications and blocking them from SOA-based distribution.

But many people are unaware of the exposure, and sometimes programmers, with the best intentions, might accidentally end up compromising operations.

In the end, it is up to management to put in place any education programs, working practices and policies and then to enforce them. But at least forewarned is forearmed.