For a long time I thought that SOA is process driven, meaning that the functional requirements were discovered by working downwards from a business process perspective to a comprehensive set of services to deliver the business value.
However, the more solutions I see, the less I believe it. Saying so, I realize I need to clarify that a bit more. Whenever you're building an SOA from the Business process downwards, it is smart to discover possible candidate services from your existing legacy environments (building upwards) so you can execute a 'meet in the middle approach'. This usually results in orchestrating business processes (with BPEL) and -simplified - composing composite services (based on atomic services from your legacy).
Is that bad? Not in itself, I think. It depends on the type and the number of processes. Suppose you have - on average - 50.000 running instances of a process. If you have to change that BPEL process you defined, what will you do? The first question to answer is: what do I need to do with my current active processes? Can they continue using the existing process or do the need to switch? Does it depend on the state they are in or not?
This problem gets worse the longer a process runs. It increases the chance that a process change will occur during the time it runs. Besides, not only functional changes impacts the running processes, but an update of the underlying BPEL engine may do so as well. Of course, if the number of instances increases, your problem gets worse too. Ever tried to restart a BPEL engine with 150.000 instances?
If it's a short running process, you could stop all incoming transactions for the specific process and wait for the running processes to finish before you upgrade the process to it's new version. Unfortunately life is usually not that simple.
Let's get back to the initial thought: is an SOA process driven or not? If you follow the scenario I sketched above, you might end up with one that is. What you should keep in mind while designing an SOA application, is that you have to look ahead to see what possible situations you need to able to handle. If you know beforehand that you will have a (very) large number of active processes, that they will be long-running and that they will very likely be subject to change, you might want to consider a different approach. Why? Because migrating running processes in a BPEL environment is a very hard thing to do (and expensive too). You will need to make allowances in your architecture to prevent problems.
In that scenario, a more event-driven or message-driven approach may be the answer. By loosely coupling the stages a process will go through, using either events or messages, you are a lot more flexible when it comes to migrating to new versions of a process(step). This will mean that your 'highest level' process will probably not be visible as a process in your BPEL engine, but the underlying steps may be. Again depending on the duration of the processes.