In many enterprise integration scenarios, a single event triggers a sequence of processing steps/stages. Each performing a specific function.
For example, lets assume you have requirement to encode files in a directory structure and create an output file. This can be achieved by using Pipes-and-Filters architecture style to divide a larger processing task into sequence of smaller tasks - running independently connected to each other by channels ( pipes ).
To be able to demonstrate this scenario, this example create a very simple pipe loading file and processing it. It can be extended to run in parallel to improve performance.
There are few obvious benefits by using this architectural style listed below:
- Divide and conquer : The seperate processes can be independently designed
- Increase cohesion : The processes have functional cohesion
- Reduce coupling : The processes have only one input and one output
- Increase abstraction : The pipeline components are often good abstractions, hiding their internal details.
- Increase reusability : The processes can often be used in many different contexts
- Increase reuse : It is often possible to find reusable components to insert into a pipeline
- Design for flexibility : There are several ways in which the system is flexible
- Design for testability : It is normally easy to test the individual processes
- Design defensivily : You regorously check the inputs of each component, or else you can use design by contract