Design template for managing cascade processing


So, imagine spaghetti code. A task should go from 1 to 2 then to 3, etc. If a task breaks, abort the program. I can write this in a procedural manner, but would like to know if there is a design pattern for this.

I looked at Chain of Responsibility, but it does not fit. I already know which process should handle each task. Master/Worker does not fit either because it needs to be processed step by step.

Concrete problem at hand is - allow user to upload an Excel file, compare file to the database table structure, then update the database.

Steps would be:

  1. Upload the file - verify that file is uploaded, if not, return.
  2. Verify that file is an excel file, if not, return.
  3. Read file into a datatable, if error, return.
  4. Get datatabe from the database, if error, return.
  5. Load each datatable into respective 2D arrays, if error, return.
  6. Compare if each array has same number of columns, if error, return. etc etc etc...

Is there a way to use a design pattern for this? Thanks.

There is a design patten to address this kind of situations and it's quite well known: Pipeline or as MSDN calls it Pipes and Filters Pattern. Note that it is NOT part of the famous Gang of Four design patterns and it can even be an architecture pattern.

It is mostly used when performing a large processing of data through smaller independent stages (or filters):

This pattern is used for algorithms in which data flows through a sequence of tasks or stages.

Use the PipelineProcessing pattern when: The problem consists of performing a sequence of calculations, each of which can be broken down into distinct stages, on a sequence of inputs, such that for each input the calculations must be done in order, but it is possible to overlap computation of different stages for different inputs as indicated in the figures in the Motivation section.


Use this pattern when:

  • The processing required by an application can easily be decomposed into a set of discrete, independent steps.
  • The processing steps performed by an application have different scalability requirements.

By using this pattern you can perform your process through smaller independent (and therefore more manageable) steps, maybe parallelize some of them and thus reach better performance and even have different implementations of each step (filter) and customize them easily using things like failure strategy or validation for each step.