Friday, 31 October 2014

The Systems Development Life Cycle (SDLC) and ITIL (Mind the Gap series, 2 of 10)

Today I’ll be providing a brief review of the Systems Development Life Cycle (SDLC) and ITIL and their relationship.

Each side tends to view the universe from their own point of view without due recognition of the other and the need for coordination.

The SDLC is mentioned in ITIL V3, as is almost every aspect of alternative approaches to IT and Service Management.
The S in SDLC is described as system, software or even service by different authorities.

The description of the precise steps in any project will vary in detail, as will the many approaches to development outlined over the years. Early “waterfall” development was soon improved by increased prototyping and more iterative approaches, with focus enhanced by use of scrum and sprint teamwork.

SDLC introduces its own set of acronyms :

TOR Terms Of Reference

PID Project Initiation Document (or sometimes PIC for Charter)

SPE Software Performance Engineering

Internal systems testing and external pilot site testing

For sites who work this way, any project of more than so many days has to be defined in a project management – project initiation style.
The deliverables used to establish a project are often called a PID or a PIC or some other document containing terms of reference.

However, there’s a need to map this project view into a matrix of management since of equal importance to the service is the infrastructure view of applications which is just as interested in growth of an existing application as it is in the arrival of a new one.

The infrastructure view has gained more impact with the momentum of ITIL. The library is now well known, albeit at a superficial level. The history has been discussed in many places with different levels of authority and memory accuracy. The key things to remember are why it was introduced and to what purpose.

The “centre of IT expertise” for the UK government was aware in the 1980’s of the increasing skills shortage in the public sector and the fact that for each new site, they paid significant money to external consultants to provide what was effectively an operations manual for ITSM. So they gathered together a general description of good practice from a number of sources, with a view to publishing it at no profit.

If the same team had tackled it today it would probably be a free download off the web.

It was meant to be just a general description, independent of hardware, operating system,
software, database, network or any other variables.

As such it was a question of “take it or leave it, adopt and adapt at will” without the implied “correct” answers for which of many processes would tackle which activity within the detailed dataflow definitions for any one site.

It does now carry such a large revenue from foundation training and certification that a whole army of false prophets have raised it to a new gospel-like level in order to drive the material into new areas and new markets. Maybe fragmentation of interests will cause fragmentation in the deliverables…

Next Monday I'll start the week with a look at ITIL’s description of the capacity management process……in the meantime why not sign up to be part of our Community and get access to downloads, white papers and more http://www.metron-athene.com/_downloads/index.html


Adam Grummitt
Distinguished Engineer


No comments:

Post a Comment