Along with Andy and Adam, I attended the annual IT Service Management Forum (itSMF) Conference in London on the 7th and 8th November 2011. This is regularly billed as the event of the year for service management professionals and everything ITIL.  It is supported not only by a range of delegates with serious interest in Service Management but also governing bodies, sponsors and 40 suppliers exhibiting their Service Management based tools, training and consultancy.  Of note, that may well be reflective of these times of austerity, is that the last time I attended the itSMF conference in 2009 there were over double the number of suppliers exhibiting at the conference with some notable leaders in the field not represented at this year’s event.

The main content of the conference were 50 presentations on a variety of subjects spread across the 2 days.  Typically the presentations ranged from thinly disguised vendor sales pitches, through theoretical rather than pragmatically based concepts, to real world examples of initiatives undertaken and the associated experiences. As a general rule of thumb it tends to be these real world experience based presentations along with targeted factual update sessions that provide the most value.

The sessions I went to varied greatly in their interest and usefulness with the main value coming from sessions covering:-

  • Developing Effective Performance Management, which although not a particularly revolutionary concept did help crystallise a few ideas regards a more system based approach for Performance Management.  Such an approach would be more holistic but provide differing levels of granularity in order that ‘default’ levels provided could be matched to respective audience requirements, whilst at the same time allow for drilling down to more granular based information as and when required.  One interesting idea was to utilise the Service Catalogue to identify the relative value of the Services offered to the business.
  • The ITIL 2011 Update session which outlined the major changes that have been introduced in the very recent publication of updated versions of the five ITIL manuals.  Whilst it is clear that we will need to reference all these new updated versions, approximately 80% of the changes are focused on the Service Strategy manual and cascading of these changes through the other manuals for consistency.  The only potential new concept is the introduction of a Continual Service Improvement Register, a concept we have advised clients on previously even if not using that specific terminology.
  • A session describing the currently under development ITIL Master Qualification. This will be the highest qualification attainable in ITIL and will be based upon validation of  the capability of candidates to apply the principles of ITIL in the real world. It will not be based upon any form of courses or examinations but instead will take the form of four stages working through Application, Proposal, Work Package creation and submission based upon real world utilisation and experiences before a Final Interview. There will be a set of 40-45 requirements that will be have to be met for candidates to be accepted onto the qualification programme, but further details are not yet available. Therefore the applicability to representatives from Smart421 is not yet known, but besides the personal value of such a high qualification there may be of value to Smart421 as a significant differentiator for Sales if anyone within Smart421 were to achieve this qualification.
  • Mobility, Big Data and Precognitive Support was a session presented by Chris Dancy,  the founder of ServiceSphere, which delivered an interesting and compelling view of the recent past, the present and what we should anticipate for the future. For example:-
    • Mobility:- There are 5.3 billion mobile subscribers (that’s 76 percent of the world population) with the growth being led by China and India.  Over 85 percent of new handsets are Web Enabled, which will facilitate continents like Africa to completely leap-frog  ‘hard’ infrastructure requirements like land lines and PC based networking. Mobile devices are already ubiquitous and will be the de facto integration method i.e. will increasingly take precedence and preference over ‘traditional’ PCs and laptops etc.  Facilitating integration will be primarily based around mobile apps,  one projection being that by 2016 cumulative mobile apps downloads will reach 44 billion.
    • Big Data:– a term the presenter thought will become as familiar as Cloud in the coming months and years.  Statistical examples included that the same amount of worldwide information generation that occurred between 1995 and 2000, was achieved in 2 days in 2010 and in 1 day in 2011. Another example was that the size of the internet in 2003 is the same size as Facebook alone  is today!
    • This amount of data is facilitating a move towards more precognition analysis etc. For example:- studying moods on twitter via automated textual analysis, happiness on foursquare mapped across cities etc. In addition to the presentation but another directly connected example, I heard a radio interview on the way home from the conference about perception of value for money of 3D films based upon automated textual analysis of blogs on film based sites. Back in the presentation these techniques are already apparently being utilised in extended news cycles by media organisations, with pre-event analysis occurring specifically in preparation for actual events – sounds a bit like pro-active trending!
    • Finally the speed of change that should be anticipated is already outlined to a large degree with some of the statistics highlighted above, but it is also argued that in the wider scale that “we are still in the first minutes of the first day of the internet revolution” and that Kurzweil’s Law of Accelerating Returns with the implied exponential and non-linear growth in technological capabilities and human knowledge is already occurring in line with Kurzweil’s 6 Epochs  ideas. Whilst it could be argued that such considerations are overly cerebral and abstract, it doesn’t take much consideration of the evidence and experience of the last 10 or 20 years to realise that the future opportunities and challenges are going to be a world apart from what we have seen previously.

Finally Andy & I also attended a couple of sessions based on management of ‘The Cloud’. We found both sessions basic and were left with the clear impression that Smart421’s capabilities and position are far more advanced than anything represented at the conference – so clearly this is an area we should be shouting about more!!

MonumentSmallOn day 3 it was straight into the streamed conference sessions rather than plenary/spotlight sessions.

Cloud Computing

I went for the cloud computing stream in the morning, and quickly concluded that I’d made a good call. First up was Mark Skilton from Capgemini talking about the work of the Open Group cloud computing working group, specifically the ongoing work on a cloud ROI model, cloud business use cases and a “cloud buyer questionnaire”. Mark’s presentation style was to throw a wall of information at you very quickly, which I found very engaging and he raised some really interesting points. For example…he stressed the importance of separating buyer and seller perspectives/use cases in the cloud discussion (as they tend to be rather blurred together in most discussions), there was some interesting discussions about pricing models/cloud elasticity (i.e. that infinite elasticity is an illusion), and also some debate around cloud architecture models and the “is a private cloud really a cloud?” argument.

The second cloud-related presentation from Francesco Pititto covered cloud in the (Italian) telco sector, and there were a couple of key points for me from that:

  • Cloud adoption gives the tier 1 telcos a means to combat the declining revenue challenges that they have with their traditional business, by re-exploiting their expensively developed assets again, and with longer-tailed business opportunities, i.e. customers they could not have economically serviced before.
  • Telecom Italia are offering IaaS to the Italian market – this has confirmed to me something that I’ve been chewing over for a while. On the face of it, how could a largely country-specific business even hope to compete with a scale IaaS player like Amazon Web Services? Well, the point is that they have some USPs that allow them to compete – whilst they may have a much smaller data centre footprint and less economies of scale, they can guarantee that data remains in territory (and so alleviating some customer concerns re legal/regulatory issues), they are a trusted brand in their territory, and they have an existing SME customer base to sell to (I stress SMEs as this is where the early adoption is).

Later on in the day, a presentation from Enrico Boverino raised an interesting point about ITIL’s CMDB and role it has to play in providing cloud governance. The basic point was that CMDBs require enhancement to be able to adequately support the dynamically assigned and elastic assets of the cloud computing world – and that this will be a barrier to cloud adoption (or at least the adequate service management of cloud-based solutions).

ArchiMate

In the afternoon I attended the two ArchiMate sessions which I was keen to hear, just to test my own reluctance to adopt ArchiMate within Smart421. The first one (from Harmen van den Berg) gave an overview which was a good intro to the motivations and capabilities for ArchiMate. I’ve never doubted that ArchiMate is superior to UML for enterprise architecture modelling, but my resistance to taking it any further up to now has been based on the reluctance to learn and educate staff & stakeholders about yet another notation. I don’t doubt that I can be more effective on holiday if I learn Spanish, but English is “good enough” and can get the job done. It’s rather a lazy approach, but I guess it’s all about time/energy investment vs return. So I wanted to “see the light” in this presentation, and to some extent I did. The most powerful aspect of ArchiMate that struck me was the elegance with which you can model and show the relationships through the architectural domains (business, data. application, technology) – this can be done in UML via stereotyping etc but as it was a design goal for ArchiMate, it’s really nicely done and clear in the actual usage. Of course, it’s relatively easy to pick up so the personal investment for anyone used to modelling is small.

The second ArchiMate presentation was a case study from Alexander den Hartog about the use of ArchiMate to model the EA for a global organisation. Ironically, the thing that struck me about this presentation was the enthusiasm of the presenter and the excitement he portrayed about having got the EA of his organisation under control and understood – so I think he probably had the force of will to make this happen even if he hadn’t adopted ArchiMate (though it certainly made it easier). It was quite a tour de force of an EA case study – he’d actually got the entire EA modelled, and more importantly maintained – not something you see that often. Hats off to Alexander! This led me to question what had enabled this to happen in his organisation when so many others struggle to create baseline models and “keep them alive”. The reasons seemed to be:

  • He was the modeller – he owned the model, did all the updates, did his own governance (but had external reviews), so this makes things massively easier to manage. I guess I would describe this as a relatively agile model ling approach. Of course, it’s not so scalable, but it was not a trivial organisation by any means so it was interesting that this was possible. The key observation here is…if you’ve got someone who knows what they are doing, then you don’t need an army of EAs to build and keep a model up to date. As we know, the more people you add, the more complex the communications paths become etc.
  • He had the right relationships (e.g. into the infrastructure teams) so he was alerted when things are changed, so the model does not go stale.
  • The communications strategy was right – the model was published in HTML format and shared with stakeholders and reviewed by them in this format.
  • The change management approach was clear – he published a “change document” that defined what had changed in the model and this was the key review vehicle. In their case, this output was produced and reviewed as a project initiation document which works for them.

Summary of the conference

Overall I thought it was a good event – although I personally didn’t find some of the plenary sessions on the first two days as useful as some of the more specialised afternoon sessions. I’ve met some really clever people – whilst the travel etc is a PITA, there is no substitute for just meeting people face-to-face and hearing their viewpoints, having your preconceptions about EA challenged etc. I liked the way that it didn’t feel like anything was not up for debate, so there was some great interaction in some of the sessions – not that I agreed with all of it, but that’s not the point. Vive la difference.

Out of all the presentations I’ve attended over the three days, I’m pretty sure only one of them, whilst being entertaining, must have involved some kind of hallucinogenic assistance :). So that’s not a bad percentage…

ok gloveI often think that we don’t sing our own praises enough at Smart421, and this was reinforced to me the other day when I received April’s customer survey results for the Service Management part of our business. It made me proud to be a Smartie! It’s part of the payback for all the hard work that goes into activities like being early adopters of ITIL all those years ago etc…

Here’s are just some of the quotes, each from a different customer in the survey:

  • “absolutely pleased with their knowledge and confidence of the system”
  • “dependability and thoroughness of execution, transparency of reporting and communication allow low overheads in maintaining the relationship – when compared across other suppliers”
  • “the team have delivered consistently on or in advance of agreed timeframes”
  • “Commercial propositions are responded to quickly, that’s a great help with our customer being so dynamic”
  • “Reports/feedback from [the team has] always been detailed, constructive and informative. Far beyond what I would have expected based on the experience I’ve had with other support teams”
  • “…continue to provide excellent support, as do the whole team”
  • “Response time is excellent. Very good in taking ownership and resolving issues.”
  • “[The team] have maintained their excellent level of service”
  • “All of this individuals who I have worked with have gone above and beyond the call of duty for what has been

    asked of them. It’s a pleasure to work with you all.”

  • “Very responsive and alert to the needs of our customer’s business. An excellent partner to work alongside, as they feel like part of the project team and can be relied upon like a colleague.”

The full details of the survey results are available – just ping us an email. Forgive me the sales pitch :)

I’m starting to wonder if mention of the word ‘Agile’, relating to software development, is already starting to be seen as some sort of swear-word. In just the same way as ‘Waterfall’ is frowned on, even demonised.

Over my none-too-short career, I have worked in many different environments, using many different software development processes and quality standards. Those include military spec systems, banking and financial applications, telecommunications, pharmaceutical regulatory systems and assorted other application and integration projects. As a result, I have had to work on software with different requirements based on scale and rigour (consider BS5750, ISO9001 and FDA rules, for instance).

At Smart421, we have ISO accreditation for our software development processes as well as our ITIL/ISO-20000 service management activities. Being systems integrators, we will use whichever project management process is most suitable, or that which is requested by our clients. In this regard, our Prince2-based project management approach is our default choice, scaled to meet the needs of each particular project. We have also delivered projects using Rational Unified Process (RUP) and Scrum, as well as developing with other Agile software development approaches.

Given that expertise (mine and Smart421’s), it is quite clear that no one project development process is going to be correct for all software projects. Returning to my initial comment, it seems to me that ‘Agile’ may already be suffering from too much adoption on projects where it is not entirely suitable. This results in poor delivery and of customer expectations not being met, which is just the sort of problem that ‘Waterfall’ projects have been accused of over many years.

I’m not aiming to attack Agile, nor defend Waterfall, but just want to raise the issue that both have their merits and that both have a number of failures (high or low profile). The natural ground for Agile projects is for smaller scale developments, although that does not preclude it being used for large scale deliverables, but the level of rigour must be increased to allow for this. Not to say that Waterfall is the answer to delivering large projects, but it tends to bring the associated rigour (documentation, whether seen as overhead or not) needed for such systems.

Of course, Waterfall has earned its criticism – often on very large scale, large budget, failures. Agile may be lucky in that its failures may be on smaller scale, smaller projects. A further benefit is that if approached properly, even failed projects will (should) delivery something of value. If an Agile project fails to do that, it doesn’t deserve to be called Agile either.

Recall the Agile Manifesto:

We are uncovering better ways of developing software by doing it and helping others do it. Through this work we have come to value:

Individuals and interactions over processes and tools
Working software over comprehensive documentation
Customer collaboration over contract negotiation
Responding to change over following a plan

That is, while there is value in the items on the right, we value the items on the left more

Then read Alistair Cockburn’s book ‘Agile Software Development’ and don’t just look for the worked example on XP, but read the sentiments and meaning in there. He advocates greater controls and levels of artefacts for larger projects, based on not only the scale of the problem, but on the importance of the solution. For a life-critical system, there is a need for much more rigour than on a discretionary, nice-to-have system. All quite obvious really, but something that a number of Agile proponents seem to miss.

I intend to add a further blog article about the tension between Agile processes and enterprise level, or SOA, software, to expand on my views as to how these may or may not fit together.

In the mean time, I’d like to hope that those Agile converts don’t fail to see the wood for the trees, and that not too many projects lead to failure through inappropriate choice and use of such software development processes.

Follow

Get every new post delivered to your Inbox.

Join 1,122 other followers