IT contracts and disputes have long concentrated on functional specifications, tests and defects to assure that projects deliver what is intended. This is only part of what is required. This article examines the role of design in contract and dispute.
Customers of failed IT projects frequently protest that they do not like the way the solution has been designed, “I do not like it!” is all one hears. What can a customer do when drafting a contract to insulate against the risk of this failure? How do you implement operational measures in delivery? How can an expert clarify the issues for the court?
Functions
The recent focus on design thinking has shone light on something that many knew for years: the elimination of defects alone does not make a product good. Many contracts are replete with functional requirements. I, for one, have put a lot of effort into writing them over the years. These, when done well, provide testable conditions necessary to the success of the service, for example, requiring the ability to process payments from customers. A function is a behaviour of the system that can be defined, delivered separately from others, and tested. If it works as expected, it is passed. If not, we call it a defect, and send it back to be fixed.
Many experts charged with investigating failed systems start by analysing the defects. The patterns we see in the data tell us much about what was working and what was not.
Non-Functional Requirements
Contracts will often also contain non-functional requirements. These refer to the performance of the system in areas such as security, availability, response time. You will hope to see the performance levels specified in a schedule as a service level agreement (SLA). Testing should cover these too.
Your expert will look at these test results. They are not commonly at the heart of a dispute but may be seen along the way. These too may result in defects from test.
For those of us who love data and contractual certainty, functional and non-functional tests and specifications have obvious appeal.
Indignant Customers
Customers in IT disputes commonly protest in colourful language, that the system is “terrible”. Yes there were defects, but there is more to it. The system just works horribly. It is illogical. Users trip over even simple transactions. They hate using it and rebel. Others who have gone live find that even a minor change has so many knock-on effects that the system is close to unsupportable.
Many such complaints are expressed inarticulately. This only adds to frustrated fury. There are rarely explicit breaches that a customer can rely on other than a thin “good industry practice” clause. This is changing. Stronger contractual provisions will help to support customers in holding suppliers to account. The best suppliers strive to deliver delightful outcomes and will not resist.
Fixed Price
The process of design is iterative. Good engineers and designers are expensive. A programme manager will typically separate a large endeavour into multiple work-streams. A unifying solution architect is appointed to coordinate them to assure end-to-end coherence. So as one part is designed, it rubs up against issues caused by its favoured approach in another. Steve Jobs, late of Apple, explained this beautifully.1
When should the design team stop refining? There is no right answer. They will typically keep going until ordered to stop or objectives have been met. It is much the same for a poet.
All I can do is turn a phrase until it catches the light.
Clive James, May Week Was in June
If your contract is fixed price and that price is low, the product is often base metal not burnished gold. If you expect to shine, you will still want value but elegance has no fixed price and does not arrive on schedule. For most in business, this is too much. “Good enough” will rule. We still need good design to reach that standard, even if we are not in search of the artistry of the Sistine Chapel.2
Aspects of Design
Most of us think we know good design when we see it. Apple has long competed on the proclaimed beauty and utility of its user experience, sustaining a premium over competitors. This justifies their high costs of development.
When thinking of design, most first think of screen design. That is the window into the system. Good design goes far deeper. Areas that should be encompassed include:
- User Experience – the totality of a user’s interaction with an organisation, its services and products. The user interface and aesthetics.
- Data design – What is asked for and when. How what is known is used and structured.
- Process – The flow and sequence of operations to realise the user’s and organisation’s objectives.
- Architecture – The selection and arrangement of the system building-blocks to realise the system’s objectives.
- Software – The internal code design and realisation in code that is easy to maintain. Or not.
- Deployment – The approach to delivering the completed, tested system to the business and users.
Design Frameworks
There is a large number of frameworks that describe aspects of the approach. Some, like Design Thinking and Agile, are content-free processes. Others bring useful insight into aspects of human and technical behaviour to bring the two together. I am a particular fan of Concepts in software.3 I do not advocate a customer’s imposing its preferences for framework on a supplier. All good frameworks have significant overlap and suit some situations better than others.
IT architects typically seek to optimise the lifecycle cost of ownership of a system. They can invest more up-front, adding to development costs, in the hope of reducing the later costs of maintenance and support. If you prefer another objective, you should communicate that before your team starts. You may observe that this objective is not the same as maximising sales value, customer loyalty or many other good business initiatives. Your team must be given the objectives and focus you seek.
Design in Contract
Your contract will of course contain a “good industry practice” clause. If you have that, award yourself 1 out of 10 as a start.
In the dialogue that precedes appointment and contract, a customer has a golden opportunity to explore what is most important to them. Design may be on your list.
When working as a sourcing advisor, I assess the supplier’s capability. Do they have an approach? What framework do they say they are using, and how do you and they assure that it is to be followed? Does it hold water? Do any of the people on this delivery team have personal experience or do the team rely on an occasional call at 2am with colleagues in Outer Mongolia? Do the skills of the named team cover all the areas you need? How will staff movement be measured, managed, reported? How will they measure design quality and report to you? You will see design change. Who manages this and how do we assign the costs? If you capture the interaction (or edited highlights reflecting your interests), you will build contractual certainty through holding the customer to the delivery method they have contracted for.
Building Design Integrity In
The quality of design is likely to be soft and largely intangible. You may turn some of the approach into activities that must be undertaken and link milestone achievement to their delivery. You may also consider whether some aspects are best related to the achievement of business outcomes, such as the level of customer adoption and interaction with the solution. A minority of payments may be related to the achievement of outcome targets, where you can measure them confidently.
A supplier’s risk register is a useful artefact to focus the review of the proposed approach. Have they considered what are the major design decisions they must make and scheduled work to make these early? They can be very expensive to change later. Expect them to state some dependencies on the customer. Are you capable of keeping up with the supplier? If not, what are you to do about it? If a supplier has little concept of risk and size of bet, walk away now.
Design in Operation
Design is all about making choices and trade-offs. These should start with principles such as Agile’s “Working software over comprehensive documentation”.4 A good set of principles supports your leadership of the endeavour and allows designers to frame choices in the context of a rationale that supports the selection of the best option from the several they have considered.
Governance is about review, scrutiny and holding people to account. Design review is a sub-set of programme governance. When done properly, it centres on the ongoing critique of the design in the context of the principles. The critic must understand what is being reviewed, which is often a challenge. The reviewer is there to scrutinise, not to take over the design. That is an invisible line, frequently over-stepped to the cost of the programme. Design Thinking puts great emphasis on User Testing. User testing is an important step in the process, but many of the big bets go nowhere near the user, so those aspects are not touched by user testing. They arise early (data structure, for example) and can be very expensive to change later. Design review is your principal assurance.
Obtaining good quality user input is vital and difficult. Many failed programmes see the uninformed loudly instructing technical experts how to design, despite their having no appreciation of the impact of design options. Managing this requires a high level of skill on behalf of the supplier’s analyst and robust support from a consistent and decisive governance body.
Design in Disputes
Way back in 1977, a software researcher, Edsger Dijkstra, contrasted the “correctness problem”– whether a program meets its specification, with the “pleasantness problem” – whether the specification is appropriate to the situation of use.5 He identified the first as being susceptible to mathematical formulation and analysis. The second was unfamiliar territory for IT folk. He identified that for the result of a user’s interaction with a system to be reliable, both the conformance (functional) element of the IT and the human aspect of user interaction must behave as expected. He gave both equal weighting.
Since those early days, the human aspect has been relegated. I confess that on occasion, I have joined this trend. On hearing “But I don’t like it!”, smiling sweetly and moving on. Sometimes I pause to ask “what is the requirement that this behaviour breaches?”
Fixing bugs is necessary but does not fix bad design. Neither does it make bad software good.
It Just Works
Later research and entrepreneurial investment looked at what users signed up to, such as the Zoom messaging service over lock-down. Users do not typically study the manuals for web-based apps. Gow and others noted that users infer behavioural theorems by observing their interactions with systems, and come to rely on those theories.6 If the system behaves consistently and sensibly, they stick with it. If not, they seek another that performs more reliably (if they have the choice). Gow examined user behaviour in context. This leads users to generalise from the particular behaviour they see, assuming general consistency.
This provides a test for the quality of design, that is independent of the framework used. It also points to tests beyond the user interface. Its use allows the assessor to apply measurable, objective criteria and avoid subjective judgement.
Tests of Design Quality
The mantra “I don’t like it!” remains unsatisfactory in itself. It is uninformed, opinionated and divorced from good industry practice to which it makes no reference. It can however be used to identify instances that with investigation can overcome these challenges.
The Design Products
- Is there a design?
- Is the design complete?
- Is the design informative?
- Is the design consistent with the principles and itself?
- Is the as-built product consistent with the design?
The Design Process
- Did the supplier identify the process they were to use?
- Did they adhere to the declared process, framework and principles?
- Was the process governed effectively and consistently?
- What did the contemporaneous design scrutiny reveal?
- What did contemporaneous user testing reveal?
- Were the contemporaneous design products amended appropriately in the light of comments?
In the above, the quality of investigation of design can be more insightful if it has a complete document, design, build, test set to work with. Should it fail at the first hurdle (no design), the fig leaf over modesty is likely to be blown away.
Should your contract provide rich pickings concerning the approach to be used, the above assessment can rely on the standards set within the contract. If not (as is most common), the applicable standard is “good industry practice”, not perfection. My approach is to start with whatever I can find within the contract and refer as needed to whatever widely deployed frameworks I can identify using this as a standard objectively to assess design quality. Experts must consider a range of industry practice where this exists. Practice will also change over time.
Impact and Damage
Impact and damage may vary greatly from case to case. Although the complaint may start with minor cosmetic issues, the quality of design can go to the root of whether a system can be relied upon. A user who expects one form of system behaviour and is misled in acting may set off a disastrous chain of actions.
As in all disputes, it is important to establish at an early stage which of the possible issues are likely to form the basis of a successful claim, narrowing the issues appropriately. Investigation can be expensive and must be maintained at a proportional level.7
Conclusion
Functional and non-functional testing are still necessary. They are not sufficient in many cases. The better the contract and subsequent scrutiny of delivery, the better the probability of achieving a favourable outcome. The best of all outcomes is a product delivered successfully on schedule at the first attempt. If things do go wrong, there are objective standards to measure design quality to deliver persuasive evidence.
References
This article was first published by the Society of Computers and Law and is reproduced with permission.
- Steve Jobs Rock Tumbler Metaphor https://www.youtube.com/watch?v=njYciFC7mR8 (3:31) ↩︎
- Michaelangelo took from July 1508 to October 1512 to paint the ceiling. ↩︎
- The Essence of Software, Daniel Jackson, Princeton University Press, 2021
(or less clearly) https://sebokwiki.org/wiki/System_Concept_Definition ↩︎ - https://agilemanifesto.org/ ↩︎
- Edsger W. Dijkstra. A position paper on software reliability (EDW 627). 1977. At https://www.cs.utexas.edu/users/EWD/transcriptions/EWD06xx/EWD627.html referred to by D. Jackson in The Essence of Software. ↩︎
- Jeremy Gow, Harold Thimbleby, Paul Cairns. Misleading behaviour in interactive systems. Proceedings of the British Computer Society HCI Conference. Research Press International, 2004 https://harold.thimbleby.net/cv/files/hci04gow.pdf ↩︎
- https://oareborough.com/Insights/managing-a-legal-dispute-part-1/ ↩︎
Picture credit: DepositPhotos.