Continuous Requirements Engineering part 1/2

Published 11.10.2017
Reading time approx 5 min

This is the first part of a two part blog.

Test cases complement requirements

Test design as requirements elicitation

Test design makes assumptions that complement requirements elicitation (Lu-Tze)

Every test case adds something to the requirement, specified or assumed, from which it is derived. Yes, you’re right: I do mean that all tests are requirements-based, even those called – very misleadingly – “structure-based”, because even those test cases which check single source-code statements, verify conformance with “what should be”, i.e. the requirements.

Let us ponder a simple example. There is a requirement stating that a given function accepts users aged between 20 and 70. A test analyst, using equivalence partitioning, designs the following test cases: 19, 20, 21, 50, 69, 70, 71, then adds some more tests (how would you call this technique?) with age values -1, 0, “hallo, world” and 20.000.000.000.

The tester actually elicits additional requirements, more detailed than the original requirement, defining correct, expected system behaviour for some special values. Test design techniques are therefore generic requirements elicitation methods. For example, equivalence partitioning states that, wherever you have requirements that define intervals, you can automatically add to them more requirements, defining system behaviour on the boundaries and outside the interval.

Another example: a requirements states that all record field values can be edited and changed. Test analyst creates a number of test cases, attempting various combinations of field changes, using a number of different values. The test cases make the initial requirement more detailed, by eliciting – using common sense, business knowledge or test design techniques – detailed examples of the requirement.

In agile scrum, there is a method, which is a very obvious and conspicuous example that test design is actually the continuation of requirements elicitation under another name: specification by example.


A requirement, specified as a user story, is described more in detail using examples (acceptance scenarios, acceptance criteria), which are added to it, and later used as acceptance test cases. Nice, wise and really good. You do not need to use agile scrum to adopt this method: it suits sequential development equally well. And you save much money avoiding expensive and time-consuming requirements tracing tools, since requirements and test cases are together from the start, stored in the same document or tool. 


All this is not only an academic or intellectual curiosity: this is of prime practical importance. The separation – traditional and still prevalent today – of requirements elicitation and test design procedures, makes no sense, because it artificially separates two similar activities, which for all practical purposes belong together. If they were closely connected in projects, and performed in co-operation, system development would be better: more effective and more efficient.

Test design as requirements modelling

I first learned how to model system behaviour using state diagrams not for requirements engineering, but for testing purposes. I needed a framework to help me understand complex system behaviour better than chaotic, wordy requirements spec written in natural language could do. Besides, having a model was handy for designing test cases, for keeping track of my test coverage, and even for having fresh test ideas: a look at my state graph, or a glimpse of empty cells in my state transition matrix, often put my mind into very exploratory, creative state of mind.


However, making this model was not easy: I spent a lot of time developing it, and some more time making sure it was really right. And, uh, I did find some ambiguities in the initial natural-language description. Making the guys who had written it talk to me was not easy, either. You know, they were VIP: business analysts, rubbing their shoulders with CEO and with CIO, and I was just a humble tester. When I discovered their description was not only ambiguous, but downright wrong here and there, my time investment became greater still.


Whose job was I doing then? A tester who goes into debugging is rightly said to spend her or his time doing developer’s job. A tester analysing requirements documents and making models from them, then improving the initial requirements, spends time doing requirements engineer’s job.


I do not mean to say that testers are diligent and good, while requirements engineers are lazy and bad, because this is definitely not the case, nor the issue either. The issue is, a lot of requirements analysis, modelling and verification work is performed for test design purposes, so – like in the previous section – my conclusion is that the separation of these two activities is very wrong and ineffective. Dealing with the same work twice separately, by different people, and often at different time, is wasteful. We should change it and start working together, requirements engineers and testers.

Default requirements

Testing adds a number of universal default, generic requirements to other requirements. We are often not conscious about them. They should not be written down, because they are too many, and too obvious to justify wasting ink and paper on them. For example, imagine there is a requirement, defining that the system must in certain situations display a rose triangle in the upper left-hand corner of the screen. Why would you test it many times, with different data values, instead of just once or a few times? Because what’s actually being tested is the implicit default requirement, which is “and it must work for all such situations, and never is the system allowed to crash”.
Another such implicit, generic requirement, which complements explicit requirements during test execution, is “and nothing else should happen, unless it is really trivial”. If, besides the required rose triangle, sometimes a little yellow dot appears as well, you may choose to ignore it, but if instead of a little, harmless dot you get (mind, the rose triangle is there as well, as it should!) a 2-minutes long film presentation, you may choose to report an incident.


The practical importance if this is again significant. Pretending that all, really all requirements can and should be written down is futile – and wasteful. Knowing that there are many generic, commonly accepted, implicit requirements, used for testing, that are not written, helps you handle them more effectively.


End of part 1/2


Book your course on one of Bogdan’s courses!

About author:

Bogdan Bereza

Tags

IT-ala Testausautomaatio