Requirements Elicitation and Analyst Improvement
This section of the paper focuses on tools and techniques designed to improve the analyst themselves as a means to improve requirements elicitation. As stated earlier in the beginning of the paper, many analysts do not have the experience or knowledge required in order to use more advanced analysis techniques for requirements elicitation. Huang noted that in spite of all the powerful tools at an analyst's disposal for requirements elicitation, "the cognitive abilities of information analysts are still the most important determinant for the correctness of requirements specifications" (Huang, 2005, p. 19). The techniques identified in this section are hardware/software independent, and can be taught relatively easily. As a result, analysts will increase their abilities and gain valuable knowledge at a quicker rate, and will be able to better perform requirements analysis.
Pitts and Browne focus on adjusting the standard types of questions used in the interview process to improve requirements elicitation, to compensate for analysts who don’t necessarily have the more advanced cognitive analytical skills needed for good requirements elicitation. Refer to figure 8 to a list of identified cognitive challenges and how they affect requirements elicitation. The strategies behind the prompts used in the interviews are procedural in nature, rather than interrogative. Pitts and Browne recommend a list of questions that focus on cognitive theory and argument strategies instead of using tradition interrogative questions involving traditional who, what, where, when, why, and how. Pitts and Browne cited research showing that “context-independent prompts based on argument strategies elicit information from people that they might not otherwise evoke” (Pitts and Browne, 2007, p. 96). Table 2 from the article provides a brief summary on the kinds of argument strategies and what kind of prompts would be used to promote their use. The results of Pitts and Browne’s showed that participants in their study who used the new procedural prompts “obtained significantly greater quantity, breadth and depth of requirements following the introduction of prompts” (Pitts and Browne, 2007, p. 103).
Analysts selected to use procedural prompts in their requirements analysis received tutoring on how to use procedural prompts, and were given full leeway on how many questions to actually use participants. While the article does not say how long it actually took to provide the instructions, it is unlikely a significant amount of time was spent on it, and considering the massive amount of improvement made simply by using a different kind of question, procedural prompting should definitely be considered as a potential tool to be provided to analysts, not only to improve requirements elicitation, but also to better improve the analyst. One caveat that should be noted is that the study the authors performed to test the effectiveness of procedural prompts involved a second stage of addressing client requirements after a first stage was already completed. The test group who used interrogative techniques also identified more requirements than were identified in the first round, the procedural group just got more. This implies that just spending more time in requirements elicitation, and revisiting it in more than one stage, may lead to improved requirements elicitation regardless.
Qurban and Austria focus on improving the communication skills of the IS developers, specifically by use of experiential learning. Simplified, the purpose of experiential learning is to take past experience and use that to gain knowledge and apply that knowledge in a future scenario. The video below provides a practical example of experiential learning.
This video provides an everyday example of how experiential learning can be used to improve a professional's skills in their field, namely, a special aid teacher. Qurban and Austria argue that this kind of learning can be applied to analysts in order to improve their communication abilities with clients. For a brief overview of the theory of experiential learning and its four stages of progression, check out the article here. The article identifies and explains the four main processes of experiential learning: Concrete Experience, Reflective Observation, Abstract Conceptualization, and Active Experimentation. Qurban and Austria applied these four steps towards a group of analysts doing a presentation to clients, where the analysts showed what their perceived requirements of the clients are. This was done in four steps, using the techniques provided for by referential learning. Developers first gained concrete experience by presenting a software prototype they had created to users while being taped, the developers then watched their presentation as part of reflective observation, and they then watched an educational video to teach them (advanced conceptualization), and finally, analyst presented their prototype again to assess their communication skills (active experimentation). The results of the study showed that during the second presentation, analysts “were more inclined at post intervention to involve the users during systems design” (Qurban and Austria, 2009, p. 307).
Communication between analysts and clients is critical to successful requirements elicitation. Experiential learning moved the analyst to a better mindset, encouraging them to more involve the actual client, and encourage better requirements analysis. It should be noted this learning technique was applied while using the AGILE framework, which involves the use of prototyping as well as iterations going back and forward between requirements analysis and design. As such, this improved communication technique may only be useful in frameworks that support rapid analysis and development. Even so, the fact that it encouraged better client communication should not be ignored.
As you may have noticed, the section devoted to techniques for directly improving the analyst is noticeably shorter than the previous section of diagramming techniques. This is due primarily to the fact that the research I have uncovered has focused almost exclusively on diagrams as a means for improving requirements elicitation, as opposed to techniques that focus on the analyst. Whether this is because researchers believe a truly good model can make up for any deficiencies in an analyst, or because it is easier to create a new diagram than it is to create ways to improve analysts skills, is difficult to determine. However, given the recurring emphasis in many studies of the difficulties novice analysts encounter during requirements elicitation, focusing on models is not enough. While a novice will most assuredly improve as they gain more experience, we can see that there are relatively simple ways to mitigate the challenges novice analysts must overcome while eliciting requirements from clients. Researchers should not shy away from the difficulties in creating these new techniques, as the returns may be far greater and more useful than simply using a new diagramming technique.
In this paper we have addressed some of the common causes, as well as theories behind, failure of requirements elicitation. We then reviewed several suggested diagramming techniques designed to improve the requirements elicitation process. Finally, we suggested several potential techniques to improve the analyst's skills, thereby indirectly improving requirements elicitation, as well as addressed potential reasons why this avenue of research has not yet been pursued more. The research addressed in this paper has shown that with small changes and additions to the requirements elicitation process, we can improve requirements elicitation. It is abundantly clear that there is a lot of room for improvement in the requirements elicitation process of system's analysis. With billions of dollars lost every year, and the potential for financial ruin, system's analysts cannot afford to ignore any potential means of improvement.