MSc: Requirements Engineering

From IU
Jump to navigation Jump to search

Requirements Engineering

  • Course name: Requirements Engineering
  • Course number: SE-

Course characteristics

Key concepts of the class

  • Requirements elicitation
  • Requirements specification
  • Requirements prototyping and implementation
  • Requirements verification
  • Requirements traceability

What is the purpose of this course?

The course has the following key objectives:

  • To introduce the motivation, conceptual background and terminology on which requirements engineering relies.
  • To provide a comprehensive account of state-of-the-art techniques for requirements engineering.
  • To let the students experience the actual requirements-caused problems faced by real software teams.

Prerequisites

The course has been designed to be self-included as much as possible. It can be useful to have a general understanding of the following:

  • Basics of Software Development
  • Basics of Software Testing
  • Basics of Software design and Unified Modelling Language
  • Basics of Software Development process
  • Basics of Software Engineering

Recommendations for students on how to succeed in the course

I suggest here some simple video material that can help a smooth introduction with the course:

It can be an advantage to read introductory chapters of the main textbook: Axel van Lamsweerde Requirements Engineering, Wiley Publishing (2009)


Course Objectives Based on Bloom’s Taxonomy

The “Requirements Engineering” course develops students’ skills at all the 6 levels of the Bloom’s taxonomy.

- What should a student remember at the end of the course?

By the end of the course, the students should be able to recognize and define:

  • System requirements
  • Software requirements
  • Domain knowledge
  • Environment assumptions
  • Environment-controlled phenomena
  • Machine-controlled phenomena
  • Environment-observed phenomena
  • Machine-observed phenomena
  • Problem space
  • Solution space
  • Prescriptive statements
  • Descriptive statements
  • Traceability links

- What should a student be able to understand at the end of the course?

By the end of the course, the students should be able to describe and explain (with examples):

  • Difference between system and software requirements
  • Difference between domain knowledge and environment assumptions
  • Pairwise difference between environment- and machine-controlled (observed) phenomena
  • Difference between the world and the machine
  • Difference between problem and solution space
  • Difference between prescriptive and descriptive statements
  • Difference between vertical and horizontal traceability

- What should a student be able to apply at the end of the course?

By the end of the course, the students should be able to apply:

  • Requirements elicitation techniques
  • Requirements specification techniques
  • Prototyping and implementation techniques
  • Negotiation techniques for modifying requirements
  • Techniques for establishing traceability links, both vertical and horizontal
  • Parameterized unit testing
  • Acceptance testing

- What inference can a student make based on the acquired knowledge?

By the end of the course, the students should be able to identify:

  • Lack of traceability links
  • Incorrectly implemented requirements
  • Incorrectly elicited requirements
  • Incompletely implemented requirements
  • Incompletely elicited requirements

- What judgements can a student make about the studied field?

By the end of the course, the students should be able to judge:

  • Completeness of a requirements document specified by others
  • Correctness of a requirements document specified by others
  • Completeness of an implementation developed by others wrt requirements
  • Correctness of an implementation developed by others wrt requirements
  • Traceability for software artifacts created by others
  • Presentations of other students

- What actions can students take based on their judgement?

By the end of the course, the students should be able to take appropriate actions for:

  • Eliciting lacking requirements from stakeholders
  • Negotiating requirements modifications with stakeholders
  • Implementing lacking functionality, wrt to requirements, in software developed by others
  • Fixing functionality that incorrectly implements requirements in software developed by others
  • Introducing missing traceability links
  • Writing additional tests to achieve sufficient requirements coverage

Course evaluation

Evaluation

Course grade breakdown
Practical assignments 60
Reading assignments 18
Project presentations 12
Classroom participation 10

If necessary, please indicate freely your course’s features in terms of students’ performance assessment: None

Grades range

Course grading range
Proposed range
A. Excellent 90-100 80-100
B. Good 75-89 65-79
C. Satisfactory 60-74 50-64
D. Poor 0-59 0-49


If necessary, please indicate freely your course’s grading features: The semester starts with the default range as proposed in the Table 1, but it may change slightly (usually reduced) depending on how the semester progresses.

Resources and reference material

  • Handouts supplied by the instructor

Course Sections

The main sections of the course and approximate hour distribution between them is as follows:

Course Sections
Section Section Title Teaching Hours
1 Requirements elicitation and documentation 20
2 Requirements prototyping and implementation 20
3 Requirements verification and traceability 20

Section 1

Section title:

Requirements elicitation and documentation

Topics covered in this section:

  • Foundations of requirements engineering
  • The world and the machine
  • Domain understanding and requirements elicitation
  • Questions for interviews
  • The requirements process
  • Use cases
  • Requirements specification and documentation

What forms of evaluation were used to test students’ performance in this section?

|a|c| & Yes/No
Development of individual parts of software product code & 0
Homework and group projects & 1
Midterm evaluation & 0
Testing (written or computer based) & 1
Reports & 1
Essays & 0
Oral polls & 0
Discussions & 1


Typical questions for ongoing performance evaluation within this section

  1. What is the WHY-Dimension of requirements engineering?
  2. What criteria are recommended to use for stakeholders analysis?
  3. Who is a stakeholder?
  4. What is an artifact-driven elicitation technique?
  5. What are the four principles for description in requirements engineering?
  6. What are the four facets of relationship between the world and the machine?
  7. What are the four kinds of denial in software engineering?
  8. What is a descriptive statement?
  9. What are the different kinds of information about the world?

Typical questions for seminar classes (labs) within this section

  1. Write down and present a project proposal for implementing during the course.
  2. Propose a set of questions for a requirements elicitation interview.
  3. Conduct, audio record and transcribe an elicitation interview.
  4. Design use cases based on the elicitation transcript and audio recording.

Test questions for final assessment in this section

  1. Present you experience of preparing and conducting the elicitation interview.
  2. How did you choose the stakeholder for interviewing?
  3. Did the interview go according to the plan?
  4. Which of the initially prepared questions you did not ask during the interview? Why?
  5. What questions you had to ask in addition to the initially prepared ones? Why?
  6. If you have been interviewed, how relevant were the interviewer’s questions?
  7. What conflicts did you have when merging the interview transcripts of your team members?
  8. How did you solve the merging conflicts?
  9. What lessons have you learned based on your experience as an interviewer and an interviewee?
  10. Present use cases constructed based on the elicited information.
  11. How do the use cases trace to the interview transcript?
  12. How does the interview transcript trace to the use cases?

Section 2

Section title:

Requirements prototyping and implementation

Topics covered in this section:

  • Mapping use cases to object models
  • From use cases to user interface design
  • Activity diagrams
  • The psychopathology of everyday things
  • Seamless requirements
  • The anatomy of requirements

What forms of evaluation were used to test students’ performance in this section?

|a|c| & Yes/No
Development of individual parts of software product code & 1
Homework and group projects & 1
Midterm evaluation & 0
Testing (written or computer based) & 1
Reports & 1
Essays & 0
Oral polls & 1
Discussions & 1


Typical questions for ongoing performance evaluation within this section

  1. What value do UML diagrams bring to the requirements engineering process?
  2. Define the “extend” relationship between use cases.
  3. Enumerate risk reduction tactics.
  4. Describe defining characteristics of what Davis calls “knowledge structure”.
  5. What activities does the risk management process involve?
  6. How do you call an active component in a use case diagram?
  7. What is the purpose of postconditions in use cases?
  8. Name different types of relationships in use case modelling.
  9. Categorize UML as either informal, semi-formal or formal notation.
  10. Define the “generalization” relationship in UML.

Typical questions for seminar classes (labs) within this section

  1. Reflect, individually and in teams, on the use cases and the use case diagram that you received for implementation from another team.
  2. Construct activity diagrams from the use cases and the use case diagram.
  3. Construct classes based on the activity diagrams.
  4. Design user interfaces based on the classes and activity diagrams.
  5. Develop a minimum viable product (MVP) implementing your input requirements.

Test questions for final assessment in this section

  1. Record and present a short demo of your MVP.
  2. What decisions did you have to take when implementing the MVP?
  3. How did you define your MVP?
  4. What did you have to change in the requirements document, and why?
  5. What requirements you decided to cover and not to cover in the MVP, and why?
  6. Present lessons learned from developing the MVP.

Section 3

Section title:

Requirements verification and traceability

Topics covered in this section:

  • Parameterized unit tests
  • Goal modelling
  • Scrum & User stories
  • Use case testing

What forms of evaluation were used to test students’ performance in this section?

|a|c| & Yes/No
Development of individual parts of software product code & 1
Homework and group projects & 1
Midterm evaluation & 0
Testing (written or computer based) & 1
Reports & 1
Essays & 1
Oral polls & 1
Discussions & 1


Typical questions for ongoing performance evaluation within this section

  1. How do we call an active system component playing a specific role in goal satisfaction?
  2. How do we call an autonomous and passive object in the object model, which cannot control the behaviours of instances of other objects?
  3. What is a goal?
  4. What goal pattern refers to every future state?
  5. How do we call an association where the composite object and its components appear and disappear together in the system?
  6. Describe the goal refinement process.
  7. What is object specialization?
  8. Define the “maintain” goal pattern.
  9. Enumerate and describe different relationships between goals.

Typical questions for seminar classes (labs) within this section

  1. Construct parameterized unit tests for the MVP provided to you by another team.
  2. Reflect, individually and in teams, on the MVP provided to you by another team.
  3. Reflect, individually and in teams, on the user interface design of the MVP.
  4. Develop the MVP into a usable production-quality software.
  5. Construct use case tests and parameterized units tests for the final implementation; update the requirements document as you go.
  6. Run the tests and fix the identified defects; update the requirements document as you go.
  7. Ensure pairwise mutual completeness between the requirements, final implementation and tests.
  8. Ensure pairwise mutual traceability between the requirements, final implementation and tests.
  9. Write a document that will describe very clearly how to run and use the final implementation.
  10. Describe in the document how to reproduce different use cases in the actual software.

Test questions for final assessment in this section

  1. Present the final system developed from the MVP received from another team.
  2. Introduce the project and its business goals.
  3. Evaluate the quality of the interview transcript.
  4. Evaluate the quality of the use cases.
  5. Evaluate the quality of the MVP and user interfaces.
  6. Reflect on the quality management process.
  7. Record and present demo of test runs.
  8. Reflect on teamwork and communication with other teams.
  9. Present lessons learned while implementing different parts of different projects coming from the other teams.
  10. Record and present demo runs of the software using the use cases as the reference.
  11. Describe strengths and weaknesses of the final implementation.
  12. Write an essay detailing your reflections on the overall course experience.