BSc: Software Quality, Reliability and Security
Software Quality, Reliability and Security
- Course name: Software Quality, Reliability and Security
- Code discipline:
- Subject area: Computer Science Fundamentals
Short Description
Building high-quality software is utmost important. However, it is easier said than done. The course is an overview of software quality and software reliability engineering methods. It includes introduction to software quality, overview of static analysis methods, testing techniques and reliability engineering. The students will put in practice the methods during laboratory classes and will dig down the topics in a small realistic project. The course promotes the quality analysis automation, which is a “must-know” in the modern software development practice. Therefore, the theoretical material taught at lectures will be accompanied by the continuous integration practice with regards the quality analysis methods to be demonstrated at labs. The course balances traditional lectures, laboratory class and a small course project in which students apply the concepts and methods they learn to real artifacts. The course project consists in a quality analysis of an open source project(s). The final evaluation is done at the Quality In Use hackathon that brings challenges from the industry on evaluation of open source software components.
Prerequisites
Prerequisite subjects
Prerequisite topics
- Software development
- Software project
- Testing basics
- Continuous integration basics
Course Topics
Section | Topics within the section |
---|---|
Defining quality |
|
Verification and testing |
|
Reliability and security |
|
Intended Learning Outcomes (ILOs)
What is the main purpose of this course?
What is the main goal of this course formulated in one sentence? The main goal of this course is to comprehensively review with students the quality analysis methods from defining the quality for a given project, to selecting the appropriate analysis methods and setting up the adequacy criteria.
ILOs defined at three levels
Level 1: What concepts should a student know/remember/explain?
By the end of the course, the students should be able to ...
- Explain several views on software quality.
- Describe trade-offs among quality attributes in quality models.
- Elaborate major differences in verification techniques.
- State adequacy criteria for verification.
- Understand the ways to calculate and enforce the necessary reliability.
- List software security verification techniques.
Level 2: What basic practical skills should a student be able to perform?
By the end of the course, the students should be able to ...
- Define a quality model of a software project in a given context.
- Select appropriate verification techniques for various quality properties such as security and justify their adequacy.
- Define a necessary reliability for a software project in a given context.
- Justify quality related decisions to different stakeholders based on the cost of quality concepts.Define a quality model of a software project in a given context.
- Select appropriate verification techniques for various quality properties such as security and justify their adequacy.
- Define a necessary reliability for a software project in a given context.
- Justify quality related decisions to different stakeholders based on the cost of quality concepts.
Level 3: What complex comprehensive skills should a student be able to apply in real-life scenarios?
By the end of the course, the students should be able to ...
- Define a quality model for a given software development project.
- Setup a basic continuous integration pipeline that automates quality analysis.
- Define a quality model of a software project in a given context.
- Select appropriate verification techniques for various quality properties such as security and justify their adequacy.
- Define a necessary reliability for a software project in a given context.
Grading
Course grading range
Grade | Range | Description of performance |
---|---|---|
A. Excellent | 90-100 | - |
B. Good | 75-89 | - |
C. Satisfactory | 60-74 | - |
D. Fail | 0-59 | - |
Course activities and grading breakdown
Activity Type | Percentage of the overall course grade |
---|---|
Lab assignments | 30 |
Quizzes | 20 |
Course project | 30 |
Hackathon | 20 |
Recommendations for students on how to succeed in the course
Take a basic tutorial on Continuous Integration.
Review lectures on metrics.
Review testing methods.
Resources, literature and reference materials
Open access resources
Closed access resources
- Tripathy, Priyadarshi, and Kshirasagar Naik. Software testing and quality assurance: theory and practice. John Wiley & Sons, 2011.
- Whittaker, James A., Jason Arbon, and Jeff Carollo. How Google tests software. Addison-Wesley, 2012.
- Whittaker, James A. Exploratory software testing: tips, tricks, tours, and techniques to guide test design. Pearson Education, 2009.
- more literature will be provided in the course package
Software and tools used within the course
- Provide at least 3 open/freemium access tools
- GitLab
- SONAR
- SNYK
- Yandex Tank
- JMETER
- Lighthouse
- Selenium
- Cypress.io
Teaching Methodology: Methods, techniques, & activities
Activities and Teaching Methods
Teaching Techniques | Section 1 | Section 2 | Section 3 |
---|---|---|---|
Problem-based learning (students learn by solving open-ended problems without a strictly-defined solution) | 1 | 1 | 1 |
Project-based learning (students work on a project) | 1 | 1 | 1 |
Differentiated learning (provide tasks and activities at several levels of difficulty to fit students needs and level) | 1 | 1 | 1 |
Contextual learning (activities and tasks are connected to the real world to make it easier for students to relate to them); | 1 | 1 | 1 |
развивающего обучения (задания и материал "прокачивают" ещё нераскрытые возможности студентов); | 1 | 1 | 1 |
концентрированного обучения (занятия по одной большой теме логически объединяются); | 1 | 1 | 1 |
inquiry-based learning | 1 | 1 | 1 |
Task-based learning | 1 | 1 | 1 |
Learning Activities | Section 1 | Section 2 | Section 3 |
---|---|---|---|
Lectures | 1 | 1 | 1 |
Lab exercises | 1 | 1 | 1 |
Experiments | 1 | 1 | 1 |
Cases studies | 1 | 1 | 1 |
Development of individual parts of software product code | 1 | 1 | 1 |
Group projects | 1 | 1 | 1 |
Quizzes (written or computer based) | 1 | 1 | 1 |
Discussions | 1 | 1 | 1 |
Presentations by students | 1 | 1 | 1 |
Written reports | 1 | 1 | 1 |
Formative Assessment and Course Activities
Ongoing performance assessment
Section 1
Activity Type | Content | Is Graded? |
---|---|---|
Quiz | 1. What is the difference between must have attributes and delighters in Kano’s concept? 2. Describe in your own words and with regards to ISO 25010 the following quality attributes: Security, Reliability and Maintainability. 3. Explain the difference between product quality, quality in use and process quality. Provide 2-3 quality attributes of each category briefly describing them. |
1 |
Individual Assignment | In this assignment, students should follow the laboratory assignment based on the instructions and material provided in class. For example, deploying the SONAR metrics system for a given open source project and calculating the technical debt metrics. | 1 |
Group Assignments | In this assignment, each group should build the quality model for their prototyping project and set-up a continuous integration environment. | 1 |
Section 2
Activity Type | Content | Is Graded? |
---|---|---|
Quiz | 1. What is the oracle problem? 2. What is an infeasible path? 3. Develop BVA test cases for an application that implements the logic as defined in the exercise 4. What is fuzz testing? How is it different from random testing? |
1 |
Individual Assignment | In this assignment, students should follow the laboratory assignment based on the instructions and material provided in class. For example, write a set of tests for the full basis path coverage for a given code snippet; deploy tools to the continuous integration pipeline that measure branch coverage. | 1 |
Group Assignments | In this assignment, each group should build a set of mocks and stubs as well as a set of test cases to reach 70% branch coverage for the code of their prototype project. | 1 |
Section 3
Activity Type | Content | Is Graded? |
---|---|---|
Quiz | 1. Explain impact of Utilization on Response Time. 2. List verification techniques and processes for verifying software security. 3. List exploratory testing tours and explain one of them. |
1 |
Individual Assignment | In this assignment, students should follow the laboratory assignment based on the instructions and material provided in class. For example, calculating the response time and mean-time to failure for a given micro service, implement a roll-back mechanism to improve reliability or set up Selenium for an automated UI testing. | 1 |
Group Assignments | In this assignment, each group should apply performance testing for reliability assessment based on the operational profile. In addition, the operational profile should be used in the exploratory testing of the prototype project. | 1 |
Final assessment
Section 1
- Can be a final exam, project defense, or some other equivalent of the final exam.
- For the final assessment, groups present their project work they have accomplished during the course. Below are the grading criteria for each section.
- Grading criteria for the final project presentation:
- (22pt) Criteria for artifacts evaluation
- Progress on quality gates automation
- Level of coverage
- Number of methods applied, for example
- Static analysis including style check
- Unit testing, Integration testing, UI testing.
- Mutation testing, Fuzz testing, Stress testing
- Recovery implementation
- UI testing
- Coverage from 60% statement (as per Google min requirements for the coverage)
- Dashboard and service monitoring
- (8pt) Presentations content
- Organization and process
- Progress and results
- Quality automation
- Lessons learnt
- In addition, the course conducts a hackathon with industrial “customers” who are interested in evaluating the quality of a particular open source software component. Student teams are invited to analyze the quality of those components and deliver a presentation. The cases coming from the industry are very different, however may group the criteria in the following order:
- (40pt) Understanding the business context through the Quality Model
- Interview the customer
- Define the most critical quality requirements based on the examples from the Quality
- Model
- Understand the priorities of the quality requirements
- (40pt) Findings and soundness of analysis by applying adequate methods
- Map the quality requirements and analysis methods
- Conduct the analysis by investigating the project repository, running static analysis for metrics, checking and running the available tests, conducting exploratory testing, running random testing or others.
- (20pt) Presentation quality
- Summarize your findings in the form of a presentation.
- Make it clear, what the highest priority quality requirements for the customer are
- Make it clear, what the most adequate methods to analyze the quality requirements are
- Show the soundness of you findings
Section 2
Section 3
The retake exam
Section 1
- The retake takes the form of an oral exam based on the material taught in class as well as an individual project that follows the lab's material. is project-based as well. Students need to demonstrate understanding of the course material and apply what they learned in an individual prototype project. The grading criteria for each section are the same as for the final project presentation. There has to be a meeting before the retake itself to plan and agree on the project ideas, and to answer questions.
- P7. Activities and Teaching Methods by Sections
- Mark what techniques and methods are used in each section (1 is used, 0 is not used).
- Table A1: Teaching and Learning Methods within each section
- Table A2: Activities within each section
Section 2
Section 3