MSTE: Analysis of Software Artifacts
Analysis of Software Artifacts
- Course name: Analysis of Software Artifacts
- Code discipline:
- Subject area: Computer Science Fundamentals
Short Description
Building Software quality is a key aspect of any IT solution whether a few hundred lines of code for a smart phone app or a few million lines of code for Enterprise Resource Planning software. The Analysis of Software Artifacts course provides techniques to develop confidence in the quality of the software being produced or acquired regardless of its size and domain. The course adopts the view that software quality is not only the absence of defects but that it encompasses all the characteristics that bear on the ability of the software to satisfy stated and implied needs. Software quality is then defined from different perspectives: product quality, quality in use and process quality through the use of specific quality models. The course systematically explores different quality attributes and the techniques most appropriate to verify them. Specific topics include software testing, static analysis and model checking, inspections, technical debt, cost of software quality, planning for quality, quantitative models and defect classifications. The course balances traditional lectures with small projects in which students apply the ideas they are learning to real artifacts. The group project consists on the preparation of a quality plan for an industry project. The final assessment is done at the hackathon where student teams evaluate quality of an open source software component per request of an industrial “customer”.
Prerequisites
Prerequisite subjects
Prerequisite topics
- Software development
- Software project
- Testing basics
Course Topics
Section | Topics within the section |
---|---|
Defining quality |
|
Verification and testing |
|
Quality planning |
|
Intended Learning Outcomes (ILOs)
What is the main purpose of this course?
What is the main goal of this course formulated in one sentence? The main goal of this course is to comprehensively review with students the quality analysis methods from defining the quality for a given project, to selecting the appropriate analysis methods and setting up the adequacy criteria - all conducted in a project settings.
ILOs defined at three levels
Level 1: What concepts should a student know/remember/explain?
By the end of the course, the students should be able to ...
- Explain several views on software quality.
- Describe trade-offs among quality attributes in quality models.
- Elaborate major differences in verification techniques.
- State adequacy criteria for verification.
- Understand the ways to calculate and enforce the necessary reliability.
- List software security verification techniques.
Level 2: What basic practical skills should a student be able to perform?
By the end of the course, the students should be able to ...
- Define a quality model of a software project in a given context.
- Select appropriate verification techniques for various quality properties such as security and justify their adequacy.
- Define a necessary reliability for a software project in a given context.
- Justify quality related decisions to different stakeholders based on the cost of quality concepts.Define a quality model of a software project in a given context.
- Select appropriate verification techniques for various quality properties such as security and justify their adequacy.
- Define a necessary reliability for a software project in a given context.
- Justify quality related decisions to different stakeholders based on the cost of quality concepts.
Level 3: What complex comprehensive skills should a student be able to apply in real-life scenarios?
By the end of the course, the students should be able to ...
- Define a quality model for a given software development project.
- Setup a basic continuous integration pipeline that automates quality analysis.
- Define a quality model of a software project in a given context.
- Select appropriate verification techniques for various quality properties such as security and justify their adequacy.
- Define a necessary reliability for a software project in a given context.
- Justify quality related decisions to different stakeholders based on the cost of quality concepts.
Grading
Course grading range
Grade | Range | Description of performance |
---|---|---|
A. Excellent | 80-100 | - |
B. Good | 65-79 | - |
C. Satisfactory | 50-64 | - |
D. Fail | 0-49 | - |
Course activities and grading breakdown
Activity Type | Percentage of the overall course grade |
---|---|
Midterm exam | 20 |
Final exam - Hackathon | 20 |
Quality plan | 10 |
Group assignments | 10 |
Individual assignments | 30 |
Participation | 10 |
Recommendations for students on how to succeed in the course
Review lectures on metrics.
Review testing methods.
Resources, literature and reference materials
Open access resources
Closed access resources
- Tripathy, Priyadarshi, and Kshirasagar Naik. Software testing and quality assurance: theory and practice. John Wiley & Sons, 2011.
- more literature will be provided in the course package
Software and tools used within the course
- Provide at least 3 open/freemium access tools
- SONAR
- SNYK
- Yandex Tank
- JMETER
- Lighthouse
- Selenium
- Cypress.io
Teaching Methodology: Methods, techniques, & activities
Activities and Teaching Methods
Teaching Techniques | Section 1 | Section 2 | Section 3 |
---|---|---|---|
Problem-based learning (students learn by solving open-ended problems without a strictly-defined solution) | 1 | 1 | 1 |
Project-based learning (students work on a project) | 1 | 1 | 1 |
Differentiated learning (provide tasks and activities at several levels of difficulty to fit students needs and level) | 1 | 1 | 1 |
Contextual learning (activities and tasks are connected to the real world to make it easier for students to relate to them); | 1 | 1 | 1 |
развивающего обучения (задания и материал "прокачивают" ещё нераскрытые возможности студентов); | 1 | 1 | 1 |
концентрированного обучения (занятия по одной большой теме логически объединяются); | 1 | 1 | 1 |
inquiry-based learning | 1 | 1 | 1 |
Task-based learning | 1 | 1 | 1 |
Learning Activities | Section 1 | Section 2 | Section 3 |
---|---|---|---|
Lectures | 1 | 1 | 1 |
Lab exercises | 1 | 1 | 1 |
Experiments | 1 | 1 | 1 |
Cases studies | 1 | 1 | 1 |
Development of individual parts of software product code | 1 | 1 | 1 |
Group projects | 1 | 1 | 1 |
Quizzes (written or computer based) | 1 | 1 | 1 |
Discussions | 1 | 1 | 1 |
Presentations by students | 1 | 1 | 1 |
Written reports | 1 | 1 | 1 |
Formative Assessment and Course Activities
Ongoing performance assessment
Section 1
Activity Type | Content | Is Graded? |
---|---|---|
Quiz | 1. What is the difference between must have attributes and delighters in Kano’s concept? 2. Describe in your own words and with regards to ISO 25010 the following quality attributes: Security, Reliability and Maintainability. 3. Explain the difference between product quality, quality in use and process quality. Provide 2-3 quality attributes of each category briefly describing them. |
1 |
Individual Assignment | In this assignment, students should reply to the reading questions such as: What is the dominant quality view implicit in SCRUM and RUP? Explain in your own words and in no more than three sentences the main contribution of one of the quality gurus like Ishikawa? What is the main difference between a quality model like ISO 25010 and SAP Products Standard? |
1 |
Group Assignments | In this assignment, each group should apply SONAR metrics to evaluate technical debt of a given project. Explain the results. Each group has to define a quality model for the practicum project. |
1 |
Section 2
Activity Type | Content | Is Graded? |
---|---|---|
Quiz | 1. In the context of mutation testing: What is an equivalent mutant? What is the meaning of the terms killed and dead on arrival? What is the difference between the two? What is an infeasible path? What is fuzz testing? How is it different from random testing? Give an example of the effect of the law of increasing complexity on maintainability |
1 |
Individual Assignment | In this assignment, students should reply to the reading questions such as: Develop BVA test cases for an application that implements the logic as defined in the exercise. Will you use combinatorial testing to derive test cases for a tree like menu? Yes, no, why? What is the relation between branch coverage and mutation testing? Using Humphrey’s capture-recapture procedure, how many latent defects can we estimate remain unidentified for a given code? Give an example illustrating general relationships between response time, throughput, and resource utilization. Construct a Design Structure Matrix for a given set of components. What does the DSM analysis tell you about maintainability of this set of components? |
1 |
Group Assignments | For the practicum project, each group should define the set of verification methods and the adequacy criteria that correspond to the quality model. | 1 |
Section 3
Activity Type | Content | Is Graded? |
---|---|---|
Quiz | What is Kruchten’s definition and taxonomy of Technical Debt? According to Highsmith, what is the relation of Technical Debt and Cost of Change? What is the Quality Plan? Give an example of an estimation method for the efforts required to implement the quality plan. |
1 |
Individual Assignment | In this assignment, students should reply to the reading questions: Give definition of quality artifacts contributing to the SQALE model. Based on the experience with the group project, do the calculated Technical Debt metrics correspond to your intuition? Justify Given project characteristics, what quality attributes should be tracked throughout the project. Give an example of a quality control measures for the top priority quality attribute. Give a definition of constituent parts of the cost of quality. |
1 |
Group Assignments | For the practicum project, each group should define a detailed quality plan that sets the goals for the performance indicators and specifies means of quality control. | 1 |
Final assessment
Section 1
- Can be a final exam, project defense, or some other equivalent of the final exam.
- The course conducts a hackathon with industrial “customers” who are interested in evaluating the quality of a particular open source software component. Student teams are invited to analyze the quality of those components and deliver a presentation. The cases coming from the industry are very different, however may group the criteria in the following order:
- (40pt) Understanding the business context through the Quality Model
- Interview the customer
- Define the most critical quality requirements based on the examples from the Quality
- Model
- Understand the priorities of the quality requirements
- (40pt) Findings and soundness of analysis by applying adequate methods
- Map the quality requirements and analysis methods
- Conduct the analysis by investigating the project repository, running static analysis for metrics, checking and running the available tests, conducting exploratory testing, running random testing or others.
- (20pt) Presentation quality
- Summarize your findings in the form of a presentation.
- Make it clear, what the highest priority quality requirements for the customer are
- Make it clear, what the most adequate methods to analyze the quality requirements are
- Show the soundness of you findings
Section 2
Section 3
The retake exam
Section 1
- The retake takes the form of an oral exam based on the material taught in class as well as an individual project that follows the lab's material. Students need to demonstrate understanding of the course material and apply what they learned in an individual prototype project. The grading criteria for each section are the same as for the final project presentation. There has to be a meeting before the retake itself to plan and agree on the project ideas, and to answer questions.
- P7. Activities and Teaching Methods by Sections
- Mark what techniques and methods are used in each section (1 is used, 0 is not used).
- Table A1: Teaching and Learning Methods within each section
- Table A2: Activities within each section
Section 2
Section 3