Difference between revisions of "MSc: Analysis Software Artifacts"

From IU
Jump to navigation Jump to search
(Created page with "= Analysis of Software Artifacts = * <span>'''Course name:'''</span> Analysis of Software Artifacts * <span>'''Course number:'''</span> SE-01 == Course characteristics == =...")
 
 
(9 intermediate revisions by 3 users not shown)
Line 1: Line 1:
  +
 
= Analysis of Software Artifacts =
 
= Analysis of Software Artifacts =
  +
* '''Course name''': Analysis of Software Artifacts
  +
* '''Code discipline''': SE-01
  +
* '''Subject area''':
   
  +
== Short Description ==
* <span>'''Course name:'''</span> Analysis of Software Artifacts
 
  +
This course covers the following concepts: Quality Models and Metrics; Technical Debt; Verification Techniques including Testing, Static Analysis and Inspection; Adequacy Criteria; Process Quality.
* <span>'''Course number:'''</span> SE-01
 
   
== Course characteristics ==
+
== Prerequisites ==
   
=== Key concepts of the class ===
+
=== Prerequisite subjects ===
  +
* Software testing basics
  +
* Notions of coverage
  +
* Naik, Kshirasagar, and Priyadarshi Tripathy. Software testing and quality assurance: theory and practice. John Wiley & Sons, 2011.
   
  +
=== Prerequisite topics ===
* Quality Models and Metrics
 
* Technical Debt
 
* Verification Techniques including Testing, Static Analysis and Inspection
 
* Adequacy Criteria
 
* Process Quality
 
   
=== What is the purpose of this course? ===
 
   
  +
== Course Topics ==
Software quality is a key aspect of any IT solution whether a few hundred lines of code for a smart phone app or a few million lines of code for Enterprise Resource Planning software. The Analysis of Software Artifacts course provides techniques to develop confidence in the quality of the software being produced or acquired regardless of its size and domain. The course adopts the view that software quality is not only the absence of defects but that it encompasses all the characteristics that bear on the ability of the software to satisfy stated and implied needs. Software quality is then defined from different perspectives: product quality, quality in use and process quality through the use of specific quality models. The course systematically explores different quality attributes and the techniques most appropriate to verify them. Specific topics include software testing, static analysis and model checking, inspections, technical debt, cost of software quality, planning for quality, quantitative models and defect classifications. The course balances traditional lectures with small projects in which students apply the ideas they are learning to real artifacts. The final project consists on the preparation of a quality plan for an industry project.
 
  +
{| class="wikitable"
  +
|+ Course Sections and Topics
  +
|-
  +
! Section !! Topics within the section
  +
|-
  +
| Defining quality ||
  +
# Introduction, Views on Quality
  +
# Quality Models
  +
# Measurements & Quality Metrics
  +
|-
  +
| Testing ||
  +
# Verification Overview
  +
# Measuring Test Adequacy
  +
# Black Box Testing
  +
# Modeling the Input Domain
  +
# Combinatorial Testing
  +
# Basis Path & Data Flow Testing
  +
# Random & Mutation Testing
  +
|-
  +
| Static Analysis ||
  +
# Inspections
  +
# Static Analysis
  +
# Model Checking
  +
|-
  +
| Advanced Analysis and Verification ||
  +
# Performance Analysis & Verification
  +
# Maintainability Analysis & Verification
  +
# Security Analysis & Verification
  +
# Organizational Quality & Process Improvement
  +
|-
  +
| Quality Planning ||
  +
# Technical Debt
  +
# Quality Planning - Cost of Quality
  +
# Quality Planning - Project Quality
  +
# Quality Plan for Practicum Project
  +
|}
  +
== Intended Learning Outcomes (ILOs) ==
   
=== Course Objectives Based on Bloom’s Taxonomy ===
+
=== What is the main purpose of this course? ===
  +
Software quality is a key aspect of any IT solution whether a few hundred lines of code for a smart phone app or a few million lines of code for Enterprise Resource Planning software. The Analysis of Software Artifacts course provides techniques to develop confidence in the quality of the software being produced or acquired regardless of its size and domain. The course adopts the view that software quality is not only the absence of defects but that it encompasses all the characteristics that bear on the ability of the software to satisfy stated and implied needs. Software quality is then defined from different perspectives: product quality, quality in use and process quality through the use of specific quality models. The course systematically explores different quality attributes and the techniques most appropriate to verify them. Specific topics include software testing, static analysis and model checking, inspections, technical debt, cost of software quality, planning for quality, quantitative models and defect classifications. The course balances traditional lectures with small projects in which students apply the ideas they are learning to real artifacts. The final project consists on the preparation of a quality plan for an industry project.
 
=== - What should a student remember at the end of the course? ===
 
   
  +
=== ILOs defined at three levels ===
By the end of the course, the students will remember:
 
   
  +
==== Level 1: What concepts should a student know/remember/explain? ====
  +
By the end of the course, the students should be able to ...
 
* Several views on software quality.
 
* Several views on software quality.
 
* Trade-offs among quality attributes in quality models.
 
* Trade-offs among quality attributes in quality models.
Line 30: Line 70:
 
* Cost of quality.
 
* Cost of quality.
   
=== - What should a student be able to understand at the end of the course? ===
+
==== Level 2: What basic practical skills should a student be able to perform? ====
  +
By the end of the course, the students should be able to ...
 
By the end of the course, the students should be able to describe and explain (with examples)
 
 
 
* Quality models usage.
 
* Quality models usage.
 
* Technical debt concept.
 
* Technical debt concept.
Line 39: Line 77:
 
* Quality planning.
 
* Quality planning.
   
=== - What should a student be able to apply at the end of the course? ===
+
==== Level 3: What complex comprehensive skills should a student be able to apply in real-life scenarios? ====
  +
By the end of the course, the students should be able to ...
 
By the end of the course, the students should be able to
 
 
 
* Define and execute a testing plan.
 
* Define and execute a testing plan.
 
* Perform static analysis of the code.
 
* Perform static analysis of the code.
 
* Define a quality plan.
 
* Define a quality plan.
* Justify quality related decisions to different stakeholders.
+
* Justify quality related decisions to different stakeholders.
  +
== Grading ==
   
=== Course evaluation ===
+
=== Course grading range ===
  +
{| class="wikitable"
 
  +
|+
== Evaluation ==
 
 
{|
 
|+ Course grade breakdown
 
!
 
!
 
!align="center"| '''Proposed points'''
 
 
|-
 
|-
  +
! Grade !! Range !! Description of performance
| Labs/seminar classes
 
| 20
 
|align="center"| 10
 
 
|-
 
|-
  +
| A. Excellent || 80-100 || -
| Interim performance assessment
 
| 30
 
|align="center"| 50
 
 
|-
 
|-
  +
| B. Good || 65-79 || -
| Exams
 
| 50
+
|-
  +
| C. Satisfactory || 50-64 || -
|align="center"| 40
 
  +
|-
  +
| D. Poor || 0-49 || -
 
|}
 
|}
   
  +
=== Course activities and grading breakdown ===
The students performance will be evaluated as follows:
 
  +
{| class="wikitable"
 
  +
|+
* Mid-term exam (20%)
 
* Final exam (20%)
 
* Quality plan (10%)
 
* Group projects (20%)
 
* Individual Assignments (20%)
 
* Participation (10%)
 
 
=== Grades range ===
 
 
<div id="tab:ModelsCourseGradingRange">
 
 
{|
 
|+ Course grading range
 
!
 
!
 
!align="center"| '''Proposed range'''
 
 
|-
 
|-
  +
! Activity Type !! Percentage of the overall course grade
| A. Excellent
 
| 90-100
 
|align="center"| 80-100
 
 
|-
 
|-
  +
| Labs/seminar classes || 10
| B. Good
 
| 75-89
 
|align="center"| 65-79
 
 
|-
 
|-
  +
| Interim performance assessment || 50
| C. Satisfactory
 
| 60-74
 
|align="center"| 50-64
 
 
|-
 
|-
| D. Poor
+
| Exams || 40
| 0-59
 
|align="center"| 0-49
 
 
|}
 
|}
   
  +
=== Recommendations for students on how to succeed in the course ===
   
</div>
 
If necessary, please indicate freely your course’s grading features: The semester starts with the default range as proposed in the Table [[#tab:ModelsCourseGradingRange|1]], but it may change slightly (usually reduced) depending on how the semester progresses.
 
   
=== Resources and reference material ===
+
== Resources, literature and reference materials ==
   
  +
=== Open access resources ===
 
* Text book:
 
* Text book:
 
* This course makes use of many reference materials that are posted to Moodle:
 
* This course makes use of many reference materials that are posted to Moodle:
* David A. Garvin, What Does &quot;Product Quality&quot; Really Mean?
+
* David A. Garvin, What Does "Product Quality" Really Mean?
* Volker Kruger, Main Schools Of Tqm: &quot;the big five&quot;
+
* Volker Kruger, Main Schools Of Tqm: "the big five"
 
* Steve McConnell, Managing Technical Debt
 
* Steve McConnell, Managing Technical Debt
 
* Jean-Louis Letouzey, Michel Ilkiewicz, Managing Technical Debt with SQALE Method
 
* Jean-Louis Letouzey, Michel Ilkiewicz, Managing Technical Debt with SQALE Method
 
* Stephen Chin, Erik Huddleston, Walter Bodwell, and Israel Gat, The Economics of Technical Debt
 
* Stephen Chin, Erik Huddleston, Walter Bodwell, and Israel Gat, The Economics of Technical Debt
* Douglas W. Hubbard, How to Measure Anything. Finding the Value of &quot;Intangibles&quot; in Business
+
* Douglas W. Hubbard, How to Measure Anything. Finding the Value of "Intangibles" in Business
 
* SEI, Foundations of Measurement
 
* SEI, Foundations of Measurement
 
* Frank Buechner, Is 100% Code Coverage Enough?
 
* Frank Buechner, Is 100% Code Coverage Enough?
Line 152: Line 158:
 
* SEI, Practical Software Measurement: Measuring for Process Management and Improvement, 1997
 
* SEI, Practical Software Measurement: Measuring for Process Management and Improvement, 1997
   
== Course Sections ==
+
=== Closed access resources ===
   
The main sections of the course and approximate hour distribution between them is as follows:
 
   
  +
=== Software and tools used within the course ===
{|
 
  +
|+ Course Sections
 
  +
= Teaching Methodology: Methods, techniques, & activities =
!align="center"| '''Section'''
 
  +
! '''Section Title'''
 
!align="center"| '''Teaching Hours'''
+
== Activities and Teaching Methods ==
  +
{| class="wikitable"
  +
|+ Activities within each section
 
|-
 
|-
  +
! Learning Activities !! Section 1 !! Section 2 !! Section 3 !! Section 4 !! Section 5
|align="center"| 1
 
| Defining quality
 
|align="center"| 6
 
 
|-
 
|-
  +
| Homework and group projects || 1 || 1 || 1 || 1 || 1
|align="center"| 2
 
| Testing
 
|align="center"| 16
 
 
|-
 
|-
  +
| Midterm evaluation || 1 || 1 || 1 || 1 || 1
|align="center"| 3
 
| Static Analysis
 
|align="center"| 8
 
 
|-
 
|-
  +
| Reports || 1 || 1 || 1 || 1 || 1
|align="center"| 4
 
| Advanced Analysis and Verification
 
|align="center"| 10
 
 
|-
 
|-
  +
| Essays || 1 || 1 || 1 || 1 || 1
|align="center"| 5
 
  +
|-
| Quality planning
 
  +
| Discussions || 1 || 1 || 1 || 1 || 1
|align="center"| 14
 
|}
+
|}
  +
== Formative Assessment and Course Activities ==
   
=== Section 1 ===
+
=== Ongoing performance assessment ===
 
=== Section title: ===
 
 
Defining quality
 
 
=== Topics covered in this section: ===
 
 
* Introduction, Views on Quality
 
* Quality Models
 
* Measurements &amp; Quality Metrics
 
 
=== What forms of evaluation were used to test students’ performance in this section? ===
 
 
<div class="tabular">
 
 
<span>|a|c|</span> &amp; '''Yes/No'''<br />
 
Development of individual parts of software product code &amp; 0<br />
 
Homework and group projects &amp; 1<br />
 
Midterm evaluation &amp; 1<br />
 
Testing (written or computer based) &amp; 0<br />
 
Reports &amp; 1<br />
 
Essays &amp; 1<br />
 
Oral polls &amp; 0<br />
 
Discussions &amp; 1<br />
 
 
 
 
</div>
 
=== Typical questions for ongoing performance evaluation within this section ===
 
 
# What is the dominant quality view implicit in SCRUM and RUP?
 
# Explain in your own words and in no more than three sentences the main contribution of one of the quality gurus like Ishikawa?
 
# What is the difference between must have attributes and delighters in Kano’s concept?
 
# What is the main difference between a quality model like ISO 25010 and SAP Products Standard?
 
# Describe in your own words and with regards to ISO 25010 the following quality attributes: Security, Reliability and Maintainability.
 
 
=== Typical questions for seminar classes (labs) within this section ===
 
 
# Define major quality focus by the customer in a given project.
 
# Using SONAR evaluate maintainability of a given project.
 
# Discuss you interpretation of the obtained quality level in a given project.
 
# Describe how and what for quality models are useful? Provide an example from your studio project.
 
# Map the requirement “the system shall be easy to maintain” to the ISO 25010 Quality model. Provide a definition to the metric level for at least two sub-characteristics for the requirement, and represent the mapping graphically.
 
 
=== Test questions for final assessment in this section ===
 
   
  +
==== Section 1 ====
  +
{| class="wikitable"
  +
|+
  +
|-
  +
! Activity Type !! Content !! Is Graded?
  +
|-
  +
| Question || What is the dominant quality view implicit in SCRUM and RUP? || 1
  +
|-
  +
| Question || Explain in your own words and in no more than three sentences the main contribution of one of the quality gurus like Ishikawa? || 1
  +
|-
  +
| Question || What is the difference between must have attributes and delighters in Kano’s concept? || 1
  +
|-
  +
| Question || What is the main difference between a quality model like ISO 25010 and SAP Products Standard? || 1
  +
|-
  +
| Question || Describe in your own words and with regards to ISO 25010 the following quality attributes: Security, Reliability and Maintainability. || 1
  +
|-
  +
| Question || Define major quality focus by the customer in a given project. || 0
  +
|-
  +
| Question || Using SONAR evaluate maintainability of a given project. || 0
  +
|-
  +
| Question || Discuss you interpretation of the obtained quality level in a given project. || 0
  +
|-
  +
| Question || Describe how and what for quality models are useful? Provide an example from your studio project. || 0
  +
|-
  +
| Question || Map the requirement “the system shall be easy to maintain” to the ISO 25010 Quality model. Provide a definition to the metric level for at least two sub-characteristics for the requirement, and represent the mapping graphically. || 0
  +
|}
  +
==== Section 2 ====
  +
{| class="wikitable"
  +
|+
  +
|-
  +
! Activity Type !! Content !! Is Graded?
  +
|-
  +
| Question || In the context of mutation testing: a. What is an equivalent mutant? b. What is the meaning of the terms killed and dead on arrival? c. What is the difference between the two? || 1
  +
|-
  +
| Question || Develop BVA test cases for an application that implements the logic as defined in the exercise. || 1
  +
|-
  +
| Question || Will you use combinatorial testing to derive test cases for a tree like menu? Yes, no, why? || 1
  +
|-
  +
| Question || What is the relation between branch coverage and mutation testing? || 1
  +
|-
  +
| Question || What is an infeasible path? || 1
  +
|-
  +
| Question || What is fuzz testing? How it is different from random testing? || 1
  +
|-
  +
| Question || What is the oracle problem? || 1
  +
|-
  +
| Question || Write a short code snippet that contains a possible null-pointer exception, and two different sets of test cases that achieve full branch coverage for the snippet. The first set of test cases should miss the defect; the second should trigger it. || 0
  +
|-
  +
| Question || Develop a classification tree covering all test relevant aspect for a Merge method. The method accepts two ordered integer vectors with a maximum of 128 elements each and returns a single ordered vector with no-duplicates formed from the elements of the input vectors. || 0
  +
|-
  +
| Question || Develop test cases for the logical function (A & B) | C -> D so that it achieves 100% MC/DC. || 0
  +
|-
  +
| Question || Develop test cases to achieve 100% basis path coverage utilizing McCabe method for the program below. Include: control flow graph, basis paths, test cases. || 0
  +
|}
  +
==== Section 3 ====
  +
{| class="wikitable"
  +
|+
  +
|-
  +
! Activity Type !! Content !! Is Graded?
  +
|-
  +
| Question || Wiegers references Weinberg’s concept of “egoless programming.” What does he mean by this concept, and why is it relevant to peer review? || 1
  +
|-
  +
| Question || The literature suggests a number of limits on code review sessions. For each, list and justify a reasonable guideline. || 1
  +
|-
  +
| Question || Based on your reading, list two undesirable programmer attitudes that can emerge in an organization that includes mandatory code reviews. Describe three mechanisms management, organization, or programmers can use to avoid the development of such attitudes. || 1
  +
|-
  +
| Question || Under what circumstances is model checking not a useful strategy? || 1
  +
|-
  +
| Question || In the context of model checking, what is a counterexample? || 1
  +
|-
  +
| Question || What is one common misconception about either the advantages or disadvantages of model checking versus static analysis that Engler and Musuvathi identify and debunk through experience in their article “Static analysis versus software model checking for bug finding”? || 1
  +
|-
  +
| Question || Using Humphrey’s capture-recapture procedure, how many latent defects can we estimate remain unidentified for a given code? || 0
  +
|-
  +
| Question || You need to inspect a large banking system comprising around 20,000 lines of COBOL and 25,000 lines of newly written Java code with regards to vulnerabilities but only have enough budget to look at 10,000. How would you prioritize which components to inspect without overrunning the budget? || 0
  +
|-
  +
| Question || Produce a program model for a given code example. Be sure to identify and describe the states, actions, transitions, initial state, and end states. || 0
  +
|-
  +
| Question || Create a Promela model that will print out even integers from 0 to 100. || 0
  +
|}
  +
==== Section 4 ====
  +
{| class="wikitable"
  +
|+
  +
|-
  +
! Activity Type !! Content !! Is Graded?
  +
|-
  +
| Question || Explain impact on Utilization on Response Time. || 1
  +
|-
  +
| Question || Explain Amdahl’s law relation to performance improvements during development process. || 1
  +
|-
  +
| Question || Explain Little’s law application to evaluation of critical performance characteristics of systems (response time, queue size…) || 1
  +
|-
  +
| Question || Give definition of maintainability. What are the statements of Lehman’s Laws on Program Evolution? || 1
  +
|-
  +
| Question || Give an example of vulnerability prevention measures? || 1
  +
|-
  +
| Question || What is Juran’s view on process improvement? Which CMMI level will best suite Juran’s methods. || 1
  +
|-
  +
| Question || You execute a benchmark test twice and find that the performance of the system was 30 transactions/hour the first time and 20 transactions/hour the second time. What is the average throughput? || 0
  +
|-
  +
| Question || From a pure performance point of view is it better, the same or worst to have a single server or two with half the speed? || 0
  +
|-
  +
| Question || You execute a load test for one hour, first during peak hour and again off-peak. During peak hour the system process 20 transactions/hour. Off-peak it processes 30 transactions/hour. What is the average throughput? || 0
  +
|-
  +
| Question || Your current e-commerce traffic, 12.5 transactions per second, is served by two CPUs running at 65% of its maximum capability. With the launch of a new product, marketing is forecasting an increase of 30% in your web site traffic. Your job is to make a recommendation, from a pure performance perspective, on whether to upgrade your current system with faster CPUs or to buy two additional CPUs with the same capacity as the existing ones.It is estimated the faster CPUs will reduce the current service time by 20%. || 0
  +
|-
  +
| Question || Draw a queuing diagram for the systems below and describe them using Kendall’s a) Single CPU system. b) A system comprising three web servers, to which requests are randomly directed. Each of the servers contains two CPUs. || 0
  +
|-
  +
| Question || Software monitor data for an interactive system shows a CPU utilization of 75%, a 3 second CPU service demand, a response time of 15 seconds, and 10 active users. What is the average think time of these users? || 0
  +
|-
  +
| Question || Construct a Design Structure Matrix for a given set of components. What does the DSM analysis tell you about maintainability of this set of components? || 0
  +
|}
  +
==== Section 5 ====
  +
{| class="wikitable"
  +
|+
  +
|-
  +
! Activity Type !! Content !! Is Graded?
  +
|-
  +
| Question || What is Kruchten’s definition and taxonomie of Technical Debt? || 1
  +
|-
  +
| Question || According to Highsmith, what is relation of Technical Debt and Cost of Change? || 1
  +
|-
  +
| Question || In McConnell’s taxonomy which type of Technical Debt can be positive? || 1
  +
|-
  +
| Question || Explain latent faults through “tank and pipes” model. Give an example. || 1
  +
|-
  +
| Question || What is the Quality Plan? Give an example of an estimation method for the efforts required to implement the quality plan. || 1
  +
|-
  +
| Question || Give definition of quality artifacts contributing to the SQALE model. || 0
  +
|-
  +
| Question || Based on the experience with the group project, do the calculated Technical Debt metrics correspond to your intuition? Justify. || 0
  +
|-
  +
| Question || Give an example of possible appraisal costs for a given project. || 0
  +
|-
  +
| Question || Present the quality model for the practicum project. || 0
  +
|-
  +
| Question || Present the quality control measures for the practicum project. || 0
  +
|-
  +
| Question || Present the quality plan for the practicum project with regards to the project milestones and available resources. || 0
  +
|}
  +
=== Final assessment ===
  +
'''Section 1'''
 
# Explain the difference between product quality, quality in use and process quality. Provide 2-3 quality attributes of each category briefly describing them.
 
# Explain the difference between product quality, quality in use and process quality. Provide 2-3 quality attributes of each category briefly describing them.
# What quality view best encompasses the phrase &quot;Quality consists of the extent to which a specimen [a product-brand-model-seller combination] possesses the service characteristics you desire&quot;.
+
# What quality view best encompasses the phrase "Quality consists of the extent to which a specimen [a product-brand-model-seller combination] possesses the service characteristics you desire".
 
# Explain the difference between accuracy and precision of measurement methods.
 
# Explain the difference between accuracy and precision of measurement methods.
 
# For each of the following quantities, indicate the scale (nominal, ordinal, interval, or ratio) of the data (just the scale, no justification required): a. Categories of defect types in a bug database. b. Branch coverage of a test suite. c. Severity of the defects in a bug database. d. Statement coverage of a test suite. e. Number of features delivered on a milestone.
 
# For each of the following quantities, indicate the scale (nominal, ordinal, interval, or ratio) of the data (just the scale, no justification required): a. Categories of defect types in a bug database. b. Branch coverage of a test suite. c. Severity of the defects in a bug database. d. Statement coverage of a test suite. e. Number of features delivered on a milestone.
  +
'''Section 2'''
 
=== Section 2 ===
 
 
=== Section title: ===
 
 
Testing
 
 
=== Topics covered in this section: ===
 
 
* Verification Overview
 
* Measuring Test Adequacy
 
* Black Box Testing
 
* Modeling the Input Domain
 
* Combinatorial Testing
 
* Basis Path &amp; Data Flow Testing
 
* Random &amp; Mutation Testing
 
 
=== What forms of evaluation were used to test students’ performance in this section? ===
 
 
<div class="tabular">
 
 
<span>|a|c|</span> &amp; '''Yes/No'''<br />
 
Development of individual parts of software product code &amp; 0<br />
 
Homework and group projects &amp; 1<br />
 
Midterm evaluation &amp; 1<br />
 
Testing (written or computer based) &amp; 0<br />
 
Reports &amp; 1<br />
 
Essays &amp; 1<br />
 
Oral polls &amp; 0<br />
 
Discussions &amp; 1<br />
 
 
 
 
</div>
 
=== Typical questions for ongoing performance evaluation within this section ===
 
 
# In the context of mutation testing: a. What is an equivalent mutant? b. What is the meaning of the terms killed and dead on arrival? c. What is the difference between the two?
 
# Develop BVA test cases for an application that implements the logic as defined in the exercise.
 
# Will you use combinatorial testing to derive test cases for a tree like menu? Yes, no, why?
 
# What is the relation between branch coverage and mutation testing?
 
# What is an infeasible path?
 
# What is fuzz testing? How it is different from random testing?
 
# What is the oracle problem?
 
 
=== Typical questions for seminar classes (labs) within this section ===
 
 
# Write a short code snippet that contains a possible null-pointer exception, and two different sets of test cases that achieve full branch coverage for the snippet. The first set of test cases should miss the defect; the second should trigger it.
 
# Develop a classification tree covering all test relevant aspect for a Merge method. The method accepts two ordered integer vectors with a maximum of 128 elements each and returns a single ordered vector with no-duplicates formed from the elements of the input vectors.
 
# Develop test cases for the logical function (A &amp; B) | C -&gt; D so that it achieves 100% MC/DC.
 
# Develop test cases to achieve 100% basis path coverage utilizing McCabe method for the program below. Include: control flow graph, basis paths, test cases.
 
 
=== Test questions for final assessment in this section ===
 
 
 
# Identify equivalence classes using Decision Table Method for a given problem.
 
# Identify equivalence classes using Decision Table Method for a given problem.
 
# Create a Classification Tree for the Problem. Identify constraints. Indicate boundaries.
 
# Create a Classification Tree for the Problem. Identify constraints. Indicate boundaries.
 
# Calculate number of test cases to achieve Basis Path coverage for a code sample.
 
# Calculate number of test cases to achieve Basis Path coverage for a code sample.
 
# Provide a test set that achieves full Basis Path coverage for a code sample.
 
# Provide a test set that achieves full Basis Path coverage for a code sample.
  +
'''Section 3'''
 
=== Section 3 ===
 
 
=== Section title: ===
 
 
Static Analysis
 
 
=== Topics covered in this section: ===
 
 
* Inspections
 
* Static Analysis
 
* Model Checking
 
 
=== What forms of evaluation were used to test students’ performance in this section? ===
 
 
<div class="tabular">
 
 
<span>|a|c|</span> &amp; '''Yes/No'''<br />
 
Development of individual parts of software product code &amp; 0<br />
 
Homework and group projects &amp; 1<br />
 
Midterm evaluation &amp; 1<br />
 
Testing (written or computer based) &amp; 0<br />
 
Reports &amp; 1<br />
 
Essays &amp; 1<br />
 
Oral polls &amp; 0<br />
 
Discussions &amp; 1<br />
 
 
 
 
</div>
 
=== Typical questions for ongoing performance evaluation within this section ===
 
 
# Wiegers references Weinberg’s concept of “egoless programming.” What does he mean by this concept, and why is it relevant to peer review?
 
# The literature suggests a number of limits on code review sessions. For each, list and justify a reasonable guideline.
 
# Based on your reading, list two undesirable programmer attitudes that can emerge in an organization that includes mandatory code reviews. Describe three mechanisms management, organization, or programmers can use to avoid the development of such attitudes.
 
# Under what circumstances is model checking not a useful strategy?
 
# In the context of model checking, what is a counterexample?
 
# What is one common misconception about either the advantages or disadvantages of model checking versus static analysis that Engler and Musuvathi identify and debunk through experience in their article “Static analysis versus software model checking for bug finding”?
 
 
=== Typical questions for seminar classes (labs) within this section ===
 
 
# Using Humphrey’s capture-recapture procedure, how many latent defects can we estimate remain unidentified for a given code?
 
# You need to inspect a large banking system comprising around 20,000 lines of COBOL and 25,000 lines of newly written Java code with regards to vulnerabilities but only have enough budget to look at 10,000. How would you prioritize which components to inspect without overrunning the budget?
 
# Produce a program model for a given code example. Be sure to identify and describe the states, actions, transitions, initial state, and end states.
 
# Create a Promela model that will print out even integers from 0 to 100.
 
 
=== Test questions for final assessment in this section ===
 
 
 
# Enumerate three limitations of many dynamic analyses, and describe a mitigation strategy to overcome each.
 
# Enumerate three limitations of many dynamic analyses, and describe a mitigation strategy to overcome each.
 
# Give an example of a circumstance (in terms of a system, property, program, defect, or pattern type) under which you would prefer: a) Dynamic over static analysis? b) Static over dynamic analysis c) Model checking over more lightweight static analysis?
 
# Give an example of a circumstance (in terms of a system, property, program, defect, or pattern type) under which you would prefer: a) Dynamic over static analysis? b) Static over dynamic analysis c) Model checking over more lightweight static analysis?
 
# Describe two strategies for eliciting developer support and encouraging analysis tool adoption in an organization.
 
# Describe two strategies for eliciting developer support and encouraging analysis tool adoption in an organization.
 
# Define the terms “sound” and “complete” with respect to an analysis tool, and explain or give examples of circumstances under which you would prefer one over the other in selecting a particular tool.
 
# Define the terms “sound” and “complete” with respect to an analysis tool, and explain or give examples of circumstances under which you would prefer one over the other in selecting a particular tool.
  +
'''Section 4'''
 
=== Section 4 ===
 
 
=== Section title: ===
 
 
Advanced Analysis and Verification
 
 
=== Topics covered in this section: ===
 
 
* Performance Analysis &amp; Verification
 
* Maintainability Analysis &amp; Verification
 
* Security Analysis &amp; Verification
 
* Organizational Quality &amp; Process Improvement
 
 
=== What forms of evaluation were used to test students’ performance in this section? ===
 
 
<div class="tabular">
 
 
<span>|a|c|</span> &amp; '''Yes/No'''<br />
 
Development of individual parts of software product code &amp; 0<br />
 
Homework and group projects &amp; 1<br />
 
Midterm evaluation &amp; 1<br />
 
Testing (written or computer based) &amp; 0<br />
 
Reports &amp; 1<br />
 
Essays &amp; 1<br />
 
Oral polls &amp; 0<br />
 
Discussions &amp; 1<br />
 
 
 
 
</div>
 
=== Typical questions for ongoing performance evaluation within this section ===
 
 
# Explain impact on Utilization on Response Time.
 
# Explain Amdahl’s law relation to performance improvements during development process.
 
# Explain Little’s law application to evaluation of critical performance characteristics of systems (response time, queue size…)
 
# Give definition of maintainability. What are the statements of Lehman’s Laws on Program Evolution?
 
# Give an example of vulnerability prevention measures?
 
# What is Juran’s view on process improvement? Which CMMI level will best suite Juran’s methods.
 
 
=== Typical questions for seminar classes (labs) within this section ===
 
 
# You execute a benchmark test twice and find that the performance of the system was 30 transactions/hour the first time and 20 transactions/hour the second time. What is the average throughput?
 
# From a pure performance point of view is it better, the same or worst to have a single server or two with half the speed?
 
# You execute a load test for one hour, first during peak hour and again off-peak. During peak hour the system process 20 transactions/hour. Off-peak it processes 30 transactions/hour. What is the average throughput?
 
# Your current e-commerce traffic, 12.5 transactions per second, is served by two CPUs running at 65% of its maximum capability. With the launch of a new product, marketing is forecasting an increase of 30% in your web site traffic. Your job is to make a recommendation, from a pure performance perspective, on whether to upgrade your current system with faster CPUs or to buy two additional CPUs with the same capacity as the existing ones.It is estimated the faster CPUs will reduce the current service time by 20%.
 
# Draw a queuing diagram for the systems below and describe them using Kendall’s a) Single CPU system. b) A system comprising three web servers, to which requests are randomly directed. Each of the servers contains two CPUs.
 
# Software monitor data for an interactive system shows a CPU utilization of 75%, a 3 second CPU service demand, a response time of 15 seconds, and 10 active users. What is the average think time of these users?
 
# Construct a Design Structure Matrix for a given set of components. What does the DSM analysis tell you about maintainability of this set of components?
 
 
=== Test questions for final assessment in this section ===
 
 
 
# Give an example illustrating general relationships between response time, throughput, and resource utilization.
 
# Give an example illustrating general relationships between response time, throughput, and resource utilization.
 
# Suppose that during an observation period of 1 minute, a single resource (e.g., the CPU) is observed to be busy for 36 sec. A total of 1800 transactions were observed to arrive to the system. The total number of observed completions is 1800 transactions (i.e., as many completions as arrivals occurred in the observation period). What is: a) The mean service time per transaction, b) The utilization of the resource, c) The system throughput?
 
# Suppose that during an observation period of 1 minute, a single resource (e.g., the CPU) is observed to be busy for 36 sec. A total of 1800 transactions were observed to arrive to the system. The total number of observed completions is 1800 transactions (i.e., as many completions as arrivals occurred in the observation period). What is: a) The mean service time per transaction, b) The utilization of the resource, c) The system throughput?
Line 402: Line 349:
 
# Give an example of fuzzing for security testing.
 
# Give an example of fuzzing for security testing.
 
# Give a brief definition of CMMI levels.
 
# Give a brief definition of CMMI levels.
  +
'''Section 5'''
  +
# Explain the different types of technical debt that a project might incur.
  +
# Give a definition of constituent parts of the cost of quality.
  +
# Given project characteristics, what quality attributes should be tracked throughout of the project. Give an example of a quality control measures for the top priority quality attribute.
  +
# What are the quality gates? Give an example for quality gates at 2 different milestones.
   
=== Section 5 ===
+
=== The retake exam ===
  +
'''Section 1'''
   
=== Section title: ===
+
'''Section 2'''
   
  +
'''Section 3'''
Quality Planning
 
   
  +
'''Section 4'''
=== Topics covered in this section: ===
 
   
  +
'''Section 5'''
* Technical Debt
 
* Quality Planning - Cost of Quality
 
* Quality Planning - Project Quality
 
* Quality Plan for Practicum Project
 
 
=== What forms of evaluation were used to test students’ performance in this section? ===
 
 
<div class="tabular">
 
 
<span>|a|c|</span> &amp; '''Yes/No'''<br />
 
Development of individual parts of software product code &amp; 0<br />
 
Homework and group projects &amp; 1<br />
 
Midterm evaluation &amp; 1<br />
 
Testing (written or computer based) &amp; 0<br />
 
Reports &amp; 1<br />
 
Essays &amp; 1<br />
 
Oral polls &amp; 0<br />
 
Discussions &amp; 1<br />
 
 
 
 
</div>
 
=== Typical questions for ongoing performance evaluation within this section ===
 
 
# What is Kruchten’s definition and taxonomie of Technical Debt?
 
# According to Highsmith, what is relation of Technical Debt and Cost of Change?
 
# In McConnell’s taxonomy which type of Technical Debt can be positive?
 
# Explain latent faults through “tank and pipes” model. Give an example.
 
# What is the Quality Plan? Give an example of an estimation method for the efforts required to implement the quality plan.
 
 
=== Typical questions for seminar classes (labs) within this section ===
 
 
# Give definition of quality artifacts contributing to the SQALE model.
 
# Based on the experience with the group project, do the calculated Technical Debt metrics correspond to your intuition? Justify.
 
# Give an example of possible appraisal costs for a given project.
 
# Present the quality model for the practicum project.
 
# Present the quality control measures for the practicum project.
 
# Present the quality plan for the practicum project with regards to the project milestones and available resources.
 
 
=== Test questions for final assessment in this section ===
 
 
# Explain the different types of technical debt that a project might incur.
 
# Give a definition of constituent parts of the cost of quality.
 
# Given project characteristics, what quality attributes should be tracked throughout of the project. Give an example of a quality control measures for the top priority quality attribute.
 
# What are the quality gates? Give an example for quality gates at 2 different milestones.
 

Latest revision as of 12:04, 29 August 2022

Analysis of Software Artifacts

  • Course name: Analysis of Software Artifacts
  • Code discipline: SE-01
  • Subject area:

Short Description

This course covers the following concepts: Quality Models and Metrics; Technical Debt; Verification Techniques including Testing, Static Analysis and Inspection; Adequacy Criteria; Process Quality.

Prerequisites

Prerequisite subjects

  • Software testing basics
  • Notions of coverage
  • Naik, Kshirasagar, and Priyadarshi Tripathy. Software testing and quality assurance: theory and practice. John Wiley & Sons, 2011.

Prerequisite topics

Course Topics

Course Sections and Topics
Section Topics within the section
Defining quality
  1. Introduction, Views on Quality
  2. Quality Models
  3. Measurements & Quality Metrics
Testing
  1. Verification Overview
  2. Measuring Test Adequacy
  3. Black Box Testing
  4. Modeling the Input Domain
  5. Combinatorial Testing
  6. Basis Path & Data Flow Testing
  7. Random & Mutation Testing
Static Analysis
  1. Inspections
  2. Static Analysis
  3. Model Checking
Advanced Analysis and Verification
  1. Performance Analysis & Verification
  2. Maintainability Analysis & Verification
  3. Security Analysis & Verification
  4. Organizational Quality & Process Improvement
Quality Planning
  1. Technical Debt
  2. Quality Planning - Cost of Quality
  3. Quality Planning - Project Quality
  4. Quality Plan for Practicum Project

Intended Learning Outcomes (ILOs)

What is the main purpose of this course?

Software quality is a key aspect of any IT solution whether a few hundred lines of code for a smart phone app or a few million lines of code for Enterprise Resource Planning software. The Analysis of Software Artifacts course provides techniques to develop confidence in the quality of the software being produced or acquired regardless of its size and domain. The course adopts the view that software quality is not only the absence of defects but that it encompasses all the characteristics that bear on the ability of the software to satisfy stated and implied needs. Software quality is then defined from different perspectives: product quality, quality in use and process quality through the use of specific quality models. The course systematically explores different quality attributes and the techniques most appropriate to verify them. Specific topics include software testing, static analysis and model checking, inspections, technical debt, cost of software quality, planning for quality, quantitative models and defect classifications. The course balances traditional lectures with small projects in which students apply the ideas they are learning to real artifacts. The final project consists on the preparation of a quality plan for an industry project.

ILOs defined at three levels

Level 1: What concepts should a student know/remember/explain?

By the end of the course, the students should be able to ...

  • Several views on software quality.
  • Trade-offs among quality attributes in quality models.
  • Major differences verification techniques.
  • Adequacy criteria for verification.
  • Cost of quality.

Level 2: What basic practical skills should a student be able to perform?

By the end of the course, the students should be able to ...

  • Quality models usage.
  • Technical debt concept.
  • Strengths and weaknesses of specific verification techniques.
  • Quality planning.

Level 3: What complex comprehensive skills should a student be able to apply in real-life scenarios?

By the end of the course, the students should be able to ...

  • Define and execute a testing plan.
  • Perform static analysis of the code.
  • Define a quality plan.
  • Justify quality related decisions to different stakeholders.

Grading

Course grading range

Grade Range Description of performance
A. Excellent 80-100 -
B. Good 65-79 -
C. Satisfactory 50-64 -
D. Poor 0-49 -

Course activities and grading breakdown

Activity Type Percentage of the overall course grade
Labs/seminar classes 10
Interim performance assessment 50
Exams 40

Recommendations for students on how to succeed in the course

Resources, literature and reference materials

Open access resources

  • Text book:
  • This course makes use of many reference materials that are posted to Moodle:
  • David A. Garvin, What Does "Product Quality" Really Mean?
  • Volker Kruger, Main Schools Of Tqm: "the big five"
  • Steve McConnell, Managing Technical Debt
  • Jean-Louis Letouzey, Michel Ilkiewicz, Managing Technical Debt with SQALE Method
  • Stephen Chin, Erik Huddleston, Walter Bodwell, and Israel Gat, The Economics of Technical Debt
  • Douglas W. Hubbard, How to Measure Anything. Finding the Value of "Intangibles" in Business
  • SEI, Foundations of Measurement
  • Frank Buechner, Is 100% Code Coverage Enough?
  • Brian Marick, How to Misuse Code Coverage
  • Ben H. Smith, Laurie Williams, Should software testers use mutation analysis to augment a test set?
  • Frank Buechner, Test Case Design Using the Classification Tree Method
  • Thomas J. McCabe, Arthur H. Watson, Structured Testing: A Testing Methodology Using the Cyclomatic Complexity Metric
  • Richard Hamlet, Random Testing
  • Matt Warnock, Look out! It’s the fuzz!
  • Karl E. Wiegers, Peer Reviews in Software: A Practical Guide
  • DoD, Formal Inspections
  • Jason Cohen, Questions for a Review Process
  • Cohen, Prepare to Succeed - A Guide to Effective Code Review
  • Michael A. Howard A Process for Performing Security Code Reviews
  • Stephan Wagner, Software Quality Economics for Combining Defect-Detection Techniques, 2005
  • Ashok Shenvi, Defect Prevention with Orthogonal Defect Classification, 2009
  • Brian Chess, Jacob West, Secure Programming with Static Analysis, 2007
  • Mordechai Ben-Ari, A Primer on Model Checking, 2010
  • Dave Jewell, Performance Engineering and Management Method, 2008
  • Jane Hillston, Performance Modeling, Operational Laws
  • Craig Shallahamer, Forecasting Oracle Performance (Ch. 5, Practical Queuing Theory), 2007
  • Gerald Everett, Performance Testing, Chapter 9
  • IEEE Guide for Software Verification and Validation Plans, 1993
  • Rick D. Craig, Systematic Software Testing, 2002
  • Peter Mell, A Complete Guide to the Common Vulnerability Scoring System, 2007
  • NIST, Technical Guide to Information Security Testing and Assessment, 2008
  • Boris Mutafelija, Systematic Process Improvement Using ISO 9001:2000 and CMMI, 2003
  • Edward F. Weller, Practical Applications of Statistical Process Control, 2000
  • Larry Webber, Michael Wallace, Quality Control for Dummies, 2007
  • Mahesh S. Raisinghani, Six Sigma: concepts, tools, and applications, 2005
  • SEI, Practical Software Measurement: Measuring for Process Management and Improvement, 1997

Closed access resources

Software and tools used within the course

Teaching Methodology: Methods, techniques, & activities

Activities and Teaching Methods

Activities within each section
Learning Activities Section 1 Section 2 Section 3 Section 4 Section 5
Homework and group projects 1 1 1 1 1
Midterm evaluation 1 1 1 1 1
Reports 1 1 1 1 1
Essays 1 1 1 1 1
Discussions 1 1 1 1 1

Formative Assessment and Course Activities

Ongoing performance assessment

Section 1

Activity Type Content Is Graded?
Question What is the dominant quality view implicit in SCRUM and RUP? 1
Question Explain in your own words and in no more than three sentences the main contribution of one of the quality gurus like Ishikawa? 1
Question What is the difference between must have attributes and delighters in Kano’s concept? 1
Question What is the main difference between a quality model like ISO 25010 and SAP Products Standard? 1
Question Describe in your own words and with regards to ISO 25010 the following quality attributes: Security, Reliability and Maintainability. 1
Question Define major quality focus by the customer in a given project. 0
Question Using SONAR evaluate maintainability of a given project. 0
Question Discuss you interpretation of the obtained quality level in a given project. 0
Question Describe how and what for quality models are useful? Provide an example from your studio project. 0
Question Map the requirement “the system shall be easy to maintain” to the ISO 25010 Quality model. Provide a definition to the metric level for at least two sub-characteristics for the requirement, and represent the mapping graphically. 0

Section 2

Activity Type Content Is Graded?
Question In the context of mutation testing: a. What is an equivalent mutant? b. What is the meaning of the terms killed and dead on arrival? c. What is the difference between the two? 1
Question Develop BVA test cases for an application that implements the logic as defined in the exercise. 1
Question Will you use combinatorial testing to derive test cases for a tree like menu? Yes, no, why? 1
Question What is the relation between branch coverage and mutation testing? 1
Question What is an infeasible path? 1
Question What is fuzz testing? How it is different from random testing? 1
Question What is the oracle problem? 1
Question Write a short code snippet that contains a possible null-pointer exception, and two different sets of test cases that achieve full branch coverage for the snippet. The first set of test cases should miss the defect; the second should trigger it. 0
Question Develop a classification tree covering all test relevant aspect for a Merge method. The method accepts two ordered integer vectors with a maximum of 128 elements each and returns a single ordered vector with no-duplicates formed from the elements of the input vectors. 0
Question C -> D so that it achieves 100% MC/DC. 0
Question Develop test cases to achieve 100% basis path coverage utilizing McCabe method for the program below. Include: control flow graph, basis paths, test cases. 0

Section 3

Activity Type Content Is Graded?
Question Wiegers references Weinberg’s concept of “egoless programming.” What does he mean by this concept, and why is it relevant to peer review? 1
Question The literature suggests a number of limits on code review sessions. For each, list and justify a reasonable guideline. 1
Question Based on your reading, list two undesirable programmer attitudes that can emerge in an organization that includes mandatory code reviews. Describe three mechanisms management, organization, or programmers can use to avoid the development of such attitudes. 1
Question Under what circumstances is model checking not a useful strategy? 1
Question In the context of model checking, what is a counterexample? 1
Question What is one common misconception about either the advantages or disadvantages of model checking versus static analysis that Engler and Musuvathi identify and debunk through experience in their article “Static analysis versus software model checking for bug finding”? 1
Question Using Humphrey’s capture-recapture procedure, how many latent defects can we estimate remain unidentified for a given code? 0
Question You need to inspect a large banking system comprising around 20,000 lines of COBOL and 25,000 lines of newly written Java code with regards to vulnerabilities but only have enough budget to look at 10,000. How would you prioritize which components to inspect without overrunning the budget? 0
Question Produce a program model for a given code example. Be sure to identify and describe the states, actions, transitions, initial state, and end states. 0
Question Create a Promela model that will print out even integers from 0 to 100. 0

Section 4

Activity Type Content Is Graded?
Question Explain impact on Utilization on Response Time. 1
Question Explain Amdahl’s law relation to performance improvements during development process. 1
Question Explain Little’s law application to evaluation of critical performance characteristics of systems (response time, queue size…) 1
Question Give definition of maintainability. What are the statements of Lehman’s Laws on Program Evolution? 1
Question Give an example of vulnerability prevention measures? 1
Question What is Juran’s view on process improvement? Which CMMI level will best suite Juran’s methods. 1
Question You execute a benchmark test twice and find that the performance of the system was 30 transactions/hour the first time and 20 transactions/hour the second time. What is the average throughput? 0
Question From a pure performance point of view is it better, the same or worst to have a single server or two with half the speed? 0
Question You execute a load test for one hour, first during peak hour and again off-peak. During peak hour the system process 20 transactions/hour. Off-peak it processes 30 transactions/hour. What is the average throughput? 0
Question Your current e-commerce traffic, 12.5 transactions per second, is served by two CPUs running at 65% of its maximum capability. With the launch of a new product, marketing is forecasting an increase of 30% in your web site traffic. Your job is to make a recommendation, from a pure performance perspective, on whether to upgrade your current system with faster CPUs or to buy two additional CPUs with the same capacity as the existing ones.It is estimated the faster CPUs will reduce the current service time by 20%. 0
Question Draw a queuing diagram for the systems below and describe them using Kendall’s a) Single CPU system. b) A system comprising three web servers, to which requests are randomly directed. Each of the servers contains two CPUs. 0
Question Software monitor data for an interactive system shows a CPU utilization of 75%, a 3 second CPU service demand, a response time of 15 seconds, and 10 active users. What is the average think time of these users? 0
Question Construct a Design Structure Matrix for a given set of components. What does the DSM analysis tell you about maintainability of this set of components? 0

Section 5

Activity Type Content Is Graded?
Question What is Kruchten’s definition and taxonomie of Technical Debt? 1
Question According to Highsmith, what is relation of Technical Debt and Cost of Change? 1
Question In McConnell’s taxonomy which type of Technical Debt can be positive? 1
Question Explain latent faults through “tank and pipes” model. Give an example. 1
Question What is the Quality Plan? Give an example of an estimation method for the efforts required to implement the quality plan. 1
Question Give definition of quality artifacts contributing to the SQALE model. 0
Question Based on the experience with the group project, do the calculated Technical Debt metrics correspond to your intuition? Justify. 0
Question Give an example of possible appraisal costs for a given project. 0
Question Present the quality model for the practicum project. 0
Question Present the quality control measures for the practicum project. 0
Question Present the quality plan for the practicum project with regards to the project milestones and available resources. 0

Final assessment

Section 1

  1. Explain the difference between product quality, quality in use and process quality. Provide 2-3 quality attributes of each category briefly describing them.
  2. What quality view best encompasses the phrase "Quality consists of the extent to which a specimen [a product-brand-model-seller combination] possesses the service characteristics you desire".
  3. Explain the difference between accuracy and precision of measurement methods.
  4. For each of the following quantities, indicate the scale (nominal, ordinal, interval, or ratio) of the data (just the scale, no justification required): a. Categories of defect types in a bug database. b. Branch coverage of a test suite. c. Severity of the defects in a bug database. d. Statement coverage of a test suite. e. Number of features delivered on a milestone.

Section 2

  1. Identify equivalence classes using Decision Table Method for a given problem.
  2. Create a Classification Tree for the Problem. Identify constraints. Indicate boundaries.
  3. Calculate number of test cases to achieve Basis Path coverage for a code sample.
  4. Provide a test set that achieves full Basis Path coverage for a code sample.

Section 3

  1. Enumerate three limitations of many dynamic analyses, and describe a mitigation strategy to overcome each.
  2. Give an example of a circumstance (in terms of a system, property, program, defect, or pattern type) under which you would prefer: a) Dynamic over static analysis? b) Static over dynamic analysis c) Model checking over more lightweight static analysis?
  3. Describe two strategies for eliciting developer support and encouraging analysis tool adoption in an organization.
  4. Define the terms “sound” and “complete” with respect to an analysis tool, and explain or give examples of circumstances under which you would prefer one over the other in selecting a particular tool.

Section 4

  1. Give an example illustrating general relationships between response time, throughput, and resource utilization.
  2. Suppose that during an observation period of 1 minute, a single resource (e.g., the CPU) is observed to be busy for 36 sec. A total of 1800 transactions were observed to arrive to the system. The total number of observed completions is 1800 transactions (i.e., as many completions as arrivals occurred in the observation period). What is: a) The mean service time per transaction, b) The utilization of the resource, c) The system throughput?
  3. Given a table listing arrival time of groups of new transactions and transactions currently in system, calculate a response time of the system.
  4. A web server is monitored for 10 minutes and its CPU is observed to be busy 90% of the time. The web server log shows that 30,000 requests were processed in that period. What is the CPU service demand of the web server?
  5. Give an example of the effect of the law of increasing complexity on maintainability.
  6. Give an example of fuzzing for security testing.
  7. Give a brief definition of CMMI levels.

Section 5

  1. Explain the different types of technical debt that a project might incur.
  2. Give a definition of constituent parts of the cost of quality.
  3. Given project characteristics, what quality attributes should be tracked throughout of the project. Give an example of a quality control measures for the top priority quality attribute.
  4. What are the quality gates? Give an example for quality gates at 2 different milestones.

The retake exam

Section 1

Section 2

Section 3

Section 4

Section 5