MSc: Analysis Software Artifacts

From IU
Jump to navigation Jump to search
The printable version is no longer supported and may have rendering errors. Please update your browser bookmarks and please use the default browser print function instead.

Analysis of Software Artifacts

  • Course name: Analysis of Software Artifacts
  • Code discipline: SE-01
  • Subject area:

Short Description

This course covers the following concepts: Quality Models and Metrics; Technical Debt; Verification Techniques including Testing, Static Analysis and Inspection; Adequacy Criteria; Process Quality.

Prerequisites

Prerequisite subjects

  • Software testing basics
  • Notions of coverage
  • Naik, Kshirasagar, and Priyadarshi Tripathy. Software testing and quality assurance: theory and practice. John Wiley & Sons, 2011.

Prerequisite topics

Course Topics

Course Sections and Topics
Section Topics within the section
Defining quality
  1. Introduction, Views on Quality
  2. Quality Models
  3. Measurements & Quality Metrics
Testing
  1. Verification Overview
  2. Measuring Test Adequacy
  3. Black Box Testing
  4. Modeling the Input Domain
  5. Combinatorial Testing
  6. Basis Path & Data Flow Testing
  7. Random & Mutation Testing
Static Analysis
  1. Inspections
  2. Static Analysis
  3. Model Checking
Advanced Analysis and Verification
  1. Performance Analysis & Verification
  2. Maintainability Analysis & Verification
  3. Security Analysis & Verification
  4. Organizational Quality & Process Improvement
Quality Planning
  1. Technical Debt
  2. Quality Planning - Cost of Quality
  3. Quality Planning - Project Quality
  4. Quality Plan for Practicum Project

Intended Learning Outcomes (ILOs)

What is the main purpose of this course?

Software quality is a key aspect of any IT solution whether a few hundred lines of code for a smart phone app or a few million lines of code for Enterprise Resource Planning software. The Analysis of Software Artifacts course provides techniques to develop confidence in the quality of the software being produced or acquired regardless of its size and domain. The course adopts the view that software quality is not only the absence of defects but that it encompasses all the characteristics that bear on the ability of the software to satisfy stated and implied needs. Software quality is then defined from different perspectives: product quality, quality in use and process quality through the use of specific quality models. The course systematically explores different quality attributes and the techniques most appropriate to verify them. Specific topics include software testing, static analysis and model checking, inspections, technical debt, cost of software quality, planning for quality, quantitative models and defect classifications. The course balances traditional lectures with small projects in which students apply the ideas they are learning to real artifacts. The final project consists on the preparation of a quality plan for an industry project.

ILOs defined at three levels

Level 1: What concepts should a student know/remember/explain?

By the end of the course, the students should be able to ...

  • Several views on software quality.
  • Trade-offs among quality attributes in quality models.
  • Major differences verification techniques.
  • Adequacy criteria for verification.
  • Cost of quality.

Level 2: What basic practical skills should a student be able to perform?

By the end of the course, the students should be able to ...

  • Quality models usage.
  • Technical debt concept.
  • Strengths and weaknesses of specific verification techniques.
  • Quality planning.

Level 3: What complex comprehensive skills should a student be able to apply in real-life scenarios?

By the end of the course, the students should be able to ...

  • Define and execute a testing plan.
  • Perform static analysis of the code.
  • Define a quality plan.
  • Justify quality related decisions to different stakeholders.

Grading

Course grading range

Grade Range Description of performance
A. Excellent 80-100 -
B. Good 65-79 -
C. Satisfactory 50-64 -
D. Poor 0-49 -

Course activities and grading breakdown

Activity Type Percentage of the overall course grade
Labs/seminar classes 10
Interim performance assessment 50
Exams 40

Recommendations for students on how to succeed in the course

Resources, literature and reference materials

Open access resources

  • Text book:
  • This course makes use of many reference materials that are posted to Moodle:
  • David A. Garvin, What Does "Product Quality" Really Mean?
  • Volker Kruger, Main Schools Of Tqm: "the big five"
  • Steve McConnell, Managing Technical Debt
  • Jean-Louis Letouzey, Michel Ilkiewicz, Managing Technical Debt with SQALE Method
  • Stephen Chin, Erik Huddleston, Walter Bodwell, and Israel Gat, The Economics of Technical Debt
  • Douglas W. Hubbard, How to Measure Anything. Finding the Value of "Intangibles" in Business
  • SEI, Foundations of Measurement
  • Frank Buechner, Is 100% Code Coverage Enough?
  • Brian Marick, How to Misuse Code Coverage
  • Ben H. Smith, Laurie Williams, Should software testers use mutation analysis to augment a test set?
  • Frank Buechner, Test Case Design Using the Classification Tree Method
  • Thomas J. McCabe, Arthur H. Watson, Structured Testing: A Testing Methodology Using the Cyclomatic Complexity Metric
  • Richard Hamlet, Random Testing
  • Matt Warnock, Look out! It’s the fuzz!
  • Karl E. Wiegers, Peer Reviews in Software: A Practical Guide
  • DoD, Formal Inspections
  • Jason Cohen, Questions for a Review Process
  • Cohen, Prepare to Succeed - A Guide to Effective Code Review
  • Michael A. Howard A Process for Performing Security Code Reviews
  • Stephan Wagner, Software Quality Economics for Combining Defect-Detection Techniques, 2005
  • Ashok Shenvi, Defect Prevention with Orthogonal Defect Classification, 2009
  • Brian Chess, Jacob West, Secure Programming with Static Analysis, 2007
  • Mordechai Ben-Ari, A Primer on Model Checking, 2010
  • Dave Jewell, Performance Engineering and Management Method, 2008
  • Jane Hillston, Performance Modeling, Operational Laws
  • Craig Shallahamer, Forecasting Oracle Performance (Ch. 5, Practical Queuing Theory), 2007
  • Gerald Everett, Performance Testing, Chapter 9
  • IEEE Guide for Software Verification and Validation Plans, 1993
  • Rick D. Craig, Systematic Software Testing, 2002
  • Peter Mell, A Complete Guide to the Common Vulnerability Scoring System, 2007
  • NIST, Technical Guide to Information Security Testing and Assessment, 2008
  • Boris Mutafelija, Systematic Process Improvement Using ISO 9001:2000 and CMMI, 2003
  • Edward F. Weller, Practical Applications of Statistical Process Control, 2000
  • Larry Webber, Michael Wallace, Quality Control for Dummies, 2007
  • Mahesh S. Raisinghani, Six Sigma: concepts, tools, and applications, 2005
  • SEI, Practical Software Measurement: Measuring for Process Management and Improvement, 1997

Closed access resources

Software and tools used within the course

Teaching Methodology: Methods, techniques, & activities

Activities and Teaching Methods

Activities within each section
Learning Activities Section 1 Section 2 Section 3 Section 4 Section 5
Homework and group projects 1 1 1 1 1
Midterm evaluation 1 1 1 1 1
Reports 1 1 1 1 1
Essays 1 1 1 1 1
Discussions 1 1 1 1 1

Formative Assessment and Course Activities

Ongoing performance assessment

Section 1

Activity Type Content Is Graded?
Question What is the dominant quality view implicit in SCRUM and RUP? 1
Question Explain in your own words and in no more than three sentences the main contribution of one of the quality gurus like Ishikawa? 1
Question What is the difference between must have attributes and delighters in Kano’s concept? 1
Question What is the main difference between a quality model like ISO 25010 and SAP Products Standard? 1
Question Describe in your own words and with regards to ISO 25010 the following quality attributes: Security, Reliability and Maintainability. 1
Question Define major quality focus by the customer in a given project. 0
Question Using SONAR evaluate maintainability of a given project. 0
Question Discuss you interpretation of the obtained quality level in a given project. 0
Question Describe how and what for quality models are useful? Provide an example from your studio project. 0
Question Map the requirement “the system shall be easy to maintain” to the ISO 25010 Quality model. Provide a definition to the metric level for at least two sub-characteristics for the requirement, and represent the mapping graphically. 0

Section 2

Activity Type Content Is Graded?
Question In the context of mutation testing: a. What is an equivalent mutant? b. What is the meaning of the terms killed and dead on arrival? c. What is the difference between the two? 1
Question Develop BVA test cases for an application that implements the logic as defined in the exercise. 1
Question Will you use combinatorial testing to derive test cases for a tree like menu? Yes, no, why? 1
Question What is the relation between branch coverage and mutation testing? 1
Question What is an infeasible path? 1
Question What is fuzz testing? How it is different from random testing? 1
Question What is the oracle problem? 1
Question Write a short code snippet that contains a possible null-pointer exception, and two different sets of test cases that achieve full branch coverage for the snippet. The first set of test cases should miss the defect; the second should trigger it. 0
Question Develop a classification tree covering all test relevant aspect for a Merge method. The method accepts two ordered integer vectors with a maximum of 128 elements each and returns a single ordered vector with no-duplicates formed from the elements of the input vectors. 0
Question C -> D so that it achieves 100% MC/DC. 0
Question Develop test cases to achieve 100% basis path coverage utilizing McCabe method for the program below. Include: control flow graph, basis paths, test cases. 0

Section 3

Activity Type Content Is Graded?
Question Wiegers references Weinberg’s concept of “egoless programming.” What does he mean by this concept, and why is it relevant to peer review? 1
Question The literature suggests a number of limits on code review sessions. For each, list and justify a reasonable guideline. 1
Question Based on your reading, list two undesirable programmer attitudes that can emerge in an organization that includes mandatory code reviews. Describe three mechanisms management, organization, or programmers can use to avoid the development of such attitudes. 1
Question Under what circumstances is model checking not a useful strategy? 1
Question In the context of model checking, what is a counterexample? 1
Question What is one common misconception about either the advantages or disadvantages of model checking versus static analysis that Engler and Musuvathi identify and debunk through experience in their article “Static analysis versus software model checking for bug finding”? 1
Question Using Humphrey’s capture-recapture procedure, how many latent defects can we estimate remain unidentified for a given code? 0
Question You need to inspect a large banking system comprising around 20,000 lines of COBOL and 25,000 lines of newly written Java code with regards to vulnerabilities but only have enough budget to look at 10,000. How would you prioritize which components to inspect without overrunning the budget? 0
Question Produce a program model for a given code example. Be sure to identify and describe the states, actions, transitions, initial state, and end states. 0
Question Create a Promela model that will print out even integers from 0 to 100. 0

Section 4

Activity Type Content Is Graded?
Question Explain impact on Utilization on Response Time. 1
Question Explain Amdahl’s law relation to performance improvements during development process. 1
Question Explain Little’s law application to evaluation of critical performance characteristics of systems (response time, queue size…) 1
Question Give definition of maintainability. What are the statements of Lehman’s Laws on Program Evolution? 1
Question Give an example of vulnerability prevention measures? 1
Question What is Juran’s view on process improvement? Which CMMI level will best suite Juran’s methods. 1
Question You execute a benchmark test twice and find that the performance of the system was 30 transactions/hour the first time and 20 transactions/hour the second time. What is the average throughput? 0
Question From a pure performance point of view is it better, the same or worst to have a single server or two with half the speed? 0
Question You execute a load test for one hour, first during peak hour and again off-peak. During peak hour the system process 20 transactions/hour. Off-peak it processes 30 transactions/hour. What is the average throughput? 0
Question Your current e-commerce traffic, 12.5 transactions per second, is served by two CPUs running at 65% of its maximum capability. With the launch of a new product, marketing is forecasting an increase of 30% in your web site traffic. Your job is to make a recommendation, from a pure performance perspective, on whether to upgrade your current system with faster CPUs or to buy two additional CPUs with the same capacity as the existing ones.It is estimated the faster CPUs will reduce the current service time by 20%. 0
Question Draw a queuing diagram for the systems below and describe them using Kendall’s a) Single CPU system. b) A system comprising three web servers, to which requests are randomly directed. Each of the servers contains two CPUs. 0
Question Software monitor data for an interactive system shows a CPU utilization of 75%, a 3 second CPU service demand, a response time of 15 seconds, and 10 active users. What is the average think time of these users? 0
Question Construct a Design Structure Matrix for a given set of components. What does the DSM analysis tell you about maintainability of this set of components? 0

Section 5

Activity Type Content Is Graded?
Question What is Kruchten’s definition and taxonomie of Technical Debt? 1
Question According to Highsmith, what is relation of Technical Debt and Cost of Change? 1
Question In McConnell’s taxonomy which type of Technical Debt can be positive? 1
Question Explain latent faults through “tank and pipes” model. Give an example. 1
Question What is the Quality Plan? Give an example of an estimation method for the efforts required to implement the quality plan. 1
Question Give definition of quality artifacts contributing to the SQALE model. 0
Question Based on the experience with the group project, do the calculated Technical Debt metrics correspond to your intuition? Justify. 0
Question Give an example of possible appraisal costs for a given project. 0
Question Present the quality model for the practicum project. 0
Question Present the quality control measures for the practicum project. 0
Question Present the quality plan for the practicum project with regards to the project milestones and available resources. 0

Final assessment

Section 1

  1. Explain the difference between product quality, quality in use and process quality. Provide 2-3 quality attributes of each category briefly describing them.
  2. What quality view best encompasses the phrase "Quality consists of the extent to which a specimen [a product-brand-model-seller combination] possesses the service characteristics you desire".
  3. Explain the difference between accuracy and precision of measurement methods.
  4. For each of the following quantities, indicate the scale (nominal, ordinal, interval, or ratio) of the data (just the scale, no justification required): a. Categories of defect types in a bug database. b. Branch coverage of a test suite. c. Severity of the defects in a bug database. d. Statement coverage of a test suite. e. Number of features delivered on a milestone.

Section 2

  1. Identify equivalence classes using Decision Table Method for a given problem.
  2. Create a Classification Tree for the Problem. Identify constraints. Indicate boundaries.
  3. Calculate number of test cases to achieve Basis Path coverage for a code sample.
  4. Provide a test set that achieves full Basis Path coverage for a code sample.

Section 3

  1. Enumerate three limitations of many dynamic analyses, and describe a mitigation strategy to overcome each.
  2. Give an example of a circumstance (in terms of a system, property, program, defect, or pattern type) under which you would prefer: a) Dynamic over static analysis? b) Static over dynamic analysis c) Model checking over more lightweight static analysis?
  3. Describe two strategies for eliciting developer support and encouraging analysis tool adoption in an organization.
  4. Define the terms “sound” and “complete” with respect to an analysis tool, and explain or give examples of circumstances under which you would prefer one over the other in selecting a particular tool.

Section 4

  1. Give an example illustrating general relationships between response time, throughput, and resource utilization.
  2. Suppose that during an observation period of 1 minute, a single resource (e.g., the CPU) is observed to be busy for 36 sec. A total of 1800 transactions were observed to arrive to the system. The total number of observed completions is 1800 transactions (i.e., as many completions as arrivals occurred in the observation period). What is: a) The mean service time per transaction, b) The utilization of the resource, c) The system throughput?
  3. Given a table listing arrival time of groups of new transactions and transactions currently in system, calculate a response time of the system.
  4. A web server is monitored for 10 minutes and its CPU is observed to be busy 90% of the time. The web server log shows that 30,000 requests were processed in that period. What is the CPU service demand of the web server?
  5. Give an example of the effect of the law of increasing complexity on maintainability.
  6. Give an example of fuzzing for security testing.
  7. Give a brief definition of CMMI levels.

Section 5

  1. Explain the different types of technical debt that a project might incur.
  2. Give a definition of constituent parts of the cost of quality.
  3. Given project characteristics, what quality attributes should be tracked throughout of the project. Give an example of a quality control measures for the top priority quality attribute.
  4. What are the quality gates? Give an example for quality gates at 2 different milestones.

The retake exam

Section 1

Section 2

Section 3

Section 4

Section 5