CIS 350 Spring 2013
Midterm Exam Study Guide
This document lists some important questions that you should be able to answer about the assigned readings, as well as some other questions brought up in lecture.
Chapter numbers refer to Braude & Bernstein, unless otherwise noted.
Note that there is no implication to be made here regarding the types of questions you will see on the exam, or the level of difficulty. This is purely to give you an overview of the important concepts that were covered in the first half of the course.
Part 1: Building Software
Software Engineering and Process Models
- Reading: Chapters 1-3
- What is "software engineering"?
- What is a "process model"? Why is it needed?
- What is the "waterfall" model? What are are its advantages and disadvantages?
- What is the "prototyping" model? What are are its advantages and disadvantages?
- What is the "spiral" model? What are are its advantages and disadvantages?
- What is the "Unified Process"? What are are its advantages and disadvantages?
Software Configuration Management
- Reading: Chapter 4
- Lecture slides: Jan 15
- What are the risks involved in software development that agile processes seek to avoid/limit/mitigate?
- What are some potential disadvantages to agile development processes?
- What are the "agile principles"?
- What is the "Scrum" model?
- What is the "eXtreme Programming" model?
- What is a "user story"? What is meant by "project velocity"? What is a "CRC card"? What is a "release" and how does it differ from an "iteration"?
- Reading: Chapter 6
- Lecture notes: Jan 17
- Where does "change" come from in software development? Why do we care about it?
- What is the purpose of a Configuration Management (CM) system?
- What is meant by "software configuration item"?
- What are the important features of a CM repository?
- How do version control and change control help a software development team?
- Reading: Fowler, "Continuous Integration"
- What are the advantages of continuous integration? What problems does it try to avoid?
- Why does successful continuous integration depend on the availability of test code?
- Reading: Chapters 10-13
- Lecture notes: Jan 22
- What is the difference between "high-level" and "low-level" requirements?
- What are the different types of questions you ask during requirements elicitation?
- What are the different ways of documenting requirements? What's the difference between a "use case" and a "user story"?
- What are "non-functional requirements"? What are "constraints"?
- What are some aspects of good requirements?
Part 2: Testing Software
Software Testing Concepts
- Reading: Chapter 25; P. Ammann and J. Offutt, Introduction to Software Testing, chapter 1
- Lecture notes: Jan 31
- What is the definition of "software testing"? What are its goals?
- What are the differences between "validation", "verification", and "testing"?
- What are the differences between "unit testing", "integration testing", "system testing", and "acceptance testing"?
- What are the definitions of "fault", "error", and "failure"?
- What are the definitions of "test requirement", "coverage", and "coverage level"?
- What is a "test oracle"? What does it mean for a test oracle to be "sound"? "complete"?
- Reading: Chapter 26
- Lecture notes: Feb 5, Feb 7, Feb 12
- What is the difference between "black-box testing" and "white-box testing"?
- What is "equivalence partitioning"? What is "boundary analysis"? What are "robustness cases"?
- What is the "single-fault assumption"?
- What is meant by "weak normal", "weak robust", "strong normal", and "strong robust" equivalence classes?
- What are the advantages of test-driven development, i.e. writing tests before you implement the code?
- What is a "control flow graph"? How is it used during testing?
- What is the difference between "statement coverage", "branch coverage", and "path coverage"? How are they used during testing?
- What does it mean for path coverage to "subsume" statement coverage?
- If your test suite achieves 100% statement coverage, does that mean all bugs have been found? Why or why not?
- What is a "path condition"? How is it used during testing? What does it mean for a path condition to be unsatisfiable?
- What is "mutation analysis"? How is it used in software testing?
Part 3: Designing Software
- Reading: F. Brooks, "No Silver Bullet" (1987 version)
- Lecture notes: Feb 14
- What are the different "views" of software quality? How do they differ from each other?
- How can software quality be measured?
- What are the important characteristics of the ISO 9126 model for "external quality"? for "internal quality"?
- How is internal quality related to external quality?
- What are the "essential difficulties" of software development?
- Which past advances in software engineering have addressed "accidental difficulties"?
- What does Brooks say is the most important thing we can do to improve software?
- Reading: Chapter 15
- Lecture notes: Feb 14
- How is design related to quality?
- What is the difference between "procedural abstraction" and "data abstraction"?
- How are "abstraction" and "refinement" related?
- What is meant by "modularity"?
- How is information hiding achieved in languages like Java?
- What is "functional independence"?
- What is an "aspect"?
Design Complexity Metrics
- Reading: Chapters 17 and 19
- Lecture notes: Feb 19, Feb 21
- What is meant by the "SOLID" principle of class design?
- What is meant by "grammatical parsing"? How do you use it to design a class?
- What is a CRC card?
- What are "cohesion" and "coupling"? Are they good or bad? How do they relate to the internal quality of a design? How do they relate to the external quality?
- What is the difference between "inheritance" and "composition"? When would you use each?
- What are the different categories of object-oriented design patterns?
- What is the Singleton design pattern? How is it implemented? How could it be used for "aspects"?
- What is the Observer design pattern? How is it implemented? How does it help address issues with one-to-many relationships between objects
- What is the Model-View-Controller pattern? How is it used in Android?
Implementation and Analyzability
- Reading: Chapter 20; F. Tsui & O. Karam, Essentials of Software Engineering, chapter 8
- Lecture notes: Feb 26
- How is McCabe's cyclomatic complexity defined?
- How is Henry-Kafura structural complexity measured? What are fan-in and fan-out?
- What is the Chapin metric? How does it differ from McCabe and Henry-Kafura metrics?
- What are the different levels of cohesion? What are the different levels of coupling?
- Define the following metrics: Weighted Methods per Class; Depth of Inheritance Tree; Number of Children; Coupling Between Object Classes; Response for a Class; Lack of Cohesion of Methods
- How can each of those be measured? How do they relate to "internal quality"?
- What is Pairwise Coupling? How is it measured?
- Reading: Chapter 22
- Lecture notes: Mar 12
- What is an "invariant"? How is it used in the design of a method? What are the advantages of using it?
- What is a "class invariant"? "precondition"? "postcondition"?
- What is Java runtime assertion checking? How is it done?
- What is the Java Modeling Language? How are invariants specified?
- What do the Halstead Metrics attempt to quantify? How?
- According to Buse & Weimer, what affects code readability?
- Reading: R.V. Binder, "Design for testability in object-oriented systems"
- Lecture notes: Mar 14
- What characteristics lead to testable software?
- What are "controllability" and "observability"?
- What are the six factors that influence the testability of a design?
- What is meant by the "traceability" of a software design?
- What object-oriented metrics reflect the testability of a design? What is the difference between "scope metrics" and "complexity metrics"?
- What is meant by "fault sensitivity"? How is it related to testability?
- What sorts of features need to be added to a class to achieve Built-In Test (BIT) capability? What are some of the disadvantages of doing so?