Systems & Software Engineering
Testing, Quality Assurance, and Security Techniques
This is a 3-day workshop promoting a cohesive approach to testing: a "how-to" approach with exercises, examples, and templates that can be applied immediately to testing. It addresses the roles and responsibilities of each participant in the software development process. It outlines responsibilities, expectations, and mechanisms to measure performance and progress. The course emphasizes a practical approach to testing in order to create better products and addresses the ever-changing needs and resources of an organization.
In this workshop participants will learn how to move testing and QA techniques from "gut feelings & instinct" towards an engineering discipline. The workshop discussion is directed towards practical solutions to quality assurance problems. Techniques to ensure that an information system protects data and maintains functionality are discussed. The question of how a QA team should be involved in the process of defining security requirements is investigated. We specifically address the problems of: a lack of resources, insufficient user community involvement, no budget for test automation, poor performance measures, overlap in responsibilities, and the common pitfalls in a testing process.
Who Should Attend
This workshop is extremely beneficial for quality assurance specialists, quality control analysts, system testers, programmers, end-users (customers), business analysts, systems analysts, project managers, team leaders, support analysts, engineers, and acceptance testers. Any organizations that have no formal development methodology — or are planning to adopt a development methodology such as Agile, or plan to test in an eXtreme Programming environment — should also consider this course.
Workshop Objectives
- Examine the differences between unit testing and system testing: where they overlap and how they can compliment each other.
- Review testing in an Agile methodology such as eXtreme Programming and 'Programming by Contract'.
- Translate requirements into tests, and demonstrate the value of early testing vs. late testing in a project.
- Use structured techniques to compute test coverage and determine if it is adequate.
- Ensure that testers are testing scenarios that are of concern to the end user.
- Enable testers to be able to repeat the steps that caused an error.
- Examine the levels of testing required during each stage of system development and maintenance, based upon organization size and structure.
- Develop strategies to implement better approaches to quality assurance in your organization, and clarify the role of the tester in the organization.
- Ensure that an information system protects data and maintains system functionality.
- Use diagraming techniques to identify testable conditions from specifications.
- Identify the appropriate metrics to measure progress and performance in your organization.
- Determine the appropriate quality initiatives that may be implemented during each phase of the System Development Life Cycle.
- Refine techniques for estimating the testing effort, and set test objectives.
- Write test plans that assure the desired amount of test coverage.
- Assess readiness to acquire test tools and automate the testing process.
- Identify and explain the six basic security concepts.
- Establish criteria to start testing and determine when it is completed.
Outline
| I. Introduction: Defect Detection or Defect Prevention? | ||||
| The objectives of testing are examined and the responsibilities of testers at all levels are defined. Quality concepts are introduced along with a list of success factors that will facilitate decision-making at each stage of the testing process. Numerous tests may be conducted during the SDLC and each is the primary responsibility of a specific group. As each group completes a testing phase a formal transition process is initiated before the next group begins. This section identifies entrance and exit criteria by phase for each component of the development team. The object is to eliminate as much testing overlap as possible. | ||||
| (a) Objectives | ||||
| (b) Observations | ||||
| (c) Impediments to Quality Exercise | ||||
| (d) Role of the Tester | ||||
| (e) Responsibilities | ||||
| (f) Early Testing vs. Late Testing | ||||
| (g) Quality Assurance Assessment | ||||
| (h) Quality Issues and Elements | ||||
| Quality Improvement Suggestions | ||||
| Quality Tools and Steps | ||||
| (i) Opportunities to Improve the Testing Process | ||||
| (j) System Development Life Cycles | ||||
| Waterfall SDLC | ||||
| Spiral SDLC | ||||
| (k) Phase Objectives | ||||
| (l) Measuring Performance | ||||
| (m) Reliability Metrics | ||||
| (n) Testing Success Factors | ||||
| (o) Product Development and Testing Phases | ||||
| II. Major Software Development and Testing Issues | ||||
| Successful software projects have clear specifications highlighting system objectives and serve as the vehicle for communicating all information to the individuals participating in the development process. Modern development practices recognize that changes to the plan may be required during the System Development Life Cycle and seek to manage the changes. This section emphasizes the contribution of testers during the SDLC. | ||||
| (a) Preparing Specifications | ||||
| Writing User Manuals | ||||
| Writing Skills | ||||
| (b) QA / QC Responsibilities | ||||
| (c) Reviewing Project Specifications | ||||
| Exercise: Sample Project Specification | ||||
| Specification Review Example | ||||
| Specification Problems? | ||||
| Specification Problem Classification | ||||
| Contents of a Specification | ||||
| More Specification Guidelines | ||||
| Tables in Specifications | ||||
| (d) Scripts and Cases | ||||
| (e) Unit vs. System (or Acceptance) Testing | ||||
| (f) Confidence in Testing | ||||
| (g) Positive and Negative Testing | ||||
| (h) Blind Testing | ||||
| (i) Unit-level Test Scripts | ||||
| (j) System-level Test Scripts | ||||
| (k) Managing Change | ||||
| (l) Levels of Testing | ||||
| (m) Responsibilities by Test Type | ||||
| III. Test Methodologies and Checklists | ||||
| Testing methodologies enable testers to compute their test coverage and have confidence that all requirements will be tested. Both black and white box techniques are used to demonstrate methodologies. These testing approaches are demonstrated and reinforced with exercises. The use of methodologies in testing is an essential element of a quality assurance organization. | ||||
| (a) Setting Test Objectives and Identifying Tests | ||||
| (b) Test Planning | ||||
| (c) Methodologies | ||||
| Test Coverage Computation | ||||
| Black Box Testing (Closed Box) | ||||
| White Box Testing (Open Box) | ||||
| (d) Boundary Value Analysis | ||||
| Specification Example with (Boundaries | ||||
| Exercise: What Are the Boundary Conditions? | ||||
| (e) Path Analysis (Cyclomatic Complexity) | ||||
| Path Analysis Example | ||||
| Creating a Flow Chart or Data Flow | ||||
| Data Flow Example | ||||
| Decision Tree | ||||
| Complexity Analysis with Java Code | ||||
| Path Analysis Example (Closed Box) | ||||
| Test Scripts | ||||
| Exercise: Data Flow (Accounting) | ||||
| Exercise: Data Flow (Airline) | ||||
| Exercise: Data Flow (Vesting) | ||||
| (f) Decision Tables | ||||
| Binary Arithmetic | ||||
| Decision Table Example | ||||
| Function Table | ||||
| Exercise: Decision Table (Telephone) | ||||
| (g) State Machines | ||||
| (h) State Transition | ||||
| (i) Factor Analysis | ||||
| (j) OATS — Orthagonal Array Testing Strategy | ||||
| OATS Example | ||||
| Using OATS | ||||
| (k) Pairs and Magic Squares | ||||
| Odd Order Templates | ||||
| (l) Embedded Systems | ||||
| (m) Additional Tests | ||||
| Employee Last Name Checklist | ||||
| Table and Array Testing | ||||
| Date Edit Checklist | ||||
| Screen, Button, and Character Entry Checklists | ||||
| IV. Risk Analysis | ||||
| In our applications not everything can be tested, so prioritizing the tests is a major requirement of the tester. For most systems a thorough set of tests may be impractical so identifying those crucial tests becomes imperative to the tester, we will help to identify those crucial situations. | ||||
| (a) Categorical Analysis | ||||
| (b) Factor Breakdown | ||||
| (c) Business Rule Analysis | ||||
| (d) Operational Matrix | ||||
| V. Test Planning | ||||
| Testing begins with a plan that unambiguously states the objectives. A suitable methodology is selected to provide adequate test coverage and to deliver the desired level of confidence that the software will perform as advertised. Testing is treated as a dynamic process that may continue after delivery and will certainly play a role in future system modifications. Appropriate record keeping is initiated and maintained through the life of the product. | ||||
| (a) Unit Testing (Early Testing) | ||||
| White Box Test Case Sources | ||||
| Sample Unit Test Plan Table of Contents | ||||
| Unit Testing Scenario | ||||
| (b) Integration Testing and System Testing | ||||
| (c) System / Acceptance Testing | ||||
| Sample System (or Acceptance) Plan Table of Contents | ||||
| Sample System (or Acceptance) Test Script | ||||
| (d) Possible Test Plan Elements | ||||
| Sample System (or Acceptance) Test Plan | ||||
| (e) Creating the Regression Test | ||||
| Regression Test Alternatives | ||||
| Traceability Matrix | ||||
| (f) Usability Testing | ||||
| (g) Palm Compliance Testing Checklist | ||||
| (h) Stopping Rules for Testing | ||||
| (i) Estimating with Function Points | ||||
| (j) How Do I Estimate the Testing Effort? | ||||
| (k) Data Dictionaries | ||||
| (l) Approaches to Testing | ||||
| Top-Down | ||||
| Bottom-Up | ||||
| (m) Regression Testing | ||||
| (n) Alternatives to Testing | ||||
| (o) Test Notebook | ||||
| VI. Test Plan Reviews | ||||
| The test plan is the verification document for the tester, the accuracy of this document could be the most important phase of the product delivery process. Analyzing the application for testability, completeness, sequencing, structure, and timings are the important factors discussed in this section. | ||||
| VII. Promotion Rule for Applications | ||||
| Promotion of the applications should include a planned and coordinated approach for integrating the new system or version into the work flow. A smooth transition must include a phase out of the old system and a modest introduction of the new systems. Providing for this transition requires a laid out plan of operation. | ||||
| VIII. Test Modifications | ||||
| All of our systems will be upgraded or enhanced at some time, the applications must be tested to ensure that the existing functionality continues to function, but that the new functionality is working correctly. This chapter will review the processes involved in testing the modification process. | ||||
| (a) Maintenance Issues | ||||
| (b) Maintenance Testing | ||||
| (c) Estimating the Modifications | ||||
| (d) Cost Benefit Identification | ||||
| IX. Upstream/Downstream Testing | ||||
| Upstream/Downstream testing is based upon the view that interfacing systems only either feed information into changed system or receive data from the changed system. The focus is on changing the inputs for upstream testing or changing the outputs for downstream testing. | ||||
| X. Integration Testing Errors | ||||
| The objective of integration testing is to ensure that components link and work together, and the focus is on the effectiveness of functional interactions and compatibility at the interfaces. Integration testing becomes imperative if the defects are likely to be found in system testing that are difficult to debug, we will help to define a consistent process for identifying integration testing errors. | ||||
| (a) Checklist of Integrations Issues | ||||
| (b) Error Prevention Checklist | ||||
| XI. Degraded Mode Operation | ||||
| Our systems are typically tested during peak operating conditions, however in the live environments that we are faced with the conditions are not always optimal. This will help to identify those non-optimal conditions and provide a series of testing options for these conditions. | ||||
| XII. Defect Prevention | ||||
| If we know what has gone wrong in the past we should be able to avoid repeating the problems in the future. This section examines some of the past lessons and suggests approaches to resolving problems. It recognizes that the chief responsibility of testing is defect prevention not defect detection. | ||||
| (a) Checklists | ||||
| (b) Functional Specification Defects | ||||
| (c) Design Defects | ||||
| (d) Coding Defects | ||||
| (e) Testing Defects | ||||
| (f) Coding / Testing Rules | ||||
| Quiz: Do You Have a Genius Programmer on Your Staff? | ||||
| XIII. Test Management | ||||
| Recording results and investigating the origin of defects allows testers to develop measures that will be used as guidelines for future projects. These documents play a significant role in organizing and planning the final stages of the development effort. | ||||
| (a) Test Logs | ||||
| Sample Defect Tracking Report | ||||
| Test Log Scenarios | ||||
| Retesting and Follow-up Procedures | ||||
| (b) Root Cause Analysis | ||||
| XIV. Problem Solving Techniques | ||||
| The testing process identifies faults within the system, but identifying the failure may not provide enough information to correct the problem. Testers and other development teams may have to devise mechanisms to isolate the precise cause of the fault. | ||||
| (a) Error Isolation | ||||
| (b) Variable Tracers | ||||
| (c) Flowcharts | ||||
| (d) Deductive Questioning | ||||
| (e) Structured Walkthroughs | ||||
| (f) JAD — Joint Application Design | ||||
| XV. Object-Oriented Testing | ||||
| Object-Oriented development requires adjustments to the unit testing and system testing process. The possibility of developing test clients is explored, along with selecting the appropriate methodology for OO testing. | ||||
| (a) Overview | ||||
| (b) Definitions | ||||
| (c) OO vs. Traditional Testing | ||||
| (d) Managing Complexity | ||||
| (e) Abstraction | ||||
| (f) Encapsulation | ||||
| (g) Inheritance | ||||
| (h) Object-Oriented Systems Testability Issue | ||||
| (i) Object-Oriented Testing Approach | ||||
| (j) Using Test Clients | ||||
| (k) Other Testing Issues | ||||
| XVI. Software Tools for Testing | ||||
| Some testing is impractical or impossible without automation. While automated testing is clearly a direction that most organizations should pursue, it is necessary to examine the potential benefits and the problems associated with automating tests. | ||||
| (a) Automated Testing Considerations | ||||
| (b) Test Tools | ||||
| XVII. Web-based Client/Server Testing | ||||
| Testing in the C/S and web environments adds additional dimensions to traditional software quality assurance. Many methodologies may be utilized to identify testable conditions in software, but new approaches are required to address the database and communication issues of C/S and web-based testing. | ||||
| (a) Web-based Testing — Where to Begin? | ||||
| (b) What Will Be New? | ||||
| (c) Determining What to Test | ||||
| (d) Where to Test: Client-side or Server-side? | ||||
| (e) Web Testing Responsibilities | ||||
| (f) Web Testing Checklist | ||||
| (g) Agile Methodology and Testing | ||||
| Agile Manifesto | ||||
| What Changes with Agile? | ||||
| Agile Principles | ||||
| (h) eXtreme Programming | ||||
| Productivity Measure: Velocity | ||||
| XP Basic Rules and Definitions | ||||
| Testing in XP Shops | ||||
| Basic XP Practices | ||||
| XVIII. Systems Architecture | ||||
| The architecture of our systems provide some insight on what needs to be tested, using a systematic approach to testing these configurations is imperative to our organization. In this section we focus on systematic testing approaches for testing. | ||||
| (a) Test GUIs/APIs | ||||
| (b) Local (One Node) Testing | ||||
| (c) Individual Client/Server | ||||
| (d) One Client/One Server | ||||
| (e) Two Clients/Two Servers | ||||
| (f) Multiple Clients/Multiple Servers | ||||
| XIX. Cross Functional Analysis | ||||
| Failure of interactions among features is a major source of system failures, identifying these key interactions is paramount to the organization for testing along with the strategy for testing these interactions. This section will focus on testing timing, feature impacts, and interdependencies. | ||||
| XX. Database Integrity Testing | ||||
| Inconsistencies in our data lead to defects in our applications that are not actually caused by the applications, but by the data itself. This chapter will highlight optimal ways of providing verifications and consistency checks for our application data. | ||||
| (a) Declarative Checking | ||||
| (b) Procedural Checking | ||||
| (c) Checklists | ||||
| XXI. Getting Started | ||||
| Theory, concepts and experiences do not always point us in the correct direction for future application testing efforts. All organizations must develop a strategy to start the process and then refine their techniques. The Software Engineering Institute's Capability Maturity Model offers significant guidance for developing and testing systems. | ||||
| (a) Capability Maturity Model | ||||
| (b) CMM Level 1 — Initial | ||||
| (c) CMM Level 2 — Repeatable | ||||
| (d) CMM Level 3 — Defined | ||||
| (e) CMM Level 4 — Managed | ||||
| (f) CMM Level 5 — Optimized | ||||
| XXII. Appendix | ||||
| (a) Unit Testing — A Very Short Parable | ||||
| (b) Obtaining a Quality Commitment from Management | ||||
| (c) Defect Classifications | ||||
| (d) Severity Levels | ||||
| (e) Date Testing | ||||
| (f) Increasing Productivity | ||||
| (g) Testing Environment | ||||
| (h) System Testing Without a Specification | ||||
| (i) Job Responsibilities of System Testers | ||||
| (j) Web Sites | ||||
| XXIII. Glossary | ||||


