• 读书笔记-Software Testing(By Ron Patton)


    Software Testing

    • Part I:The Big Picture
      • 1.Software Testing Background
        • Bug's formal definition

          • 1.The software doesn't do something that the product specification says it should do.

            2.The software does something that the product specification says it shouldn't do.

            3.The software does something that the product specification doesn't mention.

            4.The software doesn't do something that the product specification doesn't mention but should.

            5.The software is difficult to understand,hard to use,slow,
            or in the software tester's eyes will be viewed by the end user as just plain not right.

          The sources of Bugs

          • 1.Spesification
            • The spec isn't written
            • The spec isn't thorough enough
            • The spec is constantly changing
            • Not communicated well to the entire development team
          • 2.Design
            • It's rushed
            • It's changed
            • Not well communicated
          • 3.Code
            • The Software's complexity
            • Poor documentation
            • Schedule pressure
            • Plain dumb mistakes
          • 4.Other
            • False positives
            • Duplicate bugs
            • Testing errors
            • etc.

          The costs increase tenfold as time increases

          The goal of a software tester is to find bugs,
          find them as early as possible,
          and make sure they get fixed

      • 2.Software Development Process
        • Hidden efforts
          • Customer Requirements

            Specification

            Schedules,such as the Gantt chart

            Software Design Documents,

            to plan and organize the code that is to be written 

            Test Documents

            • Test plan
            • Test cases
            • Bug reports
            • Test tools and automation
            • Metrics,statistics,summaries
        • Lifecycle models
          • Big-Bang
            • The only virtue is simple
          • Code-and-Fix
            • A good introduction to software development
          • Waterfall
            • Three important things
              • 1.There's a large emphasis on specifying what the product will be
              • 2.The steps are discrete;there's no overlap
              • 3.There's no way to back up.
            • Disadvantage:Testing occurs only at the end
          • Spiral
            • Steps
              • 1.Determine objectives,alternatives and constraints
              • 2.Identify and resolve risks
              • 3.Evaluate alternatives
              • 4.Develop and test the current level
              • 5.Plan the next level
              • 6.Decide on the approach for the next level
            • Virtue:the lower costs and finding problens earlier
      • 3.The Realities of Software Testing
        • Testing Axioms
          • It's impossible to test a program completely
          • Software testing is a risk-based exercise
          • Testing can't show that bugs don't exist
          • The more bugs you find,the more bugs there are
          • The pesticide paradox
            • The more you test software,the more immune it becomes to your tests
          • Not all the bugs you find will be fixed
            • There's not enough time
            • It's really not a bug
            • It's too risky to fix
            • It's just not worth it
          • When a bug's a bug is difficult to say
          • Product specifications are never final
          • Software tester aren't the most popular members of a project team
            • Find bugs early
            • Temper your enthusiasm
            • Don't just report bad news
          • Software testing is a disciplined technical profession
    • Part II:Testing Fundamentals
      • 4.Examining the specification
        • High-Level Review
          • Pretend to be the customer

            • Don't forget about software security

            Research existing standards and guidelines

            • Corporate Terminology and Conventions
            • Industry Requirements
            • Government Standards
            • Graphical User Interface(GUI)
            • Security Standards

            Review and test similar software:

            Scale,Complexity,Testability,Quality/Reliability,Security

        • Low-Level Review
          • Attributes checklist(Flush out oversights and omissions):

            Complete,Accurate,Precise and clear,Consistent,Relevant,Feasible,Code-free,Testable

            Terminology checklist:

            Help assure that all the details are defined

            • Always,every,all,none,never
            • Certainly,therefore,clearly,obviously,evidently
            • Some,sometimes,often,usually,ordinarily,customarily,most,mostly
            • Etc.,and so forth,and so on,such as
            • Good,fast,cheap,efficient,small,stable
            • Handled,processed,rejected,skipped,eliminated
            • If...then..(but missing else)
      • 5.Black-Box Testing
        • Dynamic Black-Box Testing
          • Testing without knowing exactly how it works with
          • Entering inputs,receiving outputs and checking the results according to the specification
        • Test-to-Pass and Test-to-Fail
        • Equivalence Partitioning
          • Similar inputs,similar outputs and similar operation
        • Data Testing
          • Buffer overruns are the number one cause of software security issues,it's caused by boundary condition bugs

            Sub-boundary conditions:Powers-of-Two

            Null:default,empty,blank,null,zero,and none

            Bad data:invalid,wrong,incorrect,and garbage data

        • State Testing
          • The one side of software:the data-the numbers,words,inputs,outputs

            The other side:to verify the program's logic flow through it's various states.

            Test-to-pass:State transition map

            • The following items:
              • Each unique state that the software can be in
              • The input or condition that takes it from one state to the next
              • Set condition and produced output when a state is entered or exited
            • Reducing the number of state and transitions to test
              • Visit each state at least once
              • Test the state-to-state transitions that look like the most common or popular
              • Test the least common paths betwee states
              • Test all the error states and returning from the error states
              • Test random state transitions

            Test-to-fail:Testing states to fail

            • Race conditions and bad timing
            • Repetition testing
              • The main reason is to look for memory leaks
            • Stress testing
              • Look at the software and determine what external resources and dependencies it has
            • Load testing
              • Don't forget about time as a load testing variable
            • Other Black-Bos test techniques
              • Behave like a dumb user
              • Look for bugs where you've already found them
              • think like a hacker
              • follow experience,intuition,and hunches
      • 6.Examining the Code:white-box testing
        • reason

          • Obvious reason:to find bugs early

            Other reason:gives the team's black-box testers ideas for test cases to apply when they receive the software for testing

          Formal review(structural analysis):

          Static white-box testing

          • Essential elements
            • Identify problems
            • Follow rules
            • Prepare
            • Write a report
          • Indirect results:communications,quality,team camaraderie,solutions
          • The method
            • Peer reviews:the easiest way
            • Walkthroughs:having at least one senior programmer as a reviewer
            • Inspections:the presenter or reader isn't the original programmer
            • Check the coding standards and guidelines
          • Generic code review checklist
            • Data reference errors:the primary cause of buffer overrun

              Caused by using a vareable,constant,array,string,or record that hasn't been properly declared or initialized for how it's bing used and referenced

              Data declaration errors:

              Caused by improperly declaring or using variables or constants.

              Computation errors:math

              Comparison errors:Suseptible to boundary condition problems

              Control flow errors:The result of loops and other control constructs in the  language not behaving as expected

              Subroutine parameter errors:incorrent passing of data

              Other:languages,Protable,Compatibility,'warning' or 'informational' messages

          7.Dynamic white-box testing(structural testing):

          Seeing what the code does,directly testing and controlling the software

          • Unit testing
            • Bottom-up----test driver    

              Top-down----test stub

              Data coverage:

              Data flow,Sub-boundaries,Formulas and Equations,Error forcing

              Code coverage:

              Statement coverage,Path testing(branch coverage),Condition coverage

          • Iintegration testing
    • Part III:Applying Your Testing Skills
      • 8.Configuration Testing

        • The different configuration possibilities:

          The PC,Components,Peripherals,Interfaces,Options and memory,Device drivers

          Decision-making process

          • 1.Decide the types of hardware you'll need
          • 2.Decide whar hardware Brands,models,and device drivers are available
          • 3.Decide which hardware features,modes,and options are possible
          • 4.Pare down the identified hardware configurations to a manageable set
          • 5.Identify your software's unique features that work with the hardware configurations
          • 6.Design the test cases to run on each configuration
          • 7.Execute the testing and rerun until the results satisfy your team

        9.Compatibility Testing

        • The job:checking that your software interacts with and shares information correctly with other software

          The goal:to make sure that this interaction works as users would expect

          1.Platform and application versions

          • Backward and forward compatibility

            The inpact of testing multiple versions:

            Popularity,Age,Type,Manufacturer

          2.Standards and guidelines

          • High-level:Guide your product's general operation
          • Low-level:The nitty-gritty details

          3.Data sharing compatibility

          • File save and file load
          • File export and file import
          • Cut,copy,and paste
          • DDE,COM,and OLE

        10.Localization Testing/Internationalization Testing
        (Foreign-Language Testing)

        • Translation Issues

          • Text expansion

            • A good rule:to expect up to 100 percent increase in size of individual words on a button

            ASCII,DBCS,and Unicode

            Hot keys and shortcuts

            Extended characters

            • to look for all the places that your software can accept character input or send output

            Computations on characters

            • word sorting
            • uppercase and lowercase conversion
            • Spellchecking ,etc.

            Reading left to right and right to left

            Text in graphics

            Keep the text out of the code:

            all text strings, error messages, and really anything that could possibly be translated should be stored in a separate file independent of the source code

          Localization Issues(native culture):

          Content,Data formats(formats for data units)

          Configuration and Compatibility Issues

          • Foreign platform configurations
            • Keyboards:the largest language dependencies piece of hardware
            • Print,Paper sizes
            • Communication protocol
          • Data compatibility

        11.Usability Testing:

        The appropriate, functional, and effective of interaction

        • GUI testing:

          Graphical user interfaces

          • Important trait:Follows standards and guidelines,Intuitive,Consistent,Flexible,Comfortable,Correct,Useful

          Accessibility Testing:

          For the disabled

        12.Testing the Documentation

        • Packaging text and graphics

          Marketing material,ads,and other inserts

          Warranty/registration

          EULA:End User License Agreement

          Labels and stickers:
          the box,printed material,serial number stickers and labels that seal the EULA envelope

          Installation and setup instructions,User's manual,Online help

          Tutorials,wizards,and CBT(Computer based training)

          Samples,examples,and templates

          Error messages

        13.Testing for Software Security

        • It is a test-to-fail activity

          Buffer Overrun

          • Buffer overruns caused by improper handling of strings are by far the most common coding error
          • Using safe string functions

          Latent data:

          Data that "stays around" and isn't deleted from user to user

          • RAM slack
          • Disk slack

        14.Website Testing

        • Black-Box Testing

          • Text:the regular text and text what's contained in the graphics, scrolling marquees, forms, and so on

            Don't forget the text layout issues

            Hyperlinks:Look for orphan pages

            Graphics:make sure that the text wraps properly around the graphics

            Forms:the text boxes, list boxes, and other fields for entering or selecting information

            Objects and other simple miscellaneous functionality

          Gray-Box Testing:

          HTML and web pages

          White-Box Testing

          • Dynamic content:

            such as the time

            Database-Driven web pages:

            Such as the inventories of e-commerce web pages

            Programmatically created web pages

            Server performance and loading

            Security

          Configuration and Compatibility Testing

          • Hardware platform,browser software and version
          • Browser plug-ins,Options
          • Video resolution and color depth
          • Text size
          • Modem speeds

          Usability Testing

          • Gratuitous Use of Bleeding-Edge Technology
          • Scrolling Text, Marquees, and Constantly Running Animations
          • Long Scrolling Pages
          • Non-Standard Link Colors
          • Outdated Information
          • Overly Long Download Times
          • Lack of Navigation Support
          • Orphan Pages
          • Complex Website Addresses (URLs)
          • Using Frames
    • Part IV:Supplementing Your Testing
      • 15.Automated Testing and Test Tools
        • The Benefits

          • Speed,Efficiency,Accuracy and precision,

            Resource reduction,Simulation and emulation,Relentlessness

          Test Tools:

          Non-invasive and invasive

          • Viewers and monitors
          • Drivers and stubs
          • Stress and load tools
          • Interference Injectors and Noise Generators
          • Analysis Tools

          Test Automation

          • Macro Recording and Playback

            • The biggest problem is lack of verification

              Playback speed can be another difficulty with macros

              Setting the playback position to be relative to the program's window
              rather than absolute to the screen can help

            Programmed Macros

            • Can pause their execution to prompt the tester with an expected result
              and a query for her to okay whether the test passed or failed

              Can also solve many timing problems of recorded macros by
              waiting for certain conditions to occur before they go on

              Defect:lack of verification,can only loop and repeat

            Fully Programmable Automated Testing Tools:

            Have the ability to perform verification

            Important issuse

            • The software changes

              There's no substitute for the human eye and intuition

              Verification is hard to do

              It's easy to rely on automation too much

              Don't spend so much time working on tools and

              automation that you fail to test the software

              Some tools are invasive and can cause

              the software being tested to improperly fail

      • 16.Bug Bashes and Beta Testing:omit
    • Part V:Working with Test Documentation
      • 17.Test Plan

        • The Goal
          • To prescribe the scope, approach, resources, and schedule of the testing activities.
            To identify the items being tested, the features to be tested, the testing tasks to be performed,
            the personnel responsible for each task, and the risks associated with the plan

        • The Topics
          • High-Level expectations,People-Places-and Things,Definitions,
            Inter-groug responsibilities,What will and won't be tested,Test phases,
            Test strategy,Resource requirements,Tester assignments,Test schedule,
            Test cases,Bug reporting,Metrics and statistics,Risks and issues

        18.Test Cases:

        organization, repeatability, tracking, and proof

        • Test Design:

          Identifiers,Features to be tested,Approach,Test case identification,Pass/fail criteria

          Test Cases:

          Identifiers,Test item,Input specification,Output specification,Environmental needs,Special procedural requirements,Intercase dependencies

          Test Procedures:

          Identifier,Purpose,Special requirements,Procedure steps

          Test Case Organization and Tracking

        19.Test Report

        • The Reasons for not fixing a bug
          • There's not enough time
          • It's really not a bug
          • It's too risky to fix
          • It's just not worth it
          • Ineffective bug reporting
        • Fundamental Principles
          • Report bugs as soon as possible

            Effectively describe the bugs:

            Minimal,Singular,Obvious and general,Reproducible

            Be non-judgmental in reporting bugs

            Follow up on your bug reports

        • Bugs
          • Severily

            • 1.System crash, data loss, data corruption, security breach  
            • 2.Operational error, wrong result, loss of functionality  
            • 3.Minor problem, misspelling, UI layout, rare occurrence  
            • 4.Suggestion  

            Priority

            • 1.Immediate fix, blocks further testing, very visible  
            • 2.Must fix before the product is released  
            • 3.Should fix when time permits  
            • 4.Would like to fix but the product can be released as is  

            Life cycle:

            New,Open,Review,Fixed,Closed,Rejected,Reopen,Deferred

        20.Measuring Your Success:omit

  • 相关阅读:
    Mybatis的XML中数字不为空的判断
    初步使用VUE
    Vue中实现菜单下拉、收起的动画效果
    Docker For Windows时间不对的问题
    freemarker使用自定义的模板加载器通过redis加载模板
    .net core 打印请求和响应的内容
    codedecision P1113 同颜色询问 题解 线段树动态开点
    洛谷P2486 [SDOI2011]染色 题解 树链剖分+线段树
    洛谷P3150 pb的游戏(1)题解 博弈论入门
    codedecision P1112 区间连续段 题解 线段树
  • 原文地址:https://www.cnblogs.com/lobo/p/3541887.html
Copyright © 2020-2023  润新知