Software Testing Concepts (Manual Testing) Definitions Project : It is something developed based on a particular customer requirement and used by that particular customer only. Product: Product is some thing that is developed based on the company’s specifications and used by multiple customers. Quality: Quality is defined as not only the justification of the requirement but also the present of value (user friendly). Defect: Defect is defined as deviation from the requirements. Testing: Testing is a process in which the defects are identified, isolated (separated), subjected (sending) for rectification and ensured that the product is defect free in order to produce a quality product in the end and hence customer satisfaction. (Or) Testing is the process of executing a program with the intent of finding errors. (Or) Verifying and validating the application with respect to customer requirements. (Or) Finding the differences between customer expected and actual values. (Or) Testing should also ensure that a quality product is delivered to the customer. Process of Developing Project in the Software Company. BIDDING THE PROJECT : Bedding the project is defined as request for proposal, estimation and signoff.
KICK OF MEETING: It is a initial meeting conducted in the software company soon after the project is signed off in order to discus the over view of the project and to select a project manager for the project. Usually High Level Manager, Project Manager, Technical Manager, Quality Managers, Test leads and Project leads will be involved in this meeting. PIN (Project Initiation Note) PIN is a mail prepaid by the project manager and send to the CEO of the software company in order to get the permission to start the project development. SDLC (Software Development Life Cycle) It contains 6 phases. INITIAL PHASE / REQUIREMENT PHASE. ANALYSIS PHASE. DESIGN PHASE. CODING PHASE. TESTING PHASE. DELIVERY AND MAINTENANCE PHASE. Initial Phase Task Interacting with the customer and gathering the requirements. Roles BA (Business Annalist) EM (Engagement Manager)
Process First of all the business annalist will take an appointment from the customer, collects the templates from the company meats the customer on the appointed date gathers the requirements with the support of the templates and comeback to the company with a requirements documents. Then the engagement manager will check for the extra requirements if at all he fined any extra requirements he is responsible for the excess cast of the project. The engagement manager is also responsible for prototype demonstration in case of confused requirements. Template It is defined as a pre-defined format with pre-defined fields used for preparing a document perfectly. Prototype It is a rough and rapidly developed model used for demonstrating to the client in order to gather clear requirements and to win the confidence of a customer. Proof The proof of this phase is requirements document which is also called with the following name FRS (Functional Requirement Specification) BRS (Business Requirement Specification) CRS (Client/Customer Requirement Specification) URS (User Requirement Specification) BDD (Business Design Document) BD (Business Document) Note Some company’s may the over all information in one document called as ‘BRS’ and the detailed information in other document called ‘FRS’. But most of the company’s will maintain both of information in a single document. Analysis Phase Task Feasibility study. Tentative planning. Technology selection. Requirement A\analysis. Roles System Annalist (SA) Project Manager (PM) Team Manager (TM) Process (I) Feasibility study It is detailed study of the requirements in order to check whether all the requirements are possible are not. (II) Tentative planning The resource planning and time planning is temporary done in this section. (III) Technology selection The lists of all the technologies that are to be used to accomplish the project successfully will be analyzed listed out hear in this section. (IV) Requirement analysis The list of all the requirements like human resources, hardware, software required to accomplish this project successfully will be clearly analyzed and listed out hear in this section. Proof
The proof of this phase is SRC (Software Requirement Specification).
Design phase Tasks HLD (High Level Designing) Roles Process
HLD is done by the CA (Chief Architect).
LLD (Low Level Designing) LLD is done by the TL (Technical Lead).
The chief architect will divided the whole project into modules by drawing some diagrams and technical lead will divided each module into sub modules by drawing some diagrams using UML (Unified Modeling Language). The technical lead will also prepare the PSEUDO Code. Proof
The proof of this phase is technical design document (TDD).
Pseudo Code Module
It is a set of English instructions used for guiding the developer to develop the actual code easily. Module is defined as a group of related functionalities to perform a major task.
Coding Phase Task Programming / Coding. Roles Developers / Programmers. Process Developers will develop the actual source code by using the PSUEDO Code and following the coding standards like proper indentation, color-coding, proper commenting and etc… Proof
The proof of this phase is SCD (Source Code Document).
Testing Phase Task Testing. Roles Test Engineer. Process
First of all the Test Engineer will receive the requirement documents and review it for under studying the requirements.
If at all they get any doubts while understanding the requirements they will prepare the Review Report (RR) with all the list of doubts.
Once the clarifications are given and after understanding the requirements clearly they will take the test case template and write the test cases.
Once the build is released they will execute the test cases.
After executions if at all find any defects then they will list out them in a defect profile document.
Then they will send defect profile to the developers and wait for the next build.
Once the next build is released they will once again execute the test cases
If they find any defects they will follow the above procedure again and again till the product is defect free.
Once they feel product is defect free they will stop the process.
Proof Test case
The proof of this phase is Quality Product.
Test case is an idea of a Test Engineer based on the requirement to test a particular feature. Delivery and Maintenance phase Delivery Task Installing application in the client environment. Roles Senior Test Engineers / Deployment Engineer. Process The senior test engineers are deployment engineer will go to the client place and install the application into the client environment with the help of guidelines provided in the deployment document. Maintenance After the delivery if at all any problem occur then that will become a task based on the problem the corresponding roll will be appointed. Based on the problem role will define the process and solve the problem.
Where exactly testing comes in to picture? Which many sort of testing are there? There are two sorts of testing. 1. Un conventional testing 2. Conventional testing Un conventional Testing It is a sort of testing in which quality assurance people will check each and every out come document right from the initial phase of the SDLC. Conventional Testing It is a sort of testing in which the test engineer will test the application in the testing phase of SDLC. TESTING METHODOLOGY (OR) TESTING TECHNIQUES There are 3 methods are there Black Box Testing. White Box Testing. Gray Box Testing 1 Black Box Testing It is a method of testing in which one will perform testing only on the functional part of an application with out having any structural knowledge. Usually test engineers perform it. 2 White box Testing (Or) Glass box Testing (Or) Clear box Testing It is a method of testing in which one will perform testing on the structural part of an application. Usually developers are white box testers perform it. 3 Gray box Testing It is a method of testing in which one will perform testing on both the functional part as well as the structural part of an application. Note: The Test engineer with structural Knowledge will perform gray box testing. LEVELS OF TESTING There are 5 levels of testing. 1) Unite level testing 2) Module level testing 3) Integration level testing 4) System level testing 5) User acceptance level testing 1) Unit level testing If one performs testing on a unit then that level of testing is known as unit level testing. It is white box testing usually developers perform it. Unit: - It is defined as a smallest part of an application.
2) Module level testing If one perform testing on a module that is known as module level testing. It is black box testing usually test engineers perform it. 3) Integration level testing Once the modules are developing the developers will develop some interfaces and integrate the module with the help of those interfaces while integration they will check whether the interfaces are working fine or not. It is a white box testing and usually developers or white box testers perform it. The developers will be integrating the modules in any one of the following approaches. i) Top Down Approach (TDA) In this approach the parent modules are developed first and then integrated with child modules. ii) Bottom Up Approach (BUA) In this approach the child modules are developed first and the integrated that to the corresponding parent modules. iii) Hybrid Approach This approach is a mixed approach of both Top down and Bottom up approaches. iv) Big Bang Approach Once all the modules are ready at a time integrating them finally is known as big bang approach. STUB While integrating the modules in top down approach if at all any mandatory module is missing then that module is replace with a temporary program known as STUB. DRIVER While integrating the modules in bottom up approach if at all any mandatory module is missing then that module is replace with a temporary program known as DRIVER. 4) System level testing Once the application is deployed into the environment then if one performs testing on the system it is known as system level testing it is a black box testing and usually done by the test engineers. At this level of testing so many types of testing are done. Some of those are System Integration Testing Load Testing Performance Testing Stress Testing etc…. 5) User acceptance testing. The same system testing done in the presents of the user is known as user acceptance testing. It s a black box testing usually done by the Test engineers. ENVIRONMENT Environment is a combination of 3 layers. Presentation Layer. Business Layer. Database Layer.
Types of Environment There are 4 types of environments. 1. Stand alone Environment / One – tire Architecture. 2. Client – Server Environment / Two – tire Architecture. 3. Web Environment / Three – tire Architecture. 4. Distributed Environment / N – tire Architecture. 1) Stand alone environment (Or) One – Tire Architecture. This environment contains all the three layers that is Presentation layer, Business layered and Database layer in a Single tier. 2) Client – Server Environment (Or) Two – Tire Architecture In this environment two tiers will be there one tier is for client and other tier is for Database server. Presentation layer and Business layer will be present in each and every client and the database will be present in database server. 3) Web Environment In this Environment three tiers will be there client resides in one tier, application server resides in middle tier and database server resides in the last tier. Every client will have the presentation layer, application server will have the business layer and database server will have the database layer. 4) Distributed Environment It is same as the web Environment but the business logic is distributed among application server in order to distribute the load. Web Server: It is software that provides web services to the client. Application Server: It is a server that holds the business logic. Ex: Ton tact, Tomcat, Web logic, web Spear etc……… SOFTWARE DEVELOPMENT MODELS There are 6 models. Water fall Model (or) Sequential Model Prototype Model Evolutionary Model Spiral Model Fish Model V - Model 1) Water fall Model (or) Sequential Model INITIAL
Req. Gathering
BRS
ANALYSIS
Sys. Design
SRS
DESIGN
S/W Design
TDD, GUI
CODING
TESTING
Del & Maint
Implementation
Black box Testing
Delivery to Client
Unit Test
UTR
Int. Test
ITR
Mod. Test
MTR
Sys.Test
STR
UAT
UATR
Advantages: It is a simple model and easy to maintain project implementation is very easy. Drawbacks: Can’t incorporate new changes in the middle of the project development. 2) Prototype Model
Unclear Req
S R S Doc Base lined
Client Environ conformation
H/W Prototype
Demo to Client
S/W Prototype
Demo to Client
B R S Doc Base lined
Req .are Refined
Prototype
Advantages: When ever the customer with the requirements then this is the best model to gather the clear requirements. e Drawbacks: It is not a complete model. Time consuming model Prototype has to be build company’s cost The user may strict to the prototype and limit his requirements. 3) Evolutionary Model
Initial Req.
Development N
Application
User Values
Feed back with new Req
User Acceptance
Y
Advantages Whenever the customer is revolving the requirements this is the best suitable model.
App is Base lined
Drawbacks Dead lines are not clearly defined Project monitoring and maintenance is difficult. 4) Spiral Model
Defining the objects Work around Constraints
Refunding and planning for the next cycle.
Risk root cause Analysis. Estimation Contingencies.
Implementation. Advantages This is the best-suited model for highly risk-based projects. Drawbacks Time consumed model, costly model and project monitoring and maintenance is difficult. 5) Fish Model Verification: Verification is a process of checking conducted on each and every role of an organization in order to check whether he is doing his tasks in a right manner according to the guidelines or not. Right from the starting of the process tiles Delivery & the ending of the process. process of checking. AnalysisUsually the documents Designare verified in this Coding Maintenance Validation Validation is a process of checking conducted on the developed product in order to check whether it is working Requiremen according to the requirements or not. System ts gathering Testing HLD SCD
SRS LLD
BRS Review
Black box Testing
SRS Review
TDD Review
White Box
Test S/W
Testing
Changes
Verification Validation
Advantages As the verification and validation are done the outcome of a Fish Model is a quality product. Drawbacks: Time consuming and costly model.
6) V – Model
Verification Initial & Analysis
Validation BRS SRS
Design & Coding Testing
Delivery & Maintenance
Prepare Pro. Plan Prepare Test Plan Req. Phase Testing
TDD
Design phase Testing
SCD
Program phase Testing
S/W Build
System Testing Test Management process User Acceptance Testing Port Testing S/W Efficiency Test S/W Changes
Advantages As the verification and validation are done along with the Test Management. The out come of V-Model is a quality product. Drawback Time consuming and costly model.
TYPES OF TESTING There are 18 types of testing. 1. Build Verification Testing. 2. Regression Testing. 3. Re – Testing. 4. α - Testing. 5. β - Testing. 6. Static Testing. 7. Dynamic Testing. 8. Installation Testing. 9. Compatibility Testing. 10. Monkey Testing 11. Exploratory Testing. 12. Usability Testing. 13. End – To – End Testing. 14. Port – Testing. 15. Reliability Testing 16. Mutation Testing. 17. Security Testing. 18. Adhoc Testing. 1) Sanitary Testing / Build Verification Testing / Build Accepting Testing. It is a type of testing in which one will conduct overall testing on the released build in order to check weather it is proper for further details testing or not. Some companies even call it as Sanitary Testing and also Smoke Testing. But some company’s will say that just before the release of the built the developer’s will conduct the overall testing in order to check weather the build is proper for detailed testing or not that is known as Smoke Testing and once the build is released once again the testers will conduct the over all testing in order to check weather the build is proper for further detailed testing or not. That is known as Sanitary Testing. 2) Regression Testing It is a type of testing in which one will perform testing on the already tested functionality again and again this is usually done in scenarios (Situations). Scenario 1: When ever the defects are raised by the Test Engineer rectified by the developer and the next build is released to the testing department then the Test Engineer will test the defect functionality and it’s related functionalities once again.
Scenario 2: When ever some new changes are requested by the customer, those new features are incorporated by the developers, next built is released to the testing department then the test engineers will test the related functionalities of the new features once again which are already tested. That is also known as regression testing. Note: Testing the new features for the first time is new testing but not the regression testing. 3) Re – Testing: It is a type of testing in which one will perform testing on the same function again and again with multiple sets of data in order to come to a conclusion whether the functionality is working fine or not. 4) α - Testing: It is a type of testing in which one (I.e., out Test Engineer) will perform user acceptance testing in our company in the presents of the customer. Advantages: If at all any defects are found there is a chance of rectifying them immediately. 5) β - Testing: It is a type of testing in which either third party testers or end users will perform user acceptance testing in the client place before actual implementation. 6) Static Testing: It is a type of testing in which one will perform testing on an application or it’s related factors with out performing any actions. Ex: GUI Testing, Document Testing, Code reviewing and etc… 7) Dynamic Testing: It is a type of testing in which one will perform testing on the application by performing same action. Ex: Functional Testing. 8) Installation Testing: It is a type of testing in which one will install the application in to the environment by following the guidelines given in the deployment document and if the installation is successful the one will come to a conclusion that the guidelines are correct otherwise the guidelines are not correct. 9) Compatibility Testing: It is a type of testing in which one may have to install the application into multiple number of environments prepared with different combinations of environmental components in order to check whether the application is suitable with these environments or not. This is use usually done to the products. 10) Monkey Testing: It is a type of testing in which one will perform some abnormal actions intentionally (wanted) on the application in order to check its stability. 11) Exploratory Testing: It is a type of testing in which usually the domain expert will perform testing on the application parallel by exploring the functionality with out having the knowledge of requirements. 12) Usability Testing: It is a type of testing in which one will concentrate on the user friendliness of the application. 13) End – To – End Testing: It is a type of testing in which one will perform testing on a complete transaction from one end to another end. 14) Port Testing: It is a type of testing in which one will check weather the application is comfortable or not after deploying it into the original clients environment.
15) Reliability Testing (or) Soak Testing: It is a type of testing in which one will perform testing on the application continuously for long period of time in order to check its stability. 16) Mutation Testing: It is a type of testing in which one will perform testing by doing some changes For example usually the developers will be doing any many changes to the program and check it’s performance it is known as mutation testing. 17) Security Testing: It is a type of testing in which one will usually concentrate on the following areas. i) Authentication. ii) Direct URL Testing. iii) Firewall Leakage Testing. i) Authentication Testing:
It is a type of testing in which a Test Engineer will enter different combinations of user names and passwords in order to check whether only the authorized persons are accessing the application or not. ii) Direct URL Testing: It is a type of testing in which a test engineer will specified the direct URL’s of secured pages and check whether they are been accessing or not. iii) Firewall leakage Testing: It is a type of testing in which one will enter as one level of user and try to access the other level unauthorized pages in order to check whether the firewall is working properly or not. 18) Adhoc Testing: It is a type of testing in which one will perform testing on the application in his own style after understanding the requirements clearly.
SOFTWARE
TESTING LIFE CYCLE
It contains 6 phases. 1. TEST PLANNING. 2. TEST DEVELOPMENT. 3. TEST EXECUTION. 4. RESULT ANALYSIS. 5. BUG TRACKING. 6. REPORTING.
1) TEST PLANNING Plan: Plan is a strategic document, which describes how to perform a task in an effective, efficient and optimized way. Optimization: Optimization is a process of reducing or utilizing the input resources to their maximum and getting the maximum possible output. Test Plan: It is a strategic document, which describe how to perform testing on an application in an effective, efficient and optimized way. The Test Lead prepares test plan. CANTANTS OF THE TEST PLAN 1.0 INTERDUCTION. 1.1 Objective. 1.2 Reference Document. 2.0 COVERAGE OF TESTING. 2.1 Features to be Tested. 2.2 Features not to be Tested. 3.0 TEST STRATEGY. 3.1 Levels of Testing. 3.2 Types of Testing. 3.3 Test Design Technique. 3.4 Configuration Management. 3.5 Test Metrics. 3.6 Terminology. 3.7 Automation Plan. 3.8 List of Automated Tools. 4.0 BASE CRITERIA.. 4.1 Acceptance Criteria. 4.2 Suspension Criteria. 5.0 TEST DELIVARABLES. 6.0 TEST ENVERONMENT. 7.0 RESOURCE PLANNING. 8.0 SHEDULING. 9.0 STAFFING AND TRAINING. 10.0 RISKS AND CONTINGENCES. 11.0 ASSUMPTIONS. 12.0 APPROVAL INFORMATION.
1.0 INTERDUCTION. 1.1 Objective. The main purpose of the document is clearly described here in this section. 1.2 Reference Document. The list of all the documents that are referred to prepare the test plan will be listed out here in this section. 2.0 COVERAGE OF TESTING. 2.1 Features To Be Tested The list of all the features with in the scope are mentioned here in this section 2.2 Features Not To Be Tested The lists of all the features that are not planed for testing based on the following criteria are mentioned here in this section. Out of scope features Low risk areas Future functionalities. The features that are skipped based on the time constraints. 3.0 TEST STRATEGY It is defined as an organization level term, which is used for testing all the projects in the organization. TEST PLAN It is defined as a project level term, which is describes how to test a particular project in an organization. Note: Test strategy is common for all the projects. But test plan various from project to project. 3.1 Levels of Testing The list of all the levels of testing that are maintained in that company are listed out here in this section. 3.2 Types of Testing The list of all the types of testing that are followed by that company are listed out here in this section. 3.3 Test Design Technique The list of all the techniques that are followed by that company during the test case development are listed out here in this section. Ex: BVA (Boundary Value Analysis) ECP (Equable Class Partition) 3.4 Configuration Management 3.5 Test Metrics The lists of all the tasks that are measured and maintain in terms of metrics are clearly mentioned here in this section. 3.6 Terminologies The list of all the terms and the corresponding meanings are listed out here in this section 3.7 Automation plan The list of all the areas that are planed for automation in that company are listed out her in this section. 3.8 List of Automated Tools
The list of all the automated tools that are used in that company are listed out here in this section.
4.0 BASE CRITERIA 4.1 Acceptance Criteria. When to stop testing in a full pledged manner thinking then enough testing is done on the application is clearly described here in this section. 4.2 Suspension Criteria. When to stop testing suddenly and suspended the build will be clearly mentioned here in this section. 5.0 TEST DELIVERABLE. The list of all the documents that are to be prepared and deliver in the testing phase are listed out here in this section. 6.0 TEST ENVIRONMENT. The customer specified environment that is about to be used for testing is clearly describes here in this section. 7.0 RESOURCE PLANNING. Who has to do what is clearly described here in this section. 8.0 SCHEDULING. The starting dates and the ending dates of each and ever task is clearly described here in this section. 9.0 STAFFING AND TRAINING. How much staff is to be requited what kind of training is to be provided is clearly planned and mentioned here in this section. 10.0 RISK AND CONTINGENCES. The list of all the potential risks corresponding solution plans are listed out here in this section. Risks 1. Unable to deliver the software with in the dead lines. 2. Employees may leave the organization in the middle of the project development. 3. Customer may impose the dead lines. 4. Unable to test all the features with in the time. 5. Lake of expatriation. Contingences 1. Proper plan ensurence. 2. People need to be maintained on bench. 3. What not to be tested has to be planed properly. 4. Severity priority based execution. 5. Proper training needs to be provided. 11.0 ASSUMPTIONS. The list of all the assumptions that are to be assumed by a test engineer will be listed out here in this section. 12.0 APPRUVAL INFORMATION. Who will approve what is clearly mentioned here in this section.
2. TEST DEVELOPMENT. TYPES OF TEST CASES Test cases are broadly divided into two types. 1. G.U.I Test Cases. 2. Functional test cases. Functional test cases are further divided into two types. 1. Positive Test Cases. 2. Negative Test Cases. GUIDELINES TO PREPARE GUI TEST CASES: 1. Check for the availability of all the objects. 2. Check for the alignments of the objects if at all customer has specified the requirements. 3. Check for the consistence of the all the objects. 4. Check for the Spelling and Grammar. Apart from these guidelines anything we test with out performing any action will fall under GUI test cases. GUIDELINES FOR DEVELOPING POSITIVE TEST CASES. 1. A test engineer must have positive mind setup. 2. A test engineer should consider the positive flow of the application. 3. A test engineer should use the valid input from the point of functionality. GUIDELINES FOR DEVELOPING THE NEGATIVE TEST CASES: 1. A test engineer must have negative mind setup. 2. He should consider the negative flow of the application. 3. He should use at least one invalid input for a set of data. Test Case Template: 1. Test Objective : 2. Test Scenario 3. Test Procedure : 4. Test Data 5. Test Cases
: : :
1.Test Objective: The purpose of the document is clearly described here in this section. 2.Test Scenarios: The list of all the situations that are to be tested, that are listed out here in this section. 3.Test Procedure: Test procedure is a functional level term, which describe how to test the functionality. So in this section one will describe the plan for testing the functionality. 4.Test Data: The data that is required for testing is made available here in this section. 5.Test Cases: The list of all the detailed test cases is- listed out here in this section.
Note: Some companies even maintain all the above five fields individually for each and every scenario. But some companies maintain commonly for all the scenarios. 3. TEST EXECUTION. During the test execution phase the test engineer will do the following. 1. He will perform the action that is described in the description column. 2. He will observe the actual behavior of the application. 3. He will document the observed value under the actual value column. 4. RESULT ANALYSIS. In this phase the test engineer will compare the expected value with actual value and mention the result as pass if both are match other wise mentioned the result as fail. 5. BUG TRACKING. Bug tracking is a process in which the defects are identifying, isolated and managed. DEFECT PROFILE DOCUMENT Defect ID: The sequences of defect numbers are listed out here in this section. Steps of Reproducibility: The list of all the steps that are followed by a test engineer to identity the defect are listed out here in this section. Submitter: The test engineer name who submits the defect will be mentioned here in this section. Date of Submission: The date on which the defects submitted is mentioned here in this section. Version Number: The corresponding version number is mentioned here in this section. Build Number: Corresponding build number is mentioned here is this section. Assigned to: The project lead or development lead will mentioned the corresponding developers name for name the defect is assigned. Severity: How serious the defect is, is described in terms of severity. It is classified in to 4 types. 1. FATAL 2. MAJOR 3. MINOR 4. SUGGESION
Sev1 Sev2 Sev3 Sev4
S1 S2 S3 S4
1 2 3 4
FATAL: It is all the problems are related to navigational blocks or unavailability of functionality then such types of problems are treated to be FATAL defect. Note: It is also called as show stopper defects.
MAJOR: It at all the problems are related to the working of the features then such types of problems are treated to be MAJOR defects. MINOR: It at all the problems are related to the look and feel of the application then such types of problems are treated to be MINOR defects. SUGGITIONS: If at all the problems are related to the value of the application then such types of problems are treated to be suggestions. Priority: The sequence in which the defects have to be rectified is described in terms of priority. It is classified in to 4 types. 1. CRITICAL 2. HIGH 3. MEDIUM 4. LOW Usually the FATAL defects are given CRITICAL priority, MAJOR defects are given HIGH priority, MINOR defects are given MEDIUM priority and SUGGITION defects are given LOW priority sent depending upon the situation the priority may be changed by the project lead or development lead. Ex: Low Severity High Priority Case: In the case of customer visit all the look and feel defects, which are usually less savior, are given highest priority. High Severity Low Priority Case: If at all some part of the application is not available because it is under development still the test engineer will treat team as FATAL defect, but the development lead will give less priority for those defects. Hold
BUG LIFE CYCLE
Testers Mistake No
As Per Design
Require Test really Defect
Develop M1
Yes
Fixed for verification
BH # 1
BH # 2
Testing
New/Open
Reopen
Yes
No
Rectification
If Defect
No
Is it really rectified
Yes
Stop the Testing
Closed
New / Open: When ever the defect is found for the first time the test engineer will set the status as New / Open. But some companies will say to set the status as only new at this situation and once the developers accept the defect they will set the status as open. Reopen and Closed: Once the defects are rectified by the developer and the next build is released to the testing department then the testers will check whether the defects are rectified properly or not. If they feel rectified they will set the status as Closed. Other wise they will set the status as Reopen Fixed for Verification / Fixed / Rectified. When ever the test engineer raises the defects, accepted in the developers. Rectified then they will set the status as Fixed. Hold: Whenever the developer confused to accept or Reject the defect he will set the status as hold. Testers Mistake / Testers Error / Rejected. Whenever the developer is confused it is not at all a defect then he will set the status as reject. As Per Design (This is a Rare case) When ever some new changes are incorporated engineers then the test engineers will raze then as defects but the developers will set the status as ‘As Per Design’. Error: It is a problem related to the program. Defect: If the test engineer with respect to the functionality identifies a problem then it is called defect. Bug: If the developer accepts the defect, that is called as Bug. Fault / Failure: The customer identity the problem, after delivery. It is called Fault / Failure. 6. BUG REPORTING. 1). Classical Bug Reporting Process: Test Lead
Project Lead
Mail
TE1
TE2
TE3
Drawbacks:
1.Time consuming 2. Redundancy. 3. No Security.
Dev1
Dev2
Dev3
2). Common Repository Oriented Bug Reporting Process: TL PL
Caman Repository
TE1
TE2
TE3
Drawbacks:
1.Time consuming. 2. Redundancy.
Dev1
Dev2
Dev3
3). Bug Tracking Tool Oriented Bug Reporting Process:
TL
PL
BTT
TE1
TE2
TE3
Dev1
Dv2
Dve3
Big Tracking Tool: It is a software application that can be accessed only by the otherwise person and used for managing the complete bug tracking process by providing all the facilities along with a defect profile template. Note: At the end of the testing process usually the test lead will prepare the test summary report which is also called as test closure. TEST DESIGN TECHNIQUES: While developing the test cases if at all the test engineer feels complex in some areas to over come that complexity usually the test engineer will use test design techniques. Generally two types of techniques are used in most of the companies. 1. Boundary Value Analysis (BVA). 2. Equableness Class Partition (ECP). 1). Boundary Value Analysis (BVA). When ever the engineers need to develop test cases for a range kind of input then they will go for boundary value analysis. Which describes to concentrate on the boundary of the rang. Usually they test with the following values. LB-1 LB LB+1 MV UB-1 UB UB+1 2). Equableness Class Partition (ECP). When ever the test engineer need to develop test cases for a feature which has more number of validation then one will go for equableness class partition. Which describe first divide the class of inputs and then prepare the test cases. Ex:
Develop the test cases for E-Mail Test box whose validations are as follows.
Requirements: 1. It should accept Minimum 4 characters Maximum 20 characters. 2. It should accept only small characters. 3. It should accept @ and _ special symbols only. Boundary Value Analysis: LB-1 LB LB+1 3ch 4ch 5ch Equableness Class Partition (ECP). Valid
MV
UB-1
UB
UB+1
12ch
19ch
20ch
21ch
Invalid
4char 5char 12char 19char 20char a–z @ _
3char 21char A–Z 0–9 All the Special Symbols apart form @ and _. Alpha Numeric. Blank Space Dismal Numbers.
Test Case Document: Test Case ID
Test Case Type
Description
Expected Value
1
+ve
Enter the value as per the VIT
It should accept.
2
-ve
Enter the value as per the IIT
It should not accept.
Valid Input Table (VIT).
Sl NO
Invalid Input Table (IIT).
Input
Sl No
Input
1
abcd
1
abc
2
ab@zx
2
ABCD
3
abcdabcd@ab_
3
ABCD123
4
abcdabcddcbaaccd_@z
4
12345.5
5
abcdabcdabcdabcdz@_x
5
abcd abcd abcd abcd
6
abcdabcdabcdabcd_xyz
6
abcdabcd-----abc*#)
7
-: The End :Pls. leave your feed back (both +ve and –ve ) at
[email protected]