AGILE

AGILE: Agile is iterative & incremental approach for planning and guiding project process where more focus is on functionality rather than comprehensive documentation.


PRINCIPLES:
  • Individuals and interactions over processes and tools.
  • Working software over comprehensive documentation.
  • Customer collaboration over contract negotiation.
  • Responding to change over following a plan.
TERMS  IN AGILE: 

1) User Story - Requirement

2) Epic - Collection of user stories/ Requirements

3) Product Backlog - SRS/BRD/ All requirements

4) Sprint - Time period for complete releasing the functionality

5) Sprint Planning Meeting - First meeting for considering the user stories for every sprint.

6) Sprint Backlog: Committed user stories

7) Scrum Meeting: Daily stand up calls for 15 minutes

8) Sprint retrospective meeting

9) Story point: Estimation of user story using Fibonacci series(11235...)
Eg: It can be 1 hr/ 1 day

10) Sprint Review: Review of functionality delivered

11) Burndown chart : To track progress

12) Burnup chart: To track release progress and to estimate when MVP (Minimum Viable product) can be provided 

13) Scrum Board

14) Ceremonies: Meetings
Eg: Sprint Planning Meeting/Scrum Meeting/Sprint retrospective meeting

-------------------------------------------------------------------------------------------------

Scrum is an agile development methodology/framework used in the development of Software based on an iterative and incremental processes. Scrum is adaptable, fast, flexible and effective agile framework that is designed to deliver value to the customer throughout the development of the project.

It assures that Agile principles are followed.



Extreme Programming (XP) is an agile software development framework that aims to produce higher quality software, and higher quality of life for the development. 




kanban board is an agile project management tool designed to help visualize work, limit work-in-progress, and maximize efficiency.

- Not much structured like Scrum
- Focus on Work In progress



FeatureAgile (e.g., Scrum)Kanban
ApproachIterative, time-boxed sprintsContinuous flow
RolesDefined roles (e.g., Scrum Master)No required roles
PlanningSprint planning at iteration startOngoing, as work is pulled
DeliveryAt end of each sprintContinuous, as tasks complete
Board UsageOptional (Scrum board)Essential (Kanban board)
WIP LimitsNot mandatoryCore principle
CeremoniesRequired (e.g., stand-ups, reviews)Optional


------------------------------------------------------------------------------------------------------------------

SCRUM TEAM


The product owner represents the stakeholders of the project. The role is primarily responsible for setting the direction for product development or project progress.

The key responsibilities of a Product Owner include:
  • Scrum backlog management
  • Release management
  • Stakeholder management

The role of Scrum Master:

  • Facilitating the daily Scrum and Sprint initiatives.
  • Communicating between team members about evolving requirements and planning.
  • Coaching team members on delivering results.
  • Handling administrative tasks such as conducting meetings, facilitating collaboration, and eliminating hurdles affecting project progress.
  • Shielding team members from external interferences and distractions.
  DEVELOPMENT TEAM MEMBERS:
  •     Product designer
  •     Writer
  •     Programmer
  •     Tester
  •     UX specialist
      STAKE HOLDERS:
  •     The end user of the product
  •     Business executives
  •     Production support staff
  •     Investors
  •     External auditors
  •     Scrum team members from associated projects and teams

ADDITIONAL ROLES:
  • Technical and domain experts with the knowledge of technology as well as a wide variety of stakeholder requirements or expectations.
  • An independent testing and audit team may join the Scrum team members and work throughout the product development lifecycle.
  • An Integrator may be required among large teams that work on independent but closely coordinated subsystems for a project. The responsibility for the Integrator would include integration of the subsystems as well as testing that may be performed by external testing teams.
  • An Architect Owner may be required for architectural envisioning, planning and decision making.
-------------------------------------------------------------------------------------------------
VELOCITY & CAPACITY

Sprint Velocity: is the average completed (estimated) story points over the past three to five iterations.

Team Capacity: is a product of the total number of Scrum team members multiplied by the number of team productive days.

Eg: 
Sprint Velocity = 32 & 6 Team members working 8 hrs/day
FOCUS FACTOR: 32 /(6*8) = 0.67
Team effective capacity would be 0.67 * (6*8*10) = 321.6 hours
FOCUS FACTOR *(NO. OF RESOURCES*HOURS*DAYS)

-------------------------------------------------------------------------------------------------

DISADVANTAGES OF AGILE:

1) Poor resource planning
2) Limited documentation
3) Fragmented output
4) No finite end
5) Difficult measurement
-------------------------------------------------------------------------------------------------


RISK BASED TESTING: This testing is done in case there are chances of negative impact on production envt., which may happen due to,


1) Time constraint
2) Budget constraint
3) Resource constraint

To overcome this, approach which can be used is: Priotizing Technique, where important/critical features & TC's are prioritized and has to be made sure whether it is completed or not.

                    -------------------------------------------------------------------------------------------------

    REGRESSION SCOPE:

    It depends on,

   a) Project nature: Landing page/Professional page

   b) Project scope: 

     Small  (Manual)/ Medium(Manual+automation) / Large (Manual+automation)

   c) Stability of the project

   Then final approach can be decided as,

  1) Partial Regression

  2) Full Regression

                             -------------------------------------------------------------------------------------------------

PROJECT REPORTING:

AGILE:
1) Burndown Chart : User Stories(Target User Stories) Vs Time (Started Time to Target time)
2) Burnup chart: 
3) Sprint Report
4) Velocity Report: Committed Vs Completed
5) Cumulative flow diagram: No. of issues Vs Time
6) EPIC report
7) Epic burndown
8) Release burndown

DEVOPS:
9) Deployment frequency - DevOps

ISSUE ANALYSIS:
10) CREATED VS RESOLVED ISSUES
11) Resolution time report

FORECAST & MANAGEMENT:
11) Time tracking report
12) User Workload report

                                   -------------------------------------------------------------------------------------------------
READY & DONE:

READY:
1) User story is understood by the team
2) Story is written in simple format (As a User.....I want to.....So that...)
2) User story has clear business value 
3) User story is estimated
4) User story dependencies are identified
5) User stories are small
6) User story Acceptance criteria is defined

DONE:
1) Code is completed
2) Code is merged to main branch
3) Code review for code merge completed
4) Unit tests for story completed
5) User story is tested against the Acceptance Criteria
6) User story is demoed to the stake holders
7) Accepted by product owner
8) Documentation is updated

                                     -------------------------------------------------------------------------------------------------








Testing Basics

SOFTWARE TESTING:  It is a practice to execute the application in test with intent to find the defects before deploying the code to production.
-----------------------------------------------------------------------------------------------------------------------
PRINCIPLES OF TESTING:

1) Testing shows the presence of bugs: No application is error free
2) Pesticide Paradox : New test cases in Regression
3) Exhaustive testing is impossible: Not possible to test all possible combinations of data and scenarios
4) Early Testing
5) Defect clustering
6) Testing is context dependent: Different methodologies, techniques and types of testing is related to the type and nature of the application.
7) Absence of errors fallacy

-----------------------------------------------------------------------------------------------------------------------
TYPES OF TESTING :

1) FUNCTIONAL TESTING: Testing actual functionality developed by developer against the requirement specified by client.


2) NON-FUNCTIONAL TESTING: Testing to  check non functional aspects (Performance, usability, accessibility etc.)

Performance testing: Measuring response time.

Stress testing: Validates the system performance in the context of scarce resources, which involves running tests on low storage / memory configurations to identify bugs that may be undetectable in normal circumstances.

Load testing: Testing with huge no. of users.

Volume testing: Judges performance in the context of enormous amounts of data, involving an identification of where exactly the app fails, at what volume of data the system is unable to continue running. 

Usability testing: Basically to assess user-friendliness, GUI consistency, error reportage, and correct output in line with the business-specified requirements.

UI testing: Issues addressed here include layout, data movement from one page to another, and pop-ups for assistance if the system concludes that a user needs guidance.

Recovery testing: Validates if the app shuts down during failure without glitches and without affecting the system, and that the data is not lost. 

Compatibility testing: Checks overall compatibility with a range of operating systems, browsers, and devices, at varying strengths of configuration. 

Instability testing: Checks the smoothness of installs and uninstalls, and confirms that the app behavior remains steady if the disk space is limited. 

Documentation testing: Confirms the presence of guides, instructions, read-me, online assistance, release notes, etc. as part of the app package.

3) CHANGE RELATED TESTING :

  A) Confirmation/Retesting: Testing to confirm whether defects are fixed or not.

  B) Regression testing: Testing with intent to check whether application's old functionality is working fine after adding new functionality or after any modification in the code.

-----------------------------------------------------------------------------------------------------------------------
LEVELS OF TESTING:

A) Unit Testing: Smallest part of application called 'Units' are tested.

B) Integration Testing: Testing done with integrating different modules.

C) System Testing: End to end testing of application.

D) Acceptance Testing: Testing done by the client.

            i) Alpha Testing: Testing done by the client at Development side.

            ii) Beta Testing: Testing done by the client at Client side.
---------------------------------------------------------------------------------------------------------------------
 METHODS OF TESTING:

1) White Box Testing: Testing the code of application -- Done by Developer

2) Black Box Testing: Testing the functionality of application -- Done by Tester

3) Gray Box Testing: White Box Testing + Black Box Testing
-----------------------------------------------------------------------------------------------------------------------
DIFFERENCE BETWEEN SMOKE & SANITY:

Smoke: Testing with intent to check whether build/functionality/feature/ is testable or not.

Sanity: Testing with intent to check whether new functionality or bugs have been fixed.

-----------------------------------------------------------------------------------------------------------------------

DIFFERENCE BETWEEN ERROR, DEFECT, BUG & FAILURE:

1) A mistake in coding is called error.
2) Error found by tester is defect.
3) Defect accepted by developer is bug.
4) Build does not fulfils requirement then it is failure.
-----------------------------------------------------------------------------------------------------------------------

TYPES OF REVIEW:

1) Walkthrough: Not a formal process led by the authors/testers.

2) Technical review: Less formal review led by trained moderator.

3) Inspection: Most formal review led by trained moderator.

-----------------------------------------------------------------------------------------------------------------------
DIFFERENCE BETWEEN SEVERITY & PRIORITY:

Severity: Impact of defect on the functionality.

Priority: Priority suggests how fast defect can be fixed.
-----------------------------------------------------------------------------------------------------------------------
TESTING TECHNIQUES:

1) Equivalence partitioning: Selecting the values between the ranges.

Eg:    Requirement : Field should accept numeric values from 1 to 20

     TC_01:    Valid Rage: To check if user is able to enter value from range 1 to 20 (Test data:4) 

  
   TC_02:   Invalid Range: To check if user is unable to enter value less than 1 (Test data: -4)

     TC_03:   Invalid Range: To check if user is unable to enter greater than 20 (Test data: 25)

2) Boundary value analysis: Testing the corner scenarios

Eg:     Requirement : Field should accept numeric values from 1 to 20

        TC_01:    To check if user is able to enter 0 in the field provided.(-ve testcase)
                       
      TC_02:    To check if user is able to enter 1 in the field provided.(+ve testcase)
    
      TC_03:    To check if user is able to enter 2 in the field provided.(+ve testcase)
                       
      TC_04:    To check if user is able to enter 19 in the field provided.(+ve testcase)

      TC_05:    To check if user is able to enter 20 in the field provided.(+ve testcase)

      TC_06:    To check if user is able to enter 21 in the field provided.(-ve testcase)

3) Decision table testing: Decision table testing is a technique used in software testing to model complex business rules and logic. It helps testers systematically cover all possible combinations of conditions and actions.

2. Discount Calculation

Customer TypePurchase Amount > $100Discount Applied
RegularYes10%
RegularNo0%
PremiumYes20%
PremiumNo10%

4) State Transition: State transition testing is a technique used to test systems where outputs depend on the sequence of events or states, not just current inputs. It’s especially useful for systems with workflows, status changes, or user interactions.

2. Online Order Processing

States:

  • Order Placed
  • Payment Pending
  • Payment Completed
  • Order Shipped
  • Order Delivered
  • Order Cancelled

Transitions:

  • Place order → Order Placed
  • Make payment → Payment Pending → Payment Completed
  • Ship order → Payment Completed → Order Shipped
  • Deliver order → Order Shipped → Order Delivered
  • Cancel order → Any state before delivery → Order Cancelled

-----------------------------------------------------------------------------------------------------------------------

Statement coverage / Path coverage / Branch-Decision coverage / Linear code sequence & Jump:

  • 100% LCSAJ coverage will imply 100% Branch/Decision coverage
  • 100% Path coverage will imply 100% Statement coverage
  • 100% Branch/Decision coverage will imply 100% Statement coverage
  • 100% Path coverage will imply 100% Branch/Decision coverage
  • Branch coverage and Decision coverage are same
-----------------------------------------------------------------------------------------------------------------------

SDLC (SOFTWARE DEVELOPMENT LIFE CYCLE)
-------------------------------------------------------------------------------------------------------------

STLC(SOFTWARE TESTING LIFE CYCLE)



------------------------------------------------------------------------------------------------------------
                                                                    SDLC MODELS

Waterfall: Once one phase is completed the only option is to move to next phase which turned into disadvantage and gave rise to other models.


              
 V-Model: Development and Testing is done parallelly. When developement team is busy in coding testing team starts doing review of documents and writing test cases.


Spiral: Once module is developed and tested, its given to client and post approval by client development of new module is started.

-------------------------------------------------------------------------------------------------------------


                                                         DEFECT LIFE CYCLE



-------------------------------------------------------------------------------------------------------------
  TEST MANAGER ACTIVITIES



---------------------------------------------------------------------
HOW REQUIREMENT ANALYSIS IS DONE:

1) User stories don't always include enough information for development decisions.
2) Involve stakeholders regularly.
3) Identify who are the end users & who are stakeholders
4) Prioritize requirements
5) Focus on executable requirements
6) Use INVEST principle: Independent, Negotiable, Valuable, Estimated, Small & Tested
                                                    ---------------------------------------------------------------------