- collaboration
Invite Team Members
Assign Projects
Users & Role Management
Review Management [Test Cases]
Review Management [Elements]
Execution Controls
Manage Test Cases
Test Case List Actions
Import and Export Test Cases
Import Test Project Test Cases
Importing Postman Collections and Environments
Update Test Case result in a Test Plan
Test Cases (Mobile Web App)
- Test Step Types
Type: Natural Language
Type: REST API
Type: Step Group
Type: For Loop
Type: While Loop
Type: Block
Type: If Condition
Nested Step Groups
Image Injection
Cross-application testing
- Test Data Types
Raw
Parameter
Runtime
Random
Data Generator
Phone Number
Mail Box
Environment
Concat Test Data
Create Test Data [Parameter]
Update Test Data Profile
Updating Value in TDP
Import TDP
Bulk Deletion of a Test Data Profile
Create Test Data [Environment]
- Elements (Objects)
- Web Applications
Record Single Element
Record Multiple Elements
Create Elements
Supported Locator Types
Formulating Elements
Shadow DOM Elements
Verifying elements in Chrome DevTools
Handling iframe Elements?
Dynamic Locators using Parameter
Dynamic Locators using Runtime
Using Environment Test Data for Dynamic Locators
Locating Dynamic Elements in Date Widget
Freeze & Inspect Dynamic Elements (WebPage)
Locating Dynamic Elements in Tables
Import/Export Elements
AI Enabled Auto-Healing
Locator Precedence (Web Apps)
Verify Elements from Test Recorder
- test step recorder
Install Chrome Extension
Install Firefox Extension
Install Edge Extension
Exclude Attributes/Classes
- test plans
Add, Edit, Delete Test Machines
Add, Edit, Delete Test Suites
Schedule Test Plans
Run Test Suites In Parallel
Cross Browser Testing
Distributed Testing
Headless Testing
Test Lab Types
Disabling Test Cases in Test Plans
AfterTest Case
Post Plan Hook
AfterTest Suite
Email Configuration in Test Plan
Execute Partial Test Plans via API
Ad-hoc Run
Test Plan Executions
Dry Runs on Local Devices
Run Tests on Private Grid
Run Tests on Vendor Platforms
Run Test Plans on Local Devices
Test Locally Hosted Applications
Debug Test Case Failures
Parallel and Allowed queues
- debugging
Debug results on local devices (Web applications)
Debug Results on Local Devices
Launch Debugger in the Same Window
- Testsigma Agent
Pre-requisites
Setup: Windows, Mac, Linux
Setup: Android Local Devices
Setting up iOS Local Devices
Update Agent Manually
Update Drivers Manually
Delete Corrupted Agent
Delete Agents: Soft & Permanent
Triggering Tests on Local Devices
- troubleshooting
Agent - Startup and Registration Errors
Agent Logs
Upgrade Testsigma Agent Automatically
Specify Max Sessions for Agents
Testsigma Agent - FAQs
- continuous integration
Test Plan Details
REST API (Generic)
Jenkins
Azure DevOps
AWS DevOps
AWS Lambda
Circle CI
Bamboo CI
Travis CI
CodeShip CI
Shell Script(Generic)
Bitrise CI
GitHub CICD
Bitbucket CICD
GitLab CI/CD
- desired capabilities
Most Common Desired Capabilities
Browser Console Debug Logs
Geolocation Emulation
Bypass Unsafe Download Prompt
Geolocation for Chrome & Firefox
Custom User Profile in Chrome
Emulate Mobile Devices (Chrome)
Add Chrome Extension
Network Throttling
Network Logs
Biometric Authentication
Enable App Resigning in iOS
Enable Capturing Screenshots (Android & iOS)
Configure Android WebViews
Incognito/Private mode
Set Google Play Store Credentials
- addons
What is an Addon?
Addons Community Marketplace
Install Community Addon
Prerequisites(Create/Update Addon)
Create an Addon
Update Addon
Addon Types
Create a Post Plan Hook add-on in Testsigma
Create OCR Text Extraction Addon
- configuration
API Keys
- Security(SSO)
Setting Up Google Single Sign-On(SSO) Login in Testsigma
Setting Up Okta Single Sign-On Integration with SAML Login in Testsigma
Setting up SAML-based SSO login for Testsigma in Azure
iOS Settings
Creating WDA File for iOS App Testing
- uploads
Upload Files
Upload Android and iOS Apps
How to generate mobile builds for Android/iOS applications?
- Testsigma REST APIs
Environments
Elements
Test Plans
Upload Files
Get Project Wide Information
Upload & Update Test Data Profile
Fetch Test Results (All Levels)
Trigger Multiple Test Plans
Trigger Test Plans Remotely & Wait Until Completion
Run the Same Test Plan Multiple Times in Parallel
Schedule, Update & Delete a Test Plan Using API
Update Test Case Results Using API
Create and update values of Test Data Profile using REST API
Rerun Test Cases from Run Results using API
- open source dev environment setup
macOS and IntelliJ Community Edition
macOS and IntelliJ Ultimate Edition
Windows and IntelliJ Ultimate Edition
Setup Dev Environment [Addons]
- NLPs
Retrieve Value in Text Element
Capture Dropdown Elements
Unable to Select Radiobutton
Unable to Click Checkbox
Clearing the Session or Cookies
UI Identifier NLP
Drag & Drop NLP
Uploading Files NLP
- setup
Server Docker Deployment Errors
Secured Business Application Support
Troubleshooting Restricted Access to Testsigma
Why mobile device not displayed in Testsigma Mobile Test Recorder?
Unable to Create New Test Session
Agent Startup Failure Due to Used Ports
Tests Permanently Queued in Local Executions
Fix Testsigma Agent Registration Failures
Testsigma Agent Cleanup
Need of Apache Tomcat for Testsigma Agent
- web apps
URL not accessible
Test Queued for a Long Time
Issues with UI Identifiers
Missing Elements in Recorder
Collecting HAR File
Errors with Browser Session
Page Loading Issues
- mobile apps
Failed to Start Mobile Test Recorder
Troubleshooting “Failed to perform action Mobile Test Recorder” error
Test Execution State is Queued for a Long Time
Mobile app keeps stopping after successful launch
More pre-requisite settings
Unable to start WDA Process on iPhone
Most Common causes for Click/Tap NLP failure
Finding App Package & App Activity (Android)
Cross-environment Compatible ID Locators (Android)
Why Accessibility IDs Over other Locators?
Common Android Issues & Proposed Solutions
Finding the App Bundle ID for iOS
- on premise setup
On-Premise Setup Prerequisites
On-Premise Setup with Docker-compose File
Post-Installation Checklist for On-Premise Setup
Install Docker on an Unix OS in Azure Infrastructure
SMTP Configuration in Testsigma
Configure Custom Domains
- salesforce testing
Intro: Testsigma for Salesforce Testing
Creating a Connected App
Creating a Salesforce Project
Creating Metadata Connections
Adding User Connections
Build Test Cases: Manual+Live
Salesforce Element Repositories
Intro: Testsigma Special NLPs
Error Handling On Metadata Refresh
Automating Listview Table NLPs
- windows automation
Intro: Windows Desktop Automation
Windows (Adv) Project & Application
Object Learning (Using UFT One)
Converting TSR Files to TSRx
Importing/Updating TSRx Files
Test Cases for Windows Automation
Error Handling Post TSRx File Update
Manage Test Cases
Testsigma provides a comprehensive and user-friendly solution for efficiently creating, organising, and executing test cases. It enables teams to collaborate effectively, ensures maximum test coverage, and provides valuable insights into the testing workflow. Users can create test cases using either the Testsigma Recorder or by manually writing steps using NLP. This documentation guides users step-by-step on how to manage test case in Testsigma to optimise the testing process.
Prerequisites
- Ensure you create a project before creating test cases in Testsigma.
Create Test Case
- Navigate to Create Tests > Test Cases in the left-side navbar. Click the Create Test Case button in the top right corner of the Test Case List page to create a test case.
- Replace Untitled in the top left corner of the screen on the Test Case Details page with a Title for the Test Case.
-
You can create the test steps for your test case using either of the following methods:
- Write Test Steps Manually using NLPs by clicking Add new step.
- Use Testsigma Recorder to Record steps.
You should install the Testsigma test step recorder extension to record the test steps for a web or mobile web app project. Check here for instructions on how to install it.
Test Case - Advanced Options
Use the right-side navbar on the Test Case Details page to access Advanced Options for better management and grouping of test cases while creating Test Suites and Test Plans and to improve understanding of test coverage.
- Test Case Info: Click Test Case Info in the right-side navbar. You can update the Test Case Name and Description from there and view information about the Test Case's creation and last update.
- Ad-Hoc Runs: Click Ad-Hoc Runs in the right-side navbar. This will enable you to view the history and details of Ad-Hoc Run results for the Test Case.
-
Test Case Settings: Click Test Case Settings in the right-side navbar, and configure the following options:
- Pre-Requisites: Testsigma defines prerequisites for test cases. Prerequisites specify any necessary conditions or steps to be completed before executing a test case, ensuring the test environment is properly set up.
- Test Data Profile: If you use parameter-type test data in your test steps, refer to the Test Data Profile and select it in the Test Case.
- Test Data Set: Specify the data set in the test data profile used in the test case.
-
Data-Driven (toggle): To repeatedly run the same test case with different input field data sets, add a Test Data Profile and enable the Data-Driven toggle. Once enabled, customise various data sets from your test data profile used in your test case using filters such as iteration, parameter, and set name. Refer to data-driven testing for more information.
- Iteration: You can filter sequential data sets using greater than, less than, or between operations. Note that this filtering only applies to sequential data sets.
- Set Name: Filter non-sequential data sets using this type of filtering. Use operations such as between, equals, contains, starts with, and ends with to filter the data sets by their names. The test case checks for any data set names with the specified name or a part of it.
- Parameter: Filter non-sequential data sets using this type of filtering. Use parameters within the data sets to filter the data set.
- Fail Test Case if Visual Testing Fails: To ensure the accuracy and consistency of the application's user interface across different environments and iterations, enable this option that automatically marks a test case as failed if it detects any visual discrepancies during execution.
-
After Test Case: Define custom steps or tasks to perform after executing a test case using Testsigma. These steps facilitate clean-up or preparation for subsequent test cases. For more information, refer to After Test Case.
- If After Test Fails - Fail the Test Case: If the execution of after-test actions, such as post-validation checks or clean-up steps, encounters any failure or error, the test case will automatically fail. This ensures that any issues in the test's conclusion are promptly identified and resolved.
- If After Test Fails - Show Test Case Result: The test case result will be shown even if the after-test actions fail. This gives you a complete view of the test's behaviour and provides valuable insights into the overall test case execution, including any possible issues that may arise during the post-validation or clean-up phases.
- Mark this for AfterTest Suite: When you mark a test case for the AfterTest Suite, you ensure that it executes as part of the clean-up or finalisation process after the test suite's execution. This option helps maintain the test environment and ensure the proper closure of testing activities.
-
Manage Test Case: Click Manage Test Case in the right-side navbar, and configure the following options:
-
Status: Select the appropriate status for the Test Case to organise and manage the testing workflow.
- Draft: The test case is in an abstract mode.
- Review: The test case is under inspection.
- Ready: The test case is active and ready to be executed.
- Obsolete: The test case is no longer valid.
- Rework: The test case needs to be updated.
-
Priority: Select the priority level you want to set for this test case. For more information, refer to Test Case Priorities.
- Critical: Highest priority
- Major: Test case for a major feature
- Medium: Medium priority
- Minor: Test case for a minor feature
- Assignee: Choose the team member you want to assign to this test case. The team member assigned will receive notifications regarding test case failures during test case reviews.
- Reviewer: Assign a team member to review the accuracy, completeness, and adherence to testing standards of the test case. This promotes collaboration and ensures the quality of the test cases. For more information, refer to Test Case Review management.
-
Test Type: You must select the appropriate test method to apply using this test case. For more information, refer to Test Case Types.
- Unit Test: Test individual components or modules of the software to ensure their functionality in isolation.
- Integration: Test the interaction and compatibility between multiple components or modules to ensure they work together correctly.
- Functional: Test the functional requirements of a software application to ensure it meets the intended specifications.
- Non-functional: Test a software application's performance, usability, security, and other non-functional aspects.
- User Experience: Test a software application's overall user experience, usability, and interface to ensure it meets user expectations.
- Requirement: Create new requirements and associate them with test cases to establish traceability between requirements and test cases. This ensures that all necessary functionalities are adequately tested. For more information, refer to Requirements.
- Labels: Categorize test cases based on specific attributes such as modules, components, or testing phases by applying labels. Labels facilitate efficient filtering and searching, making managing and retrieving relevant test cases more manageable.
-
- Activity: Click Activity in the right-side navbar and view the History and Comments of test cases.
- Help: Click Help in the right-side navbar and access Examples, Action List, and Get Started for a general understanding of Test Cases.
Edit Test Case
- Navigate to Create Tests > Test Cases in the left-side navbar.
- Select the Test Case in the Test Cases List page and follow the steps mentioned in the Test Case - Advanced Options section above to edit the test case.
Delete Test Case
- Navigate to Create Tests > Test Cases in the left-side navbar.
-
You can delete the test case using any of the following methods:
- Click the Test Case you want to delete and open the Test Case Details Page. Then, click the Delete Button in the screen's top right corner.
- Click on the ellipsis icon for the Test Case you want to delete from the Test Case List page, and select Delete from the drop-down menu to remove the test case.
- Click the checkbox for the Test Cases you want to delete from the Test Case List page, and then click the Delete icon in the menu bar.
- Click the Test Case you want to delete and open the Test Case Details Page. Then, click the Delete Button in the screen's top right corner.
- A Delete Confirmation pop-up will appear. Click Delete to remove the test case from the project.
- When you delete test cases from the list, you send them to the trash folder. You can restore or permanently delete the test case from the trash folder.
- Deleting the test case will remove all the associated test suites, test plans, and any prerequisites.
Recover Deleted Test Case
- Navigate to Create Tests > Test Cases in the left-side navbar.
- In the Test Case List page, click Saved Filters and then select Trash (Deleted Test Cases) from the drop-down menu in the menu bar. This action will display a list of all the deleted test cases.
- Scroll through the list or use the search bar to locate the test case you want to recover or delete forever from the Trash and open it.
- Click the Restore button in the top right corner of the Test Case Details Page to recover it. Once you restore the test case, it will appear on the Test Cases List page.
- To permanently delete the Test Case, click the Delete Forever button. A confirmation pop-up screen will appear. You must enter DELETE in the field and click I Understand, delete this (Test Case Name).
Deleting the test case permanently will result in losing all Run reports and associated configurations.