How Jira and Links Explorer can boost your Agile transformation

Iterative and incremental development are critical elements of Agile development. The process represents the block of time during which the team has to produce a deliverable or increment. Agile helps organizations to evolve and maintain a focus on rapid development.

Agile gained popularity a decade ago when software industries started using agile software development methods. These methodologies address the customer’s needs and deliver business value right from the beginning of the project. The iterative process reduces the risk associated with the development and provides visibility and transparency at every stage of the project lifecycle.

Today, every industry including- automotive, telecommunications, aerospace & defense have implemented Agile methodologies in their projects.

To manage a project, teams across the world are using Agile frameworks or methodologies like Scrum, Kanban, or mixed methodology. Atlassian Jira Software is one such agile project management tool that supports these methodologies. Whether it’s agile boards, backlogs/sprints, or reports, the extensive customization options of Jira help teams to plan, track, and manage their agile projects with ease. 

As project size increases, so do the challenges, in such circumstances, implementing agile methodologies becomes difficult, but using a traceability application as Links Explorer with Jira can help your agile project team trace and visualize the links between the macro-level issues (Epics, Stories, etc) and the micro-level issues (task, subtasks, bugs, etc).

In this article, we will discuss- how you can enhance your agile project management capabilities in Jira using Links Explorer Traceability & Hierarchy.

Benefits of using Links Explorer Traceability & Hierarchy

To be a successful agile team, you need to foster an environment of team collaboration and flexibility. Links Explorer is designed to support these elements and help your agile team to identify and analyze potential risks. Here’s how Links Explorer can add value to your team:

Unified Tree View

The unified tree view is one of the unique features of Links Explorer that provides a complete hierarchy of Jira issues and its related links in a single window. This tree view captures all relationship types created in Jira including Epics, tasks, sub-tasks, and Portfolio hierarchies, making it easy for teams to interpret all relations.

It also supports cross-team collaboration by displaying relations from different projects. The optional preview feature in the Unified tree view displays information including Affected Versions, Fix Versions, Assignee, and Reporter.

The unified tree view displays crucial information related to each issue present in the hierarchy, which includes issue link type, issue status, priorities. This real-time information helps your team to track and visualize the complete progress of the project from a single window instead of switching to multiple tabs or windows. Thus reduces duplicate efforts,  supports flexibility, and help team members to recognize each other’s contributions.

You can access the unified tree view on the Jira issue page as well as the Board detail view page.

Interactive Depth Mode & Filters

It is essential to discover potential risks to ensure your project does not hit roadblocks. Measuring the magnitude of impact due to linked Jira issues (across projects) can be done using Interactive Depth mode & filters available in Links Explorer. The interactive Depth mode in the unified tree view helps your team to analyze the impact of changes across the project at different levels. Also, the multiple filters (issue type, link type, and issue direction) gives the team the exact refined view needed to assess the impact of any change request.

Traceability Reports

Every agile team looks for traceability and visibility throughout the software development process. Using Links Explorer, your team can generate end to end traceability reports that include information on issue key, issue summary along with issue links (Epic, Story, Task, Sub-task, or other issue link types) across projects. It also provides information about Issue status and priority.

These reports help the team to view how issues of any type linked with other Jira issues in an easy to comprehend tabular format.

Every team is different, but becoming a successful Agile project team is possible as long as you invest in a capable tool. Links Explorer Traceability & Hierarchy ensures that your agile team does not missout on any linked issue while working at different phases of the project. From allowing your team to visualize, track, analyze, and report the progress of linked Jira issues, Links Explorer helps to enhance the user experience.

Industry leaders like Arm, Tesla, Tesla, and many more are using Links Explorer in their transformation journey, now its time for you to transform your business with us.

For more information on Links Explorer Traceability & Hierarchy, the #1 user rated traceability app for Jira, visit

Or, Check it out in Atlassian Marketplace today

Or contact us at


In this article, we will cover

  • The impact of the gaps in design and development processes
  • How ISO-9001 addresses these gaps
  • And how we can implement these using JIRA and Links Explorer Traceability & Hierarchy


Organizations trying to invent or innovate products, often tend to miss important aspects of product design and development. Typically developers and engineers get so immersed in the process of “design and release” that they sometimes neglect the critical aspects of the path from A to B. This requires organizations to introduce a comprehensive design system or process. Overlooking these processes can lead to

  • extended timelines and a large number of development iterations
  • product recalls which cost companies tens of millions, destroys brand reputation and erodes customer’s trust.

ISO 9001 Design & Development clauses address these gaps by providing guidelines to organizations.


ISO 9001 contains various clauses that help companies achieve Total Quality through their Quality Management Systems. One such clause focuses on the Design & development of products and services.

ISO 9001 Design & Development defines seven steps to implement the product development life cycle.


Managing the process for the design & development of products manually or using Excel spreadsheet is not only complicated but also prone to multiple errors. Here’s how we can simplify this process using Jira and Links Explorer.

We have categorized the seven steps of design & development into:

  • Planning
    • Design Planning
  • Engineering
    • Design Input & Output
    • Design Review
    • Design Verification & Validation
    • Design Control


  • Create Jira issues to plan the activities to be executed. Jira is capable of creating Jira issues of multiple types e.g. Requirements, Epics, Stories, Task, subtask, etc.
  • These Jira issues can be linked with one another as required and assigned to respective users for execution with set deadlines.
  • Use Links Explorer’s Unified tree view for periodic review & update of the design plan.
  • The Unified Tree View displays a complete hierarchy of linked Jira issues in a tree structure​.


  • We can create JIRA issues as placeholders for documents like user requirements, legal & regulatory requirements, etc.
  • And we can create documentation for each Design Input and Design Output in any system capable of version control.
  • Confluence can be one choice and we can store the reference of Confluence Document in the corresponding Jira issue.
  • Design reviews can be performed by an authorized user using Jira and Links Explorer’s Unified Tree View. We can easily navigate from tree view to Jira issues detail page to determine whether all design requirements are captured correctly.
  • Design verification activities can be planned using JIRA issues and the comments/observations can also be logged as JIRA issues.
  • For Design Validation one can create Test cases, Test Plans and Test execution cycles within Jira. Jira also supports additional Test case management apps.
  • If a change request arrives, the impact of changes in constituent parts can be assessed using the Traceability Report of Links Explorer. This ensures that the changes will not impact the intended use of the

The Design & Development process applies to every industry. Making Jira + Links Explorer a part of your QMS will streamline the process and help ease the path to compliance.

For more information on Links Explorer Traceability & Hierarchy visit

Or contact us at ​

Introducing vREST NG CLI – A command line partner of vREST NG Application

Hi friends!!! Today, I would like to introduce vREST NG CLI which is the command line partner of vREST NG Application. With vREST NG CLI, you will be able to execute your API tests on the command line as well. It uses the same runner component as used by the vREST NG Application itself. So, whatever works in the vREST NG UI Application will seamlessly work on the command line as well using vREST NG CLI without any further hassle. It provides beautiful and usable reports to debug your API failures quickly.

You may download this utility from this github link. vREST NG CLI provides the following two commands:

1. Run Command:

Run command will help you in the following tasks:

  1. You can execute your test cases on the command line. Simply provide the vREST NG project directory path to the CLI to execute the tests. For detailed instructions, please follow our guide link.
$ vrest-ng-cli run --projectdir=<VREST_NG_PROJECT_DIR_PATH>
  1. You can monitor your APIs health at specific intervals by linking this command with any third party schedular or cron utility.
  2. You can validate your application builds by integrating this command with any CI/CD servers like Jenkins, TeamCity, Azure Devops, CircleCI, Bitbucket pipelines etc.
vREST NG CLI with CI/CD servers

Run command provides the following loggers to provide the detailed execution report:

1. Console Logger: This logger will output the test execution log on the console in a beautiful way. It will print the test summary, assertions results and diff report in case of test failure on the console.

Execution on the Command Line: If you execute the tests on the command line then vrest-ng-cli provides the report in the following way:

Execution in Jenkins: The same console report is also available in any CI/CD Servers like Jenkins server as shown below:

No alt text provided for this image

2. XUnit Logger: This XUnit logger will generate the xml report which can be used in any CI/CD servers like Jenkins to publish the report. The published reports using xunit logger provides the following two types of report:

a) Failed Test Cases List Report

No alt text provided for this image

b) Individual Test Case Failure Report

No alt text provided for this image

3. JSON Logger: This logger will generate the report in JSON format which you may consume by any other tool/script.

4. CSV Logger: This logger will generate the report in CSV format. So that you can visualise the report with Microsoft Excel Sheet, Numbers etc.

2. Import Command

Import command will help you in achieving the following tasks:

  1. It is helpful in importing tests from various sources like vREST Cloud, Postman, Swagger.
  2. It can also be used in syncing your tests automatically whenever there is some change in swagger schema file.

One real world use case of import command is, suppose you are maintaining the API specifications in swagger files and your specs keeps changing over a period of time. So, instead of updating the schema definitions manually for your API tests, you may schedule the import command which automatically updates your tests by re-importing the swagger files at scheduled intervals.

So, in this way, you may use vrest-ng-cli utility to continuously validate your application or sync your test cases with swagger definitions changes.

If you are facing any issues related to API Automated testing in your organization, then do contact us. We will arrange the live meeting to discuss your needs and will also provide you the demo of our product vREST NG showcasing its capabilities.

If you like this post helpful then do like, comment and share with your friends and colleagues. You can connect with me, follow me for amazing articles on API Test Automation and feel free to message me if you would like to discuss anything.

Boost your API Test efficiency using swagger and excel sheets

Today, in this post, I will highlight the process of most effective API automation Testing process and benefits of that process. This process makes use of swagger files and excel sheets and combines the power of these powerful tools to make your API testing experience more seamless. I have personally applied this process in various projects and found it to be very effective.

This process can be adopted with any API test automation tool or framework which supports swagger import process and reading test data from excel sheets. For a specific guide about detailed instructions, you may look at my article on Swagger + Excel Sheets, a wonderful way of validating REST APIs.


1. Write API Specification:

The process starts with documenting your API specifications in swagger files. Writing API specification in the beginning is the key to successful API Test Automation. The major benefit is resolving ambiguities among multiple key stakeholders e.g. Testing Team, Backend Team, Frontend Team etc.

2. Make a Test Plan:

At this stage, you may make a test plan on how you are going to validate each API in your swagger file. You may answer the following most basic questions:

  1. What different conditions you would like to validate for each API? This includes positive and negative test cases, boundary value conditions etc. basically defines your API testing scope.
  2. How you are going to setup the initial state? You may setup the initial state using several methods e.g. by restoring the database state from a dump, by invoking an API, by executing an external script or command etc.
  3. What initial state do you need for each API to handle different conditions of the API? Once you have considered all the conditions which you would like to validate for any API then you will need to generate all the application data required to validate for those conditions.

Making a test plan beforehand is also the key factor in successful API test automation. It will provide you an absolute clarity on how you are going to progress further, scope of testing, realistic deadlines etc.

3. Generate Test Logic

Now comes the generation of test logic. Instead of writing test logic manually. We can use the power of swagger files to generate that test logic. Because in swagger file, we have all the information we need to generate the test logic. The same test logic can be reused to validate the different conditions of an API and can be easily synced whenever there are some changes in our Swagger API Spec.

4. Write Test Data

Now, in this stage, we need to write the test data in our excel sheets or CSV files. We should have a separate excel sheet for each API available in our swagger file. And in excel sheet, we can write the test data to validate the different conditions of the API. We can also write, what should be our expected response of the API in that particular condition.


There are several benefits of this process but let me share you some of the key benefits of applying this process. You might have already imagined the benefits of using this approach while reading so far.

1. Removes ambiguity:

As your API specification is documented and your Test Plan is documented in this process. It resolves all the ambiguity among the key stakeholders and provides absolute clarity to them doing validation. Lack of clarity is the major reason behind many automation failures.

2. Increased Test efficiency:

As swagger file becomes the single source of truth for all of the generated test logic. It will increase your test efficiency drastically. The same test logic can reused to validate all the conditions of an API. So, it increases the reusability.

3. Easier to Maintain

Test Cases written using this process are easier to maintain over a period of time. Because whenever there are some changes in your swagger files then you will be able to change the test logic very quickly. You may even write a script that will update your test logic using swagger file automatically. So, your tests and api specification remain in sync all the time.

Maintaining test cases in excel sheets are also easy. Checking conditions of an API is like adding a new row in the excel sheet with desired data.

4. Separation of concerns

This process separates the test data from the automation logic. Anybody can write test data in the excel sheet without needing much technical knowledge. This enables you to write test data very quickly by giving you clear objectives on what to write in excel sheet.

Finally, I would like to know your feedback on this approach. You may share your feedback by giving comments on this post. If you like my post then do like or re-share this post with your friends and colleagues. Do let me know if you foresee any challenges in using this approach. I have personally applied this process in various projects and found it to be very effective.

If you are facing any issues related to API Automated testing in your organization, then do contact us. We will arrange the live meeting to discuss your needs and will also provide you the demo of our product vREST NG showcasing its capabilities.

vREST NG – Taking your API Testing experience to the next level

Hi guys, today I will discuss about how easy it is to perform API Testing using vREST NG irrespective of whether you are a developer, non developer or tester. vREST NG provides simple interface so that any body can write API test cases quickly without writing a single line of code.

Before starting, I am assuming that you have already installed vREST NG and created a project in it. You may easily follow this tutorial with in few minutes with vREST NG Pro version.

Now, I will guide you with step by step instructions on how you may write an API test case in vREST NG. For that, let’s take a sample API which creates a record on the server. The details of the API are as follows:

API Endpoint:


Request Body:

    "name":"Dheeraj Aggarwal",
    "designation":"Engineering Manager",
    "organization":"Optimizory Technologies",
    "aboutMe":"Passionate to make vREST NG - World's #1 API Testing Tool",

So, these are the API details. With these details in hand, let’s quickly try to create a test case in vREST NG.

In vREST NG, just click on plus icon as shown in the following screenshot to create your API Test Case.

No alt text provided for this image

A dialog window will appear to create the test case. Just provide the Test Suite Name, Request Method, API Endpoint and meaningful summary for your test case like this.

No alt text provided for this image

Now click on Confirm button to create the test case. The created test case will look like this:

No alt text provided for this image

Now, let’s provide the request body for this test case. Just click on the Request Tab in the application for the test case and then select Body sub-tab and provide the request body like this:

No alt text provided for this image

Now, let’s try to validate our sample API Test Case. To write assertions, simply click on the Validation tab. When you create a test case, the application automatically creates a Status Code assertion for you which checks whether the status code is equal to 200 or not.

No alt text provided for this image

Let’s execute this test case by clicking on button “Run Single” available in the middle pane. Or you may click on button “Run All” available in the left pane which will execute all the test cases available in the left pane.

When you run the test case, you can see the results in the rightmost pane.

No alt text provided for this image

So far, we have validated only the status code of our API response. Now, let’s try to validate the response content as well. To validate the response content, simply click on the “Generate Expected Body” button available in the right most pane as shown in the above image.

This operation will automatically adds a Text Body assertion that will compare the expected response body with actual response body received.

No alt text provided for this image

It also automatically sets the expected response body in the Expected Body sub-tab in middle pane.

No alt text provided for this image

Now, let’s try to execute the test again. This time the test has failed. You will be able to quickly analyse the failure by looking at the diff report as shown in the below image.

No alt text provided for this image

Our test case failed because there are some dynamic values in the response which always changes when you create a record on the server using our sample API.

To handle this issue, you may ignore these values by using our wildcard or star variable in the expected body like this:

No alt text provided for this image

Now if you run the test case, you can see that the test case is now passing by looking at the Results tab.

No alt text provided for this image

You can also validate your API response structure by specifying the JSON schema in the Expected Schema tab. vREST NG provides very powerful response validation capabilities. In 99% of the cases you will not need to write a single line of code to validate your API response. With vREST NG, you may write your complex test scenarios with in few minutes.

No alt text provided for this image

Finally, I would like to say, that it is very easy to write test cases in vREST NG. Anybody who is having a basic understanding of HTTP protocol and who is having the API specification can easily write test cases in vREST NG irrespective of his/her programming skills. In the above post, we have not written a single line of code.

You may even generate the tests through swagger file in vREST NG and feed test data via excel sheets. For more information, please read my another post on Swagger + Excel Sheets, a wonderful way of validating REST APIs.

If you are facing any issues related to API Automated testing in your organization, then do contact us. We will arrange the live meeting to discuss your needs and will also provide you the demo of our product vREST NG showcasing its capabilities.

If you find this post helpful, do like or share with your colleagues and friends. And try out vREST NG and let us know your feedbacks.


Swagger + Excel Sheets, a wonderful way of validating REST APIs

Swagger Files (aka OpenAPI Specification) is the most popular way for documenting API specifications and Excel sheet provides an easy and simple way of writing structured data. Anybody can write data in excel sheet irrespective of their programming skills. Introducing vREST NG (An enterprise ready application for Automated API Testing), which combines the power of both to make your API Testing experience more seamless. The approach is also known as Data Driven Testing.

Data Driven testing is an approach in which test data is written separately from the test logic or script.

So, this is how the process looks like:

vREST NG uses swagger files to generate all of the test logic and sample test data CSV files. vREST NG reads test data from the CSV files and iterate over the rows available in the CSV files and run the iterations one by one. Today in this post, we will look at the following in detail:

  1. How you may generate the test cases by using the swagger files.
  2. How you may feed the test data to those generated test cases through an excel sheet.

How to perform Data Driven API Testing in vREST NG

To elaborate the process, I will take a sample test application named as contacts application which provides the CRUD APIs. I will guide you through the following steps:

  1. Setup the test application
  2. Download and Install vREST NG Application
  3. Perform Data Driven API Testing in vREST NG

1. Setup the Test Application:

You may skip this step if you want to follow the instructions for your own test application.

Otherwise, just download the sample Test Application from this repository link. This application is a NodeJS based application and tested with NodeJS v10.16.2.

To setup this application, simply follow the instructions mentioned in the README file of the repository.

2. Download and Install vREST NG Application

Now, simply download the application through vREST NG website and install it. Installation is simpler but if you need OS specific instructions, then you may follow this guide link.

After installation, start the vREST NG Application and use vREST NG Pro version when prompted in order to proceed further.

Now first setup a project by dragging any empty directory from your file system in the vREST NG workspace area. vREST NG will treat it as a project and store all the tests in that directory. For more information on setting up project, please read this guide link.

For quick start, if you don’t want to follow the whole process and just want to see the end result. They may download and add this project directory in vREST NG application directly.

3. Performing Data Driven API Testing in vREST NG

vREST NG provides a quick 3 step process to perform data driven API Testing:

(a) Import the Swagger File

(b) Write Test Data in CSV Files

(c) Setup Environment

Now, we will see these steps in detail:

(a) Import the Swagger File

To import the Swagger file, simply click on the Importer button available in the top left corner of the vREST NG Application.

An import dialog window will open. In this dialog window:

  1. Select “Swagger” as Import Source
  2. Tick the option `Generate Data Driven Tests`. If this option is ticked then vREST NG Importer will generate the data driven test cases for each API spec available in the swagger file.
  3. Provide the swagger file. For this demonstration, I will use the swagger file from the test application repository. Download Swagger File

The dialog window will look something like this. Now, click on the Import button to proceed further.

The import process has done the following things so far:

1. It has generated a test case for each API spec available in the swagger file. And test suites will be generated against each tag available in the swagger file.

2. It has automatically created the sample CSV files against each test case with desired columns according to your swagger file as shown in the following image.

We will discuss in detail on how you may fill this excel sheet later in this post.

3. The generated CSV files are also automatically linked as shown in the following image.

So, before every test execution, the API test will read the data from the linked CSV file and converts it into JSON format and store it in a variable named as data. Now the test case will iterate over the data received and run the iterations. So, if you make a change in CSV file, just run the test case again. Test Case will always pick up the latest state of the CSV file. No need to import again and again.

4. It has automatically inserted some variables in the API request params as per the API definitions available in the swagger file. These variables value will picked up from the linked CSV file automatically.

No alt text provided for this image

5. It has automatically added the response validation logic as well. Status code assertion is used to validate the status code of the API response. Text Body with Default Validator assertion compares the expected response body with the actual response body. Text body with Default Schema Validator assertion validates the API response through the JSON schema.

The expected status code will be picked up from the linked CSV file.

And the expected response body will also be picked up from the linked CSV file.

And the expected schema name is also picked up from the linked CSV file.

6. It has imported all the swagger schema definitions in the Schemas section available in the Configuration tab.

You may refer these schema definitions in the Expected Schema tab as discussed earlier. And in the CSV file, you will need to just specify the respective schema name for the test iterations in the expectedSchema column.

(b) Write Test Data in CSV Files

As we have already seen the data file generated from the import process. Let me show you the generated file again for the Create Contact API:

In this sample file, you may add test data related to various iterations for the Create Contact API. In the iterationSummary column, simply provide the meaningful summary for your iterations. This iteration summary will show up in the Results tab of the vREST NG Application. You will need to fill this test data by yourself. You may even generate this test data through any external script.

Now, let’s add some test iterations in the linked CSV file.

No alt text provided for this image

With the above CSV file, we are checking two test conditions of our Create Contact API:

  1. When the name field is empty
  2. And when the name field length is greater than the limit of 35 characters.

In the above CSV file, we have intentionally left the expectedBody column blank. We don’t need to fill this column. We can fill this column’s value via the vREST NG Application itself.

Before executing the test case, we need to configure the baseURL variable of your test application in the Configuration tab like this:

No alt text provided for this image

Now, let’s execute this test in vREST NG Application. Both the iterations are failed because expected response body doesn’t match with the actual response body as shown in the following image:

No alt text provided for this image

Now, click on button “Copy Actual to Expected” for each iteration. vREST NG will directly copy the actual response body to expectedBody column in the CSV file like this.

Now after this operation, if you look at the CSV file again. You can see that vREST NG has filled the expectedBody column for you as shown in the following image.

No alt text provided for this image

Note: If you have opened this CSV file in Microsoft Excel then you will need to close the file and open it again in order to reflect the changes. But some code editors automatically detect the changes on the file system and reflect it in real time.

Now, if you execute the test again, you can see that the tests are now passing.

No alt text provided for this image

You may also see the expected vs actual response for the selected test iteration:

No alt text provided for this image

And you may see the execution details of the selected iteration by going to Execution Tab:

No alt text provided for this image

So, in this way, you may add iterations in the CSV file. Just add iterations in your CSV file and run it in the vREST NG Application directly. No need to import again and again. It all just work seamlessly. So, it increases your test efficiency drastically.

(c) Setup Environment

For the generated steps, you may also need to set the initial application or DB state before executing your tests. So that you can perform the regressions in automated way. Some use cases of setting up initial state can be:

  1. Restoring the database state from the backups
  2. Execute an external command or script
  3. Invoke a REST API to setup the initial state

In this section, Let’s see, how you may execute an external command before the execution of our tests. As our sample test application is simple and built for demonstrating vREST NG. It stores all the contacts data in a JSON file. So, I already have the initial data in a JSON file which I can copy to our test application project directory before executing the test cases.

You may specify the command as shown in the following image:

No alt text provided for this image

The above command will restore the application state from the initial data which is already there in vREST NG Project directory >> dump.json file.

Note: You will also need to specify the cpCmd variable in the Environments section because in Linux/MacOS the command name is cp and for Windows OS, the command name is copy. So, for Windows OS, you may create another environment in vREST NG application. So that your API tests can run on any machine by just switching the environment.

No alt text provided for this image

So, this is how easy, you may perform the data driven testing in vREST NG. Let me know if you found this post helpful via comments or do like or share it with your friends and colleagues.

Schema Validation | Advanced Response Validation Tutorial (Part 3)

Schema Validation is something that is the first thing that comes to mind when it comes to the structural validation of your API’s responses. There are multiple ways of validating your API’s response and one of them is the schema validation.

Continue reading Schema Validation | Advanced Response Validation Tutorial (Part 3)

Scriptless Test Automation | Advanced Response Validation Tutorial (Part 1)

There are many tools out there which offer Scriptless Test Automation, and I am pretty sure that they offer some great features and functionalities that one can benefit from. The only issue is that the lack of simplicity in the tool, itself, makes the usage so tiresome, that it will almost equal the efforts that are required in scripted test automation.

Continue reading Scriptless Test Automation | Advanced Response Validation Tutorial (Part 1)

Introduction to Automated Data Driven Testing for REST APIs

One of the biggest concerns in the industry of Software Testing will always be the concern of Investment that the Companies provide for Testing their Software services. Thus, it leaves us with only options to either reduce our resources or, better, optimize our efforts. Well, Data Driven Testing is one such thing, that can not only optimize your Testing efforts but also can give better results with minimal efforts.

Let’s take a deep dive into this subject..

Continue reading Introduction to Automated Data Driven Testing for REST APIs