Looping through a Data File in the Postman Collection Runner


Reading Time: 3 min.

Update, January 2020: Want to see how the Postman Collection Runner has evolved even further? Read our more recent blog post about Postman product improvements.

Postman’s Collection Runner lets you run all the requests inside a collection locally in the Postman app. It also runs API tests and generates reports so that you can measure the performance of your tests.

If you upload a data file to the collection runner, you can:

  • Test for hundreds of scenarios
  • Initialize a database
  • Streamline setup or teardown for testing

Let’s start with the basics.

Run a collection with the Collection Runner

To run a collection in the Postman app, click on the chevron (>) next to the collection’s name to expand the details view. Select the blue Run button to open the Collection Runner in a new window.

Verify the collection and environment if you’re using one, and hit the blue Run button. You’ll see the collection requests running in sequence and the results of your tests if you’ve written any.

Use data variables

What if you want to loop through data from a data file? This would allow you to test for hundreds of scenarios.

In the Postman app, you can import a CSV or JSON file, and use the values from the data file in your requests and scripts. You can do this using data variables with a similar syntax as we do when using environment and global variables.

Using data variables in requests

Text fields in the Postman app, like the authorization section or parameters data editor, rely on string substitution to resolve variables. Therefore, you should use the double curly braces syntax like {{variable-name}} in the text fields.

Using data variables in scripts

The pre-request and test script sections of the Postman app rely on JavaScript (not text). Therefore, you can use Postman’s pm.* API, like the pm.iterationData.get(“variable-name”) method, to access the values loaded from the data file.

Use data files

Using a CSV file

The CSV file should be formatted so that the first row contains the variable names that you want to use inside the requests. After that, every row will be used as a data row. The line endings of the CSV file must be in the UNIX format. Each row should have the same number of columns.

Using a JSON file

The JSON file should be formatted as an array of key-value pairs. The keys will be the name of the variable with the values the data that is used within the request.

Try it yourself:

Click the orange Run in Postman button above to import this example collection into your local version of the Postman app.

We will run this collection using a data file about my 4th favorite type of Japanese food: ramen. The data file can be exported from Google Search Trends about the volume of searches for “Ramen” by city. Download one or both of these sample data files, and give it a try.

If you’re looking for step-by-step instructions or helpful screenshots, check out the collection documentation.





Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

9 thoughts on “Looping through a Data File in the Postman Collection Runner

  • Bao


    How to loops through array without using data files from collection runner?

    I have my environment variable array IDs = [‘123, ‘234, ‘345’] (assume this is very large array)

    I want to run each GET “/get/id={{id}}” with each id inside array without uploading data files? Is that possible that I can use some script to generate each environment variable “id” on pre-request script? Thanks


    • Bao

      Array environment variable was get from previous GET request.
      Then I want to use another GET to loops thru all those IDs to validate it


  • Sudeep Gupta

    Hi, Can we start from a particular row of a data file. I mean i have 5 rows in my data file and i want that my script starts picking up the data from 3rd row. So is it possible to do that in postman


    • Arlemi Turpault

      You may be able to do that using pre-request and test scripts, if you’d like help with it please post on the Postman community forum: community.postman.com


  • Josh

    I ran postman runner 37,700 iterations and the report shows only 24k .
    Have you seen this before? I have a test in place that lets me know if the response was success or not. It shows 22k success and 2k failed. Math does not add up here..


  • Sonika

    I have a request body like

    “BookId”: “1234”
    “Version” : “34”
    “Id”: “51”,
    “Name” : “CCC”
    “Modules”: [
    “Module1Id”: “45”,
    “Pages”: “4”,
    “Startchapter”: “AS”,
    “Endchapter”: “DF”
    “Module2Id”: “5”,
    “Pages”: “14”,
    “Startchapter”: “AFS”,
    “Endchapter”: “DFS”
    “Module3Id”: “35”,
    “Pages”: “154”,
    “Startchapter”: “AFCS”,
    “Endchapter”: “DFRS”
    “Data”: {
    “Id”: “67”
    “Code” : “7”

    I am reading the module details from an external data file (CSV file).
    For some cases I have to send request body with 2 Module details , sometimes 1 Module details and sometimes all the 3 Module details.

    How do I write the automation script in postman to handle this that if my CSV file has a value null/empty for module Id, it won’t send that particular module details.


    • Arlemi Turpault

      Hey there! The community forum would be a better place for that question: community.postman.com.


  • Tanusree Dutta

    How to use postman/scripting to pull the endpoints and see what the data looks like and use the formatting for the loop that checks counts.


  • Holden

    I have had a few examples where I have ran a runner with 20K rows, but the runner runs well past 20K records. When I scroll down in the results, I can see the runner is actually processing empty rows from the CSV files. Is there a way to ensure only populated records are processed?


You might also like: