Giter Site home page Giter Site logo

eykrehbein / strest Goto Github PK

View Code? Open in Web Editor NEW
1.7K 28.0 59.0 319 KB

⚡️ CI-ready tests for REST APIs configured in YAML

License: MIT License

TypeScript 98.93% Dockerfile 0.51% JavaScript 0.56%
nodejs cli rest-api testing test-automation typescript javascript

strest's Issues

WIP: Log params

  my_request:
    url: foo.com
    method: GET
    log_request: true

This would result in the request object and resolved uri being logged. This would be helpful to have a copy of the json and to show the URI with environment vars and values populated.

It might be good to create a log object as:

  my_request:
    url: foo.com
    method: GET
    log:
      output: true
      file: true
      request: true

UI

Create a UI

strest-ui /tests loads all requests into visualization (use similar layout to insomnia UI)
Allow stepping through requests and displaying results
Requests can be modified and saved back to disk
Values of variables are displayed along side the 'code': i.e.
Env(FOO)bar

Repeat until

Repeat a request until a criteria is met.

requests:
  userRequest:
    url: http://localhost:3001/user 
    method: GET
    data:
      params:
        name: testUser
    max_retries: 3
    delay: 3000
    validate:
      json:
        somethingCompleted: "true"

The above request executes a maximum of 3 times before the failure is propagated

Support Value() across files

When running strest against a directory, it would be beneficial to use response data from file 1 as a Value() in file 2 request.

Support for form-data

In my cases my API takes files and performs some operations with them, after which some new file is returned.
It would be good if strest supported form-data, which would work the same as with other data.
One would provide the name of the parameter and then the filepath (either absolute or relative to the directory where strest is executed).

Feature: Response code validation

Add feature to validate the response status code

version: 1

requests:
  test:
    url: https://echo.getpostman.com/status/400
    method: GET
    validate:
      code: 2xx # status code needs to be in range of 200 - 299

execution order

execution order should be:

test_dir
A_subdir
3.strest.yml
4.strest.yml
Z_subdir
5.strest.yml
6.strest.yml
1.strest.yml
2.strest.yml

Executing strest tests/success/ results in the following execution order:

✔ Testing todoOne succeeded (0.188s)
✔ Testing todoTwo succeeded (0.107s)
✔ Testing todoOne succeeded (0.221s)
✔ Testing todoTwo succeeded (0.102s)
✔ Testing postman succeeded (0.293s)
✔ Testing my_variable_request succeeded (0.102s)
✔ Testing environment succeeded (0.101s)
✔ Testing fake succeeded (0.195s)
✔ Testing arr succeeded (0.571s)
✔ Testing arr1 succeeded (0.106s)
✔ Testing value1 succeeded (0.215s)
✔ Testing value2 succeeded (0.184s)
✔ Testing basic_auth succeeded (0.19s)
✔ Testing logging succeeded (0.197s)
✔ Testing if_Set succeeded (0.372s)
✔ Testing skipped skipped (0.001s)
✔ Testing executed succeeded (0.105s)
✔ Testing codeValidate succeeded (0.102s)
✔ Testing code404 succeeded (0.671s)
✔ Testing headersValidate succeeded (0.124s)
✔ Testing jsonValidate succeeded (0.1s)
✔ Testing jsonpath succeeded (0.264s)
✖ Testing maxRetries failed to validate. Retrying... (1.192s)
✔ Testing maxRetries succeeded (0.193s)
✔ Testing raw succeeded (0.191s)
✔ Testing login succeeded (0.188s)
✔ Testing verify_login succeeded (0.41s)
✔ Testing verify_login_chained succeeded (0.282s)
✔ Testing chaining_var1 succeeded (0.096s)
✔ Testing chaining_var2 succeeded (0.108s)

Based on current tests, this should be the order:

✔ Testing todoOne succeeded (0.188s)
✔ Testing todoTwo succeeded (0.107s)
✔ Testing postman succeeded (0.293s)
✔ Testing todoOne succeeded (0.221s)
✔ Testing todoTwo succeeded (0.102s)
✔ Testing my_variable_request succeeded (0.102s)
✔ Testing basic_auth succeeded (0.19s)
✔ Testing login succeeded (0.188s)
✔ Testing verify_login succeeded (0.41s)
✔ Testing verify_login_chained succeeded (0.282s)
✔ Testing chaining_var1 succeeded (0.096s)
✔ Testing chaining_var2 succeeded (0.108s)

unit testing / CI

So far I see no unit tests for strest and no Travis CI / AppVeyor setup.

We should have this to catch any bugs.

Verify against returned arrays

Say the returned response data has the following format,

[
  {
    "id": "5b9763b3ef7c4000018387cd",
    "from": "Bank",
    "to": "Bank",
    "amount": 0,
    "gameID": "SomePreDefiniedValue"
  }
]

How do I verify that "gameId" == "SomePreDefiniedValue"?

Feature: Set execution order

Option to add a parameter in each test file which defines in which order the files should be executed.

# File 1
version: 1
order: 1 # will be executed first
requests:
   ... 
# File 2
version: 1
order: 2 # will be executed after file 1
requests:
   ... 

Error response data

For failed responses debugging would be useful to print to console the response data including headers. This usually include a request id and validation messages in case of 4XX exit codes.

Release 2.0

  • API
  • use HAR format for request
  • nunjucks templating

Log output invisible on Solarized color scheme

It took me a few minutes to realize that the log output was not missing, but simply colored in such a way that makes it invisible in terminals using the Solarized color scheme. Selecting the text shows the log output.

Before:

solarized-colors-bad-fs8

After:

solarized-colors-highlited-fs8

Execute an array of folders or tests

strest execute_all_this.yml

where execute_all_this.yml contents are:

---
- path/to/folder/with/strest_dot_yml_files
- path/to/folder/my.strest.yml
- path/to/folder/with/more/strest_dot_yml_files

Codegen

Provide a codegen function.
strest --output curl
would result in the equivalent curl commands to perform the actions

Issue with Env() on Mac OS and node 8

On Mac OS 10.13.6, with node 8.11.3, I have a test such as
requests:
login:
url: Env(URL)
method: POST
....
....

I set URL in my env to http://foo.com/bar, but when I run the test it errors saying it can't talk to 127.0.0.1 for the url. Exact error:

strest ./post.yml

[ Strest ] Found 1 test file(s)
[ Strest ] Schema validation: 1 of 1 file(s) passed

✖ Testing login failed (0.041s)

Error: connect ECONNREFUSED 127.0.0.1:80

I took the exact same version of strest, same strest test file, same env variable set, on a linux box with node 6.14.0 and the test works correctly (uses http://foo.com/bar). It seems as though the Env call on node 8 doesn't recognize my environment variable on node8+MacOS but does on node6+Linux. I did some googling but couldn't find if there was some changes or issues with getting the env in node 8 vs. 6. On node 8 I made a one-liner node app to print process.env and I see my environment (including URL) so I don't think there is. Maybe an issue with the templating/yaml parsing between versions where the Env() call exists?

Output response

Assume 2 files where the goal is to output an env var from the request in file 1 and use it in file 2

File 1:

requests:
  test:
    url: https://postman-echo.com/get
    method: GET
    data:
      params:
        foo: bar
    output_env_vars:
      FILE_1_OUTPUT: args.foo

output:

{"args":{"foo":"bar"}}
requests:
  test:
    url: https://postman-echo.com/get
    method: GET
    data:
      params:
        foo: Env(FILE_1_OUTPUT)

Feature: Logfiles

Add an option to the cli command to create a log file. Formatted as Json array with an entry for each request where response code, data, headers and a possible error is logged?

Extract from array if

Assume a request returns:

[ {"foo":1, "bar": 1},  {"foo":2, "bar": 2} ]

It would be beneficial to extract the value of key bar given a value for foo.

This is typically handled in code using loops or find functions.

Schema validation fails silently

Describe the bug
I've written a request that fails schema validation but the strest run does not fail.

To Reproduce

  • create a request yml that is invalid
  • run it
  • the run passes

Expected behavior

  • the run should fail with a non-zero exit code

Additional context

[ Strest ] Found 3 test file(s)
[ Strest ] Schema validation: 2 of 3 file(s) passed

Executing tests in ./
Executing tests in: /app/tests/
Executing tests in: /app/tests/tokens/
✔ Testing userToken succeeded (0.573s)

[ Strest ] ✨  Done in 0.634s

The only indication that something went wrong is 2 of 3 file(s) passed at the top, which is easy to miss, especially in longer scripts.

allow self signed cert

error:

/ # strest foo.yml

[ Strest ] Found 1 test file(s)
[ Strest ] Schema validation: 1 of 1 file(s) passed


✖ Testing login failed (0.434s)

Error: self signed certificate

[ Strest ] Failed before finishing all requests

solution:
add a param to allow self-signed certs:
axios/axios#535

Nested functions

I haven't tested this, so maybe it is already possible.

Support n levels deep of this:

Value(foo.Env(BAR))

Postman to stREST

strest foo.postman.json --postman

This would convert a postman collection into a set of directories and files.

There's no reasonable way to convert all the functionality (javascripts) so this should focus on creating the requests only.

color highlighting in vscode

Possibly add this to a readme.

Assuming Dark+ theme

using this plugin:
https://github.com/fabiospampinato/vscode-highlight

add these settings:

    "highlight.regexes": {
      "(Value\\(.*?\\))": {
        "regexFlags": "g",
        "filterFileRegex": ".*\\.strest\\.yml",
        "decorations": [
          { "color": "#9CDCFE" }
        ]
      },
        "(Env\\(.*?\\))": {
          "regexFlags": "g",
          "filterFileRegex": ".*\\.strest\\.yml",
          "decorations": [
            { "color": "#C586C0" }
          ]
        }
      }

Support BDD test

Hi guys,

With Postman, I follow BDD style Given...When...Then for naming request in a flow. Example:

Given user has a balance 
When user topup +$10
Then user balance will be increased by $10

I suggest adding a new field to describe the above style.

If could, please let me know which is the fastest way to add a new field I can help to contribute a PR.

Thanks for your awesome efforts.

Add Dockerfile

installing node/npm can be avoided for those using Docker

error code when test failing

Hi, thanks for writing this! I like the clean yaml approach to writing tests. So I started to do a little testing to see how this would fit in our processes for API validation / testing.

When running a simple test when I get an HTTP 400+ response I would expect this would result in setting the exit code in terminal.

Success

version: 1

requests:
  get-test-todo:
    url: https://jsonplaceholder.typicode.com/todos/1
    method: GET
_ strest test.yml

[ Strest ] Found 1 test file(s)
[ Strest ] Schema validation: 1 of 1 file(s) passed

_ Testing get-app-version succeeded

[ Strest ] _  Done in 0.131s

_
_ echo $?
0

Failure

version: 1

requests:
  get-test-todo:
    url: https://jsonplaceholder.typicode.com/todos/230
    method: GET
_ strest test.yml

[ Strest ] Found 1 test file(s)
[ Strest ] Schema validation: 1 of 1 file(s) passed

_ Testing get-test-todo succeeded
_ Testing get-more-todo failed

Error: Request failed with status code 404

[ Strest ] _  Done in 0.453s

_
_ echo $?
0

If there are other good ways to act on failing a test I am open for suggestions.

Thanks again for this project.

Improved response validation

Thanks for this nice tool!

When writing tests it's also important to test failures, e.g. making sure that submitting invalid credentials does not give an auth token. It would be nice if you could do more advanced assertions on the response like so:

requests:
  login: # a successful login
    url: http://localhost:8080/auth
    method: POST
    body:
      json:
        username: alice
        password: secret
    response:
      code: [200-300) # expect a response code in range 200 (inclusive) to 300 (exlusive)
      body: 
        type: Base64
  login using params:
    url: http://localhost:8080/auth
    method: POST
    params:
      username: alice
      password: secret
    response:
      code: [200-300) # expect a response code in range 200 (inclusive) to 300 (exlusive)
      body:
        type: Base64
  login wrong password:
    url: http://localhost:8080/auth
    method: POST
    body:
      json:
        username: alice
        password: 123
    response:
      code: [400-500) # expect a response code in range 400 (inclusive) to 500 (exlusive)
      body:
        json:
          error: wrong username or password

Stress testing ?

Why don't add a possibility for stress testing permiting tests writer define how much requests per second wants to simulate ?

Reusable tests

I think it is a good idea to be able to extract repeatable tests in dedicated files and import them.

Ex: I want to modularize some tests by domain, and each domain needs to have an initial authentication request that is common to all tests.

Curl to request

Add feature that converts a curl command to a request in yaml syntax

Feature: Basic Auth

support this:

version: 1

requests:
  login:
    url: https://postman-echo.com/basic-auth
    Auth:
        basic:
            username: postman
            password: password
    method: GET

This would result is a header:
Authorization: Basic cG9zdG1hbjpwYXNzd29yZA==

Add filename to log output

When executing against a directory, showing the filename (and possibly the sub-directories' names) would be beneficial

WIP: validate pointer

Support this:

version: 1

requests:
  todoOne:
    url: https://jsonplaceholder.typicode.com/posts
    method: POST
    data:
      json:
        myArray:
        - foo: 1
          bar: 1
        - foo: 2
          bar: 2
    validate:
      jsonpath:
        myArray.1.foo: 2

Abort directory execution on failure

When executing strest foo_dir/ file2 will run tests even if there is a failure from a request in file1

Option 1

Support strest --abort foodir/

Option 2

Change default behavior to abort directory execution if a request fails.
This might also require the ability to override this behavior in a per file basis:

version: 1
continue: true
...

Define Environment

This would allow the defined env vars to be stored in an initial test file.

version: 1
environment:
  STREST_URL: https://jsonplaceholder.typicode.com
requests:
  environment:
    url: Env(STREST_URL)/todos/1
    method: GET

Filename()

Assume the filename is postman-echo.strest.yml

Access the filename in the request:

version: 1                            # only version at the moment

requests:                             # all test requests will be listed here
  testRequest:                        # name the request however you want
    url: https://Filename().com/get  # required
    method: GET                       # required

log_to_file

  logger:
    url: foo.com
    method: GET
    log_to_file: true

This would result in a file called logger.json that contains the output of the request

Environment Variables

Use Environment Variables to replace {{}}

version: 1                            # only version at the moment
requests:                             # all test requests will be listed here
  foo:
    url: "{{MY_ENV_VAR}}/api/v1/bar"
    method: GET                       # required

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.