Giter Site home page Giter Site logo

multiplex's People

Contributors

atheurer avatar k-rister avatar rafaelfolco avatar

Stargazers

 avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

multiplex's Issues

update how mv-param sets are defined

Here is an example of the new format:

{
  "global-options": [
    {
      "name": "common-params",
      "params": [
        { "arg": "devices", "vals": ["0000:03:00.0,0000:03:00.1,0000:82:00.0,0000:82:00.1"] },
        { "arg": "send-teaching-warmup", "vals": ["ON"]},
        { "arg": "teaching-warmup-packet-type", "vals": ["generic"]},
        { "arg": "cpus", "vals": ["2,3,4,5,6,7,8,9,10,11"]},
        { "arg": "use-src-mac-flows", "vals": ["1"] },
        { "arg": "use-dst-mac-flows", "vals": ["1"] },
        { "arg": "num-flows", "vals": ["1024"] },
        { "arg": "one-shot", "vals": ["1"] },
        { "arg": "rate-unit", "vals": ["mpps"] },
        { "arg": "rate", "vals": ["14"] },
        { "arg": "validation-runtime", "vals": ["180"] },
        { "arg": "frame-size", "vals": ["64"] }
      ]
    }
  ],
  "sets": [
    { "include": "common-params",
      "params": [
        { "arg": "active-devices", "vals": ["0000:03:00.0,0000:03:00.1"]}
      ]
    },
    { "include": "common-params",
      "params": [
        { "arg": "active-devices", "vals": ["0000:82:00.0,0000:82:00.1"]}
      ]
    }
  ]
}

The difference is that in the sets section, a new "params" array is created, instead of including the arg/vals objects in the same scope as the include object

Switch to Go code for multiplex

Break up function into separate utils:

expand: take multi-val params (--rw=read,write) and output all single-val params (--rw=read, --rw=write).
validate: take multi-val params and validate against bench-specific regex's

All utils should use json for output by default

control param usage/function

i think the main thing that is missing is the option to control what node/tier/role a parameter is intended for — schema wise that is
andrew and i have only talked about wanting to add that, nothing has been done
we want to be able to avoid having to put client-, server-, or infra- in front of the individual parameters
plus if we do that we have to tell the tiers to ignore parameters destined for the other tiers
if we move to something like { “arg”: “foo”, “vals”: [ val1, val2, val3, etc ], “role”: “” } then the parameters will be directed to the right place
i think we were thinking that role would default to client if it’s not specified

Unify on/off params into key-value --param={1,0}

make on/off params consistent with the key-value ones.
on/off params represent the options enabled/disabled w/ params present or omitted/absent.
key-value 0/1 params enable or disable options with option=0 and option=1.

The idea is all args use key value arg=0 or arg=1 and multiplex will know the ones that should be omitted (on/off type).

log levels

replace print with log levels (warning, debug, info)

Fix conversion of range vals

params that are a range of vals:
25k-30k

the conversion needs to be adjusted for both min-max vals

example: see bench-fio multiplex.json

disabled parameter is enabled on second set?

$ cat iperf-1.json
{
    "global-options": [
        {
            "name": "required",
            "params": [
                { "arg": "time", "vals": [ "30" ], "role": "client" },

                { "arg": "ifname", "vals" : [ "ens18" ], "role": "server", "enabled": "no" },
                { "arg": "ifname", "vals" : [ "eth0" ], "role": "server", "enabled": "yes" },

                { "arg": "bitrate", "vals": [ "1G" ], "role": "client", "enabled": "no" }
            ]
        }
    ],
    "sets": [
        {
            "include": "required",
            "params": [
                { "arg": "length", "vals": [ "16K" ] },
                { "arg": "protocol", "vals": [ "tcp" ] }
            ]
        },
        {
            "include": "required",
            "params": [
                { "arg": "length", "vals": [ "16K" ] },
                { "arg": "protocol", "vals": [ "udp" ] }
            ]
        }
    ]
}
$ /opt/crucible/subprojects/core/multiplex/multiplex.py --input iperf-1.json
[
    [
        {
            "arg": "time",
            "role": "client",
            "val": "30"
        },
        {
            "arg": "ifname",
            "role": "server",
            "val": "eth0"
        },
        {
            "arg": "length",
            "role": "client",
            "val": "16K"
        },
        {
            "arg": "protocol",
            "role": "client",
            "val": "tcp"
        }
    ],
    [
        {
            "arg": "time",
            "role": "client",
            "val": "30"
        },
        {
            "arg": "ifname",
            "role": "server",
            "val": "ens18"
        },
        {
            "arg": "bitrate",      <-----------------
            "role": "client",
            "val": "1G"
        },
        {
            "arg": "length",
            "role": "client",
            "val": "16K"
        },
        {
            "arg": "protocol",
            "role": "client",
            "val": "udp"
        }
    ]
]

enable / disable params in mv-params.json

add the ability to enable / disable params in the input mv-params.json

include:

"enabled": "true|false"

This requires schema to understand the new key "enabled". Multiplex code needs to ignore params with "enabled" key.

[multiplex.json] params validation

multiplex needs to learn about multiplex.json that would be provided by a workload (https://github.com/perftool-incubator/bench-fio/blob/master/multiplex.json) and handle all of that stuff -- parameter validation, parameter transformation, default parameters, mandatory parameters, parameter presets....

https://github.com/perftool-incubator/bench-fio/blob/master/multiplex.json

This PR is about params validation

    "time_smh" : { 
          "type": "validation",
          "description" : "time in seconds, minutes, or hours: 10s 2m 1h",
          "arguments" : [ "runtime", "steadystate_duration", "steadystate_ramp_time", "steadystate"  ],
          "value" : "^[0-9]+[smh]$"
    },

multiplex needs to read the settings file multiple.json and validate all params in arguments array.
proposal:

  • create a "type" key, to identify validation or transformation.
  • value replaces the original value_regex key

integration tests

add integration tests to ensure multiplex.py runs work:

  • with no input
  • with no requirements
  • with bad requirements
  • with bad input
  • check return codes

Load params presets from the requirements file

multiplex needs to learn about multiplex.json that would be provided by a workload (https://github.com/perftool-incubator/bench-fio/blob/master/multiplex.json) and handle all of that stuff -- parameter validation, parameter transformation, default parameters, mandatory parameters, parameter presets....

https://github.com/perftool-incubator/bench-fio/blob/master/multiplex.json

This PR is about reading default and mandatory params

presets section contains **essentials¨, defaults, plus custom classes of params witth arbitrary preset names...

  2     "presets": {
  3         "essentials": [
  4             { "arg": "write_iops_log", "vals": ["fio"] },
  5             { "arg": "write_lat_log", "vals": ["fio"] }
  6         ],
  7         "defaults": [
  8             { "arg": "rw", "vals": ["read", "randread"] },
  9             { "arg": "bs", "vals": ["16k"] }
 10         ],
 11         "sequential-read": [
 12             { "arg": "rw", "vals": [ "read" ] },
 13             { "arg": "bs", "vals": [ "4k" ] }
 14         ]
 15     },

where:

  • essentials will be always used on top of everything else. These params are appended to each set of the params array.
  • defaults will only be used if not defined in the multi-value params file.
  • other presets will only be used where include: <preset_name> is defined

Precedence:

  • 1st: essentials
  • 2nd:
    • param sets from multi-value params file
    • <preset_name> (classes of params presets), where include: <preset_name> is defined
  • 3rd: defaults

generate test report visible

generate html report with:
pytest -q --html=logs/report.html --self-contained-html multiplex.py tests/*.py

and make the html an artifact visible / accessible

parameter validation only occurs if len(vals) > 1

multiplex/multiplex.py

Lines 127 to 134 in 7ac0b07

if len(obj[set_idx]['vals']) > 1:
for copies in range(0, len(obj[set_idx]['vals'])):
param = obj[set_idx]['arg']
val = obj[set_idx]['vals'][copies]
# check if param passes validation pattern
if 'validation_dict' in globals() and validation_dict is not None:
if not param_validated(param, val):
return(None)

refactor t_global class

 12 class t_global(object):
 13     args = None

originally created to handle global vars from cmdline args

Every param must have a validation in the requirements file

If multiplex is instructed to load a requirements validation file, should we error on any parameter that does not have a validation rule.

The intent is that once a submission passes multiplex it should run and not fail because of invalid parameters.
If we don't test all parameters we can't guarantee that.

You can have a regex which is a catch-all for most anything if need be.

[multiplex.json] params transformation

multiplex needs to learn about multiplex.json that would be provided by a workload (https://github.com/perftool-incubator/bench-fio/blob/master/multiplex.json) and handle all of that stuff -- parameter validation, parameter transformation, default parameters, mandatory parameters, parameter presets....

https://github.com/perftool-incubator/bench-fio/blob/master/multiplex.json

This PR is about params transformation

  "requirements": {    "size_KMG" : {      "description" : "bytes in k/K (1024), m/M (1024^2) or g/G (1024^3): 4k 16M 1g",      "arguments" : [ "bs", "filesize", "io_size", "mem" ],      "value_regex" : "[0-9]+[kbmgKBMG]",      "value_transforms" : [ "s/([0-9]+)[gG]/($1*1024).\"M\"/e",        "s/([0-9]+)[mM]/($1*1024).\"K\"/e"      ]    },
  • add a type key: "type: "transformation"
  • value_transforms becomes value

Please add example file

Below is an example file for a uperf benchmark. The goal of the benchmark is to run the tests for a point release of openshift. If possible could this example file be added to the repo so others can modify as needed.

    "common": [
    {
    	"name":"global",
    	"params": [
        { "arg": "duration", "vals": [ "90" ] },
        { "arg": "protocol", "vals": [ "tcp" ] },
        { "arg": "nthreads", "vals": [ "1","16","64" ] },
        { "arg": "server-ifname", "vals": [ "eth0" ] }
	]
    }
    ],
    "sets": [
        [
	    { "common": "global"},
            { "arg": "test-type", "vals": [ "stream" ] },
            { "arg": "wsize", "vals": [ "64", "16384" ] }
        ],
        [
	    { "common": "global"},
            { "arg": "test-type", "vals": [ "rr" ] },
            { "arg": "wsize", "vals": [ "64" ] },
            { "arg": "rsize", "vals": [ "1024","16384" ] }
        ]
    ]
}

add version key to json schema

bench-params.json should have a version tag in it
much like we have for many other jsons
using json-validator for confirming the schema of this file in rickshaw-run

we just add a new key somewhere like “version”
see the schemas for rickshaw, ./subprojects/core/rickshaw/schema

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.