perftool-incubator / multiplex Goto Github PK
View Code? Open in Web Editor NEWParameter multiplexer targeted at benchmark automation
License: Apache License 2.0
Parameter multiplexer targeted at benchmark automation
License: Apache License 2.0
Here is an example of the new format:
{ "global-options": [ { "name": "common-params", "params": [ { "arg": "devices", "vals": ["0000:03:00.0,0000:03:00.1,0000:82:00.0,0000:82:00.1"] }, { "arg": "send-teaching-warmup", "vals": ["ON"]}, { "arg": "teaching-warmup-packet-type", "vals": ["generic"]}, { "arg": "cpus", "vals": ["2,3,4,5,6,7,8,9,10,11"]}, { "arg": "use-src-mac-flows", "vals": ["1"] }, { "arg": "use-dst-mac-flows", "vals": ["1"] }, { "arg": "num-flows", "vals": ["1024"] }, { "arg": "one-shot", "vals": ["1"] }, { "arg": "rate-unit", "vals": ["mpps"] }, { "arg": "rate", "vals": ["14"] }, { "arg": "validation-runtime", "vals": ["180"] }, { "arg": "frame-size", "vals": ["64"] } ] } ], "sets": [ { "include": "common-params", "params": [ { "arg": "active-devices", "vals": ["0000:03:00.0,0000:03:00.1"]} ] }, { "include": "common-params", "params": [ { "arg": "active-devices", "vals": ["0000:82:00.0,0000:82:00.1"]} ] } ] }
The difference is that in the sets section, a new "params" array is created, instead of including the arg/vals objects in the same scope as the include object
Break up function into separate utils:
expand: take multi-val params (--rw=read,write) and output all single-val params (--rw=read, --rw=write).
validate: take multi-val params and validate against bench-specific regex's
All utils should use json for output by default
Currently only a single include is supported:
"include": "<label>"
Maybe the best way to implement this would be an additional property that looks something like this:
"includes": [ "<label1>", "<label2>", ... ]
i think the main thing that is missing is the option to control what node/tier/role a parameter is intended for — schema wise that is
andrew and i have only talked about wanting to add that, nothing has been done
we want to be able to avoid having to put client-, server-, or infra- in front of the individual parameters
plus if we do that we have to tell the tiers to ignore parameters destined for the other tiers
if we move to something like { “arg”: “foo”, “vals”: [ val1, val2, val3, etc ], “role”: “” } then the parameters will be directed to the right place
i think we were thinking that role would default to client if it’s not specified
multiplex needs to learn about multiplex.json that would be provided by a workload (https://github.com/perftool-incubator/bench-fio/blob/master/multiplex.json) and handle all of that stuff.
This PR is about loading as argument with --requirements </path/to/multiplex.json>
add unit tests
make on/off params consistent with the key-value ones.
on/off params represent the options enabled/disabled w/ params present or omitted/absent.
key-value 0/1 params enable or disable options with option=0 and option=1.
The idea is all args use key value arg=0 or arg=1 and multiplex will know the ones that should be omitted (on/off type).
replace print with log levels (warning, debug, info)
params that are a range of vals:
25k-30k
the conversion needs to be adjusted for both min-max vals
example: see bench-fio multiplex.json
$ cat iperf-1.json
{
"global-options": [
{
"name": "required",
"params": [
{ "arg": "time", "vals": [ "30" ], "role": "client" },
{ "arg": "ifname", "vals" : [ "ens18" ], "role": "server", "enabled": "no" },
{ "arg": "ifname", "vals" : [ "eth0" ], "role": "server", "enabled": "yes" },
{ "arg": "bitrate", "vals": [ "1G" ], "role": "client", "enabled": "no" }
]
}
],
"sets": [
{
"include": "required",
"params": [
{ "arg": "length", "vals": [ "16K" ] },
{ "arg": "protocol", "vals": [ "tcp" ] }
]
},
{
"include": "required",
"params": [
{ "arg": "length", "vals": [ "16K" ] },
{ "arg": "protocol", "vals": [ "udp" ] }
]
}
]
}
$ /opt/crucible/subprojects/core/multiplex/multiplex.py --input iperf-1.json
[
[
{
"arg": "time",
"role": "client",
"val": "30"
},
{
"arg": "ifname",
"role": "server",
"val": "eth0"
},
{
"arg": "length",
"role": "client",
"val": "16K"
},
{
"arg": "protocol",
"role": "client",
"val": "tcp"
}
],
[
{
"arg": "time",
"role": "client",
"val": "30"
},
{
"arg": "ifname",
"role": "server",
"val": "ens18"
},
{
"arg": "bitrate", <-----------------
"role": "client",
"val": "1G"
},
{
"arg": "length",
"role": "client",
"val": "16K"
},
{
"arg": "protocol",
"role": "client",
"val": "udp"
}
]
]
add the ability to enable / disable params in the input mv-params.json
include:
"enabled": "true|false"
This requires schema to understand the new key "enabled". Multiplex code needs to ignore params with "enabled" key.
multiplex needs to learn about multiplex.json that would be provided by a workload (https://github.com/perftool-incubator/bench-fio/blob/master/multiplex.json) and handle all of that stuff -- parameter validation, parameter transformation, default parameters, mandatory parameters, parameter presets....
https://github.com/perftool-incubator/bench-fio/blob/master/multiplex.json
This PR is about params validation
"time_smh" : {
"type": "validation",
"description" : "time in seconds, minutes, or hours: 10s 2m 1h",
"arguments" : [ "runtime", "steadystate_duration", "steadystate_ramp_time", "steadystate" ],
"value" : "^[0-9]+[smh]$"
},
multiplex needs to read the settings file multiple.json and validate all params in arguments array.
proposal:
add integration tests to ensure multiplex.py runs work:
multiplex needs to learn about multiplex.json that would be provided by a workload (https://github.com/perftool-incubator/bench-fio/blob/master/multiplex.json) and handle all of that stuff -- parameter validation, parameter transformation, default parameters, mandatory parameters, parameter presets....
https://github.com/perftool-incubator/bench-fio/blob/master/multiplex.json
This PR is about reading default and mandatory params
presets section contains **essentials¨, defaults, plus custom classes of params witth arbitrary preset names...
2 "presets": {
3 "essentials": [
4 { "arg": "write_iops_log", "vals": ["fio"] },
5 { "arg": "write_lat_log", "vals": ["fio"] }
6 ],
7 "defaults": [
8 { "arg": "rw", "vals": ["read", "randread"] },
9 { "arg": "bs", "vals": ["16k"] }
10 ],
11 "sequential-read": [
12 { "arg": "rw", "vals": [ "read" ] },
13 { "arg": "bs", "vals": [ "4k" ] }
14 ]
15 },
where:
essentials
will be always used on top of everything else. These params are appended to each set of the params array.defaults
will only be used if not defined in the multi-value params file.Precedence:
essentials
sets
from multi-value params filedefaults
include --output arg (default = bench-params.json)
generate html report with:
pytest -q --html=logs/report.html --self-contained-html multiplex.py tests/*.py
and make the html an artifact visible / accessible
Lines 127 to 134 in 7ac0b07
12 class t_global(object):
13 args = None
originally created to handle global vars from cmdline args
If multiplex is instructed to load a requirements validation file, should we error on any parameter that does not have a validation rule.
The intent is that once a submission passes multiplex it should run and not fail because of invalid parameters.
If we don't test all parameters we can't guarantee that.
You can have a regex which is a catch-all for most anything if need be.
multiplex needs to learn about multiplex.json that would be provided by a workload (https://github.com/perftool-incubator/bench-fio/blob/master/multiplex.json) and handle all of that stuff -- parameter validation, parameter transformation, default parameters, mandatory parameters, parameter presets....
https://github.com/perftool-incubator/bench-fio/blob/master/multiplex.json
This PR is about params transformation
"requirements": { "size_KMG" : { "description" : "bytes in k/K (1024), m/M (1024^2) or g/G (1024^3): 4k 16M 1g", "arguments" : [ "bs", "filesize", "io_size", "mem" ], "value_regex" : "[0-9]+[kbmgKBMG]", "value_transforms" : [ "s/([0-9]+)[gG]/($1*1024).\"M\"/e", "s/([0-9]+)[mM]/($1*1024).\"K\"/e" ] },
convert all input/output data from test-* in separate file(s) and create a fixture to load them
Below is an example file for a uperf benchmark. The goal of the benchmark is to run the tests for a point release of openshift. If possible could this example file be added to the repo so others can modify as needed.
"common": [
{
"name":"global",
"params": [
{ "arg": "duration", "vals": [ "90" ] },
{ "arg": "protocol", "vals": [ "tcp" ] },
{ "arg": "nthreads", "vals": [ "1","16","64" ] },
{ "arg": "server-ifname", "vals": [ "eth0" ] }
]
}
],
"sets": [
[
{ "common": "global"},
{ "arg": "test-type", "vals": [ "stream" ] },
{ "arg": "wsize", "vals": [ "64", "16384" ] }
],
[
{ "common": "global"},
{ "arg": "test-type", "vals": [ "rr" ] },
{ "arg": "wsize", "vals": [ "64" ] },
{ "arg": "rsize", "vals": [ "1024","16384" ] }
]
]
}
bench-params.json should have a version tag in it
much like we have for many other jsons
using json-validator for confirming the schema of this file in rickshaw-run
we just add a new key somewhere like “version”
see the schemas for rickshaw, ./subprojects/core/rickshaw/schema
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.