Giter Site home page Giter Site logo

arcalot / arcaflow-plugin-sdk-python Goto Github PK

View Code? Open in Web Editor NEW
3.0 3.0 7.0 344 KB

Python SDK for creating Arcaflow plugins

Home Page: https://arcalot.github.io/arcaflow/creating-plugins/python/

License: Apache License 2.0

Python 100.00%
arcaflow-sdk python

arcaflow-plugin-sdk-python's Introduction

Python SDK for the Arcaflow workflow engine (WIP)

How this SDK works

In order to create an Arcaflow plugin, you must specify a schema for each step you want to support. This schema describes two things:

  1. What your input parameters are and what their type is
  2. What your output parameters are and what their type is

Note, that you can specify several possible outputs, depending on what the outcome of your plugin execution is. You should, however, never raise exceptions that bubble outside your plugin. If you do, your plugin will crash and Arcaflow will not be able to retrieve the result data, including the error, from it.

With the schema, the plugin can run in the following modes:

  1. CLI mode, where a file with the data is loaded and the plugin is executed
  2. GRPC mode (under development) where the plugin works in conjunction with the Arcaflow Engine to enable more complex workflows

For a detailed description please see the Arcalot website.


Requirements

In order to use this SDK you need at least Python 3.9.


Run the example plugin

In order to run the example plugin run the following steps:

  1. Checkout this repository
  2. Create a venv in the current directory with python3 -m venv $(pwd)/venv
  3. Activate the venv by running source venv/bin/activate
  4. Run pip install -r requirements.txt
  5. Run ./example_plugin.py -f example.yaml

This should result in the following placeholder result being printed:

output_id: success
output_data:
  message: Hello, Arca Lot!

Generating a JSON schema file

Arcaflow plugins can generate their own JSON schema for both the input and the output schema. You can run the schema generation by calling:

./example_plugin.py --json-schema input
./example_plugin.py --json-schema output

If your plugin defines more than one step, you may need to pass the --step parameter.

Note: The Arcaflow schema system supports a few features that cannot be represented in JSON schema. The generated schema is for editor integration only.

Generating documentation

  1. Checkout this repository
  2. Create a venv in the current directory with python3 -m venv $(pwd)/venv
  3. Activate the venv by running source venv/bin/activate
  4. Run pip install -r requirements.txt
  5. Run pip install sphinx
  6. Run pip install sphinx-rtd-theme
  7. Run sphinx-apidoc -o docs/ -f -a -e src/ --doc-project "Python SDK for Arcaflow"
  8. Run make -C docs html

Developing your plugin

We have a detailed guide on developing Python plugins on the Arcalot website.

arcaflow-plugin-sdk-python's People

Contributors

dependabot[bot] avatar dustinblack avatar janosdebugs avatar jaredoconnell avatar jdowni000 avatar mfleader avatar platform-engineering-bot avatar portante avatar sandrobonazzola avatar webbnh avatar

Stargazers

 avatar  avatar  avatar

Watchers

 avatar  avatar  avatar

arcaflow-plugin-sdk-python's Issues

TypeError: unsupported operand type(s) for 'in': 'str' and 'EnumMeta'

Describe the bug

When building a schema from an enum, the serialize function results in an error as shown in this fio carpenter build.
Looks like it built fine through history with 0.4.2 being called from here

To reproduce

Requirements

  • carpenter binary
  • fio project
docker run \
    --rm \
    -e=IMAGE_TAG="0.0.1"\
    --volume /var/run/docker.sock:/var/run/docker.sock:z \
    --volume $PWD/../arcaflow-plugin-fio:/github/workspace:z \
    carpenter-img build --build

Parent Class Support

Please describe what you would like to see in this project

The SDK does not support parent classes. The properties are in the objects, but the annotations are missing.
The SDK should read those annotations as if they are in the child class.

Please describe your use case

This is useful when items of the SDK inherit from each other because of a shared base class.

HTTP API

Please describe what you would like to see in this project

Allow a HTTP microservice to run from a plugin with OpenAPI support

Clearer error message for unsupported types

Concise and helpful error messages for unsupported types will be useful. For example, I tried using boolean type (before boolean was supported) and I get the following error message which is too long and doesn't exactly talk about the actual issue

Traceback (most recent call last):
  File "/usr/lib/python3.9/dataclasses.py", line 1033, in fields
    fields = getattr(class_or_instance, _FIELDS)
AttributeError: type object 'bool' has no attribute '__dataclass_fields__'

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/home/sanantha/arcaflow_tets/venv/lib/python3.9/site-packages/arcaflow_plugin_sdk/plugin.py", line 214, in _resolve_class
    fields_list = fields(t)
  File "/usr/lib/python3.9/dataclasses.py", line 1035, in fields
    raise TypeError('must be called with a dataclass type or instance')
TypeError: must be called with a dataclass type or instance

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/home/sanantha/fix_krkn/krkn/kraken/plugins/vmware/test_vmware_plugin.py", line 13, in <module>
    from kraken.plugins.vmware import vmware_plugin 
  File "/home/sanantha/fix_krkn/krkn/kraken/plugins/__init__.py", line 8, in <module>
    import kraken.plugins.vmware.vmware_plugin as vmware_plugin
  File "/home/sanantha/fix_krkn/krkn/kraken/plugins/vmware/vmware_plugin.py", line 441, in <module>
    def node_start(cfg: NodeScenarioConfig) -> typing.Tuple[str, typing.Union[NodeScenarioSuccessOutput, NodeScenarioErrorOutput]]:
  File "/home/sanantha/arcaflow_tets/venv/lib/python3.9/site-packages/arcaflow_plugin_sdk/plugin.py", line 71, in step_decorator
    input=build_object_schema(input_param.annotation),
  File "/home/sanantha/arcaflow_tets/venv/lib/python3.9/site-packages/arcaflow_plugin_sdk/plugin.py", line 393, in build_object_schema
    r = _Resolver.resolve(t)
  File "/home/sanantha/arcaflow_tets/venv/lib/python3.9/site-packages/arcaflow_plugin_sdk/plugin.py", line 108, in resolve
    return cls._resolve_abstract_type(t, tuple(path))
  File "/home/sanantha/arcaflow_tets/venv/lib/python3.9/site-packages/arcaflow_plugin_sdk/plugin.py", line 112, in _resolve_abstract_type
    result = cls._resolve(t, path)
  File "/home/sanantha/arcaflow_tets/venv/lib/python3.9/site-packages/arcaflow_plugin_sdk/plugin.py", line 137, in _resolve
    return cls._resolve_type(t, path)
  File "/home/sanantha/arcaflow_tets/venv/lib/python3.9/site-packages/arcaflow_plugin_sdk/plugin.py", line 171, in _resolve_type
    return _Resolver._resolve_class(t, path)
  File "/home/sanantha/arcaflow_tets/venv/lib/python3.9/site-packages/arcaflow_plugin_sdk/plugin.py", line 224, in _resolve_class
    name, final_field = cls._resolve_dataclass_field(f, tuple(new_path))
  File "/home/sanantha/arcaflow_tets/venv/lib/python3.9/site-packages/arcaflow_plugin_sdk/plugin.py", line 184, in _resolve_dataclass_field
    underlying_type = cls._resolve_field(t.type, path)
  File "/home/sanantha/arcaflow_tets/venv/lib/python3.9/site-packages/arcaflow_plugin_sdk/plugin.py", line 127, in _resolve_field
    result = cls._resolve(t, path)
  File "/home/sanantha/arcaflow_tets/venv/lib/python3.9/site-packages/arcaflow_plugin_sdk/plugin.py", line 151, in _resolve
    return cls._resolve_union(t, path)
  File "/home/sanantha/arcaflow_tets/venv/lib/python3.9/site-packages/arcaflow_plugin_sdk/plugin.py", line 375, in _resolve_union
    result = cls._resolve_field(args[0], tuple(path))
  File "/home/sanantha/arcaflow_tets/venv/lib/python3.9/site-packages/arcaflow_plugin_sdk/plugin.py", line 127, in _resolve_field
    result = cls._resolve(t, path)
  File "/home/sanantha/arcaflow_tets/venv/lib/python3.9/site-packages/arcaflow_plugin_sdk/plugin.py", line 137, in _resolve
    return cls._resolve_type(t, path)
  File "/home/sanantha/arcaflow_tets/venv/lib/python3.9/site-packages/arcaflow_plugin_sdk/plugin.py", line 171, in _resolve_type
    return _Resolver._resolve_class(t, path)
  File "/home/sanantha/arcaflow_tets/venv/lib/python3.9/site-packages/arcaflow_plugin_sdk/plugin.py", line 216, in _resolve_class
    raise SchemaBuildException(
arcaflow_plugin_sdk.plugin.SchemaBuildException: Invalid schema definition for NodeScenarioConfig -> skip_openshift_checks: The passed class is not a dataclass. Please use the @dataclasses.dataclass decorator on your class.

Logging support

Plugins should support creating logs that are output in debug mode.

http circular dependencies crash doctest execution

Describe the bug

Running docetest on the project:

$ for i in  `find . -name "*.py"|grep -v venv`; do python -m coverage run -a -m doctest "$i"; done
Traceback (most recent call last):
  File "/usr/lib64/python3.9/doctest.py", line 2793, in <module>
    sys.exit(_test())
  File "/usr/lib64/python3.9/doctest.py", line 2781, in _test
    m = __import__(filename[:-3])
  File "~/arcaflow-plugin-sdk-python/./src/arcaflow_plugin_sdk/http.py", line 3, in <module>
    from twisted.web import server, resource
  File "~/arcaflow-plugin-sdk-python/venv/lib64/python3.9/site-packages/twisted/web/server.py", line 31, in <module>
    from twisted.internet import address, interfaces
  File "~/arcaflow-plugin-sdk-python/venv/lib64/python3.9/site-packages/twisted/internet/address.py", line 18, in <module>
    from twisted.internet.interfaces import IAddress
  File "~/arcaflow-plugin-sdk-python/venv/lib64/python3.9/site-packages/twisted/internet/interfaces.py", line 26, in <module>
    from twisted.python.failure import Failure
  File "~/arcaflow-plugin-sdk-python/venv/lib64/python3.9/site-packages/twisted/python/failure.py", line 26, in <module>
    from twisted.python import reflect
  File "~/arcaflow-plugin-sdk-python/venv/lib64/python3.9/site-packages/twisted/python/reflect.py", line 22, in <module>
    from twisted.python.compat import nativeString
  File "~/arcaflow-plugin-sdk-python/venv/lib64/python3.9/site-packages/twisted/python/compat.py", line 35, in <module>
    from http import cookiejar as cookielib
ImportError: cannot import name 'cookiejar' from partially initialized module 'http' (most likely due to a circular import) (~/arcaflow-plugin-sdk-python/./src/arcaflow_plugin_sdk/http.py)

Above test coverage results:

$ python -m coverage report
Name                                         Stmts   Miss  Cover
----------------------------------------------------------------
docs/conf.py                                    13      0   100%
example_plugin.py                               30      4    87%
src/arcaflow_plugin_sdk/__init__.py              0      0   100%
src/arcaflow_plugin_sdk/annotations.py           3      0   100%
src/arcaflow_plugin_sdk/http.py                 85     83     2%
src/arcaflow_plugin_sdk/jsonschema.py           17     14    18%
src/arcaflow_plugin_sdk/plugin.py              154    122    21%
src/arcaflow_plugin_sdk/schema.py             1731    873    50%
src/arcaflow_plugin_sdk/serialization.py        44     29    34%
src/arcaflow_plugin_sdk/test_jsonschema.py      29     20    31%
src/arcaflow_plugin_sdk/test_plugin.py          33     16    52%
src/arcaflow_plugin_sdk/test_schema.py         722    621    14%
src/arcaflow_plugin_sdk/validation.py            7      0   100%
test_example_plugin.py                          17      8    53%
----------------------------------------------------------------
TOTAL                                         2885   1790    38%

To reproduce

Run doctest on the project

Additional context

$ pip list
Package                       Version
----------------------------- ---------
alabaster                     0.7.12
appdirs                       1.4.4
arcaflow-plugin-sdk           0.0.0
astroid                       2.12.7
attrs                         22.1.0
Automat                       20.2.0
Babel                         2.10.3
bandit                        1.7.4
certifi                       2022.6.15
charset-normalizer            2.1.1
constantly                    15.1.0
coverage                      6.4.4
dill                          0.3.5.1
docutils                      0.17.1
esbonio                       0.14.0
flake8                        5.0.4
gitdb                         4.0.9
GitPython                     3.1.27
hyperlink                     21.0.0
idna                          3.3
imagesize                     1.4.1
importlib-metadata            4.12.0
incremental                   21.3.0
isort                         5.10.1
Jinja2                        3.1.2
lazy-object-proxy             1.7.1
MarkupSafe                    2.1.1
mccabe                        0.7.0
packaging                     21.3
pbr                           5.10.0
pip                           20.2.4
platformdirs                  2.5.2
pycodestyle                   2.9.1
pydantic                      1.9.2
pyflakes                      2.5.0
pygls                         0.12.1
Pygments                      2.13.0
pylint                        2.15.0
pyparsing                     3.0.9
pyspellchecker                0.7.0
pytz                          2022.2.1
PyYAML                        6.0
requests                      2.28.1
setuptools                    50.3.2
six                           1.16.0
smmap                         5.0.0
snowballstemmer               2.2.0
Sphinx                        5.1.1
sphinx-rtd-theme              1.0.0
sphinxcontrib-applehelp       1.0.2
sphinxcontrib-devhelp         1.0.2
sphinxcontrib-htmlhelp        2.0.0
sphinxcontrib-jsmath          1.0.1
sphinxcontrib-qthelp          1.0.3
sphinxcontrib-serializinghtml 1.1.5
stevedore                     4.0.0
tomli                         2.0.1
tomlkit                       0.11.4
Twisted                       22.4.0
typeguard                     2.13.3
typing-extensions             4.3.0
urllib3                       1.26.12
wrapt                         1.14.1
zipp                          3.8.1
zope.interface                5.4.0

Type support: circular references

In order to accurately describe the schema itself, we need to support self-references:

@dataclass
class A:
    b: B

@dataclass
class B:
    a: A

Redundant data in circular imports

Currently, in #34 there are redundant IDs that are being used both as dict keys and as dataclass members. We should eliminate one of them to avoid bugs.

Better output when working with lists and annotations

Please describe what you would like to see in this project

The error's cause is unclear in this situation:

(venv) [jaredoconnell@fedora uperf]$ python3 uperf_plugin.py -s uperf --json-schema input > input/uperf.input.schema.json
Traceback (most recent call last):
  File "/home/jaredoconnell/Documents/projects/arcalot/arcalot-uperf/venv/lib64/python3.10/site-packages/arcaflow_plugin_sdk/plugin.py", line 271, in _resolve_class
    return schema.ObjectType(
  File "/home/jaredoconnell/Documents/projects/arcalot/arcalot-uperf/venv/lib64/python3.10/site-packages/arcaflow_plugin_sdk/schema.py", line 1139, in __init__
    attribute_annotations = cls_dict["__annotations__"]
KeyError: '__annotations__'

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/home/jaredoconnell/Documents/projects/arcalot/arcalot-uperf/venv/lib64/python3.10/site-packages/arcaflow_plugin_sdk/plugin.py", line 389, in _resolve_list_annotation
    cls._resolve_abstract_type(args[0], tuple(new_path))
  File "/home/jaredoconnell/Documents/projects/arcalot/arcalot-uperf/venv/lib64/python3.10/site-packages/arcaflow_plugin_sdk/plugin.py", line 117, in _resolve_abstract_type
    result = cls._resolve(t, path)
  File "/home/jaredoconnell/Documents/projects/arcalot/arcalot-uperf/venv/lib64/python3.10/site-packages/arcaflow_plugin_sdk/plugin.py", line 162, in _resolve
    return cls._resolve_annotated(t, path)
  File "/home/jaredoconnell/Documents/projects/arcalot/arcalot-uperf/venv/lib64/python3.10/site-packages/arcaflow_plugin_sdk/plugin.py", line 345, in _resolve_annotated
    underlying_t = cls._resolve(args[0], path)
  File "/home/jaredoconnell/Documents/projects/arcalot/arcalot-uperf/venv/lib64/python3.10/site-packages/arcaflow_plugin_sdk/plugin.py", line 160, in _resolve
    return cls._resolve_union(t, path)
  File "/home/jaredoconnell/Documents/projects/arcalot/arcalot-uperf/venv/lib64/python3.10/site-packages/arcaflow_plugin_sdk/plugin.py", line 461, in _resolve_union
    f = cls._resolve_field(args[i], tuple(new_path))
  File "/home/jaredoconnell/Documents/projects/arcalot/arcalot-uperf/venv/lib64/python3.10/site-packages/arcaflow_plugin_sdk/plugin.py", line 132, in _resolve_field
    result = cls._resolve(t, path)
  File "/home/jaredoconnell/Documents/projects/arcalot/arcalot-uperf/venv/lib64/python3.10/site-packages/arcaflow_plugin_sdk/plugin.py", line 162, in _resolve
    return cls._resolve_annotated(t, path)
  File "/home/jaredoconnell/Documents/projects/arcalot/arcalot-uperf/venv/lib64/python3.10/site-packages/arcaflow_plugin_sdk/plugin.py", line 345, in _resolve_annotated
    underlying_t = cls._resolve(args[0], path)
  File "/home/jaredoconnell/Documents/projects/arcalot/arcalot-uperf/venv/lib64/python3.10/site-packages/arcaflow_plugin_sdk/plugin.py", line 142, in _resolve
    return cls._resolve_type(t, path)
  File "/home/jaredoconnell/Documents/projects/arcalot/arcalot-uperf/venv/lib64/python3.10/site-packages/arcaflow_plugin_sdk/plugin.py", line 186, in _resolve_type
    return _Resolver._resolve_class(t, path)
  File "/home/jaredoconnell/Documents/projects/arcalot/arcalot-uperf/venv/lib64/python3.10/site-packages/arcaflow_plugin_sdk/plugin.py", line 276, in _resolve_class
    raise SchemaBuildException(path, "Failed to create object type: {}".format(e.__str__())) from e
arcaflow_plugin_sdk.plugin.SchemaBuildException: Invalid schema definition for UPerfParams -> profile -> groups -> items -> transactions -> items -> flow_ops -> items -> typing.Annotated -> typing.Union -> 0 -> typing.Annotated: Failed to create object type: '__annotations__'

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/home/jaredoconnell/Documents/projects/arcalot/arcalot-uperf/venv/lib64/python3.10/site-packages/arcaflow_plugin_sdk/plugin.py", line 389, in _resolve_list_annotation
    cls._resolve_abstract_type(args[0], tuple(new_path))
  File "/home/jaredoconnell/Documents/projects/arcalot/arcalot-uperf/venv/lib64/python3.10/site-packages/arcaflow_plugin_sdk/plugin.py", line 117, in _resolve_abstract_type
    result = cls._resolve(t, path)
  File "/home/jaredoconnell/Documents/projects/arcalot/arcalot-uperf/venv/lib64/python3.10/site-packages/arcaflow_plugin_sdk/plugin.py", line 142, in _resolve
    return cls._resolve_type(t, path)
  File "/home/jaredoconnell/Documents/projects/arcalot/arcalot-uperf/venv/lib64/python3.10/site-packages/arcaflow_plugin_sdk/plugin.py", line 186, in _resolve_type
    return _Resolver._resolve_class(t, path)
  File "/home/jaredoconnell/Documents/projects/arcalot/arcalot-uperf/venv/lib64/python3.10/site-packages/arcaflow_plugin_sdk/plugin.py", line 267, in _resolve_class
    name, final_field = cls._resolve_dataclass_field(f, tuple(new_path))
  File "/home/jaredoconnell/Documents/projects/arcalot/arcalot-uperf/venv/lib64/python3.10/site-packages/arcaflow_plugin_sdk/plugin.py", line 199, in _resolve_dataclass_field
    underlying_type = cls._resolve_field(t.type, path)
  File "/home/jaredoconnell/Documents/projects/arcalot/arcalot-uperf/venv/lib64/python3.10/site-packages/arcaflow_plugin_sdk/plugin.py", line 132, in _resolve_field
    result = cls._resolve(t, path)
  File "/home/jaredoconnell/Documents/projects/arcalot/arcalot-uperf/venv/lib64/python3.10/site-packages/arcaflow_plugin_sdk/plugin.py", line 156, in _resolve
    return cls._resolve_list_annotation(t, path)
  File "/home/jaredoconnell/Documents/projects/arcalot/arcalot-uperf/venv/lib64/python3.10/site-packages/arcaflow_plugin_sdk/plugin.py", line 392, in _resolve_list_annotation
    raise SchemaBuildException(path, "Failed to create list type") from e
arcaflow_plugin_sdk.plugin.SchemaBuildException: Invalid schema definition for UPerfParams -> profile -> groups -> items -> transactions -> items -> flow_ops: Failed to create list type

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/home/jaredoconnell/Documents/projects/arcalot/arcalot-uperf/venv/lib64/python3.10/site-packages/arcaflow_plugin_sdk/plugin.py", line 389, in _resolve_list_annotation
    cls._resolve_abstract_type(args[0], tuple(new_path))
  File "/home/jaredoconnell/Documents/projects/arcalot/arcalot-uperf/venv/lib64/python3.10/site-packages/arcaflow_plugin_sdk/plugin.py", line 117, in _resolve_abstract_type
    result = cls._resolve(t, path)
  File "/home/jaredoconnell/Documents/projects/arcalot/arcalot-uperf/venv/lib64/python3.10/site-packages/arcaflow_plugin_sdk/plugin.py", line 142, in _resolve
    return cls._resolve_type(t, path)
  File "/home/jaredoconnell/Documents/projects/arcalot/arcalot-uperf/venv/lib64/python3.10/site-packages/arcaflow_plugin_sdk/plugin.py", line 186, in _resolve_type
    return _Resolver._resolve_class(t, path)
  File "/home/jaredoconnell/Documents/projects/arcalot/arcalot-uperf/venv/lib64/python3.10/site-packages/arcaflow_plugin_sdk/plugin.py", line 267, in _resolve_class
    name, final_field = cls._resolve_dataclass_field(f, tuple(new_path))
  File "/home/jaredoconnell/Documents/projects/arcalot/arcalot-uperf/venv/lib64/python3.10/site-packages/arcaflow_plugin_sdk/plugin.py", line 199, in _resolve_dataclass_field
    underlying_type = cls._resolve_field(t.type, path)
  File "/home/jaredoconnell/Documents/projects/arcalot/arcalot-uperf/venv/lib64/python3.10/site-packages/arcaflow_plugin_sdk/plugin.py", line 132, in _resolve_field
    result = cls._resolve(t, path)
  File "/home/jaredoconnell/Documents/projects/arcalot/arcalot-uperf/venv/lib64/python3.10/site-packages/arcaflow_plugin_sdk/plugin.py", line 156, in _resolve
    return cls._resolve_list_annotation(t, path)
  File "/home/jaredoconnell/Documents/projects/arcalot/arcalot-uperf/venv/lib64/python3.10/site-packages/arcaflow_plugin_sdk/plugin.py", line 392, in _resolve_list_annotation
    raise SchemaBuildException(path, "Failed to create list type") from e
arcaflow_plugin_sdk.plugin.SchemaBuildException: Invalid schema definition for UPerfParams -> profile -> groups -> items -> transactions: Failed to create list type

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/home/jaredoconnell/Documents/projects/arcalot/arcaflow-plugins/python/uperf/uperf_plugin.py", line 300, in <module>
    def run_uperf(params: UPerfParams) -> typing.Tuple[str, typing.Union[UPerfResults, UPerfError]]:
  File "/home/jaredoconnell/Documents/projects/arcalot/arcalot-uperf/venv/lib64/python3.10/site-packages/arcaflow_plugin_sdk/plugin.py", line 76, in step_decorator
    input=build_object_schema(input_param.annotation, True),
  File "/home/jaredoconnell/Documents/projects/arcalot/arcalot-uperf/venv/lib64/python3.10/site-packages/arcaflow_plugin_sdk/plugin.py", line 525, in build_object_schema
    r = _Resolver.resolve(t)
  File "/home/jaredoconnell/Documents/projects/arcalot/arcalot-uperf/venv/lib64/python3.10/site-packages/arcaflow_plugin_sdk/plugin.py", line 113, in resolve
    return cls._resolve_abstract_type(t, tuple(path))
  File "/home/jaredoconnell/Documents/projects/arcalot/arcalot-uperf/venv/lib64/python3.10/site-packages/arcaflow_plugin_sdk/plugin.py", line 117, in _resolve_abstract_type
    result = cls._resolve(t, path)
  File "/home/jaredoconnell/Documents/projects/arcalot/arcalot-uperf/venv/lib64/python3.10/site-packages/arcaflow_plugin_sdk/plugin.py", line 142, in _resolve
    return cls._resolve_type(t, path)
  File "/home/jaredoconnell/Documents/projects/arcalot/arcalot-uperf/venv/lib64/python3.10/site-packages/arcaflow_plugin_sdk/plugin.py", line 186, in _resolve_type
    return _Resolver._resolve_class(t, path)
  File "/home/jaredoconnell/Documents/projects/arcalot/arcalot-uperf/venv/lib64/python3.10/site-packages/arcaflow_plugin_sdk/plugin.py", line 267, in _resolve_class
    name, final_field = cls._resolve_dataclass_field(f, tuple(new_path))
  File "/home/jaredoconnell/Documents/projects/arcalot/arcalot-uperf/venv/lib64/python3.10/site-packages/arcaflow_plugin_sdk/plugin.py", line 199, in _resolve_dataclass_field
    underlying_type = cls._resolve_field(t.type, path)
  File "/home/jaredoconnell/Documents/projects/arcalot/arcalot-uperf/venv/lib64/python3.10/site-packages/arcaflow_plugin_sdk/plugin.py", line 132, in _resolve_field
    result = cls._resolve(t, path)
  File "/home/jaredoconnell/Documents/projects/arcalot/arcalot-uperf/venv/lib64/python3.10/site-packages/arcaflow_plugin_sdk/plugin.py", line 142, in _resolve
    return cls._resolve_type(t, path)
  File "/home/jaredoconnell/Documents/projects/arcalot/arcalot-uperf/venv/lib64/python3.10/site-packages/arcaflow_plugin_sdk/plugin.py", line 186, in _resolve_type
    return _Resolver._resolve_class(t, path)
  File "/home/jaredoconnell/Documents/projects/arcalot/arcalot-uperf/venv/lib64/python3.10/site-packages/arcaflow_plugin_sdk/plugin.py", line 267, in _resolve_class
    name, final_field = cls._resolve_dataclass_field(f, tuple(new_path))
  File "/home/jaredoconnell/Documents/projects/arcalot/arcalot-uperf/venv/lib64/python3.10/site-packages/arcaflow_plugin_sdk/plugin.py", line 199, in _resolve_dataclass_field
    underlying_type = cls._resolve_field(t.type, path)
  File "/home/jaredoconnell/Documents/projects/arcalot/arcalot-uperf/venv/lib64/python3.10/site-packages/arcaflow_plugin_sdk/plugin.py", line 132, in _resolve_field
    result = cls._resolve(t, path)
  File "/home/jaredoconnell/Documents/projects/arcalot/arcalot-uperf/venv/lib64/python3.10/site-packages/arcaflow_plugin_sdk/plugin.py", line 156, in _resolve
    return cls._resolve_list_annotation(t, path)
  File "/home/jaredoconnell/Documents/projects/arcalot/arcalot-uperf/venv/lib64/python3.10/site-packages/arcaflow_plugin_sdk/plugin.py", line 392, in _resolve_list_annotation
    raise SchemaBuildException(path, "Failed to create list type") from e
arcaflow_plugin_sdk.plugin.SchemaBuildException: Invalid schema definition for UPerfParams -> profile -> groups: Failed to create list type

Relevant parts of the code:

@dataclass
class ProfileTransaction():
    flow_ops: typing.List[
        typing.Annotated[
            typing.Union[
                typing.Annotated[ProfileFlowOpConnect, annotations.discriminator_value("connect")],
                typing.Annotated[ProfileFlowOpAccept,  annotations.discriminator_value("accept")],
                typing.Annotated[ProfileFlowOpDisconnect, annotations.discriminator_value("disconnect")],
                typing.Annotated[ProfileFlowOpRead, annotations.discriminator_value("read")],
                typing.Annotated[ProfileFlowOpWrite, annotations.discriminator_value("write")],
                typing.Annotated[ProfileFlowOpRecv, annotations.discriminator_value("recv")],
                typing.Annotated[ProfileFlowOpSendto, annotations.discriminator_value("sendto")],
                typing.Annotated[ProfileFlowOpSendFile, annotations.discriminator_value("sendfile")],
                typing.Annotated[ProfileFlowOpSendFilev, annotations.discriminator_value("sendfilev")],
                typing.Annotated[ProfileFlowOpNOP, annotations.discriminator_value("NOP")],
                typing.Annotated[ProfileFlowOpThink, annotations.discriminator_value("think")]
            ],
            annotations.discriminator("flowop")
        ]
    ]
    iterations: typing.Optional[int]
    duration: typing.Optional[str] # TODO: Switch to the new time unit once it's added
    rate: typing.Optional[int] # Unit: Default uperf unit

@dataclass
class ProfileGroup():
    transactions: typing.List[ProfileTransaction]
    nthreads: int

@dataclass
class Profile():
    name: str
    groups: typing.List[ProfileGroup]

@dataclass
class UPerfParams:
    """
    This is the data structure for the input parameters of the step defined below.
    """
    server_addr: str
    profile: Profile
    protocol: IProtocol = IProtocol.TCP
    run_duration_ms: int = 5000

It's possible that the profile flow op values caused it, but I don't want to include too much.

Unserializing an empty dict returns an unclear error message

Describe the bug

If an empty dict gets handed to the unserialize method, especially with a required discriminator field, the resulting error message isn't really helpful

To reproduce

Use the stressng plugin at https://github.com/mkarg75/arca-stressng and remove either the cpu or the vm part of the config file and run it.

The resulting error message concludes with:

arcaflow_plugin_sdk.schema.ConstraintException: Validation failed for 'stressor': This field is required

even though the default for the corresponding class is set to None.
It's clear that we need an if-else to determine what to return, but the error message could be more clear.

Background task support

We should support plugins that execute tasks in the background and stop them later. This is useful in cases like running uperf, where a server needs to be started, then other tasks need to be executed, and then cleaned up later.

This could be achieved, by giving the plugin the option to declare a "cleanup" function in the @plugin.step declaration, which would be passed the output ID and output data from the step itself. The cleanup step has no option to return any data itself.

Standalone operation

When running a plugin as a standalone, the starting step would be run, and the cleanup step would be run immediately afterwards. Alternatively, a command-line flag could be passed to keep the service running for a specified amount of time, or indefinitely. If a SIGINT or SIGTERM is received, the cleanup function should be called and the plugin stopped gracefully.

Engine operation (ATP)

When a plugin is executed in the engine, the cleanup steps are executed in reverse order as they have been started at the end of the current workflow. After executing the step, the SDK needs to wait until it receives a signal. When a SIGINT or SIGTERM is received, the SDK needs to call the cleanup function to let the plugin cleanup any background tasks that are running.

Real time debug logs

Please describe what you would like to see in this project

Display real-time logs outputted from a plugin when the --debug flag in set.

Please describe your use case

It is needed for plugins which have a long run time such as kube-burner and web-burner plugins.

Loading a non-existent file creates a useless error message

When running the example plugin with a non-existent file, the stack trace is less than helpful:

Traceback (most recent call last):
  File "/home/janosdebugs/wolkenwalze/wolkenwalze-python-sdk/venv/lib/python3.9/site-packages/arcaflow_plugin_sdk/serialization.py", line 21, in load_from_file
    with open(file_name) as f:
FileNotFoundError: [Errno 2] No such file or directory: 'test.yaml'

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/home/janosdebugs/wolkenwalze/wolkenwalze-python-sdk/venv/lib/python3.9/site-packages/arcaflow_plugin_sdk/plugin.py", line 560, in run
    return _execute_file(step_id, s, options, stdout, stderr)
  File "/home/janosdebugs/wolkenwalze/wolkenwalze-python-sdk/venv/lib/python3.9/site-packages/arcaflow_plugin_sdk/plugin.py", line 594, in _execute_file
    data = serialization.load_from_file(filename)
  File "/home/janosdebugs/wolkenwalze/wolkenwalze-python-sdk/venv/lib/python3.9/site-packages/arcaflow_plugin_sdk/serialization.py", line 24, in load_from_file
    raise LoadFromFileException("Failed to load YAML from {}: {}".format(file_name, e.__str__())) from e
arcaflow_plugin_sdk.serialization.LoadFromFileException: <exception str() failed>

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/home/janosdebugs/wolkenwalze/wolkenwalze-python-sdk/./example_plugin.py", line 91, in <module>
    sys.exit(plugin.run(plugin.build_schema(
  File "/home/janosdebugs/wolkenwalze/wolkenwalze-python-sdk/venv/lib/python3.9/site-packages/arcaflow_plugin_sdk/plugin.py", line 569, in run
    stderr.write(e.msg + '\n')
AttributeError: 'LoadFromFileException' object has no attribute 'msg'

Type support: timestamps

The type system should support timestamps of a specific time, stored at nanosecond scale in an int64. It should also be able to decode a string with an RFC 3339 time expression.

API / transport protocol

We want a transport protocol that allows for 1) querying the schema of a plugin 2) running a step with a specified input 3) querying the result of a plugin run.

Requirements

  • The protocol must function over standard input/output.
  • The protocol must not assume that there will always be a party listening to output. The engine will reconnect when it is time for it to fetch the results, but may disconnect intermittently if the API becomes unavailable.
  • The protocol must be strictly typed.
  • The protocol must be able to transport mixed data types and convey null values.

Options

Protobuf

Protobuf is the underlying protocol of gRPC. However, there is very little documentation on how to create custom RPC handlers, which makes it difficult to use. It is also not self-delimiting, which will require us to add some sort of framing for messages.

CBOR

CBOR is less efficient than Protobuf, but has the advantage of being self-describing and self-delimiting. As a tradeoff, it is less well known and is also less efficient than Protobuf in terms of compactness.

Allow for ignoring fields on objects

We should have an option to selectively ignore unknown fields for unserialization on manual unserialization only. This is useful when unserializing from a data source that has a lot of unneeded fields that we do not want to expose as an output.

The best way to implement this would be to add an extra parameter to the unserialize method of AbstractType, which would allow specifying fields that should be skipped. The simple option is to make this a boolean, ignoring all unknown fields.

dict (not Dict) unsupported

Right now, using a lowercase d 'dict' is unsupported because it's not a dataclass. It is not clear to others, so either we need to make it work, or we need to detect the use of dict to suggest typing.Dict.

Type support: generic types

Please describe what you would like to see in this project

Some tasks require a step to be able to accept any, or almost any input. We need a way to support generic types without sacrificing type checking.

Possible solutions for inputs:

  • A step may accept any input, disabling type checking in the process.
  • The user must manually declare the input and output for steps that can deal with any kind of data
  • The step has a two-stage process where first you must declare some parameters statically in the workflow, and the step then forms a schema
  • The step will gain an opportunity if it can accept the give input schema
  • ?

Possible solutions for outputs:

  • A user must manually declare the output schema
  • The step may provide its schema based on the database schema
  • ?

The "any" type should not allow lists with inconsistent types

Please describe what you would like to see in this project

The existing implementation allows for accepting lists that have different data types per list item. This is broadly incompatible with external use cases and generally bad practice. In the current state, an error will not be exposed until a plugin attempts to use the data in an incompatible way, leading to error reporting only to stderr that we don't control directly. If the schema did not support the mixed types, the error would be caught by the SDK and we could provide better feedback to the user.

Please describe your use case

Given a list like this:

list:
  - 1
  - 2
  - string
  - 3

I may end up in an error condition with something like elasticsearch, which will set the list type to an int based on the first value, and then will choke on the string value. I would perfer this error to be caught and handled by the SDK directly.

Conditional Requirements

Please describe what you would like to see in this project

It would be useful in some situations to have annotations specify if a field is required if another field matches a specific value or set of values. This would, for example, allow setting a timeout value if wait is set to true.

Please describe your use case

I have a wait field that has a matching timeout field. The timeout does nothing when wait is false, so defining this relationship in the typing would be beneficial.

Ease SDK/dependency updates

Please describe what you would like to see in this project

Currently, SDK dependencies, or the SDK itself, cause a lot of updates on plugin repositories. We need an easier way to update plugin repositories, and also trigger their rebuilds if the SDK or its dependencies change.

`required_if` is triggering even if the listed required field is not set

Describe the bug

Using schema.required_if in a schema object is causing the SDK to report that the item is required because the other item is set, even if that item is not set.

To reproduce

Set one item in an input schema to required_if for another item. Set both items as typing.Optional with default values of None. Do not set either option in the input, and then run the plugin. Reports:

$ python3.9 arcaflow_plugin_iperf3/iperf3_plugin.py -s client --debug -f configs/iperf3-client-example.yaml 
Traceback (most recent call last):
  File "/home/dblack/git/arcaflow-plugin-iperf3/arcaflow_plugin_iperf3/iperf3_plugin.py", line 138, in <module>
    plugin.run(
  File "/home/dblack/git/arcaflow-plugin-iperf3/venv/lib64/python3.9/site-packages/arcaflow_plugin_sdk/plugin.py", line 216, in run
    return _execute_file(step_id, s, options, stdin, stdout, stderr)
  File "/home/dblack/git/arcaflow-plugin-iperf3/venv/lib64/python3.9/site-packages/arcaflow_plugin_sdk/plugin.py", line 311, in _execute_file
    output_id, output_data = s(step_id, data)
  File "/home/dblack/git/arcaflow-plugin-iperf3/venv/lib64/python3.9/site-packages/arcaflow_plugin_sdk/schema.py", line 5622, in __call__
    output_id, output_data = self._call_step(
  File "/home/dblack/git/arcaflow-plugin-iperf3/venv/lib64/python3.9/site-packages/arcaflow_plugin_sdk/schema.py", line 5578, in _call_step
    return step(
  File "/home/dblack/git/arcaflow-plugin-iperf3/venv/lib64/python3.9/site-packages/arcaflow_plugin_sdk/schema.py", line 5509, in __call__
    result = self._handler(params)
  File "/home/dblack/git/arcaflow-plugin-iperf3/arcaflow_plugin_iperf3/iperf3_plugin.py", line 114, in iperf3_client
    input_params = client_input_params_schema.serialize(params)
  File "/home/dblack/git/arcaflow-plugin-iperf3/venv/lib64/python3.9/site-packages/arcaflow_plugin_sdk/schema.py", line 5315, in serialize
    return root_object.serialize(data, tuple(new_path))
  File "/home/dblack/git/arcaflow-plugin-iperf3/venv/lib64/python3.9/site-packages/arcaflow_plugin_sdk/schema.py", line 4987, in serialize
    new_path, value = self._validate_property(data, path, field_id, property_id)
  File "/home/dblack/git/arcaflow-plugin-iperf3/venv/lib64/python3.9/site-packages/arcaflow_plugin_sdk/schema.py", line 5002, in _validate_property
    self._validate_not_set(data, property_field, tuple(new_path))
  File "/home/dblack/git/arcaflow-plugin-iperf3/venv/lib64/python3.9/site-packages/arcaflow_plugin_sdk/schema.py", line 5014, in _validate_not_set
    raise ConstraintException(
arcaflow_plugin_sdk.schema.ConstraintException: Validation failed for 'ClientInputParams -> udp': This field is required because 'udp_counters_64bit' is set

Schema snip:

...
    udp: typing.Annotated[
        typing.Optional[bool],
        schema.name("use UDP protocol"),
        schema.conflicts("sctp"),
        schema.required_if("udp_counters_64bit"),
        schema.description("use the UDP protocol for network traffic"),
    ] = None
    udp_counters_64bit: typing.Annotated[
        typing.Optional[bool],
        schema.id("udp-counters-64bit"),
        schema.name("UDP 64-bit counters"),
        schema.description("use 64-bit counters in UDP test packets"),
    ] = None
...

Input yaml:

port: 50000
interval: 10
time: 5

Additional context

See: https://github.com/arcalot/arcaflow-plugin-iperf3/blob/ffbc127e010b9c731fd5eb2967d634867f609975/arcaflow_plugin_iperf3/iperf3_schema.py#L191-L193

Unsafe dictionary key unsupported

Describe the bug

I cannot unpack a dictionary with an unsafe key, even though its corresponding safe key has been annotated with a metadata id that matches the unsafe key.

To reproduce

Create the python environment for and execute my fio_plugin.py, with the arcaflow plugin and this fio yaml configuration file to generate the traceback seen in Additional Context.

poisson-rate-submission.yaml

name: poisson-rate-submit
size: 1m
readwrite: randread
ioengine: libaio
iodepth: 32
io_submit_mode: offload
rate_iops: 50
rate_process: poisson
direct: 1
from dataclasses import dataclass, field

@dataclass
class FioParams:
    name: str
    size: str
    ioengine: str
    iodepth: int 
    io_submit_mode: str
    rate_iops: int
    rate_process: Optional[str] = 'linear'
    direct: Optional[int] = 0
    readwrite: Optional[str] = 'read'

@dataclass
class FioErrorOutput:
    error: str

@dataclass
class FioSuccessOutput:
    fio_version: str = field(metadata={
        "id": "fio version",
        "name": "fio version"
    })

fio_input_schema = plugin.build_object_schema(FioParams)
fio_output_schema = plugin.build_object_schema(FioSuccessOutput)

@plugin.step(
    id="workload",
    name="fio workload",
    description="run an fio workload",
    outputs={"success": FioSuccessOutput, "error": FioErrorOutput}
)
def run(params: FioParams) -> typing.Tuple[str, Union[FioSuccessOutput, FioErrorOutput]]:
    outfile_name = 'fio-plus'
    cmd = [
        'fio',
        *[
            f"--{key}={value}" for key, value in dataclasses.asdict(params).items()
        ],
        '--output-format=json+',
        f"--output=tmp/{outfile_name}.json"
    ]

    try:
        subprocess.check_output(
            cmd
        )
    except subprocess.CalledProcessError as error:
        return 'error', FioErrorOutput('oops')

    with open(f'tmp/{outfile_name}.json', 'r') as output_file:
        fio_results = output_file.read()

    fio_json = json.loads(fio_results)
    output = FioSuccessOutput(**fio_json)
    return 'success', output

if __name__ == '__main__':
    sys.exit(
        plugin.run(
            plugin.build_schema(
                run
            )
        )
    )

Example output from fio. I've limited it to the part that seems relevant. Executing the code will create a much larger data structure.
fio-output.json

{
  "fio version" : "fio-3.29"
}

Additional context

โฏ python fio_plugin.py -f etc-fio/poisson-rate-submission.yaml
Traceback (most recent call last):
  File "/home/mleader/workspace/arcaflow/arca-fio/fio_plugin.py", line 348, in <module>
    plugin.run(
  File "/home/mleader/.cache/pypoetry/virtualenvs/arca-fio-7ELhrWBI-py3.9/lib/python3.9/site-packages/arcaflow_plugin_sdk/plugin.py", line 494, in run
    return _execute_file(step_id, s, options, stdout, stderr)
  File "/home/mleader/.cache/pypoetry/virtualenvs/arca-fio-7ELhrWBI-py3.9/lib/python3.9/site-packages/arcaflow_plugin_sdk/plugin.py", line 540, in _execute_file
    output_id, output_data = s(step_id, data)
  File "/home/mleader/.cache/pypoetry/virtualenvs/arca-fio-7ELhrWBI-py3.9/lib/python3.9/site-packages/arcaflow_plugin_sdk/schema.py", line 900, in __call__
    output_id, output_data = self._call_step(
  File "/home/mleader/.cache/pypoetry/virtualenvs/arca-fio-7ELhrWBI-py3.9/lib/python3.9/site-packages/arcaflow_plugin_sdk/schema.py", line 860, in _call_step
    return step(
  File "/home/mleader/.cache/pypoetry/virtualenvs/arca-fio-7ELhrWBI-py3.9/lib/python3.9/site-packages/arcaflow_plugin_sdk/schema.py", line 770, in __call__
    output_id, output_data = self.handler(params)
  File "/home/mleader/workspace/arcaflow/arca-fio/fio_plugin.py", line 341, in run
    output = FioSuccessOutput(**fio_json)
TypeError: __init__() got an unexpected keyword argument 'fio version'

Enhance Clarity in Error Messages that involve Enums

Describe the bug

The error message that I get from building my plugin's object schema with the sdk is verbose and does nothing to facilitate my debugging effort.

To reproduce

Setup

Clone my arcaflow plugin fio bug fix branch.

$ cd arcaflow-plugin-fio
$ git switch bugfix
$ poetry env use $(which python3.9)
$ poetry install
$ poetry shell

Run It

$ python fio_plugin.py -f fixtures/poisson-rate-submission_input.yaml 

python fio_plugin.py -f fixtures/poisson-rate-submission_input.yaml
Traceback (most recent call last):
  File "/home/mleader/workspace/arcaflow/arcaflow-plugin-fio/fio_plugin.py", line 12, in <module>
    from fio_schema import (
  File "/home/mleader/workspace/arcaflow/arcaflow-plugin-fio/fio_schema.py", line 733, in <module>
    fio_input_schema = plugin.build_object_schema(FioJob)
  File "/home/mleader/.cache/pypoetry/virtualenvs/arca-fio-plugin-8vZa8fhA-py3.9/lib/python3.9/site-packages/arcaflow_plugin_sdk/schema.py", line 6052, in build_object_schema
    r = _SchemaBuilder.resolve(t, scope)
  File "/home/mleader/.cache/pypoetry/virtualenvs/arca-fio-plugin-8vZa8fhA-py3.9/lib/python3.9/site-packages/arcaflow_plugin_sdk/schema.py", line 5403, in resolve
    return cls._resolve_abstract_type(t, t, tuple(path), scope)
  File "/home/mleader/.cache/pypoetry/virtualenvs/arca-fio-plugin-8vZa8fhA-py3.9/lib/python3.9/site-packages/arcaflow_plugin_sdk/schema.py", line 5413, in _resolve_abstract_type
    result = cls._resolve(t, type_hints, path, scope)
  File "/home/mleader/.cache/pypoetry/virtualenvs/arca-fio-plugin-8vZa8fhA-py3.9/lib/python3.9/site-packages/arcaflow_plugin_sdk/schema.py", line 5452, in _resolve
    return cls._resolve_type(t, type_hints, path, scope)
  File "/home/mleader/.cache/pypoetry/virtualenvs/arca-fio-plugin-8vZa8fhA-py3.9/lib/python3.9/site-packages/arcaflow_plugin_sdk/schema.py", line 5500, in _resolve_type
    return cls._resolve_class(t, type_hints, path, scope)
  File "/home/mleader/.cache/pypoetry/virtualenvs/arca-fio-plugin-8vZa8fhA-py3.9/lib/python3.9/site-packages/arcaflow_plugin_sdk/schema.py", line 5663, in _resolve_class
    name, final_field = cls._resolve_dataclass_field(f, type_hints[f.name], tuple(new_path), scope)
  File "/home/mleader/.cache/pypoetry/virtualenvs/arca-fio-plugin-8vZa8fhA-py3.9/lib/python3.9/site-packages/arcaflow_plugin_sdk/schema.py", line 5534, in _resolve_dataclass_field
    underlying_type = cls._resolve_field(t.type, type_hints, path, scope)
  File "/home/mleader/.cache/pypoetry/virtualenvs/arca-fio-plugin-8vZa8fhA-py3.9/lib/python3.9/site-packages/arcaflow_plugin_sdk/schema.py", line 5434, in _resolve_field
    result = cls._resolve(t, type_hints, path, scope)
  File "/home/mleader/.cache/pypoetry/virtualenvs/arca-fio-plugin-8vZa8fhA-py3.9/lib/python3.9/site-packages/arcaflow_plugin_sdk/schema.py", line 5474, in _resolve
    return cls._resolve_annotated(t, type_hints, path, scope)
  File "/home/mleader/.cache/pypoetry/virtualenvs/arca-fio-plugin-8vZa8fhA-py3.9/lib/python3.9/site-packages/arcaflow_plugin_sdk/schema.py", line 5801, in _resolve_annotated
    underlying_t = cls._resolve(args[0], args_hints[0], path, scope)
  File "/home/mleader/.cache/pypoetry/virtualenvs/arca-fio-plugin-8vZa8fhA-py3.9/lib/python3.9/site-packages/arcaflow_plugin_sdk/schema.py", line 5452, in _resolve
    return cls._resolve_type(t, type_hints, path, scope)
  File "/home/mleader/.cache/pypoetry/virtualenvs/arca-fio-plugin-8vZa8fhA-py3.9/lib/python3.9/site-packages/arcaflow_plugin_sdk/schema.py", line 5500, in _resolve_type
    return cls._resolve_class(t, type_hints, path, scope)
  File "/home/mleader/.cache/pypoetry/virtualenvs/arca-fio-plugin-8vZa8fhA-py3.9/lib/python3.9/site-packages/arcaflow_plugin_sdk/schema.py", line 5663, in _resolve_class
    name, final_field = cls._resolve_dataclass_field(f, type_hints[f.name], tuple(new_path), scope)
  File "/home/mleader/.cache/pypoetry/virtualenvs/arca-fio-plugin-8vZa8fhA-py3.9/lib/python3.9/site-packages/arcaflow_plugin_sdk/schema.py", line 5593, in _resolve_dataclass_field
    underlying_type.default = json.dumps(underlying_type.type.serialize(default))
  File "/home/mleader/.cache/pypoetry/virtualenvs/arca-fio-plugin-8vZa8fhA-py3.9/lib/python3.9/site-packages/arcaflow_plugin_sdk/schema.py", line 3185, in serialize
    if data not in self._type:
  File "/home/mleader/.pyenv/versions/3.9.13/lib/python3.9/enum.py", line 397, in __contains__
    raise TypeError(
TypeError: unsupported operand type(s) for 'in': 'str' and 'EnumMeta'

ScopeType and ScopeSchema should not allow empty root values

Describe the bug

Currently, ScopeType and ScopeSchema allow an empty root type to allow the SchemaBuilder to work. This skews the data model as a scope without a root object is not a legitimate object. We should restructure the schema builder in such a fashion that it does not need an empty scope.

Improved Info When Output Mismatching

Please describe what you would like to see in this project

When the plugin returns nothing (None), the error message is:

Traceback (most recent call last):
  File "/home/jaredoconnell/Documents/projects/arcalot/arcaflow-plugins/python/uperf/uperf_plugin.py", line 200, in <module>
    sys.exit(plugin.run(plugin.build_schema(
  File "/home/jaredoconnell/Documents/projects/arcalot/arcalot-uperf/venv/lib64/python3.10/site-packages/arcaflow_plugin_sdk/plugin.py", line 530, in run
    return _execute_file(step_id, s, options, stdout, stderr)
  File "/home/jaredoconnell/Documents/projects/arcalot/arcalot-uperf/venv/lib64/python3.10/site-packages/arcaflow_plugin_sdk/plugin.py", line 576, in _execute_file
    output_id, output_data = s(step_id, data)
  File "/home/jaredoconnell/Documents/projects/arcalot/arcalot-uperf/venv/lib64/python3.10/site-packages/arcaflow_plugin_sdk/schema.py", line 1021, in __call__
    output_id, output_data = self._call_step(
  File "/home/jaredoconnell/Documents/projects/arcalot/arcalot-uperf/venv/lib64/python3.10/site-packages/arcaflow_plugin_sdk/schema.py", line 981, in _call_step
    return step(
  File "/home/jaredoconnell/Documents/projects/arcalot/arcalot-uperf/venv/lib64/python3.10/site-packages/arcaflow_plugin_sdk/schema.py", line 891, in __call__
    output_id, output_data = self.handler(params)
TypeError: cannot unpack non-iterable NoneType object

However, it does not make it known that the return type is wrong, making it hard to debug.

So hopefully the case can be handled explicitly, where it says Return type of <step> is X, Y expected.

Readme needs updating

Please describe what you would like to see in this project

The readme of this repository still contains information from the very early phases of development and should be updated to point to the Arcalot website.

Enum Serialization Error

Describe the bug

When switching to the main branch, this error occurs:

Traceback (most recent call last):
  File "/usr/local/lib/python3.9/site-packages/arcaflow_plugin_sdk/schema.py", line 5566, in _resolve_list_annotation
    cls._resolve_abstract_type(args[0], type_hints, tuple(new_path), scope)
  File "/usr/local/lib/python3.9/site-packages/arcaflow_plugin_sdk/schema.py", line 5130, in _resolve_abstract_type
    result = cls._resolve(t, type_hints, path, scope)
  File "/usr/local/lib/python3.9/site-packages/arcaflow_plugin_sdk/schema.py", line 5189, in _resolve
    return cls._resolve_annotated(t, type_hints, path, scope)
  File "/usr/local/lib/python3.9/site-packages/arcaflow_plugin_sdk/schema.py", line 5504, in _resolve_annotated
    underlying_t = cls._resolve(args[0], args_hints[0], path, scope)
  File "/usr/local/lib/python3.9/site-packages/arcaflow_plugin_sdk/schema.py", line 5187, in _resolve
    return cls._resolve_union(t, type_hints, path, scope)
  File "/usr/local/lib/python3.9/site-packages/arcaflow_plugin_sdk/schema.py", line 5679, in _resolve_union
    f = cls._resolve_field(args[i], arg_hints[i], tuple(new_path), scope)
  File "/usr/local/lib/python3.9/site-packages/arcaflow_plugin_sdk/schema.py", line 5151, in _resolve_field
    result = cls._resolve(t, type_hints, path, scope)
  File "/usr/local/lib/python3.9/site-packages/arcaflow_plugin_sdk/schema.py", line 5189, in _resolve
    return cls._resolve_annotated(t, type_hints, path, scope)
  File "/usr/local/lib/python3.9/site-packages/arcaflow_plugin_sdk/schema.py", line 5504, in _resolve_annotated
    underlying_t = cls._resolve(args[0], args_hints[0], path, scope)
  File "/usr/local/lib/python3.9/site-packages/arcaflow_plugin_sdk/schema.py", line 5167, in _resolve
    return cls._resolve_type(t, type_hints, path, scope)
  File "/usr/local/lib/python3.9/site-packages/arcaflow_plugin_sdk/schema.py", line 5213, in _resolve_type
    return cls._resolve_class(t, type_hints, path, scope)
  File "/usr/local/lib/python3.9/site-packages/arcaflow_plugin_sdk/schema.py", line 5366, in _resolve_class
    name, final_field = cls._resolve_dataclass_field(f, type_hints[f.name], tuple(new_path), scope)
  File "/usr/local/lib/python3.9/site-packages/arcaflow_plugin_sdk/schema.py", line 5301, in _resolve_dataclass_field
    underlying_type.default = json.dumps(default)
  File "/usr/lib64/python3.9/json/__init__.py", line 231, in dumps
    return _default_encoder.encode(obj)
  File "/usr/lib64/python3.9/json/encoder.py", line 199, in encode
    chunks = self.iterencode(o, _one_shot=True)
  File "/usr/lib64/python3.9/json/encoder.py", line 257, in iterencode
    return _iterencode(o, 0)
  File "/usr/lib64/python3.9/json/encoder.py", line 179, in default
    raise TypeError(f'Object of type {o.__class__.__name__} '
TypeError: Object of type IProtocol is not JSON serializable

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.9/site-packages/arcaflow_plugin_sdk/schema.py", line 5566, in _resolve_list_annotation
    cls._resolve_abstract_type(args[0], type_hints, tuple(new_path), scope)
  File "/usr/local/lib/python3.9/site-packages/arcaflow_plugin_sdk/schema.py", line 5130, in _resolve_abstract_type
    result = cls._resolve(t, type_hints, path, scope)
  File "/usr/local/lib/python3.9/site-packages/arcaflow_plugin_sdk/schema.py", line 5167, in _resolve
    return cls._resolve_type(t, type_hints, path, scope)
  File "/usr/local/lib/python3.9/site-packages/arcaflow_plugin_sdk/schema.py", line 5213, in _resolve_type
    return cls._resolve_class(t, type_hints, path, scope)
  File "/usr/local/lib/python3.9/site-packages/arcaflow_plugin_sdk/schema.py", line 5366, in _resolve_class
    name, final_field = cls._resolve_dataclass_field(f, type_hints[f.name], tuple(new_path), scope)
  File "/usr/local/lib/python3.9/site-packages/arcaflow_plugin_sdk/schema.py", line 5243, in _resolve_dataclass_field
    underlying_type = cls._resolve_field(t.type, type_hints, path, scope)
  File "/usr/local/lib/python3.9/site-packages/arcaflow_plugin_sdk/schema.py", line 5151, in _resolve_field
    result = cls._resolve(t, type_hints, path, scope)
  File "/usr/local/lib/python3.9/site-packages/arcaflow_plugin_sdk/schema.py", line 5183, in _resolve
    return cls._resolve_list_annotation(t, type_hints, path, scope)
  File "/usr/local/lib/python3.9/site-packages/arcaflow_plugin_sdk/schema.py", line 5569, in _resolve_list_annotation
    raise SchemaBuildException(path, "Failed to create list type") from e
arcaflow_plugin_sdk.schema.SchemaBuildException: Invalid schema definition for Profile -> groups -> items -> transactions -> items -> flowops: Failed to create list type

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.9/site-packages/arcaflow_plugin_sdk/schema.py", line 5566, in _resolve_list_annotation
    cls._resolve_abstract_type(args[0], type_hints, tuple(new_path), scope)
  File "/usr/local/lib/python3.9/site-packages/arcaflow_plugin_sdk/schema.py", line 5130, in _resolve_abstract_type
    result = cls._resolve(t, type_hints, path, scope)
  File "/usr/local/lib/python3.9/site-packages/arcaflow_plugin_sdk/schema.py", line 5167, in _resolve
    return cls._resolve_type(t, type_hints, path, scope)
  File "/usr/local/lib/python3.9/site-packages/arcaflow_plugin_sdk/schema.py", line 5213, in _resolve_type
    return cls._resolve_class(t, type_hints, path, scope)
  File "/usr/local/lib/python3.9/site-packages/arcaflow_plugin_sdk/schema.py", line 5366, in _resolve_class
    name, final_field = cls._resolve_dataclass_field(f, type_hints[f.name], tuple(new_path), scope)
  File "/usr/local/lib/python3.9/site-packages/arcaflow_plugin_sdk/schema.py", line 5243, in _resolve_dataclass_field
    underlying_type = cls._resolve_field(t.type, type_hints, path, scope)
  File "/usr/local/lib/python3.9/site-packages/arcaflow_plugin_sdk/schema.py", line 5151, in _resolve_field
    result = cls._resolve(t, type_hints, path, scope)
  File "/usr/local/lib/python3.9/site-packages/arcaflow_plugin_sdk/schema.py", line 5183, in _resolve
    return cls._resolve_list_annotation(t, type_hints, path, scope)
  File "/usr/local/lib/python3.9/site-packages/arcaflow_plugin_sdk/schema.py", line 5569, in _resolve_list_annotation
    raise SchemaBuildException(path, "Failed to create list type") from e
arcaflow_plugin_sdk.schema.SchemaBuildException: Invalid schema definition for Profile -> groups -> items -> transactions: Failed to create list type

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/plugin/test_uperf_plugin.py", line 4, in <module>
    import uperf_plugin
  File "/plugin/uperf_plugin.py", line 136, in <module>
    def run_uperf(params: Profile) -> typing.Tuple[str, typing.Union[UPerfResults, UPerfError]]:
  File "/usr/local/lib/python3.9/site-packages/arcaflow_plugin_sdk/plugin.py", line 76, in step_decorator
    input=build_object_schema(input_param.annotation),
  File "/usr/local/lib/python3.9/site-packages/arcaflow_plugin_sdk/schema.py", line 5755, in build_object_schema
    r = _SchemaBuilder.resolve(t, scope)
  File "/usr/local/lib/python3.9/site-packages/arcaflow_plugin_sdk/schema.py", line 5120, in resolve
    return cls._resolve_abstract_type(t, t, tuple(path), scope)
  File "/usr/local/lib/python3.9/site-packages/arcaflow_plugin_sdk/schema.py", line 5130, in _resolve_abstract_type
    result = cls._resolve(t, type_hints, path, scope)
  File "/usr/local/lib/python3.9/site-packages/arcaflow_plugin_sdk/schema.py", line 5167, in _resolve
    return cls._resolve_type(t, type_hints, path, scope)
  File "/usr/local/lib/python3.9/site-packages/arcaflow_plugin_sdk/schema.py", line 5213, in _resolve_type
    return cls._resolve_class(t, type_hints, path, scope)
  File "/usr/local/lib/python3.9/site-packages/arcaflow_plugin_sdk/schema.py", line 5366, in _resolve_class
    name, final_field = cls._resolve_dataclass_field(f, type_hints[f.name], tuple(new_path), scope)
  File "/usr/local/lib/python3.9/site-packages/arcaflow_plugin_sdk/schema.py", line 5243, in _resolve_dataclass_field
    underlying_type = cls._resolve_field(t.type, type_hints, path, scope)
  File "/usr/local/lib/python3.9/site-packages/arcaflow_plugin_sdk/schema.py", line 5151, in _resolve_field
    result = cls._resolve(t, type_hints, path, scope)
  File "/usr/local/lib/python3.9/site-packages/arcaflow_plugin_sdk/schema.py", line 5183, in _resolve
    return cls._resolve_list_annotation(t, type_hints, path, scope)
  File "/usr/local/lib/python3.9/site-packages/arcaflow_plugin_sdk/schema.py", line 5569, in _resolve_list_annotation
    raise SchemaBuildException(path, "Failed to create list type") from e
arcaflow_plugin_sdk.schema.SchemaBuildException: Invalid schema definition for Profile -> groups: Failed to create list type

To reproduce

class ThinkType(enum.Enum):
    IDLE = "idle"
    BUSY = "busy"

class IProtocol(enum.Enum):
    TCP = "tcp"
    UDP = "udp"
    SSL = "ssl"
    SCTP = "sctp"
    VSOCK = "vsock"

or

@dataclass
class ThinkType(enum.Enum):
    IDLE = "idle"
    BUSY = "busy"

@dataclass
class IProtocol(enum.Enum):
    TCP = "tcp"
    UDP = "udp"
    SSL = "ssl"
    SCTP = "sctp"
    VSOCK = "vsock"

Both cause the above error.

Additional context

It did not happen in 0.7.0

Recursion depth exceeded on multiple typing.Annotated parameters

Describe the bug

The plugin throws an error when more than two parameters are passed into the dataclass attribute typing annotation.

To reproduce

Given an active python environment based off of my fio plugin,

@dataclass
class FioParams:
    name: str
    size: str
    ioengine: IoEngine
    iodepth: int
    rate_iops: int
    io_submit_mode: IoSubmitMode
    direct: typing.Annotated[Optional[int], validation.min(0), validation.max(1)] = 0
    atomic: typing.Annotated[Optional[int], validation.min(0), validation.max(1)] = 0
    buffered: typing.Annotated[Optional[int], validation.min(0), validation.max(1)] = 1
    readwrite: Optional[IoPattern] = IoPattern.read.value
    rate_process: Optional[RateProcess] = RateProcess.linear.value
โฏ python fio_plugin.py -f mocks/poisson-rate-submission_input.yaml --debug
Traceback (most recent call last):
  File "/home/mleader/.cache/pypoetry/virtualenvs/arca-fio-7ELhrWBI-py3.10/lib/python3.10/site-packages/arcaflow_plugin_sdk/plugin.py", line 320, in _resolve_annotated
    underlying_t = args[i](underlying_t)
  File "/home/mleader/.cache/pypoetry/virtualenvs/arca-fio-7ELhrWBI-py3.10/lib/python3.10/site-packages/arcaflow_plugin_sdk/validation.py", line 58, in call
    effective_t.max = param
  File "/home/mleader/.cache/pypoetry/virtualenvs/arca-fio-7ELhrWBI-py3.10/lib/python3.10/site-packages/arcaflow_plugin_sdk/schema.py", line 460, in max
    self.max = max
  File "/home/mleader/.cache/pypoetry/virtualenvs/arca-fio-7ELhrWBI-py3.10/lib/python3.10/site-packages/arcaflow_plugin_sdk/schema.py", line 460, in max
    self.max = max
  File "/home/mleader/.cache/pypoetry/virtualenvs/arca-fio-7ELhrWBI-py3.10/lib/python3.10/site-packages/arcaflow_plugin_sdk/schema.py", line 460, in max
    self.max = max
  [Previous line repeated 983 more times]
  File "/home/mleader/.cache/pypoetry/virtualenvs/arca-fio-7ELhrWBI-py3.10/lib/python3.10/site-packages/arcaflow_plugin_sdk/schema.py", line 459, in max
    self._validate(self._min, max)
  File "/home/mleader/.cache/pypoetry/virtualenvs/arca-fio-7ELhrWBI-py3.10/lib/python3.10/site-packages/arcaflow_plugin_sdk/schema.py", line 429, in _validate
    if not isinstance(min, int):
RecursionError: maximum recursion depth exceeded while calling a Python object

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/home/mleader/workspace/arcaflow/arca-fio/fio_plugin.py", line 202, in <module>
    fio_input_schema = plugin.build_object_schema(FioParams)
  File "/home/mleader/.cache/pypoetry/virtualenvs/arca-fio-7ELhrWBI-py3.10/lib/python3.10/site-packages/arcaflow_plugin_sdk/plugin.py", line 473, in build_object_schema
    r = _Resolver.resolve(t)
  File "/home/mleader/.cache/pypoetry/virtualenvs/arca-fio-7ELhrWBI-py3.10/lib/python3.10/site-packages/arcaflow_plugin_sdk/plugin.py", line 109, in resolve
    return cls._resolve_abstract_type(t, tuple(path))
  File "/home/mleader/.cache/pypoetry/virtualenvs/arca-fio-7ELhrWBI-py3.10/lib/python3.10/site-packages/arcaflow_plugin_sdk/plugin.py", line 113, in _resolve_abstract_type
    result = cls._resolve(t, path)
  File "/home/mleader/.cache/pypoetry/virtualenvs/arca-fio-7ELhrWBI-py3.10/lib/python3.10/site-packages/arcaflow_plugin_sdk/plugin.py", line 138, in _resolve
    return cls._resolve_type(t, path)
  File "/home/mleader/.cache/pypoetry/virtualenvs/arca-fio-7ELhrWBI-py3.10/lib/python3.10/site-packages/arcaflow_plugin_sdk/plugin.py", line 182, in _resolve_type
    return _Resolver._resolve_class(t, path)
  File "/home/mleader/.cache/pypoetry/virtualenvs/arca-fio-7ELhrWBI-py3.10/lib/python3.10/site-packages/arcaflow_plugin_sdk/plugin.py", line 235, in _resolve_class
    name, final_field = cls._resolve_dataclass_field(f, tuple(new_path))
  File "/home/mleader/.cache/pypoetry/virtualenvs/arca-fio-7ELhrWBI-py3.10/lib/python3.10/site-packages/arcaflow_plugin_sdk/plugin.py", line 195, in _resolve_dataclass_field
    underlying_type = cls._resolve_field(t.type, path)
  File "/home/mleader/.cache/pypoetry/virtualenvs/arca-fio-7ELhrWBI-py3.10/lib/python3.10/site-packages/arcaflow_plugin_sdk/plugin.py", line 128, in _resolve_field
    result = cls._resolve(t, path)
  File "/home/mleader/.cache/pypoetry/virtualenvs/arca-fio-7ELhrWBI-py3.10/lib/python3.10/site-packages/arcaflow_plugin_sdk/plugin.py", line 158, in _resolve
    return cls._resolve_annotated(t, path)
  File "/home/mleader/.cache/pypoetry/virtualenvs/arca-fio-7ELhrWBI-py3.10/lib/python3.10/site-packages/arcaflow_plugin_sdk/plugin.py", line 322, in _resolve_annotated
    raise SchemaBuildException(
arcaflow_plugin_sdk.plugin.SchemaBuildException: Invalid schema definition for FioParams -> direct -> typing.Annotated -> 2: Failed to execute Annotated argument: maximum recursion depth exceeded while calling a Python object

Additional context

No error is thrown when only two parameters are passed to the annotation.

@dataclass
class FioParams:
    name: str
    size: str
    ioengine: IoEngine
    iodepth: int
    rate_iops: int
    io_submit_mode: IoSubmitMode
    direct: typing.Annotated[Optional[int], validation.min(0)] = 0
    atomic: typing.Annotated[Optional[int], validation.min(0)] = 0
    buffered: typing.Annotated[Optional[int], validation.min(0)] = 1
    readwrite: Optional[IoPattern] = IoPattern.read.value
    rate_process: Optional[RateProcess] = RateProcess.linear.value
โฏ python fio_plugin.py -f mocks/poisson-rate-submission_input.yaml --debug
output_id: success
output_data:
  fio version: fio-3.29
  timestamp: 1660068143
  timestamp_ms: 1660068143658
  time: Tue Aug  9 14:02:23 2022
  jobs:
  - jobname: poisson-rate-submit
    groupid: 0
    error: 0
    eta: 2147483647
    elapsed: 1
    job options:
      name: poisson-rate-submit
      size: 100k
      ioengine: sync
      iodepth: '32'
      rate_iops: '50'
      io_submit_mode: inline
      direct: '1'
      atomic: '0'
      buffered: '1'
      rw: randrw
      rate_process: poisson
    read:
      io_bytes: 28672
      io_kbytes: 28
      bw_bytes: 141241
      bw: 137
      iops: 34.482759
      runtime: 203
      total_ios: 7
      short_ios: 0
      drop_ios: 0
      slat_ns:
        min: 0
        max: 0
        mean: 0.0
        stddev: 0.0
        N: 0
      clat_ns:
        min: 300748
        max: 1064058
        mean: 487893.0
        stddev: 261419.937943
        N: 7
        percentile:
          '1.000000': 301056
          '5.000000': 301056
          '10.000000': 301056
          '20.000000': 333824
          '30.000000': 374784
          '40.000000': 374784
          '50.000000': 411648
          '60.000000': 464896
          '70.000000': 464896
          '80.000000': 464896
          '90.000000': 1056768
          '95.000000': 1056768
          '99.000000': 1056768
          '99.500000': 1056768
          '99.900000': 1056768
          '99.950000': 1056768
          '99.990000': 1056768
        bins:
          '301056': 1
          '333824': 1
          '374784': 1
          '411648': 1
          '464896': 2
          '1056768': 1
      lat_ns:
        min: 301981
        max: 1065275
        mean: 489172.285714
        stddev: 261379.680253
        N: 7
      bw_min: 0
      bw_max: 0
      bw_agg: 0.0
      bw_mean: 0.0
      bw_dev: 0.0
      bw_samples: 0
      iops_min: 0
      iops_max: 0
      iops_mean: 0.0
      iops_stddev: 0.0
      iops_samples: 0
    write:
      io_bytes: 73728
      io_kbytes: 72
      bw_bytes: 363192
      bw: 354
      iops: 88.669951
      runtime: 203
      total_ios: 18
      short_ios: 0
      drop_ios: 0
      slat_ns:
        min: 0
        max: 0
        mean: 0.0
        stddev: 0.0
        N: 0
      clat_ns:
        min: 68450
        max: 133276
        mean: 91804.888889
        stddev: 17230.069379
        N: 18
        percentile:
          '1.000000': 68096
          '5.000000': 68096
          '10.000000': 73216
          '20.000000': 75264
          '30.000000': 79360
          '40.000000': 82432
          '50.000000': 83456
          '60.000000': 95744
          '70.000000': 103936
          '80.000000': 107008
          '90.000000': 107008
          '95.000000': 134144
          '99.000000': 134144
          '99.500000': 134144
          '99.900000': 134144
          '99.950000': 134144
          '99.990000': 134144
        bins:
          '68096': 1
          '73216': 1
          '74240': 1
          '75264': 1
          '78336': 1
          '79360': 1
          '80384': 1
          '82432': 1
          '83456': 1
          '94720': 1
          '95744': 1
          '103936': 2
          '107008': 4
          '134144': 1
      lat_ns:
        min: 69953
        max: 135634
        mean: 93775.388889
        stddev: 17508.57394
        N: 18
      bw_min: 0
      bw_max: 0
      bw_agg: 0.0
      bw_mean: 0.0
      bw_dev: 0.0
      bw_samples: 0
      iops_min: 0
      iops_max: 0
      iops_mean: 0.0
      iops_stddev: 0.0
      iops_samples: 0
    trim:
      io_bytes: 0
      io_kbytes: 0
      bw_bytes: 0
      bw: 0
      iops: 0.0
      runtime: 0
      total_ios: 0
      short_ios: 0
      drop_ios: 0
      slat_ns:
        min: 0
        max: 0
        mean: 0.0
        stddev: 0.0
        N: 0
      clat_ns:
        min: 0
        max: 0
        mean: 0.0
        stddev: 0.0
        N: 0
      lat_ns:
        min: 0
        max: 0
        mean: 0.0
        stddev: 0.0
        N: 0
      bw_min: 0
      bw_max: 0
      bw_agg: 0.0
      bw_mean: 0.0
      bw_dev: 0.0
      bw_samples: 0
      iops_min: 0
      iops_max: 0
      iops_mean: 0.0
      iops_stddev: 0.0
      iops_samples: 0
    sync:
      total_ios: 0
      lat_ns:
        min: 0
        max: 0
        mean: 0.0
        stddev: 0.0
        N: 0
    job_runtime: 202
    usr_cpu: 0.0
    sys_cpu: 1.485149
    ctx: 30
    majf: 0
    minf: 15
    iodepth_level:
      '1': 100.0
      '2': 0.0
      '4': 0.0
      '8': 0.0
      '16': 0.0
      '32': 0.0
      '>=64': 0.0
    iodepth_submit:
      '0': 0.0
      '4': 100.0
      '8': 0.0
      '16': 0.0
      '32': 0.0
      '64': 0.0
      '>=64': 0.0
    iodepth_complete:
      '0': 0.0
      '4': 100.0
      '8': 0.0
      '16': 0.0
      '32': 0.0
      '64': 0.0
      '>=64': 0.0
    latency_ns:
      '2': 0.0
      '4': 0.0
      '10': 0.0
      '20': 0.0
      '50': 0.0
      '100': 0.0
      '250': 0.0
      '500': 0.0
      '750': 0.0
      '1000': 0.0
    latency_us:
      '2': 0.0
      '4': 0.0
      '10': 0.0
      '20': 0.0
      '50': 0.0
      '100': 44.0
      '250': 28.0
      '500': 24.0
      '750': 0.0
      '1000': 0.0
    latency_ms:
      '2': 4.0
      '4': 0.0
      '10': 0.0
      '20': 0.0
      '50': 0.0
      '100': 0.0
      '250': 0.0
      '500': 0.0
      '750': 0.0
      '1000': 0.0
      '2000': 0.0
      '>=2000': 0.0
    latency_depth: 32
    latency_target: 0
    latency_percentile: 100.0
    latency_window: 0
  disk_util:
  - name: dm-3
    read_ios: 5
    write_ios: 0
    read_merges: 0
    write_merges: 0
    read_ticks: 2
    write_ticks: 0
    in_queue: 2
    util: 2.857143
    aggr_read_ios: 7
    aggr_write_ios: 0
    aggr_read_merges: 0
    aggr_write_merge: 0
    aggr_read_ticks: 3
    aggr_write_ticks: 0
    aggr_in_queue: 3
    aggr_util: 3.08642
  - name: dm-0
    read_ios: 7
    write_ios: 0
    read_merges: 0
    write_merges: 0
    read_ticks: 3
    write_ticks: 0
    in_queue: 3
    util: 3.08642
    aggr_read_ios: 7
    aggr_write_ios: 0
    aggr_read_merges: 0
    aggr_write_merge: 0
    aggr_read_ticks: 2
    aggr_write_ticks: 0
    aggr_in_queue: 2
    aggr_util: 3.08642
  - name: nvme0n1
    read_ios: 7
    write_ios: 0
    read_merges: 0
    write_merges: 0
    read_ticks: 2
    write_ticks: 0
    in_queue: 2
    util: 3.08642

Type support: durations

Time durations should be supported by the typing system, denoting the nanoseconds that passed between two points in time. The starting and end time should not be stored, purely the duration. The type should have the ability to unserialize from a string, decoding time expressions such as "5m2s" The unserialization should be tolerant of any included whitespace, but should error if there unknown characters are encountered.

Types and schemas should check validity

Currently the schema and type objects do not check if they receive correct data types in their constructor, which may lead to hidden bugs. Additionally, type objects should check in detail if they match the underlying objects.

Using typing.Annotated for non-optional fields throws an error when validation.required_if_not is used

Consider the data class

@dataclass
class NetworkScenarioConfig:
    node_interface_name: typing.Annotated[int, validation.required_if_not("label_selector")
     ] 
    

    label_selector: typing.Annotated[
        typing.Optional[str], validation.required_if_not("node_interface_name")
    ] = field(
        default=None,
        metadata={
            "name": "Label selector",
            "description": "Kubernetes label selector for the target nodes. Required if node_interface_name is not set.\n"
            "See https://kubernetes.io/docs/concepts/overview/working-with-objects/labels/ for details.",
        }
    )

Running the python program gives the following error:

Traceback (most recent call last):
  File "/root/test2022/krkn_network/krkn/venv_n/lib/python3.9/site-packages/arcaflow_plugin_sdk/plugin.py", line 352, in _resolve_annotated
    underlying_t = args[i](underlying_t)
  File "/root/test2022/krkn_network/krkn/venv_n/lib/python3.9/site-packages/arcaflow_plugin_sdk/validation.py", line 124, in call
    raise BadArgumentException("required_if_not is only valid for fields on object types.")
arcaflow_plugin_sdk.schema.BadArgumentException: required_if_not is only valid for fields on object types.

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/root/test2022/krkn_network/krkn/run_kraken.py", line 15, in <module>
    import kraken.pod_scenarios.setup as pod_scenarios
  File "/root/test2022/krkn_network/krkn/kraken/pod_scenarios/setup.py", line 4, in <module>
    from kraken.plugins import pod_plugin
  File "/root/test2022/krkn_network/krkn/kraken/plugins/__init__.py", line 11, in <module>
    from kraken.plugins.network.ingress_shaping import network_chaos
  File "/root/test2022/krkn_network/krkn/kraken/plugins/network/ingress_shaping.py", line 502, in <module>
    def network_chaos(cfg: NetworkScenarioConfig,
  File "/root/test2022/krkn_network/krkn/venv_n/lib/python3.9/site-packages/arcaflow_plugin_sdk/plugin.py", line 76, in step_decorator
    input=build_object_schema(input_param.annotation, True),
  File "/root/test2022/krkn_network/krkn/venv_n/lib/python3.9/site-packages/arcaflow_plugin_sdk/plugin.py", line 525, in build_object_schema
    r = _Resolver.resolve(t)
  File "/root/test2022/krkn_network/krkn/venv_n/lib/python3.9/site-packages/arcaflow_plugin_sdk/plugin.py", line 113, in resolve
    return cls._resolve_abstract_type(t, tuple(path))
  File "/root/test2022/krkn_network/krkn/venv_n/lib/python3.9/site-packages/arcaflow_plugin_sdk/plugin.py", line 117, in _resolve_abstract_type
    result = cls._resolve(t, path)
  File "/root/test2022/krkn_network/krkn/venv_n/lib/python3.9/site-packages/arcaflow_plugin_sdk/plugin.py", line 142, in _resolve
    return cls._resolve_type(t, path)
  File "/root/test2022/krkn_network/krkn/venv_n/lib/python3.9/site-packages/arcaflow_plugin_sdk/plugin.py", line 186, in _resolve_type
    return _Resolver._resolve_class(t, path)
  File "/root/test2022/krkn_network/krkn/venv_n/lib/python3.9/site-packages/arcaflow_plugin_sdk/plugin.py", line 267, in _resolve_class
    name, final_field = cls._resolve_dataclass_field(f, tuple(new_path))
  File "/root/test2022/krkn_network/krkn/venv_n/lib/python3.9/site-packages/arcaflow_plugin_sdk/plugin.py", line 199, in _resolve_dataclass_field
    underlying_type = cls._resolve_field(t.type, path)
  File "/root/test2022/krkn_network/krkn/venv_n/lib/python3.9/site-packages/arcaflow_plugin_sdk/plugin.py", line 132, in _resolve_field
    result = cls._resolve(t, path)
  File "/root/test2022/krkn_network/krkn/venv_n/lib/python3.9/site-packages/arcaflow_plugin_sdk/plugin.py", line 162, in _resolve
    return cls._resolve_annotated(t, path)
  File "/root/test2022/krkn_network/krkn/venv_n/lib/python3.9/site-packages/arcaflow_plugin_sdk/plugin.py", line 354, in _resolve_annotated
    raise SchemaBuildException(
arcaflow_plugin_sdk.plugin.SchemaBuildException: Invalid schema definition for NetworkScenarioConfig -> node_interface_name -> typing.Annotated -> 1: Failed to execute Annotated argument: required_if_not is only valid for fields on object types.

If node_interface_name is changed to:
node_interface_name: typing.Annotated[typing.Optional[int], validation.required_if_not("label_selector")] , it seems to work. It also works for other validation functions like validation.min()

Drop HTTP support

Please describe what you would like to see in this project

Drop HTTP support from the SDK since nobody is using it and it's pulling in the Twisted dependency.

Using schema.ANY_TYPE returns NameError

Describe the bug

Using the schema.ANY_TYPE from latest version of arcaflow-plugin-sdk-python results in a NameError.

To reproduce

Sample implementation: https://github.com/engelmi/arcaflow-plugin-elasticsearch/tree/43ad64a3ab204a51ac2242da5b0bf8f9ca2d4965

When implementing a plugin:

  1. Install arcaflow-plugin-sdk-python@bce430a59a5782ef26aed9a196149031c667c6ff, e.g. via pip:
pip install git+https://github.com/arcalot/arcaflow-plugin-sdk-python@bce430a59a5782ef26aed9a196149031c667c6ff
  1. Add field to schema definition
data: schema.ANY_TYPE = field()
  1. Run

Additional context

Failure happens for typing.get_type_hints(t) when _resolve_class is called:
https://github.com/arcalot/arcaflow-plugin-sdk-python/blob/main/src/arcaflow_plugin_sdk/schema.py#L5656

Detect common type mistakes, and give useful feedback

Please describe what you would like to see in this project

One common mistake that several members of the team have made at some point is specifying a string instead of an enum value for a string-enum in the Python SDK. It would be useful to detect common type errors, like string for an enum, and specify that you need to reference the enum in code, not the string. Use the string in the YAML input only.

TypeError unclear error

Describe the bug

This error is not transparent, which does not align with the goals of this project.

Traceback (most recent call last):
  File "/home/jaredoconnell/Documents/projects/arcalot/arcalot-uperf/uperf_plugin.py", line 118, in <module>
    sys.exit(plugin.run(plugin.build_schema(
  File "/home/jaredoconnell/Documents/projects/arcalot/arcalot-uperf/venv/lib64/python3.10/site-packages/arcaflow_plugin_sdk/plugin.py", line 530, in run
    return _execute_file(step_id, s, options, stdout, stderr)
  File "/home/jaredoconnell/Documents/projects/arcalot/arcalot-uperf/venv/lib64/python3.10/site-packages/arcaflow_plugin_sdk/plugin.py", line 576, in _execute_file
    output_id, output_data = s(step_id, data)
  File "/home/jaredoconnell/Documents/projects/arcalot/arcalot-uperf/venv/lib64/python3.10/site-packages/arcaflow_plugin_sdk/schema.py", line 1031, in __call__
    serialized_output_data = self._serialize_output(step, output_id, output_data)
  File "/home/jaredoconnell/Documents/projects/arcalot/arcalot-uperf/venv/lib64/python3.10/site-packages/arcaflow_plugin_sdk/schema.py", line 1004, in _serialize_output
    return step.outputs[output_id].serialize(output_data)
  File "/home/jaredoconnell/Documents/projects/arcalot/arcalot-uperf/venv/lib64/python3.10/site-packages/arcaflow_plugin_sdk/schema.py", line 692, in serialize
    result[property_id] = property_field.type.serialize(getattr(data, field_id), tuple(new_path))
  File "/home/jaredoconnell/Documents/projects/arcalot/arcalot-uperf/venv/lib64/python3.10/site-packages/arcaflow_plugin_sdk/schema.py", line 538, in serialize
    for key in data.keys():
TypeError: unbound method dict.keys() needs an argument

To reproduce

I accidentally sent a types.GenericAlias instead of a dict by doing dict[int, UPerfRawData] instead of an empty dict.

I solved the problem, but a type check in the schema.py file would help in these situations.

Coverage Not Visible with Fork

Describe the bug

The coverage in the ATP server is not visible. This is due to the coverage library not properly following the code into the fork. There is coverage there, but the coverage should be visible to help ensure that future changes do not cause any regressions in code coverage.

One solution may be to switch from fork to multiprocessing.

Matt came up with this for multiprocessing:

coverage run --concurrency=multiprocessing,thread -m unittest test_example_plugin.py src/arcaflow_plugin_sdk/test_* && coverage combine && coverage report

.coveragerc

[run]
concurrency=multiprocessing,thread

float type validation fails

Describe the bug

Trying to set a validation for input parameters of type float fails.

To reproduce

@dataclass
class InputParams:
    seconds: typing.Annotated[float, validation.min(0.0)] = field(
        metadata={
            "id": "seconds",
            "name": "seconds",
            "description": "number of seconds to wait as a floating point "
                           "number for subsecond precision."
        }
    )

leads to:

Traceback (most recent call last):
  File "~/arcaflow-plugin-wait/venv/lib64/python3.9/site-packages/arcaflow_plugin_sdk/plugin.py", line 352, in _resolve_annotated
    underlying_t = args[i](underlying_t)
  File "~/arcaflow-plugin-wait/venv/lib64/python3.9/site-packages/arcaflow_plugin_sdk/validation.py", line 28, in call
    effective_t.min = param
AttributeError: can't set attribute

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "~/arcaflow-plugin-wait/wait_plugin.py", line 49, in <module>
    def wait(
  File "~/arcaflow-plugin-wait/venv/lib64/python3.9/site-packages/arcaflow_plugin_sdk/plugin.py", line 76, in step_decorator
    input=build_object_schema(input_param.annotation, True),
  File "~/arcaflow-plugin-wait/venv/lib64/python3.9/site-packages/arcaflow_plugin_sdk/plugin.py", line 525, in build_object_schema
    r = _Resolver.resolve(t)
  File "~/arcaflow-plugin-wait/venv/lib64/python3.9/site-packages/arcaflow_plugin_sdk/plugin.py", line 113, in resolve
    return cls._resolve_abstract_type(t, tuple(path))
  File "~/arcaflow-plugin-wait/venv/lib64/python3.9/site-packages/arcaflow_plugin_sdk/plugin.py", line 117, in _resolve_abstract_type
    result = cls._resolve(t, path)
  File "~/arcaflow-plugin-wait/venv/lib64/python3.9/site-packages/arcaflow_plugin_sdk/plugin.py", line 142, in _resolve
    return cls._resolve_type(t, path)
  File "~/arcaflow-plugin-wait/venv/lib64/python3.9/site-packages/arcaflow_plugin_sdk/plugin.py", line 186, in _resolve_type
    return _Resolver._resolve_class(t, path)
  File "~/arcaflow-plugin-wait/venv/lib64/python3.9/site-packages/arcaflow_plugin_sdk/plugin.py", line 267, in _resolve_class
    name, final_field = cls._resolve_dataclass_field(f, tuple(new_path))
  File "~/arcaflow-plugin-wait/venv/lib64/python3.9/site-packages/arcaflow_plugin_sdk/plugin.py", line 199, in _resolve_dataclass_field
    underlying_type = cls._resolve_field(t.type, path)
  File "~/arcaflow-plugin-wait/venv/lib64/python3.9/site-packages/arcaflow_plugin_sdk/plugin.py", line 132, in _resolve_field
    result = cls._resolve(t, path)
  File "~/arcaflow-plugin-wait/venv/lib64/python3.9/site-packages/arcaflow_plugin_sdk/plugin.py", line 162, in _resolve
    return cls._resolve_annotated(t, path)
  File "~/arcaflow-plugin-wait/venv/lib64/python3.9/site-packages/arcaflow_plugin_sdk/plugin.py", line 354, in _resolve_annotated
    raise SchemaBuildException(
arcaflow_plugin_sdk.plugin.SchemaBuildException: Invalid schema definition for InputParams -> seconds -> typing.Annotated -> 1: Failed to execute Annotated argument: can't set attribute

Additional context

$ pip list
Package             Version
------------------- -------
arcaflow-plugin-sdk 0.7.0
attrs               22.1.0
Automat             20.2.0
bandit              1.7.4
constantly          15.1.0
flake8              5.0.4
gitdb               4.0.9
GitPython           3.1.27
hyperlink           21.0.0
idna                3.3
incremental         21.3.0
mccabe              0.7.0
pbr                 5.10.0
pip                 22.2.2
pycodestyle         2.9.1
pyflakes            2.5.0
PyYAML              6.0
setuptools          50.3.2
six                 1.16.0
smmap               5.0.0
stevedore           4.0.0
Twisted             22.4.0
typing_extensions   4.3.0
zope.interface      5.4.0

Allow a plugin to accept input parameters from stdin when used with the CLI

Please describe what you would like to see in this project

Currently when running a plugin from the CLI, the command expects a file path passed to the -f flag. When the plugin is containerized, it can only accept a file path that is accessible within the container, making it difficult to run the command directly with various input parameters. Usual bash hacks to pass stdin to the command don't seem to work because of file extension checks.

$ cat smallfile-example.yaml | podman run --rm smallfile-test -f /dev/stdin
Unsupported file extension: /dev/stdin
$ podman run --rm smallfile-test -f <(cat smallfile-example.yaml)
Unsupported file extension: /dev/fd/63

What would be nice is for the containerized command to simply accept input piped directly to stdin as a default.

$ cat smallfile-example.yaml | podman run --rm smallfile-test

Please describe your use case

I should be able to run the containerized plugin freely with any input parameters that match the input schema.

Type support: union types

We need support for union types of objects:

import abc
import dataclasses
import typing


class AbstractLoaderParams(abc.ABC):
    """
    This is the parent class that defines a method for constructing a set of command line arguments.1
    """
    @abc.abstractmethod
    def get_cmd_args(self) -> typing.List[str]:
        """
        This function returns a list of arguments to pass to the command line.
        :return:
        """
        pass


@dataclasses.dataclass
class CPULoaderParams(AbstractLoaderParams):
    """
    This loader adds CPU parameters.
    """
    cores: int

    def get_cmd_args(self) -> typing.List[str]:
        return ["--cpu", str(self.cores)]


@dataclasses.dataclass
class MemoryLoaderParams(AbstractLoaderParams):
    """
    This loader adds memory parameters.
    """
    use_ram_mb: int

    def get_cmd_args(self) -> typing.List[str]:
        return ["--mem", str(self.use_ram_mb)]


@dataclasses.dataclass
class LoaderParams(AbstractLoaderParams):
    """
    This class is the root parameters class that will add collect a series of parameter objects and construct
    command line arguments from them.
    """
    serial_loaders: typing.List[typing.Union[CPULoaderParams, MemoryLoaderParams]]

    def get_cmd_args(self) -> typing.List[str]:
        result = []
        for loaders in self.serial_loaders:
            for arg in loaders.get_cmd_args():
                result.append(arg)
        return result


LoaderParams(
    [
        CPULoaderParams(2),
        MemoryLoaderParams(64),
        CPULoaderParams(3),
    ]
)

Create a plugin and workflow for testing the SDK

Please describe what you would like to see in this project

The unit tests are covering the internal SDK, but we need test cases for the interface to the SDK. This can be done by creating a simple plugin, getting the relevant engine version, and running that plugin using the engine and the version of the SDK under test.
This is needed for integration testing.

This should ideally be run in a Github workflow to verify that the changes work.

Please describe your use case

This should make it so that we don't need to manually run another plugin in the engine with the new SDK version substituted into it to verify that the PR didn't break anything.

We need to find a way to make sure that we can test and don't get bogged down by any of the following scenarios:

  • Change to SDK that doesn't change engine
  • Change to SDK that doesn't require an engine change.
  • Change to SDK that requires a changed engine

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.