Giter Site home page Giter Site logo

redfish-interop-validator's Introduction

Copyright 2017-2020 DMTF. All rights reserved.

Redfish Interop Validator

About

The Redfish Interop Validator is a python3 tool that will validate a service based on an Interoperability profile given to the tool. The purpose of the tool is to guarantee that a specific service is compatible with vendor systems or system tools based on a vendor's specification in a profile.

Introduction

This tool is designed to accept a profile conformant to the schematics specified by the DMTF Redfish Profile schema, and run against any valid Redfish service for a given device. It is not biased to any specific hardware, only dependent on the current Redfish specification.

Installation

From PyPI:

pip install redfish_interop_validator

From GitHub:

git clone https://github.com/DMTF/Redfish-Interop-Validator.git
cd Redfish-Interop-Validator
python setup.py sdist
pip install dist/redfish_interop_validator-x.x.x.tar.gz

Requirements

External modules:

You may install the prerequisites by running:

pip3 install -r requirements.txt

If you have a previous beautifulsoup4 installation, use the following command:

pip3 install beautifulsoup4 --upgrade

There is no dependency based on Windows or Linux OS. The result logs are generated in HTML format and an appropriate browser, such as Chrome, Firefox, or Edge, is required to view the logs on the client system.

Execution Steps

The Redfish Interop Validator is designed to execute as a purely command line interface tool with no intermediate inputs expected during tool execution. Below are the step by step instructions on setting up the tool for execution on any identified Redfish device for conformance test:

Modify the config\example.ini file to enter the system details under below section

[Tool]

Variable Type Definition
Version string Internal config version (optional)
Copyright string DMTF copyright (optional)
verbose int level of verbosity (0-3)

[Interop]

Variable Type Definition
Profile string name of the testing profile (mandatory)
Schema string name of json schema to test profile against

[Host]

Variable Type Definition
ip string Host of testing system, formatted as https:// ip : port (can use http as well)
username string Username for Basic authentication
password string Password for Basic authentication (removed from logs)
description string Description of system being tested (optional)
forceauth boolean Force authentication even on http servers
authtype string Authorization type (Basic
token string Token string for Token authentication

[Validator]

Variable Type Definition
payload string Option to test a specific payload or resource tree (see below)
logdir string Place to save logs and run configs
oemcheck boolean Whether to check Oem items on service
online_profiles boolean Whether to download online profiles
debugging boolean Whether to print debug to log
required_profiles_dir string Option to set the root folder of required profiles
collectionlimit string Sets a limit to links gathered from collections by type, e.g. ComputerSystem 20 limits ComputerSystemCollection to 20 links

Payload options

The payload option takes two parameters as "option uri"

(Single, SingleFile, Tree, TreeFile) How to test the payload URI given. Single tests will only give a report on a single resource, while Tree will report on every link from that resource

([Filename], [uri])

URI of the target payload, or filename of a local file.

HTML Log

To convert a previous HTML log into a csv file, use the following command:

python3 tohtml.py htmllogfile

Execution flow

    1. Redfish Interop Validator starts with the Service root Resource Schema by querying the service with the service root URI and getting all the device information, the resources supported and their links. Once the response of the Service root query is verified against a given profile (given the profile contains specifications for ServiceRoot), the tool traverses through all the collections and Navigation properties returned by the service.
    1. For each navigation property/Collection of resource returned, it does following operations: ** i. Reads all the Navigation/collection of resources. ** ii. Queries the service with the individual resource uri and validates all Resource returned by the service that are included in the profile specified to the tool.
    1. Step 2 repeats till all the URIs and resources are covered.

Upon validation of a resource, the following types of tests may occur:

  • Unlike the Service Validator, the program will not necessarily list and warn problematic Resources, it will expect those problems to be found with the Service Validator and are ignored in the process here.
  • When a Resource is found, check if this resource exists in the Profile provided, otherwise ignore it and move on to the next available resources via its Links.
  • With the Resource initiated, begin to validate itself and the Properties that exist in the Profile given to the program with the following possible tests:
    • MinVersion - Test the @odata.type/version of the Resource which is being tested, which must be GREATER than the given MinVersion in the profile
    • MinCount - Test based on the @odata.count annotation, determine the size of the a given Collection or List, which must be GREATER than this given MinCount in the profile
    • ReadRequirement - Test the existence of a Property or Resource, depending on whether it is Recommended or Mandatory (others unimplemented) in the profile
    • Members - Test a Resource's "Members" property, which includes MinCount test
    • MinSupportedValues - Test the enumerations of a particular Property, based on the annotation @odata.SupportedValues and the given in the profile
    • Writeable/WriteRequirement - Test if the Property is ReadWrite capable, depending on if it is required in the profile
    • Comparison - Test between an Enum Property's value and values in the Profile, with a particular set of comparisons available:
      • AnyOf, AllOf = compare if any or all of the given values exist in a List or single Enum
      • GreaterThan, LessThan, Equal, ... = compare based on common comparisons Less, Greater or Equal
      • Absent, Present = compare if a property exist or does not
    • ConditionalRequirements - Perform some of the above tests above if one of the specified requirements are True:
      • Subordinate - Test if this Resource is a child/link of the type tree listed
      • Comparison - Test if a Comparison is True to a certain value
    • ActionRequirements - Perform tests based on what Actions require, such as ReadRequirement, AllowableValues
    • Check whether a Property is at first able to be nulled or is mandatory, and pass based on its Requirement or Nullability
    • For collections, validate each property inside of itself, and expects a list rather than a single Property, otherwise validate normally:

Conformance Logs - Summary and Detailed Conformance Report

The Redfish Interop Validator generates reports in the "logs" folder: a text version named "InteropLog_MM_DD_YYYY_HHMMSS.txt" and an html version named "InteropHtmlLog_MM_DD_YYYY_HHMMSS.html". The reports give the detailed view of the individual properties checked, with the Pass/Fail/Skip/Warning status for each resource checked for conformance.

There is a verbose log file that may be referenced to diagnose tool problems when the stdout print out is insufficient, located in logs/ConformanceLog_MM_DD_YYYY_HHMMSS.html

Release Process

  1. Go to the "Actions" page
  2. Select the "Release and Publish" workflow
  3. Click "Run workflow"
  4. Fill out the form
  5. Click "Run workflow"

redfish-interop-validator's People

Contributors

arnewiebalck avatar arno500 avatar billdodd avatar eccomiqua avatar elfosardo avatar jautor avatar jordanchw avatar karinamw-se avatar mraineri avatar pwvancil avatar rgmain avatar samerhaj avatar smiller248 avatar tomasg2012 avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

redfish-interop-validator's Issues

KeyError: 'Protocol' when running Redfish Interop Validator with OCPBasicServer profile

I ran redfish interop validator with Open Compute Project profiles: OCPBasicServer and OCPManagedDevice. The error message "KeyError: 'Protocol'" pops up when I ran with OCPBasicServer profile, but not with OCPManagedDevice profile.

Full error msg:

C:\Users\leejenn1\work\Redfish\Redfish-Interop-Validator-master>python RedfishInteropValidator.py -c config\config.ini profiles\OCPBasicServer.v1_0.json
CacheMode or path invalid, defaulting to Off
ConfigURI: https:// [deleted]
System Info: Test Config, place your own description of target system here
authtype: Basic,  cachefilepath: None,  cachemode: Off,  certificatebundle: None,  certificatecheck: False,  httpproxy: None,
httpsproxy: None,  localonlymode: False,  logpath: ./logs,  metadatafilepath: ./SchemaFiles/metadata,  payloadfilepath: None,  payloadmode: Default,
schemasuffix: _v1.xml,  servicemode: False,  timeout: 30,  username: root,  usessl: True,  warnrecommended: False,

Start time: 03/08/18 - 10:46:19
Traceback (most recent call last):
  File "RedfishInteropValidator.py", line 977, in <module>
    sys.exit(main(sys.argv))
  File "RedfishInteropValidator.py", line 854, in main
    success, counts, results, xlinks, topobj = validateURITree('/redfish/v1', 'ServiceRoot', profile, expectedJson=jsonData)
  File "RedfishInteropValidator.py", line 632, in validateURITree
    serviceVersion = profile["Protocol"].get('MinVersion', '1.0.0')
KeyError: 'Protocol'

The profiles can be downloaded here:
http://opencompute.org/wiki/Hardware_Management/SpecsAndDesigns#Baseline_and_Server_profile

Cannot validate if property/version is deprecated

Hi
I try to validate the Profile Simulator.
From one side:

  • Redfish Profile Simulator
    From the other side
  • Redfish Interop Validator with:
    --> new DSP package DSP8010_2019.3.zip

got the errors:
Something went wrong ...
KeyError: 'Term'

for
PostalAddress:GPSCoords, Location:Info, PCIDevices PostalAddress:PostalLocation Fan:FanName

Properties have a deprecated status.
It seems that the Validator do NOT take care of versionDeprecated ?

In getPropertyDetails(), tag['Term'] raises an exception because the 'deprecated' structures are Collection (Redfish.Revisions Collection inside Annotation tag) and propertyTags variable is flatten before.
like GPSCoords:
image

Any idea / suggestion to fix it ? I would be pleased,

Best Regards,
Francine

The log file should include name of Profile file used

The log file should include the name of the Profile file read as input, and all referenced Profile files.

Extra credit would be a hash for each Profile file. Then the Profile own could determine that their Profile files had not been altered, prior to the execution run.

failPayloadError in Sessions

Hello
I get this error evry time when I try to validate the Profile Simulator with Interop Validator :
DSP package DSP8010_2019.3.zip

Teminal :
1 failPayloadError errors in /redfish/v1/SessionService/Sessions
Counter({'skipOptional': 139, 'pass': 130, 'metadataNamespaces': 60, 'skipOem': 23, 'passGet': 14, 'serviceNamespaces': 13, 'warningPresent':
7, 'missingNamespaces': 4, 'badNamespaceInclude': 1, 'reflink': 1, 'failPayloadError': 1})

Please can you help to fix that ? , I'll be grateful :)

Regards,
Youssef.

Add support for checking registries

This feature requires that registries be stated at the ServiceRoot, which is an optional navigation property, otherwise report that this is unable to be checked.

Flagging ERRORs that are not errors

I am running against OCPServerHardwareManagement.v0_2_3.json profile.

My ComputerSystem resource has a "SKU": "MXQ32200VV" but not a "PartNumber". The profile test is supposed to ensure that one or the other is implemented (which it is...so its not an error). But the tool seems to treat every conditional comparison that's false as an error, so I get a false positive here.

INFO - ### Validating PropertyRequirements for PartNumber
INFO - propRequirement with value: DNE
INFO - Testing ReadRequirement 
	expected:Recommended, exists: False
INFO - 	Item is recommended but does not exist
INFO - 	pass True
INFO - Evaluating conditionalRequirements
INFO - Testing a comparison 
	('MXQ32200VV', 'Absent', [])
INFO - 	pass False
ERROR - 	NoPass

Lots of Problems with Config and README

I had a lot of problems using the readme and setting up for execution. And then a quick exception. I will list problems here as encountered.

  1. Appears the program requires the profile to be listed on the command line. This is not shown in the readme. Appears need to specify the whole path?
  2. Is it possible to supply the profile in config file? Does not seem to work.
  3. Does config file even work at all? The options strings in the sample config file do not seem to match the strings in the source. I was not able to get anywhere with config file so I used command line.

For a profile I used https://github.com/DMTF/spmf/tree/master/profiles/RedfishInteroperabilityProfile.v1_0_0.json
The system used for test passes Redfish-Service-Validator with no errors

Need a simple text report

The comprehensive log is great, but for purposes of logging bugs against our products, it would be really useful to have a text output report that is easy to read at a glance and has one line per error. Each error line should contain:

  • Schema
  • Property (property path) - or Action
  • URI associated with the Schema (e.g. if it fails on only some instances of a schema class)
  • Error specifics
  • Line number in log file to reference for more verbose detail

The ConditionalRequirements options are not flexible enough to support some condition case.

The Case1 and Case2 are all come from spec DSP0272_1.0.1.pdf.
In Case2, the condition options seems no problem but the Validator's result is fail. The reason sall be caused by missing option "Comparison".
According to the spec, the ConditionalRequirement didn't mention "Comparison" option shall be a required option.
I think the Validator should regards the Case2 be a valid case.

Case1:

"IndicatorLED": {
          "ReadRequirement": "Recommended",
          "ConditionalRequirements": [{
            "Purpose": "Physical Systems must have a writable Indicator LED",
            "CompareProperty": "SystemType",
            "CompareType": "AnyOf",
            "CompareValues": ["Physical"],
            "ReadRequirement": "Mandatory",
            "WriteRequirement": "Mandatory",
            "Comparison": "AnyOf",
            "Values": ["Off"]
          }]
}

The result is pass the Validator's check.

Case2:

"IndicatorLED": {
          "ReadRequirement": "Recommended",
          "ConditionalRequirements": [{
            "Purpose": "Physical Systems must have a writable Indicator LED",
            "CompareProperty": "SystemType",
            "CompareType": "AnyOf",
            "CompareValues": ["Physical"],
            "ReadRequirement": "Mandatory",
            "WriteRequirement": "Mandatory"
          }]
}

The result is fail to pass the Validator's check and got following error messages:

KeyError: 'Comparison'
ERROR - Could not finish validation check on this payload

"If Implemented" subordinate resource links treated as Mandatory

The "If Implemented" requirement needs to be handled carefully. The tool is failing if an "if implemented" subordinate resource link is not present. That could be because the link is a JSON object (may be considering the odata.id as mandatory since the object is present).

This could be an exception case to be handled.

Conditional requirements not passing

Conditionals with "CompareProperty" don't appear to function properly. Get an "ERROR - NoPass" in the log and in the HTML output (with no context, either).

Snippet from a Boot object requirement in a ComputerSystem profile:

 "UefiTargetBootSourceOverride": {
   "ConditionalRequirements": [
        {
        "CompareProperty": "BootSourceOverrideMode", 
        "Comparison": "Equal", 
        "Purpose": "If UEFI mode is selected, must allow for UEFI target.", 
        "ReadRequirement": "Mandatory", 
        "Values": [
           "UEFI"
          ]
        }
       ], 
      "ReadRequirement": "Recommended", 
      "WriteRequirement": "Mandatory"
    }, 

and in payload:

        "BootSourceOverrideEnabled": "Once",
        "BootSourceOverrideMode": "UEFI",
        "BootSourceOverrideTarget": "Hdd",
        "UefiTargetBootSourceOverride": "None",

the Condition should apply, and the resulting Mandatory requirement is met. But log output says:

INFO - inside complex Boot.UefiTargetBootSourceOverride
INFO - propRequirement with value: None
INFO - Testing ReadRequirement 
	expected:Recommended, exists: True
INFO - 	pass True
INFO - writeable 
	Mandatory
INFO - Evaluating conditionalRequirements
INFO - Testing a comparison 
	('UEFI', 'Equal', [])
INFO - 	pass False
ERROR - 	NoPass
INFO - 	Condition does not apply

And in the HTML output - there's a "ERROR - NoPass" for that resource (but no context as to what property or requirement).

Should not flag errors for failure to find unneeded schema

I am getting multiple ERRORs for non-existent XML schema:

INFO - 	 URI /redfish/v1/UpdateService/FirmwareInventory/3, Type (SoftwareInventory), GET SUCCESS (time: 0.302655)
ERROR - getResourceObject: Namespace appears nonexistent in SchemaXML: #HpeiLOSoftwareInventory.v2_0_0.HpeiLOSoftwareInventory http://redfish.dmtf.org/schemas/v1/Resource_v1.xml

These are all for OEM schema (OEM extensions to standard Redfish schema) and are not referenced by the profile I'm testing against. The tool should not flag errors like this because they are not profile validation issues.

Support "URI" requirements

Profile spec v1.1 added URI patterns for requirements - this is very useful for easier profile construction, but need the toolchain to support it in order for folks to implement that functionality.

Allow specifying multiple profiles per run

The tool only lets you run multiple profiles if it is required by the parent profile. Allow the program to specify multiple profiles in one run.

Additionally, consider allowing the program to run multiple profiles concurrently per run instead of running each independently. This can be arranged by passing in all profiles to the main loop and iterate through them for related keys.

Errors found when checking Registry files.

We got some errors by Redfish-Interop-Validator and it seems that Redfish-Interop-Validator doesn't extract the correct context of the resource.

Resource error message
/redfish/v1/schemas/registries/ResourceEvent.1.0.1.json ERROR - getResourceObject: Namespace appears nonexistent in SchemaXML: MessageRegistry.v1_0_0.MessageProperty http://redfish.dmtf.org/schemas/v1/MessageRegistryFile_v1.xml
/redfish/v1/schemas/registries/Base.1.4.0.json ERROR - getResourceObject: Namespace appears nonexistent in SchemaXML: MessageRegistry.v1_0_0.MessageProperty http://redfish.dmtf.org/schemas/v1/MessageRegistryFile_v1.xml
/redfish/v1/schemas/registries/TaskEvent.1.0.1.json ERROR - getResourceObject: Namespace appears nonexistent in SchemaXML: MessageRegistry.v1_0_0.MessageProperty http://redfish.dmtf.org/schemas/v1/MessageRegistryFile_v1.xml
  • Try to modify interop source code, set the context to None when the reosurce is message registrey, then, interop would create the corresponding context for it.
  • After the modification, the issues that mention above are disappear(and the context in the report is correct).

Could someone help to check it? Thanks.

Readme for this tool not clear on unique profile tests

The Readme is obviously originated from copy of the Redfish-Service-Validator. While there are a few edits to replace the name the differences in operation from that tool are not made clear.

Does this tool process the special profile schema semantics? How is this accomplished in detail? Are there new errors possible?

It seems the tool ignores the schema reference by the service under test. Is this wise?

Is the intention to determine of the system under test meets the minimum requirements of the profile?

Suggest the Readme needs more work to make operation clearer. Or point the where this information is available.

Basic server profile

Has the Basic Server Profile been publically published? If so, can we include it here in the profiles subdirectory? That will be convenient for building the test tree for the Redfish-Test-Framework.

Exception on handling Links[]

When the tool encounters a Links property that is incorrectly implemented as an array (Links[]) rather than a list (Links { } ), it stops running instead of reporting the issue and continuing to run the remaining tests

How to test for "IfImplemented"?

Currently the usage of "IfImplemented" is to cover cases where the Redfish service doesn't need to implement a property or resource if the underlying functionality is not present in the hardware. For example, if a system is built without fans, then the "Fans" property in "Thermal" could be marked as "IfImplemented". But there isn't a good way to test for this (at least exclusively through Redfish). Maybe we need a hardware configuration file to describe the underlying hardware architecture of a given system to help validate a profile.

-Help command has confusing parameter definitions

Hello all,

On running "python3 RedFishInteropValidator.py -h" the help text is return as usual. I noticed that the definitions returned for the positional arg "profile" and the optional args "suffix" and "schema" the same. The definition reads "suffix of local schema files (for version differences)".

Can I get some clarity for these three args?

Cannot verify @ActionInfo in Actions property

Description

The current code for checking Action requirement in commonInterop.py

if "Parameters" in entry:
innerDict = entry["Parameters"]
for k in innerDict:
item = innerDict[k]
annotation = decodeditem.get(str(k) + '@Redfish.AllowableValues', 'DNE')
# problem: if dne, skip
# assume mandatory
msg, success = validateRequirement(item.get('ReadRequirement', "Mandatory"), annotation)

only handles for @Redfish.AllowableValues. It will treat the below response as fail although the @Redfish.AllowableValues can be retrieved in @Redfish.ActionInfo URI.

"Actions": {
    "#ComputerSystem.Reset": {
        "@Redfish.ActionInfo": "/redfish/v1/Systems/Self/ResetActionInfo",
        "target": "/redfish/v1/Systems/Self/Actions/ComputerSystem.Reset"
    }
},

Information

  • In Redfish Specification v1.8.0 (DSP0266), Section 9.5.13 mentions:

To specify the list of supported values for a parameter, the service may include the @Redfish.AllowableValues annotation.
...
The Resource may provide a separate @Redfish.ActionInfo Resource to describe the parameters and values that a particular instance or implementation supports.

  • In Redfish Schema Supplement v2019.3 (DSP0268), Section "Common objects" specifies that Actions property contains @Redfish.ActionInfo and target.

  • In Redfish Interoperability Profiles Specification v1.2.0 (DSP0272), Section 8.4.3 only defines that Action requirement contains ReadRequirement, Parameters, and Purpose. The Parameters is the requirement for any parameter available for this Action, but there are no statements specifying it must be located in @Redfish.AllowableValues or @Redfish.ActionInfo URI.

Tool should allow any profile version

The tool does not look for any profile with version less than v1.0.0,

The tool should be able to read a profile with any version. In the OCP case, the baseline and server (which references the baseline) profiles may both be in draft (<1.0.0), prior to OCP approval.

If the desire is the restrict versions to => 1.0.0 for "official" conformance runs, then I would suggest a flag in the config.ini file (e.g. OfficialTest = True)

Exceptions in execution of InteropValidator

There are a number of Tracebacks in the attached log.
I only looked at the 1st one in which decodeditem is a string,
not a list or a dict and can't see how "item" could have been created.
Anyway could someone tack a look at the exceptions?
python3 RedfishInteropValidator.py -c ./config.ini OCPBaselineHardwareManagement.v0_2_1.json
The target service is the Reddrum front end with obmc backend.
Thanks
log5.txt

Referenceable array member checks should continue without odata.id

Embedded objects / arrays which are marked as referenceable members must contain an @odata.id to conform to the spec. This is properly flagged as an error in the tool.

However, the schema type of these arrays/objects can be assumed (and perhaps always correct?) from the parent, and so checking the properties within the array member should continue, not abort (as the tool does now).

ValidateMinVersion can not get pass for RedfishVersion in ServiceRoot

Hi,

By using the profile 'OCPBaselineHardwareManagement.v1_0_0.json' got from 'https://github.com/opencomputeproject/OCP-Profiles' it has the "Protocol" section with "MinVersion": "1.0".

And we have RedfishVersion to be "1.0.2" to be validated. After run the log shows below Info.

INFO - Testing minVersion
('1.0.2', '1.0')
INFO - pass False
ERROR - No Pass

It seems ValidateMinVersion() can not get that comparison passed. However, suppose 1.0.2 is greater than 1.0 and should be pass.

Thanks

// Vic

Unable to parse configuration: TypeError("'NoneType' object is not iterable",)

Hi,
I just tried the following :
python3 RedfishInteropValidator.py MyProfile -c config.ini

I get:
Unable to parse configuration: TypeError("'NoneType' object is not iterable",)

Config:

[Information]
Updated = May 8, 2017
Description = Redfish Interop Conformance Tool 0.91

[SystemInformation]
TargetIP = 10.10.10.10
SystemInfo = Test Config, place your own description of target system here
UserName = root
Password = my_pass
AuthType = None
Token = 123456SESSIONNauthcode
UseSSL = False
CertificateCheck = Off
CertificateBundle =

[Options]
MetadataFilePath = ./SchemaFiles
CacheMode = Off
CacheFilePath =
SchemaSuffix = _v1.xml
Timeout = 30
HttpProxy =
HttpsProxy =
LocalOnlyMode = True
ServiceMode = Off

[Validator]
PayloadMode = Default
PayloadFilePath =
LogPath = ./logs
WarnRecommended = False

What is wrong ?

Best Regards,
Francine

To me, it seems pull request #19 broke the tool

I am not seeing the output that was produced by earlier versions. For example, When tool finds EthernetInterface object tool used to check the presence of the listed PropertyRequirements inside. This no longer happens.

Error on Collection if there are no members

The tool is flagging error on required Collection elements if it does not find any members in the collection.

For example:

        "EventDestination": {
            "PropertyRequirements": {
...
            "Purpose": "The EventDestination resource describes the target of an Event subscription, including the types of Events subscribed and context to provide to the target in the Event payload.",
            "ReadRequirement": "Mandatory"
        },

In this case, the profile would like to verify that if an EventDestination exists in the Events Subscription collection, that it contains the correct and mandatory properties as specified by the profile.

however, since most systems under test will NOT have any pre-existing subscriptions, the test will fail.

This is also true with a number of other cases (such as Jobs, Tasks, etc...)

Some ideas to enhance the tool to work around this issue:

  1. Output a more meaningful error message that the collection exist but is empty
  2. Add a policy in the config file of the tool, on whether to flag this as error, warning, or…
  3. Attempt to automatically create a collection member (if possible) and rerun the test. this could also be configurable in the tool conf file (as a policy)

fail.ServiceRoot.ReadRequirement error

By checking the profile 'OCPBaselineHardwareManagement' from https://github.com/opencomputeproject/OCP-Profiles, we got 'fail.ServiceRoot.ReadRequirement' error.

After some digging code, we find the reason might be objRes['ServiceRoot']['mark'] not set to True.
There only all links following the 'ServiceRoot' be checked and marked. However, in validateURITree() there did not try to mark 'ServiceRoot' after validate top URI itself.

Hope it helps.

The RequiredProfiles property is not supported

A profile can extend another profile by using the RequiredProfiles property

	"RequiredProfiles": {
		"OCPBaselineHardwareManagement": {
			"MinVersion": "1.0.0"
		}
	},

The validator should read each referenced profile and tests for the requirements specified in the listed profile(s).

However, the validator appears to ignore this property.

Add tool version to output

Currently the tool does not have any version information in the output. It would be good to have a version string in the report so that a user knows what version of the tool was used to test a service.

Downloading Profiles from Repositories causes issues

Tool attempts requesting a profile at a given repository, but fails for multiple reasons:

  • Given the nature of http responses returning file listings in bytes, requires utf-8 decoding, and over looks this.

  • Profile name is stripped one character from the file listing and causes a 404 error.

The check for @odata.context should be optional

The use of @odata.context is changed to be optional in Redfish specification in the following issue:
DMTF/Redfish#2722
However, if we remove the use of @odata.context from the resources, the tool still shows errors like the following:
ERROR - /redfish/v1: Json does not contain @odata.context

Validator fail to check property with "type" : "null"

Property "UefiTargetBootSourceOverride" definition in schema is:

"UefiTargetBootSourceOverride": {
    "description": "This property is the UEFI Device Path of the device to boot from when BootSourceOverrideTarget is UefiTarget.",
    "longDescription": "The value of this property shall be the UEFI device path of the override boot target. The valid values for this property are specified through the Redfish.AllowableValues annotation. BootSourceOverrideEnabled = Continuous is not supported for UEFI Boot Source Override as this setting is defined in UEFI as a one time boot only.",
    "readonly": false,
    "type": [
        "string",
        "null"
    ]
}

The real implementation of value "UefiTargetBootSourceOverride" in redfish service is:

"Boot": {
        "BootSourceOverrideEnabled": "Disabled",
        "BootSourceOverrideTarget": "None",
        "UefiTargetBootSourceOverride"": null,
        "BootSourceOverrideMode": "Legacy"
}

The condition option in porfile to validate property "UefiTargetBootSourceOverride" is:

"UefiTargetBootSourceOverride": {
    "ReadRequirement": "Recommended",
    "ConditionalRequirements": [{
        "Purpose": "If UEFI mode is selected, must allow for UEFI target.",
        "CompareProperty": "BootSourceOverrideMode",
        "CompareType": "Equal",
        "CompareValues": ["UEFI"],
        "ReadRequirement": "Mandatory",
        "Comparison": "AnyOf",
        "Values": ["None", "RDOC1", "RDOC2"]
    }]              
}

Whatever the value of "Values" i set into conditioin option is ["Test1", "Test2"], ["null", "Test1", "Test2"], [null, "Test1", "Test2"] or ["None", "Test1", "Test2"] to validate property "UefiTargetBootSourceOverride".
The Validator result is always return fail and the error message is:

TypeError: argument of type 'NoneType' is not iterable
ERROR - Could not finish validation check on this payload

How to config condition option to process ["type" : "null"] property?

How to disable OemCheck in Redfish-Interop-Validator?

I got following error information when run Redfish-Interop-Validator:

ERROR - The following namespaces are referenced by the service, but are not included in $metadata:
OemMessageRegistry.v1_0_0

The OemMessageRegistry.v1_0_0 is an OEM schema.
When i run the Redfish-Interop-Validator service and already add "OemCheck = False" in config.ini. The result is still get above error message in result html file.
Is there any way or possible to avoid OEM schema check?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.