Giter Site home page Giter Site logo

json-schema-test-suite's Issues

Possible error in properties tests

Section 5.4.4.4 of draft 4 of the spec and the example that follow it make it seem like that if properties matches a key, then that key is removed from the pool of keys that patternProperties checks.

However, this test has a correct implementation of patternProperties throwing an error against the key "foo", even though "foo" is one of properties, and validates successfully against the corresponding properties schema.

Could someone more experience than me with JSON schema take a look at this and see if I've made a mistake? Thanks!

Many of the tests currently utilize invalid JSON as test data

An issue that has arisen in the Ruby json-schema gem is that many of the test cases utilize invalid JSON as test data; that is, bare strings, integers, and booleans exist without an encapsulating object / array. While the majority of our validations don't check for valid JSON, some do, and thus this common test suite cannot be fully relied upon.

It is my assertion that a common test suite that is providing sample JSON data for validation ensure that said data is actually valid JSON data to begin with. There really isn't much point in validating invalid data.

Vendor external dependencies for tests

Right now the tests reference files on external servers, such as here (which references the schema for JSON schemas). Note that this isn't in refRemote.json where it might be more expected, instead it's in plain ref.json.

This works OK, but isn't great for users with choppy internet connections because their tests become non-deterministic. Might it make more sense to vendor the files that are currently being referenced remotely, adding them to the ./remotes/ directory so they can be served up at localhost?

Should invalid json data be included in the test suite?

I have encountered a problem while implementing a sort of interface to the test suite.

The module that I am testing doesn't check that the data is valid json. It assumes that you would only give it valid json, and would not give it invalid json.

I discussed this on perl irc in #perl-help, pasting http://paste.scsys.co.uk/494405?tidy=on&hl=on&submit=Format+it!

I notice that in a PHP equivilent has a function which checks the validity of the data first. I understand why this is a good idea, but I can't see in the specification that it says how, or even if, json-schema has an expected behaviour for invalid json data.

Test causing the error: https://github.com/json-schema/JSON-Schema-Test-Suite/blob/develop/tests/draft3/dependencies.json#L30

Resulting in 'false' expected, at character offset 0 (before "foo") at /[user]/perl5/perlbrew/perls/perl-5.14.2/lib/site_perl/5.14.2/JSON.pm line 171.

This test isn't present in the master branch, but only in develop.

Add test for multipleOf to see if a library has problem with floating point rounding errors

Hi,

please see: zaggino/z-schema#69

Using multipleOf can lead to wrong validation errors in some libraries which are written in

For example the following schema:

{ 
    "type": "object",
    "properties": {
      "decimal": {
        "type": "number",
        "multipleOf": 0.01
      }   
    }   
}

Validates correctly against the following documents:

{"decimal": 100}
{"decimal": 100.10}

But not against:

{"decimal": 136.67}

This is because rounding errors of floats in javascript (see http://floating-point-gui.de/):

100 / 0.01
=> 10000

100.01 / 0.01
=> 10001

// but:
136.67 / 0.01
=> 13666.999999999998

Please add a test to validate agains this error case.

Draft 4: invalid definition schema

Hello, I have just seen that this spec has a malformed type and awaits validation to fail. Is there any clarification in the draft about what it really should do? Because in my validator, valico, I prefer to report an explicit error during a schema compile phase.

I will be glad if you can clarify this. Thanks!

Draft 04 features

I realize that draft 04 isn't finalized yet, but it makes some substantial changes and those of us trying to be ahead of the curve could definitely use tests for them. I'm intending to do a lot of this work myself, in fact, I just wanted to create an issue for it to let people know and provide a centralized place to coordinate.

Is anyone aware of an actual list of the changes that have gone into 04 already? Or is the best approach to read both and compare? It would be unfortunate if so, ... I'll ask this in the google group also.

Wrong "nested refs" test, points to ignored member

The file tests/draft4/ref.json contains this schema at line 122:

        "schema": {
            "definitions": {
                "a": {"type": "integer"},
                "b": {"$ref": "#/definitions/a"},
                "c": {"$ref": "#/definitions/b"}
            },
            "$ref": "#/definitions/c"
        },

This example is wrong, because according to the JSON Reference RFC

Any members other than "$ref" in a JSON Reference object SHALL be ignored.

This means that the member "definitions" SHALL be ignored (since it is a member beside the $ref), and so the $ref points to somewhere that should be ignored.

No "type" information in test schemas

Hi,

Playing around with this test suite I stumbled on sub-schemas without type information in a test case.
Then I found also top-level schemas without type:

e.g.

{
"description": "additionalItems as schema",
"schema": {
"items": [{}],
"additionalItems": {"type": "integer"}
},
"tests": [
{
"description": "additional items match schema",
"data": [ null, 2, 3, 4 ],
"valid": true
},
{
"description": "additional items do not match schema",
"data": [ null, 2, 3, "foo" ],
"valid": false
}
]
}

Are schemas without the type keyword valid?
How should one handle the suite then?

Regards,

mlarue

how to create multiple json objects in jsp?

I wrote jsp functions to list all file names and all sub-folder names in a directory,and I want to save the result in json object.In the first step, I recursively read the all file names and all sub-folder names,and I saved the results in a string:
res_data ="my_report3.rptdesign&my_report4.rptdesign&28.12.2015$my_report.rptdesign&my_report1.rptdesign&my_report2.rptdesign&30.12.2015$customerst.rptdesign&TopNPercent.rptdesign&22.12.2015$by_sup_ML.rptdesign&chartcwong.rptdesign&HTML5 Chart.rptdesign&23.12.2015$main_page.rptdesign&my_report.rptdesign&my_report18.rptdesign&my_report19.rptdesign&my_report20.rptdesign&my_report21.rptdesign&my_report22.rptdesign&my_report23.rptdesign&my_report24.rptdesign&postgreSQL.rptdesign&test.rptdesign&TopNPercent.rptdesign&PathFind_Report"

In the second step, I wrote the following code to write the result data into json oject:
JSONObject json_obj=new JSONObject();
JSONArray obj_array = new JSONArray();
String[] first_res_d = res_data.split("[$]");
for( int i = 0 ; i < first_res_d.length ; i ++ )
{
String[] second_res_d = first_res_d[i].split("[&]");
int id_folder = second_res_d.length-1;
for( int j = 0 ; j < id_f ; j ++ )
{
json_obj.put("parent",second_res_d[id_folder]);
json_obj.put("child",second_res_d[j]);
obj_array.add( json_obj);
}
}
out.println(obj_array);
But, I only got the repeating of same json object. The printed result is as follows:
[{"parent":"PathFind_Report","child":"TopNPercent.rptdesign"},{"parent":"PathFind_Report","child":"TopNPercent.rptdesign"},{"parent":"PathFind_Report","child":"TopNPercent.rptdesign"},{"parent":"PathFind_Report","child":"TopNPercent.rptdesign"},{"parent":"PathFind_Report","child":"TopNPercent.rptdesign"},{"parent":"PathFind_Report","child":"TopNPercent.rptdesign"},{"parent":"PathFind_Report","child":"TopNPercent.rptdesign"},{"parent":"PathFind_Report","child":"TopNPercent.rptdesign"},{"parent":"PathFind_Report","child":"TopNPercent.rptdesign"},{"parent":"PathFind_Report","child":"TopNPercent.rptdesign"},{"parent":"PathFind_Report","child":"TopNPercent.rptdesign"},{"parent":"PathFind_Report","child":"TopNPercent.rptdesign"},{"parent":"PathFind_Report","child":"TopNPercent.rptdesign"},{"parent":"PathFind_Report","child":"TopNPercent.rptdesign"},{"parent":"PathFind_Report","child":"TopNPercent.rptdesign"},{"parent":"PathFind_Report","child":"TopNPercent.rptdesign"},{"parent":"PathFind_Report","child":"TopNPercent.rptdesign"},{"parent":"PathFind_Report","child":"TopNPercent.rptdesign"},{"parent":"PathFind_Report","child":"TopNPercent.rptdesign"},{"parent":"PathFind_Report","child":"TopNPercent.rptdesign"},{"parent":"PathFind_Report","child":"TopNPercent.rptdesign"},{"parent":"PathFind_Report","child":"TopNPercent.rptdesign"}]

Could you give me some suggestions to correct my code ,please ? thanks a lot in advance .

require and enum

I tripped over an issue when using tdegrunt/jsonschema schema validator implemented in JS when validating enums. It disregards value of required property for enums and considers enum optional only if 'null' is stated in list of allowed values for the enum. It's using JSON-Schema-Test-Suite for self test and all tests are passing. IMO the test suite should be more explicit about this case, so that it's clear how validators should treat enums which are not required. A short example follows:

schema:
{
"type": "string",
"enum": ["on", "off"],
"required": false
}

data:
{ }

When using i.e. Amanda schema validator it will return success. But jsonschema validator will fail in this case, because 'null' isn't stated among values in 'enum'.

Invalid format specified in tests/draft3/optional/jsregex.json

The schema in the optional jsregex.json file specifies an invalid value for format.

"schema": { "format": "pattern" }

The "pattern" format is not defined in draft 3. It should be "regex" instead.

"schema": { "format": "regex" }

Are validators trying to be compliant with draft 3 supposed to implement "pattern"?

Suggestion "String of 1 is not a number" test

A new perl module JSON::Schema::AsType has recently been released.
It uses this test suite! (Horray!)
I wonder if we should also include a test to check that a string of "1" is not a number?
Currently the code treats "1" as a number, and I figure this is not strictly correct.

Test for explicitly defined 'properties' when using 'patternProperties'

There is currently no test for explicitly defined 'properties', when also using 'patternProperties'.

The example in 5.4.4.5 in the JSON Schema Validation document seems to suggest that you can have something that matches a 'properties' object which does not match a 'patternProperties' regex and have it be valid. The base schema example here: http://json-schema.org/example2.html states the following:

"
we have a properties keyword with only a / entry;
we use patternProperties to match other property names via a regular expression (note that it does not match /);
as additionalProperties is false, it constrains object properties to be either / or to match the regular expression "
This seems to suggest that you can have '/' as a valid property, even though it does not match the regular expression within the 'patternProperties'.
"

json-schema.org also references the 'Space Telescope Science Institute' JSON Schema writing guide, which under 'patternProperties' states "Any properties explicitly defined in the properties keyword are also accepted".

There is currently no test for this, so many validators that use this test suite to implement there validators will currently incorrectly handle this schema:

{
    "$schema": "http://json-schema.org/draft-04/schema#",
    "type": "object",
    "properties": {
        "test" : { "type" : "string" }     
    },
    "patternProperties" : {
        "^.*$" : 
        {                                             
                "type": "boolean"
        }
    },
    "additionalProperties": false,
    "uniqueProperties" : true     
}

Based on the above, because 'test' is explicitly defined I would expect:

{ "test" : "example" }

To be valid. However, there is currently no test for this, resulting in validators reporting this as invalid and expecting a bool.

join us on github.com/json-schema

Would you consider transferring this project and joining us on json-schema where we could put in a combined effort?

This will give you access to the json-schema.org site amongst other things.

Not everyone gets invited but this seems like the kind of project that would better serve the community if hosted from the official project and we would love to have you.

You will be awarded membership and still remain the maintainer and have the same privileges you currently enjoy with the test suite hosted here.

What do you say?

publish to npm

Would be pretty cool if you published this to npm so you could require it as a devDependency in node.
I can send a PR if you are interested in this.

More positive examples needed for IPv6 tests

IPv6 addresses are complex, therefore more positive examples are needed to verify validation of IPv6 addresses. This actually is a general concern with the test suite. Features often are tested by checking for failed validation, which leaves huge room for false positives: Those negative tests often pass for the trivial reason, that the tested validator doesn't even remotely understand the tested construct.

Anyway, here a list of valid IPv6 addresses that should be tested:

  • RFC2373, section 2.2.1: FEDC:BA98:7654:3210:FEDC:BA98:7654:3210, 1080:0:0:0:8:800:200C:417A
  • RFC2373, section 2.2.2: 1080::8:800:200C:417A, FF01::101, ::1, ::
  • RFC2373, section 2.2.3: 0:0:0:0:0:0:13.1.68.3, 0:0:0:0:0:FFFF:129.144.52.38, ::13.1.68.3, ::FFFF:129.144.52.38

Paper suggests that the test suite is lacking

http://www2016.net/proceedings/proceedings/p263.pdf

Specifcially section 2...
"Table 1 shows the outcome of this process, It is impor-
tant to mention that all validators successfully validate the
JSON Schema test-suite [4]. As we can see, no two valida-
tors behave the same on all inputs, which is clearly not the
desired behaviour. This illustrates the need for a formal
de nition of JSON Schema which will either disallow am-
biguous schemas, or formally specify how these should be
evaluated."

I'm not sure on their methods, and I'm not especially up for downloading random code from academic URLs.
I've contacted them regarding something else. If I get a reply, I'll push them to publish the code in a repo or as a gist, and see if they deliver. It's a shame they didn't feel the need to put it on github.

Utility program to "flatten" the test suite?

It would be nice to propose one: as tests are, for now, separated into a whole bunch of files, if you want to run the test suite you have to walk the tree/open/read/close each file.

So, why not a utility program which would flattent the whole suite into a single file? It would output an array with each test being an object having members "description", "test", "schema", "data" and "valid".

What do you think?

Default values

At the moment the test suite doesn't have any tests around default values. I've added a couple of tests (for draft4 and draft3) based on my understanding of how defaults should work.

Test for resolving references when there're other properties on schema

I'm missing a test here: https://github.com/json-schema/JSON-Schema-Test-Suite/blob/develop/tests/draft4/ref.json

var schemaA = {
    "id": "long-string",
    "type": "string",
    "maxLength": 4096
}
var schemaB = {
    "id": "person-object",
    "type": "object",
    "properties": {
        "name": {
            "$ref": "long-string",
            "maxLength": 10
        }
    }
}

does the result need to respect "maxLength": 10 or not?
does the result need to respect both maxLength's? (lets say if it was not maxLength but format which would make more sense) - should the validator copy but not override properties where $ref is, or should it be match all of properties and also all properties on reference

Test that $refs can resolve to different documents depending on the situation

Say schema A changes resolution scope using id. Schema A has a child schema B with a $ref. When schema A is used to validate a document it runs schema B, whose $ref is revolved within the scope of schema A's id.

If schema B is referenced directly by another part of the document though, schema B's $ref should not be resolved within the context of schema A's id.

Automate sanity checking the suite

Write a small script to ensure:

  • all files are valid json
  • all test case schemas are valid
  • all descriptions are unique within each draft

Leverage travis.

Add `id` and `fullDescription` keys to tests

An immutable (never changing) ID field might be useful for test suite users.

A fullDescription would be useful possibly for elaborating on a test since we restrict description to less than 60 chars so that it can be used to name test methods.

1.0 listed as not-integer

e39d537 added a test saying that, apparently, 1.0 should not validate as an integer.

This is a problem, most notably, for Javascript/ECMAScript (I wonder where JSON comes from), where everything is an IEEE 64-bit floating point, and fails against my library since this is listed as a required test even though ES itself has no other alternative.

Differentiation between floating point and non-floating-point forms is listed as an optional (actually, suggested) part of v4, and is not mentioned at all in v3.

This test should, accordingly, be moved to the "optional tests" section.

JSON path for $ref and similar

I don't believe that being able to resolve a URL fragment into a sub-property of a schema or instance is a required part of JSON Schema. Identifying a specific property/sub-property inside a JSON document at a particular URL using a fragment identifier is a completely different specification.

Specifically, https://github.com/Julian/JSON-Schema-Test-Suite/blob/b7858cc3584ce8a8886a7edde3866f9776505b6f/tests/draft3/ref.json#L56-L65 isn't a valid schema, this should be throwing an error, when the test claims that it should be valid. Also, # does not resolve to the URI of the current schema, it resolves to the URI of the current schema with an additional # character (which are not necessarily the same resources).

The JSON Schema does require that relative URLs be resolved appropriately, that for example, inside a document with an id of http://example.com/a/b/schema, that ../theSchema be a valid URL, and that it resolve to http://example.com/a/theSchema

Consider creating a node branch or providing links to Node.js fork

I've created a fork that provides support for Node.js validator development. The fork provides a package.json file, unit tests, and exposes the test as a package that can be requireed for validator tests, rather than added as a git submodule.

The fork is available here and it has been published to npm here.

The fork is current for the latest develop branch.

I'm opening this issue for possible consideration in creating a node branch here in the main repo for accepting pull requests for node support updates. While the branch will be kept current with develop, it would never be merged to develop (or master) since it contains node-specific artifacts.

Alternatively, perhaps it might make sense to provide link information to the fork in this repo's README for the benefit of node developers.

Arrays of object are not invalid

Working with the following schema:

{
  "$schema": "http://json-schema.org/draft-04/schema#",
  "id": "http://localhost:1234/test",
  "title": "test",
  "description": "Test",
  "type": "object",
  "properties": {
    "location": {
      "type": "array",
      "items": {
        "type": "object",
        "properties": {
          "name": {"type": "string"}
        }
      }
    }
  }
}

And validating against http://json-schema.org/draft-04/schema# turn to be invalid by https://github.com/Prestaul/skeemas and valid by https://github.com/fge/json-schema-validator.

In my opinion this should be valid, let me know if I'm correct.

Part of 'additionalProperties' is not covered until 'definitions'

The tests for additionalProperties don't cover additionalProperties existing on its own without properties.

The test suite does cover additionalProperties existing on its own, but only within the tests for definitions[1] once you start implementing $refs. This can make implementing $refs confusing, because if you've implemented additionalProperties existing on its own incorrectly, then your $ref code will fail whether or not it's correct.

It would be good to add tests for additionalProperties existing on its own without properties to the addtionalProperties test file.

[1] It's not immediately clear from this test that it relies on additionalProperties, but that's the validator in the remote schema that actually causes the invalid data to fail validation.

Roadmap

@Julian have you given the road ahead much thought?

Some food for thought:

  • sanity checking intact - as per #14
  • implementation test script - the actual tests using the specific supported library complete with installation of latest release and implementation unit tests to ensure satisfaction.
  • CI automation - hook up to travis to automate sanity checks and implementation scripts

Lets elaborate...

Improve regex based tests

Unfortunately, they all lead the user to believe that regexes are anchored. For instance, f.*o is used in patternProperties and corresponding member names are "fo", "fooo" or equivalent.

But in the context of JSON Schema, f.*o matches defoliation. Tests should be improved in that regard.

More comprehensive $ref tests

The $ref tests in the suite currently do not do any testing where the "id" keyword changes the resolution scope. It would be very helpful for implementors to have such tests available, as proper evaluation of $ref keyword can be quite confusing. The issue with such tests, (correct me if I'm wrong,) is that with implementations that do not support inline dereferencing, we will need to have some sort of server serving the test schemas for this to be fully tested.

I am opening this ticket to get input on what the best way these kinds of tests can be included in the suite. @Julian and I were considering a simple webserver included in bin/jsonschema_suite, but whether this is the best possibility, and how tests should be written that use it remain up in the air. All input is appreciated.

(initial discussion started here python-jsonschema/jsonschema#66)

'id' resolution changes aren't tested in local refs

I don't believe this case is tested (a resolution scope change during the ref resolution):

    {
        "description": "nested refs",
        "schema": {
            "definitions": {
                "id" : "some://where.else",
                "a": {"ref": "#/definitions/integerTest"},
                "integerTest" : { "type" : "string" }
            },
            "$ref": "#/definitions/a"
        },
        "tests": [
            {
                "description": "ref valid",
                "data": 5,
                "valid": true
            },
            {
                "description": "invalid",
                "data": "a",
                "valid": false
            }
        ]
    }

I think that the following should never be hit (and instead try to resolve some://where.else#/definitions/integerTest):

"integerTest" : { "type" : "string" }

Would I be right in thinking that? And am I right in thinking that this is not tested.

Add composer.json file and create a new tag

If this project has a composer.json file then other GitHub projects that use Composer can include this project as a dependency.

{
    "repositories": [
        {
            "type": "vcs",
            "url": "https://github.com/json-schema/JSON-Schema-Test-Suite"
        }
    ],
    "require": {
        "JSON-Schema-Test-Suite/JSON-Schema-Test-Suite": "1.0.0"
    }
}

Please also create a new tag that includes the latest updates to the project since the most recent tag, 1.0.0, is over five months old. Thanks!

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.