Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

outdated third party tools -> official testsuites #332

Closed
kvlahromei opened this issue Apr 17, 2015 · 6 comments
Closed

outdated third party tools -> official testsuites #332

kvlahromei opened this issue Apr 17, 2015 · 6 comments

Comments

@kvlahromei
Copy link

Hi, even pretty new to swagger, I already noticed the problem, that if you think about the ecosystem (editor, generator, mocks, ...) you get lost. I tried to get swagger to work with python, but a lot of the libs seem to be broken (with current examples) or have significant limitations (features, swagger syntax, ...)

So might it be possible to offer a testing suite to third party tool coders to assure their tools fullfil the swagger specs (e.g. offering simple, medium, complex swagger examples that must be valid and covered by their own unittests?). Would be really useful and safe time for coders that need to use this bindings / validators / ... :)

@webron
Copy link
Member

webron commented Apr 17, 2015

We've thought about it, but it's actually quite difficult. Since most tools are producers, there's no easy way to test those. Consumers on the other hand are fairly simple.

If you have any additional thoughts about it, we'd be happy to hear.

@RobDolinMS
Copy link
Contributor

@kvlahromei While there are lots of tools, I don't believe the OpenAPI Initiative has an official test suite.

Would you mind if we either close this issue or mark it for Milestone=v3.Next ?

@MikeRalphson
Copy link
Member

@RobDolinMS if an official test-suite is desired, I'd be happy to help organise one based on (a subset of?) APIs.guru. We plan to create a v3.0.0 branch (initially static) containing conversions of the v2.0 definitions at the point the v3.0.0 specification is released, before we decide a date for converting the repository.

@darrelmiller
Copy link
Member

@MikeRalphson It would be awesome to get a test suite going. I can envision how to get a bunch of input values, but I'm still not exactly clear how we verify that the tool "did the right thing". Considering some tools generate docs, some create client libraries, others just route... Is there any way to verify correctness? Or is that out of scope? Should the goal simply be to provide a comprehensive set of input documents and let the implementer decide what is the right thing for their scenario?

Could a message/http be used as controlled input and output? e.g. Given this request message/http representation and OpenAPI definition is THIS message/http a possible valid response.
Or would you have to describe all the potential responses to the input message?

Forgive the rambling, I'm just thinking out loud here.

@MikeRalphson
Copy link
Member

@darrelmiller

Is there any way to verify correctness?

That I think is taking a big first bite. Perhaps a first level of compliance for a consumer of the test-suite would simply be to self-certify "does not terminate abnormally on this input", followed by something like "no errors issued" and "no warnings issued".

Measuring completeness of implementation of the specification is probably a pipe-dream (cat(1) would be a certified consumer of the specification given the criteria above) unless you're talking about lossless two-way conversion to/from another format.

@handrews
Copy link
Member

There are now several efforts in this general direction, so I'm going to close this in favor of more recent / specific items:

Discussions/issues regarding the testable form should be filed with the OASComply project, and discussions regarding the certification project belong with Outreach. So I'm going to close this issue as there are better (and active!) forums elsewhere.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

6 participants