-
-
Notifications
You must be signed in to change notification settings - Fork 65
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat: support JSON Schema as input for a Type #1159
Draft
TizzySaurus
wants to merge
74
commits into
arktypeio:main
Choose a base branch
from
TizzySaurus:json-schema-to-arktype
base: main
Could not load branches
Branch not found: {{ refName }}
Loading
Could not load tags
Nothing to show
Loading
Are you sure you want to change the base?
Some commits from the old base branch may be removed from the timeline,
and old review comments may become outdated.
Draft
Changes from all commits
Commits
Show all changes
74 commits
Select commit
Hold shift + click to select a range
2c2bdf1
improve wording of type example
ssalbdivad a3fd9cd
Merge branch 'main' into v2docs-6
ssalbdivad 01583b0
runtime error messages increase font size
ssalbdivad 9e3e772
rename RegexNode=>PatternNode
ssalbdivad 85e5558
add changeset
ssalbdivad 886e643
bump deps
ssalbdivad 00baf27
fix morph with alias child
ssalbdivad 2a94bcd
add changelogs
ssalbdivad 98533ad
cyclic scope
ssalbdivad ae9342e
allow overriding aliases
ssalbdivad a2ed10f
add constraints to validation scope
ssalbdivad 170bc6e
remove constraints type parameter constraints
ssalbdivad 9d0bf28
remove broken cyclic rereferences test
ssalbdivad 8628a77
cleanup anonymous constraint display
ssalbdivad 77562e3
Merge branch 'main' into json-schema-to-arktype
TizzySaurus b7c38a0
Merge branch 'main' into json-schema-to-arktype
TizzySaurus 0e5b2da
Merge branch 'main' into json-schema-to-arktype
TizzySaurus cf77faa
Merge branch 'main' into json-schema-to-arktype
TizzySaurus 4c4e61f
Export from ark/schema/shared/traversal.ts so can import TraversalCon…
TizzySaurus 934cd36
Add parsing & types for array JSON Schema (w/ excessively deep type e…
TizzySaurus d3bc90d
Add parsing & types for number/integer JSON Schema
TizzySaurus da95a37
Add parsing & types for string JSON Schema
TizzySaurus 4d230c1
Add ArkType Scope representing JSON Schema schemas
TizzySaurus 48a654a
Initialise ark/jsonschema package
TizzySaurus b343bfe
Add parsing & types for object JSON Schema
TizzySaurus 98e9c03
Add parsing for allOf + anyOf + not + oneOf JSON Schemas
TizzySaurus 6dbdbcd
Add parsing for const + enum JSON Schemas
TizzySaurus 7248b27
Add core parseJsonSchema with parsing and type inference logic
TizzySaurus 79c0a79
Add ark/jsonschema/index.ts entry level to @ark/jsonschema
TizzySaurus 28da503
Add @ark/jsonschema to root package.json devDepencies
TizzySaurus 4127422
Merge branch 'main' into json-schema-to-arktype
TizzySaurus 474a849
Export type Out from ark/type/keywords/inference.ts
TizzySaurus 95e1b40
Fix @ark/jsonschema use of arktype constraint inference
TizzySaurus 2e21553
Preliminary tests for @ark/jsonschema
TizzySaurus 2d945b3
Merge branch 'main' into json-schema-to-arktype
TizzySaurus 94cdef2
Merge branch 'json-schema-to-arktype' of personal.github.com:TizzySau…
TizzySaurus 6903094
Add ts-ignore comment for excessively deep tuple spread
TizzySaurus d65fa30
Add preliminary README.md for @ark/jsonschema
TizzySaurus 8e97d87
Linting
TizzySaurus 877f36e
Fix example in README.md
TizzySaurus d59336c
Remove extra double-slash in comment
TizzySaurus e150832
Remove old TODO
TizzySaurus f625bc9
Remove old comment
TizzySaurus dd130e8
Migrate .js imports to .ts imports
TizzySaurus 4598d02
Remove accidentally added ark/jsonschema/del.ts file
TizzySaurus f7bd853
Remove changeset
TizzySaurus 03ba515
Use type.enumerated and type.unit utils for 'const' and 'enum' JSON S…
TizzySaurus 01e296c
Specify return type of function rather than double casting the return…
TizzySaurus dc43e17
Make variable assignment clearer & remove debug log statement
TizzySaurus be4fff2
Merge branch 'json-schema-to-arktype' of github.com:TizzySaurus/arkty…
TizzySaurus b6e5c70
Update @ark/jsonschema 'scripts' and 'exports' to match new style in …
TizzySaurus 423ba89
Formatting
TizzySaurus 2fe1b7a
Use conflatenateAll util instead of manually filtering out undefined …
TizzySaurus d0e612b
Remove accidentally added debugging file
TizzySaurus 07a5215
Remove type inference from @ark/jsonschema
TizzySaurus 2da7bb5
Merge branch 'main' into json-schema-to-arktype
TizzySaurus a1f2911
Fix string tests
TizzySaurus 0e1dac7
Remove redundant duplicate tests
TizzySaurus fc67ff7
Fix broken types
TizzySaurus aa81782
Use 'expected' and 'actual' props in ctx.reject and use 'Type' over '…
TizzySaurus de4c7cc
Use ctx.hasError instead of manually tracking ctx errors
TizzySaurus ce9b213
Fix incorrect exports from ark/type/index.ts
TizzySaurus ad9e053
Fix 'Expored variable innerParseJsonSchema has or is using name XXX f…
TizzySaurus 8dc5826
Fix types for innerParseJsonSchema
TizzySaurus c55e9aa
Merge branch 'main' into json-schema-to-arktype
TizzySaurus 624c0ac
Merge branch 'main' into json-schema-to-arktype
TizzySaurus 9826c00
add getDuplicatesOf array util to @ark/util, bumping the package vers…
TizzySaurus 5272e44
Update JSON Schema object parsing logic to use new getDuplicatesOf ut…
TizzySaurus aaf7d46
Merge branch 'main' into json-schema-to-arktype
TizzySaurus 6221890
Bump @ark/util version from 0.25.0 to 0.26.0 due to new getDuplicates…
TizzySaurus 20f446b
Add support for JSON Schema 'prefixItems' keyword on arrays
TizzySaurus 0fac361
Add unit tests for new 'prefixItems' support
TizzySaurus 65112a1
Make JSON Schema types shared between arktype and @ark/jsonschema
TizzySaurus 1567862
patch-fix broken 'arktype' and '@ark/util' unit tests
TizzySaurus File filter
Filter by extension
Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,13 @@ | ||
# @arktype/jsonschema | ||
|
||
## 1.0.0 | ||
|
||
### Initial Release | ||
|
||
Released the initial implementation of the package. | ||
|
||
Known limitations: | ||
|
||
- No `dependencies` support | ||
- No `if`/`else`/`then` support | ||
- `multipleOf` only supports integers | ||
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,51 @@ | ||
# @arktype/jsonschema | ||
|
||
## What is it? | ||
|
||
@arktype/jsonschema is a package that allows converting from a JSON Schema schema, to an ArkType type. For example: | ||
|
||
```js | ||
import { parseJsonSchema } from "@ark/jsonschema" | ||
|
||
const t = parseJsonSchema({ type: "string", minLength: 5, maxLength: 10 }) | ||
``` | ||
|
||
is equivalent to: | ||
|
||
```js | ||
import { type } from "arktype" | ||
|
||
const t = type("5<=string<=10") | ||
``` | ||
|
||
This enables easy adoption of ArkType for people who currently have JSON Schema based runtime validation in their codebase. | ||
|
||
Where possible, the library also has TypeScript type inference so that the runtime validation remains typesafe. Extending on the above example, this means that the return type of the below `parseString` function would be correctly inferred as `string`: | ||
|
||
```ts | ||
const assertIsString = (data: unknown) | ||
return t.assert(data) | ||
``` | ||
|
||
## Extra Type Safety | ||
|
||
If you wish to ensure that your JSON Schema schemas are valid, you can do this too! Simply import the relevant `Schema` type from `@ark/jsonschema` like so: | ||
|
||
```ts | ||
import type { JsonSchema } from "arktype" | ||
|
||
const integerSchema: JsonSchema.Numeric = { | ||
type: "integer", | ||
multipleOf: "3" // errors stating that 'multipleOf' must be a number | ||
} | ||
``` | ||
|
||
Note that for string schemas exclusively, you must import the schema type from `@ark/jsonschema` instead of `arktype`. This is because `@ark/jsonschema` doesn't yet support the `format` keyword whilst `arktype` does. | ||
|
||
```ts | ||
import type { StringSchema } from "@ark/jsonschema" | ||
const stringSchema: StringSchema = { | ||
type: "string", | ||
minLength: "3" // errors stating that 'minLength' must be a number | ||
} | ||
``` |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,141 @@ | ||
import { attest, contextualize } from "@ark/attest" | ||
import { parseJsonSchema } from "@ark/jsonschema" | ||
|
||
// TODO: Add compound tests for arrays (e.g. maxItems AND minItems ) | ||
// TODO: Add explicit test for negative length constraint failing (since explicitly mentioned in spec) | ||
|
||
contextualize(() => { | ||
it("type array", () => { | ||
const t = parseJsonSchema({ type: "array" }) | ||
attest(t.json).snap({ proto: "Array" }) | ||
}) | ||
|
||
it("items & prefixItems", () => { | ||
const tItems = parseJsonSchema({ type: "array", items: { type: "string" } }) | ||
attest(tItems.json).snap({ proto: "Array", sequence: "string" }) | ||
attest(tItems.allows(["foo"])).equals(true) | ||
attest(tItems.allows(["foo", "bar"])).equals(true) | ||
attest(tItems.allows(["foo", 3, "bar"])).equals(false) | ||
|
||
const tItemsArr = parseJsonSchema({ | ||
type: "array", | ||
items: [{ type: "string" }, { type: "number" }] | ||
}) | ||
attest(tItemsArr.json).snap({ | ||
proto: "Array", | ||
sequence: { prefix: ["string", "number"] }, | ||
exactLength: 2 | ||
}) | ||
attest(tItemsArr.allows(["foo", 1])).equals(true) | ||
attest(tItemsArr.allows([1, "foo"])).equals(false) | ||
attest(tItemsArr.allows(["foo", 1, true])).equals(false) | ||
|
||
const tPrefixItems = parseJsonSchema({ | ||
type: "array", | ||
prefixItems: [{ type: "string" }, { type: "number" }] | ||
}) | ||
attest(tPrefixItems.json).snap({ | ||
proto: "Array", | ||
sequence: { prefix: ["string", "number"] }, | ||
exactLength: 2 | ||
}) | ||
}) | ||
|
||
it("additionalItems", () => { | ||
const tItemsVariadic = parseJsonSchema({ | ||
type: "array", | ||
items: [{ type: "string" }, { type: "number" }], | ||
additionalItems: { type: "boolean" } | ||
}) | ||
attest(tItemsVariadic.json).snap({ | ||
minLength: 2, | ||
proto: "Array", | ||
sequence: { | ||
prefix: ["string", "number"], | ||
variadic: [{ unit: false }, { unit: true }] | ||
} | ||
}) | ||
attest(tItemsVariadic.allows(["foo", 1])).equals(true) | ||
attest(tItemsVariadic.allows([1, "foo", true])).equals(false) | ||
attest(tItemsVariadic.allows([false, "foo", 1])).equals(false) | ||
attest(tItemsVariadic.allows(["foo", 1, true])).equals(true) | ||
}) | ||
|
||
it("contains", () => { | ||
const tContains = parseJsonSchema({ | ||
type: "array", | ||
contains: { type: "number" } | ||
}) | ||
const predicateRef = | ||
tContains.internal.firstReferenceOfKindOrThrow( | ||
"predicate" | ||
).serializedPredicate | ||
attest(tContains.json).snap({ | ||
proto: "Array", | ||
predicate: [predicateRef] | ||
}) | ||
attest(tContains.allows([])).equals(false) | ||
attest(tContains.allows([1, 2, 3])).equals(true) | ||
attest(tContains.allows(["foo", "bar", "baz"])).equals(false) | ||
}) | ||
|
||
it("maxItems", () => { | ||
const tMaxItems = parseJsonSchema({ | ||
type: "array", | ||
maxItems: 5 | ||
}) | ||
attest(tMaxItems.json).snap({ | ||
proto: "Array", | ||
maxLength: 5 | ||
}) | ||
|
||
attest(() => parseJsonSchema({ type: "array", maxItems: -1 })).throws( | ||
"maxItems must be an integer >= 0" | ||
) | ||
}) | ||
|
||
it("minItems", () => { | ||
const tMinItems = parseJsonSchema({ | ||
type: "array", | ||
minItems: 5 | ||
}) | ||
attest(tMinItems.json).snap({ | ||
proto: "Array", | ||
minLength: 5 | ||
}) | ||
|
||
attest(() => parseJsonSchema({ type: "array", minItems: -1 })).throws( | ||
"minItems must be an integer >= 0" | ||
) | ||
}) | ||
|
||
it("uniqueItems", () => { | ||
const tUniqueItems = parseJsonSchema({ | ||
type: "array", | ||
uniqueItems: true | ||
}) | ||
const predicateRef = | ||
tUniqueItems.internal.firstReferenceOfKindOrThrow( | ||
"predicate" | ||
).serializedPredicate | ||
attest(tUniqueItems.json).snap({ | ||
proto: "Array", | ||
predicate: [predicateRef] | ||
}) | ||
attest(tUniqueItems.allows([1, 2, 3])).equals(true) | ||
attest(tUniqueItems.allows([1, 1, 2])).equals(false) | ||
attest( | ||
tUniqueItems.allows([ | ||
{ foo: { bar: ["baz", { qux: "quux" }] } }, | ||
{ foo: { bar: ["baz", { qux: "quux" }] } } | ||
]) | ||
).equals(false) | ||
attest( | ||
// JSON Schema specifies that arrays must be same order to be classified as equal | ||
tUniqueItems.allows([ | ||
{ foo: { bar: ["baz", { qux: "quux" }] } }, | ||
{ foo: { bar: [{ qux: "quux" }, "baz"] } } | ||
]) | ||
).equals(true) | ||
}) | ||
}) |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,89 @@ | ||
import { attest, contextualize } from "@ark/attest" | ||
import { parseJsonSchema } from "@ark/jsonschema" | ||
|
||
// TODO: Compound tests for number (e.g. 'minimum' AND 'maximum') | ||
|
||
contextualize(() => { | ||
it("type number", () => { | ||
const jsonSchema = { type: "number" } as const | ||
const expectedArkTypeSchema = { domain: "number" } as const | ||
|
||
const parsedNumberValidator = parseJsonSchema(jsonSchema) | ||
attest(parsedNumberValidator.json).snap(expectedArkTypeSchema) | ||
}) | ||
|
||
it("type integer", () => { | ||
const t = parseJsonSchema({ type: "integer" }) | ||
attest(t.json).snap({ domain: "number", divisor: 1 }) | ||
}) | ||
|
||
it("maximum & exclusiveMaximum", () => { | ||
const tMax = parseJsonSchema({ | ||
type: "number", | ||
maximum: 5 | ||
}) | ||
attest(tMax.json).snap({ | ||
domain: "number", | ||
max: 5 | ||
}) | ||
|
||
const tExclMax = parseJsonSchema({ | ||
type: "number", | ||
exclusiveMaximum: 5 | ||
}) | ||
attest(tExclMax.json).snap({ | ||
domain: "number", | ||
max: { rule: 5, exclusive: true } | ||
}) | ||
|
||
attest(() => | ||
parseJsonSchema({ | ||
type: "number", | ||
maximum: 5, | ||
exclusiveMaximum: 5 | ||
}) | ||
).throws( | ||
"ParseError: Provided number JSON Schema cannot have 'maximum' and 'exclusiveMaximum" | ||
) | ||
}) | ||
|
||
it("minimum & exclusiveMinimum", () => { | ||
const tMin = parseJsonSchema({ type: "number", minimum: 5 }) | ||
attest(tMin.json).snap({ domain: "number", min: 5 }) | ||
|
||
const tExclMin = parseJsonSchema({ | ||
type: "number", | ||
exclusiveMinimum: 5 | ||
}) | ||
attest(tExclMin.json).snap({ | ||
domain: "number", | ||
min: { rule: 5, exclusive: true } | ||
}) | ||
|
||
attest(() => | ||
parseJsonSchema({ | ||
type: "number", | ||
minimum: 5, | ||
exclusiveMinimum: 5 | ||
}) | ||
).throws( | ||
"ParseError: Provided number JSON Schema cannot have 'minimum' and 'exclusiveMinimum" | ||
) | ||
}) | ||
|
||
it("multipleOf", () => { | ||
const t = parseJsonSchema({ type: "number", multipleOf: 5 }) | ||
attest(t.json).snap({ domain: "number", divisor: 5 }) | ||
|
||
const tInt = parseJsonSchema({ | ||
type: "integer", | ||
multipleOf: 5 | ||
}) | ||
attest(tInt.json).snap({ domain: "number", divisor: 5 }) | ||
|
||
// JSON Schema allows decimal multipleOf, but ArkType doesn't. | ||
attest(() => parseJsonSchema({ type: "number", multipleOf: 5.5 })).throws( | ||
"AggregateError: multipleOf must be an integer" | ||
) | ||
}) | ||
}) |
Oops, something went wrong.
Oops, something went wrong.
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@ssalbdivad are you open to lifting ArkType's restriction on divisors being non-integers? Or, if not/as an interim solution, it seems this could be added as a custom
narrow
constraint?There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
You can add whatever arbitrary validation you want in a narrow, but the reason we don't support this is because there isn't a straightforward way to test float divisibility without complex + potentially fallible string math.
@TizzySaurus was actually working on an implementation but it doesn't make sense to add a dependency or implement that much custom logic for such a niche case.
For rational values like
0.1
, you could testing something like(n: number) => (n * 10) % 1 === 0
. Outside that though, things tend to get scary so be careful.There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes, I looked into this and it's just not feasible to do floating point division in NodeJS: