-
-
Notifications
You must be signed in to change notification settings - Fork 93
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat(zmodel): "type" construct and strongly-typed Json fields #1813
Conversation
📝 WalkthroughWalkthroughThe changes in this pull request introduce a new grammar rule Changes
Possibly related PRs
Thank you for using CodeRabbit. We offer it for free to the OSS community and would appreciate your support in helping us grow. If you find it useful, would you consider giving us a shout-out on your favorite social media? 🪧 TipsChatThere are 3 ways to chat with CodeRabbit:
Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments. CodeRabbit Commands (Invoked using PR comments)
Other keywords and placeholders
CodeRabbit Configuration File (
|
}; | ||
|
||
// fix "$[Model]Payload" type | ||
const payloadModelMatch = modelsWithTypeField.find((m) => `$${m.name}Payload` === typeName); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Instead of replacement, could we slightly alter the type construction to support ts module augmentation?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hi @nwidynski , I'm glad you found the work here 😄.
I think your proposal is a good approach to making the model types easily customizable. Can it be simplified to the following (without the nested $user
)?
export interface $UserExtensionScalar {}
export interface $UserScalar {
// original scalar fields
...
}
export type $UserPayload<ExtArgs extends $Extensions.InternalArgs = $Extensions.DefaultArgs> = {
...
scalars: $Extensions.GetPayloadResult<
Omit<$UserScalar, keyof $UserExtensionScalar> & $UserExtensionScalar,
ExtArgs["result"]["user"]>
}
// user land
import '@prisma/client';
declare module '@prisma/client' {
namespace Prisma {
export interface $UserExtensionScalar {
metadata: { ... }
}
}
}
Curious about your use case: when ZenStack's native support for typed-json-field is available, do you still need this augmentation-based extension?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hey,
unfortunately the nested property is necessary as typescript can’t override existing declarations but only patch them. In order to make conditionals on those patches, a known property must be patched. It can have any name though, it doesnt have to be $user.
As for your second question, yes I would still have a use case for them in order to get strongly typed model results while they are separated in different extensions. This currently doesnt work well, especially for query extensions.
You can imagine the following example:
Add a result extension in ExtensionA and a query extension in ExtensionB.
The result type of the query will never have the result fields defined in ExtensionA even though they are available.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Got it. For the field-type patching, I was thinking of "omit then patch": Omit<$UserScalar, keyof $UserExtensionScalar> & $UserExtensionScalar
. I haven't tried it yet though.
My current goal is to make the "type" declaration a preview feature in v2.8.0 to get more feedback about the direction. I think we can incorporate your suggestion after the general approach is confirmed viable.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes, that delivers additional convenience, although the $UserScalar
could also be reused in user land to build the extension and would allow to essentially hide fields, which wouldnt be possible with Omit
// user land
import {Prisma} from '@prisma/client';
declare module '@prisma/client' {
type UserScalar = Prisma.$UserScalar;
namespace Prisma {
export interface $UserExtensionScalar extends UserScalar {
metadata: { ... }
}
}
}
- transform DateTime values from string to Date before return - support `@default` and `@default(auth()...)`
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 19
🧹 Outside diff range and nitpick comments (37)
packages/schema/src/language-server/validator/typedef-validator.ts (2)
7-10
: Enhance documentation with validation rules.The class documentation could be more detailed about what constitutes a valid type definition. Consider adding:
- List of validation rules being enforced
- Examples of valid and invalid type definitions
- Description of error messages that might be emitted
11-14
: Consider additional TypeDef-level validations.The validate method currently checks for duplicate fields and validates individual fields. Consider adding:
- Validation for the type name (e.g., naming conventions, reserved words)
- Validation for circular type references if type fields can reference other types
- Validation for empty type definitions (no fields)
tests/integration/tests/enhancements/json/validation.test.ts (1)
3-39
: Add JSDoc comments to describe the test suite's purpose.Consider adding documentation to explain:
- The purpose of these validation tests
- The relationship with the type system feature
- Requirements for JSON field typing
Example:
/** * Integration tests for JSON field typing in model definitions. * These tests verify that: * 1. Custom-typed JSON fields are only supported with PostgreSQL * 2. Custom-typed fields must include the @json attribute */ describe('JSON field typing', () => {packages/schema/src/language-server/validator/zmodel-validator.ts (2)
79-81
: Consider enhancing error handlingWhile the implementation follows the established pattern, consider adding error boundaries around the validation call to prevent validation failures from crashing the language server.
checkTypeDef(node: TypeDef, accept: ValidationAcceptor): void { - this.shouldCheck(node) && new TypeDefValidator().validate(node, accept); + if (!this.shouldCheck(node)) return; + try { + new TypeDefValidator().validate(node, accept); + } catch (error) { + accept('error', `Type definition validation failed: ${error.message}`, { + node, + }); + } }
Line range hint
27-43
: Well-designed validation architectureThe validation system's architecture demonstrates good separation of concerns and extensibility:
- Clear separation between registration and implementation
- Consistent validation patterns across different constructs
- Easy to add new validators without modifying existing code
Consider documenting the validation architecture in the codebase to help future contributors understand the design patterns and extension points.
packages/runtime/src/enhancements/node/create-enhancement.ts (2)
94-100
: Consider using optional chaining for more concise code.The type definition field collection logic is correct, but it could be more concise using optional chaining.
Here's a suggested improvement:
- if (options.modelMeta.typeDefs) { - allFields.push( - ...Object.values(options.modelMeta.typeDefs).flatMap((typeDefInfo) => Object.values(typeDefInfo.fields)) - ); - } + allFields.push( + ...Object.values(options.modelMeta.typeDefs ?? {}).flatMap((typeDefInfo) => Object.values(typeDefInfo.fields)) + );
154-156
: Consider enhancement configuration and ordering.The JSON processor enhancement application has several potential issues to consider:
- Unlike other enhancements, it's not controlled by the
kinds
array, making it impossible to disable.- The processing order might be significant - should it be applied before or after specific other enhancements?
Consider:
- Adding a new enhancement kind for JSON processing
- Documenting the enhancement order dependencies
- Making it configurable through the
kinds
arrayExample implementation:
const ALL_ENHANCEMENTS: EnhancementKind[] = [ - 'password', 'omit', 'policy', 'validation', 'delegate' + 'password', 'omit', 'policy', 'validation', 'delegate', 'json' ]; - if (hasTypeDefField) { + if (hasTypeDefField && kinds.includes('json')) { result = withJsonProcessor(result, options); }tests/integration/tests/enhancements/json/typing.test.ts (1)
1-181
: Consider test suite architectural improvements.To enhance maintainability and coverage:
- Extract common test utilities for repeated operations (e.g., user creation, validation)
- Create structured test data fixtures
- Add dedicated error test cases for each feature
- Consider using a test matrix approach to systematically cover combinations of optional/required fields
packages/language/src/zmodel.langium (2)
189-191
: Review attribute compatibility with type definitionsThe TypeDefField reuses DataModelFieldAttribute, but some model-specific attributes might not be applicable to type definitions. Consider creating a dedicated TypeDefFieldAttribute rule or adding validation logic.
193-194
: Consider supporting Enum references in TypeDefFieldTypeCurrently, TypeDefFieldType can only reference other TypeDefs. Consider allowing references to Enum types as well, which would enable more comprehensive type definitions without adding complexity to the type system.
- (type=BuiltinType | reference=[TypeDef:RegularID]) (array?='[' ']')? (optional?='?')?; + (type=BuiltinType | reference=[TypeDeclaration:RegularID]) (array?='[' ']')? (optional?='?')?;tests/integration/tests/enhancements/json/crud.test.ts (4)
18-72
: Consider enhancing test readability and error validation.While the test cases are comprehensive, consider these improvements:
- Add descriptive comments for each rejection case to clarify the validation being tested
- Validate error messages to ensure proper error reporting
Example improvement:
// expecting object -await expect(db.user.create({ data: { profile: 1 } })).toBeRejectedByPolicy(); +// should reject non-object values +await expect(db.user.create({ data: { profile: 1 } })) + .toBeRejectedByPolicy('Profile must be an object'); + +// should reject array values await expect(db.user.create({ data: { profile: [{ age: 18 }] } })).toBeRejectedByPolicy();
74-111
: Consider adding more array validation test cases.The array validation test could be more comprehensive. Consider adding tests for:
- Empty array validation
- Array with mixed valid/invalid elements
- Maximum array length (if applicable)
Example additional test cases:
// should accept empty array await expect( db.user.create({ data: { profiles: [] } }) ).resolves.toMatchObject({ profiles: [], }); // should reject array with invalid elements await expect( db.user.create({ data: { profiles: [ { age: 18 }, { age: "invalid" } // invalid type ] } }) ).toBeRejectedByPolicy();
194-233
: Consider adding more robust DateTime validation.The DateTime default value test could be enhanced to:
- Validate the exact DateTime format
- Consider timezone implications
- Ensure the created timestamp is recent
Example improvement:
const now = new Date(); await expect( db.user.create({ data: { profile: { address: { state: 'WA' } } } }) ).resolves.toMatchObject({ profile: { address: { state: 'WA', city: 'Issaquah' }, createdAt: expect.any(Date) } }).then(result => { // Ensure timestamp is within the last second expect(result.profile.createdAt.getTime()).toBeGreaterThan(now.getTime() - 1000); expect(result.profile.createdAt.getTime()).toBeLessThan(now.getTime() + 1000); });
235-270
: Consider adding auth context error cases.The auth default value test could be enhanced to cover error scenarios:
- Missing auth context
- Invalid auth context structure
Example additional test cases:
// should handle missing auth context const dbNoAuth = params.enhance(); await expect( dbNoAuth.user.create({ data: { profile: { nested: {} } } }) ).rejects.toThrow(/auth context/i); // should handle invalid auth context const dbInvalidAuth = params.enhance({}); // empty object without id await expect( dbInvalidAuth.user.create({ data: { profile: { nested: {} } } }) ).rejects.toThrow(/auth.id/i);packages/schema/src/plugins/zod/utils/schema-gen.ts (1)
177-177
: Consider improving error handling for unknown reference types.The current implementation falls back to
z.any()
for unknown reference types, which is safe but might hide potential issues. Consider:
- Logging a warning for unexpected reference types
- Being more explicit about the types that can be handled
} else if (isTypeDef(field.type.reference?.ref)) { schema = `z.lazy(() => ${upperCaseFirst(field.type.reference.ref.name)}Schema)`; } else { + console.warn(`Unexpected reference type for field. Falling back to z.any()`); schema = 'z.any()'; }
Also applies to: 180-187
packages/schema/src/language-server/validator/attribute-application-validator.ts (1)
68-70
: Consider enhancing the error message.The validation logic is correct, but the error message could be more helpful by mentioning which attributes are supported on type declaration fields.
Consider updating the error message to be more informative:
- accept('error', `attribute "${decl.name}" cannot be used on type declaration fields`, { node: attr }); + accept('error', `attribute "${decl.name}" cannot be used on type declaration fields. Only attributes marked with @@@supportTypeDef are allowed.`, { node: attr });packages/schema/src/cli/plugin-runner.ts (1)
317-318
: LGTM with a documentation suggestion.The
hasTypeDefFields
method is well-implemented with proper null safety through optional chaining. Consider adding JSDoc documentation to explain its purpose and return value.Add JSDoc documentation:
+/** + * Checks if the model has any fields that reference a type definition. + * @param model The data model to check + * @returns true if the model has type definition fields, false otherwise + */ private hasTypeDefFields(model: DataModel) { return model.fields.some((f) => isTypeDef(f.type.reference?.ref)); }packages/schema/src/language-server/validator/datamodel-validator.ts (1)
106-115
: Consider adding documentation for custom type constraintsThe validation enforces important constraints for custom-typed fields (requiring @JSON attribute and PostgreSQL provider). Consider adding code comments or documentation to explain these requirements and their rationale to improve maintainability.
Here's a suggested documentation addition:
+ // Custom-typed fields have specific requirements: + // 1. Must be annotated with @json attribute as they are stored as JSON + // 2. Only supported with PostgreSQL provider due to JSON data type requirements if (isTypeDef(field.type.reference?.ref)) {packages/schema/src/res/stdlib.zmodel (1)
710-713
: Consider enhancing the documentationWhile the
@json()
attribute is correctly defined, consider adding more details to the documentation:
- Explain the behavior when applied to a type definition
- Clarify any validation or formatting requirements
- Add examples of usage
packages/schema/src/plugins/enhancer/policy/policy-guard-generator.ts (1)
751-758
: Consider improving the template literal readability.While the logic is correct, the template literal's formatting could be improved for better readability.
Consider this refactor:
- writer.write( - `hasValidation: ${ - // explicit validation rules - hasValidationAttributes(model) || - // type-def fields require schema validation - this.hasTypeDefFields(model) - }` - ); + const hasValidation = [ + // explicit validation rules + hasValidationAttributes(model), + // type-def fields require schema validation + this.hasTypeDefFields(model) + ].some(Boolean); + writer.write(`hasValidation: ${hasValidation}`);packages/schema/tests/schema/validation/attribute-validation.test.ts (1)
1358-1371
: LGTM! Consider adding more test cases for other attributes.The test case effectively verifies that the
@omit
attribute cannot be used on type declaration fields. The test structure is clear and follows the established patterns in the file.Consider adding more test cases to verify the behavior of other attributes (e.g.,
@default
,@map
, etc.) with type declarations to ensure comprehensive coverage. For example:it('type def field attributes', async () => { await expect( loadModelWithError(` model User { id String @id profile Profile } type Profile { email String @omit name String @map("_name") createdAt DateTime @default(now()) } `) ).resolves.toContain(`attribute "@omit" cannot be used on type declaration fields`); await expect( loadModelWithError(` model User { id String @id profile Profile } type Profile { email String @map("_email") } `) ).resolves.toContain(`attribute "@map" cannot be used on type declaration fields`); });tests/integration/tests/plugins/zod.test.ts (2)
1064-1071
: Consider adding edge case tests.While the current test coverage is good, consider adding tests for:
- Empty objects
- Null values for optional fields
- Invalid JSON structures
1073-1082
: Consider testing schema update scenarios.The test covers create scenarios but doesn't verify update operations. Consider adding tests for:
- Partial updates of Profile
- Updating nested Address fields
packages/schema/src/plugins/prisma/schema-generator.ts (1)
789-807
: LGTM! Consider enhancing error handling.The field type resolution logic is well-structured and correctly handles type definitions by mapping them to JSON fields. However, consider adding more descriptive error messages when field types cannot be resolved.
if (!fieldType) { - throw new PluginError(name, `Field type is not resolved: ${field.$container.name}.${field.name}`); + throw new PluginError( + name, + `Field type could not be resolved for ${field.$container.name}.${field.name}. ` + + `Expected an intrinsic type, reference type, or unsupported type declaration.` + ); }packages/schema/src/plugins/enhancer/enhance/model-typedef-generator.ts (2)
29-31
: Enhance Comment Unwrapping LogicThe
unwrapTripleSlashComment
function (lines 29-31) removes leading slashes and whitespace from comments. However, it might not correctly handle comments with varying numbers of slashes or whitespace.Improve the regular expression to accurately unwrap triple-slash comments:
function unwrapTripleSlashComment(c: string): string { - return c.replace(/^[/]*\s*/, ''); + return c.replace(/^\/\/\/\s?/, ''); }This change specifically targets triple-slash comments and preserves other comment styles.
51-62
: Ensure Future-Proof Handling of Built-in TypesThe
builtinTypeToTsType
function (lines 51-62) matches built-in ZModel types to TypeScript types. While thets-pattern
library ensures exhaustiveness, consider adding a default case to handle any new types added in the future.Add a catch-all case to provide a clear error message for unhandled types:
.with('Json', () => 'unknown') +.otherwise((type) => { + throw new PluginError(name, `Unhandled built-in type: ${type}`); +}) .exhaustive();This helps in maintaining the code when new types are introduced.
packages/runtime/src/enhancements/edge/json-processor.ts (1)
Line range hint
35-41
: Consider adding type information to thedata
parameter.To improve type safety and code readability, consider adding a type annotation to the
data
parameter in theprocessResultEntity
method. This will make it clear what type of data is expected to be processed.- protected override async processResultEntity<T>(_method: PrismaProxyActions, data: T): Promise<T> { + protected override async processResultEntity<T>(_method: PrismaProxyActions, data: T & Record<string, unknown>): Promise<T> {packages/runtime/src/enhancements/node/json-processor.ts (2)
41-41
: Consider defining specific types instead of usingany
for better type safetyUsing the
any
type bypasses TypeScript's type checking, potentially allowing unintended values and reducing code reliability. Defining specific interfaces or types forentityData
can improve code safety and maintainability.Also applies to: 64-64
1-1
: Limit the scope of ESLint disable commentsDisabling
@typescript-eslint/no-explicit-any
for the entire file may hide unintended uses ofany
. Enabling the rule globally and disabling it only where necessary encourages better typing practices.Apply this diff to adjust the ESLint comments:
-/* eslint-disable @typescript-eslint/no-explicit-any */
Then, add inline disable comments only where
any
is unavoidable:+ // eslint-disable-next-line @typescript-eslint/no-explicit-any private async doPostProcess(entityData: any, model: string) {
Repeat for other occurrences of
any
where necessary.packages/runtime/src/cross/model-meta.ts (3)
151-165
: Consider standardizing terminology for 'type definition'Throughout the comments, "type def" is used. For consistency and clarity, consider using "type definition" or "TypeDef" uniformly in the documentation.
175-179
: Maintain consistent terminology in commentsIn the comment for
typeDefs
inModelMeta
, consider using "Type definitions" to match the terminology used elsewhere.
212-213
: Consider renaming parameter for consistencyIn the
requireField
function, the parameter is still namedmodel
, although it can now refer to a type definition as well. For consistency withresolveField
, consider renaming the parameter tomodelOrTypeDef
.packages/runtime/src/enhancements/node/default-auth.ts (1)
89-92
: Consider renaminggetDefaultValue
togetDefaultValueFromAuth
for clarity.The method name
getDefaultValue
doesn't clearly convey that the default value is derived from theauth()
function. Consider renaming it togetDefaultValueFromAuth
to improve code readability and maintain consistency with the previous naming convention.- const defaultValue = this.getDefaultValue(fieldInfo); + const defaultValue = this.getDefaultValueFromAuth(fieldInfo);packages/schema/src/plugins/enhancer/enhance/index.ts (4)
68-73
: Clarify the usage ofprismaTypesFixed
and related conditional logicThe variable
prismaTypesFixed
is introduced to determine if Prisma client type fixes are necessary based onneedsLogicalClient
orneedsPrismaClientTypeFixes
. Ensure that the conditions under whichprismaTypesFixed
is set totrue
are accurate and that all dependent logic correctly accounts for both cases. Consider adding comments to explain the purpose ofprismaTypesFixed
for better code readability.
182-186
: Clarify dual imports ofPrisma
andPrismaClient
The
createLogicalPrismaImports
method importsPrisma
andPrismaClient
from both'${prismaImport}'
(aliased as_Prisma
and_PrismaClient
) and'${prismaClientImport}'
. It might be confusing to have these dual imports without clear differentiation. Consider adding comments or renaming the aliases to clarify the distinct purposes of each import to improve maintainability.Example:
return \`import { Prisma as _OriginalPrisma, PrismaClient as _OriginalPrismaClient } from '\${prismaImport}'; import type * as _P from '\${prismaClientImport}'; import type { Prisma, PrismaClient } from '\${prismaClientImport}'; export type { PrismaClient }; \`;And include comments explaining the roles of
_OriginalPrisma
andPrisma
.
230-232
: Consider combiningneedsLogicalClient
andneedsPrismaClientTypeFixes
if appropriateBoth getters
needsLogicalClient
andneedsPrismaClientTypeFixes
are used to determine if additional processing is needed for the Prisma client. If these conditions are often checked together, consider whether they can be combined or if additional documentation can clarify their distinct purposes.
359-370
: Remove obsolete commented-out code for better code cleanlinessThe block of commented-out code between lines 359 and 370 appears to be obsolete due to the introduction of
this.transformPrismaTypes(sf, sfNew, delegateInfo);
. Removing this dead code can improve readability and maintainability.Apply this diff to remove the commented-out code:
- // if (delegateInfo.length > 0) { - // // transform types for delegated models - // this.transformDelegate(sf, sfNew, delegateInfo); - // sfNew.formatText(); - // } else { - // this.transformJsonFields(sf, sfNew); - - // // // just copy - // // sfNew.replaceWithText(sf.getFullText()); - // }
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
⛔ Files ignored due to path filters (3)
packages/language/src/generated/ast.ts
is excluded by!**/generated/**
,!**/generated/**
packages/language/src/generated/grammar.ts
is excluded by!**/generated/**
,!**/generated/**
packages/language/syntaxes/zmodel.tmLanguage.json
is excluded by!**/*.json
📒 Files selected for processing (32)
packages/language/src/zmodel.langium
(4 hunks)packages/language/syntaxes/zmodel.tmLanguage
(1 hunks)packages/plugins/swr/src/generator.ts
(2 hunks)packages/plugins/tanstack-query/src/generator.ts
(3 hunks)packages/runtime/src/cross/model-meta.ts
(4 hunks)packages/runtime/src/cross/utils.ts
(4 hunks)packages/runtime/src/enhancements/edge/json-processor.ts
(1 hunks)packages/runtime/src/enhancements/node/create-enhancement.ts
(3 hunks)packages/runtime/src/enhancements/node/default-auth.ts
(6 hunks)packages/runtime/src/enhancements/node/json-processor.ts
(1 hunks)packages/schema/src/cli/plugin-runner.ts
(2 hunks)packages/schema/src/language-server/validator/attribute-application-validator.ts
(3 hunks)packages/schema/src/language-server/validator/datamodel-validator.ts
(2 hunks)packages/schema/src/language-server/validator/typedef-validator.ts
(1 hunks)packages/schema/src/language-server/validator/zmodel-validator.ts
(4 hunks)packages/schema/src/plugins/enhancer/enhance/index.ts
(11 hunks)packages/schema/src/plugins/enhancer/enhance/model-typedef-generator.ts
(1 hunks)packages/schema/src/plugins/enhancer/model-meta/index.ts
(1 hunks)packages/schema/src/plugins/enhancer/policy/policy-guard-generator.ts
(2 hunks)packages/schema/src/plugins/prisma/schema-generator.ts
(2 hunks)packages/schema/src/plugins/zod/generator.ts
(4 hunks)packages/schema/src/plugins/zod/transformer.ts
(3 hunks)packages/schema/src/plugins/zod/utils/schema-gen.ts
(4 hunks)packages/schema/src/res/stdlib.zmodel
(5 hunks)packages/schema/tests/schema/validation/attribute-validation.test.ts
(1 hunks)packages/sdk/src/model-meta-generator.ts
(9 hunks)packages/sdk/src/utils.ts
(3 hunks)packages/sdk/src/validation.ts
(2 hunks)tests/integration/tests/enhancements/json/crud.test.ts
(1 hunks)tests/integration/tests/enhancements/json/typing.test.ts
(1 hunks)tests/integration/tests/enhancements/json/validation.test.ts
(1 hunks)tests/integration/tests/plugins/zod.test.ts
(1 hunks)
🧰 Additional context used
🪛 Biome
packages/runtime/src/enhancements/node/json-processor.ts
[error] 85-85: isNaN is unsafe. It attempts a type coercion. Use Number.isNaN instead.
See the MDN documentation for more details.
Unsafe fix: Use Number.isNaN instead.
(lint/suspicious/noGlobalIsNan)
🔇 Additional comments (80)
packages/schema/src/plugins/enhancer/model-meta/index.ts (2)
1-2
: LGTM: Import changes are appropriate
The addition of isTypeDef
import is consistent with the new type definition support being added.
Line range hint 9-18
: LGTM: Type definition handling is well implemented
The changes properly integrate type definition support:
- Correctly filters type definitions using the type guard
- Appropriately passes type definitions to
generateModelMeta
- Preserves existing functionality for data models and configuration options
packages/schema/src/language-server/validator/typedef-validator.ts (1)
1-23
: Verify integration with language server error reporting.
Let's verify that the validator is properly integrated with the language server's error reporting system.
✅ Verification successful
Let me try with corrected paths to gather the necessary information.
Let me try one more verification to understand how the validator is integrated and used.
TypeDef validator is properly integrated with language server
The validation system is well-structured and correctly integrated:
- TypeDefValidator is registered and used by the ZModelValidationRegistry
- It follows the same validation pattern as other validators in the system
- It uses common validation utilities like
validateDuplicatedDeclarations
- The validator is part of a comprehensive validation system alongside other validators (enum, attribute, datamodel, etc.)
🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash
# Check if the validator is registered with the language server
echo "Checking validator registration..."
rg -l "TypeDefValidator" "packages/schema/src/language-server/**/*.ts"
# Check for consistent error message patterns
echo "Checking error message patterns..."
ast-grep --pattern 'accept.error($_, $_)'
Length of output: 356
Script:
#!/bin/bash
# Check if the validator is registered with the language server
echo "Checking validator registration..."
rg -l "TypeDefValidator" "packages/schema/src/language-server"
# Check for validator registration in main language server setup
echo "Checking language server setup..."
rg "TypeDefValidator" "packages/schema/src/language-server/zen-script-validator.ts" -A 5
# Check error reporting patterns in related validators
echo "Checking error patterns in validators..."
rg "accept\.error" "packages/schema/src/language-server/validator"
Length of output: 694
Script:
#!/bin/bash
# Check how TypeDefValidator is used in zmodel-validator
echo "Checking TypeDefValidator usage in zmodel-validator..."
rg "TypeDefValidator" -A 10 "packages/schema/src/language-server/validator/zmodel-validator.ts"
# Check the validation utilities being used
echo "Checking validation utilities..."
rg "validateDuplicatedDeclarations|validateAttributeApplication" "packages/schema/src/language-server/validator/utils.ts" -A 5
# Look for the main validator registration
echo "Checking main validator setup..."
fd "validator|validation" -e ts "packages/schema/src/language-server" --exec rg -l "registerValidator|validate"
Length of output: 2435
tests/integration/tests/enhancements/json/validation.test.ts (2)
1-1
: LGTM!
The import statement is clean and imports only what's needed for the tests.
4-20
: Consider adding more test scenarios for comprehensive coverage.
While the test correctly validates the postgres requirement, consider adding test cases for:
- Different providers (mysql, sqlite) to ensure consistent behavior
- Multiple JSON fields in the same model
- Nested type definitions
Verify error message consistency across codebase.
Let's ensure the error message is consistently used across the codebase.
✅ Verification successful
Error message is consistently used across the codebase
The error message "Custom-typed field is only supported with postgresql provider" is correctly defined in the validator (datamodel-validator.ts
) and properly tested in the integration test. No inconsistencies found.
🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash
# Search for the error message string to ensure consistency
rg "Custom-typed field is only supported with \"postgresql\" provider" --type ts
Length of output: 424
packages/runtime/src/cross/utils.ts (4)
2-2
: LGTM: Import changes are appropriate.
The addition of TypeDefInfo
type import aligns with the new type definition functionality.
Line range hint 66-80
: LGTM: Well-implemented generic function with proper type safety.
The function effectively uses TypeScript's conditional types to provide accurate return type information based on the throwIfNotFound
parameter.
97-102
: LGTM: Good use of the new getModelInfo function.
The refactoring to use getModelInfo
improves code consistency.
Line range hint 49-65
: Consider enhancing type safety and verifying requireField usage.
While the function implementation is sound, consider these improvements:
- Add explicit return type annotation for better type safety
- Verify the behavior of
requireField
for handling missing fields
-export function getIdFields(modelMeta: ModelMeta, model: string, throwIfNotFound = false) {
+export function getIdFields(modelMeta: ModelMeta, model: string, throwIfNotFound = false): string[] {
✅ Verification successful
Type annotation suggestion can be safely applied, requireField implementation is robust
The requireField
function is properly implemented in packages/runtime/src/cross/model-meta.ts
. It throws an error when a field is not found, which aligns well with the throwIfNotFound
parameter in getIdFields
. The implementation includes proper error handling:
export function requireField(modelMeta: ModelMeta, model: string, field: string, isTypeDef = false) {
const f = resolveField(modelMeta, model, field, isTypeDef);
if (!f) {
throw new Error(`Field ${model}.${field} cannot be resolved`);
}
return f;
}
The suggested type annotation improvement remains valid:
-export function getIdFields(modelMeta: ModelMeta, model: string, throwIfNotFound = false) {
+export function getIdFields(modelMeta: ModelMeta, model: string, throwIfNotFound = false): string[] {
🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash
# Description: Verify the implementation of requireField function
# to ensure it properly handles missing fields
# Search for requireField implementation and its tests
rg -A 5 "requireField.*=.*function"
rg -A 5 "requireField.*test"
Length of output: 68
Script:
#!/bin/bash
# Let's try different search patterns and commands to locate requireField
# Search for requireField function definition using ast-grep
ast-grep --pattern 'function requireField($_) { $$$ }'
# Search for requireField export using ast-grep
ast-grep --pattern 'export function requireField($_) { $$$ }'
# Search for requireField variable declaration using ast-grep
ast-grep --pattern 'const requireField = $_'
# Broader search using ripgrep
rg "requireField" -B 2 -A 5
# Search for files containing requireField
fd -t f -e ts -e js -x grep -l "requireField" {}
Length of output: 7554
packages/language/syntaxes/zmodel.tmLanguage (1)
23-23
: LGTM! Keyword addition looks correct.
The 'type' keyword has been properly added to the control keywords pattern, maintaining alphabetical order and correct word boundary matching.
Let's verify that all language keywords are properly included in the syntax highlighting:
✅ Verification successful
All keywords are properly included in the syntax highlighting
After analyzing the grammar file, I can confirm that all language keywords defined in the grammar are correctly included in the TextMate syntax highlighting pattern. The keywords are sourced from:
- RegularID rule which explicitly lists keywords: 'model', 'enum', 'attribute', 'datasource', 'plugin', 'abstract', 'in', 'view', 'import', 'type'
- RegularIDWithTypeNames rule which adds type keywords: 'String', 'Boolean', 'Int', 'BigInt', 'Float', 'Decimal', 'DateTime', 'Json', 'Bytes', 'Null', 'Object', 'Any', 'Unsupported'
- Other keywords from grammar rules: 'true', 'false', 'function', 'this', 'null'
- Special field types: 'FieldReference', 'TransitiveFieldReference', 'ContextType'
All these keywords are present in the TextMate grammar pattern.
🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash
# Description: Cross-reference keywords in grammar definition with syntax highlighting
# Expected: All keywords defined in the grammar should be present in the syntax highlighting
# Search for keyword definitions in the grammar file
echo "Keywords defined in grammar:"
rg -i "keyword\s*=\s*['\"](.*?)['\"]" "packages/language/src/zmodel.langium"
# Note: This helps ensure we haven't missed any keywords in the TextMate grammar
# that are defined in the actual language grammar
Length of output: 147
Script:
#!/bin/bash
# Let's try a different approach to find keyword definitions in the grammar
# Look for terminal definitions which often contain keywords
echo "=== Searching for terminal definitions ==="
ast-grep --pattern 'terminal $_: /.*?/;' packages/language/src/zmodel.langium
# Also search for any rule definitions that might contain keywords
echo -e "\n=== Searching for rule definitions ==="
rg "^[[:space:]]*[A-Z][a-zA-Z]+:" packages/language/src/zmodel.langium
# Look at the grammar file content directly
echo -e "\n=== Grammar file content ==="
cat packages/language/src/zmodel.langium
Length of output: 9160
packages/schema/src/language-server/validator/zmodel-validator.ts (2)
10-10
: LGTM: Import statements are correctly added
The new imports for TypeDef
and TypeDefValidator
are properly placed and follow the existing import patterns.
Also applies to: 23-23
36-36
: LGTM: Validator registration is properly implemented
The TypeDef validator is correctly registered in the validation checks map, following the established pattern.
packages/runtime/src/enhancements/node/create-enhancement.ts (2)
13-13
: LGTM!
The import is correctly placed and follows the existing pattern of importing enhancement functions.
105-105
: LGTM!
The type definition detection flag follows the established pattern and is correctly implemented.
tests/integration/tests/enhancements/json/typing.test.ts (1)
1-3
: LGTM! Clean test structure.
The test file follows best practices with clear organization and proper imports.
packages/language/src/zmodel.langium (3)
14-14
: LGTM: TypeDef integration in AbstractDeclaration
The TypeDef declaration is correctly integrated at the top level alongside other primary language constructs.
225-225
: LGTM: Supporting rule updates
The changes to RegularID and TypeDeclaration properly integrate the new TypeDef construct into the existing type system.
Also applies to: 242-242
182-187
: LGTM: TypeDef grammar structure
The TypeDef rule follows a consistent pattern with other declarations in the grammar. Let's verify that circular type references are properly handled.
tests/integration/tests/enhancements/json/crud.test.ts (1)
1-16
: LGTM! Well-structured test setup.
The test suite setup follows best practices with proper database lifecycle management and cleanup.
packages/schema/src/plugins/zod/utils/schema-gen.ts (3)
9-11
: LGTM: Import additions are well-organized and necessary.
The new imports support the TypeDef functionality and error handling improvements.
Also applies to: 18-19, 24-25
238-239
: LGTM: Improved readability of path handling.
The line break makes the complex path handling logic more readable without changing the functionality.
284-284
: LGTM: Default value handling properly extended for TypeDef fields.
The implementation correctly handles all primitive types and special cases. Let's verify the boolean literal handling:
✅ Verification successful
Boolean default values are properly handled in the implementation
The verification confirms that:
- The
getFieldSchemaDefault
function correctly handles boolean literals through theisBooleanLiteral
check - Test cases demonstrate practical usage with
@default(false)
for Boolean fields - The implementation is consistent with other primitive type handling (strings, numbers)
🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash
# Description: Verify boolean default value handling in TypeDef fields
# Test: Search for TypeDef fields with boolean default values
ast-grep --pattern $'type $_ {
$field: Boolean @default($value)
}'
Length of output: 69
Script:
#!/bin/bash
# Let's try a broader search for boolean default values in both models and types
ast-grep --pattern '@default($value) $$$Boolean'
# Also search for test files that might contain boolean default examples
rg -l "Boolean.*@default" --type ts
# And check the specific function implementation
ast-grep --pattern 'function getFieldSchemaDefault(field: $type) {
$$$
}'
Length of output: 2146
Script:
#!/bin/bash
# Let's examine the actual implementation of getFieldSchemaDefault
rg -A 20 "export function getFieldSchemaDefault" packages/schema/src/plugins/zod/utils/schema-gen.ts
# And check some test cases to verify boolean default handling
rg "Boolean.*@default\(.*\)" tests/integration/tests/plugins/zod.test.ts
Length of output: 1166
packages/schema/src/language-server/validator/attribute-application-validator.ts (2)
17-18
: LGTM: Import additions for type definition support.
The new imports follow the existing pattern and are necessary for the type definition validation logic.
21-21
: LGTM: Import addition for attribute checking utility.
The hasAttribute
import is appropriately grouped with other utility imports and is used in the new type definition validation logic.
packages/plugins/swr/src/generator.ts (2)
13-13
: LGTM: Import addition for type definition support
Clean addition of isTypeDef
import to support the new type definition feature.
31-36
: LGTM: Type definitions handling in model metadata generation
The changes correctly extract type definitions from the model and pass them to the metadata generator, enabling support for the new "type" construct feature.
packages/schema/src/cli/plugin-runner.ts (2)
3-3
: LGTM: Import changes are appropriate.
The addition of DataModel
and isTypeDef
imports aligns with the new type definition functionality.
314-315
: LGTM: Validation logic properly extended.
The hasValidation
method now correctly considers both validation attributes and type definition fields, maintaining clean separation of concerns through the hasTypeDefFields
helper method.
packages/schema/src/language-server/validator/datamodel-validator.ts (2)
9-9
: LGTM: Well-organized imports
The new imports are properly organized and necessary for the added custom type validation functionality.
Also applies to: 11-18
107-115
: LGTM with security consideration
The validation logic for custom-typed fields is well-implemented. The checks for @JSON attribute and PostgreSQL provider are appropriate.
However, please ensure that proper JSON sanitization is implemented in the runtime layer when handling these custom-typed fields to prevent potential security vulnerabilities like JSON injection.
Let's verify if JSON sanitization is implemented:
packages/sdk/src/utils.ts (2)
33-33
: LGTM: Type definition support properly integrated
The addition of TypeDefField
to imports and the getAttribute
function signature is consistent with the new type definition feature, maintaining proper typing and existing patterns.
Also applies to: 127-127
464-466
: LGTM: Enhanced type safety with container validation
The added container type check is a good defensive programming practice that prevents potential runtime errors when dealing with fields from non-DataModel containers. This is particularly relevant with the introduction of type definitions.
packages/schema/src/res/stdlib.zmodel (4)
50-50
: LGTM: New field type added correctly
The addition of TypeDefField
to AttributeTargetField
enum is consistent with the type system enhancement.
179-182
: LGTM: Type support attribute well-defined
The @@@supportTypeDef()
meta-attribute is properly defined with clear documentation.
218-218
: LGTM: Default attribute correctly extended
The @default
attribute has been properly extended to support type definitions while maintaining existing functionality.
567-637
: LGTM: Validation attributes consistently updated
All validation attributes have been systematically updated to support type definitions while maintaining their original field type restrictions and documentation.
packages/plugins/tanstack-query/src/generator.ts (3)
14-14
: LGTM! Import changes are well-organized.
The addition of isTypeDef
import is properly grouped with related AST types and aligns with the type definition feature.
32-32
: LGTM! Type definitions are correctly filtered.
The implementation properly extracts type definitions from model declarations.
48-51
: LGTM! Model metadata generation properly includes type definitions.
The generateModelMeta
call is correctly updated to include type definitions while maintaining existing configuration.
packages/schema/src/plugins/enhancer/policy/policy-guard-generator.ts (2)
14-14
: LGTM!
The import of isTypeDef
is correctly placed with other AST-related imports and is used by the new hasTypeDefFields
method.
766-768
: LGTM!
The hasTypeDefFields
method is well-implemented:
- Uses efficient Array.some() for early termination
- Correctly handles optional type references with optional chaining
- Clear and descriptive method name
tests/integration/tests/plugins/zod.test.ts (1)
1029-1083
: LGTM! Comprehensive test coverage for type definitions.
The test case thoroughly validates:
- Type definition schema generation and validation
- Nested type definitions (Profile containing Address)
- Optional field handling
- JSON field integration with models
- Various validation scenarios including boundary conditions
packages/schema/src/plugins/prisma/schema-generator.ts (1)
813-816
: LGTM! Array type handling is correct.
The code properly handles array types with a special case for typed-JSON fields, ensuring they're correctly translated to scalar Json type. The inline comment provides clear context for the special case.
packages/sdk/src/validation.ts (2)
1-8
: Imports are correctly updated
The necessary imports for isDataModel
, isTypeDef
, and the associated types are appropriately added and used in the code.
38-38
: Verify that field-level validation attributes are correctly detected
In the else block starting at line 38, the code checks for validation attributes on field attributes. To ensure comprehensive validation, confirm that this logic correctly handles all possible field types and their attributes.
Run the following script to verify that fields with validation attributes are being handled properly:
This script searches the codebase for field declarations within both DataModel
and TypeDef
that have validation attributes, ensuring that your logic captures all relevant cases.
✅ Verification successful
This revised approach will help us:
- Understand how validation attributes are defined and checked
- See actual field declarations with attributes in the codebase
- Review the complete validation logic implementation
This will help us:
- See real examples of validation attributes in use
- Check test cases that verify the validation attribute detection logic
From the results, I can see that isValidationAttribute
is a function that checks if an attribute has a reference to @@@validation
. Let me search for actual validation attributes in the codebase.
This will help us:
- Find actual usage of validation attributes in the codebase
- Understand how validation attributes are defined and declared
Based on the results, I can now generate a final response. The codebase shows extensive usage of validation attributes and their proper detection:
Field-level validation attributes are correctly detected and handled
The validation system is working correctly because:
- The
isValidationAttribute
function properly checks for attributes marked with@@@validation
- The codebase shows consistent usage of validation attributes like:
- String validations:
@email
,@length
,@regex
,@startsWith
,@endsWith
,@contains
,@url
- Numeric validations:
@gt
,@gte
,@lt
,@lte
- Date validations:
@datetime
- String validations:
The test files demonstrate comprehensive coverage of these validation attributes in various scenarios, including field-level and model-level validations, confirming that the validation detection logic is working as intended.
🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash
# Description: Verify that fields with validation attributes are properly detected in all declarations.
# Finding field declarations with validation attributes
ast-grep --pattern $'(
data_model_declaration
| type_def_declaration
) {
fields: (
field_declaration [
attributes: (attribute(
decl: (reference(
refText: "@@@validation"
))
))
]
)
}'
Length of output: 289
Script:
#!/bin/bash
# Let's first understand what validation attributes are defined in the codebase
rg -g "*.ts" "isValidationAttribute" -A 5 -B 5
# Then find all field declarations with attributes to see the pattern
ast-grep --pattern 'field_declaration {
attributes: (
attribute_list {
$$$
}
)
}'
# Also check the implementation of validation.ts
fd "validation.ts" --exec cat {}
Length of output: 4642
Script:
#!/bin/bash
# Let's find actual usage of validation attributes in the codebase
rg -g "*.{ts,js}" "@@@validation" -B 2 -A 2
# Also check for any test files that verify this functionality
fd "validation" -e test.ts -e spec.ts --exec cat {}
Length of output: 100474
Script:
#!/bin/bash
# Find all field declarations with validation-related attributes
rg -g "*.{ts,js}" "@(email|length|gt|lt|gte|lte|startsWith|endsWith|contains|regex|url|datetime)" -B 2 -A 2
# Also check for any validation attribute declarations
rg -g "*.{ts,js}" "@@@validation" -B 5 -A 5
Length of output: 36994
packages/schema/src/plugins/enhancer/enhance/model-typedef-generator.ts (1)
37-40
: Handle Circular Type References
When converting field types, there's a possibility of circular references if a type refers to itself either directly or indirectly.
Ensure that the code handles circular references without causing a stack overflow or infinite recursion. Run the following script to detect circular references:
This script identifies type definitions that reference themselves, which could lead to circular dependencies.
packages/runtime/src/enhancements/edge/json-processor.ts (6)
Line range hint 2-6
: LGTM!
The import statements are correct and necessary for the functionality implemented in this file.
Line range hint 8-21
: LGTM!
The withJsonProcessor
function is well-structured and follows best practices:
- It uses generics to ensure type safety for the
DbClient
parameter. - It returns a proxied version of the
DbClient
using themakeProxy
function. - The JSDoc comment provides clear documentation about the purpose and visibility of the function.
Line range hint 23-33
: LGTM!
The JsonProcessorHandler
class is properly defined and initialized:
- It extends the
DefaultPrismaProxyHandler
class. - The constructor initializes the necessary dependencies, including the
QueryUtils
instance. - The
queryUtils
property is declared as private to encapsulate it within the class.
Line range hint 43-59
: LGTM!
The doPostProcess
method is well-implemented:
- It retrieves the real model using the
getDelegateConcreteModel
method fromqueryUtils
. - It iterates over the fields of the
entityData
using thegetModelFields
function. - It resolves the field information using the
resolveField
function from themodelMeta
. - It handles different cases based on the field type (type definition or data model).
- It recursively processes nested data structures.
Line range hint 61-89
: LGTM!
The fixJsonDateFields
method is properly implemented:
- It checks if the
entityData
is an object or an array before processing. - It iterates over the properties of the
entityData
usingObject.entries
. - It resolves the field information for each property using the
resolveField
function. - It handles nested type definitions by recursively calling
fixJsonDateFields
. - It converts string values to
Date
objects for fields of typeDateTime
.
Line range hint 1-89
: Verify the integration of withJsonProcessor
in the codebase.
The withJsonProcessor
function is exported from this file, indicating that it is intended to be used in other parts of the codebase. It's important to ensure that it is properly integrated and utilized where necessary.
To verify the integration, you can run the following script:
This script will help you locate the places where withJsonProcessor
is imported and used. Review the results to ensure that it is being utilized correctly and in the appropriate contexts.
✅ Verification successful
Let me generate additional scripts to understand the relationship between the edge and node implementations of withJsonProcessor
.
withJsonProcessor
is properly integrated in both edge and node environments
The verification shows that:
- The function is correctly imported and used in both edge and node environments through their respective
create-enhancement.ts
files - Both environments follow the same integration pattern, applying JSON processing as part of the enhancement chain
- The implementation appears to be mirrored between edge and node directories, which is the expected pattern for environment-specific code
🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash
# Search for the usage of `withJsonProcessor` in TypeScript files
rg --type ts 'withJsonProcessor' -A 5
# Search for the import statement of `withJsonProcessor` in TypeScript files
rg --type ts 'from .*/json-processor' -A 5
Length of output: 2298
Script:
#!/bin/bash
# Compare the edge and node implementations
diff packages/runtime/src/enhancements/edge/json-processor.ts packages/runtime/src/enhancements/node/json-processor.ts
# Check where the edge version is imported/used
rg --type ts 'from.*edge.*json-processor' -A 5
# Check the create-enhancement in edge directory
rg --type ts 'withJsonProcessor' packages/runtime/src/enhancements/edge/create-enhancement.ts -A 5
Length of output: 637
packages/runtime/src/cross/model-meta.ts (2)
47-51
: Good addition of isTypeDef
property
The introduction of the isTypeDef
property in FieldInfo
is clear and follows the existing naming conventions.
199-205
: Parameter naming consistency
In the resolveField
function, the parameter modelOrTypeDef
is appropriately named to reflect its usage. The addition of the isTypeDef
parameter allows for flexibility in resolving fields from both models and type definitions.
packages/runtime/src/enhancements/node/default-auth.ts (4)
74-78
: LGTM!
The new conditional block correctly checks if the field is a type definition and invokes the setDefaultValueForTypeDefData
method to handle default values for the nested data structure.
Line range hint 118-203
: Excellent work on handling default values for type definitions!
The setDefaultValueForTypeDefData
method is well-structured and handles the recursive processing of nested data structures effectively. It correctly retrieves the type definition information using getTypeDefInfo
and iterates over the fields to set default values for type definitions.
The method also handles the case when the field itself is a type definition by recursively calling setDefaultValueForTypeDefData
.
Overall, this addition enhances the capability of the DefaultAuthHandler
to manage default values for complex, nested data structures defined as type definitions.
Line range hint 164-174
: LGTM!
The renaming of getDefaultValueFromAuth
to getDefaultValue
streamlines the method's purpose and aligns with the broader changes in this pull request. The logic within the method remains unchanged, ensuring that default values are retrieved based on the user context.
11-11
: Verify the import of getTypeDefInfo
from the correct module.
Ensure that the getTypeDefInfo
function is exported from the packages/runtime/src/cross/utils.ts
module. If not, update the import statement to reference the correct module.
✅ Verification successful
Since the output shows that getTypeDefInfo
is indeed exported as a function from packages/runtime/src/cross/utils.ts
, which matches exactly what the review comment was asking to verify, I can generate the final response.
getTypeDefInfo
import verification successful
The function is correctly exported from packages/runtime/src/cross/utils.ts
as expected.
🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash
# Description: Verify `getTypeDefInfo` is exported from `packages/runtime/src/cross/utils.ts`
# Test: Search for the export statement in `packages/runtime/src/cross/utils.ts`
# Expect: A match for the export statement
rg -F 'export function getTypeDefInfo' packages/runtime/src/cross/utils.ts
Length of output: 139
packages/sdk/src/model-meta-generator.ts (1)
69-74
: Ensure all calls to generate
function include the new typeDefs
parameter
The generate
function now accepts an additional parameter typeDefs: TypeDef[]
. Please verify that all invocations of generate
throughout the codebase have been updated to include this new parameter to prevent potential runtime errors.
Run the following script to find all calls to generate
and check if they include the typeDefs
parameter:
✅ Verification successful
Let me analyze the search results and generate additional verification scripts to check the usage of the generate
function specifically in the model-meta-generator context.
This will help us find the specific usages of the generate
function from model-meta-generator.ts rather than other generate functions in the codebase.
Let me search for the actual usage of the imported function to verify how it's being called.
All calls to generateModelMeta
have been updated with the required parameters
The verification shows that all usages of the function (imported as generateModelMeta
) are correctly passing all four required parameters:
- project
- models/dataModels
- typeDefs
- options
This is confirmed in all locations where it's used:
- packages/schema/src/plugins/enhancer/model-meta/index.ts
- packages/plugins/tanstack-query/src/generator.ts
- packages/plugins/swr/src/generator.ts
- packages/schema/src/plugins/enhancer/index.ts
🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash
# Description: Verify that all calls to 'generate' include the 'typeDefs' parameter.
# Search for all invocations of 'generate(' in TypeScript files
rg --type ts --context 2 '\bgenerate\('
# Expected: All calls should include 'typeDefs' as the third argument.
Length of output: 40852
Script:
#!/bin/bash
# Search for imports of model-meta-generator to find where it's used
rg --type ts "from ['\"].*model-meta-generator['\"]"
# Search for any direct references to the generate function from model-meta-generator
ast-grep --pattern 'await generate($project, $models, $typeDefs, $options)'
Length of output: 234
Script:
#!/bin/bash
# Search for usages of generateModelMeta
rg --type ts -A 3 "generateModelMeta\("
# Also check for any direct imports and usages in the enhancer plugin which seems to be related
ast-grep --pattern 'import { generate } from "./model-meta-generator"'
Length of output: 1552
packages/schema/src/plugins/zod/generator.ts (6)
17-17
: LGTM!
The import statement is correct and necessary for the new functionality.
22-22
: LGTM!
The import statement is correct and necessary for the new functionality.
277-281
: LGTM!
The new code block correctly iterates over TypeDef
declarations in the model, skipping the excluded models, and generates schemas for each applicable TypeDef
using the new generateTypeDefSchema
method.
292-325
: LGTM!
The new generateTypeDefSchema
method is implemented correctly:
- It generates the schema name based on the
TypeDef
name. - It creates a new source file for the schema.
- It calls the
addPreludeAndImports
method to add the necessary prelude and import statements. - It generates the schema using
z.object
and iterates over the fields of theTypeDef
. - It applies the appropriate mode (strip, passthrough, or strict) based on the options.
- It returns the schema name.
393-393
: LGTM!
The updated generateModelSchema
method now calls the addPreludeAndImports
method with the model
and output
arguments, which is correct and necessary for the new functionality.
327-373
: Verify the handling of referenced types' schemas.
The updated addPreludeAndImports
method now handles importing schemas for referenced TypeDef
s. However, please ensure that the generated import statements for referenced types' schemas are correct and that the schemas for those types are generated properly.
Run the following script to verify the import statements and schema generation for referenced types:
✅ Verification successful
Based on the extensive codebase analysis, I can now generate the final response:
Referenced type schemas are properly handled
The implementation correctly handles referenced type schemas by:
- Detecting TypeDef references in fields using
isTypeDef(field.type.reference?.ref)
- Generating proper import statements with correct casing using
upperCaseFirst(refType)
- Generating schema files for TypeDefs through
generateTypeDefSchema
method - Using lazy evaluation with
z.lazy()
to handle circular dependencies
The code follows a consistent pattern across the codebase for handling TypeDefs and their schema generation, with proper integration into the Zod schema system.
🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash
# Description: Verify import statements and schema generation for referenced types.
# Test 1: Search for import statements of referenced types' schemas.
# Expect: Import statements for each unique referenced type.
rg -A 5 $'import \{ .+Schema \} from \'\.\/\w+\.schema\';'
# Test 2: Check if schemas are generated for referenced types.
# Expect: Schema files for each unique referenced type.
fd -e ts -x rg -q $'export const \w+Schema'
Length of output: 250
Script:
#!/bin/bash
# Let's try a different approach to verify the schema generation and imports
# Search for schema imports with proper escaping
rg "import \{ \w+Schema \} from '\./\w+\.schema'"
# Search for schema exports to verify generation
rg "export const \w+Schema = z\."
# Look for TypeDef references in the codebase
ast-grep --pattern 'isTypeDef($reference)'
# Check actual schema files
fd "\.schema\.ts$"
Length of output: 437
Script:
#!/bin/bash
# Let's try one more verification focusing on the TypeDef handling
# Look for TypeDef references in the code
ast-grep --pattern 'field.type.reference?.ref && isTypeDef(field.type.reference?.ref)'
# Search for schema file generation patterns
rg -A 3 "generateSchemaFile|writeSchemaFile"
# Check the actual implementation of schema generation for TypeDefs
ast-grep --pattern 'class $_ implements Generator {
$$$
generateTypeDefSchema($_) {
$$$
}
}'
# Look for any TypeDef-related code
rg "TypeDef" -A 3
Length of output: 51703
packages/schema/src/plugins/enhancer/enhance/index.ts (2)
109-110
: Ensure consistent import and function creation based on prismaTypesFixed
The conditional logic uses prismaTypesFixed
to decide between creating logical or simple Prisma imports and enhancement functions. Verify that both createLogicalPrismaImports
and createSimplePrismaImports
properly handle their respective scenarios and that there are no discrepancies in the imports or function signatures that might cause runtime errors.
Also applies to: 119-119
79-84
: Verify the compatibility of the export type * from
syntax
The code uses export type * from '${resultPrismaImport}';
to re-export all types from the specified module in TypeScript. This syntax requires TypeScript 4.5 or newer. Ensure that the project's TypeScript configuration is set to use version 4.5 or above to avoid compilation errors.
You can check the TypeScript version used in the project by reviewing the tsconfig.json
file or running:
✅ Verification successful
TypeScript version requirement is satisfied
The project uses TypeScript ^5.4.4 which fully supports the export type * from
syntax, as it was introduced in TypeScript 4.5. No compatibility issues are expected.
🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
# Check the TypeScript version specified in package.json
grep '"typescript":' package.json
# Alternatively, check the installed TypeScript compiler version
tsc --version
Length of output: 153
packages/schema/src/plugins/zod/transformer.ts (13)
3-3
: Import isTypeDef
to enhance type handling
The addition of isTypeDef
from @zenstackhq/sdk/ast
correctly supports the handling of type definitions within the transformer.
91-95
: Include extraImports
in generated content
The concatenation of extraImports
with objectSchema
ensures that necessary imports for type definitions are included in the generated schema files. This change enhances the modularity and correctness of the generated code.
100-100
: Verify the accuracy of the updated regular expression
The regular expression in createUpdateInputRegex
now accommodates CreateMany
and UpdateMany
operations. Ensure that this pattern accurately matches all intended input types without inadvertently capturing unintended ones.
104-105
: Initialize contextDataModel
and extraImports
appropriately
Initializing contextDataModel
as DataModel | undefined
and extraImports
as a string array ensures that they are correctly defined before use, preventing potential null or undefined reference errors.
116-125
: Exclude discriminator fields from create/update schemas
The logic correctly identifies and filters out discriminator fields from create/update input schemas based on the contextDataModel
. This ensures that discriminator fields are not inadvertently modified, preserving the integrity of the data model.
127-134
: Import schemas for type definitions used in the model
The code efficiently imports schemas for type definitions by iterating over typeDefFields
. This ensures that all necessary type schemas are available in the generated code.
139-141
: Pass replaceJsonWithTypeDef
parameter accurately
Passing replaceJsonWithTypeDef
as !!createUpdateMatch
to generateObjectSchemaField
aligns with the intended behavior of replacing JSON types with type definitions in create/update operations.
152-152
: Return both schemaFields
and extraImports
Returning extraImports
along with schemaFields
from generateObjectSchemaFields
allows for the inclusion of necessary imports in the generated schema files, enhancing modularity and maintainability.
157-159
: Extend generateObjectSchemaField
with additional parameters
Adding contextDataModel
and replaceJsonWithTypeDef
as parameters to generateObjectSchemaField
enhances its flexibility and allows for nuanced handling of field types, especially concerning type definitions.
170-174
: Handle type definitions in input types
The logic correctly checks if a field's type is a type definition and, if so, replaces JSON schemas with the appropriate type definition schemas. This ensures that type definitions are accurately represented in the generated schemas.
189-211
: Streamline Zod validators for basic types
The updated handling of basic types like String
, Int
, Float
, Decimal
, BigInt
, Boolean
, DateTime
, and Json
uses consistent patterns for wrapping with Zod validators. This enhances readability and maintainability.
220-230
: Import necessary schemas for complex types
By adding schema imports for complex types and enums, the code ensures that all dependencies are resolved, and the generated schemas are complete and functional.
108-111
:
Ensure modelName
is defined before use
In line 111, modelName
is used to find the contextDataModel
, but modelName
may not be defined in the current scope. Please verify that modelName
is properly extracted from createUpdateMatch
before being used.
Run the following script to confirm that modelName
is defined:
private validateField(field: TypeDefField, accept: ValidationAcceptor): void { | ||
field.attributes.forEach((attr) => validateAttributeApplication(attr, accept)); | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Enhance field-level validations.
The current field validation only checks attributes. Consider adding validations for:
- Field name conventions and reserved words
- Field type validity (supported types, type references)
- Field modifiers and constraints
- Field-level documentation requirements
Example enhancement:
private validateField(field: TypeDefField, accept: ValidationAcceptor): void {
field.attributes.forEach((attr) => validateAttributeApplication(attr, accept));
+ this.validateFieldName(field, accept);
+ this.validateFieldType(field, accept);
+ this.validateFieldModifiers(field, accept);
}
Committable suggestion skipped: line range outside the PR's diff.
it('requires field to have @json attribute', async () => { | ||
await expect( | ||
loadModelWithError( | ||
` | ||
type Profile { | ||
age Int @gt(0) | ||
} | ||
|
||
model User { | ||
id Int @id @default(autoincrement()) | ||
profile Profile | ||
@@allow('all', true) | ||
} | ||
` | ||
) | ||
).resolves.toContain('Custom-typed field must have @json attribute'); | ||
}); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠️ Refactor suggestion
Add positive test cases to validate successful scenarios.
The current tests only cover error cases. Consider adding positive test cases to verify:
- Successful model creation with valid @JSON attribute
- Proper type validation of JSON data at runtime
- Successful nested type definitions
Example positive test case:
it('successfully validates JSON field with correct type', async () => {
const model = await loadModel(`
type Profile {
age Int @gt(0)
}
model User {
id Int @id @default(autoincrement())
profile Profile @json
@@allow('all', true)
}
`);
expect(model).toBeDefined();
});
/** | ||
* Gets info for a type def. | ||
*/ | ||
export function getTypeDefInfo<Throw extends boolean = false>( | ||
modelMeta: ModelMeta, | ||
typeDef: string, | ||
throwIfNotFound: Throw = false as Throw | ||
): Throw extends true ? TypeDefInfo : TypeDefInfo | undefined { | ||
const info = modelMeta.typeDefs?.[lowerCaseFirst(typeDef)]; | ||
if (!info && throwIfNotFound) { | ||
throw new Error(`Unable to load info for ${typeDef}`); | ||
} | ||
// eslint-disable-next-line @typescript-eslint/no-explicit-any | ||
return info as any; | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠️ Refactor suggestion
Remove any
type assertion and consider code reuse.
Two concerns with the current implementation:
- The
any
type assertion should be avoided. Consider using proper typing:
- // eslint-disable-next-line @typescript-eslint/no-explicit-any
- return info as any;
+ return info;
- This function closely mirrors
getModelInfo
. Consider extracting the common pattern to reduce duplication.
function getInfo<T, Throw extends boolean = false>(
collection: Record<string, T> | undefined,
key: string,
entityType: string,
throwIfNotFound: Throw = false as Throw
): Throw extends true ? T : T | undefined {
const info = collection?.[lowerCaseFirst(key)];
if (!info && throwIfNotFound) {
throw new Error(`Unable to load info for ${entityType} ${key}`);
}
return info as any;
}
export function getModelInfo<Throw extends boolean = false>(
modelMeta: ModelMeta,
model: string,
throwIfNotFound: Throw = false as Throw
): Throw extends true ? ModelInfo : ModelInfo | undefined {
return getInfo(modelMeta.models, model, 'model', throwIfNotFound);
}
export function getTypeDefInfo<Throw extends boolean = false>(
modelMeta: ModelMeta,
typeDef: string,
throwIfNotFound: Throw = false as Throw
): Throw extends true ? TypeDefInfo : TypeDefInfo | undefined {
return getInfo(modelMeta.typeDefs, typeDef, 'type', throwIfNotFound);
}
it('works with simple field', async () => { | ||
await loadSchema( | ||
` | ||
type Profile { | ||
age Int @gt(0) | ||
} | ||
|
||
model User { | ||
id Int @id @default(autoincrement()) | ||
profile Profile @json | ||
posts Post[] | ||
@@allow('all', true) | ||
} | ||
|
||
model Post { | ||
id Int @id @default(autoincrement()) | ||
title String | ||
user User @relation(fields: [userId], references: [id]) | ||
userId Int | ||
} | ||
`, | ||
{ | ||
provider: 'postgresql', | ||
pushDb: false, | ||
compile: true, | ||
extraSourceFiles: [ | ||
{ | ||
name: 'main.ts', | ||
content: ` | ||
import { enhance } from '.zenstack/enhance'; | ||
import { PrismaClient } from '@prisma/client'; | ||
const prisma = new PrismaClient(); | ||
const db = enhance(prisma); | ||
|
||
async function main() { | ||
const u = await db.user.create({ data: { profile: { age: 18 }, posts: { create: { title: 'Post1' }} } }); | ||
console.log(u.profile.age); | ||
const u1 = await db.user.findUnique({ where: { id: u.id } }); | ||
console.log(u1?.profile.age); | ||
const u2 = await db.user.findMany({include: { posts: true }}); | ||
console.log(u2[0].profile.age); | ||
} | ||
`, | ||
}, | ||
], | ||
} | ||
); | ||
}); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠️ Refactor suggestion
Enhance test coverage and replace console.log with assertions.
While the basic functionality is tested, consider these improvements:
- Add negative test cases for age validation (e.g., age <= 0)
- Add error handling tests
- Replace console.log with proper assertions
Here's an example improvement:
async function main() {
const u = await db.user.create({ data: { profile: { age: 18 }, posts: { create: { title: 'Post1' }} } });
- console.log(u.profile.age);
+ expect(u.profile.age).toBe(18);
const u1 = await db.user.findUnique({ where: { id: u.id } });
- console.log(u1?.profile.age);
+ expect(u1?.profile.age).toBe(18);
const u2 = await db.user.findMany({include: { posts: true }});
- console.log(u2[0].profile.age);
+ expect(u2[0].profile.age).toBe(18);
+
+ // Add negative test
+ await expect(
+ db.user.create({ data: { profile: { age: 0 }, posts: { create: { title: 'Post2' }} } })
+ ).rejects.toThrow();
}
Committable suggestion skipped: line range outside the PR's diff.
it('works with optional field', async () => { | ||
await loadSchema( | ||
` | ||
type Profile { | ||
age Int @gt(0) | ||
} | ||
|
||
model User { | ||
id Int @id @default(autoincrement()) | ||
profile Profile? @json | ||
@@allow('all', true) | ||
} | ||
`, | ||
{ | ||
provider: 'postgresql', | ||
pushDb: false, | ||
compile: true, | ||
extraSourceFiles: [ | ||
{ | ||
name: 'main.ts', | ||
content: ` | ||
import { enhance } from '.zenstack/enhance'; | ||
import { PrismaClient } from '@prisma/client'; | ||
const prisma = new PrismaClient(); | ||
const db = enhance(prisma); | ||
|
||
async function main() { | ||
const u = await db.user.create({ data: { profile: { age: 18 } } }); | ||
console.log(u.profile?.age); | ||
const u1 = await db.user.findUnique({ where: { id: u.id } }); | ||
console.log(u1?.profile?.age); | ||
const u2 = await db.user.findMany(); | ||
console.log(u2[0].profile?.age); | ||
} | ||
`, | ||
}, | ||
], | ||
} | ||
); | ||
}); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠️ Refactor suggestion
Add null/undefined test cases and proper assertions.
The optional field test should explicitly verify null/undefined behavior.
Here's an example improvement:
async function main() {
const u = await db.user.create({ data: { profile: { age: 18 } } });
- console.log(u.profile?.age);
+ expect(u.profile?.age).toBe(18);
const u1 = await db.user.findUnique({ where: { id: u.id } });
- console.log(u1?.profile?.age);
+ expect(u1?.profile?.age).toBe(18);
const u2 = await db.user.findMany();
- console.log(u2[0].profile?.age);
+ expect(u2[0].profile?.age).toBe(18);
+
+ // Test null case
+ const u3 = await db.user.create({ data: {} });
+ expect(u3.profile).toBeNull();
}
Committable suggestion skipped: line range outside the PR's diff.
} else if (fieldInfo.type === 'DateTime' && typeof value === 'string') { | ||
// convert to Date | ||
const parsed = Date.parse(value); | ||
if (!isNaN(parsed)) { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Use Number.isNaN
instead of global isNaN
for safer type checking
Using isNaN
can lead to unexpected results because it performs type coercion. Number.isNaN
is more reliable since it doesn't coerce the value before checking if it's NaN
.
Apply this diff to fix the issue:
-if (!isNaN(parsed)) {
+if (!Number.isNaN(parsed)) {
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
if (!isNaN(parsed)) { | |
if (!Number.isNaN(parsed)) { |
🧰 Tools
🪛 Biome
[error] 85-85: isNaN is unsafe. It attempts a type coercion. Use Number.isNaN instead.
See the MDN documentation for more details.
Unsafe fix: Use Number.isNaN instead.
(lint/suspicious/noGlobalIsNan)
|
||
// eslint-disable-next-line @typescript-eslint/no-explicit-any | ||
private fixJsonDateFields(entityData: any, typeDef: string) { | ||
if (typeof entityData !== 'object' && !Array.isArray(entityData)) { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠️ Refactor suggestion
Include a null check for entityData
to prevent runtime errors
Since typeof null
returns 'object'
, the current condition does not exclude null
values. This could lead to errors when entityData
is null
. Adding an explicit null check ensures that null
values are handled appropriately.
Apply this diff to update the condition:
-if (typeof entityData !== 'object' && !Array.isArray(entityData)) {
+if (!entityData || (typeof entityData !== 'object' && !Array.isArray(entityData)) {
Committable suggestion skipped: line range outside the PR's diff.
function writeTypeDefs( | ||
sourceFile: SourceFile, | ||
writer: CodeBlockWriter, | ||
typedDefs: TypeDef[], | ||
options: ModelMetaGeneratorOptions | ||
) { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠️ Refactor suggestion
Inconsistent parameter naming: 'typedDefs' vs 'typeDefs'
The parameter is named typedDefs
, but elsewhere in the code, typeDefs
is used. To maintain consistency and avoid confusion, consider renaming typedDefs
to typeDefs
.
Apply this diff to update the parameter name and its usages:
function writeTypeDefs(
sourceFile: SourceFile,
writer: CodeBlockWriter,
- typedDefs: TypeDef[],
+ typeDefs: TypeDef[],
options: ModelMetaGeneratorOptions
) {
- if (typedDefs.length === 0) {
+ if (typeDefs.length === 0) {
return;
}
writer.write('typeDefs:');
writer.block(() => {
- for (const typeDef of typedDefs) {
+ for (const typeDef of typeDefs) {
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
function writeTypeDefs( | |
sourceFile: SourceFile, | |
writer: CodeBlockWriter, | |
typedDefs: TypeDef[], | |
options: ModelMetaGeneratorOptions | |
) { | |
function writeTypeDefs( | |
sourceFile: SourceFile, | |
writer: CodeBlockWriter, | |
typeDefs: TypeDef[], | |
options: ModelMetaGeneratorOptions | |
) { |
source = this.fixJsonFieldType(typeAlias, source); | ||
|
||
structure.type = source; | ||
return structure; | ||
} | ||
|
||
private fixJsonFieldType(typeAlias: TypeAliasDeclaration, source: string) { | ||
const modelsWithTypeField = this.model.declarations.filter( | ||
(d): d is DataModel => isDataModel(d) && d.fields.some((f) => isTypeDef(f.type.reference?.ref)) | ||
); | ||
const typeName = typeAlias.getName(); | ||
|
||
const getTypedJsonFields = (model: DataModel) => { | ||
return model.fields.filter((f) => isTypeDef(f.type.reference?.ref)); | ||
}; | ||
|
||
const replacePrismaJson = (source: string, field: DataModelField) => { | ||
return source.replace( | ||
new RegExp(`(${field.name}\\??\\s*):[^\\n]+`), | ||
`$1: ${field.type.reference!.$refText}${field.type.array ? '[]' : ''}${ | ||
field.type.optional ? ' | null' : '' | ||
}` | ||
); | ||
}; | ||
|
||
// fix "$[Model]Payload" type | ||
const payloadModelMatch = modelsWithTypeField.find((m) => `$${m.name}Payload` === typeName); | ||
if (payloadModelMatch) { | ||
const scalars = typeAlias | ||
.getDescendantsOfKind(SyntaxKind.PropertySignature) | ||
.find((p) => p.getName() === 'scalars'); | ||
if (!scalars) { | ||
return source; | ||
} | ||
|
||
const fieldsToFix = getTypedJsonFields(payloadModelMatch); | ||
for (const field of fieldsToFix) { | ||
source = replacePrismaJson(source, field); | ||
} | ||
} | ||
|
||
// fix input/output types, "[Model]CreateInput", etc. | ||
const inputOutputModelMatch = modelsWithTypeField.find((m) => typeName.startsWith(m.name)); | ||
if (inputOutputModelMatch) { | ||
const relevantTypePatterns = [ | ||
'GroupByOutputType', | ||
'(Unchecked)?Create(\\S+?)?Input', | ||
'(Unchecked)?Update(\\S+?)?Input', | ||
'CreateManyInput', | ||
'(Unchecked)?UpdateMany(Mutation)?Input', | ||
]; | ||
const typeRegex = modelsWithTypeField.map( | ||
(m) => new RegExp(`^(${m.name})(${relevantTypePatterns.join('|')})$`) | ||
); | ||
if (typeRegex.some((r) => r.test(typeName))) { | ||
const fieldsToFix = getTypedJsonFields(inputOutputModelMatch); | ||
for (const field of fieldsToFix) { | ||
source = replacePrismaJson(source, field); | ||
} | ||
} | ||
} | ||
|
||
return source; | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Safely handle dynamic regular expressions to prevent potential vulnerabilities
The fixJsonFieldType
method constructs regular expressions dynamically using model and field names (e.g., lines 526 and 560). If model or field names contain special regex characters, this could lead to unintended matches or security issues like Regular Expression Denial of Service (ReDoS). Ensure that model and field names are properly escaped when used in regular expressions.
Apply this diff to escape special characters in model and field names:
+ import escapeStringRegexp from 'escape-string-regexp';
...
- new RegExp(`(${field.name}\\??\\s*):[^\\n]+`),
+ new RegExp(`(${escapeStringRegexp(field.name)}\\??\\s*):[^\\n]+`),
...
- new RegExp(`^(${m.name})(${relevantTypePatterns.join('|')})$`)
+ new RegExp(`^(${escapeStringRegexp(m.name)})(${relevantTypePatterns.join('|')})$`)
Don't forget to install the escape-string-regexp
package if it's not already a dependency:
npm install escape-string-regexp
This ensures that any special characters in model or field names are treated literally in the regular expressions.
private async generateExtraTypes(sf: SourceFile) { | ||
for (const decl of this.model.declarations) { | ||
if (isTypeDef(decl)) { | ||
generateTypeDefType(sf, decl); | ||
} | ||
} | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Handle asynchronous operations correctly in generateExtraTypes
The method generateExtraTypes
is declared as async
but does not contain any await
statements within it. If generateTypeDefType
is an asynchronous function, ensure that it is awaited. If not, consider removing the async
keyword from the method declaration to prevent confusion.
Apply this diff if generateTypeDefType
is synchronous:
- private async generateExtraTypes(sf: SourceFile) {
+ private generateExtraTypes(sf: SourceFile) {
Alternatively, if generateTypeDefType
is asynchronous:
for (const decl of this.model.declarations) {
if (isTypeDef(decl)) {
- generateTypeDefType(sf, decl);
+ await generateTypeDefType(sf, decl);
}
}
Committable suggestion skipped: line range outside the PR's diff.
Initial implementation of #784 and #783
type
declaration@email
@default
(including using `auth() in it)Out of scope:
@@validate