-
Notifications
You must be signed in to change notification settings - Fork 1.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Complex OpenAPI Validations with @property
#1624
Comments
+1 for allowing models to provide additional validation rules. I don't think this is as simple as allowing IMO, we need to find a (generic) solution that will support other transports than just REST/OpenAPI, for example gRPC and GraphQL and ideally allow connectors to setup some of these restrictions at the database level too. For example, See also #699 |
IMO, this is beyond 4.0 GA scope. |
Well ... shouldn't it be up to the extension implementing GRPC / GraphQL to determine what it wants to do with the metadata. I don't think it's feasible for us to have a common way of supporting all of these protocols. This issue came from me working on |
First of all, I see the pain point you have described and agree LB4 should provide an easy solution for that. My objection against adding REST-specific validations to For example, gRPC uses Protocol Buffers to specify transport format, translating validations specified as JSON schema to Protocol Buffers may be difficult if possible at all (Protocol Buffer have very minimal validation rules). I would like to propose to define an extension format allowing Then we can define a convention for leveraging the extension format to provide additional JSON-Schema validations in property definition; and finally modify the code translating LB models to JSON Schema to take these extensions into account. An example showing a {
type: string,
minLength: 10,
// the LDL extension: jsonSchema
jsonSchema: {
// JSON-Schema constraints follow
format: 'email'
}
} Optionally, we can also review OpenAPI constraints, pick those that seem to be "universal", promote them to top-level LDL (LoopBack definition language) and implement support for them in juggler/connectors/transports. For example, function emailValidator(err, done) {
var value = this.email;
if (value == null)
return;
if (typeof value !== 'string')
return err('string');
if (value === '') return;
if (!isEmail.validate(value))
return err('email');
} UserModel.validate('email', emailValidator, {
message: g.f('Must provide a valid email'),
}); As I am thinking about this more, my conclusion is that for a validation rule to be eligible for including in top-level property definition (in LDL), we must have the validation implemented at juggler level, because that's the place where all data operations must go through. If a validation is not implemented at transport level (e.g. because gRPC/ProtocolBuffers do not allow that), then it's ok, because the values will be still validated by repository (juggler) operations. On the other hand, if a validation is implemented at transport level only, then we have a problem - code working with models/repositories directly via TS/JS API is effectively bypassing such validation rules! For example, test code or database seeding script can easily created invalid model instances. In that light, I am actually not sure if JSON-Schema validations applied at REST transport level are actually the proper solution for avoiding a lot of custom validation on a per route basis. Thoughts? @raymondfeng Please join our discussion, what's your opinion on this matter? |
Hmm, I see the point you're making. I'm ok if we did the validation in Juggler as long as we do it somewhere. That said I think my biggest concern is some inconsistency within the framework right now where by IMO the decorators should support storing the openapi validations and allow other layers of the framework to leverage the metadata to see if they wish to support the validation or not. I get the concern for allowing transport specific validations / juggler validations ... and for that I would propose the following:
|
@bajtos If my understanding of what you're proposing is correct that means that there would be different extensions to the LDL which would do the validation on the ORM-level regardless of the transport. Is that correct? Because if that's what you're proposing that sounds like a good idea 👍 . Right now I am going to take a leap of faith/hope and include the validation in the
specification and do the validation in the frontend and hope that that syntax will get to be supported later on in loopback 4 itself through some extension that one needs to enable. Given proper instruction I would be very open to helping contribute to such a solution. |
Yes, that's what I would like to see!
I think this should be relatively straightforward to implement. The package repository-json-schema is responsible for converting model definitions in LoopBack format into JSON schema documents. Here is the code building JSON Schema definition for a model property: Now if my understanding of this part of our codebase is correct, then we simply need to add few lines of code to Object.assign(propDef, meta.jsonSchema); Besides this tiny code change, the pull request need to include tests and documentation. @David-Mulder I am happy to help you along the way if you decide to contribute this feature. See our CONTRIBUTING guide to get started. |
@bajtos Well, I added that "hack" (just because of it temporary nature) in myself already yeah (plus some code in the Anyway, even once it would be possible to develop validation extensions it will be necessary to allow different options for JSON Schemas, as every implementation is different. In my case I am for example using |
Hmm, this is tricky. To allow a validator to consume custom PropertyDefinition properties, we would need a different way of converting model/property definitions into OpenAPI Schema. In order to plug in custom validations at REST API layer, I think we haven't fully considered this aspect yet. I think the easiest option is to leverage Dependency Injection and allow applications to plug in a custom validator. Example usage: app.bind(RestBindings.VALIDATOR).to(CustomParameterValidator); Here is the entry point where the pluggable validator would be used instead of the current hard-wired implementation:
We are using Obviously, this is all about validation at REST API level. Customizing validations at juggler/repository level is a whole new story. |
Currently what is the best approach for validating something like an email? |
Validating email addresses is tricky. We have had good experience with isemail module, I would recommend it to do the actual validation. I don't have a good answer for how to wire up a custom validation in LB4 right now. I think we will need to improve LoopBack first and define an extension point allowing applications to register custom validations, see the discussion above for few ideas. |
Any updates on that? I really appreciated the idea of maintaining all the validation at ORM level, so it is better to write integration tests that accomplish the expected values in the database. That helps a lot in cases where people write ETL scripts to extract data from CSV files or another sources, to persist these data using Loopback repository API. 😀 |
Or what's the best way of validating a model that isn't a request body endpoint? Using the isemail module is fine for email but a generic validator would be much more helpful, no? |
In discussion with @raymondfeng @bajtos: Another way to do validation is through interceptors. |
@dhmlau for us, the payload of our endpoint is encoded which would make the AJV validation difficult. Instead, we did exactly what you recommended - created a method decorator that does the AJV validation for us, with similar defaults to the OpenAPI validation, and throws an error if it isn't valid. |
According to the above comment, closing this issue as done. |
Description / Steps to reproduce / Feature proposal
LoopBack's
@requestBody
decorator is supposed to be capable of validation complex OpenAPI Schemas such as:This only works when this schema is passed to the spec via the
@model
decorator (top-down approach) ... vs. the more common route / CLI recommended path of using the@property
decorator.If you add validation properties such as
minLength
,format
, etc. to a property using the@property
decorator the validation properties don't even show up in/openapi.json
and as such are never validated against therequestBody
since those validations aren't a part of the schema.Current Behavior
@property
decorator.Expected Behavior
@property
decorator.Acceptance Criteria
@property()
decorator should accept additional validation properties as per the OpenAPI Spec and add them to the schema of the Model (should show in/openapi.json
) - see Refactor metaToJsonProperty to accept AJV keywords #2685@requestBody()
should honour validation properties added via@property
decorator.@model()
also works in preserving the validation propertiesSee Reporting Issues for more tips on writing good issues
The text was updated successfully, but these errors were encountered: