-
Notifications
You must be signed in to change notification settings - Fork 1.7k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Reimplement progressive type checking in Remap #6507
Comments
So, as mentioned on Discord, in general I agree with the gist of this issue, which is also why I brought this to your attention. If we do this (and again, I'm in favour), then the question becomes; how do we get there technically. I think I have a way forward that gives us the best of both worlds:
Let's take an example to demonstrate this: Maybe FallibleGiven this program: ok, err = to_string(.foo)
upcase(ok) This would never compile because If we were to use The solution to this is as @binarylogic already demonstrated, using However, in this example, we can't actually be sure that Always FallibleLet's take a look at another example: ok = []
upcase(ok) In this example, we know The SolutionGiven these two examples, the solution is as follows:
The Work
|
Thanks for this write-up @JeanMertz ! I think I agree with your general approach here. One suggestion I had was to allow
case. I think it will be common to have types that are Alternatively, and this is my preference, we lean even more towards the Go approach and have, With either of those improvements, I still think your general idea of allowing compilation of programs where the types intersect, and then allowing failing at runtime, is a good approach. |
I agree 👍 I'll see if I can sneak this in. |
While thinking this through, I realized what you propose (and what Go does) is not possible in our case. (some of) our functions are allowed to return different types based on their input. For example, the So, there is no sane "default" value that the function can return in an error-situation, unless we (arbitrarily) pick a default for the functions that can return more than one type. There are still options to consider, but even though I agree a solution to this would be another boost to the language ergonomics, I'd probably lean towards tackling that after 0.12 ships, as there are trade-offs to all of them. |
🤔
Do you have another example because, with that one, it seems like you could just return an empty array which would have no inner type (or possibly just give it the inner type of the array it was sliced from). I could be misunderstanding. |
#6148 lifted type checking to a compile-time error since we felt it provided a better user experience. Specifically, we thought it provided the following benefits:
round
, would no longer be fallible. It was awkward to document otherwise infallible functions as fallible just to handle the argument-type runtime error.But after updating a few VRL examples I'm questioning if this was a mistake since it adds considerable friction to the language. And while I understand we're at the finish line with the language, this change is worth a final discussion given how much it impacts the UX.
Examples
To facilitate this discussion let's run through two opposing examples:
Example1: Typically infallible operation
Typically infallible operations, like rounding a number, heavily influenced our decision to raise type checking to compile-time errors. For example:
With compile-time type checking
The user will be met with a compile-time error like:
And the correct program would look something like:
Without compile-time type checking
The user will be met with a compile-time error like:
Example 2: Typically fallible operation
Let's try a fallible operation, like parsing a Syslog log. This is an operation that can fail due to a malformed string; handling errors is expected:
With compile-time type checking
The user will be met with a compile-time error like:
And the correct program would look something like:
Without compile-time type checking
The user will be met with a compile-time error like:
Proposal
As you can see, compile-time type checking is less awkward for typically infallible operations (example 1) but more awkward for typically fallible operations (example 2). I'd argue that typically fallible operations (example 2) are what we should optimize for. For example, if I've written a few VRL programs and I write the following:
I should not be greeted with a compile-time type checking error. As far as I'm concerned, a malformed string and non-string values are the same in this context -- they can't parse.
I like this approach since type-safety is opt-in. The user can choose to add friction or not.
Let's discuss and note our decision for posterity. If we decide to relieve compile-time type checking we should do this before 0.12 is released.
The text was updated successfully, but these errors were encountered: