-
Notifications
You must be signed in to change notification settings - Fork 1.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Unable to process all valid rows from a stream #1719
Comments
What kind of error are you seeing? |
We first build a query that invokes a function in the select list, for example: SELECT field1, problem_function(field2)
FROM some_table We then hand that query to Many thousands of records get successfully piped into Postgres, but then at some point the function in our SELECT throws an error on one record. That Oracle error then causes the Our specific error is Hopefully that adds a little more context, let me know if you have more questions about what we're experiencing. |
It would be "interesting" to try and work out if errors can be ignored, and what specific ones (user errors?) can be ignored. Perhaps querying can just continue and then fail on the next fetch if the DB stops sending rows altogether. The simplest solution might be to keep all the data in Oracle DB :) |
Yeah I agree it's possible that what I'm asking for is not feasible depending on how the interaction works between the |
The feedback about the goal and limitation is good to hear so that we can make changes, or help influence other Oracle teams' decisions. |
We have a use case where we are streaming thousands of records from a database and occasionally one record will have an error. We would love to have the option to keep streaming records even when a
read
fails. The current implementation always destroys the stream on any error:node-oracledb/lib/queryStream.js
Line 106 in 1d0ee74
The text was updated successfully, but these errors were encountered: