-
Notifications
You must be signed in to change notification settings - Fork 0
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Start pulling in schedule refactor entities! #65
Conversation
edcc76a
to
2538799
Compare
074307c
to
ca9b4ec
Compare
d2be2b1
to
ff57e88
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
two inline mega-nitpicks, and overall:
- after this PR, are we ready to use calitp-py for the existing locations we use the utils versions of these classes? if not, what specific steps are needed for us to get there? I mostly just want to make sure I understand because for the schedule parsing stuff, I need to contribute new related classes... What goes in here vs. in Airflow utils? Like is our philosophy that stuff only goes in here if it's specifically needed by a PodOperator? I want to make sure that we're not maintaining parallel versions for any longer than we absolutely have to.
- can you just call out in the PR description that we're moving to poetry for dependency management specifically? (PR title and description are kind of disparate right now, wondering if we can clarify that a bit for posterity, especially because working across the repos is already kinda confusing)
Yes, as soon as a good version of calitp-py is published to pypi, we can PR in data-infra to add it as a dependency in Airflow and change the imports. I'd like us to generally default to putting this in here as long as it isn't actually importing Airflow; I think it'll be easier to develop/test as well as pro-actively make things available to pod operators. We can always reference specific git hashes while developing on the Airflow side if needed.
yep! |
One consequence of the schedule refactor is new code that we will want to share across Airflow as well as pod operators. This PR is the first step in that process; we'll continue to iterate and improve on this library so that less work is repeated across the pipeline.
Things implemented in this PR: