Skip to content
This repository has been archived by the owner on Aug 4, 2023. It is now read-only.

Update reingestion workflows to load and report data #618

Merged
merged 39 commits into from
Sep 23, 2022

Conversation

stacimc
Copy link
Contributor

@stacimc stacimc commented Jul 21, 2022

Fixes

Fixes WordPress/openverse#1502 by @stacimc

Description

This PR:

  • refactors the regular provider workflow factories slightly to make it possible to separate out the ingestion steps (pulling & loading data) from the reporting step (notify slack)
  • updates the reingestion_workflow dag factory to include the loading steps, and to report a single slack notification with summed data at the end
  • temporarily updates the schedule interval for reingestion workflows from @daily to @weekly, so that we can get a sense of how long they take when we roll them out and adjust more easily. I expect we may need to tweak the configurations.

A reingestion workflow will look like this:

Screen Shot 2022-09-07 at 1 37 28 PM

Don't panic 😄 To make it easier to see what's going on, I've altered the workflow configuration for Wikimedia here to generate a smaller number of reingestion days. You can see that each node in the graph represents an ingestion workflow for a particular day in the past. The suffix for the taskgroup is the day_shift (so ingest_data_day_shift_32 is ingesting data for 32 days in the past).

Screen Shot 2022-09-23 at 9 11 46 AM

Here's how I altered the configuration to get that graph:

ProviderReingestionWorkflow(
        dag_id="wikimedia_reingestion_workflow",
        ...
        daily_list_length=2, # Only generate 2 ingestion days with a day_shift 1x
        one_month_list_length=2 # Only generate 2 ingestion days with a day_shift 30x
)

The daily_list_length, weekly_list_length and so on tell the DAG factory how many taskgroups to generate at each level. So if you have daily_list_length = 3, taksgroups for the day_shifts 1, 2, and 3 will be generated at that level. If you have weekly_list_length=3, you'll generate 3 new taskgroups for day_shifts counting by 7, starting at the last day_shift in the previous level (3) -- so 10, 17, 24.

Between each level is a 'gather' task which acts as a gate. All the ingestion tasks in partition 0 must complete before gather_partition_0 can run; only then are partition 1 ingestion tasks processed. There is one partition for each of the day shift multipliers (ie partition 0 is today, partition 1 is daily shifts (if any are configured), then weekly shifts and so on).

When all of the tasks are completed, a single report_load_completion reports a summary of the sum of all the pull_data durations, as well as the sum of all the record_counts (total number upserted, total duplicates, etc).

REVIEW GUIDE

This is a big PR, but many of the changes are refactoring of types or the addition of tests/docstrings. I am happy to find a way to split this if necessary for review, though. Here's my review guide by file:

  • DAGs.md: auto-generated docs for the reingestion workflows
  • common/loader/reporting.py: updates to the reporting tool to aggregate data for multiple ingestion tasks
  • common/loader/storage/* and provider_data_ingester.py: all changes to these files are done to append the ingestion date as a suffix to the tsv filenames for dated DAGs. This is necessary because reingestion tasks can run concurrently, so we can get collisions on tsv filenames when just using the execution date.
  • provider_dag_factory.py: the bulk of the work. This refactors the existing factories to split out ingestion and reporting tasks, and uses this to create new reingestion workflows.
  • provider_*workflows.py and provider_*_workflow_factory.py: minor cleanup, renames reingestion dags to use the word "reingestion" for clarity
  • tests/*: new and updated tests

NOTES

The regular provider flows for Europeana and Flickr are not yet enabled. Their reingestion workflows should NOT be turned on until they have been refactored, and the 'regular' provider workflow is confirmed to be working.

The TSV filenames for all regular provider workflows remain the same. For reingestion workflows, they are further partitioned by the reingestion date (the day for which data is being processed), in order to prevent collisions when reingestion for multiple days run at once. An example tsv filename for wikimedia_reingestion_workflow is: wikimedia_reingestion/year=2020/reingestion=2022-07-22/wikimedia_image_v001_20220922230222.tsv.

Testing Instructions

  • Verify that a few regular provider DAGs, in particular Wikimedia Commons, runs normally. You'll notice they look slightly different as the ingestion steps are now in a task group.
  • Run the wikimedia_reingestion_workflow. To make this quicker, you can try:
    • Setting a global ingestion_limit variable to something like 250 so that ingestion for each day completes early
    • Reducing the number of days to reingest by editing the configuration, as described above ^
  • Inspect the report_load_completion content for the wikimedia reingestion flow, as well as a 'normal' DAG to make sure they look correct with & without aggregation.

Checklist

  • My pull request has a descriptive title (not a vague title like Update index.md).
  • My pull request targets the default branch of the repository (main) or a parent feature branch.
  • My commit messages follow best practices.
  • My code follows the established code style of the repository.
  • I added or updated tests for the changes I made (if applicable).
  • I added or updated documentation (if applicable).
  • I tried running the project locally and verified that there are no visible errors.

Developer Certificate of Origin

Developer Certificate of Origin
Developer Certificate of Origin
Version 1.1

Copyright (C) 2004, 2006 The Linux Foundation and its contributors.
1 Letterman Drive
Suite D4700
San Francisco, CA, 94129

Everyone is permitted to copy and distribute verbatim copies of this
license document, but changing it is not allowed.


Developer's Certificate of Origin 1.1

By making a contribution to this project, I certify that:

(a) The contribution was created in whole or in part by me and I
    have the right to submit it under the open source license
    indicated in the file; or

(b) The contribution is based upon previous work that, to the best
    of my knowledge, is covered under an appropriate open source
    license and I have the right under that license to submit that
    work with modifications, whether created in whole or in part
    by me, under the same open source license (unless I am
    permitted to submit under a different license), as indicated
    in the file; or

(c) The contribution was provided directly to me by some other
    person who certified (a), (b) or (c) and I have not modified
    it.

(d) I understand and agree that this project and the contribution
    are public and that a record of the contribution (including all
    personal information I submit with it, including my sign-off) is
    maintained indefinitely and may be redistributed consistent with
    this project or the open source license(s) involved.

@openverse-bot openverse-bot added the 🚦 status: awaiting triage Has not been triaged & therefore, not ready for work label Jul 21, 2022
@stacimc stacimc force-pushed the refactor/wikimedia-commons-to-use-base-class branch from 89f02de to abe67d5 Compare July 25, 2022 23:59
@sarayourfriend
Copy link
Contributor

@stacimc Is this PR blocked on #614?

@AetherUnbound
Copy link
Contributor

This is in draft pending the completion of the v1.3.0 milestone - once that's complete this can be rebased and undrafted!

@stacimc stacimc force-pushed the refactor/wikimedia-commons-to-use-base-class branch from bbecded to 1281943 Compare August 23, 2022 18:02
@stacimc stacimc force-pushed the feature/reingestion-workflows branch from b760655 to 5607f58 Compare August 24, 2022 23:45
@stacimc
Copy link
Contributor Author

stacimc commented Aug 24, 2022

This took some effort to rebase 😅 but is now working again. I've tested Wikimedia and its reingestion workflow, and both are running well!

Screen Shot 2022-08-24 at 4 53 04 PM

I'll address the To Do items now, and there'll be some code cleanup needed after the rebase.

@obulat obulat added 🟨 priority: medium Not blocking but should be addressed soon ✨ goal: improvement Improvement to an existing user-facing feature 🛠 goal: fix Bug fix and removed 🚦 status: awaiting triage Has not been triaged & therefore, not ready for work labels Sep 2, 2022
@stacimc stacimc force-pushed the refactor/wikimedia-commons-to-use-base-class branch from 88a621b to 6a72b97 Compare September 2, 2022 20:07
Base automatically changed from refactor/wikimedia-commons-to-use-base-class to main September 6, 2022 21:04
@stacimc stacimc requested a review from a team as a code owner September 8, 2022 23:33
Copy link
Member

@krysal krysal left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is looking very interesting. I need to grab the reingestion process in my mind so I'll come back to review this tomorrow with more time.
On another note, I think you can fill all those - [ ] from the checklist 😄 Those are great instructions and a PR description, very appreciated!

DAGs.md Show resolved Hide resolved
Copy link
Contributor

@AetherUnbound AetherUnbound left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is so impressive! It's a really significant change too, but will give us a lot more control and clarity around these reingestion workflows. I have a number of comments on a few different aspects, notably:

  1. The S3 prefix we use once uploaded
  2. The nomenclature used for certain variables/concepts

I'm so excited to be able to turn this on soon, great work!!

def _add_counts(self, a, b):
return (a or 0) + (b or 0)

def __add__(self, other):
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

So smart!! This is awesome 🤩

openverse_catalog/dags/common/loader/reporting.py Outdated Show resolved Hide resolved
@@ -104,10 +149,19 @@ def report_completion(

# Collect data into a single message
message = f"""
*Provider*: `{provider_name}`
*DAG*: `{dag_id}`
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I kind of preferred using provider name so we didn't have to see _workflow over and over 😅 is there potentially a different way of showing that it's a reingestion workflow DAG without having to use dag_id directly?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Previously we were already using dag_id, and just chopping _workflow off (ie dag_id.replace("_workflow", "")). We could continue doing so, but then for reingestion flows we'd get wikimedia_reingestion as the "provider", which isn't quite correct.

I can think of other things to do but they all seem like overkill to me. We could look for reingestion in the dag_id, but we'd still need to get the name of the actual provider. That would mean making a mapping somewhere, or else passing the information all the way through from the ingester class. I guess we could chop off workflow if 'reingestion' isn't in the dagid, and then report "provider" for regular flows and "dag_id" for reingestion 🤔

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Perhaps on the ProviderWorkflow dataclass we could specify provider here instead of dag_id?

Then we could also pass in the workflow type (regular/reingestion) to the reporting function which could be used to craft the message. On the other hand though, that all sounds like a lot of work so maybe DAG ID is just fine 😅

Comment on lines +161 to +163
"\n_Duration is the sum of the duration for each data pull task."
" It does not include loading time and does not account for data"
" pulls that may happen concurrently."
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

One thought on a potentially alternate way to do this down the line, we could average the durations across each aggregate. E.g. have average durations for the daily spread, weekly spread, and monthly spread, that way we could get a sense of how long each type of aggregate is taking on average!

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I like this idea! Probably overkill but it would be interesting to have some kind of summary of outliers. For Wikimedia for instance the current large-batch issue could make for some wild outliers 😮

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Oh definitely 😰 Yea not necessary now but a potential option down the line if we find we want more distinction!

openverse_catalog/dags/providers/provider_dag_factory.py Outdated Show resolved Hide resolved
openverse_catalog/dags/providers/provider_dag_factory.py Outdated Show resolved Hide resolved
openverse_catalog/dags/providers/provider_dag_factory.py Outdated Show resolved Hide resolved
openverse_catalog/dags/providers/provider_dag_factory.py Outdated Show resolved Hide resolved
@@ -87,6 +95,34 @@ def _make_report_completion_contents_data(media_type: str):
]


def _make_report_completion_contents_list_data(media_type: str):
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

These test cases are great!

Copy link
Contributor

@AetherUnbound AetherUnbound left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is incredible, and such a significant change! I was able to run this locally and it worked great. I have one more note on the reporting, but otherwise this is good to go 🚀

openverse_catalog/dags/common/loader/reporting.py Outdated Show resolved Hide resolved
Copy link
Member

@krysal krysal left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This looks awesome 👏

Avoids collisions on tsv filenames when reingestion runs for two different
dates at the same time.

- Removes the old impelementation, which appended ingestion date to the actual
filename
- Instead partitions by the reingestion date
@stacimc
Copy link
Contributor Author

stacimc commented Sep 22, 2022

Okay, last bit of feedback addressed. The biggest thing was related to this comment thread, but I'm just going to flag this at the top level for visibility, cc @AetherUnbound:

The tsv filenames no longer append the date to the end for dated DAGs. They should work exactly as they used to for all normal flows, and for reingestion flows we add one final partition to the prefix with the ingestion date. I've tested Wikimedia and Wikimedia Reingestion, and this is what they look like:

# wikimedia_commons_workflow
wikimedia_commons/year=2020/month=11/wikimedia_image_v001_20220922230222.tsv

# wikimedia_reingestion_workflow
wikimedia_reingestion/year=2020/reingestion=2022-07-22/wikimedia_image_v001_20220922230222.tsv

Note that for reingestion, it's partitioned by year and not by month. That is because I temporarily have set the reingestion flows to run weekly instead of daily. When they get turned back on to the daily schedule, the further partitioning will be added automatically.

Sorry that took awhile to get to; the templating gave me a ton of grief!

Copy link
Contributor

@AetherUnbound AetherUnbound left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I love all these changes! As difficult as that templating was to figure out, I really like the new prefix 💯

@stacimc
Copy link
Contributor Author

stacimc commented Sep 23, 2022

As difficult as that templating was to figure out, I really like the new prefix

Agreed, it was definitely worth it! Much cleaner and more organized. Thanks everyone for the reviews :)

@stacimc stacimc merged commit 4a9c008 into main Sep 23, 2022
@stacimc stacimc deleted the feature/reingestion-workflows branch September 23, 2022 16:19
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
🛠 goal: fix Bug fix ✨ goal: improvement Improvement to an existing user-facing feature 🟨 priority: medium Not blocking but should be addressed soon
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Update reingestion workflows to load and report data
6 participants