Skip to content
This repository has been archived by the owner on Aug 4, 2023. It is now read-only.

Add dayshift to tsv filenames for reingestion workflows #969

Merged
merged 6 commits into from
Feb 14, 2023

Conversation

stacimc
Copy link
Contributor

@stacimc stacimc commented Jan 17, 2023

Fixes

Fixes WordPress/openverse#1417 by @stacimc

Description

This adds the day_shift as a suffix to the tsv filenames for reingestion workflows, in order to prevent a race condition where multiple reingestion days running at the same time will attempt to write to a file with the same name. @AetherUnbound did an excellent investigation and write-up of the bug in this comment.

This PR:

  • adds the day_shift as a suffix, to ensure sufficient uniqueness. In non-reingestion DAGs, no additional suffix is added.
  • removes the generate_tsv_filenames task while we're at it. This was a separate task that runs before the pull_data step and outputs the tsv filenames to XComs, so they can be grabbed later by the copy_to_s3 task. For scripts using the ProviderDataIngester base class, it does this by initializing the ingester class to grab the file names. The pull_data task would then re-initialize the ingester class and override its generated filenames with the previously generated ones. Since all the provider scripts are using the ingester class now this should not be necessary.

Testing Instructions

General things to look for when testing a DAG:

  • Verify that the DAG completes successfully. Set the INGESTION_LIMIT Airflow variable to a very small number (<= 100) to speed up ingestion.
  • Check the logs for the pull_data task and verify that you see logs reporting the tsv file paths for each media type the DAG supports.
  • Check the logs for the copy_to_s3 task and verify that it is accessing the same file path reported in the pull_data task

Some DAGs to test:

  • A regular, non-reingestion DAG, for example cleveland_museum_workflow. You should see tsv filenames that do NOT have a day_shift appended to the end. Example pull_data log:
{media.py:183} INFO - Output path: /var/workflow_output/clevelandmuseum_image_v001_20230123230706.tsv
  • Test with a reingestion DAG, example wikimedia_reingestion_workflow. Check the logs for a few different reingestion days and make sure that the appropriate day_shift is appended to the filenames. Example pull_data log for the ingest_data_day_shift_4 TaskGroup:
{factory_utils.py:45} INFO - Image store location: /var/workflow_output/wikimedia_image_v001_20230123231717_4.tsv
{factory_utils.py:45} INFO - Audio store location: /var/workflow_output/wikimedia_audio_audio_v001_20230123231717_4.tsv

Checklist

  • My pull request has a descriptive title (not a vague title like
    Update index.md).
  • My pull request targets the default branch of the repository (main) or
    a parent feature branch.
  • My commit messages follow best practices.
  • My code follows the established code style of the repository.
  • I added or updated tests for the changes I made (if applicable).
  • I added or updated documentation (if applicable).
  • I tried running the project locally and verified that there are no visible
    errors.
  • I ran the DAG documentation generator (if applicable).

Developer Certificate of Origin

Developer Certificate of Origin
Developer Certificate of Origin
Version 1.1

Copyright (C) 2004, 2006 The Linux Foundation and its contributors.
1 Letterman Drive
Suite D4700
San Francisco, CA, 94129

Everyone is permitted to copy and distribute verbatim copies of this
license document, but changing it is not allowed.


Developer's Certificate of Origin 1.1

By making a contribution to this project, I certify that:

(a) The contribution was created in whole or in part by me and I
    have the right to submit it under the open source license
    indicated in the file; or

(b) The contribution is based upon previous work that, to the best
    of my knowledge, is covered under an appropriate open source
    license and I have the right under that license to submit that
    work with modifications, whether created in whole or in part
    by me, under the same open source license (unless I am
    permitted to submit under a different license), as indicated
    in the file; or

(c) The contribution was provided directly to me by some other
    person who certified (a), (b) or (c) and I have not modified
    it.

(d) I understand and agree that this project and the contribution
    are public and that a record of the contribution (including all
    personal information I submit with it, including my sign-off) is
    maintained indefinitely and may be redistributed consistent with
    this project or the open source license(s) involved.

@stacimc stacimc added bug Something isn't working 🟧 priority: high Stalls work on the project or its dependents 🛠 goal: fix Bug fix 💻 aspect: code Concerns the software code in the repository labels Jan 17, 2023
@stacimc stacimc self-assigned this Jan 17, 2023
@stacimc stacimc marked this pull request as ready for review January 23, 2023 23:38
@stacimc stacimc requested a review from a team as a code owner January 23, 2023 23:38
@stacimc stacimc marked this pull request as draft January 23, 2023 23:41
@stacimc
Copy link
Contributor Author

stacimc commented Jan 23, 2023

Drafting to look at extended test failures.

@stacimc stacimc marked this pull request as ready for review January 24, 2023 00:25
Copy link
Contributor

@AetherUnbound AetherUnbound left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Oh this is rad, I totally forgot that generate_tsv_filenames was something we could get rid of now that we're fully converted to the ProviderDataIngester derivatives!

This looks good, but when I run this locally both the Wikimedia commons DAG and the Cleveland one gave me XComs that had a suffix value:

[2023-01-31, 01:30:16 UTC] {factory_utils.py:44} INFO - Image store location: /var/workflow_output/wikimedia_image_v001_20230131013016_0.tsv
[2023-01-31, 01:30:16 UTC] {factory_utils.py:44} INFO - Audio store location: /var/workflow_output/wikimedia_audio_audio_v001_20230131013016_0.tsv

and

[2023-01-31, 01:34:33 UTC] {media.py:183} INFO - Output path: /var/workflow_output/clevelandmuseum_image_v001_20230131013433_None.tsv
[2023-01-31, 01:34:33 UTC] {factory_utils.py:44} INFO - Image store location: /var/workflow_output/clevelandmuseum_image_v001_20230131013433_None.tsv

Do you know why that's happening? 😮

datetime.now().strftime("%Y%m%d%H%M%S"),
tsv_suffix,
]
output_file = ("_").join(filter(None, path_components)) + ".tsv"
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hehe it's a little face 😁 ("_") (as a real note though I think the parens can be removed):

Suggested change
output_file = ("_").join(filter(None, path_components)) + ".tsv"
output_file = "_".join(filter(None, path_components)) + ".tsv"

Awesome use of filter!

Copy link
Contributor

@obulat obulat Feb 7, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

In Python, I am much more comfortable with list comprehensions than the filter. This would be clearer for me, but this is not a blocking comment:

output_file = "_".join([c for c in path_components if c is not None]) + ".tsv"

or, if 0 is not a valid value:

output_file = "_".join([c for c in path_components if c]) + ".tsv"

@@ -160,7 +166,7 @@ def _init_media_stores(self) -> dict[str, MediaStore]:

for media_type, provider in self.providers.items():
StoreClass = get_media_store_class(media_type)
media_stores[media_type] = StoreClass(provider)
media_stores[media_type] = StoreClass(provider, tsv_suffix=str(day_shift))
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Oh, it's this line 😅 This is converting None and 0 to 'None' and '0' which don't pass the filter! So they're getting appended to the filename anyway.

@openverse-bot
Copy link
Contributor

Based on the high urgency of this PR, the following reviewers are being gently reminded to review this PR:

@obulat
This reminder is being automatically generated due to the urgency configuration.

Excluding weekend1 days, this PR was updated 2 day(s) ago. PRs labelled with high urgency are expected to be reviewed within 2 weekday(s)2.

@stacimc, if this PR is not ready for a review, please draft it to prevent reviewers from getting further unnecessary pings.

Footnotes

  1. Specifically, Saturday and Sunday.

  2. For the purpose of these reminders we treat Monday - Friday as weekdays. Please note that the that generates these reminders runs at midnight UTC on Monday - Friday. This means that depending on your timezone, you may be pinged outside of the expected range.

Copy link
Contributor

@obulat obulat left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I am getting the same issue as @AetherUnbound, in load_from_s3:
Loading image/cleveland_museum/year=2023/clevelandmuseum_image_v001_20230207060449_None.tsv from S3 Bucket openverse-storage into provider_data_image_cleveland_museum_20230101T000000_0

Can we also add a test for these cases?

@stacimc
Copy link
Contributor Author

stacimc commented Feb 9, 2023

Sorry for the delay getting back to this one. Rebased and fixed the identified issue 👍

Copy link
Contributor

@AetherUnbound AetherUnbound left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I was able to verify, both with standard ingestion & reingestion, that this appropriately appends the day shift when necessary! I have some other comments/suggestions above, but nothing to block a merge 😄

@stacimc stacimc merged commit 7c5f0a1 into main Feb 14, 2023
@stacimc stacimc deleted the add/dayshift-to-tsv-suffix branch February 14, 2023 17:39
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
💻 aspect: code Concerns the software code in the repository bug Something isn't working 🛠 goal: fix Bug fix 🟧 priority: high Stalls work on the project or its dependents
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Load_data steps for image skipped during Wikimedia reingestion
4 participants