Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Determine how jobs will be linked #16

Open
Tracked by #10
franTarkenton opened this issue Jul 10, 2023 · 0 comments
Open
Tracked by #10

Determine how jobs will be linked #16

franTarkenton opened this issue Jul 10, 2023 · 0 comments

Comments

@franTarkenton
Copy link
Member

There will be 4 separate jobs that need to be run before the R based data ingestion script can be run.

Need to determine how to deal with convergent jobs.

Some options:

  • Implement individual jobs as DAGS in airflow pipeline and the the R based data get triggered when the 4 data prep jobs are complete
  • Evaluate if a trigger can be setup using GHA that will only progress if all 4 jobs are complete.
  • Staggered cron jobs (possibly as initial option, but really want to avoid this option)
  • Other???
@franTarkenton franTarkenton moved this to In Progress in RFC Backlog Dec 5, 2023
@franTarkenton franTarkenton moved this from In Progress to Sprint Backlog in RFC Backlog Dec 5, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
Status: Sprint Backlog
Development

No branches or pull requests

1 participant