Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add a check for jobs running locally or in the cloud #90

Open
ejulio opened this issue Sep 11, 2019 · 0 comments
Open

Add a check for jobs running locally or in the cloud #90

ejulio opened this issue Sep 11, 2019 · 0 comments

Comments

@ejulio
Copy link
Contributor

ejulio commented Sep 11, 2019

Currently, we may have settings for spiders running locally and other settings for spiders running in Scrapy Cloud (Dash).
Usually, I add a check like if 'SHUB_JOBKEY' not in os.environ:.
However it may not be the best one and , if for some reason this env is deprecated, I need to update my checks.
If would be nice to have this kind of check, not sure if in this library or somewhere else.

@Gallaecio Gallaecio transferred this issue from scrapinghub/python-scrapinghub Dec 16, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant