Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Potentially (M)RE for 'hanging' during rechunk based on CMIP6 recipes. #715

Open
jbusecke opened this issue Mar 21, 2024 · 45 comments
Open

Comments

@jbusecke
Copy link
Contributor

As discussed in the last community meeting and on the call with @moradology and @ranchodeluxe today, I will try to boil down one of the CMIP6 feedstock examples that I observed to 'hang' (not completing within several hours, even though the caching was already complete!).

The dataset unique identifier is 'CMIP6.CMIP.CMCC.CMCC-ESM2.historical.r1i1p1f1.3hr.pr.gn.v20210114' and the file urls are:

['https://esgf-data1.llnl.gov/thredds/fileServer/css03_data/CMIP6/CMIP/CMCC/CMCC-ESM2/historical/r1i1p1f1/3hr/pr/gn/v20210114/pr_3hr_CMCC-ESM2_historical_r1i1p1f1_gn_185001010130-185412312230.nc', 'https://esgf-data1.llnl.gov/thredds/fileServer/css03_data/CMIP6/CMIP/CMCC/CMCC-ESM2/historical/r1i1p1f1/3hr/pr/gn/v20210114/pr_3hr_CMCC-ESM2_historical_r1i1p1f1_gn_185501010130-185912312230.nc', 'https://esgf-data1.llnl.gov/thredds/fileServer/css03_data/CMIP6/CMIP/CMCC/CMCC-ESM2/historical/r1i1p1f1/3hr/pr/gn/v20210114/pr_3hr_CMCC-ESM2_historical_r1i1p1f1_gn_186001010130-186412312230.nc', 'https://esgf-data1.llnl.gov/thredds/fileServer/css03_data/CMIP6/CMIP/CMCC/CMCC-ESM2/historical/r1i1p1f1/3hr/pr/gn/v20210114/pr_3hr_CMCC-ESM2_historical_r1i1p1f1_gn_186501010130-186912312230.nc', 'https://esgf-data1.llnl.gov/thredds/fileServer/css03_data/CMIP6/CMIP/CMCC/CMCC-ESM2/historical/r1i1p1f1/3hr/pr/gn/v20210114/pr_3hr_CMCC-ESM2_historical_r1i1p1f1_gn_187001010130-187412312230.nc', 'https://esgf-data1.llnl.gov/thredds/fileServer/css03_data/CMIP6/CMIP/CMCC/CMCC-ESM2/historical/r1i1p1f1/3hr/pr/gn/v20210114/pr_3hr_CMCC-ESM2_historical_r1i1p1f1_gn_187501010130-187912312230.nc', 'https://esgf-data1.llnl.gov/thredds/fileServer/css03_data/CMIP6/CMIP/CMCC/CMCC-ESM2/historical/r1i1p1f1/3hr/pr/gn/v20210114/pr_3hr_CMCC-ESM2_historical_r1i1p1f1_gn_188001010130-188412312230.nc', 'https://esgf-data1.llnl.gov/thredds/fileServer/css03_data/CMIP6/CMIP/CMCC/CMCC-ESM2/historical/r1i1p1f1/3hr/pr/gn/v20210114/pr_3hr_CMCC-ESM2_historical_r1i1p1f1_gn_188501010130-188912312230.nc', 'https://esgf-data1.llnl.gov/thredds/fileServer/css03_data/CMIP6/CMIP/CMCC/CMCC-ESM2/historical/r1i1p1f1/3hr/pr/gn/v20210114/pr_3hr_CMCC-ESM2_historical_r1i1p1f1_gn_189001010130-189412312230.nc', 'https://esgf-data1.llnl.gov/thredds/fileServer/css03_data/CMIP6/CMIP/CMCC/CMCC-ESM2/historical/r1i1p1f1/3hr/pr/gn/v20210114/pr_3hr_CMCC-ESM2_historical_r1i1p1f1_gn_189501010130-189912312230.nc', 'https://esgf-data1.llnl.gov/thredds/fileServer/css03_data/CMIP6/CMIP/CMCC/CMCC-ESM2/historical/r1i1p1f1/3hr/pr/gn/v20210114/pr_3hr_CMCC-ESM2_historical_r1i1p1f1_gn_190001010130-190412312230.nc', 'https://esgf-data1.llnl.gov/thredds/fileServer/css03_data/CMIP6/CMIP/CMCC/CMCC-ESM2/historical/r1i1p1f1/3hr/pr/gn/v20210114/pr_3hr_CMCC-ESM2_historical_r1i1p1f1_gn_190501010130-190912312230.nc', 'https://esgf-data1.llnl.gov/thredds/fileServer/css03_data/CMIP6/CMIP/CMCC/CMCC-ESM2/historical/r1i1p1f1/3hr/pr/gn/v20210114/pr_3hr_CMCC-ESM2_historical_r1i1p1f1_gn_191001010130-191412312230.nc', 'https://esgf-data1.llnl.gov/thredds/fileServer/css03_data/CMIP6/CMIP/CMCC/CMCC-ESM2/historical/r1i1p1f1/3hr/pr/gn/v20210114/pr_3hr_CMCC-ESM2_historical_r1i1p1f1_gn_191501010130-191912312230.nc', 'https://esgf-data1.llnl.gov/thredds/fileServer/css03_data/CMIP6/CMIP/CMCC/CMCC-ESM2/historical/r1i1p1f1/3hr/pr/gn/v20210114/pr_3hr_CMCC-ESM2_historical_r1i1p1f1_gn_192001010130-192412312230.nc', 'https://esgf-data1.llnl.gov/thredds/fileServer/css03_data/CMIP6/CMIP/CMCC/CMCC-ESM2/historical/r1i1p1f1/3hr/pr/gn/v20210114/pr_3hr_CMCC-ESM2_historical_r1i1p1f1_gn_192501010130-192912312230.nc', 'https://esgf-data1.llnl.gov/thredds/fileServer/css03_data/CMIP6/CMIP/CMCC/CMCC-ESM2/historical/r1i1p1f1/3hr/pr/gn/v20210114/pr_3hr_CMCC-ESM2_historical_r1i1p1f1_gn_193001010130-193412312230.nc', 'https://esgf-data1.llnl.gov/thredds/fileServer/css03_data/CMIP6/CMIP/CMCC/CMCC-ESM2/historical/r1i1p1f1/3hr/pr/gn/v20210114/pr_3hr_CMCC-ESM2_historical_r1i1p1f1_gn_193501010130-193912312230.nc', 'https://esgf-data1.llnl.gov/thredds/fileServer/css03_data/CMIP6/CMIP/CMCC/CMCC-ESM2/historical/r1i1p1f1/3hr/pr/gn/v20210114/pr_3hr_CMCC-ESM2_historical_r1i1p1f1_gn_194001010130-194412312230.nc', 'https://esgf-data1.llnl.gov/thredds/fileServer/css03_data/CMIP6/CMIP/CMCC/CMCC-ESM2/historical/r1i1p1f1/3hr/pr/gn/v20210114/pr_3hr_CMCC-ESM2_historical_r1i1p1f1_gn_194501010130-194912312230.nc', 'https://esgf-data1.llnl.gov/thredds/fileServer/css03_data/CMIP6/CMIP/CMCC/CMCC-ESM2/historical/r1i1p1f1/3hr/pr/gn/v20210114/pr_3hr_CMCC-ESM2_historical_r1i1p1f1_gn_195001010130-195412312230.nc', 'https://esgf-data1.llnl.gov/thredds/fileServer/css03_data/CMIP6/CMIP/CMCC/CMCC-ESM2/historical/r1i1p1f1/3hr/pr/gn/v20210114/pr_3hr_CMCC-ESM2_historical_r1i1p1f1_gn_195501010130-195912312230.nc', 'https://esgf-data1.llnl.gov/thredds/fileServer/css03_data/CMIP6/CMIP/CMCC/CMCC-ESM2/historical/r1i1p1f1/3hr/pr/gn/v20210114/pr_3hr_CMCC-ESM2_historical_r1i1p1f1_gn_196001010130-196412312230.nc', 'https://esgf-data1.llnl.gov/thredds/fileServer/css03_data/CMIP6/CMIP/CMCC/CMCC-ESM2/historical/r1i1p1f1/3hr/pr/gn/v20210114/pr_3hr_CMCC-ESM2_historical_r1i1p1f1_gn_196501010130-196912312230.nc', 'https://esgf-data1.llnl.gov/thredds/fileServer/css03_data/CMIP6/CMIP/CMCC/CMCC-ESM2/historical/r1i1p1f1/3hr/pr/gn/v20210114/pr_3hr_CMCC-ESM2_historical_r1i1p1f1_gn_197001010130-197412312230.nc', 'https://esgf-data1.llnl.gov/thredds/fileServer/css03_data/CMIP6/CMIP/CMCC/CMCC-ESM2/historical/r1i1p1f1/3hr/pr/gn/v20210114/pr_3hr_CMCC-ESM2_historical_r1i1p1f1_gn_197501010130-197912312230.nc', 'https://esgf-data1.llnl.gov/thredds/fileServer/css03_data/CMIP6/CMIP/CMCC/CMCC-ESM2/historical/r1i1p1f1/3hr/pr/gn/v20210114/pr_3hr_CMCC-ESM2_historical_r1i1p1f1_gn_198001010130-198412312230.nc', 'https://esgf-data1.llnl.gov/thredds/fileServer/css03_data/CMIP6/CMIP/CMCC/CMCC-ESM2/historical/r1i1p1f1/3hr/pr/gn/v20210114/pr_3hr_CMCC-ESM2_historical_r1i1p1f1_gn_198501010130-198912312230.nc', 'https://esgf-data1.llnl.gov/thredds/fileServer/css03_data/CMIP6/CMIP/CMCC/CMCC-ESM2/historical/r1i1p1f1/3hr/pr/gn/v20210114/pr_3hr_CMCC-ESM2_historical_r1i1p1f1_gn_199001010130-199412312230.nc', 'https://esgf-data1.llnl.gov/thredds/fileServer/css03_data/CMIP6/CMIP/CMCC/CMCC-ESM2/historical/r1i1p1f1/3hr/pr/gn/v20210114/pr_3hr_CMCC-ESM2_historical_r1i1p1f1_gn_199501010130-199912312230.nc', 'https://esgf-data1.llnl.gov/thredds/fileServer/css03_data/CMIP6/CMIP/CMCC/CMCC-ESM2/historical/r1i1p1f1/3hr/pr/gn/v20210114/pr_3hr_CMCC-ESM2_historical_r1i1p1f1_gn_200001010130-200412312230.nc', 'https://esgf-data1.llnl.gov/thredds/fileServer/css03_data/CMIP6/CMIP/CMCC/CMCC-ESM2/historical/r1i1p1f1/3hr/pr/gn/v20210114/pr_3hr_CMCC-ESM2_historical_r1i1p1f1_gn_200501010130-200912312230.nc', 'https://esgf-data1.llnl.gov/thredds/fileServer/css03_data/CMIP6/CMIP/CMCC/CMCC-ESM2/historical/r1i1p1f1/3hr/pr/gn/v20210114/pr_3hr_CMCC-ESM2_historical_r1i1p1f1_gn_201001010130-201412312230.nc']

A stripped down recipe to reproduce this could look like this:

iid = 'CMIP6.CMIP.CMCC.CMCC-ESM2.historical.r1i1p1f1.3hr.pr.gn.v20210114'

urls = [
'https://esgf-data1.llnl.gov/thredds/fileServer/css03_data/CMIP6/CMIP/CMCC/CMCC-ESM2/historical/r1i1p1f1/3hr/pr/gn/v20210114/pr_3hr_CMCC-ESM2_historical_r1i1p1f1_gn_185001010130-185412312230.nc', 'https://esgf-data1.llnl.gov/thredds/fileServer/css03_data/CMIP6/CMIP/CMCC/CMCC-ESM2/historical/r1i1p1f1/3hr/pr/gn/v20210114/pr_3hr_CMCC-ESM2_historical_r1i1p1f1_gn_185501010130-185912312230.nc', 'https://esgf-data1.llnl.gov/thredds/fileServer/css03_data/CMIP6/CMIP/CMCC/CMCC-ESM2/historical/r1i1p1f1/3hr/pr/gn/v20210114/pr_3hr_CMCC-ESM2_historical_r1i1p1f1_gn_186001010130-186412312230.nc', 'https://esgf-data1.llnl.gov/thredds/fileServer/css03_data/CMIP6/CMIP/CMCC/CMCC-ESM2/historical/r1i1p1f1/3hr/pr/gn/v20210114/pr_3hr_CMCC-ESM2_historical_r1i1p1f1_gn_186501010130-186912312230.nc', 'https://esgf-data1.llnl.gov/thredds/fileServer/css03_data/CMIP6/CMIP/CMCC/CMCC-ESM2/historical/r1i1p1f1/3hr/pr/gn/v20210114/pr_3hr_CMCC-ESM2_historical_r1i1p1f1_gn_187001010130-187412312230.nc', 'https://esgf-data1.llnl.gov/thredds/fileServer/css03_data/CMIP6/CMIP/CMCC/CMCC-ESM2/historical/r1i1p1f1/3hr/pr/gn/v20210114/pr_3hr_CMCC-ESM2_historical_r1i1p1f1_gn_187501010130-187912312230.nc', 'https://esgf-data1.llnl.gov/thredds/fileServer/css03_data/CMIP6/CMIP/CMCC/CMCC-ESM2/historical/r1i1p1f1/3hr/pr/gn/v20210114/pr_3hr_CMCC-ESM2_historical_r1i1p1f1_gn_188001010130-188412312230.nc', 'https://esgf-data1.llnl.gov/thredds/fileServer/css03_data/CMIP6/CMIP/CMCC/CMCC-ESM2/historical/r1i1p1f1/3hr/pr/gn/v20210114/pr_3hr_CMCC-ESM2_historical_r1i1p1f1_gn_188501010130-188912312230.nc', 'https://esgf-data1.llnl.gov/thredds/fileServer/css03_data/CMIP6/CMIP/CMCC/CMCC-ESM2/historical/r1i1p1f1/3hr/pr/gn/v20210114/pr_3hr_CMCC-ESM2_historical_r1i1p1f1_gn_189001010130-189412312230.nc', 'https://esgf-data1.llnl.gov/thredds/fileServer/css03_data/CMIP6/CMIP/CMCC/CMCC-ESM2/historical/r1i1p1f1/3hr/pr/gn/v20210114/pr_3hr_CMCC-ESM2_historical_r1i1p1f1_gn_189501010130-189912312230.nc', 'https://esgf-data1.llnl.gov/thredds/fileServer/css03_data/CMIP6/CMIP/CMCC/CMCC-ESM2/historical/r1i1p1f1/3hr/pr/gn/v20210114/pr_3hr_CMCC-ESM2_historical_r1i1p1f1_gn_190001010130-190412312230.nc', 'https://esgf-data1.llnl.gov/thredds/fileServer/css03_data/CMIP6/CMIP/CMCC/CMCC-ESM2/historical/r1i1p1f1/3hr/pr/gn/v20210114/pr_3hr_CMCC-ESM2_historical_r1i1p1f1_gn_190501010130-190912312230.nc', 'https://esgf-data1.llnl.gov/thredds/fileServer/css03_data/CMIP6/CMIP/CMCC/CMCC-ESM2/historical/r1i1p1f1/3hr/pr/gn/v20210114/pr_3hr_CMCC-ESM2_historical_r1i1p1f1_gn_191001010130-191412312230.nc', 'https://esgf-data1.llnl.gov/thredds/fileServer/css03_data/CMIP6/CMIP/CMCC/CMCC-ESM2/historical/r1i1p1f1/3hr/pr/gn/v20210114/pr_3hr_CMCC-ESM2_historical_r1i1p1f1_gn_191501010130-191912312230.nc', 'https://esgf-data1.llnl.gov/thredds/fileServer/css03_data/CMIP6/CMIP/CMCC/CMCC-ESM2/historical/r1i1p1f1/3hr/pr/gn/v20210114/pr_3hr_CMCC-ESM2_historical_r1i1p1f1_gn_192001010130-192412312230.nc', 'https://esgf-data1.llnl.gov/thredds/fileServer/css03_data/CMIP6/CMIP/CMCC/CMCC-ESM2/historical/r1i1p1f1/3hr/pr/gn/v20210114/pr_3hr_CMCC-ESM2_historical_r1i1p1f1_gn_192501010130-192912312230.nc', 'https://esgf-data1.llnl.gov/thredds/fileServer/css03_data/CMIP6/CMIP/CMCC/CMCC-ESM2/historical/r1i1p1f1/3hr/pr/gn/v20210114/pr_3hr_CMCC-ESM2_historical_r1i1p1f1_gn_193001010130-193412312230.nc', 'https://esgf-data1.llnl.gov/thredds/fileServer/css03_data/CMIP6/CMIP/CMCC/CMCC-ESM2/historical/r1i1p1f1/3hr/pr/gn/v20210114/pr_3hr_CMCC-ESM2_historical_r1i1p1f1_gn_193501010130-193912312230.nc', 'https://esgf-data1.llnl.gov/thredds/fileServer/css03_data/CMIP6/CMIP/CMCC/CMCC-ESM2/historical/r1i1p1f1/3hr/pr/gn/v20210114/pr_3hr_CMCC-ESM2_historical_r1i1p1f1_gn_194001010130-194412312230.nc', 'https://esgf-data1.llnl.gov/thredds/fileServer/css03_data/CMIP6/CMIP/CMCC/CMCC-ESM2/historical/r1i1p1f1/3hr/pr/gn/v20210114/pr_3hr_CMCC-ESM2_historical_r1i1p1f1_gn_194501010130-194912312230.nc', 'https://esgf-data1.llnl.gov/thredds/fileServer/css03_data/CMIP6/CMIP/CMCC/CMCC-ESM2/historical/r1i1p1f1/3hr/pr/gn/v20210114/pr_3hr_CMCC-ESM2_historical_r1i1p1f1_gn_195001010130-195412312230.nc', 'https://esgf-data1.llnl.gov/thredds/fileServer/css03_data/CMIP6/CMIP/CMCC/CMCC-ESM2/historical/r1i1p1f1/3hr/pr/gn/v20210114/pr_3hr_CMCC-ESM2_historical_r1i1p1f1_gn_195501010130-195912312230.nc', 'https://esgf-data1.llnl.gov/thredds/fileServer/css03_data/CMIP6/CMIP/CMCC/CMCC-ESM2/historical/r1i1p1f1/3hr/pr/gn/v20210114/pr_3hr_CMCC-ESM2_historical_r1i1p1f1_gn_196001010130-196412312230.nc', 'https://esgf-data1.llnl.gov/thredds/fileServer/css03_data/CMIP6/CMIP/CMCC/CMCC-ESM2/historical/r1i1p1f1/3hr/pr/gn/v20210114/pr_3hr_CMCC-ESM2_historical_r1i1p1f1_gn_196501010130-196912312230.nc', 'https://esgf-data1.llnl.gov/thredds/fileServer/css03_data/CMIP6/CMIP/CMCC/CMCC-ESM2/historical/r1i1p1f1/3hr/pr/gn/v20210114/pr_3hr_CMCC-ESM2_historical_r1i1p1f1_gn_197001010130-197412312230.nc', 'https://esgf-data1.llnl.gov/thredds/fileServer/css03_data/CMIP6/CMIP/CMCC/CMCC-ESM2/historical/r1i1p1f1/3hr/pr/gn/v20210114/pr_3hr_CMCC-ESM2_historical_r1i1p1f1_gn_197501010130-197912312230.nc', 'https://esgf-data1.llnl.gov/thredds/fileServer/css03_data/CMIP6/CMIP/CMCC/CMCC-ESM2/historical/r1i1p1f1/3hr/pr/gn/v20210114/pr_3hr_CMCC-ESM2_historical_r1i1p1f1_gn_198001010130-198412312230.nc', 'https://esgf-data1.llnl.gov/thredds/fileServer/css03_data/CMIP6/CMIP/CMCC/CMCC-ESM2/historical/r1i1p1f1/3hr/pr/gn/v20210114/pr_3hr_CMCC-ESM2_historical_r1i1p1f1_gn_198501010130-198912312230.nc', 'https://esgf-data1.llnl.gov/thredds/fileServer/css03_data/CMIP6/CMIP/CMCC/CMCC-ESM2/historical/r1i1p1f1/3hr/pr/gn/v20210114/pr_3hr_CMCC-ESM2_historical_r1i1p1f1_gn_199001010130-199412312230.nc', 'https://esgf-data1.llnl.gov/thredds/fileServer/css03_data/CMIP6/CMIP/CMCC/CMCC-ESM2/historical/r1i1p1f1/3hr/pr/gn/v20210114/pr_3hr_CMCC-ESM2_historical_r1i1p1f1_gn_199501010130-199912312230.nc', 'https://esgf-data1.llnl.gov/thredds/fileServer/css03_data/CMIP6/CMIP/CMCC/CMCC-ESM2/historical/r1i1p1f1/3hr/pr/gn/v20210114/pr_3hr_CMCC-ESM2_historical_r1i1p1f1_gn_200001010130-200412312230.nc', 'https://esgf-data1.llnl.gov/thredds/fileServer/css03_data/CMIP6/CMIP/CMCC/CMCC-ESM2/historical/r1i1p1f1/3hr/pr/gn/v20210114/pr_3hr_CMCC-ESM2_historical_r1i1p1f1_gn_200501010130-200912312230.nc', 'https://esgf-data1.llnl.gov/thredds/fileServer/css03_data/CMIP6/CMIP/CMCC/CMCC-ESM2/historical/r1i1p1f1/3hr/pr/gn/v20210114/pr_3hr_CMCC-ESM2_historical_r1i1p1f1_gn_201001010130-201412312230.nc']

pattern = pattern_from_file_sequence(
        urls,
        concat_dim='time'
        )
recipes= (
        f"Creating {iid}" >> beam.Create(pattern.items())
        | OpenURLWithFSSpec()
         # do not specify file type to accomodate both ncdf3 and ncdf4
        | OpenWithXarray(xarray_open_kwargs={"use_cftime":True})
        | Preprocessor()
        | StoreToZarr(
            store_name=f"{iid}.zarr",
            combine_dims=pattern.combine_dim_keys,
            dynamic_chunking_fn=dynamic_chunking_func,
            )
        | ConsolidateDimensionCoordinates()
        | ConsolidateMetadata()
        )

(to be revised, just leaving this here as a starting point)

@ranchodeluxe do you mind sending me the steps you use to run this on a jupyterhub?

Will fill this in as I go along.

@jbusecke
Copy link
Contributor Author

Ok I got an example running here (successful submission log here). This is running on LEAP dataflow, but might be not to hard to run locally? I was a bit hesitant for now since it will cache about 100GB of data.
I guess Ill wait for a few hours and see if it gets stuck!

I can probably from there test/probe a bit more to reduce the complexity further.

@jbusecke
Copy link
Contributor Author

jbusecke commented Mar 21, 2024

Ok it has been running for about an hour, and has scaled up to ~6, then back to 3 workers, but they all seem to idle around after a while
image

image

FWIW the console indicates that this is stuck on the rechunk stage
image

But from experience I know that these graphical representations can often be misleading.

@jbusecke
Copy link
Contributor Author

I think this is a good candidate! Will report back after a few hours.

@jbusecke
Copy link
Contributor Author

Some strategies to further simplify this:

  • Hardcode the target_chunks (and maybe try a bunch of different combos to see if some work, e.g. pure time chunking)
  • Reduce the number of files (is there some number that works)

@norlandrhagen
Copy link
Contributor

In the other hanging recipe, it seemed like about three months of daily data would result in hanging.

@jbusecke
Copy link
Contributor Author

Ill see tomorrow if there is an upper bound to it not hanging in this case, but I am not convinced this is representative of a 'stall'. Jobs been running for 4 hours and basically doing nothing AFAICT:
image

@jbusecke
Copy link
Contributor Author

If (and thats a big if, see above) we can believe the graph diagram, it is actually stuck on the group by key stage 🤔

image

@jbusecke jbusecke changed the title Potentially (M)VE for 'hanging' during rechunk based on CMIP6 recipes. Potentially (M)RE for 'hanging' during rechunk based on CMIP6 recipes. Mar 22, 2024
@jbusecke
Copy link
Contributor Author

I am moving the example to a separate repo to keep things more neat.

@jbusecke
Copy link
Contributor Author

Also I just cancelled the job from yesterday after it running for 19 hours without really doing anything...
For later reference job link(not publically accessible atm).

@jbusecke
Copy link
Contributor Author

Currently trying out some variations of my failcase here. Will report back on what I find.

@jbusecke
Copy link
Contributor Author

Ok this is interesting. It seems that even only using 3 files (of 33) is enough to trigger this! If that holds true this would make this example a lot more minimal.

@jbusecke
Copy link
Contributor Author

Got 4 examples in the oven
image

I ll have to wait ~1 hour to be sure things are stuck.

@jbusecke
Copy link
Contributor Author

Ok it seems that the complex rechunking structure has something to do with the problem (as I previously suspected), when running the recipe with chunks along the time axis only both recipes succeeded!
image

This also gives us an approximate time that things "should take"? So from now on I will assume that any recipe with runtime of many hours is actually stuck.

@jbusecke
Copy link
Contributor Author

I am also calling that the short example is in fact stuck!
image
That is nice and reduces the size of the problem considerably!

@jbusecke
Copy link
Contributor Author

Next I will try to run the test_short_dynamic_chunking locally. Onwards.

@jbusecke
Copy link
Contributor Author

I went into a different direction and I think I am now able to reproduce the problem without my dynamical rechunking. Simply setting the chunking 'orthogonal' (all chunks along lon, while concatting along time) does reproduce the behavior with only 2! files.

@norlandrhagen
Copy link
Contributor

Notes on cases to test:

  • Does this fail/hang on a single worker on local runner?
  • Verify rechunking only in time succeeds
  • Can we narrow which part of StoreToZarr is hanging

@jbusecke
Copy link
Contributor Author

@norlandrhagen and I are in the middle of a pair coding session. I will jot down some notes (and edit later if needed).

We are currently trying to run two recipes locally

  • 'time_only' only time chunks (this succeeded on dataflow)
  • 'lon_only' only lon chunks (this requires a full rechunk and is hanging on dataflow)

@jbusecke
Copy link
Contributor Author

jbusecke commented Mar 22, 2024

Ok I just ran both examples with the direct runner via pgf-runner (on the LEAP Jupyterhub with 16 cores) like this (Thx to @ranchodeluxe for the instructions):

  1. Install dependencies
mamba create -n runner0102 python=3.11 -y
conda activate runner0102
pip install pangeo-forge-runner==0.10.2 --no-cache-dir
  1. Create a config.py:
c.Bake.prune = 0
c.Bake.bakery_class = 'pangeo_forge_runner.bakery.local.LocalDirectBakery'
c.LocalDirectBakery.num_workers = 16
BUCKET_PREFIX = "/home/jovyan/AREAS/PGF/hanging_bug/output"
c.TargetStorage.fsspec_class = "fsspec.implementations.local.LocalFileSystem"
c.TargetStorage.root_path = f"{BUCKET_PREFIX}/{{job_name}}/output"
c.InputCacheStorage.root_path = f"{BUCKET_PREFIX}/cache/"
c.InputCacheStorage.fsspec_class = "fsspec.implementations.local.LocalFileSystem"
  1. I then ran 'time_only' with:
pangeo-forge-runner bake \
  --repo=https://github.com/jbusecke/pgf_cmip_hanging_mre \
  --ref=main \
  --feedstock-subdir='feedstock' \
  --Bake.job_name=test_short_time_only_chunks\
  --Bake.recipe_id=test_short_time_only_chunks\
  -f config.py
  1. and 'lon_only' with:
pangeo-forge-runner bake \
  --repo=https://github.com/jbusecke/pgf_cmip_hanging_mre \
  --ref=main \
  --feedstock-subdir='feedstock' \
  --Bake.job_name=test_short_lon_only_chunks \
  --Bake.recipe_id=test_short_lon_only_chunks\
  -f config.py

These both worked! So I am not sure if my problem is in fact the same problem that @norlandrhagen and @ranchodeluxe encountered in #710.

I wonder if somebody can run this on flink to add another datapoint to our fail/runner matrix?

@jbusecke
Copy link
Contributor Author

Ok so with input from @cisaacstern I tried the same thing except I used a gcs bucket for cache and output

c.Bake.prune = 0
c.Bake.bakery_class = 'pangeo_forge_runner.bakery.local.LocalDirectBakery'
c.LocalDirectBakery.num_workers = 16
BUCKET_PREFIX = "gs://leap-scratch/jbusecke/pgf-hanging-bug-testing"
c.TargetStorage.fsspec_class = "gcsfs.GCSFileSystem"
c.TargetStorage.root_path = f"{BUCKET_PREFIX}/{{job_name}}/output"
c.InputCacheStorage.fsspec_class = "gcsfs.GCSFileSystem"
c.InputCacheStorage.root_path = f"{BUCKET_PREFIX}/cache/"

which also did not fail

@jbusecke
Copy link
Contributor Author

jbusecke commented Mar 22, 2024

Ok @norlandrhagen and I just wrapped up. We devised a few new recipe_ids to test some other things:

  • test_short_lon_only_chunks_load This is the same as test_short_lon_only_chunks but sets load=True in OpenWithXarray:
    • This full on crashed on dataflow (even though there was plenty of memory to load all the data into it). The error message below indicates that this is happening on write! Job
"Workflow failed. Causes: S05:Creating CMIP6.CMIP.CMCC.CMCC-ESM2.historical.r1i1p1f1.3hr.pr.gn.v20210114|OpenURLWithFSSpec|OpenWithXarray|Preprocessor|StoreToZarr|ConsolidateDimensionCoordinates|ConsolidateMetadata/Creating CMIP6.CMIP.CMCC.CMCC-ESM2.historical.r1i1p1f1.3hr.pr.gn.v20210114/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Read+Creating CMIP6.CMIP.CMCC.CMCC-ESM2.historical.r1i1p1f1.3hr.pr.gn.v20210114|OpenURLWithFSSpec|OpenWithXarray|Preprocessor|StoreToZarr|ConsolidateDimensionCoordinates|ConsolidateMetadata/Creating CMIP6.CMIP.CMCC.CMCC-ESM2.historical.r1i1p1f1.3hr.pr.gn.v20210114/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow+Creating CMIP6.CMIP.CMCC.CMCC-ESM2.historical.r1i1p1f1.3hr.pr.gn.v20210114|OpenURLWithFSSpec|OpenWithXarray|Preprocessor|StoreToZarr|ConsolidateDimensionCoordinates|ConsolidateMetadata/Creating CMIP6.CMIP.CMCC.CMCC-ESM2.historical.r1i1p1f1.3hr.pr.gn.v20210114/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)+Creating CMIP6.CMIP.CMCC.CMCC-ESM2.historical.r1i1p1f1.3hr.pr.gn.v20210114|OpenURLWithFSSpec|OpenWithXarray|Preprocessor|StoreToZarr|ConsolidateDimensionCoordinates|ConsolidateMetadata/Creating CMIP6.CMIP.CMCC.CMCC-ESM2.historical.r1i1p1f1.3hr.pr.gn.v20210114/MaybeReshuffle/Reshuffle/RemoveRandomKeys+Creating CMIP6.CMIP.CMCC.CMCC-ESM2.historical.r1i1p1f1.3hr.pr.gn.v20210114|OpenURLWithFSSpec|OpenWithXarray|Preprocessor|StoreToZarr|ConsolidateDimensionCoordinates|ConsolidateMetadata/Creating CMIP6.CMIP.CMCC.CMCC-ESM2.historical.r1i1p1f1.3hr.pr.gn.v20210114/Map(decode)+Creating CMIP6.CMIP.CMCC.CMCC-ESM2.historical.r1i1p1f1.3hr.pr.gn.v20210114|OpenURLWithFSSpec|OpenWithXarray|Preprocessor|StoreToZarr|ConsolidateDimensionCoordinates|ConsolidateMetadata/OpenURLWithFSSpec/MapWithConcurrencyLimit/open_url+Creating CMIP6.CMIP.CMCC.CMCC-ESM2.historical.r1i1p1f1.3hr.pr.gn.v20210114|OpenURLWithFSSpec|OpenWithXarray|Preprocessor|StoreToZarr|ConsolidateDimensionCoordinates|ConsolidateMetadata/OpenWithXarray/Open with Xarray+Creating CMIP6.CMIP.CMCC.CMCC-ESM2.historical.r1i1p1f1.3hr.pr.gn.v20210114|OpenURLWithFSSpec|OpenWithXarray|Preprocessor|StoreToZarr|ConsolidateDimensionCoordinates|ConsolidateMetadata/Preprocessor/Fix coordinates+Creating CMIP6.CMIP.CMCC.CMCC-ESM2.historical.r1i1p1f1.3hr.pr.gn.v20210114|OpenURLWithFSSpec|OpenWithXarray|Preprocessor|StoreToZarr|ConsolidateDimensionCoordinates|ConsolidateMetadata/Preprocessor/Sanitize Attrs+Creating CMIP6.CMIP.CMCC.CMCC-ESM2.historical.r1i1p1f1.3hr.pr.gn.v20210114|OpenURLWithFSSpec|OpenWithXarray|Preprocessor|StoreToZarr|ConsolidateDimensionCoordinates|ConsolidateMetadata/StoreToZarr/DetermineSchema/MapTuple()+Creating CMIP6.CMIP.CMCC.CMCC-ESM2.historical.r1i1p1f1.3hr.pr.gn.v20210114|OpenURLWithFSSpec|OpenWithXarray|Preprocessor|StoreToZarr|ConsolidateDimensionCoordinates|ConsolidateMetadata/StoreToZarr/DetermineSchema/CombineGlobally(CombineXarraySchemas)/KeyWithVoid+Creating CMIP6.CMIP.CMCC.CMCC-ESM2.historical.r1i1p1f1.3hr.pr.gn.v20210114|OpenURLWithFSSpec|OpenWithXarray|Preprocessor|StoreToZarr|ConsolidateDimensionCoordinates|ConsolidateMetadata/StoreToZarr/DetermineSchema/CombineGlobally(CombineXarraySchemas)/CombinePerKey/GroupByKey+Creating CMIP6.CMIP.CMCC.CMCC-ESM2.historical.r1i1p1f1.3hr.pr.gn.v20210114|OpenURLWithFSSpec|OpenWithXarray|Preprocessor|StoreToZarr|ConsolidateDimensionCoordinates|ConsolidateMetadata/StoreToZarr/DetermineSchema/CombineGlobally(CombineXarraySchemas)/CombinePerKey/Combine/Partial+Creating CMIP6.CMIP.CMCC.CMCC-ESM2.historical.r1i1p1f1.3hr.pr.gn.v20210114|OpenURLWithFSSpec|OpenWithXarray|Preprocessor|StoreToZarr|ConsolidateDimensionCoordinates|ConsolidateMetadata/StoreToZarr/DetermineSchema/CombineGlobally(CombineXarraySchemas)/CombinePerKey/GroupByKey/Write failed., The job failed because a work item has failed 4 times. Look in previous log entries for the cause of each one of the 4 failures. If the logs only contain generic timeout errors related to accessing external resources, such as MongoDB, verify that the worker service account has permission to access the resource's subnetwork. For more information, see https://cloud.google.com/dataflow/docs/guides/common-errors. The work item was attempted on these workers:
  • test_short_time_only_chunks_load. PR Job The above really surprised me and so I wanted to double check if load=True affects the case that previously worked in the same way. THIS ALSO FAILS. I AM COMPLETELY CONFUSED BY that

  • test_full_nowrite PR Job Just to confirm that only opening the files lazily does not cause an error. This indeed succeeded.

Ill pick this up next week, but any feedback is very welcome.

@jbusecke
Copy link
Contributor Author

From the worker logs of the OG test_short_dynamic_chunks:

Operation ongoing in bundle process_bundle-2974329341843856250-42 for PTransform{name=Creating CMIP6.CMIP.CMCC.CMCC-ESM2.historical.r1i1p1f1.3hr.pr.gn.v20210114|OpenURLWithFSSpec|OpenWithXarray|Preprocessor|StoreToZarr|ConsolidateDimensionCoordinates|ConsolidateMetadata/StoreToZarr/Rechunk/MapTuple(combine_fragments)-ptransform-46, state=process-msecs} for at least 5827.01 seconds without outputting or completing.
Current Traceback:
  File "/usr/local/lib/python3.10/threading.py", line 973, in _bootstrap
    self._bootstrap_inner()

  File "/usr/local/lib/python3.10/threading.py", line 1016, in _bootstrap_inner
    self.run()

  File "/usr/local/lib/python3.10/site-packages/apache_beam/utils/thread_pool_executor.py", line 53, in run
    self._work_item.run()

  File "/usr/local/lib/python3.10/site-packages/apache_beam/utils/thread_pool_executor.py", line 37, in run
    self._future.set_result(self._fn(*self._fn_args, **self._fn_kwargs))

  File "/usr/local/lib/python3.10/site-packages/apache_beam/runners/worker/sdk_worker.py", line 385, in task
    self._execute(

  File "/usr/local/lib/python3.10/site-packages/apache_beam/runners/worker/sdk_worker.py", line 311, in _execute
    response = task()

  File "/usr/local/lib/python3.10/site-packages/apache_beam/runners/worker/sdk_worker.py", line 386, in <lambda>
    lambda: self.create_worker().do_instruction(request), request)

  File "/usr/local/lib/python3.10/site-packages/apache_beam/runners/worker/sdk_worker.py", line 650, in do_instruction
    return getattr(self, request_type)(

  File "/usr/local/lib/python3.10/site-packages/apache_beam/runners/worker/sdk_worker.py", line 688, in process_bundle
    bundle_processor.process_bundle(instruction_id))

  File "/usr/local/lib/python3.10/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1113, in process_bundle
    input_op_by_transform_id[element.transform_id].process_encoded(

  File "/usr/local/lib/python3.10/site-packages/apache_beam/runners/worker/bundle_processor.py", line 237, in process_encoded
    self.output(decoded_value)

  File "/tmp/cf09f9f279ec355c109713fd5ed97e51c62f0daeb39deea8ba13555949327808qx0kye0e/lib/python3.10/site-packages/apache_beam/transforms/core.py", line 2040, in <lambda>

  File "/opt/apache/beam-venv/beam-venv-worker-sdk-0-0/lib/python3.10/site-packages/pangeo_forge_recipes/rechunking.py", line 240, in combine_fragments
    ds_combined = xr.combine_nested(dsets_to_concat, concat_dim=concat_dims_sorted)

  File "/opt/apache/beam-venv/beam-venv-worker-sdk-0-0/lib/python3.10/site-packages/xarray/core/combine.py", line 577, in combine_nested
    return _nested_combine(

  File "/opt/apache/beam-venv/beam-venv-worker-sdk-0-0/lib/python3.10/site-packages/xarray/core/combine.py", line 356, in _nested_combine
    combined = _combine_nd(

  File "/opt/apache/beam-venv/beam-venv-worker-sdk-0-0/lib/python3.10/site-packages/xarray/core/combine.py", line 232, in _combine_nd
    combined_ids = _combine_all_along_first_dim(

  File "/opt/apache/beam-venv/beam-venv-worker-sdk-0-0/lib/python3.10/site-packages/xarray/core/combine.py", line 267, in _combine_all_along_first_dim
    new_combined_ids[new_id] = _combine_1d(

  File "/opt/apache/beam-venv/beam-venv-worker-sdk-0-0/lib/python3.10/site-packages/xarray/core/combine.py", line 290, in _combine_1d
    combined = concat(

  File "/opt/apache/beam-venv/beam-venv-worker-sdk-0-0/lib/python3.10/site-packages/xarray/core/concat.py", line 250, in concat
    return _dataset_concat(

  File "/opt/apache/beam-venv/beam-venv-worker-sdk-0-0/lib/python3.10/site-packages/xarray/core/concat.py", line 631, in _dataset_concat
    combined_var = concat_vars(

  File "/opt/apache/beam-venv/beam-venv-worker-sdk-0-0/lib/python3.10/site-packages/xarray/core/variable.py", line 2925, in concat
    return Variable.concat(variables, dim, positions, shortcut, combine_attrs)

  File "/opt/apache/beam-venv/beam-venv-worker-sdk-0-0/lib/python3.10/site-packages/xarray/core/variable.py", line 1693, in concat
    data = duck_array_ops.concatenate(arrays, axis=axis)

  File "/opt/apache/beam-venv/beam-venv-worker-sdk-0-0/lib/python3.10/site-packages/xarray/core/duck_array_ops.py", line 356, in concatenate
    return _concatenate(as_shared_dtype(arrays), axis=axis)

  File "/opt/apache/beam-venv/beam-venv-worker-sdk-0-0/lib/python3.10/site-packages/xarray/core/duck_array_ops.py", line 232, in as_shared_dtype
    arrays = [asarray(x, xp=xp) for x in scalars_or_arrays]

  File "/opt/apache/beam-venv/beam-venv-worker-sdk-0-0/lib/python3.10/site-packages/xarray/core/duck_array_ops.py", line 232, in <listcomp>
    arrays = [asarray(x, xp=xp) for x in scalars_or_arrays]

  File "/opt/apache/beam-venv/beam-venv-worker-sdk-0-0/lib/python3.10/site-packages/xarray/core/duck_array_ops.py", line 219, in asarray
    return data if is_duck_array(data) else xp.asarray(data)

  File "/opt/apache/beam-venv/beam-venv-worker-sdk-0-0/lib/python3.10/site-packages/xarray/core/indexing.py", line 693, in __array__
    return np.asarray(self.get_duck_array(), dtype=dtype)

  File "/opt/apache/beam-venv/beam-venv-worker-sdk-0-0/lib/python3.10/site-packages/xarray/core/indexing.py", line 696, in get_duck_array
    self._ensure_cached()

  File "/opt/apache/beam-venv/beam-venv-worker-sdk-0-0/lib/python3.10/site-packages/xarray/core/indexing.py", line 690, in _ensure_cached
    self.array = as_indexable(self.array.get_duck_array())

  File "/opt/apache/beam-venv/beam-venv-worker-sdk-0-0/lib/python3.10/site-packages/xarray/core/indexing.py", line 664, in get_duck_array
    return self.array.get_duck_array()

  File "/opt/apache/beam-venv/beam-venv-worker-sdk-0-0/lib/python3.10/site-packages/xarray/core/indexing.py", line 551, in get_duck_array
    array = self.array[self.key]

  File "/opt/apache/beam-venv/beam-venv-worker-sdk-0-0/lib/python3.10/site-packages/xarray/backends/h5netcdf_.py", line 51, in __getitem__
    return indexing.explicit_indexing_adapter(

  File "/opt/apache/beam-venv/beam-venv-worker-sdk-0-0/lib/python3.10/site-packages/xarray/core/indexing.py", line 858, in explicit_indexing_adapter
    result = raw_indexing_method(raw_key.tuple)

  File "/opt/apache/beam-venv/beam-venv-worker-sdk-0-0/lib/python3.10/site-packages/xarray/backends/h5netcdf_.py", line 58, in _getitem
    return array[key]

  File "/opt/apache/beam-venv/beam-venv-worker-sdk-0-0/lib/python3.10/site-packages/h5netcdf/core.py", line 347, in __getitem__
    return self._h5ds[key]

  File "/opt/apache/beam-venv/beam-venv-worker-sdk-0-0/lib/python3.10/site-packages/h5py/_hl/dataset.py", line 758, in __getitem__
    return self._fast_reader.read(args)

  File "/opt/apache/beam-venv/beam-venv-worker-sdk-0-0/lib/python3.10/site-packages/fsspec/spec.py", line 1856, in readinto
    data = self.read(out.nbytes)

  File "/opt/apache/beam-venv/beam-venv-worker-sdk-0-0/lib/python3.10/site-packages/fsspec/spec.py", line 1846, in read
    out = self.cache._fetch(self.loc, self.loc + length)

  File "/opt/apache/beam-venv/beam-venv-worker-sdk-0-0/lib/python3.10/site-packages/fsspec/caching.py", line 189, in _fetch
    self.cache = self.fetcher(start, end)  # new block replaces old

  File "/opt/apache/beam-venv/beam-venv-worker-sdk-0-0/lib/python3.10/site-packages/gcsfs/core.py", line 1921, in _fetch_range
    return self.gcsfs.cat_file(self.path, start=start, end=end)

  File "/opt/apache/beam-venv/beam-venv-worker-sdk-0-0/lib/python3.10/site-packages/fsspec/asyn.py", line 118, in wrapper
    return sync(self.loop, func, *args, **kwargs)

  File "/opt/apache/beam-venv/beam-venv-worker-sdk-0-0/lib/python3.10/site-packages/fsspec/asyn.py", line 91, in sync
    if event.wait(1):

  File "/usr/local/lib/python3.10/threading.py", line 607, in wait
    signaled = self._cond.wait(timeout)

  File "/usr/local/lib/python3.10/threading.py", line 324, in wait
    gotit = waiter.acquire(True, timeout)

@ranchodeluxe
Copy link
Contributor

ranchodeluxe commented Mar 25, 2024

File "/opt/apache/beam-venv/beam-venv-worker-sdk-0-0/lib/python3.10/site-packages/fsspec/asyn.py", line 91, in sync
if event.wait(1):

File "/usr/local/lib/python3.10/threading.py", line 607, in wait
signaled = self._cond.wait(timeout)

File "/usr/local/lib/python3.10/threading.py", line 324, in wait
gotit = waiter.acquire(True, timeout)

Given this beautiful traceback I say the next steps to reproduce are:

  1. make sure the gcfs cache is enable
  2. write a simpler recipe that just loops through all gcsf files and does either only or some partial steps from combine_fragments to mimic that data read pattern
  3. if the first two steps can reproduce the hang then maybe we play with disabling fsspec caching

@ranchodeluxe
Copy link
Contributor

@moradology found this: fsspec/gcsfs#379

@jbusecke
Copy link
Contributor Author

I just confirmed that the problem is not resolved when only running on one worker. @moradology and I just stumbled upon some output that might a bug in the groupby logic of rechunk. We will continue on this later and report back.

@jbusecke
Copy link
Contributor Author

jbusecke commented Mar 26, 2024

Ok I think @moradology and I have honed in on the problem earlier today, and I was able to produce what I think is unintended behavior from the rechunking stage locally. How and why that would only stall on dataflow still eludes me, but I think regardless this is actually a pretty significant bug that needs to be dealt with before I can move on.

Ok let me back up. So my main hypothesis is that the combining of fragments(subfragments?) is not actually properly working!

I have evidence from the above example but I think it is more instructive to look at this new tiny example that shares the main attribute with the CMIP test examples: Target chunks that are orthogonal to the concat dim, requiring a complete rechunk.

Take these two example datasets:

import xarray as xr
a = xr.DataArray(
    [[0,1],[2,3]], 
    dims=['x', 'time'], 
    coords={'x':[0,1], 'time':[0,1]}
).to_dataset(name='var').chunk({'time':1})
b = xr.DataArray(
    [[4,5],[6,7]], 
    dims=['x', 'time'], 
    coords={'x':[0,1], 'time':[2,3]}
).to_dataset(name='var').chunk({'time':1})

I am saving both of those files locally as netcdfs

a.to_netcdf('/home/jovyan/AREAS/PGF/hanging_bug/simple_example/file_a.nc')
b.to_netcdf('/home/jovyan/AREAS/PGF/hanging_bug/simple_example/file_b.nc')

and 'simulate' the desired output zarr dataset:

ds_out = xr.concat(
    [
        xr.open_dataset(path) for path in [
            '/home/jovyan/AREAS/PGF/hanging_bug/simple_example/file_a.nc',
            '/home/jovyan/AREAS/PGF/hanging_bug/simple_example/file_b.nc'
        ]
    ],
    dim='time'
)
ds_out.chunk({'x':1, 'time':-1})
image

Note how this bad boy has ONLY TWO output chunks!

I then set up some boilerplate feedstock (I can make another repo if folks want to run this locally):

with the following files:
feedstock/meta.yaml:

title: "Simple Test"
description: ""
recipes:
  - id: test
    object: "recipe:recipe"
provenance:
  providers:
    - name: "Zenodo"
      description: "Zenodo"
      roles:
        - host
      url: https://zenodo.org/records/10261274
    - name: "TBD"
      description: "TODO"
      roles:
        - producer
        - licensor
  license: "unknown"
maintainers:
  - name: "Julius Busecke"
    orcid: 0000-0001-8571-865X
    github: jbusecke

feedstock/recipe.py:

import apache_beam as beam
from pangeo_forge_recipes.patterns import pattern_from_file_sequence
from pangeo_forge_recipes.transforms import (
    ConsolidateDimensionCoordinates,
    ConsolidateMetadata,
    OpenURLWithFSSpec,
    OpenWithXarray,
    StoreToZarr
)

urls = [
    "/home/jovyan/AREAS/PGF/hanging_bug/simple_example/file_a.nc",
    "/home/jovyan/AREAS/PGF/hanging_bug/simple_example/file_b.nc",
]

pattern = pattern_from_file_sequence(urls, concat_dim='time')
recipe = (
    beam.Create(pattern.items())
    | OpenURLWithFSSpec()
    # do not specify file type to accomodate both ncdf3 and ncdf4
    | OpenWithXarray(xarray_open_kwargs={'use_cftime': True})
    | StoreToZarr(
        store_name=f'simple_test.zarr',
        combine_dims=pattern.combine_dim_keys,
        target_chunks={'x': 1},
    )
    | ConsolidateDimensionCoordinates()
    | ConsolidateMetadata()
)

and requirements.txt:

pangeo-forge-esgf==0.1.1
pangeo-forge-recipes==0.10.6
dynamic-chunks==0.0.2
gcsfs
apache-beam[gcp]

I also have a config_local.py in the root directory:

c.Bake.prune = 0
c.Bake.bakery_class = 'pangeo_forge_runner.bakery.local.LocalDirectBakery'
c.LocalDirectBakery.num_workers = 16
BUCKET_PREFIX = "/home/jovyan/AREAS/PGF/hanging_bug/simple_example/output"
c.TargetStorage.fsspec_class = "fsspec.implementations.local.LocalFileSystem"
c.TargetStorage.root_path = f"{BUCKET_PREFIX}/{{job_name}}/output"
c.InputCacheStorage.root_path = f"{BUCKET_PREFIX}/cache/"
c.InputCacheStorage.fsspec_class = "fsspec.implementations.local.LocalFileSystem"

Not fully reproducible yet, but should be enough to get an idea.

I then run the following in the terminal

mamba create -n runner0102 python=3.11 -y
conda activate runner0102
pip install pangeo-forge-runner==0.10.2 --no-cache-dir
cd /home/jovyan/AREAS/PGF/hanging_bug/simple_example # need to fix all these local paths...
pangeo-forge-runner bake \
  --repo=. \
  --ref=main \
  --feedstock-subdir='feedstock' \
  --Bake.job_name=simple_test\
  --Bake.recipe_id=test\
  -f config_local.py

The recipe runs with the following output (This is ugly AF and only for completeness. Ill fish out the important parts below):

Collecting pangeo-forge-esgf==0.1.1 (from -r feedstock/requirements.txt (line 1)) Using cached pangeo_forge_esgf-0.1.1-py3-none-any.whl.metadata (4.2 kB) Collecting pangeo-forge-recipes==0.10.6 (from -r feedstock/requirements.txt (line 2)) Using cached pangeo_forge_recipes-0.10.6-py3-none-any.whl.metadata (4.5 kB) Collecting dynamic-chunks==0.0.2 (from -r feedstock/requirements.txt (line 3)) Using cached dynamic_chunks-0.0.2-py3-none-any.whl.metadata (1.9 kB) Collecting gcsfs (from -r feedstock/requirements.txt (line 4)) Using cached gcsfs-2024.3.1-py2.py3-none-any.whl.metadata (1.6 kB) Collecting apache-beam[gcp] (from -r feedstock/requirements.txt (line 5)) Using cached apache_beam-2.55.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (6.6 kB) Collecting aiohttp (from pangeo-forge-esgf==0.1.1->-r feedstock/requirements.txt (line 1)) Using cached aiohttp-3.9.3-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (7.4 kB) Collecting requests (from pangeo-forge-esgf==0.1.1->-r feedstock/requirements.txt (line 1)) Using cached requests-2.31.0-py3-none-any.whl.metadata (4.6 kB) Collecting tqdm (from pangeo-forge-esgf==0.1.1->-r feedstock/requirements.txt (line 1)) Using cached tqdm-4.66.2-py3-none-any.whl.metadata (57 kB) Collecting backoff (from pangeo-forge-esgf==0.1.1->-r feedstock/requirements.txt (line 1)) Using cached backoff-2.2.1-py3-none-any.whl.metadata (14 kB) Collecting cftime (from pangeo-forge-recipes==0.10.6->-r feedstock/requirements.txt (line 2)) Using cached cftime-1.6.3-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (8.6 kB) Collecting dask>=2021.11.2 (from pangeo-forge-recipes==0.10.6->-r feedstock/requirements.txt (line 2)) Using cached dask-2024.3.1-py3-none-any.whl.metadata (3.8 kB) Collecting fastparquet (from pangeo-forge-recipes==0.10.6->-r feedstock/requirements.txt (line 2)) Using cached fastparquet-2024.2.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (4.1 kB) Collecting fsspec>=2023.4.0 (from fsspec[http]>=2023.4.0->pangeo-forge-recipes==0.10.6->-r feedstock/requirements.txt (line 2)) Using cached fsspec-2024.3.1-py3-none-any.whl.metadata (6.8 kB) Collecting h5netcdf (from pangeo-forge-recipes==0.10.6->-r feedstock/requirements.txt (line 2)) Using cached h5netcdf-1.3.0-py3-none-any.whl.metadata (13 kB) Collecting h5py>=3.3.0 (from pangeo-forge-recipes==0.10.6->-r feedstock/requirements.txt (line 2)) Using cached h5py-3.10.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (2.5 kB) Collecting kerchunk>=0.0.7 (from pangeo-forge-recipes==0.10.6->-r feedstock/requirements.txt (line 2)) Using cached kerchunk-0.2.4-py3-none-any.whl.metadata (4.3 kB) Collecting netcdf4 (from pangeo-forge-recipes==0.10.6->-r feedstock/requirements.txt (line 2)) Using cached netCDF4-1.6.5-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (1.8 kB) Collecting numcodecs>=0.9.0 (from pangeo-forge-recipes==0.10.6->-r feedstock/requirements.txt (line 2)) Using cached numcodecs-0.12.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (2.8 kB) Collecting xarray>=0.18.0 (from pangeo-forge-recipes==0.10.6->-r feedstock/requirements.txt (line 2)) Using cached xarray-2024.2.0-py3-none-any.whl.metadata (11 kB) Collecting zarr>=2.12.0 (from pangeo-forge-recipes==0.10.6->-r feedstock/requirements.txt (line 2)) Using cached zarr-2.17.1-py3-none-any.whl.metadata (5.7 kB) Collecting numpy (from dynamic-chunks==0.0.2->-r feedstock/requirements.txt (line 3)) Using cached numpy-1.26.4-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (61 kB) Collecting decorator>4.1.2 (from gcsfs->-r feedstock/requirements.txt (line 4)) Using cached decorator-5.1.1-py3-none-any.whl.metadata (4.0 kB) Collecting google-auth>=1.2 (from gcsfs->-r feedstock/requirements.txt (line 4)) Using cached google_auth-2.29.0-py2.py3-none-any.whl.metadata (4.7 kB) Collecting google-auth-oauthlib (from gcsfs->-r feedstock/requirements.txt (line 4)) Using cached google_auth_oauthlib-1.2.0-py2.py3-none-any.whl.metadata (2.7 kB) Collecting google-cloud-storage (from gcsfs->-r feedstock/requirements.txt (line 4)) Using cached google_cloud_storage-2.16.0-py2.py3-none-any.whl.metadata (6.1 kB) Collecting crcmod<2.0,>=1.7 (from apache-beam[gcp]->-r feedstock/requirements.txt (line 5)) Using cached crcmod-1.7-py3-none-any.whl Collecting orjson<4,>=3.9.7 (from apache-beam[gcp]->-r feedstock/requirements.txt (line 5)) Using cached orjson-3.9.15-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (49 kB) Collecting dill<0.3.2,>=0.3.1.1 (from apache-beam[gcp]->-r feedstock/requirements.txt (line 5)) Using cached dill-0.3.1.1-py3-none-any.whl Collecting cloudpickle~=2.2.1 (from apache-beam[gcp]->-r feedstock/requirements.txt (line 5)) Using cached cloudpickle-2.2.1-py3-none-any.whl.metadata (6.9 kB) Collecting fastavro<2,>=0.23.6 (from apache-beam[gcp]->-r feedstock/requirements.txt (line 5)) Using cached fastavro-1.9.4-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (5.5 kB) Collecting fasteners<1.0,>=0.3 (from apache-beam[gcp]->-r feedstock/requirements.txt (line 5)) Using cached fasteners-0.19-py3-none-any.whl.metadata (4.9 kB) Collecting grpcio!=1.48.0,<2,>=1.33.1 (from apache-beam[gcp]->-r feedstock/requirements.txt (line 5)) Using cached grpcio-1.62.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (4.0 kB) Collecting hdfs<3.0.0,>=2.1.0 (from apache-beam[gcp]->-r feedstock/requirements.txt (line 5)) Using cached hdfs-2.7.0-py3-none-any.whl.metadata (3.9 kB) Collecting httplib2<0.23.0,>=0.8 (from apache-beam[gcp]->-r feedstock/requirements.txt (line 5)) Using cached httplib2-0.22.0-py3-none-any.whl.metadata (2.6 kB) Collecting js2py<1,>=0.74 (from apache-beam[gcp]->-r feedstock/requirements.txt (line 5)) Using cached Js2Py-0.74-py3-none-any.whl.metadata (868 bytes) Collecting jsonschema<5.0.0,>=4.0.0 (from apache-beam[gcp]->-r feedstock/requirements.txt (line 5)) Using cached jsonschema-4.21.1-py3-none-any.whl.metadata (7.8 kB) Collecting jsonpickle<4.0.0,>=3.0.0 (from apache-beam[gcp]->-r feedstock/requirements.txt (line 5)) Using cached jsonpickle-3.0.3-py3-none-any.whl.metadata (7.3 kB) Collecting objsize<0.8.0,>=0.6.1 (from apache-beam[gcp]->-r feedstock/requirements.txt (line 5)) Using cached objsize-0.7.0-py3-none-any.whl.metadata (12 kB) Collecting packaging>=22.0 (from apache-beam[gcp]->-r feedstock/requirements.txt (line 5)) Using cached packaging-24.0-py3-none-any.whl.metadata (3.2 kB) Collecting pymongo<5.0.0,>=3.8.0 (from apache-beam[gcp]->-r feedstock/requirements.txt (line 5)) Using cached pymongo-4.6.2-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (22 kB) Collecting proto-plus<2,>=1.7.1 (from apache-beam[gcp]->-r feedstock/requirements.txt (line 5)) Using cached proto_plus-1.23.0-py3-none-any.whl.metadata (2.2 kB) Collecting protobuf!=4.0.*,!=4.21.*,!=4.22.0,!=4.23.*,!=4.24.*,<4.26.0,>=3.20.3 (from apache-beam[gcp]->-r feedstock/requirements.txt (line 5)) Using cached protobuf-4.25.3-cp37-abi3-manylinux2014_x86_64.whl.metadata (541 bytes) Collecting pydot<2,>=1.2.0 (from apache-beam[gcp]->-r feedstock/requirements.txt (line 5)) Using cached pydot-1.4.2-py2.py3-none-any.whl.metadata (8.0 kB) Collecting python-dateutil<3,>=2.8.0 (from apache-beam[gcp]->-r feedstock/requirements.txt (line 5)) Using cached python_dateutil-2.9.0.post0-py2.py3-none-any.whl.metadata (8.4 kB) Collecting pytz>=2018.3 (from apache-beam[gcp]->-r feedstock/requirements.txt (line 5)) Using cached pytz-2024.1-py2.py3-none-any.whl.metadata (22 kB) Collecting regex>=2020.6.8 (from apache-beam[gcp]->-r feedstock/requirements.txt (line 5)) Using cached regex-2023.12.25-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (40 kB) Collecting typing-extensions>=3.7.0 (from apache-beam[gcp]->-r feedstock/requirements.txt (line 5)) Using cached typing_extensions-4.10.0-py3-none-any.whl.metadata (3.0 kB) Collecting zstandard<1,>=0.18.0 (from apache-beam[gcp]->-r feedstock/requirements.txt (line 5)) Using cached zstandard-0.22.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (2.9 kB) Collecting pyarrow<15.0.0,>=3.0.0 (from apache-beam[gcp]->-r feedstock/requirements.txt (line 5)) Using cached pyarrow-14.0.2-cp311-cp311-manylinux_2_28_x86_64.whl.metadata (3.0 kB) Collecting pyarrow-hotfix<1 (from apache-beam[gcp]->-r feedstock/requirements.txt (line 5)) Using cached pyarrow_hotfix-0.6-py3-none-any.whl.metadata (3.6 kB) Collecting cachetools<6,>=3.1.0 (from apache-beam[gcp]->-r feedstock/requirements.txt (line 5)) Using cached cachetools-5.3.3-py3-none-any.whl.metadata (5.3 kB) Collecting google-api-core<3,>=2.0.0 (from apache-beam[gcp]->-r feedstock/requirements.txt (line 5)) Using cached google_api_core-2.18.0-py3-none-any.whl.metadata (2.7 kB) Collecting google-apitools<0.5.32,>=0.5.31 (from apache-beam[gcp]->-r feedstock/requirements.txt (line 5)) Using cached google_apitools-0.5.31-py3-none-any.whl Collecting google-auth-httplib2<0.2.0,>=0.1.0 (from apache-beam[gcp]->-r feedstock/requirements.txt (line 5)) Using cached google_auth_httplib2-0.1.1-py2.py3-none-any.whl.metadata (2.1 kB) Collecting google-cloud-datastore<3,>=2.0.0 (from apache-beam[gcp]->-r feedstock/requirements.txt (line 5)) Using cached google_cloud_datastore-2.19.0-py2.py3-none-any.whl.metadata (5.6 kB) Collecting google-cloud-pubsub<3,>=2.1.0 (from apache-beam[gcp]->-r feedstock/requirements.txt (line 5)) Using cached google_cloud_pubsub-2.21.0-py2.py3-none-any.whl.metadata (9.3 kB) Collecting google-cloud-pubsublite<2,>=1.2.0 (from apache-beam[gcp]->-r feedstock/requirements.txt (line 5)) Using cached google_cloud_pubsublite-1.9.0-py2.py3-none-any.whl.metadata (5.6 kB) Collecting google-cloud-bigquery<4,>=2.0.0 (from apache-beam[gcp]->-r feedstock/requirements.txt (line 5)) Using cached google_cloud_bigquery-3.19.0-py2.py3-none-any.whl.metadata (8.9 kB) Collecting google-cloud-bigquery-storage<3,>=2.6.3 (from apache-beam[gcp]->-r feedstock/requirements.txt (line 5)) Using cached google_cloud_bigquery_storage-2.24.0-py2.py3-none-any.whl.metadata (5.6 kB) Collecting google-cloud-core<3,>=2.0.0 (from apache-beam[gcp]->-r feedstock/requirements.txt (line 5)) Using cached google_cloud_core-2.4.1-py2.py3-none-any.whl.metadata (2.7 kB) Collecting google-cloud-bigtable<3,>=2.19.0 (from apache-beam[gcp]->-r feedstock/requirements.txt (line 5)) Using cached google_cloud_bigtable-2.23.0-py2.py3-none-any.whl.metadata (5.3 kB) Collecting google-cloud-spanner<4,>=3.0.0 (from apache-beam[gcp]->-r feedstock/requirements.txt (line 5)) Using cached google_cloud_spanner-3.44.0-py2.py3-none-any.whl.metadata (10 kB) Collecting google-cloud-dlp<4,>=3.0.0 (from apache-beam[gcp]->-r feedstock/requirements.txt (line 5)) Using cached google_cloud_dlp-3.16.0-py2.py3-none-any.whl.metadata (5.3 kB) Collecting google-cloud-language<3,>=2.0 (from apache-beam[gcp]->-r feedstock/requirements.txt (line 5)) Using cached google_cloud_language-2.13.3-py2.py3-none-any.whl.metadata (5.3 kB) Collecting google-cloud-videointelligence<3,>=2.0 (from apache-beam[gcp]->-r feedstock/requirements.txt (line 5)) Using cached google_cloud_videointelligence-2.13.3-py2.py3-none-any.whl.metadata (5.7 kB) Collecting google-cloud-vision<4,>=2 (from apache-beam[gcp]->-r feedstock/requirements.txt (line 5)) Using cached google_cloud_vision-3.7.2-py2.py3-none-any.whl.metadata (5.2 kB) Collecting google-cloud-recommendations-ai<0.11.0,>=0.1.0 (from apache-beam[gcp]->-r feedstock/requirements.txt (line 5)) Using cached google_cloud_recommendations_ai-0.10.10-py2.py3-none-any.whl.metadata (5.3 kB) Collecting google-cloud-aiplatform<2.0,>=1.26.0 (from apache-beam[gcp]->-r feedstock/requirements.txt (line 5)) Using cached google_cloud_aiplatform-1.44.0-py2.py3-none-any.whl.metadata (27 kB) Collecting aiosignal>=1.1.2 (from aiohttp->pangeo-forge-esgf==0.1.1->-r feedstock/requirements.txt (line 1)) Using cached aiosignal-1.3.1-py3-none-any.whl.metadata (4.0 kB) Collecting attrs>=17.3.0 (from aiohttp->pangeo-forge-esgf==0.1.1->-r feedstock/requirements.txt (line 1)) Using cached attrs-23.2.0-py3-none-any.whl.metadata (9.5 kB) Collecting frozenlist>=1.1.1 (from aiohttp->pangeo-forge-esgf==0.1.1->-r feedstock/requirements.txt (line 1)) Using cached frozenlist-1.4.1-cp311-cp311-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (12 kB) Collecting multidict<7.0,>=4.5 (from aiohttp->pangeo-forge-esgf==0.1.1->-r feedstock/requirements.txt (line 1)) Using cached multidict-6.0.5-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (4.2 kB) Collecting yarl<2.0,>=1.0 (from aiohttp->pangeo-forge-esgf==0.1.1->-r feedstock/requirements.txt (line 1)) Using cached yarl-1.9.4-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (31 kB) Collecting click>=8.1 (from dask>=2021.11.2->pangeo-forge-recipes==0.10.6->-r feedstock/requirements.txt (line 2)) Using cached click-8.1.7-py3-none-any.whl.metadata (3.0 kB) Collecting partd>=1.2.0 (from dask>=2021.11.2->pangeo-forge-recipes==0.10.6->-r feedstock/requirements.txt (line 2)) Using cached partd-1.4.1-py3-none-any.whl.metadata (4.6 kB) Collecting pyyaml>=5.3.1 (from dask>=2021.11.2->pangeo-forge-recipes==0.10.6->-r feedstock/requirements.txt (line 2)) Using cached PyYAML-6.0.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (2.1 kB) Collecting toolz>=0.10.0 (from dask>=2021.11.2->pangeo-forge-recipes==0.10.6->-r feedstock/requirements.txt (line 2)) Using cached toolz-0.12.1-py3-none-any.whl.metadata (5.1 kB) Collecting importlib-metadata>=4.13.0 (from dask>=2021.11.2->pangeo-forge-recipes==0.10.6->-r feedstock/requirements.txt (line 2)) Using cached importlib_metadata-7.1.0-py3-none-any.whl.metadata (4.7 kB) Collecting googleapis-common-protos<2.0.dev0,>=1.56.2 (from google-api-core<3,>=2.0.0->apache-beam[gcp]->-r feedstock/requirements.txt (line 5)) Using cached googleapis_common_protos-1.63.0-py2.py3-none-any.whl.metadata (1.5 kB) Collecting oauth2client>=1.4.12 (from google-apitools<0.5.32,>=0.5.31->apache-beam[gcp]->-r feedstock/requirements.txt (line 5)) Using cached oauth2client-4.1.3-py2.py3-none-any.whl.metadata (1.2 kB) Collecting six>=1.12.0 (from google-apitools<0.5.32,>=0.5.31->apache-beam[gcp]->-r feedstock/requirements.txt (line 5)) Using cached six-1.16.0-py2.py3-none-any.whl.metadata (1.8 kB) Collecting pyasn1-modules>=0.2.1 (from google-auth>=1.2->gcsfs->-r feedstock/requirements.txt (line 4)) Using cached pyasn1_modules-0.4.0-py3-none-any.whl.metadata (3.4 kB) Collecting rsa<5,>=3.1.4 (from google-auth>=1.2->gcsfs->-r feedstock/requirements.txt (line 4)) Using cached rsa-4.9-py3-none-any.whl.metadata (4.2 kB) Collecting google-cloud-resource-manager<3.0.0dev,>=1.3.3 (from google-cloud-aiplatform<2.0,>=1.26.0->apache-beam[gcp]->-r feedstock/requirements.txt (line 5)) Using cached google_cloud_resource_manager-1.12.3-py2.py3-none-any.whl.metadata (5.3 kB) Collecting shapely<3.0.0dev (from google-cloud-aiplatform<2.0,>=1.26.0->apache-beam[gcp]->-r feedstock/requirements.txt (line 5)) Using cached shapely-2.0.3-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (7.0 kB) Collecting google-resumable-media<3.0dev,>=0.6.0 (from google-cloud-bigquery<4,>=2.0.0->apache-beam[gcp]->-r feedstock/requirements.txt (line 5)) Using cached google_resumable_media-2.7.0-py2.py3-none-any.whl.metadata (2.2 kB) Collecting grpc-google-iam-v1<1.0.0dev,>=0.12.4 (from google-cloud-bigtable<3,>=2.19.0->apache-beam[gcp]->-r feedstock/requirements.txt (line 5)) Using cached grpc_google_iam_v1-0.13.0-py2.py3-none-any.whl.metadata (3.3 kB) Collecting grpcio-status>=1.33.2 (from google-cloud-pubsub<3,>=2.1.0->apache-beam[gcp]->-r feedstock/requirements.txt (line 5)) Using cached grpcio_status-1.62.1-py3-none-any.whl.metadata (1.3 kB) Collecting overrides<8.0.0,>=6.0.1 (from google-cloud-pubsublite<2,>=1.2.0->apache-beam[gcp]->-r feedstock/requirements.txt (line 5)) Using cached overrides-7.7.0-py3-none-any.whl.metadata (5.8 kB) Collecting sqlparse>=0.4.4 (from google-cloud-spanner<4,>=3.0.0->apache-beam[gcp]->-r feedstock/requirements.txt (line 5)) Using cached sqlparse-0.4.4-py3-none-any.whl.metadata (4.0 kB) Collecting deprecated>=1.2.14 (from google-cloud-spanner<4,>=3.0.0->apache-beam[gcp]->-r feedstock/requirements.txt (line 5)) Using cached Deprecated-1.2.14-py2.py3-none-any.whl.metadata (5.4 kB) Collecting grpc-interceptor>=0.15.4 (from google-cloud-spanner<4,>=3.0.0->apache-beam[gcp]->-r feedstock/requirements.txt (line 5)) Using cached grpc_interceptor-0.15.4-py3-none-any.whl.metadata (8.4 kB) Collecting google-crc32c<2.0dev,>=1.0 (from google-cloud-storage->gcsfs->-r feedstock/requirements.txt (line 4)) Using cached google_crc32c-1.5.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (2.3 kB) Collecting docopt (from hdfs<3.0.0,>=2.1.0->apache-beam[gcp]->-r feedstock/requirements.txt (line 5)) Using cached docopt-0.6.2-py2.py3-none-any.whl Collecting pyparsing!=3.0.0,!=3.0.1,!=3.0.2,!=3.0.3,<4,>=2.4.2 (from httplib2<0.23.0,>=0.8->apache-beam[gcp]->-r feedstock/requirements.txt (line 5)) Using cached pyparsing-3.1.2-py3-none-any.whl.metadata (5.1 kB) Collecting tzlocal>=1.2 (from js2py<1,>=0.74->apache-beam[gcp]->-r feedstock/requirements.txt (line 5)) Using cached tzlocal-5.2-py3-none-any.whl.metadata (7.8 kB) Collecting pyjsparser>=2.5.1 (from js2py<1,>=0.74->apache-beam[gcp]->-r feedstock/requirements.txt (line 5)) Using cached pyjsparser-2.7.1-py3-none-any.whl Collecting jsonschema-specifications>=2023.03.6 (from jsonschema<5.0.0,>=4.0.0->apache-beam[gcp]->-r feedstock/requirements.txt (line 5)) Using cached jsonschema_specifications-2023.12.1-py3-none-any.whl.metadata (3.0 kB) Collecting referencing>=0.28.4 (from jsonschema<5.0.0,>=4.0.0->apache-beam[gcp]->-r feedstock/requirements.txt (line 5)) Using cached referencing-0.34.0-py3-none-any.whl.metadata (2.8 kB) Collecting rpds-py>=0.7.1 (from jsonschema<5.0.0,>=4.0.0->apache-beam[gcp]->-r feedstock/requirements.txt (line 5)) Using cached rpds_py-0.18.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (4.1 kB) Collecting ujson (from kerchunk>=0.0.7->pangeo-forge-recipes==0.10.6->-r feedstock/requirements.txt (line 2)) Using cached ujson-5.9.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (8.7 kB) Collecting dnspython<3.0.0,>=1.16.0 (from pymongo<5.0.0,>=3.8.0->apache-beam[gcp]->-r feedstock/requirements.txt (line 5)) Using cached dnspython-2.6.1-py3-none-any.whl.metadata (5.8 kB) Collecting charset-normalizer<4,>=2 (from requests->pangeo-forge-esgf==0.1.1->-r feedstock/requirements.txt (line 1)) Using cached charset_normalizer-3.3.2-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (33 kB) Collecting idna<4,>=2.5 (from requests->pangeo-forge-esgf==0.1.1->-r feedstock/requirements.txt (line 1)) Using cached idna-3.6-py3-none-any.whl.metadata (9.9 kB) Collecting urllib3<3,>=1.21.1 (from requests->pangeo-forge-esgf==0.1.1->-r feedstock/requirements.txt (line 1)) Using cached urllib3-2.2.1-py3-none-any.whl.metadata (6.4 kB) Collecting certifi>=2017.4.17 (from requests->pangeo-forge-esgf==0.1.1->-r feedstock/requirements.txt (line 1)) Using cached certifi-2024.2.2-py3-none-any.whl.metadata (2.2 kB) Collecting pandas>=1.5 (from xarray>=0.18.0->pangeo-forge-recipes==0.10.6->-r feedstock/requirements.txt (line 2)) Using cached pandas-2.2.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (19 kB) Collecting asciitree (from zarr>=2.12.0->pangeo-forge-recipes==0.10.6->-r feedstock/requirements.txt (line 2)) Using cached asciitree-0.3.3-py3-none-any.whl Collecting cramjam>=2.3 (from fastparquet->pangeo-forge-recipes==0.10.6->-r feedstock/requirements.txt (line 2)) Using cached cramjam-2.8.3-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (4.2 kB) Collecting requests-oauthlib>=0.7.0 (from google-auth-oauthlib->gcsfs->-r feedstock/requirements.txt (line 4)) Using cached requests_oauthlib-2.0.0-py2.py3-none-any.whl.metadata (11 kB) Collecting wrapt<2,>=1.10 (from deprecated>=1.2.14->google-cloud-spanner<4,>=3.0.0->apache-beam[gcp]->-r feedstock/requirements.txt (line 5)) Using cached wrapt-1.16.0-cp311-cp311-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (6.6 kB) Collecting zipp>=0.5 (from importlib-metadata>=4.13.0->dask>=2021.11.2->pangeo-forge-recipes==0.10.6->-r feedstock/requirements.txt (line 2)) Using cached zipp-3.18.1-py3-none-any.whl.metadata (3.5 kB) Collecting pyasn1>=0.1.7 (from oauth2client>=1.4.12->google-apitools<0.5.32,>=0.5.31->apache-beam[gcp]->-r feedstock/requirements.txt (line 5)) Using cached pyasn1-0.6.0-py2.py3-none-any.whl.metadata (8.3 kB) Collecting tzdata>=2022.7 (from pandas>=1.5->xarray>=0.18.0->pangeo-forge-recipes==0.10.6->-r feedstock/requirements.txt (line 2)) Using cached tzdata-2024.1-py2.py3-none-any.whl.metadata (1.4 kB) Collecting locket (from partd>=1.2.0->dask>=2021.11.2->pangeo-forge-recipes==0.10.6->-r feedstock/requirements.txt (line 2)) Using cached locket-1.0.0-py2.py3-none-any.whl.metadata (2.8 kB) Collecting oauthlib>=3.0.0 (from requests-oauthlib>=0.7.0->google-auth-oauthlib->gcsfs->-r feedstock/requirements.txt (line 4)) Using cached oauthlib-3.2.2-py3-none-any.whl.metadata (7.5 kB) Using cached pangeo_forge_esgf-0.1.1-py3-none-any.whl (15 kB) Using cached pangeo_forge_recipes-0.10.6-py3-none-any.whl (130 kB) Using cached dynamic_chunks-0.0.2-py3-none-any.whl (12 kB) Using cached gcsfs-2024.3.1-py2.py3-none-any.whl (34 kB) Using cached fsspec-2024.3.1-py3-none-any.whl (171 kB) Using cached aiohttp-3.9.3-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (1.3 MB) Using cached cachetools-5.3.3-py3-none-any.whl (9.3 kB) Using cached cloudpickle-2.2.1-py3-none-any.whl (25 kB) Using cached dask-2024.3.1-py3-none-any.whl (1.2 MB) Using cached decorator-5.1.1-py3-none-any.whl (9.1 kB) Using cached fastavro-1.9.4-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (3.3 MB) Using cached fasteners-0.19-py3-none-any.whl (18 kB) Using cached google_api_core-2.18.0-py3-none-any.whl (138 kB) Using cached google_auth-2.29.0-py2.py3-none-any.whl (189 kB) Using cached google_auth_httplib2-0.1.1-py2.py3-none-any.whl (9.3 kB) Using cached google_cloud_aiplatform-1.44.0-py2.py3-none-any.whl (4.2 MB) Using cached google_cloud_bigquery-3.19.0-py2.py3-none-any.whl (232 kB) Using cached google_cloud_bigquery_storage-2.24.0-py2.py3-none-any.whl (190 kB) Using cached google_cloud_bigtable-2.23.0-py2.py3-none-any.whl (357 kB) Using cached google_cloud_core-2.4.1-py2.py3-none-any.whl (29 kB) Using cached google_cloud_datastore-2.19.0-py2.py3-none-any.whl (176 kB) Using cached google_cloud_dlp-3.16.0-py2.py3-none-any.whl (167 kB) Using cached google_cloud_language-2.13.3-py2.py3-none-any.whl (143 kB) Using cached google_cloud_pubsub-2.21.0-py2.py3-none-any.whl (273 kB) Using cached google_cloud_pubsublite-1.9.0-py2.py3-none-any.whl (287 kB) Using cached google_cloud_recommendations_ai-0.10.10-py2.py3-none-any.whl (180 kB) Using cached google_cloud_spanner-3.44.0-py2.py3-none-any.whl (357 kB) Using cached google_cloud_storage-2.16.0-py2.py3-none-any.whl (125 kB) Using cached google_cloud_videointelligence-2.13.3-py2.py3-none-any.whl (240 kB) Using cached google_cloud_vision-3.7.2-py2.py3-none-any.whl (459 kB) Using cached grpcio-1.62.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (5.5 MB) Using cached h5py-3.10.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (4.8 MB) Using cached hdfs-2.7.0-py3-none-any.whl (34 kB) Using cached httplib2-0.22.0-py3-none-any.whl (96 kB) Using cached Js2Py-0.74-py3-none-any.whl (1.0 MB) Using cached jsonpickle-3.0.3-py3-none-any.whl (40 kB) Using cached jsonschema-4.21.1-py3-none-any.whl (85 kB) Using cached kerchunk-0.2.4-py3-none-any.whl (3.3 MB) Using cached numcodecs-0.12.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (7.9 MB) Using cached numpy-1.26.4-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (18.3 MB) Using cached objsize-0.7.0-py3-none-any.whl (11 kB) Using cached orjson-3.9.15-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (138 kB) Using cached packaging-24.0-py3-none-any.whl (53 kB) Using cached proto_plus-1.23.0-py3-none-any.whl (48 kB) Using cached protobuf-4.25.3-cp37-abi3-manylinux2014_x86_64.whl (294 kB) Using cached pyarrow-14.0.2-cp311-cp311-manylinux_2_28_x86_64.whl (38.0 MB) Using cached pyarrow_hotfix-0.6-py3-none-any.whl (7.9 kB) Using cached pydot-1.4.2-py2.py3-none-any.whl (21 kB) Using cached pymongo-4.6.2-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (680 kB) Using cached python_dateutil-2.9.0.post0-py2.py3-none-any.whl (229 kB) Using cached pytz-2024.1-py2.py3-none-any.whl (505 kB) Using cached regex-2023.12.25-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (785 kB) Using cached requests-2.31.0-py3-none-any.whl (62 kB) Using cached typing_extensions-4.10.0-py3-none-any.whl (33 kB) Using cached xarray-2024.2.0-py3-none-any.whl (1.1 MB) Using cached zarr-2.17.1-py3-none-any.whl (207 kB) Using cached zstandard-0.22.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (5.4 MB) Using cached apache_beam-2.55.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (15.8 MB) Using cached backoff-2.2.1-py3-none-any.whl (15 kB) Using cached cftime-1.6.3-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (1.4 MB) Using cached fastparquet-2024.2.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (1.8 MB) Using cached google_auth_oauthlib-1.2.0-py2.py3-none-any.whl (24 kB) Using cached h5netcdf-1.3.0-py3-none-any.whl (43 kB) Using cached netCDF4-1.6.5-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (5.5 MB) Using cached tqdm-4.66.2-py3-none-any.whl (78 kB) Using cached aiosignal-1.3.1-py3-none-any.whl (7.6 kB) Using cached attrs-23.2.0-py3-none-any.whl (60 kB) Using cached certifi-2024.2.2-py3-none-any.whl (163 kB) Using cached charset_normalizer-3.3.2-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (140 kB) Using cached click-8.1.7-py3-none-any.whl (97 kB) Using cached cramjam-2.8.3-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (2.0 MB) Using cached Deprecated-1.2.14-py2.py3-none-any.whl (9.6 kB) Using cached dnspython-2.6.1-py3-none-any.whl (307 kB) Using cached frozenlist-1.4.1-cp311-cp311-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (272 kB) Using cached google_cloud_resource_manager-1.12.3-py2.py3-none-any.whl (333 kB) Using cached google_crc32c-1.5.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (32 kB) Using cached google_resumable_media-2.7.0-py2.py3-none-any.whl (80 kB) Using cached googleapis_common_protos-1.63.0-py2.py3-none-any.whl (229 kB) Using cached grpc_google_iam_v1-0.13.0-py2.py3-none-any.whl (25 kB) Using cached grpc_interceptor-0.15.4-py3-none-any.whl (20 kB) Using cached grpcio_status-1.62.1-py3-none-any.whl (14 kB) Using cached idna-3.6-py3-none-any.whl (61 kB) Using cached importlib_metadata-7.1.0-py3-none-any.whl (24 kB) Using cached jsonschema_specifications-2023.12.1-py3-none-any.whl (18 kB) Using cached multidict-6.0.5-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (128 kB) Using cached oauth2client-4.1.3-py2.py3-none-any.whl (98 kB) Using cached overrides-7.7.0-py3-none-any.whl (17 kB) Using cached pandas-2.2.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (13.0 MB) Using cached partd-1.4.1-py3-none-any.whl (18 kB) Using cached pyasn1_modules-0.4.0-py3-none-any.whl (181 kB) Using cached pyparsing-3.1.2-py3-none-any.whl (103 kB) Using cached PyYAML-6.0.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (757 kB) Using cached referencing-0.34.0-py3-none-any.whl (26 kB) Using cached requests_oauthlib-2.0.0-py2.py3-none-any.whl (24 kB) Using cached rpds_py-0.18.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (1.1 MB) Using cached rsa-4.9-py3-none-any.whl (34 kB) Using cached shapely-2.0.3-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (2.5 MB) Using cached six-1.16.0-py2.py3-none-any.whl (11 kB) Using cached sqlparse-0.4.4-py3-none-any.whl (41 kB) Using cached toolz-0.12.1-py3-none-any.whl (56 kB) Using cached tzlocal-5.2-py3-none-any.whl (17 kB) Using cached urllib3-2.2.1-py3-none-any.whl (121 kB) Using cached yarl-1.9.4-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (328 kB) Using cached ujson-5.9.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (53 kB) Using cached oauthlib-3.2.2-py3-none-any.whl (151 kB) Using cached pyasn1-0.6.0-py2.py3-none-any.whl (85 kB) Using cached tzdata-2024.1-py2.py3-none-any.whl (345 kB) Using cached wrapt-1.16.0-cp311-cp311-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (80 kB) Using cached zipp-3.18.1-py3-none-any.whl (8.2 kB) Using cached locket-1.0.0-py2.py3-none-any.whl (4.4 kB) Installing collected packages: pytz, pyjsparser, docopt, crcmod, asciitree, zstandard, zipp, wrapt, urllib3, ujson, tzlocal, tzdata, typing-extensions, tqdm, toolz, sqlparse, six, rpds-py, regex, pyyaml, pyparsing, pyasn1, pyarrow-hotfix, protobuf, packaging, overrides, orjson, objsize, oauthlib, numpy, multidict, locket, jsonpickle, idna, grpcio, google-crc32c, fsspec, frozenlist, fasteners, fastavro, dnspython, dill, decorator, cramjam, cloudpickle, click, charset-normalizer, certifi, cachetools, backoff, attrs, yarl, shapely, rsa, requests, referencing, python-dateutil, pymongo, pydot, pyasn1-modules, pyarrow, proto-plus, partd, numcodecs, js2py, importlib-metadata, httplib2, h5py, grpc-interceptor, googleapis-common-protos, google-resumable-media, deprecated, cftime, aiosignal, zarr, requests-oauthlib, pandas, oauth2client, netcdf4, jsonschema-specifications, hdfs, h5netcdf, grpcio-status, google-auth, dask, aiohttp, xarray, pangeo-forge-esgf, kerchunk, jsonschema, grpc-google-iam-v1, google-auth-oauthlib, google-auth-httplib2, google-apitools, google-api-core, fastparquet, google-cloud-core, dynamic-chunks, apache-beam, pangeo-forge-recipes, google-cloud-vision, google-cloud-videointelligence, google-cloud-storage, google-cloud-spanner, google-cloud-resource-manager, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, google-cloud-pubsublite, google-cloud-aiplatform, gcsfs Successfully installed aiohttp-3.9.3 aiosignal-1.3.1 apache-beam-2.55.0 asciitree-0.3.3 attrs-23.2.0 backoff-2.2.1 cachetools-5.3.3 certifi-2024.2.2 cftime-1.6.3 charset-normalizer-3.3.2 click-8.1.7 cloudpickle-2.2.1 cramjam-2.8.3 crcmod-1.7 dask-2024.3.1 decorator-5.1.1 deprecated-1.2.14 dill-0.3.1.1 dnspython-2.6.1 docopt-0.6.2 dynamic-chunks-0.0.2 fastavro-1.9.4 fasteners-0.19 fastparquet-2024.2.0 frozenlist-1.4.1 fsspec-2024.3.1 gcsfs-2024.3.1 google-api-core-2.18.0 google-apitools-0.5.31 google-auth-2.29.0 google-auth-httplib2-0.1.1 google-auth-oauthlib-1.2.0 google-cloud-aiplatform-1.44.0 google-cloud-bigquery-3.19.0 google-cloud-bigquery-storage-2.24.0 google-cloud-bigtable-2.23.0 google-cloud-core-2.4.1 google-cloud-datastore-2.19.0 google-cloud-dlp-3.16.0 google-cloud-language-2.13.3 google-cloud-pubsub-2.21.0 google-cloud-pubsublite-1.9.0 google-cloud-recommendations-ai-0.10.10 google-cloud-resource-manager-1.12.3 google-cloud-spanner-3.44.0 google-cloud-storage-2.16.0 google-cloud-videointelligence-2.13.3 google-cloud-vision-3.7.2 google-crc32c-1.5.0 google-resumable-media-2.7.0 googleapis-common-protos-1.63.0 grpc-google-iam-v1-0.13.0 grpc-interceptor-0.15.4 grpcio-1.62.1 grpcio-status-1.62.1 h5netcdf-1.3.0 h5py-3.10.0 hdfs-2.7.0 httplib2-0.22.0 idna-3.6 importlib-metadata-7.1.0 js2py-0.74 jsonpickle-3.0.3 jsonschema-4.21.1 jsonschema-specifications-2023.12.1 kerchunk-0.2.4 locket-1.0.0 multidict-6.0.5 netcdf4-1.6.5 numcodecs-0.12.1 numpy-1.26.4 oauth2client-4.1.3 oauthlib-3.2.2 objsize-0.7.0 orjson-3.9.15 overrides-7.7.0 packaging-24.0 pandas-2.2.1 pangeo-forge-esgf-0.1.1 pangeo-forge-recipes-0.10.6 partd-1.4.1 proto-plus-1.23.0 protobuf-4.25.3 pyarrow-14.0.2 pyarrow-hotfix-0.6 pyasn1-0.6.0 pyasn1-modules-0.4.0 pydot-1.4.2 pyjsparser-2.7.1 pymongo-4.6.2 pyparsing-3.1.2 python-dateutil-2.9.0.post0 pytz-2024.1 pyyaml-6.0.1 referencing-0.34.0 regex-2023.12.25 requests-2.31.0 requests-oauthlib-2.0.0 rpds-py-0.18.0 rsa-4.9 shapely-2.0.3 six-1.16.0 sqlparse-0.4.4 toolz-0.12.1 tqdm-4.66.2 typing-extensions-4.10.0 tzdata-2024.1 tzlocal-5.2 ujson-5.9.0 urllib3-2.2.1 wrapt-1.16.0 xarray-2024.2.0 yarl-1.9.4 zarr-2.17.1 zipp-3.18.1 zstandard-0.22.0 Using Any for unsupported type: typing.Sequence[~T] Target Storage is FSSpecTarget(LocalFileSystem(, root_path="/home/jovyan/AREAS/PGF/hanging_bug/output/{job_name}/output")

Input Cache Storage is CacheFSSpecTarget(LocalFileSystem(, root_path="/home/jovyan/AREAS/PGF/hanging_bug/output/cache/")

Parsing recipes...
Baking only recipe_id='test'
Job name is simple_test
Converting string literal type hint to Any: "beam.PCollection[Tuple[Index, xr.Dataset]]"
Converting string literal type hint to Any: "beam.PCollection[zarr.storage.FSStore]"
Converting string literal type hint to Any: "beam.PCollection"
Converting string literal type hint to Any: "beam.PCollection"
Converting string literal type hint to Any: "beam.PCollection"
Converting string literal type hint to Any: "beam.PCollection"
Converting string literal type hint to Any: "beam.PCollection"
Converting string literal type hint to Any: "beam.PCollection"
Converting string literal type hint to Any: "beam.PCollection"
Converting string literal type hint to Any: "Indexed[T]"
Converting string literal type hint to Any: "XarraySchema"
Converting string literal type hint to Any: "Indexed[T]"
Converting string literal type hint to Any: "beam.PCollection"
Converting string literal type hint to Any: "beam.PCollection"
Storing Zarr with target_chunks ={'x': 1} to FSSpecTarget(fs=<fsspec.implementations.local.LocalFileSystem object at 0x7f418d078450>, root_path='/home/jovyan/AREAS/PGF/hanging_bug/output/simple_test/output/simple_test.zarr', fsspec_kwargs={})
Converting string literal type hint to Any: "beam.PCollection"
Converting string literal type hint to Any: "beam.PCollection"
Converting string literal type hint to Any: "beam.PCollection"
Converting string literal type hint to Any: "beam.PCollection"
Converting string literal type hint to Any: "beam.PCollection"
Converting string literal type hint to Any: "beam.PCollection"
Converting string literal type hint to Any: "XarraySchema"
Converting string literal type hint to Any: "zarr.storage.FSStore"
Converting string literal type hint to Any: "Optional[Dict[str, int]]"
Converting string literal type hint to Any: "Optional[Dict[str, str]]"
Converting string literal type hint to Any: "Optional[bool]"
Converting string literal type hint to Any: "Optional[Dict]"
Converting string literal type hint to Any: "zarr.storage.FSStore"
Converting string literal type hint to Any: "beam.PCollection"
Converting string literal type hint to Any: "beam.PCollection"
Converting string literal type hint to Any: "beam.PCollection"
Converting string literal type hint to Any: "beam.PCollection"
Converting string literal type hint to Any: "beam.PCollection"
Converting string literal type hint to Any: "beam.PCollection"
Converting string literal type hint to Any: "beam.PCollection"
Converting string literal type hint to Any: "beam.PCollection"
Converting string literal type hint to Any: "beam.PCollection[Tuple[Index, xr.Dataset]]"
Converting string literal type hint to Any: "beam.PCollection[zarr.storage.FSStore]"
Converting string literal type hint to Any: "beam.PCollection[zarr.storage.FSStore]"
Converting string literal type hint to Any: "beam.PCollection[zarr.storage.FSStore]"
Converting string literal type hint to Any: "beam.PCollection[zarr.storage.FSStore]"
Converting string literal type hint to Any: "beam.PCollection[zarr.storage.FSStore]"
Converting string literal type hint to Any: "beam.PCollection"
Converting string literal type hint to Any: "beam.PCollection"
Using Any for unsupported type: typing.MutableMapping
Using Any for unsupported type: typing.MutableMapping
Converting string literal type hint to Any: "beam.PCollection"
Converting string literal type hint to Any: "beam.PCollection"
Converting string literal type hint to Any: "beam.PCollection"
Converting string literal type hint to Any: "beam.PCollection"
Running job for recipe test

==================== <function annotate_downstream_side_inputs at 0x7f418b674720> ====================
==================== <function fix_side_input_pcoll_coders at 0x7f418b674860> ====================
==================== <function pack_combiners at 0x7f418b674e00> ====================
==================== <function lift_combiners at 0x7f418b674ea0> ====================
==================== <function expand_sdf at 0x7f418b675080> ====================
==================== <function expand_gbk at 0x7f418b675120> ====================
==================== <function sink_flattens at 0x7f418b675260> ====================
==================== <function greedily_fuse at 0x7f418b675300> ====================
==================== <function read_to_impulse at 0x7f418b6753a0> ====================
==================== <function impulse_to_input at 0x7f418b675440> ====================
==================== <function sort_stages at 0x7f418b6756c0> ====================
==================== <function add_impulse_to_dangling_transforms at 0x7f418b675800> ====================
==================== <function setup_timer_mapping at 0x7f418b675620> ====================
==================== <function populate_data_channel_coders at 0x7f418b675760> ====================
starting control server on port 33051
starting data server on port 46465
starting state server on port 34557
starting logging server on port 45721
Created Worker handler <apache_beam.runners.portability.fn_api_runner.worker_handlers.SubprocessSdkWorkerHandler object at 0x7f418b77ec50> for environment ref_Environment_default_environment_1 (beam:env:harness_subprocess_python:v1, b'/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/bin/python -m apache_beam.runners.worker.sdk_worker_main')
Created Worker handler <apache_beam.runners.portability.fn_api_runner.worker_handlers.SubprocessSdkWorkerHandler object at 0x7f418b494710> for environment ref_Environment_default_environment_1 (beam:env:harness_subprocess_python:v1, b'/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/bin/python -m apache_beam.runners.worker.sdk_worker_main')
Created Worker handler <apache_beam.runners.portability.fn_api_runner.worker_handlers.SubprocessSdkWorkerHandler object at 0x7f418b495150> for environment ref_Environment_default_environment_1 (beam:env:harness_subprocess_python:v1, b'/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/bin/python -m apache_beam.runners.worker.sdk_worker_main')
Created Worker handler <apache_beam.runners.portability.fn_api_runner.worker_handlers.SubprocessSdkWorkerHandler object at 0x7f418b495b90> for environment ref_Environment_default_environment_1 (beam:env:harness_subprocess_python:v1, b'/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/bin/python -m apache_beam.runners.worker.sdk_worker_main')
Created Worker handler <apache_beam.runners.portability.fn_api_runner.worker_handlers.SubprocessSdkWorkerHandler object at 0x7f418b496650> for environment ref_Environment_default_environment_1 (beam:env:harness_subprocess_python:v1, b'/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/bin/python -m apache_beam.runners.worker.sdk_worker_main')
Created Worker handler <apache_beam.runners.portability.fn_api_runner.worker_handlers.SubprocessSdkWorkerHandler object at 0x7f418b497150> for environment ref_Environment_default_environment_1 (beam:env:harness_subprocess_python:v1, b'/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/bin/python -m apache_beam.runners.worker.sdk_worker_main')
Created Worker handler <apache_beam.runners.portability.fn_api_runner.worker_handlers.SubprocessSdkWorkerHandler object at 0x7f418b4ac090> for environment ref_Environment_default_environment_1 (beam:env:harness_subprocess_python:v1, b'/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/bin/python -m apache_beam.runners.worker.sdk_worker_main')
Created Worker handler <apache_beam.runners.portability.fn_api_runner.worker_handlers.SubprocessSdkWorkerHandler object at 0x7f418b4aefd0> for environment ref_Environment_default_environment_1 (beam:env:harness_subprocess_python:v1, b'/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/bin/python -m apache_beam.runners.worker.sdk_worker_main')
Created Worker handler <apache_beam.runners.portability.fn_api_runner.worker_handlers.SubprocessSdkWorkerHandler object at 0x7f418b4bf950> for environment ref_Environment_default_environment_1 (beam:env:harness_subprocess_python:v1, b'/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/bin/python -m apache_beam.runners.worker.sdk_worker_main')
Created Worker handler <apache_beam.runners.portability.fn_api_runner.worker_handlers.SubprocessSdkWorkerHandler object at 0x7f418b4c66d0> for environment ref_Environment_default_environment_1 (beam:env:harness_subprocess_python:v1, b'/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/bin/python -m apache_beam.runners.worker.sdk_worker_main')
Created Worker handler <apache_beam.runners.portability.fn_api_runner.worker_handlers.SubprocessSdkWorkerHandler object at 0x7f418b43dc50> for environment ref_Environment_default_environment_1 (beam:env:harness_subprocess_python:v1, b'/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/bin/python -m apache_beam.runners.worker.sdk_worker_main')
Created Worker handler <apache_beam.runners.portability.fn_api_runner.worker_handlers.SubprocessSdkWorkerHandler object at 0x7f418b4e3710> for environment ref_Environment_default_environment_1 (beam:env:harness_subprocess_python:v1, b'/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/bin/python -m apache_beam.runners.worker.sdk_worker_main')
Created Worker handler <apache_beam.runners.portability.fn_api_runner.worker_handlers.SubprocessSdkWorkerHandler object at 0x7f418b732bd0> for environment ref_Environment_default_environment_1 (beam:env:harness_subprocess_python:v1, b'/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/bin/python -m apache_beam.runners.worker.sdk_worker_main')
Created Worker handler <apache_beam.runners.portability.fn_api_runner.worker_handlers.SubprocessSdkWorkerHandler object at 0x7f418b4f63d0> for environment ref_Environment_default_environment_1 (beam:env:harness_subprocess_python:v1, b'/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/bin/python -m apache_beam.runners.worker.sdk_worker_main')
Created Worker handler <apache_beam.runners.portability.fn_api_runner.worker_handlers.SubprocessSdkWorkerHandler object at 0x7f418b4f6ad0> for environment ref_Environment_default_environment_1 (beam:env:harness_subprocess_python:v1, b'/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/bin/python -m apache_beam.runners.worker.sdk_worker_main')
Created Worker handler <apache_beam.runners.portability.fn_api_runner.worker_handlers.SubprocessSdkWorkerHandler object at 0x7f418b4f7490> for environment ref_Environment_default_environment_1 (beam:env:harness_subprocess_python:v1, b'/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/bin/python -m apache_beam.runners.worker.sdk_worker_main')
Worker: severity: INFO timestamp { seconds: 1711492592 nanos: 37909746 } message: "semi_persistent_directory: None" log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker_main.py:110" thread: "MainThread"
Worker: severity: WARN timestamp { seconds: 1711492592 nanos: 42463779 } message: "Discarding unparseable args: ['--direct_runner_use_stacked_bundle', '--pipeline_type_check']" log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/options/pipeline_options.py:372" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492592 nanos: 44964790 } message: "Pipeline_options: {'runner': 'DirectRunner', 'direct_num_workers': 16, 'direct_running_mode': 'multi_processing', 'gcp_oauth_scopes': ['https://www.googleapis.com/auth/bigquery', 'https://www.googleapis.com/auth/cloud-platform', 'https://www.googleapis.com/auth/devstorage.full_control', 'https://www.googleapis.com/auth/userinfo.email', 'https://www.googleapis.com/auth/datastore', 'https://www.googleapis.com/auth/spanner.admin', 'https://www.googleapis.com/auth/spanner.data', 'https://www.googleapis.com/auth/bigquery', 'https://www.googleapis.com/auth/cloud-platform', 'https://www.googleapis.com/auth/devstorage.full_control', 'https://www.googleapis.com/auth/userinfo.email', 'https://www.googleapis.com/auth/datastore', 'https://www.googleapis.com/auth/spanner.admin', 'https://www.googleapis.com/auth/spanner.data'], 'requirements_file': 'feedstock/requirements.txt', 'pickle_library': 'cloudpickle', 'save_main_session': True, 'sdk_worker_parallelism': '1', 'environment_cache_millis': '0'}" log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker_main.py:134" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492592 nanos: 51962614 } message: "Creating state cache with size 0" log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/statecache.py:234" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492592 nanos: 52346229 } message: "Creating insecure control channel for localhost:33051." log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker.py:189" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492592 nanos: 33755302 } message: "semi_persistent_directory: None" log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker_main.py:110" thread: "MainThread"
Worker: severity: WARN timestamp { seconds: 1711492592 nanos: 37890195 } message: "Discarding unparseable args: ['--direct_runner_use_stacked_bundle', '--pipeline_type_check']" log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/options/pipeline_options.py:372" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492592 nanos: 40186643 } message: "Pipeline_options: {'runner': 'DirectRunner', 'direct_num_workers': 16, 'direct_running_mode': 'multi_processing', 'gcp_oauth_scopes': ['https://www.googleapis.com/auth/bigquery', 'https://www.googleapis.com/auth/cloud-platform', 'https://www.googleapis.com/auth/devstorage.full_control', 'https://www.googleapis.com/auth/userinfo.email', 'https://www.googleapis.com/auth/datastore', 'https://www.googleapis.com/auth/spanner.admin', 'https://www.googleapis.com/auth/spanner.data', 'https://www.googleapis.com/auth/bigquery', 'https://www.googleapis.com/auth/cloud-platform', 'https://www.googleapis.com/auth/devstorage.full_control', 'https://www.googleapis.com/auth/userinfo.email', 'https://www.googleapis.com/auth/datastore', 'https://www.googleapis.com/auth/spanner.admin', 'https://www.googleapis.com/auth/spanner.data'], 'requirements_file': 'feedstock/requirements.txt', 'pickle_library': 'cloudpickle', 'save_main_session': True, 'sdk_worker_parallelism': '1', 'environment_cache_millis': '0'}" log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker_main.py:134" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492592 nanos: 46410799 } message: "Creating state cache with size 0" log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/statecache.py:234" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492592 nanos: 46755790 } message: "Creating insecure control channel for localhost:33051." log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker.py:189" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492592 nanos: 48684835 } message: "Control channel established." log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker.py:197" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492592 nanos: 49096107 } message: "Initializing SDKHarness with unbounded number of workers." log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker.py:247" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492592 nanos: 51619291 } message: "Python sdk harness starting." log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker_main.py:211" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492592 nanos: 57036876 } message: "Creating insecure state channel for localhost:34557." instruction_id: "bundle_1" log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker.py:910" thread: "Thread-12"
Worker: severity: INFO timestamp { seconds: 1711492592 nanos: 57407140 } message: "State channel established." instruction_id: "bundle_1" log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker.py:917" thread: "Thread-12"
Worker: severity: INFO timestamp { seconds: 1711492592 nanos: 63988208 } message: "Control channel established." log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker.py:197" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492592 nanos: 64730405 } message: "Initializing SDKHarness with unbounded number of workers." log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker.py:247" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492592 nanos: 67382097 } message: "Python sdk harness starting." log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker_main.py:211" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492592 nanos: 73147058 } message: "Creating client data channel for localhost:46465" instruction_id: "bundle_1" log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/data_plane.py:787" thread: "Thread-12"
Worker: severity: INFO timestamp { seconds: 1711492592 nanos: 82162857 } message: "Creating insecure state channel for localhost:34557." instruction_id: "bundle_16" log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker.py:910" thread: "Thread-12"
Worker: severity: INFO timestamp { seconds: 1711492592 nanos: 82511425 } message: "State channel established." instruction_id: "bundle_16" log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker.py:917" thread: "Thread-12"
Worker: severity: INFO timestamp { seconds: 1711492592 nanos: 87887525 } message: "Creating client data channel for localhost:46465" instruction_id: "bundle_16" log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/data_plane.py:787" thread: "Thread-12"
Worker: severity: INFO timestamp { seconds: 1711492592 nanos: 79263210 } message: "semi_persistent_directory: None" log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker_main.py:110" thread: "MainThread"
Worker: severity: WARN timestamp { seconds: 1711492592 nanos: 82616090 } message: "Discarding unparseable args: ['--direct_runner_use_stacked_bundle', '--pipeline_type_check']" log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/options/pipeline_options.py:372" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492592 nanos: 84073543 } message: "Pipeline_options: {'runner': 'DirectRunner', 'direct_num_workers': 16, 'direct_running_mode': 'multi_processing', 'gcp_oauth_scopes': ['https://www.googleapis.com/auth/bigquery', 'https://www.googleapis.com/auth/cloud-platform', 'https://www.googleapis.com/auth/devstorage.full_control', 'https://www.googleapis.com/auth/userinfo.email', 'https://www.googleapis.com/auth/datastore', 'https://www.googleapis.com/auth/spanner.admin', 'https://www.googleapis.com/auth/spanner.data', 'https://www.googleapis.com/auth/bigquery', 'https://www.googleapis.com/auth/cloud-platform', 'https://www.googleapis.com/auth/devstorage.full_control', 'https://www.googleapis.com/auth/userinfo.email', 'https://www.googleapis.com/auth/datastore', 'https://www.googleapis.com/auth/spanner.admin', 'https://www.googleapis.com/auth/spanner.data'], 'requirements_file': 'feedstock/requirements.txt', 'pickle_library': 'cloudpickle', 'save_main_session': True, 'sdk_worker_parallelism': '1', 'environment_cache_millis': '0'}" log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker_main.py:134" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492592 nanos: 89889526 } message: "Creating state cache with size 0" log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/statecache.py:234" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492592 nanos: 90237855 } message: "Creating insecure control channel for localhost:33051." log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker.py:189" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492592 nanos: 93428850 } message: "Control channel established." log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker.py:197" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492592 nanos: 93870878 } message: "semi_persistent_directory: None" log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker_main.py:110" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492592 nanos: 73761463 } message: "semi_persistent_directory: None" log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker_main.py:110" thread: "MainThread"
Worker: severity: WARN timestamp { seconds: 1711492592 nanos: 78383207 } message: "Discarding unparseable args: ['--direct_runner_use_stacked_bundle', '--pipeline_type_check']" log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/options/pipeline_options.py:372" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492592 nanos: 81748723 } message: "Pipeline_options: {'runner': 'DirectRunner', 'direct_num_workers': 16, 'direct_running_mode': 'multi_processing', 'gcp_oauth_scopes': ['https://www.googleapis.com/auth/bigquery', 'https://www.googleapis.com/auth/cloud-platform', 'https://www.googleapis.com/auth/devstorage.full_control', 'https://www.googleapis.com/auth/userinfo.email', 'https://www.googleapis.com/auth/datastore', 'https://www.googleapis.com/auth/spanner.admin', 'https://www.googleapis.com/auth/spanner.data', 'https://www.googleapis.com/auth/bigquery', 'https://www.googleapis.com/auth/cloud-platform', 'https://www.googleapis.com/auth/devstorage.full_control', 'https://www.googleapis.com/auth/userinfo.email', 'https://www.googleapis.com/auth/datastore', 'https://www.googleapis.com/auth/spanner.admin', 'https://www.googleapis.com/auth/spanner.data'], 'requirements_file': 'feedstock/requirements.txt', 'pickle_library': 'cloudpickle', 'save_main_session': True, 'sdk_worker_parallelism': '1', 'environment_cache_millis': '0'}" log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker_main.py:134" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492592 nanos: 88768243 } message: "Creating state cache with size 0" log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/statecache.py:234" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492592 nanos: 89213371 } message: "Creating insecure control channel for localhost:33051." log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker.py:189" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492592 nanos: 96416950 } message: "Control channel established." log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker.py:197" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492592 nanos: 86698055 } message: "semi_persistent_directory: None" log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker_main.py:110" thread: "MainThread"
Worker: severity: WARN timestamp { seconds: 1711492592 nanos: 90617895 } message: "Discarding unparseable args: ['--direct_runner_use_stacked_bundle', '--pipeline_type_check']" log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/options/pipeline_options.py:372" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492592 nanos: 92302560 } message: "Pipeline_options: {'runner': 'DirectRunner', 'direct_num_workers': 16, 'direct_running_mode': 'multi_processing', 'gcp_oauth_scopes': ['https://www.googleapis.com/auth/bigquery', 'https://www.googleapis.com/auth/cloud-platform', 'https://www.googleapis.com/auth/devstorage.full_control', 'https://www.googleapis.com/auth/userinfo.email', 'https://www.googleapis.com/auth/datastore', 'https://www.googleapis.com/auth/spanner.admin', 'https://www.googleapis.com/auth/spanner.data', 'https://www.googleapis.com/auth/bigquery', 'https://www.googleapis.com/auth/cloud-platform', 'https://www.googleapis.com/auth/devstorage.full_control', 'https://www.googleapis.com/auth/userinfo.email', 'https://www.googleapis.com/auth/datastore', 'https://www.googleapis.com/auth/spanner.admin', 'https://www.googleapis.com/auth/spanner.data'], 'requirements_file': 'feedstock/requirements.txt', 'pickle_library': 'cloudpickle', 'save_main_session': True, 'sdk_worker_parallelism': '1', 'environment_cache_millis': '0'}" log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker_main.py:134" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492592 nanos: 96750497 } message: "Creating state cache with size 0" log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/statecache.py:234" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492592 nanos: 96970319 } message: "Creating insecure control channel for localhost:33051." log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker.py:189" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492592 nanos: 97162008 } message: "Initializing SDKHarness with unbounded number of workers." log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker.py:247" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492592 nanos: 98423242 } message: "Python sdk harness starting." log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker_main.py:211" thread: "MainThread"
Worker: severity: WARN timestamp { seconds: 1711492592 nanos: 98398685 } message: "Discarding unparseable args: ['--direct_runner_use_stacked_bundle', '--pipeline_type_check']" log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/options/pipeline_options.py:372" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492592 nanos: 100009441 } message: "Pipeline_options: {'runner': 'DirectRunner', 'direct_num_workers': 16, 'direct_running_mode': 'multi_processing', 'gcp_oauth_scopes': ['https://www.googleapis.com/auth/bigquery', 'https://www.googleapis.com/auth/cloud-platform', 'https://www.googleapis.com/auth/devstorage.full_control', 'https://www.googleapis.com/auth/userinfo.email', 'https://www.googleapis.com/auth/datastore', 'https://www.googleapis.com/auth/spanner.admin', 'https://www.googleapis.com/auth/spanner.data', 'https://www.googleapis.com/auth/bigquery', 'https://www.googleapis.com/auth/cloud-platform', 'https://www.googleapis.com/auth/devstorage.full_control', 'https://www.googleapis.com/auth/userinfo.email', 'https://www.googleapis.com/auth/datastore', 'https://www.googleapis.com/auth/spanner.admin', 'https://www.googleapis.com/auth/spanner.data'], 'requirements_file': 'feedstock/requirements.txt', 'pickle_library': 'cloudpickle', 'save_main_session': True, 'sdk_worker_parallelism': '1', 'environment_cache_millis': '0'}" log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker_main.py:134" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492592 nanos: 98696231 } message: "Initializing SDKHarness with unbounded number of workers." log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker.py:247" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492592 nanos: 100250005 } message: "Python sdk harness starting." log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker_main.py:211" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492592 nanos: 103517532 } message: "Creating state cache with size 0" log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/statecache.py:234" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492592 nanos: 103827714 } message: "Creating insecure control channel for localhost:33051." log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker.py:189" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492592 nanos: 104492425 } message: "Control channel established." log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker.py:197" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492592 nanos: 105378150 } message: "Control channel established." log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker.py:197" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492592 nanos: 95809698 } message: "semi_persistent_directory: None" log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker_main.py:110" thread: "MainThread"
Worker: severity: WARN timestamp { seconds: 1711492592 nanos: 99285602 } message: "Discarding unparseable args: ['--direct_runner_use_stacked_bundle', '--pipeline_type_check']" log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/options/pipeline_options.py:372" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492592 nanos: 100885391 } message: "Pipeline_options: {'runner': 'DirectRunner', 'direct_num_workers': 16, 'direct_running_mode': 'multi_processing', 'gcp_oauth_scopes': ['https://www.googleapis.com/auth/bigquery', 'https://www.googleapis.com/auth/cloud-platform', 'https://www.googleapis.com/auth/devstorage.full_control', 'https://www.googleapis.com/auth/userinfo.email', 'https://www.googleapis.com/auth/datastore', 'https://www.googleapis.com/auth/spanner.admin', 'https://www.googleapis.com/auth/spanner.data', 'https://www.googleapis.com/auth/bigquery', 'https://www.googleapis.com/auth/cloud-platform', 'https://www.googleapis.com/auth/devstorage.full_control', 'https://www.googleapis.com/auth/userinfo.email', 'https://www.googleapis.com/auth/datastore', 'https://www.googleapis.com/auth/spanner.admin', 'https://www.googleapis.com/auth/spanner.data'], 'requirements_file': 'feedstock/requirements.txt', 'pickle_library': 'cloudpickle', 'save_main_session': True, 'sdk_worker_parallelism': '1', 'environment_cache_millis': '0'}" log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker_main.py:134" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492592 nanos: 104528188 } message: "Creating state cache with size 0" log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/statecache.py:234" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492592 nanos: 104755401 } message: "Creating insecure control channel for localhost:33051." log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker.py:189" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492592 nanos: 106796741 } message: "Control channel established." log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker.py:197" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492592 nanos: 105781316 } message: "Initializing SDKHarness with unbounded number of workers." log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker.py:247" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492592 nanos: 107492923 } message: "Python sdk harness starting." log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker_main.py:211" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492592 nanos: 105438709 } message: "Initializing SDKHarness with unbounded number of workers." log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker.py:247" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492592 nanos: 107643842 } message: "Python sdk harness starting." log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker_main.py:211" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492592 nanos: 101886987 } message: "semi_persistent_directory: None" log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker_main.py:110" thread: "MainThread"
Worker: severity: WARN timestamp { seconds: 1711492592 nanos: 105623960 } message: "Discarding unparseable args: ['--direct_runner_use_stacked_bundle', '--pipeline_type_check']" log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/options/pipeline_options.py:372" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492592 nanos: 107106208 } message: "Pipeline_options: {'runner': 'DirectRunner', 'direct_num_workers': 16, 'direct_running_mode': 'multi_processing', 'gcp_oauth_scopes': ['https://www.googleapis.com/auth/bigquery', 'https://www.googleapis.com/auth/cloud-platform', 'https://www.googleapis.com/auth/devstorage.full_control', 'https://www.googleapis.com/auth/userinfo.email', 'https://www.googleapis.com/auth/datastore', 'https://www.googleapis.com/auth/spanner.admin', 'https://www.googleapis.com/auth/spanner.data', 'https://www.googleapis.com/auth/bigquery', 'https://www.googleapis.com/auth/cloud-platform', 'https://www.googleapis.com/auth/devstorage.full_control', 'https://www.googleapis.com/auth/userinfo.email', 'https://www.googleapis.com/auth/datastore', 'https://www.googleapis.com/auth/spanner.admin', 'https://www.googleapis.com/auth/spanner.data'], 'requirements_file': 'feedstock/requirements.txt', 'pickle_library': 'cloudpickle', 'save_main_session': True, 'sdk_worker_parallelism': '1', 'environment_cache_millis': '0'}" log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker_main.py:134" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492592 nanos: 107265710 } message: "Initializing SDKHarness with unbounded number of workers." log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker.py:247" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492592 nanos: 110610485 } message: "Python sdk harness starting." log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker_main.py:211" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492592 nanos: 109577178 } message: "semi_persistent_directory: None" log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker_main.py:110" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492592 nanos: 112775564 } message: "Creating state cache with size 0" log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/statecache.py:234" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492592 nanos: 113094329 } message: "Creating insecure control channel for localhost:33051." log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker.py:189" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492592 nanos: 113529920 } message: "Creating insecure state channel for localhost:34557." instruction_id: "bundle_2" log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker.py:910" thread: "Thread-13"
Worker: severity: INFO timestamp { seconds: 1711492592 nanos: 113919258 } message: "State channel established." instruction_id: "bundle_2" log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker.py:917" thread: "Thread-13"
Worker: severity: INFO timestamp { seconds: 1711492592 nanos: 115277528 } message: "Control channel established." log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker.py:197" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492592 nanos: 115544080 } message: "Creating client data channel for localhost:46465" instruction_id: "bundle_2" log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/data_plane.py:787" thread: "Thread-13"
Worker: severity: WARN timestamp { seconds: 1711492592 nanos: 114557504 } message: "Discarding unparseable args: ['--direct_runner_use_stacked_bundle', '--pipeline_type_check']" log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/options/pipeline_options.py:372" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492592 nanos: 115715026 } message: "Pipeline_options: {'runner': 'DirectRunner', 'direct_num_workers': 16, 'direct_running_mode': 'multi_processing', 'gcp_oauth_scopes': ['https://www.googleapis.com/auth/bigquery', 'https://www.googleapis.com/auth/cloud-platform', 'https://www.googleapis.com/auth/devstorage.full_control', 'https://www.googleapis.com/auth/userinfo.email', 'https://www.googleapis.com/auth/datastore', 'https://www.googleapis.com/auth/spanner.admin', 'https://www.googleapis.com/auth/spanner.data', 'https://www.googleapis.com/auth/bigquery', 'https://www.googleapis.com/auth/cloud-platform', 'https://www.googleapis.com/auth/devstorage.full_control', 'https://www.googleapis.com/auth/userinfo.email', 'https://www.googleapis.com/auth/datastore', 'https://www.googleapis.com/auth/spanner.admin', 'https://www.googleapis.com/auth/spanner.data'], 'requirements_file': 'feedstock/requirements.txt', 'pickle_library': 'cloudpickle', 'save_main_session': True, 'sdk_worker_parallelism': '1', 'environment_cache_millis': '0'}" log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker_main.py:134" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492592 nanos: 116016149 } message: "Initializing SDKHarness with unbounded number of workers." log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker.py:247" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492592 nanos: 117183208 } message: "semi_persistent_directory: None" log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker_main.py:110" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492592 nanos: 118010520 } message: "Python sdk harness starting." log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker_main.py:211" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492592 nanos: 120317697 } message: "Creating insecure state channel for localhost:34557." instruction_id: "bundle_7" log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker.py:910" thread: "Thread-11"
Worker: severity: INFO timestamp { seconds: 1711492592 nanos: 120638132 } message: "State channel established." instruction_id: "bundle_7" log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker.py:917" thread: "Thread-11"
Worker: severity: INFO timestamp { seconds: 1711492592 nanos: 119475603 } message: "Creating state cache with size 0" log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/statecache.py:234" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492592 nanos: 119690179 } message: "Creating insecure control channel for localhost:33051." log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker.py:189" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492592 nanos: 122087478 } message: "Creating client data channel for localhost:46465" instruction_id: "bundle_7" log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/data_plane.py:787" thread: "Thread-11"
Worker: severity: INFO timestamp { seconds: 1711492592 nanos: 123764276 } message: "Creating insecure state channel for localhost:34557." instruction_id: "bundle_9" log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker.py:910" thread: "Thread-12"
Worker: severity: INFO timestamp { seconds: 1711492592 nanos: 121750354 } message: "Control channel established." log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker.py:197" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492592 nanos: 122955799 } message: "Creating insecure state channel for localhost:34557." instruction_id: "bundle_5" log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker.py:910" thread: "Thread-12"
Worker: severity: INFO timestamp { seconds: 1711492592 nanos: 123279094 } message: "State channel established." instruction_id: "bundle_5" log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker.py:917" thread: "Thread-12"
Worker: severity: INFO timestamp { seconds: 1711492592 nanos: 124061584 } message: "State channel established." instruction_id: "bundle_9" log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker.py:917" thread: "Thread-12"
Worker: severity: INFO timestamp { seconds: 1711492592 nanos: 122328519 } message: "semi_persistent_directory: None" log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker_main.py:110" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492592 nanos: 122274398 } message: "Initializing SDKHarness with unbounded number of workers." log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker.py:247" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492592 nanos: 124626159 } message: "Creating client data channel for localhost:46465" instruction_id: "bundle_5" log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/data_plane.py:787" thread: "Thread-12"
Worker: severity: INFO timestamp { seconds: 1711492592 nanos: 125951290 } message: "Creating client data channel for localhost:46465" instruction_id: "bundle_9" log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/data_plane.py:787" thread: "Thread-12"
Worker: severity: WARN timestamp { seconds: 1711492592 nanos: 123316764 } message: "Discarding unparseable args: ['--direct_runner_use_stacked_bundle', '--pipeline_type_check']" log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/options/pipeline_options.py:372" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492592 nanos: 123878002 } message: "Python sdk harness starting." log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker_main.py:211" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492592 nanos: 126193046 } message: "Pipeline_options: {'runner': 'DirectRunner', 'direct_num_workers': 16, 'direct_running_mode': 'multi_processing', 'gcp_oauth_scopes': ['https://www.googleapis.com/auth/bigquery', 'https://www.googleapis.com/auth/cloud-platform', 'https://www.googleapis.com/auth/devstorage.full_control', 'https://www.googleapis.com/auth/userinfo.email', 'https://www.googleapis.com/auth/datastore', 'https://www.googleapis.com/auth/spanner.admin', 'https://www.googleapis.com/auth/spanner.data', 'https://www.googleapis.com/auth/bigquery', 'https://www.googleapis.com/auth/cloud-platform', 'https://www.googleapis.com/auth/devstorage.full_control', 'https://www.googleapis.com/auth/userinfo.email', 'https://www.googleapis.com/auth/datastore', 'https://www.googleapis.com/auth/spanner.admin', 'https://www.googleapis.com/auth/spanner.data'], 'requirements_file': 'feedstock/requirements.txt', 'pickle_library': 'cloudpickle', 'save_main_session': True, 'sdk_worker_parallelism': '1', 'environment_cache_millis': '0'}" log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker_main.py:134" thread: "MainThread"
Worker: severity: WARN timestamp { seconds: 1711492592 nanos: 128067731 } message: "Discarding unparseable args: ['--direct_runner_use_stacked_bundle', '--pipeline_type_check']" log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/options/pipeline_options.py:372" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492592 nanos: 130612373 } message: "Pipeline_options: {'runner': 'DirectRunner', 'direct_num_workers': 16, 'direct_running_mode': 'multi_processing', 'gcp_oauth_scopes': ['https://www.googleapis.com/auth/bigquery', 'https://www.googleapis.com/auth/cloud-platform', 'https://www.googleapis.com/auth/devstorage.full_control', 'https://www.googleapis.com/auth/userinfo.email', 'https://www.googleapis.com/auth/datastore', 'https://www.googleapis.com/auth/spanner.admin', 'https://www.googleapis.com/auth/spanner.data', 'https://www.googleapis.com/auth/bigquery', 'https://www.googleapis.com/auth/cloud-platform', 'https://www.googleapis.com/auth/devstorage.full_control', 'https://www.googleapis.com/auth/userinfo.email', 'https://www.googleapis.com/auth/datastore', 'https://www.googleapis.com/auth/spanner.admin', 'https://www.googleapis.com/auth/spanner.data'], 'requirements_file': 'feedstock/requirements.txt', 'pickle_library': 'cloudpickle', 'save_main_session': True, 'sdk_worker_parallelism': '1', 'environment_cache_millis': '0'}" log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker_main.py:134" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492592 nanos: 130235195 } message: "Creating insecure state channel for localhost:34557." instruction_id: "bundle_6" log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker.py:910" thread: "Thread-10"
Worker: severity: INFO timestamp { seconds: 1711492592 nanos: 130493402 } message: "State channel established." instruction_id: "bundle_6" log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker.py:917" thread: "Thread-10"
Worker: severity: INFO timestamp { seconds: 1711492592 nanos: 131021022 } message: "Creating state cache with size 0" log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/statecache.py:234" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492592 nanos: 131328582 } message: "Creating insecure control channel for localhost:33051." log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker.py:189" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492592 nanos: 136827468 } message: "Creating insecure state channel for localhost:34557." instruction_id: "bundle_3" log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker.py:910" thread: "Thread-13"
Worker: severity: INFO timestamp { seconds: 1711492592 nanos: 137168407 } message: "State channel established." instruction_id: "bundle_3" log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker.py:917" thread: "Thread-13"
Worker: severity: INFO timestamp { seconds: 1711492592 nanos: 133405208 } message: "Control channel established." log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker.py:197" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492592 nanos: 131821632 } message: "Creating client data channel for localhost:46465" instruction_id: "bundle_6" log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/data_plane.py:787" thread: "Thread-10"
Worker: severity: INFO timestamp { seconds: 1711492592 nanos: 137453556 } message: "Creating state cache with size 0" log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/statecache.py:234" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492592 nanos: 137765407 } message: "Creating insecure control channel for localhost:33051." log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker.py:189" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492592 nanos: 133836984 } message: "Initializing SDKHarness with unbounded number of workers." log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker.py:247" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492592 nanos: 135994434 } message: "Python sdk harness starting." log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker_main.py:211" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492592 nanos: 139851570 } message: "Control channel established." log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker.py:197" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492592 nanos: 138998746 } message: "Creating client data channel for localhost:46465" instruction_id: "bundle_3" log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/data_plane.py:787" thread: "Thread-13"
Worker: severity: INFO timestamp { seconds: 1711492592 nanos: 140286922 } message: "Initializing SDKHarness with unbounded number of workers." log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker.py:247" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492592 nanos: 144649982 } message: "Creating insecure state channel for localhost:34557." instruction_id: "bundle_13" log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker.py:910" thread: "Thread-13"
Worker: severity: INFO timestamp { seconds: 1711492592 nanos: 144948720 } message: "State channel established." instruction_id: "bundle_13" log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker.py:917" thread: "Thread-13"
Worker: severity: INFO timestamp { seconds: 1711492592 nanos: 146216869 } message: "Creating client data channel for localhost:46465" instruction_id: "bundle_13" log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/data_plane.py:787" thread: "Thread-13"
Worker: severity: INFO timestamp { seconds: 1711492592 nanos: 145555496 } message: "Creating insecure state channel for localhost:34557." instruction_id: "bundle_8" log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker.py:910" thread: "Thread-13"
Worker: severity: INFO timestamp { seconds: 1711492592 nanos: 145870685 } message: "State channel established." instruction_id: "bundle_8" log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker.py:917" thread: "Thread-13"
Worker: severity: INFO timestamp { seconds: 1711492592 nanos: 147265911 } message: "Creating client data channel for localhost:46465" instruction_id: "bundle_8" log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/data_plane.py:787" thread: "Thread-13"
Worker: severity: INFO timestamp { seconds: 1711492592 nanos: 143349170 } message: "Python sdk harness starting." log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker_main.py:211" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492592 nanos: 148793935 } message: "semi_persistent_directory: None" log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker_main.py:110" thread: "MainThread"
Worker: severity: WARN timestamp { seconds: 1711492592 nanos: 150999307 } message: "Discarding unparseable args: ['--direct_runner_use_stacked_bundle', '--pipeline_type_check']" log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/options/pipeline_options.py:372" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492592 nanos: 152004003 } message: "Pipeline_options: {'runner': 'DirectRunner', 'direct_num_workers': 16, 'direct_running_mode': 'multi_processing', 'gcp_oauth_scopes': ['https://www.googleapis.com/auth/bigquery', 'https://www.googleapis.com/auth/cloud-platform', 'https://www.googleapis.com/auth/devstorage.full_control', 'https://www.googleapis.com/auth/userinfo.email', 'https://www.googleapis.com/auth/datastore', 'https://www.googleapis.com/auth/spanner.admin', 'https://www.googleapis.com/auth/spanner.data', 'https://www.googleapis.com/auth/bigquery', 'https://www.googleapis.com/auth/cloud-platform', 'https://www.googleapis.com/auth/devstorage.full_control', 'https://www.googleapis.com/auth/userinfo.email', 'https://www.googleapis.com/auth/datastore', 'https://www.googleapis.com/auth/spanner.admin', 'https://www.googleapis.com/auth/spanner.data'], 'requirements_file': 'feedstock/requirements.txt', 'pickle_library': 'cloudpickle', 'save_main_session': True, 'sdk_worker_parallelism': '1', 'environment_cache_millis': '0'}" log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker_main.py:134" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492592 nanos: 147064685 } message: "semi_persistent_directory: None" log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker_main.py:110" thread: "MainThread"
Worker: severity: WARN timestamp { seconds: 1711492592 nanos: 150080204 } message: "Discarding unparseable args: ['--direct_runner_use_stacked_bundle', '--pipeline_type_check']" log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/options/pipeline_options.py:372" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492592 nanos: 151638746 } message: "Pipeline_options: {'runner': 'DirectRunner', 'direct_num_workers': 16, 'direct_running_mode': 'multi_processing', 'gcp_oauth_scopes': ['https://www.googleapis.com/auth/bigquery', 'https://www.googleapis.com/auth/cloud-platform', 'https://www.googleapis.com/auth/devstorage.full_control', 'https://www.googleapis.com/auth/userinfo.email', 'https://www.googleapis.com/auth/datastore', 'https://www.googleapis.com/auth/spanner.admin', 'https://www.googleapis.com/auth/spanner.data', 'https://www.googleapis.com/auth/bigquery', 'https://www.googleapis.com/auth/cloud-platform', 'https://www.googleapis.com/auth/devstorage.full_control', 'https://www.googleapis.com/auth/userinfo.email', 'https://www.googleapis.com/auth/datastore', 'https://www.googleapis.com/auth/spanner.admin', 'https://www.googleapis.com/auth/spanner.data'], 'requirements_file': 'feedstock/requirements.txt', 'pickle_library': 'cloudpickle', 'save_main_session': True, 'sdk_worker_parallelism': '1', 'environment_cache_millis': '0'}" log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker_main.py:134" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492592 nanos: 155380249 } message: "Creating state cache with size 0" log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/statecache.py:234" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492592 nanos: 155597925 } message: "Creating insecure control channel for localhost:33051." log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker.py:189" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492592 nanos: 156207799 } message: "Creating state cache with size 0" log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/statecache.py:234" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492592 nanos: 156426191 } message: "Creating insecure control channel for localhost:33051." log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker.py:189" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492592 nanos: 158031463 } message: "Control channel established." log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker.py:197" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492592 nanos: 158406496 } message: "Initializing SDKHarness with unbounded number of workers." log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker.py:247" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492592 nanos: 159943342 } message: "Python sdk harness starting." log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker_main.py:211" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492592 nanos: 157746791 } message: "Control channel established." log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker.py:197" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492592 nanos: 147618770 } message: "semi_persistent_directory: None" log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker_main.py:110" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492592 nanos: 158150911 } message: "Initializing SDKHarness with unbounded number of workers." log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker.py:247" thread: "MainThread"
Worker: severity: WARN timestamp { seconds: 1711492592 nanos: 151093482 } message: "Discarding unparseable args: ['--direct_runner_use_stacked_bundle', '--pipeline_type_check']" log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/options/pipeline_options.py:372" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492592 nanos: 152592897 } message: "Pipeline_options: {'runner': 'DirectRunner', 'direct_num_workers': 16, 'direct_running_mode': 'multi_processing', 'gcp_oauth_scopes': ['https://www.googleapis.com/auth/bigquery', 'https://www.googleapis.com/auth/cloud-platform', 'https://www.googleapis.com/auth/devstorage.full_control', 'https://www.googleapis.com/auth/userinfo.email', 'https://www.googleapis.com/auth/datastore', 'https://www.googleapis.com/auth/spanner.admin', 'https://www.googleapis.com/auth/spanner.data', 'https://www.googleapis.com/auth/bigquery', 'https://www.googleapis.com/auth/cloud-platform', 'https://www.googleapis.com/auth/devstorage.full_control', 'https://www.googleapis.com/auth/userinfo.email', 'https://www.googleapis.com/auth/datastore', 'https://www.googleapis.com/auth/spanner.admin', 'https://www.googleapis.com/auth/spanner.data'], 'requirements_file': 'feedstock/requirements.txt', 'pickle_library': 'cloudpickle', 'save_main_session': True, 'sdk_worker_parallelism': '1', 'environment_cache_millis': '0'}" log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker_main.py:134" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492592 nanos: 159495592 } message: "Python sdk harness starting." log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker_main.py:211" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492592 nanos: 158103227 } message: "Creating state cache with size 0" log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/statecache.py:234" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492592 nanos: 162379980 } message: "Creating insecure state channel for localhost:34557." instruction_id: "bundle_4" log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker.py:910" thread: "Thread-12"
Worker: severity: INFO timestamp { seconds: 1711492592 nanos: 162698984 } message: "State channel established." instruction_id: "bundle_4" log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker.py:917" thread: "Thread-12"
Worker: severity: INFO timestamp { seconds: 1711492592 nanos: 158437013 } message: "Creating insecure control channel for localhost:33051." log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker.py:189" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492592 nanos: 164228439 } message: "Creating client data channel for localhost:46465" instruction_id: "bundle_4" log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/data_plane.py:787" thread: "Thread-12"
Worker: severity: INFO timestamp { seconds: 1711492592 nanos: 160115003 } message: "Control channel established." log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker.py:197" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492592 nanos: 165413379 } message: "semi_persistent_directory: None" log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker_main.py:110" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492592 nanos: 160534381 } message: "Initializing SDKHarness with unbounded number of workers." log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker.py:247" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492592 nanos: 162346839 } message: "Python sdk harness starting." log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker_main.py:211" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492592 nanos: 167640209 } message: "Creating insecure state channel for localhost:34557." instruction_id: "bundle_14" log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker.py:910" thread: "Thread-13"
Worker: severity: INFO timestamp { seconds: 1711492592 nanos: 167884826 } message: "State channel established." instruction_id: "bundle_14" log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker.py:917" thread: "Thread-13"
Worker: severity: INFO timestamp { seconds: 1711492592 nanos: 168057441 } message: "Creating insecure state channel for localhost:34557." instruction_id: "bundle_15" log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker.py:910" thread: "Thread-13"
Worker: severity: INFO timestamp { seconds: 1711492592 nanos: 168328285 } message: "State channel established." instruction_id: "bundle_15" log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker.py:917" thread: "Thread-13"
Worker: severity: INFO timestamp { seconds: 1711492592 nanos: 168933153 } message: "Creating insecure state channel for localhost:34557." instruction_id: "bundle_10" log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker.py:910" thread: "Thread-12"
Worker: severity: INFO timestamp { seconds: 1711492592 nanos: 169290781 } message: "State channel established." instruction_id: "bundle_10" log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker.py:917" thread: "Thread-12"
Worker: severity: INFO timestamp { seconds: 1711492592 nanos: 169299602 } message: "Creating client data channel for localhost:46465" instruction_id: "bundle_14" log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/data_plane.py:787" thread: "Thread-13"
Worker: severity: INFO timestamp { seconds: 1711492592 nanos: 169944524 } message: "Creating client data channel for localhost:46465" instruction_id: "bundle_15" log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/data_plane.py:787" thread: "Thread-13"
Worker: severity: WARN timestamp { seconds: 1711492592 nanos: 168410062 } message: "Discarding unparseable args: ['--direct_runner_use_stacked_bundle', '--pipeline_type_check']" log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/options/pipeline_options.py:372" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492592 nanos: 169929504 } message: "Pipeline_options: {'runner': 'DirectRunner', 'direct_num_workers': 16, 'direct_running_mode': 'multi_processing', 'gcp_oauth_scopes': ['https://www.googleapis.com/auth/bigquery', 'https://www.googleapis.com/auth/cloud-platform', 'https://www.googleapis.com/auth/devstorage.full_control', 'https://www.googleapis.com/auth/userinfo.email', 'https://www.googleapis.com/auth/datastore', 'https://www.googleapis.com/auth/spanner.admin', 'https://www.googleapis.com/auth/spanner.data', 'https://www.googleapis.com/auth/bigquery', 'https://www.googleapis.com/auth/cloud-platform', 'https://www.googleapis.com/auth/devstorage.full_control', 'https://www.googleapis.com/auth/userinfo.email', 'https://www.googleapis.com/auth/datastore', 'https://www.googleapis.com/auth/spanner.admin', 'https://www.googleapis.com/auth/spanner.data'], 'requirements_file': 'feedstock/requirements.txt', 'pickle_library': 'cloudpickle', 'save_main_session': True, 'sdk_worker_parallelism': '1', 'environment_cache_millis': '0'}" log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker_main.py:134" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492592 nanos: 170841932 } message: "Creating client data channel for localhost:46465" instruction_id: "bundle_10" log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/data_plane.py:787" thread: "Thread-12"
Worker: severity: INFO timestamp { seconds: 1711492592 nanos: 175016641 } message: "Creating state cache with size 0" log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/statecache.py:234" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492592 nanos: 175294160 } message: "Creating insecure control channel for localhost:33051." log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker.py:189" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492592 nanos: 177244424 } message: "Control channel established." log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker.py:197" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492592 nanos: 177585601 } message: "Initializing SDKHarness with unbounded number of workers." log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker.py:247" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492592 nanos: 179465055 } message: "Python sdk harness starting." log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker_main.py:211" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492592 nanos: 178166627 } message: "semi_persistent_directory: None" log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker_main.py:110" thread: "MainThread"
Worker: severity: WARN timestamp { seconds: 1711492592 nanos: 180413722 } message: "Discarding unparseable args: ['--direct_runner_use_stacked_bundle', '--pipeline_type_check']" log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/options/pipeline_options.py:372" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492592 nanos: 181570291 } message: "Pipeline_options: {'runner': 'DirectRunner', 'direct_num_workers': 16, 'direct_running_mode': 'multi_processing', 'gcp_oauth_scopes': ['https://www.googleapis.com/auth/bigquery', 'https://www.googleapis.com/auth/cloud-platform', 'https://www.googleapis.com/auth/devstorage.full_control', 'https://www.googleapis.com/auth/userinfo.email', 'https://www.googleapis.com/auth/datastore', 'https://www.googleapis.com/auth/spanner.admin', 'https://www.googleapis.com/auth/spanner.data', 'https://www.googleapis.com/auth/bigquery', 'https://www.googleapis.com/auth/cloud-platform', 'https://www.googleapis.com/auth/devstorage.full_control', 'https://www.googleapis.com/auth/userinfo.email', 'https://www.googleapis.com/auth/datastore', 'https://www.googleapis.com/auth/spanner.admin', 'https://www.googleapis.com/auth/spanner.data'], 'requirements_file': 'feedstock/requirements.txt', 'pickle_library': 'cloudpickle', 'save_main_session': True, 'sdk_worker_parallelism': '1', 'environment_cache_millis': '0'}" log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker_main.py:134" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492592 nanos: 185740470 } message: "Creating state cache with size 0" log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/statecache.py:234" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492592 nanos: 186000585 } message: "Creating insecure control channel for localhost:33051." log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker.py:189" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492592 nanos: 186920404 } message: "Creating insecure state channel for localhost:34557." instruction_id: "bundle_11" log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker.py:910" thread: "Thread-13"
Worker: severity: INFO timestamp { seconds: 1711492592 nanos: 187176465 } message: "State channel established." instruction_id: "bundle_11" log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker.py:917" thread: "Thread-13"
Worker: severity: INFO timestamp { seconds: 1711492592 nanos: 188328981 } message: "Creating client data channel for localhost:46465" instruction_id: "bundle_11" log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/data_plane.py:787" thread: "Thread-13"
Worker: severity: INFO timestamp { seconds: 1711492592 nanos: 189215421 } message: "Control channel established." log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker.py:197" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492592 nanos: 189627647 } message: "Initializing SDKHarness with unbounded number of workers." log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker.py:247" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492592 nanos: 191555976 } message: "Python sdk harness starting." log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker_main.py:211" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492592 nanos: 194404125 } message: "Creating insecure state channel for localhost:34557." instruction_id: "bundle_12" log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker.py:910" thread: "Thread-13"
Worker: severity: INFO timestamp { seconds: 1711492592 nanos: 194675922 } message: "State channel established." instruction_id: "bundle_12" log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker.py:917" thread: "Thread-13"
Worker: severity: INFO timestamp { seconds: 1711492592 nanos: 195984601 } message: "Creating client data channel for localhost:46465" instruction_id: "bundle_12" log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/data_plane.py:787" thread: "Thread-13"
Worker: severity: INFO timestamp { seconds: 1711492592 nanos: 591678380 } message: "Caching file '/home/jovyan/AREAS/PGF/hanging_bug/simple_example/file_b.nc'" instruction_id: "bundle_17" transform_id: "Create|OpenURLWithFSSpec|OpenWithXarray|StoreToZarr|ConsolidateDimensionCoordinates|ConsolidateMetadata/OpenURLWithFSSpec/MapWithConcurrencyLimit/open_url" log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/pangeo_forge_recipes/storage.py:197" thread: "Thread-12"
Worker: severity: INFO timestamp { seconds: 1711492592 nanos: 593292951 } message: "File '/home/jovyan/AREAS/PGF/hanging_bug/simple_example/file_b.nc' is already cached" instruction_id: "bundle_17" transform_id: "Create|OpenURLWithFSSpec|OpenWithXarray|StoreToZarr|ConsolidateDimensionCoordinates|ConsolidateMetadata/OpenURLWithFSSpec/MapWithConcurrencyLimit/open_url" log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/pangeo_forge_recipes/storage.py:203" thread: "Thread-12"
Worker: severity: INFO timestamp { seconds: 1711492592 nanos: 596072912 } message: "Caching file '/home/jovyan/AREAS/PGF/hanging_bug/simple_example/file_a.nc'" instruction_id: "bundle_18" transform_id: "Create|OpenURLWithFSSpec|OpenWithXarray|StoreToZarr|ConsolidateDimensionCoordinates|ConsolidateMetadata/OpenURLWithFSSpec/MapWithConcurrencyLimit/open_url" log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/pangeo_forge_recipes/storage.py:197" thread: "Thread-13"
Worker: severity: INFO timestamp { seconds: 1711492592 nanos: 598029851 } message: "File '/home/jovyan/AREAS/PGF/hanging_bug/simple_example/file_a.nc' is already cached" instruction_id: "bundle_18" transform_id: "Create|OpenURLWithFSSpec|OpenWithXarray|StoreToZarr|ConsolidateDimensionCoordinates|ConsolidateMetadata/OpenURLWithFSSpec/MapWithConcurrencyLimit/open_url" log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/pangeo_forge_recipes/storage.py:203" thread: "Thread-13"
Worker: severity: INFO timestamp { seconds: 1711492593 nanos: 246053934 } message: "Splitting fragment = ({Dimension(name='time', operation=<CombineOp.CONCAT: 2>): IndexedPosition(value=0, indexed=True, dimsize=4)}, <xarray.Dataset> Size: 64B\nDimensions: (x: 2, time: 2)\nCoordinates:\n * x (x) int64 16B 0 1\n * time (time) int64 16B 0 1\nData variables:\n xarray_dataarray_variable (x, time) int64 32B ...), with target_chunks = {'x': 1} and schema = {'attrs': {}, 'coords': {'time': {'dims': ('time',), 'attrs': {}, 'dtype': 'int64', 'shape': (4,), 'encoding': {'fletcher32': False, 'dtype': dtype('int64'), 'original_shape': (2,), 'chunksizes': None, 'shuffle': False}}, 'x': {'dims': ('x',), 'attrs': {}, 'dtype': 'int64', 'shape': (2,), 'encoding': {'fletcher32': False, 'dtype': dtype('int64'), 'original_shape': (2,), 'chunksizes': None, 'shuffle': False}}}, 'data_vars': {'xarray_dataarray_variable': {'dims': ('x', 'time'), 'attrs': {}, 'dtype': 'int64', 'shape': (2, 4), 'encoding': {'fletcher32': False, 'dtype': dtype('int64'), 'original_shape': (2, 2), 'chunksizes': None, 'shuffle': False}}}, 'dims': {'time': 4, 'x': 2}, 'chunks': {'time': {0: 2, 1: 2}}}" instruction_id: "bundle_75" transform_id: "Create|OpenURLWithFSSpec|OpenWithXarray|StoreToZarr|ConsolidateDimensionCoordinates|ConsolidateMetadata/StoreToZarr/Rechunk/FlatMap(split_fragment)" log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/pangeo_forge_recipes/rechunking.py:35" thread: "Thread-13"
Worker: severity: INFO timestamp { seconds: 1711492593 nanos: 249351263 } message: "Splitting fragment = ({Dimension(name='time', operation=<CombineOp.CONCAT: 2>): IndexedPosition(value=2, indexed=True, dimsize=4)}, <xarray.Dataset> Size: 64B\nDimensions: (x: 2, time: 2)\nCoordinates:\n * x (x) int64 16B 0 1\n * time (time) int64 16B 2 3\nData variables:\n xarray_dataarray_variable (x, time) int64 32B ...), with target_chunks = {'x': 1} and schema = {'attrs': {}, 'coords': {'time': {'dims': ('time',), 'attrs': {}, 'dtype': 'int64', 'shape': (4,), 'encoding': {'fletcher32': False, 'dtype': dtype('int64'), 'original_shape': (2,), 'chunksizes': None, 'shuffle': False}}, 'x': {'dims': ('x',), 'attrs': {}, 'dtype': 'int64', 'shape': (2,), 'encoding': {'fletcher32': False, 'dtype': dtype('int64'), 'original_shape': (2,), 'chunksizes': None, 'shuffle': False}}}, 'data_vars': {'xarray_dataarray_variable': {'dims': ('x', 'time'), 'attrs': {}, 'dtype': 'int64', 'shape': (2, 4), 'encoding': {'fletcher32': False, 'dtype': dtype('int64'), 'original_shape': (2, 2), 'chunksizes': None, 'shuffle': False}}}, 'dims': {'time': 4, 'x': 2}, 'chunks': {'time': {0: 2, 1: 2}}}" instruction_id: "bundle_76" transform_id: "Create|OpenURLWithFSSpec|OpenWithXarray|StoreToZarr|ConsolidateDimensionCoordinates|ConsolidateMetadata/StoreToZarr/Rechunk/FlatMap(split_fragment)" log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/pangeo_forge_recipes/rechunking.py:35" thread: "Thread-13"
Worker: severity: INFO timestamp { seconds: 1711492593 nanos: 292631864 } message: "Combining group = (('time', 0), ('x', 0)), containing fragments = [({Dimension(name='time', operation=<CombineOp.CONCAT: 2>): IndexedPosition(value=0, indexed=True, dimsize=4), Dimension(name='x', operation=<CombineOp.CONCAT: 2>): IndexedPosition(value=0, indexed=True, dimsize=2)}, <xarray.Dataset> Size: 40B\nDimensions: (x: 1, time: 2)\nCoordinates:\n * x (x) int64 8B 0\n * time (time) int64 16B 0 1\nData variables:\n xarray_dataarray_variable (x, time) int64 16B ...)]" instruction_id: "bundle_81" transform_id: "Create|OpenURLWithFSSpec|OpenWithXarray|StoreToZarr|ConsolidateDimensionCoordinates|ConsolidateMetadata/StoreToZarr/Rechunk/MapTuple(combine_fragments)" log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/pangeo_forge_recipes/rechunking.py:171" thread: "Thread-12"
Worker: severity: INFO timestamp { seconds: 1711492593 nanos: 294358253 } message: "Combining group = (('time', 1), ('x', 0)), containing fragments = [({Dimension(name='time', operation=<CombineOp.CONCAT: 2>): IndexedPosition(value=2, indexed=True, dimsize=4), Dimension(name='x', operation=<CombineOp.CONCAT: 2>): IndexedPosition(value=0, indexed=True, dimsize=2)}, <xarray.Dataset> Size: 40B\nDimensions: (x: 1, time: 2)\nCoordinates:\n * x (x) int64 8B 0\n * time (time) int64 16B 2 3\nData variables:\n xarray_dataarray_variable (x, time) int64 16B ...)]" instruction_id: "bundle_82" transform_id: "Create|OpenURLWithFSSpec|OpenWithXarray|StoreToZarr|ConsolidateDimensionCoordinates|ConsolidateMetadata/StoreToZarr/Rechunk/MapTuple(combine_fragments)" log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/pangeo_forge_recipes/rechunking.py:171" thread: "Thread-13"
Worker: severity: INFO timestamp { seconds: 1711492593 nanos: 318058490 } message: "Combining group = (('time', 1), ('x', 1)), containing fragments = [({Dimension(name='time', operation=<CombineOp.CONCAT: 2>): IndexedPosition(value=2, indexed=True, dimsize=4), Dimension(name='x', operation=<CombineOp.CONCAT: 2>): IndexedPosition(value=1, indexed=True, dimsize=2)}, <xarray.Dataset> Size: 40B\nDimensions: (x: 1, time: 2)\nCoordinates:\n * x (x) int64 8B 1\n * time (time) int64 16B 2 3\nData variables:\n xarray_dataarray_variable (x, time) int64 16B ...)]" instruction_id: "bundle_90" transform_id: "Create|OpenURLWithFSSpec|OpenWithXarray|StoreToZarr|ConsolidateDimensionCoordinates|ConsolidateMetadata/StoreToZarr/Rechunk/MapTuple(combine_fragments)" log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/pangeo_forge_recipes/rechunking.py:171" thread: "Thread-12"
Worker: severity: INFO timestamp { seconds: 1711492593 nanos: 318111896 } message: "Combining group = (('time', 0), ('x', 1)), containing fragments = [({Dimension(name='time', operation=<CombineOp.CONCAT: 2>): IndexedPosition(value=0, indexed=True, dimsize=4), Dimension(name='x', operation=<CombineOp.CONCAT: 2>): IndexedPosition(value=1, indexed=True, dimsize=2)}, <xarray.Dataset> Size: 40B\nDimensions: (x: 1, time: 2)\nCoordinates:\n * x (x) int64 8B 1\n * time (time) int64 16B 0 1\nData variables:\n xarray_dataarray_variable (x, time) int64 16B ...)]" instruction_id: "bundle_89" transform_id: "Create|OpenURLWithFSSpec|OpenWithXarray|StoreToZarr|ConsolidateDimensionCoordinates|ConsolidateMetadata/StoreToZarr/Rechunk/MapTuple(combine_fragments)" log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/pangeo_forge_recipes/rechunking.py:171" thread: "Thread-12"
Worker: severity: INFO timestamp { seconds: 1711492593 nanos: 967823266 } message: "No more requests from control plane" log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker.py:288" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492593 nanos: 967950582 } message: "SDK Harness waiting for in-flight requests to complete" log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker.py:289" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492593 nanos: 968000650 } message: "Closing all cached grpc data channels." log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/data_plane.py:820" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492593 nanos: 968036174 } message: "Closing all cached gRPC state handlers." log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker.py:929" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492593 nanos: 969187736 } message: "Done consuming work." log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker.py:301" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492593 nanos: 969241619 } message: "Python sdk harness exiting." log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker_main.py:213" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492594 nanos: 379099369 } message: "No more requests from control plane" log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker.py:288" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492594 nanos: 379231214 } message: "SDK Harness waiting for in-flight requests to complete" log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker.py:289" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492594 nanos: 379267930 } message: "Closing all cached grpc data channels." log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/data_plane.py:820" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492594 nanos: 379305124 } message: "Closing all cached gRPC state handlers." log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker.py:929" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492594 nanos: 380215883 } message: "Done consuming work." log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker.py:301" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492594 nanos: 380269527 } message: "Python sdk harness exiting." log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker_main.py:213" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492594 nanos: 810832977 } message: "No more requests from control plane" log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker.py:288" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492594 nanos: 810946702 } message: "SDK Harness waiting for in-flight requests to complete" log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker.py:289" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492594 nanos: 810981035 } message: "Closing all cached grpc data channels." log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/data_plane.py:820" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492594 nanos: 811011552 } message: "Closing all cached gRPC state handlers." log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker.py:929" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492594 nanos: 812055587 } message: "Done consuming work." log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker.py:301" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492594 nanos: 812148571 } message: "Python sdk harness exiting." log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker_main.py:213" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492595 nanos: 210046291 } message: "No more requests from control plane" log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker.py:288" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492595 nanos: 210167407 } message: "SDK Harness waiting for in-flight requests to complete" log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker.py:289" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492595 nanos: 210207700 } message: "Closing all cached grpc data channels." log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/data_plane.py:820" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492595 nanos: 210243225 } message: "Closing all cached gRPC state handlers." log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker.py:929" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492595 nanos: 211204528 } message: "Done consuming work." log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker.py:301" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492595 nanos: 211266994 } message: "Python sdk harness exiting." log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker_main.py:213" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492595 nanos: 575698137 } message: "No more requests from control plane" log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker.py:288" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492595 nanos: 575803756 } message: "SDK Harness waiting for in-flight requests to complete" log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker.py:289" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492595 nanos: 575839996 } message: "Closing all cached grpc data channels." log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/data_plane.py:820" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492595 nanos: 575885057 } message: "Closing all cached gRPC state handlers." log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker.py:929" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492595 nanos: 577576875 } message: "Done consuming work." log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker.py:301" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492595 nanos: 577661275 } message: "Python sdk harness exiting." log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker_main.py:213" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492596 nanos: 231420278 } message: "No more requests from control plane" log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker.py:288" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492596 nanos: 231525659 } message: "SDK Harness waiting for in-flight requests to complete" log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker.py:289" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492596 nanos: 231563806 } message: "Closing all cached grpc data channels." log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/data_plane.py:820" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492596 nanos: 231600999 } message: "Closing all cached gRPC state handlers." log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker.py:929" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492596 nanos: 232727766 } message: "Done consuming work." log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker.py:301" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492596 nanos: 232792854 } message: "Python sdk harness exiting." log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker_main.py:213" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492596 nanos: 820110082 } message: "No more requests from control plane" log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker.py:288" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492596 nanos: 820248603 } message: "SDK Harness waiting for in-flight requests to complete" log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker.py:289" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492596 nanos: 820300579 } message: "Closing all cached grpc data channels." log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/data_plane.py:820" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492596 nanos: 820348978 } message: "Closing all cached gRPC state handlers." log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker.py:929" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492596 nanos: 822031974 } message: "Done consuming work." log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker.py:301" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492596 nanos: 822113275 } message: "Python sdk harness exiting." log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker_main.py:213" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492597 nanos: 161024332 } message: "No more requests from control plane" log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker.py:288" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492597 nanos: 161140918 } message: "SDK Harness waiting for in-flight requests to complete" log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker.py:289" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492597 nanos: 161180734 } message: "Closing all cached grpc data channels." log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/data_plane.py:820" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492597 nanos: 161214351 } message: "Closing all cached gRPC state handlers." log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker.py:929" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492597 nanos: 162184953 } message: "Done consuming work." log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker.py:301" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492597 nanos: 162249803 } message: "Python sdk harness exiting." log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker_main.py:213" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492597 nanos: 517909049 } message: "No more requests from control plane" log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker.py:288" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492597 nanos: 518013715 } message: "SDK Harness waiting for in-flight requests to complete" log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker.py:289" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492597 nanos: 518050670 } message: "Closing all cached grpc data channels." log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/data_plane.py:820" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492597 nanos: 518087387 } message: "Closing all cached gRPC state handlers." log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker.py:929" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492597 nanos: 519052743 } message: "Done consuming work." log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker.py:301" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492597 nanos: 519111394 } message: "Python sdk harness exiting." log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker_main.py:213" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492597 nanos: 851803064 } message: "No more requests from control plane" log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker.py:288" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492597 nanos: 851911306 } message: "SDK Harness waiting for in-flight requests to complete" log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker.py:289" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492597 nanos: 851964235 } message: "Closing all cached grpc data channels." log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/data_plane.py:820" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492597 nanos: 852002859 } message: "Closing all cached gRPC state handlers." log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker.py:929" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492597 nanos: 853251218 } message: "Done consuming work." log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker.py:301" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492597 nanos: 853323221 } message: "Python sdk harness exiting." log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker_main.py:213" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492598 nanos: 202118873 } message: "No more requests from control plane" log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker.py:288" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492598 nanos: 202246189 } message: "SDK Harness waiting for in-flight requests to complete" log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker.py:289" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492598 nanos: 202294588 } message: "Closing all cached grpc data channels." log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/data_plane.py:820" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492598 nanos: 202333688 } message: "Closing all cached gRPC state handlers." log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker.py:929" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492598 nanos: 203780412 } message: "Done consuming work." log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker.py:301" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492598 nanos: 203847169 } message: "Python sdk harness exiting." log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker_main.py:213" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492598 nanos: 550036191 } message: "No more requests from control plane" log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker.py:288" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492598 nanos: 550153970 } message: "SDK Harness waiting for in-flight requests to complete" log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker.py:289" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492598 nanos: 550192117 } message: "Closing all cached grpc data channels." log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/data_plane.py:820" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492598 nanos: 550229549 } message: "Closing all cached gRPC state handlers." log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker.py:929" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492598 nanos: 550799369 } message: "Done consuming work." log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker.py:301" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492598 nanos: 550851821 } message: "Python sdk harness exiting." log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker_main.py:213" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492598 nanos: 906728982 } message: "No more requests from control plane" log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker.py:288" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492598 nanos: 906840562 } message: "SDK Harness waiting for in-flight requests to complete" log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker.py:289" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492598 nanos: 906874418 } message: "Closing all cached grpc data channels." log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/data_plane.py:820" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492598 nanos: 906922101 } message: "Closing all cached gRPC state handlers." log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker.py:929" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492598 nanos: 908321142 } message: "Done consuming work." log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker.py:301" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492598 nanos: 908384799 } message: "Python sdk harness exiting." log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker_main.py:213" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492599 nanos: 245692491 } message: "No more requests from control plane" log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker.py:288" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492599 nanos: 245809316 } message: "SDK Harness waiting for in-flight requests to complete" log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker.py:289" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492599 nanos: 245852708 } message: "Closing all cached grpc data channels." log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/data_plane.py:820" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492599 nanos: 245896816 } message: "Closing all cached gRPC state handlers." log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker.py:929" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492599 nanos: 246902704 } message: "Done consuming work." log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker.py:301" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492599 nanos: 246968030 } message: "Python sdk harness exiting." log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker_main.py:213" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492599 nanos: 567260980 } message: "No more requests from control plane" log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker.py:288" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492599 nanos: 567383527 } message: "SDK Harness waiting for in-flight requests to complete" log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker.py:289" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492599 nanos: 567435026 } message: "Closing all cached grpc data channels." log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/data_plane.py:820" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492599 nanos: 567486763 } message: "Closing all cached gRPC state handlers." log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker.py:929" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492599 nanos: 568653821 } message: "Done consuming work." log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker.py:301" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492599 nanos: 568715810 } message: "Python sdk harness exiting." log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker_main.py:213" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492599 nanos: 897392988 } message: "No more requests from control plane" log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker.py:288" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492599 nanos: 897501230 } message: "SDK Harness waiting for in-flight requests to complete" log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker.py:289" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492599 nanos: 897541522 } message: "Closing all cached grpc data channels." log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/data_plane.py:820" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492599 nanos: 897591114 } message: "Closing all cached gRPC state handlers." log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker.py:929" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492599 nanos: 899071693 } message: "Done consuming work." log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker.py:301" thread: "MainThread"
Worker: severity: INFO timestamp { seconds: 1711492599 nanos: 899174690 } message: "Python sdk harness exiting." log_location: "/tmp/1160bdab0eb239dffe3428b6abeb9ec15eb3eb0ecc2d24cd199b044e6dcf0669603vyq1m/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker_main.py:213" thread: "MainThread"

So this 'finishes' but the output looks wrong!

image

There are 4! output chunks.

xr.open_dataset(
    '/home/jovyan/AREAS/PGF/hanging_bug/simple_example/output/simple_test/output/simple_test.zarr',
    engine='zarr',
    chunks={}
)
image

Digging a bit further into the log output from above I see something that Nathan and I were wondering about earlier:

"Combining group = ((\'time\', 1), (\'x\', 1)), containing fragments = [({Dimension(name=\'time\', operation=<CombineOp.CONCAT: 2>): IndexedPosition(value=2, indexed=True, dimsize=4), Dimension(name=\'x\', operation=<CombineOp.CONCAT: 2>): IndexedPosition(value=1, indexed=True, dimsize=2)}, <xarray.Dataset> Size: 40B\nDimensions:  (x: 1, time: 2)\nCoordinates:\n  * x        (x) int64 8B 1\n  * time     (time) int64 16B 2 3\nData variables:\n    var      (x, time) int64 16B ...)]" instruction_id: "bundle_82" transform_id: "Create|OpenURLWithFSSpec|OpenWithXarray|StoreToZarr|ConsolidateDimensionCoordinates|ConsolidateMetadata/StoreToZarr/Rechunk/MapTuple(combine_fragments)"

If I read this correctly this means that this is not actually combining any of the (sub)fragments, since it only contains a single fragment to begin with. This should not happen in this case.

I drew up a sketch of what I expecte to happen in this case
image

I think that the grouping (of exactly two fragments, which are different in time) for each chunk in x)

What seems to happen instead is that it writes the smallest unit (is this a fragment or subfragment - I am very confused by the terminology) as chunks to the target zarr store.

This will results in a TON of writes of tiny chunks in certain scenarios which I guess could at least explain why jobs are slow and maybe how the risk of some random hang gets exponentially increased?


phew this was a long one, but in summary I think that something is broken in the groupby/combine_fragments logic which does not actually produce the correct groups/lists/tuples of fragments that should be combined in an output chunk.

@ranchodeluxe
Copy link
Contributor

nice write up and graphics @jbusecke 🥳 , well explained

@rabernat
Copy link
Contributor

Nice digging Julius!

I think that something is broken in the groupby/combine_fragments logic which does not actually produce the correct groups/lists/tuples of fragments that should be combined in an output chunk.

The logic these functions is unit-tested relatively thoroughly in test_rechunking.py. If you think there is a bug here, the next step would be to eliminate the complexity of the recipe execution and just isolate the logical error via a unit test.

In general, we try to follow the beam-recommended practices for our PTtransforms

Expose large, non-trivial, reusable sequential bits of the transform’s code, which others might want to reuse in ways you haven’t anticipated, as a regular function or class library. The transform should simply wire this logic together.

That means we shouldn't have to run beam code at all to debug and diagnose the sort of issue you're describing. You should be able to take your toy example and turn it into a simple test.

Happy to help with this.

@rabernat
Copy link
Contributor

Target chunks that are orthogonal to the concat dim, requiring a complete rechunk

Also, FWIW, the rechunking algorithm in Pangeo Forge is really not optimized for this extreme case. It means basically splitting up every pixel of a dataset into a individual Xarray Dataset. However, it should definitely work, even if the performance is very poor.

@rabernat
Copy link
Contributor

What seems to happen instead is that it writes the smallest unit (is this a fragment or subfragment - I am very confused by the terminology) as chunks to the target zarr store.

This is the expected behavior for target_chunks={'x': 1}.

Have you tried explicitly specifying the time chunks, i.e. target_chunks={'x': 1, 'time': 4}?

@jbusecke
Copy link
Contributor Author

What seems to happen instead is that it writes the smallest unit (is this a fragment or subfragment - I am very confused by the terminology) as chunks to the target zarr store.

This is the expected behavior for target_chunks={'x': 1}.

Have you tried explicitly specifying the time chunks, i.e. target_chunks={'x': 1, 'time': 4}?

Oh yikes! Then I might have jumped down the wrong rabbithole😱. Ill double check this asap. In any case let me roll back some of the strong language above, since I think I might habe been overly immersed in my dynamic chunking function (CMIP) world where the behavior is to return len(dim) for any dimension that is not specified. Faster subway, I need to find out hahaha.

@jbusecke
Copy link
Contributor Author

Ohhh damn, @rabernat was right. I was making a wrong assumption here. Testing with target_chunks={'x': 1, 'time': 4} gives 2 resulting chunks as expected!

Well that is very good (in the sense that we did not have a big bug in recipes!) but also sets us back in finding the reason for this bug.

The only thing I think I can say at this point is that the issue seems to be somehow triggered by producing a lot of fragments or grouping them (since these jobs seems to generally get stuck on a read not a write I am less suspicious of the steps after generating / combining fragments).

Welp, onwards...tomorrow.

@jbusecke
Copy link
Contributor Author

FWIW I think it would be helpful to keep working on a schematic like the one above and label the following terms in there:

  • Files
  • Fragments
  • Sub-Fragments
  • Groups (of fragments)
  • Target chunks

I might have missed/double counted some terms, but having those enumerated might help with future debugging/refactoring.

@jbusecke
Copy link
Contributor Author

Target chunks that are orthogonal to the concat dim, requiring a complete rechunk

Also, FWIW, the rechunking algorithm in Pangeo Forge is really not optimized for this extreme case. It means basically splitting up every pixel of a dataset into a individual Xarray Dataset. However, it should definitely work, even if the performance is very poor.

Point well taken. I think that none of my dynamic rechunkings will ever be this extreme, but a full 'pixel-level' rechunk should still work as you say, and stress testing this might be helpful here to reveal the (still mysterious) underlying reason for my stalls.

@jbusecke
Copy link
Contributor Author

jbusecke commented Mar 27, 2024

I think the ultimate point of confusion is the logger entry coming from this line.

Am I wrong to assume that this should receive all the fragments that should 'land' in a given target chunks, and thus in our case here It should display more than one fragment to actually combine? There is basically a special case where all the rechunking does is 'splitting' into smaller fragment and then each one is written into a target chunk, but that should not be the case here (anymore 😆).

For my little toy example (updated version with explicit chunks) I get:

 "Combining group = ((\'time\', 1), (\'x\', 1)), containing fragments = [({Dimension(name=\'time\', operation=<CombineOp.CONCAT: 2>): IndexedPosition(value=2, indexed=True, dimsize=4), Dimension(name=\'x\', operation=<CombineOp.CONCAT: 2>): IndexedPosition(value=1, indexed=True, dimsize=2)}, <xarray.Dataset> Size: 40B\nDimensions:  (x: 1, time: 2)\nCoordinates:\n  * x        (x) int64 8B 1\n  * time     (time) int64 16B 2 3\nData variables:\n    var      (x, time) int64 16B ...)]"

Which I interpret as only a single fragment?

This would be helpful to get some clarification on before I continue on this.

@rabernat
Copy link
Contributor

Which I interpret as only a single fragment?

That is the correct interpretation. There is a list of fragments of length 1. Is that from the previous example (target_chunks={'x': 1}) or the updated version (target_chunks={'x': 1, 'time': 4})?

@jbusecke
Copy link
Contributor Author

It is from the updated version with explicit chunks for all existing dimensions!

We saw similar one fragment lists in the more complex recipes (even though I still need to double check there to exclude my error from above). Seeing only a single fragment in there is what led me down the path of making the toy example above in the first place.

Ill look further into the keys and grouping when I have the time.

@rabernat
Copy link
Contributor

It is from the updated version with explicit chunks for all existing dimensions!

But then how is the dataset being written correctly? Could you show the full logging output?

I'd still encourage you to remove the PTransforms and Pipelines from the mix to focus on the core rechunking logic, which is quite simple:

pcoll
| beam.FlatMap(
split_fragment,
target_chunks=self.target_chunks,
schema=beam.pvalue.AsSingleton(self.schema),
)
| beam.GroupByKey() # this has major performance implication
| beam.MapTuple(combine_fragments)

@jbusecke
Copy link
Contributor Author

@rabernat can you confirm that the basic assumption here is correct:

In the case sketched out above + target_chunks={'x':1, 'time':4} we would expect a list of 2 fragments to be combined to be shown in that log statement?

@rabernat
Copy link
Contributor

can you confirm that the basic assumption here is correct:

Not without a simpler reproducer that just involves the functions split_fragment and combine_fragments. 😉 What you're saying sounds generally correct, but there is so much complexity around the full end-to-end recipe that I hesitate to make any claims with certainty.

@jbusecke
Copy link
Contributor Author

I am sorry there must be something clearly wrong with the way I am testing this locally (on the LEAP hub). I just reran the example and we get the expected (2 fragement) logging call:

"Combining group = ((\'x\', 0),), containing fragments = [({Dimension(name=\'time\', operation=<CombineOp.CONCAT: 2>): IndexedPosition(value=2, indexed=True, dimsize=4), Dimension(name=\'x\', operation=<CombineOp.CONCAT: 2>): IndexedPosition(value=0, indexed=True, dimsize=2)}, <xarray.Dataset> Size: 40B\nDimensions:  (x: 1, time: 2)\nCoordinates:\n  * x        (x) int64 8B 0\n  * time     (time) int64 16B 2 3\nData variables:\n    var      (x, time) int64 16B ...), ({Dimension(name=\'time\', operation=<CombineOp.CONCAT: 2>): IndexedPosition(value=0, indexed=True, dimsize=4), Dimension(name=\'x\', operation=<CombineOp.CONCAT: 2>): IndexedPosition(value=0, indexed=True, dimsize=2)}, <xarray.Dataset> Size: 40B\nDimensions:  (x: 1, time: 2)\nCoordinates:\n  * x        (x) int64 8B 0\n  * time     (time) int64 16B 0 1\nData variables:\n    var      (x, time) int64 16B ...)]"

Somehow things are persisting across running the recipe multiple times from a local repo, which might be related to git or how things are saved/not saved on the hub 😡 . Strong point for what you said above about testing without beam!

I am very sorry for all the noise and loss of focus. I think I have convinced myself that the rechunking logic works as intended here and that the issue lies elsewhere.

@jbusecke
Copy link
Contributor Author

I just wanted to reset and confirm that my (still complex) reproducer recipe test_short_lon_only_chunks now with explicit chunks for all dimensions

target_chunks = {'lon': 10, 'time': 29200, 'bnds': 2, 'lat': 192} # this only splits the array on lon, all other chunks are == len(dim)

still hangs!

The pertaining log output is here

Operation ongoing in bundle process_bundle-764632957537901303-1759 for PTransform{name=Creating CMIP6.CMIP.CMCC.CMCC-ESM2.historical.r1i1p1f1.3hr.pr.gn.v20210114|OpenURLWithFSSpec|OpenWithXarray|Preprocessor|StoreToZarr|ConsolidateDimensionCoordinates|ConsolidateMetadata/StoreToZarr/Rechunk/MapTuple(combine_fragments)-ptransform-46, state=process-msecs} for at least 1175.68 seconds without outputting or completing.
Current Traceback:
  File "/usr/local/lib/python3.10/threading.py", line 973, in _bootstrap
    self._bootstrap_inner()

  File "/usr/local/lib/python3.10/threading.py", line 1016, in _bootstrap_inner
    self.run()

  File "/usr/local/lib/python3.10/site-packages/apache_beam/utils/thread_pool_executor.py", line 53, in run
    self._work_item.run()

  File "/usr/local/lib/python3.10/site-packages/apache_beam/utils/thread_pool_executor.py", line 37, in run
    self._future.set_result(self._fn(*self._fn_args, **self._fn_kwargs))

  File "/usr/local/lib/python3.10/site-packages/apache_beam/runners/worker/sdk_worker.py", line 385, in task
    self._execute(

  File "/usr/local/lib/python3.10/site-packages/apache_beam/runners/worker/sdk_worker.py", line 311, in _execute
    response = task()

  File "/usr/local/lib/python3.10/site-packages/apache_beam/runners/worker/sdk_worker.py", line 386, in <lambda>
    lambda: self.create_worker().do_instruction(request), request)

  File "/usr/local/lib/python3.10/site-packages/apache_beam/runners/worker/sdk_worker.py", line 650, in do_instruction
    return getattr(self, request_type)(

  File "/usr/local/lib/python3.10/site-packages/apache_beam/runners/worker/sdk_worker.py", line 688, in process_bundle
    bundle_processor.process_bundle(instruction_id))

  File "/usr/local/lib/python3.10/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1113, in process_bundle
    input_op_by_transform_id[element.transform_id].process_encoded(

  File "/usr/local/lib/python3.10/site-packages/apache_beam/runners/worker/bundle_processor.py", line 237, in process_encoded
    self.output(decoded_value)

  File "/tmp/cf09f9f279ec355c109713fd5ed97e51c62f0daeb39deea8ba13555949327808yktlp6da/lib/python3.10/site-packages/apache_beam/transforms/core.py", line 2040, in <lambda>

  File "/opt/apache/beam-venv/beam-venv-worker-sdk-0-0/lib/python3.10/site-packages/pangeo_forge_recipes/rechunking.py", line 240, in combine_fragments
    ds_combined = xr.combine_nested(dsets_to_concat, concat_dim=concat_dims_sorted)

  File "/opt/apache/beam-venv/beam-venv-worker-sdk-0-0/lib/python3.10/site-packages/xarray/core/combine.py", line 577, in combine_nested
    return _nested_combine(

  File "/opt/apache/beam-venv/beam-venv-worker-sdk-0-0/lib/python3.10/site-packages/xarray/core/combine.py", line 356, in _nested_combine
    combined = _combine_nd(

  File "/opt/apache/beam-venv/beam-venv-worker-sdk-0-0/lib/python3.10/site-packages/xarray/core/combine.py", line 232, in _combine_nd
    combined_ids = _combine_all_along_first_dim(

  File "/opt/apache/beam-venv/beam-venv-worker-sdk-0-0/lib/python3.10/site-packages/xarray/core/combine.py", line 267, in _combine_all_along_first_dim
    new_combined_ids[new_id] = _combine_1d(

  File "/opt/apache/beam-venv/beam-venv-worker-sdk-0-0/lib/python3.10/site-packages/xarray/core/combine.py", line 290, in _combine_1d
    combined = concat(

  File "/opt/apache/beam-venv/beam-venv-worker-sdk-0-0/lib/python3.10/site-packages/xarray/core/concat.py", line 250, in concat
    return _dataset_concat(

  File "/opt/apache/beam-venv/beam-venv-worker-sdk-0-0/lib/python3.10/site-packages/xarray/core/concat.py", line 631, in _dataset_concat
    combined_var = concat_vars(

  File "/opt/apache/beam-venv/beam-venv-worker-sdk-0-0/lib/python3.10/site-packages/xarray/core/variable.py", line 2925, in concat
    return Variable.concat(variables, dim, positions, shortcut, combine_attrs)

  File "/opt/apache/beam-venv/beam-venv-worker-sdk-0-0/lib/python3.10/site-packages/xarray/core/variable.py", line 1693, in concat
    data = duck_array_ops.concatenate(arrays, axis=axis)

  File "/opt/apache/beam-venv/beam-venv-worker-sdk-0-0/lib/python3.10/site-packages/xarray/core/duck_array_ops.py", line 356, in concatenate
    return _concatenate(as_shared_dtype(arrays), axis=axis)

  File "/opt/apache/beam-venv/beam-venv-worker-sdk-0-0/lib/python3.10/site-packages/xarray/core/duck_array_ops.py", line 232, in as_shared_dtype
    arrays = [asarray(x, xp=xp) for x in scalars_or_arrays]

  File "/opt/apache/beam-venv/beam-venv-worker-sdk-0-0/lib/python3.10/site-packages/xarray/core/duck_array_ops.py", line 232, in <listcomp>
    arrays = [asarray(x, xp=xp) for x in scalars_or_arrays]

  File "/opt/apache/beam-venv/beam-venv-worker-sdk-0-0/lib/python3.10/site-packages/xarray/core/duck_array_ops.py", line 219, in asarray
    return data if is_duck_array(data) else xp.asarray(data)

  File "/opt/apache/beam-venv/beam-venv-worker-sdk-0-0/lib/python3.10/site-packages/xarray/core/indexing.py", line 693, in __array__
    return np.asarray(self.get_duck_array(), dtype=dtype)

  File "/opt/apache/beam-venv/beam-venv-worker-sdk-0-0/lib/python3.10/site-packages/xarray/core/indexing.py", line 696, in get_duck_array
    self._ensure_cached()

  File "/opt/apache/beam-venv/beam-venv-worker-sdk-0-0/lib/python3.10/site-packages/xarray/core/indexing.py", line 690, in _ensure_cached
    self.array = as_indexable(self.array.get_duck_array())

  File "/opt/apache/beam-venv/beam-venv-worker-sdk-0-0/lib/python3.10/site-packages/xarray/core/indexing.py", line 664, in get_duck_array
    return self.array.get_duck_array()

  File "/opt/apache/beam-venv/beam-venv-worker-sdk-0-0/lib/python3.10/site-packages/xarray/core/indexing.py", line 557, in get_duck_array
    array = array.get_duck_array()

  File "/opt/apache/beam-venv/beam-venv-worker-sdk-0-0/lib/python3.10/site-packages/xarray/coding/variables.py", line 75, in get_duck_array
    return self.func(self.array.get_duck_array())

  File "/opt/apache/beam-venv/beam-venv-worker-sdk-0-0/lib/python3.10/site-packages/xarray/core/indexing.py", line 551, in get_duck_array
    array = self.array[self.key]

  File "/opt/apache/beam-venv/beam-venv-worker-sdk-0-0/lib/python3.10/site-packages/xarray/backends/h5netcdf_.py", line 51, in __getitem__
    return indexing.explicit_indexing_adapter(

  File "/opt/apache/beam-venv/beam-venv-worker-sdk-0-0/lib/python3.10/site-packages/xarray/core/indexing.py", line 858, in explicit_indexing_adapter
    result = raw_indexing_method(raw_key.tuple)

  File "/opt/apache/beam-venv/beam-venv-worker-sdk-0-0/lib/python3.10/site-packages/xarray/backends/h5netcdf_.py", line 58, in _getitem
    return array[key]

  File "/opt/apache/beam-venv/beam-venv-worker-sdk-0-0/lib/python3.10/site-packages/h5netcdf/core.py", line 347, in __getitem__
    return self._h5ds[key]

  File "/opt/apache/beam-venv/beam-venv-worker-sdk-0-0/lib/python3.10/site-packages/h5py/_hl/dataset.py", line 758, in __getitem__
    return self._fast_reader.read(args)

  File "/opt/apache/beam-venv/beam-venv-worker-sdk-0-0/lib/python3.10/site-packages/fsspec/spec.py", line 1856, in readinto
    data = self.read(out.nbytes)

  File "/opt/apache/beam-venv/beam-venv-worker-sdk-0-0/lib/python3.10/site-packages/fsspec/spec.py", line 1846, in read
    out = self.cache._fetch(self.loc, self.loc + length)

  File "/opt/apache/beam-venv/beam-venv-worker-sdk-0-0/lib/python3.10/site-packages/fsspec/caching.py", line 189, in _fetch
    self.cache = self.fetcher(start, end)  # new block replaces old

  File "/opt/apache/beam-venv/beam-venv-worker-sdk-0-0/lib/python3.10/site-packages/gcsfs/core.py", line 1921, in _fetch_range
    return self.gcsfs.cat_file(self.path, start=start, end=end)

  File "/opt/apache/beam-venv/beam-venv-worker-sdk-0-0/lib/python3.10/site-packages/fsspec/asyn.py", line 118, in wrapper
    return sync(self.loop, func, *args, **kwargs)

  File "/opt/apache/beam-venv/beam-venv-worker-sdk-0-0/lib/python3.10/site-packages/fsspec/asyn.py", line 91, in sync
    if event.wait(1):

  File "/usr/local/lib/python3.10/threading.py", line 607, in wait
    signaled = self._cond.wait(timeout)

  File "/usr/local/lib/python3.10/threading.py", line 324, in wait
    gotit = waiter.acquire(True, timeout)

Which I interpret as being stuck on reading something...would love to hear how others think about this.
Yesterday @moradology and I were wondering what parts of the cached files this line actually reads:

ds_combined = xr.combine_nested(dsets_to_concat, concat_dim=concat_dims_sorted)

Does this eagerly load the data to be combined, or just read the metadata at this point?

Is the issue that we are trying to read the same metadata from 'too many' processes? Or do we simply need to allow fsspec to fail and retry? Disable caching?

One more possibly important piece of information here is that this does not seem to be just a single read being hung up. At least that is how I interpret all of these log messages that have a similar content as above but different 'bundles':
image

The frustrating thing about being back at this end of the equation is that this example only fails on dataflow (@ranchodeluxe was able to run it on flink, and I ran it locally) making the debugging very very hard.

I would love nothing more than to really dig deeper into this today but have too much other things on my plate unfortunately. Maybe a little breather is also appropriate, this issue is a frustrating one.

Very curious if others have ideas how to proceed.

@jbusecke
Copy link
Contributor Author

jbusecke commented Apr 1, 2024

Just a quick update here:

  • I have found a quick (and quite dirty) workaround with the help of @ranchodeluxe. This requirement.txt basically disables the garbage collection within xarray, and that seems to solve the problem for now!!!
  • Base on discussion at the regular PGF meeting, I am currently trying out if setting copy_to_local and/or load on OpenWithXarray also solves the problem, but so far I was not successful. I am reporting on some of the issues in copy_to_local in OpenWithXarray not working on Dataflow #722

@jbusecke
Copy link
Contributor Author

jbusecke commented Apr 1, 2024

Ok so some preliminary conclusions: At least for my case, setting copy_to_local=True does not fix the problem, it just moves it?

I was able to get around the issues with the local disk by specifying a larger attached disk size (see runner PR), but now instead of errors I am seeing all the usual stalling behavior again:

  • low CPU
  • steady memory
  • and warning messages like this:
Operation ongoing in bundle process_bundle-557078944501211849-694 for PTransform{name=Creating CMIP6.CMIP.CMCC.CMCC-ESM2.historical.r1i1p1f1.3hr.pr.gn.v20210114|OpenURLWithFSSpec|OpenWithXarray|Preprocessor|StoreToZarr/OpenWithXarray/Open with Xarray-ptransform-87, state=process-msecs} for at least 1486.46 seconds without outputting or completing.
Current Traceback:
  File "/usr/local/lib/python3.10/threading.py", line 973, in _bootstrap
    self._bootstrap_inner()

  File "/usr/local/lib/python3.10/threading.py", line 1016, in _bootstrap_inner
    self.run()

  File "/usr/local/lib/python3.10/site-packages/apache_beam/utils/thread_pool_executor.py", line 53, in run
    self._work_item.run()

  File "/usr/local/lib/python3.10/site-packages/apache_beam/utils/thread_pool_executor.py", line 37, in run
    self._future.set_result(self._fn(*self._fn_args, **self._fn_kwargs))

  File "/usr/local/lib/python3.10/site-packages/apache_beam/runners/worker/sdk_worker.py", line 371, in task
    self._execute(

  File "/usr/local/lib/python3.10/site-packages/apache_beam/runners/worker/sdk_worker.py", line 297, in _execute
    response = task()

  File "/usr/local/lib/python3.10/site-packages/apache_beam/runners/worker/sdk_worker.py", line 372, in <lambda>
    lambda: self.create_worker().do_instruction(request), request)

  File "/usr/local/lib/python3.10/site-packages/apache_beam/runners/worker/sdk_worker.py", line 625, in do_instruction
    return getattr(self, request_type)(

  File "/usr/local/lib/python3.10/site-packages/apache_beam/runners/worker/sdk_worker.py", line 663, in process_bundle
    bundle_processor.process_bundle(instruction_id))

  File "/usr/local/lib/python3.10/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1056, in process_bundle
    input_op_by_transform_id[element.transform_id].process_encoded(

  File "/usr/local/lib/python3.10/site-packages/apache_beam/runners/worker/bundle_processor.py", line 237, in process_encoded
    self.output(decoded_value)

  File "/tmp/ab3ce7b5c6a409796cee8c934cce7cfd4b3d5546a1c4bd716293176fd8cfc1a1ikgl03o0/lib/python3.10/site-packages/apache_beam/transforms/core.py", line 2040, in <lambda>

  File "/tmp/ab3ce7b5c6a409796cee8c934cce7cfd4b3d5546a1c4bd716293176fd8cfc1a1ikgl03o0/lib/python3.10/site-packages/pangeo_forge_recipes/transforms.py", line 233, in <lambda>

  File "/opt/apache/beam-venv/beam-venv-worker-sdk-0-0/lib/python3.10/site-packages/pangeo_forge_recipes/openers.py", line 233, in open_with_xarray
    _copy_btw_filesystems(url_or_file_obj, target_opener)

  File "/opt/apache/beam-venv/beam-venv-worker-sdk-0-0/lib/python3.10/site-packages/pangeo_forge_recipes/storage.py", line 33, in _copy_btw_filesystems
    with output_opener as target:

dataflow job.

I am not sure if this is the same issue of deadlocking in fsspec? But unfortunately for now it seems like copy_to_local is not a solution to the issue.

EDIT: As per convo with @rabernat over in #722 i tried setting both copy_to_local=True and load=True and got something that I understand even less:
image

This one sounds curious:

Failed to read inputs in the data plane.
Traceback (most recent call last):
  File "/usr/local/lib/python3.10/site-packages/apache_beam/runners/worker/data_plane.py", line 652, in _read_inputs
    for elements in elements_iterator:
  File "/usr/local/lib/python3.10/site-packages/grpc/_channel.py", line 541, in __next__
    return self._next()
  File "/usr/local/lib/python3.10/site-packages/grpc/_channel.py", line 967, in _next
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.UNKNOWN
	details = "Exception iterating requests!"
	debug_error_string = "None"
>

Def out of my wheelhouse, but maybe @ranchodeluxe or @moradology have an idea?

@moradology
Copy link
Contributor

moradology commented Apr 1, 2024

Yeah, this is one of the things that can make it difficult in the extreme to debug distributed systems. The seemingly meaningless trace here is a symptom of the generality and the abstraction that is at the heart of their execution model. Your functions and data have to get passed around inside something capable of executing them in various contexts. A rough translation of what this looks like to me is: 'status was set to cancelled while in the process of iterating through the data we need to operate over'. Could well be there's an underlying trace further up or swallowed but I wouldn't be shocked to see this as output from killing the master process

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants