-
Since I want to use remote logging, I pass sensitive environment variable into Airflow using the Helm chart. When checking the task logs through the UI, the following error occurs in the webserver log. [2023-06-22T02:07:13.549+0000] {app.py:1744} ERROR - Exception on /get_logs_with_metadata [GET]
Traceback (most recent call last):
File "/home/airflow/.local/lib/python3.8/site-packages/flask/app.py", line 2529, in wsgi_app
response = self.full_dispatch_request()
File "/home/airflow/.local/lib/python3.8/site-packages/flask/app.py", line 1825, in full_dispatch_request
rv = self.handle_user_exception(e)
File "/home/airflow/.local/lib/python3.8/site-packages/flask/app.py", line 1823, in full_dispatch_request
rv = self.dispatch_request()
File "/home/airflow/.local/lib/python3.8/site-packages/flask/app.py", line 1799, in dispatch_request
return self.ensure_sync(self.view_functions[rule.endpoint])(**view_args)
File "/home/airflow/.local/lib/python3.8/site-packages/airflow/www/auth.py", line 47, in decorated
return func(*args, **kwargs)
File "/home/airflow/.local/lib/python3.8/site-packages/airflow/www/decorators.py", line 125, in wrapper
return f(*args, **kwargs)
File "/home/airflow/.local/lib/python3.8/site-packages/airflow/utils/session.py", line 76, in wrapper
return func(*args, session=session, **kwargs)
File "/home/airflow/.local/lib/python3.8/site-packages/airflow/www/views.py", line 1546, in get_logs_with_metadata
logs, metadata = task_log_reader.read_log_chunks(ti, try_number, metadata)
File "/home/airflow/.local/lib/python3.8/site-packages/airflow/utils/log/log_reader.py", line 62, in read_log_chunks
logs, metadatas = self.log_handler.read(ti, try_number, metadata=metadata)
File "/home/airflow/.local/lib/python3.8/site-packages/airflow/utils/log/file_task_handler.py", line 412, in read
log, out_metadata = self._read(task_instance, try_number_element, metadata)
File "/home/airflow/.local/lib/python3.8/site-packages/airflow/providers/amazon/aws/log/s3_task_handler.py", line 146, in _read
return super()._read(ti, try_number, metadata)
File "/home/airflow/.local/lib/python3.8/site-packages/airflow/utils/log/file_task_handler.py", line 311, in _read
remote_messages, remote_logs = self._read_remote_logs(ti, try_number, metadata)
File "/home/airflow/.local/lib/python3.8/site-packages/airflow/providers/amazon/aws/log/s3_task_handler.py", line 122, in _read_remote_logs
keys = self.hook.list_keys(bucket_name=bucket, prefix=prefix)
File "/home/airflow/.local/lib/python3.8/site-packages/airflow/providers/amazon/aws/hooks/s3.py", line 82, in wrapper
return func(*bound_args.args, **bound_args.kwargs)
File "/home/airflow/.local/lib/python3.8/site-packages/airflow/providers/amazon/aws/hooks/s3.py", line 443, in list_keys
for page in response:
File "/home/airflow/.local/lib/python3.8/site-packages/botocore/paginate.py", line 269, in __iter__
response = self._make_request(current_kwargs)
File "/home/airflow/.local/lib/python3.8/site-packages/botocore/paginate.py", line 357, in _make_request
return self._method(**current_kwargs)
File "/home/airflow/.local/lib/python3.8/site-packages/botocore/client.py", line 530, in _api_call
return self._make_api_call(operation_name, kwargs)
File "/home/airflow/.local/lib/python3.8/site-packages/botocore/client.py", line 943, in _make_api_call
http, parsed_response = self._make_request(
File "/home/airflow/.local/lib/python3.8/site-packages/botocore/client.py", line 966, in _make_request
return self._endpoint.make_request(operation_model, request_dict)
File "/home/airflow/.local/lib/python3.8/site-packages/botocore/endpoint.py", line 119, in make_request
return self._send_request(request_dict, operation_model)
File "/home/airflow/.local/lib/python3.8/site-packages/botocore/endpoint.py", line 198, in _send_request
request = self.create_request(request_dict, operation_model)
File "/home/airflow/.local/lib/python3.8/site-packages/botocore/endpoint.py", line 134, in create_request
self._event_emitter.emit(
File "/home/airflow/.local/lib/python3.8/site-packages/botocore/hooks.py", line 412, in emit
return self._emitter.emit(aliased_event_name, **kwargs)
File "/home/airflow/.local/lib/python3.8/site-packages/botocore/hooks.py", line 256, in emit
return self._emit(event_name, kwargs)
File "/home/airflow/.local/lib/python3.8/site-packages/botocore/hooks.py", line 239, in _emit
response = handler(**kwargs)
File "/home/airflow/.local/lib/python3.8/site-packages/botocore/signers.py", line 105, in handler
return self.sign(operation_name, request)
File "/home/airflow/.local/lib/python3.8/site-packages/botocore/signers.py", line 189, in sign
auth.add_auth(request)
File "/home/airflow/.local/lib/python3.8/site-packages/botocore/auth.py", line 418, in add_auth
raise NoCredentialsError()
botocore.exceptions.NoCredentialsError: Unable to locate credentials Obviously, I wrote a credential in the S3 Connection environment variable. Why is the above error occured and what is the solution without adding Environmnets: secret:
- envName: "AIRFLOW_CONN_S3_TEST_CONN"
secretName: s3-test-connection
secretKey: "AIRFLOW_CONN_S3_TEST_CONN"
extraSecrets:
s3-test-connection:
data: |
AIRFLOW_CONN_S3_TEST_CONN: 'base64_encoded_s3-test-connection_string'
logging:
remote_logging: "True"
remote_base_log_folder: s3://airflow-test/logs
remote_log_conn_id: s3-test-connection
encrypt_s3_logs: 'False' Reference |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 2 replies
-
Not sure if this is all (it might be problem with your encoding etc): https://airflow.apache.org/docs/apache-airflow/stable/howto/connection.html#storing-connections-in-environment-variables your connection env name has wrong name (too short) and does not match the connection name. It shoudl be |
Beta Was this translation helpful? Give feedback.
Not sure if this is all (it might be problem with your encoding etc): https://airflow.apache.org/docs/apache-airflow/stable/howto/connection.html#storing-connections-in-environment-variables your connection env name has wrong name (too short) and does not match the connection name.
It shoudl be
AIRFLOW_CONN_{CONN_ID}
with all uppercase. I am also not sure what happens with the-
in yours3-test-connection
-> in most cases we are using_
in connection name so you might want to change it as well.