首页 > 解决方案 > Airflow 1.10.10 [核心] 与 1.10.15 [日志] AWS S3 远程日志记录

问题描述

AWS S3将登录设置从[core]一个部分移动后,我无法启用远程日志记录[logging]

这是我移动的:

[logging]
# The folder where airflow should store its log files
# This path must be absolute
base_log_folder = /usr/local/airflow/logs

# Airflow can store logs remotely in AWS S3, Google Cloud Storage or Elastic Search.
# Users must supply an Airflow connection id that provides access to the storage
# location. If remote_logging is set to true, see UPDATING.md for additional
# configuration requirements.
remote_logging = True
remote_log_conn_id = MyS3Conn
remote_base_log_folder = s3://bucket/tst/
encrypt_s3_logs = False

# Logging level
logging_level = INFO
fab_logging_level = WARN

# Logging class
# Specify the class that will specify the logging configuration
# This class has to be on the python classpath
# logging_config_class = my.path.default_local_settings.LOGGING_CONFIG
logging_config_class =

# Log format
# we need to escape the curly braces by adding an additional curly brace
log_format = [%%(asctime)s] {%%(filename)s:%%(lineno)d} %%(levelname)s - %%(message)s
simple_log_format = %%(asctime)s %%(levelname)s - %%(message)s

# Log filename format
# we need to escape the curly braces by adding an additional curly brace
log_filename_template = {{ ti.dag_id }}/{{ ti.task_id }}/{{ ts }}/{{ try_number }}.log
log_processor_filename_template = {{ filename }}.log
# Name of handler to read task instance logs.
# Default to use task handler.
task_log_reader = task

我只是移动属性。 airflow upgrade_check 返回Logging configuration has been moved to new section支票是 okei。

我有,当现在位于远程日志记录中apache-airflow[crypto,postgres,ssh,s3,log]==1.10.15的所有属性都工作正常时。loggingcore

我没有找到有关如何设置它的任何信息。我只找到了这个,但它只是说以下配置已从 [core] 移动到新的 [logging] 部分。

标签: amazon-web-servicesamazon-s3airflow

解决方案


您应该继续[core]用于登录 1.10.15,只有当您更新到 Airflow >= 2.0.0 时,您才应该使用[logging]section.

upgrade_check命令表示它已移至[logging]>=2.0.0 中的部分。它将继续工作,只是发出弃用警告。


推荐阅读