Airflow: Log file is not local, unsupported remote log location

I cannot see the logs attached to tasks from the Airflow user interface: enter image description here

Parameters related to the log in the airflow.cfg file:

  • remote_base_log_folder =
  • base_log_folder = /home/my_projects/ksaprice_project/airflow/logs
  • worker_log_server_port = 8793
  • child_process_log_directory = /home/my_projects/ksaprice_project/airflow/logs/scheduler

Although I set remote_base_log_folter, it is trying to get the log from http://:8793/log/tutorial/print_date/2017-08-02T00:00:00 - I do not understand this behavior. According to the settings, workers should store the logs in /home/my_projects/ksaprice_project/airflow/logs , and they should be extracted from the same place, and not from the remote.

Update The contents of the task_instance table: enter image description here

+13
airflow apache-airflow airflow-scheduler
source share
4 answers

I also ran into the same problem.

Setting the variables below in airflow.cfg worked for me. Use {hostname} as the FQDN {hostname} machine FQDN {hostname} instead of localhost.

 endpoint_url = http://{hostname}:8080 base_url = http://{hostname}:8080 

Good luck

+7
source share

As you can see in image-1, there is a time stamp, make sure that in your logs you have a folder / file with this time stamp as a name ..

You are looking at the user interface, so first make sure that you have the log files created in the directory, in my case the folder of my log looks like

 (AIRFLOW-ENV) [ cloudera@quickstart dags]$ ll /home/cloudera/workspace/python/airflow_home/logs/my_test_dag/my_sensor_task total 8 -rw-rw-rw- 1 cloudera cloudera 3215 Nov 14 08:45 2017-11-12T12:00:00 -rw-rw-rw- 1 cloudera cloudera 2694 Nov 14 08:45 2017-11-14T08:36:06.920727 (AIRFLOW-ENV) [ cloudera@quickstart dags]$ 

So my log url is

 http://localhost:8080/admin/airflow/log?task_id=my_sensor_task&dag_id=my_test_dag&execution_date=2017-11-14T08:36:06.920727 

When you go to your DAG and select GRAPH-VIEW, you will see a drop-down list next to "RUN", select the appropriate run, and then in the graph below select the appropriate task / operator and select the log view

+1
source share

I also ran into this, and I had to give up tasks.

 dags_are_paused_at_creation = False 

I also set new dags for default with no changes in my airflow.cfg

 dags_are_paused_at_creation = False 
0
source share

I am also facing the same problem, I tried to change the host name using FQDN, but still the same problem. Has anyone solved this problem.

0
source share

All Articles