I configured an Airflow server installed within a conda environment to run some scheduled automations. Currently, I launch the scheduler, workers and webserver directly using nohup
, but I'd like to use systemd to manage it more robustly.
However, I'm having trouble launching the system with systemctl start
. I added to my .service
file the following:
ExecStartPre=. /etc/profile ; /home/shared/miniconda2/bin/conda activate airflow
ExecStart=/home/shared/miniconda2/envs/airflow/bin/airflow webserver --pid /run/airflow/webserver.pid
(where shared/
is not a user, just a folder inside /home/
to which all users have access)
ExecStart
requires the airflow
conda environment in which airflow is actually installed to be activated. To do this, I added the two commands seen in ExecStartPre
: the second one actually activates this environment. Running this alone returns CommandNotFoundError: Your shell has not been properly configured to use 'conda activate'.
, so I added the first one to ensure /etc/profile.d/conda.sh
is loaded.
However, this still fails, seemingly while trying to run the Gunicorn Server:
Running the Gunicorn Server with:
Workers: 4 sync
Host: 0.0.0.0:8080
Timeout: 120
Logfiles: - -
=================================================================
Traceback (most recent call last):
File "/home/shared/miniconda2/envs/airflow/bin/airflow", line 28, in <module>
args.func(args)
File "/home/shared/miniconda2/envs/airflow/lib/python2.7/site-packages/airflow/bin/cli.py", line 844, in webserver
gunicorn_master_proc = subprocess.Popen(run_args)
File "/home/shared/miniconda2/envs/airflow/lib/python2.7/subprocess.py", line 390, in __init__
errread, errwrite)
File "/home/shared/miniconda2/envs/airflow/lib/python2.7/subprocess.py", line 1025, in _execute_child
raise child_exception
OSError: [Errno 2] No such file or directory
(timestamps omitted to improve readability a bit)
This raises a number of questions:
enable
and restart
options that systemd offers.I hope by now you have solved your problem but for reference here is my solution.
The way I solved it was to add a level of indirection for launching airflow. I have a bash script e.g. airflow_runner.sh
with the following
conda activate airflow-env
$AIRFLOW_BIN/airflow $@
Then in my systemd unit file
ExecStart=PATH_TO_SCRIPT/airflow_runner.sh webserver --pid /run/airflow/webserver.pid
Collected from the Internet
Please contact [email protected] to delete if infringement.
Comments