我正在Windows上工作,通过Docker设置了Airflow。我在Windows上有许多Python脚本,可以从多个位置(如SSH连接、Windows文件夹等)读取和写入。复制所有这些输入到我的Docker镜像中需要大量工作,因此我希望让Airflow执行这些脚本,就好像它们在Windows中运行一样。
如果可能的话,该怎么做呢?
以下是我作为DAG运行的脚本:
如果可能的话,该怎么做呢?
以下是我作为DAG运行的脚本:
from airflow import DAG
from airflow.operators.bash_operator import BashOperator
from airflow.operators.python_operator import PythonOperator
from datetime import datetime, timedelta
# Following are defaults which can be overridden later on
default_args = {
'owner': 'test',
'depends_on_past': False,
'start_date': datetime(2018, 11, 27),
'email': ['test@gmail.com'],
'email_on_failure': True,
'email_on_retry': True,
'retries': 1,
'retry_delay': timedelta(minutes=1),
}
dag = DAG('Helloworld', default_args=default_args)
###########################################################
# Here's where I want to execute my windows python script #
###########################################################
t1=PythonOperator(dag=dag,
task_id='my_task_powered_by_python',
provide_context=False,
python_callable=r"C:\Users\user\Documents\script.py")
t2 = BashOperator(
task_id='task_2',
bash_command='echo "Hello World from Task 2"',
dag=dag)
t3 = BashOperator(
task_id='task_3',
bash_command='echo "Hello World from Task 3"',
dag=dag)
t4 = BashOperator(
task_id='task_4',
bash_command='echo "Hello World from Task 4"',
dag=dag)
t2.set_upstream(t1)
t3.set_upstream(t1)
t4.set_upstream(t2)
t4.set_upstream(t3)
t1 = BashOperator( task_id='task_1', bash_command='python /usr/local/airflow/scripts/script1.py', dag=dag )
然而,我仍然遇到相同的问题,即我的Python脚本被作为Docker中的Bash进程运行,而不是作为Windows进程运行。 - undefined