airflow


How to retrieve default args in python callable


I need to be able to access default_args defined as part of DAG definition in a Python Operator, python_callable. Maybe it's my unfamiliartiy with python or airflow in general, but could someone guide on how to achieve this.
Following is a code sample of what am trying to achieve
default_args = {
'owner': 'airflow',
'depends_on_past': False,
'email': 'xyz#xyz.com',
'email_on_failure': 'xyz#xyz.com',
'email_on_retry': False,
'retries': 1,
'retry_delay': timedelta(minutes=5),
'start_date': datetime(2017, 5, 15, 23, 20),
'end_date': datetime(2017, 5, 16, 23, 45),
'touchfile_path': '/user/myname/touchfiles/',
}
dag = DAG(
'test',
default_args=default_args,
template_searchpath=['/Users/myname/Desktop/utils/airflow/resources'],
user_defined_macros=dict(SCHEMA_NAME='abc'),
#schedule_interval='*/2 * * * * ')
schedule_interval='#once')
def webhdfs_touchfile_create(ds, *args, **kwargs):
web_hdfs_hook = WebHDFSHook('webhdfs_default')
client = web_hdfs_hook.get_conn()
client.write("/user/myname/airflow_hdfs","stringToWrite")
pp.pprint(kwargs)
task1 = PythonOperator(
task_id='task1',
provide_context=True, #enabling this would allow to pass arguments automatically to your callable function
python_callable=webhdfs_touchfile_create,
templates_dict={'attr1': {{ default_args['touchfile_path'] }}},
dag=dag)
Since the template_dict for PythonOperator is the only attribute which jinja templating works, how can i retrieve the 'touchfile_path' paramter in there?

Related Links

Refreshing dags without web server restart apache airflow
Airflow BashOperator log doesn't contain full ouput
How to configure Airflow email alert
Airflow Celery flower looks at the old config entry for mysql
Airflow: can a subdag be run inside another subdag?
Airflow: how to extend SubDagOperator?
Airflow 1.8.0 - execution_date doesn't have a default value exception
Schedule tasks on another computer with Airflow
Schedule airflow to run on weekdays
Airflow: dag_id could not be found
What is the difference between airflow trigger rule “all_done” and “all_success”?
Airflow and Docker Containers
Next Instance of Job Runs Before I Can Debug This Instance
Airflow - how to make a task only to be triggered after all its upstream tasks status have been known and at least one of them is successful
Airflow: running task t4 even if another has failed in such DAG: t1 >> t2 >> t3 >> t4
Modules in Airflow

Categories

HOME
repository
kendo-ui
oracle-sqldeveloper
cluster-analysis
cxf
wifi
abc
microcontroller
dronekit
dynamic
postsharp
crm
browser-cache
richtextbox
schema
python-3.4
jquery-select2
browserstack
cracking
ios-universal-links
olap
oms
xmlhttprequest
openscad
openam
ofbiz
virtuemart
react-jsx
computation-theory
ejb-3.1
connectiq
pushbullet
aspxgridview
pydub
cloudformation
drombler-fx
csproj
dynamics-ax-2012-r3
osgi-bundle
edirectory
andengine
mplab
keil
openvms
wunderground
abide
google-chrome-console
jvisualvm
wufoo
taiga
smooth-streaming
result
servicebus
nao-robot
lucee
visualizer
px4
sharpdevelop
fluent-nhibernate-mapping
typewriter
senti-wordnet
business-catalyst
scrollmagic
lightspeed
msiexec
blitz3d
mate-desktop
syslog4j
airwatch
std
apache2-module
markerspiderfier
android-facebook
adempiere
pdcurses
waterline
tinymce-plugins
valdr-bean-validation
autopep8
strcpy
google-experiments
pysvn
nspopupbutton
opencobol
windows-phone-store
netbeans-plugins
facebook-likebox
svg-android
google-voice
grunt-contrib-imagemin
ie-developer-tools
octokit
nokogiri
dd4t
nx
states
github-archive
unordered-set
advanceddatagrid
helicontech
stl-algorithm
fileoutputstream
openwrap
opendir
marathontesting
javah
for-xml
mysql-pconnect
focus-stealing
asp.net-1.1

Resources

Encrypt Message