Airflow

Latest version: v0.6

Safety actively analyzes 638361 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 3 of 4

1.8.0

-------------------------

[AIRFLOW-900] Double trigger should not kill original task instance
[AIRFLOW-900] Fixes bugs in LocalTaskJob for double run protection
[AIRFLOW-932] Do not mark tasks removed when backfilling
[AIRFLOW-961] run onkill when SIGTERMed
[AIRFLOW-910] Use parallel task execution for backfills
[AIRFLOW-967] Wrap strings in native for py2 ldap compatibility
[AIRFLOW-941] Use defined parameters for psycopg2
[AIRFLOW-719] Prevent DAGs from ending prematurely
[AIRFLOW-938] Use test for True in task_stats queries
[AIRFLOW-937] Improve performance of task_stats
[AIRFLOW-933] use ast.literal_eval rather eval because ast.literal_eval does not execute input.
[AIRFLOW-925] Revert airflow.hooks change that cherry-pick picked
[AIRFLOW-919] Running tasks with no start date shouldn't break a DAGs UI
[AIRFLOW-802] Add spark-submit operator/hook
[AIRFLOW-897] Prevent dagruns from failing with unfinished tasks
[AIRFLOW-861] make pickle_info endpoint be login_required
[AIRFLOW-853] use utf8 encoding for stdout line decode
[AIRFLOW-856] Make sure execution date is set for local client
[AIRFLOW-830][AIRFLOW-829][AIRFLOW-88] Reduce Travis log verbosity
[AIRFLOW-831] Restore import to fix broken tests
[AIRFLOW-794] Access DAGS_FOLDER and SQL_ALCHEMY_CONN exclusively from settings
[AIRFLOW-694] Fix config behaviour for empty envvar
[AIRFLOW-365] Set dag.fileloc explicitly and use for Code view
[AIRFLOW-931] Do not set QUEUED in TaskInstances
[AIRFLOW-899] Tasks in SCHEDULED state should be white in the UI instead of black
[AIRFLOW-895] Address Apache release incompliancies
[AIRFLOW-893][AIRFLOW-510] Fix crashing webservers when a dagrun has no start date
[AIRFLOW-793] Enable compressed loading in S3ToHiveTransfer
[AIRFLOW-863] Example DAGs should have recent start dates
[AIRFLOW-869] Refactor mark success functionality
[AIRFLOW-856] Make sure execution date is set for local client
[AIRFLOW-814] Fix Presto*CheckOperator.__init__
[AIRFLOW-844] Fix cgroups directory creation
[AIRFLOW-816] Use static nvd3 and d3
[AIRFLOW-821] Fix py3 compatibility
[AIRFLOW-817] Check for None value of execution_date in endpoint
[AIRFLOW-822] Close db before exception
[AIRFLOW-815] Add prev/next execution dates to template variables
[AIRFLOW-813] Fix unterminated unit tests in SchedulerJobTest
[AIRFLOW-813] Fix unterminated scheduler unit tests
[AIRFLOW-806] UI should properly ignore DAG doc when it is None
[AIRFLOW-812] Fix the scheduler termination bug.
[AIRFLOW-780] Fix dag import errors no longer working
[AIRFLOW-783] Fix py3 incompatibility in BaseTaskRunner
[AIRFLOW-810] Correct down_revision dag_id/state index creation
[AIRFLOW-807] Improve scheduler performance for large DAGs
[AIRFLOW-798] Check return_code before forcing termination
[AIRFLOW-139] Let psycopg2 handle autocommit for PostgresHook
[AIRFLOW-776] Add missing cgroups devel dependency
[AIRFLOW-777] Fix expression to check if a DagRun is in running state
[AIRFLOW-785] Don't import CgroupTaskRunner at global scope

1.7.2

-------------

[AIRFLOW-463] Link Airflow icon to landing page
[AIRFLOW-149] Task Dependency Engine + Why Isn't My Task Running View
[AIRFLOW-361] Add default failure handler for the Qubole Operator
[AIRFLOW-353] Fix dag run status update failure
[AIRFLOW-447] Store source URIs in Python 3 compatible list
[AIRFLOW-443] Make module names unique when importing
[AIRFLOW-444] Add Google authentication backend
[AIRFLOW-446][AIRFLOW-445] Adds missing dataproc submit options
[AIRFLOW-431] Add CLI for CRUD operations on pools
[AIRFLOW-329] Update Dag Overview Page with Better Status Columns
[AIRFLOW-360] Fix style warnings in models.py
[AIRFLOW-425] Add white fill for null state tasks in tree view.
[AIRFLOW-69] Use dag runs in backfill jobs
[AIRFLOW-415] Make dag_id not found error clearer
[AIRFLOW-416] Use ordinals in README's company list
[AIRFLOW-369] Allow setting default DAG orientation
[AIRFLOW-410] Add 2 Q/A to the FAQ in the docs
[AIRFLOW-407] Add different colors for some sensors
[AIRFLOW-414] Improve error message for missing FERNET_KEY
[AIRFLOW-406] Sphinx/rst fixes
[AIRFLOW-412] Fix lxml dependency
[AIRFLOW-413] Fix unset path bug when backfilling via pickle
[AIRFLOW-78] Airflow clear leaves dag_runs
[AIRFLOW-402] Remove NamedHivePartitionSensor static check, add docs
[AIRFLOW-394] Add an option to the Task Duration graph to show cumulative times
[AIRFLOW-404] Retry download if unpacking fails for hive
[AIRFLOW-276] Gunicorn rolling restart
[AIRFLOW-399] Remove dags/testdruid.py
[AIRFLOW-400] models.py/DAG.set_dag_runs_state() does not correctly set state
[AIRFLOW-395] Fix colon/equal signs typo for resources in default config
[AIRFLOW-397] Documentation: Fix typo "instatiating" to "instantiating"
[AIRFLOW-395] Remove trailing commas from resources in config
[AIRFLOW-388] Add a new chart for Task_Tries for each DAG
[AIRFLOW-322] Fix typo in FAQ section
[AIRFLOW-375] Pylint fixes
limit scope to user email only AIRFLOW-386
[AIRFLOW-383] Cleanup example qubole operator dag
[AIRFLOW-160] Parse DAG files through child processes
[AIRFLOW-381] Manual UI Dag Run creation: require dag_id field
[AIRFLOW-373] Enhance CLI variables functionality
[AIRFLOW-379] Enhance Variables page functionality: import/export variables
[AIRFLOW-331] modify the LDAP authentication config lines in 'Security' sample codes
[AIRFLOW-356][AIRFLOW-355][AIRFLOW-354] Replace nobr, enable DAG only exists locally message, change edit DAG icon
[AIRFLOW-362] Import __future__ division

1.7.1

-------------------------

- Fix : Don't treat premature tasks as could_not_run tasks
- AIRFLOW-92 Avoid unneeded upstream_failed session closes apache/incubator-airflow1485
- Add logic to lock DB and avoid race condition
- Handle queued tasks from multiple jobs/executors
- AIRFLOW-52 Warn about overwriting tasks in a DAG
- Fix corner case with joining processes/queues (1473)
- [AIRFLOW-52] Fix bottlenecks when working with many tasks
- Add columns to toggle extra detail in the connection list view.
- Log the number of errors when importing DAGs
- Log dagbag metrics dupplicate messages in queue into Statsd (1406)
- Clean up issue template (1419)
- correct missed arg.foreground to arg.daemon in cli
- Reinstate imports for github enterprise auth
- Use os.execvp instead of subprocess.Popen for the webserver
- Revert from using "--foreground" to "--daemon"
- Implement a Cloudant hook
- Add missing args to `airflow clear`
- Fixed a bug in the scheduler: num_runs used where runs intended
- Add multiprocessing support to the scheduler
- Partial fix to make sure next_run_date cannot be None
- Support list/get/set variables in the CLI
- Properly handle BigQuery booleans in BigQuery hook.
- Added the ability to view XCom variables in webserver
- Change DAG.tasks from a list to a dict
- Add support for zipped dags
- Stop creating hook on instantiating of S3 operator
- User subquery in views to find running DAGs
- Prevent DAGs from being reloaded on every scheduler iteration
- Add a missing word to docs
- Document the parameters of `DbApiHook`
- added oracle operator with existing oracle hook
- Add PyOpenSSL to Google cloud gcp_api.
- Remove executor error unit test
- Add DAG inference, deferral, and context manager
- Don't return error when writing files to Google cloud storage.
- Fix GCS logging for gcp_api.
- Ensure attr is in scope for error message
- Fixing misnamed PULL_REQUEST_TEMPLATE
- Extract non_pooled_task_slot_count into a configuration param
- Update plugins.rst for clarity on the example (1309)
- Fix s3 logging issue
- Add twitter feed example dag
- Github ISSUE_TEMPLATE & PR_TEMPLATE cleanup
- Reduce logger verbosity
- Adding a PR Template
- Add Lucid to list of users
- Fix usage of asciiart
- Use session instead of outdated main_session for are_dependencies_met
- Fix celery flower port allocation
- Fix for missing edit actions due to flask-admin upgrade
- Fix typo in comment in prioritize_queued method
- Add HipchatOperator
- Include all example dags in backfill unit test
- Make sure skipped jobs are actually skipped
- Fixing a broken example dag, example_skip_dag.py
- Add consistent and thorough signal handling and logging
- Allow Operators to specify SKIPPED status internally
- Update docstring for executor trap unit test
- Doc: explain the usage of Jinja templating for templated params
- Don't schedule runs before the DAG's start_date
- Fix infinite retries with pools, with test
- Fix handling of deadlocked jobs
- Show only Airflow's deprecation warnings
- Set DAG_FOLDER for unit tests
- Missing comma in setup.py
- Deprecate *args and **kwargs in BaseOperator
- Raise deep scheduler exceptions to force a process restart.
- Change inconsistent example DAG owners
- Fix module path of send_email_smtp in configuration
- added Gentner Lab to list of users
- Increase timeout time for unit test
- Fix reading strings from conf
- CHORE - Remove Trailing Spaces
- Fix SSHExecuteOperator crash when using a custom ssh port
- Add note about airflow components to template
- Rewrite BackfillJob logic for clarity
- Add unit tests
- Fix miscellaneous bugs and clean up code
- Fix logic for determining DagRun states
- Make SchedulerJob not run EVERY queued task
- Improve BackfillJob handling of queued/deadlocked tasks
- Introduce ignore_depends_on_past parameters
- Use Popen with CeleryExecutor
- Rename user table to users to avoid conflict with postgres
- Beware of negative pool slots.
- Add support for calling_format from boto to S3_Hook
- Add pypi meta data and sync version number
- Set dags_are_paused_at_creation's default value to True
- Resurface S3Log class eaten by rebase/push -f
- Add missing session.commit() at end of initdb
- Validate that subdag tasks have pool slots available, and test
- Use urlparse for remote GCS logs, and add unit tests
- Make webserver worker timeout configurable
- Fixed scheduling for once interval
- Use psycopg2's API for serializing postgres cell values
- Make the provide_session decorator more robust
- update link to Lyft's website
- use num_shards instead of partitions to be consistent with batch ingestion
- Add documentation links to README
- Update docs with separate configuration section
- Fix airflow.utils deprecation warning code being Python 3 incompatible
- Extract dbapi cell serialization into its own method
- Set Postgres autocommit as supported only if server version is < 7.4
- Use refactored utils module in unit test imports
- Add changelog for 1.7.0
- Use LocalExecutor on Travis if possible
- remove unused logging,errno, MiniHiveCluster imports
- remove extra import of logging lib
- Fix required gcloud version
- Refactoring utils into smaller submodules
- Properly measure number of task retry attempts
- Add function to get configuration as dict, plus unit tests
- Merge branch 'master' into hivemeta_sasl
- Add wiki link to README.md
- [hotfix] make email.Utils > email.utils for py3
- Add the missing "Date" header to the warning e-mails
- Add the missing "Date" header to the warning e-mails
- Check name of SubDag class instead of class itself
- [hotfix] removing repo_token from .coveralls.yml
- Set the service_name in coverals.yml
- Fixes 1223
- Update Airflow docs for remote logging
- Add unit tests for trapping Executor errors
- Make sure Executors properly trap errors
- Fix HttpOpSensorTest to use fake resquest session
- Linting
- Add an example on pool usage in the documentation
- Add two methods to bigquery hook's base cursor: run_table_upsert, which adds a table or updates an existing table; and run_grant_dataset_view_access, which grants view access to a given dataset for a given table.
- Tasks references upstream and downstream tasks using strings instead of references
- Fix typos in models.py
- Fix broken links in documentation
- [hotfix] fixing the Scheduler CLI to make dag_id optional
- Update link to Common Pitfalls wiki page in README
- Allow disabling periodic committing when inserting rows with DbApiHook
- added Glassdoor to "who uses airflow"
- Fix typo preventing from launching webserver
- Documentation badge
- Fixing ISSUE_TEMPLATE name to include .md suffix
- Adding an ISSUE_TEMPLATE to ensure that issues are adequately defined
- Linting & debugging
- Refactoring the CLI to be data-driven
- Updating the Bug Reporting protocol in the Contributing.md file
- Fixing the docs
- clean up references to old session
- remove session reference
- resolve conflict
- clear xcom data when task instance starts
- replace main_session with provide_session
- Add extras to installation.rst
- Changes to Contributing to reflect more closely the current state of development.
- Modifying README to link to the wiki committer list
- docs: fixes a spelling mistake in default config
- Set killMode to 'control-group' for webservice.service
- Set KillMode to 'control-group' for worker.service
- Linting
- Fix WebHdfsSensor
- Adding more licenses to pass checks
- fixing landscape's config
- [hotfix] typo that made it in master
- [hotfix] fixing landscape requirement detection
- Make testing on hive conditional
- Merge remote-tracking branch 'upstream/master' into minicluster
- Update README.md
- Throwing in a few license to pass the build
- Adding a reqs.txt for landscape.io
- Pointing to a reqs file
- Some linting
- Adding a .landscape.yml file
- badge for pypi version
- Add license and ignore for sql and csv
- Use correct connection id
- Use correct table name
- Provide data for ci tests
- new badge for showing staleness of reqs
- removing requirements.txt as it is uni-dimensional
- Make it work on py3
- Remove decode for logging
- Also keep py2 compatible
- More py3 fixes
- Convert to bytes for py3 compat
- Make sure to be py3 compatible
- Use unicodecsv to make it py3 compatible
- Replace tab with spaces Remove unused import
- Merge remote-tracking branch 'upstream/master'
- Support decimal types in MySQL to GCS
- Make sure to write binary as string can be unicode
- Ignore metastore
- More impyla fixes
- Test HivemetaStore if python 2
- Allow users to set hdfs_namenode_principal in HDFSHook config
- Add tests for Hiveserver2 and fix some issues from impyla
- Merge branch 'impyla' into minicluster
- This patch allows for testing of hive operators and hooks. Sasl is used (NoSasl in connection string is not possible). Tests have been adjusted.
- Treat SKIPPED and SUCCESS the same way when evaluating depends_on_past=True
- fix bigquery hook
- version cap for gcp_api
- Fix typo when returning VerticaHook
- Adding fernet key to use it as part of stdout commands
- Adding support for ssl parameters. (picking up from jthomas123)
- more detail in error message.
- make sure paths don't conflict bc of trailing /
- change gcs_hook to self.hook
- refactor remote log read/write and add GCS support
- Only use multipart upload in S3Hook if file is large enough
- Merge branch 'airbnb/master'
- Add GSSAPI SASL to HiveMetaStoreHook.
- Add warning for deprecated setting
- Use kerberos_service_name = 'hive' as standard instead of 'impala'.
- Use GSSAPI instead of KERBEROS and provide backwards compatibility
- ISSUE-1123 Use impyla instead of pyhs2
- set celery_executor to use queue name as exchange

1.4.4

[AIRFLOW-2248] Fix wrong param name in RedshiftToS3Transfer doc
[AIRFLOW-1433][AIRFLOW-85] New Airflow Webserver UI with RBAC support
[AIRFLOW-1235] Fix webserver's odd behaviour
[AIRFLOW-1460] Allow restoration of REMOVED TI's
[airflow-2235] Fix wrong docstrings in two operators
[AIRFLOW-XXX] Fix chronological order for companies using Airflow
[AIRFLOW-2124] Upload Python file to a bucket for Dataproc
[AIRFLOW-2212] Fix ungenerated sensor API reference
[AIRFLOW-2226] Rename google_cloud_storage_default to google_cloud_default
[AIRFLOW-2211] Rename hdfs_sensors.py to hdfs_sensor.py for consistency
[AIRFLOW-2225] Update document to include DruidDbApiHook
[Airflow-2202] Add filter support in HiveMetastoreHook().max_partition()
[AIRFLOW-2220] Remove duplicate numeric list entry in security.rst
[AIRFLOW-XXX] Update tutorial documentation
[AIRFLOW-2215] Update celery task to preserve environment variables and improve logging on exception
[AIRFLOW-2185] Use state instead of query param
[AIRFLOW-2183] Refactor DruidHook to enable sql
[AIRFLOW-2203] Defer cycle detection
[AIRFLOW-2203] Remove Useless Commands.
[AIRFLOW-2203] Cache signature in apply_defaults
[AIRFLOW-2203] Speed up Operator Resources
[AIRFLOW-2203] Cache static rules (trigger/weight)
[AIRFLOW-2203] Store task ids as sets not lists
[AIRFLOW-2205] Remove unsupported args from JdbcHook doc
[AIRFLOW-2207] Fix flaky test that uses app.cached_app()
[AIRFLOW-2206] Remove unsupported args from JdbcOperator doc
[AIRFLOW-2140] Add Kubernetes scheduler to SparkSubmitOperator
[AIRFLOW-XXX] Add Xero to list of users
[AIRFLOW-2204] Fix webserver debug mode
[AIRFLOW-102] Fix test_complex_template always succeeds
[AIRFLOW-442] Add SFTPHook
[AIRFLOW-2169] Add schema to MySqlToGoogleCloudStorageOperator
[AIRFLOW-2184][AIRFLOW-2138] Google Cloud Storage allow wildcards
[AIRFLOW-1588] Cast Variable value to string
[AIRFLOW-2199] Fix invalid reference to logger
[AIRFLOW-2191] Change scheduler heartbeat logs from info to debug
[AIRFLOW-2106] SalesForce hook sandbox option
[AIRFLOW-2197] Silence hostname_callable config error message
[AIRFLOW-2150] Use lighter call in HiveMetastoreHook().max_partition()
[AIRFLOW-2186] Change the way logging is carried out in few ops
[AIRFLOW-2181] Convert password_auth and test_password_endpoints from DOS to UNIX
[AIRFLOW-2187] Fix Broken Travis CI due to AIRFLOW-2123
[AIRFLOW-2175] Check that filepath is not None
[AIRFLOW-2173] Don't check task IDs for concurrency reached check
[AIRFLOW-2168] Remote logging for Azure Blob Storage
[AIRFLOW-XXX] Add DocuTAP to list of users
[AIRFLOW-2176] Change the way logging is carried out in BQ Get Data Operator
[AIRFLOW-2177] Add mock test for GCS Download op
[AIRFLOW-2123] Install CI dependencies from setup.py
[AIRFLOW-2129] Presto hook calls _parse_exception_message but defines _get_pretty_exception_message
[AIRFLOW-2174] Fix typos and wrongly rendered documents
[AIRFLOW-2171] Store delegated credentials
[AIRFLOW-2166] Restore BQ run_query dialect param
[AIRFLOW-2163] Add HBC Digital to users of airflow
[AIRFLOW-2065] Fix race-conditions when creating loggers
[AIRFLOW-2147] Plugin manager: added 'sensors' attribute
[AIRFLOW-2059] taskinstance query is awful, un-indexed, and does not scale
[AIRFLOW-2159] Fix a few typos in salesforce_hook
[AIRFLOW-2132] Add step to initialize database
[AIRFLOW-2160] Fix bad rowid deserialization
[AIRFLOW-2161] Add Vevo to list of companies using Airflow
[AIRFLOW-2149] Add link to apache Beam documentation to create self executing Jar
[AIRFLOW-2151] Allow getting the session from AwsHook
[AIRFLOW-2097] tz referenced before assignment
[AIRFLOW-2152] Add Multiply to list of companies using Airflow
[AIRFLOW-1551] Add operator to trigger Jenkins job
[AIRFLOW-2034] Fix mixup between %s and {} when using str.format Convention is to use .format for string formating oustide logging, else use lazy format See comment in related issue https://github.com/apache/incubator-airflow/pull/2823/files Identified problematic case using following command line .git/COMMIT_EDITMSG:`grep -r '%s'./* | grep '\.format('`
[AIRFLOW-2102] Add custom_args to Sendgrid personalizations
[AIRFLOW-1035][AIRFLOW-1053] import unicode_literals to parse Unicode in HQL
[AIRFLOW-2127] Keep loggers during DB migrations
[AIRFLOW-2146] Resolve issues with BQ using DbApiHook methods
[AIRFLOW-2087] Scheduler Report shows incorrect Total task number
[AIRFLOW-2139] Remove unncecessary boilerplate to get DataFrame using pandas_gbq
[AIRFLOW-2125] Using binary package psycopg2-binary
[AIRFLOW-2142] Include message on mkdir failure
[AIRFLOW-1615] SSHHook: use port specified by Connection
[AIRFLOW-2122] Handle boolean values in sshHook
[AIRFLOW-XXX] Add Tile to the list of users
[AIRFLOW-2130] Add missing Operators to API Reference docs
[AIRFLOW-XXX] Add timeout units (seconds)
[AIRFLOW-2134] Add Alan to the list of companies that use Airflow
[AIRFLOW-2133] Remove references to GitHub issues in CONTRIBUTING
[AIRFLOW-2131] Remove confusing AirflowImport docs
[AIRFLOW-1852] Allow hostname to be overridable.
[AIRFLOW-2126] Add Bluecore to active users
[AIRFLOW-1618] Add feature to create GCS bucket
[AIRFLOW-2108] Fix log indentation in BashOperator
[AIRFLOW-2115] Fix doc links to PythonHosted
[AIRFLOW-XXX] Add contributor from Easy company
[AIRFLOW-1882] Add ignoreUnknownValues option to gcs_to_bq operator
[AIRFLOW-2089] Add on kill for SparkSubmit in Standalone Cluster
[AIRFLOW-2113] Address missing DagRun callbacks Given that the handle_callback method belongs to the DAG object, we are able to get the list of task directly with get_task and reduce the communication with the database, making airflow more lightweight.
[AIRFLOW-2112] Fix svg width for Recent Tasks on UI.

1.4.1

[AIRFLOW-230] [HiveServer2Hook] adding multi statements support
[AIRFLOW-142] setup_env.sh doesn't download hive tarball if hdp is specified as distro
[AIRFLOW-223] Make parametrable the IP on which Flower binds to
[AIRFLOW-218] Added option to enable webserver gunicorn access/err logs
[AIRFLOW-213] Add "Closes X" phrase to commit messages
[AIRFLOW-68] Align start_date with the schedule_interval
[AIRFLOW-9] Improving docs to meet Apache's standards
[AIRFLOW-131] Make XCom.clear more selective
[AIRFLOW-214] Fix occasion of detached taskinstance
[AIRFLOW-206] Add commit to close PR
[AIRFLOW-206] Always load local log files if they exist
[AIRFLOW-211] Fix JIRA "resolve" vs "close" behavior
[AIRFLOW-64] Add note about relative DAGS_FOLDER
[AIRFLOW-114] Sort plugins dropdown
[AIRFLOW-209] Add scheduler tests and improve lineage handling
[AIRFLOW-207] Improve JIRA auth workflow
[AIRFLOW-187] Improve PR tool UX
[AIRFLOW-155] Documentation of Qubole Operator
Optimize and refactor process_dag
[AIRFLOW-185] Handle empty versions list
[AIRFLOW-201] Fix for HiveMetastoreHook + kerberos
[AIRFLOW-202]: Fixes stray print line
[AIRFLOW-196] Fix bug that exception is not handled in HttpSensor
[AIRFLOW-195] : Add toggle support to subdag clearing in the CLI
[AIRFLOW-23] Support for Google Cloud DataProc
[AIRFLOW-25] Configuration for Celery always required
[AIRFLOW-190] Add codecov and remove download count
[AIRFLOW-168] Correct evaluation of once schedule
[AIRFLOW-183] Fetch log from remote when worker returns 4xx/5xx response
[AIRFLOW-181] Fix failing unpacking of hadoop by redownloading
[AIRFLOW-176] remove unused formatting key
[AIRFLOW-167]: Add dag_state option in cli
[AIRFLOW-178] Fix bug so that zip file is detected in DAG folder
[AIRFLOW-176] Improve PR Tool JIRA workflow
AIRFLOW-45: Support Hidden Airflow Variables
[AIRFLOW-175] Run git-reset before checkout in PR tool
[AIRFLOW-157] Make PR tool Py3-compat; add JIRA command
[AIRFLOW-170] Add missing apply_defaults

1.0.0

[AIRFLOW-624] Fix setup.py to not import airflow.version as version
[AIRFLOW-779] Task should fail with specific message when deleted
[AIRFLOW-778] Fix completey broken MetastorePartitionSensor
[AIRFLOW-739] Set pickle_info log to debug
[AIRFLOW-771] Make S3 logs append instead of clobber
[AIRFLOW-773] Fix flaky datetime addition in api test
[AIRFLOW-219][AIRFLOW-398] Cgroups + impersonation
[AIRFLOW-683] Add jira hook, operator and sensor
[AIRFLOW-762] Add Google DataProc delete operator
[AIRFLOW-760] Update systemd config
[AIRFLOW-759] Use previous dag_run to verify depend_on_past
[AIRFLOW-757] Set child_process_log_directory default more sensible
[AIRFLOW-692] Open XCom page to super-admins only
[AIRFLOW-737] Fix HDFS Sensor directory.
[AIRFLOW-747] Fix retry_delay not honoured
[AIRFLOW-558] Add Support for dag.catchup=(True|False) Option
[AIRFLOW-489] Allow specifying execution date in trigger_dag API
[AIRFLOW-738] Commit deleted xcom items before insert
[AIRFLOW-729] Add Google Cloud Dataproc cluster creation operator
[AIRFLOW-728] Add Google BigQuery table sensor
[AIRFLOW-741] Log to debug instead of info for app.py
[AIRFLOW-731] Fix period bug for NamedHivePartitionSensor

Page 3 of 4

© 2024 Safety CLI Cybersecurity Inc. All Rights Reserved.