Giter Site home page Giter Site logo

docker-airflow's Introduction

docker-airflow

CI status Docker Build status

Docker Hub Docker Pulls Docker Stars

This repository contains Dockerfile of apache-airflow for Docker's automated build published to the public Docker Hub Registry.

Informations

Installation

Pull the image from the Docker repository.

docker pull puckel/docker-airflow

Build

Optionally install Extra Airflow Packages and/or python dependencies at build time :

docker build --rm --build-arg AIRFLOW_DEPS="datadog,dask" -t puckel/docker-airflow .
docker build --rm --build-arg PYTHON_DEPS="flask_oauthlib>=0.9" -t puckel/docker-airflow .

or combined

docker build --rm --build-arg AIRFLOW_DEPS="datadog,dask" --build-arg PYTHON_DEPS="flask_oauthlib>=0.9" -t puckel/docker-airflow .

Don't forget to update the airflow images in the docker-compose files to puckel/docker-airflow:latest.

Usage

By default, docker-airflow runs Airflow with SequentialExecutor :

docker run -d -p 8080:8080 puckel/docker-airflow webserver

If you want to run another executor, use the other docker-compose.yml files provided in this repository.

For LocalExecutor :

docker-compose -f docker-compose-LocalExecutor.yml up -d

For CeleryExecutor :

docker-compose -f docker-compose-CeleryExecutor.yml up -d

NB : If you want to have DAGs example loaded (default=False), you've to set the following environment variable :

LOAD_EX=n

docker run -d -p 8080:8080 -e LOAD_EX=y puckel/docker-airflow

If you want to use Ad hoc query, make sure you've configured connections: Go to Admin -> Connections and Edit "postgres_default" set this values (equivalent to values in airflow.cfg/docker-compose*.yml) :

  • Host : postgres
  • Schema : airflow
  • Login : airflow
  • Password : airflow

For encrypted connection passwords (in Local or Celery Executor), you must have the same fernet_key. By default docker-airflow generates the fernet_key at startup, you have to set an environment variable in the docker-compose (ie: docker-compose-LocalExecutor.yml) file to set the same key accross containers. To generate a fernet_key :

docker run puckel/docker-airflow python -c "from cryptography.fernet import Fernet; FERNET_KEY = Fernet.generate_key().decode(); print(FERNET_KEY)"

Configuring Airflow

It's possible to set any configuration value for Airflow from environment variables, which are used over values from the airflow.cfg.

The general rule is the environment variable should be named AIRFLOW__<section>__<key>, for example AIRFLOW__CORE__SQL_ALCHEMY_CONN sets the sql_alchemy_conn config option in the [core] section.

Check out the Airflow documentation for more details

You can also define connections via environment variables by prefixing them with AIRFLOW_CONN_ - for example AIRFLOW_CONN_POSTGRES_MASTER=postgres://user:password@localhost:5432/master for a connection called "postgres_master". The value is parsed as a URI. This will work for hooks etc, but won't show up in the "Ad-hoc Query" section unless an (empty) connection is also created in the DB

Custom Airflow plugins

Airflow allows for custom user-created plugins which are typically found in ${AIRFLOW_HOME}/plugins folder. Documentation on plugins can be found here

In order to incorporate plugins into your docker container

  • Create the plugins folders plugins/ with your custom plugins.
  • Mount the folder as a volume by doing either of the following:
    • Include the folder as a volume in command-line -v $(pwd)/plugins/:/usr/local/airflow/plugins
    • Use docker-compose-LocalExecutor.yml or docker-compose-CeleryExecutor.yml which contains support for adding the plugins folder as a volume

Install custom python package

  • Create a file "requirements.txt" with the desired python modules
  • Mount this file as a volume -v $(pwd)/requirements.txt:/requirements.txt (or add it as a volume in docker-compose file)
  • The entrypoint.sh script execute the pip install command (with --user option)

UI Links

Scale the number of workers

Easy scaling using docker-compose:

docker-compose -f docker-compose-CeleryExecutor.yml scale worker=5

This can be used to scale to a multi node setup using docker swarm.

Running other airflow commands

If you want to run other airflow sub-commands, such as list_dags or clear you can do so like this:

docker run --rm -ti puckel/docker-airflow airflow list_dags

or with your docker-compose set up like this:

docker-compose -f docker-compose-CeleryExecutor.yml run --rm webserver airflow list_dags

You can also use this to run a bash shell or any other command in the same environment that airflow would be run in:

docker run --rm -ti puckel/docker-airflow bash
docker run --rm -ti puckel/docker-airflow ipython

Simplified SQL database configuration using PostgreSQL

If the executor type is set to anything else than SequentialExecutor you'll need an SQL database. Here is a list of PostgreSQL configuration variables and their default values. They're used to compute the AIRFLOW__CORE__SQL_ALCHEMY_CONN and AIRFLOW__CELERY__RESULT_BACKEND variables when needed for you if you don't provide them explicitly:

Variable Default value Role
POSTGRES_HOST postgres Database server host
POSTGRES_PORT 5432 Database server port
POSTGRES_USER airflow Database user
POSTGRES_PASSWORD airflow Database password
POSTGRES_DB airflow Database name
POSTGRES_EXTRAS empty Extras parameters

You can also use those variables to adapt your compose file to match an existing PostgreSQL instance managed elsewhere.

Please refer to the Airflow documentation to understand the use of extras parameters, for example in order to configure a connection that uses TLS encryption.

Here's an important thing to consider:

When specifying the connection as URI (in AIRFLOW_CONN_* variable) you should specify it following the standard syntax of DB connections, where extras are passed as parameters of the URI (note that all components of the URI should be URL-encoded).

Therefore you must provide extras parameters URL-encoded, starting with a leading ?. For example:

POSTGRES_EXTRAS="?sslmode=verify-full&sslrootcert=%2Fetc%2Fssl%2Fcerts%2Fca-certificates.crt"

Simplified Celery broker configuration using Redis

If the executor type is set to CeleryExecutor you'll need a Celery broker. Here is a list of Redis configuration variables and their default values. They're used to compute the AIRFLOW__CELERY__BROKER_URL variable for you if you don't provide it explicitly:

Variable Default value Role
REDIS_PROTO redis:// Protocol
REDIS_HOST redis Redis server host
REDIS_PORT 6379 Redis server port
REDIS_PASSWORD empty If Redis is password protected
REDIS_DBNUM 1 Database number

You can also use those variables to adapt your compose file to match an existing Redis instance managed elsewhere.

Wanna help?

Fork, improve and PR.

docker-airflow's People

Contributors

achm6174 avatar adamunger avatar ajhodges avatar arihantsurana avatar ashb avatar darkk avatar diraol avatar edrzmr avatar eshizhan avatar hadsed-genapsys avatar ianburrell avatar jabas06 avatar javioverflow avatar jbdalido avatar jmcarp avatar kaxil avatar kristi avatar lamroger avatar maxcountryman avatar medmrgh avatar mendhak avatar mhousley avatar msn1444 avatar nbardelot avatar nsepetys avatar odannyc avatar puckel avatar ryanrussell avatar theodoresiu avatar ttaschke avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

docker-airflow's Issues

How to install custom pip packages?

Curious if there is any way control installing custom pip packages? I was hoping that you could perhaps use something like the ONBUILD triggers on the Python image

To be more specific why I'm asking is that I tried using the DockerOperator in a DAG and I'm running into the following error: No module named docker

My guess is that we need to install the docker package.

Scheduler unstable

Hi,

Thanks for the docker image. When I run a pipeline , more often I find that the scheduler is quite unstable., it starts of with the error : ERROR - [Errno 104] Connection reset by peer

The error log for your ref;

ERROR - [Errno 104] Connection reset by peer
2016-06-17T11:42:51.799483436Z Traceback (most recent call last):
2016-06-17T11:42:51.799494203Z File "/usr/local/lib/python2.7/dist-packages/airflow/jobs.py", line 755, in _execute
2016-06-17T11:42:51.799501615Z executor.heartbeat()
2016-06-17T11:42:51.799507897Z File "/usr/local/lib/python2.7/dist-packages/airflow/executors/base_executor.py", line 107, in heartbeat
2016-06-17T11:42:51.799514286Z self.sync()
2016-06-17T11:42:51.799520586Z File "/usr/local/lib/python2.7/dist-packages/airflow/executors/celery_executor.py", line 74, in sync
2016-06-17T11:42:51.799527236Z state = async.state
2016-06-17T11:42:51.799533355Z File "/usr/local/lib/python2.7/dist-packages/celery/result.py", line 394, in state
2016-06-17T11:42:51.799540325Z return self._get_task_meta()['status']
2016-06-17T11:42:51.799546778Z File "/usr/local/lib/python2.7/dist-packages/celery/result.py", line 339, in _get_task_meta
2016-06-17T11:42:51.799553135Z return self._maybe_set_cache(self.backend.get_task_meta(self.id))
2016-06-17T11:42:51.799559187Z File "/usr/local/lib/python2.7/dist-packages/celery/backends/amqp.py", line 163, in get_task_meta
2016-06-17T11:42:51.799565969Z binding.declare()
2016-06-17T11:42:51.799571830Z File "/usr/local/lib/python2.7/dist-packages/kombu/entity.py", line 521, in declare
2016-06-17T11:42:51.799578902Z self.exchange.declare(nowait)
2016-06-17T11:42:51.799585186Z File "/usr/local/lib/python2.7/dist-packages/kombu/entity.py", line 174, in declare
2016-06-17T11:42:51.799591931Z nowait=nowait, passive=passive,
2016-06-17T11:42:51.799598206Z File "/usr/local/lib/python2.7/dist-packages/amqp/channel.py", line 615, in exchange_declare
2016-06-17T11:42:51.799604873Z self._send_method((40, 10), args)
2016-06-17T11:42:51.799611275Z File "/usr/local/lib/python2.7/dist-packages/amqp/abstract_channel.py", line 56, in _send_method
2016-06-17T11:42:51.799631815Z self.channel_id, method_sig, args, content,
2016-06-17T11:42:51.799639848Z File "/usr/local/lib/python2.7/dist-packages/amqp/method_framing.py", line 221, in write_method
2016-06-17T11:42:51.799646601Z write_frame(1, channel, payload)
2016-06-17T11:42:51.799652763Z File "/usr/local/lib/python2.7/dist-packages/amqp/transport.py", line 182, in write_frame
2016-06-17T11:42:51.799659065Z frame_type, channel, size, payload, 0xce,
2016-06-17T11:42:51.799665133Z File "/usr/lib/python2.7/socket.py", line 224, in meth
2016-06-17T11:42:51.799671254Z return getattr(self._sock,name)(*args)
2016-06-17T11:42:51.799676859Z error: [Errno 104] Connection reset by peer
2016-06-17T11:42:51.799967150Z [2016-06-17 11:42:51,799] {jobs.py:759} ERROR - Tachycardia!
2016-06-17T11:45:01.136584726Z Fri Jun 17 11:45:01 UTC 2016 - waiting for RabbitMQ... 1/10
2016-06-17T11:47:13.872341907Z Fri Jun 17 11:47:13 UTC 2016 - waiting for RabbitMQ... 2/10
2016-06-17T11:49:26.352538395Z Fri Jun 17 11:49:26 UTC 2016 - waiting for RabbitMQ... 3/10

Kindly direct me with the fix.

Regards,
Vijay Raajaa G S

Airflow unable to fetch logs

*** Log file isn't local.
*** Fetching here: http://:8793/log/tutl1/print_date1/2017-04-03T09:27:24
*** Failed to fetch log file from worker.

*** Reading remote logs...
*** Unsupported remote log location.

I'm using airflow-1.8.0
While i'm trying to run it as airflow user the logs are coming, but when i modify the dockerfile to run as root , the logs are not coming . Can you help me out in finding the issue.

I'm using Celery executor

Kubernetes deployment

Hi.. I've been working on a Kubernetes deployment.

Is that something you'd be interested in adding in here ?

ImportError: No module named packaging.version

I tried to re-build docker-airflow, and I get the following error:

Installing collected packages: cryptography, idna, pyasn1, setuptools, enum34, ipaddress, cffi, appdirs, packaging, pycparser, pyparsing
  Running setup.py install for cryptography
    
    Installed /tmp/pip-build-WXBrEA/cryptography/cffi-1.9.1-py2.7-linux-x86_64.egg
    Searching for pycparser
    Reading https://pypi.python.org/simple/pycparser/
    Best match: pycparser 2.17
    Downloading https://pypi.python.org/packages/be/64/1bb257ffb17d01f4a38d7ce686809a736837ad4371bcc5c42ba7a715c3ac/pycparser-2.17.tar.gz#md5=ca98dcb50bc1276f230118f6af5a40c7
    Processing pycparser-2.17.tar.gz
    Writing /tmp/easy_install-P3qata/pycparser-2.17/setup.cfg
    Running pycparser-2.17/setup.py -q bdist_egg --dist-dir /tmp/easy_install-P3qata/pycparser-2.17/egg-dist-tmp-nrAJxO
    warning: no previously-included files matching 'yacctab.*' found under directory 'tests'
    warning: no previously-included files matching 'lextab.*' found under directory 'tests'
    warning: no previously-included files matching 'yacctab.*' found under directory 'examples'
    warning: no previously-included files matching 'lextab.*' found under directory 'examples'
    zip_safe flag not set; analyzing archive contents...
    pycparser.ply.yacc: module references __file__
    pycparser.ply.yacc: module MAY be using inspect.getsourcefile
    pycparser.ply.yacc: module MAY be using inspect.stack
    pycparser.ply.lex: module references __file__
    pycparser.ply.lex: module MAY be using inspect.getsourcefile
    pycparser.ply.ygen: module references __file__
    
    Installed /tmp/pip-build-WXBrEA/cryptography/pycparser-2.17-py2.7.egg
    
    no previously-included directories found matching 'docs/_build'
    warning: no previously-included files matching '*' found under directory 'vectors'
    generating cffi module 'build/temp.linux-x86_64-2.7/_padding.c'
    generating cffi module 'build/temp.linux-x86_64-2.7/_constant_time.c'
    generating cffi module 'build/temp.linux-x86_64-2.7/_openssl.c'
    building '_openssl' extension
    x86_64-linux-gnu-gcc -pthread -DNDEBUG -g -fwrapv -O2 -Wall -Wstrict-prototypes -fno-strict-aliasing -D_FORTIFY_SOURCE=2 -g -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/usr/include/python2.7 -c build/temp.linux-x86_64-2.7/_openssl.c -o build/temp.linux-x86_64-2.7/build/temp.linux-x86_64-2.7/_openssl.o
    x86_64-linux-gnu-gcc -pthread -shared -Wl,-O1 -Wl,-Bsymbolic-functions -Wl,-z,relro -fno-strict-aliasing -DNDEBUG -g -fwrapv -O2 -Wall -Wstrict-prototypes -D_FORTIFY_SOURCE=2 -g -fstack-protector-strong -Wformat -Werror=format-security -Wl,-z,relro -D_FORTIFY_SOURCE=2 -g -fstack-protector-strong -Wformat -Werror=format-security build/temp.linux-x86_64-2.7/build/temp.linux-x86_64-2.7/_openssl.o -lssl -lcrypto -o build/lib.linux-x86_64-2.7/cryptography/hazmat/bindings/_openssl.so
    building '_constant_time' extension
    x86_64-linux-gnu-gcc -pthread -DNDEBUG -g -fwrapv -O2 -Wall -Wstrict-prototypes -fno-strict-aliasing -D_FORTIFY_SOURCE=2 -g -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/usr/include/python2.7 -c build/temp.linux-x86_64-2.7/_constant_time.c -o build/temp.linux-x86_64-2.7/build/temp.linux-x86_64-2.7/_constant_time.o
    x86_64-linux-gnu-gcc -pthread -shared -Wl,-O1 -Wl,-Bsymbolic-functions -Wl,-z,relro -fno-strict-aliasing -DNDEBUG -g -fwrapv -O2 -Wall -Wstrict-prototypes -D_FORTIFY_SOURCE=2 -g -fstack-protector-strong -Wformat -Werror=format-security -Wl,-z,relro -D_FORTIFY_SOURCE=2 -g -fstack-protector-strong -Wformat -Werror=format-security build/temp.linux-x86_64-2.7/build/temp.linux-x86_64-2.7/_constant_time.o -o build/lib.linux-x86_64-2.7/cryptography/hazmat/bindings/_constant_time.so
    building '_padding' extension
    x86_64-linux-gnu-gcc -pthread -DNDEBUG -g -fwrapv -O2 -Wall -Wstrict-prototypes -fno-strict-aliasing -D_FORTIFY_SOURCE=2 -g -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/usr/include/python2.7 -c build/temp.linux-x86_64-2.7/_padding.c -o build/temp.linux-x86_64-2.7/build/temp.linux-x86_64-2.7/_padding.o
    x86_64-linux-gnu-gcc -pthread -shared -Wl,-O1 -Wl,-Bsymbolic-functions -Wl,-z,relro -fno-strict-aliasing -DNDEBUG -g -fwrapv -O2 -Wall -Wstrict-prototypes -D_FORTIFY_SOURCE=2 -g -fstack-protector-strong -Wformat -Werror=format-security -Wl,-z,relro -D_FORTIFY_SOURCE=2 -g -fstack-protector-strong -Wformat -Werror=format-security build/temp.linux-x86_64-2.7/build/temp.linux-x86_64-2.7/_padding.o -o build/lib.linux-x86_64-2.7/cryptography/hazmat/bindings/_padding.so
  Found existing installation: setuptools 5.5.1
    Not uninstalling setuptools at /usr/lib/python2.7/dist-packages, owned by OS
  Running setup.py install for cffi
    Traceback (most recent call last):
      File "<string>", line 1, in <module>
      File "/usr/local/lib/python2.7/dist-packages/setuptools/__init__.py", line 12, in <module>
        import setuptools.version
      File "/usr/local/lib/python2.7/dist-packages/setuptools/version.py", line 1, in <module>
        import pkg_resources
      File "/usr/local/lib/python2.7/dist-packages/pkg_resources/__init__.py", line 70, in <module>
        import packaging.version
    ImportError: No module named packaging.version
    Complete output from command /usr/bin/python -c "import setuptools, tokenize;__file__='/tmp/pip-build-WXBrEA/cffi/setup.py';exec(compile(getattr(tokenize, 'open', open)(__file__).read().replace('\r\n', '\n'), __file__, 'exec'))" install --record /tmp/pip-7DvuKF-record/install-record.txt --single-version-externally-managed --compile:
    Traceback (most recent call last):

  File "<string>", line 1, in <module>

  File "/usr/local/lib/python2.7/dist-packages/setuptools/__init__.py", line 12, in <module>

    import setuptools.version

  File "/usr/local/lib/python2.7/dist-packages/setuptools/version.py", line 1, in <module>

    import pkg_resources

  File "/usr/local/lib/python2.7/dist-packages/pkg_resources/__init__.py", line 70, in <module>

    import packaging.version

ImportError: No module named packaging.version

Looks like the version of pip is pip 1.5.6 whereas upgrading to pip 9+ seems to fix the issue. (cf. pypa/setuptools#937).

Mysql Fails

If i clone this repo then run docker-compose up it seems everything starts OK until an example DAG is run. Then the whole thing fails.

scheduler_1 | [2016-03-08 10:38:03,426] {jobs.py:632} ERROR - (_mysql_exceptions.OperationalError) (2013, 'Lost connection to MySQL server during query') [SQL: u'SELECT task_instance.task_id AS task_instance_task_id, task_instance.dag_id AS task_instance_dag_id, task_instance.execution_date AS task_instance_execution_date, task_instance.start_date AS task_instance_start_date, task_instance.end_date AS task_instance_end_date, task_instance.duration AS task_instance_duration, task_instance.state AS task_instance_state, task_instance.try_number AS task_instance_try_number, task_instance.hostname AS task_instance_hostname, task_instance.unixname AS task_instance_unixname, task_instance.job_id AS task_instance_job_id, task_instance.pool AS task_instance_pool, task_instance.queue AS task_instance_queue, task_instance.priority_weight AS task_instance_priority_weight, task_instance.operator AS task_instance_operator, task_instance.queued_dttm AS task_instance_queued_dttm \nFROM task_instance \nWHERE task_instance.dag_id = %s AND task_instance.task_id = %s AND task_instance.execution_date = %s \n LIMIT %s'] [parameters: ('example_bash_operator', 'also_run_this', datetime.datetime(2016, 3, 1, 0, 0), 1)]
scheduler_1 | Traceback (most recent call last):
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/airflow/jobs.py", line 629, in _execute
scheduler_1 |     self.process_dag(dag, executor)
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/airflow/jobs.py", line 475, in process_dag
scheduler_1 |     ti.refresh_from_db()
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/airflow/models.py", line 667, in refresh_from_db
scheduler_1 |     TI.execution_date == self.execution_date,
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/orm/query.py", line 2634, in first
scheduler_1 |     ret = list(self[0:1])
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/orm/query.py", line 2457, in __getitem__
scheduler_1 |     return list(res)
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/orm/query.py", line 2736, in __iter__
scheduler_1 |     return self._execute_and_instances(context)
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/orm/query.py", line 2751, in _execute_and_instances
scheduler_1 |     result = conn.execute(querycontext.statement, self._params)
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/engine/base.py", line 914, in execute
scheduler_1 |     return meth(self, multiparams, params)
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/sql/elements.py", line 323, in _execute_on_connection
scheduler_1 |     return connection._execute_clauseelement(self, multiparams, params)
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/engine/base.py", line 1010, in _execute_clauseelement
scheduler_1 |     compiled_sql, distilled_params
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/engine/base.py", line 1146, in _execute_context
scheduler_1 |     context)
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/engine/base.py", line 1341, in _handle_dbapi_exception
scheduler_1 |     exc_info
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/util/compat.py", line 200, in raise_from_cause
scheduler_1 |     reraise(type(exception), exception, tb=exc_tb, cause=cause)
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/engine/base.py", line 1139, in _execute_context
scheduler_1 |     context)
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/engine/default.py", line 450, in do_execute
scheduler_1 |     cursor.execute(statement, parameters)
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/MySQLdb/cursors.py", line 226, in execute
scheduler_1 |     self.errorhandler(self, exc, value)
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/MySQLdb/connections.py", line 36, in defaulterrorhandler
scheduler_1 |     raise errorvalue
scheduler_1 | OperationalError: (_mysql_exceptions.OperationalError) (2013, 'Lost connection to MySQL server during query') [SQL: u'SELECT task_instance.task_id AS task_instance_task_id, task_instance.dag_id AS task_instance_dag_id, task_instance.execution_date AS task_instance_execution_date, task_instance.start_date AS task_instance_start_date, task_instance.end_date AS task_instance_end_date, task_instance.duration AS task_instance_duration, task_instance.state AS task_instance_state, task_instance.try_number AS task_instance_try_number, task_instance.hostname AS task_instance_hostname, task_instance.unixname AS task_instance_unixname, task_instance.job_id AS task_instance_job_id, task_instance.pool AS task_instance_pool, task_instance.queue AS task_instance_queue, task_instance.priority_weight AS task_instance_priority_weight, task_instance.operator AS task_instance_operator, task_instance.queued_dttm AS task_instance_queued_dttm \nFROM task_instance \nWHERE task_instance.dag_id = %s AND task_instance.task_id = %s AND task_instance.execution_date = %s \n LIMIT %s'] [parameters: ('example_bash_operator', 'also_run_this', datetime.datetime(2016, 3, 1, 0, 0), 1)]
scheduler_1 | [2016-03-08 10:38:03,496] {jobs.py:653} ERROR - (sqlalchemy.exc.InvalidRequestError) Can't reconnect until invalid transaction is rolled back [SQL: u'SELECT dag.dag_id AS dag_dag_id, dag.is_paused AS dag_is_paused, dag.is_subdag AS dag_is_subdag, dag.is_active AS dag_is_active, dag.last_scheduler_run AS dag_last_scheduler_run, dag.last_pickled AS dag_last_pickled, dag.last_expired AS dag_last_expired, dag.scheduler_lock AS dag_scheduler_lock, dag.pickle_id AS dag_pickle_id, dag.fileloc AS dag_fileloc, dag.owners AS dag_owners \nFROM dag \nWHERE dag.dag_id = %s \n LIMIT %s'] [parameters: [immutabledict({})]]
scheduler_1 | Traceback (most recent call last):
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/airflow/jobs.py", line 624, in _execute
scheduler_1 |     dag = dagbag.get_dag(dag.dag_id)
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/airflow/models.py", line 155, in get_dag
scheduler_1 |     orm_dag = DagModel.get_current(dag_id)
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/airflow/models.py", line 1907, in get_current
scheduler_1 |     obj = session.query(cls).filter(cls.dag_id == dag_id).first()
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/orm/query.py", line 2634, in first
scheduler_1 |     ret = list(self[0:1])
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/orm/query.py", line 2457, in __getitem__
scheduler_1 |     return list(res)
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/orm/query.py", line 2736, in __iter__
scheduler_1 |     return self._execute_and_instances(context)
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/orm/query.py", line 2751, in _execute_and_instances
scheduler_1 |     result = conn.execute(querycontext.statement, self._params)
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/engine/base.py", line 914, in execute
scheduler_1 |     return meth(self, multiparams, params)
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/sql/elements.py", line 323, in _execute_on_connection
scheduler_1 |     return connection._execute_clauseelement(self, multiparams, params)
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/engine/base.py", line 1010, in _execute_clauseelement
scheduler_1 |     compiled_sql, distilled_params
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/engine/base.py", line 1078, in _execute_context
scheduler_1 |     None, None)
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/engine/base.py", line 1341, in _handle_dbapi_exception
scheduler_1 |     exc_info
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/util/compat.py", line 200, in raise_from_cause
scheduler_1 |     reraise(type(exception), exception, tb=exc_tb, cause=cause)
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/engine/base.py", line 1071, in _execute_context
scheduler_1 |     conn = self._revalidate_connection()
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/engine/base.py", line 391, in _revalidate_connection
scheduler_1 |     "Can't reconnect until invalid "
scheduler_1 | StatementError: (sqlalchemy.exc.InvalidRequestError) Can't reconnect until invalid transaction is rolled back [SQL: u'SELECT dag.dag_id AS dag_dag_id, dag.is_paused AS dag_is_paused, dag.is_subdag AS dag_is_subdag, dag.is_active AS dag_is_active, dag.last_scheduler_run AS dag_last_scheduler_run, dag.last_pickled AS dag_last_pickled, dag.last_expired AS dag_last_expired, dag.scheduler_lock AS dag_scheduler_lock, dag.pickle_id AS dag_pickle_id, dag.fileloc AS dag_fileloc, dag.owners AS dag_owners \nFROM dag \nWHERE dag.dag_id = %s \n LIMIT %s'] [parameters: [immutabledict({})]]
scheduler_1 | [2016-03-08 10:38:03,508] {jobs.py:602} ERROR - (sqlalchemy.exc.InvalidRequestError) Can't reconnect until invalid transaction is rolled back [SQL: u'SELECT slot_pool.id AS slot_pool_id, slot_pool.pool AS slot_pool_pool, slot_pool.slots AS slot_pool_slots, slot_pool.description AS slot_pool_description \nFROM slot_pool'] [parameters: [immutabledict({})]]
scheduler_1 | Traceback (most recent call last):
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/airflow/jobs.py", line 600, in _execute
scheduler_1 |     self.prioritize_queued(executor=executor, dagbag=dagbag)
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/airflow/utils.py", line 132, in wrapper
scheduler_1 |     result = func(*args, **kwargs)
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/airflow/jobs.py", line 500, in prioritize_queued
scheduler_1 |     pools = {p.pool: p for p in session.query(models.Pool).all()}
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/orm/query.py", line 2588, in all
scheduler_1 |     return list(self)
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/orm/query.py", line 2736, in __iter__
scheduler_1 |     return self._execute_and_instances(context)
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/orm/query.py", line 2751, in _execute_and_instances
scheduler_1 |     result = conn.execute(querycontext.statement, self._params)
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/engine/base.py", line 914, in execute
scheduler_1 |     return meth(self, multiparams, params)
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/sql/elements.py", line 323, in _execute_on_connection
scheduler_1 |     return connection._execute_clauseelement(self, multiparams, params)
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/engine/base.py", line 1010, in _execute_clauseelement
scheduler_1 |     compiled_sql, distilled_params
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/engine/base.py", line 1078, in _execute_context
scheduler_1 |     None, None)
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/engine/base.py", line 1341, in _handle_dbapi_exception
scheduler_1 |     exc_info
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/util/compat.py", line 200, in raise_from_cause
scheduler_1 |     reraise(type(exception), exception, tb=exc_tb, cause=cause)
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/engine/base.py", line 1071, in _execute_context
scheduler_1 |     conn = self._revalidate_connection()
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/engine/base.py", line 391, in _revalidate_connection
scheduler_1 |     "Can't reconnect until invalid "
scheduler_1 | StatementError: (sqlalchemy.exc.InvalidRequestError) Can't reconnect until invalid transaction is rolled back [SQL: u'SELECT slot_pool.id AS slot_pool_id, slot_pool.pool AS slot_pool_pool, slot_pool.slots AS slot_pool_slots, slot_pool.description AS slot_pool_description \nFROM slot_pool'] [parameters: [immutabledict({})]]
scheduler_1 | [2016-03-08 10:38:03,517] {jobs.py:653} ERROR - (sqlalchemy.exc.InvalidRequestError) Can't reconnect until invalid transaction is rolled back [SQL: u'SELECT dag.dag_id AS dag_dag_id, dag.is_paused AS dag_is_paused, dag.is_subdag AS dag_is_subdag, dag.is_active AS dag_is_active, dag.last_scheduler_run AS dag_last_scheduler_run, dag.last_pickled AS dag_last_pickled, dag.last_expired AS dag_last_expired, dag.scheduler_lock AS dag_scheduler_lock, dag.pickle_id AS dag_pickle_id, dag.fileloc AS dag_fileloc, dag.owners AS dag_owners \nFROM dag \nWHERE dag.is_paused = true'] [parameters: [immutabledict({})]]
scheduler_1 | Traceback (most recent call last):
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/airflow/jobs.py", line 621, in _execute
scheduler_1 |     paused_dag_ids = dagbag.paused_dags()
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/airflow/models.py", line 339, in paused_dags
scheduler_1 |     DagModel.is_paused == True)]
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/orm/query.py", line 2736, in __iter__
scheduler_1 |     return self._execute_and_instances(context)
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/orm/query.py", line 2751, in _execute_and_instances
scheduler_1 |     result = conn.execute(querycontext.statement, self._params)
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/engine/base.py", line 914, in execute
scheduler_1 |     return meth(self, multiparams, params)
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/sql/elements.py", line 323, in _execute_on_connection
scheduler_1 |     return connection._execute_clauseelement(self, multiparams, params)
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/engine/base.py", line 1010, in _execute_clauseelement
scheduler_1 |     compiled_sql, distilled_params
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/engine/base.py", line 1078, in _execute_context
scheduler_1 |     None, None)
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/engine/base.py", line 1341, in _handle_dbapi_exception
scheduler_1 |     exc_info
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/util/compat.py", line 200, in raise_from_cause
scheduler_1 |     reraise(type(exception), exception, tb=exc_tb, cause=cause)
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/engine/base.py", line 1071, in _execute_context
scheduler_1 |     conn = self._revalidate_connection()
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/engine/base.py", line 391, in _revalidate_connection
scheduler_1 |     "Can't reconnect until invalid "
scheduler_1 | StatementError: (sqlalchemy.exc.InvalidRequestError) Can't reconnect until invalid transaction is rolled back [SQL: u'SELECT dag.dag_id AS dag_dag_id, dag.is_paused AS dag_is_paused, dag.is_subdag AS dag_is_subdag, dag.is_active AS dag_is_active, dag.last_scheduler_run AS dag_last_scheduler_run, dag.last_pickled AS dag_last_pickled, dag.last_expired AS dag_last_expired, dag.scheduler_lock AS dag_scheduler_lock, dag.pickle_id AS dag_pickle_id, dag.fileloc AS dag_fileloc, dag.owners AS dag_owners \nFROM dag \nWHERE dag.is_paused = true'] [parameters: [immutabledict({})]]
scheduler_1 | [2016-03-08 10:38:03,523] {jobs.py:602} ERROR - (sqlalchemy.exc.InvalidRequestError) Can't reconnect until invalid transaction is rolled back [SQL: u'SELECT slot_pool.id AS slot_pool_id, slot_pool.pool AS slot_pool_pool, slot_pool.slots AS slot_pool_slots, slot_pool.description AS slot_pool_description \nFROM slot_pool'] [parameters: [immutabledict({})]]
scheduler_1 | Traceback (most recent call last):
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/airflow/jobs.py", line 600, in _execute
scheduler_1 |     self.prioritize_queued(executor=executor, dagbag=dagbag)
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/airflow/utils.py", line 132, in wrapper
scheduler_1 |     result = func(*args, **kwargs)
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/airflow/jobs.py", line 500, in prioritize_queued
scheduler_1 |     pools = {p.pool: p for p in session.query(models.Pool).all()}
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/orm/query.py", line 2588, in all
scheduler_1 |     return list(self)
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/orm/query.py", line 2736, in __iter__
scheduler_1 |     return self._execute_and_instances(context)
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/orm/query.py", line 2751, in _execute_and_instances
scheduler_1 |     result = conn.execute(querycontext.statement, self._params)
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/engine/base.py", line 914, in execute
scheduler_1 |     return meth(self, multiparams, params)
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/sql/elements.py", line 323, in _execute_on_connection
scheduler_1 |     return connection._execute_clauseelement(self, multiparams, params)
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/engine/base.py", line 1010, in _execute_clauseelement
scheduler_1 |     compiled_sql, distilled_params
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/engine/base.py", line 1078, in _execute_context
scheduler_1 |     None, None)
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/engine/base.py", line 1341, in _handle_dbapi_exception
scheduler_1 |     exc_info
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/util/compat.py", line 200, in raise_from_cause
scheduler_1 |     reraise(type(exception), exception, tb=exc_tb, cause=cause)
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/engine/base.py", line 1071, in _execute_context
scheduler_1 |     conn = self._revalidate_connection()
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/engine/base.py", line 391, in _revalidate_connection
scheduler_1 |     "Can't reconnect until invalid "
scheduler_1 | StatementError: (sqlalchemy.exc.InvalidRequestError) Can't reconnect until invalid transaction is rolled back [SQL: u'SELECT slot_pool.id AS slot_pool_id, slot_pool.pool AS slot_pool_pool, slot_pool.slots AS slot_pool_slots, slot_pool.description AS slot_pool_description \nFROM slot_pool'] [parameters: [immutabledict({})]]
scheduler_1 | [2016-03-08 10:38:03,528] {jobs.py:653} ERROR - (sqlalchemy.exc.InvalidRequestError) Can't reconnect until invalid transaction is rolled back [SQL: u'SELECT dag.dag_id AS dag_dag_id, dag.is_paused AS dag_is_paused, dag.is_subdag AS dag_is_subdag, dag.is_active AS dag_is_active, dag.last_scheduler_run AS dag_last_scheduler_run, dag.last_pickled AS dag_last_pickled, dag.last_expired AS dag_last_expired, dag.scheduler_lock AS dag_scheduler_lock, dag.pickle_id AS dag_pickle_id, dag.fileloc AS dag_fileloc, dag.owners AS dag_owners \nFROM dag \nWHERE dag.is_paused = true'] [parameters: [immutabledict({})]]
scheduler_1 | Traceback (most recent call last):
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/airflow/jobs.py", line 621, in _execute
scheduler_1 |     paused_dag_ids = dagbag.paused_dags()
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/airflow/models.py", line 339, in paused_dags
scheduler_1 |     DagModel.is_paused == True)]
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/orm/query.py", line 2736, in __iter__
scheduler_1 |     return self._execute_and_instances(context)
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/orm/query.py", line 2751, in _execute_and_instances
scheduler_1 |     result = conn.execute(querycontext.statement, self._params)
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/engine/base.py", line 914, in execute
scheduler_1 |     return meth(self, multiparams, params)
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/sql/elements.py", line 323, in _execute_on_connection
scheduler_1 |     return connection._execute_clauseelement(self, multiparams, params)
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/engine/base.py", line 1010, in _execute_clauseelement
scheduler_1 |     compiled_sql, distilled_params
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/engine/base.py", line 1078, in _execute_context
scheduler_1 |     None, None)
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/engine/base.py", line 1341, in _handle_dbapi_exception
scheduler_1 |     exc_info
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/util/compat.py", line 200, in raise_from_cause
scheduler_1 |     reraise(type(exception), exception, tb=exc_tb, cause=cause)
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/engine/base.py", line 1071, in _execute_context
scheduler_1 |     conn = self._revalidate_connection()
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/engine/base.py", line 391, in _revalidate_connection
scheduler_1 |     "Can't reconnect until invalid "
scheduler_1 | StatementError: (sqlalchemy.exc.InvalidRequestError) Can't reconnect until invalid transaction is rolled back [SQL: u'SELECT dag.dag_id AS dag_dag_id, dag.is_paused AS dag_is_paused, dag.is_subdag AS dag_is_subdag, dag.is_active AS dag_is_active, dag.last_scheduler_run AS dag_last_scheduler_run, dag.last_pickled AS dag_last_pickled, dag.last_expired AS dag_last_expired, dag.scheduler_lock AS dag_scheduler_lock, dag.pickle_id AS dag_pickle_id, dag.fileloc AS dag_fileloc, dag.owners AS dag_owners \nFROM dag \nWHERE dag.is_paused = true'] [parameters: [immutabledict({})]]
scheduler_1 | [2016-03-08 10:38:03,531] {jobs.py:602} ERROR - (sqlalchemy.exc.InvalidRequestError) Can't reconnect until invalid transaction is rolled back [SQL: u'SELECT slot_pool.id AS slot_pool_id, slot_pool.pool AS slot_pool_pool, slot_pool.slots AS slot_pool_slots, slot_pool.description AS slot_pool_description \nFROM slot_pool'] [parameters: [immutabledict({})]]
scheduler_1 | Traceback (most recent call last):
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/airflow/jobs.py", line 600, in _execute
scheduler_1 |     self.prioritize_queued(executor=executor, dagbag=dagbag)
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/airflow/utils.py", line 132, in wrapper
scheduler_1 |     result = func(*args, **kwargs)
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/airflow/jobs.py", line 500, in prioritize_queued
scheduler_1 |     pools = {p.pool: p for p in session.query(models.Pool).all()}
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/orm/query.py", line 2588, in all
scheduler_1 |     return list(self)
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/orm/query.py", line 2736, in __iter__
scheduler_1 |     return self._execute_and_instances(context)
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/orm/query.py", line 2751, in _execute_and_instances
scheduler_1 |     result = conn.execute(querycontext.statement, self._params)
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/engine/base.py", line 914, in execute
scheduler_1 |     return meth(self, multiparams, params)
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/sql/elements.py", line 323, in _execute_on_connection
scheduler_1 |     return connection._execute_clauseelement(self, multiparams, params)
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/engine/base.py", line 1010, in _execute_clauseelement
scheduler_1 |     compiled_sql, distilled_params
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/engine/base.py", line 1078, in _execute_context
scheduler_1 |     None, None)
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/engine/base.py", line 1341, in _handle_dbapi_exception
scheduler_1 |     exc_info
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/util/compat.py", line 200, in raise_from_cause
scheduler_1 |     reraise(type(exception), exception, tb=exc_tb, cause=cause)
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/engine/base.py", line 1071, in _execute_context
scheduler_1 |     conn = self._revalidate_connection()
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/engine/base.py", line 391, in _revalidate_connection
scheduler_1 |     "Can't reconnect until invalid "
scheduler_1 | StatementError: (sqlalchemy.exc.InvalidRequestError) Can't reconnect until invalid transaction is rolled back [SQL: u'SELECT slot_pool.id AS slot_pool_id, slot_pool.pool AS slot_pool_pool, slot_pool.slots AS slot_pool_slots, slot_pool.description AS slot_pool_description \nFROM slot_pool'] [parameters: [immutabledict({})]]
scheduler_1 | [2016-03-08 10:38:03,536] {jobs.py:653} ERROR - (sqlalchemy.exc.InvalidRequestError) Can't reconnect until invalid transaction is rolled back [SQL: u'SELECT dag.dag_id AS dag_dag_id, dag.is_paused AS dag_is_paused, dag.is_subdag AS dag_is_subdag, dag.is_active AS dag_is_active, dag.last_scheduler_run AS dag_last_scheduler_run, dag.last_pickled AS dag_last_pickled, dag.last_expired AS dag_last_expired, dag.scheduler_lock AS dag_scheduler_lock, dag.pickle_id AS dag_pickle_id, dag.fileloc AS dag_fileloc, dag.owners AS dag_owners \nFROM dag \nWHERE dag.is_paused = true'] [parameters: [immutabledict({})]]
scheduler_1 | Traceback (most recent call last):
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/airflow/jobs.py", line 621, in _execute
scheduler_1 |     paused_dag_ids = dagbag.paused_dags()
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/airflow/models.py", line 339, in paused_dags
scheduler_1 |     DagModel.is_paused == True)]
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/orm/query.py", line 2736, in __iter__
scheduler_1 |     return self._execute_and_instances(context)
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/orm/query.py", line 2751, in _execute_and_instances
scheduler_1 |     result = conn.execute(querycontext.statement, self._params)
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/engine/base.py", line 914, in execute
scheduler_1 |     return meth(self, multiparams, params)
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/sql/elements.py", line 323, in _execute_on_connection
scheduler_1 |     return connection._execute_clauseelement(self, multiparams, params)
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/engine/base.py", line 1010, in _execute_clauseelement
scheduler_1 |     compiled_sql, distilled_params
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/engine/base.py", line 1078, in _execute_context
scheduler_1 |     None, None)
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/engine/base.py", line 1341, in _handle_dbapi_exception
scheduler_1 |     exc_info
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/util/compat.py", line 200, in raise_from_cause
scheduler_1 |     reraise(type(exception), exception, tb=exc_tb, cause=cause)
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/engine/base.py", line 1071, in _execute_context
scheduler_1 |     conn = self._revalidate_connection()
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/engine/base.py", line 391, in _revalidate_connection
scheduler_1 |     "Can't reconnect until invalid "
scheduler_1 | StatementError: (sqlalchemy.exc.InvalidRequestError) Can't reconnect until invalid transaction is rolled back [SQL: u'SELECT dag.dag_id AS dag_dag_id, dag.is_paused AS dag_is_paused, dag.is_subdag AS dag_is_subdag, dag.is_active AS dag_is_active, dag.last_scheduler_run AS dag_last_scheduler_run, dag.last_pickled AS dag_last_pickled, dag.last_expired AS dag_last_expired, dag.scheduler_lock AS dag_scheduler_lock, dag.pickle_id AS dag_pickle_id, dag.fileloc AS dag_fileloc, dag.owners AS dag_owners \nFROM dag \nWHERE dag.is_paused = true'] [parameters: [immutabledict({})]]
scheduler_1 | [2016-03-08 10:38:03,540] {jobs.py:602} ERROR - (sqlalchemy.exc.InvalidRequestError) Can't reconnect until invalid transaction is rolled back [SQL: u'SELECT slot_pool.id AS slot_pool_id, slot_pool.pool AS slot_pool_pool, slot_pool.slots AS slot_pool_slots, slot_pool.description AS slot_pool_description \nFROM slot_pool'] [parameters: [immutabledict({})]]
scheduler_1 | Traceback (most recent call last):
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/airflow/jobs.py", line 600, in _execute
scheduler_1 |     self.prioritize_queued(executor=executor, dagbag=dagbag)
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/airflow/utils.py", line 132, in wrapper
scheduler_1 |     result = func(*args, **kwargs)
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/airflow/jobs.py", line 500, in prioritize_queued
scheduler_1 |     pools = {p.pool: p for p in session.query(models.Pool).all()}
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/orm/query.py", line 2588, in all
scheduler_1 |     return list(self)
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/orm/query.py", line 2736, in __iter__
scheduler_1 |     return self._execute_and_instances(context)
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/orm/query.py", line 2751, in _execute_and_instances
scheduler_1 |     result = conn.execute(querycontext.statement, self._params)
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/engine/base.py", line 914, in execute
scheduler_1 |     return meth(self, multiparams, params)
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/sql/elements.py", line 323, in _execute_on_connection
scheduler_1 |     return connection._execute_clauseelement(self, multiparams, params)
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/engine/base.py", line 1010, in _execute_clauseelement
scheduler_1 |     compiled_sql, distilled_params
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/engine/base.py", line 1078, in _execute_context
scheduler_1 |     None, None)
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/engine/base.py", line 1341, in _handle_dbapi_exception
scheduler_1 |     exc_info
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/util/compat.py", line 200, in raise_from_cause
scheduler_1 |     reraise(type(exception), exception, tb=exc_tb, cause=cause)
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/engine/base.py", line 1071, in _execute_context
scheduler_1 |     conn = self._revalidate_connection()
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/engine/base.py", line 391, in _revalidate_connection
scheduler_1 |     "Can't reconnect until invalid "
scheduler_1 | StatementError: (sqlalchemy.exc.InvalidRequestError) Can't reconnect until invalid transaction is rolled back [SQL: u'SELECT slot_pool.id AS slot_pool_id, slot_pool.pool AS slot_pool_pool, slot_pool.slots AS slot_pool_slots, slot_pool.description AS slot_pool_description \nFROM slot_pool'] [parameters: [immutabledict({})]]
scheduler_1 | [2016-03-08 10:38:03,551] {jobs.py:653} ERROR - (sqlalchemy.exc.InvalidRequestError) Can't reconnect until invalid transaction is rolled back [SQL: u'SELECT dag.dag_id AS dag_dag_id, dag.is_paused AS dag_is_paused, dag.is_subdag AS dag_is_subdag, dag.is_active AS dag_is_active, dag.last_scheduler_run AS dag_last_scheduler_run, dag.last_pickled AS dag_last_pickled, dag.last_expired AS dag_last_expired, dag.scheduler_lock AS dag_scheduler_lock, dag.pickle_id AS dag_pickle_id, dag.fileloc AS dag_fileloc, dag.owners AS dag_owners \nFROM dag \nWHERE dag.is_paused = true'] [parameters: [immutabledict({})]]
scheduler_1 | Traceback (most recent call last):
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/airflow/jobs.py", line 621, in _execute
scheduler_1 |     paused_dag_ids = dagbag.paused_dags()
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/airflow/models.py", line 339, in paused_dags
scheduler_1 |     DagModel.is_paused == True)]
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/orm/query.py", line 2736, in __iter__
scheduler_1 |     return self._execute_and_instances(context)
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/orm/query.py", line 2751, in _execute_and_instances
scheduler_1 |     result = conn.execute(querycontext.statement, self._params)
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/engine/base.py", line 914, in execute
scheduler_1 |     return meth(self, multiparams, params)
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/sql/elements.py", line 323, in _execute_on_connection
scheduler_1 |     return connection._execute_clauseelement(self, multiparams, params)
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/engine/base.py", line 1010, in _execute_clauseelement
scheduler_1 |     compiled_sql, distilled_params
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/engine/base.py", line 1078, in _execute_context
scheduler_1 |     None, None)
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/engine/base.py", line 1341, in _handle_dbapi_exception
scheduler_1 |     exc_info
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/util/compat.py", line 200, in raise_from_cause
scheduler_1 |     reraise(type(exception), exception, tb=exc_tb, cause=cause)
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/engine/base.py", line 1071, in _execute_context
scheduler_1 |     conn = self._revalidate_connection()
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/engine/base.py", line 391, in _revalidate_connection
scheduler_1 |     "Can't reconnect until invalid "
scheduler_1 | StatementError: (sqlalchemy.exc.InvalidRequestError) Can't reconnect until invalid transaction is rolled back [SQL: u'SELECT dag.dag_id AS dag_dag_id, dag.is_paused AS dag_is_paused, dag.is_subdag AS dag_is_subdag, dag.is_active AS dag_is_active, dag.last_scheduler_run AS dag_last_scheduler_run, dag.last_pickled AS dag_last_pickled, dag.last_expired AS dag_last_expired, dag.scheduler_lock AS dag_scheduler_lock, dag.pickle_id AS dag_pickle_id, dag.fileloc AS dag_fileloc, dag.owners AS dag_owners \nFROM dag \nWHERE dag.is_paused = true'] [parameters: [immutabledict({})]]
scheduler_1 | [2016-03-08 10:38:03,557] {jobs.py:602} ERROR - (sqlalchemy.exc.InvalidRequestError) Can't reconnect until invalid transaction is rolled back [SQL: u'SELECT slot_pool.id AS slot_pool_id, slot_pool.pool AS slot_pool_pool, slot_pool.slots AS slot_pool_slots, slot_pool.description AS slot_pool_description \nFROM slot_pool'] [parameters: [immutabledict({})]]
scheduler_1 | Traceback (most recent call last):
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/airflow/jobs.py", line 600, in _execute
scheduler_1 |     self.prioritize_queued(executor=executor, dagbag=dagbag)
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/airflow/utils.py", line 132, in wrapper
scheduler_1 |     result = func(*args, **kwargs)
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/airflow/jobs.py", line 500, in prioritize_queued
scheduler_1 |     pools = {p.pool: p for p in session.query(models.Pool).all()}
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/orm/query.py", line 2588, in all
scheduler_1 |     return list(self)
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/orm/query.py", line 2736, in __iter__
scheduler_1 |     return self._execute_and_instances(context)
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/orm/query.py", line 2751, in _execute_and_instances
scheduler_1 |     result = conn.execute(querycontext.statement, self._params)
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/engine/base.py", line 914, in execute
scheduler_1 |     return meth(self, multiparams, params)
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/sql/elements.py", line 323, in _execute_on_connection
scheduler_1 |     return connection._execute_clauseelement(self, multiparams, params)
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/engine/base.py", line 1010, in _execute_clauseelement
scheduler_1 |     compiled_sql, distilled_params
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/engine/base.py", line 1078, in _execute_context
scheduler_1 |     None, None)
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/engine/base.py", line 1341, in _handle_dbapi_exception
scheduler_1 |     exc_info
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/util/compat.py", line 200, in raise_from_cause
scheduler_1 |     reraise(type(exception), exception, tb=exc_tb, cause=cause)
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/engine/base.py", line 1071, in _execute_context
scheduler_1 |     conn = self._revalidate_connection()
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/engine/base.py", line 391, in _revalidate_connection
scheduler_1 |     "Can't reconnect until invalid "
scheduler_1 | StatementError: (sqlalchemy.exc.InvalidRequestError) Can't reconnect until invalid transaction is rolled back [SQL: u'SELECT slot_pool.id AS slot_pool_id, slot_pool.pool AS slot_pool_pool, slot_pool.slots AS slot_pool_slots, slot_pool.description AS slot_pool_description \nFROM slot_pool'] [parameters: [immutabledict({})]]
scheduler_1 | [2016-03-08 10:38:03,563] {jobs.py:653} ERROR - (sqlalchemy.exc.InvalidRequestError) Can't reconnect until invalid transaction is rolled back [SQL: u'SELECT dag.dag_id AS dag_dag_id, dag.is_paused AS dag_is_paused, dag.is_subdag AS dag_is_subdag, dag.is_active AS dag_is_active, dag.last_scheduler_run AS dag_last_scheduler_run, dag.last_pickled AS dag_last_pickled, dag.last_expired AS dag_last_expired, dag.scheduler_lock AS dag_scheduler_lock, dag.pickle_id AS dag_pickle_id, dag.fileloc AS dag_fileloc, dag.owners AS dag_owners \nFROM dag \nWHERE dag.is_paused = true'] [parameters: [immutabledict({})]]
scheduler_1 | Traceback (most recent call last):
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/airflow/jobs.py", line 621, in _execute
scheduler_1 |     paused_dag_ids = dagbag.paused_dags()
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/airflow/models.py", line 339, in paused_dags
scheduler_1 |     DagModel.is_paused == True)]
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/orm/query.py", line 2736, in __iter__
scheduler_1 |     return self._execute_and_instances(context)
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/orm/query.py", line 2751, in _execute_and_instances
scheduler_1 |     result = conn.execute(querycontext.statement, self._params)
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/engine/base.py", line 914, in execute
scheduler_1 |     return meth(self, multiparams, params)
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/sql/elements.py", line 323, in _execute_on_connection
scheduler_1 |     return connection._execute_clauseelement(self, multiparams, params)
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/engine/base.py", line 1010, in _execute_clauseelement
scheduler_1 |     compiled_sql, distilled_params
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/engine/base.py", line 1078, in _execute_context
scheduler_1 |     None, None)
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/engine/base.py", line 1341, in _handle_dbapi_exception
scheduler_1 |     exc_info
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/util/compat.py", line 200, in raise_from_cause
scheduler_1 |     reraise(type(exception), exception, tb=exc_tb, cause=cause)
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/engine/base.py", line 1071, in _execute_context
scheduler_1 |     conn = self._revalidate_connection()
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/engine/base.py", line 391, in _revalidate_connection
scheduler_1 |     "Can't reconnect until invalid "
scheduler_1 | StatementError: (sqlalchemy.exc.InvalidRequestError) Can't reconnect until invalid transaction is rolled back [SQL: u'SELECT dag.dag_id AS dag_dag_id, dag.is_paused AS dag_is_paused, dag.is_subdag AS dag_is_subdag, dag.is_active AS dag_is_active, dag.last_scheduler_run AS dag_last_scheduler_run, dag.last_pickled AS dag_last_pickled, dag.last_expired AS dag_last_expired, dag.scheduler_lock AS dag_scheduler_lock, dag.pickle_id AS dag_pickle_id, dag.fileloc AS dag_fileloc, dag.owners AS dag_owners \nFROM dag \nWHERE dag.is_paused = true'] [parameters: [immutabledict({})]]
scheduler_1 | [2016-03-08 10:38:03,565] {jobs.py:602} ERROR - (sqlalchemy.exc.InvalidRequestError) Can't reconnect until invalid transaction is rolled back [SQL: u'SELECT slot_pool.id AS slot_pool_id, slot_pool.pool AS slot_pool_pool, slot_pool.slots AS slot_pool_slots, slot_pool.description AS slot_pool_description \nFROM slot_pool'] [parameters: [immutabledict({})]]
scheduler_1 | Traceback (most recent call last):
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/airflow/jobs.py", line 600, in _execute
scheduler_1 |     self.prioritize_queued(executor=executor, dagbag=dagbag)
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/airflow/utils.py", line 132, in wrapper
scheduler_1 |     result = func(*args, **kwargs)
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/airflow/jobs.py", line 500, in prioritize_queued
scheduler_1 |     pools = {p.pool: p for p in session.query(models.Pool).all()}
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/orm/query.py", line 2588, in all
scheduler_1 |     return list(self)
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/orm/query.py", line 2736, in __iter__
scheduler_1 |     return self._execute_and_instances(context)
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/orm/query.py", line 2751, in _execute_and_instances
scheduler_1 |     result = conn.execute(querycontext.statement, self._params)
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/engine/base.py", line 914, in execute
scheduler_1 |     return meth(self, multiparams, params)
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/sql/elements.py", line 323, in _execute_on_connection
scheduler_1 |     return connection._execute_clauseelement(self, multiparams, params)
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/engine/base.py", line 1010, in _execute_clauseelement
scheduler_1 |     compiled_sql, distilled_params
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/engine/base.py", line 1078, in _execute_context
scheduler_1 |     None, None)
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/engine/base.py", line 1341, in _handle_dbapi_exception
scheduler_1 |     exc_info
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/util/compat.py", line 200, in raise_from_cause
scheduler_1 |     reraise(type(exception), exception, tb=exc_tb, cause=cause)
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/engine/base.py", line 1071, in _execute_context
scheduler_1 |     conn = self._revalidate_connection()
scheduler_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/engine/base.py", line 391, in _revalidate_connection
scheduler_1 |     "Can't reconnect until invalid "
scheduler_1 | StatementError: (sqlalchemy.exc.InvalidRequestError) Can't reconnect until invalid transaction is rolled back [SQL: u'SELECT slot_pool.id AS slot_pool_id, slot_pool.pool AS slot_pool_pool, slot_pool.slots AS slot_pool_slots, slot_pool.description AS slot_pool_description \nFROM slot_pool'] [parameters: [immutabledict({})]]
scheduler_1 | [2016-03-08 10:38:03,567] {models.py:124} INFO - Filling up the DagBag from /usr/local/airflow/dags
scheduler_1 | [2016-03-08 10:38:03,570] {models.py:197} INFO - Importing /usr/local/lib/python2.7/dist-packages/airflow/example_dags/example_http_operator.py
scheduler_1 | [2016-03-08 10:38:03,644] {models.py:324} WARNING - (sqlalchemy.exc.InvalidRequestError) Can't reconnect until invalid transaction is rolled back [SQL: u'SELECT dag.dag_id AS dag_dag_id, dag.is_paused AS dag_is_paused, dag.is_subdag AS dag_is_subdag, dag.is_active AS dag_is_active, dag.last_scheduler_run AS dag_last_scheduler_run, dag.last_pickled AS dag_last_pickled, dag.last_expired AS dag_last_expired, dag.scheduler_lock AS dag_scheduler_lock, dag.pickle_id AS dag_pickle_id, dag.fileloc AS dag_fileloc, dag.owners AS dag_owners \nFROM dag \nWHERE dag.dag_id = %s \n LIMIT %s'] [parameters: [immutabledict({})]]
scheduler_1 | [2016-03-08 10:38:03,646] {models.py:197} INFO - Importing /usr/local/lib/python2.7/dist-packages/airflow/example_dags/example_python_operator.py
scheduler_1 | [2016-03-08 10:38:03,662] {models.py:324} WARNING - (sqlalchemy.exc.InvalidRequestError) Can't reconnect until invalid transaction is rolled back [SQL: u'SELECT dag.dag_id AS dag_dag_id, dag.is_paused AS dag_is_paused, dag.is_subdag AS dag_is_subdag, dag.is_active AS dag_is_active, dag.last_scheduler_run AS dag_last_scheduler_run, dag.last_pickled AS dag_last_pickled, dag.last_expired AS dag_last_expired, dag.scheduler_lock AS dag_scheduler_lock, dag.pickle_id AS dag_pickle_id, dag.fileloc AS dag_fileloc, dag.owners AS dag_owners \nFROM dag \nWHERE dag.dag_id = %s \n LIMIT %s'] [parameters: [immutabledict({})]]
scheduler_1 | [2016-03-08 10:38:03,667] {models.py:197} INFO - Importing /usr/local/lib/python2.7/dist-packages/airflow/example_dags/example_branch_operator.py
scheduler_1 | [2016-03-08 10:38:03,689] {models.py:324} WARNING - (sqlalchemy.exc.InvalidRequestError) Can't reconnect until invalid transaction is rolled back [SQL: u'SELECT dag.dag_id AS dag_dag_id, dag.is_paused AS dag_is_paused, dag.is_subdag AS dag_is_subdag, dag.is_active AS dag_is_active, dag.last_scheduler_run AS dag_last_scheduler_run, dag.last_pickled AS dag_last_pickled, dag.last_expired AS dag_last_expired, dag.scheduler_lock AS dag_scheduler_lock, dag.pickle_id AS dag_pickle_id, dag.fileloc AS dag_fileloc, dag.owners AS dag_owners \nFROM dag \nWHERE dag.dag_id = %s \n LIMIT %s'] [parameters: [immutabledict({})]]
scheduler_1 | [2016-03-08 10:38:03,693] {models.py:197} INFO - Importing /usr/local/lib/python2.7/dist-packages/airflow/example_dags/example_xcom.py
scheduler_1 | [2016-03-08 10:38:03,700] {models.py:324} WARNING - (sqlalchemy.exc.InvalidRequestError) Can't reconnect until invalid transaction is rolled back [SQL: u'SELECT dag.dag_id AS dag_dag_id, dag.is_paused AS dag_is_paused, dag.is_subdag AS dag_is_subdag, dag.is_active AS dag_is_active, dag.last_scheduler_run AS dag_last_scheduler_run, dag.last_pickled AS dag_last_pickled, dag.last_expired AS dag_last_expired, dag.scheduler_lock AS dag_scheduler_lock, dag.pickle_id AS dag_pickle_id, dag.fileloc AS dag_fileloc, dag.owners AS dag_owners \nFROM dag \nWHERE dag.dag_id = %s \n LIMIT %s'] [parameters: [immutabledict({})]]
scheduler_1 | [2016-03-08 10:38:03,703] {models.py:197} INFO - Importing /usr/local/lib/python2.7/dist-packages/airflow/example_dags/example_short_circuit_operator.py
scheduler_1 | [2016-03-08 10:38:03,711] {models.py:324} WARNING - (sqlalchemy.exc.InvalidRequestError) Can't reconnect until invalid transaction is rolled back [SQL: u'SELECT dag.dag_id AS dag_dag_id, dag.is_paused AS dag_is_paused, dag.is_subdag AS dag_is_subdag, dag.is_active AS dag_is_active, dag.last_scheduler_run AS dag_last_scheduler_run, dag.last_pickled AS dag_last_pickled, dag.last_expired AS dag_last_expired, dag.scheduler_lock AS dag_scheduler_lock, dag.pickle_id AS dag_pickle_id, dag.fileloc AS dag_fileloc, dag.owners AS dag_owners \nFROM dag \nWHERE dag.dag_id = %s \n LIMIT %s'] [parameters: [immutabledict({})]]
scheduler_1 | [2016-03-08 10:38:03,713] {models.py:197} INFO - Importing /usr/local/lib/python2.7/dist-packages/airflow/example_dags/example_bash_operator.py
scheduler_1 | [2016-03-08 10:38:03,715] {models.py:324} WARNING - (sqlalchemy.exc.InvalidRequestError) Can't reconnect until invalid transaction is rolled back [SQL: u'SELECT dag.dag_id AS dag_dag_id, dag.is_paused AS dag_is_paused, dag.is_subdag AS dag_is_subdag, dag.is_active AS dag_is_active, dag.last_scheduler_run AS dag_last_scheduler_run, dag.last_pickled AS dag_last_pickled, dag.last_expired AS dag_last_expired, dag.scheduler_lock AS dag_scheduler_lock, dag.pickle_id AS dag_pickle_id, dag.fileloc AS dag_fileloc, dag.owners AS dag_owners \nFROM dag \nWHERE dag.dag_id = %s \n LIMIT %s'] [parameters: [immutabledict({})]]
scheduler_1 | [2016-03-08 10:38:03,720] {models.py:197} INFO - Importing /usr/local/lib/python2.7/dist-packages/airflow/example_dags/tutorial.py
scheduler_1 | [2016-03-08 10:38:03,725] {models.py:324} WARNING - (sqlalchemy.exc.InvalidRequestError) Can't reconnect until invalid transaction is rolled back [SQL: u'SELECT dag.dag_id AS dag_dag_id, dag.is_paused AS dag_is_paused, dag.is_subdag AS dag_is_subdag, dag.is_active AS dag_is_active, dag.last_scheduler_run AS dag_last_scheduler_run, dag.last_pickled AS dag_last_pickled, dag.last_expired AS dag_last_expired, dag.scheduler_lock AS dag_scheduler_lock, dag.pickle_id AS dag_pickle_id, dag.fileloc AS dag_fileloc, dag.owners AS dag_owners \nFROM dag \nWHERE dag.dag_id = %s \n LIMIT %s'] [parameters: [immutabledict({})]]
scheduler_1 | [2016-03-08 10:38:03,726] {jobs.py:611} ERROR - Failed at reloading the dagbag
webserver_1 | [2016-03-08 10:38:03 +0000] [41] [ERROR] Exception in worker process:
webserver_1 | Traceback (most recent call last):
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/gunicorn/arbiter.py", line 515, in spawn_worker
webserver_1 |     worker.init_process()
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/gunicorn/workers/base.py", line 122, in init_process
webserver_1 |     self.load_wsgi()
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/gunicorn/workers/base.py", line 130, in load_wsgi
webserver_1 |     self.wsgi = self.app.wsgi()
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/gunicorn/app/base.py", line 67, in wsgi
webserver_1 |     self.callable = self.load()
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/gunicorn/app/wsgiapp.py", line 65, in load
webserver_1 |     return self.load_wsgiapp()
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/gunicorn/app/wsgiapp.py", line 52, in load_wsgiapp
webserver_1 |     return util.import_app(self.app_uri)
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/gunicorn/util.py", line 368, in import_app
webserver_1 |     app = eval(obj, mod.__dict__)
webserver_1 |   File "<string>", line 1, in <module>
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/airflow/www/app.py", line 119, in cached_app
webserver_1 |     app = create_app(config)
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/airflow/www/app.py", line 39, in create_app
webserver_1 |     from airflow.www import views
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/airflow/www/views.py", line 1687, in <module>
webserver_1 |     class ChartModelView(wwwutils.DataProfilingMixin, AirflowModelView):
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/airflow/www/views.py", line 1771, in ChartModelView
webserver_1 |     .group_by(models.Connection.conn_id)
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/orm/query.py", line 2736, in __iter__
webserver_1 |     return self._execute_and_instances(context)
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/orm/query.py", line 2749, in _execute_and_instances
webserver_1 |     close_with_result=True)
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/orm/query.py", line 2740, in _connection_from_session
webserver_1 |     **kw)
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/orm/session.py", line 893, in connection
webserver_1 |     execution_options=execution_options)
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/orm/session.py", line 898, in _connection_for_bind
webserver_1 |     engine, execution_options)
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/orm/session.py", line 334, in _connection_for_bind
webserver_1 |     conn = bind.contextual_connect()
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/engine/base.py", line 2039, in contextual_connect
webserver_1 |     self._wrap_pool_connect(self.pool.connect, None),
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/engine/base.py", line 2078, in _wrap_pool_connect
webserver_1 |     e, dialect, self)
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/engine/base.py", line 1405, in _handle_dbapi_exception_noconnection
webserver_1 |     exc_info
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/util/compat.py", line 200, in raise_from_cause
webserver_1 |     reraise(type(exception), exception, tb=exc_tb, cause=cause)
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/engine/base.py", line 2074, in _wrap_pool_connect
webserver_1 |     return fn()
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/pool.py", line 376, in connect
webserver_1 |     return _ConnectionFairy._checkout(self)
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/pool.py", line 713, in _checkout
webserver_1 |     fairy = _ConnectionRecord.checkout(pool)
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/pool.py", line 480, in checkout
webserver_1 |     rec = pool._do_get()
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/pool.py", line 1060, in _do_get
webserver_1 |     self._dec_overflow()
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/util/langhelpers.py", line 60, in __exit__
webserver_1 |     compat.reraise(exc_type, exc_value, exc_tb)
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/pool.py", line 1057, in _do_get
webserver_1 |     return self._create_connection()
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/pool.py", line 323, in _create_connection
webserver_1 |     return _ConnectionRecord(self)
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/pool.py", line 449, in __init__
webserver_1 |     self.connection = self.__connect()
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/pool.py", line 607, in __connect
webserver_1 |     connection = self.__pool._invoke_creator(self)
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/engine/strategies.py", line 97, in connect
webserver_1 |     return dialect.connect(*cargs, **cparams)
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/engine/default.py", line 385, in connect
webserver_1 |     return self.dbapi.connect(*cargs, **cparams)
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/MySQLdb/__init__.py", line 81, in Connect
webserver_1 |     return Connection(*args, **kwargs)
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/MySQLdb/connections.py", line 204, in __init__
webserver_1 |     super(Connection, self).__init__(*args, **kwargs2)
webserver_1 | OperationalError: (_mysql_exceptions.OperationalError) (2003, "Can't connect to MySQL server on 'mysql' (111)")
webserver_1 | Traceback (most recent call last):
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/gunicorn/arbiter.py", line 515, in spawn_worker
webserver_1 |     worker.init_process()
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/gunicorn/workers/base.py", line 122, in init_process
webserver_1 |     self.load_wsgi()
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/gunicorn/workers/base.py", line 130, in load_wsgi
webserver_1 |     self.wsgi = self.app.wsgi()
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/gunicorn/app/base.py", line 67, in wsgi
webserver_1 |     self.callable = self.load()
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/gunicorn/app/wsgiapp.py", line 65, in load
webserver_1 |     return self.load_wsgiapp()
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/gunicorn/app/wsgiapp.py", line 52, in load_wsgiapp
webserver_1 |     return util.import_app(self.app_uri)
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/gunicorn/util.py", line 368, in import_app
webserver_1 |     app = eval(obj, mod.__dict__)
webserver_1 |   File "<string>", line 1, in <module>
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/airflow/www/app.py", line 119, in cached_app
webserver_1 |     app = create_app(config)
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/airflow/www/app.py", line 39, in create_app
webserver_1 |     from airflow.www import views
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/airflow/www/views.py", line 1687, in <module>
webserver_1 |     class ChartModelView(wwwutils.DataProfilingMixin, AirflowModelView):
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/airflow/www/views.py", line 1771, in ChartModelView
webserver_1 |     .group_by(models.Connection.conn_id)
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/orm/query.py", line 2736, in __iter__
webserver_1 |     return self._execute_and_instances(context)
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/orm/query.py", line 2749, in _execute_and_instances
webserver_1 |     close_with_result=True)
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/orm/query.py", line 2740, in _connection_from_session
webserver_1 |     **kw)
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/orm/session.py", line 893, in connection
webserver_1 |     execution_options=execution_options)
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/orm/session.py", line 898, in _connection_for_bind
webserver_1 |     engine, execution_options)
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/orm/session.py", line 334, in _connection_for_bind
webserver_1 |     conn = bind.contextual_connect()
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/engine/base.py", line 2039, in contextual_connect
webserver_1 |     self._wrap_pool_connect(self.pool.connect, None),
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/engine/base.py", line 2078, in _wrap_pool_connect
webserver_1 |     e, dialect, self)
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/engine/base.py", line 1405, in _handle_dbapi_exception_noconnection
webserver_1 |     exc_info
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/util/compat.py", line 200, in raise_from_cause
webserver_1 |     reraise(type(exception), exception, tb=exc_tb, cause=cause)
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/engine/base.py", line 2074, in _wrap_pool_connect
webserver_1 |     return fn()
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/pool.py", line 376, in connect
webserver_1 |     return _ConnectionFairy._checkout(self)
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/pool.py", line 713, in _checkout
webserver_1 |     fairy = _ConnectionRecord.checkout(pool)
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/pool.py", line 480, in checkout
webserver_1 |     rec = pool._do_get()
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/pool.py", line 1060, in _do_get
webserver_1 |     self._dec_overflow()
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/util/langhelpers.py", line 60, in __exit__
webserver_1 |     compat.reraise(exc_type, exc_value, exc_tb)
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/pool.py", line 1057, in _do_get
webserver_1 |     return self._create_connection()
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/pool.py", line 323, in _create_connection
webserver_1 |     return _ConnectionRecord(self)
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/pool.py", line 449, in __init__
webserver_1 |     self.connection = self.__connect()
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/pool.py", line 607, in __connect
webserver_1 |     connection = self.__pool._invoke_creator(self)
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/engine/strategies.py", line 97, in connect
webserver_1 |     return dialect.connect(*cargs, **cparams)
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/engine/default.py", line 385, in connect
webserver_1 |     return self.dbapi.connect(*cargs, **cparams)
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/MySQLdb/__init__.py", line 81, in Connect
webserver_1 |     return Connection(*args, **kwargs)
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/MySQLdb/connections.py", line 204, in __init__
webserver_1 |     super(Connection, self).__init__(*args, **kwargs2)
webserver_1 | OperationalError: (_mysql_exceptions.OperationalError) (2003, "Can't connect to MySQL server on 'mysql' (111)")
webserver_1 | [2016-03-08 10:38:03 +0000] [44] [ERROR] Exception in worker process:
webserver_1 | Traceback (most recent call last):
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/gunicorn/arbiter.py", line 515, in spawn_worker
webserver_1 |     worker.init_process()
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/gunicorn/workers/base.py", line 122, in init_process
webserver_1 |     self.load_wsgi()
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/gunicorn/workers/base.py", line 130, in load_wsgi
webserver_1 |     self.wsgi = self.app.wsgi()
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/gunicorn/app/base.py", line 67, in wsgi
webserver_1 |     self.callable = self.load()
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/gunicorn/app/wsgiapp.py", line 65, in load
webserver_1 |     return self.load_wsgiapp()
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/gunicorn/app/wsgiapp.py", line 52, in load_wsgiapp
webserver_1 |     return util.import_app(self.app_uri)
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/gunicorn/util.py", line 368, in import_app
webserver_1 |     app = eval(obj, mod.__dict__)
webserver_1 |   File "<string>", line 1, in <module>
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/airflow/www/app.py", line 119, in cached_app
webserver_1 |     app = create_app(config)
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/airflow/www/app.py", line 39, in create_app
webserver_1 |     from airflow.www import views
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/airflow/www/views.py", line 1687, in <module>
webserver_1 |     class ChartModelView(wwwutils.DataProfilingMixin, AirflowModelView):
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/airflow/www/views.py", line 1771, in ChartModelView
webserver_1 |     .group_by(models.Connection.conn_id)
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/orm/query.py", line 2736, in __iter__
webserver_1 |     return self._execute_and_instances(context)
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/orm/query.py", line 2749, in _execute_and_instances
webserver_1 |     close_with_result=True)
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/orm/query.py", line 2740, in _connection_from_session
webserver_1 |     **kw)
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/orm/session.py", line 893, in connection
webserver_1 |     execution_options=execution_options)
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/orm/session.py", line 898, in _connection_for_bind
webserver_1 |     engine, execution_options)
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/orm/session.py", line 334, in _connection_for_bind
webserver_1 |     conn = bind.contextual_connect()
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/engine/base.py", line 2039, in contextual_connect
webserver_1 |     self._wrap_pool_connect(self.pool.connect, None),
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/engine/base.py", line 2078, in _wrap_pool_connect
webserver_1 |     e, dialect, self)
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/engine/base.py", line 1405, in _handle_dbapi_exception_noconnection
webserver_1 |     exc_info
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/util/compat.py", line 200, in raise_from_cause
webserver_1 |     reraise(type(exception), exception, tb=exc_tb, cause=cause)
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/engine/base.py", line 2074, in _wrap_pool_connect
webserver_1 |     return fn()
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/pool.py", line 376, in connect
webserver_1 |     return _ConnectionFairy._checkout(self)
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/pool.py", line 713, in _checkout
webserver_1 |     fairy = _ConnectionRecord.checkout(pool)
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/pool.py", line 480, in checkout
webserver_1 |     rec = pool._do_get()
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/pool.py", line 1060, in _do_get
webserver_1 |     self._dec_overflow()
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/util/langhelpers.py", line 60, in __exit__
webserver_1 |     compat.reraise(exc_type, exc_value, exc_tb)
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/pool.py", line 1057, in _do_get
webserver_1 |     return self._create_connection()
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/pool.py", line 323, in _create_connection
webserver_1 |     return _ConnectionRecord(self)
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/pool.py", line 449, in __init__
webserver_1 |     self.connection = self.__connect()
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/pool.py", line 607, in __connect
webserver_1 |     connection = self.__pool._invoke_creator(self)
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/engine/strategies.py", line 97, in connect
webserver_1 |     return dialect.connect(*cargs, **cparams)
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/engine/default.py", line 385, in connect
webserver_1 |     return self.dbapi.connect(*cargs, **cparams)
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/MySQLdb/__init__.py", line 81, in Connect
webserver_1 |     return Connection(*args, **kwargs)
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/MySQLdb/connections.py", line 204, in __init__
webserver_1 |     super(Connection, self).__init__(*args, **kwargs2)
webserver_1 | OperationalError: (_mysql_exceptions.OperationalError) (2003, "Can't connect to MySQL server on 'mysql' (111)")
webserver_1 | Traceback (most recent call last):
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/gunicorn/arbiter.py", line 515, in spawn_worker
webserver_1 |     worker.init_process()
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/gunicorn/workers/base.py", line 122, in init_process
webserver_1 |     self.load_wsgi()
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/gunicorn/workers/base.py", line 130, in load_wsgi
webserver_1 |     self.wsgi = self.app.wsgi()
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/gunicorn/app/base.py", line 67, in wsgi
webserver_1 |     self.callable = self.load()
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/gunicorn/app/wsgiapp.py", line 65, in load
webserver_1 |     return self.load_wsgiapp()
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/gunicorn/app/wsgiapp.py", line 52, in load_wsgiapp
webserver_1 |     return util.import_app(self.app_uri)
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/gunicorn/util.py", line 368, in import_app
webserver_1 |     app = eval(obj, mod.__dict__)
webserver_1 |   File "<string>", line 1, in <module>
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/airflow/www/app.py", line 119, in cached_app
webserver_1 |     app = create_app(config)
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/airflow/www/app.py", line 39, in create_app
webserver_1 |     from airflow.www import views
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/airflow/www/views.py", line 1687, in <module>
webserver_1 |     class ChartModelView(wwwutils.DataProfilingMixin, AirflowModelView):
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/airflow/www/views.py", line 1771, in ChartModelView
webserver_1 |     .group_by(models.Connection.conn_id)
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/orm/query.py", line 2736, in __iter__
webserver_1 |     return self._execute_and_instances(context)
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/orm/query.py", line 2749, in _execute_and_instances
webserver_1 |     close_with_result=True)
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/orm/query.py", line 2740, in _connection_from_session
webserver_1 |     **kw)
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/orm/session.py", line 893, in connection
webserver_1 |     execution_options=execution_options)
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/orm/session.py", line 898, in _connection_for_bind
webserver_1 |     engine, execution_options)
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/orm/session.py", line 334, in _connection_for_bind
webserver_1 |     conn = bind.contextual_connect()
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/engine/base.py", line 2039, in contextual_connect
webserver_1 |     self._wrap_pool_connect(self.pool.connect, None),
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/engine/base.py", line 2078, in _wrap_pool_connect
webserver_1 |     e, dialect, self)
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/engine/base.py", line 1405, in _handle_dbapi_exception_noconnection
webserver_1 |     exc_info
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/util/compat.py", line 200, in raise_from_cause
webserver_1 |     reraise(type(exception), exception, tb=exc_tb, cause=cause)
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/engine/base.py", line 2074, in _wrap_pool_connect
webserver_1 |     return fn()
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/pool.py", line 376, in connect
webserver_1 |     return _ConnectionFairy._checkout(self)
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/pool.py", line 713, in _checkout
webserver_1 |     fairy = _ConnectionRecord.checkout(pool)
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/pool.py", line 480, in checkout
webserver_1 |     rec = pool._do_get()
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/pool.py", line 1060, in _do_get
webserver_1 |     self._dec_overflow()
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/util/langhelpers.py", line 60, in __exit__
webserver_1 |     compat.reraise(exc_type, exc_value, exc_tb)
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/pool.py", line 1057, in _do_get
webserver_1 |     return self._create_connection()
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/pool.py", line 323, in _create_connection
webserver_1 |     return _ConnectionRecord(self)
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/pool.py", line 449, in __init__
webserver_1 |     self.connection = self.__connect()
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/pool.py", line 607, in __connect
webserver_1 |     connection = self.__pool._invoke_creator(self)
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/engine/strategies.py", line 97, in connect
webserver_1 |     return dialect.connect(*cargs, **cparams)
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/engine/default.py", line 385, in connect
webserver_1 |     return self.dbapi.connect(*cargs, **cparams)
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/MySQLdb/__init__.py", line 81, in Connect
webserver_1 |     return Connection(*args, **kwargs)
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/MySQLdb/connections.py", line 204, in __init__
webserver_1 |     super(Connection, self).__init__(*args, **kwargs2)
webserver_1 | OperationalError: (_mysql_exceptions.OperationalError) (2003, "Can't connect to MySQL server on 'mysql' (111)")
webserver_1 | [2016-03-08 10:38:03 +0000] [41] [INFO] Worker exiting (pid: 41)
webserver_1 | [2016-03-08 10:38:03 +0000] [43] [ERROR] Exception in worker process:
webserver_1 | Traceback (most recent call last):
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/gunicorn/arbiter.py", line 515, in spawn_worker
webserver_1 |     worker.init_process()
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/gunicorn/workers/base.py", line 122, in init_process
webserver_1 |     self.load_wsgi()
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/gunicorn/workers/base.py", line 130, in load_wsgi
webserver_1 |     self.wsgi = self.app.wsgi()
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/gunicorn/app/base.py", line 67, in wsgi
webserver_1 |     self.callable = self.load()
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/gunicorn/app/wsgiapp.py", line 65, in load
webserver_1 |     return self.load_wsgiapp()
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/gunicorn/app/wsgiapp.py", line 52, in load_wsgiapp
webserver_1 |     return util.import_app(self.app_uri)
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/gunicorn/util.py", line 368, in import_app
webserver_1 |     app = eval(obj, mod.__dict__)
webserver_1 |   File "<string>", line 1, in <module>
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/airflow/www/app.py", line 119, in cached_app
webserver_1 |     app = create_app(config)
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/airflow/www/app.py", line 39, in create_app
webserver_1 |     from airflow.www import views
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/airflow/www/views.py", line 1687, in <module>
webserver_1 |     class ChartModelView(wwwutils.DataProfilingMixin, AirflowModelView):
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/airflow/www/views.py", line 1771, in ChartModelView
webserver_1 |     .group_by(models.Connection.conn_id)
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/orm/query.py", line 2736, in __iter__
webserver_1 |     return self._execute_and_instances(context)
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/orm/query.py", line 2749, in _execute_and_instances
webserver_1 |     close_with_result=True)
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/orm/query.py", line 2740, in _connection_from_session
webserver_1 |     **kw)
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/orm/session.py", line 893, in connection
webserver_1 |     execution_options=execution_options)
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/orm/session.py", line 898, in _connection_for_bind
webserver_1 |     engine, execution_options)
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/orm/session.py", line 334, in _connection_for_bind
webserver_1 |     conn = bind.contextual_connect()
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/engine/base.py", line 2039, in contextual_connect
webserver_1 |     self._wrap_pool_connect(self.pool.connect, None),
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/engine/base.py", line 2078, in _wrap_pool_connect
webserver_1 |     e, dialect, self)
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/engine/base.py", line 1405, in _handle_dbapi_exception_noconnection
webserver_1 |     exc_info
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/util/compat.py", line 200, in raise_from_cause
webserver_1 |     reraise(type(exception), exception, tb=exc_tb, cause=cause)
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/engine/base.py", line 2074, in _wrap_pool_connect
webserver_1 |     return fn()
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/pool.py", line 376, in connect
webserver_1 |     return _ConnectionFairy._checkout(self)
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/pool.py", line 713, in _checkout
webserver_1 |     fairy = _ConnectionRecord.checkout(pool)
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/pool.py", line 480, in checkout
webserver_1 |     rec = pool._do_get()
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/pool.py", line 1060, in _do_get
webserver_1 |     self._dec_overflow()
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/util/langhelpers.py", line 60, in __exit__
webserver_1 |     compat.reraise(exc_type, exc_value, exc_tb)
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/pool.py", line 1057, in _do_get
webserver_1 |     return self._create_connection()
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/pool.py", line 323, in _create_connection
webserver_1 |     return _ConnectionRecord(self)
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/pool.py", line 449, in __init__
webserver_1 |     self.connection = self.__connect()
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/pool.py", line 607, in __connect
webserver_1 |     connection = self.__pool._invoke_creator(self)
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/engine/strategies.py", line 97, in connect
webserver_1 |     return dialect.connect(*cargs, **cparams)
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/engine/default.py", line 385, in connect
webserver_1 |     return self.dbapi.connect(*cargs, **cparams)
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/MySQLdb/__init__.py", line 81, in Connect
webserver_1 |     return Connection(*args, **kwargs)
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/MySQLdb/connections.py", line 204, in __init__
webserver_1 |     super(Connection, self).__init__(*args, **kwargs2)
webserver_1 | OperationalError: (_mysql_exceptions.OperationalError) (2003, "Can't connect to MySQL server on 'mysql' (111)")
webserver_1 | Traceback (most recent call last):
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/gunicorn/arbiter.py", line 515, in spawn_worker
webserver_1 |     worker.init_process()
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/gunicorn/workers/base.py", line 122, in init_process
webserver_1 |     self.load_wsgi()
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/gunicorn/workers/base.py", line 130, in load_wsgi
webserver_1 |     self.wsgi = self.app.wsgi()
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/gunicorn/app/base.py", line 67, in wsgi
webserver_1 |     self.callable = self.load()
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/gunicorn/app/wsgiapp.py", line 65, in load
webserver_1 |     return self.load_wsgiapp()
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/gunicorn/app/wsgiapp.py", line 52, in load_wsgiapp
webserver_1 |     return util.import_app(self.app_uri)
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/gunicorn/util.py", line 368, in import_app
webserver_1 |     app = eval(obj, mod.__dict__)
webserver_1 |   File "<string>", line 1, in <module>
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/airflow/www/app.py", line 119, in cached_app
webserver_1 |     app = create_app(config)
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/airflow/www/app.py", line 39, in create_app
webserver_1 |     from airflow.www import views
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/airflow/www/views.py", line 1687, in <module>
webserver_1 |     class ChartModelView(wwwutils.DataProfilingMixin, AirflowModelView):
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/airflow/www/views.py", line 1771, in ChartModelView
webserver_1 |     .group_by(models.Connection.conn_id)
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/orm/query.py", line 2736, in __iter__
webserver_1 |     return self._execute_and_instances(context)
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/orm/query.py", line 2749, in _execute_and_instances
webserver_1 |     close_with_result=True)
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/orm/query.py", line 2740, in _connection_from_session
webserver_1 |     **kw)
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/orm/session.py", line 893, in connection
webserver_1 |     execution_options=execution_options)
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/orm/session.py", line 898, in _connection_for_bind
webserver_1 |     engine, execution_options)
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/orm/session.py", line 334, in _connection_for_bind
webserver_1 |     conn = bind.contextual_connect()
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/engine/base.py", line 2039, in contextual_connect
webserver_1 |     self._wrap_pool_connect(self.pool.connect, None),
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/engine/base.py", line 2078, in _wrap_pool_connect
webserver_1 |     e, dialect, self)
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/engine/base.py", line 1405, in _handle_dbapi_exception_noconnection
webserver_1 |     exc_info
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/util/compat.py", line 200, in raise_from_cause
webserver_1 |     reraise(type(exception), exception, tb=exc_tb, cause=cause)
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/engine/base.py", line 2074, in _wrap_pool_connect
webserver_1 |     return fn()
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/pool.py", line 376, in connect
webserver_1 |     return _ConnectionFairy._checkout(self)
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/pool.py", line 713, in _checkout
webserver_1 |     fairy = _ConnectionRecord.checkout(pool)
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/pool.py", line 480, in checkout
webserver_1 |     rec = pool._do_get()
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/pool.py", line 1060, in _do_get
webserver_1 |     self._dec_overflow()
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/util/langhelpers.py", line 60, in __exit__
webserver_1 |     compat.reraise(exc_type, exc_value, exc_tb)
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/pool.py", line 1057, in _do_get
webserver_1 |     return self._create_connection()
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/pool.py", line 323, in _create_connection
webserver_1 |     return _ConnectionRecord(self)
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/pool.py", line 449, in __init__
webserver_1 |     self.connection = self.__connect()
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/pool.py", line 607, in __connect
webserver_1 |     connection = self.__pool._invoke_creator(self)
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/engine/strategies.py", line 97, in connect
webserver_1 |     return dialect.connect(*cargs, **cparams)
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/engine/default.py", line 385, in connect
webserver_1 |     return self.dbapi.connect(*cargs, **cparams)
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/MySQLdb/__init__.py", line 81, in Connect
webserver_1 |     return Connection(*args, **kwargs)
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/MySQLdb/connections.py", line 204, in __init__
webserver_1 |     super(Connection, self).__init__(*args, **kwargs2)
webserver_1 | OperationalError: (_mysql_exceptions.OperationalError) (2003, "Can't connect to MySQL server on 'mysql' (111)")
webserver_1 | [2016-03-08 10:38:03 +0000] [44] [INFO] Worker exiting (pid: 44)
webserver_1 | [2016-03-08 10:38:03 +0000] [43] [INFO] Worker exiting (pid: 43)
webserver_1 | [2016-03-08 10:38:03 +0000] [42] [ERROR] Exception in worker process:
webserver_1 | Traceback (most recent call last):
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/gunicorn/arbiter.py", line 515, in spawn_worker
webserver_1 |     worker.init_process()
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/gunicorn/workers/base.py", line 122, in init_process
webserver_1 |     self.load_wsgi()
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/gunicorn/workers/base.py", line 130, in load_wsgi
webserver_1 |     self.wsgi = self.app.wsgi()
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/gunicorn/app/base.py", line 67, in wsgi
webserver_1 |     self.callable = self.load()
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/gunicorn/app/wsgiapp.py", line 65, in load
webserver_1 |     return self.load_wsgiapp()
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/gunicorn/app/wsgiapp.py", line 52, in load_wsgiapp
webserver_1 |     return util.import_app(self.app_uri)
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/gunicorn/util.py", line 368, in import_app
webserver_1 |     app = eval(obj, mod.__dict__)
webserver_1 |   File "<string>", line 1, in <module>
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/airflow/www/app.py", line 119, in cached_app
webserver_1 |     app = create_app(config)
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/airflow/www/app.py", line 39, in create_app
webserver_1 |     from airflow.www import views
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/airflow/www/views.py", line 1687, in <module>
webserver_1 |     class ChartModelView(wwwutils.DataProfilingMixin, AirflowModelView):
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/airflow/www/views.py", line 1771, in ChartModelView
webserver_1 |     .group_by(models.Connection.conn_id)
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/orm/query.py", line 2736, in __iter__
webserver_1 |     return self._execute_and_instances(context)
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/orm/query.py", line 2749, in _execute_and_instances
webserver_1 |     close_with_result=True)
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/orm/query.py", line 2740, in _connection_from_session
webserver_1 |     **kw)
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/orm/session.py", line 893, in connection
webserver_1 |     execution_options=execution_options)
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/orm/session.py", line 898, in _connection_for_bind
webserver_1 |     engine, execution_options)
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/orm/session.py", line 334, in _connection_for_bind
webserver_1 |     conn = bind.contextual_connect()
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/engine/base.py", line 2039, in contextual_connect
webserver_1 |     self._wrap_pool_connect(self.pool.connect, None),
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/engine/base.py", line 2078, in _wrap_pool_connect
webserver_1 |     e, dialect, self)
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/engine/base.py", line 1405, in _handle_dbapi_exception_noconnection
webserver_1 |     exc_info
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/util/compat.py", line 200, in raise_from_cause
webserver_1 |     reraise(type(exception), exception, tb=exc_tb, cause=cause)
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/engine/base.py", line 2074, in _wrap_pool_connect
webserver_1 |     return fn()
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/pool.py", line 376, in connect
webserver_1 |     return _ConnectionFairy._checkout(self)
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/pool.py", line 713, in _checkout
webserver_1 |     fairy = _ConnectionRecord.checkout(pool)
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/pool.py", line 480, in checkout
webserver_1 |     rec = pool._do_get()
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/pool.py", line 1060, in _do_get
webserver_1 |     self._dec_overflow()
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/util/langhelpers.py", line 60, in __exit__
webserver_1 |     compat.reraise(exc_type, exc_value, exc_tb)
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/pool.py", line 1057, in _do_get
webserver_1 |     return self._create_connection()
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/pool.py", line 323, in _create_connection
webserver_1 |     return _ConnectionRecord(self)
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/pool.py", line 449, in __init__
webserver_1 |     self.connection = self.__connect()
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/pool.py", line 607, in __connect
webserver_1 |     connection = self.__pool._invoke_creator(self)
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/engine/strategies.py", line 97, in connect
webserver_1 |     return dialect.connect(*cargs, **cparams)
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/engine/default.py", line 385, in connect
webserver_1 |     return self.dbapi.connect(*cargs, **cparams)
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/MySQLdb/__init__.py", line 81, in Connect
webserver_1 |     return Connection(*args, **kwargs)
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/MySQLdb/connections.py", line 204, in __init__
webserver_1 |     super(Connection, self).__init__(*args, **kwargs2)
webserver_1 | OperationalError: (_mysql_exceptions.OperationalError) (2003, "Can't connect to MySQL server on 'mysql' (111)")
webserver_1 | Traceback (most recent call last):
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/gunicorn/arbiter.py", line 515, in spawn_worker
webserver_1 |     worker.init_process()
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/gunicorn/workers/base.py", line 122, in init_process
webserver_1 |     self.load_wsgi()
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/gunicorn/workers/base.py", line 130, in load_wsgi
webserver_1 |     self.wsgi = self.app.wsgi()
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/gunicorn/app/base.py", line 67, in wsgi
webserver_1 |     self.callable = self.load()
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/gunicorn/app/wsgiapp.py", line 65, in load
webserver_1 |     return self.load_wsgiapp()
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/gunicorn/app/wsgiapp.py", line 52, in load_wsgiapp
webserver_1 |     return util.import_app(self.app_uri)
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/gunicorn/util.py", line 368, in import_app
webserver_1 |     app = eval(obj, mod.__dict__)
webserver_1 |   File "<string>", line 1, in <module>
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/airflow/www/app.py", line 119, in cached_app
webserver_1 |     app = create_app(config)
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/airflow/www/app.py", line 39, in create_app
webserver_1 |     from airflow.www import views
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/airflow/www/views.py", line 1687, in <module>
webserver_1 |     class ChartModelView(wwwutils.DataProfilingMixin, AirflowModelView):
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/airflow/www/views.py", line 1771, in ChartModelView
webserver_1 |     .group_by(models.Connection.conn_id)
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/orm/query.py", line 2736, in __iter__
webserver_1 |     return self._execute_and_instances(context)
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/orm/query.py", line 2749, in _execute_and_instances
webserver_1 |     close_with_result=True)
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/orm/query.py", line 2740, in _connection_from_session
webserver_1 |     **kw)
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/orm/session.py", line 893, in connection
webserver_1 |     execution_options=execution_options)
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/orm/session.py", line 898, in _connection_for_bind
webserver_1 |     engine, execution_options)
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/orm/session.py", line 334, in _connection_for_bind
webserver_1 |     conn = bind.contextual_connect()
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/engine/base.py", line 2039, in contextual_connect
webserver_1 |     self._wrap_pool_connect(self.pool.connect, None),
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/engine/base.py", line 2078, in _wrap_pool_connect
webserver_1 |     e, dialect, self)
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/engine/base.py", line 1405, in _handle_dbapi_exception_noconnection
webserver_1 |     exc_info
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/util/compat.py", line 200, in raise_from_cause
webserver_1 |     reraise(type(exception), exception, tb=exc_tb, cause=cause)
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/engine/base.py", line 2074, in _wrap_pool_connect
webserver_1 |     return fn()
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/pool.py", line 376, in connect
webserver_1 |     return _ConnectionFairy._checkout(self)
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/pool.py", line 713, in _checkout
webserver_1 |     fairy = _ConnectionRecord.checkout(pool)
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/pool.py", line 480, in checkout
webserver_1 |     rec = pool._do_get()
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/pool.py", line 1060, in _do_get
webserver_1 |     self._dec_overflow()
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/util/langhelpers.py", line 60, in __exit__
webserver_1 |     compat.reraise(exc_type, exc_value, exc_tb)
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/pool.py", line 1057, in _do_get
webserver_1 |     return self._create_connection()
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/pool.py", line 323, in _create_connection
webserver_1 |     return _ConnectionRecord(self)
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/pool.py", line 449, in __init__
webserver_1 |     self.connection = self.__connect()
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/pool.py", line 607, in __connect
webserver_1 |     connection = self.__pool._invoke_creator(self)
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/engine/strategies.py", line 97, in connect
webserver_1 |     return dialect.connect(*cargs, **cparams)
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/engine/default.py", line 385, in connect
webserver_1 |     return self.dbapi.connect(*cargs, **cparams)
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/MySQLdb/__init__.py", line 81, in Connect
webserver_1 |     return Connection(*args, **kwargs)
webserver_1 |   File "/usr/local/lib/python2.7/dist-packages/MySQLdb/connections.py", line 204, in __init__
webserver_1 |     super(Connection, self).__init__(*args, **kwargs2)
webserver_1 | OperationalError: (_mysql_exceptions.OperationalError) (2003, "Can't connect to MySQL server on 'mysql' (111)")
webserver_1 | [2016-03-08 10:38:03 +0000] [42] [INFO] Worker exiting (pid: 42)

Flower Crashing

I'm borrowing a lot from your docker compose to deploy in an openshift environment and the Flower pod crashes consistently with the following error...

Traceback (most recent call last):
  File "/usr/local/bin/flower", line 11, in <module>
    load_entry_point('flower==0.9.1', 'console_scripts', 'flower')()
  File "/usr/local/lib/python2.7/dist-packages/flower/__main__.py", line 11, in main
    flower.execute_from_commandline()
  File "/usr/local/lib/python2.7/dist-packages/celery/bin/base.py", line 311, in execute_from_commandline
    return self.handle_argv(self.prog_name, argv[1:])
  File "/usr/local/lib/python2.7/dist-packages/flower/command.py", line 59, in handle_argv
    return self.run_from_argv(prog_name, argv)
  File "/usr/local/lib/python2.7/dist-packages/flower/command.py", line 36, in run_from_argv
    self.apply_env_options()
  File "/usr/local/lib/python2.7/dist-packages/flower/command.py", line 71, in apply_env_options
    value = option.type(value)
ValueError: invalid literal for int() with base 10: 'tcp://172.30.195.95:5555'

I think this is due to the fact that the pod is trying to access celery via IP rather than service name. The flower pod works fine and actually connects to http://localhost:5555 when I import the service from using the kube-airflow project. I'm having trouble finding where the IP address is passed into the Flower pod so I can change it from IP to service name. Can you point me in the right direction as to which file I should modify to fix the issues with the Flower pod?

Docker-compose.yml is missing a scheduler to run triggered jobs

I tried adding this to the docker-compose.yml

     image: puckel/docker-airflow
     restart: always
     links:
         - mysqldb:mysqldb
         - rabbitmq:rabbitmq
     command: scheduler

but the db doesn't get setup in time and the scheduler has a hard time.

scheduler_1 | sqlalchemy.exc.ProgrammingError: (_mysql_exceptions.ProgrammingError) (1146, "Table 'airflow.job' doesn't exist") [SQL: u'INSERT INTO job (dag_id, state, job_type, start_date, end_date, latest_heartbeat, executor_class, hostname, unixname) VALUES (%s, %s, %s, %s, %s, %s, %s, %s, %s)'] [parameters: (None, 'running', 'SchedulerJob', datetime.datetime(2015, 9, 10, 4, 50, 21, 405881), None, datetime.datetime(2015, 9, 10, 4, 50, 21, 405910), 'CeleryExecutor', '25a5024ff2aa', 'root')]

flower service fails on vanilla 'docker-compose up'

flower_1    | [2016-03-24 00:43:30,633] {__init__.py:36} INFO - Using executor CeleryExecutor
flower_1    | Traceback (most recent call last):
flower_1    |   File "/usr/local/bin/flower", line 5, in <module>
flower_1    |     from pkg_resources import load_entry_point
flower_1    |   File "/usr/lib/python2.7/dist-packages/pkg_resources.py", line 2876, in <module>
flower_1    |     working_set = WorkingSet._build_master()
flower_1    |   File "/usr/lib/python2.7/dist-packages/pkg_resources.py", line 451, in _build_master
flower_1    |     return cls._build_from_requirements(__requires__)
flower_1    |   File "/usr/lib/python2.7/dist-packages/pkg_resources.py", line 464, in _build_from_requirements
flower_1    |     dists = ws.resolve(reqs, Environment())
flower_1    |   File "/usr/lib/python2.7/dist-packages/pkg_resources.py", line 639, in resolve
flower_1    |     raise DistributionNotFound(req)
flower_1    | pkg_resources.DistributionNotFound: pytz==2015.7
dockerairflow_flower_1 exited with code 0

Error when building an SSH connection to a remote Host

Hi, i got the following error when trying to load a new DAG(logs from the scheduler ):
[2016-11-16 10:44:39,687] {jobs.py:680} INFO - Starting the scheduler [2016-11-16 10:44:39,687] {models.py:154} INFO - Filling up the DagBag from /usr/local/airflow/dags [2016-11-16 10:44:39,794] {base_hook.py:53} INFO - Using connection to: 10.1.100.71 [2016-11-16 10:44:39,794] {models.py:250} ERROR - Failed to import: /usr/local/airflow/dags/osf_dag_v3.py Traceback (most recent call last): File "/usr/local/lib/python2.7/dist-packages/airflow/models.py", line 247, in process_file m = imp.load_source(mod_name, filepath) File "/usr/local/airflow/dags/osf_dag_v3.py", line 29, in <module> ssh_hook = SSHHook(conn_id='ssh_conn') File "/usr/local/lib/python2.7/dist-packages/airflow/contrib/hooks/ssh_hook.py", line 55, in\__init\__ self.key_file = conn.extra_dejson.get('key_file', None) File "/usr/local/lib/python2.7/dist-packages/airflow/models.py", line 590, in extra_dejson if self.extra: File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/orm/attributes.py", line 293, in\__get\__ return self.descriptor.\__get\__(instance, owner) File "/usr/local/lib/python2.7/dist-packages/airflow/models.py", line 534, in get_extra return FERNET.decrypt(bytes(self._extra, 'utf-8')).decode() File "/usr/local/lib/python2.7/dist-packages/cryptography/fernet.py", line 103, in decrypt raise InvalidToken InvalidToken

PythonOperator does not work

I'm running into an issue where the PythonOperator just straight up fails when running airflow backfills. Even an extremely simple use case (https://gist.github.com/fbdiehlad/9e356d9a303339eb3c3f1be3475efce2) fails immediately when I try to run a backfill against it. Oddly enough, when I run the task using airflow test, it works fine. I'm thinking this is some kind of permissions issue from running the setup with docker? Has anyone else run into this before?

Rabbitmq is always on restarting

Hi everybody,
I just get a problem that the rabbitmq service is always restarting, with status: Restarting(1)
The Problem happened as there was not enough space disk, and the postgres service was shut down.
As i have the whole Airflow docker container restarted, i got this problem
Docker version 1.12.3 on a Ubuntu Linux version 14.04.5 LTS

"cannot import name viewkeys" on HEAD

Just to warn you that updating your scrip to fetch the HEAD (given a COMMIT ID in the build args) gives the following error:

Traceback (most recent call last):
  File "/usr/local/bin/airflow", line 4, in <module>
    __import__('pkg_resources').run_script('airflow==1.9.0.dev0+apache.incubating', 'airflow')
  File "/usr/local/lib/python2.7/dist-packages/pkg_resources/__init__.py", line 738, in run_script
    self.require(requires)[0].run_script(script_name, ns)
  File "/usr/local/lib/python2.7/dist-packages/pkg_resources/__init__.py", line 1499, in run_script
    exec(code, namespace, namespace)
  File "/usr/local/lib/python2.7/dist-packages/airflow-1.9.0.dev0+apache.incubating-py2.7.egg/EGG-INFO/scripts/airflow", line 28, in <module>
    args.func(args)
  File "/usr/local/lib/python2.7/dist-packages/airflow-1.9.0.dev0+apache.incubating-py2.7.egg/airflow/bin/cli.py", line 734, in webserver
    app = cached_app(conf)
  File "/usr/local/lib/python2.7/dist-packages/airflow-1.9.0.dev0+apache.incubating-py2.7.egg/airflow/www/app.py", line 161, in cached_app
    app = create_app(config)
  File "/usr/local/lib/python2.7/dist-packages/airflow-1.9.0.dev0+apache.incubating-py2.7.egg/airflow/www/app.py", line 60, in create_app
    from airflow.www import views
  File "/usr/local/lib/python2.7/dist-packages/airflow-1.9.0.dev0+apache.incubating-py2.7.egg/airflow/www/views.py", line 27, in <module>
    import bleach
  File "/usr/local/lib/python2.7/dist-packages/bleach-2.0.0-py2.7.egg/bleach/__init__.py", line 5, in <module>
    from bleach.linkifier import (
  File "/usr/local/lib/python2.7/dist-packages/bleach-2.0.0-py2.7.egg/bleach/linkifier.py", line 4, in <module>
    import html5lib
  File "build/bdist.linux-x86_64/egg/html5lib/__init__.py", line 16, in <module>
  File "build/bdist.linux-x86_64/egg/html5lib/html5parser.py", line 2, in <module>
ImportError: cannot import name viewkeys

docker-compose issue

I have tried to run docker-compose -f docker-compose-CeleryExecutor.yml up -d

But recieved errors
ERROR: for webserver No such image: sha256:3631d65918560df72f335194d391370a09ea944ab9f7a9f225611829bd883bdf

ERROR: for flower No such image: sha256:3631d65918560df72f335194d391370a09ea944ab9f7a9f225611829bd883bdf

airflow scheduler stuck

Since the airflow version 1.7.1.3 the scheduler stuck probably due to a deadlock.
It seems like that setting the number of runs to 5 & auto restart didn't solve the problem.

possible solutions:

  • Using older version of ariflow e.g. 1.6.2
  • Running docker-compose restart scheduler from time to time.

reference:
https://issues.apache.org/jira/browse/AIRFLOW-401

How to to run jobs on docker-airflow ?

Hi there!
I've installed your docker-airflow images and they run examples fine (except http requests, but it's for another ticket).
Now I would like to run my own DAG, could you please explain how to do it with your images?
a) can/should I use running containers to launch DAGs?
b) can/should I rather install airflow distribution, configure & use that as a client?

I think it is valuable information to others, could you please write the instructions in your README?

Thanks!
Krzysiek

docker-airflow for version 1.8.0

Hi @puckel ,
are you planning a new docker container for the new Airflow 1.8.0 that will be out in a couple of days?
I have been using you airflow docker container for months now. great job and thanks

SSH from docker container to other VMs

Hello,
I have a airflow check that performs BashOperator with below command.
ssh root@xxxxx python2.7 xxx.py

Task failed so I went in to the worker container and checked that ssh is missing in container.
I can't find online how to install ssh on docker container. If there is anything you can help, very much appreciated.

support new DAGs

Submitting of new workflows is not supported yet by Airflow.
Make a workaround to support new DAG.

Different FERNET_KEY across docker containers

Hi @puckel ,

I just noticed that the fernet keys are different across containers, which makes the encryption not reversible. This is probably because each container will generate a new key when launching the image and running entrypoint.sh.

I was thinking a solution to move the key generation to Dockerfile from entrypoint.sh, but this could leave security flaws as the key will possibly be built into the image and pushed remote afterwards.

Please let me know what you have in mind about this issue, or correct me if I misunderstood anything.

Thanks,
Tianlong

ldap failing

Hello, I had no problem setting up ldap authentication before,
and after couple modification to the code, I am no longer able to bind ldap connection with webserver showing AuthenticationError: Cannot bind to ldap server.

I am able to telnet from the container, so I know this isnt the network problem..
If I can get any help or way to debug this, it would be really really helpful since I've been stuck with this for couple days now.

Thanks in advance for any help.

Can't connect to MySQL via local socket

Hi, great progress on packaging airflow as containers, thanks!

After pulling and building the latest version of the docker images I'm still not able to access the data-profiling features. For ad-hoc queries and charts I'm getting this error:

(2002, "Can't connect to local MySQL server through socket '/var/run/mysqld/mysqld.sock' (2)")

screen shot 2015-11-13 at 12 45 25 pm

Any advise?

Why is the container of the scheduler always restarting instead of using the official scheduler script?

I am currently looking into porting docker-airflow onto Kubernetes and I looked through the work of kube-airflow. One thing that puzzled me is that the exit of the scheduler makes Kubernetes think it crashed and gets into a CrashLoopBackOff.
An alternative would be to run airflow_scheduler_autorestart.sh instead. Is there a reason why this script is not used in the docker container for the scheduler?

Can't access webserver

i can't acces the webserver.
i have though noticed that the webserver port don't propagate correctly
screenshot from 2017-01-17 15 32 39

I have attached a screenshot from docker ps -a

Last Airflow commits

Hi @puckel ,
i wanted to ask how can i get the last Airflow commits to my container ?
Is there like a frequent update option from the Airflow's master branch ?

Error connected to database

Using the docker-compose up command, most of it works except I get the following error:

mariadb_1   | 150722 14:09:17 [Warning] Aborted connection 4 to db: 'airflow' user: 'airflow' host: '172.17.0.20' (Unknown error)

Going to mess around with it, but wanted to post the issue.

failed to build puckel/docker-airflow

Hi,

when trying to build puckel/docker-airflow i got this error: Cython.Compiler.Errors.InternalError: Internal compiler error: 'algos_common_helper.pxi' not found

More logs:

Downloading/unpacking pandas<1.0.0,>=0.15.2 (from airflow[celery,hdfs,hive,postgres]==1.7.1.3)
Running setup.py (path:/tmp/pip-build-fReTFb/pandas/setup.py) egg_info for package pandas

package init file 'pandas/io/tests/sas/__init__.py' not found (or not a regular file)
pandas/index.pyx: cannot find cimported module 'datetime'
pandas/index.pyx: cannot find cimported module 'util'
./pandas/hashtable.pxd: cannot find cimported module 'khash'
Traceback (most recent call last):
  File "<string>", line 17, in <module>
  File "/tmp/pip-build-fReTFb/pandas/setup.py", line 680, in <module>
    **setuptools_kwargs)
  File "/usr/lib/python2.7/distutils/core.py", line 151, in setup
    dist.run_commands()
  File "/usr/lib/python2.7/distutils/dist.py", line 953, in run_commands
    self.run_command(cmd)
  File "/usr/lib/python2.7/distutils/dist.py", line 972, in run_command
    cmd_obj.run()
  File "<string>", line 15, in replacement_run
  File "/usr/local/lib/python2.7/dist-packages/setuptools/command/egg_info.py", line 306, in find_sources
    mm.run()
  File "/usr/local/lib/python2.7/dist-packages/setuptools/command/egg_info.py", line 533, in run
    self.add_defaults()
  File "/usr/local/lib/python2.7/dist-packages/setuptools/command/egg_info.py", line 562, in add_defaults
    sdist.add_defaults(self)
  File "/usr/local/lib/python2.7/dist-packages/setuptools/command/py36compat.py", line 36, in add_defaults
    self._add_defaults_ext()
  File "/usr/local/lib/python2.7/dist-packages/setuptools/command/py36compat.py", line 119, in _add_defaults_ext
    build_ext = self.get_finalized_command('build_ext')
  File "/usr/lib/python2.7/distutils/cmd.py", line 312, in get_finalized_command
    cmd_obj.ensure_finalized()
  File "/usr/lib/python2.7/distutils/cmd.py", line 109, in ensure_finalized
    self.finalize_options()
  File "/usr/local/lib/python2.7/dist-packages/Cython/Distutils/build_ext.py", line 19, in finalize_options
    self.distribution.ext_modules)
  File "/usr/local/lib/python2.7/dist-packages/Cython/Build/Dependencies.py", line 809, in cythonize
    aliases=aliases)
  File "/usr/local/lib/python2.7/dist-packages/Cython/Build/Dependencies.py", line 714, in create_extension_list
    kwds = deps.distutils_info(file, aliases, base).values
  File "/usr/local/lib/python2.7/dist-packages/Cython/Build/Dependencies.py", line 590, in distutils_info
    return (self.transitive_merge(filename, self.distutils_info0, DistutilsInfo.merge)
  File "/usr/local/lib/python2.7/dist-packages/Cython/Build/Dependencies.py", line 600, in transitive_merge
    node, extract, merge, seen, {}, self.cimported_files)[0]
  File "/usr/local/lib/python2.7/dist-packages/Cython/Build/Dependencies.py", line 605, in transitive_merge_helper
    deps = extract(node)
  File "/usr/local/lib/python2.7/dist-packages/Cython/Build/Dependencies.py", line 581, in distutils_info0
    externs = self.cimports_and_externs(filename)[1]
  File "/usr/local/lib/python2.7/dist-packages/Cython/Utils.py", line 44, in wrapper
    res = cache[args] = f(self, *args)
  File "/usr/local/lib/python2.7/dist-packages/Cython/Build/Dependencies.py", line 491, in cimports_and_externs
    for include in self.included_files(filename):
  File "/usr/local/lib/python2.7/dist-packages/Cython/Utils.py", line 44, in wrapper
    res = cache[args] = f(self, *args)
  File "/usr/local/lib/python2.7/dist-packages/Cython/Build/Dependencies.py", line 474, in included_files
    include_path = self.context.find_include_file(include, None)
  File "/usr/local/lib/python2.7/dist-packages/Cython/Compiler/Main.py", line 274, in find_include_file
    error(pos, "'%s' not found" % filename)
  File "/usr/local/lib/python2.7/dist-packages/Cython/Compiler/Errors.py", line 177, in error
    raise InternalError(message)
Cython.Compiler.Errors.InternalError: Internal compiler error: 'algos_common_helper.pxi' not found
Complete output from command python setup.py egg_info:
running egg_info

creating pip-egg-info/pandas.egg-info

writing requirements to pip-egg-info/pandas.egg-info/requires.txt

writing pip-egg-info/pandas.egg-info/PKG-INFO

writing top-level names to pip-egg-info/pandas.egg-info/top_level.txt

writing dependency_links to pip-egg-info/pandas.egg-info/dependency_links.txt

writing manifest file 'pip-egg-info/pandas.egg-info/SOURCES.txt'

warning: manifest_maker: standard file '-c' not found

package init file 'pandas/io/tests/sas/init.py' not found (or not a regular file)

pandas/index.pyx: cannot find cimported module 'datetime'

pandas/index.pyx: cannot find cimported module 'util'

./pandas/hashtable.pxd: cannot find cimported module 'khash'

Traceback (most recent call last):

File "", line 17, in

File "/tmp/pip-build-fReTFb/pandas/setup.py", line 680, in

**setuptools_kwargs)

File "/usr/lib/python2.7/distutils/core.py", line 151, in setup

dist.run_commands()

File "/usr/lib/python2.7/distutils/dist.py", line 953, in run_commands

self.run_command(cmd)

File "/usr/lib/python2.7/distutils/dist.py", line 972, in run_command

cmd_obj.run()

File "", line 15, in replacement_run

File "/usr/local/lib/python2.7/dist-packages/setuptools/command/egg_info.py", line 306, in find_sources

mm.run()

File "/usr/local/lib/python2.7/dist-packages/setuptools/command/egg_info.py", line 533, in run

self.add_defaults()

File "/usr/local/lib/python2.7/dist-packages/setuptools/command/egg_info.py", line 562, in add_defaults

sdist.add_defaults(self)

File "/usr/local/lib/python2.7/dist-packages/setuptools/command/py36compat.py", line 36, in add_defaults

self._add_defaults_ext()

File "/usr/local/lib/python2.7/dist-packages/setuptools/command/py36compat.py", line 119, in _add_defaults_ext

build_ext = self.get_finalized_command('build_ext')

File "/usr/lib/python2.7/distutils/cmd.py", line 312, in get_finalized_command

cmd_obj.ensure_finalized()

File "/usr/lib/python2.7/distutils/cmd.py", line 109, in ensure_finalized

self.finalize_options()

File "/usr/local/lib/python2.7/dist-packages/Cython/Distutils/build_ext.py", line 19, in finalize_options

self.distribution.ext_modules)

File "/usr/local/lib/python2.7/dist-packages/Cython/Build/Dependencies.py", line 809, in cythonize

aliases=aliases)

File "/usr/local/lib/python2.7/dist-packages/Cython/Build/Dependencies.py", line 714, in create_extension_list

kwds = deps.distutils_info(file, aliases, base).values

File "/usr/local/lib/python2.7/dist-packages/Cython/Build/Dependencies.py", line 590, in distutils_info

return (self.transitive_merge(filename, self.distutils_info0, DistutilsInfo.merge)

File "/usr/local/lib/python2.7/dist-packages/Cython/Build/Dependencies.py", line 600, in transitive_merge

node, extract, merge, seen, {}, self.cimported_files)[0]

File "/usr/local/lib/python2.7/dist-packages/Cython/Build/Dependencies.py", line 605, in transitive_merge_helper

deps = extract(node)

File "/usr/local/lib/python2.7/dist-packages/Cython/Build/Dependencies.py", line 581, in distutils_info0

externs = self.cimports_and_externs(filename)[1]

File "/usr/local/lib/python2.7/dist-packages/Cython/Utils.py", line 44, in wrapper

res = cache[args] = f(self, *args)

File "/usr/local/lib/python2.7/dist-packages/Cython/Build/Dependencies.py", line 491, in cimports_and_externs

for include in self.included_files(filename):

File "/usr/local/lib/python2.7/dist-packages/Cython/Utils.py", line 44, in wrapper

res = cache[args] = f(self, *args)

File "/usr/local/lib/python2.7/dist-packages/Cython/Build/Dependencies.py", line 474, in included_files

include_path = self.context.find_include_file(include, None)

File "/usr/local/lib/python2.7/dist-packages/Cython/Compiler/Main.py", line 274, in find_include_file

error(pos, "'%s' not found" % filename)

File "/usr/local/lib/python2.7/dist-packages/Cython/Compiler/Errors.py", line 177, in error

raise InternalError(message)

Cython.Compiler.Errors.InternalError: Internal compiler error: 'algos_common_helper.pxi' not found


Cleaning up...
Command python setup.py egg_info failed with error code 1 in /tmp/pip-build-fReTFb/pandas
Storing debug log for failure in /root/.pip/pip.log
The command '/bin/sh -c set -ex && buildDeps=' python-dev python-pip libkrb5-dev libsasl2-dev libssl-dev libffi-dev build-essential libblas-dev liblapack-dev ' && echo "deb http://http.debian.net/debian jessie-backports main" >/etc/apt/sources.list.d/backports.list && apt-get update -yqq && apt-get install -yqq --no-install-recommends $buildDeps apt-utils curl netcat locales &amp;&amp; apt-get install -yqq -t jessie-backports python-requests libpq-dev &amp;&amp; apt-get install -y sshpass &amp;&amp; apt-get install -y openssh-server &amp;&amp; sed -i 's/^# en_US.UTF-8 UTF-8$/en_US.UTF-8 UTF-8/g' /etc/locale.gen && locale-gen && update-locale LANG=en_US.UTF-8 LC_ALL=en_US.UTF-8 && useradd -ms /bin/bash -d ${AIRFLOW_HOME} airflow && pip install --upgrade setuptools && pip install Cython && pip install pytz==2015.7 && pip install python-dateutil && pip install numpy && pip install cryptography && pip install pyOpenSSL && pip install ndg-httpsclient && pip install pyasn1 && pip install psycopg2 && pip install airflow[celery,postgres,hive,hdfs]==$AIRFLOW_VERSION && apt-get remove --purge -yqq $buildDeps libpq-dev && apt-get clean && rm -rf /var/lib/apt/lists/* /tmp/* /var/tmp/* /usr/share/man /usr/share/doc /usr/share/doc-base' returned a non-zero code: 1


any ideas ?

Hdfs hook dependency on snakebite throwing error

[2017-03-28 14:53:07,932] {models.py:266} ERROR - Failed to import: /usr/local/lib/python3.5/site-packages/airflow/example_dags/example_http_operator.py
Traceback (most recent call last):
File "/usr/local/lib/python3.5/site-packages/airflow/models.py", line 263, in process_file
m = imp.load_source(mod_name, filepath)
File "/usr/local/lib/python3.5/imp.py", line 172, in load_source
module = _load(spec)
File "", line 693, in _load
File "", line 673, in _load_unlocked
File "", line 673, in exec_module
File "", line 222, in _call_with_frames_removed
File "/usr/local/lib/python3.5/site-packages/airflow/example_dags/example_http_operator.py", line 20, in
from airflow.operators.sensors import HttpSensor
File "/usr/local/lib/python3.5/site-packages/airflow/operators/sensors.py", line 33, in
from airflow.hooks.hdfs_hook import HDFSHook
File "/usr/local/lib/python3.5/site-packages/airflow/hooks/hdfs_hook.py", line 20, in
from snakebite.client import Client, HAClient, Namenode, AutoConfigClient
File "/usr/local/lib/python3.5/site-packages/snakebite/client.py", line 1473
baseTime = min(time * (1L << retries), cap);
^
SyntaxError: invalid syntax

SSH Hook Problem - Failed to create remote temp file

Hi Puckel,

i´ve got a Problem with Airflow SSH Hook. When i start my dag, i got an error, that Airflow "Failed to create remote temp file". I want to open a ssh-connection via Airflow. I set all settings in Airflow Connections and my DAG. Maybe you can help me ?!
When i run this DAG (sshjonas.py) outside of a docker-container on a local Airflow-Installation then everythink works...

BR,
Jonas

dag.txt
unbenannt

issue running LocalExecutor docker-compose file

Hey there,

Loving the docker setup you put together here, it's been very helpful in getting me closer to a production airflow configuration, and learn more about the components!

I am having some difficulty running the docker-compose-LocalExecutor.yml setup though, using docker v1.12.1 and running on osx. I attached the docker-compose logs output here. I tried this with the CeleryExecutor as well, and had some very similar results. I tried removing the local volumes just to ensure I was in a clean environment, but things just don't seem to work :/

If you have any suggestions on how I might get things running (including the scheduler and web UI components for flower/airflow running) please let me know!

airflow-logs.txt

Different python version..

Python 2.7.9 has a bug on boto library,

What would be the way to install different version of Python on the containers?

Flower is always restarting

hi,
i am using the last version of this docker-airflow, and i noticed that flower is always restarting and i don't have acces to the webserver

HTTP/1.1 200 OK
[2016-11-15 16:45:46,526] {init.py:36} INFO - Using executor CeleryExecutor
[2016-11-15 16:45:46,584] {driver.py:120} INFO - Generating grammar tables from /usr/lib/python2.7/lib2to3/Grammar.txt
[2016-11-15 16:45:46,607] {driver.py:120} INFO - Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
Traceback (most recent call last):
File "/usr/local/bin/airflow", line 15, in
args.func(args)
File "/usr/local/lib/python2.7/dist-packages/airflow/bin/cli.py", line 576, in flower
os.execvp("flower", ['flower', '-b', broka, port, api])
File "/usr/lib/python2.7/os.py", line 346, in execvp
_execvpe(file, args)
File "/usr/lib/python2.7/os.py", line 382, in _execvpe
func(fullname, *argrest)
OSError: [Errno 2] No such file or directory

Unable to connect using jdbc connection

What I'm trying to accomplish

connect to a Teradata database using the jdbc connection type.

Error message I'm receiving

java.lang.RuntimeException: Class com.teradata.jdbc.TeraDriver not found

Setup

volumes:
            - /usr/local/airflow_project/airflow/:/usr/local/airflow/
mdir /usr/local/airflow/drivers/teradata/
copy terajdbc4.jar
copy tdgssconfig.jar

driver path
/usr/local/airflow_project/airflow/drivers/teradata/terajdbc4.jar:/usr/local/airflow_project/airflow/drivers/teradata/tdgssconfig.jar

screen shot 2017-02-28 at 9 32 39 am

Steps I've taken that moved the issue forward

  1. added && apt-get install -yqq default-jre \ to the dockerfile. I was getting a java error previously

Steps I've tried since then

  1. adding those paths to a classpath env variable
  2. installing JayDeBeApi==0.2.0
  3. using stable branch
  4. using 1.8.0rc4 branch
  5. using python3
  6. running the code from [airflow's source] (https://github.com/apache/incubator-airflow/blob/master/airflow/hooks/jdbc_hook.py) on my local machine. (actually it won't let me attach a zip because "something goes really wrong" related to the jar in it maybe? anyways python code below).

screen shot 2017-02-28 at 9 32 16 am

Python code

"""
    General hook for jdbc db access.
    If a connection id is specified, host, port, schema, username and password will be taken from the predefined connection.
    Raises an airflow error if the given connection id doesn't exist.
    Otherwise host, port, schema, username and password can be specified on the fly.
    :param jdbc_url: jdbc connection url
    :type jdbc_url: string
    :param jdbc_driver_name: jdbc driver name
    :type jdbc_driver_name: string
    :param jdbc_driver_loc: path to jdbc driver
    :type jdbc_driver_loc: string
    :param conn_id: reference to a predefined database
    :type conn_id: string
    :param sql: the sql code to be executed
    :type sql: string or string pointing to a template file. File must have
        a '.sql' extensions.
"""

import jaydebeapi

conn_name_attr = 'jdbc_conn_id'
default_conn_name = 'jdbc_default'
supports_autocommit = True

def get_conn(self):
    #conn = self.get_connection(getattr(self, self.conn_name_attr))
    #host = conn.host
    #login = conn.login
    #psw = conn.password
    #jdbc_driver_loc = conn.extra_dejson.get('extra__jdbc__drv_path')
    #jdbc_driver_name = conn.extra_dejson.get('extra__jdbc__drv_clsname')

    conn = jaydebeapi.connect(jdbc_driver_name,
                              [str(host), str(login), str(psw)],
                              jdbc_driver_loc,)
    return conn

conn = jaydebeapi.connect('com.teradata.jdbc.TeraDriver',
                          ['server','user','password'],
                          ['/Users/user/drivers/tdgssconfig.jar',
                           '/Users/user/drivers/terajdbc4.jar'])

Question on updating DAGs in compose

This is really cool, thanks for putting it together. What in your view would be the best way to update DAGs in the context of this deployment? Would you suggest just forking this repo and updating the whole docker-compose deployment every time one needs to update a DAG, or possibly placing the DAGs in a container volume mounted in the Airflow containers?

passwords aren't encrypted when using celery compose

Not sure whats going on here but after doing the following steps I can't seem to have the passwords in connections encrypted.

git clone https://github.com/puckel/docker-airflow.git
docker-compose -f docker-compose-CeleryExecutor.yml up -d
# check http://localhost:8080/admin/connection/
# nothing is encrypted

My understanding is that these should all be encrypted as

  1. there is a valid fernet_key in place.
  2. We've installed the crypto package

I've tried changing the fernet key but that doesn't seem to work either.

whats really strange is that if I don't use the celery compose and instead just run it the connections are encrypted. Any ideas whats going on?

Relevant Resource?
https://airflow.incubator.apache.org/faq.html#why-are-connection-passwords-still-not-encrypted-in-the-metadata-db-after-i-installed-airflow-crypto

screen shot 2017-02-26 at 5 09 51 pm

Encoding issue when opening a log from example_http_operator

I'm having a small issue with opening logs for the post_op task in the example_http_operator DAG. This seems to work fine on my local machine running with SequentialExecutor, but after a fresh setup with the docker container it seems to crash.

I tried a bunch of things including setting the locales in the Dockerfile but that doesn't seem to help. I'm a bit lost whether this is my own incapabilities (new to Docker) or if this is reproduceable by someone else.

The following stacktrace is displayed when opening a log from the failed post_op task:

Traceback (most recent call last):
File "/usr/lib/python2.7/dist-packages/flask/app.py", line 1817, in wsgi_app
response = self.full_dispatch_request()
File "/usr/lib/python2.7/dist-packages/flask/app.py", line 1477, in full_dispatch_request
rv = self.handle_user_exception(e)
File "/usr/lib/python2.7/dist-packages/flask/app.py", line 1381, in handle_user_exception
reraise(exc_type, exc_value, tb)
File "/usr/lib/python2.7/dist-packages/flask/app.py", line 1475, in full_dispatch_request
rv = self.dispatch_request()
File "/usr/lib/python2.7/dist-packages/flask/app.py", line 1461, in dispatch_request
return self.view_functionsrule.endpoint
File "/usr/lib/python2.7/dist-packages/flask_admin/base.py", line 68, in inner
return self._run_view(f, _args, *_kwargs)
File "/usr/lib/python2.7/dist-packages/flask_admin/base.py", line 359, in _run_view
return fn(self, _args, *_kwargs)
File "/usr/lib/python2.7/dist-packages/flask_login.py", line 755, in decorated_view
return func(_args, *_kwargs)
File "/usr/lib/python2.7/dist-packages/airflow/www/utils.py", line 103, in wrapper
return f(_args, *_kwargs)
File "/usr/lib/python2.7/dist-packages/airflow/www/views.py", line 786, in log
log = log.decode('utf-8') if PY2 else log
File "/usr/lib/python2.7/encodings/utf_8.py", line 16, in decode
return codecs.utf_8_decode(input, errors, True)
UnicodeEncodeError: 'ascii' codec can't encode character u'\u2019' in position 5872: ordinal not in range(128)

password authentication failed for user 'airflow'

As I already have redis and postgres containers in my server, so I rewrote the docker-compose as below:

version: '2'
services:
    webserver:
        image: puckel/docker-airflow:latest
        restart: always
        network_mode: bridge
        external_links: 
            - bi-postgres:postgres
            - redis
        environment:
            - LOAD_EX=n
            - FERNET_KEY=46BKJoQYlPPOexq0OhDZnIlNepKFf87WFwLbfzqDDho=
            - EXECUTOR=Celery
            - POSTGRES_USER=airflow
            - POSTGRES_PASSWORD=airflow
            - POSTGRES_DB=airflow
        volumes:
            - /root/notebooks/dags:/usr/local/airflow/dags
        ports:
            - "5002:8080"
        command: webserver

    flower:
        image: puckel/docker-airflow:latest
        restart: always
        network_mode: bridge
        external_links:
            - redis
        environment:
            - EXECUTOR=Celery
        ports:
            - "5555:5555"
        command: flower

    scheduler:
        image: puckel/docker-airflow:latest
        restart: always
        depends_on:
            - webserver
        volumes:
            - /root/notebooks/dags:/usr/local/airflow/dags
        environment:
            - LOAD_EX=n
            - FERNET_KEY=46BKJoQYlPPOexq0OhDZnIlNepKFf87WFwLbfzqDDho=
            - EXECUTOR=Celery
        command: scheduler -n 5

    worker:
        image: puckel/docker-airflow:latest
        restart: always
        depends_on:
            - scheduler
        volumes:
            - /root/notebooks/dags:/usr/local/airflow/dags
        environment:
            - FERNET_KEY=46BKJoQYlPPOexq0OhDZnIlNepKFf87WFwLbfzqDDho=
            - EXECUTOR=Celery
        command: worker

And I got issue on running initdb (DB: postgresql+psycopg2://airflow:***@postgres/airflow), but the redis get connected well. Please help.

How to change airflow static folder root path

Hello,

Our current airflow static folder is residing under root path. We are looking for how to change the static root path [/static]. Is any specific configuration file using [/static]. If so, how to change it. Highly appreciated if anybody can help to sort out this issue?

vanilla docker-compose up fails

Running on VirtualBox (4GB, 4 Cores) on MacOS:

  • MySQL container is restarted repeatedly
Version: '5.7.12'  socket: '/var/run/mysqld/mysqld.sock'  port: 3306  MySQL Community Server (GPL)
2016-06-01T16:50:31.139441Z 4 [Note] Aborted connection 4 to db: 'airflow' user: 'airflow' host: '172.17.0.4' (Got an error reading communication packets)
2016-06-01T16:50:31.567596Z 3 [Note] Aborted connection 3 to db: 'airflow' user: 'airflow' host: '172.17.0.4' (Got an error reading communication packets)
2016-06-01T16:50:32.419459Z 6 [Note] Aborted connection 6 to db: 'airflow' user: 'airflow' host: '172.17.0.4' (Got an error reading communication packets)
2016-06-01T16:50:32.421728Z 7 [Note] Aborted connection 7 to db: 'airflow' user: 'airflow' host: '172.17.0.4' (Got an error reading communication packets)
2016-06-01T16:50:33.186267Z 10 [Note] Aborted connection 10 to db: 'airflow' user: 'airflow' host: '172.17.0.4' (Got an error reading communication packets)
2016-06-01T16:50:33.224555Z 9 [Note] Aborted connection 9 to db: 'airflow' user: 'airflow' host: '172.17.0.4' (Got an error reading communication packets)
2016-06-01T16:50:33.974580Z 13 [Note] Aborted connection 13 to db: 'airflow' user: 'airflow' host: '172.17.0.4' (Got an error reading communication packets)
2016-06-01T16:50:34.039175Z 12 [Note] Aborted connection 12 to db: 'airflow' user: 'airflow' host: '172.17.0.4' (Got an error reading communication packets)
2016-06-01T16:50:36.152814Z 16 [Note] Aborted connection 16 to db: 'airflow' user: 'airflow' host: '172.17.0.4' (Got an error reading communication packets)
  • There seems to be an error in entrypoint.sh:
scheduler_1  | ./entrypoint.sh: line 15: [: too many arguments
scheduler_1  | ./entrypoint.sh: line 15: [: too many arguments
scheduler_1  | ./entrypoint.sh: line 15: [: too many arguments
scheduler_1  | ./entrypoint.sh: line 15: [: too many arguments
scheduler_1  | ./entrypoint.sh: line 29: [: too many arguments
scheduler_1  | ./entrypoint.sh: line 29: [: too many arguments
scheduler_1  | ./entrypoint.sh: line 29: [: too many arguments

The scheduler seems to be restarted every few seconds, the rest of the services less often

LOAD_EX in multiple services

Hi @puckel,

I just noticed that it is suggested to uncomment LOAD_EX=n in two services both inside docker-compose-CeleryExecutor.yml and docker-compose-LocalExecutor.yml. Is it how it is supposed to be? I'd intuitively expect this ENV variable to be needed only once because I don't think that more than one service can be responsible for adding extra DAGs. How about removing an extra occurrence of LOAD_EX=n if not needed?

Thanks a lot for crafting the docker recipe for airflow - sooo cool to be able to launch airflow in under a minute without any preps!

Logs endpoint broken due to 1.5.1 config

Whenever I try to access the logs endpoint in the admin console, webserver has the following exception:

2015-09-29 18:12:58,901 - airflow.www.app - ERROR - Exception on /admin/airflow/log [GET]
Traceback (most recent call last):
  File "/usr/lib/python2.7/dist-packages/flask/app.py", line 1817, in wsgi_app
    response = self.full_dispatch_request()
  File "/usr/lib/python2.7/dist-packages/flask/app.py", line 1477, in full_dispatch_request
    rv = self.handle_user_exception(e)
  File "/usr/lib/python2.7/dist-packages/flask/app.py", line 1381, in handle_user_exception
    reraise(exc_type, exc_value, tb)
  File "/usr/lib/python2.7/dist-packages/flask/app.py", line 1475, in full_dispatch_request
    rv = self.dispatch_request()
  File "/usr/lib/python2.7/dist-packages/flask/app.py", line 1461, in dispatch_request
    return self.view_functions[rule.endpoint](**req.view_args)
  File "/usr/lib/python2.7/dist-packages/flask_admin/base.py", line 68, in inner
    return self._run_view(f, *args, **kwargs)
  File "/usr/lib/python2.7/dist-packages/flask_admin/base.py", line 359, in _run_view
    return fn(self, *args, **kwargs)
  File "/usr/lib/python2.7/dist-packages/airflow/www/app.py", line 795, in log
    loc = loc.format(**locals())
KeyError: 'AIRFLOW_HOME'
2015-09-29 18:12:58,902 - tornado.access - ERROR - 500 GET /admin/airflow/log?task_id=run_this_last&dag_id=example_bash_operator&execution_date=2015-09-28T00:00:00 (192.168.99.1) 5.89ms

The reason for this error is that Airflow is trying to format a string with {AIRFLOW_HOME} in it and failing because the key AIRFLOW_HOME does not appear in locals. The root of this problem seems to be that the {AIRFLOW_HOME} key is in the docker-airflow config/airflow.cfg under base_log_folder. It's easy enough to fix this problem (by cloning the repo and modifying the config), but I didn't know if there was a step which I was missing that would correctly format the config in the process of bringing the containers up. Really the only step that I'm performing is docker-compose up -d.

Worker and Scheduler constantly restarting

Hi there,

We are running Airflow inside Docker since about 1.5 month without any bigger issues. However since a few days the worker and Scheduler keeps restarting without executing any of the DAGs.
The log of the worker shows an Unrecoverable error: TypeError:

[2016-11-09 14:41:04,432: CRITICAL/MainProcess] Unrecoverable error: TypeError('unorderable types: NoneType() <= int()',)
Traceback (most recent call last):
  File "/usr/local/lib/python3.4/dist-packages/celery/worker/worker.py", line 203, in start
    self.blueprint.start(self)
  File "/usr/local/lib/python3.4/dist-packages/celery/bootsteps.py", line 115, in start
    self.on_start()
  File "/usr/local/lib/python3.4/dist-packages/celery/apps/worker.py", line 143, in on_start
    self.emit_banner()
  File "/usr/local/lib/python3.4/dist-packages/celery/apps/worker.py", line 159, in emit_banner
    string(self.colored.reset(self.extra_info() or '')),
  File "/usr/local/lib/python3.4/dist-packages/celery/apps/worker.py", line 188, in extra_info
    if self.loglevel <= logging.INFO:
TypeError: unorderable types: NoneType() <= int()

I have no idea what caused this issue. Do you have an idea or a clue what could go wrong?

Very large Airflow Logs!

Hi,

how come that Airlow can generate 140 GB Logs under /stor/OSF_Tracking/docker/overlay2/d90dc3c63c3849bef95d82a47e67c138b54db7316860252a75ca308351cd00fe/diff/usr/local/airflow/logs ?

My other questions are is there a way to suppres that? or make like log rotation?
How to put those logs as a volume in docker ?
Any suggestion is very much appreciated

"Failed to fetch log file from worker" when running LocalExecutor

I'm seeing this in the webinterface when trying to access the logs of a task, but only when running LocalExecutor - could this be a misconfiguration?

*** Log file isn't local.
*** Fetching here: http://33d456e47018:8793/log/g_pipelines/rawlogs_pipeline/2016-10-24T05:30:00
*** Failed to fetch log file from worker.

*** Reading remote logs...
*** Unsupported remote log location.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.