Giter Site home page Giter Site logo

ambari-airflow-mpack's People

Contributors

miho120 avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar

ambari-airflow-mpack's Issues

airflow mpack kerberos

mpack generated wrong kerberos module into airflow.cfg
wrong\right
kerberos_ccache = ccache
kerberos_kinit_path = init_path
kerberos_reinit_frequency = reinit_frequency
kerberos_principal = principal
kerberos_keytab = keytab

Ambari doesn't find the Airflow service when trying to add it

Hi,
I have installed the mpack. It exists in /var/lib/ambari-server/resources/mpacks directory. Running Ambari 2.7.0.0. Ambari has been restarted.
When I try to add Airflow in Ambari as a service (Services/Add Service), it is not listed among the services.
What can be the problem?
We are using HDP 3.0.
I tried to add {"stack_name" : "HDP", "stack_version" : "3.0"} to the mpack.json file but it is still not shown among the availabe services to add.

Regards Tomas.

quick links

"Quick link" didn't change default port ( 8080 in my case already use on the host) when i changed it into -
Advanced airflow-webserver-site block:
Web server port = 8089

ldap

Hi.
When i added into "Custom airflow-webserver-site" preference, it is not implemented.
I need to add auth_backend_web=airflow.contrib.auth.backends.ldap_auth
But in airflow.cfg option din't appear and airflow didn't start.

Service not installed properly

I installed the mpack, but then Ambari interface doesn't seems to setup the airflow service correctly.
Any ideas ?
Also, is it possible to init a database different from the sqlite default one ?

image
image
image

Does Upgade package affect installed services.

Hi,

We have installed mpack and created airflow service but we realized that there are some params missing or wrong. When we upgrade mpack, does it affect already installed services?

Thank you,
Mustafa

local worker

Ambari send alert message when have no Airflow Worker
But i don't want it, i run airflow as local executor mode.
image

Airflow service auto stopping

Hi!
I went through the steps written in the repo. I have given all the necessary configurations in the xml(s). The Airflow is successfully getting installed but it is auto stopping. Just after couple of minutes, it automatically stops. The worker node stops and whatever the port (eg: 8088 or 8085) I'm giving, it is saying, "Connection refused", and no other service is running on that port.
Can you please help me in starting the Airflow service?

installation with ambari server

i successfully installed the mpack, but installation gets failed in ambari-server. I am using ambari 2.7.3.0 and HDP 3.1.0.0. And also now i cannot uninstall the mpack. I tried with
ambari-server uninstall-mpack --mpack=airflow-service-mpack.tar.gz

Unable to install Airlfow Components on Machines with Python 2.7 and Python 3.5

Hi,
I tried to install the MPack on machines which had Python 3.5 and Python 2.7.14 installed. The Symbolic links for the python installations are separate. But whenever the Airflow Worker tries to install, it fails while trying to install the docutils package via pip.
Ambari version: 2.7.3
HDP version: 3.1.0

Is this a known issue?
every other service installed via custom mpacks works fine.

airflow initdb error

[root@wsjylog03 ~]# airflow initdb
[2018-09-26 10:50:31,727] {init.py:45} INFO - Using executor LocalExecutor
DB: mysql://airflow:***@10.10.1.12/airflow
[2018-09-26 10:50:31,887] {db.py:312} INFO - Creating tables
INFO [alembic.runtime.migration] Context impl MySQLImpl.
INFO [alembic.runtime.migration] Will assume non-transactional DDL.
Traceback (most recent call last):
File "/usr/bin/airflow", line 28, in
args.func(args)
File "/usr/lib/python2.7/site-packages/airflow/bin/cli.py", line 897, in initdb
db_utils.initdb()
File "/usr/lib/python2.7/site-packages/airflow/utils/db.py", line 114, in initdb
schema='airflow_ci'))
File "", line 4, in init
File "/usr/lib/python2.7/site-packages/sqlalchemy/orm/state.py", line 424, in _initialize_instance
manager.dispatch.init_failure(self, args, kwargs)
File "/usr/lib/python2.7/site-packages/sqlalchemy/util/langhelpers.py", line 66, in exit
compat.reraise(exc_type, exc_value, exc_tb)
File "/usr/lib/python2.7/site-packages/sqlalchemy/orm/state.py", line 421, in initialize_instance
return manager.original_init(*mixed[1:], **kwargs)
File "/usr/lib/python2.7/site-packages/airflow/models.py", line 578, in init
self.extra = extra
File "", line 1, in set
File "/usr/lib/python2.7/site-packages/airflow/models.py", line 639, in set_extra
fernet = get_fernet()
File "/usr/lib/python2.7/site-packages/airflow/models.py", line 105, in get_fernet
return Fernet(configuration.get('core', 'FERNET_KEY').encode('utf-8'))
File "/usr/lib64/python2.7/site-packages/cryptography/fernet.py", line 34, in init
key = base64.urlsafe_b64decode(key)
File "/usr/lib64/python2.7/base64.py", line 112, in urlsafe_b64decode
return b64decode(s, '-
')
File "/usr/lib64/python2.7/base64.py", line 76, in b64decode
raise TypeError(msg)

modify /var/lib/ambari-server/resources/common-services/AIRFLOW/1.9.0/configuration/airflow-core-site.xml
from "mMGXRdfFpdUZDYmt8Ur1xVmspyOkYKtBlkv91dB8SVs" to "mMGXRdfFpdUZDYmt8Ur1xVmspyOkYKtBlkv91dB8SVs="

<property> <name>fernet_key</name> <value>mMGXRdfFpdUZDYmt8Ur1xVmspyOkYKtBlkv91dB8SVs=</value> <display-name>Fernet key</display-name> <description>Secret key to save connection passwords in the db.</description> </property>

pip repo path

I can't to install package from public repo because host have no internet access. I must install python package from internal repo only. So could you please add option where i can add path to repo during installation?
For example:
pip install airflow --index-url https://link --trusted-host host

error regarding python-pip

trying to use the master pack and it went smooth and while installing it throws following exception

raise ExecutionFailed(err_msg, code, out, err)
resource_management.core.exceptions.ExecutionFailed: Execution of '/usr/bin/yum -d 0 -e 0 -y install python-pip' returned 1. Error: Nothing to do

we do not have yum install python-pip so i got from get-pip.py file and then installed in my node but still it throws the above error.

could you please help me to fix the issue to bypass that error and also i am using virtual environments i did not get where this file located AIRFLOW_HOME/airflow_control.sh

Error while trying to install the service on Ambari HDP 2.6

Hi,

Thanks for writing this mpack for Ambari. Really useful to make it more integrated with the hole HDP stack.

I was able to make the stack available in Ambari, but not install it. Seems there is an issue running pip for one package:

 (most recent call last):
  File "/var/lib/ambari-agent/cache/common-services/AIRFLOW/1.10.0/package/scripts/airflow_scheduler_control.py", line 55, in <module>
    AirflowScheduler().execute()
  File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 367, in execute
    method(env)
  File "/var/lib/ambari-agent/cache/common-services/AIRFLOW/1.10.0/package/scripts/airflow_scheduler_control.py", line 16, in install
    Execute(format("export SLUGIFY_USES_TEXT_UNIDECODE=yes && pip install --upgrade {airflow_pip_params} pip && pip install --upgrade {airflow_pip_params} docutils pytest-runner && pip install --upgrade {airflow_pip_params} --ignore-installed apache-airflow[all]==1.10.0 apache-airflow[celery]==1.10.0"))
  File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 166, in __init__
    self.env.run()
  File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 160, in run
    self.run_action(resource, action)
  File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 124, in run_action
    provider_action()
  File "/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py", line 262, in action_run
    tries=self.resource.tries, try_sleep=self.resource.try_sleep)
  File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 72, in inner
    result = function(command, **kwargs)
  File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 102, in checked_call
    tries=tries, try_sleep=try_sleep, timeout_kill_strategy=timeout_kill_strategy)
  File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 150, in _call_wrapper
    result = _call(command, **kwargs_copy)
  File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 303, in _call
    raise ExecutionFailed(err_msg, code, out, err)
resource_management.core.exceptions.ExecutionFailed: Execution of 'export SLUGIFY_USES_TEXT_UNIDECODE=yes && pip install --upgrade  pip && pip install --upgrade  docutils pytest-runner && pip install --upgrade  --ignore-installed apache-airflow[all]==1.10.0 apache-airflow[celery]==1.10.0' returned 1. Requirement already up-to-date: pip in /usr/local/lib/python2.7/site-packages (18.0)
Collecting docutils
  Using cached https://files.pythonhosted.org/packages/50/09/c53398e0005b11f7ffb27b7aa720c617aba53be4fb4f4f3f06b9b5c60f28/docutils-0.14-py2-none-any.whl
Collecting pytest-runner
  Using cached https://files.pythonhosted.org/packages/72/a4/d7a5738a3096f22a98bec1609e237b250ebff04e5ea2930305d485337263/pytest_runner-4.2-py2.py3-none-any.whl
Installing collected packages: docutils, pytest-runner
  Found existing installation: docutils 0.11
Cannot uninstall 'docutils'. It is a distutils installed project and thus we cannot accurately determine which files belong to it which would lead to only a partial uninstall.

Default Kerberos configuration cannot work properly

Current default value of kerberos.ccahe is as follow:

But It actually conflict with systemd script configuration of scheduler and webserver:

PrivateTmp tells systemd start airflow daemon with private /tmp mounted instead of /tmp in unix file system. It prevent the airflow process share access /tmp directory as usual.

BUT, airflow kerberos Ccache is set to store TGT cache in /tmp/airflow_krb5_ccache. This cache file cannot be found by airflow scheduler or webserver started by systemd.

I think we can change Ccache default value to /usr/local/airflow/airflow_krb5_ccache

Exception airflow worker.

Hello,

We get the following exception on the worker process. Anybody can suggest?

Sep 2 18:20:52 comp-prod1 airflow_control.sh: [2019-09-02 16:20:52,089] {jobs.py:1108} INFO - No tasks to consider for execution.
Sep 2 18:20:52 comp-prod1 airflow_control.sh: Traceback (most recent call last):
Sep 2 18:20:52 comp-prod1 airflow_control.sh: File "/usr/bin/airflow", line 32, in
Sep 2 18:20:52 comp-prod1 airflow_control.sh: args.func(args)
Sep 2 18:20:52 comp-prod1 airflow_control.sh: File "/usr/lib/python2.7/site-packages/airflow/utils/cli.py", line 74, in wrapper
Sep 2 18:20:52 comp-prod1 airflow_control.sh: return f(*args, **kwargs)
Sep 2 18:20:52 comp-prod1 airflow_control.sh: File "/usr/lib/python2.7/site-packages/airflow/bin/cli.py", line 955, in worker
Sep 2 18:20:52 comp-prod1 airflow_control.sh: from airflow.executors.celery_executor import app as celery_app
Sep 2 18:20:52 comp-prod1 airflow_control.sh: File "/usr/lib/python2.7/site-packages/airflow/executors/celery_executor.py", line 24, in
Sep 2 18:20:52 comp-prod1 airflow_control.sh: from celery import Celery
Sep 2 18:20:52 comp-prod1 airflow_control.sh: File "/usr/lib/python2.7/site-packages/celery/local.py", line 511, in getattr
Sep 2 18:20:52 comp-prod1 airflow_control.sh: module = import(self._object_origins[name], None, None, [name])
Sep 2 18:20:52 comp-prod1 airflow_control.sh: File "/usr/lib/python2.7/site-packages/celery/app/init.py", line 5, in
Sep 2 18:20:52 comp-prod1 airflow_control.sh: from celery import _state
Sep 2 18:20:52 comp-prod1 airflow_control.sh: File "/usr/lib/python2.7/site-packages/celery/_state.py", line 17, in
Sep 2 18:20:52 comp-prod1 airflow_control.sh: from celery.utils.threads import LocalStack
Sep 2 18:20:52 comp-prod1 airflow_control.sh: File "/usr/lib/python2.7/site-packages/celery/utils/init.py", line 10, in
Sep 2 18:20:52 comp-prod1 airflow_control.sh: from .nodenames import worker_direct, nodename, nodesplit
Sep 2 18:20:52 comp-prod1 airflow_control.sh: File "/usr/lib/python2.7/site-packages/celery/utils/nodenames.py", line 9, in
Sep 2 18:20:52 comp-prod1 airflow_control.sh: from kombu.entity import Exchange, Queue
Sep 2 18:20:52 comp-prod1 airflow_control.sh: File "/usr/lib/python2.7/site-packages/kombu/entity.py", line 9, in
Sep 2 18:20:52 comp-prod1 airflow_control.sh: from .serialization import prepare_accept_content
Sep 2 18:20:52 comp-prod1 airflow_control.sh: File "/usr/lib/python2.7/site-packages/kombu/serialization.py", line 456, in
Sep 2 18:20:52 comp-prod1 airflow_control.sh: for ep, args in entrypoints('kombu.serializers'): # pragma: no cover
Sep 2 18:20:52 comp-prod1 airflow_control.sh: File "/usr/lib/python2.7/site-packages/kombu/utils/compat.py", line 89, in entrypoints
Sep 2 18:20:52 comp-prod1 airflow_control.sh: for ep in importlib_metadata.entry_points().get(namespace, [])
Sep 2 18:20:52 comp-prod1 airflow_control.sh: File "/usr/lib/python2.7/site-packages/importlib_metadata/init.py", line 468, in entry_points
Sep 2 18:20:52 comp-prod1 airflow_control.sh: ordered = sorted(eps, key=by_group)
Sep 2 18:20:52 comp-prod1 airflow_control.sh: File "/usr/lib/python2.7/site-packages/importlib_metadata/init.py", line 466, in
Sep 2 18:20:52 comp-prod1 airflow_control.sh: dist.entry_points for dist in distributions())
Sep 2 18:20:52 comp-prod1 airflow_control.sh: File "/usr/lib/python2.7/site-packages/importlib_metadata/init.py", line 372, in
Sep 2 18:20:52 comp-prod1 airflow_control.sh: cls._search_path(path, pattern)
Sep 2 18:20:52 comp-prod1 airflow_control.sh: File "/usr/lib/python2.7/site-packages/importlib_metadata/init.py", line 381, in _switch_path
Sep 2 18:20:52 comp-prod1 airflow_control.sh: return pathlib.Path(path)
Sep 2 18:20:52 comp-prod1 airflow_control.sh: File "/usr/lib/python2.7/site-packages/pathlib2/init.py", line 1256, in new
Sep 2 18:20:52 comp-prod1 airflow_control.sh: self = cls._from_parts(args, init=False)
Sep 2 18:20:52 comp-prod1 airflow_control.sh: File "/usr/lib/python2.7/site-packages/pathlib2/init.py", line 898, in _from_parts
Sep 2 18:20:52 comp-prod1 airflow_control.sh: drv, root, parts = self._parse_args(args)
Sep 2 18:20:52 comp-prod1 airflow_control.sh: File "/usr/lib/python2.7/site-packages/pathlib2/init.py", line 891, in _parse_args
Sep 2 18:20:52 comp-prod1 airflow_control.sh: return cls._flavour.parse_parts(parts)
Sep 2 18:20:52 comp-prod1 airflow_control.sh: File "/usr/lib/python2.7/site-packages/pathlib2/init.py", line 250, in parse_parts
Sep 2 18:20:52 comp-prod1 airflow_control.sh: parsed.append(intern(x))
Sep 2 18:20:52 comp-prod1 airflow_control.sh: TypeError: can't intern subclass of string
Sep 2 18:20:52 comp-prod1 systemd: airflow-worker.service: main process exited, code=exited, status=1/FAILURE
Sep 2 18:20:52 comp-prod1 systemd: Unit airflow-worker.service entered failed state.
Sep 2 18:20:52 comp-prod1 systemd: airflow-worker.service failed.

Setup Error

Hi!
Im using ambari version 2.6.1.3 (Deploy HDP 2.6.4 hortonwork).
I download and install airflow mpack success.
But when i add service airflow on ambari it has error on action install.
It show message:

Caught an exception while executing custom service command: <class 'ambari_agent.AgentException.AgentException'>: 'Script /var/lib/ambari-agent/cache/common-services/AIRFLOW/1.9.0/package/scripts/airflow_worker_control.py does not exist'; 'Script /var/lib/ambari-agent/cache/common-services/AIRFLOW/1.9.0/package/scripts/airflow_worker_control.py does not exist'

I don't known why script package not exist on ambari-agent. Please check with version ambari you run ok?
Many Thanks!

Require airflow kerberos daemon with systemd

Hi, miho120:

When configure airflow using kerberos authorization, airflow kerberos daemon is a must for it properly works with other applications like Hive. Airflow kerberos take the responsibility to renew TGT.

Please consider add it to systemd script. Thanks.

Error instaling Airflow

stderr:
Traceback (most recent call last):
File "/var/lib/ambari-agent/cache/common-services/AIRFLOW/1.10.0/package/scripts/airflow_worker_control.py", line 61, in
AirflowWorker().execute()
File "/usr/lib/ambari-agent/lib/resource_management/libraries/script/script.py", line 352, in execute
method(env)
File "/var/lib/ambari-agent/cache/common-services/AIRFLOW/1.10.0/package/scripts/airflow_worker_control.py", line 14, in install
self.install_packages(env)
File "/usr/lib/ambari-agent/lib/resource_management/libraries/script/script.py", line 849, in install_packages
retry_count=agent_stack_retry_count)
File "/usr/lib/ambari-agent/lib/resource_management/core/base.py", line 166, in init
self.env.run()
File "/usr/lib/ambari-agent/lib/resource_management/core/environment.py", line 160, in run
self.run_action(resource, action)
File "/usr/lib/ambari-agent/lib/resource_management/core/environment.py", line 124, in run_action
provider_action()
File "/usr/lib/ambari-agent/lib/resource_management/core/providers/packaging.py", line 30, in action_install
self._pkg_manager.install_package(package_name, self.__create_context())
File "/usr/lib/ambari-agent/lib/ambari_commons/repo_manager/yum_manager.py", line 219, in install_package
shell.repository_manager_executor(cmd, self.properties, context)
File "/usr/lib/ambari-agent/lib/ambari_commons/shell.py", line 753, in repository_manager_executor
raise RuntimeError(message)
RuntimeError: Failed to execute command '/usr/bin/yum -y install python-pip', exited with code '1', message: 'Error: Nothing to do
'
stdout:
2019-06-21 11:32:41,730 - Stack Feature Version Info: Cluster Stack=3.1, Command Stack=None, Command Version=None -> 3.1
2019-06-21 11:32:41,735 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf
2019-06-21 11:32:41,736 - Skipping creation of User and Group as host is sys prepped or ignore_groupsusers_create flag is on
2019-06-21 11:32:41,736 - Skipping setting dfs cluster admin and tez view acls as host is sys prepped
2019-06-21 11:32:41,736 - FS Type: HDFS
2019-06-21 11:32:41,736 - Directory['/etc/hadoop'] {'mode': 0755}
2019-06-21 11:32:41,750 - File['/usr/hdp/current/hadoop-client/conf/hadoop-env.sh'] {'content': InlineTemplate(...), 'owner': 'stdevhdfs', 'group': 'grpstdevhadoop'}
2019-06-21 11:32:41,750 - Writing File['/usr/hdp/current/hadoop-client/conf/hadoop-env.sh'] because contents don't match
2019-06-21 11:32:41,751 - Directory['/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir'] {'owner': 'stdevhdfs', 'group': 'grpstdevhadoop', 'mode': 01777}
2019-06-21 11:32:41,761 - Repository['HDP-3.1-repo-1'] {'base_url': 'http://public-repo-1.hortonworks.com/HDP/centos7/3.x/updates/3.1.0.0', 'action': ['prepare'], 'components': [u'HDP', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'ambari-hdp-1', 'mirror_list': None}
2019-06-21 11:32:41,766 - Repository['HDP-3.1-GPL-repo-1'] {'base_url': 'http://public-repo-1.hortonworks.com/HDP-GPL/centos7/3.x/updates/3.1.0.0', 'action': ['prepare'], 'components': [u'HDP-GPL', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'ambari-hdp-1', 'mirror_list': None}
2019-06-21 11:32:41,768 - Repository['HDP-UTILS-1.1.0.22-repo-1'] {'base_url': 'http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.22/repos/centos7', 'action': ['prepare'], 'components': [u'HDP-UTILS', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'ambari-hdp-1', 'mirror_list': None}
2019-06-21 11:32:41,770 - Repository[None] {'action': ['create']}
2019-06-21 11:32:41,771 - File['/tmp/tmpTZx74k'] {'content': '[HDP-3.1-repo-1]\nname=HDP-3.1-repo-1\nbaseurl=http://public-repo-1.hortonworks.com/HDP/centos7/3.x/updates/3.1.0.0\n\npath=/\nenabled=1\ngpgcheck=0\n[HDP-3.1-GPL-repo-1]\nname=HDP-3.1-GPL-repo-1\nbaseurl=http://public-repo-1.hortonworks.com/HDP-GPL/centos7/3.x/updates/3.1.0.0\n\npath=/\nenabled=1\ngpgcheck=0\n[HDP-UTILS-1.1.0.22-repo-1]\nname=HDP-UTILS-1.1.0.22-repo-1\nbaseurl=http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.22/repos/centos7\n\npath=/\nenabled=1\ngpgcheck=0'}
2019-06-21 11:32:41,771 - Writing File['/tmp/tmpTZx74k'] because contents don't match
2019-06-21 11:32:41,771 - File['/tmp/tmpchiThk'] {'content': StaticFile('/etc/yum.repos.d/ambari-hdp-1.repo')}
2019-06-21 11:32:41,772 - Writing File['/tmp/tmpchiThk'] because contents don't match
2019-06-21 11:32:41,772 - Package['unzip'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2019-06-21 11:32:41,840 - Skipping installation of existing package unzip
2019-06-21 11:32:41,840 - Package['curl'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2019-06-21 11:32:41,847 - Skipping installation of existing package curl
2019-06-21 11:32:41,847 - Package['hdp-select'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2019-06-21 11:32:41,853 - Skipping installation of existing package hdp-select
2019-06-21 11:32:41,857 - The repository with version 3.1.0.0-78 for this command has been marked as resolved. It will be used to report the version of the component which was installed
2019-06-21 11:32:41,861 - Skipping stack-select on AIRFLOW because it does not exist in the stack-select package structure.
2019-06-21 11:32:42,032 - Package['krb5-devel'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2019-06-21 11:32:42,099 - Installing package krb5-devel ('/usr/bin/yum -y install krb5-devel')
2019-06-21 11:32:49,450 - Package['python-devel'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2019-06-21 11:32:49,458 - Skipping installation of existing package python-devel
2019-06-21 11:32:49,459 - Package['sqlite-devel'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2019-06-21 11:32:49,466 - Installing package sqlite-devel ('/usr/bin/yum -y install sqlite-devel')
2019-06-21 11:32:50,442 - Package['openssl-devel'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2019-06-21 11:32:50,450 - Installing package openssl-devel ('/usr/bin/yum -y install openssl-devel')
2019-06-21 11:32:52,367 - Package['mysql-devel'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2019-06-21 11:32:52,375 - Installing package mysql-devel ('/usr/bin/yum -y install mysql-devel')
2019-06-21 11:32:53,648 - Package['python-pip'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2019-06-21 11:32:53,656 - Installing package python-pip ('/usr/bin/yum -y install python-pip')
2019-06-21 11:32:54,045 - The repository with version 3.1.0.0-78 for this command has been marked as resolved. It will be used to report the version of the component which was installed
2019-06-21 11:32:54,050 - Skipping stack-select on AIRFLOW because it does not exist in the stack-select package structure.

Command failed after 1 tries

GIT LFS Issue

Hi ,

I have a git lfs issue with your repository :

[root@sandbox-hdp ~]# git lfs clone https://github.com/miho120/ambari-airflow-mpack
Cloning into 'ambari-airflow-mpack'...
remote: Counting objects: 130, done.
remote: Total 130 (delta 0), reused 0 (delta 0), pack-reused 130
Receiving objects: 100% (130/130), 573.38 KiB | 503.00 KiB/s, done.
Resolving deltas: 100% (45/45), done.
Checking connectivity... done.
Git LFS: (0 of 2 files) 0 B / 250.84 MB
batch response: This repository is over its data quota. Purchase more data packs to restore access.
error: failed to fetch some objects from 'https://github.com/miho120/ambari-airflow-mpack.git/info/lfs'

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.