Giter Site home page Giter Site logo

steven-matison / dfhz_hdp_mpack Goto Github PK

View Code? Open in Web Editor NEW
48.0 4.0 38.0 1.47 GB

Install Ambari 2.7.5 with HDP 3.1.4 without using Hortonworks repositories.

License: Apache License 2.0

Python 91.77% TSQL 3.58% Shell 3.29% Batchfile 0.91% Roff 0.18% Groovy 0.28%
centos7 opensuse ambari ambari-service hdp hdp3

dfhz_hdp_mpack's Introduction

dfhz_hdp_mpack

HDP 3.1.4.0 Management Pack for Ambari

Install Ambari From MOSGA RPMS:

CENTOS7

wget -O /etc/yum.repos.d/mosga.repo https://makeopensourcegreatagain.com/repos/centos/7/ambari/2.7.5.0/mosga-ambari.repo
yum install ambari-server ambari-agent -y
ambari-server setup -s
ambari-server start
ambari-agent start

Management Pack Installaion - HDP 3.1.4.0

ambari-server install-mpack --mpack=https://github.com/steven-matison/dfhz_hdp_mpack/raw/master/hdp-ambari-mpack-3.1.4.0.tar.gz --verbose
ambari-server restart

Management Pack Removal

ambari-server uninstall-mpack --mpack-name=hdp-ambari-mpack
ambari-server restart

dfhz_hdp_mpack's People

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar

dfhz_hdp_mpack's Issues

Repository for Debian

Hello Steven,
My planned servers are Debian. Could you share your information about this issue? How can i use your HDP on debian?
My best regards

When installing Apache Flink 1.9.3 as Ambari service, a KeyError: u'flink' occurs

Apache Flink 1.9.3 as Ambari service, a KeyError: u'flink' occurs:

2021-10-01 10:23:04,925 - Stack Feature Version Info: Cluster Stack=3.1, Command Stack=None, Command Version=None -> 3.1
2021-10-01 10:23:04,930 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf
2021-10-01 10:23:04,932 - Group['flink'] {}
2021-10-01 10:23:04,933 - Group['livy'] {}
2021-10-01 10:23:04,933 - Group['spark'] {}
2021-10-01 10:23:04,933 - Group['hdfs'] {}
2021-10-01 10:23:04,933 - Group['hadoop'] {}
2021-10-01 10:23:04,933 - Group['users'] {}
2021-10-01 10:23:04,934 - User['hive'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2021-10-01 10:23:04,935 - User['yarn-ats'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
Error: Error: Unable to run the custom hook script ['/usr/bin/python', '/var/lib/ambari-agent/cache/stack-hooks/before-ANY/scripts/hook.py', 'ANY', '/var/lib/ambari-agent/data/command-197.json', '/var/lib/ambari-agent/cache/stack-hooks/before-ANY', '/var/lib/ambari-agent/data/structured-out-197.json', 'INFO', '/var/lib/ambari-agent/tmp', 'PROTOCOL_TLSv1_2', '']
2021-10-01 10:23:04,958 - The repository with version 3.1.4.0-315 for this command has been marked as resolved. It will be used to report the version of the component which was installed
2021-10-01 10:23:04,962 - Skipping stack-select on FLINK because it does not exist in the stack-select package structure.
Traceback (most recent call last):
  File "/var/lib/ambari-agent/cache/stack-hooks/before-ANY/scripts/hook.py", line 38, in <module>
    BeforeAnyHook().execute()
  File "/usr/lib/ambari-agent/lib/resource_management/libraries/script/script.py", line 352, in execute
    method(env)
  File "/var/lib/ambari-agent/cache/stack-hooks/before-ANY/scripts/hook.py", line 31, in hook
    setup_users()
  File "/var/lib/ambari-agent/cache/stack-hooks/before-ANY/scripts/shared_initialization.py", line 50, in setup_users
    groups = params.user_to_groups_dict[user],
KeyError: u'flink'
Error: Error: Unable to run the custom hook script ['/usr/bin/python', '/var/lib/ambari-agent/cache/stack-hooks/before-ANY/scripts/hook.py', 'ANY', '/var/lib/ambari-agent/data/command-52.json', '/var/lib/ambari-agent/cache/stack-hooks/before-ANY', '/var/lib/ambari-agent/data/structured-out-52.json', 'INFO', '/var/lib/ambari-agent/tmp', 'PROTOCOL_TLSv1_2', '']

Ambari on SUSE

I have downloaded all the Centos repository from
https://makeopensourcegreatagain.com/repos/

First Question: Can we install the Ambari on SUSE to my understanding it was build on Centos7 please let me know (y/n)

Secondly: If their is an build repository for SUSE please share the link along with how to install on SUSE

I have been trying and facing lot of issue, during the investigation of logs finally I have found that issue is with missing packages of OS the one I found redhat-lab pack is missing on SUSE and I think its not advisable to install on SUES.

Advise

Can't install Hue.

I have a problem when I try to install Hue from the ambari 'add service' functionality.
It fails immediately throwing an error on Hue Server Install.

Here is the error log:

StErr:
Traceback (most recent call last):
File "/var/lib/ambari-agent/cache/stack-hooks/before-ANY/scripts/hook.py", line 38, in
BeforeAnyHook().execute()
File "/usr/lib/ambari-agent/lib/resource_management/libraries/script/script.py", line 352, in execute
method(env)
File "/var/lib/ambari-agent/cache/stack-hooks/before-ANY/scripts/hook.py", line 31, in hook
setup_users()
File "/var/lib/ambari-agent/cache/stack-hooks/before-ANY/scripts/shared_initialization.py", line 50, in setup_users
groups = params.user_to_groups_dict[user],
KeyError: u'hue'
Error: Error: Unable to run the custom hook script ['/usr/bin/python', '/var/lib/ambari-agent/cache/stack-hooks/before-ANY/scripts/hook.py', 'ANY', '/var/lib/ambari-agent/data/command-816.json', '/var/lib/ambari-agent/cache/stack-hooks/before-ANY', '/var/lib/ambari-agent/data/structured-out-816.json', 'INFO', '/var/lib/ambari-agent/tmp', 'PROTOCOL_TLSv1_2', '']

stdout: /var/lib/ambari-agent/data/output-816.txt
2021-04-14 15:36:48,045 - Stack Feature Version Info: Cluster Stack=3.1, Command Stack=None, Command Version=None -> 3.1
2021-04-14 15:36:48,052 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf
2021-04-14 15:36:48,054 - Group['livy'] {}
2021-04-14 15:36:48,055 - Group['spark'] {}
2021-04-14 15:36:48,056 - Group['ranger'] {}
2021-04-14 15:36:48,056 - Group['hdfs'] {}
2021-04-14 15:36:48,056 - Group['hue'] {}
2021-04-14 15:36:48,056 - Group['zeppelin'] {}
2021-04-14 15:36:48,057 - Group['hadoop'] {}
2021-04-14 15:36:48,057 - Group['users'] {}
2021-04-14 15:36:48,057 - User['yarn-ats'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2021-04-14 15:36:48,059 - User['hive'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2021-04-14 15:36:48,060 - User['infra-solr'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2021-04-14 15:36:48,061 - User['zookeeper'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2021-04-14 15:36:48,062 - User['oozie'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop', 'users'], 'uid': None}
2021-04-14 15:36:48,063 - User['ams'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2021-04-14 15:36:48,064 - User['ranger'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['ranger', 'hadoop'], 'uid': None}
2021-04-14 15:36:48,065 - User['tez'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop', 'users'], 'uid': None}
2021-04-14 15:36:48,066 - User['zeppelin'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['zeppelin', 'hadoop'], 'uid': None}
2021-04-14 15:36:48,067 - User['accumulo'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2021-04-14 15:36:48,068 - User['livy'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['livy', 'hadoop'], 'uid': None}
2021-04-14 15:36:48,070 - User['spark'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['spark', 'hadoop'], 'uid': None}
2021-04-14 15:36:48,071 - User['ambari-qa'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop', 'users'], 'uid': None}
2021-04-14 15:36:48,072 - User['hdfs'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hdfs', 'hadoop'], 'uid': None}
Error: Error: Unable to run the custom hook script ['/usr/bin/python', '/var/lib/ambari-agent/cache/stack-hooks/before-ANY/scripts/hook.py', 'ANY', '/var/lib/ambari-agent/data/command-816.json', '/var/lib/ambari-agent/cache/stack-hooks/before-ANY', '/var/lib/ambari-agent/data/structured-out-816.json', 'INFO', '/var/lib/ambari-agent/tmp', 'PROTOCOL_TLSv1_2', '']
2021-04-14 15:36:48,098 - The repository with version 3.1.4.0-315 for this command has been marked as resolved. It will be used to report the version of the component which was installed
2021-04-14 15:36:48,103 - Skipping stack-select on HUE because it does not exist in the stack-select package structure.
Command failed after 1 tries

Can anybody help me?

Ambari metrics collector and Ambari Metrics Grafana is not working.

While trying to create Amabri using the stacks from your url makesourcegreat again, we are facing the follwing issue.
The grafana metrics, infrsolr, kafka, ranger are not getting installed as required. Facing the below issues. Can you kindly help us with it.

image
image

image
image

Ambari- Grafana

stderr:
Traceback (most recent call last):
File "/var/lib/ambari-agent/cache/stacks/HDP/3.0/services/AMBARI_METRICS/package/scripts/metrics_grafana.py", line 84, in
AmsGrafana().execute()
File "/usr/lib/ambari-agent/lib/resource_management/libraries/script/script.py", line 352, in execute
method(env)
File "/var/lib/ambari-agent/cache/stacks/HDP/3.0/services/AMBARI_METRICS/package/scripts/metrics_grafana.py", line 48, in start
not_if = params.grafana_process_exists_cmd,
File "/usr/lib/ambari-agent/lib/resource_management/core/base.py", line 166, in init
self.env.run()
File "/usr/lib/ambari-agent/lib/resource_management/core/environment.py", line 160, in run
self.run_action(resource, action)
File "/usr/lib/ambari-agent/lib/resource_management/core/environment.py", line 124, in run_action
provider_action()
File "/usr/lib/ambari-agent/lib/resource_management/core/providers/system.py", line 263, in action_run
returns=self.resource.returns)
File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 72, in inner
result = function(command, **kwargs)
File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 102, in checked_call
tries=tries, try_sleep=try_sleep, timeout_kill_strategy=timeout_kill_strategy, returns=returns)
File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 150, in _call_wrapper
result = _call(command, **kwargs_copy)
File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 314, in _call
raise ExecutionFailed(err_msg, code, out, err)
resource_management.core.exceptions.ExecutionFailed: Execution of '/usr/sbin/ambari-metrics-grafana start' returned 127. -bash: /usr/sbin/ambari-metrics-grafana: No such file or directory
stdout:
2021-03-01 11:32:59,645 - Stack Feature Version Info: Cluster Stack=3.1, Command Stack=None, Command Version=3.1.4.0-315 -> 3.1.4.0-315
2021-03-01 11:32:59,682 - Using hadoop conf dir: /usr/hdp/3.1.4.0-315/hadoop/conf
2021-03-01 11:33:00,062 - Stack Feature Version Info: Cluster Stack=3.1, Command Stack=None, Command Version=3.1.4.0-315 -> 3.1.4.0-315
2021-03-01 11:33:00,073 - Using hadoop conf dir: /usr/hdp/3.1.4.0-315/hadoop/conf
2021-03-01 11:33:00,075 - Group['livy'] {}
2021-03-01 11:33:00,078 - Group['spark'] {}
2021-03-01 11:33:00,078 - Group['ranger'] {}
2021-03-01 11:33:00,078 - Group['hdfs'] {}
2021-03-01 11:33:00,079 - Group['zeppelin'] {}
2021-03-01 11:33:00,079 - Group['hadoop'] {}
2021-03-01 11:33:00,080 - Group['users'] {}
2021-03-01 11:33:00,081 - User['yarn-ats'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2021-03-01 11:33:00,083 - User['hive'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2021-03-01 11:33:00,084 - User['infra-solr'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2021-03-01 11:33:00,086 - User['zookeeper'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2021-03-01 11:33:00,088 - User['ams'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2021-03-01 11:33:00,089 - User['ranger'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['ranger', 'hadoop'], 'uid': None}
2021-03-01 11:33:00,091 - User['tez'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop', 'users'], 'uid': None}
2021-03-01 11:33:00,093 - User['zeppelin'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['zeppelin', 'hadoop'], 'uid': None}
2021-03-01 11:33:00,095 - User['livy'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['livy', 'hadoop'], 'uid': None}
2021-03-01 11:33:00,096 - User['spark'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['spark', 'hadoop'], 'uid': None}
2021-03-01 11:33:00,098 - User['ambari-qa'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop', 'users'], 'uid': None}
2021-03-01 11:33:00,100 - User['kafka'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2021-03-01 11:33:00,101 - User['hdfs'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hdfs', 'hadoop'], 'uid': None}
2021-03-01 11:33:00,103 - User['sqoop'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2021-03-01 11:33:00,105 - User['yarn'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2021-03-01 11:33:00,107 - User['mapred'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2021-03-01 11:33:00,108 - User['hbase'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2021-03-01 11:33:00,110 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2021-03-01 11:33:00,112 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 0'] {'not_if': '(test $(id -u ambari-qa) -gt 1000) || (false)'}
2021-03-01 11:33:00,122 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 0'] due to not_if
2021-03-01 11:33:00,122 - Directory['/tmp/hbase-hbase'] {'owner': 'hbase', 'create_parents': True, 'mode': 0775, 'cd_access': 'a'}
2021-03-01 11:33:00,124 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2021-03-01 11:33:00,127 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2021-03-01 11:33:00,128 - call['/var/lib/ambari-agent/tmp/changeUid.sh hbase'] {}
2021-03-01 11:33:00,141 - call returned (0, '1009')
2021-03-01 11:33:00,143 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase 1009'] {'not_if': '(test $(id -u hbase) -gt 1000) || (false)'}
2021-03-01 11:33:00,152 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase 1009'] due to not_if
2021-03-01 11:33:00,153 - Group['hdfs'] {}
2021-03-01 11:33:00,154 - User['hdfs'] {'fetch_nonlocal_groups': True, 'groups': ['hdfs', 'hadoop', u'hdfs']}
2021-03-01 11:33:00,155 - FS Type: HDFS
2021-03-01 11:33:00,156 - Directory['/etc/hadoop'] {'mode': 0755}
2021-03-01 11:33:00,186 - File['/usr/hdp/3.1.4.0-315/hadoop/conf/hadoop-env.sh'] {'content': InlineTemplate(...), 'owner': 'hdfs', 'group': 'hadoop'}
2021-03-01 11:33:00,188 - Directory['/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir'] {'owner': 'hdfs', 'group': 'hadoop', 'mode': 01777}
2021-03-01 11:33:00,218 - Execute[('setenforce', '0')] {'not_if': '(! which getenforce ) || (which getenforce && getenforce | grep -q Disabled)', 'sudo': True, 'only_if': 'test -f /selinux/enforce'}
2021-03-01 11:33:00,237 - Skipping Execute[('setenforce', '0')] due to not_if
2021-03-01 11:33:00,238 - Directory['/var/log/hadoop'] {'owner': 'root', 'create_parents': True, 'group': 'hadoop', 'mode': 0775, 'cd_access': 'a'}
2021-03-01 11:33:00,243 - Directory['/var/run/hadoop'] {'owner': 'root', 'create_parents': True, 'group': 'root', 'cd_access': 'a'}
2021-03-01 11:33:00,244 - Directory['/var/run/hadoop/hdfs'] {'owner': 'hdfs', 'cd_access': 'a'}
2021-03-01 11:33:00,245 - Directory['/tmp/hadoop-hdfs'] {'owner': 'hdfs', 'create_parents': True, 'cd_access': 'a'}
2021-03-01 11:33:00,253 - File['/usr/hdp/3.1.4.0-315/hadoop/conf/commons-logging.properties'] {'content': Template('commons-logging.properties.j2'), 'owner': 'hdfs'}
2021-03-01 11:33:00,256 - File['/usr/hdp/3.1.4.0-315/hadoop/conf/health_check'] {'content': Template('health_check.j2'), 'owner': 'hdfs'}
2021-03-01 11:33:00,268 - File['/usr/hdp/3.1.4.0-315/hadoop/conf/log4j.properties'] {'content': InlineTemplate(...), 'owner': 'hdfs', 'group': 'hadoop', 'mode': 0644}
2021-03-01 11:33:00,291 - File['/usr/hdp/3.1.4.0-315/hadoop/conf/hadoop-metrics2.properties'] {'content': InlineTemplate(...), 'owner': 'hdfs', 'group': 'hadoop'}
2021-03-01 11:33:00,292 - File['/usr/hdp/3.1.4.0-315/hadoop/conf/task-log4j.properties'] {'content': StaticFile('task-log4j.properties'), 'mode': 0755}
2021-03-01 11:33:00,294 - File['/usr/hdp/3.1.4.0-315/hadoop/conf/configuration.xsl'] {'owner': 'hdfs', 'group': 'hadoop'}
2021-03-01 11:33:00,301 - File['/etc/hadoop/conf/topology_mappings.data'] {'owner': 'hdfs', 'content': Template('topology_mappings.data.j2'), 'only_if': 'test -d /etc/hadoop/conf', 'group': 'hadoop', 'mode': 0644}
2021-03-01 11:33:00,309 - File['/etc/hadoop/conf/topology_script.py'] {'content': StaticFile('topology_script.py'), 'only_if': 'test -d /etc/hadoop/conf', 'mode': 0755}
2021-03-01 11:33:00,315 - Skipping unlimited key JCE policy check and setup since it is not required
2021-03-01 11:33:00,331 - Skipping stack-select on AMBARI_METRICS because it does not exist in the stack-select package structure.
2021-03-01 11:33:00,742 - Using hadoop conf dir: /usr/hdp/3.1.4.0-315/hadoop/conf
2021-03-01 11:33:00,747 - checked_call['hostid'] {}
2021-03-01 11:33:00,762 - checked_call returned (0, '10ac1215')
2021-03-01 11:33:00,769 - Directory['/etc/ambari-metrics-grafana/conf'] {'owner': 'ams', 'group': 'hadoop', 'create_parents': True, 'recursive_ownership': True, 'mode': 0755}
2021-03-01 11:33:00,773 - Directory['/var/log/ambari-metrics-grafana'] {'owner': 'ams', 'group': 'hadoop', 'create_parents': True, 'recursive_ownership': True, 'mode': 0755}
2021-03-01 11:33:00,774 - Directory['/var/lib/ambari-metrics-grafana'] {'owner': 'ams', 'group': 'hadoop', 'create_parents': True, 'recursive_ownership': True, 'mode': 0755}
2021-03-01 11:33:00,774 - Directory['/var/run/ambari-metrics-grafana'] {'owner': 'ams', 'group': 'hadoop', 'create_parents': True, 'recursive_ownership': True, 'mode': 0755}
2021-03-01 11:33:00,782 - File['/etc/ambari-metrics-grafana/conf/ams-grafana-env.sh'] {'content': InlineTemplate(...), 'owner': 'ams', 'group': 'hadoop'}
2021-03-01 11:33:00,790 - File['/etc/ambari-metrics-grafana/conf/ams-grafana.ini'] {'content': InlineTemplate(...), 'owner': 'ams', 'group': 'hadoop', 'mode': 0600}
2021-03-01 11:33:00,792 - Execute[('chown', '-R', u'ams', '/etc/ambari-metrics-grafana/conf')] {'sudo': True}
2021-03-01 11:33:00,803 - Execute[('chown', '-R', u'ams', u'/var/log/ambari-metrics-grafana')] {'sudo': True}
2021-03-01 11:33:00,815 - Execute[('chown', '-R', u'ams', u'/var/lib/ambari-metrics-grafana')] {'sudo': True}
2021-03-01 11:33:00,826 - Execute[('chown', '-R', u'ams', u'/var/run/ambari-metrics-grafana')] {'sudo': True}
2021-03-01 11:33:00,852 - Directory['/usr/lib/ambari-logsearch-logfeeder/conf'] {'create_parents': True, 'mode': 0755, 'cd_access': 'a'}
2021-03-01 11:33:00,854 - Generate Log Feeder config file: /usr/lib/ambari-logsearch-logfeeder/conf/input.config-ambari-metrics.json
2021-03-01 11:33:00,854 - File['/usr/lib/ambari-logsearch-logfeeder/conf/input.config-ambari-metrics.json'] {'content': Template('input.config-ambari-metrics.json.j2'), 'mode': 0644}
2021-03-01 11:33:00,856 - Execute['/usr/sbin/ambari-metrics-grafana start'] {'not_if': "ambari-sudo.sh su ams -l -s /bin/bash -c 'test -f /var/run/ambari-metrics-grafana/grafana-server.pid && ps -p cat /var/run/ambari-metrics-grafana/grafana-server.pid'", 'user': 'ams'}
2021-03-01 11:33:01,069 - Skipping stack-select on AMBARI_METRICS because it does not exist in the stack-select package structure.

Command failed after 1 tries

Ambari - Ranger
stderr:
Traceback (most recent call last):
File "/var/lib/ambari-agent/cache/stacks/HDP/3.0/services/RANGER/package/scripts/ranger_admin.py", line 258, in
RangerAdmin().execute()
File "/usr/lib/ambari-agent/lib/resource_management/libraries/script/script.py", line 352, in execute
method(env)
File "/var/lib/ambari-agent/cache/stacks/HDP/3.0/services/RANGER/package/scripts/ranger_admin.py", line 117, in start
solr_cloud_util.setup_solr_client(params.config, custom_log4j = params.custom_log4j)
File "/usr/lib/ambari-agent/lib/resource_management/libraries/functions/solr_cloud_util.py", line 249, in setup_solr_client
content=StaticFile(solrCliFilename)
File "/usr/lib/ambari-agent/lib/resource_management/core/base.py", line 166, in init
self.env.run()
File "/usr/lib/ambari-agent/lib/resource_management/core/environment.py", line 160, in run
self.run_action(resource, action)
File "/usr/lib/ambari-agent/lib/resource_management/core/environment.py", line 124, in run_action
provider_action()
File "/usr/lib/ambari-agent/lib/resource_management/core/providers/system.py", line 123, in action_create
content = self._get_content()
File "/usr/lib/ambari-agent/lib/resource_management/core/providers/system.py", line 160, in _get_content
return content()
File "/usr/lib/ambari-agent/lib/resource_management/core/source.py", line 52, in call
return self.get_content()
File "/usr/lib/ambari-agent/lib/resource_management/core/source.py", line 76, in get_content
raise Fail("{0} Source file {1} is not found".format(repr(self), path))
resource_management.core.exceptions.Fail: StaticFile('/usr/lib/ambari-infra-solr-client/solrCloudCli.sh') Source file /usr/lib/ambari-infra-solr-client/solrCloudCli.sh is not found
stdout:
2021-03-01 11:28:41,349 - Stack Feature Version Info: Cluster Stack=3.1, Command Stack=None, Command Version=3.1.4.0-315 -> 3.1.4.0-315
2021-03-01 11:28:41,387 - Using hadoop conf dir: /usr/hdp/3.1.4.0-315/hadoop/conf
2021-03-01 11:28:41,723 - Stack Feature Version Info: Cluster Stack=3.1, Command Stack=None, Command Version=3.1.4.0-315 -> 3.1.4.0-315
2021-03-01 11:28:41,733 - Using hadoop conf dir: /usr/hdp/3.1.4.0-315/hadoop/conf
2021-03-01 11:28:41,736 - Group['livy'] {}
2021-03-01 11:28:41,738 - Group['spark'] {}
2021-03-01 11:28:41,739 - Group['ranger'] {}
2021-03-01 11:28:41,739 - Group['hdfs'] {}
2021-03-01 11:28:41,740 - Group['zeppelin'] {}
2021-03-01 11:28:41,740 - Group['hadoop'] {}
2021-03-01 11:28:41,740 - Group['users'] {}
2021-03-01 11:28:41,741 - User['yarn-ats'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2021-03-01 11:28:41,743 - User['hive'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2021-03-01 11:28:41,745 - User['infra-solr'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2021-03-01 11:28:41,747 - User['zookeeper'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2021-03-01 11:28:41,749 - User['ams'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2021-03-01 11:28:41,751 - User['ranger'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['ranger', 'hadoop'], 'uid': None}
2021-03-01 11:28:41,752 - User['tez'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop', 'users'], 'uid': None}
2021-03-01 11:28:41,754 - User['zeppelin'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['zeppelin', 'hadoop'], 'uid': None}
2021-03-01 11:28:41,756 - User['livy'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['livy', 'hadoop'], 'uid': None}
2021-03-01 11:28:41,758 - User['spark'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['spark', 'hadoop'], 'uid': None}
2021-03-01 11:28:41,759 - User['ambari-qa'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop', 'users'], 'uid': None}
2021-03-01 11:28:41,761 - User['kafka'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2021-03-01 11:28:41,763 - User['hdfs'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hdfs', 'hadoop'], 'uid': None}
2021-03-01 11:28:41,765 - User['sqoop'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2021-03-01 11:28:41,766 - User['yarn'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2021-03-01 11:28:41,768 - User['mapred'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2021-03-01 11:28:41,770 - User['hbase'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2021-03-01 11:28:41,771 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2021-03-01 11:28:41,774 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 0'] {'not_if': '(test $(id -u ambari-qa) -gt 1000) || (false)'}
2021-03-01 11:28:41,784 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 0'] due to not_if
2021-03-01 11:28:41,785 - Directory['/tmp/hbase-hbase'] {'owner': 'hbase', 'create_parents': True, 'mode': 0775, 'cd_access': 'a'}
2021-03-01 11:28:41,787 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2021-03-01 11:28:41,790 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2021-03-01 11:28:41,791 - call['/var/lib/ambari-agent/tmp/changeUid.sh hbase'] {}
2021-03-01 11:28:41,806 - call returned (0, '1009')
2021-03-01 11:28:41,807 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase 1009'] {'not_if': '(test $(id -u hbase) -gt 1000) || (false)'}
2021-03-01 11:28:41,816 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase 1009'] due to not_if
2021-03-01 11:28:41,817 - Group['hdfs'] {}
2021-03-01 11:28:41,818 - User['hdfs'] {'fetch_nonlocal_groups': True, 'groups': ['hdfs', 'hadoop', u'hdfs']}
2021-03-01 11:28:41,819 - FS Type: HDFS
2021-03-01 11:28:41,819 - Directory['/etc/hadoop'] {'mode': 0755}
2021-03-01 11:28:41,850 - File['/usr/hdp/3.1.4.0-315/hadoop/conf/hadoop-env.sh'] {'content': InlineTemplate(...), 'owner': 'hdfs', 'group': 'hadoop'}
2021-03-01 11:28:41,852 - Directory['/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir'] {'owner': 'hdfs', 'group': 'hadoop', 'mode': 01777}
2021-03-01 11:28:41,881 - Execute[('setenforce', '0')] {'not_if': '(! which getenforce ) || (which getenforce && getenforce | grep -q Disabled)', 'sudo': True, 'only_if': 'test -f /selinux/enforce'}
2021-03-01 11:28:41,893 - Skipping Execute[('setenforce', '0')] due to not_if
2021-03-01 11:28:41,894 - Directory['/var/log/hadoop'] {'owner': 'root', 'create_parents': True, 'group': 'hadoop', 'mode': 0775, 'cd_access': 'a'}
2021-03-01 11:28:41,899 - Directory['/var/run/hadoop'] {'owner': 'root', 'create_parents': True, 'group': 'root', 'cd_access': 'a'}
2021-03-01 11:28:41,900 - Directory['/var/run/hadoop/hdfs'] {'owner': 'hdfs', 'cd_access': 'a'}
2021-03-01 11:28:41,901 - Directory['/tmp/hadoop-hdfs'] {'owner': 'hdfs', 'create_parents': True, 'cd_access': 'a'}
2021-03-01 11:28:41,908 - File['/usr/hdp/3.1.4.0-315/hadoop/conf/commons-logging.properties'] {'content': Template('commons-logging.properties.j2'), 'owner': 'hdfs'}
2021-03-01 11:28:41,911 - File['/usr/hdp/3.1.4.0-315/hadoop/conf/health_check'] {'content': Template('health_check.j2'), 'owner': 'hdfs'}
2021-03-01 11:28:41,922 - File['/usr/hdp/3.1.4.0-315/hadoop/conf/log4j.properties'] {'content': InlineTemplate(...), 'owner': 'hdfs', 'group': 'hadoop', 'mode': 0644}
2021-03-01 11:28:41,942 - File['/usr/hdp/3.1.4.0-315/hadoop/conf/hadoop-metrics2.properties'] {'content': InlineTemplate(...), 'owner': 'hdfs', 'group': 'hadoop'}
2021-03-01 11:28:41,943 - File['/usr/hdp/3.1.4.0-315/hadoop/conf/task-log4j.properties'] {'content': StaticFile('task-log4j.properties'), 'mode': 0755}
2021-03-01 11:28:41,945 - File['/usr/hdp/3.1.4.0-315/hadoop/conf/configuration.xsl'] {'owner': 'hdfs', 'group': 'hadoop'}
2021-03-01 11:28:41,952 - File['/etc/hadoop/conf/topology_mappings.data'] {'owner': 'hdfs', 'content': Template('topology_mappings.data.j2'), 'only_if': 'test -d /etc/hadoop/conf', 'group': 'hadoop', 'mode': 0644}
2021-03-01 11:28:41,959 - File['/etc/hadoop/conf/topology_script.py'] {'content': StaticFile('topology_script.py'), 'only_if': 'test -d /etc/hadoop/conf', 'mode': 0755}
2021-03-01 11:28:41,966 - Skipping unlimited key JCE policy check and setup since it is not required
2021-03-01 11:28:42,439 - Stack Feature Version Info: Cluster Stack=3.1, Command Stack=None, Command Version=3.1.4.0-315 -> 3.1.4.0-315
2021-03-01 11:28:42,555 - File['/var/lib/ambari-agent/tmp/postgresql-42.2.19.jar'] {'content': DownloadSource('http://ambari.server:8080/resources/postgresql-42.2.19.jar'), 'mode': 0644}
2021-03-01 11:28:42,568 - Not downloading the file from http://ambari.server:8080/resources/postgresql-42.2.19.jar, because /var/lib/ambari-agent/tmp/postgresql-42.2.19.jar already exists
2021-03-01 11:28:42,585 - Execute[('cp', '--remove-destination', u'/var/lib/ambari-agent/tmp/postgresql-42.2.19.jar', u'/usr/hdp/current/ranger-admin/ews/lib')] {'path': ['/bin', '/usr/bin/'], 'sudo': True}
2021-03-01 11:28:42,599 - File['/usr/hdp/current/ranger-admin/ews/lib/postgresql-42.2.19.jar'] {'mode': 0644}
2021-03-01 11:28:42,600 - ModifyPropertiesFile['/usr/hdp/current/ranger-admin/install.properties'] {'owner': 'ranger', 'properties': ...}
2021-03-01 11:28:42,627 - Modifying existing properties file: /usr/hdp/current/ranger-admin/install.properties
2021-03-01 11:28:42,646 - File['/usr/hdp/current/ranger-admin/install.properties'] {'owner': 'ranger', 'content': ..., 'group': None, 'mode': None, 'encoding': 'utf-8'}
2021-03-01 11:28:42,647 - ModifyPropertiesFile['/usr/hdp/current/ranger-admin/install.properties'] {'owner': 'ranger', 'properties': {'SQL_CONNECTOR_JAR': u'/usr/hdp/current/ranger-admin/ews/lib/postgresql-42.2.19.jar'}}
2021-03-01 11:28:42,647 - Modifying existing properties file: /usr/hdp/current/ranger-admin/install.properties
2021-03-01 11:28:42,649 - File['/usr/hdp/current/ranger-admin/install.properties'] {'owner': 'ranger', 'content': ..., 'group': None, 'mode': None, 'encoding': 'utf-8'}
2021-03-01 11:28:42,650 - ModifyPropertiesFile['/usr/hdp/current/ranger-admin/install.properties'] {'owner': 'ranger', 'properties': {'audit_store': u'solr'}}
2021-03-01 11:28:42,651 - Modifying existing properties file: /usr/hdp/current/ranger-admin/install.properties
2021-03-01 11:28:42,653 - File['/usr/hdp/current/ranger-admin/install.properties'] {'owner': 'ranger', 'content': ..., 'group': None, 'mode': None, 'encoding': 'utf-8'}
2021-03-01 11:28:42,654 - ModifyPropertiesFile['/usr/hdp/current/ranger-admin/install.properties'] {'owner': 'ranger', 'properties': {'ranger_admin_max_heap_size': u'1g'}}
2021-03-01 11:28:42,654 - Modifying existing properties file: /usr/hdp/current/ranger-admin/install.properties
2021-03-01 11:28:42,656 - File['/usr/hdp/current/ranger-admin/install.properties'] {'owner': 'ranger', 'content': ..., 'group': None, 'mode': None, 'encoding': 'utf-8'}
2021-03-01 11:28:42,657 - Separate DBA property not set. Assuming Ranger DB and DB User exists!
2021-03-01 11:28:42,657 - Execute['ambari-python-wrap /usr/hdp/current/ranger-admin/db_setup.py'] {'logoutput': True, 'environment': {'RANGER_ADMIN_HOME': u'/usr/hdp/current/ranger-admin', 'JAVA_HOME': u'/usr/jdk64/jdk1.8.0_112'}, 'user': 'ranger'}
2021-03-01 11:28:46,000 [I] DB FLAVOR :POSTGRES
2021-03-01 11:28:46,001 [I] --------- Verifying Ranger DB connection ---------
2021-03-01 11:28:46,001 [I] Checking connection..
2021-03-01 11:28:46,002 [JISQL] /usr/jdk64/jdk1.8.0_112/bin/java -cp /usr/hdp/current/ranger-admin/ews/lib/postgresql-42.2.19.jar:/usr/hdp/current/ranger-admin/jisql/lib/* org.apache.util.sql.Jisql -driver postgresql -cstring jdbc:postgresql://172.16.21.18/ranger -u rangeradmin -p '' -noheader -trim -c ; -query "select 1;"
2021-03-01 11:28:46,555 [I] Checking connection passed.
2021-03-01 11:28:46,555 [I] --------- Verifying version history table ---------
2021-03-01 11:28:46,555 [JISQL] /usr/jdk64/jdk1.8.0_112/bin/java -cp /usr/hdp/current/ranger-admin/ews/lib/postgresql-42.2.19.jar:/usr/hdp/current/ranger-admin/jisql/lib/* org.apache.util.sql.Jisql -driver postgresql -cstring jdbc:postgresql://172.16.21.18/ranger -u rangeradmin -p '
' -noheader -trim -c ; -query "select * from (select table_name from information_schema.tables where table_catalog='ranger' and table_name = 'x_db_version_h') as temp;"
2021-03-01 11:28:47,294 [I] Table x_db_version_h already exists in database 'ranger'
2021-03-01 11:28:47,294 [I] --------- Importing Ranger Core DB Schema ---------
2021-03-01 11:28:47,306 [JISQL] /usr/jdk64/jdk1.8.0_112/bin/java -cp /usr/hdp/current/ranger-admin/ews/lib/postgresql-42.2.19.jar:/usr/hdp/current/ranger-admin/jisql/lib/* org.apache.util.sql.Jisql -driver postgresql -cstring jdbc:postgresql://172.16.21.18/ranger -u rangeradmin -p '' -noheader -trim -c ; -query "select version from x_db_version_h where version = 'CORE_DB_SCHEMA' and active = 'Y';"
2021-03-01 11:28:47,847 [I] CORE_DB_SCHEMA is already imported
2021-03-01 11:28:47,848 [JISQL] /usr/jdk64/jdk1.8.0_112/bin/java -cp /usr/hdp/current/ranger-admin/ews/lib/postgresql-42.2.19.jar:/usr/hdp/current/ranger-admin/jisql/lib/* org.apache.util.sql.Jisql -driver postgresql -cstring jdbc:postgresql://172.16.21.18/ranger -u rangeradmin -p '
' -noheader -trim -c ; -query "select version from x_db_version_h where version = 'DB_PATCHES' and inst_by = 'Ranger 1.2.0.3.1.4.0-315' and active = 'Y';"
2021-03-01 11:28:48,341 [I] DB_PATCHES have already been applied
2021-03-01 11:28:48,353 - Directory['/usr/hdp/current/ranger-admin/conf'] {'owner': 'ranger', 'group': 'ranger', 'create_parents': True}
2021-03-01 11:28:48,355 - File['/var/lib/ambari-agent/tmp/postgresql-42.2.19.jar'] {'content': DownloadSource('http://ambari.server:8080/resources/postgresql-42.2.19.jar'), 'mode': 0644}
2021-03-01 11:28:48,355 - Not downloading the file from http://ambari.server:8080/resources/postgresql-42.2.19.jar, because /var/lib/ambari-agent/tmp/postgresql-42.2.19.jar already exists
2021-03-01 11:28:48,359 - Execute[('cp', '--remove-destination', u'/var/lib/ambari-agent/tmp/postgresql-42.2.19.jar', u'/usr/hdp/current/ranger-admin/ews/lib')] {'path': ['/bin', '/usr/bin/'], 'sudo': True}
2021-03-01 11:28:48,375 - File['/usr/hdp/current/ranger-admin/ews/lib/postgresql-42.2.19.jar'] {'mode': 0644}
2021-03-01 11:28:48,376 - ModifyPropertiesFile['/usr/hdp/current/ranger-admin/install.properties'] {'owner': 'ranger', 'properties': ...}
2021-03-01 11:28:48,377 - Modifying existing properties file: /usr/hdp/current/ranger-admin/install.properties
2021-03-01 11:28:48,393 - File['/usr/hdp/current/ranger-admin/install.properties'] {'owner': 'ranger', 'content': ..., 'group': None, 'mode': None, 'encoding': 'utf-8'}
2021-03-01 11:28:48,394 - ModifyPropertiesFile['/usr/hdp/current/ranger-admin/install.properties'] {'owner': 'ranger', 'properties': {'SQL_CONNECTOR_JAR': u'/usr/hdp/current/ranger-admin/ews/lib/postgresql-42.2.19.jar'}}
2021-03-01 11:28:48,395 - Modifying existing properties file: /usr/hdp/current/ranger-admin/install.properties
2021-03-01 11:28:48,397 - File['/usr/hdp/current/ranger-admin/install.properties'] {'owner': 'ranger', 'content': ..., 'group': None, 'mode': None, 'encoding': 'utf-8'}
2021-03-01 11:28:48,398 - File['/usr/lib/ambari-agent/DBConnectionVerification.jar'] {'content': DownloadSource('http://ambari.server:8080/resources/DBConnectionVerification.jar'), 'mode': 0644}
2021-03-01 11:28:48,404 - Not downloading the file from http://ambari.server:8080/resources/DBConnectionVerification.jar, because /var/lib/ambari-agent/tmp/DBConnectionVerification.jar already exists
2021-03-01 11:28:48,458 - Directory['/usr/lib/ambari-logsearch-logfeeder/conf'] {'create_parents': True, 'mode': 0755, 'cd_access': 'a'}
2021-03-01 11:28:48,459 - Generate Log Feeder config file: /usr/lib/ambari-logsearch-logfeeder/conf/input.config-ranger.json
2021-03-01 11:28:48,459 - File['/usr/lib/ambari-logsearch-logfeeder/conf/input.config-ranger.json'] {'content': Template('input.config-ranger.json.j2'), 'mode': 0644}
2021-03-01 11:28:48,465 - Execute['/usr/jdk64/jdk1.8.0_112/bin/java -cp /usr/lib/ambari-agent/DBConnectionVerification.jar:/usr/hdp/current/ranger-admin/ews/lib/postgresql-42.2.19.jar:/usr/hdp/current/ranger-admin/ews/lib/* org.apache.ambari.server.DBConnectionVerification 'jdbc:postgresql://172.16.21.18:5432/ranger' rangeradmin [PROTECTED] org.postgresql.Driver'] {'environment': {}, 'path': ['/usr/sbin:/sbin:/usr/local/bin:/bin:/usr/bin'], 'tries': 5, 'try_sleep': 10}
2021-03-01 11:28:49,725 - Execute[('ln', '-sf', u'/usr/hdp/current/ranger-admin/ews/webapp/WEB-INF/classes/conf', u'/usr/hdp/current/ranger-admin/conf')] {'not_if': 'ls /usr/hdp/current/ranger-admin/conf', 'sudo': True, 'only_if': 'ls /usr/hdp/current/ranger-admin/ews/webapp/WEB-INF/classes/conf'}
2021-03-01 11:28:49,739 - Skipping Execute[('ln', '-sf', u'/usr/hdp/current/ranger-admin/ews/webapp/WEB-INF/classes/conf', u'/usr/hdp/current/ranger-admin/conf')] due to not_if
2021-03-01 11:28:49,740 - Directory['/usr/hdp/current/ranger-admin/'] {'owner': 'ranger', 'group': 'ranger', 'recursive_ownership': True}
2021-03-01 11:28:50,451 - Directory['/var/run/ranger'] {'owner': 'ranger', 'group': 'hadoop', 'create_parents': True, 'mode': 0755, 'cd_access': 'a'}
2021-03-01 11:28:50,452 - Directory['/var/log/ranger/admin'] {'owner': 'ranger', 'group': 'ranger', 'create_parents': True, 'mode': 0755, 'cd_access': 'a'}
2021-03-01 11:28:50,453 - File['/usr/hdp/current/ranger-admin/conf/ranger-admin-default-site.xml'] {'owner': 'ranger', 'group': 'ranger'}
2021-03-01 11:28:50,454 - File['/usr/hdp/current/ranger-admin/conf/security-applicationContext.xml'] {'owner': 'ranger', 'group': 'ranger'}
2021-03-01 11:28:50,455 - Execute[('ln', '-sf', u'/usr/hdp/current/ranger-admin/ews/ranger-admin-services.sh', '/usr/bin/ranger-admin')] {'not_if': 'ls /usr/bin/ranger-admin', 'sudo': True, 'only_if': 'ls /usr/hdp/current/ranger-admin/ews/ranger-admin-services.sh'}
2021-03-01 11:28:50,464 - Skipping Execute[('ln', '-sf', u'/usr/hdp/current/ranger-admin/ews/ranger-admin-services.sh', '/usr/bin/ranger-admin')] due to not_if
2021-03-01 11:28:50,465 - XmlConfig['ranger-admin-site.xml'] {'group': 'ranger', 'conf_dir': '/usr/hdp/current/ranger-admin/conf', 'mode': 0644, 'configuration_attributes': {}, 'owner': 'ranger', 'configurations': ...}
2021-03-01 11:28:50,484 - Generating config: /usr/hdp/current/ranger-admin/conf/ranger-admin-site.xml
2021-03-01 11:28:50,485 - File['/usr/hdp/current/ranger-admin/conf/ranger-admin-site.xml'] {'owner': 'ranger', 'content': InlineTemplate(...), 'group': 'ranger', 'mode': 0644, 'encoding': 'UTF-8'}
2021-03-01 11:28:50,607 - Directory['/usr/hdp/current/ranger-admin/conf/ranger_jaas'] {'owner': 'ranger', 'group': 'ranger', 'mode': 0700}
2021-03-01 11:28:50,613 - File['/usr/hdp/current/ranger-admin/ews/webapp/WEB-INF/log4j.properties'] {'content': InlineTemplate(...), 'owner': 'ranger', 'group': 'ranger', 'mode': 0644}
2021-03-01 11:28:50,616 - Execute[(u'/usr/jdk64/jdk1.8.0_112/bin/java', '-cp', u'/usr/hdp/current/ranger-admin/cred/lib/*', 'org.apache.ranger.credentialapi.buildks', 'create', u'rangeradmin', '-value', [PROTECTED], '-provider', u'jceks://file/etc/ranger/admin/rangeradmin.jceks')] {'logoutput': True, 'environment': {'JAVA_HOME': u'/usr/jdk64/jdk1.8.0_112'}, 'sudo': True}
SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
SLF4J: Defaulting to no-operation (NOP) logger implementation
SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details.
The alias rangeradmin already exists!! Will try to delete first.
FOUND value of [interactive] field in the Class [org.apache.hadoop.security.alias.CredentialShell] = [true]
Deleting credential: rangeradmin from CredentialProvider: jceks://file/etc/ranger/admin/rangeradmin.jceks
Credential rangeradmin has been successfully deleted.
Provider jceks://file/etc/ranger/admin/rangeradmin.jceks was updated.
WARNING: You have accepted the use of the default provider password
by not configuring a password in one of the two following locations:
* In the environment variable HADOOP_CREDSTORE_PASSWORD
* In a file referred to by the configuration entry
hadoop.security.credstore.java-keystore-provider.password-file.
Please review the documentation regarding provider passwords in
the keystore passwords section of the Credential Provider API
Continuing with the default provider password.

rangeradmin has been successfully created.
Provider jceks://file/etc/ranger/admin/rangeradmin.jceks was updated.
2021-03-01 11:28:52,569 - Execute[(u'/usr/jdk64/jdk1.8.0_112/bin/java', '-cp', u'/usr/hdp/current/ranger-admin/cred/lib/*', 'org.apache.ranger.credentialapi.buildks', 'create', u'trustStoreAlias', '-value', [PROTECTED], '-provider', u'jceks://file/etc/ranger/admin/rangeradmin.jceks')] {'logoutput': True, 'environment': {'JAVA_HOME': u'/usr/jdk64/jdk1.8.0_112'}, 'sudo': True}
SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
SLF4J: Defaulting to no-operation (NOP) logger implementation
SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details.
The alias trustStoreAlias already exists!! Will try to delete first.
FOUND value of [interactive] field in the Class [org.apache.hadoop.security.alias.CredentialShell] = [true]
Deleting credential: trustStoreAlias from CredentialProvider: jceks://file/etc/ranger/admin/rangeradmin.jceks
Credential trustStoreAlias has been successfully deleted.
Provider jceks://file/etc/ranger/admin/rangeradmin.jceks was updated.
WARNING: You have accepted the use of the default provider password
by not configuring a password in one of the two following locations:
* In the environment variable HADOOP_CREDSTORE_PASSWORD
* In a file referred to by the configuration entry
hadoop.security.credstore.java-keystore-provider.password-file.
Please review the documentation regarding provider passwords in
the keystore passwords section of the Credential Provider API
Continuing with the default provider password.

trustStoreAlias has been successfully created.
Provider jceks://file/etc/ranger/admin/rangeradmin.jceks was updated.
2021-03-01 11:28:53,839 - File['/etc/ranger/admin/rangeradmin.jceks'] {'owner': 'ranger', 'only_if': 'test -e /etc/ranger/admin/rangeradmin.jceks', 'group': 'ranger', 'mode': 0640}
2021-03-01 11:28:53,847 - File['/etc/ranger/admin/.rangeradmin.jceks.crc'] {'owner': 'ranger', 'only_if': 'test -e /etc/ranger/admin/.rangeradmin.jceks.crc', 'group': 'ranger', 'mode': 0640}
2021-03-01 11:28:53,853 - XmlConfig['core-site.xml'] {'group': 'ranger', 'conf_dir': '/usr/hdp/current/ranger-admin/conf', 'mode': 0644, 'configuration_attributes': {u'final': {u'fs.defaultFS': u'true'}}, 'owner': 'ranger', 'configurations': ...}
2021-03-01 11:28:53,872 - Generating config: /usr/hdp/current/ranger-admin/conf/core-site.xml
2021-03-01 11:28:53,873 - File['/usr/hdp/current/ranger-admin/conf/core-site.xml'] {'owner': 'ranger', 'content': InlineTemplate(...), 'group': 'ranger', 'mode': 0644, 'encoding': 'UTF-8'}
2021-03-01 11:28:53,970 - File['/usr/hdp/current/ranger-admin/conf/ranger-admin-env.sh'] {'owner': 'ranger', 'content': InlineTemplate(...), 'group': 'ranger', 'mode': 0755}
2021-03-01 11:28:53,977 - Execute['ambari-python-wrap /usr/hdp/current/ranger-admin/db_setup.py -javapatch'] {'logoutput': True, 'environment': {'RANGER_ADMIN_HOME': u'/usr/hdp/current/ranger-admin', 'JAVA_HOME': u'/usr/jdk64/jdk1.8.0_112'}, 'user': 'ranger'}
2021-03-01 11:28:54,452 [I] DB FLAVOR :POSTGRES
2021-03-01 11:28:54,453 [I] --------- Verifying Ranger DB connection ---------
2021-03-01 11:28:54,453 [I] Checking connection..
2021-03-01 11:28:54,453 [JISQL] /usr/jdk64/jdk1.8.0_112/bin/java -cp /usr/hdp/current/ranger-admin/ews/lib/postgresql-42.2.19.jar:/usr/hdp/current/ranger-admin/jisql/lib/* org.apache.util.sql.Jisql -driver postgresql -cstring jdbc:postgresql://172.16.21.18/ranger -u rangeradmin -p '' -noheader -trim -c ; -query "select 1;"
2021-03-01 11:28:54,929 [I] Checking connection passed.
2021-03-01 11:28:54,929 [JISQL] /usr/jdk64/jdk1.8.0_112/bin/java -cp /usr/hdp/current/ranger-admin/ews/lib/postgresql-42.2.19.jar:/usr/hdp/current/ranger-admin/jisql/lib/* org.apache.util.sql.Jisql -driver postgresql -cstring jdbc:postgresql://172.16.21.18/ranger -u rangeradmin -p '
' -noheader -trim -c ; -query "select version from x_db_version_h where version = 'JAVA_PATCHES' and inst_by = 'Ranger 1.2.0.3.1.4.0-315' and active = 'Y';"
2021-03-01 11:28:55,364 [I] JAVA_PATCHES have already been applied
2021-03-01 11:28:55,378 - Execute['ambari-python-wrap /usr/hdp/current/ranger-admin/db_setup.py -changepassword -pair admin [PROTECTED] [PROTECTED] -pair rangerusersync [PROTECTED] [PROTECTED] -pair rangertagsync [PROTECTED] [PROTECTED] -pair keyadmin [PROTECTED] [PROTECTED] '] {'logoutput': True, 'environment': {'RANGER_ADMIN_HOME': u'/usr/hdp/current/ranger-admin', 'JAVA_HOME': u'/usr/jdk64/jdk1.8.0_112'}, 'tries': 3, 'user': 'ranger', 'try_sleep': 5}
2021-03-01 11:28:55,861 [I] DB FLAVOR :POSTGRES
2021-03-01 11:28:55,861 [I] --------- Verifying Ranger DB connection ---------
2021-03-01 11:28:55,862 [I] Checking connection..
2021-03-01 11:28:55,862 [JISQL] /usr/jdk64/jdk1.8.0_112/bin/java -cp /usr/hdp/current/ranger-admin/ews/lib/postgresql-42.2.19.jar:/usr/hdp/current/ranger-admin/jisql/lib/* org.apache.util.sql.Jisql -driver postgresql -cstring jdbc:postgresql://172.16.21.18/ranger -u rangeradmin -p '' -noheader -trim -c ; -query "select 1;"
2021-03-01 11:28:56,298 [I] Checking connection passed.
2021-03-01 11:28:56,299 [JISQL] /usr/jdk64/jdk1.8.0_112/bin/java -cp /usr/hdp/current/ranger-admin/ews/lib/postgresql-42.2.19.jar:/usr/hdp/current/ranger-admin/jisql/lib/* org.apache.util.sql.Jisql -driver postgresql -cstring jdbc:postgresql://172.16.21.18/ranger -u rangeradmin -p '
' -noheader -trim -c ; -query "select version from x_db_version_h where version = 'DEFAULT_ALL_ADMIN_UPDATE' and active = 'Y';"
2021-03-01 11:28:56,759 [I] Ranger all admins default password has already been changed!!
2021-03-01 11:28:56,772 - Directory['/var/log/ambari-infra-solr-client'] {'create_parents': True, 'mode': 0755, 'cd_access': 'a'}
2021-03-01 11:28:56,774 - Directory['/usr/lib/ambari-infra-solr-client'] {'recursive_ownership': True, 'create_parents': True, 'mode': 0755, 'cd_access': 'a'}
2021-03-01 11:28:56,775 - File['/usr/lib/ambari-infra-solr-client/solrCloudCli.sh'] {'content': StaticFile('/usr/lib/ambari-infra-solr-client/solrCloudCli.sh'), 'mode': 0755}

Command failed after 1 tries

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.