2017年12月29日金曜日

Kerberos済みのHDPで、Ranger Solr PluginのAudit先をAmbari Infraにする (Unofficial)

前提条件:

SolrとRanger Solr Pluginをインストール 
mPackは2.2.9、HDPは2.6.2かそれ以降を推奨(RANGER-1446, RANGER-1658)
Ambari InfraにHDFSのAuditが正常に書き込まれているのを確認


1) NameNodeのサーバーから/etc/hadoop/conf/ranger-hdfs-audit.xmlをSolrサーバの下記ファイルにコピー
/opt/lucidworks-hdpsearch/solr/server/solr-webapp/webapp/WEB-INF/classes/ranger-solr-audit.xml 

2) ranger-solr-audit.xmlを編集後、SolrをAmbariからリスタート 

xasecure.audit.destination.solr.batch.filespool.dir = /var/log/solr/audit/solr/spool 
xasecure.audit.jaas.Client.option.keyTab = /etc/security/keytabs/solr.service.keytab 
xasecure.audit.jaas.Client.option.principal = solr/_HOST@YOUR_PRINCIPAL 
xasecure.audit.solr.solr_url = (empty value) 

3) /var/lib/ambari-agent/cache/common-services/RANGER/0.4.0/package/scripts/setup_ranger_xml.py を編集(太字の部分を追加):

service_default_principals_map = [('hdfs', 'nn'), ('hbase', 'hbase'), ('hive', 'hive'), ('kafka', 'kafka'), ('kms', 'rangerkms'), 
('knox', 'knox'), ('nifi', 'nifi'), ('storm', 'storm'), ('yanr', 'yarn'), ('solr', 'solr')]

Ambari Agentのキャッシュを変更しているので、Ambari Server側のファイルも変更する必要あり。
.pyc、.pyoファイルがあった場合は削除。 

4) Ranger AdminをAmbariから再起動(する事によって上記ユーザがSolr Roleに追加され、403エラーが発生しなくなる)

Ambari 2.6.0は日本語Localeだともしかしたら動かない場合あり

Ambari 2.6.0からyumrpm.pyが大幅に変わりました。
サービスのインストールやスタート時に、yum list availableとyum list installedを実行する模様です。
その際、標準と異なるアウトプットが出るとサービスのインストールやスタートに失敗する可能性があります。
「標準と異なるアウトプット」とはRedhat Satelliteやyum pluginなどを使っていると、yum list xxxxコマンド実行時に、リストの最初と最後に数行エクストラな情報がでます。
Satelliteに関してはAmbari2.6.2で修正される模様です。

で、問題(になる可能性)は下記のラインのように英語の出力を想定している部分があることです。

    return self._lookup_packages(cmd, 'Available Packages')

日本語環境にしているとyumの出力も日本語になります。したがって、上記の単語にマッチしないわけですが、その場合は_lookup_packages()内で最初の三行を無視するようになっているので、うまくいく場合といかない場合がでてきます。

簡単な回避策としては、/var/lib/ambari-agent/bin/ambari-agentに"export LANG=C"を追加するなどがあります。


備考:

[root@sandbox-hdp ~]# cat /etc/sysconfig/i18n
LANG="en_US.UTF-8"
SYSFONT="latarcyrheb-sun16"

日本語にして見ます。
[root@sandbox-hdp ~]# cat /etc/sysconfig/i18n
LANG="ja_JP.utf8"
SYSFONT="latarcyrheb-sun16"

ログアウト・ログイン後、または
[root@sandbox-hdp ~]# . /etc/sysconfig/i18n
[root@sandbox-hdp ~]# locale
LANG=ja_JP.utf8
LC_CTYPE="ja_JP.utf8"
LC_NUMERIC="ja_JP.utf8"
LC_TIME="ja_JP.utf8"
LC_COLLATE="ja_JP.utf8"
LC_MONETARY="ja_JP.utf8"
LC_MESSAGES="ja_JP.utf8"
LC_PAPER="ja_JP.utf8"
LC_NAME="ja_JP.utf8"
LC_ADDRESS="ja_JP.utf8"
LC_TELEPHONE="ja_JP.utf8"
LC_MEASUREMENT="ja_JP.utf8"
LC_IDENTIFICATION="ja_JP.utf8"
LC_ALL=

TODO: CentOS7の場合は"localectl set-locale LANG=ja_JP.utf8;export LC_CTYPE=ja_JP.UTF-8"?

[root@sandbox-hdp ~]# yum list installed | head
読み込んだプラグイン:fastestmirror, ovl, priorities
インストール済みパッケージ
ConsoleKit.x86_64                       0.4.1-6.el6              @base
ConsoleKit-libs.x86_64                  0.4.1-6.el6              @base
GConf2.x86_64                           2.28.0-7.el6             @base
MAKEDEV.x86_64                          3.24-6.el6               @CentOS/6.8
ORBit2.x86_64                           2.14.17-6.el6_8          @base
PyQt4.x86_64                            4.6.2-9.el6              @base
R.x86_64                                3.4.1-1.el6              @epel
R-core.x86_64                           3.4.1-1.el6              @epel

TODO: もしかしたらワークアラウンド?(どのような影響がでるか不明)
repositories.legacy-override.enabled=true

2017年12月15日金曜日

TODO: Sandbos HDP 2.6.1:Ambari Infraが開始できない

Sandbox作成後にAmbariInfraを開始しようとするとエラー
Traceback (most recent call last):
  File "/var/lib/ambari-agent/cache/common-services/AMBARI_INFRA/0.1.0/package/scripts/infra_solr.py", line 123, in <module>
    InfraSolr().execute()
  File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 329, in execute
    method(env)
  File "/var/lib/ambari-agent/cache/common-services/AMBARI_INFRA/0.1.0/package/scripts/infra_solr.py", line 46, in start
    self.configure(env)
  File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 119, in locking_configure
    original_configure(obj, *args, **kw)
  File "/var/lib/ambari-agent/cache/common-services/AMBARI_INFRA/0.1.0/package/scripts/infra_solr.py", line 41, in configure
    setup_infra_solr(name = 'server')
  File "/var/lib/ambari-agent/cache/common-services/AMBARI_INFRA/0.1.0/package/scripts/setup_infra_solr.py", line 118, in setup_infra_solr
    security_json_location=security_json_file_location
  File "/usr/lib/python2.6/site-packages/resource_management/libraries/functions/solr_cloud_util.py", line 159, in setup_kerberos_plugin
    Execute(setup_kerberos_plugin_cmd)
  File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 155, in __init__
    self.env.run()
  File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 160, in run
    self.run_action(resource, action)
  File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 124, in run_action
    provider_action()
  File "/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py", line 262, in action_run
    tries=self.resource.tries, try_sleep=self.resource.try_sleep)
  File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 72, in inner
    result = function(command, **kwargs)
  File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 102, in checked_call
    tries=tries, try_sleep=try_sleep, timeout_kill_strategy=timeout_kill_strategy)
  File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 150, in _call_wrapper
    result = _call(command, **kwargs_copy)
  File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 303, in _call
    raise ExecutionFailed(err_msg, code, out, err)
resource_management.core.exceptions.ExecutionFailed: Execution of 'ambari-sudo.sh JAVA_HOME=/usr/lib/jvm/java /usr/lib/ambari-infra-solr-client/solrCloudCli.sh --zookeeper-connect-string sandbox.hortonworks.com:2181 --znode /infra-solr --setup-kerberos-plugin' returned 1. Using default ZkCredentialsProvider
Client environment:zookeeper.version=3.4.6-1569965, built on 02/20/2014 09:09 GMT
Client environment:host.name=sandbox.hortonworks.com
Client environment:java.version=1.8.0_141
Client environment:java.vendor=Oracle Corporation
Client environment:java.home=/usr/lib/jvm/java-1.8.0-openjdk-1.8.0.141-2.b16.el6_9.x86_64/jre
Client environment:java.class.path=/usr/lib/ambari-infra-solr-client:/usr/lib/ambari-infra-solr-client/libs/log4j-1.2.17.jar:/usr/lib/ambari-infra-solr-client/libs/junit-4.10.jar:/usr/lib/ambari-infra-solr-client/libs/commons-cli-1.3.1.jar:/usr/lib/ambari-infra-solr-client/libs/noggit-0.6.jar:/usr/lib/ambari-infra-solr-client/libs/jackson-core-asl-1.9.9.jar:/usr/lib/ambari-infra-solr-client/libs/stax2-api-3.1.4.jar:/usr/lib/ambari-infra-solr-client/libs/jcl-over-slf4j-1.7.7.jar:/usr/lib/ambari-infra-solr-client/libs/tools-1.7.0.jar:/usr/lib/ambari-infra-solr-client/libs/slf4j-api-1.7.2.jar:/usr/lib/ambari-infra-solr-client/libs/solr-solrj-5.5.2.jar:/usr/lib/ambari-infra-solr-client/libs/guava-16.0.jar:/usr/lib/ambari-infra-solr-client/libs/commons-io-2.1.jar:/usr/lib/ambari-infra-solr-client/libs/commons-collections-3.2.2.jar:/usr/lib/ambari-infra-solr-client/libs/httpmime-4.4.1.jar:/usr/lib/ambari-infra-solr-client/libs/easymock-3.4.jar:/usr/lib/ambari-infra-solr-client/libs/utility-1.0.0.0-SNAPSHOT.jar:/usr/lib/ambari-infra-solr-client/libs/objenesis-2.2.jar:/usr/lib/ambari-infra-solr-client/libs/zookeeper-3.4.6.jar:/usr/lib/ambari-infra-solr-client/libs/antlr-2.7.7.jar:/usr/lib/ambari-infra-solr-client/libs/commons-lang-2.5.jar:/usr/lib/ambari-infra-solr-client/libs/antlr4-runtime-4.5.3.jar:/usr/lib/ambari-infra-solr-client/libs/slf4j-log4j12-1.7.2.jar:/usr/lib/ambari-infra-solr-client/libs/httpclient-4.4.1.jar:/usr/lib/ambari-infra-solr-client/libs/commons-beanutils-1.9.2.jar:/usr/lib/ambari-infra-solr-client/libs/httpcore-4.4.1.jar:/usr/lib/ambari-infra-solr-client/libs/commons-logging-1.1.1.jar:/usr/lib/ambari-infra-solr-client/libs/woodstox-core-asl-4.4.1.jar:/usr/lib/ambari-infra-solr-client/libs/commons-codec-1.8.jar:/usr/lib/ambari-infra-solr-client/libs/checkstyle-6.19.jar:/usr/lib/ambari-infra-solr-client/libs/jackson-mapper-asl-1.9.13.jar:/usr/lib/ambari-infra-solr-client/libs/hamcrest-core-1.1.jar:/usr/lib/ambari-infra-solr-client/libs/ambari-logsearch-solr-client-2.5.1.0.159.jar
Client environment:java.library.path=/usr/java/packages/lib/amd64:/usr/lib64:/lib64:/lib:/usr/lib
Client environment:java.io.tmpdir=/tmp
Client environment:java.compiler=<NA>
Client environment:os.name=Linux
Client environment:os.arch=amd64
Client environment:os.version=3.13.0-86-generic
Client environment:user.name=root
Client environment:user.home=/root
Client environment:user.dir=/var/lib/ambari-agent
Initiating client connection, connectString=sandbox.hortonworks.com:2181 sessionTimeout=15000 watcher=org.apache.solr.common.cloud.SolrZkClient$3@5e91993f
Waiting for client to connect to ZooKeeper
Opening socket connection to server sandbox.hortonworks.com/172.18.0.2:2181. Will not attempt to authenticate using SASL (unknown error)
Socket connection established to sandbox.hortonworks.com/172.18.0.2:2181, initiating session
Session establishment complete on server sandbox.hortonworks.com/172.18.0.2:2181, sessionid = 0x160548c0ea90005, negotiated timeout = 15000
Watcher org.apache.solr.common.cloud.ConnectionManager@350d2264 name:ZooKeeperConnection Watcher:sandbox.hortonworks.com:2181 got event WatchedEvent state:SyncConnected type:None path:null path:null type:None
Client is connected to ZooKeeper
Using default ZkACLProvider
Setup kerberos plugin in security.json
KeeperErrorCode = NoAuth for /infra-solr/security.json
org.apache.zookeeper.KeeperException$NoAuthException: KeeperErrorCode = NoAuth for /infra-solr/security.json
 at org.apache.zookeeper.KeeperException.create(KeeperException.java:113)
 at org.apache.zookeeper.KeeperException.create(KeeperException.java:51)
 at org.apache.zookeeper.ZooKeeper.setData(ZooKeeper.java:1270)
 at org.apache.solr.common.cloud.SolrZkClient$8.execute(SolrZkClient.java:362)
 at org.apache.solr.common.cloud.SolrZkClient$8.execute(SolrZkClient.java:359)
 at org.apache.solr.common.cloud.ZkCmdExecutor.retryOperation(ZkCmdExecutor.java:60)
 at org.apache.solr.common.cloud.SolrZkClient.setData(SolrZkClient.java:359)
 at org.apache.solr.common.cloud.SolrZkClient.setData(SolrZkClient.java:546)
 at org.apache.ambari.logsearch.solr.commands.EnableKerberosPluginSolrZkCommand.putFileContent(EnableKerberosPluginSolrZkCommand.java:63)
 at org.apache.ambari.logsearch.solr.commands.EnableKerberosPluginSolrZkCommand.executeZkCommand(EnableKerberosPluginSolrZkCommand.java:54)
 at org.apache.ambari.logsearch.solr.commands.EnableKerberosPluginSolrZkCommand.executeZkCommand(EnableKerberosPluginSolrZkCommand.java:32)
 at org.apache.ambari.logsearch.solr.commands.AbstractZookeeperRetryCommand.createAndProcessRequest(AbstractZookeeperRetryCommand.java:38)
 at org.apache.ambari.logsearch.solr.commands.AbstractRetryCommand.retry(AbstractRetryCommand.java:45)
 at org.apache.ambari.logsearch.solr.commands.AbstractRetryCommand.run(AbstractRetryCommand.java:40)
 at org.apache.ambari.logsearch.solr.AmbariSolrCloudClient.setupKerberosPlugin(AmbariSolrCloudClient.java:162)
 at org.apache.ambari.logsearch.solr.AmbariSolrCloudCLI.main(AmbariSolrCloudCLI.java:518)
... (snip) ...
Maximum retries exceeded: 5
Return code: 1
stdout:   /var/lib/ambari-agent/data/output-187.txt
2017-12-15 04:05:17,819 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf
2017-12-15 04:05:17,973 - Stack Feature Version Info: stack_version=2.6, version=2.6.1.0-129, current_cluster_version=2.6.1.0-129 -> 2.6.1.0-129
2017-12-15 04:05:17,974 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf
User Group mapping (user_group) is missing in the hostLevelParams
2017-12-15 04:05:17,975 - Group['livy'] {}
2017-12-15 04:05:17,976 - Group['spark'] {}
2017-12-15 04:05:17,976 - Group['ranger'] {}
2017-12-15 04:05:17,977 - Group['zeppelin'] {}
2017-12-15 04:05:17,977 - Group['hadoop'] {}
2017-12-15 04:05:17,977 - Group['users'] {}
2017-12-15 04:05:17,977 - Group['knox'] {}
2017-12-15 04:05:17,978 - User['hive'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2017-12-15 04:05:17,978 - User['storm'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2017-12-15 04:05:17,979 - User['zookeeper'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2017-12-15 04:05:17,983 - User['infra-solr'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2017-12-15 04:05:17,985 - User['oozie'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['users']}
2017-12-15 04:05:17,986 - User['atlas'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2017-12-15 04:05:17,987 - User['falcon'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['users']}
2017-12-15 04:05:17,988 - User['ranger'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['ranger']}
2017-12-15 04:05:17,988 - User['tez'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['users']}
2017-12-15 04:05:17,989 - User['zeppelin'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['zeppelin', 'hadoop']}
2017-12-15 04:05:17,990 - User['livy'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2017-12-15 04:05:17,991 - User['spark'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2017-12-15 04:05:17,992 - User['ambari-qa'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['users']}
2017-12-15 04:05:17,992 - User['flume'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2017-12-15 04:05:17,993 - User['kafka'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2017-12-15 04:05:17,994 - User['hdfs'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2017-12-15 04:05:17,995 - User['sqoop'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2017-12-15 04:05:17,995 - User['yarn'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2017-12-15 04:05:17,996 - User['mapred'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2017-12-15 04:05:17,997 - User['hbase'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2017-12-15 04:05:17,998 - User['knox'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2017-12-15 04:05:17,998 - User['hcat'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2017-12-15 04:05:17,999 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2017-12-15 04:05:18,001 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa'] {'not_if': '(test $(id -u ambari-qa) -gt 1000) || (false)'}
2017-12-15 04:05:18,052 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa'] due to not_if
2017-12-15 04:05:18,055 - Directory['/tmp/hbase-hbase'] {'owner': 'hbase', 'create_parents': True, 'mode': 0775, 'cd_access': 'a'}
2017-12-15 04:05:18,056 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2017-12-15 04:05:18,057 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase'] {'not_if': '(test $(id -u hbase) -gt 1000) || (false)'}
2017-12-15 04:05:18,106 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase'] due to not_if
2017-12-15 04:05:18,107 - Group['hdfs'] {}
2017-12-15 04:05:18,107 - User['hdfs'] {'fetch_nonlocal_groups': True, 'groups': ['hadoop', 'hdfs']}
2017-12-15 04:05:18,108 - FS Type: 
2017-12-15 04:05:18,108 - Directory['/etc/hadoop'] {'mode': 0755}
2017-12-15 04:05:18,130 - File['/usr/hdp/current/hadoop-client/conf/hadoop-env.sh'] {'content': InlineTemplate(...), 'owner': 'hdfs', 'group': 'hadoop'}
2017-12-15 04:05:18,131 - Directory['/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir'] {'owner': 'hdfs', 'group': 'hadoop', 'mode': 01777}
2017-12-15 04:05:18,152 - Execute[('setenforce', '0')] {'not_if': '(! which getenforce ) || (which getenforce && getenforce | grep -q Disabled)', 'sudo': True, 'only_if': 'test -f /selinux/enforce'}
2017-12-15 04:05:18,206 - Skipping Execute[('setenforce', '0')] due to not_if
2017-12-15 04:05:18,207 - Directory['/var/log/hadoop'] {'owner': 'root', 'create_parents': True, 'group': 'hadoop', 'mode': 0775, 'cd_access': 'a'}
2017-12-15 04:05:18,209 - Directory['/var/run/hadoop'] {'owner': 'root', 'create_parents': True, 'group': 'root', 'cd_access': 'a'}
2017-12-15 04:05:18,210 - Directory['/tmp/hadoop-hdfs'] {'owner': 'hdfs', 'create_parents': True, 'cd_access': 'a'}
2017-12-15 04:05:18,214 - File['/usr/hdp/current/hadoop-client/conf/commons-logging.properties'] {'content': Template('commons-logging.properties.j2'), 'owner': 'hdfs'}
2017-12-15 04:05:18,217 - File['/usr/hdp/current/hadoop-client/conf/health_check'] {'content': Template('health_check.j2'), 'owner': 'hdfs'}
2017-12-15 04:05:18,224 - File['/usr/hdp/current/hadoop-client/conf/log4j.properties'] {'content': InlineTemplate(...), 'owner': 'hdfs', 'group': 'hadoop', 'mode': 0644}
2017-12-15 04:05:18,236 - File['/usr/hdp/current/hadoop-client/conf/hadoop-metrics2.properties'] {'content': InlineTemplate(...), 'owner': 'hdfs', 'group': 'hadoop'}
2017-12-15 04:05:18,237 - File['/usr/hdp/current/hadoop-client/conf/task-log4j.properties'] {'content': StaticFile('task-log4j.properties'), 'mode': 0755}
2017-12-15 04:05:18,238 - File['/usr/hdp/current/hadoop-client/conf/configuration.xsl'] {'owner': 'hdfs', 'group': 'hadoop'}
2017-12-15 04:05:18,244 - File['/etc/hadoop/conf/topology_mappings.data'] {'owner': 'hdfs', 'content': Template('topology_mappings.data.j2'), 'only_if': 'test -d /etc/hadoop/conf', 'group': 'hadoop'}
2017-12-15 04:05:18,293 - File['/etc/hadoop/conf/topology_script.py'] {'content': StaticFile('topology_script.py'), 'only_if': 'test -d /etc/hadoop/conf', 'mode': 0755}
2017-12-15 04:05:18,702 - Directory['/var/log/ambari-infra-solr'] {'owner': 'infra-solr', 'group': 'hadoop', 'create_parents': True, 'mode': 0755, 'cd_access': 'a'}
2017-12-15 04:05:18,704 - Directory['/var/run/ambari-infra-solr'] {'owner': 'infra-solr', 'group': 'hadoop', 'create_parents': True, 'mode': 0755, 'cd_access': 'a'}
2017-12-15 04:05:18,705 - Directory['/opt/ambari_infra_solr/data'] {'owner': 'infra-solr', 'group': 'hadoop', 'create_parents': True, 'mode': 0755, 'cd_access': 'a'}
2017-12-15 04:05:18,706 - Directory['/opt/ambari_infra_solr/data/resources'] {'owner': 'infra-solr', 'create_parents': True, 'group': 'hadoop', 'mode': 0755, 'cd_access': 'a'}
2017-12-15 04:05:18,707 - Directory['/usr/lib/ambari-infra-solr'] {'group': 'hadoop', 'cd_access': 'a', 'create_parents': True, 'recursive_ownership': True, 'owner': 'infra-solr', 'mode': 0755}
2017-12-15 04:05:18,707 - Changing owner for /usr/lib/ambari-infra-solr from 1025 to infra-solr
2017-12-15 04:05:18,707 - Changing group for /usr/lib/ambari-infra-solr from 1025 to hadoop
2017-12-15 04:05:19,030 - Directory['/etc/ambari-infra-solr/conf'] {'group': 'hadoop', 'cd_access': 'a', 'create_parents': True, 'mode': 0755, 'owner': 'infra-solr', 'recursive_ownership': True}
2017-12-15 04:05:19,031 - File['/var/log/ambari-infra-solr/solr-install.log'] {'content': '', 'owner': 'infra-solr', 'group': 'hadoop', 'mode': 0644}
2017-12-15 04:05:19,031 - Writing File['/var/log/ambari-infra-solr/solr-install.log'] because it doesn't exist
2017-12-15 04:05:19,031 - Changing owner for /var/log/ambari-infra-solr/solr-install.log from 0 to infra-solr
2017-12-15 04:05:19,032 - Changing group for /var/log/ambari-infra-solr/solr-install.log from 0 to hadoop
2017-12-15 04:05:19,045 - File['/etc/ambari-infra-solr/conf/infra-solr-env.sh'] {'owner': 'infra-solr', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': 0755}
2017-12-15 04:05:19,047 - File['/opt/ambari_infra_solr/data/solr.xml'] {'owner': 'infra-solr', 'content': InlineTemplate(...), 'group': 'hadoop'}
2017-12-15 04:05:19,049 - File['/etc/ambari-infra-solr/conf/log4j.properties'] {'owner': 'infra-solr', 'content': InlineTemplate(...), 'group': 'hadoop'}
2017-12-15 04:05:19,055 - File['/etc/ambari-infra-solr/conf/custom-security.json'] {'owner': 'infra-solr', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': 0640}
2017-12-15 04:05:19,056 - Execute['ambari-sudo.sh JAVA_HOME=/usr/lib/jvm/java /usr/lib/ambari-infra-solr-client/solrCloudCli.sh --zookeeper-connect-string sandbox.hortonworks.com:2181 --znode /infra-solr --create-znode --retry 30 --interval 5'] {}
2017-12-15 04:05:19,744 - Execute['ambari-sudo.sh JAVA_HOME=/usr/lib/jvm/java /usr/lib/ambari-infra-solr-client/solrCloudCli.sh --zookeeper-connect-string sandbox.hortonworks.com:2181/infra-solr --cluster-prop --property-name urlScheme --property-value http'] {}
2017-12-15 04:05:20,411 - Execute['ambari-sudo.sh JAVA_HOME=/usr/lib/jvm/java /usr/lib/ambari-infra-solr-client/solrCloudCli.sh --zookeeper-connect-string sandbox.hortonworks.com:2181 --znode /infra-solr --setup-kerberos-plugin'] {}

Command failed after 1 tries

調査:
[root@sandbox ~]# ls -ltra /usr/hdp/current/ranger-admin/contrib/solr_for_audit_setup/conf/solrconfig.xml
-r-xr--r-- 1 ranger ranger 73711 May 31  2017 /usr/hdp/current/ranger-admin/contrib/solr_for_audit_setup/conf/solrconfig.xml

Kerberosが必要?