Skip to content

Instantly share code, notes, and snippets.

@miharp
Last active June 29, 2018 10:08
Show Gist options
  • Save miharp/06583a76a2d9f437ec88f476cb642ec2 to your computer and use it in GitHub Desktop.
Save miharp/06583a76a2d9f437ec88f476cb642ec2 to your computer and use it in GitHub Desktop.

LZO Notes

We use s3cmd to download and package HDP and HDP-UTILS from s3://dev.hortonworks.com

s3 sync command which excludes lzo from HDP stack

s3cmd sync --no-check-md5 --exclude 'hadooplzo/*' --exclude '*m2-artifacts.tar' --exclude '*.tar.gz' --exclude '*.src.rpm' s3://dev.hortonworks.com/HDP/${SUSE_VERSION}/2.x/BUILDS/$VERSION-${BUILD_NUMBER}/ $hdpDir

s3 sync which excludes lzo from HDP-UTILS

3cmd sync --no-check-md5 --no-preserve --exclude 'yum/*' --exclude 'yum-utils/*' --exclude 'yum-metadata-parser/*' --exclude 'hadoop-lzo/*' --exclude 'lzo/*' --exclude '*.tar.gz' --exclude '*.src.rpm' s3://public-repo-1.hortonworks.com/${hdpUtils}/repos/${SUSE_VERSION}/ $hdpUtils/repos/${SUSE_VERSION}/

Blueprint deploy fails installing 'HDFS Client Install', Ambari-2.6.0.0-267+HDP-2.6.3.0-235

stderr:   /var/lib/ambari-agent/data/errors-5.txt

Traceback (most recent call last):
  File "/var/lib/ambari-agent/cache/common-services/HDFS/2.1.0.2.0/package/scripts/hdfs_client.py", line 73, in <module>
    HdfsClient().execute()
  File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 367, in execute
    method(env)
  File "/var/lib/ambari-agent/cache/common-services/HDFS/2.1.0.2.0/package/scripts/hdfs_client.py", line 35, in install
    import params
  File "/var/lib/ambari-agent/cache/common-services/HDFS/2.1.0.2.0/package/scripts/params.py", line 25, in <module>
    from params_linux import *
  File "/var/lib/ambari-agent/cache/common-services/HDFS/2.1.0.2.0/package/scripts/params_linux.py", line 391, in <module>
    lzo_packages = get_lzo_packages(stack_version_unformatted)
  File "/usr/lib/python2.6/site-packages/resource_management/libraries/functions/get_lzo_packages.py", line 45, in get_lzo_packages
    lzo_packages += [script_instance.format_package_name("hadooplzo_${stack_version}"),
  File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 538, in format_package_name
    raise Fail("Cannot match package for regexp name {0}. Available packages: {1}".format(name, self.available_packages_in_repos))
resource_management.core.exceptions.Fail: Cannot match package for regexp name hadooplzo_${stack_version}. Available packages: ['accumulo', 'accumulo-conf-standalone', 'accumulo-source', 'accumulo_2_6_3_0_235', 'accumulo_2_6_3_0_235-conf-standalone', 'accumulo_2_6_3_0_235-source', 'atlas-metadata', 'atlas-metadata-falcon-plugin', 'atlas-metadata-hive-plugin', 'atlas-metadata-sqoop-plugin', 'atlas-metadata-storm-plugin', 'atlas-metadata_2_6_3_0_235', 'atlas-metadata_2_6_3_0_235-falcon-plugin', 'atlas-metadata_2_6_3_0_235-hive-plugin', 'atlas-metadata_2_6_3_0_235-sqoop-plugin', 'atlas-metadata_2_6_3_0_235-storm-plugin', 'bigtop-jsvc', 'bigtop-tomcat', 'datafu', 'datafu_2_6_3_0_235', 'druid', 'druid_2_6_3_0_235', 'falcon', 'falcon-doc', 'falcon_2_6_3_0_235', 'falcon_2_6_3_0_235-doc', 'flume', 'flume-agent', 'flume_2_6_3_0_235', 'flume_2_6_3_0_235-agent', 'hadoop', 'hadoop-client', 'hadoop-conf-pseudo', 'hadoop-doc', 'hadoop-hdfs', 'hadoop-hdfs-datanode', 'hadoop-hdfs-fuse', 'hadoop-hdfs-journalnode', 'hadoop-hdfs-namenode', 'hadoop-hdfs-secondarynamenode', 'hadoop-hdfs-zkfc', 'hadoop-httpfs', 'hadoop-httpfs-server', 'hadoop-libhdfs', 'hadoop-mapreduce', 'hadoop-mapreduce-historyserver', 'hadoop-source', 'hadoop-yarn', 'hadoop-yarn-nodemanager', 'hadoop-yarn-proxyserver', 'hadoop-yarn-resourcemanager', 'hadoop-yarn-timelineserver', 'hadoop_2_6_3_0_235', 'hadoop_2_6_3_0_235-client', 'hadoop_2_6_3_0_235-conf-pseudo', 'hadoop_2_6_3_0_235-doc', 'hadoop_2_6_3_0_235-hdfs', 'hadoop_2_6_3_0_235-hdfs-datanode', 'hadoop_2_6_3_0_235-hdfs-fuse', 'hadoop_2_6_3_0_235-hdfs-journalnode', 'hadoop_2_6_3_0_235-hdfs-namenode', 'hadoop_2_6_3_0_235-hdfs-secondarynamenode', 'hadoop_2_6_3_0_235-hdfs-zkfc', 'hadoop_2_6_3_0_235-httpfs', 'hadoop_2_6_3_0_235-httpfs-server', 'hadoop_2_6_3_0_235-libhdfs', 'hadoop_2_6_3_0_235-mapreduce', 'hadoop_2_6_3_0_235-mapreduce-historyserver', 'hadoop_2_6_3_0_235-source', 'hadoop_2_6_3_0_235-yarn', 'hadoop_2_6_3_0_235-yarn-nodemanager', 'hadoop_2_6_3_0_235-yarn-proxyserver', 'hadoop_2_6_3_0_235-yarn-resourcemanager', 'hadoop_2_6_3_0_235-yarn-timelineserver', 'hbase', 'hbase-doc', 'hbase-master', 'hbase-regionserver', 'hbase-rest', 'hbase-thrift', 'hbase-thrift2', 'hbase_2_6_3_0_235', 'hbase_2_6_3_0_235-doc', 'hbase_2_6_3_0_235-master', 'hbase_2_6_3_0_235-regionserver', 'hbase_2_6_3_0_235-rest', 'hbase_2_6_3_0_235-thrift', 'hbase_2_6_3_0_235-thrift2', 'hdp-select', 'hive', 'hive-hcatalog', 'hive-hcatalog-server', 'hive-jdbc', 'hive-metastore', 'hive-server', 'hive-server2', 'hive-webhcat', 'hive-webhcat-server', 'hive2', 'hive2-jdbc', 'hive2_2_6_3_0_235', 'hive2_2_6_3_0_235-jdbc', 'hive_2_6_3_0_235', 'hive_2_6_3_0_235-hcatalog', 'hive_2_6_3_0_235-hcatalog-server', 'hive_2_6_3_0_235-jdbc', 'hive_2_6_3_0_235-metastore', 'hive_2_6_3_0_235-server', 'hive_2_6_3_0_235-server2', 'hive_2_6_3_0_235-webhcat', 'hive_2_6_3_0_235-webhcat-server', 'kafka', 'kafka_2_6_3_0_235', 'knox', 'knox_2_6_3_0_235', 'livy', 'livy2', 'livy2_2_6_3_0_235', 'livy_2_6_3_0_235', 'mahout', 'mahout-doc', 'mahout_2_6_3_0_235', 'mahout_2_6_3_0_235-doc', 'oozie', 'oozie-client', 'oozie-common', 'oozie-sharelib', 'oozie-sharelib-distcp', 'oozie-sharelib-hcatalog', 'oozie-sharelib-hive', 'oozie-sharelib-hive2', 'oozie-sharelib-mapreduce-streaming', 'oozie-sharelib-pig', 'oozie-sharelib-spark', 'oozie-sharelib-sqoop', 'oozie-webapp', 'oozie_2_6_3_0_235', 'oozie_2_6_3_0_235-client', 'oozie_2_6_3_0_235-common', 'oozie_2_6_3_0_235-sharelib', 'oozie_2_6_3_0_235-sharelib-distcp', 'oozie_2_6_3_0_235-sharelib-hcatalog', 'oozie_2_6_3_0_235-sharelib-hive', 'oozie_2_6_3_0_235-sharelib-hive2', 'oozie_2_6_3_0_235-sharelib-mapreduce-streaming', 'oozie_2_6_3_0_235-sharelib-pig', 'oozie_2_6_3_0_235-sharelib-spark', 'oozie_2_6_3_0_235-sharelib-sqoop', 'oozie_2_6_3_0_235-webapp', 'phoenix', 'phoenix_2_6_3_0_235', 'pig', 'pig_2_6_3_0_235', 'ranger-admin', 'ranger-atlas-plugin', 'ranger-hbase-plugin', 'ranger-hdfs-plugin', 'ranger-hive-plugin', 'ranger-kafka-plugin', 'ranger-kms', 'ranger-knox-plugin', 'ranger-solr-plugin', 'ranger-storm-plugin', 'ranger-tagsync', 'ranger-usersync', 'ranger-yarn-plugin', 'ranger_2_6_3_0_235-admin', 'ranger_2_6_3_0_235-atlas-plugin', 'ranger_2_6_3_0_235-hbase-plugin', 'ranger_2_6_3_0_235-hdfs-plugin', 'ranger_2_6_3_0_235-hive-plugin', 'ranger_2_6_3_0_235-kafka-plugin', 'ranger_2_6_3_0_235-kms', 'ranger_2_6_3_0_235-knox-plugin', 'ranger_2_6_3_0_235-solr-plugin', 'ranger_2_6_3_0_235-storm-plugin', 'ranger_2_6_3_0_235-tagsync', 'ranger_2_6_3_0_235-usersync', 'ranger_2_6_3_0_235-yarn-plugin', 'shc', 'shc_2_6_3_0_235', 'slider', 'slider_2_6_3_0_235', 'spark', 'spark-master', 'spark-python', 'spark-worker', 'spark-yarn-shuffle', 'spark2', 'spark2-master', 'spark2-python', 'spark2-worker', 'spark2-yarn-shuffle', 'spark2_2_6_3_0_235', 'spark2_2_6_3_0_235-master', 'spark2_2_6_3_0_235-python', 'spark2_2_6_3_0_235-worker', 'spark2_2_6_3_0_235-yarn-shuffle', 'spark_2_6_3_0_235', 'spark_2_6_3_0_235-master', 'spark_2_6_3_0_235-python', 'spark_2_6_3_0_235-worker', 'spark_2_6_3_0_235-yarn-shuffle', 'spark_llap', 'spark_llap_2_6_3_0_235', 'sqoop', 'sqoop-metastore', 'sqoop_2_6_3_0_235', 'sqoop_2_6_3_0_235-metastore', 'storm', 'storm-slider-client', 'storm_2_6_3_0_235', 'storm_2_6_3_0_235-slider-client', 'tez', 'tez_2_6_3_0_235', 'tez_hive2', 'tez_hive2_2_6_3_0_235', 'zeppelin', 'zeppelin_2_6_3_0_235', 'zookeeper', 'zookeeper-server', 'zookeeper_2_6_3_0_235', 'zookeeper_2_6_3_0_235-server', 'R-KernSmooth', 'R-MASS', 'R-Matrix', 'R-Matrix-devel', 'R-base', 'R-base-devel', 'R-boot', 'R-class', 'R-cluster', 'R-codetools', 'R-compiler', 'R-core', 'R-core-devel', 'R-core-doc', 'R-core-libs', 'R-core-packages', 'R-datasets', 'R-foreign', 'R-grDevices', 'R-graphics', 'R-grid', 'R-lattice', 'R-methods', 'R-mgcv', 'R-nlme', 'R-nnet', 'R-parallel', 'R-recommended-packages', 'R-rpart', 'R-spatial', 'R-splines', 'R-stats', 'R-stats4', 'R-survival', 'R-tcltk', 'R-tools', 'R-utils', 'extjs', 'ganglia-devel', 'ganglia-gmetad', 'ganglia-gmond', 'ganglia-gmond-modules-python', 'ganglia-web', 'info', 'libapr1-devel', 'libconfuse-devel', 'libconfuse0', 'libexpat-devel', 'libganglia', 'lua-rrdtool', 'makeinfo', 'mysql-connector-java', 'mysql-connector-java', 'nagios', 'nagios-devel', 'nagios-plugins', 'nagios-www', 'netcat-openbsd', 'python-rrdtool', 'rrdtool', 'rrdtool-devel', 'snappy', 'snappy-devel', 'texinfo']
stdout:   /var/lib/ambari-agent/data/output-5.txt

2017-10-31 03:00:09,276 - Stack Feature Version Info: Cluster Stack=2.6, Command Stack=None, Command Version=None -> 2.6
2017-10-31 03:00:09,282 - Using hadoop conf dir: /usr/hdp/2.6.3.0-235/hadoop/conf
2017-10-31 03:00:09,283 - Group['hdfs'] {}
2017-10-31 03:00:09,285 - Group['hadoop'] {}
2017-10-31 03:00:09,285 - Group['users'] {}
2017-10-31 03:00:09,285 - User['hive'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2017-10-31 03:00:09,286 - User['zookeeper'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2017-10-31 03:00:09,287 - User['oozie'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'users'], 'uid': None}
2017-10-31 03:00:09,287 - User['ams'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2017-10-31 03:00:09,288 - User['tez'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'users'], 'uid': None}
2017-10-31 03:00:09,289 - User['mahout'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2017-10-31 03:00:09,289 - User['ambari-qa'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'users'], 'uid': None}
2017-10-31 03:00:09,290 - User['flume'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2017-10-31 03:00:09,291 - User['hdfs'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hdfs'], 'uid': None}
2017-10-31 03:00:09,291 - User['sqoop'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2017-10-31 03:00:09,292 - User['yarn'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2017-10-31 03:00:09,293 - User['mapred'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2017-10-31 03:00:09,293 - User['hbase'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2017-10-31 03:00:09,294 - User['hcat'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2017-10-31 03:00:09,295 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2017-10-31 03:00:09,296 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 0'] {'not_if': '(test $(id -u ambari-qa) -gt 1000) || (false)'}
2017-10-31 03:00:09,303 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 0'] due to not_if
2017-10-31 03:00:09,303 - Directory['/tmp/hbase-hbase'] {'owner': 'hbase', 'create_parents': True, 'mode': 0775, 'cd_access': 'a'}
2017-10-31 03:00:09,304 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2017-10-31 03:00:09,306 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2017-10-31 03:00:09,307 - call['/var/lib/ambari-agent/tmp/changeUid.sh hbase'] {}
2017-10-31 03:00:09,316 - call returned (0, '1003')
2017-10-31 03:00:09,316 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase 1003'] {'not_if': '(test $(id -u hbase) -gt 1000) || (false)'}
2017-10-31 03:00:09,322 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase 1003'] due to not_if
2017-10-31 03:00:09,323 - Group['hdfs'] {}
2017-10-31 03:00:09,323 - User['hdfs'] {'fetch_nonlocal_groups': True, 'groups': ['hdfs', u'hdfs']}
2017-10-31 03:00:09,324 - FS Type: 
2017-10-31 03:00:09,324 - Directory['/etc/hadoop'] {'mode': 0755}
2017-10-31 03:00:09,340 - File['/usr/hdp/2.6.3.0-235/hadoop/conf/hadoop-env.sh'] {'content': InlineTemplate(...), 'owner': 'hdfs', 'group': 'hadoop'}
2017-10-31 03:00:09,341 - Directory['/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir'] {'owner': 'hdfs', 'group': 'hadoop', 'mode': 01777}
2017-10-31 03:00:09,355 - Repository['HDP-2.6-repo-1'] {'append_to_file': False, 'base_url': 'http://blackmoon1.labs.teradata.com/HDP/sles12/2.x/updates/2.6.3.0', 'action': ['create'], 'components': [u'HDP', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'ambari-hdp-1', 'mirror_list': None}
2017-10-31 03:00:09,364 - Flushing package manager cache since repo file content is about to change
2017-10-31 03:00:09,364 - checked_call[['zypper', 'clean', '--all']] {'sudo': True}
2017-10-31 03:00:09,424 - checked_call returned (0, 'All repositories have been cleaned up.')
2017-10-31 03:00:09,424 - File['/etc/zypp/repos.d/ambari-hdp-1.repo'] {'content': '[HDP-2.6-repo-1]\nname=HDP-2.6-repo-1\nbaseurl=http://blackmoon1.labs.teradata.com/HDP/sles12/2.x/updates/2.6.3.0\n\npath=/\nenabled=1\ngpgcheck=0'}
2017-10-31 03:00:09,425 - Writing File['/etc/zypp/repos.d/ambari-hdp-1.repo'] because contents don't match
2017-10-31 03:00:09,425 - Repository['HDP-UTILS-1.1.0.21-repo-1'] {'append_to_file': True, 'base_url': 'http://blackmoon1.labs.teradata.com/HDP-UTILS-1.1.0.21/repos/sles12', 'action': ['create'], 'components': [u'HDP-UTILS', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'ambari-hdp-1', 'mirror_list': None}
2017-10-31 03:00:09,430 - Flushing package manager cache since repo file content is about to change
2017-10-31 03:00:09,430 - checked_call[['zypper', 'clean', '--all']] {'sudo': True}
2017-10-31 03:00:09,776 - checked_call returned (0, 'All repositories have been cleaned up.')
2017-10-31 03:00:09,776 - File['/etc/zypp/repos.d/ambari-hdp-1.repo'] {'content': '[HDP-2.6-repo-1]\nname=HDP-2.6-repo-1\nbaseurl=http://blackmoon1.labs.teradata.com/HDP/sles12/2.x/updates/2.6.3.0\n\npath=/\nenabled=1\ngpgcheck=0\n[HDP-UTILS-1.1.0.21-repo-1]\nname=HDP-UTILS-1.1.0.21-repo-1\nbaseurl=http://blackmoon1.labs.teradata.com/HDP-UTILS-1.1.0.21/repos/sles12\n\npath=/\nenabled=1\ngpgcheck=0'}
2017-10-31 03:00:09,777 - Writing File['/etc/zypp/repos.d/ambari-hdp-1.repo'] because contents don't match
2017-10-31 03:00:09,777 - Package['unzip'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2017-10-31 03:00:10,240 - Skipping installation of existing package unzip
2017-10-31 03:00:10,240 - Package['curl'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2017-10-31 03:00:10,661 - Skipping installation of existing package curl
2017-10-31 03:00:10,661 - Package['hdp-select'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2017-10-31 03:00:10,975 - Skipping installation of existing package hdp-select
2017-10-31 03:00:11,405 - Using hadoop conf dir: /usr/hdp/2.6.3.0-235/hadoop/conf
2017-10-31 03:00:11,412 - Stack Feature Version Info: Cluster Stack=2.6, Command Stack=None, Command Version=None -> 2.6
2017-10-31 03:00:11,435 - Using hadoop conf dir: /usr/hdp/2.6.3.0-235/hadoop/conf
2017-10-31 03:00:11,449 - Command repositories: HDP-2.6-repo-1, HDP-UTILS-1.1.0.21-repo-1
2017-10-31 03:00:11,449 - Applicable repositories: HDP-2.6-repo-1, HDP-UTILS-1.1.0.21-repo-1
2017-10-31 03:00:11,450 - Looking for matching packages in the following repositories: HDP-2.6-repo-1, HDP-UTILS-1.1.0.21-repo-1
2017-10-31 03:00:12,390 - No package found for hadooplzo_${stack_version}(hadooplzo_(\d|_)+$)
@vatsalprakash2
Copy link

vatsalprakash2 commented Jun 29, 2018

Any updates on this.. ?
facing similar issue while installing yarn apptimeline-server through ambari blueprints "Cannot match package for regexp name {0}. Available packages: {1}".format(name, self.available_packages_in_repos))
resource_management.core.exceptions.Fail: Cannot match package for regexp name hadoop_${stack_version}-yarn. Available packages: ['Pacchetti', '1.7.0.2.6.4', '1.7.0.2.6.4', '1.7.0.2.6.4', '1.7.0.2.6.4', '1.7.0.2.6.4', '1.7.0.2.6.4', '0.8.0.2.6.4', '0.8.0.2.6.4', '0.8.0.2.6.4', '0.8.0.2.6.4', '0.8.0.2.6.4', '0.8.0.2.6.4', '0.8.0.2.6.4', '0.8.0.2.6.4', '0.8.0.2.6.4', '0.8.0.2.6.4', '1.0', '6.0', '1.3.0.2.6.4', '1.3.0.2.6.4', '0.10.1.2.6.4', '0.10.1.2.6.4', '0.10.0.2.6.4', '0.10.0.2.6.4', '0.10.0.2.6.4', '0.10.0.2.6.4', '1.5.2.2.6.4', '1.5.2.2.6.4', '1.5.2.2.6.4', '1.5.2.2.6.4', '2.7.3.2.6.4', '2.7.3.2.6.4', '2.7.3.2.6.4', '2.7.3.2.6.4', '2.7.3.2.6.4', '2.7.3.2.6.4', '2.7.3.2.6.4', '2.7.3.2.6.4', '2.7.3.2.6.4', '2.7.3.2.6.4', '2.7.3.2.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment