Spark-Shell无法使用Netlib-Java。

3
我正在尝试在一个简单的例子中链接本地BLAS库,以测试与“常规”实现之间的性能差异。 我遵循了官方netlib-java GitHub上发布的说明,但我仍然遇到错误,类似于this thread
我正在使用预编译的Spark 2.3和Hadoop 2.7运行; 我尝试按照Spark MLlib page上描述的显式启用netlib-java选项构建Spark,但没有成功。
让我困惑的具体错误消息是这样的:
    spark-shell --packages com.github.fommil.netlib:all:1.1.2
Ivy Default Cache set to: /home/user/.ivy2/cache
The jars for the packages stored in: /home/user/.ivy2/jars
:: loading settings :: url = jar:file:/home/user/spark/jars/ivy-2.4.0.jar!/org/apache/ivy/core/settings/ivysettings.xml
com.github.fommil.netlib#all added as a dependency
:: resolving dependencies :: org.apache.spark#spark-submit-parent;1.0
    confs: [default]
    found com.github.fommil.netlib#all;1.1.2 in central
    found net.sourceforge.f2j#arpack_combined_all;0.1 in central
    found com.github.fommil.netlib#core;1.1.2 in central
    found com.github.fommil.netlib#netlib-native_ref-osx-x86_64;1.1 in central
    found com.github.fommil.netlib#native_ref-java;1.1 in central
    found com.github.fommil#jniloader;1.1 in central
    found com.github.fommil.netlib#netlib-native_ref-linux-x86_64;1.1 in central
    found com.github.fommil.netlib#netlib-native_ref-linux-i686;1.1 in central
    found com.github.fommil.netlib#netlib-native_ref-win-x86_64;1.1 in central
    found com.github.fommil.netlib#netlib-native_ref-win-i686;1.1 in central
    found com.github.fommil.netlib#netlib-native_ref-linux-armhf;1.1 in central
    found com.github.fommil.netlib#netlib-native_system-osx-x86_64;1.1 in central
    found com.github.fommil.netlib#native_system-java;1.1 in central
    found com.github.fommil.netlib#netlib-native_system-linux-x86_64;1.1 in central
    found com.github.fommil.netlib#netlib-native_system-linux-i686;1.1 in central
    found com.github.fommil.netlib#netlib-native_system-linux-armhf;1.1 in central
    found com.github.fommil.netlib#netlib-native_system-win-x86_64;1.1 in central
    found com.github.fommil.netlib#netlib-native_system-win-i686;1.1 in central
:: resolution report :: resolve 1196ms :: artifacts dl 16ms
    :: modules in use:
    com.github.fommil#jniloader;1.1 from central in [default]
    com.github.fommil.netlib#all;1.1.2 from central in [default]
    com.github.fommil.netlib#core;1.1.2 from central in [default]
    com.github.fommil.netlib#native_ref-java;1.1 from central in [default]
    com.github.fommil.netlib#native_system-java;1.1 from central in [default]
    com.github.fommil.netlib#netlib-native_ref-linux-armhf;1.1 from central in [default]
    com.github.fommil.netlib#netlib-native_ref-linux-i686;1.1 from central in [default]
    com.github.fommil.netlib#netlib-native_ref-linux-x86_64;1.1 from central in [default]
    com.github.fommil.netlib#netlib-native_ref-osx-x86_64;1.1 from central in [default]
    com.github.fommil.netlib#netlib-native_ref-win-i686;1.1 from central in [default]
    com.github.fommil.netlib#netlib-native_ref-win-x86_64;1.1 from central in [default]
    com.github.fommil.netlib#netlib-native_system-linux-armhf;1.1 from central in [default]
    com.github.fommil.netlib#netlib-native_system-linux-i686;1.1 from central in [default]
    com.github.fommil.netlib#netlib-native_system-linux-x86_64;1.1 from central in [default]
    com.github.fommil.netlib#netlib-native_system-osx-x86_64;1.1 from central in [default]
    com.github.fommil.netlib#netlib-native_system-win-i686;1.1 from central in [default]
    com.github.fommil.netlib#netlib-native_system-win-x86_64;1.1 from central in [default]
    net.sourceforge.f2j#arpack_combined_all;0.1 from central in [default]
    :: evicted modules:
    com.github.fommil.netlib#core;1.1 by [com.github.fommil.netlib#core;1.1.2] in [default]
    ---------------------------------------------------------------------
    |                  |            modules            ||   artifacts   |
    |       conf       | number| search|dwnlded|evicted|| number|dwnlded|
    ---------------------------------------------------------------------
    |      default     |   19  |   0   |   0   |   1   ||   17  |   0   |
    ---------------------------------------------------------------------
:: retrieving :: org.apache.spark#spark-submit-parent
    confs: [default]
    0 artifacts copied, 17 already retrieved (0kB/15ms)
2018-06-13 07:53:28 WARN  Utils:66 - Your hostname, wdfl30003105a resolves to a loopback address: 127.0.0.1; using 10.18.91.86 instead (on interface enp1s0)
2018-06-13 07:53:28 WARN  Utils:66 - Set SPARK_LOCAL_IP if you need to bind to another address
2018-06-13 07:53:30 WARN  NativeCodeLoader:62 - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Warning: Local jar /home/user/.ivy2/jars/net.sourceforge.f2j_arpack_combined_all-0.1.jar does not exist, skipping.
Warning: Local jar /home/user/.ivy2/jars/com.github.fommil.netlib_netlib-native_ref-osx-x86_64-1.1.jar does not exist, skipping.
Warning: Local jar /home/user/.ivy2/jars/com.github.fommil.netlib_netlib-native_ref-linux-i686-1.1.jar does not exist, skipping.
Warning: Local jar /home/user/.ivy2/jars/com.github.fommil.netlib_netlib-native_ref-win-x86_64-1.1.jar does not exist, skipping.
Warning: Local jar /home/user/.ivy2/jars/com.github.fommil.netlib_netlib-native_ref-win-i686-1.1.jar does not exist, skipping.
Warning: Local jar /home/user/.ivy2/jars/com.github.fommil.netlib_netlib-native_ref-linux-armhf-1.1.jar does not exist, skipping.
Warning: Local jar /home/user/.ivy2/jars/com.github.fommil.netlib_netlib-native_system-osx-x86_64-1.1.jar does not exist, skipping.
Warning: Local jar /home/user/.ivy2/jars/com.github.fommil.netlib_netlib-native_system-linux-x86_64-1.1.jar does not exist, skipping.
Warning: Local jar /home/user/.ivy2/jars/com.github.fommil.netlib_netlib-native_system-linux-i686-1.1.jar does not exist, skipping.
Warning: Local jar /home/user/.ivy2/jars/com.github.fommil.netlib_netlib-native_system-linux-armhf-1.1.jar does not exist, skipping.
Warning: Local jar /home/user/.ivy2/jars/com.github.fommil.netlib_netlib-native_system-win-x86_64-1.1.jar does not exist, skipping.
Warning: Local jar /home/user/.ivy2/jars/com.github.fommil.netlib_netlib-native_system-win-i686-1.1.jar does not exist, skipping.
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
2018-06-13 07:53:44 ERROR SparkContext:91 - Failed to add file:/home/user/.ivy2/jars/net.sourceforge.f2j_arpack_combined_all-0.1.jar to Spark environment
java.io.FileNotFoundException: Jar /home/user/.ivy2/jars/net.sourceforge.f2j_arpack_combined_all-0.1.jar not found
    at org.apache.spark.SparkContext.addJarFile$1(SparkContext.scala:1807)
    at org.apache.spark.SparkContext.addJar(SparkContext.scala:1837)
    at org.apache.spark.SparkContext$$anonfun$12.apply(SparkContext.scala:457)
    at org.apache.spark.SparkContext$$anonfun$12.apply(SparkContext.scala:457)
    at scala.collection.immutable.List.foreach(List.scala:381)
    at org.apache.spark.SparkContext.<init>(SparkContext.scala:457)
    at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2486)
    at org.apache.spark.sql.SparkSession$Builder$$anonfun$7.apply(SparkSession.scala:930)
    at org.apache.spark.sql.SparkSession$Builder$$anonfun$7.apply(SparkSession.scala:921)
    at scala.Option.getOrElse(Option.scala:121)
    at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:921)
    at org.apache.spark.repl.Main$.createSparkSession(Main.scala:103)
    at $line3.$read$$iw$$iw.<init>(<console>:15)
    at $line3.$read$$iw.<init>(<console>:43)
    at $line3.$read.<init>(<console>:45)
    at $line3.$read$.<init>(<console>:49)
    at $line3.$read$.<clinit>(<console>)
    at $line3.$eval$.$print$lzycompute(<console>:7)
    at $line3.$eval$.$print(<console>:6)
    at $line3.$eval.$print(<console>)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.call(IMain.scala:786)
    at scala.tools.nsc.interpreter.IMain$Request.loadAndRun(IMain.scala:1047)
    at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:638)
    at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:637)
    at scala.reflect.internal.util.ScalaClassLoader$class.asContext(ScalaClassLoader.scala:31)
    at scala.reflect.internal.util.AbstractFileClassLoader.asContext(AbstractFileClassLoader.scala:19)
    at scala.tools.nsc.interpreter.IMain$WrappedRequest.loadAndRunReq(IMain.scala:637)
    at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:569)
    at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:565)
    at scala.tools.nsc.interpreter.ILoop.interpretStartingWith(ILoop.scala:807)
    at scala.tools.nsc.interpreter.ILoop.command(ILoop.scala:681)
    at scala.tools.nsc.interpreter.ILoop.processLine(ILoop.scala:395)
    at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1$$anonfun$apply$mcV$sp$1$$anonfun$apply$mcV$sp$2.apply(SparkILoop.scala:79)
    at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1$$anonfun$apply$mcV$sp$1$$anonfun$apply$mcV$sp$2.apply(SparkILoop.scala:79)
    at scala.collection.immutable.List.foreach(List.scala:381)
    at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1$$anonfun$apply$mcV$sp$1.apply$mcV$sp(SparkILoop.scala:79)
    at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1$$anonfun$apply$mcV$sp$1.apply(SparkILoop.scala:79)
    at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1$$anonfun$apply$mcV$sp$1.apply(SparkILoop.scala:79)
    at scala.tools.nsc.interpreter.ILoop.savingReplayStack(ILoop.scala:91)
    at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply$mcV$sp(SparkILoop.scala:78)
    at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:78)
    at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:78)
    at scala.tools.nsc.interpreter.IMain.beQuietDuring(IMain.scala:214)
    at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:77)
    at org.apache.spark.repl.SparkILoop.loadFiles(SparkILoop.scala:110)
    at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply$mcZ$sp(ILoop.scala:920)
    at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:909)
    at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:909)
    at scala.reflect.internal.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:97)
    at scala.tools.nsc.interpreter.ILoop.process(ILoop.scala:909)
    at org.apache.spark.repl.Main$.doMain(Main.scala:76)
    at org.apache.spark.repl.Main$.main(Main.scala:56)
    at org.apache.spark.repl.Main.main(Main.scala)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
    at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:879)
    at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:197)
    at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:227)
    at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:136)
    at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
2018-06-13 07:53:44 ERROR SparkContext:91 - Failed to add file:/home/user/.ivy2/jars/com.github.fommil.netlib_netlib-native_ref-osx-x86_64-1.1.jar to Spark environment
java.io.FileNotFoundException: Jar /home/user/.ivy2/jars/com.github.fommil.netlib_netlib-native_ref-osx-x86_64-1.1.jar not found
    at org.apache.spark.SparkContext.addJarFile$1(SparkContext.scala:1807)
    at org.apache.spark.SparkContext.addJar(SparkContext.scala:1837)
    at org.apache.spark.SparkContext$$anonfun$12.apply(SparkContext.scala:457)
    at org.apache.spark.SparkContext$$anonfun$12.apply(SparkContext.scala:457)
    at scala.collection.immutable.List.foreach(List.scala:381)
    at org.apache.spark.SparkContext.<init>(SparkContext.scala:457)
    at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2486)
    at org.apache.spark.sql.SparkSession$Builder$$anonfun$7.apply(SparkSession.scala:930)
    at org.apache.spark.sql.SparkSession$Builder$$anonfun$7.apply(SparkSession.scala:921)
    at scala.Option.getOrElse(Option.scala:121)
    at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:921)
    at org.apache.spark.repl.Main$.createSparkSession(Main.scala:103)
    at $line3.$read$$iw$$iw.<init>(<console>:15)
    at $line3.$read$$iw.<init>(<console>:43)
    at $line3.$read.<init>(<console>:45)
    at $line3.$read$.<init>(<console>:49)
    at $line3.$read$.<clinit>(<console>)
    at $line3.$eval$.$print$lzycompute(<console>:7)
    at $line3.$eval$.$print(<console>:6)
    at $line3.$eval.$print(<console>)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.call(IMain.scala:786)
    at scala.tools.nsc.interpreter.IMain$Request.loadAndRun(IMain.scala:1047)
    at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:638)
    at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:637)
    at scala.reflect.internal.util.ScalaClassLoader$class.asContext(ScalaClassLoader.scala:31)
    at scala.reflect.internal.util.AbstractFileClassLoader.asContext(AbstractFileClassLoader.scala:19)
    at scala.tools.nsc.interpreter.IMain$WrappedRequest.loadAndRunReq(IMain.scala:637)
    at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:569)
    at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:565)
    at scala.tools.nsc.interpreter.ILoop.interpretStartingWith(ILoop.scala:807)
    at scala.tools.nsc.interpreter.ILoop.command(ILoop.scala:681)
    at scala.tools.nsc.interpreter.ILoop.processLine(ILoop.scala:395)
    at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1$$anonfun$apply$mcV$sp$1$$anonfun$apply$mcV$sp$2.apply(SparkILoop.scala:79)
    at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1$$anonfun$apply$mcV$sp$1$$anonfun$apply$mcV$sp$2.apply(SparkILoop.scala:79)
    at scala.collection.immutable.List.foreach(List.scala:381)
    at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1$$anonfun$apply$mcV$sp$1.apply$mcV$sp(SparkILoop.scala:79)
    at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1$$anonfun$apply$mcV$sp$1.apply(SparkILoop.scala:79)
    at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1$$anonfun$apply$mcV$sp$1.apply(SparkILoop.scala:79)
    at scala.tools.nsc.interpreter.ILoop.savingReplayStack(ILoop.scala:91)
    at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply$mcV$sp(SparkILoop.scala:78)
    at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:78)
    at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:78)
    at scala.tools.nsc.interpreter.IMain.beQuietDuring(IMain.scala:214)
    at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:77)
    at org.apache.spark.repl.SparkILoop.loadFiles(SparkILoop.scala:110)
    at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply$mcZ$sp(ILoop.scala:920)
    at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:909)
    at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:909)
    at scala.reflect.internal.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:97)
    at scala.tools.nsc.interpreter.ILoop.process(ILoop.scala:909)
    at org.apache.spark.repl.Main$.doMain(Main.scala:76)
    at org.apache.spark.repl.Main$.main(Main.scala:56)
    at org.apache.spark.repl.Main.main(Main.scala)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
    at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:879)
    at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:197)
    at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:227)
    at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:136)
    at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
2018-06-13 07:53:44 ERROR SparkContext:91 - Failed to add file:/home/user/.ivy2/jars/com.github.fommil.netlib_netlib-native_ref-linux-i686-1.1.jar to Spark environment
java.io.FileNotFoundException: Jar /home/user/.ivy2/jars/com.github.fommil.netlib_netlib-native_ref-linux-i686-1.1.jar not found

基本上是说无法找到库,但有趣的是,我的~/.ivy2/jars文件夹显示所有所需的软件包都存在,只是名称略有不同。
    ~/.ivy2/jars$ ls
com.github.fommil_jniloader-1.1.jar
com.github.fommil.netlib_core-1.1.2.jar
com.github.fommil.netlib_native_ref-java-1.1.jar
com.github.fommil.netlib_native_system-java-1.1.jar
com.github.fommil.netlib_netlib-native_ref-linux-armhf-1.1-natives.jar
com.github.fommil.netlib_netlib-native_ref-linux-i686-1.1-natives.jar
com.github.fommil.netlib_netlib-native_ref-linux-x86_64-1.1.jar
com.github.fommil.netlib_netlib-native_ref-linux-x86_64-1.1-natives.jar
com.github.fommil.netlib_netlib-native_ref-osx-x86_64-1.1-natives.jar
com.github.fommil.netlib_netlib-native_ref-win-i686-1.1-natives.jar
com.github.fommil.netlib_netlib-native_ref-win-x86_64-1.1-natives.jar
com.github.fommil.netlib_netlib-native_system-linux-armhf-1.1-natives.jar
com.github.fommil.netlib_netlib-native_system-linux-i686-1.1-natives.jar
com.github.fommil.netlib_netlib-native_system-linux-x86_64-1.1-natives.jar
com.github.fommil.netlib_netlib-native_system-osx-x86_64-1.1-natives.jar
com.github.fommil.netlib_netlib-native_system-win-i686-1.1-natives.jar
com.github.fommil.netlib_netlib-native_system-win-x86_64-1.1-natives.jar
net.sourceforge.f2j_arpack_combined_all-0.1-javadoc.jar

我对打包和/或Spark / Scala内部处理这些包的过程了解不够。对于固定的包,似乎在maven中有一种解决方法可以包括,但遗憾的是spark-shell --packages选项不允许我做类似的事情。

1
我也尝试重命名所有文件,这样就不会再出现错误了,但是熟悉的"<br/> 2018-06-13 08:32:40 WARN BLAS:61 - Failed to load implementation from: com.github.fommil.netlib.NativeSystemBLAS 又出现了。 - dennlinger
1
有时候会发生这种情况,删除你的 ~/.ivy2 文件夹然后再试一次。 - Kaushal
1
我已经尝试了几次了;我已经删除了~/.ivy2~/.m2文件夹。 但是还是没有效果。如我所述,我还尝试了从源代码构建,在这种情况下,这甚至不应该发生。 - dennlinger
1个回答

2

1
我最终做了类似的事情,但这个问题特别是关于不需要自己编译Spark。根据在线文档,它仍然应该以某种方式工作... - dennlinger

网页内容由stack overflow 提供, 点击上面的
可以查看英文原文,
原文链接