Spark-Shell: How to Determine JAR Boot Order

Running the spark shell locally + determine the path to some third-party JAR servers:

$ spark-shell --driver-class-path /Myproject/LIB/*

Inside the shell I typed

scala> import com.google.common.collect.Lists
<console>:19: error: object collect is not a member of package com.google.common
   import com.google.common.collect.Lists
                            ^

I assume that Spark uploaded the first / usr / local / spark -1.4.0-bin-hadoop2.6 / lib / spark-assembly-1.4.0-hadoop2.6.0.jar , which doesn’t work, it contains the com.google package .common.collect.

/ Myproject / LIB / contains google-collections-1.0.jar and has com.google.common.collect. However, this jar seems to be ignored.

Question . How to say that the spark shell loads JAR files in the -driver-class path before in spark-1.4.0-bin-hadoop2.6 / lib /?

:

--jars --driver-class-path. jar . , ( spark-shell --help)

$ spark-shell --jars $(echo ./Myproject/LIB/*.jar | tr ' ' ',')
+4
1

. , java classpath, spark-shell --driver-class-path $(echo ./Myproject/LIB/*.jar | tr ' ' ',')

+3

All Articles