Import data from SQL server to fix local

I want to import a table from sql server in order to fix the local JDBC driver and run excl sql on top of it. I downloaded sqljdbc for the sql server and added this line to spark-env.sh in the conf directory:

SPARK_CLASSPATH= "C:/Program Files/Microsoft SQL Server/sqljdbc_4.0/enu/sqljdbc4.jar" ./bin/spark-shell 

As suggested here

And use this line to load data

 df = sqlContext.load(source="jdbc", url="jdbc:sqlserver:dd", dbtable="Reporting.dbo.datatable") 

However, it throws an error like:

 Py4JJavaError: An error occurred while calling o28.load. : java.sql.SQLException: No suitable driver found for jdbc:sqlserver:PC-BFS2 at java.sql.DriverManager.getConnection(Unknown Source) at java.sql.DriverManager.getConnection(Unknown Source) at org.apache.spark.sql.jdbc.JDBCRDD$.resolveTable(JDBCRDD.scala:118) at org.apache.spark.sql.jdbc.JDBCRelation.<init>(JDBCRelation.scala:128) at org.apache.spark.sql.jdbc.DefaultSource.createRelation(JDBCRelation.scala:113) at org.apache.spark.sql.sources.ResolvedDataSource$.apply(ddl.scala:269) at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:114) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source) at java.lang.reflect.Method.invoke(Unknown Source) at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:231) at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:379) at py4j.Gateway.invoke(Gateway.java:259) at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:133) at py4j.commands.CallCommand.execute(CallCommand.java:79) at py4j.GatewayConnection.run(GatewayConnection.java:207) at java.lang.Thread.run(Unknown Source) 
+5
source share
3 answers

Thus, earlier versions of Spark may have this problem if they are also not specified in the driver class path, so you can try adding a jar to the driver class path (which can be specified using --driver-class-path ). I am not very good at deploying on Windows systems, but you can also specify properties in conf/spark-env.cmd , as suggested at http://spark.apache.org/docs/latest/configuration.html .

0
source

Add the following line to spark-defaults.conf

 spark.driver.extraClassPath "C:/Program Files/Microsoft SQL Server/sqljdbc_4.0/enu/*" 
0
source

Another way to do this would be to pass the driver location when doing spark-submit, as shown below (this works well for me)

 ./bin/spark-submit --driver-class-path C:/Program Files/Microsoft SQL Server/sqljdbc_4.0/enu/sqljdbc4.jar --master spark://ip:7077 mycode.py 

Please make sure you specify the sqljdbc location above in $ SPARK_HOME / conf / spark-default.conf and in the spark.jars property

0
source

All Articles