当前位置:   article > 正文

启动spark报错_sparksession should only be created and accessed o

sparksession should only be created and accessed on the driver.

再启动spark时报错:
Caused by: org.datanucleus.exceptions.NucleusException: Attempt to invoke the “DBCP” plugin to create a ConnectionPool gave an error : The specified datastore driver (“com.mysql.jdbc.Driver”) was not found in the CLASSPATH. Please check your CLASSPATH specification, and the name of the driver.

是因为在使用hive on spark时 启动没有找到 mysql的驱动
解决:

1、

再启动时把mysql驱动加上, spark-shell --master local[2] --jars $HIVE_HOME/lib/mysql驱动

2 在spark-env.sh中添加 export
SPARK_CLASSPATH=$HIVE_HOME/lib/mysql-connector-java-5.1.17.jar

声明:本文内容由网友自发贡献,版权归原作者所有,本站不承担相应法律责任。如您发现有侵权的内容,请联系我们。转载请注明出处:【wpsshop博客】
推荐阅读
相关标签
  

闽ICP备14008679号