Spring Spark Cassandra - 白标错误页面

Spring Spark Cassandra - Whitelabel Error Page

我试图通过 Spring 在 netbeans 中使用 spark 和 cassandra,但出现错误:

type=Internal Server Error, status=500 Failed to open native connection to Cassandra at {127.0.0.1}:9042.

在我尝试集成 Spring 之前,Spark 和 Cassandra 运行良好。我的 Cassandra 数据库中已经有数据,我通过 spark 获取并处理它们。基本上,我想通过 RestController 在 /welcome 页面中打印结果(矩阵)。

这是我非常简单的文件结构: image

这是我的 pom.xml:

<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>com.mycompany</groupId>
<artifactId>my-app</artifactId>
<version>1.0-SNAPSHOT</version>
<packaging>jar</packaging>
<build>
    <plugins>

        <plugin>
            <groupId>org.apache.maven.plugins</groupId>
            <artifactId>maven-compiler-plugin</artifactId>
            <version>2.3.2</version>
            <configuration>
                <debug>true</debug>
            </configuration>
        </plugin>

        <plugin>
            <groupId>org.springframework.boot</groupId>
            <artifactId>spring-boot-maven-plugin</artifactId>
            <version>2.0.0.RELEASE</version>
            <executions>
                <execution>
                    <goals>
                        <goal>repackage</goal>
                    </goals>
                </execution>
            </executions>
        </plugin>

    </plugins>
</build>
<properties>
    <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
    <maven.compiler.source>1.8</maven.compiler.source>
    <maven.compiler.target>1.8</maven.compiler.target>
</properties>

<dependencies>
    <!--Spring dependencies-->
    <dependency>
        <groupId>com.fasterxml.jackson.core</groupId>
        <artifactId>jackson-databind</artifactId>
        <version>2.9.0</version>
    </dependency>
    <dependency>
        <groupId>com.google.code.gson</groupId>
        <artifactId>gson</artifactId>
        <version>2.8.2</version>
    </dependency>
    <dependency>
        <groupId>org.springframework.boot</groupId>
        <artifactId>spring-boot-starter-web</artifactId>
        <version>2.0.0.RELEASE</version>
    </dependency>
    <dependency>
        <groupId>org.springframework</groupId>
        <artifactId>spring-core</artifactId>
        <version>5.0.4.RELEASE</version>
    </dependency>

    <!--Spark dependencies-->
    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-core_2.11</artifactId>
        <version>2.2.1</version>
    </dependency>
    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-sql_2.11</artifactId>
        <version>2.2.1</version>
    </dependency>
    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-mllib_2.11</artifactId>
        <version>2.2.1</version>
    </dependency>

    <!--Cassandra dependencies--> 

    <!--Spark cassandra connector dependencies-->
    <dependency>
        <groupId>com.datastax.spark</groupId>
        <artifactId>spark-cassandra-connector_2.11</artifactId>
        <version>2.0.7</version>
    </dependency>
</dependencies>

spark 上下文和会话初始化:

@Configuration
public class Sparkstart { 
    @Bean
    public SparkSession sparksession() {
       SparkSession sp = SparkSession
             .builder()
             .master("local[*]")
             .appName("preprocessing")
             .config("spark.cassandra.connection.host","127.0.0.1")
             .getOrCreate();
       return sp;
    }

    @Bean
    public JavaSparkContext sc(){
        JavaSparkContext sc = new JavaSparkContext(sparksession().sparkContext());
        return sc;
    }       
}

我从 Cassandra 数据库中获取数据的 class:

@Component
public class Aftersparkstart {
    @Autowired
    private SparkSession sp;

    @Autowired
    private JavaSparkContext sc;

    @Autowired
    private Pearsonclass prs;

public Matrix start(){

List<String> desclist = new ArrayList<>();
    desclist.add(some data);
    desclist.add(some data);
Dataset<Row> peakset = sp.read().format("org.apache.spark.sql.cassandra")
            .options(new HashMap<String, String>() {
                {
                    put("keyspace", "mdb");
                    put("table", "filepeaks");
                }
            })
            .load().select(col("directoryname"), col("description"), col("intensity")).filter(col("description").isin(desclist.toArray()));

Dataset<Row> finalpeaks = peakset.groupBy(peakset.col("description"), peakset.col("directoryname")).avg("intensity").orderBy(asc("directoryname"), asc("description"));

Matrix r=prs.pearsonmethod(finalpeaks,dirlist,desclist);
return r;
}
}

以及 spark 处理发生的 class:

@Component
public class Pearsonclass{

public Matrix pearsonmethod(Dataset<Row> peaks, List<String> dirlist, List<String> desclist) {
    "...stuff..."
    return r2;
}
}

最后是 RestController:

@RestController
public class Firstcontroller {

@Autowired
private Aftersparkstart str;

@RequestMapping("/welcome")
public Matrix welcome(){
    //return wlc.retrievemsg();
    return str.start();
}
}

我很确定我在依赖项中遗漏了一些东西,但我不知道是什么!

知道了!我刚刚将我的 Cassandra 版本从 3.11.0 升级到 3.11.2。问题是 JDK 与 Cassandra 不兼容。我有 1.8.0_162-8u162 与之前的 Cassandra 版本不兼容..!