无论如何直接使用where子句查询oracle db而不是在spark-sql 2.x中过滤完整的数据帧

Is there anyway directly query the oracle db with where clause instead of filtering on complete dataframe in spark-sql 2.x

我正在使用 spark-sql.2.3.1 并且我正在连接到 oracleDB 以获取数据帧

Code : >'    ora_df
                      .option("schema",schema)
                      .option("partitionColumn", partitionColumn)
                      .option("dbtable", query)
                      .load()
       > '

Issue : I would like to filter the table with date. So is there anyway to pass the query with dynamically build where clause?

习惯是这样的:

 val dataframe_mysql_4 = spark.read.jdbc(jdbcUrl, s"""(select DISTINCT type from family where type like '${val1}' ) f """, connectionProperties)  

设置您的连接字符串,然后使用 val1 的变量设置,您可以添加一个使用该值的 where 子句。注意 f.