使用两个 select 语句连接两个或多个表时的 Spark sql

Spark sql when joining two or more tables using two select statements

这是我的声明:

val Porders = sqlContext.sql(
    """SELECT count(STATUS_CD) 
    FROM s_order 
    WHERE STATUS_CD = 'pending' AND ROW_ID IN 
        ( SELECT so.ROW_ID FROM s_order so 
        JOIN s_order_item soi 
        ON so.ROW_ID = soi.ORDER_ID 
        JOIN s_order_type sot 
        ON so.ORDER_TYPE_ID = sot.ROW_ID 
        JOIN s_product sp 
        ON soi.PROD_ID = sp.ROW_ID
        WHERE (sp.NAME like '%VIP%' OR sp.NAME like '%BIZ%' OR sp.NAME like '%UniFi%') 
        AND LOWER(sot.NAME) = 'new install')
    """)

我收到以下错误:

ERROR : java.lang.RuntimeException: [3.3] failure: identifier expected
( SELECT so.ROW_ID FROM s_order so JOIN s_order_item soi 
  ^

可能是什么原因?

发生这种情况的原因是不支持子查询:请参阅 Spark-4226

甚至像

这样的查询
sqlContext.sql(
  """SELECT count(STATUS_CD)
     FROM s_order
     WHERE STATUS_CD = 'pending' AND ROW_ID IN
       (SELECT * FROM s_order)
  """)

目前不工作(谈到 Spark SQL 1.5.1)

尝试用联接替换您的子查询,例如https://dev.mysql.com/doc/refman/5.1/en/rewriting-subqueries.html