pyspark 在 spark sql 中的函数之间使用范围

pyspark use range between function in spark sql

当我 运行

spark.sql(''' 
        select client,avg(amount) over (partition by client
         order by my_timestamp
         range between interval 30 days preceding and current row) as moving_avg_amount
         from my_table''')

有效。

但是如果我想排除最后 X 天 - 它会失败。

... range between interval 30 days preceding and 12 days preceding ..
   or :
 ... range between interval 30 days preceding and interval 12 days preceding .. 

正确的语法是什么?

中建议范围间隔在SparkSQL中不能正常工作。