在 pyspark 中过滤数据框

Filtering a dataframe in pyspark

我有一个看起来像这样的数据框,

[Row(cardholder_nbr=u'10', category_nbr=87, club_nbr=6279, membership_nbr=u'780941902', retail_all=51.23, sub_category_nbr=1, system_item_nbr=3572924, unit_qty=1.0, visit_date=u'2016-08-05', visit_nbr=1608057826021405)]

我想根据 2016-09-15 到 2016-09-23 之间的访问日期范围对其进行过滤。 我该怎么做? 如果我这样做,

df = sqlContext.sql("SELECT * from df_table WHERE visit_date >= '2016-09-15' AND visit_date <= '2016-09-23'")

我收到这个错误,

"invalid literal for int()"

如何正确过滤它?

你能试试这个吗,如果有效请告诉我们?

from pyspark.sql import Row
import pyspark.sql.functions as f

data = [Row(cardholder_nbr=u'10', category_nbr=87, club_nbr=6279, membership_nbr=u'780941902', retail_all=51.23, sub_category_nbr=1, system_item_nbr=3572924, unit_qty=1.0, visit_date=u'2016-08-05', visit_nbr=1608057826021405),
        Row(cardholder_nbr=u'10', category_nbr=87, club_nbr=6279, membership_nbr=u'780941902', retail_all=51.23, sub_category_nbr=1, system_item_nbr=3572924, unit_qty=1.0, visit_date=u'2016-09-16', visit_nbr=1608057826021405)]
df = sc.parallelize(data).toDF()
df.show()
df.where((f.col("visit_date") >= '2016-09-15') & (f.col("visit_date") <= '2016-09-23')).show()