如何在 Unload 中转义单引号
How to escape single quotes in Unload
conn_string = "dbname='{}' port='{}' user='{}' password='{}' host='{}'"\
.format(dbname,port,user,password,host_url)
sql="""UNLOAD ('select col1,col2 from %s.visitation_hourly_summary_us where col4= '2018-07-10' and col5= '1';') TO 's3://%s/%s/%s.csv' \
credentials 'aws_access_key_id=%s;aws_secret_access_key=%s' \
MANIFEST GZIP ALLOWOVERWRITE;Commit;""" \
% (schema_name,s3_bucket_name, schema,table,aws_access_key_id,\
aws_secret_access_key)
con = psycopg2.connect(conn_string)
cur = con.cursor()
cur.execute(sql)
我正在尝试执行上面的脚本来读取 table 然后在 S3
中创建一个文件
由于我的列是字符串,所以我无法跳过单引号,并且在
附近出现语法错误错误
此外,我已经尝试在 where 条件下给出 \,它仍然显示相同的错误。
非常感谢任何帮助。
谢谢
您可以将值放在双引号中。
'select col1,col2 from %s.visitation_hourly_summary_us where col4= "2018-07-10" and col5= "1";'
正如 Sarang 所说,只需将查询的 col4 和 col5 值中的单引号替换为双引号即可。
不过,我建议您将字符串分成更小的块,以便于阅读和维护。这样,您应该能够按照 chepner 的建议使用 execute
(和 MySQL documentation):
# Create the inner SQL statement. Notice the single quotes for the general
# string and the double quotes for the col4 and col5 values
sql_stmt = ('SELECT col1, col2 '
'FROM %s.visitation_hourly_summary_us '
'WHERE col4 = "2018-07-10" AND col5= "1";' % schema_name)
# Format the s3 path
s3_target = 's3://%s/%s/%s.csv' % (s3_bucket_name, schema, table)
# Format credentials string
s3_credentials = 'aws_access_key_id=%s;aws_secret_access_key=%s' % (
aws_access_key_id, aws_secret_access_key)
# Create a tuple with all preformatted strings
data = (sql_stmt, s3_target, s3_credentials)
# Format the s3 query skeleton
s3_stmt = ("UNLOAD ('%s') TO '%s' "
"CREDENTIALS '%s' "
"MANIFEST GZIP ALLOWOVERWRITE;Commit;")
con = psycopg2.connect(conn_string)
cur = con.cursor()
cur.execute(s3_stmt, data)
'(单引号可以发送为)-> \\\\'
我在 R 和 python 中都用过这个
请寻找解决方案
如果您的 sql QUERY 是
Select * 来自 sample_table 其中 register_date='2018-12-31'
然后卸载命令这样写
sql= """unload ('Select * from tnltemp.otpsuccess_details where register_date=\\'2018-12-31\\' ')
to 's3://my-bucket/migration/exported_sample_table_' credentials
'aws_access_key_id=12234123;aws_secret_access_key=12345'
DELIMITER AS ','
NULL AS ''
parallel off;""""
cur = con.cursor()
cur.execute(sql)
您可能希望使用两个单引号将值括起来。
If your query contains quotes (for example to enclose literal values),
put the literal between two sets of single quotation marks—you must
also enclose the query between single quotation marks:
示例:
UNLOAD ('select * from venue where venuestate=''NV''')
摘自红移文档:
https://docs.aws.amazon.com/redshift/latest/dg/r_UNLOAD.html
你也可以使用 postgres 风格:
unload
($$
select * from table where id='ABC'
$$)
to 's3://bucket/queries_results/20150324/table_dump/'
credentials 'aws_access_key_id=;aws_secret_access_key='
;
conn_string = "dbname='{}' port='{}' user='{}' password='{}' host='{}'"\
.format(dbname,port,user,password,host_url)
sql="""UNLOAD ('select col1,col2 from %s.visitation_hourly_summary_us where col4= '2018-07-10' and col5= '1';') TO 's3://%s/%s/%s.csv' \
credentials 'aws_access_key_id=%s;aws_secret_access_key=%s' \
MANIFEST GZIP ALLOWOVERWRITE;Commit;""" \
% (schema_name,s3_bucket_name, schema,table,aws_access_key_id,\
aws_secret_access_key)
con = psycopg2.connect(conn_string)
cur = con.cursor()
cur.execute(sql)
我正在尝试执行上面的脚本来读取 table 然后在 S3
中创建一个文件由于我的列是字符串,所以我无法跳过单引号,并且在
附近出现语法错误错误此外,我已经尝试在 where 条件下给出 \,它仍然显示相同的错误。
非常感谢任何帮助。
谢谢
您可以将值放在双引号中。 'select col1,col2 from %s.visitation_hourly_summary_us where col4= "2018-07-10" and col5= "1";'
正如 Sarang 所说,只需将查询的 col4 和 col5 值中的单引号替换为双引号即可。
不过,我建议您将字符串分成更小的块,以便于阅读和维护。这样,您应该能够按照 chepner 的建议使用 execute
(和 MySQL documentation):
# Create the inner SQL statement. Notice the single quotes for the general
# string and the double quotes for the col4 and col5 values
sql_stmt = ('SELECT col1, col2 '
'FROM %s.visitation_hourly_summary_us '
'WHERE col4 = "2018-07-10" AND col5= "1";' % schema_name)
# Format the s3 path
s3_target = 's3://%s/%s/%s.csv' % (s3_bucket_name, schema, table)
# Format credentials string
s3_credentials = 'aws_access_key_id=%s;aws_secret_access_key=%s' % (
aws_access_key_id, aws_secret_access_key)
# Create a tuple with all preformatted strings
data = (sql_stmt, s3_target, s3_credentials)
# Format the s3 query skeleton
s3_stmt = ("UNLOAD ('%s') TO '%s' "
"CREDENTIALS '%s' "
"MANIFEST GZIP ALLOWOVERWRITE;Commit;")
con = psycopg2.connect(conn_string)
cur = con.cursor()
cur.execute(s3_stmt, data)
'(单引号可以发送为)-> \\\\'
我在 R 和 python 中都用过这个 请寻找解决方案
如果您的 sql QUERY 是
Select * 来自 sample_table 其中 register_date='2018-12-31'
然后卸载命令这样写
sql= """unload ('Select * from tnltemp.otpsuccess_details where register_date=\\'2018-12-31\\' ')
to 's3://my-bucket/migration/exported_sample_table_' credentials
'aws_access_key_id=12234123;aws_secret_access_key=12345'
DELIMITER AS ','
NULL AS ''
parallel off;""""
cur = con.cursor()
cur.execute(sql)
您可能希望使用两个单引号将值括起来。
If your query contains quotes (for example to enclose literal values), put the literal between two sets of single quotation marks—you must also enclose the query between single quotation marks:
示例:
UNLOAD ('select * from venue where venuestate=''NV''')
摘自红移文档: https://docs.aws.amazon.com/redshift/latest/dg/r_UNLOAD.html
你也可以使用 postgres 风格:
unload
($$
select * from table where id='ABC'
$$)
to 's3://bucket/queries_results/20150324/table_dump/'
credentials 'aws_access_key_id=;aws_secret_access_key='
;