AmazonS3 - 根据特定权限连接 Python Boto
AmazonS3 - connecting with Python Boto according specific permissions
我正在尝试通过 Boto 2.38.0 和 python 3.4.3 连接 Amazon S3。
S3 帐户归另一家公司所有,他们只授予这些权限:
"Statement":
[
{
"Effect": "Allow",
"Action": "s3:ListBucket",
"Resource": "arn:axs:s3:::GA-Exports",
"Condition":{
"StringLike":
{
"s3.prefix": "Events_3112/*"
}
}
},{
"Effect": "Allow",
"Action":
[
"s3:GetObject",
"s3.GetObjectAcl",
"s3.GetBucketAcl"
],
"Resource": "arn:axs:s3:::GA-Exports/Events_3112/*",
"Condition": {}
}
]
如果设置名称,我可以连接并检索特定文件。但我需要从 S3 检索所有数据(例如,通过脚本确定我尚未下载哪些文件)。
from boto.s3.connection import S3Connection
from boto.s3.connection import OrdinaryCallingFormat
s3_connection = S3Connection(access_key, secret_key,calling_format=OrdinaryCallingFormat())
bucket = s3_connection.get_bucket(__bucket_name, validate=False)
key = bucket.get_key(file_name)
有效,但
all_buckets = s3_connection.get_all_buckets()
报错
S3ResponseError: S3ResponseError: 403 Forbidden
<?xml version="1.0" encoding="UTF-8"?>
<Error><Code>AccessDenied</Code><Message>Access Denied</Message><RequestId>19D20ADCFFC899ED</RequestId><HostId>eI4CzQqAvOnjcXJNZyyk+drFHjO9+yj0EtP+vJ5f/D7D4Dh2HFL3UvCacy9nP/wT</HostId></Error>
有了软件S3 Browser,我可以右击> "export file list",得到我需要的东西。但是我怎样才能在 python 中做到这一点?
编辑:
终于找到答案了:
bucket_name = 'GA-Exports'
s3_connection = S3Connection(access_key, secret_key, calling_format=OrdinaryCallingFormat())
bucket = s3_connection.get_bucket(bucket_name, validate=False)
for key in bucket.list(prefix='Events_3112/DEV/'):
print(key.name, key.size, key.last_modified)
感谢您的帮助! :)
您不会被允许获取所有存储桶,权限表明您只能列出 "GA-Exports":
的存储桶内容
from boto.s3.connection import S3Connection
from boto.s3.connection import OrdinaryCallingFormat
# this is to avoid a 301 mover permanently when used OrdinaryCallingFormat
if '.' in __bucket_name:
conn = S3Connection(access_key, secret_key, calling_format=OrdinaryCallingFormat())
else:
conn = S3Connection(access_key, secret_key)
bucket = conn.get_bucket(__bucket_name, validate=False)
l = bucket.list(prefix='Events_3112/') # now l is a list of objects within the bucket
# other option is to use bucket.get_all_keys()
for key in l:
print l # or whatever you want to do with each file name
# Recall this is only the filename not the file perse :-D
请参阅 http://boto.readthedocs.org/en/latest/ref/s3.html#module-boto.s3.bucket
中的完整存储桶对象参考
编辑:在通过普通调用格式访问 S3 时收到 301 永久移动错误时添加了修复。在前缀上添加了 @garnaat 评论(谢谢!)
我正在尝试通过 Boto 2.38.0 和 python 3.4.3 连接 Amazon S3。 S3 帐户归另一家公司所有,他们只授予这些权限:
"Statement":
[
{
"Effect": "Allow",
"Action": "s3:ListBucket",
"Resource": "arn:axs:s3:::GA-Exports",
"Condition":{
"StringLike":
{
"s3.prefix": "Events_3112/*"
}
}
},{
"Effect": "Allow",
"Action":
[
"s3:GetObject",
"s3.GetObjectAcl",
"s3.GetBucketAcl"
],
"Resource": "arn:axs:s3:::GA-Exports/Events_3112/*",
"Condition": {}
}
]
如果设置名称,我可以连接并检索特定文件。但我需要从 S3 检索所有数据(例如,通过脚本确定我尚未下载哪些文件)。
from boto.s3.connection import S3Connection
from boto.s3.connection import OrdinaryCallingFormat
s3_connection = S3Connection(access_key, secret_key,calling_format=OrdinaryCallingFormat())
bucket = s3_connection.get_bucket(__bucket_name, validate=False)
key = bucket.get_key(file_name)
有效,但
all_buckets = s3_connection.get_all_buckets()
报错
S3ResponseError: S3ResponseError: 403 Forbidden
<?xml version="1.0" encoding="UTF-8"?>
<Error><Code>AccessDenied</Code><Message>Access Denied</Message><RequestId>19D20ADCFFC899ED</RequestId><HostId>eI4CzQqAvOnjcXJNZyyk+drFHjO9+yj0EtP+vJ5f/D7D4Dh2HFL3UvCacy9nP/wT</HostId></Error>
有了软件S3 Browser,我可以右击> "export file list",得到我需要的东西。但是我怎样才能在 python 中做到这一点?
编辑: 终于找到答案了:
bucket_name = 'GA-Exports'
s3_connection = S3Connection(access_key, secret_key, calling_format=OrdinaryCallingFormat())
bucket = s3_connection.get_bucket(bucket_name, validate=False)
for key in bucket.list(prefix='Events_3112/DEV/'):
print(key.name, key.size, key.last_modified)
感谢您的帮助! :)
您不会被允许获取所有存储桶,权限表明您只能列出 "GA-Exports":
的存储桶内容from boto.s3.connection import S3Connection
from boto.s3.connection import OrdinaryCallingFormat
# this is to avoid a 301 mover permanently when used OrdinaryCallingFormat
if '.' in __bucket_name:
conn = S3Connection(access_key, secret_key, calling_format=OrdinaryCallingFormat())
else:
conn = S3Connection(access_key, secret_key)
bucket = conn.get_bucket(__bucket_name, validate=False)
l = bucket.list(prefix='Events_3112/') # now l is a list of objects within the bucket
# other option is to use bucket.get_all_keys()
for key in l:
print l # or whatever you want to do with each file name
# Recall this is only the filename not the file perse :-D
请参阅 http://boto.readthedocs.org/en/latest/ref/s3.html#module-boto.s3.bucket
中的完整存储桶对象参考编辑:在通过普通调用格式访问 S3 时收到 301 永久移动错误时添加了修复。在前缀上添加了 @garnaat 评论(谢谢!)