越过自定义异常但打印所有其他异常
Overpass a custom exception but print all the other exceptions
我是运行下面的try-except代码:
try:
paths = file_system_client.get_paths("{0}/{1}/0/{2}/{3}/{4}".format(container_initial_folder, container_second_folder, chronological_date[0], chronological_date[1], chronological_date[2]), recursive=True)
list_of_paths=["abfss://{0}@{1}.dfs.core.windows.net/".format(storage_container_name, storage_account_name)+path.name for path in paths if ".avro" in path.name]
except Exception as e:
if e=="AccountIsDisabled":
pass
else:
print(e)
我不想在我的 try-except 遇到错误时打印以下错误,也不想在遇到此错误时停止我的程序执行:
"(AccountIsDisabled) The specified account is disabled.
RequestId:3159a59e-d01f-0091-5f71-2ff884000000
Time:2020-05-21T13:09:03.3540242Z"
我只想超越它和打印任何其他error/exception(例如TypeError、ValueError等)会发生。
这在Python 3 中可行吗?
请注意,.get_paths() 方法属于 azure.storage.filedatalake 模块,可直接连接 Python 与 Azure Data Lake 进行路径提取。
我给出的说明是为了指出我试图绕过的异常不是内置异常。
[更新] 在按照建议的附加答案进行排序后,我将代码修改为:
import sys
from concurrent.futures import ThreadPoolExecutor
from azure.storage.filedatalake._models import StorageErrorException
from azure.storage.filedatalake import DataLakeServiceClient, DataLakeFileClient
storage_container_name="name1" #confidential
storage_account_name="name2" #confidential
storage_account_key="password" #confidential
container_initial_folder="name3" #confidential
container_second_folder="name4" #confidential
def datalake_connector(storage_account_name, storage_account_key):
global service_client
datalake_client = DataLakeServiceClient(account_url="{0}://{1}.dfs.core.windows.net".format("https", storage_account_name), credential=storage_account_key)
print("Client successfuly created!")
return datalake_client
def create_list_paths(chronological_date,
container_initial_folder="name3",
container_second_folder="name4",
storage_container_name="name1",
storage_account_name="name2"
):
list_of_paths=list()
print("1. success")
paths = file_system_client.get_paths("{0}/{1}/0/{2}/{3}/{4}".format(container_initial_folder, container_second_folder, chronological_date[0], chronological_date[1], chronological_date[2]), recursive=True)
print("2. success")
list_of_paths=["abfss://{0}@{1}.dfs.core.windows.net/".format(storage_container_name, storage_account_name)+path.name for path in paths if ".avro" in path.name]
print("3. success")
list_of_paths=functools.reduce(operator.iconcat, result, [])
return list_of_paths
service_client = datalake_connector(storage_account_name, storage_account_key)
file_system_client = service_client.get_file_system_client(file_system=storage_container_name)
try:
list_of_paths=[]
executor=ThreadPoolExecutor(max_workers=8)
print("Start path extraction!")
list_of_paths=[executor.submit(create_list_paths, i, container_initial_folder, storage_container_name, storage_account_name).result() for i in date_list]
except:
print("no success")
print(sys.exc_info())
不幸的是,StorageErrorException 无法 处理 是有原因的,我仍然得到以下标准输出:
你想比较异常的type
,把你的条件改成:
if type(e)==AccountIsDisabled:
示例:
class AccountIsDisabled(Exception):
pass
print("try #1")
try:
raise AccountIsDisabled
except Exception as e:
if type(e)==AccountIsDisabled:
pass
else:
print(e)
print("try #2")
try:
raise Exception('hi', 'there')
except Exception as e:
if type(e)==AccountIsDisabled:
pass
else:
print(e)
输出:
try #1
try #2
('hi', 'there')
清单[Python 3.Docs]: Compound statements - The try statement.
有几种方法可以实现这一点。这是一个:
try:
# ...
except StorageErrorException:
pass
except:
print(sys.exc_info()[1])
请注意 except:
很棘手,因为您可能会默默地处理不应该处理的异常。另一种方法是捕获代码可能显式引发的任何异常。
try:
# ...
except StorageErrorException:
pass
except (SomeException, SomeOtherException, SomeOtherOtherException) as e:
print(e)
快速浏览[MS.Docs]: filedatalake package and the sourcecode, revealed that StorageErrorException (which extends [MS.Docs]: HttpResponseError class)是你需要处理的
可能要检查 [SO]: About catching ANY exception。
与捕获异常失败有关,貌似有2个同名:
- azure.storage.blob._generated.models._models_py3.StorageErrorException(当前导入)
- azure.storage.filedatalake._generated.models._models_py3.StorageErrorException
我不知道其中的原理(我没有使用该包),但考虑到该包引发了另一个包中定义的异常,而它也定义了一个具有相同名称的包,这似乎很蹩脚。
无论如何导入正确的异常都可以解决问题。
附带说明一下,在处理这种情况时,不要只导入基本名称,而要使用完全限定名称:
import azure.storage.filedatalake._generated.models.StorageErrorException
我是运行下面的try-except代码:
try:
paths = file_system_client.get_paths("{0}/{1}/0/{2}/{3}/{4}".format(container_initial_folder, container_second_folder, chronological_date[0], chronological_date[1], chronological_date[2]), recursive=True)
list_of_paths=["abfss://{0}@{1}.dfs.core.windows.net/".format(storage_container_name, storage_account_name)+path.name for path in paths if ".avro" in path.name]
except Exception as e:
if e=="AccountIsDisabled":
pass
else:
print(e)
我不想在我的 try-except 遇到错误时打印以下错误,也不想在遇到此错误时停止我的程序执行:
"(AccountIsDisabled) The specified account is disabled.
RequestId:3159a59e-d01f-0091-5f71-2ff884000000
Time:2020-05-21T13:09:03.3540242Z"
我只想超越它和打印任何其他error/exception(例如TypeError、ValueError等)会发生。
这在Python 3 中可行吗?
请注意,.get_paths() 方法属于 azure.storage.filedatalake 模块,可直接连接 Python 与 Azure Data Lake 进行路径提取。
我给出的说明是为了指出我试图绕过的异常不是内置异常。
[更新] 在按照建议的附加答案进行排序后,我将代码修改为:
import sys
from concurrent.futures import ThreadPoolExecutor
from azure.storage.filedatalake._models import StorageErrorException
from azure.storage.filedatalake import DataLakeServiceClient, DataLakeFileClient
storage_container_name="name1" #confidential
storage_account_name="name2" #confidential
storage_account_key="password" #confidential
container_initial_folder="name3" #confidential
container_second_folder="name4" #confidential
def datalake_connector(storage_account_name, storage_account_key):
global service_client
datalake_client = DataLakeServiceClient(account_url="{0}://{1}.dfs.core.windows.net".format("https", storage_account_name), credential=storage_account_key)
print("Client successfuly created!")
return datalake_client
def create_list_paths(chronological_date,
container_initial_folder="name3",
container_second_folder="name4",
storage_container_name="name1",
storage_account_name="name2"
):
list_of_paths=list()
print("1. success")
paths = file_system_client.get_paths("{0}/{1}/0/{2}/{3}/{4}".format(container_initial_folder, container_second_folder, chronological_date[0], chronological_date[1], chronological_date[2]), recursive=True)
print("2. success")
list_of_paths=["abfss://{0}@{1}.dfs.core.windows.net/".format(storage_container_name, storage_account_name)+path.name for path in paths if ".avro" in path.name]
print("3. success")
list_of_paths=functools.reduce(operator.iconcat, result, [])
return list_of_paths
service_client = datalake_connector(storage_account_name, storage_account_key)
file_system_client = service_client.get_file_system_client(file_system=storage_container_name)
try:
list_of_paths=[]
executor=ThreadPoolExecutor(max_workers=8)
print("Start path extraction!")
list_of_paths=[executor.submit(create_list_paths, i, container_initial_folder, storage_container_name, storage_account_name).result() for i in date_list]
except:
print("no success")
print(sys.exc_info())
不幸的是,StorageErrorException 无法 处理 是有原因的,我仍然得到以下标准输出:
你想比较异常的type
,把你的条件改成:
if type(e)==AccountIsDisabled:
示例:
class AccountIsDisabled(Exception):
pass
print("try #1")
try:
raise AccountIsDisabled
except Exception as e:
if type(e)==AccountIsDisabled:
pass
else:
print(e)
print("try #2")
try:
raise Exception('hi', 'there')
except Exception as e:
if type(e)==AccountIsDisabled:
pass
else:
print(e)
输出:
try #1
try #2
('hi', 'there')
清单[Python 3.Docs]: Compound statements - The try statement.
有几种方法可以实现这一点。这是一个:
try:
# ...
except StorageErrorException:
pass
except:
print(sys.exc_info()[1])
请注意 except:
很棘手,因为您可能会默默地处理不应该处理的异常。另一种方法是捕获代码可能显式引发的任何异常。
try:
# ...
except StorageErrorException:
pass
except (SomeException, SomeOtherException, SomeOtherOtherException) as e:
print(e)
快速浏览[MS.Docs]: filedatalake package and the sourcecode, revealed that StorageErrorException (which extends [MS.Docs]: HttpResponseError class)是你需要处理的
可能要检查 [SO]: About catching ANY exception。
与捕获异常失败有关,貌似有2个同名:
- azure.storage.blob._generated.models._models_py3.StorageErrorException(当前导入)
- azure.storage.filedatalake._generated.models._models_py3.StorageErrorException
我不知道其中的原理(我没有使用该包),但考虑到该包引发了另一个包中定义的异常,而它也定义了一个具有相同名称的包,这似乎很蹩脚。
无论如何导入正确的异常都可以解决问题。
附带说明一下,在处理这种情况时,不要只导入基本名称,而要使用完全限定名称:
import azure.storage.filedatalake._generated.models.StorageErrorException