如何使用 google colab jupyter 从互联网上下载图像

How to download an image from the internet using google colab jupyter

我需要使用 url 下载图像。我设法获得了我需要下载的图像的 urls,但现在我不知道如何将它下载到我的本地计算机。我正在使用 google colab/jupyter。谢谢! 到目前为止,这是我的代码:

from bs4 import BeautifulSoup
import requests
import json
import urllib.request

#use Globe API to get data

#input userid - plan: have program read userids from csv or excel file
userid = xxxxxxxx

#use Globe API to get data
source = requests.get('https://api.globe.gov/search/v1/measurement/protocol/measureddate/userid/?protocols=land_covers&startdate=2020-05-04&enddate=2020-07-16&userid=' + str(userid) +'&geojson=FALSE&sample=FALSE').text
#set up BeautifulSoup4
soup = BeautifulSoup(source, 'lxml')

#Isolate the Json data and put it into a string called "paragraph"
body = soup.find('body')
paragraph = body.p.text

#load the string into a python object
data = json.loads(paragraph)

#pick out the needed information and store them
for landcover in data['results']:
  siteId = landcover['siteId']
  measuredDate = landcover['measuredDate']
  latitude = landcover['latitude']
  longitude = landcover['longitude']
  protocol = landcover['protocol']
  DownURL = landcover['data']['landcoversDownwardPhotoUrl']
  #Here is where I want to download the url contained in 'DownURL'

尝试

from google.colab import files as FILE
import os

img_data = requests.get(DownURL).content
with open('image_name.jpg', 'wb') as handler:
    handler.write(img_data)

FILE.download('image_name.jpg')
os.remove('image_name.jpg') # to save up space

如果您不想设置在每次循环迭代中保持递增的图像名称或计数器变量,您可以调用随机函数。