使用硒的艰难刮擦案例

A tough scraping case using selenium

所以我正在尝试使用 selenium 抓取 webs table 并尝试使用 xpath 提取 table:

之前我试图寻找 table class 但是没有找到 table,所以我决定寻找 div 元素。

xpath="//div[@class='table-scroller ScrollableTable__table-scroller QuoteHistoryTable__table__scroller QuoteHistoryTable__QuoteHistoryTable__table__scroller']"
WebDriverWait(driver, 10).until(
        expected_conditions.visibility_of_element_located((By.XPATH, xpath)))
source = driver.page_source
driver.quit()
soup = BeautifulSoup(source, "html5lib")

table = soup.find('div', {'class': 'table-scroller ScrollableTable__table-scroller QuoteHistoryTable__table__scroller QuoteHistoryTable__QuoteHistoryTable__table__scroller'})
df = pd.read_html(str(table), flavor='html5lib', header=0, thousands='.', decimal=',')
print(df[0])

我遇到的问题是我只打印 headers 和第一行充满 nans:

的值

为什么我没有得到 table 的值?为什么抓取此内容如此困难?

编辑: @DebanjanB 能够提供一个很好的答案但是我无法复制输出,这背后的原因是什么?

使用Selenium and you can use the following Locator StrategyOMX Stockholm 30<table>提取内容:

  • 使用XPATH:

    print(WebDriverWait(driver, 20).until(EC.visibility_of_element_located((By.XPATH, "//h4[text()='OMX Stockholm 30']//following::div[2]//table"))).text)
    
  • 注意:您必须添加以下导入:

    from selenium.webdriver.support.ui import WebDriverWait
    from selenium.webdriver.common.by import By
    from selenium.webdriver.support import expected_conditions as EC
    
  • 控制台输出:

    VÆRDIPAPIR
    KURS
    ÆNDRING I %
    ÆNDRING
    VOLUME
    BUD
    UDBUD
    OPDATERET
    ABB LTD
    229,60
    0,13%
    0,30
    1.953.199 229,20 229,30 18.09.2020
    ALFA LAVAL AB
    210,50
    1,20%
    2,50
    1.513.953 210,30 210,40 18.09.2020
    ASSA ABLOY AB SER. B
    216,00
    1,55%
    3,30
    3.250.421 216,20 216,40 18.09.2020
    ASTRAZENECA PLC
    995,10
    0,56%
    5,50
    507.005 994,70 995,00 18.09.2020
    ATLAS COPCO AB SER. A
    425,60
    1,89%
    7,90
    2.313.361 425,80 426,10 18.09.2020
    ATLAS COPCO AB SER. B
    376,60
    2,78%
    10,20
    971.096 376,60 376,90 18.09.2020
    AUTOLIV INC. SDB
    655,00
    -1,18%
    -7,80
    279.485 656,80 657,40 18.09.2020
    BOLIDEN AB
    275,80
    1,03%
    2,80
    2.450.311 276,60 276,80 18.09.2020
    ELECTROLUX, AB SER. B
    194,60
    0,34%
    0,65
    1.381.656 195,00 195,10 18.09.2020
    ERICSSON, TELEFONAB. L M SER.
    98,26
    1,30%
    1,26
    17.811.892 98,12 98,16 18.09.2020
    ESSITY AB SER. B
    306,40
    -0,20%
    -0,60
    1.795.692 306,20 306,40 18.09.2020
    GETINGE AB SER. B
    188,10
    1,65%
    3,05
    864.843 188,05 188,15 18.09.2020
    HENNES & MAURITZ AB, H &#3
    157,85
    -1,68%
    -2,70
    5.188.908 157,85 157,90 18.09.2020
    HEXAGON AB SER. B
    677,20
    0,06%
    0,40
    776.831 676,20 676,80 18.09.2020
    INVESTOR AB SER. B
    584,60
    1,53%
    8,80
    1.681.508 585,00 585,20 18.09.2020
    KINNEVIK AB SER. B
    336,95
    3,34%
    10,90
    1.118.689 336,35 336,55 18.09.2020
    NORDEA BANK ABP
    68,37
    -1,85%
    -1,29
    11.846.193 68,45 68,48 18.09.2020
    SANDVIK AB
    185,10
    1,54%
    2,80
    3.874.524 185,00 185,10 18.09.2020
    SECURITAS AB SER. B
    140,00
    -0,53%
    -0,75
    1.545.060 140,20 140,35 18.09.2020
    SKANDINAVISKA ENSKILDA BANKEN
    81,38
    -3,46%
    -2,92
    10.968.672 81,38 81,42 18.09.2020
    

更新

正如您在评论中提到的那样 ...要么超时,要么我只能得到 headers... 这实际上意味着我们的定位器是正确,问题出在渲染上,在这种情况下,您可以 scrollIntoView() 并且可以使用以下解决方案:

driver.get('https://www.euroinvestor.dk/markeder/aktier/sverige/omx-stockholm-30/21')
driver.execute_script("return arguments[0].scrollIntoView(true);", WebDriverWait(driver, 20).until(EC.visibility_of_element_located((By.XPATH, "//h4[text()='OMX Stockholm 30']"))))
print(WebDriverWait(driver, 20).until(EC.visibility_of_element_located((By.XPATH, "//h4[text()='OMX Stockholm 30']//following::div[2]//table"))).text)

如果您检查页面请求,您可能会注意到一个端点为您提供正确的信息 JSON:

https://api.euroinvestor.dk/indices/21/instruments

您可以使用 pandas 直接从 URL 读取(您甚至不需要 Selenium):

instruments = pd.read_json('https://api.euroinvestor.dk/indices/21/instruments')

务必查看 API 使用条款(尤其是任何速率限制);否则你可能会被阻止。