如何使用numpy.histogram计算概率,然后用它来计算KL散度?

How to calculate probabilities using numpy.histogram and then use it for calculating KL divergence?

在以下代码中,每个 bin 的 density=True returns 概率密度函数。现在如果必须计算 P(x),我可以说 hist 显示概率吗?例如,如果第一个 bin 的平均值是 0.5,我可以说在 x=0.5 时概率是 hist[0] 吗?我必须使用使用 P(x) 的 KL 散度。

x = np.array([0,0,0,0,0,3,3,2,2,2,1,1,1,1,])
hist,bin_edges= np.histogram(x,bins=10,density=True)

当您设置 density=TrueNumPy returns 一个 概率密度函数 (比方说 p)。从理论上讲,p(0.5) = 0 因为概率被定义为 PDF 曲线下的面积。您可以阅读有关它的更多详细信息 here。因此,如果您想要计算概率,则必须定义所需的范围并汇总该范围内的所有 PDF 值。

对于KL,我可以分享我的互信息计算解决方案(基本上是KL):

def mutual_information(x, y, sigma=1):
    bins = (256, 256)
    # histogram
    hist_xy = np.histogram2d(x, y, bins=bins)[0]

    # smooth it out for better results
    ndimage.gaussian_filter(hist_xy, sigma=sigma, mode='constant', output=hist_xy)

    # compute marginals
    hist_xy = hist_xy + EPS # prevent division with 0
    hist_xy = hist_xy / np.sum(hist_xy)
    hist_x = np.sum(hist_xy, axis=0)
    hist_y = np.sum(hist_xy, axis=1)

    # compute mi
    mi = (np.sum(hist_xy * np.log(hist_xy)) - np.sum(hist_x * np.log(hist_x)) - np.sum(hist_y * np.log(hist_y)))
    return mi

编辑: KL 可以这样计算(请注意,我没有测试这个!):

def kl(x, y, sigma=1):
    # histogram
    hist_xy = np.histogram2d(x, y, bins=bins)[0]

    # smooth it out for better results
    ndimage.gaussian_filter(hist_xy, sigma=sigma, mode='constant', output=hist_xy)

    # compute marginals
    hist_xy = hist_xy + EPS # prevent division with 0
    hist_xy = hist_xy / np.sum(hist_xy)
    hist_x = np.sum(hist_xy, axis=0)
    hist_y = np.sum(hist_xy, axis=1)

    kl = -np.sum(hist_x * np.log(hist_y / hist_x ))
    return kl

此外,为了获得最佳结果,您应该使用一些试探法来计算 sigma,例如 A rule-of-thumb bandwidth estimator