总值错误的分布图
Distribution plot with wrong total value
创建
我用下面的代码制作了一个分布图:
from numpy import *
import numpy as np
import matplotlib.pyplot as plt
sigma = 4.1
x = np.linspace(-6*sigma, 6*sigma, 200)
def distr(n):
def g(x):
return (1/(sigma*sqrt(2*pi)))*exp(-0.5*(x/sigma)**2)
FxSum = 0
a = list()
for i in range(n):
# divide into 200 parts and sum one by one
numb = g(-6*sigma + (12*sigma*i)/n)
FxSum += numb
a.append(FxSum)
return a
plt.plot(x, distr(len(x)))
plt.show()
当然,这是一种无需使用 hist()、cdf() 或 Python 库中的任何其他选项即可获得结果的方法。
为什么总和不是1?它不应该依赖于(例如)西格玛。
几乎正确,但为了积分,您必须将函数值 g(x)
乘以您的微小间隔 dx
(12*sigma/200
)。那就是你总结的区域:
from numpy import *
import numpy as np
import matplotlib.pyplot as plt
sigma = 4.1
x = np.linspace(-6*sigma, 6*sigma, 200)
def distr(n):
def g(x):
return (1/(sigma*sqrt(2*pi)))*exp(-0.5*(x/sigma)**2)
FxSum = 0
a = list()
for i in range(n):
# divide into 200 parts and sum one by one
numb = g(-6*sigma + (12*sigma*i)/n) * (12*sigma/200)
FxSum += numb
a.append(FxSum)
return a
plt.plot(x, distr(len(x)))
plt.show()
创建
from numpy import *
import numpy as np
import matplotlib.pyplot as plt
sigma = 4.1
x = np.linspace(-6*sigma, 6*sigma, 200)
def distr(n):
def g(x):
return (1/(sigma*sqrt(2*pi)))*exp(-0.5*(x/sigma)**2)
FxSum = 0
a = list()
for i in range(n):
# divide into 200 parts and sum one by one
numb = g(-6*sigma + (12*sigma*i)/n)
FxSum += numb
a.append(FxSum)
return a
plt.plot(x, distr(len(x)))
plt.show()
当然,这是一种无需使用 hist()、cdf() 或 Python 库中的任何其他选项即可获得结果的方法。
为什么总和不是1?它不应该依赖于(例如)西格玛。
几乎正确,但为了积分,您必须将函数值 g(x)
乘以您的微小间隔 dx
(12*sigma/200
)。那就是你总结的区域:
from numpy import *
import numpy as np
import matplotlib.pyplot as plt
sigma = 4.1
x = np.linspace(-6*sigma, 6*sigma, 200)
def distr(n):
def g(x):
return (1/(sigma*sqrt(2*pi)))*exp(-0.5*(x/sigma)**2)
FxSum = 0
a = list()
for i in range(n):
# divide into 200 parts and sum one by one
numb = g(-6*sigma + (12*sigma*i)/n) * (12*sigma/200)
FxSum += numb
a.append(FxSum)
return a
plt.plot(x, distr(len(x)))
plt.show()