有没有一种方法可以更改函数的更新列表而无需在 Theano 中重新编译它?
Is there a way to change a function's update list without re-compiling it in Theano?
确实我想在不同的训练阶段改变学习率。类似于:
for i in range(iter_num):
learn_rate = i*alpha
do_training(learn_rate,...)
显然为每次迭代重新编译一个新函数会太慢。
所以我想知道在 Theano 中是否有更好的方法来做到这一点?
谢谢!
你可以让学习率成为一个符号变量,然后像这样将它传递到训练函数中:
import numpy
import theano
import theano.tensor as tt
def compile(input_size, hidden_size, output_size):
W_h = theano.shared(numpy.random.standard_normal(size=(input_size, hidden_size)).astype(theano.config.floatX))
b_h = theano.shared(numpy.zeros((hidden_size,), dtype=theano.config.floatX))
W_y = theano.shared(numpy.random.standard_normal(size=(hidden_size, output_size)).astype(theano.config.floatX))
b_y = theano.shared(numpy.zeros((output_size,), dtype=theano.config.floatX))
x = tt.matrix('x')
z = tt.ivector('z')
learning_rate = tt.scalar()
h = tt.tanh(theano.dot(x, W_h) + b_h)
y = tt.nnet.softmax(theano.dot(h, W_y) + b_y)
cost = tt.nnet.categorical_crossentropy(y, z).mean()
updates = [(p, p - learning_rate * tt.grad(cost, p)) for p in (W_h, b_h, W_y, b_y)]
return theano.function([x, z, learning_rate], outputs=cost, updates=updates)
def main():
input_size = 5
hidden_size = 4
output_size = 3
train = compile(input_size, hidden_size, output_size)
print train([[0, 1, 2, 3, 4], [5, 6, 7, 8, 9]], [1, 2], 0.1)
main()
请注意,训练函数现在具有三个参数;第三个是学习率。
确实我想在不同的训练阶段改变学习率。类似于:
for i in range(iter_num):
learn_rate = i*alpha
do_training(learn_rate,...)
显然为每次迭代重新编译一个新函数会太慢。 所以我想知道在 Theano 中是否有更好的方法来做到这一点? 谢谢!
你可以让学习率成为一个符号变量,然后像这样将它传递到训练函数中:
import numpy
import theano
import theano.tensor as tt
def compile(input_size, hidden_size, output_size):
W_h = theano.shared(numpy.random.standard_normal(size=(input_size, hidden_size)).astype(theano.config.floatX))
b_h = theano.shared(numpy.zeros((hidden_size,), dtype=theano.config.floatX))
W_y = theano.shared(numpy.random.standard_normal(size=(hidden_size, output_size)).astype(theano.config.floatX))
b_y = theano.shared(numpy.zeros((output_size,), dtype=theano.config.floatX))
x = tt.matrix('x')
z = tt.ivector('z')
learning_rate = tt.scalar()
h = tt.tanh(theano.dot(x, W_h) + b_h)
y = tt.nnet.softmax(theano.dot(h, W_y) + b_y)
cost = tt.nnet.categorical_crossentropy(y, z).mean()
updates = [(p, p - learning_rate * tt.grad(cost, p)) for p in (W_h, b_h, W_y, b_y)]
return theano.function([x, z, learning_rate], outputs=cost, updates=updates)
def main():
input_size = 5
hidden_size = 4
output_size = 3
train = compile(input_size, hidden_size, output_size)
print train([[0, 1, 2, 3, 4], [5, 6, 7, 8, 9]], [1, 2], 0.1)
main()
请注意,训练函数现在具有三个参数;第三个是学习率。