XDRush

TensorFlow中的initializer

1 TensorFlow中的initializer

TensorFlow中集成有多个initializer,这些initializer对tensorflow中变量的初始化提供了方便,下面将对这些initializer进行介绍:

1.1 truncated_normal_initializer

中文称作截断正太分布,具体意义可自行查阅相关资料。

1
2
3
tensor = tf.get_variable("tensor",
shape=[3, 4],
initializer=tf.truncated_normal_initializer(mean=5.0, stddev=0))

TensorFlow文档中对这个initializer说明如下:

1
2
3
4
These values are similar to values from a `random_normal_initializer`
except that values more than two standard deviations from the mean
are discarded and re-drawn. This is the recommended initializer for
neural network weights and filters.

可以看出,TensorFlow推荐使用这个initializer对权重和过滤器进行初始化。

1.2 random_normal_initializer

随机正太分布,可以指定均值和方差。

1
2
3
4
5
6
7
8
9
10
11
12
tensor = tf.get_variable("tensor",
shape=[3, 4],
initializer=tf.random_normal_initializer(mean=5.0, stddev=0.1))
sess = tf.Session()
sess.run(tf.global_variables_initializer())
print sess.run(tensor)
# output
[[ 4.97685289 5.06061602 4.92227983 4.94275761]
[ 5.0433836 5.07488203 4.89840937 5.08210135]
[ 4.88602161 4.9784441 4.9026494 4.94235086]]

1.3 random_uniform_initializer

随机均匀分布,凡是uniform_initializer都是某种均匀分布。

1
2
3
4
5
6
7
8
tensor = tf.get_variable("tensor",
shape=[3, 4],
initializer=tf.random_uniform_initializer(minval=-1, maxval=1))
# output
[[-0.43593144 -0.7915628 -0.79432058 -0.70454049]
[ 0.94479799 -0.89193916 0.13897324 0.15223503]
[ 0.22761178 0.26544452 0.71627235 -0.87396431]]

1.4 constant_initializer

顾名思义,用常量来初始化变量。

1
2
3
4
5
6
7
8
tensor = tf.get_variable("tensor",
shape=[3, 4],
initializer=tf.constant_initializer(5))
# output
[[ 5. 5. 5. 5.]
[ 5. 5. 5. 5.]
[ 5. 5. 5. 5.]]

1.5 ones_initializer

全1的initializer,将变量全部赋值为1.

1
2
3
4
5
6
7
8
tensor = tf.get_variable("tensor",
shape=[3, 4],
initializer=tf.ones_initializer())
# output
[[ 1. 1. 1. 1.]
[ 1. 1. 1. 1.]
[ 1. 1. 1. 1.]]

1.6 zeros_initializer

全0的initializer,将变量全部赋值为0.

1
2
3
4
5
6
7
8
tensor = tf.get_variable("tensor",
shape=[3, 4],
initializer=tf.zeros_initializer())
# output
[[ 0. 0. 0. 0.]
[ 0. 0. 0. 0.]
[ 0. 0. 0. 0.]]

以上介绍了几种常用的initializer,其他的initializer可查阅官网。