新闻是有分量的

五湖四海娱乐澳&(3)

2019-05-01 10:34栏目:新闻

通过对 tf.keras.Model 进行子类化并定义您自己的前向传播来构建完全可自定义的模型。在 init 方法中创建层并将它们设置为类实例的属性。在 call 方法中定义前向传播

classMyModel(tf.keras.Model):

def__init__(self, num_classes=10):

super(MyModel, self).__init__(name= 'my_model')

self.num_classes = num_classes

self.layer1 = layers.Dense( 32, activation= 'relu')

self.layer2 = layers.Dense(num_classes, activation= 'softmax')

defcall(self, inputs):

h1 = self.layer1(inputs)

out = self.layer2(h1)

returnout

defcompute_output_shape(self, input_shape):

shape = tf.TensorShapej(input_shape).as_list()

shape[ -1] = self.num_classes

returntf.TensorShape(shape)

model = MyModel(num_classes= 10)

model.compile(optimizer=tf.keras.optimizers.RMSprop( 0.001),

loss=tf.keras.losses.categorical_crossentropy,

metrics=[ 'accuracy'])

model.fit(train_x, train_y, batch_size= 16, epochs= 5)

4.3 自定义层

通过对 tf.keras.layers.Layer 进行子类化并实现以下方法来创建自定义层:

  • build:创建层的权重。使用 add_weight 方法添加权重。
  • call:定义前向传播。
  • compute_output_shape:指定在给定输入形状的情况下如何计算层的输出形状。或者,可以通过实现 get_config 方法和 from_config 类方法序列化层。
classMyLayer(layers.Layer):

def__init__(self, output_dim, **kwargs):

self.output_dim = output_dim

super(MyLayer, self).__init__(**kwargs)

defbuild(self, input_shape):

shape = tf.TensorShape((input_shape[ 1], self.output_dim))

self.kernel = self.add_weight(name= 'kernel1', shape=shape,

initializer= 'uniform', trainable= True)

super(MyLayer, self).build(input_shape)

defcall(self, inputs):

returntf.matmul(inputs, self.kernel)

defcompute_output_shape(self, input_shape):

shape = tf.TensorShape(input_shape).as_list()

shape[ -1] = self.output_dim

returntf.TensorShape(shape)

defget_config(self):

base_config = super(MyLayer, self).get_config()

base_config[ 'output_dim'] = self.output_dim

returnbase_config

@classmethod

deffrom_config(cls, config):

returncls(**config)

model = tf.keras.Sequential(

[

MyLayer( 10),

layers.Activation( 'softmax')

])

model.compile(optimizer=tf.keras.optimizers.RMSprop( 0.001),

loss=tf.keras.losses.categorical_crossentropy,

metrics=[ 'accuracy'])

model.fit(train_x, train_y, batch_size= 16, epochs= 5)

4.4 回调

callbacks = [

tf.keras.callbacks.EarlyStopping(patience= 2, monitor= 'val_loss'),

tf.keras.callbacks.TensorBoard(log_dir= './logs')

]

model.fit(train_x, train_y, batch_size= 16, epochs= 5,

callbacks=callbacks, validation_data=(val_x, val_y))

5 保持和恢复

5.1 权重保存

model = tf.keras.Sequential([

layers.Dense( 64, activation= 'relu'),

layers.Dense( 10, activation= 'softmax')])

model.compile(optimizer=tf.keras.optimizers.Adam( 0.001),

loss= 'categorical_crossentropy',

metrics=[ 'accuracy'])

model.save_weights( './weights/model')

model.load_weights( './weights/model')

model.save_weights( './model.h5')

model.load_weights( './model.h5')