How to reduce the number of kernels/filters in a trained model in Tensorflow?

by Safi   Last Updated October 10, 2019 02:26 AM

If I have a trained model, where I want to retrain the same model, with few filters/kernel removed from the existing model. e.g.
conv1 = tf.get_variable('conv1_1', shape=(11, 11, 3, 64), initializer=tf.contrib.layers.xavier_initializer()), and I want to resize this tensor such that it has the shape of (11, 11, 3, 20) but the same name and position, mean exactly the same variable. Advance thanks for the help.

I have tried tf.reshape but it gives me error of not matching the number of elements in a and b I have also tried tf.assign(a,b, validate_shape=false)

self.weights = {
    'conv1_': tf.get_variable('conv1_l1', shape=(11, 11, 3, 64), initializer=tf.contrib.layers.xavier_initializer()),
    'conv2_': tf.get_variable('conv2_l1', shape=(7, 7, 64, 128), initializer=tf.contrib.layers.xavier_initializer())

Answers 1

What you want to do is partially achievable.

Having an variable with the exact same name as one that's already defined is probably not possible. Because TensorFlow creates a data flow graph and each node needs to be uniquely identifiable (to avoid ambiguities). If you want the same name you can do that using variable scoping having different scopes.

But for assigning a part of the variable to another you can use the following code.

import tensorflow as tf
import numpy as np


with tf.variable_scope('old'):
  conv1 = tf.get_variable('conv1_1', shape=(11, 11, 3, 64), initializer=tf.contrib.layers.xavier_initializer())
with tf.variable_scope('new'):
  conv_res_1 = tf.get_variable('conv1_1', shape=(11, 11, 3, 20), initializer=tf.contrib.layers.xavier_initializer())

assign_op = tf.assign(conv_res_1,conv1[:,:,:,:20])

with tf.Session() as sess:
  w_1, w_res_1 =[conv1, assign_op])
  assert np.all(w_1[:,:,:,:20] == w_res_1)
  print(w_1[0,0,0,0], w_res_1[0,0,0,0])  
October 10, 2019 01:31 AM

Related Questions

Updated March 11, 2017 06:26 AM

Updated May 13, 2017 03:26 AM

Updated October 17, 2017 19:26 PM

Updated December 19, 2017 17:26 PM