首页 > 解决方案 > Getting unwanted and duplicate blocks in tensorboard but not in model.summary()

问题描述

I am using a CNN architecture and adding few more layers in between and creating a new model. By using model.summary() on this new model, I see everything is aligned fine but in tensorboard, I am seeing duplicate blocks connected around with new blocks.Kind of new network parallely drawn with the old network block and some connection in between.

I am using tensorboard = Tensorboard() for the tensorboard in keras.

Please advise why I see these connections and old network blocks in parallel with new model blocks but on the other hand model.summary() looks totally fine.

I am trying to understand so any detail on this will help.

标签: graphkerasconv-neural-networktensorboard

解决方案


I have faced the similar issue.
The main reason for this is, whenever the model is created, every layer gets a new name. For example: if you have a model with two 2d-convolutional layers followed by a dense layer. Creating the model for the 1st time and executing model.Summary(), results in the below layer names:

  • conv2d_1
  • conv2d_2
  • dense_1

While re-executing the same code, yields:

  • conv2d_3
  • conv2d_4
  • dense_2

At the same time, the logs that Tensor-board uses are overwritten. Hence, parallel blocks of layers with different names appears.


推荐阅读