keras 模型用于预测时的注意事项
发布日期:2021-11-21 04:41:30 浏览次数:46 分类:技术文章

本文共 1722 字,大约阅读时间需要 5 分钟。

为什么训练误差比测试误差高很多?

一个Keras的模型有两个模式:训练模式测试模式一些正则机制,如Dropout,L1/L2正则项在测试模式下将不被启用

另外,训练误差是训练数据每个batch的误差的平均。在训练过程中,每个epoch起始时的batch的误差要大一些,而后面的batch的误差要小一些。另一方面,每个epoch结束时计算的测试误差是由模型在epoch结束时的状态决定的,这时候的网络将产生较小的误差。

【Tips】可以通过定义回调函数将每个epoch的训练误差和测试误差并作图,如果训练误差曲线和测试误差曲线之间有很大的空隙,说明你的模型可能有过拟合的问题。当然,这个问题与Keras无关。

在keras中文文档中指出了这一误区,笔者认为产生这一问题的原因在于网络实现的机制。即dropout层有前向实现和反向实现两种方式,这就决定了概率p是在训练时候设置还是测试的时候进行设置

利用预训练的权值进行Fine tune时的注意事项:

不能把自己添加的层进行将随机初始化后直接连接到前面预训练后的网络层

  • in order to perform fine-tuning, all layers should start with properly trained weights: for instance you should not slap a randomly initialized fully-connected network on top of a pre-trained convolutional base. This is because the large gradient updates triggered by the randomly initialized weights would wreck the learned weights in the convolutional base. In our case this is why we first train the top-level classifier, and only then start fine-tuning convolutional weights alongside it.
  • we choose to only fine-tune the last convolutional block rather than the entire network in order to prevent overfitting, since the entire network would have a very large entropic capacity and thus a strong tendency to overfit. The features learned by low-level convolutional blocks are more general, less abstract than those found higher-up, so it is sensible to keep the first few blocks fixed (more general features) and only fine-tune the last one (more specialized features).
  • fine-tuning should be done with a very slow learning rate, and typically with the SGD optimizer rather than an adaptative learning rate optimizer such as RMSProp. This is to make sure that the magnitude of the updates stays very small, so as not to wreck the previously learned features.

转载地址:https://blog.csdn.net/xiaojiajia007/article/details/73771311 如侵犯您的版权,请留言回复原文章的地址,我们会给您删除此文章,给您带来不便请您谅解!

上一篇:keras 迁移学习, 微调, model的predict函数定义
下一篇:keras slice layer 层 实现

发表评论

最新留言

网站不错 人气很旺了 加油
[***.192.178.218]2024年04月05日 06时32分03秒

关于作者

    喝酒易醉,品茶养心,人生如梦,品茶悟道,何以解忧?唯有杜康!
-- 愿君每日到此一游!

推荐文章