博客
关于我
学习笔记之——Normalization
阅读量:515 次
发布时间:2019-03-07

本文共 3662 字,大约阅读时间需要 12 分钟。

Normalization Methods in Deep Learning

Normalization techniques are crucial in deep learning, particularly for training deep neural networks efficiently. These techniques help address issues like Internal Covariate Shift, which can hinder training due to varying input distributions across batches. Here, we explore key normalization methods and their applications.

Batch Normalization (BN)

Batch Normalization (BN) was introduced to address the Internal Covariate Shift issue. By normalizing each batch to have a mean of 0 and variance of 1, BN stabilizes the input distribution, preventing gradient issues like vanishing or exploding gradients. BN introduces learnable parameters β and γ to shift and scale the normalized values, allowing the model to adapt to different data distributions.

Advantages of BN

  • Reduces Internal Covariate Shift, accelerating training.
  • Mitigates gradient issues, especially with activation functions like sigmoid or tanh.
  • Less sensitive to initial parameter values, enhancing training stability.

Disadvantages of BN

  • Batch size sensitivity: Small batch sizes lead to inaccurate mean and variance estimates.
  • Requires storing batch statistics during training, which can be memory-intensive for large batches.
  • Not suitable for RNNs due to varying sequence lengths.

Layer Normalization (LN)

Layer Normalization (LN) normalizes all features in a layer across the batch. Unlike BN, LN does not depend on batch size, making it suitable for small batches. It computes mean and variance across the entire layer, ensuring consistent normalization regardless of batch size. This makes LN more versatile, particularly for RNNs where sequence lengths vary.

Differences Between BN and LN

  • BN normalizes features per batch, while LN normalizes features across the entire layer.
  • LN does not require storing batch statistics, saving memory.

Weight Normalization (WN)

Weight Normalization (WN) normalizes filter weights rather than features. It decouples weights into magnitude (beta) and direction (gamma), allowing independent training of these parameters. Unlike BN and LN, WN does not depend on input data distribution, making it a form of parameter normalization.

Instance Normalization (IN)

Instance Normalization (IN) is ideal for tasks like image style transfer, where the model's output depends on individual image instances rather than the entire batch. IN normalizes features within each image instance, ensuring consistent output while allowing diverse styles across instances.

Group Normalization (GN)

Group Normalization (GN) addresses the limitations of BN by grouping features along the channel dimension. Each group is normalized separately, reducing the impact of batch size on normalization accuracy. GN is particularly effective for small batches and can be seen as a middle ground between BN and LN.

Key Characteristics of GN

  • Divides features into groups and normalizes each group separately.
  • Reduces dependency on batch size, making it suitable for small batches.
  • Combines aspects of BN and LN, providing a flexible normalization approach.

Switchable Normalization (SN)

Switchable Normalization (SN) allows each layer to choose the most appropriate normalization method dynamically during training. This adaptability addresses the limitation of fixed normalization methods, offering a solution for diverse applications without manual tuning.

Summary

Normalization techniques like BN, LN, WN, IN, GN, and SN each address specific challenges in deep learning. Understanding their unique properties and applications can help choose the optimal method for a given task, ensuring efficient and stable training of deep neural networks.

转载地址:http://tiajz.baihongyu.com/

你可能感兴趣的文章
Openlayers高级交互(18/20):根据feature,将图形适配到最可视化窗口
查看>>
Openlayers高级交互(19/20): 地图上点击某处,列表中显示对应位置
查看>>
Openlayers高级交互(2/20):清除所有图层的有效方法
查看>>
Openlayers高级交互(20/20):超级数据聚合,页面不再混乱
查看>>
Openlayers高级交互(3/20):动态添加 layer 到 layerGroup,并动态删除
查看>>
Openlayers高级交互(4/20):手绘多边形,导出KML文件,可以自定义name和style
查看>>
Openlayers高级交互(5/20):右键点击,获取该点下多个图层的feature信息
查看>>
Openlayers高级交互(6/20):绘制某点,判断它是否在一个电子围栏内
查看>>
Openlayers高级交互(7/20):点击某点弹出窗口,自动播放视频
查看>>
Openlayers高级交互(8/20):选取feature,平移feature
查看>>
Openlayers高级交互(9/20):编辑图形(放缩、平移、变形、旋转),停止编辑
查看>>
Openlayers:DMS-DD坐标形式互相转换
查看>>
openlayers:圆孔相机根据卫星经度、纬度、高度、半径比例推算绘制地面的拍摄的区域
查看>>
OpenLDAP(2.4.3x)服务器搭建及配置说明
查看>>
OpenLDAP编译安装及配置
查看>>
Openmax IL (二)Android多媒体编解码Component
查看>>
OpenMCU(一):STM32F407 FreeRTOS移植
查看>>
OpenMCU(三):STM32F103 FreeRTOS移植
查看>>
OpenMCU(三):STM32F103 FreeRTOS移植
查看>>
OpenMCU(二):GD32E23xx FreeRTOS移植
查看>>