Glossary

What is Zero-centric / Zero-bias initialization

Zero-centric or Zero-bias initialization is a technique widely used in machine learning and deep learning to set the initial weights or parameters of a model to zero. This method aims to avoid biases in the early stages of training, enhancing the convergence speed and overall performance of the model.


In deep learning, the initialization of network weights has a profound impact on the final model's performance. By initializing weights to zero, the model can better learn the structure of the data during training without the instability caused by random initial weights. The key to this method is to reduce redundant information in the early training stage, which helps the model quickly find optimal solutions.


However, zero initialization has its downsides. A major issue is that when all weights are initialized to zero, the outputs of neurons in any layer during the forward propagation process are the same. This can hinder effective weight updates during gradient descent, preventing the model from learning. Therefore, it is often recommended to use other initialization strategies, such as Xavier initialization or He initialization, in certain cases.