Batch-Instance Normalization for Adaptively Style-Invariant Neural Networks
IN performs a form of style normalization, showing simply adjusting the feature statistics—namely the mean and variance—of a generator network can control the style of the generated image.
we propose Batch-Instance Normalization (BIN) to normalize the styles adaptively to the task and selectively to individual feature maps. It learns to control how much of the style information is propagated through each channel of features leveraging a learnable gate parameter. If the style associated with a feature map is irrelevant to or disturbs the task, BIN closes the gate to suppress the style using IN. If the style carries important information to the task, on the other hand, BIN opens the gate to preserve the style through IN. This simple collaboration of two normalization methods can be directly incorporated into existing BN- or IN-based models, requiring only a few additional parameters.
we assume the mean and variance of a convolutional feature map encode a certain style attribute. In other words, the information carried by each feature map can be divided into two components: a style (the mean and variance of activations) and shape (the spatial configuration of activations).