Home
Hoang Dang
Cancel

RNNs and Transformers

The self-attention mechanism works by computing attention weights between each input token and all other input tokens and using these weights to compute a weighted sum of the input token emb...

GAN - Generative Adversarial Networks

Object Detection and Segmentation

Optimizing Performance

CHAPER 3. CNN IN DEPTH IMAGE AUGMENTATION is very common method to: Avoid overfitting Increase robustness of the network Introduce rotational, translational, scale invariance as well as in...

FNN & CNN Concepts