jszhn

Recent Notes

  • A* algorithm

    Oct 29, 2025

    • ALOHA

      Oct 29, 2025

      • ARP

        Oct 29, 2025

        • Accounting

          Oct 29, 2025

          • Activation function

            Oct 29, 2025

            Home

            ❯

            Optimiser

            Optimiser

            Jun 13, 20241 min read

            In deep learning, optimisers are used to determine how each parameter (i.e., the weights) in a model should change based on the value of the loss function.

            Many common NN optimisers are based on gradient descent.

            Types

            • Stochastic gradient descent
            • Adaptive moment estimation (Adam)

            In code

            During model training, we should define our optimiser:

            optimiser = optim.fn(model.parameters(), lr, ...)

            PyTorch has several optimiser functions contained within the torch.optim subclass.


            Graph View

            Backlinks

            • Adaptive moment estimation
            • Batch normalisation
            • Deep learning
            • Learning rate
            • Neural network
            • PyTorch
            • Regularisation
            • APS360 — Applied Fundamentals of Deep Learning

            Created with Quartz v4.5.2 © 2025

            • Twitter
            • LinkedIn
            • GitHub