Menu
Log in


CPD163- Mathematical Engineering of Deep Learning - Part One Foundations

  • 20 Apr 2023
  • 9:00 AM - 5:00 PM (AEST)
  • ): Room 163, 14 Sir Christopher Ondaatje Ave Macquarie University, NSW 2109, Australia
  • 71

Registration


Registration is closed

The NSW Branch and School of Mathematical and Physical Sciences, Macquarie University is offering this workshop: Mathematical Engineering of Deep Learning - Part One Foundations

In this workshop, we first present the general feed-forward deep neural network. After exploring the expressive power of deep neural networks, we dive into the details of training by understanding the back-propagation algorithm for gradient evaluation and exploring other practical aspects such as weight initialisation, dropout, and batch normalization.

The second part of the workshop concentrates on convolutional neural networks. Much of the success of deep learning is due to the strength of convolutional neural networks when applied to images and similar data formats. The concepts of channels and filter design are introduced, followed by an exploration of the common state of the art architectures that have made significant impacts and are still in use today.

The last part of this workshop is about sequence models.  These models are critical for data such as text with applications in natural language processing. We explore recurrent neural networks and their generalizations. These include long short-term memory models, gated recurrent units, auto-encoders for end-to-end language translation, and the attention model with transformers. The workshop includes deep learning demonstrations using R and Python software.

Course Outline

  • -        The General Fully Connected Architecture
  • -        The Expressive Power of Neural Networks
  • -        Activation Function Alternatives
  • -        The Back Propagation Algorithm
  • -        Weight Initilization and Batch Normalization
  • -        Mitigating Overfitting with Dropout and Regularization
  • -        Overview of Convolutional Neural Networks
  • -        The Convolution Operation
  • -        Building a Convolutional Layer
  • -        Building a Convolutional Neural
  • -        Inception, Resnets, and Other Landmark Architectures
  • -        Recurrent Neural
  • -        Long Short Term Memory Models
  • -        Gated Recurrent Unit Models
  • -        Encoder-decoder for End to End Translation
  • -        Transformers Model
Presenter:

Prof. Benoit Liquet-Weiland and Dr Sarat Moka

Dr Liquet is a Professor of Mathematical and Computational Statistics at Macquarie University in the School of Mathematics and Statistics. In addition, he is affiliated to the University of Queensland and to the Université de Pau et Pays de l’Adour (UPPA). He was previously affiliated with ACEMS (Centre of Excellence for Mathematical and Statistical Frontiers), Queensland University of Technology. Throughout his career he has extensively worked in developing novel statistical models mainly to provide novel tools to analyse clinical, health and biological data arising from epidemiological studies. Since 2011, he moved to the field of computational biology and generalised some of these methods so that they scale to high throughput (“omic”) data.  He has been teaching an advanced course on the mathematical engineering of Deep Learning at the Australian Mathematical Sciences Institute (AMSI) summer school in 2021.  A book draft of his new co-authored book on concepts of “Deep Learning” is available at https://deeplearningmath.org . Benoit Liquet works on Applied Statistics, as well as on the development of R packages and on industrial applications (such as Machine Learning). 

Dr Sarat Moka is a new Lecturer at the School of Mathematics and Statistics at UNSW. He was previously Research Fellow at the School of Mathematical and Physical Sciences, Macquarie University and an ACEMS (ARC Centre for Excellence for Mathematical & Statistical Frontiers) Postdoc at The University of Queensland. He has obtained a PhD in Applied Probability from Tata Institute of Fundamental Research, Mumbai, and Master of Engineering from the Indian Institute of Science, Bengaluru. He has been working on problems in Statistics, Applied Probability, and Deep Learning. In addition, he has worked on the modelling of cancer data and on the Safe Blues project (https://safeblues.org/).  He has been teaching advanced courses in Statistics and Deep Learning and is currently co-writing a book on "The Mathematical Engineering of Deep Learning" (https://deeplearningmath.org/).


Course Timetable (including morning tea, lunchtime and afternoon tea breaks)

Below is the tentative schedule for the workshop; it will be adapted on the workshop day.

 

9:00 – 10:45

Session 1

10:45 – 11:15 (30 mins)

Morning tea break

11:15 – 12:45

Session 2

12:45– 13:45 (1 hour)

Lunch Break

13:45 – 15:15

Session 3

15:15 – 15:45 (30 mins)

Afternoon tea break

15:45 – 17:00

Session 4

 

Target Audience: From engineering, signal processing, statistics, physics, econometrics, operations research, quantitative management, pure mathematics, bioinformatics, applied machine learning, or even applied deep learning.

Requirements:

Delegates are required to bring their own laptop and have R, Rstudio, Python, and Jupyter (via Anaconda)

Cancellation Policy:

Cancellations received prior to Friday,14th April, will be refunded, minus a $20 administration fee. From then on wards no part of the registration fee will be refunded. However, registrations are transferable within the same organisation. Please advise any changes to events@statsoc.org.au.

Powered by Wild Apricot Membership Software