Neural Networks

COSC420 - Neural Networks

Semester 1, 2019

Hi, and welcome to COSC420 in 2019.  It's a very interesting time for neural networks right now, with huge strides in theory, performance, and commercial application. Salaries in some cases are going crazy. If you haven't seen it before, here is the standard course overview:

Introduction to artificial neural networks (ANNs) and "deep learning". These are computational tools inspired by the brain. They offer new perspectives on computation, and insights into human cognition. Deep learning is a currently popular technology underlying advances in artificial intelligence, as used by organisations such as Google, Baidu, Microsoft and Apple.

As this topic will be new to most, the course is divided into two phases. The first introduces basic material and will be in a lecture driven format. We will cover an introduction to the topic, the practical use and tuning of ANNs, and fundamental algorithms / architectures including (a selection of): 1-layer nets, multi-layer nets, back propagation, Hopfield nets, Boltzmann machines, unsupervised learning, dynamic architectures, reinforcement learning, and "deep learning" / convolutional nets.

The second phase covers current research within the field, and will be in a more open, student driven format. The material will be different every year, as it will be driven by your specific interests. Topics which have been frequently covered in the past include: advanced neural network theory, applications to vision and robotics, purpose built hardware ("neurocomputers"), applications within artificial life and software agents, neural/symbolic hybrids, mathematical and Baysean interpretations, and applications to neuroscience and psychological modelling.

COSC420 is good preparation, but is not required, for COSC421 and COSC422.


Lectures are held in Owheo G34, Tuesday 9am-11am. The lecturer is Anthony Robins Owheo 2.53, phone 479-8314, email


• Assignment: 25%, due Friday 3rd May 5pm (by email to me), note departmental policy -10% per working day if late.


• Presentation and reviews: 10%, due date TBA (varies by student)

• Quizzes and participation: 5%

• Final exam: 60%

Lecture notes and readings

Lecture notes will be distributed in pdf form via email. There is no set text book, but readings will be drawn from the following free online sources (lecture notes may sometimes refer to them by [abbreviation]):

[Kriesel] A Brief Introduction to Neural Networks

[Rojas] Neural Networks: A Systematic Introduction

[GBC] Deep Learning

[Nielsen] Neural Networks and Deep Learning

[NIPS] Electronic Proceedings of the Neural Information Processing Systems Conference

The following texts are of historical interest (lecture notes may sometimes refer to them by [abbreviation]).

[Hay09] Haykin, S., 2009, Neural Networks and Learning Machines. (Third edition) Pearson Education, NJ.

[Hay99] Haykin, S., 1999, Neural Networks A Comprehensive Foundation. (Second edition) Macmillan, NY.

[HKP] Hertz, J. Krough, A. & Palmer, R.G., 1991, Introduction to the Theory of Neural Computation. Addison-Wesley, Redwood City CA.

[O'Reilly] O'Reilly, R. & Munakata, Y., 2000, Computational Explorations in Cognitive Neuroscience: Understanding the Mind by Simulating the Brain. MIT Press, Cambridge MA.

[Vol1] Rumelhart D.E., McClelland, J.L. & the PDP Research Group, 1986, Parallel Distributed Processing: Explorations in the Microstructure of Cognition, Volume 1: Foundations. MIT Press, Cambridge MA.

[Vol2] McClelland, J.L., Rumelhart D.E. & the PDP Research Group, 1986, Parallel Distributed Processing: Explorations in the Microstructure of Cognition, Volume 2: Psychological and Biological Models. MIT Press, Cambridge MA.


The library has many good books - try a few relevant keyword searches. The following are useful journals:

Connection Science (Sci: Phy CY14)

Network: Computation in Neural Systems (Sci: Phy N643)

Neural Computation (Sci: Phy N811)

Neural Networks (Sci: Phy N818)

IEEE Transactions on Neural Networks (Sci: Phi I718)

The excellent conference proceedings "Neural Information Processing" (NIPS) are available elcronically (link above), Volumes 1 - 19 are also in the Science library (Sci: QA/76.87/A725).

Class representatives

The class rep for 2019 is TBA. Volunteers are welcome any time!

Course outline and email

Further information about learning objectives, workload, and other details as specified by Otago policy, can be found here.

Unless otherwise arranged, email regarding this course will go to your address which you should read frequently for university related correspondence.


Anthony Robins

Any problems or questions - please get in touch!

Anthony Robins

Owheo Room 253, phone 479-8314, email