Neural Networks

COSC420 - Neural Networks

Semester 1, 2018 (The Guide to Enrolment 2018 incorrectly says this paper is not offered in 2018)

Hi, and welcome to COSC420 in 2018.  If you haven't seen it before, here is the standard overview:

Despite its slow "hardware", the brain is a much more powerful and sophisticated computational system than any computer ever built. What can the brain teach us about computation and how to perform complex tasks such as natural language processing, vision, and control and optimisation problems? Artificial neural networks are a family of methods that try to address these issues and explore "brain-like" computation, information processing and learning. "Deep learning" is a currently popular technology based on convolutional neural networks, and used by organisations such as Google, Baidu, Microsoft and Apple.

This paper will cover an introduction to neural networks, a survey of algorithms and architectures (including deep learning), and practical applications of neural networks. Depending on student interest we may also cover the impact of neural networks in other fields (such as psychology, neuroscience, parallel computers), or related topics such as artificial life, software agents, and robotics.

COSC420 is good preparation, but is not required, for COSC421 and COSC422.


Lectures are held in Owheo G34, Tuesday 11am-1pm. The lecturer is Anthony Robins Owheo 2.53, phone 479-8314, email


• Assignment: 25%, due date is TBA (by email to me), note departmental policy -10% per working day if late. • Presentation and reviews: 10%, due date TBA (varies by student)

• Quizzes and participation: 5%

• Final exam: 60% - date and venue to be arranged.


The content of the course is fairly fluid this year, and can be influenced by student interests! We will start out by introducing the basics and the practical use of neural nets. We then look at important architectures such as 1-layer nets, multi-layer nets (back propagation), Hopfield nets, competitive learning, self organising maps, recurrent nets, dynamic architectures and others.

Then we can choose from a range of topics, determined by student interest. Possibilities include:

  • deep learning (convolutional neural networks)
  • neural nets and the brain ("real neuroscience")
  • neural nets and cognition (models of memory, learning, language etc.)
  • parallel hardware implementations ("neurocomputers")
  • catastrophic forgetting and ongoing learning (my own research)
  • neural network and symbolic hybrids (e.g. nets and decision trees)
  • mathematical and Baysean interpretations
  • practical applications of neural nets
  • artificial life / software agents
  • robotics / vision
  • other - suggest your own!

Lecture notes and readings

The Department has gone "paperless" so lecture notes and readings are distributed in pdf form via email. There is no set text book. The books below will be referred to - often using the [abbreviation]. Readings for each lecture will be taken from these sources.

[Hay09] Haykin, S., 2009, Neural Networks and Learning Machines. (Third edition) Pearson Education, NJ.

[Hay99] Haykin, S., 1999, Neural Networks A Comprehensive Foundation. (Second edition) Macmillan, NY.

[HKP] Hertz, J. Krough, A. & Palmer, R.G., 1991, Introduction to the Theory of Neural Computation. Addison-Wesley, Redwood City CA.

[O'Reilly] O'Reilly, R. & Munakata, Y., 2000, Computational Explorations in Cognitive Neuroscience: Understanding the Mind by Simulating the Brain. MIT Press, Cambridge MA.

[Vol1] Rumelhart D.E., McClelland, J.L. & the PDP Research Group, 1986, Parallel Distributed Processing: Explorations in the Microstructure of Cognition, Volume 1: Foundations. MIT Press, Cambridge MA.

[Vol2] McClelland, J.L., Rumelhart D.E. & the PDP Research Group, 1986, Parallel Distributed Processing: Explorations in the Microstructure of Cognition, Volume 2: Psychological and Biological Models. MIT Press, Cambridge MA.


The library has many good books - try a few relevant keyword searches. The excellent conference proceedings "Advances in Neural Information Processing" (NIPS) Volumes 1 - 19 are in the Science library (Sci: QA/76.87/A725), later volumes can be found here: See also the following journals:

Connection Science (Sci: Phy CY14)

Network: Computation in Neural Systems (Sci: Phy N643)

Neural Computation (Sci: Phy N811)

Neural Networks (Sci: Phy N818)

IEEE Transactions on Neural Networks (Sci: Phi I718)


My Neural Networks page ( has information about neural nets research and teaching at Otago, as well as links to FAQs and sites for journals, conferences, demos, simulators, ftp sites, online bibliographies, etc.

Class representatives

The class rep for 2016 is Joseph Cahill-Lane. Other volunteers are welcome any time!

Course outline and email

Further information about learning objectives, workload, and other details as specified by Otago policy, can be found here.

Unless otherwise arranged, email regarding this course will go to your address which you should read frequently for university related correspondence.


Anthony Robins

Any problems or questions - please get in touch!

Anthony Robins

Owheo Room 253, phone 479-8314, email