Module Catalogues, Xi'an Jiaotong-Liverpool University   
Module Code: CSE301
Module Title: Bio-Computation
Module Level: Level 3
Module Credits: 5.00
Academic Year: 2018/19
Semester: SEM1
Originating Department: Computer Science and Software Engineering
Pre-requisites: N/A
1. To introduce students to a range of topics in the field of artificial neural networks, and to provide them with hands-on familiarity some of the established works.

2. To highlight some contemporary issues within the domain of neural computation with regard to biologically-motivated computing, particularly in relation to multidisciplinary research.

3. To emphasise the need to keep up-to-date in developing areas of science and technology and provide some skills necessary to achieve this.

4. To enable students to make reasoned decisions about the engineering of machine learning systems
Learning outcomes 
By the end of the module students will be expected to:

1. Account for biological and historical developments of neural computation. Describe the nature and operation of Perceptron, MLP, Convolutional Neural Network (CNN), stacked Auto-encoders, Competitive learning, Oja learning, and SOM networks, and when they are used assess the appropriate applications and limitations of ANNs;

2. Apply their knowledge to some emerging research issues in the field;

3. Understand how ANN models work in general terms and with respect to specific applications, e.g., prediction and classification. The understanding will be reinforced by firsthand experiences in problem solving and assessed by course works and exam;

4. Understand some of the contemporary topics of artificial neural networks, and deep neural networks in particular, including CNN, deep belief net and stacked auto-encoder, with awareness of their advantages and applications,

5. Awareness of some of the modern machine learning concepts; and

6. Familiarity with the essentials of Matlab and relevant toolboxes so as to enable exploration of the above in practical applications of ANNs.

Method of teaching and learning 
1. Didactic component - the core of the teaching is lecture-based with Q/A and feedback. Lectures are supported by tutorials and labs / Practicals.

2. Self-learning component - students are encouraged to read around the subject materials.

3. Comprehension/review exercise - two continuous assessments, following supervised discussion and Q/A sessions in the seminars.

4. Case studies will be supplied to help students place the course material in context.

Lectures 1-2: Biological basics and historical context of neural computation neurons, synapses, brain, neural computation and computational neuroscience, Hebb's rule, the McCulloch-Pitts Neuron, Perceptron and nonlinear sepearbility, dynamical systems, etc.;

Lectures 3-4: The nature of the machine learning and issues related to problems using ANNs, including regression, prediction, and pattern classification ;

Lectures 5-8: MLP and relevant issues in practices; supervised learning; the multilayer perceptron contrast with Perceptron; sigmoidal functions, activation, generalised delta rule, adaptation and learning, convergence, gradient descent, recent developments;

Lectures 9-12: Convolutional Neural Network (CNN), with key concepts including filters (convolution) and pooling; efficient convolution algorithms, BP for CNN.

Lectures 13-16: Deep belief nets, Boltzmann machine (BM), Restricted Boltzmann machine (RBM), contrast divergence, deep Bolzmann machines, deep belief nets, convolutional Boltzmann machines

Lectures 17-20: Auto‐encoder, denoising auto‐encoder, stacked autoencoder

Lectures 21-24: associative memory models; auto-associative memory and hetero-associative memory models; associative memory model as efficient tools for pattern recognition.

Lectures 25-28: Recurrent neural network and dynamic neural network models. Hopfield model; Elman network and its applications in time-series prediction

Lectures 29-32: unsupervised learning; k-means clustering and hybrid learning models; Oja rule and PCA networks; applications of PCA and Hebbian learning;

Lectures 33-36: competitive learning; Kohonen self-organising maps (SOM), nature of unsupervised competitive learning and its relation with clustering, comparisons with statistical methods such as k-means.

Delivery Hours  
Lectures Seminars Tutorials Lab/Prcaticals Fieldwork / Placement Other(Private study) Total
Hours/Semester 26     13  11    100  150 


Sequence Method % of Final Mark
1 Written Examination 80.00
2 Assessment Task 10.00
3 Assessment Task 10.00

Module Catalogue generated from SITS CUT-OFF: 5/22/2018 9:50:52 PM