CNN Assignment - Milestone 1
Overview
The purpose of this assignment is to learn about the inner workings
of CNNs. You are asked to implement a pioneering CNN, LeNet5, which is
a CNN for MNIST digit recognition.
Assignment
This assignment is worth 100 points. Please work on it by
yourself. For this milestone, you are asked to implement the
architecture of LeNet5 as well as the forward pass. You will also be
asked to perform a limited amout of testing. In the process of
imlementing it, we will make some modifications, as informed by
currect best practice.
- Create new Java files/classes entitled "LeNet5.java" and
"MNISTCNN.java" They will be used to implement the LeNet5 CNN and the
set-up and testing of LeNet5 on the MNIST data set.
- Since you already working with the MNIST data set, importing it
into your project, reading the files and displaying the images should
be reused from your FFnet project. Notice that you need to pass 2D
images to the CNN rather than linearized images.
- In the "LeNet5.java" file, for this milestone, implement the
forward pass as documented in section II.B of the GradientBased
Learning Applied to Document Recognition paper by Yann LeCun, Leon
Bottou, Yoshua Bengio and Patrick Haner.
- Set-up the weight matrices as we worked out in class through the
LeNet5
architecture worksheet.
- Ensure that in the prior part, you do not use magic
numbers. Define constants that you will use to declare the arrays and
that you will use later on in the code.
- Initialize the weights to small random number in the range
[-0.05..0.05[ as you have done for FFNet.
- Next, implement the padding (unless you send LeNet5 already padded
inputs).
- Implement C1, the first layer, ensuring you use the relu
activation function.
- Please ensure you use structured programming principles. You
should have procedures for initializing the weights, for the
convolutional layers, for the pooling layers and for the classifier as
necessary.
- Implement S2, using max pooling.
- Implement C3, using the connectivity as specified in table I and
again using relu.
- Implement S4. This should be the same process as used for S2,
again using max pooling.
- Implement C5. Be mindful of the full connectivity and how it is
set up. Use relu.
- Implement F6. Use relu
- Implement the Output layer. Use softmax.
- Test your network. For some very basic testing, please run the LeNet5Testing.java code. Fix any
errors.
- Complete the lab manual
Submission
Please submit "LeNet5.java" and "LeNet5Testing.java" files as well as the
lab manual to the appropriate drop-box on Moodle.