Day 13

Today I continued tests for my classifier experiment. I worked on finding the optimal parameters for the Multi-Layer Perceptron (MLP) classifier. I was trying to determine the best hidden layer shape and weight decay. Hidden layer shape is how many dimensions (layers) of neurons (nodes used for machine learning) there are, and how many neurons are in each dimension. I only began with the knowledge that there would be a maximum of three layers in the hidden layer shape. Weight decay is a value that decreases the weight values after each training iteration to prevent overfitting to the training data. I knew that the best value for weight decay would be 1e-4, 5e-4, or 0. I wrote a program to test different combinations of hidden layer shapes and weight decay values. While these tests were running, I documented some of the classifiers with which I was working, including the MLP classifier. This process helped me to better understand MLP and the other classifiers because it made me work through each part of the classifiers' code. Eventually, the parameter tests I was running were completed, and from these tests it seems that the value 5e-4 gives the most accurate results. I am still trying to narrow down the dimensions of the hidden layer shape, although it seems that with a single layer, the number of neurons should be between 50 and 100.
Today was different because I was the only high school intern in the lab, as Ryan is out of town this week. However, the college students were there and I still had Ron for help.

Comments

Popular posts from this blog

Day 6

Day 14

Day 3