Skip to content
/ MLP Public

super simple multilayer perceptron neural network

Notifications You must be signed in to change notification settings

cbrwn/MLP

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

27 Commits
 
 
 
 
 
 
 
 

Repository files navigation

MultiLayer Perceptron

This is a simple multilayer perceptron neural network class thing

I wrote everything from scratch with a VERY BASIC understanding of neural networks, so nothing about this is optimized or efficient or well done

It has the ability to accept any number of inputs, have any number of hidden layers with any number of neurons for each, and produce any number of outputs. It uses tanh (or sigmoid if you want) as the activation function and is very slow because I implemented my matrix class super badly

It's a feed forward network which uses backpropagation to 'learn' - using supervsed learning or possibly reinforcement learning

It has only really been tested by learning to solve XOR but I plan on having it do the classic handwritten digit recognition thing and also have it learn to play some games

Resources I used to make this:

About

super simple multilayer perceptron neural network

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published