ML on the Mind
I made fun of PyBrain a while back. Found myself using it again, still dissatisfied but reluctantly putting stuff together. If you’ve got a simple ML problem and can’t be arsed to hash together a backprop learner, it does serve it’s purpose. I’m still peeved, however, because it’s slow as balls. I’m aware that backprop isn’t the fastest learning algorithm, but I would have expected a little bit of a speed bump. I wonder if they’re accelerating with numpy, of if they’re doing everything sequentially.
I have a 256 input visible layer, a 512 unit hidden layer, a 128 unit layer, and a 256 unit output layer. With 1000 training examples, each epoch is taking 20-30 minutes. :\