Message Boards Message Boards

Deep Learning Library for Mathematica

Posted 9 years ago

Dear all,

I have been working on an open source deep learning library in Mathematica which I would like to share.

The code is hosted on github under an MIT license: https://github.com/jfrancis71/Machine-Vision

I have succesfully used it to train a face recognition system, and also automatically drive my lego EV3 robot.

Face Recognition: https://www.youtube.com/watch?v=TdRtUnSppB0

EV3 Car: https://www.youtube.com/watch?v=DCad82UdDFA

This has arisen from a homegrown project, so the code is probably a little scruffy (and not well documented) at the moment. Having said that it does work and I would welcome constructive feedback and help turning this into an active open source project.

It's not really aimed at someone who just wants a good face recogniser, the new built in Mathematica functions are probably more suitable for that. it's more suitable for someone interested in the internals or developing their own neural nets.

The distinction with the current Mathematica machine learning functionality, is this library is designed to support multiple layer networks including convolutional layers (much used in computer vision presently).

There is more information available on the github site in the link above.

Thanks, Julian Francis.

POSTED BY: Julian Francis
7 Replies
Posted 9 years ago

Julian, I am working in a very similar field and have been building generalized machine learning engines using Mathematica 10. Shoot me an e-mail if you would like to chat about your work and it's direction. I believe there could be some heavy overlap in what we are doing and I would like to share what I have been doing with you. My e-mail is wduhe@yottaforce.co and my Skype ID is william.john.pierre.duhe - would love to hear back from you soon.

POSTED BY: William Duhe

I was following this GPU acceleration tutorial that uses OpenCLLink and CUDALink with a package for RecurrentNeuralNetworks but the link to download the RNN package is broken. I reached out to the developer and his email is bouncing.

http://www.ims2015.net/proceedings/papers/29.pdf

I think this will be very helpful for you though. I have a beefy NVIDIA card and would love to get something working.

BTW: I have a small budget to contribute if you are interested. :)

POSTED BY: David Johnston
Posted 9 years ago

That link that you (David) posted does seem to work. It looks like they have quite a small recurrent neural net going, but it looks like fun. I'll certainly revisit that when I get to look at implementing a recurrent net.

I think for the next few weeks I plan on still playing around with the existing network for pattern recognition. I think there's quite a lot of room for improvement in the way in which I obtain my training data sets. I think there might be much better ways of doing things. I've hardly played with learning rates at all, just sticking with the box standard default (0.01). Also I suspect there are much better neural architectures (hopefully just using the existing layer types).

So for the moment I think my focus is on making these things perform really well (as opposed to fast),

Just on reference to your paper link, although I can see how much fun it would be to write ones own GPU kernels, I'd probably go down NVIDIA's cuDNN route. Just from personal experience there's quite an art to writing extremely efficient GPU code, and I'd be pretty tempted to leverage someone else's, especially if it's basically doing the same calculations (which it probably would be).

Am out of UK for a couple of weeks (and without internet), so won't be doing anything on this until I return.

Anyway, those are my thoughts.

Kind regards, Julian.

POSTED BY: Julian Francis
Posted 9 years ago

Hi David,

Yes, I am definitely thinking about RNN's. I haven't got any practical experience of building RNN's so I'm not entirely sure whether this will be a simple change or require any significant architectural change. Clearly much of the existing code will be relevant; after all, we still need to do forward and backward propogation and calculate gradients etc. But how much more work is involved I'm not sure.

For the short term I'm planning on basically using the net and see what I can teach it to do, and what good architectures are suitable for different pattern recognition tasks.

After that there are two projects that are competing for my attention. One is the RNN route as you mention. The other is that the library is slower than some of the state of the art learning packages as they can tap into NVIDIA cards to get hardware accelerated learning. I have thought it would be quite nice to give CognitoNet the same capability. I probably wouldn't write my own GPU code, I'd probably use NVIDIA's cuDNN library which I think does everything CognitoNet would need. That would involve writing a DLL to forward inbound Mathematica calls onto the cuDNN library (and back again). I don't think it's particularly difficult, but it's not trivial code either.

I have got some code that I call Dream code that I'll aim to migrate in which does the same sort of stuff as DeepDream. It just varies the input picture to maximise a particular neuron output and fits quite nicely into the feed forward architecture. It's quite fun, but it's not as spectacular as DeepDream, I think primarily because that can work with higher resolution images, whereas I primarily work on image patches of size 3232 (although I am starting to move onto patches of 6464). I think making it look really cool would involve moving to larger images around 256*256 and that would require GPU support to be practical, hence my NVIDIA interest.

The license for CognitoNet is very liberal, so please feel free to contribute!

Thanks, Julian.

POSTED BY: Julian Francis

Great work on this. You obviously spent a lot of time on the Wiki and all that. Looks great.

I really appreciate your hard work and willingness to share it.

Are you thinking of building a similar package for RNN's? I would love to do cool stuff like DeepDream does.

POSTED BY: David Johnston
Posted 9 years ago

Thanks for sharing Julian :)

POSTED BY: Thomas Eli

@Julian Francis, this is a very nice project, thanks for sharing! You might consider applying for Wolfram Summer School where similar projects are worked on. You would have a chance to connect with Wolfram folks who work on built-in machine learning tools and also with many international students with similar interests.

POSTED BY: Vitaliy Kaurov
Reply to this discussion
Community posts can be styled and formatted using the Markdown syntax.
Reply Preview
Attachments
Remove
or Discard

Group Abstract Group Abstract