David Picard
ETIS - ENSEA
ETIS - ENSEA
friday 24 april 2015
Since we had a few publications on the topic of distributed machine learning (in particular a Neurocomputing paper on distributed PCA: "Asynchronous Gossip Principal Components Analysis"), let's talk a bit more about it. My Ph.D. student Jérôme Fellus has rolled out the version first version of his libagml library. This is a distributed machine learning library in C++ that relies on Gossip protocols.
The main page is here: http://perso-etis.ensea.fr/~jerofell/software.html
The way it works is dead simple: you have a mother class that corresponds to a node, and all you have to do is derive it to make your specifi local computation and aggregation procedures. All the networking, instantiation, etc, is handle by the library. Nice, isn't it?
friday 29 november 2013
Here we go again, it seems I'm only alternating new publications and update to jkms on this page.
Version 2.2.
The can now optionaly depend on EJML in order accelerate the eigen-decomposition. I had a lot of fun implementing some algorithms (Jacobi, QR, householder transforms, Givens rotation, ...), which allows the library to perform all available BLAS on its own. However, it will never be competitive with dedicated libraries. So I checked the current pure java blas library, and EJML is probably the best out there (kudos to the people behind). I made a simple wrapper that checks that the library is in the classpath, and uses it in that case. No older code should break because of this. If it does, email me rapidly...
Next, I will wrap more things around EJML (i.e. not only eig), but I still want jkms to be totally autonomous. That is, not existing feature will ever require EJML (nor any other library).
Another new feature is a fast Kernel based on the Nystrom approximation, with an active learning strategy for fast training. this was among the stuff I worked with Hedi Tabia and presented at BMVC last september.
monday 10 june 2013
The linear algebra package is at the moment very rough. I find it somehow useful to perform some king of pre-processing (like a PCA for example). At the moment, my matrix code is a bit slow. If ever I find the time to make solid matrix operations, I will add some nice features like low rank approximations of kernels (Nyström).
Nevertheless, I suggest to always pick the latest git version instead of these releases. The API is very stable now and should not change significantly, which means that all the code you write now is to be supported in the next few years. Thus, picking the latest git always assures you to have the bug-fixes and so on (I don't release versions only for bug-fixes).
One more thing: JKernelMachines has been published in JMLR last month. I encourage you to read the paper and to cite it if you ever use to code for your publications.
tuesday 05 march 2013
JKernelMachines version number bumped to 2.0!
The bigs changes are:
friday 16 november 2012
I am organizing a special session with Philippe Gosselin at this year's ESANN conference.
Machine Learning for multimedia applications
David Picard , ETIS – ENSEA,
Philippe-Henri Gosselin , INRIA Rennes (France)
In recent years, many multimedia applications have shown very successful improvements by leveraging machine learning techniques. These applications include image and video classification, object recognition, image and video retrieval, or event detection.
However, these multimedia applications also uncover new machine learning problems in areas such as mid-level features learning, distance learning, feature combination, and so on.
This special session is intended to research papers that combine machine learning for multimedia problems. The following topics are of particular interest:
page 1 of 2next