LibAGML first release

Since we had a few publications on the topic of distributed machine learning (in particular a Neurocomputing paper on distributed PCA: "Asynchronous Gossip Principal Components Analysis"), let's talk a bit more about it. My Ph.D. student Jérôme Fellus has rolled out the version first version of his libagml library. This is a distributed machine learning library in C++ that relies on Gossip protocols.

The main page is here:

The way it works is dead simple: you have a mother class that corresponds to a node, and all you have to do is derive it to make your specifi local computation and aggregation procedures. All the networking, instantiation, etc, is handle by the library. Nice, isn't it?

No news, good news

This post should explain why I did not set anything new here for long time. First, I've been very busy doing research the past year. In no particular order:

  • My very first PhD student - Romain Negrel - has successfully defended in December 2014. He is now pursuing a postdoc at GREY, Caen, France.
  • I've been invited for a month at the TU Darmstadt between August and September, working with Dr.-Ing. V. Willert and Thomas Guthier on learning local descriptors. While the work isn't finished yet (read not published), it is very interesting. All of this was funded by the DAAD, which I thank gratefully.
  • I've been working a lot recently on computer vision and image processing for cultural heritage purposes. To that end, I've set up a benchmark using images from the BnF, based on a project we have funded by the Labex Patrima. A paper on that has been accepted in IEEE Signal Processing Magazine. With respect to the project, we are starting to work on interesting things involving deep learning with my new postdoc Yi Ren.
  • More on cultural heritage and image processing, I'm organizing a special session at this year's edition of the GRETSI. The call for paper is here, you are of course all invited to submit as many paper as you can to share new work in this interesting area.
  • I've been publishing many things in the past months (at my scale of course), all of which you can find in the publications page.

Now for the things scarcely related to research, I've been elected head of the computer science department at the ENSEA, which means that I now have a lot of administrative things to do. If you have any inquiries regarding CS at our graduate school, I guess I am now the guy to ask.

Also, I've been releasing an album with my oldest band, which you can download for free here.

Some Publications, JKMS, ESANN

2014 is set to be a good year! We already have the reviews for a few papers I've been working on lately. Some are in the ML domain (an ICPR paper with Romain Negrel on supervised sparse subspace learning, an ESANN paper with Jérôme Fellus on decentralized PCA), others in CV (2 journals in revision on low level visual descriptors with Olivier Kihl) and 1 in 3D indexing with Hedi Tabia (CVPR poster)

Other than that, I've been pushing version 2.3 of jkms. I've tagged it the "density edition" since most of the changes are related to density estimators (mostly one class SVM). I've introduced the density version of SimpleMKL, which could e useful to perform model selection. Basically, if you set C=1, you'll get a Parzen estimator, albeit selection the kernel from a specific set.

Finally, I'll be in Brugge next week for the ESANN 2014 conference. A good way to start new projects, if anyone volunteers!

JKenelMachines 2.2

Here we go again, it seems I'm only alternating new publications and update to jkms on this page.

Version 2.2.

  • Fast kernel using Nystrom approximation (with fast active learning procedure as in (Tabia BMVC13))
  • Large scale Kernel SVM using the Nystrom approximation
  • New algorithms and better tuning in the algebra package
  • Multhithreading support for algebra
  • Optional dependency on EJML for faster eigen decomposition (check is at runtime, compatible with older code)
  • Revised and online Javadoc

The can now optionaly depend on EJML in order accelerate the eigen-decomposition. I had a lot of fun implementing some algorithms (Jacobi, QR, householder transforms, Givens rotation, ...), which allows the library to perform all available BLAS on its own. However, it will never be competitive with dedicated libraries. So I checked the current pure java blas library, and EJML is probably the best out there (kudos to the people behind). I made a simple wrapper that checks that the library is in the classpath, and uses it in that case. No older code should break because of this. If it does, email me rapidly...

Next, I will wrap more things around EJML (i.e. not only eig), but I still want jkms to be totally autonomous. That is, not existing feature will ever require EJML (nor any other library).

Another new feature is a fast Kernel based on the Nystrom approximation, with an active learning strategy for fast training. this was among the stuff I worked with Hedi Tabia and presented at BMVC last september.

BMVC 2013, 3DOR2014

I'll be at BMVC in Bristol next week to present some work done with Hedi Tabia on 3D similarities using curves.

Similarity on curves is usually very expansive since it requires to re-parametrize one of the curves so as to map it to the same affine space as the other. To circumvent the heavy processing, we proposed to use the Nyström approximation for kernels on some well chosen training set found by active learning. Nothing excessively fancy, but very very effective in computational time.

On the other hand, Hedi is co-organizing the next 3DOR 2014, which will be held in Strasbourg just before Eurographics. The website is brand new, so expect more information soon.

page 1 of 6next»