JKenelMachines 2.2

Here we go again, it seems I'm only alternating new publications and update to jkms on this page.

Version 2.2.

  • Fast kernel using Nystrom approximation (with fast active learning procedure as in (Tabia BMVC13))
  • Large scale Kernel SVM using the Nystrom approximation
  • New algorithms and better tuning in the algebra package
  • Multhithreading support for algebra
  • Optional dependency on EJML for faster eigen decomposition (check is at runtime, compatible with older code)
  • Revised and online Javadoc

The can now optionaly depend on EJML in order accelerate the eigen-decomposition. I had a lot of fun implementing some algorithms (Jacobi, QR, householder transforms, Givens rotation, ...), which allows the library to perform all available BLAS on its own. However, it will never be competitive with dedicated libraries. So I checked the current pure java blas library, and EJML is probably the best out there (kudos to the people behind). I made a simple wrapper that checks that the library is in the classpath, and uses it in that case. No older code should break because of this. If it does, email me rapidly...

Next, I will wrap more things around EJML (i.e. not only eig), but I still want jkms to be totally autonomous. That is, not existing feature will ever require EJML (nor any other library).

Another new feature is a fast Kernel based on the Nystrom approximation, with an active learning strategy for fast training. this was among the stuff I worked with Hedi Tabia and presented at BMVC last september.

BMVC 2013, 3DOR2014

I'll be at BMVC in Bristol next week to present some work done with Hedi Tabia on 3D similarities using curves.

Similarity on curves is usually very expansive since it requires to re-parametrize one of the curves so as to map it to the same affine space as the other. To circumvent the heavy processing, we proposed to use the Nyström approximation for kernels on some well chosen training set found by active learning. Nothing excessively fancy, but very very effective in computational time.

On the other hand, Hedi is co-organizing the next 3DOR 2014, which will be held in Strasbourg just before Eurographics. The website is brand new, so expect more information soon.

JKernelMachines 2.1 release

I released a new version of JKernelMachines with the following features:
  • new algorithms: SDCA (Shalev-Shwartz 2013), SAG (Le Roux 2012)
  • new custom matrix kernel to handle train and test separately
  • add fvec file format
  • add experimental package for linear algebra and corresponding processing (i.e. PCA, KPCA), use at your own risk!
  • add example app to perform VOC style classification
  • Lots of bug fixes

The linear algebra package is at the moment very rough. I find it somehow useful to perform some king of pre-processing (like a PCA for example). At the moment, my matrix code is a bit slow. If ever I find the time to make solid matrix operations, I will add some nice features like low rank approximations of kernels (Nyström).

Nevertheless, I suggest to always pick the latest git version instead of these releases. The API is very stable now and should not change significantly, which means that all the code you write now is to be supported in the next few years. Thus, picking the latest git always assures you to have the bug-fixes and so on (I don't release versions only for bug-fixes).

One more thing: JKernelMachines has been published in JMLR last month. I encourage you to read the paper and to cite it if you ever use to code for your publications.

Some publications in 2013

The year is beginning with a small batch of publications on the different topics I'm working on.

Two years ago, we developed a new signature for image retrieval and classification based on tensors aggregation we named VLAT. The paper giving the very details of the method (plus a bonus for cheap large scale computation) has now been published in Computer Vision and Image Understanding at Elsevier. In the meantime, Romain Negrel (Ph.D. Student) has completely redesigned the method to improve its effectiveness. His work has now been accepted in IEEE Multimedia. There are some nice experiments in this paper, including large scale retrieval (1M images) at very low bitrate (less than 64 bytes per image).

On the video front, we have a paper accepted at MVA 2013 with Olivier Kihl (PostDoc), on video descriptors using polynomials expansion. We have very good results on well known data-sets, which makes me think this approach sounds very promising.

In 3D object retreival, we have an accepted paper at 3DOR with Hedi Tabia. This was a pretty straight forward extension of our still images indexing methods to 3D Objects, and it works well.

On a totally different topic, I recently did a paper with my colleague Aymeric Histace on the modeling of an insect (the bark beetle) using a multi-agents system. This was something I haven't done for years, and it was fun to do. The novelty in our approach is that we consider the chemical markers released by the agents and the environment to evolved thanks to a partial differential equations system modeling the physical spreading. This concurrent evolution between MAS and PDE makes the behavior of the agents a lot less predictable. This work was in part done by Marie-Charlotte Desseroit (undergrad student) during an internship last summer, which I find pretty impressive.

JKernelMachines 2.0

JKernelMachines version number bumped to 2.0!

The bigs changes are:

  • All classes have migrated under fr.lip6.jkernelmachines.* This breaks backward compatibility! (hence the 2.0 version number).
  • Separation of the core library and unit testing
  • Junit testing added
  • Lots of bug fixes
  • Better examples, and many useless test classes removed
  • A small demo script to benchmark the library was added
As always, check the mloss page or directly github.

page 1 of 6next»