Latest Articles
- Face Detection and Recognition for Linux (September 17th, 2011)
- Cleaning up F-Spot's database (August 30th, 2011)
- Face Detection and Recognition for Linux (April 29th, 2011)
- SuSE 6.0 on a Thinkpad 701c (February 14th, 2011)


Categories
Face Detection and Recognition for Linux
Download | Read more...
What is it?
Screenshots
How does it work?
Installation
FAQ
- Why several, separate scripts?
- Why is the GUI so bad?
-
Truth be told: I'm a command line person at heart. That's why, rather than writing one monolithic big program which does everything in a rather clumsy, mouse controlled way, I wrote several small scripts which do one thing only, but can be used from the command line in just the way I like to use them.
This also, in a roundabout way, explains why the GUI is so bad: I'm just not a GUI person. I have no idea what makes a good GUI (tell me, and you just might see it happen). So I came up with what I initially thought makes sense. However, I can recognise a bad GUI, and it is clear that this GUI needs to see at least some improvements (coming soon):
- An option to select more than one face at a time and then apply an operation to all faces at the same time. This is certainly needed for removing, and might come in handy for tagging; also, it would be nice to be able to select several faces and then query for other faces similar to any of them.
- This needs to be faster. And I even know how to do it…
- This needs an undo button. But don't hold your breath on that one.
- It should be possible to minimise the image preview if not needed.
- The image preview should come up with ROIs scaled appropriately.
I'm sure you could come up with more – let me know what you think!
- I just started face_sort, but I do not see any faces, only crud?
- Well, you did read the README, didn't you? By default, this will start up with what it thinks are not faces, so that you can start by weeding these out. Don't worry, you will come to some faces eventually. Or simply press the 'Random Thumbnail' button a few times, until you see faces (not recommended, but understandable :-). For more info, have a look at "How does it work" (and then "run face_sort") above.
- Why can't your program tell my girlfriend apart from my mum?
-
Several reasons:
- Running face-learn on the actual data you are later going to classify means that the recognition will be heavily biased to the faces appearing most often (I'll presume that's your girl-friend, not your mum).
- Eigenfaces, and in particular the implementation used here, are a very simple classifier. They know nothing about faces as such, so in addition to what you think looks similar, they will also consider posture of the head, texture of the image (which will have something to do with lighting conditions and camera used), and even background. It is possible to account for most of this variation, but this wasn't done yet; this means that all faces slightly cocked to one side will look more similar to each other (even though they show different subjects) than images of the same person cocked differently.
- Has it occurred to you that your girl-friend does indeed look like our mum?
- Why was this face (or that, or another one) not detected?
-
Again there's more than one possible reason:
This uses a "frontal face detector", which basically assumes that people are staring straight into the camera, head held high (if your the proud owner of one of those so called biometric passports, you'll know exactly how to look). If people do not look directly at the camera, detection results will slowly deteriorate; if people cock their heads, detection results will be even worse. Essentially, the closer your face is to the one on the right, the better…
- There's a bug in OpenCV which doesn't help things (it gets worse, the bigger the image).
- There have been reports that the OpenCV's face detector is somewhat racist as well as beardist. While I can not confirm that from my own observations, it is a certainly not uncommon problem, as most commonly available databases which could be used for training tend to be one or more of: sexist, ageist, racist, etc (basically all they ever show are 20 year old aspiring computer scientists from Pakistan). You can always train your own filter, if you think you could do better. Good luck.