Skip to content. | Skip to navigation

Personal tools

Navigation

You are here: Home / Testing / ICB 2015 - 1st Competition on Counter Measures to Finger Vein Spoofing Attacks

ICB 2015 - 1st Competition on Counter Measures to Finger Vein Spoofing Attacks

The BEAT project, the Swiss Centre for Biometrics Research and Testing and the Biometric group at Idiap Research Institute organize the 1st Competition on Counter Measures to Finger Vein Spoofing Attacks. The competition will be part of the 8th IAPR International Conference of Biometrics (ICB-2015):

The 1st Competition on counter measures to finger vein spoofing attacks

8th International Conference of Biometrics (ICB-2015)

May 20-22, 2015 Phuket, Thailand

http://icb2015.org/

Motivation

Despite the reliability and the wide usage of the finger vein recognition systems in many environments where security is vital, they are still vulnerable to direct sensory attacks i.e. spoofing attacks. In a spoofing attack, an invalid user may gain access to the system by presenting counterfeit biometric evidence of a valid user.
The finger vein recognition is an growing technology where the advanced spoofing attacks are emerging. Unfortunately, the number of databases and anti-spoofing systems are still limited or unknown. Furthermore, the number of baseline anti-spoofing systems whose source code is publicly available for comparison and reproducible research, is even more limited.
The objective of this competition is to challenge the proposed anti-spoofing algorithms on spoofing attacks. Therefore, the proposed methods will be evaluated on the Spoofing-Attack finger vein spoofing database, which contains two sub-tasks: full printed and cropped printed images. Additionally, the competition would like to address the lack of available baseline anti-spoofing algorithms for comparison and reproducible research. For that reason, the competition encourages publishing the source code of the developed spoofing counter-measures as a free software.

Database

The competition will be carried out on the Spoofing-Attack finger vein database. The database consists of real accesses and printed attacks from 110 clients (440 samples). The following table shows the number of samples divided into protocols.

ProtocolTraining set
Development set
Test set
full
120120200
cropped120120200

Attack protocols are used to evaluate the (binary classification) performance of counter-measures to spoof attacks. The database can be split into 2 different sub-tasks according to the visual information available: full printed images and cropped printed images.

The samples in the Spoofing-Attack database are given in the form of ".png" file format. The image samples and the list of the files divided into protocols can be downloaded directly from the official Spoofing-Attack finger vein website

Submission of results

The participating teams can already download the training and development data and start developing their algorithms (End User Licence Agreement needs to be signed). To ensure that the sample labels are not used to infer the class of the image, a set of unlabeled test images will be made available at evaluation time (November 3rd, 2014).

 

The participants need to submit two score files: one for the development (November 2nd, 2014) and one for the anonymized test set by the specified deadline (November 23th, 2014). Only participants that submit the development results will be considered for the competition results.
The files need to have two columns:

filename score

The first column corresponds to the file name of the filename, while the second one for the score of the decision. The scores must be normalized between 0 and 1 (0.0 <= score <= 1.0). Please note that we expect that the real accesses have larger scores then the attacks.

Here is an example how the score file for the development set should look like:

dev/real/022-M/022_L_2 0.5394
dev/real/006-M/006_R_1 0.7444
dev/spoof/025-F/025_L_2 0.017

The test set for protocol full should accordingly look like this (example):

test_file_002 0.8921
test_file_012 0.0001
test_file_033 0.9932

Evaluation and ranking

The ranking will be based on the HTER (Half-Total Error Rate) measured on the anonymized test set data using an 'a priori' threshold calculated on the development data, using the two score files that you are going to provide. The threshold to be used is the value that equalizes the false-acceptance and false-rejections, a.k.a. equal-error rate (EER). The smallest the HTER on the test set, the higher rank you will get.

The logic we will use is the following:

1) Compute the threshold for the EER on the development set

2) Apply the threshold for discriminating the test data, calculate the HTER

The participants will be required to submit a short description of their method as well. Additional credits will be given to participants who will provide the source code of their method as a free software.
The results of all the submitted algorithms, as well as a short description for them, will be submitted for publication at ICB-2015.

Useful links

Participating teams can use the free machine learning and signal processing toolbox Bob based on Python.

How to engage

Please register your team here by September 14th, 2014. Deadline extended September 21th, 2014.

Important dates

Registration due (DEADLINE EXTENDED)September 21th, 2014
Deadline for Development set resultsNovember 2nd, 2014
* Availability of test setNovember 3st, 2014
Submission of results and method descriptionNovember 23th, 2014
Submission for publication of the results at ICB-2015December 15th, 2014

* Only participants that submit the development results will be considered for the competition results.

Organizers

Dr. Pedro Tome ( pedro DOT tome AT idiap DOT ch)

Dr. Sébastien Marcel ( sebastien DOT marcel AT idiap DOT ch)

This is themeComment