📰Concept

Catch & Function creep

Probleem

The police uses predictive policing methods to detect criminals and suspects more quickly. For example, since 2016, the facial recognition program "CATCH" has been used. The software is also called facial comparison, because CATCH does not judge who someone "is", but only "who someone looks like".

CATCH is currently used to verify the identity of a suspected or convicted person on the basis of a dataset. The facial recognition system is therefore not yet used in real time to locate suspects in public spaces. There are already trials in which it is tested. In principle, it is already technically possible to apply real-time facial recognition in the public space and it is therefore also possible that this will also be applied in the near future.

Because there is a chance that the police will want to apply the real-time technology, there is a chance of "function creep".

Function creep

When technology is used at a certain moment for purposes other than originally intended, this is called "function creep".

The development of function creep in information and technology is more a habit than an exception. At some point, the technology is used for purposes other than originally intended. The problem with function creep is that it often ensures that exactly what is currently excluded by law happens in the future.

Legislation

The article of law on which the current use is based, article 55c Sv, was not created to regulate this technology. The rules come from a time when facial recognition technology was not yet used. The consequence of this is that no specific legal bases have been drawn up for facial recognition, and that there has been little social or political debate about its applications in criminal proceedings. This problem will persist as further applications of this technology are based on this article of law.

Werking van CATCH

1) CATCH's process currently works as follows: A police officer sends an application with a photo to the Center for Biometrics. This can be a photo taken to identify a person who has been arrested, but also a detection photo, such as images from (security) cameras hanging in the public space.

2) Police officers may only apply for "proper performance of their duties". The application is registered. This way the center can see who made the application and on what date. The center cannot see which offense the person is suspected of or which case is involved.

3) The center assesses whether the photo is of good quality, and therefore useful for the technology. The quality of a tracing photo can be "improved", but the center should not manipulate the images. If it is checked whether a suspected or convicted person is already listed in the criminal justice system database, the face from the criminal investigation image is compared with all the faces in it. You can then specify what the minimum degree of similarity should be and how many results should be shown.

4) A team then assesses which face from the results is the most similar.

5) Then that photo, along with the original tracing photo, goes to two experts. These assess independently of each other to what extent they correspond. If they both believe that those faces are likely to belong to the same person, the identity of that person from the criminal justice system database and the degree of similarity with it is reported to the police officer who made the request.

6) If the police officer himself thinks he already knows who the person in the investigation picture is, the police officer can also ask the center to compare the investigation picture from the criminal justice chain database with one face. The outcome of the investigation is a detection indication. Based on this, an investigation into a person is started.

7) The center does not assess the legitimacy of obtaining the criminal record. They act on the basis of the principle of trust: they assume that the police officer has complied with the law in obtaining the images and the request has been made "for a proper performance of their duties".

Control

The Police Data Authority conducts random checks to verify that the requests are being made lawfully. The center is separate from the police teams conducting the investigations. The police officers cannot get started with CATCH themselves. The processes within the center are also separate: there is a team that applies facial recognition technology and a team that analyses those results. There is no separate supervision of the decisions made by the center within the framework of CATCH. The investigation report of the Center for Biometrics is added to the suspect's report. No figures are available on how often CATCH has presented a false identity as "most likely".

Threats

According to the Dutch police, the system is neutral and based on objective crime figures. However, this is not correct: prejudices and stereotypes influence the models and algorithms. And that leads to discriminatory outcomes with higher risk scores for certain social population groups.

Applying such systems can lead to "over-policing" of various minority communities because these algorithms are trained on biased data or dirty data. Critics often point to the possibility that such systems can produce dangerous feedback loops.

These systems may lead to further over-policing of minority communities by virtue of being trained on biased or “dirty” data. Critics commonly point to the possibility that such systems may produce dangerous feedback loops.

These systems require transparency in the field of big data and AI systems in the public sector. The self-learning algorithms - cannot be properly checked once it has been implemented.

Concept

With my concept I want to show the danger that "function creep" entails as soon as certain techniques are further applied within predictive policing systems.

Due to the desire of investigative services to work more efficiently, certain functionalities can also be applied for the most "nasty scenarios" of which we as citizens are not aware.

By creating a screening system where you gain insight "how likely" you are considered a criminal by the system. I want to create attention and awareness among the ignorant citizen. I also want to leave a question as to whether we want to have such systems in our society and whether this violates our privacy too much.

The purpose of my application is to give the user insight into how A.I. systems have certain biases about you and just create a certain profile of you based on self-learning software. This puts you as a person in a certain "box", which may seem innocent, but because of the link between your profile and crime figures, it takes a less innocent turn and can lead to ethnic profiling and discrimination of certain groups with certain characteristics. By being transparent and explaining how the system arrived at certain outcomes. Awareness can be created.

In the future, the police could use a similar application and then target the persons or locations that are classified as "high risk".

Working

  • The user's face will act as input and will be analysed through the A.I. system that is fed with data to perform a certain "prediction".

  • The A.I. system will be able to detect 3 characteristics (gender, age, ethnicity).

    • I want to link data figures of crime figures (suspects; crime group, gender, age and migration background) that are registered in the Netherlands to your predicted facial characteristics.

  • The user will then be able to view his "risk profile".

  • A graph with the top 3 crimes that the user feels "risk profile" will be displayed (with risk score).

Modules add-ons

  • Upload photos and obtain risk profile.

  • Compare risk profile with other persons (who is more likely to commit a certain crime).

  • Do i need a mask (Face Mask printing that is safe in certain neighbourhoods)

Possibilities in the future

Identification can be further extended to locating terrorists in a crowd, which can be further extended to locating 'potential terrorists', which can be extended to locating people who regularly visit a particular location, which can be further expanded. extended to tracking particular groups or individuals.

These can be tracked on a variety of grounds, such as appearance, certain behaviours, beliefs or political beliefs. This form of observation can take place in secret or in public. These developments sound fierce, but technology makes it possible to observe and follow groups and individuals intimately. This is a very unwanted and unhealthy situation and must be avoided.

📔Concept idea's

V1 Sketches

  • Introduction (short)

  • Scan face

    • check face

    • if face is covered - suspicious

    • if face - chance on

  • See results

  • Tell how the results are builded

✏️ Sketches

Last updated

Was this helpful?