I just used face recognition, can technology be cracked?

There may be no more biometrics such as "brushing face" that has received much attention when it is used on a large scale.

The fingerprint identification technology that pioneered the application of biometrics technology was in the limelight when it first appeared on the market. Countless enterprises entered the market and expanded their applications. Later, they slowly discovered that it seems that fingerprint identification technology can only be verified. The module is applicable to a wide range of applications, and the market demand is large and large, but it is always not fun in the three-point land. Then the proposed finger vein recognition and palmprint recognition technology is more secure. As a verification module, it is more reassuring, but there is a growing voice in the world: biometric technology should not be used for verification. Exist!

Later, the concepts of "contact" and "non-contact" biometrics were well known. Fingerprints, finger veins, and palm prints were classified as "contact", while faces, irises, gaits, etc. were returned. The class is in "non-contact". Since then, people have been discussing the differences between the two types of biometrics and their application scenarios.

In recent years, with the popularity of ultra-high definition front-end equipment, "non-contact" biometric technology is gradually coming to the fore. Among them, face recognition technology has been placed more hope.

"Brush face" in the narrow sense, "face recognition" in a broad sense

Speaking of face recognition, most people may think of brushing the face to open the door, brushing the face to pay, brushing the face to unlock the phone, and some people may have experienced the high-tech of brushing the face into the train station and boarding, if there are people Knowing that brushing your face to take money and brush your face as an electronic ID card is already an amazing "message". However, are these the whole picture of face recognition? Of course not, these are just "brush faces" in the narrow sense.

As I mentioned at the beginning of this article, "There may be no more biometrics such as face recognition being used in large-scale applications." If you only say "brush face", then it and other biometric technologies are not. Too big a difference: Not all verification? But if you talk about face recognition, it is different.

The combination with big data makes the face recognition technology more widely used. Taking public security applications as an example, the public security departments often encounter unidentified personnel when investigating and handling cases, such as lost elderly people, children, suspects who refuse to give evidence, and unclaimed bodies. At this time, the traditional methods often cannot solve the problem. The face face is input into the system using the face retrieval system. The system automatically performs a search comparison in the massive population database, listing the first few similar personnel information. Then, through manual intervention, the system results are screened to obtain the true identity of the target.

Behind this application is the sign that the face recognition technology is becoming more and more mature. As a security industry with its own AI landing gene (data large and high trial and error tolerance), in recent years, Skynet has been widely spread and intelligent systems. Global application, urban order has been more effectively managed and protected, people's livelihood needs have been processed and feedback in a timely manner, and violations of chaos can be accurately identified and punished.

At the same time, face recognition applications are also being criticized

On June 21st, an article published by The Wall Street Journal found that although Amazon CEO Bezos said he would boycott government investigators who tried to obtain personal information from the company's equipment, the company was targeting private companies. Law enforcement agencies and other sales of face recognition technology, many people believe that the use of this technology will threaten personal privacy.

In their view, applying face recognition technology to front-end video surveillance cameras may break the balance between privacy and usability. Suppose the US police has several such security cameras and a "blacklisted" photo library of suspicious people. If anyone else has something similar to these suspicious people, once they enter the security camera of the police, it is possible Interrogated by police officers. And most Americans don't want to live in that world.

Some time ago, the study of a face recognition revealing the appearance of criminals also caused an uproar. Researchers at Shanghai Jiaotong University collected 1,856 photos of Chinese citizens aged 18-55, of different races, genders, ages and even different facial expressions. Among them, 730 were selected convicted criminals, and then used artificial intelligence technology to commit crimes. Conduct research and conclude that:

The curvature of the upper lip of the prisoner is 23% higher than the average person;

The distance between the corners of the eyes of the prisoner's eyes is 6% narrower than that of the average person;

The angle between the tip of the prisoner and the corner of the mouth is 20% smaller than that of the average person.

The computer analyzes the facial features of these citizens through four algorithms to infer the common facial features of the criminals. The conclusion of the paper: artificial intelligence finds the accuracy of criminals according to the appearance of up to 89.51%.

Once this high-tech research was disclosed, it was immediately criticized by netizens. If this technology is applied to the real world, it is very likely that the good guys will suffer the innocence and let the real bad guys go unpunished.

If the first two things are privacy issues, then the following is a technical problem.

Recently, an article pointed out that today's very popular AI application face recognition has a huge difference in accuracy for different races. Among them, the error rate for black women is as high as 21% -35%, while the error rate for white males is less than 1%.

In response to this, related product manufacturers responded that the high rate of dark-skinned human identification is a common phenomenon. First, the lack of dark ethnic data sets, and second, the characteristics of dark ethnic faces are difficult to extract.

Anti-face recognition products come out

Recently, University of Toronto professor ParhamAarabi and his graduate student Avishek Bose developed an algorithm that dynamically destroys the face recognition system by "light conversion" the image.

The university professor also gave reasons for inventing the product based on privacy issues. “With the increasing popularity of face recognition technology, personal privacy has become a real problem that needs to be solved. This is the anti-face recognition system being developed. The reason is also the use of the system."

According to Aarabi, they mainly use adversarial training techniques to make two neural networks confront each other. One neural network acquires information from the data (face data), and the other neural network tries to destroy the first neural network. The task performed.

It is reported that their algorithm is trained on data sets containing more than 600 face photos in different races, different lighting conditions and background environments (industry standard library), two neural networks will form a real-time confrontation "Filter", which can be applied to any image. Because of its goal - a single pixel in an image is specific, changing a particular pixel is almost imperceptible to the naked eye. For example, if the detection network is looking for the corner of the eye, the interference algorithm will adjust the corner of the eye so that the pixels in the corner of the eye are less conspicuous. The algorithm creates very small interference in the picture, but for the detector, these interferences are enough to fool the system.

In fact, it is not uncommon to destroy the product recognition rate of face recognition products. As early as 2016, researchers at Carnegie Mellon University designed a spectacle frame that could mislead the facial recognition system and cause it to be misidentified.

to sum up

Regarding the hot spot of face recognition, the author has said it roughly. You have to ask me how to look at it. I will say that there is nothing new under the sun. There is not much privacy in the era of big data. The key is to look at the morality of the company. Government regulation. As for the anti-face recognition system and the face recognition technology, I am very happy to see this happen. One spear and one shield, and each other is a game, in order to check for missing and improve technology.

Tenor Ukulele

Ukulele Tenor 26 Inch,26 Inch Ukulele,Electric Tenor Ukulele,Acoustic Electric Tenor Ukulele

GUANGZHOU GIDOO MUSICAL INSTRUMENT CO.,LTD , https://www.kymusicstand.com