Yes, facial recognition is fun. It is being used to unlock the phone without our efforts to look for the 18th-century Renaissance painting our faces resemble the most, as we mostly do, and to make the emoticons move in front of the stool.
But there’s a dark part you’ve probably heard recently. Face-lift technology facilitates state surveillance, which can be used without explicit consent from participants and can enhance racial discrimination and illegal harassment.
So far, companies that develop and using facial recognition technology have no laws. But as moral concerns accumulate, some nations have begun to enforce laws to keep them. The EU adopts its own extensive privacy provisions requiring companies to obtain clear consent from those who have scanned their faces (companies face a huge amount of fines if they do not comply with the law). But in the United States, regulators did not accept the same order.
Microsoft, the biggest name in facial recognition technology, thinks that this is a mistake.
Last week, Microsoft President Bradford L. Smith called for legal recognition in the United States. Smith wrote in a blog: “We live in national law, and the government must play an important role in managing to regulate facial recognition technology.
Smith is not only worried about how technology will make use of face recognition, but the rule will also maintain its control. “The only effective method of controlling the use of technology by the government is for the government to actively manage this use itself,” Smith wrote.
Smith is the newest member of the growing vocalist group against acne problems.
In Google: Google employees have signed an open letter demanding CEO Satya Nadella to shut down a contract specialist project for the Ministry of Defense to create a “personalized AI surveillance machine” that will allow military drones to use facial recognition software.
At Microsoft: Microsoft employees released their letter that has broken the company’s contract with controversial Immigration and Customs (ICE) controversy. While CEO Satya Nedella downgraded her relationship with the agency responsible for dispersing children and families, employees were concerned that Microsoft could work with them.
At Amazon: Amazon employees looked at horror as the United States government authorities forcibly split migrant children from their parents and have called for CEO Jeff Bezos to stop selling all of the company’s recognition software to law enforcement contractors.
It is as if the city remarks a deceptive way to read Smith’s letter. “This is Microsoft’s biggest competitor – Apple, Google, Amazon, and Facebook – all using different forms of recognition and are leaders in developing the technology. Microsoft may reckon that taking a stand for regulation would serve its better than continuing to compete on the court without rules. “Says senior tech author of Slate ‘’Will Oremus.
However, the deeper reading – that the Constituent Assembly cannot ignore these concerns – should not be ruled out. We have already seen this discovery as a misplaced technology, telling thousands of innocent people and scattered countless numbers of people across the border and airports in the United States without being able to stop. Although Facebook was stunned, a user scanning without specific consent when entering into the EU General Law is in effect on Data Protection Day (GDPR). These activities form dangerous patterns and can pose a threat to personal freedom if they are still not completely in control.
Without sufficient supervision, border controls and law enforcement can easily adhere to the program for face recognition. And this is not an unlikely idea – in the case of the police force in South Wales, the use of face recognition to identify suspicious individuals during this football game, the cost of 87.5 percent of those settings is false. And without proper procedures or legal procedures for wrongful defendants, innocent people can spend years in prison.
If the US government does not intervene in a meaningful way, the Silicon Valley will be forced to explore the gravity: to give up the moral values out of the windows, “do not be bad” to curse or control yourself.
But at least Microsoft has set the tone, putting some pressure on the regulators and the main appearance of technologies that have “social implications and potential for widespread abuse” as Smith puts it. Since the Silicon Valley epidemic mood, the regulator may have no option but to intervene by ensuring its own freedom and security in the process.