web analytics
a

Facebook

Twitter

Copyright 2015 Libero Themes.
All Rights Reserved.

8:30 - 6:00

Our Office Hours Mon. - Fri.

703-406-7616

Call For Free 15/M Consultation

Facebook

Twitter

Search
Menu
Westlake Legal Group > Posts tagged "Face"

China Uses DNA to Map Faces, With Help From the West

Westlake Legal Group 00CHINA-DNA1-facebookJumbo China Uses DNA to Map Faces, With Help From the West Xinjiang (China) Springer Nature Privacy Muslims and Islam Ministry of Public Security of the People's Republic of China Genetics and Heredity Forensic Science Face DNA (Deoxyribonucleic Acid) Chinese Academy of Sciences China

TUMXUK, China — In a dusty city in the Xinjiang region on China’s western frontier, the authorities are testing the rules of science.

With a million or more ethnic Uighurs and others from predominantly Muslim minority groups swept up in detentions across Xinjiang, officials in Tumxuk have gathered blood samples from hundreds of Uighurs — part of a mass DNA collection effort dogged by questions about consent and how the data will be used.

In Tumxuk, at least, there is a partial answer: Chinese scientists are trying to find a way to use a DNA sample to create an image of a person’s face.

The technology, which is also being developed in the United States and elsewhere, is in the early stages of development and can produce rough pictures good enough only to narrow a manhunt or perhaps eliminate suspects. But given the crackdown in Xinjiang, experts on ethics in science worry that China is building a tool that could be used to justify and intensify racial profiling and other state discrimination against Uighurs.

In the long term, experts say, it may even be possible for the Communist government to feed images produced from a DNA sample into the mass surveillance and facial recognition systems that it is building, tightening its grip on society by improving its ability to track dissidents and protesters as well as criminals.

Some of this research is taking place in labs run by China’s Ministry of Public Security, and at least two Chinese scientists working with the ministry on the technology have received funding from respected institutions in Europe. International scientific journals have published their findings without examining the origin of the DNA used in the studies or vetting the ethical questions raised by collecting such samples in Xinjiang.

In papers, the Chinese scientists said they followed norms set by international associations of scientists, which would require that the men in Tumxuk (pronounced TUM-shook) gave their blood willingly. But in Xinjiang, many people have no choice. The government collects samples under the veneer of a mandatory health checkup program, according to Uighurs who have fled the country. Those placed in internment camps — two of which are in Tumxuk — also have little choice.

The police prevented reporters from The New York Times from interviewing Tumxuk residents, making verifying consent impossible. Many residents had vanished in any case. On the road to one of the internment camps, an entire neighborhood had been bulldozed into rubble.

New York Times reporters in Tumxuk recorded video of a large number of destroyed Uighur buildings along a road that led to a re-education camp.

Growing numbers of scientists and human rights activists say the Chinese government is exploiting the openness of the international scientific community to harness research into the human genome for questionable purposes.

Already, China is exploring using facial recognition technology to sort people by ethnicity. It is also researching how to use DNA to tell if a person is a Uighur. Research on the genetics behind the faces of Tumxuk’s men could help bridge the two.

The Chinese government is building “essentially technologies used for hunting people,” said Mark Munsterhjelm, an assistant professor at the University of Windsor in Ontario who tracks Chinese interest in the technology.

In the world of science, Dr. Munsterhjelm said, “there’s a kind of culture of complacency that has now given way to complicity.”

Sketching someone’s face based solely on a DNA sample sounds like science fiction. It isn’t.

The process is called DNA phenotyping. Scientists use it to analyze genes for traits like skin color, eye color and ancestry. A handful of companies and scientists are trying to perfect the science to create facial images sharp and accurate enough to identify criminals and victims.

The Maryland police used it last year to identify a murder victim. In 2015, the police in North Carolina arrested a man on two counts of murder after crime-scene DNA indicated the killer had fair skin, brown or hazel eyes, dark hair, and little evidence of freckling. The man pleaded guilty.

Despite such examples, experts widely question phenotyping’s effectiveness. Currently, it often produces facial images that are too smooth or indistinct to look like the face being replicated. DNA cannot indicate other factors that determine how people look, such as age or weight. DNA can reveal gender and ancestry, but the technology can be hit or miss when it comes to generating an image as specific as a face.

Phenotyping also raises ethical issues, said Pilar Ossorio, a professor of law and bioethics at the University of Wisconsin-Madison. The police could use it to round up large numbers of people who resemble a suspect, or use it to target ethnic groups. And the technology raises fundamental issues of consent from those who never wanted to be in a database to begin with.

“What the Chinese government is doing should be a warning to everybody who kind of goes along happily thinking, ‘How could anyone be worried about these technologies?’” Dr. Ossorio said.

With the ability to reconstruct faces, the Chinese police would have yet another genetic tool for social control. The authorities have already gathered millions of DNA samples in Xinjiang. They have also collected data from the hundreds of thousands of Uighurs and members of other minority groups locked up in detention camps in Xinjiang as part of a campaign to stop terrorism. Chinese officials have depicted the camps as benign facilities that offer vocational training, though documents describe prisonlike conditions, while testimonies from many who have been inside cite overcrowding and torture.

Even beyond the Uighurs, China has the world’s largest DNA database, with more than 80 million profiles as of July, according to Chinese news reports.

“If I were to find DNA at a crime scene, the first thing I would do is to find a match in the 80 million data set,” said Peter Claes, an imaging specialist at the Catholic University of Leuven in Belgium, who has studied DNA-based facial reconstruction for a decade. “But what do you do if you don’t find a match?”

Though the technology is far from accurate, he said, “DNA phenotyping can bring a solution.”

To unlock the genetic mysteries behind the human face, the police in China turned to Chinese scientists with connections to leading institutions in Europe.

One of them was Tang Kun, a specialist in human genetic diversity at the Shanghai-based Partner Institute for Computational Biology, which was founded in part by the Max Planck Society, a top research group in Germany.

The German organization also provided $22,000 a year in funding to Dr. Tang because he conducted research at an institute affiliated with it, said Christina Beck, a spokeswoman for the Max Planck Society. Dr. Tang said the grant had run out before he began working with the police, according to Dr. Beck.

Another expert involved in the research was Liu Fan, a professor at the Beijing Institute of Genomics who is also an adjunct assistant professor at Erasmus University Medical Center in the Netherlands.

Both were named as authors of a 2018 study on Uighur faces in the journal Hereditas (Beijing), published by the government-backed Chinese Academy of Sciences. They were also listed as authors of a study examining DNA samples taken last year from 612 Uighurs in Tumxuk that appeared in April in Human Genetics, a journal published by Springer Nature, which also publishes the influential journal Nature.

Both papers named numerous other authors, including Li Caixia, chief forensic scientist at the Ministry of Public Security.

In an interview, Dr. Tang said he did not know why he was named as an author of the April paper, though he said it might have been because his graduate students worked on it. He said he had ended his affiliation with the Chinese police in 2017 because he felt their biological samples and research were subpar.

“To be frank, you overestimate how genius the Chinese police is,” said Dr. Tang, who had recently shut down a business focused on DNA testing and ancestry.

Like other geneticists, Dr. Tang has long been fascinated by Uighurs because their mix of European and East Asian features can help scientists identify genetic variants associated with physical traits. In his earlier studies, he said, he collected blood samples himself from willing subjects.

Dr. Tang said the police approached him in 2016, offering access to DNA samples and funding. At the time, he was a professor at the Partner Institute for Computational Biology, which is run by the Chinese Academy of Sciences but was founded in 2005 in part with funding from the Max Planck Society and still receives some grants and recommendations for researchers from the German group.

Dr. Beck, the Max Planck spokeswoman, said Dr. Tang had told the organization that he began working with the police in 2017, after it had stopped funding his research a year earlier.

But an employment ad on a government website suggests the relationship began earlier. The Ministry of Public Security placed the ad in 2016 seeking a researcher to help explore the “DNA of physical appearance traits.” It said the person would report to Dr. Tang and to Dr. Li, the ministry’s chief forensic scientist.

Dr. Tang did not respond to additional requests for comment. The Max Planck Society said Dr. Tang had not reported his work with the police as required while holding a position at the Partner Institute, which he did not leave until last year.

The Max Planck Society “takes this issue very seriously” said will ask its ethics council to review the matter, Dr. Beck said.

It is not clear when Dr. Liu, the assistant professor at Erasmus University Medical Center, began working with the Chinese police. Dr. Liu says in his online résumé that he is a visiting professor at the Ministry of Public Security at a lab for “on-site traceability technology.”

In 2015, while holding a position with Erasmus, he also took a post at the Beijing Institute of Genomics. Two months later, the Beijing institute signed an agreement with the Chinese police to establish an innovation center to study cutting-edge technologies “urgently needed by the public security forces,” according to the institute’s website.

Dr. Liu did not respond to requests for comment.

Erasmus said that Dr. Liu remained employed by the university as a part-time researcher and that his position in China was “totally independent” of the one in the Netherlands. It added that Dr. Liu had not received any funding from the university for the research papers, though he listed his affiliation with Erasmus on the studies. Erasmus made inquiries about his research and determined there was no need for further action, according to a spokeswoman.

Erasmus added that it could not be held responsible “for any research that has not taken place under the auspices of Erasmus” by Dr. Liu, even though it continued to employ him.

Still, Dr. Liu’s work suggests that sources of funding could be mingled.

In September, he was one of seven authors of a paper on height in Europeans published in the journal Forensic Science International. The paper said it was backed by a grant from the European Union — and by a grant from China’s Ministry of Public Security.

Dr. Tang said he was unaware of the origins of the DNA samples examined in the two papers, the 2018 paper in Hereditas (Beijing) and the Human Genetics paper published in April. The publishers of the papers said they were unaware, too.

Hereditas (Beijing) did not respond to a request for comment. Human Genetics said it had to trust scientists who said they had received informed consent from donors. Local ethics committees are generally responsible for verifying that the rules were followed, it said.

Springer Nature said on Monday that it had strengthened its guidelines on papers involving vulnerable groups of people and that it would add notes of concern to previously published papers.

In the papers, the authors said their methods had been approved by the ethics committee of the Institute of Forensic Science of China. That organization is part of the Ministry of Public Security, China’s police.

With 161,000 residents, most of them Uighurs, the agricultural settlement of Tumxuk is governed by the powerful Xinjiang Production and Construction Corps, a quasi-military organization formed by decommissioned soldiers sent to Xinjiang in the 1950s to develop the region.

Credit…Human Genetics

The state news media described Tumxuk, which is dotted with police checkpoints, as one of the “gateways and major battlefields for Xinjiang’s security work.”

In January 2018, the town got a high-tech addition: a forensic DNA lab run by the Institute of Forensic Science of China, the same police research group responsible for the work on DNA phenotyping.

Procurement documents showed the lab relied on software systems made by Thermo Fisher Scientific, a Massachusetts company, to work with genetic sequencers that analyze DNA fragments. Thermo Fisher announced in February that it would suspend sales to the region, saying in a statement that it had decided to do so after undertaking “fact-specific assessments.”

For the Human Genetics study, samples were processed by a higher-end sequencer made by an American firm, Illumina, according to the authors. It is not clear who owned the sequencer. Illumina did not respond to requests for comment.

The police sought to prevent two Times reporters from conducting interviews in Tumxuk, stopping them upon arrival at the airport for interrogation. Government minders then tailed the reporters and later forced them to delete all photos, audio and video recordings taken on their phones in Tumxuk.

Uighurs and human rights groups have said the authorities collected DNA samples, images of irises and other personal data during mandatory health checks.

In an interview, Zhou Fang, the head of the health commission in Tumxuk, said residents voluntarily accepted free health checks under a public health program known as Physicals for All and denied that DNA samples were collected.

“I’ve never heard of such a thing,” he said.

The questions angered Zhao Hai, the deputy head of Tumxuk’s foreign affairs office. He called a Times reporter “shameless” for asking a question linking the health checks with the collection of DNA samples.

“Do you think America has the ability to do these free health checks?” he asked. “Only the Communist Party can do that!”

Real Estate, and Personal Injury Lawyers. Contact us at: https://westlakelegal.com 

In Hong Kong Protests, Faces Become Weapons

HONG KONG — The police officers wrestled with Colin Cheung in an unmarked car. They needed his face.

They grabbed his jaw to force his head in front of his iPhone. They slapped his face. They shouted, “Wake up!” They pried open his eyes. It all failed: Mr. Cheung had disabled his phone’s facial-recognition login with a quick button mash as soon as they grabbed him.

As Hong Kong convulses amid weeks of protests, demonstrators and the police have turned identity into a weapon. The authorities are tracking protest leaders online and seeking their phones. Many protesters now cover their faces, and they fear that the police are using cameras and possibly other tools to single out targets for arrest.

And when the police stopped wearing identification badges as the violence escalated, some protesters began to expose officers’ identities online. One fast-growing channel on the social messaging app Telegram seeks and publishes personal information about officers and their families. The channel, “Dadfindboy,” has more than 50,000 subscribers and advocates violence in crude and cartoonish ways. Rival pro-government channels seek to unmask protesters in a similar fashion.

Mr. Cheung, who was arrested last week on a suspicion of “conspiring and abetting murder,” subscribes to the “Dadfindboy” channel, although he denied being among its founders as the police have said and he condemned posts calling for violence. He believes he was targeted by the police because he developed a tool that could compare images against a set of photos of officers to find matches — a project he later abandoned.

“I don’t want them to be like secret police,” said Mr. Cheung, who was released on bail and has not been charged with wrongdoing. “If law enforcement officers don’t wear anything to show their identity, they’ll become corrupt. They’ll be able to do whatever they want.”

“With the tool, ordinary citizens can tell who the police are,” he added.

Hong Kong is at the bleeding edge of a significant change in the authorities’ ability to track dangerous criminals and legitimate political protesters alike — and in their targets’ ability to fight back. Across the border in China, the police often catch people with digital fingerprints gleaned using one of the world’s most invasive surveillance systems. The advent of facial- recognition technology and the rapid expansion of a vast network of cameras and other tracking tools has begun to increase those capabilities substantially.

The transformation strikes a strong chord in Hong Kong. The protests began over a proposed bill that would have allowed the city to extradite criminal suspects to mainland China, where the police and courts ultimately answer to the Communist Party.

ImageWestlake Legal Group merlin_158271216_f7c85bbf-41a3-4c85-9425-eac0a183069e-articleLarge In Hong Kong Protests, Faces Become Weapons Telegram LLC Surveillance of Citizens by Government Smartphones Privacy Politics and Government Police Brutality, Misconduct and Shootings Identification Devices Hong Kong facial recognition software Face Demonstrations, Protests and Riots Computers and the Internet Communist Party of China China cameras Attacks on Police

A protester spray-painted a security camera outside the Chinese government’s liaison office in Hong Kong this week. As the protests have intensified, faces and identities have become potent weapons on both sides.CreditChris Mcgrath/Getty Images

The authorities in Hong Kong have outlined strict privacy controls for the use of facial recognition and the collection of other biometric data, although the extent of their efforts is unclear. They also appear to be using other technological methods for tracking protesters. Last month, a 22-year old man was arrested for being the administrator of a Telegram group.

Protesters are responding. On Sunday, as another demonstration turned into a violent confrontation with the police, some of those involved shined laser pointers at police cameras and used spray paint to block the lenses of surveillance cameras in front of the Chinese government’s liaison office. Riot officers carried cameras on poles just behind the front lines as they fired tear gas and rubber bullets.

The protesters’ ire intensified after the police removed identification numbers from their uniforms, presumably to keep violent misconduct from being reported to city leaders. To some protesters, the move suggested the police were taking a cue from the mainland, where officers lack public accountability and often do not identify themselves.

“Why do the police get away while we’re getting attacked?” said Billy Tsui, a hairdresser. “If they do something wrong, they should face legal consequences.” He said that he favored peace over violence but that he also had some sympathy for the Telegram group exposing officers as a check on police misconduct.

“The original intention is just to identify who are the policemen,” Mr. Tsui, 21, said. “If they hide their numbers and don’t show their identity, this is the only way to know.”

Hong Kong police representatives have said personal information about officers and their friends and relatives had been posted online in an act known as doxxing. On July 3, the police said they had arrested eight people accused of, among other things, disclosing personal information without approval. A police spokesman said members of the police force had reported more than 800 incidents in which officers or their family members had been harassed following the data releases.

“Dadfindboy” — a play on the name of a Facebook group created under the auspices of helping mothers find their children, but which ultimately became a way for pro-government groups to gather photos of protesters — is one forum for the doxxing of police officers. By turns facetious, juvenile, cruel and profane in tone, the channel repeatedly reveals personal information and photos, some of them intimate, of the family members of police officers, sometimes with intimate social-media photos.

The channel has featured calls for violence, often in cartoonish ways, although there is no proof that it has incited any specific acts. One post instructed protesters on how to master using a slingshot. Another explained how to make a blow torch using aerosol deodorant. A recent poll queried the channel’s followers about how best to deal with the police. Options included prison, gas chamber, live burial, guillotine, and machine-gun execution. Live burial prevailed with about one-third of the vote.

The police grabbed Mr. Cheung 11 days after the Telegram channel was created, accusing him of administering it. They also accused him of posting a guide on how to assassinate police officers. Mr. Cheung denies the allegations, and a New York Times search could not find posts matching what the police described.

Mr. Cheung, a skinny 29-year-old, was grabbed at a mall around noon on July 18, according to his account. Four plainclothes officers waited for him to unlock his phone and then jumped on him, trying to pry it out of his hands.

After the officers tried to use his face to unlock the phone, they took him to a police station, where, he said, he was roughed up and interrogated. Later, officers went to his home and used a USB drive loaded with hacking software to break into his computers, according to his account of the incident. He said that he had been held for more than 10 hours and that he was not sure how the police had identified him.

Hong Kong police confirmed the investigation, but they declined to comment further on it.

The police may have been motivated by the facial-recognition tool, which Mr. Cheung said he had showed off in a Facebook video he posted last month. Making use of Google technology, Mr. Cheung, a college dropout who studied computer science, built an algorithm to identify police officers based on a small collection of photos that had been posted online. He said he did not continue to pursue the project because of a lack of time.

Mr. Cheung said his detention had cemented his fears. He said the plainclothes officers who arrested him did not identify themselves until they reached the police station. Later, an investigator in a suit urged him to open his phone as a way of demonstrating his innocence “to the 42nd floor” — a phrase Mr. Cheung said seemed to refer to high-ranking police officials. He did not believe that the police ultimately gained access to the phone, although they did break into his other devices.

The police also did not initially allow him to make a call. Only when he said he planned to play Ping-Pong with his uncle did they relent and let him. He said he contacted a friend instead, adding “I hate sports.”

Mr. Cheung also said he believed he had been followed by plainclothes officers since his arrest. When he arrived an hour late to an interview with The Times, he said it was because he was trying to lose a tail. With the help of his black Tesla, he said, he managed to outrace those whoever it was on the highways of the New Territories in northern Hong Kong.

“The cops are getting more and more aggressive,” he said. “I don’t think they have permission to unlock my iPhone or any device. They are starting to be out of their mind.”

“I think it’s more possible to see that Hong Kong will become China,” he said.

Qiqing Lin contributed research.

Real Estate, and Personal Injury Lawyers. Contact us at: https://westlakelegal.com 

In Hong Kong Protests, Faces Become Weapons

HONG KONG — The police officers wrestled with Colin Cheung in an unmarked car. They needed his face.

They grabbed his jaw to force his head in front of his iPhone. They slapped his face. They shouted, “Wake up!” They pried open his eyes. It all failed: Mr. Cheung had disabled his phone’s facial-recognition login with a quick button mash as soon as they grabbed him.

As Hong Kong convulses amid weeks of protests, demonstrators and the police have turned identity into a weapon. The authorities are tracking protest leaders online and seeking their phones. Many protesters now cover their faces, and they fear that the police are using cameras and possibly other tools to single out targets for arrest.

And when the police stopped wearing identification badges as the violence escalated, some protesters began to expose officers’ identities online. One fast-growing channel on the social messaging app Telegram seeks and publishes personal information about officers and their families. The channel, “Dadfindboy,” has more than 50,000 subscribers and advocates violence in crude and cartoonish ways. Rival pro-government channels seek to unmask protesters in a similar fashion.

Mr. Cheung, who was arrested last week on a suspicion of “conspiring and abetting murder,” subscribes to the “Dadfindboy” channel, although he denied being among its founders as the police have said and he condemned posts calling for violence. He believes he was targeted by the police because he developed a tool that could compare images against a set of photos of officers to find matches — a project he later abandoned.

“I don’t want them to be like secret police,” said Mr. Cheung, who was released on bail and has not been charged with wrongdoing. “If law enforcement officers don’t wear anything to show their identity, they’ll become corrupt. They’ll be able to do whatever they want.”

“With the tool, ordinary citizens can tell who the police are,” he added.

Hong Kong is at the bleeding edge of a significant change in the authorities’ ability to track dangerous criminals and legitimate political protesters alike — and in their targets’ ability to fight back. Across the border in China, the police often catch people with digital fingerprints gleaned using one of the world’s most invasive surveillance systems. The advent of facial- recognition technology and the rapid expansion of a vast network of cameras and other tracking tools has begun to increase those capabilities substantially.

The transformation strikes a strong chord in Hong Kong. The protests began over a proposed bill that would have allowed the city to extradite criminal suspects to mainland China, where the police and courts ultimately answer to the Communist Party.

ImageWestlake Legal Group merlin_158271216_f7c85bbf-41a3-4c85-9425-eac0a183069e-articleLarge In Hong Kong Protests, Faces Become Weapons Telegram LLC Surveillance of Citizens by Government Smartphones Privacy Politics and Government Police Brutality, Misconduct and Shootings Identification Devices Hong Kong facial recognition software Face Demonstrations, Protests and Riots Computers and the Internet Communist Party of China China cameras Attacks on Police

A protester spray-painted a security camera outside the Chinese government’s liaison office in Hong Kong this week. As the protests have intensified, faces and identities have become potent weapons on both sides.CreditChris Mcgrath/Getty Images

The authorities in Hong Kong have outlined strict privacy controls for the use of facial recognition and the collection of other biometric data, although the extent of their efforts is unclear. They also appear to be using other technological methods for tracking protesters. Last month, a 22-year old man was arrested for being the administrator of a Telegram group.

Protesters are responding. On Sunday, as another demonstration turned into a violent confrontation with the police, some of those involved shined laser pointers at police cameras and used spray paint to block the lenses of surveillance cameras in front of the Chinese government’s liaison office. Riot officers carried cameras on poles just behind the front lines as they fired tear gas and rubber bullets.

The protesters’ ire intensified after the police removed identification numbers from their uniforms, presumably to keep violent misconduct from being reported to city leaders. To some protesters, the move suggested the police were taking a cue from the mainland, where officers lack public accountability and often do not identify themselves.

“Why do the police get away while we’re getting attacked?” said Billy Tsui, a hairdresser. “If they do something wrong, they should face legal consequences.” He said that he favored peace over violence but that he also had some sympathy for the Telegram group exposing officers as a check on police misconduct.

“The original intention is just to identify who are the policemen,” Mr. Tsui, 21, said. “If they hide their numbers and don’t show their identity, this is the only way to know.”

Hong Kong police representatives have said personal information about officers and their friends and relatives had been posted online in an act known as doxxing. On July 3, the police said they had arrested eight people accused of, among other things, disclosing personal information without approval. A police spokesman said members of the police force had reported more than 800 incidents in which officers or their family members had been harassed following the data releases.

“Dadfindboy” — a play on the name of a Facebook group created under the auspices of helping mothers find their children, but which ultimately became a way for pro-government groups to gather photos of protesters — is one forum for the doxxing of police officers. By turns facetious, juvenile, cruel and profane in tone, the channel repeatedly reveals personal information and photos, some of them intimate, of the family members of police officers, sometimes with intimate social-media photos.

The channel has featured calls for violence, often in cartoonish ways, although there is no proof that it has incited any specific acts. One post instructed protesters on how to master using a slingshot. Another explained how to make a blow torch using aerosol deodorant. A recent poll queried the channel’s followers about how best to deal with the police. Options included prison, gas chamber, live burial, guillotine, and machine-gun execution. Live burial prevailed with about one-third of the vote.

The police grabbed Mr. Cheung 11 days after the Telegram channel was created, accusing him of administering it. They also accused him of posting a guide on how to assassinate police officers. Mr. Cheung denies the allegations, and a New York Times search could not find posts matching what the police described.

Mr. Cheung, a skinny 29-year-old, was grabbed at a mall around noon on July 18, according to his account. Four plainclothes officers waited for him to unlock his phone and then jumped on him, trying to pry it out of his hands.

After the officers tried to use his face to unlock the phone, they took him to a police station, where, he said, he was roughed up and interrogated. Later, officers went to his home and used a USB drive loaded with hacking software to break into his computers, according to his account of the incident. He said that he had been held for more than 10 hours and that he was not sure how the police had identified him.

Hong Kong police confirmed the investigation, but they declined to comment further on it.

The police may have been motivated by the facial-recognition tool, which Mr. Cheung said he had showed off in a Facebook video he posted last month. Making use of Google technology, Mr. Cheung, a college dropout who studied computer science, built an algorithm to identify police officers based on a small collection of photos that had been posted online. He said he did not continue to pursue the project because of a lack of time.

Mr. Cheung said his detention had cemented his fears. He said the plainclothes officers who arrested him did not identify themselves until they reached the police station. Later, an investigator in a suit urged him to open his phone as a way of demonstrating his innocence “to the 42nd floor” — a phrase Mr. Cheung said seemed to refer to high-ranking police officials. He did not believe that the police ultimately gained access to the phone, although they did break into his other devices.

The police also did not initially allow him to make a call. Only when he said he planned to play Ping-Pong with his uncle did they relent and let him. He said he contacted a friend instead, adding “I hate sports.”

Mr. Cheung also said he believed he had been followed by plainclothes officers since his arrest. When he arrived an hour late to an interview with The Times, he said it was because he was trying to lose a tail. With the help of his black Tesla, he said, he managed to outrace those whoever it was on the highways of the New Territories in northern Hong Kong.

“The cops are getting more and more aggressive,” he said. “I don’t think they have permission to unlock my iPhone or any device. They are starting to be out of their mind.”

“I think it’s more possible to see that Hong Kong will become China,” he said.

Qiqing Lin contributed research.

Real Estate, and Personal Injury Lawyers. Contact us at: https://westlakelegal.com 

Facial Recognition Tech Is Growing Stronger, Thanks to Your Face

SAN FRANCISCO — Dozens of databases of people’s faces are being compiled without their knowledge by companies and researchers, with many of the images then being shared around the world, in what has become a sprawling ecosystem fueling the spread of facial recognition technology.

The databases are pulled together with images from social networks, photo websites, dating services like OkCupid and cameras placed in restaurants and on college quads. While there is no precise count of the data sets, privacy activists have pinpointed repositories that were built by Microsoft, Stanford University and others, with one holding more than 10 million images while another had more than two million.

The face compilations are being driven by the race to create leading-edge facial recognition systems. This technology learns how to identify people by analyzing as many digital pictures as possible using “neural networks,” which are complex mathematical systems that require vast amounts of data to build pattern recognition.

Tech giants like Facebook and Google have most likely amassed the largest face data sets, which they do not distribute, according to research papers. But other companies and universities have widely shared their image troves with researchers, governments and private enterprises in Switzerland, India, China, Australia and Singapore for training artificial intelligence, according to academics, activists and public papers.

Companies and labs have gathered facial images for more than a decade, and the databases are merely one layer to building facial recognition technology. But people often have no idea that their faces ended up in them. And while names are typically not attached to the photos, individuals can be recognized because each face is unique to a person.

ImageWestlake Legal Group merlin_157756593_03c2fff0-6c72-469f-b32c-cd683893caf1-articleLarge Facial Recognition Tech Is Growing Stronger, Thanks to Your Face Start-ups Stanford University Social Media Research Privacy Microsoft Corp facial recognition software Face duke university Data-Mining and Database Marketing Computers and the Internet Computer Vision Clarifai Inc Artificial Intelligence

A visualization of 2,000 of the identities included in the MS Celeb database from Microsoft.CreditOpen Data Commons Public Domain Dedication and License, via Megapixels

Questions about the data sets are rising because the technologies that they have enabled are now being used in potentially invasive ways. Documents released last Sunday revealed that Immigration and Customs Enforcement officials employed facial recognition technology to scan motorists’ photos to identify undocumented immigrants. The F.B.I. also spent more than a decade using such systems to compare driver’s license and visa photos against the faces of suspected criminals, according to a Government Accountability Office report last month. On Wednesday, a congressional hearing tackled the government’s use of the technology.

There is no oversight of the data sets. Activists and others said they were angered by the possibility that people’s likenesses had been used to build ethically questionable technology and that the images could be misused. At least one face database created in the United States was shared with a company in China that has been linked to ethnic profiling of the country’s minority Uighur Muslims.

Over the past several weeks, some companies and universities, including Microsoft and Stanford, removed their face data sets from the internet because of privacy concerns. But given that the images were already so well distributed, they are most likely still being used in the United States and elsewhere, researchers and activists said.

“You come to see that these practices are intrusive, and you realize that these companies are not respectful of privacy,” said Liz O’Sullivan, who oversaw one of these databases at the artificial intelligence start-up Clarifai. She said she left the New York-based company in January to protest such practices.

“The more ubiquitous facial recognition becomes, the more exposed we all are to being part of the process,” said Liz O’Sullivan, a technologist who worked at the artificial intelligence start-up Clarifai.CreditNathan Bajar for The New York Times

“The more ubiquitous facial recognition becomes, the more exposed we all are to being part of the process,” she said.

Google, Facebook and Microsoft declined to comment.

[If you’re online — and, well, you are — chances are someone is using your information. We’ll tell you what you can do about it. Sign up for our limited-run newsletter.]

One database, which dates to 2014, was put together by researchers at Stanford. It was called Brainwash, after a San Francisco cafe of the same name, where the researchers tapped into a camera. Over three days, the camera took more than 10,000 images, which went into the database, the researchers wrote in a 2015 paper. The paper did not address whether cafe patrons knew their images were being taken and used for research. (The cafe has closed.)

The Stanford researchers then shared Brainwash. According to research papers, it was used in China by academics associated with the National University of Defense Technology and Megvii, an artificial intelligence company that The New York Times previously reported has provided surveillance technology for monitoring Uighurs.

The Brainwash data set was removed from its original website last month after Adam Harvey, an activist in Germany who tracks the use of these repositories through a website called MegaPixels, drew attention to it. Links between Brainwash and papers describing work to build A.I. systems at the National University of Defense Technology in China have also been deleted, according to documentation from Mr. Harvey.

Stanford researchers who oversaw Brainwash did not respond to requests for comment. “As part of the research process, Stanford routinely makes research documentation and supporting materials available publicly,” a university official said. “Once research materials are made public, the university does not track their use nor did university officials.”

Duke University researchers also started a database in 2014 using eight cameras on campus to collect images, according to a 2016 paper published as part of the European Conference on Computer Vision. The cameras were denoted with signs, said Carlo Tomasi, the Duke computer science professor who helped create the database. The signs gave a number or email for people to opt out.

The Duke researchers ultimately gathered more than two million video frames with images of over 2,700 people, according to the paper. They also posted the data set, named Duke MTMC, online. It was later cited in myriad documents describing work to train A.I. in the United States, in China, in Japan, in Britain and elsewhere.

Duke University researchers started building a database in 2014 using eight cameras on campus to collect images.CreditOpen Data Commons Attribution License, via Megapixels
The Duke researchers ultimately gathered more than two million video frames with images of over 2,700 people.CreditOpen Data Commons Attribution License, via Megapixels

Dr. Tomasi said that his research group did not do face recognition and that the MTMC was unlikely to be useful for such technology because of poor angles and lighting.

“Our data was recorded to develop and test computer algorithms that analyze complex motion in video,” he said. “It happened to be people, but it could have been bicycles, cars, ants, fish, amoebas or elephants.”

At Microsoft, researchers have claimed on the company’s website to have created one of the biggest face data sets. The collection, called MS Celeb, spanned over 10 million images of more than 100,000 people.

MS Celeb was ostensibly a database of celebrities, whose images are considered fair game because they are public figures. But MS Celeb also brought in photos of privacy and security activists, academics and others, such as Shoshana Zuboff, the author of the book “The Age of Surveillance Capitalism,” according to documentation from Mr. Harvey of the MegaPixels project. MS Celeb was distributed internationally, before being removed this spring after Mr. Harvey and others flagged it.

Kim Zetter, a cybersecurity journalist in San Francisco who has written for Wired and The Intercept, was one of the people who unknowingly became part of the Microsoft data set.

“We’re all just fodder for the development of these surveillance systems,” she said. “The idea that this would be shared with foreign governments and military is just egregious.”

Matt Zeiler, founder and chief executive of Clarifai, the A.I. start-up, said his company had built a face database with images from OkCupid, a dating site. He said Clarifai had access to OkCupid’s photos because some of the dating site’s founders invested in his company.

He added that he had signed a deal with a large social media company — he declined to disclose which — to use its images in training face recognition models. The social network’s terms of service allow for this kind of sharing, he said.

“There has to be some level of trust with tech companies like Clarifai to put powerful technology to good use, and get comfortable with that,” he said.

An OkCupid spokeswoman said Clarifai contacted the company in 2014 “about collaborating to determine if they could build unbiased A.I. and facial recognition technology” and that the dating site “did not enter into any commercial agreement then and have no relationship with them now.” She did not address whether Clarifai had gained access to OkCupid’s photos without its consent.

Clarifai used the images from OkCupid to build a service that could identify the age, sex and race of detected faces, Mr. Zeiler said. The start-up also began working on a tool to collect images from a website called Insecam — short for “insecure camera” — which taps into surveillance cameras in city centers and private spaces without authorization. Clarifai’s project was shut down last year after some employees protested and before any images were gathered, he said.

Mr. Zeiler said Clarifai would sell its facial recognition technology to foreign governments, military operations and police departments provided the circumstances were right. It did not make sense to place blanket restrictions on the sale of technology to entire countries, he added.

Ms. O’Sullivan, the former Clarifai technologist, has joined a civil rights and privacy group called the Surveillance Technology Oversight Project. She is now part of a team of researchers building a tool that will let people check whether their image is part of the openly shared face databases.

“You are part of what made the system what it is,” she said.

Real Estate, and Personal Injury Lawyers. Contact us at: https://westlakelegal.com 

Heeeere’s Jimmy: “The Shining,” starring Jim Carrey, part three

Westlake Legal Group j-1 Heeeere’s Jimmy: “The Shining,” starring Jim Carrey, part three The Blog swap shining shift nicholson kubrick Horror Face duvall deepfake ctrl carrey

One last entry in this growing genre to cleanse the palate at the end of a long week. Having done two famous scenes from the movie already with Carrey’s face substituted for Jack Nicholson’s, the deepfaker responsible for this series was destined to eventually take a shot at the most famous scene of all. I agree with the YouTube commenter who said this would have been perfect if only “Heeeeere’s Johnny” could have been replaced with “Alllllllllrighty then.”

Soon. Another year or two, figure, and the technology will be there.

Actually, this deepfake is interesting because it’s *not* perfect. It’s the most technically flawed of the three Carrey “Shining” clips I’ve posted this week. I don’t know whether the person who made it didn’t bother trying to swap in Carrey’s face for Nicholson’s in the side shots of Jack Torrance attacking the bathroom door with an axe or if the tech just doesn’t handle profile shots well yet, but to my eye that looks like Nicholson, not Carrey. You’ll see visual artifacts for a moment or two as well around Carrey’s face in the famous close-ups of Nicholson peeking through the hacked door and terrorizing Shelley Duvall. Whether that’s because this clip was rushed out or because close-ups are just inherently more difficult for face-swapping than shots taken from 10 feet away, when detail is less noticeable, I leave to the pros to say.

Next week are we gonna get Nicholson swapped in for Carrey in “Dumb & Dumber”? I might pay to see that.

The post Heeeere’s Jimmy: “The Shining,” starring Jim Carrey, part three appeared first on Hot Air.

Westlake Legal Group j-1-300x159 Heeeere’s Jimmy: “The Shining,” starring Jim Carrey, part three The Blog swap shining shift nicholson kubrick Horror Face duvall deepfake ctrl carrey   Real Estate, and Personal Injury Lawyers. Contact us at: https://westlakelegal.com 

Take two: “The Shining,” starring Jim Carrey

Westlake Legal Group s Take two: “The Shining,” starring Jim Carrey The Shining The Blog nicholson kubrick jim carrey Face duvall deepfake

An evening palate cleanser to follow up on this post from earlier in the week. There will come a point when these deepfake mindfarks become routine and lose the frisson of wonder which they currently inspire, probably sooner than we think.

But not yet. Not today.

The deepfake maestro who’s putting these together clearly has found a muse in the Carrey-for-Nicholson swap. This vid’s both better and worse than the one I posted a few days ago. Worse in the sense that the face-mapping isn’t quite as seamless here as it was in the previous clip. There are moments in this one in which Carrey’s face seems a touch too large for Nicholson’s head, to a slightly unnatural degree.

But better in the sense that this scene is iconic, which makes it that much trippier to see a different actor cast in the main part. It’s like watching the Obi-Wan/Vader lightsaber battle in “Star Wars” except with Patrick Stewart doing the honors instead of Alec Guinness.

They’ve gotta work on voice-swapping too, though. I want the full experience.

The post Take two: “The Shining,” starring Jim Carrey appeared first on Hot Air.

Westlake Legal Group s-300x159 Take two: “The Shining,” starring Jim Carrey The Shining The Blog nicholson kubrick jim carrey Face duvall deepfake   Real Estate, and Personal Injury Lawyers. Contact us at: https://westlakelegal.com 

Freaky: “The Shining” — starring Jim Carrey

Westlake Legal Group jc-1 Freaky: “The Shining” — starring Jim Carrey The Blog shining pacino movie kubrick jim carrey jack nicholson Face deepfake

To cleanse the palate, an unnerving late-night deepfake. I approve of a YouTube commenter’s suggestion to follow this up by swapping Jack Nicholson’s face onto Carrey’s in the “rhino” scene in Ace Ventura 2.

What makes this work so well, I think, is partly the technology and partly the inspiration of casting Carrey as Jack Torrance. The face-mapping is stellar; if you didn’t know the film (or Carrey’s voice) you would certainly take the clip at face value, no pun intended. But it’s effective too because it’s surprisingly easy to imagine Carrey in this role. The sense one typically gets watching Nicholson is of a pressure vessel that’s straining and liable to burst. The sense one gets watching Carrey is more of a fireworks show, periodically going off in every direction. (Go figure that Carrey is known for comedy.) Each is an unstable reaction happening before your eyes, which makes them compelling to watch. Go figure that you might think of Carrey when recasting a character who’s one blizzard away from threatening his family with an axe.

Can you imagine what these clips will be like when they nail down the voice technology too? The vocal mismatch is the only thing stopping this from being a true actor-for-actor substitution in an iconic film.

The guy who made this, by the way, turns out to be the same guy who made that Bill Hader/Arnold Schwarzenegger clip I posted a few weeks ago. That wasn’t the only time he’s had some fun with Hader’s impressions either. Watch the second clip below for the witchy effect achieved when Hader suddenly morphs into Al Pacino. (I think it’s the fact that their hair’s similar in color that makes it so good.) This guy has a page full of other deepfakes, with new ones coming every day. As far as anyone knows, he’s an amateur. This is what amateurs can do now.

The post Freaky: “The Shining” — starring Jim Carrey appeared first on Hot Air.

Westlake Legal Group jc-1-300x159 Freaky: “The Shining” — starring Jim Carrey The Blog shining pacino movie kubrick jim carrey jack nicholson Face deepfake   Real Estate, and Personal Injury Lawyers. Contact us at: https://westlakelegal.com 

San Francisco Bans Facial Recognition Technology

SAN FRANCISCO — The San Francisco Board of Supervisors on Tuesday enacted the first ban by a major city on the use of facial recognition technology by police and all other municipal agencies.

The vote was 8 to 1 in favor, with two members who support the bill absent. There will be an obligatory second vote next week, but it is seen as a formality.

Police forces across America have begun turning to facial recognition to search for both small-time criminal suspects and perpetrators of mass carnage: authorities used the technology to help identify the gunman in the mass killing at an Annapolis, Md., newspaper in June. But civil liberty groups have expressed unease about the technology’s potential abuse by government amid fears that it may shove the United States in the direction of an overly oppressive surveillance state.

Aaron Peskin, the city supervisor who announced the bill, said that it sent a particularly strong message to the nation, coming from a city transformed by tech.

“I think part of San Francisco being the real and perceived headquarters for all things tech also comes with a responsibility for its local legislators,” said Mr. Peskin, who represents neighborhoods on the northeast side of the city. “We have an outsize responsibility to regulate the excesses of technology precisely because they are headquartered here.”

Similar bans are under consideration in Oakland and in Somerville, Mass., outside of Boston. In Massachusetts, a bill in the state legislature would put a moratorium on facial recognition and other remote biometric surveillance systems. On Capitol Hill, a bill introduced last month would ban users of commercial face recognition technology from collecting and sharing data for identifying or tracking consumers without their consent, although it does not address the government’s uses of the technology.

Matt Cagle, an attorney with the ACLU of Northern California, summed up the broad concerns of critics Tuesday: Facial recognition technology, he said, “provides government with unprecedented power to track people going about their daily lives. That’s incompatible with a healthy democracy.”

The San Francisco proposal, he added, “is really forward-looking and looks to prevent the unleashing of this dangerous technology against the public.”

ImageWestlake Legal Group merlin_154763151_db66aec5-ab5c-437a-9586-4f9b5a5dda0f-articleLarge San Francisco Bans Facial Recognition Technology Surveillance of Citizens by Government San Francisco (Calif) Privacy Face Computer Vision

A security camera in San Francisco.CreditEric Risberg/Associated Press

In one form or another, facial recognition is already being used in many U.S. airports and big stadiums, and by a number of other police departments. The pop star Taylor Swift has reportedly incorporated the technology at one of her shows, using it to help identify stalkers.

The issue has been particularly charged in San Francisco, a city with a rich history of incubating dissent and individual liberties, but one that has also suffered lately from high rates of property crime. A local group called Stop Crime SF asked supervisors to exclude local prosecutors, police and sheriffs from the ordinance when performing investigative duties, as well as an exemption for the airport.

The group had been encouraging residents to send a form letter to supervisors. It argued that the ordinance “could have unintended consequences that make us less safe by severely curtailing the use of effective traditional video surveillance by burying agencies like the police department in a bureaucratic approval process.”

The facial recognition fight in San Francisco is largely theoretical — the police department does not currently deploy facial recognition technology, except in its airport and ports that are under federal jurisdiction and are not impacted by the legislation.

Some local homeless shelters use biometric finger scans and photos to track shelter usage, said Jennifer Friedenbach, the executive director of the Coalition on Homelessness. The practice has driven undocumented residents away from the shelters, she added.

Mr. Cagle and other experts said that it was difficult to know exactly how widespread the technology was in the U.S. “Basically governments and companies have been very secretive about where it’s being used, so the public is largely in the dark about the state of play,” he said.

But Dave Maass, senior investigative researcher at the Electronic Frontier Foundation, offered a partial list of police departments that he said used the technology, including Las Vegas, Orlando, San Jose, San Diego, New York City, Boston, Detroit and Durham, N.C.

Other users, Mr. Maas said, include the Colorado Department of Public Safety, the Pinellas County Sheriff’s Office, the California Department of Justice and the Virginia State police.

U.S. Customs and Border Protection is now using facial recognition in many U.S. airports and ports of sea entry. At airports, international travelers stand before cameras, then have their pictures matched against photos provided in their passport applications. The agency says the process complies with privacy laws, but it has still come in for criticism from the Electronic Privacy Information Center, which argues that the government, though promising travelers that they may opt out, has made it increasingly difficult to do so.

But there is a broader concern. “When you have the ability to track people in physical space, in effect everybody becomes subject to the surveillance of the government,” said Marc Rotenberg, the group’s executive director.

Real Estate, and Personal Injury Lawyers. Contact us at: https://westlakelegal.com