Alondra Nelson wants to make science and technology fairer

[ad_1]

The epidemic taught Alondra Nelson said that this is a lesson we need to learn again: science and technology are closely related to society, inequality and social life issues.

A year later, science was politicized Pandemic And the presidential campaign presidential campaign presidential congressman Ho Biden appointed Nielsen’s deputy director of the White House Office of Technology Policy, a new position. Nelson will establish a science and social department within OSTP to address issues ranging from data and democracy to STEM education. In another first time, Biden made his scientific adviser Eric Lander (Eric Lander) a part of him, and he was also the head of OSTP. cabinet.

Nelson has spent her career at the intersection of race, technology, and society, writing articles such as how to African Futurism Can make the world a better place and how the Panthers use healthcare as a form of activism led to the organization’s early interest in genetics.She is the author of several books, including DNA social life, Which focuses on the rise of consumers Genetics How the testing industry and the desire to understand their ancestry led blacks and Mormons to become early users of the technology.

Nelson is a professor at the Institute for Advanced Study in Princeton, New Jersey. Before she was appointed, she was writing a book on OSTP and the Obama administration’s major science projects, including a series of reports on artificial intelligence and government policy.

In his first official speech when he assumed the new position in January, Nielsen called science a social phenomenon and stated that technology is artificial intelligence It can reveal or reflect dangerous social structures that support the pursuit of scientific progress. In an interview with WIRED, Nielsen said that especially the black communities are over-exposed to the hazards of science and technology, but their interests are not fully served.

In the interview, she talked about the scientific moon landing plan of the Biden administration, why the government has no formal stand to prohibit face recognitionAnd she believes that the issues related to emerging technologies and society that must be resolved during her tenure in the government. The edited transcript is as follows.

Wired: In January, you talked about the “dangerous social structure hidden under the scientific progress we are pursuing” and mentioned gene editing and artificial intelligence. What prompted you to mention gene editing and artificial intelligence in your first public speech in this position?

Alondra Nelson: I think genetic science and artificial intelligence have in common that they are data-centric. Some things we know about data and how data analysis works at scale are true for certain aspects of large-scale genome analysis and machine learning, so these are the foundations. I think that as a country, the problems we still need to solve are the data sources analyzed using artificial intelligence tools, and who can decide which variables to use and which questions are raised by scientific and technical research. I hope the difference in this OSTP is the honesty of the past. Technology has harmed some communities and left out communities, preventing people from engaging in technology work.

The issue of racial equality and restoring trust in the government are identified as key issues on the first day of government work, which means that the work of science and technology policy must treat the past honestly and restore trust in the government—the part of restoring trust. Some believe that science and technology have the ability to bring any benefit to the world—in fact, they are open to the history of deficiencies and failures of science and technology.

Unfortunately, there are many examples.Next month will be another anniversary of the Associated Press story Tuskegee’s syphilis research was exposed nearly 40 years ago, so we have to reach that anniversary again. Of course, we encountered a problem in AI research, that is, the data used is incomplete, and their incompleteness means that the inferences they make are incomplete and inaccurate, especially when used in social services and criminal justice systems. Black and brown communities have truly disproportionately harmful effects.

Rand said at his confirmation hearing that OSTP will solve the algorithm-based discrimination problem. bias. How will this work?

[ad_2]

Source link

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button

Adblock Detected

Please consider supporting us by disabling your ad blocker