Whether it’s by telling Alexa to play your favorite song or using facial recognition to unlock your phone, AI is used on a daily basis to make life easier. Despite the benefits, there are also downsides to the technology, such as bias risks and security concerns. Also:

Hong Kong consumers want to choose when firms use AI  Singapore releases toolkit to guide financial sector on AI ethics  Microsoft to curtail some facial recognition capabilities in the name of ‘Responsible AI’

“Too often, these tools are used to limit our opportunities and prevent our access to critical resources or services,” said the White House in a statement.  “In America and around the world, systems supposed to help with patient care have proven unsafe, ineffective, or biased. Algorithms used in hiring and credit decisions have been found to reflect and reproduce existing unwanted inequities or embed new harmful bias and discrimination.” To protect civil rights and democratic values, the White House Office of Science and Technology Policy identified five principles that are meant to guide the design, use, and deployment of automated systems. When applied, these principles are meant to “set backstops against potential harms”.  The five pillars are: 

Safe and effective systems Algorithmic discrimination protection Data privacyNotice and explanationHuman alternatives, consideration, and fallback

The blueprint delineates that the framework applies to automated systems that “have the potential to meaningfully impact the American public’s rights, opportunities, or access to critical resources or services.” Despite the efforts to control the negative effects of AI, some critics argue that the Blueprint for an AI Bill of Rights will not be effective, as it is not a legally binding document and “lacks teeth,” according to a WIRED article.  Navigating the uncharted waters of AI regulation will surely pose challenges, but at least the White House shed light on an important issue affecting both people’s everyday lives and the tech industry as a whole.