As the healthcare industry evaluates how to pay for artificial intelligence (AI) solutions, industry experts say data and real-world evidence are essential for reaching any payment decisions. In this Forbes article, McDermott Partners Dale C. Van Demark and Jiayan Chen provide insight into some of the regulatory challenges AI presents.
“For AI to be paid for, you need data that shows your product is making a difference,” Chen notes. “To do that, you need massive quantities of data to develop the tool or algorithm, but you also have to show that it works in a real-world setting.”
As governments lift COVID-19 pandemic restrictions, employers are turning to artificial intelligence tools to accelerate their hiring processes.
However, these AI-based tools can open businesses up to discrimination claims if they are not careful, according to McDermott partner Brian Mead.
“[The technology] could decide that certain words [are] unlikely to [yield] successful candidates, and then it’s prescreening out members of protected classes and categories of applicants in a discriminatory way,” Mead said in a recent Law360 article.
The seismic, virtually overnight transformation of healthcare delivery as a result of the pandemic has flung open doors to innovation, as a diverse cross-section of digital health and life sciences stakeholders mobilize crisis resources; adjust operations for enhanced screening, sanitization and social distancing measures; harness telehealth capabilities to deliver healthcare remotely; and identify opportunities for smarter, better healthcare going forward.
Writing for The US-Israel Legal Review, partners from McDermott’s Health practice highlight the challenges and opportunities that digital health and life sciences operators and investors should consider as the industry charts a course through the post-pandemic changed healthcare landscape.
In January 2020, the Supreme Court decided it would not hear the issue of whether Facebook broke the law in Illinois when it instituted a photo-tagging feature that honed in on users’ faces and tagged them without their consent, and Facebook has now settled with the users for $550 million. The Illinois law is part of a patchwork of laws applicable to facial recognition technology (FRT).
McDermott’s Ashley Winton contributes to the second installment of a three-part article series on FRT. This article examines the applicable legal framework and regulatory guidance, including intellectual property rights, general privacy legislation, specific state biometric data laws and more.