Attorney Sara Jodka Quoted in Financial Post Article on Artificial Intelligence in the Workplace
- Jodka, Sara H.
- Media Mentions
Want to get our alerts?
Click “Subscribe Now” to get attorney insights on the latest developments in a range of services and industries.
Attorney Sara Jodka was recently quoted in an article entitled “When sexist, racist robots discriminate, are their owners at fault?” published in the Financial Post. The article examines if employers can be held legally accountable for artificial intelligence (AI) software biases.
Some experts claim that AI is increasingly biased against women and non-white people. Even robots, they claim, are being sexist and racist. The bias may not be deliberate, but rather they may have an unconscious bias that was instituted during the development of the software.
Ms. Jodka, who offers preventative counselling services to employers, says employers should “look under the hood” of the technology and determine that the software uses an appropriate range of “data sets,” the criteria fed into the software that power its determinations.
“Because it may be hard to determine precisely the extent to which the developers or the data sets are prone to blind biases, employers should contract around liability by demanding tight clauses fully indemnifying them against damages occasioned by discriminatory technology,” says Ms. Jodka.
To read the full article, please click here.
Some experts claim that AI is increasingly biased against women and non-white people. Even robots, they claim, are being sexist and racist. The bias may not be deliberate, but rather they may have an unconscious bias that was instituted during the development of the software.
Ms. Jodka, who offers preventative counselling services to employers, says employers should “look under the hood” of the technology and determine that the software uses an appropriate range of “data sets,” the criteria fed into the software that power its determinations.
“Because it may be hard to determine precisely the extent to which the developers or the data sets are prone to blind biases, employers should contract around liability by demanding tight clauses fully indemnifying them against damages occasioned by discriminatory technology,” says Ms. Jodka.
To read the full article, please click here.
Related Practices
Contacts
Recent Insights
- February 19, 2024 In the News Christina McDonald Named a “Notable Woman in Law” by Crain’s Grand Rapids Business
- November 28, 2023 In the News Five Dickinson Wright Attorneys Recognized in 2023 Mid-South Super Lawyers
- October 23, 2023 In the News David Deromedi Named a Notable Leader in Labor & Employment Law by Crain Detroit Business
- June 1, 2023 In the News Dickinson Wright Receives Top Rankings in 2023 Chambers USA Guide; 43 Attorneys Recognized as Leaders in their Fields
- January 26, 2023 Industry Alerts Tick Tock – Michigan Employers Are on the Earned Sick Time Clock
- January 20, 2023 Industry Alerts Federal Trade Commission Proposes Rule to Ban Non-Compete Clauses | 連邦取引委員会による競業避止条項の禁止に関する規則の提案
- January 19, 2023 Industry Alerts Federal Trade Commission Proposes Rule to Ban Non-Compete Clauses | 联邦贸易委员会提议禁止竞业禁止条款
- January 12, 2023 Industry Alerts Federal Trade Commission Proposes Rule to Ban Non-Compete Clauses
- November 29, 2022 In the News Four Dickinson Wright Attorneys Recognized in 2022 Mid-South Super Lawyers