Attorney Sara Jodka Quoted in Financial Post Article on Artificial Intelligence in the Workplace
- Jodka, Sara H.
- Media Mentions
Want to get our alerts?
Click “Subscribe Now” to get attorney insights on the latest developments in a range of services and industries.
Attorney Sara Jodka was recently quoted in an article entitled “When sexist, racist robots discriminate, are their owners at fault?” published in the Financial Post. The article examines if employers can be held legally accountable for artificial intelligence (AI) software biases.
Some experts claim that AI is increasingly biased against women and non-white people. Even robots, they claim, are being sexist and racist. The bias may not be deliberate, but rather they may have an unconscious bias that was instituted during the development of the software.
Ms. Jodka, who offers preventative counselling services to employers, says employers should “look under the hood” of the technology and determine that the software uses an appropriate range of “data sets,” the criteria fed into the software that power its determinations.
“Because it may be hard to determine precisely the extent to which the developers or the data sets are prone to blind biases, employers should contract around liability by demanding tight clauses fully indemnifying them against damages occasioned by discriminatory technology,” says Ms. Jodka.
To read the full article, please click here.
Some experts claim that AI is increasingly biased against women and non-white people. Even robots, they claim, are being sexist and racist. The bias may not be deliberate, but rather they may have an unconscious bias that was instituted during the development of the software.
Ms. Jodka, who offers preventative counselling services to employers, says employers should “look under the hood” of the technology and determine that the software uses an appropriate range of “data sets,” the criteria fed into the software that power its determinations.
“Because it may be hard to determine precisely the extent to which the developers or the data sets are prone to blind biases, employers should contract around liability by demanding tight clauses fully indemnifying them against damages occasioned by discriminatory technology,” says Ms. Jodka.
To read the full article, please click here.
Related Practices
Recent Insights
- April 20, 2026 Blogs Per Class, Per Session, Per Lawsuit: Wage & Hour Risks in Boutique Fitness
- April 16, 2026 In the News Danica Bebble Joins Dickinson Wright Grand Rapids Office
- April 13, 2026 Blogs It Ends with Contractor Status: Lessons from Blake Lively’s Sexual Harassment Case
- March 27, 2026 Blogs Washington State Makes Worker Non-Compete Obsolete
- March 23, 2026 Blogs Death By a Single Claim: How One Harassment Claim Nukes Arbitration for the Whole Case
- November 21, 2025 Conferences Dickinson Wright to Present at the Michigan–Japan HR Conference
- September 10, 2025 Industry Alerts Recent FTC Developments on Non-Compete Enforcement
- August 18, 2025 Blogs Sixth Circuit Raises Bar for Employer Liability for Customer Harassment of Employees
- August 04, 2025 Blogs $27M Verdict Warns Employers: Vet and Train Employees or Pay the Price