Unit 5 The AI Era: Industry Innovation and Ethical Reflections

Wrap Up

A. Listen to the talk and fill in the blanks with the words from the box. If necessary, change the form of the words.

Script
W :
AI systems, like humans, are expected to follow social norms and remain fair and unbiased. Bias in AI often comes from flawed input data. If input data is flawed, it will likely lead to biased outcomes. For example, one U.S. company used résumés of current employees to train an AI hiring model. The result? The system showed bias against women—résumés mentioning “women” were automatically rejected. Developers should carefully review data to prevent unintended discrimination. This case highlights the importance of being sensitive and open when using AI. Bias isn’t always intentional, but its impact can be serious. To use AI ethically, users must recognize and correct bias in data and models to ensure fair outcomes.