Report
When ChatGPT Goes Wrong 😈 | The Prompt
I break down complex AI tech into bite-sized pieces and inspire you to build something amazing with it.
anita.beehiiv.com
Actually, this type of inequality (racism, gender bias etc.) within computer systems isn't new. I remember reading an article in The New York Times??? or similar, regarding a computer program that sifted through resumes for job applications. The system had learned that white males were the best hires based on existing data - because white males were the predominant gender and ethnicity at the company, and so it shortlisted only or mainly white males for interviews based just on words used in the resume. Essentially, words like Mom, etc. disqualified the candidate because it identified them as female. Unfortunately, as long as humans operate these systems, and populate it with their own biases, then the same inequality that exists in the world now will continue to exist in an AI-driven world.
I think you should try ChatGPT. Then you can know when it goes wrong.
And when it lies : https://twitter.com/Kantrowitz/status/1613168223054188545?s=20&t=-tl_wlliHz4PbSw_6-bCWg
Also when you ask to make comedic jokes on certain race then it denies but when you ask to make jokes related to Indians then it makes curry jokes.
I think you should try ChatGPT. Then you can know when it goes wrong.
I think I should ask this question from chat GPT itself.
Thank you so much for sharing this information. It is helpful for our https://fairyapk.com/
This is worrying, but I can't say it's surprising. We live in a world built on inequality, and if we are the ones teaching AI then of course this inequality will be transferred and adopted by it. And so the cycle continues.
You must try it first. We are using it for our Recruitment Agency for Malaysia in Pakistan website. It never gone wrong.