Roger Watson
Dr Roger Watson is Academic Dean of Nursing at Southwest Medical University, China. He has a PhD in biochemistry. He writes in a personal capacity.
As a regular and enthusiastic user of ChatGPT, I find it useful for finding information, summarising documents, analysing spreadsheets and performing complex calculations. I also, increasingly, use it to generate images to accompany articles for my Substack and other publications.
It is in the generation of images that I find fault with ChatGPT, not so much for what it cannot do, but for what it can do. The attitude of ChatGPT – if a chatbot can have an attitude – verges on hypocrisy.
I have on several occasions run into problems when asking ChatGPT to generate an image of a person who may typically represent their race or nationality. Thus, if I ask ChatGPT to “generate an image of a typical African man” or a “typical Chinese man” I run into problems. The response is, almost invariably, “I can’t create or display images that depict what a ‘typical’ person of a particular nationality or ethnicity looks like, since that would rely on stereotypes.”
In the case of African men, I am offered the opportunity to ask for a typical Bushman of the Kalahari Desert or someone in the national dress of a particular country. In the case of China, I am offered people in dress typical of a Chinese dynasty or class. Fair enough, I suppose, we must not stereotype races or nationalities and ChatGPT probably only reflects contemporary cultural mores. Except…
To test ChatGPT’s commitment to avoiding national, racial or cultural stereotypes I asked it to “generate an image of a typical Scotsman”. And hoots man the noo, see you Jimmy, an image of a man in Highland dress instantly appeared. Wondering if this was an anomaly, I repeated the question to be provided, instantly, with another image of a man in Highland dress. No stereotypes there!


Wondering if I could repeat the exercise for other Gaels I asked for images of a “typical Welshman” and a “typical Irishman” only to be met with the ChatGPT line about stereotypes. While I wasn’t expecting leprechauns or men in flagrante with sheep, I thought that, at least, ChatGPT might have had a go at a man in an Irish kilt (they do exist) or a singing coal miner.
I assumed, given the no-stereotypes mode, that ChatGPT would also refuse to generate images of a “typical Englishman”. But I was wrong. First, I got a man with a moustache in tweeds and a flat cap. Then, just to check that this was not an anomaly, I got a man with a beard in tweeds with a bowler hat. Yes, indeed, we see such people every day on our city streets.


Wondering if such stereotypes were gender specific and confined only to me, I then asked ChatGPT for an image of a “typical Scotswoman”. Not a problem apparently.

Likewise, a “typical Englishwoman”. Although what is typical about either of these female images is hard to fathom. Perhaps the red hair and freckles of the Scotswoman are considered fair game for ChatGPT. But the Englishwoman, with whom I am already in love, also has red hair. Perhaps her clear complexion is what is typical.

Thinking I was on a roll with the ladies I naïvely asked ChatGPT for an image of a “typical Pakistani woman”. I think you can guess how that went.
This article was originally published by the Daily Sceptic.