A few years ago, American friends of mine bought a home in a European country in order to obtain an E.U. passport. (The country has an immigration program supporting this.) They state that they are doing so in case the United States presidential election goes as they fear.
They, and I, have no doubt that the U.S. would fall to some form of authoritarianism if the wrong candidate were elected. They, and I, are white, well educated, nonimmigrants and upper middle class, with a wide range of well-connected and financially stable friends. Our demographic backgrounds are relevant to my question, which is on the ethics of leaving a country because its democratic institutions are failing.
As members of some of the groups who most likely will retain many tangible privileges and are least likely to be negatively affected, do we have an ethical obligation to stay and help those who will be impacted more harshly than us, or is it ethically acceptable to leave the country? — Name Withheld
From the Ethicist:
To say ‘‘my country, right or wrong,’’ G.K. Chesterton complained, was like saying ‘‘my mother, drunk or sober.’’ If your mother took to drink (or, in another example of his, if your son committed murder), a loving relationship meant that you couldn’t be blithely indifferent. I take his point about alcoholic parents and murderous offspring. By my lights, though, when genuine patriots say ‘‘my country, right or wrong,’’ they mean that it’s their country whether or not they agree with what is done in its name. That’s the opposite of giving the country a free pass. It expresses a commitment to trying to help your country do what it should — which is how we should normally feel about our families too.
A sign of your identification with a country, your sense that it’s yours, is the pride — and the shame — that you feel about things done by your country and your compatriots. It’s a concern for national honor. Another sign is a sense of shared responsibility for the country’s fate. Leaving your country because you think it has gone off the rails isn’t really consistent with this sense of shared responsibility or with a commitment to trying to make things better. Unless you think that staying will put you in jeopardy, or that leaving will contribute to restoring your country (the way the Free French departed France when it was occupied in World War II, in order to regroup elsewhere), skedaddling does strike me as unpatriotic.
Now, patriotism is a fine thing, in my view, but that doesn’t mean patriotic self-sacrifice is a duty. If you’re convinced that life here will be unbearable for you, you are morally free to go. Morally free doesn’t mean morally admirable, though. You make it clear that this country has treated you well; let me note that people can have patriotic hopes for a country that has treated them badly. Frederick Douglass was an American patriot despite having been enslaved under his country’s laws.
I don’t share your fear that America is about to tumble into authoritarianism. I grew up under civilian and military dictatorships in Ghana. My father was a political prisoner. I think I have a sense of the conditions under which authoritarianism can arise. But if I did believe what you believe, I confess that, as a patriot, I would want to stick around and join with others to help bring us back from disaster.
Readers Respond
The previous question was from a reader who was feeling queasy about A.I. art. He wrote: “My friends and I use a website for tabletop role-playing games (think Dungeons & Dragons). When making a character for a ‘Lord of the Rings’ game, I found what looked to be the perfect piece online: a Celtic-looking warrior in the style of Alphonse Mucha. … This particular piece seems to be available only in an Etsy shop, where the creator apparently uses A.I. prompts to generate images. The price is nominal: a few dollars. Yet I cannot help thinking that those who make A.I.-generated art are taking other artists’ work, essentially recreating it and then profiting from it. I’m not sure what the best move is.”
In his response, the Ethicist noted: “There’s a sense in which A.I. image generators — such as DALL-E 3, Midjourney and Stable Diffusion — make use of the intellectual property of the artists whose work they’ve been trained on. But the same is true of human artists. The history of art is the history of people borrowing and adapting techniques and tropes from earlier work, with occasional moments of deep originality. … As forms of artificial intelligence grow increasingly widespread, we need to get used to so-called ‘‘centaur’’ models — collaborations between human and machine cognition. … Plenty of people, I know, view A.I. systems as simply parasitic on human creativity and deny that they can be in the service of it. I’m suggesting that there’s something wrong with this picture.” (Reread the full question and answer here.)
⬥
I thoroughly enjoy reading the Ethicist, and this response was just as thought-provoking as usual. In my opinion, the real question one needs to ask when utilizing A.I. models is: Are the models themselves ethical? Why are the tech companies pushing so hard for the adoption of these services? Commercial models exist for self-profiteering and are built on unethical and immoral labor practices (such as OpenAI’s reported financial exploitation of Kenyan workers). Additionally, if you read the terms of service for these commercial A.I. models, you may be entering into an agreement to not hold them liable for any infringement claims from the generated work or situations where they will not indemnify the end user from any such claims. Is that ethical? Or is it just business as usual? For now, from a legal standpoint, generative A.I. isn’t copyright or brand safe. According to the Copyright Alliance, as of mid-2024 there were 25 pending court cases regarding generative A.I., including one filed by the very publication the Ethicist writes for. — Jim
⬥
As an artist, I could not disagree more with the Ethicist about the legitimacy of A.I.-generated art. I have spent thousands of hours in the studio, studying, working and going for a B.F.A. and an M.F.A. I have sought out artists to study with. Most nights in school I was in the studio working until 11 or 12, while also taking early morning classes. I studied anatomy in depth, passing all my exams on that subject. Then I worked for years to pay off all of those loans. I have worked hard to increase the depth of my skill and knowledge for over 50 years. Anyone taking the A.I. shortcut is stealing from me! — Bepe
⬥
As an author and an art historian, I perceive no problems in selling A.I.-generated work if the buyer knows what she or he is buying. I see A.I. as opening doors, not closing them. — Betty
⬥
I feel that the Ethicist did not spend enough time considering the ethics of how large language models (LLMs) are trained — not as burgeoning artists mimicking the masters to hone their skills, but as tools produced by incredibly wealthy companies that trampled artists’ preferences and did not pay for the use of their work in a for-profit venture. Further, in its training, the A.I. did not take the “ideas” of the work, but the literal work itself. (I consider it akin to laundered crowdsourcing.) A user who is now generating art with the LLM tools is simply continuing the cycle of theft that allowed the tool to exist in the first place. — Olivia
⬥
I am an illustrator and educator. My work has been published in The New York Times. With every cell of my body, I disagree with the Ethicist’s answer to the value of generative A.I. art. My work was scraped by one of the popular tools, so I don’t see this as a centaur model, but as clear copyright theft. There are two clear problems with generative A.I. art: 1) The use of the expressive art of living/working artists (see the recent European Union study on this use of work as copyright infringement), and 2) This is not how images are made. No visual artist thinks of a sentence and a picture appears. There is no human judgment present because each word in a prompt is part of a word/image pair — that’s the beating heart of your machine. It is a zombie nostalgia maker (it is not pulling from our imagination, but rather the road traveled) to create exploitive products and nothing resembling cultural production. It is ironic that the Ethicist mentions the list of creatives in the credits of a Pixar movie; the richest companies on the planet are unleashing A.I. tools in order to obliterate those workers. This movement is not offering the tools of creativity to the world; it’s encouraging technological feudalism. — Joe
⬥
The Ethicist’s response neglects to consider that using A.I. technologies is not just a matter of negotiating how they consume and regurgitate the work of real artists — it’s also a matter of considering how these technologies are funded, how they consume energy and how they are used for much more insidious purposes than creating fantasy art. As an artist myself, I often reflect on a quote attributed to Joseph Beuys: “You cannot wait for a tool without blood on it.” In the case of A.I., there is just too much blood to be overlooked. Using these tools even for inane purposes normalizes the industry’s ability to disrupt the economy with bloated and misguided investment, to accelerate our climate crisis with excessive energy consumption and to perpetuate dangerous systemic biases. I would recommend instead that the letter writer buy a sketchbook and take a drawing class to discover his own creative voice in a less harmful manner. — Martha
⬥
Is it ethical to profit from somebody else’s work? Pretty much every idea is a permutation of an idea somebody else had. So we’re left with compromise, and to me, the copyright law seems a reasonable compromise. Just as anyone can now write and sell a Sherlock Holmes story because Sir Arthur Conan Doyle’s works are now in the public domain, you too can make and sell A.I.-derived images from long-dead artists’ oeuvres. If an artist’s work is not public domain, you negotiate royalties. Simple and ethical. But I would never call the result of such a process “art.” — John
<