Recognizing the potential and pitfalls of generative artificial intelligence (AI) tools in the workplace, the Nieman Foundation for Journalism at Harvard has created these guidelines for staff members and contractors hired by the Nieman Foundation, including contributors to Nieman’s three publications: Nieman Journalism Lab, Nieman Reports, and Nieman Storyboard.
Generative AI tools, overseen carefully by humans, can be useful and beneficial in parts of our workflow.
This policy reflects Nieman’s long-standing commitment to ethical reporting and the journalistic principles of truth, accuracy, and transparency. The guidelines are designed to protect Nieman’s reputation and credibility and safeguard audience trust. This document will be updated and modified as needed as AI technology evolves.
The use of AI-generated content, including text, images, audio, and video, is subject to the following guidelines:
We do not publish stories drafted or edited using generative AI tools.
A staff member or designated person will always vet the following material before it is published or incorporated into a story:
- Information gathered through AI research tools
- AI-generated transcriptions and translations
- AI-generated text for social media posts or headlines
- Copyedits suggested by generative AI tools
We will not use AI image generators to create photorealistic depictions of real people or places, or to alter images of real people or places. Any AI-generated illustrations or composites we publish will be clearly labeled, and the image caption will disclose the method of generation. Every visual must serve a clear editorial purpose and uphold our responsibility to inform, not mislead.
Any AI-generated material in our publications or on our websites (images, translations, etc.) will be clearly identified for readers.
In addition, Nieman adheres to the guidelines for generative AI use issued by the Harvard Public Affairs & Communications Office:
- AI should augment human creativity and decision-making, not replace it. Transparency in how AI is used for content creation and decision-making is critical, especially when it comes to automated content. Ethical concerns such as bias, intellectual property, and AI’s role in content should be addressed upfront. Clear communication about the involvement of AI fosters trust with the audience.
- AI can help scale content creation, but quality and relevance must remain a top priority.
- Digital content strategists may use AI to assess what content performs best and optimize content for targeted audiences as well as search engines.
All freelance contributors should familiarize themselves with these policies and follow them in their work for the Nieman Foundation and its publications.
Last updated September 30, 2025