Faculty Guidance for AI Use at the University of Northwestern – St. Paul
University of Northwestern – St. Paul is committed to the ethical and responsible use of advanced automated tools and technology (such as generative artificial intelligence and large language models) to foster an environment that upholds our mission and values as a comprehensive university committed to promoting Christ-centered higher learning and scholarship. Faculty have a unique opportunity to benefit directly from the use of these tools in teaching and scholarship and to also shape student impressions and practices with these emerging technologies.
...
The following are possible approaches you may take for your course[i]. For each approach, consider how you can provide students with both clear value-driven, curricular rationale for your chosen approach and training in responsible and effective use of these tools.
Prohibit All Use·
Some curricular contexts require unaided student performance.
In these cases, faculty should be explicit about the details and consequences of the prohibition, and also explain to students the rationale for the prohibition (such as targeted skill development) to help them understand context and broader professional or disciplinary norms and expectations.
Allow Use Only as Permitted·
Some curricular contexts may allow, encourage, or require use of supports in limited, pre-determined ways.
In these cases, clearly indicate for which scenarios, assignments, and learning tasks use of these tools is appropriate. It would be helpful to clarify your rationale, explicit expectations for citation or acknowledgement, and limitations, risks, and responsible use of the tools within permitted contexts.
Allow Broader Use with Explicit Acknowledgment/Attribution·
Some curricular contexts may allow, encourage, or require more versatile student use of supporting tools but still require explicit, formal acknowledgement of use and citation.
It would be helpful to explicitly indicate how students should acknowledge the use of these tools. Remind students of the tools’ limitations, risks, and their responsibility for all work and communication they claim as their own.
Freely Allow Use·
In some cases, free use of these tools without formal acknowledgement may be appropriate.
In such cases, it is necessary to guide student understanding of the boundaries of responsible use and tool limitations or risks. Students may still need to describe how supporting tools were used, even if not explicitly acknowledged.
With any approach that permits use of these tools, it may be helpful to ask students to submit a brief description of how they used advanced automated tools along with any course work. Students’ academic work and output should be contextualized within the larger scholarly conversations pertaining to topics and disciplines (e.g., how knowledge is produced, vetted, and valued). See suggested syllabus statements for each of these approaches on the “AI Resources for UNW” page on Confluence.
...
Appropriately acknowledge use of these tools
Effective and responsible use these tools
Limitations, biases, and risks of these tools
Ethical issues related to the use and development of these tools
Equity, access, and privacy issues related to the commercial nature of these tools
...
When using AI tools like GPT-based chatbots, faculty and students should not enter or request private, protected, or confidential information. These tools cannot guarantee privacy or security of such data. Data entered into these tools may be collected and used as training data or shared without your knowledge. Anyone using AI tools should thoroughly understand and comply with stated user agreements.
...
If you suspect a student has used these tools in a way not permitted by your own or departmental explicit, written policies, employ the procedures in the University Policy on Academic Integrity (reach out to the Registrar for assistance with Trad courses or Sarah Arthur for AGS/DE courses).
Be aware that conclusive evidence of a student's use of advanced automated tools in submitted work will likely be difficult to obtain, and tools that detect AI-generated content may not be reliable. For these reasons, prevention and education about responsible, professional use may be better in the long term than relying on detection of misuse. You are encouraged to continually consider ways that assessment design may discourage misuse and encourage responsible use of these tools.
Last updated: 5.25.23
...
[i] Four approaches from University of Delaware’s “Considerations for using and addressing advanced automated tools in coursework and assignments” https://ctal.udel.edu/advanced-automated-tools/