Skip to end of metadata
Go to start of metadata

You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 2 Next »

Faculty Guidance for AI Use at the University of Northwestern – St. Paul

University of Northwestern – St. Paul is committed to the ethical and responsible use of advanced automated tools and technology (such as generative artificial intelligence and large language models) to foster an environment that upholds our mission and values as a comprehensive university committed to promoting Christ-centered higher learning and scholarship. Faculty have a unique opportunity to benefit directly from the use of these tools in teaching and scholarship and to also shape student impressions and practices with these emerging technologies.

Since AI represents a novel domain of tools and methodologies, existing standards of ethics and professionalism in your academic disciplines may not yet adequately address faculty use of these tools. In your own teaching and scholarship, seek to model responsible, God-honoring use of these tools (e.g., by clearly acknowledging sources that contribute to your work).

Approaches to Selecting AI Policies for your Course Activities

Instructors are responsible to consider appropriate uses of advanced automated tools for their courses or individual learning activities, and to explicitly communicate to students the rationale, expectations, parameters, and opportunities.

The following are possible approaches you may take for your course[i]. For each approach, consider how you can provide students with both clear value-driven, curricular rationale for your chosen approach and training in responsible and effective use of these tools.

  1. Prohibit All Use

    • Some curricular contexts require unaided student performance.

    • In these cases, faculty should be explicit about the details and consequences of the prohibition, and also explain to students the rationale for the prohibition (such as targeted skill development) to help them understand context and broader professional or disciplinary norms and expectations.

  2. Allow Use Only as Permitted

    • Some curricular contexts may allow, encourage, or require use of supports in limited, pre-determined ways.

    • In these cases, clearly indicate for which scenarios, assignments, and learning tasks use of these tools is appropriate. It would be helpful to clarify your rationale, explicit expectations for citation or acknowledgement, and limitations, risks, and responsible use of the tools within permitted contexts.

  3. Allow Broader Use with Explicit Acknowledgment

    • Some curricular contexts may allow, encourage, or require more versatile student use of supporting tools but still require explicit, formal acknowledgement of use and citation.

    • It would be helpful to explicitly indicate how students should acknowledge the use of these tools. Remind students of the tools’ limitations, risks, and their responsibility for all work and communication they claim as their own.

  4. Freely Allow Use

    • In some cases, free use of these tools without formal acknowledgement may be appropriate.

    • In such cases, it is necessary to guide student understanding of the boundaries of responsible use and tool limitations or risks. Students may still need to describe how supporting tools were used, even if not explicitly acknowledged.

With any approach that permits use of these tools, it may be helpful to ask students to submit a brief description of how they used advanced automated tools along with any course work. Students’ academic work and output should be contextualized within the larger scholarly conversations pertaining to topics and disciplines (e.g., how knowledge is produced, vetted, and valued). See suggested syllabus statements for each of these approaches on the “AI Resources for UNW” page on Confluence.

Expectations for Responsible Use

In any case where advanced automated tools are used, faculty members are responsible to continually improve their own and their students’ awareness of the following:

  1. Appropriately acknowledge use of these tools

  2. Effective and responsible use these tools

  3. Limitations, biases, and risks of these tools

  4. Ethical issues related to the use and development of these tools

  5. Equity, access, and privacy issues related to the commercial nature of these tools

When using AI tools like GPT-based chatbots, faculty and students should not enter or request private, protected, or confidential information. These tools cannot guarantee privacy or security of such data. Data entered into these tools may be collected and used as training data or shared without your knowledge. Anyone using AI tools should thoroughly understand and comply with stated user agreements.

These tools are rapidly changing and will therefore require faculty members to continually learn more about these areas in order to lead students into effectively navigating them.

Accountability Procedures

If you suspect a student has used these tools in a way not permitted by your own or departmental explicit, written policies, employ the procedures in the University Policy on Academic Integrity (reach out to the Registrar for assistance with Trad courses or Sarah Arthur for AGS/DE courses).

Be aware that conclusive evidence of a student's use of advanced automated tools in submitted work will likely be difficult to obtain, and tools that detect AI-generated content may not be reliable. For these reasons, prevention and education about responsible, professional use may be better in the long term than relying on detection of misuse. You are encouraged to continually consider ways that assessment design may discourage misuse and encourage responsible use of these tools.

Last updated: 5.25.23


[i] Four approaches from University of Delaware’s “Considerations for using and addressing advanced automated tools in coursework and assignments” https://ctal.udel.edu/advanced-automated-tools/

  • No labels