Skip to main content

The recent release of powerful generative AI tools has unsettled higher education. Tools such as ChatGPT, Gemini Advanced, Dall-E, LLaMA, Claude, Pi, and Copilot are developing rapidly. These chat tools can write and revise text, code, and images in response to prompts, raising questions about academic integrity, the nature of our assignments, and new ways we may ask students to think and learn. 

Below, you can find an overview of these generative AI tools and potential ways faculty might respond.

Keep in mind that we are still in the early days of considering how these tools can be used and what their impacts will be.

For additional considerations related to using these tools, read the statement from Information Systems and Computing (ISC) on Guidance on Use of Generative Artificial Intelligence.

If you would like to discuss concerns related to generative AI and your teaching, CETLI staff are available to meet. CETLI can also work with programs or departments to bring instructors together to consider strategies and implications related to this new platform.

What is Generative AI? What Does It Do?

ChatGPT was the first of what are now many generative AI chatbots that produce coherent written responses and computer code in response to a user’s natural language prompts and adapt their responses based on user feedback or additional prompts. Many are available for free (usually with user registration), but some of the most advanced tools, such as GPT-4, require a paid subscription. 

These tools have already been used to produce a wide range of essays, exam solutions, letters, fiction, blog posts, and many other outputs at various levels of quality and accuracy. Materials are usually clearly structured and grammatically correct. They can also generate code and fix coding bugs. They can tackle certain problem sets. And these tools can produce summaries of readings, including materials from PDFs.

While there is a lot generative AI can do, it also has some varied and changing limitations. These tools periodically generate what have come to be called “hallucinations:” completely made up information that looks plausible. Some, but not all, of the tools struggle with referencing sources. Some of the tools might produce citations that look correct but that do not refer to an actual article or book. Some may present information without any way to verify where it comes from or its validity. Similarly, generative AI can produce harmful, biased, misleading or simply inaccurate content. All of this means that knowing a field is helpful in order to use generative AI most effectively.

Additionally, many tools do not have knowledge generated after the dates of their training (2021 for ChatGPT; 2023 for Claude). Some do and some do not have access to search the internet.

Alongside these performance concerns, generative AI raises concerns related to privacy and intellectual property. Penn's Office of Privacy, Audit and Compliance notes instructors' responsibility to protect students' privacy when using generative AI. 

  • Do not enter any information that could identify a student, particularly student names or detailed descriptions of student work or engagement with the class.
  • Do not enter student work (papers or projects) even anonymized without the student's permission.
  • Do not require that students enter their own work or other personal information into generative AI.

If you want to understand better how these chatbots work, The New York Times has a series called How to Become an Expert on A.I. that can help.

Ways to Address Generative AI: Academic Integrity and Rethinking Assignments

The rapid spread of generative AI means instructors will need to be clear with students around their expectations regarding use of the tools. Further, at this point, there is no clearly effective automated way to detect AI-produced materials. That means instructors who want to limit their students' use of generative AI to ensure students are doing the thinking the assignments are designed to cultivate may need new approaches. Expand the strategies below for ideas.

Under Penn’s Code of Academic Integrity, students may not use unauthorized assistance in their academic work. It is up to instructors to decide what that means and let students know. While some instructors may consider any significant use of AI-generated work to be an academic integrity violation, others may allow students to use AI-generated content in particular instances, such as part of brainstorming, to inform revisions, or for a particular assignment. Some may allow students to use generative AI as long as they disclose their use of such tools, and with the reminder that students are responsible for its accuracy. Therefore, it’s important to be clear what the policies are for your class or for particular assignments or activities and if it will vary.

You should include your policy in your syllabus and on Canvas. Policies might include:

  • You are not allow to use generative AI (such as tools like Chat-GPT) for your work in this class. Using such tools will be considered a violation of Penn’s Code of Academic Integrity and suspected used will be reported to the Center for Community Standards and Accountability. Please contact me if you have questions about this policy.
  • You may use generative AI tools for your work in this class, but you must contact me first so we can discuss how you plan to use these tools and how you will indicate your their use in your work. If you do not first discuss it with me, using such tools will be considered a violation of Penn’s Code of Academic Integrity.

Additional sample policies from Penn faculty.

You will also likely want to address this in class, especially when introducing new assignments or projects.

Being transparent with students about the purpose of an assignment can help students appreciate what they are learning, the importance of the skills they are developing, and the excitement of creating their own ideas. Consider discussing with students the ways that having core knowledge in a field makes their use of generative AI more powerful, so they understand the value of that learning even with access to these tools.

Seeing what the tools produce can help you think about what you’re assigning to students – and what an AI-generated response might look like. Consider sharing what you’ve found with students. Pointing out any issues you’ve seen regarding the quality and accuracy may help students better understand these tools and their limitations (both for their classwork and beyond).

Be aware that generative AI will give different answers to the same prompt.  You cannot assume that you will see the same AI-generated answer that your students see. And students can improve the answers they get by refining their prompts. 

Finally, different AI tools have different capabilities so may provide different kinds of answers. In general, the paid version of GPT-4 is currently far more able than other tools.

Since most written work, coding and projects done outside of class can created using generative AI, if you want students to engage that work on their own, consider steps that help you confirm that your students have done the thinking on their own. You can do this in conversation or through adjustments to what students submit.

Oral confirmation checks

Set an expectation that a students will meet with you to talk about their work after submitting it. In large classes where this might be difficult, instructors can pick a percentage (10-20%) at random for brief conversations about their work. In these discussions, which may last as little as five to ten minutes, instructors ask students about their ideas and process, probing beyond submitted materials. 

Students may find these oral checks stressful. To minimize that, emphasize that you are not grading them on their discussion with you and that they can work through ideas in front of you. You might also note the intellectual value of the exercise.

Have students submit work that verifies their process

This is a way of allowing students to show that they’ve done the work. Some of these approaches also can make students more aware of their learning. For instance, you might ask students to:

  • Submit with the assignment the notes they took on sources to prepare their papers, presentations or projects.
  • Write briefly about a source or approach they considered but decided NOT to use and why they did not. This could be done in class (more secure) or in advance and handed in with the final assignment.
  • Write briefly in class when submitting the assignment about something specific they learned from doing the work.

Check references to source material

Although generative AI is rapidly improving its ability to site sources accurately, requiring students to cite sources and reviewing those citations is still a way to spot less subtle uses of the tools. The tools tend to be weaker when citing materials not widely available online.

Consider asking students to show their work in progress or submit components of assignments in steps prior to the final deadline, so you can see and give feedback on the development of their ideas or work over time. This might include submitting an outline, a list of sources, an explanation of their approach, or a first draft before the final product. 

Requiring work done in steps can deter, but not prevent students from using generative AI if they are determined to cheat. Generative AI can also create these different steps if the student goes through the effort to determine the right prompts. Such staged assignments, however, can also help ensure that students don’t feel pressure to complete an assignment at the last minute and turn to generative AI in the crunch.

Related, consider policies that give students some flexibility when they fall behind in class or the option to revise and resubmit certain assignments, once you’ve given feedback, when they don’t do well. This may make students less tempted to turn to AI tools when they get behind or worry they are unable to perform at their best.

Making Generative AI Part of Your Teaching

Some instructors may want to incorporate AI tools into their teaching deliberately so that students can develop skills related to working with these tools. Some may find them a useful way to help students learn about their fields or course content. 

In such cases, consider designing assignments that ask students to engage with AI tools and AI-generated materials. You might ask students to:

  • Analyze work that you or they generate from an AI tool. For example, students could:
    • Try to improve or revise code produced by AI.
    • Fix inaccuracies and address gaps from an essay produced by AI.
    • Trace AI assertions back to their sources to examine how we know what we know and the value of sources.
    • Review or reflect on an answer generative AI created in response to something they know about.
  • Experiment with how different prompts, guidance, and directions result in different outputs from generative AI tools and reflect on how the results might inform more productive and appropriate uses of these tools.
  • Hold a dialogue on a course topic with the AI. Students could submit the conversation and/or a reflection on what they learned from the back and forth.

Alternatively, some instructors may decide they want students to practice collaborating with the AI as a thinking partner of sorts. In such cases:

  • Decide what aspects of the work you want students doing on their own and what aspects might productively involve generative AI. For instance, some instructors may decide students can brainstorm ideas with the AI, but should draft on their own; others may want students to come up with the ideas themselves, but may use the AI for feedback on drafts for revision.
  • If you are open to students using such tools, it can be valuable to have them to explain how they have used it so students are explicit and intentional about their choices.
  • Make clear to students that they are responsible for any inaccuracies in content or citations generated through this collaboration, and that they should own whatever positions they take in submissions.

In other cases, generative AI may support students as they take on more advanced thinking. For instance:

  • Asking AI to serve as an in-class tutor or coach. Instead of using class time to lecture, you might ask students to solve problems on the topic and make use of generative AI tools when they get stuck. Some instructors have tried this in classes that ask students to write computer code to answer questions or analyze data.
  • Encouraging students to work with the tools to help them in a revision process. Here, the prompts students use are important. For instance, students might have AI review an essay and ask for alternate points of view not addressed, or for AI to identify the main points and mirror them back.
  • Students might explain a complex topic to the AI, asking the AI to pose questions as a college student trying to learn the field.

If you plan to use any generative AI with your students, encourage them to read the specific tool’s privacy policy and terms of use. Since many of these tools collect information on users, give students the option not to use these tools if they do not feel comfortable with how their data is being collected and used. You can provide alternatives like having students work in groups or providing students with samples you or others have generated using the tool.