Course Policy on the use of Generative AI Tools
You are permitted to use “Generative AI Tools” (such as ChatGPT, GitHub Copilot, and others) according to a set of rules. Basically, you should treat GenAI as if you would a classmate: you should view it as a helper, not a replacement. Some assignments explicitly ask you not to use GenAI tools.
The general rule of thumb for me is: treat an AI tool like another student. You can talk to them or get help from them, but you can’t have them do your assignments for you.
- Students are resposible for what they turn in. If some helper (automated or otherwise) gives you bad advice, it is on you.
- Students must disclose the tools they use, how they used them, and why.
- Students must not use GenAI tools in situations where I specifically ask them not to (e.g., content surveys).
- Tools should be an assistant in your learning, not a replacement for your efforts.
- If you are in doubt, ask the course staff.
Using an AI tool (like GitHub Copilot) to help with programming assignments will be encouraged in many assignments. Our experience is that it is quite capable of doing the basic parts of the assignments (in part because example solutions have leaked into its training set). We hope that students use it to do the basic parts of the assignments, take the time to understand what it has done, and then use this baseline to create more interesting things.
Using these tools effectively is a skill that one can develop.