"I did not really like the AI based questions on the written homework. I found that they actually made it more difficult for me to learn the information as these problems were typically so arithmetically challenging so that AI could assist but the AI would complete the problem incorrectly and teach you the wrong methods of computation."
"I think that in the future, less focus on using AI tools in written homework and more of a focus on providing practice would be helpful in teaching students the content."
"While the AI allowed problems were very interesting, I felt like sometimes they were too complex for the scope of the course"
"The webassigns and the practice problems provided in class helped, but I would say the homework with the AI function did not help me out as much to understanding the concepts being learned"
"... [I would suggest] also get rid of the homework question that uses AI as a form of help"
"While the AI questions can be interesting, they're not useful for the quizzes/exams and end up being pretty frustrating and time–consuming"
I did not forbid AI use in all calculus homework problems (most of the grade came from in-class quizzes and exams), but I directed them to use AI for getting hints instead of complete solutions. However, I am not sure how widely this hint prompt was used. Prompt for students' use to get a hint on a problem:
Creating homework solutions (since Spring 2023): Copilot dramatically outperforms chat-based LLMs with real-time course correction.
Example workflow (expert use while leveraging AI speed):
LaTeX Mathematica integration. Write solutions in
LLMs cannot compute reliably, but they can write about computations when properly guided
(video sped up
Email efficiency prompt: One-line stub → professional quick response (I hope it doesn't sound AI-generated). To use it, I select the email and my one-line stub, and send this to an LLM. Then the response is copied into my clipboard, where I insert it, compare, and verify before sending.
(video sped up
At UVA, faculty advise students on course selection, career paths, and academic requirements. I created a custom GPT loaded with department documents: course catalogs and prerequisites, degree requirements, common advising scenarios
Access was given to advisors, not students, as we want vetted advice passed on to students.
Cost: $20 one-time ChatGPT Plus subscription creates shareable bots accessible to anyone, permanently.
Pressing questions from my colleagues, where LLMs are mostly inadequate so far:
OCR for handwritten notes: Mass-converting math drafts into
Hand-drawn figure sketches to TikZ: Transforming rough diagrams into complete figures with LLM yields abysmal
results. Works much better if you describe the figure in words.
Are we widening the gap between AI-savvy and traditional students?
What happens when the AI is wrong and students lack expertise to know?
Is efficiency worth the pedagogical cost?
It helps to have an honest discussion with students (on the first day) about the AI policy in this class: what students are comfortable with? What should we allow, what should we not allow?
Having students invested in the decision helps.
Continue the conversation