How to Create a Solid AI Statement for Your Syllabus | Faculty Guide

How to Create a Solid A.I. Statement for Your Syllabus

How to Create a Solid A.I. Statement for Your Syllabus 

      Integrating generative AI tools into your classroom requires clear communication with students about expectations and boundaries. A well-crafted AI policy in your syllabus not only sets standards for academic integrity but also guides students on ethical and proactive AI use. There is no single “right” policy; the right one is the one that you can commit to, communicate clearly, and assess fairly. Before writing your statement, consider exploring the following useful resources to guide content and wording: 

  1. Examples of content-specific syllabi statements
  2. Website for simple generation of statements 

Below are examples of different perspectives on A.I. use: against, avoid, and adopt 


Against: AI use is restricted 

Choosing to restrict AI use is a legitimate pedagogical decision. There are courses such as developmental writing, clinical skills labs, oral communication, hands-on technical programs, where the thinking process itself is the learning outcome. If AI generates the product, the learning does not occur. That said, a no-AI policy is also the hardest to enforce without clarity and consistency. The key is specificity: not just "no AI," but what counts as AI, what the consequences are, and how you'll handle gray areas like grammar checkers or auto complete. 

Sample Syllabus Statement 

QuoteThe use of artificial intelligence tools (such as ChatGPT, Claude, Copilot, Gemini, or similar tools) is not allowed in this course. All submitted works must reflect your own thinking and be written in your own words. Using AI to generate, rewrite, summarize, outline, or otherwise produce course content, whether in full or in part, counts as academic dishonesty and will be addressed under Kirkwood’s Academic Integrity Policy. This policy supports the core goal of the course: developing your own skills. Tools such as spell-check and basic grammar support built into Word or Google Docs are permitted. If you are unsure whether a tool is allowed, ask before submitting your work.

In Practice 

    • Assignments emphasize in‑class work, personal reflection, live demonstrations, or process‑based tasks. 

    • Major projects include checkpoints or are completed partially under instructor supervision. 

    • Assignment instructions restate “no AI use” instead of relying on the syllabus alone. 

    • Rubrics reward original thinking and process, not polish. 

    • The same standard is applied consistently from the first assignment to the last. 


Avoid: Limited AI Use  

A partial‑use policy (“some AI is okay”) is currently the most common approach, and the most likely to lead to inconsistency if not intentionally designed. Without clear boundaries, students often interpret limited permission as broad approval. Partial‑use policies work best when they: 

  1.       Distinguish phases of work (e.g., brainstorming vs. drafting), 
  2.       Separate task types (research vs. writing vs. calculation), or 
  3.       Define AI’s role (assistant vs. author). 

Equally important is a disclosure requirement. Asking students to document their AI use supports accountability and helps them develop metacognitive awareness of when AI assists learning, and when it replaces it. Here is an example A.I. Disclosure Statement. 

Sample Syllabus Statement 

QuoteArtificial intelligence tools may be used in this course only in the specific ways outlined for each assignment. Assignment instructions will clearly state whether AI is allowed and for what purpose. When AI use is permitted, you must include a brief AI Disclosure Statement with your submission, describing how the tool was used. Any AI use beyond what is explicitly allowed for an assignment, including using AI to draft or rewrite content, will be treated as academic dishonesty. You remain responsible for all submitted work. This includes checking AI output for accuracy and revising it meaningfully. Any errors or inaccuracies in AI-generated content are your responsibility. 

In Practice 

    • Every major assignment explicitly states whether AI is allowed and for what purpose. 

    • AI Disclosure Statements are required whenever AI use is permitted. 

    • Rubrics assess critical judgment, revision quality, and accuracy, not AI output. 

    • Expectations are reinforced before each major assignment. 

    • The policy is enforced consistently across low‑ and high‑stakes work. 


Adopt: AI as a Professional and Analytical Tool 

This approach treats AI as a normal professional tool, similar to a calculator or Excel, while still requiring students to think critically about how and why they use it. Students may use AI throughout their work, but they remain fully responsible for the accuracy, quality, and originality of everything they submit. This approach works well in courses that reflect real‑world practice and in courses where instructors want to explicitly teach AI literacy, ethical use, and critical evaluation. In some assignments, AI may function as a support tool; in others, AI output itself may become something students are asked to analyze, critique, or improve. To maintain academic integrity, students could be required to declare their AI use  and demonstrate meaningful engagement rather than uncritical acceptance of AI output.

Sample Syllabus Statement 

QuoteArtificial intelligence tools are permitted and encouraged in this course. You may use AI for any stage of your work: research, brainstorming, drafting, editing, formatting, and problem-solving. Using AI tools effectively is a professional skill, and this course treats it as one.  However: you are responsible for everything you submit. This means you must verify the accuracy of AI-generated content, critically evaluate AI output rather than accepting it uncritically, and add meaningful analysis, judgment, and original thinking beyond what AI alone produces. Submitting AI output without meaningful engagement - in other words, using AI as a replacement for your own thinking rather than a tool to support it — will result in a grade that reflects the lack of original contribution.  All submitted work must include a brief note at the end describing how AI was used (one to three sentences is fine). This is not punitive — it's professional practice. In the workforce, being transparent about your tools is a mark of credibility. 

In Practice 

    • AI tools may be used freely for brainstorming, drafting, editing, research support, or idea generation.

      Some assignments may explicitly require students to evaluate, revise, compare, or critique AI‑generated content rather than simply use it.

      Students submit an  AI Use Declaration Form with their work.

      Rubrics focus on effective use of AI, quality of analysis, integration of ideas, and evidence of student judgment.

      Assignments require elements AI cannot provide on its own, such as local context, personal experience, class‑specific material, or in‑class explanation.

      Students are able to explain and defend their work; follow‑up questions or reflections may be required.

The resources below may help you identify typical assignments that align with this policy. 

  1. AI Pedagogy Project from Harvard’s MetaData AI Assignments Example  
  2. Building bridges of knowledge (BBK). The City University of New York. 

The assignments include tasks that involve, criticize, and evaluate AI outputs. Students critically analyze AI outputs, for example: conducting an “interview” with a historical figure generated by AI. AI tools may be part of the assignment, but equivalent materials must be available for students who choose not to use AI. 


In Summary: Choosing Your AI Policy 

If you choose… 

Your key moves are… 

Watch out for… 

Against 

AI use is restricted 


Name specific tools and define what counts as AI; design process-based assignments with checkpoints; restate the policy on every assignment sheet; have an explicit Day 1 conversation; apply rubrics that reward process and original thinking over polish. 

Vague prohibitions students can work around; inconsistent enforcement across the semester; not revisiting the policy after Week 1; failing to distinguish permitted tools (spell-check) from prohibited ones. 

Avoid 

Limited AI use 

Specify AI permissions at the assignment level, not just in the syllabus; distinguish phases of work (brainstorming vs. drafting); require AI Disclosure Statements whenever AI is permitted; define AI's role (assistant vs. author); enforce consistently across low- and high-stakes work. 

"Limited permission" creeping into broad approval; unclear lines between assignments; treating disclosure statements as optional; students assuming AI output is automatically accurate and not verifying it. 

Adopt 

AI as a support tool and a subject of analysis

Frame AI as a professional skill, not a shortcut; require brief AI use disclosures with all submissions; design assignments that require judgment, local context, reflection, or in‑class explanation; include tasks where students evaluate, critique, or improve AI output; assess reasoning and decision‑making, not AI quality.
“You can use AI” becoming “AI does the work”; assignments generic enough that AI output earns full credit; grading AI polish instead of student thinking; students unable to explain or defend their work.

Kirkwood Community College  |  AISD 


A Final Word 

Writing an effective AI policy isn't a one-time task; it's an ongoing teaching practice. The most important thing is not which policy you choose, but whether you've thought carefully enough about your course's learning outcomes to justify it, communicated it clearly enough that students can follow it, and committed to it consistently enough that it remains meaningful. 

Questions about AI policy development, assignment design, or instructional strategies? Reach out to the AISD team. 


    • Related Articles

    • A T.R.E.A.T. For Your Syllabus: An AI Syllabus Policy Framework

      Why Have an AI Use Policy? Students continue expressing confusion, fear, and uncertainty over allowable uses of Artificial Intelligence (AI) in Higher Education. Syllabi represent a reliable, go-to location where faculty can outline their positions ...
    • AI & Assessments: A Thoughtful Approach for Faculty

      AI & Assessments: A Thoughtful Approach for Faculty The rapid expansion of AI in education has introduced both meaningful opportunities and real challenges for teaching and learning. One of the most pressing challenges instructors are navigating is ...
    • What should I include in my Generative AI syllabus policy?

      What should I include in my Generative Artificial Intelligence syllabus policy? The complex nature of Artificial Intelligence (AI) means syllabi statements regarding its use have to be comprehensive in order to properly address many new concerns that ...
    • The T.E.A.C.H. Framework for AI Literacy

      As developers, programmers, businesses, schools, and governments infuse Artificial Intelligence (AI) into more aspects of daily operations and activity, it’s essential that users have a framework for understanding the technology. T.E.A.C.H. framework ...
    • Can AI . . . create rubrics?

      START WITH A PROMPT When drafting the prompt, include as many specifics regarding your output expectations as possible; the details associated with a creating a grading rubric for a history course are bolded in the example below. SUGGESTED MODEL: ...