Artificial intelligence (AI) tools and systems are significantly shaping the education landscape, presenting both opportunities and challenges for institutions, faculty, students, and professional staff.
This AI Preparedness Checklist was created by 1EdTech's Emerging Digital Pedagogies Innovation Leadership Network (ILN) to provide institutions with guiding prompts for establishing protocols, policies, and best practices for using AI in teaching and learning.
1EdTech's Artificial Intelligence Preparedness Checklist
Kickstart the important discussion on artificial intelligence at your institution.
We've organized this resource into four sections with guiding prompts. Click on the category name (or use the + sign) to reveal questions and guidance to help your conversations and planning meetings.
Need a Printable Version to Use as a Worksheet?
Download the AI Preparedness Checklist
Foster collaboration across your campus, define core values, establish procurement guidelines for AI, engage library and academic stakeholders, strategize, and determine fair fees.
Has an institutional advisory group on AI, led by a senior member of the institutional leadership team (usually within a presidential or provostial portfolio), been established at your institution?
An institutional advisory group on AI in teaching and learning is seen as a strong asset for institutions. Such a group might be given a writ to identify areas in teaching and learning that require an institutional response or guidance or inform the resources that will be allocated or created to aid faculty, programs, and departments in better understanding this technology and options for how they may use AI in their teaching.
While this Preparedness Checklist focuses on AI in teaching and learning, we acknowledge that AI may also impact your institution's general productivity and administrative operations.
Does your institution have a separate group working on AI in those contexts, and is there a structure in place to ensure alignment between AI for teaching and learning and AI in productivity and administrative operations?
Has your institution considered or developed such a Values document in relation to AI?
If not, which role or body would be responsible for developing and maintaining such a document? As part of their response to AI, a number of insitutions have developed a Values Statement to help guide policy development and decision-making. These Values documents can include statements on ethics and equity and establishing priorities, for example, around student success or academic freedom.
Has your institution developed any RFx (Request for Proposals/Information) or contract language that can address issues related to AI during procurements?
For example, does your institution include language that it must retain the right to turn AI features on or off rather than leave that to the discretion of vendors?
Has your institution's library, bookstore, and/or academic administration units been engaged in the AI conversation?
Are relevant staff checking updated license terms and contracts to ensure that the institutions' rights and privileges are protected? Are relevant units connected to overall governance structures being put in place? Increasingly, publishers are attempting to embed restrictions on how licensed content may be used with AI tools.
This could include licensed content that is part of an institution’s library/journals collection or for course textbooks. Typically, institutions have more than one unit involved in acquisitions, such as a library, bookstore, and/or academic administration (for example, an institution may have a textbook-inclusive access program).
Does your institution have a process for reviewing possible changes to products introducing AI elements?
Does your institution have existing master agreements that override any unfavorable changes to terms and conditions and/or privacy policies?
Increasingly, we are seeing companies trying to add AI components and functionality to their existing products. This can be accompanied by unilateral changes to a company’s published terms and conditions and/or privacy policies, allowing it to leverage (or exploit) an institution’s data to deliver those AI services in ways that a school may not have intended.
Has a relevant communications plan or strategy been implemented as part of your institution's governance or oversight process?
Have appropriate communications staff been assigned relevant tasks, such as standardizing communiques and maintaining standardized websites?
One challenge many institutions face is decentralization; a particular unit may be working on some aspect of this area and may issue communications or publish guidance that has not been vetted by your governance process and may be in conflict with centrally issued guidance.
Does your school have mechanisms in place to try and mitigate these potential risks?
Has your institution provided an opportunity for students to offer their insights and opinions on the use of AI in teaching and learning?
Has your institution established or updated any policies or guidelines on ancillary or similar fees for AI teaching tools?
Some AI applications may require a licensing or subscription fee. Has a separate budget been set aside?
Prioritize instructor guidance, safeguard student information, seek legal counsel, and align guidelines with evolving AI technology to ensure ethical and secure use of outside tools.
Has your institution developed or updated an institutional policy regarding the use of 3rd-party tools by faculty where no contract or agreement is in place to protect
PII or intellectual property?
In other words, can students refuse to use such tools, even if an instructor ‘requires’ them?
Does your institutional policy address the scenario where a faculty member has uploaded student work (aka student IP) into such a 3rd-party tool without the student’s consent (for the purposes of trying to catch ‘cheating’ or otherwise)?
Does your institution have standardized reviews or audits of vendor security and privacy policies and practices, especially for educational technologies?
If yes, have those review and audit protocols been examined to make sure that they can cover AI-type tools?
For example, many AI companies rely on input data to train their models and applications and may have included in the Terms & Conditions non-revocable use of data and metadata generated by members of your school community; does your review and audit procedures look for that? And attempt to mitigate any associated risks or reject any such terms?
Have you involved your legal counsel in developing or reviewing any relevant policies?
Have you involved your legal counsel in reviewing either RFx (Request for Proposals/Information) tender documents and/or contracts with suppliers (this might include legal counsel’s review of Terms and Conditions, for example)?
For faculty who may intend to research how they use AI tools within their courses, does your institution have such guidelines, and have they been reviewed or updated to ensure that they align with new developments in AI technology?
Some institutions have mature Research Ethics guidelines for quality assurance and quality improvement (QA/QI) studies, program evaluation activities, performance reviews, or testing within normal educational requirements when used exclusively for assessment, management, or improvement purposes.
For faculty who may intend to research how they use AI tools within their courses, does your institution have such guidelines, and have they been reviewed or updated to ensure that they align with new developments in AI technology?
Does your institution have any formal structures or organizations in place that deal with issues of equity, inclusiveness, bias, and/or accessibility?
If yes, have representatives from those bodies been included in your institutional conversations about AI in teaching and learning?
Has your institution developed any formal language regarding equity, inclusivity, bias, and/or accessibility that can be included in policies, guidance, RFx/tender calls, and/or contracts and agreements as it pertains to AI?
Adapt teaching and testing methods for students using AI tools, prioritize support for faculty with resources to design effective assessments, and foster an environment of innovation and educational excellence.
How can instructional design and assessment approaches be adapted to account for students' now easy access to AI tools?
This particular topic is currently the most explored AI-related conversation, after academic integrity and detection, and is currently being discussed at most institutions.
It is not the intent of this checklist to propose or recommend specific approaches, as that would entail an entirely different kind of document and approach. However, institutions should be working towards developing resources and support for instructors and instructional designers, with a particular focus on assessment design and evaluation.
As such, this item is directly related to the Faculty and Professional Staff literacy questions above, with the additional question of whether or not institutions have moved to begin assembling, documenting, and sharing assessment approaches, with an emphasis on updated authentic assessment techniques.
Has your institution developed or updated an institutional academic integrity policy regarding using tools that (purport) detect the use of AI in student work?
Does your institution have a mature escalation pathway for cases of academic integrity violations that may result from the use of AI tools?
Has your institution created sample statements for faculty to include in course syllabi and course assignments to help shape the message to students about what AI technology is or is not allowed?
Embrace academic integrity through mastering proper citations and staying informed about AI-generated work. Establish a dedicated copyright unit, provide comprehensive AI training, and develop user-friendly modules to facilitate learning.
Does your institution have any standardized protocols regarding citations?
Are those decisions left to the discretion of individual faculty members (e.g., based on the discipline of study, MLA, APA, etc.)? Has your institution initiated any faculty development and/or student information literacy training regarding proper citation of AI-generated or AI-assisted writing and research?
Does your institution have an office or unit that deals with copyright issues (training, advice, support, etc.)?
One of the biggest concerns around the use of AI tools, particularly Generative AI tools, is the inappropriate use of copyright-protected materials, especially in Language Model training or prompt writing.
In other words, the risk of individuals uploading content for which they do not have copyright clearance into AI tools (legal risks, reputational risks, etc.). These offices are often affiliated with library systems but may be located elsewhere. If not, do you have another unit that has been identified to support issues around AI and copyright?
If yes, has your existing copyright support office begun to develop guidelines, training, support, etc., in this domain? Is that unit’s work aligned with other governance processes in place at your institution?
Has a relevant communications plan or strategy been implemented as part of any governance or oversight process at your institution?
Have appropriate communications staff been assigned relevant tasks, such as standardizing communiques and maintaining standardized websites?
One challenge many institutions face is decentralization; a particular unit may be working on some aspect of this area and may issue communications or publish guidance that has not been vetted by your governance process and may be in conflict with centrally issued guidance. Does your institution have mechanisms in place to try and mitigate these potential risks?
Has your institution developed or delivered any training programs or sessions for students related to their use of AI?
If not, would that be an institution-wide initiative, or, based on the culture of your school, would it be left to departments or even individual faculty? If left to departments or faculty, has your institution provided guidance and or materials for them to use?
We note that several institution have begun to develop self-paced modules or courses for students, often delivered via an LMS. In some cases, these have been offered to other institutions via Creative Commons licensing. Has your institution begun investigating the efficacy of this approach in your organizational context?
Has your institution created or delivered any faculty development sessions or materials related to the use of AI in teaching?
If not, who would be responsible within your institution for delivering this kind of faculty development (e.g. a teaching centre, institutional librarians, etc.)?
Has your institution created or delivered any professional development sessions or materials related to the use of AI?
If not, who would be responsible within your institution for delivering this kind of faculty development (e.g., a professional development center, institutional librarians, etc.)?
The use of AI by professional staff may extend beyond the use of AI in teaching and learning and into corporate and productivity management, for example, through the use of productivity tools that are being enhanced with AI capabilities by vendors.
Of particular concern is the inadvertent uploading of confidential information. Has your institution begun to develop policies and procedures for this context? (See section on copyright-related questions as well).
Does your institution foster and support the creation of communities of practice?
If yes, has your institution created such (a) community(ies) around the norms of behavior and best practices for using generative AI in teaching and learning?
Has your institution designated an institutional facilitator role for such a CoP? Does your institution have a formal structure or process in place where CoP conversations can contribute to governance (see leadership category below), and visa versa?
Does your institution offer any career services for either students/graduates and/or existing professional staff?
If yes, have career services professionals been included in the conversations around AI? Are there structures or processes in place that ensure trends and issues seen by career services staff can contribute to governance conversations?
When it comes to fostering experimentation, has your institution set up any environments (“walled gardens”) and supports for those who wish to experiment with AI, particularly in a teaching and learning context?
This could be for faculty (and staff) as well as for students. These environments might be for experimenting safely with Generative AI tools, as well as tools for those who may wish to try and develop or train their own language models and applications.
If not, where in your institutions’s structure would this best be situated? While some readers of this checklist may feel there is an underlying sense of centralized control underlying the questions. That is not the intent. In particular, institutions will need to find a balance between protecting their community (intellectual property, personal information, academic freedoms, etc.) while also fostering development and experimentation with AI.
Need a Printable Version to Use as a Worksheet?
Download the AI Preparedness Checklist
We're Always Happy to Help
Our AI experts are here every step of the way, welcoming you to the world’s most united
edtech community and available to ensure you are prepared for today and tomorrow.
Have Questions? We Have Answers.
Contact Us
Wait. You're Not a 1EdTech Member?
When you join 1EdTech, you'll collaborate with the brightest minds in education and technology. Whether as a Contributing, Affiliate, or Alliance member, our spirit fuels our determination to improve education.