<?xml version="1.0" encoding="UTF-8"?><rss xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:atom="http://www.w3.org/2005/Atom" version="2.0" xmlns:itunes="http://www.itunes.com/dtds/podcast-1.0.dtd" xmlns:googleplay="http://www.google.com/schemas/play-podcasts/1.0"><channel><title><![CDATA[The Intelligent Academic™: AI Governance]]></title><description><![CDATA[Practical guidance for academic leaders navigating AI governance — including policy templates, faculty development strategies, regulatory updates, and frameworks for responsible institutional adoption.]]></description><link>https://www.theintelligentacademic.com/s/ai-governance</link><generator>Substack</generator><lastBuildDate>Thu, 14 May 2026 23:46:38 GMT</lastBuildDate><atom:link href="https://www.theintelligentacademic.com/feed" rel="self" type="application/rss+xml"/><copyright><![CDATA[Viber Apps, LLC]]></copyright><language><![CDATA[en]]></language><webMaster><![CDATA[theintelligentacademic@substack.com]]></webMaster><itunes:owner><itunes:email><![CDATA[theintelligentacademic@substack.com]]></itunes:email><itunes:name><![CDATA[Billy Oglesby]]></itunes:name></itunes:owner><itunes:author><![CDATA[Billy Oglesby]]></itunes:author><googleplay:owner><![CDATA[theintelligentacademic@substack.com]]></googleplay:owner><googleplay:email><![CDATA[theintelligentacademic@substack.com]]></googleplay:email><googleplay:author><![CDATA[Billy Oglesby]]></googleplay:author><itunes:block><![CDATA[Yes]]></itunes:block><item><title><![CDATA[A Framework for Responsible AI Adoption]]></title><description><![CDATA[A guide for academic and research institutions to navigate the complexities of artificial intelligence with a structured and ethical framework.]]></description><link>https://www.theintelligentacademic.com/p/a-framework-for-responsible-ai-adoption</link><guid isPermaLink="false">https://www.theintelligentacademic.com/p/a-framework-for-responsible-ai-adoption</guid><dc:creator><![CDATA[Billy Oglesby]]></dc:creator><pubDate>Sun, 08 Feb 2026 00:12:19 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!iK8d!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Feddb9132-a2c3-4361-b3ba-50b7a622696e_1536x1024.heic" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!iK8d!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Feddb9132-a2c3-4361-b3ba-50b7a622696e_1536x1024.heic" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!iK8d!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Feddb9132-a2c3-4361-b3ba-50b7a622696e_1536x1024.heic 424w, https://substackcdn.com/image/fetch/$s_!iK8d!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Feddb9132-a2c3-4361-b3ba-50b7a622696e_1536x1024.heic 848w, https://substackcdn.com/image/fetch/$s_!iK8d!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Feddb9132-a2c3-4361-b3ba-50b7a622696e_1536x1024.heic 1272w, https://substackcdn.com/image/fetch/$s_!iK8d!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Feddb9132-a2c3-4361-b3ba-50b7a622696e_1536x1024.heic 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!iK8d!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Feddb9132-a2c3-4361-b3ba-50b7a622696e_1536x1024.heic" width="1456" height="971" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/eddb9132-a2c3-4361-b3ba-50b7a622696e_1536x1024.heic&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:971,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:380742,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/heic&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://theintelligentacademic.substack.com/i/187247206?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Feddb9132-a2c3-4361-b3ba-50b7a622696e_1536x1024.heic&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!iK8d!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Feddb9132-a2c3-4361-b3ba-50b7a622696e_1536x1024.heic 424w, https://substackcdn.com/image/fetch/$s_!iK8d!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Feddb9132-a2c3-4361-b3ba-50b7a622696e_1536x1024.heic 848w, https://substackcdn.com/image/fetch/$s_!iK8d!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Feddb9132-a2c3-4361-b3ba-50b7a622696e_1536x1024.heic 1272w, https://substackcdn.com/image/fetch/$s_!iK8d!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Feddb9132-a2c3-4361-b3ba-50b7a622696e_1536x1024.heic 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>Artificial intelligence is no longer a futuristic concept but a present-day reality that is rapidly transforming academia and research. From automating administrative tasks to accelerating data analysis, the potential of AI is immense. However, with great power comes great responsibility. The unguided adoption of AI technologies can lead to a host of ethical, legal, and reputational risks, including biased outcomes, privacy violations, and a lack of accountability. To harness the full potential of AI while mitigating its inherent risks, institutions need a robust framework for responsible AI adoption.</p><p>This article presents a comprehensive framework designed to guide academic and research institutions in their journey toward responsible AI adoption. It integrates five core principles of ethical AI with a four-phased adoption process, providing a clear roadmap from initial assessment to long-term monitoring.</p><h4>The Core Principles of Responsible AI</h4><p>At the heart of any responsible AI strategy are a set of core principles that serve as a moral compass for the development and deployment of AI systems. These principles, adapted from the <a href="https://professional.dce.harvard.edu/blog/building-a-responsible-ai-framework-5-key-principles-for-organizations/">work of leading institutions like Harvard University</a>, provide a foundation for ethical AI governance.</p><ul><li><p><strong>Fairness:</strong> AI systems must be designed and implemented to ensure equitable outcomes for all individuals and groups. This means actively working to identify and mitigate biases in data and algorithms.</p></li><li><p><strong>Transparency:</strong> The inner workings of AI systems should be understandable and explainable to the extent possible. This includes providing clarity on the data used to train the AI, the logic behind its decisions, and the potential limitations of the system.</p></li><li><p><strong>Accountability:</strong> There must be clear lines of responsibility for the outcomes of AI systems. Since AI itself cannot be held accountable, institutions must establish a governance structure that designates who is responsible for the development, deployment, and oversight of AI.</p></li><li><p><strong>Privacy:</strong> The privacy of individuals must be protected at all stages of the AI lifecycle. This involves implementing robust data protection measures to safeguard personally identifiable information.</p></li><li><p><strong>Security:</strong> AI systems and the data they rely on must be secure from both internal and external threats.</p></li></ul><h3>A Phased Framework for AI Adoption</h3><p>While principles provide the &#8220;why&#8221; of responsible AI, a phased framework provides the &#8220;how.&#8221; This four-phased approach, inspired by <a href="https://business.adobe.com/resources/sdk/the-ai-inflection-point.html">Adobe&#8217;s responsible AI adoption model</a>, offers a structured process for implementing AI in a way that is both strategic and ethical .</p><ol><li><p><strong>Assess.</strong> The journey begins with a thorough assessment of the institution&#8217;s readiness for AI. This involves a comprehensive audit of the existing technical infrastructure, governance frameworks, AI literacy, and data management practices.</p></li><li><p><strong>Pilot.</strong> Before a full-scale rollout, it is essential to pilot AI solutions in a controlled environment. This allows the institution to test the technology, evaluate its impact on a smaller scale, and identify any unforeseen challenges.</p></li><li><p><strong>Scale.</strong> Once a pilot has proven successful, the next step is to scale the AI solution across the institution. This requires careful planning and execution to ensure a smooth transition and to maximize the benefits of the technology.</p></li><li><p><strong>Monitor.</strong> The adoption of AI is not a one-time event but an ongoing process that requires continuous monitoring and evaluation. This final phase involves tracking the performance of the AI system, assessing its impact on key metrics, and ensuring that it continues to operate in a fair, transparent, and accountable manner.</p></li></ol><h3>Integrating Principles and Phases</h3><p>The true power of this framework lies in the integration of the five core principles within each of the four adoption phases. For example, during the Assess phase, the principle of Fairness would guide the selection of AI vendors. In the Pilot phase, Transparency would be a key consideration. During the Scale phase, Accountability would be paramount. And in the Monitor phase, Privacy and Security would be ongoing concerns.</p><p>By weaving these ethical principles into the fabric of the AI adoption process, institutions can create a culture of responsible innovation that builds trust with students, faculty, staff, and the wider community.</p><p>The adoption of artificial intelligence presents both immense opportunities and significant challenges for academic institutions. By embracing a framework for responsible AI adoption that is grounded in ethical principles and a structured implementation process, institutions can navigate this complex landscape with confidence. The future of AI in academia is not just about what we can achieve, but how we achieve it.</p>]]></content:encoded></item><item><title><![CDATA[A Practical Guide to AI and Academic Integrity for Faculty]]></title><description><![CDATA[Strategies for fostering a culture of integrity and adapting your pedagogy in the age of artificial intelligence.]]></description><link>https://www.theintelligentacademic.com/p/a-practical-guide-to-ai-and-academic</link><guid isPermaLink="false">https://www.theintelligentacademic.com/p/a-practical-guide-to-ai-and-academic</guid><dc:creator><![CDATA[Billy Oglesby]]></dc:creator><pubDate>Sun, 08 Feb 2026 00:02:40 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!24fT!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1a4b3f68-b5c4-43c7-9d99-75bcb71d6461_1536x1024.heic" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!24fT!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1a4b3f68-b5c4-43c7-9d99-75bcb71d6461_1536x1024.heic" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!24fT!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1a4b3f68-b5c4-43c7-9d99-75bcb71d6461_1536x1024.heic 424w, https://substackcdn.com/image/fetch/$s_!24fT!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1a4b3f68-b5c4-43c7-9d99-75bcb71d6461_1536x1024.heic 848w, https://substackcdn.com/image/fetch/$s_!24fT!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1a4b3f68-b5c4-43c7-9d99-75bcb71d6461_1536x1024.heic 1272w, https://substackcdn.com/image/fetch/$s_!24fT!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1a4b3f68-b5c4-43c7-9d99-75bcb71d6461_1536x1024.heic 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!24fT!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1a4b3f68-b5c4-43c7-9d99-75bcb71d6461_1536x1024.heic" width="1456" height="971" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/1a4b3f68-b5c4-43c7-9d99-75bcb71d6461_1536x1024.heic&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:971,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:396180,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/heic&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://theintelligentacademic.substack.com/i/187246709?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1a4b3f68-b5c4-43c7-9d99-75bcb71d6461_1536x1024.heic&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!24fT!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1a4b3f68-b5c4-43c7-9d99-75bcb71d6461_1536x1024.heic 424w, https://substackcdn.com/image/fetch/$s_!24fT!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1a4b3f68-b5c4-43c7-9d99-75bcb71d6461_1536x1024.heic 848w, https://substackcdn.com/image/fetch/$s_!24fT!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1a4b3f68-b5c4-43c7-9d99-75bcb71d6461_1536x1024.heic 1272w, https://substackcdn.com/image/fetch/$s_!24fT!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1a4b3f68-b5c4-43c7-9d99-75bcb71d6461_1536x1024.heic 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>The rapid proliferation of generative artificial intelligence has introduced a new and complex set of challenges to higher education. For faculty, the foremost concern often revolves around academic integrity. The ease with which students can generate text, solve problems, and even create entire essays using AI tools has understandably led to widespread anxiety about the future of academic honesty. However, a singular focus on detecting AI-generated content is not only proving to be an unreliable and potentially inequitable approach, but it also distracts from a more fundamental opportunity: to rethink our pedagogical strategies and foster a more robust culture of academic integrity.</p><p>This post offers a practical guide for faculty seeking to navigate this new terrain. Drawing on recommendations from leading institutions, we will explore proactive strategies for setting clear expectations, designing resilient assessments, and responding to potential misconduct in a way that is both fair and pedagogically sound.</p>
      <p>
          <a href="https://www.theintelligentacademic.com/p/a-practical-guide-to-ai-and-academic">
              Read more
          </a>
      </p>
   ]]></content:encoded></item><item><title><![CDATA[Creating Your Department's AI Use Policy]]></title><description><![CDATA[Navigating the complexities of AI integration in higher education and crafting a policy that fosters innovation while upholding academic integrity.]]></description><link>https://www.theintelligentacademic.com/p/creating-your-departments-ai-use</link><guid isPermaLink="false">https://www.theintelligentacademic.com/p/creating-your-departments-ai-use</guid><dc:creator><![CDATA[Billy Oglesby]]></dc:creator><pubDate>Sat, 07 Feb 2026 23:55:02 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!gUwp!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F056b8481-b188-4139-881e-61e5eceb8981_1024x1024.heic" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!gUwp!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F056b8481-b188-4139-881e-61e5eceb8981_1024x1024.heic" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!gUwp!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F056b8481-b188-4139-881e-61e5eceb8981_1024x1024.heic 424w, https://substackcdn.com/image/fetch/$s_!gUwp!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F056b8481-b188-4139-881e-61e5eceb8981_1024x1024.heic 848w, https://substackcdn.com/image/fetch/$s_!gUwp!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F056b8481-b188-4139-881e-61e5eceb8981_1024x1024.heic 1272w, https://substackcdn.com/image/fetch/$s_!gUwp!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F056b8481-b188-4139-881e-61e5eceb8981_1024x1024.heic 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!gUwp!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F056b8481-b188-4139-881e-61e5eceb8981_1024x1024.heic" width="1024" height="1024" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/056b8481-b188-4139-881e-61e5eceb8981_1024x1024.heic&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:1024,&quot;width&quot;:1024,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:117421,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/heic&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://theintelligentacademic.substack.com/i/187246118?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F056b8481-b188-4139-881e-61e5eceb8981_1024x1024.heic&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!gUwp!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F056b8481-b188-4139-881e-61e5eceb8981_1024x1024.heic 424w, https://substackcdn.com/image/fetch/$s_!gUwp!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F056b8481-b188-4139-881e-61e5eceb8981_1024x1024.heic 848w, https://substackcdn.com/image/fetch/$s_!gUwp!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F056b8481-b188-4139-881e-61e5eceb8981_1024x1024.heic 1272w, https://substackcdn.com/image/fetch/$s_!gUwp!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F056b8481-b188-4139-881e-61e5eceb8981_1024x1024.heic 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>The rapid proliferation of generative artificial intelligence is reshaping higher education, presenting both unprecedented opportunities and significant challenges. As students and faculty increasingly turn to AI tools for a wide range of academic tasks, the need for clear and effective AI use policies has become paramount. For academic departments, the absence of such a policy can lead to confusion, inconsistency, and a potential erosion of academic integrity.</p><p>This post offers a step-by-step guide for academic leaders to develop a thoughtful and effective AI use policy for their department. By taking a proactive and collaborative approach, departments can create a framework that not only mitigates the risks associated with AI but also harnesses its transformative potential.</p>
      <p>
          <a href="https://www.theintelligentacademic.com/p/creating-your-departments-ai-use">
              Read more
          </a>
      </p>
   ]]></content:encoded></item></channel></rss>