FEATURE

PARTNERS IN POSSIBILITY

Menlo’s Human-Centered Approach to AI

COVER IMAGE:

Illustration by Claire Dickman ’26

FEATURE

By Liza Bennigson, Senior Writer and Content Strategist

Video: Jacob Gordon, Videographer

“Code Is Poetry,” reads a poster in Menlo’s computer science lab, a whimsical space designed for innovation. Scattered amidst the laptops are dozens of rubber ducks, mascots of coding and of Menlo’s customized AI chatbot (in the form of a talking duck) that has been trained to coach stumped students through coding challenges without revealing the answers.

“The Duck” also reflects a deeper purpose of Menlo’s computer science curriculum: to empower students to engage with new technologies with curiosity, care, and an entrepreneurial spirit, ensuring the tools support—not replace—their own thinking. This philosophy was brought to life when students in Douglas Kiang’s App Design and Development class partnered with David, a community member whose wife lost her voice to ALS. The couple had been communicating by writing notes to each other—requiring them to be side by side—but David envisioned an app that would allow their messages to appear on each other’s devices in real time, from anywhere.

VIDEO: Menlo’s Human-Centered Approach to AI

The App Design students were eager to dive in but had just completed the first level of computer science and hadn’t yet learned the coding language used by iOS apps. So they leveraged AI to learn the language, accelerate coding, and free up time for what mattered most: design thinking, prototyping, iterating, and even spending an “empathy day” without speaking. “Having AI do more of the background work let us really focus on what matters,” said Ananya ’26.

This kind of collaboration exemplifies Menlo’s intentional, human-centered, and values-driven partnership with technology. “In a world of smart machines, our role as an education institution is not only to teach students how to use the technology but to cultivate the human qualities AI cannot replicate—empathy, creativity, ethical reasoning, joy, and the capacity to connect and engage deeply with one another and the world around us,” said Head of School Than Healy.

Students presenting their projects in the Ethical Leadership in the Age of AI MTerm course

EXPLORING WITH CURIOSITY

Even for those excited about AI, the knowledge that the tools available today will be the weakest and least accurate of our lifetimes can feel daunting for students and educators alike. “It’s a little scary to see that AI is progressing at a rate that makes you think, ‘Wow, maybe what I want to do in the future is not going to be there anymore,’” said Ben ’25. “But, I feel like it would be worse to hide from it; you sort of have to learn how to use it.”

As with anything new, faculty are learning, experimenting, and bringing a range of perspectives to the challenge of integrating AI into meaningful learning experiences. Philosophy teacher Jack Bowen initially shared some of his colleagues’ concerns about potential academic integrity issues. But as he began exploring, his outlook shifted. “This is an awesome tool,” he said, “and it is their future.”

Emphasizing a student-centered approach, Bowen’s class chose a thought-provoking issue they were passionate about and formulated their own questions, opinions, and conclusions before asking their favorite philosophers to weigh in via AI. As they ran back and forth between laptops, excitedly comparing responses from some of history’s greatest thinkers, Bowen continued to stoke their curiosity: “If Socrates agrees with you, how can you push further?” “What have you missed that Aristotle understands?”

Students in Zach Blickensderfer’s Advanced Topics in Computer Science (ATCS) class were similarly stretched to think beyond their own limits. Having spent the first nine months of the year learning about different ways to solve problems independent of AI aid, they were excited for the opportunity to partner with it for their final projects.

“I don’t really want it to do my code for me,” said Damian ’25, who asked AI what concepts he needed to learn in order to build a Guitar Hero-esque 2D rhythm game and which courses would be most helpful for building those skills. “I really enjoy figuring stuff out on my own,” he said. “It’s really hard, but it’s rewarding to look at the end result and think, ‘I did that. That was my own work.’”

I feel like it would be worse to hide from it; you sort of have to learn how to use it.”

—Ben ’25

Ethics in Tech (Tethics) Club co-leaders recruit new members at the Upper School Club Fair.

BUILDING INTELLECTUAL GRIT

Than Healy worries that losing this process of figuring things out—the productive struggle—is one of the greatest risks of an over-reliance on AI. “What we know about education is that students are better prepared for the world that they’re going to find because of the nights of sitting up with the problem set that wasn’t working itself out or the paper that wasn’t writing itself,” he explained as he shared his concerns with Harvard political philosopher Michael Sandel during his visit to Menlo in March.

In typical Sandel fashion, he responded to Healy’s dilemma with a question. “And so how do we design a curriculum and an educational philosophy that doesn’t deprive the students of the struggle that leads to learning, while at the same time engages them with the remarkable new opportunities presented by AI?”

Finding ways to meaningfully assess the process of learning, not just the product, is a distinction that has grown increasingly vital for Menlo faculty in the age of AI. “We know that the more aware kids are of what they do when they get stuck—how to get unstuck and what approaches help them learn best—the better they are at being lifelong learners, which is essentially what we’re trying to do,” said Kiang.

History teacher Dr. Sabahat Adil and her colleagues are deliberately offering more choice and variety in research topics to increase engagement and ownership, so students don’t want to outsource their assignments to AI. The team is also providing additional scaffolding for students throughout big projects, regularly meeting one-on-one to track where they started and how it’s going.

“This is the part of the process that AI really subverts,” said English teacher Jay Bush. “It can get in the way of their ability to think critically, to move from one step to another, to plan out extended assignments, to do multiple drafts.”

Hanna ’26 feels this tension firsthand. “The process of turning something that you don’t know into something that you do know, that’s something ChatGPT could never recreate,” she said.

Hanna looks back fondly on her days in freshman English, as her class spent weeks trying to make sense of Shakespeare’s Othello. Although they had read the assigned pages, none of them fully grasped the storyline, and their interpretations varied widely. Whereas today they may have been tempted to turn to AI for a plot summary, the exchange of diverse perspectives enriched the collective learning experience. “Making the human mistakes that are so integral to learning, gradually increasing understanding, and ultimately coming to a resolution together, I think that’s what learning is all about,” she said.

Making the human mistakes that are so integral to learning, gradually increasing understanding, and ultimately coming to a resolution together, I think that’s what learning is all about.”

—Hanna ’26

Peer presentations in Ethical Leadership in the Age of AI

“AI Habits of Heart and Mind” project at the 8th Grade Service Learning Fair

Margaret Ramsey's 9th grade English class used AI to provide counterarguments in rhetorical speech writing.

DEVELOPING UNIQUE VOICES

Elsewhere in the English Department, Dr. Rachel Blumenthal doesn’t just wonder about the effects of AI on the teaching and learning of writing, she coauthored a journal article about it. In “Does Writing Have a Future in the Age of AI?,” Blumenthal shares her thoroughly researched ruminations: while AI may assist in composing, it cannot supplant the writer’s role in imagining, questioning, and reshaping the world.

“Your writing is your brand. Your writing is your voice. Your writing is your self, your thinking, made manifest to others,” Blumenthal shared at a fall 2023 AI Assembly. Later, she collaborated with colleagues to develop the Five Tenets of AI Use in the Humanities Classroom to put these ideals into practice. These tenets—AI output is subject to errors; output is a starting line, not a finish line; prompt quality matters; attribution is necessary; and you are responsible for your digital footprint—help clarify expectations, offer examples of generic vs. nuanced prompts, and underscore the responsibility that accompanies ethical AI use.

Ask English teacher Oscar King IV what he thinks about the impact of AI on writing, and he’ll likely quote William Shakespeare’s Macbeth, Act 5, Scene 5.

Life’s but a walking shadow, a poor player That struts and frets his hour upon the stage And then is heard no more. It is a tale told by an idiot, full of sound and fury, Signifying nothing.

King frequently uses the idea of “sound and fury signifying nothing” to describe AI’s capacity. “It writes these things that seem really tactile and really rich, but they don’t have any actual utility or existential import,” he explains. He shows his students an AI-generated essay to exemplify what average work looks like, knowing full well his students aren’t willing to settle.

King does appreciate the way AI can give positive feedback and help students build confidence in their own voices. He spends a lot of time helping them develop their “unique thumbprints” by leveraging AI’s limitations—visceral reactions, authenticity, flexible imagination, and life experiences. “My students are more the piece of art or the product that I’m trying to make than any of their papers or writing assignments,” he said.

When potential academic integrity issues arise, King simply asks his students, “Why?” He wants to discover where in the process he lost their buy-in and help them understand the value of the work they’re doing. “The goal is not to write the next best textual analysis for the sake of the textual analysis,” he said. “This text should tell you something about yourself, should help other people learn something about you as a thinker, as a feeler, as a connector, and should help you be able to then act better in the world.”

[AI] writes these things that seem really tactile and really rich, but they don’t have any actual utility or existential import.”

—Upper School English teacher Oscar King IV

Exploring Photoshop’s latest AI tools in Ryan Bowden’s Advanced Photography class

Prepping peer presentations in Ethical Leadership in the Age of AI

“Doubling down on our character work is critical to the kind of transformational preparation we are providing our children.”

—Than Healy

INSTILLING VALUES

“AI tools will magnify the morality of the society that they serve,” said Healy in his 2025 State of the School remarks. “Doubling down on our character work is critical to the kind of transformational preparation we are providing our children.”

To this end, Menlo is gradually developing and introducing AI-Forward courses, purposefully and thoughtfully integrating machine learning into the core educational experience. A new MTerm course for juniors, Ethical Leadership in the Age of AI, explored what makes a good leader through the lens of morality, critical thinking, and conflict resolution. Kiang, King, and Moviemaking teacher Dr. Christina Ri experimented with using AI for everything from designing activities and analyzing student journals to building the course itself. Students leveraged AI to create websites, act as an improv coach, and learn better negotiation skills. “AI-Forward means being proactive about integrating AI into how we solve problems, make decisions, and approach real-world challenges,” reflected one student. “In this class...I learned how to guide AI with clear intentions, question its outputs, and use it to expand my own thinking.”

This fall, Menlo is incorporating a series of seven AI literacy classes into its 9th Grade Seminar to shed light on how large-language models work and explore concurrent ethical, environmental, and bias-related issues. In the spring, Blumenthal and Kiang are joining forces to co-teach Literature in the Age of AI, an upper-level English course where students will explore what role AI can—or should—play in authoring culturally relevant literature that reflects and shapes our evolving conception of childhood.

Advanced Photography teacher Ryan Bowden reminds his students that true creativity begins with mastering the craft itself—every brush, layer, and adjustment—before stepping into the evolving world of artificial intelligence. With Adobe’s latest AI tools at their fingertips, Bowden’s class explores issues of ethics, copyrights, and artistic integrity, complementing technical skills with values-based learning.

In Menlo’s Middle School, a student-led initiative helped raise the visibility of responsible AI usage. For his Eighth Grade Service Learning Project, Obasi ’29 interviewed local educational experts and used the Middle School Habits of Heart and Mind as a template to shape his four pillars of the AI Habits of Heart and Mind—transparency, ethical behavior, self-awareness, and community—which he presented at the Service Learning Fair in March. “You should have collaboration with AI but you shouldn’t be letting the machine do all the work,” Obasi cautioned his classmates.

AI-Forward means being proactive about integrating AI into how we solve problems, make decisions, and approach real-world challenges.”

—Ethical Leadership in the Age of AI student

Upper School students with Mr. Blickensderfer at the Computer Science Share Fair

COLLABORATING CREATIVELY

Through professional development, collaboration, and persistent practice, Menlo faculty members are exploring meaningful ways to use AI as a pedagogical copilot to enrich their courses and support student growth.

Many teachers are finding ways to have AI assist with tedious tasks so they can reclaim time for connection, mentorship, play, and more meaningful dialogue with their students. Others are training custom versions of ChatGPT to use as coaches in various areas of our curriculum. Several are partnering with AI to design and enhance dynamic lesson plans.

Math teacher Dennis Millstein is creating an Advanced Topics in Statistics class from scratch, exploring various statistical theories with students in a wide variety of domains. He worked with AI to come up with real-world applications that would lead students to explore, get stuck, and feel motivated to experiment with these different mathematical models.

Through months of collaboration, Millstein and the chatbot mapped out everything from a two-sided game to accompany the “random walk” statistical technique to examples of biological phenomena that let students compare different probability distributions against predictive studies on how a disease will grow or recede. He doubts he could have crafted such an ambitious syllabus without AI, but stressed the vital role of his own judgment in addressing assumptions, misinformation, and bias along the way.

Douglas Kiang consults with a student at the Computer Science Share Fair

LEARNING TOGETHER

“As with the personal computer, the internet, and the smartphone before it, AI presents us with an array of new possibilities and challenges to navigate as a school,” said Healy. “As our entire community learns, iterates, and experiments with AI, we are guided by our shared commitment to shaping kind, ethical, capable humans equipped to make a meaningful impact in the world.”

As it turns out, the best way to help students harness this technology wisely is through the growth and development of the qualities that make us fundamentally human: creativity, critical thinking, humor, resilience, and a strong moral compass. Grounded in these values, collaborations with AI are empowering curious students and transformational teachers to think beyond their own preconceived limits.

“Just because you don’t know how to do something doesn’t mean you have to stop working on it,” exclaimed Ananya of the ALS communication app she helped develop last spring. Her eyes grew wide as she recounted working on a project with real-world applications. “It’s one thing to make something for fun, and it’s another to make something that has an actual purpose and is meant to help people.”

Meanwhile, Mr. Kiang has been busy testing various AI capabilities and sharing his takeaways in a weekly blog series. Reinforcing Ananya’s experience with the ALS app development, he wrote, “Coding is getting easier, but design and empathy are more important than ever. The AI can write code, but it can’t understand your users or the real problems you’re trying to solve.” And then, as if in a nod to his students and colleagues, he added, “That’s where humans come in.”

AI can write code, but it can’t understand your users or the real problems you’re trying to solve.”

—Douglas Kiang

PREVIOUS

Welcome & Contents

NEXT

Model Minds

50 Valparaiso Avenue Atherton, CA 94027

650.330.2000

info@menloschool.org

MENLOSCHOOL.ORG

   

© 2025 Menlo School. All rights reserved. Design by CDA.