Close Menu

    Subscribe to Updates

    Get the latest creative news from FooBar about art, design and business.

    What's Hot

    America’s biggest public storage company is about to get even bigger

    March 18, 2026

    Sevenoaks Care Home Fitness & Health Consults

    March 18, 2026

    This AI tutor helps college students reason without giving them answers

    March 17, 2026
    Facebook X (Twitter) Instagram
    Facebook X (Twitter) Instagram
    Live Wild Feel Well
    Subscribe
    • Home
    • Green Brands
    • Wild Living
    • Green Fitness
    • Brand Spotlights
    • About Us
    Live Wild Feel Well
    Home»Brand Spotlights»This AI tutor helps college students reason without giving them answers
    Brand Spotlights

    This AI tutor helps college students reason without giving them answers

    wildgreenquest@gmail.comBy wildgreenquest@gmail.comMarch 17, 2026006 Mins Read
    Share Facebook Twitter Pinterest Copy Link LinkedIn Tumblr Email Telegram WhatsApp
    Follow Us
    Google News Flipboard
    Share
    Facebook Twitter LinkedIn Pinterest Email Copy Link



    Students using AI to cheat on homework or tests is a source of much discussion. But some scholars argue the greater risk of students using AI is that they will simply not learn.

    Approximately 90% of 1,100 U.S. students surveyed at two-year and four-year colleges in 2025 reported using generative AI for everything from drafting assignments to clarifying complex concepts.

    But when students use AI as a tutor or study partner, not as an immediate answer generator, does it make it easier or harder for them to learn?

    We are economists who tried to answer this question by designing an AI tool using ChatGPT’s custom GPT feature, with the web access of the chatbot disabled.

    We named the tool Macro Buddy and trained it to guide some students at one of our undergraduate macroeconomics classes at the University of Wisconsin, La Crosse, through their reasoning rather than giving them direct answers.

    We found in our research, conducted in spring 2025, that students who used Macro Buddy, alongside peer discussion, earned higher exam scores than students who worked alone, without this AI tutor.

    Meet your new tutor

    One of our macroeconomics courses enrolled 140 undergraduate students, mostly in their first or second year of college, divided across four sections.

    Students’ course materials, assignments and exams were identical across all four sections. Students were generally not allowed to use AI tools or collaborate with classmates during exams. Students took all tests in person and were not allowed to reference any notes or other materials during the exam.

    As a result, exam scores reflected what students understood and could explain on their own—without the help of AI or any other outside source.

    After all students took their first exam, we randomly assigned the four class sections to take on a different study format.

    We prompted one group of students to work individually, without Macro Buddy; another group of students worked in groups, without Macro Buddy; a third group of students worked individually, with Macro Buddy; and a fourth group of students worked in groups, with Macro Buddy.

    We wanted to compare how different study approaches—working alone, working with classmates, using Macro Buddy or combining both—altered how well students did on exams.

    Macro Buddy’s skills

    We trained Macro Buddy with the help of lecture transcripts, slides and homework questions specifically from this macroeconomics course.

    Macro Buddy had internet access turned off, so it relied only on the instructor’s course materials.

    Macro Buddy was designed to act like a tutor, not an answer machine. Instead of giving students complete solutions, Macro Buddy asked follow-up questions meant to guide students toward an answer.

    For example, if a student asked why lower prices might increase consumers’ spending, Macro Buddy would not offer a quick, full explanation. It might instead ask what happens to people’s purchasing power when prices fall. The student would then have to connect the concepts and explain their reasoning, in their own words, step by step.

    This distinction between explaining an idea and receiving a finished answer matters.

    An AI tool that simply delivers answers can allow students to skip thinking through a problem. One study found that when college students rely on a chatbot as a crutch, they perform worse when they no longer have access to it. A tool that asks questions requires students to do the work themselves, even while receiving guidance. This is the very process that makes learning stick.

    What happened to students’ learning

    The one group of students that continued working individually, without AI, served as our control group.

    The other three groups changed how they studied: One began working in groups without AI, one worked individually with Macro Buddy, and the last group combined group work with Macro Buddy.

    All of the students’ average scores declined when they took their second exam, across all four study groups.

    By the third exam, however, differences across sections became clearer.

    Students who used both Macro Buddy and group discussion earned the highest average scores. Students who used Macro Buddy alone also scored higher than those who worked alone without Macro Buddy. Students who worked in groups without Macro Buddy showed smaller improvements, when compared to the students in other groups.

    The third exam happened several weeks after we introduced the new study formats.

    By that point, students in the combined group may have grown more comfortable using Macro Buddy to test their understanding, while also explaining ideas to classmates. Working with peers meant having to articulate reasoning clearly and respond to questions, which can deepen understanding over time.

    Why this matters

    Some critics of AI worry that students will rely on AI to do the hardest parts of learning for them. This reflects a fear that students may stop practicing the skills that build expertise. Students become experts in their fields while struggling with confusing material, revising explanations and seeing whether they truly understand an idea.

    Our experiment suggests erosion of learning when using AI is not inevitable.

    We found that when AI is designed as a tutor that asks questions instead of simply giving answers—and when students are also required to explain their reasoning to classmates—the technology can support learning rather than replace it.

    Most students today use general-purpose chatbots that are not designed as tutors. They type in a question and receive a response. But our findings suggest that even small design choices, such as building an AI chatbot with guiding questions, can shape how students engage with the material.

    Peer discussion also adds something to the learning process that AI cannot provide: social accountability and exposure to alternative reasoning.

    Together, these practices encourage students to think through problems more actively.

    The evidence from our experiment highlights a practical distinction: AI can be used to replace thinking, or it can be used to support it. The impact may depend less on the technology itself and more on how it is structured and integrated into learning.

    Saharnaz Babaei-Balderlou is a teaching assistant professor of economics at the University of Wisconsin-La Crosse.

    Shishir Shakya is an assistant professor of economics at Appalachian State University.

    This article is republished from The Conversation under a Creative Commons license. Read the original article.



    Source link

    Follow on Google News Follow on Flipboard
    Share. Facebook Twitter Pinterest LinkedIn Tumblr Email Copy Link
    wildgreenquest@gmail.com
    • Website

    Related Posts

    America’s biggest public storage company is about to get even bigger

    March 18, 2026

    ‘Proud to tell you he didn’t watch it’: One person killed the ‘Buffy the Vampire Slayer’ reboot, reveals Sarah Michelle Gellar

    March 17, 2026

    What to know about the Strait of Hormuz, a key oil shipping waterway

    March 17, 2026
    Add A Comment
    Leave A Reply Cancel Reply

    Top Posts

    Secrets of the Blue Zones. My Summary

    March 17, 20263 Views

    ‘Proud to tell you he didn’t watch it’: One person killed the ‘Buffy the Vampire Slayer’ reboot, reveals Sarah Michelle Gellar

    March 17, 20262 Views

    What to know about the Strait of Hormuz, a key oil shipping waterway

    March 17, 20261 Views
    Latest Reviews
    8.5

    Pico 4 Review: Should You Actually Buy One Instead Of Quest 2?

    wildgreenquest@gmail.comJanuary 15, 2021
    8.1

    A Review of the Venus Optics Argus 18mm f/0.95 MFT APO Lens

    wildgreenquest@gmail.comJanuary 15, 2021
    8.3

    DJI Avata Review: Immersive FPV Flying For Drone Enthusiasts

    wildgreenquest@gmail.comJanuary 15, 2021
    Stay In Touch
    • Facebook
    • YouTube
    • TikTok
    • WhatsApp
    • Twitter
    • Instagram

    Subscribe to Updates

    Get the latest tech news from FooBar about tech, design and biz.

    Demo
    Facebook X (Twitter) Instagram Pinterest
    • About Us
    • Contact Us
    • Privacy Policy
    • Terms & Conditions
    • Disclaimer
    © 2026 ThemeSphere. Designed by ThemeSphere.

    Type above and press Enter to search. Press Esc to cancel.