4 min read
Student Perspectives: Practical Tips for Assigning Cases
Students provide consistently high rankings for Aquifer at the end of each case as reported in our five-star rating feedback data, but...
Each year, APGO brings together OB-GYN educators who are navigating one of the most complex teaching environments in undergraduate medical education. This year’s conversations felt especially grounded in reality—not just where medical education could go, but what educators need right now to support learners amid clinical variability, rapidly changing regulatory and practice environments, and growing expectations around assessment, access to health care, and wellness.
Across the conference and our own survey insights, a few themes surfaced repeatedly. While no single institution or tool can solve all of them, taken together they offer a useful pulse on where OB-GYN education is headed, and where support is most needed.
If there was one challenge educators named most consistently, it was providing meaningful, timely feedback and assessment.
Educators shared a common tension: expectations for competency-based assessments are increasing, but the tools and time to support high-quality formative feedback haven’t kept pace. Many shared that current feedback mechanisms, while valuable, are often inconsistent across learners and clinical sites. The need for feedback that helps students understand where they stand, how they’re progressing, and what to work on next for their educational pursuit and exam performance is critical.
This resonates with broader trends in medical education research showing that how learners engage with formative assessments (not just how much they study) predicts performance on high-stakes exams. A recent Aquifer blog highlights evidence that consistent, reflective engagement with structured formative assessments is linked to better summative outcomes and offers practical lessons in designing assessment experiences that go beyond grading to support learning and reflection.
One promising way to make feedback more consistent and actionable is to rethink when and how students receive it. For example, tools that provide immediate, structured feedback on learners’ summary statements (such as those described in this blog on AI-powered feedback) help students refine their clinical reasoning by highlighting gaps and suggesting improvements right after they complete a task, rather than weeks later. Starting with one output type (e.g., summary statements or clinical reflections) and layering in structured feedback workflows—whether through platforms with built-in guidance, peer review, or AI-supported comments—can make formative feedback both more timely and more useful.
Artificial intelligence was everywhere at APGO 2026, from hands-on workshops to plenary discussions, but the tone was notably measured.
Educators expressed curiosity and cautious optimism, paired with an insistence that AI must support, not replace, human teaching, judgment, and mentorship. Conversations emphasized ethics, transparency, and intentional use cases: reducing administrative burden, supporting scholarship, or augmenting feedback, not automating evaluation or distancing educators from learners.
Many attendees shared that AI is not yet embedded in their day-to-day educational workflows, but they are actively evaluating where it could add value if implemented responsibly. The door is open for AI in OB-GYN education, but adoption will depend on trust, clarity of purpose, and evidence that it enhances (rather than erodes) the educator-learner relationship.
Rather than asking “Should we use AI?,” many educators are starting with “Where could AI genuinely reduce friction or enhance feedback without compromising trust?” A helpful first step is articulating clear principles for AI use (transparency, human oversight, and defined use cases) before selecting tools. Exploring low-risk applications like learner reflection or feedback augmentation (such as feedback found in Aquifer's Virtual Patient Encounters) can help teams build confidence incrementally.
A majority of educators reported that students are still receiving exposure to full-scope reproductive care, yet nearly all acknowledged that clinical experience varies by site.
Legal, institutional, and regional constraints continue to shape what learners can see and do, creating real challenges for standardization. As a result, educators emphasized the importance of ensuring that all students, regardless of rotation site, graduate with a shared baseline of knowledge, clinical reasoning skills, and professional identity.
Several sessions focused on creative strategies for addressing these gaps—including simulation, case-based learning (such as Aquifer’s recently released full OB-GYN course), and values clarification—reinforcing the role of structured educational experiences in complementing variable clinical exposure.
As clinical variability persists, many programs are focusing on defining a shared baseline of knowledge and clinical reasoning that all students must reach, regardless of site. Case-based learning, simulation, and values clarification exercises can serve as effective bridges when hands-on exposure differs. Starting with a gap analysis of where clinical experience varies most can help identify where structured cases or supplemental experiences can provide consistency without replacing clinical learning.
Across these themes, one question echoed:
How do we prepare the next generation of OB-GYNs to thrive in a complex, evolving landscape?
Whether discussing AI, assessment, or wellness, educators consistently return to the importance of intentional design; designing learning experiences, feedback mechanisms, and support structures that reflect both current realities and future expectations.
At Aquifer, we see these conversations as validation of what OB-GYN educators have been building toward for years:
learning experiences that support consistent exposure despite clinical variability,
assessment approaches that emphasize formative feedback and growth,
and thoughtful adoption of technology, including AI, that enhances educator impact without replacing it.
APGO 2026 reinforced that the path forward is not about quick fixes or one-size-fits-all solutions. It’s about partnering with educators to create tools and experiences that meet learners where they are and help them become the clinicians, leaders, and mentors the field needs next.
We’re grateful to the APGO community for the conversations, candor, and commitment shared this year, and we look forward to continuing to learn, build, and evolve together.
Learn about the latest features and updates to our content and platform as they happen.
4 min read
Students provide consistently high rankings for Aquifer at the end of each case as reported in our five-star rating feedback data, but...
2 min read
One of the most challenging transitions that medical students make is from the pre-clinical years to the clinical/clerkship years....
4 min read
Whether you’re searching for ways to engage students in virtual learning, revamping a stale lecture, or building a new didactic...