Two Years, Two Kids, and Baby Steps for Personal Health AI
My second child was born three months ago, almost exactly two years after our first. We dusted off Expecting Better, dug the newborn stuff out from the back of the closet, and re-downloaded our old constellation of apps. Three months in, the experience has been remarkably different, and not just because our second is a markedly worse sleeper. The major difference is AI.
Two years is nothing in healthcare. It’s one regulatory cycle, half a clinical trial, a rounding error in the life of an EHR contract. But in consumer technology, two years is a generation. Watching the same parenting apps evolve between one newborn and the next has been a surprisingly sharp lens on where AI in health actually stands today, and where it’s headed.
The Apps Are Getting Smarter. The Gaps Are Getting Clearer.
The tools we used the first time around (Huckleberry for tracking feeds, sleep and diapers, Snoo for a smart bassinet, a handful of prenatal and postpartum apps) have all added AI-driven features in the interim. Huckleberry now offers an AI expert along with real-time schedule guidance. Several maternal health apps have added chatbot interfaces. These features are genuinely useful, but they’re all siloed. Each app builds intelligence on top of its own data exhaust and none see the full picture.
The other major shift is our go-to source of truth. With our first baby, we lived in Reddit, mining r/NewParents and r/ScienceBasedParenting for the particular relief of finding a thread where someone was facing the exact same problem at the exact same stage.
Now we live in ChatGPT. The answers come faster and sharper, but there’s something lost from knowing it was a real, live, seasoned parent on the other side of that thread. And we’ve hit some limits, too. ChatGPT can synthesize a dozen studies on wake windows and yet still give advice that feels generic, missing an obvious tip a grandmother would catch from one phone call. The reasoning is impressive. The personalization isn’t there yet.
The Copy-Paste Problem
The real unlocks come from trying to combine these tools. More than once, we’ve found ourselves screenshotting a day’s worth of data from Huckleberry — nap times, wake windows, feeding logs — and pasting it into ChatGPT: “Here’s how the day went. What should we expect tonight? Any bedtime advice?” The answers are remarkably good. But the workflow is wonky. We’re the integration layer. Copy, paste, prompt, interpret.
Conceivably, we could use Claude Code to rebuild any of these apps from scratch and produce a truly personalized “Master Newborn Tool”. Take Huckleberry’s tracking UI, Snoo’s data feeds, ChatGPT’s reasoning engine, and stitch them together the way we actually want to use it. As my colleague Eric Berry recently wrote, AI is quietly dissolving the value of software that merely organizes workflows without owning something deeper.
While we can expect a wave of innovation will come from that kind of DIY assembly, there’s a reason hardware companies look particularly well positioned for the longer run. This week, Oura announced its first proprietary AI model, purpose-built for women’s health and grounded in the continuous biometric data its ring collects. That’s intelligence trained on high-frequency, closed-loop data streams.
Snoo has a version of the same advantage: real-time infant sleep inputs that software-only competitors don’t see unless users manually enter the data. Of course, that moat erodes once the kid outgrows the bassinet (or, in our case, decides it feels more like Mr. Toad’s Wild Ride than gentle soothing).
Conflicts and Frictions
There is an uncomfortable question lurking here: What happens when the machines disagree? When Huckleberry says the baby needs a final nap and ChatGPT says to skip it. When one wearable flags a heart rate trend and a different one says it’s likely just artifact. New parents are particularly vulnerable here: sleep-deprived, anxious, and primed to defer to anything that sounds authoritative. But it extends to all patients. The hidden advantage of the current copy-paste workflow is that we’re still the ones in the loop. We can look at ChatGPT’s advice and say, “That’s dumb. Definitely won’t work.” As more of this gets stitched together, the question of who retains that veto becomes increasingly urgent.
Perhaps the harder question, though, is where do we actually want to keep some friction? There’s a real difference between “AI, suggest a nap schedule” and “AI, put my baby to bed” One is helpful; the other crosses a line, because the coaxing, the soothing, the slow learning of what this specific little human needs to meet and mold the world, are part of what it means to be a parent. That instinct won’t disappear when the domain shifts to healthcare. Companies that nail the art of what to automate and what to leave in human hands will outlast the ones that optimize purely for seamless experiences.
Just as importantly, every family’s needs are inherently unique (much like our newborns). Some want to track every feed and nap, many don’t. Some want to incorporate microbiome data, others want to track their own sleep and stress alongside the baby’s. This is the very phenomenon that makes personal health such a tantalizing space. For a platform to win broadly, it needs to synthesize across whichever sources and outputs a given user actually cares about.
The Parallel to Broader Healthcare
Zoom out, and this is the same dynamic we explored in a previous post. Every health system, payor, and EHR vendor is adding AI features. Chatbots for intake, copilots for clinicians, automation for prior auth. Useful, but incremental.
The copy-paste problem in our parenting stack is a microcosm of the copy-paste problem in healthcare, and patients adopting AI aren’t waiting for the system to solve it. They’re already dragging lab results into ChatGPT, uploading discharge summaries, asking an AI to explain what their doctor didn’t have time to.
However, it’s worth remembering how early we still are. The majority of Americans haven’t opened ChatGPT and aren’t copy-pasting anything into a prompt yet. That’s the opportunity. The companies that get ahead of this won’t expect patients to assemble their own AI stack. They’ll handle the stitching invisibly. And when the experience feels like magic for a patient (and sleep-deprived parents alike), it will be game-changing. That’s when AI for personal health truly comes of age. I have a hunch it will happen faster than we think, but not without some rough patches along the way.