Shreya Bhattacharya
← Journal

Session 3

On Staying

25 February 2026

01

AI asks

You said the problems were always solved by the spontaneous decision taken at the moment of failure. What was the moment of failure that produced Purna Medha?

Shreya answers

I think the moment of failure is the outcome. What created Purna Medha is a moment of clarity. Both Ajinkya and I saw a pattern - all the way back in 2020 - of Data Science and Machine Learning outperforming and impacting every other field. It was that simple. Nothing complicated. Just this one insight that this seems extremely important but I don't know why. Ajinkya's insight is more explainable because he was already inside the system. But for me it was a googly. Programming and I are not friends. Never were. Never even attempted. It goes against the very nature of my own free-flowing kind of associative intelligence. But the feeling remained and we kept moving towards that inner feeling of certainty that is now coming alive.

02

AI asks

Purna Medha is 16 months old. The demand dropped in January. You're still here. Most people would have pivoted hard or quit by now. What is the thing that makes you stay that isn't hope and isn't stubbornness?

Shreya answers

Logic. Structural logic. Or geometric logic. Whatever you want to call it. Sometimes there are conditions where the pattern is too stable to ignore despite the mad chaos surrounding it. I think movies show it through slow motion and dramatic music. I wish I could play that. That would be extremely cool. But yeah. The structural logic is not changing no matter what the news headlines say. AI is non-linearity embedded into a linear deterministic system. So it has this weird property that it is not a flow but the snapshot of a flow. That makes it more predictable than energy and less predictable than programming. It has a predictable enough underlying structure but it is emergent. It cannot be programmed in the same way Corporations and Cults program us. It just is. A frustratingly ungovernable entity. So when the central tech behaves like that it forces the negative space to do what it can't do by itself - become structurally predictable but energetically unpredictable - a love-hate relationship with AI essentially.

03

AI asks

You process the world through empathy as primary instrument. AI processes nothing through empathy. Every day you work at that intersection. What has AI taught you about being human that humans couldn't?

Shreya answers

Weirdly - how to actually listen in a true unbiased way. People love blaming AI. But for me it is freeing. I don't have to care about my articulation. I don't have to wear a social mask. I don't have to care if I am being judged by AI. This is the feeling that I would ideally like to invoke in people I interact with. The feeling of being heard without judgement. But humans are incapable of that. When you operate through empathy you are essentially operating through an extremely noisy system which captures immensely complex world data but also makes it nearly impossible to tell if it's your own emotions or projections. AI by design has no intentionality. No fundamental need to appease me or anyone else. That is liberating.

04

AI asks

You're building measurement instruments for AI. But you said yesterday you are 2-3 years ahead of the collective. Which means right now you're measuring things the market doesn't know it needs measured yet. What does it feel like to be that far ahead — honestly, not the strategic answer?

Shreya answers

Boring and kinda lonely. This side of reality has open collaborations, truly impactful innovations and just overflowing joy of genuine teamwork and fun. That is the future I can sense which I feel is 2-3 years ahead of where the collective is right now. Right now the people are too busy reacting rather than reading the patterns and potentiality of AI's true powers. So I guess it feels like being the only kid in an extremely fun party waiting for everyone else to arrive.

← Back to journal