You explain something to one person and it goes over their head. Same thing, different person, clicks instantly. The difference isn't the information. It's knowing how someone hears things. That's what Tenure learns. Not just what you told it, but how to reach you. Most memory systems store facts and hope the model figures out relevance. Tenure converts observations into direct instructions the model acts on immediately. Fully local, encrypted at rest, nothing leaves localhost.
We have all been there. You already told the model you use Fastify. That you want clarifying questions before long responses. That you chose the MongoDB raw driver over Mongoose for a specific reason. Next session, it has no idea and meets you as a total stranger.
The other thing that bothered me about existing systems: they store facts but not what to do with them. "Character doesn't like being yelled at" is a fact. "Use reactions to raised voices as the primary window into this character's interior life, and let that discomfort shape every confrontation scene without stating it explicitly" is an instruction the model can act on without additional inference. Every belief in Tenure carries a why_it_matters field that does this conversion at extraction time, when the model has the full conversational context to get it right.
Most memory systems hand the model a pile of vaguely related facts and hope it figures out what's relevant; because vector search can't tell your MongoDB decision apart from your Redis decision apart from your TypeScript preferences. They all live in the same semantic neighborhood. Tenure retrieves the specific belief that applies to the current moment.
Tenure is a local-first, privacy-focused proxy that sits between your client and any LLM. Next session, it already knows your stack, your preferences, and what you ruled out. It doesn't meet you as a stranger.
No comment highlights available yet. Please check back later!
About Tenure on Product Hunt
“Your AI finally learns how to talk to you”
Tenure launched on Product Hunt on May 14th, 2026 and earned 56 upvotes and 1 comments, placing #76 on the daily leaderboard. You explain something to one person and it goes over their head. Same thing, different person, clicks instantly. The difference isn't the information. It's knowing how someone hears things. That's what Tenure learns. Not just what you told it, but how to reach you. Most memory systems store facts and hope the model figures out relevance. Tenure converts observations into direct instructions the model acts on immediately. Fully local, encrypted at rest, nothing leaves localhost.
Tenure was featured in Open Source (68.4k followers), Privacy (11.1k followers), Artificial Intelligence (468.5k followers) and GitHub (41.2k followers) on Product Hunt. Together, these topics include over 132.1k products, making this a competitive space to launch in.
Who hunted Tenure?
Tenure was hunted by Jeff Flynt. A “hunter” on Product Hunt is the community member who submits a product to the platform — uploading the images, the link, and tagging the makers behind it. Hunters typically write the first comment explaining why a product is worth attention, and their followers are notified the moment they post. Around 79% of featured launches on Product Hunt are self-hunted by their makers, but a well-known hunter still acts as a signal of quality to the rest of the community. See the full all-time top hunters leaderboard to discover who is shaping the Product Hunt ecosystem.
Want to see how Tenure stacked up against nearby launches in real time? Check out the live launch dashboard for upvote speed charts, proximity comparisons, and more analytics.
We have all been there. You already told the model you use Fastify. That you want clarifying questions before long responses. That you chose the MongoDB raw driver over Mongoose for a specific reason. Next session, it has no idea and meets you as a total stranger.
The other thing that bothered me about existing systems: they store facts but not what to do with them. "Character doesn't like being yelled at" is a fact. "Use reactions to raised voices as the primary window into this character's interior life, and let that discomfort shape every confrontation scene without stating it explicitly" is an instruction the model can act on without additional inference. Every belief in Tenure carries a why_it_matters field that does this conversion at extraction time, when the model has the full conversational context to get it right.
Most memory systems hand the model a pile of vaguely related facts and hope it figures out what's relevant; because vector search can't tell your MongoDB decision apart from your Redis decision apart from your TypeScript preferences. They all live in the same semantic neighborhood. Tenure retrieves the specific belief that applies to the current moment.
Tenure is a local-first, privacy-focused proxy that sits between your client and any LLM. Next session, it already knows your stack, your preferences, and what you ruled out. It doesn't meet you as a stranger.