CEE Developers Lead the Way with Backboard.io’s Portable AI Memory Infrastructure

0
  • AI infrastructure platform Backboard.io boasts top LongMemEval and LoCoMo benchmark scores, surpassing frontier models
  • The startup was founded in 2025 to solve AI statelessness with persistent, portable memory
  • Its strong CEE traction reveals pragmatic, cost-conscious developer practices
  • Backboard.io’s future focus is the marketplace and ecosystem of AI tools to amplify open source innovation at scale

This February, Backboard.iothe Canadian startup developing an infrastructure platform of persistent, shared memory for AI systems across sessions and modelsannounced that it scored 93.4% overall accuracy of the LongMemEval benchmark and 90.1% accuracy on the LoCoMo benchmark. This marks the highest publicly reported result under consistent methodology so far.

Founders with Proven Track Records and the Structural Trigger

Backboard was founded in 2025 by Rob Imbeault (CEO), who previously co-founded and scaled Assent into a global enterprise SaaS company, and Jonathan Murray, a multi-sector founder who has built vertically integrated technology systems across hardware, software, and applied AI.

Their trigger to start Backboard.io was structural. As large language models became capable enough to power real workflows, it became apparent they were fundamentally stateless (i.e. lacking built-in memory of past interactions). Enterprises needed smarter models with durable, portable, auditable memory that could persist across agents, tools, and time. Having both built infrastructure businesses before, the founders recognized that the real bottleneck was not model intelligence but the absence of a shared memory layer. As multi-model routing and multi-agent systems accelerated, it became clear this was the moment to build foundational infrastructure rather than another surface-level wrapper. With a shared understanding of how to build systems that enterprises can actually rely on at scale, the duo believed they were the right team to deliver this solution.

Developer-Centric AI Infrastructure

They built Backboard.io as a developer-centric AI infrastructure platform that solves a core limitation of large language models — their statelessness — by providing a persistent, portable memory layer and unified API on top of 2,000+ models, letting applications retain context, preferences, and long-term knowledge across sessions and models without stitching together separate tools. It combines stateful memory, context management, retrieval-augmented generation, and multi-model orchestration into one stack so teams can build smarter, scalable AI systems faster.

Raif Barbaros, Partner at Mistral Venture Partners

‘Most people have seen that ChatGPT can “remember” things about you. But that memory is locked inside ChatGPT. Developers cannot get that same persistent memory through the standard API or move that state across different models or tools, but Backboard.io gives developers that superpower. We provide portable, persistent memory at the message level that works across models, not just inside one product. That means any AI tool can remember a user’s history, preferences, decisions, and context over time, even if the underlying model changes,’ Mr Murray explains.

Investor Confidence

This January, Backboard.io raised a sizable Pre-Seed round from Mistral Venture Partners, N49P, Garage Capital, and Developer Capital.

‘Backboard isn’t another AI wrapper. It’s foundational infrastructure. Rob has the vision and operational discipline to make Backboard the standard for how AI systems remember and retrieve information,’ Mistral Venture Partners’ partner Raif Barbaros states.

‘Rob is an exceptional founder who could be building anything but has smartly chosen the Memory Layer of the AI stack as his next challenge. When we looked into the competitors, and the technical decisions they made, it became clear to us the Backboard is more performant,’ Developer Capital’s CEO Jordan Steiner agrees.

Benchmarks and Practical Validation

Jordan Steiner, CEO Developer Capital

The capital injection allowed for the rapid expansion of the company’s capabilities, integration of thousands of models, and top scores on major AI memory benchmarks like LoCoMo and LongMemEval.

‘The dataset is designed to examine not just an LLM but any LLM-based system’s capabilities and blindspots in a fine-grained manner. Raw human performance is somewhere around 88 percent. Breaking the 90-percent threshold requires superhuman consistency in recall and reasoning. Most high-performing frontier models currently score around 80 percent on LoCoMo. The system built by Backboard.io is a far better attempt at simulating memory as it manifests in humans. It is practical, cheaper, scalable and doesn’t rely solely on brute-force LLM processing for answers,’ creator of the LoCoMo benchmark and research scientist at Databricks Adyasha Maharana comments in the Ottawa Business Journal article.

Architecture Philosophy

The investor confidence and the benchmark scores reinforce Backboard.io’s vision of memory as foundational infrastructure rather than a bolt-on feature. Emphasizing memory as a foundation and not a sideloaded plugin meant making early architectural choices that prioritized performance, stability, and cost efficiency over endless surface-level customization.

Adyasha Maharana, Creator of the LoCoMo Benchmark, Research Scientist at Databricks

‘You can always make something more configurable, but too many knobs often make systems slower, more complex, and more expensive. We chose a more opinionated core so memory is faster, more reliable, and affordable at scale. A simple analogy is a house. You can build plumbing and electrical systems into the foundation, or you can try to run extension cords and water lines along the walls after the house is finished. In theory, you can customize everything afterward, but it is messier, less efficient, and more prone to failure,’ Mr Imbeault explains.

Real-World Performance

It’s worth remembering that the mentioned benchmarks like LoCoMo or LongMemEval are academic constructs with the standard methodology, providing a clean validation. They still need to translate into real client ROI metrics — e.g., error rates in production workflows, developer productivity, latency in multi-session systems. With this in mind, Backboard.io optimizes memory at the message level, across real, evolving conversations with persistent state. The structural difference here is that the common approach of other products and open source approaches operate on prompts, not state, so they literally cannot do the same without controlling memory at the message layer.

‘Because our architecture manages memory natively at the message level, we can measure performance the way real systems behave in production, across sessions, across agents, and over time. That is not just a benchmarking advantage. It is a fundamental architectural difference. We built for real-world usage first. The fact that it also crushes traditional benchmarks is simply the outcome of designing for reality, not the test,’ Mr Imbeault emphasizes.

CEE Developer Engagement

The Backboard.io team mentions notable traction and strong developer engagement in Ukraine and Poland. According to Mr Murray, developers in these countries reveal a much stronger bias toward pragmatism. These developer communities were some of the first to actively experiment with multi-agent orchestration to drive token costs down. Instead of brute forcing tasks with a single expensive frontier model, they were coordinating multiple smaller assistants behind an orchestration layer to complete the same workflow faster and at a fraction of the cost, often using open source models.

‘That pattern showed us something important. In parts of Western developer ecosystems, there is still a tendency to anchor to a single model provider, likely because large platforms have distributed generous credits in exchange for early loyalty. In Central and Eastern Europe, we saw less attachment and more cost discipline. The focus was not brand alignment, it was performance per dollar and architectural efficiency. The distinct trend emerging is clear. The future is not one giant model doing everything. It is orchestrated systems of models, optimized dynamically for cost, speed, and task fit. That mindset aligns closely with our belief that portable state and model-agnostic memory are foundational, because orchestration only works well when memory is not locked into one provider,’ Mr Murray tells ITKeyMedia.

Open Marketplace and Tooling

He believes that CEE has a strong open source and academic culture, so Backboard.io’s focus should be on building infrastructure that amplifies that ingenuity. Over the coming months, the team is launching an open marketplace of tools built on Backboard, where developers and research teams can publish assistants, plugins, and workflows that others can fork, extend, and deploy. That creates a direct path from research prototype to reusable, production-ready infrastructure.

A model comparison tool is also scheduled for release, that allows developers to test multiple models against each other in real time, including open source models versus expensive reasoning models. The goal is to give builders empirical clarity on performance, cost, and memory behavior, so they can make informed architectural decisions without guesswork.

More broadly, we’re moving quickly to support open source projects like OpenClaw by releasing plugins and assistants as soon as possible, ensuring that Backboard integrates into the tools developers are already using. The intent is to bridge local research and open source momentum with persistent memory infrastructure that makes those innovations deployable at scale.

Isolation, Privacy, and Compliance

Backboard.io’s infrastructure involves shared memory across agents, which naturally raises questions about isolation and privacy. In multi-tenant, regulated environments (e.g. healthcare or finance), trust boundaries and compliance need to work with a unified memory layer. To address this, Backboard.io has memory scoped to an assistant_id, not to a global pool, so assistants are naturally siloed by default and you control what gets reused by choosing which assistant you attach a workflow to. Memories also persist across threads for the same assistant, which is powerful for continuity, but it means the clean trust boundary in regulated, multi-tenant environments is “one assistant per tenant, patient, account, or case,” so each tenant has an isolated memory store by design.

To share memory across agents, a developer opts into it by intentionally reusing the same assistant_id across agents or by building an explicit sharing layer on top with their own access controls and policy. On the compliance side, memory can be kept off when required, or managed directly by viewing, disabling, and deleting individual memories.

Continuity Across Models

Rob Imbeault, Founder and CEO at Backboard.io

Importantly, Backboard preserves continuity across model swaps because the memory layer is externalized and stateful. The conversation state, structured memory, and message-level relevance are not tied to any single model, which means developers can move from one provider to another without losing continuity.

‘Where continuity can get challenged is around context window management. When you move from a very large context window model to a smaller one, you have to be deliberate about what gets surfaced and compressed, otherwise you risk truncation or signal loss. We have an early solution in place that is working effectively today, and we’re close to pushing a more robust production tool that intelligently manages context packaging and relevance across model sizes. We’re also seeing multi-agent swarms built on Backboard naturally help with this, since specialized agents can operate within smaller windows while relying on the shared persistent state,’ Mr Imbeault comments.

Hackathons as Live Testbeds

Backboard.io organizes hackathons in collaboration with groups like AI Collective and regional partners as a live testbed for format, developer engagement, and follow-on conversion. They serve two purposes:

  • On the professional side, they help organizations and senior engineers understand how to implement persistent memory in real workflows. In that setting, adoption is the metric. The organizers look at API key creation, assistant deployments, message volume, and whether teams move from a prototype into an active build within 30 to 60 days.
  • On the university side, hackathons are a top of funnel GTM motion with awareness as its first objective. Blackboard.io can measure developer signups, credit activation, continued usage after the event, open source releases built on Backboard, and whether projects evolve into startups or sustained tools.

‘Participation is easy to count. Real impact is measured in retained builders and deployed systems,’ Mr Murray summarizes.

‘The model landscape is shifting constantly, open source is accelerating, and new agent frameworks appear almost weekly. What we can say with confidence is that Backboard will continue to focus on enabling developers first. We’re building for builders, listening closely to feedback, and shipping quickly in response to how people are actually using the platform,’ Mr Imbeault adds.

Future Ecosystem Expansion

Over the next few years, this likely means expanding the ecosystem layer around Backboard.io as much as the core infrastructure itself. The planned marketplace is a big part of that, creating distribution channels for tools built on Backboard.io so developers can turn prototypes into adopted products.

‘In regions like CEE, where there is strong open source and academic momentum, we see an opportunity to support local ingenuity with persistent memory infrastructure that makes those ideas deployable at scale. Our role is not to predict every trend, but to provide the stateful foundation that allows developers to adapt as the landscape evolves,’ the CEO remarks.

Backboard.io is redefining how AI systems retain and apply knowledge, turning inherently stateless models into stateful, agent-ready platforms capable of sustaining context, preferences, and workflows over time. Its traction in Central and Eastern Europe highlights a region that combines technical ingenuity with pragmatic, cost-conscious development practices, making it a natural proving ground for scalable AI infrastructure. By bridging local open source innovation with a persistent memory layer, Backboard.io is not just advancing AI performance—it’s empowering CEE developers to build the next generation of intelligent, adaptable systems.

Share.

Comments are closed.