Skip to content
Abstract illustration of connected nodes forming a constellation-like network, representing group AI connectivity
Product

Introducing Morphee: AI That Works for Your Whole Group

M
Morphee Team
· 18 min read

Every group runs on coordination. A family trying to get four people to the right place at the right time. A classroom where one teacher manages thirty students, their parents, and a stack of administrative reports. A small team shipping software across three time zones. A freelancer context-switching between five clients who each think they are the only one.

The tools exist. Calendars, task managers, messaging apps, shared drives, email. The problem is that you are the integration layer. You copy information between apps. You remember who said what and where. You hold the mental model of an entire group’s state in your head, and you rebuild it from scratch every morning.

We built Morphee because we believe an AI assistant should be able to do this for you — not as a chatbot that answers questions, but as an agent that understands your group, connects to your tools, and takes action on your behalf. One that keeps your data private, works offline, and treats a family with the same architectural seriousness as an enterprise team.

This post explains what Morphee is, the design decisions behind it, and why we think the current generation of AI assistants is solving the wrong problem.

The Problem With Personal AI

The AI assistant market has converged on a single interaction pattern: one person, one chat thread, one cloud. You type a question, you get an answer, the conversation disappears into a log you will never read again.

This works for looking up recipes or summarizing articles. It does not work for the coordination problems that actually consume your time.

Sophie is a parent with two kids in different schools, each with their own schedules, dietary restrictions, and extracurricular activities. She uses Google Calendar for appointments, Apple Reminders for groceries, WhatsApp for school parent groups, and a shared Google Sheet for meal planning. None of these systems know about each other. When her son’s soccer practice moves to Thursday, she has to manually check whether that conflicts with her daughter’s piano lesson, update the calendar, adjust the grocery list because she will not be home to cook, and message her partner about pickup logistics. An AI that can answer “What is the capital of France?” does not help here.

Marc is a high school teacher who spends roughly a third of his working hours on tasks that have nothing to do with teaching. Progress reports, parent communication, attendance tracking, differentiated lesson plans for students at different levels. He needs an assistant that understands his class roster, remembers which parents prefer email versus phone calls, and can draft a report card comment that reflects what actually happened over the semester — not a generic template.

Julie manages an engineering team of eight across Paris and Montreal. She tracks standups, unblocks people, writes status reports, and makes sure nothing falls through the cracks between sprints. Her team uses Linear, Slack, GitHub, and Notion. She needs an agent that can synthesize information across all four without requiring her to manually copy status updates from one tool to another.

Seb is a freelance designer working with five clients simultaneously. Each client has their own communication preferences, project timelines, and billing arrangements. He needs strict context separation — a question about Client A’s brand guidelines should never surface information from Client B’s project. But he also needs a unified view of his week across all clients.

These are not edge cases. They are the norm. And they share a common structure: multiple people, multiple tools, multiple contexts, and one person trying to hold it all together.

What Morphee Actually Is

Morphee is a conversational AI agent designed for groups. You interact with it through natural language, and it takes action across your tools — calendar, email, tasks, memory, files — within a single conversation. It remembers your group’s preferences, history, and context. It connects to a wide range of integrations through a unified interface system. And it renders results not as plain text, but as interactive UI components that appear directly in the conversation.

Morphee ships as a native application for desktop and mobile, powered by a compiled backend that runs entirely on your device. Our engine handles API routing, authentication, knowledge management, and AI orchestration. The application layer adds local AI inference using on-device models, an embedded vector database for semantic memory, version-controlled knowledge storage, and audio/video processing. The entire system is designed to run without an internet connection.

This is not a web app with a desktop wrapper. It is a native application with a compiled backend that runs on your device.

A Canvas, Not a Chat

Most AI interfaces are chat threads. You type, the AI responds with text, and the conversation scrolls. This works for simple Q&A, but it breaks down when the AI needs to present structured information or collect structured input.

Morphee uses a canvas-first interface. The primary view is a spatial workspace where UI components appear dynamically as the AI responds. When you ask Morphee to schedule a meeting, it does not respond with “I’ve scheduled your meeting for Thursday at 3pm.” It renders an interactive calendar card showing the proposed time, conflicts with other group members, and alternative slots — all of which you can modify directly.

When Marc asks Morphee to prepare progress reports for his class, it does not dump thirty paragraphs of text. It renders a card for each student with pre-filled observations, areas for improvement, and suggested comments. Marc can edit, approve, or regenerate each one individually. The approved reports persist on the canvas as completed items.

When Sophie asks “What does this week look like?”, Morphee renders a visual timeline for each family member — color-coded by person, with meal plan suggestions slotted into the gaps and a grocery list generated from the menu. She can drag items, reassign tasks to her partner, and approve the plan. The AI sees her edits and adjusts accordingly.

This is what we mean by canvas-first: the conversation is not a transcript — it is a workspace. Forms, calendars, task lists, approval cards, data tables, and media previews all appear as first-class components that you can interact with. Chat is still available as a collapsible drawer for when you want a traditional text conversation, but the canvas is the primary interface.

The key insight is that most group coordination tasks produce structured output, not prose. A schedule, a checklist, a meal plan, a report card. The interface should match the output.

Spaces: Context That Knows Its Boundaries

One of the hardest problems in group AI is context isolation. When Seb asks a question about his branding client, the answer must not leak information from his healthcare client. When Sophie’s kids interact with Morphee, they should not see their parents’ financial discussions. When Julie’s team discusses a sensitive HR issue, it must stay in that Space.

Morphee organizes everything into Spaces — isolated contexts that carry their own memory, integrations, permissions, and conversation history. A family might have a “Family” Space for shared logistics, a “School” Space for each child’s academic tracking, and a “Kitchen” Space for meal planning and recipes. Each Space has its own vector memory, its own set of connected tools, and its own access controls.

Spaces can be nested. A “Work” Space might contain sub-Spaces for each project, and each sub-Space inherits the parent’s integrations while maintaining its own isolated memory. This mirrors how people actually think about context — not as a flat list, but as a hierarchy of concerns.

Every data query in Morphee is filtered by group and Space. This is not a UI convention — it is an architectural constraint enforced at the database layer. There is no API endpoint that can return data from a Space the user does not belong to, because every query includes a group membership filter. It is the kind of guarantee that matters when you are storing a family’s conversations, a teacher’s student records, or a team’s internal discussions.

For more on how Spaces work in practice across different group types, see our use cases for families.

Privacy as Architecture, Not Policy

Most AI products treat privacy as a compliance checkbox. They publish a privacy policy, add a consent banner, and continue sending your data to their servers for training. Morphee takes a fundamentally different approach: privacy is an architectural decision, not a policy document.

Credentials never touch our servers. When you connect Google Calendar or Gmail, the OAuth tokens are stored in your device’s native secure storage — the macOS Keychain, Windows Credential Manager, or the platform’s mobile keystore. They are never written to a database, never transmitted to our infrastructure, never included in logs. The credential storage layer is a pluggable system that sits below the integration layer. It is not an afterthought bolted onto the app; it is the foundation the integration system is built on.

No telemetry, no training on your data. Morphee does not phone home. There is no usage analytics, no conversation logging, no “anonymized” data collection that can be reversed with enough effort. Your group’s conversations, memories, and files stay on your devices and your connected services. We do not have access to them, and we have designed the system so that we cannot have access to them.

Group-based data isolation. Every piece of data in Morphee belongs to a group, and every database query is filtered by group membership. This is enforced at the data access level, not in application code that someone might forget to update. The system is designed so that even a bug in the application logic cannot leak data between groups, because the data access layer will not return it.

GDPR compliance by design. Cascade deletes ensure that when a user leaves a group or deletes their account, every piece of associated data is removed. Consent is checked before any data processing that involves a third party. Children can use Morphee without an email address — the system issues its own authentication tokens for child accounts, so kids never need to create an account with a third-party identity provider.

We wrote extensively about why this matters in our post on AI and family data privacy. The short version: your family’s bedtime routines, your students’ learning difficulties, your team’s internal disagreements — these are not training data. They are private, and the technical architecture should make it impossible to treat them otherwise.

For a deeper look at Morphee’s security model, visit our security overview.

Offline by Default

Cloud dependency is the silent assumption of modern software. It works until it does not — on a plane, in a rural classroom, during an outage, or simply when your internet is slow. For a tool that groups depend on daily, “requires internet” is a reliability problem.

Morphee is designed to work offline. The desktop and mobile applications include everything needed to function without a network connection:

Local AI inference. Morphee can run language models directly on your device using on-device model runtimes. The compiled backend handles model loading, tokenization, and inference without requiring a cloud API call. On devices with sufficient hardware — a modern laptop, a tablet with a neural engine — the AI responds with the same capabilities it has online. On lighter hardware, a smaller model handles core tasks while more intensive requests queue until connectivity returns.

Local vector search. Morphee’s memory system is built on an embedded vector database that runs entirely on-device. When you search your group’s memory — “What did we decide about the vacation budget?” or “When did we last discuss Emma’s math tutoring?” — the semantic search happens locally. No data leaves your device. The memory layer uses version-controlled knowledge storage, so your group’s knowledge is versioned, mergeable, and recoverable.

Local storage. Conversations, Space configurations, integration settings, and cached data are stored locally. The application does not depend on a remote database for its primary data store. When connectivity is available, data can sync between devices in the group, but each device maintains a complete local copy.

This is not a “lite mode” that disables features when you are offline. It is the same application, running the same code paths, against local data stores instead of remote ones. The cloud is an optimization for sync and for accessing more powerful models — it is not a dependency.

The Integration System

An AI assistant that cannot act on your behalf is just a search engine with better grammar. Morphee connects to a wide range of integrations through a unified contract system where every external service — whether it is an LLM provider, a memory store, Gmail, or Google Calendar — is modeled as an Integration with a consistent interface.

An Integration defines a capability. An Interface is a configured instance of that Integration with credentials and settings. This distinction matters because it allows different Spaces to use the same Integration differently. Julie’s team Space might use GPT-4 for code review assistance while her personal Space uses a local model. Marc’s classroom Space might connect to the school’s calendar system while his personal Space connects to his own Google Calendar.

The integration system runs in a sandboxed extension environment. Each integration operates in isolation and cannot access the filesystem, network, or other system resources unless explicitly granted permission. A misbehaving calendar integration cannot read your email. A third-party extension cannot access your credentials. The sandbox is not just a security feature; it is what makes the extension system safe enough to open to community contributions.

Each integration exposes a set of actions that the AI can invoke. When you say “Add Emma’s recital to the family calendar and send Marc the details,” the AI decomposes this into discrete actions: create a calendar event, look up a contact, send an email. Each action is type-checked, permission-gated, and logged. You can review what the AI did, revoke permissions, and set approval requirements for sensitive actions.

For the full list of capabilities, see our features page.

Engineered for Reliability

Morphee’s engine is not a prototype or a weekend project. It is a production system built with the same rigor you would expect from infrastructure software.

The core library handles the AI pipeline: prompt compilation, multi-provider federation across LLM providers, response quality optimization, and a learning system that improves from your group’s interactions over time. This library compiles to native code for desktop and mobile targets and is shared across all deployment surfaces. One codebase, tested once, deployed everywhere.

The server component handles authentication (with separate paths for adult and child accounts), resource management, real-time communication over WebSocket connections, and knowledge management. Queries are verified at compile time, catching entire categories of errors before the code ever runs.

The choice of a compiled, native-code approach was driven by three specific requirements. First, shipping a self-contained application: Morphee needs to run as a desktop app, which means the backend must compile into a single distributable binary. Second, on-device AI inference: running local models requires low-level memory control and efficient computation. A compiled backend provides this without the overhead of a runtime interpreter. Third, type safety across the stack: when you have dozens of resource types, integration actions, and inter-process commands, the compiler needs to catch mistakes that would be runtime errors in a dynamic language.

The test suite across the full stack — core library, server, tooling, desktop commands, and frontend — is extensive, covering thousands of test cases across unit, integration, and end-to-end scenarios.

A Learning System That Gets Smarter Over Time

Beneath the integration layer and the conversation interface, Morphee has a learning system that we call the Brain. It is not a large language model — it is a pattern recognition engine that learns from your group’s interactions over time.

The Brain builds an understanding of your group by observing patterns in how you coordinate, make decisions, and respond to suggestions. It identifies recurring situations, remembers what strategies worked well in similar contexts, and refines its approach as it accumulates experience. Strategies that consistently lead to good outcomes are strengthened; strategies that fail are deprioritized. A periodic consolidation process reorganizes and reinforces the Brain’s knowledge, similar to how biological memory strengthens important patterns during sleep.

In practice, this means Morphee gets better at helping your specific group over time. It learns that Sophie’s family prefers pasta on Wednesdays. It learns that Marc’s students respond better to visual explanations. It learns that Julie’s team resolves blockers faster when she schedules a synchronous call rather than an async thread. These are not hardcoded rules — they are patterns the Brain extracts from experience.

The learning system has been rigorously tested across domains that demand abstract reasoning and pattern recognition, and we continue to push its capabilities in challenging benchmarks before applying it to the broader, messier domain of group coordination.

The Knowledge Marketplace

When the Brain learns something valuable — a teaching strategy, a meal planning workflow, a team standup format — that knowledge should not be locked inside one group’s installation. This is the vision behind the Knowledge Marketplace.

The Marketplace will allow groups to share and discover knowledge bundles — packaged collections of skills, patterns, and strategies that the Brain has learned. A teacher who has developed an effective approach to differentiated math instruction could package that knowledge and share it with other teachers. A family that has built a smooth weeknight dinner rotation system could share their workflow. An engineering manager with a proven sprint retrospective format could publish it for other teams.

Each bundle is cryptographically signed, ensuring provenance and integrity. Bundles contain learned patterns, not raw data — the system extracts generalizable knowledge from specific experiences, so sharing a “meal planning” bundle does not share your family’s actual meal history.

The Marketplace is not yet available — it is a core focus of our next major release. But the architecture for knowledge packaging, signing, and distribution is already built into the engine.

Who Morphee Is For

Morphee is not for everyone, and we are deliberate about that. It is built for groups that share a genuine coordination burden and care about keeping their data private.

Families who are tired of the mental load. The parent who maintains the family’s schedule, remembers the allergies, tracks the homework, and orchestrates the logistics. Morphee gives that parent an agent that shares the cognitive burden — one that knows the whole family’s context and can act across the tools they already use. The families use case page walks through this in detail.

Educators who spend too much time on administration. The teacher who writes thirty individualized report card comments, tracks student progress across multiple dimensions, communicates with parents in their preferred format, and still needs to plan tomorrow’s lesson. Morphee handles the structured, repetitive parts so the teacher can focus on the parts that require human judgment and empathy.

Small teams that need coordination without overhead. The team that does not want to pay for an enterprise collaboration suite, does not want to send their internal discussions to a cloud AI provider, and does not want to spend time gluing tools together. Morphee connects the tools they already use and acts as a coordination layer that understands the team’s context.

Independent professionals who juggle multiple client contexts. The freelancer or consultant who needs strict separation between client projects, unified visibility across their schedule, and an agent that can switch contexts as fast as they do.

Open Source and What Comes Next

The core engine of Morphee — the library that contains the AI pipeline, the learning system, and the integration contract system — will be open source. We believe that a system trusted with a group’s private data must be auditable, and open source is the only credible way to deliver that promise.

We are currently in private beta, onboarding groups one at a time. We are taking this slowly and deliberately because we want to understand how different types of groups actually use the system before we scale. Every beta group teaches us something about how people coordinate, what they need from an AI assistant, and where the rough edges are.

If you are part of a family, classroom, or team that could use a better coordination layer — one that respects your privacy, works offline, and gets smarter over time — we would like to hear from you.

Join the waitlist and tell us about your group.


Morphee is built by a small team that believes AI should serve the people who use it, not the companies that build it. The core engine will be open source. Your data will always be yours.

Share this article
M

Morphee Team

Morphee Team

Related articles

Encrypted GDPR compliant No tracking Local AI option Open source