All posts
PrivacyJanuary 10, 2026

Your Client Data Is Sensitive. Here's How We Treat It.

Real estate agents handle deeply personal information. Any AI tool that touches it needs to take privacy seriously.


Real estate agents know things about their clients that most people wouldn't share with their closest friends. Financial situations. Divorce timelines. Family health issues affecting housing decisions. Job changes that haven't been announced yet.

This information flows naturally in conversation because clients trust their agent. And when an AI assistant is part of that conversation, it needs to earn that trust too.

The cloud problem

Most AI tools process your data on shared cloud infrastructure. Your conversations go to a server farm alongside millions of other users' data. Even with encryption and access controls, the fundamental reality is: your data is on someone else's computer, in someone else's data center, subject to someone else's policies.

For casual use, that's fine. For client conversations that include financial details, personal circumstances, and negotiation strategies? That's a risk most agents haven't thought about carefully enough.

How Matilda handles data

Matilda runs on dedicated local hardware managed by Shafiq & Company. Here's what that means in practice:

  • Physical isolation: Your data lives on a specific machine, not in a shared cloud. There's no multi-tenant architecture where a misconfiguration could expose your data to other users.
  • No training on your data: Your conversations aren't used to train AI models. What you tell Matilda stays with Matilda.
  • You own everything: Contacts, deal notes, conversation history, it's all yours. You can request a full export or deletion at any time.
  • Workspace isolation: Each client's Matilda instance runs in its own isolated workspace. There's no shared state between agents.

What we don't do

Some things that are common in the industry that we explicitly avoid:

  • We don't sell or share your data with third parties.
  • We don't use your data for advertising.
  • We don't aggregate your data with other users' data for analytics.
  • We don't retain your data after you cancel the service (after a reasonable deletion window).

The compliance angle

Real estate transactions involve regulated information. While Matilda isn't a compliance tool, our architecture makes compliance easier:

  • Data residency is clear: it's on a specific machine in a specific location.
  • Audit trails are straightforward: everything is logged locally.
  • Data deletion is real: when we delete it, it's gone. Not "marked as deleted in our database."

Choosing tools wisely

When evaluating any AI tool for your business, ask these questions:

  1. Where does my data physically live?
  2. Is my data used to train models or improve the service for other users?
  3. Can I get a complete export of my data?
  4. Can I get my data truly deleted?
  5. What happens to my data if the company shuts down?

If a vendor can't give you clear, specific answers to these questions, that tells you something.

Our commitment

Privacy isn't a feature we added to Matilda. It's an architecture decision we made before writing a single line of code. Running on local hardware isn't the easiest way to build an AI product, it's harder, more expensive, and requires hands-on management. We chose it because it's the right way to handle sensitive client data.

Your clients trust you with their most personal information. We think the tools you use should be worthy of that trust.