About Us
Built by practitioners, for practitioners
We started Data Foundations because we kept seeing the same pattern: expensive data initiatives that produced dashboards nobody trusted and pipelines nobody could maintain.
Why we exist
Most data consulting engagements follow the same playbook: bring in a large team, produce a strategy deck nobody approves, build dashboards nobody views, and leave. Six months later, the client has a modern data stack held together with hope and Slack messages - and analysts still doing the real work in Excel.
Data Foundations Consulting exists to break that cycle. We focus on the foundational work that determines whether everything built on top actually works: platform engineering, data modeling, pipeline architecture, and the knowledge transfer that makes it stick.
We are not a staffing firm that happens to do data. We are not a product company disguised as a consultancy.
We are not a staffing firm that happens to do data. We are not a product company disguised as a consultancy. We are practitioners who build with Fabric, Databricks, Airflow, Terraform, and dbt every day - and we bring that depth to every engagement.
Every project has three non-negotiable outcomes: your systems work in production, your team can operate them independently, and you have documentation for everything we built. If those three things are not true when we leave, we have not done our job.
Our Philosophy
Three principles that define every engagement
Sustainability
Build systems that outlast the engagement.
We design every component with long-term maintainability as a primary constraint. That means standard tooling, clear abstractions, comprehensive testing, and documentation that your team actually uses. When we leave, the system should be getting better - not degrading.
Measurable Success
Define outcomes before writing code.
Every engagement starts with explicit success criteria tied to business outcomes. We would rather ship version 0.1 to production next month than spend six months designing the perfect 3-year platform. We measure delivery by what your team can do after we leave - not by hours billed or slides delivered. If the metrics are not moving, we change the approach.
Knowledge Transfer
Transfer capability, not create dependency.
Paired delivery, structured training, and embedded documentation are not add-ons - they are core to how we work. Our goal is to make ourselves unnecessary. The best outcome is a client who does not need us anymore because their team has fully absorbed the capability.
Leadership
Carlos has spent over a decade building and operating data platforms at enterprise scale - the kind that handle $85B+ in assets and run across 17 countries. He has led data engineering teams, architected cloud-native platforms on Azure, GCP, and Databricks, and delivered production AI systems across financial services, real estate investment management, healthcare, and technology.
He started DFC because he kept watching the same cycle: a consulting firm shows up, builds something impressive, and leaves. Six months later the client's team can't maintain it, so they hire another firm. The most expensive data projects are the ones that get rebuilt. By investing in solid foundations - clean architecture, proper testing, and genuine knowledge transfer - you build once and iterate forward instead of starting over.
Carlos works directly on every engagement. There is no bench of junior consultants waiting to be staffed. When you work with DFC, you work with the person who architected the platform, writes the Terraform, reviews the dbt models, and pairs with your engineers until they don't need him anymore.
Want to work together?
Book a discovery call to discuss your data challenges. No pitch deck, no pressure - just a conversation about where you are and where you want to be.