Securing Sensitive Data in Microsoft Fabric: Best Practices for Data Masking and Encryption Buragapu Ritesh November 11, 2025

Securing Sensitive Data in Microsoft Fabric: Best Practices for Data Masking and Encryption

Securing Sensitive Data in Microsoft Fabric: Best Practices for Data Masking and Encryption

Introduction

As data volumes increase and cyber threats become more sophisticated, safeguarding sensitive information has never been more crucial. Microsoft Fabric is a robust platform designed to help organizations protect their data, with data masking and encryption being two of the most effective tools for this purpose. In this blog, we’ll explore straightforward and efficient ways to use these tools in Microsoft Fabric to ensure data security, maintain privacy compliance, and meet regulatory requirements. Additionally, we’ll touch on the latest MS OneLake security model, which now moves security beyond the app access level.

Understanding Microsoft Fabric’s Security Approach

  • Keeping data safe is very important for any organization. Rules like GDPR (General Data Protection Regulation require companies to protect sensitive information. Good data security helps prevent unauthorized access and data leaks. 
  • It also builds trust with customers and protects the company’s reputation. Microsoft Fabric has the right tools to help meet these needs and keep your data secure. 

Data Masking in Microsoft Fabric

  • Data masking means hiding sensitive information by replacing it with fake but realistic values. This way, people who don’t have permission can’t see the real data.  
  • It’s especially useful when working on development, testing, or analytics, or when sharing data with third parties because it protects private details while still allowing the data to be used.  
  • Microsoft Fabric supports data masking to help keep your information safe in these situations. 

Types of Data Masking in Microsoft Fabric

Microsoft Fabric supports different types of data masking to suit various needs. Here are the two main types: 

  • Dynamic Data Masking 

This type hides sensitive data in real-time when someone tries to view or query it, without changing the actual data in the database. 
Use case:  
Ideal for protecting live production data from users who shouldn’t see sensitive information. 

  • Static Data Masking 

This method permanently replaces sensitive information with fake values in a copy of the data. The original data stays untouched. 
Use case:  
Useful for safely sharing data for testing, training, or with external partners—without exposing real details. 

These masking options help you control who can see what, while still allowing the data to be used effectively. 

Steps to Implement Data Masking in Fabric

1. Identify Sensitive Data

  • Determine which columns contain Personally Identifiable Information (PII), financial, or regulated data. 
  • Examples: Social Security Numbers, email addresses, credit card numbers, etc.

2. Choose the Type of Masking

  • Dynamic Data Masking (DDM) –Masks data at query time without changing the actual data. 
  • Static Data Masking –Replaces sensitive values in a copy of the dataset, ideal for sharing or testing. 

3. Assign Role-Based Access

  • Ensure only authorized users (e.g., admins, analysts) can view unmasked data. 
  • Use role-based security in Microsoft Fabric Data Warehouse or Lakehouse to restrict access

4. Combine with Row-Level Security (Optional but Recommended)

  • Apply RLS to control which rows users can see, in addition to column masking.

5. Test the Masking Setup

  • Query the masked columns as both: 
    • Privileged users (should see real data) 
    • Restricted users (should see masked values) 

6. Regularly Review and Update Masking Policies

  • As data models change, keep masking rules updated. 
  • Audit user access and ensure masking is applied where needed. 

Architecture for Securing Sensitive data in Microsoft Fabric

How Microsoft Fabric Keeps Your Data Safe

  • Microsoft Fabric protects sensitive data using several layers of security, including data masking, encryption, and access control. 
  • At the centre is the Fabric Data Warehouse, where your data is stored. Dynamic data masking is used to hide private information from users who don’t have permission to see it. For example, someone without access will see a masked version of an email or ID, while authorized users can see the full data. 
  • All data is encrypted while stored (at rest) using Microsoft’s own encryption keys. If your organization needs more control, you can use your own keys with Azure Key Vault—this is called customer-managed keys (CMK). 
  • Access to data is tightly controlled using roles like Admin, Member, and Viewer. Only users with special “UNMASK” permission can see the real, unmasked data. Others will only see protected, masked versions. 
Microsoft Fabric security architecture

To add more protection, Fabric also supports: 

  • Multi-factor authentication (MFA) 
  • Network isolation 
  • Monitoring with Microsoft Defender for Cloud 

This secure setup allows organizations to analyse and share data safely while following privacy rules and keeping unauthorized users out. 

Microsoft Fabric Data Encryption

Data encryption is like locking your data with a digital key. It turns readable information into scrambled code so that no one can understand it unless they have the right key to unlock it. 

In Microsoft Fabric, encryption helps keep your data safe in two main ways: 

  • At rest: When data is stored in Fabric, it’s automatically encrypted. 
  • In transit: When data is being sent between services, it’s protected using secure connections. 

By default, Microsoft uses its own secure keys to handle encryption. But if your organization needs more control—especially for compliance reasons—you can use your own encryption keys stored in Azure Key Vault. This is called customer-managed keys (CMK). 

Encryption is important because even if someone gets access to your files or network, they won’t be able to read the data without the key. It’s one of the best ways to protect sensitive information. 

Encryption at Rest and In Transit

Microsoft Fabric always keeps your data safe. 

  • At rest: Data is encrypted automatically when it’s stored in Fabric, so even if someone gets into the storage, they can’t read it without the right key. 
  • In transit: When data is moving between services or users, it’s protected using secure connections (TLS 1.2+). 

Using Customer-Managed Keys (CMK) with Azure Key Vault

If your organization needs more control over encryption, you can use your own keys instead of Microsoft’s default ones. These are called Customer-Managed Keys (CMK). 
With CMK, you store your keys in Azure Key Vault, which lets you control: 

  • Who can access the keys 
  • When keys are rotated or revoked 
  • How long each key is valid 

This is especially useful for industries with strict data compliance requirements. 

Role-Based Access Control (RBAC) in Fabric

Not everyone should see sensitive data. That’s where RBAC comes in. 
Fabric uses roles like Admin, Member, Contributor, and Viewer to control what users can access or do. 
For example: 

  • Only Admins can manage settings and permissions. 
  • Only users with the “UNMASK” permission can see original, unmasked data. 

This helps you keep control over who sees what. 

Best Practices for Encryption in Microsoft Fabric

  • Always enable encryption: Make sure your data is encrypted both when it’s stored (at rest) and when it’s being sent or received (in transit). Microsoft Fabric uses strong encryption by default, but it’s good to double-check. 
  • Use customer-managed keys (CMK): For sensitive or regulated data, use your own encryption keys stored in Azure Key Vault. This gives you more control over who can access your data and helps meet strict compliance needs. 
  • Rotate and audit keys regularly: Change your encryption keys on a regular schedule and keep track of who is using them. This adds an extra layer of security and helps prevent misuse. 

Following these simple steps helps ensure your data stays always protected and under your control. 

Best Practices for Encryption in Microsoft Fabric

Common Use Cases and Industry Scenarios

  • Healthcare: Mask patient data while allowing doctors to access full records. 
  • Finance: Protect account numbers, SSNs, and salaries during reporting. 
  • Retail: Share sales data for analytics without exposing customer info. 
  • Government: Ensure compliance with strict privacy laws. 

Visualizing the Impact of Data Masking

Showing before-and-after examples of masked data helps users understand its value. 

  • Original email: john.doe@example.com 
  • Masked view: jXXX.XXX@XXXX.com 

Tools for Monitoring and Auditing Access

Fabric integrates with Microsoft Defender for Cloud to monitor security activity. 
You can also: 

  • Track who accessed data 
  • Detect unusual access patterns 
  • Set alerts for suspicious behaviours
     

This adds another layer of protection. 

OneLake Security Model: Key Features

Microsoft Fabric’s OneLake offers a comprehensive and granular security model that goes beyond traditional encryption and data masking techniques. Here’s a breakdown of its key features: 

  • End-to-End Encryption: All data stored in OneLake is encrypted at rest by default using Microsoft-managed keys. Organizations can also use Customer Managed Keys (CMK) stored in Azure Key Vault for even more control over their encryption. 
  • Encryption in Transit: Data is always encrypted during transfer using TLS 1.2+, ensuring secure communication between services and users. 
  • Granular, Unified Access Control: OneLake enables organizations to set access controls at various levels—object-level (e.g., folders or tables), column-level, and row-level. These controls are natively enforced across all tools in the Microsoft Fabric ecosystem, including Spark, SQL, and semantic models. 
  • Workspace Roles and Inheritance: Access permissions are set at the workspace level and are inherited by objects within the workspace. This allows for granular access control based on different workloads such as analytics, engineering, and business intelligence (BI). 
  • Multi-Geo and Logical Separation: OneLake supports multi-region deployments, ensuring data sovereignty and compliance for global organizations. It also enforces logical separation between tenants and workspaces, providing an extra layer of security. 
  • Role Enforcement: Only explicitly defined roles, users, or groups are granted access. Non-admin users are granted least-privilege access based on the specific needs of each workload. 

Advanced Controls and Data Protection

OneLake doesn’t just stop at basic encryption or access control. It supports several advanced features for data protection: 

  • Data Masking and Anonymization: OneLake offers both dynamic and static data masking, with policy-based masking capabilities applied at various schema layers. It includes built-in functions for hashing and redaction of sensitive information like Personally Identifiable Information (PII). 
  • Row-Level Security (RLS), Column-Level Security (CLS), and Object-Level Security (OLS): These features let you control access to data based on user identity, user group, and the sensitivity of the data. You can restrict access to specific rows, columns, or even entire tables, ensuring data is only accessible to those who need it. 
  • Integration with Microsoft Defender for Cloud: Continuous monitoring, auditing, and alerting of data access and policy changes are built-in, keeping track of all security events. 

OneLake Security Architecture

OneLake’s security architecture is layered for maximum protection: 

  • Access Governance: All access to OneLake is governed by layered security controls, including workspace permissions, data access roles, object-level security definitions, and encryption. 
  • New Features (Data Access Roles): The new OneLake data access roles (in preview) allow administrators to grant explicit read access at the folder or table level. This helps enforce security policies (like RLS and CLS) centrally, reducing the complexity of security configurations. 
  • Compliance and Control: For industries with strict compliance requirements, OneLake supports Customer Managed Keys (CMK) for at-rest data encryption, giving organizations full control over their encryption keys. 

Best Practices for Microsoft Fabric OneLake Security

To get the most out of OneLake’s security model, follow these best practices: 

  • Principle of Least Privilege: Always grant the minimum access needed. Avoid giving users more permissions than necessary. 
  • Workload-Specific Permissions: Use specific permissions for different workloads (like Spark, SQL, and BI). This helps segment access based on operational needs. 
  • Regularly Review Security Policies: Periodically review your masking, RLS, and CLS policies to ensure they are still aligned with your data protection needs as models evolve. 
  • Use CMK for Sensitive Workloads: For highly sensitive data, enforce Customer Managed Keys (CMK) and regularly rotate or audit the keys. 
  • Continuous Monitoring: Use Microsoft Defender for Cloud to monitor access events and changes in security policies to ensure ongoing protection. 

OneLake’s Security Model Explained

OneLake serves as the central data lake in Microsoft Fabric, supporting hierarchical security enforced at multiple levels, such as: 

  • Workspace Level (overall permissions) 
  • Item Level (e.g., lakehouses, folders, or tables) 
  • Data Level (row/column-level security) 

It’s also important to understand the distinction between control plane permissions (which govern how data is managed and shared) and data plane permissions (which govern how data is accessed or queried). 

OneLake uses four workspace roles: 

  • Admin – Full control over everything 
  • Member – Full access to the workspace, but not full administrative rights 
  • Contributor – Can modify content, but with some restrictions 
  • Viewer – Can only view data, with no modification rights 

Permissions are inherited across levels, meaning permissions granted at the workspace level cascade down to objects like folders, tables, and even rows or columns. 

Deep Dive: OneLake Security Features

Here’s a closer look at how OneLake handles security: 

  • Security Roles: Admins or Members can define which users or groups can access specific parts of a lakehouse, whether that’s folders, files, tables, columns, or rows. This allows for precise control over what data can be accessed, ensuring that sensitive data is minimized and only accessible to those who need it. 
  • Row-Level and Column-Level Security (RLS/CLS): With these features, organizations can restrict access based on data sensitivity. For example, a team can be restricted to see only transactions from their region or have columns containing PII masked for additional protection. 
  • Role-Based Access: These advanced security features primarily apply to users with the Viewer role, who are only granted access to what has been explicitly allowed. Admins, Members, and Contributors typically have full access unless specifically removed from their roles. 
  • Permission Inheritance: Permissions set at the folder or table level automatically propagate to subfolders and files. This reduces complexity and makes it easier to manage security across large datasets. 

Deep Dive OneLake Security Features

Integrated Auditing and Threat Protection

  • Monitoring: Microsoft Defender for Cloud provides real-time monitoring, alerting, and reporting on access attempts and patterns to ensure ongoing security. 
  • Audit Log Integration: All security events and data access activities are logged and can be exported for regulatory review or to detect unusual behavior. 

Security Architecture Visualization

The security architecture is best represented through a layered diagram, which highlights key components such as: 

  • Role and Permission Configuration in OneLake (at the center) 
  • OLS, CLS, and RLS applied at the data layer (for data protection and access controls) 
  • Connections to Compute Engines like SQL, Spark, and Power BI 
  • Enforcement Layers at: 
    • Network (isolation of data) 
    • Identity (through Entra ID) 
    • Monitoring (via Defender for Cloud) 

Sample Implementation: Lakehouse Scenario

  • Configure OLS: Restrict access to sensitive data, like a “Transactions” table, to only the finance team. 
  • Apply CLS: Mask critical information like “SSN” and “AccountNumber” for non-privileged users. 
  • Set RLS: Use row-level security to ensure only the relevant regional teams (e.g., Europe) can access customer data for their specific region. 
  • Audit Setup: Enable logging for all access attempts and track any suspicious activities using Defender for Cloud. 

Industry Use Cases

  • Healthcare: Doctors can only access data related to their own patients, ensuring patient confidentiality. 
  • Finance: Analysts have access to only the data relevant to their projects, with sensitive information fully masked from other departments. 
  • Government: Implement multi-geo data location controls, secure key management, and audit logging to comply with regulatory requirements. 

Advanced OneLake Security Controls

Granular Role Definition:

OneLake offers highly granular control over who can access what. Security roles are integrated directly into the data plane, meaning admins can define specific roles that restrict access to certain tables or folders. These roles can also include detailed permissions down to individual rows or columns, ensuring the right people only access the right data.

DefaultReader Model:

By default, users with ReadAll permission access data through the DefaultReader role in OneLake. This default role provides broad read access across the lakehouse. However, admins can customize or completely remove this access to enforce least-privilege policies, ensuring users only have the exact level of access they need.

Consistent Permission Enforcement:

Once roles are assigned, permissions are automatically inherited by the different analytic workloads—SparkSQL, and Power BI—without needing to reconfigure security for each tool. This ensures consistent access control across the entire Microsoft Fabric ecosystem.

Integration with Microsoft Purview:

OneLake integrates seamlessly with Microsoft Purview, enabling metadata-driven security assignments. This allows organizations to catalog and classify sensitive data, manage security settings, and maintain a full audit trail for compliance and discovery.

Disaster Recovery and Data Sovereignty

Zone-Redundant Storage (ZRS) & Locally Redundant Storage (LRS): 
Data stored in OneLake is protected with Azure’s ZRS or LRS, ensuring 11–12 nines durability over the course of a year. This setup protects against hardware failures or zone disruptions, providing reliable data storage even in the face of technical issues. 

Business Continuity & Disaster Recovery (BCDR):

OneLake supports geo-replication across two Azure regions for business continuity and disaster recovery (BCDR). If one region experiences a failure, data access can be restored quickly from the secondary region via OneLake APIs, ensuring minimal downtime and continuous availability for mission-critical workloads.

Data Sovereignty:

OneLake’s workspace location selection ensures compliance with regional regulations related to data residency and sovereignty. Organizations can configure their regions and failover capabilities to meet their multi-geo requirements, ensuring data stays within legal boundaries.

Security Architecture Explained

Logical Tenant Isolation:

Microsoft Fabric ensures that each organization’s data and computing environment are logically isolated. This is achieved using secure virtual networksapplication-layer controls, and centralized metadata/authorization services, ensuring no cross-contamination of data between tenants.

Authentication & Authorization:

All users and services are validated through Microsoft Entra ID (formerly Azure AD). Resource permissions are stored centrally, with authorization checks happening at every level, from workspace entry to granular data access.

Network & Application Security:

All interactions with OneLake pass through secure web front ends. On the backend, managed code and secure endpoints are used to process requests, ensuring that no user-authored code is executed directly, maintaining a strong defense against attacks.

Visual Reference

The official Microsoft Fabric security architecture diagram illustrates this multi-layered approach. It covers everything from browser/client interactions to authenticationmetadata managementbackend capacity, and secured virtual networks, providing a comprehensive view of how security is structured. 

Implementation Example and Best Practices

Lakehouse Security Example:

Here’s a practical security setup for a Finance team: 

  • Role Creation: Create a role for the Finance team with access to the “Transactions” table, allowing access to “Amount” and “Date” columns, but restricting access to “AccountNumber” using Column-Level Security (CLS). 
  • Row-Level Security (RLS): Apply RLS to ensure each team only sees data relevant to their region (e.g., the European team only sees European transactions). 
  • Audit Logging: Enable audit logging with Microsoft Defender for Cloud to track access and suspicious activities. 
  • DevOps/GitHub Integration: Connect the workspace to DevOps or GitHub for automated code backups. 

Best Practices:

  • Use Purview for cataloging and classifying sensitive data to ensure compliance. 
  • Audit security roles regularly to ensure adherence to the principle of least privilege. 
  • Enable BCDR for all compliance-critical workspaces to ensure high availability. 
  • Review region selection to meet residency and legal compliance requirements. 

Real-World Scenarios

Regulated Industries:

In industries like healthcare or finance, granular security controls such as OLSCLS, and RLS can be leveraged to ensure sensitive data—like patient records or financial transactions—is protected and meets regulatory standards. 

Analytics at Scale:

OneLake’s security model is designed to scale for large, complex data estates, offering a centralizedconsistent, and high-performance security framework that can handle a vast amount of data with diverse access needs.

Multi-Cloud and On-Prem Integration:

OneLake offers unified security, regardless of whether the source data is in the cloud, on-premises, or across multiple cloud environments. This integration supports a hybrid architecture and ensures that security practices are well-documented and auditable, regardless of the underlying infrastructure.

Conclusion

Securing sensitive data in Microsoft Fabric requires a layered approach—combining data masking, encryption, strong access controls. 

By implementing these best practices, organizations can confidently unlock the power of analytics while minimizing risk and ensuring compliance. 

Securing Sensitive Data in Microsoft Fabric - FAQs

Data security within Microsoft Fabric is an end-to-end framework that protects sensitive data with features such as encryption at rest and in transit, role-based access control, dynamic data masking, and compliance tools like Microsoft Purview integration. This ensures that regulatory requirements, such as GDPR, are met while keeping data protected from unauthorised access and leaks.

Data Masking in Microsoft Fabric helps secure sensitive data by hiding or obfuscating values for unauthorised users, using methods like dynamic data masking and static data masking. This means that only privileged users can view the true data, while others see anonymised or partially masked information, which is crucial for ensuring compliance and maintaining privacy.

Best practices for Microsoft Fabric security architecture include implementing granular permissions via workspace roles, enforcing object, column, and row-level security, and leveraging customer-managed keys (CMK) for data encryption. Organisations should regularly review compliance settings, enable multi-factor authentication, and use monitoring with Microsoft Defender for Cloud.

Governance and compliance in Microsoft Fabric are supported by tools like data lineage, information protection labels, Purview integration, and audit logging, making it easier to catalogue data, prevent loss, and enforce regulatory policies. Fabric’s configurable security ensures organisations can adjust controls to meet evolving compliance standards such as GDPR.

Organisations can extend data security to Microsoft Fabric by integrating it with existing Microsoft Entra (Azure AD) tenants for unified identity management, deploying customer-managed keys in Azure Key Vault, and applying security features like dynamic data masking and private endpoints. This unified approach enforces consistent security controls across cloud and on-premises data assets.