-
-
-
-
- Overview
- Immuta Architecture
-
-
-
- Install Immuta in an Air-Gapped Environment
-
-
- Overview
-
-
-
- Databricks Spark Integration Overview
- Databricks Spark Pre-Configuration Details
-
- Databricks Change Data Feed
- Databricks Libraries
- DBFS Access
- Delta Lake API
- Environment Variables
- Ephemeral Overrides
- Py4j Security Error
- S3 Access in Databricks
- Scala Cluster Security Details
- Security Configuration for Performance
- Spark Direct File Reads
-
-
-
-
-
-
-
- Google BigQuery
-
-
-
-
-
- Add a License Key
-
-
-
-
-
- Overview
- Projects and Purposes
- Value of Projects
- Create Purposes and Acknowledgement Statements
- Create a Project
-
-
- Derived Data Sources
- Create a Derived Data Source
- Use Project UDFs (Databricks)
- Policy Adjustments (Public Preview)
- HIPAA Expert Determination (Public Preview)
- Adjust a Policy (Public Preview)
- Use Expert Determination (Public Preview)
-
-
-
-
-
-
- Immuta Proof of Value (POV)
- Data Setup
-
- Schema Monitoring and Automatic Sensitive Data Discovery
- Separating Policy Definition from Role Definition - Dynamic Attributes
- Policy Boolean Logic
- Exception-Based Policy Authoring
- Hierarchical Tag-Based Policy Definitions
- Subscription Policies - Benefits of Attribute-Based Table GRANTs
- Purpose-Based Exceptions
-
-
-
-
-
- Query Your Data Guide
-
-
-
-