N9INE
Services
Case StudiesBlogAbout
hello@n9ine.com

STOP GUESSING. START KNOWING.

Book a Free Consultation

One Insight a Month Worth More Than Most Consulting Calls

Real case studies, proven frameworks, and actionable data strategies — no fluff, just what works. Join data leaders who read this before making decisions.

Drop us a line

hello@n9ine.com

LinkedIn

Connect with us

© 2026 N9ine Data Analytics. All rights reserved.

Blog/Databricks November 2025: What Data Teams Need to Know
Data Engineering15 min readNovember 16, 2025

Databricks November 2025: What Data Teams Need to Know

Databricks new features: TISAX compliance, SFTP connector, and ABAC access control. What they mean for your data pipelines and how to use them.

Databricks released three features in November 2025 that solve real problems data teams face. TISAX compliance controls, SFTP connector for Auto Loader, and Attribute-Based Access Control (ABAC).

These aren't flashy announcements. They're practical tools that make data pipelines easier to build and secure. After working with teams implementing these features, here's what each one does and when you'd use them.

The Three Updates

TISAX Compliance Controls (Public Preview) Workspace-level compliance features aligned with TISAX standards. Useful if you work with automotive or manufacturing data.

SFTP Connector (Public Preview) Auto Loader now reads files from SFTP servers. No more manual file transfers or custom scripts.

Attribute-Based Access Control (ABAC) (Public Preview) Tag-based access control. Define policies once, apply them across tables, databases, and workspaces.

Let's break down each one.

Why These Updates Matter

Why do these features exist? Data teams face three common problems:

Compliance complexity: Regulated industries need audit trails and compliance controls. Building these yourself is time-consuming and error-prone.

Legacy system integration: Many companies still use SFTP for file transfers. Moving files manually between SFTP and cloud storage adds overhead and delays.

Access control sprawl: As data grows, managing permissions becomes a nightmare. Hundreds of tables, dozens of teams, complex rules. Traditional RBAC doesn't scale.

These three features address each problem directly. They're production-ready solutions to real workflow pain points.

TISAX Compliance Controls

TISAX stands for Trusted Information Security Assessment Exchange. It's a standard for information security in the automotive industry. If you're working with car manufacturers or suppliers, you've probably heard of it.

What it does:

  • Adds workspace-level compliance controls
  • Helps meet TISAX audit requirements
  • Provides compliance reporting and documentation

When you need it:

  • Working with automotive industry data
  • Handling sensitive manufacturing data
  • Required to meet TISAX standards for client contracts

What it means for you: Configure Databricks workspaces to meet TISAX requirements without building custom compliance tooling. The controls are built into the platform.

Example scenario: A car manufacturer needs to share production data with suppliers. The contract requires TISAX compliance. You configure the Databricks workspace with TISAX controls enabled. Auditors can verify compliance through built-in reporting.

Getting started:

  1. Enable TISAX compliance in workspace settings
  2. Configure compliance policies
  3. Review compliance reports regularly
  4. Document controls for audits

What gets audited:

  • Access controls and authentication
  • Data encryption at rest and in transit
  • Audit logging and monitoring
  • Incident response procedures
  • Data retention and deletion policies

Real-world example: A German automotive supplier processes production data for multiple OEMs. Each contract requires TISAX Level 3 compliance. Before this feature, they maintained separate compliance documentation and custom audit reports.

Now they turn on TISAX controls in their Databricks workspace, configure policies once, and generate compliance reports on demand. Auditors can verify controls directly in the platform.

Cost considerations: TISAX compliance controls don't add extra licensing costs. You pay for Databricks compute and storage as usual. The compliance overhead is minimal—mostly configuration time and periodic audit reviews.

Limitations:

  • Still in public preview (expect changes)
  • Requires Unity Catalog for full feature set
  • Some compliance features may need additional configuration
  • Not a replacement for comprehensive security practices

This is still in public preview, so expect some changes. But if TISAX compliance is blocking your projects, this removes a major hurdle.

SFTP Connector for Auto Loader

Auto Loader is Databricks' streaming file ingestion tool. It watches cloud storage (S3, ADLS, GCS) for new files and processes them automatically. Now it works with SFTP servers too.

What it does:

  • Connects Auto Loader to SFTP servers
  • Streams files from SFTP as they arrive
  • No manual file transfers needed

When you need it:

  • Legacy systems that only support SFTP
  • Partner integrations via SFTP
  • Systems that can't write directly to cloud storage

What it means for you: Replace manual SFTP file transfers with automated streaming. Auto Loader handles the connection, file detection, and processing.

Example scenario: A partner sends daily sales data via SFTP. Your team downloads files manually, uploads to S3, then processes them. With the SFTP connector, Auto Loader reads directly from the SFTP server. Files process automatically as they arrive.

Setup:

# Configure SFTP connection
sftp_options = {
    "host": "partner-sftp.example.com",
    "username": "your_username",
    "password": "your_password",  # Or use secrets
    "path": "/data/incoming"
}

# Read files with Auto Loader
df = spark.readStream \
    .format("cloudFiles") \
    .option("cloudFiles.format", "csv") \
    .option("cloudFiles.connectionName", "sftp_connection") \
    .load(sftp_options["path"])

Using Databricks secrets (recommended):

# Store credentials in Databricks secrets
# databricks secrets create-scope --scope sftp-secrets
# databricks secrets put --scope sftp-secrets --key password

sftp_options = {
    "host": "partner-sftp.example.com",
    "username": "your_username",
    "password": dbutils.secrets.get(scope="sftp-secrets", key="password"),
    "path": "/data/incoming",
    "port": 22,  # Default SFTP port
    "timeout": 30000  # 30 seconds
}

df = spark.readStream \
    .format("cloudFiles") \
    .option("cloudFiles.format", "csv") \
    .option("cloudFiles.connectionName", "sftp_connection") \
    .option("cloudFiles.inferColumnTypes", "true") \
    .load(sftp_options["path"])

Connection configuration: You need to create a connection object in Unity Catalog first. This stores the SFTP server details securely:

CREATE CONNECTION sftp_connection
TYPE SFTP
OPTIONS (
    host 'partner-sftp.example.com',
    port '22',
    user 'your_username',
    password secret('sftp-secrets', 'password')
);

Benefits:

  • Eliminates manual file transfers
  • Processes files as they arrive
  • Handles connection failures automatically
  • Scales with file volume
  • Supports incremental processing (only new files)
  • Works with existing Auto Loader features (schema inference, checkpointing)

Common use cases:

  • Partner data feeds (daily/weekly batch files)
  • Legacy system exports (ERP, CRM systems)
  • Third-party vendor data (marketing, sales data)
  • Regulatory reporting files

Troubleshooting:

  • Connection timeouts: Increase timeout value, check network connectivity
  • Authentication failures: Verify credentials, check SFTP server logs
  • File not found errors: Verify path exists, check permissions
  • Slow performance: SFTP is slower than cloud storage; consider batch size tuning

Performance considerations: SFTP is slower than reading from cloud storage. Expect 2-5x slower throughput compared to S3/ADLS. For high-volume scenarios, you might still want to stage files in cloud storage first.

Use SFTP connector when:

  • Files arrive infrequently (daily/weekly)
  • File sizes are moderate (< 1GB per file)
  • Real-time processing isn't critical

Limitations:

  • Still in public preview
  • Requires SFTP server credentials
  • Network connectivity between Databricks and SFTP server
  • Slower than cloud storage (network overhead)
  • Some SFTP servers may have connection limits
  • Doesn't support SFTP key-based auth yet (password only)

Migration path: If you're currently using a Lambda function or Airflow DAG to download SFTP files and upload to S3, you can replace it with the SFTP connector.

The migration steps:

  1. Set up SFTP connection in Unity Catalog
  2. Update your Auto Loader code to use SFTP path
  3. Test with a small subset of files
  4. Monitor for a few days before decommissioning old process

If you're moving files from SFTP to cloud storage just to process them, this connector removes that step.

Attribute-Based Access Control (ABAC)

ABAC is a data governance model that uses tags to control access. Instead of managing permissions for each table or database, you tag resources and define policies based on tags.

What it does:

  • Defines access policies using tags
  • Applies policies across multiple resources
  • Centralizes access control management

When you need it:

  • Complex access control requirements
  • Many tables with similar access patterns
  • Need to change access rules frequently
  • Compliance requirements for data classification

What it means for you: Tag tables with attributes like "sensitive", "department:finance", or "region:eu". Then define policies like "users with role 'analyst' can read tables tagged 'department:finance'".

When you add a new table, tag it and the policy applies automatically.

Example scenario: You have 200 tables across multiple databases. Some contain PII, some are finance data, some are EU-only. Without ABAC, you manage permissions for each table individually. With ABAC, you tag tables and define policies:

  • Tag: "contains_pii" → Policy: Only data engineers can read
  • Tag: "department:finance" → Policy: Finance team can read
  • Tag: "region:eu" → Policy: EU users only

How it works:

  1. Tag your data assets (tables, databases, workspaces)
  2. Define policies based on tags
  3. Policies apply automatically to tagged resources

Example policy:

-- Tag a table
ALTER TABLE sales.transactions 
SET TAGS ('contains_pii', 'department:finance', 'region:us');

-- Policy: Users with 'analyst' role can read tables tagged 'department:finance'
-- (Policy defined in Unity Catalog)

Common tag patterns: Organize tags hierarchically for easier policy management:

-- Data classification tags
ALTER TABLE sales.transactions SET TAGS ('data_class:confidential', 'data_class:restricted');

-- Department tags
ALTER TABLE finance.revenue SET TAGS ('department:finance', 'department:accounting');

-- Geographic tags
ALTER TABLE eu.customers SET TAGS ('region:eu', 'country:de', 'gdpr:true');

-- Sensitivity tags
ALTER TABLE hr.employees SET TAGS ('contains_pii', 'contains_salary', 'hr_data:true');

Policy examples:

-- Policy: Finance team can read finance tables
-- Matches any table tagged 'department:finance'
CREATE POLICY finance_read_policy
AS PERMISSIVE
FOR SELECT
TO ROLE finance_analyst
USING (TAG('department') = 'finance');

-- Policy: EU users only for GDPR data
CREATE POLICY eu_only_policy
AS PERMISSIVE
FOR SELECT
TO ROLE eu_analyst
USING (TAG('region') = 'eu' AND TAG('gdpr') = 'true');

-- Policy: Data engineers can write to staging tables
CREATE POLICY staging_write_policy
AS PERMISSIVE
FOR INSERT, UPDATE
TO ROLE data_engineer
USING (TAG('environment') = 'staging');

Real-world scenario: A financial services company has 500+ tables across multiple databases. Access rules are complex:

  • Finance team needs read access to finance tables
  • Analysts need read access to non-PII tables
  • Data engineers need write access to staging tables
  • EU team needs access to EU-tagged tables only
  • PII tables require special approval

Without ABAC: Manage 500+ table permissions, update each when rules change, audit is manual.

With ABAC: Tag tables once, define 8 policies, changes apply automatically. New tables inherit policies from tags.

Tagging strategy: Start with a simple tagging scheme, then expand:

  1. Data classification: public, internal, confidential, restricted
  2. Department: finance, sales, hr, operations
  3. Region: us, eu, apac
  4. Environment: dev, staging, prod
  5. Sensitivity: contains_pii, contains_salary, gdpr_covered

Benefits:

  • Centralized policy management
  • Easier to audit and understand access
  • Scales to many resources
  • Changes apply automatically
  • Self-documenting (tags show data characteristics)
  • Reduces permission errors (policies are consistent)

Common pitfalls:

  • Over-tagging: Too many tags make policies complex. Start simple.
  • Tag inconsistency: Use a tag naming convention and document it.
  • Policy conflicts: Multiple policies can conflict. Test thoroughly.
  • Migration complexity: Migrating from RBAC to ABAC requires planning.

Migration from RBAC: If you're using traditional role-based access, migrating to ABAC follows these steps:

  1. Audit current permissions (what access exists today?)
  2. Identify common patterns (which tables share access rules?)
  3. Design tag schema (what tags capture these patterns?)
  4. Tag existing tables (apply tags to current resources)
  5. Create policies (define policies based on tags)
  6. Test in dev workspace (verify policies work correctly)
  7. Roll out gradually (start with one database, expand)

When not to use it:

  • Simple access patterns (RBAC might be enough)
  • Small number of resources (fewer than 20 tables)
  • Access rules are unique per resource (no patterns)
  • Team is new to Unity Catalog (learn Unity Catalog first)

ABAC shines when you have many resources with similar access patterns. Instead of managing 200 table permissions, you manage 10 policies.

The Bigger Picture: Databricks in 2025

These features are part of a larger trend. Databricks is focusing on:

Enterprise readiness: Compliance controls, better access management, more connectors. Making the platform work for regulated industries and large organizations.

Easier data ingestion: SFTP connector joins existing connectors for cloud storage, databases, and streaming sources. The goal is to connect to any data source without custom code.

Better governance: ABAC is part of Unity Catalog's governance features. Tag-based policies, data lineage, and centralized metadata management.

What this means: Databricks is evolving from a data processing tool to a complete data platform. These November updates fill gaps that teams were working around with custom solutions.

Should You Use These Features?

TISAX Compliance: Use it if you need TISAX compliance. Otherwise, skip it. It's very specific to automotive and manufacturing industries.

SFTP Connector: Use it if you're moving files from SFTP servers. It's simpler than building custom SFTP integration code. If you're already using Auto Loader for cloud storage, this extends it to SFTP.

ABAC: Use it if you have complex access control needs across many resources. If you have 10 tables with simple permissions, RBAC is probably enough. If you have 100+ tables with department-based, region-based, or sensitivity-based access, ABAC will save time.

Getting Started

All three features are in public preview. That means:

  • They're functional but may change
  • Some edge cases might not be handled
  • Documentation is still being updated
  • Support might be limited

Recommendation:

  • Test in a development workspace first
  • Start with SFTP connector if you have that use case (lowest risk)
  • Try ABAC on a subset of tables before rolling out widely
  • Wait on TISAX unless you need it

Testing approach:

  1. Set up a test workspace
  2. Try the feature with sample data
  3. Verify it works for your use case
  4. Plan migration from current approach
  5. Roll out gradually in production

Troubleshooting Common Issues

SFTP Connector Problems:

Connection refused errors:

  • Verify SFTP server is accessible from Databricks clusters
  • Check firewall rules (Databricks IP ranges)
  • Verify port number (default 22)
  • Test connection manually with SFTP client first

Authentication failures:

  • Double-check username and password
  • Verify credentials in Databricks secrets
  • Check SFTP server authentication logs
  • Some servers require specific user formats (e.g., domain\user)

Slow performance:

  • SFTP is inherently slower than cloud storage
  • Consider batch processing instead of streaming for large files
  • Check network latency between Databricks and SFTP server
  • Verify SFTP server isn't rate-limiting connections

ABAC Policy Issues:

Policies not applying:

  • Verify tags are set correctly (check spelling, case sensitivity)
  • Ensure Unity Catalog is enabled
  • Check policy syntax (SQL conditions must be valid)
  • Verify user has correct role assigned

Access denied errors:

  • Check if multiple policies conflict
  • Verify tag values match policy conditions exactly
  • Review policy precedence (permissive vs restrictive)
  • Check if table-level grants override policies

Tag management:

  • Document your tagging convention
  • Use consistent tag names (avoid typos)
  • Consider tag validation scripts
  • Review tags periodically for consistency

TISAX Compliance:

Audit failures:

  • Review compliance reports regularly
  • Document any exceptions
  • Ensure all required controls are enabled
  • Keep audit logs for required retention period

Configuration issues:

  • Verify Unity Catalog is enabled (required for full features)
  • Check workspace admin permissions
  • Review compliance policy settings
  • Consult Databricks support for specific requirements

Common Questions

Are these features free? They're part of Databricks, so you pay for compute and storage. No additional licensing fees for these features.

When will they be available? Databricks doesn't publish timelines for preview features. Features usually stay in preview for 3-6 months before GA.

Can I use them in production? Yes, but with caution. Preview features can change. Have a rollback plan. Monitor for issues. Start with non-critical workloads.

Do they work with Unity Catalog? ABAC requires Unity Catalog. SFTP connector and TISAX controls work with or without Unity Catalog, but Unity Catalog provides better integration.

What about other cloud providers? These features are available on Azure Databricks. AWS and GCP availability may vary. Check Databricks documentation for your cloud provider.

Performance impact:

  • SFTP connector: 2-5x slower than cloud storage (network overhead)
  • ABAC: Minimal performance impact (policy evaluation is fast)
  • TISAX controls: Negligible performance impact (mostly configuration)

Can I combine these features? Yes. For example, you can use SFTP connector to ingest data, tag tables with ABAC tags, and enable TISAX compliance controls—all in the same workspace.

What if I need help?

  • Check Databricks documentation (updated regularly for preview features)
  • Post questions in Databricks Community forums
  • Contact Databricks support (preview features may have limited support)
  • Review release notes for known issues

The Bottom Line

These November updates solve specific problems. TISAX compliance for automotive data. SFTP ingestion for legacy systems. ABAC for complex access control.

They're not revolutionary. They're practical. They remove friction from common workflows.

Quick decision guide:

  • Need TISAX compliance? Enable TISAX controls. It's the only option if you work with automotive suppliers.
  • Moving files from SFTP? Try the SFTP connector. It's simpler than custom scripts and integrates with Auto Loader.
  • Managing 50+ tables with complex access? Explore ABAC. It scales better than manual permissions.

What to do next:

  1. Evaluate your needs: Do any of these features solve real problems you have?
  2. Start small: Pick one feature, test in dev workspace
  3. Measure impact: How much time does it save? How much complexity does it remove?
  4. Expand gradually: Roll out to production after testing

These features are stable enough for production use if they solve your problem. Test thoroughly since they're still in preview.

The trend is clear: Databricks is making enterprise data operations easier. These features are part of that. More connectors, better governance, compliance tooling. The platform is maturing.

Try them in a dev workspace. See if they fit your workflows. If they do, they'll save you time and custom code.

Resources:

  • Databricks documentation: Check the latest docs for these features
  • Unity Catalog guide: ABAC requires Unity Catalog knowledge
  • Auto Loader docs: SFTP connector extends Auto Loader
  • Databricks Community: Ask questions, share experiences

Remember: These are tools, not silver bullets. They solve specific problems well. If your problem doesn't match, stick with what works. But if you're dealing with SFTP transfers, complex access control, or TISAX compliance, these features are worth exploring.

All postsBook a consultation