#StackBounty: #java #apache-spark #databricks How to detect Databricks environment programmatically

Bounty: 50

I’m writing a spark job that needs to be runnable locally as well as on Databricks.

The code has to be slightly different in each environment (file paths) so I’m trying to find a way to detect if the job is running in Databricks. The best way I have found so far was to look for a “dbfs” directory in the root dir and if it’s there then assume it’s running on Databricks. This doesn’t feel like the right solution. Does anyone have a better idea?

Get this bounty!!!

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.