You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
We can add Databricks server metadata to allow us to automatically configure the Hadoop Configuration objects with respect to connectivity. This might also be helpful for issue #2639 (DBFS).
The trick will be to re-use code and functionality of our Parquet Writer transform to avoid code duplication.
Issue Priority
Priority: 3
Issue Component
Component: Transforms
The text was updated successfully, but these errors were encountered:
What would you like to happen?
Using the new standalone driver of Databricks, it should be possible to support both reading and writing data to Delta Lake.
https://docs.delta.io/latest/delta-standalone.html
We can add Databricks server metadata to allow us to automatically configure the Hadoop Configuration objects with respect to connectivity. This might also be helpful for issue #2639 (DBFS).
The trick will be to re-use code and functionality of our Parquet Writer transform to avoid code duplication.
Issue Priority
Priority: 3
Issue Component
Component: Transforms
The text was updated successfully, but these errors were encountered: