I believe there's a small bug in dbt_project.yml where the file_format key gets set. When trying to upgrade, we get the following errors:
Compilation Error in model seed_executions (models/sources/seed_executions.sql)
Invalid file format provided:
Expected one of: text, csv, json, jdbc, parquet, orc, hive, delta, libsvm, hudi
> in macro dbt_databricks_validate_get_file_format (macros/materializations/incremental/validate.sql)
> called by macro materialization_incremental_databricks (macros/materializations/incremental/incremental.sql)
> called by model seed_executions (models/sources/seed_executions.sql)
And similar for each dbt artifacts model.
I believe the fix is to change the piece below to
+file_format: '{{ "delta" if target.type == "databricks" else "" }}'
|
+file_format: '{{ "delta" if target.name == "databricks" else "" }}' |
I believe there's a small bug in
dbt_project.ymlwhere thefile_formatkey gets set. When trying to upgrade, we get the following errors:And similar for each dbt artifacts model.
I believe the fix is to change the piece below to
+file_format: '{{ "delta" if target.type == "databricks" else "" }}'dbt_artifacts/dbt_project.yml
Line 15 in 6077894