Comments (6)
@coreystokes-8451 hey i have a pr with a fix for the second issue with global init scripts, after another export it should ignore dependancies if you have no init scripts defined in the workspace.
for the cluster policy can you please post the cluster policy file that is failing:
databricks_cluster_policy_C9628DA3D2000019.tf.json
I am not able to recreate the issue.
from databricks-sync.
@stikkireddy Here is the contents of the file:
{
"resource": {
"databricks_cluster_policy": {
"databricks_cluster_policy_C9628DA3D2000019": {
"definition": "{\"spark_conf.spark.databricks.cluster.profile\": {\"type\": \"fixed\", \"hidden\": false, \"value\": \"singleNode\"}, \"spark_version\": {\"type\": \"unlimited\", \"defaultValue\": \"9.1.x-scala2.12\"}, \"autotermination_minutes\": {\"type\": \"fixed\", \"value\": 30, \"hidden\": true}, \"ssh_public_keys.*\": {\"type\": \"forbidden\", \"hidden\": true}, \"custom_tags.GBxBusinessUnit\": {\"type\": \"fixed\", \"value\": \"kpm\"}, \"custom_tags.GBxGroup\": {\"type\": \"fixed\", \"value\": \"TmMedia\"}, \"custom_tags.GBxApplicationName\": {\"type\": \"unlimited\", \"defaultValue\": \"managedteam\"}, \"custom_tags.GBxEnvironment\": {\"type\": \"fixed\", \"value\": \"stg\"}, \"custom_tags.product\": {\"type\": \"unlimited\", \"isOptional\": false, \"defaultValue\": \"${product}\"}, \"custom_tags.dbx_profit_stream\": {\"type\": \"allowlist\", \"defaultValue\": \"<Enter Profit Stream>\", \"values\": [\"insights\", \"enterprise_capabilities\", \"bcc\", \"in_store\", \"kpm\", \"merch_analytics\", \"merch_ops\", \"digital\", \"kroger_media\", \"loyalty\", \"marketing\", \"comml_platforms\", \"inmar_partnership\", \"iri_partnership\", \"non_endemic\", \"patient_monitoring\", \"pearl_rock\", \"strategic_partnerships\"]}, \"init_scripts.0.dbfs.destination\": {\"type\": \"unlimited\", \"defaultValue\": \"dbfs:/init-scripts/foundation-components.sh\", \"isOptional\": true}, \"init_scripts.1.dbfs.destination\": {\"type\": \"unlimited\", \"defaultValue\": \"dbfs:/databricks/unravel/unravel-db-sensor-archive/dbin/install-unravel.sh\", \"isOptional\": false}, \"cluster_log_conf.type\": {\"type\": \"fixed\", \"value\": \"DBFS\"}, \"cluster_log_conf.path\": {\"type\": \"fixed\", \"value\": \"dbfs:/cluster-logs\"}, \"spark_conf.spark.driver.extraJavaOptions\": {\"type\": \"fixed\", \"value\": \"-Dcom.unraveldata.client.rest.request.timeout.ms=1000 -Dcom.unraveldata.client.rest.conn.timeout.ms=1000 -Dcom.unraveldata.client.resolve.hostname=true -Dcom.unraveldata.agent.metrics.ganglia_enabled=true -javaagent:/dbfs/databricks/unravel/unravel-agent-pack-bin/btrace-agent.jar=config=driver,script=StreamingProbe.btclass,libs=spark-3.0\"}, \"spark_conf.spark.unravel.server.hostport\": {\"type\": \"fixed\", \"value\": \"unravel-nonprod.8451.cloud:4043\"}, \"spark_conf.spark.eventLog.enabled\": {\"type\": \"fixed\", \"value\": \"true\"}, \"spark_conf.spark.eventLog.dir\": {\"type\": \"fixed\", \"value\": \"dbfs:/databricks/unravel/eventLogs/\"}, \"spark_conf.spark.executor.extraJavaOptions\": {\"type\": \"fixed\", \"value\": \"-Dcom.unraveldata.client.rest.request.timeout.ms=1000 -Dcom.unraveldata.client.rest.conn.timeout.ms=1000 -Dcom.unraveldata.client.resolve.hostname=true -javaagent:/dbfs/databricks/unravel/unravel-agent-pack-bin/btrace-agent.jar=config=executor,libs=spark-3.0\"}, \"spark_conf.spark.unravel.shutdown.delay.ms\": {\"type\": \"fixed\", \"value\": \"300\"}, \"spark_conf.spark.databricks.repl.allowedLanguages\": {\"type\": \"fixed\", \"value\": \"python,sql\"}, \"instance_pool_id\": {\"type\": \"allowlist\", \"defaultValue\": \"0525-144559-sorts3-pool-j050ndks\", \"values\": [\"0525-144559-sorts3-pool-j050ndks\", \"0518-175345-ace72-pool-tutlv4io\", \"0525-144601-fired4-pool-4njo3v6o\", \"0525-144602-said5-pool-0ez8dggw\", \"0525-144604-basin6-pool-o7xc71r4\", \"0518-175342-yokel71-pool-34o1ayzl\", \"0518-175345-angle74-pool-lurl6274\", \"0518-175345-clear73-pool-v2bpr99c\"]}, \"num_workers\": {\"type\": \"fixed\", \"value\": 0, \"hidden\": true}}",
"name": "TmMedia - Single Node Clusters"
}
}
}
}
from databricks-sync.
@coreystokes-8451 thanks for this info, it is failing due to ${product}
i will add a fix for this by encoding the definition in base64 and decoding it during the import time. This should avoid having to escape special characters for terraform variable interpolation. The PR #114 has the fix for you. will be released as 0.3.1.
from databricks-sync.
@stikkireddy Thank You!
from databricks-sync.
@stikkireddy I think I found another issue with importing a cluster this time. Let me know if you want me to open a separate issue for this:
2022-06-07 15:03:08 [INFO] │ Error: Conflicting configuration arguments
2022-06-07 15:03:08 [INFO] │
2022-06-07 15:03:08 [INFO] │ with databricks_cluster.databricks_cluster_0525_145639_efwf0ccj,
2022-06-07 15:03:08 [INFO] │ on databricks_cluster_0525_145639_efwf0ccj.tf.json line 13, in resource.databricks_cluster.databricks_cluster_0525_145639_efwf0ccj:
2022-06-07 15:03:08 [INFO] │ 13: "driver_node_type_id": "${var.Standard_E8as_v4}",
2022-06-07 15:03:08 [INFO] │
2022-06-07 15:03:08 [INFO] │ "driver_node_type_id": conflicts with instance_pool_id
2022-06-07 15:03:08 [INFO] ╵
2022-06-07 15:03:08 [INFO] ╷
2022-06-07 15:03:08 [INFO] │ Error: Conflicting configuration arguments
2022-06-07 15:03:08 [INFO] │
2022-06-07 15:03:08 [INFO] │ with databricks_cluster.databricks_cluster_0525_145639_efwf0ccj,
2022-06-07 15:03:08 [INFO] │ on databricks_cluster_0525_145639_efwf0ccj.tf.json line 28, in resource.databricks_cluster.databricks_cluster_0525_145639_efwf0ccj:
2022-06-07 15:03:08 [INFO] │ 28: "instance_pool_id": "${databricks_instance_pool.databricks_instance_pool_0525_144559_sorts3_pool_j050ndks.id}",
2022-06-07 15:03:08 [INFO] │
2022-06-07 15:03:08 [INFO] │ "instance_pool_id": conflicts with driver_node_type_id
2022-06-07 15:03:08 [INFO] ╵
2022-06-07 15:03:08 [INFO] ╷
2022-06-07 15:03:08 [INFO] │ Error: Conflicting configuration arguments
2022-06-07 15:03:08 [INFO] │
2022-06-07 15:03:08 [INFO] │ with databricks_cluster.databricks_cluster_0525_145639_efwf0ccj,
2022-06-07 15:03:08 [INFO] │ on databricks_cluster_0525_145639_efwf0ccj.tf.json line 29, in resource.databricks_cluster.databricks_cluster_0525_145639_efwf0ccj:
2022-06-07 15:03:08 [INFO] │ 29: "node_type_id": "${var.Standard_E8as_v4}",
Here is the json for the exported cluster. We use defined instance pools:
{
"resource": {
"databricks_cluster": {
"databricks_cluster_0525_145639_efwf0ccj": {
"autotermination_minutes": 30,
"cluster_name": "test-cluster",
"custom_tags": {
"ResourceClass": "SingleNode"
},
"depends_on": [
"databricks_global_init_script.databricks_global_init_scripts"
],
"driver_node_type_id": "${var.Standard_E8as_v4}",
"dynamic": [
{
"azure_attributes": {
"content": {
"availability": "SPOT_WITH_FALLBACK_AZURE",
"first_on_demand": 2147483647,
"spot_bid_max_price": -1.0
},
"for_each": "${upper(var.CLOUD) == \"AZURE\" ? [1] : []}"
}
}
],
"enable_elastic_disk": "${upper(var.CLOUD) == \"AZURE\" ? true : true}",
"enable_local_disk_encryption": false,
"instance_pool_id": "${databricks_instance_pool.databricks_instance_pool_0525_144559_sorts3_pool_j050ndks.id}",
"node_type_id": "${var.Standard_E8as_v4}",
"num_workers": 0,
"single_user_name": "[email protected]",
"spark_conf": {
"spark.databricks.cluster.profile": "singleNode",
"spark.databricks.delta.preview.enabled": "true",
"spark.databricks.passthrough.enabled": "true",
"spark.master": "local[*, 4]"
},
"spark_env_vars": {
"PYSPARK_PYTHON": "/databricks/python3/bin/python3"
},
"spark_version": "10.4.x-scala2.12"
}
}
}
}
from databricks-sync.
@coreystokes-8451 please create another issue for this (can you also go to the source cluster and post the json) I wanted to confirm if both the driver and worker are associated with pools.
from databricks-sync.
Related Issues (20)
- Error exporting identity HOT 10
- Import Not Working HOT 2
- databricks-sync import is failing consistently HOT 1
- Sync Import failing on User-Scim HOT 3
- During sync export getting http error 400 Client Error: Bad Request for url: https://adb-*.11.azuredatabricks.net/api/2.0/workspace/get-status?path=%5CShared HOT 1
- databricks-sync import - cluster creation action fails when tried via GCP service account
- Issue Importing Cluster Using Instance Pool HOT 1
- Import fails due to missing variables HOT 4
- Cant Restore Cluster with global init scripts HOT 12
- Default Compute Policies Causing Import Error
- Upgrade DBSYNC to use databricks/databricks rather than databrickslabs/databricks
- All Exports are indicated as failing validation
- Switch from Notebook using Object Id vs Id which is the path
- EXPORT failing due to temp files cleanup issue in Windows HOT 2
- Export fails with trace callbacks and incorrect task counts
- More detailed yaml validation file
- adjust instance pool schema
- libraries flagged as failed should not be exported
- Error: "name 'datetime' is not defined" thrown in terraform.py
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from databricks-sync.