databricks.getCluster
Explore with Pulumi AI
Note If you have a fully automated setup with workspaces created by databricks.MwsWorkspaces or azurerm_databricks_workspace, please make sure to add depends_on attribute in order to prevent default auth: cannot configure default credentials errors.
Retrieves information about a databricks.Cluster using its id. This could be retrieved programmatically using databricks.getClusters data source.
Example Usage
Retrieve attributes of each SQL warehouses in a workspace
import * as pulumi from "@pulumi/pulumi";
import * as databricks from "@pulumi/databricks";
const all = databricks.getClusters({});
const allGetCluster = all.then(all => .reduce((__obj, [, ]) => ({ ...__obj, [__key]: databricks.getCluster({
clusterId: __value,
}) })));
import pulumi
import pulumi_databricks as databricks
all = databricks.get_clusters()
all_get_cluster = {__key: databricks.get_cluster(cluster_id=__value) for __key, __value in all.ids}
Coming soon!
using System.Collections.Generic;
using System.Linq;
using Pulumi;
using Databricks = Pulumi.Databricks;
return await Deployment.RunAsync(() =>
{
var all = Databricks.GetClusters.Invoke();
var allGetCluster = ;
});
Coming soon!
Coming soon!
Related Resources
The following resources are often used in the same context:
- End to end workspace management guide.
- databricks.Cluster to create Databricks Clusters.
- databricks.ClusterPolicy to create a databricks.Cluster policy, which limits the ability to create clusters based on a set of rules.
- databricks.InstancePool to manage instance pools to reduce cluster start and auto-scaling times by maintaining a set of idle, ready-to-use instances.
- databricks.Job to manage Databricks Jobs to run non-interactive code in a databricks_cluster.
- databricks.Library to install a library on databricks_cluster.
- databricks.Pipeline to deploy Delta Live Tables.
Using getCluster
Two invocation forms are available. The direct form accepts plain arguments and either blocks until the result value is available, or returns a Promise-wrapped result. The output form accepts Input-wrapped arguments and returns an Output-wrapped result.
function getCluster(args: GetClusterArgs, opts?: InvokeOptions): Promise<GetClusterResult>
function getClusterOutput(args: GetClusterOutputArgs, opts?: InvokeOptions): Output<GetClusterResult>
def get_cluster(cluster_id: Optional[str] = None,
cluster_info: Optional[GetClusterClusterInfo] = None,
cluster_name: Optional[str] = None,
id: Optional[str] = None,
opts: Optional[InvokeOptions] = None) -> GetClusterResult
def get_cluster_output(cluster_id: Optional[pulumi.Input[str]] = None,
cluster_info: Optional[pulumi.Input[GetClusterClusterInfoArgs]] = None,
cluster_name: Optional[pulumi.Input[str]] = None,
id: Optional[pulumi.Input[str]] = None,
opts: Optional[InvokeOptions] = None) -> Output[GetClusterResult]
func LookupCluster(ctx *Context, args *LookupClusterArgs, opts ...InvokeOption) (*LookupClusterResult, error)
func LookupClusterOutput(ctx *Context, args *LookupClusterOutputArgs, opts ...InvokeOption) LookupClusterResultOutput
> Note: This function is named LookupCluster
in the Go SDK.
public static class GetCluster
{
public static Task<GetClusterResult> InvokeAsync(GetClusterArgs args, InvokeOptions? opts = null)
public static Output<GetClusterResult> Invoke(GetClusterInvokeArgs args, InvokeOptions? opts = null)
}
public static CompletableFuture<GetClusterResult> getCluster(GetClusterArgs args, InvokeOptions options)
// Output-based functions aren't available in Java yet
fn::invoke:
function: databricks:index/getCluster:getCluster
arguments:
# arguments dictionary
The following arguments are supported:
- Cluster
Id string - The id of the cluster
- Cluster
Info GetCluster Cluster Info - block, consisting of following fields:
- Cluster
Name string - The exact name of the cluster to search
- Id string
- cluster ID
- Cluster
Id string - The id of the cluster
- Cluster
Info GetCluster Cluster Info - block, consisting of following fields:
- Cluster
Name string - The exact name of the cluster to search
- Id string
- cluster ID
- cluster
Id String - The id of the cluster
- cluster
Info GetCluster Cluster Info - block, consisting of following fields:
- cluster
Name String - The exact name of the cluster to search
- id String
- cluster ID
- cluster
Id string - The id of the cluster
- cluster
Info GetCluster Cluster Info - block, consisting of following fields:
- cluster
Name string - The exact name of the cluster to search
- id string
- cluster ID
- cluster_
id str - The id of the cluster
- cluster_
info GetCluster Cluster Info - block, consisting of following fields:
- cluster_
name str - The exact name of the cluster to search
- id str
- cluster ID
- cluster
Id String - The id of the cluster
- cluster
Info Property Map - block, consisting of following fields:
- cluster
Name String - The exact name of the cluster to search
- id String
- cluster ID
getCluster Result
The following output properties are available:
- Cluster
Id string - Cluster
Info GetCluster Cluster Info - block, consisting of following fields:
- Cluster
Name string - Cluster name, which doesn’t have to be unique.
- Id string
- cluster ID
- Cluster
Id string - Cluster
Info GetCluster Cluster Info - block, consisting of following fields:
- Cluster
Name string - Cluster name, which doesn’t have to be unique.
- Id string
- cluster ID
- cluster
Id String - cluster
Info GetCluster Cluster Info - block, consisting of following fields:
- cluster
Name String - Cluster name, which doesn’t have to be unique.
- id String
- cluster ID
- cluster
Id string - cluster
Info GetCluster Cluster Info - block, consisting of following fields:
- cluster
Name string - Cluster name, which doesn’t have to be unique.
- id string
- cluster ID
- cluster_
id str - cluster_
info GetCluster Cluster Info - block, consisting of following fields:
- cluster_
name str - Cluster name, which doesn’t have to be unique.
- id str
- cluster ID
- cluster
Id String - cluster
Info Property Map - block, consisting of following fields:
- cluster
Name String - Cluster name, which doesn’t have to be unique.
- id String
- cluster ID
Supporting Types
GetClusterClusterInfo
- Autoscale
Get
Cluster Cluster Info Autoscale - Autotermination
Minutes int - Automatically terminate the cluster after being inactive for this time in minutes. If specified, the threshold must be between 10 and 10000 minutes. You can also set this value to 0 to explicitly disable automatic termination.
- Aws
Attributes GetCluster Cluster Info Aws Attributes - Azure
Attributes GetCluster Cluster Info Azure Attributes - Cluster
Cores double - Cluster
Id string - The id of the cluster
- Cluster
Log GetConf Cluster Cluster Info Cluster Log Conf - Cluster
Log GetStatus Cluster Cluster Info Cluster Log Status - Cluster
Memory intMb - Cluster
Name string - The exact name of the cluster to search
- Cluster
Source string - Creator
User stringName - Dictionary<string, string>
- Additional tags for cluster resources.
- Data
Security stringMode - Security features of the cluster. Unity Catalog requires
SINGLE_USER
orUSER_ISOLATION
mode.LEGACY_PASSTHROUGH
for passthrough cluster andLEGACY_TABLE_ACL
for Table ACL cluster. Default toNONE
, i.e. no security feature enabled. - Dictionary<string, string>
- Docker
Image GetCluster Cluster Info Docker Image - Driver
Get
Cluster Cluster Info Driver - Driver
Instance stringPool Id - similar to
instance_pool_id
, but for driver node. - Driver
Node stringType Id - The node type of the Spark driver.
- Enable
Elastic boolDisk - Use autoscaling local storage.
- Enable
Local boolDisk Encryption - Enable local disk encryption.
- Executors
List<Get
Cluster Cluster Info Executor> - Gcp
Attributes GetCluster Cluster Info Gcp Attributes - Init
Scripts List<GetCluster Cluster Info Init Script> - Instance
Pool stringId - The pool of idle instances the cluster is attached to.
- Jdbc
Port int - Last
Restarted intTime - Last
State intLoss Time - Node
Type stringId - Any supported databricks.getNodeType id.
- Num
Workers int - Policy
Id string - Identifier of Cluster Policy to validate cluster and preset certain defaults.
- Runtime
Engine string - The type of runtime of the cluster
- Single
User stringName - The optional user name of the user to assign to an interactive cluster. This field is required when using standard AAD Passthrough for Azure Data Lake Storage (ADLS) with a single-user cluster (i.e., not high-concurrency clusters).
- Spark
Conf Dictionary<string, string> - Map with key-value pairs to fine-tune Spark clusters.
- Spark
Context intId - Spark
Env Dictionary<string, string>Vars - Map with environment variable key-value pairs to fine-tune Spark clusters. Key-value pairs of the form (X,Y) are exported (i.e., X='Y') while launching the driver and workers.
- Spark
Version string - Runtime version of the cluster.
- Spec
Get
Cluster Cluster Info Spec - Ssh
Public List<string>Keys - SSH public key contents that will be added to each Spark node in this cluster.
- Start
Time int - State string
- State
Message string - Terminated
Time int - Termination
Reason GetCluster Cluster Info Termination Reason - Workload
Type GetCluster Cluster Info Workload Type
- Autoscale
Get
Cluster Cluster Info Autoscale - Autotermination
Minutes int - Automatically terminate the cluster after being inactive for this time in minutes. If specified, the threshold must be between 10 and 10000 minutes. You can also set this value to 0 to explicitly disable automatic termination.
- Aws
Attributes GetCluster Cluster Info Aws Attributes - Azure
Attributes GetCluster Cluster Info Azure Attributes - Cluster
Cores float64 - Cluster
Id string - The id of the cluster
- Cluster
Log GetConf Cluster Cluster Info Cluster Log Conf - Cluster
Log GetStatus Cluster Cluster Info Cluster Log Status - Cluster
Memory intMb - Cluster
Name string - The exact name of the cluster to search
- Cluster
Source string - Creator
User stringName - map[string]string
- Additional tags for cluster resources.
- Data
Security stringMode - Security features of the cluster. Unity Catalog requires
SINGLE_USER
orUSER_ISOLATION
mode.LEGACY_PASSTHROUGH
for passthrough cluster andLEGACY_TABLE_ACL
for Table ACL cluster. Default toNONE
, i.e. no security feature enabled. - map[string]string
- Docker
Image GetCluster Cluster Info Docker Image - Driver
Get
Cluster Cluster Info Driver - Driver
Instance stringPool Id - similar to
instance_pool_id
, but for driver node. - Driver
Node stringType Id - The node type of the Spark driver.
- Enable
Elastic boolDisk - Use autoscaling local storage.
- Enable
Local boolDisk Encryption - Enable local disk encryption.
- Executors
[]Get
Cluster Cluster Info Executor - Gcp
Attributes GetCluster Cluster Info Gcp Attributes - Init
Scripts []GetCluster Cluster Info Init Script - Instance
Pool stringId - The pool of idle instances the cluster is attached to.
- Jdbc
Port int - Last
Restarted intTime - Last
State intLoss Time - Node
Type stringId - Any supported databricks.getNodeType id.
- Num
Workers int - Policy
Id string - Identifier of Cluster Policy to validate cluster and preset certain defaults.
- Runtime
Engine string - The type of runtime of the cluster
- Single
User stringName - The optional user name of the user to assign to an interactive cluster. This field is required when using standard AAD Passthrough for Azure Data Lake Storage (ADLS) with a single-user cluster (i.e., not high-concurrency clusters).
- Spark
Conf map[string]string - Map with key-value pairs to fine-tune Spark clusters.
- Spark
Context intId - Spark
Env map[string]stringVars - Map with environment variable key-value pairs to fine-tune Spark clusters. Key-value pairs of the form (X,Y) are exported (i.e., X='Y') while launching the driver and workers.
- Spark
Version string - Runtime version of the cluster.
- Spec
Get
Cluster Cluster Info Spec - Ssh
Public []stringKeys - SSH public key contents that will be added to each Spark node in this cluster.
- Start
Time int - State string
- State
Message string - Terminated
Time int - Termination
Reason GetCluster Cluster Info Termination Reason - Workload
Type GetCluster Cluster Info Workload Type
- autoscale
Get
Cluster Cluster Info Autoscale - autotermination
Minutes Integer - Automatically terminate the cluster after being inactive for this time in minutes. If specified, the threshold must be between 10 and 10000 minutes. You can also set this value to 0 to explicitly disable automatic termination.
- aws
Attributes GetCluster Cluster Info Aws Attributes - azure
Attributes GetCluster Cluster Info Azure Attributes - cluster
Cores Double - cluster
Id String - The id of the cluster
- cluster
Log GetConf Cluster Cluster Info Cluster Log Conf - cluster
Log GetStatus Cluster Cluster Info Cluster Log Status - cluster
Memory IntegerMb - cluster
Name String - The exact name of the cluster to search
- cluster
Source String - creator
User StringName - Map<String,String>
- Additional tags for cluster resources.
- data
Security StringMode - Security features of the cluster. Unity Catalog requires
SINGLE_USER
orUSER_ISOLATION
mode.LEGACY_PASSTHROUGH
for passthrough cluster andLEGACY_TABLE_ACL
for Table ACL cluster. Default toNONE
, i.e. no security feature enabled. - Map<String,String>
- docker
Image GetCluster Cluster Info Docker Image - driver
Get
Cluster Cluster Info Driver - driver
Instance StringPool Id - similar to
instance_pool_id
, but for driver node. - driver
Node StringType Id - The node type of the Spark driver.
- enable
Elastic BooleanDisk - Use autoscaling local storage.
- enable
Local BooleanDisk Encryption - Enable local disk encryption.
- executors
List<Get
Cluster Cluster Info Executor> - gcp
Attributes GetCluster Cluster Info Gcp Attributes - init
Scripts List<GetCluster Cluster Info Init Script> - instance
Pool StringId - The pool of idle instances the cluster is attached to.
- jdbc
Port Integer - last
Restarted IntegerTime - last
State IntegerLoss Time - node
Type StringId - Any supported databricks.getNodeType id.
- num
Workers Integer - policy
Id String - Identifier of Cluster Policy to validate cluster and preset certain defaults.
- runtime
Engine String - The type of runtime of the cluster
- single
User StringName - The optional user name of the user to assign to an interactive cluster. This field is required when using standard AAD Passthrough for Azure Data Lake Storage (ADLS) with a single-user cluster (i.e., not high-concurrency clusters).
- spark
Conf Map<String,String> - Map with key-value pairs to fine-tune Spark clusters.
- spark
Context IntegerId - spark
Env Map<String,String>Vars - Map with environment variable key-value pairs to fine-tune Spark clusters. Key-value pairs of the form (X,Y) are exported (i.e., X='Y') while launching the driver and workers.
- spark
Version String - Runtime version of the cluster.
- spec
Get
Cluster Cluster Info Spec - ssh
Public List<String>Keys - SSH public key contents that will be added to each Spark node in this cluster.
- start
Time Integer - state String
- state
Message String - terminated
Time Integer - termination
Reason GetCluster Cluster Info Termination Reason - workload
Type GetCluster Cluster Info Workload Type
- autoscale
Get
Cluster Cluster Info Autoscale - autotermination
Minutes number - Automatically terminate the cluster after being inactive for this time in minutes. If specified, the threshold must be between 10 and 10000 minutes. You can also set this value to 0 to explicitly disable automatic termination.
- aws
Attributes GetCluster Cluster Info Aws Attributes - azure
Attributes GetCluster Cluster Info Azure Attributes - cluster
Cores number - cluster
Id string - The id of the cluster
- cluster
Log GetConf Cluster Cluster Info Cluster Log Conf - cluster
Log GetStatus Cluster Cluster Info Cluster Log Status - cluster
Memory numberMb - cluster
Name string - The exact name of the cluster to search
- cluster
Source string - creator
User stringName - {[key: string]: string}
- Additional tags for cluster resources.
- data
Security stringMode - Security features of the cluster. Unity Catalog requires
SINGLE_USER
orUSER_ISOLATION
mode.LEGACY_PASSTHROUGH
for passthrough cluster andLEGACY_TABLE_ACL
for Table ACL cluster. Default toNONE
, i.e. no security feature enabled. - {[key: string]: string}
- docker
Image GetCluster Cluster Info Docker Image - driver
Get
Cluster Cluster Info Driver - driver
Instance stringPool Id - similar to
instance_pool_id
, but for driver node. - driver
Node stringType Id - The node type of the Spark driver.
- enable
Elastic booleanDisk - Use autoscaling local storage.
- enable
Local booleanDisk Encryption - Enable local disk encryption.
- executors
Get
Cluster Cluster Info Executor[] - gcp
Attributes GetCluster Cluster Info Gcp Attributes - init
Scripts GetCluster Cluster Info Init Script[] - instance
Pool stringId - The pool of idle instances the cluster is attached to.
- jdbc
Port number - last
Restarted numberTime - last
State numberLoss Time - node
Type stringId - Any supported databricks.getNodeType id.
- num
Workers number - policy
Id string - Identifier of Cluster Policy to validate cluster and preset certain defaults.
- runtime
Engine string - The type of runtime of the cluster
- single
User stringName - The optional user name of the user to assign to an interactive cluster. This field is required when using standard AAD Passthrough for Azure Data Lake Storage (ADLS) with a single-user cluster (i.e., not high-concurrency clusters).
- spark
Conf {[key: string]: string} - Map with key-value pairs to fine-tune Spark clusters.
- spark
Context numberId - spark
Env {[key: string]: string}Vars - Map with environment variable key-value pairs to fine-tune Spark clusters. Key-value pairs of the form (X,Y) are exported (i.e., X='Y') while launching the driver and workers.
- spark
Version string - Runtime version of the cluster.
- spec
Get
Cluster Cluster Info Spec - ssh
Public string[]Keys - SSH public key contents that will be added to each Spark node in this cluster.
- start
Time number - state string
- state
Message string - terminated
Time number - termination
Reason GetCluster Cluster Info Termination Reason - workload
Type GetCluster Cluster Info Workload Type
- autoscale
Get
Cluster Cluster Info Autoscale - autotermination_
minutes int - Automatically terminate the cluster after being inactive for this time in minutes. If specified, the threshold must be between 10 and 10000 minutes. You can also set this value to 0 to explicitly disable automatic termination.
- aws_
attributes GetCluster Cluster Info Aws Attributes - azure_
attributes GetCluster Cluster Info Azure Attributes - cluster_
cores float - cluster_
id str - The id of the cluster
- cluster_
log_ Getconf Cluster Cluster Info Cluster Log Conf - cluster_
log_ Getstatus Cluster Cluster Info Cluster Log Status - cluster_
memory_ intmb - cluster_
name str - The exact name of the cluster to search
- cluster_
source str - creator_
user_ strname - Mapping[str, str]
- Additional tags for cluster resources.
- data_
security_ strmode - Security features of the cluster. Unity Catalog requires
SINGLE_USER
orUSER_ISOLATION
mode.LEGACY_PASSTHROUGH
for passthrough cluster andLEGACY_TABLE_ACL
for Table ACL cluster. Default toNONE
, i.e. no security feature enabled. - Mapping[str, str]
- docker_
image GetCluster Cluster Info Docker Image - driver
Get
Cluster Cluster Info Driver - driver_
instance_ strpool_ id - similar to
instance_pool_id
, but for driver node. - driver_
node_ strtype_ id - The node type of the Spark driver.
- enable_
elastic_ booldisk - Use autoscaling local storage.
- enable_
local_ booldisk_ encryption - Enable local disk encryption.
- executors
Sequence[Get
Cluster Cluster Info Executor] - gcp_
attributes GetCluster Cluster Info Gcp Attributes - init_
scripts Sequence[GetCluster Cluster Info Init Script] - instance_
pool_ strid - The pool of idle instances the cluster is attached to.
- jdbc_
port int - last_
restarted_ inttime - last_
state_ intloss_ time - node_
type_ strid - Any supported databricks.getNodeType id.
- num_
workers int - policy_
id str - Identifier of Cluster Policy to validate cluster and preset certain defaults.
- runtime_
engine str - The type of runtime of the cluster
- single_
user_ strname - The optional user name of the user to assign to an interactive cluster. This field is required when using standard AAD Passthrough for Azure Data Lake Storage (ADLS) with a single-user cluster (i.e., not high-concurrency clusters).
- spark_
conf Mapping[str, str] - Map with key-value pairs to fine-tune Spark clusters.
- spark_
context_ intid - spark_
env_ Mapping[str, str]vars - Map with environment variable key-value pairs to fine-tune Spark clusters. Key-value pairs of the form (X,Y) are exported (i.e., X='Y') while launching the driver and workers.
- spark_
version str - Runtime version of the cluster.
- spec
Get
Cluster Cluster Info Spec - ssh_
public_ Sequence[str]keys - SSH public key contents that will be added to each Spark node in this cluster.
- start_
time int - state str
- state_
message str - terminated_
time int - termination_
reason GetCluster Cluster Info Termination Reason - workload_
type GetCluster Cluster Info Workload Type
- autoscale Property Map
- autotermination
Minutes Number - Automatically terminate the cluster after being inactive for this time in minutes. If specified, the threshold must be between 10 and 10000 minutes. You can also set this value to 0 to explicitly disable automatic termination.
- aws
Attributes Property Map - azure
Attributes Property Map - cluster
Cores Number - cluster
Id String - The id of the cluster
- cluster
Log Property MapConf - cluster
Log Property MapStatus - cluster
Memory NumberMb - cluster
Name String - The exact name of the cluster to search
- cluster
Source String - creator
User StringName - Map<String>
- Additional tags for cluster resources.
- data
Security StringMode - Security features of the cluster. Unity Catalog requires
SINGLE_USER
orUSER_ISOLATION
mode.LEGACY_PASSTHROUGH
for passthrough cluster andLEGACY_TABLE_ACL
for Table ACL cluster. Default toNONE
, i.e. no security feature enabled. - Map<String>
- docker
Image Property Map - driver Property Map
- driver
Instance StringPool Id - similar to
instance_pool_id
, but for driver node. - driver
Node StringType Id - The node type of the Spark driver.
- enable
Elastic BooleanDisk - Use autoscaling local storage.
- enable
Local BooleanDisk Encryption - Enable local disk encryption.
- executors List<Property Map>
- gcp
Attributes Property Map - init
Scripts List<Property Map> - instance
Pool StringId - The pool of idle instances the cluster is attached to.
- jdbc
Port Number - last
Restarted NumberTime - last
State NumberLoss Time - node
Type StringId - Any supported databricks.getNodeType id.
- num
Workers Number - policy
Id String - Identifier of Cluster Policy to validate cluster and preset certain defaults.
- runtime
Engine String - The type of runtime of the cluster
- single
User StringName - The optional user name of the user to assign to an interactive cluster. This field is required when using standard AAD Passthrough for Azure Data Lake Storage (ADLS) with a single-user cluster (i.e., not high-concurrency clusters).
- spark
Conf Map<String> - Map with key-value pairs to fine-tune Spark clusters.
- spark
Context NumberId - spark
Env Map<String>Vars - Map with environment variable key-value pairs to fine-tune Spark clusters. Key-value pairs of the form (X,Y) are exported (i.e., X='Y') while launching the driver and workers.
- spark
Version String - Runtime version of the cluster.
- spec Property Map
- ssh
Public List<String>Keys - SSH public key contents that will be added to each Spark node in this cluster.
- start
Time Number - state String
- state
Message String - terminated
Time Number - termination
Reason Property Map - workload
Type Property Map
GetClusterClusterInfoAutoscale
- Max
Workers int - Min
Workers int
- Max
Workers int - Min
Workers int
- max
Workers Integer - min
Workers Integer
- max
Workers number - min
Workers number
- max_
workers int - min_
workers int
- max
Workers Number - min
Workers Number
GetClusterClusterInfoAwsAttributes
- Availability string
- Ebs
Volume intCount - Ebs
Volume intIops - Ebs
Volume intSize - Ebs
Volume intThroughput - Ebs
Volume stringType - First
On intDemand - Instance
Profile stringArn - Spot
Bid intPrice Percent - Zone
Id string
- Availability string
- Ebs
Volume intCount - Ebs
Volume intIops - Ebs
Volume intSize - Ebs
Volume intThroughput - Ebs
Volume stringType - First
On intDemand - Instance
Profile stringArn - Spot
Bid intPrice Percent - Zone
Id string
- availability String
- ebs
Volume IntegerCount - ebs
Volume IntegerIops - ebs
Volume IntegerSize - ebs
Volume IntegerThroughput - ebs
Volume StringType - first
On IntegerDemand - instance
Profile StringArn - spot
Bid IntegerPrice Percent - zone
Id String
- availability string
- ebs
Volume numberCount - ebs
Volume numberIops - ebs
Volume numberSize - ebs
Volume numberThroughput - ebs
Volume stringType - first
On numberDemand - instance
Profile stringArn - spot
Bid numberPrice Percent - zone
Id string
- availability str
- ebs_
volume_ intcount - ebs_
volume_ intiops - ebs_
volume_ intsize - ebs_
volume_ intthroughput - ebs_
volume_ strtype - first_
on_ intdemand - instance_
profile_ strarn - spot_
bid_ intprice_ percent - zone_
id str
- availability String
- ebs
Volume NumberCount - ebs
Volume NumberIops - ebs
Volume NumberSize - ebs
Volume NumberThroughput - ebs
Volume StringType - first
On NumberDemand - instance
Profile StringArn - spot
Bid NumberPrice Percent - zone
Id String
GetClusterClusterInfoAzureAttributes
- availability String
- first
On NumberDemand - log
Analytics Property MapInfo - spot
Bid NumberMax Price
GetClusterClusterInfoAzureAttributesLogAnalyticsInfo
- Log
Analytics stringPrimary Key - Log
Analytics stringWorkspace Id
- Log
Analytics stringPrimary Key - Log
Analytics stringWorkspace Id
- log
Analytics StringPrimary Key - log
Analytics StringWorkspace Id
- log
Analytics stringPrimary Key - log
Analytics stringWorkspace Id
- log
Analytics StringPrimary Key - log
Analytics StringWorkspace Id
GetClusterClusterInfoClusterLogConf
GetClusterClusterInfoClusterLogConfDbfs
- Destination string
- Destination string
- destination String
- destination string
- destination str
- destination String
GetClusterClusterInfoClusterLogConfS3
- Destination string
- Canned
Acl string - Enable
Encryption bool - Encryption
Type string - Endpoint string
- Kms
Key string - Region string
- Destination string
- Canned
Acl string - Enable
Encryption bool - Encryption
Type string - Endpoint string
- Kms
Key string - Region string
- destination String
- canned
Acl String - enable
Encryption Boolean - encryption
Type String - endpoint String
- kms
Key String - region String
- destination string
- canned
Acl string - enable
Encryption boolean - encryption
Type string - endpoint string
- kms
Key string - region string
- destination str
- canned_
acl str - enable_
encryption bool - encryption_
type str - endpoint str
- kms_
key str - region str
- destination String
- canned
Acl String - enable
Encryption Boolean - encryption
Type String - endpoint String
- kms
Key String - region String
GetClusterClusterInfoClusterLogStatus
- Last
Attempted int - Last
Exception string
- Last
Attempted int - Last
Exception string
- last
Attempted Integer - last
Exception String
- last
Attempted number - last
Exception string
- last_
attempted int - last_
exception str
- last
Attempted Number - last
Exception String
GetClusterClusterInfoDockerImage
- basic
Auth Property Map - url String
GetClusterClusterInfoDockerImageBasicAuth
GetClusterClusterInfoDriver
- Host
Private stringIp - Instance
Id string - Node
Aws GetAttributes Cluster Cluster Info Driver Node Aws Attributes - Node
Id string - Private
Ip string - Public
Dns string - Start
Timestamp int
- Host
Private stringIp - Instance
Id string - Node
Aws GetAttributes Cluster Cluster Info Driver Node Aws Attributes - Node
Id string - Private
Ip string - Public
Dns string - Start
Timestamp int
- host
Private StringIp - instance
Id String - node
Aws GetAttributes Cluster Cluster Info Driver Node Aws Attributes - node
Id String - private
Ip String - public
Dns String - start
Timestamp Integer
- host
Private stringIp - instance
Id string - node
Aws GetAttributes Cluster Cluster Info Driver Node Aws Attributes - node
Id string - private
Ip string - public
Dns string - start
Timestamp number
- host
Private StringIp - instance
Id String - node
Aws Property MapAttributes - node
Id String - private
Ip String - public
Dns String - start
Timestamp Number
GetClusterClusterInfoDriverNodeAwsAttributes
- Is
Spot bool
- Is
Spot bool
- is
Spot Boolean
- is
Spot boolean
- is_
spot bool
- is
Spot Boolean
GetClusterClusterInfoExecutor
- Host
Private stringIp - Instance
Id string - Node
Aws GetAttributes Cluster Cluster Info Executor Node Aws Attributes - Node
Id string - Private
Ip string - Public
Dns string - Start
Timestamp int
- Host
Private stringIp - Instance
Id string - Node
Aws GetAttributes Cluster Cluster Info Executor Node Aws Attributes - Node
Id string - Private
Ip string - Public
Dns string - Start
Timestamp int
- host
Private StringIp - instance
Id String - node
Aws GetAttributes Cluster Cluster Info Executor Node Aws Attributes - node
Id String - private
Ip String - public
Dns String - start
Timestamp Integer
- host
Private stringIp - instance
Id string - node
Aws GetAttributes Cluster Cluster Info Executor Node Aws Attributes - node
Id string - private
Ip string - public
Dns string - start
Timestamp number
- host
Private StringIp - instance
Id String - node
Aws Property MapAttributes - node
Id String - private
Ip String - public
Dns String - start
Timestamp Number
GetClusterClusterInfoExecutorNodeAwsAttributes
- Is
Spot bool
- Is
Spot bool
- is
Spot Boolean
- is
Spot boolean
- is_
spot bool
- is
Spot Boolean
GetClusterClusterInfoGcpAttributes
- Availability string
- Boot
Disk intSize - Google
Service stringAccount - Local
Ssd intCount - Use
Preemptible boolExecutors - Zone
Id string
- Availability string
- Boot
Disk intSize - Google
Service stringAccount - Local
Ssd intCount - Use
Preemptible boolExecutors - Zone
Id string
- availability String
- boot
Disk IntegerSize - google
Service StringAccount - local
Ssd IntegerCount - use
Preemptible BooleanExecutors - zone
Id String
- availability string
- boot
Disk numberSize - google
Service stringAccount - local
Ssd numberCount - use
Preemptible booleanExecutors - zone
Id string
- availability str
- boot_
disk_ intsize - google_
service_ straccount - local_
ssd_ intcount - use_
preemptible_ boolexecutors - zone_
id str
- availability String
- boot
Disk NumberSize - google
Service StringAccount - local
Ssd NumberCount - use
Preemptible BooleanExecutors - zone
Id String
GetClusterClusterInfoInitScript
GetClusterClusterInfoInitScriptAbfss
- Destination string
- Destination string
- destination String
- destination string
- destination str
- destination String
GetClusterClusterInfoInitScriptDbfs
- Destination string
- Destination string
- destination String
- destination string
- destination str
- destination String
GetClusterClusterInfoInitScriptFile
- Destination string
- Destination string
- destination String
- destination string
- destination str
- destination String
GetClusterClusterInfoInitScriptGcs
- Destination string
- Destination string
- destination String
- destination string
- destination str
- destination String
GetClusterClusterInfoInitScriptS3
- Destination string
- Canned
Acl string - Enable
Encryption bool - Encryption
Type string - Endpoint string
- Kms
Key string - Region string
- Destination string
- Canned
Acl string - Enable
Encryption bool - Encryption
Type string - Endpoint string
- Kms
Key string - Region string
- destination String
- canned
Acl String - enable
Encryption Boolean - encryption
Type String - endpoint String
- kms
Key String - region String
- destination string
- canned
Acl string - enable
Encryption boolean - encryption
Type string - endpoint string
- kms
Key string - region string
- destination str
- canned_
acl str - enable_
encryption bool - encryption_
type str - endpoint str
- kms_
key str - region str
- destination String
- canned
Acl String - enable
Encryption Boolean - encryption
Type String - endpoint String
- kms
Key String - region String
GetClusterClusterInfoInitScriptVolumes
- Destination string
- Destination string
- destination String
- destination string
- destination str
- destination String
GetClusterClusterInfoInitScriptWorkspace
- Destination string
- Destination string
- destination String
- destination string
- destination str
- destination String
GetClusterClusterInfoSpec
- Cluster
Id string - The id of the cluster
- Driver
Instance stringPool Id - similar to
instance_pool_id
, but for driver node. - Driver
Node stringType Id - The node type of the Spark driver.
- Enable
Elastic boolDisk - Use autoscaling local storage.
- Enable
Local boolDisk Encryption - Enable local disk encryption.
- Node
Type stringId - Any supported databricks.getNodeType id.
- Spark
Version string - Runtime version of the cluster.
- Apply
Policy boolDefault Values - Autoscale
Get
Cluster Cluster Info Spec Autoscale - Aws
Attributes GetCluster Cluster Info Spec Aws Attributes - Azure
Attributes GetCluster Cluster Info Spec Azure Attributes - Cluster
Log GetConf Cluster Cluster Info Spec Cluster Log Conf - Cluster
Mount List<GetInfos Cluster Cluster Info Spec Cluster Mount Info> - Cluster
Name string - The exact name of the cluster to search
- Dictionary<string, string>
- Additional tags for cluster resources.
- Data
Security stringMode - Security features of the cluster. Unity Catalog requires
SINGLE_USER
orUSER_ISOLATION
mode.LEGACY_PASSTHROUGH
for passthrough cluster andLEGACY_TABLE_ACL
for Table ACL cluster. Default toNONE
, i.e. no security feature enabled. - Docker
Image GetCluster Cluster Info Spec Docker Image - Gcp
Attributes GetCluster Cluster Info Spec Gcp Attributes - Idempotency
Token string - An optional token to guarantee the idempotency of cluster creation requests.
- Init
Scripts List<GetCluster Cluster Info Spec Init Script> - Instance
Pool stringId - The pool of idle instances the cluster is attached to.
- Libraries
List<Get
Cluster Cluster Info Spec Library> - Num
Workers int - Policy
Id string - Identifier of Cluster Policy to validate cluster and preset certain defaults.
- Runtime
Engine string - The type of runtime of the cluster
- Single
User stringName - The optional user name of the user to assign to an interactive cluster. This field is required when using standard AAD Passthrough for Azure Data Lake Storage (ADLS) with a single-user cluster (i.e., not high-concurrency clusters).
- Spark
Conf Dictionary<string, string> - Map with key-value pairs to fine-tune Spark clusters.
- Spark
Env Dictionary<string, string>Vars - Map with environment variable key-value pairs to fine-tune Spark clusters. Key-value pairs of the form (X,Y) are exported (i.e., X='Y') while launching the driver and workers.
- Ssh
Public List<string>Keys - SSH public key contents that will be added to each Spark node in this cluster.
- Workload
Type GetCluster Cluster Info Spec Workload Type
- Cluster
Id string - The id of the cluster
- Driver
Instance stringPool Id - similar to
instance_pool_id
, but for driver node. - Driver
Node stringType Id - The node type of the Spark driver.
- Enable
Elastic boolDisk - Use autoscaling local storage.
- Enable
Local boolDisk Encryption - Enable local disk encryption.
- Node
Type stringId - Any supported databricks.getNodeType id.
- Spark
Version string - Runtime version of the cluster.
- Apply
Policy boolDefault Values - Autoscale
Get
Cluster Cluster Info Spec Autoscale - Aws
Attributes GetCluster Cluster Info Spec Aws Attributes - Azure
Attributes GetCluster Cluster Info Spec Azure Attributes - Cluster
Log GetConf Cluster Cluster Info Spec Cluster Log Conf - Cluster
Mount []GetInfos Cluster Cluster Info Spec Cluster Mount Info - Cluster
Name string - The exact name of the cluster to search
- map[string]string
- Additional tags for cluster resources.
- Data
Security stringMode - Security features of the cluster. Unity Catalog requires
SINGLE_USER
orUSER_ISOLATION
mode.LEGACY_PASSTHROUGH
for passthrough cluster andLEGACY_TABLE_ACL
for Table ACL cluster. Default toNONE
, i.e. no security feature enabled. - Docker
Image GetCluster Cluster Info Spec Docker Image - Gcp
Attributes GetCluster Cluster Info Spec Gcp Attributes - Idempotency
Token string - An optional token to guarantee the idempotency of cluster creation requests.
- Init
Scripts []GetCluster Cluster Info Spec Init Script - Instance
Pool stringId - The pool of idle instances the cluster is attached to.
- Libraries
[]Get
Cluster Cluster Info Spec Library - Num
Workers int - Policy
Id string - Identifier of Cluster Policy to validate cluster and preset certain defaults.
- Runtime
Engine string - The type of runtime of the cluster
- Single
User stringName - The optional user name of the user to assign to an interactive cluster. This field is required when using standard AAD Passthrough for Azure Data Lake Storage (ADLS) with a single-user cluster (i.e., not high-concurrency clusters).
- Spark
Conf map[string]string - Map with key-value pairs to fine-tune Spark clusters.
- Spark
Env map[string]stringVars - Map with environment variable key-value pairs to fine-tune Spark clusters. Key-value pairs of the form (X,Y) are exported (i.e., X='Y') while launching the driver and workers.
- Ssh
Public []stringKeys - SSH public key contents that will be added to each Spark node in this cluster.
- Workload
Type GetCluster Cluster Info Spec Workload Type
- cluster
Id String - The id of the cluster
- driver
Instance StringPool Id - similar to
instance_pool_id
, but for driver node. - driver
Node StringType Id - The node type of the Spark driver.
- enable
Elastic BooleanDisk - Use autoscaling local storage.
- enable
Local BooleanDisk Encryption - Enable local disk encryption.
- node
Type StringId - Any supported databricks.getNodeType id.
- spark
Version String - Runtime version of the cluster.
- apply
Policy BooleanDefault Values - autoscale
Get
Cluster Cluster Info Spec Autoscale - aws
Attributes GetCluster Cluster Info Spec Aws Attributes - azure
Attributes GetCluster Cluster Info Spec Azure Attributes - cluster
Log GetConf Cluster Cluster Info Spec Cluster Log Conf - cluster
Mount List<GetInfos Cluster Cluster Info Spec Cluster Mount Info> - cluster
Name String - The exact name of the cluster to search
- Map<String,String>
- Additional tags for cluster resources.
- data
Security StringMode - Security features of the cluster. Unity Catalog requires
SINGLE_USER
orUSER_ISOLATION
mode.LEGACY_PASSTHROUGH
for passthrough cluster andLEGACY_TABLE_ACL
for Table ACL cluster. Default toNONE
, i.e. no security feature enabled. - docker
Image GetCluster Cluster Info Spec Docker Image - gcp
Attributes GetCluster Cluster Info Spec Gcp Attributes - idempotency
Token String - An optional token to guarantee the idempotency of cluster creation requests.
- init
Scripts List<GetCluster Cluster Info Spec Init Script> - instance
Pool StringId - The pool of idle instances the cluster is attached to.
- libraries
List<Get
Cluster Cluster Info Spec Library> - num
Workers Integer - policy
Id String - Identifier of Cluster Policy to validate cluster and preset certain defaults.
- runtime
Engine String - The type of runtime of the cluster
- single
User StringName - The optional user name of the user to assign to an interactive cluster. This field is required when using standard AAD Passthrough for Azure Data Lake Storage (ADLS) with a single-user cluster (i.e., not high-concurrency clusters).
- spark
Conf Map<String,String> - Map with key-value pairs to fine-tune Spark clusters.
- spark
Env Map<String,String>Vars - Map with environment variable key-value pairs to fine-tune Spark clusters. Key-value pairs of the form (X,Y) are exported (i.e., X='Y') while launching the driver and workers.
- ssh
Public List<String>Keys - SSH public key contents that will be added to each Spark node in this cluster.
- workload
Type GetCluster Cluster Info Spec Workload Type
- cluster
Id string - The id of the cluster
- driver
Instance stringPool Id - similar to
instance_pool_id
, but for driver node. - driver
Node stringType Id - The node type of the Spark driver.
- enable
Elastic booleanDisk - Use autoscaling local storage.
- enable
Local booleanDisk Encryption - Enable local disk encryption.
- node
Type stringId - Any supported databricks.getNodeType id.
- spark
Version string - Runtime version of the cluster.
- apply
Policy booleanDefault Values - autoscale
Get
Cluster Cluster Info Spec Autoscale - aws
Attributes GetCluster Cluster Info Spec Aws Attributes - azure
Attributes GetCluster Cluster Info Spec Azure Attributes - cluster
Log GetConf Cluster Cluster Info Spec Cluster Log Conf - cluster
Mount GetInfos Cluster Cluster Info Spec Cluster Mount Info[] - cluster
Name string - The exact name of the cluster to search
- {[key: string]: string}
- Additional tags for cluster resources.
- data
Security stringMode - Security features of the cluster. Unity Catalog requires
SINGLE_USER
orUSER_ISOLATION
mode.LEGACY_PASSTHROUGH
for passthrough cluster andLEGACY_TABLE_ACL
for Table ACL cluster. Default toNONE
, i.e. no security feature enabled. - docker
Image GetCluster Cluster Info Spec Docker Image - gcp
Attributes GetCluster Cluster Info Spec Gcp Attributes - idempotency
Token string - An optional token to guarantee the idempotency of cluster creation requests.
- init
Scripts GetCluster Cluster Info Spec Init Script[] - instance
Pool stringId - The pool of idle instances the cluster is attached to.
- libraries
Get
Cluster Cluster Info Spec Library[] - num
Workers number - policy
Id string - Identifier of Cluster Policy to validate cluster and preset certain defaults.
- runtime
Engine string - The type of runtime of the cluster
- single
User stringName - The optional user name of the user to assign to an interactive cluster. This field is required when using standard AAD Passthrough for Azure Data Lake Storage (ADLS) with a single-user cluster (i.e., not high-concurrency clusters).
- spark
Conf {[key: string]: string} - Map with key-value pairs to fine-tune Spark clusters.
- spark
Env {[key: string]: string}Vars - Map with environment variable key-value pairs to fine-tune Spark clusters. Key-value pairs of the form (X,Y) are exported (i.e., X='Y') while launching the driver and workers.
- ssh
Public string[]Keys - SSH public key contents that will be added to each Spark node in this cluster.
- workload
Type GetCluster Cluster Info Spec Workload Type
- cluster_
id str - The id of the cluster
- driver_
instance_ strpool_ id - similar to
instance_pool_id
, but for driver node. - driver_
node_ strtype_ id - The node type of the Spark driver.
- enable_
elastic_ booldisk - Use autoscaling local storage.
- enable_
local_ booldisk_ encryption - Enable local disk encryption.
- node_
type_ strid - Any supported databricks.getNodeType id.
- spark_
version str - Runtime version of the cluster.
- apply_
policy_ booldefault_ values - autoscale
Get
Cluster Cluster Info Spec Autoscale - aws_
attributes GetCluster Cluster Info Spec Aws Attributes - azure_
attributes GetCluster Cluster Info Spec Azure Attributes - cluster_
log_ Getconf Cluster Cluster Info Spec Cluster Log Conf - cluster_
mount_ Sequence[Getinfos Cluster Cluster Info Spec Cluster Mount Info] - cluster_
name str - The exact name of the cluster to search
- Mapping[str, str]
- Additional tags for cluster resources.
- data_
security_ strmode - Security features of the cluster. Unity Catalog requires
SINGLE_USER
orUSER_ISOLATION
mode.LEGACY_PASSTHROUGH
for passthrough cluster andLEGACY_TABLE_ACL
for Table ACL cluster. Default toNONE
, i.e. no security feature enabled. - docker_
image GetCluster Cluster Info Spec Docker Image - gcp_
attributes GetCluster Cluster Info Spec Gcp Attributes - idempotency_
token str - An optional token to guarantee the idempotency of cluster creation requests.
- init_
scripts Sequence[GetCluster Cluster Info Spec Init Script] - instance_
pool_ strid - The pool of idle instances the cluster is attached to.
- libraries
Sequence[Get
Cluster Cluster Info Spec Library] - num_
workers int - policy_
id str - Identifier of Cluster Policy to validate cluster and preset certain defaults.
- runtime_
engine str - The type of runtime of the cluster
- single_
user_ strname - The optional user name of the user to assign to an interactive cluster. This field is required when using standard AAD Passthrough for Azure Data Lake Storage (ADLS) with a single-user cluster (i.e., not high-concurrency clusters).
- spark_
conf Mapping[str, str] - Map with key-value pairs to fine-tune Spark clusters.
- spark_
env_ Mapping[str, str]vars - Map with environment variable key-value pairs to fine-tune Spark clusters. Key-value pairs of the form (X,Y) are exported (i.e., X='Y') while launching the driver and workers.
- ssh_
public_ Sequence[str]keys - SSH public key contents that will be added to each Spark node in this cluster.
- workload_
type GetCluster Cluster Info Spec Workload Type
- cluster
Id String - The id of the cluster
- driver
Instance StringPool Id - similar to
instance_pool_id
, but for driver node. - driver
Node StringType Id - The node type of the Spark driver.
- enable
Elastic BooleanDisk - Use autoscaling local storage.
- enable
Local BooleanDisk Encryption - Enable local disk encryption.
- node
Type StringId - Any supported databricks.getNodeType id.
- spark
Version String - Runtime version of the cluster.
- apply
Policy BooleanDefault Values - autoscale Property Map
- aws
Attributes Property Map - azure
Attributes Property Map - cluster
Log Property MapConf - cluster
Mount List<Property Map>Infos - cluster
Name String - The exact name of the cluster to search
- Map<String>
- Additional tags for cluster resources.
- data
Security StringMode - Security features of the cluster. Unity Catalog requires
SINGLE_USER
orUSER_ISOLATION
mode.LEGACY_PASSTHROUGH
for passthrough cluster andLEGACY_TABLE_ACL
for Table ACL cluster. Default toNONE
, i.e. no security feature enabled. - docker
Image Property Map - gcp
Attributes Property Map - idempotency
Token String - An optional token to guarantee the idempotency of cluster creation requests.
- init
Scripts List<Property Map> - instance
Pool StringId - The pool of idle instances the cluster is attached to.
- libraries List<Property Map>
- num
Workers Number - policy
Id String - Identifier of Cluster Policy to validate cluster and preset certain defaults.
- runtime
Engine String - The type of runtime of the cluster
- single
User StringName - The optional user name of the user to assign to an interactive cluster. This field is required when using standard AAD Passthrough for Azure Data Lake Storage (ADLS) with a single-user cluster (i.e., not high-concurrency clusters).
- spark
Conf Map<String> - Map with key-value pairs to fine-tune Spark clusters.
- spark
Env Map<String>Vars - Map with environment variable key-value pairs to fine-tune Spark clusters. Key-value pairs of the form (X,Y) are exported (i.e., X='Y') while launching the driver and workers.
- ssh
Public List<String>Keys - SSH public key contents that will be added to each Spark node in this cluster.
- workload
Type Property Map
GetClusterClusterInfoSpecAutoscale
- Max
Workers int - Min
Workers int
- Max
Workers int - Min
Workers int
- max
Workers Integer - min
Workers Integer
- max
Workers number - min
Workers number
- max_
workers int - min_
workers int
- max
Workers Number - min
Workers Number
GetClusterClusterInfoSpecAwsAttributes
- Availability string
- Ebs
Volume intCount - Ebs
Volume intIops - Ebs
Volume intSize - Ebs
Volume intThroughput - Ebs
Volume stringType - First
On intDemand - Instance
Profile stringArn - Spot
Bid intPrice Percent - Zone
Id string
- Availability string
- Ebs
Volume intCount - Ebs
Volume intIops - Ebs
Volume intSize - Ebs
Volume intThroughput - Ebs
Volume stringType - First
On intDemand - Instance
Profile stringArn - Spot
Bid intPrice Percent - Zone
Id string
- availability String
- ebs
Volume IntegerCount - ebs
Volume IntegerIops - ebs
Volume IntegerSize - ebs
Volume IntegerThroughput - ebs
Volume StringType - first
On IntegerDemand - instance
Profile StringArn - spot
Bid IntegerPrice Percent - zone
Id String
- availability string
- ebs
Volume numberCount - ebs
Volume numberIops - ebs
Volume numberSize - ebs
Volume numberThroughput - ebs
Volume stringType - first
On numberDemand - instance
Profile stringArn - spot
Bid numberPrice Percent - zone
Id string
- availability str
- ebs_
volume_ intcount - ebs_
volume_ intiops - ebs_
volume_ intsize - ebs_
volume_ intthroughput - ebs_
volume_ strtype - first_
on_ intdemand - instance_
profile_ strarn - spot_
bid_ intprice_ percent - zone_
id str
- availability String
- ebs
Volume NumberCount - ebs
Volume NumberIops - ebs
Volume NumberSize - ebs
Volume NumberThroughput - ebs
Volume StringType - first
On NumberDemand - instance
Profile StringArn - spot
Bid NumberPrice Percent - zone
Id String
GetClusterClusterInfoSpecAzureAttributes
- availability String
- first
On NumberDemand - log
Analytics Property MapInfo - spot
Bid NumberMax Price
GetClusterClusterInfoSpecAzureAttributesLogAnalyticsInfo
- Log
Analytics stringPrimary Key - Log
Analytics stringWorkspace Id
- Log
Analytics stringPrimary Key - Log
Analytics stringWorkspace Id
- log
Analytics StringPrimary Key - log
Analytics StringWorkspace Id
- log
Analytics stringPrimary Key - log
Analytics stringWorkspace Id
- log
Analytics StringPrimary Key - log
Analytics StringWorkspace Id
GetClusterClusterInfoSpecClusterLogConf
GetClusterClusterInfoSpecClusterLogConfDbfs
- Destination string
- Destination string
- destination String
- destination string
- destination str
- destination String
GetClusterClusterInfoSpecClusterLogConfS3
- Destination string
- Canned
Acl string - Enable
Encryption bool - Encryption
Type string - Endpoint string
- Kms
Key string - Region string
- Destination string
- Canned
Acl string - Enable
Encryption bool - Encryption
Type string - Endpoint string
- Kms
Key string - Region string
- destination String
- canned
Acl String - enable
Encryption Boolean - encryption
Type String - endpoint String
- kms
Key String - region String
- destination string
- canned
Acl string - enable
Encryption boolean - encryption
Type string - endpoint string
- kms
Key string - region string
- destination str
- canned_
acl str - enable_
encryption bool - encryption_
type str - endpoint str
- kms_
key str - region str
- destination String
- canned
Acl String - enable
Encryption Boolean - encryption
Type String - endpoint String
- kms
Key String - region String
GetClusterClusterInfoSpecClusterMountInfo
GetClusterClusterInfoSpecClusterMountInfoNetworkFilesystemInfo
- Server
Address string - Mount
Options string
- Server
Address string - Mount
Options string
- server
Address String - mount
Options String
- server
Address string - mount
Options string
- server_
address str - mount_
options str
- server
Address String - mount
Options String
GetClusterClusterInfoSpecDockerImage
- url String
- basic
Auth Property Map
GetClusterClusterInfoSpecDockerImageBasicAuth
GetClusterClusterInfoSpecGcpAttributes
- Availability string
- Boot
Disk intSize - Google
Service stringAccount - Local
Ssd intCount - Use
Preemptible boolExecutors - Zone
Id string
- Availability string
- Boot
Disk intSize - Google
Service stringAccount - Local
Ssd intCount - Use
Preemptible boolExecutors - Zone
Id string
- availability String
- boot
Disk IntegerSize - google
Service StringAccount - local
Ssd IntegerCount - use
Preemptible BooleanExecutors - zone
Id String
- availability string
- boot
Disk numberSize - google
Service stringAccount - local
Ssd numberCount - use
Preemptible booleanExecutors - zone
Id string
- availability str
- boot_
disk_ intsize - google_
service_ straccount - local_
ssd_ intcount - use_
preemptible_ boolexecutors - zone_
id str
- availability String
- boot
Disk NumberSize - google
Service StringAccount - local
Ssd NumberCount - use
Preemptible BooleanExecutors - zone
Id String
GetClusterClusterInfoSpecInitScript
- Abfss
Get
Cluster Cluster Info Spec Init Script Abfss - Dbfs
Get
Cluster Cluster Info Spec Init Script Dbfs - File
Get
Cluster Cluster Info Spec Init Script File - Gcs
Get
Cluster Cluster Info Spec Init Script Gcs - S3
Get
Cluster Cluster Info Spec Init Script S3 - Volumes
Get
Cluster Cluster Info Spec Init Script Volumes - Workspace
Get
Cluster Cluster Info Spec Init Script Workspace
- Abfss
Get
Cluster Cluster Info Spec Init Script Abfss - Dbfs
Get
Cluster Cluster Info Spec Init Script Dbfs - File
Get
Cluster Cluster Info Spec Init Script File - Gcs
Get
Cluster Cluster Info Spec Init Script Gcs - S3
Get
Cluster Cluster Info Spec Init Script S3 - Volumes
Get
Cluster Cluster Info Spec Init Script Volumes - Workspace
Get
Cluster Cluster Info Spec Init Script Workspace
- abfss
Get
Cluster Cluster Info Spec Init Script Abfss - dbfs
Get
Cluster Cluster Info Spec Init Script Dbfs - file
Get
Cluster Cluster Info Spec Init Script File - gcs
Get
Cluster Cluster Info Spec Init Script Gcs - s3
Get
Cluster Cluster Info Spec Init Script S3 - volumes
Get
Cluster Cluster Info Spec Init Script Volumes - workspace
Get
Cluster Cluster Info Spec Init Script Workspace
- abfss
Get
Cluster Cluster Info Spec Init Script Abfss - dbfs
Get
Cluster Cluster Info Spec Init Script Dbfs - file
Get
Cluster Cluster Info Spec Init Script File - gcs
Get
Cluster Cluster Info Spec Init Script Gcs - s3
Get
Cluster Cluster Info Spec Init Script S3 - volumes
Get
Cluster Cluster Info Spec Init Script Volumes - workspace
Get
Cluster Cluster Info Spec Init Script Workspace
- abfss
Get
Cluster Cluster Info Spec Init Script Abfss - dbfs
Get
Cluster Cluster Info Spec Init Script Dbfs - file
Get
Cluster Cluster Info Spec Init Script File - gcs
Get
Cluster Cluster Info Spec Init Script Gcs - s3
Get
Cluster Cluster Info Spec Init Script S3 - volumes
Get
Cluster Cluster Info Spec Init Script Volumes - workspace
Get
Cluster Cluster Info Spec Init Script Workspace
GetClusterClusterInfoSpecInitScriptAbfss
- Destination string
- Destination string
- destination String
- destination string
- destination str
- destination String
GetClusterClusterInfoSpecInitScriptDbfs
- Destination string
- Destination string
- destination String
- destination string
- destination str
- destination String
GetClusterClusterInfoSpecInitScriptFile
- Destination string
- Destination string
- destination String
- destination string
- destination str
- destination String
GetClusterClusterInfoSpecInitScriptGcs
- Destination string
- Destination string
- destination String
- destination string
- destination str
- destination String
GetClusterClusterInfoSpecInitScriptS3
- Destination string
- Canned
Acl string - Enable
Encryption bool - Encryption
Type string - Endpoint string
- Kms
Key string - Region string
- Destination string
- Canned
Acl string - Enable
Encryption bool - Encryption
Type string - Endpoint string
- Kms
Key string - Region string
- destination String
- canned
Acl String - enable
Encryption Boolean - encryption
Type String - endpoint String
- kms
Key String - region String
- destination string
- canned
Acl string - enable
Encryption boolean - encryption
Type string - endpoint string
- kms
Key string - region string
- destination str
- canned_
acl str - enable_
encryption bool - encryption_
type str - endpoint str
- kms_
key str - region str
- destination String
- canned
Acl String - enable
Encryption Boolean - encryption
Type String - endpoint String
- kms
Key String - region String
GetClusterClusterInfoSpecInitScriptVolumes
- Destination string
- Destination string
- destination String
- destination string
- destination str
- destination String
GetClusterClusterInfoSpecInitScriptWorkspace
- Destination string
- Destination string
- destination String
- destination string
- destination str
- destination String
GetClusterClusterInfoSpecLibrary
- cran Property Map
- egg String
- jar String
- maven Property Map
- pypi Property Map
- requirements String
- whl String
GetClusterClusterInfoSpecLibraryCran
GetClusterClusterInfoSpecLibraryMaven
- Coordinates string
- Exclusions List<string>
- Repo string
- Coordinates string
- Exclusions []string
- Repo string
- coordinates String
- exclusions List<String>
- repo String
- coordinates string
- exclusions string[]
- repo string
- coordinates str
- exclusions Sequence[str]
- repo str
- coordinates String
- exclusions List<String>
- repo String
GetClusterClusterInfoSpecLibraryPypi
GetClusterClusterInfoSpecWorkloadType
GetClusterClusterInfoSpecWorkloadTypeClients
GetClusterClusterInfoTerminationReason
- Code string
- Parameters Dictionary<string, string>
- Type string
- Code string
- Parameters map[string]string
- Type string
- code String
- parameters Map<String,String>
- type String
- code string
- parameters {[key: string]: string}
- type string
- code str
- parameters Mapping[str, str]
- type str
- code String
- parameters Map<String>
- type String
GetClusterClusterInfoWorkloadType
GetClusterClusterInfoWorkloadTypeClients
Package Details
- Repository
- databricks pulumi/pulumi-databricks
- License
- Apache-2.0
- Notes
- This Pulumi package is based on the
databricks
Terraform Provider.