Google Cloud Native is in preview. Google Cloud Classic is fully supported.
google-native.dataplex/v1.Task
Explore with Pulumi AI
Google Cloud Native is in preview. Google Cloud Classic is fully supported.
Creates a task resource within a lake. Auto-naming is currently not supported for this resource.
Create Task Resource
Resources are created with functions called constructors. To learn more about declaring and configuring resources, see Resources.
Constructor syntax
new Task(name: string, args: TaskArgs, opts?: CustomResourceOptions);
@overload
def Task(resource_name: str,
args: TaskArgs,
opts: Optional[ResourceOptions] = None)
@overload
def Task(resource_name: str,
opts: Optional[ResourceOptions] = None,
execution_spec: Optional[GoogleCloudDataplexV1TaskExecutionSpecArgs] = None,
lake_id: Optional[str] = None,
task_id: Optional[str] = None,
trigger_spec: Optional[GoogleCloudDataplexV1TaskTriggerSpecArgs] = None,
description: Optional[str] = None,
display_name: Optional[str] = None,
labels: Optional[Mapping[str, str]] = None,
location: Optional[str] = None,
notebook: Optional[GoogleCloudDataplexV1TaskNotebookTaskConfigArgs] = None,
project: Optional[str] = None,
spark: Optional[GoogleCloudDataplexV1TaskSparkTaskConfigArgs] = None)
func NewTask(ctx *Context, name string, args TaskArgs, opts ...ResourceOption) (*Task, error)
public Task(string name, TaskArgs args, CustomResourceOptions? opts = null)
type: google-native:dataplex/v1:Task
properties: # The arguments to resource properties.
options: # Bag of options to control resource's behavior.
Parameters
- name string
- The unique name of the resource.
- args TaskArgs
- The arguments to resource properties.
- opts CustomResourceOptions
- Bag of options to control resource's behavior.
- resource_name str
- The unique name of the resource.
- args TaskArgs
- The arguments to resource properties.
- opts ResourceOptions
- Bag of options to control resource's behavior.
- ctx Context
- Context object for the current deployment.
- name string
- The unique name of the resource.
- args TaskArgs
- The arguments to resource properties.
- opts ResourceOption
- Bag of options to control resource's behavior.
- name string
- The unique name of the resource.
- args TaskArgs
- The arguments to resource properties.
- opts CustomResourceOptions
- Bag of options to control resource's behavior.
- name String
- The unique name of the resource.
- args TaskArgs
- The arguments to resource properties.
- options CustomResourceOptions
- Bag of options to control resource's behavior.
Constructor example
The following reference example uses placeholder values for all input properties.
var exampletaskResourceResourceFromDataplexv1 = new GoogleNative.Dataplex.V1.Task("exampletaskResourceResourceFromDataplexv1", new()
{
ExecutionSpec = new GoogleNative.Dataplex.V1.Inputs.GoogleCloudDataplexV1TaskExecutionSpecArgs
{
ServiceAccount = "string",
Args =
{
{ "string", "string" },
},
KmsKey = "string",
MaxJobExecutionLifetime = "string",
Project = "string",
},
LakeId = "string",
TaskId = "string",
TriggerSpec = new GoogleNative.Dataplex.V1.Inputs.GoogleCloudDataplexV1TaskTriggerSpecArgs
{
Type = GoogleNative.Dataplex.V1.GoogleCloudDataplexV1TaskTriggerSpecType.TypeUnspecified,
Disabled = false,
MaxRetries = 0,
Schedule = "string",
StartTime = "string",
},
Description = "string",
DisplayName = "string",
Labels =
{
{ "string", "string" },
},
Location = "string",
Notebook = new GoogleNative.Dataplex.V1.Inputs.GoogleCloudDataplexV1TaskNotebookTaskConfigArgs
{
Notebook = "string",
ArchiveUris = new[]
{
"string",
},
FileUris = new[]
{
"string",
},
InfrastructureSpec = new GoogleNative.Dataplex.V1.Inputs.GoogleCloudDataplexV1TaskInfrastructureSpecArgs
{
Batch = new GoogleNative.Dataplex.V1.Inputs.GoogleCloudDataplexV1TaskInfrastructureSpecBatchComputeResourcesArgs
{
ExecutorsCount = 0,
MaxExecutorsCount = 0,
},
ContainerImage = new GoogleNative.Dataplex.V1.Inputs.GoogleCloudDataplexV1TaskInfrastructureSpecContainerImageRuntimeArgs
{
Image = "string",
JavaJars = new[]
{
"string",
},
Properties =
{
{ "string", "string" },
},
PythonPackages = new[]
{
"string",
},
},
VpcNetwork = new GoogleNative.Dataplex.V1.Inputs.GoogleCloudDataplexV1TaskInfrastructureSpecVpcNetworkArgs
{
Network = "string",
NetworkTags = new[]
{
"string",
},
SubNetwork = "string",
},
},
},
Project = "string",
Spark = new GoogleNative.Dataplex.V1.Inputs.GoogleCloudDataplexV1TaskSparkTaskConfigArgs
{
ArchiveUris = new[]
{
"string",
},
FileUris = new[]
{
"string",
},
InfrastructureSpec = new GoogleNative.Dataplex.V1.Inputs.GoogleCloudDataplexV1TaskInfrastructureSpecArgs
{
Batch = new GoogleNative.Dataplex.V1.Inputs.GoogleCloudDataplexV1TaskInfrastructureSpecBatchComputeResourcesArgs
{
ExecutorsCount = 0,
MaxExecutorsCount = 0,
},
ContainerImage = new GoogleNative.Dataplex.V1.Inputs.GoogleCloudDataplexV1TaskInfrastructureSpecContainerImageRuntimeArgs
{
Image = "string",
JavaJars = new[]
{
"string",
},
Properties =
{
{ "string", "string" },
},
PythonPackages = new[]
{
"string",
},
},
VpcNetwork = new GoogleNative.Dataplex.V1.Inputs.GoogleCloudDataplexV1TaskInfrastructureSpecVpcNetworkArgs
{
Network = "string",
NetworkTags = new[]
{
"string",
},
SubNetwork = "string",
},
},
MainClass = "string",
MainJarFileUri = "string",
PythonScriptFile = "string",
SqlScript = "string",
SqlScriptFile = "string",
},
});
example, err := dataplex.NewTask(ctx, "exampletaskResourceResourceFromDataplexv1", &dataplex.TaskArgs{
ExecutionSpec: &dataplex.GoogleCloudDataplexV1TaskExecutionSpecArgs{
ServiceAccount: pulumi.String("string"),
Args: pulumi.StringMap{
"string": pulumi.String("string"),
},
KmsKey: pulumi.String("string"),
MaxJobExecutionLifetime: pulumi.String("string"),
Project: pulumi.String("string"),
},
LakeId: pulumi.String("string"),
TaskId: pulumi.String("string"),
TriggerSpec: &dataplex.GoogleCloudDataplexV1TaskTriggerSpecArgs{
Type: dataplex.GoogleCloudDataplexV1TaskTriggerSpecTypeTypeUnspecified,
Disabled: pulumi.Bool(false),
MaxRetries: pulumi.Int(0),
Schedule: pulumi.String("string"),
StartTime: pulumi.String("string"),
},
Description: pulumi.String("string"),
DisplayName: pulumi.String("string"),
Labels: pulumi.StringMap{
"string": pulumi.String("string"),
},
Location: pulumi.String("string"),
Notebook: &dataplex.GoogleCloudDataplexV1TaskNotebookTaskConfigArgs{
Notebook: pulumi.String("string"),
ArchiveUris: pulumi.StringArray{
pulumi.String("string"),
},
FileUris: pulumi.StringArray{
pulumi.String("string"),
},
InfrastructureSpec: &dataplex.GoogleCloudDataplexV1TaskInfrastructureSpecArgs{
Batch: &dataplex.GoogleCloudDataplexV1TaskInfrastructureSpecBatchComputeResourcesArgs{
ExecutorsCount: pulumi.Int(0),
MaxExecutorsCount: pulumi.Int(0),
},
ContainerImage: &dataplex.GoogleCloudDataplexV1TaskInfrastructureSpecContainerImageRuntimeArgs{
Image: pulumi.String("string"),
JavaJars: pulumi.StringArray{
pulumi.String("string"),
},
Properties: pulumi.StringMap{
"string": pulumi.String("string"),
},
PythonPackages: pulumi.StringArray{
pulumi.String("string"),
},
},
VpcNetwork: &dataplex.GoogleCloudDataplexV1TaskInfrastructureSpecVpcNetworkArgs{
Network: pulumi.String("string"),
NetworkTags: pulumi.StringArray{
pulumi.String("string"),
},
SubNetwork: pulumi.String("string"),
},
},
},
Project: pulumi.String("string"),
Spark: &dataplex.GoogleCloudDataplexV1TaskSparkTaskConfigArgs{
ArchiveUris: pulumi.StringArray{
pulumi.String("string"),
},
FileUris: pulumi.StringArray{
pulumi.String("string"),
},
InfrastructureSpec: &dataplex.GoogleCloudDataplexV1TaskInfrastructureSpecArgs{
Batch: &dataplex.GoogleCloudDataplexV1TaskInfrastructureSpecBatchComputeResourcesArgs{
ExecutorsCount: pulumi.Int(0),
MaxExecutorsCount: pulumi.Int(0),
},
ContainerImage: &dataplex.GoogleCloudDataplexV1TaskInfrastructureSpecContainerImageRuntimeArgs{
Image: pulumi.String("string"),
JavaJars: pulumi.StringArray{
pulumi.String("string"),
},
Properties: pulumi.StringMap{
"string": pulumi.String("string"),
},
PythonPackages: pulumi.StringArray{
pulumi.String("string"),
},
},
VpcNetwork: &dataplex.GoogleCloudDataplexV1TaskInfrastructureSpecVpcNetworkArgs{
Network: pulumi.String("string"),
NetworkTags: pulumi.StringArray{
pulumi.String("string"),
},
SubNetwork: pulumi.String("string"),
},
},
MainClass: pulumi.String("string"),
MainJarFileUri: pulumi.String("string"),
PythonScriptFile: pulumi.String("string"),
SqlScript: pulumi.String("string"),
SqlScriptFile: pulumi.String("string"),
},
})
var exampletaskResourceResourceFromDataplexv1 = new Task("exampletaskResourceResourceFromDataplexv1", TaskArgs.builder()
.executionSpec(GoogleCloudDataplexV1TaskExecutionSpecArgs.builder()
.serviceAccount("string")
.args(Map.of("string", "string"))
.kmsKey("string")
.maxJobExecutionLifetime("string")
.project("string")
.build())
.lakeId("string")
.taskId("string")
.triggerSpec(GoogleCloudDataplexV1TaskTriggerSpecArgs.builder()
.type("TYPE_UNSPECIFIED")
.disabled(false)
.maxRetries(0)
.schedule("string")
.startTime("string")
.build())
.description("string")
.displayName("string")
.labels(Map.of("string", "string"))
.location("string")
.notebook(GoogleCloudDataplexV1TaskNotebookTaskConfigArgs.builder()
.notebook("string")
.archiveUris("string")
.fileUris("string")
.infrastructureSpec(GoogleCloudDataplexV1TaskInfrastructureSpecArgs.builder()
.batch(GoogleCloudDataplexV1TaskInfrastructureSpecBatchComputeResourcesArgs.builder()
.executorsCount(0)
.maxExecutorsCount(0)
.build())
.containerImage(GoogleCloudDataplexV1TaskInfrastructureSpecContainerImageRuntimeArgs.builder()
.image("string")
.javaJars("string")
.properties(Map.of("string", "string"))
.pythonPackages("string")
.build())
.vpcNetwork(GoogleCloudDataplexV1TaskInfrastructureSpecVpcNetworkArgs.builder()
.network("string")
.networkTags("string")
.subNetwork("string")
.build())
.build())
.build())
.project("string")
.spark(GoogleCloudDataplexV1TaskSparkTaskConfigArgs.builder()
.archiveUris("string")
.fileUris("string")
.infrastructureSpec(GoogleCloudDataplexV1TaskInfrastructureSpecArgs.builder()
.batch(GoogleCloudDataplexV1TaskInfrastructureSpecBatchComputeResourcesArgs.builder()
.executorsCount(0)
.maxExecutorsCount(0)
.build())
.containerImage(GoogleCloudDataplexV1TaskInfrastructureSpecContainerImageRuntimeArgs.builder()
.image("string")
.javaJars("string")
.properties(Map.of("string", "string"))
.pythonPackages("string")
.build())
.vpcNetwork(GoogleCloudDataplexV1TaskInfrastructureSpecVpcNetworkArgs.builder()
.network("string")
.networkTags("string")
.subNetwork("string")
.build())
.build())
.mainClass("string")
.mainJarFileUri("string")
.pythonScriptFile("string")
.sqlScript("string")
.sqlScriptFile("string")
.build())
.build());
exampletask_resource_resource_from_dataplexv1 = google_native.dataplex.v1.Task("exampletaskResourceResourceFromDataplexv1",
execution_spec={
"service_account": "string",
"args": {
"string": "string",
},
"kms_key": "string",
"max_job_execution_lifetime": "string",
"project": "string",
},
lake_id="string",
task_id="string",
trigger_spec={
"type": google_native.dataplex.v1.GoogleCloudDataplexV1TaskTriggerSpecType.TYPE_UNSPECIFIED,
"disabled": False,
"max_retries": 0,
"schedule": "string",
"start_time": "string",
},
description="string",
display_name="string",
labels={
"string": "string",
},
location="string",
notebook={
"notebook": "string",
"archive_uris": ["string"],
"file_uris": ["string"],
"infrastructure_spec": {
"batch": {
"executors_count": 0,
"max_executors_count": 0,
},
"container_image": {
"image": "string",
"java_jars": ["string"],
"properties": {
"string": "string",
},
"python_packages": ["string"],
},
"vpc_network": {
"network": "string",
"network_tags": ["string"],
"sub_network": "string",
},
},
},
project="string",
spark={
"archive_uris": ["string"],
"file_uris": ["string"],
"infrastructure_spec": {
"batch": {
"executors_count": 0,
"max_executors_count": 0,
},
"container_image": {
"image": "string",
"java_jars": ["string"],
"properties": {
"string": "string",
},
"python_packages": ["string"],
},
"vpc_network": {
"network": "string",
"network_tags": ["string"],
"sub_network": "string",
},
},
"main_class": "string",
"main_jar_file_uri": "string",
"python_script_file": "string",
"sql_script": "string",
"sql_script_file": "string",
})
const exampletaskResourceResourceFromDataplexv1 = new google_native.dataplex.v1.Task("exampletaskResourceResourceFromDataplexv1", {
executionSpec: {
serviceAccount: "string",
args: {
string: "string",
},
kmsKey: "string",
maxJobExecutionLifetime: "string",
project: "string",
},
lakeId: "string",
taskId: "string",
triggerSpec: {
type: google_native.dataplex.v1.GoogleCloudDataplexV1TaskTriggerSpecType.TypeUnspecified,
disabled: false,
maxRetries: 0,
schedule: "string",
startTime: "string",
},
description: "string",
displayName: "string",
labels: {
string: "string",
},
location: "string",
notebook: {
notebook: "string",
archiveUris: ["string"],
fileUris: ["string"],
infrastructureSpec: {
batch: {
executorsCount: 0,
maxExecutorsCount: 0,
},
containerImage: {
image: "string",
javaJars: ["string"],
properties: {
string: "string",
},
pythonPackages: ["string"],
},
vpcNetwork: {
network: "string",
networkTags: ["string"],
subNetwork: "string",
},
},
},
project: "string",
spark: {
archiveUris: ["string"],
fileUris: ["string"],
infrastructureSpec: {
batch: {
executorsCount: 0,
maxExecutorsCount: 0,
},
containerImage: {
image: "string",
javaJars: ["string"],
properties: {
string: "string",
},
pythonPackages: ["string"],
},
vpcNetwork: {
network: "string",
networkTags: ["string"],
subNetwork: "string",
},
},
mainClass: "string",
mainJarFileUri: "string",
pythonScriptFile: "string",
sqlScript: "string",
sqlScriptFile: "string",
},
});
type: google-native:dataplex/v1:Task
properties:
description: string
displayName: string
executionSpec:
args:
string: string
kmsKey: string
maxJobExecutionLifetime: string
project: string
serviceAccount: string
labels:
string: string
lakeId: string
location: string
notebook:
archiveUris:
- string
fileUris:
- string
infrastructureSpec:
batch:
executorsCount: 0
maxExecutorsCount: 0
containerImage:
image: string
javaJars:
- string
properties:
string: string
pythonPackages:
- string
vpcNetwork:
network: string
networkTags:
- string
subNetwork: string
notebook: string
project: string
spark:
archiveUris:
- string
fileUris:
- string
infrastructureSpec:
batch:
executorsCount: 0
maxExecutorsCount: 0
containerImage:
image: string
javaJars:
- string
properties:
string: string
pythonPackages:
- string
vpcNetwork:
network: string
networkTags:
- string
subNetwork: string
mainClass: string
mainJarFileUri: string
pythonScriptFile: string
sqlScript: string
sqlScriptFile: string
taskId: string
triggerSpec:
disabled: false
maxRetries: 0
schedule: string
startTime: string
type: TYPE_UNSPECIFIED
Task Resource Properties
To learn more about resource properties and how to use them, see Inputs and Outputs in the Architecture and Concepts docs.
Inputs
In Python, inputs that are objects can be passed either as argument classes or as dictionary literals.
The Task resource accepts the following input properties:
- Execution
Spec Pulumi.Google Native. Dataplex. V1. Inputs. Google Cloud Dataplex V1Task Execution Spec - Spec related to how a task is executed.
- Lake
Id string - Task
Id string - Required. Task identifier.
- Trigger
Spec Pulumi.Google Native. Dataplex. V1. Inputs. Google Cloud Dataplex V1Task Trigger Spec - Spec related to how often and when a task should be triggered.
- Description string
- Optional. Description of the task.
- Display
Name string - Optional. User friendly display name.
- Labels Dictionary<string, string>
- Optional. User-defined labels for the task.
- Location string
- Notebook
Pulumi.
Google Native. Dataplex. V1. Inputs. Google Cloud Dataplex V1Task Notebook Task Config - Config related to running scheduled Notebooks.
- Project string
- Spark
Pulumi.
Google Native. Dataplex. V1. Inputs. Google Cloud Dataplex V1Task Spark Task Config - Config related to running custom Spark tasks.
- Execution
Spec GoogleCloud Dataplex V1Task Execution Spec Args - Spec related to how a task is executed.
- Lake
Id string - Task
Id string - Required. Task identifier.
- Trigger
Spec GoogleCloud Dataplex V1Task Trigger Spec Args - Spec related to how often and when a task should be triggered.
- Description string
- Optional. Description of the task.
- Display
Name string - Optional. User friendly display name.
- Labels map[string]string
- Optional. User-defined labels for the task.
- Location string
- Notebook
Google
Cloud Dataplex V1Task Notebook Task Config Args - Config related to running scheduled Notebooks.
- Project string
- Spark
Google
Cloud Dataplex V1Task Spark Task Config Args - Config related to running custom Spark tasks.
- execution
Spec GoogleCloud Dataplex V1Task Execution Spec - Spec related to how a task is executed.
- lake
Id String - task
Id String - Required. Task identifier.
- trigger
Spec GoogleCloud Dataplex V1Task Trigger Spec - Spec related to how often and when a task should be triggered.
- description String
- Optional. Description of the task.
- display
Name String - Optional. User friendly display name.
- labels Map<String,String>
- Optional. User-defined labels for the task.
- location String
- notebook
Google
Cloud Dataplex V1Task Notebook Task Config - Config related to running scheduled Notebooks.
- project String
- spark
Google
Cloud Dataplex V1Task Spark Task Config - Config related to running custom Spark tasks.
- execution
Spec GoogleCloud Dataplex V1Task Execution Spec - Spec related to how a task is executed.
- lake
Id string - task
Id string - Required. Task identifier.
- trigger
Spec GoogleCloud Dataplex V1Task Trigger Spec - Spec related to how often and when a task should be triggered.
- description string
- Optional. Description of the task.
- display
Name string - Optional. User friendly display name.
- labels {[key: string]: string}
- Optional. User-defined labels for the task.
- location string
- notebook
Google
Cloud Dataplex V1Task Notebook Task Config - Config related to running scheduled Notebooks.
- project string
- spark
Google
Cloud Dataplex V1Task Spark Task Config - Config related to running custom Spark tasks.
- execution_
spec GoogleCloud Dataplex V1Task Execution Spec Args - Spec related to how a task is executed.
- lake_
id str - task_
id str - Required. Task identifier.
- trigger_
spec GoogleCloud Dataplex V1Task Trigger Spec Args - Spec related to how often and when a task should be triggered.
- description str
- Optional. Description of the task.
- display_
name str - Optional. User friendly display name.
- labels Mapping[str, str]
- Optional. User-defined labels for the task.
- location str
- notebook
Google
Cloud Dataplex V1Task Notebook Task Config Args - Config related to running scheduled Notebooks.
- project str
- spark
Google
Cloud Dataplex V1Task Spark Task Config Args - Config related to running custom Spark tasks.
- execution
Spec Property Map - Spec related to how a task is executed.
- lake
Id String - task
Id String - Required. Task identifier.
- trigger
Spec Property Map - Spec related to how often and when a task should be triggered.
- description String
- Optional. Description of the task.
- display
Name String - Optional. User friendly display name.
- labels Map<String>
- Optional. User-defined labels for the task.
- location String
- notebook Property Map
- Config related to running scheduled Notebooks.
- project String
- spark Property Map
- Config related to running custom Spark tasks.
Outputs
All input properties are implicitly available as output properties. Additionally, the Task resource produces the following output properties:
- Create
Time string - The time when the task was created.
- Execution
Status Pulumi.Google Native. Dataplex. V1. Outputs. Google Cloud Dataplex V1Task Execution Status Response - Status of the latest task executions.
- Id string
- The provider-assigned unique ID for this managed resource.
- Name string
- The relative resource name of the task, of the form: projects/{project_number}/locations/{location_id}/lakes/{lake_id}/ tasks/{task_id}.
- State string
- Current state of the task.
- Uid string
- System generated globally unique ID for the task. This ID will be different if the task is deleted and re-created with the same name.
- Update
Time string - The time when the task was last updated.
- Create
Time string - The time when the task was created.
- Execution
Status GoogleCloud Dataplex V1Task Execution Status Response - Status of the latest task executions.
- Id string
- The provider-assigned unique ID for this managed resource.
- Name string
- The relative resource name of the task, of the form: projects/{project_number}/locations/{location_id}/lakes/{lake_id}/ tasks/{task_id}.
- State string
- Current state of the task.
- Uid string
- System generated globally unique ID for the task. This ID will be different if the task is deleted and re-created with the same name.
- Update
Time string - The time when the task was last updated.
- create
Time String - The time when the task was created.
- execution
Status GoogleCloud Dataplex V1Task Execution Status Response - Status of the latest task executions.
- id String
- The provider-assigned unique ID for this managed resource.
- name String
- The relative resource name of the task, of the form: projects/{project_number}/locations/{location_id}/lakes/{lake_id}/ tasks/{task_id}.
- state String
- Current state of the task.
- uid String
- System generated globally unique ID for the task. This ID will be different if the task is deleted and re-created with the same name.
- update
Time String - The time when the task was last updated.
- create
Time string - The time when the task was created.
- execution
Status GoogleCloud Dataplex V1Task Execution Status Response - Status of the latest task executions.
- id string
- The provider-assigned unique ID for this managed resource.
- name string
- The relative resource name of the task, of the form: projects/{project_number}/locations/{location_id}/lakes/{lake_id}/ tasks/{task_id}.
- state string
- Current state of the task.
- uid string
- System generated globally unique ID for the task. This ID will be different if the task is deleted and re-created with the same name.
- update
Time string - The time when the task was last updated.
- create_
time str - The time when the task was created.
- execution_
status GoogleCloud Dataplex V1Task Execution Status Response - Status of the latest task executions.
- id str
- The provider-assigned unique ID for this managed resource.
- name str
- The relative resource name of the task, of the form: projects/{project_number}/locations/{location_id}/lakes/{lake_id}/ tasks/{task_id}.
- state str
- Current state of the task.
- uid str
- System generated globally unique ID for the task. This ID will be different if the task is deleted and re-created with the same name.
- update_
time str - The time when the task was last updated.
- create
Time String - The time when the task was created.
- execution
Status Property Map - Status of the latest task executions.
- id String
- The provider-assigned unique ID for this managed resource.
- name String
- The relative resource name of the task, of the form: projects/{project_number}/locations/{location_id}/lakes/{lake_id}/ tasks/{task_id}.
- state String
- Current state of the task.
- uid String
- System generated globally unique ID for the task. This ID will be different if the task is deleted and re-created with the same name.
- update
Time String - The time when the task was last updated.
Supporting Types
GoogleCloudDataplexV1JobResponse, GoogleCloudDataplexV1JobResponseArgs
- End
Time string - The time when the job ended.
- Execution
Spec Pulumi.Google Native. Dataplex. V1. Inputs. Google Cloud Dataplex V1Task Execution Spec Response - Spec related to how a task is executed.
- Labels Dictionary<string, string>
- User-defined labels for the task.
- Message string
- Additional information about the current state.
- Name string
- The relative resource name of the job, of the form: projects/{project_number}/locations/{location_id}/lakes/{lake_id}/tasks/{task_id}/jobs/{job_id}.
- Retry
Count int - The number of times the job has been retried (excluding the initial attempt).
- Service string
- The underlying service running a job.
- Service
Job string - The full resource name for the job run under a particular service.
- Start
Time string - The time when the job was started.
- State string
- Execution state for the job.
- Trigger string
- Job execution trigger.
- Uid string
- System generated globally unique ID for the job.
- End
Time string - The time when the job ended.
- Execution
Spec GoogleCloud Dataplex V1Task Execution Spec Response - Spec related to how a task is executed.
- Labels map[string]string
- User-defined labels for the task.
- Message string
- Additional information about the current state.
- Name string
- The relative resource name of the job, of the form: projects/{project_number}/locations/{location_id}/lakes/{lake_id}/tasks/{task_id}/jobs/{job_id}.
- Retry
Count int - The number of times the job has been retried (excluding the initial attempt).
- Service string
- The underlying service running a job.
- Service
Job string - The full resource name for the job run under a particular service.
- Start
Time string - The time when the job was started.
- State string
- Execution state for the job.
- Trigger string
- Job execution trigger.
- Uid string
- System generated globally unique ID for the job.
- end
Time String - The time when the job ended.
- execution
Spec GoogleCloud Dataplex V1Task Execution Spec Response - Spec related to how a task is executed.
- labels Map<String,String>
- User-defined labels for the task.
- message String
- Additional information about the current state.
- name String
- The relative resource name of the job, of the form: projects/{project_number}/locations/{location_id}/lakes/{lake_id}/tasks/{task_id}/jobs/{job_id}.
- retry
Count Integer - The number of times the job has been retried (excluding the initial attempt).
- service String
- The underlying service running a job.
- service
Job String - The full resource name for the job run under a particular service.
- start
Time String - The time when the job was started.
- state String
- Execution state for the job.
- trigger String
- Job execution trigger.
- uid String
- System generated globally unique ID for the job.
- end
Time string - The time when the job ended.
- execution
Spec GoogleCloud Dataplex V1Task Execution Spec Response - Spec related to how a task is executed.
- labels {[key: string]: string}
- User-defined labels for the task.
- message string
- Additional information about the current state.
- name string
- The relative resource name of the job, of the form: projects/{project_number}/locations/{location_id}/lakes/{lake_id}/tasks/{task_id}/jobs/{job_id}.
- retry
Count number - The number of times the job has been retried (excluding the initial attempt).
- service string
- The underlying service running a job.
- service
Job string - The full resource name for the job run under a particular service.
- start
Time string - The time when the job was started.
- state string
- Execution state for the job.
- trigger string
- Job execution trigger.
- uid string
- System generated globally unique ID for the job.
- end_
time str - The time when the job ended.
- execution_
spec GoogleCloud Dataplex V1Task Execution Spec Response - Spec related to how a task is executed.
- labels Mapping[str, str]
- User-defined labels for the task.
- message str
- Additional information about the current state.
- name str
- The relative resource name of the job, of the form: projects/{project_number}/locations/{location_id}/lakes/{lake_id}/tasks/{task_id}/jobs/{job_id}.
- retry_
count int - The number of times the job has been retried (excluding the initial attempt).
- service str
- The underlying service running a job.
- service_
job str - The full resource name for the job run under a particular service.
- start_
time str - The time when the job was started.
- state str
- Execution state for the job.
- trigger str
- Job execution trigger.
- uid str
- System generated globally unique ID for the job.
- end
Time String - The time when the job ended.
- execution
Spec Property Map - Spec related to how a task is executed.
- labels Map<String>
- User-defined labels for the task.
- message String
- Additional information about the current state.
- name String
- The relative resource name of the job, of the form: projects/{project_number}/locations/{location_id}/lakes/{lake_id}/tasks/{task_id}/jobs/{job_id}.
- retry
Count Number - The number of times the job has been retried (excluding the initial attempt).
- service String
- The underlying service running a job.
- service
Job String - The full resource name for the job run under a particular service.
- start
Time String - The time when the job was started.
- state String
- Execution state for the job.
- trigger String
- Job execution trigger.
- uid String
- System generated globally unique ID for the job.
GoogleCloudDataplexV1TaskExecutionSpec, GoogleCloudDataplexV1TaskExecutionSpecArgs
- Service
Account string - Service account to use to execute a task. If not provided, the default Compute service account for the project is used.
- Args Dictionary<string, string>
- Optional. The arguments to pass to the task. The args can use placeholders of the format ${placeholder} as part of key/value string. These will be interpolated before passing the args to the driver. Currently supported placeholders: - ${task_id} - ${job_time} To pass positional args, set the key as TASK_ARGS. The value should be a comma-separated string of all the positional arguments. To use a delimiter other than comma, refer to https://cloud.google.com/sdk/gcloud/reference/topic/escaping. In case of other keys being present in the args, then TASK_ARGS will be passed as the last argument.
- Kms
Key string - Optional. The Cloud KMS key to use for encryption, of the form: projects/{project_number}/locations/{location_id}/keyRings/{key-ring-name}/cryptoKeys/{key-name}.
- Max
Job stringExecution Lifetime - Optional. The maximum duration after which the job execution is expired.
- Project string
- Optional. The project in which jobs are run. By default, the project containing the Lake is used. If a project is provided, the ExecutionSpec.service_account must belong to this project.
- Service
Account string - Service account to use to execute a task. If not provided, the default Compute service account for the project is used.
- Args map[string]string
- Optional. The arguments to pass to the task. The args can use placeholders of the format ${placeholder} as part of key/value string. These will be interpolated before passing the args to the driver. Currently supported placeholders: - ${task_id} - ${job_time} To pass positional args, set the key as TASK_ARGS. The value should be a comma-separated string of all the positional arguments. To use a delimiter other than comma, refer to https://cloud.google.com/sdk/gcloud/reference/topic/escaping. In case of other keys being present in the args, then TASK_ARGS will be passed as the last argument.
- Kms
Key string - Optional. The Cloud KMS key to use for encryption, of the form: projects/{project_number}/locations/{location_id}/keyRings/{key-ring-name}/cryptoKeys/{key-name}.
- Max
Job stringExecution Lifetime - Optional. The maximum duration after which the job execution is expired.
- Project string
- Optional. The project in which jobs are run. By default, the project containing the Lake is used. If a project is provided, the ExecutionSpec.service_account must belong to this project.
- service
Account String - Service account to use to execute a task. If not provided, the default Compute service account for the project is used.
- args Map<String,String>
- Optional. The arguments to pass to the task. The args can use placeholders of the format ${placeholder} as part of key/value string. These will be interpolated before passing the args to the driver. Currently supported placeholders: - ${task_id} - ${job_time} To pass positional args, set the key as TASK_ARGS. The value should be a comma-separated string of all the positional arguments. To use a delimiter other than comma, refer to https://cloud.google.com/sdk/gcloud/reference/topic/escaping. In case of other keys being present in the args, then TASK_ARGS will be passed as the last argument.
- kms
Key String - Optional. The Cloud KMS key to use for encryption, of the form: projects/{project_number}/locations/{location_id}/keyRings/{key-ring-name}/cryptoKeys/{key-name}.
- max
Job StringExecution Lifetime - Optional. The maximum duration after which the job execution is expired.
- project String
- Optional. The project in which jobs are run. By default, the project containing the Lake is used. If a project is provided, the ExecutionSpec.service_account must belong to this project.
- service
Account string - Service account to use to execute a task. If not provided, the default Compute service account for the project is used.
- args {[key: string]: string}
- Optional. The arguments to pass to the task. The args can use placeholders of the format ${placeholder} as part of key/value string. These will be interpolated before passing the args to the driver. Currently supported placeholders: - ${task_id} - ${job_time} To pass positional args, set the key as TASK_ARGS. The value should be a comma-separated string of all the positional arguments. To use a delimiter other than comma, refer to https://cloud.google.com/sdk/gcloud/reference/topic/escaping. In case of other keys being present in the args, then TASK_ARGS will be passed as the last argument.
- kms
Key string - Optional. The Cloud KMS key to use for encryption, of the form: projects/{project_number}/locations/{location_id}/keyRings/{key-ring-name}/cryptoKeys/{key-name}.
- max
Job stringExecution Lifetime - Optional. The maximum duration after which the job execution is expired.
- project string
- Optional. The project in which jobs are run. By default, the project containing the Lake is used. If a project is provided, the ExecutionSpec.service_account must belong to this project.
- service_
account str - Service account to use to execute a task. If not provided, the default Compute service account for the project is used.
- args Mapping[str, str]
- Optional. The arguments to pass to the task. The args can use placeholders of the format ${placeholder} as part of key/value string. These will be interpolated before passing the args to the driver. Currently supported placeholders: - ${task_id} - ${job_time} To pass positional args, set the key as TASK_ARGS. The value should be a comma-separated string of all the positional arguments. To use a delimiter other than comma, refer to https://cloud.google.com/sdk/gcloud/reference/topic/escaping. In case of other keys being present in the args, then TASK_ARGS will be passed as the last argument.
- kms_
key str - Optional. The Cloud KMS key to use for encryption, of the form: projects/{project_number}/locations/{location_id}/keyRings/{key-ring-name}/cryptoKeys/{key-name}.
- max_
job_ strexecution_ lifetime - Optional. The maximum duration after which the job execution is expired.
- project str
- Optional. The project in which jobs are run. By default, the project containing the Lake is used. If a project is provided, the ExecutionSpec.service_account must belong to this project.
- service
Account String - Service account to use to execute a task. If not provided, the default Compute service account for the project is used.
- args Map<String>
- Optional. The arguments to pass to the task. The args can use placeholders of the format ${placeholder} as part of key/value string. These will be interpolated before passing the args to the driver. Currently supported placeholders: - ${task_id} - ${job_time} To pass positional args, set the key as TASK_ARGS. The value should be a comma-separated string of all the positional arguments. To use a delimiter other than comma, refer to https://cloud.google.com/sdk/gcloud/reference/topic/escaping. In case of other keys being present in the args, then TASK_ARGS will be passed as the last argument.
- kms
Key String - Optional. The Cloud KMS key to use for encryption, of the form: projects/{project_number}/locations/{location_id}/keyRings/{key-ring-name}/cryptoKeys/{key-name}.
- max
Job StringExecution Lifetime - Optional. The maximum duration after which the job execution is expired.
- project String
- Optional. The project in which jobs are run. By default, the project containing the Lake is used. If a project is provided, the ExecutionSpec.service_account must belong to this project.
GoogleCloudDataplexV1TaskExecutionSpecResponse, GoogleCloudDataplexV1TaskExecutionSpecResponseArgs
- Args Dictionary<string, string>
- Optional. The arguments to pass to the task. The args can use placeholders of the format ${placeholder} as part of key/value string. These will be interpolated before passing the args to the driver. Currently supported placeholders: - ${task_id} - ${job_time} To pass positional args, set the key as TASK_ARGS. The value should be a comma-separated string of all the positional arguments. To use a delimiter other than comma, refer to https://cloud.google.com/sdk/gcloud/reference/topic/escaping. In case of other keys being present in the args, then TASK_ARGS will be passed as the last argument.
- Kms
Key string - Optional. The Cloud KMS key to use for encryption, of the form: projects/{project_number}/locations/{location_id}/keyRings/{key-ring-name}/cryptoKeys/{key-name}.
- Max
Job stringExecution Lifetime - Optional. The maximum duration after which the job execution is expired.
- Project string
- Optional. The project in which jobs are run. By default, the project containing the Lake is used. If a project is provided, the ExecutionSpec.service_account must belong to this project.
- Service
Account string - Service account to use to execute a task. If not provided, the default Compute service account for the project is used.
- Args map[string]string
- Optional. The arguments to pass to the task. The args can use placeholders of the format ${placeholder} as part of key/value string. These will be interpolated before passing the args to the driver. Currently supported placeholders: - ${task_id} - ${job_time} To pass positional args, set the key as TASK_ARGS. The value should be a comma-separated string of all the positional arguments. To use a delimiter other than comma, refer to https://cloud.google.com/sdk/gcloud/reference/topic/escaping. In case of other keys being present in the args, then TASK_ARGS will be passed as the last argument.
- Kms
Key string - Optional. The Cloud KMS key to use for encryption, of the form: projects/{project_number}/locations/{location_id}/keyRings/{key-ring-name}/cryptoKeys/{key-name}.
- Max
Job stringExecution Lifetime - Optional. The maximum duration after which the job execution is expired.
- Project string
- Optional. The project in which jobs are run. By default, the project containing the Lake is used. If a project is provided, the ExecutionSpec.service_account must belong to this project.
- Service
Account string - Service account to use to execute a task. If not provided, the default Compute service account for the project is used.
- args Map<String,String>
- Optional. The arguments to pass to the task. The args can use placeholders of the format ${placeholder} as part of key/value string. These will be interpolated before passing the args to the driver. Currently supported placeholders: - ${task_id} - ${job_time} To pass positional args, set the key as TASK_ARGS. The value should be a comma-separated string of all the positional arguments. To use a delimiter other than comma, refer to https://cloud.google.com/sdk/gcloud/reference/topic/escaping. In case of other keys being present in the args, then TASK_ARGS will be passed as the last argument.
- kms
Key String - Optional. The Cloud KMS key to use for encryption, of the form: projects/{project_number}/locations/{location_id}/keyRings/{key-ring-name}/cryptoKeys/{key-name}.
- max
Job StringExecution Lifetime - Optional. The maximum duration after which the job execution is expired.
- project String
- Optional. The project in which jobs are run. By default, the project containing the Lake is used. If a project is provided, the ExecutionSpec.service_account must belong to this project.
- service
Account String - Service account to use to execute a task. If not provided, the default Compute service account for the project is used.
- args {[key: string]: string}
- Optional. The arguments to pass to the task. The args can use placeholders of the format ${placeholder} as part of key/value string. These will be interpolated before passing the args to the driver. Currently supported placeholders: - ${task_id} - ${job_time} To pass positional args, set the key as TASK_ARGS. The value should be a comma-separated string of all the positional arguments. To use a delimiter other than comma, refer to https://cloud.google.com/sdk/gcloud/reference/topic/escaping. In case of other keys being present in the args, then TASK_ARGS will be passed as the last argument.
- kms
Key string - Optional. The Cloud KMS key to use for encryption, of the form: projects/{project_number}/locations/{location_id}/keyRings/{key-ring-name}/cryptoKeys/{key-name}.
- max
Job stringExecution Lifetime - Optional. The maximum duration after which the job execution is expired.
- project string
- Optional. The project in which jobs are run. By default, the project containing the Lake is used. If a project is provided, the ExecutionSpec.service_account must belong to this project.
- service
Account string - Service account to use to execute a task. If not provided, the default Compute service account for the project is used.
- args Mapping[str, str]
- Optional. The arguments to pass to the task. The args can use placeholders of the format ${placeholder} as part of key/value string. These will be interpolated before passing the args to the driver. Currently supported placeholders: - ${task_id} - ${job_time} To pass positional args, set the key as TASK_ARGS. The value should be a comma-separated string of all the positional arguments. To use a delimiter other than comma, refer to https://cloud.google.com/sdk/gcloud/reference/topic/escaping. In case of other keys being present in the args, then TASK_ARGS will be passed as the last argument.
- kms_
key str - Optional. The Cloud KMS key to use for encryption, of the form: projects/{project_number}/locations/{location_id}/keyRings/{key-ring-name}/cryptoKeys/{key-name}.
- max_
job_ strexecution_ lifetime - Optional. The maximum duration after which the job execution is expired.
- project str
- Optional. The project in which jobs are run. By default, the project containing the Lake is used. If a project is provided, the ExecutionSpec.service_account must belong to this project.
- service_
account str - Service account to use to execute a task. If not provided, the default Compute service account for the project is used.
- args Map<String>
- Optional. The arguments to pass to the task. The args can use placeholders of the format ${placeholder} as part of key/value string. These will be interpolated before passing the args to the driver. Currently supported placeholders: - ${task_id} - ${job_time} To pass positional args, set the key as TASK_ARGS. The value should be a comma-separated string of all the positional arguments. To use a delimiter other than comma, refer to https://cloud.google.com/sdk/gcloud/reference/topic/escaping. In case of other keys being present in the args, then TASK_ARGS will be passed as the last argument.
- kms
Key String - Optional. The Cloud KMS key to use for encryption, of the form: projects/{project_number}/locations/{location_id}/keyRings/{key-ring-name}/cryptoKeys/{key-name}.
- max
Job StringExecution Lifetime - Optional. The maximum duration after which the job execution is expired.
- project String
- Optional. The project in which jobs are run. By default, the project containing the Lake is used. If a project is provided, the ExecutionSpec.service_account must belong to this project.
- service
Account String - Service account to use to execute a task. If not provided, the default Compute service account for the project is used.
GoogleCloudDataplexV1TaskExecutionStatusResponse, GoogleCloudDataplexV1TaskExecutionStatusResponseArgs
- Latest
Job Pulumi.Google Native. Dataplex. V1. Inputs. Google Cloud Dataplex V1Job Response - latest job execution
- Update
Time string - Last update time of the status.
- Latest
Job GoogleCloud Dataplex V1Job Response - latest job execution
- Update
Time string - Last update time of the status.
- latest
Job GoogleCloud Dataplex V1Job Response - latest job execution
- update
Time String - Last update time of the status.
- latest
Job GoogleCloud Dataplex V1Job Response - latest job execution
- update
Time string - Last update time of the status.
- latest_
job GoogleCloud Dataplex V1Job Response - latest job execution
- update_
time str - Last update time of the status.
- latest
Job Property Map - latest job execution
- update
Time String - Last update time of the status.
GoogleCloudDataplexV1TaskInfrastructureSpec, GoogleCloudDataplexV1TaskInfrastructureSpecArgs
- Batch
Pulumi.
Google Native. Dataplex. V1. Inputs. Google Cloud Dataplex V1Task Infrastructure Spec Batch Compute Resources - Compute resources needed for a Task when using Dataproc Serverless.
- Container
Image Pulumi.Google Native. Dataplex. V1. Inputs. Google Cloud Dataplex V1Task Infrastructure Spec Container Image Runtime - Container Image Runtime Configuration.
- Vpc
Network Pulumi.Google Native. Dataplex. V1. Inputs. Google Cloud Dataplex V1Task Infrastructure Spec Vpc Network - Vpc network.
- Batch
Google
Cloud Dataplex V1Task Infrastructure Spec Batch Compute Resources - Compute resources needed for a Task when using Dataproc Serverless.
- Container
Image GoogleCloud Dataplex V1Task Infrastructure Spec Container Image Runtime - Container Image Runtime Configuration.
- Vpc
Network GoogleCloud Dataplex V1Task Infrastructure Spec Vpc Network - Vpc network.
- batch
Google
Cloud Dataplex V1Task Infrastructure Spec Batch Compute Resources - Compute resources needed for a Task when using Dataproc Serverless.
- container
Image GoogleCloud Dataplex V1Task Infrastructure Spec Container Image Runtime - Container Image Runtime Configuration.
- vpc
Network GoogleCloud Dataplex V1Task Infrastructure Spec Vpc Network - Vpc network.
- batch
Google
Cloud Dataplex V1Task Infrastructure Spec Batch Compute Resources - Compute resources needed for a Task when using Dataproc Serverless.
- container
Image GoogleCloud Dataplex V1Task Infrastructure Spec Container Image Runtime - Container Image Runtime Configuration.
- vpc
Network GoogleCloud Dataplex V1Task Infrastructure Spec Vpc Network - Vpc network.
- batch
Google
Cloud Dataplex V1Task Infrastructure Spec Batch Compute Resources - Compute resources needed for a Task when using Dataproc Serverless.
- container_
image GoogleCloud Dataplex V1Task Infrastructure Spec Container Image Runtime - Container Image Runtime Configuration.
- vpc_
network GoogleCloud Dataplex V1Task Infrastructure Spec Vpc Network - Vpc network.
- batch Property Map
- Compute resources needed for a Task when using Dataproc Serverless.
- container
Image Property Map - Container Image Runtime Configuration.
- vpc
Network Property Map - Vpc network.
GoogleCloudDataplexV1TaskInfrastructureSpecBatchComputeResources, GoogleCloudDataplexV1TaskInfrastructureSpecBatchComputeResourcesArgs
- Executors
Count int - Optional. Total number of job executors. Executor Count should be between 2 and 100. Default=2
- Max
Executors intCount - Optional. Max configurable executors. If max_executors_count > executors_count, then auto-scaling is enabled. Max Executor Count should be between 2 and 1000. Default=1000
- Executors
Count int - Optional. Total number of job executors. Executor Count should be between 2 and 100. Default=2
- Max
Executors intCount - Optional. Max configurable executors. If max_executors_count > executors_count, then auto-scaling is enabled. Max Executor Count should be between 2 and 1000. Default=1000
- executors
Count Integer - Optional. Total number of job executors. Executor Count should be between 2 and 100. Default=2
- max
Executors IntegerCount - Optional. Max configurable executors. If max_executors_count > executors_count, then auto-scaling is enabled. Max Executor Count should be between 2 and 1000. Default=1000
- executors
Count number - Optional. Total number of job executors. Executor Count should be between 2 and 100. Default=2
- max
Executors numberCount - Optional. Max configurable executors. If max_executors_count > executors_count, then auto-scaling is enabled. Max Executor Count should be between 2 and 1000. Default=1000
- executors_
count int - Optional. Total number of job executors. Executor Count should be between 2 and 100. Default=2
- max_
executors_ intcount - Optional. Max configurable executors. If max_executors_count > executors_count, then auto-scaling is enabled. Max Executor Count should be between 2 and 1000. Default=1000
- executors
Count Number - Optional. Total number of job executors. Executor Count should be between 2 and 100. Default=2
- max
Executors NumberCount - Optional. Max configurable executors. If max_executors_count > executors_count, then auto-scaling is enabled. Max Executor Count should be between 2 and 1000. Default=1000
GoogleCloudDataplexV1TaskInfrastructureSpecBatchComputeResourcesResponse, GoogleCloudDataplexV1TaskInfrastructureSpecBatchComputeResourcesResponseArgs
- Executors
Count int - Optional. Total number of job executors. Executor Count should be between 2 and 100. Default=2
- Max
Executors intCount - Optional. Max configurable executors. If max_executors_count > executors_count, then auto-scaling is enabled. Max Executor Count should be between 2 and 1000. Default=1000
- Executors
Count int - Optional. Total number of job executors. Executor Count should be between 2 and 100. Default=2
- Max
Executors intCount - Optional. Max configurable executors. If max_executors_count > executors_count, then auto-scaling is enabled. Max Executor Count should be between 2 and 1000. Default=1000
- executors
Count Integer - Optional. Total number of job executors. Executor Count should be between 2 and 100. Default=2
- max
Executors IntegerCount - Optional. Max configurable executors. If max_executors_count > executors_count, then auto-scaling is enabled. Max Executor Count should be between 2 and 1000. Default=1000
- executors
Count number - Optional. Total number of job executors. Executor Count should be between 2 and 100. Default=2
- max
Executors numberCount - Optional. Max configurable executors. If max_executors_count > executors_count, then auto-scaling is enabled. Max Executor Count should be between 2 and 1000. Default=1000
- executors_
count int - Optional. Total number of job executors. Executor Count should be between 2 and 100. Default=2
- max_
executors_ intcount - Optional. Max configurable executors. If max_executors_count > executors_count, then auto-scaling is enabled. Max Executor Count should be between 2 and 1000. Default=1000
- executors
Count Number - Optional. Total number of job executors. Executor Count should be between 2 and 100. Default=2
- max
Executors NumberCount - Optional. Max configurable executors. If max_executors_count > executors_count, then auto-scaling is enabled. Max Executor Count should be between 2 and 1000. Default=1000
GoogleCloudDataplexV1TaskInfrastructureSpecContainerImageRuntime, GoogleCloudDataplexV1TaskInfrastructureSpecContainerImageRuntimeArgs
- Image string
- Optional. Container image to use.
- Java
Jars List<string> - Optional. A list of Java JARS to add to the classpath. Valid input includes Cloud Storage URIs to Jar binaries. For example, gs://bucket-name/my/path/to/file.jar
- Properties Dictionary<string, string>
- Optional. Override to common configuration of open source components installed on the Dataproc cluster. The properties to set on daemon config files. Property keys are specified in prefix:property format, for example core:hadoop.tmp.dir. For more information, see Cluster properties (https://cloud.google.com/dataproc/docs/concepts/cluster-properties).
- Python
Packages List<string> - Optional. A list of python packages to be installed. Valid formats include Cloud Storage URI to a PIP installable library. For example, gs://bucket-name/my/path/to/lib.tar.gz
- Image string
- Optional. Container image to use.
- Java
Jars []string - Optional. A list of Java JARS to add to the classpath. Valid input includes Cloud Storage URIs to Jar binaries. For example, gs://bucket-name/my/path/to/file.jar
- Properties map[string]string
- Optional. Override to common configuration of open source components installed on the Dataproc cluster. The properties to set on daemon config files. Property keys are specified in prefix:property format, for example core:hadoop.tmp.dir. For more information, see Cluster properties (https://cloud.google.com/dataproc/docs/concepts/cluster-properties).
- Python
Packages []string - Optional. A list of python packages to be installed. Valid formats include Cloud Storage URI to a PIP installable library. For example, gs://bucket-name/my/path/to/lib.tar.gz
- image String
- Optional. Container image to use.
- java
Jars List<String> - Optional. A list of Java JARS to add to the classpath. Valid input includes Cloud Storage URIs to Jar binaries. For example, gs://bucket-name/my/path/to/file.jar
- properties Map<String,String>
- Optional. Override to common configuration of open source components installed on the Dataproc cluster. The properties to set on daemon config files. Property keys are specified in prefix:property format, for example core:hadoop.tmp.dir. For more information, see Cluster properties (https://cloud.google.com/dataproc/docs/concepts/cluster-properties).
- python
Packages List<String> - Optional. A list of python packages to be installed. Valid formats include Cloud Storage URI to a PIP installable library. For example, gs://bucket-name/my/path/to/lib.tar.gz
- image string
- Optional. Container image to use.
- java
Jars string[] - Optional. A list of Java JARS to add to the classpath. Valid input includes Cloud Storage URIs to Jar binaries. For example, gs://bucket-name/my/path/to/file.jar
- properties {[key: string]: string}
- Optional. Override to common configuration of open source components installed on the Dataproc cluster. The properties to set on daemon config files. Property keys are specified in prefix:property format, for example core:hadoop.tmp.dir. For more information, see Cluster properties (https://cloud.google.com/dataproc/docs/concepts/cluster-properties).
- python
Packages string[] - Optional. A list of python packages to be installed. Valid formats include Cloud Storage URI to a PIP installable library. For example, gs://bucket-name/my/path/to/lib.tar.gz
- image str
- Optional. Container image to use.
- java_
jars Sequence[str] - Optional. A list of Java JARS to add to the classpath. Valid input includes Cloud Storage URIs to Jar binaries. For example, gs://bucket-name/my/path/to/file.jar
- properties Mapping[str, str]
- Optional. Override to common configuration of open source components installed on the Dataproc cluster. The properties to set on daemon config files. Property keys are specified in prefix:property format, for example core:hadoop.tmp.dir. For more information, see Cluster properties (https://cloud.google.com/dataproc/docs/concepts/cluster-properties).
- python_
packages Sequence[str] - Optional. A list of python packages to be installed. Valid formats include Cloud Storage URI to a PIP installable library. For example, gs://bucket-name/my/path/to/lib.tar.gz
- image String
- Optional. Container image to use.
- java
Jars List<String> - Optional. A list of Java JARS to add to the classpath. Valid input includes Cloud Storage URIs to Jar binaries. For example, gs://bucket-name/my/path/to/file.jar
- properties Map<String>
- Optional. Override to common configuration of open source components installed on the Dataproc cluster. The properties to set on daemon config files. Property keys are specified in prefix:property format, for example core:hadoop.tmp.dir. For more information, see Cluster properties (https://cloud.google.com/dataproc/docs/concepts/cluster-properties).
- python
Packages List<String> - Optional. A list of python packages to be installed. Valid formats include Cloud Storage URI to a PIP installable library. For example, gs://bucket-name/my/path/to/lib.tar.gz
GoogleCloudDataplexV1TaskInfrastructureSpecContainerImageRuntimeResponse, GoogleCloudDataplexV1TaskInfrastructureSpecContainerImageRuntimeResponseArgs
- Image string
- Optional. Container image to use.
- Java
Jars List<string> - Optional. A list of Java JARS to add to the classpath. Valid input includes Cloud Storage URIs to Jar binaries. For example, gs://bucket-name/my/path/to/file.jar
- Properties Dictionary<string, string>
- Optional. Override to common configuration of open source components installed on the Dataproc cluster. The properties to set on daemon config files. Property keys are specified in prefix:property format, for example core:hadoop.tmp.dir. For more information, see Cluster properties (https://cloud.google.com/dataproc/docs/concepts/cluster-properties).
- Python
Packages List<string> - Optional. A list of python packages to be installed. Valid formats include Cloud Storage URI to a PIP installable library. For example, gs://bucket-name/my/path/to/lib.tar.gz
- Image string
- Optional. Container image to use.
- Java
Jars []string - Optional. A list of Java JARS to add to the classpath. Valid input includes Cloud Storage URIs to Jar binaries. For example, gs://bucket-name/my/path/to/file.jar
- Properties map[string]string
- Optional. Override to common configuration of open source components installed on the Dataproc cluster. The properties to set on daemon config files. Property keys are specified in prefix:property format, for example core:hadoop.tmp.dir. For more information, see Cluster properties (https://cloud.google.com/dataproc/docs/concepts/cluster-properties).
- Python
Packages []string - Optional. A list of python packages to be installed. Valid formats include Cloud Storage URI to a PIP installable library. For example, gs://bucket-name/my/path/to/lib.tar.gz
- image String
- Optional. Container image to use.
- java
Jars List<String> - Optional. A list of Java JARS to add to the classpath. Valid input includes Cloud Storage URIs to Jar binaries. For example, gs://bucket-name/my/path/to/file.jar
- properties Map<String,String>
- Optional. Override to common configuration of open source components installed on the Dataproc cluster. The properties to set on daemon config files. Property keys are specified in prefix:property format, for example core:hadoop.tmp.dir. For more information, see Cluster properties (https://cloud.google.com/dataproc/docs/concepts/cluster-properties).
- python
Packages List<String> - Optional. A list of python packages to be installed. Valid formats include Cloud Storage URI to a PIP installable library. For example, gs://bucket-name/my/path/to/lib.tar.gz
- image string
- Optional. Container image to use.
- java
Jars string[] - Optional. A list of Java JARS to add to the classpath. Valid input includes Cloud Storage URIs to Jar binaries. For example, gs://bucket-name/my/path/to/file.jar
- properties {[key: string]: string}
- Optional. Override to common configuration of open source components installed on the Dataproc cluster. The properties to set on daemon config files. Property keys are specified in prefix:property format, for example core:hadoop.tmp.dir. For more information, see Cluster properties (https://cloud.google.com/dataproc/docs/concepts/cluster-properties).
- python
Packages string[] - Optional. A list of python packages to be installed. Valid formats include Cloud Storage URI to a PIP installable library. For example, gs://bucket-name/my/path/to/lib.tar.gz
- image str
- Optional. Container image to use.
- java_
jars Sequence[str] - Optional. A list of Java JARS to add to the classpath. Valid input includes Cloud Storage URIs to Jar binaries. For example, gs://bucket-name/my/path/to/file.jar
- properties Mapping[str, str]
- Optional. Override to common configuration of open source components installed on the Dataproc cluster. The properties to set on daemon config files. Property keys are specified in prefix:property format, for example core:hadoop.tmp.dir. For more information, see Cluster properties (https://cloud.google.com/dataproc/docs/concepts/cluster-properties).
- python_
packages Sequence[str] - Optional. A list of python packages to be installed. Valid formats include Cloud Storage URI to a PIP installable library. For example, gs://bucket-name/my/path/to/lib.tar.gz
- image String
- Optional. Container image to use.
- java
Jars List<String> - Optional. A list of Java JARS to add to the classpath. Valid input includes Cloud Storage URIs to Jar binaries. For example, gs://bucket-name/my/path/to/file.jar
- properties Map<String>
- Optional. Override to common configuration of open source components installed on the Dataproc cluster. The properties to set on daemon config files. Property keys are specified in prefix:property format, for example core:hadoop.tmp.dir. For more information, see Cluster properties (https://cloud.google.com/dataproc/docs/concepts/cluster-properties).
- python
Packages List<String> - Optional. A list of python packages to be installed. Valid formats include Cloud Storage URI to a PIP installable library. For example, gs://bucket-name/my/path/to/lib.tar.gz
GoogleCloudDataplexV1TaskInfrastructureSpecResponse, GoogleCloudDataplexV1TaskInfrastructureSpecResponseArgs
- Batch
Pulumi.
Google Native. Dataplex. V1. Inputs. Google Cloud Dataplex V1Task Infrastructure Spec Batch Compute Resources Response - Compute resources needed for a Task when using Dataproc Serverless.
- Container
Image Pulumi.Google Native. Dataplex. V1. Inputs. Google Cloud Dataplex V1Task Infrastructure Spec Container Image Runtime Response - Container Image Runtime Configuration.
- Vpc
Network Pulumi.Google Native. Dataplex. V1. Inputs. Google Cloud Dataplex V1Task Infrastructure Spec Vpc Network Response - Vpc network.
- Batch
Google
Cloud Dataplex V1Task Infrastructure Spec Batch Compute Resources Response - Compute resources needed for a Task when using Dataproc Serverless.
- Container
Image GoogleCloud Dataplex V1Task Infrastructure Spec Container Image Runtime Response - Container Image Runtime Configuration.
- Vpc
Network GoogleCloud Dataplex V1Task Infrastructure Spec Vpc Network Response - Vpc network.
- batch
Google
Cloud Dataplex V1Task Infrastructure Spec Batch Compute Resources Response - Compute resources needed for a Task when using Dataproc Serverless.
- container
Image GoogleCloud Dataplex V1Task Infrastructure Spec Container Image Runtime Response - Container Image Runtime Configuration.
- vpc
Network GoogleCloud Dataplex V1Task Infrastructure Spec Vpc Network Response - Vpc network.
- batch
Google
Cloud Dataplex V1Task Infrastructure Spec Batch Compute Resources Response - Compute resources needed for a Task when using Dataproc Serverless.
- container
Image GoogleCloud Dataplex V1Task Infrastructure Spec Container Image Runtime Response - Container Image Runtime Configuration.
- vpc
Network GoogleCloud Dataplex V1Task Infrastructure Spec Vpc Network Response - Vpc network.
- batch
Google
Cloud Dataplex V1Task Infrastructure Spec Batch Compute Resources Response - Compute resources needed for a Task when using Dataproc Serverless.
- container_
image GoogleCloud Dataplex V1Task Infrastructure Spec Container Image Runtime Response - Container Image Runtime Configuration.
- vpc_
network GoogleCloud Dataplex V1Task Infrastructure Spec Vpc Network Response - Vpc network.
- batch Property Map
- Compute resources needed for a Task when using Dataproc Serverless.
- container
Image Property Map - Container Image Runtime Configuration.
- vpc
Network Property Map - Vpc network.
GoogleCloudDataplexV1TaskInfrastructureSpecVpcNetwork, GoogleCloudDataplexV1TaskInfrastructureSpecVpcNetworkArgs
- Network string
- Optional. The Cloud VPC network in which the job is run. By default, the Cloud VPC network named Default within the project is used.
- List<string>
- Optional. List of network tags to apply to the job.
- Sub
Network string - Optional. The Cloud VPC sub-network in which the job is run.
- Network string
- Optional. The Cloud VPC network in which the job is run. By default, the Cloud VPC network named Default within the project is used.
- []string
- Optional. List of network tags to apply to the job.
- Sub
Network string - Optional. The Cloud VPC sub-network in which the job is run.
- network String
- Optional. The Cloud VPC network in which the job is run. By default, the Cloud VPC network named Default within the project is used.
- List<String>
- Optional. List of network tags to apply to the job.
- sub
Network String - Optional. The Cloud VPC sub-network in which the job is run.
- network string
- Optional. The Cloud VPC network in which the job is run. By default, the Cloud VPC network named Default within the project is used.
- string[]
- Optional. List of network tags to apply to the job.
- sub
Network string - Optional. The Cloud VPC sub-network in which the job is run.
- network str
- Optional. The Cloud VPC network in which the job is run. By default, the Cloud VPC network named Default within the project is used.
- Sequence[str]
- Optional. List of network tags to apply to the job.
- sub_
network str - Optional. The Cloud VPC sub-network in which the job is run.
- network String
- Optional. The Cloud VPC network in which the job is run. By default, the Cloud VPC network named Default within the project is used.
- List<String>
- Optional. List of network tags to apply to the job.
- sub
Network String - Optional. The Cloud VPC sub-network in which the job is run.
GoogleCloudDataplexV1TaskInfrastructureSpecVpcNetworkResponse, GoogleCloudDataplexV1TaskInfrastructureSpecVpcNetworkResponseArgs
- Network string
- Optional. The Cloud VPC network in which the job is run. By default, the Cloud VPC network named Default within the project is used.
- List<string>
- Optional. List of network tags to apply to the job.
- Sub
Network string - Optional. The Cloud VPC sub-network in which the job is run.
- Network string
- Optional. The Cloud VPC network in which the job is run. By default, the Cloud VPC network named Default within the project is used.
- []string
- Optional. List of network tags to apply to the job.
- Sub
Network string - Optional. The Cloud VPC sub-network in which the job is run.
- network String
- Optional. The Cloud VPC network in which the job is run. By default, the Cloud VPC network named Default within the project is used.
- List<String>
- Optional. List of network tags to apply to the job.
- sub
Network String - Optional. The Cloud VPC sub-network in which the job is run.
- network string
- Optional. The Cloud VPC network in which the job is run. By default, the Cloud VPC network named Default within the project is used.
- string[]
- Optional. List of network tags to apply to the job.
- sub
Network string - Optional. The Cloud VPC sub-network in which the job is run.
- network str
- Optional. The Cloud VPC network in which the job is run. By default, the Cloud VPC network named Default within the project is used.
- Sequence[str]
- Optional. List of network tags to apply to the job.
- sub_
network str - Optional. The Cloud VPC sub-network in which the job is run.
- network String
- Optional. The Cloud VPC network in which the job is run. By default, the Cloud VPC network named Default within the project is used.
- List<String>
- Optional. List of network tags to apply to the job.
- sub
Network String - Optional. The Cloud VPC sub-network in which the job is run.
GoogleCloudDataplexV1TaskNotebookTaskConfig, GoogleCloudDataplexV1TaskNotebookTaskConfigArgs
- Notebook string
- Path to input notebook. This can be the Cloud Storage URI of the notebook file or the path to a Notebook Content. The execution args are accessible as environment variables (TASK_key=value).
- Archive
Uris List<string> - Optional. Cloud Storage URIs of archives to be extracted into the working directory of each executor. Supported file types: .jar, .tar, .tar.gz, .tgz, and .zip.
- File
Uris List<string> - Optional. Cloud Storage URIs of files to be placed in the working directory of each executor.
- Infrastructure
Spec Pulumi.Google Native. Dataplex. V1. Inputs. Google Cloud Dataplex V1Task Infrastructure Spec - Optional. Infrastructure specification for the execution.
- Notebook string
- Path to input notebook. This can be the Cloud Storage URI of the notebook file or the path to a Notebook Content. The execution args are accessible as environment variables (TASK_key=value).
- Archive
Uris []string - Optional. Cloud Storage URIs of archives to be extracted into the working directory of each executor. Supported file types: .jar, .tar, .tar.gz, .tgz, and .zip.
- File
Uris []string - Optional. Cloud Storage URIs of files to be placed in the working directory of each executor.
- Infrastructure
Spec GoogleCloud Dataplex V1Task Infrastructure Spec - Optional. Infrastructure specification for the execution.
- notebook String
- Path to input notebook. This can be the Cloud Storage URI of the notebook file or the path to a Notebook Content. The execution args are accessible as environment variables (TASK_key=value).
- archive
Uris List<String> - Optional. Cloud Storage URIs of archives to be extracted into the working directory of each executor. Supported file types: .jar, .tar, .tar.gz, .tgz, and .zip.
- file
Uris List<String> - Optional. Cloud Storage URIs of files to be placed in the working directory of each executor.
- infrastructure
Spec GoogleCloud Dataplex V1Task Infrastructure Spec - Optional. Infrastructure specification for the execution.
- notebook string
- Path to input notebook. This can be the Cloud Storage URI of the notebook file or the path to a Notebook Content. The execution args are accessible as environment variables (TASK_key=value).
- archive
Uris string[] - Optional. Cloud Storage URIs of archives to be extracted into the working directory of each executor. Supported file types: .jar, .tar, .tar.gz, .tgz, and .zip.
- file
Uris string[] - Optional. Cloud Storage URIs of files to be placed in the working directory of each executor.
- infrastructure
Spec GoogleCloud Dataplex V1Task Infrastructure Spec - Optional. Infrastructure specification for the execution.
- notebook str
- Path to input notebook. This can be the Cloud Storage URI of the notebook file or the path to a Notebook Content. The execution args are accessible as environment variables (TASK_key=value).
- archive_
uris Sequence[str] - Optional. Cloud Storage URIs of archives to be extracted into the working directory of each executor. Supported file types: .jar, .tar, .tar.gz, .tgz, and .zip.
- file_
uris Sequence[str] - Optional. Cloud Storage URIs of files to be placed in the working directory of each executor.
- infrastructure_
spec GoogleCloud Dataplex V1Task Infrastructure Spec - Optional. Infrastructure specification for the execution.
- notebook String
- Path to input notebook. This can be the Cloud Storage URI of the notebook file or the path to a Notebook Content. The execution args are accessible as environment variables (TASK_key=value).
- archive
Uris List<String> - Optional. Cloud Storage URIs of archives to be extracted into the working directory of each executor. Supported file types: .jar, .tar, .tar.gz, .tgz, and .zip.
- file
Uris List<String> - Optional. Cloud Storage URIs of files to be placed in the working directory of each executor.
- infrastructure
Spec Property Map - Optional. Infrastructure specification for the execution.
GoogleCloudDataplexV1TaskNotebookTaskConfigResponse, GoogleCloudDataplexV1TaskNotebookTaskConfigResponseArgs
- Archive
Uris List<string> - Optional. Cloud Storage URIs of archives to be extracted into the working directory of each executor. Supported file types: .jar, .tar, .tar.gz, .tgz, and .zip.
- File
Uris List<string> - Optional. Cloud Storage URIs of files to be placed in the working directory of each executor.
- Infrastructure
Spec Pulumi.Google Native. Dataplex. V1. Inputs. Google Cloud Dataplex V1Task Infrastructure Spec Response - Optional. Infrastructure specification for the execution.
- Notebook string
- Path to input notebook. This can be the Cloud Storage URI of the notebook file or the path to a Notebook Content. The execution args are accessible as environment variables (TASK_key=value).
- Archive
Uris []string - Optional. Cloud Storage URIs of archives to be extracted into the working directory of each executor. Supported file types: .jar, .tar, .tar.gz, .tgz, and .zip.
- File
Uris []string - Optional. Cloud Storage URIs of files to be placed in the working directory of each executor.
- Infrastructure
Spec GoogleCloud Dataplex V1Task Infrastructure Spec Response - Optional. Infrastructure specification for the execution.
- Notebook string
- Path to input notebook. This can be the Cloud Storage URI of the notebook file or the path to a Notebook Content. The execution args are accessible as environment variables (TASK_key=value).
- archive
Uris List<String> - Optional. Cloud Storage URIs of archives to be extracted into the working directory of each executor. Supported file types: .jar, .tar, .tar.gz, .tgz, and .zip.
- file
Uris List<String> - Optional. Cloud Storage URIs of files to be placed in the working directory of each executor.
- infrastructure
Spec GoogleCloud Dataplex V1Task Infrastructure Spec Response - Optional. Infrastructure specification for the execution.
- notebook String
- Path to input notebook. This can be the Cloud Storage URI of the notebook file or the path to a Notebook Content. The execution args are accessible as environment variables (TASK_key=value).
- archive
Uris string[] - Optional. Cloud Storage URIs of archives to be extracted into the working directory of each executor. Supported file types: .jar, .tar, .tar.gz, .tgz, and .zip.
- file
Uris string[] - Optional. Cloud Storage URIs of files to be placed in the working directory of each executor.
- infrastructure
Spec GoogleCloud Dataplex V1Task Infrastructure Spec Response - Optional. Infrastructure specification for the execution.
- notebook string
- Path to input notebook. This can be the Cloud Storage URI of the notebook file or the path to a Notebook Content. The execution args are accessible as environment variables (TASK_key=value).
- archive_
uris Sequence[str] - Optional. Cloud Storage URIs of archives to be extracted into the working directory of each executor. Supported file types: .jar, .tar, .tar.gz, .tgz, and .zip.
- file_
uris Sequence[str] - Optional. Cloud Storage URIs of files to be placed in the working directory of each executor.
- infrastructure_
spec GoogleCloud Dataplex V1Task Infrastructure Spec Response - Optional. Infrastructure specification for the execution.
- notebook str
- Path to input notebook. This can be the Cloud Storage URI of the notebook file or the path to a Notebook Content. The execution args are accessible as environment variables (TASK_key=value).
- archive
Uris List<String> - Optional. Cloud Storage URIs of archives to be extracted into the working directory of each executor. Supported file types: .jar, .tar, .tar.gz, .tgz, and .zip.
- file
Uris List<String> - Optional. Cloud Storage URIs of files to be placed in the working directory of each executor.
- infrastructure
Spec Property Map - Optional. Infrastructure specification for the execution.
- notebook String
- Path to input notebook. This can be the Cloud Storage URI of the notebook file or the path to a Notebook Content. The execution args are accessible as environment variables (TASK_key=value).
GoogleCloudDataplexV1TaskSparkTaskConfig, GoogleCloudDataplexV1TaskSparkTaskConfigArgs
- Archive
Uris List<string> - Optional. Cloud Storage URIs of archives to be extracted into the working directory of each executor. Supported file types: .jar, .tar, .tar.gz, .tgz, and .zip.
- File
Uris List<string> - Optional. Cloud Storage URIs of files to be placed in the working directory of each executor.
- Infrastructure
Spec Pulumi.Google Native. Dataplex. V1. Inputs. Google Cloud Dataplex V1Task Infrastructure Spec - Optional. Infrastructure specification for the execution.
- Main
Class string - The name of the driver's main class. The jar file that contains the class must be in the default CLASSPATH or specified in jar_file_uris. The execution args are passed in as a sequence of named process arguments (--key=value).
- Main
Jar stringFile Uri - The Cloud Storage URI of the jar file that contains the main class. The execution args are passed in as a sequence of named process arguments (--key=value).
- Python
Script stringFile - The Gcloud Storage URI of the main Python file to use as the driver. Must be a .py file. The execution args are passed in as a sequence of named process arguments (--key=value).
- Sql
Script string - The query text. The execution args are used to declare a set of script variables (set key="value";).
- Sql
Script stringFile - A reference to a query file. This can be the Cloud Storage URI of the query file or it can the path to a SqlScript Content. The execution args are used to declare a set of script variables (set key="value";).
- Archive
Uris []string - Optional. Cloud Storage URIs of archives to be extracted into the working directory of each executor. Supported file types: .jar, .tar, .tar.gz, .tgz, and .zip.
- File
Uris []string - Optional. Cloud Storage URIs of files to be placed in the working directory of each executor.
- Infrastructure
Spec GoogleCloud Dataplex V1Task Infrastructure Spec - Optional. Infrastructure specification for the execution.
- Main
Class string - The name of the driver's main class. The jar file that contains the class must be in the default CLASSPATH or specified in jar_file_uris. The execution args are passed in as a sequence of named process arguments (--key=value).
- Main
Jar stringFile Uri - The Cloud Storage URI of the jar file that contains the main class. The execution args are passed in as a sequence of named process arguments (--key=value).
- Python
Script stringFile - The Gcloud Storage URI of the main Python file to use as the driver. Must be a .py file. The execution args are passed in as a sequence of named process arguments (--key=value).
- Sql
Script string - The query text. The execution args are used to declare a set of script variables (set key="value";).
- Sql
Script stringFile - A reference to a query file. This can be the Cloud Storage URI of the query file or it can the path to a SqlScript Content. The execution args are used to declare a set of script variables (set key="value";).
- archive
Uris List<String> - Optional. Cloud Storage URIs of archives to be extracted into the working directory of each executor. Supported file types: .jar, .tar, .tar.gz, .tgz, and .zip.
- file
Uris List<String> - Optional. Cloud Storage URIs of files to be placed in the working directory of each executor.
- infrastructure
Spec GoogleCloud Dataplex V1Task Infrastructure Spec - Optional. Infrastructure specification for the execution.
- main
Class String - The name of the driver's main class. The jar file that contains the class must be in the default CLASSPATH or specified in jar_file_uris. The execution args are passed in as a sequence of named process arguments (--key=value).
- main
Jar StringFile Uri - The Cloud Storage URI of the jar file that contains the main class. The execution args are passed in as a sequence of named process arguments (--key=value).
- python
Script StringFile - The Gcloud Storage URI of the main Python file to use as the driver. Must be a .py file. The execution args are passed in as a sequence of named process arguments (--key=value).
- sql
Script String - The query text. The execution args are used to declare a set of script variables (set key="value";).
- sql
Script StringFile - A reference to a query file. This can be the Cloud Storage URI of the query file or it can the path to a SqlScript Content. The execution args are used to declare a set of script variables (set key="value";).
- archive
Uris string[] - Optional. Cloud Storage URIs of archives to be extracted into the working directory of each executor. Supported file types: .jar, .tar, .tar.gz, .tgz, and .zip.
- file
Uris string[] - Optional. Cloud Storage URIs of files to be placed in the working directory of each executor.
- infrastructure
Spec GoogleCloud Dataplex V1Task Infrastructure Spec - Optional. Infrastructure specification for the execution.
- main
Class string - The name of the driver's main class. The jar file that contains the class must be in the default CLASSPATH or specified in jar_file_uris. The execution args are passed in as a sequence of named process arguments (--key=value).
- main
Jar stringFile Uri - The Cloud Storage URI of the jar file that contains the main class. The execution args are passed in as a sequence of named process arguments (--key=value).
- python
Script stringFile - The Gcloud Storage URI of the main Python file to use as the driver. Must be a .py file. The execution args are passed in as a sequence of named process arguments (--key=value).
- sql
Script string - The query text. The execution args are used to declare a set of script variables (set key="value";).
- sql
Script stringFile - A reference to a query file. This can be the Cloud Storage URI of the query file or it can the path to a SqlScript Content. The execution args are used to declare a set of script variables (set key="value";).
- archive_
uris Sequence[str] - Optional. Cloud Storage URIs of archives to be extracted into the working directory of each executor. Supported file types: .jar, .tar, .tar.gz, .tgz, and .zip.
- file_
uris Sequence[str] - Optional. Cloud Storage URIs of files to be placed in the working directory of each executor.
- infrastructure_
spec GoogleCloud Dataplex V1Task Infrastructure Spec - Optional. Infrastructure specification for the execution.
- main_
class str - The name of the driver's main class. The jar file that contains the class must be in the default CLASSPATH or specified in jar_file_uris. The execution args are passed in as a sequence of named process arguments (--key=value).
- main_
jar_ strfile_ uri - The Cloud Storage URI of the jar file that contains the main class. The execution args are passed in as a sequence of named process arguments (--key=value).
- python_
script_ strfile - The Gcloud Storage URI of the main Python file to use as the driver. Must be a .py file. The execution args are passed in as a sequence of named process arguments (--key=value).
- sql_
script str - The query text. The execution args are used to declare a set of script variables (set key="value";).
- sql_
script_ strfile - A reference to a query file. This can be the Cloud Storage URI of the query file or it can the path to a SqlScript Content. The execution args are used to declare a set of script variables (set key="value";).
- archive
Uris List<String> - Optional. Cloud Storage URIs of archives to be extracted into the working directory of each executor. Supported file types: .jar, .tar, .tar.gz, .tgz, and .zip.
- file
Uris List<String> - Optional. Cloud Storage URIs of files to be placed in the working directory of each executor.
- infrastructure
Spec Property Map - Optional. Infrastructure specification for the execution.
- main
Class String - The name of the driver's main class. The jar file that contains the class must be in the default CLASSPATH or specified in jar_file_uris. The execution args are passed in as a sequence of named process arguments (--key=value).
- main
Jar StringFile Uri - The Cloud Storage URI of the jar file that contains the main class. The execution args are passed in as a sequence of named process arguments (--key=value).
- python
Script StringFile - The Gcloud Storage URI of the main Python file to use as the driver. Must be a .py file. The execution args are passed in as a sequence of named process arguments (--key=value).
- sql
Script String - The query text. The execution args are used to declare a set of script variables (set key="value";).
- sql
Script StringFile - A reference to a query file. This can be the Cloud Storage URI of the query file or it can the path to a SqlScript Content. The execution args are used to declare a set of script variables (set key="value";).
GoogleCloudDataplexV1TaskSparkTaskConfigResponse, GoogleCloudDataplexV1TaskSparkTaskConfigResponseArgs
- Archive
Uris List<string> - Optional. Cloud Storage URIs of archives to be extracted into the working directory of each executor. Supported file types: .jar, .tar, .tar.gz, .tgz, and .zip.
- File
Uris List<string> - Optional. Cloud Storage URIs of files to be placed in the working directory of each executor.
- Infrastructure
Spec Pulumi.Google Native. Dataplex. V1. Inputs. Google Cloud Dataplex V1Task Infrastructure Spec Response - Optional. Infrastructure specification for the execution.
- Main
Class string - The name of the driver's main class. The jar file that contains the class must be in the default CLASSPATH or specified in jar_file_uris. The execution args are passed in as a sequence of named process arguments (--key=value).
- Main
Jar stringFile Uri - The Cloud Storage URI of the jar file that contains the main class. The execution args are passed in as a sequence of named process arguments (--key=value).
- Python
Script stringFile - The Gcloud Storage URI of the main Python file to use as the driver. Must be a .py file. The execution args are passed in as a sequence of named process arguments (--key=value).
- Sql
Script string - The query text. The execution args are used to declare a set of script variables (set key="value";).
- Sql
Script stringFile - A reference to a query file. This can be the Cloud Storage URI of the query file or it can the path to a SqlScript Content. The execution args are used to declare a set of script variables (set key="value";).
- Archive
Uris []string - Optional. Cloud Storage URIs of archives to be extracted into the working directory of each executor. Supported file types: .jar, .tar, .tar.gz, .tgz, and .zip.
- File
Uris []string - Optional. Cloud Storage URIs of files to be placed in the working directory of each executor.
- Infrastructure
Spec GoogleCloud Dataplex V1Task Infrastructure Spec Response - Optional. Infrastructure specification for the execution.
- Main
Class string - The name of the driver's main class. The jar file that contains the class must be in the default CLASSPATH or specified in jar_file_uris. The execution args are passed in as a sequence of named process arguments (--key=value).
- Main
Jar stringFile Uri - The Cloud Storage URI of the jar file that contains the main class. The execution args are passed in as a sequence of named process arguments (--key=value).
- Python
Script stringFile - The Gcloud Storage URI of the main Python file to use as the driver. Must be a .py file. The execution args are passed in as a sequence of named process arguments (--key=value).
- Sql
Script string - The query text. The execution args are used to declare a set of script variables (set key="value";).
- Sql
Script stringFile - A reference to a query file. This can be the Cloud Storage URI of the query file or it can the path to a SqlScript Content. The execution args are used to declare a set of script variables (set key="value";).
- archive
Uris List<String> - Optional. Cloud Storage URIs of archives to be extracted into the working directory of each executor. Supported file types: .jar, .tar, .tar.gz, .tgz, and .zip.
- file
Uris List<String> - Optional. Cloud Storage URIs of files to be placed in the working directory of each executor.
- infrastructure
Spec GoogleCloud Dataplex V1Task Infrastructure Spec Response - Optional. Infrastructure specification for the execution.
- main
Class String - The name of the driver's main class. The jar file that contains the class must be in the default CLASSPATH or specified in jar_file_uris. The execution args are passed in as a sequence of named process arguments (--key=value).
- main
Jar StringFile Uri - The Cloud Storage URI of the jar file that contains the main class. The execution args are passed in as a sequence of named process arguments (--key=value).
- python
Script StringFile - The Gcloud Storage URI of the main Python file to use as the driver. Must be a .py file. The execution args are passed in as a sequence of named process arguments (--key=value).
- sql
Script String - The query text. The execution args are used to declare a set of script variables (set key="value";).
- sql
Script StringFile - A reference to a query file. This can be the Cloud Storage URI of the query file or it can the path to a SqlScript Content. The execution args are used to declare a set of script variables (set key="value";).
- archive
Uris string[] - Optional. Cloud Storage URIs of archives to be extracted into the working directory of each executor. Supported file types: .jar, .tar, .tar.gz, .tgz, and .zip.
- file
Uris string[] - Optional. Cloud Storage URIs of files to be placed in the working directory of each executor.
- infrastructure
Spec GoogleCloud Dataplex V1Task Infrastructure Spec Response - Optional. Infrastructure specification for the execution.
- main
Class string - The name of the driver's main class. The jar file that contains the class must be in the default CLASSPATH or specified in jar_file_uris. The execution args are passed in as a sequence of named process arguments (--key=value).
- main
Jar stringFile Uri - The Cloud Storage URI of the jar file that contains the main class. The execution args are passed in as a sequence of named process arguments (--key=value).
- python
Script stringFile - The Gcloud Storage URI of the main Python file to use as the driver. Must be a .py file. The execution args are passed in as a sequence of named process arguments (--key=value).
- sql
Script string - The query text. The execution args are used to declare a set of script variables (set key="value";).
- sql
Script stringFile - A reference to a query file. This can be the Cloud Storage URI of the query file or it can the path to a SqlScript Content. The execution args are used to declare a set of script variables (set key="value";).
- archive_
uris Sequence[str] - Optional. Cloud Storage URIs of archives to be extracted into the working directory of each executor. Supported file types: .jar, .tar, .tar.gz, .tgz, and .zip.
- file_
uris Sequence[str] - Optional. Cloud Storage URIs of files to be placed in the working directory of each executor.
- infrastructure_
spec GoogleCloud Dataplex V1Task Infrastructure Spec Response - Optional. Infrastructure specification for the execution.
- main_
class str - The name of the driver's main class. The jar file that contains the class must be in the default CLASSPATH or specified in jar_file_uris. The execution args are passed in as a sequence of named process arguments (--key=value).
- main_
jar_ strfile_ uri - The Cloud Storage URI of the jar file that contains the main class. The execution args are passed in as a sequence of named process arguments (--key=value).
- python_
script_ strfile - The Gcloud Storage URI of the main Python file to use as the driver. Must be a .py file. The execution args are passed in as a sequence of named process arguments (--key=value).
- sql_
script str - The query text. The execution args are used to declare a set of script variables (set key="value";).
- sql_
script_ strfile - A reference to a query file. This can be the Cloud Storage URI of the query file or it can the path to a SqlScript Content. The execution args are used to declare a set of script variables (set key="value";).
- archive
Uris List<String> - Optional. Cloud Storage URIs of archives to be extracted into the working directory of each executor. Supported file types: .jar, .tar, .tar.gz, .tgz, and .zip.
- file
Uris List<String> - Optional. Cloud Storage URIs of files to be placed in the working directory of each executor.
- infrastructure
Spec Property Map - Optional. Infrastructure specification for the execution.
- main
Class String - The name of the driver's main class. The jar file that contains the class must be in the default CLASSPATH or specified in jar_file_uris. The execution args are passed in as a sequence of named process arguments (--key=value).
- main
Jar StringFile Uri - The Cloud Storage URI of the jar file that contains the main class. The execution args are passed in as a sequence of named process arguments (--key=value).
- python
Script StringFile - The Gcloud Storage URI of the main Python file to use as the driver. Must be a .py file. The execution args are passed in as a sequence of named process arguments (--key=value).
- sql
Script String - The query text. The execution args are used to declare a set of script variables (set key="value";).
- sql
Script StringFile - A reference to a query file. This can be the Cloud Storage URI of the query file or it can the path to a SqlScript Content. The execution args are used to declare a set of script variables (set key="value";).
GoogleCloudDataplexV1TaskTriggerSpec, GoogleCloudDataplexV1TaskTriggerSpecArgs
- Type
Pulumi.
Google Native. Dataplex. V1. Google Cloud Dataplex V1Task Trigger Spec Type - Immutable. Trigger type of the user-specified Task.
- Disabled bool
- Optional. Prevent the task from executing. This does not cancel already running tasks. It is intended to temporarily disable RECURRING tasks.
- Max
Retries int - Optional. Number of retry attempts before aborting. Set to zero to never attempt to retry a failed task.
- Schedule string
- Optional. Cron schedule (https://en.wikipedia.org/wiki/Cron) for running tasks periodically. To explicitly set a timezone to the cron tab, apply a prefix in the cron tab: "CRON_TZ=${IANA_TIME_ZONE}" or "TZ=${IANA_TIME_ZONE}". The ${IANA_TIME_ZONE} may only be a valid string from IANA time zone database. For example, CRON_TZ=America/New_York 1 * * * *, or TZ=America/New_York 1 * * * *. This field is required for RECURRING tasks.
- Start
Time string - Optional. The first run of the task will be after this time. If not specified, the task will run shortly after being submitted if ON_DEMAND and based on the schedule if RECURRING.
- Type
Google
Cloud Dataplex V1Task Trigger Spec Type - Immutable. Trigger type of the user-specified Task.
- Disabled bool
- Optional. Prevent the task from executing. This does not cancel already running tasks. It is intended to temporarily disable RECURRING tasks.
- Max
Retries int - Optional. Number of retry attempts before aborting. Set to zero to never attempt to retry a failed task.
- Schedule string
- Optional. Cron schedule (https://en.wikipedia.org/wiki/Cron) for running tasks periodically. To explicitly set a timezone to the cron tab, apply a prefix in the cron tab: "CRON_TZ=${IANA_TIME_ZONE}" or "TZ=${IANA_TIME_ZONE}". The ${IANA_TIME_ZONE} may only be a valid string from IANA time zone database. For example, CRON_TZ=America/New_York 1 * * * *, or TZ=America/New_York 1 * * * *. This field is required for RECURRING tasks.
- Start
Time string - Optional. The first run of the task will be after this time. If not specified, the task will run shortly after being submitted if ON_DEMAND and based on the schedule if RECURRING.
- type
Google
Cloud Dataplex V1Task Trigger Spec Type - Immutable. Trigger type of the user-specified Task.
- disabled Boolean
- Optional. Prevent the task from executing. This does not cancel already running tasks. It is intended to temporarily disable RECURRING tasks.
- max
Retries Integer - Optional. Number of retry attempts before aborting. Set to zero to never attempt to retry a failed task.
- schedule String
- Optional. Cron schedule (https://en.wikipedia.org/wiki/Cron) for running tasks periodically. To explicitly set a timezone to the cron tab, apply a prefix in the cron tab: "CRON_TZ=${IANA_TIME_ZONE}" or "TZ=${IANA_TIME_ZONE}". The ${IANA_TIME_ZONE} may only be a valid string from IANA time zone database. For example, CRON_TZ=America/New_York 1 * * * *, or TZ=America/New_York 1 * * * *. This field is required for RECURRING tasks.
- start
Time String - Optional. The first run of the task will be after this time. If not specified, the task will run shortly after being submitted if ON_DEMAND and based on the schedule if RECURRING.
- type
Google
Cloud Dataplex V1Task Trigger Spec Type - Immutable. Trigger type of the user-specified Task.
- disabled boolean
- Optional. Prevent the task from executing. This does not cancel already running tasks. It is intended to temporarily disable RECURRING tasks.
- max
Retries number - Optional. Number of retry attempts before aborting. Set to zero to never attempt to retry a failed task.
- schedule string
- Optional. Cron schedule (https://en.wikipedia.org/wiki/Cron) for running tasks periodically. To explicitly set a timezone to the cron tab, apply a prefix in the cron tab: "CRON_TZ=${IANA_TIME_ZONE}" or "TZ=${IANA_TIME_ZONE}". The ${IANA_TIME_ZONE} may only be a valid string from IANA time zone database. For example, CRON_TZ=America/New_York 1 * * * *, or TZ=America/New_York 1 * * * *. This field is required for RECURRING tasks.
- start
Time string - Optional. The first run of the task will be after this time. If not specified, the task will run shortly after being submitted if ON_DEMAND and based on the schedule if RECURRING.
- type
Google
Cloud Dataplex V1Task Trigger Spec Type - Immutable. Trigger type of the user-specified Task.
- disabled bool
- Optional. Prevent the task from executing. This does not cancel already running tasks. It is intended to temporarily disable RECURRING tasks.
- max_
retries int - Optional. Number of retry attempts before aborting. Set to zero to never attempt to retry a failed task.
- schedule str
- Optional. Cron schedule (https://en.wikipedia.org/wiki/Cron) for running tasks periodically. To explicitly set a timezone to the cron tab, apply a prefix in the cron tab: "CRON_TZ=${IANA_TIME_ZONE}" or "TZ=${IANA_TIME_ZONE}". The ${IANA_TIME_ZONE} may only be a valid string from IANA time zone database. For example, CRON_TZ=America/New_York 1 * * * *, or TZ=America/New_York 1 * * * *. This field is required for RECURRING tasks.
- start_
time str - Optional. The first run of the task will be after this time. If not specified, the task will run shortly after being submitted if ON_DEMAND and based on the schedule if RECURRING.
- type "TYPE_UNSPECIFIED" | "ON_DEMAND" | "RECURRING"
- Immutable. Trigger type of the user-specified Task.
- disabled Boolean
- Optional. Prevent the task from executing. This does not cancel already running tasks. It is intended to temporarily disable RECURRING tasks.
- max
Retries Number - Optional. Number of retry attempts before aborting. Set to zero to never attempt to retry a failed task.
- schedule String
- Optional. Cron schedule (https://en.wikipedia.org/wiki/Cron) for running tasks periodically. To explicitly set a timezone to the cron tab, apply a prefix in the cron tab: "CRON_TZ=${IANA_TIME_ZONE}" or "TZ=${IANA_TIME_ZONE}". The ${IANA_TIME_ZONE} may only be a valid string from IANA time zone database. For example, CRON_TZ=America/New_York 1 * * * *, or TZ=America/New_York 1 * * * *. This field is required for RECURRING tasks.
- start
Time String - Optional. The first run of the task will be after this time. If not specified, the task will run shortly after being submitted if ON_DEMAND and based on the schedule if RECURRING.
GoogleCloudDataplexV1TaskTriggerSpecResponse, GoogleCloudDataplexV1TaskTriggerSpecResponseArgs
- Disabled bool
- Optional. Prevent the task from executing. This does not cancel already running tasks. It is intended to temporarily disable RECURRING tasks.
- Max
Retries int - Optional. Number of retry attempts before aborting. Set to zero to never attempt to retry a failed task.
- Schedule string
- Optional. Cron schedule (https://en.wikipedia.org/wiki/Cron) for running tasks periodically. To explicitly set a timezone to the cron tab, apply a prefix in the cron tab: "CRON_TZ=${IANA_TIME_ZONE}" or "TZ=${IANA_TIME_ZONE}". The ${IANA_TIME_ZONE} may only be a valid string from IANA time zone database. For example, CRON_TZ=America/New_York 1 * * * *, or TZ=America/New_York 1 * * * *. This field is required for RECURRING tasks.
- Start
Time string - Optional. The first run of the task will be after this time. If not specified, the task will run shortly after being submitted if ON_DEMAND and based on the schedule if RECURRING.
- Type string
- Immutable. Trigger type of the user-specified Task.
- Disabled bool
- Optional. Prevent the task from executing. This does not cancel already running tasks. It is intended to temporarily disable RECURRING tasks.
- Max
Retries int - Optional. Number of retry attempts before aborting. Set to zero to never attempt to retry a failed task.
- Schedule string
- Optional. Cron schedule (https://en.wikipedia.org/wiki/Cron) for running tasks periodically. To explicitly set a timezone to the cron tab, apply a prefix in the cron tab: "CRON_TZ=${IANA_TIME_ZONE}" or "TZ=${IANA_TIME_ZONE}". The ${IANA_TIME_ZONE} may only be a valid string from IANA time zone database. For example, CRON_TZ=America/New_York 1 * * * *, or TZ=America/New_York 1 * * * *. This field is required for RECURRING tasks.
- Start
Time string - Optional. The first run of the task will be after this time. If not specified, the task will run shortly after being submitted if ON_DEMAND and based on the schedule if RECURRING.
- Type string
- Immutable. Trigger type of the user-specified Task.
- disabled Boolean
- Optional. Prevent the task from executing. This does not cancel already running tasks. It is intended to temporarily disable RECURRING tasks.
- max
Retries Integer - Optional. Number of retry attempts before aborting. Set to zero to never attempt to retry a failed task.
- schedule String
- Optional. Cron schedule (https://en.wikipedia.org/wiki/Cron) for running tasks periodically. To explicitly set a timezone to the cron tab, apply a prefix in the cron tab: "CRON_TZ=${IANA_TIME_ZONE}" or "TZ=${IANA_TIME_ZONE}". The ${IANA_TIME_ZONE} may only be a valid string from IANA time zone database. For example, CRON_TZ=America/New_York 1 * * * *, or TZ=America/New_York 1 * * * *. This field is required for RECURRING tasks.
- start
Time String - Optional. The first run of the task will be after this time. If not specified, the task will run shortly after being submitted if ON_DEMAND and based on the schedule if RECURRING.
- type String
- Immutable. Trigger type of the user-specified Task.
- disabled boolean
- Optional. Prevent the task from executing. This does not cancel already running tasks. It is intended to temporarily disable RECURRING tasks.
- max
Retries number - Optional. Number of retry attempts before aborting. Set to zero to never attempt to retry a failed task.
- schedule string
- Optional. Cron schedule (https://en.wikipedia.org/wiki/Cron) for running tasks periodically. To explicitly set a timezone to the cron tab, apply a prefix in the cron tab: "CRON_TZ=${IANA_TIME_ZONE}" or "TZ=${IANA_TIME_ZONE}". The ${IANA_TIME_ZONE} may only be a valid string from IANA time zone database. For example, CRON_TZ=America/New_York 1 * * * *, or TZ=America/New_York 1 * * * *. This field is required for RECURRING tasks.
- start
Time string - Optional. The first run of the task will be after this time. If not specified, the task will run shortly after being submitted if ON_DEMAND and based on the schedule if RECURRING.
- type string
- Immutable. Trigger type of the user-specified Task.
- disabled bool
- Optional. Prevent the task from executing. This does not cancel already running tasks. It is intended to temporarily disable RECURRING tasks.
- max_
retries int - Optional. Number of retry attempts before aborting. Set to zero to never attempt to retry a failed task.
- schedule str
- Optional. Cron schedule (https://en.wikipedia.org/wiki/Cron) for running tasks periodically. To explicitly set a timezone to the cron tab, apply a prefix in the cron tab: "CRON_TZ=${IANA_TIME_ZONE}" or "TZ=${IANA_TIME_ZONE}". The ${IANA_TIME_ZONE} may only be a valid string from IANA time zone database. For example, CRON_TZ=America/New_York 1 * * * *, or TZ=America/New_York 1 * * * *. This field is required for RECURRING tasks.
- start_
time str - Optional. The first run of the task will be after this time. If not specified, the task will run shortly after being submitted if ON_DEMAND and based on the schedule if RECURRING.
- type str
- Immutable. Trigger type of the user-specified Task.
- disabled Boolean
- Optional. Prevent the task from executing. This does not cancel already running tasks. It is intended to temporarily disable RECURRING tasks.
- max
Retries Number - Optional. Number of retry attempts before aborting. Set to zero to never attempt to retry a failed task.
- schedule String
- Optional. Cron schedule (https://en.wikipedia.org/wiki/Cron) for running tasks periodically. To explicitly set a timezone to the cron tab, apply a prefix in the cron tab: "CRON_TZ=${IANA_TIME_ZONE}" or "TZ=${IANA_TIME_ZONE}". The ${IANA_TIME_ZONE} may only be a valid string from IANA time zone database. For example, CRON_TZ=America/New_York 1 * * * *, or TZ=America/New_York 1 * * * *. This field is required for RECURRING tasks.
- start
Time String - Optional. The first run of the task will be after this time. If not specified, the task will run shortly after being submitted if ON_DEMAND and based on the schedule if RECURRING.
- type String
- Immutable. Trigger type of the user-specified Task.
GoogleCloudDataplexV1TaskTriggerSpecType, GoogleCloudDataplexV1TaskTriggerSpecTypeArgs
- Type
Unspecified - TYPE_UNSPECIFIEDUnspecified trigger type.
- On
Demand - ON_DEMANDThe task runs one-time shortly after Task Creation.
- Recurring
- RECURRINGThe task is scheduled to run periodically.
- Google
Cloud Dataplex V1Task Trigger Spec Type Type Unspecified - TYPE_UNSPECIFIEDUnspecified trigger type.
- Google
Cloud Dataplex V1Task Trigger Spec Type On Demand - ON_DEMANDThe task runs one-time shortly after Task Creation.
- Google
Cloud Dataplex V1Task Trigger Spec Type Recurring - RECURRINGThe task is scheduled to run periodically.
- Type
Unspecified - TYPE_UNSPECIFIEDUnspecified trigger type.
- On
Demand - ON_DEMANDThe task runs one-time shortly after Task Creation.
- Recurring
- RECURRINGThe task is scheduled to run periodically.
- Type
Unspecified - TYPE_UNSPECIFIEDUnspecified trigger type.
- On
Demand - ON_DEMANDThe task runs one-time shortly after Task Creation.
- Recurring
- RECURRINGThe task is scheduled to run periodically.
- TYPE_UNSPECIFIED
- TYPE_UNSPECIFIEDUnspecified trigger type.
- ON_DEMAND
- ON_DEMANDThe task runs one-time shortly after Task Creation.
- RECURRING
- RECURRINGThe task is scheduled to run periodically.
- "TYPE_UNSPECIFIED"
- TYPE_UNSPECIFIEDUnspecified trigger type.
- "ON_DEMAND"
- ON_DEMANDThe task runs one-time shortly after Task Creation.
- "RECURRING"
- RECURRINGThe task is scheduled to run periodically.
Package Details
- Repository
- Google Cloud Native pulumi/pulumi-google-native
- License
- Apache-2.0
Google Cloud Native is in preview. Google Cloud Classic is fully supported.