1. Packages
  2. Azure Classic
  3. API Docs
  4. datafactory
  5. DataFlow

We recommend using Azure Native.

Azure v6.10.0 published on Tuesday, Nov 19, 2024 by Pulumi

azure.datafactory.DataFlow

Explore with Pulumi AI

azure logo

We recommend using Azure Native.

Azure v6.10.0 published on Tuesday, Nov 19, 2024 by Pulumi

    Manages a Data Flow inside an Azure Data Factory.

    Example Usage

    import * as pulumi from "@pulumi/pulumi";
    import * as azure from "@pulumi/azure";
    
    const example = new azure.core.ResourceGroup("example", {
        name: "example-resources",
        location: "West Europe",
    });
    const exampleAccount = new azure.storage.Account("example", {
        name: "example",
        location: example.location,
        resourceGroupName: example.name,
        accountTier: "Standard",
        accountReplicationType: "LRS",
    });
    const exampleFactory = new azure.datafactory.Factory("example", {
        name: "example",
        location: example.location,
        resourceGroupName: example.name,
    });
    const exampleLinkedCustomService = new azure.datafactory.LinkedCustomService("example", {
        name: "linked_service",
        dataFactoryId: exampleFactory.id,
        type: "AzureBlobStorage",
        typePropertiesJson: pulumi.interpolate`{
      "connectionString": "${exampleAccount.primaryConnectionString}"
    }
    `,
    });
    const example1 = new azure.datafactory.DatasetJson("example1", {
        name: "dataset1",
        dataFactoryId: exampleFactory.id,
        linkedServiceName: exampleLinkedCustomService.name,
        azureBlobStorageLocation: {
            container: "container",
            path: "foo/bar/",
            filename: "foo.txt",
        },
        encoding: "UTF-8",
    });
    const example2 = new azure.datafactory.DatasetJson("example2", {
        name: "dataset2",
        dataFactoryId: exampleFactory.id,
        linkedServiceName: exampleLinkedCustomService.name,
        azureBlobStorageLocation: {
            container: "container",
            path: "foo/bar/",
            filename: "bar.txt",
        },
        encoding: "UTF-8",
    });
    const example1FlowletDataFlow = new azure.datafactory.FlowletDataFlow("example1", {
        name: "example",
        dataFactoryId: exampleFactory.id,
        sources: [{
            name: "source1",
            linkedService: {
                name: exampleLinkedCustomService.name,
            },
        }],
        sinks: [{
            name: "sink1",
            linkedService: {
                name: exampleLinkedCustomService.name,
            },
        }],
        script: `source(
      allowSchemaDrift: true, 
      validateSchema: false, 
      limit: 100, 
      ignoreNoFilesFound: false, 
      documentForm: 'documentPerLine') ~> source1 
    source1 sink(
      allowSchemaDrift: true, 
      validateSchema: false, 
      skipDuplicateMapInputs: true, 
      skipDuplicateMapOutputs: true) ~> sink1
    `,
    });
    const example2FlowletDataFlow = new azure.datafactory.FlowletDataFlow("example2", {
        name: "example",
        dataFactoryId: exampleFactory.id,
        sources: [{
            name: "source1",
            linkedService: {
                name: exampleLinkedCustomService.name,
            },
        }],
        sinks: [{
            name: "sink1",
            linkedService: {
                name: exampleLinkedCustomService.name,
            },
        }],
        script: `source(
      allowSchemaDrift: true, 
      validateSchema: false, 
      limit: 100, 
      ignoreNoFilesFound: false, 
      documentForm: 'documentPerLine') ~> source1 
    source1 sink(
      allowSchemaDrift: true, 
      validateSchema: false, 
      skipDuplicateMapInputs: true, 
      skipDuplicateMapOutputs: true) ~> sink1
    `,
    });
    const exampleDataFlow = new azure.datafactory.DataFlow("example", {
        name: "example",
        dataFactoryId: exampleFactory.id,
        sources: [{
            name: "source1",
            flowlet: {
                name: example1FlowletDataFlow.name,
                parameters: {
                    Key1: "value1",
                },
            },
            dataset: {
                name: example1.name,
            },
        }],
        sinks: [{
            name: "sink1",
            flowlet: {
                name: example2FlowletDataFlow.name,
                parameters: {
                    Key1: "value1",
                },
            },
            dataset: {
                name: example2.name,
            },
        }],
        script: `source(
      allowSchemaDrift: true, 
      validateSchema: false, 
      limit: 100, 
      ignoreNoFilesFound: false, 
      documentForm: 'documentPerLine') ~> source1 
    source1 sink(
      allowSchemaDrift: true, 
      validateSchema: false, 
      skipDuplicateMapInputs: true, 
      skipDuplicateMapOutputs: true) ~> sink1
    `,
    });
    
    import pulumi
    import pulumi_azure as azure
    
    example = azure.core.ResourceGroup("example",
        name="example-resources",
        location="West Europe")
    example_account = azure.storage.Account("example",
        name="example",
        location=example.location,
        resource_group_name=example.name,
        account_tier="Standard",
        account_replication_type="LRS")
    example_factory = azure.datafactory.Factory("example",
        name="example",
        location=example.location,
        resource_group_name=example.name)
    example_linked_custom_service = azure.datafactory.LinkedCustomService("example",
        name="linked_service",
        data_factory_id=example_factory.id,
        type="AzureBlobStorage",
        type_properties_json=example_account.primary_connection_string.apply(lambda primary_connection_string: f"""{{
      "connectionString": "{primary_connection_string}"
    }}
    """))
    example1 = azure.datafactory.DatasetJson("example1",
        name="dataset1",
        data_factory_id=example_factory.id,
        linked_service_name=example_linked_custom_service.name,
        azure_blob_storage_location={
            "container": "container",
            "path": "foo/bar/",
            "filename": "foo.txt",
        },
        encoding="UTF-8")
    example2 = azure.datafactory.DatasetJson("example2",
        name="dataset2",
        data_factory_id=example_factory.id,
        linked_service_name=example_linked_custom_service.name,
        azure_blob_storage_location={
            "container": "container",
            "path": "foo/bar/",
            "filename": "bar.txt",
        },
        encoding="UTF-8")
    example1_flowlet_data_flow = azure.datafactory.FlowletDataFlow("example1",
        name="example",
        data_factory_id=example_factory.id,
        sources=[{
            "name": "source1",
            "linked_service": {
                "name": example_linked_custom_service.name,
            },
        }],
        sinks=[{
            "name": "sink1",
            "linked_service": {
                "name": example_linked_custom_service.name,
            },
        }],
        script="""source(
      allowSchemaDrift: true, 
      validateSchema: false, 
      limit: 100, 
      ignoreNoFilesFound: false, 
      documentForm: 'documentPerLine') ~> source1 
    source1 sink(
      allowSchemaDrift: true, 
      validateSchema: false, 
      skipDuplicateMapInputs: true, 
      skipDuplicateMapOutputs: true) ~> sink1
    """)
    example2_flowlet_data_flow = azure.datafactory.FlowletDataFlow("example2",
        name="example",
        data_factory_id=example_factory.id,
        sources=[{
            "name": "source1",
            "linked_service": {
                "name": example_linked_custom_service.name,
            },
        }],
        sinks=[{
            "name": "sink1",
            "linked_service": {
                "name": example_linked_custom_service.name,
            },
        }],
        script="""source(
      allowSchemaDrift: true, 
      validateSchema: false, 
      limit: 100, 
      ignoreNoFilesFound: false, 
      documentForm: 'documentPerLine') ~> source1 
    source1 sink(
      allowSchemaDrift: true, 
      validateSchema: false, 
      skipDuplicateMapInputs: true, 
      skipDuplicateMapOutputs: true) ~> sink1
    """)
    example_data_flow = azure.datafactory.DataFlow("example",
        name="example",
        data_factory_id=example_factory.id,
        sources=[{
            "name": "source1",
            "flowlet": {
                "name": example1_flowlet_data_flow.name,
                "parameters": {
                    "Key1": "value1",
                },
            },
            "dataset": {
                "name": example1.name,
            },
        }],
        sinks=[{
            "name": "sink1",
            "flowlet": {
                "name": example2_flowlet_data_flow.name,
                "parameters": {
                    "Key1": "value1",
                },
            },
            "dataset": {
                "name": example2.name,
            },
        }],
        script="""source(
      allowSchemaDrift: true, 
      validateSchema: false, 
      limit: 100, 
      ignoreNoFilesFound: false, 
      documentForm: 'documentPerLine') ~> source1 
    source1 sink(
      allowSchemaDrift: true, 
      validateSchema: false, 
      skipDuplicateMapInputs: true, 
      skipDuplicateMapOutputs: true) ~> sink1
    """)
    
    package main
    
    import (
    	"fmt"
    
    	"github.com/pulumi/pulumi-azure/sdk/v6/go/azure/core"
    	"github.com/pulumi/pulumi-azure/sdk/v6/go/azure/datafactory"
    	"github.com/pulumi/pulumi-azure/sdk/v6/go/azure/storage"
    	"github.com/pulumi/pulumi/sdk/v3/go/pulumi"
    )
    
    func main() {
    	pulumi.Run(func(ctx *pulumi.Context) error {
    		example, err := core.NewResourceGroup(ctx, "example", &core.ResourceGroupArgs{
    			Name:     pulumi.String("example-resources"),
    			Location: pulumi.String("West Europe"),
    		})
    		if err != nil {
    			return err
    		}
    		exampleAccount, err := storage.NewAccount(ctx, "example", &storage.AccountArgs{
    			Name:                   pulumi.String("example"),
    			Location:               example.Location,
    			ResourceGroupName:      example.Name,
    			AccountTier:            pulumi.String("Standard"),
    			AccountReplicationType: pulumi.String("LRS"),
    		})
    		if err != nil {
    			return err
    		}
    		exampleFactory, err := datafactory.NewFactory(ctx, "example", &datafactory.FactoryArgs{
    			Name:              pulumi.String("example"),
    			Location:          example.Location,
    			ResourceGroupName: example.Name,
    		})
    		if err != nil {
    			return err
    		}
    		exampleLinkedCustomService, err := datafactory.NewLinkedCustomService(ctx, "example", &datafactory.LinkedCustomServiceArgs{
    			Name:          pulumi.String("linked_service"),
    			DataFactoryId: exampleFactory.ID(),
    			Type:          pulumi.String("AzureBlobStorage"),
    			TypePropertiesJson: exampleAccount.PrimaryConnectionString.ApplyT(func(primaryConnectionString string) (string, error) {
    				return fmt.Sprintf("{\n  \"connectionString\": \"%v\"\n}\n", primaryConnectionString), nil
    			}).(pulumi.StringOutput),
    		})
    		if err != nil {
    			return err
    		}
    		example1, err := datafactory.NewDatasetJson(ctx, "example1", &datafactory.DatasetJsonArgs{
    			Name:              pulumi.String("dataset1"),
    			DataFactoryId:     exampleFactory.ID(),
    			LinkedServiceName: exampleLinkedCustomService.Name,
    			AzureBlobStorageLocation: &datafactory.DatasetJsonAzureBlobStorageLocationArgs{
    				Container: pulumi.String("container"),
    				Path:      pulumi.String("foo/bar/"),
    				Filename:  pulumi.String("foo.txt"),
    			},
    			Encoding: pulumi.String("UTF-8"),
    		})
    		if err != nil {
    			return err
    		}
    		example2, err := datafactory.NewDatasetJson(ctx, "example2", &datafactory.DatasetJsonArgs{
    			Name:              pulumi.String("dataset2"),
    			DataFactoryId:     exampleFactory.ID(),
    			LinkedServiceName: exampleLinkedCustomService.Name,
    			AzureBlobStorageLocation: &datafactory.DatasetJsonAzureBlobStorageLocationArgs{
    				Container: pulumi.String("container"),
    				Path:      pulumi.String("foo/bar/"),
    				Filename:  pulumi.String("bar.txt"),
    			},
    			Encoding: pulumi.String("UTF-8"),
    		})
    		if err != nil {
    			return err
    		}
    		example1FlowletDataFlow, err := datafactory.NewFlowletDataFlow(ctx, "example1", &datafactory.FlowletDataFlowArgs{
    			Name:          pulumi.String("example"),
    			DataFactoryId: exampleFactory.ID(),
    			Sources: datafactory.FlowletDataFlowSourceArray{
    				&datafactory.FlowletDataFlowSourceArgs{
    					Name: pulumi.String("source1"),
    					LinkedService: &datafactory.FlowletDataFlowSourceLinkedServiceArgs{
    						Name: exampleLinkedCustomService.Name,
    					},
    				},
    			},
    			Sinks: datafactory.FlowletDataFlowSinkArray{
    				&datafactory.FlowletDataFlowSinkArgs{
    					Name: pulumi.String("sink1"),
    					LinkedService: &datafactory.FlowletDataFlowSinkLinkedServiceArgs{
    						Name: exampleLinkedCustomService.Name,
    					},
    				},
    			},
    			Script: pulumi.String(`source(
      allowSchemaDrift: true, 
      validateSchema: false, 
      limit: 100, 
      ignoreNoFilesFound: false, 
      documentForm: 'documentPerLine') ~> source1 
    source1 sink(
      allowSchemaDrift: true, 
      validateSchema: false, 
      skipDuplicateMapInputs: true, 
      skipDuplicateMapOutputs: true) ~> sink1
    `),
    		})
    		if err != nil {
    			return err
    		}
    		example2FlowletDataFlow, err := datafactory.NewFlowletDataFlow(ctx, "example2", &datafactory.FlowletDataFlowArgs{
    			Name:          pulumi.String("example"),
    			DataFactoryId: exampleFactory.ID(),
    			Sources: datafactory.FlowletDataFlowSourceArray{
    				&datafactory.FlowletDataFlowSourceArgs{
    					Name: pulumi.String("source1"),
    					LinkedService: &datafactory.FlowletDataFlowSourceLinkedServiceArgs{
    						Name: exampleLinkedCustomService.Name,
    					},
    				},
    			},
    			Sinks: datafactory.FlowletDataFlowSinkArray{
    				&datafactory.FlowletDataFlowSinkArgs{
    					Name: pulumi.String("sink1"),
    					LinkedService: &datafactory.FlowletDataFlowSinkLinkedServiceArgs{
    						Name: exampleLinkedCustomService.Name,
    					},
    				},
    			},
    			Script: pulumi.String(`source(
      allowSchemaDrift: true, 
      validateSchema: false, 
      limit: 100, 
      ignoreNoFilesFound: false, 
      documentForm: 'documentPerLine') ~> source1 
    source1 sink(
      allowSchemaDrift: true, 
      validateSchema: false, 
      skipDuplicateMapInputs: true, 
      skipDuplicateMapOutputs: true) ~> sink1
    `),
    		})
    		if err != nil {
    			return err
    		}
    		_, err = datafactory.NewDataFlow(ctx, "example", &datafactory.DataFlowArgs{
    			Name:          pulumi.String("example"),
    			DataFactoryId: exampleFactory.ID(),
    			Sources: datafactory.DataFlowSourceArray{
    				&datafactory.DataFlowSourceArgs{
    					Name: pulumi.String("source1"),
    					Flowlet: &datafactory.DataFlowSourceFlowletArgs{
    						Name: example1FlowletDataFlow.Name,
    						Parameters: pulumi.StringMap{
    							"Key1": pulumi.String("value1"),
    						},
    					},
    					Dataset: &datafactory.DataFlowSourceDatasetArgs{
    						Name: example1.Name,
    					},
    				},
    			},
    			Sinks: datafactory.DataFlowSinkArray{
    				&datafactory.DataFlowSinkArgs{
    					Name: pulumi.String("sink1"),
    					Flowlet: &datafactory.DataFlowSinkFlowletArgs{
    						Name: example2FlowletDataFlow.Name,
    						Parameters: pulumi.StringMap{
    							"Key1": pulumi.String("value1"),
    						},
    					},
    					Dataset: &datafactory.DataFlowSinkDatasetArgs{
    						Name: example2.Name,
    					},
    				},
    			},
    			Script: pulumi.String(`source(
      allowSchemaDrift: true, 
      validateSchema: false, 
      limit: 100, 
      ignoreNoFilesFound: false, 
      documentForm: 'documentPerLine') ~> source1 
    source1 sink(
      allowSchemaDrift: true, 
      validateSchema: false, 
      skipDuplicateMapInputs: true, 
      skipDuplicateMapOutputs: true) ~> sink1
    `),
    		})
    		if err != nil {
    			return err
    		}
    		return nil
    	})
    }
    
    using System.Collections.Generic;
    using System.Linq;
    using Pulumi;
    using Azure = Pulumi.Azure;
    
    return await Deployment.RunAsync(() => 
    {
        var example = new Azure.Core.ResourceGroup("example", new()
        {
            Name = "example-resources",
            Location = "West Europe",
        });
    
        var exampleAccount = new Azure.Storage.Account("example", new()
        {
            Name = "example",
            Location = example.Location,
            ResourceGroupName = example.Name,
            AccountTier = "Standard",
            AccountReplicationType = "LRS",
        });
    
        var exampleFactory = new Azure.DataFactory.Factory("example", new()
        {
            Name = "example",
            Location = example.Location,
            ResourceGroupName = example.Name,
        });
    
        var exampleLinkedCustomService = new Azure.DataFactory.LinkedCustomService("example", new()
        {
            Name = "linked_service",
            DataFactoryId = exampleFactory.Id,
            Type = "AzureBlobStorage",
            TypePropertiesJson = exampleAccount.PrimaryConnectionString.Apply(primaryConnectionString => @$"{{
      ""connectionString"": ""{primaryConnectionString}""
    }}
    "),
        });
    
        var example1 = new Azure.DataFactory.DatasetJson("example1", new()
        {
            Name = "dataset1",
            DataFactoryId = exampleFactory.Id,
            LinkedServiceName = exampleLinkedCustomService.Name,
            AzureBlobStorageLocation = new Azure.DataFactory.Inputs.DatasetJsonAzureBlobStorageLocationArgs
            {
                Container = "container",
                Path = "foo/bar/",
                Filename = "foo.txt",
            },
            Encoding = "UTF-8",
        });
    
        var example2 = new Azure.DataFactory.DatasetJson("example2", new()
        {
            Name = "dataset2",
            DataFactoryId = exampleFactory.Id,
            LinkedServiceName = exampleLinkedCustomService.Name,
            AzureBlobStorageLocation = new Azure.DataFactory.Inputs.DatasetJsonAzureBlobStorageLocationArgs
            {
                Container = "container",
                Path = "foo/bar/",
                Filename = "bar.txt",
            },
            Encoding = "UTF-8",
        });
    
        var example1FlowletDataFlow = new Azure.DataFactory.FlowletDataFlow("example1", new()
        {
            Name = "example",
            DataFactoryId = exampleFactory.Id,
            Sources = new[]
            {
                new Azure.DataFactory.Inputs.FlowletDataFlowSourceArgs
                {
                    Name = "source1",
                    LinkedService = new Azure.DataFactory.Inputs.FlowletDataFlowSourceLinkedServiceArgs
                    {
                        Name = exampleLinkedCustomService.Name,
                    },
                },
            },
            Sinks = new[]
            {
                new Azure.DataFactory.Inputs.FlowletDataFlowSinkArgs
                {
                    Name = "sink1",
                    LinkedService = new Azure.DataFactory.Inputs.FlowletDataFlowSinkLinkedServiceArgs
                    {
                        Name = exampleLinkedCustomService.Name,
                    },
                },
            },
            Script = @"source(
      allowSchemaDrift: true, 
      validateSchema: false, 
      limit: 100, 
      ignoreNoFilesFound: false, 
      documentForm: 'documentPerLine') ~> source1 
    source1 sink(
      allowSchemaDrift: true, 
      validateSchema: false, 
      skipDuplicateMapInputs: true, 
      skipDuplicateMapOutputs: true) ~> sink1
    ",
        });
    
        var example2FlowletDataFlow = new Azure.DataFactory.FlowletDataFlow("example2", new()
        {
            Name = "example",
            DataFactoryId = exampleFactory.Id,
            Sources = new[]
            {
                new Azure.DataFactory.Inputs.FlowletDataFlowSourceArgs
                {
                    Name = "source1",
                    LinkedService = new Azure.DataFactory.Inputs.FlowletDataFlowSourceLinkedServiceArgs
                    {
                        Name = exampleLinkedCustomService.Name,
                    },
                },
            },
            Sinks = new[]
            {
                new Azure.DataFactory.Inputs.FlowletDataFlowSinkArgs
                {
                    Name = "sink1",
                    LinkedService = new Azure.DataFactory.Inputs.FlowletDataFlowSinkLinkedServiceArgs
                    {
                        Name = exampleLinkedCustomService.Name,
                    },
                },
            },
            Script = @"source(
      allowSchemaDrift: true, 
      validateSchema: false, 
      limit: 100, 
      ignoreNoFilesFound: false, 
      documentForm: 'documentPerLine') ~> source1 
    source1 sink(
      allowSchemaDrift: true, 
      validateSchema: false, 
      skipDuplicateMapInputs: true, 
      skipDuplicateMapOutputs: true) ~> sink1
    ",
        });
    
        var exampleDataFlow = new Azure.DataFactory.DataFlow("example", new()
        {
            Name = "example",
            DataFactoryId = exampleFactory.Id,
            Sources = new[]
            {
                new Azure.DataFactory.Inputs.DataFlowSourceArgs
                {
                    Name = "source1",
                    Flowlet = new Azure.DataFactory.Inputs.DataFlowSourceFlowletArgs
                    {
                        Name = example1FlowletDataFlow.Name,
                        Parameters = 
                        {
                            { "Key1", "value1" },
                        },
                    },
                    Dataset = new Azure.DataFactory.Inputs.DataFlowSourceDatasetArgs
                    {
                        Name = example1.Name,
                    },
                },
            },
            Sinks = new[]
            {
                new Azure.DataFactory.Inputs.DataFlowSinkArgs
                {
                    Name = "sink1",
                    Flowlet = new Azure.DataFactory.Inputs.DataFlowSinkFlowletArgs
                    {
                        Name = example2FlowletDataFlow.Name,
                        Parameters = 
                        {
                            { "Key1", "value1" },
                        },
                    },
                    Dataset = new Azure.DataFactory.Inputs.DataFlowSinkDatasetArgs
                    {
                        Name = example2.Name,
                    },
                },
            },
            Script = @"source(
      allowSchemaDrift: true, 
      validateSchema: false, 
      limit: 100, 
      ignoreNoFilesFound: false, 
      documentForm: 'documentPerLine') ~> source1 
    source1 sink(
      allowSchemaDrift: true, 
      validateSchema: false, 
      skipDuplicateMapInputs: true, 
      skipDuplicateMapOutputs: true) ~> sink1
    ",
        });
    
    });
    
    package generated_program;
    
    import com.pulumi.Context;
    import com.pulumi.Pulumi;
    import com.pulumi.core.Output;
    import com.pulumi.azure.core.ResourceGroup;
    import com.pulumi.azure.core.ResourceGroupArgs;
    import com.pulumi.azure.storage.Account;
    import com.pulumi.azure.storage.AccountArgs;
    import com.pulumi.azure.datafactory.Factory;
    import com.pulumi.azure.datafactory.FactoryArgs;
    import com.pulumi.azure.datafactory.LinkedCustomService;
    import com.pulumi.azure.datafactory.LinkedCustomServiceArgs;
    import com.pulumi.azure.datafactory.DatasetJson;
    import com.pulumi.azure.datafactory.DatasetJsonArgs;
    import com.pulumi.azure.datafactory.inputs.DatasetJsonAzureBlobStorageLocationArgs;
    import com.pulumi.azure.datafactory.FlowletDataFlow;
    import com.pulumi.azure.datafactory.FlowletDataFlowArgs;
    import com.pulumi.azure.datafactory.inputs.FlowletDataFlowSourceArgs;
    import com.pulumi.azure.datafactory.inputs.FlowletDataFlowSourceLinkedServiceArgs;
    import com.pulumi.azure.datafactory.inputs.FlowletDataFlowSinkArgs;
    import com.pulumi.azure.datafactory.inputs.FlowletDataFlowSinkLinkedServiceArgs;
    import com.pulumi.azure.datafactory.DataFlow;
    import com.pulumi.azure.datafactory.DataFlowArgs;
    import com.pulumi.azure.datafactory.inputs.DataFlowSourceArgs;
    import com.pulumi.azure.datafactory.inputs.DataFlowSourceFlowletArgs;
    import com.pulumi.azure.datafactory.inputs.DataFlowSourceDatasetArgs;
    import com.pulumi.azure.datafactory.inputs.DataFlowSinkArgs;
    import com.pulumi.azure.datafactory.inputs.DataFlowSinkFlowletArgs;
    import com.pulumi.azure.datafactory.inputs.DataFlowSinkDatasetArgs;
    import java.util.List;
    import java.util.ArrayList;
    import java.util.Map;
    import java.io.File;
    import java.nio.file.Files;
    import java.nio.file.Paths;
    
    public class App {
        public static void main(String[] args) {
            Pulumi.run(App::stack);
        }
    
        public static void stack(Context ctx) {
            var example = new ResourceGroup("example", ResourceGroupArgs.builder()
                .name("example-resources")
                .location("West Europe")
                .build());
    
            var exampleAccount = new Account("exampleAccount", AccountArgs.builder()
                .name("example")
                .location(example.location())
                .resourceGroupName(example.name())
                .accountTier("Standard")
                .accountReplicationType("LRS")
                .build());
    
            var exampleFactory = new Factory("exampleFactory", FactoryArgs.builder()
                .name("example")
                .location(example.location())
                .resourceGroupName(example.name())
                .build());
    
            var exampleLinkedCustomService = new LinkedCustomService("exampleLinkedCustomService", LinkedCustomServiceArgs.builder()
                .name("linked_service")
                .dataFactoryId(exampleFactory.id())
                .type("AzureBlobStorage")
                .typePropertiesJson(exampleAccount.primaryConnectionString().applyValue(primaryConnectionString -> """
    {
      "connectionString": "%s"
    }
    ", primaryConnectionString)))
                .build());
    
            var example1 = new DatasetJson("example1", DatasetJsonArgs.builder()
                .name("dataset1")
                .dataFactoryId(exampleFactory.id())
                .linkedServiceName(exampleLinkedCustomService.name())
                .azureBlobStorageLocation(DatasetJsonAzureBlobStorageLocationArgs.builder()
                    .container("container")
                    .path("foo/bar/")
                    .filename("foo.txt")
                    .build())
                .encoding("UTF-8")
                .build());
    
            var example2 = new DatasetJson("example2", DatasetJsonArgs.builder()
                .name("dataset2")
                .dataFactoryId(exampleFactory.id())
                .linkedServiceName(exampleLinkedCustomService.name())
                .azureBlobStorageLocation(DatasetJsonAzureBlobStorageLocationArgs.builder()
                    .container("container")
                    .path("foo/bar/")
                    .filename("bar.txt")
                    .build())
                .encoding("UTF-8")
                .build());
    
            var example1FlowletDataFlow = new FlowletDataFlow("example1FlowletDataFlow", FlowletDataFlowArgs.builder()
                .name("example")
                .dataFactoryId(exampleFactory.id())
                .sources(FlowletDataFlowSourceArgs.builder()
                    .name("source1")
                    .linkedService(FlowletDataFlowSourceLinkedServiceArgs.builder()
                        .name(exampleLinkedCustomService.name())
                        .build())
                    .build())
                .sinks(FlowletDataFlowSinkArgs.builder()
                    .name("sink1")
                    .linkedService(FlowletDataFlowSinkLinkedServiceArgs.builder()
                        .name(exampleLinkedCustomService.name())
                        .build())
                    .build())
                .script("""
    source(
      allowSchemaDrift: true, 
      validateSchema: false, 
      limit: 100, 
      ignoreNoFilesFound: false, 
      documentForm: 'documentPerLine') ~> source1 
    source1 sink(
      allowSchemaDrift: true, 
      validateSchema: false, 
      skipDuplicateMapInputs: true, 
      skipDuplicateMapOutputs: true) ~> sink1
                """)
                .build());
    
            var example2FlowletDataFlow = new FlowletDataFlow("example2FlowletDataFlow", FlowletDataFlowArgs.builder()
                .name("example")
                .dataFactoryId(exampleFactory.id())
                .sources(FlowletDataFlowSourceArgs.builder()
                    .name("source1")
                    .linkedService(FlowletDataFlowSourceLinkedServiceArgs.builder()
                        .name(exampleLinkedCustomService.name())
                        .build())
                    .build())
                .sinks(FlowletDataFlowSinkArgs.builder()
                    .name("sink1")
                    .linkedService(FlowletDataFlowSinkLinkedServiceArgs.builder()
                        .name(exampleLinkedCustomService.name())
                        .build())
                    .build())
                .script("""
    source(
      allowSchemaDrift: true, 
      validateSchema: false, 
      limit: 100, 
      ignoreNoFilesFound: false, 
      documentForm: 'documentPerLine') ~> source1 
    source1 sink(
      allowSchemaDrift: true, 
      validateSchema: false, 
      skipDuplicateMapInputs: true, 
      skipDuplicateMapOutputs: true) ~> sink1
                """)
                .build());
    
            var exampleDataFlow = new DataFlow("exampleDataFlow", DataFlowArgs.builder()
                .name("example")
                .dataFactoryId(exampleFactory.id())
                .sources(DataFlowSourceArgs.builder()
                    .name("source1")
                    .flowlet(DataFlowSourceFlowletArgs.builder()
                        .name(example1FlowletDataFlow.name())
                        .parameters(Map.of("Key1", "value1"))
                        .build())
                    .dataset(DataFlowSourceDatasetArgs.builder()
                        .name(example1.name())
                        .build())
                    .build())
                .sinks(DataFlowSinkArgs.builder()
                    .name("sink1")
                    .flowlet(DataFlowSinkFlowletArgs.builder()
                        .name(example2FlowletDataFlow.name())
                        .parameters(Map.of("Key1", "value1"))
                        .build())
                    .dataset(DataFlowSinkDatasetArgs.builder()
                        .name(example2.name())
                        .build())
                    .build())
                .script("""
    source(
      allowSchemaDrift: true, 
      validateSchema: false, 
      limit: 100, 
      ignoreNoFilesFound: false, 
      documentForm: 'documentPerLine') ~> source1 
    source1 sink(
      allowSchemaDrift: true, 
      validateSchema: false, 
      skipDuplicateMapInputs: true, 
      skipDuplicateMapOutputs: true) ~> sink1
                """)
                .build());
    
        }
    }
    
    resources:
      example:
        type: azure:core:ResourceGroup
        properties:
          name: example-resources
          location: West Europe
      exampleAccount:
        type: azure:storage:Account
        name: example
        properties:
          name: example
          location: ${example.location}
          resourceGroupName: ${example.name}
          accountTier: Standard
          accountReplicationType: LRS
      exampleFactory:
        type: azure:datafactory:Factory
        name: example
        properties:
          name: example
          location: ${example.location}
          resourceGroupName: ${example.name}
      exampleLinkedCustomService:
        type: azure:datafactory:LinkedCustomService
        name: example
        properties:
          name: linked_service
          dataFactoryId: ${exampleFactory.id}
          type: AzureBlobStorage
          typePropertiesJson: |
            {
              "connectionString": "${exampleAccount.primaryConnectionString}"
            }        
      example1:
        type: azure:datafactory:DatasetJson
        properties:
          name: dataset1
          dataFactoryId: ${exampleFactory.id}
          linkedServiceName: ${exampleLinkedCustomService.name}
          azureBlobStorageLocation:
            container: container
            path: foo/bar/
            filename: foo.txt
          encoding: UTF-8
      example2:
        type: azure:datafactory:DatasetJson
        properties:
          name: dataset2
          dataFactoryId: ${exampleFactory.id}
          linkedServiceName: ${exampleLinkedCustomService.name}
          azureBlobStorageLocation:
            container: container
            path: foo/bar/
            filename: bar.txt
          encoding: UTF-8
      exampleDataFlow:
        type: azure:datafactory:DataFlow
        name: example
        properties:
          name: example
          dataFactoryId: ${exampleFactory.id}
          sources:
            - name: source1
              flowlet:
                name: ${example1FlowletDataFlow.name}
                parameters:
                  Key1: value1
              dataset:
                name: ${example1.name}
          sinks:
            - name: sink1
              flowlet:
                name: ${example2FlowletDataFlow.name}
                parameters:
                  Key1: value1
              dataset:
                name: ${example2.name}
          script: "source(\n  allowSchemaDrift: true, \n  validateSchema: false, \n  limit: 100, \n  ignoreNoFilesFound: false, \n  documentForm: 'documentPerLine') ~> source1 \nsource1 sink(\n  allowSchemaDrift: true, \n  validateSchema: false, \n  skipDuplicateMapInputs: true, \n  skipDuplicateMapOutputs: true) ~> sink1\n"
      example1FlowletDataFlow:
        type: azure:datafactory:FlowletDataFlow
        name: example1
        properties:
          name: example
          dataFactoryId: ${exampleFactory.id}
          sources:
            - name: source1
              linkedService:
                name: ${exampleLinkedCustomService.name}
          sinks:
            - name: sink1
              linkedService:
                name: ${exampleLinkedCustomService.name}
          script: "source(\n  allowSchemaDrift: true, \n  validateSchema: false, \n  limit: 100, \n  ignoreNoFilesFound: false, \n  documentForm: 'documentPerLine') ~> source1 \nsource1 sink(\n  allowSchemaDrift: true, \n  validateSchema: false, \n  skipDuplicateMapInputs: true, \n  skipDuplicateMapOutputs: true) ~> sink1\n"
      example2FlowletDataFlow:
        type: azure:datafactory:FlowletDataFlow
        name: example2
        properties:
          name: example
          dataFactoryId: ${exampleFactory.id}
          sources:
            - name: source1
              linkedService:
                name: ${exampleLinkedCustomService.name}
          sinks:
            - name: sink1
              linkedService:
                name: ${exampleLinkedCustomService.name}
          script: "source(\n  allowSchemaDrift: true, \n  validateSchema: false, \n  limit: 100, \n  ignoreNoFilesFound: false, \n  documentForm: 'documentPerLine') ~> source1 \nsource1 sink(\n  allowSchemaDrift: true, \n  validateSchema: false, \n  skipDuplicateMapInputs: true, \n  skipDuplicateMapOutputs: true) ~> sink1\n"
    

    Create DataFlow Resource

    Resources are created with functions called constructors. To learn more about declaring and configuring resources, see Resources.

    Constructor syntax

    new DataFlow(name: string, args: DataFlowArgs, opts?: CustomResourceOptions);
    @overload
    def DataFlow(resource_name: str,
                 args: DataFlowArgs,
                 opts: Optional[ResourceOptions] = None)
    
    @overload
    def DataFlow(resource_name: str,
                 opts: Optional[ResourceOptions] = None,
                 data_factory_id: Optional[str] = None,
                 sinks: Optional[Sequence[DataFlowSinkArgs]] = None,
                 sources: Optional[Sequence[DataFlowSourceArgs]] = None,
                 annotations: Optional[Sequence[str]] = None,
                 description: Optional[str] = None,
                 folder: Optional[str] = None,
                 name: Optional[str] = None,
                 script: Optional[str] = None,
                 script_lines: Optional[Sequence[str]] = None,
                 transformations: Optional[Sequence[DataFlowTransformationArgs]] = None)
    func NewDataFlow(ctx *Context, name string, args DataFlowArgs, opts ...ResourceOption) (*DataFlow, error)
    public DataFlow(string name, DataFlowArgs args, CustomResourceOptions? opts = null)
    public DataFlow(String name, DataFlowArgs args)
    public DataFlow(String name, DataFlowArgs args, CustomResourceOptions options)
    
    type: azure:datafactory:DataFlow
    properties: # The arguments to resource properties.
    options: # Bag of options to control resource's behavior.
    
    

    Parameters

    name string
    The unique name of the resource.
    args DataFlowArgs
    The arguments to resource properties.
    opts CustomResourceOptions
    Bag of options to control resource's behavior.
    resource_name str
    The unique name of the resource.
    args DataFlowArgs
    The arguments to resource properties.
    opts ResourceOptions
    Bag of options to control resource's behavior.
    ctx Context
    Context object for the current deployment.
    name string
    The unique name of the resource.
    args DataFlowArgs
    The arguments to resource properties.
    opts ResourceOption
    Bag of options to control resource's behavior.
    name string
    The unique name of the resource.
    args DataFlowArgs
    The arguments to resource properties.
    opts CustomResourceOptions
    Bag of options to control resource's behavior.
    name String
    The unique name of the resource.
    args DataFlowArgs
    The arguments to resource properties.
    options CustomResourceOptions
    Bag of options to control resource's behavior.

    Constructor example

    The following reference example uses placeholder values for all input properties.

    var dataFlowResource = new Azure.DataFactory.DataFlow("dataFlowResource", new()
    {
        DataFactoryId = "string",
        Sinks = new[]
        {
            new Azure.DataFactory.Inputs.DataFlowSinkArgs
            {
                Name = "string",
                Dataset = new Azure.DataFactory.Inputs.DataFlowSinkDatasetArgs
                {
                    Name = "string",
                    Parameters = 
                    {
                        { "string", "string" },
                    },
                },
                Description = "string",
                Flowlet = new Azure.DataFactory.Inputs.DataFlowSinkFlowletArgs
                {
                    Name = "string",
                    DatasetParameters = "string",
                    Parameters = 
                    {
                        { "string", "string" },
                    },
                },
                LinkedService = new Azure.DataFactory.Inputs.DataFlowSinkLinkedServiceArgs
                {
                    Name = "string",
                    Parameters = 
                    {
                        { "string", "string" },
                    },
                },
                RejectedLinkedService = new Azure.DataFactory.Inputs.DataFlowSinkRejectedLinkedServiceArgs
                {
                    Name = "string",
                    Parameters = 
                    {
                        { "string", "string" },
                    },
                },
                SchemaLinkedService = new Azure.DataFactory.Inputs.DataFlowSinkSchemaLinkedServiceArgs
                {
                    Name = "string",
                    Parameters = 
                    {
                        { "string", "string" },
                    },
                },
            },
        },
        Sources = new[]
        {
            new Azure.DataFactory.Inputs.DataFlowSourceArgs
            {
                Name = "string",
                Dataset = new Azure.DataFactory.Inputs.DataFlowSourceDatasetArgs
                {
                    Name = "string",
                    Parameters = 
                    {
                        { "string", "string" },
                    },
                },
                Description = "string",
                Flowlet = new Azure.DataFactory.Inputs.DataFlowSourceFlowletArgs
                {
                    Name = "string",
                    DatasetParameters = "string",
                    Parameters = 
                    {
                        { "string", "string" },
                    },
                },
                LinkedService = new Azure.DataFactory.Inputs.DataFlowSourceLinkedServiceArgs
                {
                    Name = "string",
                    Parameters = 
                    {
                        { "string", "string" },
                    },
                },
                RejectedLinkedService = new Azure.DataFactory.Inputs.DataFlowSourceRejectedLinkedServiceArgs
                {
                    Name = "string",
                    Parameters = 
                    {
                        { "string", "string" },
                    },
                },
                SchemaLinkedService = new Azure.DataFactory.Inputs.DataFlowSourceSchemaLinkedServiceArgs
                {
                    Name = "string",
                    Parameters = 
                    {
                        { "string", "string" },
                    },
                },
            },
        },
        Annotations = new[]
        {
            "string",
        },
        Description = "string",
        Folder = "string",
        Name = "string",
        Script = "string",
        ScriptLines = new[]
        {
            "string",
        },
        Transformations = new[]
        {
            new Azure.DataFactory.Inputs.DataFlowTransformationArgs
            {
                Name = "string",
                Dataset = new Azure.DataFactory.Inputs.DataFlowTransformationDatasetArgs
                {
                    Name = "string",
                    Parameters = 
                    {
                        { "string", "string" },
                    },
                },
                Description = "string",
                Flowlet = new Azure.DataFactory.Inputs.DataFlowTransformationFlowletArgs
                {
                    Name = "string",
                    DatasetParameters = "string",
                    Parameters = 
                    {
                        { "string", "string" },
                    },
                },
                LinkedService = new Azure.DataFactory.Inputs.DataFlowTransformationLinkedServiceArgs
                {
                    Name = "string",
                    Parameters = 
                    {
                        { "string", "string" },
                    },
                },
            },
        },
    });
    
    example, err := datafactory.NewDataFlow(ctx, "dataFlowResource", &datafactory.DataFlowArgs{
    	DataFactoryId: pulumi.String("string"),
    	Sinks: datafactory.DataFlowSinkArray{
    		&datafactory.DataFlowSinkArgs{
    			Name: pulumi.String("string"),
    			Dataset: &datafactory.DataFlowSinkDatasetArgs{
    				Name: pulumi.String("string"),
    				Parameters: pulumi.StringMap{
    					"string": pulumi.String("string"),
    				},
    			},
    			Description: pulumi.String("string"),
    			Flowlet: &datafactory.DataFlowSinkFlowletArgs{
    				Name:              pulumi.String("string"),
    				DatasetParameters: pulumi.String("string"),
    				Parameters: pulumi.StringMap{
    					"string": pulumi.String("string"),
    				},
    			},
    			LinkedService: &datafactory.DataFlowSinkLinkedServiceArgs{
    				Name: pulumi.String("string"),
    				Parameters: pulumi.StringMap{
    					"string": pulumi.String("string"),
    				},
    			},
    			RejectedLinkedService: &datafactory.DataFlowSinkRejectedLinkedServiceArgs{
    				Name: pulumi.String("string"),
    				Parameters: pulumi.StringMap{
    					"string": pulumi.String("string"),
    				},
    			},
    			SchemaLinkedService: &datafactory.DataFlowSinkSchemaLinkedServiceArgs{
    				Name: pulumi.String("string"),
    				Parameters: pulumi.StringMap{
    					"string": pulumi.String("string"),
    				},
    			},
    		},
    	},
    	Sources: datafactory.DataFlowSourceArray{
    		&datafactory.DataFlowSourceArgs{
    			Name: pulumi.String("string"),
    			Dataset: &datafactory.DataFlowSourceDatasetArgs{
    				Name: pulumi.String("string"),
    				Parameters: pulumi.StringMap{
    					"string": pulumi.String("string"),
    				},
    			},
    			Description: pulumi.String("string"),
    			Flowlet: &datafactory.DataFlowSourceFlowletArgs{
    				Name:              pulumi.String("string"),
    				DatasetParameters: pulumi.String("string"),
    				Parameters: pulumi.StringMap{
    					"string": pulumi.String("string"),
    				},
    			},
    			LinkedService: &datafactory.DataFlowSourceLinkedServiceArgs{
    				Name: pulumi.String("string"),
    				Parameters: pulumi.StringMap{
    					"string": pulumi.String("string"),
    				},
    			},
    			RejectedLinkedService: &datafactory.DataFlowSourceRejectedLinkedServiceArgs{
    				Name: pulumi.String("string"),
    				Parameters: pulumi.StringMap{
    					"string": pulumi.String("string"),
    				},
    			},
    			SchemaLinkedService: &datafactory.DataFlowSourceSchemaLinkedServiceArgs{
    				Name: pulumi.String("string"),
    				Parameters: pulumi.StringMap{
    					"string": pulumi.String("string"),
    				},
    			},
    		},
    	},
    	Annotations: pulumi.StringArray{
    		pulumi.String("string"),
    	},
    	Description: pulumi.String("string"),
    	Folder:      pulumi.String("string"),
    	Name:        pulumi.String("string"),
    	Script:      pulumi.String("string"),
    	ScriptLines: pulumi.StringArray{
    		pulumi.String("string"),
    	},
    	Transformations: datafactory.DataFlowTransformationArray{
    		&datafactory.DataFlowTransformationArgs{
    			Name: pulumi.String("string"),
    			Dataset: &datafactory.DataFlowTransformationDatasetArgs{
    				Name: pulumi.String("string"),
    				Parameters: pulumi.StringMap{
    					"string": pulumi.String("string"),
    				},
    			},
    			Description: pulumi.String("string"),
    			Flowlet: &datafactory.DataFlowTransformationFlowletArgs{
    				Name:              pulumi.String("string"),
    				DatasetParameters: pulumi.String("string"),
    				Parameters: pulumi.StringMap{
    					"string": pulumi.String("string"),
    				},
    			},
    			LinkedService: &datafactory.DataFlowTransformationLinkedServiceArgs{
    				Name: pulumi.String("string"),
    				Parameters: pulumi.StringMap{
    					"string": pulumi.String("string"),
    				},
    			},
    		},
    	},
    })
    
    var dataFlowResource = new DataFlow("dataFlowResource", DataFlowArgs.builder()
        .dataFactoryId("string")
        .sinks(DataFlowSinkArgs.builder()
            .name("string")
            .dataset(DataFlowSinkDatasetArgs.builder()
                .name("string")
                .parameters(Map.of("string", "string"))
                .build())
            .description("string")
            .flowlet(DataFlowSinkFlowletArgs.builder()
                .name("string")
                .datasetParameters("string")
                .parameters(Map.of("string", "string"))
                .build())
            .linkedService(DataFlowSinkLinkedServiceArgs.builder()
                .name("string")
                .parameters(Map.of("string", "string"))
                .build())
            .rejectedLinkedService(DataFlowSinkRejectedLinkedServiceArgs.builder()
                .name("string")
                .parameters(Map.of("string", "string"))
                .build())
            .schemaLinkedService(DataFlowSinkSchemaLinkedServiceArgs.builder()
                .name("string")
                .parameters(Map.of("string", "string"))
                .build())
            .build())
        .sources(DataFlowSourceArgs.builder()
            .name("string")
            .dataset(DataFlowSourceDatasetArgs.builder()
                .name("string")
                .parameters(Map.of("string", "string"))
                .build())
            .description("string")
            .flowlet(DataFlowSourceFlowletArgs.builder()
                .name("string")
                .datasetParameters("string")
                .parameters(Map.of("string", "string"))
                .build())
            .linkedService(DataFlowSourceLinkedServiceArgs.builder()
                .name("string")
                .parameters(Map.of("string", "string"))
                .build())
            .rejectedLinkedService(DataFlowSourceRejectedLinkedServiceArgs.builder()
                .name("string")
                .parameters(Map.of("string", "string"))
                .build())
            .schemaLinkedService(DataFlowSourceSchemaLinkedServiceArgs.builder()
                .name("string")
                .parameters(Map.of("string", "string"))
                .build())
            .build())
        .annotations("string")
        .description("string")
        .folder("string")
        .name("string")
        .script("string")
        .scriptLines("string")
        .transformations(DataFlowTransformationArgs.builder()
            .name("string")
            .dataset(DataFlowTransformationDatasetArgs.builder()
                .name("string")
                .parameters(Map.of("string", "string"))
                .build())
            .description("string")
            .flowlet(DataFlowTransformationFlowletArgs.builder()
                .name("string")
                .datasetParameters("string")
                .parameters(Map.of("string", "string"))
                .build())
            .linkedService(DataFlowTransformationLinkedServiceArgs.builder()
                .name("string")
                .parameters(Map.of("string", "string"))
                .build())
            .build())
        .build());
    
    data_flow_resource = azure.datafactory.DataFlow("dataFlowResource",
        data_factory_id="string",
        sinks=[{
            "name": "string",
            "dataset": {
                "name": "string",
                "parameters": {
                    "string": "string",
                },
            },
            "description": "string",
            "flowlet": {
                "name": "string",
                "dataset_parameters": "string",
                "parameters": {
                    "string": "string",
                },
            },
            "linked_service": {
                "name": "string",
                "parameters": {
                    "string": "string",
                },
            },
            "rejected_linked_service": {
                "name": "string",
                "parameters": {
                    "string": "string",
                },
            },
            "schema_linked_service": {
                "name": "string",
                "parameters": {
                    "string": "string",
                },
            },
        }],
        sources=[{
            "name": "string",
            "dataset": {
                "name": "string",
                "parameters": {
                    "string": "string",
                },
            },
            "description": "string",
            "flowlet": {
                "name": "string",
                "dataset_parameters": "string",
                "parameters": {
                    "string": "string",
                },
            },
            "linked_service": {
                "name": "string",
                "parameters": {
                    "string": "string",
                },
            },
            "rejected_linked_service": {
                "name": "string",
                "parameters": {
                    "string": "string",
                },
            },
            "schema_linked_service": {
                "name": "string",
                "parameters": {
                    "string": "string",
                },
            },
        }],
        annotations=["string"],
        description="string",
        folder="string",
        name="string",
        script="string",
        script_lines=["string"],
        transformations=[{
            "name": "string",
            "dataset": {
                "name": "string",
                "parameters": {
                    "string": "string",
                },
            },
            "description": "string",
            "flowlet": {
                "name": "string",
                "dataset_parameters": "string",
                "parameters": {
                    "string": "string",
                },
            },
            "linked_service": {
                "name": "string",
                "parameters": {
                    "string": "string",
                },
            },
        }])
    
    const dataFlowResource = new azure.datafactory.DataFlow("dataFlowResource", {
        dataFactoryId: "string",
        sinks: [{
            name: "string",
            dataset: {
                name: "string",
                parameters: {
                    string: "string",
                },
            },
            description: "string",
            flowlet: {
                name: "string",
                datasetParameters: "string",
                parameters: {
                    string: "string",
                },
            },
            linkedService: {
                name: "string",
                parameters: {
                    string: "string",
                },
            },
            rejectedLinkedService: {
                name: "string",
                parameters: {
                    string: "string",
                },
            },
            schemaLinkedService: {
                name: "string",
                parameters: {
                    string: "string",
                },
            },
        }],
        sources: [{
            name: "string",
            dataset: {
                name: "string",
                parameters: {
                    string: "string",
                },
            },
            description: "string",
            flowlet: {
                name: "string",
                datasetParameters: "string",
                parameters: {
                    string: "string",
                },
            },
            linkedService: {
                name: "string",
                parameters: {
                    string: "string",
                },
            },
            rejectedLinkedService: {
                name: "string",
                parameters: {
                    string: "string",
                },
            },
            schemaLinkedService: {
                name: "string",
                parameters: {
                    string: "string",
                },
            },
        }],
        annotations: ["string"],
        description: "string",
        folder: "string",
        name: "string",
        script: "string",
        scriptLines: ["string"],
        transformations: [{
            name: "string",
            dataset: {
                name: "string",
                parameters: {
                    string: "string",
                },
            },
            description: "string",
            flowlet: {
                name: "string",
                datasetParameters: "string",
                parameters: {
                    string: "string",
                },
            },
            linkedService: {
                name: "string",
                parameters: {
                    string: "string",
                },
            },
        }],
    });
    
    type: azure:datafactory:DataFlow
    properties:
        annotations:
            - string
        dataFactoryId: string
        description: string
        folder: string
        name: string
        script: string
        scriptLines:
            - string
        sinks:
            - dataset:
                name: string
                parameters:
                    string: string
              description: string
              flowlet:
                datasetParameters: string
                name: string
                parameters:
                    string: string
              linkedService:
                name: string
                parameters:
                    string: string
              name: string
              rejectedLinkedService:
                name: string
                parameters:
                    string: string
              schemaLinkedService:
                name: string
                parameters:
                    string: string
        sources:
            - dataset:
                name: string
                parameters:
                    string: string
              description: string
              flowlet:
                datasetParameters: string
                name: string
                parameters:
                    string: string
              linkedService:
                name: string
                parameters:
                    string: string
              name: string
              rejectedLinkedService:
                name: string
                parameters:
                    string: string
              schemaLinkedService:
                name: string
                parameters:
                    string: string
        transformations:
            - dataset:
                name: string
                parameters:
                    string: string
              description: string
              flowlet:
                datasetParameters: string
                name: string
                parameters:
                    string: string
              linkedService:
                name: string
                parameters:
                    string: string
              name: string
    

    DataFlow Resource Properties

    To learn more about resource properties and how to use them, see Inputs and Outputs in the Architecture and Concepts docs.

    Inputs

    In Python, inputs that are objects can be passed either as argument classes or as dictionary literals.

    The DataFlow resource accepts the following input properties:

    DataFactoryId string
    The ID of Data Factory in which to associate the Data Flow with. Changing this forces a new resource.
    Sinks List<DataFlowSink>
    One or more sink blocks as defined below.
    Sources List<DataFlowSource>
    One or more source blocks as defined below.
    Annotations List<string>
    List of tags that can be used for describing the Data Factory Data Flow.
    Description string
    The description for the Data Factory Data Flow.
    Folder string
    The folder that this Data Flow is in. If not specified, the Data Flow will appear at the root level.
    Name string
    Specifies the name of the Data Factory Data Flow. Changing this forces a new resource to be created.
    Script string
    The script for the Data Factory Data Flow.
    ScriptLines List<string>
    The script lines for the Data Factory Data Flow.
    Transformations List<DataFlowTransformation>
    One or more transformation blocks as defined below.
    DataFactoryId string
    The ID of Data Factory in which to associate the Data Flow with. Changing this forces a new resource.
    Sinks []DataFlowSinkArgs
    One or more sink blocks as defined below.
    Sources []DataFlowSourceArgs
    One or more source blocks as defined below.
    Annotations []string
    List of tags that can be used for describing the Data Factory Data Flow.
    Description string
    The description for the Data Factory Data Flow.
    Folder string
    The folder that this Data Flow is in. If not specified, the Data Flow will appear at the root level.
    Name string
    Specifies the name of the Data Factory Data Flow. Changing this forces a new resource to be created.
    Script string
    The script for the Data Factory Data Flow.
    ScriptLines []string
    The script lines for the Data Factory Data Flow.
    Transformations []DataFlowTransformationArgs
    One or more transformation blocks as defined below.
    dataFactoryId String
    The ID of Data Factory in which to associate the Data Flow with. Changing this forces a new resource.
    sinks List<DataFlowSink>
    One or more sink blocks as defined below.
    sources List<DataFlowSource>
    One or more source blocks as defined below.
    annotations List<String>
    List of tags that can be used for describing the Data Factory Data Flow.
    description String
    The description for the Data Factory Data Flow.
    folder String
    The folder that this Data Flow is in. If not specified, the Data Flow will appear at the root level.
    name String
    Specifies the name of the Data Factory Data Flow. Changing this forces a new resource to be created.
    script String
    The script for the Data Factory Data Flow.
    scriptLines List<String>
    The script lines for the Data Factory Data Flow.
    transformations List<DataFlowTransformation>
    One or more transformation blocks as defined below.
    dataFactoryId string
    The ID of Data Factory in which to associate the Data Flow with. Changing this forces a new resource.
    sinks DataFlowSink[]
    One or more sink blocks as defined below.
    sources DataFlowSource[]
    One or more source blocks as defined below.
    annotations string[]
    List of tags that can be used for describing the Data Factory Data Flow.
    description string
    The description for the Data Factory Data Flow.
    folder string
    The folder that this Data Flow is in. If not specified, the Data Flow will appear at the root level.
    name string
    Specifies the name of the Data Factory Data Flow. Changing this forces a new resource to be created.
    script string
    The script for the Data Factory Data Flow.
    scriptLines string[]
    The script lines for the Data Factory Data Flow.
    transformations DataFlowTransformation[]
    One or more transformation blocks as defined below.
    data_factory_id str
    The ID of Data Factory in which to associate the Data Flow with. Changing this forces a new resource.
    sinks Sequence[DataFlowSinkArgs]
    One or more sink blocks as defined below.
    sources Sequence[DataFlowSourceArgs]
    One or more source blocks as defined below.
    annotations Sequence[str]
    List of tags that can be used for describing the Data Factory Data Flow.
    description str
    The description for the Data Factory Data Flow.
    folder str
    The folder that this Data Flow is in. If not specified, the Data Flow will appear at the root level.
    name str
    Specifies the name of the Data Factory Data Flow. Changing this forces a new resource to be created.
    script str
    The script for the Data Factory Data Flow.
    script_lines Sequence[str]
    The script lines for the Data Factory Data Flow.
    transformations Sequence[DataFlowTransformationArgs]
    One or more transformation blocks as defined below.
    dataFactoryId String
    The ID of Data Factory in which to associate the Data Flow with. Changing this forces a new resource.
    sinks List<Property Map>
    One or more sink blocks as defined below.
    sources List<Property Map>
    One or more source blocks as defined below.
    annotations List<String>
    List of tags that can be used for describing the Data Factory Data Flow.
    description String
    The description for the Data Factory Data Flow.
    folder String
    The folder that this Data Flow is in. If not specified, the Data Flow will appear at the root level.
    name String
    Specifies the name of the Data Factory Data Flow. Changing this forces a new resource to be created.
    script String
    The script for the Data Factory Data Flow.
    scriptLines List<String>
    The script lines for the Data Factory Data Flow.
    transformations List<Property Map>
    One or more transformation blocks as defined below.

    Outputs

    All input properties are implicitly available as output properties. Additionally, the DataFlow resource produces the following output properties:

    Id string
    The provider-assigned unique ID for this managed resource.
    Id string
    The provider-assigned unique ID for this managed resource.
    id String
    The provider-assigned unique ID for this managed resource.
    id string
    The provider-assigned unique ID for this managed resource.
    id str
    The provider-assigned unique ID for this managed resource.
    id String
    The provider-assigned unique ID for this managed resource.

    Look up Existing DataFlow Resource

    Get an existing DataFlow resource’s state with the given name, ID, and optional extra properties used to qualify the lookup.

    public static get(name: string, id: Input<ID>, state?: DataFlowState, opts?: CustomResourceOptions): DataFlow
    @staticmethod
    def get(resource_name: str,
            id: str,
            opts: Optional[ResourceOptions] = None,
            annotations: Optional[Sequence[str]] = None,
            data_factory_id: Optional[str] = None,
            description: Optional[str] = None,
            folder: Optional[str] = None,
            name: Optional[str] = None,
            script: Optional[str] = None,
            script_lines: Optional[Sequence[str]] = None,
            sinks: Optional[Sequence[DataFlowSinkArgs]] = None,
            sources: Optional[Sequence[DataFlowSourceArgs]] = None,
            transformations: Optional[Sequence[DataFlowTransformationArgs]] = None) -> DataFlow
    func GetDataFlow(ctx *Context, name string, id IDInput, state *DataFlowState, opts ...ResourceOption) (*DataFlow, error)
    public static DataFlow Get(string name, Input<string> id, DataFlowState? state, CustomResourceOptions? opts = null)
    public static DataFlow get(String name, Output<String> id, DataFlowState state, CustomResourceOptions options)
    Resource lookup is not supported in YAML
    name
    The unique name of the resulting resource.
    id
    The unique provider ID of the resource to lookup.
    state
    Any extra arguments used during the lookup.
    opts
    A bag of options that control this resource's behavior.
    resource_name
    The unique name of the resulting resource.
    id
    The unique provider ID of the resource to lookup.
    name
    The unique name of the resulting resource.
    id
    The unique provider ID of the resource to lookup.
    state
    Any extra arguments used during the lookup.
    opts
    A bag of options that control this resource's behavior.
    name
    The unique name of the resulting resource.
    id
    The unique provider ID of the resource to lookup.
    state
    Any extra arguments used during the lookup.
    opts
    A bag of options that control this resource's behavior.
    name
    The unique name of the resulting resource.
    id
    The unique provider ID of the resource to lookup.
    state
    Any extra arguments used during the lookup.
    opts
    A bag of options that control this resource's behavior.
    The following state arguments are supported:
    Annotations List<string>
    List of tags that can be used for describing the Data Factory Data Flow.
    DataFactoryId string
    The ID of Data Factory in which to associate the Data Flow with. Changing this forces a new resource.
    Description string
    The description for the Data Factory Data Flow.
    Folder string
    The folder that this Data Flow is in. If not specified, the Data Flow will appear at the root level.
    Name string
    Specifies the name of the Data Factory Data Flow. Changing this forces a new resource to be created.
    Script string
    The script for the Data Factory Data Flow.
    ScriptLines List<string>
    The script lines for the Data Factory Data Flow.
    Sinks List<DataFlowSink>
    One or more sink blocks as defined below.
    Sources List<DataFlowSource>
    One or more source blocks as defined below.
    Transformations List<DataFlowTransformation>
    One or more transformation blocks as defined below.
    Annotations []string
    List of tags that can be used for describing the Data Factory Data Flow.
    DataFactoryId string
    The ID of Data Factory in which to associate the Data Flow with. Changing this forces a new resource.
    Description string
    The description for the Data Factory Data Flow.
    Folder string
    The folder that this Data Flow is in. If not specified, the Data Flow will appear at the root level.
    Name string
    Specifies the name of the Data Factory Data Flow. Changing this forces a new resource to be created.
    Script string
    The script for the Data Factory Data Flow.
    ScriptLines []string
    The script lines for the Data Factory Data Flow.
    Sinks []DataFlowSinkArgs
    One or more sink blocks as defined below.
    Sources []DataFlowSourceArgs
    One or more source blocks as defined below.
    Transformations []DataFlowTransformationArgs
    One or more transformation blocks as defined below.
    annotations List<String>
    List of tags that can be used for describing the Data Factory Data Flow.
    dataFactoryId String
    The ID of Data Factory in which to associate the Data Flow with. Changing this forces a new resource.
    description String
    The description for the Data Factory Data Flow.
    folder String
    The folder that this Data Flow is in. If not specified, the Data Flow will appear at the root level.
    name String
    Specifies the name of the Data Factory Data Flow. Changing this forces a new resource to be created.
    script String
    The script for the Data Factory Data Flow.
    scriptLines List<String>
    The script lines for the Data Factory Data Flow.
    sinks List<DataFlowSink>
    One or more sink blocks as defined below.
    sources List<DataFlowSource>
    One or more source blocks as defined below.
    transformations List<DataFlowTransformation>
    One or more transformation blocks as defined below.
    annotations string[]
    List of tags that can be used for describing the Data Factory Data Flow.
    dataFactoryId string
    The ID of Data Factory in which to associate the Data Flow with. Changing this forces a new resource.
    description string
    The description for the Data Factory Data Flow.
    folder string
    The folder that this Data Flow is in. If not specified, the Data Flow will appear at the root level.
    name string
    Specifies the name of the Data Factory Data Flow. Changing this forces a new resource to be created.
    script string
    The script for the Data Factory Data Flow.
    scriptLines string[]
    The script lines for the Data Factory Data Flow.
    sinks DataFlowSink[]
    One or more sink blocks as defined below.
    sources DataFlowSource[]
    One or more source blocks as defined below.
    transformations DataFlowTransformation[]
    One or more transformation blocks as defined below.
    annotations Sequence[str]
    List of tags that can be used for describing the Data Factory Data Flow.
    data_factory_id str
    The ID of Data Factory in which to associate the Data Flow with. Changing this forces a new resource.
    description str
    The description for the Data Factory Data Flow.
    folder str
    The folder that this Data Flow is in. If not specified, the Data Flow will appear at the root level.
    name str
    Specifies the name of the Data Factory Data Flow. Changing this forces a new resource to be created.
    script str
    The script for the Data Factory Data Flow.
    script_lines Sequence[str]
    The script lines for the Data Factory Data Flow.
    sinks Sequence[DataFlowSinkArgs]
    One or more sink blocks as defined below.
    sources Sequence[DataFlowSourceArgs]
    One or more source blocks as defined below.
    transformations Sequence[DataFlowTransformationArgs]
    One or more transformation blocks as defined below.
    annotations List<String>
    List of tags that can be used for describing the Data Factory Data Flow.
    dataFactoryId String
    The ID of Data Factory in which to associate the Data Flow with. Changing this forces a new resource.
    description String
    The description for the Data Factory Data Flow.
    folder String
    The folder that this Data Flow is in. If not specified, the Data Flow will appear at the root level.
    name String
    Specifies the name of the Data Factory Data Flow. Changing this forces a new resource to be created.
    script String
    The script for the Data Factory Data Flow.
    scriptLines List<String>
    The script lines for the Data Factory Data Flow.
    sinks List<Property Map>
    One or more sink blocks as defined below.
    sources List<Property Map>
    One or more source blocks as defined below.
    transformations List<Property Map>
    One or more transformation blocks as defined below.

    Supporting Types

    DataFlowSink, DataFlowSinkArgs

    Name string
    The name for the Data Flow Source.
    Dataset DataFlowSinkDataset
    A dataset block as defined below.
    Description string
    The description for the Data Flow Source.
    Flowlet DataFlowSinkFlowlet
    A flowlet block as defined below.
    LinkedService DataFlowSinkLinkedService
    A linked_service block as defined below.
    RejectedLinkedService DataFlowSinkRejectedLinkedService
    A rejected_linked_service block as defined below.
    SchemaLinkedService DataFlowSinkSchemaLinkedService
    A schema_linked_service block as defined below.
    Name string
    The name for the Data Flow Source.
    Dataset DataFlowSinkDataset
    A dataset block as defined below.
    Description string
    The description for the Data Flow Source.
    Flowlet DataFlowSinkFlowlet
    A flowlet block as defined below.
    LinkedService DataFlowSinkLinkedService
    A linked_service block as defined below.
    RejectedLinkedService DataFlowSinkRejectedLinkedService
    A rejected_linked_service block as defined below.
    SchemaLinkedService DataFlowSinkSchemaLinkedService
    A schema_linked_service block as defined below.
    name String
    The name for the Data Flow Source.
    dataset DataFlowSinkDataset
    A dataset block as defined below.
    description String
    The description for the Data Flow Source.
    flowlet DataFlowSinkFlowlet
    A flowlet block as defined below.
    linkedService DataFlowSinkLinkedService
    A linked_service block as defined below.
    rejectedLinkedService DataFlowSinkRejectedLinkedService
    A rejected_linked_service block as defined below.
    schemaLinkedService DataFlowSinkSchemaLinkedService
    A schema_linked_service block as defined below.
    name string
    The name for the Data Flow Source.
    dataset DataFlowSinkDataset
    A dataset block as defined below.
    description string
    The description for the Data Flow Source.
    flowlet DataFlowSinkFlowlet
    A flowlet block as defined below.
    linkedService DataFlowSinkLinkedService
    A linked_service block as defined below.
    rejectedLinkedService DataFlowSinkRejectedLinkedService
    A rejected_linked_service block as defined below.
    schemaLinkedService DataFlowSinkSchemaLinkedService
    A schema_linked_service block as defined below.
    name str
    The name for the Data Flow Source.
    dataset DataFlowSinkDataset
    A dataset block as defined below.
    description str
    The description for the Data Flow Source.
    flowlet DataFlowSinkFlowlet
    A flowlet block as defined below.
    linked_service DataFlowSinkLinkedService
    A linked_service block as defined below.
    rejected_linked_service DataFlowSinkRejectedLinkedService
    A rejected_linked_service block as defined below.
    schema_linked_service DataFlowSinkSchemaLinkedService
    A schema_linked_service block as defined below.
    name String
    The name for the Data Flow Source.
    dataset Property Map
    A dataset block as defined below.
    description String
    The description for the Data Flow Source.
    flowlet Property Map
    A flowlet block as defined below.
    linkedService Property Map
    A linked_service block as defined below.
    rejectedLinkedService Property Map
    A rejected_linked_service block as defined below.
    schemaLinkedService Property Map
    A schema_linked_service block as defined below.

    DataFlowSinkDataset, DataFlowSinkDatasetArgs

    Name string
    The name for the Data Factory Dataset.
    Parameters Dictionary<string, string>
    A map of parameters to associate with the Data Factory dataset.
    Name string
    The name for the Data Factory Dataset.
    Parameters map[string]string
    A map of parameters to associate with the Data Factory dataset.
    name String
    The name for the Data Factory Dataset.
    parameters Map<String,String>
    A map of parameters to associate with the Data Factory dataset.
    name string
    The name for the Data Factory Dataset.
    parameters {[key: string]: string}
    A map of parameters to associate with the Data Factory dataset.
    name str
    The name for the Data Factory Dataset.
    parameters Mapping[str, str]
    A map of parameters to associate with the Data Factory dataset.
    name String
    The name for the Data Factory Dataset.
    parameters Map<String>
    A map of parameters to associate with the Data Factory dataset.

    DataFlowSinkFlowlet, DataFlowSinkFlowletArgs

    Name string
    The name for the Data Factory Flowlet.
    DatasetParameters string
    Specifies the reference data flow parameters from dataset.
    Parameters Dictionary<string, string>
    A map of parameters to associate with the Data Factory Flowlet.
    Name string
    The name for the Data Factory Flowlet.
    DatasetParameters string
    Specifies the reference data flow parameters from dataset.
    Parameters map[string]string
    A map of parameters to associate with the Data Factory Flowlet.
    name String
    The name for the Data Factory Flowlet.
    datasetParameters String
    Specifies the reference data flow parameters from dataset.
    parameters Map<String,String>
    A map of parameters to associate with the Data Factory Flowlet.
    name string
    The name for the Data Factory Flowlet.
    datasetParameters string
    Specifies the reference data flow parameters from dataset.
    parameters {[key: string]: string}
    A map of parameters to associate with the Data Factory Flowlet.
    name str
    The name for the Data Factory Flowlet.
    dataset_parameters str
    Specifies the reference data flow parameters from dataset.
    parameters Mapping[str, str]
    A map of parameters to associate with the Data Factory Flowlet.
    name String
    The name for the Data Factory Flowlet.
    datasetParameters String
    Specifies the reference data flow parameters from dataset.
    parameters Map<String>
    A map of parameters to associate with the Data Factory Flowlet.

    DataFlowSinkLinkedService, DataFlowSinkLinkedServiceArgs

    Name string
    The name for the Data Factory Linked Service.
    Parameters Dictionary<string, string>
    A map of parameters to associate with the Data Factory Linked Service.
    Name string
    The name for the Data Factory Linked Service.
    Parameters map[string]string
    A map of parameters to associate with the Data Factory Linked Service.
    name String
    The name for the Data Factory Linked Service.
    parameters Map<String,String>
    A map of parameters to associate with the Data Factory Linked Service.
    name string
    The name for the Data Factory Linked Service.
    parameters {[key: string]: string}
    A map of parameters to associate with the Data Factory Linked Service.
    name str
    The name for the Data Factory Linked Service.
    parameters Mapping[str, str]
    A map of parameters to associate with the Data Factory Linked Service.
    name String
    The name for the Data Factory Linked Service.
    parameters Map<String>
    A map of parameters to associate with the Data Factory Linked Service.

    DataFlowSinkRejectedLinkedService, DataFlowSinkRejectedLinkedServiceArgs

    Name string
    The name for the Data Factory Linked Service with schema.
    Parameters Dictionary<string, string>
    A map of parameters to associate with the Data Factory Linked Service.
    Name string
    The name for the Data Factory Linked Service with schema.
    Parameters map[string]string
    A map of parameters to associate with the Data Factory Linked Service.
    name String
    The name for the Data Factory Linked Service with schema.
    parameters Map<String,String>
    A map of parameters to associate with the Data Factory Linked Service.
    name string
    The name for the Data Factory Linked Service with schema.
    parameters {[key: string]: string}
    A map of parameters to associate with the Data Factory Linked Service.
    name str
    The name for the Data Factory Linked Service with schema.
    parameters Mapping[str, str]
    A map of parameters to associate with the Data Factory Linked Service.
    name String
    The name for the Data Factory Linked Service with schema.
    parameters Map<String>
    A map of parameters to associate with the Data Factory Linked Service.

    DataFlowSinkSchemaLinkedService, DataFlowSinkSchemaLinkedServiceArgs

    Name string
    The name for the Data Factory Linked Service with schema.
    Parameters Dictionary<string, string>
    A map of parameters to associate with the Data Factory Linked Service.
    Name string
    The name for the Data Factory Linked Service with schema.
    Parameters map[string]string
    A map of parameters to associate with the Data Factory Linked Service.
    name String
    The name for the Data Factory Linked Service with schema.
    parameters Map<String,String>
    A map of parameters to associate with the Data Factory Linked Service.
    name string
    The name for the Data Factory Linked Service with schema.
    parameters {[key: string]: string}
    A map of parameters to associate with the Data Factory Linked Service.
    name str
    The name for the Data Factory Linked Service with schema.
    parameters Mapping[str, str]
    A map of parameters to associate with the Data Factory Linked Service.
    name String
    The name for the Data Factory Linked Service with schema.
    parameters Map<String>
    A map of parameters to associate with the Data Factory Linked Service.

    DataFlowSource, DataFlowSourceArgs

    Name string
    The name for the Data Flow Source.
    Dataset DataFlowSourceDataset
    A dataset block as defined below.
    Description string
    The description for the Data Flow Source.
    Flowlet DataFlowSourceFlowlet
    A flowlet block as defined below.
    LinkedService DataFlowSourceLinkedService
    A linked_service block as defined below.
    RejectedLinkedService DataFlowSourceRejectedLinkedService
    A rejected_linked_service block as defined below.
    SchemaLinkedService DataFlowSourceSchemaLinkedService
    A schema_linked_service block as defined below.
    Name string
    The name for the Data Flow Source.
    Dataset DataFlowSourceDataset
    A dataset block as defined below.
    Description string
    The description for the Data Flow Source.
    Flowlet DataFlowSourceFlowlet
    A flowlet block as defined below.
    LinkedService DataFlowSourceLinkedService
    A linked_service block as defined below.
    RejectedLinkedService DataFlowSourceRejectedLinkedService
    A rejected_linked_service block as defined below.
    SchemaLinkedService DataFlowSourceSchemaLinkedService
    A schema_linked_service block as defined below.
    name String
    The name for the Data Flow Source.
    dataset DataFlowSourceDataset
    A dataset block as defined below.
    description String
    The description for the Data Flow Source.
    flowlet DataFlowSourceFlowlet
    A flowlet block as defined below.
    linkedService DataFlowSourceLinkedService
    A linked_service block as defined below.
    rejectedLinkedService DataFlowSourceRejectedLinkedService
    A rejected_linked_service block as defined below.
    schemaLinkedService DataFlowSourceSchemaLinkedService
    A schema_linked_service block as defined below.
    name string
    The name for the Data Flow Source.
    dataset DataFlowSourceDataset
    A dataset block as defined below.
    description string
    The description for the Data Flow Source.
    flowlet DataFlowSourceFlowlet
    A flowlet block as defined below.
    linkedService DataFlowSourceLinkedService
    A linked_service block as defined below.
    rejectedLinkedService DataFlowSourceRejectedLinkedService
    A rejected_linked_service block as defined below.
    schemaLinkedService DataFlowSourceSchemaLinkedService
    A schema_linked_service block as defined below.
    name str
    The name for the Data Flow Source.
    dataset DataFlowSourceDataset
    A dataset block as defined below.
    description str
    The description for the Data Flow Source.
    flowlet DataFlowSourceFlowlet
    A flowlet block as defined below.
    linked_service DataFlowSourceLinkedService
    A linked_service block as defined below.
    rejected_linked_service DataFlowSourceRejectedLinkedService
    A rejected_linked_service block as defined below.
    schema_linked_service DataFlowSourceSchemaLinkedService
    A schema_linked_service block as defined below.
    name String
    The name for the Data Flow Source.
    dataset Property Map
    A dataset block as defined below.
    description String
    The description for the Data Flow Source.
    flowlet Property Map
    A flowlet block as defined below.
    linkedService Property Map
    A linked_service block as defined below.
    rejectedLinkedService Property Map
    A rejected_linked_service block as defined below.
    schemaLinkedService Property Map
    A schema_linked_service block as defined below.

    DataFlowSourceDataset, DataFlowSourceDatasetArgs

    Name string
    The name for the Data Factory Dataset.
    Parameters Dictionary<string, string>
    A map of parameters to associate with the Data Factory dataset.
    Name string
    The name for the Data Factory Dataset.
    Parameters map[string]string
    A map of parameters to associate with the Data Factory dataset.
    name String
    The name for the Data Factory Dataset.
    parameters Map<String,String>
    A map of parameters to associate with the Data Factory dataset.
    name string
    The name for the Data Factory Dataset.
    parameters {[key: string]: string}
    A map of parameters to associate with the Data Factory dataset.
    name str
    The name for the Data Factory Dataset.
    parameters Mapping[str, str]
    A map of parameters to associate with the Data Factory dataset.
    name String
    The name for the Data Factory Dataset.
    parameters Map<String>
    A map of parameters to associate with the Data Factory dataset.

    DataFlowSourceFlowlet, DataFlowSourceFlowletArgs

    Name string
    The name for the Data Factory Flowlet.
    DatasetParameters string
    Specifies the reference data flow parameters from dataset.
    Parameters Dictionary<string, string>
    A map of parameters to associate with the Data Factory Flowlet.
    Name string
    The name for the Data Factory Flowlet.
    DatasetParameters string
    Specifies the reference data flow parameters from dataset.
    Parameters map[string]string
    A map of parameters to associate with the Data Factory Flowlet.
    name String
    The name for the Data Factory Flowlet.
    datasetParameters String
    Specifies the reference data flow parameters from dataset.
    parameters Map<String,String>
    A map of parameters to associate with the Data Factory Flowlet.
    name string
    The name for the Data Factory Flowlet.
    datasetParameters string
    Specifies the reference data flow parameters from dataset.
    parameters {[key: string]: string}
    A map of parameters to associate with the Data Factory Flowlet.
    name str
    The name for the Data Factory Flowlet.
    dataset_parameters str
    Specifies the reference data flow parameters from dataset.
    parameters Mapping[str, str]
    A map of parameters to associate with the Data Factory Flowlet.
    name String
    The name for the Data Factory Flowlet.
    datasetParameters String
    Specifies the reference data flow parameters from dataset.
    parameters Map<String>
    A map of parameters to associate with the Data Factory Flowlet.

    DataFlowSourceLinkedService, DataFlowSourceLinkedServiceArgs

    Name string
    The name for the Data Factory Linked Service.
    Parameters Dictionary<string, string>
    A map of parameters to associate with the Data Factory Linked Service.
    Name string
    The name for the Data Factory Linked Service.
    Parameters map[string]string
    A map of parameters to associate with the Data Factory Linked Service.
    name String
    The name for the Data Factory Linked Service.
    parameters Map<String,String>
    A map of parameters to associate with the Data Factory Linked Service.
    name string
    The name for the Data Factory Linked Service.
    parameters {[key: string]: string}
    A map of parameters to associate with the Data Factory Linked Service.
    name str
    The name for the Data Factory Linked Service.
    parameters Mapping[str, str]
    A map of parameters to associate with the Data Factory Linked Service.
    name String
    The name for the Data Factory Linked Service.
    parameters Map<String>
    A map of parameters to associate with the Data Factory Linked Service.

    DataFlowSourceRejectedLinkedService, DataFlowSourceRejectedLinkedServiceArgs

    Name string
    The name for the Data Factory Linked Service with schema.
    Parameters Dictionary<string, string>
    A map of parameters to associate with the Data Factory Linked Service.
    Name string
    The name for the Data Factory Linked Service with schema.
    Parameters map[string]string
    A map of parameters to associate with the Data Factory Linked Service.
    name String
    The name for the Data Factory Linked Service with schema.
    parameters Map<String,String>
    A map of parameters to associate with the Data Factory Linked Service.
    name string
    The name for the Data Factory Linked Service with schema.
    parameters {[key: string]: string}
    A map of parameters to associate with the Data Factory Linked Service.
    name str
    The name for the Data Factory Linked Service with schema.
    parameters Mapping[str, str]
    A map of parameters to associate with the Data Factory Linked Service.
    name String
    The name for the Data Factory Linked Service with schema.
    parameters Map<String>
    A map of parameters to associate with the Data Factory Linked Service.

    DataFlowSourceSchemaLinkedService, DataFlowSourceSchemaLinkedServiceArgs

    Name string
    The name for the Data Factory Linked Service with schema.
    Parameters Dictionary<string, string>
    A map of parameters to associate with the Data Factory Linked Service.
    Name string
    The name for the Data Factory Linked Service with schema.
    Parameters map[string]string
    A map of parameters to associate with the Data Factory Linked Service.
    name String
    The name for the Data Factory Linked Service with schema.
    parameters Map<String,String>
    A map of parameters to associate with the Data Factory Linked Service.
    name string
    The name for the Data Factory Linked Service with schema.
    parameters {[key: string]: string}
    A map of parameters to associate with the Data Factory Linked Service.
    name str
    The name for the Data Factory Linked Service with schema.
    parameters Mapping[str, str]
    A map of parameters to associate with the Data Factory Linked Service.
    name String
    The name for the Data Factory Linked Service with schema.
    parameters Map<String>
    A map of parameters to associate with the Data Factory Linked Service.

    DataFlowTransformation, DataFlowTransformationArgs

    Name string
    The name for the Data Flow transformation.
    Dataset DataFlowTransformationDataset
    A dataset block as defined below.
    Description string
    The description for the Data Flow transformation.
    Flowlet DataFlowTransformationFlowlet
    A flowlet block as defined below.
    LinkedService DataFlowTransformationLinkedService
    A linked_service block as defined below.
    Name string
    The name for the Data Flow transformation.
    Dataset DataFlowTransformationDataset
    A dataset block as defined below.
    Description string
    The description for the Data Flow transformation.
    Flowlet DataFlowTransformationFlowlet
    A flowlet block as defined below.
    LinkedService DataFlowTransformationLinkedService
    A linked_service block as defined below.
    name String
    The name for the Data Flow transformation.
    dataset DataFlowTransformationDataset
    A dataset block as defined below.
    description String
    The description for the Data Flow transformation.
    flowlet DataFlowTransformationFlowlet
    A flowlet block as defined below.
    linkedService DataFlowTransformationLinkedService
    A linked_service block as defined below.
    name string
    The name for the Data Flow transformation.
    dataset DataFlowTransformationDataset
    A dataset block as defined below.
    description string
    The description for the Data Flow transformation.
    flowlet DataFlowTransformationFlowlet
    A flowlet block as defined below.
    linkedService DataFlowTransformationLinkedService
    A linked_service block as defined below.
    name str
    The name for the Data Flow transformation.
    dataset DataFlowTransformationDataset
    A dataset block as defined below.
    description str
    The description for the Data Flow transformation.
    flowlet DataFlowTransformationFlowlet
    A flowlet block as defined below.
    linked_service DataFlowTransformationLinkedService
    A linked_service block as defined below.
    name String
    The name for the Data Flow transformation.
    dataset Property Map
    A dataset block as defined below.
    description String
    The description for the Data Flow transformation.
    flowlet Property Map
    A flowlet block as defined below.
    linkedService Property Map
    A linked_service block as defined below.

    DataFlowTransformationDataset, DataFlowTransformationDatasetArgs

    Name string
    The name for the Data Factory Dataset.
    Parameters Dictionary<string, string>
    A map of parameters to associate with the Data Factory dataset.
    Name string
    The name for the Data Factory Dataset.
    Parameters map[string]string
    A map of parameters to associate with the Data Factory dataset.
    name String
    The name for the Data Factory Dataset.
    parameters Map<String,String>
    A map of parameters to associate with the Data Factory dataset.
    name string
    The name for the Data Factory Dataset.
    parameters {[key: string]: string}
    A map of parameters to associate with the Data Factory dataset.
    name str
    The name for the Data Factory Dataset.
    parameters Mapping[str, str]
    A map of parameters to associate with the Data Factory dataset.
    name String
    The name for the Data Factory Dataset.
    parameters Map<String>
    A map of parameters to associate with the Data Factory dataset.

    DataFlowTransformationFlowlet, DataFlowTransformationFlowletArgs

    Name string
    The name for the Data Factory Flowlet.
    DatasetParameters string
    Specifies the reference data flow parameters from dataset.
    Parameters Dictionary<string, string>
    A map of parameters to associate with the Data Factory Flowlet.
    Name string
    The name for the Data Factory Flowlet.
    DatasetParameters string
    Specifies the reference data flow parameters from dataset.
    Parameters map[string]string
    A map of parameters to associate with the Data Factory Flowlet.
    name String
    The name for the Data Factory Flowlet.
    datasetParameters String
    Specifies the reference data flow parameters from dataset.
    parameters Map<String,String>
    A map of parameters to associate with the Data Factory Flowlet.
    name string
    The name for the Data Factory Flowlet.
    datasetParameters string
    Specifies the reference data flow parameters from dataset.
    parameters {[key: string]: string}
    A map of parameters to associate with the Data Factory Flowlet.
    name str
    The name for the Data Factory Flowlet.
    dataset_parameters str
    Specifies the reference data flow parameters from dataset.
    parameters Mapping[str, str]
    A map of parameters to associate with the Data Factory Flowlet.
    name String
    The name for the Data Factory Flowlet.
    datasetParameters String
    Specifies the reference data flow parameters from dataset.
    parameters Map<String>
    A map of parameters to associate with the Data Factory Flowlet.

    DataFlowTransformationLinkedService, DataFlowTransformationLinkedServiceArgs

    Name string
    The name for the Data Factory Linked Service.
    Parameters Dictionary<string, string>
    A map of parameters to associate with the Data Factory Linked Service.
    Name string
    The name for the Data Factory Linked Service.
    Parameters map[string]string
    A map of parameters to associate with the Data Factory Linked Service.
    name String
    The name for the Data Factory Linked Service.
    parameters Map<String,String>
    A map of parameters to associate with the Data Factory Linked Service.
    name string
    The name for the Data Factory Linked Service.
    parameters {[key: string]: string}
    A map of parameters to associate with the Data Factory Linked Service.
    name str
    The name for the Data Factory Linked Service.
    parameters Mapping[str, str]
    A map of parameters to associate with the Data Factory Linked Service.
    name String
    The name for the Data Factory Linked Service.
    parameters Map<String>
    A map of parameters to associate with the Data Factory Linked Service.

    Import

    Data Factory Data Flow can be imported using the resource id, e.g.

    $ pulumi import azure:datafactory/dataFlow:DataFlow example /subscriptions/00000000-0000-0000-0000-000000000000/resourceGroups/example/providers/Microsoft.DataFactory/factories/example/dataflows/example
    

    To learn more about importing existing cloud resources, see Importing resources.

    Package Details

    Repository
    Azure Classic pulumi/pulumi-azure
    License
    Apache-2.0
    Notes
    This Pulumi package is based on the azurerm Terraform Provider.
    azure logo

    We recommend using Azure Native.

    Azure v6.10.0 published on Tuesday, Nov 19, 2024 by Pulumi