Google Cloud Native is in preview. Google Cloud Classic is fully supported.
google-native.bigquerydatatransfer/v1.TransferConfig
Explore with Pulumi AI
Google Cloud Native is in preview. Google Cloud Classic is fully supported.
Creates a new data transfer configuration. Auto-naming is currently not supported for this resource.
Create TransferConfig Resource
Resources are created with functions called constructors. To learn more about declaring and configuring resources, see Resources.
Constructor syntax
new TransferConfig(name: string, args?: TransferConfigArgs, opts?: CustomResourceOptions);
@overload
def TransferConfig(resource_name: str,
args: Optional[TransferConfigArgs] = None,
opts: Optional[ResourceOptions] = None)
@overload
def TransferConfig(resource_name: str,
opts: Optional[ResourceOptions] = None,
authorization_code: Optional[str] = None,
data_refresh_window_days: Optional[int] = None,
data_source_id: Optional[str] = None,
destination_dataset_id: Optional[str] = None,
disabled: Optional[bool] = None,
display_name: Optional[str] = None,
email_preferences: Optional[EmailPreferencesArgs] = None,
encryption_configuration: Optional[EncryptionConfigurationArgs] = None,
location: Optional[str] = None,
name: Optional[str] = None,
notification_pubsub_topic: Optional[str] = None,
params: Optional[Mapping[str, str]] = None,
project: Optional[str] = None,
schedule: Optional[str] = None,
schedule_options: Optional[ScheduleOptionsArgs] = None,
service_account_name: Optional[str] = None,
user_id: Optional[str] = None,
version_info: Optional[str] = None)
func NewTransferConfig(ctx *Context, name string, args *TransferConfigArgs, opts ...ResourceOption) (*TransferConfig, error)
public TransferConfig(string name, TransferConfigArgs? args = null, CustomResourceOptions? opts = null)
public TransferConfig(String name, TransferConfigArgs args)
public TransferConfig(String name, TransferConfigArgs args, CustomResourceOptions options)
type: google-native:bigquerydatatransfer/v1:TransferConfig
properties: # The arguments to resource properties.
options: # Bag of options to control resource's behavior.
Parameters
- name string
- The unique name of the resource.
- args TransferConfigArgs
- The arguments to resource properties.
- opts CustomResourceOptions
- Bag of options to control resource's behavior.
- resource_name str
- The unique name of the resource.
- args TransferConfigArgs
- The arguments to resource properties.
- opts ResourceOptions
- Bag of options to control resource's behavior.
- ctx Context
- Context object for the current deployment.
- name string
- The unique name of the resource.
- args TransferConfigArgs
- The arguments to resource properties.
- opts ResourceOption
- Bag of options to control resource's behavior.
- name string
- The unique name of the resource.
- args TransferConfigArgs
- The arguments to resource properties.
- opts CustomResourceOptions
- Bag of options to control resource's behavior.
- name String
- The unique name of the resource.
- args TransferConfigArgs
- The arguments to resource properties.
- options CustomResourceOptions
- Bag of options to control resource's behavior.
Constructor example
The following reference example uses placeholder values for all input properties.
var transferConfigResource = new GoogleNative.BigQueryDataTransfer.V1.TransferConfig("transferConfigResource", new()
{
AuthorizationCode = "string",
DataRefreshWindowDays = 0,
DataSourceId = "string",
DestinationDatasetId = "string",
Disabled = false,
DisplayName = "string",
EmailPreferences = new GoogleNative.BigQueryDataTransfer.V1.Inputs.EmailPreferencesArgs
{
EnableFailureEmail = false,
},
EncryptionConfiguration = new GoogleNative.BigQueryDataTransfer.V1.Inputs.EncryptionConfigurationArgs
{
KmsKeyName = "string",
},
Location = "string",
Name = "string",
NotificationPubsubTopic = "string",
Params =
{
{ "string", "string" },
},
Project = "string",
Schedule = "string",
ScheduleOptions = new GoogleNative.BigQueryDataTransfer.V1.Inputs.ScheduleOptionsArgs
{
DisableAutoScheduling = false,
EndTime = "string",
StartTime = "string",
},
ServiceAccountName = "string",
VersionInfo = "string",
});
example, err := bigquerydatatransfer.NewTransferConfig(ctx, "transferConfigResource", &bigquerydatatransfer.TransferConfigArgs{
AuthorizationCode: pulumi.String("string"),
DataRefreshWindowDays: pulumi.Int(0),
DataSourceId: pulumi.String("string"),
DestinationDatasetId: pulumi.String("string"),
Disabled: pulumi.Bool(false),
DisplayName: pulumi.String("string"),
EmailPreferences: &bigquerydatatransfer.EmailPreferencesArgs{
EnableFailureEmail: pulumi.Bool(false),
},
EncryptionConfiguration: &bigquerydatatransfer.EncryptionConfigurationArgs{
KmsKeyName: pulumi.String("string"),
},
Location: pulumi.String("string"),
Name: pulumi.String("string"),
NotificationPubsubTopic: pulumi.String("string"),
Params: pulumi.StringMap{
"string": pulumi.String("string"),
},
Project: pulumi.String("string"),
Schedule: pulumi.String("string"),
ScheduleOptions: &bigquerydatatransfer.ScheduleOptionsArgs{
DisableAutoScheduling: pulumi.Bool(false),
EndTime: pulumi.String("string"),
StartTime: pulumi.String("string"),
},
ServiceAccountName: pulumi.String("string"),
VersionInfo: pulumi.String("string"),
})
var transferConfigResource = new TransferConfig("transferConfigResource", TransferConfigArgs.builder()
.authorizationCode("string")
.dataRefreshWindowDays(0)
.dataSourceId("string")
.destinationDatasetId("string")
.disabled(false)
.displayName("string")
.emailPreferences(EmailPreferencesArgs.builder()
.enableFailureEmail(false)
.build())
.encryptionConfiguration(EncryptionConfigurationArgs.builder()
.kmsKeyName("string")
.build())
.location("string")
.name("string")
.notificationPubsubTopic("string")
.params(Map.of("string", "string"))
.project("string")
.schedule("string")
.scheduleOptions(ScheduleOptionsArgs.builder()
.disableAutoScheduling(false)
.endTime("string")
.startTime("string")
.build())
.serviceAccountName("string")
.versionInfo("string")
.build());
transfer_config_resource = google_native.bigquerydatatransfer.v1.TransferConfig("transferConfigResource",
authorization_code="string",
data_refresh_window_days=0,
data_source_id="string",
destination_dataset_id="string",
disabled=False,
display_name="string",
email_preferences={
"enable_failure_email": False,
},
encryption_configuration={
"kms_key_name": "string",
},
location="string",
name="string",
notification_pubsub_topic="string",
params={
"string": "string",
},
project="string",
schedule="string",
schedule_options={
"disable_auto_scheduling": False,
"end_time": "string",
"start_time": "string",
},
service_account_name="string",
version_info="string")
const transferConfigResource = new google_native.bigquerydatatransfer.v1.TransferConfig("transferConfigResource", {
authorizationCode: "string",
dataRefreshWindowDays: 0,
dataSourceId: "string",
destinationDatasetId: "string",
disabled: false,
displayName: "string",
emailPreferences: {
enableFailureEmail: false,
},
encryptionConfiguration: {
kmsKeyName: "string",
},
location: "string",
name: "string",
notificationPubsubTopic: "string",
params: {
string: "string",
},
project: "string",
schedule: "string",
scheduleOptions: {
disableAutoScheduling: false,
endTime: "string",
startTime: "string",
},
serviceAccountName: "string",
versionInfo: "string",
});
type: google-native:bigquerydatatransfer/v1:TransferConfig
properties:
authorizationCode: string
dataRefreshWindowDays: 0
dataSourceId: string
destinationDatasetId: string
disabled: false
displayName: string
emailPreferences:
enableFailureEmail: false
encryptionConfiguration:
kmsKeyName: string
location: string
name: string
notificationPubsubTopic: string
params:
string: string
project: string
schedule: string
scheduleOptions:
disableAutoScheduling: false
endTime: string
startTime: string
serviceAccountName: string
versionInfo: string
TransferConfig Resource Properties
To learn more about resource properties and how to use them, see Inputs and Outputs in the Architecture and Concepts docs.
Inputs
In Python, inputs that are objects can be passed either as argument classes or as dictionary literals.
The TransferConfig resource accepts the following input properties:
- string
- Optional OAuth2 authorization code to use with this transfer configuration. This is required only if
transferConfig.dataSourceId
is 'youtube_channel' and new credentials are needed, as indicated byCheckValidCreds
. In order to obtain authorization_code, make a request to the following URL: https://www.gstatic.com/bigquerydatatransfer/oauthz/auth?redirect_uri=urn:ietf:wg:oauth:2.0:oob&response_type=authorization_code&client_id=client_id&scope=data_source_scopes * The client_id is the OAuth client_id of the a data source as returned by ListDataSources method. * data_source_scopes are the scopes returned by ListDataSources method. Note that this should not be set whenservice_account_name
is used to create the transfer config. - Data
Refresh intWindow Days - The number of days to look back to automatically refresh the data. For example, if
data_refresh_window_days = 10
, then every day BigQuery reingests data for [today-10, today-1], rather than ingesting data for just [today-1]. Only valid if the data source supports the feature. Set the value to 0 to use the default value. - Data
Source stringId - Data source ID. This cannot be changed once data transfer is created. The full list of available data source IDs can be returned through an API call: https://cloud.google.com/bigquery-transfer/docs/reference/datatransfer/rest/v1/projects.locations.dataSources/list
- Destination
Dataset stringId - The BigQuery target dataset id.
- Disabled bool
- Is this config disabled. When set to true, no runs are scheduled for a given transfer.
- Display
Name string - User specified display name for the data transfer.
- Email
Preferences Pulumi.Google Native. Big Query Data Transfer. V1. Inputs. Email Preferences - Email notifications will be sent according to these preferences to the email address of the user who owns this transfer config.
- Encryption
Configuration Pulumi.Google Native. Big Query Data Transfer. V1. Inputs. Encryption Configuration - The encryption configuration part. Currently, it is only used for the optional KMS key name. The BigQuery service account of your project must be granted permissions to use the key. Read methods will return the key name applied in effect. Write methods will apply the key if it is present, or otherwise try to apply project default keys if it is absent.
- Location string
- Name string
- The resource name of the transfer config. Transfer config names have the form either
projects/{project_id}/locations/{region}/transferConfigs/{config_id}
orprojects/{project_id}/transferConfigs/{config_id}
, whereconfig_id
is usually a UUID, even though it is not guaranteed or required. The name is ignored when creating a transfer config. - Notification
Pubsub stringTopic - Pub/Sub topic where notifications will be sent after transfer runs associated with this transfer config finish. The format for specifying a pubsub topic is:
projects/{project}/topics/{topic}
- Params Dictionary<string, string>
- Parameters specific to each data source. For more information see the bq tab in the 'Setting up a data transfer' section for each data source. For example the parameters for Cloud Storage transfers are listed here: https://cloud.google.com/bigquery-transfer/docs/cloud-storage-transfer#bq
- Project string
- Schedule string
- Data transfer schedule. If the data source does not support a custom schedule, this should be empty. If it is empty, the default value for the data source will be used. The specified times are in UTC. Examples of valid format:
1st,3rd monday of month 15:30
,every wed,fri of jan,jun 13:15
, andfirst sunday of quarter 00:00
. See more explanation about the format here: https://cloud.google.com/appengine/docs/flexible/python/scheduling-jobs-with-cron-yaml#the_schedule_format NOTE: The minimum interval time between recurring transfers depends on the data source; refer to the documentation for your data source. - Schedule
Options Pulumi.Google Native. Big Query Data Transfer. V1. Inputs. Schedule Options - Options customizing the data transfer schedule.
- Service
Account stringName - Optional service account email. If this field is set, the transfer config will be created with this service account's credentials. It requires that the requesting user calling this API has permissions to act as this service account. Note that not all data sources support service account credentials when creating a transfer config. For the latest list of data sources, read about using service accounts.
- User
Id string - Deprecated. Unique ID of the user on whose behalf transfer is done.
- Version
Info string - Optional version info. This is required only if
transferConfig.dataSourceId
is not 'youtube_channel' and new credentials are needed, as indicated byCheckValidCreds
. In order to obtain version info, make a request to the following URL: https://www.gstatic.com/bigquerydatatransfer/oauthz/auth?redirect_uri=urn:ietf:wg:oauth:2.0:oob&response_type=version_info&client_id=client_id&scope=data_source_scopes * The client_id is the OAuth client_id of the a data source as returned by ListDataSources method. * data_source_scopes are the scopes returned by ListDataSources method. Note that this should not be set whenservice_account_name
is used to create the transfer config.
- string
- Optional OAuth2 authorization code to use with this transfer configuration. This is required only if
transferConfig.dataSourceId
is 'youtube_channel' and new credentials are needed, as indicated byCheckValidCreds
. In order to obtain authorization_code, make a request to the following URL: https://www.gstatic.com/bigquerydatatransfer/oauthz/auth?redirect_uri=urn:ietf:wg:oauth:2.0:oob&response_type=authorization_code&client_id=client_id&scope=data_source_scopes * The client_id is the OAuth client_id of the a data source as returned by ListDataSources method. * data_source_scopes are the scopes returned by ListDataSources method. Note that this should not be set whenservice_account_name
is used to create the transfer config. - Data
Refresh intWindow Days - The number of days to look back to automatically refresh the data. For example, if
data_refresh_window_days = 10
, then every day BigQuery reingests data for [today-10, today-1], rather than ingesting data for just [today-1]. Only valid if the data source supports the feature. Set the value to 0 to use the default value. - Data
Source stringId - Data source ID. This cannot be changed once data transfer is created. The full list of available data source IDs can be returned through an API call: https://cloud.google.com/bigquery-transfer/docs/reference/datatransfer/rest/v1/projects.locations.dataSources/list
- Destination
Dataset stringId - The BigQuery target dataset id.
- Disabled bool
- Is this config disabled. When set to true, no runs are scheduled for a given transfer.
- Display
Name string - User specified display name for the data transfer.
- Email
Preferences EmailPreferences Args - Email notifications will be sent according to these preferences to the email address of the user who owns this transfer config.
- Encryption
Configuration EncryptionConfiguration Args - The encryption configuration part. Currently, it is only used for the optional KMS key name. The BigQuery service account of your project must be granted permissions to use the key. Read methods will return the key name applied in effect. Write methods will apply the key if it is present, or otherwise try to apply project default keys if it is absent.
- Location string
- Name string
- The resource name of the transfer config. Transfer config names have the form either
projects/{project_id}/locations/{region}/transferConfigs/{config_id}
orprojects/{project_id}/transferConfigs/{config_id}
, whereconfig_id
is usually a UUID, even though it is not guaranteed or required. The name is ignored when creating a transfer config. - Notification
Pubsub stringTopic - Pub/Sub topic where notifications will be sent after transfer runs associated with this transfer config finish. The format for specifying a pubsub topic is:
projects/{project}/topics/{topic}
- Params map[string]string
- Parameters specific to each data source. For more information see the bq tab in the 'Setting up a data transfer' section for each data source. For example the parameters for Cloud Storage transfers are listed here: https://cloud.google.com/bigquery-transfer/docs/cloud-storage-transfer#bq
- Project string
- Schedule string
- Data transfer schedule. If the data source does not support a custom schedule, this should be empty. If it is empty, the default value for the data source will be used. The specified times are in UTC. Examples of valid format:
1st,3rd monday of month 15:30
,every wed,fri of jan,jun 13:15
, andfirst sunday of quarter 00:00
. See more explanation about the format here: https://cloud.google.com/appengine/docs/flexible/python/scheduling-jobs-with-cron-yaml#the_schedule_format NOTE: The minimum interval time between recurring transfers depends on the data source; refer to the documentation for your data source. - Schedule
Options ScheduleOptions Args - Options customizing the data transfer schedule.
- Service
Account stringName - Optional service account email. If this field is set, the transfer config will be created with this service account's credentials. It requires that the requesting user calling this API has permissions to act as this service account. Note that not all data sources support service account credentials when creating a transfer config. For the latest list of data sources, read about using service accounts.
- User
Id string - Deprecated. Unique ID of the user on whose behalf transfer is done.
- Version
Info string - Optional version info. This is required only if
transferConfig.dataSourceId
is not 'youtube_channel' and new credentials are needed, as indicated byCheckValidCreds
. In order to obtain version info, make a request to the following URL: https://www.gstatic.com/bigquerydatatransfer/oauthz/auth?redirect_uri=urn:ietf:wg:oauth:2.0:oob&response_type=version_info&client_id=client_id&scope=data_source_scopes * The client_id is the OAuth client_id of the a data source as returned by ListDataSources method. * data_source_scopes are the scopes returned by ListDataSources method. Note that this should not be set whenservice_account_name
is used to create the transfer config.
- String
- Optional OAuth2 authorization code to use with this transfer configuration. This is required only if
transferConfig.dataSourceId
is 'youtube_channel' and new credentials are needed, as indicated byCheckValidCreds
. In order to obtain authorization_code, make a request to the following URL: https://www.gstatic.com/bigquerydatatransfer/oauthz/auth?redirect_uri=urn:ietf:wg:oauth:2.0:oob&response_type=authorization_code&client_id=client_id&scope=data_source_scopes * The client_id is the OAuth client_id of the a data source as returned by ListDataSources method. * data_source_scopes are the scopes returned by ListDataSources method. Note that this should not be set whenservice_account_name
is used to create the transfer config. - data
Refresh IntegerWindow Days - The number of days to look back to automatically refresh the data. For example, if
data_refresh_window_days = 10
, then every day BigQuery reingests data for [today-10, today-1], rather than ingesting data for just [today-1]. Only valid if the data source supports the feature. Set the value to 0 to use the default value. - data
Source StringId - Data source ID. This cannot be changed once data transfer is created. The full list of available data source IDs can be returned through an API call: https://cloud.google.com/bigquery-transfer/docs/reference/datatransfer/rest/v1/projects.locations.dataSources/list
- destination
Dataset StringId - The BigQuery target dataset id.
- disabled Boolean
- Is this config disabled. When set to true, no runs are scheduled for a given transfer.
- display
Name String - User specified display name for the data transfer.
- email
Preferences EmailPreferences - Email notifications will be sent according to these preferences to the email address of the user who owns this transfer config.
- encryption
Configuration EncryptionConfiguration - The encryption configuration part. Currently, it is only used for the optional KMS key name. The BigQuery service account of your project must be granted permissions to use the key. Read methods will return the key name applied in effect. Write methods will apply the key if it is present, or otherwise try to apply project default keys if it is absent.
- location String
- name String
- The resource name of the transfer config. Transfer config names have the form either
projects/{project_id}/locations/{region}/transferConfigs/{config_id}
orprojects/{project_id}/transferConfigs/{config_id}
, whereconfig_id
is usually a UUID, even though it is not guaranteed or required. The name is ignored when creating a transfer config. - notification
Pubsub StringTopic - Pub/Sub topic where notifications will be sent after transfer runs associated with this transfer config finish. The format for specifying a pubsub topic is:
projects/{project}/topics/{topic}
- params Map<String,String>
- Parameters specific to each data source. For more information see the bq tab in the 'Setting up a data transfer' section for each data source. For example the parameters for Cloud Storage transfers are listed here: https://cloud.google.com/bigquery-transfer/docs/cloud-storage-transfer#bq
- project String
- schedule String
- Data transfer schedule. If the data source does not support a custom schedule, this should be empty. If it is empty, the default value for the data source will be used. The specified times are in UTC. Examples of valid format:
1st,3rd monday of month 15:30
,every wed,fri of jan,jun 13:15
, andfirst sunday of quarter 00:00
. See more explanation about the format here: https://cloud.google.com/appengine/docs/flexible/python/scheduling-jobs-with-cron-yaml#the_schedule_format NOTE: The minimum interval time between recurring transfers depends on the data source; refer to the documentation for your data source. - schedule
Options ScheduleOptions - Options customizing the data transfer schedule.
- service
Account StringName - Optional service account email. If this field is set, the transfer config will be created with this service account's credentials. It requires that the requesting user calling this API has permissions to act as this service account. Note that not all data sources support service account credentials when creating a transfer config. For the latest list of data sources, read about using service accounts.
- user
Id String - Deprecated. Unique ID of the user on whose behalf transfer is done.
- version
Info String - Optional version info. This is required only if
transferConfig.dataSourceId
is not 'youtube_channel' and new credentials are needed, as indicated byCheckValidCreds
. In order to obtain version info, make a request to the following URL: https://www.gstatic.com/bigquerydatatransfer/oauthz/auth?redirect_uri=urn:ietf:wg:oauth:2.0:oob&response_type=version_info&client_id=client_id&scope=data_source_scopes * The client_id is the OAuth client_id of the a data source as returned by ListDataSources method. * data_source_scopes are the scopes returned by ListDataSources method. Note that this should not be set whenservice_account_name
is used to create the transfer config.
- string
- Optional OAuth2 authorization code to use with this transfer configuration. This is required only if
transferConfig.dataSourceId
is 'youtube_channel' and new credentials are needed, as indicated byCheckValidCreds
. In order to obtain authorization_code, make a request to the following URL: https://www.gstatic.com/bigquerydatatransfer/oauthz/auth?redirect_uri=urn:ietf:wg:oauth:2.0:oob&response_type=authorization_code&client_id=client_id&scope=data_source_scopes * The client_id is the OAuth client_id of the a data source as returned by ListDataSources method. * data_source_scopes are the scopes returned by ListDataSources method. Note that this should not be set whenservice_account_name
is used to create the transfer config. - data
Refresh numberWindow Days - The number of days to look back to automatically refresh the data. For example, if
data_refresh_window_days = 10
, then every day BigQuery reingests data for [today-10, today-1], rather than ingesting data for just [today-1]. Only valid if the data source supports the feature. Set the value to 0 to use the default value. - data
Source stringId - Data source ID. This cannot be changed once data transfer is created. The full list of available data source IDs can be returned through an API call: https://cloud.google.com/bigquery-transfer/docs/reference/datatransfer/rest/v1/projects.locations.dataSources/list
- destination
Dataset stringId - The BigQuery target dataset id.
- disabled boolean
- Is this config disabled. When set to true, no runs are scheduled for a given transfer.
- display
Name string - User specified display name for the data transfer.
- email
Preferences EmailPreferences - Email notifications will be sent according to these preferences to the email address of the user who owns this transfer config.
- encryption
Configuration EncryptionConfiguration - The encryption configuration part. Currently, it is only used for the optional KMS key name. The BigQuery service account of your project must be granted permissions to use the key. Read methods will return the key name applied in effect. Write methods will apply the key if it is present, or otherwise try to apply project default keys if it is absent.
- location string
- name string
- The resource name of the transfer config. Transfer config names have the form either
projects/{project_id}/locations/{region}/transferConfigs/{config_id}
orprojects/{project_id}/transferConfigs/{config_id}
, whereconfig_id
is usually a UUID, even though it is not guaranteed or required. The name is ignored when creating a transfer config. - notification
Pubsub stringTopic - Pub/Sub topic where notifications will be sent after transfer runs associated with this transfer config finish. The format for specifying a pubsub topic is:
projects/{project}/topics/{topic}
- params {[key: string]: string}
- Parameters specific to each data source. For more information see the bq tab in the 'Setting up a data transfer' section for each data source. For example the parameters for Cloud Storage transfers are listed here: https://cloud.google.com/bigquery-transfer/docs/cloud-storage-transfer#bq
- project string
- schedule string
- Data transfer schedule. If the data source does not support a custom schedule, this should be empty. If it is empty, the default value for the data source will be used. The specified times are in UTC. Examples of valid format:
1st,3rd monday of month 15:30
,every wed,fri of jan,jun 13:15
, andfirst sunday of quarter 00:00
. See more explanation about the format here: https://cloud.google.com/appengine/docs/flexible/python/scheduling-jobs-with-cron-yaml#the_schedule_format NOTE: The minimum interval time between recurring transfers depends on the data source; refer to the documentation for your data source. - schedule
Options ScheduleOptions - Options customizing the data transfer schedule.
- service
Account stringName - Optional service account email. If this field is set, the transfer config will be created with this service account's credentials. It requires that the requesting user calling this API has permissions to act as this service account. Note that not all data sources support service account credentials when creating a transfer config. For the latest list of data sources, read about using service accounts.
- user
Id string - Deprecated. Unique ID of the user on whose behalf transfer is done.
- version
Info string - Optional version info. This is required only if
transferConfig.dataSourceId
is not 'youtube_channel' and new credentials are needed, as indicated byCheckValidCreds
. In order to obtain version info, make a request to the following URL: https://www.gstatic.com/bigquerydatatransfer/oauthz/auth?redirect_uri=urn:ietf:wg:oauth:2.0:oob&response_type=version_info&client_id=client_id&scope=data_source_scopes * The client_id is the OAuth client_id of the a data source as returned by ListDataSources method. * data_source_scopes are the scopes returned by ListDataSources method. Note that this should not be set whenservice_account_name
is used to create the transfer config.
- str
- Optional OAuth2 authorization code to use with this transfer configuration. This is required only if
transferConfig.dataSourceId
is 'youtube_channel' and new credentials are needed, as indicated byCheckValidCreds
. In order to obtain authorization_code, make a request to the following URL: https://www.gstatic.com/bigquerydatatransfer/oauthz/auth?redirect_uri=urn:ietf:wg:oauth:2.0:oob&response_type=authorization_code&client_id=client_id&scope=data_source_scopes * The client_id is the OAuth client_id of the a data source as returned by ListDataSources method. * data_source_scopes are the scopes returned by ListDataSources method. Note that this should not be set whenservice_account_name
is used to create the transfer config. - data_
refresh_ intwindow_ days - The number of days to look back to automatically refresh the data. For example, if
data_refresh_window_days = 10
, then every day BigQuery reingests data for [today-10, today-1], rather than ingesting data for just [today-1]. Only valid if the data source supports the feature. Set the value to 0 to use the default value. - data_
source_ strid - Data source ID. This cannot be changed once data transfer is created. The full list of available data source IDs can be returned through an API call: https://cloud.google.com/bigquery-transfer/docs/reference/datatransfer/rest/v1/projects.locations.dataSources/list
- destination_
dataset_ strid - The BigQuery target dataset id.
- disabled bool
- Is this config disabled. When set to true, no runs are scheduled for a given transfer.
- display_
name str - User specified display name for the data transfer.
- email_
preferences EmailPreferences Args - Email notifications will be sent according to these preferences to the email address of the user who owns this transfer config.
- encryption_
configuration EncryptionConfiguration Args - The encryption configuration part. Currently, it is only used for the optional KMS key name. The BigQuery service account of your project must be granted permissions to use the key. Read methods will return the key name applied in effect. Write methods will apply the key if it is present, or otherwise try to apply project default keys if it is absent.
- location str
- name str
- The resource name of the transfer config. Transfer config names have the form either
projects/{project_id}/locations/{region}/transferConfigs/{config_id}
orprojects/{project_id}/transferConfigs/{config_id}
, whereconfig_id
is usually a UUID, even though it is not guaranteed or required. The name is ignored when creating a transfer config. - notification_
pubsub_ strtopic - Pub/Sub topic where notifications will be sent after transfer runs associated with this transfer config finish. The format for specifying a pubsub topic is:
projects/{project}/topics/{topic}
- params Mapping[str, str]
- Parameters specific to each data source. For more information see the bq tab in the 'Setting up a data transfer' section for each data source. For example the parameters for Cloud Storage transfers are listed here: https://cloud.google.com/bigquery-transfer/docs/cloud-storage-transfer#bq
- project str
- schedule str
- Data transfer schedule. If the data source does not support a custom schedule, this should be empty. If it is empty, the default value for the data source will be used. The specified times are in UTC. Examples of valid format:
1st,3rd monday of month 15:30
,every wed,fri of jan,jun 13:15
, andfirst sunday of quarter 00:00
. See more explanation about the format here: https://cloud.google.com/appengine/docs/flexible/python/scheduling-jobs-with-cron-yaml#the_schedule_format NOTE: The minimum interval time between recurring transfers depends on the data source; refer to the documentation for your data source. - schedule_
options ScheduleOptions Args - Options customizing the data transfer schedule.
- service_
account_ strname - Optional service account email. If this field is set, the transfer config will be created with this service account's credentials. It requires that the requesting user calling this API has permissions to act as this service account. Note that not all data sources support service account credentials when creating a transfer config. For the latest list of data sources, read about using service accounts.
- user_
id str - Deprecated. Unique ID of the user on whose behalf transfer is done.
- version_
info str - Optional version info. This is required only if
transferConfig.dataSourceId
is not 'youtube_channel' and new credentials are needed, as indicated byCheckValidCreds
. In order to obtain version info, make a request to the following URL: https://www.gstatic.com/bigquerydatatransfer/oauthz/auth?redirect_uri=urn:ietf:wg:oauth:2.0:oob&response_type=version_info&client_id=client_id&scope=data_source_scopes * The client_id is the OAuth client_id of the a data source as returned by ListDataSources method. * data_source_scopes are the scopes returned by ListDataSources method. Note that this should not be set whenservice_account_name
is used to create the transfer config.
- String
- Optional OAuth2 authorization code to use with this transfer configuration. This is required only if
transferConfig.dataSourceId
is 'youtube_channel' and new credentials are needed, as indicated byCheckValidCreds
. In order to obtain authorization_code, make a request to the following URL: https://www.gstatic.com/bigquerydatatransfer/oauthz/auth?redirect_uri=urn:ietf:wg:oauth:2.0:oob&response_type=authorization_code&client_id=client_id&scope=data_source_scopes * The client_id is the OAuth client_id of the a data source as returned by ListDataSources method. * data_source_scopes are the scopes returned by ListDataSources method. Note that this should not be set whenservice_account_name
is used to create the transfer config. - data
Refresh NumberWindow Days - The number of days to look back to automatically refresh the data. For example, if
data_refresh_window_days = 10
, then every day BigQuery reingests data for [today-10, today-1], rather than ingesting data for just [today-1]. Only valid if the data source supports the feature. Set the value to 0 to use the default value. - data
Source StringId - Data source ID. This cannot be changed once data transfer is created. The full list of available data source IDs can be returned through an API call: https://cloud.google.com/bigquery-transfer/docs/reference/datatransfer/rest/v1/projects.locations.dataSources/list
- destination
Dataset StringId - The BigQuery target dataset id.
- disabled Boolean
- Is this config disabled. When set to true, no runs are scheduled for a given transfer.
- display
Name String - User specified display name for the data transfer.
- email
Preferences Property Map - Email notifications will be sent according to these preferences to the email address of the user who owns this transfer config.
- encryption
Configuration Property Map - The encryption configuration part. Currently, it is only used for the optional KMS key name. The BigQuery service account of your project must be granted permissions to use the key. Read methods will return the key name applied in effect. Write methods will apply the key if it is present, or otherwise try to apply project default keys if it is absent.
- location String
- name String
- The resource name of the transfer config. Transfer config names have the form either
projects/{project_id}/locations/{region}/transferConfigs/{config_id}
orprojects/{project_id}/transferConfigs/{config_id}
, whereconfig_id
is usually a UUID, even though it is not guaranteed or required. The name is ignored when creating a transfer config. - notification
Pubsub StringTopic - Pub/Sub topic where notifications will be sent after transfer runs associated with this transfer config finish. The format for specifying a pubsub topic is:
projects/{project}/topics/{topic}
- params Map<String>
- Parameters specific to each data source. For more information see the bq tab in the 'Setting up a data transfer' section for each data source. For example the parameters for Cloud Storage transfers are listed here: https://cloud.google.com/bigquery-transfer/docs/cloud-storage-transfer#bq
- project String
- schedule String
- Data transfer schedule. If the data source does not support a custom schedule, this should be empty. If it is empty, the default value for the data source will be used. The specified times are in UTC. Examples of valid format:
1st,3rd monday of month 15:30
,every wed,fri of jan,jun 13:15
, andfirst sunday of quarter 00:00
. See more explanation about the format here: https://cloud.google.com/appengine/docs/flexible/python/scheduling-jobs-with-cron-yaml#the_schedule_format NOTE: The minimum interval time between recurring transfers depends on the data source; refer to the documentation for your data source. - schedule
Options Property Map - Options customizing the data transfer schedule.
- service
Account StringName - Optional service account email. If this field is set, the transfer config will be created with this service account's credentials. It requires that the requesting user calling this API has permissions to act as this service account. Note that not all data sources support service account credentials when creating a transfer config. For the latest list of data sources, read about using service accounts.
- user
Id String - Deprecated. Unique ID of the user on whose behalf transfer is done.
- version
Info String - Optional version info. This is required only if
transferConfig.dataSourceId
is not 'youtube_channel' and new credentials are needed, as indicated byCheckValidCreds
. In order to obtain version info, make a request to the following URL: https://www.gstatic.com/bigquerydatatransfer/oauthz/auth?redirect_uri=urn:ietf:wg:oauth:2.0:oob&response_type=version_info&client_id=client_id&scope=data_source_scopes * The client_id is the OAuth client_id of the a data source as returned by ListDataSources method. * data_source_scopes are the scopes returned by ListDataSources method. Note that this should not be set whenservice_account_name
is used to create the transfer config.
Outputs
All input properties are implicitly available as output properties. Additionally, the TransferConfig resource produces the following output properties:
- Dataset
Region string - Region in which BigQuery dataset is located.
- Id string
- The provider-assigned unique ID for this managed resource.
- Next
Run stringTime - Next time when data transfer will run.
- Owner
Info Pulumi.Google Native. Big Query Data Transfer. V1. Outputs. User Info Response - Information about the user whose credentials are used to transfer data. Populated only for
transferConfigs.get
requests. In case the user information is not available, this field will not be populated. - State string
- State of the most recently updated transfer run.
- Update
Time string - Data transfer modification time. Ignored by server on input.
- Dataset
Region string - Region in which BigQuery dataset is located.
- Id string
- The provider-assigned unique ID for this managed resource.
- Next
Run stringTime - Next time when data transfer will run.
- Owner
Info UserInfo Response - Information about the user whose credentials are used to transfer data. Populated only for
transferConfigs.get
requests. In case the user information is not available, this field will not be populated. - State string
- State of the most recently updated transfer run.
- Update
Time string - Data transfer modification time. Ignored by server on input.
- dataset
Region String - Region in which BigQuery dataset is located.
- id String
- The provider-assigned unique ID for this managed resource.
- next
Run StringTime - Next time when data transfer will run.
- owner
Info UserInfo Response - Information about the user whose credentials are used to transfer data. Populated only for
transferConfigs.get
requests. In case the user information is not available, this field will not be populated. - state String
- State of the most recently updated transfer run.
- update
Time String - Data transfer modification time. Ignored by server on input.
- dataset
Region string - Region in which BigQuery dataset is located.
- id string
- The provider-assigned unique ID for this managed resource.
- next
Run stringTime - Next time when data transfer will run.
- owner
Info UserInfo Response - Information about the user whose credentials are used to transfer data. Populated only for
transferConfigs.get
requests. In case the user information is not available, this field will not be populated. - state string
- State of the most recently updated transfer run.
- update
Time string - Data transfer modification time. Ignored by server on input.
- dataset_
region str - Region in which BigQuery dataset is located.
- id str
- The provider-assigned unique ID for this managed resource.
- next_
run_ strtime - Next time when data transfer will run.
- owner_
info UserInfo Response - Information about the user whose credentials are used to transfer data. Populated only for
transferConfigs.get
requests. In case the user information is not available, this field will not be populated. - state str
- State of the most recently updated transfer run.
- update_
time str - Data transfer modification time. Ignored by server on input.
- dataset
Region String - Region in which BigQuery dataset is located.
- id String
- The provider-assigned unique ID for this managed resource.
- next
Run StringTime - Next time when data transfer will run.
- owner
Info Property Map - Information about the user whose credentials are used to transfer data. Populated only for
transferConfigs.get
requests. In case the user information is not available, this field will not be populated. - state String
- State of the most recently updated transfer run.
- update
Time String - Data transfer modification time. Ignored by server on input.
Supporting Types
EmailPreferences, EmailPreferencesArgs
- Enable
Failure boolEmail - If true, email notifications will be sent on transfer run failures.
- Enable
Failure boolEmail - If true, email notifications will be sent on transfer run failures.
- enable
Failure BooleanEmail - If true, email notifications will be sent on transfer run failures.
- enable
Failure booleanEmail - If true, email notifications will be sent on transfer run failures.
- enable_
failure_ boolemail - If true, email notifications will be sent on transfer run failures.
- enable
Failure BooleanEmail - If true, email notifications will be sent on transfer run failures.
EmailPreferencesResponse, EmailPreferencesResponseArgs
- Enable
Failure boolEmail - If true, email notifications will be sent on transfer run failures.
- Enable
Failure boolEmail - If true, email notifications will be sent on transfer run failures.
- enable
Failure BooleanEmail - If true, email notifications will be sent on transfer run failures.
- enable
Failure booleanEmail - If true, email notifications will be sent on transfer run failures.
- enable_
failure_ boolemail - If true, email notifications will be sent on transfer run failures.
- enable
Failure BooleanEmail - If true, email notifications will be sent on transfer run failures.
EncryptionConfiguration, EncryptionConfigurationArgs
- Kms
Key stringName - The name of the KMS key used for encrypting BigQuery data.
- Kms
Key stringName - The name of the KMS key used for encrypting BigQuery data.
- kms
Key StringName - The name of the KMS key used for encrypting BigQuery data.
- kms
Key stringName - The name of the KMS key used for encrypting BigQuery data.
- kms_
key_ strname - The name of the KMS key used for encrypting BigQuery data.
- kms
Key StringName - The name of the KMS key used for encrypting BigQuery data.
EncryptionConfigurationResponse, EncryptionConfigurationResponseArgs
- Kms
Key stringName - The name of the KMS key used for encrypting BigQuery data.
- Kms
Key stringName - The name of the KMS key used for encrypting BigQuery data.
- kms
Key StringName - The name of the KMS key used for encrypting BigQuery data.
- kms
Key stringName - The name of the KMS key used for encrypting BigQuery data.
- kms_
key_ strname - The name of the KMS key used for encrypting BigQuery data.
- kms
Key StringName - The name of the KMS key used for encrypting BigQuery data.
ScheduleOptions, ScheduleOptionsArgs
- Disable
Auto boolScheduling - If true, automatic scheduling of data transfer runs for this configuration will be disabled. The runs can be started on ad-hoc basis using StartManualTransferRuns API. When automatic scheduling is disabled, the TransferConfig.schedule field will be ignored.
- End
Time string - Defines time to stop scheduling transfer runs. A transfer run cannot be scheduled at or after the end time. The end time can be changed at any moment. The time when a data transfer can be trigerred manually is not limited by this option.
- Start
Time string - Specifies time to start scheduling transfer runs. The first run will be scheduled at or after the start time according to a recurrence pattern defined in the schedule string. The start time can be changed at any moment. The time when a data transfer can be trigerred manually is not limited by this option.
- Disable
Auto boolScheduling - If true, automatic scheduling of data transfer runs for this configuration will be disabled. The runs can be started on ad-hoc basis using StartManualTransferRuns API. When automatic scheduling is disabled, the TransferConfig.schedule field will be ignored.
- End
Time string - Defines time to stop scheduling transfer runs. A transfer run cannot be scheduled at or after the end time. The end time can be changed at any moment. The time when a data transfer can be trigerred manually is not limited by this option.
- Start
Time string - Specifies time to start scheduling transfer runs. The first run will be scheduled at or after the start time according to a recurrence pattern defined in the schedule string. The start time can be changed at any moment. The time when a data transfer can be trigerred manually is not limited by this option.
- disable
Auto BooleanScheduling - If true, automatic scheduling of data transfer runs for this configuration will be disabled. The runs can be started on ad-hoc basis using StartManualTransferRuns API. When automatic scheduling is disabled, the TransferConfig.schedule field will be ignored.
- end
Time String - Defines time to stop scheduling transfer runs. A transfer run cannot be scheduled at or after the end time. The end time can be changed at any moment. The time when a data transfer can be trigerred manually is not limited by this option.
- start
Time String - Specifies time to start scheduling transfer runs. The first run will be scheduled at or after the start time according to a recurrence pattern defined in the schedule string. The start time can be changed at any moment. The time when a data transfer can be trigerred manually is not limited by this option.
- disable
Auto booleanScheduling - If true, automatic scheduling of data transfer runs for this configuration will be disabled. The runs can be started on ad-hoc basis using StartManualTransferRuns API. When automatic scheduling is disabled, the TransferConfig.schedule field will be ignored.
- end
Time string - Defines time to stop scheduling transfer runs. A transfer run cannot be scheduled at or after the end time. The end time can be changed at any moment. The time when a data transfer can be trigerred manually is not limited by this option.
- start
Time string - Specifies time to start scheduling transfer runs. The first run will be scheduled at or after the start time according to a recurrence pattern defined in the schedule string. The start time can be changed at any moment. The time when a data transfer can be trigerred manually is not limited by this option.
- disable_
auto_ boolscheduling - If true, automatic scheduling of data transfer runs for this configuration will be disabled. The runs can be started on ad-hoc basis using StartManualTransferRuns API. When automatic scheduling is disabled, the TransferConfig.schedule field will be ignored.
- end_
time str - Defines time to stop scheduling transfer runs. A transfer run cannot be scheduled at or after the end time. The end time can be changed at any moment. The time when a data transfer can be trigerred manually is not limited by this option.
- start_
time str - Specifies time to start scheduling transfer runs. The first run will be scheduled at or after the start time according to a recurrence pattern defined in the schedule string. The start time can be changed at any moment. The time when a data transfer can be trigerred manually is not limited by this option.
- disable
Auto BooleanScheduling - If true, automatic scheduling of data transfer runs for this configuration will be disabled. The runs can be started on ad-hoc basis using StartManualTransferRuns API. When automatic scheduling is disabled, the TransferConfig.schedule field will be ignored.
- end
Time String - Defines time to stop scheduling transfer runs. A transfer run cannot be scheduled at or after the end time. The end time can be changed at any moment. The time when a data transfer can be trigerred manually is not limited by this option.
- start
Time String - Specifies time to start scheduling transfer runs. The first run will be scheduled at or after the start time according to a recurrence pattern defined in the schedule string. The start time can be changed at any moment. The time when a data transfer can be trigerred manually is not limited by this option.
ScheduleOptionsResponse, ScheduleOptionsResponseArgs
- Disable
Auto boolScheduling - If true, automatic scheduling of data transfer runs for this configuration will be disabled. The runs can be started on ad-hoc basis using StartManualTransferRuns API. When automatic scheduling is disabled, the TransferConfig.schedule field will be ignored.
- End
Time string - Defines time to stop scheduling transfer runs. A transfer run cannot be scheduled at or after the end time. The end time can be changed at any moment. The time when a data transfer can be trigerred manually is not limited by this option.
- Start
Time string - Specifies time to start scheduling transfer runs. The first run will be scheduled at or after the start time according to a recurrence pattern defined in the schedule string. The start time can be changed at any moment. The time when a data transfer can be trigerred manually is not limited by this option.
- Disable
Auto boolScheduling - If true, automatic scheduling of data transfer runs for this configuration will be disabled. The runs can be started on ad-hoc basis using StartManualTransferRuns API. When automatic scheduling is disabled, the TransferConfig.schedule field will be ignored.
- End
Time string - Defines time to stop scheduling transfer runs. A transfer run cannot be scheduled at or after the end time. The end time can be changed at any moment. The time when a data transfer can be trigerred manually is not limited by this option.
- Start
Time string - Specifies time to start scheduling transfer runs. The first run will be scheduled at or after the start time according to a recurrence pattern defined in the schedule string. The start time can be changed at any moment. The time when a data transfer can be trigerred manually is not limited by this option.
- disable
Auto BooleanScheduling - If true, automatic scheduling of data transfer runs for this configuration will be disabled. The runs can be started on ad-hoc basis using StartManualTransferRuns API. When automatic scheduling is disabled, the TransferConfig.schedule field will be ignored.
- end
Time String - Defines time to stop scheduling transfer runs. A transfer run cannot be scheduled at or after the end time. The end time can be changed at any moment. The time when a data transfer can be trigerred manually is not limited by this option.
- start
Time String - Specifies time to start scheduling transfer runs. The first run will be scheduled at or after the start time according to a recurrence pattern defined in the schedule string. The start time can be changed at any moment. The time when a data transfer can be trigerred manually is not limited by this option.
- disable
Auto booleanScheduling - If true, automatic scheduling of data transfer runs for this configuration will be disabled. The runs can be started on ad-hoc basis using StartManualTransferRuns API. When automatic scheduling is disabled, the TransferConfig.schedule field will be ignored.
- end
Time string - Defines time to stop scheduling transfer runs. A transfer run cannot be scheduled at or after the end time. The end time can be changed at any moment. The time when a data transfer can be trigerred manually is not limited by this option.
- start
Time string - Specifies time to start scheduling transfer runs. The first run will be scheduled at or after the start time according to a recurrence pattern defined in the schedule string. The start time can be changed at any moment. The time when a data transfer can be trigerred manually is not limited by this option.
- disable_
auto_ boolscheduling - If true, automatic scheduling of data transfer runs for this configuration will be disabled. The runs can be started on ad-hoc basis using StartManualTransferRuns API. When automatic scheduling is disabled, the TransferConfig.schedule field will be ignored.
- end_
time str - Defines time to stop scheduling transfer runs. A transfer run cannot be scheduled at or after the end time. The end time can be changed at any moment. The time when a data transfer can be trigerred manually is not limited by this option.
- start_
time str - Specifies time to start scheduling transfer runs. The first run will be scheduled at or after the start time according to a recurrence pattern defined in the schedule string. The start time can be changed at any moment. The time when a data transfer can be trigerred manually is not limited by this option.
- disable
Auto BooleanScheduling - If true, automatic scheduling of data transfer runs for this configuration will be disabled. The runs can be started on ad-hoc basis using StartManualTransferRuns API. When automatic scheduling is disabled, the TransferConfig.schedule field will be ignored.
- end
Time String - Defines time to stop scheduling transfer runs. A transfer run cannot be scheduled at or after the end time. The end time can be changed at any moment. The time when a data transfer can be trigerred manually is not limited by this option.
- start
Time String - Specifies time to start scheduling transfer runs. The first run will be scheduled at or after the start time according to a recurrence pattern defined in the schedule string. The start time can be changed at any moment. The time when a data transfer can be trigerred manually is not limited by this option.
UserInfoResponse, UserInfoResponseArgs
- Email string
- E-mail address of the user.
- Email string
- E-mail address of the user.
- email String
- E-mail address of the user.
- email string
- E-mail address of the user.
- email str
- E-mail address of the user.
- email String
- E-mail address of the user.
Package Details
- Repository
- Google Cloud Native pulumi/pulumi-google-native
- License
- Apache-2.0
Google Cloud Native is in preview. Google Cloud Classic is fully supported.