dbt Cloud v0.1.25 published on Friday, Nov 8, 2024 by Pulumi
dbtcloud.getBigQueryConnection
Explore with Pulumi AI
Using getBigQueryConnection
Two invocation forms are available. The direct form accepts plain arguments and either blocks until the result value is available, or returns a Promise-wrapped result. The output form accepts Input-wrapped arguments and returns an Output-wrapped result.
function getBigQueryConnection(args: GetBigQueryConnectionArgs, opts?: InvokeOptions): Promise<GetBigQueryConnectionResult>
function getBigQueryConnectionOutput(args: GetBigQueryConnectionOutputArgs, opts?: InvokeOptions): Output<GetBigQueryConnectionResult>
def get_big_query_connection(connection_id: Optional[int] = None,
project_id: Optional[int] = None,
opts: Optional[InvokeOptions] = None) -> GetBigQueryConnectionResult
def get_big_query_connection_output(connection_id: Optional[pulumi.Input[int]] = None,
project_id: Optional[pulumi.Input[int]] = None,
opts: Optional[InvokeOptions] = None) -> Output[GetBigQueryConnectionResult]
func LookupBigQueryConnection(ctx *Context, args *LookupBigQueryConnectionArgs, opts ...InvokeOption) (*LookupBigQueryConnectionResult, error)
func LookupBigQueryConnectionOutput(ctx *Context, args *LookupBigQueryConnectionOutputArgs, opts ...InvokeOption) LookupBigQueryConnectionResultOutput
> Note: This function is named LookupBigQueryConnection
in the Go SDK.
public static class GetBigQueryConnection
{
public static Task<GetBigQueryConnectionResult> InvokeAsync(GetBigQueryConnectionArgs args, InvokeOptions? opts = null)
public static Output<GetBigQueryConnectionResult> Invoke(GetBigQueryConnectionInvokeArgs args, InvokeOptions? opts = null)
}
public static CompletableFuture<GetBigQueryConnectionResult> getBigQueryConnection(GetBigQueryConnectionArgs args, InvokeOptions options)
// Output-based functions aren't available in Java yet
fn::invoke:
function: dbtcloud:index/getBigQueryConnection:getBigQueryConnection
arguments:
# arguments dictionary
The following arguments are supported:
- Connection
Id int - Connection Identifier
- Project
Id int - Project ID to create the connection in
- Connection
Id int - Connection Identifier
- Project
Id int - Project ID to create the connection in
- connection
Id Integer - Connection Identifier
- project
Id Integer - Project ID to create the connection in
- connection
Id number - Connection Identifier
- project
Id number - Project ID to create the connection in
- connection_
id int - Connection Identifier
- project_
id int - Project ID to create the connection in
- connection
Id Number - Connection Identifier
- project
Id Number - Project ID to create the connection in
getBigQueryConnection Result
The following output properties are available:
- Auth
Provider stringX509Cert Url - Auth Provider X509 Cert URL for the Service Account
- Auth
Uri string - Auth URI for the Service Account
- Client
Email string - Service Account email
- Client
Id string - Client ID of the Service Account
- Client
X509Cert stringUrl - Client X509 Cert URL for the Service Account
- Connection
Id int - Connection Identifier
- Dataproc
Cluster stringName - Dataproc cluster name for PySpark workloads
- Dataproc
Region string - Google Cloud region for PySpark workloads on Dataproc
- Execution
Project string - Project to bill for query execution
- Gcp
Project stringId - GCP project ID
- Gcs
Bucket string - URI for a Google Cloud Storage bucket to host Python code executed via Datapro
- Id string
- The provider-assigned unique ID for this managed resource.
- Is
Active bool - Whether the connection is active
- Is
Configured boolFor Oauth - Whether the connection is configured for OAuth or not
- Location string
- Location to create new Datasets in
- Maximum
Bytes intBilled - Max number of bytes that can be billed for a given BigQuery query
- Name string
- Connection name
- Priority string
- The priority with which to execute BigQuery queries
- Private
Key string - Private key of the Service Account
- Private
Key stringId - Private key ID of the Service Account
- Project
Id int - Project ID to create the connection in
- Retries int
- Number of retries for queries
- Timeout
Seconds int - Timeout in seconds for queries
- Token
Uri string - Token URI for the Service Account
- Type string
- The type of connection
- Auth
Provider stringX509Cert Url - Auth Provider X509 Cert URL for the Service Account
- Auth
Uri string - Auth URI for the Service Account
- Client
Email string - Service Account email
- Client
Id string - Client ID of the Service Account
- Client
X509Cert stringUrl - Client X509 Cert URL for the Service Account
- Connection
Id int - Connection Identifier
- Dataproc
Cluster stringName - Dataproc cluster name for PySpark workloads
- Dataproc
Region string - Google Cloud region for PySpark workloads on Dataproc
- Execution
Project string - Project to bill for query execution
- Gcp
Project stringId - GCP project ID
- Gcs
Bucket string - URI for a Google Cloud Storage bucket to host Python code executed via Datapro
- Id string
- The provider-assigned unique ID for this managed resource.
- Is
Active bool - Whether the connection is active
- Is
Configured boolFor Oauth - Whether the connection is configured for OAuth or not
- Location string
- Location to create new Datasets in
- Maximum
Bytes intBilled - Max number of bytes that can be billed for a given BigQuery query
- Name string
- Connection name
- Priority string
- The priority with which to execute BigQuery queries
- Private
Key string - Private key of the Service Account
- Private
Key stringId - Private key ID of the Service Account
- Project
Id int - Project ID to create the connection in
- Retries int
- Number of retries for queries
- Timeout
Seconds int - Timeout in seconds for queries
- Token
Uri string - Token URI for the Service Account
- Type string
- The type of connection
- auth
Provider StringX509Cert Url - Auth Provider X509 Cert URL for the Service Account
- auth
Uri String - Auth URI for the Service Account
- client
Email String - Service Account email
- client
Id String - Client ID of the Service Account
- client
X509Cert StringUrl - Client X509 Cert URL for the Service Account
- connection
Id Integer - Connection Identifier
- dataproc
Cluster StringName - Dataproc cluster name for PySpark workloads
- dataproc
Region String - Google Cloud region for PySpark workloads on Dataproc
- execution
Project String - Project to bill for query execution
- gcp
Project StringId - GCP project ID
- gcs
Bucket String - URI for a Google Cloud Storage bucket to host Python code executed via Datapro
- id String
- The provider-assigned unique ID for this managed resource.
- is
Active Boolean - Whether the connection is active
- is
Configured BooleanFor Oauth - Whether the connection is configured for OAuth or not
- location String
- Location to create new Datasets in
- maximum
Bytes IntegerBilled - Max number of bytes that can be billed for a given BigQuery query
- name String
- Connection name
- priority String
- The priority with which to execute BigQuery queries
- private
Key String - Private key of the Service Account
- private
Key StringId - Private key ID of the Service Account
- project
Id Integer - Project ID to create the connection in
- retries Integer
- Number of retries for queries
- timeout
Seconds Integer - Timeout in seconds for queries
- token
Uri String - Token URI for the Service Account
- type String
- The type of connection
- auth
Provider stringX509Cert Url - Auth Provider X509 Cert URL for the Service Account
- auth
Uri string - Auth URI for the Service Account
- client
Email string - Service Account email
- client
Id string - Client ID of the Service Account
- client
X509Cert stringUrl - Client X509 Cert URL for the Service Account
- connection
Id number - Connection Identifier
- dataproc
Cluster stringName - Dataproc cluster name for PySpark workloads
- dataproc
Region string - Google Cloud region for PySpark workloads on Dataproc
- execution
Project string - Project to bill for query execution
- gcp
Project stringId - GCP project ID
- gcs
Bucket string - URI for a Google Cloud Storage bucket to host Python code executed via Datapro
- id string
- The provider-assigned unique ID for this managed resource.
- is
Active boolean - Whether the connection is active
- is
Configured booleanFor Oauth - Whether the connection is configured for OAuth or not
- location string
- Location to create new Datasets in
- maximum
Bytes numberBilled - Max number of bytes that can be billed for a given BigQuery query
- name string
- Connection name
- priority string
- The priority with which to execute BigQuery queries
- private
Key string - Private key of the Service Account
- private
Key stringId - Private key ID of the Service Account
- project
Id number - Project ID to create the connection in
- retries number
- Number of retries for queries
- timeout
Seconds number - Timeout in seconds for queries
- token
Uri string - Token URI for the Service Account
- type string
- The type of connection
- auth_
provider_ strx509_ cert_ url - Auth Provider X509 Cert URL for the Service Account
- auth_
uri str - Auth URI for the Service Account
- client_
email str - Service Account email
- client_
id str - Client ID of the Service Account
- client_
x509_ strcert_ url - Client X509 Cert URL for the Service Account
- connection_
id int - Connection Identifier
- dataproc_
cluster_ strname - Dataproc cluster name for PySpark workloads
- dataproc_
region str - Google Cloud region for PySpark workloads on Dataproc
- execution_
project str - Project to bill for query execution
- gcp_
project_ strid - GCP project ID
- gcs_
bucket str - URI for a Google Cloud Storage bucket to host Python code executed via Datapro
- id str
- The provider-assigned unique ID for this managed resource.
- is_
active bool - Whether the connection is active
- is_
configured_ boolfor_ oauth - Whether the connection is configured for OAuth or not
- location str
- Location to create new Datasets in
- maximum_
bytes_ intbilled - Max number of bytes that can be billed for a given BigQuery query
- name str
- Connection name
- priority str
- The priority with which to execute BigQuery queries
- private_
key str - Private key of the Service Account
- private_
key_ strid - Private key ID of the Service Account
- project_
id int - Project ID to create the connection in
- retries int
- Number of retries for queries
- timeout_
seconds int - Timeout in seconds for queries
- token_
uri str - Token URI for the Service Account
- type str
- The type of connection
- auth
Provider StringX509Cert Url - Auth Provider X509 Cert URL for the Service Account
- auth
Uri String - Auth URI for the Service Account
- client
Email String - Service Account email
- client
Id String - Client ID of the Service Account
- client
X509Cert StringUrl - Client X509 Cert URL for the Service Account
- connection
Id Number - Connection Identifier
- dataproc
Cluster StringName - Dataproc cluster name for PySpark workloads
- dataproc
Region String - Google Cloud region for PySpark workloads on Dataproc
- execution
Project String - Project to bill for query execution
- gcp
Project StringId - GCP project ID
- gcs
Bucket String - URI for a Google Cloud Storage bucket to host Python code executed via Datapro
- id String
- The provider-assigned unique ID for this managed resource.
- is
Active Boolean - Whether the connection is active
- is
Configured BooleanFor Oauth - Whether the connection is configured for OAuth or not
- location String
- Location to create new Datasets in
- maximum
Bytes NumberBilled - Max number of bytes that can be billed for a given BigQuery query
- name String
- Connection name
- priority String
- The priority with which to execute BigQuery queries
- private
Key String - Private key of the Service Account
- private
Key StringId - Private key ID of the Service Account
- project
Id Number - Project ID to create the connection in
- retries Number
- Number of retries for queries
- timeout
Seconds Number - Timeout in seconds for queries
- token
Uri String - Token URI for the Service Account
- type String
- The type of connection
Package Details
- Repository
- dbtcloud pulumi/pulumi-dbtcloud
- License
- Apache-2.0
- Notes
- This Pulumi package is based on the
dbtcloud
Terraform Provider.