We recommend new projects start with resources from the AWS provider.
aws-native.bedrock.ApplicationInferenceProfile
Explore with Pulumi AI
We recommend new projects start with resources from the AWS provider.
Definition of AWS::Bedrock::ApplicationInferenceProfile Resource Type
Create ApplicationInferenceProfile Resource
Resources are created with functions called constructors. To learn more about declaring and configuring resources, see Resources.
Constructor syntax
new ApplicationInferenceProfile(name: string, args?: ApplicationInferenceProfileArgs, opts?: CustomResourceOptions);
@overload
def ApplicationInferenceProfile(resource_name: str,
args: Optional[ApplicationInferenceProfileArgs] = None,
opts: Optional[ResourceOptions] = None)
@overload
def ApplicationInferenceProfile(resource_name: str,
opts: Optional[ResourceOptions] = None,
description: Optional[str] = None,
inference_profile_name: Optional[str] = None,
model_source: Optional[ApplicationInferenceProfileInferenceProfileModelSourcePropertiesArgs] = None,
tags: Optional[Sequence[_root_inputs.TagArgs]] = None)
func NewApplicationInferenceProfile(ctx *Context, name string, args *ApplicationInferenceProfileArgs, opts ...ResourceOption) (*ApplicationInferenceProfile, error)
public ApplicationInferenceProfile(string name, ApplicationInferenceProfileArgs? args = null, CustomResourceOptions? opts = null)
public ApplicationInferenceProfile(String name, ApplicationInferenceProfileArgs args)
public ApplicationInferenceProfile(String name, ApplicationInferenceProfileArgs args, CustomResourceOptions options)
type: aws-native:bedrock:ApplicationInferenceProfile
properties: # The arguments to resource properties.
options: # Bag of options to control resource's behavior.
Parameters
- name string
- The unique name of the resource.
- args ApplicationInferenceProfileArgs
- The arguments to resource properties.
- opts CustomResourceOptions
- Bag of options to control resource's behavior.
- resource_name str
- The unique name of the resource.
- args ApplicationInferenceProfileArgs
- The arguments to resource properties.
- opts ResourceOptions
- Bag of options to control resource's behavior.
- ctx Context
- Context object for the current deployment.
- name string
- The unique name of the resource.
- args ApplicationInferenceProfileArgs
- The arguments to resource properties.
- opts ResourceOption
- Bag of options to control resource's behavior.
- name string
- The unique name of the resource.
- args ApplicationInferenceProfileArgs
- The arguments to resource properties.
- opts CustomResourceOptions
- Bag of options to control resource's behavior.
- name String
- The unique name of the resource.
- args ApplicationInferenceProfileArgs
- The arguments to resource properties.
- options CustomResourceOptions
- Bag of options to control resource's behavior.
ApplicationInferenceProfile Resource Properties
To learn more about resource properties and how to use them, see Inputs and Outputs in the Architecture and Concepts docs.
Inputs
In Python, inputs that are objects can be passed either as argument classes or as dictionary literals.
The ApplicationInferenceProfile resource accepts the following input properties:
- Description string
- Description of the inference profile
- Inference
Profile stringName - The name of the inference profile.
- Model
Source Pulumi.Aws Native. Bedrock. Inputs. Application Inference Profile Inference Profile Model Source Properties - Contains configurations for the inference profile to copy as the resource.
- List<Pulumi.
Aws Native. Inputs. Tag> - List of Tags
- Description string
- Description of the inference profile
- Inference
Profile stringName - The name of the inference profile.
- Model
Source ApplicationInference Profile Inference Profile Model Source Properties Args - Contains configurations for the inference profile to copy as the resource.
- Tag
Args - List of Tags
- description String
- Description of the inference profile
- inference
Profile StringName - The name of the inference profile.
- model
Source ApplicationInference Profile Inference Profile Model Source Properties - Contains configurations for the inference profile to copy as the resource.
- List<Tag>
- List of Tags
- description string
- Description of the inference profile
- inference
Profile stringName - The name of the inference profile.
- model
Source ApplicationInference Profile Inference Profile Model Source Properties - Contains configurations for the inference profile to copy as the resource.
- Tag[]
- List of Tags
- description str
- Description of the inference profile
- inference_
profile_ strname - The name of the inference profile.
- model_
source ApplicationInference Profile Inference Profile Model Source Properties Args - Contains configurations for the inference profile to copy as the resource.
- Sequence[Tag
Args] - List of Tags
- description String
- Description of the inference profile
- inference
Profile StringName - The name of the inference profile.
- model
Source Property Map - Contains configurations for the inference profile to copy as the resource.
- List<Property Map>
- List of Tags
Outputs
All input properties are implicitly available as output properties. Additionally, the ApplicationInferenceProfile resource produces the following output properties:
- Created
At string - Time Stamp
- Id string
- The provider-assigned unique ID for this managed resource.
- Inference
Profile stringArn - The Amazon Resource Name (ARN) of the inference profile.
- Inference
Profile stringId - The unique identifier of the inference profile.
- Inference
Profile stringIdentifier - Inference profile identifier. Supports both system-defined inference profile ids, and inference profile ARNs.
- Models
List<Pulumi.
Aws Native. Bedrock. Outputs. Application Inference Profile Inference Profile Model> - List of model configuration
- Status
Pulumi.
Aws Native. Bedrock. Application Inference Profile Inference Profile Status - The status of the inference profile.
ACTIVE
means that the inference profile is ready to be used. - Type
Pulumi.
Aws Native. Bedrock. Application Inference Profile Inference Profile Type - The type of the inference profile. The following types are possible:
SYSTEM_DEFINED
– The inference profile is defined by Amazon Bedrock. You can route inference requests across regions with these inference profiles.APPLICATION
– The inference profile was created by a user. This type of inference profile can track metrics and costs when invoking the model in it. The inference profile may route requests to one or multiple regions.
- Updated
At string - Time Stamp
- Created
At string - Time Stamp
- Id string
- The provider-assigned unique ID for this managed resource.
- Inference
Profile stringArn - The Amazon Resource Name (ARN) of the inference profile.
- Inference
Profile stringId - The unique identifier of the inference profile.
- Inference
Profile stringIdentifier - Inference profile identifier. Supports both system-defined inference profile ids, and inference profile ARNs.
- Models
[]Application
Inference Profile Inference Profile Model - List of model configuration
- Status
Application
Inference Profile Inference Profile Status - The status of the inference profile.
ACTIVE
means that the inference profile is ready to be used. - Type
Application
Inference Profile Inference Profile Type - The type of the inference profile. The following types are possible:
SYSTEM_DEFINED
– The inference profile is defined by Amazon Bedrock. You can route inference requests across regions with these inference profiles.APPLICATION
– The inference profile was created by a user. This type of inference profile can track metrics and costs when invoking the model in it. The inference profile may route requests to one or multiple regions.
- Updated
At string - Time Stamp
- created
At String - Time Stamp
- id String
- The provider-assigned unique ID for this managed resource.
- inference
Profile StringArn - The Amazon Resource Name (ARN) of the inference profile.
- inference
Profile StringId - The unique identifier of the inference profile.
- inference
Profile StringIdentifier - Inference profile identifier. Supports both system-defined inference profile ids, and inference profile ARNs.
- models
List<Application
Inference Profile Inference Profile Model> - List of model configuration
- status
Application
Inference Profile Inference Profile Status - The status of the inference profile.
ACTIVE
means that the inference profile is ready to be used. - type
Application
Inference Profile Inference Profile Type - The type of the inference profile. The following types are possible:
SYSTEM_DEFINED
– The inference profile is defined by Amazon Bedrock. You can route inference requests across regions with these inference profiles.APPLICATION
– The inference profile was created by a user. This type of inference profile can track metrics and costs when invoking the model in it. The inference profile may route requests to one or multiple regions.
- updated
At String - Time Stamp
- created
At string - Time Stamp
- id string
- The provider-assigned unique ID for this managed resource.
- inference
Profile stringArn - The Amazon Resource Name (ARN) of the inference profile.
- inference
Profile stringId - The unique identifier of the inference profile.
- inference
Profile stringIdentifier - Inference profile identifier. Supports both system-defined inference profile ids, and inference profile ARNs.
- models
Application
Inference Profile Inference Profile Model[] - List of model configuration
- status
Application
Inference Profile Inference Profile Status - The status of the inference profile.
ACTIVE
means that the inference profile is ready to be used. - type
Application
Inference Profile Inference Profile Type - The type of the inference profile. The following types are possible:
SYSTEM_DEFINED
– The inference profile is defined by Amazon Bedrock. You can route inference requests across regions with these inference profiles.APPLICATION
– The inference profile was created by a user. This type of inference profile can track metrics and costs when invoking the model in it. The inference profile may route requests to one or multiple regions.
- updated
At string - Time Stamp
- created_
at str - Time Stamp
- id str
- The provider-assigned unique ID for this managed resource.
- inference_
profile_ strarn - The Amazon Resource Name (ARN) of the inference profile.
- inference_
profile_ strid - The unique identifier of the inference profile.
- inference_
profile_ stridentifier - Inference profile identifier. Supports both system-defined inference profile ids, and inference profile ARNs.
- models
Sequence[Application
Inference Profile Inference Profile Model] - List of model configuration
- status
Application
Inference Profile Inference Profile Status - The status of the inference profile.
ACTIVE
means that the inference profile is ready to be used. - type
Application
Inference Profile Inference Profile Type - The type of the inference profile. The following types are possible:
SYSTEM_DEFINED
– The inference profile is defined by Amazon Bedrock. You can route inference requests across regions with these inference profiles.APPLICATION
– The inference profile was created by a user. This type of inference profile can track metrics and costs when invoking the model in it. The inference profile may route requests to one or multiple regions.
- updated_
at str - Time Stamp
- created
At String - Time Stamp
- id String
- The provider-assigned unique ID for this managed resource.
- inference
Profile StringArn - The Amazon Resource Name (ARN) of the inference profile.
- inference
Profile StringId - The unique identifier of the inference profile.
- inference
Profile StringIdentifier - Inference profile identifier. Supports both system-defined inference profile ids, and inference profile ARNs.
- models List<Property Map>
- List of model configuration
- status "ACTIVE"
- The status of the inference profile.
ACTIVE
means that the inference profile is ready to be used. - type "APPLICATION" | "SYSTEM_DEFINED"
- The type of the inference profile. The following types are possible:
SYSTEM_DEFINED
– The inference profile is defined by Amazon Bedrock. You can route inference requests across regions with these inference profiles.APPLICATION
– The inference profile was created by a user. This type of inference profile can track metrics and costs when invoking the model in it. The inference profile may route requests to one or multiple regions.
- updated
At String - Time Stamp
Supporting Types
ApplicationInferenceProfileInferenceProfileModel, ApplicationInferenceProfileInferenceProfileModelArgs
- Model
Arn string - ARN for Foundation Models in Bedrock. These models can be used as base models for model customization jobs
- Model
Arn string - ARN for Foundation Models in Bedrock. These models can be used as base models for model customization jobs
- model
Arn String - ARN for Foundation Models in Bedrock. These models can be used as base models for model customization jobs
- model
Arn string - ARN for Foundation Models in Bedrock. These models can be used as base models for model customization jobs
- model_
arn str - ARN for Foundation Models in Bedrock. These models can be used as base models for model customization jobs
- model
Arn String - ARN for Foundation Models in Bedrock. These models can be used as base models for model customization jobs
ApplicationInferenceProfileInferenceProfileModelSourceProperties, ApplicationInferenceProfileInferenceProfileModelSourcePropertiesArgs
- Copy
From string - Source arns for a custom inference profile to copy its regional load balancing config from. This can either be a foundation model or predefined inference profile ARN.
- Copy
From string - Source arns for a custom inference profile to copy its regional load balancing config from. This can either be a foundation model or predefined inference profile ARN.
- copy
From String - Source arns for a custom inference profile to copy its regional load balancing config from. This can either be a foundation model or predefined inference profile ARN.
- copy
From string - Source arns for a custom inference profile to copy its regional load balancing config from. This can either be a foundation model or predefined inference profile ARN.
- copy_
from str - Source arns for a custom inference profile to copy its regional load balancing config from. This can either be a foundation model or predefined inference profile ARN.
- copy
From String - Source arns for a custom inference profile to copy its regional load balancing config from. This can either be a foundation model or predefined inference profile ARN.
ApplicationInferenceProfileInferenceProfileStatus, ApplicationInferenceProfileInferenceProfileStatusArgs
- Active
- ACTIVE
- Application
Inference Profile Inference Profile Status Active - ACTIVE
- Active
- ACTIVE
- Active
- ACTIVE
- ACTIVE
- ACTIVE
- "ACTIVE"
- ACTIVE
ApplicationInferenceProfileInferenceProfileType, ApplicationInferenceProfileInferenceProfileTypeArgs
- Application
- APPLICATION
- System
Defined - SYSTEM_DEFINED
- Application
Inference Profile Inference Profile Type Application - APPLICATION
- Application
Inference Profile Inference Profile Type System Defined - SYSTEM_DEFINED
- Application
- APPLICATION
- System
Defined - SYSTEM_DEFINED
- Application
- APPLICATION
- System
Defined - SYSTEM_DEFINED
- APPLICATION
- APPLICATION
- SYSTEM_DEFINED
- SYSTEM_DEFINED
- "APPLICATION"
- APPLICATION
- "SYSTEM_DEFINED"
- SYSTEM_DEFINED
Tag, TagArgs
Package Details
- Repository
- AWS Native pulumi/pulumi-aws-native
- License
- Apache-2.0
We recommend new projects start with resources from the AWS provider.