1. Packages
  2. AWS Cloud Control
  3. API Docs
  4. bedrock
  5. ApplicationInferenceProfile

We recommend new projects start with resources from the AWS provider.

AWS Cloud Control v1.9.0 published on Monday, Nov 18, 2024 by Pulumi

aws-native.bedrock.ApplicationInferenceProfile

Explore with Pulumi AI

aws-native logo

We recommend new projects start with resources from the AWS provider.

AWS Cloud Control v1.9.0 published on Monday, Nov 18, 2024 by Pulumi

    Definition of AWS::Bedrock::ApplicationInferenceProfile Resource Type

    Create ApplicationInferenceProfile Resource

    Resources are created with functions called constructors. To learn more about declaring and configuring resources, see Resources.

    Constructor syntax

    new ApplicationInferenceProfile(name: string, args?: ApplicationInferenceProfileArgs, opts?: CustomResourceOptions);
    @overload
    def ApplicationInferenceProfile(resource_name: str,
                                    args: Optional[ApplicationInferenceProfileArgs] = None,
                                    opts: Optional[ResourceOptions] = None)
    
    @overload
    def ApplicationInferenceProfile(resource_name: str,
                                    opts: Optional[ResourceOptions] = None,
                                    description: Optional[str] = None,
                                    inference_profile_name: Optional[str] = None,
                                    model_source: Optional[ApplicationInferenceProfileInferenceProfileModelSourcePropertiesArgs] = None,
                                    tags: Optional[Sequence[_root_inputs.TagArgs]] = None)
    func NewApplicationInferenceProfile(ctx *Context, name string, args *ApplicationInferenceProfileArgs, opts ...ResourceOption) (*ApplicationInferenceProfile, error)
    public ApplicationInferenceProfile(string name, ApplicationInferenceProfileArgs? args = null, CustomResourceOptions? opts = null)
    public ApplicationInferenceProfile(String name, ApplicationInferenceProfileArgs args)
    public ApplicationInferenceProfile(String name, ApplicationInferenceProfileArgs args, CustomResourceOptions options)
    
    type: aws-native:bedrock:ApplicationInferenceProfile
    properties: # The arguments to resource properties.
    options: # Bag of options to control resource's behavior.
    
    

    Parameters

    name string
    The unique name of the resource.
    args ApplicationInferenceProfileArgs
    The arguments to resource properties.
    opts CustomResourceOptions
    Bag of options to control resource's behavior.
    resource_name str
    The unique name of the resource.
    args ApplicationInferenceProfileArgs
    The arguments to resource properties.
    opts ResourceOptions
    Bag of options to control resource's behavior.
    ctx Context
    Context object for the current deployment.
    name string
    The unique name of the resource.
    args ApplicationInferenceProfileArgs
    The arguments to resource properties.
    opts ResourceOption
    Bag of options to control resource's behavior.
    name string
    The unique name of the resource.
    args ApplicationInferenceProfileArgs
    The arguments to resource properties.
    opts CustomResourceOptions
    Bag of options to control resource's behavior.
    name String
    The unique name of the resource.
    args ApplicationInferenceProfileArgs
    The arguments to resource properties.
    options CustomResourceOptions
    Bag of options to control resource's behavior.

    ApplicationInferenceProfile Resource Properties

    To learn more about resource properties and how to use them, see Inputs and Outputs in the Architecture and Concepts docs.

    Inputs

    In Python, inputs that are objects can be passed either as argument classes or as dictionary literals.

    The ApplicationInferenceProfile resource accepts the following input properties:

    Description string
    Description of the inference profile
    InferenceProfileName string
    The name of the inference profile.
    ModelSource Pulumi.AwsNative.Bedrock.Inputs.ApplicationInferenceProfileInferenceProfileModelSourceProperties
    Contains configurations for the inference profile to copy as the resource.
    Tags List<Pulumi.AwsNative.Inputs.Tag>
    List of Tags
    Description string
    Description of the inference profile
    InferenceProfileName string
    The name of the inference profile.
    ModelSource ApplicationInferenceProfileInferenceProfileModelSourcePropertiesArgs
    Contains configurations for the inference profile to copy as the resource.
    Tags TagArgs
    List of Tags
    description String
    Description of the inference profile
    inferenceProfileName String
    The name of the inference profile.
    modelSource ApplicationInferenceProfileInferenceProfileModelSourceProperties
    Contains configurations for the inference profile to copy as the resource.
    tags List<Tag>
    List of Tags
    description string
    Description of the inference profile
    inferenceProfileName string
    The name of the inference profile.
    modelSource ApplicationInferenceProfileInferenceProfileModelSourceProperties
    Contains configurations for the inference profile to copy as the resource.
    tags Tag[]
    List of Tags
    description str
    Description of the inference profile
    inference_profile_name str
    The name of the inference profile.
    model_source ApplicationInferenceProfileInferenceProfileModelSourcePropertiesArgs
    Contains configurations for the inference profile to copy as the resource.
    tags Sequence[TagArgs]
    List of Tags
    description String
    Description of the inference profile
    inferenceProfileName String
    The name of the inference profile.
    modelSource Property Map
    Contains configurations for the inference profile to copy as the resource.
    tags List<Property Map>
    List of Tags

    Outputs

    All input properties are implicitly available as output properties. Additionally, the ApplicationInferenceProfile resource produces the following output properties:

    CreatedAt string
    Time Stamp
    Id string
    The provider-assigned unique ID for this managed resource.
    InferenceProfileArn string
    The Amazon Resource Name (ARN) of the inference profile.
    InferenceProfileId string
    The unique identifier of the inference profile.
    InferenceProfileIdentifier string
    Inference profile identifier. Supports both system-defined inference profile ids, and inference profile ARNs.
    Models List<Pulumi.AwsNative.Bedrock.Outputs.ApplicationInferenceProfileInferenceProfileModel>
    List of model configuration
    Status Pulumi.AwsNative.Bedrock.ApplicationInferenceProfileInferenceProfileStatus
    The status of the inference profile. ACTIVE means that the inference profile is ready to be used.
    Type Pulumi.AwsNative.Bedrock.ApplicationInferenceProfileInferenceProfileType
    The type of the inference profile. The following types are possible:

    • SYSTEM_DEFINED – The inference profile is defined by Amazon Bedrock. You can route inference requests across regions with these inference profiles.
    • APPLICATION – The inference profile was created by a user. This type of inference profile can track metrics and costs when invoking the model in it. The inference profile may route requests to one or multiple regions.
    UpdatedAt string
    Time Stamp
    CreatedAt string
    Time Stamp
    Id string
    The provider-assigned unique ID for this managed resource.
    InferenceProfileArn string
    The Amazon Resource Name (ARN) of the inference profile.
    InferenceProfileId string
    The unique identifier of the inference profile.
    InferenceProfileIdentifier string
    Inference profile identifier. Supports both system-defined inference profile ids, and inference profile ARNs.
    Models []ApplicationInferenceProfileInferenceProfileModel
    List of model configuration
    Status ApplicationInferenceProfileInferenceProfileStatus
    The status of the inference profile. ACTIVE means that the inference profile is ready to be used.
    Type ApplicationInferenceProfileInferenceProfileType
    The type of the inference profile. The following types are possible:

    • SYSTEM_DEFINED – The inference profile is defined by Amazon Bedrock. You can route inference requests across regions with these inference profiles.
    • APPLICATION – The inference profile was created by a user. This type of inference profile can track metrics and costs when invoking the model in it. The inference profile may route requests to one or multiple regions.
    UpdatedAt string
    Time Stamp
    createdAt String
    Time Stamp
    id String
    The provider-assigned unique ID for this managed resource.
    inferenceProfileArn String
    The Amazon Resource Name (ARN) of the inference profile.
    inferenceProfileId String
    The unique identifier of the inference profile.
    inferenceProfileIdentifier String
    Inference profile identifier. Supports both system-defined inference profile ids, and inference profile ARNs.
    models List<ApplicationInferenceProfileInferenceProfileModel>
    List of model configuration
    status ApplicationInferenceProfileInferenceProfileStatus
    The status of the inference profile. ACTIVE means that the inference profile is ready to be used.
    type ApplicationInferenceProfileInferenceProfileType
    The type of the inference profile. The following types are possible:

    • SYSTEM_DEFINED – The inference profile is defined by Amazon Bedrock. You can route inference requests across regions with these inference profiles.
    • APPLICATION – The inference profile was created by a user. This type of inference profile can track metrics and costs when invoking the model in it. The inference profile may route requests to one or multiple regions.
    updatedAt String
    Time Stamp
    createdAt string
    Time Stamp
    id string
    The provider-assigned unique ID for this managed resource.
    inferenceProfileArn string
    The Amazon Resource Name (ARN) of the inference profile.
    inferenceProfileId string
    The unique identifier of the inference profile.
    inferenceProfileIdentifier string
    Inference profile identifier. Supports both system-defined inference profile ids, and inference profile ARNs.
    models ApplicationInferenceProfileInferenceProfileModel[]
    List of model configuration
    status ApplicationInferenceProfileInferenceProfileStatus
    The status of the inference profile. ACTIVE means that the inference profile is ready to be used.
    type ApplicationInferenceProfileInferenceProfileType
    The type of the inference profile. The following types are possible:

    • SYSTEM_DEFINED – The inference profile is defined by Amazon Bedrock. You can route inference requests across regions with these inference profiles.
    • APPLICATION – The inference profile was created by a user. This type of inference profile can track metrics and costs when invoking the model in it. The inference profile may route requests to one or multiple regions.
    updatedAt string
    Time Stamp
    created_at str
    Time Stamp
    id str
    The provider-assigned unique ID for this managed resource.
    inference_profile_arn str
    The Amazon Resource Name (ARN) of the inference profile.
    inference_profile_id str
    The unique identifier of the inference profile.
    inference_profile_identifier str
    Inference profile identifier. Supports both system-defined inference profile ids, and inference profile ARNs.
    models Sequence[ApplicationInferenceProfileInferenceProfileModel]
    List of model configuration
    status ApplicationInferenceProfileInferenceProfileStatus
    The status of the inference profile. ACTIVE means that the inference profile is ready to be used.
    type ApplicationInferenceProfileInferenceProfileType
    The type of the inference profile. The following types are possible:

    • SYSTEM_DEFINED – The inference profile is defined by Amazon Bedrock. You can route inference requests across regions with these inference profiles.
    • APPLICATION – The inference profile was created by a user. This type of inference profile can track metrics and costs when invoking the model in it. The inference profile may route requests to one or multiple regions.
    updated_at str
    Time Stamp
    createdAt String
    Time Stamp
    id String
    The provider-assigned unique ID for this managed resource.
    inferenceProfileArn String
    The Amazon Resource Name (ARN) of the inference profile.
    inferenceProfileId String
    The unique identifier of the inference profile.
    inferenceProfileIdentifier String
    Inference profile identifier. Supports both system-defined inference profile ids, and inference profile ARNs.
    models List<Property Map>
    List of model configuration
    status "ACTIVE"
    The status of the inference profile. ACTIVE means that the inference profile is ready to be used.
    type "APPLICATION" | "SYSTEM_DEFINED"
    The type of the inference profile. The following types are possible:

    • SYSTEM_DEFINED – The inference profile is defined by Amazon Bedrock. You can route inference requests across regions with these inference profiles.
    • APPLICATION – The inference profile was created by a user. This type of inference profile can track metrics and costs when invoking the model in it. The inference profile may route requests to one or multiple regions.
    updatedAt String
    Time Stamp

    Supporting Types

    ApplicationInferenceProfileInferenceProfileModel, ApplicationInferenceProfileInferenceProfileModelArgs

    ModelArn string
    ARN for Foundation Models in Bedrock. These models can be used as base models for model customization jobs
    ModelArn string
    ARN for Foundation Models in Bedrock. These models can be used as base models for model customization jobs
    modelArn String
    ARN for Foundation Models in Bedrock. These models can be used as base models for model customization jobs
    modelArn string
    ARN for Foundation Models in Bedrock. These models can be used as base models for model customization jobs
    model_arn str
    ARN for Foundation Models in Bedrock. These models can be used as base models for model customization jobs
    modelArn String
    ARN for Foundation Models in Bedrock. These models can be used as base models for model customization jobs

    ApplicationInferenceProfileInferenceProfileModelSourceProperties, ApplicationInferenceProfileInferenceProfileModelSourcePropertiesArgs

    CopyFrom string
    Source arns for a custom inference profile to copy its regional load balancing config from. This can either be a foundation model or predefined inference profile ARN.
    CopyFrom string
    Source arns for a custom inference profile to copy its regional load balancing config from. This can either be a foundation model or predefined inference profile ARN.
    copyFrom String
    Source arns for a custom inference profile to copy its regional load balancing config from. This can either be a foundation model or predefined inference profile ARN.
    copyFrom string
    Source arns for a custom inference profile to copy its regional load balancing config from. This can either be a foundation model or predefined inference profile ARN.
    copy_from str
    Source arns for a custom inference profile to copy its regional load balancing config from. This can either be a foundation model or predefined inference profile ARN.
    copyFrom String
    Source arns for a custom inference profile to copy its regional load balancing config from. This can either be a foundation model or predefined inference profile ARN.

    ApplicationInferenceProfileInferenceProfileStatus, ApplicationInferenceProfileInferenceProfileStatusArgs

    Active
    ACTIVE
    ApplicationInferenceProfileInferenceProfileStatusActive
    ACTIVE
    Active
    ACTIVE
    Active
    ACTIVE
    ACTIVE
    ACTIVE
    "ACTIVE"
    ACTIVE

    ApplicationInferenceProfileInferenceProfileType, ApplicationInferenceProfileInferenceProfileTypeArgs

    Application
    APPLICATION
    SystemDefined
    SYSTEM_DEFINED
    ApplicationInferenceProfileInferenceProfileTypeApplication
    APPLICATION
    ApplicationInferenceProfileInferenceProfileTypeSystemDefined
    SYSTEM_DEFINED
    Application
    APPLICATION
    SystemDefined
    SYSTEM_DEFINED
    Application
    APPLICATION
    SystemDefined
    SYSTEM_DEFINED
    APPLICATION
    APPLICATION
    SYSTEM_DEFINED
    SYSTEM_DEFINED
    "APPLICATION"
    APPLICATION
    "SYSTEM_DEFINED"
    SYSTEM_DEFINED

    Tag, TagArgs

    Key string
    The key name of the tag
    Value string
    The value of the tag
    Key string
    The key name of the tag
    Value string
    The value of the tag
    key String
    The key name of the tag
    value String
    The value of the tag
    key string
    The key name of the tag
    value string
    The value of the tag
    key str
    The key name of the tag
    value str
    The value of the tag
    key String
    The key name of the tag
    value String
    The value of the tag

    Package Details

    Repository
    AWS Native pulumi/pulumi-aws-native
    License
    Apache-2.0
    aws-native logo

    We recommend new projects start with resources from the AWS provider.

    AWS Cloud Control v1.9.0 published on Monday, Nov 18, 2024 by Pulumi