1. Packages
  2. AWS Cloud Control
  3. API Docs
  4. sagemaker
  5. InferenceExperiment

We recommend new projects start with resources from the AWS provider.

AWS Cloud Control v1.9.0 published on Monday, Nov 18, 2024 by Pulumi

aws-native.sagemaker.InferenceExperiment

Explore with Pulumi AI

aws-native logo

We recommend new projects start with resources from the AWS provider.

AWS Cloud Control v1.9.0 published on Monday, Nov 18, 2024 by Pulumi

    Resource Type definition for AWS::SageMaker::InferenceExperiment

    Create InferenceExperiment Resource

    Resources are created with functions called constructors. To learn more about declaring and configuring resources, see Resources.

    Constructor syntax

    new InferenceExperiment(name: string, args: InferenceExperimentArgs, opts?: CustomResourceOptions);
    @overload
    def InferenceExperiment(resource_name: str,
                            args: InferenceExperimentArgs,
                            opts: Optional[ResourceOptions] = None)
    
    @overload
    def InferenceExperiment(resource_name: str,
                            opts: Optional[ResourceOptions] = None,
                            model_variants: Optional[Sequence[InferenceExperimentModelVariantConfigArgs]] = None,
                            type: Optional[InferenceExperimentType] = None,
                            role_arn: Optional[str] = None,
                            endpoint_name: Optional[str] = None,
                            name: Optional[str] = None,
                            kms_key: Optional[str] = None,
                            data_storage_config: Optional[InferenceExperimentDataStorageConfigArgs] = None,
                            desired_state: Optional[InferenceExperimentDesiredState] = None,
                            schedule: Optional[InferenceExperimentScheduleArgs] = None,
                            shadow_mode_config: Optional[InferenceExperimentShadowModeConfigArgs] = None,
                            status_reason: Optional[str] = None,
                            tags: Optional[Sequence[_root_inputs.TagArgs]] = None,
                            description: Optional[str] = None)
    func NewInferenceExperiment(ctx *Context, name string, args InferenceExperimentArgs, opts ...ResourceOption) (*InferenceExperiment, error)
    public InferenceExperiment(string name, InferenceExperimentArgs args, CustomResourceOptions? opts = null)
    public InferenceExperiment(String name, InferenceExperimentArgs args)
    public InferenceExperiment(String name, InferenceExperimentArgs args, CustomResourceOptions options)
    
    type: aws-native:sagemaker:InferenceExperiment
    properties: # The arguments to resource properties.
    options: # Bag of options to control resource's behavior.
    
    

    Parameters

    name string
    The unique name of the resource.
    args InferenceExperimentArgs
    The arguments to resource properties.
    opts CustomResourceOptions
    Bag of options to control resource's behavior.
    resource_name str
    The unique name of the resource.
    args InferenceExperimentArgs
    The arguments to resource properties.
    opts ResourceOptions
    Bag of options to control resource's behavior.
    ctx Context
    Context object for the current deployment.
    name string
    The unique name of the resource.
    args InferenceExperimentArgs
    The arguments to resource properties.
    opts ResourceOption
    Bag of options to control resource's behavior.
    name string
    The unique name of the resource.
    args InferenceExperimentArgs
    The arguments to resource properties.
    opts CustomResourceOptions
    Bag of options to control resource's behavior.
    name String
    The unique name of the resource.
    args InferenceExperimentArgs
    The arguments to resource properties.
    options CustomResourceOptions
    Bag of options to control resource's behavior.

    InferenceExperiment Resource Properties

    To learn more about resource properties and how to use them, see Inputs and Outputs in the Architecture and Concepts docs.

    Inputs

    In Python, inputs that are objects can be passed either as argument classes or as dictionary literals.

    The InferenceExperiment resource accepts the following input properties:

    EndpointName string
    The name of the endpoint.
    ModelVariants List<Pulumi.AwsNative.SageMaker.Inputs.InferenceExperimentModelVariantConfig>
    An array of ModelVariantConfig objects. Each ModelVariantConfig object in the array describes the infrastructure configuration for the corresponding variant.
    RoleArn string
    The Amazon Resource Name (ARN) of an IAM role that Amazon SageMaker can assume to access model artifacts and container images, and manage Amazon SageMaker Inference endpoints for model deployment.
    Type Pulumi.AwsNative.SageMaker.InferenceExperimentType
    The type of the inference experiment that you want to run.
    DataStorageConfig Pulumi.AwsNative.SageMaker.Inputs.InferenceExperimentDataStorageConfig
    The Amazon S3 location and configuration for storing inference request and response data.
    Description string
    The description of the inference experiment.
    DesiredState Pulumi.AwsNative.SageMaker.InferenceExperimentDesiredState
    The desired state of the experiment after starting or stopping operation.
    KmsKey string
    The AWS Key Management Service (AWS KMS) key that Amazon SageMaker uses to encrypt data on the storage volume attached to the ML compute instance that hosts the endpoint.
    Name string
    The name for the inference experiment.
    Schedule Pulumi.AwsNative.SageMaker.Inputs.InferenceExperimentSchedule

    The duration for which the inference experiment ran or will run.

    The maximum duration that you can set for an inference experiment is 30 days.

    ShadowModeConfig Pulumi.AwsNative.SageMaker.Inputs.InferenceExperimentShadowModeConfig
    The configuration of ShadowMode inference experiment type, which shows the production variant that takes all the inference requests, and the shadow variant to which Amazon SageMaker replicates a percentage of the inference requests. For the shadow variant it also shows the percentage of requests that Amazon SageMaker replicates.
    StatusReason string
    The error message or client-specified reason from the StopInferenceExperiment API, that explains the status of the inference experiment.
    Tags List<Pulumi.AwsNative.Inputs.Tag>
    An array of key-value pairs to apply to this resource.
    EndpointName string
    The name of the endpoint.
    ModelVariants []InferenceExperimentModelVariantConfigArgs
    An array of ModelVariantConfig objects. Each ModelVariantConfig object in the array describes the infrastructure configuration for the corresponding variant.
    RoleArn string
    The Amazon Resource Name (ARN) of an IAM role that Amazon SageMaker can assume to access model artifacts and container images, and manage Amazon SageMaker Inference endpoints for model deployment.
    Type InferenceExperimentType
    The type of the inference experiment that you want to run.
    DataStorageConfig InferenceExperimentDataStorageConfigArgs
    The Amazon S3 location and configuration for storing inference request and response data.
    Description string
    The description of the inference experiment.
    DesiredState InferenceExperimentDesiredState
    The desired state of the experiment after starting or stopping operation.
    KmsKey string
    The AWS Key Management Service (AWS KMS) key that Amazon SageMaker uses to encrypt data on the storage volume attached to the ML compute instance that hosts the endpoint.
    Name string
    The name for the inference experiment.
    Schedule InferenceExperimentScheduleArgs

    The duration for which the inference experiment ran or will run.

    The maximum duration that you can set for an inference experiment is 30 days.

    ShadowModeConfig InferenceExperimentShadowModeConfigArgs
    The configuration of ShadowMode inference experiment type, which shows the production variant that takes all the inference requests, and the shadow variant to which Amazon SageMaker replicates a percentage of the inference requests. For the shadow variant it also shows the percentage of requests that Amazon SageMaker replicates.
    StatusReason string
    The error message or client-specified reason from the StopInferenceExperiment API, that explains the status of the inference experiment.
    Tags TagArgs
    An array of key-value pairs to apply to this resource.
    endpointName String
    The name of the endpoint.
    modelVariants List<InferenceExperimentModelVariantConfig>
    An array of ModelVariantConfig objects. Each ModelVariantConfig object in the array describes the infrastructure configuration for the corresponding variant.
    roleArn String
    The Amazon Resource Name (ARN) of an IAM role that Amazon SageMaker can assume to access model artifacts and container images, and manage Amazon SageMaker Inference endpoints for model deployment.
    type InferenceExperimentType
    The type of the inference experiment that you want to run.
    dataStorageConfig InferenceExperimentDataStorageConfig
    The Amazon S3 location and configuration for storing inference request and response data.
    description String
    The description of the inference experiment.
    desiredState InferenceExperimentDesiredState
    The desired state of the experiment after starting or stopping operation.
    kmsKey String
    The AWS Key Management Service (AWS KMS) key that Amazon SageMaker uses to encrypt data on the storage volume attached to the ML compute instance that hosts the endpoint.
    name String
    The name for the inference experiment.
    schedule InferenceExperimentSchedule

    The duration for which the inference experiment ran or will run.

    The maximum duration that you can set for an inference experiment is 30 days.

    shadowModeConfig InferenceExperimentShadowModeConfig
    The configuration of ShadowMode inference experiment type, which shows the production variant that takes all the inference requests, and the shadow variant to which Amazon SageMaker replicates a percentage of the inference requests. For the shadow variant it also shows the percentage of requests that Amazon SageMaker replicates.
    statusReason String
    The error message or client-specified reason from the StopInferenceExperiment API, that explains the status of the inference experiment.
    tags List<Tag>
    An array of key-value pairs to apply to this resource.
    endpointName string
    The name of the endpoint.
    modelVariants InferenceExperimentModelVariantConfig[]
    An array of ModelVariantConfig objects. Each ModelVariantConfig object in the array describes the infrastructure configuration for the corresponding variant.
    roleArn string
    The Amazon Resource Name (ARN) of an IAM role that Amazon SageMaker can assume to access model artifacts and container images, and manage Amazon SageMaker Inference endpoints for model deployment.
    type InferenceExperimentType
    The type of the inference experiment that you want to run.
    dataStorageConfig InferenceExperimentDataStorageConfig
    The Amazon S3 location and configuration for storing inference request and response data.
    description string
    The description of the inference experiment.
    desiredState InferenceExperimentDesiredState
    The desired state of the experiment after starting or stopping operation.
    kmsKey string
    The AWS Key Management Service (AWS KMS) key that Amazon SageMaker uses to encrypt data on the storage volume attached to the ML compute instance that hosts the endpoint.
    name string
    The name for the inference experiment.
    schedule InferenceExperimentSchedule

    The duration for which the inference experiment ran or will run.

    The maximum duration that you can set for an inference experiment is 30 days.

    shadowModeConfig InferenceExperimentShadowModeConfig
    The configuration of ShadowMode inference experiment type, which shows the production variant that takes all the inference requests, and the shadow variant to which Amazon SageMaker replicates a percentage of the inference requests. For the shadow variant it also shows the percentage of requests that Amazon SageMaker replicates.
    statusReason string
    The error message or client-specified reason from the StopInferenceExperiment API, that explains the status of the inference experiment.
    tags Tag[]
    An array of key-value pairs to apply to this resource.
    endpoint_name str
    The name of the endpoint.
    model_variants Sequence[InferenceExperimentModelVariantConfigArgs]
    An array of ModelVariantConfig objects. Each ModelVariantConfig object in the array describes the infrastructure configuration for the corresponding variant.
    role_arn str
    The Amazon Resource Name (ARN) of an IAM role that Amazon SageMaker can assume to access model artifacts and container images, and manage Amazon SageMaker Inference endpoints for model deployment.
    type InferenceExperimentType
    The type of the inference experiment that you want to run.
    data_storage_config InferenceExperimentDataStorageConfigArgs
    The Amazon S3 location and configuration for storing inference request and response data.
    description str
    The description of the inference experiment.
    desired_state InferenceExperimentDesiredState
    The desired state of the experiment after starting or stopping operation.
    kms_key str
    The AWS Key Management Service (AWS KMS) key that Amazon SageMaker uses to encrypt data on the storage volume attached to the ML compute instance that hosts the endpoint.
    name str
    The name for the inference experiment.
    schedule InferenceExperimentScheduleArgs

    The duration for which the inference experiment ran or will run.

    The maximum duration that you can set for an inference experiment is 30 days.

    shadow_mode_config InferenceExperimentShadowModeConfigArgs
    The configuration of ShadowMode inference experiment type, which shows the production variant that takes all the inference requests, and the shadow variant to which Amazon SageMaker replicates a percentage of the inference requests. For the shadow variant it also shows the percentage of requests that Amazon SageMaker replicates.
    status_reason str
    The error message or client-specified reason from the StopInferenceExperiment API, that explains the status of the inference experiment.
    tags Sequence[TagArgs]
    An array of key-value pairs to apply to this resource.
    endpointName String
    The name of the endpoint.
    modelVariants List<Property Map>
    An array of ModelVariantConfig objects. Each ModelVariantConfig object in the array describes the infrastructure configuration for the corresponding variant.
    roleArn String
    The Amazon Resource Name (ARN) of an IAM role that Amazon SageMaker can assume to access model artifacts and container images, and manage Amazon SageMaker Inference endpoints for model deployment.
    type "ShadowMode"
    The type of the inference experiment that you want to run.
    dataStorageConfig Property Map
    The Amazon S3 location and configuration for storing inference request and response data.
    description String
    The description of the inference experiment.
    desiredState "Running" | "Completed" | "Cancelled"
    The desired state of the experiment after starting or stopping operation.
    kmsKey String
    The AWS Key Management Service (AWS KMS) key that Amazon SageMaker uses to encrypt data on the storage volume attached to the ML compute instance that hosts the endpoint.
    name String
    The name for the inference experiment.
    schedule Property Map

    The duration for which the inference experiment ran or will run.

    The maximum duration that you can set for an inference experiment is 30 days.

    shadowModeConfig Property Map
    The configuration of ShadowMode inference experiment type, which shows the production variant that takes all the inference requests, and the shadow variant to which Amazon SageMaker replicates a percentage of the inference requests. For the shadow variant it also shows the percentage of requests that Amazon SageMaker replicates.
    statusReason String
    The error message or client-specified reason from the StopInferenceExperiment API, that explains the status of the inference experiment.
    tags List<Property Map>
    An array of key-value pairs to apply to this resource.

    Outputs

    All input properties are implicitly available as output properties. Additionally, the InferenceExperiment resource produces the following output properties:

    Arn string
    The Amazon Resource Name (ARN) of the inference experiment.
    CreationTime string
    The timestamp at which you created the inference experiment.
    EndpointMetadata Pulumi.AwsNative.SageMaker.Outputs.InferenceExperimentEndpointMetadata
    Id string
    The provider-assigned unique ID for this managed resource.
    LastModifiedTime string
    The timestamp at which you last modified the inference experiment.
    Status Pulumi.AwsNative.SageMaker.InferenceExperimentStatus
    The status of the inference experiment.
    Arn string
    The Amazon Resource Name (ARN) of the inference experiment.
    CreationTime string
    The timestamp at which you created the inference experiment.
    EndpointMetadata InferenceExperimentEndpointMetadata
    Id string
    The provider-assigned unique ID for this managed resource.
    LastModifiedTime string
    The timestamp at which you last modified the inference experiment.
    Status InferenceExperimentStatus
    The status of the inference experiment.
    arn String
    The Amazon Resource Name (ARN) of the inference experiment.
    creationTime String
    The timestamp at which you created the inference experiment.
    endpointMetadata InferenceExperimentEndpointMetadata
    id String
    The provider-assigned unique ID for this managed resource.
    lastModifiedTime String
    The timestamp at which you last modified the inference experiment.
    status InferenceExperimentStatus
    The status of the inference experiment.
    arn string
    The Amazon Resource Name (ARN) of the inference experiment.
    creationTime string
    The timestamp at which you created the inference experiment.
    endpointMetadata InferenceExperimentEndpointMetadata
    id string
    The provider-assigned unique ID for this managed resource.
    lastModifiedTime string
    The timestamp at which you last modified the inference experiment.
    status InferenceExperimentStatus
    The status of the inference experiment.
    arn str
    The Amazon Resource Name (ARN) of the inference experiment.
    creation_time str
    The timestamp at which you created the inference experiment.
    endpoint_metadata InferenceExperimentEndpointMetadata
    id str
    The provider-assigned unique ID for this managed resource.
    last_modified_time str
    The timestamp at which you last modified the inference experiment.
    status InferenceExperimentStatus
    The status of the inference experiment.
    arn String
    The Amazon Resource Name (ARN) of the inference experiment.
    creationTime String
    The timestamp at which you created the inference experiment.
    endpointMetadata Property Map
    id String
    The provider-assigned unique ID for this managed resource.
    lastModifiedTime String
    The timestamp at which you last modified the inference experiment.
    status "Creating" | "Created" | "Updating" | "Starting" | "Stopping" | "Running" | "Completed" | "Cancelled"
    The status of the inference experiment.

    Supporting Types

    InferenceExperimentCaptureContentTypeHeader, InferenceExperimentCaptureContentTypeHeaderArgs

    CsvContentTypes List<string>
    The list of all content type headers that SageMaker will treat as CSV and capture accordingly.
    JsonContentTypes List<string>
    The list of all content type headers that SageMaker will treat as JSON and capture accordingly.
    CsvContentTypes []string
    The list of all content type headers that SageMaker will treat as CSV and capture accordingly.
    JsonContentTypes []string
    The list of all content type headers that SageMaker will treat as JSON and capture accordingly.
    csvContentTypes List<String>
    The list of all content type headers that SageMaker will treat as CSV and capture accordingly.
    jsonContentTypes List<String>
    The list of all content type headers that SageMaker will treat as JSON and capture accordingly.
    csvContentTypes string[]
    The list of all content type headers that SageMaker will treat as CSV and capture accordingly.
    jsonContentTypes string[]
    The list of all content type headers that SageMaker will treat as JSON and capture accordingly.
    csv_content_types Sequence[str]
    The list of all content type headers that SageMaker will treat as CSV and capture accordingly.
    json_content_types Sequence[str]
    The list of all content type headers that SageMaker will treat as JSON and capture accordingly.
    csvContentTypes List<String>
    The list of all content type headers that SageMaker will treat as CSV and capture accordingly.
    jsonContentTypes List<String>
    The list of all content type headers that SageMaker will treat as JSON and capture accordingly.

    InferenceExperimentDataStorageConfig, InferenceExperimentDataStorageConfigArgs

    Destination string
    The Amazon S3 bucket where the inference request and response data is stored.
    ContentType Pulumi.AwsNative.SageMaker.Inputs.InferenceExperimentCaptureContentTypeHeader
    Configuration specifying how to treat different headers. If no headers are specified SageMaker will by default base64 encode when capturing the data.
    KmsKey string
    The AWS Key Management Service key that Amazon SageMaker uses to encrypt captured data at rest using Amazon S3 server-side encryption.
    Destination string
    The Amazon S3 bucket where the inference request and response data is stored.
    ContentType InferenceExperimentCaptureContentTypeHeader
    Configuration specifying how to treat different headers. If no headers are specified SageMaker will by default base64 encode when capturing the data.
    KmsKey string
    The AWS Key Management Service key that Amazon SageMaker uses to encrypt captured data at rest using Amazon S3 server-side encryption.
    destination String
    The Amazon S3 bucket where the inference request and response data is stored.
    contentType InferenceExperimentCaptureContentTypeHeader
    Configuration specifying how to treat different headers. If no headers are specified SageMaker will by default base64 encode when capturing the data.
    kmsKey String
    The AWS Key Management Service key that Amazon SageMaker uses to encrypt captured data at rest using Amazon S3 server-side encryption.
    destination string
    The Amazon S3 bucket where the inference request and response data is stored.
    contentType InferenceExperimentCaptureContentTypeHeader
    Configuration specifying how to treat different headers. If no headers are specified SageMaker will by default base64 encode when capturing the data.
    kmsKey string
    The AWS Key Management Service key that Amazon SageMaker uses to encrypt captured data at rest using Amazon S3 server-side encryption.
    destination str
    The Amazon S3 bucket where the inference request and response data is stored.
    content_type InferenceExperimentCaptureContentTypeHeader
    Configuration specifying how to treat different headers. If no headers are specified SageMaker will by default base64 encode when capturing the data.
    kms_key str
    The AWS Key Management Service key that Amazon SageMaker uses to encrypt captured data at rest using Amazon S3 server-side encryption.
    destination String
    The Amazon S3 bucket where the inference request and response data is stored.
    contentType Property Map
    Configuration specifying how to treat different headers. If no headers are specified SageMaker will by default base64 encode when capturing the data.
    kmsKey String
    The AWS Key Management Service key that Amazon SageMaker uses to encrypt captured data at rest using Amazon S3 server-side encryption.

    InferenceExperimentDesiredState, InferenceExperimentDesiredStateArgs

    Running
    Running
    Completed
    Completed
    Cancelled
    Cancelled
    InferenceExperimentDesiredStateRunning
    Running
    InferenceExperimentDesiredStateCompleted
    Completed
    InferenceExperimentDesiredStateCancelled
    Cancelled
    Running
    Running
    Completed
    Completed
    Cancelled
    Cancelled
    Running
    Running
    Completed
    Completed
    Cancelled
    Cancelled
    RUNNING
    Running
    COMPLETED
    Completed
    CANCELLED
    Cancelled
    "Running"
    Running
    "Completed"
    Completed
    "Cancelled"
    Cancelled

    InferenceExperimentEndpointMetadata, InferenceExperimentEndpointMetadataArgs

    EndpointName string
    The name of the endpoint.
    EndpointConfigName string
    The name of the endpoint configuration.
    EndpointStatus Pulumi.AwsNative.SageMaker.InferenceExperimentEndpointMetadataEndpointStatus
    The status of the endpoint. For possible values of the status of an endpoint.
    EndpointName string
    The name of the endpoint.
    EndpointConfigName string
    The name of the endpoint configuration.
    EndpointStatus InferenceExperimentEndpointMetadataEndpointStatus
    The status of the endpoint. For possible values of the status of an endpoint.
    endpointName String
    The name of the endpoint.
    endpointConfigName String
    The name of the endpoint configuration.
    endpointStatus InferenceExperimentEndpointMetadataEndpointStatus
    The status of the endpoint. For possible values of the status of an endpoint.
    endpointName string
    The name of the endpoint.
    endpointConfigName string
    The name of the endpoint configuration.
    endpointStatus InferenceExperimentEndpointMetadataEndpointStatus
    The status of the endpoint. For possible values of the status of an endpoint.
    endpoint_name str
    The name of the endpoint.
    endpoint_config_name str
    The name of the endpoint configuration.
    endpoint_status InferenceExperimentEndpointMetadataEndpointStatus
    The status of the endpoint. For possible values of the status of an endpoint.
    endpointName String
    The name of the endpoint.
    endpointConfigName String
    The name of the endpoint configuration.
    endpointStatus "Creating" | "Updating" | "SystemUpdating" | "RollingBack" | "InService" | "OutOfService" | "Deleting" | "Failed"
    The status of the endpoint. For possible values of the status of an endpoint.

    InferenceExperimentEndpointMetadataEndpointStatus, InferenceExperimentEndpointMetadataEndpointStatusArgs

    Creating
    Creating
    Updating
    Updating
    SystemUpdating
    SystemUpdating
    RollingBack
    RollingBack
    InService
    InService
    OutOfService
    OutOfService
    Deleting
    Deleting
    Failed
    Failed
    InferenceExperimentEndpointMetadataEndpointStatusCreating
    Creating
    InferenceExperimentEndpointMetadataEndpointStatusUpdating
    Updating
    InferenceExperimentEndpointMetadataEndpointStatusSystemUpdating
    SystemUpdating
    InferenceExperimentEndpointMetadataEndpointStatusRollingBack
    RollingBack
    InferenceExperimentEndpointMetadataEndpointStatusInService
    InService
    InferenceExperimentEndpointMetadataEndpointStatusOutOfService
    OutOfService
    InferenceExperimentEndpointMetadataEndpointStatusDeleting
    Deleting
    InferenceExperimentEndpointMetadataEndpointStatusFailed
    Failed
    Creating
    Creating
    Updating
    Updating
    SystemUpdating
    SystemUpdating
    RollingBack
    RollingBack
    InService
    InService
    OutOfService
    OutOfService
    Deleting
    Deleting
    Failed
    Failed
    Creating
    Creating
    Updating
    Updating
    SystemUpdating
    SystemUpdating
    RollingBack
    RollingBack
    InService
    InService
    OutOfService
    OutOfService
    Deleting
    Deleting
    Failed
    Failed
    CREATING
    Creating
    UPDATING
    Updating
    SYSTEM_UPDATING
    SystemUpdating
    ROLLING_BACK
    RollingBack
    IN_SERVICE
    InService
    OUT_OF_SERVICE
    OutOfService
    DELETING
    Deleting
    FAILED
    Failed
    "Creating"
    Creating
    "Updating"
    Updating
    "SystemUpdating"
    SystemUpdating
    "RollingBack"
    RollingBack
    "InService"
    InService
    "OutOfService"
    OutOfService
    "Deleting"
    Deleting
    "Failed"
    Failed

    InferenceExperimentModelInfrastructureConfig, InferenceExperimentModelInfrastructureConfigArgs

    InfrastructureType Pulumi.AwsNative.SageMaker.InferenceExperimentModelInfrastructureConfigInfrastructureType
    The type of the inference experiment that you want to run.
    RealTimeInferenceConfig Pulumi.AwsNative.SageMaker.Inputs.InferenceExperimentRealTimeInferenceConfig
    The infrastructure configuration for deploying the model to real-time inference.
    InfrastructureType InferenceExperimentModelInfrastructureConfigInfrastructureType
    The type of the inference experiment that you want to run.
    RealTimeInferenceConfig InferenceExperimentRealTimeInferenceConfig
    The infrastructure configuration for deploying the model to real-time inference.
    infrastructureType InferenceExperimentModelInfrastructureConfigInfrastructureType
    The type of the inference experiment that you want to run.
    realTimeInferenceConfig InferenceExperimentRealTimeInferenceConfig
    The infrastructure configuration for deploying the model to real-time inference.
    infrastructureType InferenceExperimentModelInfrastructureConfigInfrastructureType
    The type of the inference experiment that you want to run.
    realTimeInferenceConfig InferenceExperimentRealTimeInferenceConfig
    The infrastructure configuration for deploying the model to real-time inference.
    infrastructure_type InferenceExperimentModelInfrastructureConfigInfrastructureType
    The type of the inference experiment that you want to run.
    real_time_inference_config InferenceExperimentRealTimeInferenceConfig
    The infrastructure configuration for deploying the model to real-time inference.
    infrastructureType "RealTimeInference"
    The type of the inference experiment that you want to run.
    realTimeInferenceConfig Property Map
    The infrastructure configuration for deploying the model to real-time inference.

    InferenceExperimentModelInfrastructureConfigInfrastructureType, InferenceExperimentModelInfrastructureConfigInfrastructureTypeArgs

    RealTimeInference
    RealTimeInference
    InferenceExperimentModelInfrastructureConfigInfrastructureTypeRealTimeInference
    RealTimeInference
    RealTimeInference
    RealTimeInference
    RealTimeInference
    RealTimeInference
    REAL_TIME_INFERENCE
    RealTimeInference
    "RealTimeInference"
    RealTimeInference

    InferenceExperimentModelVariantConfig, InferenceExperimentModelVariantConfigArgs

    InfrastructureConfig Pulumi.AwsNative.SageMaker.Inputs.InferenceExperimentModelInfrastructureConfig
    The configuration for the infrastructure that the model will be deployed to.
    ModelName string
    The name of the Amazon SageMaker Model entity.
    VariantName string
    The name of the variant.
    InfrastructureConfig InferenceExperimentModelInfrastructureConfig
    The configuration for the infrastructure that the model will be deployed to.
    ModelName string
    The name of the Amazon SageMaker Model entity.
    VariantName string
    The name of the variant.
    infrastructureConfig InferenceExperimentModelInfrastructureConfig
    The configuration for the infrastructure that the model will be deployed to.
    modelName String
    The name of the Amazon SageMaker Model entity.
    variantName String
    The name of the variant.
    infrastructureConfig InferenceExperimentModelInfrastructureConfig
    The configuration for the infrastructure that the model will be deployed to.
    modelName string
    The name of the Amazon SageMaker Model entity.
    variantName string
    The name of the variant.
    infrastructure_config InferenceExperimentModelInfrastructureConfig
    The configuration for the infrastructure that the model will be deployed to.
    model_name str
    The name of the Amazon SageMaker Model entity.
    variant_name str
    The name of the variant.
    infrastructureConfig Property Map
    The configuration for the infrastructure that the model will be deployed to.
    modelName String
    The name of the Amazon SageMaker Model entity.
    variantName String
    The name of the variant.

    InferenceExperimentRealTimeInferenceConfig, InferenceExperimentRealTimeInferenceConfigArgs

    InstanceCount int
    The number of instances of the type specified by InstanceType.
    InstanceType string
    The instance type the model is deployed to.
    InstanceCount int
    The number of instances of the type specified by InstanceType.
    InstanceType string
    The instance type the model is deployed to.
    instanceCount Integer
    The number of instances of the type specified by InstanceType.
    instanceType String
    The instance type the model is deployed to.
    instanceCount number
    The number of instances of the type specified by InstanceType.
    instanceType string
    The instance type the model is deployed to.
    instance_count int
    The number of instances of the type specified by InstanceType.
    instance_type str
    The instance type the model is deployed to.
    instanceCount Number
    The number of instances of the type specified by InstanceType.
    instanceType String
    The instance type the model is deployed to.

    InferenceExperimentSchedule, InferenceExperimentScheduleArgs

    EndTime string
    The timestamp at which the inference experiment ended or will end.
    StartTime string
    The timestamp at which the inference experiment started or will start.
    EndTime string
    The timestamp at which the inference experiment ended or will end.
    StartTime string
    The timestamp at which the inference experiment started or will start.
    endTime String
    The timestamp at which the inference experiment ended or will end.
    startTime String
    The timestamp at which the inference experiment started or will start.
    endTime string
    The timestamp at which the inference experiment ended or will end.
    startTime string
    The timestamp at which the inference experiment started or will start.
    end_time str
    The timestamp at which the inference experiment ended or will end.
    start_time str
    The timestamp at which the inference experiment started or will start.
    endTime String
    The timestamp at which the inference experiment ended or will end.
    startTime String
    The timestamp at which the inference experiment started or will start.

    InferenceExperimentShadowModeConfig, InferenceExperimentShadowModeConfigArgs

    ShadowModelVariants List<Pulumi.AwsNative.SageMaker.Inputs.InferenceExperimentShadowModelVariantConfig>
    List of shadow variant configurations.
    SourceModelVariantName string
    The name of the production variant, which takes all the inference requests.
    ShadowModelVariants []InferenceExperimentShadowModelVariantConfig
    List of shadow variant configurations.
    SourceModelVariantName string
    The name of the production variant, which takes all the inference requests.
    shadowModelVariants List<InferenceExperimentShadowModelVariantConfig>
    List of shadow variant configurations.
    sourceModelVariantName String
    The name of the production variant, which takes all the inference requests.
    shadowModelVariants InferenceExperimentShadowModelVariantConfig[]
    List of shadow variant configurations.
    sourceModelVariantName string
    The name of the production variant, which takes all the inference requests.
    shadow_model_variants Sequence[InferenceExperimentShadowModelVariantConfig]
    List of shadow variant configurations.
    source_model_variant_name str
    The name of the production variant, which takes all the inference requests.
    shadowModelVariants List<Property Map>
    List of shadow variant configurations.
    sourceModelVariantName String
    The name of the production variant, which takes all the inference requests.

    InferenceExperimentShadowModelVariantConfig, InferenceExperimentShadowModelVariantConfigArgs

    SamplingPercentage int
    The percentage of inference requests that Amazon SageMaker replicates from the production variant to the shadow variant.
    ShadowModelVariantName string
    The name of the shadow variant.
    SamplingPercentage int
    The percentage of inference requests that Amazon SageMaker replicates from the production variant to the shadow variant.
    ShadowModelVariantName string
    The name of the shadow variant.
    samplingPercentage Integer
    The percentage of inference requests that Amazon SageMaker replicates from the production variant to the shadow variant.
    shadowModelVariantName String
    The name of the shadow variant.
    samplingPercentage number
    The percentage of inference requests that Amazon SageMaker replicates from the production variant to the shadow variant.
    shadowModelVariantName string
    The name of the shadow variant.
    sampling_percentage int
    The percentage of inference requests that Amazon SageMaker replicates from the production variant to the shadow variant.
    shadow_model_variant_name str
    The name of the shadow variant.
    samplingPercentage Number
    The percentage of inference requests that Amazon SageMaker replicates from the production variant to the shadow variant.
    shadowModelVariantName String
    The name of the shadow variant.

    InferenceExperimentStatus, InferenceExperimentStatusArgs

    Creating
    Creating
    Created
    Created
    Updating
    Updating
    Starting
    Starting
    Stopping
    Stopping
    Running
    Running
    Completed
    Completed
    Cancelled
    Cancelled
    InferenceExperimentStatusCreating
    Creating
    InferenceExperimentStatusCreated
    Created
    InferenceExperimentStatusUpdating
    Updating
    InferenceExperimentStatusStarting
    Starting
    InferenceExperimentStatusStopping
    Stopping
    InferenceExperimentStatusRunning
    Running
    InferenceExperimentStatusCompleted
    Completed
    InferenceExperimentStatusCancelled
    Cancelled
    Creating
    Creating
    Created
    Created
    Updating
    Updating
    Starting
    Starting
    Stopping
    Stopping
    Running
    Running
    Completed
    Completed
    Cancelled
    Cancelled
    Creating
    Creating
    Created
    Created
    Updating
    Updating
    Starting
    Starting
    Stopping
    Stopping
    Running
    Running
    Completed
    Completed
    Cancelled
    Cancelled
    CREATING
    Creating
    CREATED
    Created
    UPDATING
    Updating
    STARTING
    Starting
    STOPPING
    Stopping
    RUNNING
    Running
    COMPLETED
    Completed
    CANCELLED
    Cancelled
    "Creating"
    Creating
    "Created"
    Created
    "Updating"
    Updating
    "Starting"
    Starting
    "Stopping"
    Stopping
    "Running"
    Running
    "Completed"
    Completed
    "Cancelled"
    Cancelled

    InferenceExperimentType, InferenceExperimentTypeArgs

    ShadowMode
    ShadowMode
    InferenceExperimentTypeShadowMode
    ShadowMode
    ShadowMode
    ShadowMode
    ShadowMode
    ShadowMode
    SHADOW_MODE
    ShadowMode
    "ShadowMode"
    ShadowMode

    Tag, TagArgs

    Key string
    The key name of the tag
    Value string
    The value of the tag
    Key string
    The key name of the tag
    Value string
    The value of the tag
    key String
    The key name of the tag
    value String
    The value of the tag
    key string
    The key name of the tag
    value string
    The value of the tag
    key str
    The key name of the tag
    value str
    The value of the tag
    key String
    The key name of the tag
    value String
    The value of the tag

    Package Details

    Repository
    AWS Native pulumi/pulumi-aws-native
    License
    Apache-2.0
    aws-native logo

    We recommend new projects start with resources from the AWS provider.

    AWS Cloud Control v1.9.0 published on Monday, Nov 18, 2024 by Pulumi