Tuned Model and BYOM (Bring Your Own Model)

AI Factsheet now supports importing custom foundation models and tuning existing models to enhance performance for specific tasks. You can now create regular prompts using a tuned model or a custom model by simply passing the deployment ID and model type.

Note

  • Tuned model and BYOM are supported on CPD >= 5.1.

  • To use tuned_model and BYOM, specify the parameters model_type and deployment_id respectively.

  • When using tuned_model and BYOM, model_id is not required.

create_prompt(self, input_mode: str, name: str, task_id: str, prompt_details: PromptTemplate, model_id: str = None, model_type: str = None, deployment_id: str = None, description: str = None, container_type: str = None, container_id: str = None)

Create a regular prompt template asset. Supported for CPD version >=4.7.0.

Parameters:
  • input_mode (str) – The mode in which the prompt is being created. Currently supports “structured” and “freeflow” modes.

  • name (str) – The name of the prompt being created.

  • task_id (str) – Describes the possible task for the prompt template creation.

  • prompt_details (PromptTemplate) – Holds information about model version details, prompt variables, instructions, input/output prefixes, and example data.

  • model_id (str) – (Optional) The identifier of the model associated with the prompt.

  • model_type (str) – (Optional) The type of the model (e.g., tuned_model or byom).

  • deployment_id (str) – (Optional) The deployment ID for the created model (e.g., for tuned_model or byom).

  • description (str) – (Optional) Description of the external prompt to be created.

  • container_type (str) – (Optional) The type of container for saving the detached prompt.

  • container_id (str) – (Optional) Used to save the detached prompt.

Returns:

An AIGovAssetUtilities object representing the created asset.

Return type:

AIGovAssetUtilities

Example 1: Creating a Structured Prompt Template with `tuned_model`

Create a structured prompt template using a tuned_model:

prompt_template = PromptTemplate(
    model_version={"number": "2.0.0-rc.7", "tag": "tag", "description": "Description of the model version"},
    input="Input text to be given",
    prompt_variables={"text": "value"},
    prompt_instruction="Your prompt instruction",
    input_prefix="Your input prefix",
    output_prefix="Your output prefix",
    examples={
        "What is the capital of France{text}?": "The capital of France is Paris{text}.",
        "Who wrote '1984{text}'?": "George Orwell wrote '1984'{text}."
    },
    model_parameters={
        "decoding_method": "greedy",
        "max_new_tokens": 2034,
        "min_new_tokens": 0,
        "random_seed": 0,
        "top_k": 0,
        "top_p": 0
    }
)
structured_prompt = facts.client.assets.create_prompt(
    input_mode="structured",
    name="Structured prompt sample",
    task_id="summarization",
    model_type="tuned_model",
    deployment_id="**********",
    description="My first structured prompt",
    prompt_details=prompt_template
)

Example 2: Creating a Structured Prompt Template with `BYOM`

Create a structured prompt template using BYOM:

prompt_template = PromptTemplate(
    model_version={"number": "2.0.0-rc.7", "tag": "tag", "description": "Description of the model version"},
    input="Input text to be given",
    prompt_variables={"text": "value"},
    prompt_instruction="Your prompt instruction",
    input_prefix="Your input prefix",
    output_prefix="Your output prefix",
    examples={
        "What is the capital of France{text}?": "The capital of France is Paris{text}.",
        "Who wrote '1984{text}'?": "George Orwell wrote '1984'{text}."
    },
    model_parameters={
        "decoding_method": "greedy",
        "max_new_tokens": 2034,
        "min_new_tokens": 0,
        "random_seed": 0,
        "top_k": 0,
        "top_p": 0
    }
)
structured_prompt = facts.client.assets.create_prompt(
    input_mode="structured",
    name="Structured prompt sample",
    task_id="summarization",
    model_type="byom",
    deployment_id="**********",
    description="My first structured prompt",
    prompt_details=prompt_template
)

Example 3: Creating a Freeflow Prompt Template with `tuned_model`

Create a freeflow prompt template using a tuned_model:

prompt_template = PromptTemplate(
    input="Input text to be given",
    prompt_variables={"text": "value"},
    model_parameters={
        "decoding_method": "greedy",
        "max_new_tokens": 2034,
        "min_new_tokens": 0,
        "random_seed": 0,
        "top_k": 0,
        "top_p": 0
    }
)
freeflow_prompt = facts.client.assets.create_prompt(
    input_mode="freeflow",
    name="Freeflow prompt sample",
    task_id="summarization",
    model_type="tuned_model",
    deployment_id="**********",
    description="My first freeflow prompt",
    prompt_details=prompt_template
)

Example 4: Creating a Freeflow Prompt Template with `BYOM`

Create a freeflow prompt template using BYOM:

prompt_template = PromptTemplate(
    input="Input text to be given",
    prompt_variables={"text": "value"},
    model_parameters={
        "decoding_method": "greedy",
        "max_new_tokens": 2034,
        "min_new_tokens": 0,
        "random_seed": 0,
        "top_k": 0,
        "top_p": 0
    }
)
freeflow_prompt = facts.client.assets.create_prompt(
    input_mode="freeflow",
    name="Freeflow prompt sample",
    task_id="summarization",
    model_type="byom",
    deployment_id="**********",
    description="My first freeflow prompt",
    prompt_details=prompt_template
)