Regular Prompts

Note

Additional details and the list of valid values for task_id can be found in the Task class.

Sample formats for model_parameters can be downloaded from here txt

create_prompt(self, input_mode: str, name: str, task_id: str, prompt_details: PromptTemplate, model_id: str = None, model_type: str = None, deployment_id: str = None, description: str = None, container_type: str = None, container_id: str = None) AIGovAssetUtilities

Create a Regular Prompt Template Asset.

Parameters:
  • input_mode (str) – The mode in which the prompt is being created. Currently supports “structured” and “freeflow” modes.

  • name (str) – The name of prompt being created.

  • model_id (str) – The identifier of the model associated with the prompt

  • task_id (str) – Describes possible Task for the prompt template creation

  • prompt_details (PromptTemplate) – Holds information about model version details, prompt variables, instructions, input/output prefixes, and example data

  • description (str) – (Optional) description of the extrnal prompt to be created

  • container_id (str) – (Optional) used to save the detached prompt

Return Type:

AIGovAssetUtilities

Example-1 (Creating a Structured prompt template assest),:

prompt_template = PromptTemplate(model_version={"number": "2.0.0-rc.7", "tag": "tag", "description": "Description of the model version"},
                             input="Input text to be given",
                             prompt_variables= {"text": "value"}
                             prompt_instruction="Your prompt instruction",
                             input_prefix="Your input prefix,
                             output_prefix="Your output prefix",
                             examples={"What is the capidddtal of France{text}?": "The capital of France is Paris{text}.",
                                        "Who wrote '1984{text}'?": "George Orwell wrote '1984'{text}."},

                            model_parameters={"decoding_method":"greedy"
                                              "max_new_tokens":2034,
                                              "min_new_tokens":0,
                                              "random_seed":0,
                                              "top_k":0,
                                              "top_p":0
                                              }

structured_prompt = facts.client.asests.create_prompt(input_mode="structured",
                                                      name=" structured prompt sample",
                                                      task_id="summarization",
                                                      model_id="ibm/granite-13b-chat-v2", 
                                                      description="My First structured prompt",
                                                      prompt_details=prompt_template,
                                                      )

Example-2 (Creating a Freeflow prompt template assest),:

prompt_template = PromptTemplate(
                             input="Input text to be given",
                             prompt_variables= {"text": "value"}

                            model_parameters={"decoding_method":"greedy"
                                              "max_new_tokens":2034,
                                              "min_new_tokens":0,
                                              "random_seed":0,
                                              "top_k":0,
                                              "top_p":0
                                              }
                            )

freeflow_prompt = facts.client.asests.create_prompt(input_mode="freeflow",
                                                    name="Freeflow prompt sample",
                                                    task_id="summarization",
                                                    model_id="ibm/granite-13b-chat-v2", 
                                                    description="My First Freeflow prompt",
                                                    prompt_details=prompt_template,
                                                    )