promptrefiner: Using GPT-4 to Create a Perfect System Prompt for Your Local LLM | by Amirarsalan Rajabi | Apr, 2024

promptrefiner: Using GPT-4 to Create a Perfect System Prompt for Your Local LLM | by Amirarsalan Rajabi | Apr, 2024

Image created by DALL·E 3

In this tutorial, we will explore promptrefiner: A tiny python tool I have created to create perfect system prompts for your local LLM, by using the help of the GPT-4 model.

The python code in this article is available here:

https://github.com/amirarsalan90/promptrefiner.git

Crafting an effective and detailed system prompt for your program can be a challenging process that often requires multiple trials and errors, particularly when working with smaller LLMs, such as a 7b language model. which can generally interpret and follow less detailed prompts, a smaller large language model like Mistral 7b would be more sensitive to your system prompt.

Let’s imagine a scenario where you’re working with a text. This text discusses a few individuals, discussing their contributions or roles. Now, you want to have your local language model, say Mistral 7b, distill this information into a list of Python strings, each pairing a name with its associated details in the text. Take the following paragraph as a case:

Screenshot from the input text. Image created by the author

For this example, I would like to have a perfect prompt that results in the LLM giving me a string like the following:

“””
[“Elon Musk: Colonization of Mars”, “Stephen Hawking: Warnings about AI”, “Greta Thunberg: Environmentalism”, “Digital revolution: Technological advancement and existential risks”, “Modern dilemma: Balancing ambition with environmental responsibility”]
“””

When we use an instruction fine-tuned language model (language models that are fine-tuned for interactive conversations), the prompt usually consists of two parts: 1)system prompt, and 2)user prompt. For this example, consider the following system and user prompt:

Screenshot from system + user prompt. Image created by the author

You see the first part of this prompt is my system prompt that tells the LLM how to generate the answer, and the second part is my user prompt, which is…

Source link

Leave a Reply

Your email address will not be published. Required fields are marked *