Skip to content

Add prompt converter#41

Closed
sainivedh wants to merge 9 commits intomainfrom
prompt-converter
Closed

Add prompt converter#41
sainivedh wants to merge 9 commits intomainfrom
prompt-converter

Conversation

@sainivedh
Copy link
Copy Markdown

@sainivedh sainivedh commented Apr 1, 2025

Support OpenAI to respective llm prompt converter tool

  • alpaca
  • falcon
  • mosaic
  • llama2_chat
  • claude
  • mistral
  • ollama
  • Huggingface chat template
  • Gemini text Image

Eg:

from clarifai_datautils.text.prompt_factory import hf_chat_template


messages = [
        {"role": "system", "content": "You are a helpful assistant that answers in JSON."},
        {"role": "user", "content": "Create a user named Alice, who lives in 42, Wonderland Avenue."}
    ]

print(hf_chat_template("Qwen/QwQ-32B", messages, hf_token="xxx"))
<|im_start|>system
You are a helpful assistant that answers in JSON.<|im_end|>
<|im_start|>user
Create a user named Alice, who lives in 42, Wonderland Avenue.<|im_end|>

Comment thread clarifai_datautils/text/prompt_factory.py Fixed
@sainivedh sainivedh marked this pull request as ready for review April 1, 2025 09:44
@sainivedh sainivedh requested a review from luv-bansal April 1, 2025 10:17
Copy link
Copy Markdown
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Copilot reviewed 3 out of 3 changed files in this pull request and generated no comments.

Comments suppressed due to low confidence (1)

tests/test_prompt_factory.py:6

  • [nitpick] Consider renaming 'test_codellama_prompt_format' to 'test_llama_2_chat_pt_format' to clearly reflect the underlying function being tested.
def test_codellama_prompt_format():



# Llama2 prompt template
def llama_2_chat_pt(messages):
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't think we need most of these old model prompt formats, since we're not gonna use the older models. Keeping them just adds unnecessary clutter to the file

raise e


def _gemini_vision_convert_messages(messages: list):
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I see it just returns single list of prompts and images, do we need this?

I see there SDK only supports Dict of list of message, so do you think should be support util function of converting openai messages to gemini messages?

return prompt


def anthropic_pt(
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Same with this. Anthropic latest SDK only supports messages for inference, so should be have util function of converting openai message to anthropic message?

return content


def hf_chat_template(model: str,
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is good, but I think we could also add another function to convert openai message to hf transformer message

@ackizilkale
Copy link
Copy Markdown
Contributor

@sainivedh @luv-bansal what are we going to do with this PR?

@sainivedh sainivedh closed this Apr 21, 2025
@sainivedh sainivedh deleted the prompt-converter branch April 21, 2025 18:07
@sainivedh
Copy link
Copy Markdown
Author

@luv-bansal and I discussed that this should in SDK runner/utils and keep this in hold for now

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

5 participants