Conversation
There was a problem hiding this comment.
Copilot reviewed 3 out of 3 changed files in this pull request and generated no comments.
Comments suppressed due to low confidence (1)
tests/test_prompt_factory.py:6
- [nitpick] Consider renaming 'test_codellama_prompt_format' to 'test_llama_2_chat_pt_format' to clearly reflect the underlying function being tested.
def test_codellama_prompt_format():
|
|
||
|
|
||
| # Llama2 prompt template | ||
| def llama_2_chat_pt(messages): |
There was a problem hiding this comment.
I don't think we need most of these old model prompt formats, since we're not gonna use the older models. Keeping them just adds unnecessary clutter to the file
| raise e | ||
|
|
||
|
|
||
| def _gemini_vision_convert_messages(messages: list): |
There was a problem hiding this comment.
I see it just returns single list of prompts and images, do we need this?
I see there SDK only supports Dict of list of message, so do you think should be support util function of converting openai messages to gemini messages?
| return prompt | ||
|
|
||
|
|
||
| def anthropic_pt( |
There was a problem hiding this comment.
Same with this. Anthropic latest SDK only supports messages for inference, so should be have util function of converting openai message to anthropic message?
| return content | ||
|
|
||
|
|
||
| def hf_chat_template(model: str, |
There was a problem hiding this comment.
This is good, but I think we could also add another function to convert openai message to hf transformer message
|
@sainivedh @luv-bansal what are we going to do with this PR? |
|
@luv-bansal and I discussed that this should in SDK runner/utils and keep this in hold for now |
Support OpenAI to respective llm prompt converter tool
alpacafalconmosaicllama2_chatclaudemistralollamaHuggingface chat templateGemini text ImageEg: