Default langchain prompt templates

It seems to me the prompt templates in langchain are very much designed for the OpenAI API, and don’t work so well with other models.
I’m talking about document retrieval chains, and chat with your data type of applications. Those chains use templates under the hood.
I’m playing with Llama to run locally or in a private netwrok, and as i understand it the format of the Llama model for instructions is pretty different than the OpenAI instruction format, and it fails frequently to work with the chains mentioned above for that reason.

is there some more intuititve way to design templates that are compatible across models?

1 Like

Please provide the following details:

  1. OpenAI model used.
  2. Model used on your local machine.
  3. Details about the prompt and responses received.