Adding Memory to AzureChatOpenAI

I’m wondering how I can add Memory a Model ( eg AzureChatOpenAI )
In the below code, I’m adding the memory to the Model using the bind(memory=memory) method. However when I use the invoke() method on the model, the model do not seem to be taking information from the memory context. Any idea what may be going wrong and how to fix it?

from langchain_community.chat_models import AzureChatOpenAI
from langchain.memory import ConversationBufferMemory
llm = AzureChatOpenAI(
    deployment_name=config.azure_openai_deployment_name,
    openai_api_version=config.openai.api_version,
    openai_api_key=config.openai.api_key,
    azure_endpoint=config.openai.api_base,
    temperature=0,
        )
memory = ConversationBufferMemory(memory_key="chat_history")
memory.save_context({"input": "John Doe is AI Developer. Michael is Jonh Doe's Manager"},
                    {"output": "Ok, i'll remember that"})
memory.save_context({"input": "John Doe answers 1 + 1 as 11"},
                    {"output": "Ok, i'll remember that"})
llm.bind(memory=memory)
# pprint(deep_print_object(llm))
response = llm.invoke(
        input="Who is John Doe "
    )
print(response)

I’m expecting the ‘John Doe is AI Developer’ However, the code give me below answer,
‘John Doe is a name used in the United States as a placeholder for an anonymous or unknown person. … Jane Doe.’