How to do chain of thought reasoning ontop of vector store?

How to apply chain of thought reasoning like this:

user_message = f"""
by how much is the BlueWave Chromebook more expensive \
than the TechPro Desktop"""

messages =  [  
{'role':'system', 
 'content': system_message},    
{'role':'user', 
 'content': f"{delimiter}{user_message}{delimiter}"},  
] 

response = get_completion_from_messages(messages)

To the output of a vector store, where the docs say to interact like this?:

    index = VectorstoreIndexCreator(
        vectorstore_cls=DocArrayInMemorySearch,
        embedding=OpenAIEmbeddings(),
    ).from_loaders([loader])

    llm = ChatOpenAI(temperature=0.2, max_tokens=5000, model="gpt-4", batch_size=5)

    response = index.query(my_query, llm=llm)