Need help with embedding bridgetower-large-itm-mlm-itc

Inside the utils.py file, there is this func, which I want to run on VScode. How do I change it to run locally?

def bt_embedding_from_prediction_guard(prompt, base64_image):
    client = _getPredictionGuardClient()
    message = {"text": prompt,}
    if base64_image is not None and base64_image != "":
        if not isBase64(base64_image): 
            raise TypeError("image input must be in base64 encoding!")
        message['image'] = base64_image
    response = client.embeddings.create(
        model="bridgetower-large-itm-mlm-itc",
        input=[message]
    )
    return response['data'][0]['embedding']

The following is my modification as far as I understand it. Is this right? If I’m wrong, please help me.

from transformers import AutoModel
models = AutoModel.from_pretrained("BridgeTower/bridgetower-large-itm-mlm-itc")  
def bt_embedding_from_local(prompt, base64_image):
    message = {"text": prompt,}
    if base64_image is not None and base64_image != "":
        if not isBase64(base64_image): 
            raise TypeError("image input must be in base64 encoding!")
        message['image'] = base64_image
    response = embeddings.create(
        model=models,
        input=[message]
    )
    return response['data'][0]['embedding']