L2_NLP.ipynb does not work locally

Hi, it seems the libs and/or model has changed. Trying to run this notebook locally, I noticed I had to add !pip install torch in the beginning of the notebook.
Then, when trying to create the pipeline:
chatbot = pipeline(task=“conversational”,
model=“facebook/blenderbot-400M-distill”)
I get the error message:
KeyError: “Unknown task conversational, available tasks are [‘audio-classification’, ‘automatic-speech-recognition’, ‘depth-estimation’, ‘document-question-answering’, ‘feature-extraction’, ‘fill-mask’, ‘image-classification’, ‘image-feature-extraction’, ‘image-segmentation’, ‘image-to-image’, ‘image-to-text’, ‘mask-generation’, ‘ner’, ‘object-detection’, ‘question-answering’, ‘sentiment-analysis’, ‘summarization’, ‘table-question-answering’, ‘text-classification’, ‘text-generation’, ‘text-to-audio’, ‘text-to-speech’, ‘text2text-generation’, ‘token-classification’, ‘translation’, ‘video-classification’, ‘visual-question-answering’, ‘vqa’, ‘zero-shot-audio-classification’, ‘zero-shot-classification’, ‘zero-shot-image-classification’, ‘zero-shot-object-detection’, ‘translation_XX_to_YY’]”
Any hints or suggestions?
Thanks!

1 Like

Hey, it’s been a while. but did you manage to find a fix?

edit: seems to have been deprecated.

The conversational task was deprecated because it can be confusing as many models can be used for conversation, depending on your specific needs. I’d recommend checking out the text-generation task instead. This task is more general and will allow you to find models that are suitable for your conversational application.

The updated way to do this seems a lot less neat and more complicated tbh.

Hi,

You are right , the “ConversationalPipeline” class has been deprecated and its functionality has been merged into the TextGenerationPipeline.

However, taking into consideration what it has been explained in the video “Selection models” and
the link about the model in the notebook (L2_NLP)

Info about ‘blenderbot-400M-distill’

on my view, the easy way to deal with this issue using the same model is the following:

STEP1 Open the link ‘blenderbot-400M-distill’

STEP2 Go in the section “Use this model” and select Trasformers where you can find the snippet code
to be used :

  # Use a pipeline as a high-level helper
   from transformers import pipeline
   pipe = pipeline("text2text-generation", model="facebook/blenderbot-400M-distill")

STEP3 If you want to use the same variables as in the original notebook (L2_NLP), just replace pipe
with chatbot etc…

  chatbot = pipeline(task= "text2text-generation", model="facebook/blenderbot-400M-distill")

Step4 Run the code and you will get the same output as in the original notebook (L2_NLP)

Hoping this can help you.

1 Like

Hi,

thanks for looking into this.

What you propose in STEP2 (using text2text-conversation) works, but then the lab breaks again since the Conversation class seem to be gone, so the code:

from transformers import Conversation

does not run and similarly the following lines…

Is there any replacement for the Conversation class?

Thanks and best regards,

Peter

Hi,

Answering to your question, you should keep in consideration that if you switch from using the conversational task to the text2text-generation task with the Hugging Face pipeline, there are a few key changes you’ll need to make in your code.

Here’s how you can modify the code for text2text-generation:

Key Differences:
Task Change:
Instead of using task=“conversational”, you will use task=“text2text-generation”.

No Conversation Object:
The Conversation object is specific to the conversational task and manages multi-turn conversations.
With text2text-generation, you only need to pass the user_message as input, and the model will generate a
corresponding text output.

Hoping this can help you.

PS Let me know if you need some suggestion also for the remaining code " To include prior conversations in the LLM’s context " .

1 Like