Invite code for LangSmith

This course uses LangSmith (although it calls it LangChain Platform) to demonstrate visually what’s happening under the hood and encourages students to do the same. I tried signing up for LangSmith but it’s still in closed beta. Does anyone have an invite code I can use to get access to it?

Hi,

You don’t need LangSmith to use LangChain. To use it, just “pip install langchain” in your Python code.

Sam Witteveen has a good series of videos explaining how to use it while providing the Colab Code Notebook.

Moderator edit: You can find the video by searching on YouTube for “LangChain Basics Tutorial #1

Yes, I understand that LangChain doesn’t need LangSmith, but LangSmith provides visual insight on all the intermediate steps that LangChain goes through. That’s what I’m hoping to get.

I agree. I’m also on the waitlist for Langsmith. It looks like an excellent low-code tool for speeding up Gen AI development.

Browsing Deeplearning.AI courses, I just found one that maybe helps.

1 Like

For those waiting for LangSmith, in the meantime you can pass in verbose=True to LangChain to get detailed information of all the intermediary steps:

qa = ConversationalRetrievalChain.from_llm(
    llm,
    retriever=retriever,
    memory=memory,
    verbose=True
)

While this verbose setting is quite helpful, I’m still not able to see the LLM response in the intermediary steps. For that, I’ve set up a proxy that all the HTTP requests go through by pip installing pip_system_certs and using Fiddler’s http capturing feature to intercept requests and responses on a Windows machine. On a Mac, you should be able to do the same thing with tools like CharlesProxy.

Another cool thing you could try is using debug. It produces a lot of intermediate responses.

import langchain
langchain.debug=True

debug is even better. Thank you!

This is how I’m configuring it:

from langchain.globals import set_debug, set_verbose

set_debug(True)
set_verbose(True)
3 Likes