Huggingface
Integrate LangSmith with Langchain
· ☕ 3 min read · 🤖 Naresh Mehta

LangSmith is a platform that allows you to track and analyze the performance of your LLMs. It provides a lot of features like tracing, debugging, etc. LangSmith is available as a python package and can be installed using pip and also as a docker image. But the best way to start using LangSmith is pretty simple.

If you have followed my previous article on Use LocalAI server with Langchain, you should have a localAI server running and a langchain setup to interact with it. You can download the example code that I have pasted in the previous article and run it to interact with the localAI server.


Use LocalAI server with Langchain
· ☕ 7 min read · 🤖 Naresh Mehta

Now you might have used gemini, chatgpt, deepseek, claude, etc. for your daily activities or asking a question here and there. And most of these services ask for a subscription to use their services. And if you are not tech sauvy, you might not be aware that it is very easy to run local AI servers on your local machine. No need to have a subscription or pay for cloud services. The best part is that your normal PC/Mac will work just fine. Of course depending on your HW, there will be limits on the type of models that can be used. But running an SLM (Small Language Model) or a smaller LLM (Large Language Model) is a piece of cake.