LangChain

Langchain docs

Had to be coming back to it at some point

LLM Chains API

Source code in github for HuggingFaceEndpoint

Here is the real API for LangChain

How do I navigate the LLMResult object in Langchaing LLMs?

Search for: How do I navigate the LLMResult object in Langchaing LLMs?

What is pydanticparser and how to use it in langchain?

Search for: What is pydanticparser and how to use it in langchain?


import hfdriver as aiutils
import baselog as log

from langchain_community.chat_models import ChatHuggingFace

#This import is important
#This allows pylance to recognize the type in this sentence
from langchain_community.llms.huggingface_endpoint import HuggingFaceEndpoint

from langchain_core.outputs.llm_result import LLMResult

def getAHfLLM(token, endpoint):
    hf_ep = HuggingFaceEndpoint(
        endpoint_url=endpoint,
        huggingfacehub_api_token=token,
        task="text-generation"
    )
    return hf_ep

def getASampleHFEndPoint() -> HuggingFaceEndpoint:
    token = aiutils.getAPIKey()
    ep_url = aiutils.getSampleHFEndPoint()
    return getAHfLLM(token,ep_url)

def testEndPoint():
    llm = getASampleHFEndPoint()
    #Expects a list of strings
    reply =  llm.generate(["Are roses red?"])
    log.ph("Reply from LLM", f"{reply}")
    log.ph("Json from there", reply.json())
    getSingleTextFromHFEndPointReply(reply)
   
def getSingleTextFromHFEndPointReply(reply: LLMResult):
    output = reply.flatten()
    firstResult = output[0]
    log.ph("First", firstResult)
    firstGenList = firstResult.generations[0]
    log.ph("First Gen List", firstGenList)
    firstGen = firstGenList[0]
    log.ph("First Gen", firstGen)
    text = firstGen.text
    log.ph("First Gen text", text)
    return text


def localTest():
    log.ph1("Starting local test")
    testEndPoint()
    log.ph1("End local test")

if __name__ == '__main__':
    localTest()

Reply from LLM

***********************

generations=[[Generation(text="\n\nAre violets blue?\n\nIs love a game we play?\n\nOr is it something true?\n\nThese questions and more\n\nAre asked by love's devotees\n\nAs they seek to understand\n\nThe mysteries of love's decree\n\nIs love a force of nature?\n\nOr is it something we choose?\n\nIs it a flame that burns brightly,\n\nOr a gentle breeze that softly blows")]] llm_output=None run=[RunInfo(run_id=UUID('4e877cff-b330-4bb0-acef-437ac0d655bf'))]

Json from there

***********************

{"generations": [[{"text": "\n\nAre violets blue?\n\nIs love a game we play?\n\nOr is it something true?\n\nThese questions and more\n\nAre asked by love's devotees\n\nAs they seek to understand\n\nThe mysteries of love's decree\n\nIs love a force of nature?\n\nOr is it something we choose?\n\nIs it a flame that burns brightly,\n\nOr a gentle breeze that softly blows", "generation_info": null, "type": "Generation"}]], "llm_output": null, "run": [{"run_id": "4e877cff-b330-4bb0-acef-437ac0d655bf"}]}

First

***********************

generations=[[Generation(text="\n\nAre violets blue?\n\nIs love a game we play?\n\nOr is it something true?\n\nThese questions and more\n\nAre asked by love's devotees\n\nAs they seek to understand\n\nThe mysteries of love's decree\n\nIs love a force of nature?\n\nOr is it something we choose?\n\nIs it a flame that burns brightly,\n\nOr a gentle breeze that softly blows")]] llm_output=None run=None

First Gen List

***********************

[Generation(text="\n\nAre violets blue?\n\nIs love a game we play?\n\nOr is it something true?\n\nThese questions and more\n\nAre asked by love's devotees\n\nAs they seek to understand\n\nThe mysteries of love's decree\n\nIs love a force of nature?\n\nOr is it something we choose?\n\nIs it a flame that burns brightly,\n\nOr a gentle breeze that softly blows")]

First Gen

***********************

text="\n\nAre violets blue?\n\nIs love a game we play?\n\nOr is it something true?\n\nThese questions and more\n\nAre asked by love's devotees\n\nAs they seek to understand\n\nThe mysteries of love's decree\n\nIs love a force of nature?\n\nOr is it something we choose?\n\nIs it a flame that burns brightly,\n\nOr a gentle breeze that softly blows"


First Gen text
***********************


Are violets blue?

Is love a game we play?

Or is it something true?

These questions and more

Are asked by love's devotees

As they seek to understand

The mysteries of love's decree

Is love a force of nature?

Or is it something we choose?

Is it a flame that burns brightly,

Or a gentle breeze that softly blows

Guides, Custom LLMs

Chroma docs on Langchaing

Chroma wrapper basics at langchain

  1. Docs
  2. Integrations
  3. Components
  4. Vector stores
  5. Chroma

Creating custom embeddings in Langchain

Search for: Creating custom embeddings in Langchain

Text embedding models are documented here

All of the available embedding models are explained here individually

  1. docs
  2. integrations
  3. text-embedding

Sentence transformers are here

Here is Langchain base class source code: Embeddings: langchain_core/embeddings.py

Other derived classes are here

Sample embedder: hugging face local

Langchain openai embedder

Fast Embedder

Fake embedder

Core undrestanding: Get started with hugging face embeddings using an API

Core understanding: OpenAI api embeddings

Core undestanding: Chromadb embeddings

More on embedding models: sbert.net

  1. something called sbert.net
  2. various pre-trained models
  3. from sentence_transformers import SentenceTransformer
  4. Various models with their perf metrics
  5. Semantic search
  6. multi-qa models
  7. Multi-lingual models
  8. Image and text
  9. ...and more

Chroma as a database is documented here

  1. Under: docs/components/vector stores/chroma
  2. install
  3. store
  4. split
  5. Creating a Chroma db using a collection from Chromadb native
  6. Making Chroma a server
  7. Update and delete data
  8. Retrievers
  9. Similarity search
  10. Metadata filtering

How do I know if a langchain vectordatabase created or exists?

Search for: How do I know if a langchain vectordatabase created or exists?

VectorStore base class in github

Chroma source code

LangChain ai powered chat: question and answers

  1. from langchain.chains import LLMChain
  2. from langchain.prompts import PromptTemplate
  3. from vectorlib.database import DatabaseRepo
  4. from vectorlib.database import Database
  5. from langchain_core.vectorstores import VectorStore
  6. from langchain_core.runnables import RunnablePassthrough
  7. from langchain_core.output_parsers import StrOutputParser
  8. from langchain_core.language_models.llms import LLM