Model#2
Conversation
VinhDevNguyen
left a comment
There was a problem hiding this comment.
Good job on your first step, mate.
By the way, please help me fix the issues I've commented on in those files.
Many thanks.
There was a problem hiding this comment.
Please delete this file
| class DefChromaEF(OllamaEmbeddings): | ||
| def __init__(self,ef): | ||
| self.ef = ef | ||
| self.model = "phi3" | ||
|
|
||
| def embed_documents(self,texts): | ||
| return self.ef(texts) | ||
|
|
||
| def embed_query(self, query): | ||
| return self.ef([query])[0] | ||
|
|
||
| chroma_client = chromadb.Client() | ||
| collection = chroma_client.create_collection(name='rag_data', embedding_function=DefChromaEF.embed_documents) |
There was a problem hiding this comment.
-
Please check out this link to learn more about dynamically adding more embeddings of new documents in Chroma DB.
-
We might need to move the embedding code block to a separate module instead of keeping everything in one place. This way, we can avoid re-embedding and storing the data in Chroma DB repeatedly.
|
|
||
| llm = Ollama(model="phi3") | ||
|
|
||
| batch_size = 10 |
There was a problem hiding this comment.
What will happen if we change the batch size?
| ## Winner winner chicken dinner | ||
| response = retrieval_chain.invoke({"input": "how can langsmith help with testing?"}) | ||
| print(":::ROUND 2:::") | ||
| print(response["answer"]) No newline at end of file |
There was a problem hiding this comment.
Can you help me create a function that allows input from the terminal so we can chat until the user decides to stop?
There was a problem hiding this comment.
We can delete and you can work with this file on your local machine
No description provided.