Skip to content

Model#2

Open
VinhDevNguyen wants to merge 6 commits into
devfrom
model
Open

Model#2
VinhDevNguyen wants to merge 6 commits into
devfrom
model

Conversation

@VinhDevNguyen
Copy link
Copy Markdown
Owner

No description provided.

Copy link
Copy Markdown
Owner Author

@VinhDevNguyen VinhDevNguyen left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Good job on your first step, mate.

By the way, please help me fix the issues I've commented on in those files.

Many thanks.

Comment thread asd.py Outdated
Copy link
Copy Markdown
Owner Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please delete this file

Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes king

Comment thread process/rag_data_test.py
Comment on lines +14 to +26
class DefChromaEF(OllamaEmbeddings):
def __init__(self,ef):
self.ef = ef
self.model = "phi3"

def embed_documents(self,texts):
return self.ef(texts)

def embed_query(self, query):
return self.ef([query])[0]

chroma_client = chromadb.Client()
collection = chroma_client.create_collection(name='rag_data', embedding_function=DefChromaEF.embed_documents)
Copy link
Copy Markdown
Owner Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

  • Please check out this link to learn more about dynamically adding more embeddings of new documents in Chroma DB.

  • We might need to move the embedding code block to a separate module instead of keeping everything in one place. This way, we can avoid re-embedding and storing the data in Chroma DB repeatedly.

Comment thread process/rag_data_test.py

llm = Ollama(model="phi3")

batch_size = 10
Copy link
Copy Markdown
Owner Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What will happen if we change the batch size?

Comment thread process/rag_main.py Outdated
Comment on lines +41 to +44
## Winner winner chicken dinner
response = retrieval_chain.invoke({"input": "how can langsmith help with testing?"})
print(":::ROUND 2:::")
print(response["answer"]) No newline at end of file
Copy link
Copy Markdown
Owner Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can you help me create a function that allows input from the terminal so we can chat until the user decides to stop?

Comment thread process/test_llm.py Outdated
Copy link
Copy Markdown
Owner Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We can delete and you can work with this file on your local machine

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants