-
Notifications
You must be signed in to change notification settings - Fork 12
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Storage System Notes #66
Comments
Idea 11. Contextual RetrievalThe model identifies the required context and invokes function to fetch specific data from the knowledge graph. Example:
Response: {
"rank": "Lieutenant",
"relationships": [
{ "type": "serves_in", "target": 2 }
]
}
2. Structure this further by
|
I'll need to think of a retrieval and delivery method too. I might need to have all outputs processed by another model in the background and trigger some post hooks. |
Must read goodness https://cookbook.openai.com/examples/embedding_wikipedia_articles_for_search |
Vector Databases should be able to achieve this. |
This {
"text": "The user is asking a simple mathematical question. I need to provide the correct answer.",
"thought": true
}, |
Idea 2
|
Idea 3Similar to And I still rather have core learned offline information stored, well offline. |
Using this as a scratch pad for ideas.
Context window
is limited to128000
The text was updated successfully, but these errors were encountered: