Class ModelCache
Cache of requests to model based on prompt and model configuration
Inheritance
object
ModelCache
Inherited Members
object.Equals(object)
object.Equals(object, object)
object.GetHashCode()
object.GetType()
object.MemberwiseClone()
object.ReferenceEquals(object, object)
object.ToString()
Namespace: DotnetPrompt.Abstractions.LLM
Assembly: DotnetPrompt.Abstractions.dll
Syntax
public class ModelCache
Constructors
| Improve this Doc View SourceModelCache(IDistributedCache)
Declaration
public ModelCache(IDistributedCache cache)
Parameters
Type | Name | Description |
---|---|---|
IDistributedCache | cache |
Methods
| Improve this Doc View SourceGetPromptsAsync(string, IList<string>)
Get prompts that are already cached.
Declaration
public Task<(Dictionary<int, IList<Generation>> ExistingPrompts, string LLMString, IList<int> MissingPromptIdxs, IList<string> MissingPrompts)> GetPromptsAsync(string llmString, IList<string> prompts)
Parameters
Type | Name | Description |
---|---|---|
string | llmString | |
IList<string> | prompts |
Returns
Type | Description |
---|---|
Task<(Dictionary<int, IList<Generation>> ExistingPrompts, string LLMString, IList<int> MissingPromptIdxs, IList<string> MissingPrompts)> |
UpdateCache(Dictionary<int, IList<Generation>>, string, IList<int>, ModelResult, IList<string>)
Update the cache and get the LLM output.
Declaration
public Task<IDictionary<string, object>> UpdateCache(Dictionary<int, IList<Generation>> existingPrompts, string llmString, IList<int> missingPromptIndexes, ModelResult newResults, IList<string> prompts)
Parameters
Type | Name | Description |
---|---|---|
Dictionary<int, IList<Generation>> | existingPrompts | |
string | llmString | |
IList<int> | missingPromptIndexes | |
ModelResult | newResults | |
IList<string> | prompts |
Returns
Type | Description |
---|---|
Task<IDictionary<string, object>> |