📄️ dspy.HFModel
Initialize HFModel within your program with the desired model to load in. Here's an example call:
📄️ dspy.ChatModuleClient
Prerequisites
📄️ dspy.OllamaLocal
Adapted from documentation provided by https://github.com/insop
📄️ dspy.HFClientTGI
Prerequisites
📄️ dspy.TensorRTModel
TensorRT LLM by Nvidia happens to be one of the most optimized inference engines to run open-source Large Language Models locally or in production.
📄️ dspy.HFClientVLLM
Setting up the vLLM Server