Production LLM inference including vLLM, TensorRT-LLM, llama.cpp, and SGLang. Use when deploying models for production inference.
Add the marketplace
/plugin marketplace add tianhao909/AI-Research-SKILLs-cn
Install plugins
/plugin
Run these commands in Claude Code to add this plugin to your environment. The marketplace must be added before you can install its plugins.
From Plugin
inference-serving
View Plugin
From Marketplace
ai-research-skills
View Marketplace
Author
@tianhao909
View GitHub Profile