NRP-Managed LLMs
Accessing NRP LiteLLM
We use the LiteLLM proxy to provide access to the LLMs we’re running on nautilus.
You can start from logging to the NRP litellm UI. If you’re coming from an .edu domain, you’ll be placed into NRP team unless you’re in a special SDSC list, in which case you’ll be placed in the SDSC team. Otherwise request in Matrix to be assigned to a Team.
Once you’re a member of a team, you can create the tokens and access the models.
To create a token, open the Virtual Keys tab and create a new token to access the API. There are also examples on using the API.
The base URL for all models is https://llm.nrp-nautilus.io.
LiteLLM name | Model | Features |
---|---|---|
gemma3 | google/gemma-3-27b-it | agentic AI workflows, 128K tokens, speaks 140+ languages, hit 1338 ELO on LMArena |
llama3 | meta-llama/Llama-3.2-90B-Vision-Instruct | multimodal (vision), 128K tokens |
llama3-sdsc | meta-llama/Llama-3.3-70B-Instruct | 8 languages, 128K tokens, tool use |
DeepSeek-R1-Distill-Qwen-32B | deepseek-ai/DeepSeek-R1-Distill-Qwen-32B | |
embed-mistral | intfloat/e5-mistral-7b-instruct | embeddings |
gorilla | gorilla-llm/gorilla-openfunctions-v2 | function calling |
llava-onevision | llava-hf/llava-onevision-qwen2-7b-ov-hf | vision |
olmo | allenai/OLMo-2-0325-32B-Instruct | open source |
phi3 | microsoft/Phi-3.5-vision-instruct | vision |
The following apps provide access to models with no additional tokens.
NRP OpenWebUI
https://nrp-openwebui.nrp-nautilus.io - OpenWebUI chat. Deprecated in favor of LibreChat.