-
-
Notifications
You must be signed in to change notification settings - Fork 2.1k
en deploy astrbot compshare
Compshare is UCloud's GPU compute rental and LLM API platform, offering compute resources for AI, deep learning, and scientific workloads.
AstrBot provides an Ollama + AstrBot one-click self-deployment image on Compshare, and also supports Compshare model APIs.
Default image spec: RTX 3090 24GB + Intel 16-core + 64GB RAM + 200GB system disk. Billing is pay-as-you-go, so please monitor your balance.
- Register a Compshare account via this link.
- Open the AstrBot image page and create an instance.
- After deployment, open
JupyterLabfrom the console. - In JupyterLab, create a new terminal and run:
cd
./astrbot_booter.shIf startup succeeds, you should see output similar to:
(py312) root@f8396035c96d:/workspace# cd
./astrbot_booter.sh
Starting AstrBot...
Starting ollama...
Both services started in the background.After startup, open http://<instance-public-ip>:6185 in your browser to access the AstrBot dashboard.
You can find the public IP in Console -> Basic Network (Public).
It may take around 30 seconds before the page becomes reachable.
Login with username astrbot and password astrbot.
After logging in, you can reset your password and continue setup.
The instance imports Ollama-DeepSeek-R1-32B by default.
The image includes Ollama. You can pull any model and host it locally on the instance.
- Choose a model from Ollama Search.
- Connect to the instance terminal via SSH (from Compshare Console -> Instance List -> Console Command and Password).
- Run
ollama pull <model-name>and wait for completion. - In AstrBot Dashboard -> Providers, edit
ollama_deepseek-r1, update the model name, and save.
AstrBot supports direct access to model APIs provided by Compshare.
- Find the model you want at Compshare Model Center.
- In AstrBot Dashboard -> Providers, click
+ Add Provider, then choose Compshare. If Compshare is not listed, choose OpenAI-compatible access and set API Base URL tohttps://api.modelverse.cn/v1. Enter the model name in model configuration and save.
In AstrBot Dashboard, click Chat and run /provider to view and switch your active provider.
Then send a normal message to test whether the model works.
You can follow the latest platform integration guides in the AstrBot Documentation. Open the docs and check the left sidebar under Messaging Platforms.
- Lark: Connect to Lark
- LINE: Connect to LINE
- DingTalk: Connect to DingTalk
- WeCom: Connect to WeCom
- WeChat Official Account: Connect to WeChat Official Account
- QQ Official Bot: Connect to QQ Official API
- KOOK: Connect to KOOK
- Slack: Connect to Slack
- Discord: Connect to Discord
- More methods: AstrBot Documentation
For more capabilities, see the AstrBot Documentation.
- 首页
- 文档入口
- Top Level
- community events
- deploy
- dev
- others
- platform
- 接入 OneBot v11 协议实现
- 接入钉钉 DingTalk
- 接入 Discord
- 接入 Kook
- 接入飞书
- 接入 LINE
- 接入 Matrix
- 接入 Mattermost
- 接入 Misskey 平台
- 接入 QQ 官方机器人平台
- 通过 QQ官方机器人 接入 QQ (Webhook)
- 通过 QQ官方机器人 接入 QQ (Websockets)
- 接入 Satori 协议
- 接入 server-satori (基于 Koishi)
- 接入 Slack
- 接入消息平台
- 接入 Telegram
- 接入 VoceChat
- AstrBot 接入企业微信
- 接入企业微信智能机器人平台
- AstrBot 接入微信公众平台
- 接入个人微信
- providers
- use
- Home
- Docs Entry
- Top Level
- config
- deploy
- Deploy AstrBot on 1Panel
- Deploy AstrBot on BT Panel
- Deploy AstrBot on CasaOS
- Deploy AstrBot from Source Code
- Community-Provided Deployment Methods
- Deploy via Compshare
- Deploy AstrBot with Docker
- Deploy AstrBot with Kubernetes
- Deploy AstrBot with AstrBot Launcher
- Other Deployments
- Package Manager Deployment (uv)
- Installation via System Package Manager
- Preface
- dev
- ospp
- others
- platform
- Connect OneBot v11 Protocol Implementations
- Connect to DingTalk
- Connecting to Discord
- Connect to KOOK
- Connecting to Lark
- Connecting to LINE
- Connecting to Matrix
- Connecting to Mattermost
- Connecting to Misskey Platform
- Connect QQ Official Bot
- Connect QQ via QQ Official Bot (Webhook)
- Connect QQ via QQ Official Bot (Websockets)
- Connect to Satori Protocol
- Connect server-satori (Koishi)
- Connecting to Slack
- Messaging Platforms
- Connecting to Telegram
- Connect to VoceChat
- Connect AstrBot to WeCom
- Connect to WeCom AI Bot Platform
- Connect AstrBot to WeChat Official Account Platform
- Connect Personal WeChat
- providers
- 接入 302.AI
- Agent Runners
- Built-in Agent Runner
- Connect to Coze
- Connect to Alibaba Cloud Bailian Application
- Connect to DeerFlow
- Connect to Dify
- Connect AIHubMix
- coze
- dashscope
- dify
- 大语言模型提供商
- NewAPI
- 接入 PPIO 派欧云
- 接入 LM Studio 使用 DeepSeek-R1 等模型
- Integrating Ollama
- Connecting to SiliconFlow
- Connecting Model Services
- Connecting to TokenPony
- use
- Agent Runner
- Agent Sandbox Environment ⛵️
- astrbot sandbox
- Docker-based Code Interpreter
- Built-in Commands
- Computer Use
- Context Compression
- Custom Rules
- Function Calling
- AstrBot Knowledge Base
- MCP
- AstrBot Star
- Proactive Capabilities
- Anthropic Skills
- Agent Handoff and SubAgent
- Unified Webhook Mode
- Web Search
- Admin Panel


