Skip to content
AstrBot
Main Navigation HomeBlogRoadmapHTTP API

English

简体中文

English

简体中文

Toggle dark mode

Intro & DeployMessaging PlatformsAI IntegrationUsageDevelopment
Sidebar Navigation

Introduction

What is AstrBot

Community

FAQ

Deployment

Package Manager

One-click Launcher

Docker

Kubernetes

BT Panel

1Panel

Manual

Other Deployments

CasaOS

Compshare GPU

Community-provided Deployment

Support Us

Messaging Platforms

Quick Start

QQ Official Bot

Websockets

Webhook

OneBot v11

WeCom Application

WeCom AI Bot

WeChat Official Account

Lark

DingTalk

Telegram

LINE

Slack

Misskey

Discord

Satori

Connect Satori

Using server-satori

Community-provided

Matrix

KOOK

VoceChat

AI Integration

✨ Model Providers

NewAPI

AIHubMix

PPIO Cloud

SiliconFlow

TokenPony

302.AI

Ollama

LMStudio

⚙️ Agent Runners

Built-in Agent Runner

Dify

Coze

Alibaba Bailian

DeerFlow

Usage

WebUI

Plugins

Built-in Commands

Tool Use

Anthropic Skills

SubAgent Orchestration

Proactive Tasks

MCP

Web Search

Knowledge Base

Custom Rules

Agent Runner

Unified Webhook Mode

Auto Context Compression

Agent Sandbox

Development

Plugin Development

🌠 Getting Started

Minimal Example

Listen to Message Events

Send Messages

Plugin Configuration

AI

Storage

HTML to Image

Session Control

Publish Plugin

Platform Adapter Integration

AstrBot HTTP API

AstrBot Configuration File

Others

Self-hosted HTML to Image

Open Source Summer

OSPP 2025

On this page

Deploy via Compshare ​

Compshare is UCloud's GPU compute rental and LLM API platform, offering compute resources for AI, deep learning, and scientific workloads.

AstrBot provides an Ollama + AstrBot one-click self-deployment image on Compshare, and also supports Compshare model APIs.

Use the Ollama + AstrBot One-Click Image ​

Default image spec: RTX 3090 24GB + Intel 16-core + 64GB RAM + 200GB system disk. Billing is pay-as-you-go, so please monitor your balance.

  1. Register a Compshare account via this link.
  2. Open the AstrBot image page and create an instance.
  3. After deployment, open JupyterLab from the console.
  4. In JupyterLab, create a new terminal and run:
bash
cd
./astrbot_booter.sh

If startup succeeds, you should see output similar to:

txt
(py312) root@f8396035c96d:/workspace# cd
./astrbot_booter.sh
Starting AstrBot...
Starting ollama...
Both services started in the background.

After startup, open http://<instance-public-ip>:6185 in your browser to access the AstrBot dashboard. You can find the public IP in Console -> Basic Network (Public).

It may take around 30 seconds before the page becomes reachable.

WebUI

Login with username astrbot and password astrbot.

After logging in, you can reset your password and continue setup.

The instance imports Ollama-DeepSeek-R1-32B by default.

Use Other Models ​

Pull Models with Ollama ​

The image includes Ollama. You can pull any model and host it locally on the instance.

  1. Choose a model from Ollama Search.
  2. Connect to the instance terminal via SSH (from Compshare Console -> Instance List -> Console Command and Password).
  3. Run ollama pull <model-name> and wait for completion.
  4. In AstrBot Dashboard -> Providers, edit ollama_deepseek-r1, update the model name, and save.

image

Use Compshare Model API ​

AstrBot supports direct access to model APIs provided by Compshare.

  1. Find the model you want at Compshare Model Center.
  2. In AstrBot Dashboard -> Providers, click + Add Provider, then choose Compshare. If Compshare is not listed, choose OpenAI-compatible access and set API Base URL to https://api.modelverse.cn/v1. Enter the model name in model configuration and save.

Test ​

In AstrBot Dashboard, click Chat and run /provider to view and switch your active provider.

Then send a normal message to test whether the model works.

image

Connect to Messaging Platforms ​

You can follow the latest platform integration guides in the AstrBot Documentation. Open the docs and check the left sidebar under Messaging Platforms.

  • Lark: Connect to Lark
  • LINE: Connect to LINE
  • DingTalk: Connect to DingTalk
  • WeCom: Connect to WeCom
  • WeChat Official Account: Connect to WeChat Official Account
  • QQ Official Bot: Connect to QQ Official API
  • KOOK: Connect to KOOK
  • Slack: Connect to Slack
  • Discord: Connect to Discord
  • More methods: AstrBot Documentation

More Features ​

For more capabilities, see the AstrBot Documentation.

Edit this page on GitHub

Last updated:

Pager
PreviousCasaOS
NextCommunity-provided Deployment

Deployed on Rainyun Logo