πŸ“¦ agentscope-ai / CoPaw

Your Personal AI Assistant; easy to install, deploy on your own machine or on the cloud; supports multiple chat apps with easily extensible capabilities.

β˜… 3.2k stars β‘‚ 317 forks πŸ‘ 3.2k watching βš–οΈ Apache License 2.0
πŸ“₯ Clone https://github.com/agentscope-ai/CoPaw.git
HTTPS git clone https://github.com/agentscope-ai/CoPaw.git
SSH git clone git@github.com:agentscope-ai/CoPaw.git
CLI gh repo clone agentscope-ai/CoPaw
ydlstartx ydlstartx feat(providers): Add all models supported by aliyun-coding-plan (#137) 06a1f6c 1 days ago πŸ“ History
πŸ“‚ main View all commits β†’
πŸ“ .github
πŸ“ console
πŸ“ deploy
πŸ“ scripts
πŸ“ src
πŸ“ website
πŸ“„ .dockerignore
πŸ“„ .flake8
πŸ“„ .gitignore
πŸ“„ .python-version
πŸ“„ CONTRIBUTING.md
πŸ“„ LICENSE
πŸ“„ pyproject.toml
πŸ“„ README_zh.md
πŸ“„ README.md
πŸ“„ SECURITY.md
πŸ“„ setup.py
πŸ“„ README.md

CoPaw

GitHub Repo PyPI Documentation Python Version Last Commit License Code Style GitHub Stars GitHub Forks DeepWiki Discord DingTalk

[Documentation] [δΈ­ζ–‡ README]

CoPaw Logo

Works for you, grows with you.

Your Personal AI Assistant; easy to install, deploy on your own machine or on the cloud; supports multiple chat apps with easily extensible capabilities.

Core capabilities:
> Every channel β€” DingTalk, Feishu, QQ, Discord, iMessage, and more. One assistant, connect as you need.
> Under your control β€” Memory and personalization under your control. Deploy locally or in the cloud; scheduled reminders to any channel.
> Skills β€” Built-in cron; custom skills in your workspace, auto-loaded. No lock-in.
>

What you can do
>

> - Social: daily digest of hot posts (Xiaohongshu, Zhihu, Reddit), Bilibili/YouTube summaries.
- Productivity: newsletter digests to DingTalk/Feishu/QQ, contacts from email/calendar.
- Creative: describe your goal, run overnight, get a draft next day.
- Research: track tech/AI news, personal knowledge base.
- Desktop: organize files, read/summarize docs, request files in chat.
- Explore: combine Skills and cron into your own agentic app.
>

Table of Contents

Recommended reading:
> - I want to run CoPaw in 3 commands: Quick Start β†’ open Console in browser.
- I want to chat in DingTalk / Feishu / QQ: Quick Start β†’ Channels.
- I don’t want to install Python: One-line install handles Python automatically, or use ModelScope one-click for cloud.

Quick Start

pip install (recommended)

If you prefer managing Python yourself:

pip install copaw
copaw init --defaults
copaw app

Then open http://127.0.0.1:8088/ in your browser for the Console (chat with CoPaw, configure the agent). To talk in DingTalk, Feishu, QQ, etc., add a channel in the docs.

Console

One-line install (beta, continuously improving)

No Python required β€” the installer handles everything:

macOS / Linux:

curl -fsSL https://copaw.agentscope.io/install.sh | bash

Windows (PowerShell):

irm https://copaw.agentscope.io/install.ps1 | iex

Then open a new terminal and run:

copaw init --defaults   # or: copaw init (interactive)
copaw app

Install options

macOS / Linux:

# Install a specific version
curl -fsSL ... | bash -s -- --version 0.0.2

# Install from source (dev/testing)
curl -fsSL ... | bash -s -- --from-source

# With local model support
bash install.sh --extras llamacpp    # llama.cpp (cross-platform)
bash install.sh --extras mlx         # MLX (Apple Silicon)
bash install.sh --extras llamacpp,mlx

# Upgrade β€” just re-run the installer
curl -fsSL ... | bash

# Uninstall
copaw uninstall          # keeps config and data
copaw uninstall --purge  # removes everything

Windows (PowerShell):

# Install a specific version
irm ... | iex; .\install.ps1 -Version 0.0.2

# Install from source (dev/testing)
.\install.ps1 -FromSource

# With local model support
.\install.ps1 -Extras llamacpp      # llama.cpp (cross-platform)
.\install.ps1 -Extras mlx           # MLX
.\install.ps1 -Extras llamacpp,mlx

# Upgrade β€” just re-run the installer
irm ... | iex

# Uninstall
copaw uninstall          # keeps config and data
copaw uninstall --purge  # removes everything

Using Docker

docker pull agentscope/copaw:latest
docker run -p 8088:8088 -v copaw-data:/app/working agentscope/copaw:latest

Then open http://127.0.0.1:8088/ for the Console. Config, memory, and skills are stored in the copaw-data volume. To pass API keys (e.g. DASHSCOPE_API_KEY), add -e VAR=value or --env-file .env to docker run.

The image is built from scratch. To build the image yourself, please refer to the Build Docker image section in scripts/README.md, and then push to your registry.

Using ModelScope

No local install? ModelScope Studio one-click cloud setup. Set your Studio to non-public so others cannot control your CoPaw.

Deploy on Alibaba Cloud ECS

To run CoPaw on Alibaba Cloud (ECS), use the one-click deployment: open the CoPaw on Alibaba Cloud (ECS) deployment link and follow the prompts. For step-by-step instructions, see Alibaba Cloud Developer: Deploy your AI assistant in 3 minutes.


API Key

If you use a cloud LLM (e.g. DashScope, ModelScope), you must set an API key before chatting. CoPaw will not work until a valid key is configured.

Where to set it:

  • copaw init β€” When you run copaw init, the command has a step to configure the LLM provider and API key. Follow the prompts to choose a provider and enter your key.
  • Console β€” After copaw app, open http://127.0.0.1:8088/ β†’ Settings β†’ Models. Select a provider, fill in the API Key field, then activate that provider and model.
  • Environment variable β€” For DashScope you can set DASHSCOPE_API_KEY in your shell or in a .env file in the working directory.
Tools that need extra keys (e.g. TAVILY_API_KEY for web search) can be set in Console Settings β†’ Environment variables, or see Config for details.

Using local models only? If you use Local Models (llama.cpp or MLX), you do not need any API key.

Local Models

CoPaw can run LLMs entirely on your machine β€” no API keys or cloud services required.

BackendBest forInstall
llama.cppCross-platform (macOS / Linux / Windows)pip install 'copaw[llamacpp]'
MLXApple Silicon Macs (M1/M2/M3/M4)pip install 'copaw[mlx]'
After installing, download a model and start chatting:

copaw models download Qwen/Qwen3-4B-GGUF
copaw models # select the downloaded model
copaw app # start the server

You can also download and manage local models from the Console UI.


Documentation

TopicDescription
IntroductionWhat CoPaw is and how you use it
Quick startInstall and run (local or ModelScope Studio)
ConsoleWeb UI for chat and agent config
ChannelsDingTalk, Feishu, QQ, Discord, iMessage, and more
HeartbeatScheduled check-in or digest
Local ModelsRun models locally with llama.cpp or MLX
CLIInit, cron jobs, skills, clean
SkillsExtend and customize capabilities
FAQCommon questions and troubleshooting tips
MemoryContext management and long-term memory
ConfigWorking directory and config file
Full docs in this repo: website/public/docs/.


Install from source

git clone https://github.com/agentscope-ai/CoPaw.git
cd CoPaw
pip install -e .

  • Dev (tests, formatting): pip install -e ".[dev]"
  • Console (build frontend): cd console && npm ci && npm run build, then copaw app from project root.

Why CoPaw?

CoPaw represents both a Co Personal Agent Workstation and a "co-paw"β€”a partner always by your side. More than just a cold tool, CoPaw is a warm "little paw" always ready to lend a hand (or a paw!). It is the ultimate teammate for your digital life.


Built by

AgentScope team Β· AgentScope Β· AgentScope Runtime Β· ReMe


Contact us

DiscordDingTalk
DiscordDingTalk

License

CoPaw is released under the Apache License 2.0.