同步操作将从 guanty18/CoGPT 强制同步,此操作会覆盖自 Fork 仓库以来所做的任何修改,且无法恢复!!!
确定后同步将在后台操作,完成时将刷新页面,请耐心等待。
中文教程见 我的博客
Only for network programming learning purposes.
The previous version in Python is archieved in py branch, which is not stable and difficult to maintain (there are many problems with async in Python).
Provide an API nearly the same as OpenAI API (with gpt-3.5-turbo and gpt-4).
The only difference is that you should use the application token of Copilot instead of OpenAI token.
Download the latest release for your platform. Unzip it and run cogpt-get-apptoken
(or cogpt-get-apptoken.exe
on Windows).
You can see how to use proxy by ./cogpt-get-apptoken -h
GET /
Hi, I'm CoGPT.
GET /health
{"status":"OK"}
GET /v1/models
POST /v1/chat/completions
POST /v1/embeddings
for embeddings api
Pay attention that this api is not totally compatible with OpenAI API.
For input
field, OpenAI API accepts following types:
string
: The string that will be turned into an embedding.array
: The array of strings that will be turned into an embedding.array
: The array of integers that will be turned into an embedding.array
: The array of arrays containing integers that will be turned into an embedding.Unfortunately, this service only accepts the first 2 types as well as the array of arrays containing strings.
This service is not designed to be deployed on public network.
The best way to use is to deploy it on your own computer or local network. Or you can deploy it on public network, but only for yourself.
DO NOT share your token with others. If a token is accessed from many different IPs, it will be banned. And if too many tokens are requested from one IP, something bad may happen.
So again, ONLY for yourself.
DO NOT try any of theses approaches.
mkdir CoGPT && cd CoGPT
Then create a docker-compose.yml
file with following content:
version: '3'
services:
cogpt-api:
image: geniucker/cogpt:latest
environment:
- HOST=0.0.0.0
ports:
- 8080:8080
volumes:
- ./db:/app/db
- ./log:/app/log
restart: unless-stopped
container_name: cogpt-api
If you want to use development version, replace geniucker/cogpt:latest
with geniucker/cogpt:dev
.
By default, the service will listen on port 8080. If you want to change the port, edit docker-compose.yml
and change the port in ports
section. For example, if you want to listen on port 80, change 8080:8080
to 80:8080
.
Other config options can also be changed in environment
section. Or more conveniently, you can edit .env
file (You can copy .env.example
to .env
and edit it). Note that the config for db
and log
should be changed in volumes
section in docker-compose.yml
.
All config options are listed in Config.
Then run docker compose up -d
to start the service.
Download the latest release for your platform. Unzip it and run cogpt-api
(or cogpt-api.exe
on Windows).
By default, the service will listen on localhost:8080
. For configuration, see Config.
For Linux based on systemd, you can follow the steps below.
First, download the latest release for Linux. Unzip it and move cogpt-api
to /opt/cogpt/
and grant it executable permission.
Then copy the content of cogpt-api.service to /etc/systemd/system/cogpt-api.service
.
If you need to change the config, you should edit /opt/cogpt-api/.env
file.
Finally, run following commands to enable and start the service.
sudo systemctl enable cogpt-api
sudo systemctl start cogpt-api
Run sudo systemctl stop cogpt-api
to stop the service.
Run sudo systemctl disable cogpt-api
to disable the service.
For MacOS, services are based on launchd
.
First, download the latest release for MacOS. Unzip it and move cogpt-api
to /opt/cogpt/
and grant it executable permission.
Then copy the content of com.cogpt-api.plist to /Library/LaunchDaemons/com.cogpt-api.plist
.
If you need to change the config, you should edit /opt/cogpt-api/.env
file.
Finally, run sudo launchctl load /Library/LaunchDaemons/com.cogpt-api.plist
to start the service.
Run sudo launchctl unload /Library/LaunchDaemons/com.cogpt-api.plist
to stop the service.
For Windows, we can use scheduled tasks. You can follow the steps below.
First, download the latest release for Windows. Unzip it to a directory. Let's say C:\CoGPT\
.
Then create a file cogpt-api-service.ps1
in C:\CoGPT\
with copy content of cogpt-api-service.ps1 to it.
Start a PowerShell with administrator permission and run following commands.
cd C:\CoGPT\
./cogpt-api-service.ps1 enable
Here are all commands you can use. All commands should be run in PowerShell with administrator permission.
./cogpt-api-service.ps1 enable # enable and start the service
./copgt-api-service.ps1 disable # stop and disable the service
./cogpt-api-service.ps1 start # start the service
./cogpt-api-service.ps1 stop # stop the service
./cogpt-api-service.ps1 restart # restart the service
./cogpt-api-service.ps1 status # check the status of the service
If you want to share this service with your friends, it's not safe to directly share your GitHub app token. This feature is designed for this situation. You can create map that maps so-called share token to real GitHub app token.
The first way is to set environment variable or modify .env
environment variable file. You should set SHARE_TOKEN
to a string like share-xxxxxxx1:ghu_xxxxxxx1,share-xxxxxxx2:ghu_xxxxxxx2
. The format is share-token:real-token,share-token:real-token
. You can add as many pairs as you want.
The other way is to use command line argument. You can run ./cogpt-api -share-token share-xxxxxxx1:ghu_xxxxxxx1,share-xxxxxxx2:ghu_xxxxxxx2
to start the service. You can add as many pairs as you want.
If you set as above, when you make a request with a token that starts with share-
, the service will use the real token mapped to the share token. If you make a request with a token that starts with ghu_
, the service will use the token directly.
Note that share tokens must start with share-
. Maps that don't start with share-
will be ignored.
To generate a random share token, you can download the latest release for your platform. Unzip it and run ./gen-share-token
.
Edit .env
or set environment variables or command line arguments.
Here are the config options and their default values (.env or environment variables):
keys | default | description |
---|---|---|
HOST |
localhost |
Host to listen on |
PORT |
8080 |
Port to listen on |
CACHE |
true |
Whether to cache tokens in sqlite database. If false, tokens will be cached in memory |
CACHE_PATH |
db/cache.sqlite3 |
Path to sqlite database. Only used if CACHE is true |
DEBUG |
false |
Whether to enable debug mode. If true, the service will print debug info |
LOG_LEVEL |
info |
Log level. |
SHARE_TOKEN |
"" |
Maps of share-token and real token. For example, SHARE_TOKEN=share-xxxxxxx1:ghu_xxxxxxx1,share-xxxxxxx2:ghu_xxxxxxx2 . |
For command line arguments, run ./cogpt-api -h
to see the help message.
Precedence: command line arguments > environment variables > .env.
Environment variables for proxy are also supported. They are ALL_PROXY
, HTTPS_PROXY
and HTTP_PROXY
. You can also use command line arguments to set proxy. Run ./cogpt-api -h
to see the help message. Precedence: command line arguments > environment variables (ALL_PROXY
> HTTPS_PROXY
> HTTP_PROXY
).
此处可能存在不合适展示的内容,页面不予展示。您可通过相关编辑功能自查并修改。
如您确认内容无涉及 不当用语 / 纯广告导流 / 暴力 / 低俗色情 / 侵权 / 盗版 / 虚假 / 无价值内容或违法国家有关法律法规的内容,可点击提交进行申诉,我们将尽快为您处理。