Skip to content

Commit f07a0f4

Browse files
authored
Merge pull request #49 from dbpunk-labs/feat/move_sdk_to_single_module
fix: add chinese readme
2 parents c89fea5 + 3dd22bc commit f07a0f4

29 files changed

+216
-89
lines changed

.github/workflows/ci.yaml

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -22,9 +22,9 @@ jobs:
2222
run: |
2323
WS_DIR=`pwd`
2424
bash start_sandbox.sh
25-
cd ${WS_DIR}/kernel
25+
cd ${WS_DIR}/sdk
2626
pytest tests/*.py
27-
cd ${WS_DIR}/agent
27+
cd ${WS_DIR}/kernel
2828
pytest tests/*.py
2929
- uses: actions/upload-artifact@v3
3030
if: failure()

.github/workflows/release_libraries.yml

Lines changed: 6 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -43,6 +43,12 @@ jobs:
4343
packages-dir: up/dist
4444
password: ${{ secrets.PYPI_TOKEN }}
4545

46+
- name: Publish SDK
47+
uses: pypa/gh-action-pypi-publish@release/v1
48+
with:
49+
packages-dir: sdk/dist
50+
password: ${{ secrets.PYPI_TOKEN }}
51+
4652
- name: docker login
4753
uses: docker/login-action@v1
4854
with:

README.md

Lines changed: 4 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -7,6 +7,8 @@
77
[![PyPI - Version](https://img.shields.io/pypi/v/octopus_chat)](https://pypi.org/project/octopus-chat/)
88
![PyPI - Downloads](https://img.shields.io/pypi/dd/octopus_chat)
99

10+
[中文](./README_zh_cn.md)
11+
1012
> ## Octopus
1113
> an open-source code interpreter for terminal users
1214
@@ -53,7 +55,6 @@ You can use /help to look for help
5355
* Octopus Agent: Manages client requests, uses ReAct to process complex tasks, and stores user-assembled applications.
5456
* Octopus Terminal Cli: Accepts user requests, sends them to the Agent, and renders rich results. Currently supports Discord, iTerm2, and Kitty terminals.
5557

56-
For security, it is recommended to run the kernel and agent as Docker containers.
5758

5859
## Features
5960

@@ -84,19 +85,16 @@ if you have any advice for the roadmap. please create a discuession to talk abou
8485

8586
## API Service Supported
8687

87-
|name|status| note|
88+
|name|status| installation|
8889
|----|----------------|---|
8990
|[Openai GPT 3.5/4](https://openai.com/product#made-for-developers) | ✅ fully supported|the detail installation steps|
9091
|[Azure Openai GPT 3.5/4](https://azure.microsoft.com/en-us/products/ai-services/openai-service) | ✅ fully supported|the detail install steps|
9192
|[LLama.cpp Server](https://github.com/ggerganov/llama.cpp/tree/master/examples/server) | ✔️ supported| You must start the llama cpp server by yourself|
9293

9394
## Platforms Supported
9495

95-
|name|status| note|
96+
|name|status| installation|
9697
|----|----------------|---|
9798
|ubuntu 22.04 | ✅ fully supported|the detail installation steps|
9899
|macos | ✅ fully supported|the detail install steps|
99100

100-
## Thanks
101-
102-
* [Octopus icons created by Whitevector - Flaticon](https://www.flaticon.com/free-icons/octopus)

README_zh_cn.md

Lines changed: 97 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,97 @@
1+
<p align="center">
2+
<img width="100px" src="https://github.com/dbpunk-labs/octopus/assets/8623385/6c60cb2b-415f-4979-9dc2-b8ce1958e17a" align="center"/>
3+
4+
![GitHub Workflow Status (with event)](https://img.shields.io/github/actions/workflow/status/dbpunk-labs/octopus/ci.yml?branch=main&style=flat-square)
5+
[![Discord](https://badgen.net/badge/icon/discord?icon=discord&label)](https://discord.gg/UjSHsjaz66)
6+
[![Twitter Follow](https://img.shields.io/twitter/follow/OCopilot7817?style=flat-square)](https://twitter.com/OCopilot7817)
7+
[![PyPI - Version](https://img.shields.io/pypi/v/octopus_chat)](https://pypi.org/project/octopus-chat/)
8+
![PyPI - Downloads](https://img.shields.io/pypi/dd/octopus_chat)
9+
10+
[English](./README.md)
11+
> ## Octopus
12+
> 一款为终端用户打造的开源的代码解释器
13+
14+
<p align="center">
15+
<img width="1000px" src="https://github.com/dbpunk-labs/octopus/assets/8623385/3ccb2d00-7231-4014-9dc5-f7f3e487c8a2" align="center"/>
16+
17+
## 快速上手
18+
19+
在本地电脑安装octopus, 你可以选择使用openai 或者codellama-7B
20+
21+
本地环境要求
22+
* python 3 >= 3.10
23+
* pip
24+
* docker
25+
26+
安装octopus启动器
27+
28+
```bash
29+
pip install octopus_up
30+
```
31+
32+
使用octopus启动器初始化本地环境,这一步你需要选择使用openai或者codellama-7B
33+
34+
```
35+
octopus_up
36+
```
37+
38+
开始体验octopus, 在命令行执行`octopus`
39+
40+
```
41+
Welcome to use octopus❤️ . To ask a programming question, simply type your question and press esc + enter
42+
You can use /help to look for help
43+
44+
[1]🎧>
45+
```
46+
47+
## Octopus内部实现
48+
49+
![octopus_simple](https://github.com/dbpunk-labs/octopus/assets/8623385/e5bfb3fb-74a5-4c60-8842-a81ee54fcb9d)
50+
51+
* Octopus 内核: 当前基于notebook实现的代码执行引擎
52+
* Octopus Agent: 处理用户请求,将请求发给大模型服务API和将大模型生成的代码发给Octopus 内核执行代码
53+
* Octopus 命令行工具: 将用户请求发给Agent和渲染Agent返回的代码,文本和图片
54+
55+
每个组件之间都是采用流式方式进行数据交换,大模型每写一个字都会在命令行上面实时展示.
56+
57+
## 功能列表
58+
59+
* 在docker环境自动执行代码
60+
* 实验功能,在iterm2 和kitty终端进行图片显示
61+
* 支持通过`/up`命令将文件上传到Octopus内核,你可以在写问题描述的过程中使用上床文件命令
62+
* 实验功能, 支持将大模型生成的代码片段打包在一起生成一个应用,然后通过`/run` 命令直接执行
63+
* 支持将输出内容文本,代码通过 `/cc`命令复制到粘贴板上面
64+
* 支持问题历史功能,提问历史将会被保存在本地
65+
66+
如果你有功能需求建议,可以创建一个讨论帖子和大家一起讨论
67+
68+
## 后续计划
69+
70+
* 提升octopus的可用性和安全性
71+
* 支持记忆系统,让octopus能过更好服务每个人
72+
* 增强agent的代码生成能力
73+
* 增强kernel的代码执行能力
74+
* 支持gpu加速视频处理领域任务
75+
76+
当前整个计划都处于草稿状态,如果你愿意参与讨论,欢迎加入dicord讨论组交流
77+
## Demo
78+
79+
[video](https://github.com/dbpunk-labs/octopus/assets/8623385/bea76119-a705-4ae1-907d-cb4e0a0c18a5)
80+
81+
82+
## API服务支持列表
83+
84+
|名字|状态| 安装步骤|
85+
|----|----------------|---|
86+
|[Openai GPT 3.5/4](https://openai.com/product#made-for-developers) | ✅ 完整支持|使用OpenAI接口安装步骤|
87+
|[Azure Openai GPT 3.5/4](https://azure.microsoft.com/en-us/products/ai-services/openai-service) | ✅ 完整支持|使用微软OpenAI接口安装步骤|
88+
|[LLama.cpp Server](https://github.com/ggerganov/llama.cpp/tree/master/examples/server) | ✔️ 部分支持| 使用llama.cpp server安装步骤|
89+
90+
## 支持平台列表
91+
92+
|名字|状态|安装不受|
93+
|----|----------------|---|
94+
|ubuntu 22.04 | ✅ fully supported|详细安装步骤|
95+
|macos | ✅ fully supported|详细安装步骤|
96+
|windows | ✅ fully supported|详细安装步骤|
97+

agent/setup.py

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -35,6 +35,7 @@
3535
install_requires=[
3636
"octopus_proto",
3737
"octopus_kernel",
38+
"octopus_sdk",
3839
"grpcio-tools>=1.57.0",
3940
"grpc-google-iam-v1>=0.12.6",
4041
"aiofiles",

agent/src/octopus_agent/agent_server.py

Lines changed: 5 additions & 16 deletions
Original file line numberDiff line numberDiff line change
@@ -1,18 +1,6 @@
11
# vim:fenc=utf-8
22
#
33
# Copyright (C) 2023 dbpunk.com Author imotai <[email protected]>
4-
#
5-
# Licensed under the Apache License, Version 2.0 (the "License");
6-
# you may not use this file except in compliance with the License.
7-
# You may obtain a copy of the License at
8-
#
9-
# http://www.apache.org/licenses/LICENSE-2.0
10-
#
11-
# Unless required by applicable law or agreed to in writing, software
12-
# distributed under the License is distributed on an "AS IS" BASIS,
13-
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
14-
# See the License for the specific language governing permissions and
15-
# limitations under the License.
164

175
import asyncio
186
import logging
@@ -32,13 +20,13 @@
3220
from typing import AsyncIterable, Any, Dict, List, Optional, Sequence, Union, Type
3321
from tempfile import gettempdir
3422
from grpc.aio import ServicerContext, server
35-
from octopus_kernel.sdk.kernel_sdk import KernelSDK
23+
from octopus_sdk.kernel_sdk import KernelSDK
24+
from octopus_sdk.utils import parse_image_filename
3625
from .agent_llm import LLMManager
3726
from .agent_builder import build_mock_agent, build_openai_agent, build_codellama_agent
3827
import databases
3928
import orm
4029
from datetime import datetime
41-
from .utils import parse_image_filename
4230

4331
config = dotenv_values(".env")
4432
LOG_LEVEL = (
@@ -72,7 +60,6 @@ class LiteApp(orm.Model):
7260
"saved_filenames": orm.String(max_length=512, allow_null=True),
7361
}
7462

75-
7663
class AgentRpcServer(AgentServerServicer):
7764

7865
def __init__(self):
@@ -164,7 +151,7 @@ async def run(
164151
),
165152
)
166153
function_result = None
167-
async for (result, respond) in agent.call_function(lite_app.code):
154+
async for (result, respond) in agent.call_function(lite_app.code, context):
168155
if context.cancelled():
169156
break
170157
function_result = result
@@ -304,6 +291,8 @@ async def worker(task, agent, queue, context):
304291
)
305292
yield respond
306293
finally:
294+
is_cancelled = context.cancelled()
295+
logger.debug(f" the context is cancelled {is_cancelled}")
307296
if context.cancelled():
308297
try:
309298
logger.warning("cancel the request by stop kernel")

agent/src/octopus_agent/agent_setup.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -17,7 +17,7 @@
1717
""" """
1818
import click
1919
import asyncio
20-
from octopus_agent.agent_sdk import AgentSDK
20+
from octopus_sdk.agent_sdk import AgentSDK
2121

2222

2323
async def add_kernel(endpoint, api_key, kernel_endpoint, kernel_api_key):

agent/src/octopus_agent/base_agent.py

Lines changed: 6 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -22,7 +22,7 @@
2222
from pydantic import BaseModel, Field
2323
from octopus_proto.kernel_server_pb2 import ExecuteResponse
2424
from octopus_proto.agent_server_pb2 import OnAgentAction, TaskRespond, OnAgentActionEnd, FinalRespond
25-
from .utils import parse_image_filename, process_char_stream
25+
from octopus_sdk.utils import parse_image_filename, process_char_stream
2626

2727
logger = logging.getLogger(__name__)
2828

@@ -48,7 +48,9 @@ class BaseAgent:
4848
def __init__(self, sdk):
4949
self.kernel_sdk = sdk
5050

51-
async def call_function(self, code, iteration=1, token_usage=0, model_name=""):
51+
async def call_function(
52+
self, code, context, iteration=1, token_usage=0, model_name=""
53+
):
5254
"""
5355
run code with kernel
5456
"""
@@ -60,6 +62,8 @@ async def call_function(self, code, iteration=1, token_usage=0, model_name=""):
6062
if not is_alive:
6163
await self.kernel_sdk.start(kernel_name="python3")
6264
async for kernel_respond in self.kernel_sdk.execute(code=code):
65+
if context.cancelled():
66+
break
6367
# process the stdout
6468
if kernel_respond.output_type == ExecuteResponse.StdoutType:
6569
kernel_output = json.loads(kernel_respond.output)["text"]

agent/src/octopus_agent/codellama_agent.py

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -82,6 +82,7 @@ async def handle_function(
8282
function_result = None
8383
async for (result, respond) in self.call_function(
8484
code,
85+
context,
8586
iteration=iteration,
8687
token_usage=token_usage,
8788
model_name=model_name,

agent/src/octopus_agent/mock_agent.py

Lines changed: 6 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -60,7 +60,9 @@ async def call_ai(self, prompt, queue, iteration):
6060
)
6161
return message
6262

63-
async def handle_call_function(self, code, queue, explanation, saved_filenames=[]):
63+
async def handle_call_function(
64+
self, code, queue, explanation, context, saved_filenames=[]
65+
):
6466
tool_input = json.dumps({
6567
"code": code,
6668
"explanation": explanation,
@@ -78,7 +80,7 @@ async def handle_call_function(self, code, queue, explanation, saved_filenames=[
7880
)
7981
)
8082
function_result = None
81-
async for (result, respond) in self.call_function(code):
83+
async for (result, respond) in self.call_function(code, context):
8284
function_result = result
8385
if respond:
8486
await queue.put(respond)
@@ -95,7 +97,9 @@ async def arun(self, task, queue, context, max_iteration=5):
9597
if message.get("code", None):
9698
function_result = await self.handle_call_function(
9799
message["code"],
100+
queue,
98101
message["explanation"],
102+
context,
99103
message.get("saved_filenames", []),
100104
)
101105
await queue.put(

0 commit comments

Comments
 (0)