-
Notifications
You must be signed in to change notification settings - Fork 845
feat(web-api): add a chat stream method to the web client #1755
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
Codecov Report✅ All modified and coverable lines are covered by tests. Additional details and impacted files@@ Coverage Diff @@
## ai-apps #1755 +/- ##
==========================================
Coverage ? 85.12%
==========================================
Files ? 115
Lines ? 13068
Branches ? 0
==========================================
Hits ? 11124
Misses ? 1944
Partials ? 0 ☔ View full report in Codecov by Sentry. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🔏 A few fast jots for the wonderful reviewers!
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
📚 note: The docs/reference
is autogenerated with the latest code changes. I'll follow up with separate PRs that both requires this and hides these from the preview since regular review ought happen with the comments elsewhere IMHO-
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🔐 note: Ongoing discussion is happening for this - #1759.
legacy_source = re.sub(r"^from slack_sdk.web.chat_stream import ChatStream\n", "", legacy_source, flags=re.MULTILINE) | ||
legacy_source = re.sub(r"(?s)def chat_stream.*?(?=def)", "", legacy_source) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🌲 note: The LegacyWebClient
won't find this chat_stream
since responses from the various APIs can't be parsed quick - these are either async
or sync
. Instead we might recommend updating to the latest version.
🔗 https://docs.slack.dev/tools/python-slack-sdk/v3-migration
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
📝 note: Planning to include this script as part of the generate_api_docs.sh
scripts in follow up to guarantee the latest changes are used. This'll match how unit tests are setup IIRC.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
📚 note: Similar changes instead prefer to generate docs before tests in #1759.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
📠 note: This file is autogenerated from slack_sdk/web/client.py
! However, kind review that checks for correct async usage is so appreciated!
if markdown_text: | ||
self._buffer += markdown_text |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
📝 note: Planning to follow up in the node
implementation to match this. I find adding to the buffer to be more clear here than below when stopping the stream.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🎁 note: The node
implementation is in progress here: slackapi/node-slack-sdk#2388
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
📌 And a question on overall implementations I forgot!
def append( | ||
self, | ||
*, | ||
markdown_text: str, | ||
**kwargs, | ||
) -> Optional[SlackResponse]: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🗣️ question: Are these arguments flexible enough for future changes? It's not obvious how kwargs
might be buffered if the call to append doesn't send a request.
🐝 ramble: We can of course follow up with updates in later releases, but I'm wanting to avoid strangeness in implementations in the meantime...
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🧠 I think this is flexible for future changes.
Imagine in the future that the chat.appendStream
method supports other content, aside from markdown_text
. If that happens, we can update it to be markdown_text: Optional[str]
. This is a non-breaking change for existing apps and allows us to add new arguments such as new_things: Optional[str]
or new_things: Optional[Dict[str, str]]
.
Today, the kwargs
would only work as long as markdown_text
is a required field. But I think that's the right choice right now, since it is a required field to append text.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@mwbrooks Wow thanks for calling out that changing "required" to "optional" isn't breaking... That might've been catching me earlier!
I think we'll have to explore additional buffer configuration if those later changes arise, but I now feel confident in this approach 🙏 ✨
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
✅ Very solid and well done, @zimeg! I feel like I'm short changing you by not providing deeper feedback, but this came out exactly as I was hoping! 🙌🏻
🧠 This is aligned very well with the Node SDK implementation and I've noticed that you've noted a small improvement to align both around appending text.
🔮 Thinking about the future, our concern is whether this can expand with the future changes to chat_stream
. I've left a comment below, but I can see a path that's non-breaking and flexible for us. So I'm happy with it!
🧪 I've manually tested this by writing it up to our AI Sample App. I was able to adjust the buffer size successfully and it simnifically reduced the code and overhead.
async_source = re.sub( | ||
"from slack_sdk.web.chat_stream import ChatStream", | ||
"from slack_sdk.web.async_chat_stream import AsyncChatStream", | ||
async_source, | ||
) | ||
async_source = re.sub(r"ChatStream:", "AsyncChatStream:", async_source) | ||
async_source = re.sub(r"ChatStream\(", "AsyncChatStream(", async_source) | ||
async_source = re.sub( | ||
r" client.chat_stream\(", | ||
" await client.chat_stream(", | ||
async_source, | ||
) | ||
async_source = re.sub( | ||
r" streamer.append\(", | ||
" await streamer.append(", | ||
async_source, | ||
) | ||
async_source = re.sub( | ||
r" streamer.stop\(", | ||
" await streamer.stop(", | ||
async_source, | ||
) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
note: oof, you definitely had to push the boundaries of this script with that helper!
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@mwbrooks Agreed. This wasn't a favorite change to make but I think it'll let us land this with autogenerated changes 🤖
legacy_source = re.sub(r"^from slack_sdk.web.chat_stream import ChatStream\n", "", legacy_source, flags=re.MULTILINE) | ||
legacy_source = re.sub(r"(?s)def chat_stream.*?(?=def)", "", legacy_source) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
question: Can we revert this since the ChatStream isn't landing into the legacy client?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@mwbrooks Let's keep these for now - this removes the chat_stream
implementation from the legacy client altogether after the original source file is copied over!
Example: | ||
```python | ||
streamer = client.chat_stream( | ||
channel="C0123456789", | ||
thread_ts="1700000001.123456", | ||
recipient_team_id="T0123456789", | ||
recipient_user_id="U0123456789", | ||
) | ||
streamer.append(markdown_text="**hello wo") | ||
streamer.append(markdown_text="rld!**") | ||
streamer.stop() | ||
``` |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
praise: very nice 👌🏻
slack_sdk/web/chat_stream.py
Outdated
recipient_user_id: The encoded ID of the user to receive the streaming text. Required when streaming to channels. | ||
buffer_size: The length of markdown_text to buffer in-memory before calling a method. Increasing this value | ||
decreases the number of method calls made for the same amount of text, which is useful to avoid rate limits. | ||
Default: 256. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
question: Technically, this class doesn't have a default buffer size, right? The 256 default is set by the web-client's client.chat_stream
constructor.
start_request = self.thread.server.chat_stream_requests.get("/chat.startStream", {}) | ||
self.assertEqual(start_request.get("channel"), "C0123456789") | ||
self.assertEqual(start_request.get("thread_ts"), "123.000") | ||
self.assertEqual(start_request.get("recipient_team_id"), "T0123456789") | ||
self.assertEqual(start_request.get("recipient_user_id"), "U0123456789") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
praise: This is so nice!
await streamer.append(markdown_text="e is", token="xoxb-chat_stream_test_token1") | ||
await streamer.append(markdown_text=" bold!") | ||
await streamer.append(markdown_text="*") | ||
await streamer.stop(markdown_text="*", token="xoxb-chat_stream_test_token2") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
suggestion: I noticed that we don't have a test for .stop(blocks="...", ...)
. Since this is an important feature for the new feedback buttons, we should test that blocks are passed through to the API.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The node tests are also updated in slackapi/node-slack-sdk#2388 to match!
await streamer.stop() | ||
|
||
@async_test | ||
async def test_streams_errors_when_appending_to_a_completed_stream(self): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
praise: Great use-case!
def append( | ||
self, | ||
*, | ||
markdown_text: str, | ||
**kwargs, | ||
) -> Optional[SlackResponse]: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🧠 I think this is flexible for future changes.
Imagine in the future that the chat.appendStream
method supports other content, aside from markdown_text
. If that happens, we can update it to be markdown_text: Optional[str]
. This is a non-breaking change for existing apps and allows us to add new arguments such as new_things: Optional[str]
or new_things: Optional[Dict[str, str]]
.
Today, the kwargs
would only work as long as markdown_text
is a required field. But I think that's the right choice right now, since it is a required field to append text.
class ChatStreamMockHandler(MockHandler): | ||
"""Extended mock handler that captures request bodies for chat stream methods""" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
suggestion(non-blocking, future): This is very impressive that you've build your own mock server for handling the tests. We may want to talk to @WilliamBergamin to see if he has opinions or ideas around this approach. But from my perspective, this is contained to our chat_stream
tests and builds upon the existing mock server work, so I think we should move forward with it.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@mwbrooks Kind thanks. I found this to be decent for confirming expected arguments and headers are being sent with streaming requests, but I support other approaches!
For now let's use this as a starting point, but we should revisit this with suggestions otherwise 🤖
Co-authored-by: Michael Brooks <[email protected]>
@mwbrooks I appreciate your reviews so much! Thanks for sharing kind suggestions and thoughts of the future too 👾 ✨ I'll merge this for a release that we can test with, but I remain so open to improving both the organization of files and tests in follow up changes. Let's hope to keep the implementations similar in |
Sample app reference WIP: slack-samples/bolt-python-assistant-template#16 🧪 ✨ |
Summary
This PR adds a
ChatStream
class that creates chat streams with.chat_stream()
🤖Preview
Testing
WIP-
Notes
Category
/docs
(Documents)tests
/integration_tests
(Automated tests for this library)Requirements
python3 -m venv .venv && source .venv/bin/activate && ./scripts/run_validation.sh
after making the changes.