-
Notifications
You must be signed in to change notification settings - Fork 46
[AIT-273] Message per response guide for Vercel AI SDK #3151
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
|
Important Review skippedAuto reviews are disabled on this repository. Please check the settings in the CodeRabbit UI or the You can disable this status message by setting the
WalkthroughThis PR adds a new guide document for implementing token streaming with Vercel AI SDK using a message-per-response pattern, including a corresponding navigation entry. The guide covers setup, prerequisites, and code examples demonstrating publisher and subscriber implementations for streaming responses via Ably. Changes
Estimated code review effort🎯 2 (Simple) | ⏱️ ~15 minutes Possibly related PRs
Suggested reviewers
Poem
🚥 Pre-merge checks | ✅ 5✅ Passed checks (5 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. ✨ Finishing touches🧪 Generate unit tests (beta)
Tip 🧪 Unit Test Generation v2 is now available!We have significantly improved our unit test generation capabilities. To enable: Add this to your reviews:
finishing_touches:
unit_tests:
enabled: trueTry it out by using the Have feedback? Share your thoughts on our Discord thread! Comment |
|
@coderabbitai review |
✅ Actions performedReview triggered.
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 1
🤖 Fix all issues with AI agents
In `@src/pages/docs/guides/ai-transport/vercel-message-per-response.mdx`:
- Around line 40-46: Update the installation command and version note to
reference AI SDK 6.x: replace the install line "npm install ai@^5 ably@^2" with
"npm install ai@^6 ably@^2" (or "ai@6" per style) and change the Aside text that
currently says "This guide uses version 5.x of the AI SDK" to "This guide uses
version 6.x of the AI SDK" (or similar wording), ensuring any other mentions of
"5.x" in this file are updated unless the guide is intentionally for legacy v5
users.
🧹 Nitpick comments (1)
src/pages/docs/guides/ai-transport/vercel-message-per-response.mdx (1)
196-231: Consider adding basic error handling for production readiness.The publish/append pattern is correct, but the example lacks error handling. While this is a guide focused on the happy path, a brief mention of error handling strategies (e.g., try-catch around the initial publish, handling append failures) would help users build more robust implementations.
💡 Optional: Add minimal error handling
case 'text-start': // Publish initial empty message when response starts - const result = await channel.publish({ - name: 'response', - data: '' - }); - - // Capture the message serial for appending tokens - msgSerial = result.serials[0]; + try { + const result = await channel.publish({ + name: 'response', + data: '' + }); + // Capture the message serial for appending tokens + msgSerial = result.serials[0]; + } catch (err) { + console.error('Failed to publish initial message:', err); + } break;
src/pages/docs/guides/ai-transport/vercel-message-per-response.mdx
Outdated
Show resolved
Hide resolved
a01a124 to
916151f
Compare
916151f to
12a8018
Compare
GregHolmes
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Two comments below,
But also we'd need to add these to the AIT Landing (about) page: https://ably.com/docs/ai-transport
I can't test this just yet as I'm trying to an account to test.
src/pages/docs/guides/ai-transport/vercel-message-per-response.mdx
Outdated
Show resolved
Hide resolved
|
|
||
| <Code> | ||
| ```shell | ||
| mkdir ably-vercel-message-per-response && cd ably-vercel-message-per-response |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
| mkdir ably-vercel-message-per-response && cd ably-vercel-message-per-response | |
| mkdir ably-vercel-example && cd ably-vercel-example |
This is very minor, but to keep consistency with the other guides.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hmm, I think the OpenAI and Anthropic use -example but if a user follows multiple guides for a specific provider then they will have to delete the existing folder(S) which is why I switched to this format.
|
Just got an API key, tested, no issues from me with the code. So just the two comments above. |
12a8018 to
2b0edbf
Compare
V6 in latest stable(since Decemeber), so use it. No changes needed for our code samples. Updated event streaming Vercel doc links to more specific page also.
2b0edbf to
2d4d50a
Compare
Description
Follow existing message per response guides, add guide for Vercel AI SDK.
Review App
Checklist
Summary by CodeRabbit
✏️ Tip: You can customize this high-level summary in your review settings.