With this patch, the edge function begins streaming the content right away.
This leads to some minor optimization for the non-streaming use case, as there
is no large audio file kept on the server before transferring.
But this mainly creates a large optimization for the "streaming" use case,
as as the data trickles in, it is sent to the client in pass-through fashion.
This implementation has been largely inspired by the Vercel AI (stream) SDK,
available at https://github.com/vercel-labs/ai/, and in particular by the work
of @jridgewell on https://github.com/vercel-labs/ai/issues/90 and related
issues.
As soon as some pending changes land in edge-runtime and nextjs, we'll have
full stream cancellation and tokens saving #57
May be rough on the edges, but should not create issues.
The implementation is defensive, excessively validates the
return types as the OpenAI API is brittle and can easily misbehave
Using and Edge function for accessing the Google API and
return the search results (.items[]) to the client (browser).
Added all type definitions (browser<>edge, and edge<>google),
and honor environment variables. When both the new environment
variables are set at build time, the user won't be asked for
keys.
Basically this implements #98, via ReAct.
Note: Steps is limited upstream to 50, so we set that as maximum. When the
seed is random, it's not returned by the API, so a good generated image won't
be reproducible, unless the seed was set to something random beforehand.