mirror of
https://github.com/enricoros/big-AGI.git
synced 2026-05-10 21:50:14 -07:00
Compare commits
61 Commits
| Author | SHA1 | Date | |
|---|---|---|---|
| ff4857b9ac | |||
| 5b557705e7 | |||
| cd70c4dd84 | |||
| 9eb2ef05de | |||
| 8fae15d343 | |||
| bca5a1ac78 | |||
| d899fb7e3b | |||
| 0f05b70e3b | |||
| 7b121a3a95 | |||
| d4e414f99c | |||
| a7f322ef38 | |||
| d4494bf2e0 | |||
| 78cf74e3f2 | |||
| cfaed03603 | |||
| a8e3183733 | |||
| 9395db0fd5 | |||
| 8c75061178 | |||
| de0cdded87 | |||
| d225541da2 | |||
| 7a0008de5a | |||
| 0bdd817d6d | |||
| d606975584 | |||
| af56c2c1af | |||
| 73de7df0fb | |||
| 3ca80d6a6e | |||
| eb9e5362fe | |||
| 45d1ca7437 | |||
| e492ccfb04 | |||
| d01b6acd51 | |||
| eec81d5d73 | |||
| 03423ce58c | |||
| e2e7ea972d | |||
| 91b770d2c8 | |||
| 79500e6d8b | |||
| 4ede66cf2b | |||
| 40bff32442 | |||
| 3fc8e8efa0 | |||
| 12ea5f218d | |||
| d47c0e45af | |||
| 298d0201d2 | |||
| a6bde2377e | |||
| 76778c5ab7 | |||
| 11565f5ac8 | |||
| 6c5131996b | |||
| 9b4301cd90 | |||
| c73bbaf0d4 | |||
| 163257e052 | |||
| cf689ca9a9 | |||
| 4a65389b71 | |||
| 5de7762238 | |||
| 06655ced46 | |||
| 60a775b869 | |||
| 5a3645bd43 | |||
| 54d37e663a | |||
| f4c056fa9f | |||
| 8f53fa7407 | |||
| 2f9a4ea00f | |||
| ee7dae827e | |||
| 6fe94e344a | |||
| 3376867966 | |||
| 7f84160a62 |
@@ -1,7 +1,7 @@
|
||||
# BIG-AGI 🧠✨
|
||||
|
||||
Welcome to big-AGI 👋, the GPT application for professionals that need function, form,
|
||||
simplicity, and speed. Powered by the latest models from 8 vendors and
|
||||
simplicity, and speed. Powered by the latest models from 11 vendors and
|
||||
open-source model servers, `big-AGI` offers best-in-class Voice and Chat with AI Personas,
|
||||
visualizations, coding, drawing, calling, and quite more -- all in a polished UX.
|
||||
|
||||
@@ -21,8 +21,21 @@ shows the current developments and future ideas.
|
||||
- Got a suggestion? [_Add your roadmap ideas_](https://github.com/enricoros/big-agi/issues/new?&template=roadmap-request.md)
|
||||
- Want to contribute? [_Pick up a task!_](https://github.com/users/enricoros/projects/4/views/4) - _easy_ to _pro_
|
||||
|
||||
### What's New in 1.11.0 · Jan 16, 2024 · Singularity
|
||||
|
||||
https://github.com/enricoros/big-AGI/assets/1590910/a6b8e172-0726-4b03-a5e5-10cfcb110c68
|
||||
|
||||
- **Find chats**: search in titles and content, with frequency ranking. [#329](https://github.com/enricoros/big-AGI/issues/329)
|
||||
- **Commands**: command auto-completion (type '/'). [#327](https://github.com/enricoros/big-AGI/issues/327)
|
||||
- **[Together AI](https://www.together.ai/products#inference)** inference platform support (good speed and newer models). [#346](https://github.com/enricoros/big-AGI/issues/346)
|
||||
- Persona Creator history, deletion, custom creation, fix llm API timeouts
|
||||
- Enable adding up to five custom OpenAI-compatible endpoints
|
||||
- Developer enhancements: new 'Actiles' framework
|
||||
|
||||
### What's New in 1.10.0 · Jan 6, 2024 · The Year of AGI
|
||||
|
||||
https://github.com/enricoros/big-AGI/assets/32999/fbb1be49-5c38-49c8-86fa-3705700f6c39
|
||||
|
||||
- **New UI**: for both desktop and mobile, sets the stage for future scale. [#201](https://github.com/enricoros/big-AGI/issues/201)
|
||||
- **Conversation Folders**: enhanced conversation organization. [#321](https://github.com/enricoros/big-AGI/issues/321)
|
||||
- **[LM Studio](https://lmstudio.ai/)** support and improved token management
|
||||
@@ -40,22 +53,7 @@ shows the current developments and future ideas.
|
||||
- Layout fix for Firefox users
|
||||
- Developer enhancements: Text2Image subsystem, Optima layout, ScrollToBottom library, Panes library, and Llms subsystem updates.
|
||||
|
||||
### What's New in 1.8.0 · Dec 20, 2023
|
||||
|
||||
- **Google Gemini Support**: Use the newest Google models. [#275](https://github.com/enricoros/big-agi/issues/275)
|
||||
- **Mistral Platform**: Mixtral and future models support. [#273](https://github.com/enricoros/big-agi/issues/273)
|
||||
- **Diagram Instructions**. Thanks to @joriskalz! [#280](https://github.com/enricoros/big-agi/pull/280)
|
||||
- Ollama Chats: Enhanced chatting experience. [#270](https://github.com/enricoros/big-agi/issues/270)
|
||||
- Mac Shortcuts Fix: Improved UX on Mac
|
||||
- **Single-Tab Mode**: Data integrity with single window. [#268](https://github.com/enricoros/big-agi/issues/268)
|
||||
- **Updated Models**: Latest Ollama (v0.1.17) and OpenRouter models
|
||||
- Official Downloads: Easy access to the latest big-AGI on [big-AGI.com](https://big-agi.com)
|
||||
- For developers: [troubleshot networking](https://github.com/enricoros/big-AGI/issues/276#issuecomment-1858591483), fixed Vercel deployment, cleaned up the LLMs/Streaming framework
|
||||
|
||||
### What's New in... ?
|
||||
|
||||
> [To The Moon And Back, Attachment Theory, Surf's Up, Loaded, and more releases...](docs/changelog.md).
|
||||
> Check out the [big-AGI open roadmap](https://github.com/users/enricoros/projects/4/views/2)
|
||||
For full details and former releases, check out the [changelog](docs/changelog.md).
|
||||
|
||||
## ✨ Key Features 👊
|
||||
|
||||
|
||||
+11
-2
@@ -5,11 +5,20 @@ by release.
|
||||
|
||||
- For the live roadmap, please see [the GitHub project](https://github.com/users/enricoros/projects/4/views/2)
|
||||
|
||||
### 1.11.0 - Jan 2024
|
||||
### 1.12.0 - Jan 2024
|
||||
|
||||
- milestone: [1.11.0](https://github.com/enricoros/big-agi/milestone/11)
|
||||
- milestone: [1.12.0](https://github.com/enricoros/big-agi/milestone/12)
|
||||
- work in progress: [big-AGI open roadmap](https://github.com/users/enricoros/projects/4/views/2), [help here](https://github.com/users/enricoros/projects/4/views/4)
|
||||
|
||||
### What's New in 1.11.0 · Jan 16, 2024 · Singularity
|
||||
|
||||
- **Find chats**: search in titles and content, with frequency ranking. [#329](https://github.com/enricoros/big-AGI/issues/329)
|
||||
- **Commands**: command auto-completion (type '/'). [#327](https://github.com/enricoros/big-AGI/issues/327)
|
||||
- **[Together AI](https://www.together.ai/products#inference)** inference platform support. [#346](https://github.com/enricoros/big-AGI/issues/346)
|
||||
- Persona Creator history, deletion, custom creation, fix llm API timeouts
|
||||
- Enable adding up to five custom OpenAI-compatible endpoints
|
||||
- Developer enhancements: new 'Actiles' framework
|
||||
|
||||
### What's New in 1.10.0 · Jan 6, 2024 · The Year of AGI
|
||||
|
||||
- **New UI**: for both desktop and mobile, sets the stage for future scale. [#201](https://github.com/enricoros/big-AGI/issues/201)
|
||||
|
||||
@@ -28,6 +28,7 @@ GEMINI_API_KEY=
|
||||
MISTRAL_API_KEY=
|
||||
OLLAMA_API_HOST=
|
||||
OPENROUTER_API_KEY=
|
||||
TOGETHERAI_API_KEY=
|
||||
|
||||
# Model Observability: Helicone
|
||||
HELICONE_API_KEY=
|
||||
@@ -85,6 +86,7 @@ requiring the user to enter an API key
|
||||
| `MISTRAL_API_KEY` | The API key for Mistral | Optional |
|
||||
| `OLLAMA_API_HOST` | Changes the backend host for the Ollama vendor. See [config-ollama.md](config-ollama.md) | |
|
||||
| `OPENROUTER_API_KEY` | The API key for OpenRouter | Optional |
|
||||
| `TOGETHERAI_API_KEY` | The API key for Together AI | Optional |
|
||||
|
||||
### Model Observability: Helicone
|
||||
|
||||
|
||||
Generated
+2
-2
@@ -1,12 +1,12 @@
|
||||
{
|
||||
"name": "big-agi",
|
||||
"version": "1.10.0",
|
||||
"version": "1.11.0",
|
||||
"lockfileVersion": 3,
|
||||
"requires": true,
|
||||
"packages": {
|
||||
"": {
|
||||
"name": "big-agi",
|
||||
"version": "1.10.0",
|
||||
"version": "1.11.0",
|
||||
"hasInstallScript": true,
|
||||
"dependencies": {
|
||||
"@dqbd/tiktoken": "^1.0.7",
|
||||
|
||||
+1
-1
@@ -1,6 +1,6 @@
|
||||
{
|
||||
"name": "big-agi",
|
||||
"version": "1.10.0",
|
||||
"version": "1.11.0",
|
||||
"private": true,
|
||||
"scripts": {
|
||||
"dev": "next dev",
|
||||
|
||||
@@ -4,13 +4,13 @@ import { Panel, PanelGroup } from 'react-resizable-panels';
|
||||
|
||||
import { useTheme } from '@mui/joy';
|
||||
|
||||
import { useCapabilityTextToImage } from '~/modules/t2i/t2i.client';
|
||||
import { DiagramConfig, DiagramsModal } from '~/modules/aifn/digrams/DiagramsModal';
|
||||
import { FlattenerModal } from '~/modules/aifn/flatten/FlattenerModal';
|
||||
import { TradeConfig, TradeModal } from '~/modules/trade/TradeModal';
|
||||
import { getChatLLMId, useChatLLM } from '~/modules/llms/store-llms';
|
||||
import { imaginePromptFromText } from '~/modules/aifn/imagine/imaginePromptFromText';
|
||||
import { speakText } from '~/modules/elevenlabs/elevenlabs.client';
|
||||
import { useChatLLM, useModelsStore } from '~/modules/llms/store-llms';
|
||||
import { useCapabilityTextToImage } from '~/modules/t2i/t2i.client';
|
||||
|
||||
import { Brand } from '~/common/app.config';
|
||||
import { ConfirmationModal } from '~/common/components/ConfirmationModal';
|
||||
@@ -148,7 +148,7 @@ export function AppChat() {
|
||||
// Execution
|
||||
|
||||
const _handleExecute = React.useCallback(async (chatModeId: ChatModeId, conversationId: DConversationId, history: DMessage[]) => {
|
||||
const { chatLLMId } = useModelsStore.getState();
|
||||
const chatLLMId = getChatLLMId();
|
||||
if (!chatModeId || !conversationId || !chatLLMId) return;
|
||||
|
||||
// "/command ...": overrides the chat mode
|
||||
@@ -358,7 +358,7 @@ export function AppChat() {
|
||||
// Shortcuts
|
||||
|
||||
const handleOpenChatLlmOptions = React.useCallback(() => {
|
||||
const { chatLLMId } = useModelsStore.getState();
|
||||
const chatLLMId = getChatLLMId();
|
||||
if (!chatLLMId) return;
|
||||
openLlmOptions(chatLLMId);
|
||||
}, [openLlmOptions]);
|
||||
|
||||
@@ -7,10 +7,12 @@ export const CommandsAlter: ICommandsProvider = {
|
||||
getCommands: () => [{
|
||||
primary: '/assistant',
|
||||
alternatives: ['/a'],
|
||||
arguments: ['text'],
|
||||
description: 'Injects assistant response',
|
||||
}, {
|
||||
primary: '/system',
|
||||
alternatives: ['/s'],
|
||||
arguments: ['text'],
|
||||
description: 'Injects system message',
|
||||
}],
|
||||
|
||||
|
||||
@@ -8,6 +8,7 @@ export const CommandsBrowse: ICommandsProvider = {
|
||||
|
||||
getCommands: () => [{
|
||||
primary: '/browse',
|
||||
arguments: ['URL'],
|
||||
description: 'Assistant will download the web page',
|
||||
Icon: LanguageIcon,
|
||||
}],
|
||||
|
||||
@@ -9,7 +9,8 @@ export const CommandsDraw: ICommandsProvider = {
|
||||
getCommands: () => [{
|
||||
primary: '/draw',
|
||||
alternatives: ['/imagine', '/img'],
|
||||
description: 'Generate an image from text',
|
||||
arguments: ['prompt'],
|
||||
description: 'Assistant will draw the text',
|
||||
Icon: FormatPaintIcon,
|
||||
}],
|
||||
|
||||
|
||||
@@ -7,7 +7,6 @@ export const CommandsHelp: ICommandsProvider = {
|
||||
getCommands: () => [{
|
||||
primary: '/help',
|
||||
alternatives: ['/?'],
|
||||
noArgs: true,
|
||||
description: 'Display this list of commands',
|
||||
}],
|
||||
|
||||
|
||||
@@ -8,6 +8,7 @@ export const CommandsReact: ICommandsProvider = {
|
||||
|
||||
getCommands: () => [{
|
||||
primary: '/react',
|
||||
arguments: ['prompt'],
|
||||
description: 'Use the AI ReAct strategy to answer your query (as sidebar)',
|
||||
Icon: PsychologyIcon,
|
||||
}],
|
||||
|
||||
@@ -5,7 +5,7 @@ import type { CommandsProviderId } from './commands.registry';
|
||||
export interface ChatCommand {
|
||||
primary: string; // The primary command
|
||||
alternatives?: string[]; // Alternative commands
|
||||
noArgs?: boolean; // Whether the command requires arguments
|
||||
arguments?: string[]; // Arguments for the command
|
||||
description: string; // Description of what the command does
|
||||
// usage?: string; // Example of how to use the command
|
||||
Icon?: FunctionComponent; // Icon to display next to the command
|
||||
|
||||
@@ -46,7 +46,7 @@ export function extractChatCommand(input: string): TextCommandPiece[] {
|
||||
if (cmd.primary === potentialCommand || cmd.alternatives?.includes(potentialCommand)) {
|
||||
|
||||
// command needs arguments: take the rest of the input as parameters
|
||||
if (cmd.noArgs !== true) {
|
||||
if (cmd.arguments?.length) {
|
||||
const params = firstSpaceIndex >= 0 ? inputTrimmed.substring(firstSpaceIndex + 1) : '';
|
||||
return [{ type: 'cmd', providerId: provider.id, command: potentialCommand, params: params || undefined, isError: !params || undefined }];
|
||||
}
|
||||
|
||||
@@ -1,7 +1,7 @@
|
||||
import * as React from 'react';
|
||||
import { shallow } from 'zustand/shallow';
|
||||
|
||||
import { Box, IconButton, ListDivider, ListItemDecorator, MenuItem, Tooltip } from '@mui/joy';
|
||||
import { Box, IconButton, ListDivider, ListItemButton, ListItemDecorator, Tooltip } from '@mui/joy';
|
||||
import AddIcon from '@mui/icons-material/Add';
|
||||
import DeleteOutlineIcon from '@mui/icons-material/DeleteOutline';
|
||||
import FileDownloadIcon from '@mui/icons-material/FileDownload';
|
||||
@@ -16,12 +16,13 @@ import { conversationTitle, DConversationId, useChatStore } from '~/common/state
|
||||
import { useOptimaDrawers } from '~/common/layout/optima/useOptimaDrawers';
|
||||
import { useUIPreferencesStore } from '~/common/state/store-ui';
|
||||
import { useUXLabsStore } from '~/common/state/store-ux-labs';
|
||||
import DebounceInput from '~/common/components/DebounceInput';
|
||||
|
||||
import { ChatFolderList } from './folder/ChatFolderList';
|
||||
import { ChatDrawerItemMemo, ChatNavigationItemData } from './ChatNavigationItem';
|
||||
|
||||
// type ListGrouping = 'off' | 'persona';
|
||||
|
||||
// type ListGrouping = 'off' | 'persona';
|
||||
|
||||
/*
|
||||
* Optimization: return a reduced version of the DConversation object for 'Drawer Items' purposes,
|
||||
@@ -53,7 +54,7 @@ export const useChatNavigationItems = (activeConversationId: DConversationId | n
|
||||
conversationId: _c.id,
|
||||
isActive: _c.id === activeConversationId,
|
||||
isEmpty: !_c.messages.length && !_c.userTitle,
|
||||
title: conversationTitle(_c, 'New Title'),
|
||||
title: conversationTitle(_c),
|
||||
messageCount: _c.messages.length,
|
||||
assistantTyping: !!_c.abortController,
|
||||
systemPurposeId: _c.systemPurposeId,
|
||||
@@ -84,6 +85,8 @@ function ChatDrawerItems(props: {
|
||||
}) {
|
||||
|
||||
// local state
|
||||
const [debouncedSearchQuery, setDebouncedSearchQuery] = React.useState('');
|
||||
|
||||
// const [grouping] = React.useState<ListGrouping>('off');
|
||||
const { onConversationDelete, onConversationNew, onConversationActivate } = props;
|
||||
|
||||
@@ -95,7 +98,6 @@ function ChatDrawerItems(props: {
|
||||
const labsEnhancedUI = useUXLabsStore(state => state.labsEnhancedUI);
|
||||
|
||||
// derived state
|
||||
const maxChatMessages = chatNavItems.reduce((longest, _c) => Math.max(longest, _c.messageCount), 1);
|
||||
const selectConversationsCount = chatNavItems.length;
|
||||
const nonEmptyChats = selectConversationsCount > 1 || (selectConversationsCount === 1 && !chatNavItems[0].isEmpty);
|
||||
const singleChat = selectConversationsCount === 1;
|
||||
@@ -118,6 +120,35 @@ function ChatDrawerItems(props: {
|
||||
}, [onConversationDelete, singleChat]);
|
||||
|
||||
|
||||
// Filter chatNavItems based on the search query and rank them by search frequency
|
||||
const filteredChatNavItems = React.useMemo(() => {
|
||||
if (!debouncedSearchQuery) return chatNavItems;
|
||||
return chatNavItems
|
||||
.map(item => {
|
||||
// Get the conversation by ID
|
||||
const conversation = useChatStore.getState().conversations.find(c => c.id === item.conversationId);
|
||||
// Calculate the frequency of the search term in the title and messages
|
||||
const titleFrequency = (item.title.toLowerCase().match(new RegExp(debouncedSearchQuery.toLowerCase(), 'g')) || []).length;
|
||||
const messageFrequency = conversation?.messages.reduce((count, message) => {
|
||||
return count + (message.text.toLowerCase().match(new RegExp(debouncedSearchQuery.toLowerCase(), 'g')) || []).length;
|
||||
}, 0) || 0;
|
||||
// Return the item with the searchFrequency property
|
||||
return {
|
||||
...item,
|
||||
searchFrequency: titleFrequency + messageFrequency,
|
||||
};
|
||||
})
|
||||
// Exclude items with a searchFrequency of 0
|
||||
.filter(item => item.searchFrequency > 0)
|
||||
// Sort the items by searchFrequency in descending order
|
||||
.sort((a, b) => b.searchFrequency! - a.searchFrequency!);
|
||||
}, [chatNavItems, debouncedSearchQuery]);
|
||||
|
||||
|
||||
// basis for the underline bar
|
||||
const bottomBarBasis = filteredChatNavItems.reduce((longest, _c) => Math.max(longest, _c.searchFrequency ?? _c.messageCount), 1);
|
||||
|
||||
|
||||
// grouping
|
||||
/*let sortedIds = conversationIDs;
|
||||
if (grouping === 'persona') {
|
||||
@@ -154,6 +185,16 @@ function ChatDrawerItems(props: {
|
||||
/>
|
||||
|
||||
{/* Folders List */}
|
||||
{/*<Box sx={{*/}
|
||||
{/* display: 'grid',*/}
|
||||
{/* gridTemplateRows: !useFolders ? '0fr' : '1fr',*/}
|
||||
{/* transition: 'grid-template-rows 0.42s cubic-bezier(.17,.84,.44,1)',*/}
|
||||
{/* '& > div': {*/}
|
||||
{/* padding: useFolders ? 2 : 0,*/}
|
||||
{/* transition: 'padding 0.42s cubic-bezier(.17,.84,.44,1)',*/}
|
||||
{/* overflow: 'hidden',*/}
|
||||
{/* },*/}
|
||||
{/*}}>*/}
|
||||
{useFolders && (
|
||||
<ChatFolderList
|
||||
folders={folders}
|
||||
@@ -161,13 +202,23 @@ function ChatDrawerItems(props: {
|
||||
onFolderSelect={props.setSelectedFolderId}
|
||||
/>
|
||||
)}
|
||||
{/*</Box>*/}
|
||||
|
||||
{/* Chats List */}
|
||||
<PageDrawerList variant='plain' noTopPadding noBottomPadding tallRows>
|
||||
|
||||
{useFolders && <ListDivider sx={{ mb: 0 }} />}
|
||||
|
||||
<MenuItem disabled={props.disableNewButton} onClick={handleButtonNew} sx={PageDrawerTallItemSx}>
|
||||
{/* Search Input Field */}
|
||||
<DebounceInput
|
||||
onDebounce={setDebouncedSearchQuery}
|
||||
debounceTimeout={300}
|
||||
placeholder='Search...'
|
||||
aria-label='Search'
|
||||
sx={{ m: 2 }}
|
||||
/>
|
||||
|
||||
<ListItemButton disabled={props.disableNewButton} onClick={handleButtonNew} sx={PageDrawerTallItemSx}>
|
||||
<ListItemDecorator><AddIcon /></ListItemDecorator>
|
||||
<Box sx={{
|
||||
// style
|
||||
@@ -182,7 +233,7 @@ function ChatDrawerItems(props: {
|
||||
New chat
|
||||
{/*<KeyStroke combo='Ctrl + Alt + N' sx={props.disableNewButton ? { opacity: 0.5 } : undefined} />*/}
|
||||
</Box>
|
||||
</MenuItem>
|
||||
</ListItemButton>
|
||||
|
||||
{/*<ListDivider sx={{ mt: 0 }} />*/}
|
||||
|
||||
@@ -201,13 +252,13 @@ function ChatDrawerItems(props: {
|
||||
{/* </ToggleButtonGroup>*/}
|
||||
{/*</ListItem>*/}
|
||||
|
||||
{chatNavItems.map(item =>
|
||||
{filteredChatNavItems.map(item =>
|
||||
<ChatDrawerItemMemo
|
||||
key={'nav-' + item.conversationId}
|
||||
item={item}
|
||||
isLonely={singleChat}
|
||||
maxChatMessages={(labsEnhancedUI || softMaxReached) ? maxChatMessages : 0}
|
||||
showSymbols={showSymbols}
|
||||
bottomBarBasis={(labsEnhancedUI || softMaxReached || debouncedSearchQuery) ? bottomBarBasis : 0}
|
||||
onConversationActivate={handleConversationActivate}
|
||||
onConversationDelete={handleConversationDelete}
|
||||
/>)}
|
||||
@@ -217,26 +268,28 @@ function ChatDrawerItems(props: {
|
||||
|
||||
<Box sx={{ display: 'flex', alignItems: 'center' }}>
|
||||
|
||||
<MenuItem onClick={props.onConversationImportDialog} sx={{ flex: 1 }}>
|
||||
<ListItemButton onClick={props.onConversationImportDialog} sx={{ flex: 1 }}>
|
||||
<ListItemDecorator>
|
||||
<FileUploadIcon />
|
||||
</ListItemDecorator>
|
||||
Import
|
||||
{/*<OpenAIIcon sx={{ ml: 'auto' }} />*/}
|
||||
</MenuItem>
|
||||
</ListItemButton>
|
||||
|
||||
<MenuItem disabled={!nonEmptyChats} onClick={() => props.onConversationExportDialog(props.activeConversationId)} sx={{ flex: 1, display: 'flex', justifyContent: 'flex-end', gap: 2.5 }}>
|
||||
<ListItemButton disabled={!nonEmptyChats} onClick={() => props.onConversationExportDialog(props.activeConversationId)} sx={{ flex: 1 }}>
|
||||
<ListItemDecorator>
|
||||
<FileDownloadIcon />
|
||||
</ListItemDecorator>
|
||||
Export
|
||||
<FileDownloadIcon />
|
||||
</MenuItem>
|
||||
</ListItemButton>
|
||||
</Box>
|
||||
|
||||
<MenuItem disabled={!nonEmptyChats} onClick={props.onConversationsDeleteAll}>
|
||||
<ListItemButton disabled={!nonEmptyChats} onClick={props.onConversationsDeleteAll}>
|
||||
<ListItemDecorator>
|
||||
<DeleteOutlineIcon />
|
||||
</ListItemDecorator>
|
||||
Delete {selectConversationsCount >= 2 ? `all ${selectConversationsCount} chats` : 'chat'}
|
||||
</MenuItem>
|
||||
</ListItemButton>
|
||||
|
||||
</PageDrawerList>
|
||||
|
||||
|
||||
@@ -9,7 +9,6 @@ import { SystemPurposeId, SystemPurposes } from '../../../../data';
|
||||
|
||||
import { DConversationId, useChatStore } from '~/common/state/store-chats';
|
||||
import { InlineTextarea } from '~/common/components/InlineTextarea';
|
||||
import { useUIPreferencesStore } from '~/common/state/store-ui';
|
||||
|
||||
|
||||
const DEBUG_CONVERSATION_IDs = false;
|
||||
@@ -25,13 +24,14 @@ export interface ChatNavigationItemData {
|
||||
messageCount: number;
|
||||
assistantTyping: boolean;
|
||||
systemPurposeId: SystemPurposeId;
|
||||
searchFrequency?: number;
|
||||
}
|
||||
|
||||
function ChatNavigationItem(props: {
|
||||
item: ChatNavigationItemData,
|
||||
isLonely: boolean,
|
||||
maxChatMessages: number,
|
||||
showSymbols: boolean,
|
||||
bottomBarBasis: number,
|
||||
onConversationActivate: (conversationId: DConversationId, closeMenu: boolean) => void,
|
||||
onConversationDelete: (conversationId: DConversationId) => void,
|
||||
}) {
|
||||
@@ -40,11 +40,8 @@ function ChatNavigationItem(props: {
|
||||
const [isEditingTitle, setIsEditingTitle] = React.useState(false);
|
||||
const [deleteArmed, setDeleteArmed] = React.useState(false);
|
||||
|
||||
// external state
|
||||
const doubleClickToEdit = useUIPreferencesStore(state => state.doubleClickToEdit);
|
||||
|
||||
// derived state
|
||||
const { conversationId, isActive, title, messageCount, assistantTyping, systemPurposeId } = props.item;
|
||||
const { conversationId, isActive, title, messageCount, assistantTyping, systemPurposeId, searchFrequency } = props.item;
|
||||
const isNew = messageCount === 0;
|
||||
|
||||
// auto-close the arming menu when clicking away
|
||||
@@ -62,7 +59,11 @@ function ChatNavigationItem(props: {
|
||||
|
||||
const handleTitleEdited = (text: string) => {
|
||||
setIsEditingTitle(false);
|
||||
useChatStore.getState().setUserTitle(conversationId, text);
|
||||
useChatStore.getState().setUserTitle(conversationId, text.trim());
|
||||
};
|
||||
|
||||
const handleTitleEditCancel = () => {
|
||||
setIsEditingTitle(false);
|
||||
};
|
||||
|
||||
const handleDeleteButtonShow = (event: React.MouseEvent) => {
|
||||
@@ -85,14 +86,14 @@ function ChatNavigationItem(props: {
|
||||
|
||||
|
||||
const textSymbol = SystemPurposes[systemPurposeId]?.symbol || '❓';
|
||||
const buttonSx: SxProps = { ml: 1, ...(isActive ? { color: 'white' } : {}) };
|
||||
const buttonSx: SxProps = isActive ? { color: 'white' } : {};
|
||||
|
||||
const progress = props.maxChatMessages ? 100 * messageCount / props.maxChatMessages : 0;
|
||||
const progress = props.bottomBarBasis ? 100 * (searchFrequency ?? messageCount) / props.bottomBarBasis : 0;
|
||||
|
||||
return (
|
||||
<ListItemButton
|
||||
variant={isActive ? 'soft' : 'plain'} color='neutral'
|
||||
onClick={handleConversationActivate}
|
||||
onClick={!isActive ? handleConversationActivate : event => event.preventDefault()}
|
||||
sx={{
|
||||
// py: 0,
|
||||
position: 'relative',
|
||||
@@ -105,7 +106,7 @@ function ChatNavigationItem(props: {
|
||||
{/* Optional progress bar, underlay */}
|
||||
{progress > 0 && (
|
||||
<Box sx={{
|
||||
backgroundColor: 'neutral.softActiveBg',
|
||||
backgroundColor: 'neutral.softBg',
|
||||
position: 'absolute', left: 0, bottom: 0, width: progress + '%', height: 4,
|
||||
}} />
|
||||
)}
|
||||
@@ -118,28 +119,33 @@ function ChatNavigationItem(props: {
|
||||
alt='typing' variant='plain'
|
||||
src='https://i.giphy.com/media/jJxaUysjzO9ri/giphy.webp'
|
||||
sx={{
|
||||
width: 24,
|
||||
height: 24,
|
||||
width: '1.5rem',
|
||||
height: '1.5rem',
|
||||
borderRadius: 'var(--joy-radius-sm)',
|
||||
}}
|
||||
/>
|
||||
) : (
|
||||
<Typography sx={{ fontSize: '18px' }}>
|
||||
<Typography>
|
||||
{isNew ? '' : textSymbol}
|
||||
</Typography>
|
||||
)}
|
||||
</ListItemDecorator>}
|
||||
|
||||
|
||||
{/* Text */}
|
||||
{!isEditingTitle ? (
|
||||
|
||||
<Box onDoubleClick={() => doubleClickToEdit ? handleTitleEdit() : null} sx={{ flexGrow: 1 }}>
|
||||
{DEBUG_CONVERSATION_IDs ? conversationId.slice(0, 10) : title}{assistantTyping && '...'}
|
||||
</Box>
|
||||
<Typography
|
||||
level={isActive ? 'title-md' : 'body-md'}
|
||||
onDoubleClick={handleTitleEdit}
|
||||
sx={{ flex: 1 }}
|
||||
>
|
||||
{DEBUG_CONVERSATION_IDs ? conversationId.slice(0, 10) : (title.trim() ? title : 'Chat')}{assistantTyping && '...'}
|
||||
</Typography>
|
||||
|
||||
) : (
|
||||
|
||||
<InlineTextarea initialText={title} onEdit={handleTitleEdited} sx={{ ml: -1.5, mr: -0.5, flexGrow: 1 }} />
|
||||
<InlineTextarea initialText={title} onEdit={handleTitleEdited} onCancel={handleTitleEditCancel} sx={{ ml: -1.5, mr: -0.5, flexGrow: 1 }} />
|
||||
|
||||
)}
|
||||
|
||||
@@ -153,8 +159,17 @@ function ChatNavigationItem(props: {
|
||||
{/* <EditIcon />*/}
|
||||
{/*</IconButton>*/}
|
||||
|
||||
{/* Display search frequency if it exists and is greater than 0 */}
|
||||
{searchFrequency && searchFrequency > 0 && (
|
||||
<Box sx={{ ml: 1 }}>
|
||||
<Typography level='body-sm'>
|
||||
{searchFrequency}
|
||||
</Typography>
|
||||
</Box>
|
||||
)}
|
||||
|
||||
{/* Delete Arming */}
|
||||
{!props.isLonely && !deleteArmed && (
|
||||
{!props.isLonely && !deleteArmed && !searchFrequency && (
|
||||
<IconButton
|
||||
variant={isActive ? 'solid' : 'outlined'}
|
||||
size='sm'
|
||||
@@ -166,7 +181,7 @@ function ChatNavigationItem(props: {
|
||||
)}
|
||||
|
||||
{/* Delete / Cancel buttons */}
|
||||
{!props.isLonely && deleteArmed && <>
|
||||
{!props.isLonely && deleteArmed && !searchFrequency && <>
|
||||
<IconButton size='sm' variant='solid' color='danger' sx={buttonSx} onClick={handleConversationDelete}>
|
||||
<DeleteOutlineIcon />
|
||||
</IconButton>
|
||||
|
||||
@@ -1,7 +1,7 @@
|
||||
import * as React from 'react';
|
||||
import { DragDropContext, Draggable, DropResult } from 'react-beautiful-dnd';
|
||||
|
||||
import { List, ListItem, ListItemButton, ListItemContent, ListItemDecorator, MenuList, Sheet, Typography } from '@mui/joy';
|
||||
import { List, ListItem, ListItemButton, ListItemContent, ListItemDecorator, Sheet, Typography } from '@mui/joy';
|
||||
import FolderIcon from '@mui/icons-material/Folder';
|
||||
|
||||
import { DFolder, useFolderStore } from '~/common/state/store-folders';
|
||||
@@ -30,7 +30,7 @@ export function ChatFolderList(props: {
|
||||
|
||||
return (
|
||||
<Sheet variant='soft' sx={{ p: 2 }}>
|
||||
<MenuList
|
||||
<List
|
||||
variant='plain'
|
||||
sx={(theme) => ({
|
||||
'& ul': {
|
||||
@@ -129,7 +129,7 @@ export function ChatFolderList(props: {
|
||||
</StrictModeDroppable>
|
||||
</DragDropContext>
|
||||
</ListItem>
|
||||
</MenuList>
|
||||
</List>
|
||||
|
||||
<AddFolderButton />
|
||||
</Sheet>
|
||||
|
||||
@@ -3,7 +3,7 @@ import * as React from 'react';
|
||||
import FolderIcon from '@mui/icons-material/Folder';
|
||||
|
||||
import type { DConversationId } from '~/common/state/store-chats';
|
||||
import { DropdownItems, GoodDropdown } from '~/common/components/GoodDropdown';
|
||||
import { DropdownItems, PageBarDropdown } from '~/common/layout/optima/components/PageBarDropdown';
|
||||
import { useFolderStore } from '~/common/state/store-folders';
|
||||
|
||||
|
||||
@@ -61,7 +61,7 @@ export function useFolderDropdown(conversationId: DConversationId | null) {
|
||||
return null;
|
||||
|
||||
return (
|
||||
<GoodDropdown
|
||||
<PageBarDropdown
|
||||
items={folderItems}
|
||||
value={currentFolderId}
|
||||
onChange={handleFolderChange}
|
||||
|
||||
@@ -7,7 +7,7 @@ import SettingsIcon from '@mui/icons-material/Settings';
|
||||
|
||||
import { DLLM, DLLMId, DModelSourceId, useModelsStore } from '~/modules/llms/store-llms';
|
||||
|
||||
import { GoodDropdown, DropdownItems } from '~/common/components/GoodDropdown';
|
||||
import { PageBarDropdown, DropdownItems } from '~/common/layout/optima/components/PageBarDropdown';
|
||||
import { KeyStroke } from '~/common/components/KeyStroke';
|
||||
import { useOptimaLayout } from '~/common/layout/optima/useOptimaLayout';
|
||||
|
||||
@@ -54,7 +54,7 @@ function AppBarLLMDropdown(props: {
|
||||
|
||||
|
||||
return (
|
||||
<GoodDropdown
|
||||
<PageBarDropdown
|
||||
items={llmItems}
|
||||
value={props.chatLlmId} onChange={handleChatLLMChange}
|
||||
placeholder={props.placeholder || 'Models …'}
|
||||
|
||||
@@ -7,7 +7,7 @@ import CallIcon from '@mui/icons-material/Call';
|
||||
import { SystemPurposeId, SystemPurposes } from '../../../../data';
|
||||
|
||||
import { DConversationId, useChatStore } from '~/common/state/store-chats';
|
||||
import { GoodDropdown } from '~/common/components/GoodDropdown';
|
||||
import { PageBarDropdown } from '~/common/layout/optima/components/PageBarDropdown';
|
||||
import { launchAppCall } from '~/common/app.routes';
|
||||
import { useUIPreferencesStore } from '~/common/state/store-ui';
|
||||
import { useUXLabsStore } from '~/common/state/store-ux-labs';
|
||||
@@ -42,7 +42,7 @@ function AppBarPersonaDropdown(props: {
|
||||
}
|
||||
|
||||
return (
|
||||
<GoodDropdown
|
||||
<PageBarDropdown
|
||||
items={SystemPurposes} showSymbols={zenMode !== 'cleaner'}
|
||||
value={props.systemPurposeId} onChange={handleSystemPurposeChange}
|
||||
appendOption={appendOption}
|
||||
|
||||
@@ -36,6 +36,10 @@ import { useOptimaLayout } from '~/common/layout/optima/useOptimaLayout';
|
||||
import { useUIPreferencesStore } from '~/common/state/store-ui';
|
||||
import { useUXLabsStore } from '~/common/state/store-ux-labs';
|
||||
|
||||
import type { ActileItem, ActileProvider } from './actile/ActileProvider';
|
||||
import { providerCommands } from './actile/providerCommands';
|
||||
import { useActileManager } from './actile/useActileManager';
|
||||
|
||||
import type { AttachmentId } from './attachments/store-attachments';
|
||||
import { Attachments } from './attachments/Attachments';
|
||||
import { getTextBlockText, useLLMAttachments } from './attachments/useLLMAttachments';
|
||||
@@ -187,13 +191,61 @@ export function Composer(props: {
|
||||
};
|
||||
|
||||
|
||||
// Text actions
|
||||
// Mode menu
|
||||
|
||||
const handleTextAreaTextChange = React.useCallback((e: React.ChangeEvent<HTMLTextAreaElement>) => {
|
||||
const handleModeSelectorHide = () => setChatModeMenuAnchor(null);
|
||||
|
||||
const handleModeSelectorShow = (event: React.MouseEvent<HTMLAnchorElement>) =>
|
||||
setChatModeMenuAnchor(anchor => anchor ? null : event.currentTarget);
|
||||
|
||||
const handleModeChange = (_chatModeId: ChatModeId) => {
|
||||
handleModeSelectorHide();
|
||||
setChatModeId(_chatModeId);
|
||||
};
|
||||
|
||||
|
||||
// Actiles
|
||||
|
||||
const onActileCommandSelect = React.useCallback((item: ActileItem) => {
|
||||
if (props.composerTextAreaRef.current) {
|
||||
const textArea = props.composerTextAreaRef.current;
|
||||
const currentText = textArea.value;
|
||||
const cursorPos = textArea.selectionStart;
|
||||
|
||||
// Find the position where the command starts
|
||||
const commandStart = currentText.lastIndexOf('/', cursorPos);
|
||||
|
||||
// Construct the new text with the autocompleted command
|
||||
const newText = currentText.substring(0, commandStart) + item.label + ' ' + currentText.substring(cursorPos);
|
||||
|
||||
// Update the text area with the new text
|
||||
setComposeText(newText);
|
||||
|
||||
// Move the cursor to the end of the autocompleted command
|
||||
const newCursorPos = commandStart + item.label.length + 1;
|
||||
textArea.setSelectionRange(newCursorPos, newCursorPos);
|
||||
}
|
||||
}, [props.composerTextAreaRef, setComposeText]);
|
||||
|
||||
const actileProviders: ActileProvider[] = React.useMemo(() => {
|
||||
return [providerCommands(onActileCommandSelect)];
|
||||
}, [onActileCommandSelect]);
|
||||
|
||||
const { actileComponent, actileInterceptKeydown } = useActileManager(actileProviders, props.composerTextAreaRef);
|
||||
|
||||
|
||||
// Text typing
|
||||
|
||||
const handleTextareaTextChange = React.useCallback((e: React.ChangeEvent<HTMLTextAreaElement>) => {
|
||||
setComposeText(e.target.value);
|
||||
}, [setComposeText]);
|
||||
|
||||
const handleTextareaKeyDown = React.useCallback((e: React.KeyboardEvent) => {
|
||||
const handleTextareaKeyDown = React.useCallback((e: React.KeyboardEvent<HTMLTextAreaElement>) => {
|
||||
// disable keyboard handling if the actile is visible
|
||||
if (actileInterceptKeydown(e))
|
||||
return;
|
||||
|
||||
// Enter: primary action
|
||||
if (e.key === 'Enter') {
|
||||
|
||||
// Alt: append the message instead
|
||||
@@ -209,20 +261,8 @@ export function Composer(props: {
|
||||
return e.preventDefault();
|
||||
}
|
||||
}
|
||||
}, [assistantAbortible, chatModeId, composeText, enterIsNewline, handleSendAction]);
|
||||
|
||||
|
||||
// Mode menu
|
||||
|
||||
const handleModeSelectorHide = () => setChatModeMenuAnchor(null);
|
||||
|
||||
const handleModeSelectorShow = (event: React.MouseEvent<HTMLAnchorElement>) =>
|
||||
setChatModeMenuAnchor(anchor => anchor ? null : event.currentTarget);
|
||||
|
||||
const handleModeChange = (_chatModeId: ChatModeId) => {
|
||||
handleModeSelectorHide();
|
||||
setChatModeId(_chatModeId);
|
||||
};
|
||||
}, [actileInterceptKeydown, assistantAbortible, chatModeId, composeText, enterIsNewline, handleSendAction]);
|
||||
|
||||
|
||||
// Mic typing & continuation mode
|
||||
@@ -453,7 +493,7 @@ export function Composer(props: {
|
||||
minRows={isMobile ? 5 : 5} maxRows={10}
|
||||
placeholder={textPlaceholder}
|
||||
value={composeText}
|
||||
onChange={handleTextAreaTextChange}
|
||||
onChange={handleTextareaTextChange}
|
||||
onDragEnter={handleTextareaDragEnter}
|
||||
onDragStart={handleTextareaDragStart}
|
||||
onKeyDown={handleTextareaKeyDown}
|
||||
@@ -663,6 +703,9 @@ export function Composer(props: {
|
||||
/>
|
||||
)}
|
||||
|
||||
{/* Actile */}
|
||||
{actileComponent}
|
||||
|
||||
</Grid>
|
||||
</Box>
|
||||
);
|
||||
|
||||
@@ -0,0 +1,81 @@
|
||||
import * as React from 'react';
|
||||
|
||||
import { Box, ListItem, ListItemButton, ListItemDecorator, Sheet, Typography } from '@mui/joy';
|
||||
|
||||
import { CloseableMenu } from '~/common/components/CloseableMenu';
|
||||
|
||||
import type { ActileItem } from './ActileProvider';
|
||||
|
||||
|
||||
export function ActilePopup(props: {
|
||||
anchorEl: HTMLElement | null,
|
||||
onClose: () => void,
|
||||
title?: string,
|
||||
items: ActileItem[],
|
||||
activeItemIndex: number | undefined,
|
||||
activePrefixLength: number,
|
||||
onItemClick: (item: ActileItem) => void,
|
||||
children?: React.ReactNode
|
||||
}) {
|
||||
|
||||
const hasAnyIcon = props.items.some(item => !!item.Icon);
|
||||
|
||||
return (
|
||||
<CloseableMenu open anchorEl={props.anchorEl} onClose={props.onClose} noTopPadding>
|
||||
|
||||
{!!props.title && (
|
||||
<Sheet variant='soft' sx={{ p: 1, borderBottom: '1px solid', borderBottomColor: 'neutral.softActiveBg' }}>
|
||||
{/*<ListItemDecorator/>*/}
|
||||
<Typography level='title-md'>
|
||||
{props.title}
|
||||
</Typography>
|
||||
</Sheet>
|
||||
)}
|
||||
|
||||
{!props.items.length && (
|
||||
<ListItem variant='soft' color='primary'>
|
||||
<Typography level='body-md'>
|
||||
No matching command
|
||||
</Typography>
|
||||
</ListItem>
|
||||
)}
|
||||
|
||||
{props.items.map((item, idx) => {
|
||||
const labelBold = item.label.slice(0, props.activePrefixLength);
|
||||
const labelNormal = item.label.slice(props.activePrefixLength);
|
||||
return (
|
||||
<ListItemButton
|
||||
key={item.id}
|
||||
variant={idx === props.activeItemIndex ? 'soft' : undefined}
|
||||
onClick={() => props.onItemClick(item)}
|
||||
>
|
||||
{hasAnyIcon && (
|
||||
<ListItemDecorator>
|
||||
{item.Icon ? <item.Icon /> : null}
|
||||
</ListItemDecorator>
|
||||
)}
|
||||
<Box>
|
||||
|
||||
<Box sx={{ display: 'flex', alignItems: 'center', gap: 1 }}>
|
||||
<Typography level='title-md' color='primary'>
|
||||
<span style={{ fontWeight: 600, textDecoration: 'underline' }}>{labelBold}</span>{labelNormal}
|
||||
</Typography>
|
||||
{item.argument && <Typography level='body-sm'>
|
||||
{item.argument}
|
||||
</Typography>}
|
||||
</Box>
|
||||
|
||||
{!!item.description && <Typography level='body-xs'>
|
||||
{item.description}
|
||||
</Typography>}
|
||||
</Box>
|
||||
</ListItemButton>
|
||||
);
|
||||
},
|
||||
)}
|
||||
|
||||
{props.children}
|
||||
|
||||
</CloseableMenu>
|
||||
);
|
||||
}
|
||||
@@ -0,0 +1,21 @@
|
||||
import type { FunctionComponent } from 'react';
|
||||
|
||||
export interface ActileItem {
|
||||
id: string;
|
||||
label: string;
|
||||
argument?: string;
|
||||
description?: string;
|
||||
Icon?: FunctionComponent;
|
||||
}
|
||||
|
||||
type ActileProviderIds = 'actile-commands' | 'actile-attach-reference';
|
||||
|
||||
export interface ActileProvider {
|
||||
id: ActileProviderIds;
|
||||
title: string;
|
||||
|
||||
checkTriggerText: (trailingText: string) => boolean;
|
||||
|
||||
fetchItems: () => Promise<ActileItem[]>;
|
||||
onItemSelect: (item: ActileItem) => void;
|
||||
}
|
||||
@@ -0,0 +1,23 @@
|
||||
import { ActileItem, ActileProvider } from './ActileProvider';
|
||||
|
||||
|
||||
export const providerAttachReference: ActileProvider = {
|
||||
id: 'actile-attach-reference',
|
||||
title: 'Attach Reference',
|
||||
|
||||
checkTriggerText: (trailingText: string) =>
|
||||
trailingText.endsWith(' @'),
|
||||
|
||||
fetchItems: async () => {
|
||||
return [{
|
||||
id: 'test-1',
|
||||
label: 'Attach This',
|
||||
description: 'Attach this to the message',
|
||||
Icon: undefined,
|
||||
}];
|
||||
},
|
||||
|
||||
onItemSelect: (item: ActileItem) => {
|
||||
console.log('Selected item:', item);
|
||||
},
|
||||
};
|
||||
@@ -0,0 +1,23 @@
|
||||
import { ActileItem, ActileProvider } from './ActileProvider';
|
||||
import { findAllChatCommands } from '../../../commands/commands.registry';
|
||||
|
||||
|
||||
export const providerCommands = (onItemSelect: (item: ActileItem) => void): ActileProvider => ({
|
||||
id: 'actile-commands',
|
||||
title: 'Chat Commands',
|
||||
|
||||
checkTriggerText: (trailingText: string) =>
|
||||
trailingText.trim() === '/',
|
||||
|
||||
fetchItems: async () => {
|
||||
return findAllChatCommands().map((cmd) => ({
|
||||
id: cmd.primary,
|
||||
label: cmd.primary,
|
||||
argument: cmd.arguments?.join(' ') ?? undefined,
|
||||
description: cmd.description,
|
||||
Icon: cmd.Icon,
|
||||
}));
|
||||
},
|
||||
|
||||
onItemSelect,
|
||||
});
|
||||
@@ -0,0 +1,118 @@
|
||||
import * as React from 'react';
|
||||
import { ActileItem, ActileProvider } from './ActileProvider';
|
||||
import { ActilePopup } from './ActilePopup';
|
||||
|
||||
|
||||
export const useActileManager = (providers: ActileProvider[], anchorRef: React.RefObject<HTMLElement>) => {
|
||||
|
||||
// state
|
||||
const [popupOpen, setPopupOpen] = React.useState(false);
|
||||
const [provider, setProvider] = React.useState<ActileProvider | null>(null);
|
||||
|
||||
const [items, setItems] = React.useState<ActileItem[]>([]);
|
||||
const [activeSearchString, setActiveSearchString] = React.useState<string>('');
|
||||
const [activeItemIndex, setActiveItemIndex] = React.useState<number>(0);
|
||||
|
||||
|
||||
// derived state
|
||||
const activeItems = React.useMemo(() => {
|
||||
const search = activeSearchString.trim().toLowerCase();
|
||||
return items.filter(item => item.label.toLowerCase().startsWith(search));
|
||||
}, [items, activeSearchString]);
|
||||
const activeItem = activeItemIndex >= 0 && activeItemIndex < activeItems.length ? activeItems[activeItemIndex] : null;
|
||||
|
||||
|
||||
const handleClose = React.useCallback(() => {
|
||||
setPopupOpen(false);
|
||||
setProvider(null);
|
||||
setItems([]);
|
||||
setActiveSearchString('');
|
||||
setActiveItemIndex(0);
|
||||
}, []);
|
||||
|
||||
const handlePopupItemClicked = React.useCallback((item: ActileItem) => {
|
||||
provider?.onItemSelect(item);
|
||||
handleClose();
|
||||
}, [handleClose, provider]);
|
||||
|
||||
const handleEnterKey = React.useCallback(() => {
|
||||
activeItem && handlePopupItemClicked(activeItem);
|
||||
}, [activeItem, handlePopupItemClicked]);
|
||||
|
||||
|
||||
const actileInterceptKeydown = React.useCallback((_event: React.KeyboardEvent<HTMLTextAreaElement>): boolean => {
|
||||
|
||||
// Popup open: Intercept
|
||||
|
||||
const { key, currentTarget, ctrlKey, metaKey } = _event;
|
||||
|
||||
if (popupOpen) {
|
||||
if (key === 'Escape' || key === 'ArrowLeft') {
|
||||
_event.preventDefault();
|
||||
handleClose();
|
||||
} else if (key === 'ArrowUp') {
|
||||
_event.preventDefault();
|
||||
setActiveItemIndex((prevIndex) => (prevIndex > 0 ? prevIndex - 1 : activeItems.length - 1));
|
||||
} else if (key === 'ArrowDown') {
|
||||
_event.preventDefault();
|
||||
setActiveItemIndex((prevIndex) => (prevIndex < activeItems.length - 1 ? prevIndex + 1 : 0));
|
||||
} else if (key === 'Enter' || key === 'ArrowRight' || key === 'Tab' || (key === ' ' && activeItems.length === 1)) {
|
||||
_event.preventDefault();
|
||||
handleEnterKey();
|
||||
} else if (key === 'Backspace') {
|
||||
handleClose();
|
||||
} else if (key.length === 1 && !ctrlKey && !metaKey) {
|
||||
setActiveSearchString((prev) => prev + key);
|
||||
setActiveItemIndex(0);
|
||||
}
|
||||
return true;
|
||||
}
|
||||
|
||||
// Popup closed: Check for triggers
|
||||
|
||||
// optimization
|
||||
if (key !== '/' && key !== '@')
|
||||
return false;
|
||||
|
||||
const trailingText = (currentTarget.value || '') + key;
|
||||
|
||||
// check all rules to find one that triggers
|
||||
for (const provider of providers) {
|
||||
if (provider.checkTriggerText(trailingText)) {
|
||||
setProvider(provider);
|
||||
setPopupOpen(true);
|
||||
setActiveSearchString(key);
|
||||
provider
|
||||
.fetchItems()
|
||||
.then(items => setItems(items))
|
||||
.catch(error => {
|
||||
handleClose();
|
||||
console.error('Failed to fetch popup items:', error);
|
||||
});
|
||||
return true;
|
||||
}
|
||||
}
|
||||
|
||||
return false;
|
||||
}, [activeItems.length, handleClose, handleEnterKey, popupOpen, providers]);
|
||||
|
||||
|
||||
const actileComponent = React.useMemo(() => {
|
||||
return !popupOpen ? null : (
|
||||
<ActilePopup
|
||||
anchorEl={anchorRef.current}
|
||||
onClose={handleClose}
|
||||
title={provider?.title}
|
||||
items={activeItems}
|
||||
activeItemIndex={activeItemIndex}
|
||||
activePrefixLength={activeSearchString.length}
|
||||
onItemClick={handlePopupItemClicked}
|
||||
/>
|
||||
);
|
||||
}, [activeItemIndex, activeItems, activeSearchString.length, anchorRef, handleClose, handlePopupItemClicked, popupOpen, provider?.title]);
|
||||
|
||||
return {
|
||||
actileComponent,
|
||||
actileInterceptKeydown,
|
||||
};
|
||||
};
|
||||
@@ -5,7 +5,6 @@ import { useQuery } from '@tanstack/react-query';
|
||||
import { Box, Typography } from '@mui/joy';
|
||||
|
||||
import { createConversationFromJsonV1 } from '~/modules/trade/trade.client';
|
||||
import { useHasChatLinkItems } from '~/modules/trade/store-module-trade';
|
||||
|
||||
import { Brand } from '~/common/app.config';
|
||||
import { InlineError } from '~/common/components/InlineError';
|
||||
@@ -79,14 +78,14 @@ export function AppChatLink(props: { linkId: string }) {
|
||||
refetchOnWindowFocus: false,
|
||||
staleTime: 1000 * 60 * 60 * 24, // 24 hours
|
||||
});
|
||||
const hasLinkItems = useHasChatLinkItems();
|
||||
// const hasLinkItems = useHasChatLinkItems();
|
||||
|
||||
|
||||
// pluggable UI
|
||||
|
||||
const drawerContent = React.useMemo(() => <AppChatLinkDrawerContent />, []);
|
||||
const menuItems = React.useMemo(() => <AppChatLinkMenuItems />, []);
|
||||
usePluggableOptimaLayout(hasLinkItems ? drawerContent : null, null, menuItems, 'AppChatLink');
|
||||
usePluggableOptimaLayout(drawerContent, null, menuItems, 'AppChatLink');
|
||||
|
||||
|
||||
const pageTitle = (data?.conversation && conversationTitle(data.conversation)) || 'Chat Link';
|
||||
|
||||
@@ -1,7 +1,7 @@
|
||||
import * as React from 'react';
|
||||
import TimeAgo from 'react-timeago';
|
||||
|
||||
import { Box, ListDivider, ListItem, ListItemDecorator, MenuItem, Typography } from '@mui/joy';
|
||||
import { Box, ListDivider, ListItem, ListItemButton, ListItemDecorator, Typography } from '@mui/joy';
|
||||
import ArrowBackIcon from '@mui/icons-material/ArrowBack';
|
||||
|
||||
import { useChatLinkItems } from '~/modules/trade/store-module-trade';
|
||||
@@ -28,26 +28,28 @@ export function AppChatLinkDrawerContent() {
|
||||
|
||||
return <PageDrawerList>
|
||||
|
||||
<MenuItem
|
||||
onClick={closeDrawerOnMobile}
|
||||
component={Link} href={ROUTE_INDEX} noLinkStyle
|
||||
>
|
||||
<ListItemDecorator><ArrowBackIcon /></ListItemDecorator>
|
||||
{Brand.Title.Base}
|
||||
</MenuItem>
|
||||
{notEmpty && (
|
||||
<ListItemButton
|
||||
onClick={closeDrawerOnMobile}
|
||||
component={Link} href={ROUTE_INDEX} noLinkStyle
|
||||
>
|
||||
<ListItemDecorator><ArrowBackIcon /></ListItemDecorator>
|
||||
{Brand.Title.Base}
|
||||
</ListItemButton>
|
||||
)}
|
||||
|
||||
{notEmpty && <ListDivider />}
|
||||
|
||||
{notEmpty && <ListItem>
|
||||
<ListItem>
|
||||
<Typography level='body-sm'>
|
||||
Links shared by you
|
||||
{notEmpty ? 'Links shared by you' : 'No prior shared links'}
|
||||
</Typography>
|
||||
</ListItem>}
|
||||
</ListItem>
|
||||
|
||||
{notEmpty && <Box sx={{ overflowY: 'auto' }}>
|
||||
{chatLinkItems.map(item => (
|
||||
|
||||
<MenuItem
|
||||
<ListItemButton
|
||||
key={'chat-link-' + item.objectId}
|
||||
component={Link} href={getChatLinkRelativePath(item.objectId)} noLinkStyle
|
||||
sx={{
|
||||
@@ -61,7 +63,7 @@ export function AppChatLinkDrawerContent() {
|
||||
<Typography level='body-xs'>
|
||||
<TimeAgo date={item.createdAt} />
|
||||
</Typography>
|
||||
</MenuItem>
|
||||
</ListItemButton>
|
||||
|
||||
))}
|
||||
</Box>}
|
||||
|
||||
@@ -10,7 +10,7 @@ import { platformAwareKeystrokes } from '~/common/components/KeyStroke';
|
||||
|
||||
|
||||
// update this variable every time you want to broadcast a new version to clients
|
||||
export const incrementalVersion: number = 11;
|
||||
export const incrementalVersion: number = 12;
|
||||
|
||||
const B = (props: { href?: string, children: React.ReactNode }) => {
|
||||
const boldText = <Typography color={!!props.href ? 'primary' : 'neutral'} sx={{ fontWeight: 600 }}>{props.children}</Typography>;
|
||||
@@ -59,10 +59,24 @@ export const newsCallout =
|
||||
// news and feature surfaces
|
||||
export const NewsItems: NewsItem[] = [
|
||||
// still unannounced: phone calls, split windows, ...
|
||||
{
|
||||
versionCode: '1.11.0',
|
||||
versionName: 'Singularity',
|
||||
versionMoji: '🌌🌠',
|
||||
versionDate: new Date('2024-01-16T06:30:00Z'),
|
||||
items: [
|
||||
{ text: <><B href={RIssues + '/329'}>Search</B> past conversations (@joriskalz) 🔍</>, issue: 329 },
|
||||
{ text: <>Quick <B href={RIssues + '/327'}>commands pane</B> (open with '/')</>, issue: 327 },
|
||||
{ text: <><B>Together AI</B> Inference platform support</>, issue: 346 },
|
||||
{ text: <>Persona creation: <B href={RIssues + '/301'}>history</B></>, issue: 301 },
|
||||
{ text: <>Persona creation: fix <B href={RIssues + '/328'}>API timeouts</B></>, issue: 328 },
|
||||
{ text: <>Support up to five <B href={RIssues + '/323'}>OpenAI-compatible</B> endpoints</>, issue: 323 },
|
||||
],
|
||||
},
|
||||
{
|
||||
versionCode: '1.10.0',
|
||||
versionName: 'The Year of AGI',
|
||||
versionMoji: '🎊✨',
|
||||
// versionMoji: '🎊✨',
|
||||
versionDate: new Date('2024-01-06T08:00:00Z'),
|
||||
items: [
|
||||
{ text: <><B href={RIssues + '/201'}>New UI</B> for desktop and mobile, enabling future expansions</>, issue: 201 },
|
||||
|
||||
@@ -3,11 +3,33 @@ import * as React from 'react';
|
||||
import { Container, ListDivider, Sheet, Typography } from '@mui/joy';
|
||||
|
||||
import { themeBgApp } from '~/common/app.theme';
|
||||
import { usePluggableOptimaLayout } from '~/common/layout/optima/useOptimaLayout';
|
||||
|
||||
import { PersonaCreator } from './PersonaCreator';
|
||||
import { Creator } from './creator/Creator';
|
||||
import { CreatorDrawer } from './creator/CreatorDrawer';
|
||||
import { Viewer } from './creator/Viewer';
|
||||
|
||||
|
||||
export function AppPersonas() {
|
||||
|
||||
// state
|
||||
const [selectedSimplePersonaId, setSelectedSimplePersonaId] = React.useState<string | null>(null);
|
||||
|
||||
|
||||
// pluggable UI
|
||||
|
||||
const drawerContent = React.useMemo(() => {
|
||||
return (
|
||||
<CreatorDrawer
|
||||
selectedSimplePersonaId={selectedSimplePersonaId}
|
||||
setSelectedSimplePersonaId={setSelectedSimplePersonaId}
|
||||
/>
|
||||
);
|
||||
}, [selectedSimplePersonaId]);
|
||||
|
||||
usePluggableOptimaLayout(drawerContent, null, null, 'AppPersonas');
|
||||
|
||||
|
||||
return (
|
||||
<Sheet sx={{
|
||||
flexGrow: 1,
|
||||
@@ -24,7 +46,9 @@ export function AppPersonas() {
|
||||
|
||||
<ListDivider sx={{ my: 2 }} />
|
||||
|
||||
<PersonaCreator />
|
||||
{!!selectedSimplePersonaId && <Viewer selectedSimplePersonaId={selectedSimplePersonaId} />}
|
||||
|
||||
<Creator display={!selectedSimplePersonaId} />
|
||||
|
||||
</Container>
|
||||
|
||||
|
||||
@@ -1,317 +0,0 @@
|
||||
import * as React from 'react';
|
||||
|
||||
import { Alert, Box, Button, Card, CardContent, CircularProgress, Grid, Input, LinearProgress, Tab, TabList, TabPanel, Tabs, Textarea, Typography } from '@mui/joy';
|
||||
import ContentCopyIcon from '@mui/icons-material/ContentCopy';
|
||||
import SettingsAccessibilityIcon from '@mui/icons-material/SettingsAccessibility';
|
||||
import TextFieldsIcon from '@mui/icons-material/TextFields';
|
||||
import YouTubeIcon from '@mui/icons-material/YouTube';
|
||||
|
||||
import { RenderMarkdown } from '../chat/components/message/RenderMarkdown';
|
||||
|
||||
import { GoodModal } from '~/common/components/GoodModal';
|
||||
import { GoodTooltip } from '~/common/components/GoodTooltip';
|
||||
import { apiQuery } from '~/common/util/trpc.client';
|
||||
import { copyToClipboard } from '~/common/util/clipboardUtils';
|
||||
import { lineHeightTextarea } from '~/common/app.theme';
|
||||
import { useFormRadioLlmType } from '~/common/components/forms/useFormRadioLlmType';
|
||||
|
||||
import { LLMChainStep, useLLMChain } from './useLLMChain';
|
||||
|
||||
|
||||
function extractVideoID(videoURL: string): string | null {
|
||||
const regExp = /^(?:https?:\/\/)?(?:www\.)?(?:youtube\.com\/(?:watch\?v=|embed\/)|youtu\.be\/)([^#&?]*).*/;
|
||||
const match = videoURL.match(regExp);
|
||||
return (match && match[1]?.length == 11) ? match[1] : null;
|
||||
}
|
||||
|
||||
|
||||
function useTranscriptFromVideo(videoID: string | null) {
|
||||
const { data, isFetching, isError, error } =
|
||||
apiQuery.ytpersona.getTranscript.useQuery({ videoId: videoID || '' }, {
|
||||
enabled: !!videoID,
|
||||
refetchOnWindowFocus: false,
|
||||
staleTime: Infinity,
|
||||
});
|
||||
return {
|
||||
title: data?.videoTitle ?? null,
|
||||
thumbnailUrl: data?.thumbnailUrl ?? null,
|
||||
transcript: data?.transcript?.trim() ?? null,
|
||||
isFetching,
|
||||
isError, error,
|
||||
};
|
||||
}
|
||||
|
||||
|
||||
const PersonaCreationSteps: LLMChainStep[] = [
|
||||
{
|
||||
name: 'Analyzing the transcript / text',
|
||||
setSystem: 'You are skilled in analyzing and embodying diverse characters. You meticulously study transcripts to capture key attributes, draft comprehensive character sheets, and refine them for authenticity. Feel free to make assumptions without hedging, be concise and be creative.',
|
||||
addUserInput: true,
|
||||
addUser: 'Conduct comprehensive research on the provided transcript. Identify key characteristics of the speaker, including age, professional field, distinct personality traits, style of communication, narrative context, and self-awareness. Additionally, consider any unique aspects such as their use of humor, their cultural background, core values, passions, fears, personal history, and social interactions. Your output for this stage is an in-depth written analysis that exhibits an understanding of both the superficial and more profound aspects of the speaker\'s persona.',
|
||||
},
|
||||
{
|
||||
name: 'Defining the character',
|
||||
addPrevAssistant: true,
|
||||
addUser: 'Craft your documented analysis into a draft of the \'You are a...\' character sheet. It should encapsulate all crucial personality dimensions, along with the motivations and aspirations of the persona. Keep in mind to balance succinctness and depth of detail for each dimension. The deliverable here is a comprehensive draft of the character sheet that captures the speaker\'s unique essence.',
|
||||
},
|
||||
{
|
||||
name: 'Crossing the t\'s',
|
||||
addPrevAssistant: true,
|
||||
addUser: 'Compare the draft character sheet with the original transcript, validating its content and ensuring it captures both the speaker’s overt characteristics and the subtler undertones. Omit unknown information, fine-tune any areas that require clarity, have been overlooked, or require more authenticity. Use clear and illustrative examples from the transcript to refine your sheet and offer meaningful, tangible reference points. Your output is a coherent, comprehensive, and nuanced instruction that begins with \'You are a...\' and serves as a go-to guide for an actor recreating the persona.',
|
||||
},
|
||||
// {
|
||||
// name: 'Shrink',
|
||||
// addPrevAssistant: true,
|
||||
// addUser: 'Now remove all the uncertain information, omit unknown information, Your output is a coherent, comprehensive, and nuanced instruction that begins with \'You are a...\' and serves as a go-to guide for a recreating the persona.',
|
||||
// },
|
||||
];
|
||||
|
||||
|
||||
export function PersonaCreator() {
|
||||
// state
|
||||
const [selectedTab, setSelectedTab] = React.useState(0);
|
||||
const [inputText, setInputText] = React.useState<string | null>(null);
|
||||
const [videoURL, setVideoURL] = React.useState('');
|
||||
const [videoID, setVideoID] = React.useState('');
|
||||
const [personaText, setPersonaText] = React.useState('');
|
||||
|
||||
// external state
|
||||
const [personaLlm, llmComponent] = useFormRadioLlmType('Persona Creation Model');
|
||||
|
||||
|
||||
// chain to convert a text input string (e.g. youtube transcript) into a persona prompt
|
||||
const savePersona = React.useCallback((personaPrompt: string) => {
|
||||
// TODO.. save the persona prompt here
|
||||
}, []);
|
||||
|
||||
const { isFinished, isTransforming, chainProgress, chainIntermediates, chainStepName, chainOutput, chainError, abortChain } =
|
||||
useLLMChain(PersonaCreationSteps, personaLlm?.id, inputText ?? undefined, savePersona);
|
||||
|
||||
|
||||
// fetch transcript when the Video ID is ready, then store it
|
||||
const { transcript, thumbnailUrl, title, isFetching, isError, error: transcriptError } =
|
||||
useTranscriptFromVideo(videoID);
|
||||
React.useEffect(() => setInputText(transcript), [transcript]);
|
||||
|
||||
|
||||
// Reset the relevant state when the selected tab changes
|
||||
React.useEffect(() => {
|
||||
// reset state
|
||||
setVideoURL('');
|
||||
setVideoID('');
|
||||
setInputText(null);
|
||||
setPersonaText('');
|
||||
}, [selectedTab]);
|
||||
|
||||
|
||||
// [Tab: 0] Video download
|
||||
|
||||
const handleVideoIdChange = (e: React.ChangeEvent<HTMLInputElement>) => setVideoURL(e.target.value);
|
||||
|
||||
const handleFetchTranscript = (e: React.FormEvent<HTMLFormElement>) => {
|
||||
e.preventDefault(); // stop the form submit
|
||||
const videoId = extractVideoID(videoURL);
|
||||
if (!videoId) {
|
||||
setVideoURL('Invalid');
|
||||
} else {
|
||||
setInputText(null);
|
||||
setVideoID(videoId);
|
||||
}
|
||||
};
|
||||
|
||||
|
||||
// [Tab: 1] Text input
|
||||
|
||||
const handlePersonaTextChange = (e: React.ChangeEvent<HTMLTextAreaElement>) => setPersonaText(e.target.value);
|
||||
|
||||
return <>
|
||||
|
||||
<Typography level='title-sm' mb={3}>
|
||||
Create the <em>System Prompt</em> of an AI Persona from YouTube or Text.
|
||||
</Typography>
|
||||
|
||||
<Tabs defaultValue={0} variant='outlined'
|
||||
value={selectedTab}
|
||||
onChange={(_event, newValue) => setSelectedTab(newValue as number)}>
|
||||
<TabList sx={{ minHeight: 48 }}>
|
||||
<Tab>From YouTube Video</Tab>
|
||||
<Tab>From Text</Tab>
|
||||
</TabList>
|
||||
|
||||
{/* YouTube URL inputs */}
|
||||
<TabPanel value={0} sx={{ p: 3 }}>
|
||||
|
||||
<Typography level='title-md' startDecorator={<YouTubeIcon sx={{ color: '#f00' }} />} sx={{ mb: 3 }}>
|
||||
YouTube -> Persona
|
||||
</Typography>
|
||||
|
||||
<form onSubmit={handleFetchTranscript}>
|
||||
<Input
|
||||
required
|
||||
type='url'
|
||||
fullWidth
|
||||
variant='outlined'
|
||||
placeholder='YouTube Video URL'
|
||||
value={videoURL}
|
||||
onChange={handleVideoIdChange}
|
||||
sx={{ mb: 1.5 }}
|
||||
/>
|
||||
<Box sx={{ display: 'flex', justifyContent: 'space-between', alignItems: 'center' }}>
|
||||
<Button type='submit' variant='solid' disabled={isFetching || isTransforming || !videoURL} loading={isFetching} sx={{ minWidth: 140 }}>
|
||||
Create
|
||||
</Button>
|
||||
<GoodTooltip title='This example comes from the popular Fireship YouTube channel, which presents technical topics with irreverent humor.'>
|
||||
<Button variant='outlined' color='neutral' onClick={() => setVideoURL('https://www.youtube.com/watch?v=M_wZpSEvOkc')}>
|
||||
Example
|
||||
</Button>
|
||||
</GoodTooltip>
|
||||
</Box>
|
||||
</form>
|
||||
</TabPanel>
|
||||
|
||||
{/* Text area for users to paste copied text */}
|
||||
<TabPanel value={1} sx={{ p: 3 }}>
|
||||
|
||||
<Typography level='title-md' startDecorator={<TextFieldsIcon />} sx={{ mb: 3 }}>
|
||||
<b>Text</b> -> Persona
|
||||
</Typography>
|
||||
|
||||
<Textarea
|
||||
variant='outlined'
|
||||
minRows={4} maxRows={8}
|
||||
placeholder='Paste your text here...'
|
||||
value={personaText}
|
||||
onChange={handlePersonaTextChange}
|
||||
sx={{
|
||||
backgroundColor: 'background.level1',
|
||||
'&:focus-within': {
|
||||
backgroundColor: 'background.popup',
|
||||
},
|
||||
lineHeight: lineHeightTextarea,
|
||||
mb: 1.5,
|
||||
}}
|
||||
/>
|
||||
<Box sx={{ display: 'flex', justifyContent: 'space-between', alignItems: 'center' }}>
|
||||
<Button variant='solid' disabled={isFetching || isTransforming || !personaText} onClick={() => setInputText(personaText)} sx={{ minWidth: 140 }}>
|
||||
Create
|
||||
</Button>
|
||||
{!!personaText?.length && <Typography level='body-sm'>{personaText.length.toLocaleString()}</Typography>}
|
||||
</Box>
|
||||
</TabPanel>
|
||||
</Tabs>
|
||||
|
||||
{/* LLM selector (chat vs fast) */}
|
||||
{!isTransforming && !isFinished && <Box sx={{ mt: 3 }}>{llmComponent}</Box>}
|
||||
|
||||
{/* Errors */}
|
||||
{isError && (
|
||||
<Alert color='warning' sx={{ mt: 1 }}>
|
||||
<Typography component='div'>{transcriptError?.message || 'Unknown error'}</Typography>
|
||||
</Alert>
|
||||
)}
|
||||
{!!chainError && (
|
||||
<Alert color='warning' sx={{ mt: 1 }}>
|
||||
<Typography component='div'>{chainError}</Typography>
|
||||
</Alert>
|
||||
)}
|
||||
|
||||
{/* Persona! */}
|
||||
{chainOutput && <>
|
||||
<Card sx={{ boxShadow: 'md', mt: 3 }}>
|
||||
<Box sx={{ display: 'flex', justifyContent: 'space-between', alignItems: 'center' }}>
|
||||
<Typography level='title-lg' color='success' startDecorator={<SettingsAccessibilityIcon color='success' />}>
|
||||
Persona Prompt
|
||||
</Typography>
|
||||
<GoodTooltip title='Copy system prompt'>
|
||||
<Button color='success' onClick={() => copyToClipboard(chainOutput, 'Persona prompt')} endDecorator={<ContentCopyIcon />} sx={{ minWidth: 120 }}>
|
||||
Copy
|
||||
</Button>
|
||||
</GoodTooltip>
|
||||
</Box>
|
||||
<CardContent>
|
||||
<Alert variant='soft' color='success' sx={{ mb: 1 }}>
|
||||
You may now copy the text below and use it as Custom prompt!
|
||||
</Alert>
|
||||
<RenderMarkdown textBlock={{ type: 'text', content: chainOutput }} />
|
||||
</CardContent>
|
||||
</Card>
|
||||
</>}
|
||||
|
||||
{/* Input: Transcript/Text */}
|
||||
{inputText && <>
|
||||
<Typography level='title-lg' sx={{ mt: 3, mb: 0.5 }}>
|
||||
Input Data
|
||||
</Typography>
|
||||
|
||||
<Card>
|
||||
<CardContent>
|
||||
<Typography level='title-md' sx={{ mb: 1 }}>
|
||||
{title || 'Transcript / Text'}
|
||||
</Typography>
|
||||
<Box>
|
||||
{!!thumbnailUrl && <picture><img src={thumbnailUrl} alt='YouTube Video Thumbnail' height={80} style={{ float: 'left', marginRight: 8 }} /></picture>}
|
||||
<Typography level='body-sm'>
|
||||
{inputText.slice(0, 280)}...
|
||||
</Typography>
|
||||
</Box>
|
||||
</CardContent>
|
||||
</Card>
|
||||
</>}
|
||||
|
||||
{/* Intermediate outputs rendered as cards in a grid */}
|
||||
{chainIntermediates && chainIntermediates.length > 0 && <>
|
||||
<Typography level='title-lg' sx={{ mt: 3, mb: 0.5 }}>
|
||||
{isTransforming ? 'Working...' : 'Intermediate Work'}
|
||||
</Typography>
|
||||
|
||||
<Grid container spacing={2}>
|
||||
{chainIntermediates.map((intermediate, i) =>
|
||||
<Grid xs={12} sm={6} md={4} key={i}>
|
||||
<Card sx={{ height: '100%' }}>
|
||||
<CardContent>
|
||||
<Typography level='title-sm' sx={{ mb: 1 }}>
|
||||
{i + 1}. {PersonaCreationSteps[i].name}
|
||||
</Typography>
|
||||
<Typography level='body-sm'>
|
||||
{intermediate?.slice(0, 140)}...
|
||||
</Typography>
|
||||
</CardContent>
|
||||
</Card>
|
||||
</Grid>,
|
||||
)}
|
||||
</Grid>
|
||||
</>}
|
||||
|
||||
|
||||
{/* Dialog: Embodiment Progress */}
|
||||
{isTransforming && <GoodModal open>
|
||||
<Box sx={{ display: 'flex', flexDirection: 'column', alignItems: 'center', my: 2 }}>
|
||||
<CircularProgress color='primary' value={Math.max(10, 100 * chainProgress)} />
|
||||
</Box>
|
||||
<Box>
|
||||
<Typography color='success' level='title-lg'>
|
||||
Embodying Persona ...
|
||||
</Typography>
|
||||
<Typography level='title-sm' sx={{ mt: 1 }}>
|
||||
Using: {personaLlm?.label}
|
||||
</Typography>
|
||||
</Box>
|
||||
<Box>
|
||||
<Typography color='success' level='title-sm' sx={{ fontWeight: 600 }}>
|
||||
{chainStepName}
|
||||
</Typography>
|
||||
<LinearProgress color='success' determinate value={Math.max(10, 100 * chainProgress)} sx={{ mt: 1.5 }} />
|
||||
</Box>
|
||||
<Typography level='title-sm'>
|
||||
This may take 1-2 minutes. Do not close this window or the progress will be lost.
|
||||
While larger models will produce higher quality prompts,
|
||||
if you experience any errors (e.g. LLM timeouts, or context overflows for larger videos)
|
||||
please try again with faster/smaller models.
|
||||
</Typography>
|
||||
<Button variant='soft' color='neutral' onClick={abortChain} sx={{ ml: 'auto', minWidth: 100, mt: 3 }}>
|
||||
Cancel
|
||||
</Button>
|
||||
</GoodModal>}
|
||||
|
||||
</>;
|
||||
}
|
||||
@@ -0,0 +1,298 @@
|
||||
import * as React from 'react';
|
||||
|
||||
import { Alert, Box, Button, Card, CardContent, CircularProgress, Divider, FormLabel, Grid, IconButton, LinearProgress, Tab, TabList, TabPanel, Tabs, Typography } from '@mui/joy';
|
||||
import AddIcon from '@mui/icons-material/Add';
|
||||
import ContentCopyIcon from '@mui/icons-material/ContentCopy';
|
||||
import SettingsAccessibilityIcon from '@mui/icons-material/SettingsAccessibility';
|
||||
|
||||
import { RenderMarkdown } from '../../chat/components/message/RenderMarkdown';
|
||||
|
||||
import { LLMChainStep, useLLMChain } from '~/modules/aifn/useLLMChain';
|
||||
|
||||
import { GoodTooltip } from '~/common/components/GoodTooltip';
|
||||
import { copyToClipboard } from '~/common/util/clipboardUtils';
|
||||
import { useFormEditTextArray } from '~/common/components/forms/useFormEditTextArray';
|
||||
import { useLLMSelect } from '~/common/components/forms/useLLMSelect';
|
||||
import { useToggleableBoolean } from '~/common/util/useToggleableBoolean';
|
||||
|
||||
import { FromText } from './FromText';
|
||||
import { FromYouTube } from './FromYouTube';
|
||||
import { prependSimplePersona, SimplePersonaProvenance } from '../store-app-personas';
|
||||
|
||||
|
||||
// delay to start a new chain after the previous one finishes
|
||||
const CONTINUE_DELAY: number | false = false;
|
||||
|
||||
|
||||
const Prompts: string[] = [
|
||||
'You are skilled in analyzing and embodying diverse characters. You meticulously study transcripts to capture key attributes, draft comprehensive character sheets, and refine them for authenticity. Feel free to make assumptions without hedging, be concise and be creative.',
|
||||
'Conduct comprehensive research on the provided transcript. Identify key characteristics of the speaker, including age, professional field, distinct personality traits, style of communication, narrative context, and self-awareness. Additionally, consider any unique aspects such as their use of humor, their cultural background, core values, passions, fears, personal history, and social interactions. Your output for this stage is an in-depth written analysis that exhibits an understanding of both the superficial and more profound aspects of the speaker\'s persona.',
|
||||
'Craft your documented analysis into a draft of the \'You are a...\' character sheet. It should encapsulate all crucial personality dimensions, along with the motivations and aspirations of the persona. Keep in mind to balance succinctness and depth of detail for each dimension. The deliverable here is a comprehensive draft of the character sheet that captures the speaker\'s unique essence.',
|
||||
'Compare the draft character sheet with the original transcript, validating its content and ensuring it captures both the speaker’s overt characteristics and the subtler undertones. Omit unknown information, fine-tune any areas that require clarity, have been overlooked, or require more authenticity. Use clear and illustrative examples from the transcript to refine your sheet and offer meaningful, tangible reference points. Your output is a coherent, comprehensive, and nuanced instruction that begins with \'You are a...\' and serves as a go-to guide for an actor recreating the persona.',
|
||||
];
|
||||
|
||||
const PromptTitles: string[] = [
|
||||
'Common: Creator System Prompt',
|
||||
'Analyze the transcript',
|
||||
'Define the character',
|
||||
'Cross the t\'s',
|
||||
];
|
||||
|
||||
// chain to convert a text input string (e.g. youtube transcript) into a persona prompt
|
||||
function createChain(instructions: string[], titles: string[]): LLMChainStep[] {
|
||||
return [
|
||||
{
|
||||
name: titles[1],
|
||||
setSystem: instructions[0],
|
||||
addUserInput: true,
|
||||
addUser: instructions[1],
|
||||
},
|
||||
{
|
||||
name: titles[2],
|
||||
addPrevAssistant: true,
|
||||
addUser: instructions[2],
|
||||
},
|
||||
{
|
||||
name: titles[3],
|
||||
addPrevAssistant: true,
|
||||
addUser: instructions[3],
|
||||
},
|
||||
];
|
||||
}
|
||||
|
||||
|
||||
export const PersonaPromptCard = (props: { content: string }) =>
|
||||
<Card sx={{ boxShadow: 'md', mt: 3 }}>
|
||||
|
||||
<Box sx={{ display: 'flex', justifyContent: 'space-between', alignItems: 'center' }}>
|
||||
<Typography level='title-lg' color='success' startDecorator={<SettingsAccessibilityIcon color='success' />}>
|
||||
Persona Prompt
|
||||
</Typography>
|
||||
<GoodTooltip title='Copy system prompt'>
|
||||
<Button color='success' onClick={() => copyToClipboard(props.content, 'Persona prompt')} endDecorator={<ContentCopyIcon />} sx={{ minWidth: 120 }}>
|
||||
Copy
|
||||
</Button>
|
||||
</GoodTooltip>
|
||||
</Box>
|
||||
|
||||
<CardContent>
|
||||
<Alert variant='soft' color='success' sx={{ mb: 1 }}>
|
||||
You may now copy the text below and use it as Custom prompt!
|
||||
</Alert>
|
||||
<RenderMarkdown textBlock={{ type: 'text', content: props.content }} />
|
||||
</CardContent>
|
||||
</Card>;
|
||||
|
||||
|
||||
export function Creator(props: { display: boolean }) {
|
||||
|
||||
// state
|
||||
const advanced = useToggleableBoolean();
|
||||
const [selectedTab, setSelectedTab] = React.useState(0);
|
||||
const [chainInputText, setChainInputText] = React.useState<string | null>(null);
|
||||
const [inputProvenance, setInputProvenance] = React.useState<SimplePersonaProvenance | null>(null);
|
||||
const [showIntermediates, setShowIntermediates] = React.useState(false);
|
||||
|
||||
// external state
|
||||
const [personaLlm, llmComponent] = useLLMSelect(true, 'Persona Creation Model');
|
||||
|
||||
|
||||
// editable prompts
|
||||
const {
|
||||
strings: editedInstructions, stringEditors: instructionEditors,
|
||||
} = useFormEditTextArray(Prompts, PromptTitles);
|
||||
|
||||
const creationChainSteps = React.useMemo(() => {
|
||||
return createChain(editedInstructions, PromptTitles);
|
||||
}, [editedInstructions]);
|
||||
|
||||
const llmLabel = personaLlm?.label || undefined;
|
||||
const savePersona = React.useCallback((personaPrompt: string, inputText: string) => {
|
||||
prependSimplePersona(personaPrompt, inputText, inputProvenance ?? undefined, llmLabel);
|
||||
}, [inputProvenance, llmLabel]);
|
||||
|
||||
const {
|
||||
// isFinished,
|
||||
isTransforming,
|
||||
chainProgress,
|
||||
chainIntermediates,
|
||||
chainStepName,
|
||||
chainStepInterimChars,
|
||||
chainOutput,
|
||||
chainError,
|
||||
userCancelChain,
|
||||
restartChain,
|
||||
} = useLLMChain(creationChainSteps, personaLlm?.id, chainInputText ?? undefined, savePersona);
|
||||
|
||||
|
||||
// Reset the relevant state when the selected tab changes
|
||||
React.useEffect(() => {
|
||||
setChainInputText(null);
|
||||
}, [selectedTab]);
|
||||
|
||||
|
||||
// [debug] Restart the chain when complete after a delay
|
||||
const debugRestart = !!CONTINUE_DELAY && !isTransforming && (chainProgress === 1 || !!chainError);
|
||||
React.useEffect(() => {
|
||||
if (debugRestart) {
|
||||
const timeout = setTimeout(restartChain, CONTINUE_DELAY);
|
||||
return () => clearTimeout(timeout);
|
||||
}
|
||||
}, [debugRestart, restartChain]);
|
||||
|
||||
|
||||
const handleCreate = React.useCallback((text: string, provenance: SimplePersonaProvenance) => {
|
||||
setChainInputText(text);
|
||||
setInputProvenance(provenance);
|
||||
}, []);
|
||||
|
||||
const handleCancel = React.useCallback(() => {
|
||||
setChainInputText(null);
|
||||
setInputProvenance(null);
|
||||
userCancelChain();
|
||||
}, [userCancelChain]);
|
||||
|
||||
|
||||
// Hide the GFX, but not the logic (hooks)
|
||||
if (!props.display)
|
||||
return null;
|
||||
|
||||
return <>
|
||||
|
||||
<Typography level='title-sm' mb={3}>
|
||||
Create the <em>System Prompt</em> of an AI Persona from YouTube or Text.
|
||||
</Typography>
|
||||
|
||||
|
||||
{/* Inputs */}
|
||||
<Tabs
|
||||
variant='outlined'
|
||||
defaultValue={0}
|
||||
value={selectedTab}
|
||||
onChange={(_event, newValue) => setSelectedTab(newValue as number)}
|
||||
sx={{
|
||||
// boxShadow: 'sm',
|
||||
borderRadius: 'md',
|
||||
// overflow: 'hidden',
|
||||
display: isTransforming ? 'none' : undefined,
|
||||
}}
|
||||
>
|
||||
<TabList sx={{ minHeight: '3rem' }}>
|
||||
<Tab>From YouTube Video</Tab>
|
||||
<Tab>From Text</Tab>
|
||||
</TabList>
|
||||
<TabPanel keepMounted value={0} sx={{ p: 3 }}>
|
||||
<FromYouTube isTransforming={isTransforming} onCreate={handleCreate} />
|
||||
</TabPanel>
|
||||
<TabPanel keepMounted value={1} sx={{ p: 3 }}>
|
||||
<FromText isCreating={isTransforming} onCreate={handleCreate} />
|
||||
</TabPanel>
|
||||
|
||||
<Divider orientation='horizontal' />
|
||||
|
||||
<Box sx={{ p: 3, display: 'flex', flexDirection: 'column', gap: 2 }}>
|
||||
{llmComponent}
|
||||
|
||||
{advanced.on && (
|
||||
<Box sx={{ my: 1, display: 'flex', flexDirection: 'column', gap: 2 }}>
|
||||
{instructionEditors}
|
||||
</Box>
|
||||
)}
|
||||
|
||||
<FormLabel onClick={advanced.toggle} sx={{ textDecoration: 'underline', cursor: 'pointer' }}>
|
||||
{advanced.on ? 'Hide Advanced' : 'Advanced: Prompts'}
|
||||
</FormLabel>
|
||||
</Box>
|
||||
</Tabs>
|
||||
|
||||
|
||||
{/* Embodiment Progress */}
|
||||
{/* <GoodModal open> */}
|
||||
{isTransforming && <Card><CardContent sx={{ display: 'flex', flexDirection: 'column', gap: 3 }}>
|
||||
<Box sx={{ display: 'flex', flexDirection: 'column', alignItems: 'center', my: 2 }}>
|
||||
<CircularProgress color='primary' value={Math.max(10, 100 * chainProgress)} />
|
||||
</Box>
|
||||
<Box>
|
||||
<Typography color='success' level='title-lg'>
|
||||
Embodying Persona ...
|
||||
</Typography>
|
||||
<Typography level='title-sm' sx={{ mt: 1 }}>
|
||||
Using: {personaLlm?.label}
|
||||
</Typography>
|
||||
</Box>
|
||||
<Box>
|
||||
<Typography color='success' level='title-sm' sx={{ fontWeight: 600 }}>
|
||||
{chainStepName}
|
||||
</Typography>
|
||||
<LinearProgress color='success' determinate value={Math.max(10, 100 * chainProgress)} sx={{ mt: 1.5 }} />
|
||||
<Typography level='body-sm' sx={{ mt: 1 }}>
|
||||
{chainStepInterimChars === null ? 'Loading ...' : `Generating (${chainStepInterimChars.toLocaleString()} bytes) ...`}
|
||||
</Typography>
|
||||
</Box>
|
||||
<Typography level='title-sm'>
|
||||
This may take 1-2 minutes.
|
||||
While larger models will produce higher quality prompts,
|
||||
if you experience any errors (e.g. LLM timeouts, or context overflows for larger videos)
|
||||
please try again with faster/smaller models.
|
||||
</Typography>
|
||||
<Button variant='soft' color='neutral' onClick={handleCancel} sx={{ ml: 'auto', minWidth: 100, mt: 3 }}>
|
||||
Cancel
|
||||
</Button>
|
||||
</CardContent></Card>}
|
||||
|
||||
|
||||
{/* Errors */}
|
||||
{!!chainError && (
|
||||
<Alert color='warning' sx={{ mt: 1 }}>
|
||||
<Typography component='div'>{chainError}</Typography>
|
||||
</Alert>
|
||||
)}
|
||||
|
||||
{/* The Persona (Output) */}
|
||||
{chainOutput && <>
|
||||
<PersonaPromptCard content={chainOutput} />
|
||||
</>}
|
||||
|
||||
|
||||
{/* Input + Intermediate outputs (with expander) */}
|
||||
{(isTransforming || chainIntermediates?.length > 0) && <>
|
||||
<Box sx={{ display: 'flex', justifyContent: 'space-between', alignItems: 'flex-end', mt: 3, mb: 0.5, mx: 1 }}>
|
||||
<Typography level='title-lg'>
|
||||
{isTransforming ? 'Working ...' : 'Intermediate Work'}
|
||||
</Typography>
|
||||
<IconButton size='sm' variant={showIntermediates ? 'solid' : 'outlined'} onClick={() => setShowIntermediates(s => !s)}>
|
||||
<AddIcon />
|
||||
</IconButton>
|
||||
</Box>
|
||||
<Grid container spacing={2}>
|
||||
<Grid xs={12} md={showIntermediates ? 12 : 6}>
|
||||
<Card sx={{ height: '100%', overflow: 'hidden' }}>
|
||||
<CardContent>
|
||||
<Typography color='success' level='title-sm' sx={{ mb: 1 }}>
|
||||
Input Text
|
||||
</Typography>
|
||||
<Typography level='body-sm'>
|
||||
{showIntermediates ? chainInputText : (chainInputText?.slice(0, 280) + '...')}
|
||||
</Typography>
|
||||
</CardContent>
|
||||
</Card>
|
||||
</Grid>
|
||||
{chainIntermediates.map((intermediate, i) =>
|
||||
<Grid xs={12} md={showIntermediates ? 12 : 6} key={i}>
|
||||
<Card sx={{ height: '100%', overflow: 'hidden' }}>
|
||||
<CardContent>
|
||||
<Typography color='success' level='title-sm' sx={{ mb: 1 }}>
|
||||
{i + 1}. {intermediate.name}
|
||||
</Typography>
|
||||
<Typography level='body-sm'>
|
||||
{showIntermediates ? intermediate.output : (intermediate.output?.slice(0, 280) + '...')}
|
||||
</Typography>
|
||||
</CardContent>
|
||||
</Card>
|
||||
</Grid>,
|
||||
)}
|
||||
</Grid>
|
||||
</>}
|
||||
|
||||
</>;
|
||||
}
|
||||
@@ -0,0 +1,174 @@
|
||||
import * as React from 'react';
|
||||
|
||||
import { Box, Button, IconButton, ListItemButton, ListItemDecorator, Sheet, Tooltip, Typography } from '@mui/joy';
|
||||
import CheckBoxIcon from '@mui/icons-material/CheckBox';
|
||||
import CheckBoxOutlineBlankIcon from '@mui/icons-material/CheckBoxOutlineBlank';
|
||||
import DeleteOutlineIcon from '@mui/icons-material/DeleteOutline';
|
||||
import Diversity2Icon from '@mui/icons-material/Diversity2';
|
||||
import DoneIcon from '@mui/icons-material/Done';
|
||||
|
||||
import { PageDrawerHeader } from '~/common/layout/optima/components/PageDrawerHeader';
|
||||
import { PageDrawerList } from '~/common/layout/optima/components/PageDrawerList';
|
||||
import { useOptimaDrawers } from '~/common/layout/optima/useOptimaDrawers';
|
||||
|
||||
import { CreatorDrawerItem } from './CreatorDrawerItem';
|
||||
import { deleteSimplePersona, useSimplePersonas } from '../store-app-personas';
|
||||
|
||||
|
||||
export function CreatorDrawer(props: {
|
||||
selectedSimplePersonaId: string | null,
|
||||
setSelectedSimplePersonaId: (simplePersonaId: string | null) => void,
|
||||
}) {
|
||||
|
||||
// selection mode
|
||||
const [selectMode, setSelectMode] = React.useState(false);
|
||||
const [selectedIds, setSelectedIds] = React.useState<Set<string>>(new Set());
|
||||
|
||||
// external state
|
||||
const { closeDrawer } = useOptimaDrawers();
|
||||
const { simplePersonas } = useSimplePersonas();
|
||||
|
||||
|
||||
// derived state
|
||||
const hasPersonas = simplePersonas.length > 0;
|
||||
|
||||
|
||||
// Simple Persona Operations
|
||||
|
||||
const { setSelectedSimplePersonaId } = props;
|
||||
|
||||
const handleSimplePersonaUnselect = React.useCallback(() => {
|
||||
setSelectedSimplePersonaId(null);
|
||||
}, [setSelectedSimplePersonaId]);
|
||||
|
||||
const handleSimplePersonaDelete = React.useCallback((simplePersonaId: string) => {
|
||||
deleteSimplePersona(simplePersonaId);
|
||||
handleSimplePersonaUnselect();
|
||||
}, [handleSimplePersonaUnselect]);
|
||||
|
||||
|
||||
// Selection
|
||||
|
||||
const handleSelectionClose = React.useCallback(() => {
|
||||
setSelectMode(false);
|
||||
setSelectedIds(new Set());
|
||||
}, []);
|
||||
|
||||
const handleSelectionToggleId = React.useCallback((simplePersonaId: string) => {
|
||||
setSelectedIds(prevSelectedIds => {
|
||||
const newSelectedItems = new Set(prevSelectedIds);
|
||||
if (newSelectedItems.has(simplePersonaId))
|
||||
newSelectedItems.delete(simplePersonaId);
|
||||
else
|
||||
newSelectedItems.add(simplePersonaId);
|
||||
return newSelectedItems;
|
||||
});
|
||||
}, []);
|
||||
|
||||
const handleSelectionInvert = React.useCallback(() => {
|
||||
setSelectedIds(prevSelectedIds => {
|
||||
const newSelectedIds = new Set(prevSelectedIds);
|
||||
simplePersonas.forEach(persona => {
|
||||
if (newSelectedIds.has(persona.id))
|
||||
newSelectedIds.delete(persona.id);
|
||||
else
|
||||
newSelectedIds.add(persona.id);
|
||||
});
|
||||
return newSelectedIds;
|
||||
});
|
||||
}, [simplePersonas]);
|
||||
|
||||
const handleSelectionDelete = React.useCallback(() => {
|
||||
selectedIds.forEach(simplePersonaId => {
|
||||
deleteSimplePersona(simplePersonaId);
|
||||
});
|
||||
// clear the selection after deletion
|
||||
setSelectedIds(new Set());
|
||||
}, [selectedIds]);
|
||||
|
||||
|
||||
return <>
|
||||
|
||||
{/* Drawer Header */}
|
||||
<PageDrawerHeader
|
||||
title={selectMode ? 'Selection Mode' : 'Recent'}
|
||||
onClose={selectMode ? handleSelectionClose : closeDrawer}
|
||||
startButton={(!hasPersonas || selectMode) ? undefined :
|
||||
<Tooltip title={selectMode ? 'Done' : 'Select'}>
|
||||
<IconButton onClick={selectMode ? handleSelectionClose : () => setSelectMode(true)}>
|
||||
{selectMode ? <DoneIcon /> : <CheckBoxOutlineBlankIcon />}
|
||||
</IconButton>
|
||||
</Tooltip>
|
||||
}
|
||||
/>
|
||||
|
||||
<PageDrawerList
|
||||
variant='plain'
|
||||
noTopPadding noBottomPadding tallRows
|
||||
onClick={handleSimplePersonaUnselect}
|
||||
>
|
||||
|
||||
{selectMode ? (
|
||||
// Selection Header
|
||||
<Sheet variant='soft' color='warning' invertedColors>
|
||||
<Box sx={{ display: 'flex', alignItems: 'center', px: 1, minHeight: '3rem' }}>
|
||||
<Button
|
||||
variant='plain'
|
||||
color='warning'
|
||||
startDecorator={selectedIds.size === simplePersonas.length ? <CheckBoxOutlineBlankIcon /> : <CheckBoxIcon />}
|
||||
onClick={handleSelectionInvert}
|
||||
>
|
||||
{selectedIds.size === simplePersonas.length ? 'Select None' : selectedIds.size !== 0 ? 'Invert' : 'Select All'}
|
||||
</Button>
|
||||
<Button
|
||||
variant='solid'
|
||||
color='warning'
|
||||
startDecorator={<DeleteOutlineIcon />}
|
||||
onClick={handleSelectionDelete}
|
||||
disabled={selectedIds.size === 0}
|
||||
sx={{ ml: 'auto' }}
|
||||
>
|
||||
Delete
|
||||
</Button>
|
||||
</Box>
|
||||
</Sheet>
|
||||
) : (
|
||||
// Create Button
|
||||
<ListItemButton
|
||||
variant={props.selectedSimplePersonaId ? 'plain' : 'soft'}
|
||||
onClick={handleSimplePersonaUnselect}
|
||||
>
|
||||
<ListItemDecorator>
|
||||
<Diversity2Icon />
|
||||
</ListItemDecorator>
|
||||
<Typography level='title-sm' sx={!props.selectedSimplePersonaId ? { fontWeight: 600 } : undefined}>
|
||||
Create
|
||||
</Typography>
|
||||
</ListItemButton>
|
||||
)}
|
||||
|
||||
{/* Personas [] */}
|
||||
<Box sx={{ flex: 1, overflowY: 'auto' }}>
|
||||
{simplePersonas.map(item =>
|
||||
<CreatorDrawerItem
|
||||
key={item.id}
|
||||
item={item}
|
||||
isActive={item.id === props.selectedSimplePersonaId}
|
||||
isSelected={selectedIds.has(item.id)}
|
||||
isSelection={selectMode}
|
||||
onClick={(event) => {
|
||||
event.stopPropagation();
|
||||
if (selectMode)
|
||||
handleSelectionToggleId(item.id);
|
||||
else
|
||||
props.setSelectedSimplePersonaId(item.id);
|
||||
}}
|
||||
onDelete={handleSimplePersonaDelete}
|
||||
/>,
|
||||
)}
|
||||
</Box>
|
||||
|
||||
</PageDrawerList>
|
||||
|
||||
</>;
|
||||
}
|
||||
@@ -0,0 +1,100 @@
|
||||
import * as React from 'react';
|
||||
import TimeAgo from 'react-timeago';
|
||||
|
||||
import { Box, Checkbox, IconButton, ListItemButton, ListItemDecorator, Typography } from '@mui/joy';
|
||||
import CloseIcon from '@mui/icons-material/Close';
|
||||
import DeleteOutlineIcon from '@mui/icons-material/DeleteOutline';
|
||||
import TextFieldsIcon from '@mui/icons-material/TextFields';
|
||||
import YouTubeIcon from '@mui/icons-material/YouTube';
|
||||
|
||||
import type { SimplePersona } from '../store-app-personas';
|
||||
|
||||
|
||||
export function CreatorDrawerItem(props: {
|
||||
item: SimplePersona,
|
||||
isActive: boolean,
|
||||
isSelected: boolean,
|
||||
isSelection: boolean,
|
||||
onClick: (event: React.MouseEvent) => void,
|
||||
onDelete: (simplePersonaId: string) => void,
|
||||
}) {
|
||||
|
||||
// state
|
||||
const [deleteArmed, setDeleteArmed] = React.useState(false);
|
||||
|
||||
|
||||
// derived
|
||||
|
||||
const { item, isActive } = props;
|
||||
|
||||
const thumbnailUrl = item.pictureUrl || ((item.inputProvenance?.type === 'youtube' && item.inputProvenance.thumbnailUrl) ? item.inputProvenance.thumbnailUrl : undefined);
|
||||
|
||||
const icon = thumbnailUrl
|
||||
? <picture style={{ lineHeight: 0 }}><img src={thumbnailUrl} alt='Simple Persona Thumbnail' width={20} height={20} /></picture>
|
||||
: item.inputProvenance?.type === 'text'
|
||||
? <TextFieldsIcon />
|
||||
: item.inputProvenance?.type === 'youtube'
|
||||
? <YouTubeIcon />
|
||||
: undefined;
|
||||
|
||||
|
||||
return (
|
||||
<ListItemButton
|
||||
variant={isActive ? 'soft' : undefined}
|
||||
onClick={props.onClick}
|
||||
sx={{
|
||||
'&:hover > button': { opacity: 1 },
|
||||
}}
|
||||
>
|
||||
{/* Symbol or Thumbnail picture */}
|
||||
<ListItemDecorator>
|
||||
{props.isSelection ? (
|
||||
<Checkbox checked={props.isSelected} />
|
||||
) : icon}
|
||||
</ListItemDecorator>
|
||||
|
||||
<Box sx={{ overflow: 'hidden' }}>
|
||||
|
||||
{/* Title or System prompt (ellipsized) */}
|
||||
<Typography level='title-sm' sx={{ overflow: 'hidden', whiteSpace: 'nowrap', textOverflow: 'ellipsis' }}>
|
||||
{item.name || (item.systemPrompt?.slice(0, 40) + '...')}
|
||||
</Typography>
|
||||
|
||||
{/* creation Model */}
|
||||
{/*{!!item.llmLabel && <Typography level='body-xs' sx={{ overflow: 'hidden', whiteSpace: 'nowrap', textOverflow: 'ellipsis' }}>*/}
|
||||
{/* {item.llmLabel}*/}
|
||||
{/*</Typography>}*/}
|
||||
|
||||
{/* creation Date */}
|
||||
<Typography level='body-xs'>
|
||||
{!!item.creationDate && <TimeAgo date={item.creationDate} />}
|
||||
</Typography>
|
||||
|
||||
</Box>
|
||||
|
||||
|
||||
{/* Delete Arming */}
|
||||
{!props.isSelection && !deleteArmed && (
|
||||
<IconButton
|
||||
variant={isActive ? 'solid' : 'outlined'}
|
||||
size='sm'
|
||||
sx={{ opacity: { xs: 1, sm: 0 }, transition: 'opacity 0.2s' }}
|
||||
onClick={() => setDeleteArmed(on => !on)}
|
||||
>
|
||||
<DeleteOutlineIcon />
|
||||
</IconButton>
|
||||
)}
|
||||
|
||||
{/* Delete / Cancel buttons */}
|
||||
{!props.isSelection && deleteArmed && <>
|
||||
<IconButton size='sm' variant='solid' color='danger' onClick={() => props.onDelete(item.id)}>
|
||||
<DeleteOutlineIcon />
|
||||
</IconButton>
|
||||
<IconButton size='sm' variant='solid' color='neutral' onClick={() => setDeleteArmed(false)}>
|
||||
<CloseIcon />
|
||||
</IconButton>
|
||||
</>}
|
||||
|
||||
</ListItemButton>
|
||||
);
|
||||
}
|
||||
@@ -0,0 +1,68 @@
|
||||
import * as React from 'react';
|
||||
|
||||
import { Box, Button, Textarea, Typography } from '@mui/joy';
|
||||
import TextFieldsIcon from '@mui/icons-material/TextFields';
|
||||
|
||||
import { lineHeightTextarea } from '~/common/app.theme';
|
||||
|
||||
import type { SimplePersonaProvenance } from '../store-app-personas';
|
||||
|
||||
|
||||
// minimum number of characters required to create from text
|
||||
const MIN_CHARS = 100;
|
||||
|
||||
|
||||
export function FromText(props: {
|
||||
isCreating: boolean;
|
||||
onCreate: (text: string, provenance: SimplePersonaProvenance) => void;
|
||||
}) {
|
||||
|
||||
// state
|
||||
const [text, setText] = React.useState('');
|
||||
|
||||
const handleCreateFromText = (e: React.FormEvent<HTMLFormElement>) => {
|
||||
e.preventDefault(); // stop the form submit
|
||||
props.onCreate(text, { type: 'text' });
|
||||
};
|
||||
|
||||
return <>
|
||||
|
||||
<Typography level='title-md' startDecorator={<TextFieldsIcon />} sx={{ mb: 3 }}>
|
||||
<b>Text</b> -> Persona
|
||||
</Typography>
|
||||
|
||||
<form onSubmit={handleCreateFromText}>
|
||||
<Textarea
|
||||
required
|
||||
variant='outlined'
|
||||
minRows={4} maxRows={8}
|
||||
placeholder='Paste your text here...'
|
||||
value={text}
|
||||
onChange={event => setText(event.target.value)}
|
||||
sx={{
|
||||
backgroundColor: 'background.level1',
|
||||
'&:focus-within': {
|
||||
backgroundColor: 'background.popup',
|
||||
},
|
||||
lineHeight: lineHeightTextarea,
|
||||
mb: 1.5,
|
||||
}}
|
||||
/>
|
||||
|
||||
<Box sx={{ display: 'flex', justifyContent: 'space-between', alignItems: 'center' }}>
|
||||
<Button
|
||||
type='submit' variant='solid'
|
||||
disabled={props.isCreating || text?.length < MIN_CHARS}
|
||||
sx={{ minWidth: 140 }}
|
||||
>
|
||||
Create
|
||||
</Button>
|
||||
|
||||
<Typography level='body-sm'>
|
||||
{text.length < MIN_CHARS ? `(${MIN_CHARS - text.length})` : text.length.toLocaleString()}
|
||||
</Typography>
|
||||
</Box>
|
||||
</form>
|
||||
|
||||
</>;
|
||||
}
|
||||
@@ -0,0 +1,165 @@
|
||||
import * as React from 'react';
|
||||
|
||||
import type { SxProps } from '@mui/joy/styles/types';
|
||||
import { Box, Button, Card, IconButton, Input, Typography } from '@mui/joy';
|
||||
import CloseIcon from '@mui/icons-material/Close';
|
||||
import YouTubeIcon from '@mui/icons-material/YouTube';
|
||||
|
||||
import { useYouTubeTranscript, YTVideoTranscript } from '~/modules/youtube/useYouTubeTranscript';
|
||||
|
||||
import { GoodTooltip } from '~/common/components/GoodTooltip';
|
||||
import { InlineError } from '~/common/components/InlineError';
|
||||
|
||||
import type { SimplePersonaProvenance } from '../store-app-personas';
|
||||
|
||||
|
||||
function extractVideoID(videoURL: string): string | null {
|
||||
const regExp = /^(?:https?:\/\/)?(?:www\.)?(?:youtube\.com\/(?:watch\?v=|embed\/)|youtu\.be\/)([^#&?]*).*/;
|
||||
const match = videoURL.match(regExp);
|
||||
return (match && match[1]?.length == 11) ? match[1] : null;
|
||||
}
|
||||
|
||||
|
||||
function YouTubeVideoTranscriptCard(props: { transcript: YTVideoTranscript, onClose: () => void, sx?: SxProps }) {
|
||||
const { transcript } = props;
|
||||
return (
|
||||
<Card
|
||||
variant='soft'
|
||||
sx={{
|
||||
border: '1px dashed',
|
||||
borderColor: 'neutral.solidBg',
|
||||
p: 1,
|
||||
...props.sx,
|
||||
}}
|
||||
>
|
||||
<Box sx={{ position: 'relative' }}>
|
||||
{!!transcript.thumbnailUrl && (
|
||||
<picture style={{ lineHeight: 0 }}>
|
||||
<img
|
||||
src={transcript.thumbnailUrl}
|
||||
alt='YouTube Video Thumbnail'
|
||||
height={80}
|
||||
style={{ float: 'left', marginRight: 8 }}
|
||||
/>
|
||||
</picture>
|
||||
)}
|
||||
|
||||
{/*<Box sx={{ display: 'flex', flexDirection: 'column', gap: 0.5 }}>*/}
|
||||
<Typography level='title-sm'>
|
||||
{transcript?.title}
|
||||
</Typography>
|
||||
<Typography level='body-xs' sx={{ mt: 0.75 }}>
|
||||
{transcript?.transcript.slice(0, 280)}...
|
||||
</Typography>
|
||||
{/*</Box>*/}
|
||||
|
||||
<IconButton
|
||||
size='sm'
|
||||
onClick={props.onClose}
|
||||
sx={{
|
||||
position: 'absolute', top: -8, right: -8,
|
||||
borderRadius: 'md',
|
||||
}}>
|
||||
<CloseIcon />
|
||||
</IconButton>
|
||||
</Box>
|
||||
</Card>
|
||||
);
|
||||
}
|
||||
|
||||
|
||||
export function FromYouTube(props: {
|
||||
isTransforming: boolean;
|
||||
onCreate: (text: string, provenance: SimplePersonaProvenance) => void;
|
||||
}) {
|
||||
|
||||
// state
|
||||
const [videoURL, setVideoURL] = React.useState('');
|
||||
const [videoID, setVideoID] = React.useState<string | null>(null);
|
||||
|
||||
// external state
|
||||
|
||||
const { onCreate } = props;
|
||||
const onNewTranscript = React.useCallback((transcript: YTVideoTranscript) => {
|
||||
// setVideoID(null); // reset the video ID, to cycle the refetch
|
||||
onCreate(
|
||||
transcript.transcript,
|
||||
{
|
||||
type: 'youtube',
|
||||
url: videoURL,
|
||||
title: transcript.title,
|
||||
thumbnailUrl: transcript.thumbnailUrl,
|
||||
},
|
||||
);
|
||||
}, [onCreate, videoURL]);
|
||||
|
||||
const {
|
||||
transcript,
|
||||
isFetching, isError, error,
|
||||
} = useYouTubeTranscript(videoID, onNewTranscript);
|
||||
|
||||
|
||||
const handleVideoURLChange = (e: React.ChangeEvent<HTMLInputElement>) => {
|
||||
setVideoID(null);
|
||||
setVideoURL(e.target.value);
|
||||
};
|
||||
|
||||
const handleCreateFromTranscript = (e: React.FormEvent<HTMLFormElement>) => {
|
||||
e.preventDefault(); // stop the form submit
|
||||
|
||||
const videoId = extractVideoID(videoURL) || null;
|
||||
if (!videoId)
|
||||
setVideoURL('Invalid');
|
||||
|
||||
// kick-start the transcript fetch
|
||||
setVideoID(videoId);
|
||||
};
|
||||
|
||||
|
||||
return <>
|
||||
|
||||
<Typography level='title-md' startDecorator={<YouTubeIcon sx={{ color: '#f00' }} />} sx={{ mb: 3 }}>
|
||||
YouTube -> Persona
|
||||
</Typography>
|
||||
|
||||
<form onSubmit={handleCreateFromTranscript}>
|
||||
<Input
|
||||
required
|
||||
type='url'
|
||||
fullWidth
|
||||
disabled={isFetching || props.isTransforming}
|
||||
variant='outlined'
|
||||
placeholder='YouTube Video URL'
|
||||
value={videoURL}
|
||||
onChange={handleVideoURLChange}
|
||||
sx={{ mb: 1.5 }}
|
||||
/>
|
||||
|
||||
<Box sx={{ display: 'flex', justifyContent: 'space-between', alignItems: 'center' }}>
|
||||
<Button
|
||||
type='submit' variant='solid'
|
||||
disabled={isFetching || props.isTransforming || !videoURL}
|
||||
loading={isFetching}
|
||||
sx={{ minWidth: 140 }}
|
||||
>
|
||||
Create
|
||||
</Button>
|
||||
|
||||
<GoodTooltip title='This example comes from the popular Fireship YouTube channel, which presents technical topics with irreverent humor.'>
|
||||
<Button variant='outlined' color='neutral' onClick={() => setVideoURL('https://www.youtube.com/watch?v=M_wZpSEvOkc')}>
|
||||
Example
|
||||
</Button>
|
||||
</GoodTooltip>
|
||||
</Box>
|
||||
</form>
|
||||
|
||||
{isError && (
|
||||
<InlineError error={error} sx={{ mt: 3 }} />
|
||||
)}
|
||||
|
||||
{!!transcript && !!videoID && (
|
||||
<YouTubeVideoTranscriptCard transcript={transcript} onClose={() => setVideoID(null)} sx={{ mt: 3 }} />
|
||||
)}
|
||||
|
||||
</>;
|
||||
}
|
||||
@@ -0,0 +1,36 @@
|
||||
import * as React from 'react';
|
||||
import TimeAgo from 'react-timeago';
|
||||
|
||||
import { Typography } from '@mui/joy';
|
||||
|
||||
import { Link } from '~/common/components/Link';
|
||||
|
||||
import { PersonaPromptCard } from './Creator';
|
||||
import { useSimplePersona } from '../store-app-personas';
|
||||
|
||||
|
||||
export function Viewer(props: { selectedSimplePersonaId: string }) {
|
||||
|
||||
// external state
|
||||
const { simplePersona } = useSimplePersona(props.selectedSimplePersonaId);
|
||||
|
||||
if (!simplePersona)
|
||||
return <Typography level='body-sm'>Loading Persona...</Typography>;
|
||||
|
||||
return <>
|
||||
|
||||
<Typography level='title-sm'>
|
||||
This <em>System Prompt</em> was created <TimeAgo date={simplePersona.creationDate} />
|
||||
using the <strong>{simplePersona.llmLabel}</strong> model.
|
||||
</Typography>
|
||||
|
||||
<PersonaPromptCard content={simplePersona.systemPrompt || ''} />
|
||||
|
||||
{/* tell about the Provenances */}
|
||||
<Typography level='body-sm' sx={{ mt: 3 }}>
|
||||
{simplePersona.inputProvenance?.type === 'youtube' && <>The source was this YouTube video: <Link href={simplePersona.inputProvenance.url} target='_blank'>{simplePersona.inputProvenance.title}</Link>.</>}
|
||||
{simplePersona.inputProvenance?.type === 'text' && <>The source was a text snippet of {simplePersona.inputText?.length.toLocaleString()} characters.</>}
|
||||
</Typography>
|
||||
|
||||
</>;
|
||||
}
|
||||
@@ -0,0 +1,101 @@
|
||||
import { create } from 'zustand';
|
||||
import { persist } from 'zustand/middleware';
|
||||
import { shallow } from 'zustand/shallow';
|
||||
|
||||
import { createBase36Uid } from '~/common/util/textUtils';
|
||||
|
||||
|
||||
/**
|
||||
* Very simple personas store for the "Persona Creator" - note that we shall
|
||||
* switch to a more complex personas store in the future, as for now we mainly
|
||||
* save system prompts so that we don't lose what was created.
|
||||
*/
|
||||
export interface SimplePersona {
|
||||
id: string;
|
||||
name?: string;
|
||||
systemPrompt: string; // The system prompt is very important and required
|
||||
creationDate: string; // ISO string format
|
||||
pictureUrl?: string; // Optional picture URL
|
||||
// source material
|
||||
inputProvenance?: SimplePersonaProvenance;
|
||||
inputText: string;
|
||||
// llm used
|
||||
llmLabel?: string;
|
||||
}
|
||||
|
||||
export type SimplePersonaProvenance = {
|
||||
type: 'youtube';
|
||||
url: string;
|
||||
title?: string;
|
||||
thumbnailUrl?: string;
|
||||
} | {
|
||||
type: 'text';
|
||||
};
|
||||
|
||||
|
||||
interface AppPersonasStore {
|
||||
|
||||
// state
|
||||
simplePersonas: SimplePersona[];
|
||||
|
||||
// actions
|
||||
prependSimplePersona: (systemPrompt: string, inputText: string, inputProvenance?: SimplePersonaProvenance, llmLabel?: string) => void;
|
||||
deleteSimplePersona: (id: string) => void;
|
||||
|
||||
}
|
||||
|
||||
/**
|
||||
* DO NOT USE outside of this application - this is a very simple store for Personas so that
|
||||
* they're not immediately lost.
|
||||
*/
|
||||
const useAppPersonasStore = create<AppPersonasStore>()(persist(
|
||||
(_set, _get) => ({
|
||||
|
||||
simplePersonas: [],
|
||||
|
||||
prependSimplePersona: (systemPrompt: string, inputText: string, inputProvenance?: SimplePersonaProvenance, llmLabel?: string) =>
|
||||
_set(state => ({
|
||||
simplePersonas: [
|
||||
{
|
||||
id: createBase36Uid(state.simplePersonas.map(persona => persona.id)),
|
||||
systemPrompt,
|
||||
creationDate: new Date().toISOString(),
|
||||
inputProvenance,
|
||||
inputText,
|
||||
llmLabel,
|
||||
},
|
||||
...state.simplePersonas,
|
||||
],
|
||||
})),
|
||||
|
||||
deleteSimplePersona: (simplePersonaId: string) =>
|
||||
_set(state => ({
|
||||
simplePersonas: state.simplePersonas.filter(persona => persona.id !== simplePersonaId),
|
||||
})),
|
||||
|
||||
}),
|
||||
{
|
||||
name: 'app-app-personas',
|
||||
version: 1,
|
||||
},
|
||||
));
|
||||
|
||||
export function useSimplePersonas() {
|
||||
const simplePersonas = useAppPersonasStore(state => state.simplePersonas, shallow);
|
||||
return { simplePersonas };
|
||||
}
|
||||
|
||||
export function useSimplePersona(simplePersonaId: string) {
|
||||
const simplePersona = useAppPersonasStore(state => {
|
||||
return state.simplePersonas.find(persona => persona.id === simplePersonaId) ?? null;
|
||||
}, shallow);
|
||||
return { simplePersona };
|
||||
}
|
||||
|
||||
export function prependSimplePersona(systemPrompt: string, inputText: string, inputProvenance?: SimplePersonaProvenance, llmLabel?: string) {
|
||||
useAppPersonasStore.getState().prependSimplePersona(systemPrompt, inputText, inputProvenance, llmLabel);
|
||||
}
|
||||
|
||||
export function deleteSimplePersona(simplePersonaId: string) {
|
||||
useAppPersonasStore.getState().deleteSimplePersona(simplePersonaId);
|
||||
}
|
||||
@@ -114,6 +114,7 @@ export const navItems: {
|
||||
icon: Diversity2Icon,
|
||||
type: 'app',
|
||||
route: '/personas',
|
||||
drawer: true,
|
||||
hideBar: true,
|
||||
},
|
||||
{
|
||||
|
||||
@@ -33,6 +33,7 @@ export function CloseableMenu(props: {
|
||||
noBottomPadding?: boolean,
|
||||
sx?: SxProps,
|
||||
zIndex?: number,
|
||||
listRef?: React.Ref<HTMLUListElement>,
|
||||
children?: React.ReactNode,
|
||||
}) {
|
||||
|
||||
@@ -71,6 +72,7 @@ export function CloseableMenu(props: {
|
||||
>
|
||||
<ClickAwayListener onClickAway={handleClose}>
|
||||
<MenuList
|
||||
ref={props.listRef}
|
||||
// variant={props.variant} color={props.color}
|
||||
onKeyDown={handleListKeyDown}
|
||||
sx={{
|
||||
|
||||
@@ -0,0 +1,70 @@
|
||||
import * as React from 'react';
|
||||
import Input, { InputProps } from '@mui/joy/Input';
|
||||
import ClearIcon from '@mui/icons-material/Clear';
|
||||
import SearchIcon from '@mui/icons-material/Search';
|
||||
|
||||
type DebounceInputProps = Omit<InputProps, 'onChange'> & {
|
||||
onDebounce: (value: string) => void;
|
||||
debounceTimeout: number;
|
||||
};
|
||||
|
||||
const DebounceInput: React.FC<DebounceInputProps> = ({
|
||||
onDebounce,
|
||||
debounceTimeout,
|
||||
...rest
|
||||
}) => {
|
||||
const [inputValue, setInputValue] = React.useState('');
|
||||
const timerRef = React.useRef<ReturnType<typeof setTimeout>>();
|
||||
|
||||
const handleChange = (event: React.ChangeEvent<HTMLInputElement>) => {
|
||||
const newValue = event.target.value;
|
||||
setInputValue(newValue); // Update internal state immediately for a responsive UI
|
||||
|
||||
if (timerRef.current) {
|
||||
clearTimeout(timerRef.current);
|
||||
}
|
||||
|
||||
timerRef.current = setTimeout(() => {
|
||||
onDebounce(newValue); // Call onDebounce after the debounce timeout
|
||||
}, debounceTimeout);
|
||||
};
|
||||
|
||||
React.useEffect(() => {
|
||||
return () => {
|
||||
if (timerRef.current) {
|
||||
clearTimeout(timerRef.current);
|
||||
}
|
||||
};
|
||||
}, []);
|
||||
|
||||
const handleClear = () => {
|
||||
setInputValue(''); // Clear internal state
|
||||
onDebounce(''); // Call onDebounce with empty string
|
||||
};
|
||||
|
||||
return (
|
||||
<Input
|
||||
{...rest}
|
||||
value={inputValue}
|
||||
onChange={handleChange}
|
||||
aria-label={rest['aria-label'] || 'Search'}
|
||||
startDecorator={<SearchIcon />}
|
||||
endDecorator={
|
||||
inputValue && (
|
||||
<ClearIcon
|
||||
onClick={handleClear}
|
||||
tabIndex={0}
|
||||
onKeyPress={(event) => {
|
||||
if (event.key === 'Enter' || event.key === ' ') {
|
||||
handleClear();
|
||||
}
|
||||
}}
|
||||
aria-label="Clear search"
|
||||
/>
|
||||
)
|
||||
}
|
||||
/>
|
||||
);
|
||||
};
|
||||
|
||||
export default React.memo(DebounceInput);
|
||||
@@ -23,7 +23,7 @@ export function GoodModal(props: {
|
||||
const showBottomClose = !!props.onClose && props.hideBottomClose !== true;
|
||||
return (
|
||||
<Modal open={props.open} onClose={props.onClose}>
|
||||
<ModalOverflow sx={{p:1}}>
|
||||
<ModalOverflow sx={{ p: 1 }}>
|
||||
<ModalDialog
|
||||
sx={{
|
||||
minWidth: { xs: 360, sm: 500, md: 600, lg: 700 },
|
||||
|
||||
@@ -19,7 +19,11 @@ export const GoodTooltip = (props: {
|
||||
placement={props.placement}
|
||||
variant={(props.isError || props.isWarning) ? 'soft' : undefined}
|
||||
color={props.isError ? 'danger' : props.isWarning ? 'warning' : undefined}
|
||||
sx={{ maxWidth: { sm: '50vw', md: '25vw' }, ...props.sx }}
|
||||
sx={{
|
||||
maxWidth: { sm: '50vw', md: '25vw' },
|
||||
whiteSpace: 'break-spaces',
|
||||
...props.sx,
|
||||
}}
|
||||
>
|
||||
{props.children}
|
||||
</Tooltip>;
|
||||
|
||||
@@ -9,13 +9,15 @@ import { FormLabelStart } from './FormLabelStart';
|
||||
* Text form field (e.g. enter a host)
|
||||
*/
|
||||
export function FormTextField(props: {
|
||||
title: string | React.JSX.Element, description?: string | React.JSX.Element,
|
||||
title: string | React.JSX.Element,
|
||||
description?: string | React.JSX.Element,
|
||||
tooltip?: string | React.JSX.Element,
|
||||
placeholder?: string, isError?: boolean, disabled?: boolean,
|
||||
value: string | undefined, onChange: (text: string) => void,
|
||||
}) {
|
||||
return (
|
||||
<FormControl orientation='horizontal' disabled={props.disabled} sx={{ flexWrap: 'wrap', justifyContent: 'space-between', alignItems: 'center' }}>
|
||||
<FormLabelStart title={props.title} description={props.description} />
|
||||
<FormLabelStart title={props.title} description={props.description} tooltip={props.tooltip} />
|
||||
<Input
|
||||
variant='outlined' placeholder={props.placeholder} error={props.isError}
|
||||
value={props.value} onChange={event => props.onChange(event.target.value)}
|
||||
|
||||
@@ -0,0 +1,42 @@
|
||||
import * as React from 'react';
|
||||
|
||||
import { Box, FormControl, IconButton, Textarea, Tooltip } from '@mui/joy';
|
||||
import ReplayIcon from '@mui/icons-material/Replay';
|
||||
|
||||
import { FormLabelStart } from '~/common/components/forms/FormLabelStart';
|
||||
|
||||
|
||||
/**
|
||||
* A simple UI component, string array (ant titles array) in -> edited string array out
|
||||
*/
|
||||
export function useFormEditTextArray(initialStrings: string[], titles: string[]) {
|
||||
|
||||
// state
|
||||
const [strings, setStrings] = React.useState<string[]>(initialStrings);
|
||||
|
||||
const editString = React.useCallback((i: number, text: string) => {
|
||||
setStrings(s => s.map((s, j) => j === i ? text : s));
|
||||
}, []);
|
||||
|
||||
const stringEditors = React.useMemo(() => strings.map((text, i) =>
|
||||
<FormControl key={i} orientation='vertical'>
|
||||
<FormLabelStart title={i > 0 ? `${i}. ${titles[i]}` : titles[i]} />
|
||||
<Box sx={{ display: 'flex', alignItems: 'start', gap: 1 }}>
|
||||
<Textarea
|
||||
value={text}
|
||||
size='sm'
|
||||
variant='outlined'
|
||||
onChange={event => editString(i, event.target.value)}
|
||||
sx={{ flex: 1, backgroundColor: 'background.level1', boxShadow: 'none' }}
|
||||
/>
|
||||
<Tooltip title='Reset'>
|
||||
<IconButton size='sm' onClick={() => editString(i, initialStrings[i])}>
|
||||
<ReplayIcon />
|
||||
</IconButton>
|
||||
</Tooltip>
|
||||
</Box>
|
||||
</FormControl>,
|
||||
), [editString, initialStrings, strings, titles]);
|
||||
|
||||
return { strings, stringEditors };
|
||||
}
|
||||
@@ -0,0 +1,119 @@
|
||||
import * as React from 'react';
|
||||
import { shallow } from 'zustand/shallow';
|
||||
|
||||
import { Box, FormControl, ListDivider, ListItemDecorator, Option, Select } from '@mui/joy';
|
||||
|
||||
import { DLLM, DLLMId, useModelsStore } from '~/modules/llms/store-llms';
|
||||
import { findVendorById } from '~/modules/llms/vendors/vendors.registry';
|
||||
|
||||
import { FormLabelStart } from '~/common/components/forms/FormLabelStart';
|
||||
import { IModelVendor } from '~/modules/llms/vendors/IModelVendor';
|
||||
|
||||
|
||||
/**
|
||||
* Select the Model, synced with either Global (Chat) LLM state, or local
|
||||
*
|
||||
* @param localState if true, the state is local to the hook, otherwise the global chat model is changed
|
||||
* @param label label of the select, use '' to hide it
|
||||
* @param placeholder placeholder of the select
|
||||
*/
|
||||
export function useLLMSelect(localState: boolean = true, label: string = 'Model', placeholder: string = 'Models …'): [DLLM | null, React.JSX.Element | null] {
|
||||
|
||||
// state
|
||||
const localSwitch = React.useRef(localState);
|
||||
|
||||
// external state
|
||||
const { llms, globalChatLLMId, globalSetChatLLMId } = useModelsStore(state => ({
|
||||
llms: state.llms,
|
||||
globalChatLLMId: state.chatLLMId,
|
||||
globalSetChatLLMId: state.setChatLLMId,
|
||||
}), shallow);
|
||||
|
||||
// local state initially synced to the global state (may be used or not)
|
||||
const [localLLMId, setLocalLLMId] = React.useState<DLLMId | null>(globalChatLLMId);
|
||||
|
||||
// global/local (stable) switch - do not change at runtime
|
||||
const chatLLMId = localSwitch.current ? localLLMId : globalChatLLMId;
|
||||
const setChatLLMId = localSwitch.current ? setLocalLLMId : globalSetChatLLMId;
|
||||
|
||||
|
||||
// derived state
|
||||
const chatLLM = chatLLMId ? llms.find(llm => llm.id === chatLLMId) ?? null : null;
|
||||
|
||||
|
||||
const component = React.useMemo(() => {
|
||||
// hide invisible models, except the current model
|
||||
const filteredLLMs = llms.filter(llm => !llm.hidden || llm.id === chatLLMId);
|
||||
|
||||
// create the option items
|
||||
let formerVendor: IModelVendor | null = null;
|
||||
const options = filteredLLMs.map((llm) => {
|
||||
|
||||
const vendor = findVendorById(llm._source?.vId);
|
||||
const vendorChanged = vendor !== formerVendor;
|
||||
const addSeparator = vendorChanged && formerVendor !== null;
|
||||
if (vendorChanged)
|
||||
formerVendor = vendor;
|
||||
|
||||
return (
|
||||
<React.Fragment key={'llm-' + llm.id}>
|
||||
{addSeparator && <ListDivider />}
|
||||
<Option
|
||||
value={llm.id}
|
||||
sx={llm.id === chatLLMId ? { fontWeight: 500 } : undefined}
|
||||
>
|
||||
{!!vendor?.Icon && (
|
||||
<ListItemDecorator>
|
||||
<vendor.Icon />
|
||||
</ListItemDecorator>
|
||||
)}
|
||||
{/*<Tooltip title={llm.description}>*/}
|
||||
{llm.label}
|
||||
{/*</Tooltip>*/}
|
||||
{/*{llm.gen === 'sdxl' && <Chip size='sm' variant='outlined'>XL</Chip>} {llm.label}*/}
|
||||
</Option>
|
||||
</React.Fragment>
|
||||
);
|
||||
});
|
||||
|
||||
// create the component
|
||||
return (
|
||||
<FormControl>
|
||||
{!!label && <FormLabelStart title={label} />}
|
||||
<Box sx={{ display: 'flex', justifyContent: 'space-between' }}>
|
||||
<Select
|
||||
variant='outlined'
|
||||
value={chatLLMId}
|
||||
onChange={(_event, value) => value && setChatLLMId(value)}
|
||||
placeholder={placeholder}
|
||||
slotProps={{
|
||||
listbox: {
|
||||
sx: {
|
||||
// larger list
|
||||
'--ListItem-paddingLeft': '1rem',
|
||||
'--ListItem-minHeight': '2.5rem',
|
||||
// minWidth: '100%',
|
||||
},
|
||||
},
|
||||
button: {
|
||||
sx: {
|
||||
// show the full name on the button
|
||||
whiteSpace: 'inherit',
|
||||
},
|
||||
},
|
||||
}}
|
||||
sx={{
|
||||
flex: 1,
|
||||
// minWidth: '200',
|
||||
}}
|
||||
>
|
||||
{options}
|
||||
</Select>
|
||||
</Box>
|
||||
</FormControl>
|
||||
);
|
||||
}, [chatLLMId, label, llms, placeholder, setChatLLMId]);
|
||||
|
||||
|
||||
return [chatLLM, component];
|
||||
}
|
||||
@@ -0,0 +1,23 @@
|
||||
import * as React from 'react';
|
||||
|
||||
import { SvgIcon } from '@mui/joy';
|
||||
import { SxProps } from '@mui/joy/styles/types';
|
||||
|
||||
export function TogetherIcon(props: { sx?: SxProps }) {
|
||||
return <SvgIcon viewBox='0 0 976 180' width='24' height='24' strokeWidth={0} stroke='none' fill='currentColor' strokeLinecap='butt' strokeLinejoin='miter' {...props}>
|
||||
<path d='M972.52,2.94 C970.56,0.98 968.08,0 965.072,0 C962.066,0 959.584,0.98 957.624,2.94 C955.664,4.9 954.684,7.383 954.684,10.388 C954.684,13.393 955.664,15.876 957.624,17.836 C959.584,19.796 962.066,20.776 965.072,20.776 C968.08,20.776 970.56,19.796 972.52,17.836 C974.48,15.876 975.46,13.393 975.46,10.388 C975.46,7.383 974.48,4.9 972.52,2.94' id='Fill-1' />
|
||||
<polygon points='957.036 138.572 973.11 138.572 973.11 38.612 957.036 38.612' />
|
||||
<path d='M571.079,48.412 C563.631,41.095 553.831,37.436 541.679,37.436 C533.839,37.436 527.175,39.069 521.687,42.336 C516.33,45.603 512.018,49.849 508.751,55.076 L508.751,1.372 L492.679,1.372 L492.679,138.572 L508.751,138.572 L508.751,88.592 C508.751,77.355 511.43,68.469 516.787,61.936 C522.275,55.403 529.658,52.136 538.935,52.136 C547.69,52.136 554.419,54.88 559.123,60.368 C563.827,65.725 566.179,73.565 566.179,83.888 L566.179,138.572 L582.251,138.572 L582.251,82.908 C582.251,67.228 578.527,55.729 571.079,48.412' id='Fill-3' />
|
||||
<path d='M628.806,58.996 C634.555,54.031 641.219,51.548 648.798,51.548 C657.291,51.548 664.282,53.9 669.77,58.604 C675.389,63.308 678.59,69.907 679.374,78.4 L619.006,78.4 C619.79,70.429 623.057,63.961 628.806,58.996 Z M695.642,91.728 C695.903,88.592 696.034,86.305 696.034,84.868 C695.773,75.199 693.617,66.771 689.566,59.584 C685.515,52.397 679.962,46.909 672.906,43.12 C665.85,39.331 657.749,37.436 648.602,37.436 C639.325,37.436 631.093,39.592 623.906,43.904 C616.719,48.085 611.101,54.031 607.05,61.74 C603.13,69.449 601.17,78.4 601.17,88.592 C601.17,98.653 603.195,107.604 607.246,115.444 C611.427,123.153 617.242,129.164 624.69,133.476 C632.138,137.657 640.827,139.748 650.758,139.748 C662.126,139.748 671.73,136.547 679.57,130.144 C687.41,123.611 692.31,115.248 694.27,105.056 L678.002,105.056 C676.303,111.459 672.906,116.489 667.81,120.148 C662.845,123.676 656.769,125.44 649.582,125.44 C640.305,125.44 632.791,122.5 627.042,116.62 C621.293,110.74 618.287,102.9 618.026,93.1 L618.026,91.728 L695.642,91.728 L695.642,91.728 Z' />
|
||||
<path d='M767.098,38.612 L767.098,54.292 L759.062,54.292 C749.523,54.292 742.663,57.493 738.482,63.896 C734.431,70.299 732.406,78.204 732.406,87.612 L732.406,138.572 L716.334,138.572 L716.334,38.612 L730.25,38.612 L732.406,53.704 C735.28,49.131 739.005,45.472 743.578,42.728 C748.151,39.984 754.488,38.612 762.59,38.612 L767.098,38.612' id='Fill-5' />
|
||||
<path d='M430.874,53.312 L412.842,53.312 L412.842,38.612 L430.874,38.612 L430.874,10.584 L446.946,10.584 L446.946,38.612 L472.23,38.612 L472.23,53.312 L446.946,53.312 L446.946,112.308 C446.946,116.489 447.73,119.495 449.298,121.324 C450.996,123.023 453.871,123.872 457.922,123.872 L475.366,123.872 L475.366,138.572 L456.942,138.572 C447.534,138.572 440.804,136.481 436.754,132.3 C432.834,128.119 430.874,121.52 430.874,112.504 L430.874,53.312' id='Fill-6' />
|
||||
<path d='M336.682,58.996 C342.431,54.031 349.095,51.548 356.674,51.548 C365.167,51.548 372.158,53.9 377.646,58.604 C383.264,63.308 386.466,69.907 387.25,78.4 L326.882,78.4 C327.666,70.429 330.932,63.961 336.682,58.996 Z M403.518,91.728 C403.779,88.592 403.91,86.305 403.91,84.868 C403.648,75.199 401.492,66.771 397.442,59.584 C393.391,52.397 387.838,46.909 380.782,43.12 C373.726,39.331 365.624,37.436 356.478,37.436 C347.2,37.436 338.968,39.592 331.782,43.904 C324.595,48.085 318.976,54.031 314.926,61.74 C311.006,69.449 309.046,78.4 309.046,88.592 C309.046,98.653 311.071,107.604 315.122,115.444 C319.303,123.153 325.118,129.164 332.566,133.476 C340.014,137.657 348.703,139.748 358.634,139.748 C370.002,139.748 379.606,136.547 387.446,130.144 C395.286,123.611 400.186,115.248 402.146,105.056 L385.878,105.056 C384.179,111.459 380.782,116.489 375.686,120.148 C370.72,123.676 364.644,125.44 357.458,125.44 C348.18,125.44 340.667,122.5 334.918,116.62 C329.168,110.74 326.163,102.9 325.902,93.1 L325.902,91.728 L403.518,91.728 L403.518,91.728 Z' />
|
||||
<path d='M268.728,107.996 C265.984,113.484 262.064,117.796 256.968,120.932 C252.003,123.937 246.319,125.44 239.916,125.44 C229.985,125.44 221.949,122.043 215.808,115.248 C209.797,108.323 206.792,99.437 206.792,88.592 C206.792,77.747 209.797,68.927 215.808,62.132 C221.949,55.207 229.985,51.744 239.916,51.744 C246.319,51.744 252.003,53.312 256.968,56.448 C262.064,59.584 265.984,64.027 268.728,69.776 C271.472,75.395 272.844,81.797 272.844,88.984 C272.844,96.04 271.472,102.377 268.728,107.996 Z M274.804,38.612 L272.648,55.86 C269.381,49.98 264.873,45.472 259.124,42.336 C253.375,39.069 246.449,37.436 238.348,37.436 C229.201,37.436 220.969,39.592 213.652,43.904 C206.335,48.216 200.585,54.227 196.404,61.936 C192.353,69.645 190.328,78.531 190.328,88.592 C190.328,99.176 192.353,108.323 196.404,116.032 C200.585,123.741 206.269,129.621 213.456,133.672 C220.773,137.723 229.071,139.748 238.348,139.748 C254.028,139.748 265.461,133.607 272.648,121.324 L272.648,133.084 C272.648,154.121 261.868,164.64 240.308,164.64 C231.945,164.64 225.085,162.941 219.728,159.544 C214.371,156.147 211.039,151.312 209.732,145.04 L193.268,145.04 C194.575,155.885 199.279,164.248 207.38,170.128 C215.612,176.008 226.196,178.948 239.132,178.948 C272.191,178.948 288.72,163.856 288.72,133.672 L288.72,38.612 L274.804,38.612 L274.804,38.612 Z' />
|
||||
<g id='Group-12' transform='translate(0.000000, 10.584000)'>
|
||||
<path d='M152.886,97.02 C150.142,102.639 146.222,107.016 141.126,110.152 C136.161,113.288 130.411,114.856 123.878,114.856 C117.345,114.856 111.53,113.288 106.434,110.152 C101.469,107.016 97.614,102.639 94.87,97.02 C92.126,91.401 90.754,85.064 90.754,78.008 C90.754,70.952 92.126,64.615 94.87,58.996 C97.614,53.377 101.469,49 106.434,45.864 C111.53,42.728 117.345,41.16 123.878,41.16 C130.411,41.16 136.161,42.728 141.126,45.864 C146.222,49 150.142,53.377 152.886,58.996 C155.63,64.615 157.002,70.952 157.002,78.008 C157.002,85.064 155.63,91.401 152.886,97.02 Z M167.194,51.352 C163.013,43.643 157.133,37.632 149.554,33.32 C142.106,29.008 133.547,26.852 123.878,26.852 C114.209,26.852 105.585,29.008 98.006,33.32 C90.558,37.632 84.743,43.643 80.562,51.352 C76.381,59.061 74.29,67.947 74.29,78.008 C74.29,88.069 76.381,96.955 80.562,104.664 C84.743,112.373 90.558,118.384 98.006,122.696 C105.585,127.008 114.209,129.164 123.878,129.164 C133.547,129.164 142.106,127.008 149.554,122.696 C157.133,118.384 163.013,112.373 167.194,104.664 C171.375,96.955 173.466,88.069 173.466,78.008 C173.466,67.947 171.375,59.061 167.194,51.352 L167.194,51.352 Z' />
|
||||
<path d='M17.972,42.728 L-0.06,42.728 L-0.06,28.028 L17.972,28.028 L17.972,0 L34.044,0 L34.044,28.028 L59.328,28.028 L59.328,42.728 L34.044,42.728 L34.044,101.724 C34.044,105.905 34.828,108.911 36.396,110.74 C38.095,112.439 40.969,113.288 45.02,113.288 L62.464,113.288 L62.464,127.988 L44.04,127.988 C34.632,127.988 27.903,125.897 23.852,121.716 C19.932,117.535 17.972,110.936 17.972,101.92 L17.972,42.728' id='Fill-11' mask='url(#mask-2)' />
|
||||
</g>
|
||||
<path d='M911.164,97.804 C911.164,106.297 908.355,113.157 902.736,118.384 C897.117,123.48 889.408,126.028 879.608,126.028 C872.944,126.028 867.652,124.525 863.732,121.52 C859.812,118.515 857.852,114.529 857.852,109.564 C857.852,98.457 865.3,92.904 880.196,92.904 L911.164,92.904 L911.164,97.804 Z M933.9,123.872 C929.457,123.872 927.236,121.455 927.236,116.62 L927.236,73.5 C927.236,61.871 923.708,52.985 916.652,46.844 C909.727,40.572 899.861,37.436 887.056,37.436 C875.035,37.436 865.235,40.18 857.656,45.668 C850.208,51.156 845.896,58.8 844.72,68.6 L860.792,68.6 C861.837,63.504 864.581,59.453 869.024,56.448 C873.597,53.312 879.347,51.744 886.272,51.744 C894.112,51.744 900.188,53.573 904.5,57.232 C908.943,60.891 911.164,65.987 911.164,72.52 L911.164,79.38 L881.764,79.38 C868.697,79.38 858.701,82.059 851.776,87.416 C844.981,92.773 841.584,100.483 841.584,110.544 C841.584,119.56 844.916,126.681 851.58,131.908 C858.375,137.135 867.325,139.748 878.432,139.748 C893.067,139.748 904.239,134.195 911.948,123.088 C912.079,128.184 913.516,132.039 916.26,134.652 C919.004,137.265 923.577,138.572 929.98,138.572 L938.8,138.572 L938.8,123.872 L933.9,123.872 L933.9,123.872 Z' />
|
||||
<path d='M795.627,138.162 C804.074,138.162 810.922,131.314 810.922,122.867 C810.922,114.42 804.074,107.572 795.627,107.572 C787.18,107.572 780.332,114.42 780.332,122.867 C780.332,131.314 787.18,138.162 795.627,138.162' id='Fill-14' />
|
||||
</SvgIcon>;
|
||||
}
|
||||
@@ -27,7 +27,7 @@ export function MobileNavListItem(props: { currentApp?: NavItemApp }) {
|
||||
gap: 1,
|
||||
}}
|
||||
>
|
||||
{navItems.apps.filter(app => ['Chat', 'News'].includes(app.name)).map(app =>
|
||||
{navItems.apps.filter(app => ['Chat', 'Personas', 'News'].includes(app.name)).map(app =>
|
||||
<Button
|
||||
key={'app-' + app.name}
|
||||
disabled={!!app.automatic}
|
||||
@@ -35,7 +35,7 @@ export function MobileNavListItem(props: { currentApp?: NavItemApp }) {
|
||||
variant={app == props.currentApp ? 'soft' : 'solid'}
|
||||
onClick={() => Router.push(app.route)}
|
||||
>
|
||||
{app.name}
|
||||
{app == props.currentApp ? app.name : <app.icon />}
|
||||
</Button>,
|
||||
)}
|
||||
</ButtonGroup>
|
||||
|
||||
+1
-1
@@ -16,7 +16,7 @@ export type DropdownItems = Record<string, {
|
||||
/**
|
||||
* A Select component that blends-in nicely (cleaner, easier to the eyes)
|
||||
*/
|
||||
export function GoodDropdown<TValue extends string>(props: {
|
||||
export function PageBarDropdown<TValue extends string>(props: {
|
||||
items: DropdownItems,
|
||||
prependOption?: React.JSX.Element,
|
||||
appendOption?: React.JSX.Element,
|
||||
@@ -1,7 +1,7 @@
|
||||
import * as React from 'react';
|
||||
|
||||
import type { SxProps } from '@mui/joy/styles/types';
|
||||
import { ColorPaletteProp, MenuList, VariantProp } from '@mui/joy';
|
||||
import { ColorPaletteProp, List, VariantProp } from '@mui/joy';
|
||||
|
||||
|
||||
export const PageDrawerTallItemSx: SxProps = {
|
||||
@@ -15,6 +15,7 @@ export const PageDrawerTallItemSx: SxProps = {
|
||||
export function PageDrawerList(props: {
|
||||
variant?: VariantProp,
|
||||
color?: ColorPaletteProp,
|
||||
onClick?: () => void,
|
||||
largeIcons?: boolean,
|
||||
tallRows?: boolean,
|
||||
noTopPadding?: boolean,
|
||||
@@ -23,9 +24,10 @@ export function PageDrawerList(props: {
|
||||
}) {
|
||||
|
||||
return (
|
||||
<MenuList
|
||||
<List
|
||||
variant={props.variant}
|
||||
color={props.color}
|
||||
onClick={props.onClick}
|
||||
sx={{
|
||||
// size of the list items
|
||||
'--List-radius': 0,
|
||||
@@ -40,9 +42,12 @@ export function PageDrawerList(props: {
|
||||
border: 'none',
|
||||
...(!!props.noTopPadding && { pt: 0 }),
|
||||
...(!!props.noBottomPadding && { pb: 0 }),
|
||||
|
||||
// clipping/scrolling
|
||||
overflow: 'hidden',
|
||||
}}
|
||||
>
|
||||
{props.children}
|
||||
</MenuList>
|
||||
</List>
|
||||
);
|
||||
}
|
||||
@@ -60,5 +60,11 @@ export const useOptimaDrawers = () => {
|
||||
const context = React.useContext(UseOptimaDrawers);
|
||||
if (!context)
|
||||
throw new Error('useOptimaDrawer must be used within an OptimaDrawerProvider');
|
||||
// NOTE: shall we merge Drawers and Layout? They cascade anyway, and there are benefits to having them together
|
||||
// const { appPaneContent } = useOptimaLayout();
|
||||
// return {
|
||||
// ...context,
|
||||
// isDrawerOpen: context.isDrawerOpen && !!appPaneContent,
|
||||
// };
|
||||
return context;
|
||||
};
|
||||
@@ -3,7 +3,7 @@ import { createJSONStorage, devtools, persist } from 'zustand/middleware';
|
||||
import { shallow } from 'zustand/shallow';
|
||||
import { v4 as uuidv4 } from 'uuid';
|
||||
|
||||
import { DLLMId, useModelsStore } from '~/modules/llms/store-llms';
|
||||
import { DLLMId, getChatLLMId } from '~/modules/llms/store-llms';
|
||||
|
||||
import { IDB_MIGRATION_INITIAL, idbStateStorage } from '../util/idbUtils';
|
||||
import { countModelTokens } from '../util/token-counter';
|
||||
@@ -354,7 +354,7 @@ export const useChatStore = create<ConversationsStore>()(devtools(
|
||||
editMessage: (conversationId: string, messageId: string, updatedMessage: Partial<DMessage>, setUpdated: boolean) =>
|
||||
_get()._editConversation(conversationId, conversation => {
|
||||
|
||||
const chatLLMId = useModelsStore.getState().chatLLMId;
|
||||
const chatLLMId = getChatLLMId();
|
||||
const messages = conversation.messages.map((message: DMessage): DMessage =>
|
||||
message.id === messageId
|
||||
? {
|
||||
@@ -542,7 +542,7 @@ function updateDMessageTokenCount(message: DMessage, llmId: DLLMId | null, force
|
||||
* Convenience function to update a set of messages, using the current chatLLM
|
||||
*/
|
||||
function updateTokenCounts(messages: DMessage[], forceUpdate: boolean, debugFrom: string): number {
|
||||
const { chatLLMId } = useModelsStore.getState();
|
||||
const chatLLMId = getChatLLMId();
|
||||
return 3 + messages.reduce((sum, message) => 4 + updateDMessageTokenCount(message, chatLLMId, forceUpdate, debugFrom) + sum, 0);
|
||||
}
|
||||
|
||||
|
||||
@@ -1,5 +1,5 @@
|
||||
:root {
|
||||
--AGI-Nav-width: 52px;
|
||||
--AGI-Drawer-width: 320px;
|
||||
--AGI-Desktop-Drawer-width: clamp(320px, 22.5vw, 450px);
|
||||
--AGI-Desktop-Drawer-width: clamp(320px, 22.1vw, 450px);
|
||||
}
|
||||
@@ -1,5 +1,5 @@
|
||||
import { getFastLLMId } from '~/modules/llms/store-llms';
|
||||
import { llmChatGenerateOrThrow } from '~/modules/llms/llm.client';
|
||||
import { useModelsStore } from '~/modules/llms/store-llms';
|
||||
|
||||
import { useChatStore } from '~/common/state/store-chats';
|
||||
|
||||
@@ -10,7 +10,7 @@ import { useChatStore } from '~/common/state/store-chats';
|
||||
export function autoTitle(conversationId: string) {
|
||||
|
||||
// use valid fast model
|
||||
const { fastLLMId } = useModelsStore.getState();
|
||||
const fastLLMId = getFastLLMId();
|
||||
if (!fastLLMId) return;
|
||||
|
||||
// only operate on valid conversations, without any title
|
||||
@@ -27,7 +27,7 @@ export function autoTitle(conversationId: string) {
|
||||
});
|
||||
|
||||
// LLM
|
||||
void llmChatGenerateOrThrow(fastLLMId, [
|
||||
llmChatGenerateOrThrow(fastLLMId, [
|
||||
{ role: 'system', content: `You are an AI conversation titles assistant who specializes in creating expressive yet few-words chat titles.` },
|
||||
{
|
||||
role: 'user', content:
|
||||
@@ -39,17 +39,21 @@ export function autoTitle(conversationId: string) {
|
||||
historyLines.join('\n') +
|
||||
'```\n',
|
||||
},
|
||||
], null, null).then(chatResponse => {
|
||||
], null, null)
|
||||
.then(chatResponse => {
|
||||
|
||||
const title = chatResponse?.content
|
||||
?.trim()
|
||||
?.replaceAll('"', '')
|
||||
?.replace('Title: ', '')
|
||||
?.replace('title: ', '');
|
||||
const title = chatResponse?.content
|
||||
?.trim()
|
||||
?.replaceAll('"', '')
|
||||
?.replace('Title: ', '')
|
||||
?.replace('title: ', '');
|
||||
|
||||
if (title)
|
||||
useChatStore.getState().setAutoTitle(conversationId, title);
|
||||
if (title)
|
||||
useChatStore.getState().setAutoTitle(conversationId, title);
|
||||
|
||||
});
|
||||
})
|
||||
.catch(err => {
|
||||
console.error('Failed to generate auto title', err);
|
||||
});
|
||||
|
||||
}
|
||||
@@ -1,5 +1,5 @@
|
||||
import { getFastLLMId } from '~/modules/llms/store-llms';
|
||||
import { llmChatGenerateOrThrow } from '~/modules/llms/llm.client';
|
||||
import { useModelsStore } from '~/modules/llms/store-llms';
|
||||
|
||||
|
||||
const simpleImagineSystemPrompt =
|
||||
@@ -11,7 +11,7 @@ Provide output as a lowercase prompt and nothing else.`;
|
||||
* Creates a caption for a drawing or photo given some description - used to elevate the quality of the imaging
|
||||
*/
|
||||
export async function imaginePromptFromText(messageText: string): Promise<string | null> {
|
||||
const { fastLLMId } = useModelsStore.getState();
|
||||
const fastLLMId = getFastLLMId();
|
||||
if (!fastLLMId) return null;
|
||||
try {
|
||||
const chatResponse = await llmChatGenerateOrThrow(fastLLMId, [
|
||||
|
||||
@@ -1,7 +1,11 @@
|
||||
import * as React from 'react';
|
||||
|
||||
import { DLLMId, useModelsStore } from '~/modules/llms/store-llms';
|
||||
import { llmChatGenerateOrThrow, VChatMessageIn } from '~/modules/llms/llm.client';
|
||||
import { DLLMId, findLLMOrThrow } from '~/modules/llms/store-llms';
|
||||
import { llmStreamingChatGenerate, VChatMessageIn } from '~/modules/llms/llm.client';
|
||||
|
||||
|
||||
// set to true to log to the console
|
||||
const DEBUG_CHAIN = false;
|
||||
|
||||
|
||||
export interface LLMChainStep {
|
||||
@@ -16,32 +20,58 @@ export interface LLMChainStep {
|
||||
/**
|
||||
* React hook to manage a chain of LLM transformations.
|
||||
*/
|
||||
export function useLLMChain(steps: LLMChainStep[], llmId: DLLMId | undefined, chainInput: string | undefined, onSuccess?: (output: string) => void) {
|
||||
export function useLLMChain(steps: LLMChainStep[], llmId: DLLMId | undefined, chainInput: string | undefined, onSuccess?: (output: string, input: string) => void) {
|
||||
|
||||
// state
|
||||
const [chain, setChain] = React.useState<ChainState | null>(null);
|
||||
const [error, setError] = React.useState<string | null>(null);
|
||||
const [chainStepInterimText, setChainStepInterimText] = React.useState<string | null>(null);
|
||||
const chainAbortController = React.useRef(new AbortController());
|
||||
|
||||
// restart Chain on inputs change
|
||||
React.useEffect(() => {
|
||||
// abort any ongoing chain, if any
|
||||
chainAbortController.current.abort();
|
||||
|
||||
// abort an ongoing chain, if any
|
||||
const abortChain = React.useCallback((reason: string) => {
|
||||
DEBUG_CHAIN && console.log('chain: abort (' + reason + ')');
|
||||
chainAbortController.current.abort(reason);
|
||||
chainAbortController.current = new AbortController();
|
||||
setChain(null);
|
||||
}, []);
|
||||
|
||||
// error if no LLM
|
||||
const userCancelChain = React.useCallback(() => {
|
||||
abortChain('user canceled');
|
||||
setError('Canceled');
|
||||
}, [abortChain]);
|
||||
|
||||
// starts a chain with the given inputs
|
||||
const startChain = React.useCallback((inputText: string | undefined, llmId: DLLMId | undefined, steps: LLMChainStep[]) => {
|
||||
DEBUG_CHAIN && console.log('chain: restart', { textLen: inputText?.length, llmId, stepsCount: steps.length });
|
||||
|
||||
// abort any former running chain
|
||||
abortChain('restart');
|
||||
|
||||
// init state
|
||||
setError(!llmId ? 'LLM not provided' : null);
|
||||
setChain((inputText && llmId)
|
||||
? initChainState(llmId, inputText, steps)
|
||||
: null,
|
||||
);
|
||||
setChainStepInterimText(null);
|
||||
|
||||
// abort if no input
|
||||
if (!chainInput || !llmId)
|
||||
return;
|
||||
}, [abortChain]);
|
||||
|
||||
// start the chain
|
||||
setChain(initChainState(llmId, chainInput, steps));
|
||||
return () => chainAbortController.current.abort();
|
||||
}, [chainInput, llmId, steps]);
|
||||
// restarts this chain
|
||||
const restartChain = React.useCallback(() => {
|
||||
startChain(chainInput, llmId, steps);
|
||||
}, [chainInput, llmId, startChain, steps]);
|
||||
|
||||
|
||||
// perform Step on Chain update
|
||||
// lifecycle: Start on inputs change + Abort on unmounts
|
||||
React.useEffect(() => {
|
||||
restartChain();
|
||||
return () => abortChain('unmount');
|
||||
}, [restartChain, abortChain]);
|
||||
|
||||
|
||||
// stepper: perform Step on Chain updates
|
||||
React.useEffect(() => {
|
||||
// skip step if the chain has been aborted
|
||||
const _chainAbortController = chainAbortController.current;
|
||||
@@ -57,7 +87,7 @@ export function useLLMChain(steps: LLMChainStep[], llmId: DLLMId | undefined, ch
|
||||
// safety check (re-processing the same step shall never happen)
|
||||
const chainStep = chain.steps[stepIdx];
|
||||
if (chainStep.output)
|
||||
return console.log('WARNING - Output overlap - why is this happening?', chainStep);
|
||||
return console.log('WARNING - Output overlap - FIXME', chainStep);
|
||||
|
||||
// execute step instructions
|
||||
let llmChatInput: VChatMessageIn[] = [...chain.chatHistory];
|
||||
@@ -79,21 +109,30 @@ export function useLLMChain(steps: LLMChainStep[], llmId: DLLMId | undefined, ch
|
||||
const globalToStepListener = () => stepAbortController.abort('chain aborted');
|
||||
_chainAbortController.signal.addEventListener('abort', globalToStepListener);
|
||||
|
||||
// LLM call
|
||||
llmChatGenerateOrThrow(llmId, llmChatInput, null, null, chain.overrideResponseTokens ?? undefined)
|
||||
.then(({ content }) => {
|
||||
stepDone = true;
|
||||
// interim text
|
||||
let interimText = '';
|
||||
setChainStepInterimText(null);
|
||||
|
||||
// LLM call (streaming, cancelable)
|
||||
llmStreamingChatGenerate(llmId, llmChatInput, null, null, stepAbortController.signal,
|
||||
(update) => {
|
||||
update.text && setChainStepInterimText(interimText = update.text);
|
||||
})
|
||||
.then(() => {
|
||||
if (stepAbortController.signal.aborted)
|
||||
return;
|
||||
const chainState = updateChainState(chain, llmChatInput, stepIdx, content);
|
||||
const chainState = updateChainState(chain, llmChatInput, stepIdx, interimText);
|
||||
if (chainState.output && onSuccess)
|
||||
onSuccess(chainState.output);
|
||||
onSuccess(chainState.output, chainState.input);
|
||||
setChain(chainState);
|
||||
})
|
||||
.catch((err) => {
|
||||
stepDone = true;
|
||||
if (!stepAbortController.signal.aborted)
|
||||
setError(`Transformation error: ${err?.message || err?.toString() || err || 'unknown'}`);
|
||||
})
|
||||
.finally(() => {
|
||||
stepDone = true;
|
||||
setChainStepInterimText(null);
|
||||
});
|
||||
|
||||
// abort if unmounted before the LLM call ends, or if the full chain has been aborted
|
||||
@@ -102,7 +141,7 @@ export function useLLMChain(steps: LLMChainStep[], llmId: DLLMId | undefined, ch
|
||||
stepAbortController.abort('step aborted');
|
||||
_chainAbortController.signal.removeEventListener('abort', globalToStepListener);
|
||||
};
|
||||
}, [chain, llmId]);
|
||||
}, [chain, llmId, onSuccess]);
|
||||
|
||||
|
||||
return {
|
||||
@@ -111,12 +150,11 @@ export function useLLMChain(steps: LLMChainStep[], llmId: DLLMId | undefined, ch
|
||||
chainOutput: chain?.output ?? null,
|
||||
chainProgress: chain?.progress ?? 0,
|
||||
chainStepName: chain?.steps?.find((step) => !step.isComplete)?.ref.name ?? null,
|
||||
chainIntermediates: chain?.steps?.map((step) => step.output ?? null)?.filter(out => out) ?? [],
|
||||
chainStepInterimChars: chainStepInterimText?.length ?? null,
|
||||
chainIntermediates: chain?.steps?.map((step) => ({ name: step.ref.name, output: step.output ?? null })).filter(i => !!i.output) ?? [],
|
||||
chainError: error,
|
||||
abortChain: () => {
|
||||
chainAbortController.current.abort('user canceled');
|
||||
setError('Canceled');
|
||||
},
|
||||
userCancelChain,
|
||||
restartChain,
|
||||
};
|
||||
}
|
||||
|
||||
@@ -140,10 +178,7 @@ interface StepState {
|
||||
|
||||
function initChainState(llmId: DLLMId, input: string, steps: LLMChainStep[]): ChainState {
|
||||
// max token allocation fo the job
|
||||
const { llms } = useModelsStore.getState();
|
||||
const llm = llms.find(llm => llm.id === llmId);
|
||||
if (!llm)
|
||||
throw new Error(`LLM ${llmId} not found`);
|
||||
const llm = findLLMOrThrow(llmId);
|
||||
|
||||
const overrideResponseTokens = llm.maxOutputTokens;
|
||||
const safeInputLength = (llm.contextTokens && overrideResponseTokens)
|
||||
@@ -167,7 +202,7 @@ function initChainState(llmId: DLLMId, input: string, steps: LLMChainStep[]): Ch
|
||||
}
|
||||
|
||||
function updateChainState(chain: ChainState, history: VChatMessageIn[], stepIdx: number, output: string): ChainState {
|
||||
const steps = chain.steps.length;
|
||||
const stepsCount = chain.steps.length;
|
||||
return {
|
||||
...chain,
|
||||
steps: chain.steps.map((step, i) =>
|
||||
@@ -177,8 +212,8 @@ function updateChainState(chain: ChainState, history: VChatMessageIn[], stepIdx:
|
||||
isComplete: true,
|
||||
} : step),
|
||||
chatHistory: history,
|
||||
progress: Math.round(100 * (stepIdx + 1) / steps) / 100,
|
||||
output: (stepIdx === steps - 1) ? output : null,
|
||||
progress: Math.round(100 * (stepIdx + 1) / stepsCount) / 100,
|
||||
output: (stepIdx === stepsCount - 1) ? output : null,
|
||||
};
|
||||
}
|
||||
|
||||
@@ -33,6 +33,7 @@ export const backendRouter = createTRPCRouter({
|
||||
hasLlmOllama: !!env.OLLAMA_API_HOST,
|
||||
hasLlmOpenAI: !!env.OPENAI_API_KEY || !!env.OPENAI_API_HOST,
|
||||
hasLlmOpenRouter: !!env.OPENROUTER_API_KEY,
|
||||
hasLlmTogetherAI: !!env.TOGETHERAI_API_KEY,
|
||||
hasVoiceElevenLabs: !!env.ELEVENLABS_API_KEY,
|
||||
} satisfies BackendCapabilities;
|
||||
}),
|
||||
|
||||
@@ -14,6 +14,7 @@ export interface BackendCapabilities {
|
||||
hasLlmOllama: boolean;
|
||||
hasLlmOpenAI: boolean;
|
||||
hasLlmOpenRouter: boolean;
|
||||
hasLlmTogetherAI: boolean;
|
||||
hasVoiceElevenLabs: boolean;
|
||||
}
|
||||
|
||||
@@ -37,6 +38,7 @@ const useBackendStore = create<BackendStore>()(
|
||||
hasLlmOllama: false,
|
||||
hasLlmOpenAI: false,
|
||||
hasLlmOpenRouter: false,
|
||||
hasLlmTogetherAI: false,
|
||||
hasVoiceElevenLabs: false,
|
||||
|
||||
loadedCapabilities: false,
|
||||
|
||||
@@ -46,7 +46,7 @@ export async function llmChatGenerateOrThrow<TSourceSetup = unknown, TAccess = u
|
||||
const access = vendor.getTransportAccess(partialSourceSetup);
|
||||
|
||||
// get any vendor-specific rate limit delay
|
||||
const delay = vendor.getRateLimitDelay?.(llm) ?? 0;
|
||||
const delay = vendor.getRateLimitDelay?.(llm, partialSourceSetup) ?? 0;
|
||||
if (delay > 0)
|
||||
await new Promise(resolve => setTimeout(resolve, delay));
|
||||
|
||||
@@ -75,7 +75,7 @@ export async function llmStreamingChatGenerate<TSourceSetup = unknown, TAccess =
|
||||
const access = vendor.getTransportAccess(partialSourceSetup); // as ChatStreamInputSchema['access'];
|
||||
|
||||
// get any vendor-specific rate limit delay
|
||||
const delay = vendor.getRateLimitDelay?.(llm) ?? 0;
|
||||
const delay = vendor.getRateLimitDelay?.(llm, partialSourceSetup) ?? 0;
|
||||
if (delay > 0)
|
||||
await new Promise(resolve => setTimeout(resolve, delay));
|
||||
|
||||
|
||||
@@ -6,14 +6,14 @@ import DeleteOutlineIcon from '@mui/icons-material/DeleteOutline';
|
||||
import VisibilityIcon from '@mui/icons-material/Visibility';
|
||||
import VisibilityOffIcon from '@mui/icons-material/VisibilityOff';
|
||||
|
||||
import { DLLMId, useModelsStore } from '~/modules/llms/store-llms';
|
||||
import { findVendorById } from '~/modules/llms/vendors/vendors.registry';
|
||||
|
||||
import { FormLabelStart } from '~/common/components/forms/FormLabelStart';
|
||||
import { GoodModal } from '~/common/components/GoodModal';
|
||||
import { GoodTooltip } from '~/common/components/GoodTooltip';
|
||||
import { settingsGap } from '~/common/app.theme';
|
||||
|
||||
import { DLLMId, useModelsStore } from '../store-llms';
|
||||
import { findVendorById } from '../vendors/vendors.registry';
|
||||
|
||||
|
||||
function VendorLLMOptions(props: { llmId: DLLMId }) {
|
||||
// get LLM (warning: this will refresh all children components on every change of any LLM field)
|
||||
@@ -134,7 +134,7 @@ export function LLMOptionsModal(props: { id: DLLMId, onClose: () => void }) {
|
||||
</Typography>}
|
||||
<Typography level='body-xs'>
|
||||
context tokens: <b>{llm.contextTokens ? llm.contextTokens.toLocaleString() : 'not provided'}</b>{` · `}
|
||||
max output tokens: <b>{llm.maxOutputTokens ? llm.maxOutputTokens.toLocaleString() : 'not provided'}</b><br/>
|
||||
max output tokens: <b>{llm.maxOutputTokens ? llm.maxOutputTokens.toLocaleString() : 'not provided'}</b><br />
|
||||
{!!llm.created && `created: ${(new Date(llm.created * 1000)).toLocaleString()} · `}
|
||||
{/*· tags: {llm.tags.join(', ')}*/}
|
||||
config: {JSON.stringify(llm.options)}
|
||||
|
||||
@@ -5,12 +5,12 @@ import { Box, Chip, IconButton, List, ListItem, ListItemButton, Typography } fro
|
||||
import SettingsOutlinedIcon from '@mui/icons-material/SettingsOutlined';
|
||||
import VisibilityOffOutlinedIcon from '@mui/icons-material/VisibilityOffOutlined';
|
||||
|
||||
import { DLLM, DLLMId, DModelSourceId, useModelsStore } from '~/modules/llms/store-llms';
|
||||
import { IModelVendor } from '~/modules/llms/vendors/IModelVendor';
|
||||
import { findVendorById } from '~/modules/llms/vendors/vendors.registry';
|
||||
|
||||
import { GoodTooltip } from '~/common/components/GoodTooltip';
|
||||
|
||||
import { DLLM, DLLMId, DModelSourceId, useModelsStore } from '../store-llms';
|
||||
import { IModelVendor } from '../vendors/IModelVendor';
|
||||
import { findVendorById } from '../vendors/vendors.registry';
|
||||
|
||||
|
||||
function ModelItem(props: { llm: DLLM, vendor: IModelVendor, chipChat: boolean, chipFast: boolean, chipFunc: boolean, onClick: () => void }) {
|
||||
|
||||
@@ -24,42 +24,40 @@ function ModelItem(props: { llm: DLLM, vendor: IModelVendor, chipChat: boolean,
|
||||
if (llm.contextTokens) {
|
||||
tooltip += llm.contextTokens.toLocaleString() + ' tokens';
|
||||
if (llm.maxOutputTokens)
|
||||
tooltip += ' / ' + llm.maxOutputTokens.toLocaleString() + ' max output tokens'
|
||||
tooltip += ' / ' + llm.maxOutputTokens.toLocaleString() + ' max output tokens';
|
||||
} else
|
||||
tooltip += 'token count not provided';
|
||||
|
||||
return (
|
||||
<ListItem>
|
||||
<ListItemButton onClick={props.onClick} sx={{ alignItems: 'center', gap: 1 }}>
|
||||
<ListItemButton color='primary' onClick={props.onClick} sx={{ alignItems: 'center', gap: 1 }}>
|
||||
|
||||
{/* Model Name */}
|
||||
<GoodTooltip title={tooltip}>
|
||||
<Typography sx={llm.hidden ? { color: 'neutral.plainDisabledColor' } : undefined}>
|
||||
{label}
|
||||
</Typography>
|
||||
</GoodTooltip>
|
||||
{/* Model Name */}
|
||||
<GoodTooltip title={tooltip}>
|
||||
<Typography sx={llm.hidden ? { color: 'neutral.plainDisabledColor' } : undefined}>
|
||||
{label}
|
||||
</Typography>
|
||||
</GoodTooltip>
|
||||
|
||||
{/* --> */}
|
||||
<Box sx={{ flex: 1 }} />
|
||||
{/* --> */}
|
||||
<Box sx={{ flex: 1 }} />
|
||||
|
||||
{props.chipChat && <Chip size='sm' variant='plain' sx={{ boxShadow: 'sm' }}>chat</Chip>}
|
||||
{props.chipChat && <Chip size='sm' variant='plain' sx={{ boxShadow: 'sm' }}>chat</Chip>}
|
||||
|
||||
{props.chipFast && <Chip size='sm' variant='plain' sx={{ boxShadow: 'sm' }}>fast</Chip>}
|
||||
{props.chipFast && <Chip size='sm' variant='plain' sx={{ boxShadow: 'sm' }}>fast</Chip>}
|
||||
|
||||
{props.chipFunc && <Chip size='sm' variant='plain' sx={{ boxShadow: 'sm' }}>𝑓n</Chip>}
|
||||
{props.chipFunc && <Chip size='sm' variant='plain' sx={{ boxShadow: 'sm' }}>𝑓n</Chip>}
|
||||
|
||||
{llm.hidden && (
|
||||
<IconButton disabled size='sm'>
|
||||
<VisibilityOffOutlinedIcon />
|
||||
</IconButton>
|
||||
)}
|
||||
|
||||
<IconButton size='sm'>
|
||||
<SettingsOutlinedIcon />
|
||||
{llm.hidden && (
|
||||
<IconButton disabled size='sm'>
|
||||
<VisibilityOffOutlinedIcon />
|
||||
</IconButton>
|
||||
)}
|
||||
|
||||
</ListItemButton>
|
||||
</ListItem>
|
||||
<IconButton size='sm'>
|
||||
<SettingsOutlinedIcon />
|
||||
</IconButton>
|
||||
|
||||
</ListItemButton>
|
||||
);
|
||||
}
|
||||
|
||||
@@ -113,11 +111,10 @@ export function ModelsList(props: {
|
||||
}
|
||||
|
||||
return (
|
||||
<List variant='soft' size='sm' sx={{
|
||||
borderRadius: 'sm',
|
||||
pl: { xs: 0, md: 1 },
|
||||
overflowY: 'auto',
|
||||
}}>
|
||||
<List
|
||||
variant='soft' size='sm'
|
||||
sx={{ borderRadius: 'md', overflowY: 'auto' }}
|
||||
>
|
||||
{items.length > 0 ? items : (
|
||||
<ListItem>
|
||||
<Typography level='body-sm'>
|
||||
|
||||
@@ -3,13 +3,13 @@ import { shallow } from 'zustand/shallow';
|
||||
|
||||
import { Box, Checkbox, Divider } from '@mui/joy';
|
||||
|
||||
import { DModelSource, DModelSourceId, useModelsStore } from '~/modules/llms/store-llms';
|
||||
import { createModelSourceForDefaultVendor, findVendorById } from '~/modules/llms/vendors/vendors.registry';
|
||||
|
||||
import { GoodModal } from '~/common/components/GoodModal';
|
||||
import { settingsGap } from '~/common/app.theme';
|
||||
import { useOptimaLayout } from '~/common/layout/optima/useOptimaLayout';
|
||||
|
||||
import { DModelSource, DModelSourceId, useModelsStore } from '../store-llms';
|
||||
import { createModelSourceForDefaultVendor, findVendorById } from '../vendors/vendors.registry';
|
||||
|
||||
import { LLMOptionsModal } from './LLMOptionsModal';
|
||||
import { ModelsList } from './ModelsList';
|
||||
import { ModelsSourceSelector } from './ModelsSourceSelector';
|
||||
@@ -19,7 +19,7 @@ function VendorSourceSetup(props: { source: DModelSource }) {
|
||||
const vendor = findVendorById(props.source.vId);
|
||||
if (!vendor)
|
||||
return 'Configuration issue: Vendor not found for Source ' + props.source.id;
|
||||
return <vendor.SourceSetupComponent sourceId={props.source.id} />;
|
||||
return <vendor.SourceSetupComponent key={props.source.id} sourceId={props.source.id} />;
|
||||
}
|
||||
|
||||
|
||||
|
||||
@@ -1,18 +1,18 @@
|
||||
import * as React from 'react';
|
||||
import { shallow } from 'zustand/shallow';
|
||||
|
||||
import { Avatar, Badge, Box, Button, IconButton, ListItemDecorator, MenuItem, Option, Select, Typography } from '@mui/joy';
|
||||
import { Avatar, Badge, Box, Button, Chip, IconButton, ListItemDecorator, MenuItem, Option, Select, Typography } from '@mui/joy';
|
||||
import AddIcon from '@mui/icons-material/Add';
|
||||
import DeleteOutlineIcon from '@mui/icons-material/DeleteOutline';
|
||||
|
||||
import type { IModelVendor } from '~/modules/llms/vendors/IModelVendor';
|
||||
import { DModelSourceId, useModelsStore } from '~/modules/llms/store-llms';
|
||||
import { createModelSourceForVendor, findAllVendors, findVendorById, ModelVendorId } from '~/modules/llms/vendors/vendors.registry';
|
||||
|
||||
import { CloseableMenu } from '~/common/components/CloseableMenu';
|
||||
import { ConfirmationModal } from '~/common/components/ConfirmationModal';
|
||||
import { useIsMobile } from '~/common/components/useMatchMedia';
|
||||
|
||||
import type { IModelVendor } from '../vendors/IModelVendor';
|
||||
import { DModelSourceId, useModelsStore } from '../store-llms';
|
||||
import { createModelSourceForVendor, findAllVendors, findVendorById, ModelVendorId } from '../vendors/vendors.registry';
|
||||
|
||||
|
||||
/*function locationIcon(vendor?: IModelVendor | null) {
|
||||
if (vendor && vendor.id === 'openai' && ModelVendorOpenAI.hasBackendCap?.())
|
||||
@@ -81,22 +81,37 @@ export function ModelsSourceSelector(props: {
|
||||
const vendorItems = React.useMemo(() => findAllVendors()
|
||||
.filter(v => !!v.instanceLimit)
|
||||
.map(vendor => {
|
||||
const sourceCount = modelSources.filter(source => source.vId === vendor.id).length;
|
||||
const enabled = vendor.instanceLimit > sourceCount;
|
||||
const sourceInstanceCount = modelSources.filter(source => source.vId === vendor.id).length;
|
||||
const enabled = vendor.instanceLimit > sourceInstanceCount;
|
||||
return {
|
||||
vendor,
|
||||
enabled,
|
||||
sourceCount,
|
||||
component: (
|
||||
<MenuItem key={vendor.id} disabled={!enabled} onClick={() => handleAddSourceFromVendor(vendor.id)}>
|
||||
<ListItemDecorator>
|
||||
{vendorIcon(vendor, !!vendor.hasBackendCap && vendor.hasBackendCap())}
|
||||
</ListItemDecorator>
|
||||
{vendor.name}
|
||||
{/*{sourceCount > 0 && ` (added)`}*/}
|
||||
|
||||
{/*{sourceInstanceCount > 0 && ` (added)`}*/}
|
||||
|
||||
{/* Free indication */}
|
||||
{!!vendor.hasFreeModels && ` 🎁`}
|
||||
{/*{!!vendor.instanceLimit && ` (${sourceCount}/${vendor.instanceLimit})`}*/}
|
||||
{vendor.location === 'local' && <span style={{ opacity: 0.5 }}>local</span>}
|
||||
|
||||
{/* Multiple instance hint */}
|
||||
{vendor.instanceLimit > 1 && !!sourceInstanceCount && enabled && (
|
||||
<Typography component='span' level='body-sm'>
|
||||
#{sourceInstanceCount + 1}
|
||||
{/*/{vendor.instanceLimit}*/}
|
||||
</Typography>
|
||||
)}
|
||||
|
||||
{/* Local chip */}
|
||||
{vendor.location === 'local' && (
|
||||
<Chip variant='outlined' size='sm'>
|
||||
local
|
||||
</Chip>
|
||||
)}
|
||||
</MenuItem>
|
||||
),
|
||||
};
|
||||
|
||||
@@ -99,6 +99,7 @@ export async function llmStreamingRelayHandler(req: NextRequest): Promise<Respon
|
||||
case 'oobabooga':
|
||||
case 'openai':
|
||||
case 'openrouter':
|
||||
case 'togetherai':
|
||||
requestAccess = openAIAccess(access, model.id, '/v1/chat/completions');
|
||||
body = openAIChatCompletionPayload(model, history, null, null, 1, true);
|
||||
vendorStreamParser = createStreamParserOpenAI();
|
||||
|
||||
@@ -3,6 +3,7 @@ import { LLM_IF_OAI_Chat, LLM_IF_OAI_Complete, LLM_IF_OAI_Fn, LLM_IF_OAI_Vision
|
||||
import type { ModelDescriptionSchema } from '../llm.server.types';
|
||||
import { wireMistralModelsListOutputSchema } from './mistral.wiretypes';
|
||||
import { wireOpenrouterModelsListOutputSchema } from './openrouter.wiretypes';
|
||||
import { wireTogetherAIListOutputSchema } from '~/modules/llms/server/openai/togetherai.wiretypes';
|
||||
|
||||
|
||||
// [Azure] / [OpenAI]
|
||||
@@ -375,6 +376,95 @@ export function openRouterModelToModelDescription(wireModel: object): ModelDescr
|
||||
}
|
||||
|
||||
|
||||
// [Together AI]
|
||||
|
||||
const _knownTogetherAIChatModels: ManualMappings = [
|
||||
{
|
||||
idPrefix: 'NousResearch/Nous-Hermes-2-Mixtral-8x7B-DPO',
|
||||
label: 'Nous Hermes 2 - Mixtral 8x7B-DPO',
|
||||
description: 'Nous Hermes 2 Mixtral 7bx8 DPO is the new flagship Nous Research model trained over the Mixtral 7bx8 MoE LLM. The model was trained on over 1,000,000 entries of primarily GPT-4 generated data, as well as other high quality data from open datasets across the AI landscape, achieving state of the art performance on a variety of tasks.',
|
||||
contextWindow: 32768,
|
||||
pricing: {
|
||||
cpmPrompt: 0.0006,
|
||||
cpmCompletion: 0.0006,
|
||||
},
|
||||
interfaces: [LLM_IF_OAI_Chat],
|
||||
},
|
||||
{
|
||||
idPrefix: 'NousResearch/Nous-Hermes-2-Mixtral-8x7B-SFT',
|
||||
label: 'Nous Hermes 2 - Mixtral 8x7B-SFT',
|
||||
description: 'Nous Hermes 2 Mixtral 7bx8 SFT is the new flagship Nous Research model trained over the Mixtral 7bx8 MoE LLM. The model was trained on over 1,000,000 entries of primarily GPT-4 generated data, as well as other high quality data from open datasets across the AI landscape, achieving state of the art performance on a variety of tasks.',
|
||||
contextWindow: 32768,
|
||||
pricing: {
|
||||
cpmPrompt: 0.0006,
|
||||
cpmCompletion: 0.0006,
|
||||
},
|
||||
interfaces: [LLM_IF_OAI_Chat],
|
||||
},
|
||||
{
|
||||
idPrefix: 'mistralai/Mixtral-8x7B-Instruct-v0.1',
|
||||
label: 'Mixtral-8x7B Instruct',
|
||||
description: 'The Mixtral-8x7B Large Language Model (LLM) is a pretrained generative Sparse Mixture of Experts.',
|
||||
contextWindow: 32768,
|
||||
pricing: {
|
||||
cpmPrompt: 0.0006,
|
||||
cpmCompletion: 0.0006,
|
||||
},
|
||||
interfaces: [LLM_IF_OAI_Chat],
|
||||
},
|
||||
{
|
||||
idPrefix: 'mistralai/Mistral-7B-Instruct-v0.2',
|
||||
label: 'Mistral (7B) Instruct v0.2',
|
||||
description: 'The Mistral-7B-Instruct-v0.2 Large Language Model (LLM) is an improved instruct fine-tuned version of Mistral-7B-Instruct-v0.1.',
|
||||
contextWindow: 32768,
|
||||
pricing: {
|
||||
cpmPrompt: 0.0002,
|
||||
cpmCompletion: 0.0002,
|
||||
},
|
||||
interfaces: [LLM_IF_OAI_Chat],
|
||||
},
|
||||
{
|
||||
idPrefix: 'NousResearch/Nous-Hermes-2-Yi-34B',
|
||||
label: 'Nous Hermes-2 Yi (34B)',
|
||||
description: 'Nous Hermes 2 - Yi-34B is a state of the art Yi Fine-tune',
|
||||
contextWindow: 4097,
|
||||
pricing: {
|
||||
cpmPrompt: 0.0008,
|
||||
cpmCompletion: 0.0008,
|
||||
},
|
||||
interfaces: [LLM_IF_OAI_Chat],
|
||||
},
|
||||
];
|
||||
|
||||
export function togetherAIModelsToModelDescriptions(wireModels: unknown): ModelDescriptionSchema[] {
|
||||
|
||||
function togetherAIModelToModelDescription(model: { id: string, created: number }) {
|
||||
return fromManualMapping(_knownTogetherAIChatModels, model.id, model.created, undefined, {
|
||||
idPrefix: model.id,
|
||||
label: model.id.replaceAll('/', ' · ').replaceAll(/[_-]/g, ' '),
|
||||
description: 'New Togehter AI Model',
|
||||
contextWindow: null, // unknown
|
||||
interfaces: [LLM_IF_OAI_Chat], // assume..
|
||||
hidden: true,
|
||||
});
|
||||
}
|
||||
|
||||
function togetherAIModelsSort(a: ModelDescriptionSchema, b: ModelDescriptionSchema): number {
|
||||
if (a.hidden && !b.hidden)
|
||||
return 1;
|
||||
if (!a.hidden && b.hidden)
|
||||
return -1;
|
||||
if (a.created !== b.created)
|
||||
return (b.created || 0) - (a.created || 0);
|
||||
return a.id.localeCompare(b.id);
|
||||
}
|
||||
|
||||
return wireTogetherAIListOutputSchema.parse(wireModels)
|
||||
.map(togetherAIModelToModelDescription)
|
||||
.sort(togetherAIModelsSort);
|
||||
}
|
||||
|
||||
|
||||
// Helpers
|
||||
|
||||
type ManualMappings = ManualMapping[];
|
||||
|
||||
@@ -12,10 +12,12 @@ import { fixupHost } from '~/common/util/urlUtils';
|
||||
|
||||
import { OpenAIWire, WireOpenAICreateImageOutput, wireOpenAICreateImageOutputSchema, WireOpenAICreateImageRequest } from './openai.wiretypes';
|
||||
import { llmsChatGenerateWithFunctionsOutputSchema, llmsListModelsOutputSchema, ModelDescriptionSchema } from '../llm.server.types';
|
||||
import { lmStudioModelToModelDescription, localAIModelToModelDescription, mistralModelsSort, mistralModelToModelDescription, oobaboogaModelToModelDescription, openAIModelToModelDescription, openRouterModelFamilySortFn, openRouterModelToModelDescription } from './models.data';
|
||||
import { lmStudioModelToModelDescription, localAIModelToModelDescription, mistralModelsSort, mistralModelToModelDescription, oobaboogaModelToModelDescription, openAIModelToModelDescription, openRouterModelFamilySortFn, openRouterModelToModelDescription, togetherAIModelsToModelDescriptions } from './models.data';
|
||||
|
||||
|
||||
const openAIDialects = z.enum(['azure', 'lmstudio', 'localai', 'mistral', 'oobabooga', 'openai', 'openrouter']);
|
||||
const openAIDialects = z.enum([
|
||||
'azure', 'lmstudio', 'localai', 'mistral', 'oobabooga', 'openai', 'openrouter', 'togetherai',
|
||||
]);
|
||||
|
||||
export const openAIAccessSchema = z.object({
|
||||
dialect: openAIDialects,
|
||||
@@ -133,6 +135,11 @@ export const llmOpenAIRouter = createTRPCRouter({
|
||||
|
||||
// [non-Azure]: fetch openAI-style for all but Azure (will be then used in each dialect)
|
||||
const openAIWireModelsResponse = await openaiGET<OpenAIWire.Models.Response>(access, '/v1/models');
|
||||
|
||||
// [Together] missing the .data property
|
||||
if (access.dialect === 'togetherai')
|
||||
return { models: togetherAIModelsToModelDescriptions(openAIWireModelsResponse) };
|
||||
|
||||
let openAIModels: OpenAIWire.Models.ModelDescription[] = openAIWireModelsResponse.data || [];
|
||||
|
||||
// de-duplicate by ids (can happen for local servers.. upstream bugs)
|
||||
@@ -303,6 +310,7 @@ const DEFAULT_HELICONE_OPENAI_HOST = 'oai.hconeai.com';
|
||||
const DEFAULT_MISTRAL_HOST = 'https://api.mistral.ai';
|
||||
const DEFAULT_OPENAI_HOST = 'api.openai.com';
|
||||
const DEFAULT_OPENROUTER_HOST = 'https://openrouter.ai/api';
|
||||
const DEFAULT_TOGETHERAI_HOST = 'https://api.together.xyz';
|
||||
|
||||
export function openAIAccess(access: OpenAIAccessSchema, modelRefId: string | null, apiPath: string): { headers: HeadersInit, url: string } {
|
||||
switch (access.dialect) {
|
||||
@@ -416,6 +424,22 @@ export function openAIAccess(access: OpenAIAccessSchema, modelRefId: string | nu
|
||||
},
|
||||
url: orHost + apiPath,
|
||||
};
|
||||
|
||||
case 'togetherai':
|
||||
const togetherKey = access.oaiKey || env.TOGETHERAI_API_KEY || '';
|
||||
const togetherHost = fixupHost(access.oaiHost || DEFAULT_TOGETHERAI_HOST, apiPath);
|
||||
if (!togetherKey || !togetherHost)
|
||||
throw new Error('Missing TogetherAI API Key or Host. Add it on the UI (Models Setup) or server side (your deployment).');
|
||||
|
||||
return {
|
||||
headers: {
|
||||
'Authorization': `Bearer ${togetherKey}`,
|
||||
'Content-Type': 'application/json',
|
||||
'Accept': 'application/json',
|
||||
},
|
||||
url: togetherHost + apiPath,
|
||||
};
|
||||
|
||||
}
|
||||
}
|
||||
|
||||
@@ -426,7 +450,7 @@ export function openAIChatCompletionPayload(model: OpenAIModelSchema, history: O
|
||||
...(functions && { functions: functions, function_call: forceFunctionName ? { name: forceFunctionName } : 'auto' }),
|
||||
...(model.temperature && { temperature: model.temperature }),
|
||||
...(model.maxTokens && { max_tokens: model.maxTokens }),
|
||||
n,
|
||||
...(n > 1 && { n }),
|
||||
stream,
|
||||
};
|
||||
}
|
||||
|
||||
@@ -24,7 +24,7 @@ export namespace OpenAIWire {
|
||||
presence_penalty?: number;
|
||||
max_tokens?: number;
|
||||
stream: boolean;
|
||||
n: number;
|
||||
n?: number;
|
||||
// [FN0613]
|
||||
functions?: RequestFunctionDef[],
|
||||
function_call?: 'auto' | 'none' | {
|
||||
|
||||
@@ -0,0 +1,12 @@
|
||||
import { z } from 'zod';
|
||||
|
||||
|
||||
// [Together AI] Models List API - Response
|
||||
|
||||
export const wireTogetherAIListOutputSchema = z.array(z.object({
|
||||
id: z.string(),
|
||||
object: z.literal('model'),
|
||||
created: z.number(),
|
||||
}));
|
||||
|
||||
// export type WireTogetherAIListOutput = z.infer<typeof wireTogetherAIListOutputSchema>;
|
||||
@@ -163,9 +163,23 @@ export const useModelsStore = create<LlmsStore>()(
|
||||
|
||||
|
||||
addSource: (source: DModelSource) =>
|
||||
set(state => ({
|
||||
sources: [...state.sources, source],
|
||||
})),
|
||||
set(state => {
|
||||
|
||||
// re-number all sources for the given vendor
|
||||
let n = 0;
|
||||
const sourceVId = source.vId;
|
||||
|
||||
return {
|
||||
sources: [...state.sources, source].map(_source =>
|
||||
_source.vId != sourceVId
|
||||
? _source
|
||||
: {
|
||||
..._source,
|
||||
label: _source.label.replace(/ #\d+$/, '') + (++n > 1 ? ` #${n}` : ''),
|
||||
},
|
||||
),
|
||||
};
|
||||
}),
|
||||
|
||||
removeSource: (id: DModelSourceId) =>
|
||||
set(state => {
|
||||
@@ -239,6 +253,10 @@ export const useModelsStore = create<LlmsStore>()(
|
||||
);
|
||||
|
||||
|
||||
export const getChatLLMId = (): DLLMId | null => useModelsStore.getState().chatLLMId;
|
||||
|
||||
export const getFastLLMId = (): DLLMId | null => useModelsStore.getState().fastLLMId;
|
||||
|
||||
export function findLLMOrThrow<TSourceSetup, TLLMOptions>(llmId: DLLMId): DLLM<TSourceSetup, TLLMOptions> {
|
||||
const llm = useModelsStore.getState().llms.find(llm => llm.id === llmId);
|
||||
if (!llm) throw new Error(`LLM ${llmId} not found`);
|
||||
@@ -276,8 +294,15 @@ function findLlmIdBySuffix(llms: DLLM[], suffixes: string[], fallbackToFirst: bo
|
||||
for (const llm of llms)
|
||||
if (llm.id.endsWith(suffix))
|
||||
return llm.id;
|
||||
if (!fallbackToFirst) return null;
|
||||
|
||||
// otherwise return first that's not hidden
|
||||
for (const llm of llms)
|
||||
if (!llm.hidden)
|
||||
return llm.id;
|
||||
|
||||
// otherwise return first id
|
||||
return fallbackToFirst ? llms[0].id : null;
|
||||
return llms[0].id;
|
||||
}
|
||||
|
||||
|
||||
|
||||
+2
-2
@@ -4,7 +4,7 @@ import type { TRPCClientErrorBase } from '@trpc/client';
|
||||
import type { DLLM, DLLMId, DModelSourceId } from '../store-llms';
|
||||
import type { ModelDescriptionSchema } from '../server/llm.server.types';
|
||||
import type { ModelVendorId } from './vendors.registry';
|
||||
import type { VChatFunctionIn, VChatMessageIn, VChatMessageOrFunctionCallOut, VChatMessageOut } from '~/modules/llms/llm.client';
|
||||
import type { VChatFunctionIn, VChatMessageIn, VChatMessageOrFunctionCallOut, VChatMessageOut } from '../llm.client';
|
||||
|
||||
|
||||
export interface IModelVendor<TSourceSetup = unknown, TAccess = unknown, TLLMOptions = unknown, TDLLM = DLLM<TSourceSetup, TLLMOptions>> {
|
||||
@@ -29,7 +29,7 @@ export interface IModelVendor<TSourceSetup = unknown, TAccess = unknown, TLLMOpt
|
||||
|
||||
getTransportAccess(setup?: Partial<TSourceSetup>): TAccess;
|
||||
|
||||
getRateLimitDelay?(llm: TDLLM): number;
|
||||
getRateLimitDelay?(llm: TDLLM, setup: Partial<TSourceSetup>): number;
|
||||
|
||||
rpcUpdateModelsQuery: (
|
||||
access: TAccess,
|
||||
|
||||
+5
-4
@@ -26,6 +26,7 @@ export function OpenAILLMOptions(props: { llm: DLLM<unknown, LLMOptionsOpenAI> }
|
||||
// derived state
|
||||
const { id: llmId, maxOutputTokens, options } = props.llm;
|
||||
const { llmResponseTokens, llmTemperature } = normalizeOpenAIOptions(options);
|
||||
const { updateLLMOptions } = useModelsStore.getState();
|
||||
|
||||
// state (here because the initial state depends on props)
|
||||
const [overheat, setOverheat] = React.useState(llmTemperature > 1);
|
||||
@@ -34,9 +35,9 @@ export function OpenAILLMOptions(props: { llm: DLLM<unknown, LLMOptionsOpenAI> }
|
||||
|
||||
const handleOverheatToggle = React.useCallback(() => {
|
||||
if (overheat && llmTemperature > 1)
|
||||
useModelsStore.getState().updateLLMOptions(llmId, { llmTemperature: 1 });
|
||||
updateLLMOptions(llmId, { llmTemperature: 1 });
|
||||
setOverheat(!overheat);
|
||||
}, [llmId, llmTemperature, overheat]);
|
||||
}, [llmId, llmTemperature, overheat, updateLLMOptions]);
|
||||
|
||||
|
||||
return <>
|
||||
@@ -47,7 +48,7 @@ export function OpenAILLMOptions(props: { llm: DLLM<unknown, LLMOptionsOpenAI> }
|
||||
min={0} max={overheat ? 2 : 1} step={0.1} defaultValue={0.5}
|
||||
valueLabelDisplay='on'
|
||||
value={llmTemperature}
|
||||
onChange={value => useModelsStore.getState().updateLLMOptions(llmId, { llmTemperature: value })}
|
||||
onChange={value => updateLLMOptions(llmId, { llmTemperature: value })}
|
||||
endAdornment={showOverheatButton &&
|
||||
<Tooltip title={overheat ? 'Disable LLM Overheating' : 'Increase Max LLM Temperature to 2'} sx={{ p: 1 }}>
|
||||
<IconButton
|
||||
@@ -66,7 +67,7 @@ export function OpenAILLMOptions(props: { llm: DLLM<unknown, LLMOptionsOpenAI> }
|
||||
min={256} max={maxOutputTokens} step={256} defaultValue={1024}
|
||||
valueLabelDisplay='on'
|
||||
value={llmResponseTokens}
|
||||
onChange={value => useModelsStore.getState().updateLLMOptions(llmId, { llmResponseTokens: value })}
|
||||
onChange={value => updateLLMOptions(llmId, { llmResponseTokens: value })}
|
||||
/>
|
||||
) : (
|
||||
<InlineError error='Max Output Tokens: Token computations are disabled because this model does not declare the context window size.' />
|
||||
|
||||
+10
-9
@@ -25,7 +25,7 @@ const HELICONE_OPENAI_HOST = 'oai.hconeai.com';
|
||||
export function OpenAISourceSetup(props: { sourceId: DModelSourceId }) {
|
||||
|
||||
// state
|
||||
const advanced = useToggleableBoolean();
|
||||
const advanced = useToggleableBoolean(!!props.sourceId?.includes('-'));
|
||||
|
||||
// external state
|
||||
const { source, sourceHasLLMs, access, updateSetup } =
|
||||
@@ -57,6 +57,15 @@ export function OpenAISourceSetup(props: { sourceId: DModelSourceId }) {
|
||||
placeholder='sk-...'
|
||||
/>
|
||||
|
||||
{advanced.on && <FormTextField
|
||||
title='API Endpoint'
|
||||
tooltip={`An OpenAI compatible endpoint to be used in place of 'api.openai.com'.\n\nCould be used for Helicone, Cloudflare, or other OpenAI compatible cloud or local services.\n\nExamples:\n - ${HELICONE_OPENAI_HOST}\n - localhost:1234`}
|
||||
description={<><Link level='body-sm' href='https://www.helicone.ai' target='_blank'>Helicone</Link>, <Link level='body-sm' href='https://developers.cloudflare.com/ai-gateway/' target='_blank'>Cloudflare</Link></>}
|
||||
placeholder={`e.g., ${HELICONE_OPENAI_HOST}, https://gateway.ai.cloudflare.com/v1/<ACCOUNT_TAG>/<GATEWAY_URL_SLUG>/openai, etc..`}
|
||||
value={oaiHost}
|
||||
onChange={text => updateSetup({ oaiHost: text })}
|
||||
/>}
|
||||
|
||||
{advanced.on && <FormTextField
|
||||
title='Organization ID'
|
||||
description={<Link level='body-sm' href={`${Brand.URIs.OpenRepo}/issues/63`} target='_blank'>What is this</Link>}
|
||||
@@ -65,14 +74,6 @@ export function OpenAISourceSetup(props: { sourceId: DModelSourceId }) {
|
||||
onChange={text => updateSetup({ oaiOrg: text })}
|
||||
/>}
|
||||
|
||||
{advanced.on && <FormTextField
|
||||
title='API Host'
|
||||
description={<><Link level='body-sm' href='https://www.helicone.ai' target='_blank'>Helicone</Link>, <Link level='body-sm' href='https://developers.cloudflare.com/ai-gateway/' target='_blank'>Cloudflare</Link></>}
|
||||
placeholder={`e.g., ${HELICONE_OPENAI_HOST} or https://gateway.ai.cloudflare.com/v1/<ACCOUNT_TAG>/<GATEWAY_URL_SLUG>/openai`}
|
||||
value={oaiHost}
|
||||
onChange={text => updateSetup({ oaiHost: text })}
|
||||
/>}
|
||||
|
||||
{advanced.on && <FormTextField
|
||||
title='Helicone Key'
|
||||
description={<>Generate <Link level='body-sm' href='https://www.helicone.ai/keys' target='_blank'>here</Link></>}
|
||||
|
||||
+1
-1
@@ -39,7 +39,7 @@ export const ModelVendorOpenAI: IModelVendor<SourceSetupOpenAI, OpenAIAccessSche
|
||||
name: 'OpenAI',
|
||||
rank: 10,
|
||||
location: 'cloud',
|
||||
instanceLimit: 1,
|
||||
instanceLimit: 5,
|
||||
hasBackendCap: () => backendCaps().hasLlmOpenAI,
|
||||
|
||||
// components
|
||||
|
||||
@@ -0,0 +1,81 @@
|
||||
import * as React from 'react';
|
||||
|
||||
import { Alert, Typography } from '@mui/joy';
|
||||
|
||||
import { FormInputKey } from '~/common/components/forms/FormInputKey';
|
||||
import { FormSwitchControl } from '~/common/components/forms/FormSwitchControl';
|
||||
import { InlineError } from '~/common/components/InlineError';
|
||||
import { Link } from '~/common/components/Link';
|
||||
import { SetupFormRefetchButton } from '~/common/components/forms/SetupFormRefetchButton';
|
||||
import { useToggleableBoolean } from '~/common/util/useToggleableBoolean';
|
||||
|
||||
import { DModelSourceId } from '../../store-llms';
|
||||
import { useLlmUpdateModels } from '../useLlmUpdateModels';
|
||||
import { useSourceSetup } from '../useSourceSetup';
|
||||
|
||||
import { ModelVendorTogetherAI } from './togetherai.vendor';
|
||||
|
||||
|
||||
const TOGETHERAI_REG_LINK = 'https://api.together.xyz/settings/api-keys';
|
||||
|
||||
|
||||
export function TogetherAISourceSetup(props: { sourceId: DModelSourceId }) {
|
||||
|
||||
// state
|
||||
const advanced = useToggleableBoolean();
|
||||
|
||||
// external state
|
||||
const {
|
||||
source, access,
|
||||
partialSetup, sourceSetupValid, updateSetup,
|
||||
} = useSourceSetup(props.sourceId, ModelVendorTogetherAI);
|
||||
|
||||
// derived state
|
||||
const { oaiKey: togetherKey } = access;
|
||||
|
||||
// validate if url is a well formed proper url with zod
|
||||
const needsUserKey = !ModelVendorTogetherAI.hasBackendCap?.();
|
||||
const shallFetchSucceed = !needsUserKey || (!!togetherKey && sourceSetupValid);
|
||||
const showKeyError = !!togetherKey && !sourceSetupValid;
|
||||
|
||||
// fetch models
|
||||
const { isFetching, refetch, isError, error } =
|
||||
useLlmUpdateModels(ModelVendorTogetherAI, access, shallFetchSucceed, source);
|
||||
|
||||
|
||||
return <>
|
||||
|
||||
<FormInputKey
|
||||
id='togetherai-key' label='Together AI Key'
|
||||
rightLabel={<>{needsUserKey
|
||||
? !togetherKey && <Link level='body-sm' href={TOGETHERAI_REG_LINK} target='_blank'>request Key</Link>
|
||||
: '✔️ already set in server'}
|
||||
</>}
|
||||
value={togetherKey} onChange={value => updateSetup({ togetherKey: value })}
|
||||
required={needsUserKey} isError={showKeyError}
|
||||
placeholder='...'
|
||||
/>
|
||||
|
||||
<Typography level='body-sm'>
|
||||
The Together Inference platform allows you to run recent machine learning models with good speed and low
|
||||
cost. See the <Link href='https://www.together.ai/' target='_blank'>Together AI</Link> website for more
|
||||
information.
|
||||
</Typography>
|
||||
|
||||
{advanced.on && <FormSwitchControl
|
||||
title='Rate Limiter' on='Enabled' off='Disabled'
|
||||
description={partialSetup?.togetherFreeTrial ? 'Free trial: 2 requests/2s' : 'Disabled'}
|
||||
checked={partialSetup?.togetherFreeTrial ?? false}
|
||||
onChange={on => updateSetup({ togetherFreeTrial: on })}
|
||||
/>}
|
||||
|
||||
{advanced.on && !!partialSetup?.togetherFreeTrial && <Alert variant='soft'>
|
||||
Note: Please refresh the models list if you toggle the rate limiter.
|
||||
</Alert>}
|
||||
|
||||
<SetupFormRefetchButton refetch={refetch} disabled={/*!shallFetchSucceed ||*/ isFetching} loading={isFetching} error={isError} advanced={advanced} />
|
||||
|
||||
{isError && <InlineError error={error} />}
|
||||
|
||||
</>;
|
||||
}
|
||||
@@ -0,0 +1,77 @@
|
||||
import { backendCaps } from '~/modules/backend/state-backend';
|
||||
|
||||
import { TogetherIcon } from '~/common/components/icons/TogetherIcon';
|
||||
|
||||
import type { IModelVendor } from '../IModelVendor';
|
||||
import type { OpenAIAccessSchema } from '../../server/openai/openai.router';
|
||||
|
||||
import { LLMOptionsOpenAI, ModelVendorOpenAI } from '../openai/openai.vendor';
|
||||
import { OpenAILLMOptions } from '../openai/OpenAILLMOptions';
|
||||
|
||||
import { TogetherAISourceSetup } from './TogetherAISourceSetup';
|
||||
|
||||
|
||||
export interface SourceSetupTogetherAI {
|
||||
togetherKey: string;
|
||||
togetherHost: string;
|
||||
togetherFreeTrial: boolean;
|
||||
}
|
||||
|
||||
export const ModelVendorTogetherAI: IModelVendor<SourceSetupTogetherAI, OpenAIAccessSchema, LLMOptionsOpenAI> = {
|
||||
id: 'togetherai',
|
||||
name: 'Together AI',
|
||||
rank: 17,
|
||||
location: 'cloud',
|
||||
instanceLimit: 1,
|
||||
hasBackendCap: () => backendCaps().hasLlmTogetherAI,
|
||||
|
||||
// components
|
||||
Icon: TogetherIcon,
|
||||
SourceSetupComponent: TogetherAISourceSetup,
|
||||
LLMOptionsComponent: OpenAILLMOptions,
|
||||
|
||||
// functions
|
||||
initializeSetup: () => ({
|
||||
togetherKey: '',
|
||||
togetherHost: 'https://api.together.xyz',
|
||||
togetherFreeTrial: false,
|
||||
}),
|
||||
validateSetup: (setup) => {
|
||||
return setup.togetherKey?.length >= 64;
|
||||
},
|
||||
getTransportAccess: (partialSetup) => ({
|
||||
dialect: 'togetherai',
|
||||
oaiKey: partialSetup?.togetherKey || '',
|
||||
oaiOrg: '',
|
||||
oaiHost: partialSetup?.togetherHost || '',
|
||||
heliKey: '',
|
||||
moderationCheck: false,
|
||||
}),
|
||||
|
||||
// there is delay for OpenRouter Free API calls
|
||||
getRateLimitDelay: (_llm, partialSetup) => {
|
||||
const now = Date.now();
|
||||
const elapsed = now - nextGenerationTs;
|
||||
const wait = partialSetup?.togetherFreeTrial
|
||||
? 1000 + 50 /* 1 seconds for free call, plus some safety margin */
|
||||
: 50;
|
||||
|
||||
if (elapsed < wait) {
|
||||
const delay = wait - elapsed;
|
||||
nextGenerationTs = now + delay;
|
||||
return delay;
|
||||
} else {
|
||||
nextGenerationTs = now;
|
||||
return 0;
|
||||
}
|
||||
},
|
||||
|
||||
|
||||
// OpenAI transport ('togetherai' dialect in 'access')
|
||||
rpcUpdateModelsQuery: ModelVendorOpenAI.rpcUpdateModelsQuery,
|
||||
rpcChatGenerateOrThrow: ModelVendorOpenAI.rpcChatGenerateOrThrow,
|
||||
streamingChatGenerateOrThrow: ModelVendorOpenAI.streamingChatGenerateOrThrow,
|
||||
};
|
||||
|
||||
// rate limit timestamp
|
||||
let nextGenerationTs = 0;
|
||||
+1
@@ -22,6 +22,7 @@ export function useSourceSetup<TSourceSetup, TAccess, TLLMOptions>(sourceId: DMo
|
||||
|
||||
return {
|
||||
source,
|
||||
partialSetup: source?.setup ?? null, // NOTE: do not use - prefer ACCESS; only used in 1 edge case now
|
||||
access,
|
||||
sourceHasLLMs: !!sourceLLMs.length,
|
||||
sourceSetupValid,
|
||||
|
||||
+5
-2
@@ -8,6 +8,7 @@ import { ModelVendorOllama } from './ollama/ollama.vendor';
|
||||
import { ModelVendorOoobabooga } from './oobabooga/oobabooga.vendor';
|
||||
import { ModelVendorOpenAI } from './openai/openai.vendor';
|
||||
import { ModelVendorOpenRouter } from './openrouter/openrouter.vendor';
|
||||
import { ModelVendorTogetherAI } from '~/modules/llms/vendors/togetherai/togetherai.vendor';
|
||||
|
||||
import type { IModelVendor } from './IModelVendor';
|
||||
import { DLLMId, DModelSource, DModelSourceId, findLLMOrThrow, findSourceOrThrow } from '../store-llms';
|
||||
@@ -22,7 +23,8 @@ export type ModelVendorId =
|
||||
| 'ollama'
|
||||
| 'oobabooga'
|
||||
| 'openai'
|
||||
| 'openrouter';
|
||||
| 'openrouter'
|
||||
| 'togetherai';
|
||||
|
||||
/** Global: Vendor Instances Registry **/
|
||||
const MODEL_VENDOR_REGISTRY: Record<ModelVendorId, IModelVendor> = {
|
||||
@@ -36,6 +38,7 @@ const MODEL_VENDOR_REGISTRY: Record<ModelVendorId, IModelVendor> = {
|
||||
oobabooga: ModelVendorOoobabooga,
|
||||
openai: ModelVendorOpenAI,
|
||||
openrouter: ModelVendorOpenRouter,
|
||||
togetherai: ModelVendorTogetherAI,
|
||||
} as Record<string, IModelVendor>;
|
||||
|
||||
const MODEL_VENDOR_DEFAULT: ModelVendorId = 'openai';
|
||||
@@ -83,7 +86,7 @@ export function createModelSourceForVendor(vendorId: ModelVendorId, otherSources
|
||||
// create the source
|
||||
return {
|
||||
id: sourceId,
|
||||
label: vendor.name + (sourceN > 0 ? ` #${sourceN}` : ''),
|
||||
label: vendor.name, // NOTE: will be (re/) numbered upon adding to the store
|
||||
vId: vendorId,
|
||||
setup: vendor.initializeSetup?.() || {},
|
||||
};
|
||||
|
||||
@@ -0,0 +1,52 @@
|
||||
// Copyright (c) 2023-2024 Enrico Ros
|
||||
// This subsystem is responsible for fetching the transcript of a YouTube video.
|
||||
// It is used by the Big-AGI Persona Creator to create a character sheet.
|
||||
|
||||
import * as React from 'react';
|
||||
|
||||
import { apiQuery } from '~/common/util/trpc.client';
|
||||
|
||||
|
||||
export interface YTVideoTranscript {
|
||||
title: string;
|
||||
transcript: string;
|
||||
thumbnailUrl: string;
|
||||
}
|
||||
|
||||
export function useYouTubeTranscript(videoID: string | null, onNewTranscript: (transcript: YTVideoTranscript) => void) {
|
||||
|
||||
// state
|
||||
const [transcript, setTranscript] = React.useState<YTVideoTranscript | null>(null);
|
||||
|
||||
// data
|
||||
const { data, isFetching, isError, error } = apiQuery.youtube.getTranscript.useQuery({
|
||||
videoId: videoID || '',
|
||||
}, {
|
||||
enabled: !!videoID,
|
||||
refetchOnWindowFocus: false,
|
||||
staleTime: Infinity,
|
||||
});
|
||||
|
||||
|
||||
// update the transcript when the underlying data changes
|
||||
React.useEffect(() => {
|
||||
if (!data) {
|
||||
// setTranscript(null);
|
||||
return;
|
||||
}
|
||||
const transcript = {
|
||||
title: data.videoTitle,
|
||||
transcript: data.transcript,
|
||||
thumbnailUrl: data.thumbnailUrl,
|
||||
};
|
||||
setTranscript(transcript);
|
||||
onNewTranscript(transcript);
|
||||
}, [data, onNewTranscript]);
|
||||
|
||||
|
||||
return {
|
||||
transcript,
|
||||
isFetching,
|
||||
isError, error,
|
||||
};
|
||||
}
|
||||
@@ -1,4 +1,6 @@
|
||||
// noinspection ExceptionCaughtLocallyJS
|
||||
// Copyright (c) 2023-2024 Enrico Ros
|
||||
// This subsystem is responsible for fetching the transcript of a YouTube video.
|
||||
// It is used by the Big-AGI Persona Creator to create a character sheet.
|
||||
|
||||
import { TRPCError } from '@trpc/server';
|
||||
import { z } from 'zod';
|
||||
@@ -29,7 +31,7 @@ const youtubeTranscriptionSchema = z.object({
|
||||
});
|
||||
|
||||
|
||||
export const ytPersonaRouter = createTRPCRouter({
|
||||
export const youtubeRouter = createTRPCRouter({
|
||||
|
||||
/**
|
||||
* Get the transcript for a YouTube video ID
|
||||
@@ -8,7 +8,7 @@ import { llmGeminiRouter } from '~/modules/llms/server/gemini/gemini.router';
|
||||
import { llmOllamaRouter } from '~/modules/llms/server/ollama/ollama.router';
|
||||
import { llmOpenAIRouter } from '~/modules/llms/server/openai/openai.router';
|
||||
import { prodiaRouter } from '~/modules/t2i/prodia/prodia.router';
|
||||
import { ytPersonaRouter } from '../../apps/personas/ytpersona.router';
|
||||
import { youtubeRouter } from '~/modules/youtube/youtube.router';
|
||||
|
||||
/**
|
||||
* Primary rooter, and will be sitting on an Edge Runtime.
|
||||
@@ -22,7 +22,7 @@ export const appRouterEdge = createTRPCRouter({
|
||||
llmOllama: llmOllamaRouter,
|
||||
llmOpenAI: llmOpenAIRouter,
|
||||
prodia: prodiaRouter,
|
||||
ytpersona: ytPersonaRouter,
|
||||
youtube: youtubeRouter,
|
||||
});
|
||||
|
||||
// export type definition of API
|
||||
|
||||
@@ -33,6 +33,9 @@ export const env = createEnv({
|
||||
// LLM: OpenRouter
|
||||
OPENROUTER_API_KEY: z.string().optional(),
|
||||
|
||||
// LLM: Toghether AI
|
||||
TOGETHERAI_API_KEY: z.string().optional(),
|
||||
|
||||
// Helicone - works on both OpenAI and Anthropic vendors
|
||||
HELICONE_API_KEY: z.string().optional(),
|
||||
|
||||
|
||||
Reference in New Issue
Block a user