feat: статья из Discover, локализация, подсказки

- Статья: заголовок + ссылка (truncate), title в URL, articleTitle в Message
- Локализация Sources, Research Progress, Answer, шагов, formingAnswer
- Подсказки: промпт без жёсткого примера, разнообразие, label 'Что ещё спросить'
- embeddedTranslations, countryToLocale, locale инструкция для LLM

Co-authored-by: Cursor <cursoragent@cursor.com>
This commit is contained in:
home
2026-02-21 00:37:06 +03:00
parent f4d945a2b5
commit 3fa83bc605
68 changed files with 2301 additions and 345 deletions

25
.env.example Normal file
View File

@@ -0,0 +1,25 @@
# Server config. Copy to .env (в корне или apps/frontend/) или задайте в панели деплоя.
# === LLM (обязательно один из вариантов) ===
# Вариант 1: Ollama (один сервер — один URL; два сервера — укажите OLLAMA_EMBEDDING_BASE_URL)
LLM_PROVIDER=ollama
OLLAMA_BASE_URL=http://localhost:11434
# OLLAMA_EMBEDDING_BASE_URL=http://embedding-host:11434
LLM_CHAT_MODEL=ministarl-3:3b
LLM_EMBEDDING_MODEL=nomic-embed-text
# Вариант 2: Timeweb Cloud AI (закомментировать блок Ollama)
# LLM_PROVIDER=timeweb
# TIMEWEB_API_BASE_URL=https://api.timeweb.cloud
# TIMEWEB_AGENT_ACCESS_ID=
# TIMEWEB_API_KEY=
# LLM_CHAT_MODEL=gpt-4
# TIMEWEB_X_PROXY_SOURCE=
# === Дополнительно ===
# SearXNG: обязателен для веб-поиска. Публичные инстансы часто дают 429 (лимит).
# Запустите свой: docker run -d -p 4000:8080 searxng/searxng
# SEARXNG_API_URL=http://localhost:4000
# SEARXNG_FALLBACK_URL= # через запятую, если основной недоступен
# DATA_DIR=./data

69
AUDIT-PERFORMANCE.md Normal file
View File

@@ -0,0 +1,69 @@
# Аудит производительности сборки GooSeek
## Основная причина медленной загрузки
### 1. Явное использование Webpack вместо Turbopack (КРИТИЧНО)
**Проблема:** В `package.json` скрипт `dev` использует `next dev --webpack`, что отключает Turbopack.
**Влияние:**
- Turbopack: cold start ~1.1s, HMR ~90ms
- Webpack: cold start ~4.2s, HMR ~800ms
- **Разница: до 4× медленнее старт, до 10× медленнее обновления**
**Решение:** Убрать флаг `--webpack` — Next.js 16 по умолчанию использует Turbopack.
---
### 2. Тяжёлые зависимости (размер в node_modules)
| Пакет | Размер | Где используется | Загрузка |
|------------------|--------|-------------------------------------------|-------------------|
| pdf-parse | 83 MB | uploads/manager.ts (парсинг PDF) | Server, API |
| @huggingface | 49 MB | TransformersProvider (embeddings в браузере) | Динамический import |
| jspdf | 31 MB | Navbar.tsx (экспорт чата в PDF) | Клиент, при экспорте |
| mathjs | 16 MB | calculationWidget (агент расчётов) | Server, API |
| better-sqlite3 | 12 MB | drizzle/db (native addon) | Server |
| lightweight-charts | 3 MB | Stock.tsx (графики акций) | Клиент, при виджете |
---
### 3. Цепочка импортов при загрузке layout
```
layout.tsx
→ configManager (config/index.ts)
→ getModelProvidersUIConfigSection (models/providers/index.ts)
ВСЕ 9 провайдеров: OpenAI, Ollama, Timeweb, Gemini, Transformers, Groq, Lemonade, Anthropic, LMStudio
```
**Проблема:** Даже в env-only режиме (Timeweb) при каждом запросе загружаются все провайдеры для конфигурации UI. Webpack/Turbopack анализирует весь граф зависимостей.
**В env-only:** `getModelProvidersUIConfigSection` не вызывается, но импорт выполняется при загрузке модуля config.
---
### 4. Нативные модули
- `better-sqlite3` — компилируется при установке, замедляет `npm install`
- `@napi-rs/canvas` (optional) — для pdf-parse
---
## Рекомендации по оптимизации
### Быстрые (сделать сейчас)
1.**Убрать `--webpack`** — выполнено (убран из `build`, dev уже без него)
2.**Динамический import для jsPDF** — уже реализовано в Navbar.tsx
3.**Динамический import для Stock (lightweight-charts)** — уже реализовано в Renderer.tsx
### Средний приоритет
4.**Lazy-загрузка провайдеров** — вынесено в `config/providersLoader.ts`, загрузка только при !env-only
5.**Turbopack на production build** — Next.js 16 использует Turbopack по умолчанию (`next build` без флагов)
### Долгосрочно
6.**pdf-parse: динамический import** — выполнен. pdf-parse (83 MB) загружается только при парсинге PDF, не при старте. Worker через child_process не реализован из‑за ограничений Turbopack (статический анализ путей fork).
7. **Миграция better-sqlite3 → libsql** — отложено: требуется переписать `src/lib/db/migrate.ts` (см. CONTINUE.md)

23
CONTINUE.md Normal file
View File

@@ -0,0 +1,23 @@
# Недоделки — начать отсюда
## Задача (исходный запрос)
Продолжить оптимизацию из AUDIT-PERFORMANCE.md и AGENTS.MD.
## Сделано
- Динамический import для pdf-parse в `manager.ts` — загрузка только при парсинге PDF
- Обновлён AUDIT-PERFORMANCE.md с отметками о выполненных пунктах
## Осталось сделать (в порядке приоритета)
1. **Миграция better-sqlite3 → libsql** — приоритет по audit (отложено).
- **Блокировщик:** `src/lib/db/migrate.ts` использует better-sqlite3 (sync API). Нужна полная переработка под async libsql перед миграцией db/index.ts.
- Файлы: `src/lib/db/index.ts`, `src/lib/db/migrate.ts`, `drizzle.config.ts`, `package.json`
- Шаги: переписать migrate.ts на createClient из @libsql/client, затем заменить в index.ts.
2. **Опционально: pdf-parse в worker** — отложено из‑за Turbopack.
- Текущий подход (динамический import) уже снимает нагрузку со старта.
- Worker через child_process требует отдельный .cjs в известном path; Turbopack трассирует fork() и падает. Варианты: сборка с webpack для route uploads или вынос в отдельный Nodeсервис.
## Контекст для продолжения
- Изменённые файлы: `apps/frontend/src/lib/uploads/manager.ts`
- db используется в: `api/chat`, `api/chats`, `api/chats/[id]`, `lib/agents/search`

View File

@@ -1,6 +1,6 @@
/// <reference types="next" /> /// <reference types="next" />
/// <reference types="next/image-types/global" /> /// <reference types="next/image-types/global" />
import "./.next/dev/types/routes.d.ts"; import "./.next/types/routes.d.ts";
// NOTE: This file should not be edited // NOTE: This file should not be edited
// see https://nextjs.org/docs/app/api-reference/config/typescript for more information. // see https://nextjs.org/docs/app/api-reference/config/typescript for more information.

View File

@@ -1,3 +1,13 @@
import path from 'node:path';
import { fileURLToPath } from 'node:url';
import { createRequire } from 'node:module';
// Загружаем .env из корня монорепо (apps/frontend -> корень)
const __dirname = path.dirname(fileURLToPath(import.meta.url));
const rootDir = path.resolve(__dirname, '..', '..');
const require = createRequire(import.meta.url);
require('dotenv').config({ path: path.join(rootDir, '.env') });
import pkg from './package.json' with { type: 'json' }; import pkg from './package.json' with { type: 'json' };
/** @type {import('next').NextConfig} */ /** @type {import('next').NextConfig} */

View File

@@ -3,8 +3,8 @@
"version": "1.12.1", "version": "1.12.1",
"private": true, "private": true,
"scripts": { "scripts": {
"dev": "next dev --webpack", "dev": "next dev",
"build": "next build --webpack", "build": "next build",
"start": "next start", "start": "next start",
"lint": "next lint" "lint": "next lint"
}, },
@@ -14,12 +14,12 @@
"@headlessui/tailwindcss": "^0.2.2", "@headlessui/tailwindcss": "^0.2.2",
"@huggingface/transformers": "^3.8.1", "@huggingface/transformers": "^3.8.1",
"@icons-pack/react-simple-icons": "^12.3.0", "@icons-pack/react-simple-icons": "^12.3.0",
"better-sqlite3": "^11.9.1",
"@phosphor-icons/react": "^2.1.10", "@phosphor-icons/react": "^2.1.10",
"@radix-ui/react-tooltip": "^1.2.8", "@radix-ui/react-tooltip": "^1.2.8",
"@tailwindcss/typography": "^0.5.12", "@tailwindcss/typography": "^0.5.12",
"@toolsycc/json-repair": "^0.1.22", "@toolsycc/json-repair": "^0.1.22",
"axios": "^1.8.3", "axios": "^1.8.3",
"better-sqlite3": "^11.9.1",
"clsx": "^2.1.0", "clsx": "^2.1.0",
"drizzle-orm": "^0.40.1", "drizzle-orm": "^0.40.1",
"js-tiktoken": "^1.0.21", "js-tiktoken": "^1.0.21",
@@ -58,8 +58,11 @@
"@types/react": "^18", "@types/react": "^18",
"@types/react-dom": "^18", "@types/react-dom": "^18",
"@types/react-syntax-highlighter": "^15.5.13", "@types/react-syntax-highlighter": "^15.5.13",
"@types/trusted-types": "^2.0.7",
"@types/turndown": "^5.0.6", "@types/turndown": "^5.0.6",
"@types/unist": "^3.0.3",
"autoprefixer": "^10.0.1", "autoprefixer": "^10.0.1",
"dotenv": "^16.4.5",
"drizzle-kit": "^0.30.5", "drizzle-kit": "^0.30.5",
"eslint": "^8", "eslint": "^8",
"eslint-config-next": "14.1.4", "eslint-config-next": "14.1.4",

View File

View File

@@ -24,13 +24,14 @@ const chatModelSchema: z.ZodType<ModelWithProvider> = z.object({
key: z.string({ message: 'Chat model key must be provided' }), key: z.string({ message: 'Chat model key must be provided' }),
}); });
const embeddingModelSchema: z.ZodType<ModelWithProvider> = z.object({ const embeddingModelSchema = z.object({
providerId: z.string({ providerId: z.string().optional().default(''),
message: 'Embedding model provider id must be provided', key: z.string().optional().default(''),
}),
key: z.string({ message: 'Embedding model key must be provided' }),
}); });
const LOCALIZATION_SERVICE_URL =
process.env.LOCALIZATION_SERVICE_URL ?? 'http://localhost:4003';
const bodySchema = z.object({ const bodySchema = z.object({
message: messageSchema, message: messageSchema,
optimizationMode: z.enum(['speed', 'balanced', 'quality'], { optimizationMode: z.enum(['speed', 'balanced', 'quality'], {
@@ -43,8 +44,10 @@ const bodySchema = z.object({
.default([]), .default([]),
files: z.array(z.string()).optional().default([]), files: z.array(z.string()).optional().default([]),
chatModel: chatModelSchema, chatModel: chatModelSchema,
embeddingModel: embeddingModelSchema, embeddingModel: embeddingModelSchema.optional().default({ providerId: '', key: '' }),
systemInstructions: z.string().nullable().optional().default(''), systemInstructions: z.string().nullable().optional().default(''),
/** locale (ru, en и т.д.) — язык ответа, по geo если не передан */
locale: z.string().optional(),
}); });
type Body = z.infer<typeof bodySchema>; type Body = z.infer<typeof bodySchema>;
@@ -116,6 +119,27 @@ export const POST = async (req: Request) => {
const body = parseBody.data as Body; const body = parseBody.data as Body;
const { message } = body; const { message } = body;
let locale = body.locale;
if (!locale) {
try {
const localeRes = await fetch(`${LOCALIZATION_SERVICE_URL}/api/locale`, {
headers: {
'x-forwarded-for': req.headers.get('x-forwarded-for') ?? '',
'x-real-ip': req.headers.get('x-real-ip') ?? '',
'user-agent': req.headers.get('user-agent') ?? '',
'accept-language': req.headers.get('accept-language') ?? '',
},
});
if (localeRes.ok) {
const data = (await localeRes.json()) as { locale?: string };
locale = data.locale ?? undefined;
}
} catch {
/* localization-service недоступен */
}
}
const effectiveLocale = locale ?? 'en';
if (message.content === '') { if (message.content === '') {
return Response.json( return Response.json(
{ {
@@ -127,13 +151,18 @@ export const POST = async (req: Request) => {
const registry = new ModelRegistry(); const registry = new ModelRegistry();
const [llm, embedding] = await Promise.all([ const llm = await registry.loadChatModel(
registry.loadChatModel(body.chatModel.providerId, body.chatModel.key), body.chatModel.providerId,
registry.loadEmbeddingModel( body.chatModel.key,
);
let embedding: Awaited<ReturnType<ModelRegistry['loadEmbeddingModel']>> | null = null;
if (body.embeddingModel?.providerId) {
embedding = await registry.loadEmbeddingModel(
body.embeddingModel.providerId, body.embeddingModel.providerId,
body.embeddingModel.key, body.embeddingModel.key,
), );
]); }
const history: ChatTurnMessage[] = body.history.map((msg) => { const history: ChatTurnMessage[] = body.history.map((msg) => {
if (msg[0] === 'human') { if (msg[0] === 'human') {
@@ -210,7 +239,8 @@ export const POST = async (req: Request) => {
} }
}); });
agent.searchAsync(session, { agent
.searchAsync(session, {
chatHistory: history, chatHistory: history,
followUp: message.content, followUp: message.content,
chatId: body.message.chatId, chatId: body.message.chatId,
@@ -222,7 +252,16 @@ export const POST = async (req: Request) => {
mode: body.optimizationMode, mode: body.optimizationMode,
fileIds: body.files, fileIds: body.files,
systemInstructions: body.systemInstructions || 'None', systemInstructions: body.systemInstructions || 'None',
locale: effectiveLocale,
}, },
})
.catch((err: Error) => {
console.error('[Chat] searchAsync failed:', err);
session.emit('error', {
data:
err?.message ||
'Ошибка при поиске. Проверьте настройки SearXNG или LLM в Settings.',
});
}); });
ensureChatExists({ ensureChatExists({

View File

@@ -1,4 +1,5 @@
import configManager from '@/lib/config'; import configManager from '@/lib/config';
import { isEnvOnlyMode } from '@/lib/config/serverRegistry';
import ModelRegistry from '@/lib/models/registry'; import ModelRegistry from '@/lib/models/registry';
import { NextRequest, NextResponse } from 'next/server'; import { NextRequest, NextResponse } from 'next/server';
import { ConfigModelProvider } from '@/lib/config/types'; import { ConfigModelProvider } from '@/lib/config/types';
@@ -32,6 +33,7 @@ export const GET = async (req: NextRequest) => {
return NextResponse.json({ return NextResponse.json({
values, values,
fields, fields,
envOnlyMode: isEnvOnlyMode(),
}); });
} catch (err) { } catch (err) {
console.error('Error in getting config: ', err); console.error('Error in getting config: ', err);

View File

@@ -1,4 +1,4 @@
import { searchSearxng } from '@/lib/searxng'; import { searchSearxng, type SearxngSearchResult } from '@/lib/searxng';
import configManager from '@/lib/config'; import configManager from '@/lib/config';
import { getSearxngURL } from '@/lib/config/serverRegistry'; import { getSearxngURL } from '@/lib/config/serverRegistry';
@@ -206,7 +206,7 @@ export const GET = async (req: Request) => {
); );
const settled = await Promise.allSettled(searchPromises); const settled = await Promise.allSettled(searchPromises);
const allResults = settled const allResults = settled
.filter((r): r is PromiseFulfilledResult<{ url?: string; title?: string }[]> => r.status === 'fulfilled') .filter((r): r is PromiseFulfilledResult<SearxngSearchResult[]> => r.status === 'fulfilled')
.flatMap((r) => r.value); .flatMap((r) => r.value);
data = allResults data = allResults

View File

@@ -1,19 +1,40 @@
import {
getConfiguredModelProviders,
isEnvOnlyMode,
} from '@/lib/config/serverRegistry';
import ModelRegistry from '@/lib/models/registry'; import ModelRegistry from '@/lib/models/registry';
import { NextRequest } from 'next/server'; import { NextRequest } from 'next/server';
export const GET = async (req: Request) => { export const GET = async () => {
try { try {
const registry = new ModelRegistry(); const envOnlyMode = isEnvOnlyMode();
const configuredProviders = getConfiguredModelProviders();
const registry = new ModelRegistry();
const activeProviders = await registry.getActiveProviders(); const activeProviders = await registry.getActiveProviders();
const filteredProviders = activeProviders.filter((p) => { const filteredProviders = activeProviders.filter((p) => {
return !p.chatModels.some((m) => m.key === 'error'); return !p.chatModels.some((m) => m.key === 'error');
}); });
// env-only: если у провайдера пустой chatModels, подставляем модель из конфига
const providers =
envOnlyMode && configuredProviders.length > 0
? filteredProviders.map((p) => {
if (p.chatModels.length > 0) return p;
const configProvider = configuredProviders.find((c) => c.id === p.id);
const fallbackChat =
(configProvider?.chatModels?.length ?? 0) > 0
? configProvider!.chatModels
: [{ key: 'gpt-4', name: 'gpt-4' }];
return { ...p, chatModels: fallbackChat };
})
: filteredProviders;
return Response.json( return Response.json(
{ {
providers: filteredProviders, providers,
envOnlyMode: envOnlyMode,
}, },
{ {
status: 200, status: 200,
@@ -33,6 +54,10 @@ export const GET = async (req: Request) => {
}; };
export const POST = async (req: NextRequest) => { export const POST = async (req: NextRequest) => {
if (isEnvOnlyMode()) {
return Response.json({ message: 'Not available.' }, { status: 405 });
}
try { try {
const body = await req.json(); const body = await req.json();
const { type, name, config } = body; const { type, name, config } = body;
@@ -49,7 +74,6 @@ export const POST = async (req: NextRequest) => {
} }
const registry = new ModelRegistry(); const registry = new ModelRegistry();
const newProvider = await registry.addProvider(type, name, config); const newProvider = await registry.addProvider(type, name, config);
return Response.json( return Response.json(

View File

@@ -17,7 +17,8 @@ export const POST = async (
const writer = responseStream.writable.getWriter(); const writer = responseStream.writable.getWriter();
const encoder = new TextEncoder(); const encoder = new TextEncoder();
const disconnect = session.subscribe((event, data) => { let unsub: () => void = () => {};
unsub = session.subscribe((event, data) => {
if (event === 'data') { if (event === 'data') {
if (data.type === 'block') { if (data.type === 'block') {
writer.write( writer.write(
@@ -56,7 +57,7 @@ export const POST = async (
), ),
); );
writer.close(); writer.close();
disconnect(); setImmediate(() => unsub());
} else if (event === 'error') { } else if (event === 'error') {
writer.write( writer.write(
encoder.encode( encoder.encode(
@@ -67,12 +68,12 @@ export const POST = async (
), ),
); );
writer.close(); writer.close();
disconnect(); setImmediate(() => unsub());
} }
}); });
req.signal.addEventListener('abort', () => { req.signal.addEventListener('abort', () => {
disconnect(); unsub();
writer.close(); writer.close();
}); });

View File

@@ -5,6 +5,7 @@ import { ModelWithProvider } from '@/lib/models/types';
interface SuggestionsGenerationBody { interface SuggestionsGenerationBody {
chatHistory: any[]; chatHistory: any[];
chatModel: ModelWithProvider; chatModel: ModelWithProvider;
locale?: string;
} }
export const POST = async (req: Request) => { export const POST = async (req: Request) => {
@@ -24,6 +25,7 @@ export const POST = async (req: Request) => {
role: role === 'human' ? 'user' : 'assistant', role: role === 'human' ? 'user' : 'assistant',
content, content,
})), })),
locale: body.locale,
}, },
llm, llm,
); );

View File

@@ -1,4 +1,8 @@
import { NextRequest, NextResponse } from 'next/server'; import { NextRequest, NextResponse } from 'next/server';
import {
getEmbeddedTranslations,
translationLocaleFor,
} from '@/lib/localization/embeddedTranslations';
const LOCALIZATION_SERVICE_URL = const LOCALIZATION_SERVICE_URL =
process.env.LOCALIZATION_SERVICE_URL ?? 'http://localhost:4003'; process.env.LOCALIZATION_SERVICE_URL ?? 'http://localhost:4003';
@@ -7,31 +11,27 @@ export async function GET(
req: NextRequest, req: NextRequest,
{ params }: { params: Promise<{ locale: string }> }, { params }: { params: Promise<{ locale: string }> },
) { ) {
try { const resolved = await params;
const { locale } = await params; const localeParam = resolved?.locale ?? '';
if (!locale) { if (!localeParam) {
return NextResponse.json( return NextResponse.json(
{ error: 'Locale is required' }, { error: 'Locale is required' },
{ status: 400 }, { status: 400 },
); );
} }
try {
const fetchLocale = translationLocaleFor(localeParam);
const res = await fetch( const res = await fetch(
`${LOCALIZATION_SERVICE_URL}/api/translations/${encodeURIComponent(locale)}`, `${LOCALIZATION_SERVICE_URL}/api/translations/${encodeURIComponent(fetchLocale)}`,
); );
if (!res.ok) { if (res.ok) {
return NextResponse.json(
{ error: 'Failed to fetch translations' },
{ status: 502 },
);
}
const data = await res.json(); const data = await res.json();
return NextResponse.json(data); return NextResponse.json(data);
}
} catch { } catch {
return NextResponse.json( /* localization-service недоступен */
{ error: 'Failed to fetch translations' },
{ status: 500 },
);
} }
return NextResponse.json(getEmbeddedTranslations(localeParam));
} }

View File

@@ -0,0 +1,4 @@
<svg xmlns="http://www.w3.org/2000/svg" viewBox="0 0 32 32">
<rect width="32" height="32" rx="6" fill="#EA580C"/>
<text x="16" y="22" font-family="Arial" font-size="18" font-weight="bold" fill="white" text-anchor="middle">G</text>
</svg>

After

Width:  |  Height:  |  Size: 242 B

View File

@@ -9,8 +9,10 @@ import { Toaster } from 'sonner';
import ThemeProvider from '@/components/theme/Provider'; import ThemeProvider from '@/components/theme/Provider';
import { LocalizationProvider } from '@/lib/localization/context'; import { LocalizationProvider } from '@/lib/localization/context';
import configManager from '@/lib/config'; import configManager from '@/lib/config';
import { isEnvOnlyMode } from '@/lib/config/serverRegistry';
import SetupWizard from '@/components/Setup/SetupWizard'; import SetupWizard from '@/components/Setup/SetupWizard';
import { ChatProvider } from '@/lib/hooks/useChat'; import { ChatProvider } from '@/lib/hooks/useChat';
import { ClientOnly } from '@/components/ClientOnly';
const roboto = Roboto({ const roboto = Roboto({
weight: ['300', '400', '500', '700'], weight: ['300', '400', '500', '700'],
@@ -39,6 +41,13 @@ export default function RootLayout({
<ThemeProvider> <ThemeProvider>
<LocalizationProvider> <LocalizationProvider>
{setupComplete ? ( {setupComplete ? (
<ClientOnly
fallback={
<div className="flex min-h-screen items-center justify-center bg-light-primary dark:bg-dark-primary">
<div className="h-8 w-8 animate-spin rounded-full border-2 border-[#EA580C] border-t-transparent" />
</div>
}
>
<ChatProvider> <ChatProvider>
<Sidebar>{children}</Sidebar> <Sidebar>{children}</Sidebar>
<Toaster <Toaster
@@ -46,13 +55,17 @@ export default function RootLayout({
unstyled: true, unstyled: true,
classNames: { classNames: {
toast: toast:
'bg-light-secondary dark:bg-dark-secondary dark:text-white/70 text-black-70 rounded-lg p-4 flex flex-row items-center space-x-2', 'bg-light-secondary dark:bg-dark-secondary dark:text-white/70 text-black/70 rounded-lg p-4 flex flex-row items-center space-x-2',
}, },
}} }}
/> />
</ChatProvider> </ChatProvider>
</ClientOnly>
) : ( ) : (
<SetupWizard configSections={configSections} /> <SetupWizard
configSections={configSections}
envConfigRequired={!isEnvOnlyMode()}
/>
)} )}
</LocalizationProvider> </LocalizationProvider>
</ThemeProvider> </ThemeProvider>

View File

@@ -12,6 +12,7 @@ import { motion, AnimatePresence } from 'framer-motion';
import { useEffect, useState } from 'react'; import { useEffect, useState } from 'react';
import { ResearchBlock, ResearchBlockSubStep } from '@/lib/types'; import { ResearchBlock, ResearchBlockSubStep } from '@/lib/types';
import { useChat } from '@/lib/hooks/useChat'; import { useChat } from '@/lib/hooks/useChat';
import { useTranslation } from '@/lib/localization/context';
const getStepIcon = (step: ResearchBlockSubStep) => { const getStepIcon = (step: ResearchBlockSubStep) => {
if (step.type === 'reasoning') { if (step.type === 'reasoning') {
@@ -33,22 +34,27 @@ const getStepIcon = (step: ResearchBlockSubStep) => {
const getStepTitle = ( const getStepTitle = (
step: ResearchBlockSubStep, step: ResearchBlockSubStep,
isStreaming: boolean, isStreaming: boolean,
t: (key: string) => string,
): string => { ): string => {
if (step.type === 'reasoning') { if (step.type === 'reasoning') {
return isStreaming && !step.reasoning ? 'Thinking...' : 'Thinking'; return isStreaming && !step.reasoning ? t('chat.brainstorming') : t('chat.thinking');
} else if (step.type === 'searching') { } else if (step.type === 'searching') {
return `Searching ${step.searching.length} ${step.searching.length === 1 ? 'query' : 'queries'}`; const n = step.searching.length;
return t('chat.searchingQueries').replace('{count}', String(n)).replace('{plural}', n === 1 ? t('chat.query') : t('chat.queries'));
} else if (step.type === 'search_results') { } else if (step.type === 'search_results') {
return `Found ${step.reading.length} ${step.reading.length === 1 ? 'result' : 'results'}`; const n = step.reading.length;
return t('chat.foundResults').replace('{count}', String(n)).replace('{plural}', n === 1 ? t('chat.result') : t('chat.results'));
} else if (step.type === 'reading') { } else if (step.type === 'reading') {
return `Reading ${step.reading.length} ${step.reading.length === 1 ? 'source' : 'sources'}`; const n = step.reading.length;
return t('chat.readingSources').replace('{count}', String(n)).replace('{plural}', n === 1 ? t('chat.source') : t('chat.sources'));
} else if (step.type === 'upload_searching') { } else if (step.type === 'upload_searching') {
return 'Scanning your uploaded documents'; return t('chat.scanningDocs');
} else if (step.type === 'upload_search_results') { } else if (step.type === 'upload_search_results') {
return `Reading ${step.results.length} ${step.results.length === 1 ? 'document' : 'documents'}`; const n = step.results.length;
return t('chat.readingDocs').replace('{count}', String(n)).replace('{plural}', n === 1 ? t('chat.document') : t('chat.documents'));
} }
return 'Processing'; return t('chat.processing');
}; };
const AssistantSteps = ({ const AssistantSteps = ({
@@ -60,6 +66,7 @@ const AssistantSteps = ({
status: 'answering' | 'completed' | 'error'; status: 'answering' | 'completed' | 'error';
isLast: boolean; isLast: boolean;
}) => { }) => {
const { t } = useTranslation();
const [isExpanded, setIsExpanded] = useState( const [isExpanded, setIsExpanded] = useState(
isLast && status === 'answering' ? true : false, isLast && status === 'answering' ? true : false,
); );
@@ -84,8 +91,8 @@ const AssistantSteps = ({
<div className="flex items-center gap-2"> <div className="flex items-center gap-2">
<Brain className="w-4 h-4 text-black dark:text-white" /> <Brain className="w-4 h-4 text-black dark:text-white" />
<span className="text-sm font-medium text-black dark:text-white"> <span className="text-sm font-medium text-black dark:text-white">
Research Progress ({block.data.subSteps.length}{' '} {t('chat.researchProgress')} ({block.data.subSteps.length}{' '}
{block.data.subSteps.length === 1 ? 'step' : 'steps'}) {block.data.subSteps.length === 1 ? t('chat.step') : t('chat.steps')})
</span> </span>
</div> </div>
{isExpanded ? ( {isExpanded ? (
@@ -130,7 +137,7 @@ const AssistantSteps = ({
<div className="flex-1 pb-1"> <div className="flex-1 pb-1">
<span className="text-sm font-medium text-black dark:text-white"> <span className="text-sm font-medium text-black dark:text-white">
{getStepTitle(step, isStreaming)} {getStepTitle(step, isStreaming, t)}
</span> </span>
{step.type === 'reasoning' && ( {step.type === 'reasoning' && (

View File

@@ -20,6 +20,8 @@ export interface Message extends BaseMessage {
query: string; query: string;
responseBlocks: Block[]; responseBlocks: Block[];
status: 'answering' | 'completed' | 'error'; status: 'answering' | 'completed' | 'error';
/** Заголовок статьи при переходе из Discover (Summary) */
articleTitle?: string;
} }
export interface File { export interface File {

View File

@@ -0,0 +1,25 @@
'use client';
import { useEffect, useState } from 'react';
/**
* Откладывает рендер детей до момента монтирования на клиенте.
* Устраняет ошибку "Can't perform a React state update on a component that hasn't mounted yet"
* при гидрации с next-themes и другими провайдерами.
*/
export function ClientOnly({
children,
fallback = null,
}: {
children: React.ReactNode;
fallback?: React.ReactNode;
}) {
const [mounted, setMounted] = useState(false);
useEffect(() => {
setMounted(true);
}, []);
if (!mounted) return fallback;
return <>{children}</>;
}

View File

@@ -9,7 +9,7 @@ const MajorNewsCard = ({
isLeft?: boolean; isLeft?: boolean;
}) => ( }) => (
<Link <Link
href={`/?q=Summary: ${item.url}`} href={`/?q=${encodeURIComponent(`Summary: ${item.url}`)}&title=${encodeURIComponent(item.title)}`}
className="w-full group flex flex-row items-stretch gap-6 h-60 py-3" className="w-full group flex flex-row items-stretch gap-6 h-60 py-3"
target="_blank" target="_blank"
> >

View File

@@ -3,7 +3,7 @@ import Link from 'next/link';
const SmallNewsCard = ({ item }: { item: Discover }) => ( const SmallNewsCard = ({ item }: { item: Discover }) => (
<Link <Link
href={`/?q=Summary: ${item.url}`} href={`/?q=${encodeURIComponent(`Summary: ${item.url}`)}&title=${encodeURIComponent(item.title)}`}
className="rounded-3xl overflow-hidden bg-light-secondary dark:bg-dark-secondary shadow-sm shadow-light-200/10 dark:shadow-black/25 group flex flex-col" className="rounded-3xl overflow-hidden bg-light-secondary dark:bg-dark-secondary shadow-sm shadow-light-200/10 dark:shadow-black/25 group flex flex-col"
target="_blank" target="_blank"
> >

View File

@@ -21,6 +21,7 @@ import SearchVideos from './SearchVideos';
import { useSpeech } from 'react-text-to-speech'; import { useSpeech } from 'react-text-to-speech';
import ThinkBox from './ThinkBox'; import ThinkBox from './ThinkBox';
import { useChat, Section } from '@/lib/hooks/useChat'; import { useChat, Section } from '@/lib/hooks/useChat';
import { useTranslation } from '@/lib/localization/context';
import Citation from './MessageRenderer/Citation'; import Citation from './MessageRenderer/Citation';
import AssistantSteps from './AssistantSteps'; import AssistantSteps from './AssistantSteps';
import { ResearchBlock } from '@/lib/types'; import { ResearchBlock } from '@/lib/types';
@@ -58,6 +59,7 @@ const MessageBox = ({
researchEnded, researchEnded,
chatHistory, chatHistory,
} = useChat(); } = useChat();
const { t } = useTranslation();
const parsedMessage = section.parsedTextBlocks.join('\n\n'); const parsedMessage = section.parsedTextBlocks.join('\n\n');
const speechMessage = section.speechMessage || ''; const speechMessage = section.speechMessage || '';
@@ -70,7 +72,10 @@ const MessageBox = ({
const sources = sourceBlocks.flatMap((block) => block.data); const sources = sourceBlocks.flatMap((block) => block.data);
const hasContent = section.parsedTextBlocks.length > 0; const hasContent =
section.parsedTextBlocks.some(
(t) => typeof t === 'string' && t.trim().length > 0,
);
const { speechStatus, start, stop } = useSpeech({ text: speechMessage }); const { speechStatus, start, stop } = useSpeech({ text: speechMessage });
@@ -103,12 +108,36 @@ const MessageBox = ({
}, },
}; };
const isSummaryArticle =
section.message.query.startsWith('Summary: ') &&
section.message.articleTitle;
const summaryUrl = isSummaryArticle
? section.message.query.slice(9)
: undefined;
return ( return (
<div className="space-y-6"> <div className="space-y-6">
<div className={'w-full pt-8 break-words'}> <div className="w-full pt-8 break-words lg:max-w-[60%]">
<h2 className="text-black dark:text-white font-medium text-3xl lg:w-9/12"> {isSummaryArticle ? (
<div className="space-y-1 min-w-0 overflow-hidden">
<h2 className="text-black dark:text-white font-medium text-3xl">
{section.message.articleTitle}
</h2>
<a
href={summaryUrl}
target="_blank"
rel="noopener noreferrer"
className="block text-sm text-black/60 dark:text-white/60 hover:text-black/80 dark:hover:text-white/80 truncate min-w-0"
title={summaryUrl}
>
{summaryUrl}
</a>
</div>
) : (
<h2 className="text-black dark:text-white font-medium text-3xl">
{section.message.query} {section.message.query}
</h2> </h2>
)}
</div> </div>
<div className="flex flex-col space-y-9 lg:space-y-0 lg:flex-row lg:justify-between lg:space-x-9"> <div className="flex flex-col space-y-9 lg:space-y-0 lg:flex-row lg:justify-between lg:space-x-9">
@@ -121,7 +150,7 @@ const MessageBox = ({
<div className="flex flex-row items-center space-x-2"> <div className="flex flex-row items-center space-x-2">
<BookCopy className="text-black dark:text-white" size={20} /> <BookCopy className="text-black dark:text-white" size={20} />
<h3 className="text-black dark:text-white font-medium text-xl"> <h3 className="text-black dark:text-white font-medium text-xl">
Sources {t('chat.sources')}
</h3> </h3>
</div> </div>
<MessageSources sources={sources} /> <MessageSources sources={sources} />
@@ -152,7 +181,7 @@ const MessageBox = ({
<div className="flex items-center gap-2 p-3 rounded-lg bg-light-secondary dark:bg-dark-secondary border border-light-200 dark:border-dark-200"> <div className="flex items-center gap-2 p-3 rounded-lg bg-light-secondary dark:bg-dark-secondary border border-light-200 dark:border-dark-200">
<Disc3 className="w-4 h-4 text-black dark:text-white animate-spin" /> <Disc3 className="w-4 h-4 text-black dark:text-white animate-spin" />
<span className="text-sm text-black/70 dark:text-white/70"> <span className="text-sm text-black/70 dark:text-white/70">
Brainstorming... {t('chat.brainstorming')}
</span> </span>
</div> </div>
)} )}
@@ -170,11 +199,23 @@ const MessageBox = ({
size={20} size={20}
/> />
<h3 className="text-black dark:text-white font-medium text-xl"> <h3 className="text-black dark:text-white font-medium text-xl">
Answer {t('chat.answer')}
</h3> </h3>
</div> </div>
)} )}
{!hasContent && sources.length > 0 && isLast && loading && (
<p className="text-sm text-black/60 dark:text-white/60 italic">
{t('chat.formingAnswer')}
</p>
)}
{!hasContent && sources.length > 0 && isLast && !loading && (
<p className="text-sm text-black/60 dark:text-white/60 italic">
{t('chat.answerFailed')}
</p>
)}
{hasContent && ( {hasContent && (
<> <>
<Markdown <Markdown
@@ -229,7 +270,7 @@ const MessageBox = ({
size={20} size={20}
/> />
<h3 className="text-black dark:text-white font-medium text-xl"> <h3 className="text-black dark:text-white font-medium text-xl">
Related {t('chat.related')}
</h3> </h3>
</div> </div>
<div className="space-y-0"> <div className="space-y-0">

View File

@@ -20,7 +20,9 @@ import { AnimatePresence } from 'motion/react';
import { motion } from 'framer-motion'; import { motion } from 'framer-motion';
const Attach = () => { const Attach = () => {
const { files, setFiles, setFileIds, fileIds } = useChat(); const { files, setFiles, setFileIds, fileIds, embeddingModelProvider } = useChat();
if (!embeddingModelProvider?.providerId) return null;
const [loading, setLoading] = useState(false); const [loading, setLoading] = useState(false);
const fileInputRef = useRef<any>(); const fileInputRef = useRef<any>();

View File

@@ -11,7 +11,9 @@ import { AnimatePresence } from 'motion/react';
import { motion } from 'framer-motion'; import { motion } from 'framer-motion';
const AttachSmall = () => { const AttachSmall = () => {
const { files, setFiles, setFileIds, fileIds } = useChat(); const { files, setFiles, setFileIds, fileIds, embeddingModelProvider } = useChat();
if (!embeddingModelProvider?.providerId) return null;
const [loading, setLoading] = useState(false); const [loading, setLoading] = useState(false);
const fileInputRef = useRef<any>(); const fileInputRef = useRef<any>();

View File

@@ -10,6 +10,7 @@ import { AnimatePresence, motion } from 'motion/react';
const ModelSelector = () => { const ModelSelector = () => {
const [providers, setProviders] = useState<MinimalProvider[]>([]); const [providers, setProviders] = useState<MinimalProvider[]>([]);
const [envOnlyMode, setEnvOnlyMode] = useState(false);
const [isLoading, setIsLoading] = useState(true); const [isLoading, setIsLoading] = useState(true);
const [searchQuery, setSearchQuery] = useState(''); const [searchQuery, setSearchQuery] = useState('');
@@ -25,8 +26,9 @@ const ModelSelector = () => {
throw new Error('Failed to fetch providers'); throw new Error('Failed to fetch providers');
} }
const data: { providers: MinimalProvider[] } = await res.json(); const data: { providers: MinimalProvider[]; envOnlyMode?: boolean } = await res.json();
setProviders(data.providers); setProviders(data.providers);
setEnvOnlyMode(data.envOnlyMode ?? false);
} catch (error) { } catch (error) {
console.error('Error loading providers:', error); console.error('Error loading providers:', error);
} finally { } finally {
@@ -73,6 +75,8 @@ const ModelSelector = () => {
})) }))
.filter((provider) => provider.chatModels.length > 0); .filter((provider) => provider.chatModels.length > 0);
if (envOnlyMode) return null;
return ( return (
<Popover className="relative w-full max-w-[15rem] md:max-w-md lg:max-w-lg"> <Popover className="relative w-full max-w-[15rem] md:max-w-md lg:max-w-lg">
{({ open }) => ( {({ open }) => (

View File

@@ -9,8 +9,10 @@ import {
import { File } from 'lucide-react'; import { File } from 'lucide-react';
import { Fragment, useState } from 'react'; import { Fragment, useState } from 'react';
import { Chunk } from '@/lib/types'; import { Chunk } from '@/lib/types';
import { useTranslation } from '@/lib/localization/context';
const MessageSources = ({ sources }: { sources: Chunk[] }) => { const MessageSources = ({ sources }: { sources: Chunk[] }) => {
const { t } = useTranslation();
const [isDialogOpen, setIsDialogOpen] = useState(false); const [isDialogOpen, setIsDialogOpen] = useState(false);
const closeModal = () => { const closeModal = () => {
@@ -52,7 +54,7 @@ const MessageSources = ({ sources }: { sources: Chunk[] }) => {
)} )}
<p className="text-xs text-black/50 dark:text-white/50 overflow-hidden whitespace-nowrap text-ellipsis"> <p className="text-xs text-black/50 dark:text-white/50 overflow-hidden whitespace-nowrap text-ellipsis">
{source.metadata.url.includes('file_id://') {source.metadata.url.includes('file_id://')
? 'Uploaded File' ? t('chat.uploadedFile')
: source.metadata.url.replace(/.+\/\/|www.|\..+/g, '')} : source.metadata.url.replace(/.+\/\/|www.|\..+/g, '')}
</p> </p>
</div> </div>
@@ -90,7 +92,7 @@ const MessageSources = ({ sources }: { sources: Chunk[] }) => {
})} })}
</div> </div>
<p className="text-xs text-black/50 dark:text-white/50"> <p className="text-xs text-black/50 dark:text-white/50">
View {sources.length - 3} more {t('chat.viewMore').replace('{count}', String(sources.length - 3))}
</p> </p>
</button> </button>
)} )}
@@ -109,7 +111,7 @@ const MessageSources = ({ sources }: { sources: Chunk[] }) => {
> >
<DialogPanel className="w-full max-w-md transform rounded-2xl bg-light-secondary dark:bg-dark-secondary border border-light-200 dark:border-dark-200 p-6 text-left align-middle shadow-xl transition-all"> <DialogPanel className="w-full max-w-md transform rounded-2xl bg-light-secondary dark:bg-dark-secondary border border-light-200 dark:border-dark-200 p-6 text-left align-middle shadow-xl transition-all">
<DialogTitle className="text-lg font-medium leading-6 dark:text-white"> <DialogTitle className="text-lg font-medium leading-6 dark:text-white">
Sources {t('chat.sources')}
</DialogTitle> </DialogTitle>
<div className="grid grid-cols-2 gap-2 overflow-auto max-h-[300px] mt-2 pr-2"> <div className="grid grid-cols-2 gap-2 overflow-auto max-h-[300px] mt-2 pr-2">
{sources.map((source, i) => ( {sources.map((source, i) => (

View File

@@ -9,7 +9,6 @@ import {
PopoverPanel, PopoverPanel,
Transition, Transition,
} from '@headlessui/react'; } from '@headlessui/react';
import jsPDF from 'jspdf';
import { useChat, Section } from '@/lib/hooks/useChat'; import { useChat, Section } from '@/lib/hooks/useChat';
import { SourceBlock } from '@/lib/types'; import { SourceBlock } from '@/lib/types';
@@ -73,7 +72,8 @@ const exportAsMarkdown = (sections: Section[], title: string) => {
downloadFile(`${title || 'chat'}.md`, md, 'text/markdown'); downloadFile(`${title || 'chat'}.md`, md, 'text/markdown');
}; };
const exportAsPDF = (sections: Section[], title: string) => { const exportAsPDF = async (sections: Section[], title: string) => {
const { default: jsPDF } = await import('jspdf');
const doc = new jsPDF(); const doc = new jsPDF();
const date = new Date( const date = new Date(
sections[0]?.message?.createdAt || Date.now(), sections[0]?.message?.createdAt || Date.now(),
@@ -294,7 +294,7 @@ const Navbar = () => {
</button> </button>
<button <button
className="w-full flex items-center gap-3 px-3 py-2 text-left rounded-xl hover:bg-light-secondary dark:hover:bg-dark-secondary transition-colors duration-200" className="w-full flex items-center gap-3 px-3 py-2 text-left rounded-xl hover:bg-light-secondary dark:hover:bg-dark-secondary transition-colors duration-200"
onClick={() => exportAsPDF(sections, title || '')} onClick={() => void exportAsPDF(sections, title || '')}
> >
<FileDown size={16} className="text-[#EA580C]" /> <FileDown size={16} className="text-[#EA580C]" />
<div> <div>

View File

@@ -40,7 +40,7 @@ const NewsArticleWidget = () => {
<div className="w-full text-xs text-red-400">Could not load news.</div> <div className="w-full text-xs text-red-400">Could not load news.</div>
) : article ? ( ) : article ? (
<a <a
href={`/?q=Summary: ${article.url}`} href={`/?q=${encodeURIComponent(`Summary: ${article.url}`)}&title=${encodeURIComponent(article.title)}`}
className="flex flex-row items-stretch w-full h-full relative overflow-hidden group" className="flex flex-row items-stretch w-full h-full relative overflow-hidden group"
> >
<div className="relative w-24 min-w-24 max-w-24 h-full overflow-hidden"> <div className="relative w-24 min-w-24 max-w-24 h-full overflow-hidden">

View File

@@ -4,6 +4,7 @@ import { useState } from 'react';
import Lightbox from 'yet-another-react-lightbox'; import Lightbox from 'yet-another-react-lightbox';
import 'yet-another-react-lightbox/styles.css'; import 'yet-another-react-lightbox/styles.css';
import { Message } from './ChatWindow'; import { Message } from './ChatWindow';
import { useTranslation } from '@/lib/localization/context';
type Image = { type Image = {
url: string; url: string;
@@ -20,6 +21,7 @@ const SearchImages = ({
chatHistory: [string, string][]; chatHistory: [string, string][];
messageId: string; messageId: string;
}) => { }) => {
const { t } = useTranslation();
const [images, setImages] = useState<Image[] | null>(null); const [images, setImages] = useState<Image[] | null>(null);
const [loading, setLoading] = useState(false); const [loading, setLoading] = useState(false);
const [open, setOpen] = useState(false); const [open, setOpen] = useState(false);
@@ -70,7 +72,7 @@ const SearchImages = ({
> >
<div className="flex flex-row items-center space-x-2"> <div className="flex flex-row items-center space-x-2">
<ImagesIcon size={17} /> <ImagesIcon size={17} />
<p>Search images</p> <p>{t('chat.searchImages')}</p>
</div> </div>
<PlusIcon className="text-[#EA580C]" size={17} /> <PlusIcon className="text-[#EA580C]" size={17} />
</button> </button>

View File

@@ -4,6 +4,7 @@ import { useRef, useState } from 'react';
import Lightbox, { GenericSlide, VideoSlide } from 'yet-another-react-lightbox'; import Lightbox, { GenericSlide, VideoSlide } from 'yet-another-react-lightbox';
import 'yet-another-react-lightbox/styles.css'; import 'yet-another-react-lightbox/styles.css';
import { Message } from './ChatWindow'; import { Message } from './ChatWindow';
import { useTranslation } from '@/lib/localization/context';
type Video = { type Video = {
url: string; url: string;
@@ -24,7 +25,7 @@ declare module 'yet-another-react-lightbox' {
} }
} }
const Searchvideos = ({ const SearchVideos = ({
query, query,
chatHistory, chatHistory,
messageId, messageId,
@@ -33,6 +34,7 @@ const Searchvideos = ({
chatHistory: [string, string][]; chatHistory: [string, string][];
messageId: string; messageId: string;
}) => { }) => {
const { t } = useTranslation();
const [videos, setVideos] = useState<Video[] | null>(null); const [videos, setVideos] = useState<Video[] | null>(null);
const [loading, setLoading] = useState(false); const [loading, setLoading] = useState(false);
const [open, setOpen] = useState(false); const [open, setOpen] = useState(false);
@@ -87,7 +89,7 @@ const Searchvideos = ({
> >
<div className="flex flex-row items-center space-x-2"> <div className="flex flex-row items-center space-x-2">
<VideoIcon size={17} /> <VideoIcon size={17} />
<p>Search videos</p> <p>{t('chat.searchVideos')}</p>
</div> </div>
<PlusIcon className="text-[#EA580C]" size={17} /> <PlusIcon className="text-[#EA580C]" size={17} />
</button> </button>
@@ -126,7 +128,7 @@ const Searchvideos = ({
/> />
<div className="absolute bg-white/70 dark:bg-black/70 text-black/70 dark:text-white/70 px-2 py-1 flex flex-row items-center space-x-1 bottom-1 right-1 rounded-md"> <div className="absolute bg-white/70 dark:bg-black/70 text-black/70 dark:text-white/70 px-2 py-1 flex flex-row items-center space-x-1 bottom-1 right-1 rounded-md">
<PlayCircle size={15} /> <PlayCircle size={15} />
<p className="text-xs">Video</p> <p className="text-xs">{t('chat.video')}</p>
</div> </div>
</div> </div>
)) ))
@@ -150,7 +152,7 @@ const Searchvideos = ({
/> />
<div className="absolute bg-white/70 dark:bg-black/70 text-black/70 dark:text-white/70 px-2 py-1 flex flex-row items-center space-x-1 bottom-1 right-1 rounded-md"> <div className="absolute bg-white/70 dark:bg-black/70 text-black/70 dark:text-white/70 px-2 py-1 flex flex-row items-center space-x-1 bottom-1 right-1 rounded-md">
<PlayCircle size={15} /> <PlayCircle size={15} />
<p className="text-xs">Video</p> <p className="text-xs">{t('chat.video')}</p>
</div> </div>
</div> </div>
))} ))}
@@ -220,4 +222,4 @@ const Searchvideos = ({
); );
}; };
export default Searchvideos; export default SearchVideos;

View File

@@ -19,7 +19,7 @@ import SearchSection from './Sections/Search';
import Select from '@/components/ui/Select'; import Select from '@/components/ui/Select';
import Personalization from './Sections/Personalization'; import Personalization from './Sections/Personalization';
const sections = [ const baseSections = [
{ {
key: 'preferences', key: 'preferences',
name: 'Preferences', name: 'Preferences',
@@ -62,13 +62,20 @@ const SettingsDialogue = ({
setIsOpen: (active: boolean) => void; setIsOpen: (active: boolean) => void;
}) => { }) => {
const [isLoading, setIsLoading] = useState(true); const [isLoading, setIsLoading] = useState(true);
const [config, setConfig] = useState<any>(null); const [config, setConfig] = useState<{
values: Record<string, unknown>;
fields: Record<string, unknown>;
envOnlyMode?: boolean;
} | null>(null);
const sections = config?.envOnlyMode
? baseSections.filter((s) => s.key !== 'models')
: baseSections;
const [activeSection, setActiveSection] = useState<string>(sections[0].key); const [activeSection, setActiveSection] = useState<string>(sections[0].key);
const [selectedSection, setSelectedSection] = useState(sections[0]); const [selectedSection, setSelectedSection] = useState(sections[0]);
useEffect(() => { useEffect(() => {
setSelectedSection(sections.find((s) => s.key === activeSection)!); setSelectedSection(sections.find((s) => s.key === activeSection) ?? sections[0]);
}, [activeSection]); }, [activeSection, sections]);
useEffect(() => { useEffect(() => {
if (isOpen) { if (isOpen) {
@@ -84,6 +91,9 @@ const SettingsDialogue = ({
const data = await res.json(); const data = await res.json();
setConfig(data); setConfig(data);
if (data.envOnlyMode && activeSection === 'models') {
setActiveSection('preferences');
}
} catch (error) { } catch (error) {
console.error('Error fetching config:', error); console.error('Error fetching config:', error);
toast.error('Failed to load configuration.'); toast.error('Failed to load configuration.');
@@ -190,7 +200,7 @@ const SettingsDialogue = ({
className="!text-xs lg:!text-sm" className="!text-xs lg:!text-sm"
/> />
</div> </div>
{selectedSection.component && ( {selectedSection.component && config && (
<div className="flex flex-1 flex-col overflow-hidden"> <div className="flex flex-1 flex-col overflow-hidden">
<div className="border-b border-light-200/60 px-6 pb-6 lg:pt-6 dark:border-dark-200/60 flex-shrink-0"> <div className="border-b border-light-200/60 px-6 pb-6 lg:pt-6 dark:border-dark-200/60 flex-shrink-0">
<div className="flex flex-col"> <div className="flex flex-col">
@@ -204,8 +214,8 @@ const SettingsDialogue = ({
</div> </div>
<div className="flex-1 overflow-y-auto"> <div className="flex-1 overflow-y-auto">
<selectedSection.component <selectedSection.component
fields={config.fields[selectedSection.dataAdd]} fields={config.fields[selectedSection.dataAdd] as never}
values={config.values[selectedSection.dataAdd]} values={config.values[selectedSection.dataAdd] as never}
/> />
</div> </div>
</div> </div>

View File

@@ -15,10 +15,12 @@ const SetupConfig = ({
configSections, configSections,
setupState, setupState,
setSetupState, setSetupState,
envConfigRequired,
}: { }: {
configSections: UIConfigSections; configSections: UIConfigSections;
setupState: number; setupState: number;
setSetupState: (state: number) => void; setSetupState: (state: number) => void;
envConfigRequired?: boolean;
}) => { }) => {
const [providers, setProviders] = useState<ConfigModelProvider[]>([]); const [providers, setProviders] = useState<ConfigModelProvider[]>([]);
const [isLoading, setIsLoading] = useState(true); const [isLoading, setIsLoading] = useState(true);
@@ -41,10 +43,10 @@ const SetupConfig = ({
} }
}; };
if (setupState === 2) { if (setupState === 2 && !envConfigRequired) {
fetchProviders(); fetchProviders();
} }
}, [setupState]); }, [setupState, envConfigRequired]);
const handleFinish = async () => { const handleFinish = async () => {
try { try {
@@ -69,6 +71,36 @@ const SetupConfig = ({
const hasProviders = const hasProviders =
visibleProviders.filter((p) => p.chatModels.length > 0).length > 0; visibleProviders.filter((p) => p.chatModels.length > 0).length > 0;
if (envConfigRequired && setupState >= 2) {
return (
<div className="w-[95vw] md:w-[80vw] lg:w-[65vw] mx-auto px-2 sm:px-4 md:px-6 flex flex-col space-y-6">
<motion.div
initial={{ opacity: 0, y: 20 }}
animate={{
opacity: 1,
y: 0,
transition: { duration: 0.5, delay: 0.1 },
}}
className="w-full h-[calc(95vh-80px)] bg-light-primary dark:bg-dark-primary border border-light-200 dark:border-dark-200 rounded-xl shadow-sm flex flex-col overflow-hidden"
>
<div className="flex-1 overflow-y-auto px-3 sm:px-4 md:px-6 py-4 md:py-6 flex items-center justify-center">
<div className="flex flex-col items-center text-center max-w-md">
<p className="text-sm text-black/70 dark:text-white/70">
Сервис настраивается. Попробуйте позже.
</p>
<button
onClick={() => window.location.reload()}
className="mt-4 px-4 py-2 rounded-lg bg-light-200 dark:bg-dark-200 text-black/70 dark:text-white/70 hover:bg-light-300 dark:hover:bg-dark-300 text-sm transition-colors"
>
Обновить
</button>
</div>
</div>
</motion.div>
</div>
);
}
return ( return (
<div className="w-[95vw] md:w-[80vw] lg:w-[65vw] mx-auto px-2 sm:px-4 md:px-6 flex flex-col space-y-6"> <div className="w-[95vw] md:w-[80vw] lg:w-[65vw] mx-auto px-2 sm:px-4 md:px-6 flex flex-col space-y-6">
{setupState === 2 && ( {setupState === 2 && (

View File

@@ -7,8 +7,10 @@ import SetupConfig from './SetupConfig';
const SetupWizard = ({ const SetupWizard = ({
configSections, configSections,
envConfigRequired,
}: { }: {
configSections: UIConfigSections; configSections: UIConfigSections;
envConfigRequired?: boolean;
}) => { }) => {
const [showWelcome, setShowWelcome] = useState(true); const [showWelcome, setShowWelcome] = useState(true);
const [showSetup, setShowSetup] = useState(false); const [showSetup, setShowSetup] = useState(false);
@@ -27,7 +29,7 @@ const SetupWizard = ({
await delay(1500); await delay(1500);
setSetupState(2); setSetupState(2);
})(); })();
}, []); }, [envConfigRequired]);
return ( return (
<div className="bg-light-primary dark:bg-dark-primary h-screen w-screen fixed inset-0 overflow-hidden"> <div className="bg-light-primary dark:bg-dark-primary h-screen w-screen fixed inset-0 overflow-hidden">
@@ -112,6 +114,7 @@ const SetupWizard = ({
configSections={configSections} configSections={configSections}
setupState={setupState} setupState={setupState}
setSetupState={setSetupState} setSetupState={setSetupState}
envConfigRequired={envConfigRequired}
/> />
</motion.div> </motion.div>
)} )}

View File

@@ -13,6 +13,7 @@ import {
import Link from 'next/link'; import Link from 'next/link';
import { useSelectedLayoutSegments } from 'next/navigation'; import { useSelectedLayoutSegments } from 'next/navigation';
import React, { useState, type ReactNode } from 'react'; import React, { useState, type ReactNode } from 'react';
import { useTranslation } from '@/lib/localization/context';
import Layout from './Layout'; import Layout from './Layout';
import { import {
Description, Description,
@@ -29,25 +30,26 @@ const VerticalIconContainer = ({ children }: { children: ReactNode }) => {
const Sidebar = ({ children }: { children: React.ReactNode }) => { const Sidebar = ({ children }: { children: React.ReactNode }) => {
const segments = useSelectedLayoutSegments(); const segments = useSelectedLayoutSegments();
const [isOpen, setIsOpen] = useState<boolean>(true); const [isOpen, setIsOpen] = useState<boolean>(true);
const { t } = useTranslation();
const navLinks = [ const navLinks = [
{ {
icon: Home, icon: Home,
href: '/', href: '/',
active: segments.length === 0 || segments.includes('c'), active: segments.length === 0 || segments.includes('c'),
label: 'Home', label: t('nav.home'),
}, },
{ {
icon: Search, icon: Search,
href: '/discover', href: '/discover',
active: segments.includes('discover'), active: segments.includes('discover'),
label: 'Discover', label: t('nav.discover'),
}, },
{ {
icon: BookOpenText, icon: BookOpenText,
href: '/library', href: '/library',
active: segments.includes('library'), active: segments.includes('library'),
label: 'Library', label: t('nav.library'),
}, },
]; ];

View File

@@ -1,8 +1,10 @@
import React from 'react'; import React from 'react';
import dynamic from 'next/dynamic';
import { Widget } from '../ChatWindow'; import { Widget } from '../ChatWindow';
import Weather from './Weather'; import Weather from './Weather';
import Calculation from './Calculation'; import Calculation from './Calculation';
import Stock from './Stock';
const Stock = dynamic(() => import('./Stock'), { ssr: false });
const Renderer = ({ widgets }: { widgets: Widget[] }) => { const Renderer = ({ widgets }: { widgets: Widget[] }) => {
return widgets.map((widget, index) => { return widgets.map((widget, index) => {

View File

@@ -1,4 +1,7 @@
export const getSuggestions = async (chatHistory: [string, string][]) => { export const getSuggestions = async (
chatHistory: [string, string][],
locale?: string,
) => {
const chatModel = localStorage.getItem('chatModelKey'); const chatModel = localStorage.getItem('chatModelKey');
const chatModelProvider = localStorage.getItem('chatModelProviderId'); const chatModelProvider = localStorage.getItem('chatModelProviderId');
@@ -13,6 +16,7 @@ export const getSuggestions = async (chatHistory: [string, string][]) => {
providerId: chatModelProvider, providerId: chatModelProvider,
key: chatModel, key: chatModel,
}, },
locale,
}), }),
}); });

View File

@@ -12,6 +12,7 @@ class APISearchAgent {
enabledSources: input.config.sources, enabledSources: input.config.sources,
query: input.followUp, query: input.followUp,
llm: input.config.llm, llm: input.config.llm,
locale: input.config.locale,
}); });
const widgetPromise = WidgetExecutor.executeAll({ const widgetPromise = WidgetExecutor.executeAll({
@@ -69,6 +70,7 @@ class APISearchAgent {
finalContextWithWidgets, finalContextWithWidgets,
input.config.systemInstructions, input.config.systemInstructions,
input.config.mode, input.config.mode,
input.config.locale,
); );
const answerStream = input.config.llm.streamText({ const answerStream = input.config.llm.streamText({

View File

@@ -1,6 +1,6 @@
import z from 'zod'; import z from 'zod';
import { ClassifierInput } from './types'; import { ClassifierInput } from './types';
import { classifierPrompt } from '@/lib/prompts/search/classifier'; import { getClassifierPrompt } from '@/lib/prompts/search/classifier';
import formatChatHistoryAsString from '@/lib/utils/formatHistory'; import formatChatHistoryAsString from '@/lib/utils/formatHistory';
const schema = z.object({ const schema = z.object({
@@ -39,7 +39,7 @@ export const classify = async (input: ClassifierInput) => {
messages: [ messages: [
{ {
role: 'system', role: 'system',
content: classifierPrompt, content: getClassifierPrompt(input.locale),
}, },
{ {
role: 'user', role: 'user',

View File

@@ -11,6 +11,48 @@ import { TextBlock } from '@/lib/types';
class SearchAgent { class SearchAgent {
async searchAsync(session: SessionManager, input: SearchAgentInput) { async searchAsync(session: SessionManager, input: SearchAgentInput) {
try {
await this.doSearch(session, input);
} catch (err) {
console.error('[SearchAgent] Fatal:', err);
const blocks = session.getAllBlocks();
const sourceBlock = blocks.find((b) => b.type === 'source');
let sources: { metadata?: { title?: string } }[] =
sourceBlock?.type === 'source' ? sourceBlock.data : [];
if (sources.length === 0) {
const researchBlock = blocks.find(
(b): b is typeof b & { data: { subSteps?: { type: string; reading?: { metadata?: { title?: string } }[] }[] } } =>
b.type === 'research' && 'subSteps' in (b.data ?? {}),
);
const searchStep = researchBlock?.data?.subSteps?.find(
(s) => s.type === 'search_results' && Array.isArray(s.reading),
);
if (searchStep && 'reading' in searchStep) {
sources = searchStep.reading ?? [];
}
}
if (sources.length > 0) {
const lines = sources.slice(0, 10).map(
(s: { metadata?: { title?: string } }, i: number) =>
`${i + 1}. **${s?.metadata?.title ?? 'Источник'}**`,
);
session.emitBlock({
id: crypto.randomUUID(),
type: 'text',
data: `## По найденным источникам\n\n${lines.join('\n')}\n\n*Ответ LLM недоступен (400). Проверьте модель gemini-3-flash в Settings или попробуйте другую модель.*`,
});
} else {
session.emitBlock({
id: crypto.randomUUID(),
type: 'text',
data: `Ошибка: ${err instanceof Error ? err.message : String(err)}. Проверьте LLM в Settings.`,
});
}
session.emit('end', {});
}
}
private async doSearch(session: SessionManager, input: SearchAgentInput) {
const exists = await db.query.messages.findFirst({ const exists = await db.query.messages.findFirst({
where: and( where: and(
eq(messages.chatId, input.chatId), eq(messages.chatId, input.chatId),
@@ -56,6 +98,7 @@ class SearchAgent {
enabledSources: input.config.sources, enabledSources: input.config.sources,
query: input.followUp, query: input.followUp,
llm: input.config.llm, llm: input.config.llm,
locale: input.config.locale,
}); });
const widgetPromise = WidgetExecutor.executeAll({ const widgetPromise = WidgetExecutor.executeAll({
@@ -98,12 +141,19 @@ class SearchAgent {
type: 'researchComplete', type: 'researchComplete',
}); });
const MAX_RESULTS_FOR_WRITER = 15;
const MAX_CONTENT_PER_RESULT = 180;
const findingsForWriter =
searchResults?.searchFindings.slice(0, MAX_RESULTS_FOR_WRITER) ?? [];
const finalContext = const finalContext =
searchResults?.searchFindings findingsForWriter
.map( .map((f, index) => {
(f, index) => const content =
`<result index=${index + 1} title=${f.metadata.title}>${f.content}</result>`, f.content.length > MAX_CONTENT_PER_RESULT
) ? f.content.slice(0, MAX_CONTENT_PER_RESULT) + '…'
: f.content;
return `<result index=${index + 1} title="${String(f.metadata.title).replace(/"/g, "'")}">${content}</result>`;
})
.join('\n') || ''; .join('\n') || '';
const widgetContext = widgetOutputs const widgetContext = widgetOutputs
@@ -118,55 +168,61 @@ class SearchAgent {
finalContextWithWidgets, finalContextWithWidgets,
input.config.systemInstructions, input.config.systemInstructions,
input.config.mode, input.config.mode,
input.config.locale,
); );
const answerStream = input.config.llm.streamText({ const answerStream = input.config.llm.streamText({
messages: [ messages: [
{ { role: 'system', content: writerPrompt },
role: 'system',
content: writerPrompt,
},
...input.chatHistory, ...input.chatHistory,
{ { role: 'user', content: input.followUp },
role: 'user',
content: input.followUp,
},
], ],
options: { maxTokens: 4096 },
}); });
let responseBlockId = ''; let responseBlockId = '';
let hasContent = false;
for await (const chunk of answerStream) { for await (const chunk of answerStream) {
if (!chunk.contentChunk && !responseBlockId) continue;
if (!responseBlockId) { if (!responseBlockId) {
const block: TextBlock = { const block: TextBlock = {
id: crypto.randomUUID(), id: crypto.randomUUID(),
type: 'text', type: 'text',
data: chunk.contentChunk, data: chunk.contentChunk,
}; };
session.emitBlock(block); session.emitBlock(block);
responseBlockId = block.id; responseBlockId = block.id;
if (chunk.contentChunk) hasContent = true;
} else { } else {
const block = session.getBlock(responseBlockId) as TextBlock | null; const block = session.getBlock(responseBlockId) as TextBlock | null;
if (block) {
if (!block) {
continue;
}
block.data += chunk.contentChunk; block.data += chunk.contentChunk;
if (chunk.contentChunk) hasContent = true;
session.updateBlock(block.id, [ session.updateBlock(block.id, [
{ { op: 'replace', path: '/data', value: block.data },
op: 'replace',
path: '/data',
value: block.data,
},
]); ]);
} }
} }
}
if (!hasContent && findingsForWriter.length > 0) {
const lines = findingsForWriter.slice(0, 10).map((f, i) => {
const title = f.metadata.title ?? 'Без названия';
const excerpt =
f.content.length > 120 ? f.content.slice(0, 120) + '…' : f.content;
return `${i + 1}. **${title}** — ${excerpt}`;
});
session.emitBlock({
id: crypto.randomUUID(),
type: 'text',
data: `## По найденным источникам\n\n${lines.join('\n\n')}\n\n*Ответ LLM недоступен. Проверьте модель в Settings.*`,
});
}
session.emit('end', {}); session.emit('end', {});
try {
await db await db
.update(messages) .update(messages)
.set({ .set({
@@ -180,6 +236,9 @@ class SearchAgent {
), ),
) )
.execute(); .execute();
} catch (dbErr) {
console.error('[SearchAgent] DB update failed:', dbErr);
}
} }
} }

View File

@@ -24,9 +24,10 @@ class ActionRegistry {
fileIds: string[]; fileIds: string[];
mode: SearchAgentConfig['mode']; mode: SearchAgentConfig['mode'];
sources: SearchSources[]; sources: SearchSources[];
hasEmbedding?: boolean;
}): ResearchAction[] { }): ResearchAction[] {
return Array.from( return Array.from(this.actions.values()).filter((action) =>
this.actions.values().filter((action) => action.enabled(config)), action.enabled(config),
); );
} }
@@ -35,6 +36,7 @@ class ActionRegistry {
fileIds: string[]; fileIds: string[];
mode: SearchAgentConfig['mode']; mode: SearchAgentConfig['mode'];
sources: SearchSources[]; sources: SearchSources[];
hasEmbedding?: boolean;
}): Tool[] { }): Tool[] {
const availableActions = this.getAvailableActions(config); const availableActions = this.getAvailableActions(config);
@@ -50,6 +52,7 @@ class ActionRegistry {
fileIds: string[]; fileIds: string[];
mode: SearchAgentConfig['mode']; mode: SearchAgentConfig['mode'];
sources: SearchSources[]; sources: SearchSources[];
hasEmbedding?: boolean;
}): string { }): string {
const availableActions = this.getAvailableActions(config); const availableActions = this.getAvailableActions(config);

View File

@@ -13,9 +13,10 @@ const schema = z.object({
const uploadsSearchAction: ResearchAction<typeof schema> = { const uploadsSearchAction: ResearchAction<typeof schema> = {
name: 'uploads_search', name: 'uploads_search',
enabled: (config) => enabled: (config) =>
(config.classification.classification.personalSearch && config.hasEmbedding !== false &&
((config.classification.classification.personalSearch &&
config.fileIds.length > 0) || config.fileIds.length > 0) ||
config.fileIds.length > 0, config.fileIds.length > 0),
schema, schema,
getToolDescription: () => getToolDescription: () =>
`Use this tool to perform searches over the user's uploaded files. This is useful when you need to gather information from the user's documents to answer their questions. You can provide up to 3 queries at a time. You will have to use this every single time if this is present and relevant.`, `Use this tool to perform searches over the user's uploaded files. This is useful when you need to gather information from the user's documents to answer their questions. You can provide up to 3 queries at a time. You will have to use this every single time if this is present and relevant.`,
@@ -27,6 +28,10 @@ const uploadsSearchAction: ResearchAction<typeof schema> = {
Never use this tool to search the web or for information that is not contained within the user's uploaded files. Never use this tool to search the web or for information that is not contained within the user's uploaded files.
`, `,
execute: async (input, additionalConfig) => { execute: async (input, additionalConfig) => {
if (!additionalConfig.embedding) {
return { type: 'search_results' as const, results: [] };
}
input.queries = input.queries.slice(0, 3); input.queries = input.queries.slice(0, 3);
const researchBlock = additionalConfig.session.getBlock( const researchBlock = additionalConfig.session.getBlock(

View File

@@ -24,6 +24,7 @@ class Researcher {
fileIds: input.config.fileIds, fileIds: input.config.fileIds,
mode: input.config.mode, mode: input.config.mode,
sources: input.config.sources, sources: input.config.sources,
hasEmbedding: !!input.config.embedding,
}); });
const availableActionsDescription = const availableActionsDescription =
@@ -32,6 +33,7 @@ class Researcher {
fileIds: input.config.fileIds, fileIds: input.config.fileIds,
mode: input.config.mode, mode: input.config.mode,
sources: input.config.sources, sources: input.config.sources,
hasEmbedding: !!input.config.embedding,
}); });
const researchBlockId = crypto.randomUUID(); const researchBlockId = crypto.randomUUID();
@@ -63,6 +65,7 @@ class Researcher {
i, i,
maxIteration, maxIteration,
input.config.fileIds, input.config.fileIds,
input.config.locale,
); );
const actionStream = input.config.llm.streamText({ const actionStream = input.config.llm.streamText({

View File

@@ -6,13 +6,15 @@ import { ChatTurnMessage, Chunk } from '@/lib/types';
export type SearchSources = 'web' | 'discussions' | 'academic'; export type SearchSources = 'web' | 'discussions' | 'academic';
/** locale по geo (например ru, en) — язык ответа */
export type SearchAgentConfig = { export type SearchAgentConfig = {
sources: SearchSources[]; sources: SearchSources[];
fileIds: string[]; fileIds: string[];
llm: BaseLLM<any>; llm: BaseLLM<any>;
embedding: BaseEmbedding<any>; embedding: BaseEmbedding<any> | null;
mode: 'speed' | 'balanced' | 'quality'; mode: 'speed' | 'balanced' | 'quality';
systemInstructions: string; systemInstructions: string;
locale?: string;
}; };
export type SearchAgentInput = { export type SearchAgentInput = {
@@ -47,6 +49,7 @@ export type ClassifierInput = {
enabledSources: SearchSources[]; enabledSources: SearchSources[];
query: string; query: string;
chatHistory: ChatTurnMessage[]; chatHistory: ChatTurnMessage[];
locale?: string;
}; };
export type ClassifierOutput = { export type ClassifierOutput = {
@@ -64,7 +67,7 @@ export type ClassifierOutput = {
export type AdditionalConfig = { export type AdditionalConfig = {
llm: BaseLLM<any>; llm: BaseLLM<any>;
embedding: BaseEmbedding<any>; embedding: BaseEmbedding<any> | null;
session: SessionManager; session: SessionManager;
}; };
@@ -111,6 +114,7 @@ export interface ResearchAction<
fileIds: string[]; fileIds: string[];
mode: SearchAgentConfig['mode']; mode: SearchAgentConfig['mode'];
sources: SearchSources[]; sources: SearchSources[];
hasEmbedding?: boolean;
}) => boolean; }) => boolean;
execute: ( execute: (
params: z.infer<TSchema>, params: z.infer<TSchema>,

View File

@@ -1,11 +1,12 @@
import formatChatHistoryAsString from '@/lib/utils/formatHistory'; import formatChatHistoryAsString from '@/lib/utils/formatHistory';
import { suggestionGeneratorPrompt } from '@/lib/prompts/suggestions'; import { getSuggestionGeneratorPrompt } from '@/lib/prompts/suggestions';
import { ChatTurnMessage } from '@/lib/types'; import { ChatTurnMessage } from '@/lib/types';
import z from 'zod'; import z from 'zod';
import BaseLLM from '@/lib/models/base/llm'; import BaseLLM from '@/lib/models/base/llm';
type SuggestionGeneratorInput = { type SuggestionGeneratorInput = {
chatHistory: ChatTurnMessage[]; chatHistory: ChatTurnMessage[];
locale?: string;
}; };
const schema = z.object({ const schema = z.object({
@@ -22,7 +23,7 @@ const generateSuggestions = async (
messages: [ messages: [
{ {
role: 'system', role: 'system',
content: suggestionGeneratorPrompt, content: getSuggestionGeneratorPrompt(input.locale),
}, },
{ {
role: 'user', role: 'user',

View File

@@ -2,7 +2,7 @@ import path from 'node:path';
import fs from 'fs'; import fs from 'fs';
import { Config, ConfigModelProvider, UIConfigSections } from './types'; import { Config, ConfigModelProvider, UIConfigSections } from './types';
import { hashObj } from '../serverUtils'; import { hashObj } from '../serverUtils';
import { getModelProvidersUIConfigSection } from '../models/providers'; import { loadModelProvidersUIConfigSection } from './providersLoader';
class ConfigManager { class ConfigManager {
configPath: string = path.join( configPath: string = path.join(
@@ -193,10 +193,79 @@ class ConfigManager {
return config; return config;
} }
private initializeFromEnv() { /** Env-only mode: LLM from LLM_PROVIDER=ollama|timeweb, no UI for providers */
/* providers section*/ public isEnvOnlyMode(): boolean {
const providerConfigSections = getModelProvidersUIConfigSection(); return ['ollama', 'timeweb'].includes(
(process.env.LLM_PROVIDER ?? '').toLowerCase(),
);
}
private buildEnvProvider(): ConfigModelProvider | null {
const providerType = (process.env.LLM_PROVIDER ?? '').toLowerCase();
if (providerType === 'ollama') {
const defaultUrl =
process.env.DOCKER
? 'http://host.docker.internal:11434'
: 'http://localhost:11434';
const baseURL = process.env.OLLAMA_BASE_URL ?? defaultUrl;
const embeddingBaseURL =
process.env.OLLAMA_EMBEDDING_BASE_URL || baseURL;
const chatModel = process.env.LLM_CHAT_MODEL || 'llama3.2';
const embeddingModel = process.env.LLM_EMBEDDING_MODEL || 'nomic-embed-text';
return {
id: 'env-ollama',
name: 'Ollama',
type: 'ollama',
chatModels: [{ key: chatModel, name: chatModel }],
embeddingModels: [{ key: embeddingModel, name: embeddingModel }],
config: {
baseURL,
...(embeddingBaseURL !== baseURL && { embeddingBaseURL }),
},
hash: hashObj({ baseURL, embeddingBaseURL }),
};
}
if (providerType === 'timeweb') {
const baseURL = process.env.TIMEWEB_API_BASE_URL || 'https://api.timeweb.cloud';
const agentAccessId = process.env.TIMEWEB_AGENT_ACCESS_ID || '';
const apiKey = process.env.TIMEWEB_API_KEY || '';
const model = process.env.LLM_CHAT_MODEL || 'gpt-4';
const xProxySource = process.env.TIMEWEB_X_PROXY_SOURCE;
if (!agentAccessId || !apiKey) return null;
return {
id: 'env-timeweb',
name: 'Timeweb Cloud AI',
type: 'timeweb',
chatModels: [{ key: model, name: model }],
embeddingModels: [],
config: {
baseURL,
agentAccessId,
apiKey,
model,
xProxySource: xProxySource || undefined,
},
hash: hashObj({ baseURL, agentAccessId }),
};
}
return null;
}
private initializeFromEnv() {
const envProvider = this.buildEnvProvider();
if (envProvider) {
this.currentConfig.modelProviders = [envProvider];
this.currentConfig.setupComplete = true;
this.uiConfigSections.modelProviders = [];
} else {
const providerConfigSections = loadModelProvidersUIConfigSection();
this.uiConfigSections.modelProviders = providerConfigSections; this.uiConfigSections.modelProviders = providerConfigSections;
const newProviders: ConfigModelProvider[] = []; const newProviders: ConfigModelProvider[] = [];
@@ -217,7 +286,7 @@ class ConfigManager {
newProvider.config[field.key] = newProvider.config[field.key] =
process.env[field.env!] || process.env[field.env!] ||
field.default || field.default ||
''; /* Env var must exist for providers */ '';
if (field.required) newProvider.required?.push(field.key); if (field.required) newProvider.required?.push(field.key);
}); });
@@ -246,6 +315,7 @@ class ConfigManager {
}); });
this.currentConfig.modelProviders.push(...newProviders); this.currentConfig.modelProviders.push(...newProviders);
}
/* search section */ /* search section */
this.uiConfigSections.search.forEach((f) => { this.uiConfigSections.search.forEach((f) => {

View File

@@ -0,0 +1,11 @@
import type { ModelProviderUISection } from './types';
/**
* Lazy-loads getModelProvidersUIConfigSection only when needed (not in env-only mode).
* Avoids loading all 9 providers into the module graph during layout/config initialization.
*/
export function loadModelProvidersUIConfigSection(): ModelProviderUISection[] {
// eslint-disable-next-line @typescript-eslint/no-require-imports
const { getModelProvidersUIConfigSection } = require('../models/providers');
return getModelProvidersUIConfigSection();
}

View File

@@ -13,3 +13,5 @@ export const getConfiguredModelProviderById = (
export const getSearxngURL = () => export const getSearxngURL = () =>
configManager.getConfig('search.searxngURL', ''); configManager.getConfig('search.searxngURL', '');
export const isEnvOnlyMode = (): boolean => configManager.isEnvOnlyMode();

View File

@@ -1,14 +1,13 @@
import { drizzle } from 'drizzle-orm/better-sqlite3'; import { drizzle } from 'drizzle-orm/better-sqlite3';
import Database from 'better-sqlite3'; import Database from 'better-sqlite3';
import * as schema from './schema'; import * as schema from './schema';
import path from 'path'; import path from 'node:path';
const DATA_DIR = process.env.DATA_DIR const DATA_DIR = process.env.DATA_DIR
? path.resolve(process.cwd(), process.env.DATA_DIR) ? path.resolve(process.cwd(), process.env.DATA_DIR)
: process.cwd(); : process.cwd();
const sqlite = new Database(path.join(DATA_DIR, 'data', 'db.sqlite')); const sqlite = new Database(path.join(DATA_DIR, 'data', 'db.sqlite'));
const db = drizzle(sqlite, { const db = drizzle(sqlite, { schema });
schema: schema,
});
export default db; export default db;

View File

@@ -16,6 +16,7 @@ import { toast } from 'sonner';
import { getSuggestions } from '../actions'; import { getSuggestions } from '../actions';
import { MinimalProvider } from '../models/types'; import { MinimalProvider } from '../models/types';
import { getAutoMediaSearch } from '../config/clientRegistry'; import { getAutoMediaSearch } from '../config/clientRegistry';
import { useTranslation } from '@/lib/localization/context';
import { applyPatch } from 'rfc6902'; import { applyPatch } from 'rfc6902';
import { Widget } from '@/components/ChatWindow'; import { Widget } from '@/components/ChatWindow';
@@ -55,6 +56,7 @@ type ChatContext = {
message: string, message: string,
messageId?: string, messageId?: string,
rewrite?: boolean, rewrite?: boolean,
articleTitle?: string,
) => Promise<void>; ) => Promise<void>;
rewrite: (messageId: string) => void; rewrite: (messageId: string) => void;
setChatModelProvider: (provider: ChatModelProvider) => void; setChatModelProvider: (provider: ChatModelProvider) => void;
@@ -105,11 +107,10 @@ const checkConfig = async (
const data = await res.json(); const data = await res.json();
const providers: MinimalProvider[] = data.providers; const providers: MinimalProvider[] = data.providers;
const envOnlyMode = data.envOnlyMode ?? false;
if (providers.length === 0) { if (providers.length === 0) {
throw new Error( throw new Error('Сервис настраивается. Попробуйте позже.');
'No chat model providers found, please configure them in the settings page.',
);
} }
const chatModelProvider = const chatModelProvider =
@@ -117,9 +118,7 @@ const checkConfig = async (
providers.find((p) => p.chatModels.length > 0); providers.find((p) => p.chatModels.length > 0);
if (!chatModelProvider) { if (!chatModelProvider) {
throw new Error( throw new Error('Сервис настраивается. Попробуйте позже.');
'No chat models found, pleae configure them in the settings page.',
);
} }
chatModelProviderId = chatModelProvider.id; chatModelProviderId = chatModelProvider.id;
@@ -133,12 +132,11 @@ const checkConfig = async (
providers.find((p) => p.id === embeddingModelProviderId) ?? providers.find((p) => p.id === embeddingModelProviderId) ??
providers.find((p) => p.embeddingModels.length > 0); providers.find((p) => p.embeddingModels.length > 0);
if (!embeddingModelProvider) { if (!embeddingModelProvider && !envOnlyMode) {
throw new Error( throw new Error('Сервис настраивается. Попробуйте позже.');
'No embedding models found, pleae configure them in the settings page.',
);
} }
if (embeddingModelProvider) {
embeddingModelProviderId = embeddingModelProvider.id; embeddingModelProviderId = embeddingModelProvider.id;
const embeddingModel = const embeddingModel =
@@ -147,10 +145,15 @@ const checkConfig = async (
) ?? embeddingModelProvider.embeddingModels[0]; ) ?? embeddingModelProvider.embeddingModels[0];
embeddingModelKey = embeddingModel.key; embeddingModelKey = embeddingModel.key;
localStorage.setItem('chatModelKey', chatModelKey);
localStorage.setItem('chatModelProviderId', chatModelProviderId);
localStorage.setItem('embeddingModelKey', embeddingModelKey); localStorage.setItem('embeddingModelKey', embeddingModelKey);
localStorage.setItem('embeddingModelProviderId', embeddingModelProviderId); localStorage.setItem('embeddingModelProviderId', embeddingModelProviderId);
} else {
embeddingModelProviderId = '';
embeddingModelKey = '';
}
localStorage.setItem('chatModelKey', chatModelKey);
localStorage.setItem('chatModelProviderId', chatModelProviderId);
setChatModelProvider({ setChatModelProvider({
key: chatModelKey, key: chatModelKey,
@@ -269,6 +272,7 @@ export const chatContext = createContext<ChatContext>({
export const ChatProvider = ({ children }: { children: React.ReactNode }) => { export const ChatProvider = ({ children }: { children: React.ReactNode }) => {
const params: { chatId: string } = useParams(); const params: { chatId: string } = useParams();
const { localeCode } = useTranslation();
const searchParams = useSearchParams(); const searchParams = useSearchParams();
const initialMessage = searchParams.get('q'); const initialMessage = searchParams.get('q');
@@ -533,7 +537,12 @@ export const ChatProvider = ({ children }: { children: React.ReactNode }) => {
chatHistory.current = chatHistory.current.slice(0, index * 2); chatHistory.current = chatHistory.current.slice(0, index * 2);
const messageToRewrite = messages[index]; const messageToRewrite = messages[index];
sendMessage(messageToRewrite.query, messageToRewrite.messageId, true); sendMessage(
messageToRewrite.query,
messageToRewrite.messageId,
true,
messageToRewrite.articleTitle,
);
}; };
useEffect(() => { useEffect(() => {
@@ -542,7 +551,8 @@ export const ChatProvider = ({ children }: { children: React.ReactNode }) => {
toast.error('Cannot send message before the configuration is ready'); toast.error('Cannot send message before the configuration is ready');
return; return;
} }
sendMessage(initialMessage); const articleTitle = searchParams.get('title') ?? undefined;
sendMessage(initialMessage, undefined, false, articleTitle);
} }
// eslint-disable-next-line react-hooks/exhaustive-deps // eslint-disable-next-line react-hooks/exhaustive-deps
}, [isConfigReady, isReady, initialMessage]); }, [isConfigReady, isReady, initialMessage]);
@@ -552,7 +562,11 @@ export const ChatProvider = ({ children }: { children: React.ReactNode }) => {
return async (data: any) => { return async (data: any) => {
if (data.type === 'error') { if (data.type === 'error') {
toast.error(data.data); const msg =
typeof data.data === 'string'
? data.data
: (data.data as { message?: string })?.message ?? 'Ошибка поиска';
toast.error(msg);
setLoading(false); setLoading(false);
setMessages((prev) => setMessages((prev) =>
prev.map((msg) => prev.map((msg) =>
@@ -688,7 +702,7 @@ export const ChatProvider = ({ children }: { children: React.ReactNode }) => {
); );
if (hasSourceBlocks && !hasSuggestions) { if (hasSourceBlocks && !hasSuggestions) {
const suggestions = await getSuggestions(newHistory); const suggestions = await getSuggestions(newHistory, localeCode);
const suggestionBlock: Block = { const suggestionBlock: Block = {
id: crypto.randomBytes(7).toString('hex'), id: crypto.randomBytes(7).toString('hex'),
type: 'suggestion', type: 'suggestion',
@@ -715,6 +729,7 @@ export const ChatProvider = ({ children }: { children: React.ReactNode }) => {
message, message,
messageId, messageId,
rewrite = false, rewrite = false,
articleTitle,
) => { ) => {
if (loading || !message) return; if (loading || !message) return;
setLoading(true); setLoading(true);
@@ -736,6 +751,7 @@ export const ChatProvider = ({ children }: { children: React.ReactNode }) => {
responseBlocks: [], responseBlocks: [],
status: 'answering', status: 'answering',
createdAt: new Date(), createdAt: new Date(),
...(articleTitle && { articleTitle }),
}; };
setMessages((prevMessages) => [...prevMessages, newMessage]); setMessages((prevMessages) => [...prevMessages, newMessage]);
@@ -773,6 +789,7 @@ export const ChatProvider = ({ children }: { children: React.ReactNode }) => {
providerId: embeddingModelProvider.providerId, providerId: embeddingModelProvider.providerId,
}, },
systemInstructions: localStorage.getItem('systemInstructions'), systemInstructions: localStorage.getItem('systemInstructions'),
locale: localeCode,
}), }),
}); });

View File

@@ -3,6 +3,9 @@
* Получает locale на основе геопозиции (через geo-device-service) и переводы. * Получает locale на основе геопозиции (через geo-device-service) и переводы.
*/ */
import { fetchContextWithGeolocation } from './geoDevice';
import { localeFromCountryCode } from './localization/countryToLocale';
const LOCALE_API = '/api/locale'; const LOCALE_API = '/api/locale';
const TRANSLATIONS_API = '/api/translations'; const TRANSLATIONS_API = '/api/translations';
@@ -49,17 +52,118 @@ export async function fetchLocaleWithClient(): Promise<LocalizationContext> {
return res.json(); return res.json();
} }
/** IP-провайдеры возвращают country_code — fallback когда geo-device недоступен */
async function fetchCountryFromIp(): Promise<string | null> {
const providers: Array<{
url: string;
getCountry: (d: Record<string, unknown>) => string | null;
}> = [
{
url: 'https://get.geojs.io/v1/ip/geo.json',
getCountry: (d) => {
const cc = d.country_code ?? d.country;
return typeof cc === 'string' ? cc.toUpperCase() : null;
},
},
{
url: 'https://ipwhois.app/json/',
getCountry: (d) => {
const cc = d.country_code ?? d.country;
return typeof cc === 'string' ? cc.toUpperCase() : null;
},
},
];
for (const p of providers) {
try {
const res = await fetch(p.url);
const d = (await res.json()) as Record<string, unknown>;
const country = p.getCountry(d);
if (country) return country;
} catch {
/* следующий провайдер */
}
}
return null;
}
/**
* Получить locale с приоритетом geo-context (тот же источник, что у виджета погоды).
* Если geo-context вернёт countryCode — используем его.
* Иначе — fallback по IP (geojs, ipwhois), затем — localization service.
*/
export async function fetchLocaleWithGeoFirst(): Promise<LocalizationContext> {
try {
const ctx = await fetchContextWithGeolocation();
const cc = ctx.geo?.countryCode;
if (cc) {
const locale = localeFromCountryCode(cc);
return {
locale,
language: locale,
region: ctx.geo?.region ?? null,
countryCode: cc,
timezone: ctx.geo?.timezone ?? null,
source: 'geo',
};
}
} catch {
/* geo-context недоступен */
}
try {
const countryCode = await fetchCountryFromIp();
if (countryCode) {
const locale = localeFromCountryCode(countryCode);
return {
locale,
language: locale,
region: null,
countryCode,
timezone: null,
source: 'geo',
};
}
} catch {
/* IP fallback недоступен */
}
try {
return await fetchLocaleWithClient();
} catch {
const nav =
typeof navigator !== 'undefined' ? navigator.language : undefined;
const fallback = nav ? nav.split('-')[0].toLowerCase() : 'en';
return {
locale: fallback,
language: fallback,
region: null,
countryCode: null,
timezone: null,
source: 'fallback',
};
}
}
import {
getEmbeddedTranslations,
translationLocaleFor,
} from './localization/embeddedTranslations';
/** /**
* Получить переводы для locale. * Получить переводы для locale.
* При недоступности API использует встроенные переводы.
*/ */
export async function fetchTranslations( export async function fetchTranslations(
locale: string, locale: string,
): Promise<Record<string, string>> { ): Promise<Record<string, string>> {
const fetchLocale = translationLocaleFor(locale);
try {
const res = await fetch( const res = await fetch(
`${TRANSLATIONS_API}/${encodeURIComponent(locale)}`, `${TRANSLATIONS_API}/${encodeURIComponent(fetchLocale)}`,
); );
if (!res.ok) throw new Error('Failed to fetch translations'); if (res.ok) return res.json();
return res.json(); } catch {
/* API недоступен */
}
return getEmbeddedTranslations(locale);
} }
/** /**

View File

@@ -9,7 +9,7 @@ import {
useState, useState,
} from 'react'; } from 'react';
import { import {
fetchLocaleWithClient, fetchLocaleWithGeoFirst,
fetchTranslations, fetchTranslations,
type LocalizationContext as LocaleContext, type LocalizationContext as LocaleContext,
} from '@/lib/localization'; } from '@/lib/localization';
@@ -37,9 +37,44 @@ const defaultTranslations: Translations = {
'chat.send': 'Send', 'chat.send': 'Send',
'chat.newChat': 'New Chat', 'chat.newChat': 'New Chat',
'chat.suggestions': 'Suggestions', 'chat.suggestions': 'Suggestions',
'discover.title': 'Discover',
'discover.region': 'Region',
'weather.title': 'Weather',
'weather.feelsLike': 'Feels like',
'common.loading': 'Loading...', 'common.loading': 'Loading...',
'common.error': 'Error', 'common.error': 'Error',
'common.retry': 'Retry', 'common.retry': 'Retry',
'nav.home': 'Home',
'nav.discover': 'Discover',
'nav.library': 'Library',
'chat.related': 'Follow-up questions',
'chat.searchImages': 'Search images',
'chat.searchVideos': 'Search videos',
'chat.video': 'Video',
'chat.uploadedFile': 'Uploaded File',
'chat.viewMore': 'View {count} more',
'chat.sources': 'Sources',
'chat.researchProgress': 'Research Progress',
'chat.step': 'step',
'chat.steps': 'steps',
'chat.answer': 'Answer',
'chat.brainstorming': 'Brainstorming...',
'chat.formingAnswer': 'Forming answer...',
'chat.answerFailed': 'Could not form an answer from the sources found. Try rephrasing or shortening your query.',
'chat.thinking': 'Thinking',
'chat.searchingQueries': 'Searching {count} {plural}',
'chat.foundResults': 'Found {count} {plural}',
'chat.readingSources': 'Reading {count} {plural}',
'chat.scanningDocs': 'Scanning your uploaded documents',
'chat.readingDocs': 'Reading {count} {plural}',
'chat.processing': 'Processing',
'chat.query': 'query',
'chat.queries': 'queries',
'chat.result': 'result',
'chat.results': 'results',
'chat.source': 'source',
'chat.document': 'document',
'chat.documents': 'documents',
}; };
const LocalizationContext = createContext<LocalizationContextValue | null>(null); const LocalizationContext = createContext<LocalizationContextValue | null>(null);
@@ -60,9 +95,16 @@ export function LocalizationProvider({
let cancelled = false; let cancelled = false;
async function load() { async function load() {
try { try {
const loc = await fetchLocaleWithClient(); const loc = await fetchLocaleWithGeoFirst();
if (cancelled) return; if (cancelled) return;
const trans = await fetchTranslations(loc.locale); let trans: Record<string, string>;
try {
trans = await fetchTranslations(loc.locale);
} catch {
trans = (await import('./embeddedTranslations')).getEmbeddedTranslations(
loc.locale,
);
}
if (cancelled) return; if (cancelled) return;
setState({ setState({
locale: loc, locale: loc,

View File

@@ -0,0 +1,61 @@
/**
* Маппинг ISO 3166-1 alpha-2 → locale (как в localization-service).
* Используется когда localization-service недоступен, а geo-context (погода) работает.
*/
const COUNTRY_TO_LOCALE: Record<string, string> = {
RU: 'ru',
UA: 'uk',
BY: 'be',
KZ: 'kk',
US: 'en',
GB: 'en',
AU: 'en',
CA: 'en',
DE: 'de',
AT: 'de',
CH: 'de',
FR: 'fr',
BE: 'fr',
ES: 'es',
MX: 'es',
AR: 'es',
IT: 'it',
PT: 'pt',
BR: 'pt',
PL: 'pl',
NL: 'nl',
SE: 'sv',
NO: 'nb',
DK: 'da',
FI: 'fi',
CZ: 'cs',
SK: 'sk',
HU: 'hu',
RO: 'ro',
BG: 'bg',
HR: 'hr',
RS: 'sr',
GR: 'el',
TR: 'tr',
JP: 'ja',
CN: 'zh',
TW: 'zh-TW',
KR: 'ko',
IN: 'hi',
TH: 'th',
VI: 'vi',
ID: 'id',
MY: 'ms',
SA: 'ar',
AE: 'ar',
EG: 'ar',
IL: 'he',
IR: 'fa',
};
export function localeFromCountryCode(
countryCode: string | null | undefined,
): string {
if (!countryCode) return 'en';
return COUNTRY_TO_LOCALE[countryCode.toUpperCase()] ?? 'en';
}

View File

@@ -0,0 +1,310 @@
/**
* Встроенные переводы — fallback при недоступности localization-service.
* Синхронизировать с apps/localization-service/src/translations/index.ts
*/
export const embeddedTranslations: Record<
string,
Record<string, string>
> = {
en: {
'app.title': 'GooSeek',
'app.searchPlaceholder': 'Search...',
'empty.subtitle': 'Research begins here.',
'input.askAnything': 'Ask anything...',
'input.askFollowUp': 'Ask a follow-up',
'chat.send': 'Send',
'chat.newChat': 'New Chat',
'chat.suggestions': 'Suggestions',
'discover.title': 'Discover',
'discover.region': 'Region',
'weather.title': 'Weather',
'weather.feelsLike': 'Feels like',
'common.loading': 'Loading...',
'common.error': 'Error',
'common.retry': 'Retry',
'nav.home': 'Home',
'nav.discover': 'Discover',
'nav.library': 'Library',
'chat.related': 'Follow-up questions',
'chat.searchImages': 'Search images',
'chat.searchVideos': 'Search videos',
'chat.video': 'Video',
'chat.uploadedFile': 'Uploaded File',
'chat.viewMore': 'View {count} more',
'chat.sources': 'Sources',
'chat.researchProgress': 'Research Progress',
'chat.step': 'step',
'chat.steps': 'steps',
'chat.answer': 'Answer',
'chat.brainstorming': 'Brainstorming...',
'chat.formingAnswer': 'Forming answer...',
'chat.answerFailed': 'Could not form an answer from the sources found. Try rephrasing or shortening your query.',
'chat.thinking': 'Thinking',
'chat.searchingQueries': 'Searching {count} {plural}',
'chat.foundResults': 'Found {count} {plural}',
'chat.readingSources': 'Reading {count} {plural}',
'chat.scanningDocs': 'Scanning your uploaded documents',
'chat.readingDocs': 'Reading {count} {plural}',
'chat.processing': 'Processing',
'chat.query': 'query',
'chat.queries': 'queries',
'chat.result': 'result',
'chat.results': 'results',
'chat.source': 'source',
'chat.document': 'document',
'chat.documents': 'documents',
},
ru: {
'app.title': 'GooSeek',
'app.searchPlaceholder': 'Поиск...',
'empty.subtitle': 'Исследование начинается здесь.',
'input.askAnything': 'Спросите что угодно...',
'input.askFollowUp': 'Задайте уточняющий вопрос',
'chat.send': 'Отправить',
'chat.newChat': 'Новый чат',
'chat.suggestions': 'Подсказки',
'discover.title': 'Обзор',
'discover.region': 'Регион',
'weather.title': 'Погода',
'weather.feelsLike': 'Ощущается как',
'common.loading': 'Загрузка...',
'common.error': 'Ошибка',
'common.retry': 'Повторить',
'nav.home': 'Главная',
'nav.discover': 'Обзор',
'nav.library': 'Библиотека',
'chat.related': 'Что ещё спросить',
'chat.searchImages': 'Поиск изображений',
'chat.searchVideos': 'Поиск видео',
'chat.video': 'Видео',
'chat.uploadedFile': 'Загруженный файл',
'chat.viewMore': 'Ещё {count}',
'chat.sources': 'Источники',
'chat.researchProgress': 'Прогресс исследования',
'chat.step': 'шаг',
'chat.steps': 'шагов',
'chat.answer': 'Ответ',
'chat.brainstorming': 'Размышляю...',
'chat.formingAnswer': 'Формирование ответа...',
'chat.answerFailed': 'Не удалось сформировать ответ на основе найденных источников. Попробуйте переформулировать или сократить запрос.',
'chat.thinking': 'Размышляю',
'chat.searchingQueries': 'Поиск по {count} {plural}',
'chat.foundResults': 'Найдено {count} {plural}',
'chat.readingSources': 'Чтение {count} {plural}',
'chat.scanningDocs': 'Сканирование загруженных документов',
'chat.readingDocs': 'Чтение {count} {plural}',
'chat.processing': 'Обработка',
'chat.query': 'запросу',
'chat.queries': 'запросам',
'chat.result': 'результату',
'chat.results': 'результатов',
'chat.source': 'источнику',
'chat.documents': 'документов',
'chat.document': 'документу',
},
de: {
'app.title': 'GooSeek',
'app.searchPlaceholder': 'Suchen...',
'empty.subtitle': 'Die Recherche beginnt hier.',
'input.askAnything': 'Fragen Sie alles...',
'input.askFollowUp': 'Folgefrage stellen',
'chat.send': 'Senden',
'chat.newChat': 'Neuer Chat',
'chat.suggestions': 'Vorschläge',
'discover.title': 'Entdecken',
'discover.region': 'Region',
'weather.title': 'Wetter',
'weather.feelsLike': 'Gefühlt wie',
'common.loading': 'Laden...',
'common.error': 'Fehler',
'common.retry': 'Wiederholen',
'nav.home': 'Start',
'nav.discover': 'Entdecken',
'nav.library': 'Bibliothek',
'chat.related': 'Weitere Fragen',
'chat.searchImages': 'Bilder suchen',
'chat.searchVideos': 'Videos suchen',
'chat.video': 'Video',
'chat.uploadedFile': 'Hochgeladene Datei',
'chat.viewMore': '{count} weitere',
'chat.sources': 'Quellen',
'chat.researchProgress': 'Forschungsfortschritt',
'chat.step': 'Schritt',
'chat.steps': 'Schritte',
'chat.answer': 'Antwort',
'chat.brainstorming': 'Brainstorming...',
'chat.formingAnswer': 'Antwort wird formuliert...',
'chat.answerFailed': 'Antwort konnte nicht aus den gefundenen Quellen erstellt werden. Versuchen Sie es anders zu formulieren.',
'chat.thinking': 'Nachdenken',
'chat.searchingQueries': 'Suche {count} {plural}',
'chat.foundResults': '{count} {plural} gefunden',
'chat.readingSources': 'Lese {count} {plural}',
'chat.scanningDocs': 'Durchsuche hochgeladene Dokumente',
'chat.readingDocs': 'Lese {count} {plural}',
'chat.processing': 'Verarbeitung',
'chat.query': 'Abfrage',
'chat.queries': 'Abfragen',
'chat.result': 'Ergebnis',
'chat.results': 'Ergebnisse',
'chat.source': 'Quelle',
'chat.document': 'Dokument',
'chat.documents': 'Dokumente',
},
fr: {
'app.title': 'GooSeek',
'app.searchPlaceholder': 'Rechercher...',
'empty.subtitle': 'La recherche commence ici.',
'input.askAnything': 'Posez une question...',
'input.askFollowUp': 'Poser une question de suivi',
'chat.send': 'Envoyer',
'chat.newChat': 'Nouveau chat',
'chat.suggestions': 'Suggestions',
'discover.title': 'Découvrir',
'discover.region': 'Région',
'weather.title': 'Météo',
'weather.feelsLike': 'Ressenti',
'common.loading': 'Chargement...',
'common.error': 'Erreur',
'common.retry': 'Réessayer',
'nav.home': 'Accueil',
'nav.discover': 'Découvrir',
'nav.library': 'Bibliothèque',
'chat.related': 'Questions de suivi',
'chat.searchImages': 'Rechercher des images',
'chat.searchVideos': 'Rechercher des vidéos',
'chat.video': 'Vidéo',
'chat.uploadedFile': 'Fichier téléchargé',
'chat.viewMore': '{count} de plus',
'chat.sources': 'Sources',
'chat.researchProgress': 'Progression de la recherche',
'chat.step': 'étape',
'chat.steps': 'étapes',
'chat.answer': 'Réponse',
'chat.brainstorming': 'Brainstorming...',
'chat.formingAnswer': 'Formulation de la réponse...',
'chat.answerFailed': 'Impossible de formuler une réponse à partir des sources trouvées. Essayez de reformuler.',
'chat.thinking': 'Réflexion',
'chat.searchingQueries': 'Recherche de {count} {plural}',
'chat.foundResults': '{count} {plural} trouvés',
'chat.readingSources': 'Lecture de {count} {plural}',
'chat.scanningDocs': 'Analyse des documents téléchargés',
'chat.readingDocs': 'Lecture de {count} {plural}',
'chat.processing': 'Traitement',
'chat.query': 'requête',
'chat.queries': 'requêtes',
'chat.result': 'résultat',
'chat.results': 'résultats',
'chat.source': 'source',
'chat.document': 'document',
'chat.documents': 'documents',
},
es: {
'app.title': 'GooSeek',
'app.searchPlaceholder': 'Buscar...',
'empty.subtitle': 'La investigación comienza aquí.',
'input.askAnything': 'Pregunte lo que sea...',
'input.askFollowUp': 'Hacer una pregunta de seguimiento',
'chat.send': 'Enviar',
'chat.newChat': 'Nuevo chat',
'chat.suggestions': 'Sugerencias',
'discover.title': 'Descubrir',
'discover.region': 'Región',
'weather.title': 'Clima',
'weather.feelsLike': 'Sensación térmica',
'common.loading': 'Cargando...',
'common.error': 'Error',
'common.retry': 'Reintentar',
'nav.home': 'Inicio',
'nav.discover': 'Descubrir',
'nav.library': 'Biblioteca',
'chat.related': 'Preguntas de seguimiento',
'chat.searchImages': 'Buscar imágenes',
'chat.searchVideos': 'Buscar videos',
'chat.video': 'Video',
'chat.uploadedFile': 'Archivo subido',
'chat.viewMore': '{count} más',
'chat.sources': 'Fuentes',
'chat.researchProgress': 'Progreso de investigación',
'chat.step': 'paso',
'chat.steps': 'pasos',
'chat.answer': 'Respuesta',
'chat.brainstorming': 'Lluvia de ideas...',
'chat.formingAnswer': 'Formando respuesta...',
'chat.answerFailed': 'No se pudo formar una respuesta a partir de las fuentes encontradas. Intente reformular.',
'chat.thinking': 'Pensando',
'chat.searchingQueries': 'Buscando {count} {plural}',
'chat.foundResults': 'Encontrados {count} {plural}',
'chat.readingSources': 'Leyendo {count} {plural}',
'chat.scanningDocs': 'Escaneando documentos subidos',
'chat.readingDocs': 'Leyendo {count} {plural}',
'chat.processing': 'Procesando',
'chat.query': 'consulta',
'chat.queries': 'consultas',
'chat.result': 'resultado',
'chat.results': 'resultados',
'chat.source': 'fuente',
'chat.document': 'documento',
'chat.documents': 'documentos',
},
uk: {
'app.title': 'GooSeek',
'app.searchPlaceholder': 'Пошук...',
'empty.subtitle': 'Дослідження починається тут.',
'input.askAnything': 'Запитайте що завгодно...',
'input.askFollowUp': 'Задайте уточнююче питання',
'chat.send': 'Надіслати',
'chat.newChat': 'Новий чат',
'chat.suggestions': 'Підказки',
'discover.title': 'Огляд',
'discover.region': 'Регіон',
'weather.title': 'Погода',
'weather.feelsLike': 'Відчувається як',
'common.loading': 'Завантаження...',
'common.error': 'Помилка',
'common.retry': 'Повторити',
'nav.home': 'Головна',
'nav.discover': 'Огляд',
'nav.library': 'Бібліотека',
'chat.related': 'Що ще запитати',
'chat.searchImages': 'Пошук зображень',
'chat.searchVideos': 'Пошук відео',
'chat.video': 'Відео',
'chat.uploadedFile': 'Завантажений файл',
'chat.viewMore': 'Ще {count}',
'chat.sources': 'Джерела',
'chat.researchProgress': 'Прогрес дослідження',
'chat.step': 'крок',
'chat.steps': 'кроків',
'chat.answer': 'Відповідь',
'chat.brainstorming': 'Роздуми...',
'chat.formingAnswer': 'Формування відповіді...',
'chat.answerFailed': 'Не вдалося сформувати відповідь на основі знайдених джерел. Спробуйте переформулювати.',
'chat.thinking': 'Роздуми',
'chat.searchingQueries': 'Пошук за {count} {plural}',
'chat.foundResults': 'Знайдено {count} {plural}',
'chat.readingSources': 'Читання {count} {plural}',
'chat.scanningDocs': 'Сканування завантажених документів',
'chat.readingDocs': 'Читання {count} {plural}',
'chat.processing': 'Обробка',
'chat.query': 'запиту',
'chat.queries': 'запитам',
'chat.result': 'результату',
'chat.results': 'результатів',
'chat.source': 'джерелу',
'chat.document': 'документу',
'chat.documents': 'документів',
},
};
/** Локаль для запроса переводов (be, kk → ru, т.к. отдельные переводы не храним) */
export function translationLocaleFor(locale: string): string {
const base = locale.split('-')[0]?.toLowerCase() ?? 'en';
if (base === 'be' || base === 'kk') return 'ru';
return base;
}
export function getEmbeddedTranslations(locale: string): Record<string, string> {
const key = translationLocaleFor(locale);
return embeddedTranslations[key] ?? embeddedTranslations.en;
}

View File

@@ -2,6 +2,7 @@ import { ModelProviderUISection } from '@/lib/config/types';
import { ProviderConstructor } from '../base/provider'; import { ProviderConstructor } from '../base/provider';
import OpenAIProvider from './openai'; import OpenAIProvider from './openai';
import OllamaProvider from './ollama'; import OllamaProvider from './ollama';
import TimewebProvider from './timeweb';
import GeminiProvider from './gemini'; import GeminiProvider from './gemini';
import TransformersProvider from './transformers'; import TransformersProvider from './transformers';
import GroqProvider from './groq'; import GroqProvider from './groq';
@@ -12,6 +13,7 @@ import LMStudioProvider from './lmstudio';
export const providers: Record<string, ProviderConstructor<any>> = { export const providers: Record<string, ProviderConstructor<any>> = {
openai: OpenAIProvider, openai: OpenAIProvider,
ollama: OllamaProvider, ollama: OllamaProvider,
timeweb: TimewebProvider,
gemini: GeminiProvider, gemini: GeminiProvider,
transformers: TransformersProvider, transformers: TransformersProvider,
groq: GroqProvider, groq: GroqProvider,

View File

@@ -9,6 +9,7 @@ import OllamaEmbedding from './ollamaEmbedding';
interface OllamaConfig { interface OllamaConfig {
baseURL: string; baseURL: string;
embeddingBaseURL?: string;
} }
const providerConfigFields: UIConfigField[] = [ const providerConfigFields: UIConfigField[] = [
@@ -31,35 +32,33 @@ class OllamaProvider extends BaseModelProvider<OllamaConfig> {
super(id, name, config); super(id, name, config);
} }
private async fetchModels(baseURL: string): Promise<Model[]> {
const res = await fetch(`${baseURL}/api/tags`, {
method: 'GET',
headers: { 'Content-type': 'application/json' },
});
const data = await res.json();
return (data.models ?? []).map((m: { name?: string; model?: string }) => ({
name: m.model ?? m.name ?? '',
key: m.model ?? m.name ?? '',
}));
}
async getDefaultModels(): Promise<ModelList> { async getDefaultModels(): Promise<ModelList> {
try { try {
const res = await fetch(`${this.config.baseURL}/api/tags`, { const [chatModels, embeddingModels] = await Promise.all([
method: 'GET', this.fetchModels(this.config.baseURL),
headers: { this.fetchModels(
'Content-type': 'application/json', this.config.embeddingBaseURL ?? this.config.baseURL,
}, ),
}); ]);
return { chat: chatModels, embedding: embeddingModels };
const data = await res.json();
const models: Model[] = data.models.map((m: any) => {
return {
name: m.name,
key: m.model,
};
});
return {
embedding: models,
chat: models,
};
} catch (err) { } catch (err) {
if (err instanceof TypeError) { if (err instanceof TypeError) {
throw new Error( throw new Error(
'Error connecting to Ollama API. Please ensure the base URL is correct and the Ollama server is running.', 'Error connecting to Ollama API. Please ensure the base URL is correct and the Ollama server is running.',
); );
} }
throw err; throw err;
} }
} }
@@ -106,18 +105,22 @@ class OllamaProvider extends BaseModelProvider<OllamaConfig> {
return new OllamaEmbedding({ return new OllamaEmbedding({
model: key, model: key,
baseURL: this.config.baseURL, baseURL: this.config.embeddingBaseURL ?? this.config.baseURL,
}); });
} }
static parseAndValidate(raw: any): OllamaConfig { static parseAndValidate(raw: unknown): OllamaConfig {
if (!raw || typeof raw !== 'object') if (!raw || typeof raw !== 'object')
throw new Error('Invalid config provided. Expected object'); throw new Error('Invalid config provided. Expected object');
if (!raw.baseURL) const obj = raw as Record<string, unknown>;
if (!obj.baseURL)
throw new Error('Invalid config provided. Base URL must be provided'); throw new Error('Invalid config provided. Base URL must be provided');
return { return {
baseURL: String(raw.baseURL), baseURL: String(obj.baseURL),
embeddingBaseURL: obj.embeddingBaseURL
? String(obj.embeddingBaseURL)
: undefined,
}; };
} }

View File

@@ -0,0 +1,172 @@
import { UIConfigField } from '@/lib/config/types';
import BaseModelProvider from '../../base/provider';
import { Model, ModelList, ProviderMetadata } from '../../types';
import BaseLLM from '../../base/llm';
import BaseEmbedding from '../../base/embedding';
import TimewebLLM from './timewebLLM';
interface TimewebConfig {
baseURL: string;
agentAccessId: string;
apiKey: string;
model: string;
xProxySource?: string;
}
const providerConfigFields: UIConfigField[] = [
{
type: 'string',
name: 'API Base URL',
key: 'baseURL',
description: 'Timeweb Cloud AI API base URL',
required: true,
placeholder: 'https://api.timeweb.cloud',
env: 'TIMEWEB_API_BASE_URL',
scope: 'server',
},
{
type: 'string',
name: 'Agent Access ID',
key: 'agentAccessId',
description: 'Agent access ID from Timeweb Cloud AI',
required: true,
placeholder: '',
env: 'TIMEWEB_AGENT_ACCESS_ID',
scope: 'server',
},
{
type: 'password',
name: 'API Key',
key: 'apiKey',
description: 'Bearer token for Timeweb Cloud AI',
required: true,
placeholder: '',
env: 'TIMEWEB_API_KEY',
scope: 'server',
},
{
type: 'string',
name: 'Model',
key: 'model',
description: 'Model key (e.g. gpt-4)',
required: true,
placeholder: 'gpt-4',
env: 'LLM_CHAT_MODEL',
scope: 'server',
},
{
type: 'string',
name: 'X-Proxy-Source',
key: 'xProxySource',
description: 'Optional header for Timeweb API',
required: false,
placeholder: '',
env: 'TIMEWEB_X_PROXY_SOURCE',
scope: 'server',
},
];
class TimewebProvider extends BaseModelProvider<TimewebConfig> {
constructor(id: string, name: string, config: TimewebConfig) {
super(id, name, config);
}
getTimewebBaseURL(): string {
const base = this.config.baseURL.replace(/\/$/, '');
return `${base}/api/v1/cloud-ai/agents/${this.config.agentAccessId}/v1`;
}
async getDefaultModels(): Promise<ModelList> {
try {
const url = `${this.getTimewebBaseURL()}/models`;
const res = await fetch(url, {
headers: {
Authorization: `Bearer ${this.config.apiKey}`,
'Content-Type': 'application/json',
...(this.config.xProxySource && {
'x-proxy-source': this.config.xProxySource,
}),
},
});
if (!res.ok) {
throw new Error(`Timeweb API error: ${res.status} ${res.statusText}`);
}
const data = (await res.json()) as { data?: { id: string }[] };
const models: Model[] = (data.data ?? []).map((m) => ({
name: m.id,
key: m.id,
}));
// Если API вернул пустой список — используем модель из конфигурации
const chat =
models.length > 0
? models
: [{ name: this.config.model, key: this.config.model }];
return {
chat,
embedding: [],
};
} catch (err) {
return {
chat: [{ name: this.config.model, key: this.config.model }],
embedding: [],
};
}
}
async getModelList(): Promise<ModelList> {
return this.getDefaultModels();
}
async loadChatModel(key: string): Promise<BaseLLM<unknown>> {
return new TimewebLLM({
apiKey: this.config.apiKey,
baseURL: this.getTimewebBaseURL(),
model: key,
defaultHeaders: this.config.xProxySource
? { 'x-proxy-source': this.config.xProxySource }
: undefined,
});
}
async loadEmbeddingModel(_key: string): Promise<BaseEmbedding<unknown>> {
throw new Error(
'Timeweb Cloud AI does not provide embedding models. Use Ollama for embeddings.',
);
}
static parseAndValidate(raw: unknown): TimewebConfig {
if (!raw || typeof raw !== 'object')
throw new Error('Invalid config provided. Expected object');
const obj = raw as Record<string, unknown>;
if (!obj.baseURL || !obj.agentAccessId || !obj.apiKey)
throw new Error(
'Invalid config. baseURL, agentAccessId and apiKey are required',
);
return {
baseURL: String(obj.baseURL),
agentAccessId: String(obj.agentAccessId),
apiKey: String(obj.apiKey),
model: String(obj.model || 'gpt-4'),
xProxySource: obj.xProxySource ? String(obj.xProxySource) : undefined,
};
}
static getProviderConfigFields(): UIConfigField[] {
return providerConfigFields;
}
static getProviderMetadata(): ProviderMetadata {
return {
key: 'timeweb',
name: 'Timeweb Cloud AI',
};
}
}
export default TimewebProvider;

View File

@@ -0,0 +1,265 @@
import OpenAI from 'openai';
import BaseLLM from '../../base/llm';
import { zodTextFormat, zodResponseFormat } from 'openai/helpers/zod';
import {
GenerateObjectInput,
GenerateOptions,
GenerateTextInput,
GenerateTextOutput,
StreamTextOutput,
} from '../../types';
import { parse } from 'partial-json';
import z from 'zod';
import {
ChatCompletionAssistantMessageParam,
ChatCompletionMessageParam,
ChatCompletionTool,
ChatCompletionToolMessageParam,
} from 'openai/resources/index.mjs';
import { Message } from '@/lib/types';
import { repairJson } from '@toolsycc/json-repair';
type TimewebConfig = {
apiKey: string;
baseURL: string;
model: string;
options?: GenerateOptions;
defaultHeaders?: Record<string, string>;
};
class TimewebLLM extends BaseLLM<TimewebConfig> {
openAIClient: OpenAI;
constructor(protected config: TimewebConfig) {
super(config);
this.openAIClient = new OpenAI({
apiKey: this.config.apiKey,
baseURL: this.config.baseURL,
defaultHeaders: this.config.defaultHeaders,
});
}
convertToOpenAIMessages(messages: Message[]): ChatCompletionMessageParam[] {
return messages.map((msg) => {
if (msg.role === 'tool') {
return {
role: 'tool',
tool_call_id: msg.id,
content: msg.content,
} as ChatCompletionToolMessageParam;
} else if (msg.role === 'assistant') {
return {
role: 'assistant',
content: msg.content,
...(msg.tool_calls &&
msg.tool_calls.length > 0 && {
tool_calls: msg.tool_calls?.map((tc) => ({
id: tc.id,
type: 'function' as const,
function: {
name: tc.name,
arguments: JSON.stringify(tc.arguments),
},
})),
}),
} as ChatCompletionAssistantMessageParam;
}
return msg;
});
}
async generateText(input: GenerateTextInput): Promise<GenerateTextOutput> {
const openaiTools: ChatCompletionTool[] = [];
input.tools?.forEach((tool) => {
openaiTools.push({
type: 'function',
function: {
name: tool.name,
description: tool.description,
parameters: z.toJSONSchema(tool.schema),
},
});
});
const response = await this.openAIClient.chat.completions.create({
model: this.config.model,
tools: openaiTools.length > 0 ? openaiTools : undefined,
messages: this.convertToOpenAIMessages(input.messages),
temperature:
input.options?.temperature ?? this.config.options?.temperature ?? 1.0,
top_p: input.options?.topP ?? this.config.options?.topP,
max_completion_tokens:
input.options?.maxTokens ?? this.config.options?.maxTokens,
stop: input.options?.stopSequences ?? this.config.options?.stopSequences,
frequency_penalty:
input.options?.frequencyPenalty ??
this.config.options?.frequencyPenalty,
presence_penalty:
input.options?.presencePenalty ?? this.config.options?.presencePenalty,
});
if (response.choices && response.choices.length > 0) {
return {
content: response.choices[0].message.content ?? '',
toolCalls:
response.choices[0].message.tool_calls
?.map((tc) => {
if (tc.type === 'function') {
return {
name: tc.function.name,
id: tc.id!,
arguments: JSON.parse(tc.function.arguments ?? '{}'),
};
}
return undefined;
})
.filter((tc): tc is NonNullable<typeof tc> => tc !== undefined) ?? [],
additionalInfo: {
finishReason: response.choices[0].finish_reason ?? undefined,
},
};
}
throw new Error('No response from Timeweb');
}
async *streamText(
input: GenerateTextInput,
): AsyncGenerator<StreamTextOutput> {
const openaiTools: ChatCompletionTool[] = [];
input.tools?.forEach((tool) => {
openaiTools.push({
type: 'function',
function: {
name: tool.name,
description: tool.description,
parameters: z.toJSONSchema(tool.schema),
},
});
});
let stream;
try {
stream = await this.openAIClient.chat.completions.create({
model: this.config.model,
messages: this.convertToOpenAIMessages(input.messages),
tools: openaiTools.length > 0 ? openaiTools : undefined,
temperature:
input.options?.temperature ?? this.config.options?.temperature ?? 1.0,
top_p: input.options?.topP ?? this.config.options?.topP,
max_completion_tokens:
input.options?.maxTokens ?? this.config.options?.maxTokens,
stop: input.options?.stopSequences ?? this.config.options?.stopSequences,
frequency_penalty:
input.options?.frequencyPenalty ??
this.config.options?.frequencyPenalty,
presence_penalty:
input.options?.presencePenalty ?? this.config.options?.presencePenalty,
stream: true,
});
} catch (err: unknown) {
const e = err as {
status?: number;
error?: { message?: string; code?: string };
response?: { status?: number; body?: unknown };
};
const details = [
e?.status != null && `status=${e.status}`,
e?.error?.message && `error=${e.error.message}`,
e?.error?.code && `code=${e.error.code}`,
e?.response?.body != null &&
`body=${JSON.stringify(e.response.body).slice(0, 300)}`,
]
.filter(Boolean)
.join(', ');
console.error(
`[Timeweb] streamText failed: ${details || String(err)}`,
err,
);
throw err;
}
const receivedToolCalls: { name: string; id: string; arguments: string }[] =
[];
for await (const chunk of stream) {
if (chunk.choices && chunk.choices.length > 0) {
const toolCalls = chunk.choices[0].delta.tool_calls;
yield {
contentChunk: chunk.choices[0].delta.content ?? '',
toolCallChunk:
toolCalls?.map((tc) => {
if (!receivedToolCalls[tc.index!]) {
const call = {
name: tc.function?.name ?? '',
id: tc.id ?? '',
arguments: tc.function?.arguments ?? '',
};
receivedToolCalls[tc.index!] = call;
return { ...call, arguments: parse(call.arguments || '{}') };
} else {
const existingCall = receivedToolCalls[tc.index!];
existingCall.arguments += tc.function?.arguments ?? '';
return {
...existingCall,
arguments: parse(existingCall.arguments),
};
}
}) || [],
done: chunk.choices[0].finish_reason !== null,
additionalInfo: {
finishReason: chunk.choices[0].finish_reason ?? undefined,
},
};
}
}
}
async generateObject<T>(input: GenerateObjectInput): Promise<T> {
const response = await this.openAIClient.chat.completions.create({
messages: this.convertToOpenAIMessages(input.messages),
model: this.config.model,
temperature:
input.options?.temperature ?? this.config.options?.temperature ?? 1.0,
top_p: input.options?.topP ?? this.config.options?.topP,
max_completion_tokens:
input.options?.maxTokens ?? this.config.options?.maxTokens,
stop: input.options?.stopSequences ?? this.config.options?.stopSequences,
frequency_penalty:
input.options?.frequencyPenalty ??
this.config.options?.frequencyPenalty,
presence_penalty:
input.options?.presencePenalty ?? this.config.options?.presencePenalty,
response_format: zodResponseFormat(input.schema, 'object'),
});
if (response.choices && response.choices.length > 0) {
try {
return input.schema.parse(
JSON.parse(
repairJson(response.choices[0].message.content ?? '{}', {
extractJson: true,
}) as string,
),
) as T;
} catch (err) {
throw new Error(`Error parsing response from Timeweb: ${err}`);
}
}
throw new Error('No response from Timeweb');
}
async *streamObject<T>(
input: GenerateObjectInput,
): AsyncGenerator<Partial<T>> {
const result = await this.generateObject<T>(input);
yield result;
}
}
export default TimewebLLM;

View File

@@ -0,0 +1,52 @@
/** Маппинг кода локали в название языка для LLM */
const LOCALE_TO_LANGUAGE: Record<string, string> = {
ru: 'Russian',
en: 'English',
de: 'German',
fr: 'French',
es: 'Spanish',
it: 'Italian',
pt: 'Portuguese',
uk: 'Ukrainian',
pl: 'Polish',
zh: 'Chinese',
ja: 'Japanese',
ko: 'Korean',
ar: 'Arabic',
tr: 'Turkish',
be: 'Belarusian',
kk: 'Kazakh',
sv: 'Swedish',
nb: 'Norwegian',
da: 'Danish',
fi: 'Finnish',
cs: 'Czech',
sk: 'Slovak',
hu: 'Hungarian',
ro: 'Romanian',
bg: 'Bulgarian',
hr: 'Croatian',
sr: 'Serbian',
el: 'Greek',
hi: 'Hindi',
th: 'Thai',
vi: 'Vietnamese',
id: 'Indonesian',
ms: 'Malay',
he: 'Hebrew',
fa: 'Persian',
};
/**
* Возвращает инструкцию для LLM о языке ответа.
* Добавлять в system prompt каждого запроса (classifier, researcher, writer).
*/
export function getLocaleInstruction(locale?: string): string {
if (!locale) return '';
const lang = locale.split('-')[0];
const languageName = LOCALE_TO_LANGUAGE[lang] ?? lang;
return `
<response_language>
User's locale is ${locale}. Always format your response in ${languageName}, regardless of the language of the query or search results. Even when the discussed content is in another language, respond in ${languageName}.
</response_language>`;
}

View File

@@ -1,4 +1,6 @@
export const classifierPrompt = ` import { getLocaleInstruction } from '../locale';
const baseClassifierPrompt = `
<role> <role>
Assistant is an advanced AI system designed to analyze the user query and the conversation history to determine the most appropriate classification for the search operation. Assistant is an advanced AI system designed to analyze the user query and the conversation history to determine the most appropriate classification for the search operation.
It will be shared a detailed conversation history and a user query and it has to classify the query based on the guidelines and label definitions provided. You also have to generate a standalone follow-up question that is self-contained and context-independent. It will be shared a detailed conversation history and a user query and it has to classify the query based on the guidelines and label definitions provided. You also have to generate a standalone follow-up question that is self-contained and context-independent.
@@ -62,3 +64,7 @@ You must respond in the following JSON format without any extra text, explanatio
} }
</output_format> </output_format>
`; `;
export function getClassifierPrompt(locale?: string): string {
return baseClassifierPrompt + getLocaleInstruction(locale);
}

View File

@@ -1,5 +1,6 @@
import BaseEmbedding from '@/lib/models/base/embedding'; import BaseEmbedding from '@/lib/models/base/embedding';
import UploadStore from '@/lib/uploads/store'; import UploadStore from '@/lib/uploads/store';
import { getLocaleInstruction } from '../locale';
const getSpeedPrompt = ( const getSpeedPrompt = (
actionDesc: string, actionDesc: string,
@@ -323,6 +324,7 @@ export const getResearcherPrompt = (
i: number, i: number,
maxIteration: number, maxIteration: number,
fileIds: string[], fileIds: string[],
locale?: string,
) => { ) => {
let prompt = ''; let prompt = '';
@@ -350,5 +352,5 @@ export const getResearcherPrompt = (
break; break;
} }
return prompt; return prompt + getLocaleInstruction(locale);
}; };

View File

@@ -1,7 +1,10 @@
import { getLocaleInstruction } from '../locale';
export const getWriterPrompt = ( export const getWriterPrompt = (
context: string, context: string,
systemInstructions: string, systemInstructions: string,
mode: 'speed' | 'balanced' | 'quality', mode: 'speed' | 'balanced' | 'quality',
locale?: string,
) => { ) => {
return ` return `
You are GooSeek, an AI model skilled in web search and crafting detailed, engaging, and well-structured answers. You excel at summarizing web pages and extracting relevant information to create professional, blog-style responses. You are GooSeek, an AI model skilled in web search and crafting detailed, engaging, and well-structured answers. You excel at summarizing web pages and extracting relevant information to create professional, blog-style responses.
@@ -22,17 +25,19 @@ You are GooSeek, an AI model skilled in web search and crafting detailed, engagi
- **Conclusion or Summary**: Include a concluding paragraph that synthesizes the provided information or suggests potential next steps, where appropriate. - **Conclusion or Summary**: Include a concluding paragraph that synthesizes the provided information or suggests potential next steps, where appropriate.
### Citation Requirements ### Citation Requirements
- Cite every single fact, statement, or sentence using [number] notation corresponding to the source from the provided \`context\`. - Cite every single fact from **search_results** using [number] notation. Citations [1], [2], etc. refer ONLY to sources in search_results.
- **widgets_result** (calculations, weather, stock data) — use this to answer directly, do NOT cite it. This data is already shown to the user; integrate it naturally into your response.
- Integrate citations naturally at the end of sentences or clauses as appropriate. For example, "The Eiffel Tower is one of the most visited landmarks in the world[1]." - Integrate citations naturally at the end of sentences or clauses as appropriate. For example, "The Eiffel Tower is one of the most visited landmarks in the world[1]."
- Ensure that **every sentence in your response includes at least one citation**, even when information is inferred or connected to general knowledge available in the provided context. - When search_results exist: ensure every fact from search has a citation. When ONLY widgets_result has relevant data (e.g. calculation), no citations needed — just answer using the widget data.
- Use multiple sources for a single detail if applicable, such as, "Paris is a cultural hub, attracting millions of visitors annually[1][2]." - Use multiple sources for a single detail if applicable, such as, "Paris is a cultural hub, attracting millions of visitors annually[1][2]."
- Always prioritize credibility and accuracy by linking all statements back to their respective context sources. - Always prioritize credibility and accuracy by linking all statements back to their respective context sources.
- Avoid citing unsupported assumptions or personal interpretations; if no source supports a statement, clearly indicate the limitation. - Avoid citing unsupported assumptions or personal interpretations; if no source supports a statement, clearly indicate the limitation.
### Special Instructions ### Special Instructions
- **IMPORTANT**: The context contains two sections: \`search_results\` (web search) and \`widgets_result\` (calculations, weather, stocks). If widgets_result has the answer (e.g. "The result of 2+2 is 4"), USE IT. Do not say "no relevant information" when the answer is in widgets_result.
- If the query involves technical, historical, or complex topics, provide detailed background and explanatory sections to ensure clarity. - If the query involves technical, historical, or complex topics, provide detailed background and explanatory sections to ensure clarity.
- If the user provides vague input or if relevant information is missing, explain what additional details might help refine the search. - If the user provides vague input or if relevant information is missing, explain what additional details might help refine the search.
- If no relevant information is found, say: "Hmm, sorry I could not find any relevant information on this topic. Would you like me to search again or ask something else?" Be transparent about limitations and suggest alternatives or ways to reframe the query. - Only if BOTH search_results AND widgets_result lack relevant information, say: "Hmm, sorry I could not find any relevant information on this topic. Would you like me to search again or ask something else?"
${mode === 'quality' ? "- YOU ARE CURRENTLY SET IN QUALITY MODE, GENERATE VERY DEEP, DETAILED AND COMPREHENSIVE RESPONSES USING THE FULL CONTEXT PROVIDED. ASSISTANT'S RESPONSES SHALL NOT BE LESS THAN AT LEAST 2000 WORDS, COVER EVERYTHING AND FRAME IT LIKE A RESEARCH REPORT." : ''} ${mode === 'quality' ? "- YOU ARE CURRENTLY SET IN QUALITY MODE, GENERATE VERY DEEP, DETAILED AND COMPREHENSIVE RESPONSES USING THE FULL CONTEXT PROVIDED. ASSISTANT'S RESPONSES SHALL NOT BE LESS THAN AT LEAST 2000 WORDS, COVER EVERYTHING AND FRAME IT LIKE A RESEARCH REPORT." : ''}
### User instructions ### User instructions
@@ -50,5 +55,6 @@ You are GooSeek, an AI model skilled in web search and crafting detailed, engagi
</context> </context>
Current date & time in ISO format (UTC timezone) is: ${new Date().toISOString()}. Current date & time in ISO format (UTC timezone) is: ${new Date().toISOString()}.
${getLocaleInstruction(locale)}
`; `;
}; };

View File

@@ -1,17 +1,17 @@
export const suggestionGeneratorPrompt = ` import { getLocaleInstruction } from '../locale';
You are an AI suggestion generator for an AI powered search engine. You will be given a conversation below. You need to generate 4-5 suggestions based on the conversation. The suggestion should be relevant to the conversation that can be used by the user to ask the chat model for more information.
You need to make sure the suggestions are relevant to the conversation and are helpful to the user. Keep a note that the user might use these suggestions to ask a chat model for more information.
Make sure the suggestions are medium in length and are informative and relevant to the conversation.
Sample suggestions for a conversation about Elon Musk: const baseSuggestionPrompt = `
{ You are an AI suggestion generator for an AI powered search engine. You will be given a conversation below. Generate 4-5 DIFFERENT follow-up questions the user could ask to learn more.
"suggestions": [
"What are Elon Musk's plans for SpaceX in the next decade?", Rules:
"How has Tesla's stock performance been influenced by Elon Musk's leadership?", - Each suggestion must be a distinct question about a different aspect or angle of the topic
"What are the key innovations introduced by Elon Musk in the electric vehicle industry?", - Avoid similar or repetitive phrasing across suggestions
"How does Elon Musk's vision for renewable energy impact global sustainability efforts?" - Focus on what was actually discussed — suggest natural next steps: details, comparisons, implications, counterarguments, recent developments
] - Keep suggestions medium length, concrete and actionable
}
Today's date is ${new Date().toISOString()} Today's date is ${new Date().toISOString()}
`; `;
export function getSuggestionGeneratorPrompt(locale?: string): string {
return baseSuggestionPrompt + getLocaleInstruction(locale);
}

View File

@@ -1,5 +1,11 @@
import { getSearxngURL } from './config/serverRegistry'; import { getSearxngURL } from './config/serverRegistry';
const FALLBACK_INSTANCES = (
process.env.SEARXNG_FALLBACK_URL
? process.env.SEARXNG_FALLBACK_URL.split(',').map((u) => u.trim())
: ['https://searx.tiekoetter.com', 'https://search.sapti.me']
).filter(Boolean);
interface SearxngSearchOptions { interface SearxngSearchOptions {
categories?: string[]; categories?: string[];
engines?: string[]; engines?: string[];
@@ -7,7 +13,7 @@ interface SearxngSearchOptions {
pageno?: number; pageno?: number;
} }
interface SearxngSearchResult { export interface SearxngSearchResult {
title: string; title: string;
url: string; url: string;
img_src?: string; img_src?: string;
@@ -18,49 +24,70 @@ interface SearxngSearchResult {
iframe_src?: string; iframe_src?: string;
} }
function buildSearchUrl(baseUrl: string, query: string, opts?: SearxngSearchOptions): string {
const params = new URLSearchParams();
params.append('format', 'json');
params.append('q', query);
if (opts) {
Object.entries(opts).forEach(([key, value]) => {
if (value == null) return;
params.append(
key,
Array.isArray(value) ? value.join(',') : String(value),
);
});
}
const base = baseUrl.trim().replace(/\/$/, '');
const prefix = /^https?:\/\//i.test(base) ? '' : 'http://';
return `${prefix}${base}/search?${params.toString()}`;
}
export const searchSearxng = async ( export const searchSearxng = async (
query: string, query: string,
opts?: SearxngSearchOptions, opts?: SearxngSearchOptions,
) => { ) => {
const searxngURL = getSearxngURL(); const searxngURL = getSearxngURL();
const candidates: string[] = [];
if (!searxngURL?.trim()) { if (searxngURL?.trim()) {
throw new Error( let u = searxngURL.trim().replace(/\/$/, '');
'SearxNG is not configured. Please set the SearxNG URL in Settings → Search.', if (!/^https?:\/\//i.test(u)) u = `http://${u}`;
); candidates.push(u);
} }
FALLBACK_INSTANCES.forEach((u) => {
let baseUrl = searxngURL.trim().replace(/\/$/, ''); const trimmed = u.trim().replace(/\/$/, '');
if (!/^https?:\/\//i.test(baseUrl)) { if (trimmed && !candidates.includes(trimmed)) candidates.push(trimmed);
baseUrl = `http://${baseUrl}`;
}
const url = new URL(`${baseUrl}/search?format=json`);
url.searchParams.append('q', query);
if (opts) {
Object.keys(opts).forEach((key) => {
const value = opts[key as keyof SearxngSearchOptions];
if (Array.isArray(value)) {
url.searchParams.append(key, value.join(','));
return;
}
url.searchParams.append(key, value as string);
}); });
}
const res = await fetch(url); let lastError: Error | null = null;
const FETCH_TIMEOUT_MS = 15_000;
for (const baseUrl of candidates) {
try {
const fullUrl = buildSearchUrl(baseUrl, query, opts);
const controller = new AbortController();
const timeoutId = setTimeout(() => controller.abort(), FETCH_TIMEOUT_MS);
const res = await fetch(fullUrl, { signal: controller.signal });
clearTimeout(timeoutId);
const text = await res.text(); const text = await res.text();
const ok = res.ok;
const contentType = res.headers.get('content-type') ?? '';
const isJson = const isJson =
contentType.includes('application/json') || text.trim().startsWith('{') || text.trim().startsWith('[');
text.trim().startsWith('{') ||
text.trim().startsWith('['); if (res.status === 429) {
const err = new Error(
`SearXNG ${baseUrl}: лимит запросов (429). Укажите свой инстанс в настройках или попробуйте позже.`,
);
err.name = 'SearxngRateLimit';
throw err;
}
if (!isJson) { if (!isJson) {
throw new Error( throw new Error(
'SearxNG returned HTML instead of JSON. Check that the URL includes http:// or https:// (e.g. http://localhost:4000) and that SearxNG is running.', `SearXNG ${baseUrl}: ответ не JSON (HTTP ${res.status}). Проверьте URL и поддержку format=json.`,
); );
} }
@@ -68,5 +95,43 @@ export const searchSearxng = async (
const results: SearxngSearchResult[] = data.results ?? []; const results: SearxngSearchResult[] = data.results ?? [];
const suggestions: string[] = data.suggestions ?? []; const suggestions: string[] = data.suggestions ?? [];
if (!ok && results.length === 0) {
const errMsg = text.slice(0, 200) || `HTTP ${res.status}`;
throw new Error(`SearXNG ${baseUrl}: ${errMsg}`);
}
return { results, suggestions }; return { results, suggestions };
} catch (err) {
lastError =
err instanceof Error ? err : new Error(String(err));
const cause = (err as { cause?: { code?: string } })?.cause;
const isAbort = lastError.name === 'AbortError';
const isNetwork =
isAbort ||
lastError.message.includes('fetch failed') ||
lastError.message.includes('Invalid URL') ||
cause?.code === 'ECONNREFUSED' ||
cause?.code === 'ECONNRESET';
const isRateLimit =
lastError.name === 'SearxngRateLimit' ||
lastError.message.includes('429');
const hasFallback = candidates.indexOf(baseUrl) < candidates.length - 1;
if (hasFallback && (isNetwork || isRateLimit)) {
continue;
}
if (isAbort) {
throw new Error(
`SearXNG ${baseUrl}: таймаут ${FETCH_TIMEOUT_MS / 1000}с. Проверьте подключение или используйте локальный инстанс.`,
);
}
throw lastError;
}
}
throw (
lastError ??
new Error(
'SearXNG not configured. Set SEARXNG_API_URL or run Docker (includes SearXNG).',
)
);
}; };

View File

@@ -3,8 +3,6 @@ import BaseEmbedding from "../models/base/embedding"
import crypto from "crypto" import crypto from "crypto"
import fs from 'fs'; import fs from 'fs';
import { splitText } from '../utils/splitText'; import { splitText } from '../utils/splitText';
import { PDFParse } from 'pdf-parse';
import { CanvasFactory } from 'pdf-parse/worker';
import officeParser from 'officeparser' import officeParser from 'officeparser'
const supportedMimeTypes = ['application/pdf', 'application/vnd.openxmlformats-officedocument.wordprocessingml.document', 'text/plain'] as const const supportedMimeTypes = ['application/pdf', 'application/vnd.openxmlformats-officedocument.wordprocessingml.document', 'text/plain'] as const
@@ -116,15 +114,12 @@ class UploadManager {
fs.writeFileSync(contentPath, JSON.stringify(data, null, 2)); fs.writeFileSync(contentPath, JSON.stringify(data, null, 2));
return contentPath; return contentPath;
case 'application/pdf': case 'application/pdf': {
const { PDFParse } = await import('pdf-parse');
const { CanvasFactory } = await import('pdf-parse/worker');
const pdfBuffer = fs.readFileSync(filePath); const pdfBuffer = fs.readFileSync(filePath);
const parser = new PDFParse({ data: pdfBuffer, CanvasFactory });
const parser = new PDFParse({ const pdfText = (await parser.getText())?.text ?? ''
data: pdfBuffer,
CanvasFactory
})
const pdfText = await parser.getText().then(res => res.text)
const pdfSplittedText = splitText(pdfText, 512, 128) const pdfSplittedText = splitText(pdfText, 512, 128)
const pdfEmbeddings = await this.embeddingModel.embedText(pdfSplittedText) const pdfEmbeddings = await this.embeddingModel.embedText(pdfSplittedText)
@@ -147,6 +142,7 @@ class UploadManager {
fs.writeFileSync(pdfContentPath, JSON.stringify(pdfData, null, 2)); fs.writeFileSync(pdfContentPath, JSON.stringify(pdfData, null, 2));
return pdfContentPath; return pdfContentPath;
}
case 'application/vnd.openxmlformats-officedocument.wordprocessingml.document': case 'application/vnd.openxmlformats-officedocument.wordprocessingml.document':
const docBuffer = fs.readFileSync(filePath); const docBuffer = fs.readFileSync(filePath);

View File

@@ -13,10 +13,9 @@
"jsx": "react-jsx", "jsx": "react-jsx",
"incremental": true, "incremental": true,
"plugins": [{ "name": "next" }], "plugins": [{ "name": "next" }],
"paths": { "paths": { "@/*": ["./src/*"] },
"@/*": ["./src/*"] "target": "ES2017",
}, "typeRoots": ["../../node_modules/@types"]
"target": "ES2017"
}, },
"include": [ "include": [
"next-env.d.ts", "next-env.d.ts",

File diff suppressed because one or more lines are too long

View File

@@ -16,6 +16,13 @@ const en = {
'common.loading': 'Loading...', 'common.loading': 'Loading...',
'common.error': 'Error', 'common.error': 'Error',
'common.retry': 'Retry', 'common.retry': 'Retry',
'chat.related': 'Related',
'chat.searchImages': 'Search images',
'chat.searchVideos': 'Search videos',
'chat.video': 'Video',
'chat.uploadedFile': 'Uploaded File',
'chat.viewMore': 'View {count} more',
'chat.sources': 'Sources',
} as const; } as const;
const ru = { const ru = {
@@ -34,6 +41,13 @@ const ru = {
'common.loading': 'Загрузка...', 'common.loading': 'Загрузка...',
'common.error': 'Ошибка', 'common.error': 'Ошибка',
'common.retry': 'Повторить', 'common.retry': 'Повторить',
'chat.related': 'Похожие',
'chat.searchImages': 'Поиск изображений',
'chat.searchVideos': 'Поиск видео',
'chat.video': 'Видео',
'chat.uploadedFile': 'Загруженный файл',
'chat.viewMore': 'Ещё {count}',
'chat.sources': 'Источники',
} as const; } as const;
const de = { const de = {
@@ -52,6 +66,13 @@ const de = {
'common.loading': 'Laden...', 'common.loading': 'Laden...',
'common.error': 'Fehler', 'common.error': 'Fehler',
'common.retry': 'Wiederholen', 'common.retry': 'Wiederholen',
'chat.related': 'Ähnlich',
'chat.searchImages': 'Bilder suchen',
'chat.searchVideos': 'Videos suchen',
'chat.video': 'Video',
'chat.uploadedFile': 'Hochgeladene Datei',
'chat.viewMore': '{count} weitere',
'chat.sources': 'Quellen',
} as const; } as const;
const fr = { const fr = {
@@ -70,6 +91,13 @@ const fr = {
'common.loading': 'Chargement...', 'common.loading': 'Chargement...',
'common.error': 'Erreur', 'common.error': 'Erreur',
'common.retry': 'Réessayer', 'common.retry': 'Réessayer',
'chat.related': 'Similaires',
'chat.searchImages': 'Rechercher des images',
'chat.searchVideos': 'Rechercher des vidéos',
'chat.video': 'Vidéo',
'chat.uploadedFile': 'Fichier téléchargé',
'chat.viewMore': '{count} de plus',
'chat.sources': 'Sources',
} as const; } as const;
const es = { const es = {
@@ -88,6 +116,13 @@ const es = {
'common.loading': 'Cargando...', 'common.loading': 'Cargando...',
'common.error': 'Error', 'common.error': 'Error',
'common.retry': 'Reintentar', 'common.retry': 'Reintentar',
'chat.related': 'Relacionado',
'chat.searchImages': 'Buscar imágenes',
'chat.searchVideos': 'Buscar videos',
'chat.video': 'Video',
'chat.uploadedFile': 'Archivo subido',
'chat.viewMore': '{count} más',
'chat.sources': 'Fuentes',
} as const; } as const;
const uk = { const uk = {
@@ -106,6 +141,13 @@ const uk = {
'common.loading': 'Завантаження...', 'common.loading': 'Завантаження...',
'common.error': 'Помилка', 'common.error': 'Помилка',
'common.retry': 'Повторити', 'common.retry': 'Повторити',
'chat.related': 'Повʼязані',
'chat.searchImages': 'Пошук зображень',
'chat.searchVideos': 'Пошук відео',
'chat.video': 'Відео',
'chat.uploadedFile': 'Завантажений файл',
'chat.viewMore': 'Ще {count}',
'chat.sources': 'Джерела',
} as const; } as const;
export const translations: Record<string, Record<TranslationKeys, string>> = { export const translations: Record<string, Record<TranslationKeys, string>> = {

303
package-lock.json generated
View File

@@ -295,12 +295,12 @@
"@headlessui/tailwindcss": "^0.2.2", "@headlessui/tailwindcss": "^0.2.2",
"@huggingface/transformers": "^3.8.1", "@huggingface/transformers": "^3.8.1",
"@icons-pack/react-simple-icons": "^12.3.0", "@icons-pack/react-simple-icons": "^12.3.0",
"@libsql/client": "^0.17.0",
"@phosphor-icons/react": "^2.1.10", "@phosphor-icons/react": "^2.1.10",
"@radix-ui/react-tooltip": "^1.2.8", "@radix-ui/react-tooltip": "^1.2.8",
"@tailwindcss/typography": "^0.5.12", "@tailwindcss/typography": "^0.5.12",
"@toolsycc/json-repair": "^0.1.22", "@toolsycc/json-repair": "^0.1.22",
"axios": "^1.8.3", "axios": "^1.8.3",
"better-sqlite3": "^11.9.1",
"clsx": "^2.1.0", "clsx": "^2.1.0",
"drizzle-orm": "^0.40.1", "drizzle-orm": "^0.40.1",
"js-tiktoken": "^1.0.21", "js-tiktoken": "^1.0.21",
@@ -332,15 +332,17 @@
"zod": "^4.1.12" "zod": "^4.1.12"
}, },
"devDependencies": { "devDependencies": {
"@types/better-sqlite3": "^7.6.12",
"@types/jspdf": "^2.0.0", "@types/jspdf": "^2.0.0",
"@types/node": "^24.8.1", "@types/node": "^24.8.1",
"@types/pdf-parse": "^1.1.4", "@types/pdf-parse": "^1.1.4",
"@types/react": "^18", "@types/react": "^18",
"@types/react-dom": "^18", "@types/react-dom": "^18",
"@types/react-syntax-highlighter": "^15.5.13", "@types/react-syntax-highlighter": "^15.5.13",
"@types/trusted-types": "^2.0.7",
"@types/turndown": "^5.0.6", "@types/turndown": "^5.0.6",
"@types/unist": "^3.0.3",
"autoprefixer": "^10.0.1", "autoprefixer": "^10.0.1",
"dotenv": "^16.4.5",
"drizzle-kit": "^0.30.5", "drizzle-kit": "^0.30.5",
"eslint": "^8", "eslint": "^8",
"eslint-config-next": "14.1.4", "eslint-config-next": "14.1.4",
@@ -2260,6 +2262,167 @@
"@jridgewell/sourcemap-codec": "^1.4.14" "@jridgewell/sourcemap-codec": "^1.4.14"
} }
}, },
"node_modules/@libsql/client": {
"version": "0.17.0",
"resolved": "https://registry.npmjs.org/@libsql/client/-/client-0.17.0.tgz",
"integrity": "sha512-TLjSU9Otdpq0SpKHl1tD1Nc9MKhrsZbCFGot3EbCxRa8m1E5R1mMwoOjKMMM31IyF7fr+hPNHLpYfwbMKNusmg==",
"license": "MIT",
"dependencies": {
"@libsql/core": "^0.17.0",
"@libsql/hrana-client": "^0.9.0",
"js-base64": "^3.7.5",
"libsql": "^0.5.22",
"promise-limit": "^2.7.0"
}
},
"node_modules/@libsql/core": {
"version": "0.17.0",
"resolved": "https://registry.npmjs.org/@libsql/core/-/core-0.17.0.tgz",
"integrity": "sha512-hnZRnJHiS+nrhHKLGYPoJbc78FE903MSDrFJTbftxo+e52X+E0Y0fHOCVYsKWcg6XgB7BbJYUrz/xEkVTSaipw==",
"license": "MIT",
"dependencies": {
"js-base64": "^3.7.5"
}
},
"node_modules/@libsql/darwin-arm64": {
"version": "0.5.22",
"resolved": "https://registry.npmjs.org/@libsql/darwin-arm64/-/darwin-arm64-0.5.22.tgz",
"integrity": "sha512-4B8ZlX3nIDPndfct7GNe0nI3Yw6ibocEicWdC4fvQbSs/jdq/RC2oCsoJxJ4NzXkvktX70C1J4FcmmoBy069UA==",
"cpu": [
"arm64"
],
"license": "MIT",
"optional": true,
"os": [
"darwin"
]
},
"node_modules/@libsql/darwin-x64": {
"version": "0.5.22",
"resolved": "https://registry.npmjs.org/@libsql/darwin-x64/-/darwin-x64-0.5.22.tgz",
"integrity": "sha512-ny2HYWt6lFSIdNFzUFIJ04uiW6finXfMNJ7wypkAD8Pqdm6nAByO+Fdqu8t7sD0sqJGeUCiOg480icjyQ2/8VA==",
"cpu": [
"x64"
],
"license": "MIT",
"optional": true,
"os": [
"darwin"
]
},
"node_modules/@libsql/hrana-client": {
"version": "0.9.0",
"resolved": "https://registry.npmjs.org/@libsql/hrana-client/-/hrana-client-0.9.0.tgz",
"integrity": "sha512-pxQ1986AuWfPX4oXzBvLwBnfgKDE5OMhAdR/5cZmRaB4Ygz5MecQybvwZupnRz341r2CtFmbk/BhSu7k2Lm+Jw==",
"license": "MIT",
"dependencies": {
"@libsql/isomorphic-ws": "^0.1.5",
"cross-fetch": "^4.0.0",
"js-base64": "^3.7.5",
"node-fetch": "^3.3.2"
}
},
"node_modules/@libsql/isomorphic-ws": {
"version": "0.1.5",
"resolved": "https://registry.npmjs.org/@libsql/isomorphic-ws/-/isomorphic-ws-0.1.5.tgz",
"integrity": "sha512-DtLWIH29onUYR00i0GlQ3UdcTRC6EP4u9w/h9LxpUZJWRMARk6dQwZ6Jkd+QdwVpuAOrdxt18v0K2uIYR3fwFg==",
"license": "MIT",
"dependencies": {
"@types/ws": "^8.5.4",
"ws": "^8.13.0"
}
},
"node_modules/@libsql/linux-arm-gnueabihf": {
"version": "0.5.22",
"resolved": "https://registry.npmjs.org/@libsql/linux-arm-gnueabihf/-/linux-arm-gnueabihf-0.5.22.tgz",
"integrity": "sha512-3Uo3SoDPJe/zBnyZKosziRGtszXaEtv57raWrZIahtQDsjxBVjuzYQinCm9LRCJCUT5t2r5Z5nLDPJi2CwZVoA==",
"cpu": [
"arm"
],
"license": "MIT",
"optional": true,
"os": [
"linux"
]
},
"node_modules/@libsql/linux-arm-musleabihf": {
"version": "0.5.22",
"resolved": "https://registry.npmjs.org/@libsql/linux-arm-musleabihf/-/linux-arm-musleabihf-0.5.22.tgz",
"integrity": "sha512-LCsXh07jvSojTNJptT9CowOzwITznD+YFGGW+1XxUr7fS+7/ydUrpDfsMX7UqTqjm7xG17eq86VkWJgHJfvpNg==",
"cpu": [
"arm"
],
"license": "MIT",
"optional": true,
"os": [
"linux"
]
},
"node_modules/@libsql/linux-arm64-gnu": {
"version": "0.5.22",
"resolved": "https://registry.npmjs.org/@libsql/linux-arm64-gnu/-/linux-arm64-gnu-0.5.22.tgz",
"integrity": "sha512-KSdnOMy88c9mpOFKUEzPskSaF3VLflfSUCBwas/pn1/sV3pEhtMF6H8VUCd2rsedwoukeeCSEONqX7LLnQwRMA==",
"cpu": [
"arm64"
],
"license": "MIT",
"optional": true,
"os": [
"linux"
]
},
"node_modules/@libsql/linux-arm64-musl": {
"version": "0.5.22",
"resolved": "https://registry.npmjs.org/@libsql/linux-arm64-musl/-/linux-arm64-musl-0.5.22.tgz",
"integrity": "sha512-mCHSMAsDTLK5YH//lcV3eFEgiR23Ym0U9oEvgZA0667gqRZg/2px+7LshDvErEKv2XZ8ixzw3p1IrBzLQHGSsw==",
"cpu": [
"arm64"
],
"license": "MIT",
"optional": true,
"os": [
"linux"
]
},
"node_modules/@libsql/linux-x64-gnu": {
"version": "0.5.22",
"resolved": "https://registry.npmjs.org/@libsql/linux-x64-gnu/-/linux-x64-gnu-0.5.22.tgz",
"integrity": "sha512-kNBHaIkSg78Y4BqAdgjcR2mBilZXs4HYkAmi58J+4GRwDQZh5fIUWbnQvB9f95DkWUIGVeenqLRFY2pcTmlsew==",
"cpu": [
"x64"
],
"license": "MIT",
"optional": true,
"os": [
"linux"
]
},
"node_modules/@libsql/linux-x64-musl": {
"version": "0.5.22",
"resolved": "https://registry.npmjs.org/@libsql/linux-x64-musl/-/linux-x64-musl-0.5.22.tgz",
"integrity": "sha512-UZ4Xdxm4pu3pQXjvfJiyCzZop/9j/eA2JjmhMaAhe3EVLH2g11Fy4fwyUp9sT1QJYR1kpc2JLuybPM0kuXv/Tg==",
"cpu": [
"x64"
],
"license": "MIT",
"optional": true,
"os": [
"linux"
]
},
"node_modules/@libsql/win32-x64-msvc": {
"version": "0.5.22",
"resolved": "https://registry.npmjs.org/@libsql/win32-x64-msvc/-/win32-x64-msvc-0.5.22.tgz",
"integrity": "sha512-Fj0j8RnBpo43tVZUVoNK6BV/9AtDUM5S7DF3LB4qTYg1LMSZqi3yeCneUTLJD6XomQJlZzbI4mst89yspVSAnA==",
"cpu": [
"x64"
],
"license": "MIT",
"optional": true,
"os": [
"win32"
]
},
"node_modules/@mixmark-io/domino": { "node_modules/@mixmark-io/domino": {
"version": "2.2.0", "version": "2.2.0",
"resolved": "https://registry.npmjs.org/@mixmark-io/domino/-/domino-2.2.0.tgz", "resolved": "https://registry.npmjs.org/@mixmark-io/domino/-/domino-2.2.0.tgz",
@@ -2529,6 +2692,12 @@
"@tybys/wasm-util": "^0.10.0" "@tybys/wasm-util": "^0.10.0"
} }
}, },
"node_modules/@neon-rs/load": {
"version": "0.0.4",
"resolved": "https://registry.npmjs.org/@neon-rs/load/-/load-0.0.4.tgz",
"integrity": "sha512-kTPhdZyTQxB+2wpiRcFWrDcejc4JI6tkPuS7UZCG4l6Zvc5kU/gGQ/ozvHTh1XR5tS+UlfAfGuPajjzQjCiHCw==",
"license": "MIT"
},
"node_modules/@next/env": { "node_modules/@next/env": {
"version": "16.1.6", "version": "16.1.6",
"resolved": "https://registry.npmjs.org/@next/env/-/env-16.1.6.tgz", "resolved": "https://registry.npmjs.org/@next/env/-/env-16.1.6.tgz",
@@ -3673,8 +3842,8 @@
"version": "2.0.7", "version": "2.0.7",
"resolved": "https://registry.npmjs.org/@types/trusted-types/-/trusted-types-2.0.7.tgz", "resolved": "https://registry.npmjs.org/@types/trusted-types/-/trusted-types-2.0.7.tgz",
"integrity": "sha512-ScaPdn1dQczgbl0QFTeTOmVHFULt394XJgOQNoyVhZ6r2vLnMLJfBPd53SB52T/3G36VI1/g2MZaX0cwDuXsfw==", "integrity": "sha512-ScaPdn1dQczgbl0QFTeTOmVHFULt394XJgOQNoyVhZ6r2vLnMLJfBPd53SB52T/3G36VI1/g2MZaX0cwDuXsfw==",
"license": "MIT", "devOptional": true,
"optional": true "license": "MIT"
}, },
"node_modules/@types/turndown": { "node_modules/@types/turndown": {
"version": "5.0.6", "version": "5.0.6",
@@ -3696,6 +3865,15 @@
"integrity": "sha512-ko/gIFJRv177XgZsZcBwnqJN5x/Gien8qNOn0D5bQU/zAzVf9Zt3BlcUiLqhV9y4ARk0GbT3tnUiPNgnTXzc/Q==", "integrity": "sha512-ko/gIFJRv177XgZsZcBwnqJN5x/Gien8qNOn0D5bQU/zAzVf9Zt3BlcUiLqhV9y4ARk0GbT3tnUiPNgnTXzc/Q==",
"license": "MIT" "license": "MIT"
}, },
"node_modules/@types/ws": {
"version": "8.18.1",
"resolved": "https://registry.npmjs.org/@types/ws/-/ws-8.18.1.tgz",
"integrity": "sha512-ThVF6DCVhA8kUGy+aazFQ4kXQ7E1Ty7A3ypFOe0IcJV8O/M511G99AW24irKrW56Wt44yG9+ij8FaqoBGkuBXg==",
"license": "MIT",
"dependencies": {
"@types/node": "*"
}
},
"node_modules/@typescript-eslint/parser": { "node_modules/@typescript-eslint/parser": {
"version": "6.21.0", "version": "6.21.0",
"resolved": "https://registry.npmjs.org/@typescript-eslint/parser/-/parser-6.21.0.tgz", "resolved": "https://registry.npmjs.org/@typescript-eslint/parser/-/parser-6.21.0.tgz",
@@ -5371,6 +5549,57 @@
"url": "https://opencollective.com/express" "url": "https://opencollective.com/express"
} }
}, },
"node_modules/cross-fetch": {
"version": "4.1.0",
"resolved": "https://registry.npmjs.org/cross-fetch/-/cross-fetch-4.1.0.tgz",
"integrity": "sha512-uKm5PU+MHTootlWEY+mZ4vvXoCn4fLQxT9dSc1sXVMSFkINTJVN8cAQROpwcKm8bJ/c7rgZVIBWzH5T78sNZZw==",
"license": "MIT",
"dependencies": {
"node-fetch": "^2.7.0"
}
},
"node_modules/cross-fetch/node_modules/node-fetch": {
"version": "2.7.0",
"resolved": "https://registry.npmjs.org/node-fetch/-/node-fetch-2.7.0.tgz",
"integrity": "sha512-c4FRfUm/dbcWZ7U+1Wq0AwCyFL+3nt2bEw05wfxSz+DWpWsitgmSgYmy2dQdWyKC1694ELPqMs/YzUSNozLt8A==",
"license": "MIT",
"dependencies": {
"whatwg-url": "^5.0.0"
},
"engines": {
"node": "4.x || >=6.0.0"
},
"peerDependencies": {
"encoding": "^0.1.0"
},
"peerDependenciesMeta": {
"encoding": {
"optional": true
}
}
},
"node_modules/cross-fetch/node_modules/tr46": {
"version": "0.0.3",
"resolved": "https://registry.npmjs.org/tr46/-/tr46-0.0.3.tgz",
"integrity": "sha512-N3WMsuqV66lT30CrXNbEjx4GEwlow3v6rr4mCcv6prnfwhS01rkgyFdjPNBYd9br7LpXV1+Emh01fHnq2Gdgrw==",
"license": "MIT"
},
"node_modules/cross-fetch/node_modules/webidl-conversions": {
"version": "3.0.1",
"resolved": "https://registry.npmjs.org/webidl-conversions/-/webidl-conversions-3.0.1.tgz",
"integrity": "sha512-2JAn3z8AR6rjK8Sm8orRC0h/bcl/DqL7tRPdGZ4I1CjdF+EaMLmYxBHyXuKL849eucPFhvBoxMsflfOb8kxaeQ==",
"license": "BSD-2-Clause"
},
"node_modules/cross-fetch/node_modules/whatwg-url": {
"version": "5.0.0",
"resolved": "https://registry.npmjs.org/whatwg-url/-/whatwg-url-5.0.0.tgz",
"integrity": "sha512-saE57nupxk6v3HY35+jzBwYa0rKSy0XR8JSxZPwgLr7ys0IBzhGviA1/TUGJLmSVqs8pb9AnvICXEuOHLprYTw==",
"license": "MIT",
"dependencies": {
"tr46": "~0.0.3",
"webidl-conversions": "^3.0.0"
}
},
"node_modules/cross-spawn": { "node_modules/cross-spawn": {
"version": "7.0.6", "version": "7.0.6",
"resolved": "https://registry.npmjs.org/cross-spawn/-/cross-spawn-7.0.6.tgz", "resolved": "https://registry.npmjs.org/cross-spawn/-/cross-spawn-7.0.6.tgz",
@@ -5688,6 +5917,19 @@
"@types/trusted-types": "^2.0.7" "@types/trusted-types": "^2.0.7"
} }
}, },
"node_modules/dotenv": {
"version": "16.6.1",
"resolved": "https://registry.npmjs.org/dotenv/-/dotenv-16.6.1.tgz",
"integrity": "sha512-uBq4egWHTcTt33a72vpSG0z3HnPuIl6NqYcTrKEg2azoEyl2hpW0zqlxysq2pK9HlDIHyHyakeYaYnSAwd8bow==",
"dev": true,
"license": "BSD-2-Clause",
"engines": {
"node": ">=12"
},
"funding": {
"url": "https://dotenvx.com"
}
},
"node_modules/drizzle-kit": { "node_modules/drizzle-kit": {
"version": "0.30.6", "version": "0.30.6",
"resolved": "https://registry.npmjs.org/drizzle-kit/-/drizzle-kit-0.30.6.tgz", "resolved": "https://registry.npmjs.org/drizzle-kit/-/drizzle-kit-0.30.6.tgz",
@@ -8594,6 +8836,12 @@
"url": "https://github.com/sponsors/panva" "url": "https://github.com/sponsors/panva"
} }
}, },
"node_modules/js-base64": {
"version": "3.7.8",
"resolved": "https://registry.npmjs.org/js-base64/-/js-base64-3.7.8.tgz",
"integrity": "sha512-hNngCeKxIUQiEUN3GPJOkz4wF/YvdUdbNL9hsBcMQTkKzboD7T/q3OYOuuPZLUE6dBxSGpwhk5mwuDud7JVAow==",
"license": "BSD-3-Clause"
},
"node_modules/js-tiktoken": { "node_modules/js-tiktoken": {
"version": "1.0.21", "version": "1.0.21",
"resolved": "https://registry.npmjs.org/js-tiktoken/-/js-tiktoken-1.0.21.tgz", "resolved": "https://registry.npmjs.org/js-tiktoken/-/js-tiktoken-1.0.21.tgz",
@@ -8860,6 +9108,47 @@
"node": ">= 0.8.0" "node": ">= 0.8.0"
} }
}, },
"node_modules/libsql": {
"version": "0.5.22",
"resolved": "https://registry.npmjs.org/libsql/-/libsql-0.5.22.tgz",
"integrity": "sha512-NscWthMQt7fpU8lqd7LXMvT9pi+KhhmTHAJWUB/Lj6MWa0MKFv0F2V4C6WKKpjCVZl0VwcDz4nOI3CyaT1DDiA==",
"cpu": [
"x64",
"arm64",
"wasm32",
"arm"
],
"license": "MIT",
"os": [
"darwin",
"linux",
"win32"
],
"dependencies": {
"@neon-rs/load": "^0.0.4",
"detect-libc": "2.0.2"
},
"optionalDependencies": {
"@libsql/darwin-arm64": "0.5.22",
"@libsql/darwin-x64": "0.5.22",
"@libsql/linux-arm-gnueabihf": "0.5.22",
"@libsql/linux-arm-musleabihf": "0.5.22",
"@libsql/linux-arm64-gnu": "0.5.22",
"@libsql/linux-arm64-musl": "0.5.22",
"@libsql/linux-x64-gnu": "0.5.22",
"@libsql/linux-x64-musl": "0.5.22",
"@libsql/win32-x64-msvc": "0.5.22"
}
},
"node_modules/libsql/node_modules/detect-libc": {
"version": "2.0.2",
"resolved": "https://registry.npmjs.org/detect-libc/-/detect-libc-2.0.2.tgz",
"integrity": "sha512-UX6sGumvvqSaXgdKGUsgZWqcUyIXZ/vZTrlRT/iobiKhGL0zL4d3osHj3uqllWJK+i+sixDS/3COVEOFbupFyw==",
"license": "Apache-2.0",
"engines": {
"node": ">=8"
}
},
"node_modules/lie": { "node_modules/lie": {
"version": "3.3.0", "version": "3.3.0",
"resolved": "https://registry.npmjs.org/lie/-/lie-3.3.0.tgz", "resolved": "https://registry.npmjs.org/lie/-/lie-3.3.0.tgz",
@@ -10602,6 +10891,12 @@
"integrity": "sha512-3ouUOpQhtgrbOa17J7+uxOTpITYWaGP7/AhoR3+A+/1e9skrzelGi/dXzEYyvbxubEF6Wn2ypscTKiKJFFn1ag==", "integrity": "sha512-3ouUOpQhtgrbOa17J7+uxOTpITYWaGP7/AhoR3+A+/1e9skrzelGi/dXzEYyvbxubEF6Wn2ypscTKiKJFFn1ag==",
"license": "MIT" "license": "MIT"
}, },
"node_modules/promise-limit": {
"version": "2.7.0",
"resolved": "https://registry.npmjs.org/promise-limit/-/promise-limit-2.7.0.tgz",
"integrity": "sha512-7nJ6v5lnJsXwGprnGXga4wx6d1POjvi5Qmf1ivTRxTjH4Z/9Czja/UCMLVmB9N93GeWOU93XaFaEt6jbuoagNw==",
"license": "ISC"
},
"node_modules/prop-types": { "node_modules/prop-types": {
"version": "15.8.1", "version": "15.8.1",
"resolved": "https://registry.npmjs.org/prop-types/-/prop-types-15.8.1.tgz", "resolved": "https://registry.npmjs.org/prop-types/-/prop-types-15.8.1.tgz",