Compare commits

..

13 Commits

Author SHA1 Message Date
user
1310ff3220 now files do presigned url setup 2026-03-02 21:40:02 +02:00
user
34aa52ec7c well yuh 2026-03-02 21:26:47 +02:00
user
939d4ca8a8 welll let's see now if the main is fixed 2026-03-02 20:26:42 +02:00
user
2c7ab39693 removed useless server ts file 2026-03-02 20:08:43 +02:00
user
fa6374f6cf idk 2026-03-02 19:51:36 +02:00
user
a4280404dc better docker file 2026-03-02 19:49:46 +02:00
user
5c84f4a672 semi-thicker, but eh works now 2026-03-02 19:46:58 +02:00
user
a7ed537b07 added host+port env in prod script for proc. app 2026-03-02 19:25:03 +02:00
user
be45cc3fa7 resolved type errors and other proc. app build error 2026-03-02 19:23:59 +02:00
user
6069a16d80 proc. package script fix 2026-03-02 19:09:48 +02:00
user
ed0a7d5c67 dockerfile fix 2026-03-02 18:42:53 +02:00
user
e6e27de780 dockerfile fix 2026-03-02 18:22:36 +02:00
user
29f5721684 updated readme file 2026-03-01 20:52:31 +02:00
23 changed files with 411 additions and 353 deletions

View File

@@ -1,41 +1,34 @@
APP_NAME=${{project.APP_NAME}}
NODE_ENV=${{project.NODE_ENV}}
LOG_LEVEL=${{project.LOG_LEVEL}}
REDIS_URL=${{project.REDIS_URL}}
DATABASE_URL=${{project.DATABASE_URL}}
COUCHBASE_CONNECTION_STRING=${{project.COUCHBASE_CONNECTION_STRING}}
COUCHBASE_USERNAME=${{project.COUCHBASE_USERNAME}}
COUCHBASE_PASSWORD=${{project.COUCHBASE_PASSWORD}}
INTERNAL_API_KEY=${{project.INTERNAL_API_KEY}}
DEBUG_KEY=${{project.DEBUG_KEY}}
QDRANT_URL=${{project.QDRANT_URL}}
QDRANT_API_KEY=${{project.QDRANT_API_KEY}}
PUBLIC_URL=${{project.PUBLIC_URL}}
PROCESSOR_API_URL=${{project.PROCESSOR_API_URL}}
BETTER_AUTH_SECRET=${{project.BETTER_AUTH_SECRET}}
BETTER_AUTH_URL=${{project.BETTER_AUTH_URL}}
GOOGLE_CLIENT_ID=${{project.GOOGLE_CLIENT_ID}}
GOOGLE_CLIENT_SECRET=${{project.GOOGLE_CLIENT_SECRET}}
GOOGLE_OAUTH_SERVER_URL=${{project.GOOGLE_OAUTH_SERVER_URL}}
GOOGLE_OAUTH_SERVER_PORT=${{project.GOOGLE_OAUTH_SERVER_PORT}}
TWOFA_SECRET=${{project.TWOFA_SECRET}}
TWO_FA_SESSION_EXPIRY_MINUTES=${{project.TWO_FA_SESSION_EXPIRY_MINUTES}}
TWO_FA_REQUIRED_HOURS=${{project.TWO_FA_REQUIRED_HOURS}}
GEMINI_API_KEY=${{project.GEMINI_API_KEY}}
DEFAULT_ADMIN_EMAIL=${{project.DEFAULT_ADMIN_EMAIL}}
DEFAULT_ADMIN_PASSWORD=${{project.DEFAULT_ADMIN_PASSWORD}}
RESEND_API_KEY=${{project.RESEND_API_KEY}}
FROM_EMAIL=${{project.FROM_EMAIL}}
INTERNAL_API_KEY="supersecretkey"
PROCESSOR_API_URL=${{project.PROCESSOR_API_URL}}
OPENROUTER_API_KEY=${{project.OPENROUTER_API_KEY}}
MODEL_NAME=${{project.MODEL_NAME}}
OCR_SERVICE_URL=${{project.OCR_SERVICE_URL}}
OCR_SERVICE_TIMEOUT=${{project.OCR_SERVICE_TIMEOUT}}
OCR_SERVICE_MAX_RETRIES=${{project.OCR_SERVICE_MAX_RETRIES}}
OCR_FORCE_OCR=${{project.OCR_FORCE_OCR}}
OCR_DEBUG=${{project.OCR_DEBUG}}
OTEL_SERVICE_NAME=${{project.OTEL_SERVICE_NAME}}
OTEL_TRACES_EXPORTER=${{project.OTEL_TRACES_EXPORTER}}
OTEL_EXPORTER_OTLP_HTTP_ENDPOINT=${{project.OTEL_EXPORTER_OTLP_HTTP_ENDPOINT}}
OTEL_EXPORTER_OTLP_GRPC_ENDPOINT=${{project.OTEL_EXPORTER_OTLP_GRPC_ENDPOINT}}
OTEL_EXPORTER_OTLP_ENDPOINT=${{project.OTEL_EXPORTER_OTLP_ENDPOINT}}
OTEL_EXPORTER_OTLP_PROTOCOL=${{project.OTEL_EXPORTER_OTLP_PROTOCOL}}
OTEL_RESOURCE_ATTRIBUTES=${{project.OTEL_RESOURCE_ATTRIBUTES}}
R2_BUCKET_NAME=${{project.R2_BUCKET_NAME}}
R2_REGION=${{project.R2_REGION}}

196
README.md
View File

@@ -1,197 +1,3 @@
# Illusory MAPP
Internal automation platform for aggregating device data into one backend, then running coordinated automations on top of that data.
## Current Goal (Server-Side Stage 1)
Build the server-side ingestion and admin visibility for:
1. Device registration
2. Device heartbeat/ping tracking
3. SMS ingestion (ongoing)
4. Media metadata ingestion (initial sync)
5. Admin UI to inspect devices, SMS, and media
Mobile app implementation is out of scope for now.
## Scope and Non-Goals
### In scope now
- Processor endpoints that receive mobile data
- Domain logic + persistence for devices, SMS, media sync metadata
- Admin UI read views with polling for SMS (every 5 seconds)
- Observability for ingestion and read flows (logs + traces)
### Not in scope now
- Mobile app UX/screens or client implementation
- Automation execution workflows
- Advanced media processing pipelines
## Delivery Plan (Actionable Tasks)
Use this as the step-by-step implementation checklist.
### Phase 0: Align Critical Decisions (before coding)
- [x] Confirm device identity strategy (`deviceId` from mobile vs server-generated key).
- [x] Confirm admin ownership rule for newly registered devices (single admin user mapping).
- [x] Confirm SMS dedup strategy (client message id, hash, or `(deviceId + timestamp + sender + body)`).
- [x] Confirm media sync contract: metadata-only now vs metadata + upload references.
- [x] Confirm minimal auth for processor endpoints (shared token or signed header) for Stage 1.
Decisions captured:
- Keep two identifiers per device:
- `id` (server-generated primary key)
- `externalDeviceId` (sent by mobile)
- All devices are owned by the single admin user in Stage 1.
- SMS dedup uses:
- optional client message id when available
- fallback deterministic hash from `(deviceId + timestamp + sender + body)`.
- Media sync is metadata plus upload reference (`fileId`) to the file domain/R2 object.
- Stage 1 endpoint auth:
- register is open in trusted network
- ping/sync endpoints must include device id header; request is allowed only if device exists.
### Phase 1: Data Model and DB Migrations
Target: `packages/db/schema/*` + new migration files.
- [x] Add `mobile_device` table:
- Fields: `id`, `externalDeviceId`, `name`, `manufacturer`, `model`, `androidVersion`, `ownerUserId`, `lastPingAt`, `createdAt`, `updatedAt`.
- [x] Add `mobile_sms` table:
- Fields: `id`, `deviceId`, `externalMessageId?`, `sender`, `recipient?`, `body`, `sentAt`, `receivedAt?`, `rawPayload?`, `createdAt`.
- [x] Add `mobile_media_asset` table:
- Fields: `id`, `deviceId`, `externalMediaId?`, `mimeType`, `filename?`, `capturedAt?`, `sizeBytes?`, `hash?`, `metadata`, `createdAt`.
- [x] Add indexes for common reads:
- `mobile_device.lastPingAt`
- `mobile_sms.deviceId + sentAt desc`
- `mobile_media_asset.deviceId + createdAt desc`
- [x] Export schema from `packages/db/schema/index.ts`.
Definition of done:
- [x] Tables and indexes exist in schema + migration files.
- [x] Naming matches existing conventions.
### Phase 2: Logic Domain (`@pkg/logic`)
Target: `packages/logic/domains/mobile/*` (new domain).
- [x] Create `data.ts` with Valibot schemas and inferred types for:
- register device payload
- ping payload
- sms sync payload
- media sync payload
- admin query filters/pagination
- [x] Create `errors.ts` with domain error constructors using `getError`.
- [x] Create `repository.ts` with ResultAsync operations for:
- upsert device
- update last ping
- bulk insert sms (idempotent)
- bulk insert media metadata (idempotent)
- list devices
- get device detail
- list sms by device (paginated, latest first)
- list media by device (paginated, latest first)
- delete single media asset
- delete device (removed all child sms, and media assets, and by extensionn the associated files in r2+db as well)
- [x] Create `controller.ts` as use-case layer.
- [x] Wrap repository/controller operations with `traceResultAsync` and include `fctx`.
Definition of done:
- [x] Processor and main app can consume this domain without direct DB usage.
- [x] No ad-hoc thrown errors in domain flow; Result pattern is preserved.
### Phase 3: Processor Ingestion API (`apps/processor`)
Target: `apps/processor/src/domains/mobile/router.ts`.
- [x] Replace stub endpoints with full handlers:
- `POST /register`
- `PUT /ping`
- `PUT /sms/sync`
- `PUT /media/sync`
- [x] Validate request body using schemas from `@pkg/logic/domains/mobile/data`.
- [x] Build `FlowExecCtx` per request (flow id always; user/session when available).
- [x] Call mobile controller methods; return `{ data, error }` response shape.
- [x] Add basic endpoint protection agreed in Phase 0.
- [x] Add request-level structured logging for success/failure.
Definition of done:
- [x] Endpoints persist data into new tables.
- [x] Endpoint failures return normalized domain errors.
### Phase 4: Admin UI in `apps/main`
Target:
- `apps/main/src/lib/domains/mobile/*` (new)
- `apps/main/src/routes/(main)/devices/[deviceId]` (new)
- [x] Add remote functions:
- `getDevicesSQ`
- `getDeviceDetailSQ`
- `getDeviceSmsSQ`
- `getDeviceMediaSQ`
- [x] Add VM (`devices.vm.svelte.ts`) that:
- fetches devices list
- fetches selected device detail
- polls SMS every 5 seconds while device view is open
- handles loading/error states
- [x] Add UI pages/components:
- `/dashboard` list with device identity + last ping
- `/devices/[deviceId]` detail with tabs:
- Device info
- SMS feed
- Media assets list
- [x] Add sidebar/navigation entry for Devices.
Definition of done:
- [x] Admin can browse devices and open each device detail.
- [x] SMS view refreshes every 5 seconds and shows new data.
### Phase 5: Observability Stage 1
Targets:
- `packages/logic/core/observability.ts` (use existing helpers)
- Processor/mobile domain handlers and repository/controller paths
- [x] Add span names for key flows:
- `mobile.register`
- `mobile.ping`
- `mobile.sms.sync`
- `mobile.media.sync`
- `mobile.devices.list`
- `mobile.device.detail`
- [x] Add structured domain events with device id, counts, durations.
- [x] Ensure errors include `flowId` consistently.
Definition of done:
- [x] Can trace one request from processor endpoint to DB operation via shared `flowId`.
### Phase 6: Handoff Readiness (Team Test Phase)
- [x] Prepare endpoint payload examples for mobile team (in a `spec.mobile.md` file).
- [x] Provide a short operator checklist:
- register device
- verify ping updates
- verify sms appears in admin
- verify media appears in admin
## Suggested Build Order
1. Phase 0
2. Phase 1
3. Phase 2
4. Phase 3
5. Phase 4
6. Phase 5
7. Phase 6
Illusory MAPP is an internal project focused on building the foundation for future automation workflows. Right now, the platform provides a working ingestion backend and a simple admin dashboard: devices can register and sync data, and the dashboard lets the team inspect each devices current state, including SMS history and uploaded gallery/media assets. This gives us a stable operational base for visibility and data quality before layering on higher-level automation features.

View File

@@ -28,12 +28,14 @@
"@pkg/logic": "workspace:*",
"@pkg/result": "workspace:*",
"@pkg/settings": "workspace:*",
"argon2": "^0.43.0",
"better-auth": "^1.4.20",
"date-fns": "^4.1.0",
"import-in-the-middle": "^3.0.0",
"nanoid": "^5.1.6",
"neverthrow": "^8.2.0",
"qrcode": "^1.5.4",
"sharp": "^0.34.5",
"valibot": "^1.2.0"
},
"devDependencies": {

View File

@@ -51,6 +51,10 @@ const deleteFileInputSchema = v.object({
fileId: v.string(),
});
const getFileAccessUrlInputSchema = v.object({
fileId: v.string(),
});
export const deleteFileSC = command(deleteFileInputSchema, async (input) => {
const event = getRequestEvent();
const fctx = await getFlowExecCtxForRemoteFuncs(event.locals);
@@ -64,6 +68,22 @@ export const deleteFileSC = command(deleteFileInputSchema, async (input) => {
: { data: null, error: res.error };
});
export const getFileAccessUrlSQ = query(
getFileAccessUrlInputSchema,
async (input) => {
const event = getRequestEvent();
const fctx = await getFlowExecCtxForRemoteFuncs(event.locals);
if (!fctx.userId) {
return unauthorized(fctx);
}
const res = await fc.getFileAccessUrl(fctx, input.fileId, fctx.userId, 3600);
return res.isOk()
? { data: res.value, error: null }
: { data: null, error: res.error };
},
);
export const cleanupDanglingFilesSC = command(v.object({}), async () => {
const event = getRequestEvent();
const fctx = await getFlowExecCtxForRemoteFuncs(event.locals);

View File

@@ -1,5 +1,10 @@
import type { File } from "@pkg/logic/domains/files/data";
import { cleanupDanglingFilesSC, deleteFileSC, getFilesSQ } from "./files.remote";
import {
cleanupDanglingFilesSC,
deleteFileSC,
getFileAccessUrlSQ,
getFilesSQ,
} from "./files.remote";
import { toast } from "svelte-sonner";
class FilesViewModel {
@@ -104,6 +109,25 @@ class FilesViewModel {
this.cleanupLoading = false;
}
}
async openFile(fileId: string) {
try {
const result = await getFileAccessUrlSQ({ fileId });
if (result?.error || !result?.data?.url) {
toast.error(result?.error?.message || "Failed to open file", {
description: result?.error?.description || "Please try again later",
});
return;
}
window.open(result.data.url, "_blank", "noopener,noreferrer");
} catch (error) {
toast.error("Failed to open file", {
description:
error instanceof Error ? error.message : "Please try again later",
});
}
}
}
export const filesVM = new FilesViewModel();

View File

@@ -1,4 +1,5 @@
import { getMobileController } from "@pkg/logic/domains/mobile/controller";
import { getFileController } from "@pkg/logic/domains/files/controller";
import {
mobilePaginationSchema,
listMobileDeviceMediaFiltersSchema,
@@ -12,6 +13,7 @@ import { command, getRequestEvent, query } from "$app/server";
import * as v from "valibot";
const mc = getMobileController();
const fc = getFileController();
const getDevicesInputSchema = v.object({
search: v.optional(v.string()),
@@ -104,9 +106,34 @@ export const getDeviceMediaSQ = query(
});
const res = await mc.listDeviceMedia(fctx, filters, input.pagination);
return res.isOk()
? { data: res.value, error: null }
: { data: null, error: res.error };
if (res.isErr()) {
return { data: null, error: res.error };
}
const rows = res.value.data;
const withAccessUrls = await Promise.all(
rows.map(async (row) => {
const access = await fc.getFileAccessUrl(
fctx,
row.fileId,
fctx.userId!,
3600,
);
return {
...row,
r2Url: access.isOk() ? access.value.url : null,
};
}),
);
return {
data: {
...res.value,
data: withAccessUrls,
},
error: null,
};
},
);

View File

@@ -98,14 +98,13 @@
{new Date(item.uploadedAt).toLocaleString()}
</Table.Cell>
<Table.Cell>
<a
href={item.r2Url}
target="_blank"
rel="noreferrer"
<button
type="button"
onclick={() => void filesVM.openFile(item.id)}
class="text-xs text-primary underline"
>
Open
</a>
</button>
</Table.Cell>
<Table.Cell>
<Button

View File

@@ -1,93 +0,0 @@
import { getUserController } from "@pkg/logic/domains/user/controller";
import type { Session, User } from "@pkg/logic/domains/user/data";
import { auth } from "@pkg/logic/domains/auth/config.base";
import type { RequestHandler } from "@sveltejs/kit";
import { env } from "$env/dynamic/private";
import { api } from "$lib/api";
async function createContext(locals: App.Locals) {
return { ...env, locals };
}
async function getExecutionContext(sess: Session, user: User) {
const flowId = crypto.randomUUID();
return {
flowId: flowId,
userId: user.id,
sessionId: sess.id,
};
}
async function getContext(headers: Headers) {
const sess = await auth.api.getSession({ headers });
if (!sess?.session) {
return false;
}
// @ts-ignore
const fCtx = getExecutionContext(sess.session, sess.user);
return await getUserController()
.getUserInfo(fCtx, sess.user.id)
.match(
(user) => {
return {
user: user,
session: sess.session,
fCtx: fCtx,
};
},
(error) => {
console.error(error);
return false;
},
);
}
export const GET: RequestHandler = async ({ request }) => {
const context = await getContext(request.headers);
if (!context || typeof context === "boolean") {
return new Response("Unauthorized", { status: 401 });
}
return api.fetch(request, await createContext(context));
};
export const HEAD: RequestHandler = async ({ request }) => {
const context = await getContext(request.headers);
if (!context || typeof context === "boolean") {
return new Response("Unauthorized", { status: 401 });
}
return api.fetch(request, await createContext(context));
};
export const POST: RequestHandler = async ({ request }) => {
const context = await getContext(request.headers);
if (!context || typeof context === "boolean") {
return new Response("Unauthorized", { status: 401 });
}
return api.fetch(request, await createContext(context));
};
export const PUT: RequestHandler = async ({ request }) => {
const context = await getContext(request.headers);
if (!context || typeof context === "boolean") {
return new Response("Unauthorized", { status: 401 });
}
return api.fetch(request, await createContext(context));
};
export const DELETE: RequestHandler = async ({ request }) => {
const context = await getContext(request.headers);
if (!context || typeof context === "boolean") {
return new Response("Unauthorized", { status: 401 });
}
return api.fetch(request, await createContext(context));
};
export const OPTIONS: RequestHandler = async ({ request }) => {
const context = await getContext(request.headers);
if (!context || typeof context === "boolean") {
return new Response("Unauthorized", { status: 401 });
}
return api.fetch(request, await createContext(context));
};

View File

@@ -8,7 +8,14 @@ export default defineConfig({
plugins: [sveltekit(), tailwindcss(), Icons({ compiler: "svelte" })],
ssr: {
external: ["argon2", "node-gyp-build"],
external: [
"argon2",
"node-gyp-build",
"sharp",
"@img/sharp-linux-x64",
"@img/sharp-linuxmusl-x64",
"@img/sharp-wasm32",
],
},
resolve: {

View File

@@ -4,7 +4,7 @@
"scripts": {
"dev": "tsx watch src/index.ts",
"build": "tsc",
"start": "node dist/index.js"
"prod": "HOST=0.0.0.0 PORT=3000 tsx src/index.ts"
},
"dependencies": {
"@opentelemetry/api": "^1.9.0",

View File

@@ -7,6 +7,8 @@ import { Hono } from "hono";
const app = new Hono();
app.use("*", httpTelemetryMiddleware);
const host = process.env.HOST || "0.0.0.0";
const port = Number(process.env.PORT || "3000");
app.get("/health", (c) => {
return c.json({ ok: true });
@@ -21,9 +23,10 @@ app.route("/api/v1/mobile", mobileRouter);
serve(
{
fetch: app.fetch,
port: 3000,
port,
hostname: host,
},
(info) => {
console.log(`Server is running on http://localhost:${info.port}`);
console.log(`Server is running on http://${host}:${info.port}`);
},
);

View File

@@ -1,4 +1,4 @@
FROM node:25.6.1 AS production
FROM node:25.6.1 AS base
RUN npm i -g pnpm
@@ -8,6 +8,7 @@ WORKDIR /app
COPY package.json pnpm-lock.yaml pnpm-workspace.yaml turbo.json ./
COPY packages/settings packages/settings
COPY packages/db packages/db
RUN pnpm install

View File

@@ -1,6 +1,6 @@
FROM node:25.6.1-alpine AS production
FROM node:25.6.1-alpine AS deps
RUN apk add --no-cache xh
RUN npm i -g pnpm
WORKDIR /app
@@ -10,17 +10,12 @@ COPY apps/processor/package.json ./apps/processor/package.json
COPY packages ./packages
RUN pnpm install
RUN pnpm install --frozen-lockfile
COPY apps/processor ./apps/processor
RUN pnpm install
COPY scripts ./scripts
EXPOSE 3000
EXPOSE 9001
RUN chmod +x scripts/*.sh
CMD ["/bin/sh", "scripts/prod.start.sh", "apps/processor"]
CMD ["pnpm", "--filter", "@apps/processor", "run", "prod"]

View File

@@ -43,6 +43,33 @@ export class FileController {
});
}
getFileAccessUrl(
fctx: FlowExecCtx,
fileId: string,
userId: string,
expiresIn: number = 3600,
) {
return traceResultAsync({
name: "logic.files.controller.getFileAccessUrl",
fctx,
attributes: {
"app.user.id": userId,
"app.file.id": fileId,
"app.file.access_ttl_sec": expiresIn,
},
fn: () =>
this.fileRepo
.getFileById(fctx, fileId, userId)
.andThen((file) =>
this.storageRepo.generatePresignedDownloadUrl(
fctx,
file.objectKey,
expiresIn,
),
),
});
}
uploadFile(
fctx: FlowExecCtx,
userId: string,

View File

@@ -91,6 +91,14 @@ export type PresignedUploadResponse = v.InferOutput<
typeof presignedUploadResponseSchema
>;
export const presignedFileAccessResponseSchema = v.object({
url: v.string(),
expiresIn: v.pipe(v.number(), v.integer()),
});
export type PresignedFileAccessResponse = v.InferOutput<
typeof presignedFileAccessResponseSchema
>;
export const fileUploadResultSchema = v.object({
success: v.boolean(),
file: v.optional(fileSchema),

View File

@@ -1,6 +1,9 @@
import { ResultAsync, errAsync, okAsync } from "neverthrow";
import { FlowExecCtx } from "@core/flow.execution.context";
import type { PresignedUploadResponse } from "./data";
import type {
PresignedFileAccessResponse,
PresignedUploadResponse,
} from "./data";
import { R2StorageClient } from "@pkg/objectstorage";
import { type Err } from "@pkg/result";
import { fileErrors } from "./errors";
@@ -209,6 +212,78 @@ export class StorageRepository {
});
}
generatePresignedDownloadUrl(
fctx: FlowExecCtx,
objectKey: string,
expiresIn: number,
): ResultAsync<PresignedFileAccessResponse, Err> {
const startedAt = Date.now();
logDomainEvent({
event: "files.storage.presigned_download.started",
fctx,
meta: { objectKey, expiresIn },
});
return ResultAsync.fromPromise(
this.storageClient.generatePresignedDownloadUrl(objectKey, expiresIn),
(error) => {
logDomainEvent({
level: "error",
event: "files.storage.presigned_download.failed",
fctx,
durationMs: Date.now() - startedAt,
error,
meta: { objectKey },
});
return fileErrors.presignedUrlFailed(
fctx,
error instanceof Error ? error.message : String(error),
);
},
).andThen((result) => {
if (result.error) {
logDomainEvent({
level: "error",
event: "files.storage.presigned_download.failed",
fctx,
durationMs: Date.now() - startedAt,
error: result.error,
meta: { objectKey, stage: "storage_response" },
});
return errAsync(
fileErrors.presignedUrlFailed(fctx, String(result.error)),
);
}
const data = result.data;
if (!data?.downloadUrl) {
logDomainEvent({
level: "error",
event: "files.storage.presigned_download.failed",
fctx,
durationMs: Date.now() - startedAt,
error: {
code: "NO_PRESIGNED_DATA",
message: "No presigned download data returned",
},
meta: { objectKey },
});
return errAsync(fileErrors.noPresignedData(fctx));
}
logDomainEvent({
event: "files.storage.presigned_download.succeeded",
fctx,
durationMs: Date.now() - startedAt,
meta: { objectKey, expiresIn },
});
return okAsync({
url: data.downloadUrl,
expiresIn: data.expiresIn,
});
});
}
deleteFile(
fctx: FlowExecCtx,
objectKey: string,

View File

@@ -4,12 +4,14 @@ import {
GetObjectCommand,
HeadObjectCommand,
ListObjectsV2Command,
type ListObjectsV2CommandOutput,
PutObjectCommand,
S3Client,
} from "@aws-sdk/client-s3";
import type {
FileMetadata,
FileUploadConfig,
PresignedDownloadResult,
PresignedUrlResult,
UploadOptions,
UploadResult,
@@ -265,6 +267,46 @@ export class R2StorageClient {
}
}
/**
* Generate presigned URL for private object download/view
*/
async generatePresignedDownloadUrl(
objectKey: string,
expiresIn: number = 3600,
): Promise<Result<PresignedDownloadResult>> {
try {
const command = new GetObjectCommand({
Bucket: this.config.bucketName,
Key: objectKey,
});
const downloadUrl = await getSignedUrl(this.s3Client, command, {
expiresIn,
});
logger.info(`Generated presigned download URL for ${objectKey}`);
return {
data: {
downloadUrl,
expiresIn,
},
};
} catch (error) {
logger.error("Failed to generate presigned download URL:", error);
return {
error: getError(
{
code: ERROR_CODES.STORAGE_ERROR,
message: "Failed to generate presigned download URL",
description: "Could not create download URL",
detail: "S3 presigned URL generation failed",
},
error,
),
};
}
}
/**
* Get file from R2
*/
@@ -493,13 +535,14 @@ export class R2StorageClient {
let continuationToken: string | undefined = undefined;
do {
const command = new ListObjectsV2Command({
const command: ListObjectsV2Command = new ListObjectsV2Command({
Bucket: this.config.bucketName,
Prefix: prefix,
ContinuationToken: continuationToken,
});
const response = await this.s3Client.send(command);
const response: ListObjectsV2CommandOutput =
await this.s3Client.send(command);
const pageKeys =
response.Contents?.map((item) => item.Key).filter(
(key): key is string => Boolean(key),

View File

@@ -60,6 +60,14 @@ export const presignedUrlResultSchema = v.object({
});
export type PresignedUrlResult = v.InferOutput<typeof presignedUrlResultSchema>;
export const presignedDownloadResultSchema = v.object({
downloadUrl: v.string(),
expiresIn: v.pipe(v.number(), v.integer()),
});
export type PresignedDownloadResult = v.InferOutput<
typeof presignedDownloadResultSchema
>;
// File Validation Result Schema
export const fileValidationResultSchema = v.object({
isValid: v.boolean(),
@@ -108,9 +116,15 @@ export const fileProcessingResultSchema = v.object({
metadata: v.optional(v.record(v.string(), v.any())),
error: v.optional(v.string()),
});
export type FileProcessingResult = v.InferOutput<
typeof fileProcessingResultSchema
>;
export type BinaryFileData = Uint8Array<ArrayBufferLike>;
export type FileProcessingResult = {
processed: boolean;
originalFile?: BinaryFileData;
processedFile?: BinaryFileData;
thumbnail?: BinaryFileData;
metadata?: Record<string, any>;
error?: string;
};
// File Security Result Schema (from utils.ts)
export const fileSecurityResultSchema = v.object({

View File

@@ -1,6 +1,10 @@
import { createHash } from "crypto";
import type { DocumentProcessingOptions, FileProcessingResult } from "../data";
function asByteArray(input: Buffer | Uint8Array): Uint8Array {
return Uint8Array.from(input);
}
/**
* Process documents (PDF, text files, etc.)
*/
@@ -78,8 +82,8 @@ async function processPDF(
return {
processed: true,
originalFile: buffer,
processedFile: buffer, // PDFs typically don't need processing
originalFile: asByteArray(buffer),
processedFile: asByteArray(buffer), // PDFs typically don't need processing
metadata,
};
}
@@ -108,8 +112,8 @@ async function processTextFile(
return {
processed: true,
originalFile: buffer,
processedFile: buffer,
originalFile: asByteArray(buffer),
processedFile: asByteArray(buffer),
metadata,
};
}
@@ -125,8 +129,8 @@ async function processGenericDocument(
return {
processed: true,
originalFile: buffer,
processedFile: buffer,
originalFile: asByteArray(buffer),
processedFile: asByteArray(buffer),
metadata,
};
}

View File

@@ -145,9 +145,11 @@ export async function processImage(
return {
processed: true,
originalFile: inputBuffer,
processedFile: processedBuffer,
thumbnail: thumbnailBuffer,
originalFile: Uint8Array.from(inputBuffer),
processedFile: Uint8Array.from(processedBuffer),
thumbnail: thumbnailBuffer
? Uint8Array.from(thumbnailBuffer)
: undefined,
metadata,
};
} catch (error) {

View File

@@ -49,8 +49,8 @@ export async function processVideo(
return {
processed: true,
originalFile: inputBuffer,
processedFile: inputBuffer, // Videos are typically not re-encoded during upload
originalFile: Uint8Array.from(inputBuffer),
processedFile: Uint8Array.from(inputBuffer), // Videos are typically not re-encoded during upload
metadata,
};
} catch (error) {

6
pnpm-lock.yaml generated
View File

@@ -65,6 +65,9 @@ importers:
'@pkg/settings':
specifier: workspace:*
version: link:../../packages/settings
argon2:
specifier: ^0.43.0
version: 0.43.1
better-auth:
specifier: ^1.4.20
version: 1.4.20(@sveltejs/kit@2.53.4(@opentelemetry/api@1.9.0)(@sveltejs/vite-plugin-svelte@6.2.4(svelte@5.53.6)(vite@7.3.1(@types/node@25.3.3)(jiti@2.6.1)(lightningcss@1.31.1)(tsx@4.21.0)(yaml@2.8.2)))(svelte@5.53.6)(typescript@5.9.3)(vite@7.3.1(@types/node@25.3.3)(jiti@2.6.1)(lightningcss@1.31.1)(tsx@4.21.0)(yaml@2.8.2)))(drizzle-kit@0.31.9)(drizzle-orm@0.45.1(@opentelemetry/api@1.9.0)(@types/pg@8.16.0)(bun-types@1.3.9)(kysely@0.28.11)(postgres@3.4.8))(svelte@5.53.6)(vitest@4.0.18(@opentelemetry/api@1.9.0)(@types/node@25.3.3)(jiti@2.6.1)(lightningcss@1.31.1)(tsx@4.21.0)(yaml@2.8.2))
@@ -83,6 +86,9 @@ importers:
qrcode:
specifier: ^1.5.4
version: 1.5.4
sharp:
specifier: ^0.34.5
version: 0.34.5
valibot:
specifier: ^1.2.0
version: 1.2.0(typescript@5.9.3)

View File

@@ -0,0 +1,95 @@
#!/usr/bin/env python3
from __future__ import annotations
import argparse
import re
from pathlib import Path
ENV_ASSIGNMENT_RE = re.compile(
r"^(\s*(?:export\s+)?)([A-Za-z_][A-Za-z0-9_]*)(\s*=\s*)(.*)$"
)
def split_inline_comment(rhs: str) -> tuple[str, str]:
in_single = False
in_double = False
escaped = False
for i, char in enumerate(rhs):
if escaped:
escaped = False
continue
if char == "\\":
escaped = True
continue
if char == "'" and not in_double:
in_single = not in_single
continue
if char == '"' and not in_single:
in_double = not in_double
continue
if char == "#" and not in_single and not in_double:
if i == 0 or rhs[i - 1].isspace():
return rhs[:i], rhs[i:]
return rhs, ""
def transform_line(line: str) -> str:
stripped = line.strip()
if stripped == "" or stripped.startswith("#"):
return line
newline = ""
raw = line
if line.endswith("\r\n"):
newline = "\r\n"
raw = line[:-2]
elif line.endswith("\n"):
newline = "\n"
raw = line[:-1]
match = ENV_ASSIGNMENT_RE.match(raw)
if not match:
return line
prefix, key, delimiter, rhs = match.groups()
value_part, comment_part = split_inline_comment(rhs)
trailing_ws = value_part[len(value_part.rstrip()) :]
placeholder = f"${{{{project.{key}}}}}"
return f"{prefix}{key}{delimiter}{placeholder}{trailing_ws}{comment_part}{newline}"
def generate_example_env(source_path: Path, target_path: Path) -> None:
lines = source_path.read_text(encoding="utf-8").splitlines(keepends=True)
transformed = [transform_line(line) for line in lines]
target_path.write_text("".join(transformed), encoding="utf-8")
def main() -> None:
parser = argparse.ArgumentParser(
description="Generate .env.example using ${{project.KEY}} placeholders."
)
parser.add_argument("--source", default=".env", help="Path to source .env file")
parser.add_argument(
"--target", default=".env.example", help="Path to output .env.example file"
)
args = parser.parse_args()
source_path = Path(args.source)
target_path = Path(args.target)
if not source_path.exists():
raise FileNotFoundError(f"Source env file not found: {source_path}")
generate_example_env(source_path, target_path)
if __name__ == "__main__":
main()