[go: up one dir, main page]

Menu

Tree [b42b5d] main /
 History

HTTPS access


File Date Author Commit
 .github 2025-09-08 Jarle Taksdal Jarle Taksdal [56fce4] chore(ci): add frontend/backend tests and npm a...
 android 2025-09-08 Jarle Taksdal Jarle Taksdal [8c8ed7] chore: recreate repository history without larg...
 backend 2025-09-08 Jarle Taksdal Jarle Taksdal [8c8ed7] chore: recreate repository history without larg...
 cli 2025-09-08 Jarle Taksdal Jarle Taksdal [8c8ed7] chore: recreate repository history without larg...
 components 2025-09-08 Jarle Taksdal Jarle Taksdal [b42b5d] Add toast notifications, settings panel, histor...
 docs 2025-09-08 Jarle Taksdal Jarle Taksdal [99b91e] docs: simplify deployment section, add DEPLOYME...
 electron 2025-09-08 Jarle Taksdal Jarle Taksdal [8c8ed7] chore: recreate repository history without larg...
 hooks 2025-09-08 Jarle Taksdal Jarle Taksdal [b42b5d] Add toast notifications, settings panel, histor...
 public 2025-09-08 Jarle Taksdal Jarle Taksdal [8c8ed7] chore: recreate repository history without larg...
 scripts 2025-09-08 Jarle Taksdal Jarle Taksdal [8c8ed7] chore: recreate repository history without larg...
 services 2025-09-08 Jarle Taksdal Jarle Taksdal [5ecc1c] Add telemetry event tracking across UI and serv...
 tests 2025-09-08 Jarle Taksdal Jarle Taksdal [8c8ed7] chore: recreate repository history without larg...
 .env 2025-09-08 Jarle Taksdal Jarle Taksdal [8c8ed7] chore: recreate repository history without larg...
 .gitattributes 2025-09-08 Jarle Taksdal Jarle Taksdal [8c8ed7] chore: recreate repository history without larg...
 .gitignore 2025-09-08 Jarle Taksdal Jarle Taksdal [8c8ed7] chore: recreate repository history without larg...
 App.tsx 2025-09-08 Jarle Taksdal Jarle Taksdal [b42b5d] Add toast notifications, settings panel, histor...
 CHANGELOG.md 2025-09-08 Jarle Taksdal Jarle Taksdal [8c8ed7] chore: recreate repository history without larg...
 DEVELOPMENT.md 2025-09-08 Jarle Taksdal Jarle Taksdal [8c8ed7] chore: recreate repository history without larg...
 DISTRIBUTION.md 2025-09-08 Jarle Taksdal Jarle Taksdal [8c8ed7] chore: recreate repository history without larg...
 LICENSE 2025-09-08 Jarle Taksdal Jarle Taksdal [8c8ed7] chore: recreate repository history without larg...
 README.md 2025-09-08 Jarle Taksdal Jarle Taksdal [4c55bb] Update README.md
 SECURITY.md 2025-09-08 Jarle Taksdal Jarle Taksdal [8c8ed7] chore: recreate repository history without larg...
 capacitor.config.json 2025-09-08 Jarle Taksdal Jarle Taksdal [8c8ed7] chore: recreate repository history without larg...
 capacitor.config.ts 2025-09-08 Jarle Taksdal Jarle Taksdal [8c8ed7] chore: recreate repository history without larg...
 docker-compose.yml 2025-09-08 Jarle Taksdal Jarle Taksdal [8c8ed7] chore: recreate repository history without larg...
 env.d.ts 2025-09-08 Jarle Taksdal Jarle Taksdal [5ecc1c] Add telemetry event tracking across UI and serv...
 index.css 2025-09-08 Jarle Taksdal Jarle Taksdal [b42b5d] Add toast notifications, settings panel, histor...
 index.html 2025-09-08 Jarle Taksdal Jarle Taksdal [b42b5d] Add toast notifications, settings panel, histor...
 index.tsx 2025-09-08 Jarle Taksdal Jarle Taksdal [5ecc1c] Add telemetry event tracking across UI and serv...
 jest.config.ts 2025-09-08 Jarle Taksdal Jarle Taksdal [8c8ed7] chore: recreate repository history without larg...
 jest.setup.ts 2025-09-08 Jarle Taksdal Jarle Taksdal [8c8ed7] chore: recreate repository history without larg...
 metadata.json 2025-09-08 Jarle Taksdal Jarle Taksdal [8c8ed7] chore: recreate repository history without larg...
 package-lock.json 2025-09-08 Jarle Taksdal Jarle Taksdal [8c8ed7] chore: recreate repository history without larg...
 package.json 2025-09-08 Jarle Taksdal Jarle Taksdal [291dc4] feat(obs): add app insights init, error boundar...
 prometheus.yml 2025-09-08 Jarle Taksdal Jarle Taksdal [8c8ed7] chore: recreate repository history without larg...
 telemetry.ts 2025-09-08 Jarle Taksdal Jarle Taksdal [5ecc1c] Add telemetry event tracking across UI and serv...
 tsconfig.json 2025-09-08 Jarle Taksdal Jarle Taksdal [8c8ed7] chore: recreate repository history without larg...
 types.ts 2025-09-08 Jarle Taksdal Jarle Taksdal [8c8ed7] chore: recreate repository history without larg...
 vite.config.ts 2025-09-08 Jarle Taksdal Jarle Taksdal [8c8ed7] chore: recreate repository history without larg...

Read Me

KnoksPix – AI-Powered Image & Code Companion

Banner

Live Preview
Smoke Test
CI/CD
Issues
Last Commit
Code Size
Top Language
PR Previews
License
Docker
Python
TypeScript
Static Analysis
Tests

KnoksPix is an AI-first creative workspace combining:

  • Intelligent image editing & adjustment workflow
  • Generative augmentation via Gemini / (optional) local Starcoder2 backend
  • Cross‑platform delivery (Web + Desktop [Electron] + Mobile [Capacitor/Android])
  • Extensible architecture (pluggable model backends & tools)

Demo: View the app in AI Studio


Table of Contents

  1. Core Features
  2. Screenshots
  3. Quick Start
  4. Frontend
  5. Backend API (Starcoder2)
  6. Deployment Options
  7. Preview Deployments
  8. Environment Variables
  9. Testing
  10. Architecture
  11. Security
  12. Release Checklist
  13. Contributing
  14. License

Features

Area Highlights
Image Editing Crop, filters, adjustment panel, object layer cards
AI Integration Gemini API for generation (pluggable)
Local LLM Option Optional Starcoder2 backend with SSE streaming, OpenAI-compatible endpoint
Performance Vite + React + Code-splitting
Cross Platform Web, Electron desktop builds, Android (Capacitor)
Tooling Jest tests, GitHub Actions smoke & CI, PR preview deploys
Observability Prometheus metrics, structured logging (backend)
Rate Limiting SlowAPI sliding window on backend

Screenshots

Uten tittel - Thorium 08 09 2025 20_39_36

Uten tittel - Thorium 08 09 2025 20_40_39

Uten tittel - Thorium 08 09 2025 20_41_28


Quick Start

Frontend Only (using hosted Gemini)

git clone https://github.com/knoksen/knoksPix.git
cd knoksPix
npm ci
echo "GEMINI_API_KEY=your_key" > .env.local
npm run dev

Open http://localhost:5173

Full Stack (with local Starcoder2 backend)

cp backend/.env.sample backend/.env
# Edit backend/.env as needed (MODEL_ID, tokens, mock mode etc.)
docker compose up --build
# Frontend (separate terminal)
npm run dev

Backend API docs: http://localhost:8000/docs


Frontend

React + TypeScript + Vite. Core UI elements in components/. State is localized per panel keeping bundle size lean. Tests under tests/ use Jest + React Testing Library.

Build production bundle:

npm run build

Serve locally:

npm run preview

Backend API Service (Starcoder2)

Located in backend/. Provides:

  • POST /v1/chat/completions – OpenAI-style (stream or non-stream)
  • POST /v1/generate – Simple prompt generation (stream or non-stream)
  • GET /metrics – Prometheus metrics
  • GET /healthz – Liveness check

Streaming uses Server-Sent Events (SSE). Chat endpoint emits OpenAI-compatible chat.completion.chunk objects. Generation endpoint emits { "text": "..." } chunks then [DONE] sentinel.

Mock Mode (no model download): set USE_MOCK_GENERATION=1.

Backend Setup

cd backend
python -m venv .venv && source .venv/bin/activate  # Windows: .venv\\Scripts\\activate
pip install -r requirements.txt -r requirements-dev.txt
cp .env.sample .env  # edit values
uvicorn main:app --reload --port 8000

Configuration Reference

Variable Purpose Default
MODEL_ID HF model id to load bigcode/starcoder2-3b
HF_TOKEN (Optional) auth for private models empty
STARCODER2_API_TOKEN Bearer token required by clients changeme
USE_MOCK_GENERATION Skip model load; return synthetic outputs 0
MAX_NEW_TOKENS_LIMIT Hard upper bound user requests 512
RATE_LIMIT slowapi rate expression 100/minute
LOG_LEVEL Logging threshold INFO

Streaming Protocol Details

Chat endpoint (/v1/chat/completions, stream=true):

data: {"id":"...","object":"chat.completion.chunk","choices":[{"delta":{"content":"def"}}]}
data: {"id":"...","object":"chat.completion.chunk","choices":[{"delta":{"content":" add"}}]}
data: {"id":"...","object":"chat.completion.chunk","choices":[{"delta":{}}],"finish_reason":"stop"}
data: [DONE]

Generate endpoint (/v1/generate, stream=true):

data: {"text":"partial token"}
data: {"text":" more"}
data: [DONE]

Both endpoints send text/event-stream; charset=utf-8 and can be consumed with any SSE client. Non‑stream mode aggregates full text in a single JSON object.

Observability

Metric names (Prometheus):

  • http_requests_total / latency histograms (instrumentator defaults)
  • Custom counters (planned): token usage per request (placeholder if not yet present)

Dashboards: Point Grafana at the Prometheus service (see docker-compose.yml).

Performance Notes

  • Use mock mode in CI to avoid multi‑GB model pulls.
  • For GPU: extend Dockerfile build args (CUDA / ROCm) and launch with --gpus=all (already hinted in compose).
  • Adjust MAX_NEW_TOKENS_LIMIT to guard latency & memory.
  • Consider swapping to a TextIteratorStreamer for finer token pacing (roadmap).

Example (chat streaming):

curl -N \
   -H "Authorization: Bearer $STARCODER2_API_TOKEN" \
   -H "Content-Type: application/json" \
   -d '{
      "model": "bigcode/starcoder2-3b",
      "messages": [{"role":"user","content":"Write a Python hello world"}],
      "stream": true
   }' \
   http://localhost:8000/v1/chat/completions

Example (generate streaming):

curl -N \
   -H "Authorization: Bearer $STARCODER2_API_TOKEN" \
   -H "Content-Type: application/json" \
   -d '{
      "prompt": "def add(a,b):\n    return a+b",
      "max_new_tokens": 64,
      "stream": true
   }' \
   http://localhost:8000/v1/generate

Python Helper Client

from starcoder2_client import ChatClient
client = ChatClient(base_url="http://localhost:8000", token="changeme")
resp = client.chat([{ "role": "user", "content": "Write a Python function add(a,b)." }])
print(resp["choices"][0]["message"]["content"])

Deployment Options

Simplified here. Full extended matrices, one‑click buttons, hosting comparisons, and hardening tips moved to docs/DEPLOYMENT.md.

Common quick paths:

Scenario Command / Action
Local frontend only npm run dev
Local full stack (mock) (cd backend && USE_MOCK_GENERATION=1 uvicorn main:app --port 8000) & npm run dev
Docker backend docker compose up --build
GitHub Pages deploy Auto on push to main
PR preview Auto Surge URL comment

See docs/DEPLOYMENT.md for buttons & providers.

Badges:

Deploy Frontend
Run Docker
Open Metrics


Preview Deployments

  • Each PR triggers a build + Surge deployment.
  • A unique URL is commented automatically on the PR.
  • Main branch builds to GitHub Pages (frontend) + can trigger backend container build (optional pipeline extension).

Environment Variables

Frontend (.env.local):

GEMINI_API_KEY=your_key
VITE_API_BASE=http://localhost:8000

Backend (backend/.env): (see backend/.env.sample for full list)

MODEL_ID=bigcode/starcoder2-3b
STARCODER2_API_TOKEN=changeme
USE_MOCK_GENERATION=1
RATE_LIMIT=100/minute
MAX_NEW_TOKENS_LIMIT=512
LOG_LEVEL=INFO

Testing

Run UI tests:

npm test

Backend tests (mock mode):

pytest

Architecture

flowchart LR
   A[React/Vite Frontend] -->|Fetch / Chat| B((FastAPI Backend))
   B -->|Generation| C[Starcoder2 Model]
   B -->|Mock Mode| C2[(In-Memory Mock)]
   B -->|/metrics| D[(Prometheus)]
   D --> G[Grafana]
   B -->|Structured Logs| L[(Log Aggregator)]
  • Rate limiting: slowapi sliding window
  • Streaming: SSE for both chat + generate
  • Metrics: latency, request counts, token counter
  • Multi-platform packaging: Electron + Capacitor

Security

  • Secrets only via environment variables
  • Bearer token auth on backend when token set
  • Rate limiting enabled by default
  • No secrets shipped in PR preview builds

Release Checklist

Item Status Notes
README updated Current file
License present MIT license in repo
Env samples backend/.env.sample
CI smoke tests smoke.yml badge passing
PR preview pipeline Surge deployment configured
Backend health endpoint /healthz present
Metrics endpoint /metrics (Prometheus)
Rate limiting slowapi configured
Streaming verified SSE implemented both endpoints
Mock mode USE_MOCK_GENERATION=1
Image assets ⚠️ Replace placeholder screenshots
Deployment buttons Added Netlify/Vercel/etc
Backend tests Mock mode tests present
Release workflow Tag push triggers Electron & dist build
Dependabot .github/dependabot.yml configured
CodeQL scan codeql.yml workflow added
SBOM generation sbom.yml workflow (CycloneDX)

Run Locally (Frontend)

Prerequisites: Node.js 20 or higher

  1. Clone the repository:

bash git clone https://github.com/knoksen/knoksPix.git cd knoksPix

  1. Install dependencies:

bash npm ci

  1. Set up environment variables:
  2. Create a .env.local file in the root directory
  3. Add your Gemini API key: GEMINI_API_KEY=your_api_key_here

  4. Start the development server:

bash npm run dev

  1. Open http://localhost:5173 in your browser

Local Backend + Frontend Together

  1. Start backend (mock for speed):

bash (cd backend && USE_MOCK_GENERATION=1 uvicorn main:app --port 8000)

  1. Start frontend:

bash npm run dev

  1. Set VITE_API_BASE=http://localhost:8000 in .env.local if calling backend.

Deployment

See Deployment Options for matrix. Production frontend is emitted to dist/.

License

MIT. See LICENSE.

Contributing

  1. Fork the repository
  2. Create a feature branch: git checkout -b feat/awesome
  3. Commit: git commit -m 'feat: add awesome capability'
  4. Push: git push origin feat/awesome
  5. Open a Pull Request – preview URL will be auto‑attached.

Conventional commit prefixes (feat:, fix:, docs:) encouraged. Small, focused PRs merge faster.