Skip to main content

Tech Stack

This page is a complete inventory of every important piece of technology in Dashify. For each one we explain, in plain English, what it does and why it earned a place in the stack. If a tool is used, it is listed here.

Languages

TypeScript is used everywhere, server, client, worker, build scripts, this docs site. It is JavaScript with a type system bolted on top. Types catch mistakes at write-time instead of at run-time and make refactoring safe across a codebase this large. The project uses strict mode, which means the compiler refuses to compile code where a value could be null or undefined without you handling that case.

JavaScript appears in places where TypeScript would be overkill, small helper scripts, configuration files.

The browser app

React 18 is the UI library. It lets us describe screens as a tree of components and re-renders only the parts that change.

Vite is the build tool. It turns the developer's TypeScript and React code into the bundles the browser actually downloads. Vite is fast, both during development (instant hot-reload) and when producing the final production build.

Redux Toolkit + RTK Query is how the client manages state. RTK Query is the layer that talks to the API: it handles caching, request deduplication, optimistic updates, and automatic retries.

Reactstrap + Bootstrap 5 provide the base component library, modals, dropdowns, forms, navigation. The look and feel is then layered on top with custom styles.

Socket.IO (the client) opens a persistent connection to the API for real time features (chat, notifications, board updates). It uses WebSockets and gracefully falls back to long polling if the network gets in the way.

Workbox powers the service worker. It precaches the application shell, runtime-caches images and fonts, and lets the app load instantly on repeat visits.

CKEditor 5 is the rich-text editor used in the knowledge base and announcement pages.

DOMPurify sanitises any HTML the application is about to render. Where rich-text input is involved, DOMPurify is the last line of defence against cross site scripting.

Recharts + ApexCharts render the charts on dashboards and reports.

Zustand is a tiny state library used inside the chat module for ephemeral local state that does not belong in Redux.

The API server

Node.js 20 is the runtime.

Express 4 is the HTTP framework, it routes incoming requests to the right handler and exposes the middleware stack.

Mongoose is the ORM (object-document mapper) that talks to MongoDB. It defines schemas, validates documents on save, and exposes a familiar API for queries.

Redis (via the ioredis client) is used for session storage, BullMQ queue plumbing, the Socket.IO Redis adapter (for fanning out events across multiple API instances), and rate limit counters.

BullMQ is the queue. The API enqueues a job with a name and a payload; the worker process consumes it. Failed jobs retry with exponential backoff. A bull-board UI exposes queue health to super-admins.

Pino is the structured logger. Every log line is JSON with a request id, the user id, the tenant id, and the route. Easy to grep, easy to ship to a log aggregator.

Helmet sets a long list of security-related HTTP response headers (HSTS, X-Frame-Options, X-Content-Type-Options, and more, see the Security section).

csrf-csrf implements the double submit CSRF protection pattern (covered in detail in the Security section).

rate-limiter-flexible (backed by Redis) applies per-IP and per-account rate limits to login, password-reset, and other sensitive routes.

@node-rs/argon2 hashes user passwords. Argon2 is the current best in class password-hashing algorithm, the reasons are spelled out in the Authentication section.

otplib generates and verifies the six-digit codes used in two factor authentication.

openid-client implements OIDC for single sign on. Each tenant can configure its own identity provider (Okta, Azure AD, Google Workspace, etc.) and let users sign in with their corporate credentials.

Multer handles multipart form uploads (file attachments).

Cloudinary is the file CDN. Files are uploaded directly to Cloudinary and the API only stores a reference.

Swagger / OpenAPI documents every API endpoint. The docs are visible in development at /api-docs.

Zod validates incoming request payloads at the boundary. Anything that fails validation is rejected before it ever touches business logic.

The worker

The worker shares the same codebase as the API but runs worker.ts instead of app.ts. It registers BullMQ consumers for every queue and shuts down gracefully when the platform restarts (drains in flight jobs, refuses new ones, then exits).

The worker also runs a nightly indexer that walks every tenant's knowledge base, files, and announcements, embeds the content using nomic-embed-text via Ollama, and stores the vectors in Qdrant for the AI assistant to retrieve later.

Storage

MongoDB 7 is the primary database. Documents are organised by collection (users, organisations, projects, work items, etc.) and almost every document carries a tenantId (the foreign key to its owning organisation).

Redis 7 is the cache, queue backend, pub/sub broker, and rate limit store all in one.

Qdrant is a vector database, it stores embeddings (long lists of floating-point numbers that represent the semantic meaning of text) and answers similarity queries in milliseconds. The AI assistant uses it to find which passages in your tenant's data are most relevant to a question before passing them to the language model.

AI

Ollama is the engine that runs language models locally. It serves a simple HTTP API and Dashify talks to it like any other service.

Qwen 2.5 (1.5B) is the chat model. Small enough to run on a laptop, capable enough to give useful answers when grounded in retrieved context.

nomic-embed-text is the embedding model. It turns a passage of text into a vector of 768 floating-point numbers that captures its meaning.

@anthropic-ai/sdk is also installed, historically Dashify supported a "bring your own Anthropic API key" mode for tenants that prefer a hosted model. The on prem path is the default; the hosted path remains for tenants that ask for it.

Testing

Vitest is the unit / integration test runner. Fast, Vite-native, friendly with TypeScript.

React Testing Library drives the component tests on the client.

Playwright runs end to end tests against the real browser and a real API.

Testcontainers spins up real MongoDB and Redis instances inside Docker for integration tests so we are never testing against mocks (and never get burned by mock-vs-prod drift).

k6 runs load tests. Useful for finding the point where rate limits or database queries become the bottleneck.

MSW (Mock Service Worker) intercepts API calls in some unit tests when a real server is overkill.

Observability

Pino for structured logs (already mentioned).

Prometheus scrapes metrics from the API (request counts, request durations, queue lengths, etc.).

Grafana visualises Prometheus metrics on dashboards.

Jaeger + OpenTelemetry capture distributed traces, the path of a single request through every service it touches.

Sentry captures uncaught exceptions and unhandled promise rejections in production. Local dev does not initialise Sentry.

DevOps

Docker + Docker Compose package every service into containers and run them together with one command.

GitHub Actions runs CI on every push: linting, type-checking, tests, security scans (Trivy), and the docs site deploy.

pnpm + pnpm workspaces manage dependencies across the monorepo.

Husky + lint-staged + commitlint enforce formatting, linting, and conventional commit messages on every git commit.

How they all fit

Key takeaways

  • TypeScript everywhere; React + Vite on the client; Node + Express on the server.
  • MongoDB for documents, Redis for fast questions, Qdrant for vectors, Ollama for the local LLM.
  • Pino, Prometheus, Grafana, Jaeger, and Sentry give the platform a glass front so failures cannot hide.
  • Vitest, Playwright, Testcontainers, and k6 cover unit, integration, end to end, and load testing.