My submission for FBLA 2026 Website Coding & Development.
Go to file
DragonDuck24 2a1436515b
All checks were successful
ci / docker_image (push) Successful in 3m14s
ci / deploy (push) Successful in 30s
Readme fix
2026-02-07 01:03:54 -06:00
.gitea/workflows CI: pull base image 2026-01-24 13:38:19 -06:00
src cleanup 2026-02-07 01:03:12 -06:00
static Lots of dev 2026-02-03 00:23:43 -06:00
.env.example readme 2026-02-07 01:01:37 -06:00
.gitignore ai impl 2026-02-06 02:39:23 -06:00
.npmrc init commit 2025-10-13 10:37:07 -05:00
.prettierignore init commit 2025-10-13 10:37:07 -05:00
.prettierrc init commit 2025-10-13 10:37:07 -05:00
components.json move to shadcn 2026-01-24 11:43:11 -06:00
docker-compose.yml readme 2026-02-07 01:01:37 -06:00
Dockerfile change deployed build 2026-01-28 13:00:01 -06:00
eslint.config.js Lots of dev 2026-02-03 00:23:43 -06:00
package-lock.json items revisions 2026-02-04 01:30:46 -06:00
package.json items revisions 2026-02-04 01:30:46 -06:00
README.md Readme fix 2026-02-07 01:03:54 -06:00
server.js better server build 2026-01-24 20:54:17 -06:00
svelte.config.js Lots of dev 2026-02-03 00:23:43 -06:00
tsconfig.json init commit 2025-10-13 10:37:07 -05:00
vite.config.ts init commit 2025-10-13 10:37:07 -05:00
vitest-setup-client.ts init commit 2025-10-13 10:37:07 -05:00

Westuffind - FBLA 2026

Overview

This is a lost and found application built using SvelteKit for the 2026 FBLA Website Coding & Development event. It allows users to browse items, post found items, and manage them. The application is designed for fast performance and a seamless user experience.

Features

  • User authentication (login/signup/logout)
    • Email-only token-based methods for non-admins
  • Browse/search items
  • Post found items
  • Inquire about items
  • Claim items
  • Email notifications
  • Themes

Installation

To set up the project locally, follow these steps:

Prerequisites

Clone the repository

git clone https://git.marinodev.com/MarinoDev/FBLA25
cd FBLA25

Create a .env file in the root directory and configure environment variables. .env.example is provided as a template. Download a LLaMA compatible LLM (and mmproj) to llm-models. I recommend Qwen3-VL-2B-Instruct.

Docker

A Dockerfile and docker-compose.yml file are provided for running the application in a Docker container.

Manual

Using Docker is strongly recommended, as it bundles the database and the AI.

Install dependencies

npm install

Start the development server

npm run dev

Go to http://localhost:5173/ (or the port shown in the terminal).

Deployment

To deploy the application, build it using:

npm run build
node build

Resources Used

Technologies

Libraries