Show HN: LUML – an open source (Apache 2.0) MLOps/LLMOps platform
Hi HN,

We built LUML (https://github.com/luml-ai/luml), an open-source (Apache 2.0) MLOps/LLMOps platform that covers experiments, registry, LLM tracing, deployments and so on.

It separates the control plane from your data and compute. Artifacts are self-contained. Each model artifact includes all metadata (including the experiment snapshots, dependencies, etc.), and it stays in your storage (S3-compatible or Azure).

File transfers go directly between your machine and storage, and execution happens on compute nodes you host and connect to LUML.

We’d love you to try the platform and share your feedback!

Why is it better than mlflow?
1. MLFlow is rather annoying to self-host for a bigger team. We provide projects, permissions, etc. out of the box. The user just needs to connect the storage bucket once.

2. LUML is an end-to-end platform, so it has a much larger scope than MLFlow.