We built LUML (https://github.com/luml-ai/luml), an open-source (Apache 2.0) MLOps/LLMOps platform that covers experiments, registry, LLM tracing, deployments and so on.
It separates the control plane from your data and compute. Artifacts are self-contained. Each model artifact includes all metadata (including the experiment snapshots, dependencies, etc.), and it stays in your storage (S3-compatible or Azure).
File transfers go directly between your machine and storage, and execution happens on compute nodes you host and connect to LUML.
We’d love you to try the platform and share your feedback!
2. LUML is an end-to-end platform, so it has a much larger scope than MLFlow.