Why build this when Replit or VS Code Live Share already exist?
Because my goal was simplicity (and education). I wanted something lightweight for beginners who just want to write and share simple Python scripts (alone or with others), without downloads, paywalls, or extra noise. There’s also no AI/copilot built in, something many teachers and learners actually prefer. Also its free lol
Tech stack (frontend): React + TailwindCSS CodeMirror for linting Y.js for real-time syncing and live cursors Skulpt to execute Python in the browser (for safety - I initially wanted Docker containers, but that would eat too much memory at scale. Skulpt has a limited library, so unfortunately imports like pygame wont work). I don’t enjoy frontend or UI design much, so I leaned on AI for some design help, but all the logic/code is mine. Deployed via Vercel.
Tech stack (backend): Django (channels, auth, celery/redis support made it a great fit) PostgreSQL via Supabase JWT + OAuth authentication Redis for channel layers + caching Fully Dockerized + deployed on a VPS (8GB RAM, $7/mo deal)
Data models: Users <-> Groups -> Projects -> Code Users can join many groups Groups can have multiple projects Each project belongs to one group and has one code file (kept simple for beginners, though I may add a file system later).
There were a lot of issues I came across when building this project, especially related to the backend. My biggest issue was figuring out how to create a reliable and smart autosave system. I couldn't just make it save on every user keystroke because for obvious reasons, that would overwhelm the database especially at scale. So I came up with a solution that I am really proud of; I used Redis to cache active projects, then used Celery to loop through these active projects every minute and then persist the code to the db. I did this by tracking a user count for each project everytime someone joins or leaves, and if the user count drops to 0 for a project, remove it from Redis (save the code too). Redis is extremely fast, so saving the code on every keystroke is not a problem at all. I am essentially hitting 4 birds with one stone with this because I am reusing Redis, which I've already integrated into my channel layers, to track active projects, and to also cache the code so when a new user enters the project, instead of hitting the db for the code, it'll get it from Redis. I even get to use Redis as my message broker for Celery (didn't use RabbitMQ because I wanted to conserve storage instead of dockerizing an entirely new service). This would also work really well at scale since Celery would offload the task of autosaving a lot of code away from the backend. The code also saves when someone leaves the project. Another issue I came across later is if people try sending a huge load of text, so I just capped the limit to 1 MB (will tinker with this).
Deployment on a VPS was another beast. I spent ~8 hours wrangling Nginx, Certbot, Docker, and GitHub Actions to get everything up and running. It was frustrating, but I learned a lot.
If you’re curious or if you wanna see the work yourself, the source code is here. Feel free to contribute: https://github.com/SJRiz/pytogether.
I’m still learning, so any feedback would be amazing (and contributions)!