• zkmon
  • ·
  • 30 minutes ago
  • ·
  • [ - ]
> There should be some balanced path in the middle somewhere, but I haven’t stumbled across a formal version of it after all these decades.

That's very simple. The balanced path depends directly on how much of the requirements and assumptions are going to change during the life time of the thing you are building.

Engineering is helpful only to the extent you can forsee the future changes. Anything beyond that requires evolution.

You are able to comment on the complexity of that large company only because you are standing in the future into 50 years from when those things started take shape. If you were designing it 50 years back, you would end up with same complexity.

The nature's answer to it is, consolidate and compact. Everything that falls onto earth gets compacted into a solid rock over time, by a huge pressure of weight. All complexity and features are flattened out. Companies undergo similar dynamics driven by pressures over time, not by big-bang engineering design upfront.

“A complex system that works is invariably found to have evolved from a simple system that worked. The inverse proposition also appears to be true: A complex system designed from scratch never works and cannot be made to work. You have to start over, beginning with a working simple system.” Gall’s Law
I think the important part here is "from scratch". Typically when you're designing a new (second, third, whatever) system to replace the old one you actually take the good and the bad parts of the previous design into account, so it's no longer from scratch. That's what allows it to succeed (at least in my experience it usually did).
This is often quoted, but I wonder whether it's actually strictly true, at least if you keep to a reasonable definition of "works". It's certainly not true in mechanical engineering.
People misinterpret this and think they can incrementally build a skyscraper out of a shed.
Ah, the Second System Effect, and the lesson learned from it.
But this is about the first systems? I tend to tell people, the fourth try usually sticks.

The first is too ambitious and ends in an unmaintainable pile around a good core idea.

The second tries to "get everything right" and suffers second system syndrome.

The third gets it right but now for a bunch of central business needs. You learned after all. It is good exactly because it does not try to get _everything_ right like the second did.

The fourth patches up some more features to scoop up B and C prios and calls it a day.

Sometimes, often in BigCorp: Creators move on and it will slowly deteriorate from being maintenaned...

  • YZF
  • ·
  • 2 hours ago
  • ·
  • [ - ]
So true.
  • iafan
  • ·
  • 2 hours ago
  • ·
  • [ - ]
> There are two main schools of thought in software development about how to build really big, complicated stuff.

> The most prevalent one, these days, is that you gradually evolve the complexity over time. You start small and keep adding to it.

> The other school is that you lay out a huge specification that would fully work through all of the complexity in advance, then build it.

I think AI will drive an interesting shift in how people build software. We'll see a move toward creating and iterating on specifications rather than implementations themselves.

In a sense, a specification is the most compact definition of your software possible. The knowledge density per "line" is much higher than in any programming language. This makes specifications easier to read, reason about, and iterate on—whether with AI or with peers.

I can imagine open source projects that will revolve entirely around specifications, not implementations. These specs could be discussed, with people contributing thoughts instead of pull requests. The more articulated the idea, the higher its chance of being "merged" into the working specification. For maintainers, reviewing "idea merge requests" and discussing them with AI assistants before updating the spec would be easier than reviewing code.

Specifications could be versioned just like software implementations, with running versions and stable releases. They could include addendums listing platform-specific caveats or library recommendations. With a good spec, developers could build their own tools in any language. One would be able to get a new version of the spec, diff it with the current one and ask AI to implement the difference or discuss what is needed for you personally and what is not. Similarly, It would be easier to "patch" the specification with your own requirements than to modify ready-made software.

Interesting times.

Iceberg is, primarily, a spec [0]. It defines exactly what data is stored and how it is interacted with. The community debates broadly on spec changes first, see a recent one on cross-platform SQL UDFs [1].

We have yet to see a largely llm driven language implementation, but it is surely possible. I imagine it would be easier to tell the llm to instead translate the Java implementation to whatever language you need. A vibe-coded language could do major damage to a companies data.

[0] https://iceberg.apache.org/spec/ [1] https://lists.apache.org/thread/whbgoc325o99vm4b599f0g1owhgw...

  • iafan
  • ·
  • 52 minutes ago
  • ·
  • [ - ]
If I had a spec for something non-trivial, I probably would ask AI to create a test suite first. Or port tests from an existing system since each test is typically orders of magnitude easier to rewrite in any language, and then run AI in a loop until the tests pass.
> I can imagine open source projects that will revolve entirely around specifications

This is a really good observation and I predict you will be correct.

There is a consequence of this for SaaS. You can imagine an example SaaS that one might need to vibecode to save money. The reason its not possible now is not because Claude can't do it, its because getting the right specs (like you suggested) is hard work. A well written spec will not only contain the best practices for that domain of software but also all the legal compliance BS that comes along with it.

With a proper specification that is also modular, I imagine we will be able to see more vibecoded SaaS.

Overall I think your prediction is really strong.

You can look at the Web as a starter: https://html.spec.whatwg.org/#history-2

> The WHATWG was based on several core principles, (..) and that specifications need to be detailed enough that implementations can achieve complete interoperability without reverse-engineering each other.

But in my experience you need more than a spec, because an implementation is not just something that implements a spec, it is also the result of making many architectural choices in how the spec is implemented.

Also even with detailed specs AI still needs additional guidance. For example couple of weeks ago Cursor unleashed thousands of agents with access to web standards and the shared WPT test suite: the result was total nonsense.

So the future might rather be like a Russian doll of specs: start with a high-level system description, and then support it with finer-grained specs of parts of the system. This could go down all the way to the code itself: existing architectural patterns provide a spec for how to code a feature that is just a variation of such a pattern. Then whenever your system needs to do something new, you have to provide the code patterns for it. The AI is then relegated to its strength: applying existing patterns.

TLA+ has a concept of refinement, which is kind of what I described above as Russian dolls but only applied to TLA+ specs.

Here is a quote that describes the idea:

There is no fundamental distinction between specifications and implementations. We simply have specifications, some of which implement other specifications. A Java program can be viewed as a specification of a JVM (Java Virtual Machine) program, which can be viewed as a specification of an assembly language program, which can be viewed as a specification of an execution of the computer's machine instructions, which can be viewed as a specification of an execution of its register-transfer level design, and so on.

Source: https://cseweb.ucsd.edu/classes/sp05/cse128/ (chapter 1, last page)

There are parallels of thought here to template and macro libraries.

One issue is that a spec without a working reference implementation is essentially the same as a pull request that's never been successfully compiled. Generalization is good but you can't get away from actually doing the thing at the end of the day.

I've run into this issue with C++ templates before. Throw a type at a template that it hasn't previously been tested with and it can fall apart in new and exciting ways.

Interested in ideas for this. I've mulled over different compact DSLs for specs, but unstructured (beyond file-specific ownership boundaries) has served me better.
I think it has to be modular and reusable. Like GDPR compliance spec should be opensourced and reused by all SaaS specs.
Show me an example of a large complex software system built from spec rather than evolved.
  • iafan
  • ·
  • 27 minutes ago
  • ·
  • [ - ]
Everything that is touching hardware, for example. Bluetooth stack, HDMI, you name it.

Everything W3C does. Go is evolving through specs first. Probably every other programming language these days.

People already do that for humankind-scale projects where there have to be multiple implementations that can talk to each other. Iteration is inevitable for anything that gains traction, but it still can be iteration on specs first rather than on code.

The Evolution method outlined also seems born from the Continuous Delivery paradigm that was required for subscription business models. I would argue Engineering is the superior approach as the Lean/Agile methods of production were born from physical engineering projects whose end result was complete. Evolution seems to be even more chaotic because an improper paradigm of 'dev ops' was used instead of organically emerged as one would expect with an evolving method.

Ai assistance would seem to favor the engineering approach as the friction of teams and personalities is reduced in favor of quick feasibility testing and complete planning.

A major factor supporting evolution over big up-front design is the drift in system requirements over time. Even on large military like projects, apparently there's "discovery"--and the more years that pass, the more requirements change.
  • zppln
  • ·
  • 47 minutes ago
  • ·
  • [ - ]
This isn't my experience. Requirements tend to settle over time (unless they're stupidly written). Users tend to like things to stay the same, with perhaps some improvement to performance here and there.

But if anything, all development is the search for the search for the requirements. Some just value writing them down.

  • YZF
  • ·
  • 2 hours ago
  • ·
  • [ - ]
Even if the requirements are indeed fixed your understanding of the problem domain evolves.
Software cannot be built like skyscrapers because the sponsors know about the malleability of the medium and treat it like a lump of clay that by adding water can be shaped to something else.
  • ako
  • ·
  • 2 hours ago
  • ·
  • [ - ]
You're mixing up design and manufacturing. A skyscraper is first completely designed (on paper, cad systems, prototypes), before it is manufactured. In software engineering, coding is often more a design phase than a manufacturing phase.

Designers need malleability, that is why they all want digital design systems.

  • baxtr
  • ·
  • 1 hour ago
  • ·
  • [ - ]
Funny you bring up the clay analogy.

It was discussed here just 2 days ago intensively.

https://news.ycombinator.com/item?id=46881543

  • YZF
  • ·
  • 2 hours ago
  • ·
  • [ - ]
But software is in fact not very malleable at all. It's true the medium supports change, it's just a bunch of bits, but change is actually hard and expensive, perhaps more than other mediums.
I'd argue it's more malleable than a skyscraper.

How rapidly has business software changed since COVID? Yet how many skyscrapers remain partially unoccupied in big cities like London, because of the recent arrival of widespread hybrid working?

The buildings are structurally unchanged and haven't been demolished to make way for buildings that better support hybrid working. Sure office fit outs are more oriented towards smaller simultaneous attendance with more hot desking. Also a new industry boom around team building socials has arrived. Virtual skeet shooting or golf, for example.

On the whole, engineered cities are unchanged, their ancient and rigid specifications lacking the foresight to include the requirements that accommodate hybrid working. Software meanwhile has adapted and as the OP says, evolved.

  • ako
  • ·
  • 2 hours ago
  • ·
  • [ - ]
With LLMs it's becoming very malleable.
[dead]