I get it, but in general I don't get the OO hate.
It's all about the problem domain imo. I can't imagine building something like a graphics framework without some subtyping.
Unfortunately, people often use crap examples for OO. The worst is probably employee, where employee and contractor are subtypes of worker, or some other chicanery like that.
Of course in the real world a person can be both employee and contractor at the same time, can flit between those roles and many others, can temporarily park a role (e.g sabbatical) and many other permutations, all while maintaining history and even allowing for corrections of said history.
It would be hard to find any domain less suited to OO that HR records. I think these terrible examples are a primary reason for some people believing that OO is useless or worse than useless.
Most code bases don't need dynamically loaded objects designed with interfaces that can be swapped out. In fact, that functionality is nearly never useful. But that's how most people wrote Java code.
It was terrible and taught me to avoid applying for jobs that used Java.
I like OOP and often use it. But mostly just as an encapsulation of functionality, and I never use interfaces or the like.
To the point that there are people that will assert the GoF book, published before Java was invented, actually contains Java in it.
It was so rare that the GoF though they needed to write a book to teach people how to use those patterns when they eventually find them.
But after the book was published, those patterns became "advanced programming that is worth testing for in job interviews", and people started to code for their CVs. The same happened briefly with refactoring, and for much longer with unit tests and the other XP activities (like TDD).
At the same time, Java's popularity was exploding on enterprise software.
Again, Smalltalk did it first, and is actually one of the languages on the famous GoF book, used to create all the OOP patterns people complain about, the other being C++.
I didn't claim it does. To make the point though: bare functions are a much simpler building block, and a much cleaner building block than classes. Classes by their nature put state and behavior in one place. If one doesn't need that, then a class is actually not the right concept to go for (assuming one has the choice, which one doesn't in Java). A few constants and a bunch of functions would be a simpler and fully sufficient concept in that case. And how does one group those? Well, a module.
In Java you are basically forced to make unnecessary classes, that only have static functions as members, to achieve a similar simplicity, but then you still got that ugly class thing thrown in unnecessarily.
In a few other languages maybe things are based on different things than functions. Like words in Forth or something. But even they can be interpreted to be functions, with a few implicit arguments. And you can just write them down. No need to put them into some class or some blabliblub concept.
As mentioned in another reply, Java did not invent this, it was building upon Smalltalk and SELF, with a little bit of Objective-C on the side, and C++ like syntax.
Try to create a single function in Smalltalk, or SELF.
http://stephane.ducasse.free.fr/FreeBooks.html
It is also no accident that when Java came into the scene, some big Smalltalk names like IBM, one day of the other easily migrated their Smalltalk tooling into Java, and to this day Eclipse still has the same object browser as any Smalltalk environment.
Smalltalk,
https://www.researchgate.net/figure/The-Smalltalk-browser-sh...
Which you will find a certain similarity including with NeXTSTEP navigation tools, and eventually OS X Finder,
The code browser in Eclipse
https://i.sstatic.net/4OFEM.png
By the way, in OOP languages like Python, even functions are objects,
    Python 3.14.0 (tags/v3.14.0:ebf955d, Oct  7 2025, 10:15:03) [MSC v.1944 64 bit (AMD64)] on win32
    Type "help", "copyright", "credits" or "license" for more information.
    >>> def sum(x, y): return x + y
    ...
    >>> sum
    <function sum at 0x0000017A9778D4E0>
    >>> dir(sum)
    ['__annotate__', '__annotations__', '__builtins__', '__call__', '__class__', '__closure__', '__code__', '__defaults__', '__delattr__', '__dict__', '__dir__', '__doc__', '__eq__', '__format__', '__ge__', '__get__', '__getattribute__', '__getstate__', '__globals__', '__gt__', '__hash__', '__init__', '__init_subclass__', '__kwdefaults__', '__le__', '__lt__', '__module__', '__name__', '__ne__', '__new__', '__qualname__', '__reduce__', '__reduce_ex__', '__repr__', '__setattr__', '__sizeof__', '__str__', '__subclasshook__', '__type_params__']
    >>> type(sum)
    <class 'function'>
    >>> sum.__name__
    'sum'
    >>> sum.__class__
    <class 'function'>https://en.wikipedia.org/wiki/Modular_programming
> Languages that formally support the module concept include Ada, ALGOL, BlitzMax, C++, C#, Clojure, COBOL, Common Lisp, D, Dart, eC, Erlang, Elixir, Elm, F, F#, Fortran, Go, Haskell, IBM/360 Assembler, IBM System/38 and AS/400 Control Language (CL), IBM RPG, Java, Julia, MATLAB, ML, Modula, Modula-2, Modula-3, Morpho, NEWP, Oberon, Oberon-2, Objective-C, OCaml, several Pascal derivatives (Component Pascal, Object Pascal, Turbo Pascal, UCSD Pascal), Perl, PHP, PL/I, PureBasic, Python, R, Ruby,[4] Rust, JavaScript,[5] Visual Basic (.NET) and WebDNA.
If the whole complaint is that you cannot have a bare bones function outside of a class, Java is not alone.
Predating Java by several decades, Smalltalk, StrongTalk, SELF, Eiffel, Sather, BETA.
And naturally lets not forget C#, that came after Java.
    > that was actively encouraged by the design of the language.
    > I never use interfaces or the like.
Interfaces for everything, abstract classes “just in case,” dependency injection frameworks that exist mainly to manage all the interfaces. Java (and often Enterprise C#) is all scaffolding built to appease the compiler and the ideology of “extensibility” before there’s any actual complexity to extend.
You can write clean, functional, concise Java today, especially with records, pattern matching, and lambdas, but the culture around the language was forged in a time when verbosity was king.
Then my templated impl header can be very heavy without killing my build times since only the interface base class is #included.
Not sure if this is as common in Java.
C++ is a hell of a language.
The challenge with modern C++ projects is that every individual TU can take forever to build because it involves parsing massive header files. Oftentimes you can make this faster with "unity builds" that combine multiple C++ files into a single TU since the individual .cpp file's build time is negligible compared to your chonky headers.
The reason the header files are so massive is because using a templated entity (function or class) requires seeing the ENTIRE DEFINITION at the point of use, because otherwise the compiler doesn't know if the substitution will be successful. You can't forward declare a templated entity like you would with normal code.[2]
If you want to avoid including these definitions, you create an abstract interface and inherit from that in your templated implementations, then pass the abstract interface around.
[1] or linking with mold
[2] There used to be a feature that allowed forward declaring templated entities called "export". A single compiler tried to implement it and it was such a failure it was removed from the language. https://www.open-std.org/jtc1/sc22/wg21/docs/papers/2003/n14...
   t = new T(); // T is a template parameter class
    > C++ uses reified generics
    > does c++ have reified generics?
    > C++ templates do not provide reified generics in the same sense as languages like C# or Java (to a limited extent). Reified generics mean that the type information of generic parameters is available and accessible at runtime.In any event, you have to use weird (I think “unsafe”) reflection tricks to get the type info back at runtime in Java. To the point where it makes you think it’s not supported by the language design but rather a clever accident that someone figured out how to abuse.
I think people are still too ready to use massive, hulking frameworks for every little thing, of course, but the worst of the 'enterprise' stuff seems to have been banished.
Always makes me think of that AbstractProxyFactorySomething or similar, that I saw in Keycloak, for when you want to implement your own password quality criteria. When you step back a bit and think about what you actually want to have, you realize, that actually all you want is a function, that takes as input a string, and gives as output a boolean, depending on whether the password is strong enough, or fulfills all criteria. Maybe you want to output a list of unmet criteria, if you want to make it complex. But no, it's AbstractProxyFactorySomething.
Here is a tiny interface that will do what you need:
    @FunctionalInterface
    public interface IPasswordChecker
    {
        bool isValid(String password);
    }
Example:
    const IPasswordChecker passwordChecker = (String password) -> password.length() >= 16;Has anyone ever actually done this ?
But if it was as convoluted to use as it's in Java, I wouldn't. And also, it's not enterprise CRUD. Enterprise CRUD resists complex architectures like nothing else.
Perhaps I'm not following, but dynamically loaded objects are the core feature of shared libraries. Among it's purposes, it allows code to be reused and even updated without having to recompile the project. That's pretty useful.
Interfaces are also very important. They allow your components to be testable and mockable. You cannot have quality software without these basic testing techniques. Also, interfaces are extremely important to allow your components to be easily replaced even at runtime.
Perhaps you haven't had the opportunity to experience the advantages of using these techniques, or were you mindful of when you benefited from them. We tend to remember the bad parts and assume the good parts are a given. But personal tastes don't refute the value and usefulness of features you never learned to appreciate.
> Perhaps I'm not following, but dynamically loaded objects are the core feature of shared libraries. Among it's purposes, it allows code to be reused and even updated without having to recompile the project. That's pretty useful.
> Interfaces are also very important. They allow your components to be testable and mockable. You cannot have quality software without these basic testing techniques. Also, interfaces are extremely important to allow your components to be easily replaced even at runtime.
I don't think GP was saying that Dynamically loaded objects are not needed, or that Interfaces are not needed.
I read it more as "Dynamically loaded interfaces that can be swapped out are not needed".
Of course you can, wtf?
Mock are often the reason of tests being green and app not working :)
Explain then what is your alternative to unit and integration tests.
> Mock are often the reason of tests being green and app not working :)
I don't think that's a valid assumption. Tests just verify the system under test, and test doubles are there only to provide inputs in a way that isolates your system under test. If your tests either leave out invariants that are behind bugs and regressions or have invalid/insufficient inputs, the problem lies in how you created tests, not in the concept of a mock.
Workman and it's tools.
For example of such intertwined architecture see Mutter, a window manager of Gnome Shell (a program that manages windows on Linux desktop). A code that handles key presses (accessibility features, shortcuts), needs objects like MetaDisplay or MetaSeat and cannot be tested in isolation; you figuratively need a half of wayland for it to work.
The good tests use black box principle; i.e. they only use public APIs and do not rely on knowledge of inner working of a component. When the component changes, tests do not break. Tests with mocks rely on knowing how component work, which functions it calls; the tests with mocks become brittle, break often and require lot of effort to update when the code changes.
Avoid mocks as much as you can.
I am fine with having fake implementations and so forth, but the whole "when function X is called with Y arguments, return Z" thing is bad. It leads to very tight coupling of the test code with the implementation, and often means the tests are only testing against the engineer's understanding of what's happening - which is the same thing they coded against in the first place. I've seen GP's example of tests being green but the code not working correctly a number of times because of that.
Tests against real components instead of mocks.
>If your tests either leave out invariants that are behind bugs and regressions or have invalid/insufficient inputs, the problem lies in how you created tests, not in the concept of a mock.
Nowadays external components can be very complex systems e.g dbs, messaging queues, 3rd APIs and so on
A lot of things can go wrong and you arent even aware of them in order to get mocks right.
Examples? fuckin emojis.
On mocked in memory database they work fine, but fail on real db due to encoding settings.
Wdym?
You're testing e.g simple crude operation, e.g create hn thread
With mocked db it passed, with real db it fails due to encoding issue.
The result is that tests are green, but app does not work.
This is pretty important since "unit tests" would be far too constraining for reasonable modifications to the compiler, e.g. adding a new pass could change the actual output code without modifying the semantics.
I mean they run single pass with some small llvm ir input and check if the output IR is fine
You really haven't argued anything, so ending on a "you must be personally blind jab" just looks dumb.
Java I think gets attacked this way because a lot of developers, especially in the early 2000s, were entering the industry only familiar with scripting languages they'd used for personal hobby projects, and then Java was the first time they encountered languages and projects that involved hundreds of developers. Scripting codebases didn't define interfaces or types for anything even though that limits your project scalability, unit testing was often kinda just missing or very superficial, and there was an ambient assumption that all dependencies are open source and last forever whilst the apps themselves are throwaway.
The Java ecosystem quickly evolved into the enterprise server space and came to make very different assumptions, like:
• Projects last a long time, may churn through thousands of developers over their lifetimes and are used in big mission critical use cases.
• Therefore it's better to impose some rules up front and benefit from the discipline later.
• Dependencies are rare things that create supplier risks, you purchase them at least some of the time, they exist in a competitive market, and they can be transient, e.g. your MQ vendor may go under or be outcompeted by a better one. In turn that means standardized interfaces are useful.
So the Java community focused on standardizing interfaces to big chunky dependencies like relational databases, message queuing engines, app servers and ORMs, whereas the scripting language communities just said YOLO and anyway why would you ever want more than MySQL?
Very different sets of assumptions lead to different styles of coding. And yes it means Java can seem more abstract. You don't send queries to a PostgreSQL or MySQL object, you send it to an abstract Connection which represents standardized functionality, then if you want to use DB specific features you can unwrap it to a vendor specific interface. It makes things easier to port.
I suspect many OOP haters have experienced what I'm currently experiencing, stateful objects for handing calculations that should be stateless, a confusing bag of methods that are sometimes hidden behind getters so you can't even easily tell where the computation is happening, etc
Sorry to learn, hope you don't get scar tissue from it.
Most programs in my experience are about manipulating records: retrieve something from a database, manipulate it a bit (change values), update it back.
Over here OOP do a good job - you create the data structures that you need to manipulate, but create the exact interface to effect the changes in a way that respect the domain rules.
I do get that this isn't every domain out there and _no size fits all_, but I don't get the OP complaints.
I currently think that most of the anger about OOP is either related to bad practices (overusing) or to lack of knowledge from newcomers. OOP is a tool like any other and can be used wrong.
And then there's a reason they're teaching the "functional core, imperative shell" pattern.
It’s certainly possible to write good code in Java but it does still lend itself to abuse by the kind of person that treated Design Patterns as a Bible.
I have a vague idea of what the Bible says, but I have my favorite parts that I sometimes get loud about. Specifically, please think really hard before making a Singleton, and then don't do it.
Exactly. This is the way to think about it, imo. One of those places is GUI frameworks, I think, and there I am fine doing OOP, because I don't have a better idea how to get things done, and most GUI frameworks/toolkits/whatever are designed in an OOP way anyway. Other places I just try to go functional.
OOP is a collection of ideas about how to write code. We should use those ideas when they are useful and ignore them when they are not.
But many people don't want to put in the critical thinking required to do that, so instead they hide behind the shield of "SOLIDD principles" and "best practice" to justify their bad code (not knocking on SOLIDD principles, it's just that people use it to justify making things object oriented when they shouldn't be).
As with everything, there isn't a golden rule to follow. Sometimes OO makes sense, sometimes it doesn't. I rarely use it, or abstractions in general, but there are some things where it's just the right fit.
This, this, this. So much this.
Back when I was in uni, Sun had donated basically an entire lab of those computers terminals that you used to sign in to with a smart card (I forgot the name). In exchange, the uni agreed to teach all classes related to programming in Java, and to have the professors certify in Java (never mind the fact that nobody ever used that laboratory because the lab techs had no idea how to work with those terminals).
As a result of this, every class from algorithms, to software architecture felt like like a Java cult indoctrination. One of the professors actually said C was dead because Java was clearly superior.
In our uni (around 1998/99) all professors said that except the Haskell teacher who indeed called Java a mistake (but c also).
Tale as old as time.
To be honest, I'm convinced the reason so many people dislike Java is because they have had to use it in a professional context only. It's not really a hobbyist language.
> Sounds like a problem with poor code rather than something unique to OOP.
And yeah, OO may lean a bit towards more indirection, but it definitely doesn't force you to write code like that. If you go through too many levels, that's entirely on the developer.
  > I can't imagine building something like a graphics framework without some subtyping.
They use higher order types to implement subtyping as a library, with combinators. For example, you can take your fudget that does not (fully) implement some functionality, wrap it into another one that does (or knows how to) implement it and have a combined fudget that fully implements what you need. Much like parsing combinators.
It's the misuse of OO constructs that gives it a bad name, almost always that is inheritance being overused/misused. Encapsulation and modularity are important for larger code bases, and polymorphism is useful for making code simpler, smaller and more understandable.
Maybe the extra long names in java also don't help too, along with the overuse/forced use of patterns? At least it's not Hungarian notation.
A sample: pandas loc, iloc etc. Or Haskell scanl1. Or Scheme's cdr and car. (I know - most of the latest examples are common functions that you'll learn after a while, but still, reading it at first is terrible).
My first contact with a modern OO language was C# after years of C++. And I remember how I thought it awkward that the codebase looked like everything was spelled out. Until I realize that it is easier to read, and that's the main quality for a codebase.
> CMMetadataFormatDescriptionCreateWithMetadataFormatDescriptionAndMetadataSpecifications(allocator:sourceDescription:metadataSpecifications:formatDescriptionOut:)
https://developer.apple.com/documentation/coremedia/cmmetada...:)
I think people focus a lot on inheritance but the core idea of OO is more the grouping of values and functions. Conceptually, you think about how methods transforms the data you are manipulating and that’s a useful way to think about programs.
This complexity doesn’t really disappear when you leave OO language actually. The way most complex Ocaml programs are structured with modules grouping one main type and the functions working on it is in a lot of way inspired by OO.
Encapsulation.
Which I think is misunderstood a lot, both by practitioners and critics.
Even with non-obfuscated code, if you're working with a decompilation you don't get any of the accompanying code comments or documentation. The more abstractions are present, the harder it is to understand what's going on. And, the harder it is to figure out what code changes are needed to implement your desired feature.
C++ vtables are especially annoying. You can see the dispatch, but it's really hard to find the corresponding implementation from static analysis alone. If I had to choose between "no variable names" and "no vtables", I'd pick the latter.
> Everything is dispatched dynamically
Well, not everything, there is NS_DIRECT. The reason for that being that dynamic dispatch is expensive - you have to keep a lot of metadata about it in the heap for sometimes rarely-used messages. (It's not about CPU usage.)
Unfortunately there were so many bad examples from the old Java "every thing needs a dozen factories and thousands of interfaces" days that most people haven't seen the cases where it works well.
Computers work on data. Every single software problem is a data problem. Learning to think about problems in a data oriented way will make you a better developer and will make many difficult problems easier to think about and to write software to solve.
In addition to that, data oriented software almost inherently runs faster because it uses the cache more efficiently.
The objects that fall out of data oriented development represent what is actually going on inside the application instead of how an observer would model it naively.
I really like data oriented development and I wish I had examples I could show, but they are all $employer’s.
The keyword being "some".
Yes, there are those who can use OOP responsibly, but in my (fortunately short) experience with Enterprise Java, they are outnumbered by the cargo-cult dogma of architecture astronauts who advocate a "more is better" approach to abstraction and design patterns. That's how you end up with things like AbstractSingletonProxyFactoryBean.
Also, I dislike design patterns overuse, DDD done Uncle Bob style.
Also we can think of where OOP drives many teams to:
https://steve-yegge.blogspot.com/2006/03/execution-in-kingdo...
https://factoryfactoryfactory.net/
https://github.com/EnterpriseQualityCoding/FizzBuzzEnterpris...
    > https://github.com/EnterpriseQualityCoding/FizzBuzzEnterpriseEdition
    > FizzBuzz Enterprise Edition is a no-nonsense implementation of FizzBuzz made by serious businessmen for serious business purposes.
While React technically uses some OOP, in practice it's a pretty non-OOP way do UI. Same with e.g. ImGUI (C++), Clay (C). I suppose for the React case there's still an OOP thing called the DOM underneath, but that's pretty abstracted.
In practice most of the useful parts of OOP can be done with a "bag/record of functions". (Though not all. OCaml has some interesting stuff wrt. the FP+OOP combo which hasn't been done elsewhere, but that may just be because it wasn't ultimately all that useful.)
Function calls have state, in React. Think about that for a second! It totally breaks the most basic parts of programming theory taught in day one of any coding class. The resulting concepts map pretty closely:
• React function -> instantiate or access a previously instantiated object.
• useState -> define an object field
• Code inside the function: constructor logic
• Return value: effectively a getResult() style method
The difference is that the underlying stateful objects implemented in OOP using inheritance (check out the blink code) is covered up with the vdom diffing. It's a very complicated and indirect way to do a bunch of method calls on stateful objects.
The React model doesn't work for a lot of things. I just Googled [react editor component] and the first hit is https://primereact.org/editor/ which appears to be an ultra-thin wrapper around a library called Quill. Quill isn't a React component, it's a completely conventional OOP library. That's because modelling a rich text editor as a React component would be weird and awkward. The data structures used for the model aren't ideal for direct modification or exposure. You really need the encapsulation provided by objects with properties and methods.
    Welcome to Node.js v24.10.0.
    Type ".help" for more information.
    > const fn = (x) => x + x
    undefined
    > typeof(fn)
    'function'
    > Object.getOwnPropertyNames(fn)
    [ 'length', 'name' ]
    > fn.name
    'fn'
    > fn.length
    1
    > Object.getPrototypeOf(fn)
    [Function (anonymous)] ObjectThe game used to be simple, both conceptually and codewise but obviously, it became more and more bloated the more developers touched it and the more bureaucracy was added. Now, it's a complete nightmare, and I bet it's also a nightmare for the developers too, considering how hard it is for them to fix even basic issues which have been in the game for like a decade at this point.
The devs also wrote a write-up here about how they handle the desyncs in netcode [1].
[1] https://medium.com/project-slippi/fighting-desyncs-in-melee-...
You have to have Factories and inheritence..
/s
> Also, what makes you think it’s the most popular server?
Because it's the only server software that can actually scale and support a long-term server with feature and bugfix stability. Its popularity bears out in what hosting companies say people are most commonly using. Though I'm not sure if there is a specific publicly published statistic to point to to prove this - there is bStats global stats, but it is biased towards the Paper ecosystem.
Fabric is getting close with certain optimization and bugfixing mods, but it's still not there. Paper has a checklist of what optimizations and fixes must be included for a release to proceed, whereas Fabric ecosystem is still a hodgepodge of different things that are only available on specific Minecraft versions.
Paper does seem to have it's own site for plugins, hangar or something? (Don't have my web history on this PC) but the community support doesn't seem nearly as fleshed out.
It is incredible though, before 1.21 the last time I played around with MC server hosting was probably around 1.8 days, when it seemed like you only had Bukkit and a few plugins for it
I’m not really clear on mod vs plugin vs mixin, I was just trying to refer to whatever software does the decompilation work rather than just consuming APIs provided by projects that do.
Sounds like it’s correct that Paper didn’t do its own mod API, but incorrect that Paper doesn’t do its own decompilation work.
> By playercount, it is the largest (custom, standalone) MC server software in the world.
Do you have a source on this? Not trying to accuse you of anything, I just know that a few servers claim this, and don’t know if we have reliable numbers.
> As of 1.16.5 [(2021)], Forge will be using Mojang’s Official Mappings, or MojMaps, for the forseeable future
Pretty sure this applies to NeoForge as well: https://neoforged.net/personal/sciwhiz12/what-are-mappings/
According to the article, official mappings can be found here: https://piston-meta.mojang.com/mc/game/version_manifest_v2.j...
"hn$z" is a heck of a lot smaller than "tld.organization.product.domain.concern.ClassName"
> Our scanner can only do so much however. As a developer you can further improve parsing performance by increasing the information density of your programs. The easiest way to do so is by minifying your source code, stripping out unnecessary whitespace, and to avoid non-ASCII identifiers where possible.
It's the closest I've ever see to someone literally being one of the hackers from Matrix, literally staring at hexadecimal and changing chars one at a time
Minecraft also has a plugin system based around JSON file datapacks, but it's a lot more limited. It's more at the level of scope of adding a few cool features to custom maps then completely modding the game.
- They left in the code debug features that they used to strip out.
- They left in the code their testing infrastructure that they used to strip out as well.
- They started making everything namespaced to differentiate contents between mods (like in this week's snapshot they made gamerules namespaced with the "minecraft:" prefix like items and blocks and whatnot)
- They are adding a lot more "building blocks" type features that both allow new /easier things in datapacks, and in mods as well.
Method patching with Mixins is less needed now because the game's internal APIs are more versatile than ever.
For example, in Micronaut (which is what I'm more familiar with) you can use @Replace or a BeanCreatedListener to swap out objects at injection time with compatible objects you provide. If a use-site injects Collection<SomeInterface> you can just implement that interface yourself, annotate your class with @Singleton or @Prototype and now your object will appear in those collections. You can use @Order to control the ordering of that collection too to ensure your code runs before the other implementations. And so on - there's lots of ways to write code that modifies the execution of other code, whilst still being understandable and debuggable.
Players who were teenagers when the game first came out are now 29 to 35 years old. It's a pretty ancient game at this point. From my experience, most contemporary modders are in their late 20s.
We're still relying on legacy code written by inexperienced kids, though...
If you remember entire contraptions of command blocks doing stuff like playing Pokemon Red in Minecraft or "one commands" that summoned an entire obelisk of command blocks, the introduction of datapacks pretty much replaced both of those.
This is called the inner-platform effect, where in order to avoid programming in the original language, you invent a worse programming language. Apparently it used to be a big killer of enterprise software. It's also one of the reasons Minecraft needs ten times the RAM it used to. To be fair, we have fifty times as much RAM as we did when Minecraft came out, but wouldn't you rather have it put to use doing extended view distance, extended world height, and shaders?
---
Edit: https://web.archive.org/web/20100708183651/http://notch.tumb...
Makes it feel lightweight I think.
Here's some examples, particularly of his antisemitism to better illustrate the issues
Bethesda games have the same ecosystem - they do provide an official plugin system, but since modders aren't content with the restrictions of that system, they reverse engineered the game(s, this has been going on since Oblivion) and made a script extender that hacks the game in-memory to inject scripts (hence the name).
1. Use MultiMC to manage instances with various mods, since mods are rarely compatible with each other, and since each version of a mod only is compatible with a single specific point release of the game itself.
Never download any EXE files to get a mod, that does sound sketch AF.
2. mods are always packaged for a particular Loader (some package for multiples and some require Forge, Fabric, or NeoForge), and MultiMC can install any of them into a given instance. Aside from different startup screens there seems to be no difference so idk why we need 3 different ones.
3. Curseforge's website and modrinth both seem to be legit places to get mods from. I personally find the installable Curseforge program itself to be bad and spammy, and would never use that, but the site still lets you directly download the jars you need, and lets you check "Dependencies" to find out what other mods you need.
PrismLauncher, a popular MultiMC fork, has direct integration with Curseforge and Modrinth, while being completely ad-free. Best of both worlds.
A few mods are not available because Curseforge allows mod authors the option to force ad monetization by blocking API access, but these are few and far between.
And there's a makedeb for it! https://mpr.makedeb.org/packages/prismlauncher
One wonders why Mojang didn’t embed Lua or Python or something and instead hand-rolled an even shittier version of Bash. The only reason MC servers like Hypixel exist is because the community developed an API on top of the vanilla jar that makes plugin development easy. Even with that there is still no way for servers to run client-side code, severely limiting what you can do. They could’ve easily captured all of Roblox’s marketshare but just let that opportunity slip through their fingers. Through this and a series of other boneheaded decisions (huge breaking changes, changes to the base game, lack of optimization), they have seriously fractured their ecosystem:
- PvP is in 1.8 (a version from 2015) or sometimes even 1.7 (from 2013)
- Some technical Minecraft is latest, some is in 1.12 (from 2017)
- Adventure maps are latest version
- Casual players play Bedrock (an entirely different codebase!)
The words “stable API” have never been said in the Mojang offices. So the community made their own for different versions, servers use the Bukkit 1.8 API, client 1.8 mods use Forge, latest mods use Forge or Fabric. The deobfuscated names are of little utility because the old names are so well ingrained, and modders will also probably avoid them for legal reasons.
That's not their main mean of distribution, most often those sites were just third parties unrelated to the mod authors that repackaged the mod and somehow got a better SEO. But TBF back in the days the UX/UI for installing mods was pretty terrible. Nowadays there are more standardized and moderated distribution websites from which you just download the .jar of the mod.
> And as far as I know there is no sandboxing at all in the game (uhm, no pun intended) so once installed the mod has full access to your computer?
This is totally true though.
I used to use prism launcher which would just give me a search box and It on the side would have things like modrinth / curseforge etc., Usually I preferred Modrinth but there were some modpacks just on curseforge only but I never really downloaded a shady modpack from some random website aside from these two, In fact sometimes I never opened up a website but just prismlauncher itself lol
[1]Fabric uses Mix-ins while [2]Forge uses a more event based system that is added to the source code of minecraft where they add hooks into events that users can use.
To me its just incredible. Its not often that I see that users own an abstraction instead of the developers.
I wonder from a modding perspective would it be better if all public methods are just the API users can call and they themselves create a way for mods to exist?
[1] https://wiki.fabricmc.net/tutorial:mixin_introduction [2] https://docs.minecraftforge.net/en/latest/concepts/lifecycle...
It's the way vintage story implemented modding. They developed the whole game as engine + modapi + hooking engine for stuff outside of hookapi.
Then most of gameplay is implemented as mods on top of engine using api and hooking. And those tools are open source, with central distribution point for mods, so servers can dispatch and send update of required mods to clients as they join.
Marvellous and elegant design. Makes running a server with client side mods a breeze, because mods are automatically pushed to the clients.
Though in the end, you can't really open all the interfaces and expect it to be stable without making some huge trade offs. When it works, it's extremely pleasing. Some mods for vintage story that are made purely using mod api can work between major game versions. Even server/client version check is intentionally loose as mismatched versions can still for the most part interact across most of mechanics.
In practice, to preserve balance of api evolution and stability, not everything in the game is in the api, and thus you have to use their hooking api, and stuff that is not exposed tends to break much more often, so mods require manual updates, just like in minecraft(though not as bad, tbh. In minecraft nowadays modders tend to support both fabric and neoforge/forge apis, targeting each for at least a few major versions. In vintage story, you only gotta support one modding api heh).
> ... you can't really open all the interfaces and expect it to be stable without making some huge trade offs.
Another game I often play with a huge open interface is Crusader Kings 3 and paradox games in general. Most of the gameplay is implemented in their scripting language for the engine. But as you said when the game gets a big update most mods simply dont work anymore.
If the support of the community dies down many mods with much work and craft dont get updated anymore and rot away as the game gets updates. Quite sad actually.
Thats why I also quite like Star Wars Empire at War mods. The game does not get any updates anymore. The API here is mostly frozen, even old mods still work.
I'm surprised this hasn't become a malware distribution channel yet.
All of the same supply chain issues you have with packages in a programming ecosystem exist in VS's mod system.
Most serious servers only allow players with valid paid Minecraft accounts to join, because it allows the server owner to ban people or otherwise keep track of people. I don't see any reason why this would change just because the game client was made open source.
Sure a different approach might be possible, but would likely also hinder adoption of such a 3rd party account system.
> Once sales start dying and a minimum time has passed, I will release the game source code as some kind of open source.
https://web.archive.org/web/20100301103851/http://www.minecr...
Has that part ever happened?
Back then he couldn't have foreseen the size of the money printing factory that the game would become.
Since then they've made that back on game copies alone, and god only knows how much from movie/merch rights and microtransactions.
A lot of Qanon rants and other conspiracy things. Just goes to show you that some times it is best you don't get what you wish for.
https://www.youtube.com/watch?v=PmTUW-owa2w
He is developing a new voxel-like game called "levers and chests", and before that he has shown us a few cool webgl demos I find interesting.
I started playing Minecraft again recently and while it sounds like it’s the same artist, and it’s still somewhat contemplative, it’s not dissonant anymore.
https://www.loc.gov/static/programs/national-recording-prese...
I turn it off but only because I have great difficulty with multiple sound sources at the same time. I will happily listen to C418's output for hours whilst doing something else.
(And also Touhou because who doesn't love an electric trumpet?)
OSS would probably also just mean "read the source e.g. on github", not really specific as to all the four essential freedoms.
They have been open sourcing some of their older IPs, they recently open sourced their Command & Conquer games for example:
https://www.ea.com/games/command-and-conquer/command-and-con...
That said, I never had any interest in playing on a server that was populated by anyone but my small circle of friends.
Now my kids are growing up doing the same which I find great because I know exactly with whom they are interacting and have no worries about it.
And once there were mods and mod loaders built on the obscured source, it became easier to not disrupt the toolchains than to bite the bullet; I think Mojang now wants to make moving mods easier (someone somewhere has to be a bit sad that there are famous modpacks running old versions of Minecraft because it's easier to backport everything to 1.7.10 (including running on newer Javas) than it is to update mods).
I think one of the reasons Vision Pro and metaverse have been struggling is because their engines are bad. Not just locked down, but hard to develop on (although I don't have personal experience, I've heard this about VR in general). If you want to build a community, you must make development easy for hobbyists and small users*. I believe this has held even for the biggest companies, case in point the examples above.
* Though you also need existing reputation, hence small companies struggle to build communities even with good engines.
I believe though, that what you actually need as a big or small company, is good game first and foremost; the engine is secondary. When the community around a game reaches a critical mass, the very small percentage of its members who have the skills to modify things becomes significant as well.
For instance, Richard Burns Rally was not intended to be modded at all, yet the fans added new cars, new tracks, online scoreboards, etc.
In the Luanti [1] community (a voxel games engine/platform, designed to be moddable nearly from the start), one begins to see something similar as well: notable games gets mods, others don't (the former default game is a particular case; it is not exactly good but go tons of mods because of its status, and games based on it benefit from that ecosystem). Yet all use the same engine (perhaps Roblox is similar in that respect, I'm not sure if they have "reified" whole games like Luanti did).
What it did do right was be very open-ended and be conducive to modding, both of which were amplified by multiplayer capabilities.
I would wager that most of the fun players have had in Minecraft is from experiences that were built on top of Minecraft, not from the game’s own gameplay.
That made it a great game. I think it was inevitable that the first game which combined these two, infinite procedural worlds and free modifiability, would be a huge success. Worth noting also that infiniminer, despite the name, didn't have the infinite part worked out!
Battle royale games were almost certainly heavily inspired by the Minecraft minigame which predates them. Factorio has the old industrialcraft mod as an acknowledged inspiration. Vintage Story is basically standalone Terrafirmacraft (and by a dev from that, as I recall).
Last man standing formats were perfectly possible in traditional FPS formats too, but they weren't really a thing because to actually be fun, the format needs
1. Big maps and lots of players (more than the typical FPS)
2. A "searching for loot" mechanic, where you can increase your chances of survival by looking for good items, making interesting risk/reward tradeoffs and discouraging just turtling up in the most defensible location.
3. Shrinking borders, to prevent an anticlimactic endgame of powerful players searching for hiding stragglers.
Minecraft basically had all three since 2014, and there were quite popular last man standing formats like UHC even before they had world border (and before the Hunger Games film came out).
In 2006, I could download the Roblox app and bam, I would play thousands of 3D multiplayer games for free that loaded near instantly. With fully destructible buildings and dynamic terrain. Somehow I didn't get viruses from remote code execution.
That was groundbreaking at the time. In that era, I'd have to download Steam, buy individual games like Counterstrike, and the wackiest thing would be the "surf" gamemode. Most games I'd buy on CDs. I certainly couldn't knock down entire buildings with grenades.
If you contrast with Second Life/Habbo Hotel, you could walk around and talk to people I guess?
The community that spring up around it eventually carried it into total dominance of gaming for American children, but the basic parts of the engine like "click button, load into game, blow stuff up" were a decade ahead of the curve.
Also Blockland cost money, Roblox was free.
It's interesting that you chose Counter-Strike as an example, as that is a Half Life mod itself, and by 2006 there was a large ecosystem [1] of Half Life modifications using Metamod and AMX Mod (X). The last one in a weird C-like language called Small or Pawn, which was my first programming language that I made serious programs with.
Especially the War3FT mod where users gained server-bound XP in combination with a reserved slots plugins which allowed top-XP users to join a full server really created a tight community of players on my tiny DSL home-hosted server.
[1] https://www.amxmodx.org/compiler.php?mod=1&cat=0&plugin=&aut...
It's challenging to get networking right, and the effort required doesn't get all that much smaller just because your game is smaller.
Most engines do come with a networking framework or layer these days but Roblox gets to assume a bunch of things an engine can't, and as such provide a complete solution out of the box.
Everything was replicated in the client and server. So you could open Cheat Engine, modify your total $$$ on the client, and it would propagate to the server and everyone else playing.
They only fixed this in 2014 with FilteringEnabled/RemoteFunctions but that was opt-in until 2018 and fully rolled out in 2021 (breaking most classic Roblox games). This also made games much harder to develop.
> In that era, I'd have to download Steam, buy individual games like Counterstrike, and the wackiest thing would be the "surf" gamemode.
You could also play any Source mod. Also WC3 maps were insane at the time.
To give an example, Roblox added user-created cosmetic t-shirts as a way to monetize the platform. Developers immediately scripted their games to recognize special "VIP t-shirts" that would provide in-game benefits. And quickly created idle games called "tycoons" where you could wait 2 hours to accumulate money to buy a fortress, or buy the t-shirt to skip all that.
I don't think there were any modding systems with mtx support.
I dont think I am alone in saying this. IIRC the game was making millions while still in alpha.
If they released a cheap or impressive enough VR headset, I doubt desktop or face-tracking would matter. But I think the next best thing, a decent headset with an open platform that enabled such things, would’ve saved them.
I am glad they don't, the headset should be a general computing device first and foremost, launching apps you choose to participate in.
(Meta, I think, fails to understand that the people that most want a virtual space to interact with, to the point of putting up with the limitations of VR tech, mostly want to not look like regular people in that space, because they keep pushing a vision that seems to be a uniform 'normality' even more extreme than the real world)
The VRChat community should consider forming and funding an open source group to re-implement the platform as it will eventually get regulated.
For what it's worth I don't use VRChat, I've just been around the internet for long enough to know the pattern.
There are currently two much smaller competitors that are perfectly usable but lacking community buy-in. Chillout, which is similar to VRChat, with some improvements the community has wanted for years, but missing some of VRChat's (admittedly excellent) homemade functionality, such as better IK code, better bone dynamics, etc. And Resonite, which is more similar to SecondLife, possessing a cross-world inventory system and in-game content authoring tools.
Unity and UE have pretty good VR support nowadays, and even godot is getting there. Plus making a custom engine for VR was never that much harder than for a normal 3D game (well, once some API like OpenXR got normalized).
The big issue with VR right now is that it is more costly to develop for than normal apps and games, while having less user. It makes it a hard sell. For some indie dev, I allow them to profit from a market that is not yet saturated (right now, with no good marketing, you just get buried on steam, any app store, etc). There are many factors that make it more costly, like having to support several mobility and accessibility features for games (for example smooth and jump locomotion, reduce fov when moving the view, etc), that you usually don't have to care for in other plateform. And there is the issue of interactivity. UX (and in many ways UI) is still very far from ideal. Most VR apps and games just try things out, but there is still a world of pattern and good practice to build up. This makes using anything VR often an annoying experience. Especially since some issue can be an absolute no-go for some user. As an example, displaying subtitle in a 6dof environment can be tricky. Some game put it at a fix point of your view, which can cause nausea and readability problem, some move still follows the head/view but with a delay, which reduce nausea issue but can be distracting and also has readability issue (the subs can go out of view).
In a “free for all” setting, anyone (including kids) could potentially learn enough (or even just download pre-made scripts) and try their hand at modding software/games.
In a modern situation with developer registration, etc someone would need some sort of established identity, potentially going through age verification, paying some nominal fee for a license, accepting an EULA and so forth. This is a huge barrier to entry for kids/teenagers just wanting to tweak the game experience for themselves/their friends. I remember my first time trying to install Apache on Windows I guess around 2008-09, and the (very well-made!) install wizard asked me for a domain name. At the time I wasn’t aware of how DNS/etc worked and was scared to continue, thinking I would either take up some other company’s name or not being “allowed” to use a random name I’d pick and get myself/my parents in trouble.
All these “regulated” ecosystems make it scarier for well-meaning but inexperienced devs to get started, while doing little to deter dedicated attackers who know the game and know actual cybercrime enforcement is both lacking and trivial to defeat in any case.
The “free for all” environment made me the developer & sysadmin (or DevOps person as the techbros call it) I am today despite no formal training/education and I am sad to see this opportunity go for the younger generations.
Diverging even slightly from the demo use case would quickly feel like Sisyphus; so close, but never succeeding in getting over the hill.
Good for marketing in certain cases (to be the first), but bad for the community of builders
To me an interesting thing when a game succedes despite its community. As if people can endure a lot of toxicity as long as the game is good
Curious to know to what degree the "Creative" maps have fueled Fortnite's success as opposed to the 1st and 2nd party developed experiences.
> So one of the big efforts that we're making for Unreal Engine 6 is improving the networking model, where we both have servers supporting lots of players, but also the ability to seamlessly move players between servers and to enable all the servers in a data center or in multiple data centers, to talk to each other and coordinate a simulation of the scale of millions or in the future, perhaps even a billion concurrent players. That's got to be one of the goals of the technology. Otherwise, many genres of games just can never exist because the technology isn't there to support them. And further, we've seen massively multiplayer online games that have built parts of this kind of server technology. They've done it by imposing enormous costs on every programmer who writes code for the system. As a programmer you would write your code twice, one version for doing the thing locally when the player's on your server and another for negotiating across the network when the player's on another server. Every interaction in the game devolves into this complicated networking protocol every programmer has to make work. And when they have any bugs, you see item duplication bugs and cheating and all kinds of exploits. Our aim is to build a networking model that retains the really simple Verse programming model that we have in Fortnite today using technology that was made practical in the early 2000's by Simon Marlow, Simon Peyton Jones and others called Software Transactional Memory.
The lockdown is a big part of it, though. The industry has cross-platform VR/AR SDKs like OpenXR that Apple refuses to implement. A big reason their platform isn't supported day-and-date with multiplat VR releases is Apple's insistence on reinventing the wheel with every platform they make.
If the rumors of Valve's VR headset being able to run flatscreen games are true, it's more-or-less Game Over for the Vision Pro. The appetite for an iPad-like experience with six DOF is already handled by much cheaper machines.
You can, pretty much, get the Minecraft experience by downloading mods. Or just use the VoxeLibre game mod.
https://content.luanti.org/packages/Wuzzy/mineclone2/
The mods are written in lua and you can find the source code for most of them.
One I like is Zoonami which turns the experience into a Pokemon like game.
It differentiates between mods and games. A game changes the core game to be much more different, but sometimes a game is just a collection of some other mods.
https://content.luanti.org/packages/?type=game
Personally, I find it more fun to just go and click on about 6 to 8 mods that are interesting and see how the game goes.
https://content.luanti.org/packages/?type=mod
Some of my picks are...
https://content.luanti.org/packages/ElCeejo/animalia/
https://content.luanti.org/packages/random-wizard/gear_up/
Hell, they could even make it Open Source with a clause preventing other companies from using to code to make a profit. It's too big to fail.
Such a clause would immediately make it Source Available not Open Source.
nowadays github is filled with so-called open source projects under GPL3 but the maintainers want you to pay for a dual license
> The license must allow modifications and derived works, and must allow them to be distributed under the same terms as the license of the original software.
> No Discrimination Against Fields of Endeavor
You cannot restrict selling derivatives of the software (a field of endeaver), and the derived work must be able to be shared under a similar license.
GPL does not add any restrictions counter to that. It allows you to redistribute and sell copies if you want, you just have to respect your user's freedom by also giving them the modified source code.
Indeed, why did they even bother with this half-measure in the first place?
Now I'm bracing for them to drop support for Java Edition entirely and go strictly Bedrock in a couple of years.
Perhaps Minecraft 2.0 is finally nearing release.
Relevant wiki link: https://minecraft.wiki/w/Java_Edition_2.0
[1] https://www.anthropic.com/news/claude-code-plugins#:~:text=P...
However the source information was always missing and strange in the logs making matching some messages difficult. Hopefully this will make more messages more unique so that I can easily match the ones I am interested in.
before the judge would have to admit it was just coincident.
Not to mention doing would basically kill game as one of the biggest reason people even still play Minecraft is the modding scene, not the minimum viable effort that have been the official updates for last number of years.
Or is the argument that only source code is copyrighted, but not binaries so it only matters if the name matches the original source code? That doesn’t seem possible because it’s copyright infringement to share a retail game binary, so they’re clearly copyrighted as well.
So I’m really unclear how the risk here is any different regardless of obfuscation since the mod needs to use method names from the copyrighted binary either way.
Reading a little closer, the decision was that even assuming the API copyright claim was valid, Google's use of the API was fair use.
> In April 2021, the Supreme Court ruled in a 6–2 decision that Google's use of the Java APIs served an organizing function and fell within the four factors of fair use, bypassing the question on the copyrightability of the APIs. The decision reversed the Federal Circuit ruling and remanded the case for further review.
https://en.wikipedia.org/wiki/Google_LLC_v._Oracle_America,_....
Source? It seems that if they wanted people to get their lost accounts back, there's more efficient, less expensive ways of doing it?
> 11. Choice of Law and Place to Resolve Disputes. If you live in (or, if a business, your principal place of business is in) the United States, the laws of the state where you live (or, if a business, where your principal place of business is located) govern all claims, regardless of conflict of laws principles, except that the Federal Arbitration Act governs all provisions relating to arbitration. You and we irrevocably consent to the exclusive jurisdiction and venue of the state or federal courts in King County, Washington, for all disputes arising out of or relating to these Terms or the Services that are not heard in arbitration or small claims court.
This has been a pain to workaround for years as the modding scene has gotten bigger. Hopefully this makes modding a bit more accessible.
This already changed A LOT when Forge and later Fabric came out, with a simple patch system akin to BepinEx and a mods folder.
On the other, I'd assume this means that any official modding support is now stone dead and will never happen.
So instead they have been structuring the code in ways that help mod development, and have been talking directly to the devs of mods and mod loaders to try and reduce friction with Minecraft updates.
> I'd assume this means that any official modding support is now stone dead and will never happen.
I was a bit surprised to read this because talk of modding support had been on the radar since notch days, it's wild to me that this hasn't happened yet.
The original issue with official modding support, from my perspective, has always been a legal one. But the Mojang EULA explicitly allows modding now. So I would see this decision as one in a long line of decisions by Mojang to both normalise the legal relationship with modders, and beyond that giving a "thumbs up" to the community.
Why did they keep it obfuscated for so long even after it became readily apparent that almost everyone buys Minecraft to (eventually) play the mods?
Why did they keep it obfuscated even though they acknowledged it didn't really stop modders (or anyone else) from understanding the program?
What occurred recently that caused them to change their mind?
Its great to make this step.
Thanks for the info.
(But no, I don't think they're going to stop JE development. I'd bet it's still the far more popular version, and they probably still make plenty of money from sales)
But I agree Java Edition is not ending any time soon.
Exactly...? How much content is built with Bedrock edition and Marketplace Add-on's?
The inner platform effect is when, in an effort to make it so people don't have to use the original programming language because programming is complicated, you create a worse programming language and make people use that. In Minecraft, it's data and resource packs. The Java code isn't just a function on the block that renders it, any more - there's a bunch of indirection through resource packs, and they've gone abstraction hell with that too, adding unnecessary abstractions in the way of the actual abstraction they want.
Can you elaborate on this? This seems like a strange way of saying, "it's easier to mod little things with data/resource packs" - and mods are still absolutely necessary, as data/resource packs can't do everything. But they're great for, say, adding tags to random items (something I do regularly) or - the most obvious usecase - texture packs
public class MyBlock extends Block {public Icon getTexture() {return 0;} public String getTextureAtlasPath() {return "/mymod.png";}}
Later it was
public class MyBlock extends Block {Icon icon; public void registerIcons(IconRegistry r) {icon = r.register("mymod:myblock");} public Icon getTexture() {return icon;}}
You need a little bit more code and you have to know that "mymod:myblock" really means "/assets/mymod/icons/blocks/myblock.png" but it's not too bad. (Why not specify the actual path?)
But now it takes the Java class, plus about 5 different JSON files that are magically linked based on strings like the above (interpreted differently in each context), and if you want to simply set the icon in a few lines of code like before, you can't because all the code is specialized for handling JSON files. https://docs.minecraftforge.net/en/1.12.x/models/files/
You could argue it's better because it handles more block shapes, but the story for shapes isn't much better - you used to be able to write if(thingAboutItem) renderCertainWay(); but now you can write {"when":{"certain_condition":"true"}, "apply":{"model":"certain_model"}} and there's a whole bunch of code to write to map "certain_condition" to the condition you want, and woe betide you if your model isn't a bunch of textured axis-aligned cuboids. https://docs.minecraftforge.net/en/1.12.x/models/using/ https://docs.minecraftforge.net/en/1.12.x/models/advanced/ex...
If you know the inner-platform effect, it's the inner-platform effect: creating a poor replica of part of your programming environment in the quest for "configurability" or "no-code". https://en.wikipedia.org/wiki/Inner-platform_effect https://thedailywtf.com/articles/the_inner-platform_effect https://news.ycombinator.com/item?id=39412321
Modding with data packs is harder than modding with Java used to be, and modding with Java now is also harder than modding with Java used to be, because of data packs.
The only time I encountered it was when I was working for the government, we were working on the rules that decide who gets audited in depth by the tax police. The .jar it compiled to was obfuscated.
Most of the stuff is like naming every method a or b, and using the fact they are overloaded, given one-letter-name or a reserved keyword like 'if' to classnames (or packages) was popular, too. Pretty much constant pool modifications w/o too much byte-code-editing.
Overall cheap and unnecessary and has not stopped anyone.
Microsoft logs us out every damn time we close the software, which means my grade schoolers have my (now guessable) MS account password (and I scorch-earthed the account, because this is so dumb I won’t trust or use their crap moving forward).
Has anyone figured out how to pirate the binaries? I’d like to remove the yellow sticky note with the password from my monitor.
You'd still need to log in to update the game (I think), but for your purposes it'd probably work pretty well.
Also note that monthly active users for Roblox and Fortnite equate to monthly revenue, whereas I doubt there are as many people buying Minecraft in-app purchases.
Minecraft has made press releases detailing active player counts (“Up to”139,000,000 MAU in 2021). See here:
https://web.archive.org/web/20210809155838/https://news.xbox...
This does claim that 238,000,000 copies have been sold but that there are 400,000,000 registered Minecraft players in China (this would be about 1/3rd of the population so I think it’s probably a typo).
Epic is privately held so I suspect they wouldn’t bother reporting official player counts. Roblox actually does have numbers listed in their annual report, it just wasn’t showing up when I Googled it: 3.6 billion in revenue and 82.9 million daily active users. So that would put it within the wheelhouse of Minecraft’s playerbase, but still about 20,000,000 short.
It may well be that all the kids are playing it over Minecraft, though, since the document I linked above claims the average Minecraft player in North America and Europe is 27. I have no idea what those numbers look like for Roblox but from what I understand the playerbase has always skewed substantially towards minors.
> You have literally just invented a conspiracy theory to affirm your biases around this matter.
This was incredibly abrasive.
I guess Microsoft won't want to deal with the license issue of publishing the loader part.
I bought the game after they added fences and fishing rods and before the Nether. The nether ruined the game, beds ruined the game, hunger ruined the game, potions, enchantments, villager trading, and hoppers ruined the game, but redstone and minecarts and dungeons didn't ruin the game because those were added before I bought it, see? If you bought it today, you wouldn't think hunger ruined the game, you'd rather think I took away a good feature if I showed you a version without hunger.
But a lot of things Mojang has added, if they had been mods some random developer made, we probably wouldn't have been putting in our modpacks. A new tier of armor, which requires a tedious grind in the nether to get? That's like baby's first mod. Happy ghasts? Pretty fun, and impressive that you can stand on them, but like the morph mod, kinda ridiculous and definitively doesn't belong in every pack. Eventually, if they keep doing it like this, Minecraft will be as ridiculous as the old kitchen sink modpacks.
Yeah, 1.7.10 is many modders' favorite, I know. If you did stop at 1.7.10, I guess you know about the GTNH people's crazy work in keeping that version running? For a while, you could 1.7.10 with a newer version of Java than the current latest version.
And for what it's worth, I think I've visited The End once? What makes it a sandbox is that you can play however you want - let The End be your goal, if you'd like, or just mine and build big castles, or mess around in Creative mode. That's the brilliance of it.
In current Minecraft a lot of resources are renewable via villager trading. The best way to get many resources is to enslave some villagers and find a trading loop that nets a profit in emeralds on each cycle, then spend some of that profit on the thing you want. If you want the player to dig up coal to make electricity, they can make a trading hall or a wither skeleton farm instead. If you want the player to dig up iron to expand their factory they'll make an iron golem farm which produces it at a high rate for free. The section of the design space that you wanted to access is blocked off by the mechanics of the base game.
Or maybe it wasn't intentional, but emergent... sort of like, you know, playing in a sandbox?
Villager trading is another new(ish) mechanic I don't partake in. I know others who don't either. I don't think we're playing the game wrong.
And if a mod/pack developer doesn't want players to use vanilla mechanics, they can disable those, as many, many developers do.
Probably the same with java. No point doing so with our beyond fast and powerful computers.
No, It was obfuscated since around 1.8 when you (Microsoft) buy up Mojang Studios. before that? meh, It wasn't. That's the main reason why JE has broader mod ecosystem from the start., result being 1.7.2 being the one of the most active modded versions since most of them can't get passed to around 1.8.
The motive behind this is probably due to them finding out people can not get their mods/server software updated in-time (due to extra work required) and this leading people being really reluctant to update their versions.
It was definitely already obfuscated by then, the Microsoft acquisition had nothing to do with it.
If anything, looking back all the years, Microsoft has largely delivered on the promise to not fuck up the game and its community. They’ve mostly kept their hands off it, besides the Microsoft account stuff (which makes sense why they did it, but a lot of people are still understandably annoyed). Hell, they’ve kept up two separate codebases in fairly close feature parity for _years_. I doubt they’d have kept JE if there weren’t people in that team who genuinely cared.
Huh? This is not true. The very first version released in 2009 was obfuscated with ProGuard, the same obfuscator used today.
The reason Minecraft 1.7 was a popular version for modding was because Forge was taking too long to come out, and the block model system was changed in a fundamental way in the next update. Has nothing to do with obfuscation.
> The motive behind this is probably due to them finding out people can not get their mods/server software updated in-time (due to extra work required) and this leading people being really reluctant to update their versions.
Not really accurate. The Minecraft modding and custom server software ecosystem has more agility right now than it ever had in the past. In the past 5 years, a remarkable shift has occurred: people have actually started updating their game & targeting the latest version. Minecraft 1.21 has the highest number of mods in the history of the game.
The best thing to happen to Minecraft is 1.7.10 backporting; the second best thing has been breaking the Forge monopoly on modding.
(The code quality of mods back in the 1.7 days ranges from "pretty decent" to "absolutely horrendous" mind you.)
You can easily see that versions prior to Beta 1.8 were obfuscated just by downloading the .jar for the older versions on minecraft.wiki.
You can even view some of the old MCP mappings here: https://archive.org/details/minecraftcoderpack
It’s disinformation if it’s deliberate.