So much goodness in this release. Struct redefinition combined with Revise.jl makes development much smoother. Package apps are also an amazing (and long awaited) feature!

I can't wait to try out trimming and see how well it actually works in its current experimental instantiation.

How's the Julia ecosystem these days? I used it for a couple of years in the early days (2013-2016ish) and things initially felt like they were going somewhere, but since then I haven't seen it make much inroads.

Any thoughts from someone more plugged in to the community today?

My company (a hedge fund) has been using Julia for our major data/numeric pipelines for 4 years. It's been great. Very easy to translate math/algorithms into code, lots of syntactical niceties, parallelism/concurrency is easy, macros for the very rare cases you need them. It's easy to get high performance and possible to get extremely high performance.

It does have some well-known issues (like slow startup/compilation time) but if you're using it for long-running data pipelines it's great.

Disclaimer: I am not plugged into the community.

The other day that old article "Why I no longer recommend Julia" got passed around. On the very same day I encountered my own bug in the Julia ecosystem, in JuliaFormatter, that silently poisoned my results. I went to the GitHub issues and someone else encountered it on the same day. I'm sure they will fix it (they haven't yet, JuliaFormatter at this very moment is a subtle codebase-destroyer) but as a newcomer to the ecosystem I am not prepared to understand which bog standard packages can be trusted and which cannot. As an experiment I switched to R and the language is absolute filth compared to Julia, but I haven't seen anyone complain about bugs (the opposite, in fact) and the packages install fast without needing to ship prebuilt sysimages like I do in Julia. Those are the only two good things about R but they're really important.

I think Julia will get there once they have more time in the oven for everything to stabilize and become battle hardened, and then Julia will be a force to be reckoned with. An actually good language for analysis! Amazing!

just to be fair, the very first words in the README for JuliaFormatter is a warning that v2 is broken, and users should stick to v1. so it is not a "subtle" codebase-destroyer so much as a "loud" codebase-destroyer.
That's fair, and my bug was in 2.x, but it doesn't really make me feel better. It's OffsetArrays again--the language made cross-cutting changes that its ecosystem isn't prepared to absorb, so everything is just buggy everywhere as a result.

Not super loud, though. Obviously I missed it despite using JuliaFormatter constantly. It doesn't get printed when you install the package nor when you use it. It's not on the docs webpage for JuliaFormatter. It's only in the GitHub readme. I was reading the docs. What other packages have disclaimers that I'm not seeing?

  • pjmlp
  • ·
  • 1 hour ago
  • ·
  • [ - ]
Going well, regardless of the regular doom and gloom comments on HN.

https://juliahub.com/case-studies

I do wonder in particular about the startup time "time-to-plot" issue. I last used Julia about 2021-ish to develop some signal processing code, and restarting the entire application could have easily taken tens of seconds. Both static precompilation and hot reloading were in early development and did not really work well at the time.
That was fixed in 1.9. Indeed it makes a huge difference now that you can quickly run for the first time.
  • moelf
  • ·
  • 17 minutes ago
  • ·
  • [ - ]
on a macMini (i.e. fast RAM), time to display:

- Plots.jl, 1.4 seconds (include package loading)

- CairoMakie.jl, 4 seconds (including package loading)

julia> @time @eval (using Plots; display(plot(rand(3))))

  1.477268 seconds (1.40 M allocations: 89.648 MiB, 2.70% gc time, 7.16% compilation time: 5% of which was recompilation)
My shop just moved back to Julia for digital signal processing and it’s accelerated development considerably over our old but mature internal C++ ecosystem.
Mine did the same for image processing but coming from python/numpy/numba. We initially looked at using Rust or C++ but I'm glad we chose to stick it out with Julia despite some initial setbacks. Numerical code flows and read so nicely in Julia. It's also awesome seeing the core language continuously improve so much.
> For example, the all-inference benchmarks improve by about 10%, an LLVM-heavy workload shows a similar ~10% gain, and building corecompiler.ji improves by 13–16% with BOLT. When combined with PGO and LTO, total improvements of up to ~23% have been observed.

> To build a BOLT-optimized Julia, run the following commands

Is BOLT the default build (eg. fetched by juliaup) on the supported Linux x86_64 and aarch64? I'm assuming not, based on the wording here, but I'm interested in what the blocker is and whether there's plans to make it part of the default build process. Is it considered as yet immature? Are there other downsides to it than the harmless warnings the post mentions?

I wish a) that I was a Julia programmer and b) that Julia had taken off instead of python for ML. I’m always jealous when I scan the docs.
  • culi
  • ·
  • 1 hour ago
  • ·
  • [ - ]
Python predates Julia by 3 decades. In many ways Julia is a response to Python's shortcomings. Julia could've never taken off "instead of" python but it clearly hopes to become the mature and more performant alternative eventually
Some small additional details: 23 years not 30. Also, I think Julia was started as much in response to Octave/Matlab’s shortcomings. I don’t know if it is written down, but I was told a big impetus was that Edelman had just sold his star-p company to Microsoft, and star-p was based around octave/matlab.

- https://julialang.org/blog/2012/02/why-we-created-julia/

When Julia came out neither Python nor data science and ML had the popularity they have today. Even 7-8 years ago people we're still having Python vs R debates.
In 2012, python was already well-established in ML, though not as dominant as it is today. scikit-learn was already well-established and Theano was pretty popular. Most of the top entries on Kaggle were C++ or Python.
  • ·
  • 1 hour ago
  • ·
  • [ - ]
Being able to redefine structs is what I always wanted when prototyping using Revise.jl :) great to have it
Has anyone tried the `--trim` option? I wonder how well it works in "real life".
  • pavon
  • ·
  • 4 minutes ago
  • ·
  • [ - ]
I've tried it on some of my julia code. The lack of support for dynamic dispatch severely limits the use of existing libraries. I spent a couple days pruning out dependencies that caused problems, before hitting some that I decided would be more effort to re-implement than I wanted to spend.

So for now we will continue rewriting code that needs to run on small systems rather than deploy the entire julia environment, but I am excited about the progress that has been made in creating standalone executables, and can't wait to see what the next release holds.

I think Julia missed the boat with Python totally dominating the AI area.

Which is a shame, because now Python has all the same problems with the long startup time. On my computer, it takes almost 15 seconds just to import all the machine-learning libraries. And I have to do that on every app relaunch.