Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Rust vs. Julia in scientific computing (mo8it.com)
88 points by EvgeniyZh on July 24, 2023 | hide | past | favorite | 102 comments


As a scientists who codes both Rust and Julia, I don't think Rust is anywhere near as good as Julia for scientific computing (as I elaborate in the posts on the Julia Discourse about this blog post linked below).

Briefly, for scientific computing, rapid iteration, prototyping and redesign is one of the main requirements. So is "open internals", i.e. generic code and/or the ability to open up someone else's code and reuse parts. Speed is somewhere further down the list - which is why Python (and previously, Perl) are the dominant languages, not C++ or Rust.

Julia excels at all of these and as such is nearly perfect for science, whereas Rust is unusually bad at the "prototyping and redesign" part. Like, you change the ownership model of some struct, or some single type, and now you have to unravel half your code base to change type signatures and fix borrow checker woes. It's also just slow to write.

Rust is nice for scientific code that is more "application-like", such as command-like tools whose task is very well-defined and mature, as well as for large-scale scientific appliations. It's also nice for very low-level code that needs to be close to the metal (e.g. scientific code running without an OS inside scientific instruments). But in most scientific use cases, dynamic languages are just way better, as they have been the last 30 years, and Rust doesn't change that, nice as it is.

1. https://discourse.julialang.org/t/blog-post-rust-vs-julia-in...

2. https://discourse.julialang.org/t/blog-post-rust-vs-julia-in...

3. https://discourse.julialang.org/t/blog-post-rust-vs-julia-in...


As an aside: I do a lot of numerical research and have mainly been using python with Jax due to how trivial it is to run my code on the GPU with Jax. However, I’ve gotten increasingly fed up with Python as a language and Julia is basically my ideal language, feature wise. Do you think it’s worth the switch and, if so, is it easy to run stuff on the GPU these days? Last I checked it still required some care to do that.


I think it depends on a few things. Python are light years ahead of Julia when it comes to machine learning, so if your GPU code is that, don't pick Julia yet. Do you need a lot of niche libraries? Then maybe not: Julia has many fewer packages than Python. Also, what is your tolerance for "rough edges"? Despite Julia being 10 years old at this point I feel like I run into rough edges in Julia quite often: Underspecified behaviour of function in edge cases, a compiler whose optimisations are somewhat unreliable at producing the most efficient code, and some language nice-to-haves that are conspicuously missing such as a good linter.

Nonetheless, I've found it to be better than Python for my use cases. I also hear that Julia's GPU capabilities are excellent


I think the author misses one of the most important points: how easy/fast is it to write a program?

Julia is dedicated to scientific computing, where it is very normal to start out with solving a simple problem to verify that your approach can work. That code is usually very bad, because all it needs to do is function. The goal of Julia is to:

- Make writing that first prototype very easy

- Use that prototype to create a fast version, without reimplementing everything

That is what rust has to be compared to. And while I absolutely do like what rust is, it is not a language that lends itself to quick prototyping. For the prototype you don't care that your program might crash with the wrong parameter or that the generated matrix is not invertible or that you access an array out of bounds or ... By any measure rust is a complex language, which takes significant understanding to write and requires the programmer to engage with the code alot.


Exactly. There is no need to shoehorn rust into absolutely everything. Right tools for the job.


> it is not a language that lends itself to quick prototyping

Maybe it depends on your perspective, or maybe it is simply personal preference, but I find rust pleasant and quick to prototype in.


> Maybe it depends on your perspective, or maybe it is simply personal preference, but I find rust pleasant and quick to prototype in.

Or, perhaps more importantly, on use case.

It’s hard for me to imagine (though certainly possible, I suppose) someone finding prototyping e.g. sensor processing algorithms in Rust more enjoyable than Julia, or faster for that matter.


It's one of the thing that annoys me the most about these discussions. The best language to prototype in is the one you know the most. For me and probably for you too, that's rust. So it's trivial for me to start a new project and have everything I need very quickly. With pretty much any other languages I need to relearn some syntax or apis which is a massive waste of time when prototyping.


I'm not here do doubt your comfort in rust, but I do think that prototyping scientific software involves more than just iterating on code. Personally, my development process of is just different for that kind of code because it's easier and faster to debug through plots than traditional logging/breakpoints/whatever. Personally, I feel more proficient in c++ than python, but I will still prototype in python. Half the reason for this is that I can set a breakpoint anywhere in program, and produce an interactive plot in about 3 seconds. Julia has a similar advantage here. Maybe there are tricks I don't know, but it's a massive pain to do this in languages like rust and c++.


I think so as well. And it's quite easy to go from prototype to production by starting from removing unwraps.

To be fair I'm not familiar with Julia so I'm not making any comparisons.


This is quite possibly the first good language comparison I've seen on HN. Good job!

The upshot:

1. If you need to eke out that last extra drop of performance, and can't deal with any mistakes, Rust is good because it makes it very difficult to write slow code. Julia makes it easy, because Julia needs to be easy to write.

2. Julia's greater permissiveness means the compiler can't catch as many bugs as Rust's (although it still catches far more than Python's static analysis tools or C, where every other line has some undefined behavior).

3. If you need interactive+dynamic code, Julia is your best bet. It's a lot like Python in this regard, but with a much better UX in the form of a better REPL, package management, etc. Well-written Julia code will also be just as fast as Rust or C++, meaning you can use it for high-performance computing.

4. If you want a scientific/machine learning ecosystem, go with Julia.

I think this post is strongest in hitting Julia where its problems are -- Julia can be a bit too permissive/promiscuous with letting you do dumb things.

Where it's weakest is letting Rust off the hook a bit too easy. Rust is good at making it hard to write bad code, but the cost is it makes it hard to write code in general, unless you handle every detail that might slow your code down. You're not going to get users happily writing Rust when they just want to make a plot. It takes about 5-10 lines of Rust to create one line of Julia code.


On 1, not quite, because Julia makes it way easier to offload a computationally demanding inner loop to the GPU, which is a big win for a lot of simple scientific programs.


> It's a lot like Python in this regard, but with a much better UX in the form of a better REPL, package management, etc.

Personally, I have found the package management to be somewhat more difficult to navigate than Python's (which I also have little love for...). But perhaps this is a learning curve thing...


Strange headline. Compiled languages can't compete for scientific computing except as an engine language for some calculator (a ray tracer, a finite element solver, etc). Model setup and analysis will always be done in some kind of interactive scripting language, and these days it will be done in a notebook interface. So you're right back at the two language problem.


Julia's explicit goal is being a "scripting language" with extremely high performance able to run the "engine" code at near native speed. You can also use it in a notebook.


This is honestly very 2010s brained. With modern LLVM-based compiled languages, we really can have our cake and eat it too in terms of doing most if not all of the productive things scripting languages will do, without the slowness of said scripting languages. Name one thing that Python can do that Rust/Julia/Crystal/etc can't (other than statements about the availability of some package), that researchers _need_ to have, that is also a _good idea and not a footgun_...


> other than statements about the availability of some package

But that is probably the most important thing. I don't have time to re-write 40+ years of code in a new language just because it's trendy or safer or whatever. I have new science to do!

In some ways science suffers because of this, but it is also nice to have a relatively common ecosystem and institutional knowledge that doesn't change every 3 years.


Right, but we're comparing languages, not their ecosystems. It's not a language's fault if the community hasn't yet realized it is better for some purpose and overcome the market effects of tons of existing packages already being in the popular language by migrating them to the less popular language. This is simply using inefficiency as a justification for subsequent and ongoing inefficiency. The whole point of an argument like this is to compel people to put effort into porting things out of a sub-optimal ecosystem.

In the short term it is of course convenient to just use the most popular thing always (in which case we'd all still be using PHP and Flash), but eventually things do switch on a large scale, and it benefits all of us to push this along when we can.

The more we double-down on an inherently sub-optimal ecosystem, the more we are trapped by it.

Imagine if all the effort making Python usable over the last decade had instead been spent on giving a compiled language better dev UX and GPU support...

I'm not even a fan of Julia, but I probably would be if it had received the same attention Python has for the last ten years.

Conversely, I'm _still_ not a fan of Python, even after all this time and effort has been expended, because the foundation being built upon is just simply a bad one for high performance and high security domains. Anything they manage to get working is akin to a hack, and is working in spite of the language in which it is built.


Comparing languages requires comparing their ecosystem in 99% of cases, since most people don't have the luxury of having enough time to reimplement whatever they need from scratch.


Right!

It's one thing to compare languages in the abstract, for e.g., to compare and contrast iteration or exceptions or whatever. It's a different thing to compare languages for doing something specific -- in this case, scientific computing. If someone want to get something done in a realistic ( or affordable) time frame, then things like like libraries, community, and documentation become especially important.


If we take Python an example, how many man-centuries have been spent trying to speed it up enough so that it becomes usable for something it was never meant to be used for?. If 'new science' was writing a small script to automate something instead of writing it in bash then yes I think this language is probably the best language for this task.


Right but how many of those man-centuries could have been better spent just writing X in a suitable language where you get that for free, and how many subsequent man centuries will be wasted before everyone switches? Imagine where the AI community would be today if the second one of these languages had become stable and generally available, everything had been ported and we'd now be several years into iterating on things in a more suitable environment, and that environment would in turn be that much more advanced by this point?


There are two problems with this thinking.

1.) The rewrite never seems to have all the features of the original (for many reasons), so you end up keeping the original around because science can be very niche. Now you have two or more packages to deal with.

2.) There is always a better language. Science (or at least parts of science) have switched before - from Fortran to C to C++ to Python. Some of the gains have materialized (some safety, performance, borrowing stuff from outside science). But it has come at a cost (language fragmentation, and also packaging is an absolute shit show right now, partly because #1).

But I'm sure the next batch of languages will finally solve all our problems once and for all, and we will never have to switch again.

(In general, I am talking about non-AI type science. I am a computational chemist, and our code really does date back > 40 years at times. That is not always a bad thing).


> That is not always a bad thing

Sure and that's reasonable, but in the cases where performance/throughput/training time/RAM usage is the main limiting factor of a field, suddenly this stuff matters a lot. There's plenty of useful code written in COBOL and Basic from back in the day, but you don't see people using those things to train LLMs

Imagine if instead of letting Basic effectively die, we had improved it with a myriad of extensions to the point where you can run LLMs and GPU code and things in a performant-ish way on it. Now replace the word Basic with Python


How do you, or "you" as the industry, decide that new code and new techniques are worth moving away from existing code?


> Name one thing that Python can do that Rust/Julia/Crystal/etc can't

This is honesty very 1990's brained, based on the idea that compute power is expensive and constrained.

All Turing complete languages can theoretically do the same things. That doesn't make them equally useful across all domains.

As long as developer hours are much cheaper than compute hours, we're going to be doing certain types of work in high level scripting languages.


Compute power can directly influence developer hours. Developers/researchers spend a lot of time twiddling their thumbs waiting for a simulation/calculation to finish and plots to render. It directly costs time, and it also messes with your focus and progress in general.


Julia is a high level scripting language. it's just also fast. that speed matters. even if your code only takes 1 seconds to run, making it 100x faster opens the door to doing all sorts of things you wouldn't have before (e.g. fancy parameter optimization)


> This is honesty very 1990's brained, based on the idea that compute power is expensive and constrained.

This is honestly very 2015 brained. Compute power is back to being more expensive and constrained than developer hours. :)


This is very 2023 brained tbh


Just so :)


turing computability makes no guarantees about constant factors affecting execution time. If I say "do X in less than X nanoseconds", some languages can, and some cannot, regardless of what hardware you run on. If our bar for usable is turing computable, then yes, by all means, let's re-implement AI systems using CSS media filters or typescript's type system.


Computer power is expensive and constrained for many scientists...


There's no mention of distribution, which is a huge advantage for rust since it compiles to a static binary.

There are many applications where the algorithms are written by a specialist which is then compiled into a static binary for use in another application or UI. This is why FORTRAN C and C++ are so popular in the scientific computing field for writing numerical algorithms. It's not just for speed, there is clear separation of concerns here.

Julia is still far behind here as you can't compile to a GC free static binary yet. There is PackageCompiler.jl but it appears it makes large binaries.

Scientific analysis and plotting is just one aspect, ultimately your algorithms would have to be distributed for use by others.


Hardly irrelevant to a community used to R, Python, Mathematica and MatLab.

Fortran, C and C++ are popular due to the way they allow to explore HPC infrastructure, which Rust is still years away to support, and Julia is already ahead in that regard.

MPI, SIMD, OpenMP, NUMA algorithms across the computation cluster.


Plus vendor compilers often outperform GCC and LLVM… on Cray systems especially


Can Julia take advantage of that?


In theory, but it’s usually specific optimisations that work well on HPC codes that they’d need to replicate at the LLVM level: https://www.umr-cnrm.fr/aladin/IMG/pdf/compilers_elsinore_re...


Awesome, thanks for the link!


Cray is pushing their own language as well, Chapel.

https://chapel-lang.org/

As for Julia on Cray,

"Julia — The Newest Petaflop Family Language We Have Started to Love"

https://www.avenga.com/magazine/julia-programming-language

> Julia is one of the few languages that are in the so-called PetaFlop family; the other languages are C, C++ and Fortrant. It achieved 1.54 petaflops with 1.3 million threads on the Cray XC40 supercomputer.


But specifically because of the proprietary compiler? Or is it just calling Fortran / C / C++ libraries that were compiled with their compiler?


The program is Celeste: https://juliahub.com/case-studies/celeste/index.html

It seems to be pure Julia.

Julia uses the LLVM compiler; I’m pretty sure that’s the only platform, aside from experiments compiling to WASM.


My understanding is that the Julia Petaflops run executed a Julia program per node, communicating via MPI. For some, that's probably obvious/expected for HPC; for others, it might not be considered "pure Julia".


That's how it works in any language on a supercomputer. MPI is pretty much the only game in town for inter-node communication.


@cbkeller: Though MPI is dominant in HPC by a very large margin, it's definitely not the only game in town. SHMEM is an MPI alternative with a smaller but very dedicated following. UPC, Fortran 2008, UPC++, and Chapel are all alternatives that support inter-node communication without relying on MPI or explicit library communication calls. Chapel has the additional advantage of not imposing the SPMD programming model on the user and supporting asynchronous dynamic tasking.

It's my understanding that Julia aspires to join this group of languages if it is able to do so, which is why the Petaflops announcement was originally enticing to me, and then became somewhat less so once I learned that it was relying on MPI.


The compiler is free and open source, not proprietary. It is built on LLVM, which is also FOSS.

And Julia code is not Fortran/C/C++, not sure what you are asking.


The comment I replied to said: "Plus vendor compilers often outperform GCC and LLVM… on Cray systems especially".

I said "Can Julia take advantage of that?". So the "that" in my question was "vendor compilers [which] often outperform GCC and LLVM... on Cray systems especially".

It seems like the answer is, no, Julia cannot be compiled with those vendor compilers that often outperform LLVM.

But it seems that Julia nonetheless has been shown to perform well, just not specifically by using those vendor compilers that I was asking about.


Is there something like OpenMP for rust? (there must be...) But I imagine MPI is probably not a great fit...


There's rayon, which in my opinion makes writing multi threaded code even simpler than openmp. It's also nice that it doesn't rely on any compiler feature. It's just a library.



> There's no mention of distribution, which is a huge advantage for rust since it compiles to a static binary.

Frustrating article. Not only does it levy unfair criticism against Julia, it misses on one obvious legitimate complaint.

Rust can be compiled today, no caveats.


Yes. I work on a project that calls into Julia from Python, and the dynamic packaging has been (IMO) a nightmare. It would be so excellent if I had a static library to link against instead!


C and fortran win here because of ABI... (and C++ sorta...)


I think Julia solves the two-language problem within a range of project complexity. If I am writing a simple machine learning model to classify MNIST digits, Python and its libraries is still the easiest way to go (although Julia is fine too). If I'm writing a large language model and processing large amounts of data, perhaps with some fancy plotting downstream, Julia makes more sense. If you're building a large, stable production application then you still might be more inclined to use Rust or C/C++ or the like.


Weird opening example. Trying to multi-thread a loop and then forcing everything through an atomic is worse than pointless: the atomic will effectively make it single-threaded anyway except now you have worse performance from cache non-locality, branch mis-prediction, and the overhead of using atomics. I get that it's just an example to show how the rust compiler can catch data-races but the section concludes "Moore's law is almost dead, at least for single core performance. Therefore, we need a language that makes concurrency not only easy, but also correct" without mentioning that many algorithms and procedures cannot be parallelized and that trying to force them to be will actually just make things worse.


Dumb question: how is Julia better than Fortran to write numerical models (for example, finite element simulations) ?

In the lab where I work there are plenty of people real good at the physics/maths but not that good at writing code. Since they need performances to run their stuff, they often go to Fortran (because that's what has always been used here). How much Julia would make their life easier ?

(I'm the computer guy in the lab: I optimize their stuff when they need it but I'm not in the mindset of "model first, code quality after", I'm much more a rustacean :-))


Both Fortran and Julia have a concise syntax for array operations, and both, with a little care, can produce fast programs. Unless you want to program in C or C++, those are your only options¹ for high-level languages that have already proven themselves in high performance computing.

I wouldn’t advise rewriting a substantial Fortran project in Julia; but for new projects, Julia is the better choice. Fortran is excellent in the scientific/numerical high performance niche, but it’s still awkward to move beyond that; it’s still stuck in the Formula Translator role. So the non-numerical parts of the program, that deal with input/output, for example, are awkward; Julia is more pleasant to program these tasks.

Julia lets you build your program interactively, by trying things out in the REPL. This alone may be sufficient reason to prefer it over Fortran.

Although most scientists won’t become sophisticated as programmers, for those that will, Julia can grow with them. It has extensive support for metaprogramming (for example), including real macros. Its type/dispatch system, and other features, allow you to write far more concise and better organized code than you can in Fortran. Sharing code and composing libraries² is easier in Julia.

Really, Julia is just more fun to program in.

1 HPCWire (2017) ‘Julia Joins Petaflop Club’. Available from: https://cacm.acm.org/news/221003-julia-joins-petaflop-club/f...

2 Phillips, Lee (2020) ‘The Unreasonable Effectiveness of the Julia Programming Language’. Ars Technica. Available from: https://arstechnica.com/science/2020/10/the-unreasonable-eff...


Except for the fact that your Julia code will always suffer from correctness problems, something that would rarely if ever happen in Fortran. https://yuri.is/not-julia/


No, it probably won’t, and certainly not “always”. A majority of the issues raised in that article concern errors in packages that have been fixed, most rather a long time ago.


That "always" holds for the Julia language almost surely as long as interfaces are missing in the language. It is just a matter of time to find newer issues.


My Julia programs do not “always suffer from correctness problems”; therefore you are incorrect.

More generally, saying that programs written in a language that happens not to include a feature you think is important (in this case, interfaces) will “always suffer from correctness problems” doesn’t seem to be a serious point of view.


All Julia codes are arbitrarily extensible. Any Julia code can always be readily extended to silently yield incorrect results. That is the whole point of the blog post shared above. Justifying the indefensible is different from wanting or liking it.


I’m afraid I don’t understand anything in this comment.


Thanks ofr this answer. Looks very balanced to me.


It's actually very easy to make Fortran bindings for Julia, where you can call your Fortran code from Julia.

https://docs.julialang.org/en/v1/manual/calling-c-and-fortra...

So unless you think the Julia version will compile to faster code than the Fortran code (probably not if you use a fancy paid license Fortran compiler), then definitely don't rewrite.


Might be (significantly) easier to run idiomatic Julia on GPU(s)? (Unsure of current Fortran here).


The fact that Rust boosters need to write blogs about why we should all use Rust for use case XYZ is an autmoatic tell that Rust isn't that great for use case XYZ.

Python didn't become the Lingua Franca of Data Science and Analytics because people on Reddit, Hacker News, the blogosphere etc, pushed Python and wrote presuasive pieces on why it was so awesome (there was some of this, but absolutely nothing compared to the cult of Rust). It suceeded becasue it turned out to be a very good tool for that particular job.

I don't know about others on HN, but the constant push for "Write everything in Rust" has really turned me off from the language. I know it's petty, but I now tend to associate Rust with being nagged and harangued.


People have and did make that claim about Python. The creator of this very website wrote exactly that, in 2004: http://www.paulgraham.com/pypar.html It is one of the most famous and popular things he has ever written.

It turns out that people write posts about things they like. Sometimes they like things for good reasons, sometimes they like things for bad reasons. But that has no direct bearing on how good a tool is.


I really don't relate to this. When I was in high school and college and for the first few years of my career, Python was not "a very good tool for that particular job". A number of the Ruby programmers I first knew and respected were using Ruby for doing computational / scientific computing with more iteration and velocity than they could achieve with Fortran, C, or C++ (the common scientific languages at the time). There was no reason back then to think that Python as a language was any better suited for this purpose than Ruby or Perl or any other language really. (And what we now call data scientists were already mostly doing this with R.)

It was indeed through a significant amount of writing and advocacy and persuasion that Python - through numpy and scipy and pandas - won out here.

For a time it looked like it was going to fade, as it was too slow and all the "big" tooling was built on the jvm instead. But it got faster and people made more interfaces to underlying faster components, and people did more writing and advocacy and persuasion, and now it's mostly python still, with a bunch of native or jvm code under the hood sometimes.

But none of this happened just through python being clearly better for this niche. People always write and discuss and advocate for the tools they like. It's normal. It's same as it ever was.


Until now I've convinced that people don't code Rust for higher performance, it's a cult and people use that to be looked cool. Rust are not some kind of black wand that can zap your 100k lines of C/C++ code to be 100 lines of code with the same speed. If you want high performance then it's not just the language which is fast, it's the way you implement your idea.

People are always talking shit about things to make people think they are smart, like people around me think they are Larry Wall, bashed Java for being slow, but they don't know that a lot of important, performant thing are coded in Java.

If you want raw performance, use C/C++. If you want simplicity, then use Python. If you want reliability and safety, then use Java/Go.


Well said. Now replace "Rust" with "Julia" in your first sentence for a moment of enlightenment for everyone.


Fair enough.


Julia is fast, but the community is too small. I once had a bug which cannot modify the column in DataFrames.jl. No one response on SO, GitHub, just a bunch of devs who has the same problem as me. Dropped the project and switched to Python. Rust is not a beginner friendly to create a high performance program, but somehow it became a cult and coding in Rust is some kind of flexing like finishing a Dark Soul game.

There must be a reason why C, Python (and Erlang/Elixir) is popular.


If you’re inclined to be informed by this you might also want to check the corrections and author’s concessions at the Julia discourse: https://discourse.julialang.org/


That would seem to be this post, specifically https://discourse.julialang.org/t/blog-post-rust-vs-julia-in...


The thread is very long. Any impressions from it that you're willing to summarize? (Recognizing that they will be your own perspective, of course.)


I think the consensus response from the Julia community is that the author is spot-on with a key point: that Rust is exceptional when you need/want static analysis. And Julia isn't great there.

This doesn't mean, however, that Julia doesn't solve the 2-language problem. There are large classes of problems and applications — especially in scientific computing — where a complete static analysis isn't required to move into production. It is definitely helpful for maintenance and security (and a requirement in some industries), and Julia's static analysis tooling is continuing to improve... especially with ongoing investments from key companies pushing us towards this goal.

Rust is a great language when you know exactly what you want to write. When you know exactly what the inputs are and what the outputs are and what the algorithm is. And it's really painful when you're not sure. Pick the right tool for the right job!


Thank you :)


> If it compiles, it is data-race free

False:

    (0..10_000).into_par_iter().for_each(|_| {
        counter.store(counter.load(Ordering::SeqCst) + 1, Ordering::SeqCst);
    });
    println!("counter {}", counter.load(Ordering::SeqCst));
Rustaceans who think the borrow checker is some kind of miracle cure against every type of bug there is gets under my skin.


A race condition is not the same as a data race, and nondeterminism is not the same as undefined behavior.


Fine, but in practice a data race is just a special case of a race condition so my point still stands. Most multi-threaded code with races is more like my example than the author's and Rust will not magically prevent such race conditions.


Indeed, I like your example, usually I go after external resources being accessed from same threads, database accesses without proper transaction management, or OS IPC, where the Rust threads don't have any visibility about what the other processes are doing.


As a Rust user, the RESF-type no-effort comments are very frustrating. Hopefully it dies down in a few years.


I'm still not sold on Julia. Compared to R/python, the supposed benefit doesn't outweigh the cost of implementation and learning (my perception). I do a lot of "big data" stuff in R/python, and I just use libaries written in C or C++.


This works fine as long as you only do things for which someone else has written the c/c++ parts you need already.

Really the premise of Julia is to make their life easier, while leaving yours equivalently easy (or better).


This is spot on. I love Julia, whatever little I have dabbled into it. History of deep learning would have been slightly different if Julia had taken off as the implementation language. Might have encouraged more hackability and understandability. Wrapping our heads around Pytorch source or Jax source is tough.

But this introspection gave me a reality check. In my own work, as much as I love Julia, I fallback to quickly using Python libs. In industry, most of the people's focus is on using someone's already written code. The byproduct is inheriting Python and C/C++ interface mess around dependency management. People needing to write their own algorithms are majority academics and researchers where Julia has flourished.

The incentives of corporate work are setup to build on top of Python's ecosystem which means (sadly) Julia will stay in its niche.


"Just use C++ libraries" is good enough for "simple" operations on "simple" types that have been implemented in C++. If you want C-like performance for generic types without relying on someone having already implemented that in C++, use Julia.


I think you’re missing one of the key points of Julia, that others have also pointed out: you’re reliant on libraries other write in c/c++.

Julia’s proposition is that the entire community should be able to read and contribute to these libraries. Unlike the python community where the vast majority of users could not read the source behind those libraries let alone contribute.


Generally this doesn't feel like a problem with Python, due to its massive user base advantage. Meta, Google, Microsoft, and many others contribute tens of billions of dollars annually to Python-ecosystem projects (PyTorch, Tensorflow, faster-cpython). Unless that funding differential changes I can't see Julia competing.


Contributing to libraries may not be a goal for most people


I agree with the observation (most people do not contribute to libraries), but I think the arrow of causality is little different. Python is used as a glue code in scientific computing on top of libraries written in C/C++. The development ergonomics of these languages are intimidating for many people, with both languages having enough footguns. The net result is no-one wants to peek under the hood and see how things are working.

A fresher take on scientific computing like Julia, if it is mainstream, might enable more contributions and in general understandability of the black box algorithms.


Well put


Personally a lot of the points in this article are pretty decent, but I think the solution is not Rust but something like LFortran (https://lfortran.org) which has an interactive REPL, can be AOT compiled, no performance footguns, and is Fortran (a big plus for adoption by scientists).


I like the comparison but I think Scientist Time is an important missing part of the equation. (Also disclosure that I wrote my dissertation project in Julia, which made use both of multi threading and the ability to parallelize easily in an HPC environment.)

Not knowing rust, a lot of this material looks like it would have high initial investment (to learn rust) and an ongoing variable cost to implement (ie, figuring out all of these complexities in a particular use case). There is surely a trade off on accuracy but everything in science is about trade offs.

If you can publish three papers working in julia while the rust programmer gets through only one… we know who is going to get tenure.


I'd rather have a language that directly targets the GPU. In fact, I started using WebGPU wgsl and compute shaders for prototyping and it suits me extremely well.


Julia does have really nice GPU support, being able to directly compile julia code into CUDA, ROCm, Metal or other accelerators. (Being GPU code it's limited to a subset of the main language)


Sorry, do you have an example of using wgsl for prototyping? I'm having trouble imagining something like that, since WebGPU is not exactly terse.


Scientific computing has got to at least be interactive. Julia provides that in spades, but Rust is basically painful.


A great essay that showcases Rust's greatest strength: its ability to eliminate race conditions.


Only between threads, there are plenty of other ones.


Mojo


Not really an option yet




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: