This is an amazing achievement for the WebRender team, but users should manage their expectations with regard to performance improvements.
The reality is that Firefox's current rendering engine is highly tuned, and switching to a new engine without major performance regressions is impressive. Keep in mind, there may be performance improvements coming down the line.
As a desktop Linux user, I'd guess it's harder to make it work right (because of, among other things, driver issues) and it affects a smaller portion of the user base of Firefox. Oh well.
Fortunately it's easy to opt in to, and in my experience works well on an oldish Intel chip.
Unfortunately, it fails hard with Wayland, putting no pixels on the screen. (In my experience) But I'm using it happily through XWayland.
No, only tried on stable. Looking forward to testing 67 when I get the update.
I should clarify that I am talking about the situation when running Firefox with GDK_BACKEND=wayland.
UPDATE: Ooh! Firefox 67 in my package manager! I am happy to report that I am now able to use Firefox with WebRender on Wayland directly. One glitch so far: It leaves a one-pixel row transparent at the bottom of the screen. Weird :)
Distro shouldn't matter one bit. Just scan for hardware acceleration support and enable it if found. Plenty of other applications do this without any per-distro changes.
Tremendous effort on a very complicated piece of software. I wish Firefox would also catch up on less complicated but needed so greatly features which are present in other browsers, like filling out credit card info and autocomplete=email input fields autocompletion.
WebRender has come out of Mozilla's Servo browser engine project, which has had influence on the development of Rust itself. The language and the browser engine have been developed together, so this and other parts of the Quantum project [1] (mostly bringing parts of Servo into Firefox) are like the culmination of all this work, and a validation of the language.
Rust makes it easier to write bug-free concurrent code. Webrender relies on this and I'm not sure Mozilla could have pulled it out in C++ in the same time frame, if at all.
I think that's pretty presumptuous. My experience is that rust's checking is great, but much easier and safer concurrency can still be enabled by some work with data structures and program architecture.
IMozilla engineers have specifically said Stylo (Quantum Style whatever) would be impossible in C++, because they actually tried it in C++. Presumably it'd be the same with WebRender.
Impossible is a strong word. Architecture makes a very big difference. They can claim it is impossible, but it really doesn't make sense. I'm surprised anyone would just take their word for it.
Trying to use raw threads and ad-hoc futures is going to be difficult, but fundamentally concurrency is about separating data by dependencies.
Dependency graphs that pass data around combined with lock free data structures can be used to isolate parts of the program so that dealing with concurrency is one generic part of the program.
EDIT: Parent commenter edited the comment to append "with the same performance as C", so you can ignore my reply.
So is Python, JavaScript, Java, OCaml, Haskell...
A more distinguishing feature of Rust is not only that it is memory safe but also that it doesn’t have a GC.
In any case the presence of a single unsafe block can make the program no longer memory safe. Of course, if used appropriately, it vastly reduces the audit surface compared to something like C++.
Is GC a negative? I’ve yet to see anyone go “well I would’ve written this in Golang but it’s got GC”. Even that new Windows driver style, where it’s split in two, can have half of it being GC’d, right? Looking at https://en.wikipedia.org/wiki/Windows_Driver_Frameworks at least.
Yes, GC is a negative. There is a huge literature in measures to mitigate its negatives. These always involve trading off one against another, and libraries are frequently incompatible with any particular mix of them.
I'm confused as to why someone would even want them to change the codebase - is this just an example of the "newer is better" problem we have here in tech? Is there something glaringly obvious about Rust that would make the effort worthwhile?
Web browsers are almost unique in how hard it is to make them resistant enough to attacks to fully protect the interests of their users. I know that when my local copy of Firefox got a malware infection, I reacted by switching to Chrome. That was about 2.5 years ago, and I have yet to switch back.
(More on why I switched. I like to tweak things, and had gotten used to about 8 or so about:config customizations and a handful of Firefox extensions. The infection added tons of phoney about:config customizations, and it would have been tedious to identify which customizations were mine and which were the infection's. IIRC, if I had had a backup of my customizations unmixed with phoney customizations, I wouldn't have switched. I.e., basically I switched to avoid my having to learn how to back up my about:config settings and my choice of extensions. I recall spending some time with a search engine unsuccessfully trying to learn.)
TL;DR: Rust's concurrency guarantees made parallelizing CSS layout feasible. They tried it in C++ twice, and failed.
One of the early justifications for pursing Rust at all was security; an internal survey of security issues was done, and over half of them were memory-safety related. At the same time, browsers need speed. Traditionally, you get either speed or memory safety in a language; Rust gives you both.
Because within that codebase there's an untold number of exploitable memory errors which put users at risk. Browsers have critical security issues all the time.
A few of them. It won't stop logic errors (although enums help here) but it will prevent buffer overflows, use-after-free, some cases of type confusion, uninitialised reads, and a few other problems.
If you look at the CVEs for browsers, I would say a great majority is due to errors that would've been prevented in any memory safe language - not specifically Rust.
However due to the other requirements (like performance) and its adoption at Mozilla, Rust is a real contender.
Exactly, and besides, if you want to write secure software it is also mostly a development process thing, and not just what programming language you write it with.
But how much of that is the existing compositor, renderer or other components (that have been or will be replaced by Firefox Quantum components) and eventually be deprecated and removed?
Here's hoping it is easy to switch off. Firefox has become extremely crashy, running under Qubes, where the GPU is forbidden, despite my turning off attempts at shader use everywhere I can find. E.g., even turning off "smooth scrolling" made a substantial difference. But it still crashes, even just loading a DDG search results page.
Advice welcome. (No, abandoning Qubes is not an option.)
The reality is that Firefox's current rendering engine is highly tuned, and switching to a new engine without major performance regressions is impressive. Keep in mind, there may be performance improvements coming down the line.