Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Graphics Team ships WebRender MVP (mozillagfx.wordpress.com)
139 points by jandem on May 21, 2019 | hide | past | favorite | 50 comments


This is an amazing achievement for the WebRender team, but users should manage their expectations with regard to performance improvements.

The reality is that Firefox's current rendering engine is highly tuned, and switching to a new engine without major performance regressions is impressive. Keep in mind, there may be performance improvements coming down the line.


Shipping for Linux is waaay down the list:

https://github.com/orgs/FirefoxGraphics/projects/1

I assume due to driver issues?


As a desktop Linux user, I'd guess it's harder to make it work right (because of, among other things, driver issues) and it affects a smaller portion of the user base of Firefox. Oh well.

Fortunately it's easy to opt in to, and in my experience works well on an oldish Intel chip.

Unfortunately, it fails hard with Wayland, putting no pixels on the screen. (In my experience) But I'm using it happily through XWayland.


> Unfortunately, it fails hard with Wayland, putting no pixels on the screen. (In my experience)

It works for me in nightly, at least. Have you tried there?


No, only tried on stable. Looking forward to testing 67 when I get the update.

I should clarify that I am talking about the situation when running Firefox with GDK_BACKEND=wayland.

UPDATE: Ooh! Firefox 67 in my package manager! I am happy to report that I am now able to use Firefox with WebRender on Wayland directly. One glitch so far: It leaves a one-pixel row transparent at the bottom of the screen. Weird :)


Nightly user with a Radeon, running Sway on Wayland and Firefox with Webrender. Been working quite nicely for some months now already.


I tried it for a while and it worked fine but I did notice that it significantly increased idle CPU load, enough to affect battery life


FWIW, WebRender is enabled by default on Nightly on Linux with a subset of modern Intel hardware.


Seeing as Linux Firefox still doesn’t use any hardware acceleration, it’s probably because there’s no developer room for it.


It doesn't use it by default, but you can manually enable it. It's the only way I could watch twitch streams on a 1440p monitor at 60 FPS.


There are plenty of developers using Linux, though. Time for someone to jump in?


Not only. There's Ubuntu Linux/Fedora Linux/Arch Linux, there's Wayland and a whole slew of compositors...


As always, the solution would be to not attempt to support "Linux", but to pick one distribution (or a manageable set) to support officially.


Distro shouldn't matter one bit. Just scan for hardware acceleration support and enable it if found. Plenty of other applications do this without any per-distro changes.


So Windows 10 + Nvidia GPU is only 4%? Huh. I would have thought this demographic would be larger.

I guess most users are on laptops where integrated Intel GPUs are more common.


I'm sorta in that 4%, I have an RTX2080 I use for gaming on my home desktop but I'm only in Win10 for gaming.

Hopefully I'll be able to use the RTX with Firefox and Fedora down the line though (stably).


Tremendous effort on a very complicated piece of software. I wish Firefox would also catch up on less complicated but needed so greatly features which are present in other browsers, like filling out credit card info and autocomplete=email input fields autocompletion.



The article forgot to mention the important fact that it's written in Rust!


Naive question: why is it important, compared to the developed features and new architecture?


WebRender has come out of Mozilla's Servo browser engine project, which has had influence on the development of Rust itself. The language and the browser engine have been developed together, so this and other parts of the Quantum project [1] (mostly bringing parts of Servo into Firefox) are like the culmination of all this work, and a validation of the language.

[1] https://wiki.mozilla.org/Quantum


Because development of this particular project has been a large driving force of Mozilla's interaction with Rust.


    <rust-evangelism-strike-force>
        FEARLESS CONCURRENCY
    </rust-evangelism-strike-force>
Rust makes it easier to write bug-free concurrent code. Webrender relies on this and I'm not sure Mozilla could have pulled it out in C++ in the same time frame, if at all.


I think that's pretty presumptuous. My experience is that rust's checking is great, but much easier and safer concurrency can still be enabled by some work with data structures and program architecture.


IMozilla engineers have specifically said Stylo (Quantum Style whatever) would be impossible in C++, because they actually tried it in C++. Presumably it'd be the same with WebRender.


Impossible is a strong word. Architecture makes a very big difference. They can claim it is impossible, but it really doesn't make sense. I'm surprised anyone would just take their word for it.

Trying to use raw threads and ad-hoc futures is going to be difficult, but fundamentally concurrency is about separating data by dependencies.

Dependency graphs that pass data around combined with lock free data structures can be used to isolate parts of the program so that dealing with concurrency is one generic part of the program.


Since Rust is a memory safe language with nearly the same performance as C


EDIT: Parent commenter edited the comment to append "with the same performance as C", so you can ignore my reply.

So is Python, JavaScript, Java, OCaml, Haskell...

A more distinguishing feature of Rust is not only that it is memory safe but also that it doesn’t have a GC.

In any case the presence of a single unsafe block can make the program no longer memory safe. Of course, if used appropriately, it vastly reduces the audit surface compared to something like C++.


None of those have nearly the same performance as C.


GP edited the comment, so now my reply looks silly.


OCaml can have nearly the same performance as C.


Is GC a negative? I’ve yet to see anyone go “well I would’ve written this in Golang but it’s got GC”. Even that new Windows driver style, where it’s split in two, can have half of it being GC’d, right? Looking at https://en.wikipedia.org/wiki/Windows_Driver_Frameworks at least.


> I’ve yet to see anyone go “well I would’ve written this in Golang but it’s got GC”.

This is a dealbreaker in a lot of contexts where garbage collection pauses would be unacceptable.


Yes, GC is a negative. There is a huge literature in measures to mitigate its negatives. These always involve trading off one against another, and libraries are frequently incompatible with any particular mix of them.


A GC is literally the only reason I haven't tried go.


Can Rust target the GPU?


Sort of; it's not trivial, but some people have done it. See https://bheisler.github.io/post/rust-on-the-gpu-with-accel/ for example. I think https://blog.theincredibleholk.org/blog/2012/12/05/compiling... is the earliest one I know of; two and a half years before Rust 1.0!


Hopefully they will rewrite the rest of Firefox in Rust as soon as possible.


... That is an gargantuan task. According to https://www.openhub.net/p/firefox/analyses/latest/languages_... they have over 5.5 million lines of code written in C++.

That is an insane amount of code that would need to be rewritten, rechecked, retested and so on. Not going to happen anytime soon, if ever.


I'm confused as to why someone would even want them to change the codebase - is this just an example of the "newer is better" problem we have here in tech? Is there something glaringly obvious about Rust that would make the effort worthwhile?


Web browsers are almost unique in how hard it is to make them resistant enough to attacks to fully protect the interests of their users. I know that when my local copy of Firefox got a malware infection, I reacted by switching to Chrome. That was about 2.5 years ago, and I have yet to switch back.

(More on why I switched. I like to tweak things, and had gotten used to about 8 or so about:config customizations and a handful of Firefox extensions. The infection added tons of phoney about:config customizations, and it would have been tedious to identify which customizations were mine and which were the infection's. IIRC, if I had had a backup of my customizations unmixed with phoney customizations, I wouldn't have switched. I.e., basically I switched to avoid my having to learn how to back up my about:config settings and my choice of extensions. I recall spending some time with a search engine unsuccessfully trying to learn.)


For a case study here, see https://www.youtube.com/watch?v=Y6SSTRr2mFU

TL;DR: Rust's concurrency guarantees made parallelizing CSS layout feasible. They tried it in C++ twice, and failed.

One of the early justifications for pursing Rust at all was security; an internal survey of security issues was done, and over half of them were memory-safety related. At the same time, browsers need speed. Traditionally, you get either speed or memory safety in a language; Rust gives you both.


Because within that codebase there's an untold number of exploitable memory errors which put users at risk. Browsers have critical security issues all the time.


So would Rust remove all the potential exploits?


A few of them. It won't stop logic errors (although enums help here) but it will prevent buffer overflows, use-after-free, some cases of type confusion, uninitialised reads, and a few other problems.


If you look at the CVEs for browsers, I would say a great majority is due to errors that would've been prevented in any memory safe language - not specifically Rust.

However due to the other requirements (like performance) and its adoption at Mozilla, Rust is a real contender.


Exactly, and besides, if you want to write secure software it is also mostly a development process thing, and not just what programming language you write it with.


But how much of that is the existing compositor, renderer or other components (that have been or will be replaced by Firefox Quantum components) and eventually be deprecated and removed?


Here's hoping it is easy to switch off. Firefox has become extremely crashy, running under Qubes, where the GPU is forbidden, despite my turning off attempts at shader use everywhere I can find. E.g., even turning off "smooth scrolling" made a substantial difference. But it still crashes, even just loading a DDG search results page.

Advice welcome. (No, abandoning Qubes is not an option.)


Mozregression - https://mozilla.github.io/mozregression/ should help you figure out where it broke. You can open a bug with the commit you found as "regressed by" - https://wiki.mozilla.org/BMO/UserGuide/BugFields#regressed_b... to help developers track down how what they need to change to resolve the issue.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: