Not really. All major browsers just use the value the OS gives them, which is usually rounded to 0.5 or 0.25, which helps keep integer CSS px values an integer number of device pixels. So you could be off as much as 12% if you are on a device that rounded down from 1.12 to 1.0.
Also, even if they didn't, there's no standard for what the correct DPI should be for a device; it theoretically should depend on viewing distance, but it's impractical to constantly change the screen DPI depending on how far away the user's eyes are :)
OP could, however, use a better default than 96 DPI for mobile devices. Most are targeting ~160-ish.
> All major browsers round this to 0.5 or 0.25, which helps keep integer CSS px values an integer number of device pixels.
This is completely false. No browser that I know of does any such thing, nor would it make any sense to do so (nor would it achieve the goal you specify to any meaningful extent).
The closest thing that does happen is that browsers use integer fractions of pixels as their basic layout unit: Firefox and its kin sixtieths, Chrome and its kin sixty-fourths.
But the rest of your answer is correct; and to add a proper citation: “the reference pixel is the visual angle of one pixel on a device with a device pixel density of 96dpi and a distance from the reader of an arm’s length” <https://drafts.csswg.org/css-values-4/#reference-pixel>.
Maybe it's better to say that browsers just take what the OS tells them, rather than actually deriving a device pixel ratio from first principles according to the CSS spec. Because, yeah, there's some weird devices with DPRs like 2.625, though _most_ are multiples of 0.25: https://yesviz.com/viewport/. But note how the same DPR can give a varying CSS PPI, which makes using it useless for this purpose.
1.8 probably would produce a non-integer number of CSS pixels. The browser needs (wants?) to pick a number that divides both the width and height without remainders. For 1920x1080, 1.8 doesn't (works for the height, but not the width) but 1.8̅1̅ does.
Although that has a plausible sound, I don’t think it’s it: the window size is what matters, not the screen size, and you can’t guarantee any sort of divisibility for either anyway. For example, my screen is actually 2560×1440, which is 1706⅔×960 in CSS pixels given the real devicePixelRatio of 1.5. The established rule when you need an integer is, at least on Wayland, to round things down to the nearest integer; I’m not certain about other platforms. Certainly everything that deals in integer pixels sees 1706×960.
Chromium doesn’t exhibit this behaviour; it’s just Firefox on some of its zoom levels. And when I saw 90% being 0.9090909090909091 (90⁄99 instead of 90⁄100) it triggered a memory of observing this five or seven years ago on my Surface Book (3000×2000 @ 2×). I think it is just that they’ve chosen to display different, slightly inaccurate percentage labels.
It's pretty straight forward to include the output of `wasm-pack` into a vite project. The output is a node module in a folder with the wasm files and a single "main" JS file.
Because I wanted to load WASM in a web worker for my project [1] I needed to use vite-plugin-wasm and `wasm-pack build --target web` but without that constraint you should be able to import the main JS file from the wasm-pack output directory using wasm-pack's default `bundler` target and no vite plugins.
To me the interesting thing is the logic programming "rules" and their overlap with game rules. Inspired by work at Stanford on the logic programming based Game Description Language [1] I implemented Tic Tac Toe in Datascript yesterday: https://github.com/kasbah/datascript-games/blob/e06a37025bf9...
I am still not clear whether there isn't a more succinct rule definition than what I have there. In the Stanford paper you have rules like:
This is a consequence of the Datomic information model which is focused on handling a single universe of triples (to facilitate natural schema growth) instead of independent n-ary relations. However unlike when building serious business applications, for most simple games you probably won't ever care about the ease of handling the schema growth of persisted data. You would likely care more once you think about supporting long-lived multiplayer environments or introducing played-defined concepts within the game.
Interesting, thanks. I am especially interested in the idea of introducing player-defined concepts.
Would you be able to recommend a Datalog implementation that allows independant n-ary relations. Ideally one I can use from Python or Javascript in a sort of sandboxed way, as I am doing with Datascript, but if you have any recommendation at all it would be helpful to me.
DataScript already supports processing n-ary relations, it's just not how the data is naturally stored when you use `d/transact!`. Even though it's all in-memory anyway (ignoring the recent addition of durable storage on the JVM) the main benefit you get when 'storing' data is the suite of persistent B-Tree EAVT indexes. DataScript also let's you store plain vectors (and most other objects) as values, which you can access from the Datalog, so it's very flexible really. And learning Clojure is good fun.
If you want to try something more exotic I would be tempted by https://logica.dev/ + some flavour of SQLite (potentially in-memory/WASM).
https://developer.mozilla.org/en-US/docs/Web/API/Window/devi...