How granular are CSS units?

I am developing a scalable mobile / desktop friendly website using CSS rem modules based on the average size of the site’s viewport.

I set the edge sizes to 0.001rem for certain add-ons and fields that work fine in my Firefox browser ... but will it work in all modern browsers?

I just ask a question about the ability to use 0.001rem units, because the highest level of detail I've seen before thousandths is hundredths of opacity ... for example, opacity:0.25 .

How low can return units be? Is 0.00000001rem an acceptable value in modern browsers?

+7
units-of-measurement css css3
source share
3 answers

Ultimately, your granularity is 1 physical pixel (well technically a subpixel in modern browsers, but I will ignore this for the purposes of this discussion). You can have different calculated pixel values ​​based on em or rem even up to a few digits of precision. Then, run into the real world of problems when rendering that decimal accuracy will be lost when the browser ultimately rounds up the size of pixels available at any pixel density of the device relative to the reference pixel density (96ppi).

In essence, this reference pixel is 1/96 of an inch. So 1px in CSS terms basically means 1/96 "at 96ppi. On screens with a higher pixel density (for example, 326 ppi of many Apple Retina screens), scaling occurs to convert the CSS reference pixel to physical pixels. , this scaling factor will be ~ 3.4. Therefore, if you specified a CSS rule to set something to 10px, the retina display browser should display at 34 physical pixels (assuming that there are no other HTML changes (that is, meta elements) will change the display behavior). Due to this scaling behavior, the physical size of the item will still be 10/96 ", which is exactly the same physical size as if the item was displayed on a 96ppi screen.

Now add em and rem to the mix. So let’s use an example font size of a 10px root element with a declaration on some other .001rem element. This would mean that you are trying to display this element on the reference pixels 0.01 (10px * .001rem), which will translate to 0.034 physical pixels on the retina screen. You can clearly see that the value of rem 0.001 is at least one order different from the significant difference in the physical display, since .01rem in this case translates to 0.34 physical pixels - no difference in rounding for display than for a more precise specification .001rem .

So, I think that you define blog-based CSS rules with much more specificity than you can actually fit in real-life conditions when drawing physical pixels, unless you have a very large root element size and / or you have a physical screen with pixel density is an order of magnitude greater than your retina screen. I assume this last case is incorrect.

Just because CSS can be calculated to the nearest ten decimal places or whatever, this does not mean that physical rendering can occur at the same level of accuracy.

+5
source share

A simple JSBin example shows that a height of 1.001 rem displays 18.0156 px and a height of 1.0001 rem renders 18 px (this will be the same as using only 1 rem ).

This means you can have 3 decimal precision (at least in the desktop version of Chrome and with the usual font size).

I also tried to write a JavaScript test to measure accuracy, but element.offsetHeight is an integer, so it is useless for that. If there is no other way to measure the size of elements in pixels (with decimal places).

EDIT 1 : according to the CSS specification (see this and this ), there seems to be no limit on the number of decimal places.

But in the end, I think you are limited by the pixel density of the device. Physical pixels on the screen are indivisible, so all calculated sizes are rounded to the nearest integer.

+1
source share

"CSS theoretically supports infinite precision and infinite ranges for all types of values, but in practice, implementations have finite capacity. UA should support reasonably useful ranges and prefixes." w3.org

In practice, however, my conclusion is this: in Chrome, you can get up to a thousandth place of unlimited accuracy until the fractional part of the pixel value drops below 0.015 .

EDIT . The original output turned out to be erroneous after the font size was increased to a larger number.

Check it out yourself (I used Element.getBoundingClientRect() ):

 var l = [ "normal", "tens", "hundreds", "thousands", "ten-thousands", "hundred-thousands" ]; var o = document.getElementById("output"); for (var i = 0; i < l.length; i++) { o.innerHTML += l[i] + ": " + document.getElementById(l[i]).getBoundingClientRect().height + "px<br>"; } 
 body, html { font-size: 1000px; } #normal { height: 1rem; } #tens { height: 1.1rem; } #hundreds { height: 1.01rem; } #thousands { height: 1.001rem; } #ten-thousands { height: 1.0001rem; } #hundred-thousands { height: 1.00001rem; } #output { font-size: 16px; } 
 <div id="output"></div> <br> <div id="normal">Hello</div> <div id="tens">Hello</div> <div id="hundreds">Hello</div> <div id="thousands">Hello</div> <div id="ten-thousands">Hello</div> <div id="hundred-thousands">Hello</div> 

My result:

 normal: 1000px tens: 1100px hundreds: 1010px thousands: 1001px ten-thousands: 1000.09375px hundred-thousands: 1000px 

An indication that the place in the ten thousandth ( 0.00001 ) does not correspond to the accuracy that my browser supports.

After increasing the font size and subsequent playback with it, I can not find a value that will give a pixel value that has a fractional part <0.015. (Try setting font-size: 1556px and 1557px )

+1
source share

All Articles