One inch is not an inch


#1

It’d be really cool if (on devices with displays that have valid EDID info) 1in would actually be one real-world inch (and the same with other physical units), but that’s not currently the case in any browser.

Any thoughts?


#2

Browsers are explicitly allowed to do so. If they don’t, it’s because having the px be some particular value is more useful to them (which is also explicitly allowed). The only requirement is that 96px == 1in. There’s nothing to do in specs here - you just need to convince browser engineers that having realistic ‘in’ values (when possible) is more useful than whatever they’re currently doing with ‘px’ values.


#3

Could you give a few examples of how this would be useful?


#4

There are a few I can think of:

  1. “Actual size” images/diagrams (e.g. a web shop for jewellery items, or hardware parts)
  2. Accurate zoom levels for content that can be made (or represents something that is) physical, e.g:
    1. “view at 100%” for an A4 Google document actually displays the page as it prints — allows for more accurate choice of appropriate font sizes.
    2. “2000%” view of a microscope image.
  3. An on-screen ruler.
  4. Measure lengths by allowing user to manipulate a photo on-screen, e.g: “look at camera from 1/2 metre away” ==> better results for someone buying headgear/glasses online.

#5

Another potential usecase is displaying images of real-world objects (e. g. items in online shops) at their actual size.


#6

Because not all screens can report accurately anyway, if you want accurate display you need to ask the user for help in calibrating.

If you don’t need precisely accurate measurements, 1in can only be at most 50% off, which occurs at 144dpi, exactly halfway between 1x and 2x screens. The next maximum error is 33%, at 2.5x, and then 25% at 3.5x, etc.


#7

Chromium’s response was to “engage with the CSS WG”.


#8

Do we know how prevalent that problem actually is? Not trying to force the issue; just wondering if this is something that would work in 90%+ of cases, in which case it might still be useful (maybe combined with some media query).

In any case, given the existing behaviour, even if this were to be “fixed” it’d probably be wiser to have either new units (rin, rcm…) or some sort of modifier (real-physical-units: yes-please).


#9

Sure, for the addition of a “real inch” unit. That’s different than changin 1in to be a real inch, which is what I was responding to. :slight_smile:

The WG has in the past agreed not to add “true” physical units that require accurate lengths, because the use-cases appear minimal, and we can’t guarantee that it’s possible on any given device.


#10

I don’t think so. Especially on mobile devices, physical units would be practical to define the min touch area of buttons. I stumbled upon this several times. Is there any official proposal regarding this topic?


#11

There have been suggestions to address that particular use-case more explicitly, by adding a “touch” unit. You could write “button { min-width: 1touch; min-height: 1touch;}” and solve the issue easily.

The proposal wasn’t pushed, but I’d be happy to revive it.


Physical unit redux: ‘tip’
#12

@tabatkins

But for screens that do report accurately we shouldn’t need to ask for calibration (or at least a calibration step would be simpler).

Either way (changing in to be real, or adding a new “real” units) I’d be happy, as long as there was just some way.

@nemzes

It’s prevalent enough that “native” framework provide ways of getting the actual values we seek. For example, the Qt project provides a Screen class that lets you get the device’s actual pixel density (among other APIs for getting EDID info, plus it’s a C++ framework so we can also just get EDID info from displays using the OS APIs). I can draw things on the screen using real-world sizes in Qt.

@jhnns

Yes!! Being able to specify display-size-independent design on any device using real-world units would be incredibly nice! I was able to do this using Qt. The web needs this too.

Imagine a designer who designs books. In many cases of book design, the measurements for things like font size and book size are probably based on real-world units (I’m guessing, but my guess is probably correct in many cases), since the thing being made is a real-world thing (nothing digital). We should be able to do the same with digital designs if the hardware allows it (i.e. if the hardware supplies EDID info).

Besides, even if not all displays reported accurate info, if it was a mere fact that the “web platform” had such things s a way to report actual screen metrics, then there would be a huge encouragement onto display manufacturers to provide such useful information, so it seems to me that adding such a feature to browsers is a good thing anyway, even if displays aren’t currently accurate. For displays that are not accurate, it’s easy to use the current system as a fallback anyways (a “real” unit can simply result in the same as having used the “fake” equivalent).


#13

Deja vu


#14

As a side note, currently browser/device/OS manufacturers can set their actual ideal viewport dimensions (in CSS pixels) to whatever they deem best in light of not only the physical screen size, but also their (estimated) viewing distance. If we now had a requirement that an inch actually be a physical inch, we’d immediately have an issue with things being designed specifically with a particular viewing distance in mind (think for instance a design which is designed for mobile, but is then displayed on a web-capable TV…where a 1"x1" control may be perfect on mobile, it’ll be ridiculously tiny on a 42" TV at the idealised 10’ viewing distance that’s normally bandied around). So, while it would open one door (being able to accurately design something in real-world sizes), it would immediately create new problems for responsive design that needs to work across different device classes (and no, i hope we’re not now moving to “we just need device-specific media query features”…)

(partly related, my musings on TV and ideal viewport, from aeons ago http://patrickhlauke.github.io/web-tv/ideal-viewport/)


#15

I disagree. This is why we need to have access to pixel density and the screen size of the display so that we can make informed decisions. For example, if my app knows that the screen size is 30 inches by 40 inches and the pixel density is X, then my app would know it is running on a huge display, and it will adapt accordingly. My app therefore wouldn’t display a tiny 1-inch thing, but instead something bigger.

For those of us that wouldn’t want to take advantage of that luxury (you for example), the spec would need for browsers to behave as they do currently do by default. The only difference would be that someone like me (who does want this) would have the option to take advantage of such information.

In the current state of affairs, “1 inch” not being an inch is just plain wrong, IMHO.


#16

Browsers would by default continue to behave like they currently do. This new feature I’m imagining would be opt-in, so that if people want to take advantage of it they can.

The “new problems” that you mention would only be there for people who opt-in, and I would be one of those people dealing with those new problems, but by default, you certainly wouldn’t have to.

Cheers and happy new year! :}


#17

would only be there for people who opt-in

and those who just cargo-cult/copy paste from other projects without understanding the potential implications of this foot-gun

and I would be one of those people dealing with those new problems

as there’s no solution to it though (unless you make separate sites for mobile, desktop, TV and do browser/UA sniffing to try and redirect people to the correct one), it still sounds dangerous to me to add this particular component while the ways to deal with it correctly in different scenarios aren’t available.


#18

That applies for anything in any programming language, so isn’t a good argument against this feature. It’s not the best idea to just paste anything all the time without knowing what it does.

As I mentioned, the default behavior would be the current behavior, and there’d need to be a way to opt-in. I personally don’t care if people who go pasting things blindly accidentally opt themselves in. That’s what they get for pasting blindly.


#19

Idea: What about a set of new units like rin, rcm, etc, where the letter r prefixed in front of them stand for “real”, so “real inch”, “real centimeter”, etc. Those units would be real when the hardware supports, otherwise they fall back to current behavior. Simple!

After having that unit, it’d become possible to do all sorts of things like make an invisible box of 1x1 rin, calculate it’s pixel size, and then use that info to get screen density, etc. It would then be possible to pass numerical pixel values to other JavaScript libraries (for example canvas 2d or webgl drawing libraries) that work in pixels (assuming the pixel ratio of the canvas to the document is 1).


#20

Please don’t use other threads to continue to push for your “get the hardware pixel density” proposal.