A partial archive of discourse.wicg.io as of Saturday February 24, 2024.

MQ for Userʼs Population Group or Browsing Environment

Crissov
2019-11-08

Many sites vary their content or style or both based on the “maturity” of the user and acceptability of “mature content” in their browsing environment:

  • Sites for children have parent sections or modes, where the style may be less playful and concentrate more on dull (and smaller) text than on colorful graphics. Typography may be adapted accordingly to aid either reading many paragraphs or deciphering a couple of words.
  • Entertainment sites are tagging delicate (user-generated) content as not safe for work (NSFW) and let users either opt-in or opt-out to access it.
  • Search engines offer a safe mode for content that may be offensive, often with three levels: on, moderate, off.
  • Several countries require age verification (by various methods) for some content or only allow it during specific local times (i. e. usually during childrenʼs nominal bedtime).
  • Many countries have voluntary or mandatory content rating systems (issuing labels) or even censorship agencies (altering or restricting content) for graphic, lyric or interactive content, but exact age groups and criteria depend a lot on culture and tradition. Beeping over curse words, blurring genitals or replacing Nazi symbols belong in this category.
  • “Adult” sites, i. e. porn in most cases, hide or obscure hardcore content from guests, i. e. possibly underage, unverified visitors that are not logged-in paying members.
  • If known, some styles are also varied with the userʼs gender, e. g. pastel colors and cursive fonts for girls vs. earthen tones and stencil lettering for boys.
  • Parental controls and company policies may use filters, whitelists or blacklists to restrict or prohibit user access to certain content or sites.

Iʼm not sure whether or how MQ should address this. I tend to think it is in scope. This is a sensible topic and usually has less or no effect on style in comparison to content, but Mediaqueries are not limited to selecting stylesheets anyway.

Authors might want to know the exact age (at least with year precision) of the user to determine appropriate style and content, but birthdates are not usually collected by OS and browser, so arenʼt available at all, and even if they were they would probably be unreliable and should not be reported in detail for privacy reasons.

However, an age group or related social role could possibly be determined implicitly or selected explicitly. Possible values could include: baby, toddler, child, pre-teen, teen, youth, junior, young, minor, twen, adult, mature, senior, old; kid, parent, pupil, student, teacher, professor; assistant, manager, admin(istrator), worker, minion, client, customer, guest, member, host, master, servant, clerk; girl, boy, man, woman, son, daughter, mother, father.

The user environment could be derived from some parameters that may be available to the browser or at least to the operating system. Possible values could include home, work, office, school, library, cinema, restaurant; public, private, protected, kiosk, presentation, shared, supervision, conference, team; transport, car, bus, train, plane, ship; desk, counter, bed, couch, chair, seat, floor, wall.

patrick_h_lauke
2019-11-12

but Mediaqueries are not limited to selecting stylesheets anyway

…but they are. They’re a CSS mechanism that allows you to adapt your visual presentation/layout to different factors (device capabilities, viewport metrics, user preferences). What you’re proposing (leaving aside the potentially huge privacy and fingerprinting concerns) seems more geared towards being able to present/restrict/adapt actual content, not just styles. For examples like “hiding hardcore content”, that’s really something you want to do at the content side, not just having some way of just setting display:none or whatever. So fundamentally, this is more something that should be tackled primarily via request headers or similar (so the server itself tailors its content), and then once THAT is in place, sure … some possible JavaScript API or similar (with permissions handling, possibly) to be able to query the user’s specifically set characteristics, maybe.

Crissov
2019-11-12
dauwhe
2019-11-14

The social value of the Web is that it enables human communication, commerce, and opportunities to share knowledge. One of W3C’s primary goals is to make these benefits available to all people, whatever their hardware, software, network infrastructure, native language, culture, geographical location, or physical or mental ability.

Giving web sites the capability to serve different content or styles to girls or boys, to servants or masters (please, please don’t provide a use case) goes against the fundamental principle that we are all equal on the web. I don’t want my browser or your web site to know my age or gender or the numerous, overlapping, fluid, ever-changing social roles I take on.

Crissov
2019-11-14

I donʼt want websites to know more about myself than they need to know, either, and I want to know and control what they know. Nevertheless, I provided several examples where such information is needed – and hence provided – in some way already. Iʼm just trying to explore whether this is something that Media Queries should be used for, e. g. whether that would perhaps even improve privacy or at least user experience.

(And no, I donʼt have a use case for the master / servant distinction, but I could come up with one for many of the other roles.)

plinss
2019-11-14

I cannot possibly state how much I object to filtering web content by population group. Seriously, words fail me.

Selecting web content (or style) by age group or social role is completely unethical and illegal in many jurisdictions. The fact that you included gender roles is additionally problematic.

I suggest a review of the TAG Ethical Web Principles is in order. If it doesn’t make it clear why this is a bad idea, please let me know as we’d then need to revise the finding to make it more obvious.

I accept that you’re acting in good faith trying to solve a real problem, but there are existing solutions to identifying and handling mature content. Filtering by population group can only be problematic.

marcosc
2019-11-15

I’d like to echo the concerns that have been mentioned above. Although well intentioned, this could lead to various forms of discrimination.

I’m also concerned that this is going to be a highly controversial topic. So, as Chair, I’m going to take preemptive action and lock this thread before the internet finds this and it blows up.

marcosc
2019-11-15