Can you elaborate on why is would be necessary? Currently there’s no standard mapping between how browsers perceive resource priority and H2 dependency trees. (e.g. Firefox and Chrome mark critical JS and async JS in very different ways, both in dependencies and weights)
Browsers might have different ways of using it, but there’s a standard data model - H2 priority trees (if we ignore the problem of H1 for now). If this effort introduces a different model, then we need a mapping (which is likely to be lossy).
If we’re going to abandon priority trees in favour of something else, this might make sense, but I haven’t seen serious suggestions that we do so yet.
I believe we can tackle the stated use cases by providing something simpler, such as “upgrade/downgrade” semantics as well as optional dependencies or deference indications (“critical” vs. “important” vs. “other”).
Would “upgrade” change the weight, or re-root the dependency? If the latter, where?
If we define a new model like this, my concern is that not only will different browsers have different ways of using the dependency tree, but they’ll also now have different ways of mapping this new model into it. Early days, of course, but it feels like a mess in the making.
Allowing authors to query and express dependencies (with optional weights) might be a way around this. It’s easy and intuitive to say “resource load A depends on B” and so forth.
That would work well for JS, but not for markup (as it would be pretty verbose), unless we assign some sort of semantics to ordering / containment, etc.
I’d suggest that the best way forward might be to go Extensible Web on it – i.e., provide a low-level, primitive but powerful JS API, and let libraries figure out higher-level abstractions.
I agree that defining a full mapping of resources to H2 priorities would be potentially more powerful, but:
Servers will rarely have knowledge of all the resources a certain page will trigger, and won’t necessarily know the ideal dependency tree.
If they do, and all said resources come from a single server (again, rare), the server could apply those ideal dependency trees while ignoring the browser’s sent priorities.
Sorry, you lost me here. I’m not suggesting that this be done statically server-side; this is client-side code running. Or are you saying that we ought only define “hints” that the client can use to adjust its priorities at runtime (but it might decide to ignore them, or only partially follow them)? If so, I wonder if we can get it right the first time, and how much work fixing it after that would take.