I would like this topic to contain discussion related to standardizing nonvisual mapping techniques. If the discussion becomes too big, we can split the topic, but for now, there are quite a few similar requirements for nonvisual maps.
Here are some links from the presentations:
Digital Auditory Maps:
Audiom - The cross-sensory web map widget builder.
TEAM - An OSM viewer of data in a first-person view.
iSonic - A heatmap tool.
SAS Graphics Accelerator - Audio Graphics tool, allows the creation of heatmaps and maps with points (as well as other sonifications).
Grmapa - A grid-based presentation of OSM data.
Vibro Haptic Maps:
VEMI Lab - A company that makes Vibro haptic diagrams.
Commonalities Between the Above Maps
- All features need a name property along with an attribute
- All maps use an egocentric viewpoint rather than an allocentric viewpoint
- All the maps use a type property, like “Restaurant” or “Metro”
Things to Consider for the Future
Web XR is here, and along with Web XR comes peripherals and multimodal displays. HaptX is an example of a tactile display that would bring a tactile 3D model map or raised line map viewer to the web.