# Evaluation Topics: Wayfinding

![article 1: Creating Impactful Spatial Experiences ](https://1763729276-files.gitbook.io/~/files/v0/b/gitbook-legacy-files/o/assets%2F-LiPPFFn8713gkOyphHP%2F-LicRc51Gtn1yRecya1H%2F-LicUsdejpyCJ8N1bq_O%2FScreen%20Shot%202019-06-30%20at%2015.06.54.png?alt=media\&token=c35a23bf-5d92-4910-86ee-ba666338bba8)

[**article 1: Creating Impactful Spatial Experiences**](https://www.slideshare.net/FrostSydney/creating-impactful-spatial-experiences)&#x20;

**Wayfinding** is the *cognitive component of moving around in an environment*. It is a **decision making process** which needs clarity and metaphors can be useful. Wayfinding needs high level thinking, planning and decision-making related to user movement. It involves:

* *spatial understanding* (see chapters about spatial dimensions).
* *planning tasks*, such as determining the current location within the environment, determining a path from the current location to a goal location and building a mental map of the environment.&#x20;

In virtual worlds, way finding can be crucial, an efficient travel technique needs to be combined with an overview of routing and knowing where to go.\
&#x20;<img src="https://1763729276-files.gitbook.io/~/files/v0/b/gitbook-legacy-files/o/assets%2F-LiPPFFn8713gkOyphHP%2F-LicRc51Gtn1yRecya1H%2F-LicTg1mjVvCrE7BleCs%2FScreen%20Shot%202019-06-30%20at%2014.58.44.png?alt=media&#x26;token=1832b04b-95a9-4bf0-a953-f2f2c67a569e" alt="" data-size="original"> <img src="https://1763729276-files.gitbook.io/~/files/v0/b/gitbook-legacy-files/o/assets%2F-LiPPFFn8713gkOyphHP%2F-LicRc51Gtn1yRecya1H%2F-LicTcvZUeh9BCY1ee4d%2FScreen%20Shot%202019-06-30%20at%2014.58.28.png?alt=media&#x26;token=151167b9-3b49-43d3-b3f1-064a4e58b882" alt="" data-size="original">&#x20;

## Wayfinding Cues

#### Environment-Centered&#x20;

{% embed url="<https://vimeo.com/338651211>" %}

* ***Environment legibility***: think of paths (linear), edges (enclosing), districts (quickly identifiable), nodes (gathering points), landmarks (static objects).&#x20;
* ***Landmarks***: for directional cues and local for decision-making by providing information.
* ***Maps***: most common in daily life, but very complex to design for in virtual environments. Needn't necessarily be a spatial representation, but can also categorise and place hierarchical structure at its core. [Environmental clutter](https://encrypted-tbn0.gstatic.com/images?q=tbn:ANd9GcSl8446rIEJqZDpCMb_j0PgT89tVSey7klb5oUeQqKGgaSylJiwtA) vs [Neat & Empty](https://encrypted-tbn0.gstatic.com/images?q=tbn:ANd9GcRSViF2QzxKSGQhepjHujN_hYTdc2VV2t4yhaS4ZRVWcWopGrArrQ)
* ***Compasses***: provide directional cues, great for implementation in maps.
* ***Signs***&#x20;
* ***Trails***: helps users to retrace their steps.
* ***Reference Objects*****:**&#x77;ell-known objects like chairs and human figures to determine size in virtual reality.&#x20;

User-Controller-Centered

![](https://1.bp.blogspot.com/-1P0iEg6PdUg/Vv6K6o7UBYI/AAAAAAAAACY/e3_MSgirCCY9mGEbpbyupQpyF2g2NyAlw/s1600/touch%2Bmapper.jpg)

There are several strategies that designers and developers can handle to support the challenges that the human perceptual system and VR-hardware will keep on facing.

* ***Field of View (FOV)*****:** larger FOV (> 40-80 degrees) reduce head movement. Small FOV's lead to cyber sickness.
* ***Motion Cues*****:** \
  \- the peripheral vision provides strong motion cues (direction, velocity, orientation during movement), \
  \- additional vestibular cues ([inertia](https://www.youtube.com/watch?v=NYD8ZG3W31k) & balance, which usually are related to embodied self-motion cues) are necessary as well.
* ***Multi-sensory Output***: \
  \- [Tactile maps](https://1.bp.blogspot.com/-1P0iEg6PdUg/Vv6K6o7UBYI/AAAAAAAAACY/e3_MSgirCCY9mGEbpbyupQpyF2g2NyAlw/s1600/touch%2Bmapper.jpg) (a map which contours are raised so they can be sensed by touch as well as sight.) Tactile cues can aid in the formation and usage of spatial memory. \
  \- Audio use can be used as well.<br>


---

# Agent Instructions: Querying This Documentation

If you need additional information that is not directly available in this page, you can query the documentation dynamically by asking a question.

Perform an HTTP GET request on the current page URL with the `ask` query parameter:

```
GET https://annemarleen.gitbook.io/immersive-design/evaluation/wayfinding.md?ask=<question>
```

The question should be specific, self-contained, and written in natural language.
The response will contain a direct answer to the question and relevant excerpts and sources from the documentation.

Use this mechanism when the answer is not explicitly present in the current page, you need clarification or additional context, or you want to retrieve related documentation sections.
