The client for the WalkingTourEditor available at:
https://github.com/JGL/WalkingTourEditor
WalkingTourClient is currently hosted on https://jgl-walkingtour.glitch.me/
- Install Homebrew if on OSX or Homebrew on Linux if on Linux or Windows (via Windows Subsystem for Linux). Run "brew update" to ensure Homebrew is up to date. Then run "brew doctor" to make sure your system is ready to brew. Next, add Homebrew’s location to your $PATH in your .bash_profile or .zshrc file. "export PATH="/usr/local/bin:$PATH""
- Install git via homebrew
- Install node via homebrew
- clone https://github.com/JGL/WalkingTourClient.git
- cd into freshly created folder "WalkingTourClient"
- run "npm install"
- run "node server.js"
- view the server via your favourite webrowser (likely defaults to localhost:3000)
While looking at the the context and use cases of the augmented reality tour, we reexamined some initial experience mapping that was done at the start of the project. What we want to design is an experience that does not emphasise looking at the screen, thus taking away the physical experience of the place and most importantly, leaves little to imagination.
We've mapped out the experince on two axises, the vertical moving from analog to digital, and the horizontal from understanding content to enjoyment. This axis in particular is what we mean to support through the other. We want to create an enjoyable experience and see how we can mix that with providing historical context.
You can see the happy path here is a curve up from the bottom left up to the middle of the experience where the user is at the apex of digital experience and context awareness and visibility and then finally rounding out to the bottom right where they have learned about the situation, made the journey, aware of where they are and what happened there and then can enjoy the local wine and food.
We're looking at creating an extensible, accessible open source experience platform where location will activate media for the user. Here is an example of what it would look like where the user on-site would open media. Our first step is to have location open up the image in a browser on mobile, and then on to audio, video and finally mixed reality overlays.
We're thinking about ways to connect external or remote visitors through the experience of wine to the sites that produce that wine.