Experiential Design - Weekly Journal


Jump Link



Week 1 (23/4/2025)

Today is the first class of this module, and we were given the module information and assignment brief. Based on my understanding, the learning outcome of the Experiential Design module is to learn how to develop AR using Unity, by integrating both physical and digital elements to solve a real-life problem. I believe that this module will be quite heavy, as it requires a lot of technical skills, similar to what we did in our Game Development module. However, I believe that this module will be very beneficial and will push me to learn new skills that I haven’t explored before!


Lecture Note

  • MVP (minimum variable product): the simplest version of your product that it is capable to sell.
  • Vertical slice: Showing certain functionality of your app, but it is complete. Complete visual, functionality, animation of that section.
  • Vuforia: Plugin to ease the AR development in Unity.
  • Don't use image scanning ONLY for our assignment.
  • No need to integrate API, just to show the UI of the app as in how user interact with the app.


Week 2 (30/4/2025)

In today’s class, we were given a theoretical lecture on experiential design. We mainly learned about user mapping and journey mapping, which serve as preliminary drafts to understand the user flow, the problems users encounter, the solutions provided, and more. We also participated in a class activity to further enhance our understanding of this topic.


Lecture Note

  • Fundamental of experiential design: User Mapping, Journey Map
  • User experience (UX) - how a user interacts with the app/product
  • Experience Design (XD) - where and how elements are positioned in space with spatial design principle
  • Customer Experience (CX) - includes all interactions a customer has with a brand
  • Brand Experience (BX) - cultural impact your brand leaves on people, the values, tone, and personality of the brand across all platforms
  • Information architecture (IA) - foundation of a digital experience, organizing content and structure

Figure 2.1 Modern Design Field




Figure 2.2 User Experience vs Experiential Design


  • Empathy map - articulate what we know about a particular user, to create a shared understanding and aid in decision making. It consists of four quadrants: says, thinks, does, feels
  • Journey map - tracks the user’s experience over time across different touchpoints, example: Pre-boarding experience → In-flight experience
  • UX Mapping Reference: Link
  • Journey Mapping Reference: Link

Figure 2.3 Empathy Map



Figure 2.4 Journey Map


Class Activity

The task for this class activity was to form a group of 5–6 people and identify the gain points, pain points, and solutions for users in a selected location. We chose our campus, Taylor's University, as the location since everyone is familiar with it and is also a user of the space. We came up with an AR-based solution, as it aligns with our module and upcoming assignment. This encouraged us to brainstorm how AR can be used to address the pain points we identified.

We all work on this together in Miro! Here is the Link 


Figure 2.5 Journey Map in Taylor's University


Figure 2.6 Future Journey Map




Week 3 (7/5/2025)

In today’s lecture, we learned about the design process and participated in a group activity to deepen our understanding of AR, specifically how AR can be used to solve everyday pain points. Additionally, we were given a tutorial on how to create a simple image-based AR experience using the Vuforia Engine and Unity.


Lecture Note

  • Augmented Reality (AR) - less immersive, combine real + virtual objects, enhanced physical world, using screen device
  • Mixed Reality (MR) - middle, enhance within the physics of the physical word, interact and manipulate virtual object, hands are free to interact and touch
  • Virtual Reality (VR) - more immersive, computer-generated world, sense of presence
  • marker-based, marker-less
  • Design component - UX design + usability + interaction design


Class Activity

We were tasked to work in groups to create a simple AR mockup that addresses a specific problem statement. Our group chose the zoo as our target location and identified the pain points experienced there.


Location: Zoo Negara

Problem Statement:
  • Limited Interactivity: Static information board is too boring and less engagement and interactivity, especially for children to absorb the information.
  • Language Barrier: Foreign tourists might not understand our languages, difficulties in understanding the info board
  • Maintenance: Signage exposed to outdoor condition subject to weather damage, fading.
  • Outdated Information: Information board is not updated, which can lead to misunderstanding
  • Space Constraint: Difficulties to access the info board during peak time due to crowd.
  • Limited readability: Small and cluttered text that can be difficult to read from a distance.

AR Mockup:

Figure 3.1 AR Navigation


Figure 3.2 AR Informative Board


Unity
  • Plan license > generate basic license
  • Target manager > add username > click in it > add target
  • Rating - how easy to scan image - 4/5 stars = good 
  • Download database > import into unity 
  • Right click > Vuforia engine > AR camera > open VE configuration > paste the long license key
  • Vuforia engine > add image target > set the type, database and image target


Figure 3.3 Image Target Behaviour


Figure 3.4 My First AR


Week 4 (14/5/2025)

In today’s lecture, we were given tutorials on how to set conditions, create simple UI, and add animations for our AR project. Fortunately, we already have the basics from the Games Development module, so this lecture felt quite easy and manageable for me. It even made me feel that our assignment won’t be as difficult as the one from the previous module.


Lecture Note

  • right click > ui > canvas

Figure 4.1 Screen Space - overlay


  • canvas scaler > set to > scale with screen size
  • to change the cube (active and inactive) > go to button > scroll down to "on click" > add "game object" > SetActive (bool)
  • to toggle animation > add "animator" > bool enabled


Figure 4.2 Set active boolean


Figure 4.3 AR Testing


Figure 4.4 Animation boolean


< back


Week 5 (21/5/2025)

In today’s class, we gained deeper insights into enhancing our AR app and explored alternative methods to achieve similar features. As part of the learning outcomes, we learned how to add videos into AR, script using Visual Studio Code, use visual scripting, and set conditions.


Lecture Note

  • How to add video > Under image target, add "3D object" > add "Plane" > Under "plane", go to "add component" > Add "video player"
  • When detect image - play, if not - stop > go to "Image target" > scroll to the bottom > set the condition as below:


Figure 5.1 Set condition


Figure 5.2 Script for video toggle


Figure 5.3 Set condition from script


Figure 5.4 Add SFX


  • Another method without using code: Select plane > "add component" > "visual scripting" > "script machine"
  • Change the "source" from "graph" to "embed" > "edit graph"
  • Right click to add nodes
  • Learning outcome for today's lesson: Add video into AR, Scripting using Visual Studio Code, Visual scripting, Set condition


Figure 5.5 Visual scripting


Figure 5.6 Final Outcome



Week 6 (28/5/2025)

Today's lecture was pretty light since it was an online class. Basically, what he taught had already been covered in last semester’s game development module, such as adding text, UI elements, scene controllers, and so on.


Lecture Note

  • Import Ui element > change texture type to sprite (2D and UI)
  • Tick "preserve aspect" to remain the ratio and scale of the UI element
  • Button > Scroll to downwards > "+" > Drag the "canvas menu" > Scene controller > gotoARScene

Figure 6.1 Outcome


< back


Week 7 (4/6/2025)

In this class, we learned how to connect our phones to Unity to run an AR app. Unfortunately, the connection only works from macOS to iOS or from Android to Windows. As an iPhone user with a Windows PC, I need access to an external macOS device to establish the connection.


  • File > Build profiles > Switch to IOS platform 
  • In IOS > click "player settings" > set necessary settings 
  • tick "metal API validation", IOS version: 15.0, tick "Render over native UI"
  • Build and run after player settings > Create new folder
  • Youtube recap: Link


Figure 7.1 AR on app

< back


Week 8 (11/6/2025)

In today's class, we learned about the ground plane, where we explored how to interact with it by placing elements on the surface. We also experimented with this feature using our phones.


Lecture Note

  • Open Vuforia in Safari > Download database under target manager
  • Import the database into unity
  • Right click > Vuforia Engine > Ground plane > Add "Ground plane stage" and "Plane finder"
  • Drag "Ground plane stage" into "Plane finder"
  • Drag the object under "Ground plane stage" to make a child 
  • To add material > Right click to "Create" > "Material" > Set your own material > Then drag the material into the object in the scene 
  • Go to "Vuforia Engine" at the panel below > "Vuforia" > "Database" > "For print" > "Emulator"


Figure 8.1 Ground plane stage


Figure 8.2 Testing on emulator


< back


Week 9 (18/6/2025)

Lecture Note

  • Package Manager > unity Registry > Download Pro Builder
  • Tools > ProBuilder > Editors > Create Shape
  • On the ground plane > click and drag to make a wall
  • Plane Finder > uncheck "Duplicate stage"



Week 10 (25/6/2025)

  • Add a canvas > Scale: ~0.3 > Set to "World Space" > Drag the object under "ground plane stage"
  • Raycast script > Add into the AR Camera > Add a new layer name "layer mask"






Comments

Popular Posts