Welcome to my laundry store!
Scroll mouse or slider bar to see more work!
My adventure as Creative Technogist started in London.
Check some of my works in the machine.
I like to do personal work in my free time.
Have some Game works on artstation as well.
Sometime, I work as freelancer for some projects.
On the right is the VR game that I worked with.
Thanks for checking my laundry store!
Contact me with andrewtw.chang@gmail.com


Master of Art Final Project. Real-time face tracking and projection mapping for an immersive narrative experience.


The objective of this project is to utilise technology to create a live performance that conveys the story of schizophrenia in an immersive manner. This project will use projections to highlight dancers' movements and thoroughly immerse the audience in the story of schizophrenia, a misunderstood and frequently stigmatised mental disorder that affects both sufferers and their loved ones. By portraying the picture to viewers from a different angle and increasing public awareness of schizophrenia, OP ("PANTOMIMUS") hopes to achieve these goals.



The project combines 3D modelling and 2D imagery to create immersive visual effects. One of the challenges encountered is the computational expense required for face tracking, which demands high-performance GPU. Initially, the intention was to implement real-time render face-hacking visuals. However, in order to ensure stability and maintain a consistent frame rate throughout the performance, a combination of real-time and pre-rendered visual effects was chosen. Regardless, the visual effects remain seamlessly integrated with the live performance.

patomimus_picture_2 patomimus_picture_3

The results

The results is a live performance at Goldsmith of Uniersity in London. Using a custom-built face projection mapping application developed in TouchDesigner, live data is seamlessly incorporated into the performance. The visual components are displayed on a specially designed mask worn by the performer, immersing the audience in a multi-sensory experience. The performance is divided into five distinct phases, each accompanied by a dynamic stereo audio setting. The unique combination of performance, visual, and audio elements creates an unparalleled immersive experience, transporting the audience to a new realm of storytelling. The intimate setting of the performance further enhances the immersive experience, allowing the audience to be fully engrossed in the performance.

patomimus_picture_4 patomimus_picture_5

Produced By: Andrew Chang
Mask Design: Ray Zheng
Performer: Kristia Morabito
Audio: Christina Karpodini


Master of Art Project. 3D Scanning project with Augmented reality and Data visualise for digital story show.


The advancement of 3D scanning technology has made capturing detailed images of objects and people increasingly accessible. As a result, it is becoming increasingly common for people to manipulate their digital images to craft false narratives about their lives. This raises questions about the nature of reality and the validity of our perceptions. What happens when these digital narratives are brought into the physical realm? Traditional standard for verifying the authenticity of something has been to rely on our own senses. But what if we stitch fragments of memories to create a new, composite memory? This project explores these questions by bringing a single memory to life in both virtual and physical forms, resulting in a unique exploration of the memory.



The goal was to investigate the usage of 3D scanning technologies. Kinect V1 was employed as an economical and straightforward scanning device to scan objects such as the room and the individual performing the project. However, it has certain limitations such as the inability to scan exterior objects and a requirement for constant power supply. The Display.land software was used to gather small organic items, although the results were initially messy that needs time to clean up the data and produced a well-formed 3D scanned tree sample. Scanning with the Kinect V1, the results could be still quite equally messy. Nonetheless, both methods still provides a convenient and efficient way to acquire data in this project.


The results

The project used 3D scanning technology to exhibit in a pop-up exhibition. Additionally, visitors can print their own memories or dreams. By engaging with 3D sculptures and projecting mapping, create an immersive experience that will elicit emotional responses from the audience.

The end result is a 3D composite item for a digital story show. The sculpture, which was made out of a tree and the creator's body, was meant to be printed and kept in a space. The synchronised point cloud animation would be projected onto the sculpture, offering spectators with a visually immersive experience . For further development, a 1:1 model will be created and displayed in a dark room to provide a more immersive experience.

tree_picture_3 tree_picture_4

Produced By: Andrew Chang


Real-time visual for fabric interactions.

Home Grown

Home Grown is a bio-based collection of objects created as alternative tools to interact with our digital products. Inspired by the lack of mindfulness regarding the sourcing and recycling of materials for technology, Home Grown has been sourced from local materials from the UK & Ireland and created using traditional craft techniques. It uses ambient technology as a visual language to uncover the relationship between these objects and the user, exploring how levels of care towards these within our daily interactions, can create a deeper understanding and longevity within the everyday object.

This work explores local conductive and bio-based materials that could create speculative alternatives for the current multinational ingredient list within technologies today, replacing the metal components and interfaces whilst reducing the material, and the making process, to a localised area.

Generating stories and thoughts on mindfulness towards the everyday objects in our home, not treating the object as ‘us and them’ but a cohabiting together. Bringing mindfulness into our daily interactions with the environment.

Essentially Home Grown is not about generating a ‘Hero’ solution or idea to save the planet, but stimulating alternative materials, methods and stories of interaction to create more mindful approaches towards the everyday object.


Material Language

A key development was the visual language and what this would look like between the user and the object?

Sound interactions required extra components such as a speaker. Whilst a written language seemed sterile compared to creating a visual language that would change colour and movement over time.

When it came to creating these material conversations between the object and the user, Emma tried to capture the movement of the sea developed from her research and mark making. With the materials emotional ‘states’ changing and developing as the user interacted with the object over time.

homeGrown_pic_2 homeGrown_pic_3

Produced By: Emma Harriet Wright
Technical Support: Andrew Chang



Standalone VR experience


The Vitality District VR experience is a brief journey showcasing the potential of virtual environments to improve physical and mental health at the Paris VR summit. Players engage in activities enhancing health, such as learning about fruits/flowers, solving a stone balance puzzle, climbing exercises, and meditating. The goal is to introduce players to the potential of virtual environments for personal growth and health promotion.


Game screenshot


Game screenshot


Game screenshot


Game screenshot


Game screenshot


Paris VR Summit

Game DEV: Hammed Arowosegbe
Game & Environment Design / Model: Andrew Chang
Model: Cheslav Sukstul
UI: Oluwatobi