I wish to apply my broad experience and proven technical and creative skills in a dynamic work environment that challenges me to continue learning in new and interesting ways.
Largely self-taught, I am a fast and enthusiastic learner. My interests lie around new and evolving technologies, VR/AR, computer graphics, human-computer interaction, and information discovery using a wide range of tools, including game engines and libraries, interaction devices, spatial data, and 2D/3D data visualisation. I'm looking towards exploring where these technologies can take us, while having fun doing it.
Visualisation and eResearch (ViseR)•http://viser.net.au
Institute for Future Environments (IFE)•Queensland University of Technology (QUT)
March 2010 - Present
IFE•Biogeosciences•High Performance Computing and Research Support (HPC)•QUT
January 2008 – April 2013
Mathematical Sciences•QUT
July 2009 – December 2009
Mathematical Sciences•HPC•QUT
January 2006 – December 2007
Brisbane North Point Institute of TAFE
January 2003 – December 2006
Southbank Institute of TAFE•Online Learning Centre
June 2005 – December 2005
2009•Australian Institute of Management
2004 – 2007•Queensland University of Technology
2005•Martin College Brisbane
2001 – 2002•Brisbane North Institute of TAFE
1998 – 1999•Yeronga Institute of TAFE
An Android/GearVR application for expert elicitation, to crowd-source information for statistical models to better monitor the health of the Great Barrier Reef. An online map is used to locate geo-tagged 360 images, then a QR code is scanned from the mobile camera to load the 360 image and begin an elicitation survey about the selected location. Results are then uploaded to a central database.
Developed for a project called Monitoring Through Many Eyes, a collaboration between QUT, QLD local government, CRC for Spatial Information and ARC Centre of Excellence for Mathematical & Statistical Frontiers.
Click image below to open full-size in a new tab
An early investigation into control panel interaction in VR using hand-tracking controllers using Oculus Rift DK1 and DK2, and Razer Hydra controllers. Included humanoid animation blending and IK.
Demonstrated to the public during QUT Robotronica event in 2015. Also briefly featured on ABC's 7:30 program (from 1:50)
Click image below to open full-size in a new tab
A web-based portal for QUT called QUT VR, delivering a range of 360° imagery, videos, and other university-related VR content to a range of audiences, including staff, current and future students and the general public.
Working in a small team, my role included developing flocking behaviours for a VR reef scene, video editing and rendering in Premiere, assisting with IONIC JavaScript framework evaluation, and deployment and testing of application on various mobile devices.
Click image below to open full-size in a new tab
An early investigation into multi-user VR using Oculus Rift DK1 and DK2, and Razer Hydra controllers. This example shows a university building model with the ability to direct attention using a shared 'laser' pointer.
Click image below to open full-size in a new tab
An Android/GearVR application and fly-through video showcasing design of a new university building at QUT, with focus on proposed large display walls.
Developed in Unity using an architectural model exported from Autodesk Revit. Implemented using gaze-based teleport to move between pre-defined viewpoints.
Click image below to open full-size in a new tab
A cross-disciplinary project across software engineering, statistical modelling, web dev and graphics design. My role involved working with a small team on framework, data model and database design, and VR expert elicitation integration including the use of QR codes for initiating survey sessions.
Developed for a project called Monitoring Through Many Eyes, a partnership between QUT, QLD local government, CRC for Spatial Information and ARC Centre of Excellence for Mathematical & Statistical Frontiers.
Click image below to open full-size in a new tab
A prototype desktop, touchscreen and WebVR application for precinct planning and interactive wayfinding at QUT. Precinct planning application developed for high-level campus layout, decision making and communication, and included TUIO-based touch screen interaction.
WebGL wayfinding application involved use of Unity NavMesh functionality to highlight shortest route between selected locations. Basic Hololens demo also developed as a learning exercise.
Click image below to open full-size in a new tab
I developed GVS as an interactive visualisation tool for surface and groundwater data/processes and complex subsurface geology. Able to animate and interrogate 3D conceptual models and spatial data, to encourage an understanding of groundwater systems and their interaction with the environment.
Based upon VTK. Various support for Leap Motion, WebSocket remote web rendering and interaction in an HTML canvas for Federation University's VVG website, touchscreen and distributed rendering on The Cube at QUT, and IR-tracked hand control for camera navigation using Oblong's g-speak system.
Click image below to open full-size in a new tab
Various groundwater and geological visualisations produced using the Groundwater Visualisation System (GVS). Implemented using a wide range of data including borehole logs, GIS layers and groundwater simulation outputs.
Projects involved collaboration and consultation with geology/hydrogeology experts and stakeholders from across academia, industry, government and community groups.
Click image below to open full-size in a new tab
Remote interactive rendering over a network involves a networked rendering server that takes user commands from a remote 'light' client application, renders frames 'offscreen' and sends the updated image to the client. Client performs no 3D rendering, instead displaying image frames from the server, and capturing user interactions to send back to the server.
Example shows an implementation for Federation University's VVG website utilising GVS-based rendering, WebSocket communication, and HTML canvas for client interaction and display.
Previous prototypes included UDP/TCP socket communication, a Java-based client, and rendering from VTK, OpenSceneGraph and OpenGL-based server implementations.
Click image below to open full-size in a new tab
A story-focused spatial data platform, the Cube Globe presents a Cesium-based interactive globe, integrating a range of data visualisation techniques, spatial maps, 3D models, and integrated multimedia.
Designed and developed as part of a small team, my role included development of the backend CMS (data model, database and website), spatial data manipulation and styling, Geoserver setup and config, Google Charts creation and rendering, and various 3D visualisation techniques on the globe.
Originally for display on The Cube at QUT, the platform now supports desktop, mobile, and large-scale display systems.
Click image below to open full-size in a new tab
I developed a prototype web-based distributed interface for interacting with Microsoft's World Wide Telescope (WWT), for display on The Cube at QUT.
Based around a networked 'jukebox' mechanism, users select a planet/moon/panorama/tour which is added to a timed queue. A user single selection is active at a time, which transitions the WWT scene on the main projector wall and provides an interface to the active user, allowing touchscreen interaction to control the main visualisation.
Developed using HTML5 canvas and JavaScript, and interacting with WWT over a REST-based interface.
Click image below to open full-size in a new tab
Working as part of a small team, we developed a projection mapping presentation on Old Government House for the Liberact 2016 conference. Incorporating user interaction through a network of SICK LiDAR scanners, the mapping presented a sequence of interactive scenes accompanied by a custom soundtrack.
Developed using Unity and an OpenCV-based projection mapping implementation, and designed in collaboration with Deloitte Digital.
Click image below to open full-size in a new tab
Investigation into flocking and steering behaviours in Unity, using the UnitySteer library. This example shows an underwater scene with a range of animated species, each with different flocking behaviours.
A networked prototype was also developed to run across a large number of screens on The Cube at QUT.
Click image below to open full-size in a new tab
A networked prototype of fish flocking and steering implemented in Unity using the UnitySteer library, and utilising Unity's networking system. Developed to run across a large number of nodes/screens on The Cube at QUT. Includes synchronised fish positions and animation frames.
The Cube is a networked cluster, with each pair of portrait-oriented touch screens shown below driven by a separate node in the cluster, with the upper projection space also running on a separate node on the network.
Click image below to open full-size in a new tab
This project, called Sensing SEC, enables live and historical data from a range of building sensors and other sources, to be viewed/downloaded in CSV or JSON format via a REST interface. Data sources include water, gas and energy usage, solar yield from photovoltaic (PV) panels, structural vibration, weather data, room temperatures, crowd counting and even user touch data from the touch screens on The Cube at QUT.
I developed integration of PV, structural vibration, CCTV-based crowd counting and touch screen data, which included data collection/management using the Splunk platform, and REST interfaces implemented using C# and the ASP.NET Web API.
A website was also developed to showcase a range of demos and visualisations using the data. A Unity-based prototype was also developed to show application for 3D visualisation.
Click image below to open full-size in a new tab
A prototype LiDAR point renderer I developed in Unity. Utilising an octree LOD structure, where nodes are dynamically loaded/unloaded based on camera frustum, allowing interactive rendering of a 100's of millions of points or more. Rendering is achieved via a custom shader which uses Unity compute buffers.
LiDAR data stored in LAS format is read and preprocessed using the LASlib library. Points are segmented into octree nodes and stored as individual files in a simple, custom binary format. The Unity interactive renderer then loads and renders the nodes on demand based on user camera interaction.
Click image below to open full-size in a new tab
You can contact me using the details below, or use the form ...