Half Cheetah is a series of artworks using Reinforcement Learning behaviours generated from quadruped and ‘half cheetah’ models in a 3D space. These behaviours are then transposed onto real human 3D scans and integrated into a real-time environment.
The dm_control model is a self-learning agent which learns to move its body in an ‘imagined’ environment before transferring those movements to physical space. The model is run using similar parameters to commercial robotics and military tests. The performance optimisation process takes days and includes millions of steps, as the model progresses from incapability to competence.
The behaviours from these learning agents are then transposed onto 3D models of humans, whose physical bodies have been scanned, reduced to an efficient file size and the rights sold online, usually to be used in games or advertising. The body motions are adapted where required to fit the human frame.
These combined behavioural elements are then wrapped in a custom-designed real-time dynamic environment, in which a second Machine Learning model is turned inwards to analyse the generated movements. This new model breaks down the linear progression in performance and re-orders motion according to formal aesthetic goals, resulting in a displaced flow of body actions in space.
Half Cheetah demo
The dm_control model is a self-learning agent which learns to move its body in an ‘imagined’ environment before transferring those movements to physical space. The model is run using similar parameters to commercial robotics and military tests. The performance optimisation process takes days and includes millions of steps, as the model progresses from incapability to competence.
The behaviours from these learning agents are then transposed onto 3D models of humans, whose physical bodies have been scanned, reduced to an efficient file size and the rights sold online, usually to be used in games or advertising. The body motions are adapted where required to fit the human frame.
These combined behavioural elements are then wrapped in a custom-designed real-time dynamic environment, in which a second Machine Learning model is turned inwards to analyse the generated movements. This new model breaks down the linear progression in performance and re-orders motion according to formal aesthetic goals, resulting in a displaced flow of body actions in space.
Half Cheetah demo