We collaborated with artist Yinka Shonibare and the Royal Academy, leveraging new machine learning techniques to bring a fully immersive VR art exhibit to life.
As AI continues to evolve, its role in artistic applications have only just begun to be explored. For this experimental project, our team merged machine learning and virtual reality to generate a style-matched, immersive art experience. The VR art walkthrough piece was created for the Royal Academy’s high-profile From Life exhibition, a curation of works exploring how the human form has been represented by artists over the years.
As the basis of our experience, we replicated the 2D environment of a Neoclassical painting (Venus Presenting Helen To Paris, 1785) in a 3D space, allowing viewers to walk around and explore. The original painter, Gavin Hamilton, had come up in Yinka’s personal research having once excavated a statue of the goddess Venus – the subject of Yinka’s recent work.
A VR version of Yinka’s ‘Townley Venus’ sculpture, a vibrantly modernised take on the classical statue, can be discovered by the viewer by walking through Hamilton’s scene to the courtyard beyond.
One pivotal goal of the experience was to match the original artwork’s overall aesthetic and replicate the artist’s unique style. To do this, we applied pioneering ‘Style Transfer’ techniques based on Deep Learning Neural Networks to replicate Hamilton’s painting style and complete the unknown textures of the backs of the characters and elements of the environment. The result is a fusion of old and new art that utilises tech both in the creation and viewing of the experience.