Two info sessions on the Open Call with Maja Stark and one techi from the XR_Unites team are planned via Zoom (no registration required) to present the call and answer all your questions:
The third and last XR_Unites OPEN CALL is now open!
The body and embodiment in virtual reality (VR) play a central role in XBPMMM. Embodiment is understood as the perception of the environment is not only done by the brain, but also by the body: Body and mind cannot be clearly separated. For the VR experience, this means that both the physical and the virtual avatar are significantly involved in the perception of the own body and the virtual world.
In XBPMMM, the body as well as body images and ideals of Western society are reflected on different levels, such as in theory, on the level of emotions as well as materiality.
The avatar in XBPMMM will change its shape in the course of the game – and has also already undergone a metamorphosis from the standadardized robot to the bubble-shaped amorph during the conceptual development. The beautiful bubble body is hereby conserved – it may have to give way to another body shape in the end.
To be continued …
Mixed reality installation with HoloLens 2
Collaboration by reVerb, Chitrasena Dance Company, Sri Lanka,
and XR_Unites, HTW Berlin
Media Theatre, Humboldt-Universität, Berlin, 2021
Dance: Thaji Dias
Choreography: Heshma Wignaraja
Dramaturgy: Susanne Vincenz
Sound: Mareike Trillhaas
3D scans & management: Umadanthi Dias
Developers: Christoph Holtmann, Laura Magdaleno Amaro
und Ekaterina Losik
Video & Scenography: Isabel Robson
Within XR_Unites funded by the European Regional Development Fund (ERDF) in the INP-II program. It is also supported by the national performance network (npn) – stepping out, funded by the Federal Government Commissioner for Culture and the Media as part of the initiative Neustart Kultur. Aid Program Dance.
Exciting! A first meeting took place on 7 October, which mainly served to get to know each other and make initial plans for cooperation. The next step will be a joint hackathon at HTW Berlin on 27 October!
From 3 September to 5 October 2021, the mixed reality installation TRANSIENT EXPOSURE can be experienced in Berlin-Mitte! It is the result of the collaboration between the INKA project XR_Unites at the HTW Berlin, the artists’ collective reVerb and the Chitrasena Dance Company – the result of XR_Unites’ first OPEN CALL.
The performance venue is the Media Theatre of the Faculty of Humanities and Social Sciences at the Humboldt-Universität Berlin.
Two to three visitors at a time can enter the installation with the mixed reality glasses HoloLens and immerse themselves in a 15-minute multimedia experience: In the physical world, the installation was conciously kept very minimal with rattan blinds, a fan and a large metal box. Archive material from the Chitrasena Dance Company in combination with a 3D sound collage from Colombo and partly interactive 2D and 3D elements formed the digital level of the installation, which thrilled many visitors.
»How seemingly simple and natural analogue and digital space interpenetrate here, how skilfully you guide the visitors, how lovingly and yet astutely you curate and stage the archive material – it all makes you want more!« wrote one visitor shortly after the experience.
The development with the VFX graph is part of our digital media production. In TRANSIENT EXPOSURE, the graph is combined with Kinect recordings of the dancer Thaji Chitrasena. It is a powerful free tool from Unity that is quite easy to learn even if you are not a computer scientist – especially if you have previous knowledge of software like Blender or Bolt, because the VFX graph is also based on nodes. That means you don’t have to work with code – instead, the user interface shows blocks (nodes) that can be connected to each other via edges. And what’s the point of all this?
As a depth camera, the Kinect generates data that are not visible at first. The VFX graph, on the other hand, consists of a visual particle system that can be edited via the nodes, e.g. in their shape, color or number of particles. This creates an effect through which the Kinect data can be made visible – e.g. via a pixel cloud.
An avatar serves as the medium between the VFX graph and the Kinect data: It is synchronized with both the Kinect data and the particle system of the VFX graph. To let the effect resemble the dancer in the end, the avatar still has to be edited so that its anatomy roughly corresponds to that of Thaji.
In TRANSIENT EXPOSURE, we work with the Azure Kinect, a sensor bar that contains advanced AI sensors, a depth sensor and a spatial microphone array. The device can track body movements (keyword body tracking) and save them as a data set. The AI combines all collected data with a rough skeleton model.
To enable our team members in Colombo, Sri Lanka, to easily make recordings with the Azure Kinect for Unity, a recorder is being developed in XR_Unites which currently includes the following features:
- Capture motion via the Kinect’s body tracking feature
- Transferring these movements to a human 3D model
- Save the motion as an animation of a 3D model in a separate file
- Simultaneously record audio via a microphone
- Saving the audio in a separate (.wav) file
- Specifying a concrete recording time
- Simultaneous playback of audio and animation
- Loading and deleting the saved recordings via a menu
TRANSIENT EXPOSURE is an artistic experiment with the Hololens 2, the latest Mixed Reality goggles from Microsoft, which some may already know from artistic works such as BLOOM: Open Space by Brian Eno and Peter Chilvers or Concrete Storm by Studio Shift (both 2018). Usually, it is used in industry – for example, in manual production by digitally overlaying additional information and instructions. So far, it has not been used much in art, which may also be due to its high price of over € 3,000 – however, it is sometimes also lent by Microsoft for artistic projects, for example in the two cases mentioned above. The Hololens 2 features 6 degrees of freedom, area- and space-, as well as gesture-, hand- and speech recognition. It is also suitable for people wearing glasses. In XR_Unites, the Mixed Reality application for TRANSIENT EXPOSURE is being developed with the Unity game engine.