Two info sessions on the Open Call with Maja Stark and one techi from the XR_Unites team are planned via Zoom (no registration required) to present the call and answer all your questions:
The third and last XR_Unites OPEN CALL is now open!
The body and embodiment in virtual reality (VR) play a central role in XBPMMM. Embodiment is understood as the perception of the environment is not only done by the brain, but also by the body: Body and mind cannot be clearly separated. For the VR experience, this means that both the physical and the virtual avatar are significantly involved in the perception of the own body and the virtual world.
In XBPMMM, the body as well as body images and ideals of Western society are reflected on different levels, such as in theory, on the level of emotions as well as materiality.
The avatar in XBPMMM will change its shape in the course of the game – and has also already undergone a metamorphosis from the standadardized robot to the bubble-shaped amorph during the conceptual development. The beautiful bubble body is hereby conserved – it may have to give way to another body shape in the end.
To be continued …
The XBPMMM meeting on 01.12.21 started with a guest presentation by AURORA developer and media computer scientist Leonid Barsht about locomotion, interaction and user interface in virtual reality.
During the following discussion it became clear that we will probably test some VR applications as references for locomotion and interaction in the near future – we will keep you posted! For XBPMMM a combination of different locomotion and interaction possibilities (possibly depending on the level of the multiplayer) is planned.
And traraaaa: At this meeting the Unity project was restructured based on the preliminary work of Janne and Anton. That means that now – parallel to the development of the storyboard – we can start with the development!
The focus of storyboard and development is first on level 0, where you arrive, and the third and thus last scene – both in combination with the softrobots and the MQTT protocol. The goal is to think and develop the (browser-based) WebGL and VR multiplayer in parallel.
At this meeting, our new expert board member Sebastian Keppler presented the BMBF-funded research project VitraS, in which he is involved as a project collaborator (lead at HTW: Prof. Habakuk Israel, also expert on the board of XR_Unites).
The focus of VitraS is on virtual reality therapy by stimulating modulated body perception, for example in obesity patients – it is thus a true embodiment project in a medical context. Based on Sebastian’s talk, a mutually fruitful exchange developed at the intersection of art, humanities, informatics and health.
Common topics were, for example, normative body images and barely existing »marginal body forms« in the digital world, facial expressions of avatars, counter-gendered hand models, body awareness and the potential of the mirror – symbolizing self-knowledge in visual art – to mediate perception of one’s own (avatar) body. The fact that body images are always mediated by media was also discussed – a fact that has enormous relevance for body perception in times of Instagram face filters and remote conferencing and reinforces the often unhealthy orientation towards idealized norms in our contemporary Western society.
VitraS is a cooperation of HTW Berlin with the University of Würzburg (lead), University of Bielefeld, SRH University of Health Gera, The Caputry GmbH, TU Munich, brainboost GmbH, CBMI.
We thank Sebastian Keppler for the great lecture and exchange!
For more info on the VitraS project, see here.
XBPMMM (AT) addresses a huge range of exciting computer science areas – most of which have to do with embodiment experience in VR. On 10/27/21, the second meeting on this took place at the Research Center for Culture and Informatics (FKI) at HTW Berlin. The artists shared their experiences with the MQTT protocol, we played the already developed WebGL multiplayer, Janne Kummer brought softrobot forms cast out of silicone and skills, responsibilities and necessary technologies were important topics.
Potential work areas for further development:
- Cross Platform Development (XR)
- Multiplayer in WebGL and VR (bring together)
- Integration of MQTT on different hardware and software and the interpolation of data between devices
- Physicality and locomotion in VR
- Deformability of meshes (avatar body).
- Materials, textures (size, scaling, export from Blender)
- Handling light in Unity and WebGL
Mixed reality installation with HoloLens 2
Collaboration by reVerb, Chitrasena Dance Company, Sri Lanka,
and XR_Unites, HTW Berlin
Media Theatre, Humboldt-Universität, Berlin, 2021
Dance: Thaji Dias
Choreography: Heshma Wignaraja
Dramaturgy: Susanne Vincenz
Sound: Mareike Trillhaas
3D scans & management: Umadanthi Dias
Developers: Christoph Holtmann, Laura Magdaleno Amaro
und Ekaterina Losik
Video & Scenography: Isabel Robson
Within XR_Unites funded by the European Regional Development Fund (ERDF) in the INP-II program. It is also supported by the national performance network (npn) – stepping out, funded by the Federal Government Commissioner for Culture and the Media as part of the initiative Neustart Kultur. Aid Program Dance.
Exciting! A first meeting took place on 7 October, which mainly served to get to know each other and make initial plans for cooperation. The next step will be a joint hackathon at HTW Berlin on 27 October!
From 3 September to 5 October 2021, the mixed reality installation TRANSIENT EXPOSURE can be experienced in Berlin-Mitte! It is the result of the collaboration between the INKA project XR_Unites at the HTW Berlin, the artists’ collective reVerb and the Chitrasena Dance Company – the result of XR_Unites’ first OPEN CALL.
The performance venue is the Media Theatre of the Faculty of Humanities and Social Sciences at the Humboldt-Universität Berlin.
Two to three visitors at a time can enter the installation with the mixed reality glasses HoloLens and immerse themselves in a 15-minute multimedia experience: In the physical world, the installation was conciously kept very minimal with rattan blinds, a fan and a large metal box. Archive material from the Chitrasena Dance Company in combination with a 3D sound collage from Colombo and partly interactive 2D and 3D elements formed the digital level of the installation, which thrilled many visitors.
»How seemingly simple and natural analogue and digital space interpenetrate here, how skilfully you guide the visitors, how lovingly and yet astutely you curate and stage the archive material – it all makes you want more!« wrote one visitor shortly after the experience.
The development with the VFX graph is part of our digital media production. In TRANSIENT EXPOSURE, the graph is combined with Kinect recordings of the dancer Thaji Chitrasena. It is a powerful free tool from Unity that is quite easy to learn even if you are not a computer scientist – especially if you have previous knowledge of software like Blender or Bolt, because the VFX graph is also based on nodes. That means you don’t have to work with code – instead, the user interface shows blocks (nodes) that can be connected to each other via edges. And what’s the point of all this?
As a depth camera, the Kinect generates data that are not visible at first. The VFX graph, on the other hand, consists of a visual particle system that can be edited via the nodes, e.g. in their shape, color or number of particles. This creates an effect through which the Kinect data can be made visible – e.g. via a pixel cloud.
An avatar serves as the medium between the VFX graph and the Kinect data: It is synchronized with both the Kinect data and the particle system of the VFX graph. To let the effect resemble the dancer in the end, the avatar still has to be edited so that its anatomy roughly corresponds to that of Thaji.