We work in the Institute for Computer Science and Control, Hungarian Academy of Sciences, and we are developing an open source, scalable, distributed VR/AR engine called ApertusVR. Our research group has been founded about one and half year ago, in August 2016, but the predecessor research group conducted a ten-year research programme related to collaborative virtual reality technologies. As a result, our group checked and published results in a form of a software system called ApertusVR (http://apertusvr.org)
ApertusVR is a loosely coupled modular software system, that gives you the opportunity to create collaborative virtual spaces. You are able to create virtual reality scenes and share them with your colleagues. A good example is to create a meeting room, where you can communicate via text chat and voice chat independently from your geolocation. At first, hearing this may sound nothing new, but in order to do this till now, you had to choose a VR platform-framework, and use it to build your application with certain limitations of the chosen framework. This choice also influenced the hardware you had to use. With ApertusVR these limitations simply disappear. ApertusVR creates a new abstraction layer over the hardware vendors in order to rapidly integrate the virtual and augmented reality technologies into your developments, products.
The modular structure of ApertusVR allows you to run the ApertusCore on any device. Written in C++ 11 it is able to run on smartphones, tablets, desktops. The core is responsible for the synchronization of the scene with NAT Punch Through technology. A typical host-guest relationship can be varied according to the situation - if the original host falls out, a guest takes the host role and runs all software components needed in order to maintain the shared virtual space.
Plugins are extending the functionality of the core according to your needs/choice. Let's take an example, where a computing power required particle system simulation has to be developed and it has to be displayed on a smartphone. In this case, ApertusCore runs on a HPC server and on the smartphone for the scene synchronization. Beside ApertusCore, a specific plugin calculates particle physics has to be implemented for the HPC server, and another plugin has to put on the smartphone for the visualization. Till now, VR engines forced the user to run all features at the same time on the device, which was computing power consuming. But with ApertusVR, you can do it easily.
Our business logic
Our Institute of Computer Science and Control is among the prominent computer science research institutes of Europe. The results of the last decade made it clear, that although our institute has numerous financially successful research projects, we need to be open to the market, and start pre-competitive researches.
One of these researches is ApertusVR. As it is an open source toolkit, we are searching for B2B partners. We deliver R+D services to our partners so they can enjoy the advances in VR/AR technologies. Although we are a research institute, we leave open the opportunity to found start-up enterprises in order to attract investors if they see the market potential in ApertusVR and would like to develop jointly a future B2C product. We’re also looking for partners who could supply the hardware for our future-clients, which means that we intend to suggest the best hardware for the AR/VR/AR solution we deliver in our development. We cannot be official resellers, but in the case your product meets the requirements of our client, we would gladly recommend it to them.
Projects we are working on
Production line virtualization, remote control
Within the confines of the Symbio-tic project, we participate in a common project with the Research Laboratory on Engineering and Management Intelligence. In this common project, we have the task to virtualize the physically built production environment in order to visualize real-time data, so a person can take control of the production environment independently of his/her geolocation. This virtualized production environment serves as a sandbox for experimenting and fine-tuning of production procedures, and a space for consulting with colleagues. All of the participants (humans, software and hardware) of the factory communicate and share knowledge on the highest cognitive level – the level of the visualization.
Multi-doctor medical examination
This application enables the doctors to participate in medical examination independently form space with the help of shared virtual reality. With this application, doctors who participate in the examination can draw, annotate the examination material and append other documents. Streaming the examination with a device we can create a virtual canvas that matches the field of view of the examination device’s camera, so doctors can have the sense of being inside the patient.
Visual stimulus for grid cell research in human navigation
We developed an application using ApertusVR for the Texas University, Austin. In this research the patient has to chase bubbles with a certain time limit. This way the application produces a constant visual stimulus, while the patient has to design a route to chase as many bubbles as possible. This task is monitored using clinical EEG devices and processed for further studies. The application runs on tablet and Oculus Rift with the same codebase, using the necessary plugins.
The scenario in this research programme was to create a multiplayer environment, where the participants had to solve a problem together. The participants were in different geolocations and they have been monitored with an EEG device. The EEG data have been processed with big data solutions for further studies. Our task was to deliver the IT solution to this research project.