Apertus
  • Documentation
  • Introduction
    • Definitions
      • Coordinate systems
      • Primitives
    • Features
      • Basic
        • Nodes
        • Light sources
        • Geometries
        • Primitives
        • Texts
      • Environment simulation
        • Water
        • Sky
        • Terrain
      • Browser
      • UI technologies
        • HTML UI
        • Presentation
        • Gallery
      • PointCloud
      • 360
        • 360 Images
        • 360 Videos
      • 3D Model Formats
      • Scene Sharing
        • Multiplayer
      • Video and Voice Chat
      • Hand Tarcking
        • Leap Motion
      • Head Tracking
        • Fob
      • Displays
        • Multi Display
        • Cave System
        • HMDs
      • Industry
        • IoT, and Sensors
        • Robot monitoring
        • Robot calibration
  • Developers
    • Development Cycle
    • Architecture
      • Project folders
      • Configuration ecosystem
    • API
      • C++ API
      • JavaScript API
      • HTTP REST API
    • Getting Started
      • Creating a plugin
      • Creating a sample
  • Contribute
    • Report a Bug
    • Suggest a Feature
  • Tutorial - How to visualize CAD models in VR
    • Introduction
    • Import CAD Models
    • Convert CAD Models
    • Create Low-poly from CAD Models
    • Create Photorealistic CAD Models
  • Plugins - Photorealistic Render
  • Plugins - Physics
  • Tutorial - How to visualize Tensorflow training in VR
  • Tutorial - Virtual Learning Factory Toolkit Gamification
  • Overview
    • Introduction
    • Architecture
    • Use Cases
  • Installation
    • Windows
    • Android
      • How to use
      • Writing an application
    • MacOS
  • Build
    • Windows
      • How to build the source on Windows
    • Android
    • MacOS
  • Plugins on Windows
    • Photorealistic Render
      • How to use
      • How to configure
      • Features
      • Sample
    • Physics
      • How to use
      • How to configure
      • Features
      • Samples
      • Demo video
  • Plugins on Android
    • Java Native Interface
      • How to use
      • Extending the API
    • Filament render
      • How to use
      • How to configure
      • Developers
  • Plugins on MacOS
    • Untitled
  • Samples on Windows
    • Deep learning
      • Untitled
      • Use the Fastai-PythorchVR Sample
      • Use the HTTP API
      • Create HTTP Requests from Python
      • Demo video
    • Virtual Learning Factory Toolkit Gamification
      • Installation
      • Lobby - User Interface
      • Local - User Interface
      • Student - User Interface
      • Teacher - User Interface
      • VLFT Gamification Session
      • VR Mode
  • Virtual Learning Factory Application
    • Installation on Windows
    • Installation on Apple
    • Lobby
    • Single Player
    • Multi Player - Student
    • Multi Player - Teacher
Powered by GitBook
On this page
  • Add dependency
  • Import classes
  • Init the plugin
  1. Plugins on Android
  2. Filament render

How to use

This pages shows how to use the Filament render plugin on Android

On this page, we will go through on how to use the render plugin in a simple Android app which only contains a MainActivity. The basic usage of the plugin is easy, but we have to pay attention on several tasks.

Add dependency

The render plug-in is placed into a separate module called apefilamentrender. This means that when building our application, we do not just have to add the apertusvr module to our dependency list, but the apefilamentrender, as well:

build.gradle (Module: app)
dependencies {
    // ... 

    implementation project(":apertusvr")
    implementation project(":apefilamentrender")
}

Import classes

The plug-in itself contains sever classes, but when we using this module, our only concern is about the apeFilamentRenderPlugin. This is the one class, which contains everything we need to render on Android:

public final class apeFilamentRenderPlugin implements apePlugin {

// ...

}

So in our MainActivity.java file we import this class:

import org.apertusvr.render.apefilamentrenderplugin;

Init the plugin

It is not a surprise that we have to create every resource and init the plug-in, before using it. Assuming we want to use the plugin right from the application's starting, we have to do our job in the MainActivity.onCreate(/*...*/) funciton. So gather what resources we have to obtain there:

  1. The context, which is the reference to the activity, which started the plugin,

  2. the context's lifecycle,

  3. the context's resources folder

  4. the context's asset folder,

  5. an optional userNode.

SurfaceView surfaceView = new SurfaceView(this);

apeFilamentRenderPlugin renderPlugin = new apeFilamentRenderPlugin(
                this, getLifecycle(), surfaceView,
                getResources(), getAssets(), /* ... */);
  1. The context is a reference to our MainActivity.

  2. We can easily get the activity's SurfaceView by new SurfaceView(this).

  3. The resource folder can be obtained with the getResources() method.

  4. The asset folder can be obtained with the getAssets() method.

So what about the 6th parameter, the userNode? We need this in order to be able to propagate our camera's position into ApertusVR. If we do not need this feature in our application, just set the 6th parameter to null. Otherwise, we have to use the ApertusVR Java-API to create a userNode, possibly with some avatar geometry (e.g. a cone). Without further explanation the following code block shows an example for this:

String userName = /* get userName */;
String GUID = apeCoreConfig.getNetworkGUID();
apeNode userNode = apeSceneManager.createNode(userName, true, GUID);

if (userNode != null && userNode.isValid()) {
    apeManualMaterial userMaterial = Objects.requireNonNull(
    apeSceneManager.createEntity(
        userName + "_Material",
        apeEntity.Type.MATERIAL_MANUAL,
        true, GUID))
    .cast(new apeManualMaterial.apeManualMaterialBuilder());

    if(userMaterial != null && userMaterial.isValid()) {
        apeColor color = new apeColor(255f,105f,180f);
    
        userMaterial.setDiffuseColor(color);
        userMaterial.setSpecularColor(color);
    
        apeNode userConeNode = apeSceneManager.createNode(
                userName + "_ConeNode", true, GUID);
    
        if (userConeNode != null && userConeNode.isValid()) {
            userConeNode.setParentNode(userNode);
           
            apeConeGeometry userCone = Objects.requireNonNull(
                apeSceneManager.createEntity(
                    userName + "_ConeGeometry",
                    apeEntity.Type.GEOMETRY_CONE,
                    true, GUID))
                .cast(new apeConeGeometry.apeConeBuilder());
    
            if (userCone != null && userCone.isValid()) {
                userCone.setParentNode(userConeNode);
                userCone.setMaterial(userMaterial);
            }
        }
    }
}

Now we have userNode, so just we set it as the 6th parameter.

Great! What is left? The only task remained is to add our plugin as a lifecycle observer:

getLifecycle().addObserver(renderPlugin).

For basic usage this is all you need to know. However to be able to custumize the plug-in for your own wish, see the next page on how to configure.

PreviousFilament renderNextHow to configure

Last updated 4 years ago

a , which we can render onto,

The first five is pretty trivial if our MainActivity is derived from :

AppCompatActivity implements the , therefore it has getLifecycle() method.

If you did everything as discussed and also paid attention on , then the render plugin will give response for all the events occouring in ApertusVR while MainActivity is running.

SurfaceView
AppCompatActivity
LifecycleOwner interface
how to set up the ApertusVR system for applications