Apertus
  • Documentation
  • Introduction
    • Definitions
      • Coordinate systems
      • Primitives
    • Features
      • Basic
        • Nodes
        • Light sources
        • Geometries
        • Primitives
        • Texts
      • Environment simulation
        • Water
        • Sky
        • Terrain
      • Browser
      • UI technologies
        • HTML UI
        • Presentation
        • Gallery
      • PointCloud
      • 360
        • 360 Images
        • 360 Videos
      • 3D Model Formats
      • Scene Sharing
        • Multiplayer
      • Video and Voice Chat
      • Hand Tarcking
        • Leap Motion
      • Head Tracking
        • Fob
      • Displays
        • Multi Display
        • Cave System
        • HMDs
      • Industry
        • IoT, and Sensors
        • Robot monitoring
        • Robot calibration
  • Developers
    • Development Cycle
    • Architecture
      • Project folders
      • Configuration ecosystem
    • API
      • C++ API
      • JavaScript API
      • HTTP REST API
    • Getting Started
      • Creating a plugin
      • Creating a sample
  • Contribute
    • Report a Bug
    • Suggest a Feature
  • Tutorial - How to visualize CAD models in VR
    • Introduction
    • Import CAD Models
    • Convert CAD Models
    • Create Low-poly from CAD Models
    • Create Photorealistic CAD Models
  • Plugins - Photorealistic Render
  • Plugins - Physics
  • Tutorial - How to visualize Tensorflow training in VR
  • Tutorial - Virtual Learning Factory Toolkit Gamification
  • Overview
    • Introduction
    • Architecture
    • Use Cases
  • Installation
    • Windows
    • Android
      • How to use
      • Writing an application
    • MacOS
  • Build
    • Windows
      • How to build the source on Windows
    • Android
    • MacOS
  • Plugins on Windows
    • Photorealistic Render
      • How to use
      • How to configure
      • Features
      • Sample
    • Physics
      • How to use
      • How to configure
      • Features
      • Samples
      • Demo video
  • Plugins on Android
    • Java Native Interface
      • How to use
      • Extending the API
    • Filament render
      • How to use
      • How to configure
      • Developers
  • Plugins on MacOS
    • Untitled
  • Samples on Windows
    • Deep learning
      • Untitled
      • Use the Fastai-PythorchVR Sample
      • Use the HTTP API
      • Create HTTP Requests from Python
      • Demo video
    • Virtual Learning Factory Toolkit Gamification
      • Installation
      • Lobby - User Interface
      • Local - User Interface
      • Student - User Interface
      • Teacher - User Interface
      • VLFT Gamification Session
      • VR Mode
  • Virtual Learning Factory Application
    • Installation on Windows
    • Installation on Apple
    • Lobby
    • Single Player
    • Multi Player - Student
    • Multi Player - Teacher
Powered by GitBook
On this page
  1. Installation
  2. Android

Writing an application

In this section we will discuss how you can create your own application using ApertusVR in it.

PreviousHow to useNextMacOS

Last updated 4 years ago

If you have read the previous section you know how to query events and handle them with entity usage, we can discuss how you can set up ApertusVR for your android application with the support of the JNI-plugin.

On Android, we build our applications from , and each activity has its own lifecycle. To learn about activity lifecycles, see the . In code it appears as five function we have to override:

protected void onCreate(Bundle savedInstanceState) {
    // ...
}

protected void onStart() {
    // ...
}

protected void onResume() {
    // ...
}

protected void onPause() {
    // ...
}

protected void onDestroy() {
    // ...
}

When the activity is launched (and the onCreate(...) function is called) by Android , we have to do three things:

  1. Start the ApertusVR system and optionally connect to a room.

  2. Initialize the plug-ins for our application (more about it in the Plugins section).

The 1st is done by typing choreographer = Choreographer.getInstance(). Not a big deal. The 2nd needs further explanation, and we will go through the 3rd in the next section.

Start ApertusVR

Before doing anything with ApertusVR, we have to call its C++ start method in the ape::Systemnamespace:

void Start(const char* configFolderPath, bool isBlocking, std::function<void()> userThreadFunction = std::function<void()>());

When we do that, we have to tell ApertusVR where it can find the apeCore.json file (configFolderPath), which contains necessary informations. On Android it is stored in the assets folder of the given application. Therefore, when calling this method through the JNI-interface and the apeSystem wrapper class, we type:

String configFolderPath = /* some path in the assets folder */;

if (!apeSystem.isRunning()) {
    apeSystem.start(configFolderPath, getAssets());
}

For reading the asset folder from C++, the Android build has a shared library called apeAAssetOpen. The apeCoreConfig module, uses this shared library to open the asset folder. The JNI-method startApertusVR(...) also initializes apeAAssetOpen library.

Connect to room

ApertusVR users on Android-Java can set the room they want to connect pre-runtime in the apeCore.json, or runtime with the help of the apeSceneNetwork wrapper. So basically everything goes like in the C++ case:

/* start ApertusVR */

String roomName = /* existing room name */
apeSceneNetwork.connectToRoom(roomName);

Get an instance from , and store it for later usages.

As you see, we have to provide the (getAssets()) for the start function. This will call the JNI-function startApertusVR(...), which will call the actual ape::System::Start(...)function.

Activity classes
Android guide on activity lifecycles
Android choreographer
Android AssetManager