AR Library Weekly #5

The Processing Foundation

Contributions Week-V

Hello World.!
This is my weekly report 5 on the AR Library for Processing-Android which is a Library that will help creating Augmented Reality applications using Processing through Android Mode.

On continuation to the previous week, this week began with refining the Plane renderer which is used to detect planes. On the previous week the ability to add colors to the plane was added this week on extension to that the ability to set some basic textures to the detected planes were added, which are shown in the following pictures.



The build scripts which were migrarted from the ant build system to the gradle build system were pushed in a seperate branch namely new-build and the master branch is not disturbed yet, the build instructions to build the library using Gradle will be updated soon i.e., README file will be updated soon.

Github Link : https://github.com/SyamSundarKirubakaran/processing-ar/tree/new-build



Milestone:

As suggested by Jesus during the previous evauation, I've created a Milestone for the release of stable version of the AR Library.



Point Cloud Renderer:

The point cloud Renderer contains a set of observed 3D points and confidence values. To be exact the point cloud has multiple functions when it comes to AR and most of all the point clouds are the collection of points that gives raise to the feature points also known as anchors over which the objects are placed. The feature points are the ones that give better understanding of the found plane by identifying a unique feature of that plane. These points are usually formed in the regions where there are more contract .i.e., the point where a color switch is evident.

  • Image (1) - shows a simple AR scene over which the point cloud renderer is acting upon.
  • Image (2) - shows the zoomed in image of (1) where you can find the point clouds and the places where they are formed.
  • Image (3) - gives you a better picture of how they are formed, the image has a high contrast which switches colors quite often, here you can infer that more point clouds are formed more than the previous because of this reason (contrast).

Youtube Link : https://youtu.be/YLghOMM2fow



Integration with Processing core:

On placing sketch.handleDraw() turns the AR renderer void. Andres was able to give me a clear picture on why this happens : This is because the Processing renderer is unaware of the AR sub-renderers that are implemented so far. This is important because the main objective is to bring about a library that is able to renderer 3D objects in the AR scene and give life to it through simple processing sketches. Like importing objects into the AR scene though processing sketch and so on. This can be accomplished by having the VR Library as reference and the standard P3D renderer which is used to import objects into the app through processing sketch. I've started an issue for this integration feel free to give your valuble comments there which might help me in this process of integration. [ENHANCE] Integration with Processing Renderers

Also I've started with the initial stage of integration right away. At the end the instead of clearing the screen with background, it should use the background renderer to set the camera image as background and then use the session, frame, and camera objects to place the shapes drawn with Processing on the right place.



Useful Link : https://github.com/processing/processing-android/pull/346
  • This has the collection of commits that Andres made while creating the VR library and this definitly gives a better view on how I should make progress in make this integration.


Summary:

  • Additional features to Plane Renderer.
  • Implementation of Point Cloud Renderer.
  • Creation of Milestone v1.0 for the AR Library.
  • Initial stage of integration with the processing core.

Thanks to,
  • Jesus Duran - Mentor
  • Andres Colubri
for guiding me through this.!

That's it for the week. Report week - V check.