#APK link: https://drive.google.com/file/d/1Zm0X-yP8oDmcPf9aibUZxIxlfe0nlkZf/view?usp=sharing
Android + OpenCV (C++) + OpenGL ES + TypeScript Web Viewer
A real-time edge detection system that captures live camera frames, processes them using OpenCV in C++ via JNI, and renders the processed frames using OpenGL ES 2.0. Additionally, a lightweight TypeScript web viewer displays a sample processed frame, showcasing integration between native image processing and web visualization.
- Live Camera Feed using
TextureView(Camera2 API). - JNI-based Frame Processing β each frame is sent to native C++ code for processing.
- Canny Edge Detection & Grayscale Filters implemented with OpenCV C++.
- OpenGL ES Renderer β displays the processed frames as textures in real time.
- Achieved smooth 10β20 FPS performance during processing and rendering.
- Optional toggle between raw feed and edge-detected view.
- FPS Counter overlay for performance monitoring.
- Minimal TypeScript + HTML web app that displays a sample processed frame (static/base64).
- Shows basic frame metadata β FPS, resolution, processing time.
- Demonstrates ability to connect native output to a web visualization layer.
π¦ RealTime-EdgeDetection βββ app/ # Java/Kotlin Android code (UI + Camera) β βββ MainActivity.kt β βββ CameraHandler.kt β βββ JNIInterface.kt βββ jni/ # Native C++ OpenCV processing the project in Android Studio (Hedgehog or newer):
Launch Android Studio.
Select File β Open and choose the project directory.
Wait for Gradle to sync.
Install NDK and CMake via SDK Manager:
Go to Tools β SDK Manager β SDK Tools tab.
Check and install:
β NDK (Side by side) β CMake β LLDB (optional for debugging native code)
Click Apply and wait for installation to complete.
Download and Link OpenCV SDK:
Go to the official OpenCV releases page: π https://opencv.org/releases/
Download the latest OpenCV Android SDK (ZIP file).
Extract it anywhere (for example: C:/OpenCV-android-sdk/).
In Android Studio, update your CMakeLists.txt to include OpenCV paths:
set(OpenCV_DIR "C:/OpenCV-android-sdk/sdk/native/jni")
find_package(OpenCV REQUIRED)
include_directories(${OpenCV_INCLUDE_DIRS})
target_link_libraries(native-lib ${OpenCV_LIBS} log)Also make sure build.gradle includes:
externalNativeBuild { cmake { path "src/main/cpp/CMakeLists.txt" } }
Run the Project:
Connect your Android device or launch an emulator.
Click
You should see a real-time camera feed with edges highlighted.
π» Web Viewer Setup
Navigate to the web folder:
cd web
npm install typescript
tscOpen index.html in your browser to view the sample processed frame.
The viewer displays:
Static base64 image (representing processed frame)
Frame metadata, such as FPS and resolution
π§ Architecture Overview
Camera Feed (Java/Kotlin) β Captured frames passed to native via JNI.
Native Processing (C++) β OpenCV applies Canny Edge Detection or Grayscale filter.
Rendering (OpenGL ES) β Processed frames displayed as textures in real-time.
Web Layer (TypeScript) β Displays a sample processed frame, simulating result transmission to web.
Data Flow:
Camera Frame β JNI Bridge β OpenCV (C++) β Processed Output β OpenGL Texture β Display
π§© Technologies Used
Android SDK (Kotlin/Java) NDK + JNI (C++) OpenCV (C++ for image processing) OpenGL ES 2.0 (for rendering)
TypeScript (for web visualization)β βββ CMakeLists.txt β βββ edge_processor.cpp β βββ native-lib.cpp βββ gl/ # OpenGL renderer classes β βββ GLRenderer.cpp β βββ GLRenderer.h β βββ shaders/ β βββ basic_fragment.glsl βββ web/ # TypeScript-based web viewer β βββ index.html β βββ main.ts β βββ tsconfig.json βββ README.md
- Clone the repository:
git clone https://github.com/RealCifer/edgeviewer.git cd edgeviewer
| Raw Image | Processed Image |
|---|---|
![]() |
![]() |



