Appearance
12. Android RT Doorbell
For the next labs, we are writing a digital doorbell application. It consists of two components:
- The Embedded Linux System, which includes an application reading camera images and waiting for a GPIO to be pressed, as well as a GPIO responsible for controlling a door lock.
- The Android Application shows a notification if the door is rung and a camera feed from the camera attached to the Embedded Linux System.
- The main goal of the application is to maintain minimal latency for the following components:
- Video Feed
- Notification when the Door Bell is rung
- Opening the Door from the Android app
Android Application
The goal of this lab is to write an Android Application acting as the user-facing component of a Realtime Doorbell. The Android application displays a notification when the door is rung.
The Android Application is going to be built in Java, and the GUI is put together using ConstraintLayout
. The following goals need to be achieved with the Android application:
- Read JPEG Images over UDP Stream
- Display the Images in the UI
- Display a Button for opening the Door
- Show a message if the Door Bell has been pressed
Here are a couple of key points to consider when building the Application:
- When the Activity is started, a background thread should be created listening for UDP packets containing JPEG Images.
- After receiving an Image, the image should be parsed using the
BitmapFactory
class. - The UI contains a
SurfaceView
, and aCanvas
can be acquired usinggetHolder()
. The Holder offers methods for locking and releasing the Canvas from a non-UI Thread. - Once the
SurfaceHolder
is available (install the callback forSurfaceHolder
), start the Thread which continuously receives UDP packets. - To test the process, instead of expecting data over UDP, create an Android asset folder and store a demo JPEG image. Use
Context.getAssets()
to get anInputStream
of an asset.
Key Considerations for Android Application
There are some considerations to make when building an application that does background work. Also, when coming from Java AWT/Swing, there are key differences in how applications are handled inside Android.
Lifecycle of an Application and Activity
An Application Context often survives the creation and destruction of an Activity. Activity instances are most of the time short-lived. The best example is screen rotation - an Activity is then destroyed, and a new Activity of the same type is created.
When doing background operations such as accepting UDP packets in a separate thread, it is important to not hold references to destroyed activities or their UI components. There are two options to handle such a situation:
- When the Activity is destroyed, also shut down background threads gracefully and start them when the next Activity is created.
- Maintain threads in the Application Context by either extending
Application
or creating a customViewModel
. Register and unregister listeners when an Activity is created/destroyed.
See https://developer.android.com/guide/components/activities/activity-lifecycle for learning more about lifecycles. For more information about getting a long-lived Context Object, see https://yakivmospan.com/blog/best-practice-application/
Context and its dangers
The Context object inside an Android application is responsible for giving access to services of the OS, such as Notifications, Bluetooth, USB, Broadcasts, etc. It is, therefore, critical when writing applications. The Activity
class is also an instance of Context
, as well as Application
.
When passing the Context object downward in your class hierarchy for convenient access to system services, consider the following: The Context object cannot be easily unit tested. Having a Context anywhere as a reference makes it more challenging to quickly run and test Application components and also makes them reliant on an Android environment.
One way to avoid such situations is, instead of passing a Context directly, to write interfaces implementing a specific subset of required methods for access to System Resources.
Quickly running Android Code on the Host
Take a look at the Unit Test Class that is automatically generated when starting the Android Project from a template. With a right-click within the Source inside Android Studio, an option is shown to run the test immediately.
This way, certain parts (those not requiring a Context) can be tested independently of Android. This eases project development and allows for quick iteration.
UI Thread
The Activity's UI components are only accessible when running on the UI thread. Mutating UI state from any other thread may cause the application to misbehave. The utility method runOnUiThread
will allow passing a Runnable that is going to be posted to the UI thread's worker queue.
A notable exception to this rule is the SurfaceView
. Its canvas allows it to be accessed while locked from a foreign thread and, therefore, is an ideal candidate for rendering data provided by external sources that arrive asynchronously.
Working with primitive bytes
In Java, the type byte
is always interpreted as a signed value. Executing arithmetic operations always requires upcasting byte
to at least short
for maintaining unsigned values. Consider the following example:
java
public class Main {
public static void main(String[] args) {
byte len = (byte) 0xFF;
System.out.printf("%02X\n", len);
System.out.printf("%d\n", len);
System.out.printf("%d\n", (int) len);
int unsigned = len & 0xFF;
System.out.printf("%d\n", unsigned);
}
}
This leads to the output:
FF
-1
-1
255
IP Fragmentation
Sending UDP packets with a large payload will cause IP fragmentation (unless the 'don't fragment' (DF) flag in IP is set, then the packet will be rejected on fragmentation). The main reason for this is passing links with Layer 2 frame sizes that do not fit the UDP payload entirely.
The most commonly found safe payload size that will not cause IP fragmentation (MTU) is 1500
Bytes or 1492
(ref https://de.wikipedia.org/wiki/Maximum_Transmission_Unit, PPPoE).
IP fragmentation may be undesirable in low latency applications since re-assembly will be out of control of the application.
To optimize this issue, writing a protocol for sequencing would resolve the problem.
Excercise: write the Android application
- Create an Android Project based on an "Empty Views Activity".
- Create the UI with a SurfaceView and a Button
- Write a Class that starts a Thread reading UDP Packets Helpful hints over here: https://www.baeldung.com/udp-in-java
- Install a Listener on the
SurfaceView
inonCreate
of theActivity
to be notified when theHolder
is available. Then pass the reference to the Thread, so it can accessCanvas
of the Holder. See https://developer.android.com/reference/android/view/SurfaceHolder.Callback - Receive UDP Packets and decode them with
BitmapFactory
and draw them to theCanvas
after locking. - Test by sending a static JPEG image over UDP from the host with a simple python application.
Progress 4. June 2024
java
package com.nexuscomputing.android.doorbell;
import android.graphics.Bitmap;
import android.graphics.BitmapFactory;
import android.graphics.Canvas;
import android.graphics.Color;
import android.graphics.Paint;
import android.graphics.Rect;
import android.graphics.RectF;
import android.os.Bundle;
import android.util.Log;
import android.view.SurfaceHolder;
import android.view.SurfaceView;
import androidx.activity.EdgeToEdge;
import androidx.annotation.NonNull;
import androidx.appcompat.app.AppCompatActivity;
import androidx.core.graphics.Insets;
import androidx.core.view.ViewCompat;
import androidx.core.view.WindowInsetsCompat;
import java.io.InputStream;
public class MainActivity extends AppCompatActivity {
private SurfaceHolder videoSurfaceHolder = null;
private Bitmap currentFrame = null;
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
EdgeToEdge.enable(this);
setContentView(R.layout.activity_main);
ViewCompat.setOnApplyWindowInsetsListener(findViewById(R.id.main), (v, insets) -> {
Insets systemBars = insets.getInsets(WindowInsetsCompat.Type.systemBars());
v.setPadding(systemBars.left, systemBars.top, systemBars.right, systemBars.bottom);
return insets;
});
try {
InputStream is = getApplicationContext().getAssets().open("photo.jpg");
currentFrame = BitmapFactory.decodeStream(is);
} catch (Exception e) {
e.printStackTrace();
}
Thread imageThread = new Thread(() -> {
int x = 0;
Paint paint = new Paint();
paint.setStyle(Paint.Style.FILL);
for (; ; ) {
// while loop reads udp data
// when image data received
// then decode image
try {
Thread.sleep(16);
} catch (InterruptedException e) {
throw new RuntimeException(e);
}
Canvas canvas = videoSurfaceHolder.lockCanvas();
try {
paint.setColor(Color.BLACK);
canvas.drawBitmap(currentFrame,
new Rect(0, 0, currentFrame.getWidth(),
currentFrame.getHeight()),
new RectF(0f, 0f, canvas.getWidth(),
canvas.getHeight()), paint);
// canvas.drawRect(0, 0, canvas.getWidth(), canvas.getHeight(), paint);
// paint.setColor(Color.RED);
// canvas.drawRect(x, 0, x + canvas.getWidth() / 2, canvas.getHeight() / 2, paint);
x += 2;
Log.d("INT", "x is: " + x);
} catch (Exception e) {
Log.e("ERROR", e.getMessage());
e.printStackTrace();
}
videoSurfaceHolder.unlockCanvasAndPost(canvas);
}
});
SurfaceView surfaceView = findViewById(R.id.video_view);
surfaceView.getHolder().addCallback(new SurfaceHolder.Callback() {
@Override
public void surfaceCreated(@NonNull SurfaceHolder surfaceHolder) {
videoSurfaceHolder = surfaceHolder;
imageThread.start();
}
@Override
public void surfaceChanged(@NonNull SurfaceHolder surfaceHolder, int i, int i1, int i2) {
}
@Override
public void surfaceDestroyed(@NonNull SurfaceHolder surfaceHolder) {
}
});
}
}