Have you tried to read a book or an article like this while on a bus or walking down the street? I bet you tried! In this case you should notice that reading text in a such way is not a good idea since the screen constantly shakes. It seems that screen shaking is a big enough issue and eliminating it can promise a good UX. My idea is to use device accelerometer sensors to compensate for and smooth out screen content shaking in the same way as DSLR camera stabilizes its sensors/lenses. Technically, this is possible so why not try this yourself!
Existing solutions
Firstly let’s look for existing solutions. There are several interesting articles about our subject.
NoShake: Content Stabilization for Shaking Screens of Mobile Devices by Lin Zhong, Ahmad Rahmati and Clayton Shepard is about screen stabilization for iPhone (3) published in 2009. The article concludes that screen stabilization works and gives noticeable results but the algorithm uses “on average 30% of the 620 MHz ARM processor”. This makes this implementation almost useless in real life. Although nowadays iPhone can handle this much better there is no source codes or compiled application to try it out.
- Walking with your Smartphone: Stabilizing Screen Content by Kevin Jeisy. This article was published in 2014. It has good mathematical background and concludes “Using a hidden Markov model, we have achieved a good stabilization in theory”. Unfortunately there is no sources or compiled application, so we can’t try it out.
- Shake-Free Screen. Investigation of the same question but unfortunately with no readyness to try results.
These documents provide a very good explanation of the subject but there are no source codes or a pre-built application to try these solutions. Lets try to reinvent the wheel and implement it in our own manner.
Theory
Accelerometer sensor can be used to detect device movements. However you should note that the accelerometer sensor is obviously intended to calculate the acceleration. To answer the question how to calculate the movement using acceleration, let’s look at the device with motion sensors:
there are three axes thus accelerometer provides three values at the output. Technically accelerometer contains three separate accelerometers displaced in corresponding axes but let’s consider it as whole unit.
Accelerometer output is three numeric values meaning the acceleration in corresponding axes:
The acceleration is calculated in “m/s2”. As you can see we have some acceleration through Y axe. Actually it’s due to the acceleration of Earth’s gravity and a simple rotation of the device will change all three values:
You can imagine it as a ball tied to the device using a wire. This explanation is good enough because if you replace the ball with an arrow you will get the acceleration vector.
Ok. But what about detecting device movement?
I can’t show you a good screenshot here, but if you move your device spatially a bit, then the designated vector will change: actually it will consist of two vectors: 1) the earth’s gravity vector as before; 2) the device acceleration vector due to movement within the axe(s). The most interesting part for us is solely device acceleration vector. It is easy to subtract the gravity vector from the final acceleration vector but how does one get a true gravity vector? This task can be solved in different ways, but luckily Android has the special Linear Acceleration Sensor to do exactly what we need. Normally this sensor output is zero and only moving the device will output non zero values. Here is its source code if you’re interested. We are one step closer to detecting device movements. Let’s start coding something.
Implementation
To discover how to calculate the device movement let’s start coding one simple application with one activity. The application will listen for accelerometer change and move special view accordingly. Also it will show accelerometer raw output on the graph:
I will only show here key snippets. Complete code can be found in the GIT repository. The key things are:
1. The special view that we will move. It is the blue block with text inside view container:
<FrameLayout
android:layout_width="match_parent"
android:layout_height="match_parent"
android:layout_above="@id/graph1"
android:background="@drawable/dots_repeat_bg"
android:clipChildren="false">
<LinearLayout
android:id="@+id/layout_sensor"
android:layout_width="match_parent"
android:layout_height="match_parent"
android:layout_margin="20dp"
android:orientation="vertical"
android:background="#5050FF">
<ImageView
android:id="@+id/img_test"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:src="@mipmap/ic_launcher"/>
<TextView
android:id="@+id/txt_test"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:layout_below="@id/img_test"
android:textSize="15sp"
android:text="@string/test"/>
</LinearLayout>
</FrameLayout>
To move layout_sensor view we will use View.setTranslationX and View.setTranslationY methods.
Also subscribe to any view click event to reset calculation values to zero because at the first stage they will be very impish:
private void reset()
{
position[0] = position[1] = position[2] = 0;
velocity[0] = velocity[1] = velocity[2] = 0;
timestamp = 0;
layoutSensor.setTranslationX(0);
layoutSensor.setTranslationY(0);
}
2. Subscribe for accelerometer changes:
sensorManager = (SensorManager) getSystemService(SENSOR_SERVICE);
accelerometer = sensorManager.getDefaultSensor(Sensor.TYPE_LINEAR_ACCELERATION);
sensorManager.registerListener(sensorEventListener, accelerometer, SensorManager.SENSOR_DELAY_FASTEST);
3. And the main thing: accelerometer listener. Its basic implementation:
private final float[] velocity = new float[3];
private final float[] position = new float[3];
private long timestamp = 0;
private final SensorEventListener sensorEventListener = new SensorEventListener()
{
@Override
public void onAccuracyChanged(Sensor sensor, int accuracy) {}
@Override
public void onSensorChanged(SensorEvent event)
{
if (timestamp != 0)
{
float dt = (event.timestamp - timestamp) * Constants.NS2S;
for(int index = 0; index < 3; ++index)
{
velocity[index] += event.values[index] * dt;
position[index] += velocity[index] * dt * 10000;
}
}
else
{
velocity[0] = velocity[1] = velocity[2] = 0f;
position[0] = position[1] = position[2] = 0f;
}
Let’s examine what is going on here. Method onSensorChanged was called by the system every time the acceleration values were changed. Firstly, we are checking for initial call where timestamp field is not initialized. In this case we just initialize main fields to zero. In case this method is called subsequently we perform calculations using the following formula:
deltaT = time() - lastTime;
velocity += acceleration * deltaT;
position += velocity * deltaT;
lastTime = time();
You should note an interesting number 10000. Treat it is as a some sort of magic number.
And here is the result:
As you see current implementation has two problems:
Movement drifting;
- The view does not return to zero.
Actually, the fix is common for both issues - we need to introduce friction there. The modified formula is the following:
deltaT = time() - lastTime;
velocity += acceleration * deltaT - VEL_FRICTION * velocity;
position += velocity * deltaT - POS_FRICTION * position;
lastTime = time();
Well. Current implementation looks good enough. I would add some “cosmetic” changes like using “low pass” filter and cutting edge values as well as add application settings.
The final application can be found in the repository branch “standalone_app”.
AOSP
We established basic stabilization algorithm and developed a proof-of-concept application which demonstrates that screen content stabilization is possible. Now we can apply these groundworks to the whole device. That is a challenging task but this makes it even more interesting.
This task requires some experience in AOSP building. Google provides all the necessary documentation. In essence, you need to download Android sources for selected Nexus device. Build Nexus image and flash it. Don’t forget to integrate the corresponding binary drivers before building.
Once you manage to get a working stock ROM image you can proceed to integrating screen stabilization.
The implementation plan is as follows:
Find a way to shift the screen on the device.
- Develop an API in the AOSP internals to allow screen content shift from standard Android application.
- Implement sensor data processing service in our demo application. It will use implemented earlier algorithm altogether with implemented in point 2 API to perform device screen stabilization. The service will be made auto-start so screen stabilization will work right after device start.
Now I will just explain how I accomplished these tasks.
1. The first file to look is DisplayDevice.cpp which controls device display parameters. The method to look is void DisplayDevice::setProjection(int orientation, const Rect& newViewport, const Rect& newFrame). The most interesting line for us is line 483:
where the final transformation matrix is determined by the help of other components. All these variables are Transform class instances. This class represents transformations using matrices and has several overloaded operators (like *). To introduce a shift there add new component:
If you compile this and flash it onto your device it will shift the screen by translateX pixels horizontally and translateY pixels vertically. By the end we need to introduce a new method void setTranslate(int x, int y); which will control “translate” matrix.
2. The second file to look is SurfaceFlinger.cpp. This file is a key for making an API to accessing DisplayDevice. Just add new method:
which will call setTranslate for all displays. The other piece is very odd but I will explain it later. We need to modify status_t SurfaceFlinger::onTransact(uint32_t code, const Parcel& data, Parcel* reply, uint32_t flags) method by adding the following switch branch:
this piece of code is the entry point to our improvement.
3. The sensor data processing service is pretty simple: it uses the same algorithm developed earlier to get the screen offset values. Finally, it uses IPC to pass these values into SurfaceFlinger:
ServiceManager is not recognized by Android Studio because it is hidden and not available for non core application. Core application means that it should be built altogether with AOSP using makefile build system. This will allow our application to get some mandatory extra permissions and access to hidden Android APIs. To get SurfaceFlinger service the application needs “android.permission.ACCESS_SURFACE_FLINGER” permission. Note that this permission can be granted only for system applications (see later). To be able to call our API with code 2020 the application needs “android.permission.HARDWARE_TEST” permission. This permission can be used only by system applications too. And finally to make our application system, modify its manifest in the following way:
and create corresponding makefile:
All other stuff in the application (like boot broadcast receiver, settings, etc) are standard and I will not touch them here. The only thing to note is how to make this application preinstalled (i.e. build into the ROM image). Just place the application source code into {aosp}/packages/apps directory and modify core.mk file to include this application:
The final demo:
You can find all detailed information and the source code on GitHub.
There is ScreenStabilization application which should be placed into {aosp}/packages/apps directory, AOSP patches: 0001-ScreenStabilization-application-added.patch should be applied to {aosp}/build directory, 0001-Translate-methods-added.patch should be applied to {aosp}/frameworks/native directory.
Nexus 2013 Mobile ROM image. It is built in “userdebug” configuration so it is more for testing purposes. To flash the image, reboot the device into bootloader by holding “volume down” and pressing “power” button at the same time and type:
fastboot -w update aosp_deb_screen_stabilization.zip
This procedure will erase all data on your device. Note to flash any custom ROM you need to unlock your device bootloader with a command:
fastboot oem unlock
Conclusion
This article shows how to implement a simple screen stabilization algorithm and apply it to the device by customizing Android source code and building custom ROM. The algorithm is not polished yet but is sufficient for demo purposes. We created custom ROM for Nexus 2013 Mobile device but resulting source codes can be applied to any Nexus device or even any other AOSP systems like CyanogenMod which makes it possible to integrate the screen stabilization solution into new devices.