Category: Uncategorized

Android MotionLayout Tutorial – Collapsing View

Android MotionLayout Tutorial – Collapsing View

MotionLayout is a layout class that extends from ConstraintLayout. MotionLayout has all the features of ConstraintLayout. On top of that, it also provides the ability to easily animate changes to your UI, without needing to know much about UI interactions and the Android Animation Frameworks.

Take for example this subtle animation of a view being scrolled and the profile picture shrinking. Before MotionLayout, this would be a tedious task to complete. We may have needed a CollapsingToolbar and some other custom animation code to ensure the profile picture scales correctly. Now with MotionLayout, this is really easy to achieve with one extra XML file.

In this article, we will be looking at implementing a simple swipe action on a RecyclerView and how we can achieve the scaling animation with MotionLayout.

Add MotionLayout as a Gradle Dependency

To get started with MotionLayout, you need to make sure we have the latest version in your build.gradlefile. (Note: MotionLayout is still in alpha at the time of writing)

 implementation 'androidx.constraintlayout:constraintlayout:2.0.0-alpha3'

Create your layout XML as per usual

The great part about MotionLayout is that it uses the same constructs as ConstraintLayout. Everything you’ve previously learnt about ConstraintLayout (ie barriers, chains etc) is applicable to layouts that we build with MotionLayout.

To get started, open up your editor and change the root element of your layout to use MotionLayout. Add a RecyclerView and an ImageView to your layout. Make sure the RecyclerView is constrained to the bottom of the ImageView.

RecyclerView with ImageView above

The XML behind the following layout should look similar to this:

<?xml version="1.0" encoding="utf-8"?>
<androidx.constraintlayout.motion.widget.MotionLayout xmlns:android="http://schemas.android.com/apk/res/android"                                                      xmlns:app="http://schemas.android.com/apk/res-auto"                                                   xmlns:tools="http://schemas.android.com/tools"                                     android:orientation="vertical"                                android:layout_width="match_parent"                                     app:showPaths="false"                          app:layoutDescription="@xml/motion_layout_example"                                android:layout_height="match_parent">
    <View android:layout_width="0dp" android:layout_height="0dp"
          app:layout_constraintStart_toStartOf="parent"
          android:background="@color/colorPrimary"
          android:id="@+id/background"
          app:layout_constraintBottom_toBottomOf="@id/space"
          app:layout_constraintTop_toTopOf="parent"
          app:layout_constraintEnd_toEndOf="parent" android:alpha="0"
    />

    <ImageView
            android:layout_width="140dp"
            android:layout_height="0dp"
            android:scaleType="centerCrop"
            android:id="@+id/imageViewAvatar"
            android:layout_marginTop="16dp"
            app:layout_constraintTop_toTopOf="parent"
            app:layout_constraintStart_toStartOf="parent"
            android:layout_marginStart="16dp"
            app:layout_constraintDimensionRatio="h,1:1"
            app:srcCompat="@drawable/veruca"/>


    <TextView
            android:text="@string/veruca_salt_name"
            android:layout_width="0dp"
            android:layout_height="wrap_content"
            android:id="@+id/textViewName"
            android:fontFamily="@font/willywonka"
            app:layout_constraintStart_toEndOf="@+id/imageViewAvatar"
            android:layout_marginStart="16dp"
            app:layout_constraintEnd_toEndOf="parent"
            android:layout_marginEnd="16dp"
            android:textAppearance="@style/TextAppearance.AppCompat.Display1"
            android:layout_marginTop="8dp"
            app:layout_constraintTop_toTopOf="@+id/imageViewAvatar" 
            android:layout_marginBottom="8dp"
            app:layout_constraintBottom_toTopOf="@+id/space"/>

    <androidx.recyclerview.widget.RecyclerView
            android:layout_width="0dp"
            android:layout_height="0dp"
            app:layout_constraintEnd_toEndOf="parent"
            android:layout_marginEnd="8dp"
            app:layout_constraintStart_toStartOf="parent"
            android:layout_marginStart="8dp"
            app:layout_constraintBottom_toBottomOf="parent"
            tools:listitem="@layout/list_item_status"
            android:id="@+id/recyclerViewStatus" android:layout_marginTop="8dp"
            app:layout_constraintTop_toBottomOf="@+id/imageViewAvatar">

    </androidx.recyclerview.widget.RecyclerView>
    <Space
            android:layout_width="0dp"
            android:layout_height="8dp"
            android:id="@+id/space"
            app:layout_constraintStart_toEndOf="@+id/imageViewAvatar"
            app:layout_constraintTop_toBottomOf="@id/imageViewAvatar"
            app:layout_constraintEnd_toEndOf="@+id/imageViewAvatar"
            app:layout_constraintStart_toStartOf="@+id/imageViewAvatar"/>
</androidx.constraintlayout.motion.widget.MotionLayout>

Create the MotionScene XML

In order to animate this layout we need to describe how views should animate in the layout. To do this, create an XML file in the xml folder of your application. We will call it motion_scene.xml.

The first thing that we will do, is define the Transition for this MotionScene. We set the reference to the start and end ConstraintSets on the Transition object. We can also set the duration of the transition.

<?xml version="1.0" encoding="utf-8"?>
<MotionScene
        xmlns:app="http://schemas.android.com/apk/res-auto"
        xmlns:android="http://schemas.android.com/apk/res/android">

    <Transition
            app:constraintSetStart="@id/start"
            app:constraintSetEnd="@id/end"
            app:duration="1000">
        <OnSwipe
                app:touchAnchorId="@+id/recyclerViewStatus"
                app:touchAnchorSide="top"
                app:dragDirection="dragUp" />
    </Transition>
    <ConstraintSet android:id="@+id/start">
    </ConstraintSet>

    <ConstraintSet android:id="@+id/end">
    </ConstraintSet>
</MotionScene>

The next step, that can be seen in the code snippet above, is to create the OnSwipe declaration. This indicates to the MotionLayout that it should monitor the layout for a swipe movement. When the user performs a dragUp gesture on the specified touchAnchorId, the MotionScene will start interpolating between the two states defined (start and end). In this case, it is the recyclerViewStatus view that it will be monitoring for the dragUp gesture.

In the code snippet above, you may notice that there are two ConstraintSet tags defined in the MotionScene. These tags are for the start and end constraints of the view. The beautiful part about MotionLayout is that it will automatically interpolate between these two states and produce some pretty magical animations. At the moment, we don’t have any constraint changes applied, so let’s change that.

The first view we want to animate is the ImageView. We want the size of the view to decrease as a user scrolls up on the RecyclerView. To do that, we will add an end Constraint to the ImageView to adjust the width and height to 40dp .

<?xml version="1.0" encoding="utf-8"?>
<MotionScene
        xmlns:app="http://schemas.android.com/apk/res-auto"
        xmlns:android="http://schemas.android.com/apk/res/android">

    <Transition
            app:constraintSetStart="@id/start"
            app:constraintSetEnd="@id/end"
            app:duration="1000">
        <OnSwipe
                app:touchAnchorId="@+id/recyclerViewStatus"
                app:touchAnchorSide="top"
                app:dragDirection="dragUp" />
    </Transition>

    <ConstraintSet android:id="@+id/start">

    </ConstraintSet>

    <ConstraintSet android:id="@+id/end">
        
        <Constraint android:id="@id/imageViewAvatar"
                    android:layout_width="40dp"
                    android:layout_height="40dp" android:layout_marginTop="16dp"
                    app:layout_constraintTop_toTopOf="parent" app:layout_constraintStart_toStartOf="parent"
                    android:layout_marginStart="16dp">
        </Constraint>

    </ConstraintSet>
</MotionScene>

Linking the MotionScene to the Layout

In order to link the MotionScene to the Layout, you will need to set the property app:layoutDescription="@xml/motion_layout_example" to point to your newly created XML file on the root MotionLayout element.

Running this on device, you will observe that as you scroll up on the RecyclerView , the ImageView decreases in size.

Animating Visibility of a View

Now if you wanted to animate the background to go from invisible to visible, we can just add a Property animation to the start and end constraints. The final XML will look as follows:

<?xml version="1.0" encoding="utf-8"?>
<MotionScene
        xmlns:app="http://schemas.android.com/apk/res-auto"
        xmlns:android="http://schemas.android.com/apk/res/android">

    <Transition
            app:constraintSetStart="@id/start"
            app:constraintSetEnd="@id/end"
            app:duration="1000">
        <OnSwipe
                app:touchAnchorId="@+id/recyclerViewStatus"
                app:touchAnchorSide="top"
                app:dragDirection="dragUp" />
    </Transition>

    <ConstraintSet android:id="@+id/start">

        <Constraint android:id="@id/background">
            <PropertySet app:alpha="0"/>
        </Constraint>
    </ConstraintSet>

    <ConstraintSet android:id="@+id/end">

        <Constraint android:id="@id/imageViewAvatar"
                    android:layout_width="40dp"
                    android:layout_height="40dp"
                    android:layout_marginTop="16dp"
                    app:layout_constraintTop_toTopOf="parent"
                    app:layout_constraintStart_toStartOf="parent"
                    android:layout_marginStart="16dp">
        </Constraint>

        <Constraint android:id="@id/background">
            <PropertySet app:alpha="1"/>
        </Constraint>
    </ConstraintSet>
</MotionScene>

It’s a wrap!

As you can see from the short example above, using MotionLayout can decrease the amount of code you need to write in order to achieve delightful animations. In the next few posts, we will be covering more of the features of MotionLayout, as this article has just touched the surface. Till next time!

Follow me on Twitter – @riggaroo.

ConstraintLayout 2.0: ImageFilterView

ConstraintLayout 2.0: ImageFilterView

Whilst browsing through the various examples online with the new ConstraintLayout 2.0, I stumbled upon ImageFilterView. This got my attention immediately and I decided to investigate further.

An ImageFilterView allows you to perform some common image filtering techniques on an ImageView, including saturation, contrast, warmth and crossfade. 

If you have tried to implement these image filters before, you may have run into ColorMatrix. If you look at the source of ImageFilterView you will see that these methods have just been nicely wrapped up with simpler API usage.

For example, if you would like to adjust the warmth, contrast or saturation, all you need to do is set a property on the ImageFilterView:

<androidx.constraintlayout.utils.widget.ImageFilterView
        android:layout_width="0dp"
        android:layout_height="0dp"
        android:id="@+id/imageView"
        android:layout_marginStart="8dp"
        android:layout_marginTop="8dp"
        android:layout_marginEnd="8dp"
        app:warmth="1.2"
        app:contrast="1.0"
        app:saturation="2.0"
        app:layout_constraintTop_toTopOf="parent"
        app:layout_constraintEnd_toEndOf="parent"
        app:layout_constraintStart_toStartOf="parent"
        app:layout_constraintDimensionRatio="16:9"
        tools:srcCompat="@tools:sample/avatars"
        />

You can also access this programmatically, so you could add a SeekBar to control these values.

seekBar.setOnSeekBarChangeListener(object : SeekBar.OnSeekBarChangeListener {
    override fun onProgressChanged(seekbar: SeekBar?, progress: Int, p2: Boolean) {
        val percentage = (progress / 100.0f)
        imageView.saturation = (percentage) + 1
    }

    override fun onStartTrackingTouch(seekbar: SeekBar?) {

    }

    override fun onStopTrackingTouch(seekbar: SeekBar?) {

    }
})

There is also the ability to crossfade between two different images using the crossfade method defined on ImageFilterView. This allows you to merge two images together.

If you are looking for a quick way to add some basic image effects, ImageFilterView is definitely something to consider. It is fast to use and execute since it is backed by ColorMatrix which uses the GPU (and not the CPU) to process the resultant image.

Here is an example of ImageFilterView in action:

Realtime Image Processing with ImageFilterView
Realtime Image Processing with ImageFilterView

 

The downside to using this approach is that you are not in full control of the exact pixel values that are going to be used, which could be problematic if you are developing an image editing application.

Overall, I’m really excited about the ImageFilterView class! I hope it is the start of some awesome Image effects offered by the Android Team.

Check out the ConstraintLayout demo repository for the code used in the above example.

Follow me on Twitter for more.

 

Building a Custom Machine Learning Model on Android with TensorFlow Lite

Building a Custom Machine Learning Model on Android with TensorFlow Lite

Building a custom TensorFlow Lite model sounds really scary. As it turns out, you don’t need to be a Machine Learning or TensorFlow expert to add Machine Learning capabilities to your Android/iOS App. 

One of the simplest ways to add Machine Learning capabilities is to use the new ML Kit from Firebase recently announced at Google I/O 2018. 

ML Kit is a set of APIs provided by Firebase that provide Face Detection, Barcode Scanning, Text Recognition, Landmark Detection and Image Labelling. Some of these APIs provide an offline-mode which enables you to use these features without worrying if a user has an internet connection.

ML Kit is great for the common use cases described above, but what if you have some very specific use case? For example, you want to be able to classify between different kinds of candy boxes, or you want to be able to differentiate between different potato chip packets. This is when TensorFlow Lite comes in.

Nik Naks are a popular South African brand of Cheese Puffs
Nik Naks are a popular South African brand of Cheese Puffs

What is TensorFlow Lite?

TensorFlow Lite is TensorFlow’s solution to lightweight models for mobile and embedded devices. It allows you to run a trained model on device. It also makes use of hardware acceleration on Android with the Machine Learning APIs.

How do I train my own custom model?

There are a few steps to this process that we are going to take in order to build our own custom TensorFlow Lite model. 

6 Steps to retrain Mobile Image Classifier
6 Steps to Retrain Mobile Image Classifier

Training a TensorFlow model can take a long time and require a large corpus of data. Luckily, there is a way to make this process shorter and does not require gigabytes of images or tons of GPU processing power. 

Transfer Learning is the process of using an already trained model and retraining it to produce a new model.

In this example, we will use the MobileNet_V1 model and provide it with our own set of images that we will retrain this model on.

This example is an adaption of these two codelabs (1 and 2) and this talk from Yufeng Guo.

Prerequisites:

We need to install TensorFlow in order to run this example. You will also need to make sure PILLOW is installed. 

pip install --upgrade  "tensorflow==1.7.*"

pip install PILLOW

If the installation of TensorFlow doesn’t work, follow the instructions here.

Clone the following repository and cd into the directory:

git clone https://github.com/googlecodelabs/tensorflow-for-poets-2

cd tensorflow-for-poets-2

Step 1: Gather Training Data

For this part of the process, because we don’t have a large set of data to work with, taking a video recording of each chip packet will work well enough for our use case. With each video, we need to make sure we are getting different angles of the chip packet and if possible, different lighting conditions. 

Here is an example of a video taken for of a chip packet: 

We would need a video of each packet of chips that we want to identify. 

Step 2: Convert Training Data into useful images

Once we have our videos from the previous step, we need to convert these into images. Using FFMPEG (a command-line tool for video processing), we can batch convert a video into images by running this command for each video, substituting the name of the mp4 file and the folder and image name:

ffmpeg -i flings.mp4 flings/flings_%04d.jpg

Step 3: Folders of Images 

Once you have all your videos cut up into images, make sure you have a folder with all the training data. Inside the folder, make sure to group all the related images, into labeled folders (This would happen if you have done the above step). It should look something like this:

Folder Structure for retraining TensorFlow Lite Model 

Step 4: Retrain the Model with the new images 

Once we’ve got our training data, we need to retrain the MobileNet_V1 model, with our new images. This python script will be run in the folder that we have checked out from the prerequisites.

python -m scripts.retrain \
--bottleneck_dir=tf_files/bottlenecks \
--how_many_training_steps=500 \
--model_dir=tf_files/models/ \
--summaries_dir=tf_files/training_summaries/"${ARCHITECTURE}" \
--output_graph=tf_files/retrained_graph.pb \
--output_labels=tf_files/retrained_labels.txt \
--architecture="${ARCHITECTURE}" \
--image_dir=training_data/south_african_chips

We run the scripts.retrain python script, with our new training data referenced as the image_dir. This step will produce a retrained_graph.pb file.

Step 5: Optimise the Model for Mobile Devices

Once we are done retraining our model, we need to optimise the file to run on mobile devices. TOCO or “TensorFlow Lite Optimizing Converter” is a tool provided by the TensorFlow Library, that optimises the graph to run on mobile devices. 

We pass our new retrained_graph.pb file that we created from the previous step, into this function. 

IMAGE_SIZE=224
toco \
--input_file=AgencyDay/retrained_graph.pb \
--output_file=AgencyDay/chips_optimized_graph.tflite \
--input_format=TENSORFLOW_GRAPHDEF \
--output_format=TFLITE \
--input_shape=1,${IMAGE_SIZE},${IMAGE_SIZE},3 \
--input_array=input \
--output_array=final_result \
--inference_type=FLOAT \
--input_data_type=FLOAT

After running this step, we have a chips_optimized_graph.tflite file and a bunch of labels stored in the .txt file. 

Side note: This step was honestly what took me a while to get working, there were a lot of issues I was experiencing and I ended up having to dive deep into TensorFlow libraries and building the whole TensorFlow library from source to be able to run TOCO. 🤷🏻‍ Apparently there is a tool coming soon to the Firebase Console that will help developers easily optimise their models for Android without having to build TensorFlow from source. I suggest reading more in this codelab here if you are also struggling.

Step 6: Embed .tflite file into App or distribute via ML Kit on Firebase

Now open up the android folder from the checked out repository in Android Studio to build and run the project. Once you have it opened, navigate to a class called ImageClassifier . Inside here, there are two fields you need to update with your new TensorFlow Lite model that we have created.

The MODEL_PATH and the LABEL_PATH will need to be updated with the names of the new files that you have created. You can place these files inside the assets folder in the app. 

Step 7: Profit 🤑

Once we have retrained our model to our needs and embedded the new model in our app, we can run the app locally and see if it detects the correct chip packets. Here is an example of it working below:

Things to consider

  • If you need to update your model, you would need to ship a new app update and hope that people download it. Another way to do this, without requiring an app update, is to host the model on Firebase. Have a read here for more information on how to do this. 
  • TensorFlow Mobile is the older version of TensorFlow for Android/Mobile devices. Make sure any tutorial you are following is using the new TensorFlow Lite and not TensorFlow Mobile

Hopefully, this inspires you to train your own Image Classifier and ship some cool features into your apps! Find me on twitter @riggaroo

References:

On-Device Machine Learning: TensorFlow for Android https://youtu.be/EnFyneRScQ8

Codelabs with more information:

Teaching High School Girls about the Different Careers in Software Engineering

Teaching High School Girls about the Different Careers in Software Engineering

Yesterday I was invited to speak at St. Mary’s Diocesan School for Girls in Pretoria about Software Engineering and the different aspects of my every day job. I was really excited to share my story with them. When I was in High School we didn’t have this kind of opportunity. We had a Career Expo but not anything like this, our Career Expo involved a bunch of stands in the school hall with people handing out brochures. I remember walking around and being way too scared to talk to anyone, I collected a few brochures and still had no clue what I wanted to do with my life.

Read More Read More

Google Developer Launchpad Build SSA – Nairobi and Cape Town Events

Google Developer Launchpad Build SSA – Nairobi and Cape Town Events

I was lucky enough to be invited to speak in Nairobi and Cape Town this past week for the Google Developer Launchpad Build Series events.

The theme this year was Firebase. The event was a huge success and I had the best time! I gave a talk about Firebase Remote Config and Test Lab. Here are the slides from my talk:

Read More Read More