Learning Library

← Back to Library

Edge Containerization on Android

Key Points

  • Edge computing moves data processing from centralized clouds to powerful mobile devices, enabling faster decisions and smarter data collection without heavy network latency.
  • Samsung and IBM are collaborating to bring containerization to Android devices, allowing entire applications with their dependencies to run securely and consistently at the edge.
  • Modern smartphones, like the Samsung Galaxy, combine high‑resolution sensors, 5G connectivity, and robust security (e.g., Samsung Knox) to serve as capable edge platforms for AI‑driven tasks.
  • A practical use case, such as a plant safety inspector using on‑device AI visual analysis, demonstrates how edge containers let users run sophisticated models locally and sync results back to the cloud for storage and future model training.

Full Transcript

# Edge Containerization on Android **Source:** [https://www.youtube.com/watch?v=LBx56Q10G3Q](https://www.youtube.com/watch?v=LBx56Q10G3Q) **Duration:** 00:05:32 ## Summary - Edge computing moves data processing from centralized clouds to powerful mobile devices, enabling faster decisions and smarter data collection without heavy network latency. - Samsung and IBM are collaborating to bring containerization to Android devices, allowing entire applications with their dependencies to run securely and consistently at the edge. - Modern smartphones, like the Samsung Galaxy, combine high‑resolution sensors, 5G connectivity, and robust security (e.g., Samsung Knox) to serve as capable edge platforms for AI‑driven tasks. - A practical use case, such as a plant safety inspector using on‑device AI visual analysis, demonstrates how edge containers let users run sophisticated models locally and sync results back to the cloud for storage and future model training. ## Sections - [00:00:00](https://www.youtube.com/watch?v=LBx56Q10G3Q&t=0s) **Edge Computing on Android Devices** - The segment explains how Samsung and IBM are leveraging Android smartphones as powerful edge platforms—using containerization, AI, and built‑in sensors—to process data locally, avoid cloud latency, and enable faster, smarter decision‑making. - [00:03:07](https://www.youtube.com/watch?v=LBx56Q10G3Q&t=187s) **Four Key Benefits of Containers** - The speaker outlines four primary advantages of using containers over virtual machines—easier deployment, streamlined development, improved performance, and enhanced security—especially for edge and cloud environments. ## Full Transcript
0:00When you think about enterprise processing power,  you probably think about supercomputers. Or, you 0:05might visualize a data center with racks  of servers going from ceiling to floor. 0:11But as any mobile user probably knows,  these devices have lots of processing power. 0:17This means they can handle the analysis  of data directly. This power shift has 0:21the potential to change the way we  work with technology we use every day. 0:26In this video, I'll explain an exciting  new emerging capability Samsung and IBM 0:30are working on together to develop the ability  to use containerization--at the edge on Android. 0:36As an example, I'll use a mobile phone. But  what makes processing on the edge so important? 0:43Well, it doesn't require data to be sent  across the network to a centralized cloud 0:47to be analyzed. Instead, it brings applications  closer to the data, which enables faster decision 0:53making and smarter data collection. With a  device like this Samsung Galaxy mobile phone, 0:58edge computing helps us close that last mile  between tech and the people that use it. 1:04Modern mobile phones also have several beneficial  functions as edge platforms. That includes high-res 1:09cameras, microphones, accelerometer sensors  and geospatial capabilities. Not to mention they 1:16have significant processing power, 5G connectivity  and advanced security suites like Samsung Knox. 1:22So, let's start with an example of how, when  combined with AI, devices on the edge can provide 1:28rapid results right in someone's hands. Let's  say, for example, I'm a plant safety inspector. 1:34My job a lot easier if I could offload routine  inspection work to AI-based visual analysis tools. 1:41But these visual analysis tools don't have to be  hosted in some faraway data center. That's where 1:47a really smart phone and camera comes in. In my  job as plant inspector, I can take applications 1:53built in the cloud and have them sent directly  to my device from a centralized management tool. 1:59So I walk into the plant and for my initial  inspection, start up an app and start recording 2:04what I see. I could then immediately apply the AI  model running on my device as I work in the plant. 2:11That takes the guesswork out of my role  and uses tech like AI where it can make 2:15the biggest difference. Once the inspection is  done, the results can be sent back to the cloud. 2:22That way, they can be stored for historical  purposes and potential model training data. 2:27But how do we get our applications out to the  edge and onto the device we use every day in a 2:32way that is scalable and secure? One answer  is containers. But what is the container in 2:39the first place? And why might we want to run  these on something like this Samsung phone? 2:45In simple language, a container is a single  executable package of software that includes 2:50an application and all the dependencies it needs  to run. That means the application, its scripts and 2:55libraries are all together in one place. So in  order to take advantage of containers on Samsung 3:00devices at the edge, we use the Docker runtime. So  how do containers compare with virtual machines? 3:08With virtual machines, each instance includes  a separate operating system. On the other hand, 3:14containers run of the operating system level.  So they share an operating system kernel. 3:20That means containers can spin up really  fast with little overhead per instance. 3:25There are other advantages to running containers  on these devices at the edge as well as across the 3:30cloud. But I'll focus on four main benefits. The  first, easier deployment: Containers are portable, 3:38lightweight and easy to distribute through  an orchestration platform like Kubernetes 3:41without having to rebuild the core application  each time. Number two, easier development: 3:49Using containers means we can tap into cloud  native developer skills. That means workloads 3:54in AI on the cloud can be moved to the edge  using existing development skills. Number three, 4:01application performance: Because they are sharing  OS level resources across containers in real time, 4:08devices can quickly adjust and maintain  application resource requirements as demand 4:13rises and declines. And finally, number four:  security. While containers might share the OS 4:21kernel, it appears each container that has  own operating system since it can access 4:25only what was installed within the container  itself. This isolates processes to control 4:29what each application has access to and prevent  access to other data or resources on a device. 4:36Now, with lots of applications going to lots of  devices, this could quickly get pretty complex. 4:43But luckily we can use software to set policies  to help. So instead of needing someone working 4:48at a particular location to manage which  applications go on which device for which tasks, 4:53the IT team can centralize this. With preset  policies, they can automate app management. 5:00In our example, we can set the software policy so  the container with AI application is automatically 5:05sent to the plant safety inspector's mobile  device. With the right models for that plant, 5:10they're inspecting that day--without  anyone in the plant managing this process. 5:16This simplifies the management and orchestration  of our applications and helps us to bring tech out 5:22of the cloud and onto the devices we use  every day in a scalable and secure way. 5:28Thanks for watching. Hey~ Before you leave,  remembered to hit “like” and “subscribe”.