Patricia Campbell
Computer Science

Exploring Machine Learning on an Internet of Things Edge Cluster

By Patricia Campbell
Cohort 2021-2022

Overview

My proposal for the Community of practice was to explore Machine Learning (ML) on an IoT (Internet of Things) edge cluster. Meaning, in a nutshell, instead of using the power of cloud computing to train ML models we use tiny devices that consume fewer resources to do the training.聽聽 The real breakthroughs in the use of and research into AI/ML were made possible by the collective resources and power of cloud computing. Before cloud computing became generally available we did not have the capacity to move forward, which is why AI/ML is a fairly recent phenomenon.聽 Due to the limitations of using the cloud, training on the edge is being explored as a way to supercharge ML.

My experiments took me in a slightly different direction to that which I expected.聽 It may be possible to explore my original premise in future, but the time and equipment limitations made me realize that in order to have results that were suitable to a Computer Science Tech student I had to pivot.

As I experimented with some s I created some basic supporting slide decks and some code repositories, the below is a blurb on each.聽 Note the slides were written with a target audience of Computer Science students.

General Information

I started this as part of my project and I am still working on it, hopefully it provides some clearer definitions of terminology in central place: .

Background

The first slide deck I created was to present students with an understanding of computing on 鈥渢he edge鈥: it gives an overview of what edge computing means, what can be done there and why we need ML for decision making.

The second slide deck presents some foundational ideas behind Machine Learning.聽 It is an overview of the high-level concepts including contrasting training on the edge versus using a pre-trained model.

Algorithms & ML & Training

Computer Science students write algorithms beginning in their first courses throughout the whole program.聽 The next slide deck presents the ideas they have been working with as they create their own algorithms in order to illustrate the need for ML as complexity increases and data factors increase.

The final slide deck introduces the ideas behind training ML algorithms with a simple introductory example using some common algorithms such as a very simplified Linear Regression, K nearest neighbours, and a neural network. Below you can find links to repositories where the code is available.

Try the

tensorflow

Code

As part of my exploration of this topic I worked with some code to illustrate a few of the simpler ML algorithms and their training in order to gently introduce students with a programming background to the concepts of ML .聽 The whole repository, referred to in slide deck 04 can be found here:

kNN

Scikit Learn: kNN

Using 1 csv to train, 2 gen own dataset 3 plain old python:

Linear Regression

SciKit Learn: Linear regression

Using 1. Generate dataset & results聽 to train 2. Plain old python (3 both of the prev with timing):

Neural Net

DIY NN (using NumPy & Matplotlib for error chart):

Using Raspberry Pis & Edge Impulse

As mentioned, I initially wanted to do model training at the edge.聽 I started by setting up a cluster to attempt federated learning but I was unsuccessful.聽 Unfortunately the time and equipment limitations made me realize that in order to have results that were suitable to a Computer Science Tech student I had to pivot.聽 That is when I turned to .

edgeimpulse

Here is a basis for a lab and a log of the possess used:

Supporting the lab: 聽 also

Summary

Any of my work posted here may be used as base material, all are licensed (except where stated otherwise).聽 My own written work is licensed under . My own code is licensed under .

As always there is more to learn and more to adjust for the CEGEP student depending on the level of the students and the course but please consider it all a work in progress!聽 Any errors or omissions are my own.