Abstract:
The rapid expansion of the Internet of Things (IoT) has resulted in a paradigm
shift of computing from centralised cloud to edge environments, where data processing
is performed closer to the source. However, the deployment of intelligent IoT
applications within this edge-cloud continuum presents unique challenges, including
resource management, data processing efficiency, and maintaining system reliability. This
thesis focuses on enhancing the performance of intelligent applications by designing
approaches for optimising task allocation, load distribution, Machine Learning (ML)
operations, and data management in the IoT infrastructure.
The thesis aims to design a framework that supports efficient and cost-e↵ective
operation of IoT applications across the edge-cloud continuum. I first propose an
algorithm for dynamic task allocation that emphasises on minimising the completion
time while maximising the task execution performance. By formulating the algorithm
that dynamically allocates tasks based on real-time analytics and system state, the
approach e↵ectively reduces execution latency and enhances the accuracy of real-time
decision-making processes. In addition to task allocation, this thesis presents a load
distribution framework for IoT applications deployed on edge computing infrastructure.
The mechanism prioritises completion time, waiting time, resource utilisation, evaluation
overhead, failure rate, and provides a strategic approach that classifies tasks and
computational resources into categories such as restricted, public; and private, shared.
This results in a security-aware load distribution mechanism that handles IoT-based
tasks in real-time. In order to optimise the ML and Artificial Intelligence (AI)
operations, the thesis introduces an approach to select layers for model training using
a genetic algorithm. This method determines the optimal configuration of active and
inactive layers which enhances the model efficiency and adaptability during training
phases. A pruning mechanism is also developed which utilises heatmap to identify
performance-critical features and simplifies the model by eliminating non-essential
features. This dual approach significantly reduces computational overhead and execution
time while preserving the essential analytical capabilities of the model and maintaining
its accuracy. To handle IoT-based data, the thesis also proposes a methodology that
ensures optimal storage, access, and recovery of data and model files in case any data
loss or system failure occurs. All these methods are designed to enhance the resilience
of the IoT system, ensuring that their performance, data integrity, and availability are
maintained even under adverse conditions. Through mathematical formulation of the
problems and implementation via simulation and testbed, I validate the feasibility and
performance of proposed frameworks on an agricultural (weed detection) use-case scenario.