Hardware Fairness

Hardware design and the fairness of a neural network

Ensuring the fairness of neural networks is crucial when applying deep learning techniques to critical applications such as medical diagnosis and vital signal monitoring. However, maintaining fairness becomes increasingly challenging when deploying these models on platforms with limited hardware resources, as existing fairness-aware neural network designs typically overlook the impact of resource constraints. Here we analyse the impact of the underlying hardware on the task of pursuing fairness. We use neural network accelerators with compute-in-memory architecture as examples. We first investigate the relationship between hardware platform and fairness-aware neural network design. We then discuss how hardware advancements in emerging computing-in-memory devices—in terms of on-chip memory capacity and device variability management—affect neural network fairness. We also identify challenges in designing fairness-aware neural networks on such esource-constrained hardware and consider potential approaches to overcome them.

https://doi.org/10.1038/s41928-024-01213-0

References