The Internet of Things (IoT) continues to usher in the joys of connecting several house appliances and electronic devices via wired and wireless networks. But how should the rooted systems that support IoT provide security and privacy for online users in light of the rapidly changing nature of malware environments? Lu and Lysecky provide techniques for the timely detection of anomalous malware as inherent software executes instructions based on missing caches of data and instruction characteristics.
The authors echo the deficiencies of the many existing algorithms in the literature, designed to cope with emerging new malwares issues. Clearly, IoT devices that rely on embedded systems, with limited processing resources and insufficient security at network interfaces, will continue to challenge network service providers in charge of limiting the risks of malware exposures and containing all malware vulnerabilities for users. To overcome the complicated issues related to isolating and resolving unusual malware detection in a timely manner, the authors investigate the drawbacks and advantages of alternative models and systems for dynamically identifying and overcoming real-time malware in IoT devices with inadequate processing resources.
Indeed, it is difficult to predict the behaviors and execution patterns of malware in embedded systems due to a lack of awareness of the malware attributes required for clustering and classifying malware detection. Moreover, mission-critical devices in embedded systems with low error detection delay are inadequate for collecting appropriate runtime data analysis of real-time malware detection. Consequently, the authors offer: (1) a model that decomposes a system into missing malware times for “instruction cache, data cache, and intrinsic software execution”; (2) new insights into the adequate application of “range-based, distanced-based, and one-class support vector machine[s]” for incorporating timing characteristics into effective malware detection; and (3) reliable experimental results from the application of data-driven anomaly detection algorithms to model malware attacks using real-world benchmark application data (for example, gain scheduling, key leakage, and camera obfuscation).
The authors present clear strategies for efficiently implementing unobtrusive hardware malware sensors using an appropriate malware abnormality detector based on quid pro quo in hardware requirements and necessary detection ratio. They also raise two challenging (and unresolved) research questions. First, how should “methods ... assist designers in auto-optimizing training parameters and monitored event selection, subject to constraints on area power and false-positive rates”? And second, how should runtime detection be integrated “with adaptive system risk models to enable automated mitigation of malicious threats”? I invite all security experts to read this paper and offer solutions to these thought-provoking questions.