Cloud computing, an information technology paradigm for virtual storage, has earned a good deal of recognition in the last few years, whether we’re talking about public clouds gaining traction among individual users or private and even hybrid cloud architectures increasingly becoming the corporate storage solution. In fact, cloud computing is laying the groundwork for new startups and job opportunities. This phenomenon is in turn fostering a slew of research initiatives and much investment.
In the US, the University of Arizona’s Center for Cloud and Autonomic Computing (CAC) is widely considered one of the field’s leading research departments. The National Science Foundation recently contributed $500,000 to the CAC’s research endeavors.
In the words of Dr. Salim Hariri, the CAC’s director, “The CAC mission is to conduct near-term research projects to advance the knowledge of how to design cloud and autonomic computing systems that are capable of self-healing, self-protecting, and self-optimizing themselves with little involvement of users or system administrators.”
Its advocates make the case that, unlike other research initiatives, which are often narrowly focused, the CAC’s approach tackles the various aspects of cloud computing – performance, security, faults, availability, energy, and more – in a holistic manner.
This does not mean, however, that cloud computing’s individual elements and their attendant problems receive short shrift. Consider the pressing matter of cybersecurity. The CAC is currently working on designing cybersecurity models that “monitor, manage, and analyze systems, applications, and cyber resources” in an autonomous and therefore self-regulating and even self-reinventing way, so that the technology develops pari passu with new threats as they surface.
Here’s a partial list of projects the CAC is undertaking, some of which are joint initiatives with industry titans and/or US government agencies:
1. Autonomic Cyber Security: A paradigm shift in cybersecurity
Autonomic Cyber Security, or ACS, is a form of autonomic computing cybersecurity. Its design is inspired by the human nervous system. In addition to possessing a cyber infrastructure that is largely self-manageable and self-optimizing, ACS self-corrects in the case of software/hardware faults. Ultimately, it will be equipped to protect itself from cyber threats.
The methodology utilized here is referred to as Anomaly Behavior Analysis (ABA), which will “perform data-driven analytics on operations of cyber components” before “determin[ing] what automated or semi-automated actions to take in order to respond in a proactive manner to the detected anomalous events.”
2. Internet of Things security framework for smart cyber infrastructures
This strand of research looks closely at Building Automation Systems (BAS) and Supervisory Control and Data Acquisition (SCADA): in other words, the infrastructure for managing smart homes and buildings. The ever-growing connectivity through the Internet of Things (IoT) makes these systems much more liable to cyberattacks.
Here, an IoT security framework comprising four layers – end devices (nodes), networks, services, and applications – is erected. The threat model devised in this case is known as the Anomaly Behavior Analysis Intrusion Detection System, and it serves to “better recognize the vulnerabilities in each layer and the possible countermeasures that can be deployed to mitigate their exploitation.”
3. Anomaly Behavior Analysis of websites’ vulnerabilities
This is all about research, carried out online, that develops optimal “anomaly behavior analysis of websites,” with a particular focus on HTML files. The approach here utilizes “feature selection, data mining, data analytics and statistical techniques” to home in on affected web content so as to identify and eventually treat “phishing attacks, cross site scripting attacks, html injection attacks, malware insertion attacks.”
4. Anomaly Behavior Analysis for Building Automation Systems
Here, ABA’s Intrusion Detection System is once again utilized, this time to monitor BAS protocols and sensors. Information from BAS is filtered through “behavior analysis methods including Discrete Wavelets Transform (DWT) and rule based abnormal behavior analysis” and is extracted and collected into two categories of data structures: Protocol Context Aware Data Structure (PCADS) and Sensor-DNA (s-DNA).
5. Mining Internet Relay Chat for information on hackers
This attempt to strike at the heart of cybercrime takes into consideration the human element, with its cognitive behaviors and goals, and how it interacts with ever-evolving tools.
By analyzing data on hackers collected in Internet Relay Chat (IRC), researchers can understand behavioral patterns and end goals. Besides deploying bots that aid in the achievement of this task, a hacking language module based on Stanford CoreNLP has been developed to analyze hacker activity.
6. Vehicle Information and Management Portal
This project looks at malicious breaches targeting smart vehicles, and seeks to develop a Vehicle Information and Management Portal (VIMP). Such a creation would connect the various components in play and make them universally accessible through a portal specific to each vehicle, all made readily available online using cloud and Internet technologies.
“By connecting cars to VIMP services, we can oﬀer revolutionary information services in entertainment, communication, collaboration...[and] increase safety by proactively and reactively warning about...dangerous conditions, continuous access to ﬁeld data, on-line ﬁrmware update, just to name a few.”
7. An Autonomic Workflow Performance Manager for researching and forecasting the weather
This project looks at one of the world’s most rampant natural hazards: tropical cyclones. The models emulating cyclones are fed with a near-continuous stream of real-time data obtained through observation. This enhances their potential for accurate forecasts while enabling on-the-ground emergency services to mitigate the risks posed by the gargantuan storms.
“As a part of this research, we are developing an autonomic workﬂow performance manager (AWPM) to test an integrated dynamic hurricane modeling environment for an end-to-end predictive tool to inform interested actors of real hazards associated with a landfalling hurricane.”
8. Resilient Software-Defined Radios and Moving Target Defense
The purpose of this research is the development of “resilient wireless communications architecture” for Software-Defined Radios (SDRs).
The method for achieving such resilience is “randomly changing the runtime characteristics of the wireless communications channels between the diﬀerent wireless nodes in order to make it extremely diﬃcult to succeed in launching attacks.” No wonder it’s called the Moving Target Defense.
9. HeartCyPert: Design and analysis of a heart cyber expert system
The aim here is to develop a 3D computational model that predicts the risk factor in patients highly susceptible to ventricular arrhythmia, despite the complexity of the task. The model relies on data-driven engines that monitor both animals and humans suffering from chronic heart failure. This helps regulate parameters for the 3D cardiac model, and, as symptoms are detected, indicates the likelihood of cardiac complications. As a result, researchers will emerge better-equipped to tailor treatment to individual patients.
10. Autonomic Cloud Management System
Power consumption and costs related to popular data centers and clouds are an increasing concern, given that managing systems try to strike a balance between the input of resources and system performance. As workloads fluctuate, the computing services that input power must prove elastic and smart.
Research in this area “presents an autonomic power and performance management system that utilizes AppFlow-based reasoning to conﬁgure virtual machine (VM) resource allocation dynamically during runtime.” As a result, there are no unwanted fluctuations in the power the VM makes use of.
11. Cybersecurity Lab as a Service (CLaaS)
This strand of research uses cloud technology to fashion a cybersecurity environment, one that serves as an arena for experimentation and education. It is called Cybersecurity Lab as a Service (CLaaS). What happens is that the cloud, accessible from any smart device, becomes a virtual cybersecurity space where experiments and training are carried out.
By testing these spaces and uncovering vulnerabilities, researchers can shed light on how they are “exploited to launch cyberattacks, how they can be removed, and how cyber resources and services can be hardened or better protected.”
12. Scalable feature subset selection and learning in dynamic environments
The Neyman-Pearson Feature Selection (NPFS) scheme is key to such an endeavor. NPFS is a parallel feature selection process that can be scaled to hold large quantities of data, “allowing a user the choice of a ﬁlter-based objective function” and providing him/her with the ability to “identify the number of relevant features in a data set.”
This research takes into consideration that 1) “the training and testing data are sampled from a fixed probability distribution,” also known as concept drift, and 2) “there are an equal number of samples from all classes,” also known as class imbalance.
The end goal is to investigate both theoretical frameworks and empirical observations so as to learn about multiple classifier systems in unknown or unsteady environments.
13. Just In Time Architecture: A new paradigm for designing agile data centers
Just in Time Architecture (JITA) attempts to find practical solutions to nonlinear scaling and unpredictable data growth.
Since the consumption rate of energy resources fluctuates drastically and at a very fast rate, regulating provisions poses a challenge, particularly because new hardware configurations must be devised to fit each workload. The proposed solution is to design a malleable JITA architecture by having its system constructed with flexible and automated building blocks in a way that reduces inefficiency and facilitates a protean data center infrastructure.
14. Resilient Applications as a Service (RAaaS)
CAC head Dr. Hariri has teamed up with Dr. Cihan Tunc to explore resilient cloud services, specifically Service Oriented Architecture (SOA) and Autonomic Computing. The project emphasizes four objectives:
- “Develop an SOA based architecture to design adaptive and resilient cloud applications”
- “Develop Adaptive Resilient Cloud Algorithms”
- “Autonomic Control and Management”
- “Experimental Testbed and Evaluation”
15. Cyber Security Assistant (CSA) technology for everyone
This is another project spearheaded by Hariri and Tunc. The goal is to develop innovative cybersecurity technologies and create cyber assistants that are accessible to all – not just large-scale companies with big budgets.
The envisaged Cyber Security Assistant (CSA) would function around the clock to provide protection while also answering questions about cybersecurity. In conjunction with this, a cybersecurity laboratory instantly accessible from any smart device hooked up to the Internet would “provide cybersecurity information, recommendations, and tools.”
The objectives of this research are to:
- “Implement a real-time 24/7 automated cybersecurity’s question/answering system (Ask CyPert) that will be powered by IBM Watson cognitive services”
- “Develop a Cybersecurity Lab as a Service (CLaaS) to allow any user to conduct sophisticated cybersecurity experiments without the need to acquire the physical resources and software required to build such experiments by exploring cloud and virtualization technologies”
16. Anomaly detection in Internet of Things sensors
This research deals with sensors, that is, the components in an IoT environment that are the representations of the physical world in a virtual one. Due to their vulnerability, it’s important to have a system in place that allows for the detection of compromised sensors – and takes action.
“We developed an algorithm to create a sensor-DNA data structure that uniquely deﬁnes the correct operations of the sensor and can be used to detect sensor compromises and attacks.” This system operates according to an algorithm that identifies each sensor by means of a DNA data structure that obtains feedback from the operations of the individual sensors, each of which functions in a distinct way. The methodology employed here brings together offline training and online testing.
17. Big data analytics applied to anomaly detection in user behavior
The objective of such research is to guard individual users against malicious attacks by studying these users’ cyber activities and tendencies, and subsequently creating a user DNA for individuals. (Big data technologies such as Hadoop are employed to create a scalable environment for analyzing huge amounts of data clusters.) Processing information such as the user’s “hours of connections, the number of times he/she connects to an address and a predictive model of his/her action” is key to this undertaking.
Ultimately, the goal is to go beyond simply relying on an I.P. address as the sole identifier of a user and to develop a DNA based on cyber usage tendencies. If a significant change in patterns of usage is detected, a warning is triggered. Obviously, the system is designed to allow for some leeway, given that humans are not robots, and we tend to deviate from patterns to some extent.