Computers as well as the Internet have become essential to businesses and homes alike. Their dependence increases each day, whether for the household user, vital areas such as management of power grids medical applications, or finance systems in corporate settings. In addition, there are the difficulties in the ongoing and reliable service delivery which is a growing problem for companies. Cyber security is at the top of the threats organisations face, with the majority of respondents rating it as higher than the risk of terrorist attacks or natural disasters.
Despite all the attention Cyber security has received and received, it’s been a difficult journey so far. The total cost of IT Security is anticipated to exceed $120 billion in 2017 , and that is one area in which the IT budgets of the majority of companies has remained the same or even increased in recent financial crisis. However, that hasn’t drastically diminished the vulnerability in software or the attacks of criminal organizations.
The US Government is preparing for the possibility of a “Cyber Pearl Harbor” kind of all-out attack that could disrupt essential services and possibly cause physical destruction of lives and property. It is likely to be orchestrated by the underworld of criminals in countries such as China, Russia or North Korea.
The economic cost of Cyber criminality is estimated at $100 billion annually in the United States all by itself.
It is time to completely rethink how we go about to safeguarding the IT infrastructure we use. Security is solitary and focuses on single solutions to address particular threats, such as anti virus as well as anti-spam filters, intrusion detection, and firewalls. However, we are in a phase where Cyber systems are more than tin-and-wires and software. They have systemic problems that have the economic, social and political dimension. The interconnectedness of systems that are interspersed with the human aspect, makes IT systems incomparable from human factors. Cyber systems that are complex today have an existence of their own. cyber systems can be described as complex, adaptive systems that we’ve attempted to comprehend and solve using traditional theories.
Before we get into the reasons of considering the Cyber systems as Complex system, here’s an overview explanation of what is a Complex System is. It is important to note that “system” could refer to any combination of people, processes or technologies which serves a particular function. The wristwatch you are sporting, sub-oceanic coral reefs or the economy of a country are just a few examples of an “system”. For Cyber Security News it is very important to be updated.
That is why it is as Aristotle stated “the totality is more than the total of its components”. A few of the well-known examples that can be used to illustrate this is the city’s traffic system as well as the emergence of traffic jams. the analysis of cars as well as drivers of cars cannot in explaining the patterns and the formation congestions.
The complex adaptive System (CAS) also exhibits characteristics of self-learning, evolution, and emergence between the agents in the system. The agents or participants of the CAS exhibit diverse behaviour. Their behaviour , as well as interactions with each other are continuously changing. The most important characteristics of an agent to be classified as being complex Adaptive are:
using “complicated” processes. Complex processes are those with an unpredictable outcome no matter how easy the steps may appear. A complex process is one which involves many intricate processes and challenging pre-requisites, yet with a predictable final result. A common example is that making tea is complex (at at least my experience… I’ve never been able to make an experience that is identical to the previous one) Making cars is complicated. David Snowden’s Cynefin framework provides an explanation of the concepts.
The area of study isn’t brand new it’s roots can be traced to the research about Metaphysics of Aristotle [8The Metaphysics of Aristotle [8. Complexity theory is mostly influenced by the biological system and has been employed in epidemiology, social science and natural science research for a while. It has been utilized to study economic systems as well as free market systems and is gaining acceptance in financial risk analysis too. (Refer my article on the complexity of Financial risk analysis here [1919). It’s not something that has gained a lot of attention in the realm of Cyber security field but there is increasing acceptance of the concept of complexity within applied science and in computing.
Today, IT systems are created and developed by us (as humans in the human society of IT employees within an organisation, and the suppliers) as a collective. We possess all the information we need to know about the systems. So why do we encounter the emergence of new threats to IT systems each day that we never imagined, attacking weaknesses that we didn’t even know existed? One reason could be the fact that every IT system is created by thousands of different people across all technology stacks from the business application all the way down to the components of the network and the hardware it is based on. This creates a human component to the design of Cyber systems , and possibilities are everywhere to introduce flaws which could lead to weaknesses.
Many organizations are equipped with multiple levels of protection for their most critical systems (layers of firewalls IDS, a hardened O/S and strong authentication, etc) however, cyber-attacks still occur. The majority of the time computers are targeted because of the result of a combination of circumstances rather than a vulnerability that is which can be exploited by hackers for an attack to be successful. It’s an “whole” of events and the actions of the attackers that create the harm.
Reductionism as well as Holism are two different philosophical perspectives for the study or design of any system or object. The Reductionists believe the possibility of reducing any object down to the components of it and then analysed using “reducing” it to its constituent parts and vice versa. On the other hand, the Holists insist that the total is greater than its parts and therefore, a system cannot be understood solely by analyzing the parts of it.
Reductionists claim that all machines and systems can be understood through examining its components. A majority of the current sciences and analysis techniques are built on the reductionist model and, to be honest, they have served us well thus far. If you understand what each component does, you can really understand what a wristwatch could perform, and by designing each piece separately it is possible to create a car to behave according to your preferences and by analyzing the positions of celestial objects, we can precisely determine the time of the coming Solar eclipse. Reductionism is a firm focus on causality. There is a reason for an effect.
However, that’s the extent that the reductionist viewpoint can help explain the behavior of the system. In the case of emerging systems such as human behavior Socio-economic systems and biological systems , or Socio-cyber networks, the reductionist view is not without its flaws. Simple examples such as humans, or the reactions of a group to a political signal and the response of the financial markets to the announcement of a merger or even a traffic jam can’t be predicted, even when looking at the behavior of the people who make up each of these “systems”.
We’ve always examined Cyber security using the Reductionist perspective, using particular point solutions to specific issues and attempted to anticipate what attacks cybercriminals might launch against vulnerabilities that are known to be vulnerable. It’s high time we look at Cyber security from an alternative Holism approach too.
Computer burglaries are more like an infection that is bacterial or viral than a car or home break-in. The burglar who breaks into a house isn’t able to utilize it as a way to enter the neighbor’s home. Also, the weakness in one lock system on an automobile be exploited to millions of other locks across the world at the same time. They’re more like infections caused by microbial organisms to the human body. They could spread the infection in the same way that humans do. They can affect vast areas of a species for as long as they’re “connected” to one another and, in the event of severe illnesses, the systems are typically isolated, as are those who are placed in quarantine to prevent the spread of infection [99. The lexicon used by Cyber systems makes use of biological metaphors such as viruses, worms, infections and more. There are numerous parallels with epidemiology, however the principles used to design in Cyber systems aren’t compatible with natural nature of selection. Cyber systems are based on the uniformity of technology and processes in comparison to the diversity of genes within species, which makes the species more resistant to attacks of epidemic.
The Flu pandemic in 1918 killed more than 50 million people, which was more than that of the Great War itself. Nearly everyone was affected, but how did it affect 20-40 year olders more than other age groups? It could be due to differences in body’s structure that causes a different reactions to an attack?
Complexity theory has gained huge popularity and proved to be extremely useful in understanding epidemiology, the patterns of spreading of infection and strategies for stopping these. Researchers are now focusing on applying their knowledge from the science to cyber systems.
There have been traditionally two distinct and complementary approaches to reduce security risks to Cyber systems being used today in the majority of modern systems.
This method mostly relies on the testing teams of any IT system to identify any weaknesses within the system that may reveal a vulnerability that could be exploited by hackers. This can be functional testing to confirm that the system provides the correct answer in the way it should and penetration testing to verify its resistance to specific attacks, or tests for availability and resilience. The focus of the testing generally concerns the system that is tested and not the defenses that surround it.
This is a great approach for systems that are self-contained and relatively simple in which the user’s travel paths are relatively simple. In the case of most interconnected systems, the formal validation isn’t enough since it’s impossible to “test everything”.
Automation of tests is a well-known method of reducing the dependence of humans on testing processes, however, as Turing’s Halting issue of undecideability[*] shows that it’s not possible to create machines that test the other one in every case. Tests are only evidence of the system’s performance within the specific scenarios it was tested for and automating helps gather this evidence more quickly.
In the event that a system cannot be completely validated by formal testing procedures we implement additional layers of protection that take the form of Firewalls or network segregation , or to encapsulate them in virtual machines that are not visible to of the of the network. Other popular methods of defense mechanisms are Intrusion Prevention systems, Anti-virus and others.
This strategy is widely used by many organizations as a way to protect against attacks that are not known to the company since it’s nearly impossible to verify that software is secure from security flaws and will remain that way.
Strategies utilizing Complexity science could be beneficial in addition to more conventional methods. The plethora of computer systems can make them unpredictable or capable of exhibiting emergent behavior that are not predictable even without “running the system” [1111. Additionally, running it in a test lab isn’t the same as operating an actual system in the environment it’s supposed to be in, because it’s the result of the collision of several elements that triggers the emerging behavior (recalling the notion of holism! ).
The ability to withstand changes is an important emerging characteristic in biological systems. Imagine a world with all living organisms sharing the identical genetic structure, body type, same immune system and antibodies – the spread of a virus disease would have destroyed the all of the population. It’s not the case because we all are different and each are different in our resistances to illnesses.
In addition, certain mission-critical Cyber systems, especially those for those in the Aerospace and Medical industry implement “diversity implementations” of the same functionality . The centralised “voting” functions decide the answer to the user if the results of the different implementations don’t match.
It’s commonplace to have multiple copies of mission-critical systems in organizations, but they’re homogenous instead of diverse, making them equally vulnerable to the vulnerabilities and faults as the main ones. If the application of the duplicate systems differs from the primary one – for example, using a different OS or version of the database or application container and the two versions will be different in their resistance to specific attacks. A change in the order of access to memory stacks can modify the response to an attack by a buffer overflow the different versions, highlighting the central voting system that something is wrong somewhere. So long as the input data and business purpose of the implementation is the same Any deviation in the responses of the implementations could indicate a an attack. If a true service-based design is used, each’service can have multiple (but very few) different implementations, and the business function can randomly choose which service implementation it chooses to use for each new request from a user. There are a lot of execution paths can be created using this method and increase the resiliency for the entire system.
Multi-variant Execution Environments (MVEE) are being designed, where programs that have a slight variation in their implementation execute in lockstep, and the response to requests is monitored [1212. These have proved extremely effective in intrusion detection by and attempting to modify the behavior of the program or even identifying weaknesses where the different versions respond differently to requests.
Similar to this using the N-version programming model, the N-version antivirus was designed in the University of Michigan that had diverse implementations that looked at new files to determine if they match with virus signatures. This resulted in a stronger anti-virus systemthat was less susceptible to attacks on itself , and 35% more effective in detecting across the entire estate
One of the major areas of research of study Complexity research includes Agent Based Modelling, a Simulation modelling technique.
Agent Based Modeling is a technique for simulation modelling employed to analyze and comprehend the behavior of Complex systems, in particular Complex adaptive systems. The groups or individuals that interact with one another in the Complex system are represented by artificial agents that are governed by a rules. Agents are able to change their behaviour and alter their behaviour according to the conditions. Contrary to Deductive reasoning , which has been the most commonly used to explain the behavior of economic and social system, Simulation does not try to generalize the system and the behavior of agents.
ABMs have become quite well-known to study issues such as crowd management in the event of fire evacuations and spread of epidemics to understand market behavior and, more recently, an analysis of risk in financial markets. It is a bottom-up model method where the behavior for each participant is programed independently, and may be different from the other agents. The self-learning and evolutionary behavior of agents is possible by using a variety of methods, Genetic Algorithm implementation being one of the most popular.
Cyber systems connect software modules and the wiring of logic circuits microchips as well as the Internet and various customers (system consumers or the end user). The actors and interactions can be simulated in a model to perform what-if analysis, and predict the effects of changes in parameters and the interactions between the participants of the model. Models of simulation have been utilized to analyze the performance characteristics of an application based on characteristics and user behavior for some time and a few of the top Performance management and capacity applications employ this method. Similar techniques can be employed to study the reaction cyber security systems in response to threat, constructing an architecture that is fault-tolerant, and then analyzing the degree of emergent resilience due to the diversity of the implementation.
One of the areas of interest on in Agent Based modelling is the “self-learning” process of agents. The behavior of an attacker could change as they gain the experience of an attacker. This element of the behavior of an agent is achieved through the process of learning for agents Genetic Algorithms are an extremely widely used techniques for that. Genetic Algorithms have been employed for creating aeronautics and automobile engineering, enhancing performances that comes from Formula one cars [17] and mimicking the learning behavior in stock markets that simulate (implemented with Agent Based models).
Complexity in Cybersystems particularly that of Agent Based modelling to assess the behavior of systems that emerge is a relatively recent field of study that has had very little research being conducted as of now. There’s a long more to be done before Agent Based Modeling is an enterprise-level proposition for companies. With the increasing focus of Cyber security and the inadequacies of our current approach, Complexity science is certainly an area that researchers and academics are increasing their attention to.