What are Cyber Physical Systems?

Cyber-physical systems (CPS) are intelligent robotics systems, linked with the Internet of Things, or technical systems of networked computers, robots and artificial intelligence that interact with the physical world.

The project 'Ethical aspects of CPS' aims to provide insights into the potential ethical concerns and related unintended impacts of the possible evolution of CPS technology by 2050. The overarching purpose is to support the European Parliament, the parliamentary bodies, and the individual Members in their anticipation of possible future concerns regarding developments in CPS, robotics and artificial intelligence.

The Scientific Foresight study was conducted in three phases:

1. A 'technical horizon scan', in the form of briefing papers describing the technical trends and their possible societal, ethical, economic, environmental, political/legal and demographic impacts, and this in seven application domains.

2. The 'soft impact and scenario phase', which analysed soft impacts of CPS, on the basis of the technical horizon scan, for pointing out possible future public concerns via an envisioning exercise and using exploratory scenarios.

3. The 'legal backcasting' phase, which resulted in a briefing for the European Parliament identifying the legal instruments that may need to be modified or reviewed, including — where appropriate — areas identified for anticipatory parliamentary work, in accordance with the conclusions reached within the project.

The outcome of the study is a policy briefing for MEPs describing legal instruments to anticipate impacts of future developments in the area of cyber-physical systems, such as intelligent robotics systems, linked with the Internet of Things.

It is important to note that not all impacts of CPS are easily translated into legislation, as it is often contested whether they are in effect harmful, who is to be held accountable, and to what extent these impacts constitute a public rather than a private concern.

Executive summary

Cyber-physical systems (CPS) are defined as technical systems of networked computers, robots and artificial intelligence that interact with the physical world. The aim of the project Ethical aspects of CPS was to
(i) examine future development paths of CPS technology up to the year 2050,
(ii) highlight potential unintended impacts and ethical concerns; and
(iii) support the European Parliament, the parliamentary committees and other parliamentary bodies, as well as the individual Members, in their anticipation of possible future concerns regarding developments in CPS, robotics and artificial intelligence.

Context

The study was launched by the STOA Panel upon the request of the European Parliament's Committee on Legal Affairs (the JURI Committee) to provide evidence for its Working Group on legal questions related to the development of robotics and artificial intelligence, which should feed into the reflection of Members on the need for civil law rules by facilitating specific information, providing for an exchange of views with experts from many fields of academic expertise and enabling Members to conduct an in-depth analysis/examination of the challenges and prospects at stake. The input gathered by the Working Group will be the basis for an INI report and possible future legislative activities. The INI report will also be discussed by other Committees before being voted upon in plenary. The present STOA study will lead to a final policy briefing paper which aims to support these parliamentary bodies by providing an analysis of legal instruments available for dealing proactively with possible future concerns regarding developments in CPS, robotics and artificial intelligence.

Methodology

The Scientific Foresight study was conducted in three phases :
1. a 'technical horizon scan', in which briefing papers described the key technical developments, including short- and long-term trends with a reflection upon their societal, ethical and other impacts;
2. a 'soft impacts and scenario development phase', which analysed soft impacts of CPS to highlight possible public concerns. Two workshops were organised to identify these soft impacts, to develop a set of possible future scenarios, and to identify areas of possible public or ethical concern;
3. a 'legal backcasting' phase, which identified the legal instruments that may need to be modified or reviewed and, where appropriate, areas where anticipative parliamentary work may be required. In this phase, the outcomes from the previous steps were transformed into a forward-looking strategy to support the legislative activities of the European Parliament, the parliamentary committees and the Members of the European Parliament.

Process summary

Step 1: Request for the study: 'Ethics of Cyber-Physical Systems', for the JURI Committee

Step 2: Technical Horizon Scanning, unravelling the complexity of CPS in seven areas;

Step 3: Envisioning phase ('Soft impact' phase), identifying possible future impacts of CPS;

Step 4: Scenario Phase, resulting in areas of societal concern raised by CPS;

Step 5: Legal backcasting, identifying legal instruments that may need to be reviewed or modified and, where appropriate, areas where anticipatory parliamentary work may be required, anticipating the future concerns identified;

Step 6: Sense-making phase in which the outcomes are transformed into briefings supporting the Members of the European Parliament in their anticipation of possible future concerns regarding developments in CPS, robotics and artificial intelligence.

Close

Lay summary

The present Scientific Foresight study on 'Ethics of Cyber-Physical Systems' was conducted for the European Parliament's STOA Panel (Science and Technology Options Assessment Panel).

What are cyber-physical systems?

Cyber-physical systems (CPS) are technical systems in which networked computers and robots interact with the physical world. By 2050, these systems may interact with us in many domains, driving on our roads, moving alongside us in our daily lives and working within our industries. Due to the wide range of situations where we will be interacting with CPS, understanding the impacts of these systems is essential.

Expected benefits and core promises

The integration of CPS into society promises many benefits, including increasing the efficiency and sustainability of many of our current practices, and creating new markets and growth.

These promises include:

• automated cars that enhance traffic flow, reduce pollution and allow drivers to work or relax while in transit;

• mass-customisation of products that closely match consumers' preferences and reduce waste during production;

• telecare alarm systems and CPS treatment tools that help to care for sick and elderly people while enabling them to live with more independence;

• smart technological aids for disabled citizens that enable them to become more active members of society;

• CPS in agriculture that reduces the need for pesticides, prevents food waste, and optimises food production, all the while reducing water and energy footprints;

• drones and search-and-rescue robots that perform missions in hazardous environments, thereby reducing the risk for the operating personnel.

Unintended impacts and policy implications

While many potential benefits of CPS systems raise high expectations, past experience has taught that the effects of newly introduced technologies can never be completely predicted. There are always unintended effects, some of which are good, some bad, and others that are never truly realised.

One such unintended consequence may be that 3D printing changes consumer habits, making production so easily obtained that we start producing more, and become less attached to our goods. As a result, maybe we become more inclined to discard goods, and thus generate more waste.

We need to think ahead and avoid such possible unintended consequences while ensuring that these technologies can benefit everyone.

Employment and delegation of tasks

As we delegate more tasks to CPS, old jobs will be lost while new ones are created, such as repairing robots, and mediating between robots and humans. In these circumstances, will humans leave routine decisions to robots to remain focused on tasks that demand creative thinking and decision-making? If so, will we be able to integrate our knowledge with data from these systems? Is it desirable to delegate meaningful tasks to robots when robots can do these tasks better than humans, or at least as well as them? Such as taking care of our loved ones: If robots perform these tasks, would we lose a certain degree of meaning in our lives? Do we want to live our lives without experiencing the satisfaction deriving from the altruism of unconditionally helping others? These are the sort of possible effects we need to keep in mind.

Safety, responsibility and liability

Safety aspects, i.e., finding ways for robots and humans to work together without accidents, should be one of our primary concerns. This is especially important as robots increasingly operate in close proximity to humans.

CPS systems are large and complex, intelligent and self-learning. But who should be held responsible when the system fails? Moreover, finding the initial cause and attributing liability will prove very difficult. Who can we hold accountable should these systems malfunction?

In healthcare, is it the doctor, caregiver or patient who is responsible for failure? Or is it the developer or producer of the CPS?

As factories, energy grids and transport systems become digital networks, how can we prevent outsiders from hacking and infiltrating these systems for nefarious purposes?

Privacy concerns

CPS require vast amounts of data to operate effectively, and this poses several privacy questions. For instance, in order to optimise energy usage, smart home systems might want to keep track of the times residents are away, which is also valuable information for burglars. Or will robots be spying on the working habits of their human co-workers, maybe even manipulating them to work harder? Should the code of conduct on medical professional secrecy be reviewed, concerning the health data stored on connected parts of medico-technical systems—data that third parties can also access?

Collecting data on a person's lifestyle and physical parameters can definitely improve their health, but as we proceed, should we discuss how to prevent others from taking advantage of the data shared on medico-technical systems?

Social relations

CPS will influence our relations with machines, and might even lead to new controversies: Should robots acquire some form of moral sense if they are to interact with us in our comfort zone? And as we humanise the robot, how will this affect our self-understanding? And what if robots one day become emotionally — or even affectionately — involved with humans, how should this be managed?

Also, with CPS we can build smart prostheses for the disabled. At what point will they turn into cyborgs, possibly even exceeding human abilities? And how will we define 'disabled' or 'able-bodied' in the future?

Conclusion

Exploring the future effects of CPS shows that it could have considerable impacts on various areas in our personal and professional lives. The deployment of interconnected autonomous working machines in complicated data environments involves a number of legal areas, such as responsibility, liability, data ownership and privacy. Designing CPS for operation in proximity to humans means that current safety regulations need to be updated to ensure that individuals are not harmed and that the desired benefits outweigh the potential unintended consequences.

Close

Conclusions

The overarching purpose of this study is to support the European Parliament's bodies, as well as the individual Members, to strengthen their anticipatory knowledge and develop insights regarding the dynamics of change, future long-term challenges and options, in view of the rapid developments in the field of robotics. The main outcome of this foresight study is a policy briefing aiming at translating the identified the technical trends in the area of robotics as well as the respective impacts and concerns into legal and regulatory terms.

During the first phase of this study, a 'technical horizon scan' has been conducted in seven different domains of possible cyber-physical system (CPS) application, including short- and longer-term trends and their societal impacts. These domains are:

1. Disabled people and daily life

2. Healthcare

3. Agriculture and food supply

4. Manufacturing

5. Energy and critical infrastructure

6. Logistics and transport

7. Security and safety

In each case, the key technical developments, the short- and long-term trends and reflections upon the most important social, technological, environmental, economic, political, ethical and demographic impacts identified have been highlighted.

The development and implementation of CPS for the disabled and the elderly may lead to higher risks to data protection and privacy and a shift in the focus of medicine from treatment to prevention. This shift may help relieve the burden on medical professionals, allowing for more time to focus on patient care, and lowering the cost of medicine.

CPS may create major changes in the healthcare sector. These changes will trigger discussions on patient privacy including medical professional secrecy, data ownership and patient acceptance of CPS, and on civil liability for in case of medical shortcomings.

In the area of farming and food, CPS may result in greater food safety and hygiene. Through a combination of the Internet of Things, autonomous robots and sensors, we may experience an improvement in working conditions in the agricultural sector, and achieve optimised harvests and increased production. These changes will require discussions, for instance, on liability issues and the terms of the relationship between farmers and machines.

As CPS continue to be developed and deployed in manufacturing, with smart factories, we will see a radical change in the way manufacturing occurs primarily through new business models and the customisation of products. The changes will have profound implications for the economy, therefore legal action concerning data ownership, privacy, certification and safety will be needed.

CPS will become a key component of future energy systems, and the critical infrastructure underpinning the energy grid. However it is important to discuss the areas of liability, data collection and the ownership of that data to ensure that the rollout of the new energy systems can achieve the positive benefits while mitigating and potentially eliminating the negative side effects.

CPS are already changing the transport and logistics sectors. In the future, these changes will profoundly impact the way we move both goods and people with major impacts on safety, emissions, and the mobility of older citizens. A discussion towards accommodating the implantation of CPS will be needed concerning appropriate regulatory policy actions and timing, standardisation of laws, liability and privacy.

CPS will not generally make the world a more or less safe or secure place. It may however create a more complex world in which we will need to improve our ability to predict and understand the machines and their effect on security and safety. We need to ensure that those coming into contact with these new technologies – whether bystanders or operators – are able to understand the risks to their safety and security. These changes will require discussions on liability, data protection and the impact CPS will have on employment.

The next step of the study was based on and inspired by the outcomes of the technical horizon scan. Within this frame, a series of potential soft impacts of CPS were taken into consideration along with some publicly expressed concerns and fears. Two workshops were organised to identify these soft impacts, to develop a set of possible future scenarios, and to identify areas of possible public or ethical concern. A list of possible societal impacts and concerns related to a future in which CPS would be integrated in society was the main outcome of this phase.

Examples of these outcomes cover both rather obvious and less likely robotics-related futures. Some of the examples below illustrate how entrenched meanings may be challenged by technological developments in the domain of CPS, raising subsequent questions about how to act in view of the following challenges/developments:

• Co-living and co-working between humans and robots, possibility of intelligent robots as a result of artificial intelligence developments

• The incorporation of smart technologies, which might raise issues as to where we will need to consider the borderlines between assistive technologies and human enhancement: will human enhancement become wide-spread? Will we consider the concept of 'disability' differently tomorrow? In this area, we are also confronted with issues such as self-determination and physical integrity.

• Normative conceptions of 'nature' will be constantly challenged, to an even greater extent than today is the case. For instance, do we perceive precision farming, using CPS, as conflicting with the concepts of 'nature'? Opinions already differ if organic farming compatible with the use of agricultural robots.

• CPS in farming may allow farmers to work further from the land they are farming. The image of farmers might change drastically. Will CPS in farming make farming more or less attractive to young farmers?

• Currently, security as a value is most often applied to questions regarding physical health and safety, and to a lesser extent to employment and finances. With CPS and the Internet of Things, data streams become ubiquitous, making all domains of life possible candidates for security risks. As a possible consequence, it is foreseen that debates will occur about how much security it is reasonable to expect or claim in any given domain, and what is seen as (ir)responsible behaviour.

• Where the border is drawn between public and private is of the utmost relevance for policy-making. However, CPS technology is bound to bring these borders constantly into question, because it shifts responsibility from the collective to the individual and back again. For example: is human enhancement a private or public responsibility?

There are a number of examples where CPS may invite novel behaviour more directly, for instance:

• If technologies become easier to use, more people will use them. One example of this mechanism is drones. If you are the only one to have them, that gives you a competitive edge. However, other parties are bound to acquire the technology too, and then what seemed 'smart for one' may become 'dumb for all'.

• If things or services gets cheaper, more people buy and use them. If we make energy cheaper through the use of CPS, this may well lead to a net increase in energy use, with for example devastating effects for global warming. Or, more in the domain of soft impacts: might scarcity be a good thing? If energy is cheap and clean, do we still need to be conscientious about using it? Are there other positive practices that we have developed in the light of scarcity?

For this analysis of possible future impacts and concerns, the study made use of four exploratory scenarios in which the first identified possible futures were considered. These were developed after the envisioning meeting during which a working group with technical experts, social scientists and some stakeholders brainstormed on the possible future impact of CPS. They covered:

• CPS and health and disability

• CPS and farming

• CPS and manufacturing and security

• CPS and energy and security

Each of these scenarios is an imagined account of a future in which CPS has developed and matured in various aspects of our lives. Based upon these exploratory scenarios, the study identified future concerns regarding CPS that can be considered for anticipatory action by the European policymaker.

While the scenarios are speculative and ultimately fictional, it is important to note that they are systematically based upon concrete research conducted by top experts in the field, including technical trend analysis, horizon scanning and expert workshops. As such, every detail of the scenarios presented — from the habits, hopes, fears and values of the characters to the social, legal and ethical tensions evident in their lives — is based upon substantial research and analysis.

The scenarios do not aim to predict the future, but to highlight how technology development might affect society and the public and private lives of EU citizens. These imagined scenarios (published in Annex 2 to this report) are meant to provide an accessible means for the reader to understand the social, ethical and legal tensions that were identified in the research process. They were designed to support committees and individual MEPs in exploring, anticipating and responding to potential CPS development paths and their associated impacts, and to aid reflection on anticipatory policy and agenda setting at the European Parliament.

The final step of the foresight process, called the 'legal backcasting' phase, was performed entirely in-house and aimed at translating the findings of the foresight phase into legal terms so as to pave the way for possible parliamentary reflection and work. During this phase, the outcomes of the previous steps were taken into account and were legally translated into a forward looking instrument for the European Parliament, the parliamentary committees and the Members of the European Parliament.

The analysis consisted of the following phases:

1. Identification and analysis of areas of possible future concern regarding CPS that may trigger EU legal interest

2. Identification of those relevant EP committees and intergroups that may have a stake or interest in these areas

3. Identification of those legal instruments that may need to be reviewed, modified or further specified

4. Identification of possible horizontal issues of a legal nature (not committee-specific, wider questions to think about)

The legal backcasting covered the following areas:

• Transport

• Trade (dual-use / misuse)

• Data protection

• Safety (including risk assessment, etc.)

• Health (clinical trials/medical devices/E-health devices)

• Energy and environment

• Horizontal legal issues (cross-committee considerations).

The analysis looked at all stages of contact between robots, AI and humans. In this process, special emphasis was given to human safety, privacy, integrity, dignity, autonomy, data ownership and the need to provide a clear and predictable legal framework of an anticipatory nature. Special attention was given to the legal framework for data protection owing to the (expected massive) flow of data arising from the use of robotics and AI. Moreover, consumer concerns over safety and security concerning the use of robots and AI were discussed. The analysis shed light on legal concerns arising during the testing and development of robots including the risks associated with the terms of interaction with robots given their potential to profoundly impact physical and moral relations in our society.

Beyond the identification of the main areas of potential legal concern and the associated challenges as well as the respective pieces of EU legislation that may need to be reviewed or considered, the analysis leads to several, rather conceptual conclusions of a structural nature. Firstly, that every attempt to conceive and tackle the legal challenges associated with such a multifaceted technology needs to be designed in a reflective manner in order to help with making individual adjustments on a case-by-case basis. Moreover, special emphasis should be placed on the need for a clear definition of CPS and more specifically of smart autonomous robots for reasons of legal certainty at least at EU level. Such a definition should be subject to future modifications in the form of delegated acts.

Beyond the identified points of legal reflection, a risk analysis strategy should be devised in order to provide a plausible instrument of regulatory importance that will have a horizontal and technology-driven perspective.

Last but not least, the attempt to regulate emerging technology of this kind should be accompanied by ethical standards and procedures that will address the needs and ethical dilemmas of researchers, practitioners, users and designers alike. Such an ethical framework does not need to take a legally binding form but would be better established as an EU code of conduct. Finally, it should be emphasised that the fact is that not all the concerns identified in the previous steps can be 'translated' into legislative terms. Following such an extensive backcasting analysis and looking simultaneously at the rapid technological trends and developments at the same time, the regulatory and protective limits of law become rather evident.

In the light of some of these foresight scenarios, laws appears to be significantly inept at fulfilling their protective or even precautionary function. When carrying out these forward-thinking technological reflections, the fragility of traditional legal instruments and the limits of law and legal optimism become rather clear. At the same time, however, multidisciplinary exercises of this kind can facilitate the technological embodiment of law and help to shape a pluralist conception of law and technology.

To illustrate this, below we list some examples of concerns which, given the current EU legislative acquis and the particular status of EU competences but also the nature of some of these challenges, cannot fall within the scope of law in general or EU law in particular.

• The affordability of CPS services

• The possible digital divide between those using CPS and those not doing so

• The terms of interface between the authority of the doctor and the patient with AI-authority

• Avoiding data concentration

• The shortage of skills required for working with robots (e.g. as a person with a disability, as a user of an autonomous vehicle or as a farmer)

• Empathy with robots

• Control of super smart, quick, strong cyborgs

Close