The ARCADIAN-IoT project under the same call as SPATIAL(H2020-SU-DS-2018-2019-2020 ,Digital Security) involves using different types of technologies, particularly Internet of Things (IoT) devices, to collect and process different categories of personal data. This raises numerous data protection issues and potential challenges that must be assessed and addressed.

Using such technological components (i.e., blockchain, AI, biometric technologies, IoT medical devices, and drones) might raise issues on data protection. This blog post aims to provide general considerations about IoT devices and their integrations with blockchain and AI technologies, and then issue-specific information about the different IoT technologies used within the ARCADIAN-IoT project (i.e., medical IoT devices and drones).

IoT and data protection challenges

The concept of IoT refers to infrastructures designed to record, process, store and transfer an extensive amount of data while interacting with other devices. Thus, IoT devices constantly collect vast amounts of data – such as location and health data – that relate to identified or identifiable natural persons and, therefore, qualify as personal data under Article 4, paragraph 1, no. 1 of the Regulation (UE) 2016/679 (the so-called “GDPR”).

The processing of personal data in this context also implies the coordinated intervention of different partners involved in developing IoT technologies, to provide functionalities or interfaces.

In light of the above, the development of IoT raises significant privacy issues and challenges, already identified by the Article 29 Data Protection Working Party (now, the European Data Protection Board, “WP29” or “EDPB”) in its Opinion 8/2014 on the “Recent Developments of the Internet of Thing”. In particular, according to the Opinion, the following major risk categories can be identified:

  1. Lack of control and information asymmetry: as a result of the need to provide pervasive services, users might be under third-party monitoring and can lose all control of the dissemination of their data, depending on the data controller’s level of transparency concerning the data processor.
  2. Quality of users’ consent: the information asymmetry mentioned above constitutes a significant barrier to demonstrating valid (i.e., informed and freely given) consent under Article 7, GDPR. Moreover, classical mechanisms used to obtain an individual’s consent may be difficult to apply in IoT, resulting, according to the WP29, in a “low-quality” consent given in a lack of information or in the factual impossibility to provide fine-tuned consent in line with the actual preferences of data subjects.
  3. Intrusive bringing out of behaviour patterns and profiling: even though different objects will separately collect isolated pieces of data, further analysis can reveal specific data subjects’ life such as habits, behaviours and preferences.
  4. Interferences derived from data and repurposing of original processing: apparently insignificant data originally collected through an IoT device (e.g., accelerometer or gyroscope) and, in general, the amount of data generated by the IoT, may lend to secondary uses.

In addition to the above-mentioned concerns, the heterogeneous nature of the IoT and its critical use make cybersecurity an essential aspect, which is even more important when considering that every IoT developer deals with supply chain activities aiming at transforming raw materials and components into a finished IoT product. Therefore, it is essential to identify and implement privacy by design and default measures.

Blockchain in IoT

In brief, a blockchain is a decentralised system (i.e., there is no master computer managing the entire chain), keeping a record of an ever-growing set of data. Blockchain technologies are deemed immutable and secure because the database can only be extended and previous records cannot be changed.

IoT presents security concerns that the blockchain can address. In particular, using a private permissioned blockchain system (i.e., a special-purpose blockchain implementation that only works within a given system) provides immutable auditability and traceability properties to the data under management.

However, while securing the processing of users’ personal data, the use of blockchain involves risks about the GDPR requirements. In particular, the “immutability” of the data, implied in the very nature of the blockchain, constitutes a critical point of tension between such technology and the provisions of the GDPR such as, inter alia:

  1. Purpose limitation and data minimisation principles under Article 5 of the GDPR, specification principle, according to which personal data must only be collected for specified, explicit and legitimate purposes and shall be adequate, relevant and limited to what is necessary for relation to these purposes. In the case of blockchain technology, the problem arises because once added to the database; the data will always continue to be processed;
  2. The possibility of exercising rights that the GDPR grants to data subjects and which should always be exercisable by them, including the right to rectification (Article 16, GDPR) and the right to erasure (so-called “right to be forgotten” under Article 17, GDPR).

Using AI technologies in IoT devices

Artificial Intelligence (AI) is a set of technologies that combines data, algorithms and computing power. As pointed out by the EU Commission in its “White Paper on Artificial Intelligence”, the latter is rapidly developing and will transform the pattern of society and the way people act in it, improving, for example, health care and increasing the safety of citizens. In this regard, the European Parliament has pointed out that the increasing use of AI systems also entails risks, including threats to fundamental rights, including:

  1. The risk that a bias (unconsciously set by the programmers) negatively influences machine learning and then affects the AI results (e.g., the AI could “make decisions” influenced by ethnicity, gender, age, etc.);
  2. The opacity of the algorithms: the steps through which the data are interpreted are not always explainable (transparent);
  3. Privacy and sharing of data, given that the AI feeds on data which is indispensable for the training of the machine;
  4. Consent and autonomy: the data subject must be adequately informed of the technology and of the developments it may have. Moreover, the comprehensibility of what is being communicated must be guaranteed.

Back in 2018, the European Commission set out its vision of ethical, safe and state-of-the-art AI “made in Europe”. To support the implementation of this vision, the Commission has set up a High-Level Expert Group on Artificial Intelligence, which has developed the “Ethics Guidelines for Trustworthy Artificial Intelligence”, to promote trustworthy AI. Starting from a fundamental rights-based approach, the Group identifies ethical principles and values that must be respected in the development, deployment and use of AI systems.

In particular, the Group provides key indications (such as paying special attention to situations involving vulnerable subjects, taking appropriate measures to mitigate risks, etc.), as well as indications on how to achieve reliable AI by listing seven requirements that AI systems should meet, namely:

  1. human intervention and surveillance;
  2. technical robustness and security;
  3. confidentiality and data governance;
  4. transparency;
  5. diversity;
  6. non-discrimination and fairness
  7. social and environmental well-being;
  8. accountability.

Finally, it should be noted that in the context of the European Strategy for AI, the European Commission published, last 21 April, a proposal for a Regulation on the European approach to AI, proposing a European legal framework on AI.

IoT medical devices and the processing of health data

The IoT nowadays covers all everyday tools that have become smart with the technological evolution, incorporating smart sensors collecting a wide variety of information and transmitting it to the network. In medicine, this trend – so-called “digital health” – refers to sensors that detect real-time information from the human body (heartbeat, temperature, movement, etc.).

On the one hand, digital health makes healthcare better, safer and more efficient, enabling new ways of management of individuals’ health. On the other hand, the continuous monitoring of patients’ conditions underlies many risks to their rights and freedoms (e.g., in relation to the consequences of incorrect monitoring due to the collection of inaccurate, incomplete, ambiguous or contradictory data). The need to address such risks is even more critical when considering that the use of medical IoT systems involves the processing of health-related data, which therefore falls into the special categories of personal data under Article 9 of the GDPR.

Drones and facial recognition mechanisms

From a perspective focussing strictly on data protection, drone operations can be classified into two main categories: the purpose of the operation involving personal data processing, on the one hand, and, on the other hand, operations whose purpose does not include the processing of personal data.

With specific reference to the first type of operations, it must be noted that drones are combined with applications such as cameras or video cameras and might also record the images, through software to process the video images, which might have further applications (including high-power zoom, facial recognition, behaviour profiling, movement detection, night vision, GPS systems processing the location of the persons filmed, etc.). This implies the collection, recording, organisation, storing, use and combination of data allowing the identification of persons.

It must be noted that regulations for the use of airspace apply in parallel with personal data protection regulations such as the EU Regulations 2019/947 and 2019/945, setting out the framework for the safe operation of civil drones in the European skies through a risk-based approach.

In particular, this balance should consider national security strategies and the necessity of not stepping back in protecting the privacy and security of individuals. This is a crucial issue as new technologies (and, among them, drones) may impact several individual aspects.

This precondition and the potential clash between the fundamental rights of the individuals and the necessity of the European Union and of the Member States to monitor the emerging security threats has guided the approach of ARCADIAN-IoT and its legal and ethical outcomes.

According to EU Regulations 2019/947 and 2019/945, there is no distinction between leisure or civil, and commercial drone activities. What is relevant for the EU regulations is the weight and the specifications of the civil drone and the operation it is intended to conduct.

Regulation (EU) 2019/947, which has been applicable since 31 December 2020 in all EU Member States, including Norway and Liechtenstein, caters for most types of civil drone operations and their levels of risk. It defines three categories of civil drone operations:

  1. The “open” category (Article 4) addresses the lower-risk civil drone operations, where safety is ensured provided the civil drone operator complies with the relevant requirements for its intended operation. This category is subdivided into three subcategories, namely A1, A2 and A3. Operational risks in the open category are considered low, and, therefore, no operational authorisation is required before starting a flight;
  2. The “specific” category (Article 5) covers riskier civil drone operations, where the drone operator ensures safety by obtaining operational authorisation from the competent national authority before starting the operation. To obtain the operational authorisation, the drone operator is required to conduct a risk assessment, which will determine the requirements necessary for the safe operation of the civil drone(s);
  3. The “certified” category (Article 6), in which the safety risk is considerably high; therefore, the certification of the drone operator and its drone, as well as the licensing of the remote pilot(s), is always required to ensure safety.

The regulation also emphasises that all drone operators and remote pilots must comply with European and national privacy and data protection rules. The drone operations must be carried out with minor interference with individuals’ privacy and personal data on the ground, and any personal data collected must be handled in compliance with the principles, requirements and individual rights laid down in the GDPR.

Processing of special categories of personal data under GDPR

As stated in the previous paragraphs, components processing special categories of personal data under Article 9, GDPR will also be used in ARCADIAN-IoT activities. This applies specifically to biometrics components identifying persons through face recognition AI and to health data processed by IoT medical devices.

Both biometric and health data fall within the special categories of personal data regulated by Article 9, GDPR which states that “processing of personal data revealing racial or ethnic origin, political opinions, religious or philosophical beliefs, or trade union membership, and the processing of genetic data, biometric data to uniquely identify a natural person, data concerning health or data concerning a natural person’s sex life or sexual orientation shall be prohibited” unless one of the conditions laid down in Article 9(2) is met and, in particular, if the data subject has given explicit consent or if the processing.

While processing this type of data, it is essential to keep in mind that the risk-based approach of the GDPR requires data controllers to use more excellent care because collecting and using it is more likely to interfere with these fundamental rights or open someone up to discrimination.

The involvement of minors

Particular protection is necessary when collecting and processing children’s personal data, because they may be less aware of the risks involved.

Protection of children’s data in the GDPR

GDPR provides guidance for specific circumstances and risks related to children when their personal data is collected and processed, emphasising the need for clear communication with children about the processing of their personal data and the related risks. Moreover, the GDPR recognises the possibility for children to exercise their data protection rights.

As mentioned, the GDPR requires data controllers and processors to implement higher protection standards when processing children’s personal data. In particular, Recital 38 states that “children merit specific protection concerning their personal data, as they may be less aware of the risks, consequences and safeguards concerned and their rights about the processing of personal data. Such specific protection should, in particular, apply […] when using services offered directly to a child. The consent of the holder of parental responsibility should not be necessary for the context of preventive or counselling services offered directly to a child”.

Given that children merit specific protection, Recital 58 provides that “any information and communication, where processing is addressed to a child, should be in such a clear and plain language that the child can easily understand”.

Legal bases for processing children’s personal information

Under the GDPR, data controllers have an obligation to process personal data on a legal basis, irrespective of whether it belongs to a child or an adult. Article 6 of the GDPR sets out the six possible legal bases for processing personal data, i.e.:

  1. Performance of a contract or taking steps to enter into a contract;
  2. Compliance with a legal obligation;
  3. Protecting vital interests of a data subject or another person;
  4. Performance of a task carried out in the public interest or through official authority;
  5. Legitimate interests of the data controller or another party; and
  6. The consent of the data subject.

In particular, under Article 6(1)(a) of the GDPR, the processing is lawful if “the data subject has given consent to the processing of his or her personal data for one or more specific purposes”, consent that must be freely given, specific, informed and unambiguous made by way of a clear statement or affirmative action by the data subject.

While processing children’s personal data, data controllers should ensure that the children are given a real choice over how their personal data is used and that they have the capacity to understand precisely what it is they are consenting to, relying on:

  1. Age;
  2. Any imbalance of power that might be inherent in their relationship with the child.

Finally, special restrictions apply where organisations provide online services. In fact, Article 8 in combination with member States’ law sets limitations as to the minimum age upon which online service providers can rely.

Technical measures and anonymisation

The GDPR, in light of the accountability principle, has introduced the concepts of privacy by design and privacy by default.

According to Recital 78 of the GDPR “To be able to demonstrate compliance with this Regulation, the controller should adopt internal policies and implement measures which meet in particular the principles of data protection by design and data protection by default. Such measures could consist, inter alia, of minimising the processing of personal data, pseudonymising personal data as soon as possible, transparency concerning the functions and processing of personal data, enabling the data subject to monitor the data processing, enabling the controller to create and improve security features. When developing, designing, selecting and using applications, services and products that are based on the processing of personal data or process personal data to fulfil their task, producers of the products, services and applications should be encouraged to take into account the right to data protection when developing and designing such products, services and applications and, with due regard to state of the art, to make sure that controllers and processors can fulfil their data protection obligations”.

According to Article 25, paragraph 2 of the GDPR “the controller shall implement appropriate technical and organisational measures for ensuring that, by default, only personal data which are necessary for each specific purpose of the processing are processed. That obligation applies to the amount of personal data collected, the extent of their processing, the period of their storage and their accessibility”.

As provided by Article 5, par. 1, lett. f) – also reported above – the personal data shall be processed to ensure appropriate security of the personal data, including protection against unauthorised or unlawful processing and against accidental loss, destruction or damage, using appropriate technical or organisational measures (‘integrity and confidentiality). In line with this principle, Article 32 provides that the appropriate technical and organisational measures shall be implemented to ensure a level of security appropriate to the risk, including inter alia, as appropriate:

  1. The pseudonymisation and encryption of personal data.
  2. The ability to ensure the ongoing confidentiality, integrity, availability and resilience of processing systems and services.
  3. The ability to restore the availability and access to personal data on time in the event of a physical or technical incident.
  4. A process for regularly testing, assessing and evaluating the effectiveness of technical and organisational measures for ensuring the security of the processing.

With specific reference to pseudonymisation or, if possible, anonymisation of data, it is also worth remembering that the GDPR only applies to personal data: this implies that where data continues to be processed beyond its initial purpose, but only in an anonymised form, then this processing no longer falls within the scope of the GDPR. Therefore, another requirement could be data anonymisation in such cases.

Ensure the exercise of data subjects’ rights

GDPR provides data subjects – inter alia – with the following rights:

  1. The right of access, i.e., to know if her data are processed and to obtain a readable copy in an understandable format. It is notably used to check data accuracy.
  2. The right to rectification, i.e., to modify, correct or update data concerning them to reduce the spread or use of inaccurate information;
  3. The right to erasure, i.e., to delete their data.
  4. The right to restriction of processing temporarily.
  5. The right to withdraw the consent previously provided.
  6. The right to human intervention concerning profiling or a decision solely based on automated processing.

Regarding components using blockchain technology, it is necessary to ensure that the citizens can exercise the subject data rights according to the GDPR, giving back control to the data subject by letting her/him the choice to “remember” or “forget” their identifiers, in order to be compliant with GDPR and, in particular, to the “right to be forgotten” when the user stops using services (or upon request).

For more articles from ARCADIAN-IoT please visit: https://www.arcadian-iot.eu/blog/