This document presents a new ODRL profile, the Privacy Paradigm ODRL Profile (PPOP), that extends ODRL, DPV and other specifications to bridge the gap in the representation of information related to transparency practices in privacy and data protection across the knowledge representation, legal and ethical fields, and addresses the data processing requirements for personal datastores envisaged as key core elements of the data economy.
This research has been supported by European Union’s Horizon 2020 research and innovation programme under the Marie Skłodowska-Curie grant agreement No 813497 (PROTECT).
Privacy Paradigm ODRL Profile - PPOP | |
---|---|
1. Purpose | |
The purpose of this profile is to support the specification of transparency measures in the context of data sharing activities and data-intensive flows between multiple data subjects, controllers and processors on decentralised data storage environments. | |
2. Scope | |
The scope of this profile is limited to the definition of data sharing policies aligned with European data protection regulations and with ethical guidelines related to the transparency of Artificial Intelligence. | |
3. Implementation Language | |
RDF, RDFS | |
4. Intended End-Users | |
Developers of decentralised data storage and sharing solutions | |
5. Intended Uses | |
Use 1 - Classify transparency practices of a PIMS Use 2 - Define access control policies for legal and ethical access to group and individual personal data stores Use 3 - Model agreements / contracts between data subjects and intermediaries in new data governance schemes Use 4 - Model safeguards for the trustworthiness of AI systems and respective rights and duties Use 5 - Create a machine and human-readable policy notice template |
|
6. Ontology Requirements | |
a. Non-Functional Requirements | |
NFR 1. The ontology shall be published online with standard documentation. | |
b. Functional Requirements: Groups of Competency Questions | |
Related to Safeguards | |
Trustworthiness / Reliability | CQTS1 - To what extent did you ensure that the system would function reliably under harsh conditions? |
Safety | CQSF1 - What considerations did you take to prioritise the safety and the mental and physical integrity of people when scanning horizons of technological possibility and when conceiving of and deploying AI applications? |
Security |
CQSC1 - What strategies did you establish to ensure that the system continuously remains functional and accessible to its authorised users? CQSC2 - What protocols did you use to keep confidential and private information secure even under hostile or adversarial conditions? CQSC3 - To what extent is the system capable of maintaining the integrity of the information that constitutes it, including protecting its architectures from the unauthorised modification or damage of any of its component parts? |
Privacy | CQPR1 - What measures did you take to enhance privacy? |
Explainability |
CQEX1 - What mechanism did you consider to provide explanation and justification of both the content of algorithmically supported decisions and the processes behind their production in plain, understandable, and coherent language? Did you research and try to use the simplest and most interpretable model possible for the application in question? CQEX2 - What considerations were taken into account when the rationale behind a specific decision or behaviour was communicated and clarified? Did you establish mechanisms to inform (end-)users on the reasons and criteria behind the AI system's outcomes? CQEX3 - What strategies did you use to provide a formal or logical explanation? CQEX4 - What strategies did you use to provide a semantic explanation (explanation of technical rationale behind the outcome)? |
Traceability |
CQTR1 - What measures can ensure traceability of outcomes and decisions? CQTR2 - What methods are used to ensure traceability of designing and developing algorithmic systems trained by personal data? |
Auditability | CQAU1 - What measures did you take to ensure that every step of the process of designing and implementing AI is accessible for audit, oversight, and review? Did builders and implementers of algorithmic systems keep records and make accessible information that enable monitoring from the stages of collection, pre-processing, and modelling to training, testing, and deploying? |
Avoid Bias |
CQFR1 - How to ensure that the system has been sufficiently trained to develop and implement responsibility without bias? CQFR2 - Did you ensure that model architectures did not include target variables, features, processes, or analytical structures (correlations, interactions, and inferences) which are unreasonable, morally objectionable, or unjustifiable)? CQFR3 - What considerations were taken into account to encourage all voices to be heard and all opinions to be weighed seriously and sincerely throughout the production and use lifecycle? |
Transparency |
CQTP1 - What considerations were taken into account when considering the transparency of an AI system? CQTP2 - If data was collected from the data subject, has the controller provided the data subject with the mandatory information? CQTP3 - If data wasn't collected from the data subject, has the controller provided the data subject with the mandatory information? CQTP4 - Is it the first time that the data subject is contacted? CQTP5 - Is there any applicable exemption to the information obligation? CQTP6 - Has the user of data intermediary services been provided with the mandatory information? CQTP7 - Has the user of a mere conduit service been provided with information about the restrictions on the service? CQTP8 - Were the purposes associated with each particular category of personal data informed? CQTP9 - Was the particular legitimate interest associated with each particular category of personal data informed? CQTP10 - Were the data recipients associated with each particular category of personal data informed? CQTP11 - Were the reasons for the data transfer associated with each particular category of personal data informed? CQTP12 - Were the data retention periods associated with each particular category of personal data informed? CQTP13 - Were the conditions and restrictions related to the use of the service informed? |
Related to Rights | |
Non-discrimination | CQDS1 - How to ensure that the decisions of the system do not have discriminatory or inequitable impacts on the lives of the people they affect? |
Autonomy / Informed Decisions | CQAT1 - How to ensure that the users are able to make free and informed decisions (in interaction with a system)? |
Right to Privacy | CQRP1 - Did you build in mechanisms for notice and control over personal data? |
Related to Duties | |
Accuracy |
CQAC1 - Did you ensure that the system generates a correct output? CQAC2 - Did you assess whether you can analyse your training and testing data? can you change and update this over time? |
Other CQs | |
Intended Purpose | CQPP1 - Did you clarify the purpose of the AI system and who or what may benefit from the product/service? |
Accountability | CQRE1 - How to establish a continuous chain of human responsibility across the whole AI projects delivery flow: from the design of an AI system to its algorithmically steered outcomes? |
Impacts on business | CQBU1 - If the organisation's business model relies on personal data, where does the data come from to create value for the organisation? |
Well-being | CQWE1 - To what extend did you ensure that the use of technology fosters and cultivates the welfare and well-being of data subjects whose interests are impacted by its use? |
Informed data subjects | CQIN1 - Did you enable people to understand how an AI system is developed, trained, operates, and deployed in the relevant application domain, so that consumers, for example, can make more informed choices? |
The base concepts specified by the Privacy Paradigm ODRL Profile are shown in the figure below.
Prefix | Namespace | Description |
---|---|---|
odrl | http://www.w3.org/ns/odrl/2/ | [odrl-vocab] [odrl-model] |
rdf | http://www.w3.org/1999/02/22-rdf-syntax-ns# | [rdf11-concepts] |
rdfs | http://www.w3.org/2000/01/rdf-schema# | [rdf-schema] |
owl | http://www.w3.org/2002/07/owl# | [owl2-overview] |
dct | http://purl.org/dc/terms/ | [dct] |
ns1 | http://purl.org/vocab/vann/ | [ns1] |
xsd | http://www.w3.org/2001/XMLSchema# | [xsd] |
skos | http://www.w3.org/2004/02/skos/core# | [skos] |
dpv | http://www.w3.org/ns/dpv# | [dpv] |
ppop | https://w3id.org/ppop | [ppop] |
This ODRL profile relies on the invocation of legal and ethical concepts. Where adequate, the term has the indication of the legal and ethical sources that were used to define it. Its core concepts are new entities involved in the data economy, individual and group rights, organizational duties and measures to safeguard them.
In-force regulations and proposals of the European Commission were taken into account as well as existing case law and guidelines by the European Data Protection Board (EDPB). The sources used in this respect were the following: (i) the General Data Protection Regulation [GDPR], (ii) the Data Governance Act [DGA], (iii) the European Digital Identity regulation ammendment [eIDAS 2], (iv) the Digital Services Act [DSA], (v) EDPB’s guidelines on consent [consent], (vi) Article 29 Working Party guidelines on transparency [transparency], and (vii) WhatsApp Ireland decision from the Irish data protection supervisory authority [whatsapp]. The work was further complemented by scholarly legal literature on the matter when gaps were identified.
Existing ethical guidelines related to the transparency of Artificial Intelligence were also used for the collection of requirements for the profile. As such, the sources used in this respect were the following: (i) Ethics Guidelines for Trustworthy AI [AI HLEG], (ii) Understanding artificial intelligence ethics and safety [AI Turing], (iii) Recommendation of the Council on AI [OECD], (iv) First draft of the recommendation on the ethics of artificial intelligence [UNESCO], and (v) European Convention on Human Rights [ECHR]. In addition to these sources, various ethical and philosophical literature was used.
Group | Data Sharing Entity | Data Intermediary | Data Sharing Service Provider | Data Altruism Organisation | Data Holder | Data User | Data Trust Provider
Term: | Group |
---|---|
Definition: | Collection of individuals who may or may not share a common purpose or intention; or a formal organization with formal goals and formal organizational rules. |
Instance of: | odrl:Party |
Legal source: | [pagallo-17], [puri-21], [puri-22] |
Ethical source: | [copp-84], [french-84], [newman-04] |
Usage examples: | Family Pod |
Term: | Data Sharing Entity |
---|---|
Definition: | A legal or natural person that can be a data intermediary, data holder or data user |
Subclass of: | dpv:LegalEntity |
Legal source: | [DGA Art.9], [eIDAS Art.3.19] |
Usage examples: |
Term: | Data Intermediary |
---|---|
Definition: | A legal person that engages in intermediation services between data holders which are legal persons and potential data users, including making available the technical or other means to enable such services |
Subclass of: | ppop:DataSharingEntity |
Legal source: | [DGA Art.9.1] |
Usage examples: |
Term: | Data Sharing Service Provider |
---|---|
Definition: | A legal person that engages in intermediation services between data subjects that seek to make their personal data available and potential data users, including making available the technical or other means to enable such services, in the exercise of the rights provided in Regulation (EU) 2016/679 |
Subclass of: | ppop:DataIntermediary |
Legal source: | [DGA Art.9.2] |
Usage examples: | Family Pod |
Term: | Data Altruism Organisation |
---|---|
Definition: | A legal person that performs the activities related to data altruism |
Subclass of: | ppop:DataIntermediary |
Legal source: | [DGA Art.14-16] |
Usage examples: |
Term: | Data Holder |
---|---|
Definition: | A legal person or data subject who has the right to grant / share access to data under its control |
Subclass of: | ppop:DataSharingEntity |
Legal source: | [DGA Art.2.5] |
Usage examples: | Family Pod |
Term: | Data User |
---|---|
Definition: | A natural or legal person who has lawful access to data and is authorised to use it |
Subclass of: | ppop:DataSharingEntity |
Legal source: | [DGA Art.2.6] |
Usage examples: |
Term: | Data Trust Provider |
---|---|
Definition: | A natural or a legal person who provides one or more trust services either as a qualified or as a non-qualified trust service provider |
Subclass of: | dpv:LegalEntity |
Legal source: | [eIDAS Art.3.19] |
Usage examples: |
has charge price | collects metadata | is fair | is transparent | is non discriminatory | converts data | prevents fraudulent access | prevents abusive access | ensures reasonable continuity | has competition procedures | has service limitation
Term: | has charge price |
---|---|
Definition: | Indicates whether the data sharing entity charges a price for their service |
Domain: | ppop:DataSharingEntity |
Range: | xsd:boolean |
Legal source: | [DGA Art. 11.3] |
Usage examples: |
Term: | collects metadata |
---|---|
Definition: | Indicates whether the data sharing entity has disclosed if the data sharing implies collecting metadata |
Domain: | ppop:DataSharingEntity |
Range: | xsd:boolean |
Legal source: | [DGA Art. 11.2] |
Usage examples: |
Term: | is fair |
---|---|
Definition: | Indicates whether the data sharing entity has disclosed if the data sharing is fair |
Domain: | ppop:DataSharingEntity |
Range: | xsd:boolean |
Legal source: | [DGA Art. 11.3] |
Usage examples: |
Term: | is transparent |
---|---|
Definition: | Indicates whether the data sharing entity has provided transparency measures |
Domain: | ppop:DataSharingEntity |
Range: | xsd:boolean |
Legal source: | [DGA Art. 11.3] |
Usage examples: |
Term: | is non discriminatory |
---|---|
Definition: | Indicates whether the data sharing entity has disclosed that the data sharing is non discriminatory |
Domain: | ppop:DataSharingEntity |
Range: | xsd:boolean |
Legal source: | [DGA Art. 11.3] |
Usage examples: |
Term: | converts data |
---|---|
Definition: | Indicates whether the data sharing entity has disclosed that the data sharing implies changing the data format |
Domain: | ppop:DataSharingEntity |
Range: | xsd:boolean |
Legal source: | [DGA Art. 11.4] |
Usage examples: |
Term: | prevents fraudulent access |
---|---|
Definition: | Indicates whether the data sharing entity has disclosed if the data sharing has technical and organisational measures to avoid fake access |
Domain: | ppop:DataSharingEntity |
Range: | xsd:boolean |
Legal source: | [DGA Art. 11.5] |
Usage examples: |
Term: | prevents abusive access |
---|---|
Definition: | Indicates whether the data sharing entity has disclosed if the data sharing has technical and organisational measures to avoid abusive access |
Domain: | ppop:DataSharingEntity |
Range: | xsd:boolean |
Legal source: | [DGA Art. 11.5] |
Usage examples: |
Term: | ensures reasonable continuity |
---|---|
Definition: | Indicates whether the data sharing entity has disclosed if the data sharing has technical and organisational measures to ensure continuity in the data sharing |
Domain: | ppop:DataSharingEntity |
Range: | xsd:boolean |
Legal source: | [DGA Art. 11.6] |
Usage examples: |
Term: | has competition procedures |
---|---|
Definition: | Indicates whether the data sharing entity has disclosed if the data sharing has technical and organisational measures to ensure competitive practices |
Domain: | ppop:DataSharingEntity |
Range: | xsd:boolean |
Legal source: | [DGA Art. 11.9] |
Usage examples: |
Term: | has service limitation |
---|---|
Definition: | Indicates whether the data sharing entity has disclosed the data retention of the data sharing |
Domain: | ppop:DataSharingEntity |
Range: | xsd:boolean |
Legal source: | [eIDAS Art. 2] |
Usage examples: |
Personal Information Management System | Personal Data Store | Identity Wallet
Term: | Personal Information Management System |
---|---|
Definition: | System that helps to give individuals more control over their personal data by managing their personal data in secure, on-premises or online storage systems and sharing it when and with whomever they choose |
Subclass of: | dpv:Technology |
Legal source: | [EDPS PIMS] |
Usage examples: |
Term: | Personal Data Store |
---|---|
Definition: | Service that lets an individual store, manage and deploy their personal data |
Subclass of: | ppop:PIMS |
Usage examples: |
Term: | Identity Wallet |
---|---|
Definition: | Service that allows the user to store identity data, credentials and attributes linked to her/his identity, to provide them to relying parties on request and to use them for authentication, online and offline, and to create qualified electronic signatures and seals |
Subclass of: | ppop:PIMS |
Legal source: | [eIDAS Art.3.42] |
Usage examples: |
Measure | Technical and Organisational Measure | Safeguard for Trustworthiness | Safeguard for General Safety | Safeguard for Security | Safeguard for Privacy | Safeguard for Explainability | Safeguard for Traceability | Safeguard for Auditability | Safeguard to Avoid Bias | Stakeholder Participation | Transparency Measure | Service Conditions
Term: | Measure |
---|---|
Definition: | Any action deployed by an entity involved in a data processing activity, due to the existence of a legal obligation, to guarantee and protect that the personal data involved shall not be affected in any way and, consequently, cause harm to the data subject |
Legal source: | [GDPR Arts. 5.1.d, 6.4.e, 12-22, 25, 32, 43-46, 77-79, 82, 89.1] |
Usage examples: |
Term: | Technical and Organisational Measure |
---|---|
Definition: | Any measure to ensure data’s confidentiality, integrity, and availability |
Instance of: | dpv:TechnicalOrganisationalMeasure |
Legal source: | [GDPR Arts. 5.1.d, 6.4.e, 12-22, 25, 32, 43-46, 77-79, 82, 89.1] |
Usage examples: |
Term: | Safeguard for Trustworthiness |
---|---|
Definition: | Safeguards to ensure that the technology itself is (statistically) reliable |
Comment: | Alternative definition: Factor that conveys trust in a system over future uncertainties |
Subclass of: | dpv:Safeguard |
Legal source: | [bodo-21] |
Ethical source: | [AI HLEG] |
Usage examples: |
Term: | Safeguard for General Safety |
---|---|
Definition: | Precautionary measures to prevent vulnerabilities such as data polution, physical infrastructure, cyber security attacks, etc., to ensure the integrity and resilience of the AI system against potential attacks, to ensure public health against accidental release of hazardous biological agents |
Subclass of: | dpv:SafeguardForTrustworthiness |
Legal source: | [GDPR Art. 32] |
Ethical source: | [AI HLEG], [SIENNA], [AI Turing] |
Usage examples: |
Term: | Safeguard for Security |
---|---|
Definition: | Actions taken to maintain the integrity of the information that constitute the system, safeguard to ensure that the system continuously remain functional and accessible to its authorised users |
Subclass of: | dpv:SafeguardForTrustworthiness |
Legal source: | [GDPR Art. 32] |
Ethical source: | [AI Turing] |
Usage examples: |
Term: | Safeguard for Privacy |
---|---|
Definition: | Technical measures or tools for de-identification or anonymization of data |
Subclass of: | dpv:SafeguardForTrustworthiness |
Ethical source: | [ohm-09] |
Usage examples: |
Term: | Safeguard for Explainability |
---|---|
Definition: | Safeguards used to provide a formal, logical, or semantic explanation to ensure that the rationale behind a specific decision or behaviour is communicated to (end-)users, and to make explicit and clarify the meaning of the content of the outcome |
Subclass of: | dpv:SafeguardForTrustworthiness |
Legal source: | [GDPR Arts. 12-14] |
Ethical source: | [AI HLEG], [AI Turing], [UNESCO] |
Usage examples: |
Term: | Safeguard for Traceability |
---|---|
Definition: | Technical measures to trace all phases of algorithmic system design and development from data collection to selection, model building, and outcome of / decisions taken by algorithms |
Subclass of: | dpv:SafeguardForTrustworthiness |
Legal source: | [GDPR Art. 5] |
Ethical source: | [AI HLEG] |
Usage examples: |
Term: | Safeguard for Auditability |
---|---|
Definition: | Safeguards to ensure that organizations and AI systems are consistent with relevant principles or norms; checks and balances in place to ensure that a system can be revised by indepedant third parties |
Subclass of: | dpv:SafeguardForTrustworthiness |
Legal source: | [GDPR Arts. 5, 24] |
Ethical source: | [capAI] |
Usage examples: |
Term: | Safeguard to Avoid Bias |
---|---|
Definition: | Safeguards to avoid underrepresenting or overrepresenting specific groups or samples in data collection |
Subclass of: | dpv:SafeguardForTrustworthiness |
Ethical source: | [dalessandro-17] |
Usage examples: |
Term: | Stakeholder Participation |
---|---|
Definition: | Diverse stakeholder participation is required to hear all voices and opinions throughout the production and use lifecycle of technologies |
Subclass of: | dpv:SafeguardToAvoidBias |
Legal source: | [GDPR Art. 22] |
Ethical source: | [AI Turing] |
Usage examples: |
Term: | Transparency Measure |
---|---|
Definition: | Measures to identify which information can or should be disclosed and the most appropriate way to make it available; and conditions of its accessibility |
Subclass of: | dpv:Measure |
Ethical source: | [turilli-floridi-09] |
Usage examples: |
Term: | Service Conditions |
---|---|
Definition: | Any piece of information provided to natural persons regarding the activity where their data is involved |
Subclass of: | dpv:TransparencyMeasure |
Legal source: | [DGA Art. 11] |
Usage examples: |
Group Right | Right to Group Privacy | Right to Non-Discrimination | Right to Dignity | Right to Privacy | Right to Autonomy | Right to Security
Term: | Group Right |
---|---|
Definition: | Rights held by a group itself rather than by its members severally |
Subclass of: | dpv:Right |
Legal source: | [GDPR Arts. 12-22] |
Ethical source: | [raz-88], [mcdonald-91] |
Usage examples: |
Term: | Right To Group Privacy |
---|---|
Definition: | A group has rights to privacy that are not reducible to the privacy of individuals who comprise that group |
Comment: | 2) Where information is shared: in family groups: absolutely private information about the group is shared only between insiders but not with outsiders. Joint individual right to privacy is required to keep the shared information private;3) Where aggregated information is analysed: in non-voluntary groups: group right to privacy: This kind of group is identified by any feature (or combination) individuals have in common, which is represented as the result of applying an algorithm, or used in an algorithm decision. Right to privacy for these groups is defined as right to reasonable inferences. |
Subclass of: | ppop:GroupRight |
Ethical source: | [floridi-17], [mantelero-17], [taylor-17], [sloot-15] |
Usage examples: |
Term: | Right to Non-Discrimination |
---|---|
Definition: | Right to be protected from the unfair and harmful use of group-related information |
Subclass of: | ppop:GroupRight |
Ethical source: | [muhlhoff-21], [vandyke-77] |
Usage examples: |
Term: | Right to Dignity |
---|---|
Definition: | A group has a right to dignity that is not reducible to the dignity of individuals who comprise that group |
Subclass of: | ppop:GroupRight |
Usage examples: |
Term: | Right to Privacy |
---|---|
Definition: | Data subjects have a right to exercise control over their personal information |
Comment: | Individuals (end-users) should have access to and control over their personal information. 1) Where information is collected: individual right to privacy: data subject should exercise control over their personal information. |
Subclass of: | dpv:DataSubjectRight |
Legal source: | [GDPR Arts. 12-22] |
Ethical source: | [roessler-05], [westin-67] |
Usage examples: |
Term: | Right to Autonomy |
---|---|
Definition: | Data subjects have a right to reflect on, deliberate on, and justify decisions made in interaction with a system |
Subclass of: | dpv:DataSubjectRight |
Legal source: | [GDPR Arts. 12-22] |
Ethical source: | [AI Turing], [UNESCO], [OECD] |
Usage examples: |
Term: | Right to Security |
---|---|
Definition: | Data subjects are entitled to have their data processed, by both controllers and processors, in an appropiate manner that does not affect their freedom, including preventing inappropriate accessing to it |
Subclass of: | dpv:DataSubjectRight |
Legal source: | [GDPR Arts. 12-22] |
Ethical source: | [ECHR], [lundgren-18] |
Usage examples: |
Right Exemption | Right to be Informed Exemption | Data Subject Already Informed | Extraordinary Effort | Affects Processing | Legal Disclosure | Confidentiality Obligation | Expression of Opinion | Investigation Prevention | Third Party Rights | Confidentiality of Opinion | Statistical or Research Purpose | Legal Privilege | National Security | Defence | Public Security | Judicial Independence or Proceedings
Term: | Right Exemption |
---|---|
Definition: | Organisations can deny a data subject from exercising their rights where it is necessary and proportionate but also allowed by the relevant regulation |
Legal source: | [GDPR Arts. 13.4, 14.5, 23] |
Usage examples: |
Term: | Right to be Informed Exemption |
---|---|
Definition: | Reasons why the data controller should not provide the data subject with the relevant information, according to Arts. 13 or 14 as applicable, about an intended data processing activity |
Subclass of: | ppop:RightExemption |
Legal source: | [GDPR Arts. 13.4, 14.5] |
Usage examples: |
Term: | Data Subject Already Informed |
---|---|
Definition: | The data subject already has the relevant information about the intended data |
Subclass of: | ppop:RightToBeInformedExemption |
Legal source: | [GDPR Arts. 13.4, 14.5.a] |
Usage examples: |
Term: | Extraordinary Effort |
---|---|
Definition: | Providing the data subject with the relevant information would imply an impossible or disproportionate effort for the data controller |
Subclass of: | ppop:RightToBeInformedExemption |
Legal source: | [GDPR Art. 14.5.b] |
Usage examples: |
Term: | Affects Processing |
---|---|
Definition: | Providing the data subject with the relevant information would render impossible or seriously impar the processing |
Subclass of: | ppop:RightToBeInformedExemption |
Legal source: | [GDPR Art. 14.5.b] |
Usage examples: |
Term: | Legal Disclosure |
---|---|
Definition: | The information due to the data subject is already disclosed in a Member State or Union law |
Subclass of: | ppop:RightToBeInformedExemption |
Legal source: | [GDPR Art. 14.5.c] |
Usage examples: |
Term: | Confidentiality Obligation |
---|---|
Definition: | The data subject is not informed about a data processing activity due to the existance of a confidentiality obligation that covers the processing activity |
Subclass of: | ppop:RightToBeInformedExemption |
Legal source: | [GDPR Art. 14.5.d] |
Usage examples: |
Term: | Expression of Opinion |
---|---|
Definition: | The personal data relating to the data subject consisting of an expression of opinion about the data subject by another given in confidence or on the understanding that it would be treated as confidential to a person who has a legitimate interest in receiving it |
Subclass of: | ppop:RightExemption |
Legal source: | [GDPR Art. 23.1.a - 23.1.j] |
Usage examples: |
Term: | Investigation Prevention |
---|---|
Definition: | There is an allegation being made against the data subject and it is felt that the disclosure of data in the context of the request could in some way hinder the investigation |
Subclass of: | ppop:RightExemption |
Legal source: | [GDPR Art. 23.1.a - 23.1.j] |
Usage examples: |
Term: | Third Party Rights |
---|---|
Definition: | The data subject is only allowed to seek data in relation to themselves. Where another person may be identifiable from any information which may identify the third-party data should be redacted unless the third party has given consent |
Subclass of: | ppop:RightExemption |
Legal source: | [GDPR Art. 23.1.a - 23.1.j] |
Usage examples: |
Term: | Confidentiality of Opinion |
---|---|
Definition: | There is a confidential opinion expressed about the data subject by a member of staff |
Subclass of: | ppop:RightExemption |
Legal source: | [GDPR Art. 23.1.a - 23.1.j] |
Usage examples: |
Term: | Statistical or Research Purpose |
---|---|
Definition: | The request of the data subject can be refused if the exercise of rights would be likely to render impossible or seriously impair the achievement of archiving purposes or such restriction is necessary for the fulfilment of those purposes |
Subclass of: | ppop:RightExemption |
Legal source: | [GDPR Art. 23.1.a - 23.1.j] |
Usage examples: |
Term: | Legal Privilege |
---|---|
Definition: | Documents that have personal data of the data subject exempt from disclosure in court proceedings apply in relation to a Subject Access Request, this applies to both legal advice and litigation privilege |
Subclass of: | ppop:RightExemption |
Legal source: | [GDPR Art. 23.1.a - 23.1.j] |
Usage examples: |
Term: | National Security |
---|---|
Definition: | The exercise of the right by the data subject can be refused to safeguard national security where accepting the request of the right poses a threat to it |
Subclass of: | ppop:RightExemption |
Legal source: | [GDPR Art. 23.1.a - 23.1.j] |
Usage examples: |
Term: | Defence |
---|---|
Definition: | The exercise of the right by the data subject can be refused to safeguard defence where accepting the request of the right poses a threat to it |
Subclass of: | ppop:RightExemption |
Legal source: | [GDPR Art. 23.1.a - 23.1.j] |
Usage examples: |
Term: | Public Security |
---|---|
Definition: | The exercise of the right by the data subject can be refused to safeguard public security where accepting the request of the right poses a threat to it |
Subclass of: | ppop:RightExemption |
Legal source: | [GDPR Art. 23.1.a - 23.1.j] |
Usage examples: |
Term: | Judicial Independence or Proceedings |
---|---|
Definition: | The exercise of the right by the data subject can be refused to safeguard judicial independence or proceedings where accepting the request of the right poses a threat to it |
Subclass of: | ppop:RightExemption |
Legal source: | [GDPR Art. 23.1.a - 23.1.j] |
Usage examples: |
Organisation Duty | Explainability | Traceability | Auditability | Accuracy
Term: | Organisation Duty |
---|---|
Definition: | Moral or legal duties of organizations corresponding to the rights of the data subject |
Legal source: | [GDPR Art. 24] |
Ethical source: | [HCR] |
Usage examples: |
Term: | Explainability |
---|---|
Definition: | Organisations have a duty to explain and justify what is happening and why it is happening based on known facts and logical steps |
Subclass of: | ppop:OrganisationDuty |
Legal source: | [GDPR Art. 12] |
Ethical source: | [gall-21] |
Usage examples: |
Term: | Traceability |
---|---|
Definition: | Organisations have a duty to assign responsibilities and document decisions to enable follow-up |
Subclass of: | ppop:OrganisationDuty |
Legal source: | [GDPR Art. 5] |
Ethical source: | [capAI] |
Usage examples: |
Term: | Auditability |
---|---|
Definition: | Organisations have a duty to operationalise conformity assessment (to mitigate objective risks) and to ensure independant third party review (to mitigate subjective risks) |
Subclass of: | ppop:OrganisationDuty |
Legal source: | [GDPR Art. 24] |
Ethical source: | [capAI] |
Usage examples: |
Term: | Accuracy |
---|---|
Definition: | Ensure that systems generate correct and up to date outputs/outcomes |
Subclass of: | ppop:OrganisationDuty |
Legal source: | [GDPR Art. 5] |
Ethical source: | [AI Turing] |
Usage examples: |
Family with 2 parents and 2 children allow the processing of their medical health data for the purpose of research and development and have a data sharing service provider as a data intermediary
ex:familyPod a odrl:Policy ;
odrl:profile ppop:, oac: ;
odrl:uid <https://pod-provider/familyA/policy1> ;
dc:issued "2022-02-22" ;
odrl:permission [
odrl:assigner ex:familyPool ;
odrl:target oac:MedicalHealth ;
odrl:action oac:Read, oac:Write ;
odrl:constraint ex:purpose
] .
ex:familyPool a ppop:Group ;
ppop:hasDataIntermediary ex:dataIntermediary ;
ppop:hasVoluntaryMembership ex:parent1, ex:parent2 ;
ppop:hasNonVoluntaryMembership ex:child1, ex:child2 .
ex:dataIntermediary a ppop:DataSharingServiceProvider .
ex:parent1 a ppop:DataHolder, dpv:DataSubject ;
ppop:isDataHolderFor ex:child1, ex:child2, ex:parent1 .
ex:parent2 a ppop:DataHolder, dpv:DataSubject ;
ppop:isDataHolderFor ex:child1, ex:child2, ex:parent2 .
ex:child1 a dpv:Child .
ex:child2 a dpv:Child .
ex:purpose a odrl:Constraint ;
odrl:leftOperand oac:Purpose ;
odrl:operator odrl:isA ;
odrl:rightOperand dpv:ResearchAndDevelopment .
...
In order to develop a model to identify different phenotypes of asthma that will lead to the provision of appropriate medical treatment, a server in a hospital processes encrypted patient records stored in individuals Pods in conjunction with anonymised patient records from public health care systems. A validated model is developed in which a pattern between two variables, sex and BMI, is discovered and labelled as Type A asthma. The developed model is used to treat patients differently based on the type of asthma that has been identified. The discovered knowledge may result in discriminatory requests for the care of obese females.
To realize the task of the AI system, i.e., prediction of asthma phenotype, a new sample is sent to the AI model for performing inference. The patient whose data is sent to the AI model has already been informed that an AI system is being used to make a prediction based on what it was trained on. Following the transmission of data to the model, the result, which is the label of identified type of asthma, is sent to the diagnostic workstation for evaluation and assessment in terms of reliability and accuracy. Finally, the patient is aware of the type of asthma s/he has (because the clinician explains the outcome of an AI system in a simple language while complying with transparency measures), and the clinician takes the appropriate actions to treat the patient.
Example 4.3A presents patient A's data sharing policy related to its health record for the provision of asthma treatment, while example 4.3B presents the hospital's privacy policy regarding this particular service.
ex:patientA-policy a odrl:Policy ;
odrl:profile ppop:, oac: ;
odrl:uid <https://pod-provider/patientA/policy3> ;
dc:issued "2022-01-13" ;
odrl:assigner ex:patientA ;
odrl:target oac:HealthRecord ;
odrl:permission [ odrl:action oac:Use ; odrl:constraint ex:purpose ] ;
odrl:permission [ odrl:action oac:Store ; odrl:constraint ex:storage ] ;
odrl:obligation [
odrl:action ex:discloseCodeImplementer ;
ppop:accountableParty ex:codeImplementer ] .
ex:patientA a ppop:DataHolder, oac:DataSubject .
ex:AsthmaTreatment a dpv:Purpose ; skos:broaderTransitive dpv:ServiceProvision ;
rdfs:label "Provision of medical asthma treatment" .
ex:purpose odrl:leftOperand oac:Purpose ;
odrl:operator odrl:isA ;
odrl:rightOperand ex:AsthmaTreatment .
ex:storage odrl:leftOperand ppop:ProcessingContext ;
odrl:operator odrl:eq ;
odrl:rightOperand [ a ppop:PersonalDataStore ;
dpv:hasLocation <https://pod-provider/patientA/> ] .
ex:discloseCodeImplementer a dpv:Processing ; skos:broaderTransitive dpv:Disclose ;
rdfs:label "Disclose the implementer of the predictive model" .
ex:codeImplementer a ppop:DataUser .
ex:pp-usecase-3 a odrl:Privacy ;
odrl:profile ppop:, oac: ;
odrl:uid <https://hospitala.com/policy3> ;
dc:issued "2022-01-15" ;
odrl:assigner ex:controller ;
odrl:constraint ex:usedTechnology ;
odrl:permission [
ppop:accountableParty ex:controller ;
odrl:assignee ex:patients ;
odrl:target oac:HealthRecord ;
odrl:action [
rdf:value oac:Use ;
odrl:refinement [
odrl:and ex:purpose, ex:legalBasis, ex:measure
]
]
] ;
odrl:permission [
odrl:target ex:anonymisedHealthRecords ;
odrl:action [
rdf:value oac:Use ;
odrl:refinement ex:purpose
]
] ;
odrl:obligation [
odrl:target ppop:RightToNonDiscrimination ;
odrl:action [
rdf:value ppop:realize ;
odrl:refinement [
odrl:leftOperand opp:Measure ;
odrl:operator odrl:isA ;
odrl:rightOperand ppop:SafeguardForPrivacy
]
]
] .
ex:controller a oac:DataController, ppop:DataUser ;
dpv:hasName "Hospital A" ;
dpv:hasAddress "Hospital Street, City, Country" ;
dpv:hasContact "contact@hospitala.com" ;
dpv:hasDataProtectionOfficer [
a dpv:DataProtectionOfficer ;
dpv:hasContact "dpo@hospitala.com" ;
] .
ex:Server a dpv:Technology ;
skos:broaderTransitive dpv-tech:DataStorageTechnology, dpv-tech:DataManagementTechnology ;
rdfs:label "Server used to store and develop models with patients' data" .
ex:usedTechnology a odrl:Constraint ;
odrl:leftOperand ppop:Technology ;
odrl:operator odrl:isA ;
odrl:rightOperand ex:Server .
ex:patients a ppop:Group ;
ppop:hasNonVoluntaryMembership ex:patientA, ex:patientB, ex:patientC .
ex:purpose a odrl:Constraint ;
odrl:leftOperand oac:Purpose ;
odrl:operator odrl:isA ;
odrl:rightOperand ex:AsthmaTreatment .
ex:legalBasis a odrl:Constraint ;
odrl:leftOperand ppop:LegalBasis ;
odrl:operator odrl:isA ;
odrl:rightOperand ex:consent .
ex:consent a dpv-gdpr:A9-2-a ;
dpv:hasWithdrawalMethod "withdraw@hospitala.com" .
ex:measure a odrl:Constraint ;
odrl:leftOperand ppop:Measure ;
odrl:operator odrl:isA ;
odrl:rightOperand dpv:Encryption .
ex:AnonymisedHealthRecord a dpv:AnonymisedData, dpv:HealthRecord ;
rdfs:label "Anonymised health record data" .
ex:anonymisedHealthRecords a ex:AnonymisedHealthRecord ;
ppop:collectedfromOtherDataSources "public health care systems" .