There is an interference with the right to privacy whenever personal data is collected, processed, and/or transferred. For an interference with privacy to be justified, and thus not a violation of the right, the interference must be necessary to achieve a legitimate aim and the measures taken must be proportionate.1 Throughout the data collection effort, Collectors must review the necessity and proportionality of their actions in light of the right to privacy on a regular basis.2 Whether data is collected in violation of privacy rights can be relevant to a court’s assessment of the data’s admissibility as evidence.

Collectors should operate under the presumption that the collected data could contain personal data, and that consequently privacy protections apply. Collectors should regularly review necessity and proportionality even if the data they are collecting will not be reviewed by human eyes, for example if the collection effort’s objective is only to collect and store data.3 This applies equally if the data is collected solely for the purposes of algorithmic training.

Reviewing the necessity of a collection effort requires the Collector to ask: could the collection effort objectives be achieved by less privacy-intrusive means? For example, by structuring the collection effort to avoid collecting personal data, or by anonymising any collected personal data? If the answer is yes, the collection effort must be adjusted accordingly; if the answer is no, the Collector can then proceed to a proportionality assessment.4

The proportionality assessment should be guided by the questions listed in section 4.2.2.C. of the Legal Framework. This list is non-exhaustive and may need to be expanded depending on the collection effort’s objectives and specific context.

Tech Specs & Resources

Anonymisation of audio data can be achieved through redaction techniques, including, among others, noise addition, speech transformation, or voice conversion (e.g., zero-shot voice conversion is a means of voice conversation using machine learning that requires no or minimal training data).

Due to the likely sensitive nature of the information undergoing anonymisation, it is preferable for any anonymisation tools to be localised and machine-based, rather than cloud-based, in order to minimise any additional risk to security or privacy.

The Collector should consider the existing scepticism concerning the permanence of anonymisation techniques, particularly if there is other data that can be linked to establish personal data. See e.g., NIST, Interagency Report 8387, Digital Evidence Preservation Considerations for Evidence Handlers (2022), page 11.

Legal Framework

See section 4.2. on when data is classified as personal data and section 4.2.1. on the applicability of privacy in military communications.

See section 4.2.2. on the three-part test for determining whether an interference with the right to privacy is justified, and in particular section 4.2.2.B. and section 4.2.2.C. on the necessity and proportionality aspects of this test.

Applicable Ethical Principles Do No Harm; Legal Awareness.

Footnotes

  1. ECHR, Article 8; Convention 108, Article 11.

  2. P.N. v Germany (ECtHR), Judgment, para 85; Catt v United Kingdom (ECtHR), Judgment, para 119-120; Big Brother Watch and Others v. United Kingdom (ECtHR), Judgment, para. 350, 356; Case of S. and Marper v United Kingdom (ECtHR), Judgment, para 119.

  3. ECtHR, Guide to the Case-Law of the European Court of Human Rights: Data Protection (2022), para. 8, citing Kırdök and Others v. Turkey (ECtHR), Judgment.

  4. ECHR, Article 8(2); Convention 108, Article 11.