<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	>

<channel>
	<title>personal data - KIELTYKA GLADKOWSKI LEGAL | CROSS BORDER POLISH LAW FIRM RANKED IN THE LEGAL 500 EMEA SINCE 2019</title>
	<atom:link href="https://www.kg-legal.eu/info/tag/personal-data/feed/" rel="self" type="application/rss+xml" />
	<link>https://www.kg-legal.eu/info/tag/personal-data/</link>
	<description>KIELTYKA GLADKOWSKI LEGAL &#124; CROSS BORDER POLISH LAW FIRM RANKED IN THE LEGAL 500 EMEA SINCE 2019</description>
	<lastBuildDate>Wed, 12 Nov 2025 10:22:53 +0000</lastBuildDate>
	<language>en-GB</language>
	<sy:updatePeriod>
	hourly	</sy:updatePeriod>
	<sy:updateFrequency>
	1	</sy:updateFrequency>
	
	<item>
		<title>National Healthcare and the processing of personal data by means of AI</title>
		<link>https://www.kg-legal.eu/info/pharmaceutical-healthcare-life-sciences-law/national-healthcare-and-the-processing-of-personal-data-by-means-of-ai/</link>
					<comments>https://www.kg-legal.eu/info/pharmaceutical-healthcare-life-sciences-law/national-healthcare-and-the-processing-of-personal-data-by-means-of-ai/#respond</comments>
		
		<dc:creator><![CDATA[jakub]]></dc:creator>
		<pubDate>Wed, 12 Nov 2025 10:22:53 +0000</pubDate>
				<category><![CDATA[PHARMACEUTICAL, HEALTHCARE & LIFE SCIENCES LAW]]></category>
		<category><![CDATA[AI]]></category>
		<category><![CDATA[Artificial intelligence]]></category>
		<category><![CDATA[gdpr]]></category>
		<category><![CDATA[National Health Fund]]></category>
		<category><![CDATA[National Healthcare]]></category>
		<category><![CDATA[nfz]]></category>
		<category><![CDATA[personal data]]></category>
		<category><![CDATA[Poland]]></category>
		<category><![CDATA[processing of personal data]]></category>
		<guid isPermaLink="false">https://www.kg-legal.eu/?p=8478</guid>

					<description><![CDATA[<p>Publication date: November 12, 2025 Artificial intelligence (AI) is currently finding widespread use in healthcare. A prime example is the Polish National Health Fund (NFZ) initiative, which utilizes AI to analyze patient data stored in the Fund&#8217;s databases. This data is then analyzed with the support of machine learning tools to make strategic decisions regarding [&#8230;]</p>
<p>Artykuł <a href="https://www.kg-legal.eu/info/pharmaceutical-healthcare-life-sciences-law/national-healthcare-and-the-processing-of-personal-data-by-means-of-ai/">National Healthcare and the processing of personal data by means of AI</a> pochodzi z serwisu <a href="https://www.kg-legal.eu">KIELTYKA GLADKOWSKI LEGAL | CROSS BORDER POLISH LAW FIRM RANKED IN THE LEGAL 500 EMEA SINCE 2019</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p><strong><mark style="background-color:rgba(0, 0, 0, 0)" class="has-inline-color has-vivid-cyan-blue-color">Publication date: November 12, 2025</mark></strong></p>



<p>Artificial intelligence (AI) is currently finding widespread use in healthcare. A prime example is the Polish National Health Fund (NFZ) initiative, which utilizes AI to analyze patient data stored in the Fund&#8217;s databases. This data is then analyzed with the support of machine learning tools to make strategic decisions regarding the health of Poles. This approach will certainly simplify the work of doctors by searching for and analyzing the desired information, undoubtedly reducing their workload. However, such a solution may raise several issues and legal requirements related to regulations regarding the protection and processing of personal data.</p>



<span id="more-8478"></span>



<h2 class="wp-block-heading"><strong>What is personal data?</strong></h2>



<p>The most common definition of personal data is contained in the EU Regulation 2016/679 (GDPR), according to which personal data is any information about an identified or identifiable natural person. This includes direct identification (e.g., name and surname) or certain factors allowing indirect identification (e.g., job description or nationality). This concept is expanded by the Polish Act on the Protection of Personal Data Processed in Connection with the Prevention and Combating of Crime of December 14, 2018, by applying it directly to health. Health data here means personal data relating to the physical or mental health of an individual, including data on the use of healthcare services that reveal information about their health.</p>



<p class="has-luminous-vivid-amber-background-color has-background has-medium-font-size"><strong>Patients&#8217; rights</strong></p>



<p>A number of patient rights are listed in the EU Regulation 2025/327 (Regulation on the European Health Data Space). The fundamental right is the right of individuals to access their electronically collected data (especially &#8220;priority data,&#8221; e.g., electronic prescriptions or imaging test results), which should be granted immediately after data is registered in the system. Individuals can also add their own information to their data already visible in the system and correct it. Furthermore, patients can grant access or request the transfer of their data to another provider. Access to healthcare professionals can also be restricted (however, in such cases, the patient should also be informed of the potential impact of such action on the quality of care provided). In this case, institutions collecting patient data should be aware of these rights, because if they are not respected, the patient could file a complaint (provided, however, that the rights or interests of the individual are adversely affected) and demand appropriate compensation.</p>



<p>The EU GDPR also provides similar rights, which additionally provides for one crucial privilege: the right to object. According to this regulation, an individual may object at any time to the processing of their data, including in connection with the performance of healthcare tasks, for reasons relating to their particular situation. In such a case, the data may no longer be processed unless the controller demonstrates compelling and legitimate grounds for further processing. Under the regulation, a patient could also request the deletion of their personal data if, for example, they are no longer necessary for the purpose for which they were collected or if they were processed unlawfully. Furthermore, the regulation also provides for the possibility of imposing an administrative fine of up to €20 million for a controller&#8217;s violation of guaranteed rights. Furthermore, Article 79 of the GDPR grants the right to an effective judicial remedy if the individual (patient) believes that the processing of their personal data violated the law.</p>



<p class="has-luminous-vivid-amber-background-color has-background has-medium-font-size"><strong>Obligations of entities storing and processing data</strong></p>



<p>Pursuant to Article 24 of the GDPR, the data controller is obligated to implement appropriate technical and organizational measures to ensure data processing is carried out in compliance with legal provisions and the rights and freedoms of others. The controller must also review and update these measures as necessary. In the case of <strong><u>AI-based patient data processing</u></strong>, the obligation specified in this article to design the measures described above is also crucial, ensuring that only information necessary to protect the patient&#8217;s life and health is processed by default. In the event of a personal data breach, the controller should (within 72 hours of becoming aware of the breach) notify the relevant supervisory authority of the personal data breach. However, the controller is not obligated to do so if the likelihood of a breach affecting the rights and freedoms of natural persons is low. If the risk of a breach is high, the controller should also notify the affected individual. Furthermore, before processing begins, even using new technologies (including AI), if it may result in a high future risk to the rights and freedoms of natural persons, it will be necessary to assess the impact of the planned processing on personal data protection. If such an assessment indeed reveals a high risk, and if the controller fails to implement any measures to mitigate it, the controller must contact the relevant supervisory authority (in Poland, the President of the Personal Data Protection Office [President of the UODO]), which then provides the controller with a written recommendation and may also temporarily restrict or prohibit processing or issue a warning to the controller. General obligations, according to which personal data must be processed lawfully and fairly, in a transparent manner, and limited to what is necessary for the purposes for which they are processed, are also important.</p>



<p>In this situation, Regulation 2024/1689 (&#8220;AI Act&#8221;) also provides an interesting requirement. According to Article 4 thereof, healthcare entities using AI systems to make strategic decisions about patients are responsible for maintaining an appropriate level of AI competence among their staff, taking into account the purpose of using the system and the persons for whom the systems are to be used.</p>



<p class="has-luminous-vivid-amber-background-color has-background has-medium-font-size"><strong>Requirements for the AI systems themselves</strong></p>



<p>The basic requirements that AI systems used for data processing would have to meet are set out in the aforementioned AI Act. This document explicitly classifies AI systems as &#8220;high-risk systems&#8221;, and therefore, the requirements set out in the act apply to them. Primarily, this requires maintaining appropriate documentation for the system: technical documentation regarding the quality management system (including data acquisition, collection, analysis, and labeling) and an EU declaration of conformity confirming the system&#8217;s compliance with the requirements set out in the regulation. Furthermore, this documentation should be kept at the disposal of the competent national authorities for 10 years after the system&#8217;s commissioning. Such systems should also meet transparency requirements, meaning they should be designed to facilitate proper use and interpretation of their actions (they should also have clear operating instructions). They must also have an appropriate oversight system that allows for human oversight of the AI if necessary, especially if its operation were to get out of control and harm others. Finally, AI systems are also subject to certain formal requirements, such as undergoing a pre-market conformity assessment and registering in a dedicated EU database for high-risk AI systems. It is also important to remember that high-risk AI systems are subject to general CE marking regulations. Once placed on the market, suppliers are required to establish a post-market monitoring system for AI systems to ensure their legal compliance.</p>



<p><strong><u>The issue of non-personal data</u></strong></p>



<p>It is also worth raising the issue of non-personal data, for example, in the context of a situation where a hospital, in order to decide on the appropriate medication for a patient, requests information about certain medications from a pharmacy. A public sector body may request such information only to the extent that the lack of this data would prevent it from performing its public interest tasks or when the body has no other available means of obtaining such data. A request for this purpose should be submitted (specifying, in particular, the purpose for which the information is requested). However, the data subject who received it may refuse to provide it if they have no control over the requested information or if a similar request for the same purpose has already been submitted by another public sector body. Once the requested information is in the possession of the requester, they must not use it in a manner inconsistent with the purpose for which the data was provided. They must ensure measures to protect its confidentiality or integrity, and they must delete the data as soon as it is no longer needed for the specified purpose. They are also prohibited from using the information obtained to improve a competitive product or from disclosing any information in this regard to third parties. It is also important to bear in mind the right of a public sector body to share the data received with individuals or organisations for the purposes of scientific research or analyses consistent with the purpose for which the data was requested, or with national statistical offices (e.g. the Central Statistical Office). It is important here that these organisations do not have a commercial nature or are not related to entities that do.</p>



<p><strong><u>The status in Poland</u></strong></p>



<p>As mentioned above, Poland has established a special supervisory authority for personal data protection, the President of the Personal Data Protection Office (UODO), acting with the assistance of the Office for Personal Data Protection. Among other things, this authority is responsible for consultations on data processing that poses a significant risk of violating the rights of others. It also conducts proceedings in cases of violations of personal data protection regulations and establishes a plan for monitoring compliance with these regulations.</p>



<p>It is also worth remembering the regulations of the Polish Act on Patients&#8217; Rights and the Patient Ombudsman. It stipulates that patients have the right to access medical records concerning their health and the services provided to them. The entity storing this documentation is obligated to disclose the data contained therein only to the patient themselves or an authorized person (or, for example, to a university or research institute for scientific purposes, but without any data allowing for the identification of the individual). Furthermore, the entity providing services is obligated to retain medical records only for a specified period (generally 20 years), after which they should be destroyed in a way that prevents the identification of the patient to whom they pertained. The Act on the Healthcare Information System also limits access to these records to medical professionals and physicians.</p>



<p>Also important are the provisions of the Act on the computerization of the activities of entities carrying out public activities, under which an entity maintaining a public register (i.e. any type of records used to carry out public tasks based on the relevant provisions) should provide another public entity with access to the data in its possession to the extent necessary to carry out public tasks.</p>



<p>It is also important to remember the Polish Code of Medical Ethics, which, in Article 14, requires physicians to inform patients about the benefits and risks associated with proposed diagnostic procedures and, where appropriate, about the possibility of using other methods. Furthermore, according to Article 12, the use of AI in treatment may only occur after the following conditions are met: informing the patient that artificial intelligence will be used in the diagnosis or therapeutic process; obtaining the patient&#8217;s informed consent to the use of artificial intelligence in the diagnostic or therapeutic process; and using AI algorithms that are approved for medical use and have the appropriate certifications. However, the final decision always rests with the physician.</p>



<p>A government draft legislation is currently being prepared, which will be designed to adapt the national legal system to the requirements imposed by the AI Act. The government&#8217;s proposals primarily envisage the establishment of the Artificial Intelligence Development and Security Commission, which will oversee the AI market within the scope specified in Article 2 of Regulation 2024/1689. The second main body will be the President of the Personal Data Protection Office (UODO) that will oversee high-risk AI systems, including those related to healthcare.</p>



<p><strong><u>Summary</u></strong></p>



<p>Processing personal data for healthcare purposes, additionally supported by artificial intelligence, is undoubtedly a convenient and practical solution, but it is associated with a number of legal obligations intended to ensure the security of the data used (e.g., using the acquired data only for a strictly defined purpose), the security of patients themselves (e.g., the obligation to inform the patient of the intention to use artificial intelligence in the treatment process), or simply related to formalities (e.g., the requirement to register the artificial intelligence system in an EU database). Currently, EU regulations are much more detailed in this matter.</p>
<p>Artykuł <a href="https://www.kg-legal.eu/info/pharmaceutical-healthcare-life-sciences-law/national-healthcare-and-the-processing-of-personal-data-by-means-of-ai/">National Healthcare and the processing of personal data by means of AI</a> pochodzi z serwisu <a href="https://www.kg-legal.eu">KIELTYKA GLADKOWSKI LEGAL | CROSS BORDER POLISH LAW FIRM RANKED IN THE LEGAL 500 EMEA SINCE 2019</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.kg-legal.eu/info/pharmaceutical-healthcare-life-sciences-law/national-healthcare-and-the-processing-of-personal-data-by-means-of-ai/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
		<item>
		<title>Voice Cloning as a Global New Technology and its Challenges for EU and Polish Law</title>
		<link>https://www.kg-legal.eu/info/it-new-technologies-media-and-communication-technology-law/voice-cloning-as-a-global-new-technology-and-its-challenges-for-eu-and-polish-law/</link>
					<comments>https://www.kg-legal.eu/info/it-new-technologies-media-and-communication-technology-law/voice-cloning-as-a-global-new-technology-and-its-challenges-for-eu-and-polish-law/#respond</comments>
		
		<dc:creator><![CDATA[jakub]]></dc:creator>
		<pubDate>Fri, 07 Jul 2017 10:56:08 +0000</pubDate>
				<category><![CDATA[IT, NEW TECHNOLOGIES, MEDIA AND COMMUNICATION TECHNOLOGY LAW]]></category>
		<category><![CDATA[data protection]]></category>
		<category><![CDATA[personal data]]></category>
		<category><![CDATA[personal data protection]]></category>
		<category><![CDATA[voice cloning]]></category>
		<guid isPermaLink="false">https://www.kg-legal.eu/?p=1051</guid>

					<description><![CDATA[<p>Voice Cloning as a Global New Technology and its Challenges for EU and Polish Law</p>
<p>Artykuł <a href="https://www.kg-legal.eu/info/it-new-technologies-media-and-communication-technology-law/voice-cloning-as-a-global-new-technology-and-its-challenges-for-eu-and-polish-law/">Voice Cloning as a Global New Technology and its Challenges for EU and Polish Law</a> pochodzi z serwisu <a href="https://www.kg-legal.eu">KIELTYKA GLADKOWSKI LEGAL | CROSS BORDER POLISH LAW FIRM RANKED IN THE LEGAL 500 EMEA SINCE 2019</a>.</p>
]]></description>
										<content:encoded><![CDATA[<div class="wp-block-image">
<figure class="alignleft size-large is-resized"><img decoding="async" src="https://www.kg-legal.eu/wp-content/uploads/2016/12/pay-per-view.jpg" alt="" style="width:302px;height:auto"/></figure></div>

<p>Siri, Cortana, Google and other applications use human voice to do a variety of things, e.g. searching for information, sending emails, calling somebody. Voice-based technologies are increasingly applied in legal environment and legal services, for example in legal advice rendered online and in legal translations. At the same time, new applications of innovative technologies caused the necessity to define the approach to privacy issues anew. The cases of Edward Snowden and Julian Assange showed us how meaningful privacy and its protection is and made us realize the excessive amount of personal data processed and stored daily. This is why privacy and its protection will soon become one of the most important personal rights. The issue of voice protection comes to the fore in this context. Voice is, obviously, a personal right. What is more, voice is becoming a tool used by most applications both for mundane activities as well as more complex ones, like ROSS AI operating on IBM’s Watson, which can do legal research and is learning to understand law with every research conducted by it. What if it was possible for such applications as Watson to use the voice of a specific lawyer and, with the use of voice sample, produce speech of a different content, for example in the form of legal advice? Well, practically it is possible, since last November Adobe presented Adobe VoCo to the world, which (when having a voice sample) is able to read various content differing from the conent sampled. The present article will try to shed some light to the issue of the risk involved with voice cloning technology in legal environment and will analyse whether law can adequately protect human voice as a personal right.</p>
<p><span id="more-1051"></span></p>
<h4><strong>Background of Voice Cloning Technology</strong></h4>
<p>Voice cloning technology is based on copying and reusing of recorded speech. In the future, such software will be able to record voice samples and, afterwards, produce an infinite number of combined syllables leading to an unlimited number of sentences, without the participation of the human being that provided the voice sample. In reference to the latest developments, we may be first to witness the creation of such software for commercial use. The first project worth mentioning is Google Deep Mind&#8217;s WaveNet. It is a deep neural network for generating raw audio waveforms, including speech and music. WaveNet has outperformed other text-to-speech systems, but this product has not yet been declared to be available for consumers.<a href="#_edn1" name="_ednref1">[i]</a> From this point of view, it is of importance to mention Adobe Project VoCo, presented during the Adobe MAX 2016 Sneak Peeks. It is a software which is able to create a voice model of the speaker from an earlier given voice sample of 20 minutes duration by said speaker.<a href="#_edn2" name="_ednref2">[ii]</a> VoCo can construct new words and sentences which did not occur in the provided recordings.<a href="#_edn3" name="_ednref3">[iii]</a> Such potential of said software, with the plans to release VoCo to the consumer market, raises considerable concerns, also legal ones, in respect of data and privacy protection.</p>
<h4>Voice as a Personal Right and its Protection in Polish Jurisdiction</h4>
<p>In order to specify if the European or the Polish law can adequately protect the use of voice technology-based applications and word-building software, we need to indicate the legal status of the human voice at first. From the legal point of view, the human voice should generally be classified as a personal right and, more specific, as a non-pecuniary property of every human being connected with his individual existence, which is effective against everyone, inalienable and not inherited. Polish provisions (art. 23–24 of the  <a href="http://www.ebrd.com/downloads/legal/core/poland.pdf">Polish Civil Code</a>) include a sample and open catalogue of personal rights and their protection, irrespective of other regulations. Polish jurisprudence and the majority of law practitioners<a href="#_edn4" name="_ednref4">[iv]</a> express approval for the most essential judgement in this regard, which has been delivered by the Polish Court of Appeals in Gdańsk on 21 June 1991 (case citation: <a href="http://prawo.legeo.pl/prawo/i-acr-127-91/">I ACr 127/91, LEX</a>), where the Court acknowledged that the voice shall be regarded as a personal right (as defined in art. 23 of the Poish Civil Code)<a href="#_edn5" name="_ednref5">[v]</a> and protected pursuant to art. 24 of the Polish Civil Code.<a href="#_edn6" name="_ednref6">[vi]</a> Voice serves the same purpose as a human image, namely: identification. It is an element of appearance, given that it relates to individual voice alteration, pitch, sound and the ways someone speaks, i.e. intonation and characteristic words. The violation of this right could occur, e.g., by duplication of voice records or their modification and, what is more, by imitation of distinctive voices, if it could be demonstrated that the above mentioned use was intended to deceive listeners in regard to the identity of the person speaking.<a href="#_edn7" name="_ednref7">[vii]</a> In case of acknowledging that recognizing a person within the sphere of a sound is possible just as well as through an external image, principles related to images apply analogically to voices, provided that the voice is protected as a separate personal right.<a href="#_edn8" name="_ednref8">[viii]</a></p>
<p>In accordance with the latter provision, the one whose personal right is threatened by the activity of third party may demand this activity to cease, unless it is legitimate (art. 24 par. 1 of the Polish Civil Code). Nevertheless, there are also legal experts who intend to recognize the voice not as a separate personal right but rather as a part of human image or as «audio-image» / «sound-image» that makes it possible to identify a person by sense of hearing. If this view is adopted, then the voice is protected not only on the basis of the Polish Civil Code but also within the framework of copyright law (art. 24 par. 3 of the Polish Civil Code). </p>
<p>The protection system of the personal rights should be deeply analysed in regard to new technologies based on the use of human voice, since new ways of using (i.e. for online legal advice) or modifying it (i.e. in order to circumvent voice recognition technologies used by banks while making payment orders) could not be protected adequately enough.</p>
<p>Under the Polish Civil Code, the conditions for legal protection of the voice as a personal right are to be viewn as a breach or threat of a breach of personal rights and unlawfulness of such breach or threat. The person who provides his or her voice may therefore demand, amongst others, that the consequences of said breach are removed and that monetary compensation is paid under this title. In this context, the controversy arises, whether – given the situation that a person voluntarily and in consent provides a voice sample – the element of unlawfulness can be demonstrated when the specific software clones the voice in an unintended manner. Accordingly, the open question is whether the means mentioned above provide sufficient protection in this respect. It appears, that nowadays new technologies use subjects of personal rights (protected by given legal methods) in pioneer ways, so that the effects of those activities require new concepts, i.e. applications editing an attorney’s voice (such as VoCo by Adobe) could be used for providing unfounded legal advice and therefore we not only deal with a breach of the personal right related to the voice but also related to the image, scientific activity, freedom of conscience or other implied legal consequences. On the other hand, VoicePass technology, constructed by the Polish University of Science and Technology in Cracow, which is able to identify our voice and allows to verify our identity i.e. in banks, insurance offices or authority bodies<a href="#_edn9" name="_ednref9">[ix]</a> is not only a great invention and simplifying various official procedures but also a potential risk of violating our personal data.</p>
<h4>Voice Cloning in the Light of Penal Liability (in the Polish Copyright Law)</h4>
<p>It has to be considered that manufacturers of computer programs which allow voice cloning will provide adequate protection in the form of tags, digital watermarks or any other forms, so that it can be showed that somebody’s voice being used in bad faith has been created by the program. But what if somebody circumvents effective technical devices applied to protect the software in order to remove digital watermarks and to use somebody’s voice unlawfully? This has to be viewed as cracking and the accountable person can be treated as cracker or hacker. <a href="http://www.wipo.int/wipolex/en/text.jsp?file_id=129377">Polish Copyright Law</a> indicates penalties in its art. 118 para. 1, stating that «anyone who produces devices or components of devices for the purpose of unauthorised removal or circumvention of effective technical devices applied to protect a work or the subject matter of related rights against replaying, copying or reproduction or trades in such devices or components of such devices, or advertises their sale or rental, is liable to a fine, restriction of personal liberty or imprisonment for up to 3 years». In turn, para. 2 of said article sets forth that «anyone who owns, stores or uses devices or components of devices as referred to in paragraph 1, is liable to a fine, restriction of personal liberty or imprisonment for up to a year».</p>
<p>First of all, the term «effective technical devices» must be clarified: it means that the introduced technical security is objectively capable of fulfilling its function and – without it being removed or bypassed – replaying, copying or reproducing are impossible.<a href="#_edn10" name="_ednref10">[x]</a> The problem arising from the wording of the quoted legal provision concerns computer programs and whether this provision also applies to computer programs striving for illegal neutralization of security. It is worth pointing out that computer programs are not devices, since, according to the Polish Languages Dictionary, a device is a mechanism or a set of mechanisms performing specific actions,<a href="#_edn11" name="_ednref11">[xi]</a> meaning that devices must be material and, apart from that, computer programs constitute intangible rights. In the literature, it is proposed that computer programs may be, at most, treated as components of devices.<a href="#_edn12" name="_ednref12">[xii]</a> This is a significant issue, since the removal or circumvention of the effective technical devices applied (in this case, a voice cloning computer program) is usually performed by special computer programs and the appropriate interpretation will decide whether art. 118 para. 1 applies in this regard.</p>
<h4>Voice Cloning Software and its Risks</h4>
<p>It seems that the main anxiety in this area is connected with the use of audio recordings as an evidence in court. Obviously, audio recording could be used as a valid evidence in the course of litigation under the Polish jurisdiction.<a href="#_edn13" name="_ednref13">[xiii]</a> Some restrictions apply to recordings acquired illegally but a general rule states that such recordings are also admitted in court if they support reaching a fair ruling.<a href="#_edn14" name="_ednref14">[xiv]</a> The software enabling the creation of statements which sound, for example, like the defendant, can cause a considerable risk for the fairness of the trial. Accordingly, one of the ideas to provide protection against fake statements generated by means of voice cloning technology is adding audio watermarks to every output of such software. Digital watermarking is the process of imperceptibly embedding watermarks into digital media as a permanent sign to assure its authenticity.<a href="#_edn15" name="_ednref15">[xv]</a></p>
<h4>Data Protection in the Light of Voice Cloning</h4>
<p>Voice cloning technology requires the obtained data to be saved, therefore it is necessary to also look at this new technology from a personal data protection point of view. In the European Union, this issue is regulated by a number of directives, i.e. the Data Protection Directive (<a href="http://eur-lex.europa.eu/LexUriServ/LexUriServ.do?uri=CELEX:31995L0046:en:PDF">95/46/EC</a>), the Telecommunications Act of 16 July 2004 (unified text of 2016, item 1489 as amended) or the Electronic Communications Data Protection Directive (<a href="http://eur-lex.europa.eu/LexUriServ/LexUriServ.do?uri=CELEX:32002L0058:en:PDF">2002/58/EC</a>). In spite of sealing personal data protection (not only internationally but also on a national level), multiple problems occur in practice, e.g., when the data controller entrusts data to countries with insufficient data protection standards. Sufficient data protection standards shall be assessed in the light of all circumstances surrounding a data transfer operation, in particular, the nature of data, its purpose and the duration of the proposed processing operation. According to the Polish Data Protection Act of 29 August 1998 (unified text Journal of Laws of 2016, item 922), the above stated doubts arise whenever personal data is transferred to a country not belonging to the European Economic Area. However, the issue of voice filing as personal data requires a more extensive description and exceeds the scope of this paper.</p>
<h4>Voice Biometrics vs Voice Cloning</h4>
<p>Voice cloning technology differs from voice biometrics technology, since the latter is a technology used to identify people by their voices. Biometrics refers to metrics related to human characteristics. Nowadays, this technology is being used increasingly, especially in matters of security (e.g., at the airport, where one can choose the facial recognition system instead of the traditional way of checking in).</p>
<p>The human voice is as unique as fingerprints. Moreover, everyone articulates sentences in an original way: one puts emphasis differently, the rate of speech and the intonation are varying. The system records and picks out all the differences while taking into consideration details such as the size and the shape of throat, mouth cavity, nasal cavity, length and tension of vocal cords. Every recorded voice print is stored as a mathematical model. To avoid mistakes during voice recording, the commands are dictated by a speech synthesizer. The verification process consists of comparing samples of recordings to previous recordings. The said systems are currently equipped with technologies removing ambient noise, and can therefore recognize voices in most cases.</p>
<p>Companies using voiceprint checks to verify their customers are at risk of voice cloning technologies, too. It is said that biometric systems would not be tricked by this, as the inspected items differ from what humans look for when identifying people.<a href="#_edn16" name="_ednref16">[xvi]</a> The authors of the VoicePIN, a new startup from Poland, claim that their product based on voice authentication is resistant to spoofing and can detect whether the sample is original or re-played.<a href="#_edn17" name="_ednref17">[xvii]</a> If such assumption is correct, it seems reasonable that the biometric system could likewise be protected from voice cloning software. The final answer will be known only after specific tests and experiments.</p>
<p>The above remarks lead to the conclusion that voice cloning technology may cause new types of legal liability, both civil and penal, of the entities applying this technology. Consequently, voice cloning, like any other new technology, will involve the need to amend and adjust the existing legal provision. Nevertheless, these should not restrict the application of this technology in areas like providing services, e.g., legal advice and legal translation. The said changes are particularly required in the area of administrative law when defining the authority and supervisory competence of entities protecting personal data. Moreover, an important postulate would be to precisely define human voice as a specific personal interest. Furthermore, an unauthorised modification of such voice by means of computer devices should be classified as a specific type of infringement of human voice as a personal interest. Nevertheless, despite the existence of potential risks, when properly safeguarded by legal provisions, voice cloning software can indeed influence the effectiveness and cost-efficiency of legal services positively.</p>
<p> </p>
<p><a href="#_ednref1" name="_edn1">[i]</a> Aaron van den Oord / Karen Simonyan / Nal Kalchbrenner / Sander Dieleman / Oriol Vinyals / Andrew Senior / Heiga Zen / Alex Graves / Koray Kavukcuoglu, WaveNet: A Generative Model for Raw Audio, 19 September 2016, <a href="http://www.arxiv.org/pdf/1609.03499.pdf">www.arxiv.org/pdf/1609.03499.pdf</a> (all internet addresses last accessed 18 April 2017).</p>
<p><a href="#_ednref2" name="_edn2">[ii]</a> Official live presentation during the Adobe MAX 2016 Sneak Peeks, co-hosted by Jordan Peele, <a href="http://www.youtube.com/watch?v=I3l4XLZ59iw">www.youtube.com/watch?v=I3l4XLZ59iw</a>.</p>
<p><a href="#_ednref3" name="_edn3">[iii]</a> Sebastian Anthony, Adobe demos «photoshop for audio,» lets you edit speech as easily as text, arsTECHNICA, 11 July 2016, <a href="http://www.arstechnica.com/information-technology/2016/11/adobe-voco-photoshop-for-audio-speech-editing">www.arstechnica.com/information-technology/2016/11/adobe-voco-photoshop-for-audio-speech-editing</a>.</p>
<p><a href="#_ednref4" name="_edn4">[iv]</a> Janusz Barta / Ryszard Markiewicz / Andrzej Matlak, Media Law, LexisNexis, Warsaw 2005; Justyna Balcarczyk, The right to image and its commercialization, Oficyna Wolter Kluwer Business, Warsaw 2009, pp. 52–54; Justyna Balcarczyk, Voice right – outline of basis issues, Zeszyty Naukowe Uniwersytetu Jagiellońskiego 2010/2/115–126, LEX; Maksymilian Pazdan, Commentary on Article 23 of the Civil Code, in: Krzysztof Pietrzkowski (ed<em>.</em>),<em> </em>Civil Code. Commentary on Articles 1–449[10]<em>,</em> Volume 1, Legalis.</p>
<p><a href="#_ednref5" name="_edn5">[v]</a> Art. 23 of the Polish Civil Code dated on 23 April 1964, Journal of Laws No 16.94 as amended.</p>
<p><a href="#_ednref6" name="_edn6">[vi]</a> Art. 24 of the Polish Civil Code dated on 23 April 1964, Journal of Laws No 16.94 as amended.</p>
<p><a href="#_ednref7" name="_edn7">[vii]</a> Małgorzata Pyziak-Szafnicka / Paweł Księżak, Civil Code – Comment. General Part. Edition II. LEX, 2014.</p>
<p><a href="#_ednref8" name="_edn8">[viii]</a> Justyna Balcarczyk,  The right to image and its commercialization, Oficyna Wolter Kluwer Business, Warsaw 2009, pp. 52–54.</p>
<p><a href="#_ednref9" name="_edn9">[ix]</a> Polish Press Agency, You know your neighbour by his voice, 31 March 2014, <a href="http://naukawpolsce.pap.pl/aktualnosci/news,399802,poznasz-blizniego-po-glosie-jego.html">http://naukawpolsce.pap.pl/aktualnosci/news,399802,poznasz-blizniego-po-glosie-jego.html</a>.</p>
<p><a href="#_ednref10" name="_edn10">[x]</a> Zbigniew Ćwiąkalski, Commentary on Article 118(1) of the Copyright Law, in: Barta Janusz / Markiewicz Ryszard (eds.), Copyright Law. Commentary, Volume 5, LEX no. 8545, 2011.</p>
<p><a href="#_ednref11" name="_edn11">[xi]</a> Polish Language Dictionary, <a href="http://sjp.pwn.pl/sjp/urzadzenie;2533403.html">http://sjp.pwn.pl/sjp/urzadzenie;2533403.html</a>.</p>
<p><a href="#_ednref12" name="_edn12">[xii]</a> Janusz Raglewski, Commentary on Article 118(1) of the Copyright Law, in: Damian Flisak (ed.), Copyright Law. Commentary, LEX no. 9083, 2015.</p>
<p><a href="#_ednref13" name="_edn13">[xiii]</a> Article 308 §1 of <a href="http://www.wipo.int/wipolex/en/details.jsp?id=3511">Polish Code of Civil Procedure</a> of 17 November 1964, Journal of Laws 2016.1822 as amended.</p>
<p><a href="#_ednref14" name="_edn14">[xiv]</a> Resolution of the Supreme Court of 22 April 2016, ref. no. II CSK 478/15.</p>
<p><a href="#_ednref15" name="_edn15">[xv]</a> Yiqing Lin / Waleed H. Abdulla, Audio Watermark: A Comprehensive Foundation Using MATLAB, Springer, 2014, ISBN: 9783319079745.</p>
<p><a href="#_ednref16" name="_edn16">[xvi]</a> British Broadcasting Corporation, Adobe VoCo «Photoshop-for-voice» causes concern, 7 November 2016, <a href="http://www.bbc.com/news/technology-37899902">www.bbc.com/news/technology-37899902</a>.</p>
<p><a href="#_ednref17" name="_edn17">[xvii]</a> Information given by CEO on VoicePIN in the interview for Business Insider, 28 March 2017, <a href="http://www.businessinsider.com.pl/technologie/nowe-technologie/voicepin-zabezpieczenia-biometryczne-thing-big-upc/l20w4f3">www.businessinsider.com.pl/technologie/nowe-technologie/voicepin-zabezpieczenia-biometryczne-thing-big-upc/l20w4f3</a>.</p>


<p></p>
<p>Artykuł <a href="https://www.kg-legal.eu/info/it-new-technologies-media-and-communication-technology-law/voice-cloning-as-a-global-new-technology-and-its-challenges-for-eu-and-polish-law/">Voice Cloning as a Global New Technology and its Challenges for EU and Polish Law</a> pochodzi z serwisu <a href="https://www.kg-legal.eu">KIELTYKA GLADKOWSKI LEGAL | CROSS BORDER POLISH LAW FIRM RANKED IN THE LEGAL 500 EMEA SINCE 2019</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.kg-legal.eu/info/it-new-technologies-media-and-communication-technology-law/voice-cloning-as-a-global-new-technology-and-its-challenges-for-eu-and-polish-law/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
	</channel>
</rss>
