!csFYqQwEzkhBXOMzLF:matrix.org

Offtopic

127 Members
libregaming.org | osg - Let's talk about everything except libre gaming!7 Servers

Load older messages


SenderMessageTime
29 Mar 2023
@rampoina:matrix.orgRampoinaalso it contains code fron github licensed as MIT23:31:46
@rampoina:matrix.orgRampoinawhich isn't copyleft23:32:00
@rampoina:matrix.orgRampoinaand public domain works and others23:32:35
@rampoina:matrix.orgRampoinathe training dataset I meant23:32:50
@L29Ah:libera.chatL29Ahsure it does apply23:44:44
@L29Ah:libera.chatL29Ah people take down torrent indexer web sites that contain hashes of copyrighted material because these are derived from the copyrighted material using some algorithm; it's no different in case of neural network weights 23:46:43
30 Mar 2023
@rampoina:matrix.orgRampoinaI don't think that applies legally, I don't think the weights are considered as a derivative00:08:35
@rampoina:matrix.orgRampoinabut I'm not a lawyer00:08:46
@rampoina:matrix.orgRampoinaand neither are you as far as I know00:08:59
@rampoina:matrix.orgRampoinain fact as I said earlier I'm pretty sure that legally whether weights are copyrightable or not is still yet to be determined00:09:53
@rampoina:matrix.orgRampoinain any case as already mentioned, the training dataset contains more than copyleft licenses00:11:12
@irc_poVoq:matrix.f-hub.orgpoVoq Yeah, it might be that copyright gets adjusted somehow, but right now it looks like these trained models can not be considered derivative works as they don't actually contain any of the training data. 01:15:22
@L29Ah:libera.chatL29Ah poVoq: do gz-compressed files contain any of the compressed data? 10:49:26
@irc_poVoq:matrix.f-hub.orgpoVoq Of course they do, but that's a false analogy 10:50:34
@L29Ah:libera.chatL29Ah poVoq: do JPEG-lossy-compressed files contain any of the compressed data? 10:51:00
@irc_poVoq:matrix.f-hub.orgpoVoq Same. A ML model doesn't do compression 10:51:39
@L29Ah:libera.chatL29Aha ML model does lossy compression10:52:03
@irc_poVoq:matrix.f-hub.orgpoVoqNo, you are wrong10:52:17
@L29Ah:libera.chatL29Ahyou literally adjust numbers to produce a closer match to the training data set, the same as in some other lossy compression algorithmsx10:53:16
@irc_poVoq:matrix.f-hub.orgpoVoq If forms rules about how something should look like. That is often compared to compression algorithms as they do have some conceptual similarities, but in the end it isn't the same. 10:53:49
@L29Ah:libera.chatL29Ah poVoq: do SHA1 hashes contain any of the hashed data? 10:54:06
@irc_poVoq:matrix.f-hub.orgpoVoqStill false analogy10:54:36
@emorrp1:freedombox.emorrp1.nameemorrp1 L29Ah: I agree that intuitively they should be considered derivative works, but here's a thread from much more knowledgeable people that if something is sufficiently "transformative" then that no longer applies: 10:54:41
@L29Ah:libera.chatL29Ah poVoq: yes or no? 10:54:55
@irc_poVoq:matrix.f-hub.orgpoVoqNo10:55:01
@emorrp1:freedombox.emorrp1.nameemorrp1https://lists.debian.org/debian-project/2023/02/msg00017.html10:57:04
@irc_poVoq:matrix.f-hub.orgpoVoq Your brain isn't a derivative work of the Matrix movie, even though you likely know the story and remember some scenes 😅 10:57:07
@L29Ah:libera.chatL29Ah left the room.17:23:05
@L29Ah:libera.chatL29Ah joined the room.19:41:59
@L29Ah:libera.chatL29Ahhttps://s3.eu-central-1.wasabisys.com/lor-sh/lor-sh/cache/media_attachments/files/110/113/823/481/865/894/original/1f7905b8f0b82079.jpg19:42:15

There are no newer messages yet.


Back to Room ListRoom Version: 6