WhyTheMoon's picture
Initial upload
a97c2a3 verified
raw
history blame contribute delete
145 Bytes
{
"target_layers": "10,20",
"transform_layers": "-1",
"lorra_alpha": 10.0,
"trainsets": null,
"valsets": null,
"full_layers": false
}