Looks like Google’s brand-new Bard image generation software is going to keep even more IP lawyers gainfully employed this year…
Not every image looks like this, but it’s pretty clear after a few minutes experimentation from a friend that (a) Bard makes fantastic images but that (b) it is drawing from copyrighted sources and (c) as capable as the other systems Reid Southen and I looked at in returning derivative artwork that appears to infringe without attribution.
Results from “draw a videogame plumber”, “draw a videogame hedgehog”, and ““draw a yellow 3d cartoon character with goggles” below:
Lawyers in the audience will no doubt have already taken note of Alphabet’s deep pockets.
update: Minutes after I posted the above images here and on on X, a MSc student in Uruguay, Santi Góngora published a stunning extension to the observations above:
comparing that to the real character, drawn from the video game Resident Evil.
Gary Marcus is standing firmly by his prediction that 2024 will be the year of Generative AI litigation.
The majority of people producing copyright material are independent artists. They will be the ones disproportionately affected by AI theft. As mentioned by Gary Marcus and in the comments below; with the well known players it's rinse and repeat - overreach, get sued, settle and do it again.
I noticed this in the Bard email today: “Using a technology called SynthID, all unique images generated on Bard will have embedded watermarking to indicate that it was created by AI. The watermark is directly added into the pixels of an AI-generated image, meaning it’s imperceptible to the human eye but it can still be detectable with SynthID. It’s important that we approach the creation of images with AI responsibly.”
So users ares stamped with a watermark on the output, but the input is taken scott-free (so far). Interesting.
"Being able to identify AI-generated content is critical to promoting trust in information. While not a silver bullet for addressing the problem of misinformation, SynthID is an early and promising technical solution to this pressing AI safety issue."
https://deepmind.google/technologies/synthid/
Hey Google – If you're so smart, how about developing a system for identifying where your inputs and outputs came from (even when relatively morphed) – some kind of end-to-end IP chain of custody technology (CoC) sorta thing? IPCoC ?... Now *that* would be cool... more cool than more Mario look-alikes ...