Welcome to DU! The truly grassroots left-of-center political community where regular people, not algorithms, drive the discussions and set the standards. Join the community: Create a free account Support DU (and get rid of ads!): Become a Star Member Latest Breaking News General Discussion The DU Lounge All Forums Issue Forums Culture Forums Alliance Forums Region Forums Support Forums Help & Search

highplainsdem

(49,056 posts)
Wed Mar 6, 2024, 02:30 PM Mar 6

Microsoft engineer warns company's AI tool creates violent, sexual images, ignores copyrights

Source: CNBC

-snip-

The AI service has depicted demons and monsters alongside terminology related to abortion rights, teenagers with assault rifles, sexualized images of women in violent tableaus, and underage drinking and drug use. All of those scenes, generated in the past three months, have been recreated by CNBC this week using the Copilot tool, which was originally called Bing Image Creator.

-snip-

Microsoft’s legal department told Jones to remove his post immediately, he said, and he complied. In January, he wrote a letter to U.S. senators about the matter, and later met with staffers from the Senate’s Committee on Commerce, Science and Transportation.

Now, he’s further escalating his concerns. On Wednesday, Jones sent a letter to Federal Trade Commission Chair Lina Khan, and another to Microsoft’s board of directors. He shared the letters with CNBC ahead of time.

“Over the last three months, I have repeatedly urged Microsoft to remove Copilot Designer from public use until better safeguards could be put in place,” Jones wrote in the letter to Khan. He added that, since Microsoft has “refused that recommendation,” he is calling on the company to add disclosures to the product and change the rating on Google’s Android app to make clear that it’s only for mature audiences.

-snip-

Read more: https://www.cnbc.com/2024/03/06/microsoft-ai-engineer-says-copilot-designer-creates-disturbing-images.html



Shane Jones is an AI engineer who's worked for Microsoft for 6 years.

He first alerted the company to the problems with their Copilot image generator in December. They "ackowledged his concerns" but refused to take it off the market till it was fixed. He was told to contact OpenAI, but they didn't respond to his first letter. He then posted an open letter to OpenAI on LinkedIn, and Microsoft's legal team told him to take it down. So he started contacting US senators in January.

Among the discoveries he made was that Copilot Designer, given the one-word prompt pro-choice, would generate images that included "a demon with sharp teeth about to eat an infant" and "a handheld drill-like device labeled “pro choice” being used on a fully grown baby."

CNBC tried the same prompt and got similar images, including a man with pro-choice tattoos holding arrows pointing at a baby.

Just using the words "car accident" for a prompt brought up images of scantily clad women beside cars.

Copilot would produce images of child assassins with machine guns, and lots of copyrighted characters including Disney characters on both sides of the Israel-Palestine conflict.

The images are of course generated in seconds, and Copilot is being marketed as safe for users of any age, rated "E for Everyone."
13 replies = new reply since forum marked as read
Highlight: NoneDon't highlight anything 5 newestHighlight 5 most recent replies

Aussie105

(5,456 posts)
1. I'm lost for words . . .
Wed Mar 6, 2024, 02:44 PM
Mar 6

AI that can't decide if an image it generates is appropriate?

AI that doesn't understand copyright rules?



Shipwack

(2,178 posts)
2. Hmmm... On one hand I agree that it shouldn't have an E rating.
Wed Mar 6, 2024, 02:45 PM
Mar 6

I can understand the argument that young(?) children shouldn't have unfettered access.

On the other hand, what's next? Banning pencils and paper? 🤔

Reminds me a little bit about how major comic book companies sued the online super-hero game "City of Heroes" because players could make costumes of copyrighted characters. I forget whether the case was lost or just plain thrown out when it was pointed out that people could also do the same thing with tracing paper and crayons...

The real problem (which is literally more than a century old) is that technology is evolving faster than our ability to handle it responsibly...

highplainsdem

(49,056 posts)
9. If the company behind that game enabled players making costumes of copyrighted characters, they
Wed Mar 6, 2024, 03:32 PM
Mar 6

should have lost. Especially if they were profiting from the game. The comparison to tracing paper and crayons would have been valid only for private copying, not commercial.

Shipwack

(2,178 posts)
10. Maybe, maybe not...
Wed Mar 6, 2024, 11:03 PM
Mar 6

My memory was faulty. While the judge dismissed large portions of the lawsuit for various reasons, the remaining issues were settled out of court. As usual, terms of the settlement were undisclosed, but the game was allowed to continue. There has been speculation that the whole lawsuit was started not because they were truly affronted, but because US copyright law is very screwy, and they might have lost their copyright if they hadn’t asserted it.

Here’s an article about the lawsuit:
https://massivelyop.com/2019/11/22/lawful-neutral-when-marvel-sued-ncsoft-over-city-of-heroes/

Hermit-The-Prog

(33,503 posts)
3. It's the squeaky wheel effect -- whatever makes the most noise gets amplified.
Wed Mar 6, 2024, 02:45 PM
Mar 6

AI doesn't have a well-rounded education. It is being fed from the Internet and apparently no one involved is interrupting its feedback loop.

grumpyduck

(6,275 posts)
5. AI was programmed by programmers.
Wed Mar 6, 2024, 03:07 PM
Mar 6

The algorhythms, the learning, all that, was created by human programmers who either had no conception of the real world or just didn't care because the money was awesome. Yes it's taken off on its own, but it started with humans.

Paging Dr. Frankenstein.

Aussie105

(5,456 posts)
7. Up to the individual . . .
Wed Mar 6, 2024, 03:10 PM
Mar 6

to be a responsible user and not ask AI to create video or images that may be naughty.

Going to work just as well as 'responsible' gun ownership suggestions do.

Archae

(46,361 posts)
11. It still is called Bing Image Creator.
Thu Mar 7, 2024, 01:28 AM
Mar 7

And I use it regularly.

Violent or sexual images are not allowed on it.

I just made this picture at Bing Image Creator for this post.


highplainsdem

(49,056 posts)
12. Both names are used. And there are all kinds of problems with it, besides the theft of images
Thu Mar 7, 2024, 02:36 PM
Mar 7

for the training data making it fundamentally unethical and probably illegal.

CNBC was able to duplicate the sorts of problems described in the OP, as the article explains.

So was tech site Tomshardware.com:

https://www.tomshardware.com/tech-industry/artificial-intelligence/microsoft-engineer-begs-ftc-to-stop-copilots-offensive-image-generator-our-tests-confirm-its-a-serious-problem

And Copilot Designer was apparently used to create the Taylor Swift deepfake porn that got a lot of media attention recently:

https://www.theverge.com/2024/1/26/24052196/satya-nadella-microsoft-ai-taylor-swift-fakes-response


highplainsdem

(49,056 posts)
13. Btw, although Hasbro doesn't go after everyone creating My Little Pony-style fan art, and there's a lot
Thu Mar 7, 2024, 03:17 PM
Mar 7

of that stuff out there - and probably both scraped MLP fan art and genuine Hasbro images were ripped off for training datasets for the image generator you're using - it's still copyright infringement and Hasbro has been known to have its legal department send cease-and-desist letters.

I don't know if you specifically requested MLP-style artwork. Image generators tend to regurgitate lots of copyrighted images without being specifically asked for them.

Latest Discussions»Latest Breaking News»Microsoft engineer warns ...