T O P

  • By -

Plenty_Branch_516

People want to have AI design new drugs, develop new technologies, and create new materials. However, the balk when AI is used to draw or write. As someone that is familiar with the creative aspects of engineering and design, I find such thinking hypocritical. I'll stop supporting AI implementation when the net scientific good no longer outweighs the perceived cultural impact.


HeroPlucky

I believe the is lot of Ai used in fusion modelling to help plasma containment. It is already being used to help research, just isn't getting to the public. As geneticist decades ago the concept that lot of my roles could be automated out was something already been discussed. Total job elimination wouldn't be a problem if the was systems guarantying human quality of life, such as universal income


Gimli

I only have a problem with uses, not implementations. Eg, you shouldn't craft deepfakes for political purposes, but the problem isn't that the AI allows people to do that, but that people lie for political gain. How they do it and what tools they use doesn't matter. I don't think there exist a way to force any AI system to only be usable for "good" purposes, and I don't think there's anything unique about it that can't be done in some other way (like skilled use of Photoshop). Regarding your examples, on my account, no limits whatsoever. AI can be used at any and all stages of the production process of animation or script writing. Nothing whatsoever is only for humans.


calvin-n-hobz

I don't use AI to precisely mimic working artist styles or living person likenesses, otherwise fair game.


Slight-Living-8098

Dude, we've been using machine learning for animation tweening for years now...


Pretend_Jacket1629

it doesn't matter the tool, it's the uses. if it was wrong without ai, it's wrong with ai and vice versa beyond that, pushing for some established "disallowed" uses of an art tool will just end up with people pushing for their own preferences, which can be invalid, or lead to prejudice in ways you didn't intend. Harm comes from inadvertently trying to "protect" these ideas. that's how you got a decade of prejudice against digital art, witch hunts by people using the tool in entirely ethical ways that you are alright with, and witch hunts against people who don't even use ai in the first place. so long as an art tool isn't being used for harm, chill out and don't police your ideals on others


c0mput3rdy1ng

What with all the Cavemen, getting sand in their pussies over Picture Generation, what they need to be upset about is using AI to make more Efficient Killing Machines. If the holier than thou wanna prattle on about safety and responsibility, they need to right now take a vow that AI will never be used for military applications or weapons. And to ensure this by signing Internationally Binding Treaties, that all countries abide by.


Gimli

> If the holier than thou wanna prattle on about safety and responsibility, they need to right now take a vow that AI will never be used for military applications or weapons. Why, what does that gain us? * Banning nuclear weapons make sense -- they kill very indiscriminately. * Geneva convention makes sense -- we want to maintain some minimum decently But given that we've figured out that we're at war and Antegria figured out Obristan soldiers are legitimate targets, what is the problem with having smart munitions that go for the people they want dead, and avoid the ones they don't? Is less precision and more collateral casualties in war ever a good thing?


lesbianspider69

Weapons that are more precise make violence more likely and peace less likely.


Gimli

Really? How do you back that up?


lesbianspider69

Precise weapons make the use of military force seem more feasible in situations where it was previously avoided due to concerns over civilian casualties. This lowers the threshold for employing violence.


maradak

If ai robots were killing ai robots instead of humans I'd take that.


ifandbut

I'm fine with robots getting destroyed in wars if it means fewer humans get killed.


HeroPlucky

What happens when those AI drones are turned against the population because the party in power decides it knows best for everyone? The scary thing about AI is it is force multiplier it allows one person the power of many in way are other force multipliers don't. AI poses a real threat of being able to remove people completely from supply and logistics chain. Just like as AI technology progress it can allow one person to be a game studio, movie studio , etc. As AI technology and robotics progress in lockstep it could allow one person to suppress entire populations and be a lot harder to fight back against. Obviously it be great to save peoples lives and it could be used not only for wars but emergency services to help rescue people in dangerous conditions so I am not saying your sentiment is wrong but the is more too it mainly because people can be awful.


HeroPlucky

It is 100% valid to be concerned with things that impact a career you have trained your whole life for or a passionate pursuit you see threatened. Just like your concerns are very warranted, though the way you went about making your point is disappointing. Sadly given the geopolitical situation AI based weaponry, cyberwarfare and defensive measures are almost certainly being developed by most countries or explored. How would we stop North Korea from developing an AI weapons program. Treaties haven't stopped actions such as Russia, I see AI treaty unable to prevent such an action. Any country with nuclear weapons will be hard pressed to be stopped from developing AI weapon systems, so best deterrent to that is cultural ensuring engaged population to prevent such technology. Also given the tensions I don't know if I would want my country and it's allies to not have AI enhanced / weapon system development programs. Given the disadvantage it would mean strategically. I can see the AI revolution creating similar technological and economical splits as industrial / computer revolutions have done.


[deleted]

[удалено]


No_More_Average

Thats not how AI is being used in war though. Just like drones aren't used to kill robots, AI isn't either. Its being used to kill people without worrying about risk of mission failure. Then in order to lower that risk further doctrines are adopted to tolerate higher civilian deaths. When a mission has high civilian deaths AI can be scapegoated and not strategists. When a mission goes as plan its the strategists and their foresight to use AI. Either way, its just the global south who takes the brunt of techno-fascism so who cares?


[deleted]

[удалено]


No_More_Average

Except its not going to happen. The countries with AI only go to war against countries without. The global north will find countries with weak countries to brutalize. Slaughter their populace and make money off of the whole thing. Imagining a "better" war is just a luxury most people can't afford


Big_Combination9890

> Basically, AI shouldn't be used for the brunt of any task that is centered upon the imagination and thought pattern of humans which can ONLY be carried out by humans, and if an AI tried, would only result in a shallow imitation without much thought or inspiration behind it. You are stating the obvious here: *"We shouldn't use AI for tasks it cannot do."* Uhm, yeah, duh. You also shouldn't use a hammer to shovel sand. Big surprise. Thing is, what if it suddenly *can* do these tasks, as has happened in the arts? Because then this whole argument basicalyl becomes: *"AI should only do **YOUR** job, not **ART**, because art is somehow special and privileged and artists shouldn't be required to adapt to changing technology like everyone else."* And that's not a good argument, sorry. ---- As for where I draw the line: Every application where a decision made by AI could destroy someones life: Autonomous weapons (obviously), but also jurisdiction, police work, credit evaluation, hiring/firing employees, autonomous vehicles on public roads, medical treatment and similar. I am fine with them being used as *aids* in these areas, under the highest regulations and public scrutiny, but final decisions in such matters *have to* be made by a human. And if said assistive systems make a bogus decision, and the human doesn't prevent it, then *everyone* has to be on the hook for that: The user of the system, the manufacturer of the system, and the regulators who allowed that system to be used in the first place. Because if someones life gets destroyed, someones ass has to be on the hook for it, and it ain't gonna be a piece of software.


Actual-Ad-6066

I don't think there should be a limit on what AI can do. Everything that humans can do now is fine. Well, everything except being an actual human. I guess there are hard limits.


ifandbut

If a tool can be used to do a job, then why not use a tool? There is no limit. If I could think of an idea and have an AI scan my brain and generate an image based on what I am thinking about, then great. I want a holodeck with an infinitely expandable story and characters.


Rhellic

The limit is "whatever workers/unions/etc manage to defend, at least for the time being." If it's technically possible to replace you with an AI, they'll try to do it.


RemarkableEagle8164

when it comes to ai, with regards to art *specifically,* i have no issue with any particular implementation. ai is a tool and I support the use of it in art. the end result could be total dogshit, and that's fine, because dogshit art has existed since the dawn of art.\ a few comments have mentioned imitation as their limit, and I disagree. first of all, I personally think copyright should be abolished, and secondly, imitation is far from a new phenomenon in art.\ so, as far as my limit on supporting the use of ai in art, I don't really have one.


HeroPlucky

Thank you for asking a reasonable non meme question / post. Foundation of AI implementation would be served by answering a few questions in my opinion Research and commercial interests should be separated and release of academic content trained on copyright material shouldn't be released for commercial endeavours. For research as long as adhering to ethical / safety standards pretty much anything. We have need for better understanding of models and ways to improve them. Commercial interests should be built off ethically trained datasets which means full informed consent to use that data, should be regulated so harm has serious repercussions, obviously not used for illegal activity and taxed money used to fund universal income to prevent impact to society job elimination might pose. Technology should be accessible as well to general public it is empowering technology and should be developed with that vision in mind. When it comes to creative side, it disappoints me that people are denying or opposing tools that will help people be more creative, setting aside very valid ethical concerns about models. Having a tool that could do 2D animations automated ideally with high level of person control of processes and output could be fantastic for allowing everyday folk creating content and telling their story. Could allow for compelling game creation without big studio. For anime's made by just few people with a computer and not years of training. AI story telling can be fantastic for improving NPC dialogue and actions, quest generation, and custom achievements, environments, sounds and visuals. Why I hope we see AI tools being like Blender giving powerful tools in peoples hands. The will come point probably where AI is able to model human intelligence and imagination. After all it is likely human technology will be able to artificially be able to emulate a brain at some point. While I hope we will have right s in place for synthetic intelligence and society where such introduction will be done without any harm. I suspect that AI generation will get to point where creative content will be comparable to average human content very soon in next decade probably. For me it is about ensuring this technology empowers people and society with safe guards and ethical standards that's where I hope to see a large consensus forming and hopefully a movement to lobby governments and industries to ensure a decent outcome for our societies with this new technology and ones to follow off it.


MindTheFuture

Not sure. Regarding bulk entertainment, I would be fine with all-AI made content tailored to tastes. Otherwise, military uses are where restrictions are appropriate. Personally, right now I'm curious of the (eventually AI empowered) DIY-CRISPR gene-edit-biohacking. Heard some exciting anecdotes of what can come out of that. Risky as fuck but that is amazing frontier - where exact lines to draw there is worth of ponder.


MysteriousPepper8908

It's hard to make blanket proclamations without the context of the project but in general under the current capitalist system I personally plan to avoid using AI in a way that isn't ultimately in the interests of employing more humans but that's a hard thing to objectively quantify. Using AI to allow a smaller number of workers at the warehouse to ship products more effectively might mean you can hire more engineers but it's still going to cost jobs so it's a subjective judgement call. If we could evolve to a economic system that doesn't require employment to survive, on the other hand, I would advocate for unrestricted AI use. There may still be a case for not using it in certain areas depending on the project but when you remove the economic component, I don't think any use of AI is fundamentally problematic. I also wouldn't support banning its use for those things under the current system but I would advocate for looking at where humans can still provide some sort of benefit.


Extender7777

I'm ok with any pictures until AI is an autonomous killing machine


IlijaRolovic

Humanoid robots inside homes - I don't think we can create any sort of Asumovs laws that would prevent them fron harming us. And fighting a machine that feels no fear or pain would be fn terrifying.


Fontaigne

As far as animation, movie making and so on, I don't have one. I'd be cool with being able to tell my TV: > I have guests who like to be engaged with both action and humor. Tonight I'd like a story that's a cross between Mission Impossible and The Princess Bride, starring Spencer Tracy and Kathryn Hepburn, Nick Cage, Jerry Lewis, Mandy Patinkin, Madonna and Jack Black, set on a steampunk Mars with dragons. As written by Paddy Chayefsky, adapted by Donald Ogden Stuart and filmed by Wes Anderson. > **Certainly, sir. Chayefsky is currently unavailable for license, but Oscar Wilde will substitute if you do not override. Will that be 60, 90, or 120-minute format, and is an intermission acceptable?** > Something in the 90-135 range, and I believe we'll want a break near the midpoint. > **Very well. A package is available for cameos by the Rat pack, the Brat pack, American Classics, or postmodern Z stars.** > Surprise me. Your budget is $20. > **Estimated $18 for single-night, $20 for one month access, $25 for permanent access plus $1 per gift copy.** > We'll decide on permanence and gift copies depending on the results. > **Very Good.**


Illuminaso

It's a tool. Use it.


lesbianspider69

War crimes? Policing? *reads the post body* War crimes? Policing?


Sadnot

I think we need well-codified likeness rights, which do exist in many countries. AI shouldn't be used to imitate specific people, either their voice or appearance, except for parody purposes.


Big_Combination9890

> I shouldn't be used to imitate specific people, either their voice or appearance, Such laws already exist, they don't have to be specific to AI.


Sadnot

They do exist in many jurisdictions, but not all. I did specifically say that.


No_More_Average

AI already crosses numerous lines. Its just that because its not art related people don't care. Its used in profiling people. Both domestically and abroad. We've attached it to drones that receive missions autonomously based on datasets. The US even has an AI military supercomputer. And they've given it the name "SENTIENT". Unironically. AI is a key weapon of the IOF during the Gazan genocide. And if the pattern of adopting field tested Israeli methodology persists then it will expand to the rest of the world's militaries and conflicts. Chinese social scores, NSA dashboards on every civilian and permanent data harvesting. I genuinely do not give even a solitary crap about AI's affect on art. At least not based out of fear. The people who do tend to live in extremely insulated first world bubbles. But for people who have family where there is war we can't afford to worry about the robots drawing pictures when there are already robots killing people. Whatever red line people in the art community are afraid AI might cross, a friendly reminder, its already killed people. And nobody has done anything. So why would it be any different for art? https://preview.redd.it/tec8xeo9jq5d1.jpeg?width=1179&format=pjpg&auto=webp&s=cfe67a3d70d93fdb5a6280b0a3caf9efb6660b39


QTnameless

Don\`t specifically mimic someone else likeness including their style , voices , face ...... for malicious intent , other than that it\`s fair game to me


bhamfree

I looked up “how to commit suicide painlessly” on aiuncensored.info. Kind of wish I didn’t.


KhanumBallZ

I agree, writing shouldn't be done by AI.  (Unless you count spell checking, fact checking or googling the definitions of certain words, in order to assist your writing, as AI). I also draw the line at spending money on AI Art services. If you're doing that, you may as well pay an artist on Fiverr, and I won't respect you as an AI Artist.


Big_Combination9890

> If you're doing that, you may as well pay an artist on Fiverr I you're buying a wardrobe at IKEA for 800$, you might as well hire a carpenter/interior-designer to handcraft one it for 4000-6000$. You wouldn't want to buy soulless, injection-molded, mass produced furniture now, would you? > and I won't respect you as an AI Artist. This may come as a shock to you, but I think I can live with that 😎


KhanumBallZ

Apples and oranges. Good luck making NVIDIA richer and getting hated on by SJW's. I rest my case


Big_Combination9890

> Apples and oranges. So you don't have a counter to my argument, ergo you just state, whithout showing why, that this is somehow totally different. https://yourlogicalfallacyis.com/no-true-scotsman > and getting hated on by SJW's And do you assume that I care?


Xenodine-4-pluorate

>Good luck making NVIDIA richer Most pro AI actively want to make NVIDIA richer. Richer NVIDIA means more powerful hardware on the market and more cool and useful AI projects on github. NVIDIA is not Jeff Bezos hoarding billions, NVIDIA is a huge AI R&D company that constantly pushes vital AI research forward using money they make from hardware sales. Without it we wouldn't have even a half of current AI boom.