I know a lot of people want to interpret copyright law so that allowing a machine to learn concepts from a copyrighted work is copyright infringement, but I think what people will need to consider is that all that’s going to do is keep AI out of the hands of regular people and place it specifically in the hands of people and organizations who are wealthy and powerful enough to train it for their own use.
If this isn’t actually what you want, then what’s your game plan for placing copyright restrictions on AI training that will actually work? Have you considered how it’s likely to play out? Are you going to be able to stop Elon Musk, Mark Zuckerberg, and the NSA from training an AI on whatever they want and using it to push propaganda on the public? As far as I can tell, all that copyright restrictions will accomplish to to concentrate the power of AI (which we’re only beginning to explore) in the hands of the sorts of people who are the least likely to want to do anything good with it.
I know I’m posting this in a hostile space, and I’m sure a lot of people here disagree with my opinion on how copyright should (and should not) apply to AI training, and that’s fine (the jury is literally still out on that). What I’m interested in is what your end game is. How do you expect things to actually work out if you get the laws that you want? I would personally argue that an outcome where Mark Zuckerberg gets AI and the rest of us don’t is the absolute worst possibility.
You have not clearly defined the danger. You just said “ai is here”. Well, lawyers are here too and they have the law on their side. Also the ai will threaten their model, so they will probably have no mercy anyway and will work full time on the subject.
Wealthy and powerful corporations fear the law above anything else. A single parliament can shut down their activity better than anyone else on the planet.
Maybe you talk from the point of view of a corrupt country like the USA, but the EU parliament, which BTW doesn’t host any GAFAM, is totally ready to strike hard on the businesses founded on AI.
See, people doesn’t want to lose their job to a robot and they will fight for it. This induces a major threat to the ai: people destroying data centers. They will do it. Their interests will converge with the interest of the people caring about global warming. Don’t take the ai as something inevitable. An ai has a high dependency on resources and generates unemployment and pollution, and a questionable value.
An AI requires:
Energy
Water
High tech hardware
Network
Security
Stability
Investments
It’s like a nuclear powerplant but more fragile. If an activist group takes down a datacenter hosting an ai, who will blame them? The jury will take turns to high five them.
Wow, you have this all planned out, don’t you?
If that’s what Europe is like, they’ll build their data centers somewhere else. Like the corrupt USA. Again, you’ll be taking away your access to AI, not theirs.
I don’t think the EU is so lawless as to allow blatant property destruction, and if it is, I can’t imagine such a lack of rule of law will do much for the EU’s future economic prosperity.
I’m probably just a dumb hick American though.
The economic prosperity came bundled with an ecological debt, provoked by the overusage of oil. Oil is cheap and makes everything cheap. Remove oil and everything increase in price. The “prosperity” is behind us now. I don’t see how an AI described above would bring in term of prosperity.
There is a debate in France about the morality of acting against the law when it comes to protesting against global warming. And a datacenter is in the jurisdiction of the people fighting against global warming.
We should not take order for granted. Keep in mind that the temperature will ramp up slowly each year, destroying our environment a little bit more each year. When the time of sacrifice will come I bet that the AI will be very high on the list.
Why do you think people will build data centers in Europe when they can build them elsewhere?
Tell us, I don’t know. All I know is that when a data center will require more water than the environment can provide, there will be conflicts for water, and the people living around will protest. And the most active of them will pull the plug at night or funny stuff like that. Data centers are fragile things.
As the technology improves, data centers that run AI will require significantly less cooling. GPUs aren’t very power-efficient for doing AI stuff because they have to move a lot of data around from their memory to their processor cores. There are AI-specific cards being worked on that will allow the huge matrix multiplications to happen in place without that movement happening, which will mean drastically lower power and cooling requirements.
Also, these kinds of protestors are the same general group of people who stopped nuclear power from becoming a bigger player back in the 1960s and 70s. If we’d gone nuclear and replaced coal, we almost certainly wouldn’t be sitting here at the beginning of what looks to be a major global warming event that’s unlike anything we’ve ever seen before. It wouldn’t have completely solved the problem, but it would have bought us time. An AI may be able to help us develop ideas to mitigate global warming, and it seems ridiculous to me to go all luddite and smash the machines over what will be a minuscule overall contribution to it given the possibility that it could help us solve the problem.
But let’s be real here; these hypothetical people smashing the machines are doing it because they’ve bought into AI panic, not because they’re afraid of global warming. If they really want to commit acts of ecoterrorism, there are much bigger targets.
I can’t believe that you are blaming the green people! Those people are the one who consume the less and begged you to consume less. Did you do it? No, you didn’t. Had people like you listened then we wouldn’t be in our current situation. You wanted the ultimate comfort no matter what and you listened to nothing. We’ve been talking about greenhouse effect since the previous century.
You will never move a boat with nuclear, you will never move an airplane with nuclear, you will never fertilize a field with nuclear. Stop dreaming.
Short sighted view of the problem. First there is not enough uranium for everyone.
Second, nuclear power is reserved to stable countries.
Third, there is no uranium in the EU, making it yet another tool for pressuring countries.
HAHAHA!
“The AI will save us!”
Eat less meat! How hard is it to compute! So turn off your stupid AI and eat less meat. Do it now, stop eating meat.
You know exactly what to do, you just DONT WANT TO DO IT BECAUSE YOU ARE LAZY AND ADDICTED TO COMFORT.
If you don’t do what ten thousands of scientists are telling you to do right now then you will never do what a robot tells you to do. Your face when the AI will tell you to stop eating meat. “But this is not possible, we can’t do this, the AI is wrong! We need a bigger AI!!”
omg, the denial.
Like the tires of your car.
I assume you haven’t heard of aircraft carriers and nuclear submarines.
Also, nuclear power can be stored in batteries and capacitors and then used to move electric vehicles (including boats, planes, and tractors), so I don’t know what the hell you’re even talking about.
I’ve actually cut my meat consumption way down.
That being said, a person using AI consumes an absolutely minuscule amount of power compared to a person eating a steak. One steak (~20kwh) is equivalent to about 60 hours of full time AI usage (300W for an nvidia A100 at max capacity), and most of the time a person spends using an AI is spent idling while they type and read, so realistically it’s a lot longer than that.
Again, your hypothetical data center smashers are going after AI because they hate AI, not because they care about the environment. There are better targets for ecoterrorism. Like my car’s tires, internet tough guy.
You are talking about military equipment. I’m talking about trade. There are more than 100k cargo ships today.
You will never, ever, EVER make 100k cargo ships move on battery power.
The latest buzzword.