View Single Post
Old 06-05-23, 04:58 AM   #5
Skybird
Soaring
 
Skybird's Avatar
 
Join Date: Sep 2001
Location: the mental asylum named Germany
Posts: 40,496
Downloads: 9
Uploads: 0


Default

https://www.zeit.de/digital/datensch...e.google.de%2F

Quote:
(...)
Whittaker: The big tech companies have built up a surveillance business model over the years: they provide us with their services free of charge, while in the background profiles are created from the gigantic amounts of data generated and stored, and access to them is sold to the advertising industry. This is the engine that drives the entire technology industry. Around 2010, these companies asked themselves how they could further increase their profits. How they can open up even more markets where they can use the data that they already have stored anyway. How they can use their computing capacities - the huge data centers and server farms that they operate anyway. The answer then was: you can use all these resources to train what we now call artificial intelligence. And then you can sell that across many different markets as a super capable, intelligent solution - from education to health and so on.

The more you turn to AI, the more it justifies and exacerbates the concentration of power and social control in the hands of a few companies that set up big language models.

ZEIT ONLINE: What role did technological development play in this?

Whittaker: Around the same time, it turned out that you could do whole new things with old so-called AI technologies like machine learning and artificial neural networks if you combined them with massive amounts of data and computing power. Artificial intelligence, in my mind, is a marketing term, rather than a technical one. The more you turn to it, the more it justifies and exacerbates the concentration of power and social control in the hands of a few companies that set up big language models.

ZEIT ONLINE: So, by implication, you mean: you can't build and train a big language model today at all that is not based on the logic of such surveillance business models, as you call them?

Whittaker: I think it's impossible to build a large language model that defies that logic. Because it takes gigantic amounts of computing power to train these systems - and that's very expensive. It's also very complex and expensive to make these systems usable by users as an interface, as we see with ChatGPT or Microsoft.
(...)
__________________
If you feel nuts, consult an expert.
Skybird is offline   Reply With Quote