Galactocosmic Ontological Disorder
"We will arrive at a moment of sufficient self alienation where we can contemplate our own destruction as in a static spectacle..." -Walter Benjamin
Show more9 130
Subscribers
+724 hours
+967 days
+29730 days
- Subscribers
- Post coverage
- ER - engagement ratio
Data loading in progress...
Subscriber growth rate
Data loading in progress...
"Like many intellectuals, he was incapable of saying a simple thing in a simple way."
Marcel Proust
Repost from Anticapitalist Surrealism 🇵🇸🔻
Dumb YouTuber tries to post Zionist content, gets punished by Allah less than six hours later
Here is a summary of the key points from the video transcript in bullet points:
- Training AI models requires a massive amount of energy, with GPT-3 estimated to have consumed over 1,300 MWh of energy, enough to power 130 US homes for a year. Training even larger models like GPT-4 is estimated to cost over $100 million.
- Operating and using AI models requires significantly less energy than training, but image generation still requires around 1,000 times as much energy as text generation per task.
- Data centers currently account for 1-2% of global electricity usage, around 400 TWh per year. AI and crypto mining are likely to increase this substantially by 2026.
- Building large AIs is incredibly expensive, requiring billions per year. This will likely lead to only a few massive AIs controlled by big companies or governments.
Video
Galactocosmic Ontological Disorder in the style of zdzisław beksiński
By Playground-v2.5
Galactocosmic Ontological Disorder in the style of zdzisław beksiński
By StableDiffusionXL