A Mammoth Undertaking: Interviewing the Future of Art

The following is my interview of DeepAI and Stable Diffusion running on an AMD Ryzen with NVIDIA GeForce RTX 3090.

Thank you for taking the time to discuss this important and interesting topic. Why are mammoths so important to the art world?

That the mammoths were domesticated before the start of the human era seems implausible from a historical perspective. In fact the majority of mammoths that people find in museums, libraries, and other collections are not domesticated mammoths. This suggests there might have been a greater need for mammoths within culture than before. This would make sense, considering that Neanderthals likely hunted other types of other species before settling in our world more than 400 years ago. Because they were not domesticated, they were only adapted to hunting rodents and other plant and animal species that were less domesticated in their wild form such as rabbits, snakes, etc.

Starting at the earliest part of your collection, the mammoth is an unusual theme for Rembrandt, isn’t it?

It’s hard to believe that after all the success of their latest work, The Lord’s Prayer doesn’t want to let their “biggest hit yet”, the classic The Lord’s Prayer, disappear. The actor’s latest film, The Wrestler (via Warner Bros., which produces on their own, and in the U.K.), has garnered more than 15 million views on Amazon Instant Video (the company’s preferred store for Netflix-wielding subscribers, not the usual suspects Google Video), and has become so popular that when the film premiered at the Venice Film Festival, it was deemed the top film at the fest. The star-studded line-up has included the likes of Peter Lorre, Michael J Freberg, Colin Farrell, and Peter Sellers (both of whom made it to a Hollywood winery).… Read the rest

One Shot, Few Shot, Radical Shot

Exunoplura is back up after a sad excursion through the challenges of hosting providers. To be blunt, they mostly suck. Between systems that just don’t work right (SSL certificate provisioning in this case) and bad to counterproductive support experiences, it’s enough to make one want to host it oneself. But hosting is mostly, as they say of war, long boring periods punctuated by moments of terror as things go frustratingly sideways. But we are back up again after two hosting provider side-trips!

Honestly, I’d like to see an AI agent effectively navigate through these technological challenges. Where even human performance is fleeting and imperfect, the notion that an AI could learn how to deal with the uncertain corners of the process strikes me as currently unthinkable. But there are some interesting recent developments worth noting and discussing in the journey towards what is named “general AI” or a framework that is as flexible as people can be, rather than narrowly tied to a specific task like visually inspecting welds or answering a few questions about weather, music, and so forth.

First, there is the work by the OpenAI folks on massive language models being tested against one-shot or few-shot learning problems. In each of these learning problems, the number of presentations of the training data cases is limited, rather than presenting huge numbers of exemplars and “fine tuning” the response of the model. What is a language model? Well, it varies across different approaches, but typically is a weighted context of words of varying length, with the weights reflecting the probabilities of those words in those contexts over a massive collection of text corpora. For the OpenAI model, GPT-3, the total number of parameters (words/contexts and their counts) is an astonishing 175 billion using 45 Tb of text to train the model.… Read the rest