counterpoint Individual quibbles: llm -m gemini-1.5-flash-8b-latest describe -a IMG_1825.jpeg Against this photo of butterflies at the California Academy of Sciences: fuckin' if the photo is from the California Academy of Sciences it's been in everyone's training data since Google's Deep Dream. The whole point is "how is it at novel images" not "how is it at recognizing canned content from 2005." The resulting bubbles contributed to several financial crashes, see Wikipedia for Panic of 1873, Panic of 1893, Panic of 1901 and the UK’s Railway Mania. They left us with a lot of useful infrastructure and a great deal of bankruptcies and environmental damage. Kinda weird to mention railroad tycoons and not, you know, talk about railroad tycoons. Particularly when the marginal utility of a railroad is obvious while the marginal utility of "caption the 68,000 photos on my hard drive" is not. photo metadata has been big business since the dawn of digital photography and to the best of my knowledge, AI has not moved the big houses an inch - all the AI tagging you see is coder dipshits selling SAAS to a market they don't understand while Getty, iStock and the rest have their well-refined neural model that goes back to eighteen diggity two sitting there doing exactly what they need it to with no interference from OpenAI whatsoever. The big impact of railroads was poor people being crushed by rich people, not better URL scraping.How good are those descriptions? Here’s what I got from this command:
An interesting point of comparison here could be the way railways rolled out around the world in the 1800s. Constructing these required enormous investments and had a massive environmental impact, and many of the lines that were built turned out to be unnecessary—sometimes multiple lines from different companies serving the exact same routes!