I have been a casual camera user for decades, ever since I got a simple box camera in my teens. I actually had a job as a professional photographer when I got out of college, and built my own darkroom in several homes for printing black and white pictures. This obviously was during the pre-digital era. The past few years I have been enjoying my smartphone’s camera, which keeps getting more and more capable. Going digital meant no more darkroom, and being able to print to an online photo processor by uploading my images. Here is a photo that I took recently.
Well, I come bringing bad news, although it initially doesn’t seem that way. First, cameras are getting smarter. It isn’t just a matter of better resolution images, but better software. The new Google Pixel 9 has something called Reimagine, which has some very neat tricks for in-phone picture editing as shown in this threads post by Chris Welch..This used to be the domain of a skilled Photoshop operator. Now you just tap on the right buttons.
Second, AI is moving into more “original” photography. Google’s ImageFX is now available in preview here— you type in a description of a photo that you are interested in, and within a few seconds, it creates a few images that you can choose from. My prompt of a “high-resolution photo-realistic interior of ornate teal art deco living room” brought the following image:
That is a pretty nice setup. Note the lighting effects, and reflections in the glass tableĀ and mirror are pretty darn good.
So what is the bad news you may ask? Well, as someone reminded me, what we are seeing here are the absolute worst images that AI has produced, and the quality will only get better, much better given the pace of AI development. Soon the lenses on the back of your phone will be redundant. Those travel photos that you took of your last trip to someplace exotic? Chances are someone else has been there, posted the pix, and some AI engine has gobbled it up. Just a few clicks and you can be added in the foreground. What about pictures of things? Now Google’s Lens has been improved: it is now part of the Android OS and you can do all sorts of tricks with it to identify what you are seeing on a web page or IRL. No need to set up a pesky set of lights or to compose the perfect image. Just crop with your finger.
While you mull that over, I want to leave you with one last image, of a real electric power station near Budapest. At least, I think it is real and not some AI construct. But soon, it won’t matter much, just as we have forgotten all about using stop bath and learning the zone system.