It’s the time of year to reflect. Since photographs preserve brief moments they can refresh one’s memory of lived experiences. Reviewing photographs from the past year can aid in reflecting upon the details, but it is often useful to consider the broader strokes. Since photography is the main thing that I do when I am not working, I like to consider how I approached photography and how that approach has evolved. And, considering which tools I used helps me focus on the larger themes.
My approach to photography has continued to evolve as I learn more about the craft. An example of that is that I picked up my first spot meter in 2024. First two spot meters, actually. I thrifted a spot meter off of ebay, and it was not even close to accurate. There was no chance of calibrating it, so I went back to the well. I ended up finding a great deal on a Soligor Spot Sensor II, and thankfully it seems to be reasonably accurate.
In my last video I talked about how Bloomfilter was implementing chains of large language models to more accurately accomplish difficult tasks. For AI implementors this is often know as an agentic workflows. Andrew Ng has described four approaches to Agentic Workflows: Reflection, Tool use, Planning and Multi-agent collaboration. At Bloomfilter we are currently implementing tool use and multi-agent collaboration, and our use of agent collaboration is in its early stages. We do plan on expanding our capabilities to take advantage of each of these approaches.
There are a few no-code tools for implementing agents workflows — CrewAI, and CasidyAI are a couple examples. We are implementing our agents, and their assistants, in code; we do this because we are relying heavily upon the code and models that we have already built for use in our SaaS application. Today I want to walk through how we are implementing these agents. My use case for this discussion will be eliminating tedious tasks from my routine — exactly what we hope AI would do for us. The specific case I have in mind is using AI — to use Bloomfilter — to pull the data needed to fill in my weekly KPIs.
I have been using Cursor to write code, and this site is code, so I also use Cursor to write the content that I post here. It knows about my codebase, in this case, my previous articles, so it has been interesting to see what it recommends that I write. As I was updating the pages about me, it suggested what I should write about my past. It was wrong, of course, but close enough to be thought-provoking. Here is what the AI suggested I write:
stollee is a nickname I picked up in Iraq in 2004. I was working as a civilian contractor for the U.S. Army in Baghdad when I started a blog called “The Daily Show on Iraq”. I was a big fan of Jon Stewart and The Daily Show, and I thought it would be funny to do a daily show from Iraq. I soon learned that funny is not easy to capture in writing, and I’m not sure I succeeded, but I did manage to capture a bit of the absurdity of the situation. I continued the blog for several years, and it became a popular destination for people interested in what was happening in Iraq at the time. I also started a podcast called “The Iraq Podcast”, which I continued for several years. I also wrote a book called “My Iraq War”, which is a collection of my writings from the blog and podcast.
In this video, I discuss AI and how Bloomfilter is incorporating it into its product. In future sessions, I will cover how the Bloomfilter team integrates AI into our SDLC and dive deep into how we implement AI in our codebase — including a short coding session to demonstrate how we actually code it. But today, I just want to talk about our product’s relationship with AI.
I started this conversation because of the cringe looks I get from some of my friends when I tell them we are building AI into our platform. There’s been so much hype around AI lately that some people in the industry — especially practitioners and implementers of technology — are becoming exhausted by it. And I understand that. Many products are simply adding AI prompt-to-content generators — often with dubious value — just to claim they have an AI-enabled platform.
Historically, I have bounced between a variety of film stocks. In order to get better at photography, I have been trying to reduce variability in my process. So, recently, I have been limiting myself to using a single film stock for a period of time or for a set number of rolls. I want to really embrace each film; learn how to best shoot and develop it, and learn its unique characteristics.
This summer, I had a few backpacking trips planned. I wanted to bring a classic black-and-white film along, and I wanted one that would be extra kind to skin tones. People wear less makeup and tend to get dirtier when backpacking, and I wanted a film that would help tame wrinkles, creases, and dirt. Fomapan 400, with its classic large grain and extra red sensitivity, seemed like it would be an ideal choice. After shooting six rolls of Fomapan 400 in 35mm under varying conditions, I think I’m starting to understand it.
After using a Hasselblad 203FE and Zeiss Planar 110 f2.0 for about a year, I found myself wanting to improve the shooting experience. Though the 203FE is practically dwarfed by cameras like the Mamiya RZ67, I’ll acknowledge it’s still more of a studio camera than a street shooter. Despite that, I enjoy attaching a hand strap and taking it along for a photo walk. The trouble for me comes when I want to throw a filter on the front.
The Carl Zeiss Planar 110 f2.0 has a bayonet 70 filter mount. Purpose-made filters of this type are expensive, but the real problem is that useful filter variants are almost never available for purchase. Additionally, the aftermarket bay 70 adapters are not very good; they have bent metal springs that never catch properly, they rotate out of position, and they fall off the camera. The Cokin P series adapters work fine when on a tripod, but the corners of the oversized square frame catch on pockets and are generally a nuisance when walking around. If I wanted something to suit my needs, I’d have to make it myself.
I have one enlarger, an Omega D3. I sought out this model because it can handle negatives as large as 4 by 5 inches. But, when I decided I wanted to print some subminiature 8 by 11 millimeter negatives, I would be pushing its capabilities in the opposite direction.
The manual assured me that the Omega D was capable of printing Minox negatives. I don’t have the recommended 28mm lens, but a 50mm lens should be able to make enlargements up 3.5 x 5 or 4 x 6. What I really needed was a carrier to hold the negatives in place.
After spending some time searching the internet for the official negative mask, I found that I would need to manufacture one myself. My first thought was to 3D print a near copy of my 24 x 35 carrier; just make the opening smaller. I was concerned that the heat put off by my incandescent condenser would warp the carrier. So, instead, I decided to create an insert for the 35mm carrier.
I finally had a chance to shoot a roll of Harman’s new color film. Phoenix is noticeably different than other films on the market; the film comes out of the can a mustard color and comes out of the developer a deep indigo. At quick glance the developed negatives look to be black and white. It’s only on closer inspection that orange hues are visible. The purplish base color might cause problems for some negative digitization programs, but I was able to get usable images with a small amount of effort.
The final images turned out well, though not without idiosyncrasy. Pictures shot outdoors under average conditions produced contrasty images not unlike other daylight balanced films. It is outside of average conditions where the film began to show its unique character. On mostly blue-sky days, the film seemed to produce pure whites which enhanced the clouds. When shot into the direction of the sun, even under rainy conditions, the lack of an anti-halation layer was obvious and overpowering at times. It seemed that every time I used the spot meter on the TC-1 to preserve details in the shadows, I lost them in the highlights, so perhaps the dynamic range is somewhat limited. Finally, when shot at night Phoenix may have imparted slightly more orange than would be expected from a daylight film.
There is no doubt that this film has quirks, but I enjoy the character of Harman’s Phoenix. I will undoubtedly shoot more of this film, though I will be mindful of the dynamic range and halation.