For several months now I’ve watched My KAGE buddies Bert Stefani and Jonas Rask, sharing GFX 50S images shot using various vintage lenses. Actually, Jonas is the mad doctor/collector in the gang—at this point I believe we can safely call it an addiction. But, needless to say, I’ve been curious about trying this myself.

I had an old Pentax K1000 (my “college” camera) that a friend had given me awhile back, and I realized the lens on this thing was a 50mm f/1.7—faster than what I thought. So on a whim I ordered a Fotodiox Pentax-K to GF adapter. I wasn’t expecting much but man...colour me impressed.

Obviously this is manual focus only, but the various options on the camera (focus peaking, zooming, picture in picture) make this extremely usable, even wide-open. This is a full-frame lens on a much larger sensor which does introduce some vignetting—easily corrected in post or an interesting effect on certain images. But what most surprises me is the sharpness: I wasn’t counting on this at all. I don’t mind a softer look and I was in fact expecting this combination to err on the side of “imperfection”... the reality is quite different. Just for kicks I shot a couple of studio images and although it’s probably not obvious at these resolutions, believe me we’re in pixel peeping territory here. I mostly don’t care about that sort of thing but it is surprising. 

The K1000 shot with its own lens. 

The K1000 shot with its own lens. 

I don’t expect to fall into the rabbit hole of vintage collectibles. Honestly, I have way too much gear as it is. I’ve even taken steps to clean-up my approach and better identify my needs at this point—the two systems can certainly work in parallel but there are clear strengths on either side that I now see much more clearly.

This lens gives me a slightly wider 40mm-ish FOV—which isn’t negligible. Basically, the Pentax is a new, unexpected tool in my GFX arsenal.
The experiment has paid off. 



I need street and colour and a giant gasp of visual oxygen. Been browsing through Philip-Lorca diCorcia’s Tokyo work and it almost hurts. I can sense the crowds, the movements, hear the murmurs.

February is when starvation kicks in.


diCorcia but also Crewdson...I’m on a rampage. It’s a new ritual, making a point of searching for images, seeing the work of others—photographers or painters or sculptors. Forcing it as an essential part of my day. Feeding, really. I keep a running screenshot scrapbook in Bear—I started this a long time ago but I’m now trying to add content to it daily. It’s a vicarious get behind someone else’s eyes and understand their impulse. A deconstruction of intent. Ultimately I’m creating a lookbook based on my own personal triggers, without direction or afterthought.

I invaded our bathroom yesterday morning, with a flash and a c-stand, a tripod and my GFX. I noticed colours, however dilapidated, and couldn’t help myself. Cynthia called it our crap as art when I showed her the results later in the evening. “Good hashtag”, I joked. Crap/treasure...we all know how that goes, right?

For now it’s all an antidote to whiteness.


A few moments ago I deleted Facebook from both my iPhone and iPad. There was no collapse of the upper atmosphere, no seismic shift in the ground beneath me. I was reading the comments section on yet another Trump-related piece of crap and I suddenly saw the spiral, with perfect clarity. I saw time squandered in this daily drip of insanity, in a flux of news always worse, always infuriating in their idiocy. The politics south of our border are unraveling so completely that it’s hard to look elsewhere. Well, it’s hard for me. I’m just built that way. But I know it’s become psychologically damaging and as much as I wish to stay informed, it all needs to stop. I can’t exist in this perpetual, silent scream.

So enough already. Detox. Peace, please. And maybe some sunshine...eventually.

A few images shot throughout the day—as I await delivery of a Pentax K adapter. But that’s for another story.

Apple, glitter & joy.


This is one of those diverging posts, so feel free to skip it and grab your camera instead. I won’t mind one bit.

Apple announced another blowout quarter this week, the new iPhone X pushing revenues forward (regardless of naysayers). Cool. I guess. Truth is I’m not feeling much of a buzz these days. I’m not moving to another platform or ditching anything. I still love my iPad Pro to death and get a kick out of iOS developments. But I’m realizing a big chunk of the Apple magic is gone for me. And it’s a little like discovering Santa isn’t real: it’s not the end of the world but it’s still a small emptiness, like a layer of glitter scratched away from reality. I could point to misfires—from the Touch Bar/MacBook debacle to the lack of updated Macs for way too long. I have friends—longtime Apple users—who ended up actually moving to Windows as a result. But those missteps haven’t directly affected me. I could go back to Aperture as the very first blow on my relationship with the company—losing a tool I loved that was instrumental in becoming a full-time photographer. But I’ve long-since moved on and I don’t tend to hold grudges. I guess it’s a combination of various elements. Today I want to look at two aspects of the Apple experience that, while on the periphery, have been bugging me for some time. In my mind, they’re also symptomatic of a certain erosion: Apple Music and Siri.


We’ve also been building Apple Music into a great service and bringing more intelligence to Siri to help you discover new music based on your tastes and preferences.
— Phil Schiller-Sound & Vision, January 28 2018


I subscribed to Apple Music (family plan) the day it became available. Before this, I’d been on iTunes Match—Apple’s paid service that allowed a personal iTunes library to exist in the Cloud—for several years. My existing songs were pretty much all rated. In the new Apple Music-enabled app I filled out the bubbles to let Apple know my likes and dislikes. And I’ve been painstakingly “loving” and “disliking” songs for over two years, in an effort to help the service understand my tastes which, I admit, can be a bit complex. Long story short, Apple has a boatload of musical data on me: my entire music library, everything I’ve ever purchased from iTunes and all I’ve since added through their streaming platform. And yet...the New Music Mix is still, to this day, either seriously off, wrong or downright insulting. Is it all bad? No. I’ve discovered a handful of artists along the way, thank God. But the overwhelming majority of what Apple Music proposes through the supposed power of its “knowledge” is just plain awful. I get current pop music when I’ve explicitly disabled the genre several times already; when I’ve disliked every single song remotely tied to it. I get Quebec artists that don’t even remotely fit anything I’ve ever listened to (because it’s where I happen to live?). It’s gotten to the point where listening to this mix is a chore, something I do out of contempt, on the desperate hope that it will, maybe, eventually, pay off. For a service that claims intelligence this feels a lot like a failure in my world.

In the For You section, I’ve been presented with the same rotation of playlists since the day I subscribed. Until recently, most of these were labeled as “updated two years ago”. This has now been removed, probably out of shame. No real refresh though.

I also believe there’s a fundamental problem with the binary choice of either “Love” or “Dislike”. It allows for zero nuances in a user’s tastes and I think this effectively contributes to Apple Music’s shortcomings: with all the buzz around machine learning and AI, I have to believe there’s an advantage to gathering as much micro data as possible. But even if we allow for the current dual-choice setup: the system should identify what effectively triggers loves and dislikes, beyond artist and genre. Do I favour songs in certain keys over others? A certain type of mood or instrumentation? Tempo? Intensity? Is there a link between Radiohead’s Fake Plastic Trees, The Cure’s One Hundred Years and Josefin Öhrn’s Sister Green Eyes? Between Miles Davis, Beach House and Grizzly Bear? There’s so much here that could inform Apple Music into tailoring to a user’s preferences. And yet from what I can tell, if I love a Public Enemy song or a Kendrick Lamar album, then I’ve sent a signal for hip-hop/rap, period. Perhaps I’m wrong but it certainly feels like a dumb flag. It feels like there’s no actual evaluation of content, of finer sonic characteristics or even a simple analysis against the rest of my library. So every time I hear a song that I happen to enjoy on Apple Music—regardless of genre—I find myself questioning the effect a “love” will have on future recommendations, often skipping the process entirely. Which ends up defeating the purpose.

This notion of “intelligence” isn’t new. When iOS 9 was announced, it was the big buzzword of the day: Siri would learn to anticipate our needs and preferences. This might be anecdotal but after updating, every time I got in the car and plugged in my iPhone I was greeted by the same Nat King Cole Christmas album. For months, all through the following summer. A Christmas album. And every time I’d look at my iPhone and think wow, you’re really dumb.

Which brings us to Siri.


In Apple’s world Siri is everywhere: iOS, MacOS, WatchOS, TvOS. So I think it’s normal to expect this ubiquitous technology to be the glue that links all devices together—a kind of common, unifying personality. But it isn’t.

Here’s another quote from Phil Schiller’s Sound & Vision interview:

From the very beginning, we engineered HomeKit to work great with Siri, so it understands all of your HomeKit accessories and how you want to control each of them.
— Phil Schiller-Sound & Vision, January 28 2018

I’m sitting at my Mac and my iOS devices are upstairs. So I ask Siri (on my Mac) to make the lights brighter:


“Mac Siri” can’t control HomeKit. Ok. But here’s another one: an Apple TV 4 or later (or iPad but that’s besides the point here) is required if you intend to do things such as automation with HomeKit devices. It says so right there in the iOS Home app. The Apple TV is an essential part of the HomeKit ecosystem. Now, Siri is also on Apple TV. So here’s a new scenario: I’m sitting on the couch and I want to dim the lights. I pick up the Siri remote and ask Siri know...dim the lights. Nope. “Apple TV Siri” CANNOT CONTROL HOMEKIT.

When Apple talks about Siri as a singular entity..they’re not being honest. Siri is utterly fragmented. It’s actually up to us to decipher what Siri can or cannot do, depending on the device we’re using. And this segregation of function per device extends to other areas in ways I simply don’t understand: the HomePod is soon to be shipping, but on this device Siri only understands English (which is probably the reason why it’s not available in Canada). What’s going on here? Is this a new “HomePod Siri” that can’t understand what “iPhone Siri” already knows? Doesn’t “iPhone Siri” speak a ton of languages by now?

What’s the point of development if you can’t build on what already exists or—in AI terms—use knowledge that’s already been acquired?


I bought an Echo Dot last December. Probably no big deal for many of you but in Canada, this was our very first “official” taste of Alexa. Honestly it was an impulse buy: we’re already Amazon Prime members and the Dot was something like $45 CAD during the holidays. I was curious so I figured what the hell. If it sucks, it sucks. I won’t be losing much sleep over it.

Holy shit, what an awakening.

With Siri, no matter which device I use (Apple Watch, various iPads, iPhones, even iPods) I’m never entirely certain of the results. Sometimes it will answer immediately and do exactly what I asked. Other times I just stand there like a moron, repeating myself to no avail. Or the reply will be wrong. When Siri works it’s fine...but consistency is incredibly important in digital assistants. Because when they don’t work you feel like a fool—and you just stop using them. And then it doesn’t matter if they eventually improve because you’ve stopped trying.

When I received the Echo Dot it was already the evening, so I didn’t much go beyond the initial set up. But I was somewhat impressed when I paired it to my iPhone, only to hear Alexa explain how I could just ask her to connect next time (*1). Then I moved downstairs to read (Cynthia was out playing volleyball), turned on the aforementioned Apple TV and asked Siri to play some classical music. I got David Bowie. Bowie...classic rock maybe? Nope. This was Blackstar. I asked again. Still no go. And btw: I could see the query typed out correctly on-screen so it’s not like Siri understood something different. On a whim I went back upstairs and asked Alexa the exact same thing: “Here’s a station called Classical Music, from Amazon Music”, she replied. Huh (*2).

Alexa works every. single. time. It’s like a bloody revelation. If I ask for Cool Jazz I get a Cool Jazz station—the specific genre. Siri gives me recent jazz songs or top jazz songs, as if “cool” is a synonym for “popular”. Alexa understands the kids, regardless of their accents and sometimes imperfect English grammar. She understands when I’m halfway down the stairs and I ask her to turn on the lights in the studio—no pause, no “hold on”, just an “ok” and the lights are on, instantly. One night during dinner Anaïs, our oldest daughter, asked Alexa to “say meow”...don’t ask me why, kids are kids. But just as I was about to tell her she was being silly...cats started meowing. Everyone’s eyes went very, very wide, our actual cat went completely berserk and we had a pretty good laugh. Useless? Oh totally. But behind the scenes Alexa heard and understood the query, found a “skill” that would do the job, enabled and launched it—all of this before I had time to utter a single word. Call me crazy but for me this is technology at its most magical: frictionless. It just works? Damn right it does. As for Siri...I tried the same thing just for kicks:


I understand and appreciate Apple’s stance on privacy. I’m also well aware of the way Amazon can and probably will use some of the information it gathers. But Apple has repeatedly said their approach doesn’t hinder their capacity to compete in this space, so I should be able to compare along the same lines and not grade on a curve. Do I care that Siri can’t meow? Of course not. But what I’m witnessing with the Echo Dot is lifestyle integration. Amazon’s strategy of anything and everything results in a device that, almost invisibly, weaves itself into the fabric of our lives. In the morning I ask Alexa to “start my day”: I get the weather, driving conditions and the latest news bulletin from CBC. I then ask for a specific radio station and she obliges (through the integrated TuneIn). If I ask “what’s up” I get: the time, a weather update, a check of any upcoming events on my calendar and a random trending bit of information—usually something light and fun. And if I ask for classical music, I get bloody classical music.

It’s clear that the lack of a visual interface initially forced Amazon into providing any information in audio form. They had no choice. There’s no “Here’s what I found on the web” cop out...because there’s no web to look at (*3). So when I was cooking chicken one night, had my hands dirty, and asked Alexa for the safe cooking temperature, I got the answer I needed. I didn’t get the following response (which would’ve been useless unless I was ok with putting down the chicken I was holding and getting grease all over my screen):


So because it reacts, because it responds with actual real-world answers, the Echo feels “normal”. And this is creating a very different type of “relationship” in our household. Actually, it’s not unlike how the Mac used to feel when other computers were overly complex and intimidating. The smiling Mac on boot up was derided by those who dismissed it as a toy, but what it represented eventually changed our world: approachability, ease of use, a rejection of the arcane. A natural extension of possibilities.

Siri, in its current form, feels arcane in 2018. Of course the threshold for this isn’t the same as it was in 1984, but the friction is still too great. Siri should simply work better than it does, in small ways: it should provide the information we ask for, not a list of random web links; it should integrate with every type of application, not just a trickling subset that Apple expands once a year if we’re lucky. And most importantly: Siri should be the same Siri across all Apple devices. One of HomePod’s bullet-point feature is Siri as musicologist: deep knowledge or just a new incarnation stuck in a speaker?

And while we’re on the subject: Siri should have a voice on all devices—that’s its identity. I can understand the lack of audio feedback on first-gen Apple Watches but not, for instance, on a $199 CAD Apple TV. Not when a cheap plastic puck from Amazon can speak to me just like my iPhone can.

In the above quotes Phil Schiller mentions Siri—not “Siri on iOS” or “Siri on HomePod”. Just Siri. If that’s how Apple presents the technology then that’s how it should work. The entire platform would be that much stronger. I’m at a complete loss to understand why this isn’t the case.

I type these words on an iPad Pro, with Ulysses, and the experience still makes me smile. But beyond Siri and music, it’s an accumulation of disappointments that has soured my view. The universally hated Siri Remote infuriates me daily—I still can’t understand how any company could release this with a straight face, let alone Apple (*4). The Apple News app is still unavailable in Canada three years after its introduction (it’s a news reader ffs); iBooks Author, an application that could’ve been HyperCard for books never got past a few lacklustre releases, was never ported to iOS, and now seems dead in the water. Both were visions I enthusiastically vain, ultimately. 

The DNA is still there: the recent 10.4 update to Logic Pro X was nothing short of amazing, reminding me of the glory days of Apple’s creative software suite. But it highlighted how much I miss those days. I miss Apple The Enabler. I miss watching keynotes that sparked a thousand ideas.

I miss the glitter and joy.


1. Unfortunately, you lose this ability as soon as you add more than one device.

2. My iPhone 8 Plus gave me the correct results last night. But this wasn’t the first time I’d been unable to get classical music on the Apple TV or older devices in the past year.

3. It’ll be interesting to see how the HomePod handles this type of information. Will it simply be as clueless as the Apple TV?

4. We finally added a cheap silicone sleeve that helps with grip and orientation. But what an awful design. Hard to imagine Jony Ives using this and not throwing it against a wall. I really don’t get it.

​The curse of molecules

 Processed with VSCO with a6 preset

I wonder if living further south would alleviate the symptoms. I profoundly hate this time of year and I‘m not even certain it has anything to do with the weather—as I write these words, the sun is shining as bright as it can possibly shine. But then I guess it’s the amount of it that counts. Can something as simple as a lack of melatonin seriously screw with our brains this way? Our thoughts so tightly bound to chemical compounds that we just stop moving without the right combination of molecules in our bloodstream?

I should work on ten thousand projects but I’m frozen. I should share something, somewhere, build and be productive and proactive and a bright, cheery and winning face to the world. But the very thought of pretending makes me sick to my stomach. I know how this all sounds, how privileged I am and how bloody ungrateful I must appear. Still, the air these days feels like carbonite. It happens and it’ll pass, I know; it always does. All hail the four seasons. But it scares the hell out of me each and every time.

I should probably also keep all of this to myself but meh...right now this seems way more relevant than posting on Instagram, waiting for likes to come in. And maybe in their own strange way these times are necessary. Maybe the depths allow for the highs.

 Processed with VSCO with a7 preset

I spent the weekend reading Laura Wilson’s Avedon at Work, an account of the six year journey that resulted in the seminal In the American West exhibit at the Amon Carter Museum. It’s a quick read and an amazing book, recounting how the series came about but more importantly, the encounters and often poignant stories behind Avedon’s subjects. And yes the portraits were shot with an 8x10 camera, they were printed life-sized—which was a first for this sort of work at the time...but in the end, these are people against a roll of white seamless that’s been taped to a wall or a trailer or a truck. In the end it’s one camera, one lens and open shade. What matters is the humanity captured, in a moment. The right moment. One chosen moment out of many.

This is where we weave our tale.

I write this by a window, hoping for rays to stir my atoms. I do love the transformative powers of light...I guess it’s only fitting that it should guide my soul as much as my eye.

P.S Images shot with the X100F. Old friend...