Still mostly-staying-at-home, still rummaging through the archives.
I rarely delete a photo: what's uninteresting today could be interesting tomorrow and what's unusable today could be rescued by tomorrow's technology.
One technology that has absolutely-and-in-all-other-respects improved is stitching, a.k.a. panoramas.
Down the years I've used free tools such as hugin and autostitch - both still active to some degree - but once Lightroom added support I never looked back.
Recently, its stitching dialog saw an intriguing addition: a "fill edges" checkbox.
As promised, "fill edges" eliminates those pesky jagged edges around the edges of panoramas: although the edges often just contain sky or grass which in theory you could dodge and burn, in practice they need a million fiddly little edits and still never look quite right...so you just crop it.
Maybe some machine learning would help?
Well, it's several years now since Google could reliably detect kittens and I've been wondering when all that would trickle down to Lightroom: here we are.
What I like about this example is that it:⠀
On second glance you'll notice the artifacts in the huge block of former negative space but otherwise it's (finally) how I saw it that afternoon on West 23rd.
Revisiting this set was timely because it's from when I visited New York just prior to relocating here back in 2011.
That week was a test run of how I would happily spend the next ~decade: jazz, theatre, walking the streets, eating and drinking - and some work.
But that routine ended abruptly in March: with little reason to think week 19 is much closer to a return-to-normal than week 1 we've also had to revisit exactly what's still keeping us here.