breakfast
Posted on February 26th, 2026 – Comments Off on breakfast
A Cooper’s hawk perched outside the living room window and finishing off his meal (another bird). Personally, I prefer to start my day off with a coffee.

A Cooper’s hawk perched outside the living room window and finishing off his meal (another bird). Personally, I prefer to start my day off with a coffee.
We seem to be approaching the 6-month mark since the previous installment so, ladies and gentlemen, without further adieu I bring you a bespoke version of the most elegant:
Despite its obvious advances in everything from fashion to medicine, one may perhaps be tempted to dismiss the Toronto of over a hundred years ago as an inconsequential anachronism in the context of the modern metropolis. For example, one may point to the seemingly ubiquitous springtime intestinal troubles experienced by locals, as evidenced by the apparent popularity of certain products that appeared in advertisements of the period.
Pish posh, I say! Can one honestly claim that we don’t have to deal with different types of shit in Toronto every season?
Besides, perhaps their physical ailments were related to the introduction of inferior arsenic and strychnine into various products, or perhaps the complete absence of such healthifying ingredients in fake products (shame on the flim-flammers!), but I’m certainly no doctor so we can just go ahead and file that under “speculation”. Moreover, such an analysis fails to take into consideration the countering health benefits provided by certain yeasty tablets which, along with vitamins A, B, C, and calcium, contained only the finest and most refined naturally occurring strychnine.
There are, it must be said, many traditional concepts that we should like to dispense with but that have held through to the modern era due to their enduring aptness. Is it for me to say that they’re wrong?
Why, even non-scientific, which is to say artistic, endeavours from bygone years have stood the test of time. Should I claim that my tastes in decorative motifs are the sole and correct ones?
It’s precisely for these reasons that one should occasionally glance to the past and say, “Gee whiz, that sure was something.”

tl;dr — the story’s far from over.
Long version — yup, it’s done.
Let me explain.
Technically, /sectionb was done well over a month ago but I wanted to give it one more thorough read/edit before officially calling it a day. I suspect that an error or incongruence or two snuck in regardless but, at this point, I’ll just have to live with it or them.
That being said, I welcome you to read the first /sectionb novel online but as someone who spends time ingesting content on an ageing offline tablet, I though it might be useful to produce a few offline versions as well. You can download them, in full and for free, here:
With my bona fides in place, I wish to assure you that the dedicated /sectionb website will continue to be a place to catch up on updates and to get extra content. I make no promises but as work on the second part continues, you may find some other there stuff too.
Here’s my remix of Brad Turcotte‘s (a.k.a. Brad Sucks) “Feel Free! Plastic Surgery!” from his 2012 album “Guess Who’s a Mess“. Brad’s track is awesome but it doesn’t match my musical sensibilities (abilities?) so you’ll find that it’s a bit of a departure from the original.
This tune is the first of what I hope will become a collection or album of some kind. The idea for the Boreds of Canada name can be heard near the end of my version of the song and, besides Brad, the ditty was loosely inspired by Doxent Zsimond’s “acoustic” rendition (quotes and their contents mine). Rest assured that as soon as I discover other Canadians worthy of my deft acoustic touch I’ll be adding them to the Boreds repertoire.
FEEL FREE! TO DOWNLOAD THE MP3!
(It’s licensed under Creative Commons)
There’s also an instrumental version
(From Toronto to Substack)

About a month ago IEEE Spectrum magazine published an online piece by Matthew Smith entitled “Your Laptop Isn’t Ready for LLMs. That’s about to change“
In the article Matthew laments that, “for the average laptop that’s over a year old, the number of useful AI models you can run locally on your PC is close to zero. This laptop might have a four- to eight-core processor (CPU), no dedicated graphics chip (GPU) or neural-processing unit (NPU), and 16 gigabytes of RAM, leaving it underpowered for LLMs.“
🤔 “That’s odd,” I thought to myself. “It sure seems like I’ve been using considerably more than ‘close to zero’ useful models on my setup.”
For comparison, I’m running a dual-core (multi-threaded) system with 128MB integrated Intel UHD graphics, definitely no NPU, and by modern standards a measly 8 gigs of RAM. The machine is about 3 years old and it was already a “budget-friendly” laptop back when I got it. As a gaming machine in 2004 it would’ve been pretty badass. Today, not so much.
Admittedly, most of the models I run locally are not (by modern standards) considered large but they’re pretty much on par for my daily needs. There appear to be a good variety of minimal desktop models to choose from and although they’re not all used for interactive chat, within my personally limited specs the number of choices is still quite large.
While Matthew makes mention of the Small Language Models that I employ, his only criticism is that these models “either scale back these features or omit them entirely“ without actually defining what “these features“ are (unless the ginormous size of LLMs is considered a “feature“?)
I’ll grant that generating responses on my hardware is noticeably slower than when using larger (remote) models but that just means that my (fully local) agentic sidekick needs to wake up a bit earlier in the morning in order to complete its high-priority tasks before my first coffee of the day. After that there are plenty of assignments that it can accomplish in the background while I finish another high-quality, fullscreen mission in “Psi-Ops: The Mindgate Conspiracy”.
All told, a 3-to-6 billion parameter model is probably the upper limit for my setup but even then I’ve got some great options like Google’s Gemma, Microsoft’s Phi, or Alibaba’s Qwen. All three come in a variety of quantized flavours that include thinking/reasoning and integrated software tool use.
If I want to use a model that’s not specifically trained for out-of-the-box tool use I can provide it with programmatic rules, not unlike how llama.cpp operates. Moreover, I can comfortably use these models concurrently with other, smaller, and more specialized models for tasks like computer vision, speech, etc.
Should I need to tighten my resource belt I can hot-swap down to slimmer language models like Liquid AI’s LFM or IBM’s Granite. Additionally, there are many derived and tweaked models available for deeply “underpowered” machines like mine.
Point being, I think that Mr. Smith got it wrong on this one. Laptops like mine are more than sufficient to run modern (albeit smaller), models. Even geriatric machines and browsers can contribute to the effort — depends on your requirements and your ability to split up the workload.
For example, there are certain tasks like generative image and video creation that my setup can’t reasonably handle but for these cases either me or my agentic buddy can farm the work out to a public interface like Google’s Colab.
There are limits, of course, but fully local agentic natural-language AI, as of late 2025, can definitely help with some of the day’s heavy lifting. In conclusion, Mr. Smith, I must judge your information to be a smidge out of date.
P.S. Regular TCL readers may recall a live example of how even browsers can run (very) limited models.
… in which the mercenaries are subdued, Section B et al. set out for a fateful rendezvous, and the first part of the story is concluded.
Little Norway Park isn’t specifically what I had in mind when I was writing “Brush Pass” but it has a similar feel: damp, deserted, and seemingly benign — until it isn’t.

(larger)

(larger)

(larger)

(larger)
As of this post, Google Street View’s only capture of Housey Street at approximately this spot is from 2018. One step in the opposite direction and you’re back in 2009. Besides the interesting visual contrasts I’m curious about why the Street View car appears to be avoiding this street. Consider that there are more recent captures at each of the roads that connect to Housey yet the small avenue itself hasn’t been fully traversed by Google since 2009 (and even that “traversal” is arguable).
With a little over two months since my last devastating revelation I’d bet that the Raisin Gang thought I’d forgotten about my efforts to expose them to the world. Well, they’re wrong!
Cue the next target: Kye Fox
Living up to his sly title as the “Fox”, Kye has proven to be considerably more elusive than his fellow Gang members. Although he’s occasionally included in the credits of the Gang’s roster of videos, to the best of my knowledge he’s only ever appeared in an “official” Raisin Gang capacity one time:
I know I’ve included this video in a previous post but given the wily nature of this particular individual it’s the best I could muster. At this point I can only speculate but maybe Kye’s hidden-behind-the-scenes strategy has something to do with regret and contrition. This may help to explain why he ended up ostensibly parting ways with the Gang and becoming a certified addiction recovery coach.
Truth be told, I suppose that after witnessing what the Gang was capable of I too may have encountered problems with pain-and-memory-killing drugs. Would I be open to vicariously sharing such a history with the world? Hard to say, but then again I was never a nefarious Gang member.
It’s difficult to estimate exactly how long Kye has donned this phoenix-like persona but according to his LinkedIn profile he’s been “helping people” for a few years. Between the periods of his contemporary “rebirth” and his membership in the Raisin Gang the only reference I was able to find was an old online ad for a seven-bucks-a-pop “1-2-3 Laughter!” show that doesn’t even mention any of the Gang members by name. Not sure why it popped up in the search results, and maybe the timeline’s a bit wobbly, but all of it spells “evasive” to my mind.
Other than what appears to be Kye’s reading list there’s little else that I was able to find but you may be able to extrapolate more information about Mr. Fox from his linktr.ee profile. Nevertheless, I would caution that what you see may not necessarily be what you get. Call me crazy but I believe that a fox never changes his stripes.
So what’s it to be, Raising Gang? Are you prepared for more public infamy, shaming, and anthropomorphising, or are you ready to come clean?
My patience is wearing as thin as a molting coat.