I’ve come up with a fun way to practice written conversations in norsk—by taunting my AI practice partner.
If that sounds like fun, just step behind this curtain and I’ll show you the game.
Similar to the Ad-hoc Text problem, another technique I use for improving my Norwegian is to start each day by scanning Norsk news headlines. But can I make it even easier to use?
◇◆◇◆◇◆◇◆◇◆◇◆◇◆◇◆◇◆◇
For years I’ve used an RSS aggregator app to pull all my news, blog, and web sources into a single convenient interface that I can scan each morning, and for a while, I had added a couple of Norsk sources to that, but I’ve discovered a few wrinkles that have made that suboptimal.
Norsk is actually written in two completely different writing/spelling systems, but none of my aggregator tools let me filter out the one I’m not studying.
Having experienced LingQ’s sentence-by-sentence reading mode, I find that I much prefer it for reading norsk.
I want to be able to filter out articles that really don’t interest me - sports, financial, farm reports, fashion, etc. - but my RSS aggregator doesn’t allow for more complex filtering.
Because of this, I wrote my own “headline grabber” script a while back, which I had intended to feed into the ad-hoc document ingester via the clipboard, but then I realized I can go a step further, and actually build the RSS scanner and filter right into Frankie.
Step 1 was to simply use the new Ad-Hoc Note feature, which I’m calling Jot, that can create the note by cutting and pasting from the existing headline grabber. Works a treat. Notice the highlighted document.
But even so, executing it inside Frankie directly, at the touch of a button will be sweeter.
I’ve come up with a fun way to practice written conversations in norsk—by taunting my AI practice partner.
If that sounds like fun, just step behind this curtain and I’ll show you the game.
Every language course I’ve ever taken began with how to have a simple conversation, but I don’t think I’ve ever been taught what to do when those conversations break down. And they do break down. All the time. Especially for beginners.
This post recaps a conversation I had with ChatGPT about what I think is a crucial - yet often missing - first lesson in language learning: How to keep conversations moving when the bottom falls out.
I call it The Rip-Cord Protocol.
As I focus more specifically on ear-training, I’m noticing stages of progress in my ability to unpack the noise into recognizable chunks, but how many stages should I expect on this journey? And what do they look like?