A Return to Status Blogging

Facebook does that thing where it reminds you of what you posted on a given day. Sometimes I get to be reminded of a period from roughly 2007-2010, what I think of as the golden age of status blogging. I was a heavy identi.ca user, cross-posting to Twitter and Facebook — after all, I’m one person, not several — and engaging in conversations, community, and snark.

I backed the Micro.blog Kickstarter largely out of nostalgia for that period. Yes, I often overshared, and can’t believe some of the crap I posted, but it felt like I was part of a community in a way Twitter never has. Since Manton started rolling out invitations to the hosted service, I’ve been slowly exploring it, uncertain if I’d really re-commit to status blogging. What I’m discovering, though, is that this isn’t just about nostalgia. Instead it’s prompted me to think about what I’m sharing out to the world, where it lives, and how much say I have in where it goes.

The Micro.Blog interface is pretty stripped down, but at the core you have the features you’d expect: post, profile, timeline, discover. What makes it different is that it’s also a communication hub: you can configure it to cross-post to Twitter or Facebook from within the profile. Shortly after I started experimenting with it, I also set up IFTTT to pull my posts via RSS and publish them to my personal blog. I’m not sure this is something I want, but it’s an experiment.

My Status Blog Setup

More interesting than the cross posting to Twitter, though, are the features that make Micro.blog more of a hub than a publisher. Instead of syndicating from Micro.blog to my personal blog, I can configure Micro.blog to consume an RSS feed, making it a network hub. This brings my status posts from my personal site to the budding Micro.blog community and from there to Twitter. The IndieWeb community has branded this POSSE: Publish on your Own Site, Syndicate Elsewhere.

Exploring the work the IndieWeb community is doing reminds me of my time at Creative Commons, when we were trying to leverage the distributed nature of the web as a feature of our work.

I loved my time on Identi.ca, and there’s probably more self-reflection to be done about why I stopped posting. (I logged in as I wrote this and see some of my former colleagues are still active, although it still feels like a ghost town to me.) And I’m enjoying exploring Micro.blog, thinking about the tradeoffs between different approaches, and basically taking things much less seriously than I used to.

Work in progress from the studio this weekend, homage to Matisse’s “Seated Pink Nude”. Reduction linocut print with selective inking. Hoping to get it finished this week!
Loving the Van Dyke Brown in the background and Rubinius Red; a little Payne’s Gray and we’ll have all my faves.
#wip #linocut #printmaking #relief #matisse #color #layering

As someone who still has 5.25” floppies of Brøderbund’s Print Shop, Print Shop Companion, etc, interesting to think about when my expectations around software upgrades changed.

Building WebExtensions in Typescript

I spent yesterday evening doing something I haven’t done in a while: tinkering. You may have seen the news that there’s a big change coming in Firefox. The short version is that later this year, the old extension model is going to be retired permanently, and extensions using it will no longer work. As someone with an extension on addons.mozilla.org, I’ve received more than a few emails warning me that they’re about to go dark. This isn’t the first time Mozilla has tried to entice folks to move on from XUL Overlays: Jetpack was a similar effort to allow extensions to play better within a sandbox. This time I think it’s going to stick: the performance benefits seems undeniable, and as a developer the prospect of writing a single extension to support multiple browsers is pretty appealing.

Over a year ago I took a stab at porting OpenAttribute to Browser (Web)Extensions. I read the Firefox code and basically understood it, but only because it was the third iteration of something I’d built. The Chrome code — which should be close to a proper WebExtension — was almost inscrutable to me. So naturally I wanted to start with tests. But a year ago I couldn’t quite make the connection for some reason. WebExtensions split your code between the page (content scripts) and the background process. Long running things belong in the background, and the two communicate via message passing. After reading about the coming XUL-pocalypse, I decided to take another run at it.

Last night, though, I focused on something far smaller: just understanding how to put together a WebExtension using the technologies I’m familiar with — react, redux — and the ones I’m interested in — TypeScript. The result is an extension that doesn’t do much, but it is written in TypeScript, and it does work in both Firefox and Chrome from a single code base.

The attribution extensions I’ve written have always had a data flow problem. There’s the question of what triggers parsing, where the extracted data is stored, and how you update the display. Not to mention how do you do that without slowing down overall browser performance. I’ve had good luck with React in other projects: it feels like it forces me to think of things more functionally, making it easier to write tests: does this component do the right thing with the right data? does this other thing send the right signals with the right input? Cool. But how to do that across the process boundary between background and content scripts?

webext-redux is a package that makes it easy to manage a Redux store in both processes and keep it in sync. The only real wrinkle is that the actions you fire on the content side have to be mapped to actions on the background process, which is where the mutations all take place.

So why TypeScript? I’ve been enjoying ES6 and the changes it brings to JavaScript. But I’ve still missed the types you get in Go with MyPy. TypeScript is interesting: it’s duck typed, but the ducks seem to quack louder than they do in Python.

I was particularly intrigued by ambient modules, which is how TypeScript provides type information for third party JavaScript libraries you may want to integrate. Luckily type definitions already exit for the web extension API, and it’s easy to write a (useless) one to quell the compiler warnings.

I think the biggest shift I’ve been trying to make is understanding imports. import * as actions from './actions' feels weird to write, and to be honest I’m not sure how it differs from import actions from './actions' when there’s not a default export.

I like TypeScript enough to try another experiment in the future. The compiler already pointed out a couple of errors that would have been hard to track down.

Up next: figuring out how to test web extensions and build a single code base that runs under Chrome, Firefox, Edge, and Opera.