Rust – A trap to ensnare unwary web crawlers, by Tim McNamara. It creates pathological patterns of input data that will slow down naive robots by the sheer volume of processing required, whilst using minimal resources on the server thanks to elegant event-driven code. It's effectively a reversed denial-of-service attack, designed to overwhelm malicious or thoughtless crawlers of your site. Well-written and robust robot scripts will cope with malformed input of course, but the odds are that any crawler that's bringing your site to its knees with an unreasonable number of requests won't be a masterpiece of engineering!
Seeing like a database – Written by another fan of Seeing like a State, this has a great quote from Jay Owens at the end, noting "the asymmetry of personal data, open for the 99% & deep analytics for the 1%".
HttpBin – Echoes back information about HTTP requests you send it, including things like headers, data, and forced result codes. I'm just thankful it introduced me to the 418 (I'm a teapot) status code, I can't believe I've been writing web code for so long without checking for that possibility.
Drone landscapes, intelligent geotextiles, geographic countermeasures – I'd never realized how deeply adding processing to landscape structures could change our world. This is a compelling exploration of some of the possibilities, and I'm especially struck by the possibilties for a robot-readable world.
An end to bad heir days - The copyright on James Joyce's work finally expired! The enforcement process became a poster child for how the combination of insanely-long copyright terms and ornery heirs can derail the enjoyment and exploration of an artist's work. Thankfully scholars are now free to quote Joyce's work and letters, and I've just downloaded A Portrait of an Artist as a Young Man to re-read in celebration.