Category Archives: Gadgets & Toys

Gardening tip of the day!

This may also be my shortest YouTube video yet!

Now you’ll be able to explain if anybody says, “I think Quentin has finally cracked. I saw him jet-washing his lawn the other day!”

Twice, while I was clearing the driveway, as if by way of defiance, a leaf came down and landed on my head! It occurred to me that if I pointed the lance upwards, I could probably have blasted the last few leaves from the tree and swept them all up together, rather than having to wait for the next gale…

Rockin’, Rollin’, Ridin’

“Model railways”, someone once told me, “are a lot like breasts. They’re meant to be there for the children, but it’s always the men who want to play with them.”

Well, though I’ve always liked and admired them, they’re not something I ever went in for very much myself. Model railways, I mean.

It must be much more fun these days, though, since I gather you can get engines with cameras in them, giving a driver’s-eye view of your carefully-constructed world. I’d love to see one of those in action!

But, lest you should think that model trains are purely frivolous, Tom Scott’s latest video shows that they can have serious uses too.

Now that must have been great fun to build! And, as Tom mentions, and as some of my colleagues in the Computer Lab discovered a few years back, users exhibit a lot more engagement with something if there’s likely to be a physical crash when they get it wrong or lose concentration. Even if that crash only involves a model, it’s a great deal more compelling than a simulator on a screen.

I don’t think, by the way, that I’ve ever seen one of Tom’s YouTube videos that wasn’t worth watching. Subscription definitely recommended.

Audio sometimes preferred!

I had an interesting start to the day. Regular readers of this blog will probably have heard quite enough about webcams and coffee pots, but that’s apparently not true for everybody in the world…

Thirty years ago today, Sir Tim turned on his first public webserver, which means that this is one of the days that people have chosen to label as the 30th anniversary of the Web. As it happens, we’re also not too far from the 30th anniversary of the day when we turned on the Trojan Room coffee pot camera, which would be connected to the web a couple of years later and so become the first webcam.

Anyway, I sometimes get wheeled out as a suitable relic to display from this era, and I had an email yesterday from BBC Radio Cambridgeshire asking if I was willing to be on the Louise Hulland show first thing this morning. I said yes, and they were going to contact me with further details… but I heard nothing more, so presumed it wasn’t going ahead. Until, that is, I emerged from my shower this morning, draped in my dressing gown but dripping slightly, to hear my phone ringing… and answered it only to be dropped into a live interview. However much I like networked video, there are times when audio really is the best medium! Anyway, it’s here, for the record.

Perhaps better is an interview that was actually recorded quite some time ago by Jim Boulton for the Centre for Computing History, but which they first published today as part of the local Web@30 event. In it, I am (a) slightly more compos mentis, since it was recorded later in the day and I had consumed more coffee, and (b) rather better attired.

MeetingBuster and the Christmas Call Diary

There was a period of a few years when I played quite extensively with VOIP, which for the uninitiated, stands for Voice-Over-Internet-Protocols, sometimes called ‘IP Telephony’. This isn’t about Zoom and Skype and FaceTime, but about traditional phone calls: the things your parents used to make (and maybe still do), often using devices attached to the wall with wires!

It all seems very obvious now, but there was a point between about 20 and 10 years ago when the typical office phone changed from being an audio device plugged into a landline-style connection with analogue voltages talking to a phone exchange, to being something digital that plugged into the ethernet and had an IP address. Telephone calls, hitherto controlled by large national monopolies with expensive proprietary equipment and hideously complex signalling protocols, started to become something ordinary users could manage with their own software, even Open Source software.

Companies that had previously paid vast sums of money to buy or lease a PBX (the ‘Private Branch Exchange’ that gave you internal phone numbers and routed calls to and from external numbers), could now just install software on a cheap PC and route calls to phone handsets over the local network. If you also routed calls over the wider internet, limitations of most broadband connections meant that the quality and reliability left something to be desired, but, as one perceptive observer commented at the time, “The great thing about mobile networks is that they have lowered people’s expectations of telephony to the point where VOIP is a viable solution.”

Phone Phun

And what you could do in an office, you could also do at home, just for fun. I loved this stuff, because in my youth telephony had embodied the quintessence of big faceless corporations: you paid them, they told you what you could and couldn’t do with the socket in your wall, you lived with the one phone number they decided to give you, and could only plug in the equipment that they approved. Any variations on this theme rapidly became very expensive.

With VOIP, however, you could now get multiple phone numbers in your own house and configure how they were handled yourself. I had one number that was registered in Seattle (because I was doing lots of work there), but it rang a phone in my home office in Cambridge — the same one that also had one Cambridge number and one London one — with the calls routed halfway around the world over the internet, basically for free. All of a sudden, you could do things that the Post Office, BT, AT&T, or whoever, would never have let you do in the past. It was fun!

Part of my interest came from the clear parallels between how phone calls were handled in this new world, and the way HTTP requests were handled on the web. I first got involved in telephony with the AT&T Broadband Phone project back in 1999, when my friends and I had to write our own telephony stack based on the new SIP protocol, and build our own custom hardware to connect our SIP network to real-world phone lines.

But, as with the early days of the web, Open Source servers soon emerged so you didn’t have to write your own! The Asterisk and, a little later, FreeSwitch packages were very much analogous to Apache and Nginx in the web world. Calls came in, and you decided what to do with them using a set of configuration rules similar to those that might determine what page or image to return for a particular URL. Voice prompts and keypad button presses were a bit like forms and submit buttons on web pages… and so on.

Anyway, there were a couple of quick hacks that I put together at the time which turned out to be rather useful, so if you’re still with me after the history lesson above, I’ll describe them.

The Christmas Call Diary

We were a young startup company, with about half-a-dozen employees, operating primarily out of a garden shed in Cambridge. But we had sold products to real customers who expected a decent level of support. As Christmas approached, we realised that the office was going to be empty for about a fortnight, and started to wonder what would happen if anybody had technical support issues and needed urgent help.

So I set up a shared Google calendar, and asked everyone to volunteer to be available for particular periods of time over the holiday, just in case any customers called; a possibility that was, we hoped, pretty unlikely, but it would improve our reputation no end if somebody did answer. All we had to do was put entries in the calendar that contained our mobile or home number during times when we didn’t mind being disturbed. People valiantly signed up.

We were running a VOIP exchange on an old Dell PC, and I wrote a script to handle incoming calls, which worked like this:

  • When a call comes in, ring all the phones in the office for a short while.
  • If nobody picks up, then look at the special Google Calendar to see if there’s a current entry, and if its contents look like a phone number. If so, then divert the call to that number.
  • If it isn’t answered after a short while, send the caller to our voicemail system, and email the resulting message to all of us.

In the end, I don’t think anybody did call, but the script worked as intended, and allowed us to have a more worry-free Christmas break, which was perhaps its most important achievement!


Back in 2006, I registered the domain, and thanks to the wonderful Internet Archive, I can see once again what the front page looked like, which neatly explains its purpose (click if you need a larger image):

A later update allowed you to call MeetingBuster and press a number key within 10 seconds, and your callback would then happen that many tens of minutes later, so pressing ‘3’ just before going into a meeting would give you an option to escape from it after half an hour. (Remember this was all well before the iPhone was released, so all such interactions had to be based on DTMF tones.)

Anyway, Meetingbuster was just for fun, and there are probably better ways to escape from today’s virtual meetings. But if/when we go back to face-to-face meetings again, and you need an excuse to say, “Oh, I’m sorry, I really ought to answer that; do you mind?”, then let me know and perhaps I can revive it!

Eye in the Sky

I had owned my little drone for a while before I discovered one of its cleverer tricks: taking 360 panoramic views. You just put it in the right mode and press the button, and it turns round on the spot taking 26 photos at various pitch angles, then stitches them together. In some ways I find these interactive views more compelling than videos.

This was one of my first: Houghton Mill, on the River Great Ouse. (If you just see a blank space below, I’m afraid you may need to try another browser, and if you get these posts by email, you’ll probably have to view it on the web.)

Or here’s a view of the University’s Computer Lab, where I used to work back in the days when we had physical offices. The big building site opposite is the Physics Department’s new Cavendish Lab (the third of that name), which is also known as the Ray Dolby Centre, since that little button you used to press on your cassette deck is paying for a lot of this:

People who are interested in the West Cambridge Site may want to look at other shots from the same evening. And people who remember when cassette decks started having Dolby C as well as Dolby B may be inspired by the title of this post to hum tunes from the Alan Parsons Project.

There may be more of these to come.

Location, location, location (or, ‘How technology saved me a few hundred quid yesterday’)

Yesterday, I lost my glasses. This is perfectly normal, and happens on a regular basis. One of my roles in life is to provide the opticians of South Cambridgeshire with a healthy and predictable revenue stream. What was much less typical about yesterday, though, was that I found them again!

Despite some of my recent posts, this was nothing to do with Apple AirTags, because I don’t currently have those attached to my spectacles. (I thought I looked rather dashing with them dangling about the ears, and many of the chaps at the Drones agreed, but I noticed that Jeeves had become particularly frosty recently, and found himself unable to extract me from a dashed sticky situation involving Madeline Bassett… but I digress. No AirTags currently adorn your correspondent’s brow.)

Anyway, yesterday afternoon, I was out walking Tilly, in a gentle rain, and I decided to take a photo of the view across the field in the mist. “This shot would be easier to compose”, I thought, “if I were wearing my glasses”, and I reached into the pocket of my coat… to discover that they weren’t there.

“Bother!”, said I, contemplating the last couple of miles that we had walked, the branches I had ducked under and the ditches I had leaped, any of which might be clues to their likely location. But then I remembered I’d received a phone call just a few hundred meters back, and had been able to read the name of the caller with ease, so I must have been wearing them then. I started to retrace my steps, gazing without much hope at the long wet grass, and thinking how easy this would have been if it weren’t for Jeeves.

After a couple of further passes over the relevant stretch, in the fading light, with a bemused (but useless) scent-hound trotting behind, I was about to give up hope, when I suddenly remembered: Hang on a minute! I had taken one other photo! It was after the call, but before I had noticed the glasses were missing. Surely I must have been wearing them then, and removed them afterwards because of the raindrops gathering on them. I bet I dropped the glasses at the location of the photo!

The problem was that it hadn’t really been a great photo; not many distinguishing features in the view. I was along one side of a big field, on a wide path, and I had paid little attention to my surroundings. Looking back at it now, I realise that I could probably have got those two rows of distant trees in the same relative alignment and located my former position quite accurately, but I was trying to view it on a small and decidedly damp screen… without my glasses. I was having enough fun just trying to tap on the right photo.

However, of course, all my photos are geotagged. I spent a while trying to work out how to see its location in the Photos app: not easy, even when you can read the text and make out the icons. (Hint: you need to swipe up from the bottom.) But eventually I found it. Not entirely helpful for precise location.

By tapping random blurry things, I managed to get into satellite view, and that was much better:

However, you have to remember that, at the time, it looked more like this:

And there are quite a lot of bends in the path with big trees beside them.

The Photos app doesn’t show your current location, only the location of the photo. I switched into Google maps, where I could see the moving blue dot, but the trees and crops looked completely different; the photo had been taken at a different time of year, several years before. Not much help.

But Apple Maps, of course, used the same satellite imagery as Apple Photos, and I was able to switch to and fro between the apps as I walked, until the blurry image with a blue dot seemed perfectly aligned with the blurry image with a yellow square.

I looked down, and there, nestled in a tuft of long grass, were my glasses. I had passed them three times that afternoon since dropping them, as had Tilly, who can track a pheasant at a considerable distance, but takes little notice of her master’s most valuable possessions right under her nose.

“At last!”, said she. “Come on! It’s dinner time.” And we scampered off towards the car.

Getting 3-Dimensional

Some quick thoughts after my first couple of days of owning a 3D printer.

Windowizer continued

I’ve had lots of fun comments about The Windowizer. People asked things like:

  • I like the Mac version – do you make one for Windows?
  • Where’s the Mute button?
  • Does it cut you off after 40 mins if you haven’t paid?
    and so on.

Amidst these customer support questions, I’ve been working on a conference-call version to help you communicate with groups of other people, but if there are more than about three or four participants, it becomes a lot less portable, because they also need some scaffolding to appear in the correct layout. Work needed there.

My friend Shaw also sent me this cartoon:

A think the spirit of Heath Robinson is still alive…

The search for warmth

Here’s a nice story on the BBC. I had never really thought about the value of drones for search and rescue, but on the River Foyle they have one equipped with a thermal camera, which can really help find people in the river, especially at night.

Looking down from on high

Yesterday evening, I got a toy that many of my friends and family were surprised that a gadget enthusiast like me hadn’t been seduced by many years ago!

And today, I took it for a walk. I’m very pleased to discover that Tilly doesn’t seem at all fazed by the drone, only by the fact that I’m not paying enough attention to her.

It’s a tribute to how good the technology is, that a complete amateur like me can produce a pretty video on the first day. Having the sunshine and a light dusting of snow helped a lot too, though!

Fireworks of the Future

I missed this at the time, but there was a lovely presentation put together for the Hogmanay New Year’s Celebrations in Edinburgh, using a swarm of drones carrying lights.

A short summary video is here:

but it is really worth watching the full presentation, which you can find here, with music and narration.

You don’t want to think too much about the fact that the effect was really only visible from one location, or that they weren’t allowed actually to film the Edinburgh scenes over the city so they had to do it in a remote bit of the Highlands and overlay it on images of the Edinburgh skyline… It’s still a lovely combination of software, hardware and poetry.

P.S. It turns out that there have been quite a few of these type of displays in recent months, if you search YouTube for ‘drone light show’. But most of them aren’t narrated by David Tennant 🙂

Activating Home Automations using NFC tags on iOS

Now that I have a shiny new iPhone, I’ve realised that I can finally start playing with NFC tags, and, in particular, they can do interesting things around the house by making them trigger actions in my Home Assistant system.

I do alread have various Zigbee buttons around the house, and in general these are more convenient, since you can just press them without needing a phone in your hand! There are a couple in the sitting room, for example, which toggle our ‘movie mode’. When movie mode is switched on, the lights in the hall, kitchen and sitting room dim to a low warm glow, any lights that reflect in the TV screen turn off completely, the temperature in the room is raised by a degree or two, and the TV & DVD player switch on. When movie mode is switched off, everything reverts to its previous state. I don’t want to have to pull out a phone to do this; it’s much easier to turn it on and off with a button, or to use voice. “Alexa, it’s movie time!”

A Xiaomi Zigbee button on the left; one of my NFC tags on the right. The NFC tag is an inch in diameter.

But if you don’t mind pulling out your phone, NFC tags have some key advantages: they’re small, weatherproof, require no battery and can do more things. You can also arrange that they do different things depending on who’s scanning them, so, for example, you could stick one beside your garage door; when you scan it, it unlocks your car, when your spouse scans it, it unlocks theirs, and when anyone else scans it, it does nothing (or perhaps causes your security camera to take a photo of them!)

Some tips

NFC tags each have a fixed unique ID, and for simple interactions you can just arrange that your phone does something when a particular ID is scanned.

But they can also be programmed with custom data using a protocol/format known as NDEF. There are standard ways of storing URLs, phone numbers, etc, much as you would with a QR code. So if you want a tag to take you to a web page, for example, without your phone needing to know anything about the tag in advance, this is a good way to do it.

If you want to experiment with this, then the Simply NFC app is a good place to start. Another good and completely free one is NFC TagWriter by NXP, but for the particular issue of reading things with an iPhone, I had more luck with Simply NFC. And a key thing to know if you’re using small tags is that the NFC reader is at the top of the back of your phone near the camera, and this needs to be within about a centimetre of the tag.

Recent iPhones will read a subset of these tag types in the background (i.e. without you having to run an app). As an example, I’ve just programmed a tag here with my email address (a mailto: link), and if I scan it, a notification pops up offering to take me to the mail app to send a message. I can do this with my iPhone at the home screen, or even the lock screen. More complex email variants, though, (for example, including an email subject line), don’t seem to work without running a special app.

Home Assistant – the simple way, and doing it better

Recent versions of the Home Assistant app know how to program NFC tags, and scan them, and associate them with Home Assistant actions. This is very cool, and gives you lots of information about who’s doing the scanning, etc.

But it has a problem on iOS: Apple doesn’t let an NFC tag perform an action on your phone without your confirmation. So instead of just pulling out your phone and tapping it on the tag, you also need to look for the resulting notification and confirm that you want the action to take place, which spoils the magic a bit. This isn’t an issue, I gather, on Android, but Apple are more cautious about doing things behind your back, especially, I guess, since an NFC tag could be hidden and yet still accidentally scannable.

However, there is one way to allow tags to perform actions on an iPhone without requiring your confirmation each time.

If you create an ‘automation’ on your iPhone using the Shortcuts app (not to be confused with a Home Assistant automation), you can choose to trigger this with an NFC tag.

You don’t need to program the tag: this just uses its ID, I think.

Now, an iPhone automation can do all sorts of things, including requesting a URL. And Home Assistant allows you to create webhooks which can trigger Home Assistant automations in response to a URL being requested.

Setting up a webhook

You can find information on how to create a Home Assistant webhook online, depending on whether you create your automations through the GUI or using YAML. Here’s my simple example called study_toggle, which toggles both ceiling lights in my study:

- alias: Toggle study lights
    - platform: webhook
      webhook_id: study_toggle
    - service: homeassistant.toggle
      entity_id: light.q_study_back
    - service: homeassistant.toggle
      entity_id: light.q_study_front

I can cause this automation to be run using the URL `/api/webhook/study_toggle’ on my Home Assistant server.

NOTE: It’s important to remember that webhooks don’t require authentication, so if your server is at all accessible to the outside world you should make sure you use more obscure URLs! Please don’t have one called!

Calling the webhook

OK, back to the iPhone. Now, your phone will need to make an HTTP POST request to that URL, but fortunately, this is easy to do. When adding an action to your automation, go into the ‘Web’ section and use ‘Get contents of URL’:

Then you can put in the URL and expand the ‘Show more’ section, which will let you change the HTTP method from GET to POST.

There’s no need to send any data in the request body, but you can add some JSON if you wish to make use of it in Home Assistant.

And that’s basically it! Make sure you turn off the ‘Ask Before Running’ option on the automation.

Now, the first time you scan the tag, it will still ask you for confirmation, but it’ll also give you the option not to be asked in future, at which point you can just tap the tag to run the action. Your phone does need to be unlocked.

Some hints

If you use Nabu Casa’s Home Assistant Cloud, they make it easy to get a long obscure URL which will link to your webhook and which will be accessible from anywhere. (If you set this up on your Mac, you’ll really want your ‘Universal Clipboard‘ enabled so you can copy on the Mac and paste on the phone!)

This is handy if you might want to put the tag somewhere away from your home, e.g. if it’s the last thing you scan before you leave the office to notify your spouse that you’re on the way. I’ve also heard of people sticking tags to their car dashboard which will open or close the garage door.

But if you’re only using the tag to control things when you’re actually at home, you’ll make it a lot more snappy if you keep everything on your local network, don’t go via lots of proxies, and you could even use an IP address to avoid a DNS lookup. So my actual tag to toggle my study lights calls a URL which is something like:

and it’s pretty much instantaneous.

© Copyright Quentin Stafford-Fraser