Everything is muddy at the moment. Some of us just know how to glide above it.
In my post yesterday, I forgot to mention the final twist to my open-air Teams meeting, which made it even more surreal.
Just after pressing the ‘Leave meeting’ button on the app, I walked through our village churchyard and fell into conversation with a gravedigger. No, really. He was filling in a hole, and, leaning on his spade, told me that the heavy clay around here was nothing compared that that around Lavenham. It was a strangely Shakespearian encounter; I half-expected him to bend down, pick up a skull and ask if I recognised it.
After a brief but cheery discussion, I bade him good day and departed, thinking that I should probably have tossed him half a crown for good luck, or something.
Definitely not my typical office meeting, I thought to myself, and Tilly and I walked home debating the whims of Lady Fortune in iambic pentameter.
Every Wednesday afternoon during term, we have a departmental meeting for the senior staff, which used to take place in an efficient but not-very-inspiring and rather windowless room in the Lab. There are typically 50-100 attendees, and so, when it moved into the virtual world, we don’t in general use video; most people only turn on their cameras when they’re talking.
Well, this week, a rather wonderful thought occurred to me.
Since this meeting is essentially an audio-only experience, I realised I didn’t need to postpone my dog-walk until after it had finished. Why not do them at the same time? Especially since I was more likely to be in the role of audience than presenter for the duration of this one. Much more efficient.
So I fired up Microsoft Teams on my phone, put it in my jacket breast pocket where I knew the speaker would be clearly audible (since that’s how I normally listen to podcasts and audiobooks), and headed out.
Now, it’s rare for me to say anything good about Teams — actually, it’s rare for anyone to say anything good about Teams, as far as I can see — but on this occasion it performed beautifully, the audio quality was excellent and the video, when people did turn on their cameras, was excellent too, albeit slightly blurred by the raindrops.
At the end of the meeting, as people were saying goodbye, I turned on my camera to reveal that I was in fact wrapped up and squelching through the mud in pursuit of my spaniel, something nobody had been aware of up to that point. And for me, it had been a thoroughly enjoyable meeting. Just imagine what it would be like in sunshine!
Anyway, strongly recommended, if you have the option. Combine your meetings with your daily exercise. Go and watch the rabbits. I promise you it’ll be a more pleasant experience than sitting in your average office meeting room.
And remember, there’s no such thing as bad weather, only inappropriate clothing.
Looking back through my posts about electric vehicles, I came across my brief entry from five years ago, when I got my first electric car. How different things were back then! Those who have seen my more recent posts or YouTube videos will know that I’ve just exchanged my BMW i3 for a Tesla.
It’s perhaps worth mentioning, in case you associate the word Tesla with extraordinary wealth, that this was a Model 3, and, though they are very far from being cheap, they are also about half the price of a Model S or Model X, so if you have figures in mind from old episodes of Top Gear, they might need to be revised downwards a bit! In my case, this — my first-ever brand new car — was bought almost entirely with the combined proceeds of selling a second-hand i3 and a second-hand campervan. Well, third-hand, by the time I sold them!
But I always like trying to live in the future, and the Tesla is several years ahead of most of its competition on almost any metric, especially when you think of it not so much as buying a car but buying into a transport ecosystem; combining an OK car with the best software and the best charging network available. So I took the ridiculous step of buying a brand new car — something that sane people don’t usually do — and of buying a car without a hatchback — something no sane person should do, and certainly no sane person with a dog.
Even after five years of electric driving, though, I thought I was still doing something slightly unusual and pioneering. But it turns out I was mistaken. In December 2020, the Tesla Model 3 was the top-selling car in the UK. No, you didn’t read that wrong: not the top-selling EV, but the top-selling car overall, ahead of the VW Golf and the Ford Fiesta. Here’s the list from the SMMT:
Now, there are all sorts of factors to take into account here, when interpreting this.
Car sales as a whole were significantly down last year, EV sales, by contrast, tripled their 2019 numbers. It’s worth noting that the Tesla doesn’t appear at all in the top 10 for the year as a whole, though it was also head of the charts in April, so this isn’t just a one-off occurrence. And Tesla had a big push at the end of the month because they wanted to hit the magic figure of half-a-million cars produced globally in 2020, helped on by their new production facilities in Shanghai.
It’s also encouraging to see the the VW ID.3 — another fine vehicle — came in at number 4, so soon after its general release. This no doubt also reduced the Golf numbers significantly.
So the figures need some interpretation, but any street cred I might once have had as an EV pioneer who had to write his own software to interface to his car (e.g. here and here) is clearly long gone. Everybody’s getting ’em.
Now, I can just say that it’s one of the nicest computers I’ve ever driven.
Apparently, lots of people are leaving WhatsApp, or at least looking for alternatives. (So say articles like this and this, at least.) I’ve only rarely used it, since most of my close friends and family are on iMessage and both my work-related groups use Zulip. It’s only the occasional extended-family discussion that ends up on WhatsApp.
But if you’ve missed the story, this is because they changed their Terms of Service recently, and lots of people are shocked to discover that it now says they will share your details — location, phone number, etc — with the rest of the Facebook group.
I actually read, or at least skimmed, the Terms when they came out, and didn’t blink an eye, because I’ve always assumed that’s what they did anyway! I deleted my Facebook account many years ago, but I was aware that they still knew a lot about me because I do still use WhatsApp and Instagram (though only about once a month). Still, that will give them things like my name, phone number and location (from my photos if not from the apps).
In the early days, by the way, WhatsApp traded, as BlackBerry had done before, on the fact that it was secure messaging — encrypted end-to-end at least for one-on-one conversations. My understanding from those who follow these things more closely is that the security services tolerate this because the accounts are so closely tied to phone numbers, which means that, though they can only see metadata, they can get lots of it and related information because of older laws allowing phone-tracing etc. But there may be some people out there who thought that the use of WhatsApp was giving them a decent level of security, in which case this would perhaps be more of a shock.
Anyway, I too now have a Signal account, alongside Telegram, Skype, Messages… and all the others on all my devices. Actually, that was one of the reasons I disliked WhatsApp: the pain of using it on my iPad, desktop and laptop. And who wants to type things on a phone keypad when they have an alternative? You could run clients on those other devices, but (presumably because of the regulatory issues above) they had to be tied to the account running on your phone, and that connection seemed a bit fragile and had to be oft-renewed.
Signal, which I installed last night, works on a similar principle; it’ll be interesting to see whether it does it better! But it looks OK on my iPad; time to go and try it on my Macs… In the meantime, you can find me on Signal, if you know my phone number (like the FBI, GCHQ and Mark Zuckerberg do). If not, they can tell you where to find me.
I missed this at the time, but there was a lovely presentation put together for the Hogmanay New Year’s Celebrations in Edinburgh, using a swarm of drones carrying lights.
A short summary video is here:
but it is really worth watching the full presentation, which you can find here, with music and narration.
You don’t want to think too much about the fact that the effect was really only visible from one location, or that they weren’t allowed actually to film the Edinburgh scenes over the city so they had to do it in a remote bit of the Highlands and overlay it on images of the Edinburgh skyline… It’s still a lovely combination of software, hardware and poetry.
P.S. It turns out that there have been quite a few of these type of displays in recent months, if you search YouTube for ‘drone light show’. But most of them aren’t narrated by David Tennant 🙂
Here are some lovely examples of what can be achieved with a combination of technological prowess and human patience. Denis Shiryaev takes old, noisy, shaky black and white videos, and adds stabilisation, resolution enhancement, facial feature enhancement and some light colourisation. Then he adds sound. This is far from a fully-automatic process: he takes weeks over each one, but without the help of neural networks, it would take months or years if it were even possible.
Here’s a collection of old Lumière Brothers’ films that have had his treatment. Even though, by modern YouTube standards, almost nothing happens in them, I found them surprisingly compelling, yet also calming.
https://www.youtube.com/watch?v=YZuP41ALx_Q
Oh, and this, though less convincing, is also fun:
More information on his YouTube channel and at his new company site.
“Those who can make you believe absurdities, can make you commit atrocities.”
Now that I have a shiny new iPhone, I’ve realised that I can finally start playing with NFC tags, and, in particular, they can do interesting things around the house by making them trigger actions in my Home Assistant system.
I do alread have various Zigbee buttons around the house, and in general these are more convenient, since you can just press them without needing a phone in your hand! There are a couple in the sitting room, for example, which toggle our ‘movie mode’. When movie mode is switched on, the lights in the hall, kitchen and sitting room dim to a low warm glow, any lights that reflect in the TV screen turn off completely, the temperature in the room is raised by a degree or two, and the TV & DVD player switch on. When movie mode is switched off, everything reverts to its previous state. I don’t want to have to pull out a phone to do this; it’s much easier to turn it on and off with a button, or to use voice. “Alexa, it’s movie time!”
A Xiaomi Zigbee button on the left; one of my NFC tags on the right. The NFC tag is an inch in diameter.
But if you don’t mind pulling out your phone, NFC tags have some key advantages: they’re small, weatherproof, require no battery and can do more things. You can also arrange that they do different things depending on who’s scanning them, so, for example, you could stick one beside your garage door; when you scan it, it unlocks your car, when your spouse scans it, it unlocks theirs, and when anyone else scans it, it does nothing (or perhaps causes your security camera to take a photo of them!)
NFC tags each have a fixed unique ID, and for simple interactions you can just arrange that your phone does something when a particular ID is scanned.
But they can also be programmed with custom data using a protocol/format known as NDEF. There are standard ways of storing URLs, phone numbers, etc, much as you would with a QR code. So if you want a tag to take you to a web page, for example, without your phone needing to know anything about the tag in advance, this is a good way to do it.
If you want to experiment with this, then the Simply NFC app is a good place to start. Another good and completely free one is NFC TagWriter by NXP, but for the particular issue of reading things with an iPhone, I had more luck with Simply NFC. And a key thing to know if you’re using small tags is that the NFC reader is at the top of the back of your phone near the camera, and this needs to be within about a centimetre of the tag.
Recent iPhones will read a subset of these tag types in the background (i.e. without you having to run an app). As an example, I’ve just programmed a tag here with my email address (a mailto: link), and if I scan it, a notification pops up offering to take me to the mail app to send a message. I can do this with my iPhone at the home screen, or even the lock screen. More complex email variants, though, (for example, including an email subject line), don’t seem to work without running a special app.
Recent versions of the Home Assistant app know how to program NFC tags, and scan them, and associate them with Home Assistant actions. This is very cool, and gives you lots of information about who’s doing the scanning, etc.
But it has a problem on iOS: Apple doesn’t let an NFC tag perform an action on your phone without your confirmation. So instead of just pulling out your phone and tapping it on the tag, you also need to look for the resulting notification and confirm that you want the action to take place, which spoils the magic a bit. This isn’t an issue, I gather, on Android, but Apple are more cautious about doing things behind your back, especially, I guess, since an NFC tag could be hidden and yet still accidentally scannable.
However, there is one way to allow tags to perform actions on an iPhone without requiring your confirmation each time.
If you create an ‘automation’ on your iPhone using the Shortcuts app (not to be confused with a Home Assistant automation), you can choose to trigger this with an NFC tag.
You don’t need to program the tag: this just uses its ID, I think.
Now, an iPhone automation can do all sorts of things, including requesting a URL. And Home Assistant allows you to create webhooks which can trigger Home Assistant automations in response to a URL being requested.
You can find information on how to create a Home Assistant webhook online, depending on whether you create your automations through the GUI or using YAML. Here’s my simple example called study_toggle
, which toggles both ceiling lights in my study:
- alias: Toggle study lights
trigger:
- platform: webhook
webhook_id: study_toggle
action:
- service: homeassistant.toggle
entity_id: light.q_study_back
- service: homeassistant.toggle
entity_id: light.q_study_front
I can cause this automation to be run using the URL `/api/webhook/study_toggle’ on my Home Assistant server.
NOTE: It’s important to remember that webhooks don’t require authentication, so if your server is at all accessible to the outside world you should make sure you use more obscure URLs! Please don’t have one called http://homeassistant.me.org/api/webhook/open_garage
!
OK, back to the iPhone. Now, your phone will need to make an HTTP POST request to that URL, but fortunately, this is easy to do. When adding an action to your automation, go into the ‘Web’ section and use ‘Get contents of URL’:
Then you can put in the URL and expand the ‘Show more’ section, which will let you change the HTTP method from GET to POST.
There’s no need to send any data in the request body, but you can add some JSON if you wish to make use of it in Home Assistant.
And that’s basically it! Make sure you turn off the ‘Ask Before Running’ option on the automation.
Now, the first time you scan the tag, it will still ask you for confirmation, but it’ll also give you the option not to be asked in future, at which point you can just tap the tag to run the action. Your phone does need to be unlocked.
If you use Nabu Casa’s Home Assistant Cloud, they make it easy to get a long obscure URL which will link to your webhook and which will be accessible from anywhere. (If you set this up on your Mac, you’ll really want your ‘Universal Clipboard‘ enabled so you can copy on the Mac and paste on the phone!)
This is handy if you might want to put the tag somewhere away from your home, e.g. if it’s the last thing you scan before you leave the office to notify your spouse that you’re on the way. I’ve also heard of people sticking tags to their car dashboard which will open or close the garage door.
But if you’re only using the tag to control things when you’re actually at home, you’ll make it a lot more snappy if you keep everything on your local network, don’t go via lots of proxies, and you could even use an IP address to avoid a DNS lookup. So my actual tag to toggle my study lights calls a URL which is something like:
http://192.168.0.30:8123/api/webhook/study_toggle_x65fedwibble
and it’s pretty much instantaneous.
This is the best cartoon I’ve seen over the last few days.
It was one of those shared-on-WhatsApp things, so I’m afraid I don’t know whom to credit.
Christmas breakfast on the edge of the Cairngorms, 2019
Hello Everybody, and Happy New Year! I’ve been doing something very foolish in 2020, and now I’ve stopped.
Let me explain…
This time last year, over the Christmas and New Year period, Rose was visiting her family in the States, so after dropping her at Heathrow, I turned our little campervan around, and headed north, accompanied by my cocker spaniel. The only thing I knew at the time was that we were spending the first night in the Lakes, and that we were probably heading for Scotland. The rest would be decided en route, mostly based on the weather forecast. I’m not sure if the Dark Sky app is often used as a route planner…
Anyway, I recorded quite a large chunk of our journey with my GoPro, and came back with a ridiculous amount of video footage, some of which had technical issues to overcome, and I discovered I had a mammoth editing task on my hands. I feared it could be well into the spring before I was able to share any of it. And then we had a spring unlike any other. So then I hoped that lockdown would give me more time to work on projects like this, but actually 2020 has been really quite a busy year for me, and it was only once we got back towards Christmas again that I was actually able to devote any time to it.
“At least”, I said to myself, “I have to finish it before the end of the year.” And I did! I clicked ‘upload’ on the final episodes just before midnight last night. 🙂
Now, let’s be clear here: You’ll note I say ‘episodes’ above. There are, in fact, nine of them, and that’s after I’d edited out enough material for at least four more! This is perhaps the most extreme let-me-bore-you-with-my-holiday-snaps variant one can come up with, and I don’t expect the average Status-Q reader to be interested in watching one, let alone nine of these little narratives.
An AirBnB for New Year’s Eve, December 2019
The van is visible in the bottom right. Click for a larger version.
Amazingly, though, there are people who will enjoy my holiday snaps! Some are watching already.
Those longing for the open road amidst Covid restrictions, or those planning their next motorhome trip in more normal times, do like to get ideas for their next adventure, or relive the memories of journeys past, and road trip videos are very popular on YouTube. I’ve watched a lot of them, and some were partly responsible for me buying the van in the first place.
That’s before you get into the experiences, hints and tips of the full-time motorhomers: try searching YouTube for ‘van life’ if you want to enter another world.
But, even though producing this has, in some ways, been a burden that I wanted to get off my shoulders for a whole year, it’s also been a joy. Rewatching my holiday several times over means that some of the best bits are burned into my memory; there are sites, sounds and places that I would otherwise have forgotten in a month, and that I’ll now remember for ever.
And, in the unlikely event that you want to experience any of it too, there’s a YouTube playlist, and the journey begins here:
Rose suggested a better rhyme for the old carol:
It works better if you pronounce ‘grass’ the way she does, rather than the way I do!
© Copyright Quentin Stafford-Fraser
Recent Comments