Tag Archives: Security

Abandoning my principles

Two quotations occurred to me this morning.  The first was from Edmund Blackadder, talking to Prince George:

“Well, it is so often the way, sir: too late one thinks of what one should have said.

Sir Thomas More, for instance — burned alive for refusing to recant his Catholicism — must have been kicking himself, as the flames licked higher, that it never occurred to him to say, “I recant my Catholicism.”

Leaving aside for a moment a somewhat rare error on the part of the writers — Thomas More was beheaded, not burned — the topic of when to abandon one’s principles was in my mind, because I was reinstalling WhatsApp on my phone, having deleted it several years ago.

I have written enough here in the past about why I consider Facebook not to be a force for good in the world, and why I think that all of their apps — Facebook, WhatsApp, Instagram, and now presumably Threads — go a step too far in the privacy-infringement arena because, for example, they capture the details not only of the person using the app, but of all their contacts too.  I have a few friends who could be considered celebrities, for example, and, now that I’m running WhatsApp again, their details are on the Meta servers…

Except, of course, that they aren’t… at least, not really because of me.  

I was taking a stand to alert people to what Meta were doing, but it’s clear that most of my friends didn’t really care that much.  Many who actively didn’t like Facebook didn’t realise that WhatsApp and Instagram were the same company.   But lots of them had Gmail accounts, or used Android phones, anyway, so security & privacy weren’t too high up their list of priorities.  And it turns out, of course, that most of them are already on WhatsApp and Facebook and Instagram themselves, so not only were their details already known to the servers, but so were mine, because of them.   My virtuous stance was a bit of an empty gesture. (Besides, I hadn’t been quite as pure in my dedication to the cause as I suggest, because I did still have a rarely-used Instagram account, so all bets were really off anyway.)

And so I am now accessible on WhatsApp again, which will make certain social interactions easier.  I still think Signal to be superior in almost every way, and will continue to use it and other services where possible in preference.  But in the end, it came down to my second quotation of the day: the famous observation made by Scott McNealy, the CEO of Sun Microsystems, nearly quarter of a century ago, long before Facebook and its siblings even existed:

“You have zero privacy anyway. Get over it.”

Perhaps he was right. 🙂

Sign of the times: might ChatGPT re-invigorate GPG?

It’s important to keep finding errors in LLM systems like ChatGPT, to remind us that, however eloquent they may be, they actually have very little knowledge of the real world.

A few days ago, I asked ChatGPT to describe the range of blog posts available on Status-Q. As part of the response it told me that ‘the website “statusq.org” was founded in 2017 by journalist and author Ben Hammersley.’ Now, Ben is a splendid fellow, but he’s not me. And this blog has been going a lot longer than that!

I corrected the date and the author, and it apologised. (It seems to be doing that a lot recently.) I asked if it learned when people corrected it, and it said yes. I then asked it my original question again, and it got the author right this time.

Later that afternoon, it told me that StatusQ.org was the the personal website of Neil Lawrence.  

Unknown

Neil is also a friend, so I forwarded it to him, complaining of identity theft!

A couple of days later, my friend Nicholas asked a similar question and was informed that “based on publicly available information, I can tell you that Status-Q is the personal blog of Simon Wardley”.  Where is this publicly-available information, I’d like to know!

The moral of the story is not to believe anything you read on the Net, especially if you suspect some kind of AI system may be involved.  Don’t necessarily assume that they’re a tool to make us smarter!

When the web breaks, how will we fix it?

So I was thinking about the whole question of attribution, and ownership of content, when I came across this post, which was written by Fred Wilson way back in the distant AI past (ie. in December).  An excerpt:

I attended a dinner this past week with USV portfolio founders and one who works in education told us that ChatGPT has effectively ended the essay as a way for teachers to assess student progress. It will be easier for a student to prompt ChatGPT to write the essay than to write it themselves.

It is not just language models that are making huge advances. AIs can produce incredible audio and video as well. I am certain that an AI can produce a podcast or video of me saying something I did not say and would not say. I haven’t seen it yet, but it is inevitable.

So what do we do about this world we are living in where content can be created by machines and ascribed to us?

His solution: we need to sign things cryptographically.

Now this is something that geeks have been able to do for a long time.  You can take a chunk of text (or any data) and produce a signature using a secret key to which only you have access.  If I take the start of this post: the plain text version of everything starting from “It’s important” at the top down to “sign things cryptographically.” in the above paragraph, I can sign it using my GPG private key. This produces a signature which looks like this:

-----BEGIN PGP SIGNATURE-----
iQEzBAEBCgAdFiEENvIIPyk+1P2DhHuDCTKOi/lGS18FAmRJq1oACgkQCTKOi/lG
S1/E8wgAx1LSRLlge7Ymk9Ru5PsEPMUZdH/XLhczSOzsdSrnkDa4nSAdST5Gf7ju
pWKKDNfeEMuiF1nA1nraV7jHU5twUFITSsP2jJm91BllhbBNjjnlCGa9kZxtpqsO
T80Ow/ZEhoLXt6kDD6+2AAqp7eRhVCS4pnDCqayz0r0GPW13X3DprmMpS1bY4FWu
fJZxokpG99kb6J2Ldw6V90Cynufq3evnWpEbZfCkCl8K3xjEwrKqxHQWhxiWyDEv
opHxpV/Q7Vk5VsHZozBdDXSIqawM/HVGPObLCoHMbhIKTUN9qKMYPlP/d8XTTZfi
1nyWI247coxlmKzyq9/3tJkRaCQ/Aw==
=Wmam<
-----END PGP SIGNATURE-----

If you were so inclined, you could easily find my corresponding public key online and use it to verify that signature.  What would that tell you?

Well, it would say that I have definitely asserted something about the above text: in this case, I’m asserting that I wrote it.  It wouldn’t tell you whether that was true, but it would tell you two things:

  • It was definitely me making the assertion, because nobody else could produce that signature.  This is partly because nobody else has access to my private key file, and even if they did, using it also requires a password that only I know. So they couldn’t  produce that signature without me. It’s way, way harder than faking my handwritten signature.

  • I definitely had access to that bit of text when I did so, because the signature is generated from it. This is another big improvement on a handwritten signature: if I sign page 6 of a contract and you then go and attach that signature page to a completely new set of pages 1-5, who is to know? Here, the signature is tied to the thing it’s signing.

Now, I could take any bit of text that ChatGPT (or William Shakespeare) had written and sign it too, so this doesn’t actually prove that I wrote it.  

But the key thing is that you can’t do it the other way around: somebody using an AI system could produce a blog post, or a video or audio file which claims to be created by me, but they could never assert that convincingly using a digital signature without my cooperation.  And I wouldn’t sign it. (Unless it was really good, of course.)

Gordon Brander goes into this idea in more detail in a post entitled “LLMs break the internet. Signing everything fixes it.”   The gist is that if I always signed all of my blog posts, then you could at least treat with suspicion anything that claimed to be by me but wasn’t signed.  And that soon, we’ll need to do this in order to separate human-generated content from machine-generated.

A tipping point?

This digital signature technology has been around for decades, and is the behind-the-scenes core of many technologies we all use.  But it’s never been widely, consciously adopted by ordinary computer users.  Enthusiasts have been using it to sign their email messages since the last millennium… but I know few people who do that, outside the confines of security research groups and similar organisations.  For most of us, the tools introduce just a little bit too much friction for the perceived benefits.

But digital identities are quickly becoming more widespread: Estonia has long been way ahead of the curve on this, and other countries are following along.  State-wide public key directories may eventually take us to the point where it becomes a matter of course for us automatically to sign everything we create or approve.

At which point, perhaps I’ll be able to confound those of my friends and colleagues who, according to ChatGPT, keep wanting to pinch the credit for my blog.

 

 

 

 

 

 

 

Signalling virtue

Dear Reader,

Can I encourage you to try something today? Go to Signal.org and get hold of the Signal messaging app, and/or go to your app store and download Signal for your phone. And while it’s downloading, come back here and I’ll tell you why I’ve become so fond of it, and why you might actually want another messaging app.

To put it in a nutshell, Signal is like WhatsApp but without selling your soul. Imagine what a good time Faust would have had without that awkward business with the Devil, and you get the idea. Well, OK… you don’t quite have to sell your soul to Facebook to use WhatsApp, but you do have give away your privacy, your friends’ privacy, endure a lot of advertising, and so forth. (More info in an earlier post.)

For Apple users, Signal is rather like Messages, which I also like and use a lot, but you can use Signal with your non-Apple friends too, on all of your, and all of their, devices.

Signal:

  • is well-designed and nice to use.
  • runs on iOS, Android, Windows, Mac, Linux, tablets, desktop and mobile.
  • uses proper end-to-end encrypted communications, unlike some alternatives such as Telegram.
  • is Open Source, so if you doubt any aspect of it, you can go and see how it works.
  • is free: supported by grants and donations. No advertisements.
  • allows most of the interactions you expect on a modern messaging service: group chats, sharing files and images, audio and video chat, etc.

Now, of course, it has the problem that all networks initially have: what happens if none of my friends are on it? And yes, that can be an issue, but it’s becoming less so. When I first signed up, I think I knew about three other users. Now, over 100 of my contacts are there, and more arrive every week. When I see them pop up, I send them a quick hello message just to welcome them and let them know I’m here too. It’s a bit like wondering if you’re at the wrong party because you know so few people here, and then over time more and more of your friends walk through the door.

How do you find them? Well, like WhatsApp, Signal works on phone numbers, and when you sign up you have the option to let it scan your contacts list and see if any of them are on Signal too. Unlike Facebook/WhatsApp, however, your contacts’ details aren’t transmitted to the company’s servers and used to build the kind of personal profiles that FB keeps even on people who aren’t members.

Signal instead encrypts (hashes) the phone numbers in your contacts, truncates the encrypted form so it can’t be used to match the full phone number, sends those truncated versions to their servers, and if it finds matches for any truncated other account numbers it sends the encrypted possible matches back to you for your app to check. Security experts will realise that this isn’t perfect either, but it’s so much better than most of the alternatives that you can be much more comfortable doing it. Here’s a page talking about it with a link to more detailed technical descriptions about how they’re trying to make it even more secure. And here’s the source code for all their software in case you don’t trust what they say and want to check it out for yourself.

So in recent months, if I’ve wanted to set up group chat sessions to discuss the care of an elderly relative, or plan a boating holiday with friends, or discuss software development with colleagues in another timezone, I tell people that I disconnected from Facebook a few years back so I don’t do WhatsApp, but have you tried Signal? It’s pretty much the same, with all the bad bits taken out, and works much better on the desktop and on tablets, in my now-rather-dated experience, than WhatsApp ever did.

So give it a try, and if you find that not many friends are there, don’t delete it. Just wait a bit… and tell all your friends about this post, of course!

Keep it secret! Keep it safe!

Some very smart friends of mine have created a rather neat device called EnCloak. It looks and acts just like a normal USB drive, but it can encrypt and decrypt files in cunning ways as you save and retrieve them.

“So what?”, you may say, “There are lots of encrypted storage devices on the market.”

Yes, but this one has some particularly smart attributes, most notably that the hardware just uses standard USB file storage operations, so you don’t need any software or drivers on the machine to make use of it. And if you drop it in the car park and somebody picks it up and plugs it in, they’ll just see a small standard flash drive and won’t even know there are also secret files on it without having the appropriate credentials, let alone be able to read them.

Need to take those super-secret exam questions to the publishing company without wanting to trust any intervening networks? Or keep a backup copy of the things you normally store in your password manager, which you could get at anywhere in future without access to that bit of software? This might be the thing for you.

There are lots of other ways to get encrypted data from place to place, so you may not need this. But hey, the printing company may not know about your GPG keys, and the examination board may not want to install your decryption software, and you know the Feds will get at anything you have in the cloud. If they don’t, Facebook will. Besides, gadgets are fun!

Anyway, they’ve been working on this for quite a while; and I saw an early prototype over two years ago, so I can vouch that it worked even back then. Now they’ve just launched a Kickstarter project to fund the initial production run, so you can now sign up for one — either for yourself, or to get your Christmas presents sorted out nice and early for your geeky friends!

A Day in the Life of Your Data

This is a nicely-written document from Apple which is intended to give people an idea of the amount of data that can be gathered about them as they go about their normal lives.

It is also, of course, intended to persuade you that it’s a good idea for your phone to run software from Apple, rather than from a company that makes its money from selling data about you. But it’s pretty balanced overall, and might be useful if you have non-technical friends who haven’t considered this stuff.

As a photographer, I have quite a few photo-related apps, and I often give them access to my entire photo library, because I may want to use them to edit any of my images. And even though the article doesn’t highlight this directly, it did make me realise that, by doing so, I’m also giving them access to a great deal of my location history, because all of my photos are geotagged. Something to consider.

iOS tip of the day

When you’re setting up your fingerprints for Touch ID, take the time after registering each one to go in and give it a name. Then if you find one finger becoming unreliable, you know which one to delete and reprogram.

Preparing for the cybercrime of the future

My friend Frank helped organise what looked like a great event at the Computer Lab recently – called Cambridge2Cambridge, it’s a joint initiative between us and MIT, and they’ve done a splendid video about it.

More information here.

Always look on the dark side of life

I love these nihilistic security questions from Soheil Rezayazdi…

nihilisticsecurity

Thanks to Rory C-J for the link.

Not as secure as it SIMs

simIf you knew, or cared, anything about the way your mobile phone communicates with the mobile network, you may have believed that your calls were secure and private, at least as far as the core of your provider’s network. They should be, too, if you’re on a 3G or 4G network: the SIM in your phone includes encryption keys known only to it and the mobile provider, and these are used to encode the voice and text traffic so that anyone snooping on the radio signal, or on the backhaul network between the base station and the provider’s headquarters, would not be able to make head or tail of the stream of bytes flowing by. To do so on any scale would need vast amounts of computing power.

However, if this article in The Intercept, The Great SIM Heist, is correct, the NSA and GCHQ have a much better approach. To quote the article:

Adi Shamir famously asserted: “Cryptography is typically bypassed, not penetrated.” In other words, it is much easier (and sneakier) to open a locked door when you have the key than it is to break down the door using brute force.

So that’s what they allegedly did, according to the latest revelations from Ed Snowden: they hacked into the networks of the SIM card manufacturers, most notably Gemalto, the largest in this field and a supplier to 450 mobile providers around the world, and just stole copies of the keys before they were shipped to the mobile providers. They focused on the activities of employees who used email encryption and those exploring more secure methods of file transfer, since they were more likely to have valuable information to hide.

Perhaps the most shocking thing about these thoroughly illegal activities is that the companies and individuals targeted were not in any way assumed to be engaged in illicit activities. They were innocents going about their daily business, but they just had information that was of potential use to the authorities.

Snowden’s information is from 2009/10, so it is to be presumed that this has been going on for some time. Meanwhile, this is what it did to poor old Gemalto’s stock price when the news came out a couple of days ago:

gemalto

Learning from the disaster

Most of you have probably heard by now about how the technology reporter Mat Honan’s accounts were hacked and how he lost his Google Mail, his Apple and Amazon account, his Twitter account and the contents of his iPhone and laptop. All in under one hour.

What’s fascinating about this story is that we know how it was done: there was no heavy brute-force attack on weakly-encypted passwords, no SQL injections on his company’s website. The hackers had no animosity towards him; they didn’t know who he was, they just liked his three-letter @mat Twitter ID. In other words, this could easily happen to you too!

If you haven’t heard the story, then I recommend listening to episode 364 of Security Now, which you can get from here or here. The discussion starts 30 mins into the programme.

You should probably listen to this if you, say, use the Internet…

Brand confusion

An elderly colleague turned to me at lunch yesterday.

“Tell me”, he said. “you’re a computer expert… All of these leaks must mean that nobody in government will be able to use email ever again. Just what are the political motivations of an organisation like Wikipedia?”

© Copyright Quentin Stafford-Fraser