jm + cryptography   11

Attack of the week: searchable encryption and the ever-expanding leakage function
In all seriousness: database encryption has been a controversial subject in our field. I wish I could say that there’s been an actual debate, but it’s more that different researchers have fallen into different camps, and nobody has really had the data to make their position in a compelling way. There have actually been some very personal arguments made about it. The schools of thought are as follows:

The first holds that any kind of database encryption is better than storing records in plaintext and we should stop demanding things be perfect, when the alternative is a world of constant data breaches and sadness.

To me this is a supportable position, given that the current attack model for plaintext databases is something like “copy the database files, or just run a local SELECT * query”, and the threat model for an encrypted database is “gain persistence on the server and run sophisticated statistical attacks.” Most attackers are pretty lazy, so even a weak system is probably better than nothing.

The countervailing school of thought has two points: sometimes the good is much worse than the perfect, particularly if it gives application developers an outsized degree of confidence of the security that their encryption system is going to provide them.

If even the best encryption protocol is only throwing a tiny roadblock in the attacker’s way, why risk this at all? Just let the database community come up with some kind of ROT13 encryption that everyone knows to be crap and stop throwing good research time into a problem that has no good solution.

I don’t really know who is right in this debate. I’m just glad to see we’re getting closer to having it.

(via Jerry Connolly)
cryptography  attacks  encryption  database  crypto  security  storage  ppi  gdpr  search  databases  via:ecksor 
7 days ago by jm
A Guide to Post-Quantum Cryptography
Post-quantum cryptography is an incredibly exciting area of research that has seen an immense amount of growth over the last decade. While the four types of cryptosystems described in this post have received lots of academic attention, none have been approved by NIST and as a result are not recommended for general use yet. Many of the schemes are not performant in their original form, and have been subject to various optimizations that may or may not affect security. Indeed, several attempts to use more space-efficient codes for the McEliece system have been shown to be insecure. As it stands, getting the best security from post-quantum cryptosystems requires a sacrifice of some amount of either space or time. Ring lattice-based cryptography is the most promising avenue of work in terms of flexibility (both signatures and KEM, also fully homomorphic encryption), but the assumptions that it is based on have only been studied intensely for several years. Right now, the safest bet is to use McEliece with Goppa codes since it has withstood several decades of cryptanalysis.
cryptography  crypto  post-quantum-crypto  pqc  quantum-computing  via:el33th4xor  security  algorithms 
october 2018 by jm
Tahoe LAFS accidentally lose Bitcoin wallet with loads of donations in it, get it back
But ECDSA private keys don't trigger the same protective instincts that
we'd apply to, say, a bar of gold. One sequence of 256 random bits looks
just as worthless as any other. And the cold hard unforgeability of
these keys means we can't rely upon other humans to get our money back
when we lose them. Plus, we have no experience at all with things that grow in value by
four orders of magnitude, without any attention, in just three years.

So we have a cryptocurrency-tool UX task in front of us: to avoid
mistakes like the one we made, we must to either move these digital
assets into solid-feeling physical containers, or retrain our
perceptions to attach value to the key strings themselves.
backups  cryptography  bitcoin  cryptocurrency  ecdsa  private-keys  ux  money 
march 2016 by jm
The Moral Failure of Computer Scientists - The Atlantic
Phillip Rogaway, a professor of CS at UC Davis, contends that computer scientists should stand up against the construction of surveillance states built using their work:
Waddell: In your paper, you compare the debate over nuclear science in the 1950s to the current debate over cryptography. Nuclear weapons are one of the most obvious threats to humanity today — do you think surveillance presents a similar type of danger?

Rogaway: I do. It’s of a different nature, obviously. The threat is more indirect and more subtle. So with nuclear warfare, there was this visually compelling and frightening risk of going up in a mushroom cloud. And with the transition to a state of total surveillance, what we have is just the slow forfeiture of democracy.
ethics  cryptography  crypto  surveillance  politics  phillip-rogaway  morals  speaking-out  government 
december 2015 by jm
ImperialViolet - No, don't enable revocation checking
...because it doesn't stop attacks. Turning it on does nothing but slow things down. You can tell when something is security theater because you need some absurdly specific situation in order for it to be useful.
cryptography  crypto  heartbleed  ssl  security  tls  https  internet  revocation  crls 
april 2014 by jm
Akamai's "Secure Heap" patch wasn't good enough
'Having the private keys inaccessible is a good defense in depth move.
For this patch to work you have to make sure all sensitive values are stored in
the secure area, not just check that the area looks inaccessible. You can't do
that by keeping the private key in the same process. A review by a security
engineer would have prevented a false sense of security. A version where the
private key and the calculations are in a separate process would be more
secure. If you decide to write that version, I'll gladly see if I can break
that too.'

Akamai's response: https://blogs.akamai.com/2014/04/heartbleed-update-v3.html -- to their credit, they recognise that they need to take further action.

(via Tony Finch)
via:fanf  cryptography  openssl  heartbleed  akamai  security  ssl  tls 
april 2014 by jm
A looming breakthrough in indistinguishability obfuscation
'The team’s obfuscator works by transforming a computer program into what Sahai calls a “multilinear jigsaw puzzle.” Each piece of the program gets obfuscated by mixing in random elements that are carefully chosen so that if you run the garbled program in the intended way, the randomness cancels out and the pieces fit together to compute the correct output. But if you try to do anything else with the program, the randomness makes each individual puzzle piece look meaningless. This obfuscation scheme is unbreakable, the team showed, provided that a certain newfangled problem about lattices is as hard to solve as the team thinks it is. Time will tell if this assumption is warranted, but the scheme has already resisted several attempts to crack it, and Sahai, Barak and Garg, together with Yael Tauman Kalai of Microsoft Research New England and Omer Paneth of Boston University, have proved that the most natural types of attacks on the system are guaranteed to fail. And the hard lattice problem, though new, is closely related to a family of hard problems that have stood up to testing and are used in practical encryption schemes.'

(via Tony Finch)
obfuscation  cryptography  via:fanf  security  hard-lattice-problem  crypto  science 
february 2014 by jm
How Advanced Is the NSA's Cryptanalysis — And Can We Resist It?
Bruce Schneier's suggestions:
Assuming the hypothetical NSA breakthroughs don’t totally break public-cryptography — and that’s a very reasonable assumption — it’s pretty easy to stay a few steps ahead of the NSA by using ever-longer keys. We’re already trying to phase out 1024-bit RSA keys in favor of 2048-bit keys. Perhaps we need to jump even further ahead and consider 3072-bit keys. And maybe we should be even more paranoid about elliptic curves and use key lengths above 500 bits.

One last blue-sky possibility: a quantum computer. Quantum computers are still toys in the academic world, but have the theoretical ability to quickly break common public-key algorithms — regardless of key length — and to effectively halve the key length of any symmetric algorithm. I think it extraordinarily unlikely that the NSA has built a quantum computer capable of performing the magnitude of calculation necessary to do this, but it’s possible. The defense is easy, if annoying: stick with symmetric cryptography based on shared secrets, and use 256-bit keys.
bruce-schneier  cryptography  wired  nsa  surveillance  snooping  gchq  cryptanalysis  crypto  future  key-lengths 
september 2013 by jm
Applied Cryptography, Cryptography Engineering, and how they need to be updated
Whoa, I had no idea my knowledge of crypto was so out of date! For example:
ECC is going to replace RSA within the next 10 years. New systems probably shouldn’t use RSA at all.


This blogpost is full of similar useful guidelines and rules of thumb. Here's hoping I don't need to work on a low-level cryptosystem any time soon, as the risk of screwing it up is always high, but if I do this is a good reference for how it needs to be done nowadays.
thomas-ptacek  crypto  cryptography  coding  design  security  aes  cbc  ctr  ecb  hmac  side-channels  rsa  ecc 
july 2013 by jm
tcpcrypt
opportunistic encryption of TCP connections. not the simplest to set up, though
cryptography  encryption  tcp  security  internet  tcpcrypt  opportunistic  from delicious
august 2010 by jm

Copy this bookmark:



description:


tags: