**jm + gradient-descent**
1

I am blown away by this -- given that Recurrent Neural Networks are Turing-complete, they can actually automate cryptanalysis given sufficient resources, at least to the degree of simulating the internal workings of the Enigma algorithm given plaintext, ciphertext and key:

machine-learning
deep-learning
rnns
enigma
crypto
cryptanalysis
turing
history
gpus
gradient-descent
The model needed to be very large to capture all the Enigma’s transformations. I had success with a single-celled LSTM model with 3000 hidden units. Training involved about a million steps of batched gradient descent: after a few days on a k40 GPU, I was getting 96-97% accuracy!

27 days ago by jm

**related tags**

Copy this bookmark: