generative adversarial networks-based pseudo-random number

23
Generative Adversarial Networks-Based Pseudo-Random Number Generator for Embedded Processors Hyunji Kim, Yongbeen Kwon, Minjoo Sim, Sejin Lim, Hwajeong Seo IT Department, Hansung University, Seoul, Korea

Upload: others

Post on 07-Dec-2021

15 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Generative Adversarial Networks-Based Pseudo-Random Number

Generative Adversarial Networks-BasedPseudo-Random Number Generator forEmbedded ProcessorsHyunji Kim, Yongbeen Kwon, Minjoo Sim, Sejin Lim,

Hwajeong SeoIT Department, Hansung University, Seoul, Korea

Page 2: Generative Adversarial Networks-Based Pseudo-Random Number

Contentsโ€ข Introductionโ€ข Backgroundโ€ข Proposed Methodโ€ข Evaluationโ€ข Conclusion

Page 3: Generative Adversarial Networks-Based Pseudo-Random Number

Motivation and Contributionโ€ข Motivation

โ€ข Improve the randomness of the previous work.โ€ข Letโ€™s make the Cryptographically Secure Pseudo Random Number

Generator, CSPRNG) for Embedded Processor.

โ€ข Contributionโ€ข Novel GAN based PRNG (DRBG) mechanism design for embedded

processors.โ€ข High randomness validation through NIST test suite.

Page 4: Generative Adversarial Networks-Based Pseudo-Random Number

Random Number Generatorโ€ข Random Number Generator (RNG)

โ€ข Produce a sequence of numbers that cannot be predicted better than by a random chance.

โ€ข True Random Number Generator (TRNG)โ€ข Must produce unpredictable bits even if every detail of the generator is

available.โ€ข Pseudo Random Number Generator (PRNG)

โ€ข Deterministic Random Bit Generator (DRBG) : Generate random numbers by producting the random sequence with perfect balance between 0โ€™s and 1โ€™s.

Page 5: Generative Adversarial Networks-Based Pseudo-Random Number

TensorFlow and TensorFlow Lite

โ€ข TensorFlowโ€ข Open-source software library for machine learning applications, such

as neural networks.โ€ข TensorFlow Lite

โ€ข Official framework for running TensorFlow model inference on edge devices.

Page 6: Generative Adversarial Networks-Based Pseudo-Random Number

Edge TPUโ€ข USB type hardware accelerators.โ€ข ASIC designed to run inference at the edge.โ€ข Support the TensorFlow Lite.โ€ข Small footprint, low power.

Page 7: Generative Adversarial Networks-Based Pseudo-Random Number

Previous GAN based PRNG Implementation

โ€ข Generatorโ€ข Generate random decimal numberโ€ข The range of output : [0,216 โˆ’ 1]

โ€ข Predictorโ€ข Used as a discriminator and training data is not required.โ€ข Consist of 4 Conv1D layers.

Page 8: Generative Adversarial Networks-Based Pseudo-Random Number

System Configuration โ€“ Training & inference

๐‘Ÿ๐‘Ÿ๐‘Ÿ๐‘Ÿ๐‘Ÿ๐‘Ÿ๐‘Ÿ๐‘Ÿ๐‘Ÿ๐‘Ÿ๐‘Ÿ๐‘Ÿ๐‘ ๐‘ ๐‘ ๐‘ ๐‘ ๐‘ ๐‘Ÿ๐‘Ÿ

0 โ‹ฏ 1โ‹ฎ โ‹ฑ โ‹ฎ

1 โ‹ฏ 0

Generaterandom bit stream

0 โ‹ฏ 1โ‹ฎ โ‹ฑ โ‹ฎ

1 โ‹ฏ 0

๐‘†๐‘†๐‘†๐‘†๐‘†๐‘†๐‘†๐‘†๐‘†๐‘†๐‘†

๐‘†๐‘†๐‘†๐‘†๐‘†๐‘†๐‘†๐‘†๐‘†๐‘†1

0 โ‹ฏ 0

Predictrandom bit streamto come after ๐‘ ๐‘ ๐‘†๐‘†๐‘†๐‘†๐‘†๐‘†๐‘†๐‘†0

compare

predicted

Split into 2 parts

Generator

Predictor

Trained Generator

EdgeTPU

convert to

TensorFlow Lite

compile &

deploy

IoT Device

Entropy Randomseed

0 โ‹ฏ 0โ‹ฎ โ‹ฑ โ‹ฎ0 โ‹ฏ 1

Random Bit Stream

inference

input

Training phase Inference phase

Page 9: Generative Adversarial Networks-Based Pseudo-Random Number

The generator modelโ€ข ๐’๐’,๐’Œ๐’Œ are adjustable hyperparameter

โ€ข Determine the number of bits to train.

โ€ข sigmoid activation functionโ€ข Set the number of the desired range through bit-wise training (0 or 1)

instead of training with a specific range of numbers.

1 โ‹ฏ 0

โ‹ฎ โ‹ฎ0 โ‹ฏ 1

Generator

Dense

LeakyReLu

Dense

LeakyReLu

Dense

LeakyReLu

Dense

LeakyReLu

Dense

Sigmoid

๐‘Ÿ๐‘Ÿ

k

Randomseed

Page 10: Generative Adversarial Networks-Based Pseudo-Random Number

The predictor modelโ€ข Split generated bit stream into 2 parts.

โ€ข ๐‘ ๐‘ ๐‘†๐‘†๐‘†๐‘†๐‘†๐‘†๐‘†๐‘†๐‘† : for trainingโ€ข ๐‘ ๐‘ ๐‘†๐‘†๐‘†๐‘†๐‘†๐‘†๐‘†๐‘†๐‘  : for comparision with predicted bit stream

๐‘ ๐‘ ๐‘†๐‘†๐‘†๐‘†๐‘†๐‘†๐‘†๐‘† 1 (๐‘Ÿ๐‘Ÿ โˆ’ 1)๐‘Ÿ๐‘Ÿ

๐‘ ๐‘ ๐‘†๐‘†๐‘†๐‘†๐‘†๐‘†๐‘†๐‘† 0 (๐‘Ÿ๐‘Ÿ โˆ’ 2)1 โ‹ฏ 0

โ‹ฎ โ‹ฎ0 โ‹ฏ 1

1 โ‹ฏ 0

โ‹ฎ0 โ‹ฏ 1

generated bit stream

Page 11: Generative Adversarial Networks-Based Pseudo-Random Number

The predictor model โ€ข Using RNN

โ€ข Time series analysis using only CNN is difficult to have a mutual effect as the distance between data increases.

โ€ข RNN is used to predict data following a random walk and have long-term dependency.

โ€ข ๐‘ณ๐‘ณ๐‘ณ๐‘ณ๐‘ณ๐‘ณ๐‘ณ๐‘ณ๐‘ท๐‘ท = ๐’Ž๐’Ž๐’Ž๐’Ž๐’Ž๐’Ž๐’๐’( ๐‘ณ๐‘ณ๐’”๐’”๐’”๐’”๐’”๐’”๐’”๐’”๐’”๐’” โˆ’ ๐‘น๐‘น๐‘น๐‘น๐‘น๐‘น๐‘ท๐‘ท )

0 โ‹ฏ 0

0 โ‹ฏ 1Predictor

Conv1D

LeakyReLu

Conv1D

LeakyReLu

Conv1D

LeakyReLu

Dense

SigmoidRNN

๐‘ ๐‘ ๐‘†๐‘†๐‘†๐‘†๐‘†๐‘†๐‘†๐‘† 1

prediction (๐‘…๐‘…๐‘…๐‘…๐‘†๐‘†๐‘ƒ๐‘ƒ)

compare

๐‘ ๐‘ ๐‘†๐‘†๐‘†๐‘†๐‘†๐‘†๐‘†๐‘† 0

1 โ‹ฏ 0

โ‹ฎ

Page 12: Generative Adversarial Networks-Based Pseudo-Random Number

GAN based PRNGโ€ข Training the generator

โ€ข Trough combined model.โ€ข Loss is calculated by ๐‘ ๐‘ ๐‘†๐‘†๐‘†๐‘†๐‘†๐‘†๐‘†๐‘†๐‘  and ๐‘…๐‘…๐‘…๐‘…๐‘†๐‘†๐‘ƒ๐‘ƒ .๐ฟ๐ฟ๐‘Ÿ๐‘Ÿ๐‘ ๐‘ ๐‘ ๐‘ ๐บ๐บ = ๐‘Ÿ๐‘Ÿ๐‘ ๐‘ ๐‘Ÿ๐‘Ÿ๐‘Ÿ๐‘Ÿ 1 โˆ’ ๐‘ ๐‘ ๐‘†๐‘†๐‘†๐‘†๐‘†๐‘†๐‘†๐‘†๐‘  โˆ’ ๐‘…๐‘…๐‘…๐‘…๐‘†๐‘†๐‘ƒ๐‘ƒ ๏ฟฝ 0.5

โ€ข Convert to decimal number.โ€ข ๐‘๐‘ โ† โˆ‘๐‘–๐‘–=0๐‘š๐‘š+๐‘ก๐‘กโˆ’1 2๐‘–๐‘– ๏ฟฝ ๐‘…๐‘…๐‘…๐‘…๐‘†๐‘†๐‘–๐‘–๐‘Ÿ๐‘Ÿ๐‘›๐‘›๐‘Ÿ๐‘Ÿ โ† ๐‘๐‘ ๐‘Ÿ๐‘Ÿ๐‘Ÿ๐‘Ÿ๐‘Ÿ๐‘Ÿ ๐‘Ÿ๐‘Ÿ

โ€ข The range of number is determined by setting ๐‘Ÿ๐‘Ÿ and ๐‘Ÿ๐‘Ÿ.

0 โ‹ฏ 1โ‹ฎ โ‹ฑ โ‹ฎ

1 โ‹ฏ 0

0 โ‹ฏ 1โ‹ฎ โ‹ฑ โ‹ฎ

1 โ‹ฏ 0

0 โ‹ฏ 0

๐‘Ÿ๐‘Ÿ๐‘Ÿ๐‘Ÿ๐‘Ÿ๐‘Ÿ๐‘Ÿ๐‘Ÿ๐‘Ÿ๐‘Ÿ๐‘Ÿ๐‘Ÿ๐‘ ๐‘ ๐‘ ๐‘ ๐‘ ๐‘ ๐‘Ÿ๐‘Ÿ

๐‘†๐‘†๐‘†๐‘†๐‘†๐‘†๐‘†๐‘†๐‘†๐‘†๐‘†

๐‘†๐‘†๐‘†๐‘†๐‘†๐‘†๐‘†๐‘†๐‘†๐‘†1

prediction (๐‘…๐‘…๐‘…๐‘…๐‘†๐‘†๐‘ƒ๐‘ƒ)

Generator

Predictor

๐ฟ๐ฟ๐‘Ÿ๐‘Ÿ๐‘ ๐‘ ๐‘ ๐‘ ๐บ๐บ

Combined model (Generator + Predictor)

โˆ— ๐‘†๐‘†๐‘ ๐‘ ๐‘๐‘๐‘›๐‘›๐‘Ÿ๐‘Ÿ๐‘ ๐‘  ๐‘†๐‘†๐‘Ÿ๐‘Ÿ๐‘Ÿ๐‘Ÿ๐‘Ÿ๐‘Ÿ๐‘Ÿ๐‘Ÿ๐‘ ๐‘ ๐‘†๐‘†๐‘ ๐‘ ๐‘Ÿ๐‘Ÿ (๐‘†๐‘†),๐‘…๐‘…๐‘Ÿ๐‘Ÿ๐‘Ÿ๐‘Ÿ๐‘…๐‘…๐‘ ๐‘  ๐‘Ÿ๐‘Ÿ๐‘œ๐‘œ ๐‘Ÿ๐‘Ÿ๐‘Ÿ๐‘Ÿ๐‘Ÿ๐‘Ÿ๐‘Ÿ๐‘Ÿ๐‘Ÿ๐‘Ÿ๐‘Ÿ๐‘Ÿ ๐‘Ÿ๐‘Ÿ๐‘›๐‘›๐‘Ÿ๐‘Ÿ๐‘›๐‘›๐‘ ๐‘ ๐‘Ÿ๐‘Ÿ (๐‘Ÿ๐‘Ÿ),๐‘‡๐‘‡๐‘‡๐‘ ๐‘  ๐‘Ÿ๐‘Ÿ๐‘›๐‘›๐‘Ÿ๐‘Ÿ๐‘›๐‘›๐‘ ๐‘ ๐‘Ÿ๐‘Ÿ ๐‘Ÿ๐‘Ÿ๐‘œ๐‘œ ๐‘›๐‘›๐‘†๐‘†๐‘†๐‘†๐‘ ๐‘  ๐‘Ÿ๐‘Ÿ๐‘ ๐‘ ๐‘ ๐‘ ๐‘Ÿ๐‘Ÿ๐‘ ๐‘ ๐‘Ÿ๐‘Ÿ ๐‘†๐‘†๐‘Ÿ๐‘Ÿ ๐‘Ÿ๐‘Ÿ๐‘ ๐‘ ๐‘†๐‘†๐‘Ÿ๐‘Ÿ๐‘ ๐‘ ๐‘ ๐‘ ๐‘ ๐‘ ๐‘Ÿ๐‘Ÿ๐‘†๐‘† ๐‘Ÿ๐‘Ÿ๐‘Ÿ๐‘Ÿ๐‘Ÿ๐‘Ÿ๐‘Ÿ๐‘Ÿ๐‘Ÿ๐‘Ÿ๐‘Ÿ๐‘Ÿ ๐‘Ÿ๐‘Ÿ๐‘›๐‘›๐‘Ÿ๐‘Ÿ๐‘›๐‘›๐‘ ๐‘ ๐‘Ÿ๐‘Ÿ (๐‘Ÿ๐‘Ÿ)

Page 13: Generative Adversarial Networks-Based Pseudo-Random Number

GAN based PRNG for Embedded Processorsโ€ข Deploy only generator model

โ€ข The predictor is not required to generate the random bit stream.

โ€ข Simple architecture for resource-constrained environment.

Trained Generator

EdgeTPU

convert to

TensorFlow Lite

compile &

deploy

IoT Device

Page 14: Generative Adversarial Networks-Based Pseudo-Random Number

GAN based PRNG for Embedded Processorsโ€ข Entropy for random seed

โ€ข The trained generator is a PRNG with a fixed internal state. random seed with

sufficiently high entropy is required.

โ€ข Collected from IoT device.(e.g. sensor data)

Trained Generator

EdgeTPU

convert to

TensorFlow Lite

compile &

deploy

IoT Device

Entropy Randomseed

0 โ‹ฏ 0โ‹ฎ โ‹ฑ โ‹ฎ0 โ‹ฏ 1

Random Bit Stream

inference

input

Page 15: Generative Adversarial Networks-Based Pseudo-Random Number

Comparison with the previous work

Page 16: Generative Adversarial Networks-Based Pseudo-Random Number

Visualizationโ€ข After training, the internal state changes.โ€ข The generated bit stream is distributed without a pattern.

Visualization of random bit stream generated by the generator. Before training (left) and after training (right).

Page 17: Generative Adversarial Networks-Based Pseudo-Random Number

NIST SP 800-22 : Randomness test for PRNGโ€ข Improving the randomness of PRNG.

โ€ข In the previous work, tests such as frequency and cumulative sums failed because they only used convolution layer.

final analysis report of NIST test suite ; (left) previous work, (right) this work.

Page 18: Generative Adversarial Networks-Based Pseudo-Random Number

NIST SP 800-22 : Randomness test for PRNGโ€ข The failed test instance (๐น๐น๐ผ๐ผ/%) is reduced by about 1.91%.โ€ข No failed p-value (๐น๐น๐‘ƒ๐‘ƒ) in this work.โ€ข The failed individual test (๐น๐น%) is reduced by about 2.5%.

Page 19: Generative Adversarial Networks-Based Pseudo-Random Number

Unpredictability for CSPRNGโ€ข Next bit test

โ€ข The ๐‘Ÿ๐‘Ÿ + ๐‘ ๐‘†๐‘†๐‘‡ bit cannot be predicted with the ๐‘Ÿ๐‘Ÿ-bit.โ€ข The training process means this test, so if the loss is minimized,

the next bit will be unpredictable.

Page 20: Generative Adversarial Networks-Based Pseudo-Random Number

Unpredictability for CSPRNGโ€ข State compromise attack resistance

โ€ข When the internal state of PRNG is known at some time, the output can be predicted after or before.

โ€ข Reseed for each batch to ensure resistance.

๐‘Ÿ๐‘Ÿ๐‘Ÿ๐‘Ÿ๐‘Ÿ๐‘Ÿ๐‘Ÿ๐‘Ÿ๐‘Ÿ๐‘Ÿ๐‘Ÿ๐‘Ÿ๐‘ ๐‘ ๐‘ ๐‘ ๐‘ ๐‘ ๐‘Ÿ๐‘Ÿ

Generator

reseed

Page 21: Generative Adversarial Networks-Based Pseudo-Random Number

โ€ข Execution environmentโ€ข The PRNGs on desktop : Intel Core i5-8259 [email protected] x 8, 16GB.โ€ข MPCG64 : STM32F4.โ€ข This work : Edge TPU.

Comparison With Existing PRNGs

Page 22: Generative Adversarial Networks-Based Pseudo-Random Number

Conclusion and Future workโ€ข Conclusion

โ€ข GAN based PRNG (DRBG) for embedded processors.โ€ข High randomness validation through the NIST test suite.

โ€ข Future workโ€ข Optimizing to maintain high randomness while being more efficient for

resource-constrained environments.โ€ข Applying other GAN models for high randomness and efficiency.โ€ข Designing a lightweight model through pruning.โ€ข Efficient entropy collection.

Page 23: Generative Adversarial Networks-Based Pseudo-Random Number

Thank you for your attention!