I'm fooling around with a new Swift Cocoa project. The Encoder app does encryption and decryption. I use the phrase "fooling around" deliberately because an important rule of cryptography is to use standard libraries. If you don't really know what you're doing, you certainly shouldn't "roll your own."
Here is a screenshot of the current version of the app.
At the heart of it, we are doing something trivial:
return Zip2Sequence(a1,a2).map { $0^$1 }
where
a1
and
a2
are binary data, i.e. arrays of integers with values between 0 and 255 (UInt8). The xor operator
^
is just what we want for both encoding and decoding.
The trick, of course, is to generate a keystream of pseudo-random numbers given a key, which is a String.
Foundation provides standard functions for obtaining random numbers:
rand
,
random
,
arc4random
,
arc4random_uniform
.
arc4random
and
arc4random_uniform
are preferred for
really random PRNG, but lack the ability to be seeded (by the caller). Seeding is needed so that the same pseudo-random keystream sequence can be regenerated, given a relatively short key.
The
rand
function returns an
Int32
(try
rand() is Int32
in a Swift playground. So in theory it might return both positive and negative values. However, it only seems to return positive numbers, which is good, because, say
-3 % 256
is equal to -3, which would cause trouble when trying to convert to UInt8. According to this (very old)
file, the OS X rand uses unsigned values.
The maximum value returned is
RAND_MAX
, which is equal to 2147483647 = 2^31 - 1, which is also equal to
Int32.max
. What we want are integers in the interval [0,255], so I just do:
Int(rand()) % 256
I'm not an expert, of course, but I looked at the distribution of output of
rand
and it seemed sufficiently random for my purposes.
The next question is that of seeding the PRNG. You might do:
srand(UInt32(time(nil)))
The seed function
srand
takes a
UInt32
. The question is, how do we turn our key into a
UInt32
? What I've done so far is to use the String property
hashValue
/ This gives an
Int
, which is 64 bits, and of course, may also be negative.
Here is my approach:
Having initialized everything, we just call
func next() -> UInt8 {
return UInt8( Int(rand()) % 256 )
}
One issue I'm still working on is that of loading binary data. I know how to turn an
[UInt8]
into data:
let a: [UInt8] = Array(0..<4)
let data = NSData(bytes: a, length: 4)
and load data from a file with
NSData(contentsOfFile:fn)
, but I haven't been able to figure out how to turn that back into an array of
UInt8
in Swift.
I puzzled out a way around the problem, however. String will take NSData in an initializer:
Given that, I get the individual bytes from the string and use a
Dictionary
As usual these days, the project is on github
here. Thanks to
samol for explaining how to do the gist thing.
UPDATE:
I found a way to do the data conversion mentioned above. It looks like it is probably the natural way to do this in Swift.
We use
NSInputStream
initialized with the NSData object. We allocate the necessary space:
var buffer = Array(count: n, repeatedValue: 0)
and then read the data into the buffer using a reference.
stream.read(&buffer, maxLength: n)
The last call returns a result, which is the number of bytes read. Alternatively, one could read byte by byte until
stream.hasBytesAvailable
returns false. The only thing that confused me for a while was that I had
Array(0..<n)
which gave Int values of 64 bits each. Without the map on line 4 the data looks like:
[0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 2,
which I interpret as 00000000, 10000000, 20000000 and so on. These are the decimal values for each byte, where the ordering is little-endian, with the low value byte having the first memory address.
Intel x86 processors store a two-byte integer with the least significant byte first, followed by the most significant byte. This is called little-endian byte ordering.
Apple
docs. If I write the 64-bit data to a file and examine it with hexdump:
> hexdump x.bin
0000000 00 00 00 00 00 00 00 00 01 00 00 00 00 00 00 00