pig-monkey.com - cryptohttps://pig-monkey.com/2024-06-28T18:47:31-07:00YubiKey Replacement2024-06-28T00:00:00-07:002024-06-28T18:47:31-07:00Pig Monkeytag:pig-monkey.com,2024-06-28:/2024/06/yubikey-replacement/<p>Since I <a href="/2015/05/key/">began using a YubiKey for PGP operations in 2015</a>, I’ve always kept a spare <a href="https://www.yubico.com/">YubiKey</a> locked away with <a href="/2017/06/armory/">my USB Armory</a>, in case the one on <a href="/2013/08/keychain/">my keychain</a> failed. While performing my <a href="/2018/05/key-renewal/">annual key renewal</a> this month I decided it was time to switch to the spare …</p><p>Since I <a href="/2015/05/key/">began using a YubiKey for PGP operations in 2015</a>, I’ve always kept a spare <a href="https://www.yubico.com/">YubiKey</a> locked away with <a href="/2017/06/armory/">my USB Armory</a>, in case the one on <a href="/2013/08/keychain/">my keychain</a> failed. While performing my <a href="/2018/05/key-renewal/">annual key renewal</a> this month I decided it was time to switch to the spare YubiKey. My old one still works, but it often takes a few attempts to read.</p>
<p><a href="https://www.flickr.com/photos/pigmonkey/53821667807/in/dateposted/" title="YubiKey NEO"><img src="https://live.staticflickr.com/65535/53821667807_ef3be6f90b_c.jpg" width="800" height="600" alt="YubiKey NEO"/></a></p>
<p>Both YubiKeys are 9 years old. But one has spent those 9 years locked away, while the other spent every day of those 9 years in my pocket (and saw repeated use on most of those days). The new one always works on the first attempt, and it fits into USB ports with a comforting amount of friction. The old one had been worn down so much that it often just falls out of ports if it isn’t being held in. (My calipers measure the front contact area of the old YubiKey at 2.26mm thick, where the new one is 2.40mm.) I’m glad to know that YubiKeys can reliably work for nigh a decade, but next time maybe I’ll start to think about replacing this one after around 5 years of EDC rather than 10.</p>
<p>I was pleasantly surprised to discover that modern versions of GnuPG are happy to use different cards for the same key, so you no longer need to <a href="https://security.stackexchange.com/a/258837">delete keygrip files when switching cards</a>.</p>YubiKey Cleaning2020-09-13T00:00:00-07:002020-09-13T11:53:44-07:00Pig Monkeytag:pig-monkey.com,2020-09-13:/2020/09/yubikey-cleaning/<p>I’ve carried the same <a href="https://support.yubico.com/support/solutions/articles/15000006494-yubikey-neo">YubiKey NEO</a> on <a href="/2013/08/keychain/">my keychain</a> for five years. On average it gets used dozens of times per day, via USB, as an <a href="https://en.wikipedia.org/wiki/OpenPGP_card">OpenPGP card</a>. The YubiKey looks a little worse for wear, but it almost always works flawlessly.</p>
<p>Occasionally, it requires a few insertions to …</p><p>I’ve carried the same <a href="https://support.yubico.com/support/solutions/articles/15000006494-yubikey-neo">YubiKey NEO</a> on <a href="/2013/08/keychain/">my keychain</a> for five years. On average it gets used dozens of times per day, via USB, as an <a href="https://en.wikipedia.org/wiki/OpenPGP_card">OpenPGP card</a>. The YubiKey looks a little worse for wear, but it almost always works flawlessly.</p>
<p>Occasionally, it requires a few insertions to be read. When this happens I clean the contacts by rubbing them gently with a <a href="https://www.pentel.com/collections/erasers/products/clic-eraser-grip">Pentel Clic Eraser</a>, wiping off the dust, spraying them with <a href="/2020/01/hid-cleaning/">isopropyl alcohol</a>, and then wiping them dry. Afterwards, the YubiKey is registered immediately on the first insert. I perform this procedure about once or twice per year.</p>
<p><a href="https://www.flickr.com/photos/pigmonkey/50338458991/in/dateposted/" title="YubiKey Cleaning"><img src="https://live.staticflickr.com/65535/50338458991_98162acffc_c.jpg" width="800" height="533" alt="YubiKey Cleaning"></a></p>
<p>Using the eraser is <a href="https://electronics.stackexchange.com/a/169030">potentially dangerous</a>, but I’ve had good luck with it over the years. The white vinyl in the Pentel Clic feels very smooth compared to the abrasiveness of the rubber found on the tops of most pencils.</p>Optical Backups of Financial Archives2019-06-29T00:00:00-07:002019-06-29T14:49:50-07:00Pig Monkeytag:pig-monkey.com,2019-06-29:/2019/06/optical-financal-backups/<p>Every year I burn an optical archive of my financial documents, similar to how (and why) I <a href="/2013/05/optical-photo-backups/">create optical backups of photos</a>. I schedule this financial archive for the spring, after the previous year’s taxes have been submitted and accepted. <a href="https://taskwarrior.org/">Taskwarrior</a> solves the problem of remembering to complete the …</p><p>Every year I burn an optical archive of my financial documents, similar to how (and why) I <a href="/2013/05/optical-photo-backups/">create optical backups of photos</a>. I schedule this financial archive for the spring, after the previous year’s taxes have been submitted and accepted. <a href="https://taskwarrior.org/">Taskwarrior</a> solves the problem of remembering to complete the archive.</p>
<div class="highlight"><pre><span></span><code>$ task add project:finance due:2019-04-30 recur:yearly wait:due-4weeks <span class="s2">"burn optical financial archive with parity"</span>
</code></pre></div>
<p>The archive includes two <a href="https://git-annex.branchable.com/">git-annex</a> repositories.</p>
<p>The first is my <a href="https://www.ledger-cli.org/">ledger</a> repository. Ledger is the double-entry accounting system I began using in 2012 to record the movement of every penny that crosses one of my bank accounts (small cash transactions, less than about $20, are usually-but-not-always except from being recorded). In addition to the plain-text ledger files, this repository also holds PDF or JPG images of receipts.</p>
<p>The second repository holds my tax information. Each tax year gets a <a href="https://git.zx2c4.com/ctmg/about/">ctmg</a> container which contains any documents used to complete my tax returns, the returns themselves, and any notifications of those returns being accepted.</p>
<p>The yearly optical archive that I create holds the entirety of these two repositories – not just the information from the previous year – so really each disc only needs to have a shelf life of 12 months. Keeping the older discs around just provides redundancy for prior years.</p>
<h2>Creating the Archive</h2>
<p>The process of creating the archive is very similar to the process I outlined six years ago for the photo archives.</p>
<p>The two repositories, combined, are about 2GB (most of that is the directory of receipts from the ledger repository). I burn these to a 25GB BD-R disc, so file size is not a concern. I’ll <code>tar</code> them, but skip any compression, which would just add extra complexity for no gain.</p>
<div class="highlight"><pre><span></span><code>$ mkdir ~/tmp/archive
$ <span class="nb">cd</span> ~/library
$ tar cvf ~/tmp/archive/ledger.tar ledger
$ tar cvf ~/tmp/archive/tax.tar tax
</code></pre></div>
<p>The ledger archive will get signed and encrypted with my PGP key. The contents of the tax repository are already encrypted, so I’ll skip encryption and just sign the archive. I like using detached signatures for this.</p>
<div class="highlight"><pre><span></span><code>$ <span class="nb">cd</span> ~/tmp/archive
$ gpg -e -r peter@havenaut.net -o ledger.tar.gpg ledger.tar
$ gpg -bo ledger.tar.gpg.sig ledger.tar.gpg
$ gpg -bo tax.tar.sig tax.tar
$ rm ledger.tar
</code></pre></div>
<p>Previously, when creating optical photo archives, I used <a href="https://web.archive.org/web/20160427222800/http://dvdisaster.net/en/index.html">DVDisaster</a> to create the disc image with parity. DVDisaster no longer exists. The code can still be found, and the program still works, but nobody is developing it and it doesn’t even an official web presence. This makes me uncomfortable for a tool that is part of my long-term archiving plans. As a result, I’ve moved back to using <a href="https://parchive.github.io/">Parchive</a> for parity. Parchive also does not have much in the way of active development around it, but it <a href="https://github.com/Parchive/par2cmdline/commits/master">is still maintained</a>, has been around for a long period of time, is still used by a wide community, and will probably continue to exist as long as people share files on less-than-perfectly-reliable mediums.</p>
<p>As previously mentioned, I’m not worried about the storage space for these files, so I tell <code>par2create</code> to create PAR2 files with 30% redundancy. I suppose I could go even higher, but 30% seems like a good number. By default this process will be allowed to use 16MB of memory, which is cute, but RAM is cheap and I usually have enough to spare so I’ll give it permission to use up to 8GB.</p>
<div class="highlight"><pre><span></span><code>$ par2create -r30 -m8000 recovery.par2 *
</code></pre></div>
<p>Next I’ll use <a href="http://md5deep.sourceforge.net/">hashdeep</a> to generate message digests for all the files in the archive.</p>
<div class="highlight"><pre><span></span><code>$ hashdeep * > hashes
</code></pre></div>
<p>At this point all the file processing is completed. I’ll put a blank disc in my burner (a <a href="https://pioneerelectronics.com/PUSA/Computer/Computer+Drives/BDR-XD05B">Pioneer BDR-XD05B</a>) and burn the directory using <a href="http://fy.chalmers.se/~appro/linux/DVD+RW/">growisofs</a>.</p>
<div class="highlight"><pre><span></span><code>$ growisofs -Z /dev/sr0 -V <span class="s2">"Finances 2019"</span> -r *
</code></pre></div>
<h2>Verification</h2>
<p>The final step is to verify the disc. I have a few options on this front. These are the same steps I’d take years down the road if I actually needed to recover data from the archive.</p>
<p>I can use the previous hashes to find any files that do not match, which is a quick way to identify bit rot.</p>
<div class="highlight"><pre><span></span><code>$ hashdeep -x -k hashes *.<span class="o">{</span>gpg,tar,sig,par2<span class="o">}</span>
</code></pre></div>
<p>I can check the integrity of the PGP signatures.</p>
<div class="highlight"><pre><span></span><code>$ gpg --verify tax.tar.gpg<span class="o">{</span>.sig,<span class="o">}</span>
$ gpg --verify tax.tar<span class="o">{</span>.sig,<span class="o">}</span>
</code></pre></div>
<p>I can use the PAR2 files to verify the original data files.</p>
<div class="highlight"><pre><span></span><code>$ par2 verify recovery.par2
</code></pre></div>PGP Key Renewal2018-05-06T00:00:00-07:002018-11-22T12:50:54-08:00Pig Monkeytag:pig-monkey.com,2018-05-06:/2018/05/key-renewal/<p>Last year I demonstrated <a href="/2017/06/armory/">setting up the USB Armory for PGP key management</a>. The two management operations I perform on the Armory are key signing and key renewal. I set my keys to expire each year, so that each year I need to confirm that I am not dead, still …</p><p>Last year I demonstrated <a href="/2017/06/armory/">setting up the USB Armory for PGP key management</a>. The two management operations I perform on the Armory are key signing and key renewal. I set my keys to expire each year, so that each year I need to confirm that I am not dead, still control the keys, and still consider them trustworthy.</p>
<p>After booting up the Armory, I first verify that NTP is disabled and set the current UTC date and time. Time is critical for any cryptography operations, and the Armory has no battery to maintain a clock.</p>
<div class="highlight"><pre><span></span><code>$ timedatectl set-ntp <span class="nb">false</span>
$ timedatectl set-time <span class="s2">"yyyy-mm-dd hh:mm:ss"</span>
</code></pre></div>
<p>My keys are stored on an encrypted microSD card, which I mount and decrypt.</p>
<div class="highlight"><pre><span></span><code>$ mkdir /mnt/sdcard
$ cryptsetup luksOpen /dev/sda sdcrypt
$ mount /dev/mapper/sdcrypt /mnt/sdcard
</code></pre></div>
<p>Next I’ll export a few environment variables to make things less redundant later on.</p>
<div class="highlight"><pre><span></span><code>$ <span class="nb">export</span> <span class="nv">YEAR</span><span class="o">=</span><span class="k">$(</span>date +%Y<span class="k">)</span>
$ <span class="nb">export</span> <span class="nv">PREVYEAR</span><span class="o">=</span><span class="k">$((</span><span class="nv">$YEAR</span><span class="o">-</span><span class="m">1</span><span class="k">))</span>
$ <span class="nb">export</span> <span class="nv">GNUPGHOME</span><span class="o">=</span><span class="s2">"/mnt/sdcard/gpg/</span><span class="nv">$YEAR</span><span class="s2">-renewal/.gnupg"</span>
$ <span class="nb">export</span> <span class="nv">KEYID</span><span class="o">=</span><span class="s2">"0x70B220FF8D2ACF29"</span>
</code></pre></div>
<p>I perform each renewal in a directory specific to the current year, but the <code>GNUPGHOME</code> directory I set for this year’s renewal doesn’t exist yet. Better create it.</p>
<div class="highlight"><pre><span></span><code>$ mkdir -p <span class="nv">$GNUPGHOME</span>
$ chmod <span class="m">700</span> <span class="nv">$GNUPGHOME</span>
</code></pre></div>
<p>I keep a copy of my <a href="https://github.com/pigmonkey/dotfiles/blob/master/gnupg/gpg.conf">gpg.conf</a> on the microSD card. That needs to be copied in to the new directory, and I’ll need to tell GnuPG what pinentry program to use.</p>
<div class="highlight"><pre><span></span><code>$ cp /mnt/sdcard/gpg/gpg.conf <span class="nv">$GNUPGHOME</span>
$ <span class="nb">echo</span> <span class="s2">"pinentry-program /usr/bin/pinentry-curses"</span> > <span class="nv">$GNUPGHOME</span>/gpg-agent.conf
</code></pre></div>
<p>After renewing the master key and subkey the previous year, I exported them. I’ll now import those backups from the previous year.</p>
<div class="highlight"><pre><span></span><code><span class="err">$</span> <span class="n">gpg</span> <span class="o">--</span><span class="kn">import</span> <span class="o">/</span><span class="n">mnt</span><span class="o">/</span><span class="n">sdcard</span><span class="o">/</span><span class="n">gpg</span><span class="o">/</span><span class="err">$</span><span class="n">PREVYEAR</span><span class="o">-</span><span class="n">renewal</span><span class="o">/</span><span class="n">backup</span><span class="o">/</span><span class="n">peter</span>\<span class="nd">@havenaut</span><span class="o">.</span><span class="n">net</span><span class="o">.</span><span class="n">master</span><span class="o">.</span><span class="n">gpg</span><span class="o">-</span><span class="n">key</span>
<span class="err">$</span> <span class="n">gpg</span> <span class="o">--</span><span class="kn">import</span> <span class="o">/</span><span class="n">mnt</span><span class="o">/</span><span class="n">sdcard</span><span class="o">/</span><span class="n">gpg</span><span class="o">/</span><span class="err">$</span><span class="n">PREVYEAR</span><span class="o">-</span><span class="n">renewal</span><span class="o">/</span><span class="n">backup</span><span class="o">/</span><span class="n">peter</span>\<span class="nd">@havenaut</span><span class="o">.</span><span class="n">net</span><span class="o">.</span><span class="n">subkeys</span><span class="o">.</span><span class="n">gpg</span><span class="o">-</span><span class="n">key</span>
</code></pre></div>
<p>When performing the actual renewal, I’ll set the expiration to 13 months. This needs to be done for the master key, the signing subkey, the encryption subkey, and the authentication subkey.</p>
<div class="highlight"><pre><span></span><code>$ gpg --edit-key <span class="nv">$KEYID</span>
trust
<span class="m">5</span>
expire
13m
y
key <span class="m">1</span>
key <span class="m">2</span>
key <span class="m">3</span>
expire
y
13m
y
save
</code></pre></div>
<p>That’s the renewal. I’ll list the keys to make sure they look as expected.</p>
<div class="highlight"><pre><span></span><code>$ gpg --list-keys
</code></pre></div>
<p>Before moving the subkeys to my Yubikey, I back everything up. This will be what I import the following year.</p>
<div class="highlight"><pre><span></span><code>$ mkdir /mnt/sdcard/gpg/<span class="nv">$YEAR</span>-renewal/backup
$ gpg --armor --export-secret-keys <span class="nv">$KEYID</span> > /mnt/sdcard/gpg/<span class="nv">$YEAR</span>-renewal/backup/peter<span class="se">\@</span>havenaut.net.master.gpg-key
$ gpg --armor --export-secret-subkeys <span class="nv">$KEYID</span> > /mnt/sdcard/gpg/<span class="nv">$YEAR</span>-renewal/backup/peter<span class="se">\@</span>havenaut.net.subkeys.gpg-key
</code></pre></div>
<p>Now I can insert my Yubikey, struggle to remember the admin PIN I set on it, and move over the subkeys.</p>
<div class="highlight"><pre><span></span><code>$ gpg --edit-key <span class="nv">$KEYID</span>
toggle
key <span class="m">1</span> <span class="c1"># signature</span>
keytocard
<span class="m">1</span>
key <span class="m">1</span>
key <span class="m">2</span> <span class="c1"># encryption</span>
keytocard
<span class="m">2</span>
key <span class="m">2</span>
key <span class="m">3</span> <span class="c1"># authentication</span>
keytocard
<span class="m">3</span>
save
</code></pre></div>
<p>When I list the secret keys, I expect them to all be stubs (showing as <code>ssb></code>).</p>
<div class="highlight"><pre><span></span><code>$ gpg --list-secret-keys
</code></pre></div>
<p>Of course, for this to be useful I need to export my renewed public key and copy it to some place where it can be brought to a networked machine for dissemination.</p>
<div class="highlight"><pre><span></span><code>$ gpg --armor --export <span class="nv">$KEYID</span> > /mnt/sdcard/gpg/<span class="nv">$YEAR</span>-renewal/peter<span class="se">\@</span>havenaut.net.public.gpg-key
$ mkdir /mnt/usb
$ mount /dev/sdb1 /mnt/usb
$ cp /mnt/sdcard/gpg/<span class="nv">$YEAR</span>-renewal/peter<span class="se">\@</span>havenaut.net.public.gpg-key /mnt/usb/
</code></pre></div>
<p>That’s it. Clean up, shutdown, and lock the Armory up until next year.</p>
<div class="highlight"><pre><span></span><code>$ umount /mnt/usb
$ umount /mnt/sdcard
$ cryptsetup luksClose sdcrypt
$ systemctl poweroff
</code></pre></div>LUKS Header Backup2017-07-16T00:00:00-07:002017-07-16T11:03:56-07:00Pig Monkeytag:pig-monkey.com,2017-07-16:/2017/07/luks/<p>I’d neglected backup <a href="https://en.wikipedia.org/wiki/Linux_Unified_Key_Setup">LUKS</a> headers until <a href="https://www.gwern.net/Notes#november-2016-data-loss-postmortem">Gwern’s data loss postmortem</a> last year. After reading his post I dumped the headers of the drives I had accessible, but I never got around to performing the task on my less frequently accessed drives. Last month I had trouble mounting one …</p><p>I’d neglected backup <a href="https://en.wikipedia.org/wiki/Linux_Unified_Key_Setup">LUKS</a> headers until <a href="https://www.gwern.net/Notes#november-2016-data-loss-postmortem">Gwern’s data loss postmortem</a> last year. After reading his post I dumped the headers of the drives I had accessible, but I never got around to performing the task on my less frequently accessed drives. Last month I had trouble mounting one of those drives. It turned out I was simply using the wrong passphrase, but the experience prompted me to make sure I had completed the header backup procedure for all drives.</p>
<p>I dump the header to memory using <a href="https://wiki.archlinux.org/index.php/Dm-crypt/Device_encryption#Backup_using_cryptsetup">the procedure from the Arch wiki</a>. This is probably unnecessary, but only takes a few extra steps. The header is stored in my password store, which is obsessively backed up.</p>
<div class="highlight"><pre><span></span><code>$ sudo mkdir /mnt/tmp
$ sudo mount ramfs /mnt/tmp -t ramfs
$ sudo cryptsetup luksHeaderBackup /dev/sdc --header-backup-file /mnt/tmp/dump
$ sudo chown pigmonkey:pigmonkey /mnt/tmp/dump
$ pass insert -m crypt/luksheader/themisto < /mnt/tmp/dump
$ sudo umount /mnt/tmp
$ sudo rmdir /mnt/tmp
</code></pre></div>Borg Assimilation2017-07-05T00:00:00-07:002017-11-15T10:10:07-08:00Pig Monkeytag:pig-monkey.com,2017-07-05:/2017/07/borg/<p>For years the core of my backup strategy has been <a href="http://rsnapshot.org/">rsnapshot</a> via <a href="https://github.com/pigmonkey/cryptshot">cryptshot</a> to various external drives for local backups, and <a href="https://www.tarsnap.com/">Tarsnap</a> for remote backups.</p>
<p>Tarsnap, however, can be slow. It tends to take somewhere between 15 to 20 minutes to create my dozen or so archives, even if little …</p><p>For years the core of my backup strategy has been <a href="http://rsnapshot.org/">rsnapshot</a> via <a href="https://github.com/pigmonkey/cryptshot">cryptshot</a> to various external drives for local backups, and <a href="https://www.tarsnap.com/">Tarsnap</a> for remote backups.</p>
<p>Tarsnap, however, can be slow. It tends to take somewhere between 15 to 20 minutes to create my dozen or so archives, even if little has changed since the last run. My impression is that this is simply due to the number of archives I have stored and the number of files I ask it to archive. Once it has decided what to do, the time spent transferring data is negligible. I run Tarsnap hourly. Twenty minutes out of every hour seems like a lot of time spent Tarsnapping.</p>
<p>I’ve eyed <a href="https://github.com/borgbackup/borg">Borg</a> for a while (and before that, <a href="https://attic-backup.org/">Attic</a>), but avoided using it due to the rapid development of its earlier days. While activity is nice, too many changes too close together do not create a reassuring image of a backup project. Borg seems to have stabilized now and has a large enough user base that I feel comfortable with it. About a month ago, I began using it to backup my laptop to <a href="http://www.rsync.net/products/attic.html">rsync.net</a>.</p>
<p>Initially I played with <a href="https://torsion.org/borgmatic/">borgmatic</a> to perform and maintain the backups. Unfortunately it seems to have issues with signal handling, which caused me to end up with annoying lock files left over from interrupted backups. Borg itself has <a href="https://borgbackup.readthedocs.io/en/stable/">good documentation</a> and is <a href="https://borgbackup.readthedocs.io/en/stable/usage.html">easy to use</a>, and I think it is useful to build familiarity with the program itself instead of only interacting with it through something else. So I did away with borgmatic and wrote a small bash script to handle my use case.</p>
<p><a href="https://borgbackup.readthedocs.io/en/stable/usage.html#borg-create">Creating the backups</a> is simple enough. Borg disables compression by default, but after a little experimentation I found that LZ4 seemed to be a decent compromise between compression and performance.</p>
<p><a href="https://borgbackup.readthedocs.io/en/stable/usage.html#borg-prune">Pruning backups</a> is equally easy. I knew I wanted to match roughly what I had with Tarsnap: hourly backups for a day or so, daily backups for a week or so, then a month or two of weekly backups, and finally a year or so of monthly backups.</p>
<p>My only hesitation was in how to maintain the health of the backups. Borg provides the convenient <a href="https://borgbackup.readthedocs.io/en/stable/usage.html#borg-check">borg check</a> command, which is able to verify the consistency of both a repository and the archives themselves. Unsurprisingly, this is a slow process. I didn’t want to run it with my hourly backups. Daily, or perhaps even weekly, seemed more reasonable, but I did want to make sure that both checks were completed successfully with some frequency. Luckily this is just the problem that I wrote <a href="https://github.com/pigmonkey/backitup">backitup</a> to solve.</p>
<p>Because the consistency checks take a while and consume some resources, I thought it would also be a good idea to avoid performing them when I’m running on battery. Giving backitup the ability to detect if the machine is on battery or AC power was <a href="https://github.com/pigmonkey/backitup/commit/0cd4d3a45df02a5f592617f8a4ad3811a02c9a38">a simple hack</a>. The script now features the <code>-a</code> switch to specify that the program should only be executed when on AC power.</p>
<p>My completed Borg wrapper is thus:</p>
<div class="highlight"><table class="highlighttable"><tr><td class="linenos"><div class="linenodiv"><pre><span class="normal"> 1</span>
<span class="normal"> 2</span>
<span class="normal"> 3</span>
<span class="normal"> 4</span>
<span class="normal"> 5</span>
<span class="normal"> 6</span>
<span class="normal"> 7</span>
<span class="normal"> 8</span>
<span class="normal"> 9</span>
<span class="normal">10</span>
<span class="normal">11</span>
<span class="normal">12</span>
<span class="normal">13</span>
<span class="normal">14</span>
<span class="normal">15</span>
<span class="normal">16</span>
<span class="normal">17</span>
<span class="normal">18</span>
<span class="normal">19</span>
<span class="normal">20</span>
<span class="normal">21</span>
<span class="normal">22</span>
<span class="normal">23</span>
<span class="normal">24</span>
<span class="normal">25</span>
<span class="normal">26</span>
<span class="normal">27</span>
<span class="normal">28</span>
<span class="normal">29</span>
<span class="normal">30</span>
<span class="normal">31</span>
<span class="normal">32</span>
<span class="normal">33</span>
<span class="normal">34</span>
<span class="normal">35</span></pre></div></td><td class="code"><div><pre><span></span><code><span class="ch">#!/bin/sh</span>
<span class="nb">export</span> <span class="nv">BORG_PASSPHRASE</span><span class="o">=</span><span class="s1">'supers3cr3t'</span>
<span class="nb">export</span> <span class="nv">BORG_REPO</span><span class="o">=</span><span class="s1">'borg-rsync:borg/nous'</span>
<span class="nb">export</span> <span class="nv">BORG_REMOTE_PATH</span><span class="o">=</span><span class="s1">'borg1'</span>
<span class="c1"># Create backups</span>
<span class="nb">echo</span> <span class="s2">"Creating backups..."</span>
borg create --verbose --stats --compression<span class="o">=</span>lz4 <span class="se">\</span>
--exclude ~/projects/foo/bar/baz <span class="se">\</span>
--exclude ~/projects/xyz/bigfatbinaries <span class="se">\</span>
::<span class="s1">'{hostname}-{user}-{utcnow:%Y-%m-%dT%H:%M:%S}'</span> <span class="se">\</span>
~/documents <span class="se">\</span>
~/projects <span class="se">\</span>
~/mail <span class="se">\</span>
<span class="c1"># ...etc</span>
<span class="c1"># Prune backups</span>
<span class="nb">echo</span> <span class="s2">"Pruning backups..."</span>
borg prune --verbose --list --prefix <span class="s1">'{hostname}-{user}-'</span> <span class="se">\</span>
--keep-within<span class="o">=</span>2d <span class="se">\</span>
--keep-daily<span class="o">=</span><span class="m">14</span> <span class="se">\</span>
--keep-weekly<span class="o">=</span><span class="m">8</span> <span class="se">\</span>
--keep-monthly<span class="o">=</span><span class="m">12</span> <span class="se">\</span>
<span class="c1"># Check backups</span>
<span class="nb">echo</span> <span class="s2">"Checking repository..."</span>
backitup -a <span class="se">\</span>
-p <span class="m">172800</span> <span class="se">\</span>
-l ~/.borg_check-repo.lastrun <span class="se">\</span>
-b <span class="s2">"borg check --verbose --repository-only"</span> <span class="se">\</span>
<span class="nb">echo</span> <span class="s2">"Checking archives..."</span>
backitup -a <span class="se">\</span>
-p <span class="m">259200</span> <span class="se">\</span>
-l ~/.borg_check-arch.lastrun <span class="se">\</span>
-b <span class="s2">"borg check --verbose --archives-only --last 24"</span> <span class="se">\</span>
</code></pre></div></td></tr></table></div>
<p>This is executed by a <a href="https://github.com/pigmonkey/dotfiles/blob/master/config/systemd/user/borg.service">systemd service</a>.</p>
<div class="highlight"><pre><span></span><code><span class="k">[Unit]</span><span class="w"></span>
<span class="na">Description</span><span class="o">=</span><span class="s">Borg Backup</span><span class="w"></span>
<span class="k">[Service]</span><span class="w"></span>
<span class="na">Type</span><span class="o">=</span><span class="s">oneshot</span><span class="w"></span>
<span class="na">ExecStart</span><span class="o">=</span><span class="s">/home/pigmonkey/bin/borgwrapper.sh</span><span class="w"></span>
<span class="k">[Install]</span><span class="w"></span>
<span class="na">WantedBy</span><span class="o">=</span><span class="s">multi-user.target</span><span class="w"></span>
</code></pre></div>
<p>The service is called hourly by a <a href="https://github.com/pigmonkey/dotfiles/blob/master/config/systemd/user/borg.timer">systemd timer</a>.</p>
<div class="highlight"><pre><span></span><code><span class="k">[Unit]</span><span class="w"></span>
<span class="na">Description</span><span class="o">=</span><span class="s">Borg Backup Timer</span><span class="w"></span>
<span class="k">[Timer]</span><span class="w"></span>
<span class="na">Unit</span><span class="o">=</span><span class="s">borg.service</span><span class="w"></span>
<span class="na">OnCalendar</span><span class="o">=</span><span class="s">hourly</span><span class="w"></span>
<span class="na">Persistent</span><span class="o">=</span><span class="s">True</span><span class="w"></span>
<span class="k">[Install]</span><span class="w"></span>
<span class="na">WantedBy</span><span class="o">=</span><span class="s">timers.target</span><span class="w"></span>
</code></pre></div>
<p>I don’t enable the timer directly, but add it to <code>/usr/local/etc/trusted_units</code> so that <a href="https://github.com/pigmonkey/nmtrust">nmtrust</a> activates it when I’m connected to trusted networks.</p>
<div class="highlight"><pre><span></span><code>$ <span class="nb">echo</span> <span class="s2">"borg.timer,user:pigmonkey"</span> >> /usr/local/etc/trusted_units
</code></pre></div>
<p>I’ve been running this for about a month now and have been pleased with the results. It averages about 30 seconds to create the backups every hour, and another 30 seconds or so to prune the old ones. As with Tarsnap, deduplication is great.</p>
<div class="highlight"><pre><span></span><code><span class="nb">------------------------------------------------------------------------------</span><span class="c"></span>
<span class="c"> Original size Compressed size Deduplicated size</span>
<span class="c">This archive: 19</span><span class="nt">.</span><span class="c">87 GB 18</span><span class="nt">.</span><span class="c">41 GB 10</span><span class="nt">.</span><span class="c">21 MB</span>
<span class="c">All archives: 836</span><span class="nt">.</span><span class="c">02 GB 773</span><span class="nt">.</span><span class="c">35 GB 19</span><span class="nt">.</span><span class="c">32 GB</span>
<span class="c"> Unique chunks Total chunks</span>
<span class="c">Chunk index: 371527 14704634</span>
<span class="nb">------------------------------------------------------------------------------</span><span class="c"></span>
</code></pre></div>
<p>The most recent repository consistency check took about 30 minutes, but only runs every 172800 seconds, or once every other day. The most recent archive consistency check took about 40 minutes, but only runs every 259200 seconds, or once per 3 days. I’m not sure that those schedules are the best option for the consistency checks. I may tweak their frequencies, but because I know they will only be executed when I am on a trusted network and AC power, I’m less concerned about the length of time.</p>
<p>With Borg running hourly, I’ve reduced Tarsnap to run only once per day. Time will tell if Borg will slow as the number of stored archives increase, but for now running Borg hourly and Tarsnap daily seems like a great setup. Tarsnap and Borg both target the same files (with a few exceptions). Tarsnap runs in the AWS us-east-1 region. I’ve always kept my rsync.net account in their Zurich datacenter. This provides the kind of redundancy that lets me rest easy.</p>
<p>Contrary to what you might expect given the <a href="/tag/backups/">number of blog posts on the subject</a>, I actually spend close to no time worrying about data loss in my day to day life, thanks to stuff like this. An ounce of prevention, and all that. (Maybe a few kilograms of prevention in my case.)</p>The USB Armory for PGP Key Management2017-06-28T00:00:00-07:002018-05-05T21:11:27-07:00Pig Monkeytag:pig-monkey.com,2017-06-28:/2017/06/armory/<p>I use a <a href="https://www.yubico.com/products/yubikey-hardware/yubikey-neo/">Yubikey Neo</a> for day-to-day PGP operations. For managing the secret key itself, such as during renewal or key signing, I use a <a href="https://inversepath.com/usbarmory">USB Armory</a> with <a href="https://github.com/inversepath/usbarmory/wiki/Host-adapter">host adapter</a>. In host mode, the Armory provides a trusted, open source platform that is compact and easily secured, making it ideal …</p><p>I use a <a href="https://www.yubico.com/products/yubikey-hardware/yubikey-neo/">Yubikey Neo</a> for day-to-day PGP operations. For managing the secret key itself, such as during renewal or key signing, I use a <a href="https://inversepath.com/usbarmory">USB Armory</a> with <a href="https://github.com/inversepath/usbarmory/wiki/Host-adapter">host adapter</a>. In host mode, the Armory provides a trusted, open source platform that is compact and easily secured, making it ideal for key management.</p>
<p>Setting up the Armory is fairly straightforward. The <a href="https://archlinuxarm.org/">Arch Linux ARM</a> project provides <a href="https://archlinuxarm.org/platforms/armv7/freescale/usb-armory">prebuilt images</a>. From my laptop, I follow their instructions to prepare the micro SD card, where <code>/dev/sdX</code> is the SD card.</p>
<div class="highlight"><pre><span></span><code><span class="n">$</span><span class="w"> </span><span class="n">dd</span><span class="w"> </span><span class="k">if</span><span class="o">=/</span><span class="n">dev</span><span class="o">/</span><span class="n">zero</span><span class="w"> </span><span class="k">of</span><span class="o">=/</span><span class="n">dev</span><span class="o">/</span><span class="n">sdX</span><span class="w"> </span><span class="n">bs</span><span class="o">=</span><span class="n">1M</span><span class="w"> </span><span class="n">count</span><span class="o">=</span><span class="mi">8</span><span class="w"></span>
<span class="n">$</span><span class="w"> </span><span class="n">fdisk</span><span class="w"> </span><span class="o">/</span><span class="n">dev</span><span class="o">/</span><span class="n">sdX</span><span class="w"></span>
<span class="c1"># `o` to clear any partitions</span><span class="w"></span>
<span class="c1"># `n`, `p`, `1`, `2048`, `enter` to create a new primary partition in the first position with a first sector of 2048 and the default last sector</span><span class="w"></span>
<span class="c1"># `w` to write</span><span class="w"></span>
<span class="n">$</span><span class="w"> </span><span class="n">mkfs</span><span class="p">.</span><span class="n">ext4</span><span class="w"> </span><span class="o">/</span><span class="n">dev</span><span class="o">/</span><span class="n">sdX1</span><span class="w"></span>
<span class="n">$</span><span class="w"> </span><span class="n">mkdir</span><span class="w"> </span><span class="o">/</span><span class="n">mnt</span><span class="o">/</span><span class="n">sdcard</span><span class="w"></span>
<span class="n">$</span><span class="w"> </span><span class="n">mount</span><span class="w"> </span><span class="o">/</span><span class="n">dev</span><span class="o">/</span><span class="n">sdX1</span><span class="w"> </span><span class="o">/</span><span class="n">mnt</span><span class="o">/</span><span class="n">sdcard</span><span class="w"></span>
</code></pre></div>
<p>And then extract the image, doing whatever verification is necessary after downloading.</p>
<div class="highlight"><pre><span></span><code>$ wget http://os.archlinuxarm.org/os/ArchLinuxARM-usbarmory-latest.tar.gz
$ bsdtar -xpf ArchLinuxARM-usbarmory-latest.tar.gz -C /mnt/sdcard
$ sync
</code></pre></div>
<p>Followed by installing the bootloader.</p>
<div class="highlight"><pre><span></span><code>$ sudo dd <span class="k">if</span><span class="o">=</span>/mnt/sdcard/boot/u-boot.imx <span class="nv">of</span><span class="o">=</span>/dev/sdX <span class="nv">bs</span><span class="o">=</span><span class="m">512</span> <span class="nv">seek</span><span class="o">=</span><span class="m">2</span> <span class="nv">conv</span><span class="o">=</span>fsync
$ sync
</code></pre></div>
<p>The bootloader must be tweaked to enable host mode.</p>
<div class="highlight"><pre><span></span><code>$ sed -i <span class="s1">'/#setenv otg_host/s/^#//'</span> /mnt/sdcard/boot/boot.txt
$ <span class="nb">cd</span> /mnt/sdcard/boot
$ ./mkscr
</code></pre></div>
<p>For display I use a <a href="https://www.amazon.com/dp/B004AIJE9G">Plugable USB 2.0 UGA-165</a> adapter. To setup <a href="https://wiki.archlinux.org/index.php/DisplayLink">DisplayLink</a> one must configure the correct modules.</p>
<div class="highlight"><pre><span></span><code><span class="o">$</span><span class="w"> </span><span class="n">sed</span><span class="w"> </span><span class="o">-</span><span class="n">i</span><span class="w"> </span><span class="s1">'/blacklist drm_kms_helper/s/^/#/g'</span><span class="w"> </span><span class="o">/</span><span class="n">mnt</span><span class="o">/</span><span class="n">sdcard</span><span class="o">/</span><span class="n">etc</span><span class="o">/</span><span class="n">modprobe</span><span class="o">.</span><span class="n">d</span><span class="o">/</span><span class="n">no</span><span class="o">-</span><span class="n">drm</span><span class="o">.</span><span class="n">conf</span><span class="w"></span>
<span class="o">$</span><span class="w"> </span><span class="n">echo</span><span class="w"> </span><span class="s2">"blacklist udlfb"</span><span class="w"> </span><span class="o">>></span><span class="w"> </span><span class="o">/</span><span class="n">mnt</span><span class="o">/</span><span class="n">sdcard</span><span class="o">/</span><span class="n">etc</span><span class="o">/</span><span class="n">modprobe</span><span class="o">.</span><span class="n">d</span><span class="o">/</span><span class="n">no</span><span class="o">-</span><span class="n">drm</span><span class="o">.</span><span class="n">conf</span><span class="w"></span>
<span class="o">$</span><span class="w"> </span><span class="n">echo</span><span class="w"> </span><span class="n">udl</span><span class="w"> </span><span class="o">></span><span class="w"> </span><span class="o">/</span><span class="n">mnt</span><span class="o">/</span><span class="n">sdcard</span><span class="o">/</span><span class="n">etc</span><span class="o">/</span><span class="n">modules</span><span class="o">-</span><span class="nb">load</span><span class="o">.</span><span class="n">d</span><span class="o">/</span><span class="n">udl</span><span class="o">.</span><span class="n">conf</span><span class="w"></span>
</code></pre></div>
<p>Finally, I copy over <a href="https://www.passwordstore.org/">pass</a> and <a href="https://git.zx2c4.com/ctmg/about/">ctmg</a> so that I have them available on the Armory and unmount the SD card.</p>
<div class="highlight"><pre><span></span><code>$ cp /usr/bin/pass /mnt/sdcard/bin/
$ cp /usr/bin/ctmg /mnt/sdcard/bin/
$ umount /mnt/sdcard
</code></pre></div>
<p>The SD card can then be inserted into the Armory. At no time during this process – or at any point in the future – is the Armory connected to a network. It is entirely air-gapped. As long as the image was not compromised and the Armory is stored securely, the platform should remain trusted.</p>
<p>Note that because the Armory is never on a network, and it has no internal battery, it does not keep time. Upon first boot, NTP should be disabled and the time and date set.</p>
<div class="highlight"><pre><span></span><code>$ timedatectl set-ntp <span class="nb">false</span>
$ timedatectl set-time <span class="s2">"yyyy-mm-dd hh:mm:ss"</span> <span class="c1"># UTC</span>
</code></pre></div>
<p>On subsequent boots, the time and date should be set with <code>timedatectl set-time</code> before performing any cryptographic operations.</p>Cold Storage2016-08-26T00:00:00-07:002016-08-27T18:19:34-07:00Pig Monkeytag:pig-monkey.com,2016-08-26:/2016/08/storage/<p>This past spring I mentioned my <a href="/2016/03/backup/">cold storage setup</a>: a number of encrypted 2.5” drives in external enclosures, stored inside a <a href="http://www.pelican.com/us/en/product/watertight-protector-hard-cases/small-case/standard/1200/">Pelican 1200</a> case, secured with <a href="https://securitysnobs.com/Abloy-Protec2-PL-321-Padlock.html">Abloy Protec2 321</a> locks. Offline, secure, and infrequently accessed storage is an important component of any strategy for resilient data. The ease with …</p><p>This past spring I mentioned my <a href="/2016/03/backup/">cold storage setup</a>: a number of encrypted 2.5” drives in external enclosures, stored inside a <a href="http://www.pelican.com/us/en/product/watertight-protector-hard-cases/small-case/standard/1200/">Pelican 1200</a> case, secured with <a href="https://securitysnobs.com/Abloy-Protec2-PL-321-Padlock.html">Abloy Protec2 321</a> locks. Offline, secure, and infrequently accessed storage is an important component of any strategy for resilient data. The ease with which this can be managed with <a href="https://git-annex.branchable.com/">git-annex</a> only increases <a href="/tag/annex/">my infatuation with the software</a>.</p>
<p><a href="https://www.flickr.com/photos/pigmonkey/29168947362/in/dateposted/" title="Data Data Data Data Data"><img src="https://c3.staticflickr.com/9/8405/29168947362_2c7ecc9a97_c.jpg" width="800" height="450" alt="Data Data Data Data Data"></a></p>
<p>I’ve been happy with the <a href="https://www.amazon.com/gp/product/B00MPWYLHO/">Seagate ST2000LM003</a> drives for this application. Unfortunately the enclosures I first purchased did not work out so well. I had two die within a few weeks. They’ve been replaced with the <a href="https://www.amazon.com/gp/product/B00YT6TOJO/">SIG JU-SA0Q12-S1</a>. These claim to be compatible with drives up to 8TB (someday I’ll be able to buy 8TB 2.5” drives) and support USB 3.1. They’re also a bit thinner than the previous enclosures, so I can easily fit five in my box. The Seagate drives offer about 1.7 terabytes of usable space, giving this setup a total capacity of 8.5 terabytes.</p>
<p>Setting up git-annex to support this type of cold storage is fairly straightforward, but does necessitate some familiarity with how the program works. Personally, I prefer to do all my setup manually. I’m happy to let the <a href="http://git-annex.branchable.com/assistant/">assistant</a> watch my repositories and manage them after the setup, and I’ll occasionally fire up the <a href="https://git-annex.branchable.com/design/assistant/webapp/">web app</a> to see what the assistant daemon is doing, but I like the control and understanding provided by a manual setup. The power and flexibility of git-annex is deceptive. Using it solely through the simplified interface of the web app greatly limits what can be accomplished with it.</p>
<h2>Encryption</h2>
<p>Before even getting into git-annex, the drive should be encrypted with <a href="https://en.wikipedia.org/wiki/Linux_Unified_Key_Setup">LUKS</a>/<a href="https://en.wikipedia.org/wiki/Dm-crypt">dm-crypt</a>. The need for this could be avoided by using something like <a href="https://git-annex.branchable.com/special_remotes/gcrypt/">gcrypt</a>, but LUKS/dm-crypt is an ingrained habit and part of my workflow for all external drives. Assuming the drive is <code>/dev/sdc</code>, pass <code>cryptsetup</code> some sane defaults:</p>
<div class="highlight"><pre><span></span><code>$ sudo cryptsetup --cipher aes-xts-plain64 --key-size <span class="m">512</span> --hash sha512 luksFormat /dev/sdc
</code></pre></div>
<p>With the drive encrypted, it can then be opened and formatted. I’ll give the drive a human-friendly label of <code>themisto</code>.</p>
<div class="highlight"><pre><span></span><code>$ sudo cryptsetup luksOpen /dev/sdc themisto_crypt
$ sudo mkfs.ext4 -L themisto /dev/mapper/themisto_crypt
</code></pre></div>
<p>At this point the drive is ready. I close it and then mount it with <a href="https://github.com/coldfix/udiskie">udiskie</a> to make sure everything is working. How the drive is mounted doesn’t matter, but I like udiskie because it can <a href="https://github.com/pigmonkey/dotfiles/blob/master/config/udiskie/config.yml#L5">integrate with my password manager</a> to get the drive passphrase.</p>
<div class="highlight"><pre><span></span><code>$ sudo cryptsetup luksClose /dev/mapper/themisto_crypt
$ udiskie-mount -r /dev/sdc
</code></pre></div>
<h2>Git-Annex</h2>
<p>With the encryption handled, the drive should now be mounted at <code>/media/themisto</code>. For the first few steps, we’ll basically follow the <a href="https://git-annex.branchable.com/walkthrough/">git-annex walkthrough</a>. Let’s assume that we are setting up this drive to be a repository of the annex <code>~/video</code>. The first step is to go to the drive, clone the repository, and initialize the annex. When initializing the annex I prepend the name of the remote with <code>satellite :</code>. My cold storage drives are all named after satellites, and doing this allows me to easily identify them when looking at a list of remotes.</p>
<div class="highlight"><pre><span></span><code>$ <span class="nb">cd</span> /media/themisto
$ git clone ~/video
$ <span class="nb">cd</span> video
$ git annex init <span class="s2">"satellite : themisto"</span>
</code></pre></div>
<h3>Disk Reserve</h3>
<p>Whenever dealing with a repository that is bigger (or may become bigger) than the drive it is being stored on, it is important to set a disk reserve. This tells git-annex to always keep some free space around. I generally like to set this to 1 GB, which is way larger than it needs to be.</p>
<div class="highlight"><pre><span></span><code>$ git config annex.diskreserve <span class="s2">"1 gb"</span>
</code></pre></div>
<h3>Adding Remotes</h3>
<p>I’ll then tell this new repository where the original repository is located. In this case I’ll refer to the original using the name of my computer, <code>nous</code>.</p>
<div class="highlight"><pre><span></span><code>$ git remote add nous ~/video
</code></pre></div>
<p>If other remotes already exist, now is a good time to add them. These could be <a href="https://git-annex.branchable.com/special_remotes/">special remotes</a> or normal ones. For this example, let’s say that we have already completed this whole process for another cold storage drive called <code>sinope</code>, and that we have an <a href="https://git-annex.branchable.com/special_remotes/S3/">s3</a> remote creatively named <code>s3</code>.</p>
<div class="highlight"><pre><span></span><code>$ git remote add sinope /media/sinope/video
$ <span class="nb">export</span> <span class="nv">AWS_ACCESS_KEY_ID</span><span class="o">=</span><span class="s2">"..."</span>
$ <span class="nb">export</span> <span class="nv">AWS_SECRET_ACCESS_KEY</span><span class="o">=</span><span class="s2">"..."</span>
$ git annex enableremote s3
</code></pre></div>
<h3>Trust</h3>
<p><a href="https://git-annex.branchable.com/trust/">Trust</a> is a critical component of how git-annex works. Any new annex will default to being semi-trusted, which means that when running operations within the annex on the main computer – say, dropping a file – git-annex will want to confirm that <code>themisto</code> has the files that it is supposed to have. In the case of <code>themisto</code> being a USB drive that is rarely connected, this is not very useful. I tell git-annex to trust my cold storage drives, which means that if git-annex has a record of a certain file being on the drive, it will be satisfied with that. This increases the risk for potential data-loss, but for this application I feel it is appropriate.</p>
<div class="highlight"><pre><span></span><code>$ git annex trust .
</code></pre></div>
<h3>Preferred Content</h3>
<p>The final step that needs to be taken on the new repository is to tell it what files it should want. This is done using <a href="https://git-annex.branchable.com/preferred_content/">preferred content</a>. The <a href="https://git-annex.branchable.com/preferred_content/standard_groups/">standard groups</a> that git-annex ships with cover most of the bases. Of interest for this application is the <code>archive</code> group, which wants all content except that which has already found its way to another archive. This is the behaviour I want, but I will duplicate it into a custom group called <code>satellite</code>. This keeps my cold storage drives as standalone things that do not influence any other remotes where I may want to use the default <code>archive</code>.</p>
<div class="highlight"><pre><span></span><code>$ git annex groupwanted satellite <span class="s2">"(not copies=satellite:1) or approxlackingcopies=1"</span>
$ git annex group . satellite
$ git annex wanted . groupwanted
</code></pre></div>
<p>For other repositories, I may want to store the data on multiple cold storage drives. In that case I would create a <code>redundantsatellite</code> group that wants all content which is not already present in two other members of the group.</p>
<div class="highlight"><pre><span></span><code>$ git annex groupwanted redundantsatellite <span class="s2">"(not copies=redundantsatellite:2) or approxlackingcopies=1"</span>
$ git annex group . redundantsatellite
$ git annex wanted . groupwanted
</code></pre></div>
<h3>Syncing</h3>
<p>With everything setup, the new repository is ready to sync and to start to ingest content from the remotes it knows about!</p>
<div class="highlight"><pre><span></span><code>$ git annex sync --content
</code></pre></div>
<p>However, the original repository also needs to know about the new remote.</p>
<div class="highlight"><pre><span></span><code>$ <span class="nb">cd</span> ~/video
$ git remote add themisto /media/themisto/video
$ git annex sync
</code></pre></div>
<p>The same is the case for any other previously existing repository, such as <code>sinope</code>.</p>Cryptographic Identity2016-05-17T00:00:00-07:002016-05-17T20:56:35-07:00Pig Monkeytag:pig-monkey.com,2016-05-17:/2016/05/id/<p>Despite its shortcomings, I think PGP is still one of the better ways to verify a person’s identity. Because of this – and because I use my PGP key daily<sup class="footnote-ref" id="fnref:key-use"><a rel="footnote" href="#fn:key-use" title="see footnote">1</a></sup> – I make an effort to properly secure my private key. Verifying a PGP key is a fairly straightforward process …</p><p>Despite its shortcomings, I think PGP is still one of the better ways to verify a person’s identity. Because of this – and because I use my PGP key daily<sup class="footnote-ref" id="fnref:key-use"><a rel="footnote" href="#fn:key-use" title="see footnote">1</a></sup> – I make an effort to properly secure my private key. Verifying a PGP key is a fairly straightforward process for fellow PGP users, and my hope is that anyone who does verify my key can maintain a high confidence in its signature.</p>
<p>However, I also use other cryptographic channels to communicate – XMPP/OTR and Signal chief among them. I consider these keys more transient than PGP. The OTR keys on my computer are backed up because it takes no effort to do so, but I have no qualms about creating new ones if I feel like it. I don’t bother to port the same keys to other devices, like my phone. My Signal key is guaranteed to change anytime I rebuild or replace my phone. Given the nature of these keys and how I handle them, I don’t expect others to put the same amount of effort into verifying their fingerprints.</p>
<p>The solution to this is to maintain a simple text file, signed via PGP, containing the fingerprints of my other keys. With a copy of the file and a trusted copy of my public PGP key, anyone can verify my identity on other networks or communication channels. If a key is replaced, I simply add the new fingerprint to the file, sign it and distribute. Contacts download the file, check its signature, and thus easily trust the new fingerprint without additional rigmarole.</p>
<p>The first examples of this that I saw were from <a href="http://web.mit.edu/zyan/www/zyan.txt">Yan</a> and <a href="https://tomlowenthal.com/id">Tom Lowenthal</a>. I thought it seemed like a great idea and began to maintain a file with a list of examples whenever I stumbled across then, with a note that I should do that someday<sup class="footnote-ref" id="fnref:keybase"><a rel="footnote" href="#fn:keybase" title="see footnote">2</a></sup>.</p>
<p>Today I decided to stop procrastinating on this and create my own identity file. It is located at <a href="/id.txt">pig-monkey.com/id.txt</a>. The file, along with the rest of this website, is <a href="https://github.com/pigmonkey/pig-monkey.com/blob/master/content/id.txt">in git</a> so that changes to it may be tracked over time.</p>
<p>Inspired by some of the examples I had collected, I added a couple pieces of related information to the file. The section on PGP key signing should provide others some context for what it means when they see my signature on a different key. Even if no one cares, I found it useful to enunciate the policy simply to clear up my own thinking about what the different certification levels should mean. Finally, the section on key management gives others a rough idea about how I manage my key, which should help them to maintain their confidence in it. If I verify that someone’s identity and fingerprint match their key, I will have high confidence in its signature initially. But if I know that the person keeps their secret key on their daily driver machine without any additional effort to protect it, my confidence in it will degrade over time. Less so if I know that they take great care and handling in their key’s protection.</p>
<p>A file like this should also provide a good mechanism for creating a transition and revocation statement for my PGP key, should the need arise. One hopes that it does not.</p>
<div id="footnotes">
<h2>Notes</h2>
<ol>
<li id="fn:key-use"><a rev="footnote" href="#fnref:key-use" class="footnote-return" title="return to article">↵</a> Realistically, I use PGP multiple times per hour when I'm on my computer.</li>
<li id="fn:keybase"><a rev="footnote" href="#fnref:keybase" class="footnote-return" title="return to article">↵</a> Since I began my list, <a href="https://keybase.io/">Keybase</a> has become a thing. It addresses a similar problem, although seems to promote using services like Twitter as the root of trust. Assuming that you want to stubbornly stick with a PGP key as the root of trust, I don't see the advantage of using Keybase for this problem, except that it offers a centralized lookup repository.</li>
</ol>
</div>I celebrated World Backup Day by increasing the resiliency of data in my life.2016-03-31T00:00:00-07:002016-08-19T20:03:18-07:00Pig Monkeytag:pig-monkey.com,2016-03-31:/2016/03/backup/<p>Four <a href="https://wiki.archlinux.org/index.php/Dm-crypt">encrypted</a> 2TB hard drives, stored in a <a href="http://www.pelican.com/us/en/product/watertight-protector-hard-cases/small-case/standard/1200/">Pelican 1200</a>, with <a href="https://securitysnobs.com/Abloy-Protec2-PL-321-Padlock.html">Abloy Protec2 PL 321</a> padlocks as tamper-evident seals. Having everything that matters stored in <a href="https://git-annex.branchable.com/">git-annex</a> makes projects like this simple: just clone the repositories, define the <a href="https://git-annex.branchable.com/preferred_content/">preferred content expressions</a>, and watch the magic happen.</p>
<p><a href="https://www.flickr.com/photos/pigmonkey/25889491200/in/dateposted/" title="Cold Storage"><img src="https://farm2.staticflickr.com/1624/25889491200_7b962ddfd0_c.jpg" width="800" height="450" alt="Cold Storage"></a></p>My GPG key has been superseded.2015-05-25T00:00:00-07:002019-01-04T16:49:07-08:00Pig Monkeytag:pig-monkey.com,2015-05-25:/2015/05/key/<p>I’ve migrated to using a <a href="https://www.yubico.com/products/yubikey-hardware/yubikey-neo/">Yubikey Neo</a> as a smart card and decided to replace the old key as part of the process. The new key can be found <a href="/key.asc">in the usual location</a> or on your keyserver of choice.</p>Optical Backups of Photo Archives2013-05-29T00:00:00-07:002013-05-29T00:00:00-07:00Pig Monkeytag:pig-monkey.com,2013-05-29:/2013/05/optical-photo-backups/<p>I store my photos in <a href="http://git-annex.branchable.com/">git-annex</a>. A full copy of the annex exists on my laptop and on an external drive. Encrypted copies of all of my photos are stored on <a href="https://aws.amazon.com/s3/">Amazon S3</a> (which I pay for) and <a href="https://www.box.com/">box.com</a> (which provides 50GB for free) via git-annex <a href="http://git-annex.branchable.com/special_remotes/">special remotes</a>. The …</p><p>I store my photos in <a href="http://git-annex.branchable.com/">git-annex</a>. A full copy of the annex exists on my laptop and on an external drive. Encrypted copies of all of my photos are stored on <a href="https://aws.amazon.com/s3/">Amazon S3</a> (which I pay for) and <a href="https://www.box.com/">box.com</a> (which provides 50GB for free) via git-annex <a href="http://git-annex.branchable.com/special_remotes/">special remotes</a>. The photos are backed-up to an external drive daily with the rest of my laptop hard drive via <a href="/2012/10/back-it-up/">backitup.sh</a> and <a href="/2012/09/cryptshot-automated-encrypted-backups-rsnapshot/">cryptshot</a>. My entire laptop hard drive is also mirrored monthly to an external drive stored off-site.</p>
<p>(The majority of my photos are also <a href="http://www.flickr.com/photos/pigmonkey/">on Flickr</a>, but I don’t consider that a backup or even reliable storage.)</p>
<p>All of this is what I consider to be the bare minimum for any redundant data storage. Photos have special value, above the value that I assign to most other data. This value only increases with age. As such they require an additional backup method, but due to the size of my collection I want to avoid backup methods that involve paying for more online storage, such as <a href="/2012/09/tarsnapper-managing-tarsnap-backups/">Tarsnap</a>.</p>
<p>I choose optical discs as the medium for my photo backups. This has the advantage of being read-only, which makes it more difficult for accidental deletions or corruption to propagate through the backup system. DVD-Rs have a capacity of 4.7 GBs and a cost of around $0.25 per disc. Their life expectancy varies, but 10-years seem to be a reasonable low estimate.</p>
<h2>Preparation</h2>
<p>I keep all of my photos in year-based directories. At the beginning of every year, the previous year’s directory is burned to a DVD.</p>
<p>Certain years contain few enough photos that the entire year can fit on a single DVD. More recent years have enough photos of a high enough resolution that they require multiple DVDs.</p>
<h3>Archive</h3>
<p>My first step is to build a compressed archive of each year. I choose <a href="http://www.gnu.org/software/tar/">tar</a> and <a href="http://en.wikipedia.org/wiki/Bzip2">bzip2</a> compression for this because they’re simple and reliable.</p>
<div class="highlight"><table class="highlighttable"><tr><td class="linenos"><div class="linenodiv"><pre><span class="normal">1</span>
<span class="normal">2</span></pre></div></td><td class="code"><div><pre><span></span><code>$ <span class="nb">cd</span> ~/pictures
$ tar cjhf ~/tmp/pictures/2012.tar.bz <span class="m">2012</span>
</code></pre></div></td></tr></table></div>
<p>If the archive is larger than 3.7 GB, it needs to be split into multiple files. The resulting files will be burned to different discs. The capacity of a DVD is 4.7 GB, but I place the upper file limit at 3.7 GB so that the DVD has a minimum of 20% of its capacity available. This will be filled with parity information later on for redundancy.</p>
<div class="highlight"><table class="highlighttable"><tr><td class="linenos"><div class="linenodiv"><pre><span class="normal">1</span></pre></div></td><td class="code"><div><pre><span></span><code>$ split -d -b 3700M <span class="m">2012</span>.tar.bz <span class="m">2012</span>.tar.bz.
</code></pre></div></td></tr></table></div>
<h3>Encrypt</h3>
<p>Leaving unencrypted data around is <a href="http://www.youtube.com/watch?v=OwHrlM4oVSI">bad form</a>. The archive (or each of the files resulting from splitting the large archive) is next encrypted and signed with <a href="http://www.gnupg.org/">GnuPG</a>.</p>
<div class="highlight"><table class="highlighttable"><tr><td class="linenos"><div class="linenodiv"><pre><span class="normal">1</span>
<span class="normal">2</span></pre></div></td><td class="code"><div><pre><span></span><code>$ gpg -eo <span class="m">2012</span>.tar.bz.gpg <span class="m">2012</span>.tar.bz
$ gpg -bo <span class="m">2012</span>.tar.bz.gpg.sig <span class="m">2012</span>.tar.bz.gpg
</code></pre></div></td></tr></table></div>
<h2>Imaging</h2>
<p>The encrypted archive and the detached signature of the encrypted archive are what will be burned to the disc. (Or, in the case of a large archive, the encrypted splits of the full archive and the associated signatures will be burned to one disc per split/signature combonation.) Rather than burning them directly, an image is created first.</p>
<div class="highlight"><table class="highlighttable"><tr><td class="linenos"><div class="linenodiv"><pre><span class="normal">1</span></pre></div></td><td class="code"><div><pre><span></span><code>$ mkisofs -V <span class="s2">"Photos: 2012 1/1"</span> -r -o <span class="m">2012</span>.iso <span class="m">2012</span>.tar.bz.gpg <span class="m">2012</span>.tar.bz.gpg.sig
</code></pre></div></td></tr></table></div>
<p>If the year has a split archive requiring multiple discs, I modify the sequence number in the volume label. For example, a year requiring 3 discs will have the label <code>Photos: 2012 1/3</code>.</p>
<h3>Parity</h3>
<p>When I began this project I knew that I wanted some sort of parity information for each disc so that I could potentially recover data from slightly damaged media. My initial idea was to use <a href="http://en.wikipedia.org/wiki/Parchive">parchive</a> via <a href="https://github.com/BlackIkeEagle/par2cmdline">par2cmdline</a>. Further research led me to <a href="http://dvdisaster.net/en/index.html">dvdisaster</a> which, despite being a GUI-only program, seemed more appropriate for this use case.</p>
<p>Both dvdisaster and parchive use the same <a href="http://en.wikipedia.org/wiki/Reed–Solomon_error_correction">Reed–Solomon error correction codes</a>. Dvdidaster is aimed at optical media and has the ability to place the error correction data on the disc by <a href="http://dvdisaster.net/en/howtos30.html">augmenting the disc image</a>, as well as <a href="http://dvdisaster.net/en/howtos20.html">storing the data separately</a>. It can also <a href="http://dvdisaster.net/en/howtos10.html">scan media for errors</a> and assist in judging when the media is in danger of becoming defective. This makes it an attractive option for long-term storage.</p>
<p>I use dvdisaster with the <a href="http://dvdisaster.net/en/howtos32.html">RS02</a> error correction method, which augments the image before burning. Depending on the size of the original image, this will result in the disc having anywhere from 20% to 200% redundancy.</p>
<h3>Verify</h3>
<p>After the image has been augmented, I mount it and verify the signature of the encrypted file on the disc against the local copy of the signature. I’ve never had the signatures not match, but performing this step makes me feel better.</p>
<div class="highlight"><table class="highlighttable"><tr><td class="linenos"><div class="linenodiv"><pre><span class="normal">1</span>
<span class="normal">2</span>
<span class="normal">3</span></pre></div></td><td class="code"><div><pre><span></span><code>$ sudo mount -o loop <span class="m">2012</span>.iso /mnt/disc
$ gpg --verify <span class="m">2012</span>.tar.bz.gpg.sig /mnt/disc/2012.tar.bz.gpg
$ sudo umount /mnt/disc
</code></pre></div></td></tr></table></div>
<h3>Burn</h3>
<p>The final step is to burn the augmented image. I always burn discs at low speeds to diminish the chance of errors during the process.</p>
<div class="highlight"><table class="highlighttable"><tr><td class="linenos"><div class="linenodiv"><pre><span class="normal">1</span></pre></div></td><td class="code"><div><pre><span></span><code>$ cdrecord -v <span class="nv">speed</span><span class="o">=</span><span class="m">4</span> <span class="nv">dev</span><span class="o">=</span>/dev/sr0 <span class="m">2012</span>.iso
</code></pre></div></td></tr></table></div>
<p>Similar to the optical backups of my <a href="/2013/04/password-management-vim-gnupg/">password database</a>, I burn two copies of each disc. One copy is stored off-site. This provides a reasonably level of assurance against any loss of my photos.</p>Password Management with Vim and GnuPG2013-04-04T00:00:00-07:002013-06-30T00:00:00-07:00Pig Monkeytag:pig-monkey.com,2013-04-04:/2013/04/password-management-vim-gnupg/<p>The first password manager I ever used was a simple text file encrypted with <a href="http://www.gnupg.org/">GnuPG</a>. When I needed a password I would decrypt the file, read it in <a href="http://www.vim.org/">Vim</a>, and copy the required entry to the system clipboard. This system didn’t last. At the time I wasn’t using …</p><p>The first password manager I ever used was a simple text file encrypted with <a href="http://www.gnupg.org/">GnuPG</a>. When I needed a password I would decrypt the file, read it in <a href="http://www.vim.org/">Vim</a>, and copy the required entry to the system clipboard. This system didn’t last. At the time I wasn’t using GnuPG for much else, and this was in the very beginning of my Vim days, when the program seemed cumbersome and daunting. I shortly moved to other, purpose-built password managers.</p>
<p>After some experimentation I landed on <a href="http://www.keepassx.org/">KeePassX</a>, which I used for a number of years. Some time ago I decided that I wanted to move to a command-line solution. KeePassX and a web browser were the only graphical applications that I was using with any regularity. I could see no need for a password manager to have a graphical interface, and the GUI’s dependency on a mouse decreased my productivity. After a cursory look at the available choices I landed right back where I started all those years ago: Vim and GnuPG.</p>
<p>These days Vim is my most used program outside of a web browser and I use GnuPG daily for handling the majority of my encryption needs. My greater familiarity with both of these tools is one of the reasons I’ve been successful with the system this time around. I believe the other reason is my more systematic approach.</p>
<h2>Structure</h2>
<p>The power of this system comes from its simplicity: passwords are stored in plain text files that have been encrypted with GnuPG. Every platform out there has some implementation of the <a href="https://en.wikipedia.org/wiki/Pretty_Good_Privacy#OpenPGP">PGP protocol</a>, so the files can easily be decrypted anywhere. After they’ve been decrypted, there’s no fancy file formats to deal with. It’s all just text, which can be manipulated with a <a href="https://en.wikipedia.org/wiki/GNU_Core_Utilities">plethora of powerful tools</a>. I favor reading the text in Vim, but any text editor will do the job.</p>
<p>All passwords are stored within a directory called <code>~/pw</code>. Within this directory are multiple files. Each of these files can be thought of as a separate password database. I store bank information in <code>financial.gpg</code>. Login information for various shopping websites are in <code>ecommerce.gpg</code>. My email credentials are in <code>email.gpg</code>. All of these entries could very well be stored in a single file, but breaking it out into multiple files allows me some measure of access control.</p>
<h3>Access</h3>
<p>I regularly use two computers: my laptop at home and a desktop machine at work. I trust my laptop. It has my GnuPG key on it and it should have access to all password database files. I do not place complete trust in my machine at work. I don’t trust it enough to give it access to my GnuPG key, and as such I have a different GnuPG key on that machine that I use for encryption at work.</p>
<p>Having passwords segregated into multiple database files allows me to encrypt the different files to different keys. Every file is encrypted to my primary GnuPG key, but only some are encrypted with my work key. Login credentials needed for work are encrypted to the work key. I have no need to login to my bank accounts at work, and it wouldn’t be prudent to do so on a machine that I do not fully trust, so the <code>financial.gpg</code> file is not encrypted to my work key. If someone compromises my work computer, they still will be no closer to accessing my banking credentials.</p>
<h3>Git</h3>
<p>The <code>~/pw</code> directory is a <a href="http://git-scm.com/">git</a> repository. This gives me version control on all of my passwords. If I accidentally delete an entry I can always get it back. It also provides syncing and redundant storage without depending on a third-party like Dropbox.</p>
<h3>Keys</h3>
<p>An advantage of using a directory full of encrypted files as my password manager is that I’m not limited to only storing usernames and passwords. Any file can be added to the repository. I keep keys for backups, SSH keys, and SSL keys (all of which have been encrypted with my GnuPG key) in the directory. This gives me one location for all of my authentication credentials, which simplifies the locating and backing up of these important files.</p>
<h2>Markup</h2>
<p>Each file is structured with <a href="http://vimdoc.sourceforge.net/htmldoc/fold.html">Vim folds</a> and indentation. There are various ways for Vim to fold text. I use markers, sticking with the default <code>{{{</code>/<code>}}}</code> characters. A typical password entry will look like this:</p>
<div class="highlight"><table class="highlighttable"><tr><td class="linenos"><div class="linenodiv"><pre><span class="normal">1</span>
<span class="normal">2</span>
<span class="normal">3</span>
<span class="normal">4</span>
<span class="normal">5</span></pre></div></td><td class="code"><div><pre><span></span><code>Amazon{{{
user: foo@bar.com
pass: supers3cr3t
url: https://amazon.com
}}}
</code></pre></div></td></tr></table></div>
<p>Each file is full of entries like this. Certain entries are grouped together within other folds for organization. Certain entries may have comments so that I have a record of the false personally identifiable information the service requested when I registered.</p>
<div class="highlight"><table class="highlighttable"><tr><td class="linenos"><div class="linenodiv"><pre><span class="normal">1</span>
<span class="normal">2</span>
<span class="normal">3</span>
<span class="normal">4</span>
<span class="normal">5</span>
<span class="normal">6</span>
<span class="normal">7</span>
<span class="normal">8</span></pre></div></td><td class="code"><div><pre><span></span><code>Super Ecommerce{{{
user: foobar
pass: g0d
Comments{{{
birthday: 1/1/1911
first car: delorean
}}}
}}}
</code></pre></div></td></tr></table></div>
<p>Following a consistent structure like this makes the file easier to navigate and allows for the possibility of the file being parsed by a script. The fold markers come into play with my Vim configuration.</p>
<h2>Vim</h2>
<p>I use Vim with the <a href="https://github.com/jamessan/vim-gnupg">vim-gnupg</a> plugin. This makes editing of encrypted files seamless. When opening existing files, the contents are decrypted. When opening new files, the plugin asks which recipients the file should be encrypted to. When a file is open, leaking the clear text is avoided by disabling <a href="http://vimdoc.sourceforge.net/htmldoc/starting.html#viminfo">viminfo</a>, <a href="http://vimdoc.sourceforge.net/htmldoc/options.html#%27swapfile%27">swapfile</a>, and <a href="http://vimdoc.sourceforge.net/htmldoc/options.html#%27undofile%27">undofile</a>. I run <code>gpg-agent</code> so that my passphrase is remembered for a short period of time after I use it. This makes it easy and secure to work with (and create) the encrypted files with Vim. I define a few extra options in my <a href="https://github.com/pigmonkey/dotfiles/blob/master/vimrc">vimrc</a> to facilitate working with passwords.</p>
<div class="highlight"><table class="highlighttable"><tr><td class="linenos"><div class="linenodiv"><pre><span class="normal"> 1</span>
<span class="normal"> 2</span>
<span class="normal"> 3</span>
<span class="normal"> 4</span>
<span class="normal"> 5</span>
<span class="normal"> 6</span>
<span class="normal"> 7</span>
<span class="normal"> 8</span>
<span class="normal"> 9</span>
<span class="normal">10</span>
<span class="normal">11</span>
<span class="normal">12</span>
<span class="normal">13</span>
<span class="normal">14</span>
<span class="normal">15</span>
<span class="normal">16</span>
<span class="normal">17</span>
<span class="normal">18</span>
<span class="normal">19</span>
<span class="normal">20</span>
<span class="normal">21</span>
<span class="normal">22</span>
<span class="normal">23</span>
<span class="normal">24</span>
<span class="normal">25</span>
<span class="normal">26</span>
<span class="normal">27</span></pre></div></td><td class="code"><div><pre><span></span><code><span class="c">""""""""""""""""""""</span>
<span class="c">" GnuPG Extensions "</span>
<span class="c">""""""""""""""""""""</span>
<span class="c">" Tell the GnuPG plugin to armor new files.</span>
<span class="k">let</span> <span class="k">g</span>:GPGPreferArmor<span class="p">=</span><span class="m">1</span>
<span class="c">" Tell the GnuPG plugin to sign new files.</span>
<span class="k">let</span> <span class="k">g</span>:GPGPreferSign<span class="p">=</span><span class="m">1</span>
augroup GnuPGExtra
<span class="c">" Set extra file options.</span>
autocmd <span class="nb">BufReadCmd</span><span class="p">,</span><span class="nb">FileReadCmd</span> *.\<span class="p">(</span>gpg\<span class="p">|</span><span class="k">asc</span>\<span class="p">|</span>pgp\<span class="p">)</span> <span class="k">call</span> SetGPGOptions<span class="p">()</span>
<span class="c">" Automatically close unmodified files after inactivity.</span>
autocmd <span class="nb">CursorHold</span> *.\<span class="p">(</span>gpg\<span class="p">|</span><span class="k">asc</span>\<span class="p">|</span>pgp\<span class="p">)</span> quit
augroup END
<span class="k">function</span> SetGPGOptions<span class="p">()</span>
<span class="c">" Set updatetime to 1 minute.</span>
<span class="k">set</span> <span class="nb">updatetime</span><span class="p">=</span><span class="m">60000</span>
<span class="c">" Fold at markers.</span>
<span class="k">set</span> <span class="nb">foldmethod</span><span class="p">=</span>marker
<span class="c">" Automatically close all folds.</span>
<span class="k">set</span> <span class="k">foldclose</span><span class="p">=</span><span class="k">all</span>
<span class="c">" Only open folds with insert commands.</span>
<span class="k">set</span> <span class="k">foldopen</span><span class="p">=</span>insert
<span class="k">endfunction</span>
</code></pre></div></td></tr></table></div>
<p>The first two options simply tell vim-gnupg to always ASCII-armor and sign new files. These have nothing particular to do with password management, but are good practices for all encrypted files.</p>
<p>The first <code>autocmd</code> calls a function which holds the options that I wanted applied to my password files. I have these options apply to all encrypted files, although they’re intended primarily for use when Vim is acting as my password manager.</p>
<h3>Folding</h3>
<p>The primary shortcoming with using an encrypted text file as a password database is the lack of protection against shoulder-surfing. After the file has been decrypted and opened, anyone standing behind you can look over your shoulder and view all the entries. This is solved with <a href="http://vim.wikia.com/wiki/Folding">folds</a> and is what most of these extra options address.</p>
<p>I set <a href="http://vimdoc.sourceforge.net/htmldoc/options.html#%27foldmethod%27">foldmethod</a> to <code>marker</code> so that Vim knows to look for all the <code>{{{</code>/<code>}}}</code> characters and use them to build the folds. Then I set <a href="http://vimdoc.sourceforge.net/htmldoc/options.html#%27foldclose%27">foldclose</a> to <code>all</code>. This closes all folds unless the cursor is in them. This way only one fold can be open at a time – or, to put it another way, only one password entry is ever visible at once.</p>
<p>The final fold option instructs Vim when it is allowed to open folds. Folds can always be opened manually, but by default Vim will also open them for many other cases: if you navigate to a fold, jump to a mark within a fold or search for a pattern within a fold, they will open. By setting <a href="http://vimdoc.sourceforge.net/htmldoc/options.html#%27foldopen%27">foldopen</a> to <code>insert</code> I instruct Vim that the only time it should automatically open a fold is if my cursor is in a fold and I change to insert mode. The effect of this is that when I open a file, all folds are closed by default. I can navigate through the file, search and jump through matches, all without opening any of the folds and inadvertently exposing the passwords on my screen. The fold will open if I change to insert mode within it, but it is difficult to do that by mistake.</p>
<p>I have my <a href="https://github.com/pigmonkey/dotfiles/blob/master/vimrc#L116">spacebar setup to toggle folds</a> within Vim. After I have navigated to the desired entry, I can simply whack the spacebar to open it and copy the credential that I need to the system clipboard. At that point I can whack the spacebar again to close the fold, or I can quit Vim. Or I can simply wait.</p>
<h3>Locking</h3>
<p>The other special option I set is <a href="http://vimdoc.sourceforge.net/htmldoc/options.html#%27updatetime%27">updatetime</a>. Vim uses this option to determine when it should write swap files for crash recovery. Since vim-gnupg disables swap files for decrypted files, this has no effect. I use it for something else.</p>
<p>In the second <code>autocmd</code> I tell Vim to close itself on <a href="http://vimdoc.sourceforge.net/htmldoc/autocmd.html#CursorHold">CursorHold</a>. <code>CursorHold</code> is triggered whenever no key has been pressed for the time specified by <code>updatetime</code>. So the effect of this is that my password files are automatically closed after 1 minute of inactivity. This is similar to KeePassX’s behaviour of “locking the workspace” after a set period of inactivity.</p>
<h3>Clipboard</h3>
<p>To easily copy a credential to the system clipboard from Vim I have two <a href="https://github.com/pigmonkey/dotfiles/blob/master/vimrc#L175">shortcuts</a> mapped.</p>
<div class="highlight"><table class="highlighttable"><tr><td class="linenos"><div class="linenodiv"><pre><span class="normal">1</span>
<span class="normal">2</span>
<span class="normal">3</span>
<span class="normal">4</span>
<span class="normal">5</span></pre></div></td><td class="code"><div><pre><span></span><code>" Yank WORD to system clipboard in normal mode
nmap <leader>y "+yE
" Yank selection to system clipboard in visual mode
vmap <leader>y "+y
</code></pre></div></td></tr></table></div>
<p>Vim can access the system clipboard using both the <code>*</code> and <code>+</code> registers. I opt to use <code>+</code> because <a href="http://vimdoc.sourceforge.net/htmldoc/gui_x11.html#x11-selection">X treats it as a selection rather than a cut-buffer</a>. As the Vim documentation explains:</p>
<blockquote>
<p>Selections are “owned” by an application, and disappear when that application (e.g., Vim) exits, thus losing the data, whereas cut-buffers, are stored within the X-server itself and remain until written over or the X-server exits (e.g., upon logging out).</p>
</blockquote>
<p>The result is that I can copy a username or password by placing the cursor on its first character and hitting <code><leader>y</code>. I can paste the credential wherever it is needed. After I close Vim, or after Vim closes itself after 1 minute of inactivity, the credential is removed from the clipboard. This replicates KeePassX’s behaviour of clearing the clipboard so many seconds after a username or password has been copied.</p>
<h2>Generation</h2>
<p>Passwords should be long and unique. To satisfy this any password manager needs some sort of password generator. Vim provides this with its ability to <a href="http://vim.wikia.com/wiki/Append_output_of_an_external_command.">call and read external commands</a> I can tell Vim to call the standard-issue <a href="http://linux.die.net/man/1/pwgen">pwgen</a> program to generate a secure 24-character password utilizing special characters and insert the output at the cursor, like this:</p>
<div class="highlight"><table class="highlighttable"><tr><td class="linenos"><div class="linenodiv"><pre><span class="normal">1</span></pre></div></td><td class="code"><div><pre><span></span><code><span class="p">:</span><span class="k">r</span><span class="p">!</span>pwgen <span class="p">-</span><span class="k">sy</span> <span class="m">24</span> <span class="m">1</span>
</code></pre></div></td></tr></table></div>
<h2>Backups</h2>
<p>The <code>~/pw</code> directory is backed up in the same way as most other things on my hard drive: to <a href="http://www.tarsnap.com/">Tarsnap</a> via <a href="/2012/09/tarsnapper-managing-tarsnap-backups/">Tarsnapper</a>, to an external drive via <a href="http://www.rsnapshot.org/">rsnapshot</a> and <a href="/2012/09/cryptshot-automated-encrypted-backups-rsnapshot/">cryptshot</a>, <a href="https://wiki.archlinux.org/index.php/Full_System_Backup_with_rsync">rsync to a mirror drive</a>. The issue with these standard backups is that they’re all encrypted and the keys to decrypt them are stored in the password manager. If I loose <code>~/pw</code> I’ll have plenty of backups around, but none that I can actually access. I address this problem with regular backups to optical media.</p>
<p>At the beginning of every month I burn the password directory to two CDs. One copy is stored at home and the other at an off-site location. I began these optical media backups in December, so I currently have two sets consisting of five discs each. Any one of these discs will provide me with the keys I need to access a backup made with one of the more frequent methods.</p>
<p>Of course, all the files being burned to these discs are still encrypted with my GnuPG key. If I loose that key or passphrase I will have no way to decrypt any of these files. Protecting one’s GnuPG key is another problem entirely. I’ve taken steps that make me feel confident in my ability to always be able to recover a copy of my key, but none that I’m comfortable discussing publicly.</p>
<h2>Shell</h2>
<p>I’ve defined a <a href="https://github.com/pigmonkey/dotfiles/blob/master/shellrc#L70">shell function</a>, <code>pw()</code>, that operates exactly like the function I use for <a href="/2012/12/notes-unix/">notes on Unix</a>.</p>
<div class="highlight"><table class="highlighttable"><tr><td class="linenos"><div class="linenodiv"><pre><span class="normal"> 1</span>
<span class="normal"> 2</span>
<span class="normal"> 3</span>
<span class="normal"> 4</span>
<span class="normal"> 5</span>
<span class="normal"> 6</span>
<span class="normal"> 7</span>
<span class="normal"> 8</span>
<span class="normal"> 9</span>
<span class="normal">10</span>
<span class="normal">11</span></pre></div></td><td class="code"><div><pre><span></span><code><span class="c1"># Set the password database directory.</span>
<span class="nv">PASSDIR</span><span class="o">=</span>~/pw
<span class="c1"># Create or edit password databases.</span>
pw<span class="o">()</span> <span class="o">{</span>
<span class="nb">cd</span> <span class="s2">"</span><span class="nv">$PASSDIR</span><span class="s2">"</span>
<span class="k">if</span> <span class="o">[</span> ! -z <span class="s2">"</span><span class="nv">$1</span><span class="s2">"</span> <span class="o">]</span><span class="p">;</span> <span class="k">then</span>
<span class="nv">$EDITOR</span> <span class="k">$(</span>buildfile <span class="s2">"</span><span class="nv">$1</span><span class="s2">"</span><span class="k">)</span>
<span class="nb">cd</span> <span class="s2">"</span><span class="nv">$OLDPWD</span><span class="s2">"</span>
<span class="k">fi</span>
<span class="o">}</span>
</code></pre></div></td></tr></table></div>
<p>This allows me to easily open any password file from wherever I am in the filesystem without specifying the full path. These two commands are equivalent, but the one utilizing <code>pw()</code> requires fewer keystrokes:</p>
<div class="highlight"><table class="highlighttable"><tr><td class="linenos"><div class="linenodiv"><pre><span class="normal">1</span>
<span class="normal">2</span></pre></div></td><td class="code"><div><pre><span></span><code>$ vim ~/pw/financial.gpg
$ pw financial
</code></pre></div></td></tr></table></div>
<p>The function changes to the password directory before opening the file so that while I’m in Vim I can drop down to a shell with <code>:sh</code> and already be in the proper directory to manipulate the files. After I close Vim the function returns me to the previous working directory.</p>
<p>This still required a few more keystrokes than I like, so I configured my shell to <a href="https://github.com/pigmonkey/dotfiles/blob/master/zshrc#L44">perform autocompletion in the directory</a>. If <code>financial.gpg</code> is the only file in the directory beginning with an “f”, typing <code>pw f<tab></code> is all that is required to open the file.</p>
<h2>Simplicity</h2>
<p>This setup provides <a href="https://wiki.archlinux.org/index.php/The_Arch_Way#Simplicity">simplicity</a>, power, and portability. It uses the same tools that I already employ in my daily life, and does not require the use of the mouse or any graphical windows. I’ve been happily utilizing it for about 6 months now.</p>
<p>Initially I had thought I would supplement the setup with a script that would search the databases for a desired entry, using some combination of <code>grep</code>, <code>awk</code> and <code>cut</code>, and then copy it to my clipboard via <code>xsel</code>. As it turns out, I haven’t felt the desire to do this. Simply opening the file in Vim, searching for the desired entry, opening the fold and copying the credential to the system clipboard is quick enough. The whole process, absent of typing in my passphrase, takes me only a couple of seconds.</p>
<h2>Resources</h2>
<p>I’m certainly not the first to come up with the idea of managing password with Vim. These resources were particularly useful to me when I was researching the possibilities:</p>
<ul>
<li><a href="http://connermcd.com/blog/2012/05/01/file-encryption-and-password-management/">File encryption and password management</a> by Conner McDaniel</li>
<li><a href="http://vim.wikia.com/wiki/Keep_passwords_in_encrypted_file">Keep passwords in encrypted file</a> on the Vim Wiki</li>
<li><a href="http://www.noah.org/wiki/Password_Safe_with_Vim_and_OpenSSL">Password Safe with Vim and OpenSSL</a> by Noah</li>
</ul>
<p>If you’re interesting in other ideas for password management, <a href="http://zx2c4.com/projects/password-store/">password-store</a> and <a href="http://raymontag.github.com/keepassc/">KeePassC</a> are both neat projects that I follow.</p>
<div class="notice">
<p>2013 June 30: <a href="http://blog.oddbit.com/">larsks</a> has hacked together a <a href="https://gist.github.com/larsks/5868076">Python script</a> to convert KeepassX XML exports to the plain-text markup format that I use.</p>
</div>Gwern offers an excellent overview of Silk Road.2013-02-23T00:00:00-08:002013-02-23T00:00:00-08:00Pig Monkeytag:pig-monkey.com,2013-02-23:/2013/02/silk-road/<p>In the essay he <a href="http://www.gwern.net/Silk%20Road">introduces the website and describes his experience as a user</a> purchasing illegal drugs. It is well worth the read. I’ve spent hours on <a href="http://www.gwern.net/">his website</a> perusing his other works.</p>When mentioning the cypherpunks, I like to point to Moxie Marlinkspike's explanation of their failure.2012-12-12T00:00:00-08:002012-12-12T00:00:00-08:00Pig Monkeytag:pig-monkey.com,2012-12-12:/2012/12/moxie-marlinkspike-defcon-18/<p>In <a href="https://www.youtube.com/watch?v=eG0KrT6pBPk">his talk from Defcon 18</a> (<a href="http://privacy-pc.com/news/changing-threats-to-privacy-moxie-marlinspike-on-privacy-threats.html">transcript available</a>), <a href="http://www.thoughtcrime.org/">Moxie</a> argues that what we were preparing for was fascism and what we got was social democracy. For me it was an eye-opening explanation, and one that I think is important to understand given the ever-increasing <a href="https://en.wikipedia.org/wiki/Network_effect">network effect</a> of technologies that are …</p><p>In <a href="https://www.youtube.com/watch?v=eG0KrT6pBPk">his talk from Defcon 18</a> (<a href="http://privacy-pc.com/news/changing-threats-to-privacy-moxie-marlinspike-on-privacy-threats.html">transcript available</a>), <a href="http://www.thoughtcrime.org/">Moxie</a> argues that what we were preparing for was fascism and what we got was social democracy. For me it was an eye-opening explanation, and one that I think is important to understand given the ever-increasing <a href="https://en.wikipedia.org/wiki/Network_effect">network effect</a> of technologies that are a not only a danger to personal privacy but can also grow to <a href="https://www.youtube.com/watch?v=sKOk4Y4inVY">threaten free thought</a>.</p>Currently reading This Machine Kills Secrets by Andy Greenberg2012-12-11T00:00:00-08:002012-12-22T00:00:00-08:00Pig Monkeytag:pig-monkey.com,2012-12-11:/2012/12/currently-reading-machine-kills-secrets-andy-greenberg/<p>When <a href="http://www.amazon.com/This-Machine-Kills-Secrets-WikiLeakers/dp/0525953205">the book</a> was first published I assumed it would be just another entry into the media hubbub around <a href="http://wikileaks.org/">WikiLeaks</a>. When I saw that <a href="http://cryptome.org/">John Young</a> – cranky old man of the cypherpunk movement – gave it <a href="http://www.amazon.com/review/R6ZIEYW08RO13/">a positive review</a> I decided that it would be worth a read. While the book …</p><p>When <a href="http://www.amazon.com/This-Machine-Kills-Secrets-WikiLeakers/dp/0525953205">the book</a> was first published I assumed it would be just another entry into the media hubbub around <a href="http://wikileaks.org/">WikiLeaks</a>. When I saw that <a href="http://cryptome.org/">John Young</a> – cranky old man of the cypherpunk movement – gave it <a href="http://www.amazon.com/review/R6ZIEYW08RO13/">a positive review</a> I decided that it would be worth a read. While the book does center on <a href="https://en.wikipedia.org/wiki/Julian_Assange">Assange</a>, <a href="http://blogs.forbes.com/andygreenberg/">Greenberg</a> does an admirable job of tracing the history of the <a href="https://en.wikipedia.org/wiki/Cypherpunk">cypherpunks</a> and describing what in the future we will probably refer to as a sequel to <a href="http://www.amazon.com/Crypto-Rebels-Government-Privacy-Digital/dp/0140244328">the cryptowars</a>. It is a recommended read.</p>