Education in Futility: WarpWallet Brute Forcing

So, WarpWallet is a so-called brain wallet for Bitcoin. That is, you only have to remember a relatively short password and it generates the corresponding private key for use. It uses a memory and CPU hard set of cryptographic algorithms to ensure that brute-forcing is slowed way down. That is, when generating the private key, it takes considerable time. Their Javascript implementation takes over 10 seconds on my machine.

So the makers of it had challenges setup. By the time I stumbled on it, only the last challenge was left, with 6 months remaining. For that challenge, the reward for cracking an 8-character alphanumeric password was 20 BTC (and BCH and BTG!), which was worth over $100,000 USD at the time of the challenge end date.

Since their Javascript implementation is terribly slow I wondered if anyone had ported it to any other language, and found a Go version–but it was outdated and would not compile. So, as my first exercise in Go, I updated it and got it compiling. Instead of 10+ seconds per keypair generated, it took about 1 second. But, it took input from the command line, so I decided to make a brute forcer that used this newly updated generator. It would feed it the passphrase and salt and store the result (the private key and public key) and then I could parse these results later.

So the basic design was this:
My WarpWallet Brute Forcer (using Go WarpWallet implementation) -> SQL Database

The brute-forcer underwent many revisions. First it checked the history of passphrases to ensure no duplicates were stored, but this eventually took longer than the time to generate private keys, slowing the whole processes down. So it was eliminated (and there was virtually no chance of generating the same passphrase twice, the same odds as finding the correct passphrase).

It also did not store anything at first besides the date and the passphrase. The client checked each public key against the target one and discarded the result. This meant if the client was killed before I could check the output, I was out of luck! Later improvements added the private key, public key, and the hostname of the computer that generated it (as I used all available idle personal computers to do so).

Another misstep was having the Go pipeline switch sleep. First it slept 100 ms if no channel had data or their buffers were full, then I increased it to 250 ms inexplicably, then realized it waits by default. So this was leaving processing power on the table. Removing the sleep command on my main desktop gave a ~20% improvement in performance (from 5.12 to 6.14 keypairs/s on an i7). Below are the contributions from various machines. The IPs at the end are AWS servers, the largest chunk of which was from a c4.xlarge machine over a single day!

 

And then on January 1st, 2018 the challenge expired. There’re just over 24 million rows, 4.5GB data. It takes a few seconds to test any result. I investigated testing each public address to see if they had a balance but on my local Bitcoin node it takes minutes to scan the blockchain for transactions for newly added addresses. And web APIs rate limit you to where it would take a year or so to test each one. Less if I spread requests out across API providers. So, in the end, I just deleted all the results. It was fun, I learned a lot about Go, cryptocurrency nodes, and I’m ready for the next, hopefully more fruitful, project.

 

Reverse Engineering the firmware on a Kenwood DDX9903S

I bought and really like my Kenwood Excelon DDX9903S headunit. I had it in my WRX, and moved it to my LS430. It supports Android Auto and CarPlay, which I find really useful when driving.

However, it has a nag screen every time it boots up. This got me curious as to how it worked, and see if it could be patched to skip this disclaimer. I figured it probably ran Linux on a SoC, as pretty much everything does nowadays. So I grabbed the latest firmware for it (mine was already updated to it), and started probing.

S_V2_7_0008_0600_AT1.zip

Extract that and you get 3 folders under S_V2_7_0008_0600/:

BOOT_V2_7_0008_0600_release/
MAIN_V1_0_2758_0400/
SOC_V2_7_0008_0600/

In each there’s a .nfu file, which I’ve never encountered before. I ran binwalk on each:

[BOOT_V2_7_0008_0600_release]$ binwalk Boot_2.7.0008.0600.nfu
DECIMAL HEXADECIMAL DESCRIPTION
--------------------------------------------------------------------------------
248776 0x3CBC8 Android bootimg, kernel size: 0 bytes, kernel addr: 0x4F525245, ramdisk size: 1226848850 bytes, ramdisk addr: 0x6C61766E, product name: "ERROR: Cannot read kernel image"
1571592 0x17FB08 ELF, 64-bit LSB shared object, AMD x86-64, version 1 (SYSV)
2358024 0x23FB08 ELF, 64-bit LSB shared object, AMD x86-64, version 1 (SYSV)
3209992 0x30FB08 ELF, 64-bit LSB shared object, AMD x86-64, version 1 (SYSV)

Surprise, surprise, it runs Android. But, I’m thinking this image is possibly just the firmware updater, and not what I am looking for.

Continue reading

Oculus Rift and Touch

I’m super excited for the imminent onslaught of consumer grade VR equipment and game support. I think it will change how games are played from now on. Not all games, mind you, it takes a certain level of involvement to strap on a head-mounted display and get into the experience. The military already uses similar technology extensively for training. It’s just a matter of months before equipment hits shelves.

That said, there’s a serious problem with it already. When you strap on a VR headset, your mouse and keyboard disappear. They seem antiquated as you tilt and peak below the headset to find the right keys. HIDs are going to need a total revamp to work well with head mounted VR.

Enter the Oculus Touch. Basically two Wii motion controllers. It makes sense. I can’t say it’s the best solution, because no one really knows what is (well, aside from 100% perfect hand and finger motion tracking without any device attached). It’s a great start, and I can see it working well.

via Engadget

For example, one thing I’m really excited for is VR support in my favorite flight combat sim, Eagle Dynamic’s DCS. I’ve seen some footage of the Oculus in use with it, and I’ve used the Oculus DK2 for development and with other games. But DCS has some serious keyboard use involved when playing. Even if you have a nice HOTAS, you can’t map all functions down to the toggle switches and buttons. And even then, you have to know it by feel. What DCS does offer (somewhat uniquely, if I’m not mistaken), is the ability to use in cockpit (on-screen) controls–currently with the mouse.

The new Oculus Touch should be able to handle that. Reach out and you see a hand reach out on screen. Move your hand over to the landing gear switch and press a button. How’s that for integration? This isn’t anything new, the technology’s been around for years now but no one’s made a solid controller for the PC, nor has any game I’m aware of supported it.

I was already excited for Oculus (and other VR HMDs). First-person shooters are also about to see a big change. The current heavy reliance on mouse input for looking and keyboard input for moving makes a lot of FPSs all about mouse/keyboard coordination. HMDs will allow a more realistic experience–if that’s what’s desired. For me, I’m more about the simulation than kill counts (or “360 no scopes”) so I can’t wait.

Action Camera Review: Garmin Virb Elite

After seeing a fellow MSNE autocrosser post a video with nice data overlays, I asked what datalogger and software he used. It turns out it wasn’t just a data logger, it was a camera that gathered his speed, acceleration, and so on for him. It was the Garmin Virb Elite. The ‘Elite’ part signifying the difference between the non-sensor, data-less variant.

I’d never heard of it, and I thought I kept at least a cursory eye on the action camera market. I was familiar but not impressed with GoPro’s Hero line (and always confused about which version had what features). The Hero prices always seemed steep for what they were. I’d talked with other autocrosser’s about their cameras. I’d looked at the Sonys, the knock-offs, and the half-way knock-offs. But then again, I’d never actually bought a real action camera, and am definitely not a professional video producer. I am decently savvy enough with technology though.

So, I immediately looked it up. $260 on Amazon at the time (more on that later). Good reviews. I read review sites. A couple days of contemplation later, and I decided, “Hell, it’s a good starter camera at worst.”, and bought one. Another couple of days later and I’m happy to say I am thoroughly impressed.

So, let me quickly point out some pros and cons that mattered or were interesting to me:

Pros:

  • Records 720p60, 1080p30 and lower. That’s about as far as my video cares go.
  • Wireless control via phone app (iOS & Android)
  • Easy record/off slider (glove-friendly), with LED feedback
  • Onboard LCD viewfinder (unbacklit)
  • Wireless viewfinder
  • GoPro mount adapter (to use GoPro accessories)
  • Long battery life
  • GPS, Accelerometer datalogging to GPX (XML) format (non-proprietary)
  • Decent video editing software at no extra cost
  • Can do dash-cam duty (overwrite oldest recordings)
  • Can invert video by 180° via setting

Cons:

  • No suction cup mount
  • No tripod mount (1/4″-20 threaded hole)

For another $5 or so, you can purchase a tripod mount on Amazon, so it’s no big deal. It still would have been nice to have one in the box anyway.

What sealed the deal for me was the sensor data, and the GPX output format. It just so happens the post-production video software I use, RaceRender, handles GPX. This is in addition to the MSL (MegaSquirt log file) format. So, I can get RPM, boost level, and throttle position from MSL, and combine it with speed, and G-force from the camera. Cool! That’s a lot of data, more than I can display in the video frame at one time.

Referral link: