Bitcoin Mining on Mac OS X – Bitfury ASICS

A new batch of ASICs is hitting the market based on the 55nm Bitfury ASIC chip (abbreviated BF1). The most popular of these are USB sticks branded as either Blue Fury or Red Fury, depending on the color of the onboard LED. The BF1 Fury sticks look very similar to the ASICMINER Block Erupter sticks. However, while the Block Erupters hash at 335Mh/s the new BF1 Fury sticks hash at anywhere from 2.2Gh/s up to 3.0Gh/s.

Red Fury

Miner Installation

As with the ASICMINER Block Erupters, the first step is to install either bfgminer or cgminer on OS X. There are several ways to go about this, from compiling them yourself, to using Homebrew, to downloading precompiled binaries.

There is a thread here on the Bitcoin Talk forums which discusses various ways to install cgminer and bfgminer on Mac OS X. One of the more full-proof methods is to use Homebrew:

  1. Launch Terminal.app from Spotlight or from your Applications folder
  2. Install Homebrew by entering the following command:
    ruby -e "$(curl -fsSL https://raw.github.com/mxcl/homebrew/go/install)"
  3. Run the following command and fix any reported issues:
    brew doctor
  4. Tap this Homebrew repostory so that you can install packages from it:
    brew tap nwoolls/xgminer
  5. Finally, install either cgminer or bfgminer:
    brew install cgminer

If you’d like to install the miner using the latest source from Github rather than the latest official package, use the –HEAD (two dashes prefix) parameter, e.g.:

brew install bfgminer --HEAD

Driver Installation

As with the ASICMINER Block Erupters, bfgminer requires the correct kernel extension to be loaded in order to detect the BF1 Fury stick. And, as with the Block Erupters, cgminer will fail to detect the BF1 Fury stick unless that same kernel extension is unloaded. Unlike with the Block Erupter there is no driver or software to download. When using bfgminer the Apple Communication Device Class (CDC) driver will be used. When using cgminer that driver (kernel extension) must be unloaded.

To load the required Apple drivers (if you intend to use bfgminer) execute the following commands:

sudo kextload -b com.apple.driver.AppleUSBCDC
sudo kextload -b com.apple.driver.AppleUSBCDCACMData

To unload those drivers (if you intend on using cgminer) execute the following:

sudo kextunload -b com.apple.driver.AppleUSBCDC
sudo kextunload -b com.apple.driver.AppleUSBCDCACMData

Detection

Once you have the proper Bitcoin mining software installed and the kernel extensions loaded (or unloaded), you can use the -d? argument to list available devices with bfgminer:

bfgminer -d? -S bigpic:all
[2013-11-21 16:38:38] Started bfgminer 3.6.0
[2013-11-21 16:38:38] Devices detected:
[2013-11-21 16:38:38] 0. BPM 0 (driver: bigpic)
1 devices listed

or with cgminer:

cgminer -d?
[2013-11-21 16:49:06] Started cgminer 3.8.2
[2013-11-21 16:49:09] Devices detected:
[2013-11-21 16:49:09] 0. BF1 0 (driver: bitfury)
[2013-11-21 16:49:09] 1 devices listed

Mining

Once the BF1 Blue Fury or Red Fury is detected you can fire up your chosen miner using the -o, -u and -p arguments to start hashing away:

cgminer -o hostname -u username -p password

Running bfgminer requires an additional -S argument:

bfgminer -S bigpic:all -o hostname -u username -p password

bfgminer Red Fury

Feel free to leave any questions below or on this thread on the Bitcoin Talk forums. And happy mining!

Cryptocoin Mining on Mac OS X – Wallet Backups

If you’ve taken cryptocoin mining beyond the hobby stages, you probably have at least one digital wallet on your system – possibly several. What happens if your OS were to crash? If you were to reinstall? Where would your accumulated cryptocurrency be?

Gone.

It is very important that you backup the private keys that secure your digital wallet so that you can restore them at a later time. Otherwise the keys required to claim ownership of your coins will be gone forever. It is also important that you backup your wallets regularly. It is not enough to backup your keys just once. See this entry on securing the bitcoind wallet for more details.

So what steps can you take on Mac OS X to keep your wallets backed up? Luckily OS X comes with several tools that help along the way. Additionally, the Unix foundation of OS X allows us to fall back on some time-tested solutions for this project.

Step One – Create an Encrypted Disk Image

Step one will be to create an encrypted disk image used to store the wallet.dat backups. This is what Steve Gibson refers to as “Pre-Internet Encryption” – encrypting important data before backing it up to the Web.

  1. Launch Disk Utility from the Applications folder
  2. Click New Image
  3. Save the image somewhere that is itself backed up (e.g. Dropbox or Wuala)
  4. For the name, specify Wallet Backups (so that the below scripts match)
  5. For Encryption, select either 128 or 256 (256 is more secure and should be plenty fast for this)
  6. Click Create

New Disk Image

Step Two – Create a Backup Script

Now that we have a secure place to store wallet backups, the next step is to create a script that will do the grunt-work. Namely it should mount the encrypted disk image, backup any wallet.dat files (we could have Bitcoin, Litecoin, and who knows what other digital wallets – get them all), and then unmount the encrypted image.

We can do all of this with Automator and a couple of shell scripts.

  1. Launch Automator from the Applications folder
  2. Select Application and click Choose
  3. Add a Get Specified Finder Items action
  4. Select the DMG file created in Step One
  5. Add a Mount Disk Image action
  6. Add a Run Shell Script action and enter:
    cd ~/Library/Application\ Support && rsync -R ./*/wallet.dat /Volumes/Wallet\ Backups/
  7. Add another Run Shell Script action and enter:
    diskutil unmount /Volumes/Wallet\ Backups/
  8. Click File, Save
  9. Save as BackupWallets.app in Applications

Complete Automator Application

Step Three – Schedule the Backup

Armed with encrypted storage and a backup script, the only thing left is to schedule the backup. The Unix foundation of OS X means we can do this by editing the system crontab, a configuration file that specifies the commands to run for cron, the Unix job scheduler.

  1. Launch Terminal from the Applications folder
  2. Enter the following and press Return:
    export EDITOR=nano && crontab -e
  3. Enter the following into the nano editor:
    # backup wallets at midnight every Sunday
    0 0 * * 0 open /Applications/BackupWallets.app
  4. Type Ctrl+X, Y, Return

crontab

Conclusion

And we’re done. You can check your disk image each Sunday after midnight to ensure that your wallets have been backed up. You can also run the BackupWallets.app application to backup your wallets on-demand.

Delphi Shelf Life Reaches a New Low

I am going to try to keep this post short and to the point. I don’t want to rant (too much) or say anything I regret (too much), but something has to be said.

When it comes to iOS development, Delphi XE4, a major product released by Embarcadero five months ago, is now obsolete. If you want support for iOS 7 you must buy their new product, Delphi XE5.

Let’s take a step back and look at the facts when it comes to Delphi and Embarcadero chasing the mobile landscape:

  • Delphi XE2 – released September 2011. Claims support for iOS development but, by all reports, fails to deliver. iOS development makes use of the FreePascal compiler and cannot use traditional Delphi code.
  • Delphi XE3 – released September 2012. Support for iOS development completely removed. Anyone who built their iOS app on the foundation of XE2 was left out in the cold.
  • Delphi XE4 – released April 2013. Claims support for iOS development (again). Anyone who wants the iOS development support promised in XE2 must now buy XE4, released as a new product only seven months after XE3.

And now Delphi XE5 has been released only five months after Delphi XE4. It’s another new version and another paid upgrade.

Here’s the real rub though. iOS 7 was just released by Apple. It features a new UI and new navigation elements for apps. Anyone using Xcode (the traditional tool for iOS development which is free) could simply recompile their apps and resubmit them to support iOS 7.

What about Delphi XE4 customers? The ones who just put down their hard earned money for iOS support, again, five months ago? They are left out in the cold. Again. If a Delphi XE4 customer wants to support iOS 7 they must now purchase Delphi XE5. I confirmed this myself with a back-and-forth on Twitter with Jim McKeeth from Embarcadero:

Jim goes on to point out that, if you forgo using the bundled UI kit in your iOS app, you can still leverage the new iOS 7 elements using Delphi XE4:

However, this is basically suggesting the customer not use the same UI technology that is the very heart of Embarcadero’s marketing strategy for several releases now: FireMonkey.

To be clear, I am not upset by a six month release cycle. A lot of companies do that and it’s a smart move for some of them. However, Embarcadero is releasing Delphi like a subscription while selling it like a packaged product. While they offer “Software Assurance” this is a far cry from a subscription. This is an added fee on top of your purchase that allows you to get major upgrades that happen to be released within a year. It’s insurance. It’s the type of thing most of us pass up when checking out at Best Buy.

All-in-all this has just left a horrible taste in my mouth and the mouths of many other developers. My advice? If Delphi has turned into a subscription then charge a subscription. Stop packaging it as something that will be obsolete in 5 months without another purchase.

Resources for the Self Employed Software Developer

After a year of working for myself as a software consultant, this Monday I begin a new position at IDMWORKS. And, while I’ve had a blast being self-employed, I’m very excited to start this new chapter in my career with a lot of really cool ladies and gentlemen.

As I wind down my consulting I thought I’d do a blog post describing some of the resources I’ve used for the past few years in order to work with my customers as a software consultant and freelance developer. One of the fun parts of venturing into this was learning about all of the really awesome services there are out there – and at amazing prices – to help the solo consultant really hit the ground running.

Time Tracking & Invoicing

Harvest

For tracking time and invoicing customers, I really dig Harvest. It does everything I need and then some, and it’s priced right. Harvest is very easy to use and lets you manage:

  • Clients
  • Projects
  • Time Sheets
  • Invoices
  • Retainers
  • Payments

It also lets you accept payments via PayPal, Stripe, or Authorize.Net, sends out automatic invoice reminders, and more. When it comes to time tracking, they have a very nice HTML5 page for that, or you can use mobile apps and desktop widgets.

And, you can use it for free until you need more than two projects or more than four clients. After that, if you are the only user you are looking at a whopping $12/month.

Contracts

Contractually

In order to get projects under way you’ll eventually need to draw up some contracts and get them signed. I’m a fan of Contractually for getting this done. They have a library of contract templates available to customize, and you can save your customized templates for re-use later. From there you can invite folks to review and, optionally, edit the contract online with full version control. Once both parties accept the contract, both can sign the contract digitally. With the latest changes from the team at Contractually, the party you invite to review and sign no longer has to create a Contractually account.

Like Harvest and the rest of the resources on this list, Contractually is priced right. The price has gone up since they launched, but you can still get a solo account for $49/year, which is a bargain for getting this level of ease when it comes to the contract process.

Project/Task Management

Asana

So everyone’s all “Trello“! Honestly, I really like Asana for project/task management. It’s a very straight-forward “traditional” task management system that lets you break things down into workspaces, then projects, and finally tasks. Tasks can have sub-task lists, and it’s very easy to invite customers to participate in individual workspaces.

Asana is completely free for teams up to 15 people. After that their pricing model scales up nicely.

Hosted Source Control

Bitbucket

As with project management, there’s already another strong contender in this category: GitHub. And I love GitHub, especially for working on collaborative, open source projects. The workflow is just superb. But it costs money to host private repositories, and you must pay more as you add repositories. To me this discourages version controlling projects and keeping them offsite.

Bitbucket is a really wonderful product. The only real weakness is that it’s not GitHub. And everyone uses GitHub for collaborative projects. But if you need somewhere to store your private projects with great features and the ability to easily invite your customers, Bitbucket gives you that and is free – including unlimited private repositories – for up to 5 users.

And while we’re on the topic, Atlassian also provides a wonderful Git client for OS X (and Windows) called SourceTree.

Hosted Servers/Services

Windows Azure

If you follow my blog or my Twitter account you’ll know I’m a fan (if sometimes critic) of the Windows Azure services. To me, there is no single stronger tool a self-employed software consultant can have under her belt. Eventually you are going to need to host things somewhere that isn’t your machine. In my experience, the hosting options out there come in two flavors: cheap and horrible, or expensive and great.

Windows Azure gives you hosted environments for many different things, from websites to full virtual machines (Windows and Linux) to SQL data, off-site storage and APIs for mobile applications. And the pricing is very attractive. All of the services let you start off for free and the portal and services are structured in such a way that you will be warned before you are ever billed. From there, the pricing scales very nicely.

Most importantly, Windows Azure is absolutely a high priority for Microsoft. This is obvious from their recent developer conferences and product releases. For now it looks like Windows Azure is more of an Xbox than a Silverlight.

Training/Education

Pluralsight

The independent software consultant must constantly stay up-to-date on the available technologies in the field and how (and when) to exploit them. And Pluralsight is just a fantastic resource for training and education on the top technologies in development today. They go far beyond just how-to and include great details on the whys of what you are watching.

And to stick with our established pattern, the pricing don’t suck. Starting at $29/month you get access to their entire catalog of courses. This one is a no-brainer folks.

Conclusion

I’ll still be blogging here, plus I’ll be contributing to the IDMWORKS blog going forward. Feel free to share any resources you’ve found useful in the comments and good luck!

Painless File Backups to Azure Storage

Windows Azure

In my previous post I discussed steps and utilities for backing up Azure SQL Databases in order to guard against data loss due to user- or program-error. Since then I’ve started investigating options for backing up files – specifically those in Azure Virtual Machines – to the same Azure Storage service used previously.

Just like before I was delighted to find an existing app that makes this super easy. The AzCopy utility makes it possible to copy files to and from Azure Storage and local storage, or from Azure Storage to Azure Storage, with a nice set of arguments.

For instance, the following command will copy all of the files in a local folder, recursively, over to Azure Storage. It will overwrite existing files, and it will skip any files that already exist unless the source file is newer. Perfect.

AzCopy.exe C:\Here\Be\Important\Things http://yourstoragename.blob.core.windows.net/yourstoragecontainer /destKey:YourSuperLongAzureStorageKey /S /V /Y /XO

The Azure storage account name and access key can be accessed in the Storage section of the portal, by clicking the Manage Access Keys button at the bottom of the Windows Azure Portal.

Azure Storage Information

This command took under a minute to backup 3,000 files to Azure Storage from an Azure Virtual Machine. From there I can keep running the command and it will only copy new files over to Azure Storage, overwriting any existing file.

As in my previous post, my little utility AzureStorageCleanup is a nice companion to this process. I have updated the source on Github to include a -recursive argument, which will remove files within the virtual hierarchy found in blob storage (created by the recursive option in AzCopy).

AzureStorageCleanup.exe -storagename yourstoragename -storagekey YourSuperLongAzureStorageKey -container yourstoragecontainer -mindaysold 60 -recursive

By scheduling AzureStorageCleanup to run with the -recursive option, you can remove old files to keep storage use in-check.

Painless Azure SQL Database Backups

Windows Azure

While the SQL Database service from Windows Azure provides resiliency and redundancy, there is no built in backup feature to guard against data loss due to user- or program-error. The advised way to handle this is to take a three-step approach:

  1. Make a copy of the SQL Database
  2. Backup the database copy to Azure Storage
  3. Maintain & remove any outdated backups on blob storage

The process in Windows Azure that backs up a SQL Database to blob storage is not transactionally consistent, which is why the initial database copy is required.

Richard Astbury has provided an excellent tool, SQLDatabaseBackup, that takes care of the first two steps with little fuss:

SQLDatabaseBackup.exe -datacenter eastus -server hghtd75jf9 -database MyDatabase -user DbUser -pwd DbPassword -storagename mybackups -storagekey YourSuperLongAzureStorageKey -cleanup

The data center and server name can be obtained from the SQL Databases section of the Windows Azure Portal.

SQL Database Information

The Azure storage account name and access key can be accessed in the Storage section of the portal, by clicking the Manage Access Keys button at the bottom of the portal.

Azure Storage Information

Finally, by specifying the -cleanup argument, the utility will delete the SQL Database copy it creates after the backup is successfully created.

And while the pricing for Azure blob storage is very affordable, you may want to automate the process of deleting old backups. I’ve created a very simple utility that does just that. AzureStorageCleanup uses command line arguments that mirror the SQLDatabaseBackup project (as it is meant to compliment its use):

AzureStorageCleanup.exe -storagename mybackups -storagekey YourSuperLongAzureStoragekey -container sqlbackup -mindaysold 60

The above command will remove files equal-to-or-older-than sixty days from the container “sqlbackup” – the default container used by SQLDatabaseBackup. The details of each file deleted are printed to the console.

By scheduling these two utilities on an available machine you’ll have painless, affordable backups for any of your Windows Azure SQL Databases.

Bitcoin Mining on Mac OS X – ASICs

While it’s been possible to purchase ASICs (Application Specific Integrated Circuits – chips specifically created to mine Bitcoins in this case) for several months, it has been a difficult and risky process involving auctions on forums with one-way exchanges of money for promises of future hardware. However, recently ASICMINER has made it much easier to purchase their Blades and USB miners. You can pick up one of the USB ASIC miners for around $90 USD (at current exchange rates). One of these will hash at around 333 Mh/s at a fraction of the power usage of modern GPUs.

USB Block Erupter

Installation

To get started using a USB ASIC under OS X, such as the Block Erupter from ASICMINER, you will first need to install either cgminer or bfgminer. You can refer to my previous article for details on installing cgminer or bfgminer under OS X. The basic steps are:

  1. Launch Terminal.app from Spotlight or your Applications folder
  2. Install Homebrew by entering the following command:
    ruby -e "$(curl -fsSL https://raw.github.com/mxcl/homebrew/go)"
  3. Run the following command and fix any reported issues:
    brew doctor
  4. Tap this Homebrew repostory so that you can install packages from it:
    brew tap nwoolls/xgminer
  5. Finally, install either cgminer or bfgminer:
    brew install cgminer

If you’d like to install the miner using the latest source from Github rather than the latest official package, use the –HEAD (two dashes prefix) parameter, e.g.:

brew install bfgminer --HEAD

NOTE: if you are going to use bfgminer, at this time you must use the above –HEAD parameter to get the latest changes from Github. Otherwise USB devices may not be recognized automatically.

Also, if you are using bfgminer you must install the CP210x USB to UART Bridge VCP driver found here. This driver will break USB support in cgminer. Click here for instructions on removing the driver. If you get a page saying your session timed out, visit the main knowledge base site first, and then navigate to the removal instructions. I’ve also reproduced their instructions below:

To uninstall the VCP driver from a Mac OS machine, just drag the driver to the trash from the System/Library/Extensions folder (SilabsUSBDriver) and then reboot the machine.

To verify if a driver is present, plug in a CP210x device and check the /dev directory for a device named “tty.SLAB_USBtoUART”. If this is not present, it means no VCP driver is active, or that the CP210x device has a VID/PID combination that does not match the driver installed. The VID/PID can be found in the device listing in the System Profiler (even if a corresponding driver is not installed).

Detection

Once you have successfully installed one of the above Bitcoin miners, use the following commands to probe for the ASIC USB devices:

cgminer --ndevs

The above command should show USB details for each Erupter ASIC:

[2013-07-01 17:16:30] USB all: found 18 devices - listing known devices
.USB dev 0: Bus 58 Device 2 ID: 10c4:ea60
Manufacturer: 'Silicon Labs'
Product: 'CP2102 USB to UART Bridge Controller'
.USB dev 1: Bus 93 Device 2 ID: 10c4:ea60
Manufacturer: 'Silicon Labs'
Product: 'CP2102 USB to UART Bridge Controller'
[2013-07-01 17:16:30] 2 known USB devices

For bfgminer, use the following command:

bfgminer -S all -d?

The bfgminer output should show ICA devices for the Erupter ASIC:

[2013-07-01 17:15:12] Started bfgminer 3.1.1
[2013-07-01 17:15:17] Devices detected:
[2013-07-01 17:15:17] 0. OCL 0 (driver: opencl)
[2013-07-01 17:15:17] 1. OCL 1 (driver: opencl)
[2013-07-01 17:15:17] 2. ICA 0 (driver: icarus)
[2013-07-01 17:15:17] 3. ICA 1 (driver: icarus)
[2013-07-01 17:15:17] 4 devices listed

Mining

Finally, once you have ensured the miner is recognizing your ASIC, you can start mining with it. You can start cgminer without any special parameters:

cgminer -o hostname -u username -p password

Or start bfgminer with the -S all parameter:

bfgminer -o hostname -u username -p password -S all

cgminer USB Erupter

If you run into any issues you can leave comments below. However, if your issues are specific to cgminer or bfgminer you can find specific forum threads for them here and here.

UPDATE: Several readers have asked how to mine using only their ASIC devices and not their GPUs. Both cgminer and bfgminer support the –disable-gpu (two dashes prefix) argument:

cgminer -o hostname -u username -p password --disable-gpu
bfgminer -o hostname -u username -p password -S all --disable-gpu

Integration Testing ASP.NET MVC Projects with SpecsFor.Mvc

SpecsFor.Mvc

I’ve recently spent some time looking into different frameworks for doing integration testing for ASP.NET MVC projects. One that caught my eye almost immediately was SpecsFor.Mvc. Unlike other solutions I found for writing integration tests, SpecsFor.Mvc lets you write tests in a fashion that is very similar to writing unit tests, using strongly-typed access to your application’s data without having to script or dig into the DOM.

Some nice things that SpecsFor.Mvc provides out-of-the-box:

  • Hosts your ASP.NET MVC project, building a specified configuration of your project and then hosting it automatically under an instance of IIS Express
  • Provides strongly-typed methods for navigating to controllers and actions, checking route results, and filling out and submitting forms
  • Provides access to validation data, including access to the validation summary as well as the validity of each property in your view’s model

SpecsFor.Mvc uses Selenium WebDriver internally in order to drive the browser. You can still access the Selenium IWebDriver interface any time you need to dig further into your page.

Let’s take a look at these things in practice by writing a few hypothetical tests written against the stock ASP.NET MVC Internet Application.

Starting the Project

To get started, create a new ASP.NET MVC 4 project in Visual Studio.

New ASP.NET MVC 4 Web Application

Select the Internet Application project template, check the option to create a unit test project and click OK.

Internet Application Project Template

Once both the ASP.NET MVC project and the unit test project have been created, right-click the References folder under your Tests project and click Manage Nuget Packages.

Manage Nuget packages

Under Online, search for and install the official SpecsFor.Mvc Nuget package.

SpecsFor.Mvc Nuget Package

Initializing the Hosting Environment

The next thing that we need to add to the Tests project is some code that will initialize the IIS Express hosting environment using the classes provided by SpecsFor.Mvc. To do this, create a new class called MvcAppConfig with the following contents (adjust the namespace as needed):

using Microsoft.VisualStudio.TestTools.UnitTesting;
using SpecsFor.Mvc;

namespace SpecsForMvcDemo2.IntegrationTests
{
    [TestClass]
    class MvcAppConfig
    {
        private static SpecsForIntegrationHost integrationHost;

        [AssemblyInitialize()]
        public static void MyAssemblyInitialize(TestContext testContext)
        {
            var config = new SpecsForMvcConfig();

            config.UseIISExpress()
                .With(Project.Named("SpecsForMvcDemo"))
                .ApplyWebConfigTransformForConfig("Debug");

            config.BuildRoutesUsing(r => RouteConfig.RegisterRoutes(r));

            config.UseBrowser(BrowserDriver.Chrome);

            integrationHost = new SpecsForIntegrationHost(config);
            integrationHost.Start();
        }

        [AssemblyCleanup()]
        public static void MyAssemblyCleanup()
        {
            integrationHost.Shutdown();
        }
    }
}

The class is marked as a TestClass even though there are no explicit methods to test. This is so that the MyAssemblyInitialize() and MyAssemblyCleanup() methods run. In order for the AssemblyInitialize and AssemblyCleanup attributes to work the class must be marked with the TestClass attribute. With this code in place, MyAssemblyInitialize() will run once before all of the test methods in the project and MyAssemblyCleanup() will run after they all complete.

The code found in MyAssemblyInitialize() is fairly straight-forward given the clarity of the SpecsFor.Mvc API. A new SpecsForMvcConfig instance is created and set to use IIS Express with a given project name and configuration name. Next, a call to BuildRoutesUsing is made in order to register the various controllers and actions with SpecsFor.Mvc. Finally, the browser is specified and the configuration is used to start a new instance of the SpecsForIntegrationHost.

The MyAssemblyCleanup() method, paired with the AssemblyCleanup attribute, is used to shut down the integration host after all the tests have completed.

Initializing the Browser

Now that we have code in place to host the ASP.NET MVC site before any tests are run, we need some code in place to create an instance of our MVC application in a browser. Right-click the Tests project and add a new Unit Test.

Add Unit Test

Add the following code to the top of your new UnitTest1 class, before the TestMethod1 declaration:

private static MvcWebApp app;

[ClassInitialize]
public static void MyClassInitialize(TestContext testContext)
{
    //arrange
    app = new MvcWebApp();
}

This will require adding a using statement for SpecsFor.Mvc.

Using SpecsFor.Mvc

This new method, MyClassInitialize() will run before all of the tests in the new UnitTest1 class. It will create a new instance of the MvcWebApp class which will launch the browser with your application loaded.

If you go ahead and run the tests for UnitTest1 now you’ll see that two console windows are opened, one for IIS Express hosting the ASP.NET application and one for the Selenium WebDriver that is driving your application. In addition, after the Selenium WebDriver console window is opened, the browser specified in the MvcAppConfig class will be launched.

Note that you may get a prompt from Windows Firewall that you’ll need to allow.

Firewall Alert

Because we haven’t actually written any tests yet, all these windows will close after they are opened, but this demonstrates that these few lines of code used to bootstrap the environment are working.

Authentication Tests

Now that all the setup work is done, let’s see what some actual integration tests look like using SpecsFor.Mvc. The first test will ensure that, if a user tries to navigate to the /account/manage route of the ASP.NET MVC application without logging in, they will be redirected to the login screen.

[TestMethod]
public void AccountManage_WithoutSession_RedirectsToLogin()
{
    //act
    AccountController.ManageMessageId? messageId = null;
    app.NavigateTo<AccountController>(c => c.Manage(messageId));

    //assert
    const string returnUrl = "%2fAccount%2fManage";
    app.Route.ShouldMapTo<AccountController>(c => c.Login(returnUrl));
}

This test will require adding two new items to the using statements: YourProjectName.Controllers and MvcContrib.TestHelper (MvcContrib.TestHelper is needed for the call to ShouldMapTo).

And that’s it for the first integration test. I love it. It’s clear, concise, and (aside from the return URL path) it’s strongly typed. The call to NavigateTo will navigate to the URL corresponding to the AccountController and the Manage action, specified in the lambda expression. The call to ShouldMapTo will ensure that the resulting route corresponds to the AccountController and Login action (with the proper ReturnUrl parameter).

Let’s add two more tests to illustrate a few more examples using SpecsFor.Mvc:

[TestMethod]
public void Login_InvalidInput_TriggersValidation()
{
    //act
    app.NavigateTo<AccountController>(c => c.Login(string.Empty));
    app.FindFormFor<LoginModel>()
        .Field(f => f.UserName).SetValueTo(string.Empty)
        .Field(f => f.Password).SetValueTo(string.Empty)
        .Submit();

    //assert
    app.FindFormFor<LoginModel>()
        .Field(f => f.UserName).ShouldBeInvalid();
    app.FindFormFor<LoginModel>()
        .Field(f => f.Password).ShouldBeInvalid();
}

[TestMethod]
public void Login_InvalidCredentials_TriggersValidation()
{
    //act
    app.NavigateTo<AccountController>(c => c.Login(string.Empty));
    app.FindFormFor<LoginModel>()
        .Field(f => f.UserName).SetValueTo(Guid.NewGuid().ToString())
        .Field(f => f.Password).SetValueTo(Guid.NewGuid().ToString())
        .Submit();

    //assert
    app.ValidationSummary.Text.AssertStringContains("incorrect");
}

These tests will require adding a using statement for YourProjectName.Models so that the LoginModel class can be accessed.

Again, looking at the code, I love the simplicity and clarity in the SpecsFor.Mvc tests. I can use NavigateTo to navigate to my controller and action, and then use FindFormFor to access my view’s model. Finally I can submit the form with easy access to the resulting validation data.

Unfortunately, if you try to run these new tests right now they will fail. The reason is that the SpecsFor.Mvc initialization code compiles and deploys a fresh copy of the ASP.NET MVC project to a TestSite folder within the Debug folder. The App_Data folder contents are not included in the ASP.NET MVC Visual Studio project. So, the database files are not deployed to the TestSite folder and the site itself will YSOD if you try to do anything requiring the database.

No DB YSOD

To fix this, right-click the App_Data folder in your main MVC project and click Add>Existing Item.

Add Existing Item

Then, add the two files found in your physical App_Data folder to the project (you’ll need to run the MVC site and access the database once manually).

After adding the MDF and LDF files to the project you should be able to run all of the authentication integration tests without error.

The Big But

Pee-Wee Big But

Now this all sounds great, but

At the time I’m writing this, SpecsFor.Mvc tests run great under third-party test runners such as TestDriven.Net and CodeRush. However, the tests don’t run under Visual Studio’s MSTest runner. Trying to run the tests using Visual Studio’s built in test runner will result in a “Build failed” error. The author of SpecsFor.Mvc has reproduced the issue and is hoping to have it fixed within a couple of days.

UPDATE: This issue has since been resolved by Matt and is no longer a problem in version 2.4.0. No more buts!

Resources

Bitcoin & Litecoin Proxy Mining on Mac OS X

Bitcoin

When it comes to mining Bitcoins and Litecoins there are two major protocols involved: the older Getwork protocol and the newer Stratum protocol. At this point the Stratum protocol has all-but-replaced the Getwork protocol. All major mining pools and mining software support Stratum and Getwork is deprecated.

This is usually not a problem, however some older mining utilities (for example the Litecoin CPU miner) do not have Stratum support. And some pools, such as Slush’s Bitcoin pool or WeMineLTC, have limited-to-no support for Getwork. This is where a very nice Python utility called the Stratum Mining Proxy can help.

You can use the Stratum Mining Proxy on any computer to connect to your desired Bitcoin or Litecoin pool. Then you can connect your mining software to the IP address of the computer the proxy mining software is running on. All of the Getwork network requests from the mining software will be reshaped into the Stratum protocol and then forwarded on to the mining pool.

Installing the Xcode Command Line Tools

In order to install the mining proxy you’ll need a set of command line utilities for compiling software. You have two options:

  1. Install Xcode for free from the App Store and then use Xcode to install the Command Line Tools package
  2. Install only the Command Line Tools package from an unofficial (non-Apple) source

If you would rather play it safe and stick to as many trusted sources as possible, or if you plan to make use of the Xcode IDE, go with the first option. If you’d rather save some disk space and don’t mind using a widely used – if unofficial – source, go with option two.

Option 1: Install Xcode and Command Line Tools

To get started using this option you will need to install the Xcode. Xcode is a free download from the Apple App Store.

Xcode on App Store

Next you’ll use Xcode to install the Command Line Tools. Launch Xcode and then click the Xcode>Preferences menu item. Click the Downloads tab and then click Install next to Command Line Tools.

Xcode Downloads

If the text next to Xcode Command Line Tools says Installed, carry on to the next step.

Option 2: Install just the Command Line Tools

If you would like to install only the GCC Command Line Tools you can download the package for your version of OS X here.

Standalone GCC Installer

Simply run the setup package after downloading and step through the installer.

Downloading the Mining Proxy

The original Stratum Mining Proxy project was created by Slush, who also runs one of the more popular mining pools and is very involved in the Bitcoin community and development, from proposing the concept of pooled mining to proposing the Stratum protocol.

However his original project is for Bitcoin miners and pools only. In order to support both Bitcoin and Litecoin mining you’ll want to refer to this fork which includes support for the scrypt algorithm used by Litecoin. You can either download the latest source as a zip file or use Git to clone the repository.

Installing Scrypt (Litecoin) Support

If you downloaded a zip file of the mining proxy source rather than using Git to clone the repository, extract the zip file contents by double-clicking.

Next, open Terminal.app from the Applications folder or using Spotlight. Use the cd command to navigate to the mining proxy source. For instance, if you downloaded the source as a zip file:

cd ~/Downloads/stratum-mining-proxy-master

Next, change directory to the Litecoin scrypt module:

cd litecoin_scrypt

Finally, install the scrypt module for Litecoin support:

sudo python setup.py install

Installing the Stratum Mining Proxy

Now, in the Terminal app window change the working directory back up to the root of the mining proxy source:

cd ..

And finally install the mining proxy:

sudo python setup.py install

Install Mining Proxy

Using the Mining Proxy

Once the mining proxy is installed, using it is fairly straight forward. Normally your Bitcoin or Litecoin miner connects to a host and port with a specific username and password. For instance:

minerd -o http://pool-host:pool-port -u pool-username -p pool-password -t thread-count

With the mining proxy you’ll want to first run the proxy itself, connecting to the pool’s host and port:

mining_proxy.py -nm -o pool-host -p pool-port

If you are mining Litecoins, use the -pa parameter:

mining_proxy.py -nm -pa scrypt -o pool-host -p pool-port

Mining Proxy Output

Then run your miner, but specify your computer’s IP address and the port number reported by the mining proxy output:

minerd -o http://your-ip:port-from-output -u pool-username -p pool-password -t thread-count

Mining Proxy Output - Clients

Other Benefits

There are a couple of other benefits of the Stratum Mining Proxy that are worth noting:

  1. The mining proxy supports both Getwork and Stratum clients. This means you can use the proxy to consolidate network traffic regardless of the mining protocol.
  2. Using the mining proxy gives you a simple way to get a heads-up view of which machines are alive and which are submitting accepted work by referring to the Terminal output.

Resources

To learn more about mining Bitcoins and Litecoins on OS X, see my previous blog posts:

Bitcoin Mining – Part 1
Bitcoin Mining – Part 2
Bitcoin Mining – Part 3
Litecoin Mining – Part 1
Litecoin Mining – Part 2

Litecoin Mining on Mac OS X – GPU Mining

Litecoin Logo

My previous article on Litecoin mining I discussed how you can get started mining Litecoins, an alternative to the Bitcoin crypto-currency, using your spare CPU cycles. If you’re GPU is already mining Bitcoins, using your CPU to mine Litecoins may be an obvious choice. However, if you are “all in” on Litecoin then you can get a sizable performance increase by using your GPU to mine Litecoins. Whether that trade-off is worth-while is a decision you would need to make. Litecoin is not as mature as Bitcoin and is just as volatile. Some see it as a doomed clone, others as the next Bitcoin, poised to take off.

GPU Mining

In order to start using your GPU to mine Litecoins on OS X, you’ll need to first install cgminer or bfgminer. Please read my previous article on Bitcoin mining for step-by-step instructions and installation packages for cgminer and bfgminer. The Homebrew formulas I shared in my previous article include the configuration settings necessary to mine both Bitcoins and Litecoins. If you are compiling the applications yourself, make sure to use the –enable-scrypt parameter when running ./configure.

Armed with a copy of cgminer or bfgminer with scrypt enabled, you can now start mining Litecoins with your GPU. Now a fair warning: finding a nice set of parameters for mining Litecoins with cgminer or bfgminer is far more finicky and time consuming. With Bitcoins you can basically run either miner and just specify your pool. The miner will then tune itself and eventually reach a nice hash-rate.

This isn’t really the case with using either miner to mine Litecoins. For instance, if I use the following command line to mine Litecoins using my ATI 5770. Note that I am using the -d parameter to specify which GPU to use to simplify these examples.

cgminer --scrypt -o host:port -u username -p password -d 0

cgminer No Params

With no additional tuning my 5770 gets about 5Kh/s. My CPU alone gets around 29Kh/s. Using bfgminer with the basic command-line parameters yields similar results. As indicated in my previous article I use the –no-opencl-binaries to work around a crash in bfgminer with multiple GPU’s on OS X.

bfgminer --scrypt -o host:port -u username -p password -d 0 --no-opencl-binaries

bfgminer No Params

So how can we get better performance mining Litecoins on the GPU? The first parameter to focus on is the -I parameter, or the intensity. With Bitcoin this parameter does help, but honestly not a ton. If you aren’t using the PC, and you leave the intensity at “Desktop” (-I d), both cgminer and bfgminer will eventually reach around the same performance you’d get by specifying something like -I 9.

This is not the case with Litecoin mining. With scrypt (Litecoin’s hashing algorithm) you need to specify a high intensity, in fact higher than is suggested for Bitcoin. I’d start at around 10 and go up from there. On my system, 15-16 is about the best I can do.

So lets try cgminer with -I 13 to see what that does:

cgminer --scrypt -o host:port -u username -p password -d 0 -I 13

cgminer with I

That change alone triples the Kh/s, but from 5Kh/s to 15Kh/s honestly isn’t much to brag about. Let’s try with bfgminer:

bfgminer --scrypt -o host:port -u username -p password -d 0 --no-opencl-binaries -I 13

bfgminer with I

Similarly I get around 14Kh/s using the -I 13 parameter with bfgminer.

The next parameter to focus on is the -w parameter, which specifies a worksize in multiples of 64 up to 256. This parameter is what really makes a difference on my OS X rig:

cgminer --scrypt -o host:port -u username -p password -d 0 -I 13 -w 64

cgminer with I w

Now we’re cooking with Crisco! Using -I 13 -w 64 on my 5770 gets me from the original 5Kh/s up to 83Kh/s with cgminer. And with bfgminer?

bfgminer --scrypt -o host:port -u username -p password -d 0 --no-opencl-binaries -I 13 -w 64

bfgminer with I w

This gets us up to around 80 Kh/s with bfgminer as well. By removing the -d parameter so that both GPU’s are used and cranking up the -I parameter to 16 I can get my GPU’s up to around 230 Kh/s:

cgminer --scrypt -o host:port -u username -p password -I 16 -w 64

cgminer Two GPUs

The last parameter I want to note is –thread-concurrency. If cgminer or bfgminer are reporting hardware errors (the HW number in the output) this indicates that you need to make use of the –thread-concurrency parameter. These are not hardware failures. You can find guidelines for the value to use here:

57xx cards: 2368-4096 (3200 is common)
58xx cards: 4096-8192 (5600, 7168, and 8000 are common)
5970 cards: 4096-8192 (5632 or 8000 are common)

67xx cards: 2368-4096 (3200 is common)
68xx cards: 3008-6720 (4800 is common)
69xx cards: 4096-8192 (5600, 7168, and 8000 are common)
6990 cards: 4096-8192 (5632 or 8000 are common)

7xxx cards: 64 * bus-width-of-card-in-bits. So, for a 7950, that would be 64 * 384 = 24576. Ideal values are 21712 or 24000. Find your bus width here.

I verified that 3200 works well with a 5770 and 4800 works well with a 6870.

Conclusion

And that about covers it for my series of articles on mining Bitcoins and Litecoins using Mac OS X and Apple hardware. The growing landscape of crypto-currency is fascinating from many perspectives. If your interest is piqued and you are running OS X, hopefully you now have the information you need in order to get started mining this new form of currency.