The bulk email test: An (expected) “fail”

mass email test

mass email testA few weeks ago, I reported here about my decision to develop a tool/system to accommodate larger-volume email distribution outside the rules of the conventional third-party email management services such as Constant Contact and Mailchimp. The commercial services rightfully have set rather stringent permission rules for email communications.  They don’t want to be blocked by anti-spam resources, or associated with spam emails.

However, we anticipated some circumstances where we might wish to use specialized email lists outside the rules of these services to send (in our opinion) relevant news information rather than commercial advertising. The goal: to see if we could reach out and develop some relationships.

After some effort I discovered a service-provider who would set up the system on a different server from our general corporate email and business accounts. The service provider provided (as part of the deal) access to some truly giant — and totally “spammy” — lists — of supposed opportunity seekers, and the like. I passed on using these lists as they are obviously not relevant to our specialized readership.

Then the provider said he had built his own list for construction-related businesses. I could purchase (not rent) this list for $75.00. I decided to “bite” and purchased the 6,000-name list.

There are several rules regarding email marketing. One is you should test carefully before launching, seeking statistically relevant variables that you can test further. I didn’t bother with these formalities — setting up a simple and direct “broadcast” using the entire list. (With the data I have now, I don’t think it would have mattered.)

So, a few hours ago, I sent out the email — promoting this blog and my book. It isn’t a work of art, but the links seemed to work on the trial email, and I wanted to see what would happen.

The results:

Bounce, bounce, and more bounce. I cannot tell how many emails were actually delivered, and how many actually bounced, because the bounce measurement tool within the email software doesn’t seem to have been set up properly (my bad for not properly testing that feature). However, I know the amount of non-delivered returns is far greater than anything I’ve experienced on a regular list.

Of course, it is outside business hours as I write this, but the results so far are still revealing. So far, only two individuals have opened any of the links associated with the email, and eight have “unsubscribed”. The latter number is low, perhaps, for the size and type of the mailing — but I wonder if it is simply because very few emails actually reached the destination.

Based on what I see here, I won’t use this list again.

I’m reminded, directly, about spam’s costs and harmful impact. Spammers, of course, don’t care about what is right and wrong — they will broadcast stuff that is fraudulent, unethical and often totally criminal (such as schemes to infect your computer with trojans, or gather and steal your personal banking information). Extremely low response rates don’t matter, as they aren’t paying for the bandwidth. So they flood the Internet with more and more of their crap. Anti-spam measures and controls from mailing list services have resulted in the reduction of much activity that could be interpreted as spam in some ways, but might be justified in others (such as my experiments here). But the bad guys don’t care. They’ll spam as much as they want, when they want, and my experience that allowed the set-up of the test I’ve just run took me to the edge of the dark-side where these schemes operate.

Nevertheless, I can’t see now why any legitimate business would play around with this sort of stuff. The reward doesn’t equal the risk and effort required to work around the rules within business norms, and when we stretch further, we get into uncomfortable areas where I think few of us wish to go. Lessons learned.