The simplest and cleanest way to access the SparkPost API from Ruby or from Rails.
Add this line to your application's Gemfile:
gem 'simple_spark'
And then execute:
$ bundle
Or install it yourself as:
$ gem install simple_spark
The official gem was somewhat lacking in functionality, though with the demise of Mandrill it seems SparkPost decided to restart development on it, they abandoned that as of 17th May 2016
As we would have to write wrappers around all the functions we would need for our app to use SparkPost anyway, it seemed much easier to write the wrapper as a gem and allow others to use it too.
Breaking change: initialising the client is now done with a hash instead of with ordered parameters, as there were getting to be too many after supporting Subaccounts and user specified headers
First you need to ensure you are requiring the library
require 'simple_spark'
The simplest version of the client is to just provide your API key from SparkPost
simple_spark = SimpleSpark::Client.new(api_key: 'your_api_key')
You can also use ENV vars to configure the key, setting ENV['SPARKPOST_API_KEY'] will allow you to just use
simple_spark = SimpleSpark::Client.new
You can also override the other options if you need to in advanced scenarios, the full signature is (api_key, api_host, base_path, debug), i.e.
simple_spark = SimpleSpark::Client.new(api_key: 'your_api_key', api_host: 'https://api.sparkpost.com', base_path: '/api/v1/', debug: false, subaccount_id: 'my_subaccount')
Setting debug to true will cause Excon to output full debug information to the log.
This will default to true if you are running under Rails and are in a development environment, otherwise it will default to false (setting other values to nil will cause them to use their defaults)
You can also pass a Logger into the client options to have SimpleSpark log there. By default Rails.logger will be used when runnign under Rails, and STDOUT will be used otherwise
simple_spark = SimpleSpark::Client.new(api_key: 'your_api_key', debug: true, logger: Rails.logger)
By setting subaccount_id on your client you are telling Simple Spark to use that subaccount for all calls made on this instance of the client.
Not all Sparkpost calls support the Subaccount feature, and their API will throw an unauthorized error if you use a subaccount_id on an unsupported call. Depending on your code this may mean you need to instantiate two instances of the Simple Spark client in your code, one for subaccount calls, and one for other calls. This is a less than ideal solution, but due to the rapid pace of Sparkpost development on their API this is the option that causes least dependency up Simple Spark to be updated as their API is.
Should you have any need to override the headers that are sent by default, then you can specify headers as an option. The headers specified here will override any of the generated headers that the library creates. In normal operation there should be no reason to use this option, but it is provided for convenience and to allow for Sparkpost updating their API in any unexpected way.
simple_spark = SimpleSpark::Client.new(api_key: 'your_api_key', headers: { 'NewSparkpostHeader' => 'hello'})
SimpleSpark wraps all the common errors from the SparkPost API
If the API takes too long to respond (times out in Excon) a GatewayTimeoutExceeded will be raised
Status 400 raises Exceptions::BadRequest
Status 404 raises Exceptions::NotFound
Status 422 raises Exceptions::UnprocessableEntity
Status 420/429 raises Exceptions::ThrottleLimitExceeded
Other response status codes raise Exceptions::UnprocessableEntity
In some cases it is possible to send too fast for the API (apparently) to handle , in this case the SparkPost API returns a 504 status with an empty body. This is raised by SimpleSpark as Exceptions::GatewayTimeoutExceeded
simple_spark.account.retrieve
The argument can be specified in a comma separated list. The only valid value is currently usage.
simple_spark.account.retrieve("usage")
see SparkPost API Documentation
properties = {
company_name: "SparkPost",
options: {
smtp_tracking_default: true,
rest_tracking_default: true,
transactional_unsub: true,
transactional_default: true
}
}
simple_spark.account.update(properties)
see SparkPost API Documentation
simple_spark.metrics.discoverability_links
see SparkPost API Documentation
Summary of metrics
properties = {
from: '2013-04-20T07:12',
to: '2018-04-20T07:12',
metrics: 'count_accepted',
timezone: 'America/New_York'
}
simple_spark.metrics.deliverability_metrics_summary(properties)
see SparkPost API Documentation
Metrics grouped by Domain
properties = {
from: '2013-04-20T07:12',
to: '2018-04-20T07:12',
metrics: 'count_accepted',
timezone: 'America/New_York'
}
simple_spark.metrics.deliverability_metrics_by_domain(properties)
see SparkPost API Documentation
Metrics grouped by Sending Domain
properties = {
from: '2013-04-20T07:12',
to: '2018-04-20T07:12',
metrics: 'count_accepted',
timezone: 'America/New_York'
}
simple_spark.metrics.deliverability_metrics_by_sending_domain(properties)
see SparkPost API Documentation
Metrics grouped by Subaccount
properties = {
from: '2013-04-20T07:12',
to: '2018-04-20T07:12',
metrics: 'count_accepted',
timezone: 'America/New_York'
}
simple_spark.metrics.deliverability_metrics_by_subaccount(properties)
see SparkPost API Documentation
Metrics grouped by Campaign
properties = {
from: '2013-04-20T07:12',
to: '2018-04-20T07:12',
metrics: 'count_accepted',
timezone: 'America/New_York'
}
simple_spark.metrics.deliverability_metrics_by_campaign(properties)
see SparkPost API Documentation
Metrics grouped by Template
properties = {
from: '2013-04-20T07:12',
to: '2018-04-20T07:12',
metrics: 'count_accepted',
timezone: 'America/New_York'
}
simple_spark.metrics.deliverability_metrics_by_template(properties)
see SparkPost API Documentation
Metrics across a Time Series
properties = {
from: '2013-04-20T07:12',
to: '2018-04-20T07:12',
metrics: 'count_accepted',
timezone: 'America/New_York',
precision: 'day'
}
simple_spark.metrics.deliverability_time_series(properties)
Returns an array of metrics with time stamps:
[{ "count_targeted"=>2, "ts"=>"2011-06-01T00:00:00+00:00" }, { "count_targeted"=>3, "ts"=>"2011-06-02T00:00:00+00:00" }]
see SparkPost API Documentation
List all Transmissions
When messages are sent the Transmission will be deleted, so this will only return transmissions that are about to be sent or are scheduled for the future
simple_spark.transmissions.list
see SparkPost API Documentation
Create a new Transmission
properties = {
options: { open_tracking: true, click_tracking: true },
campaign_id: 'christmas_campaign',
return_path: 'bounces-christmas-campaign@sp.yourdomain.com',
metadata: {user_type: 'students'},
substitution_data: { sender: 'Big Store Team' },
recipients: [
{ address: { email: 'yourcustomer@theirdomain.com', name: 'Your Customer' },
tags: ['greeting', 'sales'],
metadata: { place: 'Earth' }, substitution_data: { address: '123 Their Road' } }
],
content:
{ from: { name: 'Your Name', email: 'you@yourdomain.com' },
subject: 'I am a test email',
reply_to: 'Sales <sales@yourdomain.com>',
headers: { 'X-Customer-CampaignID' => 'christmas_campaign' },
text: 'Hi from {{sender}} ... this is a test, and here is your address {{address}}',
html: '<p>Hi from {{sender}}</p><p>This is a test</p>'
}
# Or to use a template, change the content key to be:
# content: { template_id: 'first-template-id' }
}
simple_spark.transmissions.create(properties)
To send attachments, they need to be Base64 encoded
require 'base64'
properties = {
recipients: [{ address: { email: 'yourcustomer@theirdomain.com', name: 'Your Customer' }],
content:
{ from: { name: 'Your Name', email: 'you@yourdomain.com' },
subject: 'I am a test email',
html: '<p>Hi from {{sender}}</p<p>This is a test</p>',
attachments: [{ name: "attachment.txt", type: "text/plain", data: attachment }]
}
}
# load your file contents first, then use Base64 to encode them
encoded_attachment = Base64.encode64('My file contents')
properties[:content][:attachments] = [{ name: "attachment.txt", type: "text/plain", data: encoded_attachment }]
simple_spark.transmissions.create(properties)
simple_spark.transmissions.delete_campaign("white-christmas")
see SparkPost API Documentation
List all Subaccounts
simple_spark.subaccounts.list
see SparkPost API Documentation
Create a new Subaccount
properties = {
name: 'Sparkle Ponies', key_label: 'API Key for Sparkle Ponies Subaccount',
key_grants: ['smtp/inject', 'sending_domains/manage', 'message_events/view', 'suppression_lists/manage']
}
simple_spark.subaccounts.create(properties)
see SparkPost API Documentation
Retrieves a Subaccount by its id
simple_spark.subaccounts.retrieve(123)
see SparkPost API Documentation
Updates a Subaccount with new values
properties = { name: "new name" }
simple_spark.subaccounts.update('mail.mydomain.com', properties)
see SparkPost API Documentation
List an example of the event data that will be included in a response from the Message Events search endpoint
simple_spark.message_events.samples
To limit to just some events
simple_spark.message_events.samples('bounce')
see SparkPost API Documentation
Perform a filtered search for message event data. The response is sorted by descending timestamp. For full options you should consult the SparkPost API documentation
simple_spark.message_events.search(campaign_ids: 'christmas-campaign, summer-campaign')
see SparkPost API Documentation
List an example of the event data that will be included in a response from the Events search endpoint
simple_spark.events.samples
To limit to just some events
simple_spark.events.samples('bounce')
see SparkPost API Documentation
Perform a filtered search for event data. The response is sorted by descending timestamp. For full options you should consult the SparkPost API documentation
simple_spark.events.search(campaign_ids: 'christmas-campaign, summer-campaign')
see SparkPost API Documentation
List all Webhooks, optionally providing a timezone property
simple_spark.webhooks.list('America/New_York')
see SparkPost API Documentation
Create a new Webhook
simple_spark.webhooks.create(values)
see SparkPost API Documentation
Retrieves a Webhook
simple_spark.webhooks.retrieve(webhook_id)
see SparkPost API Documentation
Updates a Webhook with new values
properties = { "name" => "New name" }
simple_spark.webhooks.update(webhook_id, properties)
see SparkPost API Documentation
Validates a Webhook by sending an example message event batch from the Webhooks API to the target URL
simple_spark.webhooks.validate(webhook_id)
see SparkPost API Documentation
Retrieve the Batch Status Information for a Webhook
simple_spark.webhooks.batch_status(webhook_id)
see SparkPost API Documentation
List an example of the event data that will be sent from a webhook
simple_spark.webhooks.samples
To limit to just some events
simple_spark.webhooks.samples('bounce')
see SparkPost API Documentation
List all Sending Domains
simple_spark.sending_domains.list
see SparkPost API Documentation
Create a new Sending Domain
simple_spark.sending_domains.create({domain: 'mail.mydomain.com'})
see SparkPost API Documentation
Retrieves a Sending Domain by its domain name
simple_spark.sending_domains.retrieve('mail.mydomain.com')
see SparkPost API Documentation
Updates a Sending Domain with new values
properties = { "tracking_domain" => "new.tracking.domain" }
simple_spark.sending_domains.update('mail.mydomain.com', properties)
see SparkPost API Documentation
Forces verification of a Sending Domain.
Including the fields "dkim_verify" and/or "spf_verify" in the request initiates a check against the associated DNS record type for the specified sending domain.Including the fields "postmaster_at_verify" and/or "abuse_at_verify" in the request results in an email sent to the specified sending domain's postmaster@ and/or abuse@ mailbox where a verification link can be clicked. Including the fields "postmaster_at_token" and/or "abuse_at_token" in the request initiates a check of the provided token(s) against the stored token(s) for the specified sending domain.
properties = { "dkim_verify": true, "spf_verify": true }
simple_spark.sending_domains.verify('mail.mydomain.com', properties)
see SparkPost API Documentation
Deletes a Sending Domain permanently
simple_spark.sending_domains.delete('mail.mydomain.com')
see SparkPost API Documentation
List all Inbound Domains
simple_spark.inbound_domains.list
see SparkPost API Documentation
Create a new Inbound Domain
simple_spark.inbound_domains.create('mail.mydomain.com')
see SparkPost API Documentation
Retrieves an Inbound Domain by its domain name
simple_spark.inbound_domains.retrieve('mail.mydomain.com')
see SparkPost API Documentation
Deletes an Inbound Domain permanently
simple_spark.inbound_domains.delete('mail.mydomain.com')
see SparkPost API Documentation
Find supression list entries
params = {
from: '2013-04-20T07:12',
to: '2018-04-20T07:12'
}
simple_spark.suppression_list.search(params)
see SparkPost API Documentation
Bulk update supression list entries
recipients = [
{
recipient: "rcpt_1@example.com",
type: "transactional",
description: "User requested to not receive any transactional emails."
},
{
recipient: "rcpt_2@example.com",
type: "non_transactional"
}
]
simple_spark.suppression_list.create_or_update(recipients)
see SparkPost API Documentation
simple_spark.suppression_list.retrieve("rcpt_1@example.com")
see SparkPost API Documentation
simple_spark.suppression_list.delete("rcpt_1@example.com")
see SparkPost API Documentation
List all Relay Webhooks
simple_spark.relay_webhooks.list
see SparkPost API Documentation
Create a new Relay Webhook
properties = {
name: "Replies Webhook",
target: "https://webhooks.customer.example/replies",
auth_token: "",
match: {
protocol: "SMTP",
domain: "email.example.com"
}
}
simple_spark.relay_webhooks.create(properties)
see SparkPost API Documentation
Retrieves a Relay Webhook by its id
simple_spark.relay_webhooks.retrieve(id)
see SparkPost API Documentation
Updates a Relay Webhook with new values
properties = { name: "Replies Webhook" }
simple_spark.relay_webhooks.update(id, properties)
see SparkPost API Documentation
Deletes a Relay Webhook permanently
simple_spark.relay_webhooks.delete(id)
see SparkPost API Documentation
List all templates
simple_spark.templates.list
see SparkPost API Documentation
Create a new Template
properties = { "name" => "Summer Sale!",
"content"=> { "from" => "marketing@yourdomain.com",
"subject"=> "Summer deals",
"html"=> "<b>Check out these deals!</b>"
}
}
simple_spark.templates.create(properties)
see SparkPost API Documentation
Retrieves a Template by its ID
draft = nil
simple_spark.templates.retrieve(yourtemplateid, draft)
see SparkPost API Documentation
Updates a Template with new values
properties = { "name" => "Sorry, the Winter Sale!" }
update_published = false
simple_spark.templates.update(yourtemplateid, properties, update_published)
see SparkPost API Documentation
Merges the template with the Substitution data and returns the result
properties = { substitution_data: { name: 'Mr test User' } }
draft = nil
simple_spark.templates.preview(yourtemplateid, properties, draft)
see SparkPost API Documentation
Deletes a template permanently
simple_spark.templates.delete(yourtemplateid)
see SparkPost API Documentation
List all recipient lists
simple_spark.recipient_lists.list
see SparkPost API Documentation
Create a new Recipient list
properties = { "name" => "Small List",
"recipients"=> [
{
"address" => {
"email" => "somemail@somedomain.com"
}
}
]
}
num_rcpt_errors = 1
simple_spark.recipient_lists.create(properties, num_rcpt_errors)
see SparkPost API Documentation
Retrieves a Recipient list by its ID
show_recipients = true
simple_spark.recipient_lists.retrieve(your_list_id, show_recipients)
see SparkPost API Documentation
Updates a Recipient list with new values
properties = { "name" => "New List Name" }
simple_spark.recipient_lists.update(your_list_id, properties)
see SparkPost API Documentation
Deletes a Recipient list permanently
simple_spark.recipient_lists.delete(your_list_id)
see SparkPost API Documentation
1.0.12 Fix param name on suppression list #27
- Add Recipients List Endpoint
- Add Events Endpoint
- Exceptions now return SparkPost results object too
- Add Account API Endpoint
- Add Delete Campaign
- Bug fixes
- Add Supression List Endpoint
Add sparkpost error code into exception message to allow more specific error handling
Suppress Excon warning for using :debug parameter
As SparkPost have now stopped development on their own gem, and have recommended this one as being a better alternative, bumping version to 1.0.0 - the code has been running in production for a while now and seems stable and near feature complete.
Adding status error code to message as SparkPost uses a wide range of status codes
Check :progname before assigning
- long day ... bug in 504 exception syntax came back with merge
- Bug in 504 exception syntax
- Breaking change: 204 responses now return an empty hash t simplify consuming code
- Added logging, if debug is set then SimpleSpark will log its options and calls in addition to Excon.
- Improved exception handling
- Added Time Series to Metrics
- Fixed accidental bug
- Subaccounts endpoint added
- Metrics main endpoints added
- Merged pull request to fix Rails development check for debug
- Breaking change: client paramaters are now a hash of options instead of ordered params
- Added Subaccount support to client
- Added Headers support to client
Passing tests are encouraged going forwards, and generally code should follow most of the standard rules that Rubocop checks for.
- Fork it ( https://github.com/leadmachineapp/simple_spark/fork )
- Create your feature branch (
git checkout -b my-new-feature
) - Commit your changes (
git commit -am 'Add some feature'
) - Push to the branch (
git push origin my-new-feature
) - Create a new Pull Request