Friday, July 24, 2015

Using the SQL DELETE function in ExactTarget Queries

The Salesforce Marketing Cloud (aka ExactTarget) offers you the ability to write your own SQL statements to pull together data from multiple Data Extensions (tables) into a single table.  It's really handy and quite powerful in its ability to write complicated JOINS. (It's also great for writing small queries against a Data Extension to update that same data extension - that is, you don't have to do it all in one massive SELECT - instead, you can write lots of little ones, each drawing from a different table if you need to.)

However, everything you want to do has to be done with the SELECT statement depending on where you target the output (to itself, to another table) and the method of write (APPEND, OVERWRITE, UPDATE) allows you to approximate UPDATE and INSERT but there's no native DELETE functionality.

That's probably a way to protect us from ourselves - DELETE Is a pretty dangerous command.

You could always overwrite the data with something you don't want or something that would be ignored by your AmpScript or the mailing engine, but what if you truly want to delete some content within your table?

Here's my quick trick.

Scenario (completely made-up for the purposes of this post):

I have a number of different emails that I want to send my subscribers.

  • Balance Transfer Confirmations
  • Loan Approvals
  • Loan Rejections
  • Statements Available Online

They don't need to instantaneous (they're not a password reset) and usually, there will be a batch. The new records could come from an automated process (some other system drops records on the FTP server) or manual (a personal uploads files to the FTP server).

The business could create new message types (maybe "non-sufficient funds" or something) at any time.  They may have an audience ready to go at the same time they first inform me about the new message.  They need to be able to tell me that they've uploaded a new audience and provide me the design of the email.  (They can't be expected to wait for a new process with new instructions on how to upload and keep track of lots of different methods of uploading.)

I don't want to use an API because it doesn't need to be instantaneous and because I want more visibiility and prefer the reporting of automated sends versus triggered sends.

How to Manage:

I need a pool or queue where new recipients can wait (with their data) until sent.  Once sent, they need to be removed from the queue so that the queue remains lean and there's no risk of double-mailing - it should only contain people not-yet-mailed. (This will also show me if a new message type has been introduced that they didn't tell me about or if a particular message is failing.)

So I have a Data Extension with "Message ID", "Subscriber Key", "Data1", "Data2", "Data3", etc. An automation pours in new records as it receives them on the FTP site (whether placed there by an external process or by a person).

Process:

So essentially, I need:

(1) Find new records for a message type in the queue.
(2) Mail those records.
(3) Remove them from the queue.

How?

(1a) Select matching records by message ID
(1b) and copy them into a message-specific table as an overwrite (the table is blanked out so only new records appear in it)
(2) Mail that table (with optional suppression from 3b)
(3a) Remove them from the queue.
(3b) Optionally add them to a suppression (if it's a message they can only receive once ever) or to a historic log (so that I can see what I sent when for compliance purposes)

Everything about this is super-easy except 3a: Remove them from the queue.

Actually, that's easy as well.

Create the Effect of a DELETE function using only SELECT

You'll do this by using two tables.  Copy the records you want to keep from table a to table b and then copy table b back to table a.

Step 1: Create a new data extension.  It should be based on your existing data extension.  If your original DE is "queue" then call it "queue_transit"

Step 2: Create a new query called "queue_reduce_balancexferconf".  It should simply be
select * from queue where messageID<>'balancexferconf'
(where 'balancexferconf' is your messageID)
and it should be set to "OVERWRITE" and target "queue_transit"

Step 3: Create a new query called "queue_update".  It should simply be
select * from queue_transit
and it should be set to OVERWRITE and target "queue"

Step 4: Create a program (or add to your existing program) that runs
- query: queue_reduce_balancexferconf
- query: queue_update

And that's it.  These will run really quickly because your pool will remain small.

You will need a new query for step 2 for each message, but it beats re-creating the intake process (file locations, trigger, import, data extension, documentation, training, meetings, etc.) for each and every message that the business wants to quickly add.


No comments: