How to export newsletters from SendFox
How to move newsletters from SendFox to Ghost? What do you mean there's no newsletter export?
I had a client moving from using SendFox for his newsletter to using Ghost Pro. He wanted to get all his old newsletters posted in Ghost and that's when we discovered that SendFox has no export functionality for newsletters. (Subscribers were easy, but he wanted to have the old newsletters on his new site, too.)
Here's my solution, which uses node-fetch to get the page content, cheerio for some parsing help, and then puts the newsletters into Ghost using the Ghost API.
import fetch from 'node-fetch';
import * as cheerio from 'cheerio';
import GhostAdminAPI from '@tryghost/admin-api'
// This script uses the SendFox account owner's credentials to retrieve all previous newsletters and convert them to Ghost posts.
// Note that it does NOT make new copies of any images - it leaves them hosted on Sendfox.
// If someone would like to commission that work, please let me know!
// Update the five lines below.
let ghostAPI="PUT YOUR GHOST API URL PROVIDED ON THE INTEGRATIONS PAGE"
let ghostKey="PUT YOUR ADMIN API KEY FROM THE GHOST INTEGRATIONS PAGE HERE.";
let baseURL = "https://sendfox.com/dashboard/emails?page="
let foxCookie = "LOGIN TO SENDFOX AND USE DEV TOOLS TO GET THE VALUE OF THE sendfox_session COOKIE";
let numPages = 20 ; // check the number of newsletter pages visible at https://sendfox.com/dashboard/emails and adjust to match.
// See lines 47-60 for possible changes to post tagging and some clean-up. That may need to be customized for your specific newsletter.
const api = new GhostAdminAPI({
url: ghostAPI,
version: "v5.0",
key: ghostKey
});
let parseThese = []
let options ={
method: "GET",
headers: {
cookie: `sendfox_session=${foxCookie}`
}
}
async function grabOnePage(baseURL, pageNo) {
let data = await fetch(baseURL + pageNo, options)
let text = await data.text()
const $ = cheerio.load(text);
$('table tr > th.align-middle.pl-0.py-4.font-weight-normal > a').each((i,el)=> parseThese.push($(el).attr('href')))
}
async function grabOneEmail(URL) {
let data = await fetch(URL, options)
let json = await data.json()
// prevent lots of extra line breaks.
let html = json['campaign']['html'].replaceAll('<p><br></p>','')
// My client's posts all start this way. Adjust to suit your purposes.
html = html.replaceAll('<p>Hey {{contact.first_name}},</p>','')
let title = json['campaign']['subject']
let sent_at = '2020-01-01T12:00:00Z'
if (json['campaign']['sent_at']) {
sent_at = json['campaign']['sent_at'].split(' ')[0]+'T00:00:00Z'
// if you want to change the tag(s) for the import, do so below.
api.posts
.add(
{title: title, published_at: sent_at,tags: ['Newsletter'], status: 'published', html: html }, {source: 'html'}
)
.then(response =>
console.log('done')
)
.catch(error => console.error(error));
} else { console.log('no date for ', title, ", which probably means it isn't published")}
}
for (let i = 1; i <= numPages; i++ ) {
console.log(i)
await grabOnePage(baseURL, i)
}
for (let one of parseThese) {
let oneArray = one.split('/')
await grabOneEmail('https://sendfox.com/emails/'+oneArray[5])
}
Slightly sketchy directions for anyone who needs them:
- You need node installed. You need npm installed. How to do that is beyond the scope of this page, but googling "node and npm install on YourOperatingSystem" should get you lots of tutorials!
- Save the code above in a folder somewhere, named getposts.js (or whatever you like). Open it in a text editor and add the required values at the top of the file.
- Make a file called package.json and put it in that folder. See below for contents.
- In the directory you made (using whatever shell tool you have), run "npm install". Wait while npm finds and installs everything required.
- Run "node getposts.js". Wait. Check the console for any errors.
Hey, before you go... If your finances allow you to keep this tea-drinking ghost and the freelancer behind her supplied with our hot beverage of choice, we'd both appreciate it!