Tweet a video or audio clip from a video, with optional text status.
If it's an audio file it creates an animated waveform video
If captions are provided, for either video or audio, it burns them onto the video
Originally developed as part of textAV 2018, for "Full Fact - tweet that clip" by Pietro & James.
Part of textAV reusable components Trello board
Subsequently integrated in autoEdit.io, to enable tweeting audio or video quote.
- node
v10.0.0
- npm
6.1.0
-
Get twitter credentials
-
Create a
.env
file and add that to the root of the app. -
See "Authentication - Access Tokens" in twitter docs to get credentials.
TWITTER_CONSUMER_KEY=""
TWITTER_CONSUMER_SECRET=""
TWITTER_CALLBACK=""
and user credentials, to post on user timelines
TWITTER_ACCESS_TOKEN=""
TWITTER_ACCESS_TOKEN_SECRET=""
Install npm module tweet-that-clip
npm install tweet-that-clip
requrie and use in your code
const path = require('path');
const tweetThatClip = require('tweet-that-clip');
const ffmpeg = require('ffmpeg-static-electron');
const opts = {
inputFile: path.join(__dirname,'./assets/test.mp4'),
mediaType: 'video', // 'audio' or 'video'
outputFile: path.join(__dirname,'/example/test-clipped.mp4'),
inputSeconds: 10, // seconds
durationSeconds: 20, // in seconds. Up to 2min duration
// Twitter text status 280 characters limit.
tweetText: 'The Trussell Trust found that food bank use increased by 52% in a year in areas where Universal Credit has been rolled out. The National Audit Office observed similar findings https://fullfact.org/economy/universal-credit-driving-people-food-banks/',
// tmp directory for creating intermediate clips when processing media
tmpDir: path.join(__dirname,'/assets'),
// optional path to ffmpeg. eg To burn captions, needs, optional path to ffmpeg binary - enable libas,
// if not provided it uses default on system if present
// if in doubt can give the path to https://www.npmjs.com/package/ffmpeg-static-electron
ffmpegBin: ffmpeg.path,
// Optional caption file - if burning captions provide an srtFilePath.
srtFilePath: path.join(__dirname,'./assets/captions.srt')
};
tweetThatClip(opts)
.then((res)=>{
console.log('in example-usage for video',res.outputFile);
// console.log(res.resTwitter);
})
.catch((error) => {
console.log('Error in example-usage for video',error);
})
also See ./example-usage-video.js
file and ./example-usage-audio.js
.
As seen in example below you need to provide binary for ffmpeg. eg ffmpeg-static
or ffmpeg-static-electron
.
Especially when using the option to burn captions you need to provide an ffmpeg with --enable-libass
. The two binaries linked above have been tested to work.
For some use cases such as electron, you might want to pass in an optional credentials
object attribute, see example blow
const opts = {
...
// optional credentials
credentials: {
consumerKey: "",
consumerSecret: "",
accessToken: "",
accessTokenSecret: ""
}
};
If you provide the path to a caption file for the selection you want to trim, it is going to be used to burn captions onto the clip.
Note timecodes and text need to be relative to the selection only, as if the sele
const opts = {
...
// Optional caption file - if burning captions provide an srtFilePath.
srtFilePath: path.join(__dirname,'./assets/captions.srt')
};
- At a high level it uses
fluent-ffmpeg
to trim clip and convert to twitter video specs0.5 seconds and 30 seconds (sync) / 140 seconds (async)
not exceed 15 mb (sync) / 512 mb (async)
- For twitter video upload and status post uses script by @jcipriano refactored into a module.
- It creates a tmp clipped/trimmed file and deletes it once the tweet is sent.
In more detail, the main index.js
pulls in the modules from lib
.
- Trim video
- if audio create Waveform
- Burn captions - optional, if srt is provided
- Tweet clip
No build step
No tests for now, just ./example-usage-video.js
file and ./example-usage-audio.js
.
No deployment, as node module, but available on npm as tweet-that-clip