Compare commits

...

48 Commits

Author SHA1 Message Date
Ed Robbins
7f1543a0f3 Add ability to enable/disable Azure audio logging via azureOptions (#1432) 2025-11-30 11:56:56 -05:00
Hoan Luu Huu
83955ba972 SoundHound support audio endpoint from speech credential (#1446)
* SoundHound support audio endpoint from speech credential

* add requestInfo and sampleRate to houndify channel variable

* add requestInfo and sampleRate to houndify channel variable

* wip

* wip

* wip

* wip

* wip

* wip

* wip
2025-11-30 11:55:20 -05:00
Hoan Luu Huu
a5fa5fce5b Fixed transcribe 2 legs cannot fallback (#1451)
* fixed transcribe cannot fallback for specific endpoint

* wip

* wip

* wip

* wip

* wip

* wip

* wip

* wip
2025-11-28 21:43:05 -05:00
Dave Horton
cc1751f500 fix race condition where gather resolves with speech transcript but t… (#1449)
* fix race condition where gather resolves with speech transcript but timeout timer gets set after the resolve and is left running after gather completes

* remove unneeded line of code
2025-11-27 11:44:49 -06:00
Ed Robbins
1a1f53aede Compare sdp to determine if transcoding is being used. (#1444)
* compare sdp for transcoding

* refactor sdp check for leading codec

* fix reference to epOther

* minor changes

* minor

* fix #1447

* fix security issue

* use convenience getter appIsUsingWebsockets in CallSession

---------

Co-authored-by: Dave Horton <daveh@beachdognet.com>
2025-11-24 10:50:41 -06:00
Hoan Luu Huu
1984b6d3ea allow say verb failed as NonFatalTaskError for File Not Found (#1443)
* allow say verb failed as NonFatalTaskError for File Not Found

* wip
2025-11-20 07:22:28 -05:00
Hoan Luu Huu
769b66f57e fixed playbackIds is not in correct order compare with say.text array (#1439)
* fixed playbackIds is not in correct order compare with say.text array

* wip

* wip
2025-11-19 19:00:44 -05:00
Hoan Luu Huu
98b845f489 fix say verb does not close streaming when finish say (#1412)
* fix say verb does not close streaming when finish say

* wip

* wip

* ttsStreamingBuffer reset eventHandlerCount after remove listeners

* only send tokens to module if connected

* wip

* sent stream_open when successfully connected to vendor
2025-11-17 08:56:09 -05:00
Ed Robbins
f92b1dbc97 Add ability to override certain tts streaming options via the config … (#1429)
* Add ability to override certain tts streaming options via the config verb.

* Update to null operator(??), support parameter override via config
2025-11-12 13:54:01 -05:00
Dave Horton
0442144793 fix bug escaping backspace character 2025-11-03 15:33:59 -05:00
Hoan Luu Huu
2de24af169 fixed gather does not start timeout on bargin (#1421)
* fixed gather does not start timeout on bargin

* with previous change, no need to emit playDone since no where in the code are we listening for it

---------

Co-authored-by: Dave Horton <daveh@beachdognet.com>
2025-11-03 13:11:59 -05:00
Dave Horton
a884880321 fix for #1422 (#1423)
* fix for #1422

* fix prev commit
2025-11-03 12:53:43 -05:00
Dave Horton
b307df79d0 update deps (#1417) 2025-10-31 07:31:32 -04:00
Hoan Luu Huu
77bd11dd47 update speech util 0.2.26 (#1416) 2025-10-31 07:14:38 -04:00
Hoan Luu Huu
46d56fe546 fd_1574: should not send only whitespace to streaming tts engine (#1415) 2025-10-30 20:59:25 -04:00
Hoan Luu Huu
30ab281ea2 support disableTtsCache from config verb (#1410) 2025-10-28 08:19:03 -04:00
Sam Machin
0869a73052 add distributeDtmf to conference (#1401)
* add distributeDtmf to conference

* lint

* bump verb specs
2025-10-21 11:20:12 -04:00
Sam Machin
a0a579ccee escape json special chars in metadata (#1399) 2025-10-20 10:30:03 -04:00
Sam Machin
4218653852 add customerData on transferred calls (#1391)
* add customerData on transferred calls

* change to if statement
2025-10-20 09:20:12 -04:00
Hoan Luu Huu
89cc39f726 support gladia stt (#1397)
* support gladia stt

* wip

* wip

* update verb specification
2025-10-20 04:56:39 -04:00
Sam Machin
b231593bff bump dbhelpers for cache change (#1396) 2025-10-15 11:38:43 -04:00
Sam Machin
4309d25376 don't encode querystring if its the filename (#1395)
* don't encode querystring if its the filename

* lint

* update link to issue

u
2025-10-14 10:48:50 -04:00
Hoan Luu Huu
a00703a067 support houndify stt (#1364)
* support houndify stt

* wip

* wip

* wip

* update houndify stt parameters

* wip

* wip
2025-10-14 00:55:21 -04:00
Hoan Luu Huu
89c985b564 fixed does not send final status call back if call canceled quickly (#1393)
* fixed callsession should cleanup resource if call was canceled while fetching app

* wip

* wip

* wip

* wip

* wip
2025-10-11 03:44:42 -04:00
Dave Horton
b4ed4c8c46 #1385: Gather - dont start the continuous asr timer when we first start listening if this is a background gather (#1386) 2025-10-09 08:47:51 -04:00
Hoan Luu Huu
581d309f36 support elevenlabs different endpoint (#1387)
* support elevenlabs different endpoint

* wip

* wip
2025-10-09 08:19:40 -04:00
Sam Machin
d1baf2fe37 if call is transferred from another FS then always answer (#1383)
Currently if the call being transferred was originally an outbound call then the direction thats retrieved from redis is outbound and the invite of the refer from the other FS is never answered,
However a transferredCall will always need to be answered regardless of CallDirection
2025-10-07 07:19:11 -04:00
Dave Horton
28bf0d3477 send eager_eot events (#1382) 2025-10-06 16:50:20 -04:00
Dave Horton
d2d3b4583e Fix/flux cleanup (#1379)
* for deepgram flux include the turn taking events in the transcription payload

* for deepgram flux, including turn_taking_event in the speech payload

* fix prev commit which used wrong field
2025-10-04 20:06:38 -04:00
Hoan Luu Huu
854c26db11 support deepgramflux (#1373)
* support deepgramflux

* wip

* wip

* wip

* wip

* update verb scpecification
2025-10-03 10:38:39 -04:00
Dave Horton
e77666a1a7 update speech-utils and drachtio-srf (#1377) 2025-10-03 10:06:37 -04:00
Ed Robbins
5acb19225b If an error occurs during initial TTS request, propagate the error (#1369)
* If an error occurs during initial TTS request, propagate the error

* fix missing semicolon

* fix jslint error

* add null check to r.playbackMilliseconds
2025-10-01 00:02:51 -04:00
Dave Horton
1d6f84c2d7 add event handler for when deepgram closes with an error (#1372) 2025-09-28 14:18:56 -04:00
Vinod Dharashive
de9b970a93 Update example-voicemail-greetings.json (#1367)
Add JP greetin
2025-09-23 09:30:40 -04:00
rammohan-y
ec786ef1dd Fix for sending synthesized-audio verb:status event when using TTS streaming (#1366)
https://github.com/jambonz/jambonz-feature-server/issues/1365
2025-09-23 09:30:05 -04:00
Sam Machin
a95a6d1683 clear main timout when interdigit timeout is started, (#1351)
* clear main timout when interdigit timeout is started,

also clear the asrTimeout when dtmf has taken priority

* lint

* lint
2025-09-12 09:01:51 -04:00
Dave Horton
65b3066866 catch exceptions from req.cancel() (#1359)
* catch exceptions from req.cancel()

* catch other instances of req.cancel

* fix prev commit
2025-09-11 12:25:36 -04:00
Hoan Luu Huu
057f52e56c speech_util v0.2.23 (#1358) 2025-09-11 01:30:31 -04:00
Hoan Luu Huu
b46be57eba singleDialer should create ConfirmCallSession with correct tmpFiles (#1357) 2025-09-10 22:37:41 -04:00
Hoan Luu Huu
f950d19d1c fix ConfirmCallSession in placeCall does not have access to tmpFiles for removing tmp file later (#1356) 2025-09-10 19:39:43 -04:00
pk32495
859132bb1c Fixed token missing log line. (#1354) 2025-09-10 15:03:56 -04:00
Dave Horton
acaadceaa2 fix exception when receiving REFER but dial task ended (#1353) 2025-09-10 12:23:32 -04:00
Hoan Luu Huu
add8d63e8e tts stream should not print speech credential (#1352) 2025-09-09 19:18:49 -04:00
Sam Machin
a05b72a420 Fix/1345 (#1349)
* don't try and guess carrier if LCR is set

* lint

* update dbHelpers dep
2025-09-06 14:15:16 -04:00
rammohan-y
28ff85225f Fixed issue for punctuation (#1344)
https://github.com/jambonz/jambonz-feature-server/issues/1343
2025-09-03 13:33:38 -04:00
Dave Horton
f2fe7c4d24 Fix/playback race by fs generates playback (#1331)
* update to speech-utils that generates playback id

* modify tts and say task to track current playback id and match against start and stop events

* bump speech utils

* wip

* wip

* fix race condition where say with playbackId gets stop event from previous play from cache file

* logging

* wip

* fix comparison when playing cached files

* logging
2025-08-26 09:39:25 -04:00
Dave Horton
97408c7d3b fix uncaught exception with llm streaming 2025-08-22 13:24:59 -04:00
Dave Horton
db5f0a0dce Feat/startup logging (#1333)
* turn down some logging

* add startup logging

* wip

* lint
2025-08-21 14:09:20 -04:00
30 changed files with 3379 additions and 3342 deletions

6
app.js
View File

@@ -29,6 +29,12 @@ const {LifeCycleEvents, FS_UUID_SET_NAME, SystemState, FEATURE_SERVER} = require
const installSrfLocals = require('./lib/utils/install-srf-locals');
const createHttpListener = require('./lib/utils/http-listener');
const healthCheck = require('@jambonz/http-health-check');
const ProcessMonitor = require('./lib/utils/process-monitor');
const monitor = new ProcessMonitor(logger);
// Log startup
monitor.logStartup();
monitor.setupSignalHandlers();
logger.on('level-change', (lvl, _val, prevLvl, _prevVal, instance) => {
if (logger !== instance) {

View File

@@ -174,5 +174,61 @@
"non è raggiungibile",
"lascia pure un messaggio",
"puoi lasciare un messaggio"
],
"ja-JP": [
"この通話は留守番電話に転送されました",
"発信先は現在電話に出ることができません",
"発信音の後でメッセージを録音してください",
"録音を完了したら電話を切ることができます",
"只今電話に出ることができません",
"ただ今電話に出ることができません",
"ただいま電話に出ることができません",
"ピーという発信音の後にお名前とご用件をお話しください",
"ファックスを送られる方はスタートボタンを押してください",
"FAXを送られる方はスタートボタンを押してください",
"おかけになった電話をお呼びしましたが",
"お出になりません",
"おでになりません",
"お掛けになった電話番号は",
"おかけになった電話番号は",
"お掛けになった電話は",
"おかけになった電話は",
"現在使われておりません",
"番号をお確かめになって",
"お掛け直し下さい",
"おかけ直し下さい",
"おかけ直しください",
"こちらはNTTドコモです",
"こちらはエーユーです",
"こちらはソフトバンクです",
"電波の届かない",
"電源が入っていない",
"掛かりません",
"かかりません",
"お繋ぎすることが出来ません",
"お繋ぎ出来ません",
"お繋ぎすることができません",
"お繋ぎできません",
"おつなぎすることができません",
"おつなぎできません",
"メッセージを録音",
"留守番電話",
"お留守番サービス",
"留守番",
"留守電",
"留守",
"接続します",
"合図の音",
"ピーと",
"発信音",
"ご用件",
"伝言",
"お話しください",
"ファックス",
"FAX",
"終了",
"終了しました",
"終了いたしました",
"営業時間"
]
}

View File

@@ -147,7 +147,7 @@ router.post('/',
// find handling sbc sip for called user
if (JAMBONES_DIAL_SBC_FOR_REGISTERED_USER && target.type === 'user') {
const { registrar } = srf.locals.dbHelpers;
const { registrar} = srf.locals.dbHelpers;
const reg = await registrar.query(target.name);
if (reg) {
sbcAddress = selectHostPort(logger, reg.sbcAddress, 'tcp')[1];
@@ -159,7 +159,9 @@ router.post('/',
* trunk isn't specified,
* check if from-number matches any existing numbers on Jambonz
* */
if (target.type === 'phone' && !target.trunk) {
const { lookupLcrByAccount} = srf.locals.dbHelpers;
const lcrs = await lookupLcrByAccount(req.body.account_sid);
if (target.type === 'phone' && !target.trunk && lcrs.length == 0) {
const str = restDial.from || '';
const callingNumber = str.startsWith('+') ? str.substring(1) : str;
const voip_carrier_sid = await lookupCarrierByPhoneNumber(req.body.account_sid, callingNumber);

View File

@@ -112,6 +112,14 @@ module.exports = function(srf, logger) {
req.locals.callingNumber = sipURIs[1];
}
}
// Feature server INVITE request pipelines taking time to finish,
// while connecting and fetch application from db and invoking webhook.
// call can be canceled without any handling, so we add a listener here
req.once('cancel', (sipMsg) => {
logger.info(`${callId} got CANCEL request`);
req.locals.canceled = true;
});
next();
}
@@ -362,13 +370,14 @@ module.exports = function(srf, logger) {
});
// if transferred call contains callInfo, let update original data to newly created callInfo in this instance.
if (app.transferredCall && app.callInfo) {
const {direction, callerName, from, to, originatingSipIp, originatingSipTrunkName} = app.callInfo;
const {direction, callerName, from, to, originatingSipIp, originatingSipTrunkName, customerData} = app.callInfo;
req.locals.callInfo.direction = direction;
req.locals.callInfo.callerName = callerName;
req.locals.callInfo.from = from;
req.locals.callInfo.to = to;
req.locals.callInfo.originatingSipIp = originatingSipIp;
req.locals.callInfo.originatingSipTrunkName = originatingSipTrunkName;
if (customerData) req.locals.callInfo.customerData = customerData;
delete app.callInfo;
}
next();

View File

@@ -658,6 +658,15 @@ class CallSession extends Emitter {
}
}
// disableTtsCache
get disableTtsCache() {
return this._disableTtsCache || false;
}
set disableTtsCache(d) {
this._disableTtsCache = d;
}
getTsStreamingVendor() {
let v;
if (this.currentTask?.isStreamingTts) {
@@ -710,7 +719,7 @@ class CallSession extends Emitter {
}
hasGlobalSttPunctuation() {
get hasGlobalSttPunctuation() {
return this._globalSttPunctuation !== undefined;
}
@@ -941,6 +950,14 @@ class CallSession extends Emitter {
this.ttsStreamingBuffer?.start();
}
stopTtsStream() {
if (this.appIsUsingWebsockets) {
this.requestor?.request('tts:streaming-event', '/streaming-event', {event_type: 'stream_closed'})
.catch((err) => this.logger.info({err}, 'CallSession:clearTtsStream - Error sending user_interruption'));
this.ttsStreamingBuffer?.stop();
}
}
async enableBotMode(gather, autoEnable) {
try {
let task;
@@ -964,7 +981,7 @@ class CallSession extends Emitter {
task.sticky = autoEnable;
// listen to the bargein-done from background manager
this.backgroundTaskManager.on('bargeIn-done', () => {
if (this.requestor instanceof WsRequestor) {
if (this.appIsUsingWebsockets) {
try {
this.kill(true);
} catch (err) {}
@@ -1017,8 +1034,6 @@ class CallSession extends Emitter {
(type === 'tts' && credential.use_for_tts) ||
(type === 'stt' && credential.use_for_stt)
)) {
this.logger.debug(
`${type}: ${credential.vendor} ${credential.label ? `, label: ${credential.label}` : ''} `);
if ('google' === vendor) {
if (type === 'tts' && !credential.tts_tested_ok ||
type === 'stt' && !credential.stt_tested_ok) {
@@ -1088,6 +1103,13 @@ class CallSession extends Emitter {
deepgram_stt_use_tls: credential.deepgram_stt_use_tls
};
}
else if ('gladia' === vendor) {
return {
speech_credential_sid: credential.speech_credential_sid,
api_key: credential.api_key,
region: credential.region,
};
}
else if ('soniox' === vendor) {
return {
speech_credential_sid: credential.speech_credential_sid,
@@ -1119,6 +1141,7 @@ class CallSession extends Emitter {
return {
api_key: credential.api_key,
model_id: credential.model_id,
api_uri: credential.api_uri,
options: credential.options
};
}
@@ -1167,7 +1190,16 @@ class CallSession extends Emitter {
service_version: credential.service_version
};
}
else if ('deepgramriver' === vendor) {
else if ('houndify' === vendor) {
return {
speech_credential_sid: credential.speech_credential_sid,
client_id: credential.client_id,
client_key: credential.client_key,
user_id: credential.user_id,
houndify_server_uri: credential.houndify_server_uri
};
}
else if ('deepgramflux' === vendor) {
return {
speech_credential_sid: credential.speech_credential_sid,
api_key: credential.api_key,
@@ -1249,6 +1281,7 @@ class CallSession extends Emitter {
this.ttsStreamingBuffer.on(TtsStreamingEvents.Pause, this._onTtsStreamingPause.bind(this));
this.ttsStreamingBuffer.on(TtsStreamingEvents.Resume, this._onTtsStreamingResume.bind(this));
this.ttsStreamingBuffer.on(TtsStreamingEvents.ConnectFailure, this._onTtsStreamingConnectFailure.bind(this));
this.ttsStreamingBuffer.on(TtsStreamingEvents.Connected, this._onTtsStreamingConnected.bind(this));
}
else {
this.logger.info(`CallSession:exec - not a normal call session: ${this.constructor.name}`);
@@ -1307,7 +1340,7 @@ class CallSession extends Emitter {
}
if (0 === this.tasks.length &&
this.requestor instanceof WsRequestor &&
this.appIsUsingWebsockets &&
!this.requestor.closedGracefully &&
!this.callGone &&
!this.isConfirmCallSession
@@ -1417,7 +1450,11 @@ class CallSession extends Emitter {
}
else {
if (this.req && !this.dlg) {
this.req.cancel();
try {
this.req.cancel();
} catch (err) {
this.logger.error({err}, 'CallSession:_lccCallStatus error cancelling request');
}
this._callReleased();
}
}
@@ -1864,7 +1901,7 @@ Duration=${duration} `
return;
}
else if (tokens === undefined) {
this.logger.info({opts}, 'CallSession:_lccTtsTokens - invalid command since id is missing');
this.logger.info({opts}, 'CallSession:_lccTtsTokens - invalid command since tokens is missing');
return this.requestor.request('tts:tokens-result', '/tokens-result', {
id,
status: 'failed',
@@ -2397,7 +2434,7 @@ Duration=${duration} `
this.logger.debug(`endpoint was destroyed!! ${this.ep.uuid}`);
});
if (this.direction === CallDirection.Inbound) {
if (this.direction === CallDirection.Inbound || this.application?.transferredCall) {
if (task.earlyMedia && !this.req.finalResponseSent) {
this.res.send(183, {body: ep.local.sdp});
return {ep};
@@ -2529,7 +2566,7 @@ Duration=${duration} `
this.backgroundTaskManager.stopAll();
this.clearOrRestoreActionHookDelayProcessor().catch((err) => {});
this.ttsStreamingBuffer?.stop();
this.stopTtsStream();
this.sttLatencyCalculator?.stop();
}
@@ -2683,7 +2720,7 @@ Duration=${duration} `
*/
_onRefer(req, res) {
const task = this.currentTask;
const sd = task.sd;
const sd = task?.sd;
if (task && TaskName.Dial === task.name && sd && task.referHook) {
task.handleRefer(this, req, res);
}
@@ -2989,14 +3026,14 @@ Duration=${duration} `
*/
_notifyTaskError(obj) {
if (this.requestor instanceof WsRequestor) {
if (this.appIsUsingWebsockets) {
this.requestor.request('jambonz:error', '/error', obj)
.catch((err) => this.logger.debug({err}, 'CallSession:_notifyTaskError - Error sending'));
}
}
_notifyTaskStatus(task, evt) {
if (this.notifyEvents && this.requestor instanceof WsRequestor) {
if (this.notifyEvents && this.appIsUsingWebsockets) {
const obj = {...evt, id: task.id, name: task.name};
this.requestor.request('verb:status', '/status', obj)
.catch((err) => this.logger.debug({err}, 'CallSession:_notifyTaskStatus - Error sending'));
@@ -3048,7 +3085,7 @@ Duration=${duration} `
}
_clearTasks(backgroundGather, evt) {
if (this.requestor instanceof WsRequestor && !backgroundGather.cleared) {
if (this.appIsUsingWebsockets && !backgroundGather.cleared) {
this.logger.debug({evt}, 'CallSession:_clearTasks on event from background gather');
try {
backgroundGather.cleared = true;
@@ -3076,13 +3113,18 @@ Duration=${duration} `
}
}
_onTtsStreamingConnected() {
this.requestor?.request('tts:streaming-event', '/streaming-event', {event_type: 'stream_open'})
.catch((err) => this.logger.info({err}, 'CallSession:_onTtsStreamingConnected - Error sending'));
}
_onTtsStreamingEmpty() {
const task = this.currentTask;
if (task && TaskName.Say === task.name) {
task.notifyTtsStreamIsEmpty();
} else if (
// If Gather nested say task is streaming
TaskName.Gather === task.name && task.sayTask && task.sayTask.isStreamingTts) {
task && TaskName.Gather === task.name && task.sayTask && task.sayTask.isStreamingTts) {
const sayTask = task.sayTask;
sayTask.notifyTtsStreamIsEmpty();
}

View File

@@ -22,6 +22,12 @@ class InboundCallSession extends CallSession {
this.req = req;
this.res = res;
// if the call was canceled before we got here, handle it
if (this.req.locals.canceled) {
req.locals.logger.info('InboundCallSession: constructor - call was already canceled');
this._onCancel();
}
req.once('cancel', this._onCancel.bind(this));
this.on('callStatusChange', this._notifyCallStatusChange.bind(this));

View File

@@ -49,7 +49,8 @@ class Conference extends Task {
this.confName = this.data.name;
[
'beep', 'startConferenceOnEnter', 'endConferenceOnExit', 'joinMuted',
'maxParticipants', 'waitHook', 'statusHook', 'endHook', 'enterHook', 'endConferenceDuration'
'maxParticipants', 'waitHook', 'statusHook', 'endHook', 'enterHook',
'endConferenceDuration', 'distributeDtmf'
].forEach((attr) => this[attr] = this.data[attr]);
this.record = this.data.record || {};
this.statusEvents = [];
@@ -356,6 +357,7 @@ class Conference extends Task {
//https://developer.signalwire.com/freeswitch/FreeSWITCH-Explained/Modules/mod_conference_3965534/
// mute | Enter conference muted
...((this.joinMuted || this.speakOnlyTo) && {mute: true}),
...(this.distributeDtmf && {'dist-dtmf': true})
}});
/**

View File

@@ -18,7 +18,8 @@ class TaskConfig extends Task {
'boostAudioSignal',
'vad',
'ttsStream',
'autoStreamTts'
'autoStreamTts',
'disableTtsCache'
].forEach((k) => this[k] = this.data[k] || {});
if ('notifyEvents' in this.data) {
@@ -88,6 +89,7 @@ class TaskConfig extends Task {
get hasReferHook() { return Object.keys(this.data).includes('referHook'); }
get hasNotifySttLatency() { return Object.keys(this.data).includes('notifySttLatency'); }
get hasTtsStream() { return Object.keys(this.ttsStream).length; }
get hasDisableTtsCache() { return Object.keys(this.data).includes('disableTtsCache'); }
get summary() {
const phrase = [];
@@ -125,6 +127,7 @@ class TaskConfig extends Task {
phrase.push(`${this.ttsStream.enable ? 'enable' : 'disable'} ttsStream`);
}
if ('autoStreamTts' in this.data) phrase.push(`enable Say.stream value ${this.data.autoStreamTts ? 'on' : 'off'}`);
if (this.hasDisableTtsCache) phrase.push(`disableTtsCache ${this.data.disableTtsCache ? 'on' : 'off'}`);
return `${this.name}{${phrase.join(',')}}`;
}
@@ -357,6 +360,11 @@ class TaskConfig extends Task {
this.logger.info('Config: disabling ttsStream');
cs.disableTtsStream();
}
if (this.hasDisableTtsCache) {
this.logger.info(`set disableTtsCache = ${this.disableTtsCache}`);
cs.disableTtsCache = this.data.disableTtsCache;
}
}
async kill(cs) {

View File

@@ -21,7 +21,7 @@ const {parseUri} = require('drachtio-srf');
const {ANCHOR_MEDIA_ALWAYS,
JAMBONZ_DIAL_PAI_HEADER,
JAMBONES_DIAL_SBC_FOR_REGISTERED_USER} = require('../config');
const { isOnhold, isOpusFirst } = require('../utils/sdp-utils');
const { isOnhold, isOpusFirst, getLeadingCodec } = require('../utils/sdp-utils');
const { normalizeJambones } = require('@jambonz/verb-specifications');
const { selectHostPort } = require('../utils/network');
const { sleepFor } = require('../utils/helpers');
@@ -158,6 +158,7 @@ class TaskDial extends Task {
get canReleaseMedia() {
const keepAnchor = this.data.anchorMedia ||
this.weAreTranscoding ||
this.cs.isBackGroundListen ||
this.cs.onHoldMusic ||
ANCHOR_MEDIA_ALWAYS ||
@@ -641,7 +642,9 @@ class TaskDial extends Task {
* trunk isn't specified,
* check if number matches any existing numbers
* */
if (t.type === 'phone' && !t.trunk) {
const { lookupLcrByAccount} = srf.locals.dbHelpers;
const lcrs = await lookupLcrByAccount(cs.accountSid);
if (t.type === 'phone' && !t.trunk && lcrs.length == 0) {
const str = this.callerId || req.callingNumber || '';
const callingNumber = str.startsWith('+') ? str.substring(1) : str;
const voip_carrier_sid = await lookupCarrierByPhoneNumber(cs.accountSid, callingNumber);
@@ -674,7 +677,8 @@ class TaskDial extends Task {
rootSpan: cs.rootSpan,
startSpan: this.startSpan.bind(this),
dialTask: this,
onHoldMusic: this.cs.onHoldMusic
onHoldMusic: this.cs.onHoldMusic,
tmpFiles: this.cs.tmpFiles,
});
this.dials.set(sd.callSid, sd);
@@ -775,6 +779,9 @@ class TaskDial extends Task {
this.epOther.api('uuid_break', this.epOther.uuid);
this.epOther.bridge(sd.ep);
}
else {
this.logger.error('Dial:_connectSingleDial - no other endpoint to bridge!');
}
this.bridged = true;
}
@@ -923,7 +930,13 @@ class TaskDial extends Task {
this.logger.info({err}, 'Dial:_selectSingleDial - Error boosting audio signal');
}
}
/* basic determination to see if call is being transcoded */
const codecA = getLeadingCodec(this.epOther.local.sdp);
const codecB = getLeadingCodec(this.ep.remote.sdp);
this.weAreTranscoding = (codecA !== codecB);
if (this.weAreTranscoding) {
this.logger.info(`Dial:_selectSingleDial - transcoding from ${codecA} (A leg) to ${codecB} (B leg)`);
}
/* if we can release the media back to the SBC, do so now */
if (this.canReleaseMedia || this.shouldExitMediaPathEntirely) {
setTimeout(this._releaseMedia.bind(this, cs, sd, this.shouldExitMediaPathEntirely), 200);

View File

@@ -5,13 +5,15 @@ const {
AwsTranscriptionEvents,
AzureTranscriptionEvents,
DeepgramTranscriptionEvents,
GladiaTranscriptionEvents,
SonioxTranscriptionEvents,
CobaltTranscriptionEvents,
IbmTranscriptionEvents,
NvidiaTranscriptionEvents,
JambonzTranscriptionEvents,
AssemblyAiTranscriptionEvents,
DeepgramRiverTranscriptionEvents,
HoundifyTranscriptionEvents,
DeepgramfluxTranscriptionEvents,
VoxistTranscriptionEvents,
CartesiaTranscriptionEvents,
OpenAITranscriptionEvents,
@@ -93,6 +95,8 @@ class TaskGather extends SttTask {
get needsStt() { return this.input.includes('speech'); }
get isBackgroundGather() { return this.bugname_prefix === 'background_bargeIn_'; }
get wantsSingleUtterance() {
return this.data.recognizer?.singleUtterance === true;
}
@@ -227,7 +231,9 @@ class TaskGather extends SttTask {
const startListening = async(cs, ep) => {
this._startTimer();
if (this.isContinuousAsr && 0 === this.timeout) this._startAsrTimer();
if (this.isContinuousAsr && 0 === this.timeout && !this.isBackgroundGather) {
this._startAsrTimer();
}
if (this.input.includes('speech') && !this.listenDuringPrompt) {
try {
await this._setSpeechHandlers(cs, ep);
@@ -252,7 +258,7 @@ class TaskGather extends SttTask {
startDtmfListener();
}
this._stopVad();
if (!this.killed) {
if (!this.killed && !this.resolved) {
startListening(cs, ep);
if (this.input.includes('speech') && this.vendor === 'nuance' && this.listenDuringPrompt) {
this.logger.debug('Gather:exec - starting transcription timers after say completes');
@@ -264,19 +270,21 @@ class TaskGather extends SttTask {
};
this.sayTask.span = span;
this.sayTask.ctx = ctx;
this.sayTask.exec(cs, {ep}) // kicked off, _not_ waiting for it to complete
this.sayTask
.exec(cs, {ep}) // kicked off, _not_ waiting for it to complete
.then(() => {
if (this.sayTask.isStreamingTts) return;
this.logger.debug('Gather:exec - nested say task completed');
span.end();
process();
return;
})
.catch((err) => {
process();
});
if (this.sayTask.isStreamingTts && !this.sayTask.closeOnStreamEmpty) {
// if streaming tts, we do not wait for it to complete if it is not closing the stream automatically
process();
} else {
this.sayTask.on('playDone', (err) => {
span.end();
if (err) this.logger.error({err}, 'Gather:exec Error playing tts');
process();
});
}
}
else if (this.playTask) {
@@ -288,7 +296,7 @@ class TaskGather extends SttTask {
startDtmfListener();
}
this._stopVad();
if (!this.killed) {
if (!this.killed && !this.resolved) {
startListening(cs, ep);
if (this.input.includes('speech') && this.vendor === 'nuance' && this.listenDuringPrompt) {
this.logger.debug('Gather:exec - starting transcription timers after play completes');
@@ -300,15 +308,17 @@ class TaskGather extends SttTask {
};
this.playTask.span = span;
this.playTask.ctx = ctx;
this.playTask.exec(cs, {ep}) // kicked off, _not_ waiting for it to complete
this.playTask
.exec(cs, {ep}) // kicked off, _not_ waiting for it to complete
.then(() => {
this.logger.debug('Gather:exec - nested play task completed');
span.end();
process();
return;
})
.catch((err) => {
process();
});
this.playTask.on('playDone', (err) => {
span.end();
if (err) this.logger.error({err}, 'Gather:exec Error playing url');
process();
});
}
else {
if (this.killed) {
@@ -395,6 +405,7 @@ class TaskGather extends SttTask {
if (this.digitBuffer.length === 0 && this.needsStt) {
// DTMF is higher priority than STT.
this.removeCustomEventListeners();
this._clearAsrTimer(); //clear ASR timer as we're now using dtmf
this._stopTranscribing(ep);
}
this.digitBuffer += evt.dtmf;
@@ -409,6 +420,7 @@ class TaskGather extends SttTask {
const ms = this.interDigitTimeout * 1000;
this.logger.debug(`starting interdigit timer of ${ms}`);
this.interDigitTimer = setTimeout(() => this._resolve('dtmf-interdigit-timeout'), ms);
this._clearTimer(); //clear main timer as we're now using interdigit dtmf timer
}
}
@@ -466,16 +478,28 @@ class TaskGather extends SttTask {
this.addCustomEventListener(ep, DeepgramTranscriptionEvents.Connect, this._onVendorConnect.bind(this, cs, ep));
this.addCustomEventListener(ep, DeepgramTranscriptionEvents.ConnectFailure,
this._onVendorConnectFailure.bind(this, cs, ep));
this.addCustomEventListener(ep, DeepgramTranscriptionEvents.Error, this._onVendorError.bind(this, cs, ep));
break;
case 'deepgramriver':
this.bugname = `${this.bugname_prefix}deepgramriver_transcribe`;
case 'deepgramflux':
this.bugname = `${this.bugname_prefix}deepgramflux_transcribe`;
this.addCustomEventListener(
ep, DeepgramRiverTranscriptionEvents.Transcription, this._onTranscription.bind(this, cs, ep));
ep, DeepgramfluxTranscriptionEvents.Transcription, this._onTranscription.bind(this, cs, ep));
this.addCustomEventListener(
ep, DeepgramRiverTranscriptionEvents.Connect, this._onVendorConnect.bind(this, cs, ep));
this.addCustomEventListener(ep, DeepgramRiverTranscriptionEvents.ConnectFailure,
ep, DeepgramfluxTranscriptionEvents.Connect, this._onVendorConnect.bind(this, cs, ep));
this.addCustomEventListener(ep, DeepgramfluxTranscriptionEvents.ConnectFailure,
this._onVendorConnectFailure.bind(this, cs, ep));
this.addCustomEventListener(ep, DeepgramfluxTranscriptionEvents.Error, this._onVendorError.bind(this, cs, ep));
break;
case 'gladia':
this.bugname = `${this.bugname_prefix}gladia_transcribe`;
this.addCustomEventListener(
ep, GladiaTranscriptionEvents.Transcription, this._onTranscription.bind(this, cs, ep));
this.addCustomEventListener(ep, GladiaTranscriptionEvents.Connect, this._onVendorConnect.bind(this, cs, ep));
this.addCustomEventListener(ep, GladiaTranscriptionEvents.ConnectFailure,
this._onVendorConnectFailure.bind(this, cs, ep));
this.addCustomEventListener(ep, GladiaTranscriptionEvents.Error, this._onVendorError.bind(this, cs, ep));
break;
case 'soniox':
@@ -555,6 +579,18 @@ class TaskGather extends SttTask {
this._onVendorConnectFailure.bind(this, cs, ep));
break;
case 'houndify':
this.bugname = `${this.bugname_prefix}houndify_transcribe`;
this.addCustomEventListener(ep, HoundifyTranscriptionEvents.Transcription,
this._onTranscription.bind(this, cs, ep));
this.addCustomEventListener(ep, HoundifyTranscriptionEvents.Error,
this._onVendorError.bind(this, cs, ep));
this.addCustomEventListener(ep, HoundifyTranscriptionEvents.ConnectFailure,
this._onVendorConnectFailure.bind(this, cs, ep));
this.addCustomEventListener(ep, HoundifyTranscriptionEvents.Connect,
this._onVendorConnect.bind(this, cs, ep));
break;
case 'voxist':
this.bugname = `${this.bugname_prefix}voxist_transcribe`;
this.addCustomEventListener(ep, VoxistTranscriptionEvents.Transcription,
@@ -702,8 +738,8 @@ class TaskGather extends SttTask {
this.logger.debug(`Starting timoutTimer of ${this.timeout}ms`);
this._clearTimer();
this._timeoutTimer = setTimeout(() => {
// If continuousASR in use then extend by the asr window for more transcripts.
if (this.isContinuousAsr) this._startAsrTimer();
if (this.interDigitTimer) return; // let the inter-digit timer complete
else {
this._resolve(this.digitBuffer.length >= this.minDigits ? 'dtmf-num-digits' : 'timeout');
}
@@ -850,12 +886,10 @@ class TaskGather extends SttTask {
return;
}
if (this.sayTask && !this.sayTask.killed) {
this.sayTask.removeAllListeners('playDone');
this.sayTask.kill(cs);
this.sayTask = null;
}
if (this.playTask && !this.playTask.killed) {
this.playTask.removeAllListeners('playDone');
this.playTask.kill(cs);
this.playTask = null;
}
@@ -910,7 +944,7 @@ class TaskGather extends SttTask {
evt = this.normalizeTranscription(evt, this.vendor, 1, this.language,
this.shortUtterance, this.data.recognizer.punctuation);
//this.logger.debug({evt, bugname, finished, vendor: this.vendor}, 'Gather:_onTranscription normalized transcript');
this.logger.debug({evt, bugname, finished, vendor: this.vendor}, 'Gather:_onTranscription normalized transcript');
if (evt.alternatives.length === 0) {
this.logger.info({evt}, 'TaskGather:_onTranscription - got empty transcript, continue listening');
@@ -1076,6 +1110,11 @@ class TaskGather extends SttTask {
this.cs.requestor.request('verb:hook', this.partialResultHook, Object.assign({speech: evt},
this.cs.callInfo, httpHeaders));
}
else if (this.vendor === 'deepgramflux' &&
['EagerEndOfTurn', 'TurnResumed'].includes(evt.vendor.evt?.event)) {
this.logger.debug(`Gather:_onTranscription - deepgramflux event detected: ${evt.event}`);
this.performAction({speech: evt, reason: 'speechDetected'}, false);
}
if (this.vendor === 'soniox') {
if (evt.vendor.finalWords.length) {
this.logger.debug({evt}, 'TaskGather:_onTranscription - buffering soniox transcript');
@@ -1122,7 +1161,7 @@ class TaskGather extends SttTask {
}
async _startFallback(cs, ep, evt) {
if (this.canFallback) {
if (this.canFallback()) {
this._stopTranscribing(ep);
try {
this.logger.debug('gather:_startFallback');

View File

@@ -5,6 +5,17 @@ const moment = require('moment');
const MAX_PLAY_AUDIO_QUEUE_SIZE = 10;
const DTMF_SPAN_NAME = 'dtmf';
function escapeString(str) {
return str
.replace(/\\/g, '\\\\') // Escape backslashes
.replace(/"/g, '\\"') // Escape double quotes
.replace(/[\b]/g, '\\b') // Escape backspace (NOTE: [\b] not \b)
.replace(/\f/g, '\\f') // Escape formfeed
.replace(/\n/g, '\\n') // Escape newlines
.replace(/\r/g, '\\r') // Escape carriage returns
.replace(/\t/g, '\\t'); // Escape tabs
}
class TaskListen extends Task {
constructor(logger, opts, parentTask) {
super(logger, opts);
@@ -16,10 +27,21 @@ class TaskListen extends Task {
this.preconditions = TaskPreconditions.Endpoint;
[
'action', 'auth', 'method', 'url', 'finishOnKey', 'maxLength', 'metadata', 'mixType', 'passDtmf', 'playBeep',
'action', 'auth', 'method', 'url', 'finishOnKey', 'maxLength', 'mixType', 'passDtmf', 'playBeep',
'sampleRate', 'timeout', 'transcribe', 'wsAuth', 'disableBidirectionalAudio', 'channel'
].forEach((k) => this[k] = this.data[k]);
//Escape JSON special characters in metadata
if (this.data.metadata) {
this.metadata = {};
for (const key in this.data.metadata) {
if (this.data.metadata.hasOwnProperty(key)) {
const value = this.data.metadata[key];
this.metadata[key] = typeof value === 'string' ? escapeString(value) : value;
}
}
}
this.mixType = this.mixType || 'mono';
this.sampleRate = this.sampleRate || 8000;
this.earlyMedia = this.data.earlyMedia === true;

View File

@@ -218,7 +218,7 @@ class TaskLlmUltravox_S2S extends Task {
async _onServerEvent(_ep, evt) {
let endConversation = false;
const type = evt.type;
this.logger.debug({evt}, 'TaskLlmUltravox_S2S:_onServerEvent');
//this.logger.debug({evt}, 'TaskLlmUltravox_S2S:_onServerEvent');
/* server errors of some sort */
if (type === 'error') {

View File

@@ -6,9 +6,21 @@ class TaskPlay extends Task {
super(logger, opts);
this.preconditions = TaskPreconditions.Endpoint;
this.url = this.data.url.includes('?')
? this.data.url.split('?')[0] + '?' + this.data.url.split('?')[1].replaceAll('.', '%2E')
: this.data.url;
//Cleanup URLs that contain a querystring with a . unless that querystring is the filename
// see https://github.com/jambonz/jambonz-feature-server/pull/1293
// and https://github.com/jambonz/jambonz-feature-server/issues/1394 for background
if (this.data.url.includes('?')) {
if (['.mp3', '.wav'].includes(this.data.url.slice(-4))) {
this.url = this.data.url;
}
else {
this.url = this.data.url.split('?')[0] + '?' + this.data.url.split('?')[1].replaceAll('.', '%2E');
}
}
else {
this.url = this.data.url;
}
this.seekOffset = this.data.seekOffset || -1;
this.timeoutSecs = this.data.timeoutSecs || -1;
this.loop = this.data.loop || 1;

View File

@@ -19,6 +19,7 @@ class TaskRestDial extends Task {
this.timeout = this.data.timeout || 60;
this.sipRequestWithinDialogHook = this.data.sipRequestWithinDialogHook;
this.referHook = this.data.referHook;
this.recentCallStatus = 0;
this.on('connect', this._onConnect.bind(this));
this.on('callStatus', this._onCallStatus.bind(this));
@@ -57,7 +58,11 @@ class TaskRestDial extends Task {
this._clearCallTimer();
if (this.canCancel) {
this.canCancel = false;
cs?.req?.cancel();
try {
cs?.req?.cancel();
} catch (err) {
this.logger.error({err}, 'TaskRestDial: error cancelling call');
}
}
this.notifyTaskDone();
}
@@ -118,7 +123,8 @@ class TaskRestDial extends Task {
}
_onCallStatus(status) {
this.logger.debug(`CallStatus: ${status}`);
this.logger.debug(`RestDial CallStatus: ${status}`);
this.recentCallStatus = status;
if (status >= 200) {
this.canCancel = false;
this._clearCallTimer();
@@ -136,11 +142,16 @@ class TaskRestDial extends Task {
}
_onCallTimeout() {
this.logger.debug('TaskRestDial: timeout expired without answer, killing task');
this.logger.debug(`TaskRestDial: timeout expired without answer, last status ${this.recentCallStatus}`);
this.timer = null;
if (this.canCancel) {
if (this.canCancel && this.recentCallStatus < 200) {
this.logger.debug('TaskRestDial: cancelling call attempt');
this.canCancel = false;
this.cs?.req?.cancel();
try {
this.cs?.req?.cancel();
} catch (err) {
this.logger.error({err}, 'TaskRestDial: error cancelling call');
}
}
}

View File

@@ -2,8 +2,31 @@ const assert = require('assert');
const TtsTask = require('./tts-task');
const {TaskName, TaskPreconditions} = require('../utils/constants');
const pollySSMLSplit = require('polly-ssml-split');
const { SpeechCredentialError } = require('../utils/error');
const { SpeechCredentialError, NonFatalTaskError } = require('../utils/error');
const { sleepFor } = require('../utils/helpers');
const { NON_FANTAL_ERRORS } = require('../utils/constants.json');
/**
* Discard unmatching responses:
* (1) I sent a playback id but get a response with a different playback id
* (2) I sent a playback id but get a response with no playback id
* (3) I did not send a playback id but get a response with a playback id
* (4) I sent a cache file but get a response with a different cache file
*/
const isMatchingEvent = (logger, filename, playbackId, evt) => {
if (!!playbackId && !!evt.variable_tts_playback_id && evt.variable_tts_playback_id === playbackId) {
//logger.debug({filename, playbackId, evt}, 'Say:isMatchingEvent - playbackId matched');
return true;
}
if (!!filename && !!evt.file && evt.file === filename) {
//logger.debug({filename, playbackId, evt}, 'Say:isMatchingEvent - filename matched');
return true;
}
logger.info({filename, playbackId, evt}, 'Say:isMatchingEvent - no match');
return false;
};
const breakLengthyTextIfNeeded = (logger, text) => {
// As The text can be used for tts streaming, we need to break lengthy text into smaller chunks
@@ -98,13 +121,11 @@ class TaskSay extends TtsTask {
}
if (this.isStreamingTts) await this.handlingStreaming(cs, obj);
else await this.handling(cs, obj);
this.emit('playDone');
} catch (error) {
if (error instanceof SpeechCredentialError) {
// if say failed due to speech credentials, alarm is writtern and error notification is sent
// finished this say to move to next task.
this.logger.info({error}, 'Say failed due to SpeechCredentialError, finished!');
this.emit('playDone');
return;
}
throw error;
@@ -125,9 +146,6 @@ class TaskSay extends TtsTask {
await cs.startTtsStream();
cs.requestor?.request('tts:streaming-event', '/streaming-event', {event_type: 'stream_open'})
.catch((err) => this.logger.info({err}, 'TaskSay:handlingStreaming - Error sending'));
if (this.text.length !== 0) {
this.logger.info('TaskSay:handlingStreaming - sending text to TTS stream');
for (const t of this.text) {
@@ -259,40 +277,32 @@ class TaskSay extends TtsTask {
while (!this.killed && (this.loop === 'forever' || this.loop--) && ep?.connected) {
let segment = 0;
while (!this.killed && segment < filepath.length) {
const filename = filepath[segment];
if (cs.isInConference) {
const {memberId, confName, confUuid} = cs;
await this.playToConfMember(ep, memberId, confName, confUuid, filepath[segment]);
await this.playToConfMember(ep, memberId, confName, confUuid, filename);
}
else {
let playbackId;
const isStreaming = filepath[segment].startsWith('say:{');
const isStreaming = filename.startsWith('say:{');
if (isStreaming) {
const arr = /^say:\{.*\}\s*(.*)$/.exec(filepath[segment]);
if (arr) this.logger.debug(`Say:exec sending streaming tts request: ${arr[1].substring(0, 64)}..`);
}
else {
this.logger.debug(`Say:exec sending ${filepath[segment].substring(0, 64)}`);
const arr = /^say:\{.*\}\s*(.*)$/.exec(filename);
if (arr) this.logger.debug(`Say:exec sending streaming tts request ${arr[1].substring(0, 64)}..`);
else this.logger.debug(`Say:exec sending ${filename.substring(0, 64)}`);
}
const onPlaybackStop = (evt) => {
try {
this.logger.debug({evt},
`Say got playback-stop ${evt.variable_tts_playback_id ? evt.variable_tts_playback_id : ''}`);
/**
* If we got a playback id on both the start and stop events, and they don't match,
* then we must have received a playback-stop event for an earlier play request.
*/
const unmatchedResponse = (!!playbackId && !!evt.variable_tts_playback_id) &&
evt.variable_tts_playback_id !== playbackId;
if (unmatchedResponse) {
this.logger.info({currentPlaybackId: playbackId, stopPPlaybackId: evt.variable_tts_playback_id},
const playbackId = this.getPlaybackId(segment);
const isMatch = isMatchingEvent(this.logger, filename, playbackId, evt);
if (!isMatch) {
this.logger.info({currentPlaybackId: playbackId, stopPlaybackId: evt.variable_tts_playback_id},
'Say:exec discarding playback-stop for earlier play');
ep.once('playback-stop', this._boundOnPlaybackStop);
return;
}
this.logger.debug({evt},
`Say got playback-stop ${evt.variable_tts_playback_id ? evt.variable_tts_playback_id : ''}`);
this.notifyStatus({event: 'stop-playback'});
this.notifiedPlayBackStop = true;
const tts_error = evt.variable_tts_error;
@@ -331,6 +341,7 @@ class TaskSay extends TtsTask {
!this.disableTtsCache
) {
const text = parseTextFromSayString(this.text[segment]);
this.logger.debug({text, cacheFile: evt.variable_tts_cache_filename}, 'Say:exec cache tts');
addFileToCache(evt.variable_tts_cache_filename, {
account_sid,
vendor,
@@ -358,9 +369,17 @@ class TaskSay extends TtsTask {
};
this._boundOnPlaybackStop = onPlaybackStop.bind(this);
ep.once('playback-start', (evt) => {
const onPlaybackStart = (evt) => {
try {
playbackId = evt.variable_tts_playback_id;
const playbackId = this.getPlaybackId(segment);
const isMatch = isMatchingEvent(this.logger, filename, playbackId, evt);
if (!isMatch) {
this.logger.info({currentPlaybackId: playbackId, startPlaybackId: evt.variable_tts_playback_id},
'Say:exec playback-start - unmatched playback_id');
ep.once('playback-start', this._boundOnPlaybackStart);
return;
}
ep.once('playback-stop', this._boundOnPlaybackStop);
this.logger.debug({evt},
`Say got playback-start ${evt.variable_tts_playback_id ? evt.variable_tts_playback_id : ''}`);
if (this.otelSpan) {
@@ -374,16 +393,29 @@ class TaskSay extends TtsTask {
} catch (err) {
this.logger.info({err}, 'Error handling playback-start event');
}
});
ep.once('playback-stop', this._boundOnPlaybackStop);
};
this._boundOnPlaybackStart = onPlaybackStart.bind(this);
ep.once('playback-start', this._boundOnPlaybackStart);
// wait for playback-stop event received to confirm if the playback is successful
this._playPromise = new Promise((resolve, reject) => {
this._playResolve = resolve;
this._playReject = reject;
});
const r = await ep.play(filepath[segment]);
this.logger.debug({r}, 'Say:exec play result');
try {
const r = await ep.play(filename);
this.logger.debug({r}, 'Say:exec play result');
if (r.playbackSeconds == null && r.playbackMilliseconds == null && r.playbackLastOffsetPos == null) {
this._playReject(new Error('Playback failed to start'));
}
} catch (err) {
if (NON_FANTAL_ERRORS.includes(err.message)) {
throw new NonFatalTaskError(err.message);
}
throw err;
}
try {
// wait for playback-stop event received to confirm if the playback is successful
await this._playPromise;
@@ -400,12 +432,12 @@ class TaskSay extends TtsTask {
this._playResolve = null;
this._playReject = null;
}
if (filepath[segment].startsWith('say:{')) {
const arr = /^say:\{.*\}\s*(.*)$/.exec(filepath[segment]);
if (filename.startsWith('say:{')) {
const arr = /^say:\{.*\}\s*(.*)$/.exec(filename);
if (arr) this.logger.debug(`Say:exec complete playing streaming tts request: ${arr[1].substring(0, 64)}..`);
} else {
// This log will print spech credentials in say command for tts stream mode
this.logger.debug(`Say:exec completed play file ${filepath[segment]}`);
this.logger.debug(`Say:exec completed play file ${filename}`);
}
}
segment++;
@@ -421,8 +453,8 @@ class TaskSay extends TtsTask {
const {memberId, confName} = cs;
this.killPlayToConfMember(this.ep, memberId, confName);
} else if (this.isStreamingTts) {
this.logger.debug('TaskSay:kill - clearing TTS stream for streaming audio');
cs.clearTtsStream();
this.logger.debug('TaskSay:kill - stopping TTS stream for streaming audio');
cs.stopTtsStream();
} else {
if (!this.notifiedPlayBackStop) {
this.notifyStatus({event: 'stop-playback'});

View File

@@ -171,7 +171,7 @@ class SttTask extends Task {
try {
this.sttCredentials = await this._initSpeechCredentials(this.cs, this.vendor, this.label);
} catch (error) {
if (this.canFallback) {
if (this.canFallback()) {
this.notifyError(
{
msg: 'ASR error', details:`Invalid vendor ${this.vendor}, Error: ${error}`,
@@ -203,6 +203,56 @@ class SttTask extends Task {
if (cs.hasGlobalSttPunctuation && !this.data.recognizer.punctuation) {
this.data.recognizer.punctuation = cs.globalSttPunctuation;
}
if (this.vendor === 'gladia') {
const { api_key, region } = this.sttCredentials;
const {url} = await this.createGladiaLiveSession({
api_key, region,
model: this.data.recognizer.model || 'solaria-1',
options: this.data.recognizer.gladiaOptions || {}
});
const {host, pathname, search} = new URL(url);
this.sttCredentials.host = host;
this.sttCredentials.path = `${pathname}${search}`;
}
}
async createGladiaLiveSession({
api_key,
region = 'us-west',
model = 'solaria-1',
options = {},
}) {
const url = `https://api.gladia.io/v2/live?region=${region}`;
const response = await fetch(url, {
method: 'POST',
headers: {
'x-gladia-key': api_key,
'Content-Type': 'application/json'
},
body: JSON.stringify({
encoding: 'wav/pcm',
bit_depth: 16,
sample_rate: 8000,
channels: 1,
model,
...options,
messages_config: {
receive_final_transcripts: true,
receive_speech_events: true,
receive_errors: true,
}
})
});
if (!response.ok) {
const error = await response.text();
this.logger.error({url, status: response.status, error}, 'Error creating Gladia live session');
throw new Error(`Error creating Gladia live session: ${response.status} ${error}`);
}
const data = await response.json();
this.logger.debug({url: data.url}, 'Gladia Call registered');
return data;
}
addCustomEventListener(ep, event, handler) {
@@ -210,8 +260,19 @@ class SttTask extends Task {
ep.addCustomEventListener(event, handler);
}
removeCustomEventListeners() {
this.eventHandlers.forEach((h) => h.ep.removeCustomEventListener(h.event, h.handler));
removeCustomEventListeners(ep) {
if (ep) {
// for specific endpoint
this.eventHandlers.filter((h) => h.ep === ep).forEach((h) => {
h.ep.removeCustomEventListener(h.event, h.handler);
});
this.eventHandlers = this.eventHandlers.filter((h) => h.ep !== ep);
return;
} else {
// for all endpoints
this.eventHandlers.forEach((h) => h.ep.removeCustomEventListener(h.event, h.handler));
this.eventHandlers = [];
}
}
async _initSpeechCredentials(cs, vendor, label) {
@@ -279,11 +340,13 @@ class SttTask extends Task {
return credentials;
}
get canFallback() {
canFallback() {
return this.fallbackVendor && this.isHandledByPrimaryProvider && !this.cs.hasFallbackAsr;
}
async _initFallback() {
// ep is optional for gather or any verb that have single ep,
// but transcribe does need as it might has 2 eps
async _initFallback(ep) {
assert(this.fallbackVendor, 'fallback failed without fallbackVendor configuration');
this.logger.info(`Failed to use primary STT provider, fallback to ${this.fallbackVendor}`);
this.isHandledByPrimaryProvider = false;
@@ -296,7 +359,7 @@ class SttTask extends Task {
this.data.recognizer.label = this.label;
this.sttCredentials = await this._initSpeechCredentials(this.cs, this.vendor, this.label);
// cleanup previous listener from previous vendor
this.removeCustomEventListeners();
this.removeCustomEventListeners(ep);
}
async compileHintsForCobalt(ep, hostport, model, token, hints) {

View File

@@ -6,7 +6,8 @@ const {
AwsTranscriptionEvents,
AzureTranscriptionEvents,
DeepgramTranscriptionEvents,
DeepgramRiverTranscriptionEvents,
GladiaTranscriptionEvents,
DeepgramfluxTranscriptionEvents,
SonioxTranscriptionEvents,
CobaltTranscriptionEvents,
IbmTranscriptionEvents,
@@ -14,6 +15,7 @@ const {
JambonzTranscriptionEvents,
TranscribeStatus,
AssemblyAiTranscriptionEvents,
HoundifyTranscriptionEvents,
VoxistTranscriptionEvents,
CartesiaTranscriptionEvents,
OpenAITranscriptionEvents,
@@ -68,6 +70,9 @@ class TaskTranscribe extends SttTask {
this._bufferedTranscripts = [ [], [] ]; // for channel 1 and 2
this.bugname_prefix = 'transcribe_';
this.paused = false;
// fallback flags
this.isHandledByPrimaryProviderForEp1 = true;
this.isHandledByPrimaryProviderForEp2 = true;
}
get name() { return TaskName.Transcribe; }
@@ -237,19 +242,35 @@ class TaskTranscribe extends SttTask {
this._onVendorConnect.bind(this, cs, ep));
this.addCustomEventListener(ep, DeepgramTranscriptionEvents.ConnectFailure,
this._onVendorConnectFailure.bind(this, cs, ep, channel));
this.addCustomEventListener(ep, DeepgramTranscriptionEvents.Error, this._onVendorError.bind(this, cs, ep));
/* if app sets deepgramOptions.utteranceEndMs they essentially want continuous asr */
//if (opts.DEEPGRAM_SPEECH_UTTERANCE_END_MS) this.isContinuousAsr = true;
break;
case 'deepgramriver':
this.bugname = `${this.bugname_prefix}deepgramriver_transcribe`;
this.addCustomEventListener(ep, DeepgramRiverTranscriptionEvents.Transcription,
case 'deepgramflux':
this.bugname = `${this.bugname_prefix}deepgramflux_transcribe`;
this.addCustomEventListener(ep, DeepgramfluxTranscriptionEvents.Transcription,
this._onTranscription.bind(this, cs, ep, channel));
this.addCustomEventListener(ep, DeepgramRiverTranscriptionEvents.Connect,
this.addCustomEventListener(ep, DeepgramfluxTranscriptionEvents.Connect,
this._onVendorConnect.bind(this, cs, ep));
this.addCustomEventListener(ep, DeepgramRiverTranscriptionEvents.ConnectFailure,
this.addCustomEventListener(ep, DeepgramfluxTranscriptionEvents.ConnectFailure,
this._onVendorConnectFailure.bind(this, cs, ep, channel));
this.addCustomEventListener(ep, DeepgramfluxTranscriptionEvents.Error, this._onVendorError.bind(this, cs, ep));
break;
case 'gladia':
this.bugname = `${this.bugname_prefix}gladia_transcribe`;
this.addCustomEventListener(ep, GladiaTranscriptionEvents.Transcription,
this._onTranscription.bind(this, cs, ep, channel));
this.addCustomEventListener(ep, GladiaTranscriptionEvents.Connect,
this._onVendorConnect.bind(this, cs, ep));
this.addCustomEventListener(ep, GladiaTranscriptionEvents.ConnectFailure,
this._onVendorConnectFailure.bind(this, cs, ep, channel));
this.addCustomEventListener(ep, GladiaTranscriptionEvents.Error, this._onVendorError.bind(this, cs, ep));
break;
case 'soniox':
this.bugname = `${this.bugname_prefix}soniox_transcribe`;
@@ -320,6 +341,18 @@ class TaskTranscribe extends SttTask {
this._onVendorConnectFailure.bind(this, cs, ep, channel));
break;
case 'houndify':
this.bugname = `${this.bugname_prefix}houndify_transcribe`;
this.addCustomEventListener(ep, HoundifyTranscriptionEvents.Transcription,
this._onTranscription.bind(this, cs, ep, channel));
this.addCustomEventListener(ep, HoundifyTranscriptionEvents.Error,
this._onVendorError.bind(this, cs, ep));
this.addCustomEventListener(ep, HoundifyTranscriptionEvents.ConnectFailure,
this._onVendorConnectFailure.bind(this, cs, ep, channel));
this.addCustomEventListener(ep, HoundifyTranscriptionEvents.Connect,
this._onVendorConnect.bind(this, cs, ep));
break;
case 'voxist':
this.bugname = `${this.bugname_prefix}voxist_transcribe`;
this.addCustomEventListener(ep, VoxistTranscriptionEvents.Transcription,
@@ -746,7 +779,7 @@ class TaskTranscribe extends SttTask {
}
async _startFallback(cs, _ep, evt) {
if (this.canFallback) {
if (this.canFallback(_ep)) {
_ep.stopTranscription({
vendor: this.vendor,
bugname: this.bugname,
@@ -756,7 +789,7 @@ class TaskTranscribe extends SttTask {
try {
this.notifyError({ msg: 'ASR error',
details:`STT Vendor ${this.vendor} error: ${evt.error || evt.reason}`, failover: 'in progress'});
await this._initFallback();
await this._initFallback(_ep);
let channel = 1;
if (this.ep !== _ep) {
channel = 2;
@@ -865,6 +898,41 @@ class TaskTranscribe extends SttTask {
if (this._asrTimer) clearTimeout(this._asrTimer);
this._asrTimer = null;
}
// We need to keep track the fallback is happened for each endpoint
// override the canFallback and _initFallback methods to make sure that
// we only fallback once per endpoint
// we want to keep track this on task level instead of endpoint level
// because the endpoint instance is used across multiple tasks.
canFallback(ep) {
let isHandledByPrimaryProvider = this.isHandledByPrimaryProvider;
if (ep === this.ep) {
isHandledByPrimaryProvider = this.isHandledByPrimaryProviderForEp1;
} else if (ep === this.ep2) {
isHandledByPrimaryProvider = this.isHandledByPrimaryProviderForEp2;
}
const isOneOfEndpointAlreadyFallenBack = !!this.ep && !!this.ep2 &&
this.isHandledByPrimaryProviderForEp1 !== this.isHandledByPrimaryProviderForEp2;
// fallback is configured
return this.fallbackVendor &&
// has this endpoint already fallen back
isHandledByPrimaryProvider &&
// in global level, is there any fallback is already happened
// one fallen endpoint will mark cs.hasFallbackAsr to true,
// so if one endpoint was fallen, the other endpoint would be able to fallback.
(isOneOfEndpointAlreadyFallenBack || !this.cs.hasFallbackAsr);
}
_initFallback(ep) {
if (ep === this.ep) {
this.isHandledByPrimaryProviderForEp1 = false;
} else if (ep === this.ep2) {
this.isHandledByPrimaryProviderForEp2 = false;
}
return super._initFallback(ep);
}
}
module.exports = TaskTranscribe;

View File

@@ -3,6 +3,16 @@ const { TaskPreconditions } = require('../utils/constants');
const { SpeechCredentialError } = require('../utils/error');
const dbUtils = require('../utils/db-utils');
const extractPlaybackId = (str) => {
// Match say:{...} and capture the content inside braces
const match = str.match(/say:\{([^}]*)\}/);
if (!match) return null;
// Look for playback_id=value within the captured content
const playbackMatch = match[1].match(/playback_id=([^,]*)/);
return playbackMatch ? playbackMatch[1] : null;
};
class TtsTask extends Task {
constructor(logger, data, parentTask) {
@@ -22,10 +32,19 @@ class TtsTask extends Task {
this.disableTtsCache = this.data.disableTtsCache;
this.options = this.synthesizer.options || {};
this.instructions = this.data.instructions;
this.playbackIds = [];
}
getPlaybackId(offset) {
return this.playbackIds[offset];
}
async exec(cs) {
super.exec(cs);
// update disableTtsCache from call session if not set in task
if (this.data.disableTtsCache == null) {
this.disableTtsCache = cs.disableTtsCache;
}
if (cs.synthesizer) {
this.options = {...cs.synthesizer.options, ...this.options};
this.data.synthesizer = this.data.synthesizer || {};
@@ -66,55 +85,66 @@ class TtsTask extends Task {
}
async setTtsStreamingChannelVars(vendor, language, voice, credentials, ep) {
const {api_key, model_id, custom_tts_streaming_url, auth_token} = credentials;
let obj;
const {api_key, model_id, api_uri, custom_tts_streaming_url, auth_token, options} = credentials;
// api_key, model_id, api_uri, custom_tts_streaming_url, and auth_token are encoded in the credentials
// allow them to be overriden via config, using options
// give preference to options passed in via config
const local_options = {...JSON.parse(options), ...this.options};
const local_voice_settings = {...JSON.parse(options).voice_settings, ...this.options.voice_settings};
const local_api_key = local_options.api_key ?? api_key;
const local_model_id = local_options.model_id ?? model_id;
const local_api_uri = local_options.api_uri ?? api_uri;
const local_custom_tts_streaming_url = local_options.custom_tts_streaming_url ?? custom_tts_streaming_url;
const local_auth_token = local_options.auth_token ?? auth_token;
this.logger.debug({credentials},
`setTtsStreamingChannelVars: vendor: ${vendor}, language: ${language}, voice: ${voice}`);
let obj;
switch (vendor) {
case 'deepgram':
obj = {
DEEPGRAM_API_KEY: api_key,
DEEPGRAM_API_KEY: local_api_key,
DEEPGRAM_TTS_STREAMING_MODEL: voice
};
break;
case 'cartesia':
obj = {
CARTESIA_API_KEY: api_key,
CARTESIA_TTS_STREAMING_MODEL_ID: model_id,
CARTESIA_API_KEY: local_api_key,
CARTESIA_TTS_STREAMING_MODEL_ID: local_model_id,
CARTESIA_TTS_STREAMING_VOICE_ID: voice,
CARTESIA_TTS_STREAMING_LANGUAGE: language || 'en',
};
break;
case 'elevenlabs':
const {stability, similarity_boost, use_speaker_boost, style, speed} = this.options.voice_settings || {};
// eslint-disable-next-line max-len
const {stability, similarity_boost, use_speaker_boost, style, speed} = local_voice_settings || {};
obj = {
ELEVENLABS_API_KEY: api_key,
ELEVENLABS_TTS_STREAMING_MODEL_ID: model_id,
ELEVENLABS_API_KEY: local_api_key,
...(api_uri && {ELEVENLABS_API_URI: local_api_uri}),
ELEVENLABS_TTS_STREAMING_MODEL_ID: local_model_id,
ELEVENLABS_TTS_STREAMING_VOICE_ID: voice,
// 20/12/2024 - only eleven_turbo_v2_5 support multiple language
...(['eleven_turbo_v2_5'].includes(model_id) && {ELEVENLABS_TTS_STREAMING_LANGUAGE: language}),
...(['eleven_turbo_v2_5'].includes(local_model_id) && {ELEVENLABS_TTS_STREAMING_LANGUAGE: language}),
...(stability && {ELEVENLABS_TTS_STREAMING_VOICE_SETTINGS_STABILITY: stability}),
...(similarity_boost && {ELEVENLABS_TTS_STREAMING_VOICE_SETTINGS_SIMILARITY_BOOST: similarity_boost}),
...(use_speaker_boost && {ELEVENLABS_TTS_STREAMING_VOICE_SETTINGS_USE_SPEAKER_BOOST: use_speaker_boost}),
...(style && {ELEVENLABS_TTS_STREAMING_VOICE_SETTINGS_STYLE: style}),
// speed has value 0.7 to 1.2, 1.0 is default, make sure we send the value event it's 0
...(speed !== null && speed !== undefined && {ELEVENLABS_TTS_STREAMING_VOICE_SETTINGS_SPEED: `${speed}`}),
...(this.options.pronunciation_dictionary_locators &&
Array.isArray(this.options.pronunciation_dictionary_locators) && {
...(local_options.pronunciation_dictionary_locators &&
Array.isArray(local_options.pronunciation_dictionary_locators) && {
ELEVENLABS_TTS_STREAMING_PRONUNCIATION_DICTIONARY_LOCATORS:
JSON.stringify(this.options.pronunciation_dictionary_locators)
JSON.stringify(local_options.pronunciation_dictionary_locators)
}),
};
break;
case 'rimelabs':
const {
pauseBetweenBrackets, phonemizeBetweenBrackets, inlineSpeedAlpha, speedAlpha, reduceLatency
} = this.options;
} = local_options;
obj = {
RIMELABS_API_KEY: api_key,
RIMELABS_TTS_STREAMING_MODEL_ID: model_id,
RIMELABS_API_KEY: local_api_key,
RIMELABS_TTS_STREAMING_MODEL_ID: local_model_id,
RIMELABS_TTS_STREAMING_VOICE_ID: voice,
RIMELABS_TTS_STREAMING_LANGUAGE: language || 'en',
...(pauseBetweenBrackets && {RIMELABS_TTS_STREAMING_PAUSE_BETWEEN_BRACKETS: pauseBetweenBrackets}),
@@ -129,8 +159,8 @@ class TtsTask extends Task {
if (vendor.startsWith('custom:')) {
const use_tls = custom_tts_streaming_url.startsWith('wss://');
obj = {
CUSTOM_TTS_STREAMING_HOST: custom_tts_streaming_url.replace(/^(ws|wss):\/\//, ''),
CUSTOM_TTS_STREAMING_API_KEY: auth_token,
CUSTOM_TTS_STREAMING_HOST: local_custom_tts_streaming_url.replace(/^(ws|wss):\/\//, ''),
CUSTOM_TTS_STREAMING_API_KEY: local_auth_token,
CUSTOM_TTS_STREAMING_VOICE_ID: voice,
CUSTOM_TTS_STREAMING_LANGUAGE: language || 'en',
CUSTOM_TTS_STREAMING_USE_TLS: use_tls
@@ -249,9 +279,9 @@ class TtsTask extends Task {
}
/* produce an audio segment from the provided text */
const generateAudio = async(text) => {
if (this.killed) return;
if (text.startsWith('silence_stream://')) return text;
const generateAudio = async(text, index) => {
if (this.killed) return {index, filePath: null};
if (text.startsWith('silence_stream://')) return {index, filePath: text};
/* otel: trace time for tts */
if (!preCache && !this._disableTracing) {
@@ -307,13 +337,21 @@ class TtsTask extends Task {
'id': this.id
});
}
return {index, filePath, playbackId: null};
}
else {
const playbackId = extractPlaybackId(filePath);
this.logger.debug('Say: a streaming tts api will be used');
const modifiedPath = filePath.replace('say:{', `say:{session-uuid=${ep.uuid},`);
return modifiedPath;
this.notifyStatus({
event: 'synthesized-audio',
vendor,
language,
servedFromCache,
'id': this.id
});
return {index, filePath: modifiedPath, playbackId};
}
return filePath;
} catch (err) {
this.logger.info({err}, 'Error synthesizing tts');
if (this.otelSpan) this.otelSpan.end();
@@ -328,8 +366,20 @@ class TtsTask extends Task {
}
};
const arr = this.text.map((t) => (this._validateURL(t) ? t : generateAudio(t)));
return (await Promise.all(arr)).filter((fp) => fp && fp.length);
// process all text segments in parallel will cause ordering issue
// so we attach index to each promise result and sort them later
const arr = this.text.map((t, index) => (this._validateURL(t) ?
Promise.resolve({index, filePath: t, playbackId: null}) : generateAudio(t, index)));
const results = await Promise.all(arr);
const sorted = results.sort((a, b) => a.index - b.index);
return sorted
.filter((fp) => fp.filePath && fp.filePath.length)
.map((r) => {
this.playbackIds.push(r.playbackId);
return r.filePath;
});
} catch (err) {
this.logger.info(err, 'TaskSay:exec error');
throw err;

View File

@@ -94,12 +94,20 @@
"DeepgramTranscriptionEvents": {
"Transcription": "deepgram_transcribe::transcription",
"ConnectFailure": "deepgram_transcribe::connect_failed",
"Connect": "deepgram_transcribe::connect"
"Connect": "deepgram_transcribe::connect",
"Error": "deepgram_transcribe::error"
},
"DeepgramRiverTranscriptionEvents": {
"Transcription": "deepgramriver_transcribe::transcription",
"ConnectFailure": "deepgramriver_transcribe::connect_failed",
"Connect": "deepgramriver_transcribe::connect"
"DeepgramfluxTranscriptionEvents": {
"Transcription": "deepgramflux_transcribe::transcription",
"ConnectFailure": "deepgramflux_transcribe::connect_failed",
"Connect": "deepgramflux_transcribe::connect",
"Error": "deepgramflux_transcribe::error"
},
"GladiaTranscriptionEvents": {
"Transcription": "gladia_transcribe::transcription",
"ConnectFailure": "gladia_transcribe::connect_failed",
"Connect": "gladia_transcribe::connect",
"Error": "gladia_transcribe::error"
},
"SonioxTranscriptionEvents": {
"Transcription": "soniox_transcribe::transcription",
@@ -167,6 +175,12 @@
"ConnectFailure": "assemblyai_transcribe::connect_failed",
"Connect": "assemblyai_transcribe::connect"
},
"HoundifyTranscriptionEvents": {
"Transcription": "houndify_transcribe::transcription",
"Error": "houndify_transcribe::error",
"ConnectFailure": "houndify_transcribe::connect_failed",
"Connect": "houndify_transcribe::connect"
},
"VoxistTranscriptionEvents": {
"Transcription": "voxist_transcribe::transcription",
"Error": "voxist_transcribe::error",
@@ -321,7 +335,8 @@
"Empty": "tts_streaming::empty",
"Pause": "tts_streaming::pause",
"Resume": "tts_streaming::resume",
"ConnectFailure": "tts_streaming::connect_failed"
"ConnectFailure": "tts_streaming::connect_failed",
"Connected": "tts_streaming::connected"
},
"TtsStreamingConnectionStatus": {
"NotConnected": "not_connected",
@@ -341,5 +356,8 @@
"WS_CLOSE_CODES": {
"NormalClosure": 1000,
"GoingAway": 1001
}
},
"NON_FANTAL_ERRORS": [
"File Not Found"
]
}

View File

@@ -81,7 +81,12 @@ const speechMapper = (cred) => {
obj.deepgram_tts_uri = o.deepgram_tts_uri;
obj.deepgram_stt_use_tls = o.deepgram_stt_use_tls;
}
else if ('deepgramriver' === obj.vendor) {
else if ('gladia' === obj.vendor) {
const o = JSON.parse(decrypt(credential));
obj.api_key = o.api_key;
obj.region = o.region;
}
else if ('deepgramflux' === obj.vendor) {
const o = JSON.parse(decrypt(credential));
obj.api_key = o.api_key;
}
@@ -101,6 +106,7 @@ const speechMapper = (cred) => {
const o = JSON.parse(decrypt(credential));
obj.api_key = o.api_key;
obj.model_id = o.model_id;
obj.api_uri = o.api_uri;
obj.options = o.options;
}
else if ('playht' === obj.vendor) {
@@ -141,6 +147,13 @@ const speechMapper = (cred) => {
obj.api_key = o.api_key;
obj.service_version = o.service_version;
}
else if ('houndify' === obj.vendor) {
const o = JSON.parse(decrypt(credential));
obj.client_id = o.client_id;
obj.client_key = o.client_key;
obj.user_id = o.user_id;
obj.houndify_server_uri = o.houndify_server_uri;
}
else if ('voxist' === obj.vendor) {
const o = JSON.parse(decrypt(credential));
obj.api_key = o.api_key;

View File

@@ -173,7 +173,8 @@ function installSrfLocals(srf, logger, {
lookupAccountCapacitiesBySid,
lookupSmppGateways,
lookupClientByAccountAndUsername,
lookupSystemInformation
lookupSystemInformation,
lookupLcrByAccount
} = require('@jambonz/db-helpers')({
host: JAMBONES_MYSQL_HOST,
user: JAMBONES_MYSQL_USER,
@@ -279,7 +280,8 @@ function installSrfLocals(srf, logger, {
retrieveByPatternSortedSet,
sortedSetLength,
sortedSetPositionByPattern,
getVerbioAccessToken
getVerbioAccessToken,
lookupLcrByAccount
},
parentLogger: logger,
getSBC,

View File

@@ -20,7 +20,7 @@ const { createMediaEndpoint } = require('./media-endpoint');
class SingleDialer extends Emitter {
constructor({logger, sbcAddress, target, opts, application, callInfo, accountInfo, rootSpan, startSpan, dialTask,
onHoldMusic}) {
onHoldMusic, tmpFiles}) {
super();
assert(target.type);
@@ -44,6 +44,7 @@ class SingleDialer extends Emitter {
this.callSid = crypto.randomUUID();
this.dialTask = dialTask;
this.onHoldMusic = onHoldMusic;
this.tmpFiles = tmpFiles;
this.on('callStatusChange', this._notifyCallStatusChange.bind(this));
}
@@ -328,7 +329,13 @@ class SingleDialer extends Emitter {
*/
async kill(Reason) {
this.killed = true;
if (this.inviteInProgress) await this.inviteInProgress.cancel();
if (this.inviteInProgress) {
try {
await this.inviteInProgress.cancel();
} catch (err) {
this.logger.error({err}, 'SingleDialer:kill error cancelling invite');
}
}
else if (this.dlg && this.dlg.connected) {
const duration = moment().diff(this.dlg.connectTime, 'seconds');
this.logger.debug('SingleDialer:kill hanging up called party');
@@ -402,7 +409,7 @@ class SingleDialer extends Emitter {
tasks,
rootSpan: this.rootSpan,
req: this.req,
tmpFiles: cs.tmpFiles,
tmpFiles: this.tmpFiles,
});
await cs.exec();
@@ -536,12 +543,12 @@ class SingleDialer extends Emitter {
function placeOutdial({
logger, srf, ms, sbcAddress, target, opts, application, callInfo, accountInfo, rootSpan, startSpan, dialTask,
onHoldMusic
onHoldMusic, tmpFiles
}) {
const myOpts = deepcopy(opts);
const sd = new SingleDialer({
logger, sbcAddress, target, opts: myOpts, application, callInfo,
accountInfo, rootSpan, startSpan, dialTask, onHoldMusic
accountInfo, rootSpan, startSpan, dialTask, onHoldMusic, tmpFiles
});
sd.exec(srf, ms, myOpts);
return sd;

View File

@@ -0,0 +1,91 @@
// lib/utils/process-monitor.js
const fs = require('fs');
const path = require('path');
class ProcessMonitor {
constructor(logger) {
this.logger = logger;
this.packageInfo = this.getPackageInfo();
this.processName = this.packageInfo.name || 'unknown-app';
}
getPackageInfo() {
try {
const packagePath = path.join(process.cwd(), 'package.json');
return JSON.parse(fs.readFileSync(packagePath, 'utf8'));
} catch (e) {
return { name: 'unknown', version: 'unknown' };
}
}
logStartup(additionalInfo = {}) {
const startupInfo = {
msg: `${this.processName} started`,
app_name: this.processName,
app_version: this.packageInfo.version,
pid: process.pid,
ppid: process.ppid,
pm2_instance_id: process.env.NODE_APP_INSTANCE || 'not_pm2',
pm2_id: process.env.pm_id,
is_pm2: !!process.env.PM2,
node_version: process.version,
uptime: process.uptime(),
timestamp: new Date().toISOString(),
...additionalInfo
};
this.logger.info(startupInfo);
return startupInfo;
}
setupSignalHandlers() {
// Log when we receive signals that would cause restart
process.on('SIGINT', () => {
this.logger.info({
msg: 'SIGINT received',
app_name: this.processName,
pid: process.pid,
ppid: process.ppid,
uptime: process.uptime(),
timestamp: new Date().toISOString()
});
process.exit(0);
});
process.on('SIGTERM', () => {
this.logger.info({
msg: 'SIGTERM received',
app_name: this.processName,
pid: process.pid,
ppid: process.ppid,
uptime: process.uptime(),
timestamp: new Date().toISOString()
});
process.exit(0);
});
process.on('uncaughtException', (error) => {
this.logger.error({
msg: 'Uncaught exception - process will restart',
app_name: this.processName,
error: error.message,
stack: error.stack,
pid: process.pid,
timestamp: new Date().toISOString()
});
process.exit(1);
});
process.on('unhandledRejection', (reason, promise) => {
this.logger.error({
msg: 'Unhandled rejection',
app_name: this.processName,
reason,
pid: process.pid,
timestamp: new Date().toISOString()
});
});
}
}
module.exports = ProcessMonitor;

View File

@@ -55,11 +55,28 @@ const extractSdpMedia = (sdp) => {
}
};
const getLeadingCodec = (sdp) => {
if (!sdp) {
return null;
}
const parsed = sdpTransform.parse(sdp);
const audio = parsed.media?.find((m) => m.type === 'audio');
if (!audio) {
return null;
}
return audio.rtp?.[0]?.codec || null;
};
module.exports = {
isOnhold,
mergeSdpMedia,
extractSdpMedia,
isOpusFirst,
makeOpusFirst,
removeVideoSdp
removeVideoSdp,
getLeadingCodec
};

View File

@@ -131,6 +131,43 @@ const stickyVars = {
'OPENAI_TURN_DETECTION_PREFIX_PADDING_MS',
'OPENAI_TURN_DETECTION_SILENCE_DURATION_MS',
],
houndify: [
'HOUNDIFY_CLIENT_ID',
'HOUNDIFY_CLIENT_KEY',
'HOUNDIFY_USER_ID',
'HOUNDIFY_MAX_SILENCE_SECONDS',
'HOUNDIFY_MAX_SILENCE_AFTER_FULL_QUERY_SECONDS',
'HOUNDIFY_MAX_SILENCE_AFTER_PARTIAL_QUERY_SECONDS',
'HOUNDIFY_VAD_SENSITIVITY',
'HOUNDIFY_VAD_TIMEOUT',
'HOUNDIFY_VAD_MODE',
'HOUNDIFY_VAD_VOICE_MS',
'HOUNDIFY_VAD_SILENCE_MS',
'HOUNDIFY_VAD_DEBUG',
'HOUNDIFY_AUDIO_FORMAT',
'HOUNDIFY_ENABLE_NOISE_REDUCTION',
'HOUNDIFY_AUDIO_ENDPOINT',
'HOUNDIFY_ENABLE_PROFANITY_FILTER',
'HOUNDIFY_ENABLE_PUNCTUATION',
'HOUNDIFY_ENABLE_CAPITALIZATION',
'HOUNDIFY_CONFIDENCE_THRESHOLD',
'HOUNDIFY_ENABLE_DISFLUENCY_FILTER',
'HOUNDIFY_MAX_RESULTS',
'HOUNDIFY_ENABLE_WORD_TIMESTAMPS',
'HOUNDIFY_MAX_ALTERNATIVES',
'HOUNDIFY_PARTIAL_TRANSCRIPT_INTERVAL',
'HOUNDIFY_SESSION_TIMEOUT',
'HOUNDIFY_CONNECTION_TIMEOUT',
'HOUNDIFY_LATITUDE',
'HOUNDIFY_LONGITUDE',
'HOUNDIFY_CITY',
'HOUNDIFY_STATE',
'HOUNDIFY_COUNTRY',
'HOUNDIFY_TIMEZONE',
'HOUNDIFY_DOMAIN',
'HOUNDIFY_CUSTOM_VOCABULARY',
'HOUNDIFY_LANGUAGE_MODEL'
],
};
/**
@@ -339,20 +376,56 @@ const normalizeDeepgram = (evt, channel, language, shortUtterance) => {
};
};
const normalizeDeepgramRiver = (evt, channel, language, shortUtterance) => {
const normalizeGladia = (evt, channel, language, shortUtterance) => {
const copy = JSON.parse(JSON.stringify(evt));
// Handle Gladia transcript format
if (evt.type === 'transcript' && evt.data && evt.data.utterance) {
const utterance = evt.data.utterance;
const alternatives = [{
confidence: utterance.confidence || 0,
transcript: utterance.text || '',
}];
return {
language_code: utterance.language || language,
channel_tag: channel,
is_final: evt.data.is_final || false,
alternatives,
vendor: {
name: 'gladia',
evt: copy
}
};
}
};
const normalizeDeepgramFlux = (evt, channel, language) => {
const copy = JSON.parse(JSON.stringify(evt));
let turnTakingEvent;
if (['StartOfTurn', 'EagerEndOfTurn', 'TurnResumed', 'EndOfTurn'].includes(evt.event)) {
turnTakingEvent = evt.event;
}
/* calculate total confidence based on word-level confidence */
const realWords = (evt.words || [])
.filter((w) => ![',.!?;'].includes(w.word));
const confidence = realWords.length > 0 ? realWords.reduce((acc, w) => acc + w.confidence, 0) / realWords.length : 0;
return {
language_code: language,
channel_tag: channel,
is_final: evt.event === 'EndOfTurn',
alternatives: [
{
confidence: evt.end_of_turn_confidence,
confidence,
end_of_turn_confidence: evt.end_of_turn_confidence,
transcript: evt.transcript,
...(turnTakingEvent && {turn_taking_event: turnTakingEvent})
}
],
vendor: {
name: 'deepgramriver',
name: 'deepgramflux',
evt: copy
}
};
@@ -570,6 +643,30 @@ const normalizeAssemblyAi = (evt, channel, language) => {
};
};
const normalizeHoundify = (evt, channel, language) => {
const copy = JSON.parse(JSON.stringify(evt));
const alternatives = [];
const is_final = evt.ResultsAreFinal && evt.ResultsAreFinal[0] === true;
if (evt.Disambiguation && evt.Disambiguation.ChoiceData && evt.Disambiguation.ChoiceData.length > 0) {
// Handle Houndify Voice Search Result format
const choiceData = evt.Disambiguation.ChoiceData[0];
alternatives.push({
confidence: choiceData.ConfidenceScore || choiceData.ASRConfidence || 0.0,
transcript: choiceData.FormattedTranscription || choiceData.Transcription || '',
});
}
return {
language_code: language,
channel_tag: channel,
is_final,
alternatives,
vendor: {
name: 'houndify',
evt: copy
}
};
};
const normalizeVoxist = (evt, channel, language) => {
const copy = JSON.parse(JSON.stringify(evt));
return {
@@ -669,8 +766,10 @@ module.exports = (logger) => {
switch (vendor) {
case 'deepgram':
return normalizeDeepgram(evt, channel, language, shortUtterance);
case 'deepgramriver':
return normalizeDeepgramRiver(evt, channel, language, shortUtterance);
case 'gladia':
return normalizeGladia(evt, channel, language, shortUtterance);
case 'deepgramflux':
return normalizeDeepgramFlux(evt, channel, language, shortUtterance);
case 'microsoft':
return normalizeMicrosoft(evt, channel, language, punctuation);
case 'google':
@@ -689,6 +788,8 @@ module.exports = (logger) => {
return normalizeCobalt(evt, channel, language);
case 'assemblyai':
return normalizeAssemblyAi(evt, channel, language, shortUtterance);
case 'houndify':
return normalizeHoundify(evt, channel, language, shortUtterance);
case 'voxist':
return normalizeVoxist(evt, channel, language);
case 'cartesia':
@@ -819,7 +920,7 @@ module.exports = (logger) => {
...(rOpts.initialSpeechTimeoutMs > 0 &&
{AZURE_INITIAL_SPEECH_TIMEOUT_MS: rOpts.initialSpeechTimeoutMs}),
...(rOpts.requestSnr && {AZURE_REQUEST_SNR: 1}),
...(rOpts.audioLogging && {AZURE_AUDIO_LOGGING: 1}),
...(azureOptions.audioLogging && {AZURE_AUDIO_LOGGING: 1}),
...{AZURE_USE_OUTPUT_FORMAT_DETAILED: 1},
...(azureOptions.speechSegmentationSilenceTimeoutMs &&
{AZURE_SPEECH_SEGMENTATION_SILENCE_TIMEOUT_MS: azureOptions.speechSegmentationSilenceTimeoutMs}),
@@ -965,19 +1066,30 @@ module.exports = (logger) => {
...(entityPrompt && {DEEPGRAM_SPEECH_ENTITY_PROMPT: entityPrompt}),
};
}
else if ('deepgramriver' === vendor) {
else if ('deepgramflux' === vendor) {
const {
preflightThreshold,
eotThreshold,
eotTimeoutMs,
mipOptOut
mipOptOut,
model,
eagerEotThreshold,
keyterms
} = rOpts.deepgramOptions || {};
opts = {
DEEPGRAM_API_KEY: sttCredentials.api_key,
...(preflightThreshold && {DEEPGRAM_SPEECH_PRELIGHT_THRESHOLD: preflightThreshold}),
...(eotThreshold && {DEEPGRAM_SPEECH_EOT_THRESHOLD: eotThreshold}),
...(eotTimeoutMs && {DEEPGRAM_SPEECH_EOT_TIMEOUT_MS: eotTimeoutMs}),
...(mipOptOut && {DEEPGRAM_SPEECH_MIP_OPT_OUT: mipOptOut}),
DEEPGRAMFLUX_API_KEY: sttCredentials.api_key,
DEEPGRAMFLUX_SPEECH_MODEL: model || 'flux-general-en',
...(eotThreshold && {DEEPGRAMFLUX_SPEECH_EOT_THRESHOLD: eotThreshold}),
...(eotTimeoutMs && {DEEPGRAMFLUX_SPEECH_EOT_TIMEOUT_MS: eotTimeoutMs}),
...(mipOptOut && {DEEPGRAMFLUX_SPEECH_MIP_OPT_OUT: mipOptOut}),
...(eagerEotThreshold && {DEEPGRAMFLUX_SPEECH_EAGER_EOT_THRESHOLD: eagerEotThreshold}),
...(keyterms && keyterms.length > 0 && {DEEPGRAMFLUX_SPEECH_KEYTERMS: keyterms.join(',')}),
};
}
else if ('gladia' === vendor) {
const {host, path} = sttCredentials;
opts = {
GLADIA_SPEECH_HOST: host,
GLADIA_SPEECH_PATH: path,
};
}
else if ('soniox' === vendor) {
@@ -1106,6 +1218,61 @@ module.exports = (logger) => {
{ASSEMBLYAI_WORD_BOOST: JSON.stringify(rOpts.hints)})
};
}
else if ('houndify' === vendor) {
const {
latitude, longitude, city, state, country, timeZone, domain, audioEndpoint,
maxSilenceSeconds, maxSilenceAfterFullQuerySeconds, maxSilenceAfterPartialQuerySeconds,
vadSensitivity, vadTimeout, vadMode, vadVoiceMs, vadSilenceMs, vadDebug,
audioFormat, enableNoiseReduction, enableProfanityFilter, enablePunctuation,
enableCapitalization, confidenceThreshold, enableDisfluencyFilter,
maxResults, enableWordTimestamps, maxAlternatives, partialTranscriptInterval,
sessionTimeout, connectionTimeout, customVocabulary, languageModel,
requestInfo, sampleRate
} = rOpts.houndifyOptions || {};
const audioEndpointUri = audioEndpoint || sttCredentials.houndify_server_uri;
opts = {
...opts,
HOUNDIFY_CLIENT_ID: sttCredentials.client_id,
HOUNDIFY_CLIENT_KEY: sttCredentials.client_key,
HOUNDIFY_USER_ID: sttCredentials.user_id,
HOUNDIFY_MAX_SILENCE_SECONDS: maxSilenceSeconds || 5,
HOUNDIFY_MAX_SILENCE_AFTER_FULL_QUERY_SECONDS: maxSilenceAfterFullQuerySeconds || 1,
HOUNDIFY_MAX_SILENCE_AFTER_PARTIAL_QUERY_SECONDS: maxSilenceAfterPartialQuerySeconds || 1.5,
...(vadSensitivity && {HOUNDIFY_VAD_SENSITIVITY: vadSensitivity}),
...(vadTimeout && {HOUNDIFY_VAD_TIMEOUT: vadTimeout}),
...(vadMode && {HOUNDIFY_VAD_MODE: vadMode}),
...(vadVoiceMs && {HOUNDIFY_VAD_VOICE_MS: vadVoiceMs}),
...(vadSilenceMs && {HOUNDIFY_VAD_SILENCE_MS: vadSilenceMs}),
...(vadDebug && {HOUNDIFY_VAD_DEBUG: vadDebug}),
...(audioFormat && {HOUNDIFY_AUDIO_FORMAT: audioFormat}),
...(enableNoiseReduction && {HOUNDIFY_ENABLE_NOISE_REDUCTION: enableNoiseReduction}),
...(enableProfanityFilter && {HOUNDIFY_ENABLE_PROFANITY_FILTER: enableProfanityFilter}),
...(enablePunctuation && {HOUNDIFY_ENABLE_PUNCTUATION: enablePunctuation}),
...(enableCapitalization && {HOUNDIFY_ENABLE_CAPITALIZATION: enableCapitalization}),
...(confidenceThreshold && {HOUNDIFY_CONFIDENCE_THRESHOLD: confidenceThreshold}),
...(enableDisfluencyFilter && {HOUNDIFY_ENABLE_DISFLUENCY_FILTER: enableDisfluencyFilter}),
...(maxResults && {HOUNDIFY_MAX_RESULTS: maxResults}),
...(enableWordTimestamps && {HOUNDIFY_ENABLE_WORD_TIMESTAMPS: enableWordTimestamps}),
...(maxAlternatives && {HOUNDIFY_MAX_ALTERNATIVES: maxAlternatives}),
...(partialTranscriptInterval && {HOUNDIFY_PARTIAL_TRANSCRIPT_INTERVAL: partialTranscriptInterval}),
...(sessionTimeout && {HOUNDIFY_SESSION_TIMEOUT: sessionTimeout}),
...(connectionTimeout && {HOUNDIFY_CONNECTION_TIMEOUT: connectionTimeout}),
...(latitude && {HOUNDIFY_LATITUDE: latitude}),
...(longitude && {HOUNDIFY_LONGITUDE: longitude}),
...(city && {HOUNDIFY_CITY: city}),
...(state && {HOUNDIFY_STATE: state}),
...(country && {HOUNDIFY_COUNTRY: country}),
...(timeZone && {HOUNDIFY_TIMEZONE: timeZone}),
...(domain && {HOUNDIFY_DOMAIN: domain}),
...(audioEndpointUri && {HOUNDIFY_AUDIO_ENDPOINT: audioEndpointUri}),
...(customVocabulary && {HOUNDIFY_CUSTOM_VOCABULARY:
Array.isArray(customVocabulary) ? customVocabulary.join(',') : customVocabulary}),
...(languageModel && {HOUNDIFY_LANGUAGE_MODEL: languageModel}),
...(requestInfo && {HOUNDIFY_REQUEST_INFO: JSON.stringify(requestInfo)}),
...(sampleRate && {HOUNDIFY_SAMPLING_RATE: sampleRate}),
};
}
else if ('voxist' === vendor) {
opts = {
...opts,

View File

@@ -80,7 +80,7 @@ class TtsStreamingBuffer extends Emitter {
clearTimeout(this.timer);
this.removeCustomEventListeners();
if (this.ep) {
this._api(this.ep, [this.ep.uuid, 'close'])
this._api(this.ep, [this.ep.uuid, 'stop'])
.catch((err) =>
this.logger.info({ err }, 'TtsStreamingBuffer:stop Error closing TTS streaming')
);
@@ -193,10 +193,7 @@ class TtsStreamingBuffer extends Emitter {
this.logger.debug('TtsStreamingBuffer:_feedQueue TTS stream is not open or no endpoint available');
return;
}
if (
this._connectionStatus === TtsStreamingConnectionStatus.NotConnected ||
this._connectionStatus === TtsStreamingConnectionStatus.Failed
) {
if (this._connectionStatus !== TtsStreamingConnectionStatus.Connected) {
this.logger.debug('TtsStreamingBuffer:_feedQueue TTS stream is not connected');
return;
}
@@ -278,6 +275,14 @@ class TtsStreamingBuffer extends Emitter {
}
const chunk = combinedText.slice(0, chunkEnd);
// Check if the chunk is only whitespace before processing the queue
// If so, wait for more meaningful text
if (isWhitespace(chunk)) {
this.logger.debug('TtsStreamingBuffer:_feedQueue chunk is only whitespace, waiting for more text');
this._setTimerIfNeeded();
return;
}
// Now we iterate over the queue items
// and deduct their lengths until we've accounted for chunkEnd characters.
let remaining = chunkEnd;
@@ -301,6 +306,14 @@ class TtsStreamingBuffer extends Emitter {
this.bufferedLength -= chunkEnd;
const modifiedChunk = chunk.replace(/\n\n/g, '\n \n');
if (isWhitespace(modifiedChunk)) {
this.logger.debug('TtsStreamingBuffer:_feedQueue modified chunk is only whitespace, restoring queue');
this.queue.unshift({ type: 'text', value: chunk });
this.bufferedLength += chunkEnd;
this._setTimerIfNeeded();
return;
}
this.logger.debug(`TtsStreamingBuffer:_feedQueue sending chunk to tts: ${modifiedChunk}`);
try {
@@ -349,6 +362,7 @@ class TtsStreamingBuffer extends Emitter {
if (this.queue.length > 0) {
await this._feedQueue();
}
this.emit(TtsStreamingEvents.Connected, { vendor });
}
_onConnectFailure(vendor) {
@@ -399,6 +413,7 @@ class TtsStreamingBuffer extends Emitter {
removeCustomEventListeners() {
this.eventHandlers.forEach((h) => h.ep.removeCustomEventListener(h.event, h.handler));
this.eventHandlers.length = 0;
}
_initHandlers(ep) {

View File

@@ -293,7 +293,7 @@ class WsRequestor extends BaseRequestor {
/* send the message */
this.ws.send(JSON.stringify(obj), async() => {
this.logger.debug({obj}, `WsRequestor:request websocket: sent (${url})`);
if (obj.type !== 'llm:event') this.logger.debug({obj}, `WsRequestor:request websocket: sent (${url})`);
// If session:reconnect is waiting for ack, hold here until ack to send queuedMsgs
if (this._reconnectPromise) {
try {

5589
package-lock.json generated

File diff suppressed because it is too large Load Diff

View File

@@ -27,14 +27,14 @@
"dependencies": {
"@aws-sdk/client-auto-scaling": "^3.549.0",
"@aws-sdk/client-sns": "^3.549.0",
"@jambonz/db-helpers": "^0.9.16",
"@jambonz/db-helpers": "^0.9.18",
"@jambonz/http-health-check": "^0.0.1",
"@jambonz/mw-registrar": "^0.2.7",
"@jambonz/realtimedb-helpers": "^0.8.15",
"@jambonz/speech-utils": "^0.2.19",
"@jambonz/speech-utils": "^0.2.26",
"@jambonz/stats-collector": "^0.1.10",
"@jambonz/time-series": "^0.2.14",
"@jambonz/verb-specifications": "^0.0.113",
"@jambonz/verb-specifications": "^0.0.122",
"@modelcontextprotocol/sdk": "^1.9.0",
"@opentelemetry/api": "^1.8.0",
"@opentelemetry/exporter-jaeger": "^1.23.0",
@@ -49,12 +49,12 @@
"debug": "^4.3.4",
"deepcopy": "^2.1.0",
"drachtio-fsmrf": "^4.1.2",
"drachtio-srf": "^5.0.5",
"drachtio-srf": "^5.0.14",
"express": "^4.19.2",
"express-validator": "^7.0.1",
"moment": "^2.30.1",
"parse-url": "^9.2.0",
"pino": "^8.20.0",
"pino": "^10.1.0",
"polly-ssml-split": "^0.1.0",
"sdp-transform": "^2.15.0",
"short-uuid": "^5.1.0",

View File

@@ -83,7 +83,8 @@ test('invalid jambonz json create alert tests', async(t) => {
{account_sid: 'bb845d4b-83a9-4cde-a6e9-50f3743bab3f', page: 1, page_size: 25, days: 7});
let checked = false;
for (let i = 0; i < data.total; i++) {
checked = data.data[i].message === 'malformed jambonz payload: must be array'
checked = data.data[i].message === 'malformed jambonz payload: must be array';
if (checked) break;
}
t.ok(checked, 'alert is raised as expected');
disconnect();